Higher Learning Commission

Dancing Backwards in High Heels: Small Institutions and Criterion 5.D.

Donna J. Lewis, Barbara D. Wright and Letha B. Zook

“Ginger Rogers did everything [Fred Astaire] did—only backwards and in high heels” (Thaves 1982). Small institutions can identify with Ms. Rogers, as they have the same operational, reporting and accreditation requirements as their larger counterparts but must carry them out with fewer human and other resources. Small institutions are too often “people poor.” That fact was brought home forcefully to the institutional self-study team at the University of Charleston (UC) when the Higher Learning Commission (HLC) published its revised Criteria for Accreditation in February 2012. The self-study team struggled with determining how to comply with the new Criterion 5.D., which mandated that the institution provide documentary evidence that it “works systematically to improve its performance” within the constraints of limited human and financial resources.

As is often the case at small institutions, many UC administrators have responsibility for multiple functions, and some are one-person departments. These individuals had little enthusiasm for initiating a process requiring an additional report and even less enthusiasm for the idea that their departments’ performance would be “judged” by others. No administrative office was willing to take responsibility for overseeing a new evaluation process or providing administrative support for the effort.

Faced with these challenges, the self-study team turned to the academic side of the institution for help implementing a new administrative assessment process. The assistant dean for Assessment, who was also co-chair of the institutional self-study team, was tasked with oversight of implementation using some guiding precepts:

  • Model the new process after the established process for assessing academic programs, adapting existing report templates and scoring rubrics for the administrative process.
  • Restrict the scope of the required report to a limited number of core departmental “outcomes” or “goals” to make the reporting task less burdensome.
  • Encourage departments to present data already being collected as evidence of performance.
  • Use the institution’s Chalk & Wire electronic portfolio system as a document depository.
  • Recruit current or former members of the Academic Assessment Committee to volunteer their time as an ad hoc Quality Assurance Committee to evaluate the reports.

Initially, administrative and academic support departments were asked only to submit a plan for assessment. The report template for the plan asked for the following information:

  • Departmental mission statement (must align with the institutional mission)
  • Department description (Identify key functions and list any quality indicators that can be supported with evidence, e.g., clean audits, exemplary safety record, etc.)
  • Department outcomes/goals (Goals must be measurable and suitable for trending over time. At least one goal must be focused on customer service or student retention.)
  • Data collection methods (Identify how and when progress toward goal attainment is measured and by whom.)
  • Employee credentials (evidence that departmental staff are appropriately qualified)
  • Vision for the future (including an analysis of strengths, weaknesses, opportunities and threats)

In subsequent years departments were asked to submit quality assurance (QA) reports. The QA reports document data collection and describe how the data are being used to make improvements to operations. Elements included in the QA report but not required in the plan document include the following:

  • Results of data collection and analysis of those results
  • Response to data analysis—documentation of changes made in response to data or to feedback from the Quality Assurance Committee
  • Effectiveness of changes made in the previous year
  • Plan for continuity of functions (Are employees cross-trained? Are procedure manuals in place?)

Reports are scored using a weighted rubric. Results are communicated to department heads and the vice president responsible for each department using a standard feedback sheet. A table reporting scores for all departments, selected feedback comments and identified resource needs is communicated to the board of trustees and the cabinet.

The QA process has proved itself useful for documenting evidence beyond that needed for Criterion 5.D. The QA report to the board of trustees and cabinet communicates resource needs for consideration in the budget process as required in Core Component 5.C.2. The questions about continuity of functions were initially included to identify and address areas of institutional risk related to employee turnover, a common problem at small institutions that may have only one or two staff members in a department. However, that question and the requirement to document employee credentials also provide evidence needed for Criterion 5.C. (institutional planning) and 3.C.6. and 5.A.4. (employee credentials).

The QA process is maturing each year, with better rubric language and more precise instructions. In fall 2015, UC formally institutionalized the QA process by creating a standing Quality Assurance Committee. Chaired by the assistant dean for Assessment, the committee membership includes key administrators, a vice president and a faculty officer. UC’s four vice presidents will rotate membership, serving a two-year term. Including a faculty officer is expected to improve understanding and communication between faculty members and administrators.

UC administrators are beginning to see value in the QA process. The process serves as an important institutional vehicle for publicly celebrating the accomplishments of its employees and offices. For example, UC’s Financial Aid Office documented significant decreases in cohort loan default rates, resulting in UC’s default rate being one-third that of the West Virginia average and one-half the national average. Facilities Services documented a significant increase in the tons of waste material sent for recycling. These accomplishments were noted in the report sent to the board of trustees and will be noted in the university’s Annual Assessment Report. In the past, these achievements would likely have been known to only a few people on campus.

The QA process has also resulted in departments identifying and eliminating tasks or initiatives that offer little return on their investment of time or money, streamlining operations or documenting resource needs. The Office of Advancement identified a particular event that consumed human and financial resources each spring but did not gain enough new donors to justify their investment.

In fall 2015 the time commitment for QA Committee members was approximately 25 hours. As department heads become more skilled at assessing their operations and collecting data, report preparation is taking less time. The director of the Academic Success Center commented that it took only 30 minutes to complete her 2015 report.

Lessons Learned

In December 2015, UC completed its fourth quality assurance cycle. The process is not yet perfect and may never be, as important lessons are being learned in the course of each cycle. Some of the more important advice UC can give to other institutions considering implementing a similar process includes the following points:

  • Avoid making the self-study team responsible for implementing the new initiative. In the minds of many at UC, the process of assessing administrative programs became conflated with the collection of evidence for the institutional self-study, and those people were then surprised to be asked for a report in succeeding years.
  • Get full support for the process from institutional leadership at the beginning. UC’s initiative was implemented during a time of massive institutional change that demanded the full attention of the president and his cabinet. Though senior leadership recognized the need for the process to document continuous improvement, they had little interest in asking employees in their areas to undertake one more task. The self-study team therefore assumed responsibility for the process while having no authority to compel compliance. It was not until the third cycle, when a cabinet-level person joined the ad hoc committee, that all departments complied with the process.
  • Adapting the existing academic assessment process worked well for UC. The logistics were familiar—processes and documents did not have to be built from scratch. Because the academic and administrative processes are now parallel, they create a solid framework for assessing institutional effectiveness. In fact, the Academic Assessment Committee and the newly minted Quality Assurance Committee are now actually sub-committees of an Institutional Effectiveness Committee, which will meet at least annually to review all institutional assessment results.
  • Language matters. Documents adapted from the academic program assessment process initially contained a great deal of academic jargon that proved to be an impediment to implementation in the first cycle. For example, during that first year, the process was called “Administrative Assessment.” Assessment was not an everyday term for many department heads. Changing the name to “Quality Assurance” in the second cycle made the process sound less arcane. Changing the word outcomes to goals also had a positive effect.
  • Everyone is busy, all year round. There is no good time to implement the process and no reporting period that works well for all departments. No matter when reports are due, there will be at least one department that will have to deal with the issue of timing.
  • Emphasize the positives. Departments did not like the concept of being “judged.” Help them see the QA process as an opportunity to brag about departmental successes, identify ways to streamline and improve operations and bring their department’s needs to the resource allocation table.
  • Be nice. Word constructive feedback carefully, and be sure to include positive feedback wherever possible. Recognize accomplishments.
  • Take advantage of the process to document evidence for other HLC Criteria.

The University of Charleston is satisfied that the process put in place meets the demands of Criterion 5.D. and is realistically sustainable with current human and financial resources. UC also believes that similar opportunities to adapt existing processes exist at other small institutions and should be considered as a means of documenting continuous operational improvement.

Reference

Thaves, R. 1982. Fred Astaire Film Festival (cartoon). Frank & Ernest. Kansas City, MO: Newspaper Enterprise Association, May 3. http://www.frankandernest.com/search/index.php?kw=astaire
&property=Frank+and+Ernest&id=13&opt=jqvxclneswf&css=http%3A%2F%2F
www.frankandernest.com%2Fincludes%2Fiframe&fb_ref=888&submit=Search.

 

About the Authors

Donna J. Lewis is Assistant Dean for Assessment, Barbara D. Wright is Associate Dean for Curriculum, and Letha B. Zook is Provost at University of Charleston in West Virginia.

In this Paper

Lessons Learned
Reference

Copyright © 2017 - Higher Learning Commission

NOTE: The papers included in this collection offer the viewpoints of their authors. HLC recommends them for study and for the advice they contain, but they do not represent official HLC directions, rules or policies.


Higher Learning Commission • 230 South LaSalle Street, Suite 7-500 • Chicago, IL 60604 • info@hlcommission.org • 800.621.7440

Home | About HLC | Contact Us | Privacy Policy

YouTube graybkgrdLinkedIn graybkgdTwitter graybkgd