Higher Learning Commission

2015 Collection of Papers

“Assessment’s Not My Job!”: Engaging Staff in Cocurricular Assessment

Erika Beseler Thompson and Jeremy Penn

One of the most popular segments on National Public Radio’s weekly quiz show “Wait Wait . . . Don’t Tell Me!” is called “Not My Job,” in which the host interviews a celebrity and asks a series of questions on a topic about which the celebrity presumably has little to no knowledge. For example, former CIA Officer Robert Baer was asked questions about actual bears; country singer Dale Watson was asked about Sherlock Holmes; and Daniel Radcliffe, known for playing Harry Potter, was asked about Chia Pets (hairy pottery). Too often the work of cocurricular assessment feels like “Not My Job,” in which faculty and staff members—experts in teaching, programming, and working with students—struggle to understand how to assess cocurricular learning.

There is an increasingly urgent call in higher education to provide evidence that cocurricular activities contribute to student learning in order to answer demands for accountability, justify the use of scarce resources, and meet the needs of a diversifying student body (Bresciani 2012; Bresciani, Gardner and Hickmott 2009; Grund 2010; Schuh and Gansemer-Topf 2010; Suskie 2009). There is also a documented need to increase the capacity of Student Affairs staff members to plan, conduct and use assessment. Banta and Associates (2002) noted a lack of experience of many Student Affairs practitioners regarding application of assessment methods to evaluate and improve programs. Furthermore, Seagraves and Dean (2010) noted that the lack of training among student affairs practitioners and faculty members hinders assessment efforts at many institutions. VanDerLinden and Cope’s presentation at the 2014 Higher Learning Commission (HLC) Annual Conference reviewed assurance arguments written by Pathways pilot institutions for Criteria 3.E and 4.B of the HLC Criteria for Accreditation. They found that several institutions counted the number of clubs, participants, and so on, but did not provide evidence that these activities were actually supporting student learning. They concluded there was much work to be done in this area and opportunities existed for institutions to improve assessment of cocurricular programs. To clearly address accreditation Criteria related to cocurricular assessment—specifically, Criteria 3.E and 4.B—and ensure that effective assessment of cocurricular programs and services is taking place, institutions must be prepared to build a culture of evidence by communicating the importance of assessment activities and providing developmental opportunities to increase assessment competency among Student Affairs staff members.

This study was intended to (a) determine the current levels of expertise and support that exist for assessment activities among Student Affairs staff members at North Dakota State University (NDSU), (b) determine assessment-related attitudes and beliefs held by staff members, and (c) identify topic areas, delivery mode options, and particular audiences for additional assessment-related professional development opportunities.

Student Affairs Assessment Competency Areas

In support of identifying the professional knowledge, skills, and attitudes expected of individuals within the field of Student Affairs, several professional organizations have identified professional competency areas for Student Affairs practitioners. These organizations include the National Association of Student Personnel Administrators (NASPA), American College Personnel Association (ACPA), and Council for the Advancement of Standards in Higher Education (CAS), among others. The most comprehensive and widely endorsed list of competencies is a set of standards developed jointly by ACPA and NASPA in 2010. The Professional Competency Areas for Student Affairs Practitioners (National Association of Student Personnel Administrators and American College Personnel Association 2010) includes an overview of three levels—basic, intermediate, and advanced—for 10 different competency areas. For the assessment, evaluation and research competency, these levels have been operationalized to indicate specific knowledge, skills and attitudes held by staff members at the three levels. At the basic level, practitioners understand the basic terminology of assessment, can facilitate data collection for their department or program and understand the connection of assessment to learning outcomes and goals. At the intermediate level, practitioners design data collection efforts, contribute to the understanding of colleagues regarding the relationship of assessment to learning outcomes and goals, design surveys and other assessment instruments and help create a departmental climate that supports assessment efforts. At the advanced level, practitioners lead the conceptualization and design of assessment efforts, including the use and dissemination of findings, create the expectation that assessment is central to professional practice and effectively use assessment results in institutional decision-making. These three levels were used to guide instrument development to help determine the current level of competency among Student Affairs staff members for implementing, planning and using assessment.

Instrument Development

The survey used in this study was intended to assess the competency level of Student Affairs staff members for various assessment activities, investigate staff member attitudes and beliefs related to assessment, and determine the topic areas, delivery mode options and particular audiences for additional assessment-related training. The survey was divided into three sections: Assessment Experience and Opinions, Professional Development and Professional Experience.

The Assessment Experience and Opinions questions focused primarily on levels of responsibility for assessment activities, levels of engagement in specific assessment activities and attitudes and beliefs related to assessment as a whole. The assessment activities list and the levels of engagement were developed to align with the descriptions of levels of assessment competency developed by NASPA and ACPA (2010). The attitude and belief questions were designed to determine the current support and attitudes related to assessment that exist among staff members and the overall culture for assessment that exists in the division. The need to determine the overall culture for assessment is supported by Suskie (2004, 55), who stated:

Faculty and staff resistance is perhaps the major obstacle to the successful implementation of assessment programs, so an important step in planning an assessment program is to evaluate the campus culture for assessment and take appropriate steps to ensure a climate that promotes assessment.

The Professional Development section included questions related to the interest level of participants for training on various assessment topics and preferred training formats. Finally, the Professional Experience section included questions related to the number of years participants have worked in Student Affairs and whether participants supervise full-time staff, graduate students or undergraduate students. These questions were intended to identify whether staff members who have worked in Student Affairs for longer amounts of time or those who supervise others have different competencies or attitudes related to assessment.

Results

In spring 2014, the Student Affairs Assessment Experiences, Expertise and Needs survey was administered electronically to Student Affairs staff members (n = 229) in three job bands: executive (n = 19), professional (n = 103), and paraprofessional/office support (n = 107). Response rates varied by job band: 89 percent for executive, 56 percent for professional, and 39 percent for paraprofessional/office support.

Assessment Attitudes and Beliefs

Results indicated that wide support for assessment exists and staff members are engaged in assessment across all three job bands. Respondents held a shared belief that assessment is everyone’s responsibility and the primary purpose of assessment is to determine the impact of programs and services on student learning. Some respondents, however, reported concerns related to not having enough time to conduct assessment, particularly among those in the executive bands, and beliefs that assessment results were not used as much as they could be.

Level of Engagement

The level of engagement in assessment activities varied depending on position, with those in executive/administrative positions reporting the highest levels of engagement, followed by professional staff, then paraprofessional staff. Staff members reported confusion regarding whether assessment is part of their official job duties, and a smaller than expected number of professional staff reported that assessment was part of their official job responsibilities (38 percent). Furthermore, despite the overall positive attitude toward assessment, we were surprised to find that only 38 percent of staff members who engaged in assessment activities felt “prepared” or “very prepared” to meet the assessment responsibilities of their current position.

Staff Development: Needs and Interests

Across all job bands, the highest level of interest for professional development in the area of assessment was for using assessment results for program and service improvement. This was strongly supported by qualitative response themes, especially among the professional and paraprofessional staff members. Members of these bands indicated a desire to see assessment results used to shape decisions and drive quality improvement. Regarding the format of training sessions, 60- to 90-minute sessions were the most frequent choice for professional development, while full-day workshops, attending assessment conferences, and online videos were unpopular.

Recommendations

Study results indicated that additional communication and professional development related to assessing cocurricular learning was needed. It was also clear that too much of the work of assessment was being done by too few people, and many of those who were involved in cocurricular assessment did not feel sufficiently prepared. The following recommendations are a result of study findings:

  • Consider implementing a similar study on your campus to identify relevant training topics and attitudes and beliefs regarding assessment. This study helped us focus training sessions on the use of assessment results based on reported attitudes and identified interest areas.
  • Use tailored approaches for assessment-related staff development, depending on the needs of various staff members. At NDSU, we created an intensive Assessment Academy for professional staff members to develop and implement their own assessment projects, while paraprofessional staff members were provided with short, monthly assessment workshops more suited to their role in the assessment process. Online modules and “just-in-time” training modules are also being developed and made available on the Student Affairs Assessment website.
  • Include assessment responsibilities in job descriptions. Our directors were encouraged to look closely at job descriptions and consider where assessment responsibilities should be officially included. This helps avoid those “Not My Job” conversations when it comes to conducting assessment.

REFERENCES

Banta, T. W., and Associates. 2002. Building a scholarship of assessment. San Francisco, CA: Jossey-Bass.

Bresciani, D. 2012. Time for an honest look in the mirror. Leadership Exchange 10 (3): 40.

Bresciani, M., M. Gardner, and J. Hickmott. 2009. Demonstrating student success: A practical guide to outcomes-based assessment of learning and development in student affairs. Sterling, VA: Stylus Publishing.

Grund, N. 2010. Mapping the future of student affairs: Task force highlights, opportunities, and challenges. Leadership Exchange 8 (1): 11–15.

National Association of Student Personnel Administrators (NASPA) and American College Personnel Association (ACPA). 2010. ACPA and NASPA professional competency areas for student affairs practitioners. Washington, DC: Authors.

Schuh, J., and A. Gansemer-Topf. 2010. The role of student affairs in student learning assessment. NILOA Occasional Paper No. 7. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Seagraves, B., and L. Dean. 2010. Conditions supporting a culture of assessment in student affairs divisions at small colleges and universities. Journal of Student Affairs Research and Practice 47 (3): 307–324.

Suskie, L. 2004. Assessing student learning: A common sense guide. San Francisco, CA: Jossey-Bass.

_____. 2009. Assessing student learning: A common sense guide, 2nd ed. San Francisco, CA: Jossey-Bass.

VanDerLinden, K., and M. Cope. 2014. Assessing the co-curriculum: Implications for addressing core component 3.E. and 4.B. Presentation at the Higher Learning Commission Annual Conference, Chicago, IL.


Note: A full summary of results and a copy of the survey instrument are available at http://www.ndsu.edu/vpsa/assessment.

 

 

About the Authors

Erika Beseler Thompson is Associate Director, Student Success Programs, and Jeremy Penn is Director, Student Affairs Assessment, at North Dakota State University in Fargo.

Copyright © 2019 - Higher Learning Commission

NOTE: The papers included in this collection offer the viewpoints of their authors. HLC recommends them for study and for the advice they contain, but they do not represent official HLC directions, rules or policies.


Higher Learning Commission • 230 South LaSalle Street, Suite 7-500 • Chicago, IL 60604 • info@hlcommission.org • 800.621.7440

Home | About HLC | Contact Us | Privacy Policy

YouTube graybkgrdLinkedIn graybkgdTwitter graybkgd