The Academy for Assessment That Matters: Baking Assessment into the Department Culture

Scott F. Oates, Ph.D.
Director, Academic Integrity and Assessment
Virginia Commonwealth University

 

The Academy for Assessment That Matters is a 9-month program for academic departments that are ready to improve their assessment plans and practices. Assessment that matters is assessment designed to catalyze productive change regarding student learning in a degree program.[1]  Participation in the Academy is through an application process that demonstrates that a department or degree program is ready to transform their assessment from a practice of compliance to a meaningful practice of inquiry into the efficacy of the curriculum in terms of student learning.

Funded by the Office of the Provost, the Academy provides teams with instruction, leadership, follow-up support and institutional recognition for developing and implementing a project to develop a meaningful and sustainable practice of assessment. The Academy convenes for four days in August during which the assessment project and plans for implementing and evaluating it are developed. For the remaining months, academy activities include monthly progress reports, a mid-year summit to report on lessons learned and on adjustments to the project, and in May, a poster-session and hosted lunch with invited guests from departments, deans’ offices, and academic affairs leadership. At this final meeting, teams present to the audience their project goals, activities and outcomes, and, most importantly, what they have learned about building and sustaining meaningful assessment practices.[2]  Deliverables for the teams include a poster, a narrative case study, and any assessment protocols developed during the year.

The focus of the academy workshop in August is on developing and planning for a useful, efficient, and sustainable practice of assessment within the department or degree program. Towards this, teams prepare a SWOT analysis (strengths, weaknesses, opportunities, threats) regarding authentic assessment in their programs; the findings of the SWOT analysis yields information from which year-long projects are emerged. To develop and plan the implementation of the “assessment that matters project,” participants learn to use a logic model to articulate the goals, objectives, and expected outcomes and to develop the activities within the department towards achieving these expectations. (Academy workshop materials were developed from the University of Wisconsin extension website on program development and evaluation.)

Logic models help teams to “drill down” into colleague’s understandings of and beliefs about assessment, the roots and sources of resistance (some imagined, some grounded in real conditions) and the routines and structures of committees. Identifying and analyzing these elements aid a team’s development of the project objectives and strategies.

With the emphasis that a logic model places on identifying the steps toward the larger goal, teams leave the August workshops with a plan full of these smaller steps. This plan of small steps functions as a framework for accountability. Specifically, monthly progress reports and a mid-year summit help teams stay on task, working toward the objectives they set and modifying the plans as necessary.

Examples from the first two cohorts include the following:

The History B.A. team transformed a senior internship into a capstone assessment project. Building support for this project included engaging faculty, students, and internship site administrators in discussions about the learning outcomes for a history degree and working in “the real world.”  The team developed a rubric and protocols for assessing the internship as a capstone for the History B.A. degree.

The Biostatistics Ph.D. team addressed a gap that assessment data revealed in preparing Ph.D. candidates for problem posing and problem solving. This project led to engaging research faculty in discussions about relevant learning for Ph.D. students. The outcome of this projects was the revision of the curriculum, improving alignment to relevant learning outcomes.

The Mathematics B.S. team sought to align expectations for learning among multiple classes for first-year and transfer students. Team members planned a year-long series of structured discussions among the faculty about the kinds of mathematical literacy students need for success in degree paths other than mathematics, ways to align the various courses for these students, and how the faculty could best use the assessment data to see how well the courses are preparing students with the learning they need.

The Dentistry D.D.S. team used their academy project to bring together the didactic and clinical faculties for the program. The team engaged the faculties in checking course alignment with program learning outcomes and identifying potential synergies among the didactic and clinical learning experiences. This work was the first step in laying the foundation for developing a learning and assessment portfolio project. The portfolio would be a record of individual student learning and yield program level assessment data.

Though all projects are unique to the teams’ degree programs, in all cases, the teams focused on building a meaningful and sustainable practice of assessment in their program cultures. As one of the academy participants declared at the culminating poster presentations, “We are learning how to bake assessment in to the culture of our programs.”

 

[1] See Kuh, et al. Using Evidence of Student Learning to Improve Higher Education. 2015. Jossey-Bass

[2] See Maki, Peggy. Assessing for Learning: Building a Sustainable Commitment Across the Institution, 2nd ed. 2010. Stylus Publishing.

 
 

Share this Post