Assessment & Program Review

Annual Program Learning Outcomes Assessment

Assessment is a process to ensure that we continue to create and maintain high-quality learning environments that promote student learning and a positive learning experience for all our students. This site is designed to support, help you learn relevant terminology, develop meaningful program learning outcomes, and maintain, sustain, use, and report your efforts to assess student learning.

The following figure illustrates the steps that take place during a typical PLO assessment cycle:

PLO cycle process

Programs may enter any step in the cycle depending on their progress in sustaining the assessment process. To learn more about the steps in the PLO assessment cycle, click on any resource listed under Planning Your Annual Assessment Process below.

Planning Your Annual Assessment Process

  • Best Practices

    Our colleagues from the (now defunct) Academic Assessment Council recommend the following best practice in program learning outcomes assessment: having an assessment coordinator at the college level and an assessment facilitator at the department or program level ensures the process moves forward and assessment reports are submitted on time.

    College Assessment Coordinator
    A College Assessment Coordinator is appointed by the Dean of each college and has the following responsibilities:

    • Convening Program Assessment Facilitators within college
    • Coordinate program reporting and archiving within college
    • Provide support for facilitators, training, consultation, discussion, etc.
    • Facilitate sharing of best practices, tools, report information between programs and for college community, train new College Coordinator
    • Work with faculty & chairs to integrate assessment and program review processes
    • Coordinate with Dean & Chairs to advance assessment work and quality
    • Ensure timely reporting to Dean and beyond as needed
    • Provide assessment report feedback to facilitators/chairs/programs
    Program Assessment Facilitator/Facilitation
    A Program Assessment Facilitator may be selected in each program through a process determined by the program, or a program can take collective or shared responsibility for the facilitation of assessment tasks and responsibilities (e.g. through an Assessment Committee). Whether there is a designated person or the role is shared, the facilitator or equivalent lead has the following assessment responsibilities:
    • Convening program assessment meetings addressing short- and long-term assessment plans, involving faculty in the planning process, assessment work, and implementation and tracking of improvements
    • Facilitate semester or yearly assessment work in conjunction with faculty, which can include but is not limited to, PLOs selection, sample work identification and collection, tool selection or revision, data evaluation, data analysis and discussion of improvements and implementation plans
    • Develop assessment knowledge and skills by taking advantage of campus resources to support program assessment activity
    • Attend university and college assessment related events, workshops, meetings
    • Work with the Department Chair, College Coordinator and program/department faculty to advance the assessment process and quality
    • Report on program findings using established templates, meeting established deadlines and additional requests for information
    • Identify what party is responsible for producing and delivering the final reports to the College Coordinator
    • Plan for and prepare replacement Facilitator if needed. New facilitators shall not be responsible for reporting previous work unless an agreement was made indicating that was an expectation
  • Program Mission and Goals
    Mission Statement Defined

    A mission statement is a holistic vision of the values and philosophy of the program. The program mission needs to be consistent with campus and/or college mission. The overall question that the mission statement should answer is, What is the overall, unique purpose of this program?

    Example from Business Information Systems

    We offer an innovative, technologically focused business program providing students with a foundation for life-long learning that leads to professional success. Our program is built on applied learning aligned with leading industry standards. We affirm the importance of ethical conduct, stakeholder participation, scholarship, and public service.

    Developing Goal Statements

    Program learning goals are statements that fundamentally answer the question: What do we expect graduates from this program to be able to do, know, or value? In other words, what are the exit skills, knowledge, and values that we want from this program’s graduates. Program goals are general statements about knowledge, skills, attitudes, and values expected in graduates.

    Examples
    • Knowledge: Students know basic biological principles and concepts.
    • Skill: Students can use appropriate technology tools.
    • Value: Students respect the professional code of ethics for business practice
  • Program Learning Outcomes

    Program Learning Outcomes (PLO) statements take the program learning goals and focus on how students can demonstrate that the goals are being met. In other words, PLOs answer the question: how can graduates from this program demonstrate they have the needed/stated knowledge, skills, and/or values.

    PLOs are clear, concise statements that describe how students can demonstrate their mastery of program learning goals. Each student learning outcome statement must be measurable. Measures are applied to student work and may include student assignments, work samples, tests, etc. measuring student ability/skill, knowledge, or attitude/value.

    Criteria for Development

    The following criteria can be used to develop and evaluate student learning outcome statements:

    • Student Perspective: statement focus is on what students learn not on what the course covers
    • Clarity: statement is concise and understandable by all interested groups including students, parents, faculty, and others specific to the program.
    • Potentially observable and/or measurable: statement describes how students can demonstrate that they have the knowledge, skills, and/or attitudes/values specified in the goals.
    Examples
    • Knowledge:  Students can distinguish between science and pseudo-science.
    • Skill: Graduates can locate appropriate sources by searching electronic and traditional databases.  
    • Value: Graduates appreciate the need to vary counseling services to people who are different from themselves with respect to gender, age, culture, sexual orientation, and other significant characteristics.

    Tips for Writing Student Learning Outcomes (PDF)

  • Curriculum Matrix
    Building the Program's Curriculum Matrix

    Once the learning goals and program learning outcomes (PLOs) are developed for the program, the next step in program assessment is to develop the program's curriculum matrix. This matrix lists the goals and associated PLOs against the courses in the program. In each cell, the faculty identify where each PLO is introduced, practiced, or mastered.

    The curriculum matrix can be used as the basis of the assessment plan. In later tasks the faculty and/or assessment coordinator can determine what student artifact or work sample (signature assignment) can be used to measure progress towards the PLO and/or when the assessment will take place. In addition, the matrix needs to be reviewed with respect to balance and omissions. Questions to ask at this point include the following:

    • Are there learning outcomes that are not associated with any course?
    • Are there courses that do not contribute to the achievement of any PLO?  (Why do we need these courses in our curriculum?)
    • Do we have appropriate levels of the desired performance associated with each PLO with respect to 1) introducing the learning outcome, and  2) practicing the learning outcome before mastery is expected?
    • Do we have a place where we can measure mastery within the context of the program for each PLO?
    • If answers to the above are “no,” what changes do we need in our curriculum to achieve the desired results?

    Below are examples of curriculum matrices. Letters are used to represent where an PLO is introduced (I), practiced (P), or mastered (M). In some matrices a “D” is used instead of an “M” to indicate that in this course this PLO demonstrates mastery of the learning within the context of the program. Of course, mastery is defined in a similar way. In either case, the M or D level is where the student can demonstrate the highest level of achievement with respect to the stated PLO within the context of the program.

    Curriculum Matrix Examples:
  • Strategies for Assessing Program Learning Outcomes

    In general the strategies used to do the assessment are largely addressed in how to conduct the assessment and are one key component in the planning process. Overall the strategies are classified as direct (where actual student behavior is measured or assessed) or indirect. Indirect measures include things like surveys, focus groups, and other activities that gather impressions or opinions about the program and/or its learning goals. Direct measures are most effective if they are also course-embedded which means the work done by the student is actually work that counts towards the grade. Most studies that look at assessment data show that if the student work is also used in the grading criteria the student takes the activity more seriously.

    While indirect measures can be useful, assessment of learning must include mostly direct measures. This is increasingly the mandate from accrediting agencies including WSCUC. It is also acceptable and often good to have multiple measures for the same goal. For example, as a result of an assessment of critical thinking, focus group discussion could be used to learn more about how the students viewed the assignment, etc.

    Examples of Direct Assessment include but are not limited to the following: 

    Written work, Projects, Performances or Presentations. According to Suskie (2009), these kinds of assignments typically have students demonstrate skills and are considered alternatives to objective exams or essays. Students have the opportunity to learn while working unlike traditional tests and are considered more authentic in that they should be more realistic and challenging, often requiring complex answers or outcomes.

    Capstone Assignments. Similar kinds of course-embedded assignments include Signature Assignments and Performance Tasks. The work produced by these assignments is typically driven by multiple program goals or student learning outcomes and challenge students to produce work that demonstrates a variety of outcomes. Advantages of course-embedded assessments include no additional assignments or work for students or faculty, a direct measure of progress on program-specific, mission-linked learning goals actually covered in the curriculum, increased involvement of faculty and students in assessment, and ability to address deficiencies in individual student learning before graduation. The primary disadvantage is the time necessary to develop the assessment systems as well as the time to collect and analyze the assessment data collected. 

    Portfolios.  According to Suskie (2009), “A portfolio is compelling evidence of what a student has learned. It assembles in one place evidence of many different kinds of learning and skills. It encourages students, faculty, and staff to examine student learning holistically – seeing how learning comes together – rather than through compartmentalized skills and knowledge. It shows not only the outcome of a course or program but also how the student has grown as a learner. It’s thus a richer record than test scores, rubrics, and grades alone” (p. 204).

    Examples of Indirect Assessment include but are not limited to the following: 

    • Assignment of Course Grades.
    • Surveys, such as satisfaction, attitudinal, feedback, employer or alumni perceptions.
    • Focus Groups.
    • Interviews.
    • Self-evaluations, such as student or alumni self-ratings of learning. 

    Suskie, L. (2009). Assessing Student Learning: A Common Sense Guide. San Francisco, CA: Jossey-Bass.

  • Developing a Plan
    Recap Assessment Planning Tasks
    1. Create program mission and goals
    2. Identify program student learning outcomes (PLOs) for each goal
    3. Build the outcomes/course curriculum matrix
    4. Develop the assessment plan
    5. Closing the loop – making assessment matter

    For Each PLO:

    • Refine the detailed rubric or measure (defined in the plan)
    • Conduct the actual evaluation activity (collect data)
    • Generate analysis – get evaluation results
    • Decide what action to take (if any)
    • Determine follow-up – when do we assess again
    Documenting the Plan
    • When  and how often should this PLO be measured?
    • If we use course embedded measures (this is recommended) what student assignment or artifact can we use for the assessment?
    • How can we measure it?
    • Who needs to be involved in measuring it?
    • Who gets the data, reflects on the results?
    • How can we be sure that the results are used to make meaningful program changes that will improve student learning?

    Starting with the curriculum matrix (developed in the previous steps) add two columns to the Excel spreadsheet, one for time frame to pilot (practice the measurement) and one for when we plan to collect and use the data. Since none of us are perfect at doing assessment a pilot or practice is a good idea. This can be real data that you gather, evaluate results and determine action or it can be practice to just test out the rubric. We also make sure that the student work is identified in the course where we want to do the assessment, and it is helpful to have a copy of the assignment and scoring rubric as part of the plan. If you are using a rubric that has been developed and tested on similar assignments it is probably not as important to pilot the rubric as in the case of a rubric or measure that is unique to the program or one that has not been used before by any other program. 

    Lessons Learned
    1. It is better to do the full cycle of assessment on one or two PLOs than to try and do everything at once.
    2. It is better to start at the mastery level and see if you have a problem than to gather measures at each of the levels (introduced, practiced, and mastery)
    3. Keep it Simple (Use the KISS method.)
    4. It is better to try something – Just Do It – rather than waiting for things to be perfect.
    5. Involve the faculty who teach the courses in both the measurement and analysis of results.
    6. Don’t aggregate across traits or characteristics in the rubric. Results within the category can lead you to an action plan. Aggregating the data (useful for assigning a grade) loses the level of granularity needed for action plans.
    7. Using the results is probably the most important step.
    8. Keep in mind this is not grading.
  • Closing the Loop

    For many (perhaps most) programs, this step is the most difficult and is typically where the assessment effort gets derailed. If the analysis is not compelling and sufficiently granular, constituents frequently are unable to reach consensus on which actions might be indicated by the data. Unable to even agree on a set of possible actions, no action is taken and the program fails to “close the loop.” To be successful at this step, programs need to present the data to stakeholders who can take action (department chairs, program coordinators, deans, etc.) in a form that is sufficiently granular so that a set of actions can be developed. Sometimes graphical data or data broken down by relevant student characteristics helps define the boundary for actions. Once a set of possible actions is completed, each action can be screened based on criticality, cost, time, and other dimensions to create an agreed-upon subset to implement.

    Actions can be anything from concluding that student performance with respect to a learning goal meets expectations to major curriculum change. Other actions may include increasing admission requirements, remediation, adding prerequisites, increasing or changing specific assignments in existing courses, and providing support structures such as tutoring or help sessions. Another action could be to reevaluate whether the learning goal or expectations for performance on that goal are appropriate. Note that actions can also include any modifications to the program’s assessment schedule or any aspect of the program’s assessment process.

    Finally the recommendations need to be implemented and follow up is required to see if the implemented change actually made a difference. This is largely a management function, and there has to be some way to make the execution of action and the follow up part of the ongoing work of department chairs, program coordinators and/or deans. Here is an example of how this is done in the College of Business.

Reporting Information

Please note the following for AY 2025-26:

  1. Annual Program Assessment Report (APAR) submission deadline is February 16, Census Date in Spring 2026. See Report Guidelines and Template below for additional information.
  2. APARs must be completed for each program - departments with multiple programs are required to submit a separate APAR for each program offered.
  3. To be considered complete, each APAR submission must include:
  • Report Guidelines and Template
  • Resources
  • Frequently Asked Questions

    1. What is the purpose of assessing program learning outcomes (PLOs) annually?

    The purpose of annual assessment is to ensure program quality through a process of continuous improvement in student learning. Such a process assures that PLOs are truly representative of our academic programs and substantiate the quality of the degrees we offer. Annual assessment certifies that our students are meeting or exceeding our stated PLOs. Our campus partners rely on our assessment work for external program accreditation, internal periodic program reviews, and to align with WSCUC accreditation standards and federal regulations. On-going assessment allows us to document, demonstrate, ensure, and affirm the excellent learning experienced by our students in the pursuit of their degrees.

    2. How should programs select PLOs for assessment each year?

    Programs typically adopt a rotational schedule to assess different PLOs over a multi-year cycle. This approach ensures that all outcomes are assessed periodically. Selection may also be influenced by areas of concern from previous assessments or program changes.

    3. What evidence is needed for assessing PLOs?

    Evidence for assessing PLOs should include direct and indirect measures. Examples of direct measures are:

    • Course-embedded assignments assessed with rubrics
    • Capstone projects
    • Comprehensive exams

    Indirect measures might include:

    • Surveys of student perceptions
    • Alumni feedback
    • Employer evaluations

    4. What is the role of faculty in the assessment process?

    Faculty play a central role by:

    • Defining and refining PLOs.
    • Designing and maintaining assessment methods.
    • Collecting and analyzing data.
    • Using the assessment results to make informed decisions about curriculum and pedagogy.

    5. How is assessment data collected and analyzed?

    Data collection should be systematic, using tools such as rubrics or standardized tests. Analysis involves identifying trends, strengths, and areas for improvement. This should be a collaborative process among faculty to ensure shared understanding and buy-in.

    6. How are assessment findings used?

    Findings should be used to inform decisions about:

    • Curriculum changes.
    • Changes to the assessment process.
    • Faculty development.
    • Resource allocation.

    This process, often referred to as "closing the loop," demonstrates a commitment to continuous improvement and accountability.

    7. What documentation is required for the annual assessment?

    It is important to make assessment an explicit part of the program faculty's culture and to document your efforts and outcomes. Each academic program is expected to have a mission statement, a list of PLOs, a curriculum or alignment matrix, and an assessment plan. As outlined in our Annual Program Assessment Report (APAR) template, each program should document:

    • the PLOs assessed
    • the methods and tools used
    • a summary of findings
    • actions taken or planned in response to the findings
    • evidence of "closing the loop"

    8. What happens if a program is not meeting its PLOs?

    If assessment findings indicate that PLOs are not being met, programs should:

    • Analyze potential causes (e.g., curriculum gaps, instructional methods, issues with the assessment process).
    • Develop and implement an action plan.
    • Reassess to evaluate the effectiveness of changes.

    This process should be documented to demonstrate accountability and commitment to improvement.

    9. What are common challenges in assessing PLOs, and how can they be addressed?

    • Challenge: Lack of faculty engagement.
      • Seek training and professional development opportunities and highlight the benefits of assessment.
      • Create an Assessment Committee for the program or department that will collaboratively lead the assessment efforts.
      • If your program/department already has an Assessment Committee, be sure to tightly couple it with your Curriculum Committee. One way to do this is to make the chair(s) of your Curriculum Committee automatic members of the Assessment Committee.
    • Challenge: Difficulty in measuring outcomes.
      • Use clear, specific rubrics and validated tools.
      • Set a meeting for the faculty teaching the course(s) where data for that outcome is gathered. Let them discuss and evaluate the effectiveness of the student deliverable or performance being used.
    • Challenge: Limited resources.
      • Reduce the number of PLOs assessed in each cycle, but ensure all PLOs are being assessed within the program review cycle.
      • Leverage institutional support and prioritize high-impact assessments.
    • Challenge: Small sample sizes.
      • Aggregate data from multiple semesters/years, and clarify that in the report.

    10. Who do I submit assessment reports to?

    The program faculty are expected to have reviewed and provided feedback to completed APARs before they are submitted. Additionally, it is recommended that the college office also reviews completed APARs before submission - this is typically the role of the associate dean. Our current APAR template includes acknowledgment and signatures from the department (or program) chair and the college dean or associate dean to ensure the appropriate distribution has taken place prior to submission. Completed APARs are submitted to the Director of Assessment and Program Review either by the associate dean of the program's home college or by the APAR report writer with a Cc to their college office. 

    11. When are these reports due?

    Reports are due on Census Date in Spring semesters for assessment data collected from the previous academic year.  This due date was moved from the former September 30th due date to allow programs maximum flexibility to plan and conduct assessment work, analyze the results, and make improvement discussions during the academic year.

    12. How is all this assessment work connected to the program review process?

    APARs are used as evidence of continuous program improvement in the self-study report portion of the (internal or external) program review process.

    13. What resources are available to assist with the assessment process?

    In addition to the resources provided in this website, the following are also available:

    • Assessment workshops and training: either internally (contact the Director of Assessment and Program Review or the Office of Faculty Development for any opportunities) or externally (through WSCUC educational programs(opens in new window) or workshops provided by external program accreditation agencies)
    • Our campus Institutional Research and Strategic Analytics (IRSA(opens in new window)) office is available for data support.

    14. How are campus partners informed about assessment results?

    Assessment results should be shared with:

    • Faculty and program staff for decision-making.
    • Students to demonstrate program effectiveness.
    • External partners, such as accreditors and employers, as needed.

    Institutions may use dashboards, reports, or meetings to communicate findings.

    15. How does annual assessment align with WSCUC accreditation requirements?

    WSCUC accreditation emphasizes the importance of learning outcomes, assessment, and continuous improvement. Annual assessment helps demonstrate compliance with WSCUC’s standards for quality assurance, student achievement, and institutional effectiveness.

    16. How does this process connect to federal regulations?

    Federal regulations require institutions to demonstrate educational effectiveness, including setting goals for student achievement and measuring progress. The assessment of PLOs provides evidence that programs meet these requirements and ensure transparency and accountability.

    17. How does this process contribute to overall institutional effectiveness?

    By ensuring programs are meeting their intended outcomes, the assessment process supports broader institutional goals, including retention, graduation rates, and post-graduate success. It also fosters a culture of evidence-based decision-making and continuous improvement.

  • WSCUC Compliance
    Our Annual Program Learning Outcomes Assessment process and Annual Program Assessment Report (APAR) template address the following WSCUC 2023 Standards(opens in new window):
    • CFR 2.3: The institution clearly identifies and effectively implements student learning outcomes and expectations for achievement. These outcomes and expectations are reflected in and supported by academic programs, policies, and curricula, and provide the framework for academic advising, student support programs and services, and information and technology resources.
    • CFR 2.7: The faculty are responsible for creating and evaluating student learning outcomes and establishing standards of student performance.
    • CFR 2.9: The institution demonstrates that graduates consistently achieve stated learning outcomes and standards of performance. Faculty evaluate student work in terms of stated learning outcomes.
    Please visit our campus WSCUC website(opens in new window) for information on our WSCUC accreditation.