Assurance of Learning

Assessment, CSU, Chico

Sustainable Assurance of Learning (AoL) at California State University, Chico

The year is 2002, the new AoL standards have come out, and the College of Business at CSU, Chico has just four years to comply. Beginning in 2006-07, schools need to perform a “Full Monte” and demonstrate how the AoL process instigated changes to the student learning process. Chico needed to develop an efficient and sustainable AoL process quickly. There was a strong desire on the part of the faculty to do AoL right, as opposed to something that could just “check the box.”  (This theme of “Doing the Right Thing” has been carried over into the assurance of learning efforts by the whole University – see

Since the college also launched a new degree program in Business Information Systems (BSIS) in fall 2003, it seemed natural to link the assurance of learning process with curriculum development of BSIS. The BSIS program, in general, and the management information systems (MINS) option, in particular, had a long history of involving students, faculty, and industry representatives in the curriculum development process. In fact, the whole BSIS degree was developed with industry input dating back to the mid-1990s. Thus when assurance of learning was being discussed, it was relatively easy to get active involvement of all stakeholders, so the college looked to the MINS faculty to develop and fine tune the AoL process.

Since the AACSB standards call for input from “all appropriate constituencies,” items such as 1) define student learning goals and 2) develop assessment rubrics were understandably added to the meeting agendas. Faculty wanted to continue the tradition of obtaining and valuing the industry perspectives on student skills, knowledge, and values. Similarly industry partners expect to be consulted on major curriculum changes. The question was how to get them involved in the ongoing effort of assessment of student learning. With standards 15 through 21 placing emphasis on “course-embedded” assessment activities with input from “all appropriate constituencies,” we decided to formally develop a set of system requirements related to assurance of learning. In order to get industry to participate in an ongoing way, we knew we needed a Web-based interface that had e-mail notification to/from industry partners.

In fall 2002, students in a systems analysis class used the assurance of learning context to develop a list of system requirements and put forward a prototype system that could be developed in the subsequent semester’s systems design course (spring 2003). The students gathered requirements from industry partners, faculty, students, and administrators.  At the time, most of the student prototypes favored a portfolio approach. The students were in favor of having industry feedback on their required work so they would know that what they were learning in college was, in fact, valued by potential employers. The analysis prototypes of STEPS (Student Tracking, Evaluation, and Portfolio System) placed a good deal of emphasis on the student-initiated portfolio.

However, prior to the STEPS system development in spring 2003, it was clear that the faculty were none too happy with a completely student-initiated assessment effort where all the artifacts available for assurance of learning by faculty, industry partners, and administrators were selected and controlled by the students. Issues of academic integrity were raised, and students openly admitted that if industry people were looking at “their work,” they wanted it to be the best it could be – after all, the portfolio artifacts were not being graded, only assessed by people other than the teaching faculty member. This concern by faculty along with the idea that students might manipulate the artifacts in each portfolio to their advantage led to the notion that assessment should be separated from the showcase portfolio activity and a system of double-blind review was needed to assess student work. We retained the identity of the reviewer type, however, so that the student could know if the feedback came from faculty, another student, or an industry partner.

The administration recognized the value in getting input on all student work deemed appropriate for assessment (not just those assignments included by some students in some portfolios). The administration further wanted some control over the process so that the assessment effort was capable of producing an audit trail. In other words, if someone from AACSB comes to Chico and looks at the results, he/she can also drill down and see the actual student work, the evaluations submitted by each type of reviewer, and determine at a glance if the results are, in fact, reproducible (i.e., a good research design). This turned out to be a great benefit going forward because the STEPS system became a data warehouse of everything related to assurance of learning. The evaluations, student work, assignment/exam instructions, measurement instruments, etc. are all available so that the system can track student performance over time by person or in groups. The system can also keep measurement instrument changes or use previously submitted work to assess a new learning goal. For example, one case write up can be used to assess both critical thinking and writing, even though each of these skills requires different rubrics and potentially different reviewers.  We developed formal links into the student data so we can look at the results by major, option, year in school, transfer status, etc. The goal was to make the data easy to analyze in order to get to actionable results as quickly as possible so the faculty spends time developing interventions and curriculum changes rather than arguing over the results.

Based on these observations, the dean at the time (May 2003) decided to invest in the working STEPS system to make it more robust and expand its use beyond the BSIS program. Features such as adding more faculty control (where desired), making the interface more user friendly, focusing more on the assessment of required artifacts (even though the portfolio functionality is still available in all the versions of the STEPS product), etc., were added in the second year of development. Starting in June 2003, the college, and eventually the university, began to support the ongoing development and enhancement of the STEPS product. Students were hired, largely from the MINS and computer science programs, and a faculty member was given a course release to supervise the work, enforce development standards, and prioritize needed changes to the system.

The University at large saw the potential for using STEPS to assess general education classes, and the system was expanded for use by the Department of Education for the TPA (teacher performance assessment) process. More modifications came during 2004 and 2005 so that the system was more flexible, more secure, and had more features that supported all sorts of student work, not just work that was easily uploaded as an electronic document. We started exploring the use of STEPS by other universities and by fall 2005, the University of Arkansas and the University of Houston-Downtown became the first beta test schools for STEPS. At every activity along the way of development, we have made significant improvements and now the system is used for general education assessment, Department of Education assessment, as well as by eight colleges of business.

The highlights of the system are that it is secure (meeting FERPA requirements), links to student enrollment data for quick and accurate analysis, supports various types of sampling, uses different types of constituents as reviewers, can be controlled by the faculty (or not), is fully configurable for any set of learning goals, outcomes and rubrics, and has flexible data output. The STEPS system is low cost, and as such, provides a sustainable solution for assurance of learning activities in any university. This year the development effort includes enhancement of the showcase portfolio and more streamlined reporting. For an online tour or more information, please see and/or

Back to Top