NCATE Accreditation

Assessment System and Unit Evaluation

2.1 Assessment System and Unit Evaluation

How does the unit use its assessment system to improve candidate performance, program quality and unit operations?

The Continuous Improvement System guides the unit in the improvement of candidate performance, program quality and unit operations. The unit uses this system’s four components (plan, implement, assess and reflect/report) to engage in purposeful self-examination and growth. Exhibits 2.4.a.1 and 2.4.a.2 describe the system, including the key requirements and assessments used at transition points.


The planning component of the model includes foundational elements that inform program design, such as the research base, mission, vision, standards and findings of previous assessment cycles. These inputs are unified around goals, which drive programmatic decisions. As part of the CSU Chico assessment process, initial and advanced degree-granting programs set goals based on student learning outcomes and data. In Annual Assessment Reports, programs report on progress toward these goals, including actions taken and the results of those actions (2.4.g.2-5).

All credential-granting programs are part of the seven-year accreditation cycle developed by the California Commission on Teacher Credentialing (CTC). Every two years, the unit submits a biennial report that presents candidate assessment and program effectiveness data, an analysis and discussion of data, and programmatic modifications (2.4.h). The unit’s next biennial report will be submitted in September 2014; the accreditation cycle was postponed one year due to changes in the commission’s timeline. The goals identified on CSU Chico Assessment Reports and Biennial Reports are chosen thoughtfully in an effort to focus continuous improvement efforts in key areas of need. Unit goals are informed by and aligned with goals chosen for all CSU’s as part of the Improvement and Accountability Plan (IAP) of the CSU System (1.4.j.2).

Faculty are involved in identifying goals and planning change, which fosters a collaborative ownership over continuous improvement efforts. For example, in 2011 all initial programs jointly chose the goal of preparing candidates to work in inclusive, general education classrooms. Over the next three years, faculty worked toward this goal by redesigning curriculum, co-teaching courses, and sharing faculty expertise across programs. Each year at the SOE faculty retreat, faculty revisited this goal and their plans, used data to measure their progress, and adjusted the goal accordingly (e.g. 2.4.j).  Goals also drive unit-wide plans. Monthly EPP Unit coordinator meetings have led to fruitful collaborative discussions in which EPP programs across departments use data to identify joint goals, plan actions and implement change.

Standards, proficiencies and student learning outcomes frame the work of the unit. As part of the CTC’s seven-year cycle, each program responds to standards, using evidence such as syllabi to illustrate program offerings. As standards change, programs are updated. For example, in December of 2013, the Adapted Physical Education credential became an added authorization in California, with a revised set of standards. Faculty in the APE program articulated coursework and shifted candidate experiences to align with the new standards. At the P-12 level, the new Common Core State Standards led to changes in program method courses across the unit, as instructors added common core pedagogy.


In the Continuous Improvement System, data and goals drive program implementation (2.4.g.1). Since our last accreditation, it is common for faculty to share actions that have been helpful in achieving goals. For example, in Spring of 2013, program coordinators in the EPP Unit discussed the need to better prepare candidates to teach students with special needs. The group designed and implemented a plan. Since 2011, initial credential programs had been conducting a day-long workshop for candidates focused on assistive technology. Advanced programs decided to join this event, and in Spring of 2014 a workshop was conducted for all candidates in any program within the unit. After hearing from local and national experts, candidates applied their learning to case studies at tables with representatives from multiple programs, in multiple departments. Candidates collaborated with each other to apply their best strategies to scenarios of students with a wide range of strengths and needs. According to post-workshop surveys, the 2014 workshop had both strengths and areas for improvement. Candidates appreciated the opportunity to work with others who had different areas of expertise. However, based upon survey responses of Single Subject Program candidates, in Spring of 2014 the workshop will include sessions co-taught by Single Subject and Education Specialist faculty.

The “implement” component of the system emphasizes the roles that research, professional development and collaboration play in continuous improvement. Unit faculty are teacher-scholars who use current research and theory to frame their work with candidates. They participate in professional development that supports the unit’s mission and vision, standards and proficiencies, and identified goals. For example, with the change in P-12 standards, faculty in the School of Education have engaged in professional development around the Common Core State Standards and the Smarter Balanced Assessment. Faculty collaborate to achieve growth. In 2014, the Communication and Sciences Disorders program requested input in developing a diversity seminar for candidates, and faculty from the SOE assisted in designing and leading this seminar. Each of these examples shows how the assessment system works to improve the implementation of programs.


Assessment plays a central role in the Continuous Improvement System. The School of Education Assessment Coordinator (staff position) and the Director of Assessment and Accreditation (faculty position) work together to ensure that key assessments are administered effectively, data are collected and analyzed, and outcomes are shared. For both initial and advanced programs, data is collected upon program entry, at an in-program transition point, upon program completion, and post-program (2.4.a.2). For initial programs, the Assessment Coordinator uses an online database using STEPS (Student Tracking, Evaluation and Portfolio System) to ensure that data are collected and stored systematically. This web-based software, developed by the CSU Chico College of Business, provides an organized repository for data and streamlines the analysis and reporting processes. Advanced programs that reside both inside and outside of the SOE collect key assessment data and submit it annually to the SOE Assessment Coordinator to include with unit analyses. The Assessment Coordinator is responsible for analyzing data and works with the Director of Assessment and Accreditation to ensure that data is aggregated and disaggregated in ways that are meaningful in informing the work of the unit and reporting to stakeholders.

As part of the university’s assessment process (Annual Program Assessment Report, 2.4.g.2-5), programs are asked to set goals that address assessment instruments. Efforts are made to ensure that assessments are fair, accurate, consistent, and free of bias (2.4.c). Exhibit 2.4.g shows some examples of changes to assessment instruments and gives a rationale for those changes.  The SOE Assessment Committee is responsible for reviewing assessment tools and scoring guides and providing guidance in the development, revision and approval of assessment instruments. For example, in 2013-2014, the Assessment Committee reviewed and revised the Supervisor Survey, Classroom Environment Survey and the TPE Field Placement Rubric. While housed in the SOE, this committee and its members consult on assessment across the unit. Members of this committee facilitate instrument development and are called upon to guide discussions about data in various meetings so that faculty might see the relevance of data to their daily work.

Program coordinators monitor candidate progress during all transition points, from admission to program completion (2.4.a.2). Sensitive information, including candidate concerns, is recorded electronically by the program coordinator and shared with the program’s administrator or college dean when deemed appropriate. Candidate complaints are handled through informal and formal grievance processes. Documentation of their resolution is kept in the candidate’s file and in the Office of Student Judicial Affairs (2.4.e). The School of Education Director, who also serves as the Associate Dean for CME, maintains a record of informal candidate complaints and their resolution (2.4.e, 2.4.f).


Meaningful data is essential to the reflecting/reporting stage of the Assessment System. The Data and Reporting Calendar (2.4.d) guides the flow of analysis and reporting to both internal and external stakeholders. Data are reviewed at multiple levels by pathway coordinators, program faculty, administrators and community partners. The Data and Reporting Calendar shows several types of reporting that ensure that various levels of stakeholders are included in, informed of and contributing to the preparation of educators. For example, the Program Assessment Report is submitted each September by every degree-granting entity at CSU Chico. In September, the SOE Governance Council is consulted on the content of these reports, and presented with the final reports. In October, these reports are shared with AURTEC (All University Responsibility for Teacher Education Committee), a group that provides feedback to the unit about program quality and ideas for continuous improvement. Results are also shared with advisory boards, the College Dean, and the Provost.

These elements of the Continuous Improvement System--plan, implement, assess, and reflect/report-- result in strong programs with positive candidate outcomes.