NCATE Accreditation

Areas of Improvement

2.2 Moving Toward Target or Continuous Improvement

2.2.b  Continuous Improvement. Summarize activities and changes based on data that have led to continuous improvement of candidate performance and program quality. Discuss plans for sustaining and enhancing performance through continuous improvement as articulated in this standard.

1) Unit assessment system.

With our community partners, we revised our assessment system to emphasize the process of using data for program improvement (2.4.a.1). Revisions of the Continuous Improvement System were shared at SOE Advisory Board Meetings from 2012-2014, and specific feedback was given. The revisions in the system helped the unit to build a culture of continuous improvement. Faculty in all programs engage with data for program improvement (2.4.g.1). Data is used consistently to refine programs or to develop new programs, like the Masters in Agricultural Education, which was developed after taking a needs assessment of stakeholders in Northern California. The newly developed Data and Reporting Calendar (2.4.d) guides the systematic collection and sharing of data. We added new points of access for faculty, who can find disaggregated data on the server and on Blackboard Learn for faculty review. We implemented monthly, unit-wide coordinator meetings in which coordinators of all unit pathways meet to discuss collective goals, plans and improvements. As an outcome of these meetings, programs from different departments collaborate to plan and deliver professional development to faculty and candidates across all pathways in the unit.

In 2013-2014, AURTEC (All University Responsibility for Teacher Education Committee) was reconstituted to provide assessment feedback for the health of the unit. Previously, this group provided feedback and approval for programs seeking approval from the CTC. Its revised mission includes evaluating the health of the unit by providing feedback on assessments and program quality. Future plans include continuing to draw upon expertise across the university to strengthen candidate preparation. 

2) Support for assessment.

Since our initial accreditation visit in 2007, we have added an Assessment Coordinator, (a full-time staff position) and a Director of Assessment and Accreditation (an assigned-time, faculty position) to guide the implementation of the Continuous Improvement System. After the merger of the two education departments in 2011, the School of Education revised the charge of the SOE Assessment Committee. The focus of the assessment committee is designing and revising data collection instruments. In addition, all surveys in the SOE have to be approved by the committee, to ensure that instruments are soundly constructed, collect meaningful data, and align with the Continuous Improvement Assessment System (2.4.k SOE).

3) New and revised assessments.

Since our initial accreditation visit, we have added instruments to assess candidate performance and program effectiveness. For example, students in initial programs in the School of Education complete the Performance Assessment for California Teachers (PACT), which provides data along 12 teaching dimensions. At the advanced level, the writing proficiency of graduate students is formally assessed at program entry and before advancement to candidacy. Calibrated scorers use a rubric to identify students for a course that targets academic writing.

Future efforts will focus on instruments and processes collect data from advanced program finishers. Advanced programs, including those teachers and for other school professionals, are planning to develop an alumni survey for advanced programs in the unit. The unit will both develop the survey and work with the Career Center and Graduate Studies to track program finishers for the survey pool. In addition, initial programs will be working to improve the response rate for the year-out survey by collaborating with county offices of education to increase the list of contacts.

There have also been many thoughtful changes to assessment and evaluation tools and processes (2.4.g.1, Part 2). Faculty have converted holistic rubrics to analytical rubrics that ask for ratings on several dimensions or traits. For example, in 2013, the MA in Education Writing Rubric was converted from a holistic rubric to an analytical rubric that provides more information about candidate performance and areas for program improvement. In the future, these efforts will continue. In particular, the programs for other school professionals, in collaboration with P-12 partners, will be looking at ways to revise and align instruments so that they can collect more fine-grain data to inform program improvement.

There have been many efforts to ensure the instruments are used accurately and consistently. The unit meets the state requirements for validity and reliability of the PACT and Content Area Tasks (CATs), including annual calibration, auditing and double scoring. The School of Education holds calibration sessions for field evaluation instruments and the MA in Education writing rubric. However, the unit wishes to strengthen its calibration efforts across all programs in order to ensure that thoroughly develop fairness, accuracy and consistency in measurements. Each fall, beginning in fall 2014, programs and pathways will engage in calibration exercises. With the potential of new faculty hires on the horizon, we want to ensure new faculty an opportunity to become calibrated before supervising candidates in fieldwork or clinical placements. We plan to develop online modules so that new faculty can be quickly oriented and engage in calibration exercises in a way that is effective and efficient.

4) Use of technology.

In recent years the unit has made progress with the use of technology to collect and analyze data regarding candidate progress and program quality. Communication Sciences and Disorders developed a data dashboard so that the program can easily provide information and collect data from candidates and community partners (Exhibit 2.4.L). The School of Education expanded the use of STEPS (Student Tracking, Evaluation and Portfolio System).  All field data from initial pathways are part of STEPS, including mid-program and final field evaluations, disposition data, and PACT scores. In addition, we have expanded the STEPS also tracks writing proficiency for advanced candidates in the MA in Education Program. We also use the learning management system Blackboard Learn to distribute candidate videos for scoring. 

In 2012-2013, we began working with the Registrar’s Office to adapt the university’s Degree Audit Program for all candidates in credential programs.  DAP will allow the education office and credential analysts, to monitor candidate performance from admission to recommendation for credential (2.4.i). The pilot implementation will occur in 2014-2015, with use for all initial programs by 2015-2016. Of its many features, one of immediate benefit is a place to keep secure electronic advising notes for candidates.

The unit also has a plan to enhance the use of electronic data management for key assessments in the assessment system. Our needs and our uses are becoming more sophisticated, and we need a system that better supports and accommodates this. During 2013-2014 have been meeting with the designer of STEPS to see what is possible with this system. We will also be exploring other options for user interface, so that individuals (candidates, faculty, school partners, graduates) will have easy access to evaluation and reporting features. The “dashboard” that CMSD designed holds promise for use across the unit. During 2014-2015, together with advisory committees and AURTEC, we will be exploring other data management systems.