STANDARD 1. CANDIDATE KNOWLEDGE, SKILLS, AND PROFESSIONAL DISPOSITIONS
Evidence for the BOE Team to validate during the onsite visit
1. What is the school psychology program’s NASP recognition status?
The school psychology program (PPS—Pupil Personnel Services) submitted its NASP report for continuing accreditation to AIMS in September of 2014. As of the submission of this report, on AIMS the report status reads as “in process.”
2. Could more information, perhaps just Website URLs to informational pages, be provided on CBEST and CSET? (Data are provided, but details on the substance of these admission assessments could not be located.)
The California Basic Educational Skills Test(opens in new window) (CBEST) was developed to meet requirements of laws relating to credentialing and employment. This test requirement does not replace any of the other requirements of subject matter knowledge, professional preparation, and practice teaching or field experience applicable to the issuance of credentials. The CBEST is designed to test basic reading, mathematics, and writing skills found to be important for the job of an educator; the test is not designed to measure the ability to teach those skills.
The California Subject Examinations for Teachers(opens in new window) (CSET) have been developed by the California Commission on Teacher Credentialing (CTC) for prospective teachers who choose to or are required to meet specific requirements for certification by taking examinations. Each test in the program is designed to measure a specific set of knowledge, skills, and/or abilities reflecting the subject area taught. The CSET examination is an approved route to subject matter competency for all initial credential programs. The sets of subject matter requirements for all examinations of the CSET were defined in conjunction with committees of California educators and approved by the CTC. Single Subject Program candidates may alternatively demonstrate their subject matter competence by successfully completing a CTC approved subject matter program. Chico State has twelve approved subject matter programs(opens in new window).
3. A variety of acronyms and abbreviations are used throughout the narrative. Can an alphabetized list of these, including the acronyms/abbreviations and full titles, be provided for the onsite visit to help decipher data sets and narrative references?
A list of acronyms and abbreviations is provided in exhibit A.1.3 Acronyms and Abbreviations.
4. Exhibit 1.4.g identifies the PACT and several course-specific assessments in initial programs used to determine impact on student learning. Can the unit provide onsite all data, rubrics, and uses of the assessments in programs that were noted in the IR as too substantive to include in the offsite review?
Key assessment and rubrics files were too large to upload to AIMS. They are available on the unit’s NCATE accreditation website under the heading “Key Assessments”.
5. What credential is awarded for the RTR program? In AIMS, the program is identified as post-baccalaureate; however, the IR alludes to RTR being part of the MA in Education. As presented, the IR does not include specific references to RTR candidate performance. Can the unit provide more information to help determine whether the initial program data reported include RTR candidates?
RTR crosses three degree/credential areas. Candidates in RTR earn a Multiple Subject or Education Specialist teaching credential together with an MA in Education. Candidates participate in two groups of key assessments—those of the initial credential programs in the SOE and those of the MA in Education program. In the data chart (IR Exhibit 1.4.d.2 Unit Data Table). RTR data is located on tabs 2, 3, 4 and 7.
6. A comparison of the data sets in Exhibit 1.4.d.2 and the program inventory in AIMS presents some confusion about what programs (on the AIMS profile) lead to what credentials (in Exhibit 1.4.d.2). (Since the academic catalog hyperlink in Exhibit I.5.a.1 was broken, a clear understanding of the unit’s programs and how each is available is pending. For example, Exhibit 1.4.d.2 presents data for, perhaps, two RTR program paths—one listed as “MS” and one listed as “SPED” within the Education Specialist program. Also, the same data set includes data for three paths of the Education Specialist (RTR SPED, Concurrent, and Education Specialist). Can the unit provide data to clarify how these three paths differ?
Exhibit A.1.6 Clarification of Initial Pathways clarifies the pathways and the credentials offered.
7. Can the system-wide first-year teacher evaluation tool be provided?
Exhibit A.1.7.a Year-Out Survey for Teachers is the survey used in 2013-14 for teachers who earned a Multiple Subject credential. Exhibit A.1.7.b. Year Out Survey for Employers is the survey completed by these teachers’ supervisors. Surveys contain core questions answered by graduates of all programs and their employers. In addition, there are some specialized questions related to specific credentials.
8. The EDMA 600 key assessment rubric for the M.A. in Education program appears general. Can the unit provide information on criteria specific to the assessment description, and clarifying how candidate performance is rated? Can detail be provided on how this rubric is used, especially related to inter-rater reliability, and how it provides data specific enough to make judgment on candidate mastery?
The rubric submitted with the IR was a previous version of the rubric. The updated rubric is shown in exhibit A.1.8 EDMA 600 Rubric. The course was developed and approved in 2012-2013. The rubric is continuing to be revised during the 2014-2015 academic year.
9. How do data presented in Exhibit 1.4.d.3 align with the rubrics in Exhibits 1.4.c.3 and 1.4.c.6? For example, the EDMA 600 rubric appears to include three criteria on a scale of four. The data set in Exhibit 1.4.d.3 presents data organized by Level 1, Level 2, Level 3, and Level 4. The rubric, though, appears to have a point range up to 30. The reviewer cannot determine how the data set and rubric are aligned.
Exhibit 1.1.8 EDMA 600 Equity Study Rubric clarifies the rubric levels that correspond to the EDMA 600 data. Other rubric levels have been clarified in the key assessment documents that are accessible on the unit’s NCATE accreditation website under the heading “Key Assessments”.
10. Can a matrix or additional notations in Exhibit 1.4.c.1 be provided to show a clear alignment between advanced programs’ key assessments and what is being assessed by each (i.e. content knowledge, pedagogical content knowledge, impact on student learning, or dispositions)?
Exhibit A.1.10 Key Assessments Aligned with Standard 1 shows key assessments and what is being assessed by each.
11. Data are presented in Exhibit 1.4.d.2 for the M.A. in Education program’s Writing Proficiency (Advance to Candidacy) assessment. Is this the data set for the Writing Proficiency: In-program Writing Proficiency key assessment identified in Exhibit 1.4.c.1?
The “In-program Writing Proficiency” and “Writing Proficiency –Advance to Candidacy” are referring to the same data set.
12. Exhibit 1.4.c.6 includes descriptions and/or rubrics for assessments that are not identified as key assessments in Exhibit 1.4.c.1. Does one of the two documents contain misinformation?
In the IR, exhibit 1.4.c.6 contains forms, assignments and rubrics used in the EDAD Program (Preliminary Administrative Services Credential). Some of these assignments are not used as key assessments for the purposes of data collection, even though they were labeled “key assessment”. A corrected compilation of key assessments can be found on the accreditation website.
13. The following table identifies key assessments in non-SPA advanced programs and whether rubrics and data were found for each.
The chart provided by the offsite team now includes two new columns, in which we have written a response to clarify the location of rubrics and data.
Program (per AIMS) | Key Assessment (per Exhibit 1.4.c.1) | Rubric (Off-Site Review) | Rubric (Addendum Response) | Data (Off-Site Review) | Data (Addendum Response) | |
M.A. in Education | Initial writing | Y | ✓ | Y | ✓ | |
In-program writing1 | N | Same rubric used for Initial Writing and In-Program Writing | N | IR Figure 1.4.d.2 Tab 7, labeled “Advance to Candidacy” | ||
EDMA 6002 | N | N | Data submitted, see IR figure 1.4.d.2 Tab 7 | |||
EDMA 602 | N | See accreditation website | N | Data submitted, see IR figure 1.4.d.2 Tab 7 | ||
EDCI 601 | N | See accreditation website | Y | |||
Culminating activity | Y | ✓ | Y | ✓ | ||
Exit survey | Y | ✓ | Y | ✓
| ||
| ||||||
M.S. in Agricultural Education | Initial writing3 | N | See accreditation website | N | Data available at visit | |
In-program writing3 | N | See accreditation website | N | Data submitted see IR figure 1.4.d.2 Tab 11 | ||
AGED 608 | N | See accreditation website | N | Data available at visit | ||
AGED 601 | N | See accreditation website | N | Data available at visit | ||
AGED 610 | N | See accreditation website | N | Data available at visit | ||
Culminating activity | N | See accreditation website | Y | ✓ | ||
Exit survey | N | See accreditation website | N | Data available at visit | ||
Preliminary Administrative Services | Initial writing | Y | ✓ | Y | ✓ | |
In-program writing | N | See accreditation website | N | Data submitted see IR figure 1.4.d.2 Tab 11 | ||
Mid-program portfolio | N | See accreditation website | N | Data submitted see IR fig 1.4.d.2 Tab 7 line 58 | ||
EDAD 615 | N | See accreditation website | N | Data submitted see IR fig 1.4.d.2 Tab 7 line 70 | ||
Final portfolio | N | See accreditation website | N | Data submitted see IR fig 1.4.d.2 Tab 7 line 65 | ||
Culminating activity | Y | ✓ | Y | ✓ | ||
Exit survey | Y | ✓ | Y | ✓ | ||
1See Question #13.
2See Question #11.
3Exhibit 1.4.d.2 (tab 11) includes data for Writing Assessment; however, which of these two assessments align with the data set is not identified.
4See Question #14
(Note: The Offsite Report did not have question #14 or #15)
16. Can the unit clarify whether the RTR program is required to earn state approval?
The Rural Teacher Residency is an experimental pathway that was externally funded by a Teacher Quality Partnership grant through the Office of Innovation and Improvement, US Department of Education from 10/2009-9/2014. In 2010, we were told that the RTR program did not require a separate review process because we already had approval for both Multiple Subject and Education Specialist Programs (A.1.16 RTR Approval Email). Instead, it would be approved in the regular accreditation cycle, beginning with the program narratives due in 2011. Please note that the last year of funding of the program is 2014-2015, with the last cohort of RTR candidates finishing in spring of 2015. SOE faculty are working to integrate aspects of the RTR pathway (such as the co-teaching model) into other multiple subject and education specialist pathways.
17. Can the unit provide supporting documentation that it noted was too substantive to include for the offsite review? (This documentation relates to the examination of data, descriptions, and rubrics for all related assessments for all initial programs that will be conducted during the onsite review.)
Many of the key assessments and rubrics were too large to upload to AIMS. They are available on the unit’s NCATE accreditation website.
18. Can the unit detail specific evaluation criteria for the evaluation in Exhibits 1.4.i and 1.4.j.1 showing mean results on reflective evaluations from the first-year teachers (former candidates) and from evaluators?
The scale includes four levels describing one’s perception of preparation:
4 Well prepared
3 Adequately prepared
2 Somewhat prepared
1 Not at all prepared
The percentages reflect collapsed levels 3 and 4, “adequately to well-prepared”.
19. Can the unit provide additional information related to documentation in the IR and exhibits related to advanced programs?
Please see answers to previous questions that supplied additional information in regard to advanced programs. The location of additional information related to advanced programs can be found on the accreditation website.
20. Can the unit clarify which key assessments evaluate candidate 1)content knowledge, 2) pedagogical content knowledge, 3) impact on student learning and 4) dispositions for each of the unique program areas?
Exhibit A.1.10 Key Assessments Aligned with Standard 1 shows key assessments and what is being assessed by each.