Assessment Plan 2012
CALIFORNIA STATE UNIVERSITY, CHICO
ANNUAL PROGRAM ASSESSMENT REPORT,
BS Degree Program: Department of Mathematics and Statistics
Rick Ford, Program Assessment Coordinator: Department of Mathematics and Statistics, zip 0525
8985063
rford@csuchico.edu Date: June 17, 2012
I. Assessment of Student Learning Outcomes
 Name and Contact Information of Program Assessment Coordinator: Rick Ford
Department of Mathematics and Statistics, zip 0525 8985063
 Goal Statements and Student Learning Outcomes
[General Content] Graduates are proficient in performing basic operations on fundamental mathematical objects and have a working knowledge of the mathematical ideas and theories behind these operations.
GC1 Demonstrate basic skills and conceptual understanding of differential, integral, and multivariable calculus and fundamental mathematical objects introduced in our degree core, such as, sets, functions, equations, vectors, and matrices.
GC2 Demonstrate basic understanding of probability and statistics, relevant to their option.
GC3 Demonstrate more technical skills and more indepth and broader conceptual understanding in core mathematical areas (such as, analysis, geometry/topology, algebra, applied math, statistics), relevant to their option in the major.
[Critical Thinking/Problem Solving] Graduates use critical thinking and problem solving skills to analyze and solve mathematical & Statistical problems.
PS Interpret and translate problems into appropriate mathematical language; then solve problems by applying appropriate strategies and interpreting the results.
[Communication] Graduates communicate mathematics effectively in a manner appropriate to career goals and the mathematical maturity of the audience.
Com1 Demonstrate the ability to effectively and accurately write on mathematical topics relevant to their mathematics option and appropriate to their audience.
Com2 Demonstrate the ability to effectively and accurately speak on mathematical topics relevant to their mathematics option and appropriate to their audience.
[Proof Proficiency] Graduates have a basic proficiency in the comprehension and application of proofs.
PP Students can read mathematical proofs, extract the key ideas used in the proof, and convey the logic behind the proof; they can also write their own rigorous and logically correct proofs.
[Technology] Graduates know how to use technology tools (e.g., graphing calculators, computer algebra systems) appropriate to the context of the problem.
Tech Students use technology to manipulate mathematical objects (e.g., functions equations, data sets, etc.), to conduct mathematical explorations, to model problem scenarios, and to analyze mathematical objects.
[Lifelong Learner] Graduates are aware of the important role of mathematics and have the interest and ability to be independent learners and practitioners.
LL1 Students demonstrate the ability to apply mathematics and statistics to new contexts (e.g., in other classes, the workplace, graduate school, or classes they teach).
LL2 Students recognize and appreciate the role that mathematics can play in their futures and in society in general.
Course Alignment Matrix:
See attached Excel Spreadsheet (Appendix B)
4. Learning Outcome(s) Assessed in AY 201112 (Year 4 of Assessment Plan)
Com2PM, GC3IPM (changed this year from GC4 resultant from combining GC1 and GC2).
5. Assessment Methodology Used
The SLOs were assessed at the levels of Introductory, Practice, and Mastery (IPM) as indicated in the list above. Courses assessed this academic year were determined by the approved Assessment Plan. The plan called for MATH 435 (GC3M), relating to assessment of our General Option, MATH 341 (GC3 IP), 342 (GC3 M and Com2 PM) and 337 (GC3IP) which all relate to our Mathematics Education option. This academic year one section of each of these courses was offered and assessed. A rubric was applied to determine performance at the Exemplary, Proficient, Acceptable, and Unacceptable levels. Each assessment item was scored independently by the instructor of record.
Assessment items this year included questions embedded in final exams, and midterms, and through scoring of student presentations. The assessment items, rubrics, raw scores and sorted data for all class assessments are available in Appendix A. Student work is available upon request.
The rubrics applied varied slightly from course to course. In Math 435 the instructor applied a 3 point rubric with 3 = Exemplary, 2 = Proficient, 1 = Not Acceptable. In Math 337 the instructor applied a 4 point rubric with 4 = exemplary, 3 = proficient, 2 = adequate, and 1 = inadequate. Rubrics for math 341 and 342 followed similar rules.
The performance items and rubrics are available in the appendices.
6. Assessment Results
Table 1:
Student Learning Outcome 
Where sample is from, sample size 
Measure 
% Exemplary + Proficient by problem 
GC3M 
MATH 435 (5) 
Embedded Final Exam item 1 
100 
GC3M 
MATH 435 (5) 
Embedded Final Exam item 2 
60 
GC3M 
MATH 435 (5) 
Embedded Final Exam item 3 
60 
GC3M 
MATH 435 (5) 
Embedded Final Exam item 4 
80 




GC3M 
MATH 337 (11) 
Embedded Final Exam item 5 
88.8% 
GC3P 
MATH 337 (11) 
Embedded Final Exam item 6 
76.7% 
GC3I 
MATH 337 (11) 
Embedded Final Exam item 7 
73.9% 




GC3IP 
MATH 341 (21) 
Embedded Final Exam item 8 
57.1% 




GC3P 
MATH 342 (15) 
Embedded Midterm Exam item 9 
46.7% 
Com2P 
MATH 342 (15) 
Student Presentation Item 10 
46.7%% 
Com2M 
MATH 342 (15) 
Student Presentation item 11 
60.0% 
7. Analysis / Interpretation of Results
Results/Analysis/Interpretation
All of the courses assessed this year are upper division designed specifically for our mathematics majors. The data and numbers presented in the table represent all of the students enrolled in those courses. The raw data is available in Appendix A.
Math 435: Items 14 in table 1 above pertain to Math 435, which is an advanced linear algebra course that carries the prerequisite of elementary linear algebra. The course is designed primarily for students enrolled in our general mathematics option but also serves as an elective for those in other options. All items were intended to assess SLO GC3 at the Mastery level.
Item 1: This item addresses the skill of testing if a subset is a subspace and the concept of closure under addition and scalar multiplication. The concept is fundamental in linear algebra and should be mastered by all mathematics majors who complete this course successfully. The test data indicated that 100% of the students completed this question at the level of proficient or above.
Item 2: This item addresses the skill of proving the equivalence of three statements and the concepts of null space, range, and the relation between the dimensions of these spaces. The specific task required knowledge of linear transformations and the ability to provide solid mathematical proof. While we expect all general mathematics majors to be able to provide solid mathematical proofs for a variety of results in linear algebra, we do not expect all of these majors to be able to prove all results. Thus this single proof item cannot serve to either qualify a student as having mastered proof nor disqualify a student based on failing this one item. This points to a fundamental weakness in this type of “item” approach to program assessment that will be discussed further in this report. This is a prime example where a battery of assessment items is required to properly determine whether a student has mastered this particular SLO or not. The data presented in table 1 above indicates that 60% of the students were proficient or better on this single item.
Item 3: This item addresses the skill of proof and the concepts of eigenvector, eigenvalue, and the equivalence of a zero eigenvalue with a nontrivial null space. In combination with Items 2 and 4 the relative mastery of proof by the students in the class
can be inferred with a certain degree of confidence. On this assessment item, like item 2, 60% of the students scored proficient or better. It should be noted that the exact same students who scored proficient or better on item 2 are those that scored proficient or better on item 3.
Item 4: This item addresses the skill of computing eigenvectors and eigenvalues. This is another fundamental ability expected of all mathematics majors who successfully complete this course. The table indicates that 80% of the students, including all those who were proficient at the proof problems, are proficient in this item.
The data presented is intended to assess SLO GC4 at the mastery level. What it shows is that 60% of the students are meeting the expectations of the course and that 40% are not. Our goal is obviously to educate 100% to the level of proficient or above. Does this data indicate some program deficiency that can be articulated and corrected? I think not.
With a sample size of only 5 students, there is simply no statistical significance that would merit program adjustments. There are no planned program improvement actions based on this data nor are revisions of metrics indicated.
Math 337: Items 57 in Table 1 pertain to Math 337. Math 337 is an introduction to number theory and is required of majors in the Mathematics Education option. The course also serves as an elective to majors in other options. The course has been selected for assessment relative to the mathematics education option. Each of the 3 items was designed to assess SLO GC4, but each at different levels.
Item 5: This item assesses GC3 (M). Students are asked to apply the standard proof technique of induction to the concept of modular arithmetic. Induction is a technique that is introduced in Math 119, precalculus, then developed further in methods of proofs, Math 330. Students in Math 337 are expected to demonstrate mastery of the technique. The data indicate that 88.8% of the students scored at the level of proficient or above.
Item 6: This item assesses GC3 (P) and asks students to solve a pair of modular arithmetic problems. While the concept of modular arithmetic is relatively introductory, it falls into the category of “progressing” because it is a more advanced concept not found in lower division courses. The data indicate that 76.7% of the students tested proficient or above on this item.
Item 7: This item assesses GC3 (I) and asks students to apply their knowledge of public key encryption to decode a message. The application of advanced results from number theory are required to solve this problem. The method of public key encryption is first introduced in this course and it is expected that students have never been exposed to the concept. Hence the concept is assessed at the introductory level. As indicated in Table 1, 73.9% of the students tested proficient or above.
What does all this data tell us about Math 337 and what that course contributes to our mathematics education option? It would seem to indicate that the large majority of students are doing just fine and so no actions or adjustments seem necessary. But I
would argue that the data really only tells us that these specific students did reasonably well on those 3 specific items in that specific course. Number theory as an area of mathematics applies many different and intellectually accessible concepts to develop powerful analytic tools for discovering the structure of our natural number system. The topics and techniques serve to illustrate to our students the amazing power of abstract thinking to develop very practical applications. These topics and techniques can serve to inspire our students and to prepare them for successfully tackling even more difficult and abstract topics and techniques in the most challenging courses, Math 420 (Advanced Calculus) and Math 449 (Abstract Algebra). The true programmatic success of Math 337 is partially revealed by the subsequent success or failure of these students in those more advanced and challenging courses. Rather than assessing “SLO’s” that are either too specific to relate to the larger picture, or too broad to be measured with any real meaning, we should identify the elements of the scaffolding that lead our students to be successful in tackling higher level topics and concepts. Math 337 in its totality is part of that scaffolding. One authentic measure of the success of this course would be the correlation between the grades students earn in the course to those earned in the subsequent courses, Math 420 and Math 449. Neither Math 420 nor Math 449 carry Math 337 as a prerequisite since Math 337 is only required for the Math Ed option. But if math 337 were a prerequisite course, and should students with A’s in Math 337 end up with F’s in Math 420, then we would have evidence of a programmatic problem.
Assessing these SLO’s in the narrow way we do by looking at individual test items every 5 years sheds little light on the quality of the structure, content, and delivery of our curriculum.
Math 341: Math 341 is a course designed exclusively for majors selecting the Mathematics Education option. This course will not provide any advancement to majors with other options. It is not an elective for the other options. Consequently the only students who take this course are majors declared in the math ed option. The department’s program assessment matrix called for SLO GC3 to be assessed at both the introductory (I) and the progressing (P) levels for this course this year. The instructor designed a single item to measure both levels and embedded it on the final exam. The item consisted of a time vs. velocity graph and the students were asked to sketch the two possible corresponding time vs. distance graphs. The problem is highly conceptual and measures a student’s comprehension of the connection of rates of change of a quantity to the total amount of the quantity in a graphical setting. This is a problem assessing concepts that students are typically exposed to in their first semester calculus course. It is possible for a student to not revisit the concepts for several semesters, but still the level being assessed should probably be characterized as (P) progressing. Details of the grading rubric are provided in the appendix. The instructor does not articulate how the (I) introductory level of assessment plays a role other than to point out that the concept assessed is developed in Math 120. Table 1 indicates that 57.1% of the students performed at or above the proficient level on the item. This level of proficiency is certainly less than one would expect, given both the fact that students are expected to learn the ideas in Math 120 and that the instructor provided preparation consisting of related activities and review of these calculus ideas during class time. Furthermore, most of the students in this course are junior and senior math majors and they should be
quite familiar by now with this type of elementary, yet highly important, conceptual problem. The fact that they did relatively poorly may be evidence that there is some type of program deficiency lurking. Or it could simply mean that this particular class consists of more mathematically weak students than usual. The important question is this: How do we determine a programmatic weakness from a single item assessing a single SLO particularly when it’s done only once every five years? Since we are all paying dearly for this program assessment, we decided to go the extra mile and research the strength of this particular class. The prerequisite included one other upper division course, Math 330. Of the 22 students enrolled in Math 341 in the fall of 2011, 7 of them were required to repeat the prerequisite course. Doing an overall calculation of the GPA of grades earned by these students in the prerequisite course yielded a 1.73 average grade points. With this additional information, the relatively poor showing of only 57.1% testing proficient or above becomes more understandable. It would appear that this additional prerequisite grade point information helps to explain the lower than expected scores, rather than some program or curricular deficiency that needs to be corrected. This analysis points to a fundamental principle of cohesive programs: that prerequisite courses should be properly sequenced so that students who succeed are usually wellprepared and therefore also succeed in their subsequent courses. When students do poorly in the prerequisite course, it should be expected that they will also struggle in those subsequent courses. Thus one measure of proper mathematics program scaffolding will be the relative grade distributions in the sequential courses. We would expect that with proper preparation, success rates will increase in subsequent courses.
Math 342: Like Math 341, Math 342 is a course specifically designed for majors selecting the mathematics education option. The course does not serve as a viable elective for any of the other options. The program assessment matrix called for SLO
GC3 to be assessed at the progressing (P) level. General Content in the context of Math 342 is interpreted as the deeper understanding of mathematics topics and contents typically found in the secondary curriculum. This is a major focus of both Math 341 and Math 342. Item 9 in Table 1 was designed to provide this data. Students were asked as part of their midterm exam to give a partitivemodel based explanation for why dividing any number by a number less than 1 will result in a larger number. A proper explanation requires the deeper content knowledge that teachers should have, but are typically not taught as part of their secondary level training. The standard 4point rubric was applied (see appendix for details) and we see from the table that only 46.7% of the students performed at the proficient or above level. The instructor concludes in his remarks that this is likely an artifact of the way students are taught to divide in grade school and that it is difficult to develop that deeper understanding when they already know an algorithm that produces the correct answer, but don’t really understand why the algorithm works.
Another explanation for the weak showing is that this is just a weak class. Looking at this in greater depth, by examining the roster we found that 14 of the 15 students were the same students who were in the Math 341 class that was assessed the previous semester. The collective GPA earned by the 15 students in the prerequisite Math 330 course, however, was 2.42, considerably better than we saw in Math 341. Ignoring the fact the sample size is only 15 and therefore little statistical significance exists, the data is consistent with the instructor’s conclusion that more work will be required to develop
the type of depth of knowledge of division that we would like to see in our preparing teachers.
Another of the goals of Math 342 course is to develop students’ abilities to communicate mathematics effectively. The program assessment matrix called for SLO Com2 to be assessed at both the progressing (P) and mastery (M) levels. Items 10 and 11 in Table 1 were designed to provide that assessment. Both items called for students to deliver teaching presentations. For item 10 the presentation was to consist of the first 78 minutes of a lesson that focused on some conceptual knowledge. This presentation assignment was conducted during the first third of the semester. Item 11 consisted of a more extensive teaching presentation that also was to focus on developing conceptual knowledge, but over a 2025 minute period. The grading rubric for item 10 weighed heavily on whether the student gave a conceptually oriented lesson. The most successful student would not only meet this requirement, but also clearly develop significant mathematical ideas in the limited 78 minute time frame. As can be seen in Table 1, again only 46.7% of the students (7 out of 15) performed at the proficient or above level. The grading rubric for item 11 focused on three main points: 1. organization and execution, 2. content development and coherence, and 3. conceptual focus. This item was designed to measure SLO Com2 at the mastery level. For this assessment the class improved and 60% of the students performed at the proficient or above level. In terms of assessing the degree program, this data tells us very little. The sample size of only 15 students is simply too small to be able to make any valid judgments relative to the program.
8. Planned Program Improvement Actions Resulting from Outcomes:
No program improvements are planned as a result of the SLO assessment. However, the instructor in Math 341 expressed that he would like to share the results of the poor student showing on his assessment with the Math 120 instructors in hopes that they might reflect on how their students might develop better conceptual understanding. The instructor in Math 342 indicated that he would use his first student presentation as a teaching tool rather than an evaluative assignment.
9. Planned Revision of Measures or Metrics (if applicable)
The assessment coordinator has recommended that grade distributions be used to assess SLO’s generally, rather than the current method that carries such a low statistical significance.
Other Changes
* We have made some progress in our attempt to institutionalize assessment. We added an Assessment Committee to our list of Department committees. The committee has 2 members, and the term is 2 years. The duties of the Chair of the committee include writing the annual report, and also mentoring the new member for one year. The junior member becomes the Chair during the second year of the term. We are hoping that by engaging more faculty in the Assessment Reporting process,
they will learn to value assessment (or at least be more willing participants). The “apprentice” committee member this year is Nancy Carter, who also happens to be our resident expert statistician.
 The general consensus of the department faculty is that it is hard to find meaningful information coming out of the Degree Program Assessment Reports, as they are currently done. This particular statement gets made every year in our report and we have yet to receive any help or feedback on this concern. This and other comments that have been made in the reports that have seemingly gone unnoticed lead us to believe that these reports are not actually read by anyone. So if the department doesn’t find them useful, and if nobody else is reading them, then there has been a very serious waste of time going on for the past 4 years. Nevertheless, we are seriously interested in finding ways to assess our program and our classes in a meaningful way that will produce results that help us improve them.
 Of course the biggest change this year is the creation of the Undergraduate Program Assessment Report that utilizes statistically significant longitudinal grade distribution data to assess how well the SLO’s are being met and to discover structural strengths and weakness in the program. With this methodology the SLO’s are not addressed specifically and so the degree to which they are being met must be inferred from the overall student success in each course. An alternative that could produce that same type of statistically significant data would be to establish uniform assessment items for each SLO and each course in the curriculum and then to collect data on student performance in every course over several years until a statistically valid picture emerges. We feel this to be totally impractical and hence the grade distribution process holds promise from the point of view that it can be executed relatively efficiently. Furthermore, the grade distribution methodology leverages the considerable efforts of departments to attract, develop, and retain outstanding faculty by assuming that these faculty are taking full consideration of the SLO’s connected to their courses. It is safe and appropriate to assume that student grades produced by vetted tenure track faculty reflect a comprehensive measure of how well the SLO’s were achieved.
10. Planned Revisions to Program Objectives or Learning Outcomes (if applicable)
In 2011 we combined GC1 and GC2 into a single SLO, (GC1) and renumbered GC3 and GC4 as GC2 and GC3 respectively. Each SLO carries the designations of I, P and M determining the level of mathematical difficulty involved in the concepts, rather than having two different SLOs.
In 2012 we have suggested courses that could or should address the SLO, Com2. Up until now we have not identified courses for each option that address this SLO that relates to the oral communication of mathematics. It is becoming more common that students make presentations of solutions and proofs in many of our courses. We therefore suggest that this process be formalized in the following way:
Com2I: Math 235 (core course) for all options.
Com2P: Math 220 for all options and in addition, Math 342 for Math Ed.
Com3M: Math 420 for all options and 449 for the General and both 449 and 342 for Math Ed.
This means that student presentations should become a regular part of the instruction routine in these courses.
We are still trying to determine how to get meaningful, measurable data for the Lifelong Learning SLO. Our current Assessment Plan calls for questionnaires to be sent to alumni, but we still do not have good contact information for our alums, and we do not have an acceptable questionnaire yet written. We can get a modest amount of information from exit interviews; this will be an issue for the 201213 Assessment Committee to figure out.
 Changes to Assessment Schedule (if applicable) We propose using grade distribution data for all courses in the undergraduate curriculum every year to assess SLO’s comprehensively in statistically meaningful ways.
12. Information for Next Year
What learning outcome(s) are you examining next year and who will be the contact person?
Assuming that program assessment survives the budget hatchet, and assuming the grade distribution method is rejected, then 20122013 will be Year 5 of the Degree Program Assessment schedule. The assessment schedule calls for collecting data in Math 360/361, Math 472/475/480, Math 356 and Math 350/456,7,8. The SLOs that are scheduled to be assessed from these courses are the following:
360/361—GC4 at the Introductory and Practice (IP) levels 472/475/480—GC4 at the Mastery (M) level
356—GC4 at the Introductory and Practice (IP) levels and and PS at the Mastery (M) level.
350/456,7,8 GC4 at the Mastery level
In addition to the above, our exit interviews that we normally conduct every year and the results of an alumni questionnaire are scheduled to be included in the 5th year report.
The contact people for 20122013 will be Nancy Carter and Terry Kiser, and an as yetunnamed (apprentice) Assessment Committee member.
II. Appendices (please include any of the following that are applicable to your program)
 Assessment Data Summaries (Test items, Rubrics, Raw Data)
Math 337
www.csuchico.edu/~rford/ProgramAssessment/DataAY2011_12/Math337AssessmentF11.pdf
Math 341
http://www.csuchico.edu/~rford/ProgramAssessment/DataAY2011_12/Math341DataF11.pdf
Math 342
http://www.csuchico.edu/~rford/ProgramAssessment/DataAY2011_12/Math342AssessmentDataS12.pdf
Math 435
http://www.csuchico.edu/~rford/ProgramAssessment/DataAY2011_12/Math435AssessDataF11.pdf
Math 435 Data
http://www.csuchico.edu/~rford/ProgramAssessment/DataAY2011_12/Math435DataF11.pdf
B. Mission, Goals, and SLO’s ,Assessment Schedule and Course Alignment Matrix
Annual Program Assessment Update 20112012 (Math)
http://www.csuchico.edu/~rford/ProgramAssessment/AssessmentPlan/AnnualMathBSUpdate11_12.xls
Mission, Goals, and SLO’s
http://www.csuchico.edu/~rford/ProgramAssessment/AssessmentPlan/MissionGoalsSLOsS12.pdf
Assessment Schedule
http://www.csuchico.edu/~rford/ProgramAssessment/AssessmentPlan/FiveYrAssessPlanS12.pdf
Course Alignment Matrix
http://www.csuchico.edu/~rford/ProgramAssessment/AssessmentPlan/DegreeCoreAlignmentMatrix.pdf
Goal and SLO Alignment Matrix
http://www.csuchico.edu/~rford/ProgramAssessment/AssessmentPlan/GoalsSLOsMatrixS12.pdf
Course – SLO Matrix
http://www.csuchico.edu/~rford/ProgramAssessment/AssessmentPlan/SLO_CourseMatrixS12.pdf