Assessment Plan 2013

CALIFORNIA STATE UNIVERSITY, CHICO ANNUAL PROGRAM ASSESSMENT REPORT

BS, Department of Mathematics and Statistics (4 Options: Applied, Pure, Education, Statistics)

 

Date: 9/3/13

 

I.       Assessment of Student Learning Outcomes

  1. Name and Contact Information of Program Assessment Coordinator: Nancy Carter

Department of Mathematics and Statistics, zip 0525 898-6562

ncarter@csuchico.edu

 

  1. Goal Statements and Student Learning Outcomes

[General Content] Graduates are proficient in performing basic operations on fundamental mathematical objects and have a working knowledge of the mathematical ideas and theories behind these operations.

 

GC1     Demonstrate basic skills and conceptual understanding of differential, integral, and multivariable calculus.

 

GC2     Demonstrate basic skills and conceptual understanding as relating to fundamental mathematical objects introduced in our degree core, such as, sets, functions, equations, vectors, and matrices.

 

GC3    Demonstrate more technical skills and more in-depth and broader conceptual understanding in core mathematical areas (such as, analysis, geometry/topology, algebra, applied math, statistics), relevant to their option in the major.

 

[Critical Thinking/Problem Solving] Graduates use critical thinking and problem solving skills to analyze and solve mathematical & Statistical problems.

 

PS            Interpret and translate problems into appropriate mathematical language; then solve problems by applying appropriate strategies and interpreting the results.

[Communication] Graduates communicate mathematics effectively in a manner appropriate to career goals and the mathematical maturity of the audience.

Com1  Demonstrate the ability to effectively and accurately write on mathematical topics relevant to their mathematics option and appropriate to their audience.

 

Com2  Demonstrate the ability to effectively and accurately speak on mathematical topics relevant to their mathematics option and appropriate to their audience.

 

[Proof Proficiency] Graduates have a basic proficiency in the comprehension and application of proofs.

 

PP      Students can read mathematical proofs, extract the key ideas used in the proof, and convey the logic behind the proof; they can also write their own rigorous and logically correct proofs.

 

 

[Technology] Graduates know how to use technology tools (e.g., graphing calculators, computer algebra systems) appropriate to the context of the problem.

 

Tech       Students use technology to manipulate mathematical objects (e.g., functions equations, data sets, etc.), to conduct mathematical explorations, to model problem scenarios, and to analyze mathematical objects.

 

[Life-long Learner] Graduates are aware of the important role of mathematics and have the interest and ability to be independent learners and practitioners.

 

LL1      Students demonstrate the ability to apply mathematics and statistics to new contexts (e.g., in other classes, the workplace, graduate school, or classes they teach).

 

LL2      Students recognize and appreciate the role that mathematics can play in their futures and in society in general.

 

Course Alignment Matrix:

 

See attached Excel Spreadsheet (Appendix B)

 

 

4.      Learning Outcome(s) Assessed in AY 2012-13(Year 5 of Assessment Plan)

 

GC3-M, GC3-IP, PS-M, LL1-I, LL2-MP

 

5.      Assessment Methodology Used

The SLOs were assessed at the level of mastery (I, P, M, for Introductory, Practice, Mastery) as indicated in the list above with the exceptions of LL1-I and LL2-MP which will be discussed later. Assessment items were embedded in exams for Math 315 and 458. Assessment items were embedded in projects for Math 480. In Math 315, there were two instructors who each constructed their own assessment questions in collaboration with each other to be sure the same topics (although not using the exact same questions) were used for assessment in both classes. This approach was used since the two teaching styles were different and the timing of the topics was not the same for the two sections. There was only one instructor for Math 361, 458, and

480. All five instructors constructed a rubric which was applied to determine performance at the Exemplary, Proficient, Acceptable, and Unacceptable levels. Each item was scored independently by instructor.

 

Assessment items, rubrics, raw scores and sorted data for all class assessments are available in Appendix A. Student work is available upon request.

 

The rubrics applied varied from course to course. For MATH 315 and MATH 458, the instructors used points for each problem. Exemplary, Proficient, Acceptable and Unacceptable, were determined by the total points earned on the problem. The rubric for MATH 480 was based on perceived student grasp of concepts; their understanding was categorized as Exemplary, Proficient, Acceptable or Unacceptable. The performance items and rubrics are available in Appendix A.

 

6.      Assessment Results

 

 

Student Learning Outcome

Where sample is from, sample size

Measure

% Exemplary +

Proficient by problem

GC3-IP

MATH 315

(4)

Embedded Exam item 1 Section 1

75

GC3-IP

MATH 315

(4)

Embedded Exam item 2 Section 1

100

PS-M

MATH 315

(4)

Embedded Exam item 3 Section 1

100

PS-M

MATH 315

(4)

Embedded Exam item 4 Section 1

100

PS-M

MATH 315

(4)

Embedded Exam item 5 Section 1

25

PS-M

MATH 315

(4)

Embedded Exam item 6 Section 1

100

PS-M

MATH 315

(4)

Embedded Exam item 7 Section 1

75

GC3-IP

MATH 315

(6)

Embedded Exam item 1 Section 2

100

 

GC3-IP

MATH 315

(6

Embedded Exam item 2 Section 2

100

PS-M

MATH 315

(6)

Embedded Exam item 3 Section 2

100

PS-M

MATH 315

(6)

Embedded Exam item 4 Section 2

100

PS-M

MATH 315

(6)

Embedded Exam item 5 Section 2

100

PS-M

MATH 315

(6)

Embedded Exam item 6 Section 2

100

PS-M

MATH 315

(6)

Embedded Exam item 7 Section 2

83

 

 

 

 

GC3-M

MATH 458

(16)

Embedded Exam item 1

81

GC3-M

MATH 458

(16)

Embedded Exam item 2

100

GC3-M

MATH 458

(17)

Embedded Exam item 3

77

GC3-M

MATH 458

(17)

Embedded Exam item 4

94

GC3-M

MATH 458

(17)

Embedded Exam item 5

82

GC3-M

MATH 458

(17)

Embedded Exam item 6

94

 

 

 

 

GC3-M

MATH 480

(3)

Embedded Project item 1

100

GC3-M

MATH 480

(3)

Embedded Project item 2a

100

GC3-M

MATH 480

(3)

Embedded Project item 2b

100

GC3-M

MATH 480

(3)

Embedded Project item 3

100

GC3-M

MATH 480

(3)

Embedded Project item 4

67

 

 

 

 

GC3-IP

MATH 361

Not done since faculty member did not submit material

 

 

 

 

 

 

LL1-I

Exit Interviews

Not done since the exit interviews had not collected the necessary data

 

LL2-MP

Alumni Questionnaire

Not done because no data base of contact information for graduates was ever compiled and no questionnaire was ever developed

 

 

 

  1. Analysis / Interpretation of Results

 

Results/Analysis/Interpretation

 

Math 315: Math 315 (Applied Statistics I) is a course for majors (particularly those in the statistics option) and also serves as a service course (mainly for students majoring in biology and nutrition). It has replaced Math 356 in the requirements for the statistics option. This was done to streamline the department course offerings since Math 356 often had a small enrollment. Math 315 introduces students to numerous basic and some advanced applied statistical procedures but it is less theoretical than Math 356 since the majority of students are not statistics majors. The course has been selected for assessment relative to statistics option. Two sections were assessed. In both sections, students were separated into “majors” and “non- majors” but only the assessments for the majors are included in the table in section 6 above. The raw data for majors and non-majors as well as for both groups combined is available in Appendix A.

 

Item 1 in section 1: This item addresses a student’s understanding of basic descriptive statistics and the interpretation of confidence intervals. Seventy-five percent of the majors scored at the Exemplary or Proficient level. This is what we should expect of our majors for this type of question.

 

Item 2 in section 1: This item addresses the student’s ability to interpret and translate problems into appropriate mathematical language and interpret the results.

Specifically, they were asked to construct and interpret a confidence interval for a mean and 100% of the sample scored at the Exemplary or Proficient level on this item. For majors, this should be fairly straightforward and it was for this class.

 

Item 3 in section 1: This item addresses the skills of hypothesis testing and identifying Type I and Type II errors. This item required the students to conduct a hypothesis for testing a population mean and required them to state all of the parts of the hypothesis including the p-value and to state their conclusion. It further required them to interpret Type I and II errors and to determine which type of error would have been possible for this problem. All four (100%) of the majors were rated as Exemplary or Proficient.

 

 

Item 4 in section 1: This item addresses a student’s skill for interpreting multiple comparisons. This is a crucial part of performing an analysis of variance. All (100%) of the four students did this at the Exemplary or Proficient level.

 

Item 5 in section 1: This item addresses a student’s skill at performing and interpreting simple liner regression and hypotheses connected to the regression. Only 25% of the four majors scored at the Exemplary or Proficient level. This is a bit troubling but the rubric was graded with almost no acceptable error and some students struggle with interpreting the results of tests for the slope.

 

Item 6 in section 1: This item required the students to use a statistical package to run a backward multiple regression procedure, construct and interpret a residual plot and predict values for certain independent variables. This is very important for our majors and 100% of the math majors were rated as Exemplary or Proficient.

 

Item 7 in section 1: This item required the students to perform a two factor analysis and construct the interaction plot, test for significant interaction and interpret the results and also to test for one of the main effects. This is also basic and very important for our majors and 75% were rated as Exemplary or Proficient.

 

Item 1 in section 2: This item addresses a student’s understanding of basic descriptive statistics. All six (100%) of the majors scored at the Exemplary or Proficient level. (Similar to Item 1 in section 1)

 

Item 2 in section 2: This item addresses a student’s understanding of computation and interpretation of confidence intervals. All six (100%) of the majors were Exemplary or Proficient for this item. (Similar comments as for Item 2 in section 1)

 

Item 3 section 2: This item addresses a student’s ability to interpret and translate problems into appropriate statistical language and interpret the results. Specifically, it addresses the skills of hypothesis testing and identifying Type I and Type II errors.

This item required the students to conduct a hypothesis for testing a population mean and required them to state all of the parts of the hypothesis including the p-value and to state their conclusion. It further required them to interpret Type I and II errors in the context of the problem. All six (100%) of the majors Exemplary or Proficient for this item.

 

Item 4 in section 2: This item addresses a student’s ability to interpret and translate problems into appropriate statistical language and interpret the results. Specifically, it requires students to perform a single factor analysis of variance, do the Tukey multiple comparisons and also do the appropriate nonparametric equivalent (Kruskal- Wallis) procedure. This is a common type of applied statistical analysis and it important for our majors to be able to do this. All six (100%) of the majors Exemplary or Proficient for this item.

 

Item 5 in section 2: This item also addresses a student’s ability to interpret and translate problems into appropriate statistical language and interpret the results. Specifically, it covers simple liner regression and applications of simple linear regression including interpretation of the results. This is a common type of applied statistical analysis and it important for our majors to be able to do this. All six (100%) of the majors Exemplary or Proficient for this item. (Similar to Item 5 in section 1)

 

Item 6 in section 2: This item also addresses a student’s ability to interpret and translate problems into appropriate statistical language and interpret the results. Specifically, it addresses two-way analysis of variance along with Tukey multiple comparisons, factor plots and interpretations. This is also a basic procedure for applied statistics. All six (100%) of the majors Exemplary or Proficient for this item. (Similar to Item 7 in section 1)

 

Item 7 in section 2: This item also addresses a student’s ability to interpret and translate problems into appropriate statistical language and interpret the results. Specifically, it addresses multiple regression including reading the computer output, interpreting the results and conducting the appropriate statistical tests. Five of the six majors (83%) were rated as Exemplary or Proficient. (Similar to Item 6 in section 1)

 

Items 1 and 2 in section 1 and items 1 and 2 in section 2 were intended to assess SLO GC3 at the introductory and practice (aka progressing) levels. Combining the results of these items shows that approximately 95% of our majors are meeting the expectations of the course. This is based on the results from the 10 majors that were enrolled in the two combined sections.

 

Items 3, 4, 5, 6 and 7 in both sections 1 and 2 were intended to address SLO PS at the mastery level. Combining the results from the two sections shows that approximately 90% of the time our majors are meeting this expectation for the course. Again, this is based on the results from the 10 majors that were enrolled in the two combined sections.

 

Math 458: Math 458 (Sampling Methods) is a course for majors (particularly those in the statistics option) but also serves as an introduction to sampling methods for students from other majors (i.e. biology, agriculture, etc.). There were 17 majors enrolled in this course. The course has been selected for assessment relative to the statistics option.

 

Item 1: This item addresses estimating a population total based on a simple random sample and also determining the sample size required. Of the math majors, 81% were rated as Exemplary or Proficient.

 

Item 2: This item addresses estimating a population proportion based on a simple random sample and also determining the sample size required. Of the math majors, 100% were rated as Exemplary or Proficient.

 

Item 3: This item addresses estimating the mean, total and proportion from a stratified random sample. It also involves two types of sampling problems associated with stratified sampling. Of the math majors, 77% were rated as Exemplary or Proficient.

 

Item 4: This item asks students to construct ratio, regression and simple random sampling estimates for a population mean and to compare them to determine which is most appropriate. Of the math majors, 94% were rated as Exemplary or Proficient.

 

Item 5: This item addresses estimating the size of a population using capture- recapture methods. Of the math majors, 82% were rated as Exemplary or Proficient.

 

Item 6: This item addresses using a systematic sample to estimate a population mean and to interpret the result. Of the math majors, 94% were rated as Exemplary or Proficient.

 

All six items were intended to address SLO GC3 at the mastery level. Combining the results of the six items shows that approximately 88% of the time our majors are meeting this expectation for the course (based on a sample size of 17).

 

Math 480: Math 480 (Mathematical Modeling) involves the translation of real world phenomena into mathematical language. Students are required to do projects with applications such as mathematical theories of war, traffic flow, river pollution, water waves and tidal dynamics. The course has been selected for assessment relative to the applied mathematics option. All of the assessment items were embedded in student projects.

 

Item 1: This item addresses a student’s understanding of the diatomic molecule as a harmonic oscillator. All three (100%) of the majors were rated as Exemplary or Proficient for this item.

 

Item 2a: This item addresses a student’s ability to provide a simplified description of the main processes in our body related to breathing. All three (100%) of the majors were rated as Exemplary or Proficient for this item.

 

Item2b: This item addresses a student’s ability to describe the motion of the diaphragm while breathing. All three (100%) of the majors were rated as Exemplary or Proficient for this item.

 

Item 3: This item requires the student to give a physical interpretation of microscopic traffic flow and the main parameters of the model. All three (100%) of the majors were rated as Exemplary or Proficient for this item.

 

Item 4: This item requires demonstrated ability to solve main equations of a macroscopic traffic flow model. For this item, 67% (2 of the 3) majors were rated as Exemplary or Proficient for this item.

All of the items were intended to address SLO GC3 at the mastery level. Based on the very small sample size of three majors, approximately 93.4% of the time, our majors met the expectation for this course. This is such a small sample size that the results may be misleading.

 

Math 361: The faculty member who taught Math 361 in fall 2012 did not submit assessment material despite repeated reminders. As a result, this course will need to be assessed in the next cycle in addition to the courses that were to be assessed in 2013-14.

 

 

Exit Interviews: This SLO was to be done at the LL1 Introductory level but was not done since the exit interviews used by the department did not collect any information relating to assessment. The current assessment committee did not know about this until this June and it was too late to make any changes for this current cycle.

Consequently, there was no data to evaluate for this SLO.

 

Alumni Questionnaire: This SLO was to be done at the LL2 at the practice and mastery levels. This was not done since the department has still not constructed an address list for our alumni. Since we did not have an address list, a questionnaire was not developed. Again, the current committee was not aware of this problem until too late to do anything this year and there is no data to evaluate for this SLO.

 

 

8.  Planned Program Improvement Actions Resulting from Outcomes:

 

No program improvements are planned for Math 315, 458, 480 or 361 as a result of the SLO assessment.

 

9.  Planned Revision of Measures or Metrics:

 

There are no planned revisions for the measures or metrics for any of our course assessments. However, we do need to revise the exit interview so that it collects data on assessment. Also, the alumni questionnaire needs to be developed and an address list for alumni created for us to gather data for the Lifelong Learning SLO. Our current department chair, Terry Kiser, has already created a new exit interview. In particular, there are questions in section III of the exit interview which address both LL1 and LL2 (see Appendix B). The incoming department chair, Rick Ford (the previous assessment chair), has stated he plans to make more changes to the exit interview form before he administers it in 2013-2014. In particular, he plans to ask for contact information so that we can contact our recent graduates approximately one year after they graduate in order to determine how they assess their knowledge and how they feel about what they learned or did not learn and how that has impacted them.

Other Changes

 

* As stated in our previous reports, we have made some progress in our attempt to institutionalize assessment. We added an Assessment Committee to our list of Department committees. The committee has 2 members, and the term is 2 years. The duties of the Chair of the committee include writing the annual report, and also mentoring the new member for one year. The junior member becomes the Chair during the second year of the term. We are hoping that by engaging more faculty in the Assessment Reporting process, they will learn to value assessment (or at least be more willing participants). The “apprentice” committee member this year is Thomas Mattman.

 

* As also stated in last year’s report, the general consensus of the department faculty is that it is hard to find meaningful information coming out of the Degree Program Assessment Reports, as they are currently done. This particular statement gets made every year in our report and we have yet to receive any help or feedback on this concern. This and other comments that have been made in the reports that have seemingly gone unnoticed lead us to believe that these reports are not actually read by anyone. So if the department doesn’t find them useful and if nobody else is reading them, then there has been a very serious waste of time going on for the past 5 years.

Nevertheless, we are seriously interested in finding ways to assess our program and our classes in a meaningful way that will produce results that help us improve them. We are still having problems getting some of our faculty to participate in assessment. (And even those who *do* participate do so more out of an innate willingness to cooperate rather than because they feel they will benefit from the process.) We feel that the reason for this might be that it is hard to find meaningful information coming out of the assessment reports, as they are currently done. We are still trying to find ways to assess our classes that will produce results that help us improve our program.

 

 

10.  Planned Revisions to Program Objectives or Learning Outcomes (if applicable)

 

As noted in last year’s report, in 2011 we combined GC1 and GC2 into a single SLO, (GC1) and renumbered GC3 and GC4 as GC2 and GC3 respectively. Each SLO carries the designations of I, P and M determining the level of mathematical difficulty involved in the concepts, rather than having two different SLOs.

 

In last year’s report (2012-2013 report), some possible courses that could or should address the SLO Com2 were suggested. As noted in the report, up until now we have not identified courses for each option that address this SLO that relates to the oral communication of mathematics. It is becoming more common that students make presentations of solutions and proofs in many of our courses. We therefore suggest that this process be formalized in the following way:

 

Com2-I: Math 235 (core course) for all options.

Com2-P: Math 220 for all options and in addition, Math 342 for Math Ed.

Com3-M: Math 420 for all options and 449 for the General and both 449 and 342 for Math Ed.

 

This means that student presentations should become a regular part of the instruction routine in these courses.

 

As noted above in section 9, we are still trying to determine how to get meaningful, measurable data for the Lifelong Learning SLO. Our current Assessment Plan still calls for questionnaires to be sent to alumni, but we still do not have good contact information for our alums, and we do not have an acceptable questionnaire yet written. We do plan to ask for contact information as part of the revised exit interview so we can begin to build a contact data base. We also still plan to get some information from exit interviews using a new exit interview questionnaire (Appendix B contain a draft of a new questionnaire). The new department chair (Rick Ford) has some ideas for improving the exit interview and also for questions for the alumni questionnaire. He proposes we conduct the alumni questionnaire every year by contacting our graduates who graduated the previous year. That is, we get information from our majors when they graduate (exit interview) and then we contact them again the following year (one year of graduate school or a job) to see if their impressions have changed. This would mean conducting and collecting information on LL1 and LL2 every year. This could be reported yearly or collected and summarized every five years. The department will need to make a decision about this. This will be an issue for the 2013-14 Assessment Committee. Since there has been no data collected for the Lifelong Learning SLO to date, it might be a good idea to report the new exit interview data next year. The first alumni questionnaire data would then be collected the following year (2014-15) if Rick Ford’s suggestion is approved.

 

 

 

11.  Changes to Assessment Schedule (if applicable)

 

The only possible change would be to conduct the alumni questionnaire every year as described in section 10 above. As described above, the LL1 and LL2 data could then begin to be reported every year or summarized at the end of five years. Also, as noted earlier, Math 361 will need to be assessed out of sequence in 2013-14 but this is a onetime change and should not be considered a permanent schedule change.

 

 

12.    Information for Next Year

 

What learning outcome(s) are you examining next year and who will be the contact person?

 

2013-2014 will be Year 1 of a new Assessment schedule. It is unclear if we are to repeat the schedule followed the previous 5 years or come up with a new one. If we

do repeat the schedule we would be able to see if there have been any changes in the five years since data was originally collected, although we may still be struggling with very small sample sizes for some courses. If we repeat the schedule, we would do the following: we will collect data in Math 120, Math 121 and Math 220. The SLOs that will be assessed from these classes are:

 

120  —Com1 at the Introductory level, CG1 at the Introductory level, PS at the Introductory level, Tech at the Introductory level and Com2 at the Introductory level

 

121  — Com1 at the Introductory level, CG1 at the Introductory and Practice levels, PS at the Practice level and Tech at the Introductory level

 

220 —GC1 at the Practice level and PS at the Practice level

 

Also, since it was not completed in 2012-13, 361—GC3 at the Introductory and Practice (IP) levels

 

 

The contact people will be Thomas Mattman and Rick Ford, and an as-yet-unnamed (apprentice) Assessment Committee member.

 

 

II.    Appendices (please include any of the following that are applicable to your program)

 

  1. Assessment Data Summaries (Test items, Rubrics, Raw Data)

 

  1. Assessment Schedule and Course Alignment Matrix

 

 

 

 

Please submit completed reports electronically to your college assessment representative.