IRA

Forms

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-48720098-1']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();

Assessment Methods

Description of Assessment Methods Used to Assess PLO(s)

When describing methods used to assess student achievement of the PLO(s):

  • Include sources of data, collection methods
  • State whether they are direct or indirect instruments.
  • Describe how data was analyzed and evaluated.

If a rubric-based assessment was used:

  • Describe the rubric development and scoring process 
  • Describe the sample and collection of artifacts, as well as sample size and number of artifacts. 
  • For the scoring session, describe the number of readers for each artifact, the calibration of readers, and the resulting level of inter-reader reliability.
  • You will want to achieve a level of at least .80 during your calibration session before reading student work

Selecting Effective Assessment Methods

  • It is important to realize that not every educational objective can always be assessed.
  • Identify those that you prize most highly and that can be meaningfully measured.
  • Good assessment, according to Mary Allen (2002) is meaningful, manageable and sustainable. In other words, planning for assessment requires setting priorities; it is neither prudent nor productive to measure everything that moves.
  • Select carefully those outcomes that your department is most interested in having students achieve.
  •  You may not be able to assess all of your important learning outcomes in a single year, but remember that outcomes assessment is an ongoing process.

Methods of Assessment

Methods will vary depending on the learning outcome(s) to be measured.  Direct methods are when students demonstrate that they have achieved a learning outcome or objective.  Indirect methods are when students (or others) report perceptions of how well students have achieved an objective or outcome.

Examples of Direct Methods

  • Capstone Courses: could be a senior seminar or designated assessment course. Program learning outcomes can be integrated into assignments.
  • Case Studies: involve a systematic inquiry into a specific phenomenon, e.g. individual, event, program, or process. Data are collected via multiple methods often utilizing both qualitative and quantitative approaches.
  • Classroom Assessment: is often designed for individual faculty who wish to improve their teaching of a specific course. Data collected can be analyzed to assess student learning outcomes for a program.
  • Collective Portfolios: Faculty assemble samples of student work from various classes and use the "collective" to assess specific program learning outcomes. Portfolios can be assessed by using scoring rubrics; expectations should be clarified before portfolios are examined.
  • Content Analysis: is a procedure that categorizes the content of written documents. The analysis begins with identifying the unit of observation, such as a word, phrase, or concept, and then creating meaningful categories to which each item can be assigned. For example, a student's statement that "I learned that I could be comfortable with someone from another culture" could be assigned to the category of "Positive Statements about Diversity." The number of incidents that this type of response occurred can then be quantified and compared with neutral or negative responses addressing the same category.
  • Embedded Questions to Assignments: Questions related to program learning outcomes are embedded within course exams. For example, all sections of "research methods" could include a question or set of questions relating to your program learning outcomes. Faculty score and grade the exams as usual and then copy exam questions that are linked to the program learning outcomes for analysis. The findings are reported in the aggregate.
  • Locally developed essay questions: Faculty develop essay questions that align with program learning outcomes. Performance expectations should be made explicit prior to obtaining results.
  • Locally developed exams with objective questions: Faculty create an objective exam that is aligned with program learning outcomes. Performance expectations should be made explicit prior to obtaining results.
  • Observations: can be of any social phenomenon, such as student presentations, students working in the library, or interactions at student help desks. Observations can be recorded as a narrative or in a highly structured format, such as a checklist, and they should be focused on specific program objectives.
  • Primary Trait Analysis: is a process of scoring student assignments by defining the primary traits that will be assessed, and then applying a scoring rubric for each trait.
  • Reflective Essays: generally are brief (five to ten minute) essays on topics related to identified learning outcomes, although they may be longer when assigned as homework. Students are asked to reflect on a selected issue. Content analysis is used to analyze results.
  • Scoring Rubrics: can be used to holistically score any product or performance such as essays, portfolios, recitals, oral exams, research reports, etc. A detailed scoring rubric that delineates criteria used to discriminate among levels is developed and used for scoring. Generally two raters are used to review each product and a third rater is employed to resolve discrepancies.
  • Standardized Achievement and Self-Report Tests: Select standardized tests that are aligned to your specific program learning outcomes. Score, compile, and analyze data. Develop local norms to track achievement across time and use national norms to see how your students compare to those on other campuses.

Examples of Indirect Methods

  • Exit Interviews: Students leaving the university, generally graduating students are interviewed or surveyed to obtain feedback. Data obtained can address strengths and weaknesses of an institution or program and or to assess relevant concepts, theories or skills.
  • Focus Groups: are a series of carefully planned discussions among homogeneous groups of 6-10 respondents who are asked a carefully constructed series of open-ended questions about their beliefs, attitudes, and experiences. The session is typically recorded and later the recording is transcribed for analysis. The data is studied for major issues and reoccurring themes along with representative comments.
  • Interviews: are conversations or direct questioning with an individual or group of people. The interviews can be conducted in person or on the telephone. The length of an interview can vary from 20 minutes to over an hour. Interviewers should be trained to follow agreed-upon procedures (protocols).
  • Matrices: are used to summarize the relationship between program objectives and courses, course assignments, or course syllabus objectives to examine congruence and to ensure that all objectives have been sufficiently structured into the curriculum.
  • Surveys: are commonly used with open-ended and closed-ended questions. Closed ended questions require respondents to answer the question from a provided list of responses. Typically, the list is a progressive scale ranging from low to high, or strongly agree to strongly disagree.
  • Transcript Analysis: are examined to see if students followed expected enrollment patterns or to examine specific research questions, such as to explore differences between transfer and freshmen enrolled students.

Source: Allen, Mary; Noel, Richard, C.; Rienzi, Beth, M.; and McMillin, Daniel, J. (2002). Outcomes Assessment Handbook. California State University, Institute for Teaching and Learning, Long Beach, CA.

Related Content

Office Contact

Academic Programs and Planning 
1 Grand Avenue
Kennedy Library
  (Bldg. 35), Suite 319  
San Luis Obispo, CA 93407 

Main Number 
(805) 756-2246

General Email
acadprog@calpoly.edu