Skip Navigation

Guidelines

Guidelines for Program Assessment Process

Guidelines for Annual Program Learning Outcomes Assessment Reports

Step 1: Guidelines for Program Assessment Process

Student learning outcomes (SLOs) should describe the measurable and/or observable knowledge, skills, abilities, or values that students should be able to demonstrate upon completion of the degree and/or certificate program.

Guidelines/Recommendations

  • Articulate 3-5 program-level SLOs that are specific, measurable and/or observable, and attainable.  Select SLOs that faculty deem, at the minimum, to be the most important for all majors of a given degree program or certificate to achieve upon completion of their studies.  
  • Additional SLOs can certainly be documented if required by a program’s accrediting agencies and/or if faculty decide to document more than 5 SLOs for in any given assessment cycle or academic year. 
  • Graduate degree and/or certificate programs should have more advanced learning outcomes (and different measures and criteria) in comparison to undergraduate programs in the same discipline.

TIP:  Use action verbs in Bloom’s Taxonomy to describe learning outcomes (see examples below).

  • Remembering (Knowledge): Collect, Draw, Duplicate, Examine, Identify, Indicate, Label, List, Locate, Memorize, Name, Quote, Read, Recall, Recite, Record, Reproduce, Select, Show.
  • Understanding (Comprehension): Associate, Change, Classify, Compute, Conclude, Contrast, Convert, Demonstrate, Describe, Determine, Differentiate, Discuss, Distinguish, Draw, Estimate, Explain, Extend.
  • Applying (Application): Apply, Calculate, Change, Chart, Complete, Construct, Contribute, Demonstrate, Develop, Discover, Dramatize, Employ, Establish, Examine.
  • Analyzing (Analysis): Analyze, Break down, Appraise, Arrange, Conclude, Contract, Categorize, Classify, Compare, Connect, Contrast, Correlate, Criticize, Debate.
  • Evaluating (Synthesis): Appraise, Argue, Assemble, Build, Collaborate, Classify, Collect, Combine, Compile, Compose, Construct, Create, Deduce, Defend, Derive, Design.
  • Creating (Evaluation): Assemble, Appraise, Argue, Assess, Choose, Compare, Conclude, Consider, Construct, Contrast, Convince, Create, Critique, Decide, Defend.

Examples of student learning outcome statements:

  • Upon completion of the degree and/or certificate program, students will demonstrate an understanding and application of basic and advanced research methods used in education. (Special Education, PhD
    • Apply and understand the process and nature of science (Microbiology, BS)
      1.  Explain how scientific knowledge differs from other ways of knowig
      2. Recognize and formulate clear, testable hypotheses
      3. Design and perform an experiment to test a hypothesis

  • Upon completion of the B.S. in Chemistry (Professional), students should be able to demonstrate a thorough understanding of chemical principles that underlie the inner-workings of the natural world. Specifically, students should be able to:
        a) demonstrate knowledge of the chemical principles that are involved in intramolecular and intermolecular interactions
        b) demonstrate knowledge of the particulate nature of matter including the structure-and-function relationships that underlie chemical phenomena
        c) demonstrate knowledge of the energetics that govern transformations of matter in the natural world (Active)
  • Students will demonstrate fundamental knowledge and comprehension of the major concepts, theoretical perspectives, and empirical findings in psychology. (Psychology, BA)

Step 2: Identify Appropriate Assessment Methods

Below are essential elements of this section:

A.Curriculum Map

Curriculum maps are very helpful in demonstrating where in the program’s curriculum learning outcomes are being addressed.  In essence, a curriculum map consists of a table with two axes, one pertaining to program learning outcomes, the other pertaining to courses in the major.

Example of a curriculum map:

Courses Required for the Major*Learning Outcome 1Learning Outcome 2Learning Outcome 3Learning Outcome 4Learning Outcome 5
ABC 1234L  LL
ABC 2345MMLM 
ABC 3456M H H
ABC 4568
(Capstone)
HHHHH

*Required courses:  Program Learning Outcomes should be assessed in required courses for the major since all majors in the program must take them.  Electives should not be used to assess any program-level student learning outcome as not all students in the major are required to take them.

Note:  L, M, and H describe the extent to which students experience the learning outcome.   L=Low emphasis on the learning outcome; M=Moderate emphasisH=High emphasis.  Every required course should contribute to addressing one or more learning outcomes.

B. Assessment Methods

Assessment methods describe how each student learning outcome is measured.  There are two types of assessment measures (direct and indirect).

1)  Direct Measures AND Number of Students Assessed:  This section requires description of direct measure(s) used to assess students’ progress toward achievement of each learning outcome. Direct measures require students to demonstrate acquired knowledge and skills.  Below are examples.

Assessment Methods

  • Major Course Assignments
  • Oral Presentations*
  • Embedded test items
  • Capstone projects
  • Portfolios
  • Pre/Post Testing
  • Defenses*
  • Proposals*
  • Research Projects/Papers*
  • Comprehensive Exams*
  • Qualifying Exams*
  • Thesis or Disertations*
  • Licensure/Certification Exams*
  • National/Standardized Exams*
  • Internship/Practicum Evaluation*

*These measures are suitable for assessing graduate level learning outcomes.  Focus on assessment of graduate learning outcomes should be on formative assessments that prepare students for culminating experiences (e.g., comprehensive or general examinations, thesis and thesis defense, dissertation and dissertation defense).

Each student learning outcome should be measured using a direct measure as that is the only way to determine the extent to which students are able to demonstrate knowledge, skills and abilities they have acquired in a course or program.  Please remember to indicate the total number of students assessed using a given measure.

2)  Indirect Measures AND Number of Students Assessed:  Indirect measures are commonly used to seek student opinions regarding knowledge and skills acquired in the program.  Findings from indirect measures should be used to augment those of direct measures.  Examples include surveys, focus groups and interviews.  Please include the total number of students participating in each activity.

3)  Performance Target (a.k.a. Criteria for Success):  
This refers to the desired level of performance faculty want to see, based on a measure of method of assessment, that represents success at achieving a given student learning outcome.

Additional Guidelines/Recommendations regarding assessment measures:

  1. Each learning outcome should be assessed using multiple measures, one of which must be a direct measure. Multiple measures are desirable for triangulation of results.
  2. Since findings from indirect measures are self-reported, they cannot be used as sole method of assessing student learning outcomes.  They should be used to augment or supplement findings from direct measures.
  3. To inform improvement efforts, select measures that will identify relative strengths and weaknesses among students’ (aggregate) achievement of the learning outcome.   For example, by using oral presentations as a measure of students’ communication skills, faculty may learn that collectively, students’ skills are weaker in the areas of delivery and organization, and stronger in content and adaptation to audience.  It would be much more difficult to identifying such strengths and weaknesses when using classroom discussion as a measure.
  4. Consider using rubrics to score subjective assessments. Rubrics provide those doing the assessment with detailed descriptions of what is being learned and what is not, students’ collective strengths and their weaknesses.

Examples of well documented Assessment Methods or Measures:

Direct Measure(s) AND Number of Students Assessed:

Assessment of senior Capstone research papers (N=30) by a faculty panel using a locally constructed 5-point rating scale where 5=“excellent,” 4=“good,” and 3=“satisfactory.”   85% should score satisfactory or better, and the mean score should reflect a better than satisfactory performance for the graduating class.
(Undergraduate – Dept. of Communication)

Indirect Measure(s) AND Number of Students Completing Surveys:
Graduating seniors (N=30) will be surveyed to gather their perceptions on basic knowledge and background in the communication discipline using a 5-point Likert scale, where 1=very well, 2=fairly well, 3=somewhat well, 4=not very well, and 5=not at all. 80% of respondents will rate their basic knowledge and background in the communication discipline at “very well” or fairly well”.
(Undergraduate – Dept. of Communication)*

______________________________

*Statement was revised for accuracy.

Step 3: Analyze Student Achievment Data and Interpret Findings

This step focuses on documentation of results of the analysis of assessment data to indicate how students actually performed in each learning outcome based upon the assessment methods faculty selected for each outcome. The following key questions can be used to guide analysis of data:

What do the findings tell us?

It is very important to analyze assessment results in order to learn whether or not the criteria on the student learning outcomes were met.  Analysis of data may provide important information regarding relationship between assessment outcomes and relevant program indicators such as course grades. Further, department faculty may be able to find out the extent to which students change overtime and/or whether or not students meet specified program expectations.

How is assessment data analyzed?

Analyzing data should include organizing, synthesizing, interrelating, comparing, and presenting the assessment results.  These processes should be based upon the nature of the assessment questions asked, the types of data that are available, as well as the needs and wants of the faculty, students and the whole university community including stakeholders.  Since the outcome of data analysis lends itself to multiple interpretations, it may be critical to work in conjunction with others in looking through the analyzed data as this will lead to greater understanding often via different perspectives.

What can data be compared to?

Data can be compared to results from previous assessments, baseline data, existing criteria/standards, etc.  For instance, department faculty may be interested in finding out if their majors learned or developed more as a result of participating in a course or program than students who did not participate.

Step 4: Make Adjustments based on Assessment Findings

The express purpose of assessment is to continuously improve student learning.  In light of this, it is very important that assessment results are analyzed, interpreted, reflected upon, and most importantly, used by faculty to make programmatic changes in the context of continuous improvement.  The assessment process cannot be complete without “closing the loop” – that is, using assessment results for program change and improvement - this is the most important part of the assessment process.  

The key questions regarding this step: How do faculty intend to enhance student learning based on results of student performance?  Where are students performing well (as expected)? Where are they falling short? Which areas of the curriculum should be emphasized more?

Assessment results can be used in a variety of ways including, but not limited to the following:

Courses Required for the Major*Learning Ouctome 1Learning Outcome 2Learning Outcome 3Learning Outcome 4Learning Outcome 5
ABC 1234L  LL
ABC 2345MMLM 
ABC 3456M H H
ABC 4568
(Capstone)
HHHHH

Curriculum Review and Revisions

  • Planning and Budgeting
  • Accreditation Requirements (regional and discipline specific)
  • State Requirements
  • Student Learning Outcomes Review and Revisions
  • Program Promotion/Marketing
  • Conference Presentations
  • Research and Publications
  • Boost Retention and Graduation Rates
  • Recruitment/Retention Initiatives
  • Grant Applications
  • Advising Improvements
  • Professional Development Opportunities