Skip Navigation

FEEDBACK FOR PROGRAM ASSESSMENT REPORTS

FEEDBACK FOR ANNUAL PROGRAM ASSESSMENT REPORTS


As part of our ongoing efforts to strengthen the quality of feedback provided on annual assessment reports, the reviews of the 2024–2025 academic year reports were enhanced across all steps of the program assessment process. Below, you will find practical examples of common observations and the corresponding feedback offered for each.

As you review the feedback from the Office of Academic Assessment in the Platform system, you may notice that some content previously approved in earlier cycles now embeds recommendations for revision. This shift does not indicate a decrease in the quality of the assessment reports. Instead, it reflects our continuous improvement efforts to ensure greater clarity, stronger alignment, and overall effectiveness in our program assessment practices.


Process for Accessing and Viewing Feedback for Programs in your Department/School

  1. Log into the Platform here using your 4x4 and password.
  2. Click on the “Assessment Report Feedback” folder located at the bottom left of the screen.
  3. In the “Review Year” box on the right of the program title, click the down arrow and select “2024-2025”. The color-coded results from our reviews of your 2024-2025 AY assessment report(s) will appear under the “LEGEND” section. To view the feedback in the system (and download it, if you wish), please log in and navigate to “PROGRAM ASSESSMENT”, then select “Assessment Reports Review & Feedback”. The feedback document in Word format will appear in the upper right corner of the page.
  4. Refer to the “LEGEND” for definitions of each color-code designation, also detailed in the Program Assessment Review Rubric. To view the rubric directly on the Platform, log in, navigate to “PROGRAM ASSESSMENT”, and select “Assessment Reports Review & Feedback”. The rubric will be displayed in the top right corner.

Access Permissions

  • Assessment Liaisons have both read and write access for the programs they are responsible for.
  •  Chairs and Directors have read-only access to programs in their respective departments or schools.
  • Deans and Associate Deans have read-only access to programs in their respective colleges.

If you're unable to access the system, please email Felix Wao at wao@ou.edu


Common Observations and Corresponding Feedback

Observation: Missing or incorrect information: E.g., providing department/school mission instead of the mission or description of a particular program or wrong name of Assessment Liaison OR leaving this entire section blank.

Recommendation: Please complete/update the “General Information Area”. For “Mission of the Program” simple provide the mission or a brief description of the program itself, not the general mission of the college, department or school.

Observation: Too broad, very brief or vague statements (e.g., “Graduates will demonstrate thorough understanding of Nuclear Physics”.)

Recommendation: The program student learning outcome (PSLOs) is too broad, vague, and lacks specificity. It’s unclear what “thorough understanding of Nuclear Physics” means. Revise the PSLO to more accurately reflect the specific knowledge, skills, and abilities that graduates are expected to demonstrate upon completion of the program. In short specify what faculty expect graduates of the program to know and do.

 

Observation: Statements with “process” terms (e.g., “recognize”, “familiar”, “gain”), as opposed to outcome verbs (e.g., apply, describe, evaluate).

Recommendation: Using "have" (and similar terms such as “understand”, “recognize”, “appreciate”, “gain familiarity”) in a PSLO statement are highly discouraged because they are not only “process” oriented (i.e., not outcome focused) but also lack clarity and do not convey measurable or observable student behavior. Instead, in all PSLOs, use Bloom’s Taxonomy action verbs that clearly describe what students will be able TO DO as a result of their learning.

 

NB: The “process” terms can be used if the program’s PSLOs containing them are prescribed by the specialized, discipline-specific accrediting agency.

 

Observation: Lack of context or examples of knowledge and/or competency in the PSLO (e.g., “Students will understand engineering concepts.”)

Recommendation: Besides being vague and using process-based term, the PSLO statement lacks context. Therefore, restating it and embedding examples of what constitutes “engineering concepts” would enhance it’s understanding. Below are recommended formats for revising the above PSLO:

Format 1

Students will apply advanced engineering principles, such as thermodynamics and fluid mechanics, to analyze and design solutions for complex mechanical systems. Specifically, students will:

  • Identify and correctly apply thermodynamic laws to real-world engineering problems.
  • Apply fluid mechanics principles in calculating flow rates and pressure drops in system designs.
  • Develop and justify design solutions using appropriate engineering software and tools.

Format 2

Graduates will apply advanced engineering principles, such as thermodynamics and fluid mechanics, to analyze and design solutions for complex mechanical systems. They will demonstrate this by correctly identifying and applying thermodynamic laws to real-world problems, apply fluid mechanics principles in calculating flow rates and pressure drops, and developing justified design solutions through appropriate engineering software and tools.

 

Observation: Identical PSLOs for multiple programs within the same level or different levels (e.g., same PSLOs for both MS and PhD in the same discipline or the same PSLOs for two different masters or bachelor programs within the same department/school).

Recommendation: The Program Student Learning Outcomes (PSLOs) for the master’s and doctoral programs are identical. This is not acceptable, as programs within the same discipline must clearly differentiate the expected depth, complexity, and scope of student learning. While some shared competencies are reasonable—particularly in core knowledge and skills—each program must clearly define distinct PSLOs that reflect its specific disciplinary focus, methodologies, and professional expectations. The PhD program’s PSLOs should be revised to reflect the higher-level expectations appropriate to doctoral study—specifically, greater depth of knowledge, methodological and theoretical originality, independent research competence, and a substantive contribution to the discipline. These revisions should make the doctoral PSLOs demonstrably more advanced than those of the master’s program.

 

Observation: Reporting on only one or two PSLOs.

Recommendation: Per OU's program assessment guidelines, each degree program is expected to articulate and report (annually) on 3-5 measurable program student learning outcomes (PSLOs) that define the specific knowledge, skills, and abilities that graduates are expected to demonstrate upon completion of the program. Certificate programs should document 2-3 PSLOs.

 

Observation: Lack of description of the assessment method – including where assessments are conducted (e.g., Simply stating “problem solving”, “capstone projects”, or “comprehensive exams” as assessment methods without providing details.)

Recommendation: In addition to identifying the assessment method (e.g., “Thesis” or “Capstone Project”), please include a brief description of the specific projects, assignments, or activities and specify where assessments take place (i.e., the specific required courses in which they are conducted). Clearly articulate how student work will be evaluated by outlining the assessment criteria—such as a rubric—that defines the standards and performance levels faculty use to assess the quality of student work.

 

Observation: Lack of description of Performance Targets

Recommendation: Provide a statement that specifies the performance targets—or expected levels of achievement—for each PSLO. These targets should align with the planned assessments for the PSLO and reflect the criteria and rating scales (or a rubric) that will be used to evaluate the quality of student work. Be sure to attach the rubric on the Platform system.

 

Observation: Using end of course letter grades (different from “grades” in a particular assessment, assignment or project).

Recommendation: Given that end-of-course letter grades are designed to summarize overall course performance, not to measure achievement of specific program-level student learning outcomes (PSLOs), their use as evidence of acquisition of any aspect of PSLO is highly discouraged. This is partly because they include other indicators (such as “attendance” that have nothing to do with student learning). Therefore, rely exclusively on student performance in assessments, assignments or graded projects directly associated with the course content.  

 

Observation: Reporting student performance for ALL PSLOs from a single project/assessment in an undergraduate capstone course or graduate level culminating experiences (e.g., thesis, dissertation defense).

Recommendation (undergraduate): Relying solely on capstone course data limits the department’s ability to make informed improvements because it overlooks important insights from required courses earlier in the program and overemphasizes graduating seniors - who may be very few compared to all majors in the whole degree program. To gain a more complete and useful picture of student learning and as we discussed sometime back, we highly refining  assessment process to include achievement data for the PSLOs from major assignments across all required courses, ensuring the assessment reflects the experiences of all students—not just those at the end of the program.

 

Recommendation (graduate): At present, all Program Student Learning Outcomes (PSLOs) rely exclusively on culminating experiences (e.g., thesis and oral defense) to collect data on student achievement. From a program assessment perspective, this approach limits the school's ability to make informed, inclusive decisions about continuous improvement.

By focusing only on end-of-program assessments, the program misses valuable insights from student performance and feedback gathered by faculty earlier in the curriculum. For the next report, integrate assessments from key assignments or projects in required courses, as well as other essential processes such as prospectus preparation, committee reviews, and advisement activities. Doing so will provide a more comprehensive and actionable understanding of student learning across the entire program and ensure the report reflects the experiences of all students in the degree program—not just those nearing graduation.

 

Observation: Lack of description of the assessment results (e.g., Simply stating that “all students scored 100% in the final examination or “All the three students who defended their dissertation passed”.)

Recommendation: Provide detailed descriptions of student performance for each Program Student Learning Outcome (PSLO), addressing specific aspects assessed in the PSLOs. Highlight areas where students demonstrated strong achievement, as well as aspects that may require improvement. Additionally, where applicable, identify any observable trends over time and note any unexpected results or patterns that emerged from the data.

Observation: Lack of details regarding recommendations and/or action plans undertaken by faculty for continuous improvement.

Recommendation: Provide details regarding recommendations and/or action plans faculty in the program intend to implement for continuous improvement of student learning, instruction or the overall effectiveness of the program. For next year’s report, provide specific recommendations and/or action plans for continuous improvement of aspects of the program’s PSLOs.

 

Observation: Selecting the option “No Changes Needed” for all PSLOs or the same PSLO every year.

Recommendation: Given that assessment is fundamentally about continuous improvement. Even when students perform well, there are always opportunities to strengthen learning, enhance instruction, and improve the overall program. For next year’s report, be sure to share recommendations and/or action plans for continuous improvement of aspects of the program’s PSLOs.

 

Observation: Lack of progress regarding recommendations from previous cycles. This occurs when a program indicates that faculty will undertake specific actions (e.g., XYZ) during the current assessment year but then fails to report on the outcomes of those actions in subsequent assessment reports. As a result, there is no evidence of follow‑through on the recommendations.

Recommendation: For this year’s report, provide the outcomes and progress made in implementing the recommendations and action plans outlined in last year’s report.