Graduate Biennial Program Assessment Report

A printable Word document of this template can be found here.

Program Information: (Modify table as needed)

Degree/s Assessed

 

 

 

College or Administrative Division

 

Department/School

 

Report Submitted By

 

Date Submitted

 

Assessment Period:

 

 

Graduate assessment reports are to be submitted biennially. The report deadline is October 15th.


Biennial Graduate Assessment Process:

Every graduate program assessment must have the following key components:

  1. Program Description: Depending on the program plan (A: Thesis; B: Professional, or C: Course Work) will define the nature of your PLO’s. Ideally plans would include assessment that would cover all plans, but that would depend on the nature of your Master’s program.
  2. Program Learning Outcomes: PLOs are the accumulated knowledge, skills, and attitudes that students develop during a course of study in the program. Essentially, PLOs tell us what students will learn in the program. PLOs should be written as specific, measureable statements describing what students will be able to do upon completion of the program. Each PLO should contain an action verb and a learning statement.  (For help in developing learning outcomes see “Program Assessment Overview”, under Resources on Provost Page: https://www.montana.edu/provost/assessment/program_assessment.html)
  3. Threshold Values: Along with program learning outcomes, program assessment reports should include threshold values to measure student achievement for learning outcomes.
  4. Methods of Assessment: Every assessment report needs evidence to demonstrate student learning at the program level. This evidence can be in the form of a direct measure of student learning or an indirect measure of student learning.  Both direct and indirect assessment data must be associated with the program’s learning outcomes, and collected within a timeframe determined by the program.
  5. Timeframe for Collecting and Analyzing Data: Please provide a multi-year assessment schedule that will show when all program learning outcomes will be assessed, and by what criteria (data). Ideally, assessment data should be collected throughout the year on an annual basis. At the minimum, program faculty should schedule an annual meeting to review these data and discuss student progress toward the SLOs.
  6. Use of Assessment Data: The assessment report should identify who received the analyzed assessment data, and how it was used by program faculty for program improvement (s).
  7. Closing the Loop: Assessment reports should also be reflective on previous assessment and program improvements. Based on assessment from previous years, please include program level changes that have led to outcome improvements.

 

1. Program Description:

 

 

2. Program Learning Outcomes, Assessment Schedule, and Methods of Assessment

ASSESSMENT PLANNING CHART

 

 

PROGRAM LEARNING OUTCOMES

2016-2017

 

2017-2018

 

2018-2019

 

2019-2020

 

Data Source*

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

3. Threshold values for program learning outcomes (please include assessment rubrics)

Threshold Values

PROGRAM LEARNING OUTCOME

Threshold Value

Data Source

Example: Demonstrate oral and written communication skills to present and publish work in their disciplinary field

First Year Graduate Students – 75% will demonstrate level 3 attainment of oral and written communication skills (as defined by assessment rubric)

Terminal Year Graduate students 100% of students will demonstrate 3-4

Graduate Seminar presentations

 

 

Poster/publications

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

4. What Was Done


a) Was the completed assessment consistent with the plan provided?
YES_____ NO_____
If no, please explain why the plan was altered.

b) Please provide a rubric that demonstrates how your data was evaluated.
(Example provided below should be deleted before submission – your rubric may be very different, it just needs to explain the criteria used for evaluating student achievement).
Example:

Indicators

Beginning - 1

Developing- 2

Competent- 3

Accomplished- 4

Analysis of Information, Ideas, or Concepts

Identifies problem types

Focuses on difficult problems with persistence

Understands complexity of a problem

Provides logical interpretations of data

 

Application of Information, Ideas, or Concepts

Uses standard solution methods

Provides a logical interpretation of the data

Employs creativity in search of a solution

Achieves clear, unambiguous conclusions from the data

 

Synthesis

Identifies intermediate steps required that connects previous material

Recognizes and values alternative problem solving methods

Connects ideas or develops solutions in a clear coherent order

Develops multiple solutions, positions, or perspectives

Evaluation

Check the solutions against the issue

Identifies what the final solution should determine

Recognizes hidden assumptions and implied premises

Evaluates premises, relevance to a conclusion and adequacy of support for conclusion.

 

5. What Was Learned: Results


Please include who received the analyzed assessment data, and how it was used by program faculty for program improvement (s).

a) Areas of strength

 

b) Areas that need improvement

6. How We Responded

a) Based on the faculty responses, will there any curricular or assessment changes (such as plans for measurable improvements, or realignment of learning outcomes)?
YES______                         NO_______

 If yes, when will these changes be implemented?

Please include which outcome is targeted, and how changes will be measured for improvement.  If other criteria is used to recommend program changes,  please explain how the responses are driving department, or program decisions.

b) When will the changes be next assessed? 

7. Closing the Loop

 

a) If there have been changes in program/curriculum to reflect concerns from previous assessments, what impact have the changes had (if any) on achieving the desired level of student learning outcomes?

 

NOTE: Student names must not be included in data collection.  Dialog on successful completions, manner of assessment (publications, thesis/dissertation, or qualifying exam) may be presented in table format if they apply to learning outcomes.  In programs where numbers are very small and individual identification can be made, focus should be on programmatic improvements rather than student success.  Data should be collected through the year on an annual basis.

Submit report to [email protected]