Graduate Biennial Program Plan & Assessment Report Template

Program Information: (Modify table as needed)

Degree/s Assessed  
   
College or Administrative Division  
Department/School  
Report Submitted By  
Date Submitted  
Assessment Period:  

Graduate assessment reports are to be submitted biennially. The report deadline is September 15th.

 

Part 1: Assessment PlanEvery graduate report must have the following key components.

Program Learning Outcomes (PLOs): PLOs should be written as specific, measureable statements describing what students will be able to do upon completion of the program.  The assessment of PLOs provide feedback on the accumulated knowledge, skills, and attitudes that students develop as they progress through their graduate program.  Plans should include PLO’s that would cover all types of graduate programs, depending on the nature of your programs (i.e. Master’s Thesis, Professional, Course work, Doctoral Dissertation, or Certifications).

(For help in developing learning outcomes see “Program Assessment Overview”, under Resources on Provost Page: https://www.montana.edu/provost/assessment/program_assessment.html)

Threshold Values: Along with PLOs, plans should include threshold values; minimums against which to assess student achievement for learning outcomes.  Threshold values are defined as an established criteria for which outcome achievement is defined as met or not met.

Methods of Assessment & Data Source:  Assessment plans require evidence to demonstrate student learning at the program level.  This evidence can be in the form of a direct or indirect measure of student learning.  Both direct and indirect assessment data must be associated with the program’s learning outcomes.  An assessment rubric will also need to be included that demonstrates how evaluation of the data was used to assess student achievement.

Timeframe for Collecting and Analyzing Data:  Provide a multi-year assessment schedule that will show when all program learning outcomes will be assessed.  As graduate assessment reports are biennial, faculty review of assessment results may only occur every other year, however, annual faculty meeting to review these data and discuss student progress may be beneficial. 

Part 2: Program Assessment

The assessment report should identify how assessment was conducted, who received the analyzed assessment data, and how it was used by program faculty for program improvement(s).  Assessment reports should also reflect on previous assessment and program improvements by identifying previous program-level changes that have led to outcome improvements.

NOTE: Student names must not be included in data collection.  Dialog on successful completions, manner of assessment (e.g., publications, thesis/dissertation, or qualifying exam) may be presented in table format if they apply to learning outcomes.  In programs where numbers are very small and individual identification can be made, focus should be on programmatic improvements rather than student success.  Data should be collected through the year on an annual basis.

Part 1: Program Assessment Plan

A) Program Description (from catalog):

B) Program Learning Outcomes, Assessment Schedule, Methods of Assessment, & Threshold Values

ASSESSMENT PLANNING CHART  
PROGRAM LEARNING OUTCOMES 2016-2017  2017-2018  2018-2019  2019-2020  Data Source Threshold Value
Example:
Demonstrate a substantive breadth of knowledge in the field of study.
x x x x Qualifying Exam 80% of students will meet or exceed expectations on first qualifying exam attempt.
             

(Examples provided should be deleted before submission, and add additional rows if need)

Part 2: Program Assessment Results

A) What Was Done
             1) Was the completed assessment consistent with the plan provided?        YES_____            NO_____
             If no, please explain why the plan was altered.

            2) Please provide a rubric that demonstrates how your data was evaluated.


Example:Rubric Qualifying Exam for MS Thesis & PhD Students

Component Expectations not met Meets Expectations Exceeds Expectations
Motivating the work The reasons for the work are not covered or only minimally covered Big picture presented.  Reasons for research question laid out Motivation is clear and documentation and/or data is used to show the importanceand need for the work
Defining the specific research question Not clear what problem is going to be addressed Clear what problemis being addressed Clear what specific problem is being addressed
Experimental design and analysis Experiments and analysis are not clear. Experiments are not tied to research question. Alternatives are not presented Clear experiments and analysis with specific anticipated results and alternatives tied to research question Rigorous design ofexperiments and analysis that not only include alternatives but are designed so that a negative finding is still very informative
Integration with core material Core material is notunderstood well or not connected to proposal Core material is referenced and relevant parts are used to strengthen proposal Core material is used to gain new and potentially important insights into field
Writing Writing is unclear, organization is poor Writing is clear, organization islogical Writing is at the level of a fundable grant
Presentation Slides hard to read, organization poor. Speaker cannot be heard clearly Slides are clearPresentation organizedSpeaker projects Presentation equivalent to talk at national conferences
Questions Does not understandquestions and/or not able to answer questions Understands & answersquestions, with potentially someclarifications Understands and responds to questions as well as givescontext to larger issues around questions

(Example provided should be deleted before submission – your rubric may be very different, it just needs to explain the criteria used for evaluating student achievement).

B) What Was Learned: Results
Please include who received the analyzed assessment data, and how it was used by program faculty for program improvement (s).

1) Who were the recipients of the analyzed assessment data?

2) Areas of strength

3) Areas that need improvement

4) What else was learned?

 

C) Use of Assessment Data

1) Based on the faculty responses, will there be any curricular or assessment changes (such as plans for measurable improvements, or realignment of learning outcomes)?
                YES______                         NO_______

 If yes, when will these changes be implemented?

Please include which outcome is targeted, and how changes will be measured for improvement.  If other criteria is used to recommend program changes, please explain how the responses are driving department, or program decisions.

2) When will the changes be next assessed? 

3) What are your goals moving forward?

D) Closing the Loop
Reflect on previous assessment and program improvements by identifying previous program level changes that have led to outcome improvements.

1) What was identified as an area for improvement from the last report?

2) What was implemented to improve these outcomes?

3) What impact have the changes had (if any) on achieving the desired level of student learning outcomes?

Submit report to [email protected]