Printable Word Document with Detailed Instructions

 

Assessment Plan – Year 0 Report

This is a template that contains embedded instructions and explanation for your information. Please delete unnecessary instructions or explanations from your final report, including this text. Download Word document above.

Year 0 Assessment Plan Report is due October 15th .

Academic Year of Year 0 Plan:

College:
Department:
Submitted by:

 

Program(s) to be Assessed.

List all majors, minors, certificates and/or options that are included in this new Assessment Plan

Majors/Minors/Certificate

Options

 

 

 

 

 

 

 

Is this a new program?    Yes___No___ 

Are you keeping existing outcomes?Yes ____No___ 

If no, please identify all that apply: 

Consolidating PLOs ____ 

Rewriting PLOs to be more assessable ____ 

Rewriting PLOs to be more aligned with program objectives ____ 

 

Other: 

 

Part 1: Program Learning Outcomes (PLOs).

PLOs should be written as specific, measurable statements describing what students will be able to do upon completion of the program.  The assessment of PLOs provide feedback on the expected knowledge, skills, and attitudes that students develop as they progress through their program. Ideally, a program will have no more than 5 PLOs, if you have more than 7 PLOs, you can expand the table, but consider consolidating outcomes.  You will need to assess all PLOs and want this to be manageable.

 

List the Program Learning Outcomes (these should match what is in CIM)

PLO#

PLO Description

1

 

2

 

3

 

4

 

5

 

6

 

7

 

 

Part 2: Development of Assessment Plan.

Each plan will require the following information:

a) Threshold Values.

Along with PLOs, plans should include threshold values –

minimums against which to assess student achievement for learning outcomes.  Threshold values are defined as an established criteria for which outcome achievement is defined as met or not met.

 

b) Methods of Assessment & Data Source.

Assessment plans require evidence to demonstrate student learning at the program level. This evidence can be in the form of a direct or indirect measure of student learning*.  Both direct and indirect assessment data must be associated with the program's learning outcomes.

 

*Data sources should be examples of direct evidence of student learning: specifically designed exam questions, written work, performances, presentations, projects (using a program-specific rubric – not a course grading rubric); scores and pass rates on licensure exams that assess key learning goals; observations of student skill or behavior; summaries classroom response systems; student reflections.

 

Indirect evidence of student learning includes course grades, grade distributions, assignment grades, retention and graduation rates, alumni perceptions, and questions on end-of-course evaluations forms related to the course rather than the instructor. These may provide information for identifying areas of learning that need more direct assessment but should NOT be used as primary sources for direct evidence of student learning.

 

c) Timeframe for Collecting and Analyzing Data.

Develop a multi-year assessment schedule that will show when all program learning outcomes will be assessed.  As graduate assessment reports are biennial, faculty review of assessment results may only occur every other year, however, annual faculty meeting to review these data and discuss student progress may be beneficial. 

 

d) Curriculum Map & Assessment Planning Chart.

Using the chart below, fill in the map.  (This table can be recreated to make more room for PLOs.) All courses in a program should align with at least one PLO.  Attempt to schedule assessment so all PLOs are assessed at least every three years.

 

 

ASSESSMENT PLANNING CHART

Program Learning Outcomes

Course Alignments:
Include rubric, number, and course title

Identification of Assessment Artifact

Year to be assessed 

 

 

 

2023-2024

2024-2025

2025-2026

2026-2027

2027-2028

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Part 3: What Will be Done.

Explain how assessment will be conducted, who receives the analyzed assessment data, and how it will be used by program faculty for program improvement(s). 

a) How will assessment artifacts be identified?

b) How will they be collected (and by whom)?

c) Who will be assessing the artifacts?

 

Part 4: Assessment-Specific Rubrics. 

All plans must include program-specific assessment rubrics (the methodology of how student artifacts are to be assessed).  This is different than course-specific rubrics.  Program-specific rubrics are developed to create indicators (or criteria) for each PLO of what the student work should demonstrate to support the PLO(s) being assessed.  In some cases, a program-assessment rubric can hold multiple PLOs and indicators that are assessed across the same student artifacts.  Sometimes course-specific rubrics may contain an indicator that also works for a program-specific rubrics, but course-specific rubrics should never be used as a program-specific rubric for assessment.  Measuring whether students achieve the outcomes of a course is not the same as determining if a course is achieving the outcomes of a program.  Include a threshold for student success attainment. The chart below is an example of the information requested. You can configure your rubrics in different ways.  Examples provided should be deleted before submission.

Example: PLO #1: Demonstrate a substantive breadth of knowledge in the field of study.

Threshold Values

 

Indicators or Criteria

 

Level 1

 

Level2

 

Level 3

 

Level 4

80% of students will meet or exceed Level 3 competency

Analysis of Information, Ideas, or Concepts

 

Identifies problem types

 

Focuses on difficult problems with persistence

Understands complexity of a problem

Provides logical interpretations of data

 

Application of Information, Ideas, or Concepts

 

Uses standard solution methods

Provides a logical interpretation of the data

Employs creativity in search of a solution

Achieves clear, unambiguous conclusions from the data

 

Synthesis

 

Identifies intermediate steps required that connects previous material

Recognizes and values alternative problem solving methods

Connects ideas or develops solutions in a clear coherent order

Develops multiple solutions, positions, or perspectives

 

Evaluation

 

Check the solutions against the issue

Identifies what the final solution should determine

Recognizes hidden assumptions and implied premises

Evaluates premises, relevance to a conclusion and adequacy of support for conclusion.

 

 

 

Part 5: Program Assessment Planning & Report Communication

a) How will annual assessment be communicated to faculty within the department? How will faculty participating in the collecting of assessment data (student work/artifacts) be notified?

 

b) When will the data be collected and reviewed, and by whom?

 

c) Who will be responsible for the writing of the report?

 

d) How, when, and by whom, will the report be shared?

 

Part 6: Closing the Loop(s).

“Closing the Loop” is the self-reflective portion of the assessment where faculty have an opportunity to evaluate how a PLO(s) was assessed previously compared to the findings in the current report.  The goal of program assessment is continual student learning improvement even if thresholds have been met.  How will Closing the Loop be documented going forward?  How will past assessments be used to inform changes and improvements?

 

Other Comments:

 

Submit report to [email protected]

Upload Assessment Plan to department website for future reference.