University Seminar Core Student Learning Outcome Assessment Report

 

Course Title:                                                  LS 101US Ways of Knowing

 

Author of Report:                                          Bridget Kevane

 

Outcome Being Assessed:                         Written Communication

 

Semester and Year                                       Fall 2018

           

Course Enrollment:                                      104

 

Number of Course Sections:                      7 Fall Sections

 

Number of Assignments Assessed:           6 (5%)

 

Assessment Team:                          

Bridget Kevane, Director; Teresa Greenwood, Advisor/LS Instructor; Jennifer Storment, Program Manager; Sarah Coletta-Flynn, Online Program Manager/LS Instructor; John Townsend-Mehler, Online Program Advisor/LS Instructor

 

Method of Selecting Student Work:

The LS director requested that each section submit two randomly chosen papers across all seven sections. A few of the sections did not submit the requested papers, but because we received the minimum of 5%, we did assess the received papers. The assessment team reviewed the student papers using the Written Assessment rubric provided by the US Core Committee to score each assignment.

 

Method of Ensuring Inter-rater Reliability:

The assessment team met and discussed the rubric and expectations for different rankings of students’ written work. No one on the assessment team assessed their own students’ work to maintain objectivity.  No one on the assessment team was able to see other assessors’ rankings, and rankings were not recorded until all had submitted their completed rubrics, to prevent scoring bias. 

 

Because five assessors reviewed each paper, there were some instances of scores that differed by more than one scoring category.  For instance, on paper 2’s “Support” category, two assessors gave a “Meets Expectations” score, two assessors gave an “Above Expectations” score, and one assessor gave a “Below Expectations” score.  Because the “Below Expectations” score was the outlier, that particular ranking was thrown out.  There were five instances of this occurring, and in all five instances, the outlier score was removed from the overall scores.  This allowed for a more consistent measure of overall student performance.

 

Notes about Scoring:

For each category, rubric scores were coded as “Below Expectations” (1), “Meets Expectations (2), and “Above Expectations” (3). Assigning values to the rankings with 3 being the highest score, and 1 being the lowest, allowed us to monitor averages and more easily compare and contrast final results.  See the tables below for reference.    

  

Results:

 

Fall 2018 Writing Assessment Results

Criteria

Above Expectations

Meets Expectations

Below Expectations

Outlier Scores thrown out (not calculated in %)

 
 

Thesis or Claim

10.3%

48.3%

41.4%

1

 

Support

10.3%

69.0%

20.7%

1

 

Exploration and Synthesis

10.7%

60.7%

28.6%

2

 

Language

10.3%

75.9%

13.8%

1

 

Mechanics

13.3%

83.3%

3.3%

0

 

Overall

11.02%

67.43%

21.55%

n/a

 

 

 

Fall 2015 Writing Assessment Results (for comparison)

Criteria

Above Expectations

Meets Expectations

Below Expectations

Not Applicable

(paper assignment did not require this)

Thesis or Claim

12.5%

56.25%

31.25%

 

Support

12.5%

37.5%

50%

 

Exploration and Synthesis

6.25%

31.25%

12.5%

50%

Language

6.25%

50%

43.75%

 

Mechanics

6.25%

56.25%

37.50%

 

Overall

6.25%

46.25%

35%

10%

 

 

 

Action Plan for LS101: It is difficult to directly compare the fall 2018 Writing Assessment results with the fall 2015 results given the facts that 1.) the number of papers assessed decreased quite a bit from 16 in 2015 to 6 in 2018, and 2.) the number of assessors increased quite a bit, from 2 in 2015 to 5 in 2018.  In the future, we plan on collecting more assignments to ensure the results are more accurate, so that one weaker or stronger paper doesn’t dramatically alter the final results.

 

Regardless, the area of concern that persists since the 2015 assessment, with nearly half (41.4%) of assessed papers falling “Below Expectations” is in “Thesis or Claim”.  The other assessed categories have significantly fewer papers falling in the “Below Expectations” categories. Between fall 2018 and fall 2015 it appears that working more with students in the areas of “Language” and “Mechanics” has had positive benefits as those areas are ranked much higher in the fall 2018 semester.  Again, because we had a much smaller sampling of assignments to assess, however, these differences are difficult to accurately compare. 

 

Strategies for meeting the learning outcomes of the University Seminar We will encourage our instructors to consider more practice for students in the following area:

 

Area of Focus

Action Plan

Clear statement of a thesis and provision of relevant evidence to support this thesis. 

Nearly half (41.4%) of assessed assignments fell in the “Below Expectations” category because there was no clear thesis or claim made. We will review these results with our 101 instructors and discuss ways in which we might better model thesis statements and how one provides support for one’s claims.