Academic Year Assessed

2022 - 2023

Program(s) Assessed

Department of English Writing Option

  • Past Assessment Summary. Briefly summarize the findings from the last assessment report conducted related to the PLOs being assessed this year. Include any findings that influenced this cycle’s assessment approach. Alternatively, reflect on the program assessment conducted last year, and explain how that impacted or informed any changes made to this cycle’s assessment plan.

 We are starting after our year zero plan submission so this is the first assessment of the writing option.

  •  Action Research Question. What question are you seeking to answer in this cycle’s assessment?

 How well are we implementing research instruction in our teaching?

  •  Assessment Plan, Schedule, and Data Source(s).
    • Please provide a multi-year assessment schedule that will show when all program learning outcomes will be assessed, and by what criteria (data).

ASSESSMENT PLANNING SCHEDULE CHART

PROGRAM LEARNING OUTCOME

COURSES MAPPED TO PLOs

2021-2022

2022-2023

2023-2024

2024-2025

#3

WRIT: 300, 494

 

X

 

 

#1

WRIT: 300, 494

 

 

X

 

#2

WRIT: 205, 494

 

 

 

X

 

    • What are the threshold values for which your program demonstrates student achievement?

Threshold Values

PROGRAM LEARNING OUTCOME

Threshold Value

Data Source(s)*

4. Research: Students will be able to find, evaluate, and engage with various resources and methodologies as they compose texts that apply information literacy. Students will demonstrate an understanding of disciplinary research strategies, conventions, and integrity.

75% of students will meet or exceed Level 3 competency per the rubric below (1-4)

Randomly selected student essays

 

What Was Done

Was the completed assessment consistent with the program’s assessment plan? If not, please explain the adjustments that were made.                               

       Yes                                                        

How were data collected and analyzed and by whom? Please include method of collection and sample size.

Participants in the reading workshop, led by two members of the department Assessment Committee who are also writing faculty, initially read all assessment artifacts and made notes on them prior to the workshop. The workshop was scheduled after the Spring semester during off-contract time, with the four participants compensated from assessment grant funds. Readers met for three hours to discuss and reach consensus ratings on artifacts for the three assessed courses, spending about one hour each on the six artifacts from each course. Readers, as faculty in the Writing option, found significant value in this approach for fostering wide-ranging discussion of 1) the role of each course in contributing to PLO’s, 2) the different configurations a given course and its assignments take when the same course is taught by different faculty (as all of ours are), and how we need to be conscious of the impact of those differences.

Please provide a rubric that demonstrates how your data were evaluated.

PLO#3 Research: Students will be able to find, evaluate, and engage with various resources and methodologies as they compose texts that apply information literacy. Students will demonstrate an understanding of disciplinary research strategies, conventions, and integrity. 

Threshold Values

Level 1 

Level 2 

Level 3 

Level 4 

75% of students will meet or exceed Level 3 competency

Able to develop a research question. Shows awareness of resources available for addressing research questions. May not yet be effective at incorporating research sources.   

Able to utilize available resources to address a research question and able to assess sources for validity. Demonstrate appropriate citation conventions.   

Conducts a research project to address an original research question. Composes an original text engaging with appropriate external sources.  

Conducts an original, empirical research project to address an original research question.   

 

 

What Was Learned

Based on the analysis of the data, and compared to the threshold values established, what was learned from the assessment?

Note: While we are focusing on research in particular, which is the last to be discussed here, our conversation was much broader and we have included some of the notes from that below.

    • The assessment reading workshop created a site of rich conversation on curricular design and faculty values for student learning, as well as producing clear focuses for extending the conversation through the coming year. We found this conversation more valuable than the purely numerical outcomes that emerged from it (we comment on some limitations below) and are pleased with how the occasion of reviewing artifacts of student learning lets us ground curriculum-development conversations in concrete student performance as well as faculty’s own classroom experiences.  
    • Our PLOs – designed for use across all options in the English major – are not sufficiently addressing two outcomes that, in the assessment reading workshop, emerged as clear faculty values for student learning:  rhetorical awareness (sense of audience values / uses to which writing is put / role of texts in presenting the writer’s face to a broader readership) and emotional intelligence, and ability to speak in writerly ways about writing development and provide substantive feedback on writing to other writers (students). (We may also wish to consider a PLO related to writing self-efficacy as we continue assessing.)  
    • While we are committed to using end-of-course student learning reflections as our artifact in our sophomore-level course (WRIT 205), we need to adjust the reflection prompts we give students there to ensure that they are drawn to speak toward subjects, such as critical reading and research, centered by our PLOs. (In other words, better align the reflection prompt given in the class to the focus of our PLOs.) In this assessment, we found that one set of reflections in particular, while useful for establishing a baseline on many elements of student learning, didn’t address reading and research sufficiently to assess because students weren’t prompted to discuss those matters.  
    • We are pleased with the use of professional portfolios as the artifact of assessment from the 494 class and articulating one or two PLOs that capture the breadth of achievement demonstrated by the portfolios will help us adjust teaching in earlier courses to more fully develop portfolios along the way. 
    • Because of the small numbers of students in assessed classes (usually 15-20) and the consequent small number of artifacts involved, percentage-based measures such as threshold values can very easily be skewed downward either by oversampling work from weaker students or by one artifact in six being non-assessable on a given PLO. We saw both these skews in our assessment data, and thus are not drawing strong conclusions from the numerical side of our assessment.  
    • Given our desired configuration of artifact collection (6 per course) and number of courses to draw artifacts from (3), a more meaningful threshold value for PLOs would be divisible by 3 (e.g., 66%) rather than 4 (e.g., 75%); we will consider adjusting the threshold values in coming assessment cycles. 
    • We deliberately assess the entire range of student performance by stratifying (rather than randomizing) artifact selection to collect work that instructors have seen as excellent, average, and weak. It is not possible to generalize, therefore, about class-by-class student achievement on PLOs; we see as much poor work from WRIT 494 as we do from WRIT 205, and as much strong work from each. What differs from course to course is the nature of the best performance we see related to each PLO. Some of our observations from this reading, organized by PLO: 
      • Reading: the WRIT 205 artifacts in which students discuss or demonstrate reading show a more limited conception of what can be done with a text than the WRIT 300 and WRIT 494 artifacts. The highest-achieving artifacts from those junior and senior courses are in highly mature conversation with texts and demonstrate as high as MA-level reading ability and use of texts.  
      • Writing: Our writing PLO emphasizes purposefulness, audience awareness, and control of conceptual and textual elements spanning from organization to argumentation to editing. We are unsurprised to see 200- and even 300-level artifacts that demonstrate lack of audience awareness or editorial polish (rating 1). Seeing a 400-level portfolio that earned this rating in writing is disappointing, but not systemic. Across the course levels, we see the most pronounced growth in audience awareness and articulation of purpose (a sense of writerly “control” of, or explanation of, what they think readers should find useful in their writing). Our top Writing ratings, both in 300- and 400-level courses, are marking artifacts that genuinely impress us as writers and teachers of writing and show the writers to be well prepared for writing in professional or graduate sites. Particularly, the highest levels of performance suggest writers who will demonstrate flexibility / adaptability in approaching new / unfamiliar writing tasks. 
      • Research:  In the two relevant courses for this outcome (WRIT 300 and WRIT 494) our students met the threshold in WRIT 300 with a 3.17 mean (4, 3, 2, 3, 4, 3 are the scores for the artifacts from this class) and were below threshold in the WRIT 494 with a 2.8 mean (4, 3, 2, 1, NA [this student presented a creative writing portfolio], 4).
    • Our research PLO rubric – with a 4-rating requiring “Conducts an original, empirical research project to address an original research question” -- sets a very high bar in the opinion of many department faculty who are unsure such performance is possible prior to graduate studies. (We may modify this rubric in the coming assessment cycle.) Our WRIT 205 class is designed only to increase student awareness that such inquiry is one kind of research, not to give students practice in such work, and thus we are not able to assess this PLO in student reflections from that course. Our 300-level courses are designed to have students practice aspects of the PLO (such as developing a research question, and developing awareness of various disciplinary research methods), and our 494 course is designed to have students integrate all aspects of the PLO in an overarching research project. What we saw in this reading is that both 300-level and 494 student artifacts demonstrated a range of facility, from not yet what we wish to see, to higher performance than we could reasonably expect of all students. This range suggests that our PLO is indeed achievable and reached by many students, but currently among majors we don’t see the consistency of achievement that we would ideally like to.  
    • In order to ensure that we are reviewing the whole range of student achievement in our major, we ask instructors for three artifacts per class, one they consider high-achieving, one mid-achieving, and one low-achieving. However, we recognize the limitation that this sampling method does not ensure a representative sample of overall student performance on achieving PLOs across the major. In comparison with course grade distributions, for example, our assessment sampling method oversampled low-achieving students. As a result, when our assessment readers concur with the faculty selecting artifacts that 1/3 of our sample is, by our request, achieving below-threshold on PLOs, the resulting over-sample of weaker students impacts achievement of thresholds.

 

What areas of strength in the program were identified from this assessment process?

    • The program’s decision (emerging from the Year 0 assessment plan) to standardize the use of a professional portfolio in all sections of WRIT 494, and to begin building instruction on / design of / maintenance of the portfolio into earlier coursework, including the introductory course to the major (WRIT 205), is proving to be a good plan now that we’ve launched it. It is providing students 1) a site for reflection that closes their own loop on learning and 2) a space to demonstrate the breadth of their abilities in reading, research, and writing that is uniquely reflective of their identities and emergent “brands.” While we need to adjust PLOs to take advantage of what the portfolio can show us as teachers, we are pleased with the breadth of ways the portfolios give us to recognize students’ learning and growth.
    • Our action research question for this cycle involved research, and we do see the kind of growth in student research abilities and products across coursework that we would want to.

 

What areas were identified that either need improvement or could be improved in a different way from this assessment process?

    • Assessment process itself: We’ve noted above ways we need to update our assessment process to make the most of our artifacts (modifying PLOs), as well as revision some of our writing prompts to provide more usable artifacts for our assessment. One example is the need to develop reflection prompts for WRIT 205 that emphasize (and build students’ familiarity with) more of our PLOs. 
    •  Our PLOs themselves also need improvement to better articulate the kinds of student learning that our assessment has shown us we’re seeking in their work. In conversation with Writing faculty, we will likely modify existing PLOs and/or add 1-2 new PLOs for our next assessment loop. 
    • We did not see any systemic lacks for student learning / development in this assessment loop, but we would always like to see more students achieve at higher levels. The assessment suggests to us that part of our curricular / teaching conversations for the coming cycle need to focus on helping students prioritize growth that expands their horizons rather than “doubling down” just on kinds of writing or abilities that are already their strength. 

How We Responded

Describe how “What Was Learned” was communicated to the department, or program How did faculty discussions re-imagine new ways program assessment might contribute to program growth/improvement/innovation beyond the bare minimum of achieving program learning objectives through assessment activities conducted at the course level?

We have not had the opportunity to put this data in front of our faculty to this point. We will do this in an upcoming (December or January) meeting.

How are the results of this assessment informing changes to enhance student learning in the program?

Please see above. These are in process.

Closing the Loop

This is our first round for the Writing Option after the Year zero report; so we do not yet have a loop to close.