Critical Thinking Essay Rubric Text

Jonathan Friesen - Writing Coach

tod porter, joseph palardy, angela messenger, and hillary fuhrman one recommendation of youngstown state university’s ysu 2008 higher learning commission hlc site team report was that the university needed to show consistency of commitment to assessment practices that inform curricular and program development, especially in general education higher learning commission 2008. In an attempt to address the team’s concerns and improve writing and critical thinking at the university, ysu designed the repository of assessment documents road project. Developed during ysu’s participation in the hlc’s academy for the assessment of student learning, the road consists of two assessment cycles: a short run cycle using cross sectional data to evaluate the quality of writing instruction for first year students, and a long run cycle using longitudinal data to evaluate changes in students’ performance over their academic career. Making the project possible is an internally developed system for storing, retrieving, and scoring student writing samples. Since starting the project, more than five thousand student documents have been uploaded to the database and approximately thirteen hundred have been reviewed and scored.

The analysis of the data has prompted discussion within the english department about some changes in the composition program. Developing an internal system for assessment of general education, compared to a commercial e portfolio system, had multiple advantages, including cost savings. By building the document storage and scoring system within the university’s banner system, training and support costs could be minimized. The general education committee, chaired by the general education coordinator, developed a policy that called for student essays to be uploaded from the second course in the first year composition sequence writing 2 and for each program to identify an upper division writing assignment that would be uploaded to the database. To become a binding requirement and gain faculty support, the policy needed the approval of the academic senate. First, a number of departments primarily in the sciences stated that they typically did not assign essays and grading multiple writing assignments would significantly increase their workload.

Second, a number of faculty members again, in the sciences felt that the rubric was not an appropriate instrument to evaluate their students’ writing. Third, there were concerns that the student essays would be used to evaluate specific faculty members or departments. To address these concerns, the requirement was reduced students would be asked to upload one document with a word count between 750 and 4,0 words in addition to the essay from writing 2.

The rubric would be reviewed and changed as needed by a committee composed of representatives from each college. Finally, the policy explicitly stated that scores for individual students, individual classes, or the majors of individual programs would not be released. Beginning with the fall semester of 2011, the general education coordinator has analyzed the relationship between the scores for the writing samples and student characteristics. Implementing the road involved designing the repository system and organizing the review process. The rubric that was adopted is based on the american association of colleges and universities’ aac amp u valid assessment of learning in undergraduate education value rubrics for written communication and critical thinking american association of colleges and universities 2014.

Readers train with anchor papers using the following criteria: context of and purpose for writing, content development, genre and disciplinary conventions, sources and evidence, control of syntax and mechanics, student’s position, and conclusions and related outcomes youngstown state university 2011. All students’ and instructors’ names are removed from the files before review. Approximately 25 percent of all of the essays are reviewed by a second reader, and the writing center coordinator reviews those scores with the readers in an attempt to keep the readers aligned in their scoring. The dataset from the road project consisted of two categories of variables: student characteristics and rubric scores.

Student characteristics included gender, age, college and high school grade point averages, semester hours completed at upload, course grades, and two constructed variables: a placement test score and a binary late registration variable. The placement test score was the average normalized score on the act english, sat verbal, ysu composition placement, and compass reading tests, which are all used to place students into the composition sequence. The late registration variable indicated if a student registered for the course less than two weeks before the start of the course.