Rubric Best Practices, Examples, and Templates

A rubric is a scoring tool that identifies the different criteria relevant to an assignment, assessment, or learning outcome and states the possible levels of achievement in a specific, clear, and objective way. Use rubrics to assess project-based student work including essays, group projects, creative endeavors, and oral presentations.

Rubrics can help instructors communicate expectations to students and assess student work fairly, consistently and efficiently. Rubrics can provide students with informative feedback on their strengths and weaknesses so that they can reflect on their performance and work on areas that need improvement.

How to Get Started

Best practices, moodle how-to guides.

  • Workshop Recording (Spring 2024)
  • Workshop Registration

Step 1: Analyze the assignment

The first step in the rubric creation process is to analyze the assignment or assessment for which you are creating a rubric. To do this, consider the following questions:

  • What is the purpose of the assignment and your feedback? What do you want students to demonstrate through the completion of this assignment (i.e. what are the learning objectives measured by it)? Is it a summative assessment, or will students use the feedback to create an improved product?
  • Does the assignment break down into different or smaller tasks? Are these tasks equally important as the main assignment?
  • What would an “excellent” assignment look like? An “acceptable” assignment? One that still needs major work?
  • How detailed do you want the feedback you give students to be? Do you want/need to give them a grade?

Step 2: Decide what kind of rubric you will use

Types of rubrics: holistic, analytic/descriptive, single-point

Holistic Rubric. A holistic rubric includes all the criteria (such as clarity, organization, mechanics, etc.) to be considered together and included in a single evaluation. With a holistic rubric, the rater or grader assigns a single score based on an overall judgment of the student’s work, using descriptions of each performance level to assign the score.

Advantages of holistic rubrics:

  • Can p lace an emphasis on what learners can demonstrate rather than what they cannot
  • Save grader time by minimizing the number of evaluations to be made for each student
  • Can be used consistently across raters, provided they have all been trained

Disadvantages of holistic rubrics:

  • Provide less specific feedback than analytic/descriptive rubrics
  • Can be difficult to choose a score when a student’s work is at varying levels across the criteria
  • Any weighting of c riteria cannot be indicated in the rubric

Analytic/Descriptive Rubric . An analytic or descriptive rubric often takes the form of a table with the criteria listed in the left column and with levels of performance listed across the top row. Each cell contains a description of what the specified criterion looks like at a given level of performance. Each of the criteria is scored individually.

Advantages of analytic rubrics:

  • Provide detailed feedback on areas of strength or weakness
  • Each criterion can be weighted to reflect its relative importance

Disadvantages of analytic rubrics:

  • More time-consuming to create and use than a holistic rubric
  • May not be used consistently across raters unless the cells are well defined
  • May result in giving less personalized feedback

Single-Point Rubric . A single-point rubric is breaks down the components of an assignment into different criteria, but instead of describing different levels of performance, only the “proficient” level is described. Feedback space is provided for instructors to give individualized comments to help students improve and/or show where they excelled beyond the proficiency descriptors.

Advantages of single-point rubrics:

  • Easier to create than an analytic/descriptive rubric
  • Perhaps more likely that students will read the descriptors
  • Areas of concern and excellence are open-ended
  • May removes a focus on the grade/points
  • May increase student creativity in project-based assignments

Disadvantage of analytic rubrics: Requires more work for instructors writing feedback

Step 3 (Optional): Look for templates and examples.

You might Google, “Rubric for persuasive essay at the college level” and see if there are any publicly available examples to start from. Ask your colleagues if they have used a rubric for a similar assignment. Some examples are also available at the end of this article. These rubrics can be a great starting point for you, but consider steps 3, 4, and 5 below to ensure that the rubric matches your assignment description, learning objectives and expectations.

Step 4: Define the assignment criteria

Make a list of the knowledge and skills are you measuring with the assignment/assessment Refer to your stated learning objectives, the assignment instructions, past examples of student work, etc. for help.

  Helpful strategies for defining grading criteria:

  • Collaborate with co-instructors, teaching assistants, and other colleagues
  • Brainstorm and discuss with students
  • Can they be observed and measured?
  • Are they important and essential?
  • Are they distinct from other criteria?
  • Are they phrased in precise, unambiguous language?
  • Revise the criteria as needed
  • Consider whether some are more important than others, and how you will weight them.

Step 5: Design the rating scale

Most ratings scales include between 3 and 5 levels. Consider the following questions when designing your rating scale:

  • Given what students are able to demonstrate in this assignment/assessment, what are the possible levels of achievement?
  • How many levels would you like to include (more levels means more detailed descriptions)
  • Will you use numbers and/or descriptive labels for each level of performance? (for example 5, 4, 3, 2, 1 and/or Exceeds expectations, Accomplished, Proficient, Developing, Beginning, etc.)
  • Don’t use too many columns, and recognize that some criteria can have more columns that others . The rubric needs to be comprehensible and organized. Pick the right amount of columns so that the criteria flow logically and naturally across levels.

Step 6: Write descriptions for each level of the rating scale

Artificial Intelligence tools like Chat GPT have proven to be useful tools for creating a rubric. You will want to engineer your prompt that you provide the AI assistant to ensure you get what you want. For example, you might provide the assignment description, the criteria you feel are important, and the number of levels of performance you want in your prompt. Use the results as a starting point, and adjust the descriptions as needed.

Building a rubric from scratch

For a single-point rubric , describe what would be considered “proficient,” i.e. B-level work, and provide that description. You might also include suggestions for students outside of the actual rubric about how they might surpass proficient-level work.

For analytic and holistic rubrics , c reate statements of expected performance at each level of the rubric.

  • Consider what descriptor is appropriate for each criteria, e.g., presence vs absence, complete vs incomplete, many vs none, major vs minor, consistent vs inconsistent, always vs never. If you have an indicator described in one level, it will need to be described in each level.
  • You might start with the top/exemplary level. What does it look like when a student has achieved excellence for each/every criterion? Then, look at the “bottom” level. What does it look like when a student has not achieved the learning goals in any way? Then, complete the in-between levels.
  • For an analytic rubric , do this for each particular criterion of the rubric so that every cell in the table is filled. These descriptions help students understand your expectations and their performance in regard to those expectations.

Well-written descriptions:

  • Describe observable and measurable behavior
  • Use parallel language across the scale
  • Indicate the degree to which the standards are met

Step 7: Create your rubric

Create your rubric in a table or spreadsheet in Word, Google Docs, Sheets, etc., and then transfer it by typing it into Moodle. You can also use online tools to create the rubric, but you will still have to type the criteria, indicators, levels, etc., into Moodle. Rubric creators: Rubistar , iRubric

Step 8: Pilot-test your rubric

Prior to implementing your rubric on a live course, obtain feedback from:

  • Teacher assistants

Try out your new rubric on a sample of student work. After you pilot-test your rubric, analyze the results to consider its effectiveness and revise accordingly.

  • Limit the rubric to a single page for reading and grading ease
  • Use parallel language . Use similar language and syntax/wording from column to column. Make sure that the rubric can be easily read from left to right or vice versa.
  • Use student-friendly language . Make sure the language is learning-level appropriate. If you use academic language or concepts, you will need to teach those concepts.
  • Share and discuss the rubric with your students . Students should understand that the rubric is there to help them learn, reflect, and self-assess. If students use a rubric, they will understand the expectations and their relevance to learning.
  • Consider scalability and reusability of rubrics. Create rubric templates that you can alter as needed for multiple assignments.
  • Maximize the descriptiveness of your language. Avoid words like “good” and “excellent.” For example, instead of saying, “uses excellent sources,” you might describe what makes a resource excellent so that students will know. You might also consider reducing the reliance on quantity, such as a number of allowable misspelled words. Focus instead, for example, on how distracting any spelling errors are.

Example of an analytic rubric for a final paper

Above Average (4)Sufficient (3)Developing (2)Needs improvement (1)
(Thesis supported by relevant information and ideas The central purpose of the student work is clear and supporting ideas always are always well-focused. Details are relevant, enrich the work.The central purpose of the student work is clear and ideas are almost always focused in a way that supports the thesis. Relevant details illustrate the author’s ideas.The central purpose of the student work is identified. Ideas are mostly focused in a way that supports the thesis.The purpose of the student work is not well-defined. A number of central ideas do not support the thesis. Thoughts appear disconnected.
(Sequencing of elements/ ideas)Information and ideas are presented in a logical sequence which flows naturally and is engaging to the audience.Information and ideas are presented in a logical sequence which is followed by the reader with little or no difficulty.Information and ideas are presented in an order that the audience can mostly follow.Information and ideas are poorly sequenced. The audience has difficulty following the thread of thought.
(Correctness of grammar and spelling)Minimal to no distracting errors in grammar and spelling.The readability of the work is only slightly interrupted by spelling and/or grammatical errors.Grammatical and/or spelling errors distract from the work.The readability of the work is seriously hampered by spelling and/or grammatical errors.

Example of a holistic rubric for a final paper

The audience is able to easily identify the central message of the work and is engaged by the paper’s clear focus and relevant details. Information is presented logically and naturally. There are minimal to no distracting errors in grammar and spelling. : The audience is easily able to identify the focus of the student work which is supported by relevant ideas and supporting details. Information is presented in a logical manner that is easily followed. The readability of the work is only slightly interrupted by errors. : The audience can identify the central purpose of the student work without little difficulty and supporting ideas are present and clear. The information is presented in an orderly fashion that can be followed with little difficulty. Grammatical and spelling errors distract from the work. : The audience cannot clearly or easily identify the central ideas or purpose of the student work. Information is presented in a disorganized fashion causing the audience to have difficulty following the author’s ideas. The readability of the work is seriously hampered by errors.

Single-Point Rubric

Advanced (evidence of exceeding standards)Criteria described a proficient levelConcerns (things that need work)
Criteria #1: Description reflecting achievement of proficient level of performance
Criteria #2: Description reflecting achievement of proficient level of performance
Criteria #3: Description reflecting achievement of proficient level of performance
Criteria #4: Description reflecting achievement of proficient level of performance
90-100 points80-90 points<80 points

More examples:

  • Single Point Rubric Template ( variation )
  • Analytic Rubric Template make a copy to edit
  • A Rubric for Rubrics
  • Bank of Online Discussion Rubrics in different formats
  • Mathematical Presentations Descriptive Rubric
  • Math Proof Assessment Rubric
  • Kansas State Sample Rubrics
  • Design Single Point Rubric

Technology Tools: Rubrics in Moodle

  • Moodle Docs: Rubrics
  • Moodle Docs: Grading Guide (use for single-point rubrics)

Tools with rubrics (other than Moodle)

  • Google Assignments
  • Turnitin Assignments: Rubric or Grading Form

Other resources

  • DePaul University (n.d.). Rubrics .
  • Gonzalez, J. (2014). Know your terms: Holistic, Analytic, and Single-Point Rubrics . Cult of Pedagogy.
  • Goodrich, H. (1996). Understanding rubrics . Teaching for Authentic Student Performance, 54 (4), 14-17. Retrieved from   
  • Miller, A. (2012). Tame the beast: tips for designing and using rubrics.
  • Ragupathi, K., Lee, A. (2020). Beyond Fairness and Consistency in Grading: The Role of Rubrics in Higher Education. In: Sanger, C., Gleason, N. (eds) Diversity and Inclusion in Global Higher Education. Palgrave Macmillan, Singapore.

Assessment Rubrics

A rubric is commonly defined as a tool that articulates the expectations for an assignment by listing criteria, and for each criteria, describing levels of quality (Andrade, 2000; Arter & Chappuis, 2007; Stiggins, 2001). Criteria are used in determining the level at which student work meets expectations. Markers of quality give students a clear idea about what must be done to demonstrate a certain level of mastery, understanding, or proficiency (i.e., "Exceeds Expectations" does xyz, "Meets Expectations" does only xy or yz, "Developing" does only x or y or z). Rubrics can be used for any assignment in a course, or for any way in which students are asked to demonstrate what they've learned. They can also be used to facilitate self and peer-reviews of student work.

Rubrics aren't just for summative evaluation. They can be used as a teaching tool as well. When used as part of a formative assessment, they can help students understand both the holistic nature and/or specific analytics of learning expected, the level of learning expected, and then make decisions about their current level of learning to inform revision and improvement (Reddy & Andrade, 2010). 

Why use rubrics?

Rubrics help instructors:

Provide students with feedback that is clear, directed and focused on ways to improve learning.

Demystify assignment expectations so students can focus on the work instead of guessing "what the instructor wants."

Reduce time spent on grading and develop consistency in how you evaluate student learning across students and throughout a class.

Rubrics help students:

Focus their efforts on completing assignments in line with clearly set expectations.

Self and Peer-reflect on their learning, making informed changes to achieve the desired learning level.

Developing a Rubric

During the process of developing a rubric, instructors might:

Select an assignment for your course - ideally one you identify as time intensive to grade, or students report as having unclear expectations.

Decide what you want students to demonstrate about their learning through that assignment. These are your criteria.

Identify the markers of quality on which you feel comfortable evaluating students’ level of learning - often along with a numerical scale (i.e., "Accomplished," "Emerging," "Beginning" for a developmental approach).

Give students the rubric ahead of time. Advise them to use it in guiding their completion of the assignment.

It can be overwhelming to create a rubric for every assignment in a class at once, so start by creating one rubric for one assignment. See how it goes and develop more from there! Also, do not reinvent the wheel. Rubric templates and examples exist all over the Internet, or consider asking colleagues if they have developed rubrics for similar assignments. 

Sample Rubrics

Examples of holistic and analytic rubrics : see Tables 2 & 3 in “Rubrics: Tools for Making Learning Goals and Evaluation Criteria Explicit for Both Teachers and Learners” (Allen & Tanner, 2006)

Examples across assessment types : see “Creating and Using Rubrics,” Carnegie Mellon Eberly Center for Teaching Excellence and & Educational Innovation

“VALUE Rubrics” : see the Association of American Colleges and Universities set of free, downloadable rubrics, with foci including creative thinking, problem solving, and information literacy. 

Andrade, H. 2000. Using rubrics to promote thinking and learning. Educational Leadership 57, no. 5: 13–18. Arter, J., and J. Chappuis. 2007. Creating and recognizing quality rubrics. Upper Saddle River, NJ: Pearson/Merrill Prentice Hall. Stiggins, R.J. 2001. Student-involved classroom assessment. 3rd ed. Upper Saddle River, NJ: Prentice-Hall. Reddy, Y., & Andrade, H. (2010). A review of rubric use in higher education. Assessment & Evaluation In Higher Education, 35(4), 435-448.

  • Grades 6-12
  • School Leaders

Free Attendance Questions Slideshow ✨

15 Helpful Scoring Rubric Examples for All Grades and Subjects

In the end, they actually make grading easier.

Collage of scoring rubric examples including written response rubric and interactive notebook rubric

When it comes to student assessment and evaluation, there are a lot of methods to consider. In some cases, testing is the best way to assess a student’s knowledge, and the answers are either right or wrong. But often, assessing a student’s performance is much less clear-cut. In these situations, a scoring rubric is often the way to go, especially if you’re using standards-based grading . Here’s what you need to know about this useful tool, along with lots of rubric examples to get you started.

What is a scoring rubric?

In the United States, a rubric is a guide that lays out the performance expectations for an assignment. It helps students understand what’s required of them, and guides teachers through the evaluation process. (Note that in other countries, the term “rubric” may instead refer to the set of instructions at the beginning of an exam. To avoid confusion, some people use the term “scoring rubric” instead.)

A rubric generally has three parts:

  • Performance criteria: These are the various aspects on which the assignment will be evaluated. They should align with the desired learning outcomes for the assignment.
  • Rating scale: This could be a number system (often 1 to 4) or words like “exceeds expectations, meets expectations, below expectations,” etc.
  • Indicators: These describe the qualities needed to earn a specific rating for each of the performance criteria. The level of detail may vary depending on the assignment and the purpose of the rubric itself.

Rubrics take more time to develop up front, but they help ensure more consistent assessment, especially when the skills being assessed are more subjective. A well-developed rubric can actually save teachers a lot of time when it comes to grading. What’s more, sharing your scoring rubric with students in advance often helps improve performance . This way, students have a clear picture of what’s expected of them and what they need to do to achieve a specific grade or performance rating.

Learn more about why and how to use a rubric here.

Types of Rubric

There are three basic rubric categories, each with its own purpose.

Holistic Rubric

A holistic scoring rubric laying out the criteria for a rating of 1 to 4 when creating an infographic

Source: Cambrian College

This type of rubric combines all the scoring criteria in a single scale. They’re quick to create and use, but they have drawbacks. If a student’s work spans different levels, it can be difficult to decide which score to assign. They also make it harder to provide feedback on specific aspects.

Traditional letter grades are a type of holistic rubric. So are the popular “hamburger rubric” and “ cupcake rubric ” examples. Learn more about holistic rubrics here.

Analytic Rubric

Layout of an analytic scoring rubric, describing the different sections like criteria, rating, and indicators

Source: University of Nebraska

Analytic rubrics are much more complex and generally take a great deal more time up front to design. They include specific details of the expected learning outcomes, and descriptions of what criteria are required to meet various performance ratings in each. Each rating is assigned a point value, and the total number of points earned determines the overall grade for the assignment.

Though they’re more time-intensive to create, analytic rubrics actually save time while grading. Teachers can simply circle or highlight any relevant phrases in each rating, and add a comment or two if needed. They also help ensure consistency in grading, and make it much easier for students to understand what’s expected of them.

Learn more about analytic rubrics here.

Developmental Rubric

A developmental rubric for kindergarten skills, with illustrations to describe the indicators of criteria

Source: Deb’s Data Digest

A developmental rubric is a type of analytic rubric, but it’s used to assess progress along the way rather than determining a final score on an assignment. The details in these rubrics help students understand their achievements, as well as highlight the specific skills they still need to improve.

Developmental rubrics are essentially a subset of analytic rubrics. They leave off the point values, though, and focus instead on giving feedback using the criteria and indicators of performance.

Learn how to use developmental rubrics here.

Ready to create your own rubrics? Find general tips on designing rubrics here. Then, check out these examples across all grades and subjects to inspire you.

Elementary School Rubric Examples

These elementary school rubric examples come from real teachers who use them with their students. Adapt them to fit your needs and grade level.

Reading Fluency Rubric

A developmental rubric example for reading fluency

You can use this one as an analytic rubric by counting up points to earn a final score, or just to provide developmental feedback. There’s a second rubric page available specifically to assess prosody (reading with expression).

Learn more: Teacher Thrive

Reading Comprehension Rubric

Reading comprehension rubric, with criteria and indicators for different comprehension skills

The nice thing about this rubric is that you can use it at any grade level, for any text. If you like this style, you can get a reading fluency rubric here too.

Learn more: Pawprints Resource Center

Written Response Rubric

Two anchor charts, one showing

Rubrics aren’t just for huge projects. They can also help kids work on very specific skills, like this one for improving written responses on assessments.

Learn more: Dianna Radcliffe: Teaching Upper Elementary and More

Interactive Notebook Rubric

Interactive Notebook rubric example, with criteria and indicators for assessment

If you use interactive notebooks as a learning tool , this rubric can help kids stay on track and meet your expectations.

Learn more: Classroom Nook

Project Rubric

Rubric that can be used for assessing any elementary school project

Use this simple rubric as it is, or tweak it to include more specific indicators for the project you have in mind.

Learn more: Tales of a Title One Teacher

Behavior Rubric

Rubric for assessing student behavior in school and classroom

Developmental rubrics are perfect for assessing behavior and helping students identify opportunities for improvement. Send these home regularly to keep parents in the loop.

Learn more: Teachers.net Gazette

Middle School Rubric Examples

In middle school, use rubrics to offer detailed feedback on projects, presentations, and more. Be sure to share them with students in advance, and encourage them to use them as they work so they’ll know if they’re meeting expectations.

Argumentative Writing Rubric

An argumentative rubric example to use with middle school students

Argumentative writing is a part of language arts, social studies, science, and more. That makes this rubric especially useful.

Learn more: Dr. Caitlyn Tucker

Role-Play Rubric

A rubric example for assessing student role play in the classroom

Role-plays can be really useful when teaching social and critical thinking skills, but it’s hard to assess them. Try a rubric like this one to evaluate and provide useful feedback.

Learn more: A Question of Influence

Art Project Rubric

A rubric used to grade middle school art projects

Art is one of those subjects where grading can feel very subjective. Bring some objectivity to the process with a rubric like this.

Source: Art Ed Guru

Diorama Project Rubric

A rubric for grading middle school diorama projects

You can use diorama projects in almost any subject, and they’re a great chance to encourage creativity. Simplify the grading process and help kids know how to make their projects shine with this scoring rubric.

Learn more: Historyourstory.com

Oral Presentation Rubric

Rubric example for grading oral presentations given by middle school students

Rubrics are terrific for grading presentations, since you can include a variety of skills and other criteria. Consider letting students use a rubric like this to offer peer feedback too.

Learn more: Bright Hub Education

High School Rubric Examples

In high school, it’s important to include your grading rubrics when you give assignments like presentations, research projects, or essays. Kids who go on to college will definitely encounter rubrics, so helping them become familiar with them now will help in the future.

Presentation Rubric

Example of a rubric used to grade a high school project presentation

Analyze a student’s presentation both for content and communication skills with a rubric like this one. If needed, create a separate one for content knowledge with even more criteria and indicators.

Learn more: Michael A. Pena Jr.

Debate Rubric

A rubric for assessing a student's performance in a high school debate

Debate is a valuable learning tool that encourages critical thinking and oral communication skills. This rubric can help you assess those skills objectively.

Learn more: Education World

Project-Based Learning Rubric

A rubric for assessing high school project based learning assignments

Implementing project-based learning can be time-intensive, but the payoffs are worth it. Try this rubric to make student expectations clear and end-of-project assessment easier.

Learn more: Free Technology for Teachers

100-Point Essay Rubric

Rubric for scoring an essay with a final score out of 100 points

Need an easy way to convert a scoring rubric to a letter grade? This example for essay writing earns students a final score out of 100 points.

Learn more: Learn for Your Life

Drama Performance Rubric

A rubric teachers can use to evaluate a student's participation and performance in a theater production

If you’re unsure how to grade a student’s participation and performance in drama class, consider this example. It offers lots of objective criteria and indicators to evaluate.

Learn more: Chase March

How do you use rubrics in your classroom? Come share your thoughts and exchange ideas in the WeAreTeachers HELPLINE group on Facebook .

Plus, 25 of the best alternative assessment ideas ..

Scoring rubrics help establish expectations and ensure assessment consistency. Use these rubric examples to help you design your own.

You Might Also Like

Collage of Interactive Notebook Ideas

How To Get Started With Interactive Notebooks (Plus 25 Terrific Examples)

It's so much more than a place to take notes during class. Continue Reading

Copyright © 2024. All rights reserved. 5335 Gate Parkway, Jacksonville, FL 32256

Writing Beginner

Writing Rubrics [Examples, Best Practices, & Free Templates]

Writing rubrics are essential tools for teachers.

Rubrics can improve both teaching and learning. This guide will explain writing rubrics, their benefits, and how to create and use them effectively.

What Is a Writing Rubric?

Writer typing at a vintage desk, with a stormy night outside -- Writing Rubrics

Table of Contents

A writing rubric is a scoring guide used to evaluate written work.

It lists criteria and describes levels of quality from excellent to poor. Rubrics provide a standardized way to assess writing.

They make expectations clear and grading consistent.

Key Components of a Writing Rubric

  • Criteria : Specific aspects of writing being evaluated (e.g., grammar, organization).
  • Descriptors : Detailed descriptions of what each level of performance looks like.
  • Scoring Levels : Typically, a range (e.g., 1-4 or 1-6) showing levels of mastery.

Example Breakdown

Criteria4 (Excellent)3 (Good)2 (Fair)1 (Poor)
GrammarNo errorsFew minor errorsSeveral errorsMany errors
OrganizationClear and logicalMostly clearSomewhat clearNot clear
ContentThorough and insightfulGood, but not thoroughBasic, lacks insightIncomplete or off-topic

Benefits of Using Writing Rubrics

Writing rubrics offer many advantages:

  • Clarity : Rubrics clarify expectations for students. They know what is required for each level of performance.
  • Consistency : Rubrics standardize grading. This ensures fairness and consistency across different students and assignments.
  • Feedback : Rubrics provide detailed feedback. Students understand their strengths and areas for improvement.
  • Efficiency : Rubrics streamline the grading process. Teachers can evaluate work more quickly and systematically.
  • Self-Assessment : Students can use rubrics to self-assess. This promotes reflection and responsibility for their learning.

Examples of Writing Rubrics

Here are some examples of writing rubrics.

Narrative Writing Rubric

Criteria4 (Excellent)3 (Good)2 (Fair)1 (Poor)
Story ElementsWell-developedDeveloped, some detailsBasic, missing detailsUnderdeveloped
CreativityHighly creativeCreativeSome creativityLacks creativity
GrammarNo errorsFew minor errorsSeveral errorsMany errors
OrganizationClear and logicalMostly clearSomewhat clearNot clear
Language UseRich and variedVariedLimitedBasic or inappropriate

Persuasive Writing Rubric

Criteria4 (Excellent)3 (Good)2 (Fair)1 (Poor)
ArgumentStrong and convincingConvincing, some gapsBasic, lacks supportWeak or unsupported
EvidenceStrong and relevantRelevant, but not strongSome relevant, weakIrrelevant or missing
GrammarNo errorsFew minor errorsSeveral errorsMany errors
OrganizationClear and logicalMostly clearSomewhat clearNot clear
Language UsePersuasive and engagingEngagingSomewhat engagingNot engaging

Best Practices for Creating Writing Rubrics

Let’s look at some best practices for creating useful writing rubrics.

1. Define Clear Criteria

Identify specific aspects of writing to evaluate. Be clear and precise.

The criteria should reflect the key components of the writing task. For example, for a narrative essay, criteria might include plot development, character depth, and use of descriptive language.

Clear criteria help students understand what is expected and allow teachers to provide targeted feedback.

Insider Tip : Collaborate with colleagues to establish consistent criteria across grade levels. This ensures uniformity in expectations and assessments.

2. Use Detailed Descriptors

Describe what each level of performance looks like.

This ensures transparency and clarity. Avoid vague language. Instead of saying “good,” describe what “good” entails. For example, “Few minor grammatical errors that do not impede readability.”

Detailed descriptors help students gauge their performance accurately.

Insider Tip : Use student work samples to illustrate each performance level. This provides concrete examples and helps students visualize expectations.

3. Involve Students

Involve students in the rubric creation process. This increases their understanding and buy-in.

Ask for their input on what they think is important in their writing.

This collaborative approach not only demystifies the grading process but also fosters a sense of ownership and responsibility in students.

Insider Tip : Conduct a workshop where students help create a rubric for an upcoming assignment. This interactive session can clarify doubts and make students more invested in their work.

4. Align with Objectives

Ensure the rubric aligns with learning objectives. This ensures relevance and focus.

If the objective is to enhance persuasive writing skills, the rubric should emphasize argument strength, evidence quality, and persuasive techniques.

Alignment ensures that the assessment directly supports instructional goals.

Insider Tip : Regularly revisit and update rubrics to reflect changes in curriculum and instructional priorities. This keeps the rubrics relevant and effective.

5. Review and Revise

Regularly review and revise rubrics. Ensure they remain accurate and effective.

Solicit feedback from students and colleagues. Continuous improvement of rubrics ensures they remain a valuable tool for both assessment and instruction.

Insider Tip : After using a rubric, take notes on its effectiveness. Were students confused by any criteria? Did the rubric cover all necessary aspects of the assignment? Use these observations to make adjustments.

6. Be Consistent

Use the rubric consistently across all assignments.

This ensures fairness and reliability. Consistency in applying the rubric helps build trust with students and maintains the integrity of the assessment process.

Insider Tip : Develop a grading checklist to accompany the rubric. This can help ensure that all criteria are consistently applied and none are overlooked during the grading process.

7. Provide Examples

Provide examples of each performance level.

This helps students understand expectations. Use annotated examples to show why a particular piece of writing meets a specific level.

This visual and practical demonstration can be more effective than descriptions alone.

Insider Tip : Create a portfolio of exemplar works for different assignments. This can be a valuable resource for both new and experienced teachers to standardize grading.

How to Use Writing Rubrics Effectively

Here is how to use writing rubrics like the pros.

1. Introduce Rubrics Early

Introduce rubrics at the beginning of the assignment.

Explain each criterion and performance level. This upfront clarity helps students understand what is expected and guides their work from the start.

Insider Tip : Conduct a rubric walkthrough session where you discuss each part of the rubric in detail. Allow students to ask questions and provide examples to illustrate each criterion.

2. Use Rubrics as a Teaching Tool

Use rubrics to teach writing skills. Discuss what constitutes good writing and why.

This can be an opportunity to reinforce lessons on grammar, organization, and other writing components.

Insider Tip : Pair the rubric with writing workshops. Use the rubric to critique sample essays and show students how to apply the rubric to improve their own writing.

3. Provide Feedback

Use the rubric to give detailed feedback. Highlight strengths and areas for improvement.

This targeted feedback helps students understand their performance and learn how to improve.

Insider Tip : Instead of just marking scores, add comments next to each criterion on the rubric. This personalized feedback can be more impactful and instructive for students.

4. Encourage Self-Assessment

Encourage students to use rubrics to self-assess.

This promotes reflection and growth. Before submitting their work, ask students to evaluate their own writing against the rubric.

This practice fosters self-awareness and critical thinking.

Insider Tip : Incorporate self-assessment as a mandatory step in the assignment process. Provide a simplified version of the rubric for students to use during self-assessment.

5. Use Rubrics for Peer Assessment

Use rubrics for peer assessment. This allows students to learn from each other.

Peer assessments can provide new perspectives and reinforce learning.

Insider Tip : Conduct a peer assessment workshop. Train students on how to use the rubric to evaluate each other’s work constructively. This can improve the quality of peer feedback.

6. Reflect and Improve

Reflect on the effectiveness of the rubric. Make adjustments as needed for future assignments.

Continuous reflection ensures that rubrics remain relevant and effective tools for assessment and learning.

Insider Tip : After an assignment, hold a debrief session with students to gather their feedback on the rubric. Use their insights to make improvements.

Check out this video about using writing rubrics:

Common Mistakes with Writing Rubrics

Creating and using writing rubrics can be incredibly effective, but there are common mistakes that can undermine their effectiveness.

Here are some pitfalls to avoid:

1. Vague Criteria

Vague criteria can confuse students and lead to inconsistent grading.

Ensure that each criterion is specific and clearly defined. Ambiguous terms like “good” or “satisfactory” should be replaced with concrete descriptions of what those levels of performance look like.

2. Overly Complex Rubrics

While detail is important, overly complex rubrics can be overwhelming for both students and teachers.

Too many criteria and performance levels can complicate the grading process and make it difficult for students to understand what is expected.

Keep rubrics concise and focused on the most important aspects of the assignment.

3. Inconsistent Application

Applying the rubric inconsistently can lead to unfair grading.

Ensure that you apply the rubric in the same way for all students and all assignments. Consistency builds trust and ensures that grades accurately reflect student performance.

4. Ignoring Student Input

Ignoring student input when creating rubrics can result in criteria that do not align with student understanding or priorities.

Involving students in the creation process can enhance their understanding and engagement with the rubric.

5. Failing to Update Rubrics

Rubrics should evolve to reflect changes in instructional goals and student needs.

Failing to update rubrics can result in outdated criteria that no longer align with current teaching objectives.

Regularly review and revise rubrics to keep them relevant and effective.

6. Lack of Examples

Without examples, students may struggle to understand the expectations for each performance level.

Providing annotated examples of work that meets each criterion can help students visualize what is required and guide their efforts more effectively.

7. Not Providing Feedback

Rubrics should be used as a tool for feedback, not just scoring.

Simply assigning a score without providing detailed feedback can leave students unclear about their strengths and areas for improvement.

Use the rubric to give comprehensive feedback that guides students’ growth.

8. Overlooking Self-Assessment and Peer Assessment

Self-assessment and peer assessment are valuable components of the learning process.

Overlooking these opportunities can limit students’ ability to reflect on their own work and learn from their peers.

Encourage students to use the rubric for self and peer assessment to deepen their understanding and enhance their skills.

What Is a Holistic Scoring Rubric for Writing?

A holistic scoring rubric for writing is a type of rubric that evaluates a piece of writing as a whole rather than breaking it down into separate criteria

This approach provides a single overall score based on the general impression of the writing’s quality and effectiveness.

Here’s a closer look at holistic scoring rubrics.

Key Features of Holistic Scoring Rubrics

  • Single Overall Score : Assigns one score based on the overall quality of the writing.
  • General Criteria : Focuses on the overall effectiveness, coherence, and impact of the writing.
  • Descriptors : Uses broad descriptors for each score level to capture the general characteristics of the writing.

Example Holistic Scoring Rubric

ScoreDescription
5 : Exceptionally clear, engaging, and well-organized writing. Demonstrates excellent control of language, grammar, and style.
4 : Clear and well-organized writing. Minor errors do not detract from the overall quality. Demonstrates good control of language and style.
3 : Satisfactory writing with some organizational issues. Contains a few errors that may distract but do not impede understanding.
2 : Basic writing that lacks organization and contains several errors. Demonstrates limited control of language and style.
1 : Unclear and poorly organized writing. Contains numerous errors that impede understanding. Demonstrates poor control of language and style.

Advantages of Holistic Scoring Rubrics

  • Efficiency : Faster to use because it involves a single overall judgment rather than multiple criteria.
  • Flexibility : Allows for a more intuitive assessment of the writing’s overall impact and effectiveness.
  • Comprehensiveness : Captures the overall quality of writing, considering all elements together.

Disadvantages of Holistic Scoring Rubrics

  • Less Detailed Feedback : Provides a general score without specific feedback on individual aspects of writing.
  • Subjectivity : Can be more subjective, as it relies on the assessor’s overall impression rather than specific criteria.
  • Limited Diagnostic Use : Less useful for identifying specific areas of strength and weakness for instructional purposes.

When to Use Holistic Scoring Rubrics

  • Quick Assessments : When a quick, overall evaluation is needed.
  • Standardized Testing : Often used in standardized testing scenarios where consistency and efficiency are priorities.
  • Initial Impressions : Useful for providing an initial overall impression before more detailed analysis.

Free Writing Rubric Templates

Feel free to use the following writing rubric templates.

You can easily copy and paste them into a Word Document. Please do credit this website on any written, printed, or published use.

Otherwise, go wild.

Criteria4 (Excellent)3 (Good)2 (Fair)1 (Poor)
Well-developed, engaging, and clear plot, characters, and setting.Developed plot, characters, and setting with some details missing.Basic plot, characters, and setting; lacks details.Underdeveloped plot, characters, and setting.
Highly creative and original.Creative with some originality.Some creativity but lacks originality.Lacks creativity and originality.
No grammatical errors.Few minor grammatical errors.Several grammatical errors.Numerous grammatical errors.
Clear and logical structure.Mostly clear structure.Somewhat clear structure.Lacks clear structure.
Rich, varied, and appropriate language.Varied and appropriate language.Limited language variety.Basic or inappropriate language.
Criteria4 (Excellent)3 (Good)2 (Fair)1 (Poor)
Strong, clear, and convincing argument.Convincing argument with minor gaps.Basic argument; lacks strong support.Weak or unsupported argument.
Strong, relevant, and well-integrated evidence.Relevant evidence but not strong.Some relevant evidence, but weak.Irrelevant or missing evidence.
No grammatical errors.Few minor grammatical errors.Several grammatical errors.Numerous grammatical errors.
Clear and logical structure.Mostly clear structure.Somewhat clear structure.Lacks clear structure.
Persuasive and engaging language.Engaging language.Somewhat engaging language.Not engaging language.

Expository Writing Rubric

Criteria4 (Excellent)3 (Good)2 (Fair)1 (Poor)
Thorough, accurate, and insightful content.Accurate content with some details missing.Basic content; lacks depth.Incomplete or inaccurate content.
Clear and concise explanations.Mostly clear explanations.Somewhat clear explanations.Unclear explanations.
No grammatical errors.Few minor grammatical errors.Several grammatical errors.Numerous grammatical errors.
Clear and logical structure.Mostly clear structure.Somewhat clear structure.Lacks clear structure.
Precise and appropriate language.Appropriate language.Limited language variety.Basic or inappropriate language.

Descriptive Writing Rubric

Criteria4 (Excellent)3 (Good)2 (Fair)1 (Poor)
Vivid and detailed imagery that engages the senses.Detailed imagery with minor gaps.Basic imagery; lacks vivid details.Little to no imagery.
Highly creative and original descriptions.Creative with some originality.Some creativity but lacks originality.Lacks creativity and originality.
No grammatical errors.Few minor grammatical errors.Several grammatical errors.Numerous grammatical errors.
Clear and logical structure.Mostly clear structure.Somewhat clear structure.Lacks clear structure.
Rich, varied, and appropriate language.Varied and appropriate language.Limited language variety.Basic or inappropriate language.

Analytical Writing Rubric

Criteria4 (Excellent)3 (Good)2 (Fair)1 (Poor)
Insightful, thorough, and well-supported analysis.Good analysis with some depth.Basic analysis; lacks depth.Weak or unsupported analysis.
Strong, relevant, and well-integrated evidence.Relevant evidence but not strong.Some relevant evidence, but weak.Irrelevant or missing evidence.
No grammatical errors.Few minor grammatical errors.Several grammatical errors.Numerous grammatical errors.
Clear and logical structure.Mostly clear structure.Somewhat clear structure.Lacks clear structure.
Precise and appropriate language.Appropriate language.Limited language variety.Basic or inappropriate language.

Final Thoughts: Writing Rubrics

I have a lot more resources for teaching on this site.

Check out some of the blog posts I’ve listed below. I think you might enjoy them.

Read This Next:

  • Narrative Writing Graphic Organizer [Guide + Free Templates]
  • 100 Best A Words for Kids (+ How to Use Them)
  • 100 Best B Words For Kids (+How to Teach Them)
  • 100 Dictation Word Ideas for Students and Kids
  • 50 Tricky Words to Pronounce and Spell (How to Teach Them)

Center for Teaching Innovation

Resource library.

  • AACU VALUE Rubrics

Using rubrics

A rubric is a type of scoring guide that assesses and articulates specific components and expectations for an assignment. Rubrics can be used for a variety of assignments: research papers, group projects, portfolios, and presentations.  

Why use rubrics? 

Rubrics help instructors: 

  • Assess assignments consistently from student-to-student. 
  • Save time in grading, both short-term and long-term. 
  • Give timely, effective feedback and promote student learning in a sustainable way. 
  • Clarify expectations and components of an assignment for both students and course teaching assistants (TAs). 
  • Refine teaching methods by evaluating rubric results. 

Rubrics help students: 

  • Understand expectations and components of an assignment. 
  • Become more aware of their learning process and progress. 
  • Improve work through timely and detailed feedback. 

Considerations for using rubrics 

When developing rubrics consider the following:

  • Although it takes time to build a rubric, time will be saved in the long run as grading and providing feedback on student work will become more streamlined.  
  • A rubric can be a fillable pdf that can easily be emailed to students. 
  • They can be used for oral presentations. 
  • They are a great tool to evaluate teamwork and individual contribution to group tasks. 
  • Rubrics facilitate peer-review by setting evaluation standards. Have students use the rubric to provide peer assessment on various drafts. 
  • Students can use them for self-assessment to improve personal performance and learning. Encourage students to use the rubrics to assess their own work. 
  • Motivate students to improve their work by using rubric feedback to resubmit their work incorporating the feedback. 

Getting Started with Rubrics 

  • Start small by creating one rubric for one assignment in a semester.  
  • Ask colleagues if they have developed rubrics for similar assignments or adapt rubrics that are available online. For example, the  AACU has rubrics  for topics such as written and oral communication, critical thinking, and creative thinking. RubiStar helps you to develop your rubric based on templates.  
  • Examine an assignment for your course. Outline the elements or critical attributes to be evaluated (these attributes must be objectively measurable). 
  • Create an evaluative range for performance quality under each element; for instance, “excellent,” “good,” “unsatisfactory.” 
  • Avoid using subjective or vague criteria such as “interesting” or “creative.” Instead, outline objective indicators that would fall under these categories. 
  • The criteria must clearly differentiate one performance level from another. 
  • Assign a numerical scale to each level. 
  • Give a draft of the rubric to your colleagues and/or TAs for feedback. 
  • Train students to use your rubric and solicit feedback. This will help you judge whether the rubric is clear to them and will identify any weaknesses. 
  • Rework the rubric based on the feedback. 

Skip to Content

Other ways to search:

  • Events Calendar

Rubrics are a set of criteria to evaluate performance on an assignment or assessment. Rubrics can communicate expectations regarding the quality of work to students and provide a standardized framework for instructors to assess work. Rubrics can be used for both formative and summative assessment. They are also crucial in encouraging self-assessment of work and structuring peer-assessments. 

Why use rubrics?

Rubrics are an important tool to assess learning in an equitable and just manner. This is because they enable:

  • A common set of standards and criteria to be uniformly applied, which can mitigate bias
  • Transparency regarding the standards and criteria on which students are evaluated
  • Efficient grading with timely and actionable feedback 
  • Identifying areas in which students need additional support and guidance 
  • The use of objective, criterion-referenced metrics for evaluation 

Some instructors may be reluctant to provide a rubric to grade assessments under the perception that it stifles student creativity (Haugnes & Russell, 2018). However, sharing the purpose of an assessment and criteria for success in the form of a rubric along with relevant examples has been shown to particularly improve the success of BIPOC, multiracial, and first-generation students (Jonsson, 2014; Winkelmes, 2016). Improved success in assessments is generally associated with an increased sense of belonging which, in turn, leads to higher student retention and more equitable outcomes in the classroom (Calkins & Winkelmes, 2018; Weisz et al., 2023). By not providing a rubric, faculty may risk having students guess the criteria on which they will be evaluated. When students have to guess what expectations are, it may unfairly disadvantage students who are first-generation, BIPOC, international, or otherwise have not been exposed to the cultural norms that have dominated higher-ed institutions in the U.S (Shapiro et al., 2023). Moreover, in such cases, criteria may be applied inconsistently for students leading to biases in grades awarded to students.

Steps for Creating a Rubric

Clearly state the purpose of the assessment, which topic(s) learners are being tested on, the type of assessment (e.g., a presentation, essay, group project), the skills they are being tested on (e.g., writing, comprehension, presentation, collaboration), and the goal of the assessment for instructors (e.g., gauging formative or summative understanding of the topic). 

Determine the specific criteria or dimensions to assess in the assessment. These criteria should align with the learning objectives or outcomes to be evaluated. These criteria typically form the rows in a rubric grid and describe the skills, knowledge, or behavior to be demonstrated. The set of criteria may include, for example, the idea/content, quality of arguments, organization, grammar, citations and/or creativity in writing. These criteria may form separate rows or be compiled in a single row depending on the type of rubric.

(See row headers  of  Figure 1 )

Create a scale of performance levels that describe the degree of proficiency attained for each criterion. The scale typically has 4 to 5 levels (although there may be fewer levels depending on the type of rubrics used). The rubrics should also have meaningful labels (e.g., not meeting expectations, approaching expectations, meeting expectations, exceeding expectations). When assigning levels of performance, use inclusive language that can inculcate a growth mindset among students, especially when work may be otherwise deemed to not meet the mark. Some examples include, “Does not yet meet expectations,” “Considerable room for improvement,” “ Progressing,” “Approaching,” “Emerging,” “Needs more work,” instead of using terms like “Unacceptable,” “Fails,” “Poor,” or “Below Average.”

(See column headers  of  Figure 1 )

Develop a clear and concise descriptor for each combination of criterion and performance level. These descriptors should provide examples or explanations of what constitutes each level of performance for each criterion. Typically, instructors should start by describing the highest and lowest level of performance for that criterion and then describing intermediate performance for that criterion. It is important to keep the language uniform across all columns, e.g., use syntax and words that are aligned in each column for a given criteria. 

(See cells  of  Figure 1 )

It is important to consider how each criterion is weighted and for each criterion to reflect the importance of learning objectives being tested. For example, if the primary goal of a research proposal is to test mastery of content and application of knowledge, these criteria should be weighted more heavily compared to other criteria (e.g., grammar, style of presentation). This can be done by associating a different scoring system for each criteria (e.g., Following a scale of 8-6-4-2 points for each level of performance in higher weight criteria and 4-3-2-1 points for each level of performance for lower weight criteria). Further, the number of points awarded across levels of performance should be evenly spaced (e.g., 10-8-6-4 instead of 10-6-3-1). Finally, if there is a letter grade associated with a particular assessment, consider how it relates to scores. For example, instead of having students receive an A only if they received the highest level of performance on each criterion, consider assigning an A grade to a range of scores (28 - 30 total points) or a combination of levels of performance (e.g., exceeds expectations on higher weight criteria and meets expectations on other criteria). 

(See the numerical values in the column headers  of  Figure 1 )

 a close up of a score sheet

Figure 1:  Graphic describing the five basic elements of a rubric

Note : Consider using a template rubric that can be used to evaluate similar activities in the classroom to avoid the fatigue of developing multiple rubrics. Some tools include Rubistar or iRubric which provide suggested words for each criteria depending on the type of assessment. Additionally, the above format can be incorporated in rubrics that can be directly added in Canvas or in the grid view of rubrics in gradescope which are common grading tools. Alternately, tables within a Word processor or Spreadsheet may also be used to build a rubric. You may also adapt the example rubrics provided below to the specific learning goals for the assessment using the blank template rubrics we have provided against each type of rubric. Watch the linked video for a quick introduction to designing a rubric . Word document (docx) files linked below will automatically download to your device whereas pdf files will open in a new tab.

Types of Rubrics

In these rubrics, one specifies at least two criteria and provides a separate score for each criterion. The steps outlined above for creating a rubric are typical for an analytic style rubric. Analytic rubrics are used to provide detailed feedback to students and help identify strengths as well as particular areas in need of improvement. These can be particularly useful when providing formative feedback to students, for student peer assessment and self-assessments, or for project-based summative assessments that evaluate student learning across multiple criteria. You may use a blank analytic rubric template (docx) or adapt an existing sample of an analytic rubric (pdf) . 

figure 2

Fig 2: Graphic describing a sample analytic rubric (adopted from George Mason University, 2013)

These are a subset of analytical rubrics that are typically used to assess student performance and engagement during a learning period but not the end product. Such rubrics are typically used to assess soft skills and behaviors that are less tangible (e.g., intercultural maturity, empathy, collaboration skills). These rubrics are useful in assessing the extent to which students develop a particular skill, ability, or value in experiential learning based programs or skills. They are grounded in the theory of development (King, 2005). Examples include an intercultural knowledge and competence rubric (docx)  and a global learning rubric (docx) .

These rubrics consider all criteria evaluated on one scale, providing a single score that gives an overall impression of a student’s performance on an assessment.These rubrics also emphasize the overall quality of a student’s work, rather than delineating shortfalls of their work. However, a limitation of the holistic rubrics is that they are not useful for providing specific, nuanced feedback or to identify areas of improvement. Thus, they might be useful when grading summative assessments in which students have previously received detailed feedback using analytic or single-point rubrics. They may also be used to provide quick formative feedback for smaller assignments where not more than 2-3 criteria are being tested at once. Try using our blank holistic rubric template docx)  or adapt an existing sample of holistic rubric (pdf) . 

figure 3

Fig 3: Graphic describing a sample holistic rubric (adopted from Teaching Commons, DePaul University)

These rubrics contain only two levels of performance (e.g., yes/no, present/absent) across a longer list of criteria (beyond 5 levels). Checklist rubrics have the advantage of providing a quick assessment of criteria given the binary assessment of criteria that are either met or are not met. Consequently, they are preferable when initiating self- or  peer-assessments of learning given that it simplifies evaluations to be more objective and criteria can elicit only one of two responses allowing uniform and quick grading. For similar reasons, such rubrics are useful for faculty in providing quick formative feedback since it immediately highlights the specific criteria to improve on. Such rubrics are also used in grading summative assessments in courses utilizing alternative grading systems such as specifications grading, contract grading or a credit/no credit grading system wherein a minimum threshold of performance has to be met for the assessment. Having said that, developing rubrics from existing analytical rubrics may require considerable investment upfront given that criteria have to be phrased in a way that can only elicit binary responses. Here is a link to the checklist rubric template (docx) .

 Graphic describing a sample checklist rubric

Fig. 4: Graphic describing a sample checklist rubric

A single point rubric is a modified version of a checklist style rubric, in that it specifies a single column of criteria. However, rather than only indicating whether expectations are met or not, as happens in a checklist rubric, a single point rubric allows instructors to specify ways in which criteria exceeds or does not meet expectations. Here the criteria to be tested are laid out in a central column describing the average expectation for the assignment. Instructors indicate areas of improvement on the left side of the criteria, whereas areas of strength in student performance are indicated on the right side. These types of rubrics provide flexibility in scoring, and are typically used in courses with alternative grading systems such as ungrading or contract grading. However, they do require the instructors to provide detailed feedback for each student, which can be unfeasible for assessments in large classes. Here is a link to the single point rubric template (docx) .

Fig. 5 Graphic describing a single point rubric (adopted from Teaching Commons, DePaul University)

Fig. 5 Graphic describing a single point rubric (adopted from Teaching Commons, DePaul University)

Best Practices for Designing and Implementing Rubrics

When designing the rubric format, descriptors and criteria should be presented in a way that is compatible with screen readers and reading assistive technology. For example, avoid using only color, jargon, or complex terminology to convey information. In case you do use color, pictures or graphics, try providing alternative formats for rubrics, such as plain text documents. Explore resources from the CU Digital Accessibility Office to learn more.

Co-creating rubrics can help students to engage in higher-order thinking skills such as analysis and evaluation. Further, it allows students to take ownership of their own learning by determining the criteria of their work they aspire towards. For graduate classes or upper-level students, one way of doing this may be to provide learning outcomes of the project, and let students develop the rubric on their own. However, students in introductory classes may need more scaffolding by providing them a draft and leaving room for modification (Stevens & Levi 2013). Watch the linked video for tips on co-creating rubrics with students . Further, involving teaching assistants in designing a rubric can help in getting feedback on expectations for an assessment prior to implementing and norming a rubric. 

When first designing a rubric, it is important to compare grades awarded for the same assessment by multiple graders to make sure the criteria are applied uniformly and reliably for the same level of performance. Further, ensure that the levels of performance in student work can be adequately distinguished using a rubric. Such a norming protocol is particularly important to also do at the start of any course in which multiple graders use the same rubric to grade an assessment (e.g., recitation sections, lab sections, teaching team). Here, instructors may select a subset of assignments that all graders evaluate using the same rubric, followed by a discussion to identify any discrepancies in criteria applied and ways to address them. Such strategies can make the rubrics more reliable, effective, and clear.

Sharing the rubric with students prior to an assessment can help familiarize students with an instructor’s expectations. This can help students master their learning outcomes by guiding their work in the appropriate direction and increase student motivation. Further, providing the rubric to students can help encourage metacognition and ability to self-assess learning.

Sample Rubrics

Below are links to rubric templates designed by a team of experts assembled by the Association of American Colleges and Universities (AAC&U) to assess 16 major learning goals. These goals are a part of the Valid Assessment of Learning in Undergraduate Education (VALUE) program. All of these examples are analytic rubrics and have detailed criteria to test specific skills. However, since any given assessment typically tests multiple skills, instructors are encouraged to develop their own rubric by utilizing criteria picked from a combination of the rubrics linked below.

  • Civic knowledge and engagement-local and global
  • Creative thinking
  • Critical thinking
  • Ethical reasoning
  • Foundations and skills for lifelong learning
  • Information literacy
  • Integrative and applied learning
  • Intercultural knowledge and competence
  • Inquiry and analysis
  • Oral communication
  • Problem solving
  • Quantitative literacy
  • Written Communication

Note : Clicking on the above links will automatically download them to your device in Microsoft Word format. These links have been created and are hosted by Kansas State University . Additional information regarding the VALUE Rubrics may be found on the AAC&U homepage . 

Below are links to sample rubrics that have been developed for different types of assessments. These rubrics follow the analytical rubric template, unless mentioned otherwise. However, these rubrics can be modified into other types of rubrics (e.g., checklist, holistic or single point rubrics) based on the grading system and goal of assessment (e.g., formative or summative). As mentioned previously, these rubrics can be modified using the blank template provided.

  • Oral presentations  
  • Painting Portfolio (single-point rubric)
  • Research Paper
  • Video Storyboard

Additional information:

Office of Assessment and Curriculum Support. (n.d.). Creating and using rubrics . University of Hawai’i, Mānoa

Calkins, C., & Winkelmes, M. A. (2018). A teaching method that boosts UNLV student retention . UNLV Best Teaching Practices Expo , 3.

Fraile, J., Panadero, E., & Pardo, R. (2017). Co-creating rubrics: The effects on self-regulated learning, self-efficacy and performance of establishing assessment criteria with students. Studies In Educational Evaluation , 53, 69-76

Haugnes, N., & Russell, J. L. (2016). Don’t box me in: Rubrics for àrtists and Designers . To Improve the Academy , 35 (2), 249–283. 

Jonsson, A. (2014). Rubrics as a way of providing transparency in assessment , Assessment & Evaluation in Higher Education , 39(7), 840-852 

McCartin, L. (2022, February 1). Rubrics! an equity-minded practice . University of Northern Colorado

Shapiro, S., Farrelly, R., & Tomaš, Z. (2023). Chapter 4: Effective and Equitable Assignments and Assessments. Fostering International Student Success in higher education (pp, 61-87, second edition). TESOL Press.

Stevens, D. D., & Levi, A. J. (2013). Introduction to rubrics: An assessment tool to save grading time, convey effective feedback, and promote student learning (second edition). Sterling, VA: Stylus.

Teaching Commons (n.d.). Types of Rubrics . DePaul University

Teaching Resources (n.d.). Rubric best practices, examples, and templates . NC State University 

Winkelmes, M., Bernacki, M., Butler, J., Zochowski, M., Golanics, J., & Weavil, K.H. (2016). A teaching intervention that increases underserved college students’ success . Peer Review , 8(1/2), 31-36.

Weisz, C., Richard, D., Oleson, K., Winkelmes, M.A., Powley, C., Sadik, A., & Stone, B. (in progress, 2023). Transparency, confidence, belonging and skill development among 400 community college students in the state of Washington . 

Association of American Colleges and Universities. (2009). Valid Assessment of Learning in Undergraduate Education (VALUE) . 

Canvas Community. (2021, August 24). How do I add a rubric in a course? Canvas LMS Community.

 Center for Teaching & Learning. (2021, March 03). Overview of Rubrics . University of Colorado, Boulder

 Center for Teaching & Learning. (2021, March 18). Best practices to co-create rubrics with students . University of Colorado, Boulder.

Chase, D., Ferguson, J. L., & Hoey, J. J. (2014). Assessment in creative disciplines: Quantifying and qualifying the aesthetic . Common Ground Publishing.

Feldman, J. (2018). Grading for equity: What it is, why it matters, and how it can transform schools and classrooms . Corwin Press, CA.

Gradescope (n.d.). Instructor: Assignment - Grade Submissions . Gradescope Help Center. 

Henning, G., Baker, G., Jankowski, N., Lundquist, A., & Montenegro, E. (Eds.). (2022). Reframing assessment to center equity . Stylus Publishing. 

 King, P. M. & Baxter Magolda, M. B. (2005). A developmental model of intercultural maturity . Journal of College Student Development . 46(2), 571-592.

Selke, M. J. G. (2013). Rubric assessment goes to college: Objective, comprehensive evaluation of student work. Lanham, MD: Rowman & Littlefield.

The Institute for Habits of Mind. (2023, January 9). Creativity Rubrics - The Institute for Habits of Mind . 

  • Assessment in Large Enrollment Classes
  • Classroom Assessment Techniques
  • Creating and Using Learning Outcomes
  • Early Feedback
  • Five Misconceptions on Writing Feedback
  • Formative Assessments
  • Frequent Feedback
  • Online and Remote Exams
  • Student Learning Outcomes Assessment
  • Student Peer Assessment
  • Student Self-assessment
  • Summative Assessments: Best Practices
  • Summative Assessments: Types
  • Assessing & Reflecting on Teaching
  • Departmental Teaching Evaluation
  • Equity in Assessment
  • Glossary of Terms
  • Attendance Policies
  • Books We Recommend
  • Classroom Management
  • Community-Developed Resources
  • Compassion & Self-Compassion
  • Course Design & Development
  • Course-in-a-box for New CU Educators
  • Enthusiasm & Teaching
  • First Day Tips
  • Flexible Teaching
  • Grants & Awards
  • Inclusivity
  • Learner Motivation
  • Making Teaching & Learning Visible
  • National Center for Faculty Development & Diversity
  • Open Education
  • Student Support Toolkit
  • Sustainaiblity
  • TA/Instructor Agreement
  • Teaching & Learning in the Age of AI
  • Teaching Well with Technology

rubric for report assignment

How to Use Rubrics

rubric for report assignment

A rubric is a document that describes the criteria by which students’ assignments are graded. Rubrics can be helpful for:

  • Making grading faster and more consistent (reducing potential bias). 
  • Communicating your expectations for an assignment to students before they begin. 

Moreover, for assignments whose criteria are more subjective, the process of creating a rubric and articulating what it looks like to succeed at an assignment provides an opportunity to check for alignment with the intended learning outcomes and modify the assignment prompt, as needed.

Why rubrics?

Rubrics are best for assignments or projects that require evaluation on multiple dimensions. Creating a rubric makes the instructor’s standards explicit to both students and other teaching staff for the class, showing students how to meet expectations.

Additionally, the more comprehensive a rubric is, the more it allows for grading to be streamlined—students will get informative feedback about their performance from the rubric, even if they don’t have as many individualized comments. Grading can be more standardized and efficient across graders.

Finally, rubrics allow for reflection, as the instructor has to think about their standards and outcomes for the students. Using rubrics can help with self-directed learning in students as well, especially if rubrics are used to review students’ own work or their peers’, or if students are involved in creating the rubric.

How to design a rubric

1. consider the desired learning outcomes.

What learning outcomes is this assignment reinforcing and assessing? If the learning outcome seems “fuzzy,” iterate on the outcome by thinking about the expected student work product. This may help you more clearly articulate the learning outcome in a way that is measurable.  

2. Define criteria

What does a successful assignment submission look like? As described by Allen and Tanner (2006), it can help develop an initial list of categories that the student should demonstrate proficiency in by completing the assignment. These categories should correlate with the intended learning outcomes you identified in Step 1, although they may be more granular in some cases. For example, if the task assesses students’ ability to formulate an effective communication strategy, what components of their communication strategy will you be looking for? Talking with colleagues or looking at existing rubrics for similar tasks may give you ideas for categories to consider for evaluation.

If you have assigned this task to students before and have samples of student work, it can help create a qualitative observation guide. This is described in Linda Suskie’s book Assessing Student Learning , where she suggests thinking about what made you decide to give one assignment an A and another a C, as well as taking notes when grading assignments and looking for common patterns. The often repeated themes that you comment on may show what your goals and expectations for students are. An example of an observation guide used to take notes on predetermined areas of an assignment is shown here .

In summary, consider the following list of questions when defining criteria for a rubric (O’Reilly and Cyr, 2006):

  • What do you want students to learn from the task?
  • How will students demonstrate that they have learned?
  • What knowledge, skills, and behaviors are required for the task?
  • What steps are required for the task?
  • What are the characteristics of the final product?

After developing an initial list of criteria, prioritize the most important skills you want to target and eliminate unessential criteria or combine similar skills into one group. Most rubrics have between 3 and 8 criteria. Rubrics that are too lengthy make it difficult to grade and challenging for students to understand the key skills they need to achieve for the given assignment. 

3. Create the rating scale

According to Suskie, you will want at least 3 performance levels: for adequate and inadequate performance, at the minimum, and an exemplary level to motivate students to strive for even better work. Rubrics often contain 5 levels, with an additional level between adequate and exemplary and a level between adequate and inadequate. Usually, no more than 5 levels are needed, as having too many rating levels can make it hard to consistently distinguish which rating to give an assignment (such as between a 6 or 7 out of 10). Suskie also suggests labeling each level with names to clarify which level represents the minimum acceptable performance. Labels will vary by assignment and subject, but some examples are: 

  • Exceeds standard, meets standard, approaching standard, below standard
  • Complete evidence, partial evidence, minimal evidence, no evidence

4. Fill in descriptors

Fill in descriptors for each criterion at each performance level. Expand on the list of criteria you developed in Step 2. Begin to write full descriptions, thinking about what an exemplary example would look like for students to strive towards. Avoid vague terms like “good” and make sure to use explicit, concrete terms to describe what would make a criterion good. For instance, a criterion called “organization and structure” would be more descriptive than “writing quality.” Describe measurable behavior and use parallel language for clarity; the wording for each criterion should be very similar, except for the degree to which standards are met. For example, in a sample rubric from Chapter 9 of Suskie’s book, the criterion of “persuasiveness” has the following descriptors:

  • Well Done (5): Motivating questions and advance organizers convey the main idea. Information is accurate.
  • Satisfactory (3-4): Includes persuasive information.
  • Needs Improvement (1-2): Include persuasive information with few facts.
  • Incomplete (0): Information is incomplete, out of date, or incorrect.

These sample descriptors generally have the same sentence structure that provides consistent language across performance levels and shows the degree to which each standard is met.

5. Test your rubric

Test your rubric using a range of student work to see if the rubric is realistic. You may also consider leaving room for aspects of the assignment, such as effort, originality, and creativity, to encourage students to go beyond the rubric. If there will be multiple instructors grading, it is important to calibrate the scoring by having all graders use the rubric to grade a selected set of student work and then discuss any differences in the scores. This process helps develop consistency in grading and making the grading more valid and reliable.

Types of Rubrics

If you would like to dive deeper into rubric terminology, this section is dedicated to discussing some of the different types of rubrics. However, regardless of the type of rubric you use, it’s still most important to focus first on your learning goals and think about how the rubric will help clarify students’ expectations and measure student progress towards those learning goals.

Depending on the nature of the assignment, rubrics can come in several varieties (Suskie, 2009):

Checklist Rubric

This is the simplest kind of rubric, which lists specific features or aspects of the assignment which may be present or absent. A checklist rubric does not involve the creation of a rating scale with descriptors. See example from 18.821 project-based math class .

Rating Scale Rubric

This is like a checklist rubric, but instead of merely noting the presence or absence of a feature or aspect of the assignment, the grader also rates quality (often on a graded or Likert-style scale). See example from 6.811 assistive technology class .

Descriptive Rubric

A descriptive rubric is like a rating scale, but including descriptions of what performing to a certain level on each scale looks like. Descriptive rubrics are particularly useful in communicating instructors’ expectations of performance to students and in creating consistency with multiple graders on an assignment. This kind of rubric is probably what most people think of when they imagine a rubric. See example from 15.279 communications class .

Holistic Scoring Guide

Unlike the first 3 types of rubrics, a holistic scoring guide describes performance at different levels (e.g., A-level performance, B-level performance) holistically without analyzing the assignment into several different scales. This kind of rubric is particularly useful when there are many assignments to grade and a moderate to a high degree of subjectivity in the assessment of quality. It can be difficult to have consistency across scores, and holistic scoring guides are most helpful when making decisions quickly rather than providing detailed feedback to students. See example from 11.229 advanced writing seminar .

The kind of rubric that is most appropriate will depend on the assignment in question.

Implementation tips

Rubrics are also available to use for Canvas assignments. See this resource from Boston College for more details and guides from Canvas Instructure.

Allen, D., & Tanner, K. (2006). Rubrics: Tools for Making Learning Goals and Evaluation Criteria Explicit for Both Teachers and Learners. CBE—Life Sciences Education, 5 (3), 197-203. doi:10.1187/cbe.06-06-0168

Cherie Miot Abbanat. 11.229 Advanced Writing Seminar. Spring 2004. Massachusetts Institute of Technology: MIT OpenCourseWare, https://ocw.mit.edu . License: Creative Commons BY-NC-SA .

Haynes Miller, Nat Stapleton, Saul Glasman, and Susan Ruff. 18.821 Project Laboratory in Mathematics. Spring 2013. Massachusetts Institute of Technology: MIT OpenCourseWare, https://ocw.mit.edu . License: Creative Commons BY-NC-SA .

Lori Breslow, and Terence Heagney. 15.279 Management Communication for Undergraduates. Fall 2012. Massachusetts Institute of Technology: MIT OpenCourseWare, https://ocw.mit.edu . License: Creative Commons BY-NC-SA .

O’Reilly, L., & Cyr, T. (2006). Creating a Rubric: An Online Tutorial for Faculty. Retrieved from https://www.ucdenver.edu/faculty_staff/faculty/center-for-faculty-development/Documents/Tutorials/Rubrics/index.htm

Suskie, L. (2009). Using a scoring guide or rubric to plan and evaluate an assessment. In Assessing student learning: A common sense guide (2nd edition, pp. 137-154 ) . Jossey-Bass.

William Li, Grace Teo, and Robert Miller. 6.811 Principles and Practice of Assistive Technology. Fall 2014. Massachusetts Institute of Technology: MIT OpenCourseWare, https://ocw.mit.edu . License: Creative Commons BY-NC-SA .

Rubric Design

Main navigation, articulating your assessment values.

Reading, commenting on, and then assigning a grade to a piece of student writing requires intense attention and difficult judgment calls. Some faculty dread “the stack.” Students may share the faculty’s dim view of writing assessment, perceiving it as highly subjective. They wonder why one faculty member values evidence and correctness before all else, while another seeks a vaguely defined originality.

Writing rubrics can help address the concerns of both faculty and students by making writing assessment more efficient, consistent, and public. Whether it is called a grading rubric, a grading sheet, or a scoring guide, a writing assignment rubric lists criteria by which the writing is graded.

Why create a writing rubric?

  • It makes your tacit rhetorical knowledge explicit
  • It articulates community- and discipline-specific standards of excellence
  • It links the grade you give the assignment to the criteria
  • It can make your grading more efficient, consistent, and fair as you can read and comment with your criteria in mind
  • It can help you reverse engineer your course: once you have the rubrics created, you can align your readings, activities, and lectures with the rubrics to set your students up for success
  • It can help your students produce writing that you look forward to reading

How to create a writing rubric

Create a rubric at the same time you create the assignment. It will help you explain to the students what your goals are for the assignment.

  • Consider your purpose: do you need a rubric that addresses the standards for all the writing in the course? Or do you need to address the writing requirements and standards for just one assignment?  Task-specific rubrics are written to help teachers assess individual assignments or genres, whereas generic rubrics are written to help teachers assess multiple assignments.
  • Begin by listing the important qualities of the writing that will be produced in response to a particular assignment. It may be helpful to have several examples of excellent versions of the assignment in front of you: what writing elements do they all have in common? Among other things, these may include features of the argument, such as a main claim or thesis; use and presentation of sources, including visuals; and formatting guidelines such as the requirement of a works cited.
  • Then consider how the criteria will be weighted in grading. Perhaps all criteria are equally important, or perhaps there are two or three that all students must achieve to earn a passing grade. Decide what best fits the class and requirements of the assignment.

Consider involving students in Steps 2 and 3. A class session devoted to developing a rubric can provoke many important discussions about the ways the features of the language serve the purpose of the writing. And when students themselves work to describe the writing they are expected to produce, they are more likely to achieve it.

At this point, you will need to decide if you want to create a holistic or an analytic rubric. There is much debate about these two approaches to assessment.

Comparing Holistic and Analytic Rubrics

Holistic scoring .

Holistic scoring aims to rate overall proficiency in a given student writing sample. It is often used in large-scale writing program assessment and impromptu classroom writing for diagnostic purposes.

General tenets to holistic scoring:

  • Responding to drafts is part of evaluation
  • Responses do not focus on grammar and mechanics during drafting and there is little correction
  • Marginal comments are kept to 2-3 per page with summative comments at end
  • End commentary attends to students’ overall performance across learning objectives as articulated in the assignment
  • Response language aims to foster students’ self-assessment

Holistic rubrics emphasize what students do well and generally increase efficiency; they may also be more valid because scoring includes authentic, personal reaction of the reader. But holistic sores won’t tell a student how they’ve progressed relative to previous assignments and may be rater-dependent, reducing reliability. (For a summary of advantages and disadvantages of holistic scoring, see Becker, 2011, p. 116.)

Here is an example of a partial holistic rubric:

Summary meets all the criteria. The writer understands the article thoroughly. The main points in the article appear in the summary with all main points proportionately developed. The summary should be as comprehensive as possible and should be as comprehensive as possible and should read smoothly, with appropriate transitions between ideas. Sentences should be clear, without vagueness or ambiguity and without grammatical or mechanical errors.

A complete holistic rubric for a research paper (authored by Jonah Willihnganz) can be  downloaded here.

Analytic Scoring

Analytic scoring makes explicit the contribution to the final grade of each element of writing. For example, an instructor may choose to give 30 points for an essay whose ideas are sufficiently complex, that marshals good reasons in support of a thesis, and whose argument is logical; and 20 points for well-constructed sentences and careful copy editing.

General tenets to analytic scoring:

  • Reflect emphases in your teaching and communicate the learning goals for the course
  • Emphasize student performance across criterion, which are established as central to the assignment in advance, usually on an assignment sheet
  • Typically take a quantitative approach, providing a scaled set of points for each criterion
  • Make the analytic framework available to students before they write  

Advantages of an analytic rubric include ease of training raters and improved reliability. Meanwhile, writers often can more easily diagnose the strengths and weaknesses of their work. But analytic rubrics can be time-consuming to produce, and raters may judge the writing holistically anyway. Moreover, many readers believe that writing traits cannot be separated. (For a summary of the advantages and disadvantages of analytic scoring, see Becker, 2011, p. 115.)

For example, a partial analytic rubric for a single trait, “addresses a significant issue”:

  • Excellent: Elegantly establishes the current problem, why it matters, to whom
  • Above Average: Identifies the problem; explains why it matters and to whom
  • Competent: Describes topic but relevance unclear or cursory
  • Developing: Unclear issue and relevance

A  complete analytic rubric for a research paper can be downloaded here.  In WIM courses, this language should be revised to name specific disciplinary conventions.

Whichever type of rubric you write, your goal is to avoid pushing students into prescriptive formulas and limiting thinking (e.g., “each paragraph has five sentences”). By carefully describing the writing you want to read, you give students a clear target, and, as Ed White puts it, “describe the ongoing work of the class” (75).

Writing rubrics contribute meaningfully to the teaching of writing. Think of them as a coaching aide. In class and in conferences, you can use the language of the rubric to help you move past generic statements about what makes good writing good to statements about what constitutes success on the assignment and in the genre or discourse community. The rubric articulates what you are asking students to produce on the page; once that work is accomplished, you can turn your attention to explaining how students can achieve it.

Works Cited

Becker, Anthony.  “Examining Rubrics Used to Measure Writing Performance in U.S. Intensive English Programs.”   The CATESOL Journal  22.1 (2010/2011):113-30. Web.

White, Edward M.  Teaching and Assessing Writing . Proquest Info and Learning, 1985. Print.

Further Resources

CCCC Committee on Assessment. “Writing Assessment: A Position Statement.” November 2006 (Revised March 2009). Conference on College Composition and Communication. Web.

Gallagher, Chris W. “Assess Locally, Validate Globally: Heuristics for Validating Local Writing Assessments.” Writing Program Administration 34.1 (2010): 10-32. Web.

Huot, Brian.  (Re)Articulating Writing Assessment for Teaching and Learning.  Logan: Utah State UP, 2002. Print.

Kelly-Reilly, Diane, and Peggy O’Neil, eds. Journal of Writing Assessment. Web.

McKee, Heidi A., and Dànielle Nicole DeVoss DeVoss, Eds. Digital Writing Assessment & Evaluation. Logan, UT: Computers and Composition Digital Press/Utah State University Press, 2013. Web.

O’Neill, Peggy, Cindy Moore, and Brian Huot.  A Guide to College Writing Assessment . Logan: Utah State UP, 2009. Print.

Sommers, Nancy.  Responding to Student Writers . Macmillan Higher Education, 2013.

Straub, Richard. “Responding, Really Responding to Other Students’ Writing.” The Subject is Writing: Essays by Teachers and Students. Ed. Wendy Bishop. Boynton/Cook, 1999. Web.

White, Edward M., and Cassie A. Wright.  Assigning, Responding, Evaluating: A Writing Teacher’s Guide . 5th ed. Bedford/St. Martin’s, 2015. Print.

  • Faculty and Staff

twitter

Assessment and Curriculum Support Center

Creating and using rubrics.

Last Updated: 4 March 2024. Click here to view archived versions of this page.

On this page:

  • What is a rubric?
  • Why use a rubric?
  • What are the parts of a rubric?
  • Developing a rubric
  • Sample rubrics
  • Scoring rubric group orientation and calibration
  • Suggestions for using rubrics in courses
  • Equity-minded considerations for rubric development
  • Tips for developing a rubric
  • Additional resources & sources consulted

Note:  The information and resources contained here serve only as a primers to the exciting and diverse perspectives in the field today. This page will be continually updated to reflect shared understandings of equity-minded theory and practice in learning assessment.

1. What is a rubric?

A rubric is an assessment tool often shaped like a matrix, which describes levels of achievement in a specific area of performance, understanding, or behavior.

There are two main types of rubrics:

Analytic Rubric : An analytic rubric specifies at least two characteristics to be assessed at each performance level and provides a separate score for each characteristic (e.g., a score on “formatting” and a score on “content development”).

  • Advantages: provides more detailed feedback on student performance; promotes consistent scoring across students and between raters
  • Disadvantages: more time consuming than applying a holistic rubric
  • You want to see strengths and weaknesses.
  • You want detailed feedback about student performance.

Holistic Rubric: A holistic rubrics provide a single score based on an overall impression of a student’s performance on a task.

  • Advantages: quick scoring; provides an overview of student achievement; efficient for large group scoring
  • Disadvantages: does not provided detailed information; not diagnostic; may be difficult for scorers to decide on one overall score
  • You want a quick snapshot of achievement.
  • A single dimension is adequate to define quality.

2. Why use a rubric?

  • A rubric creates a common framework and language for assessment.
  • Complex products or behaviors can be examined efficiently.
  • Well-trained reviewers apply the same criteria and standards.
  • Rubrics are criterion-referenced, rather than norm-referenced. Raters ask, “Did the student meet the criteria for level 5 of the rubric?” rather than “How well did this student do compared to other students?”
  • Using rubrics can lead to substantive conversations among faculty.
  • When faculty members collaborate to develop a rubric, it promotes shared expectations and grading practices.

Faculty members can use rubrics for program assessment. Examples:

The English Department collected essays from students in all sections of English 100. A random sample of essays was selected. A team of faculty members evaluated the essays by applying an analytic scoring rubric. Before applying the rubric, they “normed”–that is, they agreed on how to apply the rubric by scoring the same set of essays and discussing them until consensus was reached (see below: “6. Scoring rubric group orientation and calibration”). Biology laboratory instructors agreed to use a “Biology Lab Report Rubric” to grade students’ lab reports in all Biology lab sections, from 100- to 400-level. At the beginning of each semester, instructors met and discussed sample lab reports. They agreed on how to apply the rubric and their expectations for an “A,” “B,” “C,” etc., report in 100-level, 200-level, and 300- and 400-level lab sections. Every other year, a random sample of students’ lab reports are selected from 300- and 400-level sections. Each of those reports are then scored by a Biology professor. The score given by the course instructor is compared to the score given by the Biology professor. In addition, the scores are reported as part of the program’s assessment report. In this way, the program determines how well it is meeting its outcome, “Students will be able to write biology laboratory reports.”

3. What are the parts of a rubric?

Rubrics are composed of four basic parts. In its simplest form, the rubric includes:

  • A task description . The outcome being assessed or instructions students received for an assignment.
  • The characteristics to be rated (rows) . The skills, knowledge, and/or behavior to be demonstrated.
  • Beginning, approaching, meeting, exceeding
  • Emerging, developing, proficient, exemplary 
  • Novice, intermediate, intermediate high, advanced 
  • Beginning, striving, succeeding, soaring
  • Also called a “performance description.” Explains what a student will have done to demonstrate they are at a given level of mastery for a given characteristic.

4. Developing a rubric

Step 1: Identify what you want to assess

Step 2: Identify the characteristics to be rated (rows). These are also called “dimensions.”

  • Specify the skills, knowledge, and/or behaviors that you will be looking for.
  • Limit the characteristics to those that are most important to the assessment.

Step 3: Identify the levels of mastery/scale (columns).

Tip: Aim for an even number (4 or 6) because when an odd number is used, the middle tends to become the “catch-all” category.

Step 4: Describe each level of mastery for each characteristic/dimension (cells).

  • Describe the best work you could expect using these characteristics. This describes the top category.
  • Describe an unacceptable product. This describes the lowest category.
  • Develop descriptions of intermediate-level products for intermediate categories.
Important: Each description and each characteristic should be mutually exclusive.

Step 5: Test rubric.

  • Apply the rubric to an assignment.
  • Share with colleagues.
Tip: Faculty members often find it useful to establish the minimum score needed for the student work to be deemed passable. For example, faculty members may decided that a “1” or “2” on a 4-point scale (4=exemplary, 3=proficient, 2=marginal, 1=unacceptable), does not meet the minimum quality expectations. We encourage a standard setting session to set the score needed to meet expectations (also called a “cutscore”). Monica has posted materials from standard setting workshops, one offered on campus and the other at a national conference (includes speaker notes with the presentation slides). They may set their criteria for success as 90% of the students must score 3 or higher. If assessment study results fall short, action will need to be taken.

Step 6: Discuss with colleagues. Review feedback and revise.

Important: When developing a rubric for program assessment, enlist the help of colleagues. Rubrics promote shared expectations and consistent grading practices which benefit faculty members and students in the program.

5. Sample rubrics

Rubrics are on our Rubric Bank page and in our Rubric Repository (Graduate Degree Programs) . More are available at the Assessment and Curriculum Support Center in Crawford Hall (hard copy).

These open as Word documents and are examples from outside UH.

  • Group Participation (analytic rubric)
  • Participation (holistic rubric)
  • Design Project (analytic rubric)
  • Critical Thinking (analytic rubric)
  • Media and Design Elements (analytic rubric; portfolio)
  • Writing (holistic rubric; portfolio)

6. Scoring rubric group orientation and calibration

When using a rubric for program assessment purposes, faculty members apply the rubric to pieces of student work (e.g., reports, oral presentations, design projects). To produce dependable scores, each faculty member needs to interpret the rubric in the same way. The process of training faculty members to apply the rubric is called “norming.” It’s a way to calibrate the faculty members so that scores are accurate and consistent across the faculty. Below are directions for an assessment coordinator carrying out this process.

Suggested materials for a scoring session:

  • Copies of the rubric
  • Copies of the “anchors”: pieces of student work that illustrate each level of mastery. Suggestion: have 6 anchor pieces (2 low, 2 middle, 2 high)
  • Score sheets
  • Extra pens, tape, post-its, paper clips, stapler, rubber bands, etc.

Hold the scoring session in a room that:

  • Allows the scorers to spread out as they rate the student pieces
  • Has a chalk or white board, smart board, or flip chart
  • Describe the purpose of the activity, stressing how it fits into program assessment plans. Explain that the purpose is to assess the program, not individual students or faculty, and describe ethical guidelines, including respect for confidentiality and privacy.
  • Describe the nature of the products that will be reviewed, briefly summarizing how they were obtained.
  • Describe the scoring rubric and its categories. Explain how it was developed.
  • Analytic: Explain that readers should rate each dimension of an analytic rubric separately, and they should apply the criteria without concern for how often each score (level of mastery) is used. Holistic: Explain that readers should assign the score or level of mastery that best describes the whole piece; some aspects of the piece may not appear in that score and that is okay. They should apply the criteria without concern for how often each score is used.
  • Give each scorer a copy of several student products that are exemplars of different levels of performance. Ask each scorer to independently apply the rubric to each of these products, writing their ratings on a scrap sheet of paper.
  • Once everyone is done, collect everyone’s ratings and display them so everyone can see the degree of agreement. This is often done on a blackboard, with each person in turn announcing his/her ratings as they are entered on the board. Alternatively, the facilitator could ask raters to raise their hands when their rating category is announced, making the extent of agreement very clear to everyone and making it very easy to identify raters who routinely give unusually high or low ratings.
  • Guide the group in a discussion of their ratings. There will be differences. This discussion is important to establish standards. Attempt to reach consensus on the most appropriate rating for each of the products being examined by inviting people who gave different ratings to explain their judgments. Raters should be encouraged to explain by making explicit references to the rubric. Usually consensus is possible, but sometimes a split decision is developed, e.g., the group may agree that a product is a “3-4” split because it has elements of both categories. This is usually not a problem. You might allow the group to revise the rubric to clarify its use but avoid allowing the group to drift away from the rubric and learning outcome(s) being assessed.
  • Once the group is comfortable with how the rubric is applied, the rating begins. Explain how to record ratings using the score sheet and explain the procedures. Reviewers begin scoring.
  • Are results sufficiently reliable?
  • What do the results mean? Are we satisfied with the extent of students’ learning?
  • Who needs to know the results?
  • What are the implications of the results for curriculum, pedagogy, or student support services?
  • How might the assessment process, itself, be improved?

7. Suggestions for using rubrics in courses

  • Use the rubric to grade student work. Hand out the rubric with the assignment so students will know your expectations and how they’ll be graded. This should help students master your learning outcomes by guiding their work in appropriate directions.
  • Use a rubric for grading student work and return the rubric with the grading on it. Faculty save time writing extensive comments; they just circle or highlight relevant segments of the rubric. Some faculty members include room for additional comments on the rubric page, either within each section or at the end.
  • Develop a rubric with your students for an assignment or group project. Students can the monitor themselves and their peers using agreed-upon criteria that they helped develop. Many faculty members find that students will create higher standards for themselves than faculty members would impose on them.
  • Have students apply your rubric to sample products before they create their own. Faculty members report that students are quite accurate when doing this, and this process should help them evaluate their own projects as they are being developed. The ability to evaluate, edit, and improve draft documents is an important skill.
  • Have students exchange paper drafts and give peer feedback using the rubric. Then, give students a few days to revise before submitting the final draft to you. You might also require that they turn in the draft and peer-scored rubric with their final paper.
  • Have students self-assess their products using the rubric and hand in their self-assessment with the product; then, faculty members and students can compare self- and faculty-generated evaluations.

8. Equity-minded considerations for rubric development

Ensure transparency by making rubric criteria public, explicit, and accessible

Transparency is a core tenet of equity-minded assessment practice. Students should know and understand how they are being evaluated as early as possible.

  • Ensure the rubric is publicly available & easily accessible. We recommend publishing on your program or department website.
  • Have course instructors introduce and use the program rubric in their own courses. Instructors should explain to students connections between the rubric criteria and the course and program SLOs.
  • Write rubric criteria using student-focused and culturally-relevant language to ensure students understand the rubric’s purpose, the expectations it sets, and how criteria will be applied in assessing their work.
  • For example, instructors can provide annotated examples of student work using the rubric language as a resource for students.

Meaningfully involve students and engage multiple perspectives

Rubrics created by faculty alone risk perpetuating unseen biases as the evaluation criteria used will inherently reflect faculty perspectives, values, and assumptions. Including students and other stakeholders in developing criteria helps to ensure performance expectations are aligned between faculty, students, and community members. Additional perspectives to be engaged might include community members, alumni, co-curricular faculty/staff, field supervisors, potential employers, or current professionals. Consider the following strategies to meaningfully involve students and engage multiple perspectives:

  • Have students read each evaluation criteria and talk out loud about what they think it means. This will allow you to identify what language is clear and where there is still confusion.
  • Ask students to use their language to interpret the rubric and provide a student version of the rubric.
  • If you use this strategy, it is essential to create an inclusive environment where students and faculty have equal opportunity to provide input.
  • Be sure to incorporate feedback from faculty and instructors who teach diverse courses, levels, and in different sub-disciplinary topics. Faculty and instructors who teach introductory courses have valuable experiences and perspectives that may differ from those who teach higher-level courses.
  • Engage multiple perspectives including co-curricular faculty/staff, alumni, potential employers, and community members for feedback on evaluation criteria and rubric language. This will ensure evaluation criteria reflect what is important for all stakeholders.
  • Elevate historically silenced voices in discussions on rubric development. Ensure stakeholders from historically underrepresented communities have their voices heard and valued.

Honor students’ strengths in performance descriptions

When describing students’ performance at different levels of mastery, use language that describes what students can do rather than what they cannot do. For example:

  • Instead of: Students cannot make coherent arguments consistently.
  • Use: Students can make coherent arguments occasionally.

9. Tips for developing a rubric

  • Find and adapt an existing rubric! It is rare to find a rubric that is exactly right for your situation, but you can adapt an already existing rubric that has worked well for others and save a great deal of time. A faculty member in your program may already have a good one.
  • Evaluate the rubric . Ask yourself: A) Does the rubric relate to the outcome(s) being assessed? (If yes, success!) B) Does it address anything extraneous? (If yes, delete.) C) Is the rubric useful, feasible, manageable, and practical? (If yes, find multiple ways to use the rubric: program assessment, assignment grading, peer review, student self assessment.)
  • Collect samples of student work that exemplify each point on the scale or level. A rubric will not be meaningful to students or colleagues until the anchors/benchmarks/exemplars are available.
  • Expect to revise.
  • When you have a good rubric, SHARE IT!

10. Additional resources & sources consulted:

Rubric examples:

  • Rubrics primarily for undergraduate outcomes and programs
  • Rubric repository for graduate degree programs

Workshop presentation slides and handouts:

  • Workshop handout (Word document)
  • How to Use a Rubric for Program Assessment (2010)
  • Techniques for Using Rubrics in Program Assessment by guest speaker Dannelle Stevens (2010)
  • Rubrics: Save Grading Time & Engage Students in Learning by guest speaker Dannelle Stevens (2009)
  • Rubric Library , Institutional Research, Assessment & Planning, California State University-Fresno
  • The Basics of Rubrics [PDF], Schreyer Institute, Penn State
  • Creating Rubrics , Teaching Methods and Management, TeacherVision
  • Allen, Mary – University of Hawai’i at Manoa Spring 2008 Assessment Workshops, May 13-14, 2008 [available at the Assessment and Curriculum Support Center]
  • Mertler, Craig A. (2001). Designing scoring rubrics for your classroom. Practical Assessment, Research & Evaluation , 7(25).
  • NPEC Sourcebook on Assessment: Definitions and Assessment Methods for Communication, Leadership, Information Literacy, Quantitative Reasoning, and Quantitative Skills . [PDF] (June 2005)

Contributors: Monica Stitt-Bergh, Ph.D., TJ Buckley, Yao Z. Hill Ph.D.

Alliant International University Center for Teaching Excellence

Rubrics for Written Assignments

Introduction.

Most graduate courses require students to produce written work although these products differ in purpose and required parameters (e.g., format, length, or tone). Thus, a faculty member might be called on to evaluate short reflection papers, longer lab reports, or longer still term papers. In evaluating a written product, it is important to choose or develop a rubric in order to bring consistency, fairness, and clarity to the task. Creating Rubrics

An analytic rubric is a scoring guide used to evaluate performance, a product, or a project. It has three parts: 1) performance criteria; 2) rating scale; and 3) indicators. How to Develop a Rubric

Using a rubric to evaluate student written work is helpful for both faculty and students. For faculty, rubrics

  • Reduce the time spent grading by allowing instructors to refer to a substantive description without writing long comments
  • Help to identify strengths and weaknesses across an entire class and adjust instruction appropriately
  • Help to ensure consistency across time and across graders
  • Reduce the uncertainty that can accompany grading
  • Discourage complaints about grades

Rubrics help students to

  • Understand instructors’ expectations and standards
  • Use instructor feedback to improve their performance
  • Monitor and assess their own progress
  • Recognize their strengths and weaknesses and direct their efforts accordingly

Benefitting from Rubrics

Developing a Rubric

Developing a rubric entails the following steps:

  • ​​​​​​​List all the possible criteria students should demonstrate in the assignment.
  • Decide which of those criteria are crucial. Ideally, the rubric will have three to five performance criteria.
  • Criteria should be: unambiguous, clearly stated, measurable, precise, and distinct.
  • Prioritize the criteria by relating them to the learning objectives for the unit and determining which skills are essential at competent or proficiency levels for the assignment.
  • Basic, Developing, Accomplished, Exemplary
  • Poor, Below Average, Average, Above Average, Excellent
  • Below Expectations, Basic, Proficient, Outstanding
  • Unsatisfactory, Basic, Competent, Distinguished
  • Developing, Acceptable, Target
  • Does Not Meet Expectations, Meets Expectations, Exceeds Expectations
  • 5, 4, 3, 2, 1
  • Low Mastery, Average Mastery, High Mastery
  • Missing, unclear, clear, thorough
  • Below expectations, basic, proficient, outstanding
  • Never, rarely, sometimes, often, always
  • Novice, apprentice, proficient, master ​​​​​​​
  • Develop indicators of quality. Define the performance expected of the ideal assessment for each criterion. Begin with the highest level of the scale to define top quality performance and create indicators for all performance levels.
  • Discuss the rubric with students so that they are clear on the expectations. Students can even help create the rubric.
  • Does the rubric relate to the outcome(s) being measured?
  • Does it cover important criteria for student performance?
  • Does the top end of the rubric reflect excellence?
  • Are the criteria and scales well-defined?
  • Share the rubric with colleagues, students, and experts
  • Test the rubric on samples of student work
  • If multiple raters are being used, discuss common definitions, standards, and expectations for quality and practice using the rubric and comparing ratings to determine consistency in judgments across raters.

Rubrics for Written Work

There are, of course, many types of student papers, which differ in the learning outcomes they represent and the skills they are meant to develop. Ideally, an instructor will develop a unique rubric for each assignment, based on the intent of the assignment and the relevant learning objectives as well as the overall learning objectives for the course. When creating a rubric to evaluate a written assignment, an instructor should be able to answer the following questions:

  • What will distinguish the best papers from the least effective?
  • What skills is this task meant to teach that should be evaluated with the rubric?
  • What is the paper supposed to accomplish, and what is the process that the writer should go through to accomplish those goals?
  • How will I know if they have learned what the task calls for them to learn?

Designing and Using Rubrics

A review of a sample of rubrics for evaluating papers indicates that they vary in both the number of dimensions and the content of the dimensions included used; however, it is possible to extract several common dimensions for evaluation. These may include the following:

  • ​​​​​​​Thoroughness/completeness
  • Currency/recency

Organization/structure

  • Thesis statement/argument
  • Supporting evidence
  •  Logic/coherence
  • Cohesiveness

Presentation of ideas

  • Integration/synthesis
  •  Evaluation
  •  Creativity/originality

Writing style

  • Conciseness
  • Punctuation
  • Word choice
  • Sentence structure
  • Use of APA style in text
  • Use of APA style in references

An instructor creating a rubric should consider these dimensions and determine which ones are pertinent to the purpose of the assignment being evaluated. It is also possible to adopt or adapt existing rubrics. One common source is the Association of American Colleges and Universities Value Rubrics: Written Communication.

AACU Value Rubrics: Written Communication

Other examples of specific rubrics include the following:

Examples of Rubrics for Research Papers

Research Paper Rubric Cornell College Cole Library

Rubric for Research Paper Kansas State Assessment Toolkit

Rubric for Research Paper University of Florida Center for Teaching Excellence

Writing Rubric for Psychology Middlebury College Academics

Rubrics for Essays

Grading Rubrics: Essays Brandeis University Writing Program

Analytic and Critical Thinking ​​​​​​​Mount Holyoke College Teaching & Learning Initiative

Argument Essay Grading Rubric Saint Paul College Academic Effectiveness and Innovation

Rubrics for Class Papers

College Level Writing Rubric Virginia Union University

Grading Rubric for Papers St. John’s University

Grading Rubric for Writing Assignment The American University of Rome

Rubrics for Reflection Papers

Reflection Writing Rubric Carnegie Mellon University Eberly Center for Teaching Excellence

Reflective Essay University of Florida Center for Teaching Excellence

Grading Rubric for Reflective Essay Mount Holyoke College Teaching & Learning Initiative

Creating Rubrics University of Texas/Austin Faculty Innovation Center

Evaluating Rubrics DePaul University Teaching Commons

Using Rubrics University of North Carolina/Chapel Hill Office of Institutional Research and Assessment

Building A Rubric Columbia University Center for Teaching and Learning

Designing & Using Rubrics University of Michigan Sweetland Center for Writing

Grading with Rubrics Western University Center for Teaching and Learning

Grading Rubrics Berkeley Graduate Division Graduate Student Instructor Teaching & Resource Center

Berkeley Graduate Division

  • Basics for GSIs
  • Advancing Your Skills

Examples of Rubric Creation

Creating a rubric takes time and requires thought and experimentation. Here you can see the steps used to create two kinds of rubric: one for problems in a physics exam for a small, upper-division physics course, and another for an essay assignment in a large, lower-division sociology course.

Physics Problems

In STEM disciplines (science, technology, engineering, and mathematics), assignments tend to be analytical and problem-based. Holistic rubrics can be an efficient, consistent, and fair way to grade a problem set. An analytical rubric often gives a more clear picture of what a student should direct their future learning efforts on. Since holistic rubrics try to label overall understanding, they can lead to more regrade requests when compared to analytical rubric with more explicit criteria. When starting to grade a problem, it is important to think about the relevant conceptual ingredients in the solution. Then look at a sample of student work to get a feel for student mistakes. Decide what rubric you will use (e.g., holistic or analytic, and how many points). Apply the holistic rubric by marking comments and sorting the students’ assignments into stacks (e.g., five stacks if using a five-point scale). Finally, check the stacks for consistency and mark the scores. The following is a sample homework problem from a UC Berkeley Physics Department undergraduate course in mechanics.

Homework Problem

Learning objective.

Solve for position and speed along a projectile’s trajectory.

Desired Traits: Conceptual Elements Needed for the Solution

  • Decompose motion into vertical and horizontal axes.
  • Identify that the maximum height occurs when the vertical velocity is 0.
  • Apply kinematics equation with g as the acceleration to solve for the time and height.
  • Evaluate the numerical expression.

A note on analytic rubrics: If you decide you feel more comfortable grading with an analytic rubric, you can assign a point value to each concept. The drawback to this method is that it can sometimes unfairly penalize a student who has a good understanding of the problem but makes a lot of minor errors. Because the analytic method tends to have many more parts, the method can take quite a bit more time to apply. In the end, your analytic rubric should give results that agree with the common-sense assessment of how well the student understood the problem. This sense is well captured by the holistic method.

Holistic Rubric

A holistic rubric, closely based on a rubric by Bruce Birkett and Andrew Elby:

The student clearly understands how to solve the problem. Minor mistakes and careless errors can appear insofar as they do not indicate a conceptual misunderstanding.
The student understands the main concepts and problem-solving techniques, but has some minor yet non-trivial gaps in their reasoning.
The student has partially understood the problem. The student is not completely lost, but requires tutoring in some of the basic concepts. The student may have started out correctly, but gone on a tangent or not finished the problem.
The student has a poor understanding of the problem. The student may have gone in a not-entirely-wrong but unproductive direction, or attempted to solve the problem using pattern matching or by rote.
The student did not understand the problem. They may have written some appropriate formulas or diagrams, but nothing further. Or they may have done something entirely wrong.
The student wrote nothing or almost nothing.

[a] This policy especially makes sense on exam problems, for which students are under time pressure and are more likely to make harmless algebraic mistakes. It would also be reasonable to have stricter standards for homework problems.

Analytic Rubric

The following is an analytic rubric that takes the desired traits of the solution and assigns point values to each of the components. Note that the relative point values should reflect the importance in the overall problem. For example, the steps of the problem solving should be worth more than the final numerical value of the solution. This rubric also provides clarity for where students are lacking in their current understanding of the problem.

Student decomposes the velocity (a vector quantity) into its vertical component
Student realizes that the motion should be decomposed, but does not arrive at the correct expression for
No attempt at decomposing the 2D motion into its vertical component.
Student successfully translates the physical question (the highest point of the ball) to an equation that can be used to help solve the motion ( ).
Student identifies the maximum height condition with minor mistakes.
Incorrect or missing identification of maximum height condition.
Applies the kinematic equations to yield a correct expression for the height in terms of the given variables. Solution uses the fact that the vertical motion has a constant downward acceleration due to gravity. The sequence of steps clearly demonstrates the thought process. Most likely, the solution includes solving for the time it takes to reach the top and then uses that time to see how far up the ball traveled.
Mostly correct application with minor error (e.g. algebraic mistakes or incorporating extraneous equations).
Equations include relevant parameters from the problem, but the student does not isolate relevant variables being solved for (such as time or distance).
Some kinematics formulas are written down but they are not connected with the information in the problem.
No attempt.
Correct numerical answer with appropriate units.
Mostly correct answer but with a few minor errors. Still physically sensible answer (e.g. units and numerical values are reasonable).
No attempt or physically unreasonable answer (e.g. a negative maximum height or reporting the height in units of seconds).

Try to avoid penalizing multiple times for the same mistake by choosing your evaluation criteria to be related to distinct learning outcomes. In designing your rubric, you can decide how finely to evaluate each component. Having more possible point values on your rubric can give more detailed feedback on a student’s performance, though it typically takes more time for the grader to assess.

Of course, problems can, and often do, feature the use of multiple learning outcomes in tandem. When a mistake could be assigned to multiple criteria, it is advisable to check that the overall problem grade is reasonable with the student’s mastery of the problem. Not having to decide how particular mistakes should be deducted from the analytic rubric is one advantage of the holistic rubric. When designing problems, it can be very beneficial for students not to have problems with several subparts that rely on prior answers. These tend to disproportionately skew the grades of students who miss an ingredient early on. When possible, consider making independent problems for testing different learning outcomes.

Sociology Research Paper

An introductory-level, large-lecture course is a difficult setting for managing a student research assignment. With the assistance of an instructional support team that included a GSI teaching consultant and a UC Berkeley librarian [b] , sociology lecturer Mary Kelsey developed the following assignment:

This was a lengthy and complex assignment worth a substantial portion of the course grade. Since the class was very large, the instructor wanted to minimize the effort it would take her GSIs to grade the papers in a manner consistent with the assignment’s learning objectives. For these reasons Dr. Kelsey and the instructional team gave a lot of forethought to crafting a detailed grading rubric.

Desired Traits

  • Use and interpretation of data
  • Reflection on personal experiences
  • Application of course readings and materials
  • Organization, writing, and mechanics

For this assignment, the instructional team decided to grade each trait individually because there seemed to be too many independent variables to grade holistically. They could have used a five-point scale, a three-point scale, or a descriptive analytic scale. The choice depended on the complexity of the assignment and the kind of information they wanted to convey to students about their work.

Below are three of the analytic rubrics they considered for the Argument trait and a holistic rubric for all the traits together. Lastly you will find the entire analytic rubric, for all five desired traits, that was finally used for the assignment. Which would you choose, and why?

Five-Point Scale

5 Argument pertains to relationship between social factors and educational opportunity and is clearly stated and defensible.
4 Argument pertains to relationship between social factors and educational opportunity and is defensible, but it is not clearly stated.
3 Argument pertains to relationship between social factors and educational opportunity but is not defensible using the evidence available.
2 Argument is presented, but it does not pertain to relationship between social factors and educational opportunity.
1 Social factors and educational opportunity are discussed, but no argument is presented.

Three-Point Scale

Argument pertains to relationship between social factors and educational opportunity and is clearly stated and defensible.
Argument pertains to relationship between social factors and educational opportunity but may not be clear or sufficiently narrow in scope.
Social factors and educational opportunity are discussed, but no argument is presented.

Simplified Three-Point Scale, numbers replaced with descriptive terms

Argument pertains to relationship between social factors and educational opportunity and is clearly stated and defensible      

For some assignments, you may choose to use a holistic rubric, or one scale for the whole assignment. This type of rubric is particularly useful when the variables you want to assess just cannot be usefully separated. We chose not to use a holistic rubric for this assignment because we wanted to be able to grade each trait separately, but we’ve completed a holistic version here for comparative purposes.

The paper is driven by a clearly stated, defensible argument about the relationship between social factors and educational opportunity. Sufficient data is used to defend the argument, and the data is accurately interpreted to identify each school’s position within a larger social structure. Personal educational experiences are examined thoughtfully and critically to identify significance of external social factors and support the main argument. Paper reflects solid understanding of the major themes of the course, using course readings to accurately define sociological concepts and to place the argument within a broader discussion of the relationship between social status and individual opportunity. Paper is clearly organized (with an introduction, transition sentences to connect major ideas, and conclusion) and has few or no grammar or spelling errors. Scholarly ideas are cited correctly using the ASA style guide.
The paper is driven by a defensible argument about the relationship between social factors and public school quality, but it may not be stated as clearly and consistently throughout the essay as in an “A” paper. The argument is defended using sufficient data, reflection on personal experiences, and course readings, but the use of this evidence does not always demonstrate a clear understanding of how to locate the school or community within a larger class structure, how social factors influence personal experience, or the broader significance of course concepts. Essay is clearly organized, but might benefit from more careful attention to transitional sentences. Scholarly ideas are cited accurately, using the ASA style sheet, and the writing is polished, with few grammar or spelling errors.
The paper contains an argument about the relationship between social factors and public school quality, but the argument may not be defensible using the evidence available. Data, course readings, and personal experiences are used to defend the argument, but in a perfunctory way, without demonstrating an understanding of how social factors are identified or how they shape personal experience. Scholarly ideas are cited accurately, using the ASA style sheet. Essay may have either significant organizational or proofreading errors, but not both.
The paper does not have an argument, or is missing a major component of the evidence requested (data, course readings, or personal experiences). Alternatively, or in addition, the paper suffers from significant organizational and proofreading errors. Scholarly ideas are cited, but without following ASA guidelines.
The paper does not provide an argument and contains only one component of the evidence requested, if any. The paper suffers from significant organizational and proofreading errors. If scholarly ideas are not cited, paper receives an automatic “F.”

Final Analytic Rubric

This is the rubric the instructor finally decided to use. It rates five major traits, each on a five-point scale. This allowed for fine but clear distinctions in evaluating the students’ final papers.

Argument pertains to relationship between social factors and educational opportunity and is clearly stated and defensible.
Argument pertains to relationship between social factors and educational opportunity and is defensible, but it is not clearly stated.
Argument pertains to relationship between social factors and educational opportunity but is not defensible using the evidence available.
Argument is presented, but it does not pertain to relationship between social factors and educational opportunity.
Social factors and educational opportunity are discussed, but no argument is presented.
The data is accurately interpreted to identify each school’s position within a larger social structure, and sufficient data is used to defend the main argument.
The data is accurately interpreted to identify each school’s position within a larger social structure, and data is used to defend the main argument, but it might not be sufficient.
Data is used to defend the main argument, but it is not accurately interpreted to identify each school’s position within a larger social structure, and it might not be sufficient.
Data is used to defend the main argument, but it is insufficient, and no effort is made to identify the school’s position within a larger social structure.
Data is provided, but it is not used to defend the main argument.
Personal educational experiences are examined thoughtfully and critically to identify significance of external social factors and support the main argument.
Personal educational experiences are examined thoughtfully and critically to identify significance of external social factors, but relation to the main argument may not be clear.
Personal educational experiences are examined, but not in a way that reflects understanding of the external factors shaping individual opportunity. Relation to the main argument also may not be clear.
Personal educational experiences are discussed, but not in a way that reflects understanding of the external factors shaping individual opportunity. No effort is made to relate experiences back to the main argument.
Personal educational experiences are mentioned, but in a perfunctory way.
Demonstrates solid understanding of the major themes of the course, using course readings to accurately define sociological concepts and to place the argument within a broader discussion of the relationship between social status and individual opportunity.
Uses course readings to define sociological concepts and place the argument within a broader framework, but does not always demonstrate solid understanding of the major themes.
Uses course readings to place the argument within a broader framework, but sociological concepts are poorly defined or not defined at all. The data is not all accurately interpreted to identify each school’s position within a larger social structure, and it might not be sufficient.
Course readings are used, but paper does not place the argument within a broader framework or define sociological concepts.
Course readings are only mentioned, with no clear understanding of the relationship between the paper and course themes.
Clear organization and natural “flow” (with an introduction, transition sentences to connect major ideas, and conclusion) with few or no grammar or spelling errors. Scholarly ideas are cited correctly using the ASA style guide.
Clear organization (introduction, transition sentences to connect major ideas, and conclusion), but writing might not always be fluid, and might contain some grammar or spelling errors. Scholarly ideas are cited correctly using the ASA style guide.
Organization unclear or the paper is marred by significant grammar or spelling errors (but not both). Scholarly ideas are cited correctly using the ASA style guide.
Organization unclear and the paper is marred by significant grammar and spelling errors. Scholarly ideas are cited correctly using the ASA style guide.
Effort to cite is made, but the scholarly ideas are not cited correctly. (Automatic “F” if ideas are not cited at all.)

[b] These materials were developed during UC Berkeley’s 2005–2006 Mellon Library/Faculty Fellowship for Undergraduate Research program. Members of the instructional team who worked with Lecturer Kelsey in developing the grading rubric included Susan Haskell-Khan, a GSI Center teaching consultant and doctoral candidate in history, and Sarah McDaniel, a teaching librarian with the Doe/Moffitt Libraries.

NC State

Grading Rubric for Written Reports

N
60
Topic mastery, including technical correctness
Appropriate level of detail and thoroughness of documentation
15
Clearly identified purpose and approach
Content is clearly organized and supports the objective
Transitions between topics
15
Easy to read
Grammatically and stylistically correct
Uniform writing style
10
Consistent presentation of graphics
Uniform document design and layout

Assessment Rubric Design

The most updated lab writing instructional modules are available: engineeringlabwriting.org

Faculty Writing Instruction Guide 2: Designing Lab Report Assessment Rubrics

Well-designed assessment instruments will help instructors inform their expectations to students and assess student lab reports fairly and efficiently. A rubric can be an excellent assessment instrument for engineering lab reports. It identifies the instructor’s expectations from an assigned lab report and then explicitly states the possible levels of achievement along a continuum (poor to excellent or novice to expert).

The rubric can be constructed for individual labs or an entire course.

Step 1: Define the purpose of the lab report assignment/assessment for which you are creating a rubric
Instruction Rubric development example
The first step is to clarify the purpose of the assignment and identify student’s learning outcome(s) from lab report writing. .
Step 2: Choose a Rubric Type: Analytical vs. Holistic
Instruction Rubric development example
Instructors need to select one out of two types of rubrics: analytical vs. holistic.

See Figures 1 and 2 for the differences between the two types.

was chosen for the further rubric development on lab data presentation.

rubric for report assignment

Step 3: Define the Criteria
Instruction Rubric development example

  • Instructors need to define the levels of quality for student performance. Most rating scales include: 3 (below; meet; above), 4 (fail; fair; pass; exceed), 5 (never; sometimes; usually; mostly; always), or 6 (limited-low; limited-high; acceptable-low; acceptable-high; proficient-low; proficient-high).
  • More rating points can provide more detail; however, more rating points can also make grading more difficult and time-consuming.
  • This example rubric will use
  • Score 1: Below expectations.
  • Score 2: Meets expectations.
  • Score 3: Exceeds expectations.

These 3 levels can become 6 levels when you can make high-low in each level.

Step 4: Design the Rating Scale
Instruction Rubric development example
Step 5: Write Performance Descriptors for Each Rating (Step 3 + Step 4)
Instruction Rubric development example
Step 6: Build and Revise the Rubric
Instruction Rubric development example

Rubric Examples:

Intro Analyze the technical audience’s expectations and the context for the lab report.

Provide purpose, context, and technical background proficiently.

The writer’s understanding of the context and audience supports a generally successful report. Attention to purpose, context, and technical background are generally appropriate, with some lapses. Little to no awareness of the audience’s needs and the context.

The purpose, context, and technical background provided are too basic or inadequate.

Methods Lab processes presented are accurate and concise so that the writer can repeat the lab with the description. Graphics, such as photographs, are used effectively. The presentation of the lab processes is accurate; however, it is highly wordy or unnecessarily detailed. Graphics, such as photographs, are used but lack clarity. The writer cannot repeat the lab with the presentation. The lab processes are highly concise, simple, or not well organized.
Results The writer uses effective strategies to use graphic/table forms when communicating lab data/results.

Graphic/table forms are stand-alone and professional. They contain all required features to follow standard conventions and include useful captions.  Figures, tables, and illustrations are correctly and usefully labeled.

When communicating lab data/results, strategies using graphic/table forms were generally appropriate, with lapses.

Graphic/table forms are generally appropriate; however, they contain minor errors. Figures, tables, and illustrative materials are labeled.

The writer fails to use effective graphic/table forms when communicating lab data/results.

Graphic/table forms contain little or no required features. Multiple errors are found in the graphics/tables. Figures, tables, and illustrative materials are not labeled.

Results /Discussion The writer analyzes lab data using appropriate methods (statistical, comparative, uncertainty, etc.) professionally. The writer draws significant technical knowledge from an in-depth analysis consistent with the complexity of the experimentation. Lab data analysis is generally appropriate; however, the analysis methods have some lapses, or the analysis results of lab data are not well aligned with the complexity of the experimentation. The writer fails to analyze lab data. The writer’s lab data analysis is limited, and the data analysis methods have significant errors. Sometimes, the writer may “let the data do the talking.”
Results /Discussion The writer interprets lab data using factual and quantitative evidence appropriately. The writer addresses existing knowledge (engineering principles or outside reference data/information as the secondary sources) to connect the in-depth lab data analysis (the lab data as the primary sources). The writer interprets lab data using secondary sources; however, the writer’s explanation about the meaning of lab data is appropriate with some lapses. The writer addresses existing knowledge to connect the in-depth lab data analysis; however, it is limited. The writer fails to interpret the lab data. The writer’s explanation about the meaning of lab data is wrong or not based on factual and/or quantitative evidence.
Conclusion The writer draws meaningful conclusions and reflects on the experiment as a whole in ways that provide closure and bring the analysis to a satisfying ending. The writer provides closure by summarizing the analysis but may draw limited or inconsistent conclusions from the analysis. The writer fails to close the report. The conclusion is inconsistent with the report’s purpose and other sections’ contents (intro and body).
IMRDC The writer communicates ideas effectively through reasoning and productive patterns. The writer uses appropriate strategies (claim-evidence-reasoning, cause-effect, compare-contrast, advantages-disadvantages, problem-solution, etc.) to make arguments logically to the audience with a proper flow. The writer communicates ideas through reasoning and productive patterns with some lapses. Paper generally has a well-constructed flow; however, it sometimes wanders from one idea to another. The write fails to use reasoning and productive patterns to make arguments. No strategies are used when making arguments and/or describing factual evidence — disjointed connections of ideas within or across paragraphs.
IMRDC The writer provides a purposeful structure that clearly articulates the experiment’s purpose as a whole document. The report has a well-structured introduction, body, and conclusion. Each of these three parts (intro, body, conclusion) well functions in one report. The writer provides a structure (intro, body, and conclusion) generally appropriate for a lab report as a whole document. Generally, each part (intro, body, conclusion) relates to the primary purpose of the report. The report’s structure (intro, body, conclusion) may be inappropriate, incomplete, or missing. The writer made significant errors in the functions of these three parts (intro, body, conclusion).
IMRDC The writer provides an error-free document. Style, tone, tense, and voice are appropriate for a lab report. Errors in mechanics and grammar are minimal and highly infrequent. The report employs a syntax and diction appropriate to the lab report genre. The citations of source material are clear and consistent, and the citation style is appropriate. Style, tone, tense, and voice are generally appropriate, with some lapses. Errors in mechanics and grammar are generally minor but may be sufficiently frequent to distract a reader. The writer’s diction and syntax are sometimes effective.

Source citations are uniformly included but may be incomplete.

Figures, tables, and other illustrative materials are generally well-formatted and labeled.

Choices of style, diction, tone, tense, and voice are inconsistent with or inappropriate for a lab report. The writer’s stylistic choices may seem random. Errors are frequent and seriously detract from meaning or prevent the reader from adequately understanding the writer’s meaning. The writer omits some citations for sources and may inconsistently label tables, figures, and other visual material.
  • Designing Grading Rubrics, Brown University, https://www.brown.edu/sheridan/teaching-learning-resources/teaching-resources/course-design/classroom-assessment/grading-criteria/designing-rubrics
  • Assessment: What is a Rubric?, DePaul University’s Office for Teaching, Learning and Assessment, https://resources.depaul.edu/teaching-commons/teaching-guides/feedback-grading/rubrics/Pages/default.aspx
  • Creating a Rubric: an Online Tutorial for Faculty, the Center for Faculty Development at the University of Colorado Denver, https://www.ucdenver.edu/centers/cfda
  • Dannelle D. Stevens & Antonia J. Levi, An Introduction to Rubrics (Sterling, VA: Stylus, 2005).

Library Toolbox for Faculty and Staff

  • Toolbox Home
  • Embed Library Resources
  • EBooks in Courses
  • Library Instruction and Information Literacy
  • Research Assignment Design Help
  • Sample Research Assignments
  • Sample Assignment Rubrics
  • Information Literacy Learning Objects Menu
  • Student Learning Opportunities
  • Your Scholarly Research
  • Library Request Forms
  • All Faculty Guides

An assignment rubric can serve multiple purposes:

  • communicate to your students the exact aspects of the assignment they will be graded on
  • model scholarly practice to students
  • help you grade submitted assignments in an efficient and transparent way.

Here are some samples:

  • Annotated Bibliography Evaluation Rubric
  • Research Paper Rubric (Cornell College)
  • Rubric for Research Paper (Kansas St)
  • Sample Grading and Performance Rubrics (Carnegie Mellon)

If interested in creating your own research assignment/paper rubric, the library can help you with the library and information literacy aspects of that. Email [email protected]  for help with this.

  • << Previous: Sample Research Assignments
  • Next: Information Literacy Learning Objects Menu >>
  • Last Updated: Jul 26, 2024 2:31 PM
  • URL: https://subjectguides.sunyempire.edu/facultystafftoolbox

SpringShare Privacy Policy

  • Safety & Security
  • ITS Service Desk
  • Facilities & Maintenance
  • University Policies
  • Web Accessibility
  • Freedom of Information

Facebook

© 2023 SUNY Empire State University The Torch logo is a trademark of SUNY Empire State University.

Instructure Logo

You're signed out

Sign in to ask questions, follow content, and engage with the Community

  • Feature User Groups
  • Enhanced Rubrics (Feature Preview)

Rubric Enhancements Issue for Students

  • Subscribe to RSS Feed
  • Mark Topic as New
  • Mark Topic as Read
  • Float this Topic for Current User
  • Printer Friendly Page

mroach2

  • Mark as New
  • Report Inappropriate Content
  • All forum topics
  • Previous Topic

nbruenin

Rubric Enhancements - duplicate criterion not work...

Community help, view our top guides and resources:.

To participate in the Instructure Community, you need to sign up or log in:

How can I use AI as an instructor?

AI can be a powerful ally in content creation. Various tools and platforms, often user-friendly and accessible, empower instructors to generate diverse and creative content. From learning outcomes to writing assignments to lesson plans, generative AI can provide inspiration and assist in overcoming creative blocks.

Click on the tabs below to view instructions and examples for generating each item

Student learning outcomes

  • Define learning objectives and key concepts: Begin by clearly outlining the learning objectives and key concepts you want to address. For instance, if the goal is to understand the principles of physics, specify concepts such as motion, energy, and forces.
  • Generate general learning outcomes: Utilize a generative AI tool to create overarching learning outcomes. Example: “Generate student learning outcomes for a physics course covering motion, energy, and forces.” AI-generated outcomes may include statements like “Students will explain the relationship between force and motion” or “Students will apply the principles of energy conservation to solve real-world problems.”
  • Refine and tailor outcomes: Review the AI-generated outcomes and refine them to align with specific course objectives. Tailor the language to fit the educational context and the level of understanding expected from students.
  • Create specific learning objectives: Use generative AI to create more granular learning objectives for each key concept. Example: “Generate specific learning objectives for understanding motion in a physics course.” AI-generated objectives may include statements like “Students will calculate speed and acceleration using appropriate formulas” or “Students will analyze graphs to interpret an object’s motion.”
  • Ensure measurability and clarity: Review each learning outcome to ensure it is measurable and clear. Adjust language or add specificity as needed to enhance clarity. For instance, if an outcome states “Students will understand,” consider revising it to “Students will demonstrate understanding through practical applications.”

Lecture plans or lecture notes

  • Define lecture topics and objectives: Identify the main topics and objectives you want to cover in your lecture. For instance, if the lecture is on “Introduction to Artificial Intelligence,” outline key subtopics such as AI history, applications, and ethical considerations.
  • Generate lecture overview: Prompt the generative AI with a request for a general lecture overview. Example: “Generate an overview for a lecture on ‘Introduction to Artificial Intelligence’ covering history, applications, and ethical considerations.” The AI might produce a concise summary, outlining key points and subtopics.
  • Outline main points and subtopics: Use the AI to help outline the main points and subtopics for each section of your lecture. Example: “Outline key points for the section on AI history in the ‘Introduction to Artificial Intelligence’ lecture.” The AI could generate an organized list of historical milestones and developments in AI.
  • Expand on subtopics with details: To add depth to your lecture, use the AI to generate detailed information for each subtopic. Example: “Provide detailed information on the ethical considerations related to AI in the ‘Introduction to Artificial Intelligence’ lecture.” The AI might generate content on topics such as bias in AI algorithms, privacy concerns, and responsible AI practices.
  • Incorporate relevant examples and case studies: Enhance your lecture with real-world examples and case studies. Example: “Generate examples illustrating the applications of AI in the ‘Introduction to Artificial Intelligence’ lecture.” The AI might offer instances of AI in healthcare, finance, or autonomous vehicles, providing concrete illustrations for your lecture.
  • Craft engaging lecture introductions and transitions: Use the AI to help craft engaging introductions for each section and smooth transitions between topics. Example: “Create an engaging introduction for the section on AI applications in the ‘Introduction to Artificial Intelligence’ lecture.” The AI might generate an attention-grabbing anecdote or a relevant statistic to captivate your audience.
  • Ensure coherence and flow: Review the content generated by the AI to ensure coherence and flow between different sections of your lecture. Adjust language or reorder information as needed to create a seamless and logical progression of ideas.

Assignment or discussion prompts

Generative AI, with its ability to understand context and generate diverse responses, can assist instructors in formulating open-ended questions that encourage deeper understanding and analysis.

  • Define assignment or discussion topics: Start by clearly defining the topics or themes for the assignment or discussion. For instance, if the focus is on literature, identify specific works, themes, or concepts you want students to explore.
  • Generate overall assignment or discussion prompt: Prompt the generative AI with a request for a general assignment or discussion prompt. Example: “Generate an assignment prompt for analyzing symbolism in ‘The Great Gatsby’ by F. Scott Fitzgerald.” The AI may provide a broad question like, “Explore and analyze the use of symbolism in ‘The Great Gatsby’ and its impact on character development and themes.”
  • Break down into specific tasks or questions: Use the AI to help break down the overall prompt into specific tasks or questions. Example: “Generate specific questions for analyzing symbolism in ‘The Great Gatsby’.” The AI might produce questions such as, “How does the green light symbolize the American Dream?” or “Examine the symbolic significance of the Valley of Ashes in the novel.”
  • Tailor for different learning levels: Adjust the complexity of the prompts based on the learning level of your students. Example: “Create discussion prompts on genetic inheritance suitable for high school biology students.” The AI may generate questions like, “Explain the principles of Mendelian inheritance” or “Predict the outcomes of genetic crosses involving specific traits.”
  • Include real-world applications: Enhance the relevance of the assignment or discussion by incorporating real-world applications. Example: “Generate prompts for a business ethics assignment discussing the ethical implications of a real-world corporate case.” The AI might provide questions like, “Analyze the ethical decisions made by the company in response to environmental concerns” or “Propose alternative strategies that prioritize ethical considerations.”
  • Encourage critical thinking and reflection: Prompt the AI to generate prompts that encourage critical thinking and reflection. Example: “Create discussion prompts for a philosophy class exploring existentialism.” The AI might generate questions like, “How does existentialism influence personal identity?” or “Reflect on the existential themes in a contemporary work of literature or film.”
  • Check for clarity and coherence: Review the prompts generated by the AI to ensure clarity and coherence. Adjust language or rephrase questions as needed to ensure that students can easily understand and respond to the prompts.

Grading rubrics

Rubric design is a critical aspect of assessing student work. Generative AI can streamline this process by helping instructors create fair and comprehensive rubrics. In this session, we will learn how to leverage AI to design rubrics tailored to specific assignments and projects. By examining real-world examples, participants will gain insights into how AI can enhance the assessment process, providing more nuanced and objective evaluation criteria.

  • Identify assessment criteria: Define the assessment criteria for the assignment or project. For example, if grading a research paper, criteria might include thesis clarity, evidence quality, organization, and writing mechanics.
  • Generate overall Rubric structure: Prompt the generative AI with a request for an overall rubric structure. Example: “Generate a rubric structure for grading a persuasive essay.” The AI may provide a general format, including categories like “Thesis Statement,” “Supporting Evidence,” and “Organization.”
  • Break down categories into subcriteria: Use the AI to break down each category into specific subcriteria. Example: “Generate subcriteria for assessing ‘Thesis Statement’ in a persuasive essay rubric.” The AI might generate subcriteria such as “Clarity of thesis,” “Relevance to the prompt,” and “Originality.”
  • Determine scoring levels: Define the scoring levels for each subcriterion. Example: “Generate scoring levels for ‘Clarity of thesis’ in a persuasive essay rubric.” The AI might provide levels like “Excellent (4): Clearly stated and focused,” “Good (3): Mostly clear,” “Satisfactory (2): Somewhat clear,” and “Needs Improvement (1): Unclear or vague.”
  • Tailor for assignment specifics: Adjust the rubric based on the specific requirements of the assignment. Example: “Customize a rubric for assessing creativity in a student presentation.” The AI could generate categories like “Creativity in Content,” “Presentation Style,” and “Engagement with the Audience.”
  • Include weighting for importance: Prompt the AI to generate weighting for each category or subcriterion based on their relative importance. Example: “Generate weightings for categories in a coding project rubric.” The AI might suggest higher weight for categories like “Functionality” and “Efficiency” compared to “Aesthetics.”
  • Check for clarity and objectivity: Review the rubric generated by the AI to ensure clarity and objectivity. Adjust language or modify criteria as needed to ensure that the rubric provides clear guidance for both instructors and students.

This content was developed with the assistance of Open AI’s ChatGPT.

Mostly Sunny

Red Sox reliever’s FB at 93-94 mph, rehab assignment possible this weekend

  • Updated: Aug. 14, 2024, 10:50 p.m.
  • | Published: Aug. 14, 2024, 5:54 p.m.

New York Yankees v Boston Red Sox

Red Sox' Liam Hendriks could be sent out on a rehab assignment this weekend. (Photo by Maddie Malhotra/Boston Red Sox/Getty Images) Getty Images

BOSTON — Liam Hendriks ’ fastball was up to 93-94 mph during the live batting practice he threw Wednesday here at Fenway Park, manager Alex Cora said.

It marked his third and final scheduled live batting practice before potentially heading out on a rehab assignment. He is making his way back from Tommy John surgery that he underwent last Aug 2.

“The slider was good actually against lefties,” Cora said. “Today he got hit around a little bit but it’s encouraging, man. It’s a good fastball. It’s a different fastball than what we have in the bullpen. I’m not saying he’s going to be a guy that will come here and be the savior of the bullpen but he will contribute. If this happens, he will contribute. We’ll take care of him like we always do with our guys coming from injuries. We still have to be patient but we are almost there.”

Cora said the Red Sox need to see how Hendriks feels Thursday before deciding whether he’ll pitch in a rehab game this weekend.

“Won’t be surprised if he goes on a rehab assignment over the weekend,” Cora said.

The 35-year-old righty won two AL Rivera Reliever awards and earned three All-Star selections while recording 114 saves from 2019-22.

But he has pitched only five innings since the start of 2023. He first battled and overcame Stage 4 non-Hodgkin’s lymphoma in ‘23, then returned to appear in five games before tearing his ulnar collateral ligament.

The Red Sox signed him Feb. 20 for two years and $10 million in guaranteed money with an additional $10 million available in performance bonuses.

“He’s been pushing,” Cora said. “He’s working. He has the will to do it. ... This is nothing for him. He’s been through a lot. You see him on the mound and it’s kind of like significant and ironic that his last live BP to get ready probably to go on the rehab assignment is when we’re having the telethon (for) the Jimmy Fund. Use that as inspiration and I think he’s going to contribute. He’s going to help us.”

Triston Casas not ready yet

Triston Casas will continue his rehab assignment in Triple-A Worcester . Asked if there’s any chance Casas will be activated Thursday in Baltimore, Cora said, “As of now, no.”

The first baseman has a .294/.400/.559/.959 line with one homer and six doubles in nine games (40 plate appearances) on his rehab assignment for the WooSox.

“If the player needs more, he needs more,” Cora said. “I’m not there to feel what he feels. Obviously there’s a few checkpoints that he has in mind. He’s been in touch with the medical staff and (chief baseball officer) Craig (Breslow). He feels like he needs more at-bats. He’ll get more at-bats. At one point, he’s going to be here. It’s just a matter of us being patient.”

Casas is on the 60-day IL with some fractured cartilage which connects his ribcage to his sternum . He hasn’t appeared in a game for the Red Sox since April 20.

Cam Booser placed on 15-day IL

The Red Sox made four roster moves Wednesday, including placing left-handed reliever Cam Booser on the 15-day IL with elbow inflammation .

Booser struggled in Tuesday’s game, allowing a hit and two walks without recording an out.

“After his outing yesterday he came in and he was talking to the trainers and he said that he felt it right away in the first at-bat,” Cora said. “He tried to grind through it. Obviously not good for him, not good for us. But I respect the fact that he tried. In those cases, sometimes when you try to push, it makes it worse for the individual. So just get some treatment. Hopefully, it’s something where he’ll be back sooner rather than later.”

Criswell feeling well

Righty Cooper Criswell is feeling well and could rejoin the Red Sox starting rotation when he’s eligible to return from the COVID-related IL on Friday in Baltimore.

“He felt good in the bullpen,” Cora said. “(Pitching coach Andrew Bailey) feels like he threw the ball well. Energy-wise, he feels good. Strength-wise, too. Hopefully, he joins us in Baltimore.”

If you purchase a product or register for an account through a link on our site, we may receive compensation. By using this site, you consent to our User Agreement and agree that your clicks, interactions, and personal information may be collected, recorded, and/or stored by us and social media and other third-party partners in accordance with our Privacy Policy.

  • SI SWIMSUIT
  • SI SPORTSBOOK

Angels Designate Veteran For Assignment, Call Up Ex-Red Sox Infielder: Report

Maren angus-coombs | 3 hours ago.

rubric for report assignment

  • Los Angeles Angels

The Los Angeles Angels are designating first baseman/outfielder Willie Calhoun for assignment, according to Alden González of ESPN on X .

In a corresponding move, former Boston Red Sox infielder/outfielder Jack López is being called up, per Jorge Castillo of ESPN on X .

Calhoun signed a minor league deal with the Angels prior to spring training and was selected to the big league roster on May 1. He accumulated 254 plate appearances over 68 games and is slashing .245/.315/.380 with five home runs and 20 runs batted in.

The Angels are calling up infielder Jack López, per source. https://t.co/lHfMCsGrDb — Jorge Castillo (@jorgecastillo) August 15, 2024

Calhoun was once considered one of the game's top prospects when he had a 21 home run season in 2019. He just couldn't find a way to remain in the big leagues for a significant amount of time and ran out of options in 2022.

Once his options were gone, he has done nothing but bounce around the league playing for the Texas Rangers, the San Francisco Giants, the New York Yankees and the Angels.

Calhoun could join his fifth organization since 2022 via the waiver process in the coming days, and finish out the season on a major league roster — perhaps even latch on with a contending team.

Calhoun came up through the Los Angeles Dodgers farm system and became the key piece in the 2017 trade that sent pitcher Yu Darvish from the Texas Rangers to the Dodgers.

Back then, he was trying to discover his power hitting which resulted in his inconsistent approach at the plate.

“I was always a gap-to-gap guy,” Calhoun told Jeff Fletcher of the Orange County Register in mid-May. “I lost that approach when I got (to the majors with the Rangers) and started trying to chase power instead of letting it naturally flow. … It really obviously never worked out for me.”

López re-signed with the Angels on a minor league deal after spending the entire 2023 season with Triple-A Salt Lake. He has played 104 Triple-A games this year, hitting 12 home runs and slashing .274/.333/.421.

The 31-year-old doesn't have much major league experience, appearing in only seven big league games with the 2021 Red Sox and hitting .154/.214/.308. He does, however, have a ton of minor league experience.

He first reached Triple-A in 2017 and has played 687 minor league games since then. He has a career slash line of .256/.306/.386.

Calhoun served as the Angels' designated hitter during his short-lived career in Anaheim, however López has versatility to allow him to play all over the field. That depth gives the Halos options at DH going forward.

Maren Angus-Coombs

MAREN ANGUS-COOMBS

Read the Latest on Page Six

  • Sports Betting
  • Sports Entertainment
  • New York Yankees
  • New York Mets
  • Transactions

Recommended

Breaking news, sean reid-foley’s mets rehab assignment stalls as struggles continue.

The Mets don’t know what is going on with Sean Reid-Foley, who officially has been returned from his rehab assignment.

The righty reliever, who is making his way back from a right shoulder impingement that has sidelined him since June 19, had struggled in six minor league games in which he has walked eight in 4 ²/₃ innings.

Reid-Foley most recently pitched Tuesday with Triple-A Syracuse and allowed two hits, a walk and two runs in 1 ¹/₃ innings. Mendoza said Reid-Foley likely won’t throw in a game again “in the next few days.”

Sean-Reid Foley, throwing a pitch in a game earlier this season, has his rehab assignment stalled due to struggles on mound and with velocity.

“One of those [situations] where he’s not sure if it’s mechanics or what, but the ball’s not coming out the way he would like it to,” Mendoza said before the Mets’ 9-1 skid-busting win over the A’s . “So we got to figure out what we’re dealing here with.”

Reid-Foley feels fine physically, Mendoza added, though his velocity has not fully returned. Reid-Foley, whose four-seamer averaged 94.9 mph in 23 games this season, was throwing 92-93 mph Tuesday, Mendoza said.

The Mets formally shut down his rehab assignment, which buys them more time. Pitchers are allowed 30 days to rehab before they must be activated, optioned or otherwise moved.

Reid-Foley, a 28-year-old who has been with the Mets since 2021, was having his best season before his shoulder bothered him.

In 23 games, he owns a 1.66 ERA and 25 strikeouts in 21 ²/₃ innings.

Sean Reid-Foley is looking to come back from a right shoulder impingement.

J.D. Martinez, who left Tuesday’s loss after a 99.6-mph fastball drilled his left elbow, was back in the lineup a day later and hit the ball hard repeatedly.

He went 1-for-4 with a double and a walk.

Mendoza said he originally did not anticipate Martinez being ready yet, but the DH told him that he was feeling well.

Martinez wore a new elbow pad during his at-bats.

Dedniel Nunez (right pronator strain) looked “really good” in his Tuesday bullpen session, Mendoza said, and was up to 94 mph.

The righty will throw another bullpen session or live batting practice Friday.

Top prospect Brandon Sproat was ejected from his second start with Triple-A Syracuse arguing a play at the plate.

Sproat allowed one run on two hits and two walks in three innings in which he struck out two.

COMMENTS

  1. Rubric Best Practices, Examples, and Templates

    Rubric Best Practices, Examples, and Templates. A rubric is a scoring tool that identifies the different criteria relevant to an assignment, assessment, or learning outcome and states the possible levels of achievement in a specific, clear, and objective way. Use rubrics to assess project-based student work including essays, group projects ...

  2. Assessment Rubrics

    Assessment Rubrics. A rubric is commonly defined as a tool that articulates the expectations for an assignment by listing criteria, and for each criteria, describing levels of quality (Andrade, 2000; Arter & Chappuis, 2007; Stiggins, 2001). Criteria are used in determining the level at which student work meets expectations.

  3. 15 Helpful Scoring Rubric Examples for All Grades and Subjects

    15 Helpful Scoring Rubric Examples for All Grades and Subjects. In the end, they actually make grading easier. By Jill Staake, B.S., Secondary ELA Education. Jun 16, 2023. When it comes to student assessment and evaluation, there are a lot of methods to consider. In some cases, testing is the best way to assess a student's knowledge, and the ...

  4. PDF Grading Rubric for Written Assignments

    GRADING RUBRIC FOR WRITTEN ASSIGNMENTS. Exceeds Expectations. Central idea is well developed; clarity of purpose clearly exhibited throughout paper. Abundance of evidence of critical, careful thought to support main ideas, evidence and examples are vivid and specific, while focus on topic remains tight, ideas work together as a unified whole.

  5. PDF Writing Assessment and Evaluation Rubrics

    Holistic scoring is a quick method of evaluating a composition based on the reader's general impression of the overall quality of the writing—you can generally read a student's composition and assign a score to it in two or three minutes. Holistic scoring is usually based on a scale of 0-4, 0-5, or 0-6.

  6. Writing Rubrics [Examples, Best Practices, & Free Templates]

    Use the rubric consistently across all assignments. This ensures fairness and reliability. Consistency in applying the rubric helps build trust with students and maintains the integrity of the assessment process. Insider Tip: Develop a grading checklist to accompany the rubric. This can help ensure that all criteria are consistently applied and ...

  7. Using rubrics

    Rubrics help instructors: Assess assignments consistently from student-to-student. Save time in grading, both short-term and long-term. Give timely, effective feedback and promote student learning in a sustainable way. Clarify expectations and components of an assignment for both students and course teaching assistants (TAs).

  8. Rubrics

    Rubrics are a set of criteria to evaluate performance on an assignment or assessment. Rubrics can communicate expectations regarding the quality of work to students and provide a standardized framework for instructors to assess work. Rubrics can be used for both formative and summative assessment. They are also crucial in encouraging self ...

  9. How to Use Rubrics

    A rubric is a document that describes the criteria by which students' assignments are graded. Rubrics can be helpful for: Making grading faster and more consistent (reducing potential bias). Communicating your expectations for an assignment to students before they begin. Moreover, for assignments whose criteria are more subjective, the ...

  10. Rubric Design

    Writing rubrics can help address the concerns of both faculty and students by making writing assessment more efficient, consistent, and public. Whether it is called a grading rubric, a grading sheet, or a scoring guide, a writing assignment rubric lists criteria by which the writing is graded.

  11. Writing Rubrics: How to Score Well on Your Paper

    A writing rubric is a clear set of guidelines on what your paper should include, often written as a rating scale that shows the range of scores possible on the assignment and how to earn each one. Professors use writing rubrics to grade the essays they assign, typically scoring on content, organization, mechanics, and overall understanding.

  12. Creating and Using Rubrics

    They agreed on how to apply the rubric and their expectations for an "A," "B," "C," etc., report in 100-level, 200-level, and 300- and 400-level lab sections. Every other year, a random sample of students' lab reports are selected from 300- and 400-level sections. ... Hand out the rubric with the assignment so students will know ...

  13. Rubrics for Written Assignments

    An analytic rubric is a scoring guide used to evaluate performance, a product, or a project. It has three parts: 1) performance criteria; 2) rating scale; and 3) indicators. Using a rubric to evaluate student written work is helpful for both faculty and students. For faculty, rubrics. Rubrics help students to. Benefitting from Rubrics.

  14. PDF Grading Rubric for Writing Assignment

    Your professor may use a slightly different rubric, but the standard rubric at AUR will assess your writing according to the following standards: A (4) B (3) C (2) D/F (1/0) Focus: Purpose. Purpose is clear. Shows awareness of purpose. Shows limited awareness of purpose.

  15. Examples of Rubric Creation

    Examples of Rubric Creation. Creating a rubric takes time and requires thought and experimentation. Here you can see the steps used to create two kinds of rubric: one for problems in a physics exam for a small, upper-division physics course, and another for an essay assignment in a large, lower-division sociology course.

  16. Example 1

    Example 1 - Research Paper Rubric. Characteristics to note in the rubric: Language is descriptive, not evaluative. Labels for degrees of success are descriptive ("Expert" "Proficient", etc.); by avoiding the use of letters representing grades or numbers representing points, there is no implied contract that qualities of the paper will ...

  17. Writing an Assignment Prompt and Rubric

    An assignment prompt is a set of instructions for a written assignment. It gives students topics or questions to then address in writing. The assignment prompt gives students a starting point for what to write about, and often provides expectations for the written work. The purpose of the prompt is to provide students with clear understanding ...

  18. PDF ASSESSMENT RUBRIC FOR RESEARCH REPORT WRITING: A TOOL FOR ...

    • A rubric provides a common framework and criteria for performance assessment. • A rubric provides standards of transparency and objectivity for all students in a course in which students understand their learning target(s) and the quality standards of a given assignment. • The use of a rubric facilitates the efficient examination

  19. PDF Rubric for evaluating NEWS REPORTS, EOSC 310

    Rubric for evaluating NEWS REPORTS, EOSC 310 Use this rubric as a guide. Write the categories (left side) on your index card. Evaluate each category on a scale of 0-4. Write comments on reverse side of card. Category Excellent (4) Good (3) Adequate (2) Inadequate (1) Opening & intro Clearly, quickly established the focus of the presentation,

  20. Grading Rubric for Written Reports

    Contact Information. Dr. Lisa Bullard, Teaching Professor. Department of Chemical and Biomolecular Engineering. North Carolina State University. Raleigh, NC 27695-7905 USA. Phone: 919-515-7455. Email: [email protected].

  21. PDF Technical Report Evaluation Rubric

    Technical Report Evaluation Rubric 1 Writing Performance Levels Purpose: Evaluate a student's ability to write a technical report. Student Name: Evaluator: Ranking: On a scale from 1 (lowest performance) to 10 (highest performance), assign points to each dimension based on the criteria below. Writing Dimensions/ Weight Does Not Meet Expectations

  22. Assessment Rubric Design

    Step 1: Define the purpose of the lab report assignment/assessment for which you are creating a rubric; Instruction Rubric development example; The first step is to clarify the purpose of the assignment and identify student's learning outcome(s) from lab report writing. Assume an instructor focus on data presentation on one lab report assignment.

  23. Sample Assignment Rubrics

    An assignment rubric can serve multiple purposes: communicate to your students the exact aspects of the assignment they will be graded on; model scholarly practice to students; help you grade submitted assignments in an efficient and transparent way. Here are some samples: Annotated Bibliography Evaluation Rubric. Research Paper Rubric (Cornell ...

  24. Creating an assignment in Feedback Studio using Moodle Direct V2

    A similarity report will still be generated for paper submissions, but your students' papers will not be stored in the Turnitin standard paper repository or the institution's paper repository for future comparison. ... GradeMark options allows you to attach a rubric to an assignment. You can do this by selecting a rubric from the dropdown list ...

  25. Rubric Enhancements Issue for Students

    We enabled rubric enhancements today and are seeing issues with students' ability to see the rubric details on the assignment submission page. The criteria are listed, but the descriptions are missing. This has been tested in two browsers. We have disabled rubric enhancements in BETA and the rubric appears correctly on the submission page.

  26. How Can I Use AI as an Instructor?

    AI can be a powerful ally in content creation. Various tools and platforms, often user-friendly and accessible, empower instructors to generate diverse and creative content. From learning outcomes to writing assignments to lesson plans, generative AI can provide inspiration and assist in overcoming ...

  27. PDF IEERB 2024 CBA COMPLIANCE RUBRIC

    The Rubric is designed to assist parties in developing a compliant CBA. Statutory changes have been ... Report and Recommendation, to ensure that their 2024 CBA is compliant. Unfortunately, time and staffing ... Assignment of instructional leadership roles Academic needs of students in the corporation

  28. Red Sox reliever's FB at 93-94 mph, rehab assignment ...

    The first baseman has a .294/.400/.559/.959 line with one homer and six doubles in nine games (40 plate appearances) on his rehab assignment for the WooSox. "If the player needs more, he needs ...

  29. Angels Designate Veteran For Assignment, Call Up Ex-Red Sox Infielder

    The Los Angeles Angels are designating first baseman/outfielder Willie Calhoun for assignment, according to Alden González of ESPN on X. In a corresponding move, former Boston Red Sox infielder ...

  30. Sean Reid-Foley's Mets rehab assignment stalls

    Sean-Reid Foley, throwing a pitch in a game earlier this season, has his rehab assignment stalled due to struggles on the mound and with velocity. Corey Sipkin for New York Post