Globebyte Documentation
  • AI Agents for Learning
  • Assess for Learning
    • Creating the Assess Connected App
    • Setting up Assess for Learning
    • Viewing Assessments
    • Assessment Outcomes & Validation
    • Marking
    • Best Practices
  • Tutor for Learning
    • Setting up Tutor
    • Agent Actions
      • Tutor_Mark
      • Tutor_Task
      • Tutor_Criterion
      • Tutor_SenseMaking
      • Tutor_Instruction
    • Topics
      • Tutor_Assessment
      • Tutor_Knowledge
  • Data for Learning
  • Actions for Learning
    • Creating the xAPI Actions Connected App
    • Setting up xAPI Actions
    • Creating your first xAPI Action Flow
    • xAPI Statement Data explorer
      • Metadata
      • xapiActor
      • xapiVerb
        • Verb reference
      • xapiObject
      • authority
      • xapiResult
      • xapiContext
    • Filtering xAPI Statements
    • Viewing xAPI Statements
    • Viewing xAPI Usage
    • Setting a default statement language
    • Error messages and troubleshooting
  • Experience for Learning
    • Setting up xAPI for Salesforce
    • Send xAPI from a Flow
    • Form Action fields
    • Send xAPI from Apex
    • xApiStatement Class reference
      • Actor
      • Verb
      • Object
      • Context
      • Result
      • Authority
      • Version
      • Send methods
    • Logging and defaults
  • Learning Journey Model
    • Introduction
    • Curriculums & Pathways
    • Courses & Modules
    • Pedagogies & Objectives
    • Rubrics & Criteria
    • Learning Resources
    • Assessments & Tasks
    • Learning Groups
    • Step-by-step working example
    • Activity Tracking (Advanced)
    • Additional Pedagogies Reference
    • Best Practices
    • Assess for Learning Integration
    • Data for Learning Integration
    • Object References
      • Learning Curriculum
      • Learning Pathway
      • Learning Course
      • Learning Module
      • Learning Pedagogy
      • Learning Objective
      • Learning Objective Assignment
      • Learning Rubric
      • Learning Rubric Criterion
      • Learning Rubric Model Solution
      • Learning Resource Type
      • Learning Resource
      • Learning Assessment
      • Learning Text Task
      • Learner Text Attempt
      • Learner Text Criterion Score
      • Learning Choice Task
      • Learner Choice Attempt
      • Learner Mark
      • Learning Group
      • Learner Group Membership
      • Learner Activity
      • Learner Activity Instance
      • Learner XAPIStatement
      • Developer Cheat Sheet: Key LDM Objects
  • Globebyte.com
Powered by GitBook
On this page
  • Learning Rubric Criterion
  • Introduction
  • Properties
  • Example
  • Use Case
  1. Learning Journey Model
  2. Object References

Learning Rubric Criterion

Learning Rubric Criterion

Introduction

A Learning Rubric Criterion represents one element or dimension within a Rubric. For instance, if your rubric evaluates an essay’s clarity, grammar, and argument quality, each of those aspects is a separate criterion. Each criterion can have multiple performance levels (e.g., “Excellent,” “Good,” “Fair,” “Poor”) with associated descriptors and point values.

Key highlights:

  • Granular Assessment: Break down performance into distinct attributes.

  • Flexible Levels: Up to five performance levels let you define different gradations of quality or mastery.

  • Weighted Scoring: You can assign weights to each criterion so that some aspects matter more than others.

Properties

Property Name
Description

LearningRubricId

Links this criterion to the broader rubric. You might have several criteria under one rubric for a comprehensive scoring approach.

Weight

(Optional) A numeric weight for this criterion if certain elements are more important. For example, “Argument Quality” might carry more weight than “Grammar” if that’s central to your learning outcome.

Sequence

Defines the order of the criteria. If you want to talk about “Thesis Statement” first, set this to 1. “Evidence and Support” might be 2, and so on.

Name

A short label for the criterion (e.g., “Thesis Statement,” “Research Depth,” “Formatting”).

Level1Name through Level5Name

Identify how you label each performance tier (e.g., “Excellent,” “Good,” “Fair,” “Poor,” “Very Poor”).

Level1Description through Level5Description

Provide narratives explaining what exactly “Excellent” or “Good” means in terms of learner performance. These help both graders and learners understand expectations.

Level1Score through Level5Score

Assign specific points to each level (e.g., 4 points for Excellent, 3 for Good). This ensures uniform scoring across graders.

Level3Weight

(Optional) You might have a specialized weight for certain levels if your scoring model is more complex.

CreatedDate

Automatically set when the criterion is added to the system.

ModifiedDate

Automatically updated when changes occur in this criterion.

Example

You define a criterion named “Clarity of Argument.” Level1Name is “Excellent” with a 5-point value, described as “The argument is clearly stated, logically structured, and thoroughly defended with evidence.” Level2Name might be “Good” with a 4-point value, and so forth. If you want it to appear first among the set, set Sequence to 1.

Use Case

For a “Final Research Paper Rubric,” you add three criteria: “Thesis Statement,” “Evidence and Support,” “Analysis of Data.” Each criterion outlines multiple levels. Students can see precisely what’s expected at each level. If “Evidence and Support” is crucial, you give it a higher weight to reflect its importance in the final score.

PreviousLearning RubricNextLearning Rubric Model Solution

Last updated 4 months ago