Globebyte Documentation
  • AI Agents for Learning
  • Assess for Learning
    • Creating the Assess Connected App
    • Setting up Assess for Learning
    • Viewing Assessments
    • Assessment Outcomes & Validation
    • Marking
    • Best Practices
  • Tutor for Learning
    • Setting up Tutor
    • Agent Actions
      • Tutor_Mark
      • Tutor_Task
      • Tutor_Criterion
      • Tutor_SenseMaking
      • Tutor_Instruction
    • Topics
      • Tutor_Assessment
      • Tutor_Knowledge
  • Data for Learning
  • Actions for Learning
    • Creating the xAPI Actions Connected App
    • Setting up xAPI Actions
    • Creating your first xAPI Action Flow
    • xAPI Statement Data explorer
      • Metadata
      • xapiActor
      • xapiVerb
        • Verb reference
      • xapiObject
      • authority
      • xapiResult
      • xapiContext
    • Filtering xAPI Statements
    • Viewing xAPI Statements
    • Viewing xAPI Usage
    • Setting a default statement language
    • Error messages and troubleshooting
  • Experience for Learning
    • Setting up xAPI for Salesforce
    • Send xAPI from a Flow
    • Form Action fields
    • Send xAPI from Apex
    • xApiStatement Class reference
      • Actor
      • Verb
      • Object
      • Context
      • Result
      • Authority
      • Version
      • Send methods
    • Logging and defaults
  • Learning Journey Model
    • Introduction
    • Curriculums & Pathways
    • Courses & Modules
    • Pedagogies & Objectives
    • Rubrics & Criteria
    • Learning Resources
    • Assessments & Tasks
    • Learning Groups
    • Step-by-step working example
    • Activity Tracking (Advanced)
    • Additional Pedagogies Reference
    • Best Practices
    • Assess for Learning Integration
    • Data for Learning Integration
    • Object References
      • Learning Curriculum
      • Learning Pathway
      • Learning Course
      • Learning Module
      • Learning Pedagogy
      • Learning Objective
      • Learning Objective Assignment
      • Learning Rubric
      • Learning Rubric Criterion
      • Learning Rubric Model Solution
      • Learning Resource Type
      • Learning Resource
      • Learning Assessment
      • Learning Text Task
      • Learner Text Attempt
      • Learner Text Criterion Score
      • Learning Choice Task
      • Learner Choice Attempt
      • Learner Mark
      • Learning Group
      • Learner Group Membership
      • Learner Activity
      • Learner Activity Instance
      • Learner XAPIStatement
      • Developer Cheat Sheet: Key LDM Objects
  • Globebyte.com
Powered by GitBook
On this page
  • Learning Assessment
  • Introduction
  • Properties
  • Example
  • Use Case
  1. Learning Journey Model
  2. Object References

Learning Assessment

Learning Assessment

Introduction

A Learning Assessment is your formal evaluation instrument—like a quiz, exam, or project. It can be affiliated with a Course or a Module, and can be Formative (to check understanding along the way) or Summative (to gauge final mastery). You may enable AI assistance for marking or feedback.

Key highlights:

  • Flexible Configuration: Set open/close times, pass thresholds, or AI aggregation.

  • AssessingType: Decide if you want AI to auto-mark, produce a suggested mark, or if you prefer to do everything manually.

  • Submission Windows: Control the times when learners can submit tasks, ensuring fairness and punctuality.

Properties

Property Name
Description

LearningCourseId / LearningModuleId

Link the assessment to either a single course or module, indicating the context in which learners take it (like “Midterm for Module 2”).

Name

A clear label, e.g., “Midterm Exam,” “Project Proposal,” or “Quiz #3.” This helps learners and staff see at a glance what the assessment is about.

Type

Classifies the assessment as “Formative” or “Summative.” Formative might be frequent small quizzes; Summative is typically a final or high-stakes exam.

AssessingType

Determines how final marks are produced: “Disable AI,” “AI Aggregated Marking and Approval,” or “AI Aggregated Marking for Manual Approval.” Lets you blend automation with instructor expertise.

SubmissionOpenDate

A date/time after which learners can start submitting. Great for scheduling (e.g., test window opens on March 1 at 9:00 AM).

SubmissionPeriod

The number of hours submissions remain open. For instance, if set to 48, it closes two days after SubmissionOpenDate.

Status

“Inactive,” “Open,” or “Closed.” If set to Open, learners can see and submit the assessment. When closed, no new submissions can be made.

MinimumMarkRequired

A numeric pass threshold. If the total mark is below this, the learner is considered to have not met the requirement.

LearningActivityDefinitionId

An optional link to external analytics or tracking systems.

ECLearningOutcomeItemId

(Optional) If you integrate with Education Cloud, point to an existing outcome item so both systems stay in sync regarding learner progress.

CreatedDate

System-managed timestamp when this assessment is created.

ModifiedDate

System-managed timestamp for the latest update to the assessment record.

Example

You create an “End-of-Module Quiz” for Module 2 in your “Data Analytics” course. You set Type to “Formative” for a low-stakes check. You specify it opens next Monday at 9:00 AM (SubmissionOpenDate) and remains open for 24 hours. You enable “AI Aggregated Marking for Manual Approval” so the AI suggests scores, but an instructor finalizes them.

Use Case

A university runs an online “Principles of Finance” course. They schedule weekly quizzes as Formative to keep students on track, plus a Summative final exam that closes on a specific date. The final exam is set to “AI Aggregated Marking and Approval” to expedite grading for large cohorts, but the instructor can still review borderline cases or anomalies.

PreviousLearning ResourceNextLearning Text Task

Last updated 4 months ago