Your Common Assessment Class Report gives you a clear picture of how your class performed on a district-wide assessment. Use it to review participation, accuracy, performance bands, and individual student results - all in one place. This article walks you through every section of the report and how to use it effectively.
How to Access Your Class Report
Go to your Wayground dashboard and click on 'Sessions' > ‘Common Assessments’ in the left navigation pane.
Find the Common Assessment you want to review. It will appear under the ‘Active’ or ‘Ended’ tab.
Click on the Common Assessment name to open the details page.
Your class sessions will appear under the ‘My Sessions’ section.
Click on ‘Class report’ to open your class report.
Note: Class report is available as soon as the assessment is assigned to the class. Student data starts showing up once the students submit their assessment.
Understanding the Report Overview
The top of your Class Report shows a high-level summary of class performance.
Participation & Accuracy Metrics
Your report displays two key metrics at the top:
Participants: The number of unique students in your class who submitted at least one attempt. Each student is counted only once, even if they attempted the assessment multiple times.
Class Accuracy: The overall accuracy percentage for your class, calculated using each student’s best attempt.

Tip: Accuracy metrics use each student’s best attempt. If a student took the assessment multiple times, only their highest-scoring attempt counts toward class accuracy, item accuracy, and standard accuracy.
Performance Bands
Below the accuracy metric, a color-coded bar shows how your students are distributed across four performance bands:
Did Not Meet (Red, <40%): Student performance falls below the expected threshold.
Partially Met (Yellow, 40–70%): Student shows emerging understanding but hasn’t fully met expectations.
Met (Green, 70–90%): Student demonstrates solid understanding of the assessed concepts.
Exceeded (Blue, >90%): Student exceeds expectations with strong accuracy and depth.
Note: Performance band ranges are configured by the CA creator when setting up the Common Assessment. The ranges above are defaults — your district may use different thresholds.
School & District Averages
Your Class Report displays your class accuracy alongside your school average and the district average. These benchmarks appear at three levels:
Assessment level: In the overview section, compare your overall class accuracy against school and district.
Standard level: In the Standards tab, see how your class performed on each standard relative to the school and district.
Item level: In the Items tab, see per-question accuracy compared to school and district benchmarks.
This helps you quickly identify whether a performance gap is specific to your class or reflects a broader trend across the school or district.
Reviewing Student Results
The Participant tab shows a detailed view of every student in your class and their assessment status.
At the top of the report, a status badge indicates the current state of the Common Assessment:
‘Active’: The assessment is currently within its scheduled start and end dates. Teachers can administer sessions and students can participate.
‘Ended’: The assessment has passed its end date. No new sessions can be started. Report data is final unless the CA creator reopens the assessment.
The status badge is displayed at the top of your Class Report next to the assessment title.
Student Status Tags
Each student record displays one of the following status tags:
Pending Evaluation: The student has submitted their assessment, but one or more open-ended responses still need to be manually evaluated by you. Until you evaluate these, the student’s results are excluded from accuracy roll-ups.
Not Submitted: The student started an attempt but has not submitted it yet. Their attempt is still in progress.
Not Started: The student has not begun the assessment.
Points displayed (e.g., 18/20): The student’s assessment is fully evaluated and their best attempt data is shown.
[Screenshot: The Participants tab showing 4–5 student rows with different status tags. Highlight each status type with annotations.]
Ending Ongoing Attempts
If students have active attempts from a previous session that they never submitted, you’ll see an End Ongoing Attempts button on your report.
Clicking this button:
Saves the responses of all in-progress students
Submits the assessment on their behalf
Makes their responses available for review and scoring
Note: This button appears only when one or more students have active but unsubmitted attempts. Once all attempts are ended, the button disappears. Use this when the testing window has closed and you need to finalize all student records.
[Screenshot:The report page showing the “End Ongoing Attempts” button in context. Use an arrow to annotate the button.]
Evaluating Open-Ended Questions
If your Common Assessment includes open-ended questions (text response, audio, video, or draw), you need to evaluate them manually. This ensures accuracy metrics reflect real teacher judgment.
Manual Evaluation Workflow
On the Participant tab, look for students with the Pending Evaluation status tag.
Click on the student’s name to open their submission.
Review their open-ended response(s).
Assign a score for each open-ended question. The points field will be blank until you evaluate.
Save your evaluation. The student’s status will update, and their results will be included in accuracy calculations.
[GIF: Demonstrate the full evaluation flow — select a student with “Pending Evaluation” status → view the response → enter a score → save the evaluation → confirm that the status updates to “Scored.”OEQEval.mov ]
Using AI Evaluation
If AI Evaluation is enabled for the assessment, you can speed up the process:
Open a student’s pending submission.
Click Apply AI Evaluation Points.
Review the AI-assigned scores to ensure they’re appropriate.
Adjust if needed, then save.
Tip: AI Evaluation is a time-saver, but it’s a starting point — not a final grade. Always review AI-assigned scores before relying on them for instructional decisions.
[GIF: Show the process of clicking “Apply AI Evaluation Points,” displaying the scores populating automatically, and the teacher reviewing and saving the evaluation.]
How Pending Evaluations Affect Accuracy
Until all open-ended responses are evaluated, the affected accuracy metrics are adjusted:
Assessment-level accuracy: Only students who are fully evaluated are included.
Standard-level accuracy: A student is excluded from a standard’s accuracy until all their responses for items within that standard are evaluated.
Item-level accuracy: A student is excluded from an item’s accuracy until their response to that item is evaluated.
Multiple attempts: If a student has multiple attempts, the system cannot determine their best attempt until all responses across all attempts are fully evaluated. Until then, the student is excluded from all accuracy aggregates.
Tip: For the most accurate class-wide data, evaluate all open-ended responses before analyzing your standards and item accuracy. This ensures no students are excluded from the calculations.
Viewing Performance by Standard
The Standards tab breaks down your class accuracy by each assessed standard. For each standard, you’ll see:
The standard name and number of items mapped to it
Your class accuracy for that standard
School and district averages for comparison
Performance band distribution showing how many students fall in each band
Click on any standard to drill deeper into item-level details for that standard.
[Screenshot: Show the Standards tab with 2–3 standards listed, each showing accuracy percentage, band distribution bar, and school/district averages. Annotate key elements.]
Viewing Performance by Item
The Items tab shows performance on each individual question. For each item, you can see:
The question accuracy percentage
School and district averages for that item
Whether the item was among the hardest or most-missed for your class
This helps you pinpoint specific concepts or questions where your students struggled and decide whether to reteach or revisit that material.
Tips & Best Practices
Evaluate open-ended questions first. Before drawing conclusions from your class data, complete all pending evaluations so that every student’s results are included in the accuracy calculations.
Use benchmarks for context. If your class accuracy is below the school or district average on a particular standard, consider collaborating with colleagues who are above average to share instructional strategies.
Focus on performance bands. Identify students in the “Did Not Meet” and “Partially Met” bands for targeted intervention, and challenge students in the “Exceeded” band with enrichment opportunities.
End ongoing attempts promptly. If the testing window has closed, use the End Ongoing Attempts button to finalize student records so their data is captured.
Was this article helpful?
That’s Great!
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
Feedback sent
We appreciate your effort and will try to fix the article







