TABLE OF CONTENTS
- Quick navigation
- Who can access this report?
- How do I get to the report?
- What’s the big picture?
- Which standards need attention?
- How are my schools performing?
- How are individual classes doing?
- Are there equity gaps?
- How can I explore performance patterns? (Deep Dive)
- How did students perform on each question?
- Am I meeting my accountability targets?
- How do I share this data with my team?
- What does "Pending Evaluation" mean?
- How are the numbers calculated?
- Quick reference: report sections at a glance
- Using the filter
You’ve administered a Common Assessment across your district. Students have completed it, teachers have proctored it — and the data is in. The District Report is where you turn that data into action.
This report gives you a clear, organized view of how students performed across standards, schools, classes, and demographic subgroups. Use it to prepare for your next Professional Learning Community meeting, identify schools that need support, surface equity gaps, track progress against your district’s accountability targets, and identify individual students for intervention or enrichment.
This article is for: System Admins, School Admins, Assessment Coordinators, Department Heads, and Instructional Coaches.
Quick navigation
Who can access this report?
Different roles see different levels of data. Here’s what each role can access:
How do I get to the report?
Click Common Assessments in the left navigation.
Click the assessment you want to review.
On the Common Assessment Details page, click View Report.
The report opens to the Summary tab by default. Data updates once daily overnight, so recent submissions may take up to a day to appear. Check the "Last updated" timestamp in the report header before presenting data in meetings.
What’s the big picture?
Overall accuracy
A single percentage representing district-wide performance — the average of points earned by all student participants across the district, divided by the total possible points, expressed as a percentage.
Accuracy = (Sum of points earned by all participants) ÷ (Maximum possible points × Number of participants)
The formula uses each student’s best attempt only and counts only unique participants who submitted at least one attempt.
Performance band distribution
A color-coded bar showing what share of students fall into each proficiency level:
Below the bar, you’ll see the participant count for each band (e.g., "24 participants - Did not meet," "267 participants - Partially met"). Hover over any segment to see details.
Participant count
The total number of unique students included in the report (e.g., "496 participants").
Report header information
The header displays:
Assessment title and metadata (Grade, Subject, Date range)
Creator name
Status indicator (e.g., "Ended")
Last updated timestamp — check this before presenting data in meetings
Report header actions
See "Pending Evaluation" in the legend? Some students have Open-Ended questions teachers haven’t scored yet. Those students are temporarily excluded from accuracy. See "What Does 'Pending Evaluation' Mean?" below.
Which standards need attention?
Go to Summary > Standards. Each standard shows:
Standard code and description (e.g., "TEKS.MATH.6.3D")
Number of items tagged to this standard (e.g., "4 items")
Accuracy percentage in a circular indicator
Participant distribution bar (color-coded by performance band)
Look for standards where a large share of students fall into "Did Not Meet" or "Partially Met" — these are your priority areas.
Drill down into a standard
Click on any standard row to open its drilldown panel. This lets you move from "this standard is underperforming" to "here’s exactly where the issue is."
The Standards drilldown shows:
Header: Standard code and description, Standard Accuracy percentage, Participant Distribution bar (color-coded by performance band, with counts).
Three sub-tabs:
Items section:
Lists each question tagged to this standard
Shows "% Students Awarded Points" for each item
If an accountability goal is set, the panel organizes schools and classes into clear sections:
"Meeting accountability goal" — Lists cohorts above your target, with count (e.g., "8 of 8 schools")
"Not meeting accountability goal" — Lists cohorts below your target, with count and a "View all →" link
How are my schools performing?
Go to Summary > Schools. Each school shows:
School name and participant count
Accuracy percentage displayed as a horizontal bar
District Average reference line (vertical dashed line) for quick comparison
Schools are listed in order, making it easy to see which are above or below the district average.
Drill down into a school
Click on any school to open its drilldown panel. This is where you go from "this school is below average" to "here’s exactly which standards and classes are driving the gap."
The Schools drilldown shows:
Header: School name and participant count, School Accuracy percentage with comparison to district (e.g., "↑ 6% above district average"), Participant Distribution bar (color-coded by performance band).
"View school report →" link: Opens the school-level report in a new tab — great for sharing with principals.
Three sub-tabs:
How are individual classes doing?
Go to Summary > Classes. Each class shows:
Class name (format: TeacherName Grade+Section SUBJECT SchoolCode)
Participant count
Accuracy as a horizontal bar
District Average reference line for comparison
Use this when: Planning coaching conversations, identifying high-performing classes whose instructional practices might be worth replicating, or spotting classes that may benefit from additional coaching or support.
Are there equity gaps?
Go to Summary > Sub groups. Each demographic group shows:
Group name (e.g., Asian, Hispanic, Two or More, English Learners, Special Education)
Accuracy percentage in a circular indicator
Participant distribution bar (color-coded by performance band)
Click on any group to drill down and see school-by-school and class-by-class performance for that subgroup.
Sub group data is also available throughout the rest of the report:
Filter — Narrow the entire report (Summary, Deep Dive, Items) to a single sub group.
Drilldowns — Sub group data is reflected in Standards and Schools drilldowns.
Deep Dive — Filter the heatmap by sub group for focused analysis.
Use this when: Preparing equity reports, addressing board questions about achievement gaps, or prioritizing intervention resources.
How can I explore performance patterns? (Deep Dive)
Click the Deep Dive tab to access this view.
What you’ll see:
Sub-tabs: Standard Accuracy | Item Accuracy
"Group by" dropdown: Schools (default) | Classes | Students
Sort dropdown: Ascending | Descending
Heatmap grid: Rows are schools (or classes/students), columns are standards (or items)
Reading the heatmap:
The first row shows the District Average with overall accuracy
Each cell shows accuracy and is color-coded by performance band: Red = Did Not Meet, Yellow = Partially Met, Green = Met, Blue = Exceeded
Pattern interpretation:
Use Filter (top-right) to narrow the Deep Dive to a specific subgroup before analyzing patterns.
How did students perform on each question?
Go to the Items tab to view question-by-question results.
What you’ll see:
Each question with its accuracy percentage
Questions tagged to specific standards
Response patterns across the district
How to interpret patterns:
Scroll down or click on the Response analysis area for any item to see:
The question type, the standard it was tagged to, and the points
The percentage and number of students who selected each option
You can also access item-level data through the Standards drilldown — each standard’s panel shows the items tagged to it with "% Students Awarded Points."
Are we meeting our accountability targets?
If an accountability goal has been set for this Common Assessment, you see it reflected throughout the Summary tabs and drilldowns:
"Meeting accountability goal" sections show cohorts above your target with a count (e.g., "8 of 8 schools")
"Not meeting accountability goal" sections show cohorts below your target with a count and "View all →" link
Each cohort displays its Students in Met + Exceeded percentage, so you can see exactly how close (or far) each one is from your target
Setting or updating goals: Accountability goals are configured during Common Assessment creation under Reporting > Accountability Goals. You can update them at any time — changes reflect in reports within one day.
Use this when: Preparing for a school board presentation, updating your district improvement plan, or running a data review meeting where you need to show progress against targets.
How do I share this data with my team?
Click the Download button (top-right) and choose:
PDF — A Wayground-branded report with summaries across Standards, Schools, Classes, Subgroups, and Item Analysis. Ready for leadership meetings and improvement plans.
CSV — Raw data for custom analysis in Excel or Google Sheets.
Great for: Professional Learning Community discussions, leadership team meetings, board presentations, school improvement plan documentation, and sharing with principals who want a quick overview.
Best practice: Use the PDF for stakeholder presentations and Professional Learning Community discussions. Use the CSV when you need to combine Common Assessment data with other sources or for custom analytics. For deeper interactive exploration (like drilldowns and Deep Dive), team members can log in to Wayground and use the in-product report.
What does "Pending Evaluation" mean?
Some assessments include Open-Ended questions (text, audio, video, or drawing responses). These require manual scoring by teachers before they can factor into accuracy calculations.
If you see "Pending evaluation" in the report:
How are the numbers calculated?
Accuracy formula:
Accuracy = (Sum of points earned by participants) ÷ (Max points possible × Number of participants)
Calculation rules:
Why this matters:
Multiple retries don’t artificially inflate or deflate scores
Participation counts reflect actual students, not duplicate attempts
Open-Ended question integrity is maintained through teacher evaluation
Quick reference: report sections at a glance
Using the filter
Click Filter (top-right) to narrow the report to specific cohorts.
Filter options:
Use the search box to quickly find specific schools or classes. Click Apply Filters to update the report, or Reset Filters to clear selections.
Filters apply across all tabs (Summary, Deep Dive, Items), so you can analyze a specific cohort throughout the entire report.
Was this article helpful?
That’s Great!
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
Feedback sent
We appreciate your effort and will try to fix the article