Analytics Dashboard
Last updated
Last updated
The analytics feature in VEGA AI is designed to help teachers track student performance in a simple and effective way. It gives detailed insights into how students are performing, what areas they are struggling with, and where they are excelling. Think of this as a dashboard that tells you the health of your classroom, similar to how a doctor uses charts to understand a patient’s health.
Tabs in Analytics
At the top of the page, you will see three main tabs:
Overall Analytics – This tab gives a comprehensive view of how all students in your course are performing. You can think of this as the "big picture" view.
Batch Analytics – Here, you can focus on specific groups of students (batches). It’s like checking how different teams in a competition are performing.
Scheduled Tests Analytics – This tab helps you analyze the results of specific tests you’ve scheduled for students.
Overall Analytics Overview
Under this tab, you will first see a table of student scores. Each student’s name, projected SAT score, and a button to view their detailed analytics are displayed here. For example:
Red Zone, Yellow Zone, Green Zone: Students are grouped into these zones based on their scores. Think of this like a traffic light:
Red Zone means students need urgent help.
Yellow Zone indicates moderate progress.
Green Zone shows students who are doing well. In the provided example, the majority of students are in the Red Zone, signaling areas for immediate attention.
Error Frequency Analysis
Below the table, you’ll find a bar graph showing the Frequency of Errors in Reading, Writing, and Math. For example:
If 90% of errors are in “Central Idea and Details,” it’s clear students struggle with understanding the main points in passages. This gives teachers a direct hint to revisit this concept during class.
There’s also a list of Top Three Areas for Improvement generated by AI. For example, if the areas include “Text Structure & Purpose,” you know students need help understanding how authors organize information.
Time Management Insights
The next section focuses on Time Management, showing how long students are taking to solve specific types of questions. For instance, if students spend 8 seconds on "Inference" questions but still get them wrong, it means they are rushing through these questions and might need more practice.
For example, if students spend 2 seconds on “Words in Context” but perform poorly, this suggests they are rushing and may need to slow down and carefully consider these types of questions.
The Top Three Areas for Weakness in Time Management are also highlighted. For example:
Command of Evidence
Inferences
Words in Context
LQ Mastery Score
This section shows the Learning and Question Mastery Levels (LQ Mastery) for different topics within Reading, Writing, and Math. It’s like a report card that breaks down a student’s understanding into smaller sections so you can see exactly where they stand.
Topics and Levels: Each topic (e.g., “Central Idea and Details” or “Command of Evidence”) is listed with its corresponding mastery level, such as Novice, Intermediate, or Advanced.
Example: If “Command of Evidence” shows 71% (Advanced) but “Text Structure & Purpose” is 0% (Novice), this indicates students excel at finding evidence but struggle with understanding how authors organize their text.
Top Areas for Improvement: On the right side, the AI highlights the top three weak areas for students, based on their mastery levels. For instance:
Text Structure & Purpose
Cross Text & Connections
Rhetorical Synthesis
These insights are incredibly valuable because they allow teachers to focus their efforts on areas that need the most attention.
Strengths and Weaknesses Analysis
This tool helps teachers dive even deeper by letting them select a specific topic and a mastery level to examine how students are performing in that area.
Table Breakdown: The table lists:
Student Name
Mastery Level for Full-Length Tests
Mastery Level for Play & Practice (games or activities within VEGA AI)
Time Spent Training
Total Questions Corrected/Attempted
A link to "Detailed Analytics" for a more focused view.
Example: Suppose you select the topic “Command of Evidence” and the mastery level “Novice.” The table will show all students who are struggling in this area. If a student has spent 0 minutes practicing and their questions are all incorrect, this signals the need for more directed practice in this topic.
After exploring the Overall Analytics, VEGA AI provides additional tabs for Batch Analytics and Scheduled Test Analytics. These features are tailored to give a more focused view of performance based on student groups (batches) and specific scheduled tests.
Batch Analytics
This tab focuses on performance within specific student groups (batches). Teachers can monitor how well a group of students assigned to a course or program is performing collectively.
Selecting a Batch
Use the dropdown menu at the top left to select a specific batch. For example, if you choose “LearnQ B1,” only the students in that batch will appear in the analysis.
Student Data
A table displays all students in the batch along with:
Average Projected SAT Score: This is a prediction of how well each student might perform based on their current progress.
Detailed Analytics Link: Clicking this opens an individual report for the selected student.
Example: If “Raja Kumar” shows a score of 540, you can click "View Analytics" to get an in-depth breakdown of their strengths, weaknesses, and time management.
Red, Yellow, and Green Zones
The right panel categorizes students into performance zones:
Red Zone: Students who need immediate attention.
Yellow Zone: Students with moderate performance who need improvement.
Green Zone: High-performing students.
Example: If 1 student is in the Red Zone, you can prioritize their improvement with targeted strategies.
Frequency of Errors Analysis
Just like in Overall Analytics, this chart displays the percentage of errors for each topic in Reading, Writing, and Math.
Example: If batch students made a 78% error rate in “Cross Text & Connections,” this is flagged as a critical weak area.
Time Management
Average time spent solving questions is shown per topic. If students in this batch spend 2 seconds on "Command of Evidence," but the error rate is high, it signals that students are not spending enough time reviewing these questions.
Top Three Areas for Improvement in Time Management are provided to guide teachers.
LQ Mastery Score
Similar to Overall Analytics, this section lists the batch’s mastery levels for various topics. It helps compare collective performance with individual student analytics.
Strengths and Weaknesses Analysis
Teachers can filter by topic and mastery level for the entire batch. For instance, selecting “Text Structure & Purpose” and “Novice” will highlight which students are struggling in that specific area.
Scheduled Test Analytics
This tab provides insights specifically for tests that were scheduled for students, giving teachers a way to evaluate performance on structured assessments.
Selecting a Scheduled Test
Use the dropdown menu to select a test from the list. Details for the selected test will appear, including:
Test Name
Test Category: For example, Digital SAT.
Test Type: Whether it’s Full-Length or Diagnostic.
Test Date and Time
Assigned Batches or Students
Student Data
This section displays a table of students who participated in the test, showing their:
Average Projected SAT Score
A “View Analytics” link for detailed insights.
Example: If no students have taken the test yet, the table will indicate “No students found.”
Red, Yellow, and Green Zones
Similar to Batch Analytics, the right panel categorizes students based on their performance during the test.
Frequency of Errors Analysis
This section analyzes how students performed in different topics of Reading, Writing, and Math during the test.
If data is unavailable (e.g., no students completed the test), this section will show a “No data available” message.
Time Management
Like the other analytics sections, this tab highlights the average time spent on different topics during the test. It also flags the top three areas where students need improvement in time management.
Question by Question Analysis
Teachers can view a detailed breakdown of how students performed on each question in the test.
The tab is divided by subject (Math, Reading, Writing) and further into difficulty modules (e.g., Easy, Medium, Hard).
Example: If Question 10 from “Math Module 1” has a 100% error rate, this question is flagged as challenging, and teachers can review its details.