VEGA AI
Go to websiteGo to platform
  • Getting Started
    • Welcome to VEGA AI
    • Set Things Up
    • Navigation & Interface
  • AI Avatar
    • Set Up your AI Avatar
    • Train & Manage Your AI Avatar
  • AI Agents
    • What is an AI Agent?
    • Set Up your AI Agent
    • Manage and Improve Your AI Agent
  • Course Management
    • Basic Tour
    • Create a Structure
    • Upload or Generate Resources
      • Types of Resources
    • Question Database
    • Build or Generate Tests
    • Deployment Of Course
    • Journeys
    • Course AI Avatar
  • Operations
    • Manage your Users
    • Manage Your Batches
    • Analytics & Report
      • Batches Analytics
      • User Analytics
      • Scheduled Test Analytics
    • Test Scheduling
    • Zoom Integration
  • VEGA AI Exclusives
    • Spaces
    • VEGA AI Tools
  • Support Documentation
    • Frequently Asked Questions
    • How to Guides
      • How to represent mathematical formulas and equations while adding questions manually?
      • How to Underline, bold and italize text while adding questions mannualy?
      • How to write theorems and proofs manually on VEGA AI?
    • Troubleshooting
    • Glossary
Powered by GitBook
On this page
  • Build a Test Manually:
  • Adding Test Metrics & Analytics
  1. Course Management

Build or Generate Tests

PreviousQuestion DatabaseNextDeployment Of Course

Last updated 1 month ago

VEGA AI provides two types of tests for educators: Practice Tests and Structured Tests. Both are designed to meet specific assessment needs, from quick evaluations to comprehensive exams. In this guide, we will explain every feature and option available in detail to ensure educators can use this feature effectively.


Practice Tests

Overview

Practice tests are designed to help students strengthen specific skills or concepts. VEGA AI offers multiple ways to create practice tests:

  1. Generate Practice Tests Using AI: Automatically create tests by specifying topics, difficulty levels, or other parameters.

  2. Build Practice Tests Manually: Customize your test by selecting specific questions from the Question Database.

Practice tests include options like:

  • Exit Ticket Generator: A quick assessment tool to measure students' understanding at the end of a lesson.

  • Quiz Generator: Create quizzes to assess knowledge on specific topics.

  • Assignment Generator: Design detailed assignments covering multiple concepts.

  • Homework Generator: Create tasks for students to practice at home.

Step-by-Step Guide for Practice Tests

Access the Practice Tests Section:

  • Go to the "Practice Tests" tab on the course dashboard. This section shows the test creation options and a list of previously created tests (if any).

Create a New Practice Test:

  • You can either:

    • Generate with AI: Click on one of the test generators (e.g., Quiz Generator) to create a test automatically based on your inputs.

    • Build Manually: Use the "Build Manually" option to create a test by selecting specific questions from the Question Database.

Generate a Test Using AI:

  • Choose the type of test (Quiz, Assignment, etc.).

  • Select the question type (MCQs, Passage-based, or Open-ended Questions).

  • Specify the number of questions.

  • Optionally, upload reference material like syllabi or notes or provide a brief description of the topic and desired difficulty level.

  • VEGA AI will generate a test based on your inputs. You can preview, edit, or finalize the questions before saving the test.

Build a Test Manually:

  • Click on "Build Manually" and provide a name for the test.

  • Select the type of test (Assignment, Quiz, Homework, or Exit Ticket).

  • Use the Question Database to pick specific questions:

    • Filters Available: You can filter questions by difficulty level, topic, type (MCQ, Open-ended), and verification status.

    • Preview Questions: Before adding, you can preview each question to ensure it fits your requirements.

    • Add questions one by one to the test.

  1. Finalize the Practice Test:

    • After adding questions, review the list in the "Selected Questions" section.

    • You can remove any question by clicking the delete icon next to it.

    • Once the test is finalized, click "Create" to save it.

  2. Assign the Practice Test:

    • Assign the test to a batch of students or specific individuals.

    • Set the start and end date for the test.

    • Decide if you want the test to be available for free to all students.


Structured Tests

Overview

The Structured Test feature on VEGA AI empowers academic institutions and educators to design summative assessments that simulate real-world exams such as the SAT, GRE, or institutional mock tests. These tests are modular, customizable, and ideal for benchmarking learner performance across multiple skill areas.

This guide walks you through the full lifecycle of creating a structured test, selecting questions, configuring modules, and applying performance analytics all within VEGA Prep.

Step-by-Step Guide for Structured Tests

Step 1: Navigate to the Structured Test Section

From the left sidebar, navigate to Courses → Structured Tests. This section lists all your existing structured tests with metadata such as section and module counts.

Click on the “Create New Test” button on the top-right to begin test creation.

Step 2: Add Basic Test Info

You’ll land on a two-step setup process:

  1. Add Basic Test Info

  2. Define Metrics (Optional)

In the first step, fill out the following:

  • Test Name: Use a clear naming convention like SAT Reading – Passage 1.

  • Test Description: Briefly describe the purpose of the test, number of sections/modules, and what students can expect.

    Example: This SAT Reading section includes a passage-based comprehension test with 1 section and 1 timed module.

  • Test Instructions: Provide guidance to the learner. Markdown is supported.

    Example: Read the passage thoroughly. Answer the questions based on textual evidence, vocabulary in context, and author’s intent.

  • Number of Sections: Default is 1. Increase if building a full-length test.

Click Create to proceed to section setup.


Step 3: Configure Section Details

Each section is where you define the layout and rules of the test's logical segments.

Fill in:

  • Section Name: e.g., “Reading Comprehension”

  • Section Instructions: Custom instructions specific to the section.

  • Section Reference Information: Use this to describe any context or passage attached.

  • Section Reference Image (Optional): Upload any diagrams or visual context.

  • Is this section adaptive? Select “No” for SAT-style static modules.

Module Setup:

Each section supports 1–2 modules.

  • Module Name: e.g., “Module 1: Passage A”

  • Module Duration: Typically 20 minutes for 10 questions.

  • Question Count: Set to 10 for a mini-assessment.

  • Access to Calculator: “No” for Reading and Writing modules.

  • Section Interval: Enable if you want to space sections (default 10 seconds between).

Click Next after setting up the modules.

Step 4: Add Questions to the Module

Once your module is defined:

  1. Click into the module and select “Choose Questions.”

  2. Use filters like Question Type, Subject, Topic, and Difficulty.

  3. Select relevant questions from the database. You can select up to 10 (or as needed).

  4. Click Done to save your selections.

Questions are now linked to your module. You’re ready to move on to performance analytics.

Adding Test Metrics & Analytics

Once your test structure is in place, VEGA AI allows you to embed data-rich performance analytics through Ring and Box visualizations. These metrics help educators, students, and institute admins assess test performance with clarity, precision, and insight.

You can access this by proceeding to the “Define Metrics (optional)” stage of the test creation flow.


Why Metrics Matter

Structured Tests are not just static assessments—they’re diagnostic tools. With the help of metrics, VEGA AI:

  • Visualizes learner performance using smart charts

  • Provides scaled scores & percentile benchmarks

  • Enables mastery tracking

  • Helps educators intervene with precision

There are two metric sections:

  1. Ring Section – Visual analytics (Apple Watch-style donut charts)

  2. Box Section – Tabular breakdowns for detailed reporting

Let’s break down how to configure both.


Ring Section: Visual Summary Analytics

What It Is:

The Ring Section displays up to 4 key metrics in a concentric circle chart. This gives learners and educators a snapshot of how the student performed overall.

How to Configure:

  • Section Name: e.g., Ring Section 1

  • Select Metrics: Click the dropdown and choose up to four.

For example: Recommended Metrics for SAT-style Tests:

Metric Name
Description

Digital SAT Scaled Score

Combined score out of 1600 (based on reading, writing, math)

Digital SAT Reading and Writing Scaled Score

Scaled performance in verbal sections

Digital SAT Mathematics Scaled Score

Scaled performance in math sections

Digital SAT Scaled Score and Percentile

Includes both score and percentile relative to others

Tip: The ring section works best when used to summarize top-level performance, so select metrics that give learners an immediate pulse check.


Box Sections: Detailed Score Reports

What It Is:

Box Sections provide a grid-style performance breakdown. You can add multiple box sections if needed.

How to Configure:

  • Box Section Name: e.g., Box Section 1: Short Diagnostics

  • Select Metrics: Choose from the same list as Ring Section but structure them across row/column pairs for precise analytics.

Suggested Use:

  • Box Section 1: Metric: Digital SAT Short Test Diagnostics Scaled Score and Percentile

  • Box Section 2: Metric: Digital SAT Short Diagnostics Test Math Scaled Score and Percentile

You can even break metrics down by subject, time spent per question, mastery levels, or custom institutional scores—depending on the sophistication of your diagnostic strategy.


🎯 Best Practices for Metrics

  • Always include at least one scaled score metric in the Ring Section

  • Use Box Sections for detailed insights, especially for instructors during review

  • Name your sections clearly so students can interpret the charts without confusion

  • Avoid adding metrics you don’t plan to explain or act on

Once metrics are set, click Save to finalize your test.


Once your structured test is built and metrics are in place, the final phase involves publishing, assigning, and tracking test delivery. VEGA AI ensures that this process is smooth, time-bound, and completely auditable.

This section covers:

  • Test dashboard management

  • Student preview workflow

  • Assignment flow & access control

  • Important post-assignment constraints


Test Dashboard Overview

From the Structured Tests panel under any course (e.g., DIGITAL_SAT), you’ll see a centralized list of all created tests, with critical summary data:

Column
Purpose

Test Name

Internal title used for filtering/searching

Sections

Total logical parts in the test

Modules

Total timed blocks inside sections

Type

STRUCTURED TEST

Action Buttons

Preview, Assign, and Edit options


Previewing the Student Experience

Before deploying, you can always simulate the student view by clicking the Preview button:

  • See each section and module as a learner would

  • Confirm the number of questions per module

  • Ensure instructional copy and layout is clean

  • Identify and resolve any setup issues before it goes live

This is crucial for maintaining test integrity and UX quality.


📤 Assigning the Test

Click Assign to open the assignment modal. This is where you set:

1. Batch/Student Selection

  • Assign to entire batches or individual students

  • Use dropdowns to search & select

  • Ideal for staggered release to different cohorts

2. Date & Time Window

  • Start/End Dates: Define a testing window during which students can access the test

  • Start/End Time: Controls hourly access per day

  • Timezone: Defaults to admin locale; adjust as needed

3. Critical Note

Once assigned, the test is locked. No changes can be made. This ensures data sanctity and consistency across performance tracking.


✅ Recap: Assignment Best Practices

Do
Avoid

Preview every test before assigning

Assigning without a time window

Use clear test titles (e.g. DSAT Mock 12)

Renaming tests after assignment

Match metric titles using recommended patterns

Using ambiguous or non-standard metric names

Assign in advance with a defined access window

Assigning to the wrong cohort without review


With the test created, assigned, and metrics live, you're fully equipped to analyze performance and deliver high-quality assessments that are standardized, scalable, and student-centric.