Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

Frontend Test Cases Guide

THIS DOCUMENT IS STILL A WIP…

The test-cases directory inside apps/app contains structured test specifications that define the expected behavior of our application. These specifications serve as a bridge between product requirements and automated tests.

Overview

The test cases are defined in JSON files and follow a strict schema (defined in schema.json). Each test case is identified by a unique ID and contains detailed information about what needs to be tested.

File Structure

  • schema.json - Defines the structure and validation rules for test case specifications
  • onboarding.spec.json - Test specifications for user onboarding flows
  • Additional .spec.json files for other features

Test Case Schema

Each test case follows this structure:

{
  "TEST-001": {
    "title": "Test case title",
    "priority": "P0",
    "preconditions": ["List of conditions that must be met"],
    "steps": ["Step-by-step test instructions"],
    "acceptance_criteria": ["List of criteria that must be met"]
  }
}

Fields Explained

  • title: A descriptive name for the test case
  • priority: Importance level (P0-P3)
    • P0: Critical path, must not break and must be covered by automated tests
    • P1: Core functionality
    • P2: Important but not critical
    • P3: Nice to have
  • preconditions: Required setup or state before running the test
  • steps: Detailed test steps
  • acceptance_criteria: What must be true for the test to pass

Priority Levels

  • P0: Critical business flows (e.g., user registration, login)
  • P1: Core features that significantly impact user experience
  • P2: Secondary features that enhance user experience
  • P3: Edge cases and nice-to-have features

Updating Test Cases

  1. Adding a New Test Case:

    • Choose an appropriate spec file (or create a new one for new features)
    • Add a new entry with a unique ID (format: XXX-###)
    • Fill in all required fields according to the schema
    • Validate against schema.json
  2. Modifying Existing Test Cases:

    • Update the relevant fields
    • Ensure changes are reflected in the corresponding automated tests
    • Keep the test ID unchanged
  3. Best Practices:

    • Keep steps clear and actionable
    • Write acceptance criteria that can be automated
    • Include edge cases and error scenarios
    • Document dependencies between test cases

Integration with Automated Tests

The test specifications in this directory serve as a source of truth for our automated tests. The relationship works as follows:

  1. Test specs define WHAT needs to be tested
  2. Automated tests implement HOW to test it
  3. Automated tests written for a test case should reference its corresponding test case ID

Example:

describe("ONB-001: User Registration - with email", () => {
  it("should complete registration flow successfully", async () => {
    // Test implementation
  });
});

Maintaining Test Coverage

  1. Every new feature should have corresponding test cases
  2. Test cases should be reviewed along with code changes
  3. Regular audits ensure test coverage matches specifications
  4. Update or deprecate test cases when features change