Frontend Test Cases Guide
THIS DOCUMENT IS STILL A WIP…
The test-cases directory inside apps/app contains structured test specifications that define the expected behavior of our application. These specifications serve as a bridge between product requirements and automated tests.
Overview
The test cases are defined in JSON files and follow a strict schema (defined in schema.json). Each test case is identified by a unique ID and contains detailed information about what needs to be tested.
File Structure
schema.json- Defines the structure and validation rules for test case specificationsonboarding.spec.json- Test specifications for user onboarding flows- Additional
.spec.jsonfiles for other features
Test Case Schema
Each test case follows this structure:
{
"TEST-001": {
"title": "Test case title",
"priority": "P0",
"preconditions": ["List of conditions that must be met"],
"steps": ["Step-by-step test instructions"],
"acceptance_criteria": ["List of criteria that must be met"]
}
}
Fields Explained
title: A descriptive name for the test casepriority: Importance level (P0-P3)- P0: Critical path, must not break and must be covered by automated tests
- P1: Core functionality
- P2: Important but not critical
- P3: Nice to have
preconditions: Required setup or state before running the teststeps: Detailed test stepsacceptance_criteria: What must be true for the test to pass
Priority Levels
- P0: Critical business flows (e.g., user registration, login)
- P1: Core features that significantly impact user experience
- P2: Secondary features that enhance user experience
- P3: Edge cases and nice-to-have features
Updating Test Cases
-
Adding a New Test Case:
- Choose an appropriate spec file (or create a new one for new features)
- Add a new entry with a unique ID (format:
XXX-###) - Fill in all required fields according to the schema
- Validate against schema.json
-
Modifying Existing Test Cases:
- Update the relevant fields
- Ensure changes are reflected in the corresponding automated tests
- Keep the test ID unchanged
-
Best Practices:
- Keep steps clear and actionable
- Write acceptance criteria that can be automated
- Include edge cases and error scenarios
- Document dependencies between test cases
Integration with Automated Tests
The test specifications in this directory serve as a source of truth for our automated tests. The relationship works as follows:
- Test specs define WHAT needs to be tested
- Automated tests implement HOW to test it
- Automated tests written for a test case should reference its corresponding test case ID
Example:
describe("ONB-001: User Registration - with email", () => {
it("should complete registration flow successfully", async () => {
// Test implementation
});
});
Maintaining Test Coverage
- Every new feature should have corresponding test cases
- Test cases should be reviewed along with code changes
- Regular audits ensure test coverage matches specifications
- Update or deprecate test cases when features change