Files
ss-tools/.specify/templates/test-docs-template.md
2026-03-16 23:11:19 +03:00

4.4 KiB

description
description
Test documentation template for feature implementation

Test Documentation: [FEATURE NAME]

Feature: [Link to spec.md] Created: [DATE] Updated: [DATE] Tester: [Agent/User Name]


Overview

[Brief description of what this feature does and why testing is important]

Test Strategy:

  • Unit Tests (co-located in __tests__/ directories)
  • Integration Tests (if needed)
  • E2E Tests (if critical user flows)
  • Contract Tests (for API endpoints and semantic contract boundaries)
  • Semantic Contract Verification (@PRE, @POST, @SIDE_EFFECT, @DATA_CONTRACT, @TEST_*)
  • UX Contract Verification (@UX_STATE, @UX_FEEDBACK, @UX_RECOVERY, @UX_REACTIVITY)

Test Coverage Matrix

Module File Unit Tests Coverage % Status
[Module Name] path/to/file.py [x] [XX%] [Pass/Fail]
[Module Name] path/to/file.svelte [x] [XX%] [Pass/Fail]

Test Cases

[Module Name]

Target File: path/to/module.py

ID Test Case Type Expected Result Status
TC001 [Description] [Unit/Integration] [Expected] [Pass/Fail]
TC002 [Description] [Unit/Integration] [Expected] [Pass/Fail]

Test Execution Reports

Report [YYYY-MM-DD]

Executed by: [Tester] Duration: [X] minutes Result: [Pass/Fail]

Summary:

  • Total Tests: [X]
  • Passed: [X]
  • Failed: [X]
  • Skipped: [X]

Failed Tests:

Test Error Resolution
[Test Name] [Error Message] [How Fixed]

Anti-Patterns & Rules

DO

  1. Write tests BEFORE implementation when the workflow permits it
  2. Use co-location: src/module/__tests__/test_module.py
  3. Use MagicMock for external dependencies (DB, Auth, APIs)
  4. Trace tests to semantic contracts and DTO boundaries, not just filenames
  5. Test edge cases and error conditions
  6. Test UX contracts for Svelte components (@UX_STATE, @UX_FEEDBACK, @UX_RECOVERY, @UX_REACTIVITY)
  7. For Complexity 5 boundaries, verify @DATA_CONTRACT, invariants, and declared @TEST_* metadata
  8. For Complexity 4/5 Python flows, verify behavior around guards, side effects, and belief-state-driven logging paths where applicable

DON'T

  1. Delete existing tests (only update if they fail)
  2. Duplicate tests - check for existing tests first
  3. Test implementation details, not behavior
  4. Use real external services in unit tests
  5. Skip error handling tests
  6. Skip UX contract tests for critical frontend components
  7. Treat legacy @TIER as sufficient proof of test scope without checking actual complexity and contract metadata

UX Contract Testing (Frontend)

UX States Coverage

Component @UX_STATE @UX_FEEDBACK @UX_RECOVERY Tests
[Component] [states] [feedback] [recovery] [status]

UX Test Cases

ID Component UX Tag Test Action Expected Result Status
UX001 [Component] @UX_STATE: Idle [action] [expected] [Pass/Fail]
UX002 [Component] @UX_FEEDBACK [action] [expected] [Pass/Fail]
UX003 [Component] @UX_RECOVERY [action] [expected] [Pass/Fail]

UX Test Examples

// Testing @UX_STATE transition
it('should transition from Idle to Loading on submit', async () => {
  render(FormComponent);
  await fireEvent.click(screen.getByText('Submit'));
  expect(screen.getByTestId('form')).toHaveClass('loading');
});

// Testing @UX_FEEDBACK
it('should show error toast on validation failure', async () => {
  render(FormComponent);
  await fireEvent.click(screen.getByText('Submit'));
  expect(screen.getByRole('alert')).toHaveTextContent('Validation error');
});

// Testing @UX_RECOVERY
it('should allow retry after error', async () => {
  render(FormComponent);
  // Trigger error state
  await fireEvent.click(screen.getByText('Submit'));
  // Click retry
  await fireEvent.click(screen.getByText('Retry'));
  expect(screen.getByTestId('form')).not.toHaveClass('error');
});

Notes

  • [Additional notes about testing approach]
  • [Known issues or limitations]
  • [Recommendations for future testing]