Checklist QA avec MCP : générer les étapes de test à partir des tâches
Generate comprehensive test cases from task descriptions using MCP. This QA workflow shows you how to read a task, create test steps covering functionality and edge cases, and add them as a comment—with variants for regression tests, smoke tests, and integration tests.
Ce que ce workflow permet
QA checklist workflow with MCP helps you create thorough test coverage:
Résultats clés
- Test case generation: Create comprehensive test steps from task descriptions
- Coverage analysis: Identify what needs to be tested (happy path, edge cases, error handling)
- Test type variants: Generate regression, smoke, and integration test cases
- Documentation: Add test cases as comments for QA tracking
- Quality assurance: Ensure tasks have clear, testable outcomes
Prérequis
Before using this workflow, ensure you have:
- Corcava MCP server configured in your AI assistant
- API key with read access to tasks and comments
- Tasks with clear descriptions and acceptance criteria
- For posting comments: API key with write permissions
Workflow pas à pas
Step 1: Read the Task
Start by getting the full task context:
Task Reading Prompt
What the AI does:
- Calls
get_taskto retrieve task details - Calls
list_task_commentsto get existing context - Analyzes task description and acceptance criteria
- Presents task context for test case generation
Step 2: Generate Test Cases
Create comprehensive test steps:
Test Case Generation Prompt
What the AI does:
- Analyzes task requirements
- Identifies test scenarios
- Creates detailed test steps
- Formats test cases clearly
Step 3: Review Test Cases
Review the generated test cases:
Review Prompt
What the AI does:
- Formats test cases for comment
- Ensures all criteria are covered
- Presents formatted test case list
- Waits for approval before posting
Step 4: Add Test Cases as Comment (After Approval)
Post the test cases after confirmation:
Post Test Cases Prompt (Requires Approval)
What the AI does:
- Drafts comment with test cases
- Shows preview for approval
- Waits for confirmation token
- Calls
add_task_commentafter approval
Complete Workflow Prompts
Here are complete, copy-paste ready prompts:
Full QA Test Case Generation
Use this when: You want comprehensive test case generation
Quick Test Case Generation
Use this when: You want a quick test case generation
Workflow Variants by Test Type
Variant 1: Regression Test Cases
Generate regression test cases that verify existing functionality:
What happens: AI generates regression-focused test cases
Variant 2: Smoke Test Cases
Generate quick smoke test cases for basic verification:
What happens: AI generates minimal smoke test cases
Variant 3: Integration Test Cases
Generate integration test cases for system interactions:
What happens: AI generates integration-focused test cases
Variant 4: End-to-End Test Cases
Generate end-to-end test cases for complete user workflows:
What happens: AI generates comprehensive end-to-end test cases
Example: Complete QA Test Case Session
Here's a real example of how a QA test case generation session flows:
User Prompt
AI Response (Task Details)
Task #123: Implement user authentication
Description: Add user login/logout functionality with email and password authentication.
Acceptance Criteria:
- User can register with email and password
- User can login with credentials
- User can logout
- Session management works correctly
AI Generated Test Cases
QA Test Cases (Ready to Post):
Happy Path Tests:
- User Registration: Steps: Navigate to register page, enter valid email and password, submit. Expected: User account created, redirect to login.
- User Login: Steps: Enter registered email and password, click login. Expected: User logged in, session created, redirect to dashboard.
- User Logout: Steps: Click logout button. Expected: Session terminated, redirect to login page.
Edge Cases:
- Duplicate Email Registration: Steps: Register with existing email. Expected: Error message, account not created.
- Invalid Login Credentials: Steps: Enter wrong password. Expected: Error message, no session created.
- Session Expiry: Steps: Wait for session timeout, try to access protected page. Expected: Redirect to login, session expired message.
Error Handling:
- Invalid Email Format: Steps: Enter invalid email format. Expected: Validation error, registration/login blocked.
- Weak Password: Steps: Enter password that doesn't meet requirements. Expected: Validation error, password requirements shown.
Type 'CONFIRM' or 'YES, ADD COMMENT' to post these test cases.
User Approval
AI Action (After Approval)
✅ Posted QA test cases comment to task #123
Test cases have been added to the task for QA tracking.
Safety Patterns
✅ Preview and Approval Workflow
This workflow uses preview and approval for all write operations:
- Read-first: Always read task details before generating test cases
- Preview test cases: Show all test cases before posting
- Confirmation tokens: Require "CONFIRM" or "YES, ADD COMMENT" before posting
- Review step: Allow review and editing before posting
- Clear formatting: Test cases formatted clearly for readability
Troubleshooting
Task Not Found
Symptom: Cannot retrieve task details
Possible causes:
- Task ID incorrect
- Task in different workspace
- API key lacks read access
Fix: Verify task ID and API key permissions. Check workspace access.
Test Cases Too Generic
Symptom: Generated test cases are too vague
Possible causes:
- Task description lacks detail
- Acceptance criteria missing
Fix: Ask for more specific test cases: "Generate detailed test cases with specific steps and expected results"
Comments Not Being Posted
Symptom: AI doesn't post comments even after approval
Possible causes:
- Confirmation token not recognized
- API key lacks write permissions
- Task ID incorrect
Fix: Use exact confirmation: "CONFIRM" or "YES, ADD COMMENT". Verify API key has write access.
Related Tools
This workflow uses these Corcava MCP tools:
get_task
Read task details and requirements
list_task_comments
Review existing comments for context
add_task_comment
Post test cases as comments
Related Use Cases
Acceptance Criteria
Generate acceptance criteria before test cases
Bug Triage
Use test cases to verify bug fixes
Generate Comprehensive Test Cases
Connect your AI assistant to Corcava and create thorough QA test coverage
