Azure DevOps Test Reports Explained (With Examples)
A complete guide to Azure DevOps test reports — pipeline test results, Azure Test Plans reports, progress reports, and requirements coverage. Includes examples and how to use reports to make release decisions.
Azure DevOps produces test reports from two sources: Azure Pipelines (automated test results) and Azure Test Plans (manual test execution). Understanding both — and how they combine — gives you a complete quality picture for every sprint.
Pipeline test results
Every pipeline run with a PublishTestResults task shows results in the Tests tab.
What the Tests tab shows
Tests tab — Sprint 9 Regression — Build #1047
Total tests: 284
Passed: 279 (98.2%)
Failed: 3
Not executed: 2
Duration: 12m 34s
Failed tests:
✗ checkout › Apply discount code › Expired code error message
Expected: "This code has expired" | Got: "Invalid code"
✗ auth › Password reset › Reset email sent confirmation
Timeout: Element not visible after 15000ms
✗ profile › Avatar upload › Reject oversized file
Expected status 413 | Got 500
Test analytics (trend over time)
Go to Pipelines → [Pipeline] → Analytics → Tests for trend charts:
- Pass rate over the last 30 days
- Test count trend (growing? stable?)
- Top failing tests (consistently failing tests)
- Slowest tests (candidates for optimisation)
- Flaky tests (fail rate between 5% and 95%)
The Top Failing Tests view is the most actionable — it shows which tests fail most often, with a direct link to each failure's history.
Azure Test Plans reports
Progress Report
The Progress Report shows manual test execution status for a test plan.
Go to Test Plans → [Plan] → Progress Report:
Sprint 9 Test Plan Progress
Checkout Suite:
Requirements: 5 user stories
Test cases: 32
Executed: 32 (100%)
Passed: 30 (93.8%)
Failed: 2 → 2 open bugs (P2)
Auth Suite:
Requirements: 3 user stories
Test cases: 18
Executed: 18 (100%)
Passed: 18 (100%)
Regression Suite:
Test cases: 46
Executed: 46 (100%)
Passed: 46 (100%)
Sprint 9 Summary:
Total test cases: 96
Executed: 96 (100%)
Overall pass rate: 97.9%
Open bugs: 2 (both P2, deferred to Sprint 10)
Sign-off: ✓ APPROVED
Requirements Coverage Report
Go to Test Plans → [Plan] → Requirements tab:
| User Story | Status | Test Cases | Passed | Failed | Open Bugs |
|---|---|---|---|---|---|
| #312 Wishlist | Active | 12 | 12 | 0 | 0 |
| #315 Checkout | Active | 8 | 6 | 2 | 2 |
| #318 Auth | Closed | 6 | 6 | 0 | 0 |
This is your requirements traceability matrix. Export to Excel for stakeholders.
Test Result Trend widget
Add the Test Results Trend widget to your team dashboard:
- Go to Overview → Dashboards → [Dashboard] → Edit
- Click + Add widget → Test Results Trend (Advanced)
- Configure:
- Pipeline: your regression pipeline
- Date range: last 30 runs
- Metrics: pass rate, failed count
This widget shows at a glance whether quality is improving or declining over time.
Making release decisions from reports
Go / No-Go checklist
Before releasing to production, QA engineers review:
Release: v2.5 — Sprint 9
☑ All P1/P2 test cases executed and passed
☑ Regression suite: 46/46 passed (100%)
☑ New feature test cases: 50/52 passed (96.2%)
☑ P1/P2 bugs: 0 open
☐ P3 bugs: 2 open (deferred — PO approval documented)
☑ Performance test: page load < 2s ✓
☑ Pipeline pass rate: 98.2% (last 3 runs all green)
Decision: APPROVED FOR RELEASE
Sign-off: Jane Smith (QA Lead) — 2025-09-14
Document the sign-off as a comment on the release work item in Azure Boards.
Common errors and fixes
Error: Test results don't appear in the Tests tab after pipeline runs
Fix: The PublishTestResults task must find the XML file AND the XML must be valid JUnit format. Test with a minimal XML: <testsuites><testsuite name="test" tests="1"><testcase name="example" classname="test"/></testsuite></testsuites>.
Error: Progress Report shows 0% even after manual test execution Fix: Tests must be run through Test Runner (not manually marked). Test Runner creates a Test Run record that the Progress Report reads. Marking test cases directly in the grid does not create a run.
Error: Analytics tab shows no data for the pipeline
Fix: Analytics requires at least 2 pipeline runs with published test results. Wait for a second run. Also check that testResultsFormat matches your actual XML format (JUnit vs NUnit vs TRX).
Error: Requirements tab is empty despite having linked test cases Fix: Check the test plan's Area Path and Iteration match the user stories. Requirements in different areas or sprints don't appear in the Coverage tab.
Stay ahead in AI-driven QA
Get practical tutorials on test automation, AI testing, and quality engineering — straight to your inbox. No spam, unsubscribe any time.
Discussion
Sign in with GitHub to comment · powered by Giscus