Get a high-level view of test health across all workspaces in your account on the account-level coverage dashboard. This dashboard aggregates quality metrics from each workspace, helping you identify trends and problem areas without switching between workspaces.
Accessing the dashboard
To access the account-level coverage dashboard:
- Open the workspace dropdown and click on your company dashboard.
- Select an account.
- Click the Quality Metrics tab.
For more information on accessing the company dashboard, see the company dashboard article.
Filtering
To focus on specific workspaces or time periods, expand the Filters section and apply filters as needed:
- Workspace: Select a workspace to include in the dashboard metrics. By default, all workspaces are included.
- Date range: Select a time period for historical data. The default is 30 days, and you can view up to 100 days of data.
Pass rate
Use the Pass rate card to understand the overall health of your test suite at a glance. This metric shows the percentage of test runs that passed out of all executions for the selected date range, aggregated across all workspaces (or filtered workspaces).
The trend indicator on the Pass rate card indicates whether the pass rate is improving or declining. A positive trend suggests your test suite is stabilizing. A negative trend may indicate recent application changes that introduced failures or new flaky tests.
mabl calculates the trend indicator by comparing the average pass rate from the first half of your selected date range to the second half. For example, consider the following 30-day range:
- First 15 days: 88% average pass rate
- Last 15 days: 92% average pass rate
The trend for the 30-day range is 4% (improving).
Workspace information
The Workspace information section includes Health and Activity tabs. This section appears only when viewing all workspaces (no workspace filter applied).
Use these tables to compare testing across teams and identify which workspaces need attention.
Health
The Health tab shows pass rate and test volume for each workspace for the selected date range:
- Pass rate: percentage of all test runs that passed
- Test runs: total number of test executions
- Failing tests: total number of failed runs
Workspace activity
Use the Activity tab to understand which teams are actively building and maintaining tests:
- Active users: total number of unique users in the workspace
- Unique tests: the total number of distinct tests that ran during the date range.
- New tests: total number of tests created during the date range
- Updated tests: total number of existing tests that were modified during the date range
For more information on how mabl defines test authoring activity, see the article on workspace usage.
Test run history by type
Use the Test run history by type table to understand the composition of your testing efforts across the account:
- Browser tests
- API tests
- Mobile tests
Categorized failures
The Categorized failures chart shows the most common failure categories across all workspaces.
By establishing a practice of applying failure reasons to failed tests, your team can use this chart to identify systemic issues affecting your workspaces.
Unique tests run
The Unique tests run chart tracks how many distinct tests ran over the defined date range. Use this chart to monitor the breadth of your test coverage. For example, if your account has 500 total tests, but the chart indicates that you run no more than 200 tests on a given day, you may want to investigate why other tests aren’t being executed.
Active users over time
The Active users chart shows a breakdown of the number of automators and participants over time, aggregated across workspaces.
- Automators: users with 30 or more test authoring activities in a given month
- Participants: users with fewer than 30 test authoring activities in a given month
An increasing number of automators suggests that more team members are actively building and maintaining tests, while a declining trend may indicate that testing responsibilities are consolidating to fewer people.
For more details on how mabl calculates user activity at the account level, see account billing and usage.
Test authoring activity
Review how many tests were created and updated over the defined date range to identify spikes in edits and maintenance bottlenecks.
Test execution time
Review the time your tests spend executing in the mabl cloud:
- Average: the total execution time divided by the number of test runs
- Total: the sum of execution duration across all workspaces (or filtered workspaces).