The following example provides some bash scripts for using the mabl Reporting API batch results endpoint to retrieve a set of test run results and transform them into a CSV format.
Setup
To execute the example scripts you need a command line capable of executing bash scripts with a couple common utilities installed
curl
sed
jq
: available for a variety of platforms from here
Retrieving test results
Save the following bash script as mabl_results_to_csv.sh
This script transforms the JSON results of the mabl batch results endpoint to a CSV format. You can edit the fields
list in the script if you do not want all fields or want to change their order.
# Check if the -h flag was provided to get usage help
if [[ "$1" == "-h" || "$1" == "-help" || "$1" == "--help" || "$1" == "help" ]]; then
echo "Transform mabl batch results JSON (provided on STDIN) into a CSV format with optional header row (included if -p provided as argument)"
echo "Example usage: cat results.json | $0 -p > results.csv"
exit 0
fi
# Check whether the -p flag was provided to print the header
if [ "$1" == "-p" ]; then
print_header=true
else
print_header=false
fi
# Extract the test_results array using jq
results=$(jq -r '.test_results')
# Define the list of variable names to extract
fields=(
'application_id'
'application_name'
'environment_id'
'environment_name'
'initial_url'
'scenario_name'
'browser'
'browser_version'
'execution_runner_type'
'plan_id'
'plan_name'
'plan_run_id'
'test_id'
'test_version'
'test_name'
'test_type'
'branch'
'test_run_id'
'test_run_app_url'
'is_ad_hoc_run'
'failure_category'
'start_time'
'end_time'
'run_time'
'status'
'success'
'trigger_type'
'triggering_deployment_event_id'
'emulation_mode'
'metrics.cumulative_speed_index'
'metrics.cumulative_api_response_time'
'metrics.accessibility_rule_violations.critical'
'metrics.accessibility_rule_violations.serious'
'metrics.accessibility_rule_violations.moderate'
'metrics.accessibility_rule_violations.minor'
)
# Print the header as a comma-separated list of field names
if [ "$print_header" == true ]; then
echo "${fields[*]}" | tr ' ' ','
fi
# Loop through each result in the array and output a flat CSV format, using base64 encoding+decoding to handle spaces and special characters in the result rows during loop
for row in $(echo "${results}" | jq -r '.[] | @base64'); do
decoded_row=$(echo ${row} | base64 --decode)
fields_values=()
for field in "${fields[@]}"; do
value=$(echo $decoded_row | jq -r ".$field // empty") # extract the field, using the empty string instead of "null" for fields with no value
value=$(echo "$value" | sed 's/,/_/g') # replace commas with underscores
fields_values+=("$value")
done
# Join field values with comma separators
printf -v joined '%s,' "${fields_values[@]}"
echo "${joined%,}"
done
After saving the script, you can make it executable from the bash command prompt with: chmod +x mabl_results_to_csv.sh
This script is not very speedy with the use of jq
; it will likely take around one or two minutes to run per 100 run results.
Call the script from a bash command line to process results from the mabl Reporting API and save them to a CSV. Replace <WORKSPACE_ID>
with your workspace ID and replace <API_KEY>
with a mabl Viewer API key.
curl "https://api.mabl.com/results/workspace/<WORKSPACE_ID>/testRuns?advanced_metrics=true" \
-u "key:<API_KEY>" \
| ./mabl_results_to_csv.sh -p \
> results.csv
Retrieving additional results
In order to dump a large quantity of results into a file, you have to call the endpoint repeatedly to get multiple pages of results. You can create another bash script as mabl_get_batch_results.sh
, that uses the first script and does this paging for you:
# Check number of arguments and provide help with usage example if incorrect
if [[ $# -lt 3 || $# -gt 4 ]]; then
echo "Usage: $0 <workspace ID> <API key> <output filename> [query parameter string]"
echo "Example: $0 'MY-WORKSPACE-ID-w' 'API-KEY' 'results.csv' 'test_id=MY-TEST-ID-j&earliest_run_start_time=1677587852400'"
exit 1
fi
workspace_id=$1
api_key=$2
output_file=$3
endpoint="https://api.mabl.com/results/workspace/$workspace_id/testRuns?"
# Add given query parameters to API endpoint URL
if [ "$4" ]; then
endpoint+="$4"
fi
# Set initial cursor value to null
cursor="null"
# Add header row to output file
echo "" | ./mabl_results_to_csv.sh -p >> $output_file
# Loop until there is no more data to retrieve
while : ; do
echo "Getting batch of results with cursor = $cursor"
# Make API call with current cursor value
if [[ $cursor != "null" ]]; then
response=$(curl -s "$endpoint&cursor=$cursor" -u "key:$api_key")
else
response=$(curl -s "$endpoint" -u "key:$api_key")
fi
# Extract cursor value from API response
cursor=$(echo $response | jq -r '.cursor')
# Process data from API response
echo "Processing batch of results with cursor = $cursor"
echo "$response" | ./mabl_results_to_csv.sh >> $output_file
echo "Finished processing batch of results with cursor = $cursor"
[[ $cursor != "null" ]] || break
done
After saving the script, make it executable from a bash command prompt with:chmod +x mabl_get_batch_results.sh
Call the script from the command line to retrieve multiple pages of results from the mabl Reporting API and save them to a CSV file:
./mabl_get_batch_results.sh <WORKSPACE_ID> <API_KEY> <OUT_FILE> '<QUERY_PARAMETER_STRING>'
The script takes the following arguments:
<WORKSPACE_ID>
: replace<WORKSPACE_ID>
below with your workspace ID<API_KEY>
: have a workspace owner create or use an existing mabl Viewer API key and replace<API_KEY>
with the key secret.<OUT_FILE>
: replace<OUT_FILE>
with the name of the output file, such as results.csv- `\<QUERY_PARAMETER_STRING>: use this optional argument to filter your results.
See the API documentation for the API batch run results endpoint to find available query parameters and create an appropriate query parameter string to replace <QUERY_PARAMETER_STRING>
.
For example, to specify all runs within a time range, use https://www.epochconverter.com/ to get the desired start and end timestamp in milliseconds and use them with in a query parameter string as follows: earliest_run_start_time=1677506400000&latest_run_start_time=1677592800000