Troubleshooting API tests

Tips for troubleshooting unexpected results in API tests

If your API test failed or produced an unexpected outcome, here are some steps you can take to isolate the issue:

Investigate failed steps

By clicking on failing steps on the Test Output page, you can get more information on what happened:

  • The response status appears at the top
  • The Assertions tab shows which assertions failed
  • The Errors tab shows any errors

Inspecting individual steps on the Test Output page

Some questions to guide your investigation include:

  1. What is the expected result of this step?
  2. Does the step use variables generated from a previous step? If so, what happened in the previous step?
  3. Does the API have required fields in the header or body?


Double check variables and assertions

If a step fails on an assertion or variable assignment from a JSON response body, open the API Test Editor and check that the JSON path is correct.

To check the initial and final values of variables in a cloud run, click on View all in the Test Output page.

Find the last passing test

If a failed API test was previously passing, you can use the last passed test as a point of reference. Investigate individual steps to identify any differences that might have contributed to the different outcomes between the two runs.

Add console log statements

If you want to get more detail about what is happening in a step, you can add console log statements for debugging purposes.

Compare local and cloud runs

If an API test yields different results locally than in the cloud, you should check whether mabl can access the URL endpoint.

If an URL endpoint cannot be accessed from the public Internet, here are some options for troubleshooting and configuring networking issues:

If your workspace uses mabl Link to access the endpoint:

  1. Validate that the link agent is active
    • Go to Settings > Networking to check that the link agent is active.
  2. Ensure that the environment is configured to use link agent
    • Go to Configuration > Applications
    • Click on the pencil icon next to the environment that should be using the link agent. The Edit Environment page will appear.
    • In the Advanced section, ensure that the "Use link agent" box has been checked and the link agent has been selected from the dropdown.

Compare plan settings to ad-hoc run settings

If an API test is passing in ad hoc cloud runs but failing in plan runs, compare the plan settings to the ad-hoc run settings.

Plan settings

If a test was part of a plan run, the plan is listed in the Plans column at the top. You can click on the plan name to see more plan details.


Identifying the plan on the Test Output page

Ad-hoc settings

If the API test was run ad hoc, the Test Output page will show "Ad hoc run" in the Plans column. Click on the View all link to view settings for the ad-hoc run.


Viewing settings for an ad-hoc run

Some questions to guide your investigation include:

  • Is the test associated with a DataTable in the plan? (Does this setting differ in ad hoc runs?)
  • Does the plan use the same api.url as the ad hoc run?

Compare to results in Postman

If a collection was imported from Postman, confirm whether the test is passing there. If it is passing, identify what differences there are between the results in the Postman run and the mabl run.