Swagger contracts & Postman schema validation

As a developer or tester, we often find inconsistencies between what one party develops, and what the other party expects. Looking at API’s, it can happen that the providing party returns a decimal, where the consuming party expects an integer, leading to differences in the available data or, in the worst case, (uncaught) errors. In order to fight this, the providing party usually plans to finish their work before the consuming party starts developing. This way, testing can be done, expectations can be met and bugs can be fixed. But… this way of working takes time. If it concerns a pressing issue or an important business feature, we want to release these to our customers as soon as possible. This blog explains how to use Swagger contracts and Postman schema validation.

Table of Contents

Contracts

Where do we start when writing up a contract?

Creating a JSON Schema

Adding Schema validation to Postman tests

Create a new GET Request

Create JSON Schema(s) in Pre-request Script

Use the JSON Schema(s) to validate the contract in your tests

Watch it pass!

Make it fail!

Watch it fail!

Recap

Contracts

So, what can be done? A possible solution to these challenges would be contract based testing. Contract based testing starts with, duh, a contract. Before either party starts developing, analysts and developers (or a selection of them) start drawing what the API should look like. What responses can we expect? What attributes can we expect? In what format will they be? Will they be arrays or objects?

Once all is decided, this concept contract is discussed and assessed by both parties (or more, if more parties are involved). Once the contract is agreed, this contract can be used as a basis for all parties involved. We know what will be requested, we know what will be returned. And if anything differs, we have the contract as a solid agreement and either of the parties will have to make changes to abide by the contract once again.

Where do we start when writing up a contract?

My team and me quickly turn to Swagger.io, an online editor that has the contract code on the left hand side of the screen and a visual interpretation (and possible errors if the tool finds any) on the right hand side. We create OpenAPI3.0 documentation with Swagger because our backend (and endpoints) are JSON based. Swagger provides a great overview and interface to work with. And in my experience, anyone who has an example file to play with (which I will provide further on in this blog) can edit, extend (and copy-paste), to create a contract that suits their own endpoint(s).

For the examples we are using the free-to-use Catfacts API.

Now, why don’t we head to Swagger.io and paste the code below in the right hand panel:

Download the example code CatfactsExample.YAML right here — Note that with other JSON files I wanted to upload here, the file format is currently *.txt. You can manually change it to *.yaml if you want to use it on Confluence with the SwaggerUI macro.

Open the file (with Notepad++ for example).

Preview snippet of the code in this file (this is just part of the code, the entire code for copy-paste can be downloaded above):

openapi: 3.0.0

info:
  title: CatFacts API
  description: CatFacts API returns cat facts
  version: 0.1.0

servers:
  - url: https://cat-fact.herokuapp.com
    description: catfacts URL

paths:
  /facts:
    get:
      summary: This API returns a list of cat facts
      tags:
        - CatFacts API
      responses:
        '200':    # Successful response
          description: Success
          content:
            application/json:
              schema:
                $ref: '#/components/schemas/catfacts'

.....

If all goes well, you will see the following:

Creating a JSON Schema

This documentation is written in OpenAPI3.0 and can either be saved as a YAML file (File -> Save as YAML), or as JSON (File -> Convert and save as JSON). My team uses the YAML files in combination with the Swagger UI macro on Confluence. The Confluence plugin allows you to visualize the documentation the same way Swagger.io does on the right hand panel. This makes readability for other parties a lot better compared to the code in the YAML file itself. The JSON file we use to create JSON schemas to test with. Swagger UI macro will make the YAML file on Confluence look like this (source: marketplace-cdn.atlassian.net):

In order to create a JSON schema from the documentation, after saving the OpenAPI3.0 as JSON, we take out snippets of the response we want to validate. In this case, the “cat facts” array, containing the other 2 schemas within. The result:

Download the example JSON Schema CatfactsExampleSchema.txt here (I wanted to name it *.json but apparently that’s a risk on WordPress 😉 )

Preview snippet of the code in this file (this is just part of the code, the entire code to copy-paste can be downloaded above):

{
    "type": "array",
    "items": {
        "catfact": {
            "type": "object",
            "required": [
                "status",
                "type",
                "text"
            ],
            "properties": {
                "status": {
........

This quite elaborate JSON schema contains all the restrictions and possible attributes that are also present in the YAML version of our OpenAPI3.0 document. With the help of this schema, we can add scheme validation to our Postman tests. The benefit of having tests with schema validation is not having to look at the content of each of the attributes individually, but knowing that they adhere to the contract. The actual data can of course be validated as well, but for contract testing this is not the main priority. The main priority is to detect any possible mishaps in code, resulting in the response not adhering to the schema. The second priority (the main priority when using these validations in regression runs) is to check for any breaking changes. Does the schema validation at some point fail? Then that means that the providing party has introduced a change that broke the contract. In that case, we need to review the current contract, and possibly agree on a new contract that contains the introduced change.

Adding Schema validation to Postman tests

Now, how do we implement this schema into our Postman tests? It’s actually not that difficult, but you will need a few tries to make sure it works (e.g., the first time your test says the schema validation passed, you want to see it fail to verify that the mechanism works). For this blog, we assume that it all works as intended, as I have tested with the provided YAML, JSON schema and endpoint.

Create a new GET Request

First, we create a new GET request in Postman, using the free-of-charge CatFacts API:
https://cat-fact.herokuapp.com/facts
Note that we don’t need to provide any environment, because we don’t have any environment variables. We will add one for the schema, but we will do this in our pre-request script, so the variable is always available and correctly set when we wish to test.


Create JSON Schema(s) in Pre-request Script

Secondly, we add the JSON Schema to our environment variables, so that we can re-use it in our test. We do this by pasting a snippet of code into the “Pre-request script” tab of our Postman script.


Basically, it’s the JSON Schema that we created earlier, but now it’s topped by pm.environment.set(“CatfactsSCHEMA”,
This line of code sets a new Environment Variable called CatFactsSCHEMA each time the GET request is called. Note that we need to set 3 (well, at least 2 but we use 3) schemas to validate our response in Postman. This is done because when the AJV tool finds an array, it will check it as a tuple (used to store multiple items in a single variable). This means that it will find the first item in the array and stops validation there. By creating at least one more validation schema (the catfact object within the catfacts array), we can validate whether the objects in the array adhere to our contract.

Download the example Pre-request Script code here

Preview snippet of the code in this file (this is just part of the code, the entire code to copy-paste can be downloaded above):

pm.environment.set("CatfactsSCHEMA",
{
    "type": "array",
    "items": {
    }
})

pm.environment.set("CatfactSCHEMA",
{
    "type": "object",
    "required": [
        "status",
......

Use the JSON Scheme(s) to validate the contract in your tests

Thirdly, we want to use the created variable containing our JSON Scheme in our tests, to validate the response against our scheme. In order to do this, we copy-paste a snippet into our “Tests” tab in our Postman GET request.

var Ajv = require('ajv');
ajv = new Ajv({logger: console});

var catfacts = pm.environment.get("CatfactsSCHEMA");
var catfact = pm.environment.get("CatfactSCHEMA");
var status = pm.environment.get("StatusSCHEMA");
                
var data = pm.response.json();
var firstObject = data[0];
var firstObjectStatus = data[0].status;

pm.test('Catfacts array is valid', function() {
    pm.expect(ajv.validate(catfacts, data)).to.be.true;
});
pm.test('Catfact object is valid', function() {
    pm.expect(ajv.validate(catfact, firstObject)).to.be.true;
});
pm.test('Status object is valid', function() {
    pm.expect(ajv.validate(status, firstObjectStatus)).to.be.true;
});

The abbreviation AJV (the tool we use in Postman to validate the JSON Schema) stands for “Another JSN Scheme Validator”. This implies there are more ways of validating schemes in Postman, but we use AVJ in this example. Basically, this test script tells AJV to set the scheme (and it uses the CatfactsSCHEMA, CatfactSCHEMA and StatusSCHEMA environment variable that we set in the pre-request script) and then executes the test. The test consists of parsing the JSON response body (in variable data), then expects it to adhere to the schema. If so, it will return that the Schema is Valid. If not, it will return that the Schema is not Valid.

Watch it pass!

If we now Send the GET request, with our pre-request script and tests in place, we should be seeing this result:

Make it fail!

To check whether our schema validation actually works, let’s change one of the expected response attributes.

Watch it fail!

Now that we expect a String value, but our Schema states that it expects a Boolean value, we expect that the test now fails.

Voila! We have now created an OpenAPI3.0 YAML Contract, created a JSON Schema (well, actually multiple), and used these schemas to assess whether the response body of our API call adheres to our contract!

Recap

So, you’ve reached the end of this blog. Thanks for sticking with me so far! I know it’s been quite a technical journey, but I do hope this helps when working with multiple teams, creating and consuming API’s, all with different needs and expectations. If you’re reading this, then unless you skipped to the end, you can now:

+ Create an OpenAPI3.0 contract using Swagger
+ Export the OpenAPI3.0 contract as YAML and as JSON file
++ In case of YAML, you can use this file for documenting on Confluence using the Swagger UI macro
++ In case of JSON, you can use this file for creating JSON validation schemas
+ Create JSON Schema variables using Pre-request Scripts in Postman
+ Use these JSON Schema variables in your Tests in Postman
+ Double check whether the validation works by making your validation fail

I hope you all enjoyed reading, and I sincerely hope this blog can help you all create better software, create better tests, deliver better quality. As long as everyone adheres to the contract, you should all be on the same page.