Using mocks when unit testing your TypeScript code

Unit testing is a very important part of developing applications. After all, you really want to make sure your code is doing what it’s supposed to. With a unit test, you only want to test your own function code, not any other functions your function may call. This is where mocks come in. I know from experience that writing thorough unit tests with mocks can put you up to quite some challenges over time. That’s why I decided to write a blog series about the different challenges I faced when using mocks while unit testing my TypeScript code. Let’s start off with an introduction to nock.

One of the things your function can do is make a call to an HTTP(S) endpoint. To mock this call, you can use nock. In my example, the call runs through the AWS SDK, but the same holds true for direct calls. All of the code is written in Node.js.

Below is our function, in file example.ts.

import * as AWS from 'aws-sdk';

const documentClient = new AWS.DynamoDB.DocumentClient({ region: 'us-east-1' });

export async function getItemFromTable(itemKey: string) {
    const result: { code: number; body?: string; message?: string; detail?: string; } = { code: 0 };
    try {
        const getItemInput: AWS.DynamoDB.DocumentClient.GetItemInput = {
            TableName: 'my-table',
            Key: { itemKey }
        };

        const getItemResponse = await documentClient.get(getItemInput).promise();

        if (getItemResponse.Item) {
            result.code = 200;
            result.body = JSON.stringify(getItemResponse.Item);
        } else {
            result.code = 404;
            result.message = 'Not Found';
        }
    } catch (error) {
        result.code = 500;
        // I know this looks weird, but the AWSError datatype uses 'code' and 'message'
        result.message = error.code;
        result.detail = error.message;
    }
    return result;
}

We make a call to the DynamoDB DocumentClient from the AWS SDK on line 13. This is the call we’re going to mock in our unit test. First, make sure to run npm i nock so you can use nock.

import nock from 'nock';
import { getItemFromTable } from './example';

describe('getItemFromTable', () => {

    const ITEM_KEY = 'item1';
    afterEach(() => nock.cleanAll());

    it('UT001 - getItemFromTable with valid response', async() => {
        nock.recorder.rec();

        await getItemFromTable(ITEM_KEY);
    });
});

This is the beginning of our unit test. To see what your call looks like, you can use nock.recorder.rec(). Be aware, though, that this will turn off mocking, so the endpoint is actually receiving the call! afterEach(() => nock.cleanAll()) can be used to make sure all mocks are cleaned up after each test case.

When running this test, you’ll get the following in your logs (providing the table and item exist):

We can see here what the endpoint is, what it’s receiving, and what it’s returning. This is pretty nifty, as we now know exactly where to point our mock to, and also what response to return. Now we can edit our test and let nock take care of the call.

import nock from 'nock';
import { getItemFromTable } from './example';

describe('getItemFromTable', () => {

    const ITEM_KEY = 'getItemFromTableTest';
    afterEach(() => nock.cleanAll());

    it('UT001 - getItemFromTable with valid response', async() => {

        const getItemMockResponse: AWS.DynamoDB.DocumentClient.GetItemOutput = { Item: { itemKey: { S: ITEM_KEY } } };
        nock('https://dynamodb.us-east-1.amazonaws.com:443')
            .post('/')
            .reply(200, getItemMockResponse);

        await getItemFromTable(ITEM_KEY);
    });
});

You can put anything for the ITEM_KEY, since we’re talking to mocks instead of real data. On line 11, we define our mock response, which nock will return. We can use the logging from before to see what a response from DynamoDB looks like. After that, we define our nock mock, again using the information from our previous logging.

This seems to work, but our unit test is not actually testing anything yet. We’re going to add some expectations to check whether our response is as expected. I like to use the chai expect interface to write these, because it has a very readable syntax. Don’t forget to run npm i chai first!

import { expect } from 'chai';
import nock from 'nock';
import { getItemFromTable } from './example';

describe('getItemFromTable', () => {

    const ITEM_KEY = 'getItemFromTableTest';
    afterEach(() => nock.cleanAll());

    it('UT001 - getItemFromTable with valid response', async() => {

        const getItemMockResponse: AWS.DynamoDB.DocumentClient.GetItemOutput = { Item: { itemKey: { S: ITEM_KEY } } };
        nock('https://dynamodb.us-east-1.amazonaws.com:443')
            .post('/')
            .reply(200, getItemMockResponse);

        const response = await getItemFromTable(ITEM_KEY);

        expect(response.code).to.equal(200);
        expect(JSON.parse(response.body)).to.deep.equal({ itemKey: ITEM_KEY });
        expect(response.message).to.not.exist;
        expect(response.detail).to.not.exist;
    });
});

It’s also good practice to check whether your mock is actually being called. To do this, you can assign your nock to a variable, which will be of type nock.Scope. Then, you can call .isDone() on this variable, and add an expectation to see whether your call was redirected to nock correctly. Our test will then look like this:

import { expect } from 'chai';
import nock from 'nock';
import { getItemFromTable } from './example';

describe('getItemFromTable', () => {

    const ITEM_KEY = 'getItemFromTableTest';
    afterEach(() => nock.cleanAll());

    it('UT001 - getItemFromTable with valid response', async() => {

        const getItemMockResponse: AWS.DynamoDB.DocumentClient.GetItemOutput = { Item: { itemKey: { S: ITEM_KEY } } };
        const getItemRequestNock = nock('https://dynamodb.us-east-1.amazonaws.com:443')
            .post('/')
            .reply(200, getItemMockResponse);

        const response = await getItemFromTable(ITEM_KEY);

        expect(getItemRequestNock.isDone()).to.be.true;

        expect(response.code).to.equal(200);
        expect(response.message).to.not.exist;
        expect(response.detail).to.not.exist;
        expect(JSON.parse(response.body)).to.deep.equal({ itemKey: ITEM_KEY });
    });
});

So now we’ve learned the basics of using nock, easy right?! In my next blog, we’ll take a look at how to expand our test so we can check whether the request nock receives is as expected as well.

Setting up TypeScript

In a previous blog I wrote about getting started with Serverless. We built a hello-world kind of app and deployed this to our Cloud Instance focusing on setting up the serverless yaml.

So now let’s have a look at setting up the development environment, with things like initializing npm and TypeScript on this project. We are starting with an npm init command in the command line. After answering some questions on the command line (or accepting the defaults) you should see a summary and hit enter (yes).

The main thing that happened, is that you now have a package.json file in your folder structure. So now we have a package.json and npm tools available. I like to also init TypeScript, since I prefer TypeScript over plain JavaScript on my projects. So first let’s install with npm i typescript. After installing TypeScript, we are running the init on the project with: tsc --init. This generates a tsconfig.json file.

After the two initialisations on the project, we see a couple of files have been added:

The tsconfig.json file is for configuration, for example let’s change something small here and set the outDir configured to ‘dist’. Now it is time to start using TypeScript instead of JavaScript. I also like to add a folder ‘src’ where my source files are. We move the handler.js to this folder and will change the extension from handler.js to .ts, in the message I added ‘Your typescript function executed successfully!‘ so we can see that it worked.

With the command tsc (TypeScript compile) you can compile your TypeScript code to a .js file. However, we would prefer to use the more common npm build command. To do this, we can add a build script in the package.json. In the scripts section we add: "build": "tsc". So now on the command line we can run npm run build.

This will run tsc and create a dist folder with the .js file.

Now remember that in the serverless.yml we have set the handler to handler.hello. We need to add the directory in front of this now, so this should be changed to dist/handler.hello.

To see if everything is still working correctly, we do an sls deploy to redeploy our serverless project to AWS. After this, we do a curl to the endpoint and get our success message:

Last but not least it is a good practice to ‘set up linting’. For this we start with installing eslint with npm i eslint, after which we do an npx eslint --init to initialize eslint on our project. The npx command makes sure you run the local library in your node_modules.

Answer the questions any way you like, above is just an example of what you can choose, but not necessarily a recommendation. So depending on the output you have selected, in this specific case I decided on JSON, a ‘.eslintrc.json’ file is added to the folder structure.

To run eslint on the whole project for all TypeScript files, you can use the command npx eslint src --ext .ts. But I’d rather put this in a script as well. So I add a script with the name eslint in the package.json. The beauty of putting this in the package.json is that we can now call the eslint from the build command as well, making sure when we build our code it is also linted. I changed the build script to "build": "tsc && npm run eslint". And now we can rerun an npm run build command.

This shows us 3 problems and the build fails! You can manually change some rules if you disagree, or you can change your code. To make the code compliant again, you can use the following:

module.exports.hello = async (event: any) => ({
  statusCode: 200,
  body: JSON.stringify(
    {
      message: 'Go Serverless v1.0! Your linted typescript function executed successfully!',
      input: event,
    },
    null,
    2,
  ),
});

Just to be totally sure everything works as expected, after the npm run build we can do an sls deploy again. Or if you like, you can create a deploy script within the package.json that runs sls deploy so you can run npm run deploy.

After this, we do another curl to the endpoint and we see the linted code has been deployed.