Automation testing guidance

Introduction

Automation testing is the use of automation tools to write scripts that execute tests cases on software applications.

As of February 2023, there exists two types of automation tests in the LiteFarm code base. API unit tests written with Jest and an e2e happyPath test written with Cypress. Both of these tests are integrated into the projects CI/CD pipeline and run on each pull request to the integration branch of the LiteFarm repository. At present, they cover just under 60 percent of statements in the repo. You can check this here:

 

Automation testing strategy

Going forward automation testing should be used in the project as follows:

Local testing

Jest unit tests must be written for any changes to the API, and the owner of each ticket must ensure that all existing and applicable new tests pass per the new requirements before making a PR against integration.

Engineers must also ensure that the happyPath cypress e2e test passes locally before making a PR against integration. Since the e2e tests take several minutes to run, we suggest you only run them just prior to making a PR against integration, rather than as a regular part of your development process.

Peer review process

Jest API unit tests currently run in CI on each pull request against integration and are one of the checks in the peer review process. Not only must these pass during CI, but QA will also ensure that unit tests implement the right test cases as described here.

Given time constraints, it was suggested that maybe the happyPath test provides the most value as it gives some confidence that the app is working as expected, here I would suggest adding more coverage to said test to increase confidence, i think a reasonable statement coverage to aim for would be 75% by the end of second quarter 2023, the main focus being hitting all the user flows in the app. All this keeping in mind that the tests should not exceed a reasonable test execution time of 30 minutes.

Deploy

To give early warning of some bugs that may not have been detected by automated testing at peer review, I think there may be value in running the entire suite of cypress tests against the beta app biweekly at the end of each sprint. This can be done by creating a manually trigger GitHub action similar to the one that is currently triggered on Pull Requests to the integration branch, that runs the full suite of cypress user flow tests.

Release processes

Ideally all features being released should have automation tests written for them, API unit tests must be in place and at the very least happy path cypress tests. I would suggest spinning up a temporary server with the release candidate and that all these tests be run and must have passed before any releases.