Skip to content

Latest commit

 

History

History
53 lines (30 loc) · 3.6 KB

README.md

File metadata and controls

53 lines (30 loc) · 3.6 KB

Integration tests

The integration tests consist of a series of automated tests for this repository that test it in a way similar to how it would be used in production.

It uses Docker containers to create containerised instances of Kafka and other components.

Usage

Requires Python 3.6+.

[optional] Set up a Python virtual environment and activate it (see here)

  • Install Docker

  • Install the requirements using pip: pip install -r integration-tests/requirements.txt

  • Stop and remove any containers that may interfere with the integration tests, e.g IOC or Kafka containers and containers from previous runs. To stop and remove all containers use docker stop $(docker ps -a -q) && docker rm $(docker ps -a -q)

  • If you have a local conan server then set environment variable local_conan_server=<ADDRESS OF SERVER>, if there are binaries for conan packages on the server this makes a huge difference in time taken to build the file writer docker image.

  • Run python -m pytest -s . --writer-binary=<PATH_TO_BUILD_DOR> from the integration-tests/ directory. The integration tests will spawn instances of the filewriter running on the host and will thus need to know the location of a binary that you want to test. Note that the <PATH_TO_BUILD_DOR> is the path to the directory containing the bin directory that contains the kafka-to-nexus executable.

  • To run a single test, use the -k argument (e.g.) python -m pytest -s . -k 'test_two_different_writer_modules_with_same_flatbuffer_id'.

  • If you want to manually instantiate/run a version of the file-writer (with e.g. debug symbols), use the command line argument --start-no-filewriter=true. This argument is mutually exclusive with the --writer-binary argument. If you wait for too long before manually starting a file-writer instance, the integration test will fail.

General Architecture

The integration tests use pytest for the test runner, and use separate fixtures for different configurations of the file-writer.

The Kafka and Zookeeper containers are started with docker-compose and persist throughout all of the tests, and when finished will be stopped and removed.

The first test starts the file-writer with an ini config file found in /config-files. Each test uses a seperate JSON config/command file found in commands.

Most tests check the NeXus file created by the file-writer contains the correct static and streamed data, however, some tests instead test that the status of the file writer matches expectation, by consuming status messages from Kafka.

Log files are placed in the logs folder in integration-tests provided that the ini file is using the --log-file flag and the docker-compose file mounts the logs directory.

Creating tests

To create a new fixture, a new function should be added in conftest.py as well as a docker compose file in compose/ and a startup ini config file. The test itself should be created in a file with the prefix test_, for example test_idle_pv_updates, so that file can be picked up by pytest.

The fixture name must be used as the first parameter to the test like so: def test_data_reaches_file(docker_compose):

Formatting

black formatting tool should be used to keep formatting of the integration-test scripts consistent. A particular version of black is specified in the requirements file. It can be run from the root of the repository like this:

black integration-tests

pyproject.toml has been configured to tell black to exclude python scripts which were generated by flatc, so that newly generated files can be compared if necessary.