Test the Docker image
Now we want to test if everything goes well, which means:
- Launching the docker container and checking that it responds to http requests
- Checking that the process we just deployed is working correctly
Fortunately, PESTO features a test/usage framework which is the purpose of the pesto/test
folder
Booting up the container & first http requests
If the build succeeds you should be able to see your image with
docker image ls
REPOSITORY TAG IMAGE ID CREATED SIZE
algo-service 1.0.0.dev0 f04d96bb57f4 4 minutes ago 1.16GB
First, we can verify that we are able to start the container and send very basic requests to it
docker run --rm -p 4000:8080 algo-service:1.0.0.dev0
This should start the container so that it can be accessed from http://localhost:4000
.
In your browser (or using CURL) you can send basic GET requests to your container:
Health
CURL -X GET http://localhost:4000/api/v1/health
OK
Describe
CURL -X GET http://localhost:4000/api/v1/describe
Using pesto test
command
The first way of testing your service is to call pesto test
utility the same way you called pesto build
.
In order, this command will:
- Run the docker container (the same way we did previously)
- Send requests to
api/v1/describe
and compare with theexpected_describe.json
- Send process payloads to
api/v1/process
and compare them to the desired outputs
The inputs and desired output have to be configured in the test resources directory
Defining Test Resources
Let's take a look at the pesto/test
directory
tests
├── README.md
└── resources/
├── expected_describe.json
├── test_1/
└── test_2/
The resources
folder will be used by the PESTO Test API and be converted to processing requests that will be sent to /api/v1/process
with the right format. The response will then be compared to the expected response, and act as unit tests.
The first file of interest is the expected_describe.json
. This file will be compared to the http://localhost:4000/api/v1/describe
json document returned by the API. This description file can be used to parse the information about the API (input / output schema, description etc...)
You will learn in time how to manually create an expected_describe.json
from the pesto/api
folder, for now it is best to copy the describe.json
file that we generated earlier and to put it as expected_describe.json
. You can compare this file to the default expected_describe.json
and notice how the differences translate themselves to the default processing
Now, there are several folders named test_*
. The purpose of these test folders is that the input payload files are deposited in input
and the expected response is in output
Let's take a look at the test folder:
test_1
├── input
│ ├── dict_parameter.json
│ ├── image.png
│ ├── integer_parameter.int
│ ├── number_parameter.float
│ ├── object_parameter.json
│ └── string_parameter.string
└── output
├── areas
│ ├── 0.json
│ └── 1.json
├── dict_output.json
├── geojson.json
├── image_list
│ ├── 0.png
│ └── 1.png
├── image.png
├── integer_output.integer
├── number_output.float
└── string_output.string
You can see that both input and output have files with extension corresponding to input types (see pesto test). The filenames are matched with the json payload keys.
Run pesto test
To run the test on configured input/output, use the pesto test command:
pesto test {PESTO_PROJECT_ROOT}
The logs should show different steps being processed.
You can check the responses and differences between dictionaries in the .pesto workspace: /home/$USER/.pesto/tests/xxx-service/1.0.0.dev0
You will find there the results / responses of all the requests, including describes and processing requests. This is a useful folder to debug potential differences.
results.json
Should everything goes well, the results.json
file should look like this
{
"describe": {
"NoDifference": true
},
"test_1": {
"NoDifference": true
},
"test_2": {
"NoDifference": true
}
}
Bonus: Using Pytest & unit testing
Once you're sure and have debugged properly you can write or edit unit tests in {PESTO_PROJECT_ROOT}/tests/
(check the autogenerated file tests/test_service.py
) and run it with pytest tests
on your root project
This can be used to ensure non regression on further edits or if you want to do test driver development
Bonus: Using PESTO Python API to run tests & send requests to model
Should you want to use in a non-scalable way or further test your services, you can have a look at the {PESTO_PROJECT_ROOT}/scripts/example_api_usage.py
file that exposes the low level python API that is used in pesto test
- The
ServiceManager
class is the class used as a proxy for the python Docker API, and is used to pull / run /attach / stop the containers - The
PayloadGenerator
class is used to translate files to actual json payload for the REST API - The
EndpointManager
manages the various endpoints of the processes, and act as a front to post/get requests - The
ServiceTester
is used to validate payloads & responses against their expected values
Note
This API is a simple example of how to use services packaged with pesto in python scripts. We encourage you to copy/paste and modify the classes should you feel the need for specific use cases, but both this and pesto test
is not designed for robustness and scalability
We consider the target of pesto test
capabilities to be the data scientist, integration testing & scalability should be done at production level