Backend for DataAmazon.
- Docker
- Docker Compose
Use vscode and set the python verion from .python-version via create environment. Follow the tutorial at https://code.visualstudio.com/docs/python/environments#_creating-environments
Use Makefile targets to make your life easier!
- Start docker containers
make ENV_FILE_PATH={env_file_path} docker-run
The env_file_path
is the path for the {env-name}.env
file on your project. We have a local.env
with a local configuration that could be used by you
to configure you local environment.
- Access pgAdmin in your browser at http://localhost:5050 to use PgAdmin to connect to the PostgreSQL database
- Log in using the admin credentials defined in
docker-compose-database.yaml
file that uses the.env
file, under the servicegatekeeper-pgadmin
. - Double click in
Servers > Gatekeeper DB
and inform the password fromdocker-compose-database.yaml
- Create a virtual environment and activate it
make ENV_FILE_PATH={env_file_path} python-env
- Install project dependencies
If your are running MacOS, install openssl first:
brew install openssl
make ENV_FILE_PATH={env_file_path} python-pip-install
- Run database migrations
make ENV_FILE_PATH={env_file_path} db-upgrade
- Start the application by using any of the following
make ENV_FILE_PATH={env_file_path} python-run
- If you need to delete the docker containers
make ENV_FILE_PATH={env_file_path} docker-down
Obs.: this will delete the containers, but not the images generated nor the database data, since it uses a docker volume to persistently storage data.
We are using unittest. See examples at https://docs.python.org/3/library/unittest.html
To run all tests from command line, use:
pytest
To generage the code coverage reports, use:
# Run the tests and generage the .coverage file
coverage run -m pytest
# Print the coverage report on console
coverage report
# Generate the coverage report in html
coverage html
To create new migrations, follow the steps below.
- Map your new table in a new file in
models/{your_new_model}.py
- Import your new database model in
migrations/env.py
so alembic maps the file. - Run
make ENV_FILE_PATH={ENV_FILE_PATH} MESSAGE="{MESSAGE}" db-create-migration"
- Check the generated file under
migrations/versions/<generated_file>.py
and see if any fix is needed. - Run
make ENV_FILE_PATH={env_file_path} db-upgrade
- Check the database, if there's any problem, run
make ENV_FILE_PATH={env_file_path} db-downgrade
WARNING: Sometimes a new migration tries to delete the
casbin_rule
table. This is not intended and should be investigated. As of now, check the migration file to see if the upgrade and downgrade has a create and/or delete table forcabin_rule
. If, so, just delete this statement from the files (from both upgrade and downgrade).
WARNING: The current deployment process causes downtime for services.
# Connect to USP infra
ssh [email protected] -p 5010
# Navegate to the project folder
cd gatekeeper
# Get the last (main) branch version
git pull
# Start python virtual env
python3 -m venv venv
. venv/bin/activate
# Install libraries
make ENV_FILE_PATH={env_file_path} python-pip-install
# Run db migrations
make ENV_FILE_PATH={env_file_path} db-upgrade
# Deactivate python virtual env
deactivate
# Refresh and deploy the last docker image.
make ENV_FILE_PATH={env_file_path} docker-deployment
- Frontend:
https://datamap.pcs.usp.br/
- Backend:
https://datamap.pcs.usp.br/api/docs
- pgAdmin:
http://datamap.pcs.usp.br/pgadmin
- MIN.io:
https://datamap.pcs.usp.br/minio/ui/
- TUSd:
https://datamap.pcs.usp.br/files/
- For the first client, the easiest way is to remove the
authorize
interceptor from the client creation endpoint - Create with
curl -X POST http://localhost:9092/api/v1/clients -H 'Content-Type: application/json' -d '{"name": "DataAmazon Local Client", "secret": "{secret}"}'
- Test with
curl -X GET -H "X-Api-Key: {generated-api-key}" -H "X-Api-Secret: {defined-api-secret}" localhost:9092/api/v1/datasets
- Change the database host, api key and api secret in
tools/import_dataset.py
- Remove the authorization interceptor for the dataset creation route
python tools/import_dataset.py
- Open
http://localhost:9092/api/v1/docs
in the browser - Under the "Authorize" button (top right corner), paste the api key and secret
- Execute the POST for users route and create a new user
- Open the pgAdmin at
http://localhost:9092/pgadmin
and login - Execute SQL in
app/resources/casbin_seed_policies.sql
andapp/resources/tenancy_seed.sql
in the gatekeeper database - Add your own user to the admin role, so you can test everything:
INSERT INTO public.casbin_rule (ptype, v0, v1, v2, v3, v4, v5) VALUES ('g', '{used_id}', 'admin', NULL, NULL, NULL, NULL);
This projects uses Ruff to manage code style, linter and formatting.
- To check code style problems:
ruff check
- To auto-fix some problems:
ruff check --fix
- To format files:
ruff format
- If you have the psycopg_2 problem, run
brew install postgresql
- If you have the "Failed to build dependency-injector", use Python 3.10.4