- clone this repo
- install dependencies:
pip install -r requirements.txt
- create a postgresql user with password
municipal_finance
:createuser municipal_finance -W
- create a database:
createdb municipal_finance -O municipal_finance
- install data from somewhere :)
- run it:
python manage.py runserver
dokku config:set municipal-finance DJANGO_DEBUG=False \
DISABLE_COLLECTSTATIC=1 \
DJANGO_SECRET_KEY=... \
NEW_RELIC_APP_NAME=municipal_finance \
NEW_RELIC_LICENSE_KEY=... \
DATABASE_URL=postgres://municipal_finance:[email protected]/municipal_finance
Data import is still a fairly manual process leveraging the DB and a few SQL scripts to do the hard work. This is usually done against a local DB, sanity checked with a locally-running instance of the API and some tools built on it, and if everything looks ok, dumped table-by-table with something like pg_dump "postgres://municipal_finance@localhost/municipal_finance" --table=audit_opinions -O -c --if-exists > audit_opinions.sql
and then loaded into the production database.
- Create the table with the file in the
sql
dir with the table's name, e.g. - Import the first few columns which are supplied by National Treasury
- Run the relevant add_labels_-prefixed SQL file to add the remaining labels.
- These should be idempotent so they can simply run again when data is added.
- Make sure
create_indices.sql
and its indices are up to date
- create it with the python module
municiapl_finance.data_import.create_indices
- add it to git and run it if it was changed
- the prod DB doesn't support CREATE INDEX IF NOT EXISTS yet so ignore errors for existing indices unless their columns changed and they need to be manually removed and recreated.
Remember to run VACUUM ANALYSE
or REINDEX tables after significant changes to ensure stats are up to date to use indices properly.
MIT License