-
Notifications
You must be signed in to change notification settings - Fork 37
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* Implement new schema and management - nothing is using it yet! * Start getting update process onto new schema - incomplete work in progress. Includes some product->layer name refactoring. * Writing to the new layer-based schema, with batch caching. * Reading from the new layer range table. More product->layer renaming. * Passing mypy, failing tests. * Passing unit tests, server intialising. Integration tests still failing. * Passing integration tests. * make datacube/env handling more generic (one step closer to multi-db) and passing mypy. * Passing all tests. * Add new tests and fix broken tests. * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * lintage. * lintage. * Don't rely on DEA Explorer * Update db to postgres 16 and use DB_URL * Revert main docker-compose.yaml * Need port as well. * Fix nodb test fixture for GH * Opps - used non-raw github link. * Fix ows-update call in GHA test prep script. * Update documentation. * Fix spelling or add (non-)words to wordlist. * Various fixes/cleanups found on self-review. * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Make no_db in test_no_db_routes a proper fixture. * Documentation edits * Some cleanup in wms_utils.py * Some cleanup in update_ranges_impl.py * Make access in initialiser more consistent. * Provide better examples of role granting in scripts and documentation. * Fix inconsistent indentation. * Typo --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
- Loading branch information
1 parent
c8214a7
commit d1ac065
Showing
87 changed files
with
963 additions
and
1,017 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -3,23 +3,20 @@ datacube-ows | |
============ | ||
|
||
.. image:: https://github.com/opendatacube/datacube-ows/workflows/Linting/badge.svg | ||
:target: https://github.com/opendatacube/datacube-ows/actions?query=workflow%3ALinting | ||
:target: https://github.com/opendatacube/datacube-ows/actions?query=workflow%3ACode%20Linting | ||
|
||
.. image:: https://github.com/opendatacube/datacube-ows/workflows/Tests/badge.svg | ||
:target: https://github.com/opendatacube/datacube-ows/actions?query=workflow%3ATests | ||
|
||
.. image:: https://github.com/opendatacube/datacube-ows/workflows/Docker/badge.svg | ||
:target: https://github.com/opendatacube/datacube-ows/actions?query=workflow%3ADocker | ||
:target: https://github.com/opendatacube/datacube-ows/actions?query=workflow%3ADockerfile%20Linting | ||
|
||
.. image:: https://github.com/opendatacube/datacube-ows/workflows/Scan/badge.svg | ||
:target: https://github.com/opendatacube/datacube-ows/actions?query=workflow%3A%22Scan%22 | ||
|
||
.. image:: https://codecov.io/gh/opendatacube/datacube-ows/branch/master/graph/badge.svg | ||
:target: https://codecov.io/gh/opendatacube/datacube-ows | ||
|
||
.. image:: https://img.shields.io/pypi/v/datacube?label=datacube | ||
:alt: PyPI | ||
|
||
Datacube Open Web Services | ||
-------------------------- | ||
|
||
|
@@ -41,7 +38,7 @@ Features | |
System Architecture | ||
------------------- | ||
|
||
.. image:: docs/diagrams/ows_diagram.png | ||
.. image:: docs/diagrams/ows_diagram1.9.png | ||
:width: 700 | ||
|
||
Community | ||
|
@@ -141,14 +138,14 @@ To run the standard Docker image, create a docker volume containing your ows con | |
-e AWS_DEFAULT_REGION=ap-southeast-2 \ # AWS Default Region (supply even if NOT accessing files on S3! See Issue #151) | ||
-e SENTRY_DSN=https://[email protected]/projid \ # Key for Sentry logging (optional) | ||
\ # Database connection URL: postgresql://<username>:<password>@<hostname>:<port>/<database> | ||
-e ODC_DEFAULT_DB_URL=postgresql://cube:DataCube@172.17.0.1:5432/datacube \ | ||
-e ODC_DEFAULT_DB_URL=postgresql://myuser:mypassword@172.17.0.1:5432/mydb \ | ||
-e PYTHONPATH=/code # The default PATH is under env, change this to target /code | ||
-p 8080:8000 \ # Publish the gunicorn port (8000) on the Docker | ||
\ # container at port 8008 on the host machine. | ||
--mount source=test_cfg,target=/code/datacube_ows/config \ # Mount the docker volume where the config lives | ||
name_of_built_container | ||
|
||
The image is based on the standard ODC container. | ||
The image is based on the standard ODC container and an external database | ||
|
||
Installation with Conda | ||
------------ | ||
|
@@ -157,7 +154,7 @@ The following instructions are for installing on a clean Linux system. | |
|
||
* Create a conda python 3.8 and activate conda environment:: | ||
|
||
conda create -n ows -c conda-forge python=3.8 datacube pre_commit postgis | ||
conda create -n ows -c conda-forge python=3.10 datacube pre_commit postgis | ||
conda activate ows | ||
|
||
* install the latest release using pip install:: | ||
|
@@ -186,7 +183,7 @@ The following instructions are for installing on a clean Linux system. | |
# to create schema, tables and materialised views used by datacube-ows. | ||
|
||
export DATACUBE_OWS_CFG=datacube_ows.ows_cfg_example.ows_cfg | ||
datacube-ows-update --role ubuntu --schema | ||
datacube-ows-update --write-role ubuntu --schema | ||
|
||
|
||
* Create a configuration file for your service, and all data products you wish to publish in | ||
|
@@ -253,8 +250,9 @@ Local Postgres database | |
| xargs -n1 -I {} datacube dataset add s3://deafrica-data/{} | ||
|
||
5. Write an ows config file to identify the products you want available in ows, see example here: https://github.com/opendatacube/datacube-ows/blob/master/datacube_ows/ows_cfg_example.py | ||
6. Run `datacube-ows-update --schema --role <db_read_role>` to create ows specific tables | ||
7. Run `datacube-ows-update` to generate ows extents. | ||
6. Run ``datacube-ows-update --schema --read-role <db_read_role> --write-role <db_write_role>`` as a database | ||
superuser role to create ows specific tables and views | ||
7. Run ``datacube-ows-update`` as ``db_write_role`` to populate ows extent tables. | ||
|
||
Apache2 mod_wsgi | ||
---------------- | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.