`_
-docker-compose.yml
-------------------
+Setup
+-----
-.. code-block:: yaml
+Clone the repository and ``cd`` into the project directory.
- # replace this
- image: live_data:dev
- # with this
- build:
- network: host
- context: .
+Create a conda environment ``livedata``, containing all the dependencies
+.. code-block:: python
+
+ conda env create -f environment.yml
+ conda activate livedata
-This will build from our local source instead of pulling an image online.
+To deploy this application locally, you will need to set a number of environment variables,
+for example (bash):
+.. code-block:: bash
-Settings.py
------------
+ export DATABASE_NAME=livedatadb
+ export DATABASE_USER=livedatauser
+ export DATABASE_PASS=livedatapass
+ export DATABASE_HOST=db
+ export DATABASE_PORT=5432
+ export LIVE_PLOT_SECRET_KEY="secretKey"
-.. code-block:: python
+ # These need to be set for `pytest`,
+ # but are not used in the docker compose
+ export DJANGO_SUPERUSER_USERNAME=$DATABASE_USER
+ export DJANGO_SUPERUSER_PASSWORD=$DATABASE_PASS
+
+
+*NOTES*:
+
+- The ``DATABASE_PORT`` **must** be set to ``5432``, as Postgres is configured to listen on that port by default.
+ If you need to change the port, you will need to modify the ``docker-compose.yml`` file accordingly.
- # replace this
- ALLOWED_HOSTS = ['livedata.sns.gov']
- # with this
- ALLOWED_HOSTS = ['*']
+- It is recommended to save these variables into an ``.envrc`` file which can be managed by `direnv `_.
+ direnv will automatically load the variables when you ``cd`` into the project directory.
+After the secrets are set, you can start the server with:
+.. code-block:: bash
+
+ make docker/compose/local
+
+This command will copy ``config/docker-compose.envlocal.yml`` into ``./docker-compose.yml`` before composing all the services.
+
+| Run ``make help`` to learn about other macros available as make targets.
+| For instance, ``make docker/pruneall`` will stop all containers, then remove all containers, images, networks, and volumes.
-This setting is meant for production where its actually hosted on livedata.sns.gov.
-Changing it to a wildcard lets us ping it as local host and not get a 400 error.
+Testing
+-------
+
+After the setup, with the server running, you can test your setup with ``pytest``:
+
+.. code-block:: bash
+ # run all tests
+ pytest
+ # or run a specific test
+ pytest tests/test_post_get.py
-You should now be able to interact with the api on `localhost:9999` but there's a little more.
-You need to add a user that you can use for your post requests,
+*NOTE:*
+The environment variables ``DJANGO_SUPERUSER_USERNAME`` and ``DJANGO_SUPERUSER_PASSWORD`` are defined in the ``docker-compose.envlocal.yml`` file, but ``pytest`` does not read this file.
+You must either have them exported to the shell where ``pytest`` is to be run, as described above, or modify the ``pytest`` command to include them, e.g.:
.. code-block:: bash
- docker exec -it live_data_server_livedata_1 /bin/bash
- cd live_data_server
- python manage.py createsuperuser
+ DJANGO_SUPERUSER_USERNAME=***** DJANGO_SUPERUSER_PASSWORD=***** pytest
+API
+---
I personally recommend using `Postman `_ when interacting with the api.
-If you do, set the request body to `form-data`!
+If you do, set the request body to ``form-data``!
Some relevant form-data field keys:
diff --git a/docs/developer/index.rst b/docs/developer/index.rst
index 7c9f2cb..514626b 100644
--- a/docs/developer/index.rst
+++ b/docs/developer/index.rst
@@ -5,5 +5,6 @@ Development Guide
:maxdepth: 2
config_for_local_use
+ updating_data_models
service_through_apache
troubleshoot/index
diff --git a/docs/developer/updating_data_models.rst b/docs/developer/updating_data_models.rst
new file mode 100644
index 0000000..045dedc
--- /dev/null
+++ b/docs/developer/updating_data_models.rst
@@ -0,0 +1,38 @@
+=============================================
+Updating Data Models
+=============================================
+
+| There may be times when you need to update the data models used by Django.
+| This can be done by following these steps:
+
+#. Make the necessary changes to the models in ``src/live_data_server/plots/models.py``.
+#. Generate the Django migration file(s):
+
+ .. code-block:: bash
+
+ cd src/live_data_server
+ python manage.py makemigrations
+
+The migration(s) will be created in the ``src/live_data_server/plots/migrations/`` directory.
+First check the migration(s) to ensure they are correct. If they are, apply the migration(s):
+
+From within the live_data_server Docker container:
+
+.. code-block:: bash
+
+ python manage.py migrate
+
+ # or if you are not in the container
+ docker exec -i live_data_server-livedata-1 bash -ic '
+ conda activate livedata
+ cd app
+ python manage.py migrate
+ '
+
+If the migration(s) are not correct, you can delete them and start again:
+
+.. code-block:: bash
+
+ python manage.py migrate plots zero
+ python manage.py makemigrations
+ python manage.py migrate
diff --git a/pyproject.toml b/pyproject.toml
index a53bfa1..47b2624 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -9,15 +9,10 @@ dependencies = [
license = { text = "BSD-3" }
[project.urls]
-homepage = "https://livedata-ornl.readthedocs.io" # if no homepage, use repo url
+homepage = "https://livedata-ornl.readthedocs.io" # if no homepage, use repo url
[build-system]
-requires = [
- "setuptools >= 40.6.0",
- "wheel",
- "toml",
- "versioningit"
-]
+requires = ["setuptools >= 40.6.0", "wheel", "toml", "versioningit"]
build-backend = "setuptools.build_meta"
[tool.black]
@@ -43,14 +38,24 @@ where = ["src"]
exclude = ["tests*", "scripts*", "docs*"]
[tool.pytest.ini_options]
-pythonpath = [
- ".", "src", "scripts"
-]
+pythonpath = [".", "src", "scripts"]
testpaths = ["tests"]
python_files = ["test*.py"]
[tool.ruff]
line-length = 120
-select = ["A", "ARG","ASYNC","BLE","C90", "E", "F", "I", "N", "UP032", "W"]
+lint.select = [
+ "A",
+ "ARG",
+ "ASYNC",
+ "BLE",
+ "C90",
+ "E",
+ "F",
+ "I",
+ "N",
+ "UP032",
+ "W",
+]
# Add additional 3rd party tool configuration here as needed
diff --git a/src/live_data_server/__init__.py b/src/live_data_server/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/src/live_data_server/live_data_server/settings.py b/src/live_data_server/live_data_server/settings.py
index c125fb6..b314a85 100644
--- a/src/live_data_server/live_data_server/settings.py
+++ b/src/live_data_server/live_data_server/settings.py
@@ -140,7 +140,6 @@
USE_TZ = True
-
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/1.9/howto/static-files/
@@ -148,10 +147,17 @@
STATIC_ROOT = "/var/www/livedata/static/"
# Add secret key to settings only if there's a non-empty environment variable with same name
-if "LIVE_PLOT_SECRET_KEY" in os.environ:
- secret_key = os.environ.get("LIVE_PLOT_SECRET_KEY")
- if bool(secret_key):
- LIVE_PLOT_SECRET_KEY = os.environ.get("LIVE_PLOT_SECRET_KEY")
+secret_key = os.environ.get("LIVE_PLOT_SECRET_KEY")
+if secret_key:
+ LIVE_PLOT_SECRET_KEY = secret_key
+
+# Set expiration time for live plots to 3 years if not set
+expiration_time = os.environ.get("LIVE_PLOT_EXPIRATION_TIME")
+if expiration_time:
+ LIVE_PLOT_EXPIRATION_TIME = int(expiration_time)
+else:
+ LIVE_PLOT_EXPIRATION_TIME = 365 * 3
+
# Import local settings if available
try:
diff --git a/src/live_data_server/plots/admin.py b/src/live_data_server/plots/admin.py
index a1e54b9..be4eebf 100644
--- a/src/live_data_server/plots/admin.py
+++ b/src/live_data_server/plots/admin.py
@@ -1,5 +1,4 @@
from django.contrib import admin
-
from plots.models import DataRun, Instrument, PlotData
@@ -9,7 +8,14 @@ class PlotDataAdmin(admin.ModelAdmin):
class DataRunAdmin(admin.ModelAdmin):
- list_display = ("id", "run_number", "run_id", "instrument", "created_on")
+ list_display = (
+ "id",
+ "run_number",
+ "run_id",
+ "instrument",
+ "created_on",
+ "expiration_date",
+ )
admin.site.register(DataRun, DataRunAdmin)
diff --git a/src/live_data_server/plots/management/commands/purge_expired_data.py b/src/live_data_server/plots/management/commands/purge_expired_data.py
new file mode 100644
index 0000000..db17068
--- /dev/null
+++ b/src/live_data_server/plots/management/commands/purge_expired_data.py
@@ -0,0 +1,13 @@
+from django.core.management.base import BaseCommand
+from django.utils import timezone
+from plots.models import DataRun
+
+
+class Command(BaseCommand):
+ help = "Delete expired runs and related plots"
+
+ def handle(self, *args, **options): # noqa: ARG002
+ runs = DataRun.objects.all()
+ for run in runs:
+ if run.expiration_date < timezone.now():
+ run.delete()
diff --git a/src/live_data_server/plots/migrations/0002_datarun_expiration_date.py b/src/live_data_server/plots/migrations/0002_datarun_expiration_date.py
new file mode 100644
index 0000000..8f1470a
--- /dev/null
+++ b/src/live_data_server/plots/migrations/0002_datarun_expiration_date.py
@@ -0,0 +1,22 @@
+# Generated by Django 5.1 on 2024-08-08 18:55
+
+import datetime
+
+from django.db import migrations, models
+
+
+class Migration(migrations.Migration):
+ dependencies = [
+ ("plots", "0001_initial"),
+ ]
+
+ operations = [
+ migrations.AddField(
+ model_name="datarun",
+ name="expiration_date",
+ field=models.DateTimeField(
+ default=datetime.datetime(2027, 8, 8, 18, 55, 41, 999298, tzinfo=datetime.timezone.utc),
+ verbose_name="Expires",
+ ),
+ ),
+ ]
diff --git a/src/live_data_server/plots/models.py b/src/live_data_server/plots/models.py
index 8a53c8c..0f16ac4 100644
--- a/src/live_data_server/plots/models.py
+++ b/src/live_data_server/plots/models.py
@@ -4,17 +4,18 @@
import logging
import sys
+from datetime import timedelta
+from django.conf import settings
from django.db import models
+from django.utils import timezone
DATA_TYPES = {"json": 0, "html": 1, "div": 1}
DATA_TYPE_INFO = {0: {"name": "json"}, 1: {"name": "html"}}
class Instrument(models.Model):
- """
- Table of instruments
- """
+ """Table of instruments"""
name = models.CharField(max_length=128, unique=True)
run_id_type = models.IntegerField(default=0)
@@ -24,25 +25,32 @@ def __str__(self):
class DataRun(models.Model):
- """
- Table of runs
+ """Table of runs.
+
+ A run is a collection of plots that are all related to a single data set.
+
+ Attributes:
+ run_number (int): Run number
+ run_id (str): Optional run identifier
+ instrument (Instrument): Instrument object
+ created_on (datetime): Timestamp
+ expiration_date (datetime): Expiration date
"""
run_number = models.IntegerField()
- # Optional free-form run identifier
run_id = models.TextField()
-
instrument = models.ForeignKey(Instrument, on_delete=models.deletion.CASCADE)
created_on = models.DateTimeField("Timestamp", auto_now_add=True)
+ expiration_date = models.DateTimeField(
+ "Expires", default=timezone.now() + timedelta(days=(settings.LIVE_PLOT_EXPIRATION_TIME))
+ )
def __str__(self):
return f"{self.instrument}_{self.run_number}_{self.run_id}"
class PlotData(models.Model):
- """
- Table of plot data. This data can either be json or html
- """
+ """Table of plot data. This data can either be json or html"""
## DataRun this run status belongs to
data_run = models.ForeignKey(DataRun, on_delete=models.deletion.CASCADE)
@@ -60,8 +68,8 @@ def __str__(self):
return str(self.data_run)
def is_data_type_valid(self, data_type):
- """
- Verify that a given data type matches the stored data
+ """Verify that a given data type matches the stored data
+
@param data_type: data type to check
"""
try:
@@ -73,8 +81,8 @@ def is_data_type_valid(self, data_type):
@classmethod
def get_data_type_from_data(cls, data):
- """
- Inspect the data to guess what type it is.
+ """Inspect the data to guess what type it is.
+
@param data: block of text to store
"""
if data.startswith("[\w]+)/upload_user_data/$", views.upload_user_data, name="upload_user_data"),
re_path(r"^(?P[\w]+)/list/$", views.get_data_list, name="get_data_list"),
+ # re_path(r"^(?P[\w]+)/list_extra/$", views.get_data_list, name="get_data_list"),
]
diff --git a/src/live_data_server/plots/view_util.py b/src/live_data_server/plots/view_util.py
index 2d128a9..0428f01 100644
--- a/src/live_data_server/plots/view_util.py
+++ b/src/live_data_server/plots/view_util.py
@@ -5,11 +5,12 @@
import hashlib
import logging
import sys
+from datetime import datetime
+from typing import Optional
from django.conf import settings
from django.http import HttpResponse
from django.utils import timezone
-
from plots.models import DataRun, Instrument, PlotData
@@ -62,7 +63,12 @@ def request_processor(request, instrument, run_id):
return request_processor
-def get_or_create_run(instrument, run_id, create=True):
+def get_or_create_run(
+ instrument,
+ run_id,
+ expiration_date: datetime = None,
+ create: bool = True,
+):
"""
Retrieve a run entry, or create it.
@param instrument: instrument name
@@ -89,6 +95,7 @@ def get_or_create_run(instrument, run_id, create=True):
run_obj = DataRun()
run_obj.instrument = instrument_obj
run_obj.run_number = run_id
+ run_obj.expiration_date = expiration_date
run_obj.save()
else:
return None
@@ -113,10 +120,10 @@ def get_plot_data(instrument, run_id, data_type=None):
return None
-def store_user_data(user, data_id, data, data_type):
+def store_user_data(user, data_id, data, data_type, expiration_date: Optional[datetime] = None):
"""
- Store plot data and associate it to a user identifier (a name, not
- an actual user since users don't log in to this system).
+ Store plot data and associate it to a user identifier
+ (a name, not an actual user since users don't log in to this system).
"""
# Get or create the instrument
instrument_list = Instrument.objects.filter(name=user.lower())
@@ -135,9 +142,10 @@ def store_user_data(user, data_id, data, data_type):
run_obj.instrument = instrument_obj
run_obj.run_number = 0
run_obj.run_id = data_id
+ run_obj.expiration_date = expiration_date
run_obj.save()
# Since user data have no run number, force the run number to be the PK,
- # which is unique and will allow use to retrieve the data live normal
+ # which is unique and will allow user to retrieve the data like normal
# instrument data.
run_obj.run_number = run_obj.id
run_obj.save()
@@ -157,7 +165,7 @@ def store_user_data(user, data_id, data, data_type):
plot_data.save()
-def store_plot_data(instrument, run_id, data, data_type):
+def store_plot_data(instrument, run_id, data, data_type, expiration_date: Optional[datetime] = None):
"""
Store plot data
@param instrument: instrument name
@@ -165,7 +173,7 @@ def store_plot_data(instrument, run_id, data, data_type):
@param data: data to be stored
@param data_type: requested data type
"""
- run_object = get_or_create_run(instrument, run_id)
+ run_object = get_or_create_run(instrument, run_id, expiration_date)
# Look for a data file and treat it differently
data_entries = PlotData.objects.filter(data_run=run_object)
diff --git a/src/live_data_server/plots/views.py b/src/live_data_server/plots/views.py
index 3d7e734..ac1399b 100644
--- a/src/live_data_server/plots/views.py
+++ b/src/live_data_server/plots/views.py
@@ -4,6 +4,7 @@
import json
import logging
+from datetime import timedelta
from django.conf import settings
from django.contrib.auth import authenticate, login
@@ -12,7 +13,6 @@
from django.utils import dateformat, timezone
from django.views.decorators.cache import cache_page
from django.views.decorators.csrf import csrf_exempt
-
from plots.models import DataRun, Instrument, PlotData
from . import view_util
@@ -97,11 +97,15 @@ def _store(request, instrument, run_id=None, as_user=False):
raw_data = request.FILES["file"].read().decode("utf-8")
data_type_default = PlotData.get_data_type_from_data(raw_data)
data_type = request.POST.get("data_type", default=data_type_default)
+ expiration_date = request.POST.get(
+ "expiration_date", default=timezone.now() + timedelta(days=settings.LIVE_PLOT_EXPIRATION_TIME)
+ )
+
if as_user:
data_id = request.POST.get("data_id", default="")
- view_util.store_user_data(instrument, data_id, raw_data, data_type)
+ view_util.store_user_data(instrument, data_id, raw_data, data_type, expiration_date)
else:
- view_util.store_plot_data(instrument, run_id, raw_data, data_type)
+ view_util.store_plot_data(instrument, run_id, raw_data, data_type, expiration_date)
else:
return HttpResponse(status=400)
@@ -129,22 +133,27 @@ def upload_user_data(request, user):
@csrf_exempt
@check_credentials
-def get_data_list(_, instrument):
+def get_data_list(request, instrument):
"""
Get a list of user data
"""
instrument_object = get_object_or_404(Instrument, name=instrument.lower())
data_list = []
+ get_extra = request.POST.get("extra", default=False)
for item in DataRun.objects.filter(instrument=instrument_object):
- localtime = timezone.localtime(item.created_on)
- df = dateformat.DateFormat(localtime)
- data_list.append(
- dict(
- id=item.id,
- run_number=str(item.run_number),
- run_id=item.run_id,
- timestamp=item.created_on.isoformat(),
- created_on=df.format(settings.DATETIME_FORMAT),
- )
+ timestamp_local = timezone.localtime(item.created_on)
+ timestamp_formatted = dateformat.DateFormat(timestamp_local).format(settings.DATETIME_FORMAT)
+ data = dict(
+ id=item.id,
+ run_number=str(item.run_number),
+ run_id=item.run_id,
+ timestamp=item.created_on.isoformat(),
+ created_on=timestamp_formatted,
)
+ if get_extra:
+ expiration_local = timezone.localtime(item.expiration_date)
+ expiration_formatted = dateformat.DateFormat(expiration_local).format(settings.DATETIME_FORMAT)
+ data["expiration_date"] = expiration_formatted
+ data["expired"] = True if expiration_local < timezone.now() else False
+ data_list.append(data)
return JsonResponse(data_list, safe=False)
diff --git a/tests/conftest.py b/tests/conftest.py
index a1044ad..c684fd7 100644
--- a/tests/conftest.py
+++ b/tests/conftest.py
@@ -1,5 +1,3 @@
-# 3rd-party imports
-# standard imports
import os
import sys
@@ -10,7 +8,7 @@
@pytest.fixture(scope="module")
def data_server():
- r"""Object containing info and functionality for data files.
+ """Object containing info and functionality for data files.
It assumes the data files are stored under directory `data/`, located
under the same directory as this module.
@@ -21,14 +19,14 @@ class _DataServe(object):
@property
def directory(self):
- r"""Directory where to find the data files"""
+ """Directory where to find the data files"""
return self._directory
def path_to(self, basename):
- r"""Absolute path to a data file"""
+ """Absolute path to a data file"""
file_path = os.path.join(self._directory, basename)
if not os.path.isfile(file_path):
- raise IOError("File {basename} not found in data directory {self._directory}")
+ raise IOError(f"File {basename} not found in data directory {self._directory}")
return file_path
return _DataServe()
diff --git a/tests/test_expiration.py b/tests/test_expiration.py
new file mode 100644
index 0000000..abc3185
--- /dev/null
+++ b/tests/test_expiration.py
@@ -0,0 +1,147 @@
+import json
+import os
+import subprocess
+from datetime import datetime, timedelta, timezone
+
+import psycopg
+import requests
+
+TEST_URL = "http://127.0.0.1"
+HTTP_OK = requests.status_codes.codes["OK"]
+HTTP_UNAUTHORIZED = requests.status_codes.codes["unauthorized"]
+HTTP_NOT_FOUND = requests.status_codes.codes["NOT_FOUND"]
+HTTP_BAD_REQUEST = requests.status_codes.codes["BAD_REQUEST"]
+
+
+class TestLiveDataServer:
+ # authenticate with username and password
+ username = os.environ.get("DJANGO_SUPERUSER_USERNAME")
+ password = os.environ.get("DJANGO_SUPERUSER_PASSWORD")
+ user_data = {"username": username, "password": password}
+
+ @classmethod
+ def setup_class(cls):
+ """Clean the database before running tests"""
+ conn = psycopg.connect(
+ dbname=os.environ.get("DATABASE_NAME"),
+ user=os.environ.get("DATABASE_USER"),
+ password=os.environ.get("DATABASE_PASS"),
+ port=os.environ.get("DATABASE_PORT"),
+ host="localhost",
+ )
+ cur = conn.cursor()
+ cur.execute("DELETE FROM plots_plotdata")
+ cur.execute("DELETE FROM plots_datarun")
+ cur.execute("DELETE FROM plots_instrument")
+ conn.commit()
+ conn.close()
+
+ def test_expiration_plot(self, data_server):
+ """Test the expiration field on DataRun model for upload_plot_data"""
+
+ instrument = "TEST_INST"
+
+ # request data
+ filename = "reflectivity.html"
+ files = {"file": open(data_server.path_to(filename)).read()}
+ request_data = {
+ **self.user_data,
+ "data_id": filename,
+ }
+
+ # create a new run
+ run_id = 12345
+ request = requests.post(
+ f"{TEST_URL}/plots/{instrument}/{run_id}/upload_plot_data/", data=request_data, files=files, verify=True
+ )
+ assert request.status_code == HTTP_OK
+
+ # create expired run
+ run_id += 1
+ expiration_date = datetime.now(tz=timezone.utc) - timedelta(days=365 * 3)
+ request_data["expiration_date"] = expiration_date
+ request = requests.post(
+ f"{TEST_URL}/plots/{instrument}/{run_id}/upload_plot_data/",
+ data=request_data,
+ files=files,
+ verify=True,
+ )
+ assert request.status_code == HTTP_OK
+
+ request = requests.post(
+ f"{TEST_URL}/plots/{instrument}/list/",
+ data={**self.user_data, "extra": True},
+ )
+ assert request.status_code == HTTP_OK
+
+ r = request.json()
+ assert r[0]["expired"] is False
+ assert r[1]["expired"] is True
+
+ def test_expiration_user(self, data_server):
+ """Test the expiration field on DataRun model for upload_user_data"""
+
+ filename = "reflectivity.json"
+ with open(data_server.path_to(filename), "r") as file_handle:
+ files = {"file": json.dumps(json.load(file_handle))}
+ request_data = {
+ **self.user_data,
+ "data_id": filename,
+ }
+
+ # create a new run
+ request = requests.post(
+ f"{TEST_URL}/plots/{self.username}/upload_user_data/", data=request_data, files=files, verify=True
+ )
+ assert request.status_code == HTTP_OK
+
+ # create expired run
+ expiration_date = datetime.now(tz=timezone.utc) - timedelta(days=365 * 3)
+ request_data["data_id"] = "reflectivity_expired.json"
+ request_data["expiration_date"] = expiration_date
+ request = requests.post(
+ f"{TEST_URL}/plots/{self.username}/upload_user_data/", data=request_data, files=files, verify=True
+ )
+ assert request.status_code == HTTP_OK
+
+ request = requests.post(
+ f"{TEST_URL}/plots/{self.username}/list/",
+ data={**self.user_data, "extra": True},
+ )
+ assert request.status_code == HTTP_OK
+
+ # check that expiration field for runs are marked correctly
+ r = request.json()
+ assert r[0]["expired"] is False
+ assert r[1]["expired"] is True
+
+ def test_deleting_expired(self):
+ """Test the purge_expired_data command"""
+ command = "docker exec -i live_data_server-livedata-1 bash -ic"
+ subcommand = "conda activate livedata && cd app && coverage run manage.py purge_expired_data"
+ # subcommand = "conda activate livedata && cd app && python manage.py purge_expired_data"
+ output = subprocess.check_output([*command.split(" "), subcommand])
+ print(output)
+
+ # Ensure the above ran and worked
+ conn = psycopg.connect(
+ dbname=os.environ.get("DATABASE_NAME"),
+ user=os.environ.get("DATABASE_USER"),
+ password=os.environ.get("DATABASE_PASS"),
+ port=os.environ.get("DATABASE_PORT"),
+ host="localhost",
+ )
+ cur = conn.cursor()
+
+ cur.execute("SELECT * FROM plots_datarun")
+ results = cur.fetchall()
+ print(f"Runs after purge: {len(results)}")
+ for i in results:
+ print(i)
+ assert len(results) == 2
+
+ # Plots after purge
+ cur.execute("SELECT * FROM plots_plotdata")
+ results = cur.fetchall()
+ print(f"Plots after purge: {len(results)}")
+ assert len(results) == 2
diff --git a/tests/test_post_get.py b/tests/test_post_get.py
index 43f3256..bf6dc2b 100644
--- a/tests/test_post_get.py
+++ b/tests/test_post_get.py
@@ -1,4 +1,3 @@
-# standard imports
import hashlib
import json
import os
@@ -14,6 +13,11 @@
class TestLiveDataServer:
+ # authenticate with username and password
+ username = os.environ.get("DJANGO_SUPERUSER_USERNAME")
+ password = os.environ.get("DJANGO_SUPERUSER_PASSWORD")
+ user_data = {"username": username, "password": password}
+
@classmethod
def setup_class(cls):
"""Clean the database before running tests"""
@@ -32,93 +36,90 @@ def setup_class(cls):
conn.close()
def test_post_request(self, data_server):
- username = os.environ.get("DJANGO_SUPERUSER_USERNAME")
- monitor_user = {"username": username, "password": os.environ.get("DJANGO_SUPERUSER_PASSWORD")}
-
# load html plot as autoreduce service
- file_name = "reflectivity.html"
- files = {"file": open(data_server.path_to(file_name)).read()}
- monitor_user["data_id"] = file_name
+ filename = "reflectivity.html"
+ files = {"file": open(data_server.path_to(filename)).read()}
+ request_data = {
+ **self.user_data,
+ "data_id": filename,
+ }
- http_request = requests.post(
- TEST_URL + "/plots/REF_L/12345/upload_plot_data/", data=monitor_user, files=files, verify=True
+ request = requests.post(
+ f"{TEST_URL}/plots/TEST_INST/12345/upload_plot_data/", data=request_data, files=files, verify=True
)
- assert http_request.status_code == HTTP_OK
+ assert request.status_code == HTTP_OK
# load json plot a user "someuser" of the web-reflectivity app
- file_name = "reflectivity.json"
- with open(data_server.path_to(file_name), "r") as file_handle:
+ filename = "reflectivity.json"
+ with open(data_server.path_to(filename), "r") as file_handle:
files = {"file": json.dumps(json.load(file_handle))}
- monitor_user["data_id"] = file_name
+ request_data["data_id"] = filename
- http_request = requests.post(
- TEST_URL + "/plots/" + username + "/upload_user_data/", data=monitor_user, files=files, verify=True
+ request = requests.post(
+ f"{TEST_URL}/plots/{self.username}/upload_user_data/", data=request_data, files=files, verify=True
)
- assert http_request.status_code == HTTP_OK
+ assert request.status_code == HTTP_OK
- monitor_user.pop("data_id")
# get all plots for an instrument
- http_request = requests.post(TEST_URL + "/plots/REF_L/list/", data=monitor_user, files={}, verify=True)
- assert http_request.status_code == HTTP_OK
+ request = requests.post(f"{TEST_URL}/plots/TEST_INST/list/", data=self.user_data, files={}, verify=True)
+ assert request.status_code == HTTP_OK
# get all plots from someuser
- http_request = requests.post(
- TEST_URL + "/plots/" + username + "/list/", data=monitor_user, files={}, verify=True
- )
- assert http_request.status_code == HTTP_OK
+ request = requests.post(f"{TEST_URL}/plots/{self.username}/list/", data=self.user_data, files={}, verify=True)
+ assert request.status_code == HTTP_OK
def test_get_request(self, data_server):
"""Test GET request for HTML data like from monitor.sns.gov"""
instrument = "REF_M"
run_number = 12346
- # upload the run data using POST (authenticate with username and password)
- username = os.environ.get("DJANGO_SUPERUSER_USERNAME")
- monitor_user = {"username": username, "password": os.environ.get("DJANGO_SUPERUSER_PASSWORD")}
# load html plot as autoreduce service
- file_name = "reflectivity.html"
- files = {"file": open(data_server.path_to(file_name)).read()}
- monitor_user["data_id"] = file_name
+ filename = "reflectivity.html"
+ files = {"file": open(data_server.path_to(filename)).read()}
+ request_data = {
+ **self.user_data,
+ "data_id": filename,
+ }
- http_request = requests.post(
+ request = requests.post(
f"{TEST_URL}/plots/{instrument}/{run_number}/upload_plot_data/",
- data=monitor_user,
+ data=request_data,
files=files,
verify=True,
)
- assert http_request.status_code == HTTP_OK
+ assert request.status_code == HTTP_OK
base_url = f"{TEST_URL}/plots/{instrument}/{run_number}/update/html/"
# test GET request - authenticate with secret key
url = f"{base_url}?key={_generate_key(instrument, run_number)}"
- http_request = requests.get(url)
- assert http_request.status_code == HTTP_OK
- assert http_request.text == files["file"]
+ request = requests.get(url)
+ assert request.status_code == HTTP_OK
+ assert request.text == files["file"]
# test that getting the json should return not found
- http_request = requests.get(
+ request = requests.get(
f"{TEST_URL}/plots/{instrument}/{run_number}/update/json/?key={_generate_key(instrument, run_number)}"
)
- assert http_request.status_code == HTTP_NOT_FOUND
- assert http_request.text == "No data available for REF_M 12346"
+ assert request.status_code == HTTP_NOT_FOUND
+ assert request.text == "No data available for REF_M 12346"
# test GET request - no key
url = base_url
- http_request = requests.get(url)
- assert http_request.status_code == HTTP_UNAUTHORIZED
+ request = requests.get(url)
+ assert request.status_code == HTTP_UNAUTHORIZED
# test GET request - wrong key
url = f"{base_url}?key=WRONG-KEY"
- http_request = requests.get(url)
- assert http_request.status_code == HTTP_UNAUTHORIZED
+ request = requests.get(url)
+ assert request.status_code == HTTP_UNAUTHORIZED
# test GET request - wrong key
- http_request = requests.get(
+ request = requests.get(
base_url,
headers={"Authorization": "WRONG-KEY"},
)
- assert http_request.status_code == HTTP_UNAUTHORIZED
+ assert request.status_code == HTTP_UNAUTHORIZED
def test_upload_plot_data_json(self):
# test that when you upload json you can get back the same stuff
@@ -136,12 +137,12 @@ def test_upload_plot_data_json(self):
assert response.status_code == HTTP_NOT_FOUND
# now upload json data
- http_request = requests.post(
+ request = requests.post(
f"{TEST_URL}/plots/{instrument}/{run_number}/upload_plot_data/",
data=monitor_user,
files={"file": json.dumps(data)},
)
- assert http_request.status_code == HTTP_OK
+ assert request.status_code == HTTP_OK
# check list of data
response = requests.post(f"{TEST_URL}/plots/{instrument}/list/", data=monitor_user)
@@ -178,19 +179,19 @@ def test_bad_request(self):
}
# missing files
- http_request = requests.post(
+ request = requests.post(
f"{TEST_URL}/plots/{instrument}/{run_number}/upload_plot_data/",
data=monitor_user,
)
- assert http_request.status_code == HTTP_BAD_REQUEST
+ assert request.status_code == HTTP_BAD_REQUEST
# used filename instead of file in files
- http_request = requests.post(
+ request = requests.post(
f"{TEST_URL}/plots/{instrument}/{run_number}/upload_plot_data/",
data=monitor_user,
files={"filename": ""},
)
- assert http_request.status_code == HTTP_BAD_REQUEST
+ assert request.status_code == HTTP_BAD_REQUEST
def test_unauthorized(self):
# test get request unauthorized