Skip to content

Commit f080791

Browse files
authored
Merge branch 'main' into logger-config
2 parents 872ecde + 4159431 commit f080791

File tree

95 files changed

+1056
-1311
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

95 files changed

+1056
-1311
lines changed

.flake8

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
[flake8]
2+
max-line-length = 100
3+
extend-ignore = E203,E701
4+
exclude = */proto/*_pb2*.py

.github/workflows/docker-publish.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ jobs:
2020
runs-on: ubuntu-latest
2121
strategy:
2222
matrix:
23-
python-version: ['3.9', '3.10', '3.11', '3.12']
23+
python-version: ['3.9', '3.10', '3.11', '3.12', '3.13']
2424

2525
permissions:
2626
contents: read

.github/workflows/tests.yml

Lines changed: 17 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -5,24 +5,25 @@ on:
55
- pull_request
66

77
jobs:
8-
build:
8+
generate-jobs:
9+
runs-on: ubuntu-latest
10+
outputs:
11+
session: ${{ steps.set-matrix.outputs.session }}
12+
steps:
13+
- uses: actions/checkout@v4
14+
- uses: wntrblm/nox@main
15+
- id: set-matrix
16+
shell: bash
17+
run: echo session=$(nox --json -l -s tests | jq -c '[.[].session]') | tee --append $GITHUB_OUTPUT
18+
checks:
19+
name: Session ${{ matrix.session }}
20+
needs: [generate-jobs]
921
runs-on: ubuntu-latest
1022
strategy:
23+
fail-fast: false
1124
matrix:
12-
python-version: ["3.9", "3.10", "3.11", "3.12"]
13-
25+
session: ${{ fromJson(needs.generate-jobs.outputs.session) }}
1426
steps:
1527
- uses: actions/checkout@v4
16-
- name: Set up Python ${{ matrix.python-version }}
17-
uses: actions/setup-python@v5
18-
with:
19-
python-version: ${{ matrix.python-version }}
20-
- name: Install dependencies
21-
run: |
22-
python -m pip install --upgrade pip
23-
python -m pip install tox tox-gh-actions
24-
# TODO: when calling pytest, it build a new dev image, but it should build the image
25-
# using the specific python version. Before it was working cause the compose was not
26-
# building the image, rather the image was built here with the {{ matrix.python-version}}
27-
- name: Test with tox
28-
run: tox
28+
- uses: wntrblm/nox@main
29+
- run: nox -s "${{ matrix.session }}"

.gitignore

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,8 @@ tests/mock/client.properties
1515
/setup.py.orig
1616
/.project
1717
.tox
18+
.nox
19+
.mypy_cache
1820
*__pycache__*
1921
/.idea/
2022
/venv*/

docs/contributing.rst

Lines changed: 9 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,8 @@
11
Contributing
22
============
33

4-
dataClay is a `BSC <https://www.bsc.es/research-and-development/software-and-apps/software-list/dataclay>`_
4+
The dataClay distributed data store is a
5+
`BSC <https://www.bsc.es/research-and-development/software-and-apps/software-list/dataclay>`_
56
project under the `BSD License <https://github.com/bsc-dom/dataclay/blob/main/LICENSE.txt>`_
67
and we happily accept contributions.
78

@@ -16,7 +17,7 @@ If you wish to add a new feature or fix a bug:
1617
#. Write a test which shows that the bug was fixed or that the feature works
1718
as expected.
1819
#. Format your changes with `Black <https://black.readthedocs.io/en/stable/>`_ using the
19-
command `tox -e format` and lint your changes using the command `tox -e lint`.
20+
command `nox -s format` and lint your changes using the command `nox -s lint`.
2021
#. Send a pull request and follow up with the maintainer until it gets merged and published.
2122

2223
.. #. Add a `changelog entry
@@ -25,14 +26,14 @@ If you wish to add a new feature or fix a bug:
2526
Setting up your development environment
2627
---------------------------------------
2728

28-
To set up your development environment, you will need `tox`_ installed on your machine:
29+
To set up your development environment, you will need `nox`_ installed on your machine:
2930

3031
.. code-block:: console
3132
32-
$ python -m pip install --user --upgrade tox
33+
$ python -m pip install --user --upgrade nox
3334
3435
You wll also need to have `docker engine <https://docs.docker.com/engine/install/ubuntu/>`_ installed
35-
for `tox`_ to use `pytest-docker <https://pypi.org/project/pytest-docker/>`_.
36+
for `nox`_ to use `pytest-docker <https://pypi.org/project/pytest-docker/>`_.
3637

3738
Install dataClay in editable mode with the ``dev`` extra requirement:
3839

@@ -44,11 +45,11 @@ Running the tests
4445
-----------------
4546

4647
When running the test suite, we use external dependencies, multiple interpreters, and code coverage analysis.
47-
Our `tox.ini <https://github.com/bsc-dom/dataclay/blob/main/tox.ini>`_ file handles much of this for you:
48+
Our `noxfile.py <https://github.com/bsc-dom/dataclay/blob/main/noxfile.py>`_ file handles much of this for you:
4849

4950
.. code-block:: console
5051
51-
$ tox
52+
$ nox
5253
5354
54-
.. _tox: https://tox.wiki/en/stable/
55+
.. _nox: https://nox.thea.codes/en/stable/

docs/index.rst

Lines changed: 11 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -10,8 +10,6 @@ dataClay
1010
main-concepts
1111
alien-objects
1212
advanced-usage
13-
telemetry
14-
hpc-tracing
1513
logging
1614
examples/index
1715

@@ -26,12 +24,22 @@ dataClay
2624
deployment/hpc-manual-deployment
2725
deployment/compile-redis
2826

27+
.. toctree::
28+
:hidden:
29+
:caption: Telemetry
30+
31+
telemetry/configuration
32+
telemetry/offline
33+
telemetry/real-time
34+
telemetry/prometheus
35+
telemetry/hpc-tracing
36+
2937
.. toctree::
3038
:hidden:
3139
:caption: Release Notes
3240

33-
releasenotes/3-x
3441
releasenotes/4-x
42+
releasenotes/3-x
3543

3644
.. toctree::
3745
:hidden:

docs/telemetry.rst

Lines changed: 0 additions & 28 deletions
This file was deleted.

docs/telemetry/configuration.rst

Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,41 @@
1+
Telemetry Configuration
2+
=======================
3+
4+
dataClay is instrumented with `OpenTelemetry <https://opentelemetry.io/>`_ to allow observability of
5+
distributed traces, metrics, and logs. You can configure tracing to export telemetry data either in real-time or for post-mortem analysis. Visualizations can be performed in Grafana.
6+
7+
Configuration
8+
-------------
9+
10+
To activate tracing in dataClay, the following environment variables need to be set:
11+
12+
- **`DATACLAY_TRACING`**: Set to `true` to enable tracing.
13+
- **`DATACLAY_TRACING_EXPORTER`**: Export traces to the OpenTelemetry Collector (`otlp`) or print traces to the console (`console`). The default is `otlp`.
14+
- **`DATACLAY_TRACING_HOST`**: Host of the OpenTelemetry Collector (default: `localhost`).
15+
- **`DATACLAY_TRACING_PORT`**: Port of the OpenTelemetry Collector (default: `4317`).
16+
- **`DATACLAY_SERVICE_NAME`**: The service name, which identifies dataClay components in trace data.
17+
18+
Metrics
19+
-------
20+
21+
.. list-table::
22+
:header-rows: 1
23+
24+
* - Metric
25+
- Description
26+
- Service
27+
* - dataclay_inmemory_objects
28+
- Number of objects in memory
29+
- backend, client
30+
* - dataclay_loaded_objects
31+
- Number of loaded objects
32+
- backend
33+
* - dataclay_stored_objects
34+
- Number of stored objects
35+
- backend
36+
* - dataclay_inmemory_misses_total
37+
- Number of inmemory misses
38+
- backend, client
39+
* - dataclay_inmemory_hits_total
40+
- Number of inmemory hits
41+
- backend, client

docs/hpc-tracing.rst renamed to docs/telemetry/hpc-tracing.rst

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,9 @@
1-
===========
21
HPC Tracing
32
===========
43

5-
How to generate paraver traces in MN5
6-
=====================================
4+
How to generate paraver traces in MN5 using COMPSs
5+
--------------------------------------------------
76

8-
Using COMPSs
9-
------------
107
In order to get the traces we will create a script.
118

129
- First we have to import the COMPSs and DataClay modules in order to be able to use them, as well as defining which python version we will be using:
@@ -57,7 +54,8 @@ In order to generate the paraver files, we will call another COMPSs script, "com
5754
If we run this script in the same directory where we found the traces ($HOME/.COMPSs/[SLURM_JOB_ID]/trace/), the paraver files will appear.
5855

5956
How to inspect the traces in Paraver
60-
====================================
57+
------------------------------------
58+
6159
To be able to see these files we will have to open them using the following commands:
6260

6361
.. code-block:: bash

docs/telemetry/offline.rst

Lines changed: 62 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,62 @@
1+
2+
Offline Telemetry Example
3+
=========================
4+
5+
This example demonstrates exporting OpenTelemetry traces to a JSON file for post-mortem analysis in Grafana.
6+
7+
1. **Activate tracing** by setting environment variables as described in the `telemetry configuration <https://dataclay.bsc.es/docs/telemetry/configuration>`_.
8+
2. **Generate traces**:
9+
10+
- Navigate to the `json-exporter` folder in the `offline telemetry example JSON exporter <https://github.com/bsc-dom/dataclay/tree/telemetry-doc/examples/telemetry/offline/json-exporter>`_.
11+
- Start dataClay and OpenTelemetry Collector services:
12+
13+
.. code-block:: bash
14+
15+
docker compose up
16+
17+
- Run the dataClay client:
18+
19+
.. code-block:: bash
20+
21+
python3 client.py
22+
23+
- Traces are exported to the `traces` folder. You can visualize the JSON traces in Grafana.
24+
25+
3. **Visualize in Grafana**:
26+
27+
- Navigate to the `json-post-mortem` folder in the `offline telemetry example post-mortem <https://github.com/bsc-dom/dataclay/tree/telemetry-doc/examples/telemetry/offline/json-post-mortem>`_.
28+
- Start the OpenTelemetry Collector, Tempo, and Grafana services:
29+
30+
.. code-block:: bash
31+
32+
docker compose up
33+
34+
- Open Grafana at <http://localhost:3000> (default username/password: `admin`/`admin`).
35+
- In the `Explore` section, select `Tempo` as the data source and use the `Trace ID` field to query traces.
36+
37+
4. **Alternative Trace Export**:
38+
39+
- Run the OpenTelemetry Collector manually:
40+
41+
.. code-block:: bash
42+
43+
docker run \
44+
-v ./config/otel-collector.yaml:/etc/otel-collector.yaml \
45+
otel/opentelemetry-collector-contrib \
46+
"--config=/etc/otel-collector.yaml"
47+
48+
5. **Copy Traces from MareNostrum 5**:
49+
50+
- To analyze traces from MareNostrum 5, copy them locally:
51+
52+
.. code-block:: bash
53+
54+
scp transfer1.bsc.es:~/.dataclay/otel-traces.json ./traces/otel-traces.json
55+
56+
6. **Troubleshooting**:
57+
58+
- If permission issues arise for the `/traces` folder, adjust permissions:
59+
60+
.. code-block:: bash
61+
62+
sudo chmod -R 777 traces

0 commit comments

Comments
 (0)