Skip to content
Merged
Show file tree
Hide file tree
Changes from 34 commits
Commits
Show all changes
43 commits
Select commit Hold shift + click to select a range
0cd32ae
Added clustering and mode tracking
Oct 21, 2025
1f711db
Added clustering and mode tracking
Oct 21, 2025
d34018c
Merge branch 'main' of https://github.com/au650680/example-shm
Oct 21, 2025
a487753
Merge branch 'main' into clustering_and_mode_tracking
Oct 21, 2025
62bddcd
Edits to last pull
Oct 28, 2025
aedc878
Bug fix and small change to plot_cluster
Oct 31, 2025
802139e
Small changes
Nov 5, 2025
9d145fd
Small changes
Nov 5, 2025
ee4f49a
Model update module and other changes
Nov 11, 2025
1e08818
Update README.md
Nov 11, 2025
c8ba534
Changed the typing of some arguments in some functions
Nov 11, 2025
e7e0926
Merge branch 'clustering_and_mode_tracking' into fix_clustering_and_m…
Nov 12, 2025
708df2c
Delete init file
Nov 12, 2025
8e8858a
Multiple changes and fixes
Nov 18, 2025
dedbf22
Simplifications to commit
Nov 21, 2025
23e1f4d
TopicsToPublish placeholder names
Nov 21, 2025
47da3c7
New record and replay data
Nov 21, 2025
bbdc720
Minor changes to comments
Nov 21, 2025
bbf56aa
Updated record and replay functions
Nov 24, 2025
d83cc29
Update model_update.py
Nov 24, 2025
0600840
Minor changes
Nov 25, 2025
afd27e1
Fixes and looping replay function
Nov 25, 2025
f9b3e39
Revert back to "sysid" key in configurations
Nov 25, 2025
8acef14
YAFEM model function is added to constants.py
Nov 27, 2025
67ac538
Refactor MQTT record/replay and change to model update information
Nov 27, 2025
e887bd4
Change recording topic to match with subscribe topic.
Nov 27, 2025
0875f7b
New topic names and origon added to replay config
Nov 28, 2025
32f667a
Hotfix for model update functions
Nov 28, 2025
ad7a8fc
Updated poetry.lock
Nov 28, 2025
c8b7599
Updated USERGUIDE
Nov 28, 2025
ca522ac
Added record and replay guide
Nov 28, 2025
89f3239
Small improvements
Feb 25, 2026
2e5c96b
Replay function edit and beam experiment data added
Mar 5, 2026
d348023
Changes to replay function
Mar 5, 2026
0b8a54e
Merge remote-tracking branch 'upstream/main' into fix_clustering_and_…
Mar 9, 2026
f6b2e21
Updated plot_sysid.py
Mar 9, 2026
0900daa
Added readme file, reference results for beam experiment, bug fixes
Mar 11, 2026
7c34e4e
Update beam import
Mar 11, 2026
14f6c16
replay functionality added as example
Mar 11, 2026
90e5d7a
Included record folder to project
Mar 11, 2026
4fe7578
Change to file path in replay.py
Mar 11, 2026
7962ed8
Removed os library
Mar 11, 2026
270dc7b
Removed os pathing from MU
Mar 11, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
299 changes: 241 additions & 58 deletions USERGUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,75 +2,202 @@

This guide outlines configuration and execution of the _example-shm_ application. It provides detailed information on the configuration parameters, command-line options, and execution procedures.

## Install
There are two ways to run the package, either directly from the repository code as a developer or as a build package. The latter is described further down in this file in section 2.

You can install the built package using pip:

## 1. Run package with developer setup.

## Step 1.1 Initiate virtual enviroment with poetry

```bash
python -m venv .venv
.\.venv\Scripts\Activate.ps1 # On Windows
source .venv/bin/activate # On Linux

pip install poetry #Specifically install poetry to your system
# If you have poetry installed globally
poetry env activate # shows the command to activate venv
poetry install # installs all required python packages
```
## Step 1.2 Setup configuration file
config/ consist of a set of standard configuration files.
`production.json.template` For general digital twin purposes
`replay.json.template` For recording and replaying data
`replay_production.json.template` For running function blocks with replayed data

Copy the needed config files and remove `.template` to use them.

Add MQTT connection information, credentials and topics.
For production purposes the MQTT topics structure follows the pattern:

```txt
cpsens/DAQ_ID/MODULE_ID/CH_ID/PHYSICS/ANALYSIS/DATA_TYPE
```
Where DATA_TYPE could be metadata or plain data.


## Step 1.3 Change settings
There are some settings that can be changed for the specific use case.

* **Sample time**
This is how many samples should be used for sysid. The value uses the time in minutes and the sample frequency, fs,
to calculate the correct ammount of samples.

Inside `examples/run_sysid.py`, `examples/run_mode_clustering.py`, `examples/run_mode_tracking.py`, `examples/run_model_update.py`
the sample time can be changed: `number_of_minutes`.

* **Parameters**
Inside `method/constants.py` the parameters for system identification, mode clustering, mode tracking and model updating
can be changed.

* **Model**
A digital YAFEM model can be added to `models/<your_model>`.
Inside `method/constants.py` the model paramters can be set together with the paths to the model folder and the model function file.


## Step 1.4 Run examples
The following experiments can be run using the package.

* **acceleration_readings** demonstrates the use of `Accelerometer` class to extract
accelerometer measurements from MQTT data stream.
* **aligning_readings** demonstrates the use of `Aligner` class to collect and
align accelerometer measurements from multiple MQTT data streams.

* **sysid** demonstrates the use of `sysid` with four cases:
1. **sysid-and-plot**: plots natural frequencies.
2. **sysid-and-print**: prints sysid output to console.
3. **sysid-and-publish**: publishes one set of sysid output via MQTT to the config given under [sysid] config.
4. **live-sysid-and-publish**: Continuously publishes sysid output via MQTT to the config given under [sysid] config.

* **Clustering** demonstrates the use of `clustering` with three cases:
1. **clustering-with-local-sysid**: gets the sysid output by runing sysid
locally, then runs the mode clustering.
2. **clustering-with-remote-sysid**: gets sysid output by subscribing,
then runs the mode clustering. This is a one time operation.
3. **live-clustering-with-remote-sysid**: gets sysid output by subscribing,
then runs the mode clustering. This operation runs in loop.
4. **live-clustering-with-remote-sysid-and-publish**: gets sysid output by subscribing,
then runs the mode clustering. The cluster results are published. This operation runs in loop.

* **mode-tracking** demonstrates the use of `mode_tracking` with three cases:
1. **mode-tracking-with-local-sysid**: gets the sysid output by runing sysid
locally, then runs mode clustering and mode tracking.
2. **mode-tracking-with-remote-sysid**: gets sysid output by subscribing,
then runs mode clustering and mode tracking. This is a one time operation.
3. **live-mode-tracking-with-remote-sysid**: gets sysid output by subscribing,
then runs mode clustering and mode tracking. This operation runs in loop.

* **model-update** demonstrates the use of `model_update` with two cases:
1. **model-update-local-sysid**: gets the sysid output, then uses it to
run update model and get updated system parameters.
2. **live-model-update-with-remote-sysid**: gets the sysid output by subscribing to
MQTT topic, then runs mode clustering to run update model and get updated system parameters.
3. **live-model-update-with-remote-clustering**: gets the mode clustering output by subscribing to
MQTT topic, then uses the mode clustering output to run update model and get updated system parameters.

```bash
$pip install dist/example_shm-<version>-py3-none-any.whl
python .\src\examples\example.py align-readings # run an experiment with real data (Needs "production.json" Config)
```

Replace `<version>` with the actual version number (e.g., `0.5.0`).
To run the examples with specified config, use

## Create Configuration
```bash
python .\src\examples\example.py --config .path_to\production.json align-readings
```

Example,

```bash
python .\src\examples\example.py --config .\config\production.json align-readings
```



## 2. Install the Package from Poetry Build

To install the package built using `poetry`, follow these steps:

### Step 2.1: Build the Package

```bash
poetry build
```

This will create a `.whl` file in the `dist/` directory,
e.g., `dist/cp_sens-0.6.0-py3-none-any.whl`.

### Step 2.2: Create and Activate a Virtual Environment

```py
python -m venv .venv
source .venv/bin/activate # On Linux/macOS
.\.venv\Scripts\Activate.ps1 # On Windows
```

### Step 2.3: Install the Built Package

```py
pip install example_shm-<version>-py3-none-any.whl
```

Replace `<version>` with the version number found in the `.whl`
filename. (e.g `0.6.0`).

## Step 2.4: Create Configuration

The package requires a json configuration file and access to MQTT broker.
The format of configuration file is,

```json
{
"MQTT": {
"sysid": {
"host": "",
"port": ,
"port": 0,
"userId": "",
"password": "",
"ClientID": "NOT_NEEDED",
"QoS": 1,
"MetadataToSubscribe":["sensors/1/acc/raw/metadata"],
"TopicsToSubscribe": [
"sensors/1/acc/raw/data",
"sensors/1/acc/raw/metadata",
"sensors/2/acc/raw/data",
"sensors/3/acc/raw/data",
"sensors/4/acc/raw/data"
]
],
"TopicsToPublish": ["sensors/1/acc/sysid/data"]
},

"sysID": {
"host": "",
"port": ,
"userId": "",
"password": "",
"ClientID": "NOT_NEEDED",
"QoS": 2,
"TopicsToSubscribe": ["sensors/1/acc/sysid/data"]
},

"mode_cluster": {
"host": "",
"port": ,
"port": 0,
"userId": "",
"password": "",
"ClientID": "NOT_NEEDED",
"QoS": 2,
"TopicsToSubscribe": ["sensors/1/acc/mode_cluster/data"]
"TopicsToSubscribe": ["sensors/1/acc/sysid/data"],
"TopicsToPublish": ["sensors/1/acc/mode_cluster/data"]
},

"model_update": {
"host": "",
"port": ,
"port": 0,
"userId": "",
"password": "",
"ClientID": "NOT_NEEDED",
"QoS": 2,
"TopicsToSubscribe": ["sensors/1/acc/model_update/data"]
"TopicsToSubscribe": ["sensors/1/acc/mode_cluster/data"],
"TopicsToPublish": ["sensors/1/acc/model_update/data"]
}
}
```
Where the MQTT topic structure follows the pattern:
`PROJECT/CH_ID/PHYSICS/ANALYSIS/DATA_TYPE`
Where `DATA_TYPE` could be metadata or plain data.

The file needs to be saved. The application looks for configuration in
`config/production.json`.

## Use
## Step 2.5: Use

Launch the bridge with the default configuration:

Expand All @@ -86,38 +213,6 @@ using
example-shm --config <config-file>
```

The following experiments can be run using
the package.

### Run Experiments

The following experiments can be run using
the package.

* **acceleration_readings** demonstrates the extraction of
accelerometer measurements from MQTT data stream.
* **aligning_readings** collects and aligns the accelerometer measurements from multiple MQTT data streams.

* **sysid** demonstrates the use of `sys_id` with four cases:
1. **sysid-and-plot**: plots natural frequencies.
1. **sysid-and-print**: prints SysID results to console.
1. **sysid-and-publish**: publishes one set of SysID results via MQTT to the config given under [sysid] config.
1. **live-sysid-and-publish**: Continuously publishes SysID results via MQTT to the config given under [sysid] config.

* **mode-tracking** demonstrates the use of `mode_track` with three cases:
1. **mode-tracking-with-local-sysid**: gets the pyOMA results by runing sysid
locally, then runs the mode track.
1. **mode-tracking-with-remote-sysid**: gets pyOMA results by subscribing,
then runs the mode track. This is a one time operation.
1. **live-mode-tracking-with-remote-sysid**: gets pyOMA results by subscribing,
then runs the mode track. This operation runs in loop.

* **model-update** demonstrates the use of `model_update` with two cases:
1. **model-update-local-sysid**: gets the mode track output, then uses it to
run update model and get updated system parameters.
1. **live-model-update-remote-sysid**: gets the mode track output by subscribing to
MQTT topic, then uses the mode track output to run update model and get updated system parameters.

You can find the available experiments by running the program

```bash
Expand All @@ -131,15 +226,18 @@ Options:
Commands:
accelerometers
align-readings
align-readings-plot
clustering-with-local-sysid
clustering-with-remote-sysid
live-clustering-with-remote-sysid
live-clustering-with-remote-sysid-and-publish
live-mode-tracking-with-remote-sysid
live-model-update-remote-sysid
live-model-update-with-remote-clustering
live-model-update-with-remote-sysid
live-sysid-publish
mode-tracking-with-local-sysid
mode-tracking-with-remote-sysid
model-update-local-sysid
model-update-with-local-sysid
sysid-and-plot
sysid-and-print
sysid-and-publish
Expand All @@ -158,3 +256,88 @@ To run the examples with a custom config, use:
$example-shm
Usage: example-shm --config <config-file> accelerometers
```








## Distributed Setup Overview

This explains the setup needed to run the distributed version of the example-shm pipeline.

## Machine 1: Edge Layer – Raspberry Pi with Accelerometers

This machine connects to ADXL375 sensors and is responsible for acquiring raw sensor data.
It performs calibration and continuously publishes sensor data over MQTT.

**Step 1**: Run calibration to find sensor offsets

```bash
poetry run python src/scripts/find_offset.py
```

**Step 2**: Start publishing raw accelerometer data

```bash
poetry run python src/scripts/publish_samples.py
```

**Record and replay**
Record and replay data is also possible. Here a `config/replay.json` config file must be defined before hand.

Inside `record/record.py` some parameters must be specified:

```py
config_path = "config/replay.json"
config = load_config(config_path)
MQTT_CONFIG = config["MQTT"]

RECORDINGS_DIR = "record/mqtt_recordings"
FILE_NAME = "recording2.jsonl"

DURATION_SECONDS = 20 # This is how many seconds of data that are recorded
```

The same goes for `record/replay.py`:

```py
# MQTT Configuration
CONFIG_PATH = "config/replay.json"

RECORDINGS_DIR = "record/mqtt_recordings"
FILE_NAME = "recording.jsonl"

REPLAY_SPEED = 1 # Multiplier for replay speed
```
At the bottom the number of times to loop the replay function can be stated: `replay_mqtt_messages(loop=10) # Times to loop`

The files can then be run with:

```bash
poetry run python record/record.py
poetry run python record/replay.py
```

## Machine 2: Fog Layer – Data Alignment and System Identification

This machine subscribes to MQTT topics from Machine 1. It aligns multi-channel data, runs system identification,
and publishes the pyOMA output.

Run the aligner and system identification pipeline

```bash
poetry run python src/examples/example.py sysid-and-publish
```

## Machine 3: Cloud Layer – Mode Tracking and Model Update

This machine subscribes to the pyOMA output, performs mode clustering and updates the structural model.

Run mode clustering and model update

```bash
poetry run python src/examples/example.py model-update-with-remote-sysid
```
Loading