Skip to content

NarayanLab/afib-tissue-activation

Repository files navigation

afib-tissue-activation

This repository provides high‑level pseudo‑code showing how coMAP inference is run. It intentionally omits several preprocessing steps and does not include trained model weights.

The model can be tried via our demo portal: demo portal.

What’s included

  • Pipeline wrapper: stitches together model loading, feature computation, inference, and binarization.
  • Illustrative CLI: shows how to invoke the pipeline on a single epoch.

Key files

  • run_analysis.py: Minimal CLI to run coMAP on a single epoch array; loads inputs, calls stage_coMAP, and illustrates saving outputs.
  • comap_wrapper.py: Orchestrates the pipeline: load models → generate features → run inference → binarize activations; returns dense predictions and a sparse activation matrix.
  • load_build_model.py: Defines an example Keras model, loads intermediate weights, and loads a secondary classifier; exposes build_model, load_intermediate_weights, load_model.
  • step_features.py: Feature generation entry point; compute_features is intentionally left unimplemented as preprocessing is proprietary.
  • step_inference.py: Runs the RNN and extracts intermediate features (dense_1) to feed a secondary classifier; returns per‑sample probabilities.
  • step_binarization.py: Converts probabilities to a binary activation signal using a threshold and a refractory period; binarize_prediction is intentionally left unimplemented.

Notes and limitations

  • Several functions are placeholders and need project‑specific implementations.
  • Expected inputs (e.g., epoch_data) are NumPy arrays; model_info is a pickled dict with keys such as features_list, sampling_freq, and threshold.
  • Without the proprietary preprocessing and weights, this repository is intended for educational/demo purposes only.

System Requirements

Hardware requirements

  • A standard computer with enough RAM to hold NumPy arrays used during feature computation and inference.
  • No non‑standard hardware is required for this repository. The production demo runs as a web app.

OS requirements

  • macOS, Linux, Windows. Example environments:
    • macOS: Ventura (13) or later
    • Linux: Ubuntu 20.04 or later
    • Windows: Windows 10 or 11

Tested versions

  • Not applicable for this repository (pseudo‑code only). The runnable demo is provided via the web app.

Python

  • Python 3.11+ recommended.

Python dependencies

  • numpy (≥1.24)
  • scipy (≥1.10)
  • scikit-learn (≥1.3)
  • tensorflow (2.x) with Keras API
  • Optional: pandas (≥1.5), seaborn (≥0.12)

Note: This repository provides pseudo-code; end‑to‑end execution requires proprietary preprocessing and model weights that are not included.

Installation guide

Important

  • This repository is pseudo‑code; there is nothing to install to run the demo. Use the web app instead.

If you still want a local environment for reading/experimentation

  • Create and activate a virtual environment.
  • Install the scientific stack and ML libraries listed above.

Example (illustrative)

python -m venv .venv && source .venv/bin/activate
pip install --upgrade pip
pip install numpy scipy scikit-learn tensorflow pandas seaborn

Typical install time

  • ~5–15 minutes on a normal desktop with a stable internet connection.

Demo

Important

  • The runnable demo is on the web app; this repository cannot run an end‑to‑end demo as‑is.

If you wish to simulate locally (illustrative only)

  • Prepare inputs: epoch_data.npy, model_info.pkl, RNN weights.h5, and a pickled secondary_model.pkl.
  • Ensure you implement compute_features and binarize_prediction or stub them to return plausible arrays.
  • Run a command similar to:
python run_analysis.py \
  --epoch_data_path path/to/epoch_data.npy \
  --model_info_path path/to/model_info.pkl \
  --rnn_weights_path path/to/weights.h5 \
  --secondary_model_path path/to/secondary_model.pkl \
  --refractory_period 50

Expected output

  • Dense per‑sample probabilities and a sparse activation matrix representation.

Expected run time

  • A few seconds, depending on array sizes and library versions.

Instructions for use

Important

  • For practical use, please upload data and run inference in the web app.

Adapting locally (advanced; pseudo‑code)

  • Implement step_features.compute_features to produce the model’s expected feature tensors.
  • Implement step_binarization.binarize_prediction to map probabilities to binary activations.
  • Provide a model_info.pkl with keys like features_list, sampling_freq, threshold.
  • Load weights and secondary model files (see load_build_model.py).
  • Call comap_wrapper.stage_coMAP with your arrays and file paths.

Reproduction instructions (optional)

Not applicable

  • Full reproduction is not possible from this repository because proprietary preprocessing steps and model weights are not included. The web demo provides an environment to interact with the model.

License

This repository is licensed under the Apache License, Version 2.0. See LICENSE for details.

About

Code and sample data to estimate tissue activation from noisy clinical electrograms, with sample data

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages