Skip to content
Jakob E. Bardram edited this page Nov 20, 2023 · 116 revisions

This is the wiki page for the CACHET Research Platform (CARP) Mobile Sensing (CAMS) software. An overview and access to the software libraries are listed on the main Github site.

CARP Mobile Sensing Framework in Flutter

The CARP Mobile Sensing (CAMS) Flutter package carp_mobile_sensing is a programming framework for adding digital phenotyping capabilities to your mobile (health) app. CARP Mobile Sensing is designed to be used for collection of three main types of data:

  • passive sensing from onboard sensors on the phone (e.g., light, location, etc.)
  • collection of data from (wearable) devices and online services connected to the phone (e.g., heart rate, weather information, etc.)
  • active collection of data from the user (e.g., surveys, EMAs, questionnaires, etc.)

These wiki pages contains an overall documentation of the main software architecture and domain model of the framework, and how to use and extend it, plus a set of appendices providing details on measure types, data formats, and data backends.

Table of Content

  1. Software Architecture – the overall picture.
  2. Domain Model – the detailed picture of the data model(s).
  3. Using CARP Mobile Sensing – how to configure passive sensing in your app.
  4. The AppTask Model – how to collect data from the user in your app.
  5. Extending CARP Mobile Sensing – how to extends the framework, with a focus on new sensing capabilities and external (wearable) devices.
  6. Best Practice – tips and trick, especially related to differences between Android and iOS.

Appendices

Purpose and Goals

The overall goal of CAMS is to have a programming framework that helps building custom mobile sensing applications.

The basic scenario we want to support is to allow programmers to design and implement a custom mHealth app e.g. for cardiovascular diseases or diabetes. Such an app would have its main focus on providing application-specific functionality for patients with either hearth rhythm problems or diabetes, which are two rather distinct application domains.

However, the goal of a mobile sensing programming framework would be to enable the programmers to add mobile sensing capabilities in a 'flexible and simple' manner. This would include adding support for collecting data like ECG, location, activity, and step counts; to format this data according to different health data formats (like the Open mHealth formats); to use this data in the app (e.g. showing it to the user); and to upload it to a specific server, using a specific API (e.g. REST), in a specific format. The framework should be able to support different wearable devices for ECG or glucose monitoring. Hence, focus is on software engineering support in terms of a sound programming API and runtime execution environment, which is being maintained as the underlying mobile phone operating systems are evolving. Moreover, focus is on providing an extensible API and runtime environment, which allow for adding application-specific data sampling, wearable devices, data formatting, data management, and data uploading functionality.

Initial Example

The code listing below shows a simple Dart example of how to add sampling to a Flutter app. This basic example illustrates how sampling is configured, deployed, initialized, and used in three basic steps;

  1. a study protocol is defined;
  2. the runtime environment is created and initialized, and
  3. sensing is started and the stream of measurements collected can be used in the app.
import 'package:carp_core/carp_core.dart';
import 'package:carp_mobile_sensing/carp_mobile_sensing.dart';

/// This is an example of how to set up a minimal study
Future<void> example() async {
  // Create a study protocol
  SmartphoneStudyProtocol protocol = SmartphoneStudyProtocol(
    ownerId: 'AB',
    name: 'Track patient movement',
  );

  // Define which devices are used for data collection.
  // In this case, it's only this smartphone
  var phone = Smartphone();
  protocol.addPrimaryDevice(phone);

  // Automatically collect step count, ambient light, screen activity, and
  // battery level. Sampling is delayed by 10 seconds.
  protocol.addTaskControl(
    DelayedTrigger(delay: const Duration(seconds: 10)),
    BackgroundTask(measures: [
      Measure(type: SensorSamplingPackage.STEP_COUNT),
      Measure(type: SensorSamplingPackage.AMBIENT_LIGHT),
      Measure(type: DeviceSamplingPackage.SCREEN_EVENT),
      Measure(type: DeviceSamplingPackage.BATTERY_STATE),
    ]),
    phone,
    Control.Start,
  );

  // Create and configure a client manager for this phone, and
  // create a study based on the protocol.
  SmartPhoneClientManager client = SmartPhoneClientManager();
  await client.configure();
  Study study = await client.addStudyProtocol(protocol);

  // Get the study controller and try to deploy the study.
  SmartphoneDeploymentController? controller = client.getStudyRuntime(study);
  await controller?.tryDeployment();

  // Configure the controller.
  await controller?.configure();

  // Start the data sampling
  controller?.start();

  // Listening and printing all measurements collected
  controller?.measurements.forEach(print);
}

Section 3 on Using CARP Mobile Sensing contains more details on this flow. But before digging in to this, read section 2 on the Domain Model. Section 1 provide some technical details on the CAMS Architecture, but this should not be necessary to read before using CAMS. Section 5 provides details on how to extend CAMS.