Skip to content

Activities

actimotus.Activities dataclass

Processes extracted features to perform Human Activity Recognition (HAR).

This class ingests features from multiple sensors (required: thigh; optional: trunk, calf, arm) and produces a time-series of recognized activities at 1-second resolution.

Key capabilities include automatic sensor orientation detection (correcting for upside-down or inside-out flipped devices), vendor-specific signal corrections, and configurable activity recognition thresholds.

Attributes:

Name Type Description
system_frequency int

The target frequency (in Hz) used for internal calculations. Defaults to 30 Hz.

vendor Literal['Sens', 'Other']

The hardware vendor of the sensor. If set to 'Sens', specific signal corrections are applied. Use 'Other' for generic devices.

orientation bool

If True, automatically detects and corrects the sensor orientation (e.g., if the device was worn upside down).

chunks bool

If True, processes data in overlapping chunks to simulate cloud/streaming infrastructure.

size str | timedelta

The duration of each processing chunk. Accepts a timedelta object or a pandas-style string alias (e.g., '1d', '1h').

overlap str | timedelta

The duration of overlap between consecutive chunks. Accepts a timedelta object or a string alias (e.g., '15min').

config dict[str, Any] | Literal['DEFAULT', 'LEGACY']

The configuration for activity recognition thresholds. Can be a dictionary of custom parameters, or a preset string:

  • 'DEFAULT': Standard thresholds for general population.
  • 'LEGACY': Older threshold values for backward compatibility.

Examples:

Standard usage with default configuration:

>>> model = Activities()
>>> # activities, references = model.process(features)

Usage for 'Sens' devices with legacy thresholds and automatic flip detection:

>>> model = Activities(
...     vendor='Sens',
...     config='LEGACY',
...     orientation=True
... )

compute

compute(
    thigh: DataFrame,
    *,
    trunk: DataFrame | None = None,
    calf: DataFrame | None = None,
    arm: DataFrame | None = None,
    references: dict[str, Any] | None = None
) -> tuple[pd.DataFrame, dict[str, Any]]

Executes the activity recognition pipeline on the provided sensor data.

This method synchronizes inputs from the thigh (primary) and optional secondary sensors.

Parameters:

Name Type Description Default
thigh DataFrame

The primary accelerometer feature data. Must contain a DatetimeIndex. This sensor is mandatory for the pipeline.

required
trunk DataFrame | None

Optional feature data from a trunk sensor.

None
calf DataFrame | None

Optional feature data from a calf sensor.

None
arm DataFrame | None

Optional feature data from an arm sensor.

None
references dict[str, Any] | None

A dictionary containing reference data (individual reference angles, calibration intervals).

None

Returns:

Type Description
DataFrame

A tuple (activities, updated_references) containing:

dict[str, Any]
  1. activities (pd.DataFrame): The recognized activities, resampled to 1-second epochs.
tuple[DataFrame, dict[str, Any]]
  1. updated_references (dict): The updated state dictionary, containing new reference angles calculated during processing.

Examples:

Basic usage with only the mandatory thigh sensor:

>>> model = Activities()
>>> activities, references = model.compute(features)

Usage with multiple sensors and existing references. Note that secondary sensors must be passed as keyword arguments:

>>> previous_references = {
...     'thigh': {
...         'value': -0.201,
...         'expires': '2024-09-03 12:05:51+00:00',
...     },
...     'calibrations': [
...         {
...             'start': '2024-09-03 08:08:51+00:00',
...             'end': '2024-09-03 08:09:11+00:00',
...             'ttl': '24h',
...         },
...     ],
... }
>>> activities, new_references = model.compute(
...     thigh_df,
...     trunk=trunk_df,
...     references=previous_references,
... )