FI20196079A1 - Sensor data management - Google Patents

Sensor data management Download PDF

Info

Publication number
FI20196079A1
FI20196079A1 FI20196079A FI20196079A FI20196079A1 FI 20196079 A1 FI20196079 A1 FI 20196079A1 FI 20196079 A FI20196079 A FI 20196079A FI 20196079 A FI20196079 A FI 20196079A FI 20196079 A1 FI20196079 A1 FI 20196079A1
Authority
FI
Finland
Prior art keywords
sensor data
data elements
labels
sequences
sequence
Prior art date
Application number
FI20196079A
Other languages
Finnish (fi)
Swedish (sv)
Other versions
FI129882B (en
Inventor
Tuomas Hapola
Mikko Martikka
Timo Eriksson
Erik Lindman
Original Assignee
Amer Sports Digital Services Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/228,981 external-priority patent/US20190142307A1/en
Application filed by Amer Sports Digital Services Oy filed Critical Amer Sports Digital Services Oy
Publication of FI20196079A1 publication Critical patent/FI20196079A1/en
Application granted granted Critical
Publication of FI129882B publication Critical patent/FI129882B/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0223Magnetic field sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/029Humidity sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/002Monitoring the patient using a local or closed circuit, e.g. in a room or building
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1112Global tracking of patients, e.g. by using GPS
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/389Electromyography [EMG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7242Details of waveform analysis using integration
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7253Details of waveform analysis characterised by using transforms
    • A61B5/7257Details of waveform analysis characterised by using transforms using Fourier transforms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • A63B2024/0071Distinction between different activities, movements, or kind of sports performed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching
    • G06F2218/16Classification; Matching by matching signal segments

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Physiology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Theoretical Computer Science (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Artificial Intelligence (AREA)
  • Dentistry (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Psychiatry (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Social Psychology (AREA)
  • Evolutionary Biology (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Signal Processing (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

According to an example aspect of the present invention, there is provided a personal multi-sensor apparatus (300) comprising a memory (320) configured to store plural sequences of sensor data elements and at least one processing core (310) configured to: derive, from the plural sequences of sensor data elements, plural sensor data segments, each sensor data segment comprising time-aligned sensor data element sub-sequences from at least two of the sequences of sensor data elements, and assign a label to at least some of the sensor data segments based on the sensor data elements comprised in the respective sensor data segments, to obtain a sequence of labels.

Description

SENSOR DATA MANAGEMENT FIELD
[0001] The present invention relates to managing user data generated from sensor devices.
BACKGROUND
[0002] User sessions, such as activity sessions, may be recorded, for example in notebooks, spreadsheets or other suitable media. Recorded training sessions enable more — systematic training, and progress toward set goals can be assessed and tracked from the records so produced. Such records may be stored for future reference, for example to assess progress an individual is making as a result of the training. An activity session may comprise a training session or another kind of session.
[0003] Personal sensor devices, such as, for example, sensor buttons, smart watches, smartphones or smart jewellery, may be configured to produce sensor data for session records. Such recorded sessions may be useful in managing physical training, child safety or in professional uses. Recorded sessions, or more generally sensor-based activity management, may be of varying type, such as, for example, running, walking, skiing, canoeing, wandering, or assisting the elderly.
[0004] Recorded sessions may be viewed using a personal computer, for example, oO S wherein recordings may be copied from a personal device to the personal computer. Files
N A on a personal computer may be protected using passwords and/or encryption, for example. S [0005] Personal devices may be furnished with sensors, which may be used, for E example, in determining a location, acceleration, or rotation of the personal device. For O 25 example, a satellite positioning sensor may receive positioning information from a satellite 3 constellation, and deduce therefrom where the personal device is located. A recorded > training session may comprise a route determined by repeatedly determining the location of the personal device during the training session. Such a route may be later observed using a personal computer, for example.
SUMMARY OF THE INVENTION
[0006] The invention is defined by the features of the independent claims. Some specific embodiments are defined in the dependent claims.
[0007] According to a first aspect of the present invention, there is provided a personal multi-sensor apparatus comprising a memory configured to store plural sequences of sensor data elements and at least one processing core configured to: derive, from the — plural sequences of sensor data elements, plural sensor data segments, each sensor data segment comprising time-aligned sensor data element sub-sequences from at least two of the sequences of sensor data elements, and assign a label to at least some of the sensor data segments based on the sensor data elements comprised in the respective sensor data segments, to obtain a sequence of labels.
[0008] According to a second aspect of the present invention, there is provided a method in a personal multisensor apparatus, comprising storing plural sequences of sensor data elements, deriving, from the plural sequences of sensor data elements, plural sensor data segments, each sensor data segment comprising time-aligned sensor data element sub- sequences from at least two of the sequences of sensor data elements, and assigning a label — to at least some of the sensor data segments based on the sensor data elements comprised in the respective sensor data segments, to obtain a seguence of labels.
O [0009] According to a third aspect of the present invention, there is provided a server N apparatus comprising a receiver configured to receive a seguence of labels assigned based = on sensor data elements, the sensor data elements not being comprised in the sequence of = 25 labels, and at least one processing core configured to determine, based on the sequence of z labels, an activity type a user has engaged in.
O [0010] According to a fourth aspect of the present invention, there is provided a 3 method in a server apparatus, comprising receiving a sequence of labels assigned based on > sensor data elements, the sensor data elements not being comprised in the seguence of labels, and determining, based on the sequence of labels, an activity type a user has engaged in.
[0011] According to a fifth aspect of the present invention, there is provided a computer program configured to cause a method in accordance with at least one of the second and fourth aspects to be performed.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] FIGURE 1 illustrates an example system in accordance with at least some —embodiments of the present invention;
[0013] FIGURE 2A illustrates an example multisensorial time series;
[0014] FIGURE 2B illustrates a second example multisensorial time series;
[0015] FIGURE 3 illustrates an example apparatus capable of supporting at least some embodiments of the present invention;
[0016] FIGURE 4 illustrates signalling in accordance with at least some embodiments of the present invention, and
[0017] FIGURE 5 is a flow graph of a method in accordance with at least some embodiments of the present invention. o 20 EMBODIMENTS
O N
N S [0018] Sensor data produced in a user device may consume resources in storing or E processing it due to its large volume. Conseguently, reducing the volume of such sensor O data is of interest. Reducing the volume of the sensor data should aim to reduce the sensor 3 25 — data volume while maintaining a usability of the sensor data. Described herein are methods > to replace raw sensor data with semantic interpretations of the raw sensor data, in the form of labels assigned to segments of the sensor data, greatly reducing the volume of the data while maintaining its meaning.
[0019] FIGURE 1 illustrates an example system in accordance with at least some embodiments of the present invention. The system comprises device 110, which may comprise, for example, multi-sensor device, such as, for example, a personal multi-sensor device, such as, for example, a personal biosensor apparatus such as a smart watch, digital watch, sensor button, or another type of suitable device. In general, a biosensor apparatus may comprise a fitness sensor apparatus or a therapy sensor apparatus, for example. In the illustrated example, device 110 is attached to the user’s ankle, but it may equally be otherwise associated with the user, for example by being worn around the wrist. A sensor button is a device comprising a set of sensors and communications interface, configured to produce from each sensor a sequence of sensor data elements. A sensor button may be powered by a battery, or it may gain its energy from movements of the user, for example.
The multi-sensor device may comprise an internet of things, IoT, device, for example.
[0020] The sensors may be configured to measure acceleration, rotation, moisture, pressure and/or other variables, for example. In one specific embodiment, the sensors are — configured to measure acceleration along three mutually orthogonal axes and rotation about three mutually orthogonal axes. The sensors may comprise single-or multi-axis magnetic field sensors, skin signal EMG, ECG, heartbeat and/or optical pulse sensors. Additionally or alternatively, human activity may be sensed via motion or use of sport utensils, tools, machinery and/or devices. In all, such sensors would produce six sequences — of sensor data elements, such that in each sequence the sensor data elements are in chronological order, obtained once per sampling interval. The sampling intervals of the sensors do not need to be the same.
[0021] Device 110 may be communicatively coupled, directly or indirectly, with a 2 communications network. For example, in FIGURE 1 device 110 is coupled, via wireless N 25 — link 112, with base station 120. Base station 120 may comprise a cellular or non-cellular N base station, wherein a non-cellular base station may be referred to as an access point. = Examples of cellular technologies include wideband code division multiple access, > WCDMA, and long term evolution, LTE, while examples of non-cellular technologies S include wireless local area network, WLAN, and worldwide interoperability for microwave 2 30 access, WIMAX. Base station 120 may be coupled with network node 130 via connection N 123. Connection 123 may be a wire-line connection, for example. Network node 130 may comprise, for example, a controller or gateway device. Network node 130 may interface, via connection 134, with network 140, which may comprise, for example, the Internet or acorporate network. Network 140 may be coupled with further networks via connection
141. Network 140 may comprise, or be communicatively coupled, with a back-end server, for example.
[0022] Device 110 may be configured to receive, directly or indirectly, from satellite 5 constellation 150, satellite positioning information via satellite link 151. The satellite constellation may comprise, for example the global positioning system, GPS, or the Galileo constellation. Satellite constellation 150 may comprise more than one satellite, although only one satellite is illustrated in FIGURE 1 for the same of clarity. Likewise, receiving the positioning information over satellite link 151 may comprise receiving data — from more than one satellite.
[0023] Where device 110 is indirectly coupled with the communications network and/or satellite constellation 150, it may be arranged to communicate with a personal device of user 101, such as a smartphone, which has connectivity with the communications network and/or satellite constellation 150. Device 110 may communicate with the personal — device via, for example, a short-range communication technology such as the Bluetooth or Wibree technologies, or, indeed, via a cable. The personal device and device 110 may be considered to form a personal area network, PAN.
[0024] Alternatively or additionally to receiving data from a satellite constellation, device 110 or the personal device may obtain positioning information by interacting with a network in which base station 120 is comprised. For example, cellular networks may employ various ways to position a device, such as trilateration, multilateration or positioning based on an identity of a base station with which attachment is possible or O ongoing. Likewise a non-cellular base station, or access point, may know its own location N and provide it to device 110 or the personal device, enabling device 110 and/or the 2 25 — personal device to position itself within communication range of this access point. Device - 110 or the personal device may be configured to obtain a current time from satellite = constellation 150, base station 120 or by reguesting it from the user, for example. o S [0025] Device 110 or the personal device may be configured to provide an activity 2 session. An activity session may be associated with an activity type. Examples of activity N 30 types include rowing, paddling, cycling, jogging, walking, hunting, swimming and paragliding. In a simple form, an activity session may comprise storing sensor data produced with sensors comprised in device 110, the personal device or a server, forexample. An activity session may be determined to have started and ended at certain points in time, such that the determination takes place afterward or concurrently with the starting and/or ending. In other words, device 110 may store sensor data to enable subsequent identification of activity sessions based at least partly on the stored sensor data.
[0026] An activity session may enhance a utility a user can obtain from the activity, for example, where the activity involves movement outdoors, the activity session may provide a recording of the activity session. A recording of an activity session may, in some embodiments, provide the user with contextual information. Such contextual information may comprise, for example, locally relevant weather information, received via base station 120, for example. Such contextual information may comprise at least one of the following: a rain warning, a temperature warning, an indication of time remaining before sunset, an indication of a nearby service that is relevant to the activity, a security warning, an indication of nearby users and an indication of a nearby location where several other users have taken photographs. Contextual information may be presented during an activity — session.
[0027] A recording of an activity session may comprise information on at least one of the following: a route taken during the activity session, a metabolic rate or metabolic effect of the activity session, a time the activity session lasted, a guantity of energy consumed during the activity session, a sound recording obtained during the activity — session and an elevation map along the length of the route taken during the activity session. A route may be determined based on positioning information, for example. Metabolic effect and consumed energy may be determined, at least partly, based on sensor data obtained from user 101 during the activity session. A recording may be stored in device 2 110, the personal device, or in a server or other cloud data storage service. A recording N 25 stored in a server or cloud may be encrypted prior to transmission to the server or cloud, to N protect privacy of the user. A recording may be produced even if the user has not indicated I an activity session has started, since a beginning and ending of an activity session may be > determined after the session has ended, for example based, at least partly, on sensor data.
S O [0028] After an activity has ended, device 110 may have stored therein, or in a > 30 memory to which device 110 has access, plural sequences of sensor data elements. The stored seguences of sensor data elements may be stored in chronological order as a time series that spans the activity session as well as time preceding and/or succeeding theactivity session. The beginning and ending points in time of the activity session may be selected from the time series by the user, or dynamically by device 110. For example, where, in the time series, acceleration sensor data begins to indicate more active movements of device 110, a beginning point of an activity session may be selected. Such a change may correspond to a time in the time series when the user stopped driving a car and began jogging, for example. Likewise, a phase in the time series where the more active movements end may be selected as an ending point of the activity session.
[0029] As described above, the plural sequences of sensor data elements may comprise data from more than one sensor, wherein the more than one sensor may comprise — sensors of at least two distinct types. For example, plural sequences of sensor data elements may comprise sequences of acceleration sensor data elements and rotation sensor data elements. Further examples are sound volume sensor data, moisture sensor data and electromagnetic sensor data. In general, each sequence of sensor data elements may comprise data from one and only one sensor. — [0030] An activity type may be determined based, at least partly, on the sensor data elements. This determining may take place when the activity is occurring, or afterwards, when analysing the sensor data. The activity type may be determined by device 110 or by a server-side computer that has access to the sensor data, for example, or a server that is provided access to the sensor data. Where a server is given access to the sensor data, or, in some embodiments, when activity type detection is performed on device 110 or the personal device, the sensor data may be processed into a seguence of labels.
[0031] A seguence of labels may characterize the content of sensor data. For O example, where the sensor data elements are numerical values obtained during jogging, a N seguence of labels derived from those sensor data elements may comprise a seguence of = 25 labels: {jog-step, jog-step, jog-step, jog-step, jog-step, ...}. Likewise, where the sensor = data elements are numerical values obtained during a long jump, a sequence of labels E derived from those sensor data elements may comprise a seguence of labels: fsprint-step, O sprint-step, sprint-step, sprint-step, sprint-step, leap, stop}. Likewise, where the sensor data 3 elements are numerical values obtained during a triple jump, a sequence of labels derived > 30 from those sensor data elements may comprise a sequence of labels: {sprint-step, sprint- step, sprint-step, sprint-step, leap, leap, leap, stop}. The sequences of labels are thus usablein identifying the activity type, for example differentiating between long jump and triple jump based on the number of leaps.
[0032] The labels may be expressed in natural language or as indices to a pre- defined table, which may be dynamically updatable, as new kinds of exercise primitives become known. For example, in the table a jog-step may be represented as 01, a sprint-step (that is, a step in running much faster than jogging) as 02, a leap as 03, and a stopping of motion may be represented as 04. Thus the triple jump would be represented as a sequence of labels {02, 02, 02, 02, 03, 03, 03, 04}. The activity, for example a triple jump, may be detected from the labels, while the sequence of labels takes up significantly less space than — the original sequences of sensor data elements.
[0033] To process the sequences of sensor data elements into a sequence of labels, sensor data segments may be derived from the sequences of sensor data elements. Each sensor data segment may then be associated with an exercise primitive and assigned a label, to obtain the sequence of labels. Each sensor data segment may comprise time- aligned sensor data element sub-sequences from at least two of the sequences of sensor data elements. In other words, segments of sensor data are derived, each such segment comprising a time slice of original sequences of sensor data elements. This may be conceptualized as time-slicing a multi-sensor data stream captured during jogging into the individual steps that make up the jogging session. Likewise other activity sessions may be — time-sliced into exercise primitives which make up the activity.
[0034] To derive the segments, device 110 or another device may be configured to analyse the sequences of sensor data elements to identify therein units. Each segment may O comprise slices of the seguences of sensor data elements, the slices being time-aligned, that N is, obtained at the same time from the respective sensors.
N N 25 [0035] For example, steps in running are repetitive in nature, wherefore identifying a = pattern in the sequences of sensor data elements which repeats at a certain frequency is a * clue the sequences may be segmented according to this frequency. A frequency may be S identified, for example, by performing a fast fourier transform, FFT, on each of the 2 sequences of sensor data elements, and then averaging the resulting spectrum, to obtain an N 30 overall frequency characteristic of the sequences of sensor data elements.
[0036] In case of motion, one way to segment the sensor data is to try to construct a relative trajectory of the sensor device. One way to estimate this trajectory is to double integrate the x-, y-, and z-components of acceleration sensor outputs. In this process one may remove gravity induced biases. Mathematically this can be done by calculating the baseline of each output. One way is to filter the data as in the next equation.
[0037] acc i baseline = acc i baseline + coeff a * (acc 1 — acc i baseline)
[0038] Acc above refers to the acceleration measurement and 1 refers to its components x, y, and z. These filtered values can be subtracted from the actual measurements: acc i without G = acc i — acc i baseline. This is a rough estimate of the — true linear acceleration, but still a fast and robust way to estimate it. The integration of these linear acceleration values leads to the estimate of the velocity of the sensor device in three-dimensional, 3D, space. The velocity components have biases due the incomplete linear acceleration estimate. These biases may be removed like in the previous equation:
[0039] v i baseline =v 1 baseline + coeff v*(vi-vi baseline) — [0040] V above refers to the velocity estimate and I refers to its components x, y, and z. These velocity components are not true velocities of the sensor device, but easily and robustly calculated estimates of them. The baseline components may be subtracted from the velocity estimates before integration: v i wo bias=vi-vi baseline. Since the method so far is incomplete, the integrals of the velocity components produce biased — position estimates p x, p y, and p z. Therefore these biases needs to be removed like in the previous eguations: O [0041] pi baseline=p i baseline + coeff p*(pi-pi baseline) & a [0042] P above refers to the position estimate and i refers to its components. Since W this procedure effectively produces 0-mean values, the natural reference of position is = 25 p x ref=0,p y ref =0, and p z ref = 0. The Euclidean distances of the measured values a sgrt(p x ti**2 + p y ti**2 + p z ti**2) form a time series varying from 0 to some S maximum value. ti refers to the index in the time series. These maximum values can 3 detected easily. The moment in time of the maximum value starts and the next maximum N value end the segment (and starts the next segment). The detection of the maximum value — can be conditional i.e. the maximum value is accepted as a start/stop marker only when it exceeds a certain level.
[0043] Also, the above described procedure to calculate the relative trajectory can be more precise by utilizing the gyroscopes and using e.g. complementary filtering.
[0044] Other ways to segment the data, that is, derive the segments, may include fitting to a periodic model, using a suitably trained artificial neural network or using a separate segmenting signal provided over a radio or wire-line interface, for example. The segmenting signal may be correlated in time with the sequences of sensor data elements, to obtain the segments. A segmenting signal may be transmitted or provided by a video recognition system or pressure pad system, for example. Such a video recognition system may be configured to identify steps, for example. — [0045] Once the segments have been derived, each segment may be assigned a label. Assigning the label may comprise identifying the segment. The identification may comprise comparing the sensor data comprised in the segment to a library of reference segments, for example in a least-sguares sense, and selecting from the library of reference segments a reference segment which most resembles the segment to be labelled. The label — assigned to the segment will then be a label associated with the closest reference segment in the library of reference segments.
[0046] In some embodiments, a plurality of reference segment libraries is used, such that a first phase of the identification is selection of a reference segment library. For example, where two reference segment libraries are used, one of them could be used for — continuous activity types and a second one of them could be used for discontinuous activity types. The continuous activity type is selected where the seguences of sensor data elements reflect a repetitive action which repeats a great number of times, such as jogging, O walking, cycling or rowing. The discontinuous activity type is selected when the activity is N characterized by brief seguences of action which are separated from each other in time, for = 25 example the afore-mentioned triple jump, or pole vault, being examples. Once the - reference segment library is chosen, all the segments are labelled with labels from the s selected reference segment library. o S [0047] A benefit of first selecting a reference segment library is obtained in more 2 effective labelling, as there is a lower risk segments are assigned incorrect labels. This is N 30 — so, since the number of reference segments the sensor data segments are compared to is lower, increasing the chances a correct one is chosen.
[0048] Once the segments have been labelled, a syntax check may be made wherein it is assessed, if the sequence of labels makes sense. For example, if the sequence of labels is consistent with known activity types, the syntax check is passed. On the other hand, if the sequence of labels comprises labels which do not fit together, a syntax error may be generated. As an example, a sequence of jogging steps which comprises mixed therein a few paddling motions would generate a syntax error, since the user cannot really be jogging and paddling at the same time. In some embodiments, a syntax error may be resolved by removing from the sequence of labels the labels which do not fit in, in case they occur in the sequence of labels only rarely for example at a rate of less than 2%. — [0049] The reference segment libraries may comprise indications as to which labels fit together, to enable handling syntax error situations.
[0050] Different exercise primitives may be associated with different characteristic freguencies. For example, acceleration sensor data may reflect a higher characteristic freguency when the user has been running, as opposed to walking. Thus the labelling of the segments may be based, in some embodiments, at least partly, on deciding which reference segment has a characteristic freguency that most closely matches a characteristic freguency of a section of the seguence of sensor data elements under investigation. Alternatively or in addition, acceleration sensor data may be employed to determine a characteristic movement amplitude. — [0051] The reference segment libraries may comprise reference datasets that are multi-sensorial in nature in such a way, that each reference segment comprises data that may be compared to each sensor data type that is available. For example, where device 110 O is configured to compile a time series of acceleration and sound sensor data types, the N reference segments may comprise reference datasets each reference segment = 25 corresponding to a label, wherein each reference segment comprises data that may be = compared with the acceleration data and data that may be compared with the sound data, z for example. The determined label may be determined as the label that is associated with O the multi-sensorial reference segment that most closely matches the segment stored by 3 device 110, for example. Device 110 may comprise, for example, microphones and > 30 cameras. Furthermore a radio receiver may, in some cases, be configurable to measure electric or magnetic field properties. Device 110 may comprise a radio receiver, in general, where device 110 is furnished with a wireless communication capability.
[0052] An example of activity type identification by segmenting and labelling is swimming, wherein device 110 stores sequences of sensor data elements that comprise moisture sensor data elements and magnetic field sensor data elements. The moisture sensor data elements indicating presence of water would cause a water-sport reference segment library to be used. Swimming may involve elliptical movements of an arm, to which device 110 may be attached, which may be detectable as periodically varying magnetic field data. In other words, the direction of the Earth’s magnetic field may vary from the point of view of the magnetic field sensor in a periodic way in the time series. This would enable labelling the segments as, for example, breast-stroke swimming motions.
[0053] Overall, a determined, or derived, activity type may be considered an estimated activity type until the user has confirmed the determination is correct. In some embodiments, a few, for example two or three, most likely activity types may be presented to the user as estimated activity types for the user to choose the correct activity type from.
Using two or more types of sensor data increases a likelihood the estimated activity type is correct. Once the user confirms or selects a specific activity type, labelling of segments may be enforced to be compliant with this activity type. This may mean, for example, that the set of reference segments the sensor data segments are compared to is limited to reference data segments consistent with this activity type.
[0054] Where device 110 or a personal device assigns the labels, the sequence of labels may be transmitted to a network server, for example, for storage. Device 110, the personal device or the server may determine an overall activity type the user is engaged in, > based on the labels. This may be based on a library of reference label sequences, for S example. = 25 [0055] In general, device 110 or the personal device may receive a machine readable = instruction, such as an executable program or executable script, from the server or another E network entity. The machine readable instruction may be usable in determining activity O type from the seguence of labels, and/or in assigning the labels to sensor data segments. In 3 the latter case, the machine readable instruction may be referred to as a labelling > 30 instruction.
[0056] The process may adaptively learn, based on the machine readable instructions, how to more accurately assign labels and/or determine activity types. A servermay have access to information from a plurality of users, and high processing capability, and thus be more advantageously placed to update the machine-readable instructions than device 110, for example.
[0057] The machine readable instructions may be adapted by the server. For example, a user who first obtains a device 110 may initially be provided, responsive to messages sent from device 110, with machine readable instructions that reflect an average user population. Thereafter, as the user engages in activity sessions, the machine readable instructions may be adapted to more accurately reflect use by this particular user. For example, limb length may affect periodical properties of sensor data captured while the — user is swimming or running. To enable the adapting, the server may request sensor data from device 110, for example periodically, and compare sensor data so obtained to the machine readable instructions, to hone the instructions for future use with this particular user. Thus a beneficial effect is obtained in fewer incorrectly labelled segments, and more effective and accurate compression of the sensor data. — [0058] FIGURE 2A illustrates an example of plural seguences of sensor data elements. On the upper axis, 201, is illustrated a seguence of moisture sensor data elements 210 while the lower axis, 202, illustrates a time series 220 of deviation of magnetic north from an axis of device 110, that is, a seguence of magnetic sensor data elements.
[0059] The moisture seguence 210 displays an initial portion of low moisture, followed by a rapid increase of moisture that then remains at a relatively constant, elevated, level before beginning to decline, at a lower rate than the increase, as device 110 dries. oO D [0060] Magnetic deviation seguence 220 displays an initial, erratic seguence of 2 deviation changes owing to movement of the user as he operates a locker room lock, for A 25 example, followed by a period of approximately periodic movements, before an erratic = sequence begins once more. The wavelength of the periodically repeating motion has been > exaggerated in FIGURE 2 to render the illustration clearer.
S O [0061] A swimming activity type may be determined as an estimated activity type, > beginning from point 203 and ending in point 205 of the seguences. In detail, the sequences may be segmented into two segments, firstly from point 203 to point 204, and secondly from point 204 to point 205. As the moisture sensor indicates water sports, awater sports reference segment library is used to label the segments as, for example, freestroke swimming segments. The sequence of labels would thus be {freestroke, freestroke}. Of course, in actual swimming the number of segment would be much higher, but two segments are illustrated in FIGURE 2 for the sake of simplicity. Overall, the two sensor data segments, from 203 to 204 and from 204 to 205, both comprise time-aligned sensor data element sub-sequences from sequences 210 and 220.
[0062] FIGURE 2B illustrates a second example of plural sequences of sensor data elements. In FIGURE 2B, like numbering denotes like elements as in FIGURE 2A. Unlike in FIGURE 2A, not one but two activity sessions are determined in the time series of FIGURE 2B. Namely, a cycling session is determined to start at beginning point 207 and to end at point 203, when the swimming session begins. Thus the compound activity session may relate to triathlon, for example. In cycling, moisture remains low, and magnetic deviation changes only slowly, for example as the user cycles in a velodrome. The segments would thus comprise two segments between points 207 and 203, and three — segments between points 203 and 205. The sequence of labels could be {cycling, cycling, freestroke, freestroke, freestroke}. Again, the number of segments is dramatically reduced for the sake of clarity of illustration.
[0063] FIGURE 3 illustrates an example apparatus capable of supporting at least some embodiments of the present invention. Illustrated is device 300, which may — comprise, for example, device 110 of FIGURE 1. Comprised in device 300 is processor 310, which may comprise, for example, a single- or multi-core processor wherein a single- core processor comprises one processing core and a multi-core processor comprises more than one processing core. Processor 310 may comprise more than one processor. A 2 processing core may comprise, for example, a Cortex-A8 processing core designed by N 25 ARM Holdings or an Excavator processing core produced by Advanced Micro Devices "s Corporation. Processor 310 may comprise at least one Oualcomm Snapdragon and/or Intel = Atom processor. Processor 310 may comprise at least one application-specific integrated a > circuit, ASIC. Processor 310 may comprise at least one field-programmable gate array, S FPGA. Processor 310 may be means for performing method steps in device 300. Processor 2 30 310 may be configured, at least in part by computer instructions, to perform actions.
N
[0064] Device 300 may comprise memory 320. Memory 320 may comprise random- access memory and/or permanent memory. Memory 320 may comprise at least one RAMchip. Memory 320 may comprise solid-state, magnetic, optical and/or holographic memory, for example. Memory 320 may be at least in part accessible to processor 310. Memory 320 may be at least in part comprised in processor 310. Memory 320 may be means for storing information. Memory 320 may comprise computer instructions that processor 310 is configured to execute. When computer instructions configured to cause processor 310 to perform certain actions are stored in memory 320, and device 300 overall is configured to run under the direction of processor 310 using computer instructions from memory 320, processor 310 and/or its at least one processing core may be considered to be configured to perform said certain actions. Memory 320 may be at least in part comprised in processor 310. Memory 320 may be at least in part external to device 300 but accessible to device 300.
[0065] Device 300 may comprise a transmitter 330. Device 300 may comprise a receiver 340. Transmitter 330 and receiver 340 may be configured to transmit and receive, respectively, information in accordance with at least one cellular or non-cellular standard. Transmitter 330 may comprise more than one transmitter. Receiver 340 may comprise more than one receiver. Transmitter 330 and/or receiver 340 may be configured to operate in accordance with global system for mobile communication, GSM, wideband code division multiple access, WCDMA, long term evolution, LTE, IS-95, wireless local area network, WLAN, Ethernet and/or worldwide interoperability for microwave access, WiMAX, standards, for example.
[0066] Device 300 may comprise a near-field communication, NFC, transceiver 350. NFC transceiver 350 may support at least one NFC technology, such as NFC, Bluetooth, > Wibree or similar technologies. N [0067] Device 300 may comprise user interface, UI, 360. UI 360 may comprise at = 25 least one of a display, a keyboard, a touchscreen, a vibrator arranged to signal to a user by = causing device 300 to vibrate, a speaker and a microphone. A user may be able to operate = device 300 via UI 360, for example to manage activity sessions. = [0068] Device 300 may comprise or be arranged to accept a user identity module > 370. User identity module 370 may comprise, for example, a subscriber identity module, N 30 — SIM, card installable in device 300. A user identity module 370 may comprise information identifying a subscription of a user of device 300. A user identity module 370 may comprise cryptographic information usable to verify the identity of a user of device 300and/or to facilitate encryption of communicated information and billing of the user of device 300 for communication effected via device 300.
[0069] Processor 310 may be furnished with a transmitter arranged to output information from processor 310, via electrical leads internal to device 300, to other devices comprised in device 300. Such a transmitter may comprise a serial bus transmitter arranged to, for example, output information via at least one electrical lead to memory 320 for storage therein. Alternatively to a serial bus, the transmitter may comprise a parallel bus transmitter. Likewise processor 310 may comprise a receiver arranged to receive information in processor 310, via electrical leads internal to device 300, from other devices comprised in device 300. Such a receiver may comprise a serial bus receiver arranged to, for example, receive information via at least one electrical lead from receiver 340 for processing in processor 310. Alternatively to a serial bus, the receiver may comprise a parallel bus receiver.
[0070] Device 300 may comprise further devices not illustrated in FIGURE 3. For — example, where device 300 comprises a smartphone, it may comprise at least one digital camera. Some devices 300 may comprise a back-facing camera and a front-facing camera, wherein the back-facing camera may be intended for digital photography and the front- facing camera for video telephony. Device 300 may comprise a fingerprint sensor arranged to authenticate, at least in part, a user of device 300. In some embodiments, device 300 — lacks at least one device described above. For example, some devices 300 may lack a NFC transceiver 350 and/or user identity module 370.
[0071] Processor 310, memory 320, transmitter 330, receiver 340, NFC transceiver O 350, UI 360 and/or user identity module 370 may be interconnected by electrical leads N internal to device 300 in a multitude of different ways. For example, each of the = 25 aforementioned devices may be separately connected to a master bus internal to device = 300, to allow for the devices to exchange information. However, as the skilled person will E appreciate, this is only one example and depending on the embodiment various ways of O interconnecting at least two of the aforementioned devices may be selected without 3 departing from the scope of the present invention.
O N 30 [0072] FIGURE 4 illustrates signalling in accordance with at least some embodiments of the present invention. On the vertical axes are disposed, on the left, device 110 of FIGURE 1, and on the right, a server SRV. Time advances from the top toward thebottom. Initially, in phase 410, device 110 obtains sensor data from at least one, and in some embodiments from at least two sensors. The sensor data may comprise sequences of sensor data elements, as described herein above. The sensor or sensors may be comprised in device 110, for example. The sensor data may be stored in a time series, for example at a sampling frequency of 1 Hz, 10 Hz, 1 Khz or indeed another sampling interval. The sampling interval need not be the same in the various sequences of sensor data elements.
[0073] Phase 410 may comprise one or more activity sessions of at least one activity type. Where multiple activity sessions are present, they may be of the same activity type or different activity types. The user need not, in at least some embodiments, indicate to device 110 that activity sessions are ongoing. During phase 410, device 110 may, but in some embodiments need not, identify activity types or sessions. The sequences of sensor data elements compiled during phase 410 may last 10 minutes or 2 hours, for example. As a specific example, the time series may last from the previous time sensor data was downloaded from device 110 to another device, such as, for example, personal computer PCI.
[0074] Further, in phase 410, device 110 segments the sequences of sensor data elements to plural sensor data segments, as described herein above. These segments are then assigned labels to obtain a conversion of the sequences of sensor data elements to a sequence of labels.
[0075] In phase 420, the sequence of labels is provided, at least partly, to server SRV. This phase may further comprise providing to server SRV optional activity and/or event reference data. The providing may proceed via base station 120, for example. The © sequence of labels may be encrypted en route to the server to protect the user’s privacy. & A [0076] In phase 430, server SRV may determine, based at least partly on the A 25 sequence of labels in the message of phase 420, an associated machine readable = instruction. The machine readable instruction may relate, for example, to improved * labelling of segments relating to activities related to the labels in the seguence of labels S received in server SRV from device 110 in phase 420. > I [0077] In phase 440 the machine readable instruction determined in phase 430 is provided to device 110, enabling, in phase 450, a more accurate labelling of segments of sensor data.
[0078] FIGURE 5 is a flow graph of a method in accordance with at least some embodiments of the present invention. The phases of the illustrated method may be performed in device 110, an auxiliary device or a personal computer, for example, or in a control device configured to control the functioning thereof, when implanted therein.
[0079] Phase 510 comprises storing plural seguences of sensor data elements. Phase 520 comprises deriving, from the plural seguences of sensor data elements, plural sensor data segments, each sensor data segment comprising time-aligned sensor data element sub- seguences from at least two of the seguences of sensor data elements. Finally, phase 530 comprises assigning a label to at least some of the sensor data segment based on the sensor — data elements comprised in the respective sensor data segment, to obtain a seguence of labels.
[0080] It is to be understood that the embodiments of the invention disclosed are not limited to the particular structures, process steps, or materials disclosed herein, but are extended to eguivalents thereof as would be recognized by those ordinarily skilled in the — relevant arts. It should also be understood that terminology employed herein is used for the purpose of describing particular embodiments only and is not intended to be limiting.
[0081] Reference throughout this specification to one embodiment or an embodiment means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Where reference is made to a numerical value using a term such as, for example, about or substantially, the exact numerical value is also disclosed. oO > [0082] As used herein, a plurality of items, structural elements, compositional N 25 elements, and/or materials may be presented in a common list for convenience. However, N these lists should be construed as though each member of the list is individually identified I as a separate and unique member. Thus, no individual member of such list should be > construed as a de facto equivalent of any other member of the same list solely based on S their presentation in a common group without indications to the contrary. In addition, 2 30 — various embodiments and example of the present invention may be referred to herein along N with alternatives for the various components thereof. It is understood that such embodiments, examples, and alternatives are not to be construed as de facto equivalents of one another, but are to be considered as separate and autonomous representations of thepresent invention.
[0083] Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the preceding description, numerous specific details are provided, such as examples of lengths, widths, shapes, etc., to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the invention.
— [0084] While the forgoing examples are illustrative of the principles of the present invention in one or more particular applications, it will be apparent to those of ordinary skill in the art that numerous modifications in form, usage and details of implementation can be made without the exercise of inventive faculty, and without departing from the principles and concepts of the invention. Accordingly, it is not intended that the invention be limited, except as by the claims set forth below.
[0085] The verbs “to comprise” and “to include” are used in this document as open limitations that neither exclude nor reguire the existence of also un-recited features. The features recited in depending claims are mutually freely combinable unless otherwise explicitly stated. Furthermore, it is to be understood that the use of "a" or "an", that is, a — singular form, throughout this document does not exclude a plurality.
INDUSTRIAL APPLICABILITY > [0086] At least some embodiments of the present invention find industrial > application in facilitating analysis of sensor data.
N T ACRONYMS LIST
N I 25 GPS Global Positioning System a O LTE Long Term Evolution 3 2 NFC Near-Field Communication
N WCDMA Wideband Code Division Multiple Access WiMAX worldwide interoperability for microwave access
WLAN Wireless local area network
REFERENCE SIGNS LIST Network Node Satellite Constellation 201, 202 Axes in FIGURE 2 203, 2 05, Activity session endpoints in FIGURE2 and FIGURE2B 207 210, 220 Sensor data time series in FIGUREs 2 and 2B 310-370 Structure illustrated in FIGURE 3 410-430 Phases of the method of FIGURE 4 510-530 Phases of the method of FIGURE 5 o
O N
N ~
T ja m o o | oO ©
O O N

Claims (17)

CLAIMS:
1. A personal multi-sensor apparatus comprising; — a memory configured to store plural sequences of sensor data elements, and — at least one processing core configured to: " derive, from the plural sequences of sensor data elements, plural sensor data segments, each sensor data segment comprising time- aligned sensor data element sub-sequences from at least two of the sequences of sensor data elements, and " assign a label to at least some of the sensor data segments based on the sensor data elements comprised in the respective sensor data segments, to obtain a sequence of labels, — wherein the apparatus is further configured to determine, based on the sequence of labels, an activity type a user has engaged in while the sequences of sensor data have been obtained.
2. The apparatus according to claim 1, wherein the apparatus is further configured to transmit the sequence of labels to a node in network.
3. The apparatus according to claim 1, wherein the apparatus is configured to receive, from a node in a network, a machine readable instruction, and to employ the machine readable instruction in determining the activity type. oO > 25 —
4. The apparatus according to claim 3, wherein the machine readable instruction comprises N at least one of the following: an executable program and an executable script. a z
5. The apparatus according to any of claims 1 — 4, wherein the apparatus is configured to > receive, from a network, at least one labelling instruction, and to employ the at least one S 30 — machine readable labelling instruction in the assigning of the label to each sensor data 2 segment.
6. The apparatus according to claim 5, wherein the machine readable labelling instruction comprises at least one of the following: an executable program and an executable script.
7. The apparatus according to any of claims 1 — 6, wherein each of the plural sequences of sensor data elements comprises sensor data elements originating in exactly one sensor.
8 The apparatus according to any of claims 1 — 7, wherein the plural sequences of sensor data elements comprise at least three sequences of sensor data elements.
9. The apparatus according to any of claims 1 — 8, wherein the plural sequences of sensor data elements comprise at least nine sequences of sensor data elements.
10. The apparatus according to any of claims 1 — 9, wherein the apparatus is configured to derive the plural sensor data segments using, at least in part, a suitably trained artificial neural network.
11. A method in a personal multisensor apparatus, comprising: — storing plural sequences of sensor data elements; — deriving, from the plural sequences of sensor data elements, plural sensor data segments, each sensor data segment comprising time-aligned sensor data element sub-sequences from at least two of the sequences of sensor data elements, — assigning a label to at least some of the sensor data segments based on the sensor data elements comprised in the respective sensor data segments, to obtain a sequence of labels, and — determining, based on the sequence of labels, an activity type a user has engaged in o while the seguences of sensor data have been obtained. > 25 N
12. The method according to claim 11, further comprising transmitting the sequence of N labels to a node in network. Tr a o
13. The method according to claim 11, further comprising receiving, from a node in a S 30 network, a machine readable instruction, and employing the machine readable instruction = in determining the activity type.
14. A server apparatus comprising:
— areceiver configured to receive a sequence of labels assigned based on sensor data elements, the sensor data elements not being comprised in the sequence of labels and the labels not comprising the sensor data elements, and — at least one processing core configured to: = determine, based on the sequence of labels, an activity type a user has engaged in while the sensor data elements have been obtained.
15. The server apparatus according to claim 14, wherein the server apparatus is configured to determine the activity type based on comparing the received sequence of labels with a — list of label sequences stored in the server apparatus, and by selecting an activity type which is associated with a sequence of labels in the list which matches the received sequence of labels.
16. A method in a server apparatus, comprising: — receiving a sequence of labels assigned based on sensor data elements, the sensor data elements not being comprised in the sequence of labels and the labels not comprising the sensor data elements, and — determining, based on the sequence of labels, an activity type a user has engaged in while the sensor data elements have been obtained.
17. A computer program configured to cause a method in accordance with at least one of claims 11 — 13 or 16 to be performed.
o EO 25
N
N
I Ao a o
MN
O
O
O
O
N
FI20196079A 2018-12-21 2019-12-12 Sensor data management FI129882B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US16/228,981 US20190142307A1 (en) 2015-12-21 2018-12-21 Sensor data management

Publications (2)

Publication Number Publication Date
FI20196079A1 true FI20196079A1 (en) 2020-06-22
FI129882B FI129882B (en) 2022-10-14

Family

ID=69147143

Family Applications (1)

Application Number Title Priority Date Filing Date
FI20196079A FI129882B (en) 2018-12-21 2019-12-12 Sensor data management

Country Status (5)

Country Link
CN (1) CN111351524A (en)
DE (1) DE102019008548A1 (en)
FI (1) FI129882B (en)
GB (1) GB2581014B (en)
TW (1) TWI729596B (en)

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2425180B (en) * 2005-04-14 2009-03-18 Justin Pisani Monitoring system
JP5028751B2 (en) * 2005-06-09 2012-09-19 ソニー株式会社 Action recognition device
US8187182B2 (en) * 2008-08-29 2012-05-29 Dp Technologies, Inc. Sensor fusion for activity identification
WO2010083562A1 (en) * 2009-01-22 2010-07-29 National Ict Australia Limited Activity detection
EP2539837A4 (en) * 2010-02-24 2016-05-25 Jonathan Edward Bell Ackland Classification system and method
EP2603870B1 (en) * 2010-08-09 2020-03-18 NIKE Innovate C.V. Monitoring fitness using a mobile device
US8774499B2 (en) * 2011-02-28 2014-07-08 Seiko Epson Corporation Embedded optical flow features
US20150119728A1 (en) * 2011-12-02 2015-04-30 Fitlinxx, Inc. Health monitor
WO2014118767A1 (en) * 2013-02-03 2014-08-07 Sensogo Ltd. Classifying types of locomotion
JP5803962B2 (en) * 2013-03-22 2015-11-04 ソニー株式会社 Information processing apparatus, sensor apparatus, information processing system, and recording medium
KR101500662B1 (en) * 2013-10-18 2015-03-09 경희대학교 산학협력단 Apparatus and method for activity recognizing using mobile device
CN104680046B (en) * 2013-11-29 2018-09-07 华为技术有限公司 A kind of User Activity recognition methods and device
EP3077937B1 (en) * 2013-12-02 2020-07-15 NIKE Innovate C.V. Determination of flight time of an athlete
CN103970271B (en) * 2014-04-04 2017-06-20 浙江大学 The daily routines recognition methods of fusional movement and physiology sensing data
CN116584928A (en) * 2014-09-02 2023-08-15 苹果公司 Physical activity and fitness monitor
WO2016087381A1 (en) * 2014-12-02 2016-06-09 Koninklijke Philips N.V. System and method for generating health data using measurements of wearable device
US9654234B2 (en) * 2015-08-28 2017-05-16 Focus Ventures, Inc. System and method for automatically time labeling repetitive data
CN105242779B (en) * 2015-09-23 2018-09-04 歌尔股份有限公司 A kind of method and mobile intelligent terminal of identification user action
US20170232294A1 (en) * 2016-02-16 2017-08-17 SensorKit, Inc. Systems and methods for using wearable sensors to determine user movements
US9830516B1 (en) * 2016-07-07 2017-11-28 Videoken, Inc. Joint temporal segmentation and classification of user activities in egocentric videos

Also Published As

Publication number Publication date
GB201917731D0 (en) 2020-01-15
GB2581014A (en) 2020-08-05
GB2581014B (en) 2021-09-22
TWI729596B (en) 2021-06-01
TW202032327A (en) 2020-09-01
DE102019008548A1 (en) 2020-06-25
CN111351524A (en) 2020-06-30
FI129882B (en) 2022-10-14

Similar Documents

Publication Publication Date Title
US11607144B2 (en) Sensor based context management
US10433768B2 (en) Activity intensity level determination
US20230112041A1 (en) Music Selection Based on Exercise Detection
US10327673B2 (en) Activity intensity level determination
US10488527B2 (en) Automatic tracking of geolocation data for exercises
US9730027B2 (en) Back-filling of geolocation-based exercise routes
US10856776B2 (en) Activity intensity level determination
US8990011B2 (en) Determining user device's starting location
CN107113760A (en) Determine Network Synchronization state
US20180048996A1 (en) Location and activity aware content delivery system
EP3459271B1 (en) Back-filling of geolocation-based exercise routes
CN106454723A (en) Mobile phone accelerometer based child custody method
WO2016027001A1 (en) Handling sensor information
US20190142307A1 (en) Sensor data management
US11587484B2 (en) Method for controlling a display
FI129882B (en) Sensor data management
EP3431002B1 (en) Rf based monitoring of user activity
EP2751704B1 (en) Method and apparatus for determining environmental context utilizing features obtained by multiple radio receivers
CN114912065A (en) Method and device for calculating movement distance, wearable device and medium
FI20206293A1 (en) Method for controlling a display
GB2579998A (en) Sensor Based context management
CN118447999A (en) Method for monitoring motion information and related equipment
CN116186552A (en) Providing unlabeled training data for training a computational model

Legal Events

Date Code Title Description
PC Transfer of assignment of patent

Owner name: SUUNTO OY

FG Patent granted

Ref document number: 129882

Country of ref document: FI

Kind code of ref document: B