GB2488521A - Activity recognition in living species using tri-axial acceleration data - Google Patents
Activity recognition in living species using tri-axial acceleration data Download PDFInfo
- Publication number
- GB2488521A GB2488521A GB201102714A GB201102714A GB2488521A GB 2488521 A GB2488521 A GB 2488521A GB 201102714 A GB201102714 A GB 201102714A GB 201102714 A GB201102714 A GB 201102714A GB 2488521 A GB2488521 A GB 2488521A
- Authority
- GB
- United Kingdom
- Prior art keywords
- algorithm
- subject
- data
- activity
- sensors
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 230000000694 effects Effects 0.000 title claims abstract description 55
- 230000001133 acceleration Effects 0.000 title claims description 4
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 25
- 238000004458 analytical method Methods 0.000 claims abstract description 21
- 238000000605 extraction Methods 0.000 claims abstract description 13
- 230000003044 adaptive effect Effects 0.000 claims abstract 2
- 238000000034 method Methods 0.000 claims description 20
- 230000006399 behavior Effects 0.000 claims description 13
- 238000001514 detection method Methods 0.000 claims description 5
- 238000004891 communication Methods 0.000 claims description 4
- 238000012549 training Methods 0.000 claims description 3
- 238000004445 quantitative analysis Methods 0.000 claims description 2
- 230000004936 stimulating effect Effects 0.000 claims 2
- 230000004044 response Effects 0.000 claims 1
- 230000036541 health Effects 0.000 abstract description 7
- 230000006870 function Effects 0.000 description 6
- 238000013459 approach Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 241000894007 species Species 0.000 description 5
- 230000006978 adaptation Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 230000002093 peripheral effect Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000012800 visualization Methods 0.000 description 4
- 206010057342 Onychophagia Diseases 0.000 description 3
- 230000009471 action Effects 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 238000007781 pre-processing Methods 0.000 description 3
- WHXSMMKQMYFTQS-UHFFFAOYSA-N Lithium Chemical compound [Li] WHXSMMKQMYFTQS-UHFFFAOYSA-N 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 229910052744 lithium Inorganic materials 0.000 description 2
- 238000003909 pattern recognition Methods 0.000 description 2
- 229920000642 polymer Polymers 0.000 description 2
- 238000010223 real-time analysis Methods 0.000 description 2
- 238000013179 statistical model Methods 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 239000000758 substrate Substances 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 238000007630 basic procedure Methods 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000011960 computer-aided design Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000013499 data model Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000012854 evaluation process Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000005802 health problem Effects 0.000 description 1
- 230000003862 health status Effects 0.000 description 1
- 238000011065 in-situ storage Methods 0.000 description 1
- 239000005340 laminated glass Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000002715 modification method Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000008450 motivation Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000035764 nutrition Effects 0.000 description 1
- 235000016709 nutrition Nutrition 0.000 description 1
- 238000011017 operating method Methods 0.000 description 1
- 238000000513 principal component analysis Methods 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 238000012109 statistical procedure Methods 0.000 description 1
- 230000009182 swimming Effects 0.000 description 1
- 238000012731 temporal analysis Methods 0.000 description 1
- 238000000700 time series analysis Methods 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K29/00—Other apparatus for animal husbandry
- A01K29/005—Monitoring or measuring activity, e.g. detecting heat or mating
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
- A61B5/065—Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
- A61B5/067—Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe using accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1118—Determining activity level
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/40—Animals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/028—Microscale sensors, e.g. electromechanical sensors [MEMS]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/486—Bio-feedback
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Animal Behavior & Ethology (AREA)
- Physiology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Environmental Sciences (AREA)
- Human Computer Interaction (AREA)
- Animal Husbandry (AREA)
- Biodiversity & Conservation Biology (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
An electronic device is worn by a living species and comprises sensors such as accelerometers and gyroscopes which collect data relating to the movement of the species. This data is processed by an algorithm which identifies the type of activity. The output of the algorithm is also a quantitative and qualitative measure of the subjectâ s activity. The algorithm may use adaptive feature extraction and classification on the sensor outputs. The device may be worn by a dog for a period of time to gather information about the subjectâ s activity during the time period. The algorithm may be enhanced by supervised learning. The invention may be used to determine the changing state of mobility and health of the subject based on sensor data. The invention may provide fine-grained analysis of data, which in turn can be used for simple data display, feedback, decision-making, control or automation.
Description
Worn Activity Recognition Device -WARD
FIELD OF THE INVENTION
The present invention pertains generally to the field of activity recognition and prediction in living species. More specifically the invention levers on gathered tn-axial acceleration data as a basis for making such activity recognitions.
BACKGROUND OF THE INVENTION
Activity recognition in bipeds and quadrupeds has been attempted for many reasons over the years. One of the most motivating applications for doing so lies in the field of health care.The diagnosis and detection ofsymptoms may be automated if movements and activities can be accurately detected and compared to normal. The aim of making this step is not to alleviate the need for skilled professionals to make diagnoses, but to give them a more definite set of information rather than the anecdotal communication they are often faced with.
In addition to collecting the movement and activity data, the subject's behaviour can also be analysed. This data can be used to assess the risk of future health complications and aid in recommending suitable lifestyle changes to avert them.
The data can also be analysed in real time and used with a suitable cue to perform behaviour modifications.
To perform any sort of activity recognition it is necessary to start with a set of measurements: geospatial, video, angular velocity and acceleration are popular choices because of the availability of suitable sensors. The sensor data is usuallyanalysed by an algorithm that detects features and correlates these to actions: sitting, walking, running, etc. This invention comprises of a miniature electronic device collecting sensor data, an algorithm that interprets the sensor data and a means of relaying this data to the user or subject. In the case of behaviour modification, this process may use a stimulus to re-enforce or discourage the action. 1.
Throughout this document, the example of using the device for quadrupeds is used, however, those skilled in the art will recognise that with subtle changes the device would also be suitable for similar applications in bipeds including humans.
SUMMARY OF THE INVENTION
The invention is best described as a system comprising of a set of block components. The exact embodiment and implementation platform of each block is secondary to its input, output and function. Those skilled in the art will note that each block may be implemented in a variety of ways, however for purpose of clarity and conciseness only one implementation is described. For further clarity this implementation is limited to activity recognition upon quadrupeds.
The first block is a worn electronic device that is placed on a moving part of the quadruped. The device is miniature in size and battery operated and its primary function is to gather measurements relating to the body part it is placed upon.
This set of measurements is then relayed onto the next block.
The next block is a computational algorithm taking measurements from the previous block as inputs. The algorithm uses a method of principal component analysis for extracting those information from the measurements that are relevant for the subsequent analysis. This analysis comprises of the automatic detection and classification of the quadruped's activities (denoted as recognition), which are of interest. The recognition system is initially trained on annotated sample data and then applied to the recorded sensor data. The output of the block is a set of decisions in the time domain about the input data and what activity it most likely pertains to. The algorithm can either be embedded directly onto the hardware of the first block or may also be implemented as a function of the last block.
The final block of the system is an alert mechanism. The purpose of this block is to alert an interested party about the findings of the algorithm. This block can be implemented in various modalities, sounds, vibrates but perhaps the most obvious implementation is as a set of visual cues. While these cues maybe embedded into the hardware of the first block, the example invention description herein realises them running on a computer.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 is a block diagram of one embodiment of the invention hardware. The device consists of several interconnected electronic components on a support substrate such as a printed circuit board. At the core of the hardware is a microcontroller or microprocessor (2) which controls the behaviour of the other system components by executing instructions contained within its internal firmware. The device has inputs from accelerometer(s) (1), gyroscope(s) (3), temperature sensor(s) (13), a real time clock (10) and buttons (7). The device also has a display component (5) that can displaymessages and information to the user. An LED(s) (6) is also featured to relay more simple messages to the user; this could conveniently be a single multi colour LED. A computer connection (9) is also provided which can conveniently use the USB(universal serial bus) standards. The computer connection can access the other system blocks by communicating with the microcontroller/microprocessor (2), this is essential to allow the computer to access the data stored on the devices memory storage (4). The computer connection (9) can also be used to provide power to the device and, by using the charge control circuit (12), the connection allows charging of the devices battery (11) which could conveniently be a lithium polymer rechargeable cell.
Figure 2 is a flow diagram of the data processing algorithm. The process consists of multiple independent parts that exchange information as indicated by the arrows in the figure (figure 2). Each part could be implemented in hardware or in software. Elliptical elements in the figure describe sub-processes that require multiple steps. Rectangular shapes indicate entities, such as the device (15) and data stream (17) that are either physical or present in digital form that can be stored and communicated. The device (15), as described in figure 1, provides means to access the data that was recorded to the pre-processing routine (16) where basic transformations such as amplitude and frequency normalisation and basic analysis such a step-count, distance estimate and energy-expenditure are performed. The resulting data stream (17) is analysed by the feature extraction (18) to obtain a set of features (19), a compact and suitable representation of the data for further processing. Based on the extracted features (19) an activity classification (20) is performed to provide basic results (21) about the subject's activities. The basic results (21) are then displayed to the user of the system using intuitive forms such as graphs, statistics, tables, informative graphics and in textual form in the visualisation process (22). Interesting parts of the data stream (17) and the features (19) (i.e. outliers that are found via statistical analysis and representative instances for a set of activities) are made anonymous and processed in system adaptation (24). Parameters are chosen that can affect the pre-processing (16), the feature extraction (18) and the classification (20) based on statistics about already collected data and the subject's profile (23). A further, detailed analysis (25) is performed on the extracted features (19) to assess fine-grained details beyond activities such as motor performance, general health, rehabilitation progress and impact on medication or nutrition among others.
DETAILED DESCRIPTION OF THE DRAWINGS
FIGURE 1 Figure 1 shows a block diagram of one particular configuration of the hardware associated with this patent. Those skilled in the art will realise that a hardware device as described here could be realised in many of permutations. The aim of this section is not to limit the device to a single possible configuration but to describe in detail one such permutation to allow recreation of the system by a person skilled in the art.
The device is made by attaching electronic components to a substrate know as a printed circuit board (PCB), this is the standard way to produce electronic devices and is described briefly since no new information is claimed in this field.
The PCB is likely to be a multi layer glass laminate suitable for population with modern surface mount components such as the ones mentioned in the proceeding paragraphs.
The principle component of system is the microcontroller or microprocessor (2) and the instructional code contained within its program memory. This code contains the required logic sequences to make the device behave in the desired way, known as the firmware. The firmware is loaded onto the device much like a computer program on any computer, that is to say the device can not function without it. The firmware is part of the microcontroller (1) in the system block diagram. The microcontroller communicates with the other sensor outputs using electrical interconnects on the PCB. There are several predefined ways in which this occurs such as serial peripheral interfaces, 12C interfaces and parallel clocked communication. The details of these are not described further since no new innovations are claimed.
The most important sensors in this typical example device are the accelerometer (1) and the gyroscope (3). In this example a tn-axial accelerometer is used such as the ADXL33O by Analog Devices. The gyroscope is a L3G4200 device by ST semiconductor, which also senses motion on 3 axis. The output from these sensors is aggregated by the microcontroller and either (or both) analysed by the device or stored in the devices memory (4). In the former case, the device must have an activity recognition algorithm in its firmware. In the latter case, the device simply needs to store the data (using the devices memory (4)) for post analysis on a computer. The memory must be capable of storing a large quantity of data to allow data logging to take place for extended periods, there are many memory alternatives that are suitable and in this example a micro secure digital (SD) card has been used.
The device may use a display (5) to relay information to the subject or the user and provide feedback while the user is completing tasks such assetting the value of the time, turning the device on or off or requesting the status of battery or memory longevity. The display (5) is used in combination with the buttons (7) and the multi-colour LED (6). In the case where the device is monitoring the subjects activity levels, the display can also chart the users statistics such as calories expired, number of steps, percentage of day spent walking, distance covered etc. The exact information presented on the display depends entirely on the deployment scenario and end application. The display component itself is likely to be a monochrome or colour liquid crystal display or, if the battery longevity is of lesser concern, an OLED or colour graphic display capable of more impressive graphics. In this example device, the display in a 128x64 pixel transfiective monochrome LCD.
The device is for much of the time powered by its battery (8) which makes power consumption important. The battery is a rechargeable lithium polymer cell with typically 200 mAh capacity. It is managed by a charge controller circuit(11). The battery is connected to a regulator circuit (12)which provides a regulated voltage of 3.3v to the peripheral components and the microcontroller.
The charge controller takes current from the micro USB connector (9) when the device is attached to a charger or host computer.
Several other peripherals may be incorporated into the device, in this example a temperature sensor (13) and a real time clock (10) have been added. These provide additional qualifying data that augments the movement data from the accelerometer (1) and gyroscope (3). When connected to a host computer, the device begins charging its battery whist allowing access to the data that has been collected in its memory (4). During this connection the device communicates with an application running on the host computer allowing the user to interact with the data through the computers richer multimedia environment (e.g. using the computers full colour monitor, speakers and keyboard/mouse). In this example the device makes use of the capabilities of the operating system to access the data in the form of a file system (mass storage drive). The application software communicates with the device via a separate standard known as a communication device class. This allows simultaneous access to the data and device settings/sensor outputs whilst charging the battery.
Finally, the device should be housed in a suitable protective enclosure to prevent damage to the battery or shorting of the electrical contacts. The housing must also not obscure the display(S) or prevent operation of the push buttons (7). The housing specification is very application specific and is not described further.
FIGURE 2 Figure 2 shows a flow diagram of the signal processing and activity recognition algorithm. Those skilled in the art will realize that the basic procedure (left half of Figure 2: 15,16,17,18,19, 20, 21) corresponds to a classical statistical pattern recognition approach for sequentially gathered sensor data. It is based on a supervised training procedure that exploits annotated sample data and statistical models for the activities of interest (e.g running, walking, lying, eating, swimming). This approach is used for automatic analysis of participant's behaviour. In addition the procedure consists of means for the automatic adaptation towards the idiosyncrasies of a particularly monitored subject (right half of Figure 2; 23, 24, 25, 26). Both the feature extraction module (18) and the activity classification module (20), which are pre-trained on species related but not individual related sample data, are automatically adapted towards a best suitable fit with a particular subject, which is to be monitored. Therefore, background knowledge about the specific subject is incorporated (23) and automatic, i.e., unsupervised analysis methods are applied for the refinement of feature extraction (18) and the activity classification itself (20). System adaptation (24) is based on automatically analysed data collected from the subject wearing the device (15) and utilises the pre-trained analysis system.
Data gathering is done using the device described in Figure 1, which the participant wears during its regular activities. The raw sensor data is then pre-processed (16), i.e. normalized (with respect to device orientation, gravity independence, and value range) and filtered (de-noising, outlier removal, band-pass filtered). All pre-processing procedures correspond to basic analytic manipulations of either scalar or vector data, which can be implemented as a hardware module.
The resulting filtered data (17) is then fed into the feature extraction module (18). The purpose of feature calculation is the extraction of the information that pertains to the activities that are of interest for recognition i.e. dimensionality reduction. Sliding an analysis window of fixed length along the sensor data sequence extracts frames of contiguous samples, which will be converted into feature vectors (19). For generalization purposes, i.e. independence of the procedure regarding inter-and intre-species differences and for efficiency reasons, feature extraction (18) is based on a tailored Principal Components Analysis (PCA) approach. The latter corresponds to matrix multiplication, which can easily be implemented as a hardware module of the device (15) itself thereby providing rapid feature extraction for real-time analysis and efficient on-board data storage.
Based on the feature representation of the sensor data (19) the actual classification of the activities of interest is performed (20). The recognition process (which is used in a synonymous meaning for classification) consists of assigning identities of activities of interest (plus unknown or idle) to dedicated portions of the sensor data streams. The recognition hence consists of a detection step (i.e. determining when is something of interest happening) and an identification step (i.e. determining what is happening). In order to capture the variance of the activities of interest statistical models will be employed for the recognition step. Those skilled in the art of pattern recognition and machine learning will realize that there are a number of approaches to sequential data analysis.The activity classification module (20) is derived from a supervised training procedure exploiting annotated sample data.
The basic results of this recognition step (21) (e.g. information about on-going activities of the monitored subject), activity statistics based on historical data, estimated health status based on the analysis of energy expenditure and incorporating background knowledge about the particular participant's requirements (23), are forwarded to the visualization module (22) of either the device ((22) for real-time visualization) or to accompanying analysis software for offline analysis and storage.
For the fine-grained and individualized analysis of the behaviour of a particular quadruped background information about the species of interest (23) is incorporated into the evaluation process. This subject profile is used for the calibration of specialized calculation schemes like energy expenditure, or pedometer analysis, which are both dependent on factors such as the size of the animal/subject. Furthermore, the system adaptation module (24) continuously updates the feature extraction (18) and the activity classification modules (20) by modifying the particular statistical procedures (PCA for feature extraction and statistical sequence modelling for activity classification) using, for example, linear transformation estimated on species specific data, which are automatically annotated using the existing analysis system.
The data processing procedure provides insight information about the quality of the activities in addition to the basic results covering the type of activities a monitored species performs and statistics about these very activities as they are performed over a certain period of time (typically a day). This detailed analysis (25) is based on unsupervised time-series analysis techniques, which uncover signal quality breakdowns and, based on this, infer metric information for a quantitative analysis of the monitored activities, which will be provided as detailed results (26) to the visualization module (offline).
ENHANCEMENTS
The primary function of this invention is a system for identifying activities made by a tagged person or animal and presenting the results to an observer. While this task remains a fundamental component of the invention, many enhancements and peripheral functions maybe added to extend this basic functionality or make this facility accessible across different mediums. Such an enhancement might be to add a transceiver module to the device. This would enable real time analysis to be carried out on the gathered data instead ofa post analysis approach.
Another enhancement might be to add an input mechanism that enables the gathering of data to be instigated upon a given condition or the gathered data to be contextualised with environmental data. The scope for implementing such input mechanisms is broad and application specific, however, GPS, infrared proximity sensors, switches, timing devices, are some examples.
In many applications, when an action or activity is detected by the computational algorithm block, an alert or indication to an interested party may be required.
While in some cases it might be sufficient to make the alert via software running on a PC, this might not always be appropriate. One example illustrating this case is where the results of the activity recognition step are used to trigger an event.
Applications such as an automatic animal feeder may use certain activities to release food upon detecting certain behaviour patterns. In this case the detection of the activity should be acted on rather than postulated over at the end of the data gathering session. Many such examples exist that rely on the described invention for correct operation but do not follow the exact embodiment described.
BENEFITS OF THE INVENTION
The invention can be used to determine the changing state of mobility and health of the subject based on the sensor data. The findings can be used to diagnose recovery or deterioration in health. Currently this is performed using anecdotal data which is often unreliable or with annotated video data which is labour intensive to gather (and often intrusive).
Extended analysis on the sensor data can revealinsights into the subjectslifestyle.
Health care professionals can use this information to foresee the long term risks to future health problems. Many of these may be avoided with suitable advice and lifestyle changes.
The invention can be used to quantitatively measure the effectiveness of behavioural modification methods by providing historic measures of the behaviour of interest. For example, an individual who bites their nails can see the effectiveness of an anti-nail-biting product by looking at the frequency of nail biting over the treatment course.
In the example of nail biting, the invention could be fitted with a cueing method such as a vibration motor that prompt the subject when the behaviour in question is detected; in situ prompting. In this way the invention may also augment or replace the treatment method under scrutiny.
In a scenario where a group of subjects are aiming to make similar lifestyle changes, the invention allows cross comparison of progress. This information can be fed back to the subject group to provide motivation. This is an example of using the invention to compare subjects, the example could be extended to broadcast the subjects progress onto a social network where non-participating interested parties could view and comment on the participants progress.
OTHER EMBODIMENTS
From the forgoing description it will thus be evident that the present invention provides a design for realising a worn device capable of gathering sensor readings and the algorithms capable of reasoning predefined behaviour and activity events from the gathered data. As various changes can be made in the above embodiments and operating methods without departing from the scope of the following claims, it is intended that all matter contained in the above description or shown in the accompanying drawings should be interpreted as illustrative and not in a limiting sense.
Variations or modifications to the design and construction of this invention, within the scope of the appended claims, may occur to those skilled in the art upon reviewing the disclosure herein (especially to those using computer aided design systems). Such variations or modifications are intended to he encompassed within the scope of any claims to patent protection issuing upon this invention.
Claims (14)
- CLAIMSThe embodiments of the invention in which I claim an exclusive property or privilege are defined as follows: 1) An electronic device with sensor inputs, which may or may not be part of the device, that is attached to a subject, which may be human or animal or other, and gathers input data from the sensors for analysis, either in real time or at a later time, by an algorithm which may be part of the device or part of another system in communication with the device where the output of the algorithm is the quantitative and/or qualitative measure of the subjects activity.
- 2) A device as described in claim 1 where the sensors are motion sensors with one or more accelerometers and/or gyroscopes capable of measuring linear acceleration and angular velocity respectively.
- 3) A device as in claim 2 that is attached to a quadruped subject to quantitatively measure the subjects activity using activity recognition algorithms which use adaptive feature extraction and classification on the sensor outputs.
- 4) A device as described in claim 3 where the subject is a dog and the device is worn by the subject for a period of time to gather information about the subjects activity during the time period.
- 5) A device as described in claim I but with the addition of a method for stimulating the subject or alerting another system or device in response to sensed activities such as, but not limited to, by electric shock or vibration motor for the purpose of re-enforcing or discouraging the detected activities so that the subjects behaviour may be altered by the device.
- 6) A method for detecting activities from the output of a group of sensors, either in real time or on stored data, with an algorithm which may be part of, but not limited to, a portable system, a personal computer or an externally hosted application.
- 7) A method as described in claim 6 with sensor inputs from one or more accelerometers and/or gyroscopes where the algorithm is enhanced by using supervised training.
- 8) A method as described in claim 7 where the algorithm is further enhanced by using either supervised or unsupervised feature learning.
- 9) A method as described in claim 8 where the algorithm uses adaptable feature extraction and classification for quantitative analysis of the subjects behaviour.
- 10) A system comprising an algorithm and an electronic device that is attached to a subject to gather sensor data from local and/or remote sensors on or around the subject where the sensor outputs are digitally analysed by the algorithm to identify and record the subjects activity.
- 11) A system as described in claim 10 where the subject is a quadruped and the device and sensors are attached to the subject to gather movement data from the sensors which may be, but not limited to, a group of one or more accelerometers and/or gyroscopes.
- 12) A system as described in claim 11 where the sensors are one or more accelerometers and/or gyroscopes and the subject is a dog and the sensor data is recorded on the device for analysis by a computer.
- 13) A system as described in claim 11 where the sensors are one or more gyroscopes and/or accelerometers and the sensor data is analysed by the algorithm within the device to measure the subjects current activity whilst also having the ability to record the sensor outputs, if required, along with the current detected activity.
- 14) A system as described in claim 13 but with the addition of a stimulating device, which may be, but not limited to, an electric shock generator or vibration motor, so that the subject may be stimulated based on the outcome of the activity detection to allow behaviour change through re-enforcement or discouragement.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB201102714A GB2488521A (en) | 2011-02-16 | 2011-02-16 | Activity recognition in living species using tri-axial acceleration data |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB201102714A GB2488521A (en) | 2011-02-16 | 2011-02-16 | Activity recognition in living species using tri-axial acceleration data |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201102714D0 GB201102714D0 (en) | 2011-03-30 |
GB2488521A true GB2488521A (en) | 2012-09-05 |
Family
ID=43859514
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB201102714A Withdrawn GB2488521A (en) | 2011-02-16 | 2011-02-16 | Activity recognition in living species using tri-axial acceleration data |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2488521A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016100296A1 (en) * | 2014-12-15 | 2016-06-23 | i4c Innovations Inc. | Animal caloric output tracking methods |
EP3255551A4 (en) * | 2015-02-06 | 2018-11-14 | Sony Corporation | Information processing apparatus, information processing method, and information processing system |
EP3603388A4 (en) * | 2017-03-31 | 2020-12-16 | NTT Technocross Corporation | Behavior specifying device, behavior specifying method and program |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006113804A2 (en) * | 2005-04-20 | 2006-10-26 | Vivometrics, Inc. | Systems and methods for non-invasive physiological monitoring of non-human animals |
US20100176952A1 (en) * | 2008-12-04 | 2010-07-15 | The Regents Of The University Of California | System for detection of body motion |
WO2010090867A2 (en) * | 2009-01-21 | 2010-08-12 | SwimSense, LLC | Multi-state performance monitoring system |
WO2010140117A1 (en) * | 2009-06-04 | 2010-12-09 | Koninklijke Philips Electronics, N.V. | Method and system for providing behavioural therapy for insomnia |
-
2011
- 2011-02-16 GB GB201102714A patent/GB2488521A/en not_active Withdrawn
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006113804A2 (en) * | 2005-04-20 | 2006-10-26 | Vivometrics, Inc. | Systems and methods for non-invasive physiological monitoring of non-human animals |
US20100176952A1 (en) * | 2008-12-04 | 2010-07-15 | The Regents Of The University Of California | System for detection of body motion |
WO2010090867A2 (en) * | 2009-01-21 | 2010-08-12 | SwimSense, LLC | Multi-state performance monitoring system |
WO2010140117A1 (en) * | 2009-06-04 | 2010-12-09 | Koninklijke Philips Electronics, N.V. | Method and system for providing behavioural therapy for insomnia |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016100296A1 (en) * | 2014-12-15 | 2016-06-23 | i4c Innovations Inc. | Animal caloric output tracking methods |
EP3255551A4 (en) * | 2015-02-06 | 2018-11-14 | Sony Corporation | Information processing apparatus, information processing method, and information processing system |
EP3603388A4 (en) * | 2017-03-31 | 2020-12-16 | NTT Technocross Corporation | Behavior specifying device, behavior specifying method and program |
Also Published As
Publication number | Publication date |
---|---|
GB201102714D0 (en) | 2011-03-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11468976B2 (en) | Apparel and location information system | |
Cornacchia et al. | A survey on activity detection and classification using wearable sensors | |
CN106956271B (en) | Predict the method and robot of affective state | |
RU2672684C2 (en) | Sensory stimuli to increase accuracy of sleep staging | |
Ortiz | Smartphone-based human activity recognition | |
CN107708412B (en) | Intelligent pet monitoring system | |
CN106456000B (en) | The based drive estimation of biometric signal | |
US20180348187A1 (en) | Method, Apparatus and System for Food Intake and Physical Activity Assessment | |
Mikos et al. | A wearable, patient-adaptive freezing of gait detection system for biofeedback cueing in Parkinson's disease | |
Cook et al. | Mining the home environment | |
CN110291489A (en) | The efficient mankind identify intelligent assistant's computer in calculating | |
CN105792740A (en) | Detection and calculation of heart rate recovery in non-clinical settings | |
CN104808783A (en) | Mobile terminal and method of controlling the same | |
JP6742380B2 (en) | Electronic device | |
Pasquet et al. | Wireless inertial measurement of head kinematics in freely-moving rats | |
JP2016008818A (en) | Detection apparatus and method, and program | |
CN111986530A (en) | Interactive learning system based on learning state detection | |
Rad et al. | Stereotypical motor movement detection in dynamic feature space | |
GB2488521A (en) | Activity recognition in living species using tri-axial acceleration data | |
US20210085233A1 (en) | Wearable Device for Determining and Monitoring Emotional States of a User, and a System Thereof | |
US20230004795A1 (en) | Systems and methods for constructing motion models based on sensor data | |
US20220323189A1 (en) | Jaw movement analysis system | |
KR102328182B1 (en) | Wearable device for counting physcial contact between multiful users and system for improving their relationships using the same | |
Gabrielli et al. | Action recognition to estimate Activities of Daily Living (ADL) of elderly people | |
Sec et al. | System for detailed monitoring of dog’s vital functions |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |