GB2618780A - Health tracking method and system - Google Patents

Health tracking method and system Download PDF

Info

Publication number
GB2618780A
GB2618780A GB2206928.0A GB202206928A GB2618780A GB 2618780 A GB2618780 A GB 2618780A GB 202206928 A GB202206928 A GB 202206928A GB 2618780 A GB2618780 A GB 2618780A
Authority
GB
United Kingdom
Prior art keywords
data
diagnosis
subject animal
sensor data
features
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2206928.0A
Other versions
GB202206928D0 (en
Inventor
Hewitt James
Takki Jimmy
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dassiet Oy
Original Assignee
Dassiet Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dassiet Oy filed Critical Dassiet Oy
Priority to GB2206928.0A priority Critical patent/GB2618780A/en
Publication of GB202206928D0 publication Critical patent/GB202206928D0/en
Publication of GB2618780A publication Critical patent/GB2618780A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02405Determining heart rate variability
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/445Evaluating skin irritation or skin trauma, e.g. rash, eczema, wound, bed sore
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/40Animals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/01Dynamic search techniques; Heuristics; Dynamic trees; Branch-and-bound

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physiology (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Cardiology (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Pulmonology (AREA)
  • Theoretical Computer Science (AREA)
  • Fuzzy Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Computational Linguistics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Dermatology (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

A method for diagnosing and/or predicting a health condition includes receiving sensor data 201 describing one or more aspects of a subject animal’s health status, extracting features 203 from the sensor data, the extracted features corresponding to the health status of the subject animal and determining at least one prediction or diagnosis based on the one or more extracted features 204, wherein the determining is performed using collaborative filtering in a recommender system based on features and associated diagnoses for a plurality of different animals. The method may incorporate the use of visual media as sensor data and animal mobility may the physical health status. This may involve the use of musculoskeletal markers corresponding to joints.

Description

HEALTH TRACKING METHOD AND SYSTEM
Technical Field
The invention relates to a method and system for tracking the health status of an animal.
Background
The number of pets has increased rapidly, and all pets, as well as livestock are at risk of injury and disease. Pets in particular are living longer, and most will experience age-related degradation of their health as they age. Concerns over the wellbeing of livestock are now a priority for those involved in livestock farming.
Summary of the Invention
A first aspect of the invention relates to a method. The method comprises receiving sensor data describing one or more aspects of a subject animal's health status, extracting features from the sensor data, the extracted features corresponding to the health status of the subject animal, and determining at least one prediction or diagnosis based on the one or more extracted features, wherein the determining is performed using collaborative filtering in a recommender system based on features and associated diagnoses for a plurality of different animals.
The health status may include a physical health status.
The plurality of different animals comprises one or more of animals of the same species, animals of the same species sub-type, animals of the same strain, or animals of the same limb type.
One of the aspects of the physical health status of the subject animal may be mobility of the subject animal, and the sensor data may comprise at least one visual media item, the visual media item depicting movement of the subject animal and comprising one or more of a video and a sequence of still images.
The method may further comprise receiving at least one annotation of the visual media item, the annotation comprising one or more of: * a range corresponding to a time range in the video or images in the sequence of still images in which features corresponding to movement of the subject animal are to be extracted; * musculoskeletal markers corresponding to joints of the subject animal and, optionally, lines connecting the musculoskeletal markers; * a highlighted area in the video or images in the sequence of still images from which features corresponding to movement of the subject animal are to be extracted; * text-based tags associated with positions and/or times within the video, or position within the still images in the sequence of still images.
Extracting features from the sensor data may further comprise one or more of: * extracting features from the annotations; and * limiting the extraction of features from the sensor data based on the annotations.
The method may further comprise outputting a video comprising the visual media item depicting movement of the subject animal and the annotations.
The method may further comprise capturing the visual media item with a smartphone, tablet, webcam or camera.
The method may further comprise processing the visual media item to detect any visible injuries on the subject animal.
The health status may include a mental, psychological or behavioural status.
The sensor data may comprise one or more of: * audio data; * LIDAR data; * accelerometer data; * heart rate data; * heart rate variability data; * inter-beat interval data; * RR interval data; * temperature data; * respiratory rate data; * location data; * light sensor data; * sleep data; * food and/or water consumption data; * smart cat-flap/dog-flap data; and * smart bed/basket data.
The method may further comprise receiving medical data for the subject animal and determining at least one diagnosis may be further based on the received medical data.
The medical data may comprise one or more of: * electronic medical records; * free text input; and * medical observations obtained from a decision tree.
Determining at least one prediction or diagnosis may further comprises using an animal-specific model in the diagnosis recommender system based on historical features for the subject animal.
When the method comprises determining at least one diagnosis, the method may further comprise receiving a veterinarian confirmed diagnosis, and training the recommender system based on the received veterinarian confirmed diagnosis.
When the method comprises determining at least one prediction, the method may further comprise receiving further sensor data describing one of more aspects of a subject animal's health status, extracting features from the sensor data, the extracted features corresponding to the health status of the subject animal, determining at least one diagnosis based on the one or more extracted features, wherein the determining is performed using collaborative filtering in a recommender system, and training the recommender system based on the correlation between the at least one prediction and at least one diagnosis.
The method may further comprise identifying a treatment plan using a treatment plan recommender system based on the diagnosis and extracted features.
The method may further comprise receiving feedback on the condition of the subject animal following the treatment plan. Feedback may comprise one of more of: * veterinarian or physician observations; * pet owner observations; * electronic medical record updates; and * sensor data.
The method may further comprise updating the treatment plan recommender system based on the identified diagnosis, extracted features and received feedback.
The treatment plan recommender system may be a matrix factorization recommender system or a restricted Boltzmann machine.
The method may further comprise outputting the identified treatment plan.
The method may further comprises outputting the determined diagnosis or prediction.
Outputting the identified treatment plan and/or the determined diagnosis may comprise transmitting the identified treatment plan and/or the determined diagnosis over a network to a remote computing device and displaying the identified treatment plan and/or the determined diagnosis on a screen associated with the remote computing device.
Sensor data, annotations, and/or medical data may be received from a remote computing device via a network.
The recommender system may be a matrix factorization recommender system or a restricted Boltzmann machine.
The subject animal may be a mammal, such as a human being, dog or cat.
According to a second aspect of the invention, a data processing system is provided, the data processing system comprising at least one processor configured to perform the method described above.
According to a third aspect of the invention, a computer program is provided. The computer program comprises instructions which, when the program is executed by a computer, cause the computer to carry out the method described above.
According to a fourth aspect of the invention, a computer-readable medium is provided.
The computer-readable medium comprises instructions which, when executed by a computer, cause the computer to carry out the method described above.
Brief Description of the Drawings
Figure 1 A schematic diagram depicting a system according to an embodiment of the invention.
Figure 2 A flow chart depicting a method of determining a prediction or diagnosis according to an embodiment of the present invention.
Figure 3 A flow chart depicting a method of determining a prediction or diagnosis of a mobility-related condition according to an embodiment of the present invention.
Figure 4 A flow chart depicting a method of determining a prediction or diagnosis and treatment plan according to an embodiment of the present invention.
Detailed Description of the Invention
The invention relates to a system and method for tracking the health status of an animal.
In this context, the term "animal" encompasses organisms in the biological kingdom animalia, including animals kept as pets, livestock or working animals, as well as human beings. Tracking the health status of the animal includes providing predictions and diagnoses of health conditions based on sensor data describing one or more aspects of the animal's health status. The sensor data may be actively collected from the animal as part of the method, or the sensor data may have been captured prior to the beginning of the method and may be merely provided as an input to the method, which is limited to subsequent processing of the sensor data.
The health status of the animal can include the animal's overall physical status and, or alternatively, one or more aspects of the animal's physical status, such as mobility, respiration, and chronic disease progression or regression. The health status may additionally or alternatively include mental, psychological or behavioural health.
Figure 1 is a schematic diagram depicting a system according to an embodiment of the present invention. The system includes at least a computer 100, which may be any general purpose computer, such as a server. The computer 100 includes a CPU 101, network interface 102, random access memory (RAM) 103 and non-volatile storage 104.
The computer 100 may be located in a data centre and be accessible via a network 150, such as the internet. The computer need not be dedicated hardware, but may be a virtual machine or the method, described in more detail below, may be executed as a containerized application using OS-level virtualization.
The system may also include a plurality of sensors 120a-120c, which may include one or more of cameras, microphones, LIDAR sensors, accelerometers, gyroscopes, heart rate sensors, thermometers, respiratory sensors such as blood oxygen meters, location sensors, light sensors, food and/or water consumption sensors, smart cat-flap/dog-flap sensors and smart bed/basket (also referred to as instrumented bed or basket) sensors. The sensors 120a-120c communicate with the computer 100 via the network 150. The sensors 120a-120c may be connected to the network 150 directly, e.g. via a cellular data connection, or indirectly, e.g. via intermediate networks such as an intemet of things network. The sensors 120a-120c may also be connected to the network 150 via an intermediate computer (not shown) which is itself connected to the network. Different ones of the sensors 120a-120c may be connected in different ways; some may be connected directly to the network 150, others may be connected via an intermediate loT network, and others may be connected via an intermediate computer.
The system may also include a mobile device 130, which may be a smartphone, tablet computer or other mobile computing device. The mobile device 130 may include a camera, microphone and possibly other sensors, such as LIDAR sensors, from which sensor data describing the health status of the animal may also or alternatively be received and provided to the computer 100. The mobile device 130 is connected to the network 150 directly or indirectly and can communicate with the computer 100 via the network 150. The mobile device 130 may also be used to access information about the health status of the animal, and predictions or diagnoses generated by the computer 100 may be delivered to the mobile device 130 and output via the mobile device 130. The mobile device 130 may also store medical data related to the animal. The medical data may be medical data that was input via the mobile device 130 or may have been received from another device. The medical data may be transmitted via the network 150 to the computer 100.
The system may also include a second computer 140, such as a desktop computer, laptop computer or thin client, which is connected to the network 150 and can communicate with the computer 100 via the network 150. The second computer 140 may be connected to sensors such as cameras, microphone and others as described above, from which sensor data describing the health status of the animal may also or alternatively be received and provided to the computer 100. Sensor data may be stored on the second computer 140 and subsequently transmitted to the computer 100 for processing. The second computer 140 may also be used to access information about the health status of the animal, and predictions or diagnoses generated by the computer 100 may be delivered to the second computer 140 and output via the second computer 140. The second computer 140 may also store medical data related to the animal. The medical data may be medical data that was input via the second computer 140 or may have been received from another device. The medical data may be transmitted via the network 150 to the computer 100.
Figure 2 is a flow chart depicting a method 200 of determining a prediction or diagnosis according to an embodiment of the present invention. The method 200 may run on the computer 100 shown in Figure 1. The method 200 includes step 201, in which sensor data describing one or more aspects of a subject animal's health status is received. The sensor data may be received from any combination of sensors 120a-120c, mobile device 130, and second computer 140. The sensor data may include one or more of video data, image data, audio data, LIDAR data, accelerometer data, heart rate data, for example pulse rate, heart rate variability, inter-beat intervals, and RR intervals, temperature data, respiratory rate data, activity data, location data, light sensor data, sleep data, food and/or water consumption data, smart cat-flap/dog-flap data and smart bed/basket data.
At optional step 202, medical data of the subject animal is received. The medical data may include one or more of electronic medical records, free text input, and medical observations obtained from a decision tree. The medical data includes subjective data describing the health status of the subject animal as recorded by a veterinarian, doctor or other professional, or by the owner of a pet, working animal or livestock animal. The medical data may additionally or alternatively include objective data, such as sensor or other measurement data, that has already been stored in a medical record.
At step 203, the received sensor data is processed to extract features corresponding to the health status of the subject animal. If medical data was also received, the medical data is also processed to extract features corresponding to the health status of the subject animal. As mentioned above, the health status may relate to physical health and/or mental, psychological or behavioural health.
Feature extraction transforms the input sensor and/or medical data into a set of features upon which subsequent analysis can be performed. Feature extraction may be performed using any suitable feature extraction algorithm known in the art for a given data type. For example, features may be extracted from most data types, including video, image or LIDAR data, using convolutional neural networks, features may also be extracted from audio data based on mel-frequency cepstrum coefficients, and features may be extracted from text data using a bag-of-words method or other natural language processing techniques.
At step 204, at least one prediction or diagnosis is determined based on the one or more extracted features from step 203. A prediction is an indication that the subject animal is at risk of developing a certain condition or is likely to develop the condition without intervention. A diagnosis is an indication that the subject animal is presently suffering from a certain condition.
The determining is performed using collaborative filtering in a recommender system that has been trained based on features and associated diagnoses for a plurality of different animals. Recommender systems and collaborative filtering are known in the field of content of media content recommendations, where they are trained on ratings of media content items provided by users. A different user's own ratings can then be used to predict that user's rating for unrated media content items. Examples of algorithms used by such recommender systems include matrix factorization and restricted Boltzmann machines. In the system and method of the present invention, different health statuses can be considered to be analogous to the media content items and the features extracted from the sensor and/or medical data is analogous to the ratings.
Matrix factorization is described in more detail in, Handbook of Statistics, Volume 43: Principles and Methods for Data Science (2020), Chapter 3-Machine learning algorithms, applications, and practices in data science, by Kalidas Yeturu. Restricted Boltzmann machines are described in more detail in An Introduction to Restricted Boltzmann Machines by Asja Fischer and Christian!gel, published in Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications (pp. 14-36). Restricted Boltzmann machines may be trained as described in "A Practical Guide to Training Restricted Boltzmann Machines" by Geoffrey Hinton published in Technical Report UTML TR 2010-003, University of Toronto (Vol. 9).
Some types of health conditions are specific to or express themselves in a unique way based on species of animal, or even the same species sub-type or strain, such as the specific breed. For example, German Shepherd Dogs may suffer from physical issues in the hind legs or lower back caused by an excessively roached back. Other health conditions may be common to or commonly expressed among species sub-types, such as brachycephalic dogs are susceptible to o brachycephalic airway obstructive syndrome and certain eye conditions. Other health conditions may be specific to a species or may be expressed in different ways in different species. Others may be common to a broader category of animals, for example based on limb type, i.e. plantigrade, digitigrade or ungulate, where mobility issues may present in similar gaits for animals of the same limb type. Consequently, the recommender system may be trained based on different sub-sets of animals and associated diagnosed conditions depending on the condition and its relevance across species groups, specific species, sub-types, and strains. For example, cruciate ligament injuries in dogs and horses will result in different changes in the respective animals' gaits. Horse cruciate ligament injury and dog cruciate ligament injury may therefore be considered to be separate conditions when training the recommender system.
At step 204, any number of predictions and/or diagnoses may be output. For example, if there are currently no likely predictions or diagnoses provided by the recommender system, e.g. because the animal is fundamentally healthy, then no predictions or diagnoses of specific conditions may be determined, instead a diagnosis of "healthy" may be output instead. Similarly, where the recommender system determines that the animal is not currently suffering from a condition, i.e. there is no diagnosis, it may still output a prediction if the animal is determined to be at risk of a condition, or showing signs that it may develop a condition without intervention. Where the recommender system determines that the animal is suffering from a particular condition, it may not determine a risk or likelihood of any further conditions, in which case only a diagnosis may be determined, not a prediction. Multiple predictions may be provided without limitation, where they are determined. Mien multiple likely diagnoses are determined, only a single most-likely diagnosis in per category, e.g. one for mobility, one for respiratory issues etc., may be output by the recommender system. Alternatively, all determined diagnoses, even those that are overlapping and unlikely to be simultaneously present, may be output for information along with their relative likelihoods.
Figure 3 is a flow chart depicting a method of determining one or more predictions and/or diagnoses based on one or more visual media items depicting movement of the subject animal. The method depicted in Figure 3 is a specific example of the method depicted in Figure 2. Like the method 200, method 300 may run on the computer 100 shown in Figure 1.
At step 301, sensor data describing one or more aspects of a subject animal's health status is received. At least one of the aspects of the subject animal's health status described by the sensor data is the mobility of the subject animal. The sensor data includes at least one visual media item, e.g. a video and a sequence of still images, and depicts movement of the subject animal. The visual media items may be or may have been captured with a smartphone, tablet, webcam or camera. The sensor data may also include other data types than can be used to monitor movement of the subject animal, for example LIDAR data, which may be captured by a tablet computer with a LIDAR sensor, accelerometer data from accelerometers attached to the animal.
Furthermore, it will be appreciated that sensor data may be received from any combination of sensors 120a-120c, mobile device 130, and second computer 140. The sensor data may also include one or more of audio data, heart rate data, for example pulse rate, heart rate variability, inter-beat intervals, and RR intervals, temperature data, respiratory rate data, location data, light sensor data, sleep data, food and/or water consumption data, smart cat-flap/dog-flap data and smart bed/basket data. Such sensor data may be used to supplement the visual media item(s) and any other movement-related sensor data in predicting and/or diagnosing mobility related conditions. The sensor data may also be used to diagnose other conditions not directly related to mobility.
At optional step 302, medical data of the subject animal is received. As explained with respect to step 202 of Figure 2, the medical data may include one or more of electronic medical records, free text input, and medical observations obtained from a decision tree.
The medical data includes subjective data describing the health status of the subject animal as recorded by a veterinarian, doctor or other professional, or by the owner of a pet, working animal or livestock animal. The medical data may additionally or alternatively include objective data, such as sensor or other measurement data, that has already been stored in a medical record.
At step 303, at least one annotation of the visual media item is received. The annotations are information provided by a human that supplements the visual media item. Examples of annotations that may be received include a range corresponding to a time range in the video or images in the sequence of still images in which features corresponding to movement of the subject animal are to be extracted; musculoskeletal markers corresponding to joints of the subject animal and, optionally, lines connecting the musculoskeletal markers; a highlighted area in the video or images in the sequence of still images from which features corresponding to movement of the subject animal are to be extracted; text-based tags associated with positions and/or times within the video; or position within the still images in the sequence of still images. The annotations may be received from the mobile device 130 or computer 140, as shown in Figure 1. The annotations may be received simultaneously with the one or more visual media items, e.g. when the visual media item is captured by a camera of mobile device 130, the visual media item may be annotated before being transmitted with the annotations to the computer 140. Alternatively, the visual media item(s) may be received first and subsequently transmitted from the computer 100 to mobile device 130 or computer 140 where annotations are created and subsequently transmitted to the computer 100.
At step 304, the received sensor data and annotations are processed to extract features corresponding to the health status of the subject animal extracting features from the sensor data. If medical data was also received, the medical data is also processed to extract features corresponding to the health status of the subject animal. Feature extraction transforms the input sensor, annotations and/or medical data into a set of features upon which subsequent analysis can be performed. Feature extraction may be performed using any suitable feature extraction algorithm known in the art for a given data type.
Feature extraction may include first extracting features from the annotations and subsequently limiting the extraction of features from the sensor data based on the annotations. For example when the annotations indicate a time range within a video file where mobility issued are visible, the extraction of features from the video file may be limited to the indicated time range.
At step 305, at least one prediction or diagnosis is determined based on the one or more extracted features from step 304. As explained above, a prediction is an indication that the subject animal is at risk of developing a certain condition or is likely to develop the condition without intervention. A diagnosis is an indication that the subject animal is presently suffering from a certain condition.
As in the method depicted in Figure 2, the determining is performed using collaborative filtering in a recommender system that has been trained based on features and associated diagnoses for a plurality of different animals. Recommender systems and collaborative filtering are known in the field of content of media content recommendations, where they are trained on ratings of media content items provided by users. A different user's own ratings can then be used to predict that user's rating for unrated media content items. Examples of algorithms used by such recommender systems include matrix factorization and restricted Boltzmann machines. In the system and method of the present invention, different health statuses can be considered to be analogous to the media content items and the features extracted from the sensor and/or medical data is analogous to the ratings.
As explained above, some types of health conditions are specific to or express themselves in a unique way based on a category of animals, e.g. limb type, species of animal, or even the same species sub-type or strain, such as the specific breed.
Consequently, the recommender system may be trained based on different sub-sets of animals and associated diagnosed conditions depending on the condition and its relevance across species groups, specific species, sub-types, and strains.
At step 305, while the system and method are configured for receiving and processing visual media data to identify mobility issues, any number of predictions and/or diagnoses may be output, including non-mobility related predictions and diagnoses if identifiable from the data. For example, if there are currently no likely predictions or diagnoses provided by the recommender system, e.g. because the animal is fundamentally healthy, then no predictions or diagnoses of specific conditions may be determined, instead a diagnosis of "healthy" may be output instead. Similarly, where the recommender system determines that the animal is not currently suffering from a condition, i.e. there is no diagnosis, it may still output a prediction if the animal is determined to be at risk of a condition, or showing signs that it may develop a condition without intervention. Where the recommender system determines that the animal is suffering from a particular condition, it may not determine a risk or likelihood of any further conditions, in which case only a diagnosis may be determined, not a prediction. Multiple predictions may be provided without limitation, where they are determined. When multiple likely diagnoses are determined, only a single most-likely diagnosis in per category, e.g. one for mobility, one for respiratory issues etc., may be output by the recommender system. Alternatively, all determined diagnoses, even those that are overlapping and unlikely to be simultaneously present, may be output for information along with their relative likelihoods.
The method 300 may further comprise a step of outputting a video comprising the visual media item depicting movement of the subject animal and the annotations. For example, where the annotations include musculoskeletal markers corresponding to joints of the subject animal, the musculoskeletal markers may be overlaid on the video.
The method 300 may also comprise processing the visual media item to detect any visible injuries on the subject animal, such as skin conditions or wounds.
In both the methods 200 and 300, the step of determining at least one prediction or diagnosis may also include an animal-specific model in the recommender system based on historical features for the subject animal. The animal-specific model may relate to both previously recorded sensor data and the features extracted from that sensor data. Thus, in addition to using recent sensor data in the recommender system, historical data may also be taken into account.
Furthermore, the methods 200 and 300 may also include additional steps of receiving a veterinarian confirmed diagnosis and training the recommender system based on the received veterinarian confirmed diagnosis. In this way, the recommender system becomes more accurate over time. Depending on the recommender system algorithm, training may be performed in different ways. For example where the recommender system is based on matrix factorization, when a diagnosis is confirmed by a veterinarian, it may then be added to the body of data on which the recommender system is based, i.e. the features and associated diagnoses for a plurality of different animals. Where the recommender system is based on a restricted Boltzmann machine, the features and veterinarian confirmed diagnosis may be added to the training data.
Methods 200 and 300 may also include additional steps for improving the accuracy of generated predictions. When receiving further sensor data, extracting features from the sensor data, and determining at least one diagnosis based on the one or more extracted features, if the diagnosis confirms an earlier-determined prediction for the subject animal, the recommender system may be trained based on the correlation between the at least one prediction and at least one diagnosis.
In both methods 200 and 300, the method may also include outputting the determined diagnosis or prediction. Outputting the determined diagnosis or prediction may include transmitting the diagnosis or prediction to the mobile device 130 or computer 140 to be displayed on a screen.
Figure 4 is a flow chart depicting a further method 400 of determining a prediction or diagnosis according to an embodiment of the present invention. The steps 401 to 404 correspond to steps 201 to 204 of the method 200 described above. The additional step 405 depicted in Figure 4 may also follow the method 300 depicted in Figure 3 and described above.
The additional step 405 comprises determining a treatment plan. The treatment plan is determined using a treatment plan recommender system, based on the diagnosis and extracted features. A treatment plan includes one of more treatment steps to be taken to treat, alleviate or manage a health condition. A treatment plan may include optional or alternative steps as well as indications of expected effects and side effects of the steps. A treatment plan may also include a sequence of steps, where subsequent steps are to be taken if previous steps do not effectively treat, alleviate or manage the condition. In this respect, after treatment plan has been determined by the method 400, in subsequent steps the treatment plan recommender system may determine and output subsequent steps in the same treatment plan to be taken, rather than proposing a completely new treatment plan.
The treatment plan recommender system may also be based on a matrix factorization algorithm or a restricted Boltzmann machine, as described above for the recommender system used to generate predictions and diagnoses. In the context of the treatment plan recommender system, different treatment plans can be considered to be analogous to the media content items and the diagnoses generated at step 204, 305 or 404 and features extracted from the sensor data, annotations and/or medical data are analogous to the ratings. The recommender system may have been trained using confirmed diagnoses and positive changes in the health status of the animal following a particular treatment plan.
The method may include a further step of receiving feedback on the condition of the subject animal following the treatment plan and updating the treatment plan recommender system based on the identified diagnosis, extracted features and received feedback. The feedback may include veterinarian or physician observations, pet owner observations, electronic medical record updates, and sensor data. Where the feedback indicates that the condition of the subject animal has improved following implementation of the treatment plan, the successful treatment may be added to the training data for the recommender system.
The method 400 also include a step of outputting the identified treatment plan, for example by transmitting the identified treatment plan over a network to a remote computing device, such as mobile device 130 or computer 140, and displaying the identified treatment plan on a screen.

Claims (31)

  1. Claims 1. A method comprising: receiving sensor data describing one or more aspects of a subject animal's health status; extracting features from the sensor data, the extracted features corresponding to the health status of the subject animal; and determining at least one prediction or diagnosis based on the one or more extracted features, wherein the determining is performed using collaborative filtering in a recommender system based on features and associated diagnoses for a plurality of different animals.
  2. 2. The method of claim 1, wherein the health status includes a physical health status.
  3. 3. The method of claim 2, wherein the plurality of different animals comprises one or more of animals of the same species, animals of the same species sub-type, animals of the same strain, or animals of the same limb type.
  4. 4. The method of claim 2 or 3, wherein one of the aspects of the physical health status of the subject animal is mobility of the subject animal, and wherein the sensor data comprises at least one visual media item, the visual media item depicting movement of the subject animal and comprising one or more of a video and a sequence of still images.
  5. 5. The method of claim 4, wherein the method further comprises receiving at least one annotation of the visual media item, the annotation comprising one or more of: a range corresponding to a time range in the video or images in the sequence of still images in which features corresponding to movement of the subject animal are to be extracted; musculoskeletal markers corresponding to joints of the subject animal and, optionally, lines connecting the musculoskeletal markers; a highlighted area in the video or images in the sequence of still images from which features corresponding to movement of the subject animal are to be extracted; text-based tags associated with positions and/or times within the video, or position within the still images in the sequence of still images.
  6. 6. The method of claim 5, wherein extracting features from the sensor data further comprises one or more of: extracting features from the annotations; and limiting the extraction of features from the sensor data based on the annotations.
  7. 7. The method of claim 6, wherein the method further comprises outputting a video comprising the visual media item depicting movement of the subject animal and the annotations.
  8. 8. The method of any of claims 4 to 7, further comprising capturing the visual media item with a smartphone, tablet, webcam or camera.
  9. The method of any of claim 4 to 8, wherein the method further comprises processing the visual media item to detect any visible injuries on the subject animal.
  10. 10. The method of any preceding claim, wherein the health status includes a mental, psychological or behavioural status.
  11. 11 The method of any preceding claim, wherein the sensor data further comprises one or more of audio data; LIDAR data; accelerometer data; heart rate data; heart rate variability data; inter-beat interval data; RR interval data; temperature data; respiratory rate data; location data; light sensor data; sleep data; food and/or water consumption data; smart cat-flap/dog-flap data; and smart bed/basket data.
  12. 12 The method of any preceding claim, wherein the method further comprises receiving medical data for the subject animal and wherein determining at least one diagnosis is further based on the received medical data.
  13. 13 The method of claim 12, wherein the medical data comprises one or more of: electronic medical records; free text input; and medical observations obtained from a decision tree.
  14. 14 The method of any preceding claim, wherein determining at least one prediction or diagnosis further comprises using an animal-specific model in the diagnosis recommender system based on historical features for the subject animal.
  15. 15 The method of any preceding claim, wherein the method comprises determining at least one diagnosis and wherein the method further comprises: receiving a veterinarian confirmed diagnosis; and training the recommender system based on the received veterinarian confirmed diagnosis.
  16. 16 The method of any preceding claim, wherein the method comprises determining at least one prediction and wherein the method further comprises: receiving further sensor data describing one of more aspects of a subject animal's health status; extracting features from the sensor data, the extracted features corresponding to the health status of the subject animal; determining at least one diagnosis based on the one or more extracted features, wherein the determining is performed using collaborative filtering in a recommender system; and training the recommender system based on the correlation between the at least one prediction and at least one diagnosis.
  17. 17 The method of any preceding claim, wherein the method further comprises identifying a treatment plan using a treatment plan recommender system based on the diagnosis and extracted features.
  18. 18. The method of claim 17, wherein the method further comprises receiving feedback on the condition of the subject animal following the treatment plan.
  19. 19 The method of claim 18, wherein feedback comprises one of more of: veterinarian or physician observations; pet owner observations; electronic medical record updates; and sensor data.
  20. 20 The method of claim 18 or 19, wherein the method further comprises updating the treatment plan recommender system based on the identified diagnosis, extracted features and received feedback.
  21. 21 The method of any of claims 17 to 20, wherein the treatment plan recommender system is a matrix factorization recommender system or a restricted Boltzmann machine.
  22. 22. The method of any of claims 17 to 21, wherein the method further comprises outputting the identified treatment plan.
  23. 23. The method of any preceding claim, wherein the method further comprises outputting the determined diagnosis or prediction.
  24. 24 The method of claim 21 or 23, wherein outputting the identified treatment plan and/or the determined diagnosis comprises transmitting the identified treatment plan and/or the determined diagnosis over a network to a remote computing device and displaying the identified treatment plan and/or the determined diagnosis on a screen associated with the remote computing device.
  25. 25. The method of any preceding claim, wherein sensor data, annotations, and/or medical data are received from a remote computing device via a network.
  26. 26. The method of any preceding claim, wherein the recommender system is a matrix factorization recommender system or a restricted Boltzmann machine.
  27. 27. The method of any preceding claim, wherein the subject animal is a mammal.
  28. 28. The method of claim 27, wherein the animal is a human being, dog or cat.
  29. 29. A data processing system comprising at least one processor configured to perform the method of claim any preceding claim.
  30. 30. A computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method of any of claims 1 to 28.
  31. 31. A computer-readable medium comprising instructions which, when executed by a computer, cause the computer to carry out the method of any of claims 1 to 28.
GB2206928.0A 2022-05-12 2022-05-12 Health tracking method and system Pending GB2618780A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2206928.0A GB2618780A (en) 2022-05-12 2022-05-12 Health tracking method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2206928.0A GB2618780A (en) 2022-05-12 2022-05-12 Health tracking method and system

Publications (2)

Publication Number Publication Date
GB202206928D0 GB202206928D0 (en) 2022-06-29
GB2618780A true GB2618780A (en) 2023-11-22

Family

ID=82156072

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2206928.0A Pending GB2618780A (en) 2022-05-12 2022-05-12 Health tracking method and system

Country Status (1)

Country Link
GB (1) GB2618780A (en)

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
AMIA Symposium, 2017, Shen et al., "Leveraging Collaborative Filtering to Accelerate Rare Disease Diagnosis", Pg 1554-1563 *
AMIA Symposium, 2018, Shen et al., "Incorporating Knowledge-Driven Insights into a Collaborative Filtering Model to Facilitate the Differential Diagnosis of Rare Diseases", pg 1505-1514 *
JMIR Medical Informatics, Volume 6, 10-10-2018, Shen et al., "Utilization of Electronic Medical Records and Biomedical Literature to Support the Diagnosis of Rare Diseases Using Data Fusion and Collaborative Filtering Approaches", pg e11301 *
Peer-to-Peer Networking and Applications, Volume 12, 23-03-2019, Jabeen et al., "An IoT based efficient hybrid recommender system for cardiovascular disease", pg 1263-1276 *
Processes, Volume 7, May 2019, Mingrui et al., "CDL4CDRP: A Collaborative Deep Learning Approach for Clinical Decision and Risk Prediction" , Pg 265 *
Teikyo Medical Journal, Volume 44, August 2021, Navin et al., "Mapping Recommendation System for Health Care - Diagnosis and Treatment Process", pg 1171 - 1185 *

Also Published As

Publication number Publication date
GB202206928D0 (en) 2022-06-29

Similar Documents

Publication Publication Date Title
Neethirajan The role of sensors, big data and machine learning in modern animal farming
EP4053762A1 (en) Disease prediction system, insurance fee calculation system, and disease prediction method
Andersen et al. Towards machine recognition of facial expressions of pain in horses
Qiao et al. Intelligent perception-based cattle lameness detection and behaviour recognition: A review
Lundblad et al. Effect of transportation and social isolation on facial expressions of healthy horses
JP6828928B1 (en) Livestock disease management system, livestock disease management server, livestock disease management method, and livestock disease management program
Ayadi et al. Dairy cow rumination detection: A deep learning approach
Siegford Multidisciplinary approaches and assessment techniques to better understand and enhance zoo nonhuman animal welfare
Hudson et al. Using big data in cattle practice
Neethirajan Is seeing still believing? Leveraging deepfake technology for livestock farming
Kleen et al. Precision livestock farming: What does it contain and what are the perspectives?
Siegford et al. The quest to develop automated systems for monitoring animal behavior
Mei et al. Identification of aflatoxin-poisoned broilers based on accelerometer and machine learning
Mijwil et al. Innovative livestock: A survey of artificial intelligence techniques in livestock farming management
Andersen et al. Can a machine learn to see horse pain? An interdisciplinary approach towards automated decoding of facial expressions of pain in the horse
Siachos et al. Automated dairy cattle lameness detection utilizing the power of artificial intelligence; current status quo and future research opportunities
Norton et al. Precision livestock farming: the future of livestock welfare monitoring and management?
GB2618780A (en) Health tracking method and system
Malhotra et al. Application of AI/ML approaches for livestock improvement and management
Ali et al. Application of artificial intelligence in monitoring of animal health and welfare
Rueß et al. Equine welfare assessment: Horse motion evaluation and comparison to manual pain measurements
Cohen et al. Review of rat (Rattus norvegicus), mouse (Mus musculus), guinea pig (Cavia porcellus), and rabbit (Oryctolagus cuniculus) indicators for welfare assessment
Neethirajan Beyond Deepfake Technology Fear: On its Positive Uses for Livestock Farming
Thakur et al. Digitalization of livestock farms through blockchain, big data, artificial intelligence, and Internet of Things
Rault et al. 5. Cognitive approaches and new technologies: changing methodologies in applied ethology