EP2185071A1 - System and method for displaying anonymously annotated physical exercise data - Google Patents

System and method for displaying anonymously annotated physical exercise data

Info

Publication number
EP2185071A1
EP2185071A1 EP08789619A EP08789619A EP2185071A1 EP 2185071 A1 EP2185071 A1 EP 2185071A1 EP 08789619 A EP08789619 A EP 08789619A EP 08789619 A EP08789619 A EP 08789619A EP 2185071 A1 EP2185071 A1 EP 2185071A1
Authority
EP
European Patent Office
Prior art keywords
data
person
physical
physical exercise
processing unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP08789619A
Other languages
German (de)
French (fr)
Inventor
Gerd Lanfermann
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Philips Intellectual Property and Standards GmbH
Koninklijke Philips NV
Original Assignee
Philips Intellectual Property and Standards GmbH
Koninklijke Philips Electronics NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Philips Intellectual Property and Standards GmbH, Koninklijke Philips Electronics NV filed Critical Philips Intellectual Property and Standards GmbH
Priority to EP08789619A priority Critical patent/EP2185071A1/en
Publication of EP2185071A1 publication Critical patent/EP2185071A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1127Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/22Ergometry; Measuring muscular strength or the force of a muscular blow
    • A61B5/221Ergometry, e.g. by using bicycle type apparatus
    • A61B5/222Ergometry, e.g. by using bicycle type apparatus combined with detection or measurement of physiological parameters, e.g. heart rate
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/36Training appliances or apparatus for special sports for golf
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • G09B19/0038Sports
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/744Displaying an avatar, e.g. an animated cartoon character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • A63B2024/0012Comparing movements or motion sequences with a registered reference
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0087Electric or electronic controls for exercising apparatus of groups A63B21/00 - A63B23/00, e.g. controlling load
    • A63B2024/0096Electric or electronic controls for exercising apparatus of groups A63B21/00 - A63B23/00, e.g. controlling load using performance related parameters for controlling electronic or video games or avatars
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0638Displaying moving images of recorded environment, e.g. virtual environment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0647Visualisation of executed movements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/065Visualisation of specific exercise parameters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/05Image processing for measuring physical parameters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/40Acceleration
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/803Motion sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/83Special sensors, transducers or devices therefor characterised by the position of the sensor
    • A63B2220/833Sensors arranged on the exercise apparatus or sports implement
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/15Miscellaneous features of sport apparatus, devices or equipment with identification means that can be read by electronic means
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/20Miscellaneous features of sport apparatus, devices or equipment with means for remote communication, e.g. internet or the like
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/50Wireless data transmission, e.g. by radio transmitters or telemetry
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/04Measuring physiological parameters of the user heartbeat characteristics, e.g. ECG, blood pressure modulations
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/20Measuring physiological parameters of the user blood composition characteristics
    • A63B2230/202Measuring physiological parameters of the user blood composition characteristics glucose
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/20Measuring physiological parameters of the user blood composition characteristics
    • A63B2230/207P-O2, i.e. partial O2 value
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders

Definitions

  • the present invention relates to a system and method for displaying anonymously annotated physical exercise data to a person undertaking exercises.
  • Home rehabilitation exercises for persons suffering from a medical condition like a stroke or home training exercises for persons wishing to improve body motions like a golf swing can be recorded via sensors.
  • the exercises can also be evaluated by a professional such as a physiotherapist or a golf instructor in order to give the person a direct feedback. If the professional performing the review is not present during the exercise, video camera recordings could be sent to him. These recordings could be reviewed intuitively by the professional and the commented recordings could be understood intuitively by the person undertaking the exercise. However, these recordings, especially when sent away to a remote professional, could breach the privacy of the person. Furthermore, a completely automatic processing of such recorded images to provide meaningful feedback is a demanding task. Alternatively, the sole transmission of data from the sensors would not violate the privacy of the person.
  • US 6,817,979 B2 relates to a system and method which provide for interacting with a virtual physiological model of a user with the use of a mobile communication device.
  • Physiological data associated with the user is acquired from the user.
  • the physiological data is transmitted to the mobile communication device, preferably with the use of a wireless communication protocol.
  • the methodology further involves using the mobile communication device to communicate the physiological data to a network server.
  • the physiological data is integrated into the virtual physiological model of the user.
  • the user can access data and depictions of the user developed from the physiological data.
  • a user can create an avatar representative of the current physical state of the user.
  • the user can adjust the avatar to change the appearance of the avatar to a more desired appearance.
  • the anatomical dimensions of the avatar can be changed to reflect desired waist, chest, upper arms and thigh dimensions.
  • various training, diet and related fitness recommendations can be developed to establish a training regimen most suited to help the user achieve the desired fitness goals.
  • Physiological data is subsequently acquired and applied to the user's avatar, and compared to the desired avatar's data to determine if the training regimen is effective in achieving the desired fitness goals.
  • the interpretation of sensor signals in the frontend leads to difficulties on the part of the user. It is hard to relate to an abstract rendering of an artificial screen character.
  • the present invention is directed to a method for displaying anonymously annotated physical exercise data to a person undertaking exercises, comprising the steps of: a) gathering physical exercise data from a person undertaking exercises; b) synchronously gathering visual recordings of the person undertaking exercises; c) transmitting the physical exercise data to a physically separate annotation unit; d) based on the physical exercise data, annotating the physical exercise data at the physically separate annotation unit; e) transmitting the annotation information to a display and processing unit for review of the person undertaking exercises; f) displaying the visual recordings of the person undertaking exercises together with synchronized annotation information to the person.
  • the first two steps of the method describe how two different sets of information about the exercise of the person are gathered.
  • physical exercise data is gathered, for example by continuously monitoring sensor signals from the person.
  • visual recordings are gathered, for example by using a digital video camera.
  • the physical exercise data can then be transmitted to a physically separate annotation unit.
  • the physical separation of the annotation unit provides for an anonymization of the data.
  • the physical exercise data can be processed into representations of the exercise for review by a third person.
  • the physical exercise data can then be annotated. This includes automatic processing of the data, for example by detecting deviations from motion templates.
  • the third person can include comments and suggestions to provide helpful feedback to the person performing the exercise.
  • the annotation information is transmitted to a display and processing unit at the site of the person performing the exercise.
  • the annotation information is joined with the visual recordings.
  • the recordings of the person undertaking exercises are then displayed to the person together with the synchronized annotation information.
  • the synchronization provides for displaying the annotation at the correct time so the person can directly understand what has caught the attention of the reviewer or the automatic reviewing system.
  • an exercise of a person can be reviewed anonymously and feedback can be given to the person.
  • the anonymization allows for the sharing of professional resources, making the reviewing process more efficient.
  • the person receives the feedback it is very clearly shown to him, via the visual recordings, which part of the exercise has prompted the feedback.
  • an avatar is calculated based on the physical exercise data.
  • the term 'avatar' shall denote a computer-generated abstract rendering which represents the posture or motions of a person.
  • the avatar may be a stick figure.
  • the avatar may represent additional information like the pulse rate, the amount of sweating, muscle fatigue and the like.
  • step f) additionally comprises calculating an avatar and displaying the avatar synchronously with the visual recordings and the annotations to the person.
  • the person will then see the visual recording of his exercise, the annotations and the avatar.
  • the avatar may depict more clearly the motions of the persons if they are obscured in the visual recording by baggy clothing or if they have not been recorded correctly on camera. Again, the avatar may be rotated to achieve the best viewing perspective. Another option is to provide multiple viewing angles with one or more avatars.
  • transmitting the physical exercise data in step c) and transmitting the annotation information in step e) is undertaken via an interconnected computer network, preferably the internet.
  • an interconnected computer network preferably the internet.
  • Suitable protocols can include those of the TCP/IP protocol.
  • the physical exercise data from the person is selected from the group comprising motion data, posture data, electromyographic data, pulse rate, blood pressure, blood oxygen content, blood sugar content, severity of perspiration and/or respiratory rate.
  • motion data either relate to the exercise itself, such as in the case of motion and posture data.
  • Other data types relate to the overall condition or physical fitness of the person. Knowledge about this can give valuable insight into the effectiveness of rehabilitation or training measures. For example, it may be inferred whether the person is in the supercompensation phase after a training stimulus.
  • the annotation information is selected from the group comprising visual information, audio signals and/or speech recordings.
  • Visual information can be in the form of markings such as arrows pointing out a specific issue that are inserted into the images of the avatar. Additionally, small video clips can be inserted to show the correct execution of the exercise.
  • Other visual information can be written comments or graphs showing statistics of data like electromyographic data, pulse rate, blood pressure, blood oxygen content, blood sugar content, severity of perspiration and/or the respiratory rate. This enables to assess the situation of the person performing the exercise at one glance. Audio signals can be simple beeps when a movement is no performed correctly. Recorded speech comments can be added by the reviewer when this is the simplest way of explaining an exercise.
  • the present invention is further directed towards a system for displaying anonymously annotated physical exercise data to a person undertaking exercises, comprising: a physical data processing unit; a display device in communication with the physical data processing unit; - at least one posture recording device assigned to the person undertaking exercises and in communication with the physical data processing unit; a visual recording device in communication with the physical data processing unit; a data storage unit for storing and retrieving data from the physical data processing unit and the visual recording device; the data storage means being in communication with the physical data processing unit; a physically separate annotation unit in connection with the physical data processing unit, the connection being via an interconnected computer network.
  • the at least one posture recording device comprises a motion sensor on the person undertaking exercises, the sensor being selected from the group comprising acceleration sensors, inertia sensors and/or gravity sensors.
  • the motion sensors can be worn on the body of the person on selected locations like upper arm, lower arm, upper leg, lower leg or torso. They can be commercially available highly integrated solid state sensors.
  • the transmission of the sensor signals to the posture assessment unit can be undertaken via wire, wirelessly or in a body area network using the electrical conductivity of the human skin. After calculation of the person's posture the result can be present in the form of an avatar.
  • the at least one posture recording device comprises an optical mark on the person undertaking exercises.
  • the posture recording device then employs an optical tracking system for tracking the at least one optical mark.
  • the optical marks can be borne on the body of the person on selected locations like upper arm, lower arm, upper leg, lower leg or torso.
  • the tracking of the marks can be effected with a single camera or a multitude of cameras. When a stereo camera is used, three-dimensional posture and movement data is generated. After image processing and calculation of the person's posture the result can be present in the form of an avatar.
  • a combination of motion sensors and optical tracking may provide complementary data to better calculate the posture of the person.
  • a further aspect of the present invention is the use of a system according to the present invention claims for displaying anonymously annotated physical exercise data to a person undertaking exercises.
  • Fig. 1 shows a system according to the present invention
  • Fig. 2 shows a synchronous overlay of visual recordings and an avatar representing physical exercise data
  • FIG. 3 shows a flowchart of a method according to the present invention
  • Fig. 4 shows modules for performing a method according to the present invention.
  • Fig. 1 shows a system according to the present invention for displaying anonymously annotated physical exercise data to a person undertaking exercises.
  • the person has motion sensors 3 situated on his thighs and his ankles.
  • optical marks 3' are located on the wrist and the torso.
  • the signals of the motion sensors 3 are transmitted wirelessly to the physical data processing unit 1 where the raw sensor signals are processed into motion and posture data.
  • a video camera 4 records the motions of the person.
  • the physical data processing unit 1 performs optical tracking operations on the video stream of the camera 4 for identifying the position and the movement of the optical marks 3'. This is also processed into motion and posture data and complements the data obtained from the motion sensors 3.
  • the raw or processed sensor signals and positional information from the optical marks 3' are stored in a data storage unit 5. Furthermore, the video stream of the person performing the exercise is also stored there.
  • the data in the data storage unit 5 is stored together with an information about the time of recording. This makes it possible to correlate or synchronize the information, for example knowing which position as indicated by posture recording devices 3, 3' corresponds to which frame of a video clip of the person performing the exercise.
  • the physical data processing unit 1 uses an interconnected computer network such as the internet 7, the physical data processing unit 1 transmits the processed sensor 3 signals and positional information from the optical marks 3' to a physically separate annotation unit 6. Temporal information is also transmitted.
  • This annotation unit then calculates a visual representation such as an avatar from the received physical data.
  • a physical therapist views the motion of the visual representation on his terminal 8 and comments sequences, thus performing the annotation.
  • the annotation together with the time within the exercise when the annotation has been made is transmitted back to the physical data processing unit 1 at the location of the person undertaking exercises. Again, the transmission is achieved over an interconnected computer network such as the internet 7.
  • the physical data processing unit 1 accesses the data storage unit 5 and retrieves the recorded data and video clips from the particular exercise that has been annotated.
  • a movie sequence is generated for viewing by the person and displayed on display 2.
  • the video stream of the person and an avatar calculated from the recorded data are shown simultaneously.
  • the comments of the physical therapist are also displayed or voiced to the person.
  • Fig. 2 shows a synchronous overlay of visual recordings and an avatar representing physical exercise data.
  • a person has been performing an exercise.
  • Physical data representing his motions has been recorded and used for calculation of an avatar representation.
  • the avatar's motion has been time-resolved and split into a stream of individual frames 20.
  • the person's movements have been recorded by a video camera.
  • This video image sequence has also been time-resolved and split into a stream of individual frames 21.
  • the time line in Fig. 2 beneath the frame streams arbitrarily begins at 4: 16 minutes and ends at 4:21 minutes.
  • the person starts with both of his arms stretched lowered.
  • the left arm is kept stretched and raised along the coronal plane until the hand is above the person's head.
  • the arm is kept in this position while the same movement is supposed to be performed with the right arm.
  • the person is not able to keep his right arm outstretched in the horizontal position.
  • the arm is bent at the elbow.
  • a physical therapist remotely reviewing the avatar frames 20 can then single out the frame at 4:20 minutes and add a visual or verbal comment. This comment, together with the information that it is to be shown at 4:20 minutes into the exercise, is transmitted to the person for future reviewal. At the person's location the annotation can then be combined with the visual recordings 21 so that the person can relate more directly to the exercise and contemplate his errors in performing it.
  • Fig. 3 shows a flowchart of a method according to the present invention.
  • the first step 30 is to record the exercise a person is performing visually, using a camera, and via posture data, using sensors.
  • the visual recordings are stored 31 and the posture recordings are transmitted to an annotation system 32.
  • an annotation system Using the annotation system, a person reviews the posture recordings and adds his comments and markers 33.
  • These annotations are transmitted back to the patient system 34, wherein 'patient' denotes the person performing an exercise.
  • the stored visual recordings are retrieved 35 and combined with the annotations 36 in order to give the person a comprehensive feedback that still does not compromise his anonymity.
  • Fig. 4 shows modules for performing a method according to the present invention to complement the depiction of a system in Fig. 1.
  • a sensor receiver 40 receives signals from motion sensors or information from the tracking of optical marks. This sensor receiver 40 communicates its data to a movement transmission module 41. Synchronously with the sensor receiver 40, a camera 42 captures a video sequence of the person performing exercises. These video sequences are stored in a storage facility 43.
  • the movement transmission module 41 transmits its data to a remotely located movement receiver 45. This is symbolized by barrier 44 separating the two sub-groups of modules.
  • the movement receiving module 45 passes the data on to a movement annotator 46 where the data is transformed into processible data and annotated by a reviewer.
  • the annotation together with information on the temporal position of the annotation within the exercise is passed on to annotation transmission module 47.
  • Aforementioned annotation transmission module 47 transmits the information to an annotation receiver 48 located at the sub-group of modules assigned to the person performing the exercise.
  • This annotation information reaches a processing and overlay module 49 which accesses video sequences from the storage module 43 and combines the sequences with the annotation so that the annotation is present at the appropriate time of the video sequence.
  • a rendering module 50 the overlaid video sequence is displayed to the person who has performed the exercise.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Physiology (AREA)
  • Business, Economics & Management (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Educational Administration (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Cardiology (AREA)
  • Educational Technology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention relates to a method for displaying anonymously annotated physical exercise data to a person undertaking exercises. Based on physical exercise data, the physical exercise data is annotated at a physically separate annotation unit. At the location of the person, visual recordings of the person undertaking exercises together with synchronized annotation information are displayed to the person. A system for performing the method comprises a physical data processing unit (1), a display device (2), at least one posture recording device (3, 3'), a visual recording device (4), a data storage unit (5) and a physically separate annotation unit (6) in connection with the physical data processing unit (1), the connection being via an interconnected computer network (7).

Description

System and method for displaying anonymously annotated physical exercise data
BACKGROUND OF THE INVENTION
The present invention relates to a system and method for displaying anonymously annotated physical exercise data to a person undertaking exercises.
Home rehabilitation exercises for persons suffering from a medical condition like a stroke or home training exercises for persons wishing to improve body motions like a golf swing can be recorded via sensors. The exercises can also be evaluated by a professional such as a physiotherapist or a golf instructor in order to give the person a direct feedback. If the professional performing the review is not present during the exercise, video camera recordings could be sent to him. These recordings could be reviewed intuitively by the professional and the commented recordings could be understood intuitively by the person undertaking the exercise. However, these recordings, especially when sent away to a remote professional, could breach the privacy of the person. Furthermore, a completely automatic processing of such recorded images to provide meaningful feedback is a demanding task. Alternatively, the sole transmission of data from the sensors would not violate the privacy of the person. In this respect, US 6,817,979 B2 relates to a system and method which provide for interacting with a virtual physiological model of a user with the use of a mobile communication device. Physiological data associated with the user is acquired from the user. The physiological data is transmitted to the mobile communication device, preferably with the use of a wireless communication protocol. The methodology further involves using the mobile communication device to communicate the physiological data to a network server. The physiological data is integrated into the virtual physiological model of the user. The user can access data and depictions of the user developed from the physiological data. By way of example, a user can create an avatar representative of the current physical state of the user. The user can adjust the avatar to change the appearance of the avatar to a more desired appearance. For example, the anatomical dimensions of the avatar can be changed to reflect desired waist, chest, upper arms and thigh dimensions. Given differences between the desired avatar features and present avatar features, various training, diet and related fitness recommendations can be developed to establish a training regimen most suited to help the user achieve the desired fitness goals. Physiological data is subsequently acquired and applied to the user's avatar, and compared to the desired avatar's data to determine if the training regimen is effective in achieving the desired fitness goals. However, in general the interpretation of sensor signals in the frontend leads to difficulties on the part of the user. It is hard to relate to an abstract rendering of an artificial screen character.
Despite this effort accordingly there still exists a need in the art for a system and a method for displaying anonymously annotated physical exercise data to a person undertaking exercises.
SUMMARY OF THE INVENTION
To achieve this and other objects the present invention is directed to a method for displaying anonymously annotated physical exercise data to a person undertaking exercises, comprising the steps of: a) gathering physical exercise data from a person undertaking exercises; b) synchronously gathering visual recordings of the person undertaking exercises; c) transmitting the physical exercise data to a physically separate annotation unit; d) based on the physical exercise data, annotating the physical exercise data at the physically separate annotation unit; e) transmitting the annotation information to a display and processing unit for review of the person undertaking exercises; f) displaying the visual recordings of the person undertaking exercises together with synchronized annotation information to the person.
DETAILED DESCRIPTION OF THE INVENTION
Before the invention is described in detail, it is to be understood that this invention is not limited to the particular component parts of the devices described or process steps of the methods described as such devices and methods may vary. It is also to be understood that the terminology used herein is for purposes of describing particular embodiments only, and is not intended to be limiting. It must be noted that, as used in the specification and the appended claims, the singular forms "a," "an" and "the" include singular and/or plural referents unless the context clearly dictates otherwise. In the context of the present invention, the term anonymously annotated data denotes data where a third person performing the annotation has no knowledge about the identity of the person whose data he is annotating. In particular, the data does not allow for a recognition of the person. One way of achieving the anonymization is by assigning identification numbers to the data. Physical exercise data is data relating to movements or other exercises of a person.
The first two steps of the method describe how two different sets of information about the exercise of the person are gathered. Firstly, physical exercise data is gathered, for example by continuously monitoring sensor signals from the person. At the same time, visual recordings are gathered, for example by using a digital video camera. By synchronously gathering this data it is ensured that later on, a certain portion of the video stream can be attributed to a certain portion of the sensor signal stream and vice versa.
As the visual recordings and the physical exercise data are separate entities, the physical exercise data can then be transmitted to a physically separate annotation unit. The physical separation of the annotation unit provides for an anonymization of the data. At the annotation unit the physical exercise data can be processed into representations of the exercise for review by a third person. The physical exercise data can then be annotated. This includes automatic processing of the data, for example by detecting deviations from motion templates. Furthermore, the third person can include comments and suggestions to provide helpful feedback to the person performing the exercise. Afterwards, the annotation information is transmitted to a display and processing unit at the site of the person performing the exercise. Here, the annotation information is joined with the visual recordings. The recordings of the person undertaking exercises are then displayed to the person together with the synchronized annotation information. The synchronization provides for displaying the annotation at the correct time so the person can directly understand what has caught the attention of the reviewer or the automatic reviewing system.
In summary, with the method according to the present invention an exercise of a person can be reviewed anonymously and feedback can be given to the person. The anonymization allows for the sharing of professional resources, making the reviewing process more efficient. At the same time, when the person receives the feedback it is very clearly shown to him, via the visual recordings, which part of the exercise has prompted the feedback.
In one embodiment of the invention, at the physically separate annotation unit in step d) an avatar is calculated based on the physical exercise data. For the purposes of this invention, the term 'avatar' shall denote a computer-generated abstract rendering which represents the posture or motions of a person. In simple cases, the avatar may be a stick figure. In more sophisticated cases, the avatar may represent additional information like the pulse rate, the amount of sweating, muscle fatigue and the like. An advantage of using an avatar representation is that the avatar can be rotated on the screen of the annotation unit while representing the exercise. This enables the reviewer to choose the best viewing angle for assessing the exercise.
In a further embodiment of the invention step f) additionally comprises calculating an avatar and displaying the avatar synchronously with the visual recordings and the annotations to the person. In summary, the person will then see the visual recording of his exercise, the annotations and the avatar. This is advantageous as the avatar may depict more clearly the motions of the persons if they are obscured in the visual recording by baggy clothing or if they have not been recorded correctly on camera. Again, the avatar may be rotated to achieve the best viewing perspective. Another option is to provide multiple viewing angles with one or more avatars.
In a further embodiment of the invention transmitting the physical exercise data in step c) and transmitting the annotation information in step e) is undertaken via an interconnected computer network, preferably the internet. This allows a remotely located person to perform the review and the annotation. Suitable protocols can include those of the TCP/IP protocol.
In a further embodiment of the invention the physical exercise data from the person is selected from the group comprising motion data, posture data, electromyographic data, pulse rate, blood pressure, blood oxygen content, blood sugar content, severity of perspiration and/or respiratory rate. These data types either relate to the exercise itself, such as in the case of motion and posture data. Other data types relate to the overall condition or physical fitness of the person. Knowledge about this can give valuable insight into the effectiveness of rehabilitation or training measures. For example, it may be inferred whether the person is in the supercompensation phase after a training stimulus.
In a further embodiment of the invention the annotation information is selected from the group comprising visual information, audio signals and/or speech recordings. Visual information can be in the form of markings such as arrows pointing out a specific issue that are inserted into the images of the avatar. Additionally, small video clips can be inserted to show the correct execution of the exercise. Other visual information can be written comments or graphs showing statistics of data like electromyographic data, pulse rate, blood pressure, blood oxygen content, blood sugar content, severity of perspiration and/or the respiratory rate. This enables to assess the situation of the person performing the exercise at one glance. Audio signals can be simple beeps when a movement is no performed correctly. Recorded speech comments can be added by the reviewer when this is the simplest way of explaining an exercise.
The present invention is further directed towards a system for displaying anonymously annotated physical exercise data to a person undertaking exercises, comprising: a physical data processing unit; a display device in communication with the physical data processing unit; - at least one posture recording device assigned to the person undertaking exercises and in communication with the physical data processing unit; a visual recording device in communication with the physical data processing unit; a data storage unit for storing and retrieving data from the physical data processing unit and the visual recording device; the data storage means being in communication with the physical data processing unit; a physically separate annotation unit in connection with the physical data processing unit, the connection being via an interconnected computer network.
In one embodiment of the invention the at least one posture recording device comprises a motion sensor on the person undertaking exercises, the sensor being selected from the group comprising acceleration sensors, inertia sensors and/or gravity sensors. The motion sensors can be worn on the body of the person on selected locations like upper arm, lower arm, upper leg, lower leg or torso. They can be commercially available highly integrated solid state sensors. The transmission of the sensor signals to the posture assessment unit can be undertaken via wire, wirelessly or in a body area network using the electrical conductivity of the human skin. After calculation of the person's posture the result can be present in the form of an avatar.
In a further embodiment of the invention the at least one posture recording device comprises an optical mark on the person undertaking exercises. The posture recording device then employs an optical tracking system for tracking the at least one optical mark.
Based on the signals of the optical tracking system a representation of the person's posture is then calculated. The optical marks can be borne on the body of the person on selected locations like upper arm, lower arm, upper leg, lower leg or torso. The tracking of the marks can be effected with a single camera or a multitude of cameras. When a stereo camera is used, three-dimensional posture and movement data is generated. After image processing and calculation of the person's posture the result can be present in the form of an avatar.
It is also possible to combine several posture monitoring principles. For example, a combination of motion sensors and optical tracking may provide complementary data to better calculate the posture of the person.
A further aspect of the present invention is the use of a system according to the present invention claims for displaying anonymously annotated physical exercise data to a person undertaking exercises.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will become more readily understood with reference to the following drawing, wherein
Fig. 1 shows a system according to the present invention Fig. 2 shows a synchronous overlay of visual recordings and an avatar representing physical exercise data
Fig. 3 shows a flowchart of a method according to the present invention Fig. 4 shows modules for performing a method according to the present invention.
DETAILED DESCRIPTION OF THE DRAWINGS
Fig. 1 shows a system according to the present invention for displaying anonymously annotated physical exercise data to a person undertaking exercises. As posture recording devices, the person has motion sensors 3 situated on his thighs and his ankles. Furthermore, optical marks 3' are located on the wrist and the torso. Being physical exercise data, the signals of the motion sensors 3 are transmitted wirelessly to the physical data processing unit 1 where the raw sensor signals are processed into motion and posture data. A video camera 4 records the motions of the person. Furthermore, the physical data processing unit 1 performs optical tracking operations on the video stream of the camera 4 for identifying the position and the movement of the optical marks 3'. This is also processed into motion and posture data and complements the data obtained from the motion sensors 3. The raw or processed sensor signals and positional information from the optical marks 3' are stored in a data storage unit 5. Furthermore, the video stream of the person performing the exercise is also stored there. The data in the data storage unit 5 is stored together with an information about the time of recording. This makes it possible to correlate or synchronize the information, for example knowing which position as indicated by posture recording devices 3, 3' corresponds to which frame of a video clip of the person performing the exercise.
Using an interconnected computer network such as the internet 7, the physical data processing unit 1 transmits the processed sensor 3 signals and positional information from the optical marks 3' to a physically separate annotation unit 6. Temporal information is also transmitted. This annotation unit then calculates a visual representation such as an avatar from the received physical data. A physical therapist views the motion of the visual representation on his terminal 8 and comments sequences, thus performing the annotation. The annotation together with the time within the exercise when the annotation has been made is transmitted back to the physical data processing unit 1 at the location of the person undertaking exercises. Again, the transmission is achieved over an interconnected computer network such as the internet 7.
The physical data processing unit 1 then accesses the data storage unit 5 and retrieves the recorded data and video clips from the particular exercise that has been annotated. A movie sequence is generated for viewing by the person and displayed on display 2. In this case, the video stream of the person and an avatar calculated from the recorded data are shown simultaneously. At the appropriate time, the comments of the physical therapist are also displayed or voiced to the person. Fig. 2 shows a synchronous overlay of visual recordings and an avatar representing physical exercise data. A person has been performing an exercise. Physical data representing his motions has been recorded and used for calculation of an avatar representation. The avatar's motion has been time-resolved and split into a stream of individual frames 20. Likewise, the person's movements have been recorded by a video camera. This video image sequence has also been time-resolved and split into a stream of individual frames 21. As the physical exercise data and the visual recordings have been gathered simultaneously, one common time line can be assigned to them. The time line in Fig. 2 beneath the frame streams arbitrarily begins at 4: 16 minutes and ends at 4:21 minutes. In the exercise of Fig. 2, the person starts with both of his arms stretched lowered. In the images, the left arm is kept stretched and raised along the coronal plane until the hand is above the person's head. The arm is kept in this position while the same movement is supposed to be performed with the right arm. At the time of 4:20, the person is not able to keep his right arm outstretched in the horizontal position. The arm is bent at the elbow. This makes it much easier to lift the arm so at this point no therapeutic benefit is gained. A physical therapist remotely reviewing the avatar frames 20 can then single out the frame at 4:20 minutes and add a visual or verbal comment. This comment, together with the information that it is to be shown at 4:20 minutes into the exercise, is transmitted to the person for future reviewal. At the person's location the annotation can then be combined with the visual recordings 21 so that the person can relate more directly to the exercise and contemplate his errors in performing it.
Fig. 3 shows a flowchart of a method according to the present invention. The first step 30 is to record the exercise a person is performing visually, using a camera, and via posture data, using sensors. The visual recordings are stored 31 and the posture recordings are transmitted to an annotation system 32. Using the annotation system, a person reviews the posture recordings and adds his comments and markers 33. These annotations are transmitted back to the patient system 34, wherein 'patient' denotes the person performing an exercise. On the patient side, the stored visual recordings are retrieved 35 and combined with the annotations 36 in order to give the person a comprehensive feedback that still does not compromise his anonymity.
Fig. 4 shows modules for performing a method according to the present invention to complement the depiction of a system in Fig. 1. A sensor receiver 40 receives signals from motion sensors or information from the tracking of optical marks. This sensor receiver 40 communicates its data to a movement transmission module 41. Synchronously with the sensor receiver 40, a camera 42 captures a video sequence of the person performing exercises. These video sequences are stored in a storage facility 43. The movement transmission module 41 transmits its data to a remotely located movement receiver 45. This is symbolized by barrier 44 separating the two sub-groups of modules.
The movement receiving module 45 passes the data on to a movement annotator 46 where the data is transformed into processible data and annotated by a reviewer. The annotation together with information on the temporal position of the annotation within the exercise is passed on to annotation transmission module 47. Aforementioned annotation transmission module 47 transmits the information to an annotation receiver 48 located at the sub-group of modules assigned to the person performing the exercise. This annotation information reaches a processing and overlay module 49 which accesses video sequences from the storage module 43 and combines the sequences with the annotation so that the annotation is present at the appropriate time of the video sequence. Finally, via a rendering module 50, the overlaid video sequence is displayed to the person who has performed the exercise. To provide a comprehensive disclosure without unduly lengthening the specification, the applicant hereby incorporates by reference each of the patents and patent applications referenced above.
The particular combinations of elements and features in the above detailed embodiments are exemplary only; the interchanging and substitution of these teachings with other teachings in this and the patents/applications incorporated by reference are also expressly contemplated. As those skilled in the art will recognize, variations, modifications, and other implementations of what is described herein can occur to those of ordinary skill in the art without departing from the spirit and the scope of the invention as claimed. Accordingly, the foregoing description is by way of example only and is not intended as limiting. The invention's scope is defined in the following claims and the equivalents thereto. Furthermore, reference signs used in the description and claims do not limit the scope of the invention as claimed.

Claims

CLAIMS:
1. A method for displaying anonymously annotated physical exercise data to a person undertaking exercises, comprising the steps of: a) gathering physical exercise data from a person undertaking exercises; b) synchronously gathering visual recordings of the person undertaking exercises; c) transmitting the physical exercise data to a physically separate annotation unit; d) based on the physical exercise data, annotating the physical exercise data at the physically separate annotation unit; e) transmitting the annotation information to a display and processing unit for review of the person undertaking exercises; f) displaying the visual recordings of the person undertaking exercises together with synchronized annotation information to the person.
2. Method according to claim 1, wherein at the physically separate annotation unit in step d) an avatar is calculated based on the physical exercise data.
3. Method according to claims 1 or 2, wherein step f) additionally comprises calculating an avatar and displaying the avatar synchronously with the visual recordings and the annotations to the person.
4. Method according to claims 1 to 3, wherein transmitting the physical exercise data in step c) and transmitting the annotation information in step e) is undertaken via an interconnected computer network, preferably the internet.
5. Method according to claims 1 to 4, wherein the physical exercise data from the person is selected from the group comprising motion data, posture data, electromyographic data, pulse rate, blood pressure, blood oxygen content, blood sugar content, severity of perspiration and/or respiratory rate.
6. Method according to claims 1 to 5, wherein the annotation information is selected from the group comprising visual information, audio signals and/or speech recordings.
7. A system for displaying anonymously annotated physical exercise data to a person undertaking exercises, comprising: a physical data processing unit (1); a display device (2) in communication with the physical data processing unit
(i); - at least one posture recording device (3, 3') assigned to the person undertaking exercises and in communication with the physical data processing unit (1); a visual recording device (4) in communication with the physical data processing unit (1); a data storage unit (5) for storing and retrieving data from the physical data processing unit (1) and the visual recording device (4); the data storage means (5) being in communication with the physical data processing unit (1); a physically separate annotation unit (6) in connection with the physical data processing unit (1), the connection being via an interconnected computer network (7).
8. System according to claim 7, wherein the at least one posture recording device
(3, 3') comprises a motion sensor (3) on the person undertaking exercises, the sensor being selected from the group comprising acceleration sensors, inertia sensors and/or gravity sensors.
9. System according to claim 7, wherein the at least one posture recording device
(3, 3') comprises an optical mark (3') on the person undertaking exercises.
10. Use of a system according to claims 7 to 9 for displaying anonymously annotated physical exercise data to a person undertaking exercises.
EP08789619A 2007-08-24 2008-08-22 System and method for displaying anonymously annotated physical exercise data Withdrawn EP2185071A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP08789619A EP2185071A1 (en) 2007-08-24 2008-08-22 System and method for displaying anonymously annotated physical exercise data

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP07114912 2007-08-24
PCT/IB2008/053386 WO2009027917A1 (en) 2007-08-24 2008-08-22 System and method for displaying anonymously annotated physical exercise data
EP08789619A EP2185071A1 (en) 2007-08-24 2008-08-22 System and method for displaying anonymously annotated physical exercise data

Publications (1)

Publication Number Publication Date
EP2185071A1 true EP2185071A1 (en) 2010-05-19

Family

ID=40122948

Family Applications (1)

Application Number Title Priority Date Filing Date
EP08789619A Withdrawn EP2185071A1 (en) 2007-08-24 2008-08-22 System and method for displaying anonymously annotated physical exercise data

Country Status (5)

Country Link
US (1) US20110021317A1 (en)
EP (1) EP2185071A1 (en)
JP (1) JP2010536459A (en)
CN (1) CN101784230A (en)
WO (1) WO2009027917A1 (en)

Families Citing this family (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9549585B2 (en) 2008-06-13 2017-01-24 Nike, Inc. Footwear having sensor system
US10070680B2 (en) 2008-06-13 2018-09-11 Nike, Inc. Footwear having sensor system
US9297709B2 (en) 2013-03-15 2016-03-29 Nike, Inc. System and method for analyzing athletic activity
WO2009152456A2 (en) 2008-06-13 2009-12-17 Nike, Inc. Footwear having sensor system
US8500604B2 (en) * 2009-10-17 2013-08-06 Robert Bosch Gmbh Wearable system for monitoring strength training
WO2012050969A1 (en) 2010-09-29 2012-04-19 Quentiq AG Automated health data acquisition, processing and communication system
BR112013011690A2 (en) 2010-11-10 2016-08-09 Nike International Ltd systems and methods for measuring and displaying time-based sport activity
US9011293B2 (en) * 2011-01-26 2015-04-21 Flow-Motion Research And Development Ltd. Method and system for monitoring and feed-backing on execution of physical exercise routines
WO2012112938A2 (en) 2011-02-17 2012-08-23 Nike International Ltd. Footwear having sensor system
US9411940B2 (en) 2011-02-17 2016-08-09 Nike, Inc. Selecting and correlating physical activity data with image data
EP3662829A1 (en) 2011-02-17 2020-06-10 NIKE Innovate C.V. Footwear having sensor system
US9381420B2 (en) 2011-02-17 2016-07-05 Nike, Inc. Workout user experience
US9378336B2 (en) 2011-05-16 2016-06-28 Dacadoo Ag Optical data capture of exercise data in furtherance of a health score computation
CN102440774A (en) * 2011-09-01 2012-05-09 东南大学 Remote measurement module for related physiological information in rehabilitation training process
US20130178960A1 (en) * 2012-01-10 2013-07-11 University Of Washington Through Its Center For Commercialization Systems and methods for remote monitoring of exercise performance metrics
ITGE20120011A1 (en) * 2012-01-27 2013-07-28 Paybay Networks S R L PATIENT REHABILITATION SYSTEM
US11684111B2 (en) 2012-02-22 2023-06-27 Nike, Inc. Motorized shoe with gesture control
US20130213147A1 (en) 2012-02-22 2013-08-22 Nike, Inc. Footwear Having Sensor System
US11071344B2 (en) 2012-02-22 2021-07-27 Nike, Inc. Motorized shoe with gesture control
US9652992B2 (en) 2012-10-09 2017-05-16 Kc Holdings I Personalized avatar responsive to user physical state and context
US9501942B2 (en) 2012-10-09 2016-11-22 Kc Holdings I Personalized avatar responsive to user physical state and context
JP5811360B2 (en) * 2012-12-27 2015-11-11 カシオ計算機株式会社 Exercise information display system, exercise information display method, and exercise information display program
US10926133B2 (en) 2013-02-01 2021-02-23 Nike, Inc. System and method for analyzing athletic activity
US11006690B2 (en) 2013-02-01 2021-05-18 Nike, Inc. System and method for analyzing athletic activity
US9743861B2 (en) 2013-02-01 2017-08-29 Nike, Inc. System and method for analyzing athletic activity
JP2014199613A (en) * 2013-03-29 2014-10-23 株式会社コナミデジタルエンタテインメント Application control program, application control method, and application control device
US20150133820A1 (en) * 2013-11-13 2015-05-14 Motorika Limited Virtual reality based rehabilitation apparatuses and methods
EP3096685A1 (en) * 2014-01-24 2016-11-30 Icura ApS System and method for mapping moving body parts
US10484437B2 (en) * 2015-01-21 2019-11-19 Logmein, Inc. Remote support service with two-way smart whiteboard
WO2016196217A1 (en) * 2015-05-29 2016-12-08 Nike Innovate C.V. Enhancing exercise through augmented reality
WO2017055080A1 (en) * 2015-09-28 2017-04-06 Koninklijke Philips N.V. System and method for supporting physical exercises
CN105641900B (en) * 2015-12-28 2019-07-26 联想(北京)有限公司 A kind of respiratory state based reminding method and electronic equipment and system
KR102511518B1 (en) * 2016-01-12 2023-03-20 삼성전자주식회사 Display apparatus and control method of the same
CN105615852A (en) * 2016-03-17 2016-06-01 北京永数网络科技有限公司 Blood pressure detection system and method
JP7009955B2 (en) * 2017-11-24 2022-01-26 トヨタ自動車株式会社 Medical data communication equipment, servers, medical data communication methods and medical data communication programs
US11331538B2 (en) * 2018-08-07 2022-05-17 Interactive Strength, Inc. Interactive exercise machine data architecture
US20200107750A1 (en) * 2018-10-03 2020-04-09 Surge Motion Inc. Method and system for assessing human movements

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5679004A (en) * 1995-12-07 1997-10-21 Movit, Inc. Myoelectric feedback system
DE69736622T2 (en) * 1996-07-03 2007-09-13 Hitachi, Ltd. Motion detection system
JP3469410B2 (en) * 1996-11-25 2003-11-25 三菱電機株式会社 Wellness system
US20060247070A1 (en) * 2001-06-11 2006-11-02 Recognition Insight, Llc Swing position recognition and reinforcement
US20030054327A1 (en) * 2001-09-20 2003-03-20 Evensen Mark H. Repetitive motion feedback system and method of practicing a repetitive motion
US6817979B2 (en) * 2002-06-28 2004-11-16 Nokia Corporation System and method for interacting with a user's virtual physiological model via a mobile terminal
US20060025229A1 (en) * 2003-12-19 2006-02-02 Satayan Mahajan Motion tracking and analysis apparatus and method and system implementations thereof
KR20070095407A (en) * 2005-01-26 2007-09-28 벤틀리 키네틱스 인코포레이티드 Method and system for athletic motion analysis and instruction
US20060183980A1 (en) * 2005-02-14 2006-08-17 Chang-Ming Yang Mental and physical health status monitoring, analyze and automatic follow up methods and its application on clothing
WO2006103676A2 (en) * 2005-03-31 2006-10-05 Ronen Wolfson Interactive surface and display system
US20090299232A1 (en) * 2006-07-12 2009-12-03 Koninklijke Philips Electronics N.V. Health management device
ATE505242T1 (en) * 2007-08-22 2011-04-15 Koninkl Philips Electronics Nv SYSTEM AND METHOD FOR DISPLAYING SELECTED INFORMATION TO A PERSON PERFORMING TRAINING EXERCISES

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2009027917A1 *

Also Published As

Publication number Publication date
JP2010536459A (en) 2010-12-02
US20110021317A1 (en) 2011-01-27
WO2009027917A1 (en) 2009-03-05
CN101784230A (en) 2010-07-21

Similar Documents

Publication Publication Date Title
US20110021317A1 (en) System and method for displaying anonymously annotated physical exercise data
KR100772497B1 (en) Golf clinic system and application method thereof
US11069144B2 (en) Systems and methods for augmented reality body movement guidance and measurement
JP4594157B2 (en) Exercise support system, user terminal device thereof, and exercise support program
US9892655B2 (en) Method to provide feedback to a physical therapy patient or athlete
CN108289613B (en) System, method and computer program product for physiological monitoring
US20170136296A1 (en) System and method for physical rehabilitation and motion training
CA2844651C (en) Systems, apparatus and methods for non-invasive motion tracking to augment patient administered physical rehabilitation
EP2635988B1 (en) Method and system for automated personal training
EP2643779B1 (en) Fatigue indices and uses thereof
JP4335456B2 (en) A system that allows a person who exercises a series of exercises to perform to be self-managed
US20150327794A1 (en) System and method for detecting and visualizing live kinetic and kinematic data for the musculoskeletal system
US20220105389A1 (en) System and Method for Providing Guided Augmented Reality Physical Therapy in a Telemedicine Platform
US9248361B1 (en) Motion capture and analysis systems for use in training athletes
JP2009542397A (en) Health management device
WO2022193425A1 (en) Exercise data display method and system
WO2013163204A1 (en) Equestrian performance sensing system
US20220277506A1 (en) Motion-based online interactive platform
JP2019118783A (en) Remote rehabilitation analysis device and method thereof
Huang et al. Smartglove for upper extremities rehabilitative gaming assessment
US20200371738A1 (en) Virtual and augmented reality telecommunication platforms
CN115115810A (en) Multi-person collaborative focus positioning and enhanced display method based on spatial posture capture
KR20140082449A (en) Health and rehabilitation apparatus based on natural interaction
US20100262989A1 (en) System and method for generating individualized exercise movies
WO2021178589A1 (en) Exercise instruction and feedback systems and methods

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20100324

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA MK RS

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20101223

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20110503