WO2022250099A1 - 情報処理装置、電子機器、情報処理システム、情報処理方法及びプログラム - Google Patents
情報処理装置、電子機器、情報処理システム、情報処理方法及びプログラム Download PDFInfo
- Publication number
- WO2022250099A1 WO2022250099A1 PCT/JP2022/021462 JP2022021462W WO2022250099A1 WO 2022250099 A1 WO2022250099 A1 WO 2022250099A1 JP 2022021462 W JP2022021462 W JP 2022021462W WO 2022250099 A1 WO2022250099 A1 WO 2022250099A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- sensor device
- sensor
- state
- body part
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 66
- 238000003672 processing method Methods 0.000 title description 5
- 230000033001 locomotion Effects 0.000 claims abstract description 125
- 238000011156 evaluation Methods 0.000 claims description 126
- 238000004891 communication Methods 0.000 claims description 73
- 210000000689 upper leg Anatomy 0.000 claims description 41
- 210000000245 forearm Anatomy 0.000 claims description 39
- 238000000034 method Methods 0.000 claims description 36
- 210000003127 knee Anatomy 0.000 claims description 22
- 210000002683 foot Anatomy 0.000 description 74
- 230000008569 process Effects 0.000 description 22
- 230000001133 acceleration Effects 0.000 description 20
- 238000012545 processing Methods 0.000 description 17
- 238000012854 evaluation process Methods 0.000 description 15
- 238000001514 detection method Methods 0.000 description 12
- 210000002414 leg Anatomy 0.000 description 11
- 210000000707 wrist Anatomy 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 230000005021 gait Effects 0.000 description 9
- 230000004886 head movement Effects 0.000 description 9
- 210000002569 neuron Anatomy 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 239000004065 semiconductor Substances 0.000 description 6
- 230000000694 effects Effects 0.000 description 3
- 238000005401 electroluminescence Methods 0.000 description 3
- 210000001624 hip Anatomy 0.000 description 3
- 239000007943 implant Substances 0.000 description 3
- 238000010295 mobile communication Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 210000004197 pelvis Anatomy 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 210000001217 buttock Anatomy 0.000 description 2
- 239000000470 constituent Substances 0.000 description 2
- 238000002790 cross-validation Methods 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 210000003205 muscle Anatomy 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 240000007594 Oryza sativa Species 0.000 description 1
- 235000007164 Oryza sativa Nutrition 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 210000003423 ankle Anatomy 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000009194 climbing Effects 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000033764 rhythmic process Effects 0.000 description 1
- 235000009566 rice Nutrition 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000003826 tablet Substances 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
Definitions
- the present disclosure relates to an information processing device, an electronic device, an information processing system, an information processing method, and a program.
- Patent Document 1 a technique for estimating a person's walking state is known (for example, Patent Document 1).
- the motion analysis device described in Patent Literature 1 includes detection means attached to a human body and analysis means.
- the analyzing means analyzes the walking state and/or the running state based on the signal from the detecting means.
- the interpretation described in Non-Patent Document 1 is known as an interpretation of a good way of walking.
- An information processing device Acquiring sensor data indicating movement of the body part from at least one sensor device attached to the body part of the user;
- a control unit is provided for estimating a state of a body part of the user, which is different from the body part on which the sensor device is worn, based on the acquired sensor data and the learning model.
- An electronic device includes a notification unit that notifies information about the state of the body part estimated by the information processing device.
- An information processing system includes: at least one sensor device worn on a user's body part; Acquiring sensor data indicating movement of the body part from the sensor device, and estimating a state of a body part of the user that is different from a body part on which the sensor device is worn, based on the acquired sensor data and a learning model. and an information processing device.
- An information processing method includes Acquiring sensor data indicative of movement of a body part of a user from at least one sensor device worn on the body part; estimating a state of a body part of the user that is different from the body part on which the sensor device is worn, based on the acquired sensor data and the learning model.
- a program is to the computer, Acquiring sensor data indicative of movement of a body part of a user from at least one sensor device worn on the body part; and estimating a state of a body part of the user that is different from the body part on which the sensor device is worn, based on the acquired sensor data and the learning model.
- FIG. 1 is a diagram showing a schematic configuration of an information processing system according to an embodiment of the present disclosure
- FIG. 2 is a diagram for explaining a local coordinate system and a global coordinate system
- FIG. 2 is a functional block diagram showing the configuration of the information processing system shown in FIG. 1
- FIG. 1 is a diagram showing a schematic configuration of a learning model according to an embodiment of the present disclosure
- FIG. 11 illustrates an example score according to an embodiment of the present disclosure
- FIG. 4 is a diagram illustrating an example of associations according to an embodiment of the present disclosure
- FIG. It is a graph which shows the accuracy of a learning model. It is a graph which shows the accuracy of a learning model.
- 2 is a flowchart showing operations of evaluation processing executed by the electronic device shown in FIG. 1
- FIG. 3 is a functional block diagram showing the configuration of an information processing system according to another embodiment of the present disclosure
- FIG. FIG. 11 is a sequence diagram showing operations of evaluation processing executed by the information processing system shown in FIG. 10;
- a new technique for estimating the state of the user's body parts while walking is desired. According to the present disclosure, it is possible to provide a novel technique for estimating the state of a user's body part while walking.
- An information processing system 1 as shown in FIG. 1 can estimate the state of any one of a plurality of body parts of a walking user. By using the information processing system 1, the user can grasp whether the state of his or her body part during walking is good or bad.
- the information processing system 1 may be used for any purpose that requires understanding of the state of the user's body part while walking.
- the information processing system 1 can be used when the user walks as an exercise, when the user practices walking as a model, when the user practices walking for mountain climbing, or when the user practices race walking. It may be used to grasp the state of the body part of the body.
- the information processing system 1 includes a sensor device 10A, a sensor device 10B, a sensor device 10C, a sensor device 10D, and an electronic device 20. However, the information processing system 1 does not have to include all of the sensor device 10A, the sensor device 10B, the sensor device 10C, and the sensor device 10D. The information processing system 1 may include at least one of the sensor device 10A, the sensor device 10B, the sensor device 10C, and the sensor device 10D.
- sensor devices 10A to 10D are also collectively referred to as the "sensor device 10".
- the sensor device 10 and the electronic device 20 can communicate via a communication line.
- the communication line includes at least one of wired and wireless.
- the sensor device 10 is worn on the user's body part.
- the sensor device 10 detects sensor data indicating movement of a body part of the user wearing the sensor device 10 .
- the sensor data are data in the local coordinate system.
- the local coordinate system is a coordinate system based on the position of the sensor device 10, as shown in FIG. In FIG. 2, as an example of the position of the sensor device 10, the position of the sensor device 10A is indicated by a dashed line.
- the local coordinate system is composed of, for example, x-, y-, and z-axes.
- the x-axis, y-axis, and z-axis are orthogonal to each other.
- the x-axis is parallel to the front-rear direction as seen from the sensor device 10 .
- the y-axis is parallel to the horizontal direction as seen from the sensor device 10 .
- the z-axis is parallel to the vertical direction as seen from the sensor device 10 .
- a direction parallel to the x-axis of the local coordinate system is also described as the "front-to-rear direction of the local coordinate system.”
- a direction parallel to the y-axis of the local coordinate system is also described as the "left-right direction of the local coordinate system.”
- a direction parallel to the z-axis of the local coordinate system is also described as the "vertical direction of the local coordinate system.”
- the global coordinate system is a coordinate system based on the position in the space where the user walks, as shown in FIG.
- the global coordinate system is composed of, for example, X, Y and Z axes.
- the X-axis, Y-axis, and Z-axis are orthogonal to each other.
- the X-axis is parallel to the front-rear direction as viewed by the user.
- the Y-axis is parallel to the left-right direction viewed from the user.
- the Z-axis is parallel to the up-down direction viewed from the user.
- a direction parallel to the X-axis of the global coordinate system is also described as the "forward-backward direction of the global coordinate system".
- a direction parallel to the Y-axis of the global coordinate system is also described as a "left-right direction of the global coordinate system".
- a direction parallel to the Z-axis of the global coordinate system is also described as the "vertical direction of the global coordinate system".
- the front-back direction of the global coordinate system can also be called the front-back direction of the user.
- the left-right direction of the global coordinate system can also be called the left-right direction of the user.
- the vertical direction of the global coordinate system can also be called the vertical direction of the user.
- the sagittal plane is a plane that symmetrically divides the user's body or a plane parallel to the plane that symmetrically divides the user's body.
- the frontal plane is the plane that divides the user's body into ventral and dorsal sides or is parallel to the plane that divides the user's body into ventral and dorsal sides.
- a horizontal plane is a plane that divides the user's body into upper and lower parts or a plane that is parallel to a plane that divides the user's body into upper and lower parts.
- the sagittal, frontal and horizontal planes are perpendicular to each other.
- the sensor device 10A is worn on the user's head.
- the sensor device 10A is worn on the user's ear.
- the sensor device 10A may be a wearable device.
- the sensor device 10A may be an earphone.
- Sensor device 10A may be included in an earphone.
- the sensor device 10A may be a device that can be retrofitted to existing glasses, earphones, or the like.
- the sensor device 10A may be worn on the user's head by any method.
- the sensor device 10A may be attached to the user's head by being attached to a hair accessory such as a hair band, a hairpin, an earring, a helmet, a hat, a hearing aid, false teeth, an implant, or the like.
- the front-back direction viewed from the sensor device 10A matches the front-back direction of the head viewed from the user
- the left-right direction viewed from the sensor device 10A matches the left-right direction of the head viewed from the user
- the sensor device 10 may be worn on the user's head so that the vertical direction viewed from the sensor device 10 coincides with the vertical direction of the head viewed from the user. That is, in the sensor device 10A, the x-axis of the local coordinate system based on the position of the sensor device 10A is parallel to the front-rear direction of the head viewed from the user, and the y-axis of the local coordinate system is aligned with the head viewed from the user.
- each of the front-back direction, left-right direction, and up-down direction viewed from the sensor device 10A does not necessarily correspond to each of the front-back direction, left-right direction, and up-down direction of the head viewed from the user.
- the orientation of the sensor device 10A relative to the user's head may be initialized or known as appropriate. The relative orientation is initialized or known using information on the shape of a jig for attaching the sensor device 10A to the user's head or image information generated by imaging the user's head on which the sensor device 10A is mounted. It may be done by
- the sensor device 10A detects sensor data indicating the movement of the user's head.
- the sensor data detected by the sensor device 10A includes, for example, the velocity of the user's head, the acceleration of the user's head, the angle of the user's head, the angular velocity of the user's head, the temperature of the user's head, and the user's head including data indicative of at least one of the geomagnetism at the location of the part.
- the sensor device 10B is worn on the user's forearm.
- the sensor device 10B is worn on the user's wrist.
- the sensor device 10B may be a wristwatch-type wearable device.
- the sensor device 10B may be worn on the user's forearm by any method.
- the sensor device 10B may be worn on the user's forearm by being attached to a band, bracelet, misanga, glove, ring, false nail, artificial hand, or the like.
- the bracelet may be worn by the user for decorative purposes, or may be used to attach a key to a locker or the like to the wrist.
- the front-back direction viewed from the sensor device 10B matches the front-back direction of the wrist viewed from the user
- the left-right direction viewed from the sensor device 10B matches the left-right direction of the wrist viewed from the user
- the rotation direction of the wrist is the direction in which the wrist twists and rotates. That is, in the sensor device 10B, the x-axis of the local coordinate system based on the position of the sensor device 10B is parallel to the front-rear direction of the wrist as seen from the user, and the y-axis of the local coordinate system is the wrist as seen from the user. It may be worn on the forearm of the user so that it is parallel to the left-right direction and the z-axis of the local coordinate system is parallel to the rotation direction of the wrist as viewed by the user.
- the sensor device 10B detects sensor data indicating the movement of the user's forearm.
- the sensor data detected by the sensor device 10B includes, for example, the velocity of the user's forearm, the acceleration of the user's forearm, the angle of the user's forearm, the angular velocity of the user's forearm, the temperature of the user's forearm, and the user's forearm. including data indicative of at least one of the geomagnetism at the location of the part.
- the sensor device 10C is worn on the user's thigh.
- the sensor device 10C may be a wearable device.
- the sensor device 10C may be worn on the user's thigh by any method.
- the sensor device 10C may be worn on the user's thigh by a belt.
- the sensor device 10C may be worn on the thigh by being put in a pocket near the thigh of pants worn by the user.
- the sensor device 10C may be worn on the user's thigh by being installed on pants, underwear, shorts, a supporter, an artificial leg, an implant, or the like.
- the front-rear direction viewed from the sensor device 10C matches the front-rear direction of the thigh viewed from the user
- the left-right direction viewed from the sensor device 10C matches the left-right direction of the thigh viewed from the user.
- the rotation direction of the thigh is the direction in which the thigh twists and rotates.
- the x-axis of the local coordinate system based on the position of the sensor device 10C is parallel to the front-rear direction of the thigh seen from the user, and the y-axis of the local coordinate system is parallel to the front-rear direction of the thigh viewed from the user. It may be worn on the user's thigh so that it is parallel to the left-right direction of the thigh and the z-axis of the local coordinate system is parallel to the rotation direction of the thigh viewed from the user.
- the sensor device 10C detects sensor data indicating the movement of the user's thighs.
- the sensor data detected by the sensor device 10C are, for example, the velocity of the user's thigh, the acceleration of the user's thigh, the angle of the user's thigh, the angular velocity of the user's thigh, the Includes data indicative of temperature and/or geomagnetism at the location of the user's thighs.
- the sensor device 10D is worn on the foot of the user.
- the foot is the portion from the user's ankle to the toe.
- the sensor device 10D may be a shoe last wearable device.
- the sensor device 10D may be worn on the user's foot by any method.
- the sensor device 10D may be provided on the shoe.
- the sensor device 10D may be attached to the user's foot by being attached to an anklet, band, misanga, false nail, tattoo sticker, supporter, cast, sock, insole, artificial leg, ring, implant, or the like.
- the front-rear direction viewed from the sensor device 10D matches the front-rear direction of the foot viewed from the user
- the left-right direction viewed from the sensor device 10D matches the left-right direction of the foot viewed from the user
- the sensor device 10 may be worn on the foot of the user so that the vertical direction viewed from the sensor device 10 coincides with the vertical direction of the foot viewed from the user. That is, in the sensor device 10D, the x-axis of the local coordinate system based on the position of the sensor device 10D is parallel to the front-rear direction of the foot viewed from the user, and the y-axis of the local coordinate system is aligned with the foot viewed from the user. It may be worn on the user's foot so that it is parallel to the left-right direction of the body and the z-axis of the local coordinate system is parallel to the up-down direction of the foot viewed from the user.
- the sensor device 10D detects sensor data indicating the movement of the user's foot.
- the sensor data detected by the sensor device 10D includes, for example, the velocity of the user's foot, the acceleration of the user's foot, the angle of the user's foot, the angular velocity of the user's foot, the temperature of the user's foot, and the user's foot. including data indicative of at least one of the geomagnetism at the location of the part.
- the electronic device 20 is carried by, for example, a walking user.
- the electronic device 20 functions as an information processing device, and can estimate the state of any one of a plurality of body parts of the user based on the sensor data detected by the sensor device 10 .
- the electronic device 20 determines the evaluation of the state of the user's body part by estimating the state of the user's body part.
- the electronic device 20 is, for example, a mobile device such as a mobile phone, a smart phone, or a tablet.
- the sensor device 10 includes at least a communication section 11 and a sensor section 12.
- the sensor device 10 may further include a reporting unit that reports information, a storage unit 14 and a control unit 15 .
- the notification unit is the output unit 13 .
- the notification unit is not limited to the output unit 13 .
- the sensor device 10C and the sensor device 10D do not have to include the output section 13 .
- the communication unit 11 includes at least one communication module capable of communicating with the electronic device 20 via a communication line.
- the communication module is a communication module conforming to the communication line standard.
- Standards for communication lines are short-range wireless communication standards including, for example, Bluetooth (registered trademark), Wi-Fi (registered trademark), infrared rays, and NFC (Near Field Communication).
- the sensor unit 12 is configured including arbitrary sensors corresponding to sensor data to be detected by the sensor device 10 .
- the sensor unit 12 includes at least one of, for example, a 3-axis motion sensor, a 3-axis acceleration sensor, a 3-axis speed sensor, a 3-axis gyro sensor, a 3-axis geomagnetic sensor, a temperature sensor, an air pressure sensor, a camera, and the like.
- a 3-axis motion sensor a 3-axis acceleration sensor
- a 3-axis speed sensor a 3-axis speed sensor
- a 3-axis gyro sensor a 3-axis geomagnetic sensor
- the data detected by each of the acceleration sensor and the geomagnetic sensor are used to calculate the initial angle of the body part to be detected by the sensor device 10. good. Further, the data detected by each of the acceleration sensor and the geomagnetic sensor may be used to correct the data indicating the angle detected by the sensor device 10 .
- the angle of the body part to be detected by the sensor device 10 may be calculated by time-integrating the angular velocity detected by the gyro sensor.
- the data detected by the air pressure sensor may be used when the control unit 26 of the electronic device 20, which will be described later, evaluates the state of the user's body part.
- the output unit 13 can output data.
- the output unit 13 includes at least one output interface capable of outputting data.
- the output interface is, for example, a display or speaker.
- the display is, for example, an LCD (Liquid Crystal Display) or an organic EL (Electro Luminescence) display.
- the output unit 13 may include a speaker when included in the sensor device 10A. Moreover, the output unit 13 may include a display when included in the sensor device 10B.
- the storage unit 14 includes at least one semiconductor memory, at least one magnetic memory, at least one optical memory, or a combination of at least two of them.
- the semiconductor memory is, for example, RAM (Random Access Memory) or ROM (Read Only Memory).
- the RAM is, for example, SRAM (Static Random Access Memory) or DRAM (Dynamic Random Access Memory).
- the ROM is, for example, EEPROM (Electrically Erasable Programmable Read Only Memory) or the like.
- the storage unit 14 may function as a main storage device, an auxiliary storage device, or a cache memory.
- the storage unit 14 stores data used for the operation of the sensor device 10 and data obtained by the operation of the sensor device 10 .
- the storage unit 14 stores system programs, application programs, embedded software, and the like.
- the control unit 15 includes at least one processor, at least one dedicated circuit, or a combination thereof.
- the processor is a general-purpose processor such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit), or a dedicated processor specialized for specific processing.
- the dedicated circuit is, for example, FPGA (Field-Programmable Gate Array) or ASIC (Application Specific Integrated Circuit).
- the control unit 15 executes processing related to the operation of the sensor device 10 while controlling each unit of the sensor device 10 .
- the control unit 15 receives a signal instructing the start of data detection from the electronic device 20 by the communication unit 11 . Upon receiving this signal, the control section 15 starts data detection. For example, the control unit 15 acquires data detected by the sensor unit 12 from the sensor unit 12 . The control unit 15 transmits the acquired data as sensor data to the electronic device 20 through the communication unit 11 . A signal instructing the start of data detection is transmitted from the electronic device 20 to the plurality of sensor devices 10 as a broadcast signal. By transmitting a signal instructing the start of data detection as a broadcast signal to the plurality of sensor devices 10, the plurality of sensor devices 10 can simultaneously start data detection.
- the control unit 15 acquires data from the sensor unit 12 at preset time intervals, and transmits the acquired data as sensor data through the communication unit 11 .
- This time interval may be set based on a typical user's walking speed or the like. This time interval may be the same for each of the plurality of sensor devices 10 . Since this time interval is the same for the plurality of sensor devices 10, it is possible to synchronize the timing at which each of the plurality of sensor devices 10 detects data.
- the electronic device 20 includes a communication unit 21, an input unit 22, a notification unit that notifies information, a storage unit 25, and a control unit 26.
- the notification unit is the output unit 23 and the vibration unit 24 .
- the notification section is not limited to the output section 23 and the vibration section 24 .
- the output unit 23 and the vibration unit 24 may be mounted on the electronic device 20, or may be arranged near any one of the sensor devices 10B, 10C, and 10D.
- the communication unit 21 includes at least one communication module capable of communicating with the sensor device 10 via a communication line.
- the communication module is at least one communication module compatible with the communication line standard.
- the communication line standard is, for example, a short-range wireless communication standard including Bluetooth (registered trademark), Wi-Fi (registered trademark), infrared rays, NFC, and the like.
- the communication unit 21 may further include at least one communication module connectable to the network 2 as shown in FIG. 10 which will be described later.
- the communication module is, for example, a communication module compatible with mobile communication standards such as LTE (Long Term Evolution), 4G (4th Generation), or 5G (5th Generation).
- the input unit 22 can accept input from the user.
- the input unit 22 includes at least one input interface capable of accepting input from the user.
- the input interface is, for example, a physical key, a capacitive key, a pointing device, a touch screen provided integrally with the display, or a microphone.
- the output unit 23 can output data.
- the output unit 23 includes at least one output interface capable of outputting data.
- the output interface is, for example, a display or speaker.
- the display is, for example, an LCD or an organic EL display.
- the vibrating section 24 can vibrate the electronic device 20 .
- the vibrating section 24 is configured including a vibrating element.
- the vibrating element is, for example, a piezoelectric element or the like.
- the storage unit 25 includes at least one semiconductor memory, at least one magnetic memory, at least one optical memory, or a combination of at least two of them.
- a semiconductor memory is, for example, a RAM or a ROM.
- RAM is, for example, SRAM or DRAM.
- ROM is, for example, EEPROM or the like.
- the storage unit 25 may function as a main storage device, an auxiliary storage device, or a cache memory.
- the storage unit 25 stores data used for the operation of the electronic device 20 and data obtained by the operation of the electronic device 20 .
- the storage unit 25 stores system programs, application programs, embedded software, and the like.
- the storage unit 25 stores a learning model, which will be described later, and the association shown in FIG. 6, which will be described later.
- the control unit 26 includes at least one processor, at least one dedicated circuit, or a combination thereof.
- a processor may be a general-purpose processor such as a CPU or GPU, or a dedicated processor specialized for a particular process.
- the dedicated circuit is, for example, FPGA or ASIC.
- the control unit 26 executes processing related to the operation of the electronic device 20 while controlling each unit of the electronic device 20 .
- the control unit 26 receives an input instructing execution of walking evaluation by the input unit 22 .
- This input is an input that causes electronic device 20 to execute determination processing for determining the evaluation of the state of the body part.
- This input is input from the input unit 22 by a user wearing the sensor device 10, for example.
- the user inputs this input from the input unit 22, for example, before starting walking.
- the control unit 26 transmits a signal instructing the start of data detection as a broadcast signal to the plurality of sensor devices 10 by the communication unit 21 . After the signal instructing the start of data detection is transmitted to the plurality of sensor devices 10 , sensor data is transmitted from at least one sensor device 10 to the electronic device 20 .
- the control unit 26 receives sensor data from at least one sensor device 10 via the communication unit 21 .
- the control unit 26 acquires sensor data from the sensor device 10 by receiving the sensor data from the sensor device 10 .
- the control unit 26 determines an evaluation of the state of the user's body part based on the acquired sensor data and the learning model.
- the content of the evaluation for evaluating the state of the body part may be set based on the interpretation of the generally good walking style. As an interpretation of how to walk that is generally considered to be good, for example, "ASICS Sports Research Institute,” Ultimate Way of Walking “, Kodansha Gendai Shinsho, September 2019, pp.92, 94, 95 ". There are other interpretations.
- the control unit 26 controls (1) the state of the head, (2) the state of the arms, (3) the state of the trunk, (4) the state of the knees, and (5) the state of the feet.
- a rating may be determined for at least some of them.
- Head Condition A rating for the head condition may be determined.
- the user's body while walking should not shake as much as possible.
- the shaking of the user's head while walking is smaller than when the user's body shakes while walking is large. Therefore, when the shaking of the user's head while walking is small, a higher evaluation for the state of the head may be determined than when the shaking of the user's head while walking is large.
- the state of the user's arm while walking is good when the user's arm is pulled backward. Therefore, when the user's arm is pulled backward while walking, a higher evaluation may be determined for the state of the arm than when the user's arm is not pulled backward while walking. .
- Knee Condition A rating for knee condition may be determined.
- the state of the user's knees while walking is good when the knees are not bent. Therefore, when the knees of the user are not bent while walking, a higher evaluation may be determined for the state of the knees than when the knees of the user are bent while walking. .
- Foot Condition A rating for foot condition may be determined.
- the user's stride while walking should be as wide as possible. Therefore, when the stride of the user while walking is wide, a higher evaluation may be determined as the evaluation of the state of the foot than when the user is walking and the stride is short.
- the control unit 26 determines an evaluation of the state of the user's body part based on the sensor data and the learning model.
- the learning model is machine-learned so that when sensor data or feature data is input, the user's evaluation information for the state of a predetermined body part is output.
- the control unit 26 inputs sensor data or feature data into the learning model, acquires information on the evaluation of the state of the user's body part from the learning model, and determines the evaluation of the state of the user's body part.
- the feature data is data indicating features of movement of the user's body part to which the sensor device 10 is attached.
- the control unit 26 acquires feature data from sensor data. An example of feature data will be described later.
- the learning model performs machine learning so that when sensor data or feature data is input, information regarding evaluation of the state of the user's body part different from the body part on which the sensor device 10 is worn is output. can be done. This is because multiple body parts of the user move while influencing each other during walking.
- the control unit 26 can determine the evaluation of the state of a body part different from the body part on which the sensor device 10 of the user is worn.
- the control unit 26 can determine the evaluation of the states of more body parts than the number of body parts on which the sensor device 10 of the user is attached.
- the learning model according to this embodiment is machine-learned so that when feature data is input, a score is output as information for evaluating the state of a predetermined body part.
- a score indicates an assessment of the condition of a given body part. The higher the score, the higher the evaluation for the state of the predetermined body part corresponding to the score.
- the control unit 26 determines the evaluation for the state of the predetermined body part corresponding to the score.
- the feature data may be data indicating statistical values of sensor data. Since the feature data are statistical values of the sensor data, the feature data can indicate the motion features of the body parts.
- the feature data is the maximum value, minimum value, average value, variance, or the like of sensor data in a predetermined period.
- the predetermined period is, for example, the walking cycle of the user or a partial period of the walking cycle.
- the walking cycle is, for example, the period from when one of the two feet of the user lands on the ground to when it lands on the ground again.
- the partial period of the walking cycle is, for example, a stance phase or a swing phase.
- the stance phase is, for example, the period from when one of the user's two feet touches the ground until it leaves the ground.
- the swing period is, for example, the period from when one of the user's two feet leaves the ground until it lands on the ground.
- the control unit 26 may detect the user's walking cycle and a partial period of the walking cycle by analyzing the sensor data.
- the control unit 26 may acquire feature data from the sensor data by performing calculations on the sensor data.
- the feature data may be sensor data at a predetermined timing. Since the feature data is sensor data at a predetermined timing, the feature data can indicate the feature of the movement of the body part. For example, the feature data is sensor data at the landing timing of the user. The landing timing is the timing at which the user's foot lands on the ground. The control unit 26 may detect the landing timing by analyzing sensor data.
- the feature data may be data in any coordinate system.
- the feature data may be data in a local coordinate system or data in a global coordinate system.
- the control unit 26 acquires the feature data in the global coordinate system by executing coordinate transformation on the sensor data in the local coordinate system.
- a learning model 30 as shown in FIG. 4 is a neural network learning model.
- the learning model 30 includes an input layer 31 , a hidden layer 32 , a hidden layer 33 and an output layer 34 .
- the learning model 30 outputs one score from the output layer 34 when three feature data are input from the input layer 31 .
- the input layer 31 includes 3 neurons. Feature data is input to each of the three neurons of the input layer 31 .
- Hidden layer 32 and hidden layer 33 each contain 64 neurons.
- the output layer 34 contains one neuron. A score is output from the neurons of the output layer 34 .
- the hidden layer 32, the hidden layer 33, and the output layer 34 the neurons of one layer and the neurons of the other layer of the two adjacent layers are connected to each other.
- a weighting factor corresponding to the strength of connection between each neuron is adjusted.
- the number of neurons included in each of the input layer 31, hidden layer 32, hidden layer 33, and output layer 34 may be adjusted according to the number of feature data used.
- FIG. 5 shows an example of scores according to an embodiment of the present disclosure.
- the control unit 26 acquires scores as shown in FIG. 5 by inputting the feature data into the learning model.
- the control unit 26 uses five learning models to determine (1) head state, (2) arm state, (3) trunk state, (4) knee state, and (5) ) get each of the foot condition scores.
- the score is a numerical value from 1 to 5.
- the control unit 26 acquires a score of 5 as an evaluation for (1) the state of the head.
- the control unit 26 acquires a score of 4 as an evaluation for (2) the state of the arm.
- the control unit 26 acquires a score of 3 as an evaluation for (3) the state of the trunk.
- the control unit 26 acquires a score of 1 as an evaluation for (4) the state of the knee.
- the control unit 26 obtains a score of 2 as an evaluation for (5) the state of the foot.
- the control unit 26 may generate an evaluation signal according to the determined evaluation. When determining a plurality of evaluations, the control unit 26 may generate an evaluation signal corresponding to at least one of the determined evaluations.
- the rating signal may be a signal indicating a compliment to the user if the determined rating is higher than the rating threshold.
- the rating signal may be a signal indicating advice to the user if the determined rating is below a rating threshold.
- the evaluation threshold may be set based on, for example, an average value of general user evaluations.
- the evaluation threshold may be the average score of typical users when a learning model is used.
- the contents of the user's praise and the contents of the advice may be set based on the interpretation of the walking style generally considered good as described above.
- the evaluation threshold is a score of 3.
- “Good” indicates that the rating is higher than the rating threshold.
- “Poor” indicates that the rating is below the rating threshold.
- “Average” indicates that the evaluation is the same as the evaluation threshold.
- the evaluation for each of (1) the state of the head and (2) the state of the arm is higher than the evaluation threshold.
- the control unit 26 generates, as an evaluation signal, a signal indicating the content of praising the user for each of (1) the state of the head and (2) the state of the arm. For example, (1) as an evaluation signal for the head condition, the control unit 26 generates a signal indicating that the user's head shake is small and the head condition is good. For example, (2) as an evaluation signal for the state of the arm, the control unit 26 indicates that the user's arm swing is large and the arm is pulled backward, indicating that the arm is in good condition. Generate a signal.
- the evaluation for each of (4) the state of the knee and (5) the state of the foot is lower than the evaluation threshold.
- the control unit 26 generates a signal indicating advice to the user as an evaluation signal for each of (4) the state of the knee and (5) the state of the foot. For example, (4) as the evaluation signal for the state of the knee, the control unit 26 generates a signal indicating advice not to bend the knee. For example, (5) as the evaluation signal for the state of the foot, the control unit 26 generates a signal indicating advice to widen the stride.
- the control unit 26 may transmit the generated evaluation signal to the external device through the communication unit 21.
- the control unit 26 may transmit the evaluation signal through the communication unit 21 to any sensor device 10 having the output unit 13 as an external device.
- the control section 15 receives the evaluation signal through the communication section 21 .
- the control unit 15 causes the output unit 13 as a notification unit to notify the content indicated by the received evaluation signal.
- the control unit 15 causes the output unit 13 to output the content indicated by the evaluation signal.
- the control unit 26 may transmit the evaluation signal to the earphone as an external device through the communication unit 21.
- the control section 15 receives the evaluation signal through the communication section 11 .
- the control unit 15 causes the output unit 13 as a notification unit to notify the content indicated by the evaluation signal.
- the control unit 15 notifies by outputting the content indicated by the evaluation signal from the speaker of the output unit 13 as voice.
- the control unit 26 may notify the user of the content indicated by the generated evaluation signal by means of the notification unit. As an example of notification, the control unit 26 may cause the output unit 23 to output the content indicated by the generated evaluation signal. As another example of notification, the control section 26 may vibrate the vibrating section 24 in a vibration pattern according to the determined evaluation.
- the control unit 26 may select a learning model to be used in the above-described evaluation determination process from among a plurality of learning models according to the type of the sensor device 10 that has transmitted the sensor data to the electronic device 20 .
- the control unit 26 may refer to the association shown in FIG. 6 stored in the storage unit 25 and select a learning model to be used in the evaluation determination process.
- the learning model as shown in FIG. 6 is generated by the learning model generation method described later.
- Numerical values in parentheses shown together with the learning model are the accuracy and accuracy of the learning model calculated in the learning model generation method described later.
- a method of calculating the accuracy and accuracy of the learning model will be described later.
- the number on the left in parentheses is the accuracy of the learning model.
- the numerical value on the right side in parentheses is the accuracy of the learning model.
- the learning model marked with a double circle is a learning model with an accuracy of 90% or more and an accuracy of 70% or more.
- a learning model marked with a single circle is a learning model that does not satisfy the accuracy and accuracy conditions of a learning model marked with a double circle.
- a learning model marked with a single circle has an accuracy of 80% or more and an accuracy of 60% or more.
- a learning model marked with a triangle is a learning model with an accuracy of less than 80% or an accuracy of less than 60%.
- the control unit 26 may determine the evaluation of some states from (1) the state of the head to (5) the state of the foot according to the accuracy and certainty of the learning model, and (1) the state of the head. All ratings from condition to (5) foot condition may be determined.
- control unit 26 selects a learning model to be used for evaluation determination processing by selecting case C1, case C2, case C3, case C4, or case C5.
- Cases C1 to C5 associate learning models used in the evaluation determination process with types of sensor devices 10 for acquiring feature data to be input to the learning models.
- the learning model is not limited to the one shown in FIG. A learning model using any combination of sensor data or feature data of the sensor device 10 may be employed.
- the control unit 26 may select one of cases C1 to C5 according to the type of sensor device 10 that transmits sensor data to the electronic device 20. As an example, the control unit 26 may select an arbitrary combination of sensor devices 10 from a plurality of sensor devices 10 that have transmitted sensor data to the electronic device 20 . The control unit 26 may select a case corresponding to the combination of the selected sensor devices 10 from cases C1 to C5. Information of feature data used in cases C1 to C5, which will be described below, may be stored in the storage unit 25 in association with cases C1 to C5.
- the control unit 26 may select case C1. For example, if the sensor device 10 worn by the user is only the sensor device 10A, the sensor device 10 that transmits sensor data to the electronic device 20 is the only sensor device 10A. Alternatively, when the control unit 26 selects the sensor device 10A from among the plurality of sensor devices 10 that have transmitted the sensor data to the electronic device 20, the control unit 26 may select case C1.
- control unit 26 evaluates (1) the state of the head, (2) the state of the arms, (3) the state of the trunk, (4) the state of the knees, and (5) the state of the feet.
- learning models 30A, 30B, 30C, 30D and 30E are selected respectively.
- the feature data input to the learning models 30A to 30E are the feature data indicating the motion features of the user's head in the vertical direction of the global coordinate system, and the motion feature of the user's head in the horizontal direction of the global coordinate system. and feature data indicating motion features.
- the feature data input to the learning models 30A to 30E may include feature data indicating the features of the movement of the head in at least one of the front-rear direction, left-right direction, and up-down direction of the global coordinate system.
- control unit 26 inputs three feature data to each of the learning models 30A to 30E.
- One feature data out of the three feature data input to the learning models 30A to 30E corresponds to the feature data indicating the feature of the movement of the user's head in the vertical direction of the global coordinate system.
- this one piece of feature data is the average value of the angle of the user's head in the vertical direction of the global coordinate system.
- This one piece of feature data is acquired from sensor data indicating the movement of the user's head detected by the sensor device 10A.
- the average value of the angles of the user's head in the feature data may be the average value of the angles of the head in the walking cycle of the user.
- two feature data correspond to feature data indicating features of movement of the user's head in the horizontal direction of the global coordinate system.
- these two feature data are the maximum angle of the user's head in the left-right direction of the global coordinate system and the average value of the angle of the user's head in the left-right direction of the global coordinate system.
- These two feature data are acquired from sensor data indicating the movement of the user's head detected by the sensor device 10A.
- the maximum value and average value of the angle of the user's head in the feature data may be the maximum value and average value of the angle of the head in the walking cycle of the user, respectively.
- the control unit 26 may select case C2. Alternatively, when the control unit 26 selects the sensor device 10A and the sensor device 10D from among the plurality of sensor devices 10 that have transmitted the sensor data to the electronic device 20, the control unit 26 may select case C2.
- control unit 26 evaluates (1) the state of the head, (2) the state of the arms, (3) the state of the trunk, (4) the state of the knees, and (5) the state of the feet.
- learning models 30F, 30G, 30H, 30I and 30J are selected respectively.
- the feature data input to the learning models 30F to 30J are the same or similar to case C1, the feature data indicating the feature of the movement of the user's head in the vertical direction of the global coordinate system and the global coordinates and feature data that characterizes the movement of the user's head in the lateral direction of the system. Furthermore, the feature data input to the learning models 30F to 30J include feature data indicating features of the movement of the user's foot.
- control unit 26 inputs three feature data to each of the learning models 30F to 30J.
- Two of the three feature data input to the learning models 30F to 30J are the feature data indicating the feature of the movement of the user's head in the vertical direction of the global coordinate system and the horizontal direction of the global coordinate system. corresponds to feature data indicating features of the movement of the user's head in .
- these two feature data are the average value of the angle of the user's head in the vertical direction of the global coordinate system and the maximum value of the angle of the user's head in the horizontal direction of the global coordinate system.
- These two feature data are acquired from the data indicating the movement of the user's head detected by the sensor device 10A.
- the average value and the maximum value of the angle of the user's head in the feature data may be the average value and the maximum value of the angle of the head in the walking cycle of the user, respectively.
- One feature data out of the three feature data input to the learning models 30F to 30J corresponds to the feature data indicating the feature of the movement of the user's foot.
- this one feature data is the maximum value of the acceleration of the user's foot in the vertical direction of the local coordinate system with the position of the sensor device 10D as a reference.
- This one piece of feature data is obtained from sensor data indicating the movement of the user's foot detected by the sensor device 10D.
- the maximum value of acceleration of the foot of the user in the feature data may be the maximum value of acceleration of the foot during the user's walking cycle.
- the control unit 26 may select case C3. Alternatively, when the control unit 26 selects the sensor device 10A and the sensor device 10C from the plurality of sensor devices 10 that have transmitted sensor data to the electronic device 20, the control unit 26 may select case C3.
- control unit 26 evaluates (1) the state of the head, (2) the state of the arms, (3) the state of the trunk, (4) the state of the knees, and (5) the state of the feet.
- learning models 30K, 30L, 30M, 30N and 30O are selected respectively.
- the feature data input to the learning models 30K to 30O are the same as or similar to case C1, and feature data indicating the feature of the movement of the user's head in the vertical direction of the global coordinate system; and feature data that characterizes the movement of the user's head in the horizontal direction of the system. Furthermore, the feature data input to the learning models 30K to 30O include feature data indicating features of the motion of the user's thighs.
- control unit 26 inputs three feature data to each of the learning models 30K-30O.
- Two of the three feature data input to the learning models 30K to 30O are feature data indicating features of movement of the user's head in the vertical direction of the global coordinate system and horizontal direction of the global coordinate system. corresponds to feature data indicating features of the movement of the user's head in .
- these two feature data are the average value of the angle of the user's head in the vertical direction of the global coordinate system and the average value of the angle of the user's head in the horizontal direction of the global coordinate system.
- These two pieces of feature data are obtained from sensor data indicating the movement of the user's head detected by the sensor device 10A.
- the average value of the angles of the user's head in the feature data may be the average value of the angles of the head in the walking cycle of the user.
- One feature data out of the three feature data input to the learning models 30K to 30O is the feature data indicating the feature of the movement of the user's thighs.
- this one piece of feature data is the variance of the angular velocity of the user's thigh in the left-right direction of the local coordinate system based on the position of the sensor device 10C during the stance phase.
- This one piece of feature data is obtained from sensor data indicating the movement of the user's thigh detected by the sensor device 10D.
- control unit 26 evaluates (1) the state of the head, (2) the state of the arms, (3) the state of the trunk, (4) the state of the knees, and (5) the state of the feet.
- learning models 30P, 30Q, 30R, 30S and 30T are selected respectively.
- the feature data input to the learning models 30P to 30T are the same or similar to case C1, the feature data indicating the feature of the movement of the user's head in the vertical direction of the global coordinate system and the global coordinates and feature data that characterizes the movement of the user's head in the horizontal direction of the system. Furthermore, the feature data input to the learning models 30P to 30T include feature data indicating features of the user's forearm movement.
- control unit 26 inputs three feature data to each of the learning models 30P to 30T.
- Two of the three feature data input to the learning models 30P to 30T are feature data indicating the feature of the movement of the user's head in the up-down direction of the global coordinate system and the left-right direction of the global coordinate system. corresponds to feature data indicating features of the movement of the user's head in .
- these two feature data are the average value of the angle of the user's head in the vertical direction of the global coordinate system and the angle at the landing timing of the user's head in the horizontal direction of the global coordinate system.
- These two pieces of feature data are obtained from sensor data indicating the movement of the user's head detected by the sensor device 10A.
- the average value of the angles of the user's head in the feature data may be the average value of the angles of the head in the walking cycle of the user.
- One feature data out of the three feature data input to the learning models 30P to 30T corresponds to the feature data indicating the feature of the movement of the user's forearm.
- this one piece of feature data is the dispersion of acceleration of the user's forearm in the front-rear direction of the local coordinate system with the position of the sensor device 10B as a reference.
- This one piece of feature data is obtained from sensor data indicating the movement of the user's forearm detected by the sensor device 10B.
- the variance of the acceleration of the user's forearm in the feature data may be the variance of the acceleration of the forearm in the walking cycle of the user.
- control unit 26 evaluates (1) the state of the head, (2) the state of the arms, (3) the state of the trunk, (4) the state of the knees, and (5) the state of the feet.
- learning models 30U, 30V, 30W, 30X and 30Y are selected respectively.
- the feature data input to the learning models 30U to 30Y are the same or similar to case C1, the feature data indicating the feature of the movement of the user's head in the vertical direction of the global coordinate system and the global coordinates and feature data that characterizes the movement of the user's head in the lateral direction of the system. Furthermore, the feature data input to the learning models 30U to 30Y include feature data indicating the feature of the movement of the user's forearm and feature data indicating the feature of the movement of the user's thigh.
- control unit 26 inputs four feature data to each of the learning models 30U to 30Y.
- Two of the four feature data input to the learning models 30U to 30Y are the feature data indicating the feature of the movement of the user's head in the up-down direction of the global coordinate system and the feature data in the left-right direction of the global coordinate system. It corresponds to the feature data indicating the features of the movement of the user's head.
- these two feature data are the average value of the angle of the user's head in the vertical direction of the global coordinate system and the angle at the landing timing of the user's head in the horizontal direction of the global coordinate system.
- These two pieces of feature data are obtained from sensor data indicating the movement of the user's head detected by the sensor device 10A.
- the average value of the angles of the user's head in the feature data may be the average value of the angles of the head in the walking cycle of the user.
- One feature data out of the four feature data input to the learning models 30U to 30Y corresponds to the feature data indicating the feature of the movement of the user's forearm.
- this one piece of feature data is the maximum value of the acceleration of the user's forearm in the front-rear direction of the local coordinate system with the position of the sensor device 10B as a reference.
- This one piece of feature data is obtained from sensor data indicating the movement of the user's forearm detected by the sensor device 10B.
- the maximum value of the acceleration of the forearm of the user in the feature data may be the maximum value of the acceleration of the forearm in the walking cycle of the user.
- One feature data out of the four feature data input to the learning models 30U to 30Y corresponds to the feature data indicating the feature of the motion of the user's thigh.
- this one feature data is the maximum value of the acceleration of the user's foot in the vertical direction of the local coordinate system with the position of the sensor device 10D as a reference.
- This one piece of feature data is obtained from sensor data indicating the movement of the user's thigh detected by the sensor device 10D.
- the maximum value of acceleration of the foot of the user in the feature data may be the maximum value of acceleration of the foot during the user's walking cycle.
- a method of generating a learning model will be described below.
- a gait database of subjects was used to generate the learning model.
- As a walking database of subjects "Yoshiyuki Kobayashi, Naoto Hida, Kanako Nakajima, Masahiro Fujimoto, Masaaki Mochimaru, '2019: AIST Walking Database 2019', [Online], [Searched on May 24, 2021], Internet ⁇ The data provided at https://unit.aist.go.jp/harc/ExPART/GDB2019_e.html> were used.
- the gait data of a plurality of subjects are registered in this gait database.
- the subject's gait data was detected by a motion capture system and a floor reaction force meter.
- the subject's gait in the gait database was evaluated by an instructor who instructs gait.
- the instructor will ask the subject during walking about (1) head condition, (2) arm condition, (3) trunk condition, (4) knee condition and (5) foot condition. It was evaluated by assigning a numerical score from 1 to 5.
- the instructor rated the subject's gait based on the interpretation of generally accepted gait described above.
- Feature data was obtained from the data indicating the subject's movements detected by the motion capture system.
- a dataset was generated by matching the feature data with the scores given by the instructor.
- 980 data sets were generated by using the walking data of 10 steps of 98 subjects.
- a learning model was generated by cross-validation using this dataset. In the cross-validation method, 800 data sets corresponding to 80% of the total 980 data sets were assigned to training data sets for the learning model. Of the 980 data sets, 180 data sets corresponding to 20% of the total were assigned to the data sets for learning model evaluation.
- the 980 datasets were split into 80% training datasets and 20% evaluation datasets in 10 ways. The total number of trials was 846,000.
- the accuracy and accuracy of the generated learning model were calculated. Accuracy and accuracy of the learning model were calculated by the number of correct and incorrect estimation results of the learning model. In determining whether the estimation result of the learning model is correct or incorrect, the score was divided into three stages. Specifically, the scores were divided into three grades: scores greater than 3, scores that are 3, and scores that are less than 3. Scores above 3 are also described as "Good”. A score of 3 is also described as "Average”. Scores that are less than 3 are also described as "Poor”. Then, when the score of the learning model "Good” or “Poor” matches the score given by the instructor "Good” or "Poor", the estimation result of the learning model was determined to be correct.
- the accuracy of the learning model was calculated by dividing the number of correct estimation results by the sum of the number of correct estimation results and the number of incorrect estimation results. For example, the accuracy of the learning model was calculated by Equation (1).
- Accuracy of learning model (CR) / (CR + ICR) Equation (1)
- CR is the number of correct estimation results.
- ICR is the number of incorrect estimation results.
- the accuracy of the learning model was calculated by dividing the number of correct estimation results by the number of all estimation results. For example, the accuracy of the learning model was calculated by Equation (2).
- Accuracy of learning model (CR) / (CR + ICR + NR) Equation (2)
- CR is the number of correct estimation results.
- ICR is the number of incorrect estimation results.
- NR is the number of estimation results of the learning model that are neither correct nor incorrect. That is, NR is the number of learning models with a score of 3.
- the inventors set the number of feature data to be input to the learning model to at least three, and while changing the combination of feature data to be input to the learning model, the evaluation of the accuracy and accuracy of the learning model increases. I searched for a combination of data. The inventors obtained the results shown in FIGS. 7 and 8. FIG.
- Figure 7 shows a graph showing the accuracy of the learning model.
- FIG. 8 shows a graph showing the accuracy of the learning model.
- Cases C1 to C5 shown in FIGS. 7 and 8 are the same as Cases C1 to C5 described above with reference to FIG. 6, respectively.
- 7 and 8 show the accuracy and accuracy of the learning model for each of (1) head state to (5) foot state.
- the accuracy of (1) the head state was the highest among the accuracies of the learning model for each of (1) the head state to (5) the foot state.
- the accuracy of (1) the head state was the highest among the accuracies of the learning model for each of (1) the head state to (5) the foot state.
- the feature data that is input to the learning model is feature data that indicates the feature of the subject's (user's) head movement.
- the movement of the user's head in the vertical and horizontal directions of the global coordinate system reflects the movement of body parts that differ from the user's head during walking. For example, when a walking user swings an arm or kicks a leg, the user's body moves vertically and horizontally in the global coordinate system. When the user's body moves vertically and horizontally on the global coordinate system, the user's head moves vertically and horizontally on the global coordinate system.
- the movement of the user's head in the vertical and horizontal directions of the global coordinate system reflects the movement of body parts that differ from the user's head during walking. Therefore, if the feature data indicating the motion features of the user's head in the vertical direction and the horizontal direction of the global coordinate system is used as the feature data to be input to the learning model, the user's head can move in different parts of the body than the user's head while walking. It is presumed that the condition can be assessed. As described above, in case C1, as the feature data to be input to the learning model, feature data indicating the features of the movement of the user's head in the vertical and horizontal directions of the global coordinate system is used. With such a configuration, it is presumed that, in case C1, the state of the body part different from the user's head could be evaluated with a certain degree of accuracy.
- case C2 as the feature data to be input to the learning model, feature data indicating the motion features of more body parts of the user than in case C1 are used.
- case C2 the same or similar feature data as in case C1 is used, which indicates the features of the movement of the user's head in each of the vertical and horizontal directions of the global coordinate system.
- case C2 in addition to these feature data, feature data indicating features of the movement of the user's foot are used.
- more feature data indicating the motion features of the user's body parts are used than in case C1.
- case C3 as the feature data to be input to the learning model, feature data indicating movements of more body parts of the user than in case C1 are used.
- the same or similar feature data as in case C1 is used, which indicates the features of the movement of the user's head in each of the vertical and horizontal directions of the global coordinate system.
- feature data indicating features of the movement of the user's thighs are used.
- more feature data indicating the motion features of the user's body parts are used than in case C1.
- case C4 as feature data to be input to the learning model, feature data indicating movements of more body parts of the user than in case C1 are used.
- the same or similar feature data as in case C1 is used, which indicates the features of the movement of the user's head in each of the vertical and horizontal directions of the global coordinate system.
- feature data indicating features of the movement of the user's forearm are used.
- more feature data indicating the motion features of the user's body parts are used than in case C1.
- case C5 as the feature data to be input to the learning model, feature data indicating features of more body parts of the user than in cases C1 to C4 are used. In case C5, more feature data indicating features of body parts than in cases C1 to C4 are used. It is presumed that the accuracy and certainty of the learning model became higher.
- FIG. 9 is a flowchart showing operations of evaluation processing executed by the electronic device 20 shown in FIG. This operation corresponds to an example of the information processing method according to this embodiment. For example, when an input instructing execution of walking evaluation is received by the input unit 22, the control unit 26 starts the evaluation process from the process of step S10.
- the control unit 26 receives an input instructing execution of walking evaluation by the input unit 22 (step S10). This input is input from the input unit 22 by the user wearing the sensor device 10 .
- the control unit 26 transmits a signal instructing the start of data detection as a broadcast signal to the plurality of sensor devices 10 through the communication unit 21 (step S11). After the process of step S ⁇ b>11 is executed, sensor data is transmitted from at least one sensor device 10 to the electronic device 20 .
- the control unit 26 receives sensor data from at least one sensor device 10 through the communication unit 21 (step S12).
- the control unit 26 refers to the association shown in FIG. 6, for example, and selects a learning model from among a plurality of learning models according to the type of the sensor device 10 that has transmitted the sensor data to the electronic device 20 (step S13).
- the control unit 26 acquires feature data from the sensor data received in the process of step S12 (step S14).
- the control unit 26 inputs the feature data acquired in the process of step S14 to the learning model selected in the process of step S13, thereby acquiring scores such as those shown in FIG. 5 (step S15). By obtaining the score, the control unit 26 determines the evaluation of the state of the body part corresponding to the score.
- the control unit 26 generates an evaluation signal according to the determined evaluation (step S16).
- the control unit 26 transmits the evaluation signal generated in the process of step S16 to the external device through the communication unit 21 (step S17). After executing the process of step S17, the control unit 26 ends the evaluation process.
- the control unit 26 may execute the evaluation process again when the user walks the set number of steps.
- the set number of steps may be input in advance from the input unit 22 by the user.
- the control unit 26 may start from the process of step S11.
- the control unit 26 may repeatedly execute the evaluation process each time the user walks the set number of steps until an input instructing termination of the evaluation process is received from the input unit 22 .
- the input instructing the end of the evaluation process is input from the input unit 22 by the user, for example. For example, when the user finishes walking, the user inputs an instruction to finish the evaluation process from the input unit 22 .
- the control unit 26 can estimate the state of a body part of the user that is different from the body part on which the sensor device 10 is worn, using a learning model.
- the control unit 26 can determine the evaluation of the state of the body part by estimating the state of the body part of the user that is different from the body part on which the sensor device 10 is worn. For example, the control unit 26 selects a case C1 as shown in FIG. 6, and selects a state from (2) arm state to (5) leg state that is different from the head on which the sensor device 10A is worn by the user. A rating can be determined for each.
- the state of any body part can be estimated without being limited to the body part of the user wearing the sensor device 10 .
- walking has attracted attention as a simple form of exercise.
- a walking user is required to pay attention to obstacles in front of or nearby.
- a walking user may not be able to pay attention to their posture by requiring them to pay attention to obstacles in front of them or nearby. If the user cannot pay attention to his or her posture while walking, the user may walk in an incorrect posture without realizing it. If the user walks in an incorrect posture, the exercise effect of walking may decrease.
- walking is often a familiar exercise for the user, and it is often difficult for the user to correct his or her posture while walking.
- the control unit 26 can estimate the state of the user's body part. Such a configuration may give the user the opportunity to correct his posture while walking. By giving the user the opportunity to correct his or her posture while walking, the user will be able to walk in the correct posture. By allowing the user to walk in a correct posture, the exercise effect of walking can be enhanced.
- a novel technique is provided for estimating the state of the user's body part while walking.
- control unit 26 may estimate the states of more body parts than the number of body parts of the user on which the sensor device 10 is worn, by using the learning model. For example, it is assumed that a user wears N (N is an integer equal to or greater than 1) sensor devices 10 . In this case, the control unit 26 may acquire N pieces of sensor data from the N pieces of sensor equipment 10 and determine evaluations of the states of N+1 or more body parts of the user.
- the control unit 26 acquires one piece of sensor data from the sensor device 10A.
- the control unit 26 selects a case C1 as shown in FIG. 6, and based on one acquired sensor data, determines the state of two or more body parts, for example, (1) head state to (5) leg state.
- a rating is determined for the condition of five body parts up to the condition of the buttocks.
- the control unit 26 acquires two pieces of sensor data from each of the sensor device 10A and the sensor device 10D.
- the control unit 26 selects case C2 as shown in FIG. 6, and based on the acquired two pieces of sensor data, states of three or more body parts, for example (1) head state to (5) leg state.
- a rating is determined for the condition of five body parts up to the condition of the buttocks.
- the electronic device 20 is highly convenient for the user.
- At least one sensor device 10 included in the information processing system 1 may include a sensor device 10A to be worn on the user's head.
- the control unit 26 may acquire sensor data indicating the movement of the user's head from the sensor device 10A.
- the control unit 26 may select case C1 as shown in FIG. That is, the feature data includes the feature data indicating the feature of the movement of the head in the vertical direction of the user, that is, the vertical direction of the global coordinate system, and the feature data indicating the feature of the movement of the head in the horizontal direction of the user, that is, the horizontal direction of the global coordinate system.
- the sensor device 10 worn by the user is only the sensor device 10A, it is possible to determine the evaluation of the state of the user's body part. In other words, the user only has to wear the sensor device 10A. Therefore, user convenience can be improved. Furthermore, if the sensor device 10A is or is included in an earphone, the user can easily wear the sensor device 10A on the head. User convenience can be further improved by allowing the user to easily wear the sensor device 10A on the head. In addition, by using only the sensor data detected by the sensor device 10A, it is not necessary to synchronize the timings at which each of the plurality of sensor devices 10 detects data. Since it is not necessary to synchronize the timing at which each of the plurality of sensor devices 10 detects data, it is possible to more easily determine the evaluation of the state of the user's body part.
- At least one sensor device 10 included in the information processing system 1 may include a sensor device 10A worn on the user's head and a sensor device 10D worn on the user's foot.
- the control unit 26 may acquire sensor data indicating movement of the user's head from the sensor device 10A, and may acquire sensor data indicating movement of the user's leg from the sensor device 10D.
- the control unit 26 may select case C2 as shown in FIG. That is, the feature data includes feature data indicating the feature of head movement in the vertical direction of the global coordinate system, feature data indicating the feature of head movement in the horizontal direction of the global coordinate system, and feature data of the leg movement. and feature data indicating As described above with reference to FIGS.
- the control unit 26 can more accurately determine the evaluation of the state of the user's body part.
- the sensor device 10A is an earphone or is included in an earphone, the user can easily wear the sensor device 10A on the head.
- the sensor device 10D is a shoe last wearable device, the user can easily wear the sensor device 10D on the foot. Therefore, user convenience can be improved.
- At least one sensor device 10 included in the information processing system 1 may include a sensor device 10A worn on the user's head and a sensor device 10C worn on the user's thigh.
- the control unit 26 may acquire sensor data indicating the motion of the user's head from the sensor device 10A, and may acquire data indicating the motion of the user's thigh from the sensor device 10D.
- the control unit 26 may select case C3 as shown in FIG. That is, the feature data includes feature data indicating the features of head movement in the vertical direction of the global coordinate system, feature data indicating the features of head movement in the lateral direction of the global coordinate system, and femoral movement. and feature data indicating features. As described above with reference to FIGS.
- the control unit 26 can more accurately determine the evaluation of the state of the user's body part. Also, if the sensor device 10A is an earphone or is included in an earphone, the user can easily wear the sensor device 10A on the head. With such a configuration, user convenience can be improved.
- At least one sensor device 10 included in the information processing system 1 may include a sensor device 10A worn on the user's head and a sensor device 10B worn on the user's forearm.
- the control unit 26 may acquire sensor data representing the movement of the user's head from the sensor device 10A, and acquire sensor data representing the motion of the user's forearm from the sensor device 10B.
- the control unit 26 may select case C4 as shown in FIG. That is, the feature data includes feature data indicating the feature of head movement in the vertical direction of the global coordinate system, feature data indicating the feature of head movement in the horizontal direction of the global coordinate system, and feature data of the movement of the forearm. and feature data indicating As described above with reference to FIGS.
- the control unit 26 can more accurately determine the evaluation of the state of the user's body part.
- the sensor device 10A is an earphone or is included in an earphone, the user can easily wear the sensor device 10A on the head.
- the sensor device 10B is a wristwatch-type wearable device, the user can easily wear the sensor device 10B on the forearm. With such a configuration, user convenience can be improved.
- At least one sensor device 10 included in the information processing system 1 includes a sensor device 10A worn on the user's head, a sensor device 10B worn on the user's forearm, and a sensor device worn on the user's foot. 10D.
- the control unit 26 acquires sensor data indicating the motion of the user's head from the sensor device 10A, acquires sensor data indicating the motion of the user's forearm from the sensor device 10B, and acquires sensor data indicating the motion of the user's leg from the sensor device 10D. Sensor data indicative of movement may be obtained.
- the control unit 26 may select case C5 as shown in FIG.
- the feature data may include feature data indicating features of head movement in the vertical direction of the global coordinate system and feature data indicating features of head movement in the horizontal direction of the global coordinate system. Furthermore, the feature data may include feature data indicating features of movement of the forearm and feature data indicating features of movement of the thigh. As described above with reference to FIGS. 7 and 8, in case C5, the precision and accuracy for each of (1) the state of the head to (5) the state of the foot were higher than in case C1. Therefore, the control unit 26 can more accurately determine the evaluation of the state of the user's body part.
- FIG. 10 is a functional block diagram showing the configuration of an information processing system 101 according to another embodiment of the present disclosure.
- the information processing system 101 includes a sensor device 10, an electronic device 20, and a server 40.
- the server 40 functions as an information processing device and estimates the state of the user's body part.
- the electronic device 20 and the server 40 can communicate via the network 2.
- the network 2 may be any network including mobile communication networks, the Internet, and the like.
- the control unit 26 of the electronic device 20 receives sensor data from the sensor device 10 via the communication unit 21 in the same or similar manner as the information processing system 1 .
- the control unit 26 transmits sensor data to the server 40 via the network 2 using the communication unit 21 .
- the server 40 is, for example, a server belonging to a cloud computing system or other computing system.
- the server 40 has a communication section 41 , a storage section 42 and a control section 43 .
- the communication unit 41 includes at least one communication module connectable to the network 2.
- the communication module is, for example, a communication module conforming to a standard such as wired LAN (Local Area Network) or wireless LAN.
- the communication unit 41 is connected to the network 2 via a wired LAN or wireless LAN by a communication module.
- the storage unit 42 includes at least one semiconductor memory, at least one magnetic memory, at least one optical memory, or a combination of at least two of them.
- a semiconductor memory is, for example, a RAM or a ROM.
- RAM is, for example, SRAM or DRAM.
- ROM is, for example, EEPROM or the like.
- the storage unit 42 may function as a main storage device, an auxiliary storage device, or a cache memory.
- the storage unit 42 stores data used for the operation of the server 40 and data obtained by the operation of the server 40 .
- the storage unit 42 stores system programs, application programs, embedded software, and the like.
- the storage unit 42 stores the learning model, the associations shown in FIG. 6, and the like.
- the control unit 43 includes at least one processor, at least one dedicated circuit, or a combination thereof.
- a processor may be a general-purpose processor such as a CPU or GPU, or a dedicated processor specialized for a particular process.
- the dedicated circuit is, for example, FPGA or ASIC.
- the control unit 43 executes processing related to the operation of the server 40 while controlling each unit of the server 40 .
- the control unit 43 receives sensor data from the electronic device 20 via the network 2 using the communication unit 41 .
- the control unit 43 estimates the state of the user's body part based on the sensor data by executing the same or similar processing as the processing by the control unit 26 of the electronic device 20 described above.
- FIG. 11 is a sequence diagram showing operations of evaluation processing executed by the information processing system 101 shown in FIG. This operation corresponds to an example of the information processing method according to this embodiment.
- the electronic device 20 receives an input instructing execution of walking evaluation
- the information processing system 101 starts evaluation processing from step S20.
- the learning model is machine-learned so as to output a score when feature data is input.
- control unit 26 receives an input instructing execution of walking evaluation by the input unit 22 (step S20).
- the control unit 26 transmits a signal instructing the start of data detection as a broadcast signal to the plurality of sensor devices 10 through the communication unit 21 (step S21).
- control unit 15 receives the signal instructing the start of data detection from the electronic device 20 by the communication unit 11 (step S22). Upon receiving this signal, the control section 15 starts data detection. The control unit 15 acquires data detected by the sensor unit 12 from the sensor unit 12, and transmits the acquired data as sensor data to the electronic device 20 through the communication unit 11 (step S23).
- control unit 26 receives the sensor data from the sensor device 10 through the communication unit 21 (step S24).
- the control unit 26 transmits the sensor data to the server 40 via the network 2 using the communication unit 21 (step S25).
- the control unit 43 receives the sensor data from the electronic device 20 via the network 2 by the communication unit 41 (step S26).
- the control unit 43 selects, from among the plurality of learning models, a learning model corresponding to the type of the sensor device 10 that has transmitted the sensor data to the server 40 via the electronic device 20 (step S27).
- the control unit 43 acquires feature data from the sensor data received in the process of step S26 (step S28).
- the control unit 43 acquires a score from the learning model by inputting the feature data acquired in the process of step S28 to the learning model selected in the process of step S27 (step S29). By acquiring the score, the control unit 43 determines the evaluation of the state of the part corresponding to the score.
- control unit 43 In the server 40, the control unit 43 generates an evaluation signal according to the determined evaluation (step S30). The control unit 43 transmits the evaluation signal generated in the process of step S30 to the electronic device 20 as an external device via the network 2 by the communication unit 41 (step S31).
- the control unit 26 receives the evaluation signal from the server 40 via the network 2 by the communication unit 21 (step S32).
- the control unit 26 causes the notification unit to notify the content indicated by the evaluation signal (step S33).
- the control unit 26 may cause the output unit 23 to output the content indicated by the evaluation signal.
- the control section 26 may vibrate the vibration section 24 in a vibration pattern according to the evaluation signal.
- the control unit 26 may transmit the evaluation signal to the sensor device 10 through the communication unit 21 and cause the sensor device 10 to notify the content indicated by the evaluation signal.
- the control unit 15 may output the content indicated by the evaluation signal to the output unit 13 as the notification unit.
- the control unit 15 may cause the speaker of the output unit 13 to output the contents indicated by the evaluation signal as voice.
- step S33 After executing the process of step S33, the information processing system 101 ends the evaluation process.
- the information processing system 101 may execute the evaluation process again when the user walks the set number of steps.
- the information processing system 101 may start from the process of step S23.
- the information processing system 101 may repeatedly execute the evaluation process each time the user walks the set number of steps until the electronic device 20 receives an input instructing to end the evaluation process from the input unit 22 .
- the information processing system 101 can achieve the same or similar effects as the information processing system 1.
- the learning model may be learned so that when sensor data or feature data is input, information on the state of the user's body part different from the body part on which the sensor device 10 is worn is output.
- the control unit 26 estimates the state of the body part of the user, which is different from the body part on which the sensor device 10 is worn, based on the sensor data and the learning model.
- the feature data input to the learning model are not limited to those described above.
- the types of feature data indicating the movement characteristics of the body parts increase, and the accuracy and accuracy of the learning model increase. It can be seen that the accuracy increases. Therefore, in order to increase the accuracy and accuracy of the learning model, the number of feature data to be input to the learning model may be increased by increasing the number of sensor devices 10 worn on the user's body part.
- the electronic device 20 may be a glasses-type wearable device.
- the output unit 23 may include a projector that projects an image onto lenses of glasses.
- the control unit 26 may cause the output unit 23 to output the evaluation of the determined state of the body part as an image.
- This image may include, for example, an image showing an ideal motion of a body part with a low evaluation among a plurality of body parts of the user.
- the evaluation threshold may be set based on the user's age and gender.
- the content of the user's praise and the content of the advice may be set according to the user's age, gender, and the like.
- the storage unit 25 of the electronic device 20 may store a learning model for each physical data that can distinguish the physical characteristics of multiple users.
- Physical data includes, for example, at least one of age, sex, height and weight.
- the control unit 26 may receive an input indicating physical data of the user from the input unit 22 .
- the control unit 26 may select a learning model corresponding to the received physical data of the user from among the plurality of learning models stored in the storage unit 25 . With such a configuration, it is possible to evaluate the state of the body part that matches the physical data of the individual.
- the storage unit 42 of the server 40 may store a learning model for each physical data.
- the control unit 26 of the electronic device 20 may receive an input indicating physical data of the user from the input unit 22 .
- the communication unit 21 transmits a signal indicating the user's physical data to the server 40 via the network 2. good.
- the control unit 43 receives the signal indicating the physical data of the user from the electronic device 20 via the network 2 by the communication unit 41 .
- control unit 43 selects a learning model corresponding to the user's physical data from among the plurality of learning models stored in the storage unit 42 based on the received signal indicating the user's physical data. good. With such a configuration, it is possible to evaluate the state of the body part that matches the physical data of the individual.
- control unit 26 of the electronic device 20 or the control unit 43 of the server 40 may calculate the user's body movement rhythm, the user's stride length, and the user's walking speed based on the sensor data.
- the control unit 26 of the electronic device 20 or the control unit 43 of the server 40 may calculate the walking time of the user based on the sensor data.
- the control unit 26 of the electronic device 20 or the control unit 43 of the server 40 may calculate the walking distance of the user based on the sensor data.
- the control unit 26 of the electronic device 20 or the control unit 43 of the server 40 may generate a signal prompting a break or a signal prompting the end of walking when the walking time exceeds the time threshold or the walking distance exceeds the distance threshold. .
- the control section 26 may transmit the generated signal to the external device as described above through the communication section 21 .
- the control unit 43 may transmit the generated signal to the electronic device 20 or the sensor device 10 via the network 2 by the communication unit 41 .
- the time threshold may be set based on, for example, the average walking time of a typical user.
- the distance threshold may be set based on, for example, the average distance that a typical user walks at one time.
- the communication unit 21 of the electronic device 20 may include at least one reception module compatible with the satellite positioning system.
- the reception module is, for example, a reception module compatible with GPS (Global Positioning System).
- GPS Global Positioning System
- the receiving module is not limited to this.
- the receiving module may be a receiving module compatible with any satellite positioning system.
- the storage unit 25 may store map data.
- the control unit 26 may acquire the user's location information through the communication unit 21 .
- the control unit 26 may cause the output unit 23 to output the user's location information and the map data.
- the communication unit 11 of the sensor device 10 may further include at least one communication module connectable to the network 2 as shown in FIG.
- the communication module is, for example, a communication module compatible with mobile communication standards such as LTE, 4G, or 5G.
- the control unit 15 of the sensor device 10 may transmit data detected by the sensor device 10 to the server 40 via the network 2 using the communication unit 11 .
- the control unit 26 of the electronic device 20 may estimate the overall state of a combination of two or more body parts of the user.
- the control unit 26 of the electronic device 20 or the control unit 43 of the server controls the above-described (1) state of the head, (2) state of the arm, (3) state of the trunk, and (4) state of the knee. and (5) an overall rating combining two or more of the foot conditions may be determined.
- a general-purpose computer functions as the electronic device 20 according to this embodiment.
- a program describing processing details for realizing each function of the electronic device 20 according to this embodiment is stored in the memory of a general-purpose computer, and the program is read and executed by the processor. Therefore, the configuration according to this embodiment can also be implemented as a program executable by a processor or a non-transitory computer-readable medium that stores the program.
- Reference Signs List 1 101 information processing system 2 network 10, 10A, 10B, 10C, 10D sensor device 11 communication unit 12 sensor unit 13 output unit 14 storage unit 15 control unit 20 electronic device 21 communication unit 22 input unit 23 output unit 24 vibration unit 25 storage unit 26 control unit 30, 30A to 30E, 30F to 30J, 30K to 30O, 30P to 30T, 30U to 30Y learning model 31 input layer 32 hidden layer 33 hidden layer 34 output layer 40 server 41 communication unit 42 storage unit 43 control Department
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
ユーザの身体部位に装着される少なくとも1個のセンサ機器から、前記身体部位の動きを示すセンサデータを取得し、
取得した前記センサデータと学習モデルとによって、前記ユーザにおける前記センサ機器が装着される身体部位とは異なる身体部位の状態を推定する、制御部を備える。
ユーザの身体部位に装着される少なくとも1個のセンサ機器と、
前記センサ機器から前記身体部位の動きを示すセンサデータを取得し、取得した前記センサデータと学習モデルとによって、前記ユーザにおける前記センサ機器が装着される身体部位とは異なる身体部位の状態を推定する情報処理装置と、を含む。
ユーザの身体部位に装着される少なくとも1個のセンサ機器から、前記身体部位の動きを示すセンサデータを取得することと、
取得した前記センサデータと学習モデルとによって、前記ユーザにおける前記センサ機器が装着される身体部位とは異なる身体部位の状態を推定することと、を含む。
コンピュータに、
ユーザの身体部位に装着される少なくとも1個のセンサ機器から、前記身体部位の動きを示すセンサデータを取得することと、
取得した前記センサデータと学習モデルとによって、前記ユーザにおける前記センサ機器が装着される身体部位とは異なる身体部位の状態を推定することと、を実行させる。
図1に示すような情報処理システム1は、歩行中のユーザの複数の身体部位のうちの任意の身体部位の状態を推定することができる。ユーザは、情報処理システム1を利用することにより、歩行中の自身の身体部位の状態が良いか又は悪いかを把握することができる。
頭部の状態に対する評価が決定されてよい。
腕部の状態に対する評価が決定されてよい。
体幹部の状態に対する評価が決定されてよい。
膝部の状態に対する評価が決定されてよい。
足部の状態に対する評価が決定されてよい。
制御部26は、センサデータと学習モデルとによって、ユーザの身体部位の状態に対する評価を決定する。学習モデルは、センサデータ又は特徴データが入力されると、ユーザの所定の身体部位の状態に対する評価の情報を出力するように、機械学習したものである。つまり、制御部26は、学習モデルにセンサデータ又は特徴データを入力し、学習モデルからユーザの身体部位の状態に対する評価の情報を取得することにより、ユーザの身体部位の状態に対する評価を決定する。特徴データは、ユーザにおけるセンサ機器10が装着された身体部位の動きの特徴を示すデータである。制御部26は、センサデータから特徴データを取得する。特徴データの一例については、後述する。ここで、学習モデルは、センサデータ又は特徴データが入力されると、ユーザにおけるセンサ機器10が装着された身体部位とは異なる身体部位の状態の評価の情報を出力するように、機械学習することができる。その理由は、ユーザの複数の身体部位が歩行という運動において互いに影響を及ぼしながら動くためである。このような学習モデルによって、制御部26は、ユーザにおけるセンサ機器10が装着された身体部位とは異なる身体部位の状態の評価を決定することができる。また、このような学習モデルによって、制御部26は、ユーザにおけるセンサ機器10が装着された身体部位の個数よりも多くの身体部位の状態の評価を決定することができる。
制御部26は、ユーザの身体部位の状態に対する評価を決定すると、決定した評価に応じた評価信号を生成してよい。制御部26は、複数の評価を決定した場合、決定した複数の評価のうちの、少なくとも1つの評価に応じた評価信号を生成してよい。評価信号は、決定した評価が評価閾値よりも高い場合、ユーザを褒める内容を示す信号であってよい。評価信号は、決定した評価が評価閾値よりも低い場合、ユーザへのアドバイスを示す信号であってよい。評価閾値は、一般的なユーザの評価の平均値等に基づいて、設定されてよい。評価閾値は、学習モデルが用いられる場合、一般的なユーザのスコアの平均値であってよい。ユーザを褒める内容及びアドバイスの内容は、上述したような一般的に良いとされる歩き方の解釈に基づいて、設定されてよい。
制御部26は、複数の学習モデルのうちから、センサデータを電子機器20へ送信したセンサ機器10の種類に応じて、上述した評価の決定処理に用いる学習モデルを選択してよい。制御部26は、記憶部25に記憶された図6に示すような対応付けを参照し、評価の決定処理に用いる学習モデルを選択してよい。
制御部26は、センサデータを電子機器20へ送信するセンサ機器10がセンサ機器10Aのみである場合、ケースC1を選択してよい。例えば、ユーザが装着するセンサ機器10がセンサ機器10Aのみである場合、センサデータを電子機器20へ送信するセンサ機器10は、センサ機器10Aのみとなる。又は、制御部26は、センサデータを電子機器20へ送信した複数のセンサ機器10のうちからセンサ機器10Aを選択した場合、ケースC1を選択してよい。
制御部26は、センサデータを電子機器20へ送信したセンサ機器10がセンサ機器10A及びセンサ機器10Dのみである場合、ケースC2を選択してよい。又は、制御部26は、センサデータを電子機器20へ送信した複数のセンサ機器10のうちからセンサ機器10A及びセンサ機器10Dを選択した場合、ケースC2を選択してよい。
制御部26は、センサデータを電子機器20へ送信したセンサ機器10がセンサ機器10A及びセンサ機器10Cのみである場合、ケースC3を選択してよい。又は、制御部26は、センサデータを電子機器20へ送信した複数のセンサ機器10からセンサ機器10A及びセンサ機器10Cを選択した場合、ケースC3を選択してよい。
制御部26は、センサデータを電子機器20へ送信したセンサ機器10がセンサ機器10A及びセンサ機器10Bのみである場合、ケースC4を選択したよい。又は、制御部26は、センサデータを電子機器20へ送信した複数のセンサ機器10のうちからセンサ機器10A及びセンサ機器10Bを選択した場合、ケースC4を選択したよい。
制御部26は、センサデータを電子機器20へ送信したセンサ機器10がセンサ機器10A、センサ機器10B及びセンサ機器10Dのみである場合、ケースC5を選択してよい。又は、制御部26は、センサデータを電子機器20へ送信した複数のセンサ機器10からセンサ機器10A、センサ機器10B及びセンサ機器10Dを選択した場合、ケースC5を選択してよい。
以下、学習モデルの生成方法について説明する。学習モデルの生成では、被験者の歩行データベースが用いられた。被験者の歩行データベースとして、「小林吉之、肥田直人、中嶋香奈子、藤本雅大、持丸正明、“2019:AIST歩行データベース2019”、[Online]、[令和3年5月24日検索]、インターネット<https://unit.aist.go.jp/harc/ExPART/GDB2019_e.html>」において提供されるデータが用いられた。この歩行データベースには、複数の被験者の歩行データが登録されている。被験者の歩行データは、モーションキャプチャシステムと、床反力計とによって検出されたものである。
学習モデルの精度=(CR)/(CR+ICR) 式(1)
式(1)において、CRは、正解となる推定結果の数である。ICRは、不正解となる推定結果の数である。
学習モデルの確度=(CR)/(CR+ICR+NR) 式(2)
式(2)において、CRは、正解となる推定結果の数である。ICRは、不正解となる推定結果の数である。NRは、正解及び不正解の何れでもない学習モデルの推定結果の数である。つまり、NRは、学習モデルのスコアが3となったものの数である。
図9は、図1に示す電子機器20が実行する評価処理の動作を示すフローチャートである。この動作は、本実施形態に係る情報処理方法の一例に相当する。制御部26は、例えば、歩行評価の実行を指示する入力を入力部22によって受け付けると、ステップS10の処理から評価処理を開始する。
図10は、本開示の他の実施形態に係る情報処理システム101の構成を示す機能ブロック図である。
図11は、図10に示す情報処理システム101が実行する評価処理の動作を示すシーケンス図である。この動作は、本実施形態に係る情報処理方法の一例に相当する。電子機器20が歩行評価の実行を指示する入力を受け付けると、情報処理システム101は、ステップS20の処理から評価処理を開始する。以下、学習モデルは、特徴データが入力されると、スコアを出力するように機械学習したものであるとする。
2 ネットワーク
10,10A,10B,10C,10D センサ機器
11 通信部
12 センサ部
13 出力部
14 記憶部
15 制御部
20 電子機器
21 通信部
22 入力部
23 出力部
24 振動部
25 記憶部
26 制御部
30,30A~30E,30F~30J,30K~30O,30P~30T,30U~30Y 学習モデル
31 入力層
32 隠れ層
33 隠れ層
34 出力層
40 サーバ
41 通信部
42 記憶部
43 制御部
Claims (18)
- ユーザの身体部位に装着される少なくとも1個のセンサ機器から、前記身体部位の動きを示すセンサデータを取得し、
取得した前記センサデータと学習モデルとによって、前記ユーザにおける前記センサ機器が装着される身体部位とは異なる身体部位の状態を推定する、制御部を備える、情報処理装置。 - 前記制御部は、前記学習モデルによって、前記ユーザにおける前記センサ機器が装着される身体部位の個数よりも多くの身体部位の状態を推定する、請求項1に記載の情報処理装置。
- 前記学習モデルは、前記センサデータが入力されると、前記ユーザにおける前記センサ機器が装着される身体部位とは異なる身体部位の状態に対する評価の情報を出力し、
前記制御部は、前記ユーザの身体部位の状態を推定することとして、前記学習モデルから前記評価の情報を取得することにより前記ユーザの身体部位の状態に対する評価を決定する、請求項1又は2に記載の情報処理装置。 - 前記学習モデルは、前記身体部位の動きの特徴を示す特徴データが入力されると、前記ユーザにおける前記センサ機器が装着される身体部位とは異なる身体部位の状態に対する評価の情報を出力し、
前記制御部は、
前記センサデータから前記特徴データを取得し、
前記ユーザの身体部位の状態を推定することとして、前記学習モデルから前記評価の情報を取得することにより前記ユーザの身体部位の状態に対する評価を決定する、請求項1又は2に記載の情報処理装置。 - 前記少なくとも1個のセンサ機器は、前記ユーザの頭部に装着させるセンサ機器を含み、
前記制御部は、前記ユーザの頭部の動きを示す前記センサデータを取得し、
前記特徴データは、前記ユーザの前後方向、左右方向及び上下方向の少なくとも何れかにおける前記頭部の動きの特徴を示す特徴データを含む、請求項4に記載の情報処理装置。 - 前記特徴データは、前記ユーザの上下方向における前記頭部の動きの特徴を示す特徴データと、前記ユーザの左右方向における前記頭部の動きの特徴を示す特徴データとを含む、請求項5に記載の情報処理装置。
- 前記少なくとも1個のセンサ機器は、前記ユーザの足部に装着させるセンサ機器をさらに含み、
前記制御部は、前記ユーザの足部の動きを示す前記センサデータをさらに取得し、
前記特徴データは、前記足部の動きの特徴を示す特徴データをさらに含む、請求項6に記載の情報処理装置。 - 前記少なくとも1個のセンサ機器は、前記ユーザの大腿部に装着させるセンサ機器をさらに含み、
前記制御部は、前記ユーザの大腿部の動きを示す前記センサデータをさらに取得し、
前記特徴データは、前記大腿部の動きの特徴を示す特徴データをさらに含む、請求項6に記載の情報処理装置。 - 前記少なくとも1個のセンサ機器は、前記ユーザの前腕部に装着させるセンサ機器をさらに含み、
前記制御部は、前記ユーザの前腕部の動きを示す前記センサデータをさらに取得し、
前記特徴データは、前記前腕部の動きの特徴を示す特徴データをさらに含む、請求項6に記載の情報処理装置。 - 前記少なくとも1個のセンサ機器は、前記ユーザの前腕部に装着させるセンサ機器と、前記ユーザの足部に装着させるセンサ機器とをさらに含み、
前記制御部は、前記ユーザの前腕部の動きを示す前記センサデータと、前記ユーザの足部の動きを示す前記センサデータとをさらに取得し、
前記特徴データは、前記前腕部の動きの特徴を示す特徴データと、前記足部の動きの特徴を示す特徴データとをさらに含む、請求項6に記載の情報処理装置。 - 前記制御部は、前記ユーザの頭部の状態、腕部の状態、体幹部の状態、膝部の状態及び足部の状態のうちの少なくとも何れかに対する評価を決定する、請求項3から10までの何れか一項に記載の情報処理装置。
- 前記制御部は、複数の前記学習モデルのうちから、前記センサデータを前記情報処理装置へ送信した前記センサ機器の種類に応じて、前記評価の情報を取得する学習モデルを選択する、請求項3から11までの何れか一項に記載の情報処理装置。
- 通信部をさらに備え、
前記制御部は、
少なくとも1つの前記評価に応じた評価信号を生成し、
生成した前記評価信号を外部機器に前記通信部によって送信する、請求項3から12までの何れか一項に記載の情報処理装置。 - 前記外部機器は、イヤホンである、請求項13に記載の情報処理装置。
- 請求項1から14までの何れか一項に記載の情報処理装置が推定した前記身体部位の状態の情報を報知する報知部を備える、電子機器。
- ユーザの身体部位に装着される少なくとも1個のセンサ機器と、
前記センサ機器から前記身体部位の動きを示すセンサデータを取得し、取得した前記センサデータと学習モデルとによって、前記ユーザにおける前記センサ機器が装着される身体部位とは異なる身体部位の状態を推定する情報処理装置と、を含む、情報処理システム。 - ユーザの身体部位に装着される少なくとも1個のセンサ機器から、前記身体部位の動きを示すセンサデータを取得することと、
取得した前記センサデータと学習モデルとによって、前記ユーザにおける前記センサ機器が装着される身体部位とは異なる身体部位の状態を推定することと、を含む、情報処理方法。 - コンピュータに、
ユーザの身体部位に装着される少なくとも1個のセンサ機器から、前記身体部位の動きを示すセンサデータを取得することと、
取得した前記センサデータと学習モデルとによって、前記ユーザにおける前記センサ機器が装着される身体部位とは異なる身体部位の状態を推定することと、を実行させる、プログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22811369.2A EP4349256A1 (en) | 2021-05-28 | 2022-05-25 | Information processing device, electronic equipment, information processing system, information processing method, and program |
JP2023524219A JPWO2022250099A1 (ja) | 2021-05-28 | 2022-05-25 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-090695 | 2021-05-28 | ||
JP2021090695 | 2021-05-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022250099A1 true WO2022250099A1 (ja) | 2022-12-01 |
Family
ID=84229884
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/021462 WO2022250099A1 (ja) | 2021-05-28 | 2022-05-25 | 情報処理装置、電子機器、情報処理システム、情報処理方法及びプログラム |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP4349256A1 (ja) |
JP (1) | JPWO2022250099A1 (ja) |
WO (1) | WO2022250099A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015058167A (ja) * | 2013-09-19 | 2015-03-30 | カシオ計算機株式会社 | 運動支援装置及び運動支援方法、運動支援プログラム |
JP2016150193A (ja) | 2015-02-19 | 2016-08-22 | 高知県公立大学法人 | 運動解析装置 |
WO2017026148A1 (ja) * | 2015-08-12 | 2017-02-16 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
WO2019203188A1 (ja) * | 2018-04-17 | 2019-10-24 | ソニー株式会社 | プログラム、情報処理装置、及び情報処理方法 |
JP6741892B1 (ja) * | 2020-02-28 | 2020-08-19 | 株式会社三菱ケミカルホールディングス | 測定システム、方法、プログラム |
JP2020151267A (ja) * | 2019-03-20 | 2020-09-24 | トヨタ自動車株式会社 | 人等の身体動作推定システム |
-
2022
- 2022-05-25 JP JP2023524219A patent/JPWO2022250099A1/ja active Pending
- 2022-05-25 WO PCT/JP2022/021462 patent/WO2022250099A1/ja active Application Filing
- 2022-05-25 EP EP22811369.2A patent/EP4349256A1/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015058167A (ja) * | 2013-09-19 | 2015-03-30 | カシオ計算機株式会社 | 運動支援装置及び運動支援方法、運動支援プログラム |
JP2016150193A (ja) | 2015-02-19 | 2016-08-22 | 高知県公立大学法人 | 運動解析装置 |
WO2017026148A1 (ja) * | 2015-08-12 | 2017-02-16 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
WO2019203188A1 (ja) * | 2018-04-17 | 2019-10-24 | ソニー株式会社 | プログラム、情報処理装置、及び情報処理方法 |
JP2020151267A (ja) * | 2019-03-20 | 2020-09-24 | トヨタ自動車株式会社 | 人等の身体動作推定システム |
JP6741892B1 (ja) * | 2020-02-28 | 2020-08-19 | 株式会社三菱ケミカルホールディングス | 測定システム、方法、プログラム |
Non-Patent Citations (2)
Title |
---|
ASICS INSTITUTE OF SPORT SCIENCE: "The Ultimate Walk", KODANSHA GENDAI SHINSHO, September 2019 (2019-09-01), pages 92 |
YOSHIYUKI KOBAYASHINAOTO HIDAKANAKO NAKAJIMAMASAHIRO FUJIMOTOMASAAKI MOCHIMARU, AIST GAIT DATABASE 2019, 2019, Retrieved from the Internet <URL:https://unit.aist.go.jp/harc/ExPART/GDB2019_e.html> |
Also Published As
Publication number | Publication date |
---|---|
EP4349256A1 (en) | 2024-04-10 |
JPWO2022250099A1 (ja) | 2022-12-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
TWI614514B (zh) | 供一行動裝置中使用之方法及設備、行動裝置及非暫時性電腦可讀媒體 | |
CN105688396B (zh) | 运动信息显示***和运动信息显示方法 | |
CN107617201B (zh) | 用于自动配置传感器的方法、电子设备和记录介质 | |
US10271790B2 (en) | Methods and systems for training proper gait of a user | |
WO2016006479A1 (ja) | 活動量測定装置、活動量の測定方法、活動量の測定プログラム | |
US20230263419A1 (en) | Method performed by an electronics arrangement for a wearable article | |
JP2020146344A (ja) | 状態検知装置、状態検知方法、およびプログラム | |
WO2022250099A1 (ja) | 情報処理装置、電子機器、情報処理システム、情報処理方法及びプログラム | |
US20200001159A1 (en) | Information processing apparatus, information processing method, and program | |
WO2023042868A1 (ja) | 情報処理装置、電子機器、情報処理システム、情報処理方法及びプログラム | |
US20220183591A1 (en) | Biomechanical modelling of motion measurements | |
WO2023106382A1 (ja) | 情報処理装置、電子機器、情報処理システム、情報処理方法及びプログラム | |
WO2022250098A1 (ja) | 情報処理装置、電子機器、情報処理システム、情報処理方法及びプログラム | |
WO2021094777A1 (en) | Method and electronics arrangement for a wearable article | |
KR20210040671A (ko) | 동적으로 변화하는 인체 무게 중심 궤적 추정 장치 및 그 방법 | |
CN114344873B (zh) | 一种预摆动作类型识别方法、***及设备 | |
US20240130691A1 (en) | Measurement device, measurement system, measurement method, and recording medium | |
WO2023163104A1 (ja) | 関節角度学習推測システム、関節角度学習システム、関節角度推測装置、関節角度学習方法、およびコンピュータプログラム | |
US20230263420A1 (en) | Electronics arrangement for a wearable article | |
CN115587282A (zh) | 一种落地特征分析方法、***及设备 | |
CN115587283A (zh) | 一种伸腿动作分析方法、***及设备 | |
CN115530807A (zh) | 一种展腹动作分析方法、***及设备 | |
CN115153520A (zh) | 一种基于穿戴设备的增强运动识别方法及装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22811369 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023524219 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18564941 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022811369 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022811369 Country of ref document: EP Effective date: 20240102 |