KR101722131B1 - Posture and Space Recognition System of a Human Body Using Multimodal Sensors - Google Patents

Posture and Space Recognition System of a Human Body Using Multimodal Sensors Download PDF

Info

Publication number
KR101722131B1
KR101722131B1 KR1020150165438A KR20150165438A KR101722131B1 KR 101722131 B1 KR101722131 B1 KR 101722131B1 KR 1020150165438 A KR1020150165438 A KR 1020150165438A KR 20150165438 A KR20150165438 A KR 20150165438A KR 101722131 B1 KR101722131 B1 KR 101722131B1
Authority
KR
South Korea
Prior art keywords
information
sensor
sensor unit
resident
unit
Prior art date
Application number
KR1020150165438A
Other languages
Korean (ko)
Inventor
차주헌
Original Assignee
국민대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 국민대학교 산학협력단 filed Critical 국민대학교 산학협력단
Priority to KR1020150165438A priority Critical patent/KR101722131B1/en
Application granted granted Critical
Publication of KR101722131B1 publication Critical patent/KR101722131B1/en

Links

Images

Classifications

    • G06K9/00335
    • G06K9/00369
    • G06K9/6292
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/35Determination of transform parameters for the alignment of images, i.e. image registration using statistical methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Alarm Systems (AREA)
  • Emergency Alarm Devices (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The attitude and space recognition system using the multi-mode sensor of the present invention generates angle information about a rotation direction of a two-dimensional normal vector (x, y) of a resident in the space and transmits the angle information to a server. A tilt sensor unit configured by two tilt sensors and an altimeter sensor unit for generating elevation information of a resident and transmitting the information to a server; and a controller for analyzing the information received from the sensor unit, A server for providing information on the determination result, and a network for connecting the sensor unit and the server.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a posture and a space recognition system using a multimodal sensor,

BACKGROUND OF THE INVENTION 1. Field of the Invention [0002] The present invention relates to a method and apparatus for collecting attitudes and behavior patterns of a resident who resides in a space using a plurality of sensors, and grasping and providing attitude and movement of a resident. In general, the attitude and behavior pattern of the resident in the space can be grasped by using the tilt sensor and the altimeter sensor. The tilt sensor and the altimeter sensor may be used to accurately determine the occupant's sitting position, island and lying position, and to apply the difference between the living room, the room, the kitchen, and the toilet.

The prior art related to the present invention is disclosed in Korean Patent No. 10-0951890 (published on Apr. 12, 2010). FIG. 1 is a block diagram of a real-time object recognition system and an attitude estimation system to which the conventional situation monitoring is applied. 1, a real-time object recognition system and an attitude estimation system to which a conventional situation monitoring is applied include a video acquisition unit 100, a real-time environment A proof part 200, a multiple evidence extraction part 300, an evidence selection and evidence collection part 400, and a probabilistic information fusion part 500. The image acquiring unit 100 acquires two-dimensional and three-dimensional image information by continuously photographing a scene of an object to be recognized including an actual surrounding environment in real time. In addition, the real-time environment monitoring unit 200 receives the two-dimensional and three-dimensional image information provided from the image obtaining unit 100 and calculates and collects real-time environment information. The multiple evidence extracting unit 300 extracts, The real-time image information is input from the acquisition unit 100 to extract various evidence, and the extracted evidence and the information on the model are compared with each other to generate a position and an attitude of various objects. At the same time, the multiple evidence extraction unit 300 receives the evidence selected by the evidence selection and evidence collection unit 400 and calibrates the position and posture of the generated object. In addition, the evidence selection and evidence collection unit 400 may use the real-time environment information from the real-time environment monitoring unit 300 and the information on the model to select the most suitable evidence for the object and environment, The probabilistic information fusion unit 500 estimates the various positions and attitudes of the object generated by the multiple evidence extraction unit 300 through particle filtering and determines the position and attitude of the object based on an arbitrary distribution of particles arbitrary distribution).

The real-time object recognition system and the posture estimation system using the conventional situation monitoring as described above have a problem in that it can not be applied to a resident who moves in the space by estimating the position and attitude of the robot. Also, since the conventional art extracts and determines image information, it is expensive and has a high error probability. SUMMARY OF THE INVENTION Accordingly, the present invention has been made keeping in mind the above problems occurring in the prior art, and an object of the present invention is to provide a posture and space recognition system using the multi-mode sensor. Another object of the present invention is to reduce the construction cost by using the sensor to determine the attitude of a space occupant by using the information received from the sensor. It is intended to accurately determine the posture of the occupant in the space by applying the reference information for each posture such as sitting, island, and lying through the experiment.

The attitude and space recognition system using the multi-mode sensor of the present invention having the above-described object generates angle information about the rotation direction of a two-dimensional normal vector (x, y) of a resident in the space and transmits the angle information to the server A tilt sensor unit constituted of two tilt sensors as a set of tilt sensors and an altimeter sensor unit for generating elevation information of a resident and transmitting the information to a server; A server for judging a sitting position and a lying position and providing information on the result of the determination, and a network connecting the sensor unit and the server.

The posture and space recognition system using the multi-mode sensor of the present invention configured as described above can accurately determine the posture of a resident in the space. Another effect of the present invention is that when a resident who lives in a space is an elderly person, it is possible to grasp the dynamics of the resident such as the absence of movement or departure from the bed by judging the attitude information in real time.

FIG. 1 is a block diagram of a real-time object recognition system and an attitude estimation system to which conventional situation monitoring is applied;
FIG. 2 is a view showing the overall structure of a posture and a spatial recognition system using a multi-mode sensor according to the present invention,
3 is a detailed block diagram of a sensor unit according to the present invention,
4 is a detailed configuration diagram of a server to which the present invention is applied,
FIG. 5 is a control flowchart for a method of attitude and space recognition using a multi-mode sensor according to the present invention.
FIG. 6 is a configuration diagram of a lying-down posture applied to the present invention,
7 is a graph showing elevation data when the height of the chest of a resident is 20Cm to 100Cm based on the floor surface in the space of the present invention,
FIG. 7 is a graph showing an output voltage using a tilt sensor according to a slope of a space resident of the present invention,
8 is a photograph showing the place where the experiment of the present invention is performed and the measurement target,
9 is a graph showing a rotation angle of a normal vector and an output value of an altimeter sensor using a tilt sensor when changing from a sitting position to a lying position in the floor of the present invention,
10 is a graph showing a rotation angle of a normal vector and an output value of the altimeter sensor using a tilt sensor when changing from a sitting position to a lying position in the bed of the present invention,
11 is a graph showing an output value of the tilt sensor and an output value of the altimeter sensor when the occupant sits on a chair in a state where the occupant is standing,
12 is a graph showing an output value of the tilt sensor and an output value of the altimeter sensor when the occupant sits on the sofa in a state where the occupant is standing,
13 is a graph showing the output values of the tilt sensor and the altimeter sensor when the occupant in the space of the present invention is standing on the sink,
14 is a graph showing output values of an altimeter sensor and a tilt sensor when a face is washed in a sink.

The posture and space recognition system using the multi-mode sensor according to the present invention will be described with reference to FIGS. 2 to 14 and Tables 1 to 3 as follows.

FIG. 2 is a diagram showing the entire configuration of a posture and space recognition system using a multi-mode sensor according to the present invention. 2, the attitude and space recognition system using the multi-mode sensor of the present invention generates angle information about a rotation direction of a two-dimensional normal vector (x, y) of a resident in the space and transmits the angle information to a server. A tilt sensor unit composed of two tilt sensors as a set of sensors and an altimeter sensor unit for generating elevation information of a dweller and transmitting the information to a server; A server 20 for judging a sitting position and a lying position and providing information on the determination result, and a wired / wireless network 30 for connecting the sensor unit and the server. In the above example, the tilt sensor for generating the angle information about the rotation direction of the two-dimensional normal vector (x, y) is generated and transmitted to the server. However, the angle information about the rotation direction can be obtained and transmitted using the gyro sensor.

3 is a detailed configuration diagram of a sensor unit applied to the present invention. 3, the sensor unit of the present invention generates angle information about the rotation direction of a two-dimensional normal vector (x, y) of a resident in the space and transmits the angle information to the MCU. An altimeter sensor unit 14 for generating the elevation information of the resident and transmitting the elevation information to the MCU, and a controller for converting the analog signal received from the tilt sensor unit and the altimeter sensor unit into a digital signal An MCU 16 for calculating an output value and transmitting an output value to a server through a transmission / reception module, and a transmission / reception module 18 for transmitting an output value to the server under the control of the MCU.

4 is a detailed configuration diagram of a server applied to the present invention. 4, the server 20 receives the tilt information and the altitude information from the sensor unit 10 via the transmission / reception unit 22 and transmits the tilt information and the altitude information in the same manner as the island, A controller 25 for comparing the slope information with the altitude information to determine whether a resident in the space is standing, sitting, or lying down; and a slope information and altitude information about an attitude such as island, sitting and lying A memory unit 24 storing the same reference information, and a display unit 28 providing attitude information determined by the control unit.

5 is a control flowchart for a method of attitude and space recognition using a multi-mode sensor according to the present invention. 5, the posture and spatial recognition method using the multimode sensor of the present invention includes a step S11 of receiving sensor information sensed by the MCU from the sensor unit in real time (S11), a step of converting the sensor information received by the MCU into a digital signal A step S13 of storing the sensor information received by the server, a step S13 of storing sensor information received by the server, and a reference attitude information (S14) of judging a change in the posture of the resident in the space by comparing the received posture information and the step S15 of providing the determined posture information through the display unit. In the above, the reference posture information is inclination information and altitude information for various postures such as island, sitting, pressing, and wash water.

Fig. 6 is a configuration diagram of a lying position applied to the present invention. Fig. 6, the normal vector direction of the median plane based on the lying position is defined as x axis and the normal plane direction is defined as y axis, and has a value ranging from -90 ° to + 90 ° depending on the rotation direction of each axis The altimeter sensor indicates that the altitude can be calculated by defining the three movements of the human isle, sitting, and lying based on the attachment position (chest area) of the sensor.

FIG. 7 is a graph showing altitude data when the height of a resident's chest is 20Cm to 100Cm when the subject is laid on the basis of the floor surface. 7 shows the output value of the altimeter sensor when a resident is laid on the floor in the space in FIG. 7. The output value when the height of the lying posture changes from 20Cm to 100Cm is not directly proportional but increases continuously, . Therefore, the height of the occupant in the space can be determined using the output data of the altimeter sensor.

8 is a graph showing an output voltage using a tilt sensor according to a tilt of a resident in the space of the present invention. 8 shows an output value of the tilt sensor when the occupant tilts the upper body to a slope of -90 ° to + 90 ° in the space. In FIG. 8, the output of the tilt sensor is somewhat lacking in linearity but is interpolated in a constant quadratic curve. The output value of the interpolated tilt sensor using the interpolation equation can be expressed by the following equation (1).

Figure 112015115061688-pat00001

Where y represents the output voltage using the tilt sensor and x represents the slope of the occupant.

Therefore, the normal vector rotation angle value of the resident in the space can be obtained through the output value of the tilt sensor, and the rotation angle of the normal vector is different according to each posture as shown in Table 1 below. It is possible to determine the posture of the user.

9 is a photograph showing the place where the experiment of the present invention was performed and the photograph of the place where the subject was measured for 1 hour by attaching a tilt sensor and an altimeter sensor to a resident in a bed, a toilet, a sofa and a chair.

10 is a graph showing a rotation angle of a normal vector and an output value of an altimeter sensor using a tilt sensor when changing from a sitting position to a lying position in the floor of the present invention. FIG. 10 shows changes in the output values of the tilt sensor and the altimeter sensor when the occupant changes from a sitting position to a lying position on the floor. Upon receiving the output values of the tilt sensor and the altimeter sensor according to the above-described attitude change, the attitude of the occupant in the floor can be judged in real time using Table 1. That is, if the x-axis rotation angle of the tilt sensor is -10 ° to + 10 ° and the y-axis rotation angle is -20 ° to + 20 ° and the output value of the altimeter sensor is within 0 to 40 cm, '. In FIG. 9, the horizontal axis represents time. Also, it can be seen that the altimeter sensor output value at the floor is close to 20Cm and drops to 10Cm.

11 is a graph showing a rotation angle of a normal vector and an output value of an altimeter sensor using a tilt sensor when changing from a sitting position to a lying position in the bed of the present invention. FIG. 11 shows the output values of the tilt sensor and the altimeter sensor when changing from a sitting position to a lying position in a bed as applied to the present invention. Therefore, it can be seen that the data output value of the altimeter sensor falls from 30Cm to 20Cm in bed. 9 and 10, data that can distinguish the floor from the bed can be obtained through the output value of the altimeter sensor.

12 is a graph showing output values of the altimeter sensor and the tilt sensor when the occupant sits on a chair in a state where the occupant is standing. 12 shows the posture in which the occupant sits on the chair in a standing state. It can be seen that the output value of the altimeter sensor is lowered due to the height difference caused when the chair is seated in the standing posture. If you look at it, you will see an output of 30cm. In addition, the tilt data of the chair corresponds to the angle of 90 ° -15 ° = 75 °, which is the angle of the x-axis of SIT: standing of Table 1 from + 70 ° to + 110 °.

13 is a graph showing output values of the tilt sensor and the altimeter sensor when the occupant sits on the sofa in a state where the occupant is standing. The output value of the altimeter sensor shows an output value of 30 cm due to the height difference caused when the occupant sits on the sofa in the standing posture and the inclination data of the sitting attitude on the sofa is shown in FIG. 90 ° -20 ° = 70 °, which is close to + 20 ° + 60 °, the x-axis angle of SIT: lying bending in Table 1.

Therefore, as can be seen from FIGS. 12 to 13, it can be confirmed that the sensor is located on a chair or a sofa through the output value of the altimeter sensor, and the data of the tilt sensor can be used to determine which of the current chair (study room) It is possible to deduce where the current occupant is located in the house and whether it is the space.

14 is a graph showing output values of an altimeter sensor and a tilt sensor when a face is washed in a sink. Referring to FIG. 14, when the face is washed in the sink, it is confirmed that the data value of the altimeter sensor goes down from 40 cm to 30 cm and then returns. That is, the current occupant can judge that the height of the upper body has decreased and then raised again for a certain period while the upper body is standing. Also, the data of the tilt sensor can be checked between + 50 ° and + 90 ° and it can be seen that the tilt sensor is bent by + 50 ° for a certain time. Such repetition of chest-bending behavior can be considered to be the case where the occupant is presently washing the toilet.

As described above, according to the present invention, the attitude and behavior information of a resident in a residential space can be grasped and recognized by using a two-dimensional tilt sensor and an elevation sensor, and the output value of the tilt sensor and the altimeter sensor, Comparing the baseline information can determine the posture and behavior pattern of the occupant.

Table 1 below is a table showing normal vector rotation angle information and height information for each posture, which is stored in the memory unit according to the present invention, and defines three basic postures and two additional axes inclination values and height data values The slope value and the height value for each posture defined above can be used as reference information for determining each posture.

 [Table 1]

Figure 112015115061688-pat00002

Table 2 below is a behavior information table in each space and recognizes a space by informing specific activities according to a specific place in order to recognize a living space of a resident, and defines possible actions according to each place And it defines the attitude according to the division of the act according to the space where the resident lives, and it is possible to deduce the space where the resident is located according to the defined attitude.

 [Table 2]

Figure 112015115061688-pat00003

Table 3 below is a behavior information table according to each posture applied to the present invention and summarizes postures that can appear according to each place. In Table 3 below, we can see that the same behavior can occur at different places but at different places. In other words, sitting, sleeping, and leaning behaviors can occur similarly in the living room and the room. In Table 3 below, the expected attitude for each behavior may be similar. In other words, study, meals, and toilets can be viewed as having the same attitude that occurs in different places.

 [Table 3]

Figure 112015115061688-pat00004

For example, if the sitting position is defined as standing in a static posture, the sitting position is a posture that can occur in a living room, a room, a kitchen, and a toilet, and a posture of sitting is a sitting position, And the seat can be placed on the seat. The x-axis output value of the tilt sensor for seating is + 70 ° ~ + 110 °, the y-axis output value is within -20 ° ~ + 20 °, and the output value of the altimeter sensor is 40cm It can be seen in the range of ~ 60cm.

10: sensor unit, 12: tilt sensor unit,
14: altimeter sensor unit, 20: server,
24: memory unit, 28: display unit
30: wired / wireless network,

Claims (8)

delete In a resident attitude determination system in a residential space,
The occupant's posture determination system in the residential space includes:
A sensor unit 10 for generating posture information of a resident who resides in the space by attaching to the resident and transmitting the information to the server;
A transceiver 22 for receiving tilt information and altitude information from the sensor unit 10 by analyzing attitude information received from the sensor unit to determine an island, a sitting position, a lying position, and the like, A control unit for receiving the information through the transmission / reception unit and comparing the inclination information of the attitude information such as the island, sitting or lying, etc. stored in the memory unit 24 with the altitude information to determine whether a resident in the space is standing, A memory unit 24 storing slope information and altitude information for an attitude such as an island, sitting and lying, and a display unit 28 for providing attitude information determined by the control unit.
And a wired / wireless network (30) connecting the sensor unit and the server.
3. The method of claim 2,
The sensor unit includes:
A tilt sensor unit 12 configured to generate angle information about a rotation direction of a two-dimensional normal vector (x, y) of a resident in the space and transmit the generated angle information to the MCU;
An altimeter sensor unit 14 for generating elevation information of a resident and transmitting the information to the MCU;
An MCU (16) for converting an analog signal received from the tilt sensor unit and the altimeter sensor unit into a digital signal to calculate an output value and transmitting the output value to a server through a transmission / reception module;
And a transmission / reception module (18) for transmitting an output value to the server under the control of the MCU.
The method of claim 3,
Wherein the tilt sensor unit comprises:
And a tilt sensor or a gyro sensor, wherein the tilt sensor or the gyro sensor is constituted by two tilt sensors or gyro sensors.

delete delete delete delete
KR1020150165438A 2015-11-25 2015-11-25 Posture and Space Recognition System of a Human Body Using Multimodal Sensors KR101722131B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150165438A KR101722131B1 (en) 2015-11-25 2015-11-25 Posture and Space Recognition System of a Human Body Using Multimodal Sensors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150165438A KR101722131B1 (en) 2015-11-25 2015-11-25 Posture and Space Recognition System of a Human Body Using Multimodal Sensors

Publications (1)

Publication Number Publication Date
KR101722131B1 true KR101722131B1 (en) 2017-03-31

Family

ID=58500739

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150165438A KR101722131B1 (en) 2015-11-25 2015-11-25 Posture and Space Recognition System of a Human Body Using Multimodal Sensors

Country Status (1)

Country Link
KR (1) KR101722131B1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108392207A (en) * 2018-02-09 2018-08-14 西北大学 A kind of action identification method based on posture label
CN110334631A (en) * 2019-06-27 2019-10-15 西安工程大学 A kind of sitting posture detecting method based on Face datection and Binary Operation
CN112183347A (en) * 2020-09-28 2021-01-05 中国平安人寿保险股份有限公司 Depth space gradient-based in-vivo detection method, device, equipment and medium
CN112861563A (en) * 2019-11-12 2021-05-28 北京君正集成电路股份有限公司 Sitting posture detection method and system
CN114612939A (en) * 2022-03-25 2022-06-10 珠海视熙科技有限公司 Sitting posture identification method and device based on TOF camera and intelligent desk lamp

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140032082A (en) * 2012-09-05 2014-03-14 재단법인대구경북과학기술원 Monitoring system for sitting posture of user

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140032082A (en) * 2012-09-05 2014-03-14 재단법인대구경북과학기술원 Monitoring system for sitting posture of user

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108392207A (en) * 2018-02-09 2018-08-14 西北大学 A kind of action identification method based on posture label
CN108392207B (en) * 2018-02-09 2020-12-11 西北大学 Gesture tag-based action recognition method
CN110334631A (en) * 2019-06-27 2019-10-15 西安工程大学 A kind of sitting posture detecting method based on Face datection and Binary Operation
CN110334631B (en) * 2019-06-27 2021-06-15 西安工程大学 Sitting posture detection method based on face detection and binary operation
CN112861563A (en) * 2019-11-12 2021-05-28 北京君正集成电路股份有限公司 Sitting posture detection method and system
CN112183347A (en) * 2020-09-28 2021-01-05 中国平安人寿保险股份有限公司 Depth space gradient-based in-vivo detection method, device, equipment and medium
CN114612939A (en) * 2022-03-25 2022-06-10 珠海视熙科技有限公司 Sitting posture identification method and device based on TOF camera and intelligent desk lamp
CN114612939B (en) * 2022-03-25 2023-01-10 珠海视熙科技有限公司 Sitting posture identification method and device based on TOF camera and intelligent desk lamp

Similar Documents

Publication Publication Date Title
KR101722131B1 (en) Posture and Space Recognition System of a Human Body Using Multimodal Sensors
CN105283129B (en) Information processor, information processing method
Feng et al. Floor pressure imaging for fall detection with fiber-optic sensors
Leone et al. Detecting falls with 3D range camera in ambient assisted living applications: A preliminary study
EP3468180B1 (en) Display control device, display control system, display control method, display control program, and recording medium
JP6814220B2 (en) Mobility and mobility systems
US20130116602A1 (en) Automatic orientation calibration for a body-mounted device
JP2016104074A (en) Posture determination device, posture determination system, and program
JP6720909B2 (en) Action detection device, method and program, and monitored person monitoring device
JP2013537618A (en) Object tracking and recognition method and apparatus
US20170053401A1 (en) Posture estimation device, posture estimation system, posture estimation method, posture estimation program, and computer-readable recording medium on which posture estimation program is recorded
JP6086468B2 (en) Subject monitoring system
EP4112372B1 (en) Method and system for driver posture monitoring
Wiedemann et al. Performance evaluation of joint angles obtained by the Kinect v2
JP2020123239A (en) Posture estimation device, behavior estimation device, posture estimation program, and posture estimation method
Kido et al. Fall detection in toilet rooms using thermal imaging sensors
KR20170082742A (en) Physical Movement Judgment by Using Multi-Modal Sensor and System thereof
KR101694489B1 (en) Load Cell Matric system, and method for posture reform using the same
CN113516008A (en) Human body movement abnormity monitoring system based on human body skeleton key points
JP7243725B2 (en) Target object detection program and target object detection device
CN107563320B (en) Human body sitting posture appearance testing method and system based on spatial position information
JP2020134971A (en) Site learning evaluation program, site learning evaluation method and site learning evaluation unit
Daher et al. Multi-sensory assistive living system for elderly in-home staying
Song et al. Validation of attitude and heading reference system and microsoft kinect for continuous measurement of cervical range of motion compared to the optical motion capture system
US20220020287A1 (en) Information processing system, information processing apparatus, and non-transitory storage medium

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant