WO2022215313A1 - 情報処理方法、情報処理装置およびプログラム - Google Patents
情報処理方法、情報処理装置およびプログラム Download PDFInfo
- Publication number
- WO2022215313A1 WO2022215313A1 PCT/JP2022/000903 JP2022000903W WO2022215313A1 WO 2022215313 A1 WO2022215313 A1 WO 2022215313A1 JP 2022000903 W JP2022000903 W JP 2022000903W WO 2022215313 A1 WO2022215313 A1 WO 2022215313A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- rotation angle
- angle
- information processing
- information
- unit
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 72
- 238000003672 processing method Methods 0.000 title claims abstract description 29
- 238000012937 correction Methods 0.000 claims abstract description 68
- 238000012545 processing Methods 0.000 claims description 71
- 230000006870 function Effects 0.000 claims description 38
- 230000005484 gravity Effects 0.000 claims description 29
- 238000000034 method Methods 0.000 claims description 16
- 238000012360 testing method Methods 0.000 claims description 7
- 238000013459 approach Methods 0.000 claims description 4
- 238000011156 evaluation Methods 0.000 claims description 2
- 230000033001 locomotion Effects 0.000 abstract description 25
- 230000036544 posture Effects 0.000 description 85
- 238000010586 diagram Methods 0.000 description 33
- 238000004891 communication Methods 0.000 description 19
- 238000012986 modification Methods 0.000 description 11
- 230000004048 modification Effects 0.000 description 11
- 230000001133 acceleration Effects 0.000 description 10
- 210000000988 bone and bone Anatomy 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000006872 improvement Effects 0.000 description 3
- 230000007774 longterm Effects 0.000 description 3
- 241000282412 Homo Species 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 2
- 210000003423 ankle Anatomy 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 239000003550 marker Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 210000000707 wrist Anatomy 0.000 description 2
- OHVLMTFVQDZYHP-UHFFFAOYSA-N 1-(2,4,6,7-tetrahydrotriazolo[4,5-c]pyridin-5-yl)-2-[4-[2-[[3-(trifluoromethoxy)phenyl]methylamino]pyrimidin-5-yl]piperazin-1-yl]ethanone Chemical class N1N=NC=2CN(CCC=21)C(CN1CCN(CC1)C=1C=NC(=NC=1)NCC1=CC(=CC=C1)OC(F)(F)F)=O OHVLMTFVQDZYHP-UHFFFAOYSA-N 0.000 description 1
- JQMFQLVAJGZSQS-UHFFFAOYSA-N 2-[4-[2-(2,3-dihydro-1H-inden-2-ylamino)pyrimidin-5-yl]piperazin-1-yl]-N-(2-oxo-3H-1,3-benzoxazol-6-yl)acetamide Chemical compound C1C(CC2=CC=CC=C12)NC1=NC=C(C=N1)N1CCN(CC1)CC(=O)NC1=CC2=C(NC(O2)=O)C=C1 JQMFQLVAJGZSQS-UHFFFAOYSA-N 0.000 description 1
- HFGHRUCCKVYFKL-UHFFFAOYSA-N 4-ethoxy-2-piperazin-1-yl-7-pyridin-4-yl-5h-pyrimido[5,4-b]indole Chemical compound C1=C2NC=3C(OCC)=NC(N4CCNCC4)=NC=3C2=CC=C1C1=CC=NC=C1 HFGHRUCCKVYFKL-UHFFFAOYSA-N 0.000 description 1
- WTFUTSCZYYCBAY-SXBRIOAWSA-N 6-[(E)-C-[[4-[2-(2,3-dihydro-1H-inden-2-ylamino)pyrimidin-5-yl]piperazin-1-yl]methyl]-N-hydroxycarbonimidoyl]-3H-1,3-benzoxazol-2-one Chemical compound C1C(CC2=CC=CC=C12)NC1=NC=C(C=N1)N1CCN(CC1)C/C(=N/O)/C1=CC2=C(NC(O2)=O)C=C1 WTFUTSCZYYCBAY-SXBRIOAWSA-N 0.000 description 1
- DFGKGUXTPFWHIX-UHFFFAOYSA-N 6-[2-[4-[2-(2,3-dihydro-1H-inden-2-ylamino)pyrimidin-5-yl]piperazin-1-yl]acetyl]-3H-1,3-benzoxazol-2-one Chemical compound C1C(CC2=CC=CC=C12)NC1=NC=C(C=N1)N1CCN(CC1)CC(=O)C1=CC2=C(NC(O2)=O)C=C1 DFGKGUXTPFWHIX-UHFFFAOYSA-N 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 210000003414 extremity Anatomy 0.000 description 1
- 230000002650 habitual effect Effects 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- AYOOGWWGECJQPI-NSHDSACASA-N n-[(1s)-1-(5-fluoropyrimidin-2-yl)ethyl]-3-(3-propan-2-yloxy-1h-pyrazol-5-yl)imidazo[4,5-b]pyridin-5-amine Chemical class N1C(OC(C)C)=CC(N2C3=NC(N[C@@H](C)C=4N=CC(F)=CN=4)=CC=C3N=C2)=N1 AYOOGWWGECJQPI-NSHDSACASA-N 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
- G01C25/005—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0015—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
- A61B5/0022—Monitoring a patient using a global network, e.g. telephone networks, internet
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1071—Measuring physical dimensions, e.g. size of the entire body or parts thereof measuring angles, e.g. using goniometers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6814—Head
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6823—Trunk, e.g., chest, back, abdomen, hip
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6824—Arm or wrist
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6829—Foot or ankle
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/744—Displaying an avatar, e.g. an animated cartoon character
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2560/00—Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
- A61B2560/02—Operational features
- A61B2560/0223—Operational features of calibration, e.g. protocols for calibrating sensors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0219—Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0223—Magnetic field sensors
Definitions
- the present disclosure relates to an information processing method, an information processing device, and a program.
- motion capture technology for acquiring motion information that indicates user motion has been actively developed.
- Acquired motion information is used, for example, in sports to improve form, or in applications such as VR (Virtual Reality) or AR (Augmented Reality).
- VR Virtual Reality
- AR Augmented Reality
- an avatar video that mimics the user's motion is generated and the avatar video is distributed.
- Patent Literature 1 discloses a motion capture technique realized by a sensor system.
- drift errors accumulate due to long-term use or user movement, and the avatar video can become unnatural.
- drift errors accumulate in this way, it is conceivable to design a process to reduce the effects of drift errors based on the user's judgment and operation, but there is room for improvement in terms of user convenience. .
- the present disclosure proposes a new and improved information processing method, information processing apparatus, and program capable of improving the user's convenience regarding drift error correction.
- the posture of the moving body including a first rotation angle about an axis in the direction of gravity and a second rotation angle about an axis orthogonal to the direction of gravity based on the output from a gyro sensor attached to the moving body
- An information processing method comprising: estimating information; and correcting the first rotation angle if a condition regarding the second rotation angle is satisfied.
- the moving body based on the output from the gyro sensor attached to the moving body, the moving body includes a first rotation angle about an axis in the direction of gravity and a second rotation angle about an axis orthogonal to the direction of gravity. and a rotation angle correction unit that corrects the first rotation angle when the condition regarding the second rotation angle is satisfied.
- the computer calculates a first rotation angle about an axis in the direction of gravity and a second rotation angle about an axis orthogonal to the direction of gravity based on the output from a gyro sensor attached to a moving object. and a rotation angle correction unit that corrects the first rotation angle when a condition regarding the second rotation angle is satisfied. be done.
- FIG. 1 is an explanatory diagram showing an information processing system according to an embodiment of the present disclosure
- FIG. 4 is an explanatory diagram showing a specific example of an avatar video V displayed on a viewing user terminal 40
- FIG. 2 is an explanatory diagram showing the configuration of a distribution user terminal 20 according to an embodiment of the present disclosure
- FIG. FIG. 4 is an explanatory diagram showing functions of a base tool 250
- FIG. 4 is an explanatory diagram showing a yaw angle, a pitch angle and a roll angle
- 4 is an explanatory diagram showing a specific example of posture information stored in a first storage unit 253
- FIG. FIG. 4 is an explanatory diagram showing a specific example of generating skeleton data
- 3 is an explanatory diagram showing functions of an application unit 260;
- FIG. 4 is an explanatory diagram showing a specific example of a guidance screen
- FIG. 11 is an explanatory diagram showing a specific example of an avatar display screen
- 4 is a flowchart showing a flow up to registration of posture information in the information processing system according to an embodiment of the present disclosure
- 7 is a flowchart showing correction processing A that is performed when automatic correction is enabled
- 10 is a flowchart showing correction processing B that is performed when automatic correction is not enabled
- 9 is a flow chart showing a registration operation according to the first modified example
- FIG. 7 is an explanatory diagram showing a second configuration example of the information processing system
- FIG. 13 is an explanatory diagram showing a third configuration example of the information processing system
- FIG. 12 is an explanatory diagram showing a fourth configuration example of the information processing system;
- FIG. 12 is an explanatory diagram showing a fifth configuration example of the information processing system;
- FIG. 21 is an explanatory diagram showing a sixth configuration example of the information processing system;
- 3 is a block diagram showing the hardware configuration of a distribution user terminal 20;
- Skeleton data expressed by a skeleton structure representing the structure of the body, for example.
- Skeleton data includes information about parts and bones, which are line segments connecting parts.
- the parts in the skeleton structure correspond to, for example, terminal parts and joint parts of the body.
- the bones in the skeleton structure can correspond to, for example, human bones, but the positions and numbers of the bones do not necessarily match the actual human skeleton.
- the position of parts in skeleton data can be obtained by various motion capture technologies.
- a camera-type technology that attaches a marker to each part of the body and acquires the position of the marker using an external camera or the like, or a motion sensor attached to the part of the body, and the sensor data acquired by the motion sensor
- sensor-based technologies that acquire the position information of motion sensors based on motion sensors.
- time-series data of skeleton data is used for form improvement in sports, and for applications such as VR (Virtual Reality) or AR (Augmented Reality).
- time-series data of skeleton data is used to generate an avatar image imitating a user's movement, and the avatar image is distributed.
- FIG. 1 is an explanatory diagram showing an information processing system according to an embodiment of the present disclosure.
- an information processing system according to an embodiment of the present disclosure has six sensor devices 10A-10F, a distribution user terminal 20, a distribution server 30 and a viewing user terminal .
- User U1 shown in FIG. 1 is a distribution user who distributes avatar videos, and users U2 and U3 are viewing users who view avatar videos.
- Network 12 is a wired or wireless transmission path for information transmitted from devices connected to network 12 .
- the network 12 may include a public line network such as the Internet, a telephone line network, a satellite communication network, various LANs (Local Area Networks) including Ethernet (registered trademark), WANs (Wide Area Networks), and the like.
- the network 12 may also include a dedicated line network such as IP-VPN (Internet Protocol-Virtual Private Network).
- the sensor device 10 includes an inertial sensor (IMU: Inertial Measurement Unit) such as an acceleration sensor that acquires acceleration and a gyro sensor (angular velocity sensor) that acquires angular velocity. )including.
- IMU Inertial Measurement Unit
- the sensor device 10 may also include sensors such as a geomagnetic sensor, an ultrasonic sensor, and an atmospheric pressure sensor.
- the sensor devices 10A to 10F are desirably attached to reference joints of the body (for example, the waist and head) or near extremities of the body (wrists, ankles, head, etc.).
- the sensor device 10A is worn on the waist of the distribution user U1
- the sensor devices 10B and 10E are worn on the wrists
- the sensor devices 10C and 10D are worn on the ankles
- the sensor device 10F is worn on the head.
- the part of the body to which the sensor device 10 is attached may also be referred to as the attachment part.
- the number of sensor devices 10 and mounting positions are not limited to the example shown in FIG. .
- Such a sensor device 10 acquires the acceleration or angular velocity of the mounting site as sensor data, and transmits the sensor data to the delivery user terminal 20.
- the distribution user terminal 20 is an example of an information processing device used by the distribution user U1.
- the distribution user terminal 20 receives the sensor data from the sensor device 10, and uses the received sensor data to generate an avatar image of the distribution user U1.
- the delivery user terminal 20 acquires mounting site information indicating the position and orientation of each mounting site based on the sensor data, and based on the mounting site information, position information and position information of each site in the skeleton structure. Generate skeleton data with pose information.
- the distribution user terminal 20 generates an avatar image having the posture indicated by the skeleton data.
- the distribution user terminal 20 transmits the generated avatar video to the distribution server 30 and requests the distribution server 30 to distribute the avatar video.
- FIG. 1 shows a notebook PC (Personal Computer) as the distribution user terminal 20
- the distribution user terminal 20 may be other information processing devices such as smart phones and desktop PCs.
- the distribution server 30 distributes the avatar video to the viewing user terminal 40 based on the request from the distribution user terminal 20 .
- FIG. 1 shows one distribution server 30 that implements a distribution service provided by a certain business operator, there may be a plurality of business operators providing distribution services and a plurality of distribution servers 30.
- the distribution user terminal 20 can request the distribution server 30, which provides the distribution service specified by the distribution user U1, to distribute the avatar video.
- the viewing user terminal 40 is an information processing device used by viewing users (for example, user U2 and user U3 shown in FIG. 1).
- the viewing user terminal 40 has a display unit that displays various screens, an operation unit that detects the operation of the viewing user, and a control unit that controls the overall operation of the viewing user terminal 40 .
- the viewing user terminal 40 requests the distribution server 30 to distribute the avatar video of the distribution user U1 based on the operation of the viewing user, and displays the avatar video distributed from the distribution server 30 .
- FIG. 2 is an explanatory diagram showing a specific example of the avatar video V displayed on the viewing user terminal 40.
- FIG. 2 the video of the two-dimensional character is displayed as the avatar video V on the viewing user terminal 40, for example.
- the posture of the avatar video V reflects the posture of distribution user U1. That is, the avatar video V changes according to the movement of the distribution user U1.
- Motion capture techniques implemented using motion sensors as described above may accumulate drift errors due to long-term use or user movement, resulting in inaccurate skeletal data generated. If the skeleton data is inaccurate, the distributed avatar video may also look unnatural.
- Recalibration is processing performed on orientation information obtained by processing sensor data
- reset processing is processing for adjusting parameters for generating skeleton data. In any processing, the distribution user must be in the initial position and orientation.
- the inventors have come to create an embodiment of the present disclosure by focusing on the above circumstances.
- An information processing system according to an embodiment of the present disclosure can easily eliminate drift errors while avoiding content from becoming unnatural.
- the configuration and operation of the distribution user terminal 20 according to an embodiment of the present disclosure will be sequentially described in detail.
- FIG. 3 is an explanatory diagram showing the configuration of the distribution user terminal 20 according to one embodiment of the present disclosure.
- the distribution user terminal 20 includes an operation unit 216, a display unit 220, a communication unit 230, and a control unit 240.
- the operation unit 216 is configured to be operated by the distribution user for inputting instructions or information to the distribution user terminal 20 .
- the display unit 220 displays various display screens. For example, the display unit 220 displays a display screen including the avatar image generated by the control unit 240.
- FIG. Communication unit 230 communicates with distribution server 30 via network 12 . For example, the communication section 230 transmits the avatar video generated by the control section 240 to the distribution server 30 via the network 12 .
- the control unit 240 controls the overall operation of the distribution user terminal 20.
- the control unit 240 according to an embodiment of the present disclosure has a function of generating skeleton data of the distribution user based on sensor data received from the sensor device 10 and generating an avatar image having the posture indicated by the skeleton data. have.
- Such functions of the control unit 240 are implemented by the base tool 250 and the application unit 260 shown in FIG.
- the base tool 250 has a function of generating skeleton data from sensor data.
- the base tool 250 supplies the generated skeleton data to the application section 260 .
- the base tool 250 according to an embodiment of the present disclosure has a function of correcting posture information (in particular, yaw angle) of each part of the distribution user calculated from sensor data, as will be described in detail later.
- the application unit 260 realizes various functions in cooperation with the base tool 250.
- the application unit 260 generates an avatar image based on the skeleton data supplied from the base tool 250 and requests the distribution server 30 to distribute the avatar image.
- the application unit 260 may request the distribution server 30 to distribute a combination of the avatar video and other content data.
- Other content data includes, for example, background data and music data.
- the developer of the base tool 250 and the developer of the application unit 260 may be the same or different. The functions of the base tool 250 and application unit 260 will be described in more detail below.
- FIG. 4 is an explanatory diagram showing the functions of the base tool 250.
- the base tool 250 has a sensor data processing unit 251, a calibration unit 252, a first storage unit 253, a yaw angle correction unit 254, a skeleton data generation unit 255, and an application interface 256.
- the sensor data processing unit 251 acquires sensor data indicating the acceleration or angular velocity of the wearing part from the sensor device 10, and estimates position information and posture information of each wearing part based on the sensor data. For example, the sensor data processing unit 251 integrates the acceleration data to estimate the position information of each attachment site, and integrates the angular velocity data to calculate the orientation including the rotation angles such as the yaw angle, pitch angle and roll angle of each attachment site. Estimate information.
- FIG. 5 is an explanatory diagram showing the yaw angle, pitch angle and roll angle.
- the yaw angle is an example of the first rotation angle, which is the rotation angle around the axis in the direction of gravity (the z-axis in FIG. 5).
- One of the pitch angle and the roll angle is an example of the second rotation angle, which is the rotation angle around the axis (x-axis in FIG. 5) perpendicular to the direction of gravity.
- the other of the pitch angle and the roll angle is an example of a third rotation angle, which is a rotation angle around an axis (y-axis in FIG. 5) perpendicular to both the axis in the direction of gravity and the axis perpendicular to the direction of gravity.
- the yaw angle of each attachment site in this specification is a relative rotation angle based on the yaw angle of the distribution user's waist.
- the calibration unit 252 calibrates the orientation information output from the sensor data processing unit 251 .
- the calibration unit 252 may perform the calibration when starting to use the base tool 250, or may perform the calibration according to the operation by the distribution user. At the time of calibration, the distribution user assumes an initial position and posture, for example, a standing posture with both arms lowered.
- the first storage unit 253 stores various information used for the operation of the base tool 250 .
- the first storage unit 253 stores posture information for registration estimated by the sensor data processing unit 251 while the distribution user is taking a posture for registration.
- FIG. 6 is an explanatory diagram showing a specific example of posture information stored in the first storage unit 253.
- the first storage unit 253 stores the posture information estimated by the sensor data processing unit 251 while the distribution user is taking the registration posture for each wearing part.
- the attitude information for example, the yaw angle is stored as the first reference angle, the pitch angle is stored as the second reference angle, and the roll angle is stored as the third reference angle.
- this posture information be posture information acquired within a predetermined period of time after the calibration is performed, in which drift error accumulation is considered to be small.
- the posture for registration may be a unique posture that is unique to the distribution user, or may be a natural pose.
- the pitch angle and roll angle estimated by the sensor data processing unit 251 are reduced in drift error by gravity correction.
- the difference between the pitch angle and roll angle of a certain attachment site and the second reference angle and third reference angle stored in the first storage unit 253 is equal to or less than the threshold, the posture of the attachment site is registered. There is a possibility that it is close to the posture for That is, it is considered that the original yaw angle of the mounting site is close to or equal to the first reference angle stored in the first storage unit 253 .
- the yaw angle correction unit 254 corrects the yaw angle input from the sensor data processing unit 251 for a certain mounting portion when the conditions regarding the pitch angle and the roll angle input from the sensor data processing unit 251 are satisfied for that mounting portion. correction. Specifically, when the difference between the pitch angle and the roll angle of a given attachment site and the second reference angle and third reference angle of the attachment site is equal to or less than a threshold value, the yaw angle correction unit 254 determines The yaw angle of the site is estimated to be the first reference angle, and the yaw angle input from the sensor data processing unit 251 for the mounting site is corrected using the first reference angle.
- the yaw angle correction unit 254 may correct the yaw angle input from the sensor data processing unit 251 so that it becomes the first reference angle.
- the yaw angle may be corrected to approach the first reference angle over time.
- the yaw angle correction is performed for each attachment site that satisfies the conditions.
- Angular correction may be performed.
- the skeleton data generation unit 255 generates a skeleton structure based on the position information of each attachment site estimated by the sensor data processing unit 251 and the orientation information of each attachment site including the corrected yaw angle by the yaw angle correction unit 254. Skeleton data including position information and posture information of each part is generated.
- generation of skeleton data will be described more specifically with reference to FIG.
- FIG. 7 is an explanatory diagram showing a specific example of generating skeleton data.
- the skeleton data generation unit 255 acquires mounting site information PD100 including position information and posture information of the mounting sites P101 to P106 on which the sensor devices 10A to 10F are mounted.
- the skeleton data generator 255 generates skeleton data SD100 including position information and posture information of each part in the skeleton structure, as shown in the right diagram of FIG. get.
- the skeleton data SD100 includes not only information on the mounting part SP101 corresponding to the mounting part P101 and the mounting part SP102 corresponding to the mounting part P102, but also information on the non-mounting part SP107.
- skeleton data can also include bone information (position information, posture information, etc.).
- skeleton data SD100 may include information on bone SB101.
- the skeleton data generator 255 can identify bone information between parts based on position information and posture information of parts in the skeleton structure.
- Application interface 256 is an interface with application unit 260 .
- the application interface 256 may be configured as an API (Application Programming Interface).
- the application interface 256 returns skeleton data of the delivery user to the application section 260 in response to a request from the application section 260 .
- the application interface 256 receives from the application unit 260 an instruction to register posture information in the posture for registration.
- FIG. 8 is an explanatory diagram showing the functions of the application unit 260.
- the application section 260 has a base tool plug-in 261 , a second storage section 262 , a retargeting section 265 , a display control section 267 and a distribution control section 268 .
- Base tool plug-in 261 is an interface with base tool 250 .
- the platform tool plug-in 261 receives data from the platform tool 250 and converts the data into a format that can be handled by the application section 260 .
- the infrastructure tool plug-in 261 receives skeleton data and notification from the infrastructure tool 250 indicating that the above-described yaw angle correction has been performed.
- the second storage unit 262 stores various information used for the operation of the application unit 260 .
- the second storage unit 262 may store information related to registration posture information stored in the first storage unit 253 of the base tool 250 .
- the retargeting unit 265 receives the distribution user's skeleton data from the base tool plug-in 261 and retargets the skeleton data to generate an avatar image having the posture or movement indicated by the skeleton data.
- the display control unit 267 generates various display screens and causes the display unit 220 to display the generated display screens. For example, the display control unit 267 generates an avatar display screen including the avatar video generated by the retargeting unit 265 and causes the display unit 220 to display the avatar display screen. Further, the display control unit 267 generates a guidance screen for guiding the distribution user to take the posture for registration described above based on the distribution user's operation on the operation unit 216 or the end of the calibration, and the guidance screen is displayed on the display unit 220 .
- FIG. 9 is an explanatory diagram showing a specific example of the guidance screen.
- the guidance screen includes guidance message 42 and skeleton data display 44 .
- the guidance message 42 is a message requesting the delivery user to stand still, and in the example shown in FIG.
- the skeleton data display 44 is a display showing skeleton data generated by the skeleton data generator 255 of the base tool 250 for the distribution user.
- FIG. 9 shows an example in which the distribution user stands still in a salute posture as a unique posture.
- the posture information estimated by the sensor data processing unit 251 of the platform tool 250 while the distribution user is stationary is stored in the first storage unit 253 of the platform tool 250 as described above.
- the yaw angle is automatically corrected when the distribution user's posture becomes close to the registration posture, but the yaw angle is corrected based on a predetermined operation input. good too.
- the display control unit 267 may generate an avatar display screen including the avatar video V for the distribution user, and arrange the correction button 46 on the avatar display screen. Then, when the correction button 46 is operated by the distribution user, the application unit 260 sends a yaw angle correction execution instruction to the base tool 250, and the yaw angle correction unit 254 of the base tool 250 performs yaw angle correction based on the execution instruction. Angular correction may be performed.
- the distribution control unit 268 transmits the avatar video generated by the retargeting unit 265 to the distribution server 30 and requests the distribution server 30 to distribute the avatar video. As a result, the avatar video is distributed to the viewing user terminal 40 and displayed on the viewing user terminal 40 .
- FIG. 11 is a flow chart showing the flow up to registration of posture information in the information processing system according to an embodiment of the present disclosure.
- the display control unit 267 causes the display unit 220 to display an automatic correction selection screen for selecting whether or not to enable the automatic correction of the yaw angle, and the distribution user confirms the yaw angle on the automatic correction confirmation screen. Select whether to enable automatic correction (S308). When enabling the automatic yaw angle correction is selected (S308/Yes), the display control unit 267 causes the display unit 220 to display an orientation registration selection screen for selecting whether or not to register the unique orientation. , the distribution user selects whether or not to register the original posture on the posture registration selection screen (S312).
- the display control unit 267 performs guidance shown in FIG.
- the screen is displayed on the display unit 220 (S316).
- the distribution user maintains stillness in a unique posture according to the guidance screen.
- the first storage unit 253 stores the posture information of each wearing part of the delivery user estimated by the sensor data processing unit 251 while the delivery user is still in the original posture (S320).
- sensor data is collected while the distribution user is stationary for a long time (that is, while the distribution user is taking a natural pose).
- Posture information of each wearing part of the distribution user estimated by the processing unit 251 is stored (S320). Note that in both S320 and S324, the sensor data processing unit 251 may store a plurality of pieces of posture information. After S320 and S324, correction processing A, which will be described with reference to FIG. 12, is performed.
- the display control unit 267 displays the orientation registration selection screen for selecting whether or not to register the unique orientation. Displayed on the unit 220, the distribution user selects whether or not to register the unique posture on the posture registration selection screen (S328).
- the display control unit 267 performs guidance shown in FIG.
- the screen is displayed on the display unit 220 (S332).
- the distribution user maintains stillness in a unique posture according to the guidance screen.
- the first storage unit 253 stores the posture information of each wearing part of the delivery user estimated by the sensor data processing unit 251 while the delivery user is still in the original posture (S336).
- FIG. 12 is a flowchart showing correction processing A that is performed when automatic correction is enabled.
- the sensor data processing unit 251 acquires sensor data from each sensor device 10 (S404). Then, the sensor data processing unit 251 performs acceleration integration and gravity correction on the sensor data acquired from each sensor device 10, and estimates posture information including the yaw angle, pitch angle, and roll angle of each attachment site. (S408).
- the yaw angle correction unit 254 determines whether or not the conditions for yaw angle correction are satisfied (S412). Specifically, the yaw angle correction unit 254 determines whether or not the condition that the difference between the pitch angle and the roll angle of a certain attachment site and the second reference angle and the third reference angle of the attachment site is equal to or less than a threshold is satisfied. to judge whether
- the yaw angle correction unit 254 stores the second reference angle and the third reference angle registered in the first storage unit 253 in association with each other. 1 The reference angle is read (S416). Then, the yaw angle correction unit 254 corrects the current yaw angle of the attachment site using the first reference angle (S420). After that, the processing from S404 is repeated until the operation of the base tool 250 is completed (S424).
- FIG. 13 is a flowchart showing correction processing B that is performed when automatic correction is not enabled.
- the sensor data processing unit 251 acquires sensor data from each sensor device 10 (S504).
- the sensor data processing unit 251 performs acceleration integration and gravity correction on the sensor data acquired from each sensor device 10, and estimates posture information including the yaw angle, pitch angle, and roll angle of each attachment site. (S508).
- the correction button 46 is pressed on the avatar display screen described with reference to FIG. Read out (S516).
- the yaw angle correction unit 254 calculates the current pitch angle and roll angle.
- the first reference angle associated with the second reference angle and the third reference angle that is the least different from the first reference angle may be retrieved.
- the yaw angle correction unit 254 corrects the current yaw angle of the attachment site using the first reference angle (S520). After that, the processing from S404 is repeated until the operation of the base tool 250 is completed (S524).
- the yaw angle can be corrected without the distribution users taking unnatural postures.
- the yaw angle can be corrected for each attachment site. Therefore, if a part of the attachment site is in the natural pose or the initial position/posture, the yaw angle of the attachment site can be corrected.
- a guidance screen for guiding the distribution user is displayed. It is possible to easily register a unique posture.
- FIG. 14 is a flowchart showing the registration operation according to the first modified example.
- the base tool 250 first acquires a plurality of learning data including the yaw angle, pitch angle and roll angle for each attachment site (S604).
- the learning data may be distribution user posture information obtained in a period of about one minute immediately after calibration when the influence of a drift error is small, or may be a general-purpose data set.
- the base tool 250 builds a relationship model between the pitch angle, the roll angle, and the yaw angle for each mounting site by learning based on the learning data, and registers the relationship model in the first storage unit 253 ( S608).
- platform tool 250 evaluates the relationship model using multiple test data, including yaw, pitch and roll angles, for each attachment site (S612). For example, the base tool 250 inputs the yaw angle and pitch angle included in the test data into the relationship model, and calculates the difference between the output yaw angle and the yaw angle included in the test data as an estimation error.
- the base tool 250 identifies the yaw angle and the pitch angle at which the yaw angle estimation error is less than the threshold for each attachment site, and uses the yaw angle and the pitch angle as the second reference angle and the third reference angle. It is registered in the first storage unit 253 (S616).
- the yaw angle correction unit 254 satisfies the condition that the difference between the pitch angle and the roll angle of a certain attachment site and the second reference angle and the third reference angle of the attachment site is equal to or less than the threshold. yaw angle is estimated using the relational model. Then, the yaw angle correction unit 254 corrects the current yaw angle of the mounting site using the estimated yaw angle.
- the first modification is useful in that, especially when a general-purpose data set is used as learning data for constructing a relationship model, distribution users do not have to stand still in the same posture to register posture information.
- Second modification> An example has been described above in which the yaw angle correction unit 254 estimates the yaw angle based on the current pitch angle and roll angle, and corrects the current yaw angle using the estimated yaw angle. However, the yaw angle correction unit 254 can also estimate the yaw angle using another method.
- the base tool 250 registers the position information of each attachment site and the yaw angle of each attachment site estimated by the sensor data processing unit 251 while the distribution user is taking the posture for registration in the first storage unit 253.
- the position information may be information indicating a relative position viewed from the position of the distribution user's waist.
- the sensor data processing unit 251 may estimate the position information based on the acceleration obtained by the sensor device 10, or may estimate the position information from the image of the distribution user obtained by the external camera.
- the yaw angle correction unit 254 when the condition that the difference between the current position information and the position information registered in the first storage unit 253 is less than the threshold for a given attachment site is satisfied, the first storage unit 254 253 may be used to correct the current yaw angle.
- the conditions may include posture information conditions such as the pitch angle and the roll angle as in the above-described embodiment in addition to the position information conditions. Correction of the yaw angle may be performed when both information conditions are satisfied.
- the sensor device 10 includes a geomagnetic sensor, and the base tool 250 obtains the geomagnetic value of each attachment site and the yaw angle of each attachment site obtained by the geomagnetic sensor while the distribution user is taking the posture for registration.
- the geomagnetism value may be information indicating a relative value with respect to the geomagnetism value of the distribution user's waist.
- the yaw angle correcting unit 254 determines that the difference between the current geomagnetism value and the geomagnetism value registered in the first storage unit 253 is less than the threshold for a given attachment site. 253 may be used to correct the current yaw angle.
- the conditions may include, in addition to the geomagnetism value condition, attitude information conditions such as the pitch angle and roll angle, position information conditions, and the like, as in the above-described embodiment.
- FIG. 15 is an explanatory diagram showing a second configuration example of the information processing system.
- the information processing system according to the second configuration example has a distribution user terminal 20-2 and a processing terminal 50-2.
- the distribution user terminal 20-2 and the processing terminal 50-2 are connected via the network 12.
- FIG. The distribution user terminal 20 - 2 has the base tool 250 and does not have the application section 260 .
- the application unit 260 is installed in the processing terminal 50-2.
- the distribution user terminal 20-2 transmits skeleton data to the processing terminal 50-2.
- the application unit 260 of the processing terminal 50 - 2 generates an avatar image from the skeleton data, and distributes the avatar image to the viewing user terminal 40 via the distribution server 30 .
- the developer of the base tool 250 and the developer of the application unit 260 may be the same or different.
- FIG. 16 is an explanatory diagram showing a third configuration example of the information processing system.
- the information processing system according to the third configuration example has a distribution user terminal 20-3 and a processing terminal 50-3.
- the distribution user terminal 20-3 and the processing terminal 50-3 are connected via the network 12.
- FIG. The distribution user terminal 20-3 has a base tool 250 and an application section 260-3.
- the application unit 260-3 does not have the retargeting unit 265 and the delivery control unit 268 in the configuration of the application unit 260 described with reference to FIG. Instead, processing terminal 50-3 has retargeting section 265 and delivery control section 268.
- the distribution user terminal 20-3 transmits skeleton data to the processing terminal 50-3. Then, the retargeting unit 265 of the processing terminal 50-3 generates an avatar image from the skeleton data, and the distribution control unit 268 distributes the avatar image to the viewing user terminal 40 via the distribution server 30.
- the developer of the base tool 250, the developer of the application unit 260-3, the developer of the retargeting unit 265, and the developer of the distribution control unit 268 may be the same, can be different.
- FIG. 17 is an explanatory diagram showing a fourth configuration example of the information processing system.
- the information processing system according to the fourth configuration example has a distribution user terminal 20-4 and a processing terminal 50-4.
- the delivery user terminal 20-4 and the processing terminal 50-4 are connected via the network 12.
- FIG. The distribution user terminal 20-4 has a base tool 250.
- FIG. The processing terminal 50-4 has an application section 260-4.
- the application unit 260-4 does not include the function of the distribution control unit 268, and the processing terminal 50-4 has the function of the distribution control unit 268 separately.
- the distribution user terminal 20-4 transmits skeleton data to the processing terminal 50-4. Then, the application unit 260-4 of the processing terminal 50-4 generates an avatar image from the skeleton data, and the distribution control unit 268 distributes the avatar image to the viewing user terminal 40 via the distribution server 30.
- the developer of the base tool 250, the developer of the application unit 260-4, and the developer of the distribution control unit 268 may be the same or different.
- FIG. 18 is an explanatory diagram showing a fifth configuration example of the information processing system.
- the information processing system according to the fifth configuration example has a distribution user terminal 20-5 and a processing terminal 50-5.
- the distribution user terminal 20-5 and the processing terminal 50-5 are connected via the network 12.
- FIG. The distribution user terminal 20-5 has a base tool 250.
- FIG. The processing terminal 50-5 has an application section 260-5.
- the application unit 260-5 does not include the functions of the retargeting unit 265 and the distribution control unit 268, and the processing terminal 50-5 has the functions of the retargeting unit 265 and the distribution control unit 268 separately.
- the distribution user terminal 20-5 transmits skeleton data to the processing terminal 50-5. Then, the application unit 260-5 supplies the skeleton data to the retargeting unit 265, the retargeting unit 265 generates an avatar video from the skeleton data, and the distribution control unit 268 distributes the avatar video to the viewing user terminal via the distribution server 30. Deliver to 40.
- the developer of the base tool 250, the developer of the application unit 260-5, the developer of the retargeting unit 265, and the developer of the distribution control unit 268 may be the same or different. may
- FIG. 19 is an explanatory diagram showing a sixth configuration example of the information processing system.
- the information processing system according to the sixth configuration example has a first mobile terminal 61 , a second mobile terminal 62 and a third mobile terminal 63 .
- the functions of the control unit 240 that is, the functions of the base tool 250 and the application unit 260 are implemented in the first mobile terminal 61 .
- the first mobile terminal 61 also has a communication section for communicating with other second mobile terminals 62 and third mobile terminals 63 .
- First mobile terminal 61 generates an avatar image of distribution user U ⁇ b>1 based on sensor data acquired from sensor device 10 , and transmits the avatar image to second mobile terminal 62 and third mobile terminal 63 .
- 19 shows an example in which the first mobile terminal 61, the second mobile terminal 62 and the third mobile terminal 63 communicate via the network 12, the first mobile terminal 61, the second mobile terminal 62 and The third mobile terminal 63 may communicate directly without going through the network 12 .
- the functions of the display unit 220 and the communication unit 230 are implemented in the second mobile terminal 62 .
- the second mobile terminal 62 receives the avatar image from the first mobile terminal 61 and displays a display screen including the avatar image on the display unit 220 . Thereby, the user U4 using the second mobile terminal 62 can check the avatar video.
- the functions of the operation unit 216 and the communication unit 230 are implemented in the third mobile terminal 63 .
- a user U5 who uses the third portable terminal 63 performs an operation of instructing the operation unit 216 to perform calibration, an operation of selecting whether to enable automatic correction, an operation of pressing a correction button, or the like. Then, the third mobile terminal 63 transmits information indicating the operation to the first mobile terminal 61 .
- the third mobile terminal 63 may also have the function of the display unit 220 that displays a display screen including an avatar image for the above operation.
- the functions of the second mobile terminal 62 and the functions of the third mobile terminal 63 may be collectively implemented in one mobile terminal.
- the second mobile terminal 62 and the third mobile terminal 63 may also have the function of the application section 260 .
- the first mobile terminal 61 transmits skeleton data to the second mobile terminal 62 and the third mobile terminal 63 instead of the avatar video
- the second mobile terminal 62 and the third mobile terminal 63 transmit the avatar video from the skeleton data. can be generated and displayed.
- a part or all of the functions of the application unit 260 may be implemented in each mobile terminal.
- Use cases of the sixth configuration example of the information processing system include, for example, shooting outdoors, shooting while moving, and shooting in a specific environment.
- the use of a mobile terminal eliminates the need to secure a power supply and transport equipment, making it possible to perform motion capture and data processing with lighter clothing.
- a distribution user U1 who is a performer carries a first mobile terminal 61, and the first mobile terminal 61 transmits skeleton data or an avatar video to each of the second mobile terminals 62 owned by a plurality of users such as producers and directors. This makes it possible to immediately check skeleton data or avatar images in multiple environments.
- FIG. 20 is a block diagram showing the hardware configuration of the distribution user terminal 20.
- the distribution user terminal 20 comprises a CPU (Central Processing Unit) 201 , a ROM (Read Only Memory) 202 , a RAM (Random Access Memory) 203 and a host bus 204 .
- the distribution user terminal 20 also includes a bridge 205 , an external bus 206 , an interface 207 , an input device 208 , an output device 210 , a storage device (HDD) 211 , a drive 212 and a communication device 215 .
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- the distribution user terminal 20 also includes a bridge 205 , an external bus 206 , an interface 207 , an input device 208 , an output device 210 , a storage device (HDD) 211 , a drive 212 and a communication device 215 .
- HDMI storage device
- the CPU 201 functions as an arithmetic processing device and a control device, and controls overall operations within the distribution user terminal 20 according to various programs.
- the CPU 201 may be a microprocessor.
- the ROM 202 stores programs, calculation parameters, and the like used by the CPU 201 .
- the RAM 203 temporarily stores programs used in the execution of the CPU 201, parameters that change as appropriate during the execution, and the like. These are interconnected by a host bus 204 comprising a CPU bus or the like. Functions such as the base tool 250 and the application unit 260 described with reference to FIG.
- the host bus 204 is connected via a bridge 205 to an external bus 206 such as a PCI (Peripheral Component Interconnect/Interface) bus.
- an external bus 206 such as a PCI (Peripheral Component Interconnect/Interface) bus.
- PCI Peripheral Component Interconnect/Interface
- the input device 208 includes input means for the user to input information, such as a mouse, keyboard, touch panel, button, microphone, switch, and lever, and an input control circuit that generates an input signal based on the user's input and outputs it to the CPU 201 . etc.
- input information such as a mouse, keyboard, touch panel, button, microphone, switch, and lever
- input control circuit that generates an input signal based on the user's input and outputs it to the CPU 201 . etc.
- the user of the distribution user terminal 20 can input various data to the distribution user terminal 20 and instruct processing operations.
- the output device 210 includes, for example, a display device such as a liquid crystal display (LCD) device, an OLED (Organic Light Emitting Diode) device, and a lamp.
- output device 210 includes audio output devices such as speakers and headphones.
- the output device 210 outputs reproduced content, for example.
- the display device displays various information such as reproduced video data as text or images.
- the audio output device converts reproduced audio data and the like into audio and outputs the audio.
- the storage device 211 is a data storage device configured as an example of the storage unit of the delivery user terminal 20 according to this embodiment.
- the storage device 211 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
- the storage device 211 is composed of, for example, an HDD (Hard Disk Drive).
- the storage device 211 drives a hard disk and stores programs executed by the CPU 201 and various data.
- the drive 212 is a reader/writer for storage media, and is built in or externally attached to the distribution user terminal 20 .
- the drive 212 reads out information recorded in the attached removable storage medium 24 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 203 .
- Drive 212 can also write information to removable storage medium 24 .
- the communication device 215 is, for example, a communication interface configured with a communication device or the like for connecting to the network 12 .
- the communication device 215 may be a wireless LAN (Local Area Network) compatible communication device, an LTE (Long Term Evolution) compatible communication device, or a wired communication device that performs wired communication.
- LTE Long Term Evolution
- each functional block in the base tool 250 described with reference to FIG. 4 may be distributed and implemented in a plurality of terminals.
- each functional block in the application unit 260 described with reference to FIG. 8 may be distributed in multiple terminals.
- each step in the processing of the distribution user terminal 20 in this specification does not necessarily have to be processed in chronological order according to the order described as the flowchart.
- each step in the process of the distribution user terminal 20 may be processed in an order different from the order described as the flowchart, or may be processed in parallel.
- the information processing method according to (2) wherein the predetermined relationship includes a relationship in which a difference between the second rotation angle and the second reference angle is equal to or less than a threshold.
- calibrating the gyro sensor estimating the posture information for registration of the moving body based on the output obtained from the gyro sensor while the moving body is taking the posture for registration after the calibration; registering the first rotation angle included in the orientation information for registration as the first reference angle, and registering the second rotation angle included in the orientation information for registration as the second reference angle;
- the information processing method according to (2) or (3) above comprising: (5) The information processing method according to (4) above, including displaying, after the calibration, a guidance screen for guiding the moving body to take the posture for registration.
- the posture information of the moving object includes a third rotation angle around an axis orthogonal to both the axis in the direction of gravity and the axis orthogonal to the direction of gravity;
- (12) further comprising estimating position information indicating the position of the moving body;
- a geomagnetic sensor is attached to the moving body,
- An estimating unit for estimating posture information of a moving body including a first rotation angle about an axis in the direction of gravity and a second rotation angle about an axis orthogonal to the direction of gravity based on an output from a gyro sensor attached to the moving body.
- a rotation angle correction unit that corrects the first rotation angle when a condition regarding the second rotation angle is satisfied
- An information processing device including (15) the computer, An estimating unit for estimating posture information of a moving body including a first rotation angle about an axis in the direction of gravity and a second rotation angle about an axis orthogonal to the direction of gravity based on an output from a gyro sensor attached to the moving body.
- Distribution user terminal 216 operation unit 220 display unit 230 communication unit 250 base tool 251 sensor data processing unit 252 calibration unit 253 first storage unit 254 yaw angle correction unit 255 skeleton data generation unit 256 application interface 260 application unit 261 base tool plug In 262 Second storage unit 265 Retargeting unit 267 Display control unit 268 Distribution control unit 30 Distribution server 40 Viewing user terminal
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Manufacturing & Machinery (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physiology (AREA)
- Computer Networks & Wireless Communication (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
1.情報処理システムの概要
2.配信ユーザ端末の構成
2-1.全体構成
2-2.基盤ツールの機能
2-3.アプリケーション部の機能
3.動作
4.小括
5.変形例
4-1.第1の変形例
4-2.第2の変形例
6.情報処理システムの他の構成例
5-1.第2の構成例
5-2.第3の構成例
5-3.第4の構成例
5-4.第5の構成例
5-5.第6の構成例
7.ハードウェア構成
8.補足
人間や動物等の動体の動きの情報を可視化するため、例えば身体の構造を示すスケルトン構造により表現されるスケルトンデータが用いられる。スケルトンデータは、部位の情報と、部位間を結ぶ線分であるボーンを含む。なお、スケルトン構造における部位は、例えば身体の末端部位や関節部位等に対応する。また、スケルトン構造におけるボーンは例えば人間の骨に相当し得るが、ボーンの位置や数は必ずしも実際の人間の骨格と整合していなくてもよい。
センサ装置10は、例えば加速度(Acceleration)を取得する加速度センサや角速度(Angular velocity)を取得するジャイロセンサ(角速度センサ)等の慣性センサ(IMU:Inertial Measurement Unit)を含む。)を含む。また、センサ装置10は、地磁気センサ、超音波センサ、気圧センサなどのセンサを含んでもよい。
配信ユーザ端末20は、配信ユーザU1が利用する情報処理装置の一例である。配信ユーザ端末20は、センサ装置10からセンサデータを受信し、受信したセンサデータを用いて配信ユーザU1のアバター映像を生成する。詳細については後述するが、配信ユーザ端末20は、センサデータに基づいて各装着部位の位置および姿勢を示す装着部位情報を取得し、装着部位情報に基づいて、スケルトン構造における各部位の位置情報および姿勢情報を含むスケルトンデータを生成する。さらに、配信ユーザ端末20は、スケルトンデータが示す姿勢を有するアバター映像を生成する。配信ユーザ端末20は、生成したアバター映像を配信サーバ30に送信し、配信サーバ30にアバター映像の配信を要求する。
配信サーバ30は、配信ユーザ端末20からの要求に基づき、アバター映像を視聴ユーザ端末40に配信する。図1においては、ある事業者により提供される配信サービスを実現する1つの配信サーバ30が示されているが、配信サービスを提供する複数の事業者および複数の配信サーバ30が存在してもよい。この場合、配信ユーザ端末20は、配信ユーザU1により指定された配信サービスを提供する配信サーバ30にアバター映像の配信を要求し得る。
視聴ユーザ端末40は、視聴ユーザ(例えば、図1に示したユーザU2およびユーザU3)が利用する情報処理装置である。視聴ユーザ端末40は、各種画面を表示する表示部、視聴ユーザの操作を検出する操作部、および視聴ユーザ端末40の動作全般を制御する制御部を有する。視聴ユーザ端末40は、例えば、視聴ユーザの操作に基づいて配信サーバ30に配信ユーザU1のアバター映像の配信を要求し、配信サーバ30から配信されたアバター映像を表示する。
上述した、モーションセンサを用いて実現されるモーションキャプチャ技術では、長時間の利用またはユーザの動きによりドリフト誤差が蓄積し、生成されるスケルトンデータが不正確になる場合がある。スケルトンデータが不正確である場合には、配信されるアバター映像も不自然な映像となり得る。
<2-1.全体構成>
図3は、本開示の一実施形態による配信ユーザ端末20の構成を示す説明図である。図3に示したように、本開示の一実施形態による配信ユーザ端末20は、操作部216と、表示部220と、通信部230と、制御部240と、を備える。
図4は、基盤ツール250の機能を示す説明図である。図4に示したように、基盤ツール250は、センサデータ処理部251、キャリブレーション部252、第1記憶部253、ヨー角補正部254、スケルトンデータ生成部255、およびアプリケーションインタフェース256、を有する。
センサデータ処理部251は、センサ装置10から装着部位の加速度または角速度などを示すセンサデータを取得し、当該センサデータに基づいて各装着部位の位置情報および姿勢情報を推定する。例えば、センサデータ処理部251は、加速度データを積分して各装着部位の位置情報を推定し、角速度データを積分して各装着部位のヨー角、ピッチ角およびロール角などの回転角度を含む姿勢情報を推定する。
キャリブレーション部252は、センサデータ処理部251から出力される姿勢情報のキャリブレーションを行う。キャリブレーション部252は、基盤ツール250の使用開始時にキャリブレーションを実行してもよいし、配信ユーザによる操作に従ってキャリブレーションを実行してもよい。キャリブレーション時には、配信ユーザは、初期位置姿勢として例えば仁王立ちで両腕を下した姿勢となる。
第1記憶部253は、基盤ツール250の動作に用いられる多様な情報を記憶する。例えば、第1記憶部253は、配信ユーザが登録用の姿勢をとっている間にセンサデータ処理部251により推定された登録用の姿勢情報を記憶する。
センサデータ処理部251により推定されるピッチ角およびロール角は、重力補正により、ドリフト誤差が軽減される。また、ある装着部位のピッチ角およびロール角と第1記憶部253に記憶されている第2参照角度および第3参照角度との差分が閾値以下である場合には、当該装着部位の姿勢が登録用の姿勢に近い可能性がある。すなわち、当該装着部位の本来のヨー角は第1記憶部253に記憶されている第1参照角度に近い、または等しいと考えられる。
スケルトンデータ生成部255は、センサデータ処理部251により推定された各装着部位の位置情報、およびヨー角補正部254による補正後のヨー角を含む各装着部位の姿勢情報に基づいて、スケルトン構造における各部位の位置情報および姿勢情報を含むスケルトンデータを生成する。以下、図7を参照して、スケルトンデータの生成についてより具体的に説明する。
アプリケーションインタフェース256は、アプリケーション部260とのインタフェースである。アプリケーションインタフェース256は、API(Application Programming Interface)として構成されてもよい。例えば、アプリケーションインタフェース256は、アプリケーション部260からの要求に応じて配信ユーザのスケルトンデータをアプリケーション部260に返す。また、アプリケーションインタフェース256は、アプリケーション部260から登録用の姿勢における姿勢情報の登録指示を受ける。
以上、基盤ツール250の機能を説明した。続いて、図7を参照し、アプリケーション部260の機能を説明する。
基盤ツールプラグイン261は、基盤ツール250とのインタフェースである。基盤ツールプラグイン261は、基盤ツール250からデータを受け取り、当該データをアプリケーション部260で扱えるフォーマットに変換する。例えば、基盤ツールプラグイン261は、スケルトンデータ、および、上述したヨー角の補正が行われたことを示す通知などを基盤ツール250から受け取る。
第2記憶部262は、アプリケーション部260の動作に用いられる多様な情報を記憶する。例えば、第2記憶部262は、基盤ツール250の第1記憶部253に記憶されている登録用の姿勢情報に関する情報を記憶していてもよい。
リターゲティング部265は、基盤ツールプラグイン261から配信ユーザのスケルトンデータを受け取り、当該スケルトンデータをリターゲットすることで、スケルトンデータが示す姿勢または動きを有するアバター映像を生成する。
表示制御部267は、多様な表示画面を生成し、生成した表示画面を表示部220に表示させる。例えば、表示制御部267は、リターゲティング部265により生成されたアバター映像を含むアバター表示画面を生成し、当該アバター表示画面を表示部220に表示させる。また、表示制御部267は、配信ユーザによる操作部216への操作、またはキャリブレーションの終了に基づき、配信ユーザに上述した登録用の姿勢をとることを誘導する誘導画面を生成し、当該誘導画面を表示部220に表示させる。
配信制御部268は、リターゲティング部265により生成されたアバター映像を配信サーバ30に送信し、配信サーバ30にアバター映像の配信を要求する。これにより、視聴ユーザ端末40にアバター映像が配信され、視聴ユーザ端末40にアバター映像が表示される。
以上、本開示の一実施形態による情報処理システムの構成を説明した。続いて、本開示の一実施形態による情報処理システムの動作を説明する。
図11は、本開示の一実施形態による情報処理システムにおける姿勢情報の登録までの流れを示すフローチャートである。まず、配信ユーザが身体の各部位にセンサ装置10を装着し、初期位置姿勢をとった状態において、配信ユーザ端末20のキャリブレーション部252が各センサ装置10のキャリブレーションを実行する(S304)。
図12は、自動補正が有効にされた場合に行われる補正処理Aを示すフローチャートである。図12に示したように、まず、センサデータ処理部251が各センサ装置10からセンサデータを取得する(S404)。そして、センサデータ処理部251が、各センサ装置10から取得したセンサデータに対して、加速度の積分および重力補正などを行い、各装着部位のヨー角、ピッチ角およびロール角を含む姿勢情報を推定する(S408)。
図13は、自動補正が有効にされていない場合に行われる補正処理Bを示すフローチャートである。図13に示したように、まず、センサデータ処理部251が各センサ装置10からセンサデータを取得する(S504)。そして、センサデータ処理部251が、各センサ装置10から取得したセンサデータに対して、加速度の積分および重力補正などを行い、各装着部位のヨー角、ピッチ角およびロール角を含む姿勢情報を推定する(S508)。
以上説明した本開示の一実施形態によれば、多様な作用効果が得られる。例えば、自動補正が有効にされている場合には、配信ユーザによる判断および操作を要さずに、自動的に各装着部位のヨー角を補正することが可能である。このような情報処理システムでは、配信ユーザの利便性が向上する。
以上、本開示の一実施形態を説明した。以下では、上述した実施形態の幾つかの変形例を説明する。なお、以下に説明する各変形例は、単独で上述した実施形態に適用されてもよいし、組み合わせで上述した実施形態に適用されてもよい。また、各変形例は、上述した構成に代えて適用されてもよいし、上述した構成に対して追加的に適用されてもよい。
上記では、基盤ツール250がヨー角の補正のために配信ユーザの姿勢情報を事前に登録しておく例を説明したが、基盤ツール250は、ヨー角の補正のための他の情報として、ヨー角とピッチ角およびロール角との関係モデルなどを登録しておいてもよい。以下、図14を参照して、第1の変形例による登録の動作を説明する。
上記では、ヨー角補正部254が現在のピッチ角およびロール角に基づいてヨー角を推定し、推定されたヨー角を用いて現在のヨー角を補正する例を説明した。しかし、ヨー角補正部254は、他の方法でヨー角を推定することも可能である。
上記では、情報処理システムの第1の構成例として、配信ユーザ端末20が基盤ツール250およびアプリケーション部260を有する構成例を説明した。しかし、本開示の情報処理システムには他の構成例も考えられる。以下、情報処理システムの他の構成例を説明する。
図15は、情報処理システムの第2の構成例を示す説明図である。図15に示したように、第2の構成例による情報処理システムは、配信ユーザ端末20-2および処理端末50-2を有する。配信ユーザ端末20-2と処理端末50-2はネットワーク12を介して接続されている。配信ユーザ端末20-2は、基盤ツール250を有し、アプリケーション部260を有さない。アプリケーション部260は、処理端末50-2に実装されている。
図16は、情報処理システムの第3の構成例を示す説明図である。図16に示したように、第3の構成例による情報処理システムは、配信ユーザ端末20-3および処理端末50-3を有する。配信ユーザ端末20-3と処理端末50-3はネットワーク12を介して接続されている。配信ユーザ端末20-3は、基盤ツール250およびアプリケーション部260-3を有する。アプリケーション部260-3は、図8を参照して説明したアプリケーション部260が有する構成のうち、リターゲティング部265および配信制御部268を有さない。代わりに、処理端末50-3がリターゲティング部265および配信制御部268を有する。
図17は、情報処理システムの第4の構成例を示す説明図である。図17に示したように、第4の構成例による情報処理システムは、配信ユーザ端末20-4および処理端末50-4を有する。配信ユーザ端末20-4と処理端末50-4はネットワーク12を介して接続されている。配信ユーザ端末20-4は、基盤ツール250を有する。処理端末50-4は、アプリケーション部260-4を有する。アプリケーション部260-4は配信制御部268の機能を含まず、処理端末50-4は別途配信制御部268の機能を有する。
図18は、情報処理システムの第5の構成例を示す説明図である。図18に示したように、第5の構成例による情報処理システムは、配信ユーザ端末20-5および処理端末50-5を有する。配信ユーザ端末20-5と処理端末50-5はネットワーク12を介して接続されている。配信ユーザ端末20-5は、基盤ツール250を有する。処理端末50-5は、アプリケーション部260-5を有する。アプリケーション部260-5はリターゲティング部265および配信制御部268の機能を含まず、処理端末50-5は別途リターゲティング部265および配信制御部268の機能を有する。
上記では、主に、PC型の配信ユーザ端末20に操作部216、表示部220、通信部230および制御部240などの機能が実装される例を説明したが、これらの機能は、スマートフォンのような携帯端末に実装されてもよい。また、上記機能は、複数の携帯端末に分散的に実装されてもよいし、分散的かつ重複的に実装されてもよい。図19を参照し、第6の構成例として、上記機能が複数の携帯端末に分散的に実装される例を説明する。
以上、本開示の実施形態を説明した。上述したヨー角の補正およびスケルトンデータの生成などの情報処理は、ソフトウェアと、以下に説明する配信ユーザ端末20のハードウェアとの協働により実現される。
以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示はかかる例に限定されない。本開示の属する技術の分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。
(1)
動体に取り付けられたジャイロセンサからの出力に基づき、重力方向の軸周りの第1回転角度、および前記重力方向に直交する軸周りの第2回転角度を含む前記動体の姿勢情報を推定することと、
前記第2回転角度に関する条件が満たされる場合に、前記第1回転角度の補正を行うことと、
を含む、情報処理方法。
(2)
前記条件は、前記第2回転角度が第2参照角度と所定の関係を有することを含み、
前記第1回転角度の補正を行うことは、前記第2参照角度に対応する第1参照角度を用いて前記第1回転角度の補正を行うことを含む、前記(1)に記載の情報処理方法。
(3)
前記所定の関係は、前記第2回転角度と前記第2参照角度の差分が閾値以下である関係を含む、前記(2)に記載の情報処理方法。
(4)
前記ジャイロセンサのキャリブレーションを行うことと、
前記キャリブレーションの後に前記動体が登録用の姿勢をとっている間に前記ジャイロセンサから得られた出力に基づき、前記動体の前記登録用の姿勢情報を推定することと、
前記登録用の姿勢情報に含まれる前記第1回転角度を前記第1参照角度として登録し、前記登録用の姿勢情報に含まれる前記第2回転角度を前記第2参照角度として登録することと、
を含む、前記(2)または(3)に記載の情報処理方法。
(5)
前記キャリブレーションの後に、前記動体に前記登録用の姿勢をとることを誘導する誘導画面を表示することを含む、前記(4)に記載の情報処理方法。
(6)
前記第1回転角度の補正を行うことは、前記1回転角度を前記第1参照角度に経時的に近づけていく補正を行うことである、前記(2)~(4)のいずれか一項に記載の情報処理方法。
(7)
所定の操作入力が行われたことに基づいて前記第1回転角度の補正を行うことをさらに含む、前記(1)~(6)のいずれか一項に記載の情報処理方法。
(8)
前記第1回転角度と前記第2回転角度を含む複数の学習用データに基づき、前記第1回転角度と前記第2回転角度との関係モデルを学習により構築することをさらに含み、
前記第1参照角度は、前記動体の前記第2回転角度から前記関係モデルに基づいて推定される第1回転角度である、前記(2)または(3)に記載の情報処理方法。
(9)
前記関係モデルを前記第1回転角度と前記第2回転角度を含む複数のテスト用データを用いて評価することと、
前記複数のテスト用データを用いた評価において、前記第1回転角度の推定誤差が閾値以下であった第2回転角度を前記第2参照角度として抽出することと、
をさらに含む、前記(8)に記載の情報処理方法。
(10)
前記動体の複数の部位に前記ジャイロセンサが取り付けられ、
前記第1回転角度の補正を行うことは、ある部位についての前記第2回転角度に関する条件が満たされる場合に、当該部位についての前記第1回転角度の補正を行うことを含む、前記(1)~(9)のいずれか一項に記載の情報処理方法。
(11)
前記動体の姿勢情報は、前記重力方向の軸、および前記重力方向に直交する軸の双方に直交する軸周りの第3回転角度を含み、
前記条件は、前記第2回転角度および前記第3回転角度に関する条件である、前記(1)~10に記載の情報処理方法。
(12)
前記動体の位置を示す位置情報を推定することをさらに含み、
前記条件は、前記動体の位置に関する条件を含む、前記(1)~(11)のいずれか一項に記載の情報処理方法。
(13)
前記動体には地磁気センサが取り付けられ、
前記条件は、前記地磁気センサから得られる地磁気値に関する条件を含む、前記(1)~(12)のいずれか一項に記載の情報処理方法。
(14)
動体に取り付けられたジャイロセンサからの出力に基づき、重力方向の軸周りの第1回転角度、および前記重力方向に直交する軸周りの第2回転角度を含む前記動体の姿勢情報を推定する推定部と、
前記第2回転角度に関する条件が満たされる場合に、前記第1回転角度の補正を行う回転角度補正部と、
を含む、情報処理装置。
(15)
コンピュータを、
動体に取り付けられたジャイロセンサからの出力に基づき、重力方向の軸周りの第1回転角度、および前記重力方向に直交する軸周りの第2回転角度を含む前記動体の姿勢情報を推定する推定部と、
前記第2回転角度に関する条件が満たされる場合に、前記第1回転角度の補正を行う回転角度補正部と、
として機能させるための、プログラム。
216 操作部
220 表示部
230 通信部
250 基盤ツール
251 センサデータ処理部
252 キャリブレーション部
253 第1記憶部
254 ヨー角補正部
255 スケルトンデータ生成部
256 アプリケーションインタフェース
260 アプリケーション部
261 基盤ツールプラグイン
262 第2記憶部
265 リターゲティング部
267 表示制御部
268 配信制御部
30 配信サーバ
40 視聴ユーザ端末
Claims (15)
- 動体に取り付けられたジャイロセンサからの出力に基づき、重力方向の軸周りの第1回転角度、および前記重力方向に直交する軸周りの第2回転角度を含む前記動体の姿勢情報を推定することと、
前記第2回転角度に関する条件が満たされる場合に、前記第1回転角度の補正を行うことと、
を含む、情報処理方法。 - 前記条件は、前記第2回転角度が第2参照角度と所定の関係を有することを含み、
前記第1回転角度の補正を行うことは、前記第2参照角度に対応する第1参照角度を用いて前記第1回転角度の補正を行うことを含む、請求項1に記載の情報処理方法。 - 前記所定の関係は、前記第2回転角度と前記第2参照角度の差分が閾値以下である関係を含む、請求項2に記載の情報処理方法。
- 前記ジャイロセンサのキャリブレーションを行うことと、
前記キャリブレーションの後に前記動体が登録用の姿勢をとっている間に前記ジャイロセンサから得られた出力に基づき、前記動体の前記登録用の姿勢情報を推定することと、
前記登録用の姿勢情報に含まれる前記第1回転角度を前記第1参照角度として登録し、前記登録用の姿勢情報に含まれる前記第2回転角度を前記第2参照角度として登録することと、
を含む、請求項2に記載の情報処理方法。 - 前記キャリブレーションの後に、前記動体に前記登録用の姿勢をとることを誘導する誘導画面を表示することを含む、請求項4に記載の情報処理方法。
- 前記第1回転角度の補正を行うことは、前記1回転角度を前記第1参照角度に経時的に近づけていく補正を行うことである、請求項2に記載の情報処理方法。
- 所定の操作入力が行われたことに基づいて前記第1回転角度の補正を行うことをさらに含む、請求項1に記載の情報処理方法。
- 前記第1回転角度と前記第2回転角度を含む複数の学習用データに基づき、前記第1回転角度と前記第2回転角度との関係モデルを学習により構築することをさらに含み、
前記第1参照角度は、前記動体の前記第2回転角度から前記関係モデルに基づいて推定される第1回転角度である、請求項2に記載の情報処理方法。 - 前記関係モデルを前記第1回転角度と前記第2回転角度を含む複数のテスト用データを用いて評価することと、
前記複数のテスト用データを用いた評価において、前記第1回転角度の推定誤差が閾値以下であった第2回転角度を前記第2参照角度として抽出することと、
をさらに含む、請求項8に記載の情報処理方法。 - 前記動体の複数の部位に前記ジャイロセンサが取り付けられ、
前記第1回転角度の補正を行うことは、ある部位についての前記第2回転角度に関する条件が満たされる場合に、当該部位についての前記第1回転角度の補正を行うことを含む、請求項1に記載の情報処理方法。 - 前記動体の姿勢情報は、前記重力方向の軸、および前記重力方向に直交する軸の双方に直交する軸周りの第3回転角度を含み、
前記条件は、前記第2回転角度および前記第3回転角度に関する条件である、請求項1に記載の情報処理方法。 - 前記動体の位置を示す位置情報を推定することをさらに含み、
前記条件は、前記動体の位置に関する条件を含む、請求項1に記載の情報処理方法。 - 前記動体には地磁気センサが取り付けられ、
前記条件は、前記地磁気センサから得られる地磁気値に関する条件を含む、請求項1に記載の情報処理方法。 - 動体に取り付けられたジャイロセンサからの出力に基づき、重力方向の軸周りの第1回転角度、および前記重力方向に直交する軸周りの第2回転角度を含む前記動体の姿勢情報を推定する推定部と、
前記第2回転角度に関する条件が満たされる場合に、前記第1回転角度の補正を行う回転角度補正部と、
を含む、情報処理装置。 - コンピュータを、
動体に取り付けられたジャイロセンサからの出力に基づき、重力方向の軸周りの第1回転角度、および前記重力方向に直交する軸周りの第2回転角度を含む前記動体の姿勢情報を推定する推定部と、
前記第2回転角度に関する条件が満たされる場合に、前記第1回転角度の補正を行う回転角度補正部と、
として機能させるための、プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22784300.0A EP4321973A1 (en) | 2021-04-08 | 2022-01-13 | Information processing method, information processing device, and program |
CN202280026569.8A CN117120961A (zh) | 2021-04-08 | 2022-01-13 | 信息处理方法、信息处理装置和程序 |
US18/285,130 US20240175717A1 (en) | 2021-04-08 | 2022-01-13 | Information processing method, information processing apparatus, and program |
JP2023512824A JPWO2022215313A1 (ja) | 2021-04-08 | 2022-01-13 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021065822 | 2021-04-08 | ||
JP2021-065822 | 2021-04-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022215313A1 true WO2022215313A1 (ja) | 2022-10-13 |
Family
ID=83546312
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/000903 WO2022215313A1 (ja) | 2021-04-08 | 2022-01-13 | 情報処理方法、情報処理装置およびプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240175717A1 (ja) |
EP (1) | EP4321973A1 (ja) |
JP (1) | JPWO2022215313A1 (ja) |
CN (1) | CN117120961A (ja) |
WO (1) | WO2022215313A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009503530A (ja) * | 2005-08-01 | 2009-01-29 | トヨタ自動車株式会社 | 加速度センサの補正装置および加速度センサの出力値補正方法 |
JP2009250778A (ja) * | 2008-04-07 | 2009-10-29 | Alpine Electronics Inc | カルマンフィルタ処理における繰り返し演算制御方法及び装置 |
JP2011257342A (ja) * | 2010-06-11 | 2011-12-22 | Nsk Ltd | ヘッドトラッキング装置及びヘッドトラッキング方法 |
JP2015149051A (ja) * | 2014-01-08 | 2015-08-20 | 富士通株式会社 | 入力装置、入力方法および入力プログラム |
WO2019203188A1 (ja) | 2018-04-17 | 2019-10-24 | ソニー株式会社 | プログラム、情報処理装置、及び情報処理方法 |
JP2019211864A (ja) * | 2018-05-31 | 2019-12-12 | 株式会社コロプラ | コンピュータプログラム、情報処理装置および情報処理方法 |
-
2022
- 2022-01-13 WO PCT/JP2022/000903 patent/WO2022215313A1/ja active Application Filing
- 2022-01-13 EP EP22784300.0A patent/EP4321973A1/en active Pending
- 2022-01-13 US US18/285,130 patent/US20240175717A1/en active Pending
- 2022-01-13 CN CN202280026569.8A patent/CN117120961A/zh active Pending
- 2022-01-13 JP JP2023512824A patent/JPWO2022215313A1/ja active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009503530A (ja) * | 2005-08-01 | 2009-01-29 | トヨタ自動車株式会社 | 加速度センサの補正装置および加速度センサの出力値補正方法 |
JP2009250778A (ja) * | 2008-04-07 | 2009-10-29 | Alpine Electronics Inc | カルマンフィルタ処理における繰り返し演算制御方法及び装置 |
JP2011257342A (ja) * | 2010-06-11 | 2011-12-22 | Nsk Ltd | ヘッドトラッキング装置及びヘッドトラッキング方法 |
JP2015149051A (ja) * | 2014-01-08 | 2015-08-20 | 富士通株式会社 | 入力装置、入力方法および入力プログラム |
WO2019203188A1 (ja) | 2018-04-17 | 2019-10-24 | ソニー株式会社 | プログラム、情報処理装置、及び情報処理方法 |
JP2019211864A (ja) * | 2018-05-31 | 2019-12-12 | 株式会社コロプラ | コンピュータプログラム、情報処理装置および情報処理方法 |
Also Published As
Publication number | Publication date |
---|---|
US20240175717A1 (en) | 2024-05-30 |
CN117120961A (zh) | 2023-11-24 |
EP4321973A1 (en) | 2024-02-14 |
JPWO2022215313A1 (ja) | 2022-10-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111954490B (zh) | 程序、信息处理装置和信息处理方法 | |
US8953036B2 (en) | Information processing apparatus, information processing method, program, and information processing system | |
CN103635891A (zh) | 大量同时远程数字呈现世界 | |
US11609345B2 (en) | System and method to determine positioning in a virtual coordinate system | |
CN113365085B (zh) | 一种直播视频生成方法及装置 | |
JP2010257081A (ja) | 画像処理方法及び画像処理装置 | |
JPWO2020110659A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
US20220157002A1 (en) | System and method for immersive telecommunications | |
WO2022215313A1 (ja) | 情報処理方法、情報処理装置およびプログラム | |
JP2006318385A (ja) | 画像生成システム、プログラムおよび情報記憶媒体 | |
JP2020135428A (ja) | 位置データ処理装置およびプログラム | |
TW201301080A (zh) | 具有動態圖像校正功能的電子裝置及方法 | |
JP5359950B2 (ja) | 運動支援装置、運動支援方法およびプログラム | |
WO2022168428A1 (ja) | 情報処理方法、情報処理装置およびプログラム | |
EP4294019A1 (en) | Display terminal, communication system, display method, and communication method | |
JP2005346016A (ja) | 挙手検知機、およびこれを用いた挙手検知システム | |
CN103093655B (zh) | 一种直指魔术棒及其定位游戏***和实现方法 | |
WO2022113834A1 (ja) | システム、撮像装置、情報処理装置、情報処理方法および情報処理プログラム | |
WO2022044399A1 (ja) | 端末装置、及びフォーム改善支援方法 | |
US10477138B2 (en) | Methods and systems for presenting specific information in a virtual reality environment | |
KR101882980B1 (ko) | 운동관련 물리량 측정센서 기반 모션데이터 생성방법 | |
JPWO2017199352A1 (ja) | 全天球カメラ撮像画像表示システム、全天球カメラ撮像画像表示方法及びプログラム | |
JP2024033276A (ja) | 通信システム、情報処理システム、動画作成方法、プログラム | |
CN116958352A (zh) | 美术资源的处理方法、装置、电子设备及存储介质 | |
JP2024033277A (ja) | 通信システム、情報処理システム、映像再生方法、プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22784300 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023512824 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18285130 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022784300 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022784300 Country of ref document: EP Effective date: 20231108 |