WO2023276216A1 - 情報処理装置、情報処理方法およびプログラム - Google Patents
情報処理装置、情報処理方法およびプログラム Download PDFInfo
- Publication number
- WO2023276216A1 WO2023276216A1 PCT/JP2022/003202 JP2022003202W WO2023276216A1 WO 2023276216 A1 WO2023276216 A1 WO 2023276216A1 JP 2022003202 W JP2022003202 W JP 2022003202W WO 2023276216 A1 WO2023276216 A1 WO 2023276216A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- information processing
- information
- unit
- head
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 118
- 238000003672 processing method Methods 0.000 title claims description 7
- 238000012545 processing Methods 0.000 claims abstract description 80
- 238000012937 correction Methods 0.000 claims abstract description 35
- 238000000034 method Methods 0.000 claims abstract description 22
- 230000008569 process Effects 0.000 claims abstract description 14
- 238000004891 communication Methods 0.000 description 64
- 230000006870 function Effects 0.000 description 28
- 230000036544 posture Effects 0.000 description 24
- 238000010586 diagram Methods 0.000 description 21
- 238000012800 visualization Methods 0.000 description 19
- 239000002689 soil Substances 0.000 description 14
- 230000001133 acceleration Effects 0.000 description 10
- 238000005259 measurement Methods 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 8
- 230000008859 change Effects 0.000 description 6
- 230000033001 locomotion Effects 0.000 description 5
- 239000004065 semiconductor Substances 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000005236 sound signal Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 235000018645 Allium odorum Nutrition 0.000 description 1
- 235000005338 Allium tuberosum Nutrition 0.000 description 1
- 244000003377 Allium tuberosum Species 0.000 description 1
- 244000000626 Daucus carota Species 0.000 description 1
- 235000002767 Daucus carota Nutrition 0.000 description 1
- 241000196324 Embryophyta Species 0.000 description 1
- 235000007688 Lycopersicon esculentum Nutrition 0.000 description 1
- 240000003768 Solanum lycopersicum Species 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/26—Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
- G01B11/27—Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes for testing the alignment of axes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
Definitions
- the present invention relates to an information processing device, an information processing method, and a program.
- Patent Document 1 As an invention for determining contact with an object in virtual space, for example, there is an information processing device disclosed in Patent Document 1.
- This information processing device specifies the length of the user's arm, and determines the reachable range of the collision area of the object based on the collision area set for the object held by the user's avatar and the range in which the user can move the hand. set. Since the reachable range of the collision area of the object is set according to the arm length of the user, the reachable range of the collision area is the predetermined range even if the user changes.
- GUI Graphic User Interface
- a user corresponding to an avatar may crouch or tilt his head in space, and depending on the user's posture, the reachable range of the avatar may change, resulting in poor operability and usability.
- the present disclosure proposes an information processing device, an information processing method, and a program capable of suppressing deterioration of usability.
- a position processing unit for estimating a user's head position and estimating a position of a predetermined part of the user's body based on the estimated head position; a correcting unit that corrects a position based on the head position and an inclination angle of the user's head; and the user visually recognizes an image operated by the user based on the position corrected by the correcting unit. and a display control unit that performs a process of superimposing and displaying the image in the space where the image is displayed. Further, according to the present disclosure, there are provided an information processing method in which the information processing of the information processing device is executed by a computer, and a program for causing the computer to implement the information processing of the information processing device.
- FIG. 1 is a diagram showing devices that constitute an information processing system according to an embodiment.
- FIG. 2 is a diagram showing an example of a work place and a GUI visually recognized by a user via a terminal device.
- FIG. 3 is a diagram for explaining a method of calculating the waist position of the user.
- FIG. 4 is a diagram for explaining a method of correcting the waist position of the user.
- FIG. 5 is a diagram showing an example of a display mode of operation buttons according to the embodiment.
- FIG. 6 is a diagram showing an example of a work place and a GUI visually recognized by the user through the terminal device.
- FIG. 7 is a diagram illustrating an example of a display mode of operation buttons and a virtual hand according to the embodiment; FIG.
- FIG. 8 is a block diagram showing the hardware configuration and functional configuration of the information processing apparatus.
- FIG. 9 is a block diagram showing the hardware configuration and functional configuration of the terminal device.
- FIG. 10 is a block diagram showing the hardware of the information providing device.
- FIG. 11 is a flowchart showing the flow of processing executed when displaying operation buttons.
- FIG. 12 is a flowchart showing the flow of processing for changing the display mode of operation buttons.
- FIG. 13 is a flow chart showing the flow of processing for changing the display mode of the operation buttons.
- FIG. 14 is a hardware configuration diagram of an example of a computer that implements the functions of the information processing apparatus.
- the GUI will be positioned outside the field of view when the operator moves the line of sight or moves within the 3D space.
- it is first necessary to search for the GUI that has moved out of the field of view, and it takes time and effort to search for the GUI.
- the present disclosure proposes an information processing device, an information processing method, and a program with good operability.
- the field workers are appropriately referred to as "users".
- the user may be a user who experiences an AR experience of a worker in a field.
- the user is a real field worker.
- FIG. 1 is a diagram showing devices constituting an information processing system 1.
- the information processing system 1 includes an information processing device 10 , a terminal device 20 , an information providing device 30 and a sensor group 40 .
- the information processing device 10 is connected to the communication line N by wire, but may be connected wirelessly.
- Various devices can be connected to the information processing device 10 .
- a terminal device 20 and an information providing device 30 are connected to the information processing device 10 via a communication line N, and information is linked between the devices.
- the terminal device 20, the information providing device 30, and the sensor group 40 are also connected to the communication line N by wire or wirelessly.
- the connection of the terminal device 20, the information providing device 30, and the sensor group 40 to the communication line N by radio is, for example, a connection via a wireless LAN, but is not limited to the wireless LAN. ) may be used.
- the terminal device 20 is, for example, an optical see-through type head-mounted display capable of AR display such as HoloLens (registered trademark) or HoloLens2.
- the terminal device 20 may be a terminal device such as a smartphone capable of AR display using ARCore (registered trademark), ARKit (registered trademark), or the like.
- the terminal device 20 may be a video see-through type AR device or XR device such as Varjo (registered trademark) XR-1.
- the terminal device 20 may be a VR device such as a head-mounted display capable of VR display.
- the terminal device 20 uses the range that the user can visually recognize through the terminal device 20 as the target range, and for example, based on the visualization information provided from the information processing device 10, AR displays various information related to the target range visually recognized in the field. I do.
- the terminal device 20 is an example of an information processing device according to the present disclosure.
- the sensor group 40 includes, for example, sensors that measure various types of information about fields, such as a camera that photographs fields, a sensor that measures the amount of sunshine in fields, and a sensor that measures the moisture content of soil in fields.
- the information providing device 30 is an information processing device that provides the information processing device 10 with various types of information regarding the target range.
- the information providing device 30 acquires and stores various types of information such as the amount of sunshine, the amount of moisture in the soil, and images of the field, for example, from the sensor group 40 installed in the field.
- the information providing device 30 provides the information processing device 10 with various types of stored information about the target range in response to a request.
- the information providing device 30 is implemented by a PC (Personal computer), WS (Work station), or the like. Note that the information providing device 30 is not limited to a PC, WS, or the like.
- the information processing device 10 is an information processing device that performs processing for providing the terminal device 20 with information to be AR-displayed on the terminal device 20 . Specifically, the information processing device 10 acquires sensor information, which will be described later, from the terminal device 20 . The information processing device 10 requests the information providing device 30 for various information such as the amount of sunlight and the moisture content of the soil in the target range specified using the acquired sensor information, and acquires the various information supplied in response to the request. . Then, based on the information acquired from the information providing device 30, the information processing device 10 provides visualization information for displaying the amount of sunshine and the moisture content of the soil for the target range.
- the information processing apparatus 10 is implemented by a PC, WS, or the like. Note that the information processing device 10 is not limited to a PC, WS, or the like. For example, the information processing device 10 may be an information processing device such as a PC, WS, or the like that implements the functions of the information processing device 10 as an application.
- Example of information processing system 1 will be described.
- the user is wearing the terminal device 20, which is a see-through type head-mounted display.
- the embodiment it is assumed that there is a work place in the farm field.
- FIG. 2 is a diagram showing an example of a work place viewed by the user through the terminal device 20 and an AR-displayed GUI displayed on the terminal device 20 and viewed by the user.
- the vegetation visually recognized by the user at the work place includes tomato V11, carrot V12, Chinese chive V13, and the like.
- the AR display visually recognized by the user includes operation buttons B11 to B13 as an example of a GUI, a virtual hand HL as a virtual object visualizing the left hand of the worker sensed by the terminal device 20, and and a virtual hand HR, which is a virtual object that visualizes the sensed right hand of the worker.
- the operation buttons B11 to B13 are displayed based on the position of the user's waist.
- the operation button B11 is a button for controlling the AR display of information on the amount of sunshine at the work place.
- the operation button B12 is a button for controlling the AR display of the moisture content of the soil at the work place.
- the operation button B13 is a button for controlling the AR display of the complexity of vegetation in the work place.
- FIG. 3 is a diagram for explaining a method of calculating the waist position of the user.
- the terminal device 20 senses a relative position from coordinates (0, 0, 0) of the origin in a left-handed three-dimensional orthogonal coordinate system having a predetermined position in the field as the origin, and at the sensed position Let the coordinates of a certain head position P1 be the user's head coordinates (xh, yh, zh).
- the terminal device 20 identifies the coordinates (xp, yp, zp) of the waist position P2 of the user based on the head position P1.
- the vertical height h from the ground where the origin of the field is located to the head position P1 is higher than the threshold value H, which is a predetermined reference height.
- the waist position P2 of the user is positioned below the head position P1 by a distance ⁇ to the waist. If the coordinates of the head position P1 are (xh, yh, zh), then the coordinates (xp, yp, zp) of the waist position P2 are (xh, yh- ⁇ , zh).
- the distance ⁇ may be a distance based on a predetermined human body model, or may be a distance obtained by previously measuring the distance from the position of the head to the waist when the user stands upright.
- the terminal device 20 assumes a posture in which the user crouches, for example, as shown in FIG. If it is equal to or less than H, the waist position P2 of the user is set to a position below the head position P1 by a distance ⁇ . The distance ⁇ is shorter than the distance ⁇ . If the coordinates of the head position P1 are (xh, yh, zh), then the coordinates (xp, yp, zp) of the waist position P2 are (xh, yh- ⁇ , zh).
- the terminal device 20 that has specified the waist position P2 corrects the waist position P2 of the user according to the inclination of the terminal device 20.
- FIG. 4 is a diagram for explaining a method of correcting the waist position P2 of the user.
- the terminal device 20 sets a left-handed three-dimensional orthogonal coordinate system with the head coordinates as the origin.
- the terminal device 20 corrects the waist position P2 of the user when the angle ⁇ of the inclination (pitch) around the x-axis of the left-handed three-dimensional orthogonal coordinate system with the head coordinates as the origin is not within a predetermined range.
- FIG. 4(a) shows a state in which the terminal device 20 is not rotated around the x-axis of the left-handed three-dimensional orthogonal coordinate system with the head coordinates as the origin when the user is standing.
- the terminal device 20 senses the tilt angle ⁇ about the x-axis of the terminal device 20 from the horizontal state shown in FIG. 4A and the vertical movement distance ⁇ h of the head position P1 from the standing state. .
- the coordinates of the waist position after correction are calculated by the following equations (1) to (3).
- xp after correction xp before correction (1)
- yp after correction yp before correction+ ⁇ h (2)
- zp after correction zp before correction ⁇ h ⁇ tan ⁇ (3)
- xp after correction xp before correction (4)
- yp after correction yp- ⁇ h before correction (5)
- zp after correction zp before correction+ ⁇ h ⁇ tan ⁇ (6)
- FIG. 5 is a diagram showing an example of a display mode of the operation button B11.
- the operation button B11 has a transparent hemispherical operated portion C1, an annular indicator portion C2, and a function display portion C3 including an icon representing a function corresponding to the operation button B11.
- the terminal device 20 executes the process corresponding to the button with which the virtual hand HL or the virtual hand HR is in contact. .
- the terminal device 20 displays the visualization information of the amount of sunshine in the target range visually recognized by the user. to the information processing apparatus 10 .
- the terminal device 20 displays a graph GR of the amount of sunshine in AR as shown in FIG.
- the terminal device 20 determines that the virtual hand HL or the virtual hand HR touches the operated portion C1
- the height of the operated portion C1 is adjusted to the height of the virtual hand HL or the virtual hand HR that touches the operated portion C1. It may be displayed in another display manner instead of being displayed lower than the state in which it is not installed.
- FIG. 8 is a block diagram showing the functional configuration of the information processing device 10. As shown in FIG. As shown in FIG. 8 , the information processing device 10 includes a control section 100 , a communication section 110 and a storage section 120 .
- the communication unit 110 has a function of communicating with an external device.
- the communication unit 110 supplies information received from the external device to the control unit 100 in communication with the external device.
- the communication unit 110 supplies information received from the information providing device 30 and information received from the terminal device 20 to the control unit 100 .
- the communication unit 110 transmits information supplied from the control unit 100 to an external device.
- the communication unit 110 acquires from the control unit 100 the visualization information about the target range generated by the control unit 100 based on the information supplied from the information providing device 30, and transmits the acquired visualization information to the terminal device 20.
- the communication unit 110 also acquires from the control unit 100 information representing the target range generated by the control unit 100 based on the sensor information supplied from the terminal device 20 and transmits the acquired information to the information providing device 30 .
- the storage unit 120 is implemented by, for example, a semiconductor memory device such as a RAM or flash memory, or a storage device such as a hard disk or optical disk.
- the storage unit 120 has a function of storing information regarding processing in the information processing apparatus 10 .
- the storage unit 120 stores, for example, information about the target range supplied from the information providing device 30 .
- the position specifying unit 1021 has a function of specifying the target range.
- the position specifying unit 1021 specifies the position of the terminal device 20 and the direction in which the user is facing, based on the sensor information acquired from the terminal device 20 .
- the position specifying unit 1021 specifies a target range that the user can visually recognize through the terminal device 20 in the field based on the specified position and direction.
- the sensor unit 250 has a head position measurement unit 251, a hand posture measurement unit 252, and a voice acquisition unit 253.
- the head position measurement unit 251 has an acceleration sensor 251a, an orientation sensor 251b, a depth sensor 251c, a gyro sensor 251d, a SLAM 251e, and a GPS module 251f.
- the acceleration sensor 251a is, for example, a triaxial acceleration sensor.
- the acceleration sensor 251a outputs acceleration information representing the measured acceleration.
- the azimuth sensor 251b is a sensor that measures geomagnetism and detects the direction in which the terminal device 20 is facing.
- the orientation sensor 251b outputs orientation information representing the detected orientation.
- the depth sensor 251c is a sensor that measures the distance from the terminal device 20 to a person or object existing within the target range.
- the depth sensor 251c outputs depth information representing the measured distance.
- the gyro sensor 251 d is a sensor that measures the angular velocity of the terminal device 20 .
- the gyro sensor 251d outputs angular velocity information representing the measured angular velocity.
- the SLAM 251e is, for example, a Lidar (Light Detection And Ranging) SLAM (Simultaneous Localization and Mapping) equipped with a laser scanner, or a Visual SLAM equipped with a camera.
- the SLAM 251e senses the target range and outputs map information representing an environmental map of the target range.
- the GPS module 251f receives radio waves measured from satellites in the satellite positioning system and measures the position of the terminal device 20 .
- the GPS module 251f outputs position information representing the measured position.
- the hand posture measurement unit 252 has a depth sensor 252a and an infrared camera 252b.
- the infrared camera 252b outputs infrared light, receives infrared light reflected by the user's hand, and photographs the user's hand.
- the depth sensor 252a measures the distance to the user's hand based on the image of the user's hand generated by the infrared camera 252b.
- the hand posture measurement unit 252 outputs hand posture information including the measured distance to the user's hand and an image of the user's hand.
- the voice acquisition unit 253 has a microphone 253a.
- the microphone 253a collects sounds around the terminal device 20 and outputs audio information representing the collected sounds.
- the storage unit 210 is realized by semiconductor memory devices such as RAM and flash memory, for example.
- the storage unit 210 has a function of storing information regarding processing in the terminal device 20 .
- the storage unit 120 stores, for example, visualization information supplied from the information processing device 10 .
- the storage unit 120 also stores application programs executed by the terminal device 20 .
- the application program stored in the storage unit 120 is, for example, a program that allows the user to visually recognize graphs of the amount of solar radiation in a field, the amount of moisture in soil, the complexity of vegetation, and the like through AR display.
- the control unit 200 is implemented by executing an application program stored in the storage unit 210.
- the control unit 200 has a position processing unit 201, a hand posture processing unit 202, a correction unit 203, a display processing unit 204, a display control unit 205, and a communication control unit 206, as shown in FIG.
- the position processing unit 201 Based on the sensor information output from the sensor unit 250, the position processing unit 201 identifies the aforementioned head position P, the waist position P2 which is an example of a predetermined part of the body, the inclination angle ⁇ of the terminal device 20, and the like. .
- the hand posture processing unit 202 identifies the position and posture of the user's hands based on the hand posture information output from the hand posture measurement unit 252 .
- the correction unit 203 corrects the waist position P2 based on the head position P and the angle ⁇ .
- the display processing unit 204 generates AR-display images of the operation buttons B11 to B13, etc., and generates images of the virtual hand HL and the virtual hand HR based on the positions and postures specified by the hand posture processing unit 202.
- the display processing unit 204 generates an image for AR display such as a graph GR based on the visualization information provided from the information processing device 10 .
- the display control unit 205 controls the video output unit 220 so that the images of the operation buttons B11 to B13 generated by the display processing unit 204 are AR-displayed at the waist position P2 corrected by the correction unit 203.
- FIG. the display control unit 205 controls the video output unit 220 so that the images of the virtual hand HL and the virtual hand HR are AR-displayed at the hand position specified by the hand posture processing unit 202 .
- the display control unit 205 controls the video output unit 220 so that the image generated based on the visualization information in the display processing unit 204 is AR-displayed in the symmetric range.
- the communication control unit 206 controls the external communication unit 240 to transmit information to the information processing device 10 and receive information from the information processing device 10 .
- the video output unit 220 displays the AR image output from the control unit 200 and viewed by the user on the half mirror.
- the audio output unit 230 includes a speaker and outputs sounds represented by audio signals supplied from an external device.
- the external communication unit 240 has a function of communicating with an external device. For example, the external communication unit 240 supplies information received from the external device to the control unit 200 in communication with the external device. Specifically, the external communication unit 240 supplies visualization information received from the information processing device 10 to the control unit 200 . In addition, the external communication unit 240 transmits information supplied from the control unit 200 to the external device in communication with the external device. Specifically, the external communication unit 240 transmits the sensor information output from the sensor unit 250 to the control unit 200 to the information processing device 10 .
- FIG. 10 is a block diagram showing the hardware configuration of the information providing device 30. As shown in FIG. As shown in FIG. 10 , the information providing device 30 includes a control section 300 , a storage section 310 and a communication section 320 .
- the communication unit 320 has a function of communicating with an external device. For example, the communication unit 320 outputs information received from the external device to the control unit 300 in communication with the external device. Specifically, communication unit 320 outputs information received from information processing device 10 to control unit 300 . For example, the communication unit 320 outputs information representing the target range transmitted from the information processing device 10 to the control unit 300 . The communication unit 320 also transmits information about the target range supplied from the control unit 300 to the information processing device 10 .
- the control unit 300 has a function of controlling the operation of the information providing device 30 .
- the control unit 300 transmits various types of information regarding the target range to the information processing device 10 via the communication unit 320 .
- the control unit 300 accesses the storage unit 310 and transmits to the information processing apparatus 10 the amount of sunlight, the amount of water in the soil, and the complexity of vegetation in the target range.
- the storage unit 310 is implemented by, for example, a semiconductor memory device such as a RAM or flash memory, or a storage device such as a hard disk or optical disk.
- the storage unit 310 has a function of storing data related to processing in the information providing device 30 .
- the storage unit 310 stores, for example, information on the amount of sunshine and the amount of water in the soil acquired from the sensor group 40 installed in the field, images of the field, and the like.
- FIG. 11 is a flow chart showing the flow of processing executed when the operation buttons B11 to B13 are displayed on the terminal device 20.
- the terminal device 20 sets a predetermined position in the field as the origin (0, 0, 0) (step S101).
- the terminal device 20 specifies the coordinates of the head position P1 when the user is standing, based on the information output from the sensor section 250 (step S102).
- the terminal device 20 calculates the tilt angle ⁇ of the terminal device 20 from the horizontal state and the vertical movement distance ⁇ h of the terminal device 20 from the standing state. Identify (step S103).
- the terminal device 20 determines whether or not the height of the head position P1 is higher than the aforementioned threshold value H based on the coordinates specified in step S102 (step S104). When the height of the head position P1 is higher than the threshold value H (Yes in step S104), the terminal device 20 sets the waist position P2 of the user to a position lower than the head position P1 by a distance ⁇ from the waist (step S105). ). On the other hand, if the height of the head position P1 is equal to or less than the threshold H (No in step S104), the terminal device 20 sets the waist position P2 of the user to a position below the head position P1 by the distance ⁇ (step S106). ).
- the terminal device 20 determines whether the tilt angle ⁇ from the horizontal state specified in step S103 is within a predetermined range ( ⁇ A ⁇ A) (step S107). of the inclination from the horizontal state is within the predetermined range (Yes in step S107), the terminal device 20 shifts the flow of processing to step S111.
- the terminal device 20 determines whether the terminal device 20 is tilted downward from the horizontal plane (step S108). When the terminal device 20 is tilted downward from the horizontal plane (Yes in step S108), the terminal device 20 corrects the coordinates of the waist position P2 upward and backward (step S109). Here, as described above, the terminal device 20 calculates the coordinates of the post-correction waist position P2 using the above equations (1) to (3). Further, when the terminal device 20 is tilted upward from the horizontal plane (No in step S108), the coordinates of the waist position P2 are corrected downward and forward (step S110).
- the terminal device 20 calculates the coordinates of the post-correction waist position P2 using the above-described equations (4) to (6). After the processing of step S109 or step S110, the terminal device 20 shifts the flow of processing to step S111.
- the terminal device 20 determines whether the origin has moved in step S111 (step S111). If the origin has moved (Yes in step S111), the terminal device 20 ends the process shown in FIG. If the origin has not moved (No in step S111), the terminal device 20 displays the operation buttons B11 to B13 at the waist position P2 (step S113). Here, when the terminal device 20 determines Yes in step S107, the operation buttons B11 to B13 are displayed based on the waist position P2 set in step S105 or step S106. Further, when the terminal device 20 determines No in step S107, it displays the operation buttons B11 to B13 based on the waist position P2 corrected in step S109 or step S110. Next, the terminal device 20 updates the coordinates of the head position P1 based on the information output from the sensor unit 250 (step S113), and returns the flow of processing to step S103.
- FIG. 12 is a flow chart showing the flow of processing for changing the display mode of the operation buttons B11 to B13.
- the terminal device 20 initializes the contact time t between the virtual hand and the operation buttons before displaying the operation buttons B11 to B13 (step S201).
- the terminal device 20 displays the operated portion C1 in a hemispherical shape based on the waist position P2 specified in the process of FIG. 11 for the operation buttons B11 to B13 (step S202).
- step S207 the terminal device 20 determines whether the measured contact time t is less than the threshold value T (step S207).
- the terminal device 20 adjusts the height of the operated part C1 with which the virtual hand HL or the virtual hand HR is in contact, as shown in FIG. 2, the contact state is displayed in a lower contact state than the hemispherical shape (step S208).
- the terminal device 20 changes the color of the indicator portion C2 according to the contact time t, as shown in FIGS. 5(b) to 5(d) (step S209). Then, the terminal device 20 returns the processing flow to step S203.
- the terminal device 20 displays the operated part C1 in a hemispherical non-contact state display mode as shown in FIG. (Step S210). Next, the terminal device 20 executes a process corresponding to the operation button with which the virtual hand HL or the virtual hand HR is in contact (step S211), and ends the process.
- the terminal device 20 transmits sensor information to the information processing device 10 .
- the information processing device 10 identifies the user's visible range based on the sensor information transmitted from the terminal device 20 .
- the information processing device 10 transmits information representing the specified visible range to the information providing device 30 .
- the information providing device 30 transmits to the information processing device 10 information on the amount of sunshine in the visible range represented by the information transmitted from the information processing device 10 .
- the information processing device 10 generates an AR image representing a three-dimensional graph of the amount of sunshine in the target range based on the information transmitted from the information providing device 30, and uses the information of the generated AR image as visualization information for the terminal device 20.
- Send to The terminal device 20 receives the visualization information transmitted from the information processing device 10, and displays the graph GR of the AR image represented by the received visualization information as shown in FIG. 6, for example.
- FIG. 13 is a flow chart showing the flow of processing for changing the display mode of the operation buttons B11 to B13 as shown in FIG. 7 according to the overlapping of the virtual hand HR or the virtual hand HL and the operated part C1.
- the terminal device 20 identifies the position and posture of the virtual hand HR or the virtual hand HL based on the hand posture information output from the hand posture measurement unit 252 (step S301).
- step S302 if a portion of the virtual hand HR or the virtual hand HL is positioned within the operated portion C1 (Yes in step S302), the terminal device 20 moves the virtual hand HR or the virtual hand HL inside the operated portion C1. Identify the area where it is located (step S303). Then, the terminal device 20 changes the color of the area specified in step S303, for example, as shown in FIG. 7 (step S304), and returns the flow of processing to step S301.
- the display positions of the operation buttons B11 to B13 may be corrected according to the length of the user's arm. In the present invention, the display positions of the operation buttons B11 to B13 may be corrected according to the user's motion state. In the present invention, the posture state of the user may be estimated from the trajectory of the user's posture specified by the acceleration sensor 251a and the gyro sensor 251d, and the waist position P2 may be corrected according to the estimated posture state. In the present invention, when an object is detected by the sensor unit 250 and the position of the detected object overlaps the display positions of the operation buttons B11 to B13, the display positions of the operation buttons B11 to B13 are corrected so that they do not overlap the detected object.
- the elapse of contact time may be notified to the user by vibration or sound.
- the above-described threshold value H and threshold value T may be values determined for each user.
- the contact between the virtual hand HL and the virtual hand HR may be determined in a wider range than the displayed operated part C1.
- the terminal device 20 displays the operation buttons B11 to B13 around the waist of the user, but the positions where the operation buttons B11 to B13 are displayed are not limited to the waist position. For example, it may be around the user's chest.
- the above-described embodiment is configured to perform AR display corresponding to synecoculture, but the AR display displayed by the information processing system 1 is not limited to one corresponding to synecoculture. AR display corresponding to work in a factory or work at a construction site may be performed.
- FIG. 14 is a block diagram illustrating an example hardware configuration of a computer that implements the functions of the information processing apparatus according to the embodiment.
- the information processing device 900 shown in FIG. 14 can realize, for example, the information processing device 10, the terminal device 20, and the information providing device 30 shown in FIG.
- Information processing by the information processing device 10, the terminal device 20, and the information providing device 30 according to the embodiment is realized by cooperation between software and hardware described below.
- the CPU 901, ROM 902, and RAM 903 are interconnected, for example, via a host bus 904a capable of high-speed data transmission.
- the host bus 904a is connected via a bridge 904, for example, to an external bus 904b having a relatively low data transmission speed.
- the external bus 904b is connected to various components via an interface 905. FIG.
- the input device 906 is implemented by a device through which information is input, such as a mouse, keyboard, touch panel, button, microphone, switch, and lever.
- the input device 906 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or PDA corresponding to the operation of the information processing device 900.
- the input device 906 may include, for example, an input control circuit that generates an input signal based on information input using the above input means and outputs the signal to the CPU 901 .
- a user of the information processing apparatus 900 can input various data to the information processing apparatus 900 and instruct processing operations by operating the input device 906 .
- the input device 906 may be formed by a device that detects the user's position.
- the input device 906 includes an image sensor (eg, camera), depth sensor (eg, stereo camera), acceleration sensor, gyro sensor, geomagnetic sensor, optical sensor, sound sensor, ranging sensor (eg, ToF (Time of Flight ) sensors), force sensors, and the like.
- the input device 906 also receives information about the state of the information processing device 900 itself, such as the attitude and movement speed of the information processing device 900, and information about the space around the information processing device 900, such as brightness and noise around the information processing device 900. may be obtained.
- the input device 906 receives GNSS signals from GNSS (Global Navigation Satellite System) satellites (for example, GPS signals from GPS (Global Positioning System) satellites) and provides position information including the latitude, longitude and altitude of the device.
- GNSS Global Navigation Satellite System
- GPS Global Positioning System
- a measuring GNSS module may be included.
- the input device 906 may detect the position by Wi-Fi (registered trademark), transmission/reception with a mobile phone/PHS/smartphone, or short-distance communication.
- the input device 906 can implement the functions of the sensor unit 250 described with reference to FIG. 9, for example.
- the output device 907 is formed by a device capable of visually or audibly notifying the user of the acquired information.
- Such devices include display devices such as CRT display devices, liquid crystal display devices, plasma display devices, EL display devices, laser projectors, LED projectors and lamps, sound output devices such as speakers and headphones, and printer devices. .
- the output device 907 outputs, for example, results obtained by various processes performed by the information processing device 900 .
- the display device visually displays the results obtained by various processes performed by the information processing device 900 in various formats such as text, image, table, and graph.
- an audio output device converts an audio signal, which is composed of reproduced audio data, acoustic data, etc., into an analog signal and aurally outputs the analog signal.
- the output device 907 can implement the functions of the video output unit 220 and the audio output unit 230 described with reference to FIG. 9, for example.
- the storage device 908 is a data storage device formed as an example of the storage unit of the information processing device 900 .
- the storage device 908 is implemented by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
- the storage device 908 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
- the storage device 908 stores programs executed by the CPU 901, various data, and various data acquired from the outside.
- the storage device 908 can realize the functions of the storage unit 120, the storage unit 210, and the storage unit 310 described with reference to FIGS. 8 to 10, for example.
- connection port 910 is, for example, a USB (Universal Serial Bus) port, an IEEE1394 port, a SCSI (Small Computer System Interface), an RS-232C port, or a port for connecting an external connection device such as an optical audio terminal. .
- USB Universal Serial Bus
- IEEE1394 Serial Bus
- SCSI Serial Computer System Interface
- RS-232C Serial Bus
- a port for connecting an external connection device such as an optical audio terminal.
- the communication network 920 is a wired or wireless transmission path for information transmitted from devices connected to the communication network 920 .
- the communication network 920 may include a public line network such as the Internet, a telephone line network, a satellite communication network, various LANs (Local Area Networks) including Ethernet (registered trademark), WANs (Wide Area Networks), and the like.
- Communication network 920 may also include a dedicated line network such as IP-VPN (Internet Protocol-Virtual Private Network).
- the terminal device 20 performs processing for correcting the display position of the AR-displayed GUI based on the vertical position of the terminal device 20 and the tilt of the terminal device 20 .
- the operation buttons B11 to B13 are always displayed around the waist position of the user, so that the operation buttons B11 to B13 can be found immediately even if the user changes position or posture, promoting improvement of usability. can do.
- the waist position is calculated based on the user's body position, the operation buttons B11 to B13 can be operated at the same waist position at all times.
- the operation buttons B11 to B13 are always displayed based on the user's waist, the user can learn the positional relationship between the operation buttons B11 to B13 and his/her own body when he/she becomes familiar with the operation. . Therefore, even if the operation buttons B11 to B13 are not placed in the center of the visual field, it is possible to operate the operation buttons B11 to B13 by feeling the movement of the body, and it is possible to operate them with one hand while working.
- the user since the display mode of the operated part C1 and the indicator part C2 changes according to the operation, the user can notice the change in the display mode, and can operate the operation buttons B11 to B13 even while performing other work. can be operated. Further, according to the display mode shown in FIG. 7, the user can be made to recognize that the user's hand is in contact with the operated portion C1, so that the user can easily recognize that the hand is in contact with the operated portion C1. can do.
- the information processing apparatus changes a display mode of the image according to an elapsed time after the user starts operating the image.
- the position processing unit estimates the position of the hand of the user,
- the display control unit superimposes an image of the user's hand on a space viewed by the user based on the position estimated by the position processing unit, and
- the information processing apparatus changes a display mode of the image of the hand according to an overlap with an image operated by the user.
- An information processing method comprising: (10) to the computer, a position processing step of estimating the position of the user's head and estimating the position of a predetermined part of the user's body based on the estimated head position; a correction step of correcting the position of the predetermined part estimated in the position processing step based on the head position and the inclination angle of the user's head; a display control step of performing a process of superimposing and displaying an image operated by the user on a space viewed by the user based on the position corrected
- information processing system 10 information processing device 20 terminal device 30 information providing device 40 sensor group 100 control unit 101 acquisition unit 102 processing unit 1021 position specifying unit 1022 generation unit 103 output unit 110 communication unit 120 storage unit 200 control unit 201 position processing unit 202 hand posture processing unit 203 correction unit 204 display processing unit 205 display control unit 206 communication control unit 210 storage unit 220 video output unit 230 audio output unit 240 external communication unit 250 sensor unit 251 head position measurement unit 251a acceleration sensor 251b direction sensor 251c depth sensor 251d gyro sensor 251e SLAM 251f GPS module 252 hand posture measurement unit 252a depth sensor 252b infrared camera 253 voice acquisition unit 253a microphone 300 control unit 310 storage unit 320 communication unit B11 to B13 operation button C1 operated unit C2 indicator unit C3 function display unit
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
1.本開示の実施形態の概要
1.1.はじめに
1.2.情報処理システムの概要
2.情報処理システムの実施例
2.1.圃場の作業場所でのAR表示
2.2.ユーザの腰の位置の特定方法
2.3.操作ボタンの表示態様
3.情報処理システムの構成
3.1.情報処理装置の構成
3.2.端末装置の構成
3.3.情報提供装置の構成
3.4.情報処理システムの処理
3.5.処理のバリエーション
4.ハードウェア構成例
5.まとめ
<1.1.はじめに>
協生農法(登録商標)において、圃場における植物等の生態系に係る各種情報(例えば、日照量や土壌の水分量)を、作業者に対してAR(Augmented Reality)技術を用いて可視化して表示する技術が注目されている。各種情報を表示する方法としては、例えば、情報を表示するためのボタン等のGUIを端末装置でAR表示し、表示されたGUIを操作する方法がある。
実施形態に係る情報処理システム1の概要について説明する。図1は、情報処理システム1を構成する装置を示す図である。図1に示したように、情報処理システム1は、情報処理装置10、端末装置20、情報提供装置30、及びセンサ群40を備える。例えば、情報処理装置10は、通信回線Nに有線で接続されるが、無線で接続されてもよい。情報処理装置10には、多様な装置が接続され得る。例えば、情報処理装置10には、端末装置20及び情報提供装置30が通信回線Nを介して接続され、各装置間で情報の連携が行われる。また、端末装置20、情報提供装置30及びセンサ群40も、有線又は無線で通信回線Nに接続される。端末装置20、情報提供装置30及びセンサ群40の無線による通信回線Nへの接続は、例えば無線LANを介した接続であるが、無線LANに限定されるものではなく、例えば、Bluetooth(登録商標)を用いた接続であってもよい。
続いて、情報処理システム1の実施例について説明する。なお、実施例では、ユーザはシースルー型のヘッドマウントディスプレイである端末装置20を装着しているものとする。また、実施例では、圃場には作業場所があるものとする。
図2は、ユーザが端末装置20を介して視認している作業場所と、端末装置20で表示されてユーザが視認しているAR表示されたGUIの一例を示す図である。図2に示すように、作業場所においてユーザが視認している植生には、トマトV11、人参V12、ニラV13等が含まれている。また、ユーザが視認するAR表示には、GUIの一例である操作ボタンB11~B13と、端末装置20によりセンシングされた作業者の左手を可視化した仮想オブジェクトである仮想手HLと、端末装置20によりセンシングされた作業者の右手を可視化した仮想オブジェクトである仮想手HRと、が含まれる。
図3は、ユーザの腰の位置の算出方法を説明するための図である。端末装置20は、例えば、圃場に予め定められた位置を原点とした左手系の3次元直交座標系において、原点の座標(0,0,0)からの相対位置をセンシングし、センシングした位置である頭部位置P1の座標をユーザの頭部座標(xh,yh,zh)とする。
補正後のxp=補正前のxp・・・(1)
補正後のyp=補正前のyp+Δh・・・(2)
補正後のzp=補正前のzp-Δh÷tanθ・・・(3)
補正後のxp=補正前のxp・・・(4)
補正後のyp=補正前のyp-Δh・・・(5)
補正後のzp=補正前のzp+Δh÷tanθ・・・(6)
図5は、操作ボタンB11の表示態様の一例を示す図である。操作ボタンB11は、透明な半球形状の***作部C1と、円環形状のインジケータ部C2と、操作ボタンB11に対応した機能を表すアイコンを含む機能表示部C3を有する。
続いて、情報処理システム1の構成について説明する。
図8は、情報処理装置10の機能構成を示すブロック図である。図8に示したように、情報処理装置10は、制御部100、通信部110、及び記憶部120を備える。
図9は、端末装置20のハードウェア構成及び機能構成を示すブロック図である。端末装置20は、制御部200、記憶部210、映像出力部220、音声出力部230、外部通信部240、及びセンサ部250を有する。
図10は、情報提供装置30のハードウェア構成を示すブロック図である。図10に示したように、情報提供装置30は、制御部300、記憶部310及び通信部320を備える。
次に情報処理システム1で行われる処理について説明する。図11は、端末装置20において操作ボタンB11~B13を表示するときに実行される処理の流れを示すフローチャートである。まず、端末装置20は、圃場に予め定められた位置を原点(0,0,0)として設定する(ステップS101)。次に端末装置20は、センサ部250から出力される情報に基づいて、ユーザが立位状態であるときの頭部位置P1の座標を特定する(ステップS102)。また、端末装置20は、センサ部250から出力される情報に基づいて、水平状態からの端末装置20の傾きの角度θと、立位状態からの端末装置20の鉛直方向への移動距離Δhを特定する(ステップS103)。
続いて、本実施形態の処理のバリエーションを説明する。なお、以下に説明する処理のバリエーションは、単独で本実施形態に適用されてもよいし、組み合わせで本実施形態に適用されてもよい。また、処理のバリエーションは、本実施形態で説明した構成に代えて適用されてもよいし、本実施形態で説明した構成に対して追加的に適用されてもよい。
次に、図14を参照しながら、実施形態に係る情報処理装置のハードウェア構成例について説明する。図14は、実施形態に係る情報処理装置の機能を実現するコンピュータの一例のハードウェア構成例を示すブロック図である。なお、図14に示す情報処理装置900は、例えば、図1に示した情報処理装置10、端末装置20、及び情報提供装置30を実現し得る。実施形態に係る情報処理装置10、端末装置20、及び情報提供装置30による情報処理は、ソフトウェアと、以下に説明するハードウェアとの協働により実現される。
以上説明したように、実施形態に係る端末装置20は、AR表示されるGUIの表示位置を端末装置20の鉛直方向の位置と、端末装置20の傾きに基づいて補正する処理を行う。これにより、操作ボタンB11~B13は、常にユーザの腰の位置の周辺に表示されるため、ユーザが位置や姿勢を変えても操作ボタンB11~B13を直ぐに見つけることができ、ユーザビリティの向上を促進することができる。また、本実施形態では、ユーザの身***置を基準に腰位置が算出されるため、常に同じ腰位置で操作ボタンB11~B13の操作を行うことができる。また、本実施形態では、常にユーザの腰を基準に操作ボタンB11~B13が表示されているため、ユーザは、操作に慣れてくると操作ボタンB11~B13と自身の身体の位置関係を学習できる。そのため、操作ボタンB11~B13を視野の中心に入れなくても、身体の動作感覚により操作ボタンB11~B13を操作することが可能となり、作業しながら片手で操作することも可能となる。
(1)
ユーザの頭部位置を推定し、推定した頭部位置に基づいてユーザの身体の所定部位の位置を推定する位置処理部と、
前記位置処理部が推定した前記所定部位の位置を、前記頭部位置と前記ユーザの頭部の傾きの角度に基づいて補正する補正部と、
前記補正部により補正された位置に基づいて、前記ユーザにより操作される画像を前記ユーザにより視認される空間に重ねて表示する処理を行う表示制御部と、
を備える情報処理装置。
(2)
前記補正部は、前記所定部位の鉛直方向の位置を、基準高さに基づいて補正する
前記(1)に記載の情報処理装置。
(3)
前記補正部は、水平状態からの前記頭部の傾きの角度が所定範囲外である場合、前記所定部位の位置を前記ユーザの前後方向に補正する
前記(1)又は(2)に記載の情報処理装置。
(4)
前記補正部は、水平状態から前記頭部が下向きに傾いた場合、前記所定部位の位置を前記ユーザの後方向に補正する
前記(3)に記載の情報処理装置。
(5)
前記補正部は、水平状態から前記頭部が上向きに傾いた場合、前記所定部位の位置を前記ユーザの前方向に補正する
前記(3)又は(4)に記載の情報処理装置。
(6)
前記ユーザにより前記画像が操作された場合、前記画像の表示態様を変更する表示処理部を更に備える
前記(1)に記載の情報処理装置。
(7)
前記表示処理部は、前記ユーザが前記画像に対する操作を開始してからの経過時間に応じて前記画像の表示態様を変更する
前記(6)に記載の情報処理装置。
(8)
前記位置処理部は、前記ユーザの手の位置を推定し、
前記表示制御部は、前記位置処理部が推定した位置に基づいて、前記ユーザの手の画像を前記ユーザにより視認される空間に重ねて表示し、
前記表示処理部は、前記手の画像の表示態様を前記ユーザにより操作される画像との重なりに応じて変更する
前記(6)又は(7)に記載の情報処理装置。
(9)
ユーザの頭部位置を推定し、推定した頭部位置に基づいてユーザの身体の所定部位の位置を推定する位置処理ステップと、
前記位置処理ステップで推定した前記所定部位の位置を、前記頭部位置と前記ユーザの頭部の傾きの角度に基づいて補正する補正ステップと、
前記補正ステップで補正された位置に基づいて、前記ユーザにより操作される画像を前記ユーザにより視認される空間に重ねて表示する処理を行う表示制御ステップと、
を備える情報処理方法。
(10)
コンピュータに、
ユーザの頭部位置を推定し、推定した頭部位置に基づいてユーザの身体の所定部位の位置を推定する位置処理ステップと、
前記位置処理ステップで推定した前記所定部位の位置を、前記頭部位置と前記ユーザの頭部の傾きの角度に基づいて補正する補正ステップと、
前記補正ステップで補正された位置に基づいて、前記ユーザにより操作される画像を前記ユーザにより視認される空間に重ねて表示する処理を行う表示制御ステップと、
を実行させるプログラム。
10 情報処理装置
20 端末装置
30 情報提供装置
40 センサ群
100 制御部
101 取得部
102 処理部
1021 位置特定部
1022 生成部
103 出力部
110 通信部
120 記憶部
200 制御部
201 位置処理部
202 手姿勢処理部
203 補正部
204 表示処理部
205 表示制御部
206 通信制御部
210 記憶部
220 映像出力部
230 音声出力部
240 外部通信部
250 センサ部
251 頭部位置測定部
251a 加速度センサ
251b 方位センサ
251c 深度センサ
251d ジャイロセンサ
251e SLAM
251f GPSモジュール
252 手姿勢測定部
252a 深度センサ
252b 赤外カメラ
253 音声取得部
253a マイク
300 制御部
310 記憶部
320 通信部
B11~B13 操作ボタン
C1 ***作部
C2 インジケータ部
C3 機能表示部
Claims (10)
- ユーザの頭部位置を推定し、推定した頭部位置に基づいてユーザの身体の所定部位の位置を推定する位置処理部と、
前記位置処理部が推定した前記所定部位の位置を、前記頭部位置と前記ユーザの頭部の傾きの角度に基づいて補正する補正部と、
前記補正部により補正された位置に基づいて、前記ユーザにより操作される画像を前記ユーザにより視認される空間に重ねて表示する処理を行う表示制御部と、
を備える情報処理装置。 - 前記補正部は、前記所定部位の鉛直方向の位置を、基準高さに基づいて補正する
請求項1に記載の情報処理装置。 - 前記補正部は、水平状態からの前記頭部の傾きの角度が所定範囲外である場合、前記所定部位の位置を前記ユーザの前後方向に補正する
請求項1に記載の情報処理装置。 - 前記補正部は、水平状態から前記頭部が下向きに傾いた場合、前記所定部位の位置を前記ユーザの後方向に補正する
請求項3に記載の情報処理装置。 - 前記補正部は、水平状態から前記頭部が上向きに傾いた場合、前記所定部位の位置を前記ユーザの前方向に補正する
請求項3に記載の情報処理装置。 - 前記ユーザにより前記画像が操作された場合、前記画像の表示態様を変更する表示処理部を更に備える
請求項1に記載の情報処理装置。 - 前記表示処理部は、前記ユーザが前記画像に対する操作を開始してからの経過時間に応じて前記画像の表示態様を変更する
請求項6に記載の情報処理装置。 - 前記位置処理部は、前記ユーザの手の位置を推定し、
前記表示制御部は、前記位置処理部が推定した位置に基づいて、前記ユーザの手の画像を前記ユーザにより視認される空間に重ねて表示し、
前記表示処理部は、前記手の画像の表示態様を前記ユーザにより操作される画像との重なりに応じて変更する
請求項6に記載の情報処理装置。 - ユーザの頭部位置を推定し、推定した頭部位置に基づいてユーザの身体の所定部位の位置を推定する位置処理ステップと、
前記位置処理ステップで推定した前記所定部位の位置を、前記頭部位置と前記ユーザの頭部の傾きの角度に基づいて補正する補正ステップと、
前記補正ステップで補正された位置に基づいて、前記ユーザにより操作される画像を前記ユーザにより視認される空間に重ねて表示する処理を行う表示制御ステップと、
を備える情報処理方法。 - コンピュータに、
ユーザの頭部位置を推定し、推定した頭部位置に基づいてユーザの身体の所定部位の位置を推定する位置処理ステップと、
前記位置処理ステップで推定した前記所定部位の位置を、前記頭部位置と前記ユーザの頭部の傾きの角度に基づいて補正する補正ステップと、
前記補正ステップで補正された位置に基づいて、前記ユーザにより操作される画像を前記ユーザにより視認される空間に重ねて表示する処理を行う表示制御ステップと、
を実行させるプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/568,110 US20240212293A1 (en) | 2021-06-29 | 2022-01-28 | Information processing apparatus, information processing method, and program |
CN202280044639.2A CN117561490A (zh) | 2021-06-29 | 2022-01-28 | 信息处理装置、信息处理方法和程序 |
JP2023531358A JPWO2023276216A1 (ja) | 2021-06-29 | 2022-01-28 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-108086 | 2021-06-29 | ||
JP2021108086 | 2021-06-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023276216A1 true WO2023276216A1 (ja) | 2023-01-05 |
Family
ID=84691102
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/003202 WO2023276216A1 (ja) | 2021-06-29 | 2022-01-28 | 情報処理装置、情報処理方法およびプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240212293A1 (ja) |
JP (1) | JPWO2023276216A1 (ja) |
CN (1) | CN117561490A (ja) |
WO (1) | WO2023276216A1 (ja) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010534895A (ja) * | 2007-07-27 | 2010-11-11 | ジェスチャー テック,インコーポレイテッド | 高度なカメラをベースとした入力 |
US20170140552A1 (en) * | 2014-06-25 | 2017-05-18 | Korea Advanced Institute Of Science And Technology | Apparatus and method for estimating hand position utilizing head mounted color depth camera, and bare hand interaction system using same |
JP2017182216A (ja) * | 2016-03-28 | 2017-10-05 | 株式会社バンダイナムコエンターテインメント | シミュレーション制御装置及びシミュレーション制御プログラム |
JP2018205913A (ja) * | 2017-05-31 | 2018-12-27 | 株式会社コロプラ | 仮想空間を提供するためにコンピュータで実行される方法、プログラム、および、情報処理装置 |
WO2019038875A1 (ja) * | 2017-08-24 | 2019-02-28 | マクセル株式会社 | ヘッドマウントディスプレイ |
JP2020003898A (ja) * | 2018-06-26 | 2020-01-09 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
JP2020107123A (ja) * | 2018-12-27 | 2020-07-09 | 株式会社コロプラ | プログラム、情報処理装置、および方法 |
-
2022
- 2022-01-28 JP JP2023531358A patent/JPWO2023276216A1/ja active Pending
- 2022-01-28 WO PCT/JP2022/003202 patent/WO2023276216A1/ja active Application Filing
- 2022-01-28 CN CN202280044639.2A patent/CN117561490A/zh active Pending
- 2022-01-28 US US18/568,110 patent/US20240212293A1/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010534895A (ja) * | 2007-07-27 | 2010-11-11 | ジェスチャー テック,インコーポレイテッド | 高度なカメラをベースとした入力 |
US20170140552A1 (en) * | 2014-06-25 | 2017-05-18 | Korea Advanced Institute Of Science And Technology | Apparatus and method for estimating hand position utilizing head mounted color depth camera, and bare hand interaction system using same |
JP2017182216A (ja) * | 2016-03-28 | 2017-10-05 | 株式会社バンダイナムコエンターテインメント | シミュレーション制御装置及びシミュレーション制御プログラム |
JP2018205913A (ja) * | 2017-05-31 | 2018-12-27 | 株式会社コロプラ | 仮想空間を提供するためにコンピュータで実行される方法、プログラム、および、情報処理装置 |
WO2019038875A1 (ja) * | 2017-08-24 | 2019-02-28 | マクセル株式会社 | ヘッドマウントディスプレイ |
JP2020003898A (ja) * | 2018-06-26 | 2020-01-09 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
JP2020107123A (ja) * | 2018-12-27 | 2020-07-09 | 株式会社コロプラ | プログラム、情報処理装置、および方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023276216A1 (ja) | 2023-01-05 |
CN117561490A (zh) | 2024-02-13 |
US20240212293A1 (en) | 2024-06-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11847747B2 (en) | Displaying a virtual image of a building information model | |
JP7268692B2 (ja) | 情報処理装置、制御方法及びプログラム | |
CN110536665B (zh) | 使用虚拟回声定位来仿真空间感知 | |
WO2020241189A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
US11443540B2 (en) | Information processing apparatus and information processing method | |
WO2015048890A1 (en) | System and method for augmented reality and virtual reality applications | |
WO2020110659A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
KR20200116459A (ko) | 증강 현실을 위한 시스템들 및 방법들 | |
US20230148185A1 (en) | Information processing apparatus, information processing method, and recording medium | |
WO2023276216A1 (ja) | 情報処理装置、情報処理方法およびプログラム | |
CN115904188B (zh) | 户型图的编辑方法、装置、电子设备及存储介质 | |
US20230177781A1 (en) | Information processing apparatus, information processing method, and information processing program | |
US11240482B2 (en) | Information processing device, information processing method, and computer program | |
CN112154389A (zh) | 终端设备及其数据处理方法、无人机及其控制方法 | |
Cai et al. | Heads-up lidar imaging with sensor fusion | |
WO2018074054A1 (ja) | 表示制御装置、表示制御方法及びプログラム | |
US20230245397A1 (en) | Information processing method, program, and system | |
WO2023276215A1 (ja) | 情報処理装置、情報処理方法及びプログラム | |
WO2020235539A1 (ja) | オブジェクトの位置及び姿勢を特定する方法及び装置 | |
JP2023062983A (ja) | 仮想鉄塔表示システム | |
JP2022142517A (ja) | 画像表示制御装置、画像表示制御システム及び画像表示制御方法 | |
CN115761046A (zh) | 房屋信息的编辑方法、装置、电子设备及存储介质 | |
WO2020185115A1 (ru) | Способ и система сбора информации для устройства совмещенной реальности в режиме реального времени |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22832384 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023531358 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18568110 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280044639.2 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22832384 Country of ref document: EP Kind code of ref document: A1 |