WO2017138278A1 - 画像表示装置および画像表示方法 - Google Patents
画像表示装置および画像表示方法 Download PDFInfo
- Publication number
- WO2017138278A1 WO2017138278A1 PCT/JP2017/000242 JP2017000242W WO2017138278A1 WO 2017138278 A1 WO2017138278 A1 WO 2017138278A1 JP 2017000242 W JP2017000242 W JP 2017000242W WO 2017138278 A1 WO2017138278 A1 WO 2017138278A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- unit
- image
- state
- image display
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 43
- 238000001514 detection method Methods 0.000 claims abstract description 103
- 230000001133 acceleration Effects 0.000 claims abstract description 24
- 230000033001 locomotion Effects 0.000 description 45
- 230000005540 biological transmission Effects 0.000 description 32
- 210000003128 head Anatomy 0.000 description 31
- 239000000446 fuel Substances 0.000 description 11
- 238000004891 communication Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 101100347655 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) NAB3 gene Proteins 0.000 description 4
- 239000000284 extract Substances 0.000 description 3
- 230000004886 head movement Effects 0.000 description 3
- 210000000887 face Anatomy 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
- B60K35/81—Arrangements for controlling instruments for controlling displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/64—Constructional details of receivers, e.g. cabinets or dust covers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/146—Instrument input by gesture
- B60K2360/1464—3D-gesture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/166—Navigation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/167—Vehicle dynamics information
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/169—Remaining operating distance or charge
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/177—Augmented reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0141—Head-up displays characterised by optical features characterised by the informative content of the display
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0181—Adaptation to the pilot/driver
Definitions
- the present invention relates to an image display apparatus and an image display method.
- Priority is claimed on Japanese Patent Application No. 2016-025244, filed Feb. 12, 2016, the content of which is incorporated herein by reference.
- image display devices are of non-transmissive type that covers the eyes and transmissive type that does not cover the eyes.
- non-transmissive type only the image is displayed on the display unit.
- the transmissive type the display unit is, for example, a half mirror.
- the user can visually recognize the image and the image of the outside world.
- Such an image display apparatus is worn on the head of the user, so when the user moves the head, the outside world or the image may be difficult to see.
- an image display apparatus capable of reliably transmitting character information which is an image even when the user moves the head (see, for example, Patent Document 1).
- the aspect which concerns on this invention is made in view of said point, and it aims at providing the image display apparatus which can change the information displayed according to the motion of an image display apparatus, and the image display method. I assume.
- the image display device includes an acquisition unit that acquires two or more pieces of information, a detection unit that detects at least one of acceleration, azimuth, and angular velocity of the image display device; The state of the image display device is determined based on the result detected by the unit, and the information to be selected from the two or more pieces of information acquired by the acquisition unit is switched according to the determined state of the image display device An image changing unit that generates display data based on the switched information, and a display unit that displays the display data.
- the detection unit detects the state of the image display device regarding at least one of the upper, lower, left, and right directions with respect to the reference state of the image display device. You may judge based on the result.
- the image changing unit generates additional information when the image display device reciprocates between the reference posture and the outside of the reference posture within a predetermined time.
- the additional information may be information according to a situation in which the image display device is used.
- the acquisition unit acquires information on the vehicle from the vehicle, and the additional information May be information on the vehicle.
- An image display method is an image display method of an image display apparatus, and an acquisition procedure for acquiring two or more pieces of information, and acceleration, azimuth and angular velocity of the image display apparatus.
- the state of the image display device is determined based on the detection procedure for detecting at least one and the result detected by the detection procedure, and acquisition is performed by the acquisition procedure according to the determined state of the image display device.
- the information change procedure which switches the information selected from the said 2 or more information and which produces
- the above aspect (1) or (6) it is possible to change a plurality of display data by a simple operation of the image display device.
- more display data can be selectively displayed by the simple operation of the image display device.
- different information can be displayed according to the operation pattern.
- the information on the vehicle can be displayed on the image display device, and the information to be displayed can be switched according to the operation of the image display device.
- HMD glasses-type head mount display
- FIG. 1 is a block diagram showing a schematic configuration of the HMD 1 according to the present embodiment.
- the HMD 1 includes an operation unit 10, an acquisition unit 11, a detection unit 12, an information change unit 13, and a display unit 14.
- the detection unit 12 further includes a magnetic sensor 121 (detection unit), an acceleration sensor 122 (detection unit), and an angular velocity sensor 123 (detection unit).
- the information change unit 13 also includes a storage unit 131, a control unit 132, and an image change unit 133.
- the operation unit 10 includes, for example, a mechanical switch, a touch panel switch, and the like.
- the operation unit 10 detects the result of the user's operation, and outputs the detected operation instruction to the control unit 132.
- the HMD 1 may not have the operation unit 10.
- the HMD 1 may be configured such that the acquisition unit 11 acquires an operation instruction from an external device (not shown) and outputs the acquired operation instruction to the control unit 132.
- the external device is, for example, a portable terminal such as a smartphone, a remote controller, or the like.
- the operation instruction is, for example, an instruction to turn on or off the power of the HMD 1, an instruction to perform a learning mode, an instruction to display information on the display unit 14, or any condition when the HMD 1 is in any state.
- the learning mode is a mode in which learning of a predetermined time and a threshold for determining the downward state, the vertical swing operation, and the horizontal swing operation is performed.
- the downward state is a state in which the user is depressed.
- the vertical swing motion is a state in which the user shakes his head longitudinally several times so as to nod.
- the shaking motion is a state in which the user shakes his / her head several times in the lateral direction.
- the acquisition unit 11 includes at least one of a wired communication scheme and a wireless communication scheme.
- the acquisition unit 11 acquires information from an external device, and outputs the acquired information to the image changing unit 133.
- the information acquired from the external device is, for example, navigation information to a destination, information on a weather forecast of a current location or a destination, information on a schedule of today, and the like.
- the acquisition unit 11 outputs the acquired operation instruction to the control unit 132.
- the detection unit 12 detects the state of the HMD 1 and outputs the detected state information to the control unit 132.
- the state of HMD1 refers to the inclination of HMD1, the movement of HMD1, the orientation of HMD1, etc.
- the state information includes at least one of information indicating an azimuth, information indicating an acceleration direction, and information indicating an angular velocity.
- the magnetic sensor 121 is, for example, a geomagnetic sensor, and detects the orientation of the HMD 1.
- the magnetic sensor 121 outputs the detected detection value to the control unit 132.
- the acceleration sensor 122 is, for example, a three-axis sensor, and detects the inclination of the HMD 1.
- the acceleration sensor 122 detects the direction of acceleration based on the gravitational acceleration, and outputs the detected value to the control unit 132.
- the angular velocity sensor 123 is, for example, a gyro sensor, and detects the rotation of the HMD 1.
- the angular velocity sensor 123 outputs the detected detection value to the control unit 132.
- the information changing unit 13 determines the state of the HMD 1 (for example, the stationary state or the operating state of the HMD 1) based on the result detected by the detecting unit 12 and the acquisition unit 11 acquires the determined state according to the determined state of the HMD 1
- the information to be selected from the two or more pieces of information is switched, and display data is generated based on the switched information.
- the storage unit 131 detects a threshold for detecting a downward state and a predetermined time, a threshold for detecting a vertical swing operation, and a predetermined time for oscillating. Threshold and predetermined time are stored.
- the storage unit 131 is information indicating the type of image to be displayed in the normal state, information indicating the type of image to be displayed in the downward state, information indicating the type of image to be displayed in the vertical movement operation, and horizontal movement Stores information indicating the type of image to be displayed.
- the normal state is a state in which the user wears the HMD 1 on the head and faces, for example, the front direction.
- the control unit 132 acquires the operation instruction output from the operation unit 10 or the acquisition unit 11.
- the control unit 132 acquires the detection value output by the detection unit 12.
- the control unit 132 performs learning of the threshold and each predetermined time for each state (including each operation) of the HMD 1 using the acquired detected value in accordance with the instruction to perform the learning mode included in the acquired operation instruction.
- the storage unit 131 stores the predetermined threshold and the predetermined time.
- the state of the HMD 1 refers to a state in which the user wears the HMD 1 on the head and faces upward (hereinafter referred to as an upward state), a state directed to the front (hereinafter referred to as a front state), and the left State (hereinafter referred to as left facing), state facing right (hereinafter referred to right), state facing left upper (hereinafter referred to left upward), state facing upwards (hereinafter referred to right upward) , The lower left (hereinafter, left downward), the lower right (hereinafter, right downward).
- the motion of the HMD 1 is a motion in which the user wears the HMD 1 on the head and turns downward from the front, a motion from upward to the front, a motion from upward to the downward, a motion from the front to the left .
- the process in the learning mode will be described later.
- control unit 132 when the control unit 132 acquires an instruction to set image display, the control unit 132 generates a display instruction to display an image based on the display information acquired from the terminal 3 on the display unit 14 and generates the generated display instruction as an image. Output to the change unit 133.
- the control unit 132 acquires an instruction to set not to display an image, the control unit 132 generates a non-display instruction not to display an image based on display information acquired from the terminal 3 on the display unit 14 and generates the non-display instruction It is output to the image changing unit 133.
- control unit 132 determines the state (including the operation) of the HMD 1 based on the acquired detection value, the threshold stored in the storage unit 131, and the predetermined time.
- the control unit 132 calculates a change in state (direction) using the detection value of the magnetic sensor 121 and the detection value of the angular velocity sensor 123.
- the control unit 132 determines the operation based on the time change of the azimuth calculated using the detection value of the magnetic sensor 121 and the detection value of the angular velocity sensor 123.
- the control unit 132 may determine the operation using a detection value of the angular velocity sensor 123.
- the control unit 132 determines that the downward state is present, and outputs the determination result to the image changing unit 133.
- the control unit 132 determines that it is an operation, and outputs the determination result to the image changing unit 133.
- control unit 132 When the control unit 132 detects the operation of turning right from the front and the operation of turning left from the front a second predetermined number of times within a third predetermined time, the control unit 132 determines that it is a swinging operation and determines the determination result as an image changing unit Output to
- the image changing unit 133 acquires the information output by the acquiring unit 11. In addition, the image changing unit 133 acquires the determination result output by the control unit 132. The image changing unit 133 selects at least one of the acquired information in accordance with the acquired determination result, and generates an image to be displayed on the display unit 14 using the selected information. When the control unit 132 outputs a display instruction, the image changing unit 133 outputs the generated image to the display unit 14. When the control unit 132 outputs the non-display instruction, the image changing unit 133 does not output the generated image to the display unit 14.
- the display unit 14 includes a projection unit that projects an image, and a transmissive display that uses, for example, a hologram.
- the display unit 14 transmits external light, and displays the image output from the image changing unit 133 using a hologram.
- the display unit 14 may have both the left and right or one of the left and right.
- FIG. 2 is a view showing an example of the appearance of the HM 1 according to the present embodiment.
- the coordinates of the acceleration sensor 122 are the z-axis direction in the vertical direction, the x-axis direction in the left-right direction, and the front-rear direction
- the y-axis direction is assumed.
- the acceleration sensor 122 is attached, for example, such that the detected value in the Z-axis direction is in the negative direction.
- the HMD 1 of the present embodiment is a glasses type.
- the HMD 1 includes display portions 14R and 14L, nose pads 102R and 102L, a bridge 103, and temples 101R and 101L on the left and right.
- the detection unit 12 is attached in the left and right temples 101R and 101L, and the operation unit 10, the acquisition unit 11, the storage unit 131, the control unit 132, and the image changing unit 133 are attached in the left temple 101L.
- the configuration illustrated in FIG. 2 is an example, and the locations where the operation unit 10, the acquisition unit 11, the detection unit 12, the storage unit 131, the control unit 132, and the image changing unit 133 are attached are limited thereto. Absent.
- the control unit 132 first prompts the user for each state, and the upward state, the front state, the left state, the right direction, the left upward state, the right upward state, the left downward state, the right
- the detection value of each detection unit 12 in the downward state is acquired.
- the control unit 132 learns the detection value of each state to determine the threshold of the detection value of the magnetic sensor 121 and the predetermined time in each state, and the threshold of the detection value of the acceleration sensor 122 and the predetermined time.
- the control unit 132 may also determine the threshold of the angular velocity sensor 123 in each state and the predetermined time.
- the control unit 132 outputs, to the image changing unit 133, an instruction to perform an operation regarding the state of the HMD 1, for example, an instruction to promote an upward state.
- the image changing unit 133 causes the display unit 14 to display information based on an instruction to urge the upward state.
- the information based on the instruction prompting the upward state is, for example, "Please turn your head up”.
- the user continues the posture according to the information displayed on the display unit 14 for a predetermined time (for example, 2 seconds or more).
- the control unit 132 acquires the detection value of each sensor in the period in which the instruction is displayed on the display unit 14 and the time during which each state is maintained.
- the control unit 132 urges, for example, five operations for each state for learning, and learns detection values of each state using detection values acquired at the time of five operations, thereby the magnetic sensor 121 in each state.
- the threshold value of the detection value of each of the acceleration sensors 122 is determined.
- the control unit 132 determines a predetermined time for determining each state based on the time during which each state is maintained. The control unit 132 may determine based on the result of the user operating the operation unit 10 at the start and end of state learning, and the state is maintained while the detection value of each sensor is maintained. Period may be used.
- the control unit 132 When an instruction to perform the learning mode is obtained, the control unit 132 then prompts the user to perform each action, and the user wears the HMD 1 on his head and exercises downward from the front, from the top to the front
- the detection values of the detection unit 12 are acquired for each of the motion, the motion from upward to downward, and the motion from front to left.
- the control unit 132 learns the detection value of each state to determine the threshold of the detection value of the angular velocity sensor 123 and the predetermined time in each state.
- the control unit 132 may determine the threshold of the detection value of the magnetic sensor 121 in each state and the predetermined time, and the threshold of the detection value of the acceleration sensor 122 and the predetermined time.
- the control unit 132 outputs, to the image changing unit 133, an instruction to urge the exercise to face downward from the front.
- the image changing unit 133 displays, on the display unit 14, information based on an instruction to urge the exercise to face downward from the front.
- the user repeatedly performs an exercise according to the information displayed on the display unit 14 within a predetermined time (for example, 2 seconds or more).
- the control unit 132 acquires the detection value of each sensor in the period in which the instruction is displayed on the display unit 14.
- the control unit 132 urges, for example, five sets of operations for each state for learning, and learns the detected values of each state using detected values acquired at the time of five sets of operations, thereby detecting the detected values of the angular velocity sensor 123 Determine the threshold of Note that one set of motion is a motion that is directed downward from the front and performed multiple times within a predetermined time. Further, the control unit 132 extracts, for example, a maximum value in the detection value of each sensor, and determines a predetermined time for determining each operation based on the period of the extracted maximum value.
- FIG. 3 is a diagram illustrating an example of detection values regarding states stored in the storage unit 131 according to the present embodiment.
- the storage unit 131 stores, in the state, the threshold of the direction calculated using the detection values of the magnetic sensor 121 and the acceleration sensor 122 and the predetermined time in association with each other.
- the storage unit 131 stores ( ⁇ 12 , ⁇ 12 ) as an azimuth threshold value and t 11 as a predetermined time in association with each other in the upward state.
- the storage unit 131 stores ( ⁇ 22 , ⁇ 22 ) as the azimuth threshold value and t 21 as the predetermined time in association with the frontal state.
- ⁇ 12 and ⁇ 22 are azimuths in the horizontal direction, and ⁇ 12 and ⁇ 22 are azimuths in the vertical direction. Further, ⁇ 12 , ⁇ 12, ⁇ 22 and ⁇ 22 may be values having a range. The predetermined times t 11 to t 91 may be the same value.
- FIG. 4 is a diagram showing an example of detection values regarding the operation stored in the storage unit 131 according to the present embodiment.
- the storage unit 131 stores, in the operation, a threshold of the amount of change in azimuth and a predetermined time.
- the storage unit 131 stores ( ⁇ 101 , ⁇ 101 ) and t 101 as a predetermined time in association with the operation of facing downward from the front as a threshold of the change amount of the azimuth.
- the storage unit 131 stores the t 201 as the predetermined time in association with the operation from the upward direction to the front as a threshold value of the change amount of the azimuth ( ⁇ 201 , ⁇ 201 ).
- the control unit 132 is configured to calculate a state in which the azimuth ( ⁇ , ⁇ ) calculated using the detected value of the magnetic sensor 121 and the detected value of the acceleration sensor 122 is equal to or higher than the azimuth threshold ( ⁇ 32 , ⁇ 32 ) for the first predetermined time t. When 31 or more continue, it is determined that it is in the downward state.
- control unit 132 has an azimuth change amount of ( ⁇ 101 , ⁇ 101 ) or more and a second predetermined time t 101 or more (operation downward from the front), and an azimuth change amount of ( ⁇ 201 , ⁇ 201 ) or more and At least one of the first predetermined time t201 or more (operation from upward to front) and the change in orientation ( ⁇ 301 , ⁇ 301 ) or more and the second predetermined time t 301 or more (operation from upward to downward) is the first When the predetermined number of times is detected, it is determined that the vertical swing operation is performed. The control unit 132 may determine which threshold is to be used in the combination of the change amount of the azimuth and the predetermined time in the learning mode.
- control unit 132 determines that the angular velocity ⁇ calculated using the detection value of the angular velocity sensor 123 is equal to or greater than at least one of the angular velocity thresholds ⁇ 11, ⁇ 21, and ⁇ 31 for the second predetermined time.
- the second predetermined number of times is detected within (t 101 , t 201 , t 201 ), it may be determined that the vertical swing operation is performed.
- control unit 132 has an azimuth change amount of ( ⁇ 401 , ⁇ 401 ) or more and a third predetermined time t 401 or more (operation from the front to the right) and an azimuth change amount of ( ⁇ 401 , ⁇ 401 ) or more and When at least one of the third predetermined time t 501 or more (operation from the front to the left) is detected a second predetermined number of times, it is determined that the operation is a swing operation.
- the control unit 132 may determine which threshold is to be used in the combination of the change amount of the azimuth and the predetermined time in the learning mode.
- the control unit 132 determines that the acceleration ⁇ calculated using the detection value of the angular velocity sensor 123 is at least one of the threshold values ⁇ 41 and ⁇ 51 of the angular velocity, the state during the third predetermined time (t 401 , T 501 ) when a second predetermined number of times or more is detected, it may be determined that the horizontal movement operation.
- FIG. 5 is a diagram showing an example of information displayed on the display unit 14 according to the present embodiment.
- the image g101 is an example of an image displayed on the display unit 14 when the user wears the HMD 1 on the head and faces the front (normal state).
- the image displayed in the normal state is an image in which the time information of the area surrounded by the dashed line g121 overlaps the image of the outside world.
- the image of the outside world is not an image created by the image changing unit 133 but an image visually recognized by the user through the display unit 14.
- the information of the area surrounded by the dashed line g111 is information which is set in advance as information to be displayed when the user is in the normal state, and is, for example, time information.
- the image displayed in the downward state is an image in which the information of the area surrounded by the dashed line g121 is superimposed on the image of the outside world.
- the information of the area surrounded by the dashed line g121 is an image based on the information set as information to be displayed in advance when the user is in the downward state, and is, for example, information on the weather forecast of the current location or the destination. That is, in the present embodiment, when the state of the HMD 1 changes, the control unit 132 switches and changes the information displayed on the display unit 14.
- an image g103 is displayed.
- the image displayed in the vertical swing operation is an image in which the information of the area surrounded by the dashed line g131 is superimposed on the external image, in addition to the information of the area surrounded by the dashed line g111.
- the information of the area surrounded by the dashed line is information which is set in advance as information to be displayed when the user performs the vertical swing operation, and is, for example, information related to navigation such as a route from the current location to a destination.
- an image g104 is displayed.
- the image displayed in the vertical swing operation is an image in which the information of the area surrounded by the dashed line g141 is superimposed on the external image, in addition to the information of the area enclosed by the dashed line g111.
- the information of the area surrounded by the dashed line g141 is information which is set in advance as information to be displayed by the user at the time of the horizontal movement operation, and is, for example, information on a schedule. That is, in the present embodiment, when the operation of the HMD 1 changes, the control unit 132 adds and changes information to be displayed on the display unit 14.
- the control unit 132 adds information related to the schedule in addition to the information (time information, navigation information) displayed on the image g103 to display Change the information you
- the HMD 1 of the present embodiment causes the additional information to be displayed when the HMD 1 reciprocates between the normal state (reference posture) and the outside of the normal state (reference posture) within a predetermined time.
- the normal state (reference posture) and the outside are downward posture, upward posture, leftward posture, rightward posture, left upward posture, right upward posture, left downward posture, right downward posture.
- the additional information is, for example, navigation information, schedule information, and the like.
- the user may not want the display to be displayed when he / she wants it to be displayed in the downward state, the vertical swing operation, or the horizontal swing operation. Therefore, when the user wants to display information on the HMD 1, the user operates the operation unit 10 to instruct the HMD 1 to display information, and when the user does not want to display information on the HMD 1, the operation unit 10 is operated. , And instructs the HMD 1 not to display information.
- the control unit 132 generates a display instruction or a non-display instruction according to the instruction information acquired by the operation unit 10.
- the image change unit 133 displays an image based on the information output from the acquisition unit 11 based on the information according to the detection value of the detection unit 12 on the display unit 14.
- the control unit 132 outputs the non-display instruction
- the image changing unit 133 does not display an image based on the information output by the acquiring unit 11 on the display unit 14.
- the time information is displayed on the display unit 14 in the normal state, the time information is changed to other information when the state changes, and the other information is added when the operation changes.
- the control unit 132 causes the display unit 14 to display the part rank information in the normal state, that is, does not display the information, changes it to display other information when the state changes, and blank information when the operation changes. It may be controlled to add other information to.
- FIG. 6 is a flowchart of threshold value learning and display image change processing according to the present embodiment. In the following processing, it is assumed that the user sets in advance information to be displayed in the downward state, information to be displayed in the vertical movement operation, and information to be displayed in the horizontal movement operation.
- Step S1 The control unit 132 acquires an operation instruction from the acquisition unit 11. Subsequently, the control unit 132 determines whether the acquired operation instruction includes an instruction indicating a learning mode. If the controller 132 determines that the instruction indicating the learning mode is included (step S1; YES), the process proceeds to step S2, and the controller 132 determines that the instruction indicating the learning mode is not included (step S1; NO) ), Go to step S5.
- Step S2 The control unit 132 acquires the detection value of the magnetic sensor 121 and the detection value of the acceleration sensor 122 in each state, each operation when the user hu is wearing the HMD 1 on the head. Subsequently, the control unit 132 calculates the change amount of the azimuth and the azimuth using a known method using the detection value of the acquired magnetic sensor 121 and the detection value of the acceleration sensor 122. In addition, the control unit 132 acquires detection values of the angular velocity sensor 123 in each operation when the user hu is wearing the HMD 1 on the head, and calculates the angular velocities using the acquired detection values using a known method. You may do it.
- Step S3 The control unit 132 sets the threshold of the azimuth identifying each state and each operation and the threshold of the amount of change of the azimuth, using the calculated azimuth and azimuth change amount of each state. Subsequently, in the learning mode, the control unit 132 measures each state and the time when each operation is performed, and sets each predetermined time based on the measured time. Subsequently, the control unit 132 writes the state of the HMD 1 in the storage unit 131 in association with a predetermined time and a threshold of an azimuth for identifying the state of the HMD 1.
- control unit 132 writes, in the storage unit 131, the predetermined time and the threshold value of the change amount of the azimuth for identifying the operation of the HMD 1 in the operation of the HMD 1.
- the control unit 132 sets each of the angular velocity threshold values for identifying each operation using the calculated angular velocity of each operation, measures the time at which each operation was performed in the learning mode, and determines the time based on the measured time.
- Each predetermined time may be set.
- the control unit 132 may write the threshold of the angular velocity for identifying the operation of the HMD 1 to the operation of the HMD 1 and the predetermined time in the storage unit 131 in association with each other. After the process ends, the control unit 132 returns to the process of step S1.
- Step S5 The acquiring unit 11 acquires information from an external device, and outputs the acquired information to the image changing unit 133. Subsequently, when the user hu is wearing the HMD 1 on the head, the control unit 132 acquires the detection value of the magnetic sensor 121 and the acceleration sensor 122. Subsequently, the control unit 132 calculates the change amount of the azimuth and the azimuth using a known method using the detection value of the acquired magnetic sensor 121 and the detection value of the acceleration sensor 122.
- Step S6 The control unit 132 compares the azimuth calculated in step S5 with the amount of change in the azimuth and the threshold stored in the storage unit 131.
- Step S7 The control unit 132 determines that the period calculated in step S5 is equal to or greater than the threshold of the downward direction stored in the storage unit 131 is the first predetermined time t 31 which is the threshold of the predetermined time. It is determined whether or not it is in the downward state according to whether or not it continues to exceed. If the control unit 132 determines that it is in the downward state (step S7; YES), it proceeds to the process of step S8, and if it determines that it is not the downward state (step S7; NO), it proceeds to the process of step S9. (Step S8) The control unit 132 switches the information to be displayed to the information used when displaying in the downward state. In addition, the image which the display part 14 displays is at least one of the information which the acquisition part 11 acquired. The control unit advances the process to step S13.
- Step S9 The control unit 132 sets the threshold of the amount of change in at least one of the direction downward from the front, upward from the front, and from upward to downward from the front stored in the storage unit 131. Whether or not the vertical swing operation is performed depending on whether or not the period which is the above has occurred a first predetermined number of times or more in the second predetermined time (t 101 , t 102 , t 103 ) which is the threshold of the predetermined time To determine If the control unit 132 determines that the vertical swing operation is performed (step S9; YES), the process proceeds to step S10. If the control unit 132 determines that the vertical swing operation is not performed (step S9; NO), the process proceeds to step S11. . (Step S10) The control unit 132 switches the information to be displayed to the information used when displaying in the vertical swing operation. The control unit advances the process to step S13.
- Step S11 A period during which the control unit 132 determines that the change amount of the direction calculated in step S5 is at least the threshold value of the change amount of at least one direction from the front stored in the storage unit 131. It is determined whether or not the swinging operation is performed depending on whether or not the second predetermined number of times has occurred in the third predetermined time (t 401 , t 501 ) which is the threshold of the predetermined time. If the control unit 132 determines that the operation is a swing operation (step S11; YES), the process proceeds to the process of step S12. If the control unit 132 determines that the operation is not a shake operation (step S11; NO), the process returns to the process of step S5 . (Step S12) The control unit 132 switches the information to be displayed to the information used when displaying in the case of the swinging operation. The control unit advances the process to step S13.
- Step S13 The image changing unit 133 generates a display image using the information switched by the control unit 132 in step S8, S10, or S12. Subsequently, the display unit 14 displays the image generated by the image changing unit 133. This is the end of the learning of the threshold and the process of changing the display image.
- the present embodiment it is possible to change the information to be displayed according to the movement of the image display device, that is, the movement (state, operation) of the head of the user. Further, according to the present embodiment, it is possible to change a plurality of display data by simple operations of the HMD 1 such as upward, nodding, and horizontal movement. Further, according to the present embodiment, more simple display data can be selectively displayed by the simple operation of the HMD 1. Moreover, according to the present embodiment, it is possible to display different information according to the operation pattern.
- FIG. 7 is a block diagram showing a schematic configuration of the HMD 1A according to the present embodiment.
- the HMD 1A includes an operation unit 10, an acquisition unit 11A, a detection unit 12, an information change unit 13A, a display unit 14, and a transmission unit 15.
- the information change unit 13A includes a storage unit 131, a control unit 132A, and an image change unit 133.
- the same reference numerals are used for functional units having the same functions as the HMD 1 in the first embodiment, and the description will be omitted.
- the HMD 1A communicates with the vehicle 2 and the terminal 3.
- the HMD 1A communicates with the vehicle 2 and the terminal 3 using, for example, a communication method of Bluetooth (registered trademark) LE (Low Energy) (hereinafter, referred to as BLE) standard.
- BLE Bluetooth
- the vehicle 2 includes a detection unit 21, a radar 22, and a transmission unit 23.
- the vehicle 2 is provided with a vehicle body frame, an engine, a steering wheel, and the like (not shown). Further, the detection unit 21 includes a remaining amount detection unit 211 and a vehicle speed detection unit 212.
- the radar 22 further includes a signal generation unit 221, a transmission unit 222, a reception unit 223, an other-vehicle detection unit 224, and a human detection unit 225.
- the terminal 3 includes a receiving unit 31, an operation unit 32, a control unit 33, a display unit 34, a display information generating unit 35, and a transmitting unit 36.
- the vehicle 2 is, for example, a four-wheeled vehicle, a saddle-ride type vehicle, a motorcycle, or the like.
- the vehicle 2 transmits the information detected by the detection unit 21 to the terminal 3 by the transmission unit 23. Further, the vehicle 2 transmits the information detected by the radar 22 to the terminal 3 by the transmitting unit 23.
- the detection unit 21 detects the state of the vehicle 2 and outputs information indicating the detected state to the transmission unit 23.
- the remaining amount detection unit 211 detects the remaining amount of fuel (gasoline, power, etc.) that is the state of the vehicle 2, and outputs information indicating the detected remaining amount to the transmitting unit 23.
- the vehicle speed detection unit 212 detects the vehicle speed which is the state of the vehicle 2, and outputs information indicating the detected vehicle speed to the transmission unit 23.
- the radar 22 is, for example, a MIMO (Multiple-Input Multiple-Output) radar using millimeter waves, and detects other vehicles and people (including pedestrians) and detects the detection results as information indicating the radar detection results. It is output to the transmitter 23.
- the radar 22 may detect another vehicle or a person using an infrared ray, an ultrasonic wave, an image captured by an imaging device, or the like.
- the signal generation unit 221 generates a transmission signal, and outputs the generated transmission signal to the transmission unit 222, the other vehicle detection unit 224, and the human detection unit 225.
- the transmission unit 222 includes a transmission antenna, converts the transmission signal output from the signal generation unit 221 into a transmission wave, and transmits the converted signal from the transmission antenna.
- the receiving unit 223 includes a receiving antenna, receives a reception wave that is a reflected wave that is reflected by an object (another vehicle, a pedestrian, or the like), and converts the received wave into a reception signal. Further, the receiving unit 223 outputs the received signal to the other vehicle detecting unit 224 and the human detecting unit 225.
- the other vehicle detection unit 224 detects another vehicle by using a known method for the transmission signal output from the signal generation unit 221 and the reception signal output from the reception unit 223, and another vehicle is detected.
- the information indicating the result of the case detection is output to the transmission unit 23.
- the person detection unit 225 detects a person by detecting a person from the transmission signal output from the signal generation unit 221 and the reception signal output from the reception unit 223 using a known method.
- the information indicating the result is output to the transmission unit 23.
- the transmitting unit 23 transmits, to the terminal 3, the information indicating the state of the vehicle 2 output from the detecting unit 21 and the information indicating the radar detection result output from the radar 22.
- the terminal 3 is, for example, a smartphone, a tablet terminal or the like.
- the terminal 3 detects a user's operation instruction, and when the detected operation instruction includes an instruction to the HMD 1A, the terminal 3 extracts the instruction and transmits the extracted instruction to the HMD 1A.
- the terminal 3 switches the information to be received from the information received from the vehicle according to the information selection instruction which is an instruction to select the information to be displayed on the display unit 14 output by the HMD 1 and based on the switched information
- the display information is generated, and the generated display information is transmitted to the HMD 1A.
- the receiving unit 31 receives the information indicating the state of the vehicle 2 and the information indicating the radar detection result transmitted by the vehicle 2, and the received information indicating the state of the vehicle 2 and the information indicating the radar detection result as a display information generating unit 35 Output to Further, the receiving unit 31 outputs the information selection instruction output from the HMD 1A to the control unit 33. In addition, the receiving unit 31 acquires navigation information to a destination received via a network (not shown), and outputs the acquired navigation information to the control unit 33. Note that the receiving unit 31 acquires the weather forecast, information indicating the current location, information on shops around the current location, congestion information on the route of the vehicle, and the like via the network, and transmits the acquired information to the control unit 33. Output.
- the operation unit 32 is, for example, a touch panel sensor attached on the display unit 34, detects a user operation, and outputs the detected operation result to the control unit 33.
- the operation results there are an instruction to start the learning mode, an information display instruction to set information display, an information non-display instruction to set no information display, information indicating a departure place, and a destination. Information to be shown, instructions to acquire navigation information, etc. are included.
- the control unit 33 generates a switching instruction for switching the information to be displayed on the display unit 14 of the HMD 1A from the information received by the receiving unit 31 in accordance with the information selection instruction output by the receiving unit 31.
- the control unit 33 outputs the generated switching instruction to the display information generating unit 35.
- the control unit 33 extracts a learning mode start instruction, an information display instruction, and an information non-display instruction from the operation result output from the operation unit 32.
- the control unit 33 outputs the extracted start instruction of the learning mode, the information display instruction, and the information non-display instruction to the transmission unit 36.
- the control unit 33 may output an instruction not to transmit the display information to the HMD 1A, when the information non-display instruction is included in the operation result output from the operation unit 32.
- the control unit 33 outputs, to the transmission unit 36, the information indicating the departure place output from the operation unit 32, the information indicating the destination, an instruction to acquire navigation information, an instruction to acquire traffic jam information, and the like.
- the display unit 34 is, for example, a liquid crystal display panel, and includes a backlight.
- the display unit 34 displays the image information output by the control unit 33.
- the image information includes an image for setting and operating the HMD 1A, an image for setting communication with the vehicle 2, and the like.
- the display information generation unit 35 switches the information output by the reception unit 31 to information to be transmitted to the HMD 1A in accordance with the switching instruction output by the control unit 33.
- the display information generation unit 35 selects information from the information output by the reception unit 31 based on the switched result, and generates display information to be displayed on the display unit 14 of the HMD 1A based on the selected information.
- the display information generation unit 35 outputs the generated display information to the transmission unit 36.
- the display information is at least one of vehicle speed information of the vehicle 2, fuel remaining amount information of the vehicle 2, information indicating a radar detection result, navigation information, traffic jam information, and current location information.
- the transmission unit 36 transmits the instruction to start the learning mode, the information display instruction, and the information non-display instruction output from the control unit 33 to the HMD 1A.
- the transmission unit 36 also transmits display information to the HMD 1A.
- the acquisition unit 11A acquires the start instruction of the learning mode transmitted by the terminal 3, the information display instruction, the information non-display instruction, and the display information.
- the acquisition unit 11A outputs the acquired display information to the image change unit 133.
- the acquisition unit 11A also outputs the acquired instruction to start the learning mode, the information display instruction, and the information non-display instruction to the control unit 132A.
- the control unit 132A acquires the learning mode start instruction, the information display instruction, and the information non-display instruction output from the acquisition unit 11A.
- the control unit 132A acquires the detection value output from the detection unit 12.
- the control unit 132A learns the threshold as in the case of the control unit 132, and stores the learned threshold and the predetermined time in the storage unit 131.
- Control unit 132A controls the state of HMD 1A (normal state, downward state, vertical movement operation, horizontal movement operation) using the detection value output from detection unit 12, the threshold value stored in storage unit 131, and a predetermined time. The determination is performed in the same manner as in the unit 132, and an information selection instruction is generated based on the determination result.
- the control unit 132A outputs the generated information selection instruction to the transmission unit 15. Further, the control unit 132A generates a display instruction or a non-display instruction according to the acquired information, and outputs the generated display instruction or the non-display instruction to the image changing unit 133.
- the transmitting unit 15 transmits the information selection instruction output from the control unit 132A to the terminal 3.
- FIG. 8 is a diagram showing an example of information displayed on the display unit 14 according to the present embodiment.
- the image g201 is an example of an image displayed on the display unit 14 when the user wears the HMD 1 on the head and is in the normal state.
- the image displayed in the normal state is information of the area surrounded by the dashed line g211 in the image of the outside world.
- the information of the area enclosed by the dashed line g211 is navigation information, and for example, an image showing the route, an image showing the route, an image showing the current intersection name, an image showing the next intersection name, the distance to the next intersection The image etc. which are shown are included.
- the display position of the navigation information may be set by the user operating the terminal 3 or the HMD 1A, or may be a predetermined position.
- the predetermined position is preferably, for example, a position where it is assumed that the road surface is viewed in the field of view of the user.
- the image g202 is displayed.
- the image displayed in the downward state is an image in which the information of the area surrounded by the dashed line g221 is superimposed on the image of the outside world.
- the information on the area enclosed by the dashed line g221 includes, for example, information indicating the current time, distance to the destination, information indicating scheduled time to arrive at the destination, estimated remaining time when traveling to the destination, etc. It is done.
- the HMD 1A switches the display information from the navigation information to the information such as the distance to the destination and displays it. The user sets the position where the information is displayed by operating the terminal 3 or the HMD 1A.
- an image g203 is displayed.
- the image displayed in the vertical swing operation is an image in which the information of the area surrounded by the dashed lines g231 to g233 is superimposed on the external image, in addition to the dashed line g211 indicating the navigation information.
- the information of the area enclosed by the dashed line g231 includes information indicating the position of the pedestrian, and the information of the area enclosed by the dashed line g232 includes, for example, information indicating the vehicle speed.
- the information of the area enclosed by the dashed line g233 Contains information indicating the remaining amount of fuel.
- the HMD 1A when the HMD 1A changes from the normal state to the vertical swing operation, the HMD 1A switches the display information to information obtained by adding information such as the vehicle speed and the remaining amount of fuel to navigation information. .
- FIG. 8 shows an example in which information such as the vehicle speed and the remaining amount of fuel is displayed in addition to the navigation information displayed in the normal state when changing from the normal state to the vertical swing operation. It is not restricted to this.
- the navigation information may be switched to the information such as the vehicle speed and the remaining amount of fuel.
- the display position of the information in the area surrounded by the dashed lines g231 to g233 may be a preset position, or may be a position set by the user operating the terminal 3 or the HMD 1A.
- an image g204 is displayed.
- the image displayed during the swinging operation is added to the image of the outside world on the dotted line g211 indicating navigation information, and the information (vehicle speed, remaining amount of fuel) displayed during the vertical swinging operation and surrounded by the dashed line g241 to g245. It is an image in which the information of the The information on the area surrounded by the dashed line g241 includes information indicating a predicted movement of a person detected in the traveling direction of the vehicle 2.
- the information on the area enclosed by the dashed line g 242 includes information indicating the predicted movement of the other vehicle in the traveling direction of the vehicle 2.
- the information on the area enclosed by the dashed line g243 includes information indicating the speed limit of the road on which the vehicle 2 is traveling.
- the information on the area enclosed by the dashed line g244 includes an image indicating the name of the destination, and the dashed line g245 includes an image indicating the travel distance predicted to be travelable before the fuel is supplied.
- the HMD 1A when the HMD 1A changes from the horizontal movement operation to the vertical movement operation, the HMD 1A adds the display information to the information to be displayed in the normal state and the information to be displayed during the vertical movement operation.
- the present invention is not limited to this.
- the display position of the information of the area surrounded by the dashed line g241 to g245 may be a preset position or a position set by the user operating the terminal 3 or the HMD 1A.
- the HMD 1A of the present embodiment causes the additional information to be displayed when the reciprocation operation between the HMD 1A between the normal state (reference posture) and the normal state (reference posture) is performed within a predetermined time.
- the normal state (reference posture) and the outside are downward posture, upward posture, leftward posture, rightward posture, left upward posture, right upward posture, left downward posture, right downward posture.
- the additional information is, for example, vehicle speed information, fuel remaining amount information, other vehicle information, interpersonal information, destination information and the like.
- the additional information is information on the vehicle 2 and information on the traveling of the vehicle 2 according to the situation in which the HMD 1 is used. It is information.
- the control unit 132A detects the traveling unit when the vehicle 2 is traveling based on the vehicle speed of the vehicle 2 Twelve detection values may be acquired, and the threshold values of each state and each operation may be corrected using the acquired detection values.
- the normal state when the user is on the vehicle 2 may be bending forward than the posture facing the front in the learning mode. In such a state, even if the user turns the head downward or performs the vertical swing operation, it may not be possible to detect each state and each operation with the threshold in the learning mode performed in the normal state. Therefore, the control unit 132A may correct the threshold value using the detection value at the time of traveling. In addition, since the swing operation can not be detected with the threshold set in the learning mode when wearing the helmet, the control unit 132A corrects the threshold using the detection value during running. It is also good.
- the present invention is not limited to this.
- other information include: information on current location (road information, nearby store information), landmark name on the route, value of water thermometer on the vehicle 2, travel distance of the vehicle (trip), engine control mode, suspension It may be information such as a control mode, the presence or absence of an emergency vehicle, the presence or absence of road construction, and the presence or absence of traffic jam in the forward direction of the route.
- the terminal 3 may acquire these pieces of information via the communication line and transmit the information to the HMD 1A, and the vehicle 2 may acquire road information and transmit the information to the HMD 1A.
- the control unit 132 or 132A detects an upward state, a leftward state, a rightward state, a left upward state, a right upward state, a left downward state, or a right downward state.
- the information to be displayed may be switched. That is, the control unit 132 or 132A performs determination of at least one of the vertical, horizontal, and diagonal directions with respect to the normal state (reference state) of the HMD 11 or 1A using the detection result of the detection unit 12. It is also good.
- the present embodiment it is possible to switch the display of the additional information according to the situation in which the HMD 1A is used, according to the operation (including the state) of the HMD 1A. Moreover, according to this embodiment, the information regarding the vehicle 2 can be displayed on HMD1A, and the information to display can be switched according to operation
- the program for realizing the functions of the HMD 1 and 1A in the present invention is recorded in a computer readable recording medium, and the computer system reads the program recorded in the recording medium and executes the program to display information. Switching, display of an image based on display information, or the like may be performed.
- the “computer system” includes an OS and hardware such as peripheral devices.
- the “computer system” also includes a WWW system provided with a homepage providing environment (or display environment).
- the “computer-readable recording medium” means a portable medium such as a flexible disk, a magneto-optical disk, a ROM, a CD-ROM, or a storage device such as a hard disk built in a computer system.
- the "computer-readable recording medium” is a volatile memory (RAM) in a computer system serving as a server or a client when a program is transmitted via a network such as the Internet or a communication line such as a telephone line.
- RAM volatile memory
- those that hold the program for a certain period of time are also included.
- the program may be transmitted from a computer system in which the program is stored in a storage device or the like to another computer system via a transmission medium or by transmission waves in the transmission medium.
- the “transmission medium” for transmitting the program is a medium having a function of transmitting information, such as a network (communication network) such as the Internet or a communication line (communication line) such as a telephone line.
- the program may be for realizing a part of the functions described above. Furthermore, it may be a so-called difference file (difference program) that can realize the above-described functions in combination with a program already recorded in the computer system.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mechanical Engineering (AREA)
- Chemical & Material Sciences (AREA)
- Transportation (AREA)
- Combustion & Propulsion (AREA)
- General Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
- Instrument Panels (AREA)
Abstract
Description
本願は、2016年2月12日に出願された日本国特願2016-025244号に基づき優先権を主張し、その内容をここに援用する。
(1)本発明に係る一態様の画像表示装置は、2つ以上の情報を取得する取得部と、画像表示装置の加速度、方位および角速度のうち少なくとも1つを検出する検出部と、前記検出部が検出した結果に基づいて、前記画像表示装置の状態を判定し、前記判定した前記画像表示装置の状態に応じて、前記取得部が取得した前記2つ以上の情報から選択する情報を切り替え、前記切り替えた情報に基づく表示データを生成する画像変更部と、前記表示データを表示する表示部と、を備える。
(4)上記(3)の態様において、前記付加情報は、前記画像表示装置が利用されている状況に応じた情報であってもよい。
上記(2)の場合、画像表示装置の簡易な動作により、より多くの表示データを選択的に表示可能となる。
上記(3)の場合、動作パターンに応じて異なる情報の表示が可能となる。
上記(4)の場合、画像表示装置が利用されている状況に応じた付加情報の表示を画像表示装置の動作に応じて切り替えることができる。
上記(5)の場合、画像表示装置に車両に関する情報を表示することができ、表示する情報を画像表示装置の動作に応じて切り替えることができる。
図1は、本実施形態に係るHMD1の概略構成を示すブロック図である。
図1に示すように、HMD1は、操作部10、取得部11、検出部12、情報変更部13、および表示部14を備えている。また、検出部12は、磁気センサー121(検出部)、加速度センサー122(検出部)、角速度センサー123(検出部)を備えている。
また、情報変更部13は、記憶部131、制御部132、および画像変更部133を備えている。
以下、利用者が地表に対して直立してHMD1を頭部に装着した場合、加速度センサー122の座標は、利用者から見て上下方向をz軸方向、左右方向をx軸方向、前後方向をy軸方向とする。加速度センサー122は、図2に示すように、例えばZ軸方向の検出値が負方向であるように取り付けられている。
図2に示すように、本実施形態のHMD1は眼鏡型である。HMD1は、左右に表示部14R及び14L、鼻あて102R及び102L、ブリッジ103、テンプル101R及び101Lを備えて構成されている。検出部12は、左右のテンプル101R及び101L内に取り付けられ、操作部10と取得部11と記憶部131と制御部132と画像変更部133は、左側のテンプル101L内に取り付けられている。なお、図2に示した構成は一例であり、操作部10、取得部11、検出部12、記憶部131、制御部132、及び画像変更部133が取り付けられている箇所は、これに限られない。
次に、学習モードのときの処理、HMD1の状態判別について説明する。
学習モードを行う指示を取得したとき、制御部132は、最初に各状態を利用者に促し、上向き状態、正面状態、左向き状態、右向き状態、左上向き状態、右上向き状態、左下向き状態、右下向き状態それぞれの検出部12の検出値を取得する。制御部132は、各状態の検出値を学習することで、各状態における磁気センサー121の検出値の閾値と所定時間、加速度センサー122の検出値の閾値と所定時間を決定する。なお、制御部132は、各状態における角速度センサー123の閾値と所定時間も決定するようにしてもよい。
制御部132は、HMD1の状態に関する動作を行うことを促す指示、例えば上向き状態を促す指示を画像変更部133に出力する。画像変更部133は、上向き状態を促す指示に基づく情報を表示部14に表示する。上向き状態を促す指示に基づく情報とは、例えば『頭を上に向けてください』である。利用者は、表示部14に表示された情報に応じた姿勢を、所定時間(例えば2秒以上)継続する。制御部132は、指示を表示部14に表示している期間の各センサーの検出値と各状態が維持されている時間を取得する。制御部132は、学習のため、各状態について、例えば5回の動作を促し、5回の動作時に取得した検出値を用いて各状態の検出値を学習することで、各状態における磁気センサー121、加速度センサー122それぞれの検出値の閾値を決定する。また、制御部132は、各状態が維持されている時間に基づいて、各状態を判別するための所定時間を決定する。
なお、制御部132は、利用者が状態の学習開始時と終了時に操作部10を操作した結果に基づいて判別してもよく、各センサーの検出値が維持されている期間を状態が維持されている期間としてもよい。
制御部132は、例えば、正面から下を向く運動を促す指示を画像変更部133に出力する。画像変更部133は、正面から下を向く運動を促す指示に基づく情報を表示部14に表示する。利用者は、表示部14に表示された情報に応じた運動を、所定時間(例えば2秒以上)内に繰り返して行う。制御部132は、指示を表示部14に表示している期間の各センサーの検出値を取得する。制御部132は、学習のため、各状態について、例えば5セットの動作を促し、5セットの動作時に取得した検出値を用いて各状態の検出値を学習することで、角速度センサー123の検出値の閾値を決定する。なお、1セットの動作とは、所定時間内に複数回行う、正面から下を向く運動である。また、制御部132は、各センサーの検出値において、例えば極大値を抽出し、抽出した極大値の期間に基づいて、各動作を判別するための所定時間を決定する。
図3は、本実施形態に係る記憶部131が記憶する状態に関する検出値の一例を示す図である。図3に示すように、記憶部131は、状態に、磁気センサー121と加速度センサー122の検出値を用いて算出した方位の閾値と所定時間を対応付けて記憶する。例えば、記憶部131は、上向き状態に、方位の閾値として(θ12、φ12)と所定時間としてt11を対応付けて記憶する。記憶部131は、正面状態に、方位の閾値として(θ22、φ22)と所定時間としてt21を対応付けて記憶する。なお、θ12、θ22は水平方向の方位、φ12、φ22は垂直方向の方位である。また、θ12、φ12、θ22、φ22は、範囲を有する値であってもよい。また、所定時間t11~t91は、同じ値であってもよい。
次に、下向き状態、縦振り動作、横振り動作の検出方法について説明する。
制御部132は、磁気センサー121の検出値と加速度センサー122の検出値を用いて算出した方位(θ、φ)が方位の閾値(θ32、φ32)以上の状態が、第1所定時間t31以上継続したとき、下向き状態であると判別する。
次に、HMD1の表示部14に表示される画像の例を説明する。
図5は、本実施形態に係る表示部14に表示される情報の一例を示す図である。
画像g101は、利用者がHMD1を頭部に装着し正面を向いている状態(通常状態)ときに表示部14に表示される画像の一例である。通常状態のときに表示される画像は、外界の画像に、鎖線g121で囲んだ領域の時刻情報が重なった画像である。なお、外界の画像は、画像変更部133が作成した画像ではなく、表示部14を通して利用者が視認する画像である。また、鎖線g111で囲んだ領域の情報は、利用者が予め通常状態のときに表示する情報として設定した情報であり、例えば時刻情報である。
すなわち、本実施形態において、HMD1の動作が変化したとき、制御部132は、表示部14に表示する情報を追加して変化させる。なお、HMD1の動作が、縦振り動作から横振り動作に変化した場合、制御部132は、画像g103に表示した情報(時刻情報、ナビゲーション情報)に加えて、スケジュールに関する情報を加えることで、表示する情報を変化させる。換言すると、本実施形態のHMD1は、HMD1が通常状態(基準姿勢)と通常状態(基準姿勢)外の間の往復動作を所定時間内に行った場合に、付加情報を表示させる。通常状態(基準姿勢)外とは、下向き姿勢、上向き姿勢、左向き姿勢、右向き姿勢、左上向き姿勢、右上向き姿勢、左下向き姿勢、右下向き姿勢である。また、付加情報とは、例えば、ナビゲーション情報、スケジュール情報等である。
次に、閾値の学習と表示画像の変更処理手順を説明する。
図6は、本実施形態に係る閾値の学習と表示画像の変更処理のフローチャートである。
なお、以下の処理では、下向き状態のときに表示する情報、縦振り動作のときに表示する情報、および横振り動作のときに表示する情報それぞれを、利用者が予め設定しているとする。
制御部132は、学習モードを示す指示が含まれていると判別した場合(ステップS1;YES)、ステップS2に進み、学習モードを示す指示が含まれていないと判別した場合(ステップS1;NO)、ステップS5に進む。
なお、制御部132は、算出した各動作の角速度を用いて、各動作を識別する角速度の閾値それぞれを設定し、学習モードにおいて、各動作を行った時間を測定し、測定した時間に基づいて各所定時間を設定するようにしてもよい。続けて、制御部132は、HMD1の動作に、HMD1の動作を識別するための角速度の閾値と所定時間を対応付けて記憶部131に書き込むようにしてもよい。制御部132は、処理終了後、ステップS1の処理に戻す。
制御部132は、下向き状態であると判別した場合(ステップS7;YES)、ステップS8の処理に進み、下向き状態ではないと判別した場合(ステップS7;NO)、ステップS9の処理に進む。
(ステップS8)制御部132は、下向き状態のときに表示するときに用いる情報に表示させる情報を切り替える。なお、表示部14が表示する画像は、取得部11が取得した情報のうちの少なくとも1つである。制御部は、ステップS13に処理を進める。
(ステップS10)制御部132は、縦振り動作のときに表示するときに用いる情報に表示させる情報を切り替える。制御部は、ステップS13に処理を進める。
(ステップS12)制御部132は、横振り動作のときに表示するときに用いる情報に表示させる情報を切り替える。制御部は、ステップS13に処理を進める。
以上で、閾値の学習と表示画像の変更処理を終了する。
図7は、本実施形態に係るHMD1Aの概略構成を示すブロック図である。
図7に示すように、HMD1Aは、操作部10、取得部11A、検出部12、情報変更部13A、表示部14、および送信部15を備えている。情報変更部13Aは、記憶部131、制御部132A、画像変更部133を備えている。なお、第1実施形態におけるHMD1と同じ機能を有する機能部には同じ符号を用いて、説明を省略する。また、図7に示すように、HMD1Aは、車両2および端末3と通信を行う。HMD1Aは、例えばBluetooth(登録商標) LE(Low Energy)(以下、BLEという)規格の通信方式を用いて車両2および端末3との通信を行う。
車両2は、例えば四輪車、鞍乗り型車両、自動二輪車等である。車両2は、検出部21が検出した情報を送信部23によって端末3へ送信する。また車両2は、レーダー22が検出した情報を送信部23によって端末3へ送信する。
送信部222は、送信アンテナを含み、信号生成部221が出力した送信信号を送信波に変換し、変換した送信アンテナから送信する。
受信部223は、受信アンテナを含み、対象物(他車両、歩行者等)に反射した反射波である受信波を受信し、受信した受信波を受信信号に変換する。また受信部223は、受信信号を他車両検出部224および人検出部225に出力する。
人検出部225は、信号生成部221が出力した送信信号と、受信部223が出力した受信信号に対して、周知の手法を用いて人の検出を行って、人が検出された場合に検出した結果を示す情報を送信部23に出力する。
送信部23は、検出部21が出力した車両2の状態を示す情報と、レーダー22が出力したレーダー検出結果を示す情報を端末3に送信する。
端末3は、例えばスマートフォン、タブレット端末等である。端末3は、利用者の操作指示を検出し、検出した操作指示にHMD1Aへの指示が含まれている場合、指示を抽出し、抽出した指示をHMD1Aへ送信する。また、端末3は、HMD1が出力した表示部14に表示する情報を選択する指示である情報選択指示に応じて、車両から受信した情報の中から受信する情報を切り替え、切り替えた情報に基づいて表示情報を生成し、生成した表示情報をHMD1Aへ送信する。
表示情報生成部35は、制御部33が出力した切替指示に応じて、受信部31が出力した情報に対して、HMD1Aに送信する情報に切り替える。表示情報生成部35は、切り替えた結果に基づいて受信部31が出力した情報の中から情報を選択し、選択した情報に基づいて、HMD1Aの表示部14に表示する表示情報を生成する。表示情報生成部35は、生成した表示情報を送信部36に出力する。なお、表示情報は、車両2の車速情報、車両2の燃料の残量情報、レーダー検出結果を示す情報、ナビゲーション情報、渋滞情報、現在地情報のうち少なくとも1つである。
取得部11Aは、端末3が送信した学習モードの開始指示、情報表示指示、情報非表示指示、および表示情報を取得する。取得部11Aは、取得した表示情報を画像変更部133に出力する。また、取得部11Aは、取得した学習モードの開始指示、情報表示指示、情報非表示指示を制御部132Aに出力する。
制御部132Aは、学習モードの開始指示の場合、制御部132と同様に閾値の学習を行い、学習した閾値と所定時間を記憶部131に記憶させる。制御部132Aは、検出部12が出力した検出値と記憶部131が記憶している閾値と所定時間を用いて、HMD1Aの状態(通常状態、下向き状態、縦振り動作、横振り動作)を制御部132と同様に判別し、判別した結果に基づいて情報選択指示を生成する。制御部132Aは、生成した情報選択指示を送信部15に出力する。また、制御部132Aは、取得した情報に応じて表示指示または非表示指示を生成し、生成した表示指示または非表示指示を画像変更部133に出力する。
次に、HMD1Aの表示部14に表示される情報の例を説明する。
図8は、本実施形態に係る表示部14に表示される情報の一例を示す図である。
画像g201は、利用者がHMD1を頭部に装着し通常状態のときに表示部14に表示される画像の一例である。通常状態のときに表示される画像は、外界の画像に、鎖線g211で囲んだ領域の情報である。鎖線g211で囲んだ領域の情報は、ナビゲーション情報であり、例えば、進路を示す画像、道順を示す画像、現在の交差点名を示す画像、次の交差点名を示す画像、次の交差点までの距離を示す画像等が含まれている。なお、ナビゲーション情報の表示位置は、利用者が端末3またはHMD1Aを操作して設定するようにしてもよく、予め定められた位置であってもよい。予め定められた位置とは、例えば、利用者の視界において、路面が視認されると想定される位置が好ましい。
Claims (6)
- 2つ以上の情報を取得する取得部と、
画像表示装置の加速度、方位および角速度のうち少なくとも1つを検出する検出部と、
前記検出部が検出した結果に基づいて、前記画像表示装置の状態を判定し、前記判定した前記画像表示装置の状態に応じて、前記取得部が取得した前記2つ以上の情報から選択する情報を切り替え、前記切り替えた情報に基づく表示データを生成する画像変更部と、
前記表示データを表示する表示部と、
を備える画像表示装置。 - 前記画像変更部は、
前記画像表示装置の基準状態に対する上下、左右、斜め方向のうち少なくとも1つに関する前記画像表示装置の状態を、前記検出部が検出した結果に基づいて判定する、
請求項1に記載の画像表示装置。 - 前記画像変更部は、
前記画像表示装置が基準姿勢と基準姿勢外の間の往復動作を所定時間内に行った場合に付加情報を生成する、
請求項1または請求項2に記載の画像表示装置。 - 前記付加情報は、
前記画像表示装置が利用されている状況に応じた情報である、
請求項3に記載の画像表示装置。 - 前記画像表示装置の利用者が車両に乗車している場合、
前記取得部は、前記車両から前記車両に関する情報を取得し、
前記付加情報は、前記車両に関する情報である、
請求項3または請求項4に記載の画像表示装置。 - 画像表示装置の画像表示方法であって、
2つ以上の情報を取得する取得手順と、
前記画像表示装置の加速度、方位および角速度のうち少なくとも1つを検出する検出手順と、
前記検出手順によって検出された結果に基づいて、前記画像表示装置の状態を判定し、前記判定した前記画像表示装置の状態に応じて、前記取得手順によって取得された前記2つ以上の情報から選択する情報を切り替え、前記切り替えた情報に基づく表示データを生成する画像変更手順と、
前記表示データを表示する表示手順と、
を含む画像表示方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/066,683 US10642033B2 (en) | 2016-02-12 | 2017-01-06 | Image display device and image display method |
EP17750003.0A EP3416159B1 (en) | 2016-02-12 | 2017-01-06 | Image display device and image display method |
JP2017566547A JP6598317B2 (ja) | 2016-02-12 | 2017-01-06 | 画像表示装置および画像表示方法 |
CN201780006905.1A CN108475495B (zh) | 2016-02-12 | 2017-01-06 | 图像显示装置及图像显示方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016025244 | 2016-02-12 | ||
JP2016-025244 | 2016-02-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017138278A1 true WO2017138278A1 (ja) | 2017-08-17 |
Family
ID=59563614
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/000242 WO2017138278A1 (ja) | 2016-02-12 | 2017-01-06 | 画像表示装置および画像表示方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10642033B2 (ja) |
EP (1) | EP3416159B1 (ja) |
JP (1) | JP6598317B2 (ja) |
CN (1) | CN108475495B (ja) |
WO (1) | WO2017138278A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110032969A (zh) * | 2019-04-11 | 2019-07-19 | 北京百度网讯科技有限公司 | 用于检测图像中的文本区域的方法、装置、设备以及介质 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113196377B (zh) * | 2018-12-20 | 2024-04-12 | Ns西日本株式会社 | 显示光射出装置、平视显示装置、图像显示***及头盔 |
US11733951B2 (en) * | 2018-12-27 | 2023-08-22 | Honda Motor Co., Ltd. | Image display apparatus, image display system and image display method |
JP7287257B2 (ja) * | 2019-12-06 | 2023-06-06 | トヨタ自動車株式会社 | 画像処理装置、表示システム、プログラムおよび画像処理方法 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09309372A (ja) * | 1996-05-24 | 1997-12-02 | Nissan Motor Co Ltd | 搭乗姿勢検出装置及び情報選択装置 |
JP2003345492A (ja) * | 2002-05-27 | 2003-12-05 | Sony Corp | 携帯電子機器 |
JP2005099908A (ja) * | 2003-09-22 | 2005-04-14 | Matsushita Electric Ind Co Ltd | カード機能を有する情報記憶装置と情報処理装置 |
JP2010097393A (ja) * | 2008-10-16 | 2010-04-30 | Casio Hitachi Mobile Communications Co Ltd | 画像表示装置及びプログラム |
JP2013083731A (ja) * | 2011-10-06 | 2013-05-09 | Murata Mach Ltd | 画像表示システム |
JP2013108841A (ja) * | 2011-11-21 | 2013-06-06 | Denso Corp | 車両用表示装置 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3403361B2 (ja) * | 1999-09-30 | 2003-05-06 | 川崎重工業株式会社 | ヘッド・マウント・ディスプレイ装置 |
EP1517277A3 (en) | 2003-09-22 | 2006-10-25 | Matsushita Electric Industrial Co., Ltd. | Secure device and information processing unit |
JP2007134785A (ja) | 2005-11-08 | 2007-05-31 | Konica Minolta Photo Imaging Inc | 頭部装着型の映像表示装置 |
EP2619749A4 (en) | 2010-09-21 | 2017-11-15 | 4IIII Innovations Inc. | Head-mounted peripheral vision display systems and methods |
US20120095643A1 (en) * | 2010-10-19 | 2012-04-19 | Nokia Corporation | Method, Apparatus, and Computer Program Product for Modifying a User Interface Format |
US20140098008A1 (en) * | 2012-10-04 | 2014-04-10 | Ford Global Technologies, Llc | Method and apparatus for vehicle enabled visual augmentation |
US9204288B2 (en) | 2013-09-25 | 2015-12-01 | At&T Mobility Ii Llc | Intelligent adaptation of address books |
US10386921B2 (en) * | 2013-12-03 | 2019-08-20 | Nokia Technologies Oy | Display of information on a head mounted display |
DE102014004178A1 (de) * | 2014-03-22 | 2015-09-24 | Audi Ag | Verfahren zum Darstellen fahrtbezogener und/oder fahrzeugspezifischer Daten |
US9767373B2 (en) * | 2014-09-05 | 2017-09-19 | Ford Global Technologies, Llc | Head-mounted display head pose and activity estimation |
US9588593B2 (en) * | 2015-06-30 | 2017-03-07 | Ariadne's Thread (Usa), Inc. | Virtual reality system with control command gestures |
-
2017
- 2017-01-06 EP EP17750003.0A patent/EP3416159B1/en active Active
- 2017-01-06 JP JP2017566547A patent/JP6598317B2/ja active Active
- 2017-01-06 US US16/066,683 patent/US10642033B2/en active Active
- 2017-01-06 WO PCT/JP2017/000242 patent/WO2017138278A1/ja active Application Filing
- 2017-01-06 CN CN201780006905.1A patent/CN108475495B/zh active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09309372A (ja) * | 1996-05-24 | 1997-12-02 | Nissan Motor Co Ltd | 搭乗姿勢検出装置及び情報選択装置 |
JP2003345492A (ja) * | 2002-05-27 | 2003-12-05 | Sony Corp | 携帯電子機器 |
JP2005099908A (ja) * | 2003-09-22 | 2005-04-14 | Matsushita Electric Ind Co Ltd | カード機能を有する情報記憶装置と情報処理装置 |
JP2010097393A (ja) * | 2008-10-16 | 2010-04-30 | Casio Hitachi Mobile Communications Co Ltd | 画像表示装置及びプログラム |
JP2013083731A (ja) * | 2011-10-06 | 2013-05-09 | Murata Mach Ltd | 画像表示システム |
JP2013108841A (ja) * | 2011-11-21 | 2013-06-06 | Denso Corp | 車両用表示装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3416159A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110032969A (zh) * | 2019-04-11 | 2019-07-19 | 北京百度网讯科技有限公司 | 用于检测图像中的文本区域的方法、装置、设备以及介质 |
CN110032969B (zh) * | 2019-04-11 | 2021-11-05 | 北京百度网讯科技有限公司 | 用于检测图像中的文本区域的方法、装置、设备以及介质 |
Also Published As
Publication number | Publication date |
---|---|
JP6598317B2 (ja) | 2019-10-30 |
CN108475495A (zh) | 2018-08-31 |
CN108475495B (zh) | 2021-01-19 |
US10642033B2 (en) | 2020-05-05 |
EP3416159A4 (en) | 2019-07-24 |
EP3416159A1 (en) | 2018-12-19 |
JPWO2017138278A1 (ja) | 2018-10-11 |
EP3416159B1 (en) | 2020-08-26 |
US20190155024A1 (en) | 2019-05-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017138278A1 (ja) | 画像表示装置および画像表示方法 | |
KR101659027B1 (ko) | 이동 단말기 및 차량 제어 장치 | |
US20190130878A1 (en) | Systems and Methods for Presenting Virtual Content in a Vehicle | |
US10108018B2 (en) | Image display apparatus for displaying an image captured by a mobile apparatus | |
JP2018190217A (ja) | 運転者監視装置、及び運転者監視方法 | |
US10839623B2 (en) | Vehicle, image display device, vehicle control method, and image display method | |
US11110933B2 (en) | Driving support device, wearable device, driving support system, driving support method, and computer-readable recording medium | |
WO2012029647A1 (ja) | 携帯型情報処理装置、携帯型情報処理装置のためのコンピュータプログラム、及び表示制御方法 | |
CN105116544A (zh) | 头操作的电子眼镜 | |
KR20240065182A (ko) | 개인 이동성 시스템의 ar 기반 성능 변조 | |
KR101659033B1 (ko) | 이동 단말기 및 이의 제어 방법 | |
US10935789B2 (en) | Image display apparatus and image display method | |
Li et al. | The design of a segway AR-Tactile navigation system | |
KR101578741B1 (ko) | 이동 단말기 및 그 제어 방법 | |
WO2021220407A1 (ja) | ヘッドマウントディスプレイ装置および表示制御方法 | |
JP5892309B2 (ja) | 車両用視線誘導装置 | |
CN110347163B (zh) | 一种无人驾驶设备的控制方法、设备及无人驾驶控制*** | |
US11900550B2 (en) | AR odometry using sensor data from a personal vehicle | |
US10563989B2 (en) | Visual and lateralized navigation assistance system | |
US20230215106A1 (en) | Ar-enhanced detection and localization of a personal mobility device | |
KR20160023755A (ko) | 이동 단말기 및 그 제어 방법 | |
CN117234340A (zh) | 头戴式xr设备用户界面显示方法及设备 | |
WO2023122459A1 (en) | An odometry using sensor data from a personal vehicle | |
CN113085884A (zh) | 移动体控制装置、移动体控制方法及存储介质 | |
JP2019057009A (ja) | 情報処理装置及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17750003 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017566547 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017750003 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017750003 Country of ref document: EP Effective date: 20180912 |