WO2010107072A1 - Visiocasque - Google Patents

Visiocasque Download PDF

Info

Publication number
WO2010107072A1
WO2010107072A1 PCT/JP2010/054585 JP2010054585W WO2010107072A1 WO 2010107072 A1 WO2010107072 A1 WO 2010107072A1 JP 2010054585 W JP2010054585 W JP 2010054585W WO 2010107072 A1 WO2010107072 A1 WO 2010107072A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
image
display
unit
head
Prior art date
Application number
PCT/JP2010/054585
Other languages
English (en)
Japanese (ja)
Inventor
佐藤知裕
Original Assignee
ブラザー工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ブラザー工業株式会社 filed Critical ブラザー工業株式会社
Publication of WO2010107072A1 publication Critical patent/WO2010107072A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/10Scanning systems
    • G02B26/105Scanning systems with one or more pivoting mirrors or galvano-mirrors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • the present invention relates to a head mounted display.
  • an image display device that includes display means that causes image light according to image information to be incident on the user's eyes and allows the user to visually recognize an image according to the image information HMD.
  • the HMD is literally worn on the user's head, has a characteristic that allows hands-free viewing of images, and is suitable for carrying.
  • a display unit that is arranged so as to be positioned in front of the user's eyes and that displays an image
  • an external information acquisition unit such as an imaging unit that acquires external information
  • the external information acquisition unit There is a control means for controlling the operation of the display means based on the information (for example, see Patent Document 1).
  • the operation key is identified by identifying what is the object imaged as external information from image information acquired by the imaging means (external world information acquisition means) and associating the identified object with predetermined information. It is possible to cause the HMD to perform a predetermined operation without operating the above.
  • the HMD includes a display means, an imaging means, a head state detection means, a part specifying means, and a control processing means.
  • the display means causes image light corresponding to the image information to enter the user's eyes and displays an image corresponding to the image information.
  • the imaging means images a predetermined range in the visual field direction of the user.
  • the head state detecting means detects the angle of the user's head.
  • the part specifying unit specifies a part of the user's body in the visual field direction of the user based on the image captured by the imaging unit and the detected angle of the user's head.
  • the control processing means performs an operation according to information on a part of the user's body specified by the part specifying means.
  • a part of the user's body is specified by the image captured by the imaging unit and the detected angle of the user's head, and an operation according to the information on the specified part of the user's body Is executed. For this reason, it becomes possible to more reliably execute a hands-free operation. Further, it is a very effective memory technique to store the body part and the information to be stored in association with each other, and it is easy to store the body part and the information to be operated in association with each other. For this reason, when the user specifies a body part in order to perform a desired operation, the user can smoothly specify the part without hesitation or mistake. Therefore, the user can surely and quickly execute the hands-free operation without depending on the menu or the guidance.
  • the HMD further includes a part state determining unit that determines a state of the body part of the user specified by the part specifying unit, wherein the control processing unit includes an operation corresponding to the part specified by the part specifying unit. You may make it perform operation according to the state of the said site
  • Such a configuration makes it possible to perform a hands-free operation more reliably and to allow a wide variety of operation types.
  • the part state determination unit determines whether the user's hand is in contact with or close to the part of the user's body as the state of the user's body part specified by the part specifying unit. You may make it do.
  • This configuration makes it possible to accurately identify the body part.
  • the part state determination means may detect the position of the user's hand by detecting a wearing tool attached to the user's hand.
  • the HMD further includes a storage unit that stores a table in which operation information is associated with a part of the user's body, and the control processing unit stores operation information corresponding to the part determined by the part state determination unit.
  • the operation may be executed by reading from the storage means storing the table.
  • control processing means does not require complicated calculations and the processing speed can be increased, and the load on the control processing means can be reduced.
  • the operation for storing information in the table may be an operation for an image display process on the display means.
  • the HMD further includes a second storage unit that stores a second table in which operation information is associated with the state of the body part of the user, and the control processing unit is configured to store the user's body based on the second table. Depending on the state of the body part, an operation for displaying the image on the display unit may be further performed.
  • control processing means does not require a complicated calculation and the processing speed can be increased while improving the usability as an HMD, and the load on the control processing means can be reduced.
  • the operation information may be set in the table according to the part of the user's body specified by the part specifying means.
  • the HMD may include calibration means for extracting information for specifying the body part of the user from the image captured by the imaging means and storing the information in the table.
  • Such a configuration can cope with individual differences among users and improve the accuracy of hands-free operation.
  • the display means may make the image light incident on a user's eye together with external light and display an image superimposed on an external scene.
  • the HMD of the present invention is provided with a head state detection unit that detects the angle of the user's head, and is based on the image captured by the imaging unit and the angle of the user's head detected by the head state detection unit.
  • the part of the user's body is specified, and an operation according to the information on the specified part of the user's body is executed. For this reason, it is possible to more reliably execute a hands-free operation.
  • it is a very effective memory technique to store the body part and the information to be stored in association with each other, and it is easy to store the body part and the information to be operated in association with each other. For this reason, when the user specifies a body part in order to perform a desired operation, the user can smoothly specify the part without hesitation or mistake. Therefore, the user can surely and quickly execute the hands-free operation without depending on the menu or the guidance.
  • HMD Head Mount Display
  • the HMD includes a see-through type display unit 1 as a display means.
  • the display unit 1 reflects the image light 500 corresponding to the image information with the half mirror 10 and projects the image light 500 on the eye 101 of the user 100 in a state where a part of the external light 400 reaches at least one of the eyes 101 of the user 100. Then, the user 100 is caused to visually recognize the display image 200 superimposed on the outside scene.
  • the HMD includes a housing 13 and a control unit 3.
  • the housing 13 is attached to a mounting portion 11 having a substantially glasses shape that is mounted on the head 102 of the user 100.
  • the control unit 3 is connected to the housing 13 via the connection cable 52.
  • the display unit 1 of the HMD according to this embodiment includes a projection unit 1a housed in a housing 13 and a light source unit 1b housed in a control unit 3 as will be described in detail later. .
  • the user 100 if the user 100 stores the control unit 3 in a pocket or the like and attaches the HMD to the head 102, the user 100 can carry it in a hands-free state, whether indoors or outdoors, Various display images 200 can be viewed while viewing the outside scene.
  • the HMD according to the present embodiment two-dimensionally scans the image light generated based on the image signal, and projects the scanned image light onto the user's eye to form an image on the retina. It is a type.
  • the HMD according to the present embodiment is a see-through type in which image light is incident on the user's eyes together with external light, and the image is displayed superimposed on the external scene.
  • the configuration of the HMD is not limited as long as the HMD includes a display unit that causes image light according to the image information to enter the user's eyes and displays the image according to the image information.
  • the HMD according to the present embodiment includes a CCD camera 2 as an imaging unit that images a predetermined range in the visual field direction of the user 100.
  • the CCD camera 2 is a small camera attached to the housing 13 of the display unit 1 as shown in FIGS.
  • the CCD camera 2 can image the field of view of the user 100, and by analyzing the image data, it can be determined in which direction the user 100 is facing. Further, the user 100 can also view the captured image as the display image 200 on the display unit 1 by viewing the CCD camera 2 by closing the shutter.
  • the image information of the display image 200 includes content data created mainly by a personal computer (hereinafter referred to as “PC”) in addition to a captured image file captured by the CCD camera 2, or an Internet site via a PC.
  • the display unit 1 can display not only a still image but also a moving image such as a movie as image information of the display image 200. Therefore, the HMD according to the present embodiment can also reproduce audio data using the earphone 9.
  • the HMD according to the present embodiment includes an angle sensor 4 as a head state detection unit that detects the angle of the head 102 of the user 100.
  • an angle sensor 4 a magnetic sensor using geomagnetism, a tilt sensor that detects the direction of gravity, a gyro sensor, or the like is used alone or in combination.
  • the angle sensor 4 is provided inside the housing 13 as shown in FIG. 1, and can detect the inclination of the head 102 of the user 100 in a three-dimensional manner. That is, the angle sensor 4 can determine how much the head 102 is inclined in which direction by the vertical angle ⁇ and the horizontal angle ⁇ .
  • the HMD can display an image according to image information such as a file of a captured image captured by the CCD camera 2 or content data acquired via a PC.
  • image information such as a file of a captured image captured by the CCD camera 2 or content data acquired via a PC.
  • a menu screen is first displayed as a display image 200 as shown in FIG.
  • the camera mode or the browsing mode can be selected.
  • the camera mode a captured image can be displayed and stored, and in the browsing mode, content data can be played back and stopped.
  • an intention display is set in advance when selecting from the options displayed by the display unit 1 or corresponding to an intention display in which “Yes” is answered to the same displayed question. For example, “Yes, the user touches his / her abdomen and sees his / her hand” is set in advance as the intention display of Yes, and “No user's intention display” "See” is preset.
  • the HMD includes a part specifying unit and a control processing unit in addition to the CCD camera 2 and the angle sensor 4 described above.
  • the part specifying means is a part of the body of the user 100 in the viewing direction of the user 100 (hereinafter simply referred to as “body”) based on the image captured by the CCD camera 2 and the angle of the head 102 of the user 100 detected by the angle sensor 4. A region ").
  • the control processing means performs an operation according to information on the body part of the user 100 specified by the part specifying means.
  • the user 100 can cause the HMD to accurately perform a predetermined operation by merely looking at a part of his / her body without operating an operation key or the like.
  • the function as the part specifying means and the control processing means is borne by the control unit 30 that substantially controls the entire HMD.
  • the control unit 30 includes a storage unit as illustrated, and the storage unit stores a table in which operation information is associated with the body part of the user 100.
  • the operation is an operation for an image display process on the display unit 1, and is a concept including commands for displaying, saving, erasing, and reproducing and stopping for a moving image.
  • control unit 30 includes a second storage unit in which a second table in which operation information is associated with the state of the body part of the user 100 is stored. Based on the second table, the control unit 30 further performs an operation for displaying the image on the display unit 1 according to the state of the body part of the user 100. Note that the functions of the storage means and the second storage means are all performed by the flash memory 42 and the like which will be described in detail later.
  • the HMD according to the present embodiment is such that when the user 100 looks at a part of his / her body, such as a hand, arm, elbow, or knee, the body part viewed by the user 100 is an arm. If it is an arm or elbow, identify it as an elbow.
  • the HMD can display an image captured by the CCD camera 2 as the display image 200 or store the image in association with the body if the operation is in accordance with the specified part, for example, the camera mode.
  • the HMD can play or stop a moving image in the browsing mode.
  • control unit 30 (control processing means) The operation like this is performed.
  • the control unit 30 causes the display unit 1 to display the captured image as a display image 200 together with a message “Do you want to save this image?”.
  • the user 100 wants to store the captured image displayed on the display unit 1 in association with the elbow as described above, the user 100 moves the head 102 and looks straight at his / her elbow.
  • the landscape image (current display image 200) captured by the user 100 is stored in a predetermined area of the storage means, that is, in this case, a virtually defined elbow folder.
  • a file storage destination can be associated with a plurality of parts of the body of the user 100, such as an arm folder or a hand folder.
  • the control unit 30 controls processing means
  • the captured image is displayed as a display image 200 by the display unit 1.
  • the CCD camera 2 is attached so as to be able to image a range centering on the normal viewing direction among the viewing directions of the user 100. For this reason, when the user 100 looks straight at the elbow while trying to select the elbow folder, a captured image in which the elbow of the user 100 is the image center is obtained as shown in FIG. 4A.
  • control unit 30 (control processing means) first determines how much the head 102 is inclined in what direction based on the detection value of the angle sensor 4. Accordingly, the control unit 30 determines which part of the body part of the user 100 is near the face. Thereby, it is possible to distinguish whether the elbow image captured by the CCD camera 2 is the elbow of the other person or the elbow of another person in the direction of the head, and malfunction can be prevented.
  • the body part that the user 100 is looking at in this case is narrowed down to be “the central part of the left arm”.
  • the control unit 30 calculates the color temperature from the RGB value of each pixel included in the captured image data, and within the defined area 250 set at the center of the image, the pixel located at the center of the defined area 250. Sample the color temperature.
  • the control unit 30 recognizes a part shape from the outline of a region where pixels having a color temperature within a predetermined tolerance range with respect to the sampled color temperature.
  • the control unit 30 determines which of the plurality of model shapes stored in advance as the shape indicating each body part is recognized. When the control unit 30 determines that the model shape indicates “elbow”, the user 100 determines that the user is looking at “elbow”.
  • the control unit 30 determines that the area of the region where the pixels of the color temperature indicating the skin color are gathered (hereinafter also referred to as “skin color area”) It is determined which one of the plurality of model areas stored as the area value indicating the part corresponds to. When the control unit 30 determines that the skin color area corresponds to the model area indicating “elbow”, the control unit 30 can also determine that the user 100 is viewing “elbow”.
  • control unit 30 stores (saves) the image file in an area (corresponding to an elbow folder) defined in a storage unit such as a flash memory.
  • control unit 30 causes the display unit 1 to display the display image 200 including the message “Image 09001.JPG was saved in the elbow folder”, and the image captured by the user 100 is displayed. Informs that has been saved.
  • the control unit 30 also functions as a part state determining unit that determines the state of the body part of the user 100 specified by the part specifying unit (FIG. 3). That is, when the control unit 30 functions as a control processing unit, an operation corresponding to the state of the part determined by the part state determining unit is executed among the operations corresponding to the part specified by the part specifying unit. Therefore, it becomes possible to perform the operation by hands-free of the HMD more reliably, and it is possible to diversify the types of operations.
  • the state of the part includes, for example, a state where the hand of the user 100 is in contact with or close to the body part of the user 100.
  • the state in which the hand of the user 100 is in contact with or close to the body part is changed, for example, by adding the shape of the hand that is in contact with or close to the part shape detected for recognition as the body part. Can be detected and determined. Moreover, when the site
  • a plurality of image files captured by the user 100 with the CCD camera 2 so far are stored for each of a plurality of folders (elbow folder, arm folder, hand folder, etc.) by category. .
  • the desired image file is a person image and is stored in a hand folder.
  • the control unit 30 converts the plurality of image files stored in association with the hand into thumbnails and displays them. 1 is displayed.
  • the control unit 30 highlights the images sequentially, for example, by thickening the frame of the image.
  • the user 100 adds the right hand to the left hand that is staring when the image file to be displayed is highlighted. That is, the state of the body part changes with the right hand overlapping the left hand.
  • the control unit 30 determines that the currently displayed image file has been selected by adding the contour of another hand to the contour shape indicating one hand and changing the overall shape. Then, the control unit 30 displays the image file in the entire area of the display image 200 as illustrated in accordance with the determination result.
  • control unit 30 includes a calibration unit in order to execute an operation corresponding to the body part of the user 100 or the state of the part (FIG. 3).
  • the calibration means extracts information specifying the body part of the user 100 from the image captured by the CCD camera 2 and stores it in the table.
  • FIG. 6 shows an example of a calibrated data table.
  • the table shown in FIG. 6 is a table in which a virtual folder storing a captured image file and a body part are associated in advance by performing calibration, and is stored in the flash memory 42.
  • “left arm”, “left elbow”, and “left hand” are used as an example of a body part.
  • the detection value of the angle sensor 4 when the user 100 moves the head 102 to see “arm”, “elbow”, and “hand” when the user 100 is actually in the posture described above The captured image captured by the CCD camera 2 at the head angle is associated with the body part (“arm”, “elbow”, “hand”) as the head angle and pattern image.
  • the detection values from the angle sensor 4 can be obtained as ⁇ (vertical angle) and ⁇ (horizontal angle).
  • the table shown in FIG. 7 corresponds to a second table in which operation information is associated with the state of the body part of the user 100, and is also stored in the flash memory 42.
  • the second table has a pattern in which the other hand of the user 100 is in contact with or close to the “arm”, “hand”, and “elbow” of one of the users 100 (left side here) as shown in the figure.
  • Each associated command is associated with the entire display of the captured image.
  • the control unit 30 referring to the table shown in FIG. 7 displays the corresponding person image as the display image. 200 is displayed in the entire display area.
  • the user's 100 hand is detected by detecting the wearing tool 110 attached to the user's 100 hand. By detecting the position of the body part, the body part can be accurately specified.
  • the user 100 puts a specific color wearing tool 110 that is easy to recognize on the finger of the right hand.
  • the wearing tool 110 is located in the prescribed area 250 set at the center of the image captured by the CCD camera 2.
  • a pixel region of a specific color by the mounting tool 110 can be recognized.
  • the control unit 30 moves the center of the arm, that is, the elbow to the user 100. Can be determined to be watching.
  • the wearing tool 11 is not limited to a ring type as shown in the figure, but may be anything that can be worn on the hand, and includes gloves.
  • the user 100 when the user 100 wears the HMD according to the present embodiment and selects the camera mode, the user 100 can accurately display and save an image captured by the CCD camera 2 in a hands-free manner. Can be performed.
  • FIG. 9 shows an external configuration of the HMD.
  • the HMD is connected to the housing 13 attached to the mounting portion 11 attached to the head 102 of the user 100, and a connection cable 52 including an optical fiber cable 50, a transmission cable 51, an audio cable 99, and the like.
  • the control unit 3 connected to the housing 13.
  • the display unit 1 includes the projection unit 1 a housed in the housing 13 and the light source unit 1 b housed in the control unit 3.
  • the mounting portion 11 is made of a support member having a substantially glasses shape, and can be easily mounted as if it were glasses.
  • the mounting unit 11 is configured such that when mounted on the user's head 102, the housing 13 that houses the projection unit 1 a is positioned on the left front of the user 100.
  • the housing 13 is formed in a substantially L shape in plan view so as to cover the left eye portion and the left temple portion of the user 100 when the mounting portion 11 is mounted on the head 102 of the user 100.
  • the CCD camera 2 is disposed on the upper surface 13 a side of the housing 13.
  • the CCD camera 2 is connected to the control unit 30 of the control unit 3 via a power transmission cable 51.
  • the CCD camera 2 has an optical axis that is focused on the body part captured by the user's 100 line of sight when the user 100 views the body such as the user's 100 hand, arm, or leg.
  • the CCD camera 2 includes a shutter button (not shown), and an image can be taken by operating the shutter button.
  • the control unit 3 may be provided with an operation button for imaging, and the CCD camera 2 may perform manual imaging by pressing the imaging button.
  • the half mirror 10 is attached to the front end side of the housing 13 so as to be positioned in front of the left eye 101. Accordingly, the external light 400 is transmitted through the half mirror 10 and is incident on the left eye 101 of the user 100, and the image light 500 is reflected by the half mirror 10 and is incident on the eye 101 of the user 100 (FIG. 1). Of course, the external light 400 is also incident on the right eye 101. Therefore, the user 100 can naturally see the outside world while viewing the display image 200 even when the right eye 101 is closed, when both eyes 101 and 101 are opened.
  • the control unit 3 is configured so that the user 100 can be carried in a pocket of clothes (see FIG. 2).
  • a power switch 7 and a power lamp 8 are provided on the case surface of the control unit 3.
  • the control unit 30 housed in the control unit 3 together with the light source unit 1b can acquire image information and the like from an external PC or the like via the communication I / F 39. And the control part 30 is transmitting the acquired image information etc. to the image signal supply circuit 6 of the light source unit 1b. Connection to an external device such as a PC can be performed either wirelessly or wired.
  • the image signal supply circuit 6 of the light source unit 1b modulates the intensity of the acquired image information in units of pixels to form image light 500, and transmits the image light 500 to the projection unit 1a via the optical fiber cable 50.
  • the projection unit 1a can cause the user 100 to visually recognize the content image 200 by scanning the transmitted image light 500.
  • the audio circuit 45 provided in the control unit 30 can convert audio data included in content data acquired from a PC or the like into an audio signal and transmit it to the earphone 9 via the audio cable 99.
  • the HMD includes a display unit 1, a CCD camera 2, an angle sensor 4, and an earphone 9.
  • the display unit 1 includes the projection unit 1 a provided in the mounting unit 11 and the light source unit 1 b provided in the control unit 3.
  • the control unit 3 is also provided with a control unit 30 that controls the overall operation of the HMD.
  • the light source unit 1b reads image information from the image signal S supplied from the control unit 30 in units of pixels, and R (red), G (green), and B (blue) based on the read pixel information. Laser light whose intensity is modulated for each color is generated and emitted.
  • the light source unit 1b may be provided not in the control unit 3 but in the projection unit 1a.
  • the display unit 1 will be described by taking as an example a retinal scanning type display unit that projects two-dimensionally scanned laser light into the eye 101 of the user 100 and projects an image on the retina 101b.
  • a liquid crystal display unit can be used.
  • the liquid crystal display unit for example, a transmissive liquid crystal panel is irradiated with light from a light source, and the light transmitted through the liquid crystal panel is incident on the user's eyes as image light.
  • the light source unit 1b is provided with an image signal supply circuit 6 that generates a signal that is an element for synthesizing an image.
  • the control unit 30 When receiving the input of the image data supplied from the PC, the control unit 30 generates an image signal S based on the image data and sends it to the image signal supply circuit 6.
  • the image signal supply circuit 6 Based on the image signal S, the image signal supply circuit 6 generates each signal that is an element for forming the display image 200 in units of pixels. That is, the image signal supply circuit 6 generates and outputs an R (red) image signal 60r, a G (green) image signal 60g, and a B (blue) image signal 60b.
  • the image signal supply circuit 6 outputs a horizontal drive signal 61 used in the horizontal scanning unit 80 and a vertical drive signal 62 used in the vertical scanning unit 90, respectively.
  • the light source unit 1b also modulates the intensity based on the image signals 60r, 60g, and 60b of the R image signal 60r, the G image signal 60g, and the B image signal 60b output from the image signal supply circuit 6 in units of pixels.
  • the image light 500 that is the laser beam (also referred to as “light beam”) is emitted.
  • the light source unit 1b is provided with an R laser driver 66, a G laser driver 67, and a B laser driver 68 for driving the R laser 63, the G laser 64, and the B laser 65, respectively.
  • Each laser 63, 64, 65 can be configured as, for example, a semiconductor laser or a solid-state laser with a harmonic generation mechanism.
  • the drive current can be directly modulated to modulate the intensity of the laser beam.
  • each laser is equipped with an external modulator, and the intensity of the laser beam is modulated. Need to do.
  • the light source unit 1 b has collimating optical systems 71, 72, 73, dichroic mirrors 74, 75, 76, and a coupling optical system 77.
  • the collimating optical systems 71, 72, and 73 collimate the laser beams emitted from the lasers 63, 64, and 65 into parallel lights.
  • the dichroic mirrors 74, 75, and 76 combine the laser beams collimated by the collimating optical systems 71, 72, and 73.
  • the coupling optical system 77 guides the laser light combined by the dichroic mirrors 74, 75, 76 to the optical fiber cable 50.
  • the laser beams emitted from the lasers 63, 64, and 65 are collimated by the collimating optical systems 71, 72, and 73 and then enter the dichroic mirrors 74, 75, and 76, respectively. Thereafter, each of the laser beams is selectively reflected and transmitted with respect to the wavelength by these dichroic mirrors 74, 75, and 76.
  • the three primary color laser beams incident on these three dichroic mirrors 74, 75, and 76 are reflected or transmitted in a wavelength selective manner, reach the coupling optical system 77, and are collected and output to the optical fiber cable 50.
  • the projection unit 1a located between the light source unit 1b and the eye 101 of the user 100 includes a collimating optical system 79, a horizontal scanning unit 80, a vertical scanning unit 90, a first relay optical system 85, and a second relay.
  • An optical system 95 is provided.
  • the collimating optical system 79 collimates the laser light generated by the light source unit 1 b and emitted through the optical fiber cable 50.
  • the horizontal scanning unit 80 reciprocally scans the laser beam collimated by the collimating optical system 79 in the horizontal direction for image display.
  • the vertical scanning unit 90 scans the laser beam scanned in the horizontal direction by the horizontal scanning unit 80 in the vertical direction.
  • the first relay optical system 85 is provided between the horizontal scanning unit 80 and the vertical scanning unit 90.
  • the second relay optical system 95 is an optical system for emitting the laser beam thus scanned in the horizontal direction and the vertical direction to the pupil 101a.
  • the horizontal scanning unit 80 and the vertical scanning unit 90 scan in the horizontal direction and the vertical direction to scan the laser beam incident from the optical fiber cable 50 as an image so that the laser beam can be projected onto the retina 101b of the user 100. It is an optical system.
  • the horizontal scanning unit 80 horizontally drives a resonance type deflection element 81 having a deflection surface for scanning laser light in the horizontal direction, and a driving signal for causing the deflection element 81 to resonate and swing the deflection surface of the deflection element 81.
  • a horizontal scanning drive circuit 82 that is generated based on the signal 61 is provided.
  • the vertical scanning unit 90 vertically drives a non-resonance type deflection element 91 having a deflection surface for scanning the laser beam in the vertical direction and a drive signal for swinging the deflection surface of the deflection element 91 in a non-resonance state. And a vertical scanning control circuit 92 generated based on the signal 62.
  • the vertical scanning unit 90 vertically scans laser light for forming an image from the first horizontal scanning line toward the last horizontal scanning line for each frame of the image to be displayed.
  • the “horizontal scanning line” means one scanning in the horizontal direction by the horizontal scanning unit 80.
  • galvanometer mirrors are used as the deflection elements 81 and 91.
  • the deflection surface deflection surface
  • electrostatic driving any driving method such as driving or electrostatic driving may be used.
  • a resonance type deflection element is used for the horizontal scanning unit 80 and a non-resonance type deflection element is used for the vertical scanning unit 90.
  • a resonance type deflection element may be used.
  • the first relay optical system 85 that relays the laser light between the horizontal scanning unit 80 and the vertical scanning unit 90 converts the laser light scanned in the horizontal direction by the deflection surface of the deflection element 81 into the deflection surface of the deflection element 91. To converge. Then, the laser light is scanned in the vertical direction by the deflection surface of the deflection element 91, and the front of the eye 101 is passed through the second relay optical system 95 in which two lenses 95a and 95b having positive refractive power are arranged in series. The light is reflected by the half mirror 10 positioned at, and enters the pupil 101 a of the user 100.
  • the display image 200 corresponding to the image signal S is projected on the retina 101b.
  • the user 100 recognizes the laser light, which is the image light 500 incident on the pupil 101a, as the display image 200 (see FIGS. 1 and 2).
  • the respective laser beams are made substantially parallel to each other and converted into convergent laser beams by the lens 95a. Then, the laser light converted by the lens 95a is converted into substantially parallel laser light by the lens 95b, and the center line of these laser lights is converted so as to converge on the pupil 101a of the user 100.
  • Control unit 30 The control unit 30 executes predetermined processing according to a control program stored therein, thereby performing the above-described head state detecting means, part specifying means, control processing means, part state determining means, and calibration means. And so on.
  • control unit 30 indicates each controller 31, 32, 36, each VRAM (Video Random Access Memory) 33, 37, and a peripheral device interface (“I / F” in the figure). 38) and a communication I / F 39.
  • VRAM Video Random Access Memory
  • the main controller 31 includes a CPU (Central Processing Unit) 40, a program ROM (Read Only Memory) 41, a flash memory 42 which is a nonvolatile memory, and a RAM (Random Access Memory) 43.
  • the CPU 40, the program ROM 41, the flash memory 42, and the RAM 43 are connected to a data communication bus, and transmit and receive various information including the head angle from the angle sensor 4 through the data communication bus. I do.
  • the CPU 40 is an arithmetic processing unit that, as a main controller 31, operates various units included in the HMD and executes various functions included in the HMD by executing a control program stored in the program ROM 41.
  • the flash memory 42 stores image data output from the CCD camera 2 and the like, various tables necessary for operation control of the HMD, setting values of luminance of the display image 200, and the like.
  • the table stored in the flash memory 42 includes a table in which operation information is associated with the body part of the user and the state of the part, as shown in FIGS. 6, 7, 18, and 19. .
  • the CPU 40 can specify the body part of the user 100 in the visual field direction of the user 100 based on the image captured by the CCD camera 2 and the angle of the head 102 of the user 100 detected by the angle sensor 4. it can.
  • the HMD controller 32 controls the display unit 1 that is a display unit in response to a request from the main controller 31, and the image signal S based on the image data stored in the HMD VRAM 33 by the main controller 31 is displayed on the display unit 1.
  • the display unit 1 When the image signal S is input from the HMD controller 32, the display unit 1 generates and scans laser light of each color whose intensity is modulated based on the image signal S, and emits it to the eye 101 of the user 100. An image corresponding to the image signal S is projected onto the retina 101b.
  • the main controller 31 performs control to display an image.
  • the camera controller 36 controls the CCD camera 2 that is an imaging means.
  • the camera VRAM 37 temporarily stores image data output from the CCD camera 2.
  • the main controller 31 can acquire the image data output from the CCD camera 2 by outputting the image data output from the CCD camera 2 to the HMD VRAM 33 via the camera VRAM 37.
  • the main controller 31 controls the CCD camera 2 via the camera controller 36 in order to specify the body part of the user 100 in the viewing direction of the user 100.
  • the main controller 31 causes the CCD camera 2 to capture an image of the object in the direction facing the face of the user 100 within the visual field range of the user 100 so as to be approximately at the center position of the image display area by the display unit 1.
  • the main controller 31 acquires image data output from the CCD camera 2 via the camera VRAM 37. As a result, the main controller 31 can acquire an image captured by the CCD camera 2.
  • the main controller 31 determines how much the head 102 is tilted in which direction based on the detection value of the angle sensor 4, and thereby, which of the body parts of the user 100 is selected. It is determined whether the face is in the vicinity of the part. That is, the main controller 31 narrows down the body part that the user 100 is looking at, for example, “the center part of the left arm” or “the tip part of the left arm” by tilting the head 102.
  • the image captured by the CCD camera 2 is analyzed to detect a part shape, and the detected part shape is compared with a prestored model shape, thereby detecting the body part of the user 100.
  • a hand that is in contact with or close to the part of the body of the user 100 is also recognized.
  • image processing can be performed at high speed by separately configuring an image processing unit in the main controller 31 with hardware.
  • Peripheral device I / F 38 is an interface for connecting peripheral device 5 such as power switch 7, content display switch, and lamps (not shown) to main controller 31.
  • peripheral device 5 such as power switch 7, content display switch, and lamps (not shown)
  • the main controller 31 receives operation information from the switches such as the power switch 7 and the content display switch as the peripheral device I / F 38. Receive from.
  • the main controller 31 supplies lamp lighting information to the lamps via the peripheral device I / F 38.
  • the communication I / F 39 controls the main controller 31 and the PC so that they can communicate with each other.
  • the main controller 31 requests the PC to supply image data via the communication I / F 39
  • the HMD controller 32 sends an image signal S based on the image data supplied from the PC via the communication I / F 39. This is supplied to the display unit 1.
  • the CCD camera 2 is used as the imaging means.
  • the present invention is not limited to this, and a CMOS camera or the like may be used.
  • FIG. 13 shows a flow of imaging processing which is an example of camera mode processing
  • FIG. 14 shows a flow of image data display processing which is also an example of camera mode processing.
  • the power switch 7 of the HMD is already turned on, and the initial setting process after the power switch 7 is turned on is all completed.
  • 13 and 14 are processes when the camera mode has already been selected (see FIG. 2).
  • the CPU 40 of the main controller 31 uses the display unit 1 to look at a specific part of the body (here, one of the left elbow, left hand, and left arm) and An instruction comment is displayed to the user 100 so as to input a determination trigger (step S101).
  • the user 100 views the instructed body part and inputs a determination trigger by, for example, pressing an operation button (not shown) as a determination trigger provided in the control unit 3.
  • the angle sensor 4 may be used instead of the button operation, and the determination trigger may be that the inclination of the head 102 of the user 100 detected by the angle sensor 4 has become a predetermined inclination.
  • the CPU 40 acquires the determination trigger input by the user 100 (step S102).
  • the CPU 40 gives the name of the body part instructed to view the head tilt data when the determination trigger is input and the image data captured by the CCD camera 2 when the determination trigger is input in step S101. And stored in the table (step S103).
  • the data of the tilt of the head when the determination trigger is input is a data value detected by the angle sensor 4.
  • step S104 the CPU 40 determines whether or not all necessary body parts to be set have been associated.
  • the CPU 40 repeats the associating process for the necessary body parts until all the processes are completed, and when all the processes are completed, the calibration process is terminated. In this way, the table shown in FIG. 6 is generated.
  • imaging processing by the CCD camera 2 will be described as an example of camera mode processing with reference to FIG. As shown in FIG. 13, the CPU 40 determines whether or not the shutter operation of the CCD camera 2 has been performed (step S201).
  • step S201 Yes
  • the CPU 40 causes the display unit 1 to display the captured image as illustrated in FIG. 4A (step S202). At this time, the CPU 40 also displays a message asking whether or not to save the image.
  • the user 100 looks at the part of the body where the image is to be stored (saved).
  • the landscape is stored in the elbow (elbow folder), the person is stored in the hand (hand folder), and the other is stored in the arm (arm folder).
  • step S203 the CPU 40 identifies the body part that the user 100 is looking at. That is, the CPU 40 detects the tilt of the head 102 of the user 100 from the detection value of the angle sensor 4 and narrows down the body part estimated to be viewed by the user 100 from the detected tilt of the head. The CPU 40 narrows down the body part, obtains image data from the CCD camera 2 attached to the head 102, and stores information on the body part where the narrowed body part and the captured image match in the flash memory 42. Get from table.
  • the CPU 40 determines from the state of the image captured by the CCD camera 2 whether or not the specified part is continuously viewed for a predetermined time (for example, 2 seconds) (step S204). For example, the CPU 40 determines that the line of sight of the user 100 is fixed when there is no change in the imaging data for a predetermined time.
  • a predetermined time for example, 2 seconds
  • step S204: Yes When the CPU 40 determines that the specified part is continuously viewed for a predetermined time (for example, 2 seconds) (step S204: Yes), the CPU 40 proceeds to step S205, and when it is determined that the specified part is not continuously viewed (step S204: No). Moves the process to step S206.
  • a predetermined time for example, 2 seconds
  • step S205 the CPU 40 saves the captured image file in a folder associated with the identified part. That is, the CPU 40 stores it in a predetermined area of the flash memory 42.
  • step S204 the CPU 40 determines that the specific part is not continuously viewed, and causes the display unit 1 to display a comment as to whether or not to delete the captured image file (step S206).
  • step S206 If the CPU 40 determines from the detection value of the angle sensor 4 that the head 102 of the user 100 has shaken up and down (step S206: Yes), the CPU 40 deletes the captured image file (step S207). On the other hand, if the CPU 40 determines that the head 102 of the user 100 has swung from side to side (step S206: No), the process proceeds to step S203. At this time, the CPU 40 may cause the display unit 1 to display a comment such as “please continue to look at the place to save” for at least two seconds.
  • the user 100 can store the image captured by the CCD camera 2 in a hands-free manner without performing a button operation or a switch operation.
  • the save destination is associated with a part of the body, the user 100 can easily store the save destination folder.
  • the CPU 40 detects the inclination of the head 102 of the user 100 from the detection value of the angle sensor 4 as shown in FIG. 14 (step S301). Then, the CPU 40 narrows down the body part estimated to be viewed by the user 100 from the detected head inclination (step S302). Further, the CPU 40 acquires imaging data from the CCD camera 2 mounted on the head 102 (step S303). Regarding the flow of steps S301 to S303, the order of the steps is not necessarily limited.
  • the CPU 40 determines whether there is information on the body part in which the narrowed-down body part and the captured image match in the table of the flash memory 42 (step S304). If the CPU 40 determines that it is not in the table (step S304: No), it moves the process to step S301. That is, until it is determined that there is a body part on the table, the user 100 finely adjusts the inclination of the head 102 to repeat the processes of steps S301 to S304.
  • step S304 When the CPU 40 determines that there is information on the body part in which the narrowed body part and the captured image match in the table of the flash memory 42 (step S304: Yes), the image (information) corresponding to the detected body part ) Is displayed (step S305).
  • the CPU 40 displays a list of person images stored in the hand folder which is a virtual folder as a thumbnail. .
  • the CPU 40 displays a list of person images stored in the hand folder which is a virtual folder as a thumbnail.
  • the body part is an elbow
  • a landscape image is displayed as a thumbnail
  • images of other categories are displayed as thumbnails.
  • the CPU 40 sequentially highlights one by one during thumbnail display.
  • the user 100 When the user 100 selects an image to be displayed from among the thumbnails, the user 100 presents a hand (for example, the right hand) to the body part associated with the category of the image at the timing when the displayed image is highlighted, and brings it into contact or proximity. .
  • a hand for example, the right hand
  • step S306 the CPU 40 determines whether or not the right hand at this time is in contact with or close to the body part. In other words, the CPU 40 determines whether or not the state of the body part has changed because the contour shape of the right hand overlaps the contour shape of the left hand. Further, in the case of the hand, the CPU 40 determines whether or not the skin color area has increased due to the skin color area of the right hand being captured by the CCD camera 2 and whether or not the state of the body part has changed. Can also be determined.
  • step S306 when it is not determined that the right hand is in contact with or close to the body part (step S306: No), the CPU 40 returns the process to step S305. At this time, it is preferable for the CPU 40 to display a message such as “Please shift the position of the right hand slightly forward / backward / left / right” to the user 100 by the display unit 1.
  • step S306 when it is determined that the right hand is in contact with or close to the body part (step S306: Yes), the CPU 40 displays the image corresponding to the thumbnail highlighted at this timing, that is, the entire image selected by the user 100. The image is displayed on the display unit 1 (step S307), and the image display process is terminated.
  • the user 100 can display a desired image in a hands-free manner without any special button operation or switch operation. Further, by repeating the processes of steps S301 to S307, image display can be performed continuously.
  • the user 100 after the camera mode is selected, the user 100 performs an operation for saving the captured image, an operation for calling the saved image and causing the display unit 1 to display the button, It is possible to perform hands-free operation without performing a switch operation or the like.
  • 15 to 17 are explanatory diagrams showing examples of usage patterns in the browsing mode of the HMD.
  • the user 100 selects a viewing mode from a menu in the display image 200 displayed by the display unit 1 as shown in FIG. 16A. select.
  • the selection at this time can also be performed hands-free by the user 100 detecting the tilt of the head 102 with the angle sensor 4 as described above.
  • one title is stored in the virtual folder on the flash memory 42 associated with the body part. Therefore, for example, when watching a movie stored in the elbow folder, the user 100 looks at the elbow of his left hand for a predetermined time (for example, 2 seconds) as shown in FIG. 16B.
  • a predetermined time for example, 2 seconds
  • the control unit 30 determines that the user 100 is “elbow” based on the inclination of the head 102 of the user 100 and the skin color area in the specified region 250 set at the center of the image. " Then, the control unit 30 causes the display unit 1 to display the movie title (for example, “Seven Bouncers”) stored in the elbow folder.
  • the user 100 looks at the arm to which the playback command is assigned for a predetermined time (for example, 2 seconds). Then, the movie “Seven Bouncers” is reproduced by the display unit 1.
  • the user 100 can enjoy content such as a desired movie without being looked at by a side customer or the like even on a train. .
  • it since it can be played back or stopped hands-free, it does not hinder the operation of the display unit 1 even when standing by holding on a hanging leather or the like.
  • this browsing mode calibration is performed as in the camera mode. That is, the body part and the operation information corresponding to this part are matched with features such as the physique of the user 100 who actually uses the HMD.
  • FIG. 18 is an example of a calibrated data table
  • FIG. 19 is an example of a second table in which an operation command is associated with the state of the body part of the user 100. These tables are all stored in the flash memory 42.
  • a virtual folder that stores a moving image file indicated by a movie title and a body part are associated in advance by performing calibration.
  • body part “arm”, “elbow”, and “hand” are also used here, and as shown in FIG. 16B, the left arm protrudes forward and is bent at a substantially right angle around the elbow.
  • the “arm”, “elbow”, and “hand” at the time are identified.
  • the detection value of the angle sensor 4 and the captured image captured by the CCD camera 2 are the head angle and the body part (“arm”, “elbow”, “Hand”).
  • the detection value of the angle sensor 4 is a detection value when the head 102 is moved in order to see the “arm”, “elbow”, and “hand” when the user 100 is actually in the posture described above.
  • the captured image captured by the CCD camera 2 is an image captured by the CCD camera 2 at a head angle when the user 100 moves the head 102.
  • the operation command is associated with the state of the body part of the user 100.
  • “arm” is a reproduction command
  • “elbow” is a stop command
  • “hand” Associated with a pause command.
  • the user 100 can perform operations relating to playback of the movie in a hands-free manner by staring at a hand or staring at an elbow or an arm.
  • the present embodiment when the user 100 wears the HMD according to the present embodiment and selects the viewing mode, it is possible to accurately perform operations related to watching a movie in a hands-free manner. .
  • a movie file such as a movie is viewed (viewed), but of course, still image viewing may be performed.
  • Display section (display means) 1a Projection unit 1b Light source unit 2 CCD camera (imaging means) 3 Control unit 4 Angle sensor (head condition detection means) 30 control unit (part specifying means, part state determining means, control processing means, calibration means) 31 Main controller 40 CPU 42 Flash memory (storage means, second storage means) 100 user 102 head

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention porte sur un visiocasque comportant un moyen d'affichage pour rendre une lumière d'image correspondant à des informations d'image incidente sur un œil d'un utilisateur, et pour afficher une image correspondant aux informations d'image ; un moyen de capture d'image pour capturer une image d'une plage prédéterminée dans la direction visible de l'utilisateur ; un moyen de détection d'état de tête pour détecter un angle de la tête de l'utilisateur ; un moyen de spécification de partie pour spécifier une partie du corps de l'utilisateur située dans la direction visible de l'utilisateur sur la base de l'image capturée par le moyen de capture d'image et de l'angle détecté de la tête de l'utilisateur ; et un moyen de traitement de commande pour exécuter une opération correspondant aux informations de la partie de corps de l'utilisateur spécifiée par le moyen de spécification de partie.
PCT/JP2010/054585 2009-03-18 2010-03-17 Visiocasque WO2010107072A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009-066514 2009-03-18
JP2009066514A JP5272827B2 (ja) 2009-03-18 2009-03-18 ヘッドマウントディスプレイ

Publications (1)

Publication Number Publication Date
WO2010107072A1 true WO2010107072A1 (fr) 2010-09-23

Family

ID=42739731

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/054585 WO2010107072A1 (fr) 2009-03-18 2010-03-17 Visiocasque

Country Status (2)

Country Link
JP (1) JP5272827B2 (fr)
WO (1) WO2010107072A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014010502A (ja) * 2012-06-27 2014-01-20 Shunji Sugaya メッセージ送信システム、メッセージ送信方法、プログラム
WO2014129105A1 (fr) * 2013-02-22 2014-08-28 ソニー株式会社 Système d'afficheur facial, afficheur facial et programme de commande pour afficheur facial
JP2014174747A (ja) * 2013-03-08 2014-09-22 Sony Corp 情報処理装置、情報処理方法およびプログラム
CN105874528A (zh) * 2014-01-15 2016-08-17 日立麦克赛尔株式会社 信息显示终端、信息显示***以及信息显示方法
JP2016186658A (ja) * 2016-07-14 2016-10-27 セイコーエプソン株式会社 頭部装着型表示装置および方法
WO2018079446A1 (fr) * 2016-10-27 2018-05-03 日本電気株式会社 Dispositif d'entrée d'informations et procédé d'entrée d'informations

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9213405B2 (en) * 2010-12-16 2015-12-15 Microsoft Technology Licensing, Llc Comprehension and intent-based content for augmented reality displays
US9153195B2 (en) 2011-08-17 2015-10-06 Microsoft Technology Licensing, Llc Providing contextual personal information by a mixed reality device
WO2013028908A1 (fr) 2011-08-24 2013-02-28 Microsoft Corporation Repères tactiles et sociaux faisant office d'entrées dans un ordinateur
JP5821464B2 (ja) * 2011-09-22 2015-11-24 セイコーエプソン株式会社 頭部装着型表示装置
JP2013206412A (ja) * 2012-03-29 2013-10-07 Brother Ind Ltd ヘッドマウントディスプレイ及びコンピュータプログラム
JP6201990B2 (ja) * 2012-06-18 2017-09-27 ソニー株式会社 画像表示装置、画像表示プログラム及び画像表示方法
WO2014141504A1 (fr) * 2013-03-11 2014-09-18 Necソリューションイノベータ株式会社 Dispositif d'interface utilisateur tridimensionnelle et procédé de traitement d'opération tridimensionnelle
US9335547B2 (en) * 2013-03-25 2016-05-10 Seiko Epson Corporation Head-mounted display device and method of controlling head-mounted display device
JP6169462B2 (ja) * 2013-09-30 2017-07-26 株式会社Nttドコモ 情報処理装置及び情報処理方法
JP6065960B2 (ja) * 2015-10-08 2017-01-25 セイコーエプソン株式会社 頭部装着型表示装置
JP7301615B2 (ja) * 2019-06-17 2023-07-03 キヤノン株式会社 電子機器およびその制御方法
JP2021157231A (ja) * 2020-03-25 2021-10-07 カシオ計算機株式会社 電子機器、計測システム、動作指示方法及びプログラム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH086708A (ja) * 1994-04-22 1996-01-12 Canon Inc 表示装置
JP2000148381A (ja) * 1998-11-05 2000-05-26 Telecommunication Advancement Organization Of Japan 入力画像処理方法、入力画像処理装置、及び入力画像処理プログラムを記録した記録媒体
JP2004013326A (ja) * 2002-06-04 2004-01-15 Canon Inc 画像処理装置及びその制御方法、並びにコンピュータプログラム及びコンピュータ可読記憶媒体
JP2008033891A (ja) * 2006-06-27 2008-02-14 Matsushita Electric Ind Co Ltd 表示装置及びその制御方法
JP2008040832A (ja) * 2006-08-07 2008-02-21 Canon Inc 複合現実感提示システム及びその制御方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH086708A (ja) * 1994-04-22 1996-01-12 Canon Inc 表示装置
JP2000148381A (ja) * 1998-11-05 2000-05-26 Telecommunication Advancement Organization Of Japan 入力画像処理方法、入力画像処理装置、及び入力画像処理プログラムを記録した記録媒体
JP2004013326A (ja) * 2002-06-04 2004-01-15 Canon Inc 画像処理装置及びその制御方法、並びにコンピュータプログラム及びコンピュータ可読記憶媒体
JP2008033891A (ja) * 2006-06-27 2008-02-14 Matsushita Electric Ind Co Ltd 表示装置及びその制御方法
JP2008040832A (ja) * 2006-08-07 2008-02-21 Canon Inc 複合現実感提示システム及びその制御方法

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014010502A (ja) * 2012-06-27 2014-01-20 Shunji Sugaya メッセージ送信システム、メッセージ送信方法、プログラム
WO2014129105A1 (fr) * 2013-02-22 2014-08-28 ソニー株式会社 Système d'afficheur facial, afficheur facial et programme de commande pour afficheur facial
US9829997B2 (en) 2013-02-22 2017-11-28 Sony Corporation Head-mounted display system, head-mounted display, and head-mounted display control program
JP2014174747A (ja) * 2013-03-08 2014-09-22 Sony Corp 情報処理装置、情報処理方法およびプログラム
CN105874528A (zh) * 2014-01-15 2016-08-17 日立麦克赛尔株式会社 信息显示终端、信息显示***以及信息显示方法
CN105874528B (zh) * 2014-01-15 2018-07-20 麦克赛尔株式会社 信息显示终端、信息显示***以及信息显示方法
JP2016186658A (ja) * 2016-07-14 2016-10-27 セイコーエプソン株式会社 頭部装着型表示装置および方法
WO2018079446A1 (fr) * 2016-10-27 2018-05-03 日本電気株式会社 Dispositif d'entrée d'informations et procédé d'entrée d'informations
JPWO2018079446A1 (ja) * 2016-10-27 2019-09-19 日本電気株式会社 情報入力装置および情報入力方法
US10955971B2 (en) 2016-10-27 2021-03-23 Nec Corporation Information input device and information input method

Also Published As

Publication number Publication date
JP5272827B2 (ja) 2013-08-28
JP2010218405A (ja) 2010-09-30

Similar Documents

Publication Publication Date Title
JP5272827B2 (ja) ヘッドマウントディスプレイ
JP6693060B2 (ja) 表示システム、表示装置、表示装置の制御方法、及び、プログラム
CN107430278B (zh) 基于用户的上下文敏感全息图反应
US9967487B2 (en) Preparation of image capture device in response to pre-image-capture signal
KR102179142B1 (ko) 착용식 식품 영양 피드백 시스템
CN103984097B (zh) 头戴式显示装置、头戴式显示装置的控制方法以及图像显示***
JP5168161B2 (ja) ヘッドマウントディスプレイ
JP5201015B2 (ja) ヘッドマウントディスプレイ
JP6094305B2 (ja) 頭部装着型表示装置、および、頭部装着型表示装置の制御方法
WO2010073879A1 (fr) Visiocasque
JP2017102768A (ja) 情報処理装置、表示装置、情報処理方法、及び、プログラム
GB2495159A (en) A head-mounted somatosensory control and display system based on a user's body action
JP6459380B2 (ja) 頭部装着型表示装置、頭部装着型表示装置の制御方法、および、コンピュータープログラム
JP2014132719A (ja) 表示装置、および、表示装置の制御方法
JP6554948B2 (ja) 表示装置、表示装置の制御方法、及び、プログラム
JP6303274B2 (ja) 頭部装着型表示装置および頭部装着型表示装置の制御方法
JP7243193B2 (ja) 表示システム、表示システムの制御方法、情報処理装置、及び情報処理装置の制御プログラム
JP2011070458A (ja) ヘッドマウントディスプレイおよびこのヘッドマウントディスプレイを含む撮像データ利用システム
JP2018124721A (ja) 頭部装着型表示装置、及び頭部装着型表示装置の制御方法
JP6337534B2 (ja) 頭部装着型表示装置および頭部装着型表示装置の制御方法
JP2017120488A (ja) 表示装置、表示システム、表示装置の制御方法、及び、プログラム
JP6740613B2 (ja) 表示装置、表示装置の制御方法、及び、プログラム
JP2018042004A (ja) 表示装置、頭部装着型表示装置、及び、表示装置の制御方法
JP2016090853A (ja) 表示装置、表示装置の制御方法、及び、プログラム
WO2010082270A1 (fr) Visiocasque

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10753563

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10753563

Country of ref document: EP

Kind code of ref document: A1