WO2016194844A1 - ウェアラブル装置 - Google Patents
ウェアラブル装置 Download PDFInfo
- Publication number
- WO2016194844A1 WO2016194844A1 PCT/JP2016/065810 JP2016065810W WO2016194844A1 WO 2016194844 A1 WO2016194844 A1 WO 2016194844A1 JP 2016065810 W JP2016065810 W JP 2016065810W WO 2016194844 A1 WO2016194844 A1 WO 2016194844A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- wearable device
- display
- image
- smartphone
- electronic device
- Prior art date
Links
- 238000001514 detection method Methods 0.000 claims abstract description 119
- 210000001508 eye Anatomy 0.000 claims abstract description 38
- 238000004891 communication Methods 0.000 claims description 34
- 238000012545 processing Methods 0.000 claims description 30
- 230000033001 locomotion Effects 0.000 claims description 23
- 230000008859 change Effects 0.000 claims description 16
- 210000001364 upper extremity Anatomy 0.000 claims description 10
- 230000009467 reduction Effects 0.000 claims description 3
- 230000006870 function Effects 0.000 description 72
- 238000003384 imaging method Methods 0.000 description 51
- 238000010586 diagram Methods 0.000 description 42
- 238000000034 method Methods 0.000 description 30
- 101100202463 Schizophyllum commune SC14 gene Proteins 0.000 description 29
- 210000003811 finger Anatomy 0.000 description 19
- 210000003813 thumb Anatomy 0.000 description 15
- 230000008569 process Effects 0.000 description 13
- 244000000626 Daucus carota Species 0.000 description 11
- 235000002767 Daucus carota Nutrition 0.000 description 11
- 210000003128 head Anatomy 0.000 description 7
- 230000007704 transition Effects 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 5
- 239000011521 glass Substances 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 210000005252 bulbus oculi Anatomy 0.000 description 3
- 238000003331 infrared imaging Methods 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 230000010287 polarization Effects 0.000 description 3
- 230000004888 barrier function Effects 0.000 description 2
- 238000005452 bending Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 210000000707 wrist Anatomy 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000010411 cooking Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 210000000245 forearm Anatomy 0.000 description 1
- 230000012447 hatching Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000005389 magnetism Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- APTZNLHMIGJTEW-UHFFFAOYSA-N pyraflufen-ethyl Chemical compound C1=C(Cl)C(OCC(=O)OCC)=CC(C=2C(=C(OC(F)F)N(C)N=2)Cl)=C1F APTZNLHMIGJTEW-UHFFFAOYSA-N 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/80—Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/64—Constructional details of receivers, e.g. cabinets or dust covers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
Definitions
- This application relates to a wearable device that can be worn on a user's head.
- a display unit having a screen arranged to enter the user's field of view, a communication unit that communicates with a portable terminal carried by the user, and a user operation performed on the portable terminal And a display that controls display of a scroll item that automatically scrolls in the first direction in the screen in accordance with the user operation detected by the detection unit.
- An information processor provided with a control part is indicated (patent documents 1).
- the image processing apparatus includes a display control unit that recognizes an external device existing in a real space reflected in the user's field of view in the captured image and arranges and displays the display item so that the recognized external device is not hidden by the display item.
- An information processing apparatus is disclosed (Patent Document 2).
- the purpose of this application is to provide a wearable device with improved operability.
- a wearable device is a wearable device including a display unit disposed in front of the eye, and includes a detection unit that detects whether or not another electronic device is present in a predetermined space in front of the wearable device. If the other electronic device is in the predetermined space and the other electronic device is performing a predetermined display, additional information related to the display is displayed on the display unit.
- the wearable device includes a display unit disposed in front of the eyes, a detection unit that detects a predetermined object existing in the real space, and the predetermined object in a predetermined space in front of the user.
- a wearable device comprising: a control unit that displays additional information related to the predetermined object at a position that does not overlap the predetermined object in the display area of the display unit, wherein the predetermined object moves, When it is detected that the predetermined object is superimposed on the area where the additional information is displayed, a selection process for the additional information is executed.
- the wearable device includes a display unit disposed in front of the eyes, a detection unit that detects a predetermined object existing in the real space, and the predetermined object in a predetermined space in front of the user. In such a case, an image including additional information related to the predetermined object is displayed so that the image includes the predetermined object in the display area of the display unit.
- a wearable device is a wearable device including a display unit arranged in front of the eye, and includes a detection unit that detects whether or not another electronic device is present in a real space, From the detection result of the detection unit, when the display screen of the other electronic device has an inclination of a predetermined angle or more with respect to the display surface of the display unit of the wearable device, the display of the other electronic device is performed. A related image is displayed superimposed on the other electronic device.
- a wearable device is a wearable device including a display unit disposed in front of the user's eyes and a detection unit that detects other electronic devices existing in a predetermined space in front of the user. Then, when an image is displayed on the display unit and the detection unit detects that another electronic device has entered the predetermined space, it follows the movement of the position of the other electronic device that has entered. Then, the size of the image is changed, and when the moving speed of the other electronic device in the predetermined space becomes less than a predetermined value, the change of the size of the image is completed.
- a wearable device includes a display unit that is disposed in front of a user's eyes and displays an image including a plurality of display elements, and a detection that detects a predetermined object existing in a predetermined space in front of the user.
- the detection unit detects that the predetermined object has entered the predetermined space when the image is displayed on the display unit, the position of the predetermined object that has entered the wearable device
- the image size is changed in accordance with the movement of the image, the image size is changed, and the width of the image in a predetermined direction is equal to or less than a predetermined length
- the display elements included in the image At least one display element is divided and displayed outside the area of the image.
- FIG. 1 is a perspective view of a wearable device.
- FIG. 2 is a block diagram of the wearable device.
- FIG. 3A is a diagram for explaining a relationship between a detection range of the detection unit and a display area of the display unit.
- FIG. 3B is a diagram for explaining the relationship between the detection range of the detection unit and the display area of the display unit.
- FIG. 3C is a diagram for explaining the relationship between the detection range of the detection unit and the display area of the display unit.
- FIG. 4 is a schematic diagram of a network system formed from a wearable device and another electronic device.
- FIG. 5 is a diagram illustrating an example of a determination flow for applying any of various functions by the wearable device.
- FIG. 6 is a diagram illustrating another example different from the case of FIG.
- FIG. 7 is a diagram illustrating a first example of processing executed by the wearable device according to the present embodiment.
- FIG. 8 is a diagram illustrating a second example of processing executed by the wearable device according to the present embodiment.
- FIG. 9 is a diagram illustrating a third example of processing executed by the wearable device according to the present embodiment.
- FIG. 10 is a diagram illustrating a fourth example of processing executed by the wearable device according to the present embodiment.
- FIG. 11 is a diagram illustrating a fifth example of processing executed by the wearable device according to the present embodiment.
- FIG. 12 is a diagram illustrating a sixth example of processing executed by the wearable device according to the present embodiment.
- FIG. 7 is a diagram illustrating a first example of processing executed by the wearable device according to the present embodiment.
- FIG. 8 is a diagram illustrating a second example of processing executed by the wearable device according to the present embodiment.
- FIG. 9 is a diagram illustrating a third example of processing executed by the wearable device according
- FIG. 13 is a diagram for continuing to explain the sixth example of the process executed by the wearable device according to the present embodiment.
- FIG. 14 is a diagram for continuously explaining the sixth example of the process executed by the wearable device according to the present embodiment.
- FIG. 15 is a diagram illustrating a seventh example of processing executed by the wearable device according to the present embodiment.
- FIG. 16 is a diagram illustrating an eighth example of processing executed by the wearable device according to the present embodiment.
- FIG. 17 is a diagram illustrating a ninth example of the process executed by the wearable device according to the present embodiment.
- FIG. 18 is a diagram illustrating a tenth example of processing executed by the wearable device according to the present embodiment.
- FIG. 19 is a diagram illustrating an eleventh example of processing executed by the wearable device according to the present embodiment.
- FIG. 20 is a diagram illustrating a twelfth example of processing executed by the wearable device according to the present embodiment.
- FIG. 1 is a perspective view of the wearable device 1.
- the wearable device 1 is a head mount type device that is worn on a user's head.
- the wearable device 1 has a front surface portion 1a, a side surface portion 1b, and a side surface portion 1c.
- Front part 1a is arranged in front of the user so as to cover both eyes of the user when worn.
- the side surface portion 1b is connected to one end portion of the front surface portion 1a.
- Side part 1c is connected to the other end of front part 1a.
- the side surface portion 1b and the side surface portion 1c are supported by a user's ear like a vine of glasses when worn, and stabilize the wearable device 1.
- the side surface portion 1b and the side surface portion 1c may be configured to be connected to the back surface of the user's head when worn.
- the front part 1a includes a display part 2a and a display part 2b on the surface facing the user's eyes when worn.
- the display unit 2a is disposed at a position facing the user's right eye when worn.
- the display unit 2b is disposed at a position facing the user's left eye when worn.
- the display unit 2a displays an image for the right eye
- the display unit 2b displays an image for the left eye.
- the display unit 2a and the display unit 2b are a pair of transflective displays, but are not limited thereto.
- the display unit 2a and the display unit 2b may include lenses such as an eyeglass lens, a sunglasses lens, and an ultraviolet cut lens, and the display unit 2a and the display unit 2b may be provided separately from the lenses.
- the display unit 2a and the display unit 2b may be configured by a single display device as long as different images can be independently provided to the user's right eye and left eye.
- the front part 1a is provided with an image pickup part 3 (also referred to as an out camera).
- the imaging unit 3 is disposed at the central portion of the front surface portion 1a.
- the imaging unit 3 acquires an image in a predetermined range in the scenery in front of the user.
- the imaging unit 3 can also acquire an image in a range corresponding to the user's field of view.
- the field of view here is a field of view when the user is looking at the front, for example.
- the imaging unit 3 includes an imaging unit disposed in the vicinity of one end portion (corresponding to the right eye side when worn) of the front surface portion 1a and the other end portion (left eye side when worn) of the front surface portion 1a. And an image pickup unit disposed in the vicinity of the image pickup unit.
- the imaging unit disposed in the vicinity of one end portion (the right eye side at the time of wearing) of the front surface portion 1a can acquire an image in a range corresponding to the field of view of the user's right eye.
- the imaging unit disposed in the vicinity of one end portion (the left eye side at the time of wearing) of the front surface portion 1a can acquire an image in a range corresponding to the visual field of the user's left eye.
- the wearable device 1 has a function of visually recognizing various information on the foreground that the user is viewing.
- the foreground includes a landscape in front of the user.
- the wearable device 1 allows the user to visually recognize the foreground through the display unit 2a and the display unit 2b.
- the wearable device 1 allows the user to visually recognize the foreground through the display unit 2a and the display unit 2b and the display contents of the display unit 2a and the display unit 2b. .
- the imaging unit 4 (also referred to as an in-camera) is provided on the front part 1a.
- the imaging part 4 is arrange
- the imaging unit 4 acquires a user's face, for example, an image of an eye.
- the front unit 1a is provided with a detection unit 5.
- the detection part 5 is arrange
- the side surface portion 1c is provided with an operation unit 6. The detection unit 5 and the operation unit 6 will be described later.
- FIG. 1 shows an example in which the wearable device 1 has a shape like glasses, but the shape of the wearable device 1 is not limited to this.
- the wearable device 1 may have a goggle shape.
- the wearable device 1 may be configured to be connected to an external device such as an information processing device or a battery device in a wired or wireless manner.
- FIG. 2 is a block diagram of the wearable device 1.
- the wearable device 1 includes a display unit 2a and a display unit 2b, an imaging unit 3 (out camera) and an imaging unit 4 (in camera), a detection unit 5, an operation unit 6, and a control unit. 7, a communication unit 8, and a storage unit 9.
- the display unit 2a and the display unit 2b include a transflective display device such as a liquid crystal display (Liquid Crystal Display) and an organic EL (Organic Electro-Luminescence) panel.
- the display unit 2a and the display unit 2b display various types of information as images in accordance with control signals input from the control unit 7.
- the display unit 2a and the display unit 2b may be a projection device that projects an image on a user's retina using a light source such as a laser beam.
- the display unit 2a and the display unit 2b may have a configuration in which a half mirror is installed on the lens portion of the wearable device 1 simulating glasses and an image irradiated from a separately provided projector is projected ( In the example shown in FIG.
- the display unit 2a and the display unit 2b are rectangular half mirrors). As described above, the display unit 2a and the display unit 2b may display various information three-dimensionally. Moreover, the display part 2a and the display part 2b may display various information as if they existed in front of the user (or a position away from the user).
- a method for displaying information in this way for example, a frame sequential method, a polarization method, a linear polarization method, a circular polarization method, a top and bottom method, a side-by-side method, an anaglyph method, a lenticular method, a parallax barrier method, a liquid crystal parallax method. Any of a multi-parallax method such as a barrier method and a two-parallax method may be employed.
- the imaging unit 3 and the imaging unit 4 electronically capture an image using an image sensor such as a CCD (Charge Coupled Device Image Sensor) or a CMOS (Complementary Metal Oxide Semiconductor). Then, the imaging unit 3 and the imaging unit 4 convert the captured image into a signal and output the signal to the control unit 7.
- an image sensor such as a CCD (Charge Coupled Device Image Sensor) or a CMOS (Complementary Metal Oxide Semiconductor). Then, the imaging unit 3 and the imaging unit 4 convert the captured image into a signal and output the signal to the control unit 7.
- the detection unit 5 detects a real object that exists in the foreground of the user.
- the detection unit 5 detects, for example, a pre-registered object or an object that matches a pre-registered shape among real objects.
- the pre-registered object includes, for example, a human hand or finger, or a portable electronic device such as a smartphone or a wristwatch type terminal.
- the shape registered in advance includes, for example, the shape of a human hand or finger, or the shape of a portable electronic device such as a smartphone or a wristwatch type terminal.
- the detection unit 5 detects the range (for example, shape and size) of an actual object in the image based on the brightness, saturation, hue edge, and the like of the pixel even for an object whose shape is not registered in advance. It may be configured as follows. Note that the actual object as described above is referred to as a predetermined object in this specification.
- the detection unit 5 includes a sensor that detects an actual object (also referred to as a predetermined object).
- the sensor is, for example, a sensor that detects an actual object using at least one of visible light, infrared light, ultraviolet light, radio waves, sound waves, magnetism, and capacitance.
- the imaging unit 3 (out camera) may also serve as the detection unit 5. That is, the imaging unit 3 detects an object (also referred to as a predetermined object) within the imaging range by analyzing the captured image.
- the operation unit 6 includes, for example, a touch sensor disposed on the side surface portion 1c.
- the touch sensor can detect a user's contact, and accepts basic operations such as starting and stopping the wearable device 1 and changing the operation mode according to the detection result.
- the operation portion 6 may be disposed on the side surface portion 1b, or may be disposed on the side surface portion 1b and the side surface portion 1c. It may be arranged on both sides.
- the control unit 7 includes a CPU (Central Processing Unit) that is a calculation means and a memory that is a storage means.
- the control unit 7 implements various functions by executing programs using these hardware resources. Specifically, the control unit 7 reads a program or data stored in the storage unit 9 and expands it in a memory, and causes the CPU to execute instructions included in the program expanded in the memory. And the control part 7 reads / writes data with respect to a memory and the memory
- a CPU Central Processing Unit
- the communication unit 8 communicates wirelessly.
- the wireless communication standards supported by the communication unit 8 include, for example, cellular phone communication standards such as 2G, 3G, and 4G, and short-range wireless communication standards.
- Cellular phone communication standards include, for example, LTE (Long Term Evolution), W-CDMA (Wideband Code Division Multiple Access), WiMAX (Worldwide InteroperabilityCableD), WiMAX (Worldwide InteroperabilityCableD). (Global System for Mobile Communications), PHS (Personal Handy-phone System), and the like.
- Examples of short-range wireless communication standards include IEEE 802.11, Bluetooth (registered trademark), IrDA (Infrared Data Association), NFC (Near Field Communication), and WPAN (Wireless Personal Area Network).
- As a communication standard of WPAN for example, there is ZigBee (registered trademark).
- the communication unit 8 may support one or more of the communication standards described above.
- the wearable device 1 when the wearable device 1 is worn by a user, the wearable device 1 communicates with a portable electronic device (for example, a smartphone or a wristwatch-type terminal) possessed by the user by the short-range wireless method.
- a portable electronic device for example, a smartphone or a wristwatch-type terminal
- various signals may be transmitted and received.
- wearable device 1 includes a connector to which another electronic device is connected.
- the connector may be a general-purpose terminal such as USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface), Light Peak (Thunderbolt (registered trademark)), or an earphone microphone connector.
- the connector may be a dedicated terminal such as a dock connector.
- the connector may be connected to any device including an external storage, a speaker, and a communication device, for example.
- the storage unit 9 stores various programs and data.
- the storage unit 9 may include a nonvolatile storage device such as a flash memory.
- the program stored in the storage unit 9 includes a control program 90.
- the storage unit 9 may be configured by a combination of a portable storage medium such as a memory card and a read / write device that reads from and writes to the storage medium.
- the control program 90 may be stored in a storage medium.
- the control program 90 may be acquired from another portable electronic device such as a server device, a smartphone, a wristwatch type terminal, or the like by wireless communication or wired communication.
- the control program 90 provides functions related to various controls for operating the wearable device 1.
- the functions provided by the control program 90 include a function for detecting an actual object (predetermined object) existing in the scenery in front of the user from the detection result of the detection unit 5, and display of the display units 2a and 2b.
- the function to control is included.
- the control program 90 includes a detection processing program 90a and a display control program 90b.
- the detection processing program 90a provides a function for detecting a predetermined object existing in the foreground of the user from the detection result of the detection unit 5.
- the detection processing program 90a provides a function of estimating the position of a predetermined object in the user's foreground from the detection result of the detection unit 5.
- the detection processing program 90a provides a function for detecting a predetermined object existing in the scenery in front of the user from the detection result of the imaging unit 3.
- the display control program 90b provides a function of displaying information related to a predetermined object existing in the scenery in front of the user.
- the display control program 90b is related to the display when the predetermined object is an electronic device having a display function, such as a smartphone or a wristwatch type terminal, and the electronic device performs a predetermined display.
- a display function such as a smartphone or a wristwatch type terminal
- a function for displaying additional information on a display unit is provided.
- 3A to 3C are diagrams illustrating the relationship between the detection range 51 of the detection unit 5 and the display area 21 of the display unit 2a and the display unit 2b.
- the detection unit 5 will be described as a sensor that detects an actual predetermined object using infrared rays.
- the detection part 5 is demonstrated as what consists of an infrared irradiation part which irradiates infrared rays, and an infrared imaging part which can receive the infrared rays reflected from an actual predetermined thing (or has infrared sensitivity). That is, the control unit 7 detects an actual predetermined object from the captured image of the infrared imaging unit.
- the display image is displayed as if the display unit 2 a and the display unit 2 b exist at positions away from the wearable device 1.
- FIG. 3A is a perspective view schematically showing the detection range 51 of the detection unit 5 and the display area 21 of the display unit 2a and the display unit 2b.
- FIG. 3B is a top view of FIG. 3A.
- FIG. 3C is a front view of FIG. 3A.
- 3A to 3C define a three-dimensional orthogonal coordinate system including an X axis, a Y axis, and a Z axis.
- the Y-axis direction is the vertical direction.
- the Z-axis direction is the user's front-rear direction.
- the X-axis direction is a direction orthogonal to both the Y-axis direction and the Z-axis direction.
- FIG. 3C corresponds to the field of view when the user visually recognizes the front.
- the detection range 51 has a three-dimensional space. That is, the detection unit 5 can detect a predetermined object in the detection range 51 that is a three-dimensional space by detecting the infrared ray irradiated from the infrared irradiation unit by the infrared imaging unit. The detection unit 5 can also detect the motion of a predetermined object that exists in the detection range 51 that is a three-dimensional space. For example, when the predetermined object is a user's arm, hand, or finger, or a combination of these (generally referred to as an upper limb), the detection unit 5 performs the finger bending / extension operation, wrist bending, etc.
- Rotation of the forearm eg, pronation, pronation
- the detection unit 5 may detect that the position of the specific portion of the upper limb moves within the detection range 51.
- the detection unit 5 may detect the shape of the upper limb.
- the detection unit 5 may detect a form (for example, a good sign) in which another finger is held while the thumb is extended upward.
- the wearable device 1 detects a predetermined object within the detection range (or within the imaging range) similarly to the detection unit 5 even when the imaging unit 3 (out-camera) is applied as the detection unit. It is possible to detect the operation of a predetermined object.
- the display unit 2a and the display unit 2b display an image in the display area 21 that is located away from the wearable device 1, not in the actually provided wearable device 1.
- the images displayed by the display units 2a and 2b may be referred to as display images).
- the display unit 2a and the display unit 2b may display the display image as a stereoscopic 3D object having a depth. Note that the depth corresponds to the thickness in the Z-axis direction.
- FIG. 4 is a schematic diagram of a network system 100 formed from the wearable device 1 and other electronic devices.
- the wearable device 1 communicates as another electronic device by, for example, a wired or wireless connection with a smartphone A possessed by the user or a wristwatch type terminal B worn by the user on the wrist. I do.
- the wearable device 1 communicates with other electronic devices by the control unit 7 controlling the communication unit 8.
- identification information for example, an ID included in a signal transmitted from any of the plurality of electronic devices.
- the communication partner may be identified based on the data).
- the wearable device 1 can display an image to be displayed by another electronic device on the display unit 2 a and the display unit 2 b of the wearable device 1.
- the smartphone A which is another electronic device, receives a mail, a control signal for displaying that the mail has been received, or a control for displaying the mail content (for example, a message).
- the signal is transmitted to the wearable device 1.
- the wearable device 1 that has received the control signal displays, on the display unit 2a and the display unit 2b, that the smartphone A has received the mail or the mail content based on the control signal.
- the second function of operating the wearable device 1 is realized based on the contact operation with respect to other electronic devices possessed by the user.
- the wearable device 1 receives a signal including information on contact with the other electronic device from the smartphone A, and displays the signal based on the signal. An operation accompanied by a change of the above may be performed.
- the wearable device 1 receives a signal based on the detected motion of the upper limb from the wristwatch type terminal B. Based on the signal, an operation accompanied by a change in display may be performed.
- the wearable device 1 displays the additional information related to the display on the display unit 2a and the display unit 2b of the wearable device 1 when another electronic device has a display function.
- the wearable device 1 may receive a signal related to display control from the smartphone A and display an image based on the signal in a space that can be visually recognized by the user. .
- FIG. 5 is a diagram illustrating an example of a determination flow for applying any one of various functions by the wearable device 1. The processing based on this determination flow is realized by the control unit 7 of the wearable device 1 executing the control program 90 stored in the storage unit 9. In the example illustrated in FIG. 5, the description will be made assuming that the other electronic device is the smartphone A.
- control unit 7 determines whether or not a signal is received from the smartphone A (step ST101).
- the control part 7 repeats the same determination, when not receiving a signal from the smart phone A as a result of determination (step ST101, No).
- the control unit 7 determines whether the smartphone A is held by the user from information based on the received signal. (Step ST102).
- Control part 7 applies the 1st function realized by network system 100, when it judges with smart phone A not being grasped by the user as a result of judgment (Step ST102, No) (Step ST103).
- the first function includes, for example, a function of notifying information that the smartphone A intends to notify the user using the wearable device 1 instead of the other electronic device.
- the control part 7 makes the detection part 5 start a detection process, when it recognizes that the smart phone A is grasped by the user as a result of determination in step ST102 (step ST102, Yes) (step ST104).
- the control unit 7 determines whether or not the smartphone A is present in a predetermined space in front of the user from the detection result of the detection unit 5 (step ST105).
- the predetermined space in front of the user may be a space that the user can visually recognize, for example.
- a space that can be visually recognized by the user is appropriately defined, and may be defined based on, for example, a standard human viewing angle of about 120 degrees.
- the predetermined space in front of the user may be a space that can be superimposed on the display area when the user visually recognizes the display area 21 while wearing the wearable device 1.
- the predetermined space may be defined in any way other than the above.
- the control part 7 applies the 2nd function implement
- the second function includes, for example, a function of operating the wearable device 1 based on a contact operation with respect to the smartphone A.
- the wearable device 1 determines whether or not the smartphone A is gripped by the user. When the smartphone A is gripped, the wearable device 1 executes the function of operating the wearable device 1 by the contact operation to the smartphone A. You may have the structure made effective. When the smartphone A is not grasped by the user, the wearable device 1 regards the information that the smartphone A is trying to report instead of the smartphone A as if the user cannot immediately recognize the smartphone A. You may have the structure which validates execution of the function alert
- the control part 7 applies the 3rd function implement
- the third function includes, for example, a function of displaying additional information related to display contents displayed by the smartphone A on the display unit 2a and the display unit 2b of the wearable device 1.
- the wearable device 1 determines whether or not the smartphone A is present in a predetermined space (for example, a space that can be visually recognized by the user), and if so, the display content displayed by the smartphone A. It is good also as a structure which validates execution of the function which displays additional information which supplements on the display part 2a and the display part 2b.
- the wearable device 1 regards the smartphone A as not viewing the smartphone A, and wears the wearable device 1 by a contact operation on the smartphone A. It is good also as a structure which validates execution of the function which operates the apparatus 1.
- the wearable device 1 can execute a more convenient function depending on the usage state of the smartphone A.
- the wearable device 1 is operated by a contact operation to the smartphone A and the smartphone A includes a touch panel
- the wearable device 1 is based on the movement of the contact position on the touch panel.
- a function as a cursor for designating the position of the display area (XY plane in FIG. 3) may be executed.
- FIG. 6 is a diagram showing another example different from the case of FIG. 5 in the determination flow for applying any of various functions by the wearable device 1.
- the processing based on this determination flow is realized by the control unit 7 of the wearable device 1 executing the control program 90 stored in the storage unit 9.
- the control unit 7 of the wearable device 1 executing the control program 90 stored in the storage unit 9.
- description will be made assuming that the other electronic device is the smartphone A.
- control unit 7 causes the detection unit 5 to start detection processing (step ST201).
- control unit 7 determines whether or not the smartphone A is present in the detection range 51 from the detection result of the detection unit 5 (step ST202).
- the control unit 7 performs the first function realized by the network system 100. Apply (step ST203).
- the first function includes, for example, a function of notifying information that the smartphone A intends to notify the user using the wearable device 1 instead of the other electronic device.
- the control part 7 advances a process to step ST204, when the smart phone A exists in the detection range 51 as a result of the determination in step ST202, ie, the detection part 5 detects the smart phone A (step ST202, Yes).
- the control part 7 determines whether the smart phone A exists in the predetermined space ahead of a user (step ST204).
- the predetermined space may be defined similarly to the case of FIG.
- the control part 7 applies the 2nd function implement
- the second function includes, for example, a function of operating the wearable device 1 based on a contact operation with respect to the smartphone A.
- the control part 7 applies the 3rd function implement
- the third function includes, for example, a function of displaying additional information related to display contents displayed by the smartphone A on the display unit 2a and the display unit 2b of the wearable device 1.
- the wearable device 1 can apply various functions realized by the network system 100 in a timely manner based on various conditions that can be set as appropriate.
- wearable device 1 concerning this embodiment is not limited only to such composition, and application of the above-mentioned various functions may be set up by a user.
- the wearable apparatus 1 does not necessarily need to have all the various functions described above.
- FIG. 7 is a diagram illustrating a first example of processing executed by the wearable device 1 according to the present embodiment.
- FIG. 7 shows the display unit 2a (2b) of the wearable device 1 (hereinafter sometimes simply referred to as the display unit 2).
- the illustration of other members and other functional units other than the display unit 2a of the wearable device 1 is omitted.
- FIG. 7 is a diagram schematically showing a space corresponding to the user's field of view as a two-dimensional plane. 7 shows a display area 21 of the display unit 2 and a predetermined space 51a in front of the user that can be detected by the detection unit 5 (in the example of FIG. 7, an area surrounded by a dotted line). Is).
- the predetermined space 51a may be variously defined as described above. Further, the predetermined space in front of the user is a predetermined space in front of the wearable device 1.
- FIG. 7 is a diagram schematically showing a space corresponding to the user's field of view as a two-dimensional plane. 7 shows a display area 21 of the display unit 2 and a predetermined space 51a in front of the user that can be
- the predetermined space 51 a is illustrated as a plane, but in practice, it is defined as a three-dimensional space having a depth in the front-rear direction of the user (for example, the Z-axis direction in FIG. 3).
- the wearable device 1 has activated the detection unit 5 and is in a state where a predetermined object can be detected in a detection range 51 (not shown).
- FIG. 7 shows a smartphone A (another electronic device) held by the user's right hand R.
- the smartphone A includes a display unit and a touch sensor provided to be superimposed on the display unit.
- the smartphone A can perform processing based on the image displayed at the position by touching a predetermined position in the display area of the display unit.
- a plurality of objects OB (also referred to as icons) for executing a predetermined function by touching are displayed on the smartphone A (such a screen may be referred to as a home screen).
- the plurality of objects OB includes an object OB1.
- the object OB1 is an image indicating that the web browser can be activated by being touched.
- the smartphone A is outside the predetermined space 51a. In this state, when the user moves the right hand to the left, as shown in step S2, the smartphone A moves into the predetermined space 51a. Note that the smartphone A is also present in the predetermined space 51a in the user's Z-axis direction (in other words, the direction orthogonal to the XY plane).
- the wearable device 1 starts communication connection with the smartphone A when the detection unit 5 detects that the smartphone A is in the predetermined space 51a in front of the user.
- Wearable device 1 receives, from smartphone A, a display control signal for executing a predetermined display on display unit 2 of wearable device 1.
- the wearable device 1 displays a new screen SC1 on the display unit 2 based on the received display control signal (step S3).
- the screen SC1 includes, for example, additional information related to the display content displayed when the smartphone A is detected to be in the predetermined space 51a.
- the screen SC1 is, for example, information for explaining the function of the object OB (icon) on the home screen, or the user A previously created memo or the like may be displayed as additional information.
- the wearable device 1 includes the detection unit 5 that detects whether or not the smartphone A (another electronic device) is present in the predetermined space 51a in front of the wearable device.
- the wearable device 1 has a configuration in which additional information related to the display is displayed on the display unit 2 when the smartphone A is in the predetermined space 51a and the smartphone A performs a predetermined display. According to such a configuration, the wearable device 1 can provide the user with an image that the smartphone A can display at a time, that is, a larger amount of information than can be provided to the user at one time. It is easy to use for the user.
- the wearable device 1 has shown a configuration in which the communication connection with the smartphone A is started when it is detected that the smartphone A is in the predetermined space 51a. It is not limited to.
- the wearable device 1 may be triggered by detecting that the smartphone A is grasped by the user.
- the fact that the smartphone A is gripped by the user may be detected by, for example, a touch sensor included in the smartphone A.
- the wearable device 1 may be configured to start communication connection when receiving a communication connection request from the smartphone A to the wearable device 1.
- the wearable device 1 displays the screen when the entire smartphone A shifts from a state outside the predetermined space 51 a to a state where the entire smartphone A is within the predetermined space 51 a.
- SC1 is displayed, it is not limited to such a configuration.
- the wearable device 1 determines that the smartphone A is in the predetermined space 51a when the ratio of the area where the smartphone A is included in the predetermined space 51a with respect to the entire area of the smartphone A becomes equal to or greater than a predetermined value. Also good.
- the wearable device 1 may determine that the smartphone A is in the predetermined space 51a when any smartphone A is detected in the predetermined space 51a.
- the wearable device 1 is configured such that the smartphone A is in the predetermined space 51a in a state where the angle at which the display surface of the smartphone A and the front surface of the wearable device 1 (or the display surface of the wearable device 1) intersect is smaller than a predetermined angle. You can detect that you are present. Wearable device 1 is good also as composition which judges that "smart phone A exists in predetermined space 51a ahead of a user" by the detection concerned, and displays screen SC1.
- Wearable device 1 is in a state where the angle at which the user's line-of-sight direction identified from the image of the user's eye captured by imaging unit 4 (in-camera) intersects the display surface of smartphone A is greater than a predetermined angle.
- the presence of the smartphone A in the predetermined space 51a may be detected.
- Wearable device 1 is good also as composition which judges that "smart phone A exists in predetermined space 51a ahead of a user" by the detection concerned, and displays screen SC1.
- control unit 7 may extract a predetermined region including the user's eyeball from the user's image captured by the imaging unit 4 and specify the line-of-sight direction based on the positional relationship between the eyes and the iris.
- control unit 7 stores, as reference images, images of a plurality of eyeballs when the user is browsing various display areas on the display.
- control part 7 may detect the said gaze position by collating a reference image and the image of a user's eyeball acquired as a determination target.
- the wearable device 1 detects, for example, the wearable device 1, the user's face, or the user's eyes in the captured image captured by the imaging unit included in the smartphone A. It may be determined that it is inside.
- the smartphone A has an imaging unit exposed on a surface parallel to the display surface (or a surface that is flush with the display surface), and the wearable device 1 (or the user's face and eyes) is included in the captured image. ) Is detected, it is assumed that the user is viewing the display surface of the smartphone A. And the smart phone A will start communication connection with the wearable apparatus 1, if the wearable apparatus 1 (or a user's face and eyes) is detected in a captured image. Wearable device 1 recognizes that smartphone A is in predetermined space 51a based on the signal received from smartphone A, and displays screen SC1.
- the wearable apparatus 1 demonstrated the structure which performs communication connection with the smart phone A (other electronic devices), and displays the screen SC1 (additional information) based on the display control signal received from the smart phone A, It is not limited.
- the wearable device 1 stores in advance additional information related to the display content of the smartphone A in the storage unit 9, and detects the display content of the smartphone A present in the predetermined space 51 a from the captured image of the imaging unit 3.
- the additional information related to the display content may be referred to from the storage unit 9 and displayed as the screen SC1.
- the wearable device 1 is configured to display the screen SC1 based on the detection of the smartphone A (another electronic device) in the predetermined space 51a, but is not limited thereto.
- the wearable device 1 may be configured to display the screen SC1 based on the detection of the right hand R that holds the smartphone A (another electronic device) in the predetermined space 51a.
- the wearable device 1 may hide other screens displayed on the display unit 2 when displaying the screen SC1.
- the display unit in the first state 2 may be configured to hide the image displayed in 2.
- the screen SC1 is displayed outside the substantially rectangular outline A1 of the smartphone A.
- the wearable device 1 detects the outline A1 of the smartphone A (another electronic device) in the predetermined space 51a based on the detection result of the detection unit 5, and the screen SC1 (for example, an image including additional information) is displayed on the smartphone A.
- the screen SC1 is displayed on the display unit 2 so as to be outside the outline A1. Thereby, the wearable device 1 does not hinder the visual recognition of the display of the smartphone A by the screen SC1, and the usability is improved.
- the screen SC1 is arranged and displayed so that one side of the screen SC1 is substantially parallel to one side of the smartphone A.
- the wearable device 1 can easily recognize that the additional information included in the screen SC1 is information related to the display content of the smartphone A.
- the wearable device 1 may associate the spatial coordinates of the display area 21 with the spatial coordinates of the predetermined space 51a in advance. Thereby, the wearable apparatus 1 can display the screen SC1 having an appropriate shape, size, and angle at an appropriate position as described above based on the position and angle of the smartphone A in the predetermined space 51a. .
- FIG. 8 is a diagram illustrating a second example of processing executed by the wearable device 1 according to the present embodiment.
- step S11 a plurality of objects OB are displayed on the smartphone A.
- the plurality of objects OB includes, for example, an object OB1 indicating that the web browser can be activated.
- the smartphone A is outside the predetermined space 51a. When the user moves the right hand in the left direction in such a state, the smartphone A moves into the predetermined space 51a as shown in step S12.
- the wearable device 1 When the wearable device 1 detects that the smartphone A is in the predetermined space 51a in front of the user, the wearable device 1 activates the imaging unit 3 (out camera). The wearable device 1 displays the captured image captured by the imaging unit 3 on the display unit 2 (step S13). The captured image displayed on the display unit 2 is referred to as a preview window PW.
- the preview window PW includes video data configured by sequentially transmitting images captured by the imaging unit 3.
- the wearable device 1 detects that the smartphone A is in the predetermined space 51a in front of the user, the wearable device 1 starts communication connection with the smartphone A, and from the smartphone A on the display unit 2 of the wearable device 1 The display control signal for executing the display is received. Then, wearable device 1 displays a plurality of screens SC2 on display unit 2 so as to overlap with preview window PW based on the received display control signal (step S13).
- the screen SC2 includes additional information related to the display content displayed at the time when the smartphone A is detected to be in the predetermined space 51a.
- the screen SC2 includes information for explaining the function of the object OB, for example.
- the plurality of screens SC2 are displayed as images including balloon portions starting from the respective objects OB corresponding to the additional information included therein.
- the wearable device 1 can recognize the display content displayed by the smartphone A and the smartphone A in the preview window PW. Thereby, wearable device 1 can display screen SC2 such that the starting point of the balloon portion of screen SC2 matches the display position of object OB in preview window PW. By having such a configuration, the wearable device 1 can reliably associate the position of the object OB with the position of the screen SC2.
- FIG. 9 is a diagram illustrating a third example of processing executed by the wearable device 1 according to the present embodiment.
- step S21 is the same as the state of step S3 shown in FIG.
- the smartphone A is present in the predetermined space 51a, and additional information related to the object OB displayed by the smartphone A is displayed on the display unit 2 as the screen SC1.
- step S21 the thumb of the right hand R of the user holding the smartphone A is located on the display surface of the smartphone A where the object OB is not displayed.
- the wearable device 1 displays the screens SC3 to SC5 instead of the screen SC1, as shown in step S22. Is displayed.
- the screens SC3 to SC5 displayed in step S22 include a plurality of web pages related to the object OB1 (for example, an image related to the web browser function). It may be a plurality of web pages, for example, a plurality of web pages opened when a browser application was previously executed. The plurality of web pages may be, for example, a plurality of web pages registered (or bookmarked) in advance by the user. On the screen SC3, for example, a “search page” registered in advance by the user is displayed.
- step S22 when the wearable device 1 detects that the smartphone A is in the predetermined space 51a, the wearable device 1 starts communication connection with the smartphone A and receives a display control signal from the smartphone A.
- the smartphone A communicates with the wearable device 1, the smartphone A detects the contact position of the user's right hand with respect to the display screen.
- the smartphone A generates and transmits a display control signal for causing the wearable device 1 to display information related to the web browser function based on the contact position being the display position of the object OB1 (web browser function).
- the wearable device 1 can display different additional information on the display unit 2 in accordance with the contact position of the user with respect to the display screen of the smartphone A (another electronic device). Thereby, since the wearable apparatus 1 can provide a user with the information suitable for the state of operation with respect to the smart phone A as additional information, it is convenient.
- the display control signal received from the smartphone A may include the display direction of the additional information.
- the screen SC3 is controlled to be displayed in the left direction with respect to the smartphone A
- the screen SC4 is displayed in the upward direction with respect to the smartphone A
- the screen SC5 is controlled in the right direction with respect to the smartphone A.
- the display control signal includes a signal for doing so.
- wearable device 1 Based on the received display control signal, wearable device 1 displays screen SC3 on the left side of smartphone A, screen SC4 on the upper side of smartphone A, and screen SC5 on the right side of smartphone A, respectively, as shown in step S22. it's shown.
- step 22 the thumb of the user's right hand R remains touching the object OB1 (web browser function) displayed by the smartphone A.
- the smartphone A activates the web browser function and changes the display to a predetermined web page (also referred to as a first process).
- a direction for example, the direction of the dotted arrow in step S22
- the smartphone A detects an operation for moving the position (in other words, a slide)
- a process different from the first process may be executed.
- the smartphone A detects an operation of moving the contact position in the direction D1 (left direction) toward the screen SC3 displayed by the wearable device 1 from the state in which the thumb of the right hand R is in contact with the display position of the object OB1. .
- the smartphone A activates the web browser function and shifts to the “search page (OB2)” displayed as the screen SC3, which is different from the predetermined web page (step S23).
- step S23 when the display shifts to “search page”, the smartphone A transmits a display control signal related to the “search page” to the wearable device 1.
- Wearable device 1 displays additional information related to “search page” (for example, web pages related to past search histories 1 to 3) as screens SC6 to SC8 based on the received display control signal.
- the display control signal received from the smartphone A includes the display direction of the additional information, and the control unit 7 of the wearable device 1 performs other electronic operations based on the display control signal.
- the additional information is displayed on the display direction side of the device.
- the smartphone A detects that the contact position of the user with respect to the display surface of the smartphone A has moved to the display direction side of the additional information
- the smartphone A regards that the additional information has been selected, and displays the display screen based on the additional information. Transition to.
- the screen SC also referred to as a display image
- displayed by the wearable device 1 can be selected by a touch operation on the smartphone A, and the contact position is in the direction toward the display location of the screen SC.
- the wearable device 1 can realize an operation mode in which the screen SC can be selected by an operation of moving the screen. Thereby, the wearable apparatus 1 can implement
- the wearable device 1 detects the right hand R holding the smartphone A from the detection result of the detection unit 5 or the captured image captured by the imaging unit 3, and moves the thumb (in other words, the upper limb) of the right hand R. May be detected.
- the wearable device 1 detects the movement of the thumb (upper limb) of the right hand R
- the wearable device 1 estimates which display direction of the plurality of screens SC displayed on the display unit 2 matches the direction of the movement.
- Wearable device 1 transmits to smartphone A a signal indicating that screen SC estimated to match is selected. Then, the smartphone A makes a transition to the display based on the selected screen SC based on the signal received from the wearable device 1.
- FIG. 10 is a diagram illustrating a fourth example of processing executed by the wearable device 1 according to the present embodiment.
- the smartphone A has an imaging function. As shown in step S31, the smartphone A is executing the imaging function, and a preview image is displayed on the display screen (the image being sequentially captured is a shaded portion illustrated in FIG. 10). Is). Then, the smartphone A displays an object OB3 (for example, a shutter button) for storing sequentially captured images on the display screen.
- an object OB3 for example, a shutter button
- step S31 the smartphone A is outside the predetermined space 51a.
- the smartphone A moves into the predetermined space 51a as shown in step S32.
- the wearable device 1 When the wearable device 1 detects that the smartphone A is in the predetermined space 51a, the wearable device 1 displays a plurality of screens SC9 in the display area 21, as shown in step S32.
- the screen SC9 includes, for example, additional information related to the imaging function being executed by the smartphone A.
- the screen SC9 may be defined as, for example, additional information regarding the display content displayed by the smartphone A.
- the display content includes, for example, a preview display or a shutter button.
- the screen SC9 may be a list of functions related to imaging that can be realized by the smartphone A, for example.
- the list of functions includes shooting modes suitable for each shooting scene, such as a person, landscape, backlight, night view, and the like.
- the screen SC9 may be a list of photograph data that has been captured and stored previously. As shown in step S32, each of the plurality of screens SC9 is displayed in parallel along the side of the smartphone A.
- step S32 the user is pushing the index finger of the left hand L forward of the user.
- the index finger of the left hand L extended forward is located next to the plurality of screens SC9 in the predetermined space 51a.
- the user moves the index finger of the left hand L downward (for example, the direction indicated by the broken line arrow in FIG. 10) along the direction in which the plurality of screens SC9 are arranged.
- the detection unit 5 detects that the index finger has moved downward, and based on this, each of the plurality of screens SC9 is scrolled downward.
- the positions of the screens indicated by hatching in the plurality of screens SC9 are changed downward when the process proceeds from step S32 to step S33.
- wearable device 1 is configured to scroll screen SC9 in a direction based on the direction of movement by moving the index finger of left hand L in a predetermined direction at a position where it does not overlap with screen SC9. .
- the wearable device 1 determines that the superimposed screen SC9 is selected by the user when the index finger of the left hand L is superimposed on any of the plurality of screens SC9, for example. May be considered.
- wearable device 1 may transmit a control signal for executing a function based on selected screen SC9 to smartphone A, for example.
- FIG. 11 is a diagram illustrating a fifth example of processing executed by the wearable device 1 according to the present embodiment.
- the smartphone A stores a plurality of moving image files.
- thumbnail images (objects OB4) of a plurality of moving image files are displayed on the smartphone A.
- the smartphone A is in the predetermined space 51a.
- the wearable device 1 is in a state where nothing is displayed on the display unit 2.
- the wearable apparatus 1 may display the additional information regarding the display content of the smart phone A.
- step S41 the thumb of the user's right hand R touches one of the thumbnail images displayed by the smartphone A (for example, the image located in the lower center) (in other words, selected).
- the thumb of the user's right hand R is moving the contact position in the direction toward the outside of the smartphone A while touching the thumbnail image OB4.
- Such an operation may provide an operational feeling as if the selected thumbnail image OB4 is dragged from the smartphone A to the predetermined space 51a outside the smartphone A (in other words, the foreground visible to the user). Absent.
- the smartphone A detects such a user action, the smartphone A transfers the moving image data corresponding to the selected thumbnail image OB4 to the wearable device 1.
- the smartphone A changes the display mode of the selected thumbnail image OB4.
- the wearable device 1 When the wearable device 1 receives the moving image data corresponding to the thumbnail image OB4 from the smartphone A, the wearable device 1 displays the moving image reproduction screen SC10 (for example, additional information) on the display unit 2 based on the data, and executes the moving image reproduction (step S42). ). At this time, the moving image reproduction screen SC10 is displayed in a reduced size at a position where it is not visually recognized superimposed on the smartphone A.
- the moving image reproduction screen SC10 for example, additional information
- the wearable device 1 transmits the additional information regarding the selected image to the smartphone A by performing a contact operation such that the image selected by the contact with the smartphone A is brought out of the smartphone A.
- it can be displayed in a predetermined space 51a that is visible.
- the user can refer to the additional information by the wearable device 1 only at a desired timing.
- Such an operation is an intuitive and simple operation.
- the wearable device 1 does not display additional information regarding the selected image in the predetermined space 51a by performing a contact operation such that the image selected by the smartphone A is exposed to the outside of the smartphone A.
- the image itself may be displayed in the predetermined space 51a.
- the wearable device 1 recognizes an operation of moving (in other words, dragging) an image selected by touching the smartphone A in a direction toward the outside of the smartphone A based on detection of movement of the contact position.
- the wearable device 1 may detect an operation of moving the image of the selected smartphone A by the detection unit 5 or the imaging unit 3 (out camera). That is, when the wearable device 1 detects that the smartphone A (another electronic device) is in the predetermined space 51a and an object that contacts the smartphone A has performed a predetermined contact operation, the moving image playback screen SC10 is displayed. (Additional information) is displayed on the display unit 2.
- the wearable apparatus 1 detects the contact operation
- the imaging unit 3 may be activated as a trigger.
- step S42 when wearable device 1 displays moving image reproduction screen SC10 on display unit 2, operation screens SC11 to SC13 for operations related to moving image reproduction screen SC10 are displayed. Displayed on the display unit 2.
- the operation screens SC11 to SC13 may be displayed based on a display control signal transmitted from the smartphone A. Further, the display control signal transmitted from the smartphone A may include information related to the display direction of the operation screens SC11 to SC13. In this case, based on this information, wearable device 1 has operation screen SC11 leftward with respect to smartphone A, operation screen SC12 upward with respect to smartphone A, and operation screen SC13 rightward with respect to smartphone A. Each direction is displayed.
- Wearable device 1 corresponds to the moving direction based on the thumb of right hand R touching the display screen of smartphone A moving in the direction in which any of operation screens SC11 to SC13 is displayed. It is assumed that the operation screen has been selected. In this case, the wearable device 1 executes an operation corresponding to the operation screen.
- the movement of the thumb of the right hand R may be recognized by the touch panel (or contact detection unit) of the smartphone A, or may be detected by the detection unit 5 or the imaging unit 3 (out camera) by the wearable device 1. .
- the operation screen SC12 displayed above the smartphone A (for example, a video playback instruction and the like)
- the smartphone A recognizes that the screen indicating that the pause instruction is executed is selected.
- the smartphone A plays a control signal for pausing the moving image when the moving image is being played on the display unit 2 of the wearable device 1, and playing the moving image when the moving image is paused.
- a control signal for generating the control signal is generated and transmitted to the wearable device 1.
- the wearable device 1 detects that the thumb of the right hand R has moved upward on the display screen of the smartphone A (for example, the direction in which the operation screen SC12 is displayed from the smartphone A) by the detection unit 5. In this case, wearable device 1 transmits a signal indicating that operation screen SC12 has been selected to smartphone A.
- the smartphone A receives a signal from the wearable device 1, the smartphone A generates a control signal for controlling the wearable device 1 based on the received signal, and transmits the control signal to the wearable device 1.
- the smartphone A can generate a control signal for temporarily stopping the moving image.
- the smartphone A can generate a control signal for reproducing the moving image.
- step S42 in a state where the moving image reproduction screen SC10 (additional information) is displayed on the display unit 2 as shown in step S42, the user moves the right hand R to move the smartphone A to a predetermined space. It is moved out of the predetermined space 51a from inside 51a.
- the wearable device 1 detects the movement of the smartphone A outside the predetermined space 51a, the wearable device 1 displays the moving image reproduction screen SC10 in an enlarged manner as shown in step S43.
- step S ⁇ b> 43 the moving image playback screen SC ⁇ b> 10 may be enlarged so as to be substantially the entire display area 21 of the display unit 2.
- FIG. 12 is a diagram illustrating a sixth example of the process executed by the wearable device 1 according to the present embodiment.
- the smartphone A is outside the predetermined space 51a in front of the user.
- the smartphone A displays character information OB5a and OB5b having different character sizes.
- the character information OB5a is composed of characters larger than the characters of the character information OB5b.
- the wearable device 1 displays the moving image reproduction screen SC14 in an area that occupies substantially the entire display area 21 of the display unit 2.
- the moving image reproduction screen SC14 includes a plurality of display elements (hereinafter also simply referred to as elements) of a moving image SC14a and character information SC14b (for example, character information related to the moving image SC14a).
- the wearable device 1 changes the size of the video playback screen SC14 according to the position of the smartphone A that has entered the predetermined space 51a (step S52).
- step S52 the moving image SC14a, which is one of the elements of the moving image playback screen SC14, has one side of the display area 21 (the left side in the example of FIG. 12) in a state where the aspect ratio does not change compared to the case of step S51. And a size that fits between one side of the outline of the smartphone A (the left side in the example of FIG. 12).
- the wearable device 1 follows the movement of the position of the smartphone A (another electronic device) that has entered, and reduces the moving image playback screen SC14 (image) as the entry direction of the smartphone A. It is good also as the same direction. For example, when the smartphone A enters the predetermined space 51a after passing through the right end side of the predetermined space 51a, that is, when the approach direction of the smartphone A is the left direction, the wearable device 1 moves the video playback screen SC14 to the left You may make it reduce to.
- the video playback screen SC14 is within the display area 21 of the display unit 2 and within an area outside the outline of the smartphone A (another electronic device).
- the size of the moving image playback screen SC14 is changed.
- the wearable device 1 can prevent the moving image reproduction screen SC14 from being superimposed on the display screen of the smartphone A and making the display screen difficult to view.
- step S52 the character information SC14b is divided and displayed outside the area of the moving image playback screen SC14.
- the character information SC14b displayed outside the area of the moving image reproduction screen SC14 is displayed with a character size that is visible to the user.
- the character information SC14b may be, for example, the same size as the character size (step S51) of the character information SC14b included in the moving image reproduction screen SC14 before being reduced. Further, the size of the character information SC14b may be changed to be the same size as the size of the character information (for example, OB5a or OB5b) displayed by the smartphone A existing in the predetermined space 51a.
- the size of the character information SC14b is changed to be the same size as the character information OB5b displayed smaller than the character information (for example, OB5a or OB5b) displayed by the smartphone A.
- the possibility of being visible by the user is high.
- the wearable device 1 At least one element among a plurality of elements included in the image is divided and displayed outside the area of the image.
- step S52 the character information SC14b that is an element divided from the moving image reproduction screen SC14 is displayed at a position that does not overlap with the smartphone A in the display area 21 of the display unit 2.
- the wearable device 1 changes the size of the moving image playback screen SC14 (image), and when the width of the image in a predetermined direction becomes smaller than a predetermined length (reduced to a predetermined size), the wearable device 1 You may make it hide at least 1 element (for example, character information SC14b) of the some contained elements.
- step S52 when the smartphone A is in the predetermined space 51a, the wearable device 1 displays the operation screens SC11 to SC13 on the display unit 2. Thereby, the wearable device 1 can operate the moving image reproduction screen SC14 by the operation of the right hand R holding the smartphone A.
- FIG. 13 is a diagram for continuing to explain a sixth example of processing executed by the wearable device 1 according to the present embodiment.
- the moving image playback screen is arranged so as to fit within the display area 21 of the display unit 2 and the area outside the outline of the smartphone A.
- the configuration for changing the size of the SC 14 has been described.
- FIG. 13 is a diagram for specifically illustrating such a configuration.
- the smartphone A is defined with a side V (VL1, VL2, VH1, VH2 in the example of FIG. 13).
- the side V is defined as a concept including a virtual line passing through the side of the smartphone A.
- the wearable device 1 may change the size as shown in step S62 (a).
- the moving image reproduction screen SC ⁇ b> 14 includes one side (for example, the left side DL) of the display area 21 and a plurality of sides V in the smartphone A (other electronic devices).
- the size may be changed so as to fit between one side VL1 closest to the left side DL.
- the side edge V closest to one side of the display area 21 is the closest to one side of the display area 21 and is one side of the display area 21. It is defined to be parallel side edges (or a side edge having the smallest angle that intersects one side of the display area 21).
- the wearable device 1 changes the arrangement of the moving image SC14a and the character information SC14b that are elements of the moving image reproduction screen SC14 when the shape of the moving image reproduction screen SC14 is changed. Also good. That is, when the size of the image is changed and the width of the image in a predetermined direction becomes smaller than a predetermined length, the wearable device 1 deforms the image and divides a plurality of elements included in the image, thereby deforming the deformed image. You may have the structure rearranged and displayed inside.
- the wearable device 1 may change the size as shown in step S62 (b).
- the video playback screen SC ⁇ b> 14 is closest to the upper side TL among the plurality of sides V in the smartphone A and the one side (for example, the upper side TL) of the display area 21.
- the size may be changed so as to fit between one side VH1.
- the wearable device 1 may change the size as shown in step S62 (c).
- the moving image reproduction screen SC14 is closest to the left side DL among the one side (for example, the left side DL) of the display area 21 and the plurality of side sides V in the smartphone A (other electronic devices).
- the size may be changed so that it fits in between.
- the moving image reproduction screen SC14 may hide the character information SC14b that is an element of the moving image reproduction screen SC14.
- FIG. 14 is a diagram for continuously explaining a sixth example of processing executed by the wearable device 1 according to the present embodiment.
- step S71 The state shown in step S71 is the same as the state shown in step S52 of FIG. 12, and the moving image SC14a is displayed in a size based on the position of the smartphone A that has entered the predetermined space 51a.
- description of the display content currently displayed on the display screen of the smart phone A is abbreviate
- the wearable device 1 determines the distance d (or the distance between the smartphone A and the moving image SC14a) that the smartphone A is separated from the moving image SC14a. ) Is detected. Then, the wearable device 1 uses the moving image SC14a based on the position of the smartphone A at the time when the distance d is larger than the predetermined length (in other words, the smartphone A is separated from the moving image SC14a by a predetermined length or more). Is changed (step S73).
- the wearable device 1 determines the size of the moving image SC14a based on the position of the smartphone A at that time. It may be changed again. That is, the wearable device 1 changes the size of the moving image SC14a (image) following the movement of the position of the smartphone A (another electronic device) that has entered the predetermined space 51a, and after the change is completed, The image size may be changed stepwise in accordance with the movement of the position of the smartphone A in the predetermined space 51a.
- the wearable device 1 changes the size of the moving image SC14a step by step, and when the size of the image becomes larger than a predetermined size, the wearable device 1 displays the divided moving image SC14a and character information SC14b. A single image may be combined.
- the moving image SC14a may return to the size before the smartphone A entered. That is, the wearable device 1 detects that the smartphone A has moved out of the predetermined space 51a after the change of the size of the moving image SC14a (image) based on the position of the smartphone A is completed. You may make it return to the magnitude
- the wearable apparatus 1 when the smart phone A exists in the predetermined space 51a, the wearable apparatus 1 shows the additional information relevant to the display of the smart phone A based on the position of the smart phone A (for example, shown in FIG. 7). As an example, the structure displayed on the smartphone A is not displayed. The wearable device 1 may define the display position of the additional information based on other elements, for example.
- FIG. 15 is a diagram illustrating a seventh example of processing executed by the wearable device 1 according to the present embodiment.
- the smartphone A has a route guidance function. As shown in step S81, the smartphone A is executing the route guidance function, and an input screen on which the departure point and the destination point can be input is displayed. The user can refer to the guide route and the shortest route from the departure point to the destination point by inputting the desired departure point name and destination point name.
- step S81 the user visually recognizes the scenery in front of the user through the display unit 2 of the wearable device 1.
- a road that can be the above-mentioned guide route in the scenery ahead.
- the user inputs the starting point and the destination point by moving the smartphone A into the predetermined space 51a ahead and operating the smartphone A.
- the wearable device 1 receives the input route information from the departure point to the destination point from the smartphone A, and the current position and user of the user are suitable. Detect the heading. Thereby, the wearable device 1 estimates a symbol SC15 (for example, an arrow) of a route that the user should start from the current location, and displays the symbol SC15 superimposed on the foreground visually recognized by the user (step S82).
- a symbol SC15 for example, an arrow
- the route to be started was changed by re-entering a point different from the entered destination point, or by changing the route from walking only to the route + walking route.
- the wearable device 1 may display a different symbol SC in accordance with the change.
- the additional information relevant to the display of this electronic device is displayed on the display part of the wearable device 1.
- the target of the additional information is not limited to the electronic device.
- the wearable device 1 displays the additional information when the finger touching the smartphone A is slid in the direction in which the additional information is displayed.
- the wearable device 1 may be configured to select additional information other than, for example, the operation of a finger touching the smartphone A.
- FIG. 16 is a diagram illustrating an eighth example of processing executed by the wearable device 1 according to the present embodiment.
- the user wears the wearable device 1 and possesses a carrot CA.
- the wearable device 1 detects the shape information and color of the object that the user's right hand R possesses from the captured image captured by the imaging unit 3. By analyzing the information, it is recognized that the object is a carrot CA.
- the wearable device 1 recognizes that the carrot CA is in the predetermined space 51a, for example, as shown in step S92, the recipe list images SC16 to SC18 of the dish using the carrot CA are displayed. It is displayed in the display area 21. Each of the images SC16 to SC18 is displayed at a position where it is not visually recognized superimposed on the carrot CA.
- step S93 in a state where the images SC16 to SC18 are displayed, the user makes a part of the carrot CA overlap with any one of the images SC16 to SC18 (image SC17 in the example of FIG. 16).
- wearable device 1 considers that image SC17 superimposed with carrot CA is selected.
- the wearable device 1 may transition to a cooking recipe display screen based on the image SC17, for example.
- the wearable device 1 includes the display unit 2 disposed in front of the eyes, the detection unit 5 (or the imaging unit 3) for detecting a predetermined object (carrot CA in the above example) existing in the real space,
- a predetermined object for example, images SC16 to SC18
- additional information related to the predetermined object for example, images SC16 to SC18
- a control unit 7 When it is detected that the predetermined object has moved and the predetermined object is superimposed on the area where the additional information is displayed, a selection process for the additional information is executed.
- the predetermined object is not limited to the carrot CA but may be an electronic device such as the smartphone A or the wristwatch type terminal B.
- the wearable apparatus 1 receives the control signal containing the prescription
- the configuration in which the additional information is displayed on the left side, the upper side, and the right side of the smartphone A based on the display direction is illustrated, but the display position of the additional information is not limited thereto.
- FIG. 17 is a diagram illustrating a ninth example of the process executed by the wearable device 1 according to the present embodiment.
- the additional information is obtained by sliding the finger touching the smartphone A in the direction toward the additional information in a state where the additional information is dependent on the position outside the outline of the smartphone A. It was set as the structure selected. Such a slide operation may be assumed to compete with a normal operation on the smartphone A, which is different from an operation for selecting additional information. Therefore, in the ninth example, a configuration is illustrated in which an operation for selecting additional information and a normal operation on the smartphone A are distinguished by the sliding direction of the sliding operation on the smartphone A.
- the area (space) around the smartphone A is eight areas by the four sides V of the smartphone A (the side V is a concept including a virtual line passing through the side V). It is divided into.
- the smartphone A is often scrolled or switched between screens by a sliding operation in the vertical or horizontal direction with respect to the touch panel of the smartphone. Is defined as a selection area F for selecting additional information.
- additional information is selected when the extension line of the slide trajectory corresponds to the direction passing through the selection region F (direction D2 in step S101).
- the extended line of the slide trajectory corresponds to a direction that does not pass through the selection region F (direction D3 in step S101)
- a normal operation on the smartphone A is performed based on the direction.
- the slide locus draws a curve, it may be determined whether or not the selected region corresponds to the direction of the line connecting the start point and the end point of the locus.
- how to distinguish the surrounding area of the smartphone A may be freely determined by those skilled in the art, and which of the divided areas is to be selected as the additional information selection area. You can decide freely.
- step S102 When the smart phone A moves into the predetermined space 51a in the state shown in step S101 (step S102), the wearable device 1 has additional information SC19, 20 at a position in the selection area F in the display area 21. Is displayed (step S103). In this way, the user can select the additional information SC19, 20 by sliding the finger touching the smartphone A in the direction in which the additional information SC19, 20 is displayed. Furthermore, the user can perform a normal operation on the smartphone A by sliding his / her finger in a direction different from the direction in which the additional information SC19, 20 is displayed.
- step S102 when the smartphone A recognizes that its own device is in the predetermined space 51a, the smartphone A generates a control signal for causing the wearable device 1 to recognize the specified selection area F, and causes the wearable device 1 to Send.
- the configuration in which the operation of selecting additional information and the operation different from the operation are distinguished by dividing the area around the smartphone A is illustrated.
- the method of is not limited to this.
- FIG. 18 is a diagram illustrating a tenth example of processing executed by the wearable device 1 according to the present embodiment.
- the smartphone A is possessed by the user.
- the wearable device 1 displays the additional information SC21 related to the smartphone A (step S112).
- the smart phone A detects predetermined
- the smartphone A makes a transition to a state in which an operation on the additional information SC21 (or an operation on the display of the wearable device 1) is possible (step S113).
- the wearable device 1 When the wearable device 1 makes a transition to a state in which the operation on the additional information SC21 is possible, the wearable device 1 displays the visual effect OB6 at the touch position on the touch panel in order to make the user recognize this (step S113). With such a configuration, the user can switch between an operation on the additional information SC21 (or an operation on the display of the wearable device 1) and another operation by performing an intended operation on the smartphone A.
- FIG. 19 is a diagram illustrating an eleventh example of processing executed by the wearable device 1 according to the present embodiment.
- step S121 when the smartphone A moves into the predetermined space 51a, the wearable device 1 detects that the smartphone A is in the predetermined space 51a and activates the imaging unit 3 (out camera) (step S121). S122). Then, the captured image captured by the imaging unit 3 is displayed on the display unit 2 (the captured image displayed on the display unit 2 is referred to as a preview window PW2).
- the wearable device 1 starts communication connection with the smartphone A, and receives a display control signal for executing a predetermined display on the display unit 2 of the wearable device 1 from the smartphone A.
- the wearable device 1 calculates a range PW2a including the smartphone A displayed in the preview window PW2 by analyzing the captured image.
- wearable device 1 displays screen SC21 including additional information based on the received display control signal as an image having the same size or larger than range PW2a (step S123).
- the screen SC21 may be the same image as the display image displayed when the smartphone A (electronic device) is in the predetermined space 51a.
- the screen SC21 is preferably an opaque image.
- the wearable device 1 may display the visual effect SC22 so that the user can grasp which position on the display screen of the smartphone A is touched.
- the wearable device 1 is configured to display the screen SC21 so as to include the smartphone A displayed in the preview window PW2, but is not limited thereto.
- the wearable device 1 detects that the smartphone A is in the predetermined space 51a, the wearable device 1 detects the position and shape of the smartphone A, and based on the detection result, the outline of the smartphone A visually recognized through the display unit 2 ( The outline of the smartphone A in the display area 21 may be estimated, and the screen SC21 may be displayed so as to include the estimated outline of the smartphone A.
- the wearable device 1 includes a display unit arranged in front of the eyes, a detection unit (or an imaging unit) that detects a predetermined object existing in the real space, and a predetermined object in a predetermined space in front of the user.
- the image including the additional information related to the predetermined object is displayed so that the image includes the predetermined object in the display area of the display unit (the predetermined object displayed in the preview window imaged by the imaging unit 3). It has a configuration.
- the screen SC21 When the screen SC21 is displayed so as to include a predetermined object, the screen SC21 is displayed in a direction in which the inclination of the predetermined object with respect to the outline of the display area 21 of the display unit 2 is corrected as shown in steps S122 and S123. SC21 may be displayed.
- the wearable device 1 corrects the additional information related to the smartphone A in the state where the smartphone A is viewed obliquely as shown in step S122. Since it can be displayed as an open screen, it is easy to see.
- the predetermined object may be an electronic device different from the electronic device such as the smartphone A.
- Different electronic devices include, for example, a paper medium on which characters are printed, an electronic document such as a PDF file, and the like.
- the wearable device 1 corrects the orientation of the position part to match the other part. And can be displayed easily.
- FIG. 20 is a diagram illustrating a twelfth example of processing executed by the wearable device 1 according to the present embodiment.
- the smartphone A is in a state having an inclination of a predetermined angle or more with respect to the display surface of the display unit 2 of the wearable device 1.
- the wearable device 1 detects that the smartphone A is in the predetermined space 51a.
- the wearable device 1 calculates the tilt angle of the display screen of the smartphone A with respect to the display surface of the display unit 2 of the wearable device 1 based on the detection result of the detection unit 5. If the tilt angle is larger than the predetermined angle, the wearable device 1 activates the imaging unit 3 (out camera) and displays the captured image captured by the imaging unit 3 on the display unit 2 as shown in step S132.
- the wearable device 1 calculates a range PW3a including the smartphone A displayed in the preview window PW3 by analyzing the captured image.
- the wearable device 1 displays the screen SC23 including additional information based on the display control signal received from the smartphone A on the display unit 2 as an image having the same size as or larger than the range PW3a (step S133). .
- the wearable device 1 may display the visual effect SC22 so that the user can grasp which position on the display screen of the smartphone A is touched.
- the wearable device 1 has a display screen of the smartphone A (another electronic device) having a larger inclination than a predetermined angle with respect to the display surface of the display unit 2 of the wearable device 1 based on the detection result of the detection unit 5.
- the image related to the display of the smartphone A is superimposed on the smartphone A and displayed.
- a network system may be characterized as a network system formed from a wearable device and another electronic device.
- a network system is a network system formed by a wearable device including a display unit arranged in front of the eyes and another electronic device having a display function, and the wearable device includes: A detection unit configured to detect whether or not the other electronic device is present in a predetermined space in front of the wearable device, and detecting that the other electronic device is present in the predetermined space; To the other electronic device, and when the other electronic device receives the information, the display for causing the wearable device to display additional information related to the display content displayed by the other electronic device.
- the control signal is transmitted to the wearable device, and the wearable device receives the display control signal, the wearable device transmits the additional information based on the display control signal. It may be characterized as a network system for displaying a radical 113.
- the wearable device 1 has a glasses shape
- the shape of the wearable device 1 is not limited to this.
- the wearable device 1 may have a helmet-type shape that covers substantially the upper half of the user's head.
- the wearable device 1 may have a mask type shape that covers almost the entire face of the user.
- the display part 2 has illustrated the structure which has a pair of display part 2a provided in front of a user's right and left eyes, and the display part 2b, it is not limited to this, A display The part 2 may be configured to have one display part provided in front of one of the left and right eyes of the user.
- edge part of the front part has illustrated the structure surrounding the perimeter of the edge of the display area 21 of the display part 2, it is not limited to this, The display area of the display part 2 The structure which encloses only a part of edge of 21 may be sufficient.
- the hand or finger is detected by the imaging unit 3 and the imaging unit 4 (or detection unit) as the user's upper limb, but the hand or finger is worn with gloves or gloves. Even in this state, it can be detected in the same manner.
- the configuration and operation of the wearable device 1 have been described.
- the present invention is not limited to this and may be configured as a method or program including each component.
- the network system may be configured by the wearable device 1 and other electronic devices.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Optics & Photonics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
図4は、ウェアラブル装置1と他の電子機器から形成されるネットワークシステム100の概略図である。図4に示すように、ウェアラブル装置1は、他の電子機器として、例えば、利用者が所持するスマートフォンAや、利用者が腕に装着する腕時計型端末Bと有線または無線接続されることにより通信を行う。ウェアラブル装置1は、制御部7が通信部8を制御することにより、他の電子機器との通信を行う。ウェアラブル装置1は、複数の電子機器(例えばスマートフォンAや腕時計型端末B)と通信し得る状況においては、複数の電子機器の内の何れかから送信される信号に含まれる識別情報(例えば、IDを示すデータ)に基づいて、通信相手を識別して良い。
図7から図20を参照して、他の電子機器の表示内容に関連する付加情報をウェアラブル装置1に表示する機能の実施形態を説明する。
以上に、実施形態を説明してきた。以下に種々の変形例を示す。
1a 前面部
1b 側面部
1c 側面部
2a 表示部
2b 表示部
3 撮像部
4 撮像部
5 検出部
6 操作部
7 制御部
8 通信部
9 記憶部
Claims (27)
- 眼前に配置される表示部を備えるウェアラブル装置であって、
前記ウェアラブル装置の前方の所定空間に他の電子機器が在るか否かを検出する検出部を備え、
前記他の電子機器が前記所定空間に在り、該他の電子機器が所定の表示を行っている場合、該表示に関連する付加情報を前記表示部に表示する
ウェアラブル装置。 - 前記検出部の検出結果に基づき、前記所定空間における前記他の電子機器の外郭を検出する制御部を備え、
前記制御部は、前記付加情報を含む画像が前記他の電子機器の外郭の外側となるように、該画像を前記表示部に表示する
請求項1に記載のウェアラブル装置。 - 前記制御部は、前記付加情報を含む画像の一辺が前記外郭の一辺と略平行に沿うように、該画像を前記表示部に表示する
請求項2に記載のウェアラブル装置。 - 前記制御部は、前記他の電子機器の表示画面に対する利用者の接触位置に応じて、異なる付加情報を前記表示部に表示する
請求項2又は3に記載のウェアラブル装置。 - 前記他の電子機器と通信可能な通信部を備え、
前記他の電子機器から受信した表示制御信号に基づいて、前記付加情報を前記表示部に表示する
請求項2から4の何れか一項に記載のウェアラブル装置。 - 前記表示制御信号は、前記付加情報の表示方向を含み、
前記制御部は、前記表示制御信号に基づいて、前記他の電子機器よりも前記表示方向側に前記付加情報を表示する
請求項5に記載のウェアラブル装置。 - 前記制御部は、前記検出部の検出結果より、前記表示方向へ利用者の上肢を移動させる動作を検出すると、前記付加情報を含む画像が選択された旨の信号を前記他の電子機器に送信する
請求項6に記載のウェアラブル装置。 - 前記他の電子機器が前記所定空間に在る状態で、かつ、該他の電子機器に接触する物体が所定の接触動作を行ったことを検出すると、前記付加情報を前記表示部に表示する制御部を備える
請求項1に記載のウェアラブル装置。 - 前記制御部は、前記付加情報を前記表示部に表示すると、該付加情報に関連する操作のための操作画面を前記表示部に表示する
請求項8に記載のウェアラブル装置。 - 前記他の電子機器と通信可能な通信部を備え、
前記制御部は、前記検出部の検出結果より、前記他の電子機器から前記操作画面が表示されている方向へ利用者の上肢が移動する動作を検出すると、前記操作画面が選択された
旨の信号を前記他の電子機器に送信する
請求項9に記載のウェアラブル装置。 - 前記付加情報が前記表示部に表示された状態で、前記他の電子機器が前記所定空間内から該所定空間外へと移動したことを検出すると、該付加情報を拡大表示する
請求項1から10の何れか一項に記載のウェアラブル装置。 - 眼前に配置される表示部と、
現実の空間に存在する所定物を検出する検出部と、
利用者の前方の所定空間に前記所定物が在る場合に、該所定物に関連する付加情報を、前記表示部の表示領域において前記所定物と重畳しない位置に表示する制御部と、を備えるウェアラブル装置であって、
前記所定物が移動し、前記付加情報が表示された領域に前記所定物が重畳したことを検出すると、該付加情報に対する選択処理を実行する
ウェアラブル装置。 - 眼前に配置される表示部と、
現実の空間に存在する所定物を検出する検出部と、
利用者の前方の所定空間に前記所定物が在る場合に、該所定物に関連する付加情報を含む画像を、該画像が前記表示部の表示領域における前記所定物を包含するように表示する
ウェアラブル装置。 - 前記画像は、不透明な画像である
請求項13に記載のウェアラブル装置。 - 前記所定物は、表示機能を備えた電子機器であり
前記画像は、前記電子機器が前記所定空間に在る状態において表示している表示画像と同一の画像である
請求項13又は14に記載のウェアラブル装置。 - 眼前に配置される表示部を備えるウェアラブル装置であって、
現実の空間に他の電子機器が在るか否かを検出する検出部を備え、
前記検出部の検出結果より、前記他の電子機器の表示画面が前記ウェアラブル装置の前記表示部の表示面に対して所定角度以上の傾きを有している場合に、該他の電子機器の表示に関連する画像を、該他の電子機器に重畳させて表示するウェアラブル装置。 - 利用者の眼前に配置される表示部と、
利用者の前方の所定空間に存在する他の電子機器を検出する検出部と、を備えるウェアラブル装置であって、
前記表示部に画像を表示している場合に、前記所定空間に他の電子機器が進入したことを前記検出部が検出すると、該進入した他の電子機器の位置の移動に追従して、前記画像の大きさを変更し、
前記他の電子機器の前記所定空間における移動速度が所定値未満になると、前記画像の大きさの変更を完了する
ウェアラブル装置。 - 前記画像が前記表示部の表示領域内であって、かつ、前記他の電子機器の外郭よりも外側の領域に収まるように、該画像の大きさを変更する
請求項17に記載のウェアラブル装置。 - 前記表示領域の一辺と、前記他の電子機器における複数の側辺の内の前記一辺に最も近い一の側辺と、の間に収まるように前記画像の大きさを変更する
請求項18に記載のウェアラブル装置。 - 前記表示領域の前記一辺と所定角度以上で交わる他辺と、前記他の電子機器における複数の側辺の内の前記他辺に最も近い他の側辺と、の間に収まるように前記画像の大きさを変更する
請求項19に記載のウェアラブル装置。 - 前記進入した他の電子機器の位置の移動に追従して、前記画像を縮小する場合には、該縮小方向を前記他の電子機器の前記進入方向と同一にする
請求項17から20の何れか一項に記載のウェアラブル装置。 - 前記画像の大きさの変更が完了した後は、前記空間における前記他の電子機器の位置の移動に従って、前記画像の大きさを段階的に変更する
請求項17から21の何れか一項に記載のウェアラブル装置。 - 前記画像の大きさの変更が完了した後に、前記他の電子機器が前記所定空間外に移動したことを検出したことに基づいて、該画像の大きさを前記変更される前の大きさに戻す
請求項17から22の何れか一項に記載のウェアラブル装置。 - 前記画像は、複数の表示要素を含んでおり、
前記ウェアラブル装置は、前記画像の大きさが変更され、該画像の所定方向における幅が所定長さ以下になると、該画像に含まれる前記複数の表示要素の内の少なくとも1つの表示要素を分割して、該画像の領域外に表示する
請求項17から23の何れか一項に記載のウェアラブル装置。 - 前記画像は、複数の表示要素を含んでおり、
前記ウェアラブル装置は、前記画像の大きさが変更され、該画像の所定方向における幅が所定長さ以下になると、該画像を変形すると共に、前記複数の表示要素を分割し、該変形された画像中に再配置して表示する
請求項17から24の何れか一項に記載のウェアラブル装置。 - 利用者の眼前に配置され、複数の表示要素を含む画像を表示する表示部と、
利用者の前方の所定空間に存在する所定物を検出する検出部と、を備えるウェアラブル装置であって、
前記表示部に前記画像を表示している場合に、前記所定空間に所定物が進入したことを前記検出部が検出すると、該進入した所定物の位置の移動に応じて前記画像の大きさを変更し、
前記画像の大きさが変更され、該画像の所定方向における幅が所定長さ以下になると、該画像に含まれる前記複数の表示要素の内の少なくとも1つの表示要素を分割して、該画像の領域外に表示する
ウェアラブル装置。 - 前記分割された少なくとも1つの表示要素を、前記表示部の表示領域における前記所定物と重畳しない位置に表示する
請求項26に記載のウェアラブル装置。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017521921A JP6400197B2 (ja) | 2015-05-29 | 2016-05-27 | ウェアラブル装置 |
US15/577,657 US10591729B2 (en) | 2015-05-29 | 2016-05-27 | Wearable device |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015109498 | 2015-05-29 | ||
JP2015-109498 | 2015-05-29 | ||
JP2015109497 | 2015-05-29 | ||
JP2015-109497 | 2015-05-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016194844A1 true WO2016194844A1 (ja) | 2016-12-08 |
Family
ID=57441112
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/065810 WO2016194844A1 (ja) | 2015-05-29 | 2016-05-27 | ウェアラブル装置 |
Country Status (3)
Country | Link |
---|---|
US (1) | US10591729B2 (ja) |
JP (1) | JP6400197B2 (ja) |
WO (1) | WO2016194844A1 (ja) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018181261A (ja) * | 2017-04-21 | 2018-11-15 | 京セラドキュメントソリューションズ株式会社 | 表示装置 |
JP2018190395A (ja) * | 2018-04-02 | 2018-11-29 | 株式会社コロプラ | 仮想空間を提供するための方法、当該方法をコンピュータに実行させるためのプログラム、および当該プログラムを実行するための情報処理装置 |
JP2018190335A (ja) * | 2017-05-11 | 2018-11-29 | 株式会社コロプラ | 仮想空間を提供するための方法、当該方法をコンピュータに実行させるためのプログラム、および当該プログラムを実行するための情報処理装置 |
JP2019105678A (ja) * | 2017-12-11 | 2019-06-27 | 京セラドキュメントソリューションズ株式会社 | 表示装置、及び画像表示方法 |
WO2020003361A1 (ja) * | 2018-06-25 | 2020-01-02 | マクセル株式会社 | ヘッドマウントディスプレイ、ヘッドマウントディスプレイ連携システム及びその方法 |
JP2020520494A (ja) * | 2017-05-04 | 2020-07-09 | ソニー インタラクティブ エンタテインメント ヨーロッパ リミテッド | ヘッドマウントディスプレイおよび方法 |
JP2020135544A (ja) * | 2019-02-21 | 2020-08-31 | セイコーエプソン株式会社 | 表示システム、情報処理装置の制御プログラム、及び情報処理装置の制御方法 |
JP2020197932A (ja) * | 2019-06-03 | 2020-12-10 | 三菱電機株式会社 | ソフトウェア操作支援システム |
JP6892960B1 (ja) * | 2020-09-29 | 2021-06-23 | Kddi株式会社 | 制御装置、情報処理システム、制御方法及びプログラム |
US11211008B2 (en) | 2016-12-21 | 2021-12-28 | Samsung Display Co., Ltd. | Display device and driving method thereof |
US11302037B2 (en) | 2018-12-18 | 2022-04-12 | Samsung Electronics Co., Ltd. | Electronic device for adaptively altering information display area and operation method thereof |
WO2022244052A1 (ja) * | 2021-05-17 | 2022-11-24 | マクセル株式会社 | ヘッドマウントディスプレイ装置 |
WO2022269753A1 (ja) * | 2021-06-22 | 2022-12-29 | マクセル株式会社 | 情報処理システム、情報処理装置及び画像表示装置 |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017021461A (ja) * | 2015-07-08 | 2017-01-26 | 株式会社ソニー・インタラクティブエンタテインメント | 操作入力装置および操作入力方法 |
KR102548943B1 (ko) * | 2015-11-17 | 2023-06-29 | 삼성전자주식회사 | 전자 장치 및 전자 장치의 위치에 따른 화면 제공 방법 |
CN106126144B (zh) * | 2016-06-28 | 2019-03-29 | 联想(北京)有限公司 | 一种信息显示方法及电子设备 |
US11036284B2 (en) | 2018-09-14 | 2021-06-15 | Apple Inc. | Tracking and drift correction |
USD918952S1 (en) * | 2018-10-19 | 2021-05-11 | Beijing Xiaomi Mobile Software Co., Ltd. | Electronic device with graphical user interface |
US11450069B2 (en) * | 2018-11-09 | 2022-09-20 | Citrix Systems, Inc. | Systems and methods for a SaaS lens to view obfuscated content |
EP3712759B1 (en) * | 2019-03-18 | 2023-07-19 | Apple Inc. | Virtual paper |
USD1009050S1 (en) * | 2019-06-19 | 2023-12-26 | F. Hoffman-La Roche Ag | Display screen with transitional graphical user interface |
JP2021043300A (ja) * | 2019-09-10 | 2021-03-18 | セイコーエプソン株式会社 | 表示システム、情報処理装置の制御プログラム、情報処理装置の制御方法、及び表示装置 |
US11379033B2 (en) * | 2019-09-26 | 2022-07-05 | Apple Inc. | Augmented devices |
US11169600B1 (en) * | 2019-12-06 | 2021-11-09 | Snap Inc. | Virtual object display interface between a wearable device and a mobile device |
US11544415B2 (en) | 2019-12-17 | 2023-01-03 | Citrix Systems, Inc. | Context-aware obfuscation and unobfuscation of sensitive content |
US11539709B2 (en) | 2019-12-23 | 2022-12-27 | Citrix Systems, Inc. | Restricted access to sensitive content |
US11582266B2 (en) | 2020-02-03 | 2023-02-14 | Citrix Systems, Inc. | Method and system for protecting privacy of users in session recordings |
US11361113B2 (en) | 2020-03-26 | 2022-06-14 | Citrix Systems, Inc. | System for prevention of image capture of sensitive information and related techniques |
USD959483S1 (en) * | 2020-04-01 | 2022-08-02 | Mitsubishi Electric Building Techno-Service Co., Ltd. | Display screen with graphical user interface |
US20210358294A1 (en) * | 2020-05-15 | 2021-11-18 | Microsoft Technology Licensing, Llc | Holographic device control |
WO2022041163A1 (en) | 2020-08-29 | 2022-03-03 | Citrix Systems, Inc. | Identity leak prevention |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014071756A (ja) * | 2012-09-28 | 2014-04-21 | Brother Ind Ltd | 作業補助システムおよびプログラム |
JP2014071811A (ja) * | 2012-10-01 | 2014-04-21 | Sony Corp | 情報処理装置、表示制御方法及びプログラム |
JP2014093036A (ja) * | 2012-11-06 | 2014-05-19 | Konica Minolta Inc | 案内情報表示装置 |
JP2014157482A (ja) * | 2013-02-15 | 2014-08-28 | Konica Minolta Inc | 操作表示システム |
JP2014186361A (ja) * | 2013-03-21 | 2014-10-02 | Sony Corp | 情報処理装置、操作制御方法及びプログラム |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI423112B (zh) * | 2009-12-09 | 2014-01-11 | Ind Tech Res Inst | 可攜式虛擬輸入操作裝置與其操作方法 |
US20120102438A1 (en) * | 2010-10-22 | 2012-04-26 | Robinson Ian N | Display system and method of displaying based on device interactions |
WO2013028813A1 (en) * | 2011-08-23 | 2013-02-28 | Microsoft Corporation | Implicit sharing and privacy control through physical behaviors using sensor-rich devices |
KR101861380B1 (ko) * | 2012-07-16 | 2018-05-28 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | 헤드 마운트 디스플레이를 이용한 컨텐츠 출력 방법 및 이를 위한 헤드 마운트 디스플레이 |
JP5962403B2 (ja) * | 2012-10-01 | 2016-08-03 | ソニー株式会社 | 情報処理装置、表示制御方法及びプログラム |
US20160292922A1 (en) * | 2013-05-21 | 2016-10-06 | Sony Corporation | Display control device, display control method, and recording medium |
US20150302653A1 (en) * | 2014-04-22 | 2015-10-22 | Cherif Atia Algreatly | Augmented Digital Data |
US9858720B2 (en) * | 2014-07-25 | 2018-01-02 | Microsoft Technology Licensing, Llc | Three-dimensional mixed-reality viewport |
WO2016021252A1 (ja) * | 2014-08-05 | 2016-02-11 | ソニー株式会社 | 情報処理装置及び情報処理方法、並びに画像表示システム |
KR20160022147A (ko) * | 2014-08-19 | 2016-02-29 | 엘지전자 주식회사 | 이동 단말기, 글래스형 단말기 및 그들의 화면 공유를 통한 상호 연동방법 |
US10477090B2 (en) * | 2015-02-25 | 2019-11-12 | Kyocera Corporation | Wearable device, control method and non-transitory storage medium |
-
2016
- 2016-05-27 JP JP2017521921A patent/JP6400197B2/ja active Active
- 2016-05-27 US US15/577,657 patent/US10591729B2/en active Active
- 2016-05-27 WO PCT/JP2016/065810 patent/WO2016194844A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014071756A (ja) * | 2012-09-28 | 2014-04-21 | Brother Ind Ltd | 作業補助システムおよびプログラム |
JP2014071811A (ja) * | 2012-10-01 | 2014-04-21 | Sony Corp | 情報処理装置、表示制御方法及びプログラム |
JP2014093036A (ja) * | 2012-11-06 | 2014-05-19 | Konica Minolta Inc | 案内情報表示装置 |
JP2014157482A (ja) * | 2013-02-15 | 2014-08-28 | Konica Minolta Inc | 操作表示システム |
JP2014186361A (ja) * | 2013-03-21 | 2014-10-02 | Sony Corp | 情報処理装置、操作制御方法及びプログラム |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11211008B2 (en) | 2016-12-21 | 2021-12-28 | Samsung Display Co., Ltd. | Display device and driving method thereof |
TWI761407B (zh) * | 2016-12-21 | 2022-04-21 | 南韓商三星顯示器有限公司 | 顯示裝置 |
JP2018181261A (ja) * | 2017-04-21 | 2018-11-15 | 京セラドキュメントソリューションズ株式会社 | 表示装置 |
US11590415B2 (en) | 2017-05-04 | 2023-02-28 | Sony Interactive Entertainment Inc. | Head mounted display and method |
JP2020520494A (ja) * | 2017-05-04 | 2020-07-09 | ソニー インタラクティブ エンタテインメント ヨーロッパ リミテッド | ヘッドマウントディスプレイおよび方法 |
JP7191853B2 (ja) | 2017-05-04 | 2022-12-19 | 株式会社ソニー・インタラクティブエンタテインメント | ヘッドマウントディスプレイおよび方法 |
JP2018190335A (ja) * | 2017-05-11 | 2018-11-29 | 株式会社コロプラ | 仮想空間を提供するための方法、当該方法をコンピュータに実行させるためのプログラム、および当該プログラムを実行するための情報処理装置 |
JP2019105678A (ja) * | 2017-12-11 | 2019-06-27 | 京セラドキュメントソリューションズ株式会社 | 表示装置、及び画像表示方法 |
JP2018190395A (ja) * | 2018-04-02 | 2018-11-29 | 株式会社コロプラ | 仮想空間を提供するための方法、当該方法をコンピュータに実行させるためのプログラム、および当該プログラムを実行するための情報処理装置 |
WO2020003361A1 (ja) * | 2018-06-25 | 2020-01-02 | マクセル株式会社 | ヘッドマウントディスプレイ、ヘッドマウントディスプレイ連携システム及びその方法 |
US11921293B2 (en) | 2018-06-25 | 2024-03-05 | Maxell, Ltd. | Head-mounted display, head-mounted display linking system, and method for same |
JP7379734B2 (ja) | 2018-06-25 | 2023-11-14 | マクセル株式会社 | ヘッドマウントディスプレイ |
JP2021192116A (ja) * | 2018-06-25 | 2021-12-16 | マクセル株式会社 | 表示装置 |
US11567333B2 (en) | 2018-06-25 | 2023-01-31 | Maxell, Ltd. | Head-mounted display, head-mounted display linking system, and method for same |
US11378805B2 (en) | 2018-06-25 | 2022-07-05 | Maxell, Ltd. | Head-mounted display, head-mounted display linking system, and method for same |
JP7209786B2 (ja) | 2018-06-25 | 2023-01-20 | マクセル株式会社 | 表示装置 |
US11302037B2 (en) | 2018-12-18 | 2022-04-12 | Samsung Electronics Co., Ltd. | Electronic device for adaptively altering information display area and operation method thereof |
JP7238456B2 (ja) | 2019-02-21 | 2023-03-14 | セイコーエプソン株式会社 | 表示システム、情報処理装置の制御プログラム、及び情報処理装置の制御方法 |
JP2020135544A (ja) * | 2019-02-21 | 2020-08-31 | セイコーエプソン株式会社 | 表示システム、情報処理装置の制御プログラム、及び情報処理装置の制御方法 |
JP7278152B2 (ja) | 2019-06-03 | 2023-05-19 | 三菱電機株式会社 | ソフトウェア操作支援システム |
JP2020197932A (ja) * | 2019-06-03 | 2020-12-10 | 三菱電機株式会社 | ソフトウェア操作支援システム |
JP2022055697A (ja) * | 2020-09-29 | 2022-04-08 | Kddi株式会社 | 制御装置、情報処理システム、制御方法及びプログラム |
JP6892960B1 (ja) * | 2020-09-29 | 2021-06-23 | Kddi株式会社 | 制御装置、情報処理システム、制御方法及びプログラム |
WO2022244052A1 (ja) * | 2021-05-17 | 2022-11-24 | マクセル株式会社 | ヘッドマウントディスプレイ装置 |
WO2022269753A1 (ja) * | 2021-06-22 | 2022-12-29 | マクセル株式会社 | 情報処理システム、情報処理装置及び画像表示装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2016194844A1 (ja) | 2018-01-18 |
US20180164589A1 (en) | 2018-06-14 |
JP6400197B2 (ja) | 2018-10-03 |
US10591729B2 (en) | 2020-03-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6400197B2 (ja) | ウェアラブル装置 | |
US11262835B2 (en) | Human-body-gesture-based region and volume selection for HMD | |
JP6510648B2 (ja) | ウェアラブル装置、制御方法及び制御プログラム | |
JP6393367B2 (ja) | 追従表示システム、追従表示プログラム、および追従表示方法、ならびにそれらを用いたウェアラブル機器、ウェアラブル機器用の追従表示プログラム、およびウェアラブル機器の操作方法 | |
JP6652613B2 (ja) | ウェアラブル装置、制御方法、制御プログラム及び撮像装置 | |
US9886086B2 (en) | Gesture-based reorientation and navigation of a virtual reality (VR) interface | |
KR102494698B1 (ko) | 카메라의 초점을 변경하는 방법 및 장치 | |
JP6595597B2 (ja) | ウェアラブル装置、制御方法及び制御プログラム | |
EP4246287A1 (en) | Method and system for displaying virtual prop in real environment image, and storage medium | |
KR20140070326A (ko) | 3차원 인터페이스를 제공하는 모바일 장치 및 그것의 제스처 제어 방법 | |
KR102110208B1 (ko) | 안경형 단말기 및 이의 제어방법 | |
JP7459798B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
US20230336865A1 (en) | Device, methods, and graphical user interfaces for capturing and displaying media | |
JP7499819B2 (ja) | ヘッドマウントディスプレイ | |
KR20240091221A (ko) | 미디어를 캡처 및 디스플레이하기 위한 디바이스들, 방법들, 및 그래픽 사용자 인터페이스들 | |
JP2023160103A (ja) | 電子機器 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16803279 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017521921 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15577657 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16803279 Country of ref document: EP Kind code of ref document: A1 |