KR20160137253A - Augmented Reality Device, User Interaction Apparatus and Method for the Augmented Reality Device - Google Patents

Augmented Reality Device, User Interaction Apparatus and Method for the Augmented Reality Device Download PDF

Info

Publication number
KR20160137253A
KR20160137253A KR1020150072133A KR20150072133A KR20160137253A KR 20160137253 A KR20160137253 A KR 20160137253A KR 1020150072133 A KR1020150072133 A KR 1020150072133A KR 20150072133 A KR20150072133 A KR 20150072133A KR 20160137253 A KR20160137253 A KR 20160137253A
Authority
KR
South Korea
Prior art keywords
gesture
command
augmented reality
user
wearable device
Prior art date
Application number
KR1020150072133A
Other languages
Korean (ko)
Inventor
허기수
이동우
신형철
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Priority to KR1020150072133A priority Critical patent/KR20160137253A/en
Publication of KR20160137253A publication Critical patent/KR20160137253A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • G06K9/00389
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to an augmented reality device and a user interaction device mechanism used therefor. The augmented reality device according to an embodiment of the present invention comprises: a camera; a display to provide augmented information to a user; an image based gesture recognition module to recognize a gesture of the user from an image obtained by the camera and to produce a first gesture command based on a recognition result; and a control command production module to produce a control command regarding the augmented information by selecting one between the first gesture command and a second gesture command produced based on a gesture recognition result by a wearable device or mixing the first and second gesture commands.

Description

Technical Field [0001] The present invention relates to an Augmented Reality Device and a user interaction apparatus and method for the Augmented Reality Device,

The present invention relates to an augmented reality device and a user interaction mechanism used in the augmented reality device.

Various interface technologies such as a touch pad, a wearable device, speech recognition, and image recognition are proposed for user interaction with an augmented reality contents provided through an augmented reality device such as a smart-eye (see-through HMD) . One of the most popular smart glasses is Google's Google Glass, which uses speech recognition or card sliding for easy manipulation of menus. However, the card sliding method has a problem of user fatigue due to the action of repeatedly swallowing or touching the legs of the eyeglasses for menu control, and has a limitation in complicated menu control.

On the other hand, the advantage of image recognition using a camera mounted on smart glasses is that the enhanced information in the real world can be directly touched, selected and controlled by hand. However, the disadvantage is that the visual field can be obscured, the user's arm fatigue is large for control, and the social acceptability is low or the situation is awkward. In addition, image recognition has a disadvantage in that the recognition rate is sensitive to changes in the surrounding environment such as the background and illumination, and the computation amount is large. On the other hand, the advantages and disadvantages of using a wearable device such as a smart watch can be said to be in contrast to the interaction using image recognition.

The background art of the present invention is disclosed in Korean Patent Application Nos. 10-2015-0013277, 10-2014-0163318, and 10-2014-0057774.

Embodiments of the present invention provide an optimal user interaction mechanism for an augmented reality device by combining image recognition using a camera of an augmented reality device and motion recognition of a wearable device.

According to an aspect of the present invention, an augmented reality device includes a camera; A display that provides enhancement information to the user; An image-based gesture recognition module that recognizes the gesture of the user from the image acquired through the camera and generates a first gesture command based on the recognition result; And a control unit that selects one of the first gesture command generated based on the gesture recognition result by the first gesture command and the wearable apparatus or generates a control command for the augmentation information by combining the first and second gesture commands And a command generation module.

Here, the augmented reality device may be smart glasses.

In one embodiment, the image-based gesture recognition module includes: a hand region detection sub-module that detects hand region data of the user from an image obtained through the camera; A hand information extraction sub module for extracting hand information from the detected hand area data; And a first gesture command generation submodule for generating the first gesture command based on the hand information.

In one embodiment, the first gesture command may be divided into a finger mode and a palm mode according to the number of fingers included in the hand information.

In one embodiment, the wearable device may be a device worn by the user independently of the augmented reality device.

In one embodiment, the augmented reality device includes a wearable device interface that receives a gesture recognition result from the wearable device, and a wearable device gesture recognition module that generates the second gesture command based on the gesture recognition result received through the wearable device interface Module. ≪ / RTI >

In one embodiment, the control command generation module may send the selection result to at least one of the image-based gesture recognition module and the wearable device gesture recognition module when one of the first gesture command and the second gesture command is selected can do.

In one embodiment, the control command generation module may select the second gesture command when the gesture initiation operation is recognized by the wearable device outside the camera's field of view.

In one embodiment, the augmented reality device may further include a menu control module for providing a user menu optimized for the gesture command selected by the control command generation module through the display.

According to another embodiment of the present invention, there is provided a user interaction apparatus for an augmented reality device having a display and a camera for providing augmentation information. The apparatus comprising: an image-based gesture recognition module that recognizes the user's gesture from an image acquired through the camera and generates a first gesture command based on the recognition result; And a second gesture command generated based on a result of the gesture recognition by the first gesture command and the wearable device, or by combining the first and second gesture commands to generate a control command for the augmentation information And a control command generation module.

According to another embodiment of the present invention, there is provided a user interaction method for an augmented reality device having a display and a camera for providing augmentation information. The method comprising: recognizing the user's gesture from an image acquired through the camera and generating a first gesture command based on the recognition result; Generating a second gesture command based on a result of the gesture recognition by the wearable device; And selecting one of the first gesture command and the second gesture command or combining the first and second gesture commands to generate a control command for the augmentation information.

According to various embodiments of the present invention, an optimal user interaction mechanism for augmented reality devices such as smart glasses can be provided by combining image recognition using a camera and motion recognition of a wearable device.

1 shows an example of augmentation information provided through an augmented reality device.
FIG. 2 shows an example of user interaction with the enhancement information of FIG.
3 is a block diagram illustrating a configuration of an augmented reality device according to an embodiment of the present invention.
4 is a block diagram illustrating a detailed configuration of an image-based gesture recognition module 320 according to an exemplary embodiment of the present invention.
Figures 5A-5C illustrate examples of finger mode gestures identified as image-based gesture recognition results.
Figures 6A-6C illustrate gesture examples of the palm mode identified as image-based gesture recognition results.
Fig. 7 shows an example of a wearable device usable in the present invention.
8 is a diagram illustrating examples of basic gestures recognized using a wearable device.
FIG. 9 shows an example of a motion locus generated by combining the basic motion locus of FIG.
10 is a flowchart illustrating a user interaction method for an augmented reality device according to an embodiment of the present invention.

While the present invention has been described in connection with certain exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and similarities. It is to be understood, however, that the invention is not to be limited to the specific embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS In the following description of the present invention, detailed description of known related arts will be omitted when it is determined that the gist of the present invention may be unnecessarily obscured.

In addition, numerals used in the description of the present invention are merely an identifier for distinguishing one component from another.

In addition, the singular phrases used in the present specification and claims should be interpreted generally to mean "one or more " unless otherwise stated.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings. In order to facilitate a thorough understanding of the present invention, the same reference numerals are used for the same means regardless of the number of the drawings.

1 shows an example of augmentation information provided through an augmented reality device. As shown, when a user wearing an augmented reality device (e.g., smart glasses) views the audio device 110 in the real world, the augmented reality device recognizes the audio device and displays a virtual volume dial and a virtual power (130) so that the user can simultaneously view real objects and augmentation information.

In this situation, when the user wants to control the audio volume through the interaction with the virtual volume dial, the volume control can be basically performed by using the camera-based image-based gesture recognition or the wearable device-based gesture recognition. However, image-based gesture recognition has an advantage of being able to intuitively select from a user's point of view when selecting a virtual volume dial, but it has a drawback in that it requires less intuitive operation or less accuracy in volume control. On the other hand, when only wearable device based gesture recognition is used, it is difficult to intuitively select target enhancement information when there is a large amount of enhancement information. Accordingly, embodiments of the present invention allow for use of image-based gesture recognition and wearable device-based gesture recognition selectively or in combination as needed to provide more intuitive interaction.

For example, when a user wishes to adjust the volume of an audio device, a gesture for selecting an enhanced virtual volume dial by pointing a visible volume dial to the finger, as shown in FIG. 2, Recognizing 210 based on the image, recognizing 220 a gesture in which the user holds the dial and bitwise the wrist in the right direction, and controls the volume through the combination of these gestures Thereby providing more intuitive interaction to the user.

3 is a block diagram illustrating a configuration of an augmented reality device according to an embodiment of the present invention.

In one embodiment, the augmented reality device 300 includes a camera 310, an image-based gesture recognition module 320, a control command generation module 330, a menu control module 340, a display 350, A wearable device gesture recognition module 370, and a wearable device gesture recognition module 370.

Smart glasses are a representative example of the augmented reality device. However, the embodiments of the present invention are not necessarily limited to smart glasses, and may be applied to various types of augmented reality devices capable of providing the augmentation information to a user.

Camera 310 may be color (RGB), stereo (stereo), depth (Depth), or a combination of color and depth (RGBD). The camera 310 may be a video camera capable of taking moving pictures. In one embodiment, the camera 310 may acquire a user's hand image for gesture recognition that is based on user interaction.

The image-based gesture recognition module 320 recognizes a user's single or series of gestures from an image (e.g., a user's hand image) obtained through the camera 310 and generates a first gesture command based on the recognition result do.

The control command generation module 330 selects one of the first gesture command and the second gesture command generated based on the gesture recognition result by the wearable device 380 or combines the first and second gesture commands to display And generates a control command for the augmentation information provided through the controller 350. [

In one embodiment, the control command generation module 330 may select a first gesture command based on the gesture recognition result via the image or a second gesture command based on the gesture recognition result through the wearable device based on various policies . For example, a second gesture command may be selected instead of a first gesture command if the user's gesture start operation is recognized via wearable device 380 outside the field of view of camera 310. [ That is, in order to control the augmenting information through the wearable device 380, the user must start a gesture command from outside the camera image. After the gesture command is started, even if the user's hand enters the camera image, Can be regarded as continuous. On the other hand, in order to control the content through the camera image, the user must enter the camera without holding his / her fingers when the user comes into the camera image. After the entry, the gesture start operation through the wearable device 380 (e.g., Etc.) will be regarded as a series of image-based gestures.

Alternatively, the control command generation module 330 may combine the first gesture command based on the image recognition and the second gesture command based on the wearable device to generate a final control command.

In addition, the control command generation module 330 may control the feedback of at least one of the image-based gesture recognition module 320 and the wearable device gesture recognition module 370, if the first gesture command or the second gesture command is selected, can do. Accordingly, the unselected recognition module 320 or 370 stops the gesture recognition process, thereby reducing the amount of calculation and saving the battery.

The menu control module 340 may provide a user menu optimized for the gesture command selected by the control command generation module 330. [ Thus, the user's interaction convenience can be further increased.

The display 350 displays the enhancement information so that the user can view the enhancement information together with the physical real world.

The wearable device interface 360 is responsible for communication between the wearable device 380 and the augmented reality device 100. Here, the wearable device 200 is a device that is worn by a user independently of the augmented reality device 100. [ As a representative example of the wearable device 380, a smart band, a smart watch, and the like can be mentioned, but it is a matter of understandable that the present invention is not limited to a specific type of wearable device.

In one embodiment, the wearable device gesture recognition module 370 generates a second gesture command based on the gesture recognition result received via the wearable device interface 360. Gesture recognition using a wearable device is disclosed in Korean Patent Application Nos. 10-2015-0013277 and 10-2014-0163318 filed by the same applicant as the applicant of the present application (entitled "Hand motion using optical sensor ) Detection method ").

3, the wearable device gesture recognition module 370 is illustrated as being implemented within the augmented reality device 300. However, according to another embodiment, the wearable device gesture recognition module 370 may be implemented as a wearable device Lt; RTI ID = 0.0 > 380 < / RTI > In this case, the wearable device gesture recognition module 370 implemented in the wearable device 380 generates a second gesture command based on the gesture recognition result of the wearable device, and transmits the second gesture command through the wearable device interface 360 to the second Gesture command to the augmented reality device 300. [

3 is a functional view of each component. It is to be noted that the device 300 may include hardware components such as a processor and a memory. It is obvious to those skilled in the art. Accordingly, the image-based gesture recognition module 320, the control command generation module 330, the menu control module 340, and the wearable device gesture recognition module 370 described above are created in a program form and stored in a memory, Lt; / RTI >

4 is a block diagram illustrating a detailed configuration of an image-based gesture recognition module 320 according to an exemplary embodiment of the present invention. As shown, the image-based gesture recognition module 320 may include a hand region detection sub-module 321, a hand information extraction sub-module 322, and a first gesture command generation sub-module 323.

The hand area detection sub-module 321 detects the hand area data of the user from the image obtained through the camera.

The hand information extraction sub-module 322 extracts hand information from the detected hand area data. An example of the hand information may be the direction of the palm center, the position of the palm end, the number of fingers, and the like. With regard to the hand information extraction method, Japanese Patent Application No. 10-2014-0057774 filed by the same applicant as the present applicant (entitled "User's hand detection device and its operation for detecting user's hand area" The proposed method can be used.

The first gesture command generation sub-module 323 generates the first gesture command based on the extracted hand information.

In one embodiment, the first gesture command can be divided into a finger mode (Finger UI Mode) and a palm mode (Palm UI Mode) according to the number of fingers included in the hand information. For example, when the number of fingers is two or less, the mode is classified into a finger mode. Otherwise, the mode can be classified into a palm mode.

Figures 5A-5C illustrate examples of finger mode gestures identified as image-based gesture recognition results.

5A shows an example of a selection gesture using a single finger. By intuitively touching an object displayed at the user's time using one finger, the enhancement information in the image projected on the smart glasses can be selected Represents a gesture.

FIG. 5B also shows a gesture in which the thumb points to the left direction with a back gesture using one finger. As an example, it can be used as an input of a command to move from a smartphone to a home screen or an app end command.

5C is a V-gesture using two fingers, and can be used as an input of a camera shooting command.

Figures 6A-6C illustrate gesture examples of the palm mode identified as image-based gesture recognition results.

Specifically, the gesture of the hand shown in FIG. 6A is a gesture consisting of a series of movements for putting a hand on the screen, and can be used as an input of a menu display command or the like.

The gesture of the hand shown in FIG. 6B is a gesture made up of a series of movements to remove a hand from the screen, and can be used as an input of a menu hiding command.

The gesture of the hand shown in FIG. 6C is a left-right tilting gesture of the hand and a gesturing gesture. The left-right tilting gesture of the hand can be used as an input of a command to move the menu to the left and right. It can be used as an input to an instruction.

Fig. 7 shows an example of a wearable device usable in the present invention.

7, the wearable apparatus 700 includes a main body 710 having a built-in computing unit, an automatic locking band 720, a signal bus line 730, and a detection unit 740 in which a light emitting / . The wearable device 700 has a band shape or a ring shape and can detect a change in the movement of the muscles of the wrist part when the fist is held and unfolded. In addition, the wearable device 700 includes a 9-axis motion sensor having a modularized gyro sensor, an acceleration sensor, a geomagnetic sensor, and the like, so that not only the hand gesture detection but also the trajectory of the gesture can be tracked.

In one embodiment, after the user wearing the wearable device 700 moves his / her arm with a predetermined gesture initiation action (e.g., a grasping motion) to create a motion trajectory of the hand corresponding to the selected gesture, By performing a termination operation (e.g., a fist-up), the wearable device 700 can recognize the gesture more accurately. In other words, the user can grasp the fist at the start of the desired gesture command and unfold the fist at the end of the command to clearly distinguish the start and end of the command so that the wearable device 700 can accurately recognize the trajectory of the hand have. Accordingly, the noise that may occur during gesture recognition can be removed.

8 is a diagram illustrating examples of basic gestures recognized using a wearable device.

The basic gestures shown in FIG. 8 can be combined with each other or expanded by adding another new trajectory. In addition, the trajectory of the gesture may be divided into a plurality of segments according to the setting, and the corresponding gesture instruction may be generated. For example, the right gesture command in Fig. 8 can be divided into a plurality of right gesture commands according to the motion variation setting value.

FIG. 9 shows an example of a motion locus generated by combining the basic motion locus of FIG. As shown, a right-down gesture corresponding to a new command can be generated by combining the right gesture and the down gesture of FIG. 8, and a right-up gesture can be generated by combining a right gesture and an up gesture.

10 is a flowchart illustrating a user interaction method for an augmented reality device according to an embodiment of the present invention.

In step S1010, the gesture of the user is recognized from the image obtained through the camera provided in the augmented reality device, and a first gesture command is generated based on the recognition result. More specifically, it is possible to detect the hand area data of the user from an image obtained through a camera, to extract hand information from the detected hand area data, and to extract the first gesture instruction based on the extracted hand information Can be generated.

In one embodiment, the first gesture command may be divided into a finger mode and a palm mode according to the number of fingers included in the hand information.

In step S1020, a second gesture command is generated based on the result of the gesture recognition by the wearable device. In this case, the wearable device may be a device worn by the user independently of the augmented reality device.

The above-described steps S1010 and S1020 are sequentially shown in Fig. 2, but this is for convenience of explanation, and these steps can be performed in parallel.

Next, in step S1030, either one of the first gesture command generated through image-based gesture recognition and the second gesture command generated through wearable device-based gesture recognition is selected, or the first and second gesture commands are combined And generates a control command for the augmented reality contents.

In one embodiment, when gesture recognition is initiated by the wearable device outside the camera's field of view, a second gesture command may be selected.

In one embodiment, the generation of the unselected gesture command will be interrupted if one of the first gesture command and the second gesture command is selected.

In one embodiment, when one of the first gesture command and the second gesture command is selected, a user menu optimized for the selected gesture command may be provided through the display of the augmented reality device, Can be further increased.

The method according to the above-described embodiments of the present invention may be implemented in the form of program instructions that can be executed through various computer means, recorded in a storage medium, stored in a memory of the augmented reality device, and executed by a processor. The storage medium may include program instructions, data files, data structures, and the like, alone or in combination.

Program instructions to be recorded on the storage medium may be those specially designed and constructed for the present invention or may be available to those skilled in the art of software. Examples of storage media include magnetic media such as hard disks, floppy disks and magnetic tape, optical media such as CD-ROMs and DVDs, magneto-optical media such as floptical disks, magneto-optical media and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. The above-mentioned medium may also be a transmission medium such as a light or metal wire, wave guide, etc., including a carrier wave for transmitting a signal designating a program command, a data structure and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as devices for processing information electronically using an interpreter or the like, for example, a high-level language code that can be executed by a computer.

In addition, the hardware devices described above may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention as defined in the appended claims. It will be understood that the invention may be varied and varied without departing from the scope of the invention.

Claims (16)

camera;
A display that provides enhancement information to the user;
An image-based gesture recognition module that recognizes the gesture of the user from the image acquired through the camera and generates a first gesture command based on the recognition result; And
A control command for selecting one of a first gesture command generated based on the gesture recognition result by the first gesture command and the wearable device or a control command for the augmentation information by combining the first and second gesture commands; Generation module
Gt; augmented < / RTI >
2. The augmented reality device according to claim 1, wherein the augmented reality device is smart eyewear. The method of claim 1, wherein the image-based gesture recognition module comprises:
A hand area detection sub-module for detecting hand area data of the user from an image obtained through the camera;
A hand information extraction sub module for extracting hand information from the detected hand area data; And
A first gesture command generation sub-module for generating the first gesture command based on the hand information,
Gt; augmented < / RTI >
4. The augmented reality device of claim 3, wherein the first gesture command is classified into a finger mode and a palm mode according to the number of fingers included in the hand information. 2. The augmented reality device of claim 1, wherein the wearable device is worn by the user independently of the augmented reality device. The wearable device according to claim 1, wherein the augmented reality device comprises: a wearable device interface for receiving a gesture recognition result from the wearable device; and a wearable device gesture for generating the second gesture command based on the gesture recognition result received through the wearable device interface Further comprising a recognition module. 7. The method of claim 6, wherein the control command generation module is operable to generate the selection result in at least one of the image-based gesture recognition module and the wearable device gesture recognition module when one of the first gesture command and the second gesture command is selected. Augmented reality device. 2. The augmented reality device of claim 1, wherein the control command generation module selects the second gesture command when a gesture start operation is recognized by the wearable device outside the camera's field of view. 2. The augmented reality device of claim 1, wherein the augmented reality device further comprises a menu control module for providing a user menu optimized for the gesture command selected by the control command generation module through the display. A user interaction apparatus for an augmented reality device having a display and a camera for providing augmentation information,
An image-based gesture recognition module that recognizes the gesture of the user from the image acquired through the camera and generates a first gesture command based on the recognition result; And
A control for selecting one of the second gesture commands generated based on the result of the gesture recognition by the first gesture command and the wearable device or generating a control command for the augmentation information by combining the first and second gesture commands Command generation module
Lt; / RTI >
A user interaction method for an augmented reality device having a display and a camera for providing augmentation information,
Recognizing the gesture of the user from the image obtained through the camera and generating a first gesture command based on the recognition result;
Generating a second gesture command based on a result of the gesture recognition by the wearable device; And
Selecting one of the first gesture command and the second gesture command or combining the first and second gesture commands to generate a control command for the augmentation information
The method comprising: receiving a user interaction request from the augmented reality device;
12. The method of claim 11, wherein the generating of the first gesture command comprises:
Detecting hand region data of the user from an image obtained through the camera;
Extracting hand information from the detected hand region data; And
Generating the first gesture command based on the hand information
Wherein the user interaction method comprises:
12. The method of claim 11, wherein the wearable device is worn by a user independently of the augmented reality device. 12. The method of claim 11, wherein, in the step of generating the control command, generation of the unselected gesture instruction is stopped when one of the first gesture instruction and the second gesture instruction is selected. 15. The method according to claim 14, wherein, in the step of generating the control command, the gesture recognition is started by the wearable device outside the camera's field of view, the second gesture instruction is selected. 12. The method of claim 11, further comprising providing, through the display, a user menu optimized for the selected gesture command, if one of the first gesture command and the second gesture command is selected, A user interaction method in a device.


KR1020150072133A 2015-05-22 2015-05-22 Augmented Reality Device, User Interaction Apparatus and Method for the Augmented Reality Device KR20160137253A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150072133A KR20160137253A (en) 2015-05-22 2015-05-22 Augmented Reality Device, User Interaction Apparatus and Method for the Augmented Reality Device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150072133A KR20160137253A (en) 2015-05-22 2015-05-22 Augmented Reality Device, User Interaction Apparatus and Method for the Augmented Reality Device

Publications (1)

Publication Number Publication Date
KR20160137253A true KR20160137253A (en) 2016-11-30

Family

ID=57707309

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150072133A KR20160137253A (en) 2015-05-22 2015-05-22 Augmented Reality Device, User Interaction Apparatus and Method for the Augmented Reality Device

Country Status (1)

Country Link
KR (1) KR20160137253A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210029753A (en) * 2018-10-25 2021-03-16 울산대학교 산학협력단 Method and apparatus for recognizing gesture
KR102321359B1 (en) * 2020-06-02 2021-11-02 국민대학교산학협력단 Portable terminal and vr viewer system for controlling virtual reality content based on hand shape and illumination sensor
WO2021230568A1 (en) * 2020-05-13 2021-11-18 삼성전자 주식회사 Electronic device for providing augmented reality service and operating method thereof
KR20220077361A (en) * 2020-12-02 2022-06-09 (주)유비컴 Gesture recognition device using 3D virtual space and wrist band and recognition method using the same
WO2023282446A1 (en) * 2021-07-09 2023-01-12 주식회사 피앤씨솔루션 Wearable augmented reality device for providing interface by using hand joint recognition, and method for providing interface by wearable augmented reality device using hand joint recognition
WO2023063515A1 (en) * 2021-10-14 2023-04-20 주식회사 피앤씨솔루션 Method for operating app library by sequentially recognizing wrist and trigger gestures, and head mounted display device in which app library operates
WO2023196161A1 (en) * 2022-04-04 2023-10-12 Snap Inc. Gesture-based application invocation

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210029753A (en) * 2018-10-25 2021-03-16 울산대학교 산학협력단 Method and apparatus for recognizing gesture
WO2021230568A1 (en) * 2020-05-13 2021-11-18 삼성전자 주식회사 Electronic device for providing augmented reality service and operating method thereof
KR102321359B1 (en) * 2020-06-02 2021-11-02 국민대학교산학협력단 Portable terminal and vr viewer system for controlling virtual reality content based on hand shape and illumination sensor
WO2021246574A1 (en) * 2020-06-02 2021-12-09 국민대학교산학협력단 Portable terminal for controlling virtual reality content on basis of hand shape and illumination sensor, and vr viewer system including same
KR20220077361A (en) * 2020-12-02 2022-06-09 (주)유비컴 Gesture recognition device using 3D virtual space and wrist band and recognition method using the same
WO2023282446A1 (en) * 2021-07-09 2023-01-12 주식회사 피앤씨솔루션 Wearable augmented reality device for providing interface by using hand joint recognition, and method for providing interface by wearable augmented reality device using hand joint recognition
WO2023063515A1 (en) * 2021-10-14 2023-04-20 주식회사 피앤씨솔루션 Method for operating app library by sequentially recognizing wrist and trigger gestures, and head mounted display device in which app library operates
WO2023196161A1 (en) * 2022-04-04 2023-10-12 Snap Inc. Gesture-based application invocation

Similar Documents

Publication Publication Date Title
KR20160137253A (en) Augmented Reality Device, User Interaction Apparatus and Method for the Augmented Reality Device
EP3332311B1 (en) Hover behavior for gaze interactions in virtual reality
JP6400197B2 (en) Wearable device
TWI528227B (en) Ring-type wireless finger sensing controller, control method and control system
KR101844390B1 (en) Systems and techniques for user interface control
US11625103B2 (en) Integration of artificial reality interaction modes
CN116719452A (en) Method for interacting with virtual controls and/or affordances for moving virtual objects in a virtual environment
US20140157206A1 (en) Mobile device providing 3d interface and gesture controlling method thereof
US10885322B2 (en) Hand-over-face input sensing for interaction with a device having a built-in camera
US20140028567A1 (en) Display device and control method thereof
CN110546601B (en) Information processing device, information processing method, and program
EP3811183B1 (en) Methods and apparatuses for providing input for head-worn image display devices
WO2017057107A1 (en) Input device, input method, and program
Matulic et al. Phonetroller: Visual representations of fingers for precise touch input with mobile phones in vr
KR20170133754A (en) Smart glass based on gesture recognition
CN108369451B (en) Information processing apparatus, information processing method, and computer-readable storage medium
US20230093979A1 (en) Devices, methods, and graphical user interfaces for content applications
KR101370027B1 (en) Mouse apparatus for eye-glass type display device and operating method for the same
CN114201030A (en) Device interaction method, electronic device and interaction system
KR102539045B1 (en) Dashboard control apparatus and method for wearable augmented reality device
US20240036698A1 (en) Xr manipulation feature with smart watch
US20240103680A1 (en) Devices, Methods, and Graphical User Interfaces For Interacting with Three-Dimensional Environments
CN115877941A (en) Head-mounted interactive device and interactive method for same
JP2024018908A (en) Controlling user interface by using trackpad and smart watch
CN116166161A (en) Interaction method based on multi-level menu and related equipment