US11762476B2 - Device and method for hand-based user interaction in VR and AR environments - Google Patents

Device and method for hand-based user interaction in VR and AR environments Download PDF

Info

Publication number
US11762476B2
US11762476B2 US17/761,056 US202017761056A US11762476B2 US 11762476 B2 US11762476 B2 US 11762476B2 US 202017761056 A US202017761056 A US 202017761056A US 11762476 B2 US11762476 B2 US 11762476B2
Authority
US
United States
Prior art keywords
user
overlay
positions
actions
action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US17/761,056
Other versions
US20220342485A1 (en
Inventor
Sylvain Lelievre
Philippe Schmouker
Jean-Eudes Marvie
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
InterDigital CE Patent Holdings SAS
Original Assignee
InterDigital CE Patent Holdings SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by InterDigital CE Patent Holdings SAS filed Critical InterDigital CE Patent Holdings SAS
Assigned to INTERDIGITAL CE PATENT HOLDINGS, SAS reassignment INTERDIGITAL CE PATENT HOLDINGS, SAS ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SCHMOUKER, PHILIPPE
Assigned to INTERDIGITAL CE PATENT HOLDINGS, SAS reassignment INTERDIGITAL CE PATENT HOLDINGS, SAS ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LELIEVRE, SYLVAIN, MARVIE, JEAN-EUDES
Publication of US20220342485A1 publication Critical patent/US20220342485A1/en
Application granted granted Critical
Publication of US11762476B2 publication Critical patent/US11762476B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04804Transparency, e.g. transparent or translucent windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the present disclosure relates generally to user interfaces and in particular to a user interface for Virtual Reality (VR) and Augmented Reality (AR) environments.
  • VR Virtual Reality
  • AR Augmented Reality
  • VR and AR technology provides an immersive experience to users running such applications on suitable displays such as smartTVs, PCs, projectors, head-mounted displays (HMDs), etc.
  • suitable displays such as smartTVs, PCs, projectors, head-mounted displays (HMDs), etc.
  • the user is typically able to interact with the VR or AR environment, for example to handle objects therein.
  • it is necessary to receive user input, which for example can be done by user tracking systems using sensors (wearable or not) or cameras that detect color and depth.
  • a connected smartTV equipped with a color/depth camera could be able to track the shape and the position of each element of the user's body, which could enable the use of hand gestures in a natural interaction system.
  • Another conventional solution provides on the screen indications of the user's hand position and enable the user to interact with a virtual object to zoom in/out and rotate the object.
  • the present principles are directed to a method in a device for rendering a virtual reality or augmented reality environment.
  • the method comprises in at least one hardware processor of the device rendering an overlay on a display, calculating a position of a centre indicator of the overlay based on detected positions of a user's hands, enabling at least one action based on the position of the centre indicator relative to a neutral zone, and performing, in response to user hand movement, an enabled action.
  • the present principles are directed to a device for rendering a virtual reality or augmented reality environment, the device comprising at least one hardware processor configured to render an overlay on a display, calculate a position of a centre indicator of the overlay based on positions of a user's hands, enable at least one action based on the position of the centre indicator relative to a neutral zone, and perform, in response to user hand movement, an enabled action.
  • the present principles are directed to a computer program product which is stored on a non-transitory computer readable medium and includes program code instructions executable by a processor for implementing the steps of a method according to any embodiment of the second aspect.
  • FIG. 1 illustrates a Virtual reality (VR)/augmented reality (AR) system according to an embodiment of the present principles
  • FIG. 2 illustrates a method according to an embodiment of the present principles
  • FIG. 3 illustrates a first example of such an overlay according to the present principles
  • FIG. 4 illustrates a second example of such an overlay according to the present principles.
  • FIG. 1 is a depiction of a virtual reality (VR) or augmented reality (AR) interactive system 100 in accordance with an illustrative embodiment.
  • the interactive system 100 comprises a rendering device 110 , a display 120 and a camera 130 .
  • the rendering device 110 , the display 120 and the camera 130 have been depicted as separate entities, but in different variants, they are combined in every possible way.
  • the three entities can be incorporated into a single device such as a television, the display 120 and the rendering device 110 to be together while the camera 130 is separate, or for the rendering device 110 to be separate from a display including a camera.
  • the display 120 is operative to display to a user (not shown) video received from the rendering device.
  • the display 120 can for example be a television, a computer screen, a head-mounted display (HMD), a tablet or a smartphone.
  • HMD head-mounted display
  • the camera 130 which can be a depth and colour camera, is operative to capture images and/or video of whatever is in front of its aperture.
  • the camera 130 will typically be arranged to capture an environment including the user.
  • Other gesture detectors detecting positions of a user's hands are also possible, such as one or more sensors worn by the user.
  • the rendering device 110 can for example be a computer, a set-top box, a gaming console, a tablet, a smartphone or the like.
  • the rendering device 110 typically includes a user input interface 111 , at least one hardware processor (“processor”) 112 , memory 113 , a network interface 114 and an output interface, for example a display interface 115 or a speaker.
  • processor hardware processor
  • the interfaces are at least partly implemented in hardware. It will be appreciated that parts of the device 110 that are not useful for understanding the present principles are not illustrated for the sake of brevity of description.
  • the user input interface 111 can be wireless and configured to receive input from an input device (not shown), but it can also be a touch screen, an interface to a keyboard and/or a mouse, and the like.
  • the user input interface is configured to provide received input to the processor 112 .
  • the processor 112 is configured to execute program code instructions to perform a method according to the present principles described with reference to FIG. 2 .
  • the memory 113 is configured to store the program code instructions for execution by the processor 112 and to store data.
  • the network interface 114 can be wired or wireless and is configured to allow communication between the processor 112 , via connection 140 , and at least one device such as a server in an external network (not shown) to receive computer programs or content for display to a user.
  • Such content may be video programs, which will be used as a non-limitative example.
  • the output interface can be wired or wireless and configured to output content for display on the display 120 .
  • the display 120 is typically included in the device. It will be noted that a television may also have a display interface for displaying information on a separate screen (not shown).
  • a salient aspect of the present principles is the use of an overlay on top of the VR/AR scene.
  • FIG. 2 illustrates a method according to an embodiment of the present principles. It is assumed that the user is “inside” the AR/VR scene, i.e. that the user is able to interact with the scene. For exemplary purposes, it is also assumed that the user has not interacted with any object in the scene.
  • step S 210 the user interface ( 120 in FIGS. 1 and 2 ) of the interactive system ( 100 in FIG. 1 ) detects the position of the user's hands to detect when the position matches a predefined position.
  • the predefined position can be one out of a plurality of positions having the same meaning. An example of such a predefined position can be when the user extends the hands in front of the body.
  • the user interface Upon detection of a match, the user interface renders, in step S 220 , a overlay on the AR/VR scene.
  • FIG. 3 illustrates a first example of such an overlay 300 according to the present principles.
  • the overlay 300 which can be at least partly semi-transparent, is in the example intended to be interacted with using two hands.
  • the overlay 300 can include a centre indicator 310 , a neutral zone 320 (diagonally hatched in grey) and a crown 330 that includes a first and a second lock position 341 , 342 .
  • the centre indicator 310 indicates the centre of the crown 330 .
  • certain user actions are inhibited, as will be described. Conversely, actions can be enabled only when the centre indicator 310 is located inside the neutral zone 320 .
  • the neutral zone 320 can be invisible (or more transparent) when the centre indicator 310 is located inside, and made visible or less transparent when the centre indicator 310 is outside.
  • the lock positions 341 , 342 preferably hand-shaped, indicate where the user has to put its hands to lock or unlock a scene or object, as will be described.
  • the positions of the user's hands are represented by a right hand indicator 351 and a left hand indicator 352 , illustrated in exemplary positions.
  • step S 230 the user interface waits for the user to lock on to an object by a specific hand gesture, which in the present example is putting the hands over (or at least essentially over) the lock positions 341 , 342 for a given period of time; in other words, to place the hand indicators 351 , 352 on top of the lock positions 341 , 342 for a given period of time, which can be 0.5 seconds, 1 second, 1.5 seconds and so on.
  • a specific hand gesture which in the present example is putting the hands over (or at least essentially over) the lock positions 341 , 342 for a given period of time; in other words, to place the hand indicators 351 , 352 on top of the lock positions 341 , 342 for a given period of time, which can be 0.5 seconds, 1 second, 1.5 seconds and so on.
  • the lock positions 341 , 342 blinks when there is no lock.
  • the user interface can render text to inform the user how to lock on to an object.
  • the user interface Upon lock on to the object, the user interface detects and renders, preferably essentially continuously, in step S 240 , the position of the centre indicator 310 .
  • the user interface can detect whether the centre indicator 310 lies inside or outside the neutral zone 320 .
  • the centre indicator 310 typically corresponds, possibly with some lag, to the centre of the user's two hands, i.e. the centre of the hand indicators 351 , 352 . It is noted that the crown 330 typically follows the movement of the centre indicator 310 .
  • FIG. 4 illustrates a second example of an overlay 400 according to the present principles.
  • the components of the overlay 400 are the same as those illustrated in the overlay 300 in FIG. 3 , the differences being that the lock positions 341 , 342 have been filled to indicate lock on to the object or scene, and that the centre indicator 310 and the crown 330 have followed the user's hands so that the centre indicator 310 is now outside of the neutral zone 320 (it is noted that it in the Figure appears as if the neutral zone has moved).
  • step S 250 the user interface enables and/or disables user actions depending on the position of the centre indicator 310 relative to the neutral zone 320 .
  • enabled user actions can include:
  • the neutral zone can make the AR/VR scene less prone to for example accidental rotations since small horizontal or vertical hand movements will have no effect while the centre indicator 310 is in the neutral zone 320 .
  • the user interface can indicate the possibilities by for example rendering icons indicative of enabled actions. Additionally, icons can be rendered in a different aspect (for example greyed or faded) for disabled actions, as is conventional in user interfaces. The icons can be rendered for a limited time period before disappearing or fading away. Preferably, the icons are rendered in a peripheral area so as not to hide much of the scene or object.
  • step S 260 the user interface responds to user actions tracked through hand position detection.
  • the user actions can include the ones described with reference to step S 250 .
  • Further actions can include unlocking (to be described with reference to step S 270 ), and hiding and displaying the overlay 300 when the user crosses the hands, i.e. brings the right hand to the left of the left hand (and it is noted that this can be disabled when the user performs a rotation through circular movement of the hands, so as to enable great circular rotation, e.g. 180°.
  • the hand indicators 351 , 352 can be prevented from switching sides so that the left is always to the left of or at the same vertical position as the right (except, for example, when performing a circular movement).
  • step S 270 the user interface detects unlock of the object or scene. This can be done by the user putting the hand indicators 351 , 352 on top of the lock positions 341 , 342 for a given time. As can be seen, the user can do the same as for locking on to the object or scene. The unlock may also require that the center indicator 310 is in the neutral zone 320 .
  • the overlay can include a single lock position over which the user should put a hand indicator for a given time in order to lock the scene or object, while unlocking is done the same way.
  • enabled user actions can be limited to moving an object.
  • the present principles can be used to provide a AR/VR system that provides a user with information on enabled user actions and also, through the use of the neutral zone, to provide an AR/VR experience that is less subject to ‘jitter’ caused by small hand movements.
  • processor or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (DSP) hardware, read only memory (ROM) for storing software, random access memory (RAM), and non-volatile storage.
  • DSP digital signal processor
  • ROM read only memory
  • RAM random access memory
  • any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
  • any element expressed as a means for performing a specified function is intended to encompass any way of performing that function including, for example, a) a combination of circuit elements that performs that function or b) software in any form, including, therefore, firmware, microcode or the like, combined with appropriate circuitry for executing that software to perform the function.
  • the disclosure as defined by such claims resides in the fact that the functionalities provided by the various recited means are combined and brought together in the manner which the claims call for. It is thus regarded that any means that can provide those functionalities are equivalent to those shown herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

A virtual reality (VR) or augmented reality (AR) system detects through a camera when a user's hand positions match a predefined position and in response thereto renders an overlay including a crown with which the user can interact to lock onto an object or scene. The system then detects and renders a centre indicator of the crown, tracking the user's hands, enables or disables actions depending on the position of the centre indicator relative to a neutral zone, responds to user hand movement to implement actions, and also detects unlock of the object or scene.

Description

CROSS REFERENCE TO RELATED APPLICATIONS
This application is a U.S. National Stage Application under 35 U.S.C. 371 of International Patent Application No. PCT/EP2020/074950, filed Sep. 7, 2020, which is incorporated herein by reference in its entirety.
This application claims the benefit of European Patent Application No. 19306153, filed Sep. 20, 2019, which is incorporated herein by reference in its entirety.
TECHNICAL FIELD
The present disclosure relates generally to user interfaces and in particular to a user interface for Virtual Reality (VR) and Augmented Reality (AR) environments.
BACKGROUND
This section is intended to introduce the reader to various aspects of art, which may be related to various aspects of the present disclosure that are described and/or claimed below. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present disclosure. Accordingly, it should be understood that these statements are to be read in this light, and not as admissions of prior art.
VR and AR technology provides an immersive experience to users running such applications on suitable displays such as smartTVs, PCs, projectors, head-mounted displays (HMDs), etc. Through a user interface, the user is typically able to interact with the VR or AR environment, for example to handle objects therein. To do this, it is necessary to receive user input, which for example can be done by user tracking systems using sensors (wearable or not) or cameras that detect color and depth. For instance, a connected smartTV equipped with a color/depth camera could be able to track the shape and the position of each element of the user's body, which could enable the use of hand gestures in a natural interaction system. This could give a user a way of interacting with a AR or VR scene or object in for example a broadcast TV program, a content item from the Internet or content shared during a video conference.
Conventional user interfaces can overlay images of the user's hands, which makes interaction with buttons and other menu items relatively straightforward. Interaction with objects is more difficult since the objects do not always react as in the real world.
Another conventional solution provides on the screen indications of the user's hand position and enable the user to interact with a virtual object to zoom in/out and rotate the object.
However, in particular when interacting with object, such user interfaces tend to be difficult to use, in particular for new users as they are not always intuitive.
It will thus be appreciated that there is a desire for a solution that addresses at least some of the shortcomings of user interfaces for AR and VR. The present principles provide such a solution.
SUMMARY OF DISCLOSURE
In a first aspect, the present principles are directed to a method in a device for rendering a virtual reality or augmented reality environment. The method comprises in at least one hardware processor of the device rendering an overlay on a display, calculating a position of a centre indicator of the overlay based on detected positions of a user's hands, enabling at least one action based on the position of the centre indicator relative to a neutral zone, and performing, in response to user hand movement, an enabled action.
In a second aspect, the present principles are directed to a device for rendering a virtual reality or augmented reality environment, the device comprising at least one hardware processor configured to render an overlay on a display, calculate a position of a centre indicator of the overlay based on positions of a user's hands, enable at least one action based on the position of the centre indicator relative to a neutral zone, and perform, in response to user hand movement, an enabled action.
In a third aspect, the present principles are directed to a computer program product which is stored on a non-transitory computer readable medium and includes program code instructions executable by a processor for implementing the steps of a method according to any embodiment of the second aspect.
BRIEF DESCRIPTION OF DRAWINGS
Features of the present principles will now be described, by way of non-limiting example, with reference to the accompanying drawings, in which:
FIG. 1 illustrates a Virtual reality (VR)/augmented reality (AR) system according to an embodiment of the present principles;
FIG. 2 illustrates a method according to an embodiment of the present principles;
FIG. 3 illustrates a first example of such an overlay according to the present principles; and
FIG. 4 illustrates a second example of such an overlay according to the present principles.
DESCRIPTION OF EMBODIMENTS
FIG. 1 is a depiction of a virtual reality (VR) or augmented reality (AR) interactive system 100 in accordance with an illustrative embodiment. The interactive system 100 comprises a rendering device 110, a display 120 and a camera 130.
The rendering device 110, the display 120 and the camera 130 have been depicted as separate entities, but in different variants, they are combined in every possible way. For example the three entities can be incorporated into a single device such as a television, the display 120 and the rendering device 110 to be together while the camera 130 is separate, or for the rendering device 110 to be separate from a display including a camera.
The display 120 is operative to display to a user (not shown) video received from the rendering device. The display 120 can for example be a television, a computer screen, a head-mounted display (HMD), a tablet or a smartphone.
The camera 130, which can be a depth and colour camera, is operative to capture images and/or video of whatever is in front of its aperture. In interactive system 100, the camera 130 will typically be arranged to capture an environment including the user. Other gesture detectors detecting positions of a user's hands are also possible, such as one or more sensors worn by the user.
The rendering device 110 can for example be a computer, a set-top box, a gaming console, a tablet, a smartphone or the like. The rendering device 110 typically includes a user input interface 111, at least one hardware processor (“processor”) 112, memory 113, a network interface 114 and an output interface, for example a display interface 115 or a speaker. The interfaces are at least partly implemented in hardware. It will be appreciated that parts of the device 110 that are not useful for understanding the present principles are not illustrated for the sake of brevity of description.
The user input interface 111 can be wireless and configured to receive input from an input device (not shown), but it can also be a touch screen, an interface to a keyboard and/or a mouse, and the like. The user input interface is configured to provide received input to the processor 112. The processor 112 is configured to execute program code instructions to perform a method according to the present principles described with reference to FIG. 2 . The memory 113 is configured to store the program code instructions for execution by the processor 112 and to store data. The network interface 114 can be wired or wireless and is configured to allow communication between the processor 112, via connection 140, and at least one device such as a server in an external network (not shown) to receive computer programs or content for display to a user. Such content may be video programs, which will be used as a non-limitative example. The output interface can be wired or wireless and configured to output content for display on the display 120. As already mentioned, in case the device 110 is a television, the display 120 is typically included in the device. It will be noted that a television may also have a display interface for displaying information on a separate screen (not shown).
A salient aspect of the present principles is the use of an overlay on top of the VR/AR scene.
FIG. 2 illustrates a method according to an embodiment of the present principles. It is assumed that the user is “inside” the AR/VR scene, i.e. that the user is able to interact with the scene. For exemplary purposes, it is also assumed that the user has not interacted with any object in the scene.
In step S210 the user interface (120 in FIGS. 1 and 2 ) of the interactive system (100 in FIG. 1 ) detects the position of the user's hands to detect when the position matches a predefined position. The predefined position can be one out of a plurality of positions having the same meaning. An example of such a predefined position can be when the user extends the hands in front of the body.
Upon detection of a match, the user interface renders, in step S220, a overlay on the AR/VR scene. FIG. 3 illustrates a first example of such an overlay 300 according to the present principles. The overlay 300, which can be at least partly semi-transparent, is in the example intended to be interacted with using two hands.
The overlay 300 can include a centre indicator 310, a neutral zone 320 (diagonally hatched in grey) and a crown 330 that includes a first and a second lock position 341, 342.
The centre indicator 310 indicates the centre of the crown 330. When the centre indicator 310 is located inside the neutral zone 320, certain user actions are inhibited, as will be described. Conversely, actions can be enabled only when the centre indicator 310 is located inside the neutral zone 320. The neutral zone 320 can be invisible (or more transparent) when the centre indicator 310 is located inside, and made visible or less transparent when the centre indicator 310 is outside.
The lock positions 341, 342, preferably hand-shaped, indicate where the user has to put its hands to lock or unlock a scene or object, as will be described.
The positions of the user's hands are represented by a right hand indicator 351 and a left hand indicator 352, illustrated in exemplary positions.
In step S230, the user interface waits for the user to lock on to an object by a specific hand gesture, which in the present example is putting the hands over (or at least essentially over) the lock positions 341, 342 for a given period of time; in other words, to place the hand indicators 351, 352 on top of the lock positions 341, 342 for a given period of time, which can be 0.5 seconds, 1 second, 1.5 seconds and so on.
In an embodiment, the lock positions 341, 342 blinks when there is no lock. In addition, the user interface can render text to inform the user how to lock on to an object.
Upon lock on to the object, the user interface detects and renders, preferably essentially continuously, in step S240, the position of the centre indicator 310. In particular, the user interface can detect whether the centre indicator 310 lies inside or outside the neutral zone 320. When locked, the centre indicator 310 typically corresponds, possibly with some lag, to the centre of the user's two hands, i.e. the centre of the hand indicators 351, 352. It is noted that the crown 330 typically follows the movement of the centre indicator 310.
FIG. 4 illustrates a second example of an overlay 400 according to the present principles. The components of the overlay 400 are the same as those illustrated in the overlay 300 in FIG. 3 , the differences being that the lock positions 341, 342 have been filled to indicate lock on to the object or scene, and that the centre indicator 310 and the crown 330 have followed the user's hands so that the centre indicator 310 is now outside of the neutral zone 320 (it is noted that it in the Figure appears as if the neutral zone has moved).
In step S250, the user interface enables and/or disables user actions depending on the position of the centre indicator 310 relative to the neutral zone 320.
For example, when inside, enabled user actions can include:
    • zoom in or out (i.e. bring an object closer or farther away) by respectively spreading or nearing the hands (possibly limited to when the hand indicators 351, 352 are at essentially the same height), and
    • rotating the object through a corresponding circular movement of the hands (as if they were on a steering wheel), possibly limited to when the hand indicators 351, 352 are inside the crown 330; and
      when outside, enabled user actions can include:
    • rotating an object horizontally by moving the hands in a corresponding horizontal direction, or vertically by moving the hands in a corresponding vertical direction, or
    • moving the camera (i.e. user's viewpoint) around in the scene.
As will be appreciated, the neutral zone can make the AR/VR scene less prone to for example accidental rotations since small horizontal or vertical hand movements will have no effect while the centre indicator 310 is in the neutral zone 320.
The user interface can indicate the possibilities by for example rendering icons indicative of enabled actions. Additionally, icons can be rendered in a different aspect (for example greyed or faded) for disabled actions, as is conventional in user interfaces. The icons can be rendered for a limited time period before disappearing or fading away. Preferably, the icons are rendered in a peripheral area so as not to hide much of the scene or object.
It is noted that other user actions, such as moving an object, can be implemented by the user interface.
In step S260, the user interface responds to user actions tracked through hand position detection. The user actions can include the ones described with reference to step S250. Further actions can include unlocking (to be described with reference to step S270), and hiding and displaying the overlay 300 when the user crosses the hands, i.e. brings the right hand to the left of the left hand (and it is noted that this can be disabled when the user performs a rotation through circular movement of the hands, so as to enable great circular rotation, e.g. 180°. Alternatively, the hand indicators 351, 352 can be prevented from switching sides so that the left is always to the left of or at the same vertical position as the right (except, for example, when performing a circular movement).
It is noted that horizontal or vertical rotation of an object can be temporarily disabled as the centre indicator 310 passes through the neutral zone 320 on its way to the other side.
In step S270, the user interface detects unlock of the object or scene. This can be done by the user putting the hand indicators 351, 352 on top of the lock positions 341, 342 for a given time. As can be seen, the user can do the same as for locking on to the object or scene. The unlock may also require that the center indicator 310 is in the neutral zone 320.
In a variant suitable for one-hand use, the overlay can include a single lock position over which the user should put a hand indicator for a given time in order to lock the scene or object, while unlocking is done the same way. In the variant, enabled user actions can be limited to moving an object.
It will thus be appreciated that the present principles can be used to provide a AR/VR system that provides a user with information on enabled user actions and also, through the use of the neutral zone, to provide an AR/VR experience that is less subject to ‘jitter’ caused by small hand movements.
It should be understood that the elements shown in the figures may be implemented in various forms of hardware, software or combinations thereof. Preferably, these elements are implemented in a combination of hardware and software on one or more appropriately programmed general-purpose devices, which may include a processor, memory and input/output interfaces.
The present description illustrates the principles of the present disclosure. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described or shown herein, embody the principles of the disclosure and are included within its scope.
All examples and conditional language recited herein are intended for educational purposes to aid the reader in understanding the principles of the disclosure and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions.
Moreover, all statements herein reciting principles, aspects, and embodiments of the disclosure, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure.
Thus, for example, it will be appreciated by those skilled in the art that the block diagrams presented herein represent conceptual views of illustrative circuitry embodying the principles of the disclosure. Similarly, it will be appreciated that any flow charts, flow diagrams, and the like represent various processes which may be substantially represented in computer readable media and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the term “processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (DSP) hardware, read only memory (ROM) for storing software, random access memory (RAM), and non-volatile storage.
Other hardware, conventional and/or custom, may also be included. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
In the claims hereof, any element expressed as a means for performing a specified function is intended to encompass any way of performing that function including, for example, a) a combination of circuit elements that performs that function or b) software in any form, including, therefore, firmware, microcode or the like, combined with appropriate circuitry for executing that software to perform the function. The disclosure as defined by such claims resides in the fact that the functionalities provided by the various recited means are combined and brought together in the manner which the claims call for. It is thus regarded that any means that can provide those functionalities are equivalent to those shown herein.

Claims (17)

The invention claimed is:
1. A method in a device for rendering a virtual reality or augmented reality environment, the method comprising in at least one hardware processor of the device:
rendering an overlay on a display, wherein the overlay comprises a center indicator, a neutral zone, a crown with first and second lock positions, and right and left hand indicators, the first and second lock positions indicating where the user should superpose the right and left hand indicators to lock onto an object or a scene, or to unlock a locked onto object or scene; and
enabling at least one user action of a plurality of user actions based on whether a position of the center indicator of the overlay lies inside or outside the neutral zone of the overlay, the position of the center indicator calculated as a center of detected positions of a user's hands.
2. The method of claim 1, further comprising rendering the overlay in response to detection of a match between detected hand positions and at least one predefined position.
3. The method of claim 1, wherein the overlay is at least partly semi-transparent.
4. The method of claim 1, wherein a first set of actions is enabled only when the position of the center indicator lies inside the neutral zone, and wherein a second set of actions is enabled only when the position of the center indicator lies outside the neutral zone.
5. The method of claim 4, wherein the first action is one of zoom in, zoom out and rotation in a plane of the display, and the second action is one of horizontal rotation, vertical rotation, and movement.
6. The method of claim 1, further comprising rendering first icons, having a first visual aspect, indicative of enabled actions, and second icons, having a second visual aspect, for disabled actions.
7. A device for rendering a virtual reality or augmented reality environment, the device comprising at least one hardware processor configured to:
render an overlay on a display, wherein the overlay comprises a center indicator, a neutral zone, a crown with first and second lock positions, and right and left hand indicators, the first and second lock positions indicating where the user should superpose the right and left hand indicators to lock onto an object or a scene, or to unlock a locked onto object or scene; and
enable at least one user action of a plurality of user actions based on whether a position of the center indicator of the overlay lies inside or outside the neutral zone of the overlay, the position of the center indicator calculated as a center of detected positions of a user's hands.
8. The device of claim 7, wherein the at least one hardware processor is further configured to render the overlay in response to detection of a match between detected hand positions and at least one predefined position.
9. The device of claim 7, wherein the at least one hardware processor is configured to enable a first action only when the position of the center indicator lies inside the neutral zone and to enable a second action only when the position of the center indicator lies outside the neutral zone.
10. The device of claim 9, wherein the first action is one of zoom in, zoom out and rotation in a plane of the display, and the second action is one of horizontal rotation, vertical rotation, and movement.
11. The device of claim 7, wherein the at least one hardware processor is further configured to render first icons, having a first visual aspect, indicative of enabled actions, and second icons, having a second visual aspect, for disabled actions.
12. The device of claim 7, wherein the device comprises at least one of the display and a camera configured to capture the positions of the user's hands.
13. A non-transitory computer readable medium storing program code instructions that, when executed by a processor, implement the steps of a method according to claim 1.
14. The method of claim 1, further comprising performing, in response to detected user hand movement, an enabled user action.
15. The device of claim 7, wherein the at least one hardware processor is further configured to calculate the position of the center indicator based on detected positions of a user's hands.
16. The device of claim 7, wherein the at least one hardware processor is further configured to perform an enabled user action in response to detected user hand movement.
17. The method of claim 1, further comprising calculating the position of the center indicator of the overlay based on the detected positions of the user's hands.
US17/761,056 2019-09-20 2020-09-07 Device and method for hand-based user interaction in VR and AR environments Active US11762476B2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
EP19306153 2019-09-20
EP19306153 2019-09-20
EP19306153.8 2019-09-20
PCT/EP2020/074950 WO2021052800A1 (en) 2019-09-20 2020-09-07 Device and method for hand-based user interaction in vr and ar environments

Publications (2)

Publication Number Publication Date
US20220342485A1 US20220342485A1 (en) 2022-10-27
US11762476B2 true US11762476B2 (en) 2023-09-19

Family

ID=68242597

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/761,056 Active US11762476B2 (en) 2019-09-20 2020-09-07 Device and method for hand-based user interaction in VR and AR environments

Country Status (6)

Country Link
US (1) US11762476B2 (en)
EP (1) EP4031956A1 (en)
JP (1) JP2022548390A (en)
CN (1) CN114424151A (en)
MX (1) MX2022003336A (en)
WO (1) WO2021052800A1 (en)

Citations (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5568603A (en) * 1994-08-11 1996-10-22 Apple Computer, Inc. Method and system for transparent mode switching between two different interfaces
US20030020755A1 (en) * 1997-04-30 2003-01-30 Lemelson Jerome H. System and methods for controlling automatic scrolling of information on a display or screen
US20030210227A1 (en) * 2002-05-09 2003-11-13 Gateway, Inc. Pointing device dwell time
US6874126B1 (en) * 2001-11-30 2005-03-29 View Space Technologies Method and apparatus for controlling content display by the cursor motion
US20080018595A1 (en) * 2000-07-24 2008-01-24 Gesturetek, Inc. Video-based image control system
US20130055150A1 (en) 2011-08-24 2013-02-28 Primesense Ltd. Visual feedback for tactile and non-tactile user interfaces
US20130103446A1 (en) * 2011-10-20 2013-04-25 Microsoft Corporation Information sharing democratization for co-located group meetings
US20130127825A1 (en) * 2011-07-28 2013-05-23 Pushkar P. Joshi Methods and Apparatus for Interactive Rotation of 3D Objects Using Multitouch Gestures
US20130135353A1 (en) * 2011-11-28 2013-05-30 Google Inc. Head-Angle-Trigger-Based Action
US20130246967A1 (en) * 2012-03-15 2013-09-19 Google Inc. Head-Tracked User Interaction with Graphical Interface
US20140282274A1 (en) * 2013-03-15 2014-09-18 Qualcomm Incorporated Detection of a gesture performed with at least two control objects
US20150156028A1 (en) * 2013-12-01 2015-06-04 Apx Labs, Llc Systems and methods for sharing information between users of augmented reality devices
US20160011724A1 (en) * 2012-01-06 2016-01-14 Google Inc. Hands-Free Selection Using a Ring-Based User-Interface
US20160091982A1 (en) 2011-12-23 2016-03-31 Intel Corporation Mechanism to provide visual feedback regarding computing system command gestures
US9317130B2 (en) 2011-06-16 2016-04-19 Rafal Jan Krepec Visual feedback by identifying anatomical features of a hand
US9383894B2 (en) 2014-01-08 2016-07-05 Microsoft Technology Licensing, Llc Visual feedback for level of gesture completion
US20160224123A1 (en) * 2015-02-02 2016-08-04 Augumenta Ltd Method and system to control electronic devices through gestures
US20160274732A1 (en) 2015-03-16 2016-09-22 Elliptic Laboratories As Touchless user interfaces for electronic devices
US20160306431A1 (en) 2015-04-15 2016-10-20 Sony Computer Entertainment Inc. Pinch And Hold Gesture Navigation On A Head-Mounted Display
US9529513B2 (en) 2013-08-05 2016-12-27 Microsoft Technology Licensing, Llc Two-hand interaction with natural user interface
US9582091B2 (en) 2013-08-23 2017-02-28 Samsung Medison Co., Ltd. Method and apparatus for providing user interface for medical diagnostic apparatus
US20170097687A1 (en) * 2012-07-13 2017-04-06 Softkinetic Software Method and system for human-to-computer gesture based simultaneous interactions using singular points of interest on a hand
US9829989B2 (en) 2015-06-19 2017-11-28 Microsoft Technology Licensing, Llc Three-dimensional user input
US20170354883A1 (en) * 2016-06-13 2017-12-14 Sony Interactive Entertainment LLC Method and system for directing user attention to a location based game play companion application
US20180088677A1 (en) 2016-09-29 2018-03-29 Alibaba Group Holding Limited Performing operations based on gestures
WO2018071004A1 (en) 2016-10-11 2018-04-19 Hewlett-Packard Development Company, L.P. Visual cue system
US20180143693A1 (en) * 2016-11-21 2018-05-24 David J. Calabrese Virtual object manipulation
US20180217672A1 (en) * 2015-07-29 2018-08-02 Kyocera Corporation Wearable device, control method, and control program
US20190079589A1 (en) * 2017-09-11 2019-03-14 Barco Nv Method and system for efficient gesture control of equipment
US10254846B1 (en) * 2017-03-15 2019-04-09 Meta Company Systems and methods to facilitate interactions with virtual content in an augmented reality environment
US10261594B2 (en) * 2015-02-13 2019-04-16 Leap Motion, Inc. Systems and methods of creating a realistic displacement of a virtual object in virtual reality/augmented reality environments
US20190371066A1 (en) * 2018-06-05 2019-12-05 IMEX Media, Inc. Systems and Methods for Providing Virtual Reality Musical Experiences
US10692237B2 (en) * 2018-07-09 2020-06-23 Mehul Sompura Ring size measurement system and method for digitally measuring ring size
US10852913B2 (en) * 2016-06-21 2020-12-01 Samsung Electronics Co., Ltd. Remote hover touch system and method
US20210248371A1 (en) * 2020-02-10 2021-08-12 Fuji Xerox Co., Ltd. Systems and methods for augmented reality application for annotations and adding interfaces to control panels and screens
US11334212B2 (en) * 2019-06-07 2022-05-17 Facebook Technologies, Llc Detecting input in artificial reality systems based on a pinch and pull gesture
US20220301231A1 (en) * 2021-03-16 2022-09-22 Snap Inc. Mirroring device with whole-body outfits
US20230038709A1 (en) * 2021-07-28 2023-02-09 Purdue Research Foundation System and Method for Authoring Freehand Interactive Augmented Reality Applications

Patent Citations (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5568603A (en) * 1994-08-11 1996-10-22 Apple Computer, Inc. Method and system for transparent mode switching between two different interfaces
US20030020755A1 (en) * 1997-04-30 2003-01-30 Lemelson Jerome H. System and methods for controlling automatic scrolling of information on a display or screen
US20080018595A1 (en) * 2000-07-24 2008-01-24 Gesturetek, Inc. Video-based image control system
US8274535B2 (en) * 2000-07-24 2012-09-25 Qualcomm Incorporated Video-based image control system
US6874126B1 (en) * 2001-11-30 2005-03-29 View Space Technologies Method and apparatus for controlling content display by the cursor motion
US20030210227A1 (en) * 2002-05-09 2003-11-13 Gateway, Inc. Pointing device dwell time
US9317130B2 (en) 2011-06-16 2016-04-19 Rafal Jan Krepec Visual feedback by identifying anatomical features of a hand
US20130127825A1 (en) * 2011-07-28 2013-05-23 Pushkar P. Joshi Methods and Apparatus for Interactive Rotation of 3D Objects Using Multitouch Gestures
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US20130055150A1 (en) 2011-08-24 2013-02-28 Primesense Ltd. Visual feedback for tactile and non-tactile user interfaces
US20130103446A1 (en) * 2011-10-20 2013-04-25 Microsoft Corporation Information sharing democratization for co-located group meetings
US20130135353A1 (en) * 2011-11-28 2013-05-30 Google Inc. Head-Angle-Trigger-Based Action
US20160091982A1 (en) 2011-12-23 2016-03-31 Intel Corporation Mechanism to provide visual feedback regarding computing system command gestures
US20160011724A1 (en) * 2012-01-06 2016-01-14 Google Inc. Hands-Free Selection Using a Ring-Based User-Interface
US20130246967A1 (en) * 2012-03-15 2013-09-19 Google Inc. Head-Tracked User Interaction with Graphical Interface
US20170097687A1 (en) * 2012-07-13 2017-04-06 Softkinetic Software Method and system for human-to-computer gesture based simultaneous interactions using singular points of interest on a hand
US20140282274A1 (en) * 2013-03-15 2014-09-18 Qualcomm Incorporated Detection of a gesture performed with at least two control objects
US9529513B2 (en) 2013-08-05 2016-12-27 Microsoft Technology Licensing, Llc Two-hand interaction with natural user interface
US9582091B2 (en) 2013-08-23 2017-02-28 Samsung Medison Co., Ltd. Method and apparatus for providing user interface for medical diagnostic apparatus
US20160350971A1 (en) * 2013-12-01 2016-12-01 Apx Labs, Llc Systems and methods for controlling operation of an on-board component
US9996221B2 (en) * 2013-12-01 2018-06-12 Upskill, Inc. Systems and methods for look-initiated communication
US20150153922A1 (en) * 2013-12-01 2015-06-04 Apx Labs, Llc Systems and methods for unlocking a wearable device
US20150153913A1 (en) * 2013-12-01 2015-06-04 Apx Labs, Llc Systems and methods for interacting with a virtual menu
US20160098579A1 (en) * 2013-12-01 2016-04-07 Apx Labs, Inc. Systems and methods for unlocking a wearable device
US20150153912A1 (en) * 2013-12-01 2015-06-04 Apx Labs, Llc Systems and methods for accessing a nested menu
US10466858B2 (en) * 2013-12-01 2019-11-05 Upskill, Inc. Systems and methods for interacting with a virtual menu
US10254920B2 (en) * 2013-12-01 2019-04-09 Upskill, Inc. Systems and methods for accessing a nested menu
US10558325B2 (en) * 2013-12-01 2020-02-11 Upskill, Inc. Systems and methods for controlling operation of an on-board component
US9460314B2 (en) * 2013-12-01 2016-10-04 Apx Labs, Inc. Systems and methods for providing task-based instructions
US20180284962A1 (en) * 2013-12-01 2018-10-04 Upskill, Inc. Systems and methods for look-initiated communication
US20150156803A1 (en) * 2013-12-01 2015-06-04 Apx Labs, Llc Systems and methods for look-initiated communication
US20150153826A1 (en) * 2013-12-01 2015-06-04 Apx Labs, Llc Systems and methods for providing a virtual menu
US20170017361A1 (en) * 2013-12-01 2017-01-19 Apx Labs, Inc. Systems and methods for providing task-based instructions
US20150153571A1 (en) * 2013-12-01 2015-06-04 Apx Labs, Llc Systems and methods for providing task-based instructions
US20150156028A1 (en) * 2013-12-01 2015-06-04 Apx Labs, Llc Systems and methods for sharing information between users of augmented reality devices
US9727211B2 (en) * 2013-12-01 2017-08-08 Upskill, Inc. Systems and methods for unlocking a wearable device
US9229235B2 (en) * 2013-12-01 2016-01-05 Apx Labs, Inc. Systems and methods for unlocking a wearable device
US9383894B2 (en) 2014-01-08 2016-07-05 Microsoft Technology Licensing, Llc Visual feedback for level of gesture completion
US20160224123A1 (en) * 2015-02-02 2016-08-04 Augumenta Ltd Method and system to control electronic devices through gestures
US10261594B2 (en) * 2015-02-13 2019-04-16 Leap Motion, Inc. Systems and methods of creating a realistic displacement of a virtual object in virtual reality/augmented reality environments
US20160274732A1 (en) 2015-03-16 2016-09-22 Elliptic Laboratories As Touchless user interfaces for electronic devices
US20160306431A1 (en) 2015-04-15 2016-10-20 Sony Computer Entertainment Inc. Pinch And Hold Gesture Navigation On A Head-Mounted Display
US20190107896A1 (en) * 2015-04-15 2019-04-11 Sony Interactive Entertainment Inc. Pinch and Hold Gesture Navigation on A Head-Mounted Display
US9829989B2 (en) 2015-06-19 2017-11-28 Microsoft Technology Licensing, Llc Three-dimensional user input
US20180217672A1 (en) * 2015-07-29 2018-08-02 Kyocera Corporation Wearable device, control method, and control program
US10551932B2 (en) * 2015-07-29 2020-02-04 Kyocera Corporation Wearable device, control method, and control program
US20170354883A1 (en) * 2016-06-13 2017-12-14 Sony Interactive Entertainment LLC Method and system for directing user attention to a location based game play companion application
US10852913B2 (en) * 2016-06-21 2020-12-01 Samsung Electronics Co., Ltd. Remote hover touch system and method
US20180088677A1 (en) 2016-09-29 2018-03-29 Alibaba Group Holding Limited Performing operations based on gestures
WO2018071004A1 (en) 2016-10-11 2018-04-19 Hewlett-Packard Development Company, L.P. Visual cue system
US20180143693A1 (en) * 2016-11-21 2018-05-24 David J. Calabrese Virtual object manipulation
US10254846B1 (en) * 2017-03-15 2019-04-09 Meta Company Systems and methods to facilitate interactions with virtual content in an augmented reality environment
US20190079589A1 (en) * 2017-09-11 2019-03-14 Barco Nv Method and system for efficient gesture control of equipment
US20190371066A1 (en) * 2018-06-05 2019-12-05 IMEX Media, Inc. Systems and Methods for Providing Virtual Reality Musical Experiences
US10692237B2 (en) * 2018-07-09 2020-06-23 Mehul Sompura Ring size measurement system and method for digitally measuring ring size
US11334212B2 (en) * 2019-06-07 2022-05-17 Facebook Technologies, Llc Detecting input in artificial reality systems based on a pinch and pull gesture
US20210248371A1 (en) * 2020-02-10 2021-08-12 Fuji Xerox Co., Ltd. Systems and methods for augmented reality application for annotations and adding interfaces to control panels and screens
US11475661B2 (en) * 2020-02-10 2022-10-18 Fujifilm Business Innovation Corp. Systems and methods for augmented reality application for annotations and adding interfaces to control panels and screens
US20230139977A1 (en) * 2020-02-10 2023-05-04 Fujifilm Business Innovation Corp. Systems and methods for augmented reality application for annotations and adding interfaces to control panels and screens
US20220301231A1 (en) * 2021-03-16 2022-09-22 Snap Inc. Mirroring device with whole-body outfits
US20230038709A1 (en) * 2021-07-28 2023-02-09 Purdue Research Foundation System and Method for Authoring Freehand Interactive Augmented Reality Applications

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Anonymous, "Interaction Engine 1.0—Object Interactions, UI Toolkit, Handheld Controller Support and More", Leap Motion Blog, Jun. 7, 2017, URL: http://blog.leapmotion.com/interaction-engine/, 18 pages.
Blaga et al., "[POSTER] Usability Analysis of an Off-the-Shelf Hand Posture Estimation Sensor for Freehand Physical Interaction in Egocentric Mixed Reality", Institure of Electronics and Electrical Engineers (IEEE), 2017 IEEE International Symposium on Mixed and Augmented Reality Adjunct Proceedings, Nantes, France, Oct. 9, 2017, 4 pages.
Hingorani Mohit, "Gesture based Data Interaction using the Kinect", May 16, 2014, YouTube, https://www.youtube.com/watch?v=Ew29FzodkxE, 2 pages.
Majd Jordan, "Hand-tracked-natural-interaction-in-VR", Blue Penguin Blog, Apr. 24, 2017, URL: https://blog.bluepengu.in/hand-tracked-natural-interaction-in-vr-5e5ae43ca3ff, 6 pages.

Also Published As

Publication number Publication date
MX2022003336A (en) 2022-05-06
WO2021052800A1 (en) 2021-03-25
US20220342485A1 (en) 2022-10-27
JP2022548390A (en) 2022-11-18
CN114424151A (en) 2022-04-29
EP4031956A1 (en) 2022-07-27

Similar Documents

Publication Publication Date Title
US10890983B2 (en) Artificial reality system having a sliding menu
US20190279407A1 (en) System and method for augmented reality interaction
US9724609B2 (en) Apparatus and method for augmented reality
KR20220100102A (en) Gaze-based user interactions
CN111566596B (en) Real world portal for virtual reality displays
US9123272B1 (en) Realistic image lighting and shading
US9268410B2 (en) Image processing device, image processing method, and program
US20170195664A1 (en) Three-dimensional viewing angle selecting method and apparatus
CN107408026A (en) Message processing device, information processing method and computer program
EP2853986B1 (en) Image processing device, image processing method, and program
TW202206978A (en) Private control interfaces for extended reality
TW202221380A (en) Obfuscated control interfaces for extended reality
US20200388247A1 (en) Corner-identifiying gesture-driven user interface element gating for artificial reality systems
KR20220018561A (en) Artificial Reality Systems with Personal Assistant Element for Gating User Interface Elements
WO2018198499A1 (en) Information processing device, information processing method, and recording medium
US11521346B2 (en) Image processing apparatus, image processing method, and storage medium
US11195320B2 (en) Feed-forward collision avoidance for artificial reality environments
KR20150026396A (en) Method for object composing a image and an electronic device thereof
US11762476B2 (en) Device and method for hand-based user interaction in VR and AR environments
US20170031583A1 (en) Adaptive user interface
US20200035208A1 (en) Electronic device and control method thereof
US20140043326A1 (en) Method and system for projecting content to have a fixed pose
US20220150457A1 (en) Image processing apparatus, image processing method, and storage medium
CN113190110A (en) Interface element control method and device of head-mounted display equipment
CN116159308A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: INTERDIGITAL CE PATENT HOLDINGS, SAS, FRANCE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SCHMOUKER, PHILIPPE;REEL/FRAME:060403/0420

Effective date: 20220503

Owner name: INTERDIGITAL CE PATENT HOLDINGS, SAS, FRANCE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LELIEVRE, SYLVAIN;MARVIE, JEAN-EUDES;REEL/FRAME:060403/0326

Effective date: 20220428

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE