CN108563335B - Virtual reality interaction method and device, storage medium and electronic equipment - Google Patents

Virtual reality interaction method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN108563335B
CN108563335B CN201810374739.7A CN201810374739A CN108563335B CN 108563335 B CN108563335 B CN 108563335B CN 201810374739 A CN201810374739 A CN 201810374739A CN 108563335 B CN108563335 B CN 108563335B
Authority
CN
China
Prior art keywords
virtual
virtual reality
reality interaction
interaction method
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810374739.7A
Other languages
Chinese (zh)
Other versions
CN108563335A (en
Inventor
李瑶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201810374739.7A priority Critical patent/CN108563335B/en
Publication of CN108563335A publication Critical patent/CN108563335A/en
Application granted granted Critical
Publication of CN108563335B publication Critical patent/CN108563335B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure relates to the field of virtual reality technologies, and in particular, to a virtual reality interaction method, a virtual reality interaction apparatus, a computer-readable storage medium, and an electronic device. The virtual reality interaction method comprises the following steps: responding to the virtual object activation message, and acquiring the operation forms and action parameters of the two hands of the user; determining an operation type according to the operation form; and adjusting the physical state of the virtual object according to the operation type and the action parameter. The virtual reality interaction method improves the accuracy of action recognition and the consistency of action switching in the interaction process, and remarkably optimizes the user experience.

Description

Virtual reality interaction method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of virtual reality technologies, and in particular, to a virtual reality interaction method, a virtual reality interaction apparatus, a computer-readable storage medium, and an electronic device.
Background
VR (Virtual Reality) technology is a computer simulation system that creates and experiences Virtual worlds, using computers to create a simulated environment into which users are immersed based on systematic simulation of interactive three-dimensional dynamic views and physical behaviors with multi-source information fusion.
Interaction relationships between users and various virtual objects in a simulation environment and the simulation environment are important components of VR technology, and in related technologies, gesture operations are generally used as a main interaction mode. As shown in fig. 1, the related gesture operation generally identifies the movement of the user's finger to determine the operation intention of the user on the virtual object. Because the activity space of the action in the simulation environment is not limited, the gesture operation of the user has great randomness, and the virtual object cannot be accurately controlled in the designated axial direction or the designated space range. For example, in the existing gesture operation mode, quick unequal scaling of the virtual object is difficult to realize, and action misjudgment often exists.
Therefore, a virtual reality interaction method with higher accuracy and richer operation types is needed.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The purpose of the present disclosure is to provide a virtual reality interaction method, a virtual reality interaction apparatus, a computer-readable storage medium, and an electronic device, so as to solve, at least to a certain extent, technical problems of low accuracy of motion recognition, limited operation types, and the like in the existing virtual reality interaction method.
According to one aspect of the present disclosure, a virtual reality interaction method is provided, which is characterized by comprising:
responding to the virtual object activation message to acquire the operation forms and action parameters of the two hands of the user;
determining a virtual mobile space with a specified dimension according to the operation form;
moving the virtual object within the virtual movement space in accordance with the motion parameters.
In an exemplary embodiment of the present disclosure, the operation form includes an operation hand and a finger form of the operation hand.
In an exemplary embodiment of the present disclosure, the motion parameters include a moving direction and a moving distance of the manipulator.
According to one aspect of the present disclosure, a virtual reality interaction apparatus is provided, which is characterized by comprising:
the action acquisition unit is used for responding to the virtual object activation message to acquire the operation forms and the action parameters of the two hands of the user;
an operation determining unit, configured to determine a virtual movement space having a specified dimension according to the operation form;
a movement control unit for moving the virtual object within the virtual movement space according to the motion parameters.
According to one aspect of the present disclosure, a virtual reality interaction method is provided, which is characterized by comprising:
responding to the virtual object activation message to acquire the operation forms and action parameters of the two hands of the user;
determining one or more zoom axes according to the operation form;
and zooming the virtual object along the zooming axis according to the action parameter.
In an exemplary embodiment of the present disclosure, the operation form includes an operation hand and a finger form of the operation hand.
In an exemplary embodiment of the present disclosure, the motion parameters include a moving direction and a moving distance of the manipulator.
According to one aspect of the present disclosure, a virtual reality interaction apparatus is provided, which is characterized by comprising:
the action acquisition unit is used for responding to the virtual object activation message to acquire the operation forms and the action parameters of the two hands of the user;
an operation determination unit for determining one or more zoom axes according to the operation form;
and the zooming control unit is used for zooming the virtual object along the zooming axis according to the action parameter.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when executed by a processor, implements the virtual reality interaction method described in any one of the above.
According to an aspect of the present disclosure, there is provided an electronic apparatus, comprising:
a processor;
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform any of the virtual reality interaction methods described above via execution of the executable instructions.
In the virtual reality interaction method provided by the embodiment of the disclosure, the operation types are determined by obtaining the operation forms of the two hands of the user, and the physical state of the virtual object is adjusted according to the operation types and the action parameters of the two hands of the user, so that accurate identification and rapid switching of different operation actions performed on the virtual object are realized, the accuracy of action identification in the interaction process is improved, the control types of the virtual object are enriched, and the user experience is remarkably optimized.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 schematically illustrates a virtual reality interaction interface based on gesture operation in the prior art.
Fig. 2 schematically illustrates a flowchart of steps of a virtual reality interaction method in an exemplary embodiment of the present disclosure.
Fig. 3A schematically illustrates a display interface for performing a moving operation on a virtual object within a two-dimensional virtual moving space in an exemplary embodiment of the present disclosure.
Fig. 3B schematically illustrates a display interface for performing a moving operation on a virtual object within a one-dimensional virtual moving space in an exemplary embodiment of the present disclosure.
Fig. 4 schematically illustrates a block diagram of a virtual reality interaction device according to an exemplary embodiment of the present disclosure.
Fig. 5 schematically shows a flowchart of steps of a virtual reality interaction method in an exemplary embodiment of the present disclosure.
Fig. 6A schematically illustrates a display interface for performing an isometric scaling operation on a virtual object in an exemplary embodiment of the present disclosure.
Fig. 6B schematically illustrates a display interface for performing an unequal scaling operation on a virtual object in an exemplary embodiment of the present disclosure.
Fig. 7 schematically illustrates a block diagram of a virtual reality interaction device according to an exemplary embodiment of the present disclosure.
Fig. 8 schematically illustrates a module diagram of an electronic device in an exemplary embodiment of the disclosure.
Fig. 9 schematically illustrates a schematic diagram of a program product in an exemplary embodiment of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
An exemplary embodiment of the present disclosure first provides a virtual reality interaction method, which may be applied to various virtual reality devices, platforms, or systems capable of providing a virtual reality environment. As shown in fig. 2, the virtual reality interaction method in the present exemplary embodiment mainly includes the following steps:
and S10, responding to the virtual object activation message to acquire the operation forms and the action parameters of the two hands of the user.
In the virtual reality environment to which the present exemplary embodiment is applied, there are several virtual objects that can interact with the user, and this step may activate one or several virtual objects in response to the virtual object activation message. The virtual object activation message may be an external message received by the virtual reality environment or a message generated or communicated within the virtual reality environment. The virtual object activation message may be directly issued by the user by performing a certain trigger operation, or may be automatically generated when a specific condition is satisfied. For example, the user may perform any specified operation such as clicking, pressing, etc. on the virtual button or on the virtual object to issue a virtual object activation message; in addition, the virtual object activation message may be received directly from the virtual reality environment via an external input device such as a microphone.
The operation forms and the action parameters of the two hands of the user can be acquired by detecting the two hands of the user. The operation form may include an operation hand of the user and a finger form of the operation hand. The operation hand can be subdivided into single-hand operation or two-hand operation, and the finger configuration includes a straight configuration or a curled configuration of each finger. The motion parameters may include a moving direction and a moving distance of the manipulator in the corresponding operation form. The operation forms and the action parameters of the two hands of the user can be acquired through various ways and means, for example, the user can wear the hands of the user or hold sensor equipment with an information acquisition function by the user to directly acquire the hand information of the user, and in addition, the multi-angle shooting can be carried out on the two hands of the user by means of a multi-directional camera to indirectly acquire the hand information of the user through means of image acquisition and analysis.
And S20, determining a virtual mobile space with a specified dimension according to the operation form.
The type of operation performed by the user on the virtual object may include any of various operations performed on the physical state of the virtual object displayed in the virtual reality environment, such as moving, rotating, zooming, etc. For example, when it is acquired that an operating hand of a user is a single hand, the operation type may be determined as a moving operation, and a moving range and a moving direction of the moving operation may be further determined according to a finger form of the operating hand; when the obtained operation hands of the user are both hands, the operation type can be determined to be other operations such as rotation or zooming. The correspondence between the specific operation form and the operation type may be preset, or may be set and adjusted according to the user's needs.
When the operation type is determined to be a moving operation, the step may determine a virtual moving space with a specified dimension in the virtual reality environment according to the finger shape of the user operating hand, where the virtual moving space is used to define the moving direction and the interval range of the virtual object. In the present exemplary embodiment, the index finger may correspond to a three-dimensional virtual movement space, the index finger + middle finger may correspond to a two-dimensional virtual movement space, and the index finger + middle finger + ring finger may correspond to a one-dimensional virtual movement space. Specifically, when the user drags the virtual object with the index finger, a three-dimensional virtual movement space can be determined in the virtual reality environment, that is, the virtual object can be moved in any direction. When a user drags a virtual object by using an index finger and a middle finger together, a two-dimensional virtual moving space can be determined in a virtual reality environment, which is equivalent to determining a plane; the extension direction of the plane can be related to the specific shape of the finger, for example, the straight lines where the index finger and the middle finger of the user are located can be respectively obtained, and the plane determined by the two straight lines is used as a two-dimensional virtual moving space; or detecting the finger tip positions of the index finger and the middle finger of the user, and determining the plane by combining the connecting line of the two finger tip positions with the straight line where the index finger or the middle finger is located; in addition, other information related to the shape of the finger, such as the orientation of the user's finger, may also be detected to determine the direction of extension of the plane. In the three-dimensional rectangular coordinate system shown in fig. 3A, the plane is a plane defined by the x-axis and the z-axis, and the virtual object can only move in the plane; in other exemplary embodiments, the plane may also be a plane defined by an x-axis and a y-axis or a plane defined by a y-axis and a z-axis. When a user drags a virtual object by using an index finger, a middle finger and a ring finger together, a one-dimensional virtual moving space can be determined in the virtual reality environment, which is equivalent to determining a straight line. The extending direction of the straight line may be related to the specific form of the finger, for example, a straight line where a certain designated finger of the user is located may be obtained, and the straight line is used as a one-dimensional virtual movement space; in addition, other information related to the shape of the finger, such as the orientation of the user's finger, may also be detected to determine the direction in which the line extends. In the three-dimensional rectangular coordinate system shown in fig. 3B, the straight line is a straight line where the z-axis is located, and the virtual object can move only along the extending direction of the straight line; in other exemplary embodiments, the straight line may be a straight line in any direction.
In other exemplary embodiments, the finger shape may have any combination and matching relationship with the dimension of the virtual movement space and the position of the virtual movement space in the virtual reality environment, which is not limited in the present disclosure.
Step S30, moving the virtual object in the virtual movement space according to the motion parameter.
In the virtual movement space determined in step S20, the virtual object is moved according to the motion parameter of the user 'S hand, and the specific parameter of the movement may be the same as the motion parameter of the user' S hand, or may be a product of the motion parameter and a preset coefficient. In addition, the motion parameters of the user's manipulator, such as the moving direction and the moving distance, may be calculated or adjusted and then converted into another related parameter for moving the virtual object.
The virtual reality interaction method provided by the exemplary embodiment adjusts the physical state of the virtual object by acquiring the operation forms and the action parameters of the two hands of the user, so that accurate identification and rapid switching of different operation actions performed on the virtual object are realized, and the accuracy of action identification and the continuity of action switching in the interaction process are improved. In addition, the operation type is identified through the shape of the operating hand and the fingers, so that the identification accuracy of the operation type is improved, and the types capable of controlling the virtual object are expanded.
Corresponding to the above method embodiment, the present disclosure further provides a virtual reality interaction apparatus, which may be used to execute the above method embodiment.
As shown in fig. 4, the virtual reality interacting device 40 may mainly include a motion acquisition unit 41, an operation determination unit 42, and a movement control unit 43. The action acquisition unit 41 is configured to acquire the operation forms and action parameters of the two hands of the user in response to the virtual object activation message; the operation determining unit 42 is configured to determine a virtual moving space with a specified dimension according to the operation form; the movement control unit 43 is configured to move the virtual object within the virtual movement space according to the motion parameters.
The details of the virtual reality interaction apparatus are already described in detail in the corresponding virtual reality interaction method, and therefore are not described herein again.
Further, in another exemplary embodiment of the present disclosure, the adjustment of the physical state of the virtual object may be to perform a zoom operation thereon. As shown in fig. 5, the virtual reality interaction method in the present exemplary embodiment may include the following steps:
s10, responding to the virtual object activation message to acquire the operation forms and the action parameters of the two hands of the user;
s40, determining one or more zooming axes according to the operation form;
and S50, zooming the virtual object along the zooming axis according to the action parameters.
The features of the activation message and the response method thereof, the operation forms of the two hands of the user, the motion parameters and the obtaining method thereof, and the like in step S10 have been described in detail in the foregoing embodiments, and are not described herein again. In step S40, one or more zoom axes are determined in the virtual reality environment according to the finger shape of the user 'S hand, and then the virtual object is zoomed along the designated zoom axes according to the motion parameters of the user' S hand by step S50. In the present exemplary embodiment, the scaling on the different zoom axes may be the same or different. Referring to fig. 6A, taking the sphere shown in the figure as a virtual object, when a user grasps the virtual object with the index finger and the thumb of both hands, three orthogonal zooming axes (corresponding to three coordinate axes in a three-dimensional rectangular coordinate system) can be determined, the scaling on each zooming axis is the same, and then the virtual object will be scaled equally according to the motion parameters of the user's manipulator, which may include the moving direction, the moving distance, and the relative displacement between the fingers. Referring to fig. 6B, with the sphere shown in the figure as a virtual object, when the user grasps the virtual object with the index finger, middle finger, and thumb of both hands, three mutually perpendicular zoom axes (equivalent to three coordinate axes in a three-dimensional rectangular coordinate system) can be determined as well. If the scaling on the three scaling axes is not completely the same, the virtual object is scaled unequally according to the action parameters of the user manipulator. In other exemplary embodiments, one, two or more other numbers of zoom axes may also be determined according to the operation forms of the two hands of the user, and the scaling relationship between the zoom axes may also be specified by distinguishing the operation hand from the finger form.
Corresponding to the above method embodiment, the present disclosure further provides a virtual reality interaction apparatus, which may be used to execute the above method embodiment.
As shown in fig. 7, the virtual reality interacting device 70 may mainly include a motion acquiring unit 41, an operation determining unit 72, and a scaling control unit 73. The action acquisition unit 41 is configured to acquire the operation forms and action parameters of the two hands of the user in response to the virtual object activation message; an operation determining unit 72 for determining one or more zoom axes according to the operation form; the scaling control unit 73 is configured to scale the virtual object along the scaling axis according to the motion parameter.
The details of the virtual reality interaction apparatus are already described in detail in the corresponding virtual reality interaction method, and therefore are not described herein again.
It should be noted that although the above exemplary embodiments describe the various steps of the methods of the present disclosure in a particular order, this does not require or imply that these steps must be performed in that particular order, or that all of the steps must be performed, to achieve the desired results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In an exemplary embodiment of the present disclosure, there is also provided an electronic device comprising at least one processor and at least one memory for storing executable instructions of the processor; wherein the processor is configured to perform the method steps in the above-described exemplary embodiments of the disclosure via execution of the executable instructions.
The electronic apparatus 800 in the present exemplary embodiment is described below with reference to fig. 8. The electronic device 800 is only one example and should not bring any limitations to the functionality or scope of use of embodiments of the present disclosure.
Referring to FIG. 8, an electronic device 800 is shown in the form of a general purpose computing device. The components of the electronic device 800 may include, but are not limited to: at least one processing unit 810, at least one memory unit 820, a bus 830 connecting the various system components including the processing unit 810 and the memory unit 820, and a display unit 840.
Wherein the storage unit 820 stores program code that can be executed by the processing unit 810 such that the processing unit 810 performs the method steps in the above-described exemplary embodiments of the present disclosure.
The storage unit 820 may include readable media in the form of volatile storage units, such as a random access storage unit 821(RAM) and/or a cache storage unit 822, and may further include a read only storage unit 823 (ROM).
Storage unit 820 may also include a program/utility 824 having a set (at least one) of program modules 825, such program modules including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 830 may be any of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 800 may also communicate with one or more external devices 600 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that allow a user to interact with the electronic device 800, and/or with any devices (e.g., router, modem, etc.) that allow the electronic device 800 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 850. Also, the electronic device 800 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 860. As shown in fig. 8, the network adapter 860 may communicate with other modules of the electronic device 800 via the bus 830. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 800, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
In an exemplary embodiment of the present disclosure, there is also provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, can implement the above-mentioned virtual reality interaction method of the present disclosure. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product comprising program code; the program product may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, or a removable hard disk, etc.) or on a network; when the program product is run on a computing device (which may be a personal computer, a server, a terminal apparatus, or a network device, etc.), the program code is configured to cause the computing device to perform the method steps in the above exemplary embodiments of the disclosure.
Referring to fig. 9, a program product 90 for implementing the above method according to an embodiment of the present disclosure may employ a portable compact disc read only memory (CD-ROM) and include program code, and may run on a computing device (e.g., a personal computer, a server, a terminal device, or a network device, etc.). However, the program product of the present disclosure is not limited thereto. In the exemplary embodiment, the computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium.
The readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the C language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's computing device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device over any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), etc.; alternatively, the connection may be to an external computing device, such as through the Internet using an Internet service provider.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software may be referred to herein generally as a "circuit," module "or" system.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments, and the features discussed in connection with the embodiments are interchangeable, if possible. In the above description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the embodiments of the disclosure may be practiced without one or more of the specific details, or with other methods, components, materials, and so forth. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.

Claims (10)

1. A virtual reality interaction method is characterized by comprising the following steps:
responding to the virtual object activation message to acquire the operation forms and action parameters of the two hands of the user;
determining a virtual moving space with specified dimensionality according to the operation form, wherein the virtual moving space with the specified dimensionality comprises a one-dimensional virtual moving space, a two-dimensional virtual moving space and a three-dimensional virtual moving space;
moving the virtual object within the virtual movement space in accordance with the motion parameters.
2. The virtual reality interaction method according to claim 1, wherein the operation form comprises an operation hand and a finger form of the operation hand.
3. The virtual reality interaction method of claim 2, wherein the action parameters comprise a moving direction and a moving distance of the manipulator.
4. A virtual reality interaction device, comprising:
the action acquisition unit is used for responding to the virtual object activation message to acquire the operation forms and the action parameters of the two hands of the user;
an operation determining unit, configured to determine a virtual moving space with a specified dimension according to the operation form, where the virtual moving space with the specified dimension includes a one-dimensional virtual moving space, a two-dimensional virtual moving space, and a three-dimensional virtual moving space;
a movement control unit for moving the virtual object within the virtual movement space according to the motion parameters.
5. A virtual reality interaction method is characterized by comprising the following steps:
acquiring the operation forms and the action parameters of the two hands of the user in response to the virtual object activation message,what is needed isThe motion parameters comprise relative displacement between fingers;
determining one or more zoom axes according to the operation form;
and zooming the virtual object along the zooming axis according to the action parameter.
6. The virtual reality interaction method according to claim 5, wherein the operation form comprises an operation hand and a finger form of the operation hand.
7. The virtual reality interaction method of claim 6, wherein the action parameters comprise a moving direction and a moving distance of the manipulator.
8. A virtual reality interaction device, comprising:
the action acquisition unit is used for responding to the virtual object activation message to acquire the operation forms and action parameters of the two hands of the user, and the action parameters comprise the relative displacement between the fingers;
an operation determination unit for determining one or more zoom axes according to the operation form;
and the zooming control unit is used for zooming the virtual object along the zooming axis according to the action parameter.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the virtual reality interaction method of any one of claims 1 to 3 or 5 to 7.
10. An electronic device, comprising:
a processor;
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the virtual reality interaction method of any of claims 1-3 or 5-7 via execution of the executable instructions.
CN201810374739.7A 2018-04-24 2018-04-24 Virtual reality interaction method and device, storage medium and electronic equipment Active CN108563335B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810374739.7A CN108563335B (en) 2018-04-24 2018-04-24 Virtual reality interaction method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810374739.7A CN108563335B (en) 2018-04-24 2018-04-24 Virtual reality interaction method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN108563335A CN108563335A (en) 2018-09-21
CN108563335B true CN108563335B (en) 2021-03-23

Family

ID=63536729

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810374739.7A Active CN108563335B (en) 2018-04-24 2018-04-24 Virtual reality interaction method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN108563335B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106125938A (en) * 2016-07-01 2016-11-16 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN106249879A (en) * 2016-07-19 2016-12-21 深圳市金立通信设备有限公司 The display packing of a kind of virtual reality image and terminal
CN106980362A (en) * 2016-10-09 2017-07-25 阿里巴巴集团控股有限公司 Input method and device based on virtual reality scenario
CN107430437A (en) * 2015-02-13 2017-12-01 厉动公司 The system and method that real crawl experience is created in virtual reality/augmented reality environment
CN107533373A (en) * 2015-08-04 2018-01-02 谷歌有限责任公司 Via the input of the sensitive collision of the context of hand and object in virtual reality

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050212753A1 (en) * 2004-03-23 2005-09-29 Marvit David L Motion controlled remote controller

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107430437A (en) * 2015-02-13 2017-12-01 厉动公司 The system and method that real crawl experience is created in virtual reality/augmented reality environment
CN107533373A (en) * 2015-08-04 2018-01-02 谷歌有限责任公司 Via the input of the sensitive collision of the context of hand and object in virtual reality
CN106125938A (en) * 2016-07-01 2016-11-16 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN106249879A (en) * 2016-07-19 2016-12-21 深圳市金立通信设备有限公司 The display packing of a kind of virtual reality image and terminal
CN106980362A (en) * 2016-10-09 2017-07-25 阿里巴巴集团控股有限公司 Input method and device based on virtual reality scenario

Also Published As

Publication number Publication date
CN108563335A (en) 2018-09-21

Similar Documents

Publication Publication Date Title
Kim et al. Touch and hand gesture-based interactions for directly manipulating 3D virtual objects in mobile augmented reality
CN107913520B (en) Information processing method, information processing device, electronic equipment and storage medium
KR101872426B1 (en) Depth-based user interface gesture control
CN109771941B (en) Method, device, equipment and medium for selecting virtual object in game
CN108037888B (en) Skill control method, skill control device, electronic equipment and storage medium
CN107329690B (en) Virtual object control method and device, storage medium and electronic equipment
KR101318244B1 (en) System and Method for Implemeting 3-Dimensional User Interface
CN107562201B (en) Directional interaction method and device, electronic equipment and storage medium
CN110568929B (en) Virtual scene interaction method and device based on virtual keyboard and electronic equipment
CN110237534B (en) Game object selection method and device
CN110075519B (en) Information processing method and device in virtual reality, storage medium and electronic equipment
CN109710066B (en) Interaction method and device based on gesture recognition, storage medium and electronic equipment
CN108776544B (en) Interaction method and device in augmented reality, storage medium and electronic equipment
US20190155378A1 (en) Analysis of User Interface Interactions Within a Virtual Reality Environment
Matlani et al. Virtual mouse using hand gestures
Raees et al. VEN-3DVE: vision based egocentric navigation for 3D virtual environments
CN111832648A (en) Key point marking method and device, electronic equipment and storage medium
US10769824B2 (en) Method for defining drawing planes for the design of a 3D object
He et al. Ubi Edge: Authoring Edge-Based Opportunistic Tangible User Interfaces in Augmented Reality
CN108355352B (en) Virtual object control method and device, electronic device and storage medium
CN112527110A (en) Non-contact interaction method and device, electronic equipment and medium
Kerefeyn et al. Manipulation of virtual objects through a LeapMotion optical sensor
CN108563335B (en) Virtual reality interaction method and device, storage medium and electronic equipment
Ong et al. 3D bare-hand interactions enabling ubiquitous interactions with smart objects
Liu et al. COMTIS: Customizable touchless interaction system for large screen visualization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant