CN111694432B - Virtual hand position correction method and system based on virtual hand interaction - Google Patents

Virtual hand position correction method and system based on virtual hand interaction Download PDF

Info

Publication number
CN111694432B
CN111694432B CN202010530219.8A CN202010530219A CN111694432B CN 111694432 B CN111694432 B CN 111694432B CN 202010530219 A CN202010530219 A CN 202010530219A CN 111694432 B CN111694432 B CN 111694432B
Authority
CN
China
Prior art keywords
hand
virtual
virtual hand
real
axis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010530219.8A
Other languages
Chinese (zh)
Other versions
CN111694432A (en
Inventor
冯志全
曾波涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Jinan
Original Assignee
University of Jinan
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Jinan filed Critical University of Jinan
Priority to CN202010530219.8A priority Critical patent/CN111694432B/en
Publication of CN111694432A publication Critical patent/CN111694432A/en
Application granted granted Critical
Publication of CN111694432B publication Critical patent/CN111694432B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a virtual hand position correction method and system based on virtual hand interaction, aiming at the problem that an object in an interactive scene presents large and small in size and is close to the edge of a visual area and is stretched and deformed visually in a three-dimensional visual angle presenting mode of the virtual scene, according to the position of a virtual hand in the scene, the position of the virtual hand is corrected through virtual manipulation vector correction, so that the virtual hand coincides with the position of a real hand in space and visual angle, the dislocation sense of a user during interaction is corrected, the interactive reality experience is improved, and the interaction success rate and the interaction efficiency are improved to a certain extent.

Description

Virtual hand position correction method and system based on virtual hand interaction
Technical Field
The invention relates to the technical field of virtual reality interaction, in particular to a virtual hand position correction method and system based on virtual hand interaction.
Background
The human-computer interaction technology is one of key technologies in virtual reality application, and aims to realize natural harmonious interaction between a human and a virtual scene through various interaction devices and interaction means and enhance the immersion sense of virtual reality interaction. With the development of virtual reality technology, the early interaction mode based on a mouse and a graphical interface is difficult to meet the requirement of virtual interaction, a virtual reality system based on various sensory interactions brings new interaction experience to users, particularly, a multi-channel fusion virtual test environment is provided, the high immersion characteristic of the virtual reality system enables the users to naturally interact with the virtual environment, and the virtual reality system has incomparable advantages compared with other technologies in the aspect of obtaining behavior data of the users.
In the research of human-computer natural interaction, a hand is used as a main limb for interaction between a human and an object in the real world, and has the advantage that other interaction modes are difficult to surpass. Researchers have conducted a great deal of research on hand-based virtual interactions, which have been proposed as a more natural way of interacting in virtual reality scenes. How to enable the real hand to resonate with the virtual hand in the virtual scene, and controlling the virtual hand by using the real hand becomes a research hotspot. In the early days, hardware auxiliary equipment is mainly adopted to achieve the purpose, and the data glove invented by Grimes can acquire motion information and freedom degree information of each joint of a finger by using a bending sensor. Davies can distinguish 7 different gestures by obtaining gesture data by attaching a mark to the hand. However, the wearable auxiliary device increases the interaction burden of the user, the interaction mode that the user wears other devices on the hand gradually falls down under the current technological development, and people tend to interact with the user by bare hands. Kinect is a sensing equipment that can acquire human depth information, also can obtain inaccurate hand information, and the current naked hand interaction based on Kinect uses and develops rapidly, but compares in wearing formula equipment and can obtain the accurate information of each joint of palm, and the coarse information of Kinect can not accurately map real hand that changes in real time to virtual hand, and this has caused the not enough of virtual reality interactive mode reality.
Virtual hand interaction is commonly used in applications with teaching properties such as virtual assembly, virtual surgery, virtual experiment teaching and the like, virtual hands still have many problems in interaction, particularly, the three-dimensional visual angle of a virtual scene enables a user to have the problem of position drift of the virtual hands and a real hand in the interaction process, the position of the real hand does not correspond to the position of the virtual hand in the scene, and the target position of the user does not correspond to the actual position, so that the operation experience and the operation efficiency of the user are greatly reduced.
Disclosure of Invention
The invention aims to provide a virtual hand position correction method and system based on virtual hand interaction, and aims to solve the problem that the virtual hand position does not correspond to the real hand position in the virtual interaction process in the prior art, correct the dislocation sense of a user during interaction and improve the interactive true experience.
In order to achieve the technical purpose, the invention provides a virtual hand position correction method based on virtual hand interaction, which comprises the following operations:
acquiring position information of a real hand through the Kinect, mapping the real hand to a virtual space, and acquiring the attribute of a virtual hand;
in the interaction process, when the real hand of the user moves left and right, normal vector correction is carried out on the virtual hand in real time, and a distance parameter between the real hand and a screen is added in the calculation process of the angle of the virtual hand around the y axis;
and (4) counting the vector data of the virtual manipulation in the moving process to obtain the relation between the angle around the y axis and the attribute of the virtual hand, and correcting the position of the virtual hand according to the relation.
Preferably, the attributes of the virtual hand are set as follows:
VRHand={p x ,p y ,p z ,r x ,r y ,r z }
p x ,p y ,p z is the position in the three directions of the x, y and z axes, r x ,r y ,r z The rotational freedom degrees in the three directions of the x axis, the y axis and the z axis are obtained.
Preferably, the relationship between the angle around the y-axis and the virtual hand attribute is:
Figure BDA0002535121750000031
b is one benefit to the actual hand operation range; p is a constant, and is initialized by x Substituting the value of (A) into the formula to obtain; VE _ Width is the Width of the virtual space, σ max Is the maximum angle.
Preferably, the maximum angle is calculated as follows:
Figure BDA0002535121750000032
VE _ Depth is the Depth of the virtual space; mu is a constant; d is the distance from the real hand to the screen.
The invention also provides a virtual hand position correction system based on virtual hand interaction, which comprises:
the virtual hand mapping module is used for acquiring the position information of the real hand through the Kinect, mapping the real hand to a virtual space and acquiring the attribute of the virtual hand;
the normal vector correction module is used for performing normal vector correction on the virtual hand in real time when the real hand of the user moves left and right in the interaction process, and adding a distance parameter between the real hand and the screen in the angle calculation process of the virtual hand around the y axis;
and the normal vector counting module is used for counting virtual manipulation vector data in the moving process to obtain the relation between the angle around the y axis and the virtual hand attribute, and correcting the position of the virtual hand according to the relation.
Preferably, the attributes of the virtual hand are set as follows:
VRHand={p x ,p y ,p z ,r x ,r y ,r z }
p x ,p y ,p z is the position in the three directions of the x, y and z axes, r x ,r y ,r z The rotational freedom degrees in the three directions of the x axis, the y axis and the z axis are obtained.
Preferably, the relationship between the angle around the y-axis and the virtual hand attribute is:
Figure BDA0002535121750000041
b is one benefit to the actual hand operation range; p is a constant, and is initialized by x Substituting the value of (A) into the formula to obtain; VE _ Width is the Width of the virtual space, σ max Is the maximum angle.
Preferably, the maximum angle is calculated as follows:
Figure BDA0002535121750000042
VE _ Depth is the Depth of the virtual space; mu is a constant; d is the distance from the real hand to the screen.
The effect provided in the summary of the invention is only the effect of the embodiment, not all the effects of the invention, and one of the above technical solutions has the following advantages or beneficial effects:
compared with the prior art, the invention aims at the problem that the object in the interactive scene is displayed to be large and small in size and the object close to the edge of the visual region is stretched and deformed in vision by the three-dimensional visual angle display mode of the virtual scene, and realizes the position correction of the object per se by correcting the vector of the virtual hand according to the position of the virtual hand in the scene, so that the virtual hand and the real hand are superposed in space and visual angle, thereby correcting the dislocation sense of the user during interaction, improving the interactive true experience and improving the interaction success rate and the interaction efficiency to a certain extent.
Drawings
Fig. 1 is a flowchart of a virtual hand position correction method based on virtual hand interaction according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a virtual hand interaction space provided in an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating the rotation attribute of a virtual hand according to an embodiment of the present invention;
fig. 4 is a block diagram of a virtual hand position correction system based on virtual hand interaction according to an embodiment of the present invention.
Detailed Description
In order to clearly explain the technical features of the present invention, the following detailed description of the present invention is provided with reference to the accompanying drawings. The following disclosure provides many different embodiments, or examples, for implementing different features of the invention. To simplify the disclosure of the present invention, the components and arrangements of specific examples are described below. Furthermore, the present invention may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed. It should be noted that the components illustrated in the figures are not necessarily drawn to scale. Descriptions of well-known components and processing techniques and procedures are omitted so as to not unnecessarily limit the invention.
The following describes a virtual hand position correction method and system based on virtual hand interaction according to embodiments of the present invention in detail with reference to the accompanying drawings.
As shown in fig. 1, the present invention discloses a virtual hand position correction method based on virtual hand interaction, the method includes the following operations:
acquiring position information of a real hand through Kinect, mapping the real hand to a virtual space, and acquiring the attribute of a virtual hand;
in the interaction process, when the real hand of the user moves left and right, normal vector correction is carried out on the virtual hand in real time, and a distance parameter between the real hand and a screen is added in the calculation process of the angle of the virtual hand around the y axis;
and (4) counting the vector data of the virtual manipulation in the moving process to obtain the relation between the angle around the y axis and the attribute of the virtual hand, and correcting the position of the virtual hand according to the relation.
Through the hand position information that Kinect obtained, can be convenient map real hand to virtual space, make the user carry out direct control to virtual hand, position in three direction has constituted virtual hand's wherein three degrees of freedom, and three degrees of freedom are then the epaxial rotation in three direction, virtual hand's degree of freedom, virtual hand attribute is as follows promptly:
VRHand={p x ,p y ,p z ,r x ,r y ,r z }
if the depth coordinate of the real hand is
Figure BDA0002535121750000061
Then the coordinate mapped to virtual space &>
Figure BDA0002535121750000062
Can be expressed as: />
Figure BDA0002535121750000063
T in the matrix T x ,t y ,t z And the conversion scale of the real hand coordinate is expressed and set according to the size of the actual virtual interaction space.
Figure BDA0002535121750000064
The offset vector for the transformed coordinates is an empirical value.
Because the depth data acquired by the Kinect are only 4 and do not contain each joint node of the hand, the gesture of the user is recognized by adopting a gesture recognition method, the form of the virtual hand is adjusted according to the gesture, and then the states of the virtual hand and the target are changed. In the process, because the virtual interaction space has the characteristic of being small and large, when the virtual hand is positioned at the edge of the interaction space, the virtual hand is stretched and misplaced with the real hand, and at the moment, the position of the virtual hand needs to be corrected by using a position correction algorithm to keep the position of the virtual hand consistent with that of the real hand.
When a user uses a virtual hand for interaction, the relevant variables in the whole interaction process are as shown in fig. 2, VE Space represents the whole virtual environment Space, and all virtual objects including the virtual hand model are in the Space. VE width represents the effective virtual space interaction width, and the operation of the virtual space by the user cannot exceed the range; VE depth represents the distance from a target object operated by virtual hands to a screen, and when the VE depth exceeds the range of the screen, the virtual hands are invisible to a user; the virtual hand movement plane represents the width of the plane of the virtual hand moving left and right in the virtual space, the value of the virtual hand movement plane is equal to the VE width, and the real hand movement plane represents a translation range plane operated by the hand of the user in front of the user.
When interaction is carried out, the position of the user corresponds to the center of the virtual space, when the right hand of the user moves leftwards, the inevitable palm center normal vector can shift leftwards from the pointing screen, and common interaction based on the virtual hand is not corrected. In order to keep the consistency between the virtual hand and the real manipulation vector, the normal vector of the virtual hand needs to be adjusted in real time. Since the virtual hand has 6 attributes, including 3 rotation attributes, which are respectively rotated around the x, y and z axes of the local coordinate system of the virtual hand, the rotation attributes are as shown in fig. 3. The XOY plane of the initialized virtual hand is parallel to the XOY plane of the virtual scene, and in the process of moving left and right, only the angle of the virtual hand around the y axis needs to be corrected, and the angle value is set as sigma, wherein:
Figure BDA0002535121750000071
however, although μ is a constant and empirically μ is 30, the virtual hand adjusted according to this value is still not matched with the real hand vector, and thus has a deviation.
Since the observation angle of view is the distance from the user's eyes to the virtual hand plane, the distance D from the real hand to the screen should be taken into account, in the Unity3D scene, the position unit of the object is meter, and in the actual interaction process, the general value of the distance from the human hand to the screen is D =1, so as to correct the rotation angle value:
Figure BDA0002535121750000081
updating the normal vector of the virtual hand in real time, counting the actual data, rotating the angle sigma and the p of the virtual hand x In a linear relationship in virtual space:
Figure BDA0002535121750000082
/>
wherein, b =1, since the right hand of the real person is on the right side of the user center, b is a benefit to the operation range of the right hand. ρ is a constant, and is initialized to a value of σ and p x The value of (b) is substituted into the above formula to obtain.
The sigma obtained by the above formula corresponds to r in the virtual hand attribute y Modifying r of virtual hand y The value according to which the real hand and the virtual hand action can be kept consistent.
In the embodiment of the invention, right-handed operation is taken as an example, and the formula is applicable to both left-handed operation and right-handed operation.
According to the embodiment of the invention, aiming at the problem that the object in the interactive scene is displayed in a three-dimensional visual angle display mode of the virtual scene, the object close to the edge of the visual area is stretched and deformed in the visual sense, according to the position of the virtual hand in the scene, the position of the object is corrected by correcting the vector of the virtual hand, so that the object is superposed with the position of the real hand in the space and the visual angle, the dislocation sense of the user during interaction is corrected, the interactive reality experience is improved, and the interaction success rate and the interaction efficiency are improved to a certain extent.
As shown in fig. 4, an embodiment of the present invention further discloses a virtual hand position correction system based on virtual hand interaction, where the system includes:
the virtual hand mapping module is used for acquiring the position information of the real hand through the Kinect, mapping the real hand to a virtual space and acquiring the attribute of the virtual hand;
the normal vector correction module is used for performing normal vector correction on the virtual hand in real time when the real hand of the user moves left and right in the interaction process, and adding a distance parameter between the real hand and the screen in the angle calculation process of the virtual hand around the y axis;
and the normal vector counting module is used for counting virtual manipulation vector data in the moving process to obtain the relation between the angle around the y axis and the virtual hand attribute, and correcting the position of the virtual hand according to the relation.
Through the hand position information that Kinect obtained, can be convenient map real hand to virtual space, make the user carry out direct control to virtual hand, position in three direction has constituted virtual hand's wherein three degrees of freedom, and three degrees of freedom are then the epaxial rotation in three direction, virtual hand's degree of freedom, virtual hand attribute is as follows promptly:
VRHand={p x ,p y ,p z ,r x ,r y ,r z }
if the depth coordinate of the real hand is
Figure BDA0002535121750000091
Then the coordinate mapped to virtual space £ is greater than or equal to>
Figure BDA0002535121750000092
Can be expressed as:
Figure BDA0002535121750000093
t in the matrix T x ,t y ,t z Representing coordinates of a real handThe conversion scale is set according to the size of the actual virtual interaction space.
Figure BDA0002535121750000094
The offset vector for the transformed coordinates is an empirical value.
Because the depth data acquired by the Kinect are only 4 and do not contain each joint node of the hand, the gesture of the user is recognized by adopting a gesture recognition method, the form of the virtual hand is adjusted according to the gesture, and then the states of the virtual hand and the target are changed. In the process, because the virtual interaction space has the characteristic of being small and large, when the virtual hand is positioned at the edge of the interaction space, the virtual hand is stretched and misplaced with the real hand, and at the moment, the position of the virtual hand needs to be corrected by using a position correction algorithm to keep the position of the virtual hand consistent with that of the real hand.
When a user interacts with a virtual hand, VE Space represents the entire virtual environment Space, with all virtual objects, including the virtual hand model, in that Space. VE width represents the effective virtual space interaction width, and the operation of the virtual space by the user cannot exceed the range; VE depth represents the distance from a target object operated by virtual hands to a screen, and when the VE depth exceeds the range of the screen, the virtual hands are invisible to a user; the virtual hand movement plane represents the width of the plane of the virtual hand moving left and right in the virtual space, the value of the virtual hand movement plane is equal to the VE width, and the real hand movement plane represents a translation range plane operated by the hand of the user in front of the user.
When interaction is carried out, the position of the user corresponds to the center of the virtual space, when the right hand of the user moves leftwards, the inevitable palm center normal vector can shift leftwards from the pointing screen, and common interaction based on the virtual hand is not corrected. In order to keep the consistency between the virtual hand and the real manipulation vector, the normal vector of the virtual hand needs to be adjusted in real time. Because the virtual hand has 6 attributes, including 3 rotation attributes, the virtual hand rotates around the x, y and z axes of the local coordinate system of the virtual hand respectively. The XOY plane of the initialized virtual hand is parallel to the XOY plane of the virtual scene, and in the process of moving left and right, only the angle of the virtual hand around the y axis needs to be corrected, and the angle value is set as sigma, wherein:
Figure BDA0002535121750000101
however, although μ is a constant and empirically μ is 30, the virtual hand adjusted according to this value is still not matched with the real hand vector, and thus has a deviation.
Since the observation angle of view is the distance from the user's eyes to the virtual hand plane, the distance D from the real hand to the screen should be taken into account, in the Unity3D scene, the position unit of the object is meter, and in the actual interaction process, the general value of the distance from the human hand to the screen is D =1, so as to correct the rotation angle value:
Figure BDA0002535121750000111
updating the normal vector of the virtual hand in real time, counting the actual data, rotating the angle sigma and the p of the virtual hand x The relationship is linear in a virtual space:
Figure BDA0002535121750000112
wherein b =1, b is an invigoration of the right hand operation range since the real right hand is at the right side of the user center. P is a constant, and is initialized by x The value of (b) is substituted into the above formula to obtain.
The sigma obtained by the above formula corresponds to r in the virtual hand attribute y Modifying r of virtual hand y The value according to which the real hand and the virtual hand action can be kept identical.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (2)

1. A virtual hand position correction method based on virtual hand interaction, the method comprising the operations of:
acquiring position information of a real hand through Kinect, mapping the real hand to a virtual space, and acquiring the attribute of a virtual hand;
in the interaction process, when the real hand of the user moves left and right, normal vector correction is carried out on the virtual hand in real time, and a distance parameter between the real hand and a screen is added in the calculation process of the angle of the virtual hand around the y axis; wherein the y-axis is parallel to the real hand;
counting virtual manipulation vector data in the moving process to obtain the relation between the angle around the y axis and the virtual hand attribute, and correcting the position of the virtual hand according to the relation;
the attributes of the virtual hand are set as follows:
VRHand={p x ,p y ,p z ,r x ,r y ,r z }
p x ,p y ,p z is the position in the three directions of the x, y and z axes, r x ,r y ,r z The rotational degrees of freedom in the three directions of the x axis, the y axis and the z axis;
the relationship between the angle around the y-axis and the virtual hand attribute is:
Figure FDA0004058190440000011
b is one benefit to the actual hand operation range; p is a constant, and is initialized by x Substituting the value of (b) into the formula to obtain; VE _ Width is the Width of the virtual space, σ max Is the maximum angle;
the maximum angle is calculated as follows:
Figure FDA0004058190440000012
VE _ Depth is the Depth of the virtual space; mu is a constant; d is the distance from the real hand to the screen;
the sigma obtained by the above formula corresponds to r in the virtual hand attribute y Modifying r of virtual hand y The value according to which the real hand and the virtual hand action are kept consistent.
2. A virtual hand position correction system based on virtual hand interaction, the system comprising:
the virtual hand mapping module is used for acquiring the position information of the real hand through the Kinect, mapping the real hand to a virtual space and acquiring the attribute of the virtual hand;
the normal vector correction module is used for performing normal vector correction on the virtual hand in real time when the real hand of the user moves left and right in the interaction process, and adding a distance parameter between the real hand and the screen in the angle calculation process of the virtual hand around the y axis; wherein the y-axis is parallel to the real hand;
and the normal vector counting module is used for counting virtual manipulation vector data in the moving process to obtain the relation between the angle around the y axis and the virtual hand attribute, and correcting the position of the virtual hand according to the relation.
The attributes of the virtual hand are set as follows:
VRHand={p x ,p y ,p z ,r x ,r y ,r z }
p x ,p y ,p z is the position in three directions of x, y and z axes, r x ,r y ,r z The rotational degrees of freedom in the three directions of the x axis, the y axis and the z axis;
the relationship between the angle around the y-axis and the virtual hand attribute is:
Figure FDA0004058190440000021
b is one benefit to the actual hand operation range; p is a constant, and is initialized by x Substituting the value of (A) into the formula to obtain; VE _ Width is the Width of the virtual spaceDegree, σ max Is the maximum angle;
the maximum angle is calculated as follows:
Figure FDA0004058190440000022
VE _ Depth is the Depth of the virtual space; mu is a constant; d is the distance from the real hand to the screen;
the sigma obtained by the above formula corresponds to r in the virtual hand attribute y Modifying r of virtual hand y The value according to which the real hand and the virtual hand action are kept consistent.
CN202010530219.8A 2020-06-11 2020-06-11 Virtual hand position correction method and system based on virtual hand interaction Active CN111694432B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010530219.8A CN111694432B (en) 2020-06-11 2020-06-11 Virtual hand position correction method and system based on virtual hand interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010530219.8A CN111694432B (en) 2020-06-11 2020-06-11 Virtual hand position correction method and system based on virtual hand interaction

Publications (2)

Publication Number Publication Date
CN111694432A CN111694432A (en) 2020-09-22
CN111694432B true CN111694432B (en) 2023-04-07

Family

ID=72480333

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010530219.8A Active CN111694432B (en) 2020-06-11 2020-06-11 Virtual hand position correction method and system based on virtual hand interaction

Country Status (1)

Country Link
CN (1) CN111694432B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005107654A (en) * 2003-09-29 2005-04-21 Noritsu Koki Co Ltd Method for correcting camera movement and photo printing device with function to correct camera movement
CN101164084A (en) * 2005-04-21 2008-04-16 佳能株式会社 Image processing method and image processing apparatus
CN102014259A (en) * 2010-11-17 2011-04-13 杭州华泰医疗科技有限公司 Projective texture mapping-based oblique projection distortion correction method
CN102378944A (en) * 2009-03-31 2012-03-14 三菱电机株式会社 Numerical control device
CN105759967A (en) * 2016-02-19 2016-07-13 电子科技大学 Global hand gesture detecting method based on depth data
CN105809727A (en) * 2016-03-16 2016-07-27 成都电锯互动科技有限公司 Three-dimensional animation production system
CN106297471A (en) * 2016-10-25 2017-01-04 深圳市科创数字显示技术有限公司 The removable cornea intelligent operation training system that AR and VR combines
CN106582012A (en) * 2016-12-07 2017-04-26 腾讯科技(深圳)有限公司 Method and device for processing climbing operation in VR scene
CN107644395A (en) * 2016-07-21 2018-01-30 华为终端(东莞)有限公司 Image processing method and mobile device
CN108536298A (en) * 2018-03-30 2018-09-14 广东工业大学 A kind of human body mapping appearance body interacts constrained procedure with the binding of virtual rotary body
CN110799926A (en) * 2017-06-30 2020-02-14 托比股份公司 System and method for displaying images in a virtual world environment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8913009B2 (en) * 2010-02-03 2014-12-16 Nintendo Co., Ltd. Spatially-correlated multi-display human-machine interface

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005107654A (en) * 2003-09-29 2005-04-21 Noritsu Koki Co Ltd Method for correcting camera movement and photo printing device with function to correct camera movement
CN101164084A (en) * 2005-04-21 2008-04-16 佳能株式会社 Image processing method and image processing apparatus
CN102378944A (en) * 2009-03-31 2012-03-14 三菱电机株式会社 Numerical control device
CN102014259A (en) * 2010-11-17 2011-04-13 杭州华泰医疗科技有限公司 Projective texture mapping-based oblique projection distortion correction method
CN105759967A (en) * 2016-02-19 2016-07-13 电子科技大学 Global hand gesture detecting method based on depth data
CN105809727A (en) * 2016-03-16 2016-07-27 成都电锯互动科技有限公司 Three-dimensional animation production system
CN107644395A (en) * 2016-07-21 2018-01-30 华为终端(东莞)有限公司 Image processing method and mobile device
CN106297471A (en) * 2016-10-25 2017-01-04 深圳市科创数字显示技术有限公司 The removable cornea intelligent operation training system that AR and VR combines
CN106582012A (en) * 2016-12-07 2017-04-26 腾讯科技(深圳)有限公司 Method and device for processing climbing operation in VR scene
CN110799926A (en) * 2017-06-30 2020-02-14 托比股份公司 System and method for displaying images in a virtual world environment
CN108536298A (en) * 2018-03-30 2018-09-14 广东工业大学 A kind of human body mapping appearance body interacts constrained procedure with the binding of virtual rotary body

Also Published As

Publication number Publication date
CN111694432A (en) 2020-09-22

Similar Documents

Publication Publication Date Title
JP7213899B2 (en) Gaze-Based Interface for Augmented Reality Environments
JP5631535B2 (en) System and method for a gesture-based control system
Lu et al. Immersive manipulation of virtual objects through glove-based hand gesture interaction
CN109145802B (en) Kinect-based multi-person gesture man-machine interaction method and device
US20120235904A1 (en) Method and System for Ergonomic Touch-free Interface
Kim et al. Interaction with hand gesture for a back-projection wall
CN113505694A (en) Human-computer interaction method and device based on sight tracking and computer equipment
CN113829357B (en) Remote operation method, device, system and medium for robot arm
CN111694432B (en) Virtual hand position correction method and system based on virtual hand interaction
Park et al. Hand tracking with a near-range depth camera for virtual object manipulation in an wearable augmented reality
Teleb et al. Data glove integration with 3d virtual environments
US11775064B1 (en) Multiple-magnet hand-mounted position-tracking device
JP5788853B2 (en) System and method for a gesture-based control system
JP2005527872A (en) Method and apparatus for interacting with a three-dimensional computer model
Shi et al. Error elimination method in moving target tracking in real-time augmented reality
Tao et al. Human-Computer Interaction Using Fingertip Based on Kinect
Turner et al. Head‐Tracked Stereo Viewing with Two‐Handed 3 D Interaction for Animated Character Construction
Zeng et al. Virtual Hand Position Correction Algorithm Based on Virtual Hand Interaction
Henschke et al. Extending the index finger is worse than sitting: Posture and minimal objects in 3D pointing interfaces
Ma et al. Remote Object Taking and Movement in Augment Reality
Liu et al. Techniques for Selecting and Manipulating Object in Virtual Environment Based on 3-DOF Trackers and Data Glove
Maggioni Non Immersive Control of Virtual Environments
Boritz The effectiveness of three dimensional interaction
Chen et al. The Virtual Surgery of Extraocular Muscles Based on Gesture Interaction
Samad et al. Real-Time Kinect Fingers Tracking and Recognition for Servo Gripper Controller

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant