CN106845335A - Gesture identification method, device and virtual reality device for virtual reality device - Google Patents

Gesture identification method, device and virtual reality device for virtual reality device Download PDF

Info

Publication number
CN106845335A
CN106845335A CN201611073934.3A CN201611073934A CN106845335A CN 106845335 A CN106845335 A CN 106845335A CN 201611073934 A CN201611073934 A CN 201611073934A CN 106845335 A CN106845335 A CN 106845335A
Authority
CN
China
Prior art keywords
active user
virtual reality
current
reality device
identification method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201611073934.3A
Other languages
Chinese (zh)
Other versions
CN106845335B (en
Inventor
张茜
张绍谦
张超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Techology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Techology Co Ltd filed Critical Goertek Techology Co Ltd
Priority to CN201611073934.3A priority Critical patent/CN106845335B/en
Priority to PCT/CN2016/111062 priority patent/WO2018098861A1/en
Publication of CN106845335A publication Critical patent/CN106845335A/en
Application granted granted Critical
Publication of CN106845335B publication Critical patent/CN106845335B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/117Biometrics derived from hands

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a kind of gesture identification method for virtual reality device, device and virtual reality device, the gesture identification method includes:Controlling depth camera gathers the current hand images of active user;Judge whether active user performs hammer action according to current hand images, in this way, then:Current signature is extracted from current hand images;Current signature is matched with the fixed reference feature in model, the button that active user taps is determined according to matching result.In just dummy keyboard being applied into reality, can be very good to heighten the flexibility that user uses, while also releasing physical keyboard space, the puzzlement that user produces by complicated character input is reduced, so as to improve Consumer's Experience.

Description

Gesture identification method, device and virtual reality device for virtual reality device
Technical field
The present invention relates to virtual reality device technical field, more particularly, to a kind of hand for virtual reality device Gesture recognition methods, device and virtual reality device.
Background technology
Virtual reality (Virtual Reality, abbreviation VR) is the new and high technology for occurring in recent years.Virtual reality technology will It is that support one is qualitative and be quantitatively combined, the pass of the comprehensive integration Multi information space that perceptual knowledge and rational knowledge are combined Key technology.With the lifting of the speed of network, an Internet era based on virtual reality technology just quietly comes up, and it is by pole The earth changes production and the life style of people.It is envisioned that we can be worn by VR travelled in the space, parachuted etc. no Dare the thing for playing and wanting to attempt, realization is experienced and reciprocation to virtual world.
At present, emerging virtual reality technology has penetrated into the fields such as office, amusement, causes the change of many industries.But It is to wear interaction based on current VR to be mainly language, gesture etc., it is impossible to realize the complex process to word.In order to increase VR heads The interaction worn, improves Consumer's Experience, proposes that a kind of VR dummy keyboards based on hard recognition are very valuable.
The content of the invention
It is an object of the present invention to provide a kind of new solution of the gesture identification for virtual reality device.
According to the first aspect of the invention, there is provided a kind of gesture identification method for virtual reality device, the void Intending real world devices includes depth camera, and the gesture identification method includes:
The depth camera is controlled to gather the current hand images of active user;
Judge whether the active user performs hammer action according to the current hand images, in this way, then:
Current signature is extracted from the current hand images;
The current signature is matched with the fixed reference feature in model, the active user is determined according to matching result The button of percussion.
Optionally, the gesture identification method also includes:
The depth camera collection is controlled with reference to the reference hand images of user;
The fixed reference feature is extracted from the reference hand images, and by fixed reference feature storage in the model In.
Optionally, the virtual reality device also includes display screen, described to work as according to the present image judges Whether preceding user also includes before tapping:
Shown on the display screen finger of keyboard image and the active user on the keyboard image just Beginning position.
Optionally, the gesture identification method also includes:
According to the button that the active user taps, show the finger of the active user in institute on the display screen State the current location on keyboard image.
According to the second aspect of the invention, there is provided a kind of gesture identifying device for virtual reality device, including:
First control module, for controlling the depth camera to gather the current hand images of active user;
Judge module, for judging whether the active user performs hammer action according to the current hand images;
Current signature extraction module, in the judged result of the judge module in the case of being, from described current Current signature is extracted in hand images;
Matching module, it is true according to matching result for the current signature to be matched with the fixed reference feature in model The button that the fixed active user taps.
Optionally, the gesture identifying device also includes:
Second control module, for controlling the depth camera collection with reference to the reference hand images of user;
Fixed reference feature extraction module, for extracting the fixed reference feature from the reference hand images, and by the ginseng Characteristic storage is examined in the model.
Optionally, the virtual reality device also includes display screen, and the gesture identifying device also includes:
First display module, for showing the finger of keyboard image and the active user in institute on the display screen State the initial position on keyboard image.
Optionally, the gesture identifying device also includes:
Second display module, the button that the active user for being determined according to the matching module taps, described Current location of the finger of the active user on the keyboard image is shown on display screen.
According to the third aspect of the invention we, there is provided a kind of virtual reality device, including according to a second aspect of the present invention institute The gesture identifying device stated.
According to the fourth aspect of the invention, there is provided a kind of virtual reality device, including depth camera, processor and deposit Reservoir, the depth camera is used to gather image, and the memory is used for store instruction, and the instruction is used to control the place Reason device performs gesture identification method described according to a first aspect of the present invention.
It was found by the inventors of the present invention that in the prior art, exist wear-type virtual reality device interaction be mainly it is logical Cross language, gesture etc., it is impossible to realize the problem to the complex process of word.Therefore, the technical assignment to be realized of the present invention or Person's technical problem to be solved be it is that those skilled in the art never expect or it is not expected that, therefore the present invention is a kind of New technical scheme.
A beneficial effect of the invention is, by the present invention in that being distinguished with the Gesture Recognition of depth transducer Various gestures, right-hand man confirm and finger fingertip coordinate is obtained, and in dummy keyboard being applied into reality, can be very good The flexibility that user uses is heightened, while also releasing physical keyboard space, user is reduced because of complicated character input generation Puzzlement, so as to improve Consumer's Experience.
By referring to the drawings to the detailed description of exemplary embodiment of the invention, further feature of the invention and its Advantage will be made apparent from.
Brief description of the drawings
The accompanying drawing for being combined in the description and constituting a part for specification shows embodiments of the invention, and even It is used to explain principle of the invention together with its explanation.
Fig. 1 is a kind of flow of the implementation method according to a kind of gesture identification method for virtual reality device of the invention Figure;
Fig. 2 is a kind of square frame of the implementation structure according to a kind of gesture identifying device for virtual reality device of the invention Schematic diagram;
Fig. 3 is a kind of frame principle figure of the implementation structure according to a kind of virtual reality device of the invention.
Specific embodiment
Describe various exemplary embodiments of the invention in detail now with reference to accompanying drawing.It should be noted that:Unless had in addition Body illustrates that the part and the positioned opposite of step, numerical expression and numerical value for otherwise illustrating in these embodiments do not limit this The scope of invention.
The description only actually at least one exemplary embodiment is illustrative below, never as to the present invention And its any limitation applied or use.
May be not discussed in detail for technology, method and apparatus known to person of ordinary skill in the relevant, but suitable In the case of, the technology, method and apparatus should be considered as a part for specification.
In all examples shown here and discussion, any occurrence should be construed as merely exemplary, without It is as limitation.Therefore, other examples of exemplary embodiment can have different values.
It should be noted that:Similar label and letter represents similar terms in following accompanying drawing, therefore, once a certain Xiang Yi It is defined in individual accompanying drawing, then it need not be further discussed in subsequent accompanying drawing.
In order to the interaction for solving wear-type virtual reality device present in prior art is mainly by language, gesture Deng, it is impossible to realize the problem to the complex process of word, there is provided a kind of gesture identification method for virtual reality device, should Virtual reality device includes depth camera.
Depth camera is also referred to as depth transducer or 3D sensors, for example, can be TOF cameras, sends modulated Near infrared light, meet object back reflection, camera by calculate light launch and reflection interval differ from or phase difference clapped to convert The distance of object is taken the photograph, to produce depth information, is shot in conjunction with traditional camera in addition, just can be by the three-D profile of object with not The image for representing different distance with color is showed.
Fig. 1 is a kind of flow of the implementation method according to a kind of gesture identification method for virtual reality device of the invention Figure.
According to Fig. 1, the gesture identification method is comprised the following steps:
Step S110, controlling depth camera gathers the current hand images of active user.
Specifically, for example can be control TOF cameras, modulated near infrared light be sent, meet the hand of active user Back reflection, camera is launched with reflection interval difference or phase difference come active user's hand optional position of converting by calculating light Distance, to produce depth information, shot in conjunction with traditional camera in addition, just can be by the three-D profile of hand in different colors The current hand images for representing different distance are showed.
Step S120, judges whether active user performs hammer action according to hand images before deserving, and in this way, then performs step Rapid S130;If not, continuing executing with step S110.
The curvature of hand profile each point in current hand images is collected according to depth camera, finger can be calculated Sharp position.Each point to hand profile in current hand images determines each curvature put, finger according to certain step-length The curvature of finger tip is that have a range of, by checking that each curvature put determines whether just to can determine that finger tip within the range Position, the result judged by gesture and the position of finger tip can calculate the position of other key points of hand by morphology again, Wherein, other key points for example can be joint.
Depth transducer can be arranged on the optional position above virtual reality device or before user's hand. In a specific embodiment of the invention, the position of depth transducer can be provided on virtual reality device.Depth sensing After device collects the current hand images of active user, be can be calculated by the curvature of hand profile point in current hand images To the current finger tip coordinate of both hands of active user, user's hand other key points are extrapolated according to user's finger tip coordinate, hand refers to Point and other key points are for example, by being that 3D Renderings obtain user's hand images and are imaged on the keyboard image of VR.Afterwards Can confirm that the change in depth of user's finger so as to confirm whether there is percussion keyboard according to current hand images.
Using 3D Renderings so that image clearly, heightened the resolution ratio of image, and can suitably adjust visible angle and Visual range.
Due to depth transducer obtain be depth map, according to hand from depth transducer distance, depth be it is different, Therefore, when finger is lifted, depth value is smaller, and depth value is larger when finger falls, and may thereby determine that whether active user performs Hammer action.
In one particular embodiment of the present invention, the virtual reality device also includes display screen, is performing step S120 Also include before:
Initial position of the finger of keyboard image and active user on the keyboard image is shown on a display screen.
When active user for example can be that wear-type virtual reality device carries out character input using the virtual unit, in order to Mitigate the discomfort of active user's arm, active user can be by the object of the flat board of surrounding such as desk etc..Active user's is double Hand can be placed according to the gesture of physical keyboard, and depth camera obtains current hand images and finger tip coordinate, can pass through Be placed on left index finger at the button F of keyboard by index finger tip normalized, and the forefinger of the right hand is placed on the button J of keyboard Place, active user can be according to the imaging of virtual reality device, appropriate other finger positions of adjustment, so as to fall each finger Correct initial position.
In such manner, it is possible to so that active user is clear that on keyboard the position of the button for needing to press, be determined with this The action of percussion, lifts Consumer's Experience.
Step S130, current signature is extracted from current hand images.
In one particular embodiment of the present invention, the extraction of current signature can be improved by neutral net (CNN) What algorithm was realized.CNN innovatory algorithms, by convolution and the characteristic point (such as 10 is several) for obtaining user's hand, afterwards further according to Maximum pond layer and full articulamentum so as to obtain all of neuron of hand, wherein, CNN innovatory algorithms can be used The algorithm framework of the CNN such as caffe or tensorflow, then therefrom extract current signature.
Step S140, the current signature is matched with the fixed reference feature in model, is determined according to matching result current The button that user taps.
Fixed reference feature for example can be stored before virtual reality device dispatches from the factory in a model, or active user Stored before using the virtual reality device.
In one particular embodiment of the present invention, before step S140 is performed, the gesture identification method also includes:
The depth camera is controlled to gather the reference hand images with reference to user;
The fixed reference feature is extracted from the reference hand images, and by fixed reference feature storage in the model.
Particularly, collection user is on tapping the reference hand images of the various actions of hand of keyboard each button, and Fixed reference feature is extracted from each reference hand images, is set up comprising the models for acting corresponding fixed reference feature various with hand, The model for obtaining can be applied to be matched with current signature, if the match is successful, can determine and the reference that the match is successful The button that the corresponding active user of feature taps.
After execution of step S140, the gesture identification method also includes:
According to the button that active user taps, the finger of active user working as on keyboard image is shown on a display screen Front position.
Specifically, the method for showing current location of the finger of active user on keyboard image on a display screen, with The method of above-mentioned display initial position can be with identical.
So, confirm by the present invention in that distinguishing various gestures, right-hand man with the Gesture Recognition of depth transducer And finger fingertip coordinate is obtained, occurs finger tapping keyboard picture before user so as to complete the defeated of character or numeral Enter, it becomes possible to during dummy keyboard applied into reality, can be very good to heighten the flexibility that user uses, while also releasing reality Body keypad space, reduces the puzzlement that user produces by complicated character input, so as to improve Consumer's Experience.
Present invention also offers a kind of gesture identifying device for virtual reality device, Fig. 2 is according to of the invention a kind of For a kind of frame principle figure of implementation structure of the gesture identifying device of virtual reality device.
According to Fig. 2, the gesture identifying device 200 includes the first control module 210, judge module 220, current signature Extraction module 230 and matching module 240.
Above-mentioned first control module 210 is used for the current hand images that controlling depth camera gathers active user;
Above-mentioned judge module 220 is used to judge whether active user performs hammer action according to current hand images;
Above-mentioned current signature extraction module 230 is used for judged result in judge module in the case of being, from working as remote holder Current signature is extracted in portion's image;
Above-mentioned matching module 240 is used to be matched current signature with the fixed reference feature in model, according to matching result Determine the button that active user taps.
Specifically, the gesture identifying device also includes the second control module and fixed reference feature extraction module, the second control mould Block is used for reference hand images of the controlling depth camera collection with reference to user;Fixed reference feature extraction module is used for from referring to hand Fixed reference feature is extracted in image, and by fixed reference feature storage in a model.
Further, virtual reality device also includes display screen, and gesture identifying device also includes the first display module, is used for Initial position of the finger of keyboard image and active user on keyboard image is shown on a display screen.
On this basis, gesture identifying device also includes the second display module, current for what is determined according to matching module The button that user taps, shows current location of the finger of active user on keyboard image on a display screen.
Present invention also offers a kind of virtual reality device, according to one side, the virtual reality device includes of the invention A kind of gesture identifying device 200 for virtual reality device.The virtual reality device for example can be virtual reality glasses, void Intend the products such as the real helmet.
Fig. 3 is a kind of frame principle figure of implementation structure of the virtual reality device according to another aspect of the present invention.
According to Fig. 3, the virtual reality device 300 includes memory 301 and processor 302, and the memory 301 is used for Store instruction, the instruction is operated to perform the above-mentioned gesture identification for virtual reality device for control process device 302 Method.
In addition, according to Fig. 3, the virtual reality device 300 also include interface arrangement 303, input unit 304, Display device 305, communicator 306 etc..Although figure 3 illustrates multiple devices, the present invention can only relate to it In partial devices, for example, processor 301, memory 302, interface arrangement 303 etc..
Above-mentioned communicator 306 can for example carry out wired or wireless communication.
Above-mentioned interface arrangement 303 is for example including earphone jack, USB interface etc..
Above-mentioned input unit 304 for example can be including touch-screen, button etc..
Above-mentioned display device 305 is, for example, LCDs, touch display screen etc..
The various embodiments described above primary focus describe the difference with other embodiment, but those skilled in the art should be clear Chu, the various embodiments described above can as needed be used alone or be combined with each other.
Each embodiment in this specification is described by the way of progressive, identical similar portion between each embodiment Point cross-reference, what each embodiment was stressed is the difference with other embodiment, but people in the art Member is it should be understood that the various embodiments described above can as needed be used alone or be combined with each other.In addition, for device For embodiment, because it is corresponding with embodiment of the method, so describing fairly simple, related part is implemented referring to method The explanation of the corresponding part of example.System embodiment described above is only schematical, wherein as separating component The module of explanation can be or may not be physically separate.
The present invention can be system, method and/or computer program product.Computer program product can include computer Readable storage medium storing program for executing, containing for making processor realize the computer-readable program instructions of various aspects of the invention.
Computer-readable recording medium can be the tangible of the instruction that holding and storage are used by instruction execution equipment Equipment.Computer-readable recording medium for example can be-- but be not limited to-- storage device electric, magnetic storage apparatus, optical storage Equipment, electromagnetism storage device, semiconductor memory apparatus or above-mentioned any appropriate combination.Computer-readable recording medium More specifically example (non exhaustive list) includes:Portable computer diskette, hard disk, random access memory (RAM), read-only deposit It is reservoir (ROM), erasable programmable read only memory (EPROM or flash memory), static RAM (SRAM), portable Compact disk read-only storage (CD-ROM), digital versatile disc (DVD), memory stick, floppy disk, mechanical coding equipment, for example thereon Be stored with instruction punch card or groove internal projection structure and above-mentioned any appropriate combination.Calculating used herein above Machine readable storage medium storing program for executing is not construed as instantaneous signal in itself, the electromagnetic wave of such as radio wave or other Free propagations, logical Cross electromagnetic wave (for example, the light pulse for passing through fiber optic cables) that waveguide or other transmission mediums propagate or by wire transfer Electric signal.
Computer-readable program instructions as described herein can from computer-readable recording medium download to each calculate/ Processing equipment, or outer computer or outer is downloaded to by network, such as internet, LAN, wide area network and/or wireless network Portion's storage device.Network can include copper transmission cable, Optical Fiber Transmission, be wirelessly transferred, router, fire wall, interchanger, gateway Computer and/or Edge Server.Adapter or network interface in each calculating/processing equipment are received from network to be counted Calculation machine readable program instructions, and the computer-readable program instructions are forwarded, for storing the meter in each calculating/processing equipment In calculation machine readable storage medium storing program for executing.
For perform the present invention operation computer program instructions can be assembly instruction, instruction set architecture (ISA) instruction, Machine instruction, machine-dependent instructions, microcode, firmware instructions, condition setup data or with one or more programming language Source code or object code that any combination is write, programming language of the programming language including object-oriented-such as Smalltalk, C++ etc., and routine procedural programming languages-such as " C " language or similar programming language.Computer Readable program instructions can perform fully on the user computer, partly perform on the user computer, as one solely Vertical software kit is performed, part performs or completely in remote computer on the remote computer on the user computer for part Or performed on server.In the situation for being related to remote computer, remote computer can be by the network-bag of any kind LAN (LAN) or wide area network (WAN)-be connected to subscriber computer are included, or, it may be connected to outer computer (such as profit With ISP come by Internet connection).In certain embodiments, by using computer-readable program instructions Status information carry out personalized customization electronic circuit, such as PLD, field programmable gate array (FPGA) or can Programmed logic array (PLA) (PLA), the electronic circuit can perform computer-readable program instructions, so as to realize each side of the invention Face.
Referring herein to method according to embodiments of the present invention, device (system) and computer program product flow chart and/ Or block diagram describes various aspects of the invention.It should be appreciated that each square frame and flow chart of flow chart and/or block diagram and/ Or in block diagram each square frame combination, can be realized by computer-readable program instructions.
These computer-readable program instructions can be supplied to all-purpose computer, special-purpose computer or other programmable datas The processor of processing unit, so as to produce a kind of machine so that these instructions are by computer or other programmable datas During the computing device of processing unit, work(specified in one or more square frames realized in flow chart and/or block diagram is generated The device of energy/action.Can also be the storage of these computer-readable program instructions in a computer-readable storage medium, these refer to Order causes that computer, programmable data processing unit and/or other equipment work in a specific way, so that, be stored with instruction Computer-readable medium then includes a manufacture, and it includes realizing in one or more square frames in flow chart and/or block diagram The instruction of the various aspects of the function/action of regulation.
Can also computer-readable program instructions be loaded into computer, other programmable data processing units or other In equipment so that perform series of operation steps on computer, other programmable data processing units or miscellaneous equipment, to produce The computer implemented process of life, so that performed on computer, other programmable data processing units or miscellaneous equipment Instruct function/action specified in one or more square frames realized in flow chart and/or block diagram.
Flow chart and block diagram in accompanying drawing show system, method and the computer journey of multiple embodiments of the invention The architectural framework in the cards of sequence product, function and operation.At this point, each square frame in flow chart or block diagram can generation One part for module, program segment or instruction of table a, part for the module, program segment or instruction is used comprising one or more In the executable instruction of the logic function for realizing regulation.In some realizations as replacement, the function of being marked in square frame Can occur with different from the order marked in accompanying drawing.For example, two continuous square frames can essentially be held substantially in parallel OK, they can also be performed in the opposite order sometimes, and this is depending on involved function.It is also noted that block diagram and/or The combination of the square frame in each square frame and block diagram and/or flow chart in flow chart, can use the function of performing regulation or dynamic The special hardware based system made is realized, or can be realized with the combination of computer instruction with specialized hardware.It is right For those skilled in the art it is well known that, realized by hardware mode, realized by software mode and by software and The mode of combination of hardware realizes it being all of equal value.
It is described above various embodiments of the present invention, described above is exemplary, and non-exclusive, and It is not limited to disclosed each embodiment.In the case of without departing from the scope and spirit of illustrated each embodiment, for this skill Many modifications and changes will be apparent from for the those of ordinary skill in art field.The selection of term used herein, purport Best explaining principle, practical application or the technological improvement to the technology in market of each embodiment, or lead this technology Other those of ordinary skill in domain are understood that each embodiment disclosed herein.The scope of the present invention is limited by appended claims It is fixed.

Claims (10)

1. a kind of gesture identification method for virtual reality device, the virtual reality device includes depth camera, and it is special Levy and be, the gesture identification method includes:
The depth camera is controlled to gather the current hand images of active user;
Judge whether the active user performs hammer action according to the current hand images, in this way, then:
Current signature is extracted from the current hand images;
The current signature is matched with the fixed reference feature in model, is determined that the active user taps according to matching result Button.
2. gesture identification method according to claim 1, it is characterised in that the gesture identification method also includes:
The depth camera collection is controlled with reference to the reference hand images of user;
The fixed reference feature is extracted from the reference hand images, and by fixed reference feature storage in the model.
3. gesture identification method according to claim 1, the virtual reality device also includes display screen, and its feature exists In described to judge whether the active user also includes before tapping according to the present image:
Initial bit of the finger of keyboard image and the active user on the keyboard image is shown on the display screen Put.
4. gesture identification method according to claim 3, it is characterised in that the gesture identification method also includes:
According to the button that the active user taps, show the finger of the active user in the key on the display screen Current location on disk image.
5. a kind of gesture identifying device for virtual reality device, it is characterised in that including:
First control module, for controlling the depth camera to gather the current hand images of active user;
Judge module, for judging whether the active user performs hammer action according to the current hand images;
Current signature extraction module, for being in the case of being, from the current hand in the judged result of the judge module Current signature is extracted in image;
Matching module, for the current signature to be matched with the fixed reference feature in model, institute is determined according to matching result State the button of active user's percussion.
6. gesture identifying device according to claim 5, it is characterised in that the gesture identifying device also includes:
Second control module, for controlling the depth camera collection with reference to the reference hand images of user;
Fixed reference feature extraction module, for extracting the fixed reference feature from the reference hand images, and by described with reference to special Levy and store in the model.
7. gesture identifying device according to claim 5, the virtual reality device also includes display screen, and its feature exists In the gesture identifying device also includes:
First display module, for showing the finger of keyboard image and the active user in the key on the display screen Initial position on disk image.
8. gesture identifying device according to claim 7, it is characterised in that the gesture identifying device also includes:
Second display module, the button that the active user for being determined according to the matching module taps, in the display Screen display goes out current location of the finger of the active user on the keyboard image.
9. a kind of virtual reality device, it is characterised in that including the gesture identification dress according to any one of claim 5-8 Put.
10. a kind of virtual reality device, it is characterised in that including depth camera, processor and memory, the depth camera Head is used for store instruction for gathering image, the memory, and the instruction is used to control right described in the computing device It is required that the gesture identification method described in any one of 1-4.
CN201611073934.3A 2016-11-29 2016-11-29 Gesture recognition method and device for virtual reality equipment and virtual reality equipment Active CN106845335B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201611073934.3A CN106845335B (en) 2016-11-29 2016-11-29 Gesture recognition method and device for virtual reality equipment and virtual reality equipment
PCT/CN2016/111062 WO2018098861A1 (en) 2016-11-29 2016-12-20 Gesture recognition method and device for virtual reality apparatus, and virtual reality apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611073934.3A CN106845335B (en) 2016-11-29 2016-11-29 Gesture recognition method and device for virtual reality equipment and virtual reality equipment

Publications (2)

Publication Number Publication Date
CN106845335A true CN106845335A (en) 2017-06-13
CN106845335B CN106845335B (en) 2020-03-17

Family

ID=59145422

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611073934.3A Active CN106845335B (en) 2016-11-29 2016-11-29 Gesture recognition method and device for virtual reality equipment and virtual reality equipment

Country Status (2)

Country Link
CN (1) CN106845335B (en)
WO (1) WO2018098861A1 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107490983A (en) * 2017-09-29 2017-12-19 中国船舶重工集团公司第七〇四研究所 A kind of emulation mode for simulating parachute jumping full experience
CN107644631A (en) * 2017-10-13 2018-01-30 深圳市明德智慧教育科技有限公司 Method, system and the virtual reality device of music input based on virtual reality
CN107693117A (en) * 2017-09-29 2018-02-16 黑龙江蓝智科技有限公司 Augmented reality aided surgery system and the method that 3D models and patient with operation are subjected to automatic reclosing matching
CN108052277A (en) * 2017-12-14 2018-05-18 深圳市艾德互联网络有限公司 A kind of AR positioning learning methods and device
CN108519855A (en) * 2018-04-17 2018-09-11 北京小米移动软件有限公司 Characters input method and device
CN108815845A (en) * 2018-05-15 2018-11-16 百度在线网络技术(北京)有限公司 The information processing method and device of human-computer interaction, computer equipment and readable medium
CN109508635A (en) * 2018-10-08 2019-03-22 哈尔滨理工大学 A kind of traffic light recognition method based on TensorFlow combination multi-layer C NN network
CN109598998A (en) * 2018-11-30 2019-04-09 深圳供电局有限公司 Power grid training wearable device and its exchange method based on gesture identification
CN109857244A (en) * 2017-11-30 2019-06-07 百度在线网络技术(北京)有限公司 A kind of gesture identification method, device, terminal device, storage medium and VR glasses
CN109933190A (en) * 2019-02-02 2019-06-25 青岛小鸟看看科技有限公司 One kind wearing display equipment and its exchange method
CN110096166A (en) * 2019-04-23 2019-08-06 广东工业大学华立学院 A kind of virtual input method
CN110321174A (en) * 2019-06-25 2019-10-11 Oppo广东移动通信有限公司 A kind of starting-up method and device, equipment, storage medium
CN111443831A (en) * 2020-03-30 2020-07-24 北京嘉楠捷思信息技术有限公司 Gesture recognition method and device
CN111766947A (en) * 2020-06-30 2020-10-13 歌尔科技有限公司 Display method, display device, wearable device and medium
CN112462937A (en) * 2020-11-23 2021-03-09 青岛小鸟看看科技有限公司 Local perspective method and device of virtual reality equipment and virtual reality equipment
CN113269089A (en) * 2021-05-25 2021-08-17 上海人工智能研究院有限公司 Real-time gesture recognition method and system based on deep learning

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111158476B (en) * 2019-12-25 2023-05-23 中国人民解放军军事科学院国防科技创新研究院 Key recognition method, system, equipment and storage medium of virtual keyboard
CN113299132A (en) * 2021-06-08 2021-08-24 上海松鼠课堂人工智能科技有限公司 Student speech skill training method and system based on virtual reality scene

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110221657A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Optical stabilization of displayed content with a variable lens
CN104636725A (en) * 2015-02-04 2015-05-20 华中科技大学 Gesture recognition method based on depth image and gesture recognition system based on depth images
US20150301592A1 (en) * 2014-04-18 2015-10-22 Magic Leap, Inc. Utilizing totems for augmented or virtual reality systems
WO2016010797A1 (en) * 2014-07-15 2016-01-21 Microsoft Technology Licensing, Llc Holographic keyboard display

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012252584A (en) * 2011-06-03 2012-12-20 Nakayo Telecommun Inc Virtual keyboard input method
WO2013144807A1 (en) * 2012-03-26 2013-10-03 Primesense Ltd. Enhanced virtual touchpad and touchscreen
CN102778951B (en) * 2012-06-15 2016-02-10 惠州华阳通用电子有限公司 Use input equipment and the input method of virtual key
US9305229B2 (en) * 2012-07-30 2016-04-05 Bruno Delean Method and system for vision based interfacing with a computer
CN103105930A (en) * 2013-01-16 2013-05-15 中国科学院自动化研究所 Non-contact type intelligent inputting method based on video images and device using the same
CN104423578B (en) * 2013-08-25 2019-08-06 杭州凌感科技有限公司 Interactive input system and method
CN105224069B (en) * 2014-07-03 2019-03-19 王登高 A kind of augmented reality dummy keyboard input method and the device using this method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110221657A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Optical stabilization of displayed content with a variable lens
US20150301592A1 (en) * 2014-04-18 2015-10-22 Magic Leap, Inc. Utilizing totems for augmented or virtual reality systems
WO2016010797A1 (en) * 2014-07-15 2016-01-21 Microsoft Technology Licensing, Llc Holographic keyboard display
CN104636725A (en) * 2015-02-04 2015-05-20 华中科技大学 Gesture recognition method based on depth image and gesture recognition system based on depth images

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107490983A (en) * 2017-09-29 2017-12-19 中国船舶重工集团公司第七〇四研究所 A kind of emulation mode for simulating parachute jumping full experience
CN107693117A (en) * 2017-09-29 2018-02-16 黑龙江蓝智科技有限公司 Augmented reality aided surgery system and the method that 3D models and patient with operation are subjected to automatic reclosing matching
CN107693117B (en) * 2017-09-29 2020-06-12 苏州蓝软智能医疗科技有限公司 Auxiliary operation system and method for automatically matching 3D model and operation patient in superposition mode
CN107644631A (en) * 2017-10-13 2018-01-30 深圳市明德智慧教育科技有限公司 Method, system and the virtual reality device of music input based on virtual reality
CN109857244A (en) * 2017-11-30 2019-06-07 百度在线网络技术(北京)有限公司 A kind of gesture identification method, device, terminal device, storage medium and VR glasses
CN109857244B (en) * 2017-11-30 2023-09-01 百度在线网络技术(北京)有限公司 Gesture recognition method and device, terminal equipment, storage medium and VR glasses
CN108052277A (en) * 2017-12-14 2018-05-18 深圳市艾德互联网络有限公司 A kind of AR positioning learning methods and device
CN108519855A (en) * 2018-04-17 2018-09-11 北京小米移动软件有限公司 Characters input method and device
CN108815845A (en) * 2018-05-15 2018-11-16 百度在线网络技术(北京)有限公司 The information processing method and device of human-computer interaction, computer equipment and readable medium
CN108815845B (en) * 2018-05-15 2019-11-26 百度在线网络技术(北京)有限公司 The information processing method and device of human-computer interaction, computer equipment and readable medium
CN109508635A (en) * 2018-10-08 2019-03-22 哈尔滨理工大学 A kind of traffic light recognition method based on TensorFlow combination multi-layer C NN network
CN109508635B (en) * 2018-10-08 2022-01-07 海南师范大学 Traffic light identification method based on TensorFlow combined with multilayer CNN network
CN109598998A (en) * 2018-11-30 2019-04-09 深圳供电局有限公司 Power grid training wearable device and its exchange method based on gesture identification
CN109933190A (en) * 2019-02-02 2019-06-25 青岛小鸟看看科技有限公司 One kind wearing display equipment and its exchange method
CN109933190B (en) * 2019-02-02 2022-07-19 青岛小鸟看看科技有限公司 Head-mounted display equipment and interaction method thereof
CN110096166A (en) * 2019-04-23 2019-08-06 广东工业大学华立学院 A kind of virtual input method
CN110321174A (en) * 2019-06-25 2019-10-11 Oppo广东移动通信有限公司 A kind of starting-up method and device, equipment, storage medium
CN111443831A (en) * 2020-03-30 2020-07-24 北京嘉楠捷思信息技术有限公司 Gesture recognition method and device
CN111766947A (en) * 2020-06-30 2020-10-13 歌尔科技有限公司 Display method, display device, wearable device and medium
CN112462937A (en) * 2020-11-23 2021-03-09 青岛小鸟看看科技有限公司 Local perspective method and device of virtual reality equipment and virtual reality equipment
US11861071B2 (en) 2020-11-23 2024-01-02 Qingdao Pico Technology Co., Ltd. Local perspective method and device of virtual reality equipment and virtual reality equipment
CN113269089A (en) * 2021-05-25 2021-08-17 上海人工智能研究院有限公司 Real-time gesture recognition method and system based on deep learning

Also Published As

Publication number Publication date
CN106845335B (en) 2020-03-17
WO2018098861A1 (en) 2018-06-07

Similar Documents

Publication Publication Date Title
CN106845335A (en) Gesture identification method, device and virtual reality device for virtual reality device
AU2010366331B2 (en) User interface, apparatus and method for gesture recognition
US8376854B2 (en) Around device interaction for controlling an electronic device, for controlling a computer game and for user verification
CN106062673A (en) Controlling a computing-based device using gestures
CN108305325A (en) The display methods and device of virtual objects
TW201814438A (en) Virtual reality scene-based input method and device
CN105518708A (en) Method and equipment for verifying living human face, and computer program product
CN104023802B (en) Use the control of the electronic installation of neural analysis
US20150241984A1 (en) Methods and Devices for Natural Human Interfaces and for Man Machine and Machine to Machine Activities
CN111124117B (en) Augmented reality interaction method and device based on sketch of hand drawing
CN107463331A (en) Gesture path analogy method, device and electronic equipment
CN107832001A (en) Information processing method, device, electronic equipment and storage medium
CN107357434A (en) Information input equipment, system and method under a kind of reality environment
WO2017114002A1 (en) Device and method for inputting one-dimensional handwritten text
CN106598235A (en) Gesture recognition method and apparatus for virtual reality device, and virtual reality device
JP2017534135A (en) Method for simulating and controlling a virtual ball on a mobile device
CN107329564B (en) Man-machine finger guessing method based on gesture intelligent perception and man-machine cooperation mechanism
CN107450717A (en) A kind of information processing method and Wearable
CN108815845B (en) The information processing method and device of human-computer interaction, computer equipment and readable medium
WO2023036810A1 (en) Method for detecting user input to a breath input configured user interface
CN107982916A (en) Information processing method, device, electronic equipment and storage medium
CN103294210B (en) A kind of method generating dummy keyboard
US11054941B2 (en) Information processing system, information processing method, and program for correcting operation direction and operation amount
KR101759631B1 (en) Method for providing user interface for card game, and server and computer-readable recording media using the same
CN112634441B (en) 3D human body model generation method, system and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20201012

Address after: 261031 north of Yuqing street, east of Dongming Road, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronic office building)

Patentee after: GoerTek Optical Technology Co.,Ltd.

Address before: 266104 Laoshan Qingdao District North House Street investment service center room, Room 308, Shandong

Patentee before: GOERTEK TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20221212

Address after: 266104 No. 500, Songling Road, Laoshan District, Qingdao, Shandong

Patentee after: GOERTEK TECHNOLOGY Co.,Ltd.

Address before: 261031 north of Yuqing street, east of Dongming Road, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronics office building)

Patentee before: GoerTek Optical Technology Co.,Ltd.

TR01 Transfer of patent right