CN107281750A - VR aobvious action identification methods and VR show - Google Patents

VR aobvious action identification methods and VR show Download PDF

Info

Publication number
CN107281750A
CN107281750A CN201710305138.6A CN201710305138A CN107281750A CN 107281750 A CN107281750 A CN 107281750A CN 201710305138 A CN201710305138 A CN 201710305138A CN 107281750 A CN107281750 A CN 107281750A
Authority
CN
China
Prior art keywords
picture
action
result
multiframe
identification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710305138.6A
Other languages
Chinese (zh)
Inventor
彭超
袁澄波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen City Henkok Electronic Technology Co Ltd
Original Assignee
Shenzhen City Henkok Electronic Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen City Henkok Electronic Technology Co Ltd filed Critical Shenzhen City Henkok Electronic Technology Co Ltd
Priority to CN201710305138.6A priority Critical patent/CN107281750A/en
Publication of CN107281750A publication Critical patent/CN107281750A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention discloses a kind of VR aobvious action identification method, methods described comprises the following steps:The gesture of the aobvious identification user of step S101, VR head;Step S102, VR head is aobvious to pass to computer equipment by the result of the action of the gesture of identification;Role in step S103, computer equipment control game is moved by the result of the action.The technical scheme that the present invention is provided has the advantages that user experience is high.

Description

VR aobvious action identification methods and VR show
Technical field
The present invention relates to field of virtual reality, more particularly to VR aobvious action identification methods and VR show.
Background technology
Present VR all-in-ones or VR boxes (main body is mobile phone), the ARM kernels CPU for being all the collocation intelligence of dependence The main frame of Android system or apple IOS systems.Such machine be in itself possess certain graph image disposal ability and The game of system collocation operation, video content resource therewith.The existing VR aobvious identification manipulation VR that can not combine action plays Operation, user in-convenience in use.
The content of the invention
Technical problem to be solved of the embodiment of the present invention is that there is provided a kind of VR aobvious action identification method and VR heads It is aobvious.So that user's advantage easy to use.
In order to solve the above-mentioned technical problem, it is described the embodiments of the invention provide a kind of VR aobvious action identification method Method comprises the following steps:
The gesture of the aobvious identification user of step S101, VR head;
Step S102, VR head is aobvious to pass to computer equipment by the result of the action of the gesture of identification;
Role in step S103, computer equipment control game is moved by the result of the action.
Optionally, the implementation method of the step S101 is specially:
VR aobvious periodically shooting pictures, obtain multiframe picture, carrying out picture processing to multiframe picture recognizes in one's hands The direction of motion, the mobile direction of motion determines the result of the action of the gesture.Such as direction of motion is to the right, it is determined that action knot Fruit is moves right, similarly, and the direction of motion is to the left, upwards, downward etc. is that can recognize that corresponding action.
Optionally, it is described to be specially to the direction of motion that multiframe picture progress picture processing recognizes in one's hands:
Using the picture of front and rear two frames in multiframe picture as a picture group, the spy of hand in the first picture in picture group is recognized The second place coordinate (X2, Y2) of the feature of hand in the first position coordinate (X1, Y1) levied, identification second picture, such as X2-X1 Absolute value of the value more than setting value and Y2-Y1 is less than setting value, determines the trend of the first picture group to the right, to recognize multiframe picture In each picture group trend, the trend of such as continuous N number of picture group is the right side, and N is more than amount threshold, determines multiframe picture pair The gesture result answered is to the right.
Second aspect is aobvious there is provided a kind of VR, and described VR shows and include:
Recognition unit, the gesture for recognizing user;
Transmit-Receive Unit, for the result of the action of the gesture of identification to be passed into computer equipment;
Control unit, for controlling the role in game to be moved by the result of the action.
Optionally, the recognition unit, specifically for periodically shooting picture, obtains multiframe picture, to multiframe picture The direction of motion that picture processing recognizes in one's hands is carried out, the mobile direction of motion determines the result of the action of the gesture.For example move Direction is to the right, it is determined that the result of the action is moves right, similarly, and the direction of motion is to the left, upwards, downward etc. is to recognize Go out corresponding action.
Optionally, the recognition unit, specifically for using the picture of two frames before and after in multiframe picture as a picture group, Recognize the feature of hand in the first position coordinate (X1, Y1) of the feature of hand in the first picture in picture group, identification second picture Second place coordinate (X2, Y2), such as X2-X1 value are more than setting value and Y2-Y1 absolute value is less than setting value, determines the first figure The trend of piece group is to the right, recognizes the trend of each picture group in multiframe picture, and the trend of such as continuous N number of picture group is the right side, And N is more than amount threshold, determine the corresponding gesture result of multiframe picture for the right.
Although those of ordinary skill in the art will be appreciated that following detailed description carries out referenced in schematic embodiment, accompanying drawing, But the present invention is not limited in these embodiments.But, the scope of the present invention is extensive, and is intended to be bound only by appended right It is required that limiting the scope of the present invention.
Brief description of the drawings
In order to illustrate more clearly about the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing There is the accompanying drawing used required in technology description to be briefly described, it should be apparent that, drawings in the following description are only this Some embodiments of invention, for those of ordinary skill in the art, on the premise of not paying creative work, can be with Other accompanying drawings are obtained according to these accompanying drawings.
Fig. 1 is a kind of schematic flow sheet for VR aobvious action identification methods that the present invention is provided.
Fig. 2 is a kind of VR aobvious structural representations that the present invention is provided.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Site preparation is described, it is clear that described embodiment is only a part of embodiment of the invention, rather than whole embodiments.It is based on Embodiment in the present invention, it is every other that those of ordinary skill in the art are obtained under the premise of creative work is not made Embodiment, belongs to the scope of protection of the invention.
Term " first ", " second " in description and claims of this specification and above-mentioned accompanying drawing etc. are to be used to distinguish Different objects, rather than for describing particular order.In addition, term " comprising " and " having " and their any deformations, it is intended that It is to cover non-exclusive include.Process, method, system, product or the equipment for for example containing series of steps or unit do not have The step of being defined in the step of having listed or unit, but alternatively also include not listing or unit, or alternatively also wrap Include for other intrinsic steps of these processes, method, product or equipment or unit.
Referenced herein " embodiment " is it is meant that the special characteristic, structure or the characteristic that describe can be wrapped in conjunction with the embodiments In at least one embodiment of the present invention.Each position in the description occur the phrase might not each mean it is identical Embodiment, nor the independent or alternative embodiment with other embodiments mutual exclusion.Those skilled in the art explicitly and Implicitly understand, embodiment described herein can be combined with other embodiments.
Refering to Fig. 1, the embodiment of the present invention provides a kind of VR aobvious action identification method, and this method is as shown in figure 1, by VR Head is aobvious to be performed, and is comprised the following steps:
The gesture of the aobvious identification user of step S101, VR head.
Step S102, VR head is aobvious to pass to computer equipment by the result of the action of the gesture of identification.
Role in step S103, computer equipment control game is moved by the result of the action.
The technical scheme that the present invention is provided is shown come identification maneuver by VR, so that the movement of game role is realized, so It has increase user interaction, improves the advantage of user experience.
Optionally, above-mentioned steps S101 implementation method is specifically as follows:
VR aobvious periodically shooting pictures, obtain multiframe picture, carrying out picture processing to multiframe picture recognizes in one's hands The direction of motion, the mobile direction of motion determines the result of the action of the gesture.Such as direction of motion is to the right, it is determined that action knot Fruit is moves right, similarly, and the direction of motion is to the left, upwards, downward etc. is that can recognize that corresponding action.
Optionally, it is above-mentioned that the direction of motion that multiframe picture progress picture processing recognizes in one's hands is specifically as follows:
Using the picture of front and rear two frames in multiframe picture as a picture group, the spy of hand in the first picture in picture group is recognized The second place coordinate (X2, Y2) of the feature of hand in the first position coordinate (X1, Y1) levied, identification second picture, such as X2-X1 Absolute value of the value more than setting value and Y2-Y1 is less than setting value, determines the trend of the first picture group to the right, to recognize multiframe picture In each picture group trend, the trend of such as continuous N number of picture group is the right side, and N is more than amount threshold, determines multiframe picture pair The gesture result answered is to the right.It can similarly identify to the left, upwards, downward etc. action.
The feature of above-mentioned hand can be exclusive feature, birthmark for example on hand, mole on hand etc., naturally it is also possible to for institute The joint of the feature of some hands, such as hand, the palm line of hand etc., the specific species of the feature of opponent of the present invention is not limited.
The reason for technical scheme, is, for the movement of hand, because each hand has a feature, and feature is relative The position of hand can not possibly change with the movement set about, that is, recognize the position for the feature sold and can realize the movement of opponent, It is that can realize identification so by the identification of image, for the processing of image, multiframe picture is grouped by we, due to every The time interval of frame is very short, then we can ignore the VR aobvious mobile influences to picture, so we only need to identification often The movement tendency of group picture, it is then determined that whether thering is the movement tendency of continuous multigroup picture is identical can know whether hand moves And mobile direction.
Refering to Fig. 2, Fig. 2 provides a kind of VR and shown, and described VR shows and include:
Recognition unit 201, the gesture for recognizing user;
Transmit-Receive Unit 202, for the result of the action of the gesture of identification to be passed into computer equipment;
Control unit 203, for controlling the role in game to be moved by the result of the action.
Optionally, the recognition unit 201, specifically for periodically shooting picture, obtains multiframe picture, to multiframe figure Piece carries out the direction of motion that picture processing recognizes in one's hands, and the mobile direction of motion determines the result of the action of the gesture.For example transport Direction is moved for the right, it is determined that the result of the action is moves right, similarly, the direction of motion is to the left, upwards, downward etc. is to know Do not go out corresponding action.
Optionally, the recognition unit 201, specifically for regarding the picture of front and rear two frames in multiframe picture as a picture Group, recognizes the feature of hand in the first position coordinate (X1, Y1) of the feature of hand in the first picture in picture group, identification second picture Second place coordinate (X2, Y2), the absolute value that such as X2-X1 value is more than setting value and Y2-Y1 is less than setting value, determines first The trend of picture group is to the right, recognizes the trend of each picture group in multiframe picture, and the trend of such as continuous N number of picture group is The right side, and N is more than amount threshold, determines the corresponding gesture result of multiframe picture for the right.
Above disclosed is only a kind of preferred embodiment of the invention, can not limit the power of the present invention with this certainly Sharp scope, one of ordinary skill in the art will appreciate that all or part of flow of above-described embodiment is realized, and according to present invention power Profit requires made equivalent variations, still falls within and invents covered scope.

Claims (6)

1. a kind of VR aobvious action identification method, it is characterised in that methods described comprises the following steps:
The gesture of the aobvious identification user of step S101, VR head;
Step S102, VR head is aobvious to pass to computer equipment by the result of the action of the gesture of identification;
Role in step S103, computer equipment control game is moved by the result of the action.
2. according to the method described in claim 1, it is characterised in that the implementation method of the step S101 is specially:
VR aobvious periodically shooting pictures, obtain multiframe picture, the fortune that picture processing recognizes in one's hands are carried out to multiframe picture Dynamic direction, mobile direction of motion determines the result of the action of the gesture.Such as direction of motion is to the right, it is determined that the result of the action is Move right, similarly, the direction of motion is to the left, upwards, downward etc. is that can recognize that corresponding action.
3. method according to claim 2, it is characterised in that described to recognize in one's hands to multiframe picture progress picture processing The direction of motion be specially:
Using the picture of two frames before and after in multiframe picture as a picture group, the feature of hand in the first picture in identification picture group The second place coordinate (X2, Y2) of the feature of hand in first position coordinate (X1, Y1), identification second picture, such as X2-X1 value is big It is less than setting value in setting value and Y2-Y1 absolute value, it is every in multiframe picture to the right, to recognize to determine the trend of the first picture group The trend of individual picture group, the trend of such as continuous N number of picture group is the right side, and N is more than amount threshold, determines that multiframe picture is corresponding Gesture result is to the right.
4. a kind of VR show, it is characterised in that described VR show include:
Recognition unit, the gesture for recognizing user;
Transmit-Receive Unit, for the result of the action of the gesture of identification to be passed into computer equipment;
Control unit, for controlling the role in game to be moved by the result of the action.
5. VR according to claim 4 shows, it is characterised in that the recognition unit, specifically for periodically shooting Picture, obtains multiframe picture, and the direction of motion that picture processing recognizes in one's hands is carried out to multiframe picture, and the mobile direction of motion is true The result of the action of the fixed gesture.Such as direction of motion is to the right, it is determined that the result of the action is moves right, and similarly, the direction of motion is To the left, upwards, it is downward etc. to can recognize that corresponding action.
6. VR according to claim 5 shows, it is characterised in that the recognition unit, specifically for by multiframe picture The picture of front and rear two frame as a picture group, in identification picture group in the first picture the feature of hand first position coordinate (X1, Y1), the second place coordinate (X2, Y2) of the feature of hand in second picture is recognized, such as X2-X1 value is more than setting value and Y2-Y1 Absolute value be less than setting value, determine the trend of the first picture group for the right, the trend of each picture group in identification multiframe picture, Trend such as continuous N number of picture group is the right side, and N is more than amount threshold, determines the corresponding gesture result of multiframe picture for the right.
CN201710305138.6A 2017-05-03 2017-05-03 VR aobvious action identification methods and VR show Pending CN107281750A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710305138.6A CN107281750A (en) 2017-05-03 2017-05-03 VR aobvious action identification methods and VR show

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710305138.6A CN107281750A (en) 2017-05-03 2017-05-03 VR aobvious action identification methods and VR show

Publications (1)

Publication Number Publication Date
CN107281750A true CN107281750A (en) 2017-10-24

Family

ID=60094359

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710305138.6A Pending CN107281750A (en) 2017-05-03 2017-05-03 VR aobvious action identification methods and VR show

Country Status (1)

Country Link
CN (1) CN107281750A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110082909A (en) * 2018-01-25 2019-08-02 宏碁股份有限公司 Head-mounted display and its operating method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102226880A (en) * 2011-06-03 2011-10-26 北京新岸线网络技术有限公司 Somatosensory operation method and system based on virtual reality
CN105045398A (en) * 2015-09-07 2015-11-11 哈尔滨市一舍科技有限公司 Virtual reality interaction device based on gesture recognition
CN105334959A (en) * 2015-10-22 2016-02-17 北京小鸟看看科技有限公司 System and method for controlling gesture motion in virtual reality environment
CN105446481A (en) * 2015-11-11 2016-03-30 周谆 Gesture based virtual reality human-machine interaction method and system
KR20170014050A (en) * 2015-07-28 2017-02-08 재단법인대구경북과학기술원 Apparatus of capsule for personalized healing based on virtual reality and Method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102226880A (en) * 2011-06-03 2011-10-26 北京新岸线网络技术有限公司 Somatosensory operation method and system based on virtual reality
KR20170014050A (en) * 2015-07-28 2017-02-08 재단법인대구경북과학기술원 Apparatus of capsule for personalized healing based on virtual reality and Method thereof
CN105045398A (en) * 2015-09-07 2015-11-11 哈尔滨市一舍科技有限公司 Virtual reality interaction device based on gesture recognition
CN105334959A (en) * 2015-10-22 2016-02-17 北京小鸟看看科技有限公司 System and method for controlling gesture motion in virtual reality environment
CN105446481A (en) * 2015-11-11 2016-03-30 周谆 Gesture based virtual reality human-machine interaction method and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110082909A (en) * 2018-01-25 2019-08-02 宏碁股份有限公司 Head-mounted display and its operating method
CN110082909B (en) * 2018-01-25 2021-08-17 宏碁股份有限公司 Head mounted display and method of operating the same

Similar Documents

Publication Publication Date Title
CN104267816B (en) Method for adjusting content of display screen and display screen adjusting device
CN108983306B (en) A kind of method and rays safety detection apparatus of article frame flow display
JP7005091B2 (en) How to steer virtual objects, equipment and computer programs
CN104583902A (en) Improved identification of a gesture
CN103970538B (en) A kind of Android focuses transform method and system
CN110428382B (en) Efficient video enhancement method and device for mobile terminal and storage medium
CN104952027A (en) Face-information-contained picture cutting method and apparatus
CN107185231A (en) Information processing method and device, storage medium, electronic equipment
CN105234940A (en) Robot and control method thereof
CN109783680A (en) Image method for pushing, image acquiring method, device and image processing system
CN104508680A (en) Object tracking in video stream
CN108176049A (en) A kind of information cuing method, device, terminal and computer readable storage medium
CN109358748A (en) A kind of device and method interacted with hand with mobile phone A R dummy object
CN110866940A (en) Virtual picture control method and device, terminal equipment and storage medium
CN107281750A (en) VR aobvious action identification methods and VR show
CN111844030B (en) Interpolation point generation method and device, processor and electronic device
CN108132727A (en) Handwriting regulation method and device based on touch control face plates
CN109613979B (en) Character input method and device, AR equipment and computer storage medium
CN110956124A (en) Display device control method based on gestures and display device
CN110490165B (en) Dynamic gesture tracking method based on convolutional neural network
CN102402279A (en) Human-computer interaction method and system based on gestures
CN106020712A (en) Touch control gesture recognition method and device
CN105488489A (en) Short video message transmitting method, electronic device and system
CN110047057A (en) A kind of image processing method, terminal and storage device
CN112883306B (en) Page display method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20171024