CN102707802A - Method for controlling speed of mapping of gesture movement to interface - Google Patents

Method for controlling speed of mapping of gesture movement to interface Download PDF

Info

Publication number
CN102707802A
CN102707802A CN2012101425870A CN201210142587A CN102707802A CN 102707802 A CN102707802 A CN 102707802A CN 2012101425870 A CN2012101425870 A CN 2012101425870A CN 201210142587 A CN201210142587 A CN 201210142587A CN 102707802 A CN102707802 A CN 102707802A
Authority
CN
China
Prior art keywords
gesture
interface
speed
mapped
skyborne
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012101425870A
Other languages
Chinese (zh)
Inventor
徐向民
苗捷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN2012101425870A priority Critical patent/CN102707802A/en
Publication of CN102707802A publication Critical patent/CN102707802A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a method for controlling speed of mapping of a gesture movement to an interface. The method includes the following steps that the speed of the movement of a gesture in the air is changed, so that the ratio between movement distance of the gesture mapped to the interface and movement distance of the gesture in the air is changed; the movement speed of the gesture in the air is set as a first speed grade, a second speed grade and a third speed grade; when the movement speed of the gesture in the air is the first speed grade, the ratio between movement distance of the gesture mapped to the interface and movement distance of the gesture in the air is set as P1; when the movement speed of the gesture in the air is the third speed grade, the ratio between movement distance of the gesture mapped to the interface and movement distance of the gesture in the air is set as P2; and when the movement speed of the gesture in the air is the second speed grade, the ratio between movement distance of the gesture mapped to the interface and movement distance of the gesture in the air is set as P, the P increases with the increasing of the movement speed of the gesture in the air, and the P is larger than the P1 and smaller than the P2.

Description

Gesture motion is mapped to the method for control speed at interface
Technical field
The present invention relates to the human-computer interactive control field, particularly relate to the method for control speed that a kind of gesture motion is mapped to the interface.
Background technology
In the non-contact type human-machine interaction system, the man-machine interaction of controlling based on gesture is an important link.Man-machine interaction based on gesture control has characteristics simply and easily, and its mode of operation commonly used is the gesture roaming.The gesture roaming is meant the Motion mapping of gesture in physical space in the interface of Be Controlled object, realizes the operation to the Be Controlled object.Existing gesture motion mapping mode is the direct mapping of gesture coordinate, that is: the position coordinates of the gesture in the image sequence that sensor is captured is mapped as the position coordinates of controlled device in its moving interface in proportion.For example, in the common graphic user interface interactive system, wide=640 pixel *, 480 pixels that the image of each frame size is long * in the image sequence that sensor captures; The position at gesture place is (200; 100) pixel, the size at interface for long * wide=1280 pixel *, 720 pixels, pass through the direct mapping of coordinate so; The coordinate of the corresponding cursor of gesture then is (1280/640*200=400,720/480*100=150) pixel in the interface.
The direct mapping method of this hand gesture location coordinate has only been used the positional information of gesture.Gesture is in motion process, and when the user needed the current gesture of chosen distance position project far away, gesture just needed the motion larger distance, can complete operation, will increase user's fatigue sense like this, and make the sense of experience of users variation.
Summary of the invention
Based on this,, be necessary to propose the method for control speed that a kind of gesture motion of being convenient to operate more is mapped to the interface to the problems referred to above.
Technical scheme of the present invention is: a kind of gesture motion is mapped to the method for control speed at interface, may further comprise the steps:
Change the skyborne movement velocity of gesture, make gesture be mapped to the ratio change of displacement and the skyborne displacement of gesture in the interface.
In non-contact type human-machine interaction system based on gesture; Make and use gesture when carrying out roam operation; Controlled device is followed gesture motion speed and is moved; Controlled device direction of motion is identical with the gesture motion direction, and the displacement that controlled device moves becomes dynamic proportion with the displacement of gesture motion, and this ratio changes according to the skyborne movement velocity of gesture.
Among embodiment, further comprising the steps of therein:
The skyborne movement velocity of gesture is set at Three Estate, i.e. first speed class, second speed grade and third speed grade;
When the skyborne movement velocity of gesture was first speed class, displacement and the ratio of the skyborne displacement of gesture that gesture is mapped in the interface were P1, wherein, and 0<P1<1;
When the skyborne movement velocity of gesture was the third speed grade, displacement and the ratio of the skyborne displacement of gesture that gesture is mapped in the interface were P2, wherein, and P2>1;
When the skyborne movement velocity of gesture was the second speed grade, displacement and the ratio of the skyborne displacement of gesture that gesture is mapped in the interface were P, and P is along with the skyborne movement velocity of gesture increases and increases, wherein, and P1<P<P2.
Among embodiment, further comprising the steps of therein:
When the skyborne movement velocity of setting gesture is less than or equal to V1, be first speed class;
When setting the skyborne movement velocity of gesture, be the second speed grade greater than V1 and less than V2, wherein, V1<V2;
When setting the skyborne movement velocity of gesture, be the third speed grade greater than V2.
Therein among embodiment; Further comprising the steps of: as to set the front end sensors that is used to catch gesture motion; The image sequence that the analyzing and processing front end sensors captures draws the positional information of gesture in each two field picture of front end sensors image sequence and the velocity information of current gesture motion.
Among embodiment, further comprising the steps of therein: setting camera is the front end sensors that is used to catch gesture motion.
Technique scheme has solved the gesture coordinate in direct mapping, and during the project far away of the current gesture of chosen distance position, the user needs the problem of movement gesture to larger distance.The motion of gesture not only includes the positional information in space, has also carried time-related information, in conjunction with room and time, just can obtain the velocity information of gesture motion.When user's gesture motion speed was fast, the distance that is mapped to the gesture motion in the interface was just long, when user's gesture motion speed is slow; The gesture motion distance that is mapped in the interface is just short; For example, the length of supposing each two field picture and interface in the image sequence that sensor captures is with roomy little identical, when gesture motion speed is fast; In the image sequence that sensor captures; User's gesture is towards 1 pixel of specific direction motion, is mapped to gesture in the interface 4 pixels of can moving, and if take direct mapping; Then sensor user's gesture of catching in the image sequence need be towards specific direction 4 pixels of moving, and are mapped to gesture in the interface 4 pixels of just can moving.Like this, the user needs the problem of movement gesture to larger distance when having solved chosen distance current gesture coordinate project far away.And when gesture motion speed is slow; In the image sequence that sensor captures; User's gesture is towards 4 pixels of specific direction motion, is mapped to gesture in the interface 1 pixel of only can moving, and if take direct mapping; When then sensor is caught user's gesture in the image sequence towards 1 pixel of specific direction motion, be mapped to gesture in the interface 1 pixel of will moving.Like this, just also solved accurate inadequately problem when selecting the close project of coordinate.
The invention has the beneficial effects as follows:
Need the user correspondingly to move gesture problem far away when (1) having solved the project that the current hand gesture location of chosen distance is far away in the direct mapping process of coordinate, make the user realize the remote function that moves gesture through the short distance fast moving of gesture;
Accurate inadequately problem when (2) having solved the project that the selection coordinate is close in the direct mapping process of coordinate makes the user move at a slow speed down in gesture, realizes gesture motion comparatively accurately;
(3) the skyborne movement velocity of gesture is divided into Three Estate, has realized the hierarchy mapping of friction speed.
Description of drawings
Fig. 1 is the coordinate graph of a relation that said gesture motion speed of the embodiment of the invention and gesture are mapped to the ratio of displacement and the skyborne displacement of gesture in the interface.
Embodiment
Below in conjunction with accompanying drawing embodiments of the invention are elaborated.
Embodiment:
As shown in Figure 1, horizontal ordinate is represented the skyborne movement velocity of gesture, and on behalf of gesture, ordinate be mapped to displacement and the ratio of the skyborne displacement of gesture in the interface.
A kind of gesture motion is mapped to the method for control speed at interface, may further comprise the steps:
Step 1, setting camera is the front end sensors that is used to catch gesture motion, camera can be common camera or degree of depth camera.The image sequence that the analyzing and processing front end sensors captures draws the positional information of gesture in each two field picture of front end sensors image sequence and the velocity information of current gesture motion.
Step 2 is set at Three Estate with the skyborne movement velocity of gesture, i.e. first speed class, second speed grade and third speed grade.
Step 3 when the skyborne movement velocity of setting gesture is less than or equal to V1, is first speed class; When setting the skyborne movement velocity of gesture, be the second speed grade greater than V1 and less than V2, wherein, V1<V2; When setting the skyborne movement velocity of gesture, be the third speed grade greater than V2.
Step 4, when the skyborne movement velocity of gesture was first speed class, displacement and the ratio of the skyborne displacement of gesture that gesture is mapped in the interface were P1, wherein, 0<P1<1.For example, if this moment, gesture displacement was 1 pixel, then to be mapped to the displacement in the interface be P1 pixel to gesture.
Step 5, when the skyborne movement velocity of gesture was the third speed grade, displacement and the ratio of the skyborne displacement of gesture that gesture is mapped in the interface were P2, wherein, P2>1.For example, if this moment, gesture displacement was 1 pixel, then to be mapped to the displacement in the interface be P2 pixel to gesture.
Step 6, when the skyborne movement velocity of gesture was the second speed grade, displacement and the ratio of the skyborne displacement of gesture that gesture is mapped in the interface were P, P is along with the skyborne movement velocity of gesture increases and increases, wherein, P1<P<P2.At this moment, the displacement and the ratio of the skyborne displacement of gesture that are mapped in the interface of gesture is linear growth according to the skyborne movement velocity of gesture.For example, referring to Fig. 1, suppose that this moment, gesture motion speed was Vt, if the skyborne displacement of gesture is 1 pixel, then to be mapped to the displacement in the interface be P2 pixel, wherein P2>1 to gesture.
The above embodiment has only expressed embodiment of the present invention, and it describes comparatively concrete and detailed, but can not therefore be interpreted as the restriction to claim of the present invention.Should be pointed out that for the person of ordinary skill of the art under the prerequisite that does not break away from the present invention's design, can also make some distortion and improvement, these all belong to protection scope of the present invention.

Claims (5)

1. a gesture motion is mapped to the method for control speed at interface, it is characterized in that, may further comprise the steps:
Change the skyborne movement velocity of gesture, make gesture be mapped to the ratio change of displacement and the skyborne displacement of gesture in the interface.
2. gesture motion according to claim 1 is mapped to the method for control speed at interface, it is characterized in that, and is further comprising the steps of:
The skyborne movement velocity of gesture is set at Three Estate, i.e. first speed class, second speed grade and third speed grade;
When the skyborne movement velocity of gesture was first speed class, displacement and the ratio of the skyborne displacement of gesture that gesture is mapped in the interface were P1, wherein, and 0<P1<1;
When the skyborne movement velocity of gesture was the third speed grade, displacement and the ratio of the skyborne displacement of gesture that gesture is mapped in the interface were P2, wherein, and P2>1;
When the skyborne movement velocity of gesture was the second speed grade, displacement and the ratio of the skyborne displacement of gesture that gesture is mapped in the interface were P, and P is along with the skyborne movement velocity of gesture increases and increases, wherein, and P1<P<P2.
3. gesture motion according to claim 2 is mapped to the method for control speed at interface, it is characterized in that, and is further comprising the steps of:
When the skyborne movement velocity of setting gesture is less than or equal to V1, be first speed class;
When setting the skyborne movement velocity of gesture, be the second speed grade greater than V1 and less than V2, wherein, V1<V2;
When setting the skyborne movement velocity of gesture, be the third speed grade greater than V2.
4. be mapped to the method for control speed at interface according to claim 1 or 2 or 3 described gesture motions, it is characterized in that, further comprising the steps of:
Setting is used to catch the front end sensors of gesture motion, and the image sequence that the analyzing and processing front end sensors captures draws the positional information of gesture in each two field picture of front end sensors image sequence and the velocity information of current gesture motion.
5. gesture motion according to claim 4 is mapped to the method for control speed at interface, it is characterized in that, and is further comprising the steps of:
Setting camera is the front end sensors that is used to catch gesture motion.
CN2012101425870A 2012-05-09 2012-05-09 Method for controlling speed of mapping of gesture movement to interface Pending CN102707802A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2012101425870A CN102707802A (en) 2012-05-09 2012-05-09 Method for controlling speed of mapping of gesture movement to interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2012101425870A CN102707802A (en) 2012-05-09 2012-05-09 Method for controlling speed of mapping of gesture movement to interface

Publications (1)

Publication Number Publication Date
CN102707802A true CN102707802A (en) 2012-10-03

Family

ID=46900678

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2012101425870A Pending CN102707802A (en) 2012-05-09 2012-05-09 Method for controlling speed of mapping of gesture movement to interface

Country Status (1)

Country Link
CN (1) CN102707802A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103793056A (en) * 2014-01-26 2014-05-14 华南理工大学 Mid-air gesture roaming control method based on distance vector
WO2014071657A1 (en) * 2012-11-09 2014-05-15 江苏惠通集团有限责任公司 Output control method, device, display control method and system of gesture sensing device
CN103914126A (en) * 2012-12-31 2014-07-09 腾讯科技(深圳)有限公司 Multimedia player control method and device
CN104142730A (en) * 2014-07-04 2014-11-12 华南理工大学 Method for mapping gesture tracking results to mouse events
CN104883596A (en) * 2014-02-28 2015-09-02 联想(北京)有限公司 Instruction generation method and apparatus and electronic device
CN107145838A (en) * 2017-04-17 2017-09-08 济南大学 A kind of semantic automatic classification method of gesture

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101558367A (en) * 2006-12-05 2009-10-14 索尼爱立信移动通讯有限公司 Method and system for detecting movement of an object
CN101819498A (en) * 2009-02-27 2010-09-01 瞬联讯通科技(北京)有限公司 Screen display-controlling method facing to slide body of touch screen

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101558367A (en) * 2006-12-05 2009-10-14 索尼爱立信移动通讯有限公司 Method and system for detecting movement of an object
CN101819498A (en) * 2009-02-27 2010-09-01 瞬联讯通科技(北京)有限公司 Screen display-controlling method facing to slide body of touch screen

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014071657A1 (en) * 2012-11-09 2014-05-15 江苏惠通集团有限责任公司 Output control method, device, display control method and system of gesture sensing device
CN103914126A (en) * 2012-12-31 2014-07-09 腾讯科技(深圳)有限公司 Multimedia player control method and device
CN103793056A (en) * 2014-01-26 2014-05-14 华南理工大学 Mid-air gesture roaming control method based on distance vector
CN104883596A (en) * 2014-02-28 2015-09-02 联想(北京)有限公司 Instruction generation method and apparatus and electronic device
CN104883596B (en) * 2014-02-28 2018-02-27 联想(北京)有限公司 One kind instruction generation method, device and electronic equipment
CN104142730A (en) * 2014-07-04 2014-11-12 华南理工大学 Method for mapping gesture tracking results to mouse events
CN104142730B (en) * 2014-07-04 2017-06-06 华南理工大学 A kind of method that gesture tracking result is mapped to mouse event
CN107145838A (en) * 2017-04-17 2017-09-08 济南大学 A kind of semantic automatic classification method of gesture

Similar Documents

Publication Publication Date Title
US11307666B2 (en) Systems and methods of direct pointing detection for interaction with a digital device
CN103294401B (en) A kind of icon disposal route and device with the electronic equipment of touch-screen
CN102707802A (en) Method for controlling speed of mapping of gesture movement to interface
CN102446032B (en) Information input method and terminal based on camera
CN111273778A (en) Method and device for controlling electronic equipment based on gestures
CN106959808A (en) A kind of system and method based on gesture control 3D models
US20150234467A1 (en) Method and apparatus for gesture detection and display control
JPWO2014208168A1 (en) Information processing apparatus, control method, program, and storage medium
CN102968245B (en) Mouse touches cooperative control method, device and Intelligent television interaction method, system
US9285885B2 (en) Gesture recognition module and gesture recognition method
JP6470112B2 (en) Mobile device operation terminal, mobile device operation method, and mobile device operation program
CN103440033A (en) Method and device for achieving man-machine interaction based on bare hand and monocular camera
CN104331191A (en) System and method for realizing touch on basis of image recognition
CN102999308B (en) For determining the method controlling to control output in territory and controlling device
CN103197774A (en) Method and system for mapping application track of emission light source motion track
CN105404384A (en) Gesture operation method, method for positioning screen cursor by gesture, and gesture system
CN110007838B (en) Processing method, device and equipment for erasing control
CN102999158B (en) The gesture identification of interaction systems and interaction systems
CN104142736B (en) Video monitoring equipment control method and device
KR101233793B1 (en) Virtual mouse driving method using hand motion recognition
TWI486815B (en) Display device, system and method for controlling the display device
CN105138131B (en) A kind of general gesture command transmitting and operational approach
CN101976154A (en) Projection touch system
CN103389793B (en) Man-machine interaction method and system
CN104866201A (en) Intelligent device and method for triggering editing function of application

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20121003