CN104866112A - Non-contact interaction method based on mobile terminal - Google Patents

Non-contact interaction method based on mobile terminal Download PDF

Info

Publication number
CN104866112A
CN104866112A CN201510319515.2A CN201510319515A CN104866112A CN 104866112 A CN104866112 A CN 104866112A CN 201510319515 A CN201510319515 A CN 201510319515A CN 104866112 A CN104866112 A CN 104866112A
Authority
CN
China
Prior art keywords
staff
mobile terminal
image
exchange method
method based
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510319515.2A
Other languages
Chinese (zh)
Inventor
李刚
韩明鸣
洪勇勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ANHUI LONGCOM INTERNET OF THINGS Co Ltd
Original Assignee
ANHUI LONGCOM INTERNET OF THINGS Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ANHUI LONGCOM INTERNET OF THINGS Co Ltd filed Critical ANHUI LONGCOM INTERNET OF THINGS Co Ltd
Priority to CN201510319515.2A priority Critical patent/CN104866112A/en
Publication of CN104866112A publication Critical patent/CN104866112A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a non-contact interaction method based on a mobile terminal, and relates to the technical field of human-computer interaction. The interaction method comprises the following steps: (1) utilizing first image collection equipment arranged on the mobile terminal to collect an image containing a human hand; (2) analyzing the collected image, and dividing an area of interest where the human hand is positioned; (3) carrying out further classification on the area where the human hand is positioned to divide the key positions of the human hand; (4) utilizing the physiological restriction of the human hand to supplement and correct the divided positions; and (5) establishing a motion curve of each part of the human hand, comparing the motion curve with a gesture set by an application field to carry out interaction mode discrimination and send a corresponding control command. The human hand is subjected to image collection to realize an interaction function and realize the tracking of gesture motion. The non-contact interaction method has the characteristics that the non-contact interaction method is not restricted to the screen of the mobile terminal, is high in flexibility and supports various gestures.

Description

A kind of contactless exchange method based on mobile terminal
Technical field:
The present invention relates to human-computer interaction technique field, be specifically related to a kind of contactless exchange method based on mobile terminal.
Background technology:
The man-machine interaction mode of existing mobile terminal, mainly by keyboard and touch-screen.No matter the area of mobile terminal and finite volume are keyboard or touch screen interaction, and inconvenience, easily occurs maloperation all to some extent.
Novel man-machine interaction mode have sensor-based body sense mutual, based on motion images identification mutual, based on muscle calculate mutual etc.The deficiency existed before above various interactive modes can make up, but when being applied on mobile terminal, also has its respective new problem and shortcoming.
Sensor-based body sense man-machine interaction mode, utilizes the devices such as acceleration transducer, gravity sensor, gyroscope, and its motion of perception is carried out mutual.Gravity sensor at present in the terminals such as mobile phone, but can only perceive the motion of mobile phone itself, instead of the motion of user, and range of application is very limited; As wanted the motion of perception user, need additional configuration to hold the telepilot of sensor, this is very factitious in mobile terminal is mutual, destroys the portability of mobile terminal to a certain extent.
Based on the interactive mode of motion images identification, utilize the special camera that can gather depth of view information, the action of shooting user, by the process to image, the movable information reconstructing user carries out alternately.The shortcoming of this interactive mode be depth of field camera costly, and depth of field camera can not gather too near distance, for the mutual inconvenience of mobile terminal.
Based on the interactive mode that muscle calculates, judged the action of human body by the electromyographic signal gathering human muscle, thus carry out mutual.This interactive mode is also not overripened at present, and equipment is Portable belt not.
Therefore, based on mobile terminal, need a kind of novel exchange method.
Patented claim CN201110043418.7 discloses a kind of man-machine interaction method of mobile terminal, uses sensor and the motor of mobile terminal inside, utilizes the active rotation of terminal to send signal, passive rotation acknowledge(ment) signal.The primary limitation of this invention is that the information category that can express is very limited.
Patented claim CN201210056083.7 discloses a kind of Contactless controlling device and method of mobile terminal.The video image in the method acquisition terminal front also extracts staff profile, but can only give an order according to the direction of motion of staff, and fineness is not high.
Patented claim CN201210520849.2 discloses a kind of mobile electronic device gesture control interaction method, and the innovation of this invention is the frequency of dynamic conditioning sensed activation, saves the power consumption of mobile terminal.This invention uses acceleration transducer, the motion of inductive terminations.
Summary of the invention:
The object of this invention is to provide a kind of contactless exchange method based on mobile terminal, it is by realizing interactive function to the image acquisition of staff, and realize tracking to gesture motion, have and be not limited to the feature that mobile terminal screen, dirigibility are strong, support various gestures.
In order to solve the problem existing for background technology, the present invention adopts following technical scheme: its exchange method comprises the following steps:
(1), the first image capture device of utilizing mobile terminal to carry, gather the image comprising staff;
(2), to the image gathered analyze, divide the area-of-interest at staff place;
(3), to staff region classify further, mark off staff key position;
(4), utilize staff physiological bounds, carry out supplementing to the position divided and revise;
(5), set up the movement locus of staff various piece, the gesture set by movement locus and application is contrasted, carries out interactive mode differentiation, and send corresponding control command.
Described a kind of contactless exchange method based on mobile terminal also can comprise: the second image capture device is identical with the first image capture device parameter, places side by side, and synchronous acquisition comprises the image of staff; The image gathered is mated and depth recovery, obtains reference depth image.
The area-of-interest at described division staff place comprises:
(I), the image of collection is transformed into HSV color space;
(II), utilize people's features of skin colors, Preliminary division staff region;
(III), denoising is carried out to division result, remove zonule noise effect.
Described division human body comprises the following steps:
(I) the staff coloured image of artificial mark is collected, in advance, line position precorrection of going forward side by side;
(II) each pixel, to each position of staff, extracts the difference value under the Grad in 8 directions and different step-length, alternatively feature;
(III), stochastic decision tree sorter model, cross-validation method and ID4.5 training algorithm is used, the sorter of off-line training staff key position;
(IV), at mobile terminal apply this sorter, staff key position is classified.
Described carry out supplementing to division position and correction comprises the following steps:
1., in the result of classifying step, the position that pixel is very few and lack is marked;
2., in the image in certain hour above, corresponding position position is searched;
If 3. find enough positional informations, use track fitting, obtain the deleted areas information of this frame tentative prediction;
4., according to the Rigid Constraints of finger dactylus, four coplanar constraints referred to except thumb, the angle of bend constraint of finger-joint, priority from high to low, is revised the location information of previous step prediction; As there is no enough information of forecastings, infer the position of deleted areas according to these constraints.
Described division human body comprises the following steps:
(1) coloured image and the depth image of collecting the staff of artificial mark, in advance combine, line position precorrection of going forward side by side;
(2) each pixel, to each position of staff, extracts the difference value under the Grad in 8 directions and different step-length, alternatively feature in the group of images respectively;
(3), stochastic decision tree sorter model, cross-validation method and ID4.5 training algorithm is used, the sorter of off-line training staff key position.
(4), at mobile terminal apply this sorter, staff key position is classified.
The present invention has following beneficial effect: by realizing interactive function to the image acquisition of staff, and realizes tracking to gesture motion, has and is not limited to the feature that mobile terminal screen, dirigibility are strong, support various gestures.
Accompanying drawing illustrates:
Fig. 1 is the process flow diagram of the mutual friendship method in this embodiment;
Fig. 2 is the process flow diagram that another kind in this embodiment hands over method mutually.
Embodiment:
Referring to Fig. 1-Fig. 2, this embodiment adopts following technical scheme: its exchange method comprises the following steps:
(1), the first image capture device of utilizing mobile terminal to carry, gather the image comprising staff;
(2), to the image gathered analyze, divide the area-of-interest at staff place;
(3), to staff region classify further, mark off staff key position;
(4), utilize staff physiological bounds, carry out supplementing to the position divided and revise;
(5), set up the movement locus of staff various piece, the gesture set by movement locus and application is contrasted, carries out interactive mode differentiation, and send corresponding control command.
Described a kind of contactless exchange method based on mobile terminal also can comprise: the second image capture device is identical with the first image capture device parameter, places side by side, and synchronous acquisition comprises the image of staff; The image gathered is mated and depth recovery, obtains reference depth image.
As Fig. 1, shown in, the concrete operations flow process of this embodiment is as follows:
Image capture device 101 collection comprises the image of staff;
The image capture device that existing main flow mobile terminal is equipped with, generally has closely focus function, can obtain image comparatively clearly.
In a step 102, Target Segmentation is carried out to the image gathered, obtains the area-of-interest scope comprising staff.First gathered image is transformed into HSV space by this step, and HSV space adopts storage mode brightness be separated with tone, less by extraneous illumination effect.Meanwhile, after being transformed into HSV space, the scope of human body complexion is more concentrated, can carry out Target Segmentation based on the colour of skin and connectedness.
In step 103, de-noise operation is carried out to the result of segmentation.Remove too small noise region.
At step 104, staff region pixel is classified, mark off each key position of staff.Described sorter adopts stochastic decision tree mode, carries out off-line training by the staff image 150 marked in advance by step 151.
As there is deleted areas in classification results, in step 111, the image in a period of time stored before reading, judges whether to there is corresponding position.In step 112, by track fitting, the position of tentative prediction deleted areas.
In step 105, based on the physiological bounds of staff, finger position is revised.
In step 106, by the information at finger position, compare with the predefined gesture 161 of application program, identify gesture;
In step 107, corresponding control command is performed.
Accompanying drawing 2 illustrates another embodiment of this embodiment, and the contactless exchange method of this mobile terminal can also comprise two image capture devices and depth recovery module.
First image capture device 201 is identical with the optical parametric of the second image capture device 202, and under drive unit 200 acts on, synchronous acquisition comprises the image of staff;
In step 203, Target Segmentation is carried out to the image gathered, obtains the area-of-interest comprising staff, identical with step 102 principle of Fig. 1;
In step 204, de-noise operation is carried out to the result of segmentation.Remove too small noise region.
In step 205, in the area-of-interest of two width images, search matching characteristic point, and carry out depth information recovery, obtain images match result and reference depth image;
In step 206, use coloured image and depth image simultaneously, staff region pixel is classified, marks off each key position of staff.Described sorter adopts stochastic decision tree mode, carries out off-line training by the staff image 250 marked in advance by step 251.
Other in Fig. 2 is mention that step is identical with the corresponding step of Fig. 1.
The foregoing is only preferred embodiment of the present invention, not in order to limit the present invention, within the spirit and principles in the present invention all, any amendment done, equivalent replacement, improvement etc., all should be included within protection scope of the present invention.

Claims (6)

1., based on a contactless exchange method for mobile terminal, it is characterized in that its exchange method comprises the following steps:
(1), the first image capture device of utilizing mobile terminal to carry, gather the image comprising staff;
(2), to the image gathered analyze, divide the area-of-interest at staff place;
(3), to staff region classify further, mark off staff key position;
(4), utilize staff physiological bounds, carry out supplementing to the position divided and revise;
(5), set up the movement locus of staff various piece, the gesture set by movement locus and application is contrasted, carries out interactive mode differentiation, and send corresponding control command.
2. a kind of contactless exchange method based on mobile terminal according to claim 1, it is characterized in that described a kind of contactless exchange method based on mobile terminal also can comprise: the second image capture device is identical with the first image capture device parameter, placement arranged side by side, synchronous acquisition comprises the image of staff; The image gathered is mated and depth recovery, obtains reference depth image.
3. a kind of contactless exchange method based on mobile terminal according to claim 1, is characterized in that the area-of-interest at described division staff place comprises:
(I), the image of collection is transformed into HSV color space;
(II), utilize people's features of skin colors, Preliminary division staff region;
(III), denoising is carried out to division result, remove zonule noise effect.
4. a kind of contactless exchange method based on mobile terminal according to claim 1, is characterized in that described division human body comprises the following steps:
(I) the staff coloured image of artificial mark is collected, in advance, line position precorrection of going forward side by side;
(II) each pixel, to each position of staff, extracts the difference value under the Grad in 8 directions and different step-length, alternatively feature;
(III), stochastic decision tree sorter model, cross-validation method and ID4.5 training algorithm is used, the sorter of off-line training staff key position;
(IV), at mobile terminal apply this sorter, staff key position is classified.
5. a kind of contactless exchange method based on mobile terminal according to claim 1, is characterized in that described carrying out supplementing to division position and correction comprises the following steps:
1., in the result of classifying step, the position that pixel is very few and lack is marked;
2., in the image in certain hour above, corresponding position position is searched;
If 3. find enough positional informations, use track fitting, obtain the deleted areas information of this frame tentative prediction;
4., according to the Rigid Constraints of finger dactylus, four coplanar constraints referred to except thumb, the angle of bend constraint of finger-joint, priority from high to low, is revised the location information of previous step prediction; As there is no enough information of forecastings, infer the position of deleted areas according to these constraints.
6. a kind of contactless exchange method based on mobile terminal according to claim 2, is characterized in that described division human body comprises the following steps:
(1) coloured image and the depth image of collecting the staff of artificial mark, in advance combine, line position precorrection of going forward side by side;
(2) each pixel, to each position of staff, extracts the difference value under the Grad in 8 directions and different step-length, alternatively feature in the group of images respectively;
(3), stochastic decision tree sorter model, cross-validation method and ID4.5 training algorithm is used, the sorter of off-line training staff key position;
(4), at mobile terminal apply this sorter, staff key position is classified.
CN201510319515.2A 2015-06-12 2015-06-12 Non-contact interaction method based on mobile terminal Pending CN104866112A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510319515.2A CN104866112A (en) 2015-06-12 2015-06-12 Non-contact interaction method based on mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510319515.2A CN104866112A (en) 2015-06-12 2015-06-12 Non-contact interaction method based on mobile terminal

Publications (1)

Publication Number Publication Date
CN104866112A true CN104866112A (en) 2015-08-26

Family

ID=53911993

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510319515.2A Pending CN104866112A (en) 2015-06-12 2015-06-12 Non-contact interaction method based on mobile terminal

Country Status (1)

Country Link
CN (1) CN104866112A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109961454A (en) * 2017-12-22 2019-07-02 北京中科华正电气有限公司 Human-computer interaction device and processing method in a kind of embedded intelligence machine
CN110728168A (en) * 2018-07-17 2020-01-24 广州虎牙信息科技有限公司 Part recognition method, device, equipment and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140375584A1 (en) * 2011-12-24 2014-12-25 VALEO Schalter und Sensoren GmbH a corporation Touch-sensitive operating device for a motor vehicle and motor vehicle

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140375584A1 (en) * 2011-12-24 2014-12-25 VALEO Schalter und Sensoren GmbH a corporation Touch-sensitive operating device for a motor vehicle and motor vehicle

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109961454A (en) * 2017-12-22 2019-07-02 北京中科华正电气有限公司 Human-computer interaction device and processing method in a kind of embedded intelligence machine
CN110728168A (en) * 2018-07-17 2020-01-24 广州虎牙信息科技有限公司 Part recognition method, device, equipment and storage medium
CN110728168B (en) * 2018-07-17 2022-07-22 广州虎牙信息科技有限公司 Part recognition method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
US10394334B2 (en) Gesture-based control system
CN102231093B (en) Screen locating control method and device
Jafri et al. Computer vision-based object recognition for the visually impaired in an indoors environment: a survey
CN105045398B (en) A kind of virtual reality interactive device based on gesture identification
Berman et al. Sensors for gesture recognition systems
EP2904472B1 (en) Wearable sensor for tracking articulated body-parts
WO2018076523A1 (en) Gesture recognition method and apparatus, and in-vehicle system
US20170086712A1 (en) System and Method for Motion Capture
CN110070056A (en) Image processing method, device, storage medium and equipment
CN202584010U (en) Wrist-mounting gesture control system
CN106933340B (en) Gesture motion recognition method, control method and device and wrist type equipment
CN103713738B (en) A kind of view-based access control model follows the tracks of the man-machine interaction method with gesture identification
CN103488294B (en) A kind of Non-contact gesture based on user's interaction habits controls to map method of adjustment
CN106774850B (en) Mobile terminal and interaction control method thereof
CN103353935A (en) 3D dynamic gesture identification method for intelligent home system
CN103135753A (en) Gesture input method and system
CN105739702A (en) Multi-posture fingertip tracking method for natural man-machine interaction
CN109145802B (en) Kinect-based multi-person gesture man-machine interaction method and device
CN104364733A (en) Position-of-interest detection device, position-of-interest detection method, and position-of-interest detection program
CN104813258A (en) Data input device
CN103995595A (en) Game somatosensory control method based on hand gestures
CN104281839A (en) Body posture identification method and device
CN105242776A (en) Control method for intelligent glasses and intelligent glasses
CN104898971B (en) A kind of mouse pointer control method and system based on Visual Trace Technology
KR101465894B1 (en) Mobile terminal for generating control command using marker put on finger and method for generating control command using marker put on finger in terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20150826

RJ01 Rejection of invention patent application after publication