CN104680551A - Tracking method and device based on skin color detection - Google Patents

Tracking method and device based on skin color detection Download PDF

Info

Publication number
CN104680551A
CN104680551A CN201310633324.4A CN201310633324A CN104680551A CN 104680551 A CN104680551 A CN 104680551A CN 201310633324 A CN201310633324 A CN 201310633324A CN 104680551 A CN104680551 A CN 104680551A
Authority
CN
China
Prior art keywords
area
parameter
fitted ellipse
pixel
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310633324.4A
Other languages
Chinese (zh)
Other versions
CN104680551B (en
Inventor
穆星
张乐
陈敏杰
林福辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spreadtrum Communications Tianjin Co Ltd
Original Assignee
Spreadtrum Communications Tianjin Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spreadtrum Communications Tianjin Co Ltd filed Critical Spreadtrum Communications Tianjin Co Ltd
Priority to CN201310633324.4A priority Critical patent/CN104680551B/en
Publication of CN104680551A publication Critical patent/CN104680551A/en
Application granted granted Critical
Publication of CN104680551B publication Critical patent/CN104680551B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention relates to a tracking method and a tracking device based on skin color detection. The method comprises the steps of respectively performing fitting elliptic calculation on at least one first region, so as to obtain a fitting elliptic parameter of each first region, wherein the first region is a skin color region in a first input image; obtaining a first parameter and a second parameter of a second region on the basis of the relation between the distance from pixel points of the second region to each first region and a space threshold mu, wherein the second region is a skin color region in a second input image, the first parameter is a corresponding fitting elliptic parameter of the first region in a elliptic parameter set, and the second parameter is a fitting elliptic parameter obtained on the basis of fitting elliptic calculation on the second region; tracking the second region on the basis of the first parameter and the second parameter of the second region. According to the method, the tracked skin color region can be accurately tracked, and the method has the advantages of simplicity in treatment, less calculation and easiness in implementation on a mobile terminal.

Description

A kind of tracking based on Face Detection and device
Technical field
The present invention relates to technical field of image processing, particularly relate to a kind of tracking based on Face Detection and device.
Background technology
In coloured image, because Skin Color Information is not by the impact of human body attitude, facial expression etc., there is relative stability, and have obvious difference due to the color of the colour of skin and most of background object, Face Detection technology is all had a wide range of applications in detection, gesture analysis, target following and image retrieval, the object of human body skin tone testing is from image, automatically orient the exposed skin area of human body, such as, from image, detect the regions such as the face of people, hand.
Simultaneously, along with the fast development of motion target tracking technology, create the multiple method for following the tracks of moving target accordingly, based on the color characteristic of moving target in prior art, movable information, image informations etc. set up corresponding tracking, tracking such as based on the color characteristic of moving target has average drifting, the methods such as continuous print self-adaptation average drifting, these class methods can realize the tracking of the gesture of good people etc. under some simple scenario, tracking based on the movable information of moving target has optical flow method, Kalman filtering (Kalman Filter), the methods such as particle filter (Particle Filter).
Based on the method that above-mentioned moving object detection is followed the tracks of, can follow the tracks of the feature of the image sequence captured by the hand of the people that is kept in motion, face, such as, can detect that the region such as face, hand of people is followed the tracks of to above-mentioned based on human body skin tone testing method from image.In the process that moving object detection is followed the tracks of, be the important foundation and gordian technique studied to the feature detection of moving target with following the tracks of.
But in prior art, in the process adopting said method to detect moving target and follow the tracks of, all may can there are some problems, such as, the robustness of method to complex scene and illumination variation based on color characteristic is lower, method based on movable information may be difficult to any change adapting to gesture, or in tracking processing procedure, the problem that calculated amount is larger, and in above-mentioned tracking, exist and between multiple moving target, produce large area when blocking, more difficult moving target accurately to be followed the tracks of.
Correlation technique can be the U.S. Patent application of US2013259317A1 with reference to publication number.
Summary of the invention
What technical solution of the present invention solved is more difficultly accurately to follow the tracks of tracking object, and follows the tracks of the problem that in processing procedure, calculated amount is larger.
For solving the problem, technical solution of the present invention provides a kind of tracking based on Face Detection, and described method comprises:
Carry out fitted ellipse calculating respectively at least one first area, to obtain the fitted ellipse parameter of each first area, described first area is the area of skin color in the first input picture;
Based on the pixel of second area and the relation between the distance of each first area and distance threshold μ, obtain the first parameter and second parameter of described second area, described second area is the area of skin color in the second input picture, described first parameter is the fitted ellipse parameter that elliptic parameter concentrates corresponding first area, and described second parameter is for calculating the fitted ellipse parameter obtained based on the fitted ellipse of carrying out described second area;
Based on the first parameter and second parameter of described second area, described second area is followed the tracks of;
Wherein,
Described fitted ellipse parameter comprises the coordinate figure of the central point of fitted ellipse;
Described elliptic parameter collection comprises the set of the fitted ellipse parameter of each first area;
The distance of described pixel and first area be pixel and described elliptic parameter concentrate the fitted ellipse of this first area central point between distance.
Optionally, described area of skin color is obtained by the skin color detection method based on colour of skin model of ellipse.
Optionally, described method also comprises:
By formula P (s/c)=γ × P (s/c)+(1-γ) × P w(s/c) described colour of skin model of ellipse is upgraded; Wherein, s is the pixel value of the pixel of input picture, and c is the pixel value of skin pixel point, P (s/c) for this pixel be the probable value of colour of skin point, P w(s/c) this pixel for obtaining through colour of skin model of ellipse in continuous w two field picture is the probable value of colour of skin point, and γ is sensitivity parameter.
Optionally, described to region carry out fitted ellipse calculate be determined based on asking for covariance matrix to the pixel in described region.
Optionally, the relation between the distance of the described pixel based on second area and each first area and distance threshold μ, the first parameter and the second parameter that obtain described second area comprise:
If the distance of all pixels of described second area and at least one first area is all less than μ, then the first parameter of described second area is the fitted ellipse parameter of first area nearest with described second area at least one first area described, the fitted ellipse parameter that the second parameter of described second area obtains for carrying out fitted ellipse calculating to all pixels of described second area;
The number of the pixel corresponding to first area that described and described second area is nearest is maximum, and the pixel corresponding to described first area is be less than the pixel with the distance of other first areas with the distance of this first area in described second area.
Optionally, the relation between the distance of the described pixel based on second area and each first area and distance threshold μ, the first parameter and the second parameter that obtain described second area comprise:
If the partial pixel point of described second area and N number of first area h1, h2, the distance of hN is all less than μ, then determine that described second area has N number of first parameter A1, A2, AN and N number of second parameter B1, B2, BN, wherein, Aj is the fitted ellipse parameter of first area hj, the fitted ellipse parameter that Bj obtains for carrying out fitted ellipse calculating to the pixel of the first set and the second set, described first set is the set of described partial pixel point, described second set in described second area except described partial pixel point, the set of the pixel corresponding to the hj of first area, pixel corresponding to described first area hj and the distance of first area hj are less than the distance of this pixel and other first areas, 1≤j≤N, N >=2.
Optionally, the span of described distance threshold μ is the numerical value between 1 ~ 2.
Optionally, described fitted ellipse parameter also comprises the long axis length of fitted ellipse, minor axis length and the anglec of rotation;
Based on formula D ( p , h ) = v → · v → Calculate the pixel of second area and the distance of first area, wherein, v → = cos θ - sin θ sin θ cos θ ( x - x c α , y - y c β ) , P is the pixel of second area, and (x, y) is the coordinate figure of p point, the fitted ellipse of h corresponding to first area, (x c, y c) being intended to be the coordinate figure of the central point of fitted ellipse, α is the long axis length of fitted ellipse, and β is the minor axis length of fitted ellipse, and θ is the anglec of rotation of fitted ellipse.
Optionally, the relation between the distance of the described pixel based on second area and each first area and distance threshold μ, the first parameter and the second parameter that obtain described second area comprise:
If the distance of all pixels of described second area and arbitrary first area is all greater than μ, then the first parameter of described second area is empty, the fitted ellipse parameter that the second parameter of described second area obtains for carrying out fitted ellipse calculating to all pixels of described second area.
Optionally, described method also comprises: after following the tracks of all second areas of continuous K frame, if all pixels of all second areas and the distance of same first area of described continuous K frame are all greater than μ, then the fitted ellipse parameter of this first area is concentrated from described elliptic parameter and delete, wherein, the span of K is 5 ~ 20.
Optionally, described method also comprises: concentrated by described elliptic parameter the fitted ellipse parameter of the first area corresponding with described second area to be updated to the second parameter of described second area.
Optionally, described method also comprises:
Based on formula (x c+1, y c+1)=(x c, y c)+Δ c determines the coordinate figure (x of the central point of the fitted ellipse corresponding to the 3rd region c+1, y c+1), described 3rd region is area of skin color corresponding with second area in next frame input picture;
Wherein, Δ c=(x c, y c)-(x c-1, y c-1), (x c, y c) be the coordinate figure of the central point of fitted ellipse in the second parameter of second area, (x c-1, y c-1) be the coordinate figure of the central point of fitted ellipse in the first parameter of second area.
Technical solution of the present invention also provides a kind of tracking means based on Face Detection, and described device comprises:
First acquiring unit, is suitable for carrying out fitted ellipse calculating respectively at least one first area, and to obtain the fitted ellipse parameter of each first area, described first area is the area of skin color in the first input picture;
Second acquisition unit, be suitable for based on the pixel of second area and the relation between the distance of each first area and distance threshold μ, obtain the first parameter and second parameter of described second area, described second area is the area of skin color in the second input picture, described first parameter is the fitted ellipse parameter that elliptic parameter concentrates corresponding first area, and described second parameter is for calculating the fitted ellipse parameter obtained based on the fitted ellipse of carrying out described second area;
Tracking cell, is suitable for the first parameter based on described second area and the second parameter, follows the tracks of described second area;
Wherein,
Described fitted ellipse parameter comprises the coordinate figure of the central point of fitted ellipse;
Described elliptic parameter collection comprises the set of the fitted ellipse parameter of each first area;
The distance of described pixel and first area be pixel and described elliptic parameter concentrate the fitted ellipse of this first area central point between distance.
Optionally, described device also comprises: updating block, is suitable for being concentrated by described elliptic parameter the fitted ellipse parameter of the first area corresponding with described second area to be updated to the second parameter of described second area.
Optionally, described device also comprises: predicting unit, is suitable for based on formula (x c+1, y c+1)=(x c, y c)+nc determines the coordinate figure (x of the central point of the fitted ellipse corresponding to the 3rd region c+1, y c+1), described 3rd region is area of skin color corresponding with second area in next frame input picture;
Wherein, Δ c=(x c, y c)-(x c-1, y c-1), (x c, y c) be the coordinate figure of the central point of fitted ellipse in the second parameter of second area, (x c-1, y c-1) be the coordinate figure of the central point of fitted ellipse in the first parameter of second area.
Compared with prior art, technical scheme of the present invention has the following advantages:
Fitted ellipse is carried out to the area of skin color obtained based on skin color detection method (first area and second area) and calculates the fitted ellipse parameter obtained corresponding to region, in tracing process, based on the relation between each area of skin color (each first area) distance in the pixel of the area of skin color (second area) of current input image and input picture before and distance threshold μ, accurately can determine the fitted ellipse parameter (the first parameter and the second parameter) of tracked area of skin color (second area), based on the change of the fitted ellipse parameter of area of skin color tracked described in tracing process, accurately can follow the tracks of described tracked area of skin color, and the method process is simple, calculated amount is little, be easy to realize on mobile terminals.
In process area of skin color detected based on skin color detection method, the colour of skin model of ellipse of carrying out Face Detection is optimized, colour of skin model of ellipse after optimization can carry out the detection of adaptivity according to current input image information, colour of skin model of ellipse after optimization has better robustness to illumination, effectively improves the accuracy of the detection of area of skin color.
In tracing process, based on the pixel of the area of skin color (second area) of current input image and the different relations between each area of skin color (first area) distance in input picture before and distance threshold μ, take different determination first parameters and the method for the second parameter accordingly, make it possible to accurately determine the fitted ellipse parameter of tracked area of skin color in tracing process, special in when having multiple area of skin color mutually to block in current input image, the method still can be followed the tracks of each area of skin color preferably respectively.
After tracking, based on the fitted ellipse parameter of the described tracked area of skin color in the fitted ellipse parameter of area of skin color tracked in current input image and input picture before, the prediction of the fitted ellipse parameter to the described tracked area of skin color in next frame input picture can be realized.
Accompanying drawing explanation
Fig. 1 is the schematic flow sheet of the tracking based on Face Detection that technical solution of the present invention provides;
Fig. 2 is the schematic flow sheet of the tracking based on Face Detection that one embodiment of the invention provides;
Fig. 3 is the schematic flow sheet be optimized colour of skin model of ellipse that one embodiment of the invention provides;
Fig. 4 is the schematic flow sheet of the tracking based on Face Detection that another embodiment of the present invention provides.
Embodiment
In prior art, in process area of skin color being detected and follows the tracks of, lower to the robustness of complex scene and illumination variation, and when there is multiple area of skin color and following the tracks of, effectively cannot follow the tracks of described multiple area of skin color.
In order to solve the problem, technical solution of the present invention provides a kind of tracking based on Face Detection, in the method, when tracing process, based on the relation between the distance of each area of skin color in the pixel of the area of skin color of current input image and input picture before and distance threshold μ, determine the fitted ellipse parameter of the area of skin color of current input image, realize the tracking of the area of skin color to current input image.
Fig. 1 is the schematic flow sheet of the tracking based on Face Detection that technical solution of the present invention provides, as shown in Figure 1, first perform step S101, respectively fitted ellipse calculating is carried out at least one first area, to obtain the fitted ellipse parameter of each first area.
Described first area is the area of skin color in the first input picture, and described area of skin color refers to the area of skin color being used to follow the tracks of.Described first input picture can be the initial input image before current area of skin color is followed the tracks of, the area of skin color comprised in described first input picture can be obtained based on multiple skin color detection method of the prior art, due in a frame input picture, an area of skin color may be included or include multiple area of skin color, so what comprise in described first input picture can be one or more for the area of skin color followed the tracks of, namely in this step, need to carry out fitted ellipse calculating at least one area of skin color (first area), described skin color detection method can based on single Gauss model, mixed Gauss model, oval complexion models etc. realize the Face Detection for image.
Carry out fitted ellipse to area of skin color (first area) to calculate to obtain the fitted ellipse parameter corresponding to described area of skin color, described fitted ellipse parameter comprises the coordinate figure of the central point of the fitted ellipse corresponding to described area of skin color, long axis length, minor axis length and the anglec of rotation etc.
Calculate based on described fitted ellipse, the fitted ellipse parameter of each first area in described first input picture can be obtained.
Perform step S102, based on the pixel of second area and the relation between the distance of each first area and distance threshold μ, obtain the first parameter and second parameter of described second area.
Described second area is the area of skin color in the second input picture, and described second input picture can for including the current input image of tracked area of skin color (second area).
The pixel of described second area and the distance of each first area refer to that the pixel of second area and elliptic parameter concentrate the distance between the central point of the fitted ellipse of this first area, and described elliptic parameter collection is the set of the fitted ellipse parameter of each first area in the first input picture of obtaining in step S101.
First parameter of described second area refers to that elliptic parameter concentrates the fitted ellipse parameter with the first area corresponding to described second area, and described second parameter is for calculating the fitted ellipse parameter acquired based on the fitted ellipse of carrying out described second area.
Perform step S103, based on the first parameter and second parameter of described second area, described second area is followed the tracks of.
After the first parameter obtaining second area based on step S102 and the second parameter, the first parameter due to described second area is the fitted ellipse parameter of the first area corresponding to second area, the fitted ellipse information before described second area can be obtained according to this parameter, and the second parameter is for calculating the fitted ellipse parameter acquired based on the fitted ellipse of carrying out described second area, the current fitted ellipse information of described second area can be obtained according to described second parameter, then can realize the accurate tracking for described second area based on above-mentioned not fitted ellipse information in the same time.
For enabling above-mentioned purpose of the present invention, feature and advantage more become apparent, and are described in detail specific embodiments of the invention below in conjunction with accompanying drawing.
In the present embodiment, fitted ellipse computing method are adopted to obtain fitted ellipse parameter to the area of skin color in input picture, to in tracing process, when tracked area of skin color elliptic parameter concentrate have multiple fitted ellipse parameter to be described in it tracing process during correspondence.
Fig. 2 is the schematic flow sheet of the tracking based on Face Detection that the present embodiment provides, and as shown in Figure 2, first performs step S201, detects the first area in the first input picture based on colour of skin model of ellipse.
First read the first input picture, for the first input picture, if it is the picture format of rgb space, first can carries out color space conversion, it is converted to Ycbcr space from rgb space.
Due in YCbCr space, Y represents brightness, Cb and Cr is color distinction signal, represent colourity, and under different illumination conditions, although the brightness of the color of object can produce very large difference, but colourity in very large range has stability, substantially remain unchanged, and, in prior art, also relevant result of study is had to show, the distribution of the colour of skin in YCbCr space of the mankind is relatively concentrated, the i.e. Clustering features of the colour of skin, the difference of the color between not agnate is mainly caused by brightness, and have nothing to do with color attribute, so utilize this characteristic, image pixel can be divided into the colour of skin and non-skin pixel, so in the present embodiment, in order to the accuracy that area of skin color detects can be improved, image is transformed into YCbCr space from the rgb space generally adopted.
The colour of skin model of ellipse trained in prior art can be utilized afterwards to carry out initial detecting, to obtain one or more area of skin color comprised in initial input image.
Because the colour of skin model of ellipse trained based on prior art is carrying out Face Detection, may there is the surveyed area of some mistakes in testing result, such as, and the cavitation etc. that may exist in area of skin color.So, in the present embodiment, first Advance data quality can be carried out to the area of skin color information in Face Detection result, in view of connectedness and the size of colour of skin object, by four connected region completion methods or eight connectivity area filling method, can area of skin color exists in removal of images cavitation.
In the present embodiment, model optimization can be carried out by following formula (1) to described colour of skin model of ellipse based on the area of skin color information after Advance data quality.
P(s/c)=γ×P(s/c)+(1-γ)×P w(s/c) (1)
Wherein, s is the pixel value of the pixel of input picture, c is the pixel value of skin pixel point, the P (s/c) on the equation left side is that after optimizing, this pixel is the probable value of colour of skin point, P (s/c) on the right of equation is for optimizing the probable value that front this pixel obtained through colour of skin model of ellipse is colour of skin point, P w(s/c) this pixel for obtaining through colour of skin model of ellipse in continuous w two field picture is the probable value of colour of skin point, and γ is sensitivity parameter.
After colour of skin model of ellipse is optimized, can return and again read the first input picture, then color space conversion can be carried out, again Face Detection can be carried out afterwards based on the colour of skin model of ellipse after renewal, still Advance data quality can be carried out to the area of skin color information in Face Detection result after detection, if the area of skin color information after optimizing thinks satisfied, then can based on one or more concrete area of skin color of the area of skin color information extraction after current optimization, if dissatisfied, the area of skin color information after based on Advance data quality can be continued, again model optimization is carried out to colour of skin model of ellipse by formula (1), require until the area of skin color information after Advance data quality and model optimization meets user.
Said process is incorporated by reference to being the schematic flow sheet be optimized colour of skin model of ellipse with reference to figure 3, Fig. 3.
It should be noted that, carrying out based on colour of skin model of ellipse in the process of initial detecting, at least one method in Advance data quality as above and model optimization can be adopted to be optimized.
After step S201, perform step S202, respectively fitted ellipse calculating is carried out to each first area in described first input picture.
Based on step S201, one or more area of skin color in the first input picture can be obtained, each first area in the first input picture can be determined based on described area of skin color.
Consider that colour of skin object is in actual motion process, the situation of juxtaposition may be there is, so the number of the area of skin color detected may not be equal to the area of skin color for following the tracks of, in present specification, described first area refers to the area of skin color being used to follow the tracks of.
In the present embodiment, to be described containing multiple first area in the first input picture.
Shape approximation due to the colour of skin such as face, staff object is elliptical shape, fit to elliptical shape so can be calculated by fitted ellipse by corresponding respectively for multiple first areas contained in the first input picture, described elliptical shape can be represented by model of ellipse as shown in Equation (2) usually.
h=h(x c,y c,α,β,θ) (2)
Wherein, h represents the fitted ellipse corresponding to described first area, x c, y cthe coordinate figure of the central point of the fitted ellipse corresponding to described first area, the long axis length of the fitted ellipse of α corresponding to described first area, the minor axis length of the fitted ellipse of β corresponding to described first area, the anglec of rotation of the fitted ellipse of θ corresponding to described first area.
In the present embodiment, fitted ellipse calculating can be carried out based on asking for covariance matrix to the pixel of described first area to described first area.
Be described for a first area in described first input picture, because described first area corresponds to cluster continuous print skin pixel point, so can ask for covariance matrix ∑ to the skin pixel point in described first area.
Particularly, suppose to make X=[x 1x n] represent that the X-direction of pixel point set is vectorial, Y=[y 1y n] represent that the Y-direction of pixel point set is vectorial, wherein, x 1... x nrepresent the coordinate of the X-direction of the skin pixel point in described first area, y 1... y nrepresent the coordinate of the Y-direction of the skin pixel point in described first area, n represents the number of the skin pixel point in described first area.
Order Z = X Y , Then covariance matrix ∑ can pass through formula (3) acquisition.
∑=E((Z-E(Z))(Z-E(Z)) T) (3)
Wherein, E represents and asks for mathematical expectation.
Then covariance matrix ∑ is in the vector calculation shown in formula (3), and be in fact the matrix of 2 × 2, it can be expressed as the form of formula (4).
Σ = δ xx δ xy δ xy δ yy - - - ( 4 )
Wherein, the X-direction vector of each element representation pixel point set in covariance matrix ∑, the covariance between Y-direction vector.
The long axis length α of the fitted ellipse corresponding to described first area can be obtained based on formula (5).
Wherein,
The long axis length β of the fitted ellipse corresponding to described first area can be obtained based on formula (6).
Wherein,
The anglec of rotation θ of the fitted ellipse corresponding to described first area can be obtained based on formula (7).
For the center point coordinate value (x of the fitted ellipse corresponding to described first area c, y c) in the process to the matching of described first area, the coordinate figure of the pixel on the border of described first area can be utilized, can obtain.
So far, can obtain the initial fitted ellipse parameter of described first area, described initial fitted ellipse parameter comprises the coordinate figure of the central point of the fitted ellipse corresponding to described first area, long axis length, minor axis length and the anglec of rotation.
Perform step S203, elliptic parameter collection is set.
Based on step S202, the fitted ellipse parameter of each first area in the first input picture can be obtained.
By the fitted ellipse optimum configurations of first area each in the first input picture in same set, form described elliptic parameter collection.
Perform step S204, detect each second area in the second input picture based on colour of skin model of ellipse.
For current input image, i.e. described second input picture, based on colour of skin model of ellipse, can obtain each area of skin color comprised in current input image, the each second area in the second input picture can be determined based on described area of skin color, specifically please refer to step S201.
Can follow the tracks of described each second area afterwards.
Perform step S205, calculate the pixel of second area and the distance of each first area.
Be described to be tracked as example to one of them second area, first calculate all pixels of tracked second area and the distance of each first area by this step.
Calculate based on formula (8) and each pixel of tracked second area and the distance of each first area can be obtained.
D ( p , h ) = v → · v → - - - ( 8 )
Wherein, v → = cos θ - sin θ sin θ cos θ ( x - x c α , y - y c β ) , P is the pixel of second area, and (x, y) is the coordinate figure of p point, the fitted ellipse of h corresponding to first area, (x c, y c) coordinate figure of central point of fitted ellipse corresponding to first area, the long axis length of the fitted ellipse of α corresponding to first area, the minor axis length of the fitted ellipse of β corresponding to first area, the anglec of rotation of the fitted ellipse of θ corresponding to first area.
For any one first area, the distance of each pixel to described first area of tracked second area all can be obtained based on formula (8).
Perform step S206, determine the first parameter and second parameter of second area.
After the distance of each pixel to described first area of obtaining tracked second area based on step S205, just can determine that tracked second area concentrates the fitted ellipse parameter of corresponding first area at elliptic parameter according to described distance.
In the present embodiment, concentrate the fitted ellipse parameter of the first area of multiple correspondence to be described at elliptic parameter for tracked second area.
Based on each pixel of tracked second area to the distance of described first area, can judge whether this pixel is positioned at the determined oval scope of fitted ellipse parameter of described first area.
It has been generally acknowledged that, when calculating the distance D (p of a pixel to a first area of tracked second area based on formula (8), when h)≤1, then think that this pixel is positioned at the fitted ellipse corresponding to described first area, namely this pixel is positioned at by the determined fitted ellipse scope of the fitted ellipse parameter of this first area, the fitted ellipse parameter of this first area can be called the fitted ellipse parameter of the first area corresponding to described second area, also the determined fitted ellipse of fitted ellipse parameter can thinking by first area is the target fitted ellipse of the previous moment of described second area.The fitted ellipse parameter of the first area corresponding with described second area is called the first parameter of described second area.
But due to the scrambling of the colour of skin objects such as hand, if too little to the fitted ellipse of colour of skin object, the accuracy of tracking results can be had influence on, so herein in order to reasonable tracking effect can be obtained, can after the distance of the pixel to a first area that calculate tracked second area, can based on the relation of described distance with distance threshold μ, judge whether this pixel is positioned at the fitted ellipse corresponding to described first area, described distance threshold μ can follow the tracks of situation according to reality, the size of tracking object, the factor such as the complexity of scene and the computing method of covariance matrix of following the tracks of sets accordingly, in present specification, the value of described distance threshold μ can be the numerical value between 1 ~ 2.
If calculate based on formula (8), the distance of partial pixel point and multiple first area is had all to be less than distance threshold μ in described second area, suppose in described second area, have partial pixel point and N number of first area h1, h2, the distance of hN is all less than μ, namely can think that described partial pixel point is positioned at the determined fitted ellipse scope of fitted ellipse parameter of described N number of first area simultaneously, then also can think this N number of first area h1, h2, each other may be mutually overlapping between hN, namely first area that may be different may occur mutually to block in tracking scene, the determined fitted ellipse of fitted ellipse parameter of this N number of first area all can think the target fitted ellipse of described second area, this second area has N number of first parameter A1 accordingly, A2, AN, described N number of first parameter is the fitted ellipse parameter that elliptic parameter concentrates the N number of first area corresponding with described second area.
Because second area has N number of target fitted ellipse, namely the fitted ellipse parameter of N number of first area is corresponding with it, then now need for N number of different first area hj, accordingly fitted ellipse calculating is carried out to described second area, so that based on fitted ellipse result of calculation, described second area is followed the tracks of.
For example, for any one first area hj, pixel in second area, except the distance of described partial pixel point and described all first areas is all less than except distance threshold μ, may distance hj be had near in remaining pixel, also have distance hk closer, calculate if now pixel closer for distance hk to be also used for fitted ellipse, and then obtain, with the fitted ellipse parameter of the corresponding second area of first area hj, obviously may having problems.
So, in present specification, in the fitted ellipse of the second area corresponding with first area hj is calculated, adopt the pixel (the second set) that the remaining pixel middle distance hj in the pixel of described partial pixel point (the first set) and second area except described partial pixel point is nearest, for calculating the fitted ellipse of the second area corresponding with first area hj, the nearest pixel of described distance hj also can be called be in second area with the pixel corresponding to the hj of first area, pixel corresponding to described first area hj and the distance of first area hj are less than the distance of this pixel and other first areas.
In like manner, for first area hk, in the fitted ellipse of the second area corresponding with first area hk is calculated, adopt the pixel that the remaining pixel middle distance hk in the pixel of described partial pixel point and second area except described partial pixel point is nearest, for calculating the fitted ellipse of the second area corresponding with first area hk, the like, for the N number of different first area that second area is corresponding, tackle second area mutually and carry out N fitted ellipse calculating, obtain N number of respectively with first area h1, h2, hN is fitted ellipse parameter B1 one to one, B2, BN, by described fitted ellipse parameter B1, B2, BN is called the second parameter of described second area, wherein, 1≤j≤N, N >=2, please refer to step S202 in detail.
So far second area is obtained for multiple different target fitted ellipse, the first parameter of multiple correspondence and the second parameter.
Perform step S207, based on the first parameter and second parameter of described second area, described second area is followed the tracks of.
The first parameter due to described second area is fitted ellipse parameter corresponding to first area, the fitted ellipse parameter corresponding to described second area previous moment can be thought, and the second parameter of described second area for current time carry out fitted ellipse calculate obtain the fitted ellipse parameter corresponding with the first parameter, then the first parameter of described second area and the second parameter, accurately can determine described second area not fitted ellipse parameter in the same time, can determine the motion conditions of described second area, realize the tracking for second area.
When determining the target fitted ellipse of tracked second area, the distance of first area and distance threshold is concentrated to compare the second area calculated and elliptic parameter, situation can be followed the tracks of equally according to reality, determine suitable distance threshold, make the tracking results for second area more accurate.
The method is for when having multiple area of skin color (first area) mutually to block in input picture, the method still can be followed the tracks of by area of skin color preferably.
What above-described embodiment provided is has multiple fitted ellipse parameter with it to tracing process time corresponding when tracked area of skin color is concentrated at elliptic parameter, in actual tracing process, the relation of the fitted ellipse parameter that tracked area of skin color and elliptic parameter are concentrated also may have multiple different situation.
Fig. 4 is the schematic flow sheet of the tracking based on Face Detection that another embodiment of the present invention provides, and in the present embodiment, for the situation that tracked colour of skin object is different, carries out different tracking process.
Due in tracing process, usually need to consider several situation: A below, in tracking scene, occur new colour of skin object; B, the colour of skin object before followed the tracks of disappears from tracking scene; C, tracked colour of skin object moves continuously in scene; D, different colour of skin object blocks in tracking scene.For the colour of skin object situation that above-mentioned A, B, C and D tetra-kinds is different, for the elliptic parameter collection set, corresponding meeting is to the generation that should have target fitted ellipse, the release of target fitted ellipse, the Continuous Tracking of target fitted ellipse, the situations such as the overlap of target fitted ellipse, the area of skin color of tackling corresponding to colour of skin object carries out tracking process mutually also has different, and described target fitted ellipse is the fitted ellipse corresponding to first area mentioned above.
In the present embodiment, the situation different to described A, B, C and D tetra-kinds, the tracking situation of colour of skin object is described.
As shown in Figure 4, first perform step S401, detect the first area in the first input picture based on colour of skin model of ellipse.
Perform step S402, respectively fitted ellipse calculating is carried out to each first area in described first input picture.
Perform step S403, elliptic parameter collection is set.
Perform step S404, detect each second area in the second input picture based on colour of skin model of ellipse.
Perform step S405, calculate the pixel of second area and the distance of each first area.
Step S401 to step S405 please refer to step S201 to step S205.
Based on the pixel of second area determined in step S405 and the distance of each first area, following a, b, c and d tetra-kinds of situations can be divided to determine second area.
As shown in Figure 4, if the distance of all pixels of described second area and arbitrary first area is all greater than distance threshold μ, then a situation is defined as.
If after following the tracks of all second areas of continuous K frame, all pixels of all second areas and the distance of same first area of described continuous K frame are all greater than distance threshold μ, be then defined as b situation.
If the distance of all pixels of described second area and at least one first area is all less than distance threshold μ, be then defined as c situation.
If the partial pixel point of described second area and N number of first area h1, h2 ..., hN distance be all less than distance threshold μ, be then defined as d situation.
Tracking process is carried out for a, b, c and d tetra-kinds of different situations.
If a situation, as shown in Figure 4, then performing step S406, be sky by the first optimum configurations of described second area, all pixels of described second area will be carried out to fitted ellipse parameter that fitted ellipse calculating obtains the second parameter as described second area.
When all pixels of described second area and the distance of arbitrary first area are all greater than distance threshold μ, then can determine that described second area concentrates the first area not having correspondence at elliptic parameter, namely it does not belong to the fitted ellipse corresponding to any one first area, described second area should be the area of skin color that the colour of skin object of tracking scene center appearance is corresponding, be then empty by the first optimum configurations of described second area, the fitted ellipse parameter that second parameter of described second area obtains for carrying out fitted ellipse calculating to all pixels of described second area, can fitted ellipse parameter that fitted ellipse calculating obtains is newly-increased to be added to elliptic parameter and to concentrate by carrying out all pixels of described second area, when described second area being followed the tracks of in input picture afterwards, based on the fitted ellipse parameter that the fitted ellipse parameter renewal elliptic parameter that institute's fitted ellipse calculates afterwards concentrates described second area corresponding.
If b situation, then perform step S407, the fitted ellipse parameter of this first area is concentrated from described elliptic parameter and deletes.
If after following the tracks of all second areas of continuous K frame, all pixels of all second areas and the distance of same first area of described continuous K frame are all greater than distance threshold μ, then can determine that any one second area is all greater than distance threshold μ with the distance of this first area, also namely illustrate that tracking object corresponding with this first area in former frame disappears, then the fitted ellipse parameter of this first area can be concentrated from described elliptic parameter and delete.
Here need all to consider the image information of continuous K frame, due in the process of Face Detection, sometimes the situation of losing Skin Color Information may be appeared in frame image information, so in a frame input image information, when all pixels of all second areas and the distance of same first area are all greater than distance threshold μ, can take not upgrade the fitted ellipse parameter of this first area, if in a frame input picture next, detect that again the pixel of second area and the distance of this first area are all less than distance threshold μ, then can continue to adopt the fitted ellipse parameter of said method to this first area to upgrade, if all there is this kind of situation in K frame continuously, can determine that tracking object disappears, then the fitted ellipse parameter of this first area can be concentrated from described elliptic parameter and delete.The span of K can be 5 ~ 20.
If c situation, then perform step S408, first parameter of described second area is the fitted ellipse parameter of first area nearest with described second area at least one first area described, the fitted ellipse parameter that second parameter of described second area obtains for carrying out fitted ellipse calculating to all pixels of described second area, follows the tracks of based on the first parameter of second area and the second parameter.
If the distance of all pixels of described second area and at least one first area is all less than distance threshold μ, then illustrates and concentrate the first area that may there is one or more correspondence at elliptic parameter.
If all pixels of described second area and described elliptic parameter concentrate the distance of a first area to be less than μ, the distance of other first areas is concentrated all to be greater than μ with described elliptic parameter, then determine that this first area is the first area that described second area is corresponding, then the first parameter of described second area is the fitted ellipse parameter of this first area, the fitted ellipse parameter that the second parameter of described second area obtains for carrying out fitted ellipse calculating to all pixels of described second area.
If all pixels of described second area and described elliptic parameter concentrate the distance of multiple first area to be all less than μ, usually we think that two different first areas can not area of skin color corresponding to same tracked area of skin color, so can determine the first area that described second area is corresponding based on the distance between second area and different first areas.
Particularly, first parameter of described second area is the fitted ellipse parameter of first area nearest with described second area in described multiple first area, the fitted ellipse parameter that second parameter of described second area obtains for carrying out fitted ellipse calculating to all pixels of described second area, the number of the pixel corresponding to first area that described and described second area is nearest is maximum, and the pixel corresponding to described first area is be less than the pixel with the distance of other first areas with the distance of this first area in described second area.
For two first area U and V, when the distance of the pixel of in described second area and first area U is less than the distance with first area V, then determine the pixel of this pixel corresponding to the U of first area, when more with the number of the pixel corresponding to first area in second area time, then determine that first area U is the first area nearest with second area.
If d situation, as shown in Figure 4, then perform step S409, determine described second area have N number of first parameter A1, A2 ..., AN and N number of second parameter B1, B2 ..., BN, follow the tracks of based on the first parameter of second area and the second parameter.
If the partial pixel point of described second area and N number of first area h1, h2 ..., hN distance be all less than distance threshold μ, then illustrate in described second area and have partial pixel point may be positioned within the fitted ellipse of N number of first area simultaneously, namely first area that may be different may occur mutually to block in tracking scene, the concrete condition of tracking situation now described by the embodiment before in present specification, concrete tracking processing procedure please refer to S206 and step S207, does not repeat them here.
In the present embodiment, carry out different tracking process accordingly for different situations, make for tracked area of skin color in different situations, can accurate and effective to follow the tracks of.
To current input image, namely after the second area in described second input picture is followed the tracks of, for the ease of continuing to follow the tracks of to second area described in next frame input picture, the tracking based on Face Detection of the invention described above embodiment can further include, described elliptic parameter is concentrated the fitted ellipse parameter of the first area corresponding with described second area, be updated to the second parameter of the described second area of present frame input picture (the second input picture).Afterwards when following the tracks of this second area in next frame input picture, the fitted ellipse parameter upgrading first area corresponding to rear described second area can be concentrated as the first parameter of described second area using elliptic parameter, again to this second area in next frame input picture, the method provided based on the embodiment of the present invention determines its second parameter, based on the first parameter and the tracking of the second parameter realization to this second area in next frame input picture of this second area in next frame input picture, the like, can in tracing process, real-time synchronization upgrades described elliptic parameter collection.
The first area corresponding with described second area can be determined with the distance between first area based on the pixel of second area.
For a situation in the embodiment shown in Fig. 4, if the distance of all pixels of described second area and arbitrary first area is all greater than distance threshold μ, then described second area is concentrated at elliptic parameter does not have corresponding first area, can fitted ellipse parameter that fitted ellipse calculating obtains is newly-increased to be added to elliptic parameter and to concentrate by carrying out all pixels of described second area, when described second area being followed the tracks of in input picture afterwards, the determined elliptic region of fitted ellipse parameter that elliptic parameter concentrates can be newly increased as first area corresponding to described second area using above-mentioned.
Again for the c situation in the embodiment shown in Fig. 4, if all pixels of described second area and described elliptic parameter concentrate the distance of a first area to be less than μ, the distance of other first areas is concentrated all to be greater than μ with described elliptic parameter, then determine that this first area is the first area that described second area is corresponding, if all pixels of described second area and described elliptic parameter concentrate the distance of multiple first area to be all less than μ, the first area nearest with described second area is defined as first area corresponding to described second area.
If for the d situation in the embodiment shown in Fig. 4, if there is the distance of partial pixel point and multiple first area to be all less than distance threshold μ in described second area, then this multiple first area all can be called the first area that described second area is corresponding, specifically determines that the first area that described second area is corresponding can be determined with reference to said method.
In addition, in view of the colour of skin such as staff, face object move in moving scene time, although may be irregular movement locus, but the motion of colour of skin object can be similar to and regard as linear movement between consecutive frame, so just can based on the center point coordinate value of the fitted ellipse of present frame and former frame input picture, the center point coordinate value of the fitted ellipse described in prediction next frame input picture corresponding to colour of skin object, in forecasting process, other parameter of fitted ellipse can remain unchanged.
Particularly, based on the first parameter and second parameter of second area, can the coordinate figure (x of central point of fitted ellipse corresponding to real-time estimate the 3rd region by formula (9) c+1, y c+1).
(x c+1,y c+1)=(x c,y c)+Δc (9)
Wherein, Δ c=(x c, y c)-(x c-1, y c-1), (x c, y c) be the coordinate figure of the central point of fitted ellipse in the second parameter of second area, (x c-1, y c-1) be the coordinate figure of the central point of fitted ellipse in the first parameter of second area.Described 3rd region is area of skin color corresponding with this second area in next frame input picture.
It should be noted that, in the embodiment as shown in fig .4, if when a situation occurs, when new colour of skin object occurs, it is the center point coordinate value of this area of skin color corresponding to colour of skin object in unpredictable next frame, only has area of skin color that this colour of skin object in the present frame of emerging colour of skin object and next frame input picture is corresponding after fitted ellipse calculates, then based on the fitted ellipse result of calculation of the area of skin color in the input picture of initial two frames, the fitted ellipse parameter of this area of skin color in subsequent frames can be predicted.
Although the present invention discloses as above, the present invention is not defined in this.Any those skilled in the art, without departing from the spirit and scope of the present invention, all can make various changes or modifications, and therefore protection scope of the present invention should be as the criterion with claim limited range.

Claims (15)

1. based on a tracking for Face Detection, it is characterized in that, comprising:
Carry out fitted ellipse calculating respectively at least one first area, to obtain the fitted ellipse parameter of each first area, described first area is the area of skin color in the first input picture;
Based on the pixel of second area and the relation between the distance of each first area and distance threshold μ, obtain the first parameter and second parameter of described second area, described second area is the area of skin color in the second input picture, described first parameter is the fitted ellipse parameter that elliptic parameter concentrates corresponding first area, and described second parameter is for calculating the fitted ellipse parameter obtained based on the fitted ellipse of carrying out described second area;
Based on the first parameter and second parameter of described second area, described second area is followed the tracks of;
Wherein,
Described fitted ellipse parameter comprises the coordinate figure of the central point of fitted ellipse;
Described elliptic parameter collection comprises the set of the fitted ellipse parameter of each first area;
The distance of described pixel and first area be pixel and described elliptic parameter concentrate the fitted ellipse of this first area central point between distance.
2. as claimed in claim 1 based on the tracking of Face Detection, it is characterized in that, described area of skin color is obtained by the skin color detection method based on colour of skin model of ellipse.
3., as claimed in claim 2 based on the tracking of Face Detection, it is characterized in that, also comprise:
By formula P (s/c)=γ × P (s/c)+(1-γ) xP w(s/c) described colour of skin model of ellipse is upgraded; Wherein, s is the pixel value of the pixel of input picture, and c is the pixel value of skin pixel point, P (s/c) for this pixel be the probable value of colour of skin point, P w(s/c) this pixel for obtaining through colour of skin model of ellipse in continuous w two field picture is the probable value of colour of skin point, and γ is sensitivity parameter.
4. as claimed in claim 1 based on the tracking of Face Detection, it is characterized in that, described to carry out that fitted ellipse calculates to region be determined based on asking for covariance matrix to the pixel in described region.
5. as claimed in claim 1 based on the tracking of Face Detection, it is characterized in that, relation between the distance of the described pixel based on second area and each first area and distance threshold μ, the first parameter and the second parameter that obtain described second area comprise:
If the distance of all pixels of described second area and at least one first area is all less than μ, then the first parameter of described second area is the fitted ellipse parameter of first area nearest with described second area at least one first area described, the fitted ellipse parameter that the second parameter of described second area obtains for carrying out fitted ellipse calculating to all pixels of described second area;
The number of the pixel corresponding to first area that described and described second area is nearest is maximum, and the pixel corresponding to described first area is be less than the pixel with the distance of other first areas with the distance of this first area in described second area.
6. as claimed in claim 1 based on the tracking of Face Detection, it is characterized in that, relation between the distance of the described pixel based on second area and each first area and distance threshold μ, the first parameter and the second parameter that obtain described second area comprise:
If the partial pixel point of described second area and N number of first area h1, h2, the distance of hN is all less than μ, then determine that described second area has N number of first parameter A1, A2, AN and N number of second parameter B1, B2, BN, wherein, Aj is the fitted ellipse parameter of first area hj, the fitted ellipse parameter that Bj obtains for carrying out fitted ellipse calculating to the pixel of the first set and the second set, described first set is the set of described partial pixel point, described second set in described second area except described partial pixel point, the set of the pixel corresponding to the hj of first area, pixel corresponding to described first area hj and the distance of first area hj are less than the distance of this pixel and other first areas, 1≤j≤N, N >=2.
7., as claimed in claim 1 based on the tracking of Face Detection, it is characterized in that, the span of described distance threshold μ is the numerical value between 1 ~ 2.
8. as claimed in claim 1 based on the tracking of Face Detection, it is characterized in that, described fitted ellipse parameter also comprises the long axis length of fitted ellipse, minor axis length and the anglec of rotation;
Based on formula D ( p , h ) = v → · v → Calculate the pixel of second area and the distance of first area, wherein, v → = cos θ - sin θ sin θ cos θ ( x - x c α , y - y c β ) , P is the pixel of second area, and (x, y) is the coordinate figure of p point, the fitted ellipse of h corresponding to first area, (x c, y c) being intended to be the coordinate figure of the central point of fitted ellipse, α is the long axis length of fitted ellipse, and β is the minor axis length of fitted ellipse, and θ is the anglec of rotation of fitted ellipse.
9. as claimed in claim 1 based on the tracking of Face Detection, it is characterized in that, relation between the distance of the described pixel based on second area and each first area and distance threshold μ, the first parameter and the second parameter that obtain described second area comprise:
If the distance of all pixels of described second area and arbitrary first area is all greater than μ, then the first parameter of described second area is empty, the fitted ellipse parameter that the second parameter of described second area obtains for carrying out fitted ellipse calculating to all pixels of described second area.
10. as claimed in claim 1 based on the tracking of Face Detection, it is characterized in that, also comprise: after all second areas of continuous K frame are followed the tracks of, if all pixels of all second areas and the distance of same first area of described continuous K frame are all greater than μ, then the fitted ellipse parameter of this first area is concentrated from described elliptic parameter and delete, wherein, the span of K is 5 ~ 20.
11., as claimed in claim 1 based on the tracking of Face Detection, is characterized in that, also comprise: concentrated by described elliptic parameter the fitted ellipse parameter of the first area corresponding with described second area to be updated to the second parameter of described second area.
12., as claimed in claim 1 based on the tracking of Face Detection, is characterized in that, also comprise:
Based on formula (x c+1, y c+1)=(x c, y c)+Δ c determines the coordinate figure (x of the central point of the fitted ellipse corresponding to the 3rd region c+1, y c+1), described 3rd region is area of skin color corresponding with second area in next frame input picture;
Wherein, Δ c=(x c, y c)-(x c-1, y c-1), (x c, y c) be the coordinate figure of the central point of fitted ellipse in the second parameter of second area, (x c-1, y c-1) be the coordinate figure of the central point of fitted ellipse in the first parameter of second area.
13. 1 kinds based on the tracking means of Face Detection, is characterized in that, comprising:
First acquiring unit, is suitable for carrying out fitted ellipse calculating respectively at least one first area, and to obtain the fitted ellipse parameter of each first area, described first area is the area of skin color in the first input picture;
Second acquisition unit, be suitable for based on the pixel of second area and the relation between the distance of each first area and distance threshold μ, obtain the first parameter and second parameter of described second area, described second area is the area of skin color in the second input picture, described first parameter is the fitted ellipse parameter that elliptic parameter concentrates corresponding first area, and described second parameter is for calculating the fitted ellipse parameter obtained based on the fitted ellipse of carrying out described second area;
Tracking cell, is suitable for the first parameter based on described second area and the second parameter, follows the tracks of described second area;
Wherein,
Described fitted ellipse parameter comprises the coordinate figure of the central point of fitted ellipse;
Described elliptic parameter collection comprises the set of the fitted ellipse parameter of each first area;
The distance of described pixel and first area be pixel and described elliptic parameter concentrate the fitted ellipse of this first area central point between distance.
14. as claimed in claim 13 based on the tracking means of Face Detection, it is characterized in that, also comprise: updating block, be suitable for being concentrated by described elliptic parameter the fitted ellipse parameter of the first area corresponding with described second area to be updated to the second parameter of described second area.
15., as claimed in claim 13 based on the tracking means of Face Detection, is characterized in that, also comprise: predicting unit, be suitable for based on formula (x c+1, y c+1)=(x c, y c)+Δ cdetermine the coordinate figure (x of the central point of the fitted ellipse corresponding to the 3rd region c+1, y c+1), described 3rd region is area of skin color corresponding with second area in next frame input picture;
Wherein, Δ c=(x c, y c)-(x c-1, y c-1), (x c, y c) be the coordinate figure of the central point of fitted ellipse in the second parameter of second area, (x c-1, y c-1) be the coordinate figure of the central point of fitted ellipse in the first parameter of second area.
CN201310633324.4A 2013-11-29 2013-11-29 A kind of tracking and device based on Face Detection Active CN104680551B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310633324.4A CN104680551B (en) 2013-11-29 2013-11-29 A kind of tracking and device based on Face Detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310633324.4A CN104680551B (en) 2013-11-29 2013-11-29 A kind of tracking and device based on Face Detection

Publications (2)

Publication Number Publication Date
CN104680551A true CN104680551A (en) 2015-06-03
CN104680551B CN104680551B (en) 2017-11-21

Family

ID=53315544

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310633324.4A Active CN104680551B (en) 2013-11-29 2013-11-29 A kind of tracking and device based on Face Detection

Country Status (1)

Country Link
CN (1) CN104680551B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108021881A (en) * 2017-12-01 2018-05-11 腾讯数码(天津)有限公司 A kind of skin color segmentation method, apparatus and storage medium
CN117095067A (en) * 2023-10-17 2023-11-21 山东虹纬纺织有限公司 Textile color difference detection method based on artificial intelligence

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1717695A (en) * 2002-11-29 2006-01-04 索尼英国有限公司 Face detection and tracking
CN101288103A (en) * 2005-08-18 2008-10-15 高通股份有限公司 Systems, methods, and apparatus for image processing, for color classification, and for skin color detection
CN101620673A (en) * 2009-06-18 2010-01-06 北京航空航天大学 Robust face detecting and tracking method
CN101699510A (en) * 2009-09-02 2010-04-28 北京科技大学 Particle filtering-based pupil tracking method in sight tracking system
US20130259317A1 (en) * 2008-10-15 2013-10-03 Spinella Ip Holdings, Inc. Digital processing method and system for determination of optical flow

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1717695A (en) * 2002-11-29 2006-01-04 索尼英国有限公司 Face detection and tracking
CN101288103A (en) * 2005-08-18 2008-10-15 高通股份有限公司 Systems, methods, and apparatus for image processing, for color classification, and for skin color detection
US20130259317A1 (en) * 2008-10-15 2013-10-03 Spinella Ip Holdings, Inc. Digital processing method and system for determination of optical flow
CN101620673A (en) * 2009-06-18 2010-01-06 北京航空航天大学 Robust face detecting and tracking method
CN101699510A (en) * 2009-09-02 2010-04-28 北京科技大学 Particle filtering-based pupil tracking method in sight tracking system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
高建坡等: "一种新的基于直接最小二乘椭圆拟合的肤色检测方法", 《信号处理》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108021881A (en) * 2017-12-01 2018-05-11 腾讯数码(天津)有限公司 A kind of skin color segmentation method, apparatus and storage medium
CN108021881B (en) * 2017-12-01 2023-09-01 腾讯数码(天津)有限公司 Skin color segmentation method, device and storage medium
CN117095067A (en) * 2023-10-17 2023-11-21 山东虹纬纺织有限公司 Textile color difference detection method based on artificial intelligence
CN117095067B (en) * 2023-10-17 2024-02-02 山东虹纬纺织有限公司 Textile color difference detection method based on artificial intelligence

Also Published As

Publication number Publication date
CN104680551B (en) 2017-11-21

Similar Documents

Publication Publication Date Title
CN110909611B (en) Method and device for detecting attention area, readable storage medium and terminal equipment
CN103310194B (en) Pedestrian based on crown pixel gradient direction in a video shoulder detection method
CN105760849B (en) Target object behavioral data acquisition methods and device based on video
CN104835175B (en) Object detection method in a kind of nuclear environment of view-based access control model attention mechanism
CN103258332B (en) A kind of detection method of the moving target of resisting illumination variation
CN105184779A (en) Rapid-feature-pyramid-based multi-dimensioned tracking method of vehicle
CN106355602A (en) Multi-target locating and tracking video monitoring method
CN103049751A (en) Improved weighting region matching high-altitude video pedestrian recognizing method
CN103735269B (en) A kind of height measurement method followed the tracks of based on video multi-target
CN108198201A (en) A kind of multi-object tracking method, terminal device and storage medium
CN105046197A (en) Multi-template pedestrian detection method based on cluster
CN102915545A (en) OpenCV(open source computer vision library)-based video target tracking algorithm
CN106709938B (en) Based on the multi-target tracking method for improving TLD
CN110349186B (en) Large-displacement motion optical flow calculation method based on depth matching
CN109685045A (en) A kind of Moving Targets Based on Video Streams tracking and system
CN109767454A (en) Based on Space Time-frequency conspicuousness unmanned plane video moving object detection method
CN104123714B (en) A kind of generation method of optimal objective detection yardstick in people flow rate statistical
CN105320917A (en) Pedestrian detection and tracking method based on head-shoulder contour and BP neural network
CN106529441B (en) Depth motion figure Human bodys' response method based on smeared out boundary fragment
CN109146925A (en) Conspicuousness object detection method under a kind of dynamic scene
CN107808524A (en) A kind of intersection vehicle checking method based on unmanned plane
CN108805902A (en) A kind of space-time contextual target tracking of adaptive scale
CN104598914A (en) Skin color detecting method and device
CN106056078A (en) Crowd density estimation method based on multi-feature regression ensemble learning
CN104680122A (en) Tracking method and device based on skin color detection

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant