CN102867349A - People counting method based on elliptical ring template matching - Google Patents
People counting method based on elliptical ring template matching Download PDFInfo
- Publication number
- CN102867349A CN102867349A CN2012102958336A CN201210295833A CN102867349A CN 102867349 A CN102867349 A CN 102867349A CN 2012102958336 A CN2012102958336 A CN 2012102958336A CN 201210295833 A CN201210295833 A CN 201210295833A CN 102867349 A CN102867349 A CN 102867349A
- Authority
- CN
- China
- Prior art keywords
- image
- elliptical
- moving target
- barycenter
- method based
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Image Processing (AREA)
- Image Analysis (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
The invention provides a people counting method based on elliptical ring template matching. The method comprises the following steps of: (S1) getting a video stream image in a monitoring area as an input image; (S2) performing edge detection on the input image through a Sobel operator to get an edge image; (S3) getting a differential image by processing through a background subtraction method according to the input image, and further performing binarization processing on the differential image to get a moving object area; (S4) combining the edge image with the moving object area, performing and operation, and extracting common parts to get moving object contours; (S5) scanning the moving object contours through a plurality of elliptical ring templates, and performing contact ratio Co calculation to extract pedestrian contours and the mass center of each pedestrian contour; and (S6) tracking and counting the mass centers based on a nearest neighbor matching and tracking method of a Kalman wave filter to get the number of people. The people counting efficiency and the accuracy are improved by matching the plurality of the elliptical ring templates with the moving object contours.
Description
Technical field
The invention belongs to video image and process and the recognition technology field, particularly a kind of demographic method based on the elliptical template coupling.
Background technology
In the management and decision of the public places such as market, shopping center, airport, station, flow of the people is indispensable data.By to flow of the people, namely pass in and out the statistics of number, can effectively monitor in real time, organize the operation work of public place, for people provide safer environment and better service.Be that take the market example, flow of the people are very basic and important indexs, closely related with the sales volume in market, if know more accurate and real flow of the people, can provide reliable reference information for sale, service and logistics.
Traditional population number meter counting method is to utilize manual detection, or contact equipment, still along with the arrival of information age, invents a kind of automatic demographic method and seems very necessary.Intelligent demographics technology is the intelligent management system that the method utilizing computer vision and image to process to combine is set up, in the situation that does not need manual intervention, only need to carry out location, tracking and the demographics that real-time analysis realizes the pedestrian by the video sequence that video camera is taken.
The patent of invention that Chinese patent 03109626.3, name are called " small insect automatic technique system " discloses the automatic counter system of a kind of small insect.Yet this invention only is the Auto-counting of realizing in specific background environment and specific region the specific objective body.This counting technology for the specific objective body can not satisfy in the various environment in public domain the statistical demand to the flow of the people of continuous variation.
In view of this, be necessary the demographic method in the public domain in the prior art is improved, to address the above problem.
Summary of the invention
The object of the present invention is to provide a kind of demographic method based on the elliptical template coupling, the method can improve efficient and the accuracy of the demographics in the public domain effectively.
For achieving the above object, the invention provides a kind of demographic method based on the elliptical template coupling, the method may further comprise the steps:
S1, obtain the video streaming image of guarded region as input picture;
S2, by the Sobel operator input picture is carried out rim detection, obtain edge image;
S3, according to input picture, process to obtain difference image by the background subtraction point-score, and difference image carried out binary conversion treatment, to obtain motion target area;
S4, jointing edge image and motion target area carry out and computing, and extract public part, to obtain the moving target profile;
S5, by some elliptical templates the moving target profile is scanned, and carry out registration Co and calculate, to extract the barycenter of pedestrian contour and pedestrian contour;
S6, based on the arest neighbors of Kalman wave filter coupling tracing, barycenter is followed the tracks of and is counted, to obtain number.
As a further improvement on the present invention, described step S1 is specially: the video streaming image that obtains guarded region by video camera is as input picture, described guarded region be positioned at video camera under.
As a further improvement on the present invention, described step S2 is specially: the Sobel operator of utilization 3 * 3 carries out rim detection to the input picture that described step S1 obtains, and obtains edge image.
As a further improvement on the present invention, background subtraction point-score among the described step S3 is processed and is specially: the input picture that obtains according to step S1, extract the first frame without the scene image of moving object image as a setting, then utilize current frame image and background image to make the background calculus of differences to obtain difference image, the computing formula of described background calculus of differences is:
Wherein,
Be the gray-scale value of pixel in the background image,
Be the gray-scale value of pixel in the current frame image,
Be the difference image of the two.
As a further improvement on the present invention, " by some elliptical templates the moving target profile is scanned, and carries out registration Co and calculate, to extract pedestrian contour " among the described step S5 and be specially:
By some elliptical templates, the moving target profile that step S4 is obtained scans, to extract the moving target wire-frame image vegetarian refreshments that drops in the elliptical template, then moving target wire-frame image vegetarian refreshments and the elliptical template that extracts carried out registration Co calculating, this registration Co and setting threshold T are made comparisons
If registration Co, extracts this moving target wire-frame image vegetarian refreshments more than or equal to setting threshold T as pedestrian contour,
If registration Co, does not extract this moving target wire-frame image vegetarian refreshments less than setting threshold T,
The computing formula of described registration Co is:
Wherein, ModelNum is the sum of the pixel between elliptical template outer perimeter and the interior girth; InterNum is that moving target wire-frame image vegetarian refreshments drops on the outer perimeter of described elliptical template and the sum of the pixel between the interior girth.
As a further improvement on the present invention, described setting threshold T is 50%.
As a further improvement on the present invention, the barycenter that extracts pedestrian contour among the described step S5 is specially: according to the geometric center of the minimum boundary rectangle of the pedestrian contour that the extracts barycenter as pedestrian contour.
As a further improvement on the present invention, described step S6 is specially: the barycenter of the pedestrian contour that obtains according to step S5, utilize the Kalman wave filter to estimate the position of barycenter in the next frame image, in next frame, utilize arest neighbors coupling tracing, barycenter is followed the tracks of and counted.
Compared with prior art, the invention has the beneficial effects as follows: by the method that some elliptical templates and moving target profile mate, can carry out the statistics of real-time number to the huge place of flow of the people, thereby improve efficient and the accuracy of demographics.
Description of drawings
Fig. 1 is the schematic flow sheet of a kind of demographic method embodiment based on elliptical template coupling of the present invention;
Fig. 2 is the principle of work synoptic diagram of the video streaming image that obtains guarded region shown in Figure 1;
Fig. 3 a is the synoptic diagram that the Sobel operator calculates the Grad of x direction;
Fig. 3 b is the synoptic diagram that the Sobel operator calculates the Grad of y direction;
Fig. 4 is the synoptic diagram that input picture shown in the present is done convolution and computing;
Fig. 5 is the synoptic diagram of the difform moving target profile that obtains at diverse location of the pedestrian in the guarded region shown in Figure 2;
Fig. 6 is the synoptic diagram of elliptical template shown in the present;
Fig. 7 A is that elliptical template shown in the present scans the moving target profile, and extracts moving target wire-frame image vegetarian refreshments as the synoptic diagram of pedestrian contour;
Fig. 7 B is that elliptical template shown in the present scans the moving target profile, does not extract the synoptic diagram of this moving target wire-frame image vegetarian refreshments;
Fig. 8 is the synoptic diagram of extraction pedestrian contour barycenter shown in the present;
Fig. 9 is that arest neighbors shown in the present mates the synoptic diagram of tracing;
Figure 10 is the synoptic diagram that the barycenter to pedestrian contour shown in the present is counted.
Embodiment
The present invention is described in detail below in conjunction with each embodiment shown in the drawings; but should be noted that; these embodiments are not limitation of the present invention; those of ordinary skills all belong within protection scope of the present invention according to these embodiment institute work energy, method or structural equivalent transformation or alternative.
Join shown in Figure 1ly, Fig. 1 is the schematic flow sheet of a kind of demographic method based on elliptical template coupling of the present invention.In the present embodiment, a kind of demographic method based on the elliptical template coupling, described demographic method may further comprise the steps:
S1, obtain the video streaming image of guarded region as input picture.
About the method for the demographics in the public domain, the method that the based on motion characteristic is arranged commonly used, the method based on shape information, the method based on pedestrian dummy, structural element, the method for stereoscopic vision, the methods such as method, small echo and support vector machine of neural network.
Join shown in Figure 2ly, a kind of demographic method based on elliptical template coupling of the present invention is based on video camera and vertically takes and be applicable to outdoor situations and indoor situation.In the present embodiment, this step S1 is specially: the video streaming image that obtains guarded region 30 by video camera 10 is as input picture, described guarded region 30 be positioned at video camera 10 under.
Concrete, video camera 10 be arranged near the gateway 20 directly over, the pedestrian can walk up and down in gateway 20 on the direction of arrow 201.The guarded region 30 that video camera 10 obtains can cover the Zone Full of gateway 20 fully.
In the present embodiment, this guarded region 30 is rectangle, can certainly be square or circular or other shapes.Video camera 10 be positioned at guarded region 30 central point 301 directly over, we can derive thus, this guarded region 30 be positioned at video camera 10 under.
S2, by the Sobel operator input picture is carried out rim detection, obtain edge image.
The edge refers to that image local brightness changes the most significant part, mainly be present between target and target, target and background, zone and the zone, rim detection is the fundamental operation of the local marked change of detected image, and the marked change of gradation of image value can detect with the discrete approximation function of gradient.
In conjunction with Fig. 3 a, Fig. 3 b and shown in Figure 4, the input picture of 256 grades of gray scales of a frame is made as f (x, y) at the gray-scale value of certain pixel, as follows for the Grad computing formula of this pixel:
;
Wherein,
The Grad of asking for this pixel (x, y) place,
,
For utilizing the sobel operator to calculate respectively Grad on x, y direction.Wherein Fig. 3 a is the synoptic diagram that the Sobel operator calculates this pixel (x, y) Grad in the x-direction; Fig. 3 b is the synoptic diagram that the Sobel operator calculates this pixel (x, y) Grad in the y-direction.
,
The gray level that represents respectively sobel operator and Image neighborhood is as shown in Figure 4 done convolution algorithm, among Fig. 4
(i=1,2 ...., 9) represent the gray-scale value of the pixel around these pixel (x, y) eight neighborhoods,
And
Shown in being calculated as follows with formula:
Be that image is located the Grad on x, the y direction is respectively at pixel (x, y):
The Sobel operator is one of operator during image is processed, mainly as rim detection.Technically, it is a discreteness difference operator, is used for the gradient approximate value of arithmograph image brightness function.The matrix that this operator inclusion is two group 3 * 3, be respectively be used to the Grad of asking pixel on x, y direction, it and each the pixel neighborhood of a point gray level in as shown in Figure 4 the input picture are done convolution and computing, then choose suitable threshold values K to extract edge image.
Concrete, the computing formula of this convolution and computing is as follows,
Wherein, threshold k is 200.
S3, according to input picture, process to obtain difference image by the background subtraction point-score, and difference image carried out binary conversion treatment, to obtain motion target area.
In the present embodiment, this background subtraction point-score is processed and is specially: extract the first frame without the scene image of moving object image as a setting, then utilize current frame image and background image to make calculus of differences to obtain difference image, then utilize current frame image and background image to do the background calculus of differences, to obtain difference image.The computing formula of this background calculus of differences is:
Wherein,
For the gray-scale value of pixel in the background image,
For the gray-scale value of pixel in the current frame image,
Be the difference image of the two.
Then difference image is carried out binary conversion treatment, the operational formula of this binary conversion treatment is as follows:
Wherein,
Be the difference image of the two,
Process rear resulting bianry image for method of difference, M is partition threshold, and this M is 40.
S4, jointing edge image and motion target area carry out and computing, and extract public part, to obtain the moving target profile.
With computing be a kind of logical multiplication operation rule, the logical variable that its expression just thinks to participate in computing all simultaneously value be 1 o'clock, its logical produc just equals 1.
In the present embodiment, the image border that step S2 obtains comprises: the edge of background edge and moving object.The motion target area that step S3 obtains includes only: motion target area, and without background image.Edge image and motion target area are carried out and computing, can extract the public part in two width of cloth images, to obtain the moving target profile.
S5, by some elliptical templates the moving target profile is scanned, and carry out registration Co and calculate, to extract the barycenter of pedestrian contour and pedestrian contour.
Ginseng Fig. 5 is to shown in Figure 6, and in conjunction with shown in Figure 5, from depression angle, the pedestrian contour 50 in the guarded region 30 is different in the shape of each position, but similar to ellipse.For these characteristics, in the present embodiment, can be in guarded region 30 go out to vary in size and also different some elliptical templates 60 of direction in different placement configurations, search the pedestrian of motion state with this.
In the present embodiment, the moving target profile of step S4 acquisition comprises: people's profile and inhuman profile (not shown).
Shown in ginseng Fig. 7 A and Fig. 7 B, after obtaining the moving target profile, scan with 60 pairs of moving target profiles of several elliptical templates, to extract the moving target wire-frame image vegetarian refreshments 101 that drops in the elliptical template 60, then the moving target wire-frame image vegetarian refreshments 101 that extracts is carried out registration Co with elliptical template 60 and calculate, and this registration Co and setting threshold T are made comparisons.Preferably, this setting threshold T is 50%.
Shown in Fig. 7 A, if registration Co, extracts this moving target wire-frame image vegetarian refreshments more than or equal to setting threshold T as pedestrian contour 50,
Shown in Fig. 7 B, if registration Co, does not extract this moving target wire-frame image vegetarian refreshments less than setting threshold T.Be that we think that the moving target profile that comprises in the motion target area that obtains is inhuman profile 501 in guarded region 30.
In the present embodiment, the computing formula of this registration Co is:
Wherein, Co is registration; ModelNum is the sum of the pixel between elliptical template 60 outer perimeters 701 and the interior girth 702; InterNum is that moving target point 101 drops on the outer perimeter 701 of elliptical template 60 and the sum of the pixel between the interior girth 702.
Join shown in Figure 8ly, at this moment, pedestrian contour 50 has extracted complete, for the ease of realizing the tracking of Kalman wave filter, need to choose the barycenter of pedestrian contour 50.Concrete, can be according to the geometric center of the minimum boundary rectangle 80 of the pedestrian contour 50 that the extracts barycenter as this pedestrian contour 50
S6, based on the arest neighbors of Kalman wave filter coupling tracing, barycenter is followed the tracks of and is counted, to obtain number.
Kalman filtering is a kind of estimation of recurrence, can calculate the estimated value of current state as long as namely know the observed reading of upper estimated value moment state and current state, does not therefore need the historical information of hourly observation or estimation.
The Kalman wave filter is the specific implementation of Kalman filter.
The operation of Kalman wave filter comprises two stages: prediction and renewal.At forecast period, the Kalman wave filter uses the estimation of laststate, makes the estimation to current state.In update stage, the predicted value that the Kalman wave filter utilizes the observed reading optimization to current state to obtain at forecast period is to obtain more accurate new estimated value.
In the present embodiment, the implementation step based on the arest neighbors of Kalman wave filter coupling tracing is as follows:
Suppose that present frame is the k two field picture, the k-1 frame predicts that the computing formula of k frame characteristic point position is as follows:
In the formula (1),
The result who utilizes the prediction of k-1 frame,
It is barycenter
At the optimal location of k-1 two field picture, U (k) is controlled quentity controlled variable, and A, B are systematic parameter.
K frame detected characteristics point position is the barycenter of aforesaid pedestrian contour 50
Then according to the optimal estimation method, in conjunction with the centroid position of k-1 frame prediction
With the detected barycenter of k frame
The position calculates k frame barycenter
Optimal location in current frame image.
The computing formula of optimal estimation is as follows:
In the formula (2),
It is k frame barycenter
Optimal location,
It is the detected barycenter of k frame
The position.
The kalman gain, should
Computing formula as the formula (3):
(5)
In the present embodiment, according to the barycenter of the pedestrian contour 50 that obtains among the step S5
, utilize the Kalman wave filter to estimate barycenter
Position in the next frame image utilizes arest neighbors coupling tracing, to barycenter in next frame
Follow the tracks of and count.
The matching process that 9B is ordered among Fig. 9 is: at first utilize the Kalman wave filter to estimate 9A in the position of next frame, be designated as 9A ', shown in circle among Fig. 9.Then calculate the Euclidean distance between 9B point and the 9A '.Distance among the figure between 9B and the 9A ' is d
BA 'If, d
BA '≤ setting threshold th, then the match is successful for 9B point and 9A ' point, and namely solid line 91 is the position of same pedestrian under different frame with dotted line 92, and upgrades the Kalman filter status; If d
BA '>setting threshold th, then 9B point and 9A ' some coupling is unsuccessful, and namely solid line 91 and dotted line 92 are different pedestrians, do not upgrade the Kalman filter status.Preferably, this setting threshold th is 15.
In conjunction with shown in Figure 10, in the present embodiment, after the arest neighbors coupling tracing based on the Kalman wave filter, also comprise barycenter is followed the tracks of and counted.
As shown in figure 10, this rectangle is the guarded region 30 among the step S1, and two lines of 10A, 10B are respectively into counting line and go out counting line.Advancing counting line 10A is tracing area with the center section 10C that goes out counting line 10B.The position that the barycenter of Lable mark pedestrian contour 50 detects first.Before if the pedestrian enters tracing area 10C, the row-coordinate of the barycenter of the pedestrian contour 50 that detects first
Be greater than or equal to the row-coordinate that counts out detection line 10B
The time, Lable is designated as 1; Before if the pedestrian enters tracing area 10C, the row-coordinate of the barycenter of the pedestrian contour 50 that detects first
Be less than or equal to the row-coordinate that counting advances detection line 10A
, then Lable is labeled as 0.Its mathematical description is:
If the barycenter of pedestrian contour 50 leaves tracing area 10C by counting out detection line 10B, and Lable is 0, and namely the pedestrian enters tracing area 10C by advancing counting line 10A, leaves tracing area 10C by going out counting line 10B, and the number that then goes out increases by 1; Leave tracing area 10C if the barycenter of pedestrian contour 50 advances detection line 10A by counting, and Lable is 1, namely the pedestrian enters tracing area 10C by going out counting line 10B, leaves tracing area 10C by advancing counting line 10A, and the number of then advancing increases by 1.
Above listed a series of detailed description only is specifying for feasibility embodiment of the present invention; they are not to limit protection scope of the present invention, allly do not break away from equivalent embodiment or the change that skill spirit of the present invention does and all should be included within protection scope of the present invention.
To those skilled in the art, obviously the invention is not restricted to the details of above-mentioned example embodiment, and in the situation that does not deviate from spirit of the present invention or essential characteristic, can realize the present invention with other concrete form.Therefore, no matter from which point, all should regard embodiment as exemplary, and be nonrestrictive, scope of the present invention is limited by claims rather than above-mentioned explanation, therefore is intended to include in the present invention dropping on the implication that is equal to important document of claim and all changes in the scope.Any Reference numeral in the claim should be considered as limit related claim.
In addition, be to be understood that, although this instructions is described according to embodiment, but be not that each embodiment only comprises an independently technical scheme, this narrating mode of instructions only is for clarity sake, those skilled in the art should make instructions as a whole, and the technical scheme among each embodiment also can through appropriate combination, form other embodiments that it will be appreciated by those skilled in the art that.
Claims (8)
1. demographic method based on elliptical template coupling is characterized in that the method may further comprise the steps:
S1, obtain the video streaming image of guarded region as input picture;
S2, by the Sobel operator input picture is carried out rim detection, obtain edge image;
S3, according to input picture, process to obtain difference image by the background subtraction point-score, and difference image carried out binary conversion treatment, to obtain motion target area;
S4, jointing edge image and motion target area carry out and computing, and extract public part, to obtain the moving target profile;
S5, by some elliptical templates the moving target profile is scanned, and carry out registration Co and calculate, to extract the barycenter of pedestrian contour and pedestrian contour;
S6, based on the arest neighbors of Kalman wave filter coupling tracing, barycenter is followed the tracks of and is counted, to obtain number.
2. the demographic method based on elliptical template coupling according to claim 1, it is characterized in that, described step S1 is specially: the video streaming image that obtains guarded region by video camera is as input picture, described guarded region be positioned at video camera under.
3. the demographic method based on elliptical template coupling according to claim 1 is characterized in that described step S2 is specially: use 3 * 3 Sobel operator that the input picture that described step S1 obtains is carried out rim detection, obtain edge image.
4. the demographic method based on elliptical template coupling according to claim 1, it is characterized in that, background subtraction point-score among the described step S3 is processed and is specially: the input picture that obtains according to step S1, extract the first frame without the scene image of moving object image as a setting, then utilize current frame image and background image to make the background calculus of differences to obtain difference image, the computing formula of described background calculus of differences is:
5. the demographic method based on elliptical template coupling according to claim 1, it is characterized in that, " by some elliptical templates the moving target profile is scanned, and carries out registration Co and calculate, to extract pedestrian contour " among the described step S5 and be specially:
By some elliptical templates, the moving target profile that step S4 is obtained scans, to extract the moving target wire-frame image vegetarian refreshments that drops in the elliptical template, then moving target wire-frame image vegetarian refreshments and the elliptical template that extracts carried out registration Co calculating, this registration Co and setting threshold T are made comparisons
If registration Co, extracts this moving target wire-frame image vegetarian refreshments more than or equal to setting threshold T as pedestrian contour,
If registration Co, does not extract this moving target wire-frame image vegetarian refreshments less than setting threshold T,
The computing formula of described registration Co is:
;
Wherein, ModelNum is the sum of the pixel between elliptical template outer perimeter and the interior girth; InterNum is that moving target wire-frame image vegetarian refreshments drops on the outer perimeter of described elliptical template and the sum of the pixel between the interior girth.
6. the demographic method based on the elliptical template coupling according to claim 5 is characterized in that described setting threshold T is 50%.
7. the demographic method based on elliptical template coupling according to claim 1, it is characterized in that the barycenter that extracts pedestrian contour among the described step S5 is specially: according to the geometric center of the minimum boundary rectangle of the pedestrian contour that the extracts barycenter as pedestrian contour.
8. according to claim 1 or 7 described demographic methods based on elliptical template coupling, it is characterized in that, described step S6 is specially: the barycenter of the pedestrian contour that obtains according to step S5, utilize the Kalman wave filter to estimate the position of barycenter in the next frame image, in next frame, utilize arest neighbors coupling tracing, barycenter is followed the tracks of and counted.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210295833.6A CN102867349B (en) | 2012-08-20 | 2012-08-20 | People counting method based on elliptical ring template matching |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210295833.6A CN102867349B (en) | 2012-08-20 | 2012-08-20 | People counting method based on elliptical ring template matching |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102867349A true CN102867349A (en) | 2013-01-09 |
CN102867349B CN102867349B (en) | 2015-03-25 |
Family
ID=47446203
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210295833.6A Active CN102867349B (en) | 2012-08-20 | 2012-08-20 | People counting method based on elliptical ring template matching |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102867349B (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103559481A (en) * | 2013-11-05 | 2014-02-05 | 无锡慧眼电子科技有限公司 | People counting method under complex environment |
CN104091198A (en) * | 2014-06-27 | 2014-10-08 | 无锡慧眼电子科技有限公司 | Pedestrian flow statistic method based on ViBe |
CN104899559A (en) * | 2015-05-25 | 2015-09-09 | 江苏大学 | Rapid pedestrian detection method based on video monitoring |
CN105354610A (en) * | 2014-08-18 | 2016-02-24 | 无锡慧眼电子科技有限公司 | Random Hough transform-based people counting method |
CN105740862A (en) * | 2014-10-27 | 2016-07-06 | 江苏慧眼数据科技股份有限公司 | Pedestrian contour detection method based on macro feature point description |
CN105844649A (en) * | 2016-04-12 | 2016-08-10 | 中国科学院长春光学精密机械与物理研究所 | Statistical method, apparatus and system for the quantity of people |
CN106951820A (en) * | 2016-08-31 | 2017-07-14 | 江苏慧眼数据科技股份有限公司 | Passenger flow statistical method based on annular template and ellipse fitting |
CN107123126A (en) * | 2017-03-29 | 2017-09-01 | 天棣网络科技(上海)有限公司 | A kind of stream of people's moving scene temperature method of estimation |
CN107180420A (en) * | 2016-03-09 | 2017-09-19 | 顺丰科技有限公司 | Article transmits householder method |
CN107730526A (en) * | 2017-09-25 | 2018-02-23 | 中国科学院声学研究所 | A kind of statistical method of the number of fish school |
CN108091025A (en) * | 2018-01-18 | 2018-05-29 | 吴静 | A kind of glove intelligent door system system for verifying identity |
CN109241952A (en) * | 2018-10-26 | 2019-01-18 | 北京陌上花科技有限公司 | Personage's method of counting and device under crowd scene |
CN117132948A (en) * | 2023-10-27 | 2023-11-28 | 南昌理工学院 | Scenic spot tourist flow monitoring method, system, readable storage medium and computer |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101540892A (en) * | 2009-04-23 | 2009-09-23 | 上海中安电子信息科技有限公司 | Method for people counting in doorway on DSP video gathering device |
CN101695983A (en) * | 2009-10-23 | 2010-04-21 | 浙江工业大学 | Omnibearing computer vision based energy-saving and safety monitoring system of escalator |
CN102063613A (en) * | 2010-12-28 | 2011-05-18 | 北京智安邦科技有限公司 | People counting method and device based on head recognition |
-
2012
- 2012-08-20 CN CN201210295833.6A patent/CN102867349B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101540892A (en) * | 2009-04-23 | 2009-09-23 | 上海中安电子信息科技有限公司 | Method for people counting in doorway on DSP video gathering device |
CN101695983A (en) * | 2009-10-23 | 2010-04-21 | 浙江工业大学 | Omnibearing computer vision based energy-saving and safety monitoring system of escalator |
CN102063613A (en) * | 2010-12-28 | 2011-05-18 | 北京智安邦科技有限公司 | People counting method and device based on head recognition |
Non-Patent Citations (3)
Title |
---|
STAN BIREHFIELD: "Elliptical Head Tracking Using Intensity Gradients and Color Histograms", 《PROC.OF IEEE CONEFRENCE ON COMPUTER VISION AND PATTERN RECOGNITION》 * |
甘世民: "基于图像的运动人体提取的研究与应用", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 * |
翁传博: "基于视频的场景智能分析和人机交互技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103559481B (en) * | 2013-11-05 | 2017-02-08 | 江苏慧眼数据科技股份有限公司 | People counting method under complex environment |
CN103559481A (en) * | 2013-11-05 | 2014-02-05 | 无锡慧眼电子科技有限公司 | People counting method under complex environment |
CN104091198A (en) * | 2014-06-27 | 2014-10-08 | 无锡慧眼电子科技有限公司 | Pedestrian flow statistic method based on ViBe |
CN105354610A (en) * | 2014-08-18 | 2016-02-24 | 无锡慧眼电子科技有限公司 | Random Hough transform-based people counting method |
CN105740862A (en) * | 2014-10-27 | 2016-07-06 | 江苏慧眼数据科技股份有限公司 | Pedestrian contour detection method based on macro feature point description |
CN104899559B (en) * | 2015-05-25 | 2019-08-16 | 江苏大学 | A kind of rapid pedestrian detection method based on video monitoring |
CN104899559A (en) * | 2015-05-25 | 2015-09-09 | 江苏大学 | Rapid pedestrian detection method based on video monitoring |
CN107180420A (en) * | 2016-03-09 | 2017-09-19 | 顺丰科技有限公司 | Article transmits householder method |
CN105844649A (en) * | 2016-04-12 | 2016-08-10 | 中国科学院长春光学精密机械与物理研究所 | Statistical method, apparatus and system for the quantity of people |
CN106951820A (en) * | 2016-08-31 | 2017-07-14 | 江苏慧眼数据科技股份有限公司 | Passenger flow statistical method based on annular template and ellipse fitting |
CN106951820B (en) * | 2016-08-31 | 2019-12-13 | 江苏慧眼数据科技股份有限公司 | Passenger flow statistical method based on annular template and ellipse fitting |
CN107123126A (en) * | 2017-03-29 | 2017-09-01 | 天棣网络科技(上海)有限公司 | A kind of stream of people's moving scene temperature method of estimation |
CN107730526A (en) * | 2017-09-25 | 2018-02-23 | 中国科学院声学研究所 | A kind of statistical method of the number of fish school |
CN108091025A (en) * | 2018-01-18 | 2018-05-29 | 吴静 | A kind of glove intelligent door system system for verifying identity |
CN109241952A (en) * | 2018-10-26 | 2019-01-18 | 北京陌上花科技有限公司 | Personage's method of counting and device under crowd scene |
CN109241952B (en) * | 2018-10-26 | 2021-09-07 | 北京陌上花科技有限公司 | Figure counting method and device in crowded scene |
CN117132948A (en) * | 2023-10-27 | 2023-11-28 | 南昌理工学院 | Scenic spot tourist flow monitoring method, system, readable storage medium and computer |
CN117132948B (en) * | 2023-10-27 | 2024-01-30 | 南昌理工学院 | Scenic spot tourist flow monitoring method, system, readable storage medium and computer |
Also Published As
Publication number | Publication date |
---|---|
CN102867349B (en) | 2015-03-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102867349B (en) | People counting method based on elliptical ring template matching | |
CN102867177B (en) | A kind of demographic method based on gradation of image coupling | |
CN110688987B (en) | Pedestrian position detection and tracking method and system | |
CN102542289B (en) | Pedestrian volume statistical method based on plurality of Gaussian counting models | |
WO2019228063A1 (en) | Product inspection terminal, method and system, computer apparatus and readable medium | |
Islam et al. | Solid waste bin detection and classification using Dynamic Time Warping and MLP classifier | |
CN102103753B (en) | Use method and the terminal of real time camera estimation detect and track Moving Objects | |
CN104061907B (en) | The most variable gait recognition method in visual angle based on the coupling synthesis of gait three-D profile | |
CN107633226B (en) | Human body motion tracking feature processing method | |
CN101777185B (en) | Target tracking method for modeling by integrating description method and discriminant method | |
CN104601964A (en) | Non-overlap vision field trans-camera indoor pedestrian target tracking method and non-overlap vision field trans-camera indoor pedestrian target tracking system | |
CN102496001A (en) | Method of video monitor object automatic detection and system thereof | |
CN102855466B (en) | A kind of demographic method based on Computer Vision | |
JP6789876B2 (en) | Devices, programs and methods for tracking objects using pixel change processed images | |
CN104517095A (en) | Head division method based on depth image | |
CN102542571A (en) | Moving target detecting method and device | |
CN111886600A (en) | Device and method for instance level segmentation of image | |
CN105321187A (en) | Pedestrian counting method based on head detection | |
CN105718841A (en) | Pedestrian counting method for implementing dynamic update on pedestrian classifier | |
CN106934332A (en) | A kind of method of multiple target tracking | |
CN103886324B (en) | Scale adaptive target tracking method based on log likelihood image | |
CN106529434B (en) | The shoal of fish individual goal tracking of view-based access control model attention model | |
CN103559481A (en) | People counting method under complex environment | |
CN105426928B (en) | A kind of pedestrian detection method based on Haar feature and EOH feature | |
CN105654090A (en) | Pedestrian contour detection method based on curve volatility description |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
C56 | Change in the name or address of the patentee | ||
CP03 | Change of name, title or address |
Address after: 214174 Tsinghua innovation building, No. 1, wisdom road, Huishan Economic Development Zone, Jiangsu, Wuxi, A1501-A1509 Patentee after: ABD SMART EYE ELECTRONICS CO., LTD. Address before: 214174, Jiangsu, Huishan Economic Development Zone, Wuxi, 1, wisdom road, Jiangsu Digital Information Industry Park, Tsinghua innovation building, block A, 15F Patentee before: Wuxi Eye Technology Co., Ltd. |