CN102663776B - Violent movement detection method based on characteristic point analysis and device thereof - Google Patents

Violent movement detection method based on characteristic point analysis and device thereof Download PDF

Info

Publication number
CN102663776B
CN102663776B CN201210094638.7A CN201210094638A CN102663776B CN 102663776 B CN102663776 B CN 102663776B CN 201210094638 A CN201210094638 A CN 201210094638A CN 102663776 B CN102663776 B CN 102663776B
Authority
CN
China
Prior art keywords
strenuous exercise
image
unique point
coefficient
critical area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210094638.7A
Other languages
Chinese (zh)
Other versions
CN102663776A (en
Inventor
游磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netposa Technologies Ltd
Original Assignee
Beijing Zanb Science & Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zanb Science & Technology Co Ltd filed Critical Beijing Zanb Science & Technology Co Ltd
Priority to CN201210094638.7A priority Critical patent/CN102663776B/en
Publication of CN102663776A publication Critical patent/CN102663776A/en
Application granted granted Critical
Publication of CN102663776B publication Critical patent/CN102663776B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The present invention provides a violent movement detection method based on characteristic point analysis and a device thereof. The method comprises the following steps of: establishing and updating a background image, and extracting a foreground image of a present frame image according to the background image; acquiring a gradient image of the foreground image, and extracting a characteristic point according to the gradient image; then tracking the characteristic point, and acquiring an offset of the characteristic point; filtering a characteristic point with a small offset, carrying out clustering analysis on the characteristic point, obtaining a key area, and calculating a violent movement coefficient of the key area; and judging whether violent movement exists or not.

Description

Method and device that strenuous exercise based on characteristic point analysis is detected
Technical field
The present invention relates to image processing, video monitoring, particularly strenuous exercise's detection method and device.
Background technology
In order to ensure stablizing of social security and stabilizing of people's lives, video monitoring apparatus has been installed in each large-and-medium size cities successively by China.These video monitoring apparatus, according to the place of application, need to possess different measuring abilities.Wherein, in the important public place of such as station, airport, supermarket, business block, sports ground etc., strenuous exercise detects and seems very important, and it can effectively reduce the probability of the sexual behavior part of causing danger.
International Patent Application WO 2007/064559A1 discloses a kind of detection method of abnormal crowd behaviour, and first the method detects the crowd's agglomerate in scene, then analyzes crowd's behavior (as fighting) by calculating the entropy of agglomerate.But for complicated actual scene, said method can not detect crowd's abnormal behaviour exactly, therefore can not be applied widely.
In sum, the method and the device that detect in the urgent need to proposing a kind of strenuous exercise at present.
Summary of the invention
In view of this, fundamental purpose of the present invention is to realize the detection of strenuous exercise.
For achieving the above object, according to first aspect of the present invention, provide a kind of strenuous exercise's detection method based on characteristic point analysis, the method comprises:
First step, sets up and upgrades background image, and extracts the foreground image of current frame image according to background image;
Second step, obtains the gradient image of foreground image, and according to gradient image extract minutiae;
Third step, tracking characteristics point, obtains the side-play amount of unique point;
The 4th step, the unique point that filtering side-play amount is little, carries out cluster analysis to unique point, obtains critical area, calculates strenuous exercise's coefficient of critical area;
The 5th step, judges whether to exist strenuous exercise.
Wherein, preferably, described first step comprises: suppose represent that k(k is integer) two field picture, represent that (wherein the initial value of background image is k frame background image ), the more new formula of background image is as follows:
Wherein, x, y represent respectively horizontal ordinate and the ordinate of pixel;
Current frame image and background image are done to difference to obtain the foreground image of current frame image.
Preferably, described second step comprises: the gradient image that calculates foreground image; Calculate autocorrelation matrix according to gradient image; The eigenwert of calculating autocorrelation matrix, the point that selected characteristic value is large is unique point.
Preferably, described third step further comprises:
A) side-play amount of hypothesis unique point in next frame image is dis, and the initial value of dis is 0;
B) upgrade dis according to following formula: , , , the gradient of presentation video x direction, the gradient of presentation video y direction, represent the gray scale of next frame image, represent the gray scale of current frame image;
C) when meeting time, proceed to the 4th step, otherwise continue execution step b).
Preferably, described the 4th step further comprises: filtering side-play amount is less than 1 or equal 1 unique point; Utilize clustering method that unique point is classified, and using classes maximum unique point as critical area; Calculate strenuous exercise's coefficient of critical area.
Preferably, clustering method adopts MeanShift clustering methodology, and its step is as follows:
1) initialization: establish initial classes , i=0, class radius r, determine the basic side-play amount formula of MeanShift: , wherein , enter 2);
2) empty , choose arbitrfary point if, , x so, r generates S, enters 3); Otherwise enter 8);
3) find a little , use side-play amount formula to calculate side-play amount Mh,, enter 4);
4) if side-play amount , enter 5); Otherwise drift generates new some x=x+Mh, and according to x, r generates S, enters 3);
5) by a little put into , generate so a new class, calculate (Euclidean distance at d (*, *) representation class center), if , make , enter 6); Otherwise enter 7);
6) merge two classes , enter 2);
7) carry out i=i+1, enter 2);
8) output , be classification results.
Wherein r ∈ [5,10], .
Preferably, strenuous exercise's coefficient step of described calculating critical area is as follows: the direction of motion of extracting unique point in critical area, direction of motion is projected in 8 directions, forms the histogram of 8bin, calculate this histogrammic variance, average using the strenuous exercise's coefficient as this critical area.
Preferably, described the 5th step further comprises: when the coefficient ∈ of strenuous exercise [th1, th2) time, start to accumulate the time t1 of this strenuous exercise, when strenuous exercise's coefficient [th1, th2) time, finish this accumulated time, when occur again the coefficient ∈ of strenuous exercise [th1, th2) time, then start to accumulate the time t2 of this strenuous exercise, before and after calculating, twice strenuous exercise's accumulated time is poor, if | t1-t2|≤△ t, thinks and has strenuous exercise, otherwise think and do not have strenuous exercise.Wherein, th1=0, th2 ∈ [0.02,0.08], △ t ∈ [30,80] and be integer.
According to another aspect of the present invention, a kind of strenuous exercise's pick-up unit based on characteristic point analysis is provided, this device comprises:
Background image is set up and is upgraded and foreground image acquisition module, for setting up and upgrading background image, and extracts the foreground image of current frame image according to background image;
Feature point extraction module, for obtaining the gradient image of foreground image, and according to gradient image extract minutiae;
The side-play amount acquisition module of unique point, for tracking characteristics point, obtains the side-play amount of unique point;
Strenuous exercise's coefficient acquisition module, for the little unique point of filtering side-play amount, carries out cluster analysis to unique point, obtains critical area, calculates strenuous exercise's coefficient of critical area;
Strenuous exercise's judge module, for judging whether to exist strenuous exercise.
Wherein, preferably, described background image is set up renewal and foreground image acquisition module is used for realizing following operation: suppose represent that k(k is integer) two field picture, represent that (wherein the initial value of background image is k frame background image ), the more new formula of background image is as follows:
Wherein, x, y represent respectively horizontal ordinate and the ordinate of pixel;
Current frame image and background image are done to difference to obtain the foreground image of current frame image.
Preferably, described feature point extraction module is used for realizing following operation: the gradient image that calculates foreground image; Calculate autocorrelation matrix according to gradient image; The eigenwert of calculating autocorrelation matrix, the point that selected characteristic value is large is unique point.
Preferably, the side-play amount acquisition module of described unique point is used for realizing following operation:
A) side-play amount of hypothesis unique point in next frame image is dis, and the initial value of dis is 0;
B) upgrade dis according to following formula: , , , the gradient of presentation video x direction, the gradient of presentation video y direction, represent the gray scale of next frame image, represent the gray scale of current frame image;
C) when meeting time, proceed to the 4th step, otherwise continue execution step b).
Preferably, described strenuous exercise coefficient acquisition module is used for realizing following operation: filtering side-play amount is less than 1 or equal 1 unique point; Utilize clustering method that unique point is classified, and using classes maximum unique point as critical area; Calculate strenuous exercise's coefficient of critical area.
Preferably, clustering method adopts MeanShift clustering methodology, and its step is as follows:
1) initialization: establish initial classes , i=0, class radius r, determine the basic side-play amount formula of MeanShift: , wherein , enter 2);
2) empty , choose arbitrfary point if, , x so, r generates S, enters 3); Otherwise enter 8);
3) find a little , use side-play amount formula to calculate side-play amount Mh,, enter 4);
4) if side-play amount , enter 5); Otherwise drift generates new some x=x+Mh, and according to x, r generates S, enters 3);
5) by a little put into , generate so a new class, calculate (Euclidean distance at d (*, *) representation class center), if , make , enter 6); Otherwise enter 7);
6) merge two classes , enter 2);
7) carry out i=i+1, enter 2);
8) output , be classification results.
Wherein r ∈ [5,10], .
Preferably, strenuous exercise's coefficient step of described calculating critical area is as follows: the direction of motion of extracting unique point in critical area, direction of motion is projected in 8 directions, forms the histogram of 8bin, calculate this histogrammic variance, average using the strenuous exercise's coefficient as this critical area.
Preferably, described strenuous exercise judge module is used for realizing following operation: when the coefficient ∈ of strenuous exercise [th1, th2) time, start to accumulate the time t1 of this strenuous exercise, when strenuous exercise's coefficient [th1, th2) time, finish this accumulated time, when occur again the coefficient ∈ of strenuous exercise [th1, th2) time, then start to accumulate the time t2 of this strenuous exercise, before and after calculating, twice strenuous exercise's accumulated time is poor, if | t1-t2|≤△ t, thinks and has strenuous exercise, otherwise think and do not have strenuous exercise.Wherein, th1=0, th2 ∈ [0.02,0.08], △ t ∈ [30,80] and be integer.
Compared with prior art, strenuous exercise's detection method and the device based on characteristic point analysis of the present invention, can realize the detection of strenuous exercise exactly.
Brief description of the drawings
Fig. 1 shows according to the process flow diagram of the strenuous exercise's detection method based on characteristic point analysis of the present invention;
Fig. 2 shows according to the structural drawing of the strenuous exercise's pick-up unit based on characteristic point analysis of the present invention.
Embodiment
For making your auditor can further understand structure of the present invention, feature and other objects, be now described in detail as follows in conjunction with appended preferred embodiment, illustrated preferred embodiment is only for technical scheme of the present invention is described, and non-limiting the present invention.
Fig. 1 represents according to the process flow diagram of the strenuous exercise's detection method based on characteristic point analysis of the present invention.As shown in Figure 1, comprise according to the strenuous exercise's detection method based on characteristic point analysis of the present invention:
First step 101, sets up and upgrades background image, and extracts the foreground image of current frame image according to background image;
Second step 102, obtains the gradient image of foreground image, and according to gradient image extract minutiae;
Third step 103, tracking characteristics point, obtains the side-play amount of unique point;
The 4th step 104, the unique point that filtering side-play amount is little, carries out cluster analysis to unique point, obtains critical area, calculates strenuous exercise's coefficient of critical area;
The 5th step 105, judges whether to exist strenuous exercise.
first step:
Preferably, described first step 101 comprises: suppose represent that k(k is integer) two field picture, represent that (wherein the initial value of background image is k frame background image ), the more new formula of background image is as follows:
Wherein, x, y represent respectively horizontal ordinate and the ordinate of pixel;
Current frame image and background image are done to difference to obtain the foreground image of current frame image.
second step:
Preferably, described second step 102: the gradient image that calculates foreground image; Calculate autocorrelation matrix according to gradient image; The eigenwert of calculating autocorrelation matrix, the point that selected characteristic value is large is unique point.
third step:
Preferably, described third step 103 further comprises:
A) side-play amount of hypothesis unique point in next frame image is dis, and the initial value of dis is 0;
B) upgrade dis according to following formula: , , , the gradient of presentation video x direction, the gradient of presentation video y direction, represent the gray scale of next frame image, represent the gray scale of current frame image;
C) when meeting time, proceed to the 4th step, otherwise continue execution step b).
the 4th step:
Preferably, described the 4th step 104: filtering side-play amount is less than 1 or equal 1 unique point; Utilize clustering method that unique point is classified, and using classes maximum unique point as critical area; Calculate strenuous exercise's coefficient of critical area.
Preferably, clustering method adopts MeanShift clustering methodology, and its step is as follows:
1) initialization: establish initial classes , i=0, class radius r, determine the basic side-play amount formula of MeanShift: , wherein , enter 2);
2) empty , choose arbitrfary point if, , x so, r generates S, enters 3); Otherwise enter 8);
3) find a little , use side-play amount formula to calculate side-play amount Mh,, enter 4);
4) if side-play amount , enter 5); Otherwise drift generates new some x=x+Mh, and according to x, r generates S, enters 3);
5) by a little put into , generate so a new class, calculate (Euclidean distance at d (*, *) representation class center), if , make , enter 6); Otherwise enter 7);
6) merge two classes , enter 2);
7) carry out i=i+1, enter 2);
8) output , be classification results.
Wherein r ∈ [5,10], .
Strenuous exercise's coefficient step of described calculating critical area is as follows: the direction of motion of extracting unique point in critical area, direction of motion is projected in 8 directions, the histogram that forms 8bin, calculates this histogrammic variance, average using the strenuous exercise's coefficient as this critical area.
the 5th step:
Preferably, described the 5th step further comprises: when the coefficient ∈ of strenuous exercise [th1, th2) time, start to accumulate the time t1 of this strenuous exercise, when strenuous exercise's coefficient [th1, th2) time, finish this accumulated time, when occur again the coefficient ∈ of strenuous exercise [th1, th2) time, then start to accumulate the time t2 of this strenuous exercise, before and after calculating, twice strenuous exercise's accumulated time is poor, if | t1-t2|≤△ t, thinks and has strenuous exercise, otherwise think and do not have strenuous exercise.
Wherein, th1=0, th2 ∈ [0.02,0.08], △ t ∈ [30,80] and be integer.For example, can choose th2=0.05, △ t=50.
Fig. 2 shows according to the structural drawing of the strenuous exercise's pick-up unit based on characteristic point analysis of the present invention.As Fig. 2 shows, the strenuous exercise's pick-up unit based on characteristic point analysis comprises:
Background image is set up and is upgraded and foreground image acquisition module 1, for setting up and upgrading background image, and extracts the foreground image of current frame image according to background image;
Feature point extraction module 2, for obtaining the gradient image of foreground image, and according to gradient image extract minutiae;
The side-play amount acquisition module 3 of unique point, for tracking characteristics point, obtains the side-play amount of unique point;
Strenuous exercise's coefficient acquisition module 4, for the little unique point of filtering side-play amount, carries out cluster analysis to unique point, obtains critical area, calculates strenuous exercise's coefficient of critical area;
Strenuous exercise's judge module 5, for judging whether to exist strenuous exercise.
Preferably, described background image foundation renewal and foreground image acquisition module 1 are for realizing following operation: suppose represent that k(k is integer) two field picture, represent that (wherein the initial value of background image is k frame background image ), the more new formula of background image is as follows:
Wherein, x, y represent respectively horizontal ordinate and the ordinate of pixel;
Current frame image and background image are done to difference to obtain the foreground image of current frame image.
Preferably, described feature point extraction module 2 is for realizing following operation: the gradient image that calculates foreground image; Calculate autocorrelation matrix according to gradient image; The eigenwert of calculating autocorrelation matrix, the point that selected characteristic value is large is unique point.
Preferably, the side-play amount acquisition module 3 of described unique point is for realizing following operation:
A) side-play amount of hypothesis unique point in next frame image is dis, and the initial value of dis is 0;
B) upgrade dis according to following formula: , , , the gradient of presentation video x direction, the gradient of presentation video y direction, represent the gray scale of next frame image, represent the gray scale of current frame image;
C) when meeting time, proceed to the 4th step, otherwise continue execution step b).
Preferably, described strenuous exercise coefficient acquisition module 4 is for realizing following operation: filtering side-play amount is less than 1 or equal 1 unique point; Utilize clustering method that unique point is classified, and using classes maximum unique point as critical area; Calculate strenuous exercise's coefficient of critical area.
Preferably, clustering method adopts MeanShift clustering methodology, and its step is as follows:
1) initialization: establish initial classes , i=0, class radius r, determine the basic side-play amount formula of MeanShift: , wherein , enter 2);
2) empty , choose arbitrfary point if, , x so, r generates S, enters 3); Otherwise enter 8);
3) find a little , use side-play amount formula to calculate side-play amount Mh,, enter 4);
4) if side-play amount , enter 5); Otherwise drift generates new some x=x+Mh, and according to x, r generates S, enters 3);
5) by a little put into , generate so a new class, calculate (Euclidean distance at d (*, *) representation class center), if , make , enter 6); Otherwise enter 7);
6) merge two classes , enter 2);
7) carry out i=i+1, enter 2);
8) output , be classification results.
Wherein r ∈ [5,10], .
Strenuous exercise's coefficient step of described calculating critical area is as follows: the direction of motion of extracting unique point in critical area, direction of motion is projected in 8 directions, the histogram that forms 8bin, calculates this histogrammic variance, average using the strenuous exercise's coefficient as this critical area.
Preferably, described strenuous exercise judge module 5 is for realizing following operation: when the coefficient ∈ of strenuous exercise [th1, th2) time, start to accumulate the time t1 of this strenuous exercise, when strenuous exercise's coefficient [th1, th2) time, finish this accumulated time, when occur again the coefficient ∈ of strenuous exercise [th1, th2) time, then start to accumulate the time t2 of this strenuous exercise, before and after calculating, twice strenuous exercise's accumulated time is poor, if | t1-t2|≤△ t, thinks and has strenuous exercise, otherwise think and do not have strenuous exercise.Wherein, th1=0, th2 ∈ [0.02,0.08], △ t ∈ [30,80] and be integer.For example, can choose th2=0.05, △ t=50.
Compared with prior art, strenuous exercise's detection method and the device based on characteristic point analysis of the present invention, can realize the detection of strenuous exercise exactly.
Need statement, foregoing invention content and embodiment are intended to prove the practical application of technical scheme provided by the present invention, should not be construed as limiting the scope of the present invention.Those skilled in the art are in spirit of the present invention and principle, when doing various amendments, be equal to and replace or improve.Protection scope of the present invention is as the criterion with appended claims.

Claims (9)

1. the method that the strenuous exercise based on characteristic point analysis is detected, the method comprises:
First step, sets up and upgrades background image, and extracts the foreground image of current frame image according to background image;
Second step, obtains the gradient image of foreground image, and according to gradient image extract minutiae;
Third step, tracking characteristics point, obtains the side-play amount of unique point;
The 4th step, the unique point that filtering side-play amount is little, carries out cluster analysis to unique point, obtains critical area, calculates strenuous exercise's coefficient of critical area;
The 5th step, judges whether to exist strenuous exercise;
The 4th step further comprises: filtering side-play amount is less than 1 or equal 1 unique point; Utilize clustering method that unique point is classified, and using classes maximum unique point as critical area; Calculate strenuous exercise's coefficient of critical area;
Wherein, strenuous exercise's coefficient step of described calculating critical area is as follows: the direction of motion of extracting unique point in critical area, direction of motion is projected in 8 directions, forms the histogram of 8bin, calculate the average of this histogrammic variance using the strenuous exercise's coefficient as this critical area.
2. the method for claim 1, is characterized in that, described first step comprises: suppose I krepresent k two field picture, k is integer, B krepresent k frame background image, wherein the initial value of background image is B 0=I 0, the more new formula of background image is as follows:
B k(x,y)=B k-1(x,y)+sgn(I k(x,y)-B k-1(x,y))
Wherein, x, y represent respectively horizontal ordinate and the ordinate of pixel;
Current frame image and background image are done to difference to obtain the foreground image of current frame image.
3. the method for claim 1, is characterized in that, described second step comprises: the gradient image that calculates foreground image; Calculate autocorrelation matrix according to gradient image; The eigenwert of calculating autocorrelation matrix, the point that selected characteristic value is large is unique point.
4. the method for claim 1, is characterized in that, described third step further comprises:
A) side-play amount of hypothesis unique point in next frame image is dis, and the initial value of dis is 0;
B) upgrade dis according to following formula: dis = H - 1 Σ x [ ▿ I ] T [ T ( x + dis ) - I ( x ) ] , ▿ I = I x I y , H = Σ x ▿ I T ▿ I , I xthe gradient of presentation video x direction, I ythe gradient of presentation video y direction, T (x) represents the gray scale of next frame image, I (x) represents the gray scale of current frame image;
C) when meeting | when dis|≤ε, ε=0.01, proceeds to the 4th step, otherwise continues execution step b).
5. the method for claim 1, is characterized in that, described the 5th step further comprises: when the coefficient ∈ of strenuous exercise [th1, th2) time, start to accumulate the time t1 of this strenuous exercise, when strenuous exercise's coefficient [th1, th2) time, finish this accumulated time, when occur again the coefficient ∈ of strenuous exercise [th1, th2) time, then start to accumulate the time t2 of this strenuous exercise, before and after calculating, twice strenuous exercise's accumulated time is poor, if | t1-t2|≤△ t, thinks and has strenuous exercise, otherwise think and do not have strenuous exercise; Wherein, th1=0, th2 ∈ [0.02,0.08], △ t ∈ [30,80] and be integer.
6. the strenuous exercise's pick-up unit based on characteristic point analysis, this device comprises:
Background image is set up and is upgraded and foreground image acquisition module, for setting up and upgrading background image, and extracts the foreground image of current frame image according to background image;
Feature point extraction module, for obtaining the gradient image of foreground image, and according to gradient image extract minutiae;
The side-play amount acquisition module of unique point, for tracking characteristics point, obtains the side-play amount of unique point;
Strenuous exercise's coefficient acquisition module, for the little unique point of filtering side-play amount, carries out cluster analysis to unique point, obtains critical area, calculates strenuous exercise's coefficient of critical area;
Strenuous exercise's judge module, for judging whether to exist strenuous exercise;
Strenuous exercise's coefficient acquisition module further comprises: filtering side-play amount is less than 1 or equal 1 unique point; Utilize clustering method that unique point is classified, and using classes maximum unique point as critical area; Calculate strenuous exercise's coefficient of critical area;
Wherein, strenuous exercise's coefficient step of described calculating critical area is as follows: the direction of motion of extracting unique point in critical area, direction of motion is projected in 8 directions, forms the histogram of 8bin, calculate the average of this histogrammic variance using the strenuous exercise's coefficient as this critical area.
7. device as claimed in claim 6, is characterized in that, described background image is set up renewal and foreground image acquisition module is used for realizing following operation: suppose I krepresent k two field picture, k is integer, B krepresent k frame background image, wherein the initial value of background image is B 0=I 0, the more new formula of background image is as follows:
B k(x,y)=B k-1(x,y)+sgn(I k(x,y)-B k-1(x,y))
Wherein, x, y represent respectively horizontal ordinate and the ordinate of pixel;
Current frame image and background image are done to difference to obtain the foreground image of current frame image.
8. device as claimed in claim 6, is characterized in that, described feature point extraction module is used for realizing following operation: the gradient image that calculates foreground image; Calculate autocorrelation matrix according to gradient image; The eigenwert of calculating autocorrelation matrix, the point that selected characteristic value is large is unique point;
The side-play amount acquisition module of described unique point is used for realizing following operation:
A) side-play amount of hypothesis unique point in next frame image is dis, and the initial value of dis is 0
B) upgrade dis according to following formula: dis = H - 1 Σ x [ ▿ I ] T [ T ( x + dis ) - I ( x ) ] , ▿ I = I x I y , H = Σ x ▿ I T ▿ I , I xthe gradient of presentation video x direction, I ythe gradient of presentation video y direction, T (x) represents the gray scale of next frame image, I (x) represents the gray scale of current frame image;
C) when meeting | when dis|≤ε, ε=0.01, proceeds to the 4th step, otherwise continues execution step b).
9. device as claimed in claim 6, is characterized in that, described strenuous exercise judge module is used for realizing following operation: when the coefficient ∈ of strenuous exercise [th1, th2) time, start to accumulate the time t1 of this strenuous exercise, when strenuous exercise's coefficient [th1, th2) time, finish this accumulated time, when occur again the coefficient ∈ of strenuous exercise [th1, th2) time, then start to accumulate the time t2 of this strenuous exercise, before and after calculating, twice strenuous exercise's accumulated time is poor, if | t1-t2|≤△ t, thinks and has strenuous exercise, otherwise think and do not have strenuous exercise; Wherein, th1=0, th2 ∈ [0.02,0.08], △ t ∈ [30,80] and be integer.
CN201210094638.7A 2012-03-31 2012-03-31 Violent movement detection method based on characteristic point analysis and device thereof Active CN102663776B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210094638.7A CN102663776B (en) 2012-03-31 2012-03-31 Violent movement detection method based on characteristic point analysis and device thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210094638.7A CN102663776B (en) 2012-03-31 2012-03-31 Violent movement detection method based on characteristic point analysis and device thereof

Publications (2)

Publication Number Publication Date
CN102663776A CN102663776A (en) 2012-09-12
CN102663776B true CN102663776B (en) 2014-10-29

Family

ID=46773254

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210094638.7A Active CN102663776B (en) 2012-03-31 2012-03-31 Violent movement detection method based on characteristic point analysis and device thereof

Country Status (1)

Country Link
CN (1) CN102663776B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103064086B (en) * 2012-11-04 2014-09-17 北京工业大学 Vehicle tracking method based on depth information
CN103974028A (en) * 2013-01-30 2014-08-06 由田新技股份有限公司 Method for detecting fierce behavior of personnel
CN103279737B (en) * 2013-05-06 2016-09-07 上海交通大学 A kind of behavioral value method of fighting based on space-time interest points
CN105208402B (en) * 2015-08-31 2017-12-15 电子科技大学 A kind of frame of video complexity measure method based on Moving Objects and graphical analysis
CN107305691A (en) * 2016-04-19 2017-10-31 中兴通讯股份有限公司 Foreground segmentation method and device based on images match
CN106713702A (en) * 2017-01-19 2017-05-24 博康智能信息技术有限公司 Method and apparatus of determining video image jitter and camera device jitter

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4492036B2 (en) * 2003-04-28 2010-06-30 ソニー株式会社 Image recognition apparatus and method, and robot apparatus
US7558404B2 (en) * 2005-11-28 2009-07-07 Honeywell International Inc. Detection of abnormal crowd behavior
US8218198B2 (en) * 2006-03-07 2012-07-10 Hewlett-Packard Development Company, L.P. Color selection
CN101290682A (en) * 2008-06-25 2008-10-22 北京中星微电子有限公司 Movement target checking method and apparatus
CN101751679A (en) * 2009-12-24 2010-06-23 北京中星微电子有限公司 Sorting method, detecting method and device of moving object
CN101976353B (en) * 2010-10-28 2012-08-22 北京智安邦科技有限公司 Statistical method and device of low density crowd
CN102063613B (en) * 2010-12-28 2012-12-05 北京智安邦科技有限公司 People counting method and device based on head recognition

Also Published As

Publication number Publication date
CN102663776A (en) 2012-09-12

Similar Documents

Publication Publication Date Title
CN102663776B (en) Violent movement detection method based on characteristic point analysis and device thereof
Ullah et al. Anomalous entities detection and localization in pedestrian flows
CN102063613B (en) People counting method and device based on head recognition
CN103729858B (en) A kind of video monitoring system is left over the detection method of article
CN102509291B (en) Pavement disease detecting and recognizing method based on wireless online video sensor
CN106204640A (en) A kind of moving object detection system and method
CN106203274A (en) Pedestrian's real-time detecting system and method in a kind of video monitoring
CN102609724B (en) Method for prompting ambient environment information by using two cameras
CN103810711A (en) Keyframe extracting method and system for monitoring system videos
CN104463232A (en) Density crowd counting method based on HOG characteristic and color histogram characteristic
US20180189557A1 (en) Human detection in high density crowds
CN103020606A (en) Pedestrian detection method based on spatio-temporal context information
CN104537688A (en) Moving object detecting method based on background subtraction and HOG features
CN103996040A (en) Bottom-up visual saliency generating method fusing local-global contrast ratio
Pathak et al. Anomaly localization in topic-based analysis of surveillance videos
US20110280442A1 (en) Object monitoring system and method
US20170103536A1 (en) Counting apparatus and method for moving objects
Wang et al. Crowd density estimation based on texture feature extraction
KR102332229B1 (en) Method for Augmenting Pedestrian Image Data Based-on Deep Learning
CN103577804A (en) Abnormal human behavior identification method based on SIFT flow and hidden conditional random fields
CN104866844B (en) A kind of crowd massing detection method towards monitor video
CN101877135B (en) Moving target detecting method based on background reconstruction
CN104168462A (en) Camera scene change detecting method based on image angular point set characteristic
CN103530601A (en) Monitoring blind area crowd state deduction method based on Bayesian network
CN103310180A (en) System and method for detecting random object in target image

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
ASS Succession or assignment of patent right

Owner name: NETPOSA TECHNOLOGIES, LTD.

Free format text: FORMER OWNER: BEIJING ZANB SCIENCE + TECHNOLOGY CO., LTD.

Effective date: 20150821

C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20150821

Address after: 100102, Beijing, Chaoyang District, Tong Tung Street, No. 1, Wangjing SOHO tower, two, C, 26 floor

Patentee after: NETPOSA TECHNOLOGIES, Ltd.

Address before: 100048 Beijing city Haidian District Road No. 9, building 4, 5 layers of international subject

Patentee before: Beijing ZANB Technology Co.,Ltd.

PP01 Preservation of patent right
PP01 Preservation of patent right

Effective date of registration: 20220726

Granted publication date: 20141029