CN101231696A - Method and system for detection of hangover - Google Patents
Method and system for detection of hangover Download PDFInfo
- Publication number
- CN101231696A CN101231696A CNA2008100652550A CN200810065255A CN101231696A CN 101231696 A CN101231696 A CN 101231696A CN A2008100652550 A CNA2008100652550 A CN A2008100652550A CN 200810065255 A CN200810065255 A CN 200810065255A CN 101231696 A CN101231696 A CN 101231696A
- Authority
- CN
- China
- Prior art keywords
- target
- pixel
- frame
- connected region
- legacy
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Image Analysis (AREA)
Abstract
The invention discloses a residue detection method, which comprises the following steps that: foreground images which are different to a background are detected, foreground pixel points are determined, and then the foreground pixel points are characterized; the persistent state that the foreground pixel points are characterized is timed, and the timed pixel points which achieve a pre-set value are extracted; a connected region formed by the extracted pixel points is detected, and is determined to be a target to be analyzed; the motion characteristics of the target is analyzed to determine whether the targets to be analyzed are the residue. The invention also discloses a residue detecting system, which comprises a target detection device, a timing treatment device, a connected region detection device and a target analysis device, wherein, the timing treatment device times the pixel points, and extracts the pixel points, the characterization state persistent time of which achieve the pre-set value, the connected region detection device detects the connected region which is formed by the extracted pixel points, so as to determine the connected region of the targets to be analyzed, and the target analysis device is used to determine whether the targets to be analyzed are the residue. The invention can precisely extract the effective pixel points of the targets, thereby precisely judging whether the detected targets are the residue.
Description
[technical field]
The present invention relates to Flame Image Process, be specifically related to a kind of legacy detection method and system.
[background technology]
The detection of legacy is based on Detection for Moving Target, method commonly used in the Detection for Moving Target is the background subtraction method, mainly be by setting up background model (background image frame), use current image frame subtracting background picture frame then, obtain the difference in the scene, this difference is exactly to want the target that obtains.The prior art shortcoming is: after detecting target, judge directly then whether target moves, if rest time long enough then think legacy, but, if detect disturbance taking place, through this target, can't accurately judge whether target is static if any the people.This is because existing technology does not deal with target pixel points after detecting target, when target is when being detected when being interfered, this target is exactly imperfect false, like this, judge whether also can't accomplish that for static accurately, because disturbance changes, measured target just changes when still moving, itself be likely that static target but is mistaken as in motion, causes static legacy by omission.
[summary of the invention]
Fundamental purpose of the present invention solves the problems of the prior art exactly, and a kind of legacy detection method and system are provided, and whether it can accurately measure target to be analyzed is legacy.
For achieving the above object, the invention provides a kind of legacy detection method, may further comprise the steps:
A1, detection are different from the foreground image of background, determine the foreground pixel point, and foreground pixel is put characterization;
B1, the persistent state of foreground pixel point characterization is carried out timing, extract the pixel that timing time reaches preset value;
C1, detect the connected region that forms by the pixel that extracts, be defined as target to be analyzed;
D1, the kinetic characteristic of evaluating objects to be to confirm whether to be legacy at least.
Being characterized as in the described steps A 1: foreground pixel point and rest of pixels point are represented with different value, formed bianry image.
Described step B1 comprises following substep:
B11, a timer is set for each pixel of video image;
B12, to the image of each frame, judge the state of each pixel of present frame, add up if pixel is the state of characterization then timing, if be the state of backgroundization then timing zero clearing;
B13, judge whether the timer value of each pixel reaches preset value, if then extract this pixel.
Described step C1 comprises following substep:
C11, the pixel that each frame is extracted carry out the connected region detection;
C12, the connected region that is obtained is carried out mark;
Connected region between C13, related each frame, the connected region that success is related is defined as target to be analyzed.
Also comprise the feature of extracting connected region after the described step C1, to the step that target is carried out pattern classification, divide for people and two classifications of thing at least with target, described step D1 comprises following substep:
Whether D11, evaluating objects are stationary state;
If the D12 target is the classification of stationary state and target when being thing, determine that then this target is a legacy.
Described step D11 comprises following substep:
D111, determine to write down the center and the area of the boundary rectangle frame of this target in the n frame after the target to be analyzed;
D112, will compare, judge whether the area change rate of its boundary rectangle frame and center amount of movement surpass setting threshold from target in the continuous T frame that the n+1 frame begins and the target in the n frame;
D113, if the area change rate of boundary rectangle frame and center amount of movement all less than setting threshold, judge that then this target is a stationary state;
Wherein, n, T are default positive integer.
Described steps A 1 comprises following substep:
A11, background frames is set;
A12, obtain the detected image frame by image detection;
A13, the detected image frame compared with background frames obtains foreground image, determines the foreground pixel point;
A14, with foreground pixel point with respect to the characterization of rest of pixels point.
For achieving the above object, the present invention also provides a kind of legacy detection system, comprising:
Object detecting device is used to detect the foreground image that is different from background, determines the foreground pixel point, and foreground pixel is put characterization;
The timing treating apparatus is used for the persistent state of foreground pixel point characterization is carried out timing, extracts the pixel that timing time reaches preset value;
The connected region pick-up unit is used to detect the connected region that formed by the pixel that extracts with definite target to be analyzed;
Whether the target analysis device, being used to analyze determined target to be analyzed is legacy.
Described timing treating apparatus comprises:
Timer is used for each pixel is carried out timing;
Control module is used to judge the state of each pixel of present frame, and the control timer adds up when for the state of characterization, controls the timer zero clearing when be the state of backgroundization;
The pixel extraction unit is used to extract the pixel that timing time reaches preset value.
Described connected region pick-up unit comprises:
Detecting unit: be used to detect the connected region that forms by the pixel that extracts and mark;
Matching unit: be used for the connected region of each interframe is mated association.
Also comprise the feature that is used to extract connected region target is carried out the target classification device of pattern classification, described target classification device divides for people and two classifications of thing at least with target.
Described target analysis device comprises:
The target record unit, the center and the area of the boundary rectangle frame of this target of n frame after being used for record object and determining, wherein n is default natural number;
Comparing unit is used for the target of each frame that will begin from the n+1 frame and the target of n frame and compares, and obtains the area change rate and the center amount of movement of this boundary rectangle frame;
Judging unit is used to judge whether the area change rate of this boundary rectangle frame and center amount of movement surpass setting threshold, if the area change rate of this boundary rectangle frame and center amount of movement judge then that all less than setting threshold this target is a stationary state;
Integerated analytic unit is used for target is carried out analysis-by-synthesis, when target is stationary state and classification when being thing, determines that this target is a legacy.
The invention has the beneficial effects as follows:
Adopt the present invention, after detecting the foreground image that is different from background, foreground pixel is put characterization, make a distinction with background, next, the state to each pixel characterization carries out timing earlier, extracts the pixel that timing reaches preset value, and then carry out connected region and detect and the target travel specificity analysis, judge whether the measured target of determining is static legacy.Because after obtaining foreground pixel point, deal with according to the state continuance of video image pixel characterization earlier, the time that each pixel of accumulative total keeps characterization, have only when the continuous time of occurrence long enough of pixel, just be considered to belong to the effective pixel points of target to be analyzed, than directly judging the mode whether target moves once detecting the foreground image target in the past, the pixel of the target to be analyzed that the present invention determined is more accurate, thereby can judge accurately that whether target is legacy, reduces the omission probability greatly.
[description of drawings]
Fig. 1 is the legacy detection method implementing procedure figure of the embodiment of the invention;
Fig. 2 (a) is the synoptic diagram of a kind of connected region of the embodiment of the invention;
Fig. 2 (b) is the synoptic diagram of the another kind of connected region of the embodiment of the invention.
[embodiment]
Feature of the present invention and advantage will be elaborated in conjunction with the accompanying drawings by embodiment.
Please refer to Fig. 1, can the realizing of legacy detection method of the present invention according to following steps:
1. foreground image detects
At first set up background model, promptly set background frames, and background frames is constantly upgraded according to the variation of ambient condition;
Then utilize the background subtraction method, obtain the detected image frame, and, detect the foreground pixel point that does not belong to background in the scene with detected image frame background subtraction frame by image detection;
Then with foreground pixel point with respect to other pixel characterization, generate foreground image, wherein, the detected foreground pixel point that is different from background is represented with 1 (white), show as prospect (foreground image), rest of pixels point is represented with 0 (black), shows as background, and testing result is made up of 0,1 bianry image of forming.
2. pixel timing
For each pixel of video image is provided with a timer, original state is made as 0.In the characterization of foreground pixel point, begin frame by frame the state of each pixel is carried out timing, image to each frame, judge the state of each pixel of present frame, if be that 1 value is the state of characterization, the timer certain value that adds up then, if be that 0 value is the state of backgroundization, then timer zero setting.
3. overtime pixel detects
Through the stipulated time as predetermined frame number after, find out the pixel that timing exceeds preset value (T_overtime), promptly extract and be considered to effective overtime pixel, generate overtime pixel image (overtime image).
By above step 2,3, can judge time that a pixel occurs with the state that is different from background long enough whether, whether be effective pixel points promptly, extract the basis of the sufficiently long pixel of these time of occurrences as next step processing, like this, just can eliminate to a great extent and the external world (as the people of activity) interference may take place in foreground image detects and cause pixel to determine inaccurate shortcoming.
4. connected region detects
Detected overtime pixel image is carried out connected region detect, obtain the independent connected region that overtime pixel constitutes, and put on mark for each connected region.So-called connected region is meant the independent pixel zone (set) that all pixels that are connected are formed.Shown in Fig. 2 (a), the overtime pixel in this two field picture constitutes two connected regions, and shown in Fig. 2 (b), the overtime pixel in this two field picture only constitutes a connected region.
5. the connected region coupling is related
By the connected region between related each frame of the method for coupling, keep the consistance of connected region identity between each frame.Distributed an ID for each, confirm as a target to be analyzed by the related connected region of success.But matching process service range coupling, for example, when the centre distance of two targets of front and back two frames less than certain value, think that then they are same targets.
6. target classification
Target is carried out pattern classification, that is, extract the target signature of independent connected region, according to target feature is classified to target.For example, according to having divided thing, people, thing people mixed mode,, can use prior art target classification is mixed three kinds to thing, people, thing people according to the feature of connected region.
7. the kinetic characteristic analysis
Determine to note the center and the area of the boundary rectangle frame of this target in the n frame thereafter after the target to be analyzed, wherein n is the parameter that can be provided with, as natural number;
Pre-set two threshold values are that threshold value (T_distance) is moved in area change rate threshold value (T_eara) and center, next, the target of each frame that will begin from the n+1 frame and the target of n frame compare, whether the area change rate of judging its boundary rectangle frame surpasses area change rate threshold value (T_eara), and whether the center amount of movement surpasses the center is moved threshold value (T_distance);
If surpass the T frame continuously, the area change rate of the boundary rectangle frame of target and center amount of movement judge then that all less than setting threshold this target is static.
8. analysis-by-synthesis
Result according to target classification, kinetic characteristic analysis comprehensively adjudicates.For example, if a target is judged as thing by pattern classification, and be judged as static target by the kinetic characteristic analysis, so just think that this target is a legacy, and note goes up the legacy mark, on display screen this legacy is highlighted, the target that for example only will be judged as legacy is presented on the display screen, or indicates the target that is judged as legacy with rectangle frame on display screen.
As another aspect of the present invention, a kind of legacy detection system that is used to implement above-mentioned legacy detection method also is provided, it comprises following ingredient:
Object detecting device is used to detect the foreground image that is different from background, determines the foreground pixel point, and foreground pixel is put characterization;
The timing treating apparatus is used for the persistent state of foreground pixel point characterization is carried out timing, extracts the pixel that timing time reaches preset value;
The connected region pick-up unit is used to detect the connected region that formed by the pixel that extracts with definite target to be analyzed;
Whether the target analysis device, being used to analyze determined target to be analyzed is legacy.
In the preferred scheme, the legacy detection system comprises that also the feature that is used to extract connected region target is carried out the target classification device of pattern classification, divides for people and two classifications of thing at least with target.
The timing treating apparatus comprises:
Timer is used for each pixel is carried out timing;
Control module is used to judge the state of each pixel of present frame, and the control timer adds up when for the state of characterization, controls the timer zero clearing when be the state of backgroundization;
The pixel extraction unit is used to extract the pixel that timing time reaches preset value.
The connected region pick-up unit comprises:
Detecting unit is used to detect the connected region that is formed by the pixel that extracts and mark;
Matching unit is used for the connected region of each interframe is mated association.
The target analysis device comprises:
The target record unit, the center and the area of the boundary rectangle frame of this target of n frame after being used for record object and determining, wherein n is default natural number;
Comparing unit is used for the target of each frame that will begin from the n+1 frame and the target of n frame and compares, and obtains the area change rate and the center amount of movement of this boundary rectangle frame;
Judging unit is used to judge whether the area change rate of this boundary rectangle frame and center amount of movement surpass setting threshold, if the area change rate of this boundary rectangle frame and center amount of movement judge then that all less than setting threshold this target is a stationary state;
Integerated analytic unit is used for target is carried out analysis-by-synthesis, when target is stationary state and classification when being thing, determines that this target is a legacy.
The explanation of said detecting system principle of work can be with reference to the implementation procedure of present embodiment legacy detection method.
Above content be in conjunction with concrete preferred implementation to further describing that the present invention did, can not assert that concrete enforcement of the present invention is confined to these explanations.For the general technical staff of the technical field of the invention, without departing from the inventive concept of the premise, can also make some simple deduction or replace, all should be considered as belonging to protection scope of the present invention.
Claims (13)
1. legacy detection method is characterized in that may further comprise the steps:
A1, detection are different from the foreground image of background, determine the foreground pixel point, and foreground pixel is put characterization;
B1, the persistent state of foreground pixel point characterization is carried out timing, extract the pixel that timing time reaches preset value;
C1, detect the connected region that forms by the pixel that extracts, and be defined as target to be analyzed;
D1, the kinetic characteristic of evaluating objects to be to confirm whether to be legacy at least.
2. legacy detection method as claimed in claim 1 is characterized in that: being characterized as in the described steps A 1: foreground pixel point and rest of pixels point are represented with different value, formed bianry image.
3. legacy detection method as claimed in claim 1 is characterized in that: the foreground pixel point that will confirm as legacy after step D1 highlights.
4. the legacy detection method described in claim 1 to 3, it is characterized in that: described step B1 comprises following substep:
B11, a timer is set for each pixel of video image;
B12, to the image of each frame, judge the state of each pixel of present frame, add up if pixel is the state of characterization then timing, if be the state of backgroundization then timing zero clearing;
B13, judge whether the timer value of each pixel reaches preset value, if then extract this pixel.
5. legacy detection method as claimed in claim 4 is characterized in that: described step C1 comprises following substep:
C11, the pixel that each frame is extracted carry out the connected region detection;
C12, the connected region that is obtained is carried out mark;
Connected region between C13, related each frame, the connected region that success is related is defined as target to be analyzed.
6. legacy detection method as claimed in claim 5, it is characterized in that: also comprise the feature of extracting connected region after the described step C1, to the step that target is carried out pattern classification, divide for people and two classifications of thing at least with target, described step D1 comprises following substep:
Whether D11, evaluating objects are stationary state;
If the D12 target is the classification of stationary state and target when being thing, determine that then this target is a legacy.
7. legacy detection method as claimed in claim 6 is characterized in that: described step D11 comprises following substep:
D111, determine to write down the center and the area of the boundary rectangle frame of this target in the n frame after the target to be analyzed;
D112, will compare, judge whether the area change rate of its boundary rectangle frame and center amount of movement surpass setting threshold from target in the continuous T frame that the n+1 frame begins and the target in the n frame;
D113, if the area change rate of boundary rectangle frame and center amount of movement all less than setting threshold, judge that then this target is a stationary state;
Wherein, n, T are default positive integer.
8. legacy detection method as claimed in claim 1 is characterized in that: described steps A 1 comprises following substep:
A11, background frames is set;
A12, obtain the detected image frame by image detection;
A13, the detected image frame compared with background frames obtains foreground image, determines the foreground pixel point;
A14, with foreground pixel point with respect to the characterization of rest of pixels point.
9. legacy detection system is characterized in that comprising:
Object detecting device is used to detect the foreground image that is different from background, determines the foreground pixel point, and foreground pixel is put characterization;
The timing treating apparatus is used for the persistent state of foreground pixel point characterization is carried out timing, extracts the pixel that timing time reaches preset value;
The connected region pick-up unit is used to detect the connected region that formed by the pixel that extracts with definite target to be analyzed;
Whether the target analysis device, being used to analyze determined target to be analyzed is legacy.
10. legacy detection system as claimed in claim 9 is characterized in that: described timing treating apparatus comprises:
Timer is used for each pixel is carried out timing;
Control module is used to judge the state of each pixel of present frame, and the control timer adds up when for the state of characterization, controls the timer zero clearing when be the state of backgroundization;
The pixel extraction unit is used to extract the pixel that timing time reaches preset value.
11. as claim 9 or 10 described legacy detection systems, it is characterized in that: described connected region pick-up unit comprises:
Detecting unit: be used to detect the connected region that forms by the pixel that extracts and mark;
Matching unit: be used for the connected region of each interframe is mated association.
12. legacy detection system as claimed in claim 11 is characterized in that: also comprise the feature that is used to extract connected region target is carried out the target classification device of pattern classification, described target classification device divides for people and two classifications of thing at least with target.
13. legacy detection system as claimed in claim 12 is characterized in that: described target analysis device comprises:
The target record unit, the center and the area of the boundary rectangle frame of this target of n frame after being used for record object and determining, wherein n is default natural number;
Comparing unit is used for the target of each frame that will begin from the n+1 frame and the target of n frame and compares, and obtains the area change rate and the center amount of movement of this boundary rectangle frame;
Judging unit is used to judge whether the area change rate of this boundary rectangle frame and center amount of movement surpass setting threshold, if the area change rate of this boundary rectangle frame and center amount of movement judge then that all less than setting threshold this target is a stationary state;
Integerated analytic unit is used for target is carried out analysis-by-synthesis, when target is stationary state and classification when being thing, determines that this target is a legacy.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CNA2008100652550A CN101231696A (en) | 2008-01-30 | 2008-01-30 | Method and system for detection of hangover |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CNA2008100652550A CN101231696A (en) | 2008-01-30 | 2008-01-30 | Method and system for detection of hangover |
Publications (1)
Publication Number | Publication Date |
---|---|
CN101231696A true CN101231696A (en) | 2008-07-30 |
Family
ID=39898159
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CNA2008100652550A Pending CN101231696A (en) | 2008-01-30 | 2008-01-30 | Method and system for detection of hangover |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN101231696A (en) |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101552910B (en) * | 2009-03-30 | 2011-04-06 | 浙江工业大学 | Remnant detection device based on comprehensive computer vision |
CN102063614A (en) * | 2010-12-28 | 2011-05-18 | 天津市亚安科技电子有限公司 | Method and device for detecting lost articles in security monitoring |
CN102314695A (en) * | 2011-08-23 | 2012-01-11 | 北京黄金视讯科技有限公司 | Abandoned object detection method based on computer vision |
CN102622758A (en) * | 2012-03-16 | 2012-08-01 | 安科智慧城市技术(中国)有限公司 | Method and system for establishing residue detection background |
CN102663346A (en) * | 2012-03-16 | 2012-09-12 | 安科智慧城市技术(中国)有限公司 | Detection method and system of remnants |
CN102693537A (en) * | 2011-01-17 | 2012-09-26 | 三星泰科威株式会社 | Image surveillance system and method of detecting whether object is left behind or taken away |
CN103034867A (en) * | 2011-09-29 | 2013-04-10 | 蔡宏营 | Insect image detection method and insect classification method |
CN103093435A (en) * | 2013-01-27 | 2013-05-08 | 孙建德 | Detection method for remnants in video monitoring and based on foreground modeling |
CN103324906A (en) * | 2012-03-21 | 2013-09-25 | 日电(中国)有限公司 | Method and equipment for detecting abandoned object |
CN103425958A (en) * | 2012-05-24 | 2013-12-04 | 信帧电子技术(北京)有限公司 | Method for detecting non-movable objects in video |
CN103455997A (en) * | 2012-06-04 | 2013-12-18 | 深圳大学 | Derelict detection method and system |
CN103605983A (en) * | 2013-10-30 | 2014-02-26 | 天津大学 | Remnant detection and tracking method |
CN103679123A (en) * | 2012-09-17 | 2014-03-26 | 浙江大华技术股份有限公司 | Method and system for detecting remnant on ATM (Automatic Teller Machine) panel |
CN103729613A (en) * | 2012-10-12 | 2014-04-16 | 浙江大华技术股份有限公司 | Method and device for detecting video image |
CN104050665A (en) * | 2014-06-10 | 2014-09-17 | 华为技术有限公司 | Method and device for estimating foreground dwell time in video image |
CN104123567A (en) * | 2014-07-21 | 2014-10-29 | 国家电网公司 | Method for site remaining intelligent video recognition of electric power operation tools and instruments |
CN103034867B (en) * | 2011-09-29 | 2016-12-14 | 蔡宏营 | insect image detection method and insect classification method |
CN106228572A (en) * | 2016-07-18 | 2016-12-14 | 西安交通大学 | The long inactivity object detection of a kind of carrier state mark and tracking |
CN106408554A (en) * | 2015-07-31 | 2017-02-15 | 富士通株式会社 | Remnant detection apparatus, method and system |
CN106682566A (en) * | 2015-11-09 | 2017-05-17 | 富士通株式会社 | Traffic accident detection method, traffic accident detection device and electronic device |
CN107909598A (en) * | 2017-10-28 | 2018-04-13 | 天津大学 | A kind of moving object detection and tracking method based on interprocess communication |
CN107918762A (en) * | 2017-10-24 | 2018-04-17 | 江西省高速公路投资集团有限责任公司 | A kind of highway drops thing rapid detection system and method |
CN110135377A (en) * | 2019-05-21 | 2019-08-16 | 北京百度网讯科技有限公司 | Object moving state detection method, device, server and computer-readable medium |
CN110321808A (en) * | 2019-06-13 | 2019-10-11 | 浙江大华技术股份有限公司 | Residue and robber move object detecting method, equipment and storage medium |
CN111369529A (en) * | 2020-03-04 | 2020-07-03 | 厦门脉视数字技术有限公司 | Article loss and leave-behind detection method and system |
CN113393482A (en) * | 2021-06-17 | 2021-09-14 | 中国工商银行股份有限公司 | Method and device for detecting left-over articles based on fusion algorithm |
CN113689430A (en) * | 2021-10-26 | 2021-11-23 | 紫东信息科技(苏州)有限公司 | Image processing method and device for enteroscopy state monitoring |
-
2008
- 2008-01-30 CN CNA2008100652550A patent/CN101231696A/en active Pending
Cited By (45)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101552910B (en) * | 2009-03-30 | 2011-04-06 | 浙江工业大学 | Remnant detection device based on comprehensive computer vision |
CN102063614A (en) * | 2010-12-28 | 2011-05-18 | 天津市亚安科技电子有限公司 | Method and device for detecting lost articles in security monitoring |
CN102063614B (en) * | 2010-12-28 | 2015-06-03 | 天津市亚安科技股份有限公司 | Method and device for detecting lost articles in security monitoring |
CN102693537A (en) * | 2011-01-17 | 2012-09-26 | 三星泰科威株式会社 | Image surveillance system and method of detecting whether object is left behind or taken away |
CN102314695A (en) * | 2011-08-23 | 2012-01-11 | 北京黄金视讯科技有限公司 | Abandoned object detection method based on computer vision |
CN102314695B (en) * | 2011-08-23 | 2012-12-26 | 北京黄金视讯科技有限公司 | Abandoned object detection method based on computer vision |
CN103034867B (en) * | 2011-09-29 | 2016-12-14 | 蔡宏营 | insect image detection method and insect classification method |
CN103034867A (en) * | 2011-09-29 | 2013-04-10 | 蔡宏营 | Insect image detection method and insect classification method |
CN102663346B (en) * | 2012-03-16 | 2014-04-23 | 安科智慧城市技术(中国)有限公司 | Detection method and system of remnants |
CN102663346A (en) * | 2012-03-16 | 2012-09-12 | 安科智慧城市技术(中国)有限公司 | Detection method and system of remnants |
CN102622758A (en) * | 2012-03-16 | 2012-08-01 | 安科智慧城市技术(中国)有限公司 | Method and system for establishing residue detection background |
CN102622758B (en) * | 2012-03-16 | 2015-11-04 | 安科智慧城市技术(中国)有限公司 | The creation method of residue detection background and system |
CN103324906A (en) * | 2012-03-21 | 2013-09-25 | 日电(中国)有限公司 | Method and equipment for detecting abandoned object |
CN103324906B (en) * | 2012-03-21 | 2016-09-14 | 日电(中国)有限公司 | A kind of method and apparatus of legacy detection |
CN103425958A (en) * | 2012-05-24 | 2013-12-04 | 信帧电子技术(北京)有限公司 | Method for detecting non-movable objects in video |
CN103425958B (en) * | 2012-05-24 | 2018-10-23 | 信帧机器人技术(北京)有限公司 | A kind of method of motionless analyte detection in video |
CN103455997A (en) * | 2012-06-04 | 2013-12-18 | 深圳大学 | Derelict detection method and system |
CN103455997B (en) * | 2012-06-04 | 2016-05-04 | 深圳大学 | A kind of abandon detection method and system |
CN103679123B (en) * | 2012-09-17 | 2018-01-12 | 浙江大华技术股份有限公司 | The method and system of analyte detection is left on a kind of ATM panels |
CN103679123A (en) * | 2012-09-17 | 2014-03-26 | 浙江大华技术股份有限公司 | Method and system for detecting remnant on ATM (Automatic Teller Machine) panel |
CN103729613A (en) * | 2012-10-12 | 2014-04-16 | 浙江大华技术股份有限公司 | Method and device for detecting video image |
CN103093435B (en) * | 2013-01-27 | 2016-09-21 | 孙建德 | Remnant object detection method in video monitoring based on prospect modeling |
CN103093435A (en) * | 2013-01-27 | 2013-05-08 | 孙建德 | Detection method for remnants in video monitoring and based on foreground modeling |
CN103605983B (en) * | 2013-10-30 | 2017-01-25 | 天津大学 | Remnant detection and tracking method |
CN103605983A (en) * | 2013-10-30 | 2014-02-26 | 天津大学 | Remnant detection and tracking method |
CN104050665A (en) * | 2014-06-10 | 2014-09-17 | 华为技术有限公司 | Method and device for estimating foreground dwell time in video image |
CN104050665B (en) * | 2014-06-10 | 2017-07-21 | 华为技术有限公司 | The method of estimation and device of prospect residence time in a kind of video image |
CN104123567A (en) * | 2014-07-21 | 2014-10-29 | 国家电网公司 | Method for site remaining intelligent video recognition of electric power operation tools and instruments |
CN106408554B (en) * | 2015-07-31 | 2019-07-09 | 富士通株式会社 | Residue detection device, method and system |
CN106408554A (en) * | 2015-07-31 | 2017-02-15 | 富士通株式会社 | Remnant detection apparatus, method and system |
US10212397B2 (en) | 2015-07-31 | 2019-02-19 | Fujitsu Limited | Abandoned object detection apparatus and method and system |
CN106682566A (en) * | 2015-11-09 | 2017-05-17 | 富士通株式会社 | Traffic accident detection method, traffic accident detection device and electronic device |
CN106228572B (en) * | 2016-07-18 | 2019-01-29 | 西安交通大学 | A kind of the long inactivity object detection and tracking of carrier state mark |
CN106228572A (en) * | 2016-07-18 | 2016-12-14 | 西安交通大学 | The long inactivity object detection of a kind of carrier state mark and tracking |
CN107918762A (en) * | 2017-10-24 | 2018-04-17 | 江西省高速公路投资集团有限责任公司 | A kind of highway drops thing rapid detection system and method |
CN107918762B (en) * | 2017-10-24 | 2022-01-14 | 江西省高速公路投资集团有限责任公司 | Rapid detection system and method for road scattered objects |
CN107909598A (en) * | 2017-10-28 | 2018-04-13 | 天津大学 | A kind of moving object detection and tracking method based on interprocess communication |
CN110135377A (en) * | 2019-05-21 | 2019-08-16 | 北京百度网讯科技有限公司 | Object moving state detection method, device, server and computer-readable medium |
CN110321808B (en) * | 2019-06-13 | 2021-09-14 | 浙江大华技术股份有限公司 | Method, apparatus and storage medium for detecting carry-over and stolen object |
CN110321808A (en) * | 2019-06-13 | 2019-10-11 | 浙江大华技术股份有限公司 | Residue and robber move object detecting method, equipment and storage medium |
CN111369529B (en) * | 2020-03-04 | 2021-05-14 | 厦门星纵智能科技有限公司 | Article loss and leave-behind detection method and system |
CN111369529A (en) * | 2020-03-04 | 2020-07-03 | 厦门脉视数字技术有限公司 | Article loss and leave-behind detection method and system |
CN113393482A (en) * | 2021-06-17 | 2021-09-14 | 中国工商银行股份有限公司 | Method and device for detecting left-over articles based on fusion algorithm |
CN113689430A (en) * | 2021-10-26 | 2021-11-23 | 紫东信息科技(苏州)有限公司 | Image processing method and device for enteroscopy state monitoring |
CN113689430B (en) * | 2021-10-26 | 2022-02-15 | 紫东信息科技(苏州)有限公司 | Image processing method and device for enteroscopy state monitoring |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101231696A (en) | Method and system for detection of hangover | |
JP5518359B2 (en) | Smoke detector | |
JP3123587B2 (en) | Moving object region extraction method using background subtraction | |
CN101794385B (en) | Multi-angle multi-target fast human face tracking method used in video sequence | |
US8189913B2 (en) | Method for detecting shadow of object | |
KR101546933B1 (en) | Apparatus for sensing fire | |
US20090310822A1 (en) | Feedback object detection method and system | |
CN111507232B (en) | Stranger identification method and system based on multi-mode multi-strategy fusion | |
CN103150549A (en) | Highway tunnel fire detecting method based on smog early-stage motion features | |
CN103475800B (en) | Method and device for detecting foreground in image sequence | |
CN104463827B (en) | A kind of automatic testing method and corresponding electronic equipment of image capture module | |
CN102222349B (en) | Prospect frame detecting method based on edge model | |
Saglam et al. | Real-time adaptive camera tamper detection for video surveillance | |
US6956485B1 (en) | Fire detection algorithm | |
CN115049955A (en) | Fire detection analysis method and device based on video analysis technology | |
CN104021576A (en) | Method and system for tracking moving objects in scene | |
CN108184098B (en) | Method and system for monitoring safety area | |
KR20160083465A (en) | Multilane camera recognition system and method for analysing image to learning type thereof | |
CN101916380B (en) | Video-based device and method for detecting smog | |
CN104243967A (en) | Image detection method and device | |
JP5286113B2 (en) | Smoke detector | |
CN112364884A (en) | Method for detecting moving object | |
JPH06308256A (en) | Cloudy fog detecting method | |
CN108168375B (en) | A kind of target scoring method and device | |
CN111325073A (en) | Monitoring video abnormal behavior detection method based on motion information clustering |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C12 | Rejection of a patent application after its publication | ||
RJ01 | Rejection of invention patent application after publication |
Open date: 20080730 |