CN114724123B - Bus passenger flow statistics method based on vehicle-mounted monitoring video - Google Patents

Bus passenger flow statistics method based on vehicle-mounted monitoring video Download PDF

Info

Publication number
CN114724123B
CN114724123B CN202210330783.4A CN202210330783A CN114724123B CN 114724123 B CN114724123 B CN 114724123B CN 202210330783 A CN202210330783 A CN 202210330783A CN 114724123 B CN114724123 B CN 114724123B
Authority
CN
China
Prior art keywords
image
detected
bus
passenger flow
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210330783.4A
Other languages
Chinese (zh)
Other versions
CN114724123A (en
Inventor
苗阳
张文波
刘志远
杨俊宴
薛新
陈翚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN202210330783.4A priority Critical patent/CN114724123B/en
Publication of CN114724123A publication Critical patent/CN114724123A/en
Application granted granted Critical
Publication of CN114724123B publication Critical patent/CN114724123B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Traffic Control Systems (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a bus passenger flow statistical method based on a vehicle-mounted monitoring video, which comprises the following steps: acquiring front and rear door target area video data based on front and rear door vehicle-mounted cameras of a bus; extracting images in the video by using a frame-separating extraction method; preprocessing each frame of image; judging the motion state of passengers on and off buses based on background differential information, drawing each frame of pixel waveform diagram, and performing super-threshold judgment and extremum processing on the waveform diagram; the motion area is matched with the part which accords with the counting condition in the oscillogram, and the initial counting is updated once; and correcting the overlapping count of passengers on and off the bus by combining the wavelength test and the door closing time test to obtain two corrected count results, and carrying out weighted operation on the corrected count results and the initial result to obtain a final count result. The invention reasonably utilizes the existing monitoring equipment of the buses, reduces the labor cost during the bus investigation, and improves the counting efficiency of the number of passengers on and off the buses.

Description

Bus passenger flow statistics method based on vehicle-mounted monitoring video
Technical Field
The invention belongs to the field of traffic big data, and relates to a bus passenger flow statistical method based on a vehicle-mounted monitoring video.
Background
Urban public transportation is a large component of urban transportation and plays an important role in connecting key nodes of urban transportation, medical treatment, education and the like, promoting urban development and the like. The public transportation aims to provide a faster, more convenient and environment-friendly travel scheme for urban residents, so that the public transportation on-off passenger flow investigation is particularly important in line planning and adjustment. Traditional on-line and off-line passenger flow investigation of buses is carried out on a target line by manually erecting a camera or erecting a camera on the bus of the target line for shooting and counting. The method has large manpower resource requirement, can be completed by a plurality of investigators, and cannot perform post-checking of data. The video recorded by the camera is checked, and the inspection and check can be performed for a plurality of times, but equipment cost of the camera and the like used in investigation is high, and additional expenditure is required.
With the continuous progress of video recognition and image detection technologies, various image processing methods have been widely used in the traffic field. However, the method is not fully applicable to detection in traffic scenes by simply using an image processing technology, the existing image processing method based on machine learning, which is relatively perfect in the field of computer vision, has large training difficulty, more preparation work for video processing, and high requirements on training data sets, and a more refined algorithm also has extremely high requirements on processing operation hardware. In addition, in the aspect of target detection and counting, the internal space of the bus is narrow, the video monitoring angle is not completely in a two-dimensional expandable state, and overlapping and missing detection of passenger images are very easy to occur, so that the original processing method is required to be improved on the basis of image processing, the flow is simpler, the processing method is simpler, and the effect is more suitable for actual demands.
Disclosure of Invention
In order to improve the counting efficiency of the number of passengers on and off buses, existing monitoring equipment of buses is reasonably utilized, and labor cost in bus investigation is reduced, the invention provides a bus passenger flow counting method based on a vehicle-mounted monitoring video.
The invention adopts the following technical scheme for solving the technical problems:
a bus passenger flow statistical method based on a vehicle-mounted monitoring video comprises the following specific steps:
step 1, acquiring video data of a front door and a rear door target area based on vehicle-mounted cameras at the front door and the rear door of a bus;
Step 2, extracting images from video data by using a frame-separating extraction method, performing binarization processing, and arranging according to a time sequence to obtain an image set to be detected;
Step 3, intercepting a section of background video which does not pass through by people from the video data, extracting images from the background video by using a frame-separation extraction method, performing binarization processing, and then arranging according to a time sequence to obtain a background image set, so as to obtain a background picture;
Step 4, carrying out differential operation on each image to be detected in the image set to be detected and the background picture, and detecting a motion area in each image to be detected;
Step 5, obtaining the number of non-zero pixels of each image to be detected, drawing a waveform chart by taking the corresponding frame number of the image to be detected as an abscissa and the number of the non-zero pixels of the image to be detected as an ordinate;
Step 6, carrying out polarization treatment on the oscillograms, and carrying out primary counting by combining the motion areas in each image to be detected;
Step 7, based on the standard single person loading and unloading frame length, performing overlapping test on the primary counting result in the step 6;
Step 8, based on the opening and closing time of the vehicle-mounted sensor, evaluating and checking the passenger flow of buses on and off;
And 9, carrying out weighted summation on the primary counting result in the step 6, the overlapping test result in the step 7 and the evaluation test result in the step 8 to obtain a bus boarding and disembarking passenger flow statistical result.
Further, the target area in the step 1 is a standing forbidden area at the front and rear doors.
Further, the pixel value of the (x, y) position in the background picture B in the step 3 isWhere B i (x, y) is the pixel value of the (x, y) position in the ith background image in the background image set.
Further, the specific steps of the differential operation in the step 4 are as follows:
41 Acquiring a differential image D):
D(x,y)=|f(x,y)-B(x,y)|
Wherein D (x, y) is the pixel value of the (x, y) position in the differential image D, f (x, y) is the pixel value of the (x, y) position in the image f to be detected, and B (x, y) is the pixel value of the (x, y) position in the background picture B;
42 Thresholding the differential image D to obtain a pixel point with a pixel value of 0 in the image R, which is determined as a background point:
Wherein R (x, y) is a pixel value at the (x, y) position in the background picture R, and T is a set threshold;
43 If the pixel points of the non-background points in the R can generate a communication area, the corresponding area of the communication area in the image f to be detected is a movement area through which the passengers get on or off, otherwise, the image f to be detected does not contain the movement area.
Further, the specific steps of performing the primary counting in the step 6 are as follows:
61 Polarization processing is carried out on the waveform diagram: if the number p (j) of the non-zero pixels of the jth image to be detected is larger than the set pixel threshold, setting the corresponding ordinate of the jth image to be detected to be 1, otherwise setting the corresponding ordinate of the jth image to be detected to be 0;
62 If a certain wave peak continuously exceeds the set frame length and the corresponding images to be detected all contain a motion area in the polarized waveform diagram, the counting result of the wave peak is 1;
63 In the polarized waveform diagram, the count results of wave peaks of all sections are added to obtain an initial count result N.
Further, the specific overlap test in the step 7 is:
In the polarized waveform diagram, if the continuous frame length of a certain wave peak exceeds 1.3 times of the standard single passing frame length, the corresponding counting result of the wave peak is corrected to be And further obtaining an overlapping test result N 1, wherein t s is the standard single-person passing frame length, and delta is the peak duration frame length of the section.
Further, the evaluation test result in the step 8 is:
wherein N 2 is an evaluation and test result, T a is the total time from door opening to door closing, T a is the average time loss of door opening and closing, and T is the standard time for getting off of a single passenger.
Further, the statistical result of the passenger flow of the bus getting on and off in the step 9 is:
Ns=λN+λ1N12N2
Wherein N s is the statistical result of the passenger flow of the buses, and lambda 1、λ2 are the weights of N, N 1、N2 respectively.
Further, λ=0.6, and λ 1=0.2,λ2 =0.3.
Compared with the prior art, the technical scheme provided by the invention has the following technical effects:
1) According to the bus passenger flow statistics method based on the vehicle-mounted monitoring video, the monitoring video shot by the bus monitoring camera is used as a data source, the data volume is large, the acquisition is easy, the cost is low, and the defects of low efficiency, waste of manpower and material resources and the like of the current bus passenger flow investigation are overcome; the problem of inconvenience in post-check of manual investigation is solved, and high cost in shooting by using a camera is avoided;
2) The data processing method of the method is simple and quick, reasonably reduces the number of pictures in the traditional image processing, changes the traditional equal-length and long-distance processing method, reduces the detection of similar patterns, avoids large-scale video processing operation, and has reliable technical results;
3) According to the invention, not only is part means of image processing relied on, but also the counting of the first round of image detection is checked and corrected for two times according to the spatial characteristics of the bus door and the unidirectionality of passengers getting on and off, so that the counting accuracy can be increased, and the counting error caused by image adhesion and uneven getting on and off time is avoided;
4) The bus passenger flow statistical method based on the vehicle-mounted monitoring video is fusion and improvement on image processing and traffic investigation counting, and is innovation and application in the aspect of traffic investigation.
Drawings
FIG. 1 is a schematic flow chart of the method of the present invention.
Detailed Description
The technical scheme of the invention is further described in detail below with reference to the accompanying drawings and specific embodiments:
In one embodiment, as shown in FIG. 1, the method comprises the following specific steps:
step 1, acquiring video data of a front door and a rear door target area based on vehicle-mounted cameras at the front door and the rear door of a bus;
Step 2, extracting images from video data by using a frame-separating extraction method, performing binarization processing, and arranging according to a time sequence to obtain an image set to be detected;
Step 3, intercepting a section of background video which does not pass through by people from the video data, extracting images from the background video by using a frame-separation extraction method, performing binarization processing, and then arranging according to a time sequence to obtain a background image set, so as to obtain a background picture;
Step 4, carrying out differential operation on each image to be detected in the image set to be detected and the background picture, and detecting a motion area in each image to be detected;
Step 5, obtaining the number of non-zero pixels of each image to be detected, drawing a waveform chart by taking the corresponding frame number of the image to be detected as an abscissa and the number of the non-zero pixels of the image to be detected as an ordinate;
Step 6, carrying out polarization treatment on the oscillograms, and carrying out primary counting by combining the motion areas in each image to be detected;
Step 7, based on the standard single person loading and unloading frame length, performing overlapping test on the primary counting result in the step 6;
Step 8, based on the opening and closing time of the vehicle-mounted sensor, evaluating and checking the passenger flow of buses on and off;
And 9, carrying out weighted summation on the primary counting result in the step 6, the overlapping test result in the step 7 and the evaluation test result in the step 8 to obtain a bus boarding and disembarking passenger flow statistical result.
In one embodiment, step 2 is specifically:
21 Acquiring monitoring videos of the same time period in multiple days, and performing test counting on the multiple videos by using inter-frame intervals of 2 frames, 4 frames, 6 frames and 8 frames;
the video data is obtained from a video shot by a front and rear opposite door monitoring camera of the bus, the regional range in the video is intercepted and corrected into a forbidden standing area in front of the bus door, and the shooting time period is the time period required by the traditional bus investigation (the time of each investigation activity is assumed to be one hour). The invention considers that the position of the video monitoring area is inconsistent with the requirements, and the video image shot by the original monitoring equipment of the non-public transportation is increased to be inconsistent with the video acquisition requirements of the invention.
22 Through tests, the time of getting on and off a single passenger under the condition of no interference is about 50 frames, and in order to ensure the continuity of the passenger getting on and off process, 10% of the getting on and off time is taken as the interval frame number and about 5 frames, so that the nearest test value of 4 frames is taken as the actual interval for acquiring the video images.
23 After the video is subjected to frame-separated image extraction and binarization processing, the upper and lower marks (4, 8, 12, 16 … …) are marked according to the number of the picture frames, and the images are arranged into an image set to be detected in time sequence.
In one embodiment, step 3 obtains a background image set from the background video that is not passed by the person in the same manner as step 2, so as to obtain a background picture, wherein the pixel value of the (x, y) position in the background picture B isWhere B i (x, y) is the pixel value of the (x, y) position in the ith background image in the background image set.
In one embodiment, the specific steps of the differential operation in step 4 are as follows:
41 Acquiring a differential image D):
D(x,y)=|f(x,y)-B(x,y)|
Wherein D (x, y) is the pixel value of the (x, y) position in the differential image D, f (x, y) is the pixel value of the (x, y) position in the image f to be detected, and B (x, y) is the pixel value of the (x, y) position in the background picture B;
42 Setting a threshold as T, and processing pixels of the differential image one by one to obtain an image R. The pixel position with the gray level of 255 is determined to be the same as the pixel of the current frame, and the pixel position with the gray level of 0 is determined to be the background point, namely the point on the background image is the same as the pixel of the point on the current frame. At this time, the value of each position of each image is only 255 or 0:
Wherein R (x, y) is a pixel value at the (x, y) position in the background picture R, and T is a set threshold;
43 If the pixel points of the non-background points in the R can generate a communication area, the corresponding area of the communication area in the image f to be detected is a movement area through which the passengers get on or off, otherwise, the image f to be detected does not contain the movement area (i.e. the passengers get on or off).
In one embodiment, the specific steps for performing the initial count in step 6 are as follows:
61 Polarization processing is carried out on the waveform diagram: if the number p (j) of the non-zero pixels of the jth image to be detected is larger than a set pixel threshold (the empirical value is the total number of pixels/6), setting the corresponding ordinate of the jth image to be detected to be 1, otherwise setting the corresponding ordinate of the jth image to be detected to be 0;
62 If a certain wave peak continuously exceeds the set frame length and the corresponding images to be detected all contain a motion area in the polarized waveform diagram, the counting result of the wave peak is 1;
63 In the polarized waveform diagram, the count results of wave peaks of all sections are added to obtain an initial count result N.
In one embodiment, step 7 uses a standard single person on-off frame length to check the waveform peak to obtain a de-overlapping count. Taking a calculation method of getting off a bus passenger as an example, a standard single-person passing frame length t s is obtained:
And setting the frame length of the whole process of getting off of the passenger without interference as t 1, and acquiring the frame length from the appearance in the image frame to the whole process of getting off of the passenger in a plurality of videos under the same time period. Considering the influence of time period or road section, when the frame length of the whole process of getting off the passenger is smaller than 35 frames, giving out a correction coefficient of 1.1, and calculating the time frame length to be 1.1t 1; when the frame length exceeds 65 frames, a correction coefficient of 0.9 is given, and the calculated frame length is 0.9t 1.. And taking the average value of all corrected frame lengths to obtain the standard single passing frame length t s.
In the polarized waveform diagram, if the continuous frame length of a certain wave peak exceeds 1.3 times of the standard single passing frame length, the corresponding counting result of the wave peak is corrected to beAnd further obtaining an overlapping test result N 1, wherein t s is the standard single-person passing frame length, and delta is the peak duration frame length of the section.
In one embodiment, step 8 uses the time for opening and closing the door of the vehicle sensor to evaluate and check the number of passengers getting on and off the vehicle, and obtains the unexpected counting result. The process of calculating the vehicle door opening and closing time loss t a and the single person getting off standard time t is as follows:
Taking a calculation method of getting off a bus passenger as an example, acquiring a time interval for opening and closing a door based on a vehicle induction system, and recording the time for which the door is closed for prolonging each time one person is added;
when no special condition exists, the average time length of getting off the vehicle by passengers in the time period is calculated, and the average time length of getting off the vehicle by a standard single person is used as the time length of getting off the vehicle, and the time loss time length caused by opening and closing the door is calculated;
calculating passenger flow results passing the door closing time length test:
wherein N 2 is an evaluation and test result, T a is the total time from door opening to door closing, T a is the average time loss of door opening and closing, and T is the standard time for getting off of a single passenger.
In one embodiment, step 9 is a bus boarding and disembarking statistics N s=0.6N+0.2N1+0.2N2.
The foregoing is merely illustrative of the present invention, and the scope of the invention is not limited thereto, and any conceivable modification or replacement within the technical scope of the present invention should be covered by the scope of the present invention, and the scope of the present invention should be determined by the scope of the appended claims.

Claims (8)

1. The bus passenger flow statistical method based on the vehicle-mounted monitoring video is characterized by comprising the following specific steps of:
step 1, acquiring video data of a front door and a rear door target area based on vehicle-mounted cameras at the front door and the rear door of a bus;
Step 2, extracting images from video data by using a frame-separating extraction method, performing binarization processing, and arranging according to a time sequence to obtain an image set to be detected;
Step 3, intercepting a section of background video which does not pass through by people from the video data, extracting images from the background video by using a frame-separation extraction method, performing binarization processing, and then arranging according to a time sequence to obtain a background image set, so as to obtain a background picture;
step 4, carrying out differential operation on each image to be detected in the image set to be detected and the background picture, and detecting a motion area in each image to be detected; the method comprises the following specific steps:
41 Acquiring a differential image D):
D(x,y)=|f(x,y)-B(x,y)|
Wherein D (x, y) is the pixel value of the (x, y) position in the differential image D, f (x, y) is the pixel value of the (x, y) position in the image f to be detected, and B (x, y) is the pixel value of the (x, y) position in the background picture B;
42 Thresholding the differential image D to obtain a pixel point with a pixel value of 0 in the image R, which is determined as a background point:
wherein R (x, y) is a pixel value at the (x, y) position in the background picture R, and R is a set threshold;
43 If the pixel points of the non-background points in the R can generate a communication area, the corresponding area of the communication area in the image f to be detected is a movement area through which passengers get on or off, otherwise, the image f to be detected does not contain the movement area;
Step 5, obtaining the number of non-zero pixels of each image to be detected, drawing a waveform chart by taking the corresponding frame number of the image to be detected as an abscissa and the number of the non-zero pixels of the image to be detected as an ordinate;
Step 6, carrying out polarization treatment on the oscillograms, and carrying out primary counting by combining the motion areas in each image to be detected;
Step 7, based on the standard single person loading and unloading frame length, performing overlapping test on the primary counting result in the step 6;
Step 8, based on the opening and closing time of the vehicle-mounted sensor, evaluating and checking the passenger flow of buses on and off;
And 9, carrying out weighted summation on the primary counting result in the step 6, the overlapping test result in the step 7 and the evaluation test result in the step 8 to obtain a bus boarding and disembarking passenger flow statistical result.
2. The method for counting bus passenger flow based on-board monitoring video according to claim 1, wherein the target area in the step 1 is a standing forbidden area at front and rear doors.
3. The bus passenger flow statistics method based on vehicle-mounted monitoring video as set forth in claim 1, wherein the pixel value of the (x, y) position in the background picture B in the step 3 isWhere B i (x, y) is the pixel value of the (x, y) position in the ith background image in the background image set.
4. The bus passenger flow statistics method based on the vehicle-mounted monitoring video as set forth in claim 1, wherein the specific steps of performing the primary counting in the step 6 are as follows:
61 Polarization processing is carried out on the waveform diagram: if the number p (j) of the non-zero pixels of the jth image to be detected is larger than the set pixel threshold, setting the corresponding ordinate of the jth image to be detected to be 1, otherwise setting the corresponding ordinate of the jth image to be detected to be 0;
62 If a certain wave peak continuously exceeds the set frame length and the corresponding images to be detected all contain a motion area in the polarized waveform diagram, the counting result of the wave peak is 1;
63 In the polarized waveform diagram, the count results of wave peaks of all sections are added to obtain an initial count result N.
5. The bus passenger flow statistics method based on the vehicle-mounted monitoring video as set forth in claim 1, wherein the specific overlap test in the step 7 is:
In the polarized waveform diagram, if the continuous frame length of a certain wave peak exceeds 1.3 times of the standard single passing frame length, the corresponding counting result of the wave peak is corrected to be And further obtaining an overlapping test result N 1, wherein t s is the standard single-person passing frame length, and delta is the peak duration frame length of the section.
6. The bus passenger flow statistics method based on the vehicle-mounted monitoring video as set forth in claim 1, wherein the evaluation and inspection result in the step 8 is:
wherein N 2 is an evaluation and test result, T a is the total time from door opening to door closing, T a is the average time loss of door opening and closing, and T is the standard time for getting off of a single passenger.
7. The method for counting bus passenger flow based on the on-board monitoring video as set forth in claim 1, wherein the counting result of bus passenger flow in step 9 is as follows:
Ns=λN+λ1N12N2
Wherein N s is the statistical result of the passenger flow of the buses, and lambda 1、λ2 are the weights of N, N 1、N2 respectively.
8. A bus traffic statistics method based on-board surveillance video according to claim 7, characterized in that λ=0.6 and λ 1=0.2,λ2 =0.3.
CN202210330783.4A 2022-03-30 2022-03-30 Bus passenger flow statistics method based on vehicle-mounted monitoring video Active CN114724123B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210330783.4A CN114724123B (en) 2022-03-30 2022-03-30 Bus passenger flow statistics method based on vehicle-mounted monitoring video

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210330783.4A CN114724123B (en) 2022-03-30 2022-03-30 Bus passenger flow statistics method based on vehicle-mounted monitoring video

Publications (2)

Publication Number Publication Date
CN114724123A CN114724123A (en) 2022-07-08
CN114724123B true CN114724123B (en) 2024-04-23

Family

ID=82239432

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210330783.4A Active CN114724123B (en) 2022-03-30 2022-03-30 Bus passenger flow statistics method based on vehicle-mounted monitoring video

Country Status (1)

Country Link
CN (1) CN114724123B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115223092B (en) * 2022-07-15 2023-11-14 广东万龙科技有限公司 Video monitoring system and method under big data scene

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103871082A (en) * 2014-03-31 2014-06-18 百年金海科技有限公司 Method for counting people stream based on security and protection video image
CN104156983A (en) * 2014-08-05 2014-11-19 天津大学 Public transport passenger flow statistical method based on video image processing
CN108038432A (en) * 2017-11-30 2018-05-15 中国人民解放军国防科技大学 Bus pedestrian flow statistical method and system based on optical flow counting
CN112102290A (en) * 2020-09-15 2020-12-18 广州市几米物联科技有限公司 Passenger flow statistical method, system and computer readable storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6625310B2 (en) * 2001-03-23 2003-09-23 Diamondback Vision, Inc. Video segmentation using statistical pixel modeling

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103871082A (en) * 2014-03-31 2014-06-18 百年金海科技有限公司 Method for counting people stream based on security and protection video image
CN104156983A (en) * 2014-08-05 2014-11-19 天津大学 Public transport passenger flow statistical method based on video image processing
CN108038432A (en) * 2017-11-30 2018-05-15 中国人民解放军国防科技大学 Bus pedestrian flow statistical method and system based on optical flow counting
CN112102290A (en) * 2020-09-15 2020-12-18 广州市几米物联科技有限公司 Passenger flow statistical method, system and computer readable storage medium

Also Published As

Publication number Publication date
CN114724123A (en) 2022-07-08

Similar Documents

Publication Publication Date Title
CN110261436B (en) Rail fault detection method and system based on infrared thermal imaging and computer vision
CN107507221A (en) With reference to frame difference method and the moving object detection and tracking method of mixed Gauss model
CN106897698A (en) Classroom number detection method and system based on machine vision Yu binocular coordination technique
CN104486618A (en) Video image noise detection method and device
CN103093458B (en) The detection method of key frame and device
CN114724123B (en) Bus passenger flow statistics method based on vehicle-mounted monitoring video
CN106815583A (en) A kind of vehicle at night license plate locating method being combined based on MSER and SWT
CN105374051B (en) The anti-camera lens shake video moving object detection method of intelligent mobile terminal
CN105657435A (en) Single video frame copy and paste tamper detection method based on quantized DCT coefficient
CN106339657A (en) Straw incineration monitoring method and device based on monitoring video
CN104463104B (en) A kind of stationary vehicle target rapid detection method and device
CN104408728A (en) Method for detecting forged images based on noise estimation
CN107944354A (en) A kind of vehicle checking method based on deep learning
CN105913002A (en) On-line adaptive abnormal event detection method under video scene
CN104182983A (en) Highway monitoring video definition detection method based on corner features
CN109271904A (en) A kind of black smoke vehicle detection method based on pixel adaptivenon-uniform sampling and Bayesian model
Zhiwei et al. New method of background update for video-based vehicle detection
CN104168462B (en) Camera scene change detection method based on image angle point set feature
CN114708532A (en) Monitoring video quality evaluation method, system and storage medium
Babu et al. An efficient image dahazing using Googlenet based convolution neural networks
CN106254723A (en) A kind of method of real-time monitoring video noise interference
CN106997670A (en) Real-time sampling of traffic information system based on video
CN112070048B (en) Vehicle attribute identification method based on RDSNet
CN106375773B (en) Altering detecting method is pasted in frame duplication based on dynamic threshold
CN114627434A (en) Automobile sales exhibition room passenger flow identification system based on big data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant