CN107705577B - Real-time detection method and system for calibrating illegal lane change of vehicle based on lane line - Google Patents

Real-time detection method and system for calibrating illegal lane change of vehicle based on lane line Download PDF

Info

Publication number
CN107705577B
CN107705577B CN201711026761.4A CN201711026761A CN107705577B CN 107705577 B CN107705577 B CN 107705577B CN 201711026761 A CN201711026761 A CN 201711026761A CN 107705577 B CN107705577 B CN 107705577B
Authority
CN
China
Prior art keywords
vehicle
size
tracking
lane line
lane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711026761.4A
Other languages
Chinese (zh)
Other versions
CN107705577A (en
Inventor
李松斌
杨洁
赵思奇
刘鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanhai Research Station Institute Of Acoustics Chinese Academy Of Sciences
Institute of Acoustics CAS
Original Assignee
Institute of Acoustics CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Acoustics CAS filed Critical Institute of Acoustics CAS
Priority to CN201711026761.4A priority Critical patent/CN107705577B/en
Publication of CN107705577A publication Critical patent/CN107705577A/en
Application granted granted Critical
Publication of CN107705577B publication Critical patent/CN107705577B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a real-time detection method for calibrating violation lane change of a vehicle based on a lane line, which specifically comprises the following steps: step 1) calibrating a lane line L, and calculating and acquiring a detection rectangular area D and an image I; step 2) detecting a vehicle set A ═ A of the image I in the detection rectangular region D in the step 1) based on the deep convolutional neural network1,A2,...,Ai}; step 3) screening out a vehicle set B which is intersected with the lane line L according to the vehicle set A obtained in the step 2) { B }1,B2,...,BiAnd (c) the step of (c) in which,
Figure DDA0001448546570000011
matching with the tracking vehicle list TL again, and updating the tracking vehicle list TL ═ T1,T2,...,Tj}; step 4) judging the tracked vehicles T in the updated tracked vehicle list TLjWhether the vehicle is a violation lane-changing vehicle or not; if tracking vehicle TjMarking out the tracking vehicle T for the illegal lane changing vehiclejAnd deleting the violation lane-change vehicle T from the updated tracked vehicle list TLjThe information of (1).

Description

Real-time detection method and system for calibrating illegal lane change of vehicle based on lane line
Technical Field
The invention belongs to the technical field of intelligent traffic systems and image recognition, and particularly relates to a real-time detection method and system for calibrating vehicle lane change violations based on lane lines.
Background
With the progress and development of society, urban motor vehicles are increasing day by day, the number of motor vehicles is rising continuously, so that traffic accidents are more serious, and the economic loss and the number of death people caused by road traffic accidents are rising continuously in recent years. The control of traffic accidents is becoming a problem more and more important to traffic management departments, and the leading cause of traffic accidents is car illegal activities. As a common traffic violation, lane change violations not only cause traffic congestion, but even possibly cause serious traffic accidents, in order to reduce the occurrence rate of traffic accidents, traffic management departments continuously promote an intelligent traffic management system, and a monitoring video-based lane change violating detection technology in the intelligent traffic management system is more critical, so that the research of monitoring video-based vehicle lane change violating detection is necessary.
The existing method for detecting the violation lane change of the vehicle mainly comprises two types: one is based on spatial distance, such as laser detection method, infrared detection method and ultrasonic detection method, and the methods have the problems of expensive equipment, small space coverage area, mutual interference among equipment, incapability of processing shielding conditions and the like; the other type is based on computer vision technology, and the method has the advantages of simple installation and maintenance, high visibility, high detection accuracy and the like.
At present, a method for detecting vehicles changing lanes violating regulations based on a computer vision technology extracts a moving target set, finds out a moving vehicle set and position information, and finally judges whether the discrete degree of a moving track of each vehicle exceeds a set threshold value so as to judge whether the vehicles change lanes violating regulations. The existing method has good detection effect when fewer vehicles are on the road surface and the vehicles move faster; however, it cannot cope with a lane change violation when the vehicle moves slowly or the number of road vehicles is large. In addition, the existing method is easily influenced by factors such as illumination, shading, shadow, video jitter and the like; when the number of vehicles is large, the calculation amount of the tracking mode of each vehicle is too large, and the real-time requirement is difficult to meet. Therefore, aiming at detecting vehicles changing the lanes illegally, a real-time detection method for detecting the lanes illegally changed, which has the advantages of wide application range, high accuracy and high detection speed and can meet the real-time requirement, is urgently needed.
Disclosure of Invention
The invention aims to solve the defects of the existing real-time detection method for the illegal lane change of the vehicle, and provides the real-time detection method for the illegal lane change of the vehicle based on lane line calibration. In addition, the method can realize the rapid and accurate identification of the violation lane-changing vehicle in the video.
In order to achieve the aim, the invention provides a lane change violation real-time detection method based on lane line calibration, which specifically comprises the following steps:
step 1) calibrating a lane line L, and calculating and acquiring a detection rectangular area D and an image I;
step 2) detecting a vehicle set A ═ A of the image I in the detection rectangular region D in the step 1) based on the deep convolutional neural network1,A2,...,Ai};
Step 3) screening out a vehicle set B which is intersected with the lane line L according to the vehicle set A obtained in the step 2) { B }1,B2,...,BiAnd (c) the step of (c) in which,
Figure BDA0001448546550000021
matching with the tracking vehicle list TL again, and updating the tracking vehicle list TL ═ T1,T2,...,Tj};
Step 4) judging the tracked vehicles T in the updated tracked vehicle list TLjWhether the vehicle is a violation lane-changing vehicle or not; if tracking vehicle TjMarking out the tracking vehicle T for the illegal lane changing vehiclejAnd deleting the violation lane-change vehicle T from the updated tracked vehicle list TLjThe information of (1).
The step 1) specifically comprises the following steps:
step 1-1) calibrating lane line L ═ { L ═1,L2,...,Lj}, each lane line LjThe positions of (a) include: starting point pbj={xbj,ybjAnd end point pej={xej,yejIn which xbj,ybjAbscissa and ordinate, x, of origin, respectivelyej,yejRespectively as the horizontal and vertical coordinates of the end point;
step 1-2) calculates a detection area rectangle D ═ x, y, w, h } according to the lane line position of step 1-1), wherein,
x=min(xb1,xe1,xb2,xe2,...,xbj,xej)
y=min(yb1,ye1,yb2,ye2,...,ybj,yej)
w=max(xb1,xe1,xb2,xe2,...,xbj,xej)-x
h=max(yb1,ye1,yb2,ye2,...,ybj,yej)-y
the step 2) specifically comprises the following steps:
normalizing the detection area D in the image I to 576 multiplied by 576 as an input image according to the detection area D and the image I in the step 1), and detecting a vehicle set A ═ { A } in the image I based on a deep convolutional neural network1,A2,...,Ai},AiPosition of (D) is denoted as PAi={xi,yi,wi,hiThe detection effect is shown in FIG. 2 (b);
wherein, this convolutional neural network includes: 7 convolutional layers, 4 downsampling layers, 2 full-link layers and one output layer; scanning boundaries in the convolutional layers are automatically filled with 0, and neurons are activated by utilizing a Leaky-ReLu function; and maximum pooling is adopted in the down-sampling layers.
The convolution kernel size of convolution layer C1 is 9 × 9, 32 convolution kernels, the step size is 2, and the generated feature map size is 288 × 288; the window size of the downsampling layer S1 is 4 multiplied by 4, the step size is 4, and the size of the generated feature map is 72 multiplied by 72; the convolution layer C2 has convolution kernel size of 1 × 1, 4 convolution kernels and step size of 1, and the generated feature map size is 72 × 72; the convolution layer C3 has convolution kernel size of 3 × 3, 8 convolution kernels and step length of 1, and the generated feature map size is 72 × 72; the window size of the downsampling layer S2 is 2 multiplied by 2, the step size is 2, and the size of the generated feature map is 36 multiplied by 36; the convolution layer C4 has convolution kernel size of 1 × 1, 8 convolution kernels and step size of 1, and the generated feature map size is 36 × 36; the convolution layer C5 has convolution kernel size of 3 × 3, 16 convolution kernels and step size of 1, and the generated feature map size is 36 × 36; the window size of the downsampling layer S3 is 2 multiplied by 2, the step size is 2, and the size of the generated feature map is 18 multiplied by 18; the convolution layer C6 has convolution kernel size of 3 × 3, 32 convolution kernels and step size of 1, and the generated feature map size is 18 × 18; the window size of the downsampling layer S4 is 2 multiplied by 2, the step size is 2, and the size of the generated feature map is 9 multiplied by 9; the convolution layer C7 has convolution kernel size of 1 × 1, 64 convolution kernels and step size of 1, and the generated feature map size is 9 × 9; the full-junction layer F1 is composed of 256 neurons, and the neurons are activated by using a Relu function; the full connection layer F2 is composed of 4096 neurons, and the neurons are activated by using a Leaky-ReLu function; the output layer consists of 891 neurons, which are activated using the Relu function.
The step 3) specifically comprises the following steps:
step 3-1) the set of vehicles a ═ { a) obtained according to step 2)1,A2,...,AiScreening out a vehicle set B which is intersected with the lane line L from the lane line L1,B2,...,BiAnd (c) the step of (c) in which,
Figure BDA0001448546550000031
step 3-2) traversing the tracking vehicle list TL ═ T according to the vehicle set B obtained in the step 3-1)1,T2,...,Tj}, selecting the contact ratio UijA set of vehicles C > 0.2,
Figure BDA0001448546550000032
C={C1,C2,...,Cm}; calculating the Hog characteristic HogjAnd hogkThen calculate the remaining chord distances Cosjk(ii) a Reselecting a tracked vehicle TjThe largest cosine distance Cos from the vehicle set CjmAnd a corresponding vehicle CmIf CosjmIf greater than 0.75, then TjAnd CmMatching each other and proceeding to step 3-3); if CosjmLess than or equal to 0.75, the vehicle T is not existed and tracked in the vehicle set CjMatching vehicles and entering the step 3-4);
step 3-3) vehicle C obtained according to step 3-2)mUpdating tracked vehicle TjIs the current vehicle position, i.e. TPj=CPm(ii) a Updating tracked vehicles to current vehicle pictures, i.e. TMj=CMm(ii) a Number of consecutive lost times TTj0, the horizontal directed distance TD of the tracking point from the lane line LjUpdated to the horizontal directed distance, TD, of the current tracking point from the lane line Lj=CDm(ii) a At this time CmMatched, current vehicle is matched flag CFi=true;
Step 3-4) updating the tracked vehicle TjNumber of continuous lost times TTj', wherein TTj’=TTj+1, if TTj' if the number of consecutive lost tracking is more than 3, the vehicle T is deleted from the tracked vehicle listj
Step 3-5) traversing the vehicle set B, and enabling the unmatched vehicles Bu={BPu,BMu,BFu,BQu,BDuAs a new tracked vehicle Tu={TPu,TMu,TTu,TODu,TDuAdds to the tracked vehicle list TL, updates the vehicle tracking list TL, wherein,
TPu=BPu
TMu=BMu
TTu=0
TODu=BDu
TDu=BDu
the step 3-1) specifically comprises the following steps:
step 3-1-1) obtaining the lane line L and the vehicle A according to the step 1) and the step 2)iPosition rectangle PAiCalculating the lane line LjAnd a vehicle AiPosition rectangle PAiIntersection of four sides and lane line LjThe intersections with the left, right, upper and lower edges of the rectangle are plij={xlij,ylij}、prij={xrij,yrij}、puij={xuij,yuij}、pdij={xdij,ydij};
Wherein, { xlij,ylijIs the horizontal and vertical coordinates of the left edge intersection point, { x }rij,yrijAre right side edge intersection points respectivelyOrdinate, { xuij,yuijThe horizontal and vertical coordinates of the intersection point of the upper side edge are respectively, and the horizontal and vertical coordinates of the intersection point of the lower side edge are respectively.
Step 3-1-2) according to the intersection point of the lane line and the rectangle obtained in the step 3-1-1), if the intersection point is on the upper side and the left side of the rectangle and meets the following preset conditions, the vehicle rectangle A is used foriAdding the vehicle into the vehicle set B;
xi+wi/4<xuij<xi+wi
yi+hi/2<ylij<yi+hi
step 3-1-3) according to the intersection point of the lane line and the rectangle obtained in the step 3-1-1), if the intersection point is on the upper side and the right side of the rectangle and meets the following preset conditions, the vehicle rectangle A is used foriAdding the vehicle into the vehicle set B;
xi<xuij<xi+3wi/4
yi+hi/2<yrij<yi+hi
step 3-1-4) according to the intersection point of the lane line and the rectangle obtained in the step 3-1-1), if the intersection point is positioned at the upper side and the lower side of the rectangle and meets the following preset conditions, the vehicle rectangle A is usediAdding the vehicle into the vehicle set B;
xi+wi/4<xuij<xi+3wi/4
xi+wi/4<xdij<xi+3wi/4
the step 3-2) specifically comprises the following steps:
step 3-2-1) initializing vehicle information according to the vehicle set B obtained in step 3-1), wherein the vehicle information comprises the following steps: vehicle position BPi={xi,yi,wi,hiGet the vehicle picture BMiWhether the current vehicle is matched or not is marked BFiFalse, tracking point BQi=(xi+wi/2,yi+4hi/5) and tracking point BQiWith each lane line LjIs a horizontal directed distance BdijI.e. vehicles Bi={BPi,BMi,BFi,BQi,BDiIn which, BDi={Bdi1,Bdi2,...,Bdij};
Step 3-2-2) traversing and tracking vehicle list TL ═ T1,T2,...,Tj},Tj={TPj,TMj,TTj,TODj,TDj},TjE.g. TL, wherein TPjIndicating the position of the vehicle, TMjShows a picture of a vehicle, TTjIndicating number of consecutive tracking losses, TODjIndicating the horizontal directional distance, TD, of the starting tracking point from the lane line LjRepresenting the horizontal directed distance between the current tracking point and the lane line L; TODj={TOdj1,TOdj2,...,TOdjl},TDj={Tdj1,Tdj2,...,Tdjl}; computing tracked vehicle TjVehicle position TPjWith each unmatched vehicle position BP in the vehicle set BiCoincidence degree U ofij
Uij=Scij/Szij
Wherein S iszijIs BPiAnd TPjUnion of rectangular areas, ScijIs BPiAnd TPjIntersection of rectangular areas;
selecting a contact ratio UijA set of vehicles C > 0.2,
Figure BDA0001448546550000051
C={C1,C2,...,Cm}; if C is empty, the vehicle T does not exist and is tracked in the vehicle set BjMatching vehicles and entering the step 3-4); if C is not empty, entering step 3-2-3);
step 3-2-3) the set of vehicles C ═ { C) obtained according to step 3-2-2)1,C2,...,CmWill track vehicle TjVehicle picture TMjAnd a picture CM of each vehicle in the vehicle set CkNormalizing to 64 × 64 size, graying, and calculating the Hog characteristic HogjAnd hogk(ii) a Wherein, the meterIn the calculation, the size of the window is 64 multiplied by 64, the size of the block is 16 multiplied by 16, the size of the block sliding increment is 8 multiplied by 8, the size of the cell is 8 multiplied by 8, and the number of gradient histograms in each cell unit is 9;
step 3-2-4) Hog-characteristic Hog obtained according to step 3-2-3)jAnd hogkCalculating the remaining chord distances Cosjk
Step 3-2-5) cosine distance Cos obtained according to step 3-2-4)jkSelecting a tracked vehicle TjThe largest cosine distance Cos from the vehicle set CjmAnd corresponding vehicle CmIf CosjmIf greater than 0.75, then TjAnd CmMatching each other and proceeding to step 3-3); if CosjmLess than or equal to 0.75, the vehicle T is not existed and tracked in the vehicle set CjMatched vehicles and proceed to step 3-4).
The step 4) specifically comprises the following steps:
step 4-1) traversing the updated tracked vehicle list TL obtained according to the step 3-5), traversing the updated tracked vehicle list TL, and calculating TjStarting tracking point of (1) and lane line LjHorizontal directed distance TOd'jjAnd the current tracking point and the lane line LjIs horizontal directed distance Td'jjOf TOd'jj*Td′jjLess than 0, it means that the vehicles are successively located on the lane line LjTwo sides, TjA lane-changing vehicle violating regulations;
step 4-2) obtaining the violation lane-changing vehicle T according to the step 4-1)jThen, mark out the tracking vehicle TjAnd deleting the violation lane-change vehicle T from the updated tracked vehicle list TLjThe information of (1).
A real-time detection system for calibrating vehicle lane change violations based on lane lines is an intelligent traffic management system and comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, and is characterized in that the steps of the detection method are realized when the processor executes the program.
The invention has the advantages that:
the method is based on the deep convolutional neural network to detect the vehicle set in the image, and compared with a method for extracting the vehicle set by utilizing the motion characteristics, the method is higher in accuracy and wider in application range. In addition, the method screens out the vehicles intersected with the lane line and then carries out matching judgment, does not need to record and judge the motion track of each vehicle, has high detection speed and can meet the requirement of real-time identification.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a flow chart of a method for detecting lane change violation of a vehicle in real time based on lane line calibration according to the present invention;
fig. 2(a) is a schematic diagram of the effect of setting a lane line in step 1) of the method for detecting the violation lane change of a vehicle based on lane line calibration in the embodiment of the invention;
fig. 2(b) is a schematic diagram of the effect of detecting a vehicle set in step 2) of the method for detecting the violation lane change of a vehicle based on lane line calibration in the embodiment of the invention;
fig. 2(c) is a schematic diagram of the effect of the vehicle set which is screened out and intersects with the lane line in step 3) of the lane violation lane change real-time detection method based on lane line calibration in the embodiment of the present invention;
fig. 2(d) is a schematic diagram of the effect of marking and tracking a violation lane-changing vehicle set in a vehicle list by step 5) of the method for detecting the violation lane-changing of the vehicle based on lane line calibration in the embodiment of the invention;
fig. 2(e) is a schematic diagram of the effect of the violation vehicle driving away state in step 5) of the method for detecting the violation lane change in real time based on lane line calibration in the embodiment of the invention;
fig. 3 is a network configuration diagram of a vehicle detection model in the embodiment of the present invention.
Detailed Description
As shown in fig. 1, the invention provides a lane change violation real-time detection method based on lane line calibration, which specifically comprises the following steps:
step 1) calibrating a lane line L, and calculating and acquiring a detection rectangular area D and an image I;
step 2) detecting a vehicle set A ═ A of the image I in the detection rectangular region D in the step 1) based on the deep convolutional neural network1,A2,...,Ai};
Step 3) screening out a vehicle set B which is intersected with the lane line L according to the vehicle set A obtained in the step 2) { B }1,B2,...,BiAnd (c) the step of (c) in which,
Figure BDA0001448546550000071
matching with the tracking vehicle list TL again, and updating the tracking vehicle list TL ═ T1,T2,...,Tj};
Step 4) judging the tracked vehicles T in the updated tracked vehicle list TLjWhether the vehicle is a violation lane-changing vehicle or not; if tracking vehicle TjMarking out the tracking vehicle T for the illegal lane changing vehiclejAnd deleting the violation lane-change vehicle T from the updated tracked vehicle list TLjThe information of (1).
The step 1) specifically comprises the following steps:
step 1-1) calibrating lane line L ═ { L ═1,L2,...,Lj}, each lane line LjThe positions of (a) include: starting point pbj={xbj,ybjAnd end point pej={xej,yejAs shown in fig. 2 (a); wherein x isbj,ybjAbscissa and ordinate, x, of origin, respectivelyej,yejRespectively as the horizontal and vertical coordinates of the end point;
step 1-2) calculates a detection area rectangle D ═ x, y, w, h } according to the lane line position of step 1-1), wherein,
x=min(xb1,xe1,xb2,xe2,...,xbj,xej)
y=min(yb1,ye1,yb2,ye2,...,ybj,yej)
w=max(xb1,xe1,xb2,xe2,...,xbj,xej)-x
h=max(yb1,ye1,yb2,ye2,...,ybj,yej)-y
the step 2) specifically comprises the following steps:
normalizing the detection area D in the image I to 576 multiplied by 576 as an input image according to the detection area D and the image I in the step 1), and detecting a vehicle set A ═ { A } in the image I based on a deep convolutional neural network1,A2,...,Ai},AiPosition of (D) is denoted as PAi={xi,yi,wi,hiThe detection effect is shown in FIG. 2 (b);
as shown in fig. 3, the convolutional neural network includes: 7 convolutional layers, 4 downsampling layers, 2 full-link layers and one output layer; scanning boundaries in the convolutional layers are automatically filled with 0, and neurons are activated by utilizing a Leaky-ReLu function; and maximum pooling is adopted in the down-sampling layers.
The convolution kernel size of convolution layer C1 is 9 × 9, 32 convolution kernels, the step size is 2, and the generated feature map size is 288 × 288; the window size of the downsampling layer S1 is 4 multiplied by 4, the step size is 4, and the size of the generated feature map is 72 multiplied by 72; the convolution layer C2 has convolution kernel size of 1 × 1, 4 convolution kernels and step size of 1, and the generated feature map size is 72 × 72; the convolution layer C3 has convolution kernel size of 3 × 3, 8 convolution kernels and step length of 1, and the generated feature map size is 72 × 72; the window size of the downsampling layer S2 is 2 multiplied by 2, the step size is 2, and the size of the generated feature map is 36 multiplied by 36; the convolution layer C4 has convolution kernel size of 1 × 1, 8 convolution kernels and step size of 1, and the generated feature map size is 36 × 36; the convolution layer C5 has convolution kernel size of 3 × 3, 16 convolution kernels and step size of 1, and the generated feature map size is 36 × 36; the window size of the downsampling layer S3 is 2 multiplied by 2, the step size is 2, and the size of the generated feature map is 18 multiplied by 18; the convolution layer C6 has convolution kernel size of 3 × 3, 32 convolution kernels and step size of 1, and the generated feature map size is 18 × 18; the window size of the downsampling layer S4 is 2 multiplied by 2, the step size is 2, and the size of the generated feature map is 9 multiplied by 9; the convolution layer C7 has convolution kernel size of 1 × 1, 64 convolution kernels and step size of 1, and the generated feature map size is 9 × 9; the full-junction layer F1 is composed of 256 neurons, and the neurons are activated by using a Relu function; the full connection layer F2 is composed of 4096 neurons, and the neurons are activated by using a Leaky-ReLu function; the output layer consists of 891 neurons, which are activated using the Relu function.
The step 3) specifically comprises the following steps:
step 3-1) the set of vehicles a ═ { a) obtained according to step 2)1,A2,...,AiScreening out a vehicle set B which is intersected with the lane line L from the lane line L1,B2,...,BiAnd (c) the step of (c) in which,
Figure BDA0001448546550000091
step 3-2) traversing the tracking vehicle list TL ═ T according to the vehicle set B obtained in the step 3-1)1,T2,...,Tj}, selecting the contact ratio UijA set of vehicles C > 0.2,
Figure BDA0001448546550000092
C={C1,C2,...,Cm}; calculating the Hog characteristic HogjAnd hogkThen calculate the remaining chord distances Cosjk(ii) a Reselecting a tracked vehicle TjThe largest cosine distance Cos from the vehicle set CjmAnd a corresponding vehicle CmIf CosjmIf greater than 0.75, then TjAnd CmMatching each other and proceeding to step 3-3); if CosjmLess than or equal to 0.75, the vehicle T is not existed and tracked in the vehicle set CjMatching vehicles and entering the step 3-4);
step 3-3) vehicle C obtained according to step 3-2)mUpdating tracked vehicle TjIs the current vehicle position, i.e. TPj=CPm(ii) a Updating tracked vehicles to current vehicle pictures, i.e. TMj=CMm(ii) a Continuous heelNumber of times of losing TTj0, the horizontal directed distance TD of the tracking point from the lane line LjUpdated to the horizontal directed distance, TD, of the current tracking point from the lane line Lj=CDm(ii) a At this time CmMatched, current vehicle is matched flag CFi=true;
Step 3-4) updating the tracked vehicle TjNumber of continuous lost times TTj', wherein TTj’=TTj+1, if TTj' if the number of consecutive lost tracking is more than 3, the vehicle T is deleted from the tracked vehicle listj
Step 3-5) traversing the vehicle set B, and enabling the unmatched vehicles Bu={BPu,BMu,BFu,BQu,BDuAs a new tracked vehicle Tu={TPu,TMu,TTu,TODu,TDuAdds to the tracked vehicle list TL, updates the vehicle tracking list TL, wherein,
TPu=BPu
TMu=BMu
TTu=0
TODu=BDu
TDu=BDu
the step 3-1) specifically comprises the following steps:
step 3-1-1) obtaining the lane line L according to the step 1) and the step 2)jAnd vehicle AiPosition rectangle PAiCalculating the lane line LjAnd a vehicle AiPosition rectangle PAiIntersection of four sides and lane line LjThe intersections with the left, right, upper and lower edges of the rectangle are plij={xlij,ylij}、prij={xrij,yrij}、puij={xuij,yuij}、pdij={xdij,ydij};
Wherein, { xlij,ylijIs the horizontal and vertical coordinates of the left edge intersection point, { x }rij,yrijThe horizontal and vertical coordinates of the right edge intersection point, { x } respectivelyuij,yuijThe horizontal and vertical coordinates of the intersection point of the upper side edge are respectively, and the horizontal and vertical coordinates of the intersection point of the lower side edge are respectively.
Step 3-1-2) according to the intersection point of the lane line and the rectangle obtained in the step 3-1-1), if the intersection point is on the upper side and the left side of the rectangle and meets the following preset conditions, the vehicle rectangle A is used foriAdding the vehicle into the vehicle set B;
xi+wi/4<xuij<xi+wi
yi+hi/2<ylij<yi+hi
step 3-1-3) according to the intersection point of the lane line and the rectangle obtained in the step 3-1-1), if the intersection point is on the upper side and the right side of the rectangle and meets the following preset conditions, the vehicle rectangle A is used foriAdding the vehicle into the vehicle set B;
xi<xuij<xi+3wi/4
yi+hi/2<yrij<yi+hi
step 3-1-4) according to the intersection point of the lane line and the rectangle obtained in the step 3-1-1), if the intersection point is positioned at the upper side and the lower side of the rectangle and meets the following preset conditions, the vehicle rectangle A is usediAdding the vehicle into the vehicle set B;
xi+wi/4<xuij<xi+3wi/4
xi+wi/4<xdij<xi+3wi/4。
the step 3-2) specifically comprises the following steps:
step 3-2-1) initializing vehicle information according to the vehicle set B obtained in step 3-1), wherein the vehicle information comprises the following steps: vehicle position BPi={xi,yi,wi,hiGet the vehicle picture BMiWhether the current vehicle is matched or not is marked BFiFalse, tracking point BQi=(xi+wi/2,yi+4hi/5) and tracking point BQiWith each lane line LjHorizontal directed distance ofBdijI.e. vehicles Bi={BPi,BMi,BFi,BQi,BDiIn which, BDi={Bdi1,Bdi2,...,Bdil};
Step 3-2-2) traversing and tracking vehicle list TL ═ T1,T2,...,Tj},Tj={TPj,TMj,TTj,TODj,TDj},TjE.g. TL, wherein TPjIndicating the position of the vehicle, TMjShows a picture of a vehicle, TTjIndicating number of consecutive tracking losses, TODjIndicating the horizontal directional distance, TD, of the starting tracking point from the lane line LjRepresenting the horizontal directed distance between the current tracking point and the lane line L; TODj={TOdj1,TOdj2,...,TOdjl},TDj={Tdj1,Tdj2,...,Tdjl}; computing tracked vehicle TjVehicle position TPjWith each unmatched vehicle position BP in the vehicle set BiCoincidence degree U ofij
Uij=Scij/Szij
Wherein S iszijIs BPiAnd TPjUnion of rectangular areas, ScijIs BPiAnd TPjIntersection of rectangular areas;
selecting a contact ratio UijA set of vehicles C > 0.2,
Figure BDA0001448546550000111
C={C1,C2,...,Cm}; if C is empty, the vehicle T does not exist and is tracked in the vehicle set BjMatching vehicles and entering the step 3-4); if C is not empty, entering step 3-2-3);
step 3-2-3) the set of vehicles C ═ { C) obtained according to step 3-2-2)1,C2,...,CmWill track vehicle TjVehicle picture TMjAnd a picture CM of each vehicle in the vehicle set CkNormalizing to 64 × 64 size, graying, and calculating the Hog characteristic HogjAnd hogk(ii) a Wherein, in the calculation, the window size is 64 × 64, the block size is 16 × 16, the block sliding increment size is 8 × 8, the cell size is 8 × 8, and the number of gradient histograms in each cell unit is 9;
step 3-2-4) Hog-characteristic Hog obtained according to step 3-2-3)jAnd hogkCalculating the remaining chord distances Cosjk
Step 3-2-5) cosine distance Cos obtained according to step 3-2-4)jkSelecting a tracked vehicle TjThe largest cosine distance Cos from the vehicle set CjmAnd corresponding vehicle CmIf CosjmIf greater than 0.75, then TjAnd CmMatching with each other, and proceeding to step 3-3), the vehicle detection effect after matching is as shown in fig. 2 (c); if CosjmLess than or equal to 0.75, the vehicle T is not existed and tracked in the vehicle set CjMatched vehicles and step 3-4) are carried out, and the detection effect of the matching failure vehicles is shown in figure 2 (d).
The step 4) specifically comprises the following steps:
step 4-1) traversing the updated tracked vehicle list TL and calculating TjStarting tracking point of (1) and lane line LjHorizontal directed distance TOd'jjAnd the current tracking point and the lane line LjIs horizontal directed distance Td'jjOf TOd'jj*Td′jjLess than 0, it means that the vehicles are successively located on the lane line LjTwo sides, TjA lane-changing vehicle violating regulations;
step 4-2) obtaining the violation lane-changing vehicle T according to the step 4-1)jThen, mark out the tracking vehicle TjAs shown in fig. 2(e), and removes the lane-change violation vehicle T from the updated tracked vehicle list TLjThe information of (1).
Fig. 2(a) -2(e) are schematic diagrams illustrating the effect of detecting a lane-violation vehicle by using the method in the embodiment of the present invention.
A real-time detection system for calibrating vehicle lane change violations based on lane lines is an intelligent traffic management system and comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, and is characterized in that the steps of the detection method are realized when the processor executes the program.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present invention and are not limited. Although the present invention has been described in detail with reference to the embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (8)

1. A real-time detection method for calibrating vehicle lane change violations based on lane lines specifically comprises the following steps:
step 1) calibrating a lane line L, and calculating and acquiring a detection rectangular area D and an image I;
step 2) detecting a vehicle set A ═ A of the image I in the detection rectangular region D in the step 1) based on the deep convolutional neural network1,A2,...,Ai};
Step 3) screening out a vehicle set B which is intersected with the lane line L according to the vehicle set A obtained in the step 2) { B }1,B2,...,BiAnd (c) the step of (c) in which,
Figure FDA0002377179590000011
matching with the tracking vehicle list TL again, and updating the tracking vehicle list TL ═ T1,T2,...,Tj};
Step 4) judging the tracked vehicles T in the updated tracked vehicle list TLjWhether the vehicle is a violation lane-changing vehicle or not; if tracking vehicle TjMarking out the tracking vehicle T for the illegal lane changing vehiclejAnd deleting the violation lane-change vehicle T from the updated tracked vehicle list TLjThe information of (1).
2. The detection method according to claim 1, wherein the step 1) specifically comprises:
step 1-1) calibrating lane line L ═ { L ═1,L2,...,Lj}, each lane line LjThe positions of (a) include: starting point pbj={xbj,ybjAnd end point pej={xej,yej}; wherein x isbj,ybjAbscissa and ordinate, x, of origin, respectivelyej,yejRespectively as the horizontal and vertical coordinates of the end point;
step 1-2) calculates a detection area rectangle D ═ x, y, w, h } according to the lane line position of step 1-1), wherein,
x=min(xb1,xe1,xb2,xe2,...,xbj,xej)
y=min(yb1,ye1,yb2,ye2,...,ybj,yej)
w=max(xb1,xe1,xb2,xe2,...,xbj,xej)-x
h=max(yb1,ye1,yb2,ye2,...,ybj,yej)-y。
3. the detection method according to claim 2, wherein step 2) specifically comprises:
normalizing the detection rectangular area D in the image I to 576 multiplied by 576 as an input image according to the detection rectangular area D and the image I in the step 1), and detecting a vehicle set A ═ { A } in the image I based on a deep convolutional neural network1,A2,...,Ai},AiPosition of (D) is denoted as PAi={xi,yi,wi,hi};xi,yi,wi,hiRespectively corresponding to a vehicle rectangle AiThe horizontal coordinate of the lower left corner, the vertical coordinate of the lower left corner, the width and the height;
wherein, this convolutional neural network includes: 7 convolutional layers, 4 downsampling layers, 2 full-link layers and one output layer; scanning boundaries in the convolutional layers are automatically filled with 0, and neurons are activated by utilizing a Leaky-ReLu function; maximum pooling is adopted in the down-sampling layers;
the convolution kernel size of convolution layer C1 is 9 × 9, 32 convolution kernels, the step size is 2, and the generated feature map size is 288 × 288; the window size of the downsampling layer S1 is 4 multiplied by 4, the step size is 4, and the size of the generated feature map is 72 multiplied by 72;
the convolution layer C2 has convolution kernel size of 1 × 1, 4 convolution kernels and step size of 1, and the generated feature map size is 72 × 72;
the convolution layer C3 has convolution kernel size of 3 × 3, 8 convolution kernels and step length of 1, and the generated feature map size is 72 × 72; the window size of the downsampling layer S2 is 2 multiplied by 2, the step size is 2, and the size of the generated feature map is 36 multiplied by 36;
the convolution layer C4 has convolution kernel size of 1 × 1, 8 convolution kernels and step size of 1, and the generated feature map size is 36 × 36;
the convolution layer C5 has convolution kernel size of 3 × 3, 16 convolution kernels and step size of 1, and the generated feature map size is 36 × 36; the window size of the downsampling layer S3 is 2 multiplied by 2, the step size is 2, and the size of the generated feature map is 18 multiplied by 18;
the convolution layer C6 has convolution kernel size of 3 × 3, 32 convolution kernels and step size of 1, and the generated feature map size is 18 × 18; the window size of the downsampling layer S4 is 2 multiplied by 2, the step size is 2, and the size of the generated feature map is 9 multiplied by 9;
the convolution layer C7 has convolution kernel size of 1 × 1, 64 convolution kernels and step size of 1, and the generated feature map size is 9 × 9;
the full-junction layer F1 is composed of 256 neurons, and the neurons are activated by using a Relu function; the full connection layer F2 is composed of 4096 neurons, and the neurons are activated by using a Leaky-ReLu function; the output layer consists of 891 neurons, which are activated using the Relu function.
4. The detection method according to claim 1, wherein the step 3) specifically comprises:
step 3-1) the set of vehicles a ═ { a) obtained according to step 2)1,A2,...,AiScreening out a vehicle set B which is intersected with the lane line L from the lane line L1,B2,...,BiAnd (c) the step of (c) in which,
Figure FDA0002377179590000021
step 3-2) traversing the tracking vehicle list TL ═ T according to the vehicle set B obtained in the step 3-1)1,T2,...,Tj}, selecting the contact ratio UijA set of vehicles C > 0.2,
Figure FDA0002377179590000022
C={C1,C2,...,Cm}; wherein, Uij=Scij/Szij(ii) a Wherein S iszijIs BPiAnd TPjUnion of rectangular areas; scijIs BPiAnd TPjIntersection of rectangular areas; BP (Back propagation) ofiFor tracking vehicles BiThe vehicle position of (a); calculating the Hog characteristic HogjAnd hogkThen calculate the remaining chord distances Cosjk(ii) a Reselecting a tracked vehicle TjThe largest cosine distance Cos from the vehicle set CjmAnd a corresponding vehicle CmIf CosjmIf greater than 0.75, then TjAnd CmMatching each other and proceeding to step 3-3); if CosjmLess than or equal to 0.75, the vehicle T is not existed and tracked in the vehicle set CjMatching vehicles and entering the step 3-4);
step 3-3) vehicle C obtained according to step 3-2)mUpdating tracked vehicle TjIs the current vehicle position, i.e. TPj=CPm(ii) a Updating tracked vehicles to current vehicle pictures, i.e. TMj=CMm;TPjFor tracking vehicles TjThe vehicle position of (a); CP (CP)mIs the current vehicle position; TMjFor tracking vehicles TjThe vehicle picture of (1); CM (compact message processor)mThe current vehicle picture is obtained; number of consecutive lost times TTj0, the horizontal directed distance TD of the tracking point from the lane line LjUpdated to the horizontal directed distance, TD, of the current tracking point from the lane line Lj=CDm(ii) a At this time CmMatched, current vehicle is matched flag CFi=true;
Step 3-4) updating the tracked vehicle TjNumber of continuous lost times TTj', wherein TTj’=TTj+1, if TTj' if the number of consecutive lost tracking is more than 3, the vehicle T is deleted from the tracked vehicle listj
Step 3-5) traversing the vehicle set B, and enabling the unmatched vehicles Bu={BPu,BMu,BFu,BQu,BDuAs a new tracked vehicle Tu={TPu,TMu,TTu,TODu,TDuAdds to the tracked vehicle list TL, updates the vehicle tracking list TL, wherein,
Figure FDA0002377179590000031
wherein, TPuNew tracked vehicle positions; TMuA new tracked vehicle picture is obtained; TODuThe horizontal directed distance between the initial tracking point and the lane line L; TDuThe horizontal directed distance between the current tracking point and the lane line L; tracking number of consecutive lost tracking TT of new tracked vehicleu=0;BPuUnmatched vehicle positions; BMuThe picture is an unmatched vehicle picture; BF (BF) generatoruA matched or not flag for an unmatched vehicle; BQuUnmatched vehicle tracking points; BDuThe horizontal directional distance from the point to the lane line L is tracked for unmatched vehicles.
5. The detection method according to claim 4, wherein the step 3-1) specifically comprises:
step 3-1-1) obtaining the lane line L and the vehicle A according to the step 1) and the step 2)iPosition rectangle PAiCalculating the lane line LjAnd a vehicle AiPosition rectangle PAiIntersection of four sides and lane line LjThe intersections with the left, right, upper and lower edges of the rectangle are plij={xlij,ylij}、prij={xrij,yrij}、puij={xuij,yuij}、pdij={xdij,ydij};
Wherein, { xlij,ylijIs the horizontal and vertical coordinates of the left edge intersection point, { x }rij,yrijThe horizontal and vertical coordinates of the right edge intersection point, { x } respectivelyuij,yuijRespectively are the horizontal and vertical coordinates of the intersection point of the upper side edge and the horizontal and vertical coordinates of the intersection point of the lower side edge;
step 3-1-2) according to the intersection point of the lane line and the rectangle obtained in the step 3-1-1), if the intersection point is on the upper side and the left side of the rectangle and meets the following preset conditions, the vehicle rectangle A is used foriAdding the vehicle into the vehicle set B;
xi+wi/4<xuij<xi+wi
yi+hi/2<ylij<yi+hi
wherein x isi,yi,wi,hiRespectively corresponding to a vehicle rectangle AiThe horizontal coordinate of the lower left corner, the vertical coordinate of the lower left corner, the width and the height;
step 3-1-3) according to the intersection point of the lane line and the rectangle obtained in the step 3-1-1), if the intersection point is on the upper side and the right side of the rectangle and meets the following preset conditions, the vehicle rectangle A is used foriAdding the vehicle into the vehicle set B;
xi<xuij<xi+3wi/4
yi+hi/2<yrij<yi+hi
step 3-1-4) according to the intersection point of the lane line and the rectangle obtained in the step 3-1-1), if the intersection point is positioned at the upper side and the lower side of the rectangle and meets the following preset conditions, the vehicle rectangle A is usediAdding the vehicle into the vehicle set B;
xi+wi/4<xuij<xi+3wi/4
xi+wi/4<xdij<xi+3wi/4。
6. the detection method according to claim 4, wherein the step 3-2) specifically comprises:
step 3-2-1) according to the stepsInitializing vehicle information by the vehicle set B obtained in the step 3-1), wherein the vehicle information comprises: vehicle position BPi={xi,yi,wi,hiGet the vehicle picture BMiWhether the current vehicle is matched or not is marked BFiFalse, tracking point BQi=(xi+wi/2,yi+4hi/5) and tracking point BQiWith each lane line LjIs a horizontal directed distance BdijI.e. tracking vehicle Bi={BPi,BMi,BFi,BQi,BDiIn which, BDi={Bdi1,Bdi2,...,Bdil}; wherein x isi,yi,wi,hiRespectively corresponding to a vehicle rectangle BiThe horizontal coordinate of the lower left corner, the vertical coordinate of the lower left corner, the width and the height;
step 3-2-2) traversing and tracking vehicle list TL ═ T1,T2,...,Tj},Tj={TPj,TMj,TTj,TODj,TDj},TjE.g. TL, wherein TPjIndicating the position of the vehicle, TMjShows a picture of a vehicle, TTjIndicating number of consecutive tracking losses, TODjIndicating the horizontal directional distance, TD, of the starting tracking point from the lane line LjRepresenting the horizontal directed distance between the current tracking point and the lane line L; TODj={TOdj1,TOdj2,...,TOdjl},TDj={Tdj1,Tdj2,...,Tdjl}; computing tracked vehicle TjVehicle position TPjWith each unmatched vehicle position BP in the vehicle set BiCoincidence degree U ofij
Uij=Scij/Szij
Wherein S iszijIs BPiAnd TPjUnion of rectangular areas, ScijIs BPiAnd TPjIntersection of rectangular areas;
selecting a contact ratio UijA set of vehicles C > 0.2,
Figure FDA0002377179590000051
C={C1,C2,...,Cm}; if C is empty, the vehicle T does not exist and is tracked in the vehicle set BjMatching vehicles and entering the step 3-4); if C is not empty, entering step 3-2-3);
step 3-2-3) the set of vehicles C ═ { C) obtained according to step 3-2-2)1,C2,...,CmWill track vehicle TjVehicle picture TMjAnd a picture CM of each vehicle in the vehicle set CkNormalizing to 64 × 64 size, graying, and calculating the Hog characteristic HogjAnd hogk(ii) a Wherein, in the calculation, the window size is 64 × 64, the block size is 16 × 16, the block sliding increment size is 8 × 8, the cell size is 8 × 8, and the number of gradient histograms in each cell unit is 9;
step 3-2-4) Hog-characteristic Hog obtained according to step 3-2-3)jAnd hogkCalculating the remaining chord distances Cosjk
Step 3-2-5) cosine distance Cos obtained according to step 3-2-4)jkSelecting a tracked vehicle TjThe largest cosine distance Cos from the vehicle set CjmAnd corresponding vehicle CmIf CosjmIf greater than 0.75, then TjAnd CmMatching each other and proceeding to step 3-3); if CosjmLess than or equal to 0.75, the vehicle T is not existed and tracked in the vehicle set CjMatched vehicles and proceed to step 3-4).
7. The detection method according to claim 1, wherein the step 4) specifically comprises:
step 4-1) traversing the updated tracked vehicle list TL and calculating TjStarting tracking point of (1) and lane line LjHorizontal directed distance TOd'jjAnd the current tracking point and the lane line LjIs horizontal directed distance Td'jjOf TOd'jj*Td′jjLess than 0, it means that the vehicles are successively located on the lane line LjTwo sides, TjA lane-changing vehicle violating regulations;
step 4-2) according toStep 4-1) obtaining violation lane-changing vehicle TjThen, mark out the tracking vehicle TjAnd deleting the violation lane-change vehicle T from the updated tracked vehicle list TLjThe information of (1).
8. A real-time detection system for calibrating vehicle lane change violations based on lane marking, the detection system is an intelligent traffic management system, and comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, and the processor is characterized in that when executing the program, the steps of the real-time detection method of one of claims 1 to 7 are realized.
CN201711026761.4A 2017-10-27 2017-10-27 Real-time detection method and system for calibrating illegal lane change of vehicle based on lane line Active CN107705577B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711026761.4A CN107705577B (en) 2017-10-27 2017-10-27 Real-time detection method and system for calibrating illegal lane change of vehicle based on lane line

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711026761.4A CN107705577B (en) 2017-10-27 2017-10-27 Real-time detection method and system for calibrating illegal lane change of vehicle based on lane line

Publications (2)

Publication Number Publication Date
CN107705577A CN107705577A (en) 2018-02-16
CN107705577B true CN107705577B (en) 2020-05-26

Family

ID=61176340

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711026761.4A Active CN107705577B (en) 2017-10-27 2017-10-27 Real-time detection method and system for calibrating illegal lane change of vehicle based on lane line

Country Status (1)

Country Link
CN (1) CN107705577B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110348273B (en) * 2018-04-04 2022-05-24 北京四维图新科技股份有限公司 Neural network model training method and system and lane line identification method and system
CN109284752A (en) * 2018-08-06 2019-01-29 中国科学院声学研究所 A kind of rapid detection method of vehicle
CN109271942A (en) * 2018-09-26 2019-01-25 上海七牛信息技术有限公司 A kind of stream of people's statistical method and system
CN114120625B (en) * 2020-08-31 2023-02-21 上汽通用汽车有限公司 Vehicle information integration system, method, and storage medium
CN112712703A (en) * 2020-12-09 2021-04-27 上海眼控科技股份有限公司 Vehicle video processing method and device, computer equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102254429A (en) * 2011-05-13 2011-11-23 东南大学 Video identification-based detection apparatus and method of vehicles against regulations
CN105632186A (en) * 2016-03-11 2016-06-01 博康智能信息技术有限公司 Method and device for detecting vehicle queue jumping behavior
CN105702048A (en) * 2016-03-23 2016-06-22 武汉理工大学 Automobile-data-recorder-based illegal lane occupation identification system and method for automobile on highway
WO2016141553A1 (en) * 2015-03-10 2016-09-15 冯旋宇 System for preventing lane change on solid-line road and method for preventing lane change by using system
CN106297314A (en) * 2016-11-03 2017-01-04 北京文安智能技术股份有限公司 A kind of drive in the wrong direction or the detection method of line ball vehicle behavior, device and a kind of ball machine

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102254429A (en) * 2011-05-13 2011-11-23 东南大学 Video identification-based detection apparatus and method of vehicles against regulations
WO2016141553A1 (en) * 2015-03-10 2016-09-15 冯旋宇 System for preventing lane change on solid-line road and method for preventing lane change by using system
CN105632186A (en) * 2016-03-11 2016-06-01 博康智能信息技术有限公司 Method and device for detecting vehicle queue jumping behavior
CN105702048A (en) * 2016-03-23 2016-06-22 武汉理工大学 Automobile-data-recorder-based illegal lane occupation identification system and method for automobile on highway
CN106297314A (en) * 2016-11-03 2017-01-04 北京文安智能技术股份有限公司 A kind of drive in the wrong direction or the detection method of line ball vehicle behavior, device and a kind of ball machine

Also Published As

Publication number Publication date
CN107705577A (en) 2018-02-16

Similar Documents

Publication Publication Date Title
CN107705577B (en) Real-time detection method and system for calibrating illegal lane change of vehicle based on lane line
Narote et al. A review of recent advances in lane detection and departure warning system
CN110178167B (en) Intersection violation video identification method based on cooperative relay of cameras
Chen et al. Lane departure warning systems and lane line detection methods based on image processing and semantic segmentation: A review
EP4152204A1 (en) Lane line detection method, and related apparatus
Berriel et al. Ego-lane analysis system (elas): Dataset and algorithms
CN109977782B (en) Cross-store operation behavior detection method based on target position information reasoning
Wu et al. Applying a functional neurofuzzy network to real-time lane detection and front-vehicle distance measurement
Li et al. Road detection algorithm for autonomous navigation systems based on dark channel prior and vanishing point in complex road scenes
CN109902676B (en) Dynamic background-based violation detection algorithm
CN110379168B (en) Traffic vehicle information acquisition method based on Mask R-CNN
Deng et al. A real-time system of lane detection and tracking based on optimized RANSAC B-spline fitting
CN105426864A (en) Multiple lane line detecting method based on isometric peripheral point matching
CN114898296A (en) Bus lane occupation detection method based on millimeter wave radar and vision fusion
Li et al. A robust lane detection method based on hyperbolic model
Liu et al. Vehicle detection and ranging using two different focal length cameras
CN113011331B (en) Method and device for detecting whether motor vehicle gives way to pedestrians, electronic equipment and medium
Chao et al. Multi-lane detection based on deep convolutional neural network
Dow et al. A crosswalk pedestrian recognition system by using deep learning and zebra‐crossing recognition techniques
Janda et al. Road boundary detection for run-off road prevention based on the fusion of video and radar
Xu et al. Road lane modeling based on RANSAC algorithm and hyperbolic model
Park et al. Vision-based surveillance system for monitoring traffic conditions
Ashraf et al. HVD-net: a hybrid vehicle detection network for vision-based vehicle tracking and speed estimation
Dinh et al. Development of a tracking-based system for automated traffic data collection for roundabouts
Imad et al. Navigation system for autonomous vehicle: A survey

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220729

Address after: 100190, No. 21 West Fourth Ring Road, Beijing, Haidian District

Patentee after: INSTITUTE OF ACOUSTICS, CHINESE ACADEMY OF SCIENCES

Patentee after: NANHAI RESEARCH STATION, INSTITUTE OF ACOUSTICS, CHINESE ACADEMY OF SCIENCES

Address before: 100190, No. 21 West Fourth Ring Road, Beijing, Haidian District

Patentee before: INSTITUTE OF ACOUSTICS, CHINESE ACADEMY OF SCIENCES