WO2014054328A1 - Appareil de détection de véhicule - Google Patents

Appareil de détection de véhicule Download PDF

Info

Publication number
WO2014054328A1
WO2014054328A1 PCT/JP2013/070191 JP2013070191W WO2014054328A1 WO 2014054328 A1 WO2014054328 A1 WO 2014054328A1 JP 2013070191 W JP2013070191 W JP 2013070191W WO 2014054328 A1 WO2014054328 A1 WO 2014054328A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
distance
vehicle detection
detection device
data
Prior art date
Application number
PCT/JP2013/070191
Other languages
English (en)
Japanese (ja)
Inventor
青木 泰浩
佐藤 俊雄
雄介 高橋
茂 唐澤
Original Assignee
株式会社 東芝
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社 東芝 filed Critical 株式会社 東芝
Priority to IN3315DEN2015 priority Critical patent/IN2015DN03315A/en
Publication of WO2014054328A1 publication Critical patent/WO2014054328A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/254Analysis of motion involving subtraction of images
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30236Traffic on road, railway or crossing

Definitions

  • Embodiments of the present invention relate to a vehicle detection device that detects a vehicle entering and exiting a specific area on a road.
  • a transmission type pole sensor using infrared rays In other words, a pair of pole sensors are installed on both sides of the driving lane, detect each other by irradiating each other with an infrared beam, and detect the moment when the vehicle crosses the infrared beam. can do.
  • the pole sensor since the pole sensor requires excavation work at the time of installation, and a measuring instrument and work for adjusting the installation position are also required separately, construction and adjustment costs are required.
  • a method of detecting the presence of a vehicle using a camera has been put into practical use in a traffic condition grasping device or the like.
  • vehicle detection is performed by moving image processing.
  • an object of the present invention is to provide a vehicle detection device that can detect the entry / exit of a vehicle from a photographed image with high accuracy without being affected by changes in optical conditions when a stereo camera is employed.
  • the vehicle detection device is a vehicle detection device that detects a vehicle entering and exiting the field of view of the stereo camera based on each captured image obtained by imaging a road through which the vehicle passes by a stereo camera.
  • Edge enhancement means parallax data measurement means, parallax data rejection means, distance data measurement means, distance image creation means, state determination means, and entry / exit detection means.
  • the edge enhancement means performs edge enhancement in the vertical direction of the vehicle for each captured image.
  • the parallax data measuring means measures the parallax data between the captured images with edge enhancement for each pixel.
  • the disparity data rejecting unit obtains the variance of the pixels in each captured image and selectively rejects the disparity data for each pixel based on the result.
  • the distance data measuring unit measures distance data indicating a distance between the stereo camera and the imaging target for each pixel from the parallax data.
  • the distance image creating means creates a distance image having the distance data for each pixel.
  • the state determination unit divides the distance image into a plurality of regions in the traveling direction of the vehicle to form a plurality of reference sections, and each of the plurality of reference sections includes background distance data measured in advance without a vehicle,
  • the distance data measurement means obtains a difference from distance data measured at a specified time when the vehicle enters and exits, obtains a difference change amount for each reference section for each time, and calculates a difference change amount for each reference section. By comparing with the threshold, the state corresponding to the presence or absence of the object is determined for each time.
  • the entry / exit detection means holds state determination for each time, determines transition of the vehicle, and detects entry / exit of the vehicle.
  • FIG. 1 is a schematic diagram illustrating a configuration of a vehicle information processing system to which a vehicle detection device according to the present embodiment is applied.
  • FIG. 2 is a block diagram illustrating a configuration of the vehicle detection device according to the present embodiment.
  • FIG. 3 is a flowchart showing a process flow of the vehicle detection device according to the present embodiment.
  • FIG. 4 is a relationship diagram illustrating the relationship between the distance from the stereo camera and the parallax.
  • FIG. 5 is a diagram illustrating the division of the captured image in the vehicle traveling direction and the divided reference periods a, b, and c.
  • FIG. 6 is a diagram illustrating a state number assigned based on the determination result.
  • FIG. 7 is a diagram illustrating state number transition when the vehicle passes forward.
  • FIG. 1 is a schematic diagram illustrating a configuration of a vehicle information processing system to which a vehicle detection device according to the present embodiment is applied.
  • FIG. 2 is a block diagram illustrating a configuration of the vehicle detection device according
  • FIG. 8 is a relationship diagram showing the relationship between the vehicle position and the state number when the vehicle passes forward.
  • FIG. 9 is a flowchart showing the flow of matching point search matching and matching rejection processing shown in FIG.
  • FIG. 10 is a diagram illustrating matching point search matching using a standard image and a reference image.
  • FIG. 11 is a diagram illustrating a parallax calculation process based on a target pixel and peripheral matching of the target pixel.
  • FIG. 12 is a flowchart showing a corresponding point search matching process having a function of verifying the luminance ranges of the standard image and the reference image shown in FIG.
  • FIG. 13 is a diagram for explaining a process of setting a dead area in the reference section.
  • FIG. 10 is a diagram illustrating matching point search matching using a standard image and a reference image.
  • FIG. 11 is a diagram illustrating a parallax calculation process based on a target pixel and peripheral matching of the target pixel.
  • FIG. 12 is a flow
  • FIG. 14 is a flowchart including the area / shape determination of the connected component in the reference interval change amount determination.
  • FIG. 15 is a flowchart having a two-step switching function of the reference height.
  • FIG. 16 is a flowchart having a two-stage switching function for the background distance in the depth direction.
  • FIG. 1 is a schematic diagram showing a configuration of a vehicle information processing system to which a vehicle detection device according to this embodiment is applied.
  • This vehicle information processing system includes a laser sensor 10, a stereo camera 20, an ETC (Electronic Toll Collection) system 30, and a vehicle detection device 100.
  • ETC Electronic Toll Collection
  • the laser sensor 10 is a sensor that is disposed in the vicinity of the stereo camera 20 and obtains a distance to the traveling lane that is the background of the captured image.
  • the obtained background distance data is sent to the vehicle detection device 100.
  • the laser sensor 10 scans the laser beam vertically from the position close to the stereo camera 20 toward the traveling lane and along the direction crossing the lane, and measures the distance to the point where the laser beam is reflected. This is a sensor for obtaining background distance data.
  • the stereo camera 20 is an imaging device in which a plurality of cameras are arranged so as to obtain a predetermined parallax.
  • a plurality of cameras are arranged side by side, the lane in which the vehicle travels is captured by each camera, and each captured image is sent to the vehicle detection device 100.
  • an image captured by one camera is used as a standard image, and an image captured by the other camera is used as a reference image.
  • Each camera in the stereo camera 20 is a digital camera that captures a moving image at a preset frame rate.
  • This stereo camera 20 is installed on a pole standing on the island portion of the toll booth, and each camera arranged in the vertical direction with respect to the traveling direction of the vehicle captures an image from an oblique direction so that the vehicle passes laterally ( (Imaging area) is set. That is, the stereo camera 20 is installed at a position where the road surface can be imaged from obliquely above.
  • the stereo camera 20 is adjusted so that a grounded portion between a tire (axle) and the ground is reflected in a vehicle traveling in the camera field of view.
  • the stereo camera 20 is installed at a position where imaging can be performed from a direction perpendicular to the traveling direction of the vehicle so that a blind spot due to the vehicle does not occur when imaging a towing structure such as a tow bar.
  • the stereo cameras 20 and the vehicle detection devices 100 may be installed at 3 to 4 locations per lane in order to appropriately grasp the position of the vehicle in the lane.
  • the stereo camera 20 may be an imaging device that optically obtains upper and lower captured images of the vehicle with a single lens, instead of the upper and lower cameras.
  • the stereo camera 20 may be configured with a single camera.
  • a lens using a wedge prism polarization system in which two lenses are provided at the light entrance and a prism that bends the optical path is disposed inside can be used as appropriate.
  • the captured image obtained by the stereo camera 20 includes a time code indicating the imaging time.
  • the stereo camera 20, the ETC system 30, and the vehicle detection device 100 have a clock device (not shown) that generates time information synchronized with each other. If the stereo camera 20, the ETC system 30, and the vehicle detection device 100 operate in synchronism with other methods, that is, the captured time of the image data of the stereo camera 20 is determined by the ETC system 30 and the vehicle detection device. If 100 can be recognized, the time code may not necessarily be included.
  • the stereo camera 20 reflects from the lower end (tire grounding portion) to the upper end (roof) of the vehicle even at a height (within 4.0 m) like a large vehicle. Further, in the stereo camera 20, the video is captured as a moving image, and when traveling through the vehicle, assuming that the vehicle travels up to 80 km / h, the travel history into the field of view of the stereo camera 20 is at least per vehicle. It is assumed that several frames (entrance and exit) are shown.
  • the ETC system 30 is a toll collection device that automatically collects tolls imposed on vehicles traveling on toll roads such as expressways.
  • the ETC system 30 wirelessly communicates with an ETC on-board unit mounted on a vehicle, and acquires information for specifying a passing vehicle.
  • This ETC on-board device is generally installed at a position where at least an antenna for performing wireless communication can be visually recognized through a windshield.
  • FIG. 2 is a block diagram illustrating a configuration of the vehicle detection device 100 according to the present embodiment.
  • the vehicle detection device 100 includes a display unit 110, a user interface 120, a storage unit 130, a network interface 140, and a control unit 150.
  • the display unit 110 is a display device using an LCD (Liquid Crystal Display) or the like.
  • the display unit 110 displays various information including the operation status of the vehicle detection device 100.
  • the user interface 120 is an interface that receives instructions from the user such as a keyboard, a mouse, and a touch panel.
  • the storage unit 130 is a storage device that stores the control program and control data of the control unit 150.
  • the storage unit 130 uses one or a plurality of storage means such as HDD, RAM, ROM, and flash memory.
  • the control data includes, for example, a plurality of parts data representing a three-dimensional shape (eg, tow bar shape pattern, shape data of a circular portion of an axle, shape data of a part of a person), and the like. That is, the storage unit 130 stores a plurality of component data in advance.
  • the shape pattern of the tow bar is information indicating a typical shape pattern of the tow bar, such as a shape pattern indicating a rectangle having a long side along the traveling direction of the vehicle.
  • the shape pattern of the tow bar is stored when distinguishing between two adjacent vehicles and one vehicle that pulls the towed vehicle with the tow bar. Therefore, the shape pattern of the tow bar can be omitted if they are not distinguished from each other.
  • the network interface 140 is an interface that communicates with the stereo camera 20 and the ETC system 30 through a network such as a LAN.
  • the control unit 150 includes a microprocessor having a memory, operates according to a control program and control data stored in the storage unit 130, and controls each unit of the vehicle detection device 100 in an integrated manner.
  • the control unit 150 obtains parallax from each captured image obtained by the stereo camera 20, and calculates distance data with an object in the field of view (in the imaging region) of the stereo camera 20. In addition, the control unit 150 confirms the presence of an object in the field of view based on the difference between the background distance data acquired by the laser sensor 10 and the calculated distance data, records the positional relationship for each time, The moving direction is determined to detect entry / exit of the vehicle. Note that the control unit 150 may have a function of predicting a passing time in the real space (passing time in the communication area of the ETC system 30) in addition to the detection of the entering / leaving vehicle.
  • control unit 150 includes, for example, the following functions (f1) to (f6).
  • (F1) An edge enhancement function that performs edge enhancement in the vertical direction of the vehicle for each captured image.
  • (F2) A parallax data measurement function for measuring the parallax data between each edge-enhanced captured image for each pixel.
  • F3 A parallax data rejection function that obtains the estimated parallax and dispersion of pixel values in a small area including the target pixel in the standard image and the reference image, and rejects unnecessary parallax data based on the result.
  • (F6) The state determination for each time is retained, and the history of the vehicle determines the transition direction of the vehicle by monitoring the flow at the front, side, and rear end, or the flow at the rear end, side, and front.
  • a vehicle detection function that detects vehicle entry and exit.
  • FIG. 3 is a flowchart showing a processing flow of the vehicle detection device 100 according to the present embodiment.
  • the vehicle detection apparatus 100 inputs a captured image (one is a reference image and the other is a reference image) captured by the stereo camera 20 by inputting a captured image (step ST1).
  • the vehicle detection apparatus 100 performs parallel equalization and distortion correction on the input captured image in the parallel equalization process (step ST2), and the brightness of the captured image that has been parallel-equalized in the brightness correction process. Correction (step ST3) is performed.
  • the vehicle detection device 100 performs edge enhancement in the vertical direction of the vehicle by the function (f1) of the control unit 150 for each captured image in the image feature extraction processing by the function (f1) of the control unit 150 (step ST4). ).
  • the vehicle detection device 100 obtains a corresponding point for each pixel between each captured image subjected to edge enhancement by a block matching method or the like, and uses the function (f2) of the control unit 150. Parallax data between each captured image is obtained for each pixel (step ST5, step ST6). This corresponding point search matching and matching rejection processing will be described in an embodiment described later.
  • the vehicle detection device 100 obtains distance data for each pixel from the parallax data by the function (f4) of the control unit 150, and creates a distance image having this distance data for each pixel (step ST7).
  • FIG. 4 is a relationship diagram showing the relationship between the distance from the stereo camera 20 and the parallax.
  • the vehicle detection device 100 measures the number of vehicles from the created distance image.
  • the vehicle detection device 100 determines, in the reference section change amount determination, the reference distance a that is created on the entrance side in the traveling direction of the vehicle and the reference section that is located on the entry side of the vehicle. b and a reference section c located between the reference section a and the reference section b.
  • FIG. 5 is a diagram illustrating the division of the captured image in the vehicle traveling direction and the divided reference periods a, b, and c.
  • the vehicle detection device 100 uses the function (f5) of the control unit 150 to perform distance data for each pixel in the reference section a in the distance image and distance data for each pixel in the reference section a in the background distance image, that is, the laser sensor 10. The difference between the acquired background distance data is measured.
  • the vehicle detection device 100 determines that the reference interval “a” of the distance image is in the ON state when the sum total of the differences for each pixel within the reference interval “a” exceeds the threshold value, and if the sum of the difference is not exceeded,
  • the reference section a is determined to be in an off state.
  • ON / OFF is determined by the same steps.
  • Step ST8 The vehicle detection device 100 determines the state number based on the change (on / off) determination results of the reference sections a, b, and c determined by the function (f5) of the control unit 150 in the state determination.
  • Step ST9 FIG. 6 is a diagram illustrating a state number assigned based on the determination result.
  • the vehicle detection device 100 determines the transition state based on the determination result of the state number by comparing with the determination history of the past state number by the function (f6) of the control unit 150.
  • FIG. 7 is a diagram illustrating state number transition when the vehicle passes forward.
  • FIG. 8 is a relationship diagram showing the relationship between the vehicle position and the state number when the vehicle passes forward.
  • the vehicle detection apparatus 100 determines, for example, that the state number transition “S0 ⁇ S1 ⁇ S3 ⁇ S2 ⁇ S0” is a forward passage of the vehicle. As the determination history for each image frame, the state number overlaps with S0, S1, S1, S3, S3, S3, S3, S2, S2, S0... Sometimes register as history. The vehicle detection device 100 compares these state number transitions with a transition model registered in advance, and determines vehicle traffic. The vehicle detection device 100 can also determine that the vehicle has passed in the reverse direction with the state number transition “S0 ⁇ S2 ⁇ S3 ⁇ S1 ⁇ S0” in the reverse direction.
  • the vehicle detection apparatus 100 detects the passing vehicle by the number counting process through the above process, and measures the number of passing vehicles (step ST10).
  • the vehicle detection device 100 outputs a detection result of the passing vehicle (step ST11).
  • the vehicle detection device 100 obtains the parallax data from each captured image obtained by capturing the lane through which the vehicle passes by the stereo cameras 20 installed above and below, and the object is within the field of view. Calculate distance data. Further, the vehicle detection device 100 confirms the presence of an object in the field of view based on the difference between the background distance data and the parallax data, records the positional relationship for each time, determines the vehicle transition, and enters and exits the vehicle. Can be detected.
  • the basic configuration and operation of the vehicle detection device 100 are as described above.
  • the embodiment described below mainly relates to distance image creation and reference section change amount determination, and does not impair subsequent vehicle number measurement accuracy by accurately acquiring the distance image.
  • Example 1 In the first embodiment, a case where a pattern with extremely poor texture change is obtained as an input image is considered.
  • FIG. 9 is a flowchart showing the flow of matching point search matching and matching rejection processing shown in FIG.
  • FIG. 10 is a diagram illustrating matching point search matching using a standard image and a reference image.
  • FIG. 11 is a diagram illustrating a parallax calculation process based on a target pixel and peripheral matching of the target pixel.
  • step ST5 in the corresponding point search matching (step ST5) of the vehicle detection device 100, as shown in FIG. 10, pixels whose feature points appear from within each captured image with edge enhancement, that is, FIG. (Step ST51), and a rectangular region of interest ROI_L is set at the target pixel position on the reference image (step ST52). Moreover, the vehicle detection apparatus 100 sets the area
  • the vehicle detection device 100 calculates the difference between ROI_U and ROI_L by, for example, SAD (Sum of Absolute Difference) or the like in the corresponding point matching score measurement, and obtains the position where the calculated difference becomes the minimum value (step ST54). . Thereby, the vehicle detection apparatus 100 measures parallax from the position of ROI_L and the position of ROI_U by parallax measurement (step ST55).
  • SAD Sud of Absolute Difference
  • the parallax is obtained at an appropriate position by searching for the similarity of the texture pattern obtained from the captured image of the stereo camera 20 and obtaining the matching point.
  • the vehicle detection device 100 makes it difficult to obtain a texture pattern particularly on the road surface for the captured image of the stereo camera 20 when obtaining the parallax, and a matching error occurs. To do. This is more likely to occur in regions with a higher resolution that are farther from the camera. Further, these areas have large distance measurement errors per pixel unit. For this reason, the matching error is considered as a risk that leads to erroneous detection.
  • the function (f3) of the control unit 150 measures the variance from the luminance in the ROI_L (step ST61).
  • the vehicle detection device 100 rejects the parallax measurement target, assuming that there is a risk of erroneous matching (step S63).
  • the vehicle detection device 100 similarly measures not only the variance measured from the luminance in the ROI_L but also the variance due to the luminance in the ROI_U (step S62). More rejected (step S63).
  • the vehicle detection apparatus 100 can measure an accurate distance by preventing erroneous matching when searching for corresponding points even when a pattern with extremely poor texture change is obtained as an input image.
  • Example 2 In the second embodiment, a case where a low quality image having many pixels exceeding the upper limit and the lower limit of the luminance range of the stereo camera is used is considered.
  • FIG. 12 is a flowchart showing a corresponding point search matching process having a function of verifying the luminance ranges of the standard image and the reference image shown in FIG.
  • the vehicle detection device 100 sets a rectangular region of interest ROI_L at the pixel position of interest on the reference image as in the first embodiment (step ST52). Moreover, the vehicle detection apparatus 100 sets the area
  • SAD Sud of Absolute Difference
  • the vehicle detection apparatus 100 measures parallax from the position of ROI_L and the position of ROI_U by parallax measurement (step ST55).
  • the vehicle detection device 100 determines whether or not the luminance of the pixels in the region ROI_U corresponding to the measured parallax is within the upper and lower limits of the preset luminance range by the corresponding point search matching (step) ST56). At this time, if the luminance of the pixel exceeds the upper limit, there is a possibility that an overexposure risk has occurred. Further, when the luminance of the pixel is lower than the lower limit, there is a possibility that a blackout risk has occurred. The vehicle detection apparatus 100 rejects the parallax of the pixels that are out of the preset luminance range from the measurement target.
  • the vehicle detection device 100 will be overexposed (a large number of luminances exceeding the upper limit) at only one image at the same point in the video. The situation is also conceivable. For this reason, in the vehicle detection device 100 according to the second embodiment, the determination is simply performed for each pixel of interest.
  • the vehicle detection device 100 can prevent an erroneous matching of corresponding point search caused by a low-quality image having many pixels exceeding the upper limit and lower limit of the luminance range of the stereo camera, and can measure an accurate distance. it can.
  • Example 3 In the third embodiment, a case where a reflection portion and a shadow portion with strong sunlight are mixed on the road surface is considered.
  • the vehicle detection device 100 measures the variance from the luminance in the attention area ROI_L on the reference image in the matching rejection process as in the first embodiment (step ST61). ).
  • the vehicle detection device 100 when the edge change is significant, in the determination based on the parallax / dispersion of the matching rejection process (step ST63), since the luminance in the ROI_L is increased, it is determined whether the variance exceeds the threshold value. . Further, in the vehicle detection device 100, when the resolution of the captured image is low, mismatching is likely to occur. Therefore, the parallax between the attention area ROI_L and the area ROI_U is determined, and it is determined whether it is equal to or less than the threshold value.
  • the parallax of the pixels that satisfy the above two determination conditions is rejected from the measurement target.
  • the vehicle detection device 100 prevents in advance false matching due to the presence of extreme contrast such as a shadow boundary portion that hinders matching with detailed texture information in the captured image, and the accurate distance Can be measured.
  • Example 4 In Example 4, a case where texture information is scarce, such as a hood part of an automobile, is considered.
  • the vehicle detection device 100 sets a region of interest ROI_L on the reference image by corresponding point search matching.
  • the vehicle detection device 100 arranges the pixels in the ROI_L in ascending order centering on the target pixel in the ROI_L, and when there is a close value that is equal to or less than the threshold, the parallax between the adjacent pixels is considered in consideration of the risk of erroneous matching. Is rejected from the measurement target.
  • the vehicle detection device 100 can prevent erroneous matching in advance and measure an accurate distance based on the parallax when the texture of the vehicle hood or the like is poor.
  • Example 5 In Example 5, consider the case where a failure occurs such that an obstacle is continuously observed in a part of the visual field to be detected, FIG. 13 is a diagram for explaining a process of setting a dead area in the reference section.
  • the vehicle detection device 100 has a case in which some trouble occurs in the sensor such as mud or raindrops adhering to the camera lens, and an obstacle is continuously observed in a part of the field of view.
  • the vehicle detection device 100 measures each change amount of the reference section from the measured difference in the reference section change amount determination.
  • the vehicle detection device 100 determines the amount of change (on / off) for each reference section.
  • the vehicle detection device 100 stores the number of continuous ON determinations at that position in the history counter.
  • the vehicle detection device 100 determines whether or not the number of continuous ON determinations stored in the history counter has reached a set value, and when it reaches, sets a dead area flag.
  • the vehicle detection apparatus 100 refers to the insensitive area flag and replaces the change amount determination result from on to off for a section that is continuously on. Replacement takes place in three areas: (1) entry side, (2) central area, and (3) exit side.
  • the vehicle detection device 100 initializes the history counter and the dead area when the amount of change in the reference section changes from on to off, and restores the original state. At this time, the vehicle detection device 100 may give a condition such as executing it when it is continuously off for n frames or more, instead of clearing zero at a time.
  • the vehicle detection device 100 sets the insensitive area when a failure occurs such that an obstacle is continuously observed in a part of the field of view of the detection target. By turning off the determination result of the change amount, it is possible to provide a highly accurate vehicle number measuring function.
  • Example 6 In the sixth embodiment, a case where a relatively small flying object such as rain or snow is present in the field of view will be considered.
  • FIG. 14 is a flowchart having the determination of the area / shape of the connected component in the determination of the reference section change amount.
  • the vehicle detection device 100 When the outer contour of an obstacle is observed beyond the reference height due to snow, large rain, or the like, the vehicle detection device 100 is captured as many point-like objects in the field of view, and depending on the quantity, erroneous detection may occur. There is a risk.
  • the vehicle detection device 100 In order to prevent this erroneous detection, as shown in FIG. 14, in the vehicle detection device 100, the distance data included in the distance image acquired in the distance measurement (step ST7) and the background distance data acquired from the laser sensor 10 are used. The difference between them is measured (step ST81). The vehicle detection device 100 compares whether the measured difference is equal to or higher than the reference height with respect to the road surface (step ST82), and does not measure the difference equal to or lower than the reference height. The vehicle detection apparatus 100 measures the number of pixels whose measured difference is equal to or greater than the reference height (step ST83).
  • the vehicle detection apparatus 100 performs labeling (connected component extraction) in consideration of the connectivity between the measured pixels (step ST84), and determines the area of the labeled pixels (corresponding to the number of pixels) (step ST85).
  • a vehicle having a certain area or more is determined as a vehicle candidate, and a vehicle having a certain area or less is determined as noise.
  • the vehicle detection device 100 replaces a distance that is greater than or equal to the reference height with a distance that is less than or equal to the reference height for pixels determined to be noise (step ST86).
  • the vehicle detection apparatus 100 again measures the number of pixels exceeding the reference height (step ST87), and performs state determination based on an on / off combination for each reference section (step ST9).
  • a condition for performing noise determination is provided by shape classification in the vertical direction, horizontal direction, and diagonal direction for each label.
  • the vehicle detection device 100 acquires information on the outer contour of the flying object, determines its occupied area, and rejects noise. By doing so, it is possible to accurately measure an obstacle without depending on disturbance of weather or illuminance.
  • Example 7 In the seventh embodiment, a case where the reference height based on the road surface is increased due to snow accumulation or the like will be considered.
  • FIG. 15 is a flowchart having a reference height two-stage switching function.
  • the vehicle detection device 100 changes the reference height between the standard specification and the standard specification when measuring pixels whose distance per pixel exceeds the reference height in the reference section change amount determination. Two types having higher specifications and added set values are stored in advance.
  • the vehicle detection apparatus 100 refers to the state number from the state determination record (step ST89) after measuring the difference in the reference section change amount determination and before measuring the change amount of the reference section (step ST88).
  • the vehicle detection device 100 applies the specification with a higher reference height.
  • the state number where an obstacle is detected changes.
  • the vehicle detection apparatus 100 refers to the state number, uses the standard height of the standard specification in the determination of the change amount of the standard section, and similarly measures pixels exceeding the standard height of the standard specification, Measure the amount of change. If the number of pixels exceeding the standard specification is reduced and the reference section is determined to be off, there is no obstacle and the state number changes. If the status number changes and returns to when there is no obstacle detection status, the specification reference height should be increased again.
  • the vehicle detection apparatus 100 prepares a plurality of reference heights, and does not depend on the state number, and increases the set value at the reference height (standard + set value) as the distance from the camera increases. Measure the amount of change.
  • the tolerance is improved by applying a threshold set based on a higher standard considering the vehicle size. Therefore, the vehicle detection device 100 is not affected by a falling object and can accurately measure an obstacle when the vehicle enters.
  • FIG. 16 is a flowchart having a two-stage switching function for the background distance in the depth direction.
  • the vehicle detection device 100 measures the difference between the distance data included in the acquired distance image and the background distance data acquired from the laser sensor 10 in the reference section change amount determination.
  • the vehicle detection apparatus 100 sets the imaging area with two axes, that is, the traveling direction of the vehicle and the distance (depth) direction from the camera. That is, the imaging region is subdivided into two regions on the near side and the far side of the boundary line when a boundary line corresponding to the threshold is set in the distance (depth) direction from the camera.
  • the road On the near side, the road is usually reflected as the background, and the height from the road surface is measured from the difference between the distance from the vehicle traveling on the near side and the background distance of the road portion.
  • the road is not limited to the background, and it is uncertain that the road is a wall that partitions lanes as seen at a toll booth, or that the background is an adjacent passage. Therefore, the vehicle detection apparatus 100 sets the back side with a virtual road (wall) as a background, and determines the obstacle from the difference between the distance from the vehicle traveling on the back side and the virtual road (wall) distance. Measure the height.
  • the vehicle detection device 100 For each calculated height, the vehicle detection device 100 measures pixels that exceed the specification standard of a reference height prepared in advance, and performs on / off determination of a reference section (entrance / stop / exit part) ( Step ST810).
  • the vehicle detection device 100 can stably determine whether the reference section is on or off regardless of the position of the passage of the vehicle, and can accurately measure the passage of the vehicle.
  • the variance is measured from the luminance in the ROI_L.
  • the vehicle detection device 100 rejects the parallax measurement target if a risk of mismatching occurs when the measured variance falls below the threshold.
  • the vehicle detection device 100 similarly measures not only the variance measured from the luminance in the ROI_L but also the variance due to the luminance in the ROI_U, and similarly rejects it from the parallax measurement target when it is equal to or less than the threshold.
  • the vehicle detection apparatus 100 can prevent erroneous matching at the time of searching for corresponding points and can measure an accurate distance even when a pattern with extremely poor texture change is obtained as an input image.
  • the vehicle detection apparatus of the present embodiment can detect the entry / exit of the vehicle from the photographed image with high accuracy without being affected by the change of the optical condition when the stereo camera is employed.
  • the laser sensor 10 may be a sensor that scans a laser beam and sends received data obtained by receiving the reflected laser beam to the vehicle detection device 100.
  • the vehicle detection device 100 needs to have a function of calculating distance data from the received wave data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un appareil de détection de véhicule (100) qui réalise un renforcement des contours d'une image capturée dans la direction verticale d'un véhicule ; mesure des données de disparité entre des images capturées pour lesquelles un renfoncement des contours a été réalisé ; trouve la dispersion de pixels dans l'image capturée et rejette sélectivement les données de disparité sur la base des résultats ; mesure des données de distance basées sur les données de disparité ; crée une image de distance ayant une distance pour chaque pixel ; divise l'image de distance dans la direction de déplacement du véhicule en une pluralité de régions afin de former une pluralité d'intervalles de référence, trouve la différence entre les données de distance d'arrière-plan et les données de distance à l'aide de chacun de la pluralité d'intervalles de référence, trouve la quantité de changement de la différence pour chaque segment de référence pour chaque moment, et compare la quantité de changement de la différence pour chaque intervalle de référence à une valeur seuil, déterminant ainsi un statut correspondant à la présence ou à l'absence d'un objet pour chaque moment ; et retient la détermination de statut pour chaque moment, détermine la transition du véhicule, et détecte l'entrée et la sortie du véhicule.
PCT/JP2013/070191 2012-10-02 2013-07-25 Appareil de détection de véhicule WO2014054328A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
IN3315DEN2015 IN2015DN03315A (fr) 2012-10-02 2013-07-25

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-220429 2012-10-02
JP2012220429A JP6139088B2 (ja) 2012-10-02 2012-10-02 車両検知装置

Publications (1)

Publication Number Publication Date
WO2014054328A1 true WO2014054328A1 (fr) 2014-04-10

Family

ID=50434664

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/070191 WO2014054328A1 (fr) 2012-10-02 2013-07-25 Appareil de détection de véhicule

Country Status (3)

Country Link
JP (1) JP6139088B2 (fr)
IN (1) IN2015DN03315A (fr)
WO (1) WO2014054328A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11501541B2 (en) 2019-07-10 2022-11-15 Gatekeeper Inc. Imaging systems for facial detection, license plate reading, vehicle overview and vehicle make, model and color detection
US11538257B2 (en) 2017-12-08 2022-12-27 Gatekeeper Inc. Detection, counting and identification of occupants in vehicles
US11736663B2 (en) 2019-10-25 2023-08-22 Gatekeeper Inc. Image artifact mitigation in scanners for entry control systems
WO2024017413A3 (fr) * 2022-07-22 2024-03-14 顺丰科技有限公司 Procédé et appareil de détection d'entrée/sortie de port d'un véhicule, dispositif et support de stockage

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014106854A1 (de) 2014-05-15 2016-01-28 Odos Imaging Ltd. Bildgebendes System und Verfahren zum Überwachen eines Sichtfeldes
KR101655620B1 (ko) * 2014-12-18 2016-09-07 현대자동차주식회사 차량용 거리측정장치 및 방법
WO2019065218A1 (fr) * 2017-09-28 2019-04-04 株式会社小糸製作所 Système de capteur
CN112158134A (zh) * 2020-09-23 2021-01-01 中国第一汽车股份有限公司 一种车辆对后车路况信息提示***及其提示方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1049662A (ja) * 1996-07-31 1998-02-20 Omron Corp 車輌判別装置
JP2001043377A (ja) * 1999-07-30 2001-02-16 Fuji Heavy Ind Ltd フェールセーフ機能を有する車外監視装置
JP2006105661A (ja) * 2004-10-01 2006-04-20 Omron Corp ステレオ画像による平面推定方法
JP2008129920A (ja) * 2006-11-22 2008-06-05 Toshiba Corp 料金収受システム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1049662A (ja) * 1996-07-31 1998-02-20 Omron Corp 車輌判別装置
JP2001043377A (ja) * 1999-07-30 2001-02-16 Fuji Heavy Ind Ltd フェールセーフ機能を有する車外監視装置
JP2006105661A (ja) * 2004-10-01 2006-04-20 Omron Corp ステレオ画像による平面推定方法
JP2008129920A (ja) * 2006-11-22 2008-06-05 Toshiba Corp 料金収受システム

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11538257B2 (en) 2017-12-08 2022-12-27 Gatekeeper Inc. Detection, counting and identification of occupants in vehicles
US11501541B2 (en) 2019-07-10 2022-11-15 Gatekeeper Inc. Imaging systems for facial detection, license plate reading, vehicle overview and vehicle make, model and color detection
US11736663B2 (en) 2019-10-25 2023-08-22 Gatekeeper Inc. Image artifact mitigation in scanners for entry control systems
WO2024017413A3 (fr) * 2022-07-22 2024-03-14 顺丰科技有限公司 Procédé et appareil de détection d'entrée/sortie de port d'un véhicule, dispositif et support de stockage

Also Published As

Publication number Publication date
JP6139088B2 (ja) 2017-05-31
JP2014074939A (ja) 2014-04-24
IN2015DN03315A (fr) 2015-10-09

Similar Documents

Publication Publication Date Title
JP6139088B2 (ja) 車両検知装置
US10753758B2 (en) Top-down refinement in lane marking navigation
RU2698610C2 (ru) Способ и блок обработки для управления системой наблюдения за дорожным движением
US20210110171A9 (en) Barrier and guardrail detection using a single camera
US9047518B2 (en) Method for the detection and tracking of lane markings
RU2571368C1 (ru) Устройство обнаружения трехмерных объектов, способ обнаружения трехмерных объектов
US7046822B1 (en) Method of detecting objects within a wide range of a road vehicle
US9965690B2 (en) On-vehicle control device
RU2636120C2 (ru) Устройство обнаружения трехмерных объектов
US20130286205A1 (en) Approaching object detection device and method for detecting approaching objects
KR100969995B1 (ko) 영상처리기법을 이용한 신호교차로의 교통상충 판단 시스템및 방법
CN101510356B (zh) 视频检测***及其数据处理装置、视频检测方法
WO2014017403A1 (fr) Dispositif de reconnaissance d'image monté dans un véhicule
KR20030080285A (ko) 차량의 대기 길이 측정 장치 및 방법
KR101134857B1 (ko) 주간 및 야간 주행 차량을 조도상황에 따라 검출하는 방법및 장치
KR101210615B1 (ko) 불법유턴 차량 단속 시스템
JP5175765B2 (ja) 画像処理装置及び交通監視装置
JP2013134667A (ja) 車両検知装置
JP4071527B2 (ja) 映像診断装置
JP2003255430A (ja) 映像診断装置、車載型映像監視装置の映像診断システム
JP2002008019A (ja) 軌道認識装置及び軌道認識装置を用いた鉄道車両
WO2014050285A1 (fr) Dispositif de caméra stéréoscopique
JP2014010771A (ja) 車両検知装置
CN103909881A (zh) 环境资讯侦测***及环境资讯侦测方法
JP7523770B2 (ja) 交通量計測装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13843412

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13843412

Country of ref document: EP

Kind code of ref document: A1