WO2012147187A1 - 周辺車両検出装置 - Google Patents
周辺車両検出装置 Download PDFInfo
- Publication number
- WO2012147187A1 WO2012147187A1 PCT/JP2011/060323 JP2011060323W WO2012147187A1 WO 2012147187 A1 WO2012147187 A1 WO 2012147187A1 JP 2011060323 W JP2011060323 W JP 2011060323W WO 2012147187 A1 WO2012147187 A1 WO 2012147187A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- detection
- behind
- lane
- surrounding
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
Definitions
- the present invention relates to a surrounding vehicle detection device and method for detecting a surrounding vehicle behind the vehicle.
- the vehicle travel control device includes a rear side vehicle detection unit that detects a vehicle traveling behind the lane adjacent to the lane in which the host vehicle is traveling, and the rear side vehicle detection unit is configured to The side is imaged, and the rear side vehicle traveling in the lane adjacent to the lane in which the host vehicle is traveling is detected by image processing by the image processing device.
- a delay direction indicating a predetermined delay is calculated with respect to a change in the direction of presence of the object detected by the radar, and the calculation is performed.
- An on-vehicle scanning radar device is known that determines whether or not an object exists on its own lane based on the delayed direction (see, for example, Patent Document 2).
- a specific lane having a predetermined relationship with the vehicle lane (for example, the same lane as the vehicle lane or adjacent to the vehicle lane) It is useful to detect a surrounding vehicle behind a vehicle traveling in a lane or the like from a surrounding vehicle behind the vehicle traveling in another lane.
- an adjacent lane area as a detection target area and detect a surrounding vehicle by a rear radar or a rear camera in the detection target area.
- the detection target area is set based on the time when the vehicle is traveling on a straight road, the detection target area is caused by a change in the positional relationship between the adjacent lane and the vehicle that occurs when the vehicle travels on a curved road. There is a possibility that the detection accuracy may be lowered.
- an object of the present invention is to provide a surrounding vehicle detection device and method capable of accurately detecting a surrounding vehicle behind a vehicle traveling on one specific lane having a predetermined relationship with the traveling lane of the vehicle. To do.
- a surrounding vehicle detection unit that detects a surrounding vehicle behind the vehicle, A curve road information detector for detecting information related to the curvature radius of the curve road; A storage unit for storing a detection result of the curve road information detection unit; Based on the detection result of the curved road information detection unit stored in the storage unit and related to the curvature radius of the curved road behind the current vehicle position, a detection target area behind the vehicle is set. And a process of detecting a surrounding vehicle behind the vehicle traveling on one specific lane having a predetermined relationship with the traveling lane of the vehicle based on the detection result of the surrounding vehicle detection unit in the set detection target region.
- a surrounding vehicle detection device is provided.
- a surrounding vehicle detection device and method capable of accurately detecting surrounding vehicles behind a vehicle traveling on one specific lane having a predetermined relationship with the traveling lane of the vehicle.
- Example 1 shows the principal part structure of the surrounding vehicle detection apparatus 1 by one Example (Example 1) of this invention. It is a flowchart which shows an example of the surrounding vehicle detection process performed by the processing apparatus 10 of the present Example 1. It is a figure which shows an example of the detection object area
- FIG. 1 is a diagram showing a configuration of a main part of a surrounding vehicle detection device 1 according to an embodiment (embodiment 1) of the present invention.
- the surrounding vehicle detection device 1 includes a processing device 10, a storage device 12, a rear radar sensor 20, a rudder angle sensor 30, and a vehicle speed sensor 32.
- the processing device 10 may be configured by an arithmetic processing device including a CPU.
- the functions of the processing device 10 may be realized by arbitrary hardware, software, firmware, or a combination thereof.
- an arbitrary part or all of the functions of the processing apparatus 10 may be an application-specific integrated circuit (ASIC) or field programmable (FPGA). Gate Array) and DSP (digital signal processor).
- the processing device 10 may be realized by a plurality of processing devices.
- part or all of the functions of the processing device 10 may be realized by a processing device that can be included in the rear radar sensor 20 or a processing device that can be included in the various sensors 30 and 32.
- the storage device 12 includes a writable storage device, and may include an EEPROM or the like.
- the storage device 12 may be composed of a plurality of storage devices.
- the rear radar sensor 20 detects the presence and state of a surrounding vehicle behind the vehicle using radio waves (for example, millimeter waves), light waves (for example, laser) or ultrasonic waves as detection waves.
- the rear radar sensor 20 detects information indicating a relationship between the surrounding vehicles, for example, a relative speed, a relative distance, and an azimuth (lateral position) of the surrounding vehicle with reference to the vehicle at a predetermined cycle.
- the millimeter wave radar sensor may be, for example, an electronic scan type millimeter wave radar. In this case, the Doppler frequency (frequency shift) of the radio wave is used.
- the relative speed is detected, the relative distance of the surrounding vehicle is detected using the delay time of the reflected wave, and the direction of the surrounding vehicle is detected based on the phase difference of the received wave between the plurality of receiving antennas.
- the rear radar sensor 20 may be shared with a sensor used for predicting a collision with a surrounding vehicle behind the vehicle (for example, a radar sensor for a pre-crash system).
- the steering angle sensor 30 outputs an electrical signal corresponding to the steering angle of the steering wheel of the vehicle.
- the processing device 10 calculates the steering angle of the steering wheel based on the output signal from the steering angle sensor 30.
- the vehicle speed sensor 32 outputs an electrical signal (vehicle speed pulse) corresponding to the rotational speed of the wheel.
- the processing device 10 calculates the vehicle speed based on the output signal from the vehicle speed sensor 32.
- the vehicle speed may be detected and estimated based on other sensors. For example, the vehicle speed may be detected based on the number of rotations of the output shaft of the transmission, or may be detected based on a time change mode of the vehicle position calculated by the GPS receiver.
- FIG. 2 is a flowchart illustrating an example of the surrounding vehicle detection process executed by the processing apparatus 10 according to the first embodiment.
- the processing routine shown in FIG. 2 may be repeatedly executed at predetermined intervals while the vehicle is traveling.
- the processing routine shown in FIG. 2 may be repeatedly executed at predetermined intervals during vehicle travel when predetermined driving support control (for example, lane change assist (LCA) control) is executed.
- predetermined driving support control for example, lane change assist (LCA) control
- step 200 the processing device 10 calculates the steering angle of the steering wheel based on the output signal from the steering angle sensor 30 and stores it in the storage device 12.
- the latest calculated value for a predetermined period may be stored in the storage device 12 by a FIFO (first-in, first-out) method.
- the predetermined period may be determined so that a calculated value of a steering angle at a time point necessary for deriving a curve index value described later can be read.
- step 202 the processing device 10 calculates the vehicle speed based on the output signal from the vehicle speed sensor 32 and stores it in the storage device 12.
- the latest calculated value for a predetermined period may be stored in the storage device 12 by the FIFO method.
- the predetermined period may be determined so that a calculated value of the vehicle speed at a time point necessary for deriving a curve index value described later can be read.
- the processing device 10 reads the calculated values of the steering angle and the vehicle speed related to the point at the predetermined distance L1 behind the vehicle from the storage device 12.
- the predetermined distance L1 is a parameter that should be determined according to an application (type of driving support) using the detection result of surrounding vehicles. For example, when the detection result of the surrounding vehicle is used in the lane change assist control, it may be the maximum separation distance (for example, 60 m) of the surrounding vehicle behind the vehicle to be detected or a half thereof (distance to the intermediate point).
- the calculated values of the steering angle and the vehicle speed to be read are all or a part of the calculated values of the steering angle and the vehicle speed obtained in the section behind the predetermined distance L1 behind the vehicle and the section within the predetermined distance L1 behind the vehicle. It may be different depending on the calculation mode of the curve index value described later. This will be described in connection with the next step 206.
- the processing device 10 calculates a curve index value based on the calculated steering angle and vehicle speed read in step 204.
- the curve index value may be an index value that indirectly or directly represents the radius of curvature of the curved road behind the vehicle.
- the curve index value may be a radius of curvature of a curved road behind the vehicle, or may be a difference or ratio between the radius of curvature of the curved road behind the vehicle and the radius of curvature of the current vehicle position.
- the calculated value of the steering angle and the vehicle speed read in step 204 may include the calculated value of the current cycle in order to consider the current radius of curvature of the vehicle position.
- the processing apparatus 10 may calculate the curvature radius R of the curved road behind the vehicle based on the following relational expression, for example, assuming that the curvature radius R corresponds to the radius when the vehicle turns. Good.
- R R 0 (1-A ⁇ V 2 )
- R 0 is a turning radius when the vehicle speed V is zero
- R 1 / ⁇ f using the wheel base l and the actual steering angle ⁇ f of the front wheels.
- A is a constant (stability factor) determined according to vehicle mass, cornering power characteristics of front and rear tires, horizontal distance from the front and rear axles to the center of gravity, and the like.
- the steering angle and the actual steering angle ⁇ f of the front wheels have a certain relationship depending on the steering gear ratio (known).
- the processing device 10 substitutes the calculated values of the steering angle and the vehicle speed obtained at the point of the vehicle rear predetermined distance L1 into the equation (1), and the curvature radius R of the curved road at the point of the vehicle rear predetermined distance L1. May be calculated.
- the processing device 10 calculates, as an average value, the curvature radius R of the curved road at the point at the predetermined vehicle rearward distance L1 based on the calculated values of the steering angles and the vehicle speed before and after the predetermined vehicle rearward distance L1. May be.
- the processing device 10 may calculate the radius of curvature R of the curved road behind the vehicle as an average value based on the calculated values of the steering angle and the vehicle speed from the point of the vehicle rear predetermined distance L1 to the current vehicle position. .
- the calculated values of the steering angle and the vehicle speed are actually only stored in chronological order. Therefore, for example, the calculated values stored in the storage device 12 are obtained at a point at a predetermined distance L1 behind the vehicle. It is difficult to accurately specify the calculated steering angle and calculated vehicle speed. Therefore, when it is desired to extract the calculated values of the steering angle and the vehicle speed obtained at the point of the predetermined distance L1 behind the vehicle, the calculated values of the steering angle and the vehicle speed obtained before the predetermined time ⁇ T (before the predetermined period) from the present are extracted. Also good.
- the predetermined time ⁇ T may be a time when the predetermined distance L1 is divided by the current vehicle speed V.
- the travel distance from the time of acquisition of each steering angle and vehicle speed is calculated, and based on the calculation result of the travel distance, the vehicle rear predetermined distance L1
- the calculated values of the steering angle and the vehicle speed obtained at this point may be specified.
- the processing device 10 calculates, as an average value, the curvature radius R of the curved road at a predetermined distance L1 behind the vehicle based on a plurality of calculated steering angles and vehicle speeds obtained before and after the time point 2 seconds ago. May be.
- the processing device 10 may calculate the curvature radius R of the curved road behind the vehicle as an average value based on all the calculated steering angles and vehicle speed values obtained in the past two seconds.
- the processing apparatus 10 sets a detection target area corresponding to the curve index value calculated in step 206.
- the detection target area may be a partial area of the detection area of the rear radar sensor 20.
- the detection target area may be a scanning area (range) of the rear radar sensor 20.
- the detection target area is set so that surrounding vehicles in the lane having a predetermined relationship with the traveling lane of the vehicle can be recognized.
- the detection target area is set so that surrounding vehicles in one lane (hereinafter also simply referred to as “adjacent lane”) adjacent to the right side of the travel lane of the vehicle can be recognized.
- the processing apparatus 10 sets the detection target region in a manner that takes into consideration the curvature radius R of the curved road behind the vehicle, based on the curve index value calculated in step 206 described above. Specifically, the detection target area is set so as to cover only the adjacent lane of the curved road behind the vehicle having the radius of curvature R and not include other lanes (for example, the traveling lane of the vehicle).
- Such a detection target region may be derived in advance by a test or calculation for a plurality of curved paths having a curvature radius R. That is, the range of the detection range of the rear radar sensor 20 that includes the area of the adjacent lane on the curved road behind the vehicle is examined by testing or calculating a curved road having a plurality of curvature radii R, and a plurality of curvature radii Each detection target area corresponding to R may be prepared. In this case, a map representing the relationship between each curve index value corresponding to each curvature radius R and the corresponding detection target region 22a may be created, and this map may be retained. In this case, the processing apparatus 10 may select and set a detection target area corresponding to the curve index value calculated in step 206 from the map.
- step 210 the processing apparatus 10 determines whether or not a surrounding vehicle exists in the detection target area set in step 208 based on the detection result of the rear radar sensor 20 in the detection target area set in step 208. judge. If there is a surrounding vehicle in the detection target area, the process proceeds to step 212. If no surrounding vehicle exists in the detection target area, the process proceeds to step 214.
- step 212 the processing device 10 sets the detection flag to “1” in order to indicate that there is a surrounding vehicle in the set detection target area.
- step 214 the processing apparatus 10 sets the detection flag to “0” to indicate that no surrounding vehicle exists in the set detection target area.
- FIG. 3 and 4 are top views showing an example of the detection target area set by the processing apparatus 10, and FIG. 3 is a diagram showing an example of the detection target area set when the vehicle rear is a straight path.
- FIG. 4 is a diagram illustrating an example of a detection target region that is set when the rear of the vehicle is a curved road.
- the detection area of the rear radar sensor 20 is indicated by reference numeral 22.
- the detection area 22 of the rear radar sensor 20 is schematically shown as a fan shape, but may have an arbitrary shape.
- the detection target region is indicated by reference numeral 22a.
- the detection target area 22a is set as a rectangular internal area defined by four points P1-P4.
- the detection target region 22 a may be set in an arbitrary shape in the detection region 22.
- the detection target region 22a may be set as a fan-shaped internal region defined by a radius r and an angle ⁇ .
- the vehicle rear position of the detection target region 22a (the coordinates in the front-rear direction of P2 and P3) is the maximum separation position of the surrounding vehicles behind the vehicle to be detected by the rear radar sensor 20 in the lane change assist control. It may correspond to (a point of the predetermined distance L1).
- the detection target region 22a is preferably set so as to cover only the adjacent lane behind the vehicle and not include other lanes (for example, the traveling lane of the vehicle).
- an area that covers only the adjacent lane at the rear of the vehicle and does not include other lanes (for example, the driving lane of the vehicle) depends on the radius of curvature of the road at the rear of the vehicle, as shown in FIGS. And change.
- the detection target area 22a when the vehicle rear is a straight road is set in the same manner when the vehicle rear is a curved road, as shown by an area X in FIG. This includes the area of the driving lane.
- the detection target region 22a is set according to a curve index value (an index value that indirectly or directly represents the radius of curvature of a curved road behind the vehicle).
- a curve index value an index value that indirectly or directly represents the radius of curvature of a curved road behind the vehicle.
- the detection target region 22 a substantially includes other lanes (for example, a traveling lane of the vehicle) other than the adjacent lane according to the radius of curvature of the curved road. Is set not to be included.
- the detection target region 22 a is set so as not to include the region X.
- the present embodiment it is possible to accurately detect the surrounding vehicle on the adjacent lane behind the vehicle from the surrounding vehicle on the other lane behind the vehicle. That is, according to the present embodiment, it is possible to effectively prevent a surrounding vehicle on another lane behind the vehicle from being erroneously detected as a surrounding vehicle on an adjacent lane behind the vehicle.
- a map representing the relationship between the curve index value and the corresponding detection target region 22a (the coordinates of the four points P1-P4) is created, and this map is held. Also good.
- the curve index value is obtained from the detection results of the steering angle sensor 30 and the vehicle speed sensor 32 obtained when the vehicle actually travels (the steering angle and the vehicle speed stored in the storage device 12). Calculated based on (calculated value). Therefore, according to the present embodiment, the curve index value behind the vehicle can be calculated easily and accurately. Further, by using calculated values of existing sensors such as the steering angle sensor 30 and the vehicle speed sensor 32, an inexpensive configuration can be realized without requiring additional processing.
- FIG. 6 is a diagram showing a main configuration of a surrounding vehicle detection device 2 according to another embodiment (embodiment 2) of the present invention.
- the peripheral vehicle detection device 2 is mainly different from the peripheral vehicle detection device 1 according to the first embodiment described above in that a yaw rate sensor 31 is provided instead of the steering angle sensor 30.
- Other components that may have the same configuration are denoted by the same reference numerals and description thereof is omitted.
- the processing apparatus 10 has the same configuration as that of the first embodiment, but the function is different as described later.
- the yaw rate sensor 31 outputs an electrical signal corresponding to the yaw rate generated in the vehicle on which it is mounted.
- the processing device 10 calculates the yaw rate generated in the vehicle based on the output signal from the yaw rate sensor 31.
- the yaw rate sensor 31 may be attached, for example, under the center console of the vehicle.
- the yaw rate sensor 31 includes an acceleration sensor unit that outputs a signal corresponding to the acceleration in the vehicle longitudinal direction or the vehicle width direction generated in the vehicle, and a yaw rate sensor unit that outputs a signal corresponding to the angular velocity generated around the center of gravity axis of the vehicle. It may be realized by a semiconductor sensor configured as described above. Part or all of the functions of the processing device 10 may be realized by a processing device that can be included in the rear radar sensor 20 or a processing device that can be included in the various sensors 31 and 32.
- FIG. 7 is a flowchart illustrating an example of the surrounding vehicle detection process executed by the processing apparatus 10 according to the second embodiment.
- the processing routine shown in FIG. 7 may be repeatedly executed at predetermined intervals while the vehicle is traveling.
- the processing routine shown in FIG. 7 may be repeatedly executed at predetermined intervals during vehicle travel when predetermined driving support control (for example, lane change assist control) is executed.
- predetermined driving support control for example, lane change assist control
- step 700 the processing device 10 calculates the yaw rate based on the output signal from the yaw rate sensor 31 and stores the yaw rate in the storage device 12.
- the latest calculated value for a predetermined period may be stored in the storage device 12 by the FIFO method.
- the predetermined period may be determined so that a calculated value of the yaw rate at a time point necessary for deriving a curve index value described later can be read.
- step 702 the processing device 10 calculates the vehicle speed based on the output signal from the vehicle speed sensor 32 and stores it in the storage device 12.
- the latest calculated value for a predetermined period may be stored in the storage device 12 by the FIFO method.
- the predetermined period may be determined so that a calculated value of the vehicle speed at a time point necessary for deriving a curve index value described later can be read.
- step 704 the processing device 10 reads the calculated values of the yaw rate and the vehicle speed related to the point at the predetermined distance L1 behind the vehicle from the storage device 12.
- the predetermined distance L1 may be as described in the above embodiment.
- the calculated values of the yaw rate and the vehicle speed to be read may be all or part of the calculated values of the yaw rate and the vehicle speed obtained in the section behind the vehicle rear predetermined distance L1 and the section within the vehicle rear predetermined distance L1. It depends on the calculation mode of the curve index value. This will be described in connection with the next step 706.
- the processing apparatus 10 calculates a curve index value based on the yaw rate and vehicle speed calculated values read in step 704.
- the curve index value may be an index value that indirectly or directly represents the radius of curvature of the curved road behind the vehicle.
- the curve index value may be a radius of curvature of a curved road behind the vehicle, or may be a difference or ratio between the radius of curvature of the curved road behind the vehicle and the radius of curvature of the current vehicle position.
- the calculated values of the yaw rate and vehicle speed read in step 704 may include the calculated value of the current cycle.
- the processing apparatus 10 may calculate the curvature radius R of the curved road behind the vehicle based on the following relational expression, for example, assuming that the curvature radius R corresponds to the radius when the vehicle turns. Good.
- R V / ⁇ Formula (2)
- ⁇ is the yaw rate.
- the processing device 10 substitutes the calculated values of the yaw rate and the vehicle speed obtained at the point of the vehicle rear predetermined distance L1 into the equation (2), and sets the curvature radius R of the curved road at the point of the vehicle rear predetermined distance L1. It may be calculated. Alternatively, the processing device 10 calculates an average value of the curvature radius R of the curved road at the point of the predetermined distance L1 behind the vehicle based on the calculated values of the yaw rate and the vehicle speed before and after the predetermined distance L1 of the vehicle. Also good. Alternatively, the processing device 10 may calculate the curvature radius R of the curved road behind the vehicle as an average value based on the calculated values of the yaw rate and the vehicle speed from the point of the vehicle rear predetermined distance L1 to the current vehicle position.
- the calculated values of the yaw rate and the vehicle speed are actually only stored in chronological order. Therefore, for example, the calculated values stored in the storage device 12 are obtained at a point at a predetermined distance L1 behind the vehicle. It is difficult to precisely specify the calculated values of the yaw rate and vehicle speed. Therefore, when it is desired to extract the calculated values of the yaw rate and the vehicle speed obtained at the point of the predetermined distance L1 behind the vehicle, the calculated values of the yaw rate and the vehicle speed obtained before the predetermined time ⁇ T (before the predetermined period) from the present may be extracted. .
- the predetermined time ⁇ T may be a time when the predetermined distance L1 is divided by the current vehicle speed V.
- the travel distance from the time of acquisition of each yaw rate and vehicle speed is calculated, and based on the calculation result of the travel distance, the vehicle rear predetermined distance L1
- the calculated values of the yaw rate and vehicle speed obtained in step 1 may be specified.
- the processing device 10 calculates the radius of curvature R of the curved road at the point of the predetermined distance L1 behind the vehicle as an average value based on the calculated values of the yaw rate and vehicle speed obtained before and after the time point 2 seconds ago. May be.
- the processing device 10 may calculate the curvature radius R of the curved road behind the vehicle as an average value based on the calculated values of all yaw rates and vehicle speeds obtained in the past 2 seconds.
- step 708 to step 714 may be the same as the processing from step 208 to step 214 in FIG.
- the curve index value is obtained from the detection results of the yaw rate sensor 31 and the vehicle speed sensor 32 obtained when the vehicle actually travels (the calculated values of the yaw rate and the vehicle speed stored in the storage device 12). ). Therefore, according to the present embodiment, the curve index value behind the vehicle can be calculated easily and accurately.
- FIG. 8 is a diagram illustrating a configuration of a main part of the surrounding vehicle detection device 3 according to another embodiment (embodiment 3) of the present invention.
- the surrounding vehicle detection device 3 is mainly different from the surrounding vehicle detection device 1 according to the first embodiment described above in that a front camera 33 is provided instead of the steering angle sensor 30.
- Other components that may have the same configuration are denoted by the same reference numerals and description thereof is omitted.
- the processing apparatus 10 has the same configuration as that of the first embodiment, but the function is different as described later.
- the front camera 33 is a CCD (charge-coupled) device) and CMOS (complementary metal oxide)
- An image of a landscape in front of the vehicle (front environment image) is captured by an imaging device such as a semiconductor.
- the front camera 33 is mounted on the vehicle in such a manner that it can capture a landscape in front of the vehicle.
- the front camera 33 is attached to, for example, the back side (front surface of the vehicle) of the rearview mirror.
- the front camera 33 may acquire a front environment image in real time while the vehicle is running, and supply the front environment image to the processing device 10 in a stream format with a predetermined frame period, for example.
- the front camera 33 may also be used for other purposes (for example, a front monitoring camera, a lane keep assist control camera, a light distribution control camera, etc.). Note that some or all of the functions of the processing device 10 may be realized by a processing device that can be included in the rear radar sensor 20, a processing device that can be included in the front camera 33, or the like.
- FIG. 9 is a flowchart illustrating an example of the surrounding vehicle detection process executed by the processing apparatus 10 according to the third embodiment.
- the processing routine shown in FIG. 9 may be repeatedly executed at predetermined intervals while the vehicle is traveling.
- the processing routine shown in FIG. 9 may be repeatedly executed at predetermined intervals during vehicle travel when predetermined driving support control (for example, lane change assist control) is executed.
- predetermined driving support control for example, lane change assist control
- step 900 the processing apparatus 10 receives a front environment image input from the front camera 33.
- the processing apparatus 10 performs image processing on the front environment image input from the front camera 33, and recognizes a white line that may be included in the front environment image.
- white line recognition methods There are various white line recognition methods, and any method may be used. For example, it may be recognized using a white line pattern (such as a feature that exists in pairs) or a luminance difference from the surrounding area.
- the processing device 10 calculates the radius of curvature of the recognized white line and stores it in the storage device 12. As for the curvature radius of the white line, the latest calculated value for a predetermined period may be stored in the storage device 12 by the FIFO method. The predetermined period may be determined so that a calculated value of the curvature radius at a time point necessary for deriving a curve index value described later can be read.
- step 904 the processing device 10 reads the calculated value of the curvature radius of the white line related to the point of the vehicle rear predetermined distance L1 from the storage device 12.
- the predetermined distance L1 may be as described in the above embodiment.
- the calculated value of the radius of curvature of the white line to be read may be all or a part of the calculated value of the radius of curvature of the white line obtained in the section behind the predetermined distance L1 behind the vehicle and the section within the predetermined distance L1 behind the vehicle. Depending on the curve index value calculation mode described later. This will be described in connection with the next step 906.
- the processing apparatus 10 calculates a curve index value based on the calculated value of the curvature radius of the white line read in step 904.
- the curve index value may be an index value that indirectly or directly represents the radius of curvature of the curved road behind the vehicle.
- the curve index value may be a radius of curvature of a curved road behind the vehicle, or may be a difference or ratio between the radius of curvature of the curved road behind the vehicle and the radius of curvature of the current vehicle position.
- the calculated value of the radius of curvature of the white line read in step 904 may include the calculated value of the current cycle.
- the processing device 10 may regard the radius of curvature of the white line obtained at the point of the vehicle rear predetermined distance L1 as the radius of curvature R of the curved road at the point of the vehicle rear predetermined distance L1.
- the processing device 10 calculates, as an average value, the curvature radius R of the curved road at the point of the predetermined distance L1 behind the vehicle based on the calculated value of the curvature radius of the plurality of white lines before and after the predetermined distance L1 of the vehicle. May be.
- the processing device 10 may calculate the curvature radius R of the curved road behind the vehicle as an average value based on the calculated value of the curvature radius of the white line from the point of the vehicle rear predetermined distance L1 to the current vehicle position. .
- the calculated value of the radius of curvature of the white line is actually only stored in chronological order, and therefore obtained from each calculated value stored in the storage device 12, for example, at a point at a predetermined distance L1 behind the vehicle. It is difficult to specify exactly the calculated value of the radius of curvature of the white line. Therefore, when it is desired to extract the calculated value of the curvature radius of the white line obtained at the point of the predetermined distance L1 behind the vehicle, the calculated value of the curvature radius of the white line obtained before the predetermined time ⁇ T (before the predetermined period) from the present is extracted. Also good.
- the predetermined time ⁇ T may be a time when the predetermined distance L1 + L2 is divided by the current vehicle speed V.
- the distance L2 may correspond to the front distance of the white line recognized by the front camera 33.
- the travel distance from the acquisition of the curvature radius of each white line is calculated, and based on the calculation result of the travel distance, the vehicle rear predetermined distance L1
- the calculated value of the radius of curvature of the white line obtained at the point may be specified.
- step 908 to step 914 may be the same as the processing from step 208 to step 214 in FIG.
- the curve index value is obtained from the detection (recognition) result of the curvature radius of the white line obtained when the vehicle actually travels (calculation of the curvature radius of the white line stored in the storage device 12). Value). Therefore, according to the present embodiment, the curve index value behind the vehicle can be calculated easily and accurately.
- FIG. 10 is a diagram illustrating a configuration of a main part of a surrounding vehicle detection device 4 according to another embodiment (embodiment 4) of the present invention.
- the surrounding vehicle detection device 4 is mainly different from the surrounding vehicle detection device 1 according to the first embodiment described above in that a rear monitoring camera 26 is provided instead of the rear radar sensor 20.
- Other components that may have the same configuration are denoted by the same reference numerals and description thereof is omitted.
- the processing apparatus 10 has the same configuration as that of the first embodiment, but the function is different as described later.
- the rear monitoring camera 26 captures a landscape image (rear environment image) behind the vehicle by an image sensor such as a CCD or CMOS.
- the rear monitoring camera 26 is mounted on the vehicle in such a manner that it can capture the scenery behind the vehicle.
- the rear monitoring camera 26 is attached to, for example, a rear door.
- the rear monitoring camera 26 may include a wide-angle lens and may capture a wide range behind the vehicle, or may capture a region right behind the vehicle (see the detection region 22 in FIG. 3). .
- the rear monitoring camera 26 may acquire a rear environment image in real time while the vehicle is running, and supply the rear environment image to the processing device 10 in a stream format with a predetermined frame period, for example. Note that some or all of the functions of the processing apparatus 10 may be realized by a processing apparatus that can be included in the rear monitoring camera 26 or the like.
- FIG. 11 is a flowchart illustrating an example of the surrounding vehicle detection process executed by the processing apparatus 10 according to the fourth embodiment.
- the processing routine shown in FIG. 11 may be repeatedly executed at predetermined intervals while the vehicle is traveling.
- the processing routine shown in FIG. 11 may be repeatedly executed at predetermined intervals during vehicle travel when predetermined driving support control (for example, lane change assist control) is executed.
- predetermined driving support control for example, lane change assist control
- step 1100 to step 1106 may be the same as the processing from step 200 to step 206 in FIG.
- the processing apparatus 10 sets a detection target area according to the curve index value calculated in step 1106.
- the detection target area may be a partial area of the detection area (imaging area) of the rear monitoring camera 26.
- the concept of the detection target region setting method may be the same as in the first embodiment.
- the processing apparatus 10 specifies an image area corresponding to the detection target area set in step 1108 in the rear environment image of the rear monitoring camera 26, and whether or not the surrounding vehicle can be recognized in the image area. Determine whether.
- the image may be recognized based on the shape (shape pattern) of the vehicle, the color and characteristics of the lights, the characteristics of movement, and the like. If the surrounding vehicle can be recognized in the image area corresponding to the set detection target area, the process proceeds to step 1112, and if the surrounding vehicle cannot be recognized in the image area corresponding to the set detection target area, Proceed to step 1114.
- step 1112 and step 1114 may be the same as the processing of step 212 and step 214 of FIG.
- the same effect as that of the first embodiment described above can be obtained.
- the fourth embodiment can be combined with the second and third embodiments described above. That is, in the second and third embodiments described above, the rear monitoring camera 26 can be used instead of the rear radar sensor 20.
- FIG. 12 is a flowchart illustrating another example of the surrounding vehicle detection process executed by the processing apparatus 10 according to the fourth embodiment.
- the processing routine shown in FIG. 12 may be repeatedly executed at predetermined intervals while the vehicle is traveling. Alternatively, the processing routine shown in FIG. 12 may be repeatedly executed at predetermined intervals during vehicle travel when predetermined driving support control (for example, lane change assist control) is executed.
- predetermined driving support control for example, lane change assist control
- step 1200 the processing device 10 acquires a rear environment image from the rear monitoring camera 26 and stores it in the storage device 12.
- the storage device 12 may be a RAM or the like.
- step 1204 the processing device 10 reads a rear environment image related to a point at a predetermined distance L1 behind the vehicle from the storage device 12.
- the predetermined distance L1 may be as described in the above embodiment.
- the processing apparatus 10 performs image processing on the rear environment image, recognizes a white line that may be included in the rear environment image, and calculates a curve index value based on the recognized radius of curvature of the white line.
- the curve index value may be an index value that indirectly or directly represents the radius of curvature of the curved road behind the vehicle. There are various white line recognition methods, and any method may be used.
- step 1208 the processing apparatus 10 sets a detection target area according to the curve index value calculated in step 1106.
- the detection target area may be a partial area of the detection area (imaging area) of the rear monitoring camera 26.
- the concept of the detection target region setting method may be the same as in the first embodiment.
- the processing apparatus 10 specifies an image area corresponding to the detection target area set in step 1208 in the current latest rear environment image of the rear monitoring camera 26, and images surrounding vehicles in the image area. It is determined whether or not it can be recognized. If the rear environment image used in step 1206 is the current latest rear environment image, the rear environment image used in the processing in step 1210 is the same as the rear environment image used in step 1206. . There are a wide variety of vehicle image recognition methods, and any method may be used. For example, the image may be recognized based on the shape (shape pattern) of the vehicle, the color and characteristics of the lights, the characteristics of movement, and the like. If the surrounding vehicle can be recognized in the image area corresponding to the set detection target area, the process proceeds to step 1212. If the surrounding vehicle cannot be recognized in the image area corresponding to the set detection target area, Proceed to step 1214.
- FIG. 13 is a diagram showing a main configuration of a surrounding vehicle detection device 5 according to another embodiment (embodiment 5) of the present invention.
- the peripheral vehicle detection device 5 is mainly different from the peripheral vehicle detection device 1 according to the first embodiment described above in that an output device 40 is provided.
- Other components that may have the same configuration are denoted by the same reference numerals and description thereof is omitted.
- the processing apparatus 10 has the same configuration as that of the first embodiment, but the function is different from that of the surrounding vehicle detection function similar to that of the first embodiment described above in that the following functions are provided. .
- the output device 40 may be any type of device that can directly or indirectly output information that informs the driver of the vehicle of the presence of a surrounding vehicle.
- the output device 40 may be a video output device such as a display device, an audio output device such as a speaker, or a device that generates vibration and force that can be perceived by the driver. May be.
- FIG. 14 is a flowchart showing an example of lane change assist control executed by the processing apparatus 10.
- the process flow shown in FIG. 14 may be executed in parallel with the surrounding vehicle detection process shown in FIG.
- the process flow shown in FIG. 14 may be executed while the lane change assist control is on, for example.
- step 1400 the processing apparatus 10 determines whether or not the detection flag is “1”. If the detection flag is “1”, that is, if a surrounding vehicle is detected in the detection target area behind the vehicle, the process proceeds to step 1402. On the other hand, if the detection flag is “0”, that is, if no surrounding vehicle is detected in the detection target area behind the vehicle, it can be determined that there is no need for an alarm or the like, and the process ends.
- the processing apparatus 10 determines whether or not a predetermined alarm condition is satisfied.
- the predetermined alarm condition may be set from the viewpoint of whether or not the surrounding vehicle behind the vehicle detected in the detection target area behind the vehicle is a surrounding vehicle that should alert the driver of the vehicle.
- the predetermined alarm condition may be satisfied when, for example, the intention of changing the lane to the adjacent lane of the driver of the vehicle is detected. This intention may be detected based on, for example, an operation signal of a winker lever.
- the predetermined alarm condition may be satisfied when the vehicle speed of the surrounding vehicle behind the vehicle detected in the detection target area behind the vehicle is equal to or higher than the predetermined vehicle speed.
- the predetermined vehicle speed may be set based on the current vehicle speed.
- the determination of the predetermined alarm condition may be omitted. That is, the determination process in step 1402 may be omitted. In this case, if an affirmative determination is made in step 1400, the process proceeds directly to step 1404.
- the processing device 10 outputs information that informs the driver of the vehicle of the presence of the surrounding vehicle to the output device 40.
- the processing device 10 may sound a buzzer, output a message “Please pay attention to the vehicle behind” from a speaker, or provide a warning light on the meter to notify the presence of a surrounding vehicle. It may be lit.
- the lane change assist control can be executed with high reliability by utilizing the highly accurate surrounding vehicle detection processing result according to the first embodiment.
- the fifth embodiment can be combined with the second to fourth embodiments described above. That is, in the fifth embodiment, it is possible to use the surrounding vehicle detection processing result according to any of the above-described embodiments 2 to 4 instead of the surrounding vehicle detection processing result according to the above-described first embodiment.
- the rear radar 20 has the detection area 22 on the right rear side of the vehicle in order to detect surrounding vehicles behind the vehicle traveling in the adjacent lane on the right side of the own lane.
- the mounting position (detection region 22) of the rear radar 20 may be determined according to which lane in which the vehicle is traveling with respect to the own lane is detected.
- the rear radar 20 may be provided at the center of the vehicle.
- the rear radar 20 may be provided on the left side of the rear portion of the vehicle in order to detect a surrounding vehicle behind the vehicle traveling in the adjacent lane on the left side of the own lane (one adjacent lane). Further, these can be combined, and the left and right rear radars 20 can be mounted on the vehicle.
- a plurality of radars can be switched and used. The same applies to the rear monitoring camera 26.
- the detection target area 22a is set so as to cover only the adjacent lane behind the vehicle and not include other lanes (for example, the own lane).
- the detection target area covers only the same lane as the own lane behind the vehicle, and other lanes (for example, adjacent lanes) May be set so as not to include.
- the target lane may be any lane having a certain relationship with the own lane.
- the detection target region 22a only needs to substantially cover only the adjacent lane behind the vehicle, and may include other lanes (for example, own lane) to some extent.
- the detection target region 22a that covers only the adjacent lane behind the vehicle and does not include any other lane is set. This is because it is impossible in practice.
- both the steering angle and the vehicle speed are used as a preferred embodiment, but only the steering angle may be used.
- the rudder angle itself may be used as the curve index value.
- both the yaw rate and the vehicle speed are used as preferred examples, but only the yaw rate may be used.
- the yaw rate itself may be used as the curve index value.
- the lane change assist control is exemplified as the driving support control using the surrounding vehicle detection result.
- the surrounding vehicle detection result can be effectively used for various driving support controls. .
- it can be used for control that supports overtaking of a preceding vehicle as disclosed in Patent Document 1 described above.
- the above-described embodiment detects the surrounding vehicle behind the vehicle on a specific lane, but this is substantially equivalent to determining (specifying) the traveling lane of the surrounding vehicle behind the vehicle. is there.
- the above-described embodiment may be equivalently embodied as a traveling lane discrimination device for surrounding vehicles behind the vehicle.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
カーブ路の曲率半径に関連する情報を検出するカーブ路情報検出部と、
前記カーブ路情報検出部の検出結果を記憶する記憶部と、
前記記憶部に記憶された前記カーブ路情報検出部の検出結果であって、現在の車両位置より後方のカーブ路の曲率半径に関連する検出結果に基づいて、車両後方の検出対象領域を設定すると共に、前記設定した検出対象領域における前記周辺車両検出部の検出結果に基づいて、車両の走行車線に対して所定の関係を有する1つの特定車線上を走行する車両後方の周辺車両を検出する処理部とを備える、周辺車両検出装置が提供される。
[実施例1]
図1は、本発明の一実施例(実施例1)による周辺車両検出装置1の要部構成を示す図である。
Gate Array)、DSP(digital signal processor)により実現されてもよい。また、処理装置10は、複数の処理装置により実現されてもよい。また、処理装置10の機能の一部又は全部は、後方レーダセンサ20に含まれうる処理装置や各種センサ30,32に含まれうる処理装置により実現されてもよい。
R=R0(1-A・V2) 式(1)
ここで、R0は、車速Vがゼロの時の旋回半径であり、ホイールベースlと前輪の実舵角θfとを用いて、R=l/θfとなる。また、数1の式において、Aは、車両質量、前輪及び後輪タイヤのコーナリングパワー特性、前車軸及び後車軸から重心までの水平距離等に応じて定まる定数(スタビリティファクタ)である。尚、操舵角と前輪の実舵角θfとは、ステアリングギア比(既知)により一定の関係を有する。
[実施例2]
図6は、本発明の他の一実施例(実施例2)による周辺車両検出装置2の要部構成を示す図である。
R=V/γ 式(2)
ここで、γはヨーレートである。
[実施例3]
図8は、本発明の他の一実施例(実施例3)による周辺車両検出装置3の要部構成を示す図である。
device)やCMOS(complementary metal oxide
semiconductor)等の撮像素子により、車両前方の風景の画像(前方環境画像)を捕捉する。前方カメラ33は、車両前方の風景を撮像できるような態様で車両に搭載される。例えば、前方カメラ33は、例えばルームミラーの裏側(車両前側の面)に取り付けられる。前方カメラ33は、車両走行中にリアルタイムに前方環境画像を取得し、例えば所定のフレーム周期のストリーム形式で処理装置10に供給するものであってよい。尚、前方カメラ33は、他の用途(例えば前方監視用カメラ、レーンキープアシスト制御用カメラ、配光制御用カメラ等)と兼用であってもよい。尚、処理装置10の機能の一部又は全部は、後方レーダセンサ20に含まれうる処理装置や前方カメラ33等に含まれうる処理装置により実現されてもよい。
[実施例4]
図10は、本発明の他の一実施例(実施例4)による周辺車両検出装置4の要部構成を示す図である。
[実施例5]
図13は、本発明の他の一実施例(実施例5)による周辺車両検出装置5の要部構成を示す図である。
10 処理装置
12 記憶装置
20 後方レーダセンサ
22 検出領域
22a 検出対象領域
30 舵角センサ
31 ヨーレートセンサ
32 車速センサ
33 前方カメラ
40 出力装置
Claims (6)
- 車両後方の周辺車両を検出する周辺車両検出部と、
カーブ路の曲率半径に関連する情報を検出するカーブ路情報検出部と、
前記カーブ路情報検出部の検出結果を記憶する記憶部と、
前記記憶部に記憶された前記カーブ路情報検出部の検出結果であって、現在の車両位置より後方のカーブ路の曲率半径に関連する検出結果に基づいて、車両後方の検出対象領域を設定すると共に、前記設定した検出対象領域における前記周辺車両検出部の検出結果に基づいて、車両の走行車線に対して所定の関係を有する1つの特定車線上を走行する車両後方の周辺車両を検出する処理部とを備える、周辺車両検出装置。 - 前記特定車線は、車両の走行車線と同一の車線、又は、車両の走行車線に対して左右のいずれかで隣接する1つの車線である、請求項1に記載の周辺車両検出装置。
- 前記カーブ路情報検出部は、所定の時間周期でカーブ路の曲率半径に関連する情報を検出し、
前記処理部は、車速情報に基づいて、現在の車両位置より所定距離後方のカーブ路の曲率半径に関連する検出結果を前記記憶部から抽出する、請求項1又は2に記載の周辺車両検出装置。 - 前記カーブ路情報検出部は、舵角センサ、ヨーレートセンサ、前方カメラ及び後方カメラのうちの少なくともいずれか1つを含む、請求項1~3のうちのいずれか1項に記載の周辺車両検出装置。
- 出力装置を更に備え、
前記処理部は、前記特定車線上を走行する車両後方の周辺車両を検出した場合に、車両の運転者に対して前記周辺車両の存在を知らせる情報を前記出力装置に出力する、請求項1~4のうちのいずれか1項に記載の周辺車両検出装置。 - カーブ路の曲率半径に関連する情報をカーブ路情報検出部により検出し、
前記カーブ路情報検出部の検出結果を記憶部に記憶し、
前記記憶部に記憶された前記カーブ路情報検出部の検出結果であって、現在の車両位置より後方のカーブ路の曲率半径に関連する検出結果に基づいて、車両後方の検出対象領域を処理部により設定し、
車両後方の周辺車両を検出する周辺車両検出部であって、前記設定した検出対象領域における検出結果に基づいて、車両の走行車線に対して所定の関係を有する1つの特定車線上を走行する車両後方の周辺車両を処理部により検出することを含む、周辺車両検出方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/112,433 US20140044311A1 (en) | 2011-04-27 | 2011-04-27 | Neighboring vehicle detecting apparatus |
EP20110864308 EP2704122B1 (en) | 2011-04-27 | 2011-04-27 | Periphery vehicle detection device |
CN201180070367.5A CN103503045A (zh) | 2011-04-27 | 2011-04-27 | 周边车辆检测装置 |
PCT/JP2011/060323 WO2012147187A1 (ja) | 2011-04-27 | 2011-04-27 | 周辺車両検出装置 |
JP2013511841A JPWO2012147187A1 (ja) | 2011-04-27 | 2011-04-27 | 周辺車両検出装置 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2011/060323 WO2012147187A1 (ja) | 2011-04-27 | 2011-04-27 | 周辺車両検出装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012147187A1 true WO2012147187A1 (ja) | 2012-11-01 |
Family
ID=47071728
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/060323 WO2012147187A1 (ja) | 2011-04-27 | 2011-04-27 | 周辺車両検出装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20140044311A1 (ja) |
EP (1) | EP2704122B1 (ja) |
JP (1) | JPWO2012147187A1 (ja) |
CN (1) | CN103503045A (ja) |
WO (1) | WO2012147187A1 (ja) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013242679A (ja) * | 2012-05-21 | 2013-12-05 | Panasonic Corp | 障害物検知装置 |
WO2014080504A1 (ja) * | 2012-11-22 | 2014-05-30 | トヨタ自動車株式会社 | 音検知装置及び方法、運転支援装置及び方法、並びに報知装置及び方法 |
KR101583975B1 (ko) * | 2014-08-29 | 2016-01-08 | 현대자동차주식회사 | 후방 차량 인식 장치 및 방법 |
CN105936299A (zh) * | 2015-03-05 | 2016-09-14 | 福特全球技术公司 | 用于平行停放车辆的***和方法 |
KR20160134831A (ko) * | 2014-04-01 | 2016-11-23 | 스카니아 씨브이 악티에볼라그 | 이웃하는 적어도 두 개의 차로가 있는 도로에서 선행 차량을 주행하는 중의 차로 변경의 위험성을 평가하는 방법 및 시스템 |
KR20160134829A (ko) * | 2014-04-01 | 2016-11-23 | 스카니아 씨브이 악티에볼라그 | 이웃하는 적어도 두 개의 차로가 있는 도로에서 선행 차량을 주행하는 중의 차로 변경의 위험성을 평가하는 방법 및 시스템 |
KR20160134830A (ko) * | 2014-04-01 | 2016-11-23 | 스카니아 씨브이 악티에볼라그 | 이웃하는 적어도 두 개의 차로가 있는 도로에서 선행 차량을 주행하는 중의 차로 변경의 위험성을 평가하는 방법 및 시스템 |
JP2017084034A (ja) * | 2015-10-27 | 2017-05-18 | 株式会社日立製作所 | 交通情報提供装置及びシステム及び方法 |
JP2019213108A (ja) * | 2018-06-07 | 2019-12-12 | クラリオン株式会社 | キャリブレーション装置及び電子ミラーシステム |
JP2022007283A (ja) * | 2020-06-26 | 2022-01-13 | 酒井重工業株式会社 | 建設車両の障害物検知装置 |
US11407411B2 (en) | 2016-09-29 | 2022-08-09 | Denso Corporation | Other lane monitoring device |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5626147B2 (ja) * | 2011-07-04 | 2014-11-19 | 株式会社デンソー | 車両接近物検知装置 |
EP3159866B1 (en) * | 2014-06-19 | 2022-04-13 | Hitachi Astemo, Ltd. | Object recognition apparatus and vehicle travel controller using same |
EP3007150A1 (en) * | 2014-10-07 | 2016-04-13 | Autoliv Development AB | Lane change detection |
ITRM20140615A1 (it) * | 2014-10-30 | 2016-04-30 | Moronesi Mario | Sistema automatico per impedire gli incidenti stradali o per ridurne la gravita' |
JP6207553B2 (ja) * | 2015-07-16 | 2017-10-04 | 本田技研工業株式会社 | 運転支援装置、運転支援方法 |
KR102503253B1 (ko) * | 2015-12-14 | 2023-02-22 | 현대모비스 주식회사 | 주변 차량 인지 시스템 및 방법 |
CN107886729B (zh) * | 2016-09-30 | 2021-02-23 | 比亚迪股份有限公司 | 车辆识别方法、装置及车辆 |
CN109804421A (zh) * | 2016-10-07 | 2019-05-24 | 日产自动车株式会社 | 车辆判断方法、行驶路径修正方法、车辆判断装置及行驶路径修正装置 |
KR102039487B1 (ko) * | 2016-11-11 | 2019-11-26 | 엘지전자 주식회사 | 차량 주행 제어 장치 및 방법 |
US11644556B2 (en) * | 2017-06-20 | 2023-05-09 | Nec Corporation | Position measurement device, position measurement method, and program recording medium |
KR102368604B1 (ko) * | 2017-07-03 | 2022-03-02 | 현대자동차주식회사 | Ecu, 상기 ecu를 포함하는 무인 자율 주행 차량, 및 이의 차선 변경 제어 방법 |
US10239451B1 (en) * | 2017-09-05 | 2019-03-26 | GM Global Technology Operations LLC | Systems and methods for providing relative lane assignment of objects at distances from the vehicle |
JP7202835B2 (ja) * | 2018-10-05 | 2023-01-12 | ロベルト・ボッシュ・ゲゼルシャフト・ミト・ベシュレンクテル・ハフツング | 制動制御装置、横滑り抑制装置、制動制御方法、及びプログラム |
US20200301420A1 (en) * | 2019-03-22 | 2020-09-24 | Veoneer Us, Inc. | System and method to control the velocity and heading of a vehicle based on preview information |
DE102020216470A1 (de) * | 2019-12-26 | 2021-07-01 | Mando Corporation | Fahrerassistenzsystem, damit ausgestattetes fahrzeug und verfahren zum steuern des fahrzeugs |
KR20210126442A (ko) | 2020-04-10 | 2021-10-20 | 현대모비스 주식회사 | 차량의 후측방 경고 시스템 및 방법 |
JP7472816B2 (ja) * | 2021-02-12 | 2024-04-23 | トヨタ自動車株式会社 | 注意喚起装置 |
DE102022200218B3 (de) | 2022-01-12 | 2023-05-04 | Volkswagen Aktiengesellschaft | Verfahren und Assistenzsystem zum Unterstützen eines teilautomatisierten Fahrbetriebs basierend auf Kamera- und Schwarmdaten und Kraftfahrzeug |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09203780A (ja) | 1995-11-24 | 1997-08-05 | Toyota Motor Corp | 車載走査型レーダ装置 |
JP2000214256A (ja) * | 1999-01-28 | 2000-08-04 | Mazda Motor Corp | 車両の表示装置 |
JP2001180404A (ja) * | 1999-12-24 | 2001-07-03 | Mitsubishi Motors Corp | 車両の後方監視装置 |
JP2003063273A (ja) | 2001-08-30 | 2003-03-05 | Hitachi Ltd | 車両走行制御装置 |
JP2006514382A (ja) * | 2003-08-18 | 2006-04-27 | フィコ ミロールス,エセ ア | 自動車の外部環境を監視するためのシステム及び方法 |
JP2008056173A (ja) * | 2006-09-01 | 2008-03-13 | Toyota Motor Corp | 車両用運転支援装置 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3164439B2 (ja) * | 1992-10-21 | 2001-05-08 | マツダ株式会社 | 車両用障害物検出装置 |
JPH11345394A (ja) * | 1998-06-03 | 1999-12-14 | Mitsubishi Electric Corp | 車両の周辺監視装置 |
JP2001124853A (ja) * | 1999-10-29 | 2001-05-11 | Mitsubishi Electric Corp | 安全運転支援センサ |
US6882287B2 (en) * | 2001-07-31 | 2005-04-19 | Donnelly Corporation | Automotive lane change aid |
JP4016735B2 (ja) * | 2001-11-30 | 2007-12-05 | 株式会社日立製作所 | レーンマーク認識方法 |
JP2005088806A (ja) * | 2003-09-18 | 2005-04-07 | Hitachi Unisia Automotive Ltd | 操舵制御装置 |
JP2007164432A (ja) * | 2005-12-13 | 2007-06-28 | Matsushita Electric Ind Co Ltd | 危険情報検出装置及び危険情報検出方法 |
DE102007033887A1 (de) * | 2007-07-20 | 2008-09-04 | Vdo Automotive Ag | Fahrerassistenzsystem mit Empfehlung für einen Fahrspurwechsel |
KR101102144B1 (ko) * | 2009-11-17 | 2012-01-02 | 주식회사 만도 | 차선 유지 제어 방법 및 시스템 |
US8504233B1 (en) * | 2012-04-27 | 2013-08-06 | Google Inc. | Safely navigating on roads through maintaining safe distance from other vehicles |
-
2011
- 2011-04-27 JP JP2013511841A patent/JPWO2012147187A1/ja active Pending
- 2011-04-27 US US14/112,433 patent/US20140044311A1/en not_active Abandoned
- 2011-04-27 WO PCT/JP2011/060323 patent/WO2012147187A1/ja active Application Filing
- 2011-04-27 CN CN201180070367.5A patent/CN103503045A/zh active Pending
- 2011-04-27 EP EP20110864308 patent/EP2704122B1/en not_active Not-in-force
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09203780A (ja) | 1995-11-24 | 1997-08-05 | Toyota Motor Corp | 車載走査型レーダ装置 |
JP2000214256A (ja) * | 1999-01-28 | 2000-08-04 | Mazda Motor Corp | 車両の表示装置 |
JP2001180404A (ja) * | 1999-12-24 | 2001-07-03 | Mitsubishi Motors Corp | 車両の後方監視装置 |
JP2003063273A (ja) | 2001-08-30 | 2003-03-05 | Hitachi Ltd | 車両走行制御装置 |
JP2006514382A (ja) * | 2003-08-18 | 2006-04-27 | フィコ ミロールス,エセ ア | 自動車の外部環境を監視するためのシステム及び方法 |
JP2008056173A (ja) * | 2006-09-01 | 2008-03-13 | Toyota Motor Corp | 車両用運転支援装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2704122A4 |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013242679A (ja) * | 2012-05-21 | 2013-12-05 | Panasonic Corp | 障害物検知装置 |
WO2014080504A1 (ja) * | 2012-11-22 | 2014-05-30 | トヨタ自動車株式会社 | 音検知装置及び方法、運転支援装置及び方法、並びに報知装置及び方法 |
KR20180125620A (ko) * | 2014-04-01 | 2018-11-23 | 스카니아 씨브이 악티에볼라그 | 이웃하는 적어도 두 개의 차로가 있는 도로에서 선행 차량을 주행하는 중의 차로 변경의 위험성을 평가하는 방법 및 시스템 |
KR101997817B1 (ko) * | 2014-04-01 | 2019-07-08 | 스카니아 씨브이 악티에볼라그 | 이웃하는 적어도 두 개의 차로가 있는 도로에서 선행 차량을 주행하는 중의 차로 변경의 위험성을 평가하는 방법 및 시스템 |
KR20160134831A (ko) * | 2014-04-01 | 2016-11-23 | 스카니아 씨브이 악티에볼라그 | 이웃하는 적어도 두 개의 차로가 있는 도로에서 선행 차량을 주행하는 중의 차로 변경의 위험성을 평가하는 방법 및 시스템 |
KR20160134829A (ko) * | 2014-04-01 | 2016-11-23 | 스카니아 씨브이 악티에볼라그 | 이웃하는 적어도 두 개의 차로가 있는 도로에서 선행 차량을 주행하는 중의 차로 변경의 위험성을 평가하는 방법 및 시스템 |
KR20160134830A (ko) * | 2014-04-01 | 2016-11-23 | 스카니아 씨브이 악티에볼라그 | 이웃하는 적어도 두 개의 차로가 있는 도로에서 선행 차량을 주행하는 중의 차로 변경의 위험성을 평가하는 방법 및 시스템 |
KR102050526B1 (ko) * | 2014-04-01 | 2019-11-29 | 스카니아 씨브이 악티에볼라그 | 이웃하는 적어도 두 개의 차로가 있는 도로에서 선행 차량을 주행하는 중의 차로 변경의 위험성을 평가하는 방법 및 시스템 |
KR101997818B1 (ko) * | 2014-04-01 | 2019-07-08 | 스카니아 씨브이 악티에볼라그 | 이웃하는 적어도 두 개의 차로가 있는 도로에서 선행 차량을 주행하는 중의 차로 변경의 위험성을 평가하는 방법 및 시스템 |
KR101583975B1 (ko) * | 2014-08-29 | 2016-01-08 | 현대자동차주식회사 | 후방 차량 인식 장치 및 방법 |
CN105936299A (zh) * | 2015-03-05 | 2016-09-14 | 福特全球技术公司 | 用于平行停放车辆的***和方法 |
JP2017084034A (ja) * | 2015-10-27 | 2017-05-18 | 株式会社日立製作所 | 交通情報提供装置及びシステム及び方法 |
US11407411B2 (en) | 2016-09-29 | 2022-08-09 | Denso Corporation | Other lane monitoring device |
JP2019213108A (ja) * | 2018-06-07 | 2019-12-12 | クラリオン株式会社 | キャリブレーション装置及び電子ミラーシステム |
WO2019235004A1 (ja) * | 2018-06-07 | 2019-12-12 | クラリオン株式会社 | キャリブレーション装置及び電子ミラーシステム |
JP7226930B2 (ja) | 2018-06-07 | 2023-02-21 | フォルシアクラリオン・エレクトロニクス株式会社 | キャリブレーション装置及び電子ミラーシステム |
JP2022007283A (ja) * | 2020-06-26 | 2022-01-13 | 酒井重工業株式会社 | 建設車両の障害物検知装置 |
JP7296345B2 (ja) | 2020-06-26 | 2023-06-22 | 酒井重工業株式会社 | 転圧ローラの障害物検知装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2012147187A1 (ja) | 2014-07-28 |
EP2704122B1 (en) | 2015-02-25 |
CN103503045A (zh) | 2014-01-08 |
EP2704122A4 (en) | 2014-06-04 |
US20140044311A1 (en) | 2014-02-13 |
EP2704122A1 (en) | 2014-03-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2012147187A1 (ja) | 周辺車両検出装置 | |
JP4019736B2 (ja) | 車両用障害物検出装置 | |
JP3779280B2 (ja) | 衝突予測装置 | |
JP5070809B2 (ja) | 運転支援装置、運転支援方法、及び、プログラム | |
US8461976B2 (en) | On-vehicle device and recognition support system | |
JP6257989B2 (ja) | 運転支援装置 | |
JP4823781B2 (ja) | 車両の走行安全装置 | |
JP4434224B2 (ja) | 走行支援用車載装置 | |
US8559674B2 (en) | Moving state estimating device | |
US20110128136A1 (en) | On-vehicle device and recognition support system | |
US20150070158A1 (en) | Alert display device and alert display method | |
JP6323063B2 (ja) | 走行車線識別装置、車線変更支援装置、走行車線識別方法 | |
US20190073540A1 (en) | Vehicle control device, vehicle control method, and storage medium | |
US9908469B2 (en) | Driving support device | |
JP2017091093A (ja) | 情報表示装置 | |
JP2008204281A (ja) | 物体検出装置、および車車間通信システム | |
CN107004250B (zh) | 图像生成装置及图像生成方法 | |
JP2019066240A (ja) | レーダ装置及び情報処理方法 | |
JP4948338B2 (ja) | 車間距離計測装置 | |
JP2011012965A (ja) | 車線判定装置及びナビゲーションシステム | |
JP6582392B2 (ja) | 車載周辺物体報知システム、物体報知システム、報知制御装置 | |
JP2010003086A (ja) | ドライブレコーダー | |
JP2012234373A (ja) | 運転支援装置 | |
JP2008275380A (ja) | 車両周辺監視装置 | |
JP2007271298A (ja) | 車載用レーダ装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11864308 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011864308 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2013511841 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14112433 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |