WO2011070650A1 - Appareil de détection d'objets et procédé de détection d'objets - Google Patents

Appareil de détection d'objets et procédé de détection d'objets Download PDF

Info

Publication number
WO2011070650A1
WO2011070650A1 PCT/JP2009/070562 JP2009070562W WO2011070650A1 WO 2011070650 A1 WO2011070650 A1 WO 2011070650A1 JP 2009070562 W JP2009070562 W JP 2009070562W WO 2011070650 A1 WO2011070650 A1 WO 2011070650A1
Authority
WO
WIPO (PCT)
Prior art keywords
edge
target
reliability
object detection
deriving
Prior art date
Application number
PCT/JP2009/070562
Other languages
English (en)
Japanese (ja)
Inventor
加藤 雅之
Original Assignee
トヨタ自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by トヨタ自動車株式会社 filed Critical トヨタ自動車株式会社
Priority to US13/057,217 priority Critical patent/US8610620B2/en
Priority to DE112009005424.2T priority patent/DE112009005424B4/de
Priority to PCT/JP2009/070562 priority patent/WO2011070650A1/fr
Priority to JP2011507732A priority patent/JP4883246B2/ja
Priority to CN200980132832.6A priority patent/CN102696060B/zh
Publication of WO2011070650A1 publication Critical patent/WO2011070650A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • B60R21/0134Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/932Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using own vehicle data, e.g. ground speed, steering wheel direction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93271Sensor installation details in the front of the vehicles

Definitions

  • the present invention relates to an object detection apparatus and an object detection method for detecting an object based on information acquired by a radar and an image sensor.
  • a target an object to be detected
  • the radar can be recognized as an electromagnetic wave reflection point, and thereby the position of the target can be acquired.
  • the radar it is difficult for the radar to accurately acquire the edge of the target.
  • the edge of the target can be acquired with high accuracy from the image captured by the stereo image sensor. Therefore, in the object detection device, the target information acquired by the radar and the target information acquired from the image captured by the stereo image sensor are fused. Thereby, the object detection capability of the object detection device can be improved.
  • Patent Document 1 discloses an obstacle recognition device for a vehicle using a millimeter wave radar and a monocular camera.
  • the vehicle obstacle recognition device includes object information calculation means, image processing means, and vehicle information acquisition means.
  • the object information calculation means calculates object information such as a relative distance to the detected object and a relative lateral position from the output of the millimeter wave radar.
  • the image processing unit processes an image captured by the monocular camera based on the calculation result by the object information calculation unit.
  • the vehicle obstacle recognition apparatus determines the possibility that the detected object becomes an obstacle based on at least the outputs of the object information calculation means and the vehicle information acquisition means. Further, it is determined whether the output of the image processing means is effective for the determination of the obstacle based on the calculation result by the object information calculation means, and the output of the image processing means is also used for the determination of the obstacle only when it is effective. .
  • Patent Document 2 discloses an object detection device that acquires image information and distance information from a camera and a radar.
  • the object detection apparatus calculates an edge direction vector, an edge direction vector variance, an edge strength, and an edge strength variance from the image information. Then, the type of the object is determined based on the distance between at least one of these and the detection target object.
  • the object detection apparatus when a monocular image sensor is used as the image sensor, space saving and cost reduction can be achieved. However, it is difficult to acquire accurate information in the depth direction from an image captured by a monocular image sensor. Therefore, when the left and right edges of a target are detected from an image captured by a monocular image sensor, the edge of an object or pattern that exists behind the target as viewed from the host vehicle is actually In some cases, it is erroneously detected as an edge. If the lateral position (lateral position) of the target is derived based on the erroneously detected edge as described above, the lateral position of the target may be erroneously detected.
  • the present invention has been made in view of the above problems, and detects an object based on target information acquired by a radar and target information acquired from an image captured by a monocular image sensor.
  • An object of the present invention is to provide a technique capable of further improving the detection accuracy of the lateral position of a target.
  • the right edge and the left edge of the target are acquired from the image captured by the monocular image sensor. Furthermore, a straight line approximating the trajectory of the right edge and the left edge or a trajectory approximate line which is a predetermined curve is derived for both edges. Then, select the right edge or left edge that has more edges on the locus approximation line as the true edge of the target, and derive the lateral position of the target based on the position of the selected edge. To do.
  • the object detection device is An object detection device that detects an object based on target information acquired by a radar and target information acquired from an image captured by a monocular image sensor, An edge acquisition means for extracting a target corresponding to the target recognized by the radar from an image captured by the monocular image sensor, and acquiring a right edge and a left edge of the extracted target; Trajectory approximate line deriving means for deriving a trajectory approximate line that is a straight line or a predetermined curve that approximates the trajectory of the right edge and the left edge acquired by the edge acquiring means, for both edges; A selection unit that selects, as a true edge of a target, a larger number of edges existing on the locus approximate line among the right edge and the left edge acquired by the edge acquisition unit; Lateral position deriving means for deriving the lateral position of the target based on the position of the edge selected as the true edge by the selecting means; It is characterized by having.
  • the edge having the higher reliability is selected as the true edge of the target. Then, the lateral position of the target is derived based on the position of the edge with the higher reliability. Therefore, the detection accuracy of the lateral position of the target can be further improved.
  • the object detection apparatus may further include weighting means for weighting the reliability of the right edge and the left edge acquired by the edge acquisition means.
  • the weighting unit weights the right edge and the left edge so that the reliability closer to the position of the target recognized by the radar is higher.
  • the object detection apparatus calculates a reliability total value for calculating a total value of reliability for both edges by adding a plurality of reliability weighted by the weighting means for each of the right edge and the left edge. Means may be provided.
  • the right edge and the left edge may have the same number of edges on the locus approximation line.
  • the selection unit may select, as the true edge of the target, the larger of the reliability total values calculated by the reliability total value calculation unit among the right edge and the left edge.
  • the lateral position deriving unit may include a trajectory predicting unit and a collision position predicting unit.
  • the trajectory predicting means predicts the future trajectory of the edge selected as the true edge by the selecting means.
  • the collision position prediction means predicts the collision position between the target and the vehicle based on the future edge trajectory predicted by the trajectory prediction means as the position where the distance in the front-rear direction between the edge and the vehicle becomes zero.
  • the lateral position deriving means determines the lateral position of the target at the collision position based on the position of the edge selected as the true edge by the selection means at the collision position predicted by the collision position prediction means.
  • the lateral position of the center (hereinafter referred to as the target center) may be derived.
  • the lateral position of the target center at the collision position between the target and the vehicle can be detected.
  • the lateral position deriving unit may include a lateral width estimating unit that estimates the lateral width of the target.
  • the lateral position deriving means shifts from the position of the edge selected as the true edge by the selecting means to the other edge side by 1/2 of the lateral width of the target estimated by the lateral width estimating means. May be derived as the lateral position of the target center.
  • the lateral position of the target center can be detected with high accuracy.
  • An object detection method for detecting an object based on target information acquired by a radar and target information acquired from an image captured by a monocular image sensor, Extracting a target corresponding to the target recognized by the radar from an image captured by the monocular image sensor, and acquiring a right edge and a left edge of the extracted target; and A locus approximation line deriving step for deriving a locus approximation line that is a straight line or a predetermined curve that approximates the locus of the right edge and the left edge acquired in the edge acquisition step for both edges; A selection step of selecting, as the true edge of the target, the one having the larger number of edges existing on the locus approximate line among the right edge and the left edge acquired in the edge acquisition step; A lateral position deriving step of deriving the lateral position of the target based on the position of the edge selected as the true edge in the selection step; It is characterized by having.
  • the lateral position of the target is derived based on the position of the edge with higher reliability. Therefore, the detection accuracy of the lateral position of the target can be further improved.
  • the object detection method according to the present invention may further include a weighting step for weighting the reliability of the right edge and the left edge acquired in the edge acquisition step.
  • the weighting step the right edge and the left edge are weighted so that the reliability closer to the position of the target recognized by the radar is higher.
  • the object detection method according to the present invention calculates a reliability total value for calculating a total value of reliability for both edges by summing a plurality of reliability values weighted in the weighting step for each of the right edge and the left edge. You may have a process.
  • the larger of the reliability total values calculated in the reliability total value calculation process May be selected as the true edge of the target.
  • the “edge existing on the locus approximate line” includes not only an edge existing at a position that completely coincides with the locus approximate line but also an allowable range determined in advance from the locus approximate line. Edges located within may be included.
  • the lateral position of the target is detected.
  • the accuracy can be further improved.
  • FIG. 1 is a block diagram illustrating a schematic configuration of a collision prediction apparatus according to a first embodiment.
  • FIG. 3 is a block diagram illustrating a schematic configuration of an object detection unit according to the first embodiment. It is a figure which shows the millimeter wave detection position and monocular image detection edge in the image imaged by the monocular image sensor. It is a figure which shows the detection result at the time of detecting an obstruction using a millimeter wave radar and a stereo image sensor. It is a figure which shows the detection result at the time of detecting an obstruction using a millimeter wave radar and a monocular image sensor. It is a figure which shows the locus
  • FIG. 6 is a block diagram illustrating a schematic configuration of an object detection unit according to a second embodiment. It is a figure which shows the detection result at the time of detecting an obstruction using a millimeter wave radar and a monocular image sensor.
  • FIG. 10 is a flowchart illustrating a part of a collision determination flow according to the second embodiment.
  • the collision prediction device 200 is a device that is mounted on the vehicle 100 and predicts a collision between an obstacle such as another vehicle or a pedestrian and the host vehicle 100.
  • the vehicle 100 is equipped with a warning device 8 and a collision avoidance / collision damage reduction system 9 that are activated when a collision with an obstacle is predicted.
  • the collision prediction apparatus 200 includes a millimeter wave radar 1, a monocular image sensor 2, a steering angle sensor 3, a yaw rate sensor 4, a wheel pulse sensor 5, and an ECU 10.
  • the millimeter wave radar 1 is attached to the front center portion of the vehicle 100.
  • the millimeter wave radar 1 scans the front and oblique front of the vehicle 100 in the horizontal direction with millimeter wave band electromagnetic waves and receives the electromagnetic waves reflected on the surface of an object outside the vehicle. Thereby, the millimeter wave radar 1 recognizes the target as an electromagnetic wave reflection point.
  • the target information (such as the relative position of the target with respect to the host vehicle 100) obtained from the millimeter wave transmission / reception data is input to the ECU 10.
  • the monocular image sensor 2 is attached to the front center portion of the vehicle 100.
  • the monocular image sensor 2 captures images ahead and obliquely forward of the vehicle 100.
  • the captured image is input to the ECU 10 as an image signal.
  • the steering angle sensor 3 is attached to a steering rod or the like of the vehicle 100 and detects the steering angle of the steering wheel operated by the driver.
  • the yaw rate sensor 4 is provided at the center position of the vehicle body of the vehicle 100 and detects the yaw rate applied to the vehicle body.
  • Wheel pulse sensor 5 is attached to the wheel portion of vehicle 100 and detects the wheel speed of the vehicle. Output signals from these sensors are input to the ECU 10.
  • the ECU10 has the object detection part 6 and the collision determination part 7.
  • the object detection unit 6 detects an obstacle based on target information acquired by the millimeter wave radar 1 and target information acquired from an image captured by the monocular image sensor 2.
  • the collision determination unit 7 determines whether the obstacle detected by the object detection unit 6 and the host vehicle 100 collide with each other. Details of the object detection method in the object detection unit 6 and the collision determination method in the collision determination unit 7 will be described later.
  • FIG. 2 is a block diagram illustrating a schematic configuration of the object detection unit 6 according to the present embodiment.
  • the object detection unit 6 includes an edge detection unit 61, a locus approximate line deriving unit 62, a selection unit 63, and a lateral position deriving unit 64.
  • the lateral position deriving unit 64 includes a trajectory predicting unit 641, a collision position predicting unit 642, a lateral width estimating unit 643, and a target center lateral position deriving unit 644. Details of each of the units 61 to 64 and 641 to 644 will be described later.
  • the ECU 10 further performs a collision determination in a collision determination unit such as an estimated curve radius calculation unit, a host vehicle speed calculation unit, a host vehicle track calculation unit, an obstacle speed calculation unit, an obstacle movement distance calculation unit, and the like (not shown). Therefore, it has a calculation unit for calculating various parameters necessary for this.
  • the estimated curve radius calculation unit calculates the estimated curve radius of the host vehicle 100 based on the steering angle signal input from the steering angle sensor 3 and the yaw rate signal input from the yaw rate sensor 4.
  • the own vehicle speed calculation unit calculates the vehicle speed of the own vehicle 100 based on the wheel speed signal input from the wheel pulse sensor 5.
  • the host vehicle track calculation unit calculates the track of the host vehicle 100 based on the estimated curve radius signal input from the estimated curve radius calculation unit.
  • the obstacle speed calculation unit calculates the moving speed of the obstacle detected by the object detection unit 6 based on the target information.
  • the obstacle movement distance calculation unit calculates the movement distance after the obstacle detected by the object detection unit 6 based on the target information.
  • an ON signal is transmitted from the ECU 10 to the warning device 8 and the collision avoidance / collision damage reduction system 9.
  • the warning device 8 receives the ON signal, the warning device 8 issues a warning to the driver by display on the monitor, voice, or the like.
  • the collision avoidance / collision damage reduction system executes collision avoidance control and collision damage reduction control. Examples of the collision avoidance / collision damage reduction system include an automatic steering system, a seat belt control system, a seat position control system, a brake control system, and an airbag control system.
  • the object detection unit 6 of the ECU 10 in the collision prediction device 200 derives the lateral position of the obstacle to be used for collision determination in the collision determination unit 7.
  • the method for deriving the lateral position of the obstacle according to the present embodiment will be described with reference to FIGS.
  • the object detection unit 6 detects an obstacle based on the target information acquired by the millimeter wave radar 1 and the target information acquired from the image captured by the monocular image sensor 2.
  • the millimeter wave radar 1 the relative position of the target with respect to the host vehicle 100 can be detected.
  • the edge detection unit 61 of the object detection unit 6 corresponds to the target recognized by the millimeter wave radar 1, that is, the position of the target detected by the millimeter wave radar 1 (hereinafter referred to as millimeter wave).
  • the target existing at the detection position) is extracted from the image captured by the monocular image sensor 2. Further, the right edge and the left edge of the target are detected from the extracted target image (hereinafter, the edges may be referred to as a monocular image detection edge).
  • the object or pattern edge that exists behind the target as viewed from the own vehicle 100 is actually the target. May be erroneously detected as an edge.
  • FIG. 3 is a diagram illustrating a millimeter wave detection position and a monocular image detection edge in an image captured by a monocular image sensor.
  • FIG. 3A shows a case where the edge of the target is normally detected
  • FIG. 3B shows a case where the edge of the target is erroneously detected.
  • the target is “front pole”.
  • an edge of “back utility pole” may be erroneously detected as an edge of “front power pole”.
  • FIG. 4 is a diagram illustrating temporal changes in the millimeter wave detection position and the stereo image detection edge with respect to the host vehicle when the “front utility pole” is the target.
  • the white arrow represents the trajectory of the target derived based on the millimeter wave detection position and the stereo image detection edge.
  • Accurate information in the depth direction can be acquired from the image captured by the stereo image sensor. Therefore, when an edge is detected from an image captured by a monocular image sensor, even if there is a “back utility pole” behind the “front power pole”, as shown in FIG. The edge is not erroneously detected as the edge of the “front pole”.
  • FIG. 5 is a diagram illustrating temporal changes in the millimeter wave detection position and the monocular image detection edge with respect to the host vehicle when the “electric pole in front” is the target.
  • the white arrow represents the trajectory of the target derived based on the millimeter wave detection position and the monocular image detection edge.
  • a collision determination between the target and the vehicle 100 is performed in the collision determination unit 7 based on the trajectory of the target predicted in error, an erroneous determination may be caused. Therefore, in the present embodiment, in order to derive the lateral position of the target to be used for the collision determination in the collision determination unit 7 in the object detection unit 6, the following is performed on the image captured by the monocular image sensor 2. Perform lateral position derivation processing. 6 and 7 are diagrams illustrating an image of the lateral position derivation process according to the present embodiment.
  • the locus approximate line deriving section 62 of the object detecting section 6 is detected by the edge detecting section 61 every predetermined time (in this embodiment every 50 ms) as shown in FIG.
  • a straight line approximating the trajectory of each of the right edge and the left edge or a trajectory approximate line that is a predetermined curve is derived.
  • 6A and 6B show the monocular image detection edges detected in FIG.
  • the one-dot chain line in FIG. 6A indicates the locus approximate line derived for the left edge
  • the one-dot chain line in FIG. 6B indicates the locus approximate line derived for the right edge.
  • a method for deriving the locus approximate line is determined in advance.
  • any known method such as a least square method or a spline interpolation method may be used.
  • the “predetermined curve” means a curve derived by a predetermined approximation method.
  • the right edge includes an erroneously detected edge, and therefore, of the five points, only three points exist on the locus approximation line. Note that, here, an edge that exists within a predetermined allowable range from the locus approximate line even if it is not at a position that completely coincides with the locus approximate line is counted as existing on the locus approximate line.
  • the circled edges indicate “edges existing on the locus approximate line”.
  • the selection unit 63 calculates the reliability of the right edge and the left edge based on the number of edges existing on the locus approximation line. Then, the right edge or the left edge having the higher reliability (that is, the one having a larger number of edges on the locus approximation line (the left edge in FIG. 6)) is selected as the true edge of the target.
  • the lateral position deriving unit 64 derives the lateral position of the target based on the position of the edge selected as the true edge of the target by the selecting unit 63 (hereinafter also referred to as a selected edge). . More specifically, as shown in FIG. 7, first, the trajectory prediction unit 641 predicts the future trajectory of the selected edge (the left edge in FIG. 7) based on the previous trajectory approximation line. In FIG. 7, an arrow indicated by a chain line represents a previous locus approximate line of the selected edge and a predicted future locus.
  • the selected edge and the host vehicle are determined based on the future track of the selected edge predicted by the track prediction unit 641 and the track of the host vehicle 100 calculated by the host vehicle track calculation unit of the ECU 10.
  • the collision position between the target and the host vehicle 100 is predicted as the position where the distance in the front-rear direction with respect to 100 becomes zero.
  • the broken line represents the collision position.
  • the lateral width estimating unit 643 estimates the lateral width Wt of the target.
  • any known method may be used as the lateral width estimation method. Specifically, a method of calculating the average width of the target derived from the monocular image detection edge as the width Wt of the target, or the type of target estimated from the intensity of the received wave in the millimeter wave radar 1 A method for deriving the lateral width Wt of the target based on the above can be exemplified.
  • the target center lateral position deriving unit 644 derives the lateral position of the target center at the collision position predicted by the collision position predicting unit 642. Specifically, the position shifted from the position of the selected edge at the collision position toward the other edge (right edge in FIG. 7) by 1 ⁇ 2 of the width Wt of the target estimated by the width estimation unit 643 Derived as the lateral position of the target center at the position. In FIG. 7, the position indicated by the white triangle represents the lateral position of the target center at the collision position.
  • the edge having the higher reliability of the right edge and the left edge in the monocular image detection edge is selected as the true edge of the target. Then, based on the position of the edge having the higher reliability, the lateral position of the target center at the collision position is derived. Therefore, even when a monocular image sensor is used as the image sensor, the lateral position of the target center at the collision position can be derived with high accuracy.
  • the collision determination unit 7 executes the collision determination based on the lateral position of the target center at the collision position derived by the object detection unit 6. Thereby, it becomes possible to determine with high accuracy whether or not the host vehicle 100 and the obstacle collide.
  • step S101 a target existing at the millimeter wave detection position is extracted from the image captured by the monocular image sensor 2.
  • step S102 the right edge and the left edge are detected from the target image extracted in step S101. Note that the processing of steps S101 and S102 is executed by the edge detection unit 61.
  • step S103 locus approximation lines for the right edge and the left edge are derived for the plurality of monocular image detection edges detected in step S102.
  • the process of step S103 is executed by the locus approximate line deriving unit 62.
  • step S104 the reliability of the right edge and the left edge is calculated based on the number of edges existing on the locus approximation line derived in step S103.
  • step S105 of the right edge and the left edge, the edge having the higher reliability calculated in step S104 is selected as the true edge of the target. Note that the processing of steps S104 and S105 is executed by the selection unit 63.
  • step S106 the future trajectory of the selected edge selected in step S105 is predicted. Note that the process of step S106 is executed by the trajectory prediction unit 641.
  • step S107 the collision position between the target and the host vehicle 100 is predicted based on the future track of the selected edge predicted in step S106 and the track of the host vehicle 100 calculated by the host vehicle track calculation unit. Is done.
  • the process of step S107 is executed by the collision position prediction unit 642.
  • step S108 the lateral width Wt of the target is estimated. Note that the processing in step S108 is executed by the lateral width estimation unit 643.
  • step S109 the position shifted from the position of the selected edge at the collision position predicted in step S107 to the other edge side by 1 ⁇ 2 of the lateral width Wt of the target estimated in step S108 is the collision position. Derived as the lateral position of the target center at. Note that the processing in step S109 is executed by the target center lateral position deriving unit 644.
  • step S110 the collision probability Pc between the target and the host vehicle 100 is calculated based on the lateral position of the target center at the collision position derived in step S109.
  • step S111 it is determined whether or not the collision probability Pc calculated in step S110 is greater than or equal to the reference probability Pcbase.
  • the reference probability Pcbase is a value set in advance as a threshold value to be determined that the target and the host vehicle 100 collide.
  • step S111 If an affirmative determination is made in step S111, it is then determined in step S112 that the target and the vehicle 100 collide. On the other hand, if a negative determination is made in step S111, it is then determined in step S113 that the target and the host vehicle 100 do not collide.
  • steps S110 to S113 are executed by the collision determination unit 7. And when it determines with the target and the own vehicle 100 colliding in step S112, the collision determination part 7 transmits ON signal to the warning device 8 and the collision avoidance / collision damage reduction system 9.
  • the object detection unit 6 corresponds to the object detection apparatus according to the present invention. Moreover, the relationship between the component of the object detection part 6 which concerns on a present Example, and the component requirement of this invention is as follows.
  • the edge detection unit 61 corresponds to an edge acquisition unit according to the present invention.
  • the locus approximate line deriving unit 62 corresponds to the locus approximate line deriving unit according to the present invention.
  • the selection unit 63 corresponds to selection means according to the present invention.
  • the lateral position deriving unit 64 corresponds to the lateral position deriving unit according to the present invention.
  • the trajectory prediction unit 641 corresponds to trajectory prediction means according to the present invention.
  • the collision position prediction unit 642 corresponds to the collision position prediction means according to the present invention.
  • the lateral width estimating unit 643 corresponds to the lateral width estimating means according to the present invention.
  • steps S101 and S102 in the flowchart shown in FIG. 8 correspond to the edge acquisition process according to the present invention.
  • Step S103 in the flowchart shown in FIG. 8 corresponds to the locus approximate line deriving step according to the present invention.
  • Steps S104 and S105 in the flowchart shown in FIG. 8 correspond to the selection step according to the present invention.
  • Steps S106 to S109 in the flowchart shown in FIG. 8 correspond to the lateral position deriving step according to the present invention.
  • FIG. 10 is a block diagram illustrating a schematic configuration of the object detection unit 6 according to the present embodiment.
  • the object detection unit 6 according to the present embodiment includes a weighting unit 65 and a total reliability in addition to the edge detection unit 61, the locus approximate line deriving unit 62, the selection unit 63, and the lateral position deriving unit 64.
  • a value calculation unit 66 is included. Details of the weighting unit 65 and the reliability total value calculation unit 66 will be described later.
  • FIG. 11 is a diagram illustrating a detection result when an obstacle is detected using the millimeter wave radar and the monocular image sensor according to the present embodiment.
  • FIG. 11 is a diagram illustrating changes in the temporal position of the millimeter wave detection position and the monocular image detection edge with respect to the subject vehicle when the “front pole” is the target, as in FIG. 5.
  • FIG. 11 there is a “crossing vehicle” behind the “electric pole in front”.
  • the edge of the “crossing vehicle” may be erroneously detected as the edge of the “front pole”.
  • the right edge is erroneously detected every time.
  • Such a false detection can occur even when a fixed object such as a “back utility pole” exists behind the target “front pole” as in the case of FIG. This is more likely to occur when the object in the back is a moving object that moves in a lateral direction, such as a “crossing vehicle”.
  • the following lateral position is obtained with respect to an image captured by the monocular image sensor 2 in order to derive the lateral position of the target to be used for collision determination in the collision determination unit 7.
  • the locus approximate line deriving section 62 of the object detecting section 6 is detected by the edge detecting section 61 every predetermined time (in this embodiment every 50 ms) as shown in FIG.
  • a straight line approximating the trajectory of each of the right edge and the left edge or a trajectory approximate line that is a predetermined curve is derived.
  • 12A and 12B show the millimeter wave detection position and the monocular image detection edge detected in FIG.
  • the one-dot chain line in FIG. 12A indicates the locus approximation line derived for the left edge
  • the one-dot chain line in FIG. 12B indicates the locus approximation line derived for the right edge.
  • the method for deriving the locus approximate line is the same as in the first embodiment.
  • the reliability of the edge is calculated based on the number of edges existing on the locus approximate line, the reliability of the edge that is normally detected each time is equal to the reliability of the edge that is erroneously detected each time. It becomes. In this case, it is difficult to select the true edge of the target in the lateral position derivation process according to the first embodiment.
  • the weighting unit 65 weights the reliability based on the distance from the millimeter wave detection position for each of the right edge and the left edge detected by the edge detection unit 61. Do. Here, an edge far from the millimeter wave detection position is more likely to be an erroneously detected edge than an edge near the millimeter wave detection position. Therefore, the weighting unit 65 weights the right edge and the left edge so that the reliability closer to the millimeter wave detection position is higher.
  • the reliability total value calculation unit 66 calculates a total value of the reliability for both edges by adding a plurality of reliability weighted by the weighting unit 65 for each of the right edge and the left edge.
  • the left edge is closer to the millimeter wave detection position than the right edge in all five times.
  • the weight closer to the millimeter wave detection position is 1.1 points and the weight farther from the millimeter wave detection position is 0.9 points, the reliability of the left edge is assumed.
  • the total value of degrees is 5.5 points, and the total reliability value of the right edge is 4.5 points.
  • the selection unit 63 of the right edge and the left edge, the larger total reliability value (left edge in FIG. 12) calculated by the reliability total value calculation unit 66 is set as the true edge of the target. select. According to this, even if one edge is erroneously detected every time as shown in FIG. 11, the more reliable edge of the right edge and the left edge of the monocular image detection edge is determined as the true of the target. Can be selected as the edge.
  • the lateral position deriving unit 64 derives the lateral position of the target by the same method as in the first embodiment based on the position of the selected edge selected as described above. That is, as shown in FIG. 13, the trajectory prediction unit 641 predicts the future trajectory of the selected edge. Next, the collision position prediction unit 642 predicts the collision position between the target and the host vehicle 100. In addition, the lateral width estimation unit 643 estimates the lateral width Wt of the target. Then, the target center lateral position deriving unit 644 derives the lateral position of the target center at the collision position. In FIG. 13, as in FIG.
  • the arrow indicated by the alternate long and short dash line represents the previous locus approximate line of the selected edge of the selected edge and the predicted future locus, and the broken line represents the collision position.
  • the position indicated by the white triangle represents the lateral position of the target center at the collision position.
  • the lateral position of the target center at the collision position is determined based on the position of the edge having the higher reliability of the right edge and the left edge in the monocular image detection edge. A position is derived. Therefore, even when a monocular image sensor is used as the image sensor, the lateral position of the target center at the collision position can be derived with high accuracy.
  • step S201 it is determined whether or not the reliability of the right edge calculated in step S104 is equal to the reliability of the left edge.
  • the process of step S105 is executed. In this case, the subsequent processing is the same as in the first embodiment.
  • the process of step S202 is executed next.
  • step S202 reliability weighting is performed on the right edge and the left edge for each of the plurality of monocular image detection edges detected in step S102.
  • weighting is performed so that the reliability closer to the millimeter wave detection position is higher.
  • the process of step S201 is executed by the weighting unit 65.
  • step S203 the reliability of the plurality of right and left edges weighted in step S202 is summed, and the total value of the reliability for both edges is calculated. Note that the processing in step S203 is executed by the reliability total value calculation unit 66.
  • step S105 the true edge of the target is selected.
  • the edge having the higher reliability value calculated in step 203 is selected as the true edge of the target.
  • Subsequent processing is the same as in the first embodiment.
  • the weighting unit 65 corresponds to the weighting unit according to the present invention
  • the reliability total value calculation unit 66 corresponds to the reliability total value calculation unit according to the present invention.
  • step S202 in the flowchart shown in FIG. 14 corresponds to the weighting step according to the present invention
  • step S203 in the flowchart corresponds to the reliability total value calculation step according to the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

L'invention concerne un appareil de détection d'objets qui détecte des objets au moyen d'un radar et d'un capteur d'images monoculaires et dont la précision de détection de la position latérale d'un objet devant être détecté est encore améliorée. Un objet détecté correspondant à un objet détecté identifié par le radar est extrait des images capturées par le capteur d'images monoculaires, puis les bords droit et gauche de l'objet détecté sont également définis. De plus, pour les deux bords, le système dérive des lignes d'approximation d'emplacement qui sont des lignes droites ou des lignes courbes qui déterminent approximativement l'emplacement des bords droit et gauche. Ensuite, le bord qui a le plus de bords se trouvant sur la ligne d'approximation d'emplacement parmi les bords droit et gauche est sélectionné en tant que bord réel de l'objet détecté et la position latérale de l'objet détecté est dérivée sur la base de la position du bord sélectionné.
PCT/JP2009/070562 2009-12-08 2009-12-08 Appareil de détection d'objets et procédé de détection d'objets WO2011070650A1 (fr)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US13/057,217 US8610620B2 (en) 2009-12-08 2009-12-08 Object detecting apparatus and object detecting method
DE112009005424.2T DE112009005424B4 (de) 2009-12-08 2009-12-08 Objektdetektionsvorrichtung und Objektdetektionsverfahren
PCT/JP2009/070562 WO2011070650A1 (fr) 2009-12-08 2009-12-08 Appareil de détection d'objets et procédé de détection d'objets
JP2011507732A JP4883246B2 (ja) 2009-12-08 2009-12-08 物体検出装置及び物体検出方法
CN200980132832.6A CN102696060B (zh) 2009-12-08 2009-12-08 物体检测装置以及物体检测方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2009/070562 WO2011070650A1 (fr) 2009-12-08 2009-12-08 Appareil de détection d'objets et procédé de détection d'objets

Publications (1)

Publication Number Publication Date
WO2011070650A1 true WO2011070650A1 (fr) 2011-06-16

Family

ID=44145222

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/070562 WO2011070650A1 (fr) 2009-12-08 2009-12-08 Appareil de détection d'objets et procédé de détection d'objets

Country Status (5)

Country Link
US (1) US8610620B2 (fr)
JP (1) JP4883246B2 (fr)
CN (1) CN102696060B (fr)
DE (1) DE112009005424B4 (fr)
WO (1) WO2011070650A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014033955A1 (fr) * 2012-09-03 2014-03-06 トヨタ自動車株式会社 Dispositif et procédé de calcul de vitesse, et dispositif de détermination de collision
WO2014033954A1 (fr) * 2012-09-03 2014-03-06 トヨタ自動車株式会社 Dispositif et procédé de détermination de collision
JP2015082324A (ja) * 2013-10-22 2015-04-27 ホンダ リサーチ インスティテュート ヨーロッパ ゲーエムベーハーHonda Research Institute Europe GmbH 予測的運転者支援システムのための複合信頼度推定
JP2018112461A (ja) * 2017-01-11 2018-07-19 いすゞ自動車株式会社 確率算出装置及び確率算出方法
CN109254289A (zh) * 2018-11-01 2019-01-22 百度在线网络技术(北京)有限公司 道路护栏的检测方法和检测设备
JP2020030190A (ja) * 2018-08-24 2020-02-27 独立行政法人日本スポーツ振興センター 位置追跡システム、及び位置追跡方法

Families Citing this family (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2639781A1 (fr) * 2012-03-14 2013-09-18 Honda Motor Co., Ltd. Véhicule avec détection de position d'objet de trafic amélioré
US10609335B2 (en) * 2012-03-23 2020-03-31 Magna Electronics Inc. Vehicle vision system with accelerated object confirmation
WO2014024284A1 (fr) * 2012-08-08 2014-02-13 トヨタ自動車株式会社 Dispositif de prédiction de collision
CN102945602B (zh) * 2012-10-19 2015-04-15 上海交通大学无锡研究院 一种用于交通事件检测的车辆轨迹分类方法
JP5812064B2 (ja) * 2012-11-22 2015-11-11 株式会社デンソー 物標検出装置
CN105324275B (zh) * 2013-05-31 2017-06-20 丰田自动车株式会社 移动轨迹预测装置和移动轨迹预测方法
JP5812061B2 (ja) * 2013-08-22 2015-11-11 株式会社デンソー 物標検出装置およびプログラム
JP5929870B2 (ja) * 2013-10-17 2016-06-08 株式会社デンソー 物標検出装置
JP5991332B2 (ja) * 2014-02-05 2016-09-14 トヨタ自動車株式会社 衝突回避制御装置
CN103927437B (zh) * 2014-04-04 2016-10-26 东南大学 在非直线路段测量车头间距的方法
JP6209797B2 (ja) * 2014-05-20 2017-10-11 本田技研工業株式会社 走行制御装置
CN107004360B (zh) * 2014-07-03 2020-08-07 通用汽车环球科技运作有限责任公司 车辆雷达方法和***
CN104269070B (zh) * 2014-08-20 2017-05-17 东风汽车公司 一种车辆主动安全预警方法和运用该方法的安全预警***
JP6380232B2 (ja) * 2015-05-19 2018-08-29 株式会社デンソー 物体検出装置、及び物体検出方法
JP6508337B2 (ja) * 2015-07-27 2019-05-15 日産自動車株式会社 物体検出方法及び物体検出装置
JP6027659B1 (ja) * 2015-08-27 2016-11-16 富士重工業株式会社 車両の走行制御装置
CN208093734U (zh) 2015-11-05 2018-11-13 日本电产株式会社 缝隙阵列天线以及雷达***
JP2018511951A (ja) 2015-11-05 2018-04-26 日本電産株式会社 スロットアンテナ
DE102016125419B4 (de) 2015-12-24 2022-10-20 Nidec Elesys Corporation Wellenleitervorrichtung, Schlitzantenne und Radar, Radarsystem sowie Drahtlos-Kommunikationssystem mit der Schlitzantenne
US10381741B2 (en) 2015-12-24 2019-08-13 Nidec Corporation Slot array antenna, and radar, radar system, and wireless communication system including the slot array antenna
CN106981710B (zh) 2016-01-15 2019-11-08 日本电产株式会社 波导装置、天线装置以及雷达
DE112017000573B4 (de) 2016-01-29 2024-01-18 Nidec Corporation Wellenleitervorrichtung und Antennenvorrichtung mit der Wellenleitervorrichtung
DE102017102284A1 (de) 2016-02-08 2017-08-10 Nidec Elesys Corporation Wellenleitervorrichtung und Antennenvorrichtung mit der Wellenleitervorrichtung
DE102017102559A1 (de) 2016-02-12 2017-08-17 Nidec Elesys Corporation Wellenleitervorrichtung und Antennenvorrichtung mit der Wellenleitervorrichtung
JP2019047141A (ja) 2016-03-29 2019-03-22 日本電産エレシス株式会社 マイクロ波ic導波路装置モジュール、レーダ装置およびレーダシステム
CN208093770U (zh) 2016-04-05 2018-11-13 日本电产株式会社 无线通信***
JP2019054315A (ja) 2016-04-28 2019-04-04 日本電産エレシス株式会社 実装基板、導波路モジュール、集積回路実装基板、マイクロ波モジュール、レーダ装置およびレーダシステム
DE102016211730A1 (de) * 2016-06-29 2018-01-04 Continental Teves Ag & Co. Ohg Verfahren zur Vorhersage eines Fahrbahnverlaufs einer Fahrbahn
JP6787102B2 (ja) * 2016-12-14 2020-11-18 株式会社デンソー 物体検出装置、物体検出方法
CN106740465B (zh) * 2016-12-30 2020-07-24 南通市台盈新材料科技有限公司 一种基于宽度探测的汽车避撞方法
EP3364211B1 (fr) * 2017-02-21 2022-08-17 Continental Autonomous Mobility Germany GmbH Procédé et dispositif pour détecter une collision possible et véhicule
JP2018164252A (ja) 2017-03-24 2018-10-18 日本電産株式会社 スロットアレーアンテナ、および当該スロットアレーアンテナを備えるレーダ
CN108695585B (zh) 2017-04-12 2021-03-16 日本电产株式会社 高频构件的制造方法
JP7020677B2 (ja) 2017-04-13 2022-02-16 日本電産エレシス株式会社 スロットアンテナ装置
JP2018182740A (ja) 2017-04-13 2018-11-15 日本電産株式会社 スロットアレーアンテナ
CN208093762U (zh) 2017-04-14 2018-11-13 日本电产株式会社 缝隙天线装置以及雷达装置
JP2020520180A (ja) 2017-05-11 2020-07-02 日本電産株式会社 導波路装置および当該導波路装置を備えるアンテナ装置
JP7129999B2 (ja) 2017-05-11 2022-09-02 日本電産株式会社 導波路装置および当該導波路装置を備えるアンテナ装置
US10547122B2 (en) 2017-06-26 2020-01-28 Nidec Corporation Method of producing a horn antenna array and antenna array
JP2019009779A (ja) 2017-06-26 2019-01-17 株式会社Wgr 伝送線路装置
JP7103860B2 (ja) 2017-06-26 2022-07-20 日本電産エレシス株式会社 ホーンアンテナアレイ
JP2019012999A (ja) 2017-06-30 2019-01-24 日本電産株式会社 導波路装置モジュール、マイクロ波モジュール、レーダ装置およびレーダシステム
JP7294608B2 (ja) 2017-08-18 2023-06-20 ニデックエレシス株式会社 アンテナアレイ
JP2019050568A (ja) 2017-09-07 2019-03-28 日本電産株式会社 方向性結合器
JP2019071607A (ja) 2017-10-10 2019-05-09 日本電産株式会社 導波装置
EP3518001B1 (fr) * 2018-01-25 2020-09-16 Aptiv Technologies Limited Procédé permettant d'augmenter la fiabilité de la détermination de la position d'un véhicule sur la base d'une pluralité de points de détection
CN110443819B (zh) * 2018-05-03 2022-04-15 比亚迪股份有限公司 一种单轨列车的轨道检测方法和装置
JP7298808B2 (ja) 2018-06-14 2023-06-27 ニデックエレシス株式会社 スロットアレイアンテナ
CN109343041B (zh) * 2018-09-11 2023-02-14 昆山星际舟智能科技有限公司 用于高级智能辅助驾驶的单目测距方法
CN111446530A (zh) 2019-01-16 2020-07-24 日本电产株式会社 波导装置、电磁波锁定装置、天线装置以及雷达装置
CN110636266B (zh) * 2019-10-10 2020-10-16 珠海格力电器股份有限公司 基于电器的安防监控方法及装置、存储介质
CN113093176B (zh) * 2019-12-23 2022-05-17 北京三快在线科技有限公司 线状障碍物检测方法、装置、电子设备和存储介质
CN112927509B (zh) * 2021-02-05 2022-12-23 长安大学 一种基于交通冲突技术的道路行车安全风险评估***

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08122432A (ja) * 1994-10-20 1996-05-17 Honda Motor Co Ltd 移動体の検出装置
JP2007288460A (ja) * 2006-04-17 2007-11-01 Nissan Motor Co Ltd 物体検出方法および物体検出装置
JP2008276689A (ja) * 2007-05-07 2008-11-13 Mitsubishi Electric Corp 車両用障害物認識装置
JP2009186260A (ja) * 2008-02-05 2009-08-20 Nissan Motor Co Ltd 物体検出装置及び測距方法

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5168529A (en) * 1988-08-29 1992-12-01 Rayethon Company Confirmed boundary pattern matching
US5052045A (en) * 1988-08-29 1991-09-24 Raytheon Company Confirmed boundary pattern matching
US5018218A (en) * 1988-08-29 1991-05-21 Raytheon Company Confirmed boundary pattern matching
US5173949A (en) * 1988-08-29 1992-12-22 Raytheon Company Confirmed boundary pattern matching
US5168530A (en) * 1988-08-29 1992-12-01 Raytheon Company Confirmed boundary pattern matching
US5027422A (en) * 1988-08-29 1991-06-25 Raytheon Company Confirmed boundary pattern matching
JP2001134772A (ja) * 1999-11-04 2001-05-18 Honda Motor Co Ltd 対象物認識装置
JP4205825B2 (ja) * 1999-11-04 2009-01-07 本田技研工業株式会社 対象物認識装置
CN1914060B (zh) * 2004-01-28 2013-05-29 丰田自动车株式会社 车辆行驶支持***
JP5088669B2 (ja) * 2007-03-23 2012-12-05 株式会社デンソー 車両周辺監視装置
JP5098563B2 (ja) * 2007-10-17 2012-12-12 トヨタ自動車株式会社 物体検出装置
US8320615B2 (en) * 2008-02-27 2012-11-27 Honeywell International Inc. Systems and methods for recognizing a target from a moving platform
DE112009004346B4 (de) * 2009-01-29 2014-05-28 Toyota Jidosha Kabushiki Kaisha Objekterkennungsvorrichtung und Objekterkennungsverfahren

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08122432A (ja) * 1994-10-20 1996-05-17 Honda Motor Co Ltd 移動体の検出装置
JP2007288460A (ja) * 2006-04-17 2007-11-01 Nissan Motor Co Ltd 物体検出方法および物体検出装置
JP2008276689A (ja) * 2007-05-07 2008-11-13 Mitsubishi Electric Corp 車両用障害物認識装置
JP2009186260A (ja) * 2008-02-05 2009-08-20 Nissan Motor Co Ltd 物体検出装置及び測距方法

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014033955A1 (fr) * 2012-09-03 2014-03-06 トヨタ自動車株式会社 Dispositif et procédé de calcul de vitesse, et dispositif de détermination de collision
WO2014033954A1 (fr) * 2012-09-03 2014-03-06 トヨタ自動車株式会社 Dispositif et procédé de détermination de collision
CN104584098A (zh) * 2012-09-03 2015-04-29 丰田自动车株式会社 碰撞判定装置和碰撞判定方法
CN104620297A (zh) * 2012-09-03 2015-05-13 丰田自动车株式会社 速度算出装置、速度算出方法以及碰撞判定装置
JPWO2014033955A1 (ja) * 2012-09-03 2016-08-08 トヨタ自動車株式会社 速度算出装置及び速度算出方法並びに衝突判定装置
CN104620297B (zh) * 2012-09-03 2017-03-22 丰田自动车株式会社 速度算出装置、速度算出方法以及碰撞判定装置
CN104584098B (zh) * 2012-09-03 2017-09-15 丰田自动车株式会社 碰撞判定装置和碰撞判定方法
JP2015082324A (ja) * 2013-10-22 2015-04-27 ホンダ リサーチ インスティテュート ヨーロッパ ゲーエムベーハーHonda Research Institute Europe GmbH 予測的運転者支援システムのための複合信頼度推定
JP2018112461A (ja) * 2017-01-11 2018-07-19 いすゞ自動車株式会社 確率算出装置及び確率算出方法
JP2020030190A (ja) * 2018-08-24 2020-02-27 独立行政法人日本スポーツ振興センター 位置追跡システム、及び位置追跡方法
CN109254289A (zh) * 2018-11-01 2019-01-22 百度在线网络技术(北京)有限公司 道路护栏的检测方法和检测设备

Also Published As

Publication number Publication date
DE112009005424T8 (de) 2013-02-07
DE112009005424T5 (de) 2012-12-06
DE112009005424B4 (de) 2015-12-24
JPWO2011070650A1 (ja) 2013-04-22
JP4883246B2 (ja) 2012-02-22
US20120313806A1 (en) 2012-12-13
CN102696060A (zh) 2012-09-26
CN102696060B (zh) 2015-01-07
US8610620B2 (en) 2013-12-17

Similar Documents

Publication Publication Date Title
JP4883246B2 (ja) 物体検出装置及び物体検出方法
JP5316549B2 (ja) 物体認識装置および物体認識方法
EP2302412B1 (fr) Système et procédé d'évaluation d'une menace de collision avant d'un véhicule automobile
JP6536521B2 (ja) 物体検知装置及び物体検知方法
US7889116B2 (en) Object detecting apparatus
US7498972B2 (en) Obstacle detection system for vehicle
US11312371B2 (en) Apparatus and method for controlling vehicle
US20180211536A1 (en) Driving support system
US9102329B2 (en) Tracking control apparatus
US10967857B2 (en) Driving support device and driving support method
JP6380232B2 (ja) 物体検出装置、及び物体検出方法
US10436899B2 (en) Object detection apparatus
US10787170B2 (en) Vehicle control method and apparatus
JP5078944B2 (ja) 車両用走行制御装置
JP5471195B2 (ja) 物体検出装置
EP2894618B1 (fr) Dispositif et procédé de calcul de vitesse, et dispositif de détermination de collision
JP6432538B2 (ja) 衝突予測装置
US8730089B2 (en) Vehicle radar system
JP6432423B2 (ja) 物体検知装置、及び物体検知方法
JP6429360B2 (ja) 物体検出装置
WO2011036807A1 (fr) Dispositif de détection d'objet et procédé de détection d'objet
KR20180069019A (ko) 물체 분류로 자동차의 주변 영역을 포착하는 방법, 제어 장치, 운전자 보조 시스템, 및 자동차
WO2023228668A1 (fr) Dispositif de surveillance d'environnement et programme

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 13057217

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 2011507732

Country of ref document: JP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09852047

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 1120090054242

Country of ref document: DE

Ref document number: 112009005424

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09852047

Country of ref document: EP

Kind code of ref document: A1