WO2014033954A1 - 衝突判定装置及び衝突判定方法 - Google Patents
衝突判定装置及び衝突判定方法 Download PDFInfo
- Publication number
- WO2014033954A1 WO2014033954A1 PCT/JP2012/072361 JP2012072361W WO2014033954A1 WO 2014033954 A1 WO2014033954 A1 WO 2014033954A1 JP 2012072361 W JP2012072361 W JP 2012072361W WO 2014033954 A1 WO2014033954 A1 WO 2014033954A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- collision determination
- detection unit
- radar
- image
- target
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/04—Systems determining presence of a target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/165—Anti-collision systems for passive traffic, e.g. including static obstacles, trees
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/66—Radar-tracking systems; Analogous systems
- G01S13/72—Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
- G01S13/723—Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9327—Sensor installation details
- G01S2013/93271—Sensor installation details in the front of the vehicles
Definitions
- the present invention relates to a collision determination device and a collision determination method for determining a collision between a vehicle and an object.
- a collision determination device and a collision determination method for example, as described in JP-A-2005-84034, a composite target of an object is generated using a detection result by a radar sensor and a detection result by an image sensor, An apparatus and a method for determining a collision between a vehicle and an object based on a generated synthetic target are known.
- the collision is determined based on the composite target after performing the collision determination based on the detection result of the image sensor. It is conceivable to make a determination. In this case, the collision determination is performed by giving priority to the detection result of the radar sensor over the detection result of the image sensor. For this reason, the continuity of the collision determination is lost due to a change in the state of the target that is the target of the collision determination, which gives the driver a sense of incongruity during vehicle travel control such as collision avoidance assistance performed based on the collision determination. It is possible to end up.
- the present invention is intended to provide a collision determination device and a collision determination method that can maintain the continuity of collision determination even when the state of a target to be subjected to collision determination changes.
- a collision determination apparatus includes a radar detection unit that detects an object around a vehicle using a radar wave, an image detection unit that picks up an image around the vehicle and detects the object, and a detection result and an image from the radar detection unit.
- a collision determination unit that determines a collision between a vehicle and an object based on a composite target generated using a detection result obtained by the detection unit, and the collision determination unit includes only the image detection unit among the radar detection unit and the image detection unit.
- the collision determination is performed by giving priority to the detection result of the image detection unit over the radar detection unit. Therefore, the continuity of the collision determination can be maintained even if the state of the target that is the target of the collision determination changes.
- the collision determination unit takes over the detection result when the object is detected only by the image detection unit and then detects the composite target when the object is detected by the radar detection unit and the image detection unit. It may be generated. Thereby, since the detection result by an image detection part is taken over, the continuity of a collision determination can be maintained.
- the collision determination unit is based on the detection result by the image detection unit instead of the collision determination based on the composite target.
- the collision determination may be performed. Thereby, since the collision determination based on the detection result by the image detection unit is continued, the continuity of the collision determination can be maintained.
- the collision determination unit generates a low-threshold composite target for an object detected only by the image detection unit, and a high threshold set higher than the low threshold for the object detected by the radar detection unit and the image detection unit.
- a synthetic target may be generated.
- the collision determination unit may perform the collision determination on the low threshold synthetic target based on a determination threshold lower than the determination threshold of the high threshold synthetic target. Thereby, an appropriate collision determination can be performed for each type of composite target.
- the collision determination unit may perform the collision determination by giving priority to the detection result of the image detection unit over the radar detection unit when the possibility of collision between the vehicle and the object exceeds a predetermined threshold.
- the detection range of the radar detection unit and the detection range of the image detection unit may partially overlap, and there may be a region detected by the image detection unit without being detected by the radar detection unit.
- the radar detection unit may detect an object in front of the vehicle using a radar wave
- the image detection unit may detect the object from an image obtained by imaging the front of the vehicle.
- the radar detection unit may detect an object in the vehicle traveling direction using a radar wave
- the image detection unit may detect the object from an image obtained by imaging the vehicle traveling direction.
- the radar detection unit may detect an object in front of the vehicle using millimeter waves.
- the collision determination method detects an object around a vehicle using a radar wave and detects an object based on a captured image around the vehicle, and a composite generated using the detection result obtained by radar detection and the detection result obtained by image detection.
- a collision determination method for determining a collision between a vehicle and an object based on a target after the object is detected only by image detection out of radar detection and image detection, the object is detected by radar detection and image detection. This includes determining the collision between the vehicle and the object by giving priority to the detection result of the image detection over the detection.
- the present invention it is possible to provide a collision determination device and a collision determination method that can maintain the continuity of collision determination even when the state of a target to be subjected to collision determination changes.
- FIG. 1 It is a block diagram which shows the structure of the collision determination apparatus which concerns on embodiment of this invention. It is a figure which shows the detection range of a radar and a stereo camera. It is a figure which shows the condition of the collision determination process assumed. It is a flowchart which shows operation
- the collision determination apparatus and the collision determination method according to the embodiment of the present invention perform a collision determination with an object around the vehicle, particularly with an object in front of the vehicle.
- the present invention is not limited to the front of the vehicle, and an embodiment in which a collision determination with an object behind the vehicle or an object in the vehicle traveling direction can be similarly described.
- the collision determination device is a device that is mounted on a vehicle and determines a collision between the vehicle and an object using a radar sensor and an image sensor.
- FIG. 1 is a block diagram showing a configuration of a collision determination device according to an embodiment of the present invention.
- the collision determination apparatus includes a speed sensor 11, a radar 12, a stereo camera 13, and an ECU 20 (Electronic Control Unit).
- Speed sensor 11 detects the speed of the vehicle.
- a wheel speed sensor is used as the speed sensor 11.
- the speed sensor 11 supplies the detected vehicle speed to the ECU 20.
- the radar 12 functions as a radar detection unit (radar sensor) that detects an object around the vehicle, particularly an object in front of the vehicle, using a radar wave, transmits a radar wave (electromagnetic wave) in front of the vehicle, and receives a radar wave reflected from the object.
- a radar detection unit radar sensor
- the radar 12 supplies radar detection information indicating the detection result of the object to the ECU 20.
- the stereo camera 13 functions as an image detection unit (image sensor) that captures an image of the periphery of the vehicle, particularly the front of the vehicle, and detects an object based on the captured image.
- image sensor image sensor
- a CCD Charge-Coupled Device
- CMOS Complementary Metal-Oxide Semiconductor
- the stereo camera 13 is installed as a plurality of cameras on the front surface or cabin of the vehicle.
- the stereo camera 13 supplies image detection information indicating the detection result of the object to the ECU 20.
- a single camera may be used instead of the stereo camera 13.
- the ECU 20 includes a radar target generation unit 21, an image target generation unit 22, a composite target generation unit 23, and a collision determination unit 24.
- the ECU 20 implements the functions of a radar target generation unit 21, an image target generation unit 22, a synthetic target generation unit 23, and a collision determination unit 24 through execution of a program by the CPU, mainly using a CPU, ROM, RAM, and the like.
- the ECU 20 may be configured as a single unit or a plurality of units.
- the radar target generator 21 generates a radar target based on radar detection information from the radar 12.
- the radar target has target information related to the distance to the object and the lateral position of the object, which is obtained from coordinates based on the vehicle.
- the target information of the radar target is calculated based on the radar detection information from the radar 12.
- the distance to the object represents the distance from the vehicle (radar 12) to the object in the traveling direction of the vehicle, and is calculated based on the time until the radar wave is transmitted from the radar 12, reflected from the object and received.
- the The lateral position of the object represents a distance from the vehicle (radar 12) to the object in a direction orthogonal to the traveling direction of the vehicle, and is calculated based on the direction (angle) of the radar wave reflected and received from the object. .
- the lateral position in the radar target is information on the position of the object detected by the radar 12, and does not include information on the lateral width of the object.
- the image target generator 22 generates an image target based on the image detection information from the stereo camera 13.
- the image target has target information related to the distance to the object and the lateral position of the object, which is obtained from the coordinates based on the vehicle. Further, the image target generation unit 22 determines whether or not the object is in a stationary state by tracking the object based on the image detection information, and supplies the tracking result and the determination result of the stationary state to the collision determination unit 24. To do.
- the target information of the image target is calculated based on the principle of triangulation based on the deviation of the image detection information of the left and right cameras constituting the stereo camera 13, or based on the detected size and position of the license plate etc. of the vehicle ahead. Calculated.
- the distance to the object represents the distance from the vehicle (stereo camera 13) to the object in the traveling direction of the vehicle.
- the lateral position of the object represents the distance from the vehicle (stereo camera 13) to the object in a direction orthogonal to the traveling direction of the vehicle.
- the lateral position in the image target also includes information on the lateral range of the object detected from the image, that is, the lateral width of the object.
- the composite target generation unit 23 generates a composite target of the object using the target information of the radar target and the image target, that is, the detection result by the radar 12 and the stereo camera 13.
- the composite target is generated by collating both targets based on the target information of the radar target and the image target. Both targets are collated based on the similarity of the target information in both targets, that is, the similarity of the distance to the object and the lateral position of the object.
- the composite target has target information regarding the distance to the object and the horizontal position (including the horizontal width) of the object.
- the target information of the composite target is based on the target information of the radar target and the image target, and has higher accuracy than the target information of the radar target or the image target alone.
- FIG. 2 is a diagram showing detection ranges A1 and A2 of the radar 12 and the stereo camera 13.
- the detection range A1 of the radar 12 is narrower than the detection range A2 of the stereo camera 13. Therefore, an area that can be detected only by the stereo camera 13 outside the detection range A1 of the radar 12 exists diagonally forward of the vehicle C.
- a composite target is generated while the object exists in the detection ranges A1 and A2 of both sensors 12 and 13, but when the object deviates from the detection range A1 of the radar 12, the synthesis is performed. No target is generated.
- the collision determination unit 24 calculates a collision determination parameter for each of the radar target, the image target, and the composite target. As parameters, for example, target distance, collision probability, existence probability, and collision lateral position are calculated.
- the target distance means the distance to the target in the direction of travel of the vehicle
- the collision probability means the probability that the vehicle will collide with the object corresponding to the target
- the existence probability corresponds to the target.
- the collision lateral position means a lateral position (position in the width direction of the vehicle) where a collision with an object corresponding to the target is expected.
- the target distance, the collision probability, the existence probability, and the collision lateral position are obtained based on the movement status of each target.
- the parameters of each target are stored in a memory such as a RAM for a predetermined period together with the target information of each target, and are read out as necessary.
- the collision determination unit 24 performs a collision determination based on the composite target.
- the collision determination unit 24 determines the possibility of collision with an object based on whether or not the collision time is less than the threshold when the parameters of the composite target satisfy a predetermined threshold.
- the collision time is calculated by dividing the distance to the object by the relative speed of the object (the amount of change per unit time of the distance to the object) using the target information of the composite target.
- the determination result of the possibility of collision is used for collision avoidance support by, for example, notification to the driver, control intervention for braking or steering of the vehicle, and the like.
- the collision determination unit 24 performs the collision determination based on the image target in a situation where only the image target is generated without generating the radar target.
- the collision determination unit 24 determines the possibility of collision with an object based on whether or not the parameter of the image target satisfies a predetermined threshold and the collision time is less than the threshold.
- the collision time is calculated by dividing the distance to the object by the relative speed of the object using the target information of the image target.
- the collision determination unit 24 is more stereo camera than the radar 12.
- the collision determination is performed by prioritizing the detection result of 13. Specifically, when the object is detected by the radar 12 and the stereo camera 13, the collision determination unit 24 takes over the detection result by the stereo camera 13 when the object is detected only by the stereo camera 13, and the composite target. Set.
- the collision determination unit 24 sets a low-threshold composite target when the object is detected only by the stereo camera 13, and is set higher than the low threshold when the object is detected by the radar 12 and the stereo camera 13. Set the high threshold composite target.
- FIG. 3 is a diagram illustrating a situation of an assumed collision determination process.
- FIG. 3 shows time-series changes in the positions of the targets generated by the sensors 12 and 13 together with the detection ranges A1 and A2 of the radar 12 and the stereo camera 13.
- the target for example, a pedestrian P that crosses the front of the traveling vehicle C is assumed.
- the object is outside the detection range A1 of the radar 12 and has moved from the detection range A2 of the stereo camera 13 to the detection ranges A1 and A2 of both sensors 12 and 13. While the object is outside the detection range A1 of the radar 12 and moving within the detection range A2 of the stereo camera 13, an image target is generated, and the object moves within the detection ranges A1 and A2 of both sensors 12, 13. Then, a synthetic target is generated using the radar target and the image target.
- the collision determination process when an object is detected by both the sensors 12 and 13 after being detected only by the stereo camera 13, the collision is determined based on the image target and then the collision is performed based on the composite target. It is conceivable to make a determination. In this case, the collision determination is performed with priority given to the radar target information (collision determination parameters, etc.) over the image target. That is, image target information (such as collision determination parameters) when an object is detected only by the stereo camera 13 is not carried over, and a new composite target is set based on the radar target information.
- the radar target information collision determination parameters, etc.
- FIG. 4 is a flowchart showing the operation of the collision determination device.
- FIG. 5 is a diagram showing the situation of the collision determination process shown in FIG.
- the collision determination device repeatedly executes the process shown in FIG. 4 for each processing cycle.
- the collision determination apparatus performs the following process before performing the process shown in FIG. That is, the radar target generation unit 21 generates a radar target when an object is present within the detection range of the radar 12.
- the image target generation unit 22 generates an image target when an object exists within the detection range of the stereo camera 13.
- the synthetic target generation unit 23 generates a synthetic target when collation between the radar target and the image target is obtained.
- the collision determination unit 24 calculates a collision determination parameter for each of the radar target, the image target, and the composite target according to the target generation status.
- the collision determination unit 24 determines whether or not the image target has a predetermined accuracy (S11). In this determination, for example, the reliability of the collision lateral position is determined as a collision determination parameter for the image target.
- the collision lateral position means a lateral position (position in the width direction of the vehicle) where the collision with the image target is expected, and the reliability (certainty) is, for example, the collision lateral position in the previous processing cycle. It is calculated based on the transition of
- the collision determination unit 24 determines whether the object is detected only by the stereo camera 13, that is, whether only the image target is generated. Determine (S12). When it is determined that only the image target is generated, the collision determination unit 24 sets a low-threshold composite target as a collision determination target (S13).
- the low-threshold composite target is a composite target generated using only the image target, and the threshold of a normal composite target (a high-threshold composite target described later) is used as a determination threshold for collision determination.
- a lower threshold is set.
- the collision determination unit 24 sets an image target parameter as a collision determination parameter (S14).
- the collision determination unit 24 determines whether or not the object is detected by the radar 12 and the stereo camera 13, that is, a composite target is generated. It is determined whether or not (S15).
- the collision determination unit 24 determines whether only the image target has been generated in the immediately preceding processing cycle (S16).
- it means that the object is detected by both the sensors 12 and 13 after being detected by the stereo camera 13.
- the collision determination unit 24 prioritizes image target information (such as collision determination parameters) over the radar target to perform the collision determination. For this reason, the collision determination unit 24 uses the information of the low-threshold composite target in the immediately preceding processing cycle as the composite target (the high-threshold composite product described later) when only the image target is generated. (S17).
- the collision determination unit 24 sets a high threshold as a target for collision determination regardless of whether only an image target has been generated in the immediately preceding processing cycle.
- a composite target is set (S18).
- a high-threshold composite target is a normal composite target generated using a radar target and an image target, and has higher accuracy than a low-threshold target.
- the collision determination unit 24 sets a composite target parameter as a collision determination parameter (S19).
- a high-threshold composite target is newly set based on the information of the image target calculated in the immediately preceding processing cycle. That is, a new synthetic target is set by giving priority to image target information (such as collision determination parameters) over a radar target. Thereafter, the parameters of the high threshold composite target are calculated by updating the inherited information.
- the collision determination unit 24 sets 0 as a collision determination parameter (S20).
- the collision determination unit 24 When the collision determination parameter is set in S14, S19, or S20, the collision determination unit 24 performs a collision determination based on the set parameter. More specifically, the collision determination unit 24 determines whether or not the collision determination parameter satisfies a predetermined threshold, and determines that the collision time is less than the determination threshold when it is determined that the predetermined threshold is satisfied. To do.
- the collision determination unit 24 calculates the collision time using the target information of the composite target and compares it with the normal determination threshold.
- the collision determination unit 24 calculates the collision time using the target information of the image target and compares it with a determination threshold lower than normal. .
- the collision determination based on the image target is performed according to whether or not the image target has a predetermined accuracy in S11. However, instead of this, or in addition to this, it is determined whether or not the possibility of collision between the vehicle and the object exceeds a predetermined threshold, and when the possibility of collision exceeds the threshold, a collision determination based on the image target is performed. You may do it. In this case, for example, the possibility of collision is determined based on whether or not the collision determination parameter of the image target satisfies a predetermined threshold value.
- FIG. 5 shows time-series changes in the position of the target in the collision determination process shown in FIG. 4 in comparison with FIG.
- the collision determination process shown in FIG. 4 when an object is detected by both the sensors 12 and 13 after being detected only by the stereo camera 13, the high threshold is set after the collision is determined based on the low threshold composite target. The collision is determined based on the composite target.
- the collision determination based on the high-threshold composite target takes over the information of the image target (such as the collision determination parameter) when the object is detected only by the stereo camera 13, and the image target rather than the radar target. This is done by setting a high-threshold composite target in preference to the target information.
- the collision determination is continued based on the information of the image target, the continuity of the collision determination can be maintained even if the state of the target as a target of the collision determination changes. Therefore, it is not necessary to give the driver an uncomfortable feeling during vehicle travel control such as collision avoidance support performed based on the collision determination.
- the collision determination device As described above, according to the collision determination device according to the embodiment of the present invention, after an object is detected by only the stereo camera 13 of the radar 12 and the stereo camera 13, the object is detected by the radar 12 and the stereo camera 13. When detected, the collision determination is performed by giving priority to the detection result of the stereo camera 13 over the radar 12. Therefore, the continuity of the collision determination can be maintained even if the state of the target that is the target of the collision determination changes.
- the detection result when the object is detected only by the stereo camera 13 is taken over to generate a composite target, thereby generating a stereo target.
- the detection result by the camera 13 may be taken over.
- the collision determination is performed based on the detection result by the stereo camera 13 instead of the collision determination based on the composite target. Also good.
- a low threshold composite target is generated for an object detected only by the stereo camera 13
- a high threshold composite target set higher than the low threshold is generated for an object detected by the radar 12 and the stereo camera 13.
- a collision determination may be performed for each type of the target by performing a collision determination based on a determination threshold lower than the determination threshold for a high-threshold target.
- the collision determination may be performed by giving priority to the detection result of the stereo camera 13 over the radar 12.
- the detection range of the radar 12 and the detection range of the stereo camera 13 may partially overlap, and there may be a region that is detected by the stereo camera 13 but not detected by the radar 12.
- the radar 12 may detect an object in front of the vehicle using a radar wave
- the stereo camera 13 may detect the object from an image obtained by imaging the front of the vehicle.
- the radar 12 may detect an object in the vehicle traveling direction using a radar wave
- the stereo camera 13 may detect the object from an image obtained by capturing the vehicle traveling direction.
- the radar 12 may detect an object in front of the vehicle using millimeter waves.
- the above-described embodiment describes the best embodiment of the collision determination device and the collision determination method according to the present invention, and the collision determination device and the collision determination method according to the present invention are described in the present embodiment. It is not limited to things.
- the collision determination apparatus and the collision determination method according to the present invention are modified from the collision determination apparatus and the collision determination method according to the present embodiment without departing from the gist of the invention described in each claim, or applied to others. It may be a thing.
- the function of the radar target generation unit 21 may be realized by a single ECU, for example, a radar sensor ECU
- the function of the image target generation unit 22 may be realized by a single ECU, for example, an image sensor ECU.
- the detection ranges A1 and A2 of the radar 12 and the stereo camera 13 are bilaterally symmetric with respect to the traveling direction of the vehicle and overlapped symmetrically.
- the detection ranges A1 and A2 of both the sensors 12 and 13 are partially overlapped and need not be detected by the radar 12 but have an area detected by the stereo camera 13, and are not necessarily in the traveling direction of the vehicle. They are symmetrical with respect to each other and do not need to overlap symmetrically.
- the collision determination is made based on the image target and then based on the composite target.
- the collision determination may be performed based on the image target (that is, based on the detection result by the stereo camera 13) instead of the collision determination based on the composite target. Good. Even in this case, since the collision determination based on the image target is continued, the continuity of the collision determination can be maintained.
- the collision determination device and the collision determination method according to the embodiment of the present invention have been described with respect to the embodiment for performing the collision determination with the object around the vehicle, particularly with the object in front of the vehicle.
- the collision determination device and the collision determination method according to the embodiment of the present invention have been described with respect to the embodiment for performing the collision determination with the object around the vehicle, particularly with the object in front of the vehicle.
- the front side of the vehicle not only the front side of the vehicle but also an embodiment in which a collision determination with an object behind the vehicle or an object in the vehicle traveling direction can be similarly described.
- a collision determination with an object behind the vehicle is performed.
- the radar sensor and the image sensor it is possible to perform a collision determination with an object in the vehicle traveling direction according to the forward and backward movement of the vehicle.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
Description
Claims (11)
- レーダ波により車両周囲の物体を検出するレーダ検出部と、
前記車両周囲を撮像し撮像した画像により前記物体を検出する画像検出部と、
前記レーダ検出部による検出結果と前記画像検出部による検出結果を用いて生成された合成物標に基づいて前記車両と前記物体の衝突を判定する衝突判定部と、
を備え、
前記衝突判定部は、前記レーダ検出部及び前記画像検出部のうち前記画像検出部のみにより前記物体が検出された後に、該物体が前記レーダ検出部及び前記画像検出部により検出される場合、前記レーダ検出部よりも前記画像検出部の検出結果を優先させて衝突判定を行う衝突判定装置。 - 前記衝突判定部は、前記物体が前記画像検出部のみにより検出された後に前記レーダ検出部及び前記画像検出部により検出される場合、前記画像検出部のみにより検出された際の検出結果を引き継いで前記合成物標を設定する、請求項1に記載の衝突判定装置。
- 前記衝突判定部は、前記物体が前記画像検出部のみにより検出された後に前記レーダ検出部及び前記画像検出部により検出される場合、前記合成物標に基づく前記衝突判定に代えて、前記画像検出部による検出結果に基づいて衝突判定を行う、請求項1に記載の衝突判定装置。
- 前記衝突判定部は、前記画像検出部のみにより検出される前記物体について低閾値の合成物標を設定し、前記レーダ検出部及び前記画像検出部により検出される前記物体について前記低閾値よりも高く設定された高閾値の合成物標を設定する、請求項1又は2に記載の衝突判定装置。
- 前記衝突判定部は、前記低閾値の合成物標について、前記高閾値の合成物標の判定閾値よりも低い判定閾値に基づいて衝突判定を行う、請求項4に記載の衝突判定装置。
- 前記衝突判定部は、前記車両と前記物体の衝突可能性が所定の閾値を超える場合、前記レーダ検出部よりも前記画像検出部の検出結果を優先させて衝突判定を行う、請求項1~5のいずれか一項に記載の衝突判定装置。
- 前記レーダ検出部の検出範囲と前記画像検出部の検出範囲とが部分的に異なり、前記レーダ検出部により検出されず前記画像検出部により検出される領域が存在する、請求項1~6のいずれか一項に記載の衝突判定装置。
- 前記レーダ検出部は、レーダ波により前記車両前方の物体を検出し、
前記画像検出部は、前記車両前方を撮像し撮像した画像により前記物体を検出する、請求項1~7のいずれか一項に記載の衝突判定装置。 - 前記レーダ検出部は、レーダ波により前記車両進行方向の物体を検出し、
前記画像検出部は、前記車両進行方向を撮像し撮像した画像により前記物体を検出する、請求項1~7のいずれか一項に記載の衝突判定装置。 - 前記レーダ検出部は、ミリ波により前記車両周囲の前記物体を検出する、請求項1~9のいずれか一項に記載の衝突判定装置。
- レーダ波による車両周囲の物体の検出及び車両周囲の撮像画像による前記物体の検出を行い、
前記レーダ検出による検出結果と前記画像検出による検出結果を用いて生成された合成物標に基づいて前記車両と前記物体の衝突を判定する衝突判定方法において、
前記レーダ検出及び前記画像検出のうち前記画像検出のみにより前記物体が検出された後に、該物体が前記レーダ検出及び前記画像検出により検出される場合、前記レーダ検出よりも前記画像検出の検出結果を優先させて前記車両と前記物体の衝突判定を行うこと
を含む衝突判定方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014532714A JP5979232B2 (ja) | 2012-09-03 | 2012-09-03 | 衝突判定装置及び衝突判定方法 |
US14/425,209 US9666077B2 (en) | 2012-09-03 | 2012-09-03 | Collision determination device and collision determination method |
PCT/JP2012/072361 WO2014033954A1 (ja) | 2012-09-03 | 2012-09-03 | 衝突判定装置及び衝突判定方法 |
CN201280075572.5A CN104584098B (zh) | 2012-09-03 | 2012-09-03 | 碰撞判定装置和碰撞判定方法 |
EP12883954.5A EP2894619A4 (en) | 2012-09-03 | 2012-09-03 | DEVICE AND METHOD FOR COLLISION DETERMINATION |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2012/072361 WO2014033954A1 (ja) | 2012-09-03 | 2012-09-03 | 衝突判定装置及び衝突判定方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014033954A1 true WO2014033954A1 (ja) | 2014-03-06 |
Family
ID=50182798
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/072361 WO2014033954A1 (ja) | 2012-09-03 | 2012-09-03 | 衝突判定装置及び衝突判定方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9666077B2 (ja) |
EP (1) | EP2894619A4 (ja) |
JP (1) | JP5979232B2 (ja) |
CN (1) | CN104584098B (ja) |
WO (1) | WO2014033954A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016190099A1 (ja) * | 2015-05-27 | 2016-12-01 | 株式会社デンソー | 車両制御装置及び車両制御方法 |
JP2022134678A (ja) * | 2021-03-03 | 2022-09-15 | 本田技研工業株式会社 | 制御装置、移動体、制御方法及びプログラム |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5842863B2 (ja) * | 2013-05-14 | 2016-01-13 | 株式会社デンソー | 衝突緩和装置 |
JP5991332B2 (ja) * | 2014-02-05 | 2016-09-14 | トヨタ自動車株式会社 | 衝突回避制御装置 |
CN107250621B (zh) * | 2015-02-12 | 2019-04-26 | 本田技研工业株式会社 | 自动变速器的变速控制装置 |
WO2018105136A1 (ja) * | 2016-12-06 | 2018-06-14 | 本田技研工業株式会社 | 車両周辺情報取得装置および車両 |
CN111656396B (zh) * | 2018-02-02 | 2024-01-09 | 三菱电机株式会社 | 落下物检测装置、车载***、车辆及计算机可读取的记录介质 |
JP6939723B2 (ja) * | 2018-07-02 | 2021-09-22 | 株式会社デンソー | 衝突判定装置 |
US10930155B2 (en) * | 2018-12-03 | 2021-02-23 | Continental Automotive Systems, Inc. | Infrastructure sensor detection and optimization method |
JP7468409B2 (ja) * | 2021-03-01 | 2024-04-16 | トヨタ自動車株式会社 | 車両衝突回避支援装置 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005084034A (ja) | 2003-09-11 | 2005-03-31 | Toyota Motor Corp | 物体検出装置 |
JP2010250501A (ja) * | 2009-04-14 | 2010-11-04 | Hitachi Automotive Systems Ltd | 車両用外界認識装置及びそれを用いた車両システム |
JP2011048420A (ja) * | 2009-08-25 | 2011-03-10 | Fujitsu Ltd | 車両検出装置、車両検出プログラム、および車両検出方法 |
JP2011113286A (ja) * | 2009-11-26 | 2011-06-09 | Toyota Motor Corp | 衝突予測装置 |
WO2011070650A1 (ja) * | 2009-12-08 | 2011-06-16 | トヨタ自動車株式会社 | 物体検出装置及び物体検出方法 |
JP2012014520A (ja) * | 2010-07-01 | 2012-01-19 | Toyota Motor Corp | 障害物検出装置 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004117071A (ja) * | 2002-09-24 | 2004-04-15 | Fuji Heavy Ind Ltd | 車外監視装置、及び、この車外監視装置を備えた走行制御装置 |
JP4193703B2 (ja) * | 2004-01-19 | 2008-12-10 | トヨタ自動車株式会社 | 物体検出装置 |
JP2006281900A (ja) * | 2005-03-31 | 2006-10-19 | Xanavi Informatics Corp | 車載情報装置、およびアプリケーション実行方法 |
JP4684954B2 (ja) * | 2005-08-31 | 2011-05-18 | 本田技研工業株式会社 | 車両の走行安全装置 |
CN107176923A (zh) | 2005-10-19 | 2017-09-19 | 泰华制药工业有限公司 | 拉奎尼莫钠晶体及其制备方法 |
JP4304517B2 (ja) * | 2005-11-09 | 2009-07-29 | トヨタ自動車株式会社 | 物体検出装置 |
JP4595833B2 (ja) * | 2006-02-24 | 2010-12-08 | トヨタ自動車株式会社 | 物体検出装置 |
JP4434296B1 (ja) * | 2008-09-05 | 2010-03-17 | トヨタ自動車株式会社 | 物体検出装置 |
US8229663B2 (en) * | 2009-02-03 | 2012-07-24 | GM Global Technology Operations LLC | Combined vehicle-to-vehicle communication and object detection sensing |
JP4614005B2 (ja) * | 2009-02-27 | 2011-01-19 | トヨタ自動車株式会社 | 移動軌跡生成装置 |
JP5379543B2 (ja) * | 2009-04-09 | 2013-12-25 | 日立オートモティブシステムズ株式会社 | 自動車の外界認識装置 |
JP5287746B2 (ja) * | 2009-05-21 | 2013-09-11 | 日産自動車株式会社 | 運転支援装置、及び運転支援方法 |
JP5401344B2 (ja) | 2010-01-28 | 2014-01-29 | 日立オートモティブシステムズ株式会社 | 車両用外界認識装置 |
-
2012
- 2012-09-03 CN CN201280075572.5A patent/CN104584098B/zh active Active
- 2012-09-03 EP EP12883954.5A patent/EP2894619A4/en not_active Withdrawn
- 2012-09-03 WO PCT/JP2012/072361 patent/WO2014033954A1/ja active Application Filing
- 2012-09-03 JP JP2014532714A patent/JP5979232B2/ja active Active
- 2012-09-03 US US14/425,209 patent/US9666077B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005084034A (ja) | 2003-09-11 | 2005-03-31 | Toyota Motor Corp | 物体検出装置 |
JP2010250501A (ja) * | 2009-04-14 | 2010-11-04 | Hitachi Automotive Systems Ltd | 車両用外界認識装置及びそれを用いた車両システム |
JP2011048420A (ja) * | 2009-08-25 | 2011-03-10 | Fujitsu Ltd | 車両検出装置、車両検出プログラム、および車両検出方法 |
JP2011113286A (ja) * | 2009-11-26 | 2011-06-09 | Toyota Motor Corp | 衝突予測装置 |
WO2011070650A1 (ja) * | 2009-12-08 | 2011-06-16 | トヨタ自動車株式会社 | 物体検出装置及び物体検出方法 |
JP2012014520A (ja) * | 2010-07-01 | 2012-01-19 | Toyota Motor Corp | 障害物検出装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2894619A4 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016190099A1 (ja) * | 2015-05-27 | 2016-12-01 | 株式会社デンソー | 車両制御装置及び車両制御方法 |
JP2016223812A (ja) * | 2015-05-27 | 2016-12-28 | 株式会社デンソー | 車両制御装置、及び車両制御方法 |
US10672275B2 (en) | 2015-05-27 | 2020-06-02 | Denso Corporation | Vehicle control device and vehicle control method |
JP2022134678A (ja) * | 2021-03-03 | 2022-09-15 | 本田技研工業株式会社 | 制御装置、移動体、制御方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
EP2894619A4 (en) | 2016-03-16 |
US20150206435A1 (en) | 2015-07-23 |
US9666077B2 (en) | 2017-05-30 |
JPWO2014033954A1 (ja) | 2016-08-08 |
JP5979232B2 (ja) | 2016-08-24 |
EP2894619A1 (en) | 2015-07-15 |
CN104584098A (zh) | 2015-04-29 |
CN104584098B (zh) | 2017-09-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5979232B2 (ja) | 衝突判定装置及び衝突判定方法 | |
JP5862785B2 (ja) | 衝突判定装置及び衝突判定方法 | |
JP5884912B2 (ja) | 衝突判定装置及び衝突判定方法 | |
US9487195B2 (en) | Collision avoidance assistance device and collision avoidance assistance method | |
JP6011625B2 (ja) | 速度算出装置及び速度算出方法並びに衝突判定装置 | |
JP6536521B2 (ja) | 物体検知装置及び物体検知方法 | |
US10252716B2 (en) | Driving assist apparatus and driving assist method | |
US10967857B2 (en) | Driving support device and driving support method | |
US10592755B2 (en) | Apparatus and method for controlling vehicle | |
US20140333467A1 (en) | Object detection device | |
JP5397231B2 (ja) | リスク回避支援装置 | |
JP2014213776A (ja) | 衝突判定装置、および衝突緩和装置 | |
JP2017111684A (ja) | 制御装置、制御方法 | |
JP2018097765A (ja) | 物体検出装置、及び物体検出方法 | |
JP6600271B2 (ja) | 物体認識装置及び物体認識方法 | |
JP6504078B2 (ja) | 衝突予測装置 | |
JP2019052920A (ja) | 物体検出装置、物体検出方法及び車両制御システム | |
JP6429360B2 (ja) | 物体検出装置 | |
WO2014033958A1 (ja) | 衝突判定装置及び衝突判定方法 | |
WO2017154471A1 (ja) | 横断判定装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12883954 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014532714 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14425209 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2012883954 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012883954 Country of ref document: EP |