WO2015045504A1 - 物体検知装置 - Google Patents
物体検知装置 Download PDFInfo
- Publication number
- WO2015045504A1 WO2015045504A1 PCT/JP2014/065676 JP2014065676W WO2015045504A1 WO 2015045504 A1 WO2015045504 A1 WO 2015045504A1 JP 2014065676 W JP2014065676 W JP 2014065676W WO 2015045504 A1 WO2015045504 A1 WO 2015045504A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- distance
- object detection
- target
- extraction unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/584—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/76—Circuitry for compensating brightness variation in the scene by influencing the image signals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30236—Traffic on road, railway or crossing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
Definitions
- the present invention relates to an object detection device, for example, an object detection device for detecting a preceding vehicle ahead of the host vehicle in an environment with low illuminance such as at night.
- vehicle speed / inter-vehicle control Adaptive Cruise Control
- pre-crash safety the front of the vehicle is imaged with a stereo camera mounted on the vehicle, the relative distance from the captured image to the preceding vehicle is measured, and the vehicle follows the preceding vehicle. This is a technology that brakes the vehicle by predicting a collision with the vehicle.
- miniaturization of stereo cameras has been desired, and when the base line length between cameras constituting a stereo camera is shortened, there may be a problem that the detection distance is also shortened. Therefore, in a long-distance area away from the host vehicle, an image obtained by one camera (monocular camera) constituting the stereo camera is used to detect the vehicle position from the image, and on the detected image.
- the relative distance between the vehicles is calculated from the assumed vehicle width information (number of pixels) and the vehicle width (for example, 1.7 m) and the characteristics of the camera.
- the detection accuracy of objects around the host vehicle is maintained (for example, Patent Document 1).
- the present invention has been made in view of the above problems, and its object is to suppress a decrease in detection accuracy of an object in an environment with low illuminance such as at night or in a tunnel, for example, so that the vehicle speed /
- An object of the present invention is to provide an object detection device capable of suppressing malfunctions in control, pre-crash safety, and the like.
- an object detection device is an object detection device that detects an object around the vehicle using a plurality of imaging units, and is based on images captured by the plurality of imaging units.
- a distance information calculation unit that calculates distance information to an object around the host vehicle, and a specific color that is present at least in a long-distance area away from the target object in the image based on the distance information
- An extraction unit that extracts an object and an output unit that outputs distance information to the object extracted by the extraction unit are provided.
- an object for example, a brake light (tail lamp), a tail light), an auxiliary brake, and the like that exists in a long-distance area away from the subject vehicle in the image and has a specific color.
- the stereo is output in a low-light environment. Even when measuring the relative distance from the image obtained by the camera to the preceding vehicle, it is possible to suppress the noise in the long-distance area and suppress the decrease in detection accuracy of the object. Malfunctions such as pre-crash safety can be suppressed.
- FIG. 1 is an overall perspective view schematically showing a vehicle to which Embodiment 1 of an object detection device according to the present invention is applied.
- the internal block diagram which shows the internal structure of the control unit shown in FIG. The figure explaining the principle of the distance information calculation method using a stereo camera. The figure explaining the short distance area
- the flowchart explaining the object detection method by the object detection apparatus shown in FIG. The internal block diagram which shows the internal structure of the control unit in which Embodiment 2 of the object detection apparatus which concerns on this invention was incorporated.
- the internal block diagram which shows the internal structure of the control unit in which Embodiment 3 of the object detection apparatus which concerns on this invention was incorporated.
- an embodiment of an object detection device will be described with reference to the drawings.
- a detection method by the object detection device when detecting a preceding vehicle in front of the host vehicle in an environment with low illuminance such as at night or in a tunnel will be described.
- FIG. 1 schematically shows a vehicle to which an object detection apparatus according to Embodiment 1 of the present invention is applied.
- two cameras (imaging units) 102 and 103 are arranged side by side in a horizontal direction at a predetermined position of the vehicle 1 (for example, the room mirror 101 of the vehicle 1) toward the front of the vehicle 1. .
- an illuminance sensor 111 that detects the illuminance around the vehicle 1 is disposed at a predetermined position (for example, an upper portion of the front window 112 of the vehicle 1).
- the two cameras 102 and 103 and the illuminance sensor 111 are communicably connected to a control unit 110 having a built-in object detection device 100 (see FIG. 2), and images and illuminance sensors 111 acquired by the cameras 102 and 103 are connected.
- the illuminance information detected by is transmitted to the control unit 110 via a connection line (not shown), for example.
- FIG. 2 shows the internal configuration of the control unit shown in FIG.
- the control unit 110 mainly includes a RAM 104 as an image storage unit, an object detection device 100, and a controller 109 as a control unit.
- the object detection device 100 mainly includes a distance information calculation unit 105, An extraction unit 106, a merge processing unit 107, and an output unit 108 are included.
- the illuminance information around the vehicle 1 detected by the illuminance sensor 111 is transmitted to the controller 109 of the control unit 110 as described above, and the controller 109 exposes the cameras 102 and 103 based on the transmitted illuminance information.
- a control signal for performing exposure control for adjusting a condition (for example, exposure value) is generated.
- the controller 109 controls the camera cameras 102 and 103 so that exposure values differ between an environment with high illuminance such as daytime and an environment with low illuminance such as nighttime or in a tunnel.
- the controller 109 determines that the vehicle 1 is in an environment with low illuminance such as at night or in a tunnel based on the illuminance information detected by the illuminance sensor 111, the controller 109 is separated from the vehicle 1 by a predetermined distance.
- a control signal for adjusting the exposure value of each camera 102, 103 is generated so that the brake light and tail light of the preceding vehicle existing in the long distance area are reflected in red, and the control signal is transmitted to each camera 102, 103.
- the exposure value of the camera means the shutter value and gain value of the camera.
- Each of the cameras 102 and 103 adjusts exposure conditions such as an exposure value based on a control signal transmitted from the controller 109, and images captured by the cameras 102 and 103 are stored in the camera image storage units 104 a and 104 b of the RAM 104. Send to and save.
- the distance information calculation unit 105 of the object detection device 100 periodically acquires images stored in the camera image storage units 104a and 104b of the RAM 104, and automatically uses the parallax information of the same vehicle displayed on the acquired images.
- the relative distance between the vehicle and an object such as a preceding vehicle ahead of the host vehicle is calculated, and the calculation result is transmitted to the extraction unit 106.
- the baseline length which is the distance between the left and right optical axes, is B
- the focal length of the camera is f
- the parallax on the imaging surface is d
- the relative distance Z to the preceding vehicle Is calculated by the following formula (1) from the similarity ratio of the triangles.
- the parallax d decreases as the relative distance to the target object increases.
- the distance measurement accuracy of the target object is lowered, and the possibility of malfunction in, for example, vehicle speed / inter-vehicle control or pre-crash safety is increased.
- the extraction unit 106 exists in a long-distance area that is at least a predetermined distance away from the subject in the image, and red (specific color).
- a target object for example, a preceding vehicle with a control light or tail light turned on
- the extraction unit 106 extracts a red object in a long-distance area where the distance measurement accuracy can be lowered, and thereby suppresses an increase in noise in the long-distance area.
- the extraction unit 106 includes a short distance extraction unit 106a and a red / long distance extraction unit 106b. Based on the distance information transmitted from the distance information calculation unit 105, the short distance extraction unit 106a extracts a target existing in a short distance region near the host vehicle from the target in the image captured by the camera. Further, the red / long distance extraction unit 106b exists in a long distance area that is a predetermined distance away from the subject in the image captured by the camera based on the distance information transmitted from the distance information calculation unit 105. And the object which has red is extracted (refer FIG. 4). The predetermined distance that defines the short distance area and the long distance area is appropriately set according to, for example, the performance and arrangement of the camera.
- the merge processing unit 107 performs an integration process on the object extracted by the short distance extraction unit 106a and the object extracted by the red / long distance extraction unit 106b by OR (OR), and outputs the processing result to the output unit 108. Send to.
- the output unit 108 Based on the processing result transmitted from the merge processing unit 107, the output unit 108 obtains distance information between the target object extracted by the short distance extraction unit 106a and the target object extracted by the red / long distance extraction unit 106b. Output to.
- the controller 109 obtains distance information and position information to an object in the image output from the object detection device 100 (particularly, a preceding vehicle with a control light or tail light on), such as vehicle speed / inter-vehicle control or pre-crash safety. Used for various control applications.
- FIG. 5 illustrates an object detection method by the object detection apparatus 100 described above.
- the controller 109 determines whether the illuminance around the vehicle is within a predetermined threshold based on the illuminance information detected by the illuminance sensor 111 (S201).
- a predetermined threshold for example, in an environment with high illuminance such as daytime
- images of the surroundings of the vehicle are captured by the cameras 102 and 103 (S203).
- the object detection apparatus 100 uses a normal distance calculation process using images captured by the cameras 102 and 103 (for example, distance calculation using parallax information by a stereo camera in a distance area and a monocular camera in a long distance area). (Distance calculation processing by switching to distance calculation) is performed to detect objects around the vehicle (S204).
- each camera 102 corresponds to the illuminance.
- 103 are adjusted (S205), and each camera 102, 103 takes an image around the own vehicle (S206).
- the controller 109 is arranged so that the brake lights and taillights of the preceding vehicle existing in a long distance area away from the vehicle 1 appear in red, that is, the preceding vehicle at the proximal end of the long distance area.
- the exposure values of the cameras 102 and 103 are adjusted so that both the brake light (about 15w to 60w) and the tail light (about 5w to 30w) of the preceding vehicle at the distal end of the long-distance region are reflected in red.
- the proximal end of the long-distance area is a position closest to the own vehicle in the long-distance area (in other words, a position farthest from the own vehicle in the short-distance area), and the distal end of the long-distance area.
- the end is a position farthest from the own vehicle in the long-distance region (in other words, a position farthest from the own vehicle in the imaging region).
- the object detection apparatus 100 uses the parallax information of the same vehicle shown on the images captured by the cameras 102 and 103, and the target object such as the own vehicle and the preceding vehicle ahead of the own vehicle (that is, all light in the image).
- the relative distance from the point is calculated (S207).
- the object detection apparatus 100 extracts a light spot existing in a short-distance region close to the own vehicle from all the light spots in the image (S208), and also in the image.
- a light spot having a red color that exists in a long-distance area that is a predetermined distance away from the vehicle from all the light spots is extracted (S209).
- the object detection apparatus 100 integrates the light spot extracted in S208 and the light spot extracted in S209 by OR (OR) (S210), and then extracts the light spot extracted in S208 and S209.
- the distance information of the light spot is output to the controller 109 or the like (S211).
- the object existing in the short-distance region close to the own vehicle is extracted from the objects in the images captured by the cameras 102 and 103, and the image
- the distance measurement accuracy is obtained by extracting the target object that exists in a long distance away from the subject vehicle and has a red color and outputs the distance information to the extracted target object. It is possible to extract only an object having a red color (for example, a preceding vehicle with a control light, a taillight or the like turned on) in a long-distance region where the brightness can be reduced.
- the object is controlled by suppressing an increase in noise in a long-distance area. This can suppress a decrease in the detection accuracy of the vehicle, thereby suppressing malfunctions in vehicle speed / inter-vehicle control and pre-crash safety.
- FIG. 6 shows an internal configuration of a control unit in which the second embodiment of the object detection apparatus according to the present invention is incorporated.
- 100 A of object detection apparatuses of Embodiment 2 differ in the structure of an extraction part from the object detection apparatus 100 of Embodiment 1, and the other structure is the same as that of the object detection apparatus 100 of Embodiment 1.
- FIG. Therefore, the same reference numerals are given to the same components as those of the object detection device 100 of Embodiment 1, and the detailed description thereof is omitted.
- the object detection apparatus 100A mainly includes a distance information calculation unit 105A, an extraction unit 106A, a merge processing unit 107A, and an output unit 108A.
- the extraction unit 106A includes a short distance extraction unit 106aA and a red color extraction unit. Part 106bA.
- the short distance extraction unit 106aA of the extraction unit 106A is based on the distance information transmitted from the distance information calculation unit 105A and is close to the subject vehicle in the image captured by the camera.
- the object existing in Further, the red extraction unit 106bA based on the distance information transmitted from the distance information calculation unit 105A, an object having a red color (specific color) from an object captured by the camera (for example, a control light or a taillight). Etc.) is extracted. That is, in the second embodiment, the red extraction unit 106bA extracts a red object in the entire imaging region including the short distance region and the long distance region.
- the merge processing unit 107A performs an integration process on the object extracted by the short distance extraction unit 106aA and the object extracted by the red extraction unit 106bA by OR (OR), and transmits the processing result to the output unit 108A. .
- the output unit 108A Based on the processing result transmitted from the merge processing unit 107A, the output unit 108A outputs the distance information between the target object extracted by the short distance extraction unit 106aA and the target object extracted by the red color extraction unit 106bA to the controller 109. .
- FIG. 7 illustrates an object detection method by the above-described object detection apparatus 100A.
- the controller 109 determines whether the illuminance around the vehicle is within a predetermined threshold based on the illuminance information detected by the illuminance sensor 111, as in the first embodiment ( S201A).
- the controller 109 determines that the illuminance around the vehicle is within a predetermined threshold (for example, in an environment with high illuminance such as daytime)
- the object detection device 100A captures images with the cameras 102 and 103.
- the normal distance calculation process (for example, the distance calculation process by switching between the distance calculation using the parallax information by the stereo camera in the short distance area and the distance calculation by the monocular camera in the long distance area) is performed using the obtained image Then, an object around the own vehicle is detected (S202A to S204A).
- each camera 102 corresponds to the illuminance.
- 103 are adjusted (S205A), and each camera 102, 103 takes an image around the own vehicle (S206A).
- the controller 109 is arranged so that the brake lights and taillights of the preceding vehicle existing in a long distance area away from the vehicle 1 appear in red, that is, the preceding vehicle at the proximal end of the long distance area.
- the exposure values of the cameras 102 and 103 are adjusted so that the brake light and the tail light of the preceding vehicle at the distal end of the long-distance area are reflected in red.
- the object detection device 100A uses the parallax information of the same vehicle shown on the images captured by the cameras 102 and 103, and the target object such as the own vehicle and the preceding vehicle ahead of the own vehicle (that is, all light in the image).
- the relative distance from the point is calculated (S207A).
- the object detection device 100A extracts the light spot existing in the short distance region close to the own vehicle from all the light spots in the image based on the distance information calculated in S207A (S208A), and also in the image A light spot having a red color (a light spot corresponding to a control lamp, a tail lamp, or the like of a preceding vehicle) is extracted from all the light spots (S209A).
- the object detection apparatus 100A integrates the light spot extracted in S208A and the light spot extracted in S209A by OR (OR) (S210A), and then extracts the light spot extracted in S208A and S209A.
- the distance information of the light spot is output to the controller 109 or the like (S211A).
- the object existing in the short-distance region close to the own vehicle is extracted from the objects in the images captured by the cameras 102 and 103, and the image Similar to the object detection device 100 of the first embodiment, by extracting a target object having a red color from the target object, and integrating the extracted target objects and outputting distance information to the target object. It is possible to extract only objects having a red color (for example, a preceding vehicle in which a control light, a taillight, etc. are lit) in a long-distance region where the distance measurement accuracy can be lowered.
- a red color for example, a preceding vehicle in which a control light, a taillight, etc. are lit
- the red extraction unit 106bA of the extraction unit 106A of the object detection device 100A by extracting a red object in the imaging region by the red extraction unit 106bA of the extraction unit 106A of the object detection device 100A, for example, the object exists in a long distance region away from the own vehicle and has a red color. Compared with the case of extracting an object, the processing in the extraction unit 106A can be simplified.
- FIG. 8 shows an internal configuration of a control unit in which the third embodiment of the object detection apparatus according to the present invention is incorporated.
- the object detection apparatus 100B according to the third embodiment is different from the object detection apparatus 100A according to the second embodiment in that the object detection apparatus has an exposure value adjustment function, and the other configurations are the object detection apparatus 100A according to the second embodiment. It is the same. Therefore, the same components as those of the object detection device 100A of the second embodiment are denoted by the same reference numerals, and detailed description thereof is omitted.
- the object detection apparatus 100B mainly includes a distance information calculation unit 105B, an extraction unit 106B, a merge processing unit 107B, and an output unit 108B, and exposure conditions (for example, exposure values) of the cameras 102 and 103.
- An exposure value adjustment unit 112B that generates a control signal for performing exposure control for adjusting the exposure value.
- the exposure value adjustment unit 112B acquires the illuminance information around the vehicle output from the illuminance sensor 111 via the controller 109 or directly from the illuminance sensor 111, and based on the acquired illuminance information, the cameras 102 and 103 A control signal for performing exposure control is generated, and the control signal is transmitted to the cameras 102 and 103.
- Each of the cameras 102 and 103 adjusts exposure conditions such as an exposure value based on the control signal transmitted from the exposure value adjusting unit 112B, and each camera image storage unit of the RAM 104 stores an image captured by each of the cameras 102 and 103.
- the object detection apparatus 100B can output the distance information of the object in the image to the controller 109 using the images stored in the camera image storage units 104a and 104b.
- the controller and the exposure value adjustment unit adjust the exposure value of each camera based on the illuminance around the host vehicle detected by the illuminance sensor.
- the value is set in advance by the user or the like so that the brake light and taillight of the preceding vehicle existing in a long-distance area away from the vehicle in a low-light environment such as at night or in a tunnel appear in red. May be.
- an object having a red color corresponding to a control light, a taillight, or the like is imaged in order to detect a preceding vehicle ahead of the host vehicle in a low illumination environment such as at night or in a tunnel.
- a low illumination environment such as at night or in a tunnel.
- the present invention is not limited to the first to third embodiments described above, and includes various modifications.
- the first to third embodiments described above are described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the configurations described.
- a part of the configuration of an embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of an embodiment.
- each of the above-described configurations, functions, processing units, processing means, and the like may be realized by hardware by designing a part or all of them with, for example, an integrated circuit.
- Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor.
- Information such as programs, tables, and files for realizing each function can be stored in a memory, a hard disk, a storage device such as an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
- control lines and information lines indicate what is considered necessary for the explanation, and not all the control lines and information lines on the product are necessarily shown. Actually, it may be considered that almost all the components are connected to each other.
- Vehicle 100 Object detection device 102, 103: Camera (imaging unit) 104: RAM 104a, 104b: Camera image storage unit 105: Distance information calculation unit 106: Extraction unit 106a, 106aA: Short distance extraction unit (first partial extraction unit) 106b: Red / long distance extraction unit (second partial extraction unit) 106bA: Red extraction unit (second partial extraction unit) 107: Merge processing unit 108: Output unit 109: Controller 110: Control unit 111: Illuminance sensor 112B: Exposure value adjustment unit
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Measurement Of Optical Distance (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
図1は、本発明に係る物体検知装置の実施形態1が適用された車両を概略的に示したものである。
図6は、本発明に係る物体検知装置の実施形態2が内蔵されたコントロールユニットの内部構成を示したものである。実施形態2の物体検知装置100Aは、実施形態1の物体検知装置100に対し、抽出部の構成が相違しており、その他の構成は実施形態1の物体検知装置100と同様である。したがって、実施形態1の物体検知装置100と同様の構成には同様の符号を付してその詳細な説明を省略する。
図8は、本発明に係る物体検知装置の実施形態3が内蔵されたコントロールユニットの内部構成を示したものである。実施形態3の物体検知装置100Bは、実施形態2の物体検知装置100Aに対し、物体検知装置が露光値調整機能を有する点が相違しており、その他の構成は実施形態2の物体検知装置100Aと同様である。したがって、実施形態2の物体検知装置100Aと同様の構成には同様の符号を付してその詳細な説明を省略する。
100:物体検知装置
102、103:カメラ(撮像部)
104:RAM
104a、104b:カメラ画像保存部
105:距離情報算出部
106:抽出部
106a、106aA:近距離抽出部(第1の部分抽出部)
106b:赤色・遠距離抽出部(第2の部分抽出部)
106bA:赤色抽出部(第2の部分抽出部)
107:マージ処理部
108:出力部
109:コントローラ
110:コントロールユニット
111:照度センサ
112B:露光値調整部
Claims (9)
- 複数の撮像部を用いて自車周囲の物体を検知する物体検知装置であって、
複数の撮像部で撮像された画像から自車周囲の対象物までの距離情報を算出する距離情報算出部と、
前記距離情報に基づいて、前記画像中の対象物から少なくとも自車から離れた遠距離領域に存在しかつ特定色を有する対象物を抽出する抽出部と、
前記抽出部によって抽出された対象物までの距離情報を出力する出力部と、を備えていることを特徴とする物体検知装置。 - 前記抽出部は、前記画像中の対象物から自車に近い近距離領域に存在する対象物を抽出する第1の部分抽出部と、前記画像中の対象物から前記遠距離領域に存在しかつ特定色を有する対象物を抽出する第2の部分抽出部と、を有し、
前記出力部は、前記第1の部分抽出部により抽出された対象物及び前記第2の部分抽出部により抽出された対象物までの距離情報を出力するようになっている、請求項1に記載の物体検知装置。 - 前記抽出部は、前記画像中の対象物から自車に近い近距離領域に存在する対象物を抽出する第1の部分抽出部と、前記画像中の対象物から特定色を有する対象物を抽出する第2の部分抽出部と、を有し、
前記出力部は、前記第1の部分抽出部により抽出された対象物及び前記第2の部分抽出部により抽出された対象物までの距離情報を出力するようになっている、請求項1に記載の物体検知装置。 - 前記複数の撮像部の露光値は、前記遠距離領域での先行車の制動灯及び尾灯が前記特定色に映るように調整されていることを特徴とする、請求項1に記載の物体検知装置。
- 前記複数の撮像部の露光値は、前記遠距離領域の近位端での先行車の制動灯及び前記遠距離領域の遠位端での先行車の尾灯が前記特定色に映るように調整されていることを特徴とする、請求項1に記載の物体検知装置。
- 前記複数の撮像部の露光値は、自車周囲の照度に基づいて調整されていることを特徴とする、請求項1に記載の物体検知装置。
- 前記物体検知装置は、前記遠距離領域での先行車の制動灯及び尾灯が前記特定色に映るように前記複数の撮像部の露光値を調整する露光値調整部を有していることを特徴とする、請求項1に記載の物体検知装置。
- 前記物体検知装置は、前記遠距離領域の近位端での先行車の制動灯及び前記遠距離領域の遠位端での先行車の尾灯が前記特定色に映るように前記複数の撮像部の露光値を調整する露光値調整部を有していることを特徴とする、請求項1に記載の物体検知装置。
- 前記物体検知装置は、自車周囲の照度に基づいて前記複数の撮像部の露光値を調整する露光値調整部を有していることを特徴とする、請求項1に記載の物体検知装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015538951A JP6208244B2 (ja) | 2013-09-27 | 2014-06-13 | 物体検知装置 |
US15/024,937 US10297155B2 (en) | 2013-09-27 | 2014-06-13 | Object detector |
EP14847611.2A EP3051518B1 (en) | 2013-09-27 | 2014-06-13 | Object detector |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-201157 | 2013-09-27 | ||
JP2013201157 | 2013-09-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015045504A1 true WO2015045504A1 (ja) | 2015-04-02 |
Family
ID=52742662
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/065676 WO2015045504A1 (ja) | 2013-09-27 | 2014-06-13 | 物体検知装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10297155B2 (ja) |
EP (1) | EP3051518B1 (ja) |
JP (1) | JP6208244B2 (ja) |
WO (1) | WO2015045504A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017110277A1 (ja) * | 2015-12-25 | 2017-06-29 | 日立オートモティブシステムズ株式会社 | 前照灯制御装置 |
JP2021163203A (ja) * | 2020-03-31 | 2021-10-11 | 株式会社Soken | 物体検出装置 |
JP7473616B2 (ja) | 2021-11-15 | 2024-04-23 | ウェイモ エルエルシー | 自動露光遮蔽カメラ |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8744666B2 (en) * | 2011-07-06 | 2014-06-03 | Peloton Technology, Inc. | Systems and methods for semi-autonomous vehicular convoys |
KR101511853B1 (ko) * | 2013-10-14 | 2015-04-13 | 영남대학교 산학협력단 | 단일 다중 노출 카메라를 이용한 야간 전방 차량 검출 및 위치 측정 시스템 및 방법 |
WO2015114654A1 (en) * | 2014-01-17 | 2015-08-06 | Kpit Technologies Ltd. | Vehicle detection system and method thereof |
JP2017024538A (ja) * | 2015-07-22 | 2017-02-02 | 修一 田山 | 自動車接近警告システム |
JP6612135B2 (ja) * | 2016-01-14 | 2019-11-27 | 日立オートモティブシステムズ株式会社 | 車両検出装置および配光制御装置 |
KR101859040B1 (ko) * | 2016-09-22 | 2018-05-17 | 엘지전자 주식회사 | 차량용 카메라 장치 및 방법 |
WO2018086133A1 (en) * | 2016-11-14 | 2018-05-17 | SZ DJI Technology Co., Ltd. | Methods and systems for selective sensor fusion |
US11787330B2 (en) * | 2018-11-12 | 2023-10-17 | Koito Manufacturing Co., Ltd. | Vehicle lamp system |
US10832438B2 (en) * | 2018-12-19 | 2020-11-10 | Murat Gozu | Object distancing system for a vehicle |
EP3671692A1 (en) * | 2018-12-19 | 2020-06-24 | Ningbo Geely Automobile Research & Development Co. Ltd. | Time for passage of a platoon of vehicles |
AU2019100368B4 (en) * | 2019-01-25 | 2019-11-28 | Norman BOYLE | A driverless impact attenuating traffic management vehicle |
WO2020162343A1 (ja) * | 2019-02-04 | 2020-08-13 | 日本電気株式会社 | 車両管理装置、車両管理方法、プログラムを記憶する記憶媒体 |
JP7236556B2 (ja) * | 2019-10-14 | 2023-03-09 | 株式会社デンソー | 物体検知装置および物体検知プログラム |
US11869361B2 (en) * | 2021-04-01 | 2024-01-09 | Gm Cruise Holdings Llc | Coordinated multi-vehicle routing |
CN115610415B (zh) * | 2022-11-17 | 2023-03-14 | 广汽埃安新能源汽车股份有限公司 | 车辆距离控制方法、装置、电子设备和计算机可读介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005092861A (ja) * | 2003-08-11 | 2005-04-07 | Hitachi Ltd | 車両制御システム |
JP2008298533A (ja) * | 2007-05-30 | 2008-12-11 | Konica Minolta Holdings Inc | 障害物計測方法、障害物計測装置及び障害物計測システム |
JP2010224930A (ja) * | 2009-03-24 | 2010-10-07 | Fuji Heavy Ind Ltd | 道路認識装置 |
JP2013058829A (ja) | 2011-09-07 | 2013-03-28 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2013107476A (ja) * | 2011-11-21 | 2013-06-06 | Hitachi Automotive Systems Ltd | 画像処理装置 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4253271B2 (ja) | 2003-08-11 | 2009-04-08 | 株式会社日立製作所 | 画像処理システム及び車両制御システム |
JP5320331B2 (ja) * | 2010-03-17 | 2013-10-23 | 日立オートモティブシステムズ株式会社 | 車載用環境認識装置及び車載用環境認識システム |
US9255291B2 (en) * | 2010-05-06 | 2016-02-09 | Bioo Scientific Corporation | Oligonucleotide ligation methods for improving data quality and throughput using massively parallel sequencing |
JP5537491B2 (ja) * | 2011-05-12 | 2014-07-02 | 富士重工業株式会社 | 環境認識装置 |
JP5386539B2 (ja) * | 2011-05-12 | 2014-01-15 | 富士重工業株式会社 | 環境認識装置 |
JP5592308B2 (ja) | 2011-05-19 | 2014-09-17 | 富士重工業株式会社 | 環境認識装置 |
JP5499011B2 (ja) * | 2011-11-17 | 2014-05-21 | 富士重工業株式会社 | 車外環境認識装置および車外環境認識方法 |
-
2014
- 2014-06-13 US US15/024,937 patent/US10297155B2/en active Active
- 2014-06-13 JP JP2015538951A patent/JP6208244B2/ja active Active
- 2014-06-13 WO PCT/JP2014/065676 patent/WO2015045504A1/ja active Application Filing
- 2014-06-13 EP EP14847611.2A patent/EP3051518B1/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005092861A (ja) * | 2003-08-11 | 2005-04-07 | Hitachi Ltd | 車両制御システム |
JP2008298533A (ja) * | 2007-05-30 | 2008-12-11 | Konica Minolta Holdings Inc | 障害物計測方法、障害物計測装置及び障害物計測システム |
JP2010224930A (ja) * | 2009-03-24 | 2010-10-07 | Fuji Heavy Ind Ltd | 道路認識装置 |
JP2013058829A (ja) | 2011-09-07 | 2013-03-28 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2013107476A (ja) * | 2011-11-21 | 2013-06-06 | Hitachi Automotive Systems Ltd | 画像処理装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3051518A4 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017110277A1 (ja) * | 2015-12-25 | 2017-06-29 | 日立オートモティブシステムズ株式会社 | 前照灯制御装置 |
JPWO2017110277A1 (ja) * | 2015-12-25 | 2018-08-30 | 日立オートモティブシステムズ株式会社 | 前照灯制御装置 |
US10351048B2 (en) | 2015-12-25 | 2019-07-16 | Hitachi Automotive Systems, Ltd. | Headlight control device |
JP2021163203A (ja) * | 2020-03-31 | 2021-10-11 | 株式会社Soken | 物体検出装置 |
JP7307699B2 (ja) | 2020-03-31 | 2023-07-12 | 株式会社Soken | 物体検出装置 |
JP7473616B2 (ja) | 2021-11-15 | 2024-04-23 | ウェイモ エルエルシー | 自動露光遮蔽カメラ |
Also Published As
Publication number | Publication date |
---|---|
EP3051518A1 (en) | 2016-08-03 |
US10297155B2 (en) | 2019-05-21 |
US20160240085A1 (en) | 2016-08-18 |
JPWO2015045504A1 (ja) | 2017-03-09 |
EP3051518A4 (en) | 2017-05-24 |
JP6208244B2 (ja) | 2017-10-04 |
EP3051518B1 (en) | 2020-12-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6208244B2 (ja) | 物体検知装置 | |
JP6353525B2 (ja) | ホスト車両の速度を制御する方法、及び、ホスト車両の速度を制御するためのシステム | |
US8055017B2 (en) | Headlamp monitoring apparatus for image exposure adjustment | |
JP6407626B2 (ja) | 物体認識装置及び車両制御システム | |
JP5727356B2 (ja) | 物体検知装置 | |
US9886773B2 (en) | Object detection apparatus and object detection method | |
JP5906224B2 (ja) | 車外環境認識装置 | |
WO2012121107A1 (ja) | 車戴カメラ及び車載カメラシステム | |
CN109703555B (zh) | 用于探测道路交通中被遮蔽的对象的方法和设备 | |
JP6325927B2 (ja) | 物体検知装置及びそれを用いた車両制御システム | |
KR102177879B1 (ko) | 차량의 객체 검출 장치 및 방법 | |
JP4807733B2 (ja) | 車外環境認識装置 | |
US9524645B2 (en) | Filtering device and environment recognition system | |
KR20140069777A (ko) | 차량용 스마트 운전 보조 시스템 | |
JP6259335B2 (ja) | 車外環境認識装置 | |
JP2009146153A (ja) | 移動体検出装置、移動体検出方法および移動体検出プログラム | |
JP6891082B2 (ja) | 物体距離検出装置 | |
JP6329438B2 (ja) | 車外環境認識装置 | |
JP6387710B2 (ja) | カメラシステム、測距方法、およびプログラム | |
JP5506886B2 (ja) | 車両周辺監視装置 | |
JP2015069380A (ja) | 車外環境認識装置 | |
KR20220097656A (ko) | 운전자 보조 장치, 차량 및 그 제어 방법 | |
JP2022136534A (ja) | 画像記録装置 | |
WO2021084915A1 (ja) | 画像認識装置 | |
JP2017084019A (ja) | 対象物検出装置、対象物検出方法、及び、対象物検出プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14847611 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015538951 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2014847611 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014847611 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15024937 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |