WO2001021446A1 - Device for assisting automobile driver - Google Patents
Device for assisting automobile driver Download PDFInfo
- Publication number
- WO2001021446A1 WO2001021446A1 PCT/JP2000/006393 JP0006393W WO0121446A1 WO 2001021446 A1 WO2001021446 A1 WO 2001021446A1 JP 0006393 W JP0006393 W JP 0006393W WO 0121446 A1 WO0121446 A1 WO 0121446A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- imaging
- distance
- moving object
- moving
- Prior art date
Links
- 238000001514 detection method Methods 0.000 claims abstract description 40
- 238000003384 imaging method Methods 0.000 claims description 112
- 230000033001 locomotion Effects 0.000 claims description 90
- 230000002194 synthesizing effect Effects 0.000 claims description 11
- 230000002123 temporal effect Effects 0.000 claims description 7
- 238000007781 pre-processing Methods 0.000 claims description 2
- 239000013598 vector Substances 0.000 description 53
- 238000010586 diagram Methods 0.000 description 24
- 239000002131 composite material Substances 0.000 description 22
- 238000000034 method Methods 0.000 description 16
- 238000012545 processing Methods 0.000 description 15
- 238000009826 distribution Methods 0.000 description 10
- 238000004458 analytical method Methods 0.000 description 9
- 238000000605 extraction Methods 0.000 description 6
- 238000013459 approach Methods 0.000 description 5
- 230000004397 blinking Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000015572 biosynthetic process Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- 238000003786 synthesis reaction Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000005304 joining Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000004804 winding Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
- G01C3/08—Use of electric radiation detectors
- G01C3/085—Use of electric radiation detectors with electronic parallax measurement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/223—Analysis of motion using block-matching
- G06T7/238—Analysis of motion using block-matching using non-full search, e.g. three-step search
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/107—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using stereoscopic cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/20—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
- B60R2300/207—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used using multi-purpose displays, e.g. camera image and navigation or video on same display
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/303—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/304—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
- B60R2300/305—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images merging camera image with lines or icons
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/307—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing virtually distinguishing relevant parts of a scene from the background of the scene
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8066—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring rearward traffic
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8093—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
- G06T2207/10021—Stereoscopic video; Stereoscopic image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20016—Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/221—Image signal generators using stereoscopic image cameras using a single 2D image sensor using the relative movement between cameras and objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0085—Motion estimation from stereoscopic image signals
Definitions
- the present invention relates to a driving support device that assists driving by driving a moving body such as a vehicle by capturing a situation around the moving body with a camera and processing the captured image.
- Japanese Patent Publication No. 9-2407397 (hereinafter referred to as the "first conventional example"), a warning device for vehicles on the rear side is shown. This involves detecting a moving object present in an adjacent lane from the captured image of the rear side area of the vehicle, detecting the presence or absence of a white line in the detection of the moving object, and integrating these detection results. Detect other vehicles. Then, the approach state between the other vehicle and the own vehicle is determined, and the driver is notified when there is a possibility that the approach degree becomes excessive.
- Japanese Patent Application Laid-Open No. 7-93639 discloses a vehicle object detection device.
- This is a vehicular object detection device that can accurately identify the difference between a character or pattern on a road and an object such as a vehicle and detect the object with high accuracy.
- the movement of the edge points of the object is measured as if it were three-dimensional movement on the road surface, and the object discrimination means compares the measured amount of movement with the vehicle speed to discriminate the object. .
- a method of detecting an obstacle by analyzing the motion of a captured image is a method generally called a motion stereo, and analyzing an image change corresponding to a change in a viewpoint due to a movement to obtain an image. It obtains three-dimensional information in the image.
- this method has the disadvantage that the change in the image corresponding to the change in the viewpoint is small for the image in the moving direction. There is a problem that the sensitivity of the device decreases.
- the motion analysis method determines the motion of an object on the screen, if the imaging device is installed in the vehicle, the motion may not be accurate due to the screen shaking due to the vibration of the vehicle itself. There is a problem that is not required.
- the first conventional example only states that the driver is notified when the vehicle approaches too far. Further, the second conventional example does not describe a method of notifying a detected obstacle to a driver.
- the present invention considers the problems of such a conventional driving support device or a moving object image display system, and enables a driver to directly check the surroundings of an approaching object or an obstacle, thereby reducing the burden on the driver. It is an object of the present invention to provide a driving support device that can reduce the driving. Specifically, the present invention provides, as a device for assisting driving of a moving object, a plurality of image pickup means installed on the moving object for picking up an image of a rear side of the moving object, and moving from an image picked up by the plurality of image pickup means.
- Detecting means for detecting the movement of an object behind the body, wherein the plurality of imaging means include a vicinity of a vanishing point on the captured image and an imaging area of one imaging means and another imaging means overlap with each other.
- the detection means obtains stereo disparity between the one imaging means and the other imaging means in the overlap area, and calculates a distance to the object based on the obtained stereo disparity. Is what you want.
- the driving means provided in the driving support apparatus according to the present invention detects a flow indicating a temporal movement of an image in an imaging region other than the overlap region of the one imaging device, and detects the detected flow. It is preferable to detect the movement of the object behind the moving object on the basis of.
- the driving support device includes an image combining unit that combines images using the images captured by the plurality of imaging units and generates an image representing the rear of the moving object.
- the driving assistance device receives the information output from the detection unit, determines a possibility that an approaching object from behind the moving object collides with the moving object, and determines that the possibility is high. It is preferable that the apparatus further comprises a danger determining means for outputting an instruction signal when the determination is made, and an outside warning means for issuing an alarm toward the rear of the moving body when the instruction signal is output from the danger determining means.
- a danger determining means for outputting an instruction signal when the determination is made
- an outside warning means for issuing an alarm toward the rear of the moving body when the instruction signal is output from the danger determining means.
- the vehicle further includes a determination unit, and an occupant protection unit that performs a measure to protect an occupant of the moving body when the instruction signal is output from the risk determination unit.
- the present invention provides, as an apparatus for assisting driving of a moving body, an image pickup means installed on the moving body, for picking up an image around the moving body; Image generating means for converting into an image from a viewpoint located at a position different from the position of the step, and detecting means for detecting a distance of the object taken in the captured image from the moving body, wherein the image generating means In generating the converted image, the object is corrected for the image distortion using the distance detected by the detection unit.
- a plurality of image pickup units are provided, the plurality of image pickup units have an area in which an image pickup area of one image pickup unit and another image pickup unit overlaps on the picked-up image.
- the based on stereo disparity determined obtaining the distance to the object is preferably (also Preferably, the detecting means in the driving support device according to the present invention obtains a distance to the object from a flow indicating a temporal movement of the captured image.
- an imaging unit that is installed on the moving body and captures an image of the periphery of the moving object; and a flow that indicates temporal movement is obtained from an image captured by the imaging unit.
- Detecting means for detecting a motion wherein the detecting means obtains an estimated value of the offset from each of the obtained flows as a pre-process for detecting the motion of the object, and calculates the estimated value of the offset from the vibration of the moving object. This is to cancel from each flow as a swing component caused by the above.
- FIG. 1 is a block diagram showing the configuration of the driving support device according to the first embodiment of the present invention.
- FIG. 2A is an example of a captured image
- FIG. 2B is a diagram showing a flow on the image of FIG. 2A.
- FIG. 3 is a conceptual diagram when the image of FIG. 2 is captured as viewed from above.
- Fig. 4 (a) shows the relationship between the vanishing point and the flow
- Fig. 4 (b) shows the relationship between the vanishing point and the flow. It is the figure where the moving object area was extracted.
- FIG. 5 is a flowchart showing the flow of the process of extracting the fluctuation component.
- FIGS. 6 (a) and 6 (b) are diagrams showing the influence of the vertical vibration of the vehicle on the imaging means.
- Figs. 7 (a) to 7 (g) are diagrams for explaining the procedure for obtaining the estimated offset value of the motion vector.
- FIG. 8 is a flowchart showing a flow of processing for detecting a moving object and a detected object.
- FIGS. 9 (a) and 9 (b) are diagrams for explaining a method of determining whether or not a moving object is present.
- ⁇ FIGS. 10 (a) and (b) are extraction of a moving object region using a motion vector.
- FIG. 10 (a) and (b) are extraction of a moving object region using a motion vector.
- Fig. 11 (a) is a diagram showing the default distance value of the imaging range
- Fig. 11 (b) is a diagram showing the distance estimation when the moving object region straddles the regions AR1, AR2.
- Fig. 12 (a) is an image showing a motion vector of a still background
- Fig. 12 (b) is a diagram for explaining an obstacle determination method.
- FIG. 13 is a diagram showing an example of a display image.
- FIG. 14 is a block diagram showing the configuration of the driving support device according to the second embodiment of the present invention.
- FIG. 15 is a schematic diagram showing an example of the arrangement of the imaging means according to the second embodiment of the present invention.
- Figures 16 (a) and (b) show the flow of each captured image, and (c) shows the stereo image obtained by superimposing the two images shown in Figures 16 (a) and (b).
- Fig. 17 is an example of an image from which obstacles, moving objects and approaching objects are extracted.
- Figures 18 (a) and 18 (b) are views showing virtual viewpoints for obtaining a composite image in which a moving object is displayed stereoscopically.
- FIG. 19 (a) is an example of an actual captured image
- FIG. 19 (b) is a composite image generated from FIG. 19 (a).
- FIGS. 20 (a) and (b) are diagrams for explaining image composition taking into account the detected distance.
- FIGS. 21 (a) and 21 (b) show image synthesis taking into account the detected distance, and are diagrams for explaining the case where two imaging means are used.
- FIGS. 22 (a) and 22 (b) are diagrams showing an example of the arrangement of the imaging means according to a modification of the second embodiment of the present invention.
- FIGS. 23 (a), (b), and (c) are diagrams for explaining how to obtain stereoscopic parallax in a modified example of the second embodiment of the present invention.
- FIG. 24 is a block diagram showing the configuration of the driving support device according to the third embodiment of the present invention.
- FIG. 1 is a block diagram showing a configuration of a driving support device (mobile object image display system) according to a first embodiment of the present invention.
- reference numeral 101 denotes imaging means installed on a moving object
- 102 denotes hierarchical image forming means
- 103 denotes LPF (mouth-to-pass filter)
- 104 denotes block sampling means
- 105 denotes Hierarchical block matching means
- 106 is sub-pixel estimation
- reliability judgment means 107 is fluctuation component extraction
- cancellation means 108 is vanishing point calculation means
- 109 is moving object, approaching object detection Means
- 110 is an image synthesizing means
- 111 is a display device.
- Components 1 ⁇ 2 to 109 constitute detection means.
- the imaging means 101 is, for example, a camera, and is installed on a rear part of the vehicle, for example, on a rear panel so as to be able to photograph the rear of the vehicle.
- the display device 111 for example, a display dedicated to the present system or a liquid crystal monitor for power navigation is used.
- the purpose of the mobile object image display system according to the present embodiment is mainly to reduce the burden on the driver by directly displaying a warning of an approaching object from the rear which is dangerous when the course is changed. It is.
- the operation of the moving object image display system according to the present embodiment will be described with reference to FIGS.
- an image (here, 320 pixels ⁇ 240 pixels) behind the host vehicle is imaged by the imaging means 101.
- the captured images include obstacles such as buildings 11 and street trees 12 and moving objects such as vehicles 13 and 14 traveling behind.
- Fig. 2 schematically shows these obstacles and moving objects.
- the captured image is input to the hierarchical imaging means 102 and the image synthesizing means 110.
- the c hierarchical imaging means 102 adds the input captured image for each (2 ⁇ 2) pixel. In this way, a first-order image (160 x 120 pixels) is generated.
- a secondary upper-level image 80 x 60 pixels
- these three types of images are output to the LPF 103 as hierarchical images.
- the LPF 103 performs LPF processing of (3 ⁇ 3) pixels for each of the input hierarchical images.
- the hierarchical block matching means 105 finds the SAD (sum of absolute difference values) by block matching within the range of (5 x 5) pixels from the image of the upper layer, and based on the position where the SAD becomes the minimum. To find the motion vector. For the image of the lower layer, the motion vector is calculated in the range of (5 x 5) pixels, centering on the motion vector obtained by the same position in the upper layer.
- the reliability judgment means 106 uses the SAD and the motion vector obtained in the lowest-level image (captured image) to calculate the SAD minimum position and the SAD values of the eight points around it. From the motion vector with sub-pixel accuracy of 1 pixel or less. Make an estimate. At the same time, the reliability of the motion vector in that block is determined.
- the hierarchical block matching means 105 and the sub-pixel estimation / reliability determining means 106 determine the motion from the previous frame for each position of the captured image.
- the vector is obtained as the flow FL.
- the flow FL of the captured image will be described with reference to FIG. As shown in FIG. 2 (b), the flow FL as described above is obtained at the edge portion of the image. At this time, after canceling the camera movement, the vanishing point VP on the screen shown in FIG. 4 (a) is in the reverse direction of the vehicle traveling direction shown in FIG. An object that is stationary on the ground has a flow FL toward the vanishing point VP on the imaging screen. Therefore, an area on the screen having a flow FL other than the direction toward the vanishing point VP (for example, a rectangular area 202) can be extracted as a moving object / approaching object.
- a flow FL other than the direction toward the vanishing point VP for example, a rectangular area 202
- the shake component extraction / cancellation means 107 extracts and cancels the shake component of the image due to the vibration of the car by statistically processing the obtained motion vector.
- the vanishing point calculation means 108 finds the vanishing point VP of the flow of the image as the vehicle advances: L. That is, as shown in FIG. 4 (a), the vanishing point VP is determined as a point to which most of the entire image image goes.
- the moving object / approaching object detecting means 109 sets a block having a motion vector different from the flow FL to the vanishing point VP obtained by the vanishing point calculating means 108 as a moving object / neighboring candidate block. Extract. Then, by connecting the nearby moving object and the approaching object candidate block, the area where the moving object and the approaching object exist is extracted as a moving object as a rectangular area 202 as shown in FIG. 4 (b). .
- FIGS. 5 to 7 the specifics of the swing component extraction and cancellation means 107 will be described. The operation will be described.
- Fig. 5 is a flowchart for explaining the operation.
- C The vehicle vibrates mainly in the vertical direction due to the influence of road irregularities, in addition to the movement in the traveling direction. As shown in FIG. 6, the influence of the vertical vibration of the vehicle on the imaging means appears as a change in the position as shown in (a) and a change in the imaging direction as shown in (b).
- the change in the vertical position shown in Fig. 6 (a) is extremely small compared to the change in the position of the vehicle in the traveling direction because the frame interval of the image signal is very short.
- the effect of the change in the position of the imaging means greatly varies depending on the distance to the object, and has a great effect on a near object, but has little effect on a distant object.
- the distance to the rear target monitored in the present embodiment is sufficiently long, about several meters to several ten meters. Therefore, here, it is assumed that the influence of the change in the vertical position is not considered, but only the change in the imaging direction shown in FIG. 6 (b).
- the effect of the change in the imaging direction does not differ depending on the distance to the object, and when the angle of the change is very small, the offset of each motion vector of the screen in the vertical direction is uniform over the entire screen.
- G can be assumed as Vdy.
- the motion vector (Vx, Vy) in a stationary background other than the moving object is described as the motion vector (VOx, V0y) to the vanishing point VP due to the progress of the vehicle as shown in the following equation. It can be approximated by the sum with the offset (0, Vdy).
- Vx V 0 X
- Vy V 0 y + Vd y
- the offset (0, Vdy) is extracted as a swing component, and by canceling from the detected motion vector (Vx, Vy), the motion of the stationary background portion to the vanishing point VP is obtained.
- the vector (V 0 x, V 0 y) is obtained.
- a motion vector (Vx, Vy) is input for each position (X, y) on the screen (S11).
- Vx, Vy a motion vector
- (X 0, y 0) can be determined in advance because it is determined at a predetermined position on the screen from the mounting angle of the imaging means 101 (S 12).
- V d y V y-(y-y 0) * V x / (x-x 0)
- Vdy can be obtained from one motion vector.
- the input motion vector includes a large number of motion vectors related to the image area other than the portion of the stationary background such as a moving object.
- the motion vector related to the stationary background portion also includes an error.
- the offset Vdy of the motion vector is estimated by statistical processing. That is, as shown in FIG. 7, V dy is calculated for each moving vector according to the above equation, the frequency is calculated, and Vdy having the highest frequency is used as the final offset estimation value ( S14, S15, S16).
- the distribution shown in Fig. 7 (f) is multiplied and summed for the motion vector of the entire screen
- the distribution shown in Fig. 7 (g) is obtained.
- the value of V dy which is the most frequent in this distribution, is used as the estimated value of the offset due to the fluctuation.
- the vanishing point calculation means 108 in FIG. 1 cancels the estimated offset and then finds the actual vanishing point again.
- the calculated vanishing point is It will be very different from the vanishing point.
- the subsequent moving object / approaching object detection processing is interrupted, and the image is synthesized and displayed using the result of the previous frame.
- the moving / approaching object detecting means 109 performs the following operation.
- FIG. 8 is a flowchart showing a specific operation of the moving object / approaching object detecting means 109.
- the motion vector (Vx, Vy) for each position (x, y) on the screen and the temporary vanishing point (X0, y0) are input as in the case of extracting the blur component (S 21 and S22).
- the input motion vector is a moving object is determined based on whether or not the offset represents a motion to a vanishing point after canceling the offset (S23).
- Vd y that takes into account the motion vector error (Sat Vnx, Sat Vny) and the small error ( ⁇ nx, Sat ny) in the screen obtained at the time of the shake component extraction
- the determination is made based on whether or not the distribution (Fig. 7 (f)) matches the estimated offset value. That is, in FIG. 9 (a), the distribution of Vdy and the offset estimation value do not match, so that the position of this motion vector is determined to be a moving object, while in FIG.
- the distribution of Vdy And the offset estimated value match, the position of this motion vector is determined to be a stationary background.
- the motion vector FL1 determined as a moving object is detected in each part of the moving object on the screen.
- the area including the moving vector FL1 is grouped to generate a rectangular moving object area 202 (S24).
- the distance from the vehicle to the moving object is estimated at the position of the lower end UE of the moving object region 202 (S25).
- FIG. 11 (a) is a diagram showing a default distance value of the imaging range of the imaging means 101.
- the distance is below a predetermined position in the captured image.
- the first area AR1 is on the road surface RS
- the second area AR2 above this predetermined position is at a predetermined distance DD (for example, 50 m) from the imaging means 101.
- DD for example, 50 m
- the moving object area 202 extends over the areas AR1 and AR2 on the screen, the moving object included in the first area AR1 is displayed. Based on the lower end UE of the region 202 as a reference, the distance is estimated on the assumption that the object is in contact with the road surface RS at the lower end UE.
- the distance to the moving object area estimated here is stored in the memory. If the moving object area is detected at the same position by processing the next frame image and the estimated distance to this moving object area is shorter than the estimated distance in the previous frame stored in the memory, The object in the moving object area is determined to be an approaching object (S26).
- the distance Z is calculated by the following equation from the magnitude of the vector (the offset is canceled) (S 27).
- d Z is the amount of movement of the vehicle between frames
- r is the distance from the vanishing point VP on the screen
- the distance Z obtained here is compared with the distance to the road surface stored as the default distance value (S28). Then, an object located higher than the road surface, such as a street tree O B shown in FIG. 12 (b), is determined to be an obstacle. In the case of an object approaching from almost behind, such as a vehicle MM, a motion vector is generated near the vanishing point, but its size is extremely small. Therefore, when the distance Z is obtained by the above-described method, the value may be such that the object is located below the road surface. Usually, no object can exist below the road surface, so the motion vector in such a case is determined to be that of a moving object, and is passed to the moving object region extraction processing S 24.
- the obstacles, moving objects-approaching objects, and the distance in the image are obtained from the motion vector at each position of the screen (S29), and these information are sent to the image synthesizing means 110. Is output.
- the image synthesizing means 110 synthesizes the frame 203 of the rectangular area 202 into red on the captured image input from the imaging means 101, as shown in FIG. Is output to the display device 1 1 1.
- the display device 111 reverses the left and right display of the composite image so as to be in phase with the knock mirror.
- the driver can see the approach image by seeing the display image shown in FIG. 13 and lighting the red frame 203.
- the driver can check the surrounding situation by looking at the captured image and naturally pay attention to an approaching object that requires special attention without being surprised by an alarm sound or the like.
- only the approaching object of the moving object is displayed in a blinking red frame, but the method for calling the driver's attention is not limited to this method. Or may be displayed without blinking.
- the image of the approaching object displayed on the display device 11 1 moves downward, it is understood that it is approaching the moving object, and conversely, if it moves upward, it is understood that it is away from the moving object. it can.
- the distance to the approaching object is required, displaying the distance itself or changing the display according to the distance can further assist the driver in understanding the situation.
- the color of the frame according to the distance such as green when the distance to the approaching object is 50 m or more, yellow when it is 20 m or more and less than 50 m, and red when it is less than 20 m. May be changed, or the distance value itself may be displayed at the upper right of the moving object area.
- FIG. 14 is a block diagram showing the configuration of a moving object image display system as a driving support device according to the second embodiment of the present invention.
- the same components as those in FIG. 1 are denoted by the same reference numerals as in FIG. 1, and the detailed description thereof is omitted here.
- the components different from those in FIG. 1 are a second imaging means 401 provided separately from the first imaging means 101, a hierarchical block stereo matching means 405, 3D information estimation and obstacle detection means. 409 and 3D image synthesizing means 410 as image synthesizing means or image generating means.
- the constituent elements 102 to 109, 405, and 409 constitute detection means.
- An object of the moving object image display system according to the present embodiment is to accurately detect an approaching object or an obstacle near a vanishing point where a motion vector (flow) is not accurately obtained.
- the operation of the moving object image display system according to the present embodiment will be described with reference to FIGS.
- Figure 15 is a schematic view of the vehicle and its surroundings as viewed from above.
- two imaging means 101 and 401 are horizontally shifted from each other at the rear of the vehicle.
- the first imaging range VA1 by the imaging means 101 and the second imaging range VA2 by the second imaging means 401 as another imaging means have an overlapping area 0L. , Has been placed.
- the arrangement of the imaging means as shown in Fig. 15 is for capturing and monitoring the rear of the vehicle with a wide field of view using a camera having a normal lens with a limited viewing angle.
- Hierarchical imaging means 102, LPF 103, block sampling means 104 and hierarchical blocking matching means 105 are provided for each captured image input from imaging means 101 and 401. Then, the same processing as in the first embodiment is performed to determine a flow (motion vector).
- FIGS. 16 (a) and 16 (b) are diagrams in which the flow obtained by the above processing is superimposed on the captured images obtained by the imaging means 101 and 401, respectively.
- 0 L indicates an overlap area on the imaging screen.
- the flow for a stationary object such as a building ⁇ tree is indicated by a solid arrow
- the flow for a moving object such as a car is indicated by a dashed arrow.
- the flow for a stationary object is a flow toward the vanishing point VP due to the movement of the own vehicle.
- the magnitude of this flow is proportional to the speed of the host vehicle and the distance to the vanishing point VP on the screen. For this reason, the flow near the vanishing point VP becomes small in size, and is difficult to detect.
- the hierarchical block stereo matching means 405 In the area OL, stereo analysis of the two images shown in Figs. 16 (a) and (b) is performed to obtain stereo parallax. Since the vanishing point VP occurs immediately behind the traveling direction of the vehicle, it is easy to set the imaging means 101 and 401 so that the overlap area ⁇ L includes the vanishing point VP on the captured image. .
- FIG. 16 (c) is a view in which the two images shown in FIGS. 16 (a) and (b) are superimposed.
- the displacement VD of the vehicle image is the obtained stereo parallax.
- the stereo parallax VD is generated almost in the horizontal direction.
- SAD sum of absolute difference values
- stereo disparity is calculated from the minimum point of the SAD.
- the stereo parallax is calculated in a range of 5 horizontal pixels by 3 vertical pixels centering on the stereo parallax obtained in the block at the same position in the image of the upper layer.
- the sub-pixel estimation / reliability determination means 106 uses the motion vector and SAD obtained in the lowest hierarchical image (captured image) to calculate the minimum position of the SAD and the eight SADs around it. From the values, the motion vector is estimated with sub-pixel accuracy of 1 pixel or less. At the same time, the reliability of the motion vector in that block is determined.
- the processing relating to the motion vector to the stereoscopic parallax VD obtained by the hierarchical pro- ject stereo matching means 405 in exactly the same manner, the stereo parallax estimation with sub-pixel accuracy is achieved. And the reliability determination of stereo parallax.
- the distance from the imaging means to the object can be obtained by the principle of triangulation.
- the image time The distance from the imaging means to the object can be obtained in relation to the vehicle speed by assuming that the object is stationary on the ground, for example, by a flow representing a typical movement.
- the 3D information estimation / obstacle detection means 409 estimates the three-dimensional information for the two captured images, as shown in FIG. 17, and furthermore, a predetermined height from the ground.
- the area on the screen presumed to be as described above is detected as an obstacle OB.
- the moving object / approaching object detection means 109 extracts a block having a motion vector different from the flow toward the vanishing point VP as a moving object / approaching object candidate block, and detects an approaching moving object / approaching object candidate.
- a moving object MM is extracted as shown in FIG.
- FIG. the approaching object AP is extracted.
- the 3D image synthesizing means 410 uses the three-dimensional information obtained from the 3D information estimating / obstruction detecting means 409 as shown in FIG.
- the two captured images input from are synthesized.
- the area of the moving object MM and the approaching object AP is illuminated and displayed, for example, in a red frame, and the obstacle ⁇ B is displayed in, for example, a green frame, and the display device 1 1 1 Output to However, at this time, the display device 111 reverses the left and right display so that the composite image has the same phase as the rearview mirror.
- the driver can know the approaching of the moving object MM and the approaching object AP by lighting the red frame, and from which direction the approaching object AP It is possible to directly and easily grasp how close they are. Furthermore, the driver can easily and directly grasp the presence and location of the obstacle OB by displaying the green frame.
- a method called “motion stereo” that detects an obstacle or an approaching object by analyzing the motion of a captured image uses a method called a “motion stereo” to change the viewpoint due to movement. By analyzing the corresponding changes in the image, three-dimensional information in the captured image is obtained.
- the detection in the region near the vanishing point on the screen is supplemented by the stereo analysis in the overlap area 0 L by two cameras, so that the detection is performed with high sensitivity. be able to.
- the 3D image synthesizing means 410 uses the three-dimensional information obtained from the 3D information estimation / obstacle detecting means 409 to accurately convey information on approaching objects and obstacles to the driver. This method will be described.
- the imaging device when viewing a captured image in which information of an approaching object is synthesized, it is necessary to determine the distance in the depth direction of the screen from the apparent size of the object on the synthesized image.
- the imaging device when installed in a vehicle, it cannot be installed at a position higher than the vehicle height, and its orientation must be nearly horizontal in order to have a certain distance in view. As a result, the distance to the approaching object is in the depth direction of the screen.
- the viewpoint position of a composite image is changed as in Japanese Patent Application No. Hei 10-0-21 7 2 61 by the present inventors. There is technology to do.
- a composite image is created from a new viewpoint, for example, when looking down from the sky.
- the distance between the vehicle and other objects is proportional to the distance on the screen, so it is easy to intuitively grasp the distance.
- the 3D image synthesizing unit 410 converts the captured image obtained by the imaging units 101 and 401 using the above-described technology into a viewpoint higher than the actual installation position of the imaging unit. Convert to a composite image viewed from the position. It is better to use a viewpoint that looks down diagonally, rather than looking directly below, in order to see far into the field.
- Figure 18 (a) shows the position of the virtual viewpoint VVP for obtaining such a composite image.
- the virtual viewpoint VVP is located above the imaging means 101, and its direction is so as to look down obliquely downward from behind the vehicle. Then, as shown in FIG. 18 (b), assuming that the image taken by the actual imaging means 101 is at the default distance as described using FIG. 11 (a), the actual image taken Thus, a composite image viewed from the virtual viewpoint VVP can be generated.
- the composite image viewed from the virtual viewpoint VVP is as shown in FIG. 19 (b).
- the white lines 4 1 1 etc. actually existing on the road surface are synthesized at the correct position on the synthesized image as well as the default distance value, and since the image is viewed from above, the distance The feeling is emphasized so that it is easy to grasp.
- trees 4 12 and vehicles 4 13 which are not actually on the road surface are elongated and unnaturally distorted on the composite image.
- the composite image in FIG. 19 (b) includes an area 414 outside the actual imaging area.
- the distortion of the vehicle and the like on the composite image described above is greatly improved by using the three-dimensional information obtained from the 3D information estimation / obstacle detection means 409. This will be described with reference to FIGS. 20 and 21.
- an obstacle or an approaching object above the ground is detected using the three-dimensional information obtained from the 3D information estimation / obstacle detection means 409. For this reason, As shown in Fig. 20 (a), the area of an obstacle or approaching object is synthesized according to its three-dimensional information, so that even if the images viewed from the virtual viewpoint VVP are synthesized, a natural image with little distortion is obtained. Can be synthesized.
- trees 412 and vehicles 413 that are not actually on the road surface the actual distance RD from the imaging means 101 is detected, and that area is also used for generating a composite image. Combined at the position where the actual distance RD is considered. Therefore, as shown in Fig. 20 (b), trees 412 and vehicles 413 that are not actually on the road surface are also synthesized based on the actual distance RD instead of the default distance value. However, it will not be stretched long on the composite image, and will be natural.
- FIG. 21 is a diagram illustrating a case where a composite image is generated from the captured images of the two imaging units 101 and 401. As shown in Fig. 21 (a), in this case, too, the actual distance RD from the imaging means 101 and 401 is different for trees 412 and vehicles 413 which are not actually on the road surface.
- V A1 and V A2 are areas corresponding to the visual field ranges of the imaging means 101 and 401, respectively, and OL is an overlap area where the visual field ranges overlap.
- OL is an overlap area where the visual field ranges overlap.
- FIG. 22 is a diagram showing an example of the arrangement of the imaging means according to the present modification.
- the difference from the above-described embodiment is that the two imaging means 101 and 401 are arranged at an interval k in the vertical direction instead of the horizontal direction.
- the first imaging area VA1 of the imaging means 101 and the second imaging area VA2 of the imaging means 401 are mutually different with respect to the visual field range of the imaging means. Overlap, an over-live area L occurs.
- a stereo analysis area ST is provided in an area where two imaging areas overlap, and a motion analysis area MA is also provided in other areas. 1, MA 2 is provided.
- stereo parallax V D occurs in the vertical direction.
- Fig. 23 (c) by extracting a horizontal edge HE from the two images in advance, and taking correspondence between the two images with respect to the horizontal edge HE, the stereo parallax VD is obtained. It can be easily obtained.
- the image of a vehicle on the road surface which is the main object of the present invention, contains more horizontal edges such as pampering lines and bonnet lines than other edges, and can be easily detected. You. Since the distance from the imaging means is obtained by the stereo parallax VD, an approaching object or an obstacle can be detected as in the above-described embodiment, and an image of the approaching object or the obstacle is synthesized three-dimensionally. Can be displayed.
- the two imaging units and the virtual viewpoint are arranged in the vertical direction, so that the composite image is an image in which the vertical parallax is enhanced.
- the composite image is an image in which the vertical parallax is enhanced.
- there are few horizontal edges and therefore, where the stereo parallax VD cannot be determined with high accuracy, the accuracy of the synthesized image is reduced.
- since there are few horizontal edges unnaturalness due to positional deviation is caused. Is hardly noticeable. Therefore, as a whole, it is very natural, and in parts with horizontal edges (such as parts where other vehicles are reflected), the depth position information is emphasized. Images can be generated.
- the imaging means faces directly behind. Assuming that the angle of view in the horizontal direction of the imaging means is about 90 degrees, when the distance D is 1 Om, the angle of view in the horizontal direction is equivalent to about 2 Om. If the aspect ratio of the pixels of the imaging means is 3: 4, the vertical angle of view range Vh is equivalent to about 15 m. That is,
- V h / k V p
- the imaging means installed relatively close at an interval of about 7 to 20 cm, so that mounting on a vehicle becomes easier.
- the rear of the vehicle has been described as the main monitoring area.
- the present invention is not limited to this.
- the front and side of the vehicle may be monitored.
- images of the front and side of the vehicle may be generated.
- FIG. 24 is a block diagram showing the configuration of the driving support device according to the third embodiment of the present invention.
- the components different from those in FIG. 14 are danger determination means 501, outside-vehicle warning means 502, and occupant protection means 503.
- the conventional detection is difficult by performing stereo analysis by the hierarchical stereo matching means 405 on the overlapping area by the plurality of imaging means 101 and 401. It is possible to accurately measure the approaching object directly behind the vehicle.
- the danger determining means 501 detects the time until the collision or the speed change of the approaching object. The possibility of collision of this approaching object is determined.
- an instruction signal is output.
- the outside warning means 502 issues a warning signal toward the rear, such as automatically blinking a brake lamp, upon receiving the instruction signal.
- the warning here may be irradiation or blinking of a lamp installed rearward, a sound alarm, or an alarm using radio waves.
- the occupant protection means 50 Third, take measures to protect the occupants, such as winding up the seat belt and preparing the airbag to operate.
- the airbag if it is known in advance that there is a high possibility of collision, it is possible to carry out various pre-processing such as preparation for operation and detection of the position of the occupant's head, thereby ensuring occupant protection. Can be.
- the means for detecting an approaching object in the present embodiment is not limited to those based on image stereo or motion analysis, but may be implemented by other means using, for example, a radar, a laser, or the like. It is. Further, it is needless to say that the present invention can be easily applied to moving objects other than vehicles, such as ships, airplanes, and trains.
- installation positions and the number of the plurality of imaging means are not limited to those shown here.
- the functions of the detection unit and the image generation unit of the driving support device according to the present invention may be realized in whole or in part by using dedicated hardware, or may be realized by software. I don't care. Further, it is also possible to use a recording medium or a transmission medium storing a program for causing a computer to execute all or a part of the functions of the detection unit and the image generation unit of the driving assistance device according to the present invention.
- the present invention makes it possible to detect an approaching object without being affected by the shaking caused by the vibration of the vehicle.
- by using stereo analysis by a plurality of imaging devices together it is possible to detect an approaching object directly behind a vehicle with a small change in motion.
- the driver can directly confirm the positional relationship and the surrounding situation. Also, by converting the image into an image from an oblique viewpoint from above, the distance to the approaching object can be presented more easily. Also, by not only informing the driver, but also alerting the approaching vehicle, the possibility of collision can be reduced, and measures to protect occupants from the impact of a collision can be started earlier.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radar, Positioning & Navigation (AREA)
- Electromagnetism (AREA)
- Remote Sensing (AREA)
- Image Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Processing (AREA)
- Traffic Control Systems (AREA)
- Automotive Seat Belt Assembly (AREA)
- Air Bags (AREA)
- Studio Devices (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP00961092A EP1223083B1 (en) | 1999-09-20 | 2000-09-20 | Device for assisting automobile driver |
JP2001524839A JP3300340B2 (ja) | 1999-09-20 | 2000-09-20 | 運転支援装置 |
DE60009114T DE60009114T2 (de) | 1999-09-20 | 2000-09-20 | Vorrichtung zur unterstützung von kraftfahrzeugführern |
US10/088,593 US6993159B1 (en) | 1999-09-20 | 2000-09-20 | Driving support system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP26562999 | 1999-09-20 | ||
JP11/265629 | 1999-09-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2001021446A1 true WO2001021446A1 (en) | 2001-03-29 |
Family
ID=17419801
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2000/006393 WO2001021446A1 (en) | 1999-09-20 | 2000-09-20 | Device for assisting automobile driver |
Country Status (7)
Country | Link |
---|---|
US (1) | US6993159B1 (ja) |
EP (1) | EP1223083B1 (ja) |
JP (2) | JP3300340B2 (ja) |
KR (1) | KR100466458B1 (ja) |
CN (1) | CN1160210C (ja) |
DE (1) | DE60009114T2 (ja) |
WO (1) | WO2001021446A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002039717A (ja) * | 2000-07-28 | 2002-02-06 | Victor Co Of Japan Ltd | 撮像装置 |
DE10218228A1 (de) * | 2002-04-24 | 2003-11-06 | Volkswagen Ag | Verfahren und Einrichtung zur Funktionskontrolle einer Videokamera in einem Fahrzeug |
JP2004147210A (ja) * | 2002-10-25 | 2004-05-20 | Matsushita Electric Ind Co Ltd | 運転支援装置 |
WO2005079060A1 (ja) * | 2004-02-16 | 2005-08-25 | Matsushita Electric Industrial Co., Ltd. | 運転支援装置 |
US7158664B2 (en) * | 2001-11-09 | 2007-01-02 | Honda Giken Kogyo Kabushiki Kaisha | Image recognition apparatus |
JP2007510575A (ja) * | 2003-11-11 | 2007-04-26 | テクニクス エージー | 車両の走行および/または交通状況記録装置と記録評価方法 |
US7652686B2 (en) | 2001-06-28 | 2010-01-26 | Robert Bosch Gmbh | Device for image detecting objects, people or similar in the area surrounding a vehicle |
WO2016084506A1 (ja) * | 2014-11-28 | 2016-06-02 | 株式会社デンソー | 車両の走行制御装置及び走行制御方法 |
JP2020086491A (ja) * | 2018-11-15 | 2020-06-04 | 株式会社リコー | 情報処理装置、情報処理システム及び情報処理方法 |
Families Citing this family (96)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4573977B2 (ja) * | 1999-09-22 | 2010-11-04 | 富士重工業株式会社 | 監視システムの距離補正装置、および監視システムの消失点補正装置 |
JP3904988B2 (ja) * | 2002-06-27 | 2007-04-11 | 株式会社東芝 | 画像処理装置およびその方法 |
DE10303792A1 (de) * | 2003-01-31 | 2004-08-12 | Robert Bosch Gmbh | Verfahren zur Anzeige fahrzeugspezifischer Informationssignale |
JP2004240480A (ja) * | 2003-02-03 | 2004-08-26 | Matsushita Electric Ind Co Ltd | 運転支援装置 |
KR20050036179A (ko) * | 2003-10-15 | 2005-04-20 | 현대자동차주식회사 | 차량의 전방 감시장치 및 방법 |
JP4638143B2 (ja) * | 2003-12-26 | 2011-02-23 | 富士重工業株式会社 | 車両用運転支援装置 |
US7406182B2 (en) * | 2004-03-31 | 2008-07-29 | Fujifilm Corporation | Image capturing apparatus, image capturing method, and machine readable medium storing thereon image capturing program |
US8082101B2 (en) | 2004-04-08 | 2011-12-20 | Mobileye Technologies Ltd. | Collision warning system |
JP2005311868A (ja) * | 2004-04-23 | 2005-11-04 | Auto Network Gijutsu Kenkyusho:Kk | 車両周辺視認装置 |
JP4586416B2 (ja) * | 2004-05-20 | 2010-11-24 | 日産自動車株式会社 | 運転支援装置 |
JP3833241B2 (ja) * | 2004-06-15 | 2006-10-11 | 松下電器産業株式会社 | 監視装置 |
CN101019151A (zh) * | 2004-07-23 | 2007-08-15 | 松下电器产业株式会社 | 图像处理装置以及图像处理方法 |
EP1779061B1 (en) * | 2004-08-04 | 2013-11-06 | Intergraph Software Technologies Company | Method and computer program product for preparing and comparing composite images with non-uniform resolution |
JP2006054662A (ja) * | 2004-08-11 | 2006-02-23 | Mitsubishi Electric Corp | 運転支援装置 |
US7639841B2 (en) * | 2004-12-20 | 2009-12-29 | Siemens Corporation | System and method for on-road detection of a vehicle using knowledge fusion |
US7609290B2 (en) * | 2005-01-28 | 2009-10-27 | Technology Advancement Group, Inc. | Surveillance system and method |
KR100782811B1 (ko) * | 2005-02-04 | 2007-12-06 | 삼성전자주식회사 | 영상의 주파수 특성에 따라 포맷을 달리하는 스테레오 영상 합성 방법 및 장치와, 그 영상의 송신 및 수신 방법과, 그 영상의 재생 방법 및 장치 |
WO2006092431A1 (de) * | 2005-03-03 | 2006-09-08 | Continental Teves Ag & Co. Ohg | Verfahren und vorrichtung zum vermeiden einer kollision bei einem spurwechsel eines fahrzeugs |
JP4544028B2 (ja) * | 2005-05-13 | 2010-09-15 | 日産自動車株式会社 | 車載画像処理装置、および画像処理方法 |
JP4626400B2 (ja) * | 2005-05-25 | 2011-02-09 | 日産自動車株式会社 | 俯瞰画像表示装置及び俯瞰画像表示方法 |
KR100666276B1 (ko) * | 2005-07-13 | 2007-01-10 | 현대자동차주식회사 | 소실점을 이용한 자동차의 차선이탈 경보방법 |
SE529304C2 (sv) * | 2005-09-06 | 2007-06-26 | Gm Global Tech Operations Inc | Metod och system för förbättrande av trafiksäkerhet |
JP4353162B2 (ja) * | 2005-09-26 | 2009-10-28 | トヨタ自動車株式会社 | 車輌周囲情報表示装置 |
FR2891934B1 (fr) * | 2005-10-12 | 2008-01-18 | Valeo Electronique Sys Liaison | Dispositif de traitement de donnees video pour un vehicule automobile |
JP2007148835A (ja) * | 2005-11-28 | 2007-06-14 | Fujitsu Ten Ltd | 物体判別装置、報知制御装置、物体判別方法および物体判別プログラム |
JP4757085B2 (ja) * | 2006-04-14 | 2011-08-24 | キヤノン株式会社 | 撮像装置及びその制御方法、画像処理装置、画像処理方法、及びプログラム |
JP4846426B2 (ja) * | 2006-04-20 | 2011-12-28 | パナソニック株式会社 | 車両周囲監視装置 |
JP4847884B2 (ja) * | 2007-01-31 | 2011-12-28 | オプトレックス株式会社 | 障害物検出装置、車両用表示装置及び障害物検出方法 |
JP5271511B2 (ja) * | 2007-06-14 | 2013-08-21 | 富士通テン株式会社 | 運転支援装置および画像表示装置 |
EP2240796A4 (en) * | 2008-01-22 | 2012-07-11 | Magna Int Inc | USE OF A SINGLE CAMERA FOR SEVERAL TRAVEL SUPPORT SERVICES, PARKING AIDS, CLUTCH ASSIST AND TAILGATE PROTECTION |
JP5337170B2 (ja) | 2008-02-08 | 2013-11-06 | グーグル インコーポレイテッド | タイミング調節されるシャッターを用いる複数のイメージセンサーを有するパノラマカメラ |
JP4513871B2 (ja) * | 2008-02-15 | 2010-07-28 | ソニー株式会社 | 画像処理方法、画像処理プログラムおよび画像処理装置 |
JP4986069B2 (ja) * | 2008-03-19 | 2012-07-25 | マツダ株式会社 | 車両用周囲監視装置 |
DE102008060684B4 (de) * | 2008-03-28 | 2019-05-23 | Volkswagen Ag | Verfahren und Vorrichtung zum automatischen Einparken eines Kraftfahrzeugs |
KR100929689B1 (ko) * | 2008-09-08 | 2009-12-03 | 재단법인대구경북과학기술원 | 스테레오 영상 정보를 이용한 차량용 사고기록 장치 및 그 제어 방법 |
KR100971731B1 (ko) | 2008-11-07 | 2010-07-22 | 주식회사 케이제이몰 | 차량용 영상 저장 시스템에서의 영상 처리 장치 및 방법 |
KR100997617B1 (ko) | 2008-11-20 | 2010-12-01 | 재단법인대구경북과학기술원 | 교통 수단의 사고 기록 장치, 방법 및 시스템 |
EP2416292A4 (en) * | 2009-03-31 | 2014-09-03 | Konica Minolta Holdings Inc | IMAGE INTEGRATION UNIT AND IMAGE INTEGRATION PROCESS |
JP4733756B2 (ja) * | 2009-04-28 | 2011-07-27 | 本田技研工業株式会社 | 車両周辺監視装置 |
DE102009038406B4 (de) * | 2009-08-24 | 2017-10-05 | Volkswagen Ag | Verfahren und Vorrichtung zur Vermessung des Umfeldes eines Kraftfahrzeugs |
JP5039765B2 (ja) * | 2009-09-17 | 2012-10-03 | 日立オートモティブシステムズ株式会社 | 車両制御装置 |
WO2011036892A1 (ja) * | 2009-09-24 | 2011-03-31 | パナソニック株式会社 | 運転支援表示装置 |
US8633810B2 (en) | 2009-11-19 | 2014-01-21 | Robert Bosch Gmbh | Rear-view multi-functional camera system |
JP2011118482A (ja) * | 2009-11-30 | 2011-06-16 | Fujitsu Ten Ltd | 車載装置および認知支援システム |
CN103942544B (zh) | 2009-12-22 | 2017-11-28 | 松下电器产业株式会社 | 动作解析装置 |
CN102275558B (zh) * | 2010-06-12 | 2013-01-23 | 财团法人车辆研究测试中心 | 双视觉前车安全警示装置及其方法 |
JP5609597B2 (ja) * | 2010-12-02 | 2014-10-22 | 富士通株式会社 | 接触可能性検知装置、接触可能性検知方法、及びプログラム |
JP5695405B2 (ja) * | 2010-12-10 | 2015-04-08 | 東芝アルパイン・オートモティブテクノロジー株式会社 | 車両用画像処理装置および車両用画像処理方法 |
KR101243108B1 (ko) | 2010-12-30 | 2013-03-12 | 주식회사 와이즈오토모티브 | 차량의 후방 영상 표시 장치 및 방법 |
JP2012155655A (ja) * | 2011-01-28 | 2012-08-16 | Sony Corp | 情報処理装置、報知方法及びプログラム |
JP5329582B2 (ja) | 2011-02-09 | 2013-10-30 | 本田技研工業株式会社 | 車両用周辺監視装置 |
CN103797530B (zh) * | 2011-09-21 | 2015-09-09 | 本田技研工业株式会社 | 车辆周围监测装置 |
DE102011084554A1 (de) * | 2011-10-14 | 2013-04-18 | Robert Bosch Gmbh | Verfahren zur Darstellung eines Fahrzeugumfeldes |
CN103085716A (zh) * | 2011-10-31 | 2013-05-08 | 鸿富锦精密工业(深圳)有限公司 | 交通意外防止***及方法 |
JP5760999B2 (ja) * | 2011-12-02 | 2015-08-12 | トヨタ自動車株式会社 | 画像処理装置及び画像処理方法 |
KR101340014B1 (ko) * | 2011-12-09 | 2013-12-10 | 에스엘 주식회사 | 위치 정보 제공 장치 및 방법 |
WO2013099169A1 (ja) | 2011-12-27 | 2013-07-04 | パナソニック株式会社 | ステレオ撮影装置 |
CN103448652B (zh) * | 2012-06-04 | 2016-06-15 | 宏达国际电子股份有限公司 | 行车警示方法及其电子装置 |
CN103522952A (zh) * | 2012-07-06 | 2014-01-22 | 昆达电脑科技(昆山)有限公司 | 驾驶中提示危险的警报装置及其方法 |
JP5695000B2 (ja) * | 2012-08-13 | 2015-04-01 | 本田技研工業株式会社 | 車両周辺監視装置 |
TWI494234B (zh) * | 2012-08-24 | 2015-08-01 | Altek Autotronics Corp | 行車輔助系統及其啟用方法 |
JP6084434B2 (ja) * | 2012-10-31 | 2017-02-22 | クラリオン株式会社 | 画像処理システム及び画像処理方法 |
CN104118380B (zh) * | 2013-04-26 | 2017-11-24 | 富泰华工业(深圳)有限公司 | 行车侦测***及方法 |
WO2014185169A1 (ja) * | 2013-05-16 | 2014-11-20 | ソニー株式会社 | 画像処理装置、画像処理方法およびプログラム |
US10171775B1 (en) * | 2013-05-31 | 2019-01-01 | Vecna Technologies, Inc. | Autonomous vehicle vision system |
KR20150051389A (ko) * | 2013-11-04 | 2015-05-13 | 현대모비스 주식회사 | 영상 처리 방법 및 이를 위한 장치 |
EP3085074B1 (en) * | 2013-12-19 | 2020-02-26 | Intel Corporation | Bowl-shaped imaging system |
US9449234B2 (en) * | 2014-03-31 | 2016-09-20 | International Business Machines Corporation | Displaying relative motion of objects in an image |
JP6264173B2 (ja) | 2014-04-18 | 2018-01-24 | 富士通株式会社 | 撮像方向の正常性判定方法、撮像装置取り付け状態評価プログラムおよび撮像装置取り付け状態評価装置 |
JP6299371B2 (ja) | 2014-04-18 | 2018-03-28 | 富士通株式会社 | 撮像方向の傾き検出方法、撮像方向の傾き検出プログラムおよび撮像方向の傾き検出装置 |
JP6299373B2 (ja) | 2014-04-18 | 2018-03-28 | 富士通株式会社 | 撮像方向の正常性の判定方法、撮像方向の正常性の判定プログラムおよび撮像方向の正常性の判定装置 |
US9355547B2 (en) | 2014-05-22 | 2016-05-31 | International Business Machines Corporation | Identifying a change in a home environment |
US9613274B2 (en) * | 2014-05-22 | 2017-04-04 | International Business Machines Corporation | Identifying an obstacle in a route |
CN106461387B (zh) * | 2014-05-28 | 2020-11-20 | 京瓷株式会社 | 立体相机设备和设置有立体相机的车辆 |
CN104200213B (zh) * | 2014-08-12 | 2018-07-17 | 合肥工业大学 | 一种基于多部件的车辆检测方法 |
JP5949861B2 (ja) * | 2014-09-05 | 2016-07-13 | トヨタ自動車株式会社 | 車両の接近物体検出装置及び車両の接近物体検出方法 |
CN104477126A (zh) * | 2014-12-12 | 2015-04-01 | 常州博显汽车电子有限公司 | 用于汽车上的主动式安全带装置 |
JP5880741B1 (ja) * | 2014-12-26 | 2016-03-09 | 横浜ゴム株式会社 | 衝突回避システム及び衝突回避方法 |
WO2016103468A1 (ja) * | 2014-12-26 | 2016-06-30 | 横浜ゴム株式会社 | 衝突回避システム及び衝突回避方法 |
JP6564577B2 (ja) * | 2015-02-16 | 2019-08-21 | 修一 田山 | 自動車における近接体警報知装置 |
KR101639722B1 (ko) * | 2015-05-26 | 2016-07-15 | 주식회사 피엘케이 테크놀로지 | 소실점 보정 장치 및 방법 |
DE112016003912T5 (de) * | 2015-08-31 | 2018-05-09 | Mitsubishi Electric Corporation | Bildverarbeitungsvorrichtung, Bildverarbeitungsverfahren und Programm |
US11648876B2 (en) | 2015-09-02 | 2023-05-16 | SMR Patents S.à.r.l. | System and method for visibility enhancement |
EP3139340B1 (en) * | 2015-09-02 | 2019-08-28 | SMR Patents S.à.r.l. | System and method for visibility enhancement |
US10706580B2 (en) * | 2015-12-09 | 2020-07-07 | Hajime Kasahara | Position-information specifying method, position-information specifying device, and position-information specifying program |
KR102565485B1 (ko) * | 2016-01-11 | 2023-08-14 | 한국전자통신연구원 | 도시 거리 검색 서비스 제공 서버 및 방법 |
DE102016104730A1 (de) * | 2016-03-15 | 2017-09-21 | Connaught Electronics Ltd. | Verfahren zum Detektieren eines Objekts entlang einer Straße eines Kraftfahrzeugs, Rechenvorrichtung, Fahrerassistenzsystem sowie Kraftfahrzeug |
US20170297488A1 (en) * | 2016-04-19 | 2017-10-19 | GM Global Technology Operations LLC | Surround view camera system for object detection and tracking |
KR102529119B1 (ko) | 2016-06-27 | 2023-05-04 | 삼성전자주식회사 | 객체의 깊이 정보를 획득하는 방법, 디바이스 및 기록매체 |
IT201600094858A1 (it) * | 2016-09-21 | 2018-03-21 | St Microelectronics Srl | Procedimento per un cross traffic alert avanzato a basso costo, corrispondente sistema di elaborazione, sistema per cross traffic alert e veicolo |
US20180374237A1 (en) * | 2017-06-23 | 2018-12-27 | Canon Kabushiki Kaisha | Method, system and apparatus for determining a pose for an object |
KR101849326B1 (ko) * | 2017-08-31 | 2018-04-17 | 주식회사 핀텔 | 차량용 카메라 시스템 |
JP6958147B2 (ja) * | 2017-09-07 | 2021-11-02 | トヨタ自動車株式会社 | 画像表示装置 |
CN109263557B (zh) * | 2018-11-19 | 2020-10-09 | 威盛电子股份有限公司 | 车辆盲区侦测方法 |
CN110435650A (zh) * | 2019-08-22 | 2019-11-12 | 爱驰汽车有限公司 | 车辆后碰撞紧急避让方法、***、设备及存储介质 |
CN111813116A (zh) * | 2020-07-09 | 2020-10-23 | 海南发控智慧环境建设集团有限公司 | 一种基于三维模型的避障辅助*** |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS4851444A (ja) * | 1971-11-02 | 1973-07-19 | ||
JPH10341430A (ja) * | 1997-06-06 | 1998-12-22 | Yazaki Corp | 前方車両検出方法及び車両前方監視システム |
JPH11213295A (ja) * | 1998-01-28 | 1999-08-06 | Kansei Corp | 車両割り込み検出回路及びそれを用いた追突警報装置 |
JP2000113164A (ja) * | 1998-09-30 | 2000-04-21 | Honda Motor Co Ltd | 差分画像を用いた被写体の検出装置 |
JP2000207693A (ja) * | 1999-01-08 | 2000-07-28 | Nissan Motor Co Ltd | 車載用障害物検出装置 |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE2215576C3 (de) | 1972-03-30 | 1988-09-29 | Ernst Leitz Wetzlar Gmbh, 6330 Wetzlar | Einrichtung zum Messen, Regeln und/ oder Anzeigen der Bewegung von Landfahrzeugen |
CH564778A5 (ja) | 1972-03-30 | 1975-07-31 | Leitz Ernst Gmbh | |
DE3637165A1 (de) * | 1986-10-31 | 1988-05-05 | Rainer Ashauer | Verfahren und einrichtung zum verhindern von zusammenstoessen, insbesondere fuer kraftfahrzeuge im strassenverkehr |
US5670935A (en) | 1993-02-26 | 1997-09-23 | Donnelly Corporation | Rearview vision system for vehicle including panoramic view |
JP3239521B2 (ja) * | 1993-03-30 | 2001-12-17 | トヨタ自動車株式会社 | 移動体認識装置 |
JPH06333200A (ja) | 1993-05-21 | 1994-12-02 | Toshiba Corp | 車載用監視システム |
JPH0717328A (ja) | 1993-06-30 | 1995-01-20 | Mitsubishi Motors Corp | 車両用周辺認識補助装置 |
JP3063481B2 (ja) | 1993-09-21 | 2000-07-12 | 日産自動車株式会社 | 車両用物体検出装置 |
JP3381351B2 (ja) | 1993-12-24 | 2003-02-24 | 日産自動車株式会社 | 車両用周囲状況表示装置 |
US6118475A (en) * | 1994-06-02 | 2000-09-12 | Canon Kabushiki Kaisha | Multi-eye image pickup apparatus, and method and apparatus for measuring or recognizing three-dimensional shape |
JP3117902B2 (ja) | 1995-07-28 | 2000-12-18 | 株式会社アイチコーポレーション | 高所作業車の作業範囲規制装置 |
US5886744A (en) * | 1995-09-08 | 1999-03-23 | Intel Corporation | Method and apparatus for filtering jitter from motion estimation video data |
JP3485135B2 (ja) | 1995-09-27 | 2004-01-13 | 矢崎総業株式会社 | 車両用後側方監視装置 |
JP3765862B2 (ja) * | 1996-02-15 | 2006-04-12 | 本田技研工業株式会社 | 車両用環境認識装置 |
JPH09249083A (ja) * | 1996-03-15 | 1997-09-22 | Toshiba Corp | 移動体識別装置および方法 |
JP3456843B2 (ja) | 1996-07-22 | 2003-10-14 | 富士通株式会社 | 前方車間距離計測装置 |
JPH10222665A (ja) * | 1997-01-31 | 1998-08-21 | Fujitsu Ten Ltd | 画像認識装置 |
JP3841323B2 (ja) | 1997-07-07 | 2006-11-01 | 矢崎総業株式会社 | 車両用後側方監視方法及び車両用後側方監視装置 |
JPH1142988A (ja) | 1997-07-25 | 1999-02-16 | Yazaki Corp | 車両用後側方監視方法及び車両用後側方監視装置 |
JP3464368B2 (ja) | 1997-07-25 | 2003-11-10 | 矢崎総業株式会社 | 車両用後側方監視装置 |
US6674430B1 (en) * | 1998-07-16 | 2004-01-06 | The Research Foundation Of State University Of New York | Apparatus and method for real-time volume processing and universal 3D rendering |
JP3596314B2 (ja) * | 1998-11-02 | 2004-12-02 | 日産自動車株式会社 | 物体端の位置計測装置および移動体の通行判断装置 |
JP2000242797A (ja) | 1999-02-18 | 2000-09-08 | Toyota Motor Corp | 画像の運動検出方法及び物体検出装置 |
JP2000241120A (ja) * | 1999-02-23 | 2000-09-08 | Fanuc Ltd | 計測装置 |
US6535114B1 (en) * | 2000-03-22 | 2003-03-18 | Toyota Jidosha Kabushiki Kaisha | Method and apparatus for environment recognition |
-
2000
- 2000-09-20 EP EP00961092A patent/EP1223083B1/en not_active Expired - Lifetime
- 2000-09-20 US US10/088,593 patent/US6993159B1/en not_active Expired - Lifetime
- 2000-09-20 CN CNB008128669A patent/CN1160210C/zh not_active Expired - Lifetime
- 2000-09-20 DE DE60009114T patent/DE60009114T2/de not_active Expired - Lifetime
- 2000-09-20 JP JP2001524839A patent/JP3300340B2/ja not_active Expired - Lifetime
- 2000-09-20 KR KR10-2002-7003649A patent/KR100466458B1/ko active IP Right Grant
- 2000-09-20 WO PCT/JP2000/006393 patent/WO2001021446A1/ja active IP Right Grant
-
2002
- 2002-02-22 JP JP2002046169A patent/JP4091775B2/ja not_active Expired - Lifetime
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS4851444A (ja) * | 1971-11-02 | 1973-07-19 | ||
JPH10341430A (ja) * | 1997-06-06 | 1998-12-22 | Yazaki Corp | 前方車両検出方法及び車両前方監視システム |
JPH11213295A (ja) * | 1998-01-28 | 1999-08-06 | Kansei Corp | 車両割り込み検出回路及びそれを用いた追突警報装置 |
JP2000113164A (ja) * | 1998-09-30 | 2000-04-21 | Honda Motor Co Ltd | 差分画像を用いた被写体の検出装置 |
JP2000207693A (ja) * | 1999-01-08 | 2000-07-28 | Nissan Motor Co Ltd | 車載用障害物検出装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1223083A4 * |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002039717A (ja) * | 2000-07-28 | 2002-02-06 | Victor Co Of Japan Ltd | 撮像装置 |
US7652686B2 (en) | 2001-06-28 | 2010-01-26 | Robert Bosch Gmbh | Device for image detecting objects, people or similar in the area surrounding a vehicle |
US7158664B2 (en) * | 2001-11-09 | 2007-01-02 | Honda Giken Kogyo Kabushiki Kaisha | Image recognition apparatus |
US7474765B2 (en) * | 2001-11-09 | 2009-01-06 | Honda Giken Kogyo Kabushiki Kaisha | Image recognition apparatus |
DE10218228A1 (de) * | 2002-04-24 | 2003-11-06 | Volkswagen Ag | Verfahren und Einrichtung zur Funktionskontrolle einer Videokamera in einem Fahrzeug |
JP2004147210A (ja) * | 2002-10-25 | 2004-05-20 | Matsushita Electric Ind Co Ltd | 運転支援装置 |
JP2007510575A (ja) * | 2003-11-11 | 2007-04-26 | テクニクス エージー | 車両の走行および/または交通状況記録装置と記録評価方法 |
WO2005079060A1 (ja) * | 2004-02-16 | 2005-08-25 | Matsushita Electric Industrial Co., Ltd. | 運転支援装置 |
WO2016084506A1 (ja) * | 2014-11-28 | 2016-06-02 | 株式会社デンソー | 車両の走行制御装置及び走行制御方法 |
CN107000749A (zh) * | 2014-11-28 | 2017-08-01 | 株式会社电装 | 车辆的行驶控制装置以及行驶控制方法 |
JP2020086491A (ja) * | 2018-11-15 | 2020-06-04 | 株式会社リコー | 情報処理装置、情報処理システム及び情報処理方法 |
Also Published As
Publication number | Publication date |
---|---|
JP4091775B2 (ja) | 2008-05-28 |
KR100466458B1 (ko) | 2005-01-14 |
JP2002335524A (ja) | 2002-11-22 |
EP1223083A1 (en) | 2002-07-17 |
CN1160210C (zh) | 2004-08-04 |
US6993159B1 (en) | 2006-01-31 |
EP1223083A4 (en) | 2003-03-19 |
JP3300340B2 (ja) | 2002-07-08 |
DE60009114T2 (de) | 2004-08-05 |
DE60009114D1 (de) | 2004-04-22 |
KR20020033817A (ko) | 2002-05-07 |
CN1373720A (zh) | 2002-10-09 |
EP1223083B1 (en) | 2004-03-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP3300340B2 (ja) | 運転支援装置 | |
JP4687160B2 (ja) | 周辺監視装置 | |
KR102344171B1 (ko) | 화상 생성 장치, 화상 생성 방법, 및 프로그램 | |
JP4883977B2 (ja) | 車両用画像表示装置 | |
WO2012043184A1 (ja) | 駐車支援装置 | |
KR101611194B1 (ko) | 차량 주변 이미지 생성 장치 및 방법 | |
JP2012071635A5 (ja) | ||
JP2009017462A (ja) | 運転支援システム及び車両 | |
JP2004114977A (ja) | 移動体周辺監視装置 | |
JP2003044996A (ja) | 障害物検出装置 | |
JP5516998B2 (ja) | 画像生成装置 | |
KR20100113959A (ko) | 차량 주위 화상 표시 시스템 | |
JP5495071B2 (ja) | 車両周辺監視装置 | |
JP4601505B2 (ja) | トップビュー画像生成装置及びトップビュー画像表示方法 | |
JP2004240480A (ja) | 運転支援装置 | |
JP2006268076A (ja) | 運転支援システム | |
JP4192680B2 (ja) | 移動体周辺監視装置 | |
JP4154980B2 (ja) | 移動体周辺監視装置 | |
JP2008048094A (ja) | 車両用映像表示装置及び車両周囲映像の表示方法 | |
JP5271186B2 (ja) | 車両用画像表示装置 | |
CN108973858A (zh) | 用于确保行驶路线安全的装置 | |
JP4207519B2 (ja) | 移動体周辺監視装置 | |
JP6999239B2 (ja) | 画像処理装置および画像処理方法 | |
JP2017117357A (ja) | 立体物検知装置 | |
JP4310987B2 (ja) | 移動体周辺監視装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CN JP KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
ENP | Entry into the national phase |
Ref country code: JP Ref document number: 2001 524839 Kind code of ref document: A Format of ref document f/p: F |
|
WWE | Wipo information: entry into national phase |
Ref document number: 008128669 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10088593 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020027003649 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2000961092 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020027003649 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2000961092 Country of ref document: EP |
|
WWG | Wipo information: grant in national office |
Ref document number: 2000961092 Country of ref document: EP |
|
WWG | Wipo information: grant in national office |
Ref document number: 1020027003649 Country of ref document: KR |