CN107003389A - For vehicle driver assistance system and aid in vehicle driver method - Google Patents

For vehicle driver assistance system and aid in vehicle driver method Download PDF

Info

Publication number
CN107003389A
CN107003389A CN201480083866.1A CN201480083866A CN107003389A CN 107003389 A CN107003389 A CN 107003389A CN 201480083866 A CN201480083866 A CN 201480083866A CN 107003389 A CN107003389 A CN 107003389A
Authority
CN
China
Prior art keywords
vehicle
sensor
video camera
driver assistance
assistance system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201480083866.1A
Other languages
Chinese (zh)
Inventor
卡斯滕·考什
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Audi AG
Original Assignee
Audi AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi AG filed Critical Audi AG
Publication of CN107003389A publication Critical patent/CN107003389A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/04Display arrangements
    • G01S7/06Cathode-ray tube displays or other two dimensional or three-dimensional displays
    • G01S7/062Cathode-ray tube displays or other two dimensional or three-dimensional displays in which different colours are used
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9324Alternative operation using ultrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93273Sensor installation details on the top of the vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

It is used for the driver assistance system of vehicle (10) the present invention relates to a kind of, driver assistance system includes being used to observe at least one of sensing system (12) of the surrounding environment (14) of vehicle, and sensing system (12) includes:At least one video camera (16,18), it is configured at least one of image for shooting surrounding environment (14) and the view data for providing phenogram picture;At least one sensor (20), it is different from video camera (16,18) and is configured at least one at least one of parameter that capture characterizes surrounding environment (14), and sensor (20) is configured to provide the sensing data of characterization parameter;And data fusion unit, it is configured to merge view data and sensing data, so as to produce at least one of virtual three dimensional image of surrounding environment (14).

Description

For vehicle driver assistance system and aid in vehicle driver method
Technical field
The present invention relates to a kind of driver assistance system for vehicle and for aiding in driver to drive the side of vehicle Method.
Background technology
The A1 of US 2007/0106440 show to include being used for the car for shooting the camera system of picture in outside vehicle Parking assistance system.Vehicle parking accessory system further comprises being used for generating by carrying out picture birds-eye view conversion The video generation device of birds-eye view image.Vehicle parking accessory system further comprises:Motion detection apparatus, for detecting vehicle Motion;With combination image generating means, the mistake that will be previously generated by video generation device is passed through for the motion based on vehicle Go birds-eye view image or the past being previously generated to combine current birds-eye view image of the birds-eye view image with being currently generated to combine Generation combination birds-eye view image.Vehicle parking accessory system further comprises for detecting the fan-shaped detection zone in vehicle periphery In barrier obstacle detector and for combination birds-eye view image in an overlapping manner display be directed to barrier Current detection point and in the past test point display device.Current detection point is shown along detection of obstacles dress by display device The detection zone put with arc from a series of marks of the precalculated position extension in fan-shaped detection zone, and by past detection Point is illustrated within the mark of the pre-position in fan-shaped detection zone.
In addition, the A1 of US 2012/0154592 show a kind of to operate to handle by capturing image outside vehicle-surroundings The image processing system of the view data of acquisition, image processing system includes being fixed to vehicle and by capturing vehicle-surroundings Outside image generates the multiple images capturing unit of view data.Image processing system further comprises birds-eye view image Drawing unit, it is configured to by being determined based on view data in the view data for being generated by image capturing unit The vehicle up direction of each visual angle and generate birds-eye view image so that it is empty corresponding to the entity of two adjacent birds-eye view images Between end overlap each other.
The content of the invention
The technical problem to be solved
By it the driver of vehicle can be particularly advantageously aided in drive vehicle it is an object of the invention to provide a kind of Driver assistance system and method.
The purpose is by the driver assistance system of the feature with Patent right requirement 1 and with Patent right requirement The method of 10 feature is solved.Refer to the favourable implementation for have the Advantageous developments of the present invention in other Patent right requirements Example.
Technical scheme
The first aspect of the present invention is related to a kind of driver assistance system for vehicle, and driver assistance system includes using In at least one of sensing system of observation vehicle-periphery.
Sensing system includes at least one video camera for being configured to shoot at least one of image of surrounding environment. In addition, video camera is configured to provide the view data of phenogram picture.Sensing system further comprises different from video camera At least one sensor.Sensor is configured at least one at least one of parameter that capture characterizes surrounding environment.For example, The target being arranged in a part for surrounding environment can detect and by parameter characterization that wherein sensor is by structure by sensor The sensing data of offer characterization parameter is provided.In other words, the sensing data represents parameter, and therefore represents for example to lead to Cross the target of sensor detection.
In addition, sensing system includes data fusion unit, it is configured to merge view data and sensing data, from And produce at least one of virtual three dimensional image of surrounding environment.For example, view data and sensing data are transferred to and connect Receive the data fusion unit of view data and sensing data.The data received by data fusion unit are come in such a way Combination:Data based on reception produce at least one of 3-D view of surrounding environment.By the surrounding's ring for producing vehicle The 3-D view in border, can particularly effectively aid in the driver of vehicle to drive.For example, it is crowded that driver can be aided in be driving through Traffic so that the risk of collision between vehicle and other targets particularly traffic participant can keep especially low.In addition, can be auxiliary Help driver to park cars so that driver can by storing cycle on especially narrow parking stall, without with limited parking Target collision.In addition, according to the driver assistance system of the present invention driver can be helped to be protected under chaotic traffic Hold calm and keep controlling.
In the advantageous embodiment of the present invention, driver assistance system includes at least one display unit, and it is configured to 3-D view is presented to the driver of vehicle.Therefore, driver can be by showing the display unit of 3-D view with three-dimensional Mode visually perceives at least a portion of surrounding environment, so as to visualize the visual field of current traffic and driver It can be extended by driver assistance system.Therefore, the risk of collision between vehicle and other traffic participants can be kept It is especially low.For example, driver can be based on being driven vehicle by the 3-D view shown in display by being participated in by other traffic The very narrow space of person's limitation, without being collided with other traffic participants.
In the further advantageous embodiment of the present invention, display unit is configured to based at least one preassigned with not Target is presented with color.The target is the part of the 3-D view shown by display unit.In addition, the target is different from The vehicle, it is disposed in the surrounding environment of the vehicle and by sensor system senses.For example, the standard can be with It is the shifting of the distance between target and vehicle of detection and/or relative velocity and/or vehicle between target and vehicle and target Dynamic respective direction.For example, because vehicle may be potentially with the target collision, the target close to vehicle is shown as It is red.In addition, for example, due to vehicle and moving away the target of vehicle and/or between the target of vehicle constant distance Risk of collision is especially low, so these targets are shown as grey.In other words, for example, display unit is configured to based on determination Dangerous situation target in the surrounding environment of vehicle is shown with different colors.
In the further advantageous embodiment of the present invention, video camera is configured to ultrahigh resolution video camera, and (UHD is imaged Machine) or light fidelity video camera (LiFi video cameras).Therefore, because video camera has high-resolution, it is possible to particularly precisely Detection and at least a portion of observation surrounding environment.Light fidelity video camera is also referred to as LiFi video cameras, and it is using being also referred to as The Optical Data Transmission System of visible light communication.
In the further advantageous embodiment of the present invention, sensor is disposed in the over top of vehicle so that surrounding ring The king-sized part in border can be detected by sensor.
Alternatively, or in addition, the over top that sensor and/or video camera are installed in vehicle allows to especially have The surrounding environment of effect ground detection vehicle.
In the further advantageous embodiment of the present invention, sensor is configured to electromagnetic sensor or sonac.Cause This, can accurately detect the target arranged very close to vehicle.Furthermore, it is possible to particularly precisely detect this target and vehicle The distance between.
Preferably, sensing system is configured to the nearly range sensor system that detection range is up to 15 meters.Therefore, may be used The risk of collision between vehicle and other targets is allowd to keep special near surrounding environment particularly precisely to observe vehicle It is not low.In addition, nearly scope surrounding environment overview auxiliary can be implemented with by the three-dimensional of the nearly scope surrounding environment of vehicle Overview is presented to driver.Therefore, driver assistance system can aid in passing maneuver and/or parking to act and/or low scope turn To.
In further embodiment of the present invention, sensing system is included on the outside of the main body of vehicle at least One sensor strip, sensor strip extends across at least most of of the length of vehicle, longitudinal direction of the length in vehicle Upper extension.Thus, for example, because preferably sensor strip is vehicle extension entirely around main body, it is possible to super around vehicle Cross the surrounding environment of 360 ° of detection vehicles.
In the further advantageous embodiment of the present invention, sensing system is including being arranged in the over top of vehicle at least One sensor box.In addition, sensing system includes the multiple sensors being arranged in sensor box so that can be especially accurate Observe the surrounding environment of vehicle in ground.
The invention further relates to a kind of method for being used to aid in driver to drive vehicle, method includes seeing by sensing system Examine at least a portion of the surrounding environment of vehicle, sensing system include at least one video camera, different from video camera at least One sensor and data fusion unit.Method further comprises shooting at least one of of surrounding environment by video camera Image.The view data of phenogram picture is provided by video camera.Characterize at least one at least one of parameter of surrounding environment Captured by sensor.In addition, the sensing data of characterization parameter is provided by sensor.In addition, view data and sensor Data are merged by data fusion unit, so as to produce at least one of virtual three dimensional image of surrounding environment.According to this hair The advantageous embodiment and advantage of bright driver assistance system are considered as the advantageous embodiment of the method according to the invention and excellent Point, vice versa.
Further advantage, feature and the details of the present invention is obtained from the following description of preferred embodiment and from accompanying drawing. Without departing from the scope of the invention, in the description previously mentioned feature and combinations of features and accompanying drawing with The feature and combinations of features being shown in the drawings in lower description and/or only can not only be used in respective given combination but also It can use or individually use in any other combination.
Brief description of the drawings
Fig. 1 is to include being used to aid in the car in passenger vehicle form of the driver-operated driver assistance system of vehicle Schematic side elevation;And
Fig. 2 is the schematic elevational view of the display unit of driver assistance system, and display unit shows surrounding's ring of vehicle At least one of 3-D view in border.
Embodiment
Fig. 1 shows the vehicle 10 in passenger vehicle form.Vehicle 10 includes driver assistance system, and it, which has, is used to observe At least one of sensing system 12 of the surrounding environment 14 of vehicle 10.Sensing system includes multiple video cameras 16 and 18. For example, video camera 16 is configured to LiFi video cameras or ultrahigh resolution video camera (UHD video cameras), wherein video camera 18 is by structure Cause traditional cameras.In addition, sensing system 12 includes the multiple sensors 20 for being configured to sonac.In addition, passing Sensor system 12 includes multiple sensors in the sensor box 22 for the top of top 24 arrangement for being disposed in vehicle 10, sensor Box 22 is arranged on top 24.In addition, sensing system 12 includes the biography of at least most of extension around the periphery of vehicle 10 Sensor band 26.In the current situation, sensor strip 26 extends entirely around the main body 28 of vehicle 10, and therefore surrounds vehicle 10 Itself extends, and wherein sensor strip 26 is installed on the outside 30 of main body 28.Sensor strip 26 includes being configured to for example electric Multiple sensors of Magnetic Sensor and/or video camera and/or sonac.The sensor of sensing system 12 is different from Video camera 16 and 18.Video camera 16 and 18 is configured to shoot at least one of image of surrounding environment 14, wherein video camera 16 and 18 provide the view data for characterizing shooting image.Capture table is configured to different from the sensor of video camera 16 and 18 Levy at least one at least one of parameter of surrounding environment 14.For example, the sensor is configured to detection arrangement around Respective distance between the target and vehicle 10 of target and detection in environment 14.In addition, the sensor is configured to carry Sensing data for characterizing the parameter, the parameter and then the target and distance for characterizing detection.
In addition, sensing system 12 includes the data fusion unit for receiving view data and sensing data.Data fusion Unit is configured to merge the view data received and the sensing data of reception, so as to produce at least one of surrounding environment 14 The virtual three dimensional image divided.As figure it is seen that sensing system 12 includes at least one be arranged in the inside of vehicle 10 Individual display unit 32.Display unit is configured to the 3-D view of generation being presented to the driver of vehicle 10, wherein described three One in dimension image can see in fig. 2.The 3-D view presented by display unit 32 is each wrapped in three dimensions Include or show that other traffic of vehicle 10 itself and such as other vehicles 34 and 36, the people 38 of cycling and pedestrian 40 are participated in Person.Therefore, the orientation that the driver of vehicle 10 can be with visual perception vehicle 10 on other traffic participants.Therefore, Driver can drive vehicle 10 in ultra-safe mode, without being collided with other traffic participants.
Sensing system 12 has most 15 meters of detection range so that the nearly scope overview of surrounding environment 14 can pass through Display unit 32 is presented to driver.In addition, display unit 32 and therefore sensing system 12 are configured to be based at least one Individual preassigned is presented by the target and the therefore traffic participant of the detection of sensing system 12 with different colors. Under present case, standard includes the distance between vehicle 10 and each other traffic participant and each other traffic participant Direction on the movement of this vehicle 10.For example, because vehicle 36 and pedestrian 40 keep off this vehicle 10, vehicle 36 and row People 40 is shown as grey in 3-D view.For example, vehicle 46 moves away vehicle 10 so that between vehicle 10 and vehicle 36 Distance increase.In addition, for example, pedestrian 40 stands still so that respective constant distance or increasing between pedestrian 40 and vehicle 10 Greatly.Therefore, the risk of collision between vehicle 10 and vehicle 36 and pedestrian 40 is less than predetermined threshold so that vehicle 36 and pedestrian 40 exist Grey is shown as in 3-D view.
However, due to respective apart from low between vehicle 10 and vehicle 34 and between the people 38 of vehicle 10 and cycling In threshold value and vehicle 34 and the people 38 ridden a bicycle close to vehicle 10, so the people 38 of vehicle 34 and cycling is for example shown Go out for red.Therefore, the people 38 of vehicle 34 and cycling is the potential collision target that possible potentially collided with vehicle 10.This Mean that the risk of collision between risk of collision and vehicle 10 between vehicle 10 and vehicle 34 and the people 38 of cycling surpasses Cross threshold value so that the people 38 of vehicle 34 and cycling is shown as red in 3-D view.
By the way that 3-D view is presented into driver, driver can be aided in be driven in congested traffic so that vehicle Risk of collision between 10 and other traffic participants can keep especially low.In other words, driver 10 can be based on presentation 3-D view avoids the collision between vehicle 10 and other traffic participants.Additionally or alternatively, driver can be aided in By storing cycle in narrow parking stall, without the target collision with limited parking.

Claims (10)

1. one kind is used for the driver assistance system of vehicle (10), the driver assistance system includes the week for being used to observe vehicle At least one of sensing system (12) in collarette border (14), the sensing system (12) includes:
- at least one video camera (16,18), it is configured to shoot at least one of image of the surrounding environment (14) simultaneously And the view data for characterizing described image is provided;
- at least one sensor (20), it is different from the video camera (16,18) and is configured to the capture sign surrounding At least one at least one of parameter of environment (14), the sensor (20) is configured to provide the biography for characterizing the parameter Sensor data;And
- data fusion unit, it is configured to merge described image data and the sensing data, so as to produce around described At least one of virtual three dimensional image of environment (14).
2. driver assistance system according to claim 1,
Wherein described driver assistance system (12) includes at least one display unit (32), and it is configured to the graphics Driver as being presented to the vehicle (10).
3. driver assistance system according to claim 2,
Wherein described display unit (32) be configured to based at least one preassigned with different color present target (34, 36th, 38,40), the target (34,36,38,40) for the 3-D view part and different from the vehicle (10), quilt It is arranged in the surrounding environment of the vehicle (14) and is detected by the sensing system (12).
4. the driver assistance system described in any one in preceding claims,
Wherein described video camera (16) is configured to ultrahigh resolution video camera or light fidelity video camera.
5. the driver assistance system described in any one in preceding claims,
Wherein described sensor is disposed in above the top (24) of the vehicle.
6. the driver assistance system described in any one in preceding claims,
Wherein described sensor and/or the video camera (16,18) are installed on the top of the vehicle (10) (24).
7. the driver assistance system described in any one in preceding claims,
Wherein described sensor (20) is configured to electromagnetic sensor or sonac.
8. the driver assistance system described in any one in preceding claims,
Wherein described sensing system (12) includes at least one on the outside of the main body (28) of the vehicle (10) Sensor strip (26), the sensor strip (26) extends across at least most of of the length of the vehicle (10).
9. the driver assistance system described in any one in preceding claims,
Wherein described sensing system (12) includes:
- at least one sensor box (22), it is disposed in above the top (24) of the vehicle (10);And
- multiple sensors, it is disposed in the sensor box (22).
10. one kind is used for the method for aiding in driver to drive vehicle (10), methods described includes:
- by sensing system (12) observe vehicle surrounding environment (14) at least a portion, the sensing system (12) Including at least one video camera (16,18), at least one sensor (20) and data different from the video camera (16,18) Integrated unit;
- pass through at least one of image of the video camera (16,18) the shooting surrounding environment (14);
- view data for characterizing described image is provided by the video camera (16,18);
- at least one at least one of parameter for characterizing the surrounding environment (14) is captured by the sensor (20);
- sensing data for characterizing the parameter is provided by the sensor (20);And
- described image data and the sensing data are merged by the data fusion unit, so as to produce the surrounding ring At least one of virtual three dimensional image in border (14).
CN201480083866.1A 2014-12-05 2014-12-05 For vehicle driver assistance system and aid in vehicle driver method Pending CN107003389A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2014/066619 WO2016087900A1 (en) 2014-12-05 2014-12-05 Driver assistance system for a vehicle as well as method for assisting a driver of a vehicle

Publications (1)

Publication Number Publication Date
CN107003389A true CN107003389A (en) 2017-08-01

Family

ID=52434876

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201480083866.1A Pending CN107003389A (en) 2014-12-05 2014-12-05 For vehicle driver assistance system and aid in vehicle driver method

Country Status (2)

Country Link
CN (1) CN107003389A (en)
WO (1) WO2016087900A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102598083A (en) * 2009-10-30 2012-07-18 丰田自动车株式会社 Driving support device
CN103108796A (en) * 2010-08-12 2013-05-15 法雷奥开关和传感器有限责任公司 Method for supporting a parking procedure of a motor vehicle, driver assistance system, and motor vehicle
CN103210434A (en) * 2010-09-15 2013-07-17 大陆-特韦斯贸易合伙股份公司及两合公司 Visual driver information and warning system for driver of motor vehicle
CN103583041A (en) * 2011-06-07 2014-02-12 罗伯特·博世有限公司 Vehicle camera system and method for providing a continuous image of the vehicle surroundings

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4682809B2 (en) 2005-11-04 2011-05-11 株式会社デンソー Parking assistance system
JP5120880B2 (en) 2007-10-15 2013-01-16 アルパイン株式会社 Image processing apparatus and image processing method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102598083A (en) * 2009-10-30 2012-07-18 丰田自动车株式会社 Driving support device
CN103108796A (en) * 2010-08-12 2013-05-15 法雷奥开关和传感器有限责任公司 Method for supporting a parking procedure of a motor vehicle, driver assistance system, and motor vehicle
CN103210434A (en) * 2010-09-15 2013-07-17 大陆-特韦斯贸易合伙股份公司及两合公司 Visual driver information and warning system for driver of motor vehicle
CN103583041A (en) * 2011-06-07 2014-02-12 罗伯特·博世有限公司 Vehicle camera system and method for providing a continuous image of the vehicle surroundings

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
AMIR LLIAIFAR: "Cars/LIDAR, lasers, and logic: Anatomy of an autonomous vehicle", 《URL:HTTPS://WWW.DIGITALTRENDS.COM/CARS/LIDAR-LASERS-AND-BEEFED-UP-COMPUTERS-THE-INTRICATE-ANATOMY-OF-AN-AUTONOMOUS-VEHICLE/》 *
ANONYMOUS: "Fujitsu Develops World"s First 3D Image Synthesis Technology to Display Vehicle Exterior without Distortion - Fujitsu Global", 《URL:HTTPS://WWW.FUJITSU.COM/GLOBAL/ABOUT/RESOURCES/NEWS/PRESS-RELEASES/2013/1009-03.HTML》 *
SEBASTIAN THRUN ET AL.: "Stanley: The Robot that Won the DARPA Grand Challenge", 《JOURNAL OF FIELD ROBOTICS, JOHN WILEY&SONS, INC, US》 *

Also Published As

Publication number Publication date
WO2016087900A1 (en) 2016-06-09

Similar Documents

Publication Publication Date Title
US11794647B2 (en) Vehicular vision system having a plurality of cameras
CN106143309B (en) A kind of vehicle blind zone based reminding method and system
US10970568B2 (en) Vehicular vision system with object detection
KR101670847B1 (en) Apparatus and method for peripheral image generation of vehicle
KR101611194B1 (en) Apparatus and method for peripheral image generation of vehicle
JP5454934B2 (en) Driving assistance device
CN103237685B (en) Blind area display device and method
US8559675B2 (en) Driving support device, driving support method, and program
EP3140725B1 (en) Dynamic camera view to aid with trailer attachment
EP1005234A2 (en) Three-dimensional scope system for vehicles with a single camera
JP2018531530A (en) Method and apparatus for displaying surrounding scene of vehicle / towed vehicle combination
JP2013533168A (en) Method for supporting parking operation of automobile, driver assistance system, and automobile
JP2004240480A (en) Operation support device
JP2010016805A (en) Image processing apparatus, driving support system, and image processing method
CN105793909B (en) The method and apparatus for generating warning for two images acquired by video camera by vehicle-periphery
JP2006238131A (en) Vehicle periphery monitoring apparatus
CN103377372B (en) One kind looks around composite diagram overlapping region division methods and looks around composite diagram method for expressing
JP4932293B2 (en) Obstacle recognition device
JP2012023505A (en) Driving support device
JP2004040523A (en) Surveillance apparatus for vehicle surroundings
CN107003389A (en) For vehicle driver assistance system and aid in vehicle driver method
JP5691339B2 (en) Driving assistance device
JP2008230358A (en) Display device
JP2004312537A (en) Visual field instrument for mobile body
WO2013099701A1 (en) Obstacle warning device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170801