WO2016071332A1 - Method for operating a driver assistance system of a motor vehicle, driver assistance system and motor vehicle - Google Patents

Method for operating a driver assistance system of a motor vehicle, driver assistance system and motor vehicle Download PDF

Info

Publication number
WO2016071332A1
WO2016071332A1 PCT/EP2015/075582 EP2015075582W WO2016071332A1 WO 2016071332 A1 WO2016071332 A1 WO 2016071332A1 EP 2015075582 W EP2015075582 W EP 2015075582W WO 2016071332 A1 WO2016071332 A1 WO 2016071332A1
Authority
WO
WIPO (PCT)
Prior art keywords
motor vehicle
road marking
assistance system
driver assistance
camera
Prior art date
Application number
PCT/EP2015/075582
Other languages
French (fr)
Inventor
Ciáran HUGHES
Enda Peter WARD
Brian Michael Thomas DERGAN
Original Assignee
Connaught Electronics Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Connaught Electronics Ltd. filed Critical Connaught Electronics Ltd.
Priority to CN201580059925.6A priority Critical patent/CN107206941A/en
Priority to US15/523,572 priority patent/US20170313253A1/en
Priority to JP2017523987A priority patent/JP6510642B2/en
Priority to KR1020177012116A priority patent/KR102004062B1/en
Priority to EP15790917.7A priority patent/EP3215400B1/en
Publication of WO2016071332A1 publication Critical patent/WO2016071332A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/12Mirror assemblies combined with other articles, e.g. clocks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/21Optical features of instruments using cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/12Mirror assemblies combined with other articles, e.g. clocks
    • B60R2001/1253Mirror assemblies combined with other articles, e.g. clocks with cameras, video cameras or video screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/20Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/307Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing virtually distinguishing relevant parts of a scene from the background of the scene

Definitions

  • the invention relates to a method for operating a driver assistance system of a motor vehicle, in which a rear image of an environmental region of the motor vehicle located substantially next to and/or behind the motor vehicle is captured by at least one camera of an electronic rearview mirror of the driver assistance system, and the rear image is displayed on a display device in the motor vehicle.
  • the invention relates to a driver assistance system for a motor vehicle as well as to a motor vehicle with a driver assistance system.
  • the electronic rearview mirror is also referred to as eMirror.
  • an electronic wing mirror or an electronic exterior mirror is known, which displays an environment, which would be captured by a conventional wing mirror or a conventional exterior mirror, on the display device.
  • an electronic rearview mirror is known, which displays an environment that would be detected by a conventional rearview mirror on the display device.
  • this object is solved by a method, by a driver assistance system as well as by a motor vehicle having the features according to the respective independent claims.
  • a method according to the invention for operating a driver assistance system of a motor vehicle in which a rear image of an environmental region of the motor vehicle located substantially next to and/or behind the motor vehicle is captured by at least one camera of an electronic rearview mirror of the driver assistance system, and the rear image is displayed on a display device in the motor vehicle.
  • at least one road marking of a roadway is recognized in the environmental region based on the captured rear image.
  • the road marking is preferably a longitudinal marking.
  • the road marking bounds lanes of the roadway.
  • the road marking also roadway marking or ground marking, is a color identification on the surface of traffic areas of the road traffic.
  • the road marking is associated with the road equipment and thus also with the roadway and serves for traffic guidance, the identification of various traffic areas and as a traffic sign.
  • lane delimitation on the side of the motor vehicle is effected by the road marking.
  • the road marking can describe a roadway boundary or edge line and/or a lane boundary or solid line or safety line and/or a lane separator.
  • the motor vehicle can be more reliably and/or more precisely moved by the additional recognition of at least one road marking.
  • the rear image is captured reflection-mirrorless and therefore without a reflection mirror.
  • An electronic rearview mirror replaces the light reflecting mirror surface of a conventional rearview mirror, be it a wing mirror on the side of the motor vehicle or a rearview mirror on the headliner of the motor vehicle.
  • An electronic rearview mirror captures or films an area of the motor vehicle located behind the driver by means of a camera in the described manner and presents the camera images by means of a display device, for example a screen, in the field of view of the driver.
  • the motor vehicle has no particular side mirrors or rearview mirrors, which reflect light toward the driver.
  • the rear image is only captured electronically.
  • the rear image is captured by a camera which is constructed as at least one lateral camera of an electronic wing mirror, and the rear image is displayed on the display device in the motor vehicle.
  • a camera which is constructed as at least one lateral camera of an electronic wing mirror
  • the rear image is displayed on the display device in the motor vehicle.
  • the lateral camera because it is already present due to the electronic wing mirror.
  • a lateral distance from a longitudinal axis of the motor vehicle to the at least one road marking is determined based on the at least one recognized road marking.
  • the lateral distance it can be determined if the motor vehicle is within a lane of the roadway.
  • an increased safety of the motor vehicle can be provided.
  • a lateral speed with which the motor vehicle approaches the at least one recognized road marking, is determined based on the at least one recognized road marking.
  • the lateral speed describes, with which transverse speed the motor vehicle approaches the at least one recognized road marking.
  • the lateral speed it can be determined, which period of time is left until the motor vehicle reaches or traverses the at least one recognized road marking.
  • the motor vehicle can thereby be particularly safely moved within the lane.
  • a period of time left until traversing the road marking by the motor vehicle is determined depending on the determined lateral speed. Based on the period of time, it can be determined, which time is left to a driver of the motor vehicle for example to alter a direction of travel of the motor vehicle by a corresponding steering movement in order that the traversing of the road marking or exiting the lane can be prevented.
  • a particularly high safety in moving the motor vehicle is again advantageous because staying in the lane can be particularly reliably monitored.
  • a number of lanes of the roadway are determined based on the at least one recognized road marking.
  • the lane or a traffic lane identifies the area, which is available to the motor vehicle for driving in one direction.
  • the width of the lane varies for example between 2.75 meters and 3.75 meters.
  • the lane is mostly identified by roadway markings such as the road marking or roadway boundary or lane boundary or lane separator. Thus, additional information about the roadway contributing to safe movement of the motor vehicle is advantageous.
  • a current position of the motor vehicle with respect to the at least two lanes of the roadway is determined based on the at least one recognized road marking.
  • the current position of the motor vehicle can then be provided to other units of the motor vehicle.
  • this information of the current position can for example be compared and made plausible with data from other sources, for example sensors or other cameras, respectively.
  • the current position can be particularly precisely and/or reliably determined with respect to at least two lanes of the roadway.
  • the determined, current position of the motor vehicle is provided to a navigation apparatus of the motor vehicle.
  • the navigation apparatus in particular a navigation apparatus with a global navigation satellite system (GNSS), usually is equipped with an absolute GPS system and thus has an accuracy ⁇ 10 meters in the position. This accuracy can be improved based on the determined, current position of the motor vehicle.
  • GNSS global navigation satellite system
  • the navigation apparatus can earlier inform a driver of the motor vehicle about a required driving maneuver based on the information about the determined, current position.
  • the motor vehicle is preferably at least semi-autonomously maneuvered depending on the at least one recognized road marking.
  • the at least semi-autonomous maneuvering of the motor vehicle has the advantage that the driver of the motor vehicle can for example be relieved of a steering intervention and/or braking intervention and/or an intervention in a drive device.
  • the safety of the motor vehicle can increase.
  • fully autonomous maneuvering of the motor vehicle can be provided if the driver carries out neither the steering intervention nor the acceleration intervention and the braking intervention.
  • the fully autonomous driving or maneuvering also has the advantage that the movement of the motor vehicle can be safer carried out because for example human failure or human inattention can be excluded.
  • a driver of the motor vehicle is warned of exiting the lane by means of the evaluation device depending on the at least one recognized road marking.
  • the evaluation device can be a component of a lane departure warning system, which warns the driver of the motor vehicle of exiting the lane.
  • different optical systems and computing devices can be employed, with the aid of which the position of the motor vehicle in the lane is determined.
  • the lane departure warning system warns upon falling below the distance to the road marking or lane marking (Distance to Line Crossing criterion (DLC)) and can pre-calculate this shortfall with the aid of the Time to Line
  • the lane departure warning system can be realized in different manner.
  • the motor vehicle can be about to traverse the road marking, and a warning beep and/or a rattling sound is emitted or the steering wheel is vibrated.
  • the warning can be acoustically and/or visually and/or haptically effected. It can also be that a steering intervention is performed by the driver assistance system to prevent unintended exiting the lane.
  • a front image of an environmental region of the motor vehicle located substantially in front of the motor vehicle is provided by means of a front camera of the motor vehicle.
  • the front camera can be located behind a rearview mirror or behind the windshield of the motor vehicle and can be oriented forward with respect to the motor vehicle.
  • the front camera captures the environmental region, which is in front of the motor vehicle or in forward direction of travel of the motor vehicle. It is advantageous that additional information about the roadway in the form of the front image is provided by the front camera. In this manner, the environment of the motor vehicle can be reliably captured.
  • the at least one road marking is additionally determined based on the front image.
  • the determination of the road marking based on the front image results in a particularly reliable recognition.
  • the at least one road marking can thus be determined based on the rear image and based on the front image.
  • erroneous determinations of the road marking can be avoided because the results of the front image and of the rear image can be compared to each other and be verified, respectively.
  • the front camera is directed in the direction of travel of the motor vehicle, and thus a current direction of travel or a travel trajectory of the motor vehicle can be predicted. Based on the travel trajectory, for example, it can then be predicted when the motor vehicle traverses the road marking.
  • a lighting situation of the roadway is acquired by means of the driver assistance system and the at least one road marking is recognized depending on the acquired lighting situation in the rear image and/or the front image.
  • the rear image can for example be fused with the front image.
  • the rear image can be used to recognize the road marking. This is helpful because the rear image is captured with the at least one lateral camera, which can be oriented substantially opposite to the front camera.
  • the low sun then for example does not shine from the front onto the front camera, but the low sun shines from behind onto the at least one lateral camera.
  • the inverse case can also be possible if the low sun shines from behind onto the motor vehicle and thus the rear image seems to be unsuitable to recognize the road marking, thus, the front image can be used to better recognize the road marking.
  • the lane recognition or the recognition of the road marking can thus be particularly reliably and/or particularly accurately carried out.
  • the lighting situation describes the incidence of sunlight and/or of light of another traffic participant and/or of a road infrastructure facility.
  • a driver assistance system for a motor vehicle includes at least one camera and an evaluation device, which is adapted to perform a method according to the invention.
  • the evaluation device can be present as a separate component of the driver assistance system or the evaluation device can be integrated in the camera.
  • a motor vehicle according to the invention in particular a passenger car, includes a driver assistance system according to the invention.
  • FIG. 1 in schematic plan view an embodiment of a motor vehicle according to the invention with a driver assistance system including a left lateral camera, a right lateral camera and a front camera;
  • Fig. 2 in schematic plan view the motor vehicle according to the invention on a schematically illustrated roadway;
  • Fig. 3 in schematic plan view the motor vehicle according to the invention on the schematically illustrated roadway with four lanes.
  • a plan view of a motor vehicle 1 with a driver assistance system 2 is schematically illustrated.
  • the driver assistance system 2 includes a left lateral camera 3 and a right lateral camera 4 in the embodiment.
  • the driver assistance system 2 includes an evaluation device 5, a display device 6, a navigation apparatus 7 as well as a front camera 8.
  • the left lateral camera 3 is attached to a left side 9 of the motor vehicle 1 such that it is oriented opposite to a forward direction of travel 10 of the motor vehicle 1 and captures a left environmental region 1 1 of the motor vehicle 1 and a rear environmental region 12 of the motor vehicle 1 .
  • the right lateral camera 4 is disposed on a right side 13 of the motor vehicle 1 and is also oriented opposite to the forward direction of travel 10. Thus, the right lateral camera 4 captures a right environmental region 14 of the motor vehicle 1 and the rear environmental region 12.
  • the display device 6 is disposed in a front area of the driver's cab of the motor vehicle 1 , but can also be arbitrarily disposed in the motor vehicle 1 .
  • the display device 6 can include one or more screens.
  • a rear image of the left lateral camera 3 can for example be displayed on a left screen of the display device 6, while a rear image of the right lateral camera 4 is displayed on a right screen of the display device 6.
  • the left lateral camera 3, the right lateral camera 4 and the display device 6 together constitute an electronic rearview mirror, which can also be referred to as eMirror.
  • This electronic rearview mirror can be used alternatively or additionally to the wing mirrors of the motor vehicle 1 .
  • the electronic rearview mirror captures the left environmental region 1 1 and/or the rear environmental region 12 and/or the right environmental region 14 by means of the left lateral camera 3 and/or the right lateral camera 4 and provides this information on the display device 6.
  • the evaluation device 5 is disposed centrally in the motor vehicle 1 , but can be arbitrarily disposed in the motor vehicle 1 .
  • the evaluation device 5 can for example be a controller of the motor vehicle 1 .
  • the evaluation device 5 for example includes a digital signal processor.
  • the navigation apparatus 7 can also be arbitrarily disposed in the motor vehicle 1 .
  • the navigation apparatus 7 is based on a global navigation satellite system (GNSS), to which a GPS system and/or a Glonass system and/or a Galileo system and/or a Beidou system belong.
  • GNSS global navigation satellite system
  • the front camera 8 is disposed behind a rearview mirror of the motor vehicle 1 .
  • the front camera 8 can also be arbitrarily disposed in the motor vehicle 1 if a front environmental region 15 of the motor vehicle 1 can then be captured.
  • the left lateral camera 3, the right lateral camera 4, the evaluation device 5, the display device 6, the navigation apparatus 7 and the front camera 8 are connected to each other by a bus system 16 of the motor vehicle 1 for data transfer.
  • the left lateral camera 3 and/or the right lateral camera 4 and/or the front camera 8 can be a CMOS camera or else a CCD camera or any image capturing device, by which the rear image and/or a front image of the front camera 8 can be provided.
  • the left lateral camera 3 and/or the right lateral camera 4 and/or the front camera 8 can also be a video camera, which continuously provides a sequence of frames.
  • the motor vehicle 1 includes the electronic rearview mirror and no conventional wing mirror, which provides the left environmental region 1 1 and/or the rear environmental region 12 and/or the right environmental region 14 by means of a mirror in particular to the driver of the motor vehicle.
  • the motor vehicle 1 can be also equipped with the conventional wing mirror in addition to the electronic wing mirror.
  • Fig. 2 shows the motor vehicle 1 on a roadway 17.
  • the roadway 17 has a lane 18.
  • the lane 18 is separated from adjacent lanes by means of a road marking 19.
  • the left lateral camera 3 provides a left field of view 20, which extends over the left environmental region 1 1 and the rear environmental region 12.
  • the right lateral camera 4 provides a right field of view 21 , which extends at least partially over the right
  • the road marking 19 is recognized based on the respective rear image by means of the evaluation device 5.
  • a left lateral distance 22 and/or a right lateral distance 23 can be determined.
  • the left lateral distance 22 extends perpendicularly from a longitudinal axis 24 of the motor vehicle 1 to the road marking 19, which is disposed to the left of the motor vehicle 1 .
  • the right lateral distance 23 extends perpendicularly from the longitudinal axis 24 to the road marking 19, which is disposed to the right of the motor vehicle 1 .
  • Fig. 3 shows the motor vehicle 1 on the roadway 17 with four lanes 18. Furthermore, the left field of view 20 and the right field of view 21 are shown.
  • the evaluation device 5 is adapted to determine the left lateral distance 22 and/or the right lateral distance 23 based on the road marking 19. Based on the left lateral distance and/or the right lateral distance 23, a remaining period of time or a TTC (Time to Crossing) can be determined, which remains until traversing the road marking 19.
  • the at least one recognized road marking for example serves for a lane departure warning system (LDW), which warns a driver of the motor vehicle 1 of exiting the lane 18.
  • LDW lane departure warning system
  • the driver can be acoustically and/or visually and/or haptically warned.
  • a current position of the motor vehicle 1 with respect to at least two of the lanes 18 - as shown in Fig. 3 - can be determined by means of the evaluation device 5.
  • This current position can be passed to the navigation apparatus 7 to assist a navigation of the driver and/or at least semi-automatic navigation of the motor vehicle 1 .
  • the warning of the driver can be output if the lateral distance 22, 23 falls below a predetermined limit value.
  • the right lateral camera 4 and/or the left lateral camera 3, which each provide the rear image, and the front camera 8, which provides the front image, can be collectively used.
  • the rear image and the front image can be fused to each other to recognize the road marking 19 by means of the evaluation device 5.
  • a lighting situation of the roadway 17 can be determined by the evaluation device 5 and it can use the rear image and/or the front image for recognizing the road marking 19 depending thereon.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Combustion & Propulsion (AREA)
  • Chemical & Material Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)
  • Steering Control In Accordance With Driving Conditions (AREA)

Abstract

The invention relates to a method for operating a driver assistance system (2) of a motor vehicle (1), in which a rear image of an environmental region (11, 12, 14) of the motor vehicle (1) located substantially next to and/or behind the motor vehicle (1) is captured by at least one camera (3, 4) of the driver assistance system (2), the camera being provided on the vehicle, wherein at least one road marking (19) of a roadway (17) is recognized in the environmental region (11, 12, 14) based on the captured rear image.

Description

Method for operating a driver assistance system of a motor vehicle, driver assistance system and motor vehicle
The invention relates to a method for operating a driver assistance system of a motor vehicle, in which a rear image of an environmental region of the motor vehicle located substantially next to and/or behind the motor vehicle is captured by at least one camera of an electronic rearview mirror of the driver assistance system, and the rear image is displayed on a display device in the motor vehicle. In addition, the invention relates to a driver assistance system for a motor vehicle as well as to a motor vehicle with a driver assistance system.
Methods for operating a driver assistance system of a motor vehicle, in which a rear image of an environmental region of the motor vehicle located substantially next to and/or behind the motor vehicle is captured by means of at least one camera of an electronic rearview mirror of the driver assistance system are known from the prior art. Thus, in the electronic rearview mirror, the rear image is provided by means of the camera, which is for example disposed in the position of a conventional mirror of the motor vehicle, and output on the display device, for example one or more displays. Now, in contrast to the conventional wing mirror with a light reflective mirror, an image of the environmental region next to or on the side of and/or behind the motor vehicle is displayed to the driver on the display device in the motor vehicle. Usually, then, the conventional mirrors are omitted. The electronic rearview mirror is also referred to as eMirror. As a further embodiment of the electronic rearview mirror, an electronic wing mirror or an electronic exterior mirror is known, which displays an environment, which would be captured by a conventional wing mirror or a conventional exterior mirror, on the display device. In addition, as another embodiment, an electronic rearview mirror is known, which displays an environment that would be detected by a conventional rearview mirror on the display device.
It is the object of the invention to provide a method, a driver assistance system as well as a motor vehicle, by which or in which the electronic wing mirror can be particularly effectively used.
According to the invention, this object is solved by a method, by a driver assistance system as well as by a motor vehicle having the features according to the respective independent claims. In a method according to the invention for operating a driver assistance system of a motor vehicle, in which a rear image of an environmental region of the motor vehicle located substantially next to and/or behind the motor vehicle is captured by at least one camera of an electronic rearview mirror of the driver assistance system, and the rear image is displayed on a display device in the motor vehicle. According to the invention, it is provided that at least one road marking of a roadway is recognized in the environmental region based on the captured rear image.
By the method according to the invention, it becomes possible to recognize the road markings in the rear image. The road marking is preferably a longitudinal marking. In particular, the road marking bounds lanes of the roadway. The road marking, also roadway marking or ground marking, is a color identification on the surface of traffic areas of the road traffic. The road marking is associated with the road equipment and thus also with the roadway and serves for traffic guidance, the identification of various traffic areas and as a traffic sign. Thus, in particular lane delimitation on the side of the motor vehicle is effected by the road marking. The road marking can describe a roadway boundary or edge line and/or a lane boundary or solid line or safety line and/or a lane separator. Thus, the motor vehicle can be more reliably and/or more precisely moved by the additional recognition of at least one road marking.
Preferably, the rear image is captured reflection-mirrorless and therefore without a reflection mirror. An electronic rearview mirror replaces the light reflecting mirror surface of a conventional rearview mirror, be it a wing mirror on the side of the motor vehicle or a rearview mirror on the headliner of the motor vehicle. An electronic rearview mirror captures or films an area of the motor vehicle located behind the driver by means of a camera in the described manner and presents the camera images by means of a display device, for example a screen, in the field of view of the driver. Thus, the motor vehicle has no particular side mirrors or rearview mirrors, which reflect light toward the driver. In particular, the rear image is only captured electronically. Preferably, it is provided that the rear image is captured by a camera which is constructed as at least one lateral camera of an electronic wing mirror, and the rear image is displayed on the display device in the motor vehicle. Advantageous is the use of the lateral camera because it is already present due to the electronic wing mirror.
In particular, it is provided that a lateral distance from a longitudinal axis of the motor vehicle to the at least one road marking is determined based on the at least one recognized road marking. By the lateral distance, it can be determined if the motor vehicle is within a lane of the roadway. Furthermore, it can be determined, in which position the motor vehicle is located within the lane. Based on the lateral distance, thus, it can be determined if the motor vehicle has exited the lane and/or if the motor vehicle is about to exit the lane. Thus, it is advantageous that an increased safety of the motor vehicle can be provided.
Preferably, it is provided that a lateral speed, with which the motor vehicle approaches the at least one recognized road marking, is determined based on the at least one recognized road marking. Thus, the lateral speed describes, with which transverse speed the motor vehicle approaches the at least one recognized road marking. By the lateral speed, it can be determined, which period of time is left until the motor vehicle reaches or traverses the at least one recognized road marking. Thus, the motor vehicle can thereby be particularly safely moved within the lane.
Furthermore, it is provided that a period of time left until traversing the road marking by the motor vehicle is determined depending on the determined lateral speed. Based on the period of time, it can be determined, which time is left to a driver of the motor vehicle for example to alter a direction of travel of the motor vehicle by a corresponding steering movement in order that the traversing of the road marking or exiting the lane can be prevented. Thus, a particularly high safety in moving the motor vehicle is again advantageous because staying in the lane can be particularly reliably monitored.
Preferably, it is provided that a number of lanes of the roadway are determined based on the at least one recognized road marking. In this manner, the position of the motor vehicle on the roadway can be more accurately determined. The lane or a traffic lane identifies the area, which is available to the motor vehicle for driving in one direction. The width of the lane varies for example between 2.75 meters and 3.75 meters. The lane is mostly identified by roadway markings such as the road marking or roadway boundary or lane boundary or lane separator. Thus, additional information about the roadway contributing to safe movement of the motor vehicle is advantageous.
In a further development, it is provided that a current position of the motor vehicle with respect to the at least two lanes of the roadway is determined based on the at least one recognized road marking. The current position of the motor vehicle can then be provided to other units of the motor vehicle. Thus, this information of the current position can for example be compared and made plausible with data from other sources, for example sensors or other cameras, respectively. Thus, it is advantageous that the current position can be particularly precisely and/or reliably determined with respect to at least two lanes of the roadway.
Furthermore, it is provided that the determined, current position of the motor vehicle is provided to a navigation apparatus of the motor vehicle. This is advantageous because the navigation apparatus, in particular a navigation apparatus with a global navigation satellite system (GNSS), usually is equipped with an absolute GPS system and thus has an accuracy ± 10 meters in the position. This accuracy can be improved based on the determined, current position of the motor vehicle. Thus, the one determined lane can be exactly assigned to the motor vehicle and thus provide improved navigation with the navigation apparatus. The navigation apparatus can earlier inform a driver of the motor vehicle about a required driving maneuver based on the information about the determined, current position.
Furthermore, the motor vehicle is preferably at least semi-autonomously maneuvered depending on the at least one recognized road marking. The at least semi-autonomous maneuvering of the motor vehicle has the advantage that the driver of the motor vehicle can for example be relieved of a steering intervention and/or braking intervention and/or an intervention in a drive device. By the at least semi-autonomous maneuvering, the safety of the motor vehicle can increase. Furthermore, fully autonomous maneuvering of the motor vehicle can be provided if the driver carries out neither the steering intervention nor the acceleration intervention and the braking intervention. The fully autonomous driving or maneuvering also has the advantage that the movement of the motor vehicle can be safer carried out because for example human failure or human inattention can be excluded.
In particular, it is provided that a driver of the motor vehicle is warned of exiting the lane by means of the evaluation device depending on the at least one recognized road marking. The evaluation device can be a component of a lane departure warning system, which warns the driver of the motor vehicle of exiting the lane. Herein, different optical systems and computing devices can be employed, with the aid of which the position of the motor vehicle in the lane is determined. The lane departure warning system warns upon falling below the distance to the road marking or lane marking (Distance to Line Crossing criterion (DLC)) and can pre-calculate this shortfall with the aid of the Time to Line
Crossing criterion (TLC). The lane departure warning system can be realized in different manner. Thus, the motor vehicle can be about to traverse the road marking, and a warning beep and/or a rattling sound is emitted or the steering wheel is vibrated. Thus, the warning can be acoustically and/or visually and/or haptically effected. It can also be that a steering intervention is performed by the driver assistance system to prevent unintended exiting the lane.
Preferably, it is provided that a front image of an environmental region of the motor vehicle located substantially in front of the motor vehicle is provided by means of a front camera of the motor vehicle. For example, the front camera can be located behind a rearview mirror or behind the windshield of the motor vehicle and can be oriented forward with respect to the motor vehicle. Thus, the front camera captures the environmental region, which is in front of the motor vehicle or in forward direction of travel of the motor vehicle. It is advantageous that additional information about the roadway in the form of the front image is provided by the front camera. In this manner, the environment of the motor vehicle can be reliably captured.
Furthermore, it is provided that the at least one road marking is additionally determined based on the front image. The determination of the road marking based on the front image results in a particularly reliable recognition. The at least one road marking can thus be determined based on the rear image and based on the front image. Hereby, erroneous determinations of the road marking can be avoided because the results of the front image and of the rear image can be compared to each other and be verified, respectively. The front camera is directed in the direction of travel of the motor vehicle, and thus a current direction of travel or a travel trajectory of the motor vehicle can be predicted. Based on the travel trajectory, for example, it can then be predicted when the motor vehicle traverses the road marking.
Preferably, it is provided that a lighting situation of the roadway is acquired by means of the driver assistance system and the at least one road marking is recognized depending on the acquired lighting situation in the rear image and/or the front image. Thus, the rear image can for example be fused with the front image. With reflections or mirroring on the surface of the roadway due to low sun and/or wetness, it can for example be difficult to recognize the road marking based on the front image. In this case, the rear image can be used to recognize the road marking. This is helpful because the rear image is captured with the at least one lateral camera, which can be oriented substantially opposite to the front camera. Thus, the low sun then for example does not shine from the front onto the front camera, but the low sun shines from behind onto the at least one lateral camera. However, the inverse case can also be possible if the low sun shines from behind onto the motor vehicle and thus the rear image seems to be unsuitable to recognize the road marking, thus, the front image can be used to better recognize the road marking. The lane recognition or the recognition of the road marking can thus be particularly reliably and/or particularly accurately carried out. Thus, the lighting situation describes the incidence of sunlight and/or of light of another traffic participant and/or of a road infrastructure facility.
A driver assistance system according to the invention for a motor vehicle includes at least one camera and an evaluation device, which is adapted to perform a method according to the invention.
The evaluation device can be present as a separate component of the driver assistance system or the evaluation device can be integrated in the camera.
A motor vehicle according to the invention, in particular a passenger car, includes a driver assistance system according to the invention.
The preferred embodiments presented with respect to the method according to the invention and the advantages thereof correspondingly apply to the driver assistance system according to the invention as well as to the motor vehicle according to the invention.
Further features of the invention are apparent from the claims, the figures and the description of figures. The features and feature combinations mentioned above in the description as well as the features and feature combinations mentioned below in the description of figures and/or shown in the figures alone are usable not only in the respectively specified combination, but also in other combinations or alone, without departing from the scope of the invention. Thus, implementations are also to be considered as encompassed and disclosed by the invention, which are not explicitly shown in the figures and explained, but arise from and can be generated by separated feature combinations from the explained implementations.
Below, embodiments of the invention are explained in more detail based on schematic drawings.
There show: Fig. 1 in schematic plan view an embodiment of a motor vehicle according to the invention with a driver assistance system including a left lateral camera, a right lateral camera and a front camera;
Fig. 2 in schematic plan view the motor vehicle according to the invention on a schematically illustrated roadway; and
Fig. 3 in schematic plan view the motor vehicle according to the invention on the schematically illustrated roadway with four lanes.
In Fig. 1 , a plan view of a motor vehicle 1 with a driver assistance system 2 according to an embodiment of the invention is schematically illustrated. The driver assistance system 2 includes a left lateral camera 3 and a right lateral camera 4 in the embodiment.
Furthermore, the driver assistance system 2 includes an evaluation device 5, a display device 6, a navigation apparatus 7 as well as a front camera 8.
The left lateral camera 3 is attached to a left side 9 of the motor vehicle 1 such that it is oriented opposite to a forward direction of travel 10 of the motor vehicle 1 and captures a left environmental region 1 1 of the motor vehicle 1 and a rear environmental region 12 of the motor vehicle 1 . The right lateral camera 4 is disposed on a right side 13 of the motor vehicle 1 and is also oriented opposite to the forward direction of travel 10. Thus, the right lateral camera 4 captures a right environmental region 14 of the motor vehicle 1 and the rear environmental region 12.
The display device 6 is disposed in a front area of the driver's cab of the motor vehicle 1 , but can also be arbitrarily disposed in the motor vehicle 1 . The display device 6 can include one or more screens. Thus, a rear image of the left lateral camera 3 can for example be displayed on a left screen of the display device 6, while a rear image of the right lateral camera 4 is displayed on a right screen of the display device 6.
The left lateral camera 3, the right lateral camera 4 and the display device 6 together constitute an electronic rearview mirror, which can also be referred to as eMirror. This electronic rearview mirror can be used alternatively or additionally to the wing mirrors of the motor vehicle 1 . Thus, the electronic rearview mirror captures the left environmental region 1 1 and/or the rear environmental region 12 and/or the right environmental region 14 by means of the left lateral camera 3 and/or the right lateral camera 4 and provides this information on the display device 6.
According to the embodiment of Fig. 1 , the evaluation device 5 is disposed centrally in the motor vehicle 1 , but can be arbitrarily disposed in the motor vehicle 1 . The evaluation device 5 can for example be a controller of the motor vehicle 1 . The evaluation device 5 for example includes a digital signal processor. The navigation apparatus 7 can also be arbitrarily disposed in the motor vehicle 1 . For example, the navigation apparatus 7 is based on a global navigation satellite system (GNSS), to which a GPS system and/or a Glonass system and/or a Galileo system and/or a Beidou system belong.
According to the embodiment of Fig. 1 , the front camera 8 is disposed behind a rearview mirror of the motor vehicle 1 . However, similarly, the front camera 8 can also be arbitrarily disposed in the motor vehicle 1 if a front environmental region 15 of the motor vehicle 1 can then be captured.
The left lateral camera 3, the right lateral camera 4, the evaluation device 5, the display device 6, the navigation apparatus 7 and the front camera 8 are connected to each other by a bus system 16 of the motor vehicle 1 for data transfer.
The left lateral camera 3 and/or the right lateral camera 4 and/or the front camera 8 can be a CMOS camera or else a CCD camera or any image capturing device, by which the rear image and/or a front image of the front camera 8 can be provided. The left lateral camera 3 and/or the right lateral camera 4 and/or the front camera 8 can also be a video camera, which continuously provides a sequence of frames.
According to the embodiment of Fig. 1 , the motor vehicle 1 includes the electronic rearview mirror and no conventional wing mirror, which provides the left environmental region 1 1 and/or the rear environmental region 12 and/or the right environmental region 14 by means of a mirror in particular to the driver of the motor vehicle. However, the motor vehicle 1 can be also equipped with the conventional wing mirror in addition to the electronic wing mirror.
Fig. 2 shows the motor vehicle 1 on a roadway 17. The roadway 17 has a lane 18. The lane 18 is separated from adjacent lanes by means of a road marking 19. The left lateral camera 3 provides a left field of view 20, which extends over the left environmental region 1 1 and the rear environmental region 12. Analogously thereto, the right lateral camera 4 provides a right field of view 21 , which extends at least partially over the right
environmental region 14 and the rear environmental region 12.
In the left field of view 20 and/or the right field of view 21 , now, the road marking 19 is recognized based on the respective rear image by means of the evaluation device 5. Based on the road marking 19, a left lateral distance 22 and/or a right lateral distance 23 can be determined. The left lateral distance 22 extends perpendicularly from a longitudinal axis 24 of the motor vehicle 1 to the road marking 19, which is disposed to the left of the motor vehicle 1 . The right lateral distance 23 extends perpendicularly from the longitudinal axis 24 to the road marking 19, which is disposed to the right of the motor vehicle 1 .
Fig. 3 shows the motor vehicle 1 on the roadway 17 with four lanes 18. Furthermore, the left field of view 20 and the right field of view 21 are shown. Thus, the evaluation device 5 is adapted to determine the left lateral distance 22 and/or the right lateral distance 23 based on the road marking 19. Based on the left lateral distance and/or the right lateral distance 23, a remaining period of time or a TTC (Time to Crossing) can be determined, which remains until traversing the road marking 19. The at least one recognized road marking for example serves for a lane departure warning system (LDW), which warns a driver of the motor vehicle 1 of exiting the lane 18. For example, the driver can be acoustically and/or visually and/or haptically warned.
Furthermore, a current position of the motor vehicle 1 with respect to at least two of the lanes 18 - as shown in Fig. 3 - can be determined by means of the evaluation device 5. This current position can be passed to the navigation apparatus 7 to assist a navigation of the driver and/or at least semi-automatic navigation of the motor vehicle 1 .
The warning of the driver can be output if the lateral distance 22, 23 falls below a predetermined limit value.
Furthermore, the right lateral camera 4 and/or the left lateral camera 3, which each provide the rear image, and the front camera 8, which provides the front image, can be collectively used. Thus, the rear image and the front image can be fused to each other to recognize the road marking 19 by means of the evaluation device 5. Additionally or alternatively, a lighting situation of the roadway 17 can be determined by the evaluation device 5 and it can use the rear image and/or the front image for recognizing the road marking 19 depending thereon.

Claims

Claims
1 . Method for operating a driver assistance system (2) of a motor vehicle (1 ), in which a rear image of an environmental region (1 1 , 12, 14) of the motor vehicle (1 ) located substantially next to and/or behind the motor vehicle (1 ) is captured by at least one camera (3, 4) of an electronic rearview mirror of the driver assistance system (2), and the rear image is displayed on a display device (6) in the motor vehicle (1 ), characterized in that
at least one road marking (19) of a roadway (17) is recognized in the environmental region (1 1 , 12, 14) based on the captured rear image.
2. Method according to claim 1 ,
characterized in that
the rear image is captured reflection-mirrorless.
3. Method according to claim 1 or 2,
characterized in that
a lateral distance (22, 23) from a longitudinal axis (24) of the motor vehicle (1 ) to the at least one road marking (19) is determined based on the at least one recognized road marking (19).
4. Method according to any one of the preceding claims,
characterized in that
a lateral speed, with which the motor vehicle (1 ) approaches the at least one recognized road marking (19), is determined based on the at least one recognized road marking (19).
5. Method according to claim 4,
characterized in that
a period of time left until traversing the road marking (19) by the motor vehicle (1 ) is determined depending on the determined lateral speed.
6. Method according to any one of the preceding claims,
characterized in that a number of lanes (18) of the roadway (17) is determined based on the at least one recognized road marking (19).
7. Method according to any one of the preceding claims,
characterized in that
a current position of the motor vehicle (1 ) with respect to at least two lanes (18) of the roadway (17) is determined based on the at least one recognized road marking (19).
8. Method according to claim 7,
characterized in that
the determined, current position of the motor vehicle (1 ) is provided to a navigation apparatus (7) of the motor vehicle (1 ).
9. Method according to any one of the preceding claims,
characterized in that
the motor vehicle (1 ) is at least semi-autonomously maneuvered depending on the at least one recognized road marking (19).
10. Method according to any one of the preceding claims,
characterized in that
a driver of the motor vehicle (1 ) is warned of exiting a lane (18) of the roadway (17) depending on the at least one recognized road marking (19).
1 1 . Method according to any one of the preceding claims,
characterized in that
a front image of an environmental region (15) of the motor vehicle (1 ) located substantially in front of the motor vehicle (1 ) is provided by means of a front camera (8) of the motor vehicle (1 ).
12. Method according to claim 1 1 ,
characterized in that
the at least one road marking (19) is additionally determined based on the front image.
13. Method according to claim 1 1 or 12,
characterized in that
a lighting situation of the roadway (17) is acquired by means of the driver assistance system (2) and the at least one road marking (19) is recognized depending on the acquired lighting situation in the rear image and/or the front image.
14. Driver assistance system (2) with a camera (3, 4) and with an evaluation device (5) adapted for performing a method according to any one of the preceding claims.
15. Motor vehicle (1 ) with a driver assistance system (2) according to claim 14.
PCT/EP2015/075582 2014-11-04 2015-11-03 Method for operating a driver assistance system of a motor vehicle, driver assistance system and motor vehicle WO2016071332A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN201580059925.6A CN107206941A (en) 2014-11-04 2015-11-03 For the method for the driver assistance system for operating motor vehicles, driver assistance system and motor vehicles
US15/523,572 US20170313253A1 (en) 2014-11-04 2015-11-03 Method for operating a driver assistance system of a motor vehicle, driver assistance system and motor vehicle
JP2017523987A JP6510642B2 (en) 2014-11-04 2015-11-03 Method for activating a driver assistance system of a motor vehicle, driver assistance system, and motor vehicle
KR1020177012116A KR102004062B1 (en) 2014-11-04 2015-11-03 Method for operating a driver assistance system of a motor vehicle, driver assistance system and motor vehicle
EP15790917.7A EP3215400B1 (en) 2014-11-04 2015-11-03 Method for operating a driver assistance system of a motor vehicle, driver assistance system and motor vehicle

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102014116037.1 2014-11-04
DE102014116037.1A DE102014116037A1 (en) 2014-11-04 2014-11-04 Method for operating a driver assistance system of a motor vehicle, driver assistance system and motor vehicle

Publications (1)

Publication Number Publication Date
WO2016071332A1 true WO2016071332A1 (en) 2016-05-12

Family

ID=54476954

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2015/075582 WO2016071332A1 (en) 2014-11-04 2015-11-03 Method for operating a driver assistance system of a motor vehicle, driver assistance system and motor vehicle

Country Status (7)

Country Link
US (1) US20170313253A1 (en)
EP (1) EP3215400B1 (en)
JP (1) JP6510642B2 (en)
KR (1) KR102004062B1 (en)
CN (1) CN107206941A (en)
DE (1) DE102014116037A1 (en)
WO (1) WO2016071332A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3306524A3 (en) * 2016-09-14 2018-10-10 Robert Bosch GmbH Method and related device for guiding a means of locomotion

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6252252B2 (en) 2014-02-28 2017-12-27 株式会社デンソー Automatic driving device
DE102016114693A1 (en) * 2016-08-09 2018-02-15 Connaught Electronics Ltd. A method for assisting a driver of a motor vehicle when driving the motor vehicle, driver assistance system and motor vehicle
US10773717B2 (en) * 2018-04-12 2020-09-15 Trw Automotive U.S. Llc Vehicle assist system
FR3084631B1 (en) * 2018-07-31 2021-01-08 Valeo Schalter & Sensoren Gmbh DRIVING ASSISTANCE FOR THE LONGITUDINAL AND / OR SIDE CHECKS OF A MOTOR VEHICLE
CN111750880A (en) * 2019-03-29 2020-10-09 上海擎感智能科技有限公司 Auxiliary parking method and device
DE102019110364A1 (en) * 2019-04-18 2020-10-22 CloudMade Method for assisting a driver in driving the vehicle by activating a lane departure warning system as a function of the direction in which the driver is looking and an assistance system
US11845428B2 (en) * 2021-07-13 2023-12-19 Canoo Technologies Inc. System and method for lane departure warning with ego motion and vision
US11840147B2 (en) 2021-07-13 2023-12-12 Canoo Technologies Inc. System and method in data-driven vehicle dynamic modeling for path-planning and control
US11891060B2 (en) 2021-07-13 2024-02-06 Canoo Technologies Inc. System and method in lane departure warning with full nonlinear kinematics and curvature
US12017661B2 (en) 2021-07-13 2024-06-25 Canoo Technologies Inc. System and method in vehicle path prediction based on full nonlinear kinematics
US11908200B2 (en) 2021-07-13 2024-02-20 Canoo Technologies Inc. System and method in the prediction of target vehicle behavior based on image frame and normalization
US11891059B2 (en) 2021-07-13 2024-02-06 Canoo Technologies Inc. System and methods of integrating vehicle kinematics and dynamics for lateral control feature at autonomous driving

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102004045103A1 (en) * 2004-09-17 2006-03-30 Daimlerchrysler Ag Warning signal generation for an automobile when incorrectly positioned relative to lane markings on a road system
DE102005025387A1 (en) * 2004-09-30 2006-05-04 Daimlerchrysler Ag Method and device for driver's warning or to actively intervene in the driving dynamics, if threatening to leave the lane
DE102010013357A1 (en) * 2009-04-02 2011-12-15 Gm Global Technology Operations Llc (N.D.Ges.D. Staates Delaware) REARVIEW MIRROR HEAD-UP DISPLAY FOR ENTIRELY WINDSHIELD
US20140032100A1 (en) * 2012-07-24 2014-01-30 Plk Technologies Co., Ltd. Gps correction system and method using image recognition information

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5670935A (en) * 1993-02-26 1997-09-23 Donnelly Corporation Rearview vision system for vehicle including panoramic view
JP2001116567A (en) * 1999-10-20 2001-04-27 Matsushita Electric Ind Co Ltd On-vehicle driving supporting information displaying device
JP3848898B2 (en) * 2002-05-21 2006-11-22 アイシン精機株式会社 Lane departure judgment device
JP4374211B2 (en) * 2002-08-27 2009-12-02 クラリオン株式会社 Lane marker position detection method, lane marker position detection device, and lane departure warning device
JP2005184395A (en) * 2003-12-18 2005-07-07 Sumitomo Electric Ind Ltd Method, system and apparatus for image processing, and photographing equipment
JP2005346648A (en) * 2004-06-07 2005-12-15 Denso Corp View assistance system and program
JP2006127384A (en) * 2004-11-01 2006-05-18 Auto Network Gijutsu Kenkyusho:Kk White line recognition method, device, and system
JP2008225822A (en) * 2007-03-13 2008-09-25 Toyota Motor Corp Road partition line detection device
JP2008269399A (en) * 2007-04-23 2008-11-06 Mazda Motor Corp Traffic lane departure alarm device for vehicle
US8694195B2 (en) * 2007-12-04 2014-04-08 Volkswagen Ag Motor vehicle having a wheel-view camera and method for controlling a wheel-view camera system
JP5227065B2 (en) * 2008-01-25 2013-07-03 株式会社岩根研究所 3D machine map, 3D machine map generation device, navigation device and automatic driving device
JP5397887B2 (en) * 2008-12-17 2014-01-22 アルパイン株式会社 Vehicle monitoring system
JP5414588B2 (en) * 2010-03-24 2014-02-12 株式会社東芝 Vehicle driving support processing device and vehicle driving support device
US9090263B2 (en) * 2010-07-20 2015-07-28 GM Global Technology Operations LLC Lane fusion system using forward-view and rear-view cameras
CN201923014U (en) * 2010-11-24 2011-08-10 宁波罗谊特电子科技有限公司 Automobile safety monitoring device
JP5646980B2 (en) * 2010-12-16 2014-12-24 クラリオン株式会社 Ambient condition monitoring device for vehicles
WO2012138455A1 (en) * 2011-03-10 2012-10-11 Bowles Fluidics Corporation Integrated automotive system, nozzle assembly and remote control method for cleaning an image sensor's lens
US9516277B2 (en) * 2012-05-02 2016-12-06 GM Global Technology Operations LLC Full speed lane sensing with a surrounding view system
DE102012207716A1 (en) * 2012-05-09 2013-11-14 Robert Bosch Gmbh Optical scanning system for scanning environment of motor car, has processing unit making combined evaluation of information of images, and illumination device provided in infrared region for illuminating environment
US9834143B2 (en) * 2013-05-23 2017-12-05 GM Global Technology Operations LLC Enhanced perspective view generation in a front curb viewing system
US10946798B2 (en) * 2013-06-21 2021-03-16 Magna Electronics Inc. Vehicle vision system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102004045103A1 (en) * 2004-09-17 2006-03-30 Daimlerchrysler Ag Warning signal generation for an automobile when incorrectly positioned relative to lane markings on a road system
DE102005025387A1 (en) * 2004-09-30 2006-05-04 Daimlerchrysler Ag Method and device for driver's warning or to actively intervene in the driving dynamics, if threatening to leave the lane
DE102010013357A1 (en) * 2009-04-02 2011-12-15 Gm Global Technology Operations Llc (N.D.Ges.D. Staates Delaware) REARVIEW MIRROR HEAD-UP DISPLAY FOR ENTIRELY WINDSHIELD
US20140032100A1 (en) * 2012-07-24 2014-01-30 Plk Technologies Co., Ltd. Gps correction system and method using image recognition information

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3306524A3 (en) * 2016-09-14 2018-10-10 Robert Bosch GmbH Method and related device for guiding a means of locomotion

Also Published As

Publication number Publication date
DE102014116037A1 (en) 2016-05-04
KR102004062B1 (en) 2019-07-25
JP6510642B2 (en) 2019-05-08
EP3215400A1 (en) 2017-09-13
US20170313253A1 (en) 2017-11-02
JP2017536621A (en) 2017-12-07
EP3215400B1 (en) 2020-06-24
KR20170076695A (en) 2017-07-04
CN107206941A (en) 2017-09-26

Similar Documents

Publication Publication Date Title
EP3215400B1 (en) Method for operating a driver assistance system of a motor vehicle, driver assistance system and motor vehicle
US11436840B2 (en) Vehicular control system
US11247608B2 (en) Vehicular system and method for controlling vehicle
US10445596B2 (en) Camera device for vehicle
EP3366524B1 (en) Parking space detection method and device
CN112537295B (en) Driving assistance device
JP7401978B2 (en) Intersection start determination device
US20190135169A1 (en) Vehicle communication system using projected light
WO2012045323A1 (en) Method and driver assistance system for warning a driver of a motor vehicle of the presence of an obstacle in an environment of the motor vehicle
US20220105941A1 (en) Vehicular contol system with enhanced vehicle passing maneuvering
CN115777121A (en) Driving support device
US8681219B2 (en) System and method for driving assistance at road intersections
JP2022154933A (en) Vehicle control device, computer program for vehicle control and vehicle control method
JP2022140026A (en) Image processing device, image processing method and program
CN111746399A (en) Driving support device
CN110072750B (en) Vehicle control apparatus and method
WO2014090957A1 (en) Method for switching a camera system to a supporting mode, camera system and motor vehicle
US20230314158A1 (en) Vehicle drive assist apparatus
CN115195752A (en) Driving support device
JP2009104224A (en) Onboard device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15790917

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2015790917

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2015790917

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 15523572

Country of ref document: US

ENP Entry into the national phase

Ref document number: 20177012116

Country of ref document: KR

Kind code of ref document: A

Ref document number: 2017523987

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE