WO2010127650A1 - Verfahren zur auswertung von sensordaten für ein kraftfahrzeug - Google Patents
Verfahren zur auswertung von sensordaten für ein kraftfahrzeug Download PDFInfo
- Publication number
- WO2010127650A1 WO2010127650A1 PCT/DE2010/000035 DE2010000035W WO2010127650A1 WO 2010127650 A1 WO2010127650 A1 WO 2010127650A1 DE 2010000035 W DE2010000035 W DE 2010000035W WO 2010127650 A1 WO2010127650 A1 WO 2010127650A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- grid
- sensor data
- detection
- vehicle
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9329—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles cooperating with reflectors or transponders
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
- G08G1/163—Decentralised systems, e.g. inter-vehicle communication involving continuous checking
Definitions
- a method for evaluating sensor data of an environment detection system for a motor vehicle in which detection points are entered independently of the type of sensor an occupancy grid.
- the result is a general occupancy grid, which can be used without sensor-specific knowledge of other suitable algorithms for object evaluation.
- Typical sensors for environment detection are radar, lidar, camera and ultrasonic sensors.
- the occupancy grid substantially corresponds to the environment of the motor vehicle, in particular, the environment around the vehicle in stationary cells z.
- PImIz 1 PImIz 1 , ..., z n ).
- a sensor system without resolution in the elevation direction can not be directly concluded whether a cell is traversable.
- Such a sensor system does not measure the elevation of objects so that detection may also be from a traversable object (e.g., metal plate).
- the surroundings detection system comprises at least two similar sensors, for example a stereo camera.
- the surroundings detection system comprises at least two different types of sensor systems whose detection areas preferably at least partially overlap.
- At least one sensor system is designed as a radar, LIDAR or ultrasound system for environmental detection.
- detection points of at least two different types of sensor systems are combined in a common allocation grid.
- Objects are extracted from the allocation grid.
- the objects are recognized and / or classified with suitable algorithms.
- an object class for example, "drive-over object", "non-drive-over object” or road boundary can be provided.
- the occupancy grid representation is evaluated for structural elements containing physical objects such as e.g. Guardrails can comply. This can be done as part of an object detection. In this way, those selected from the set of detections are searched for, e.g. extracted non-drivable objects. Which objects are searched can be specified.
- the allocation grid is interpreted by e.g. Longitudinal structures that can correspond to their shape and position according to crash barriers, are sought.
- sensor-specific information acquired is obtained when sensor data are combined in a common allocation grid.
- Specific information is, for example, the fact that detection points (grid cells) belong together. Togetherness is determined with a radar sensor, eg based on the relative velocity or the RCS (Radar Cross Section) value.
- the color of a detection point can be taken from the sensor data of a color camera as specific information about its affiliation. This information remains with the Merging the data obtained from multiple sensors in a common occupancy grid.
- the representation of the sensor-specific information is sensor-independent, ie an equivalent information obtained with other types of sensors are displayed in the same manner in the allocation grid, so that the algorithm for further data processing (eg, summary of grid cells to an object, object recognition or categorization, etc.) sensor-independent can be designed.
- a proper movement of the motor vehicle, shape and position in the allocation grid is taken into account for the recognition of an object.
- the direction of travel is important because bridges i. d. R. are arranged transversely to the direction of travel, whereas lane boundaries are arranged substantially longitudinally to the direction of travel and have a straight or a slightly curved shape.
- the presented method is used for the detection of lane boundary boundaries.
- Roadside boundaries such as e.g. Lane markings or crash barriers or beacons are identified in the common occupancy grid on the basis of their location and shape in the occupancy grid and a proper movement of the vehicle and.
- extended, contiguous or periodic structures are interpreted as the roadway boundary in the longitudinal direction of the current direction of travel.
- Figure 1 Baust ⁇ llenszenat ⁇ o for the validation of the roadside estimation, dashed search windows and measuring points are shown.
- Figure 2 Merging the detection points of different sensors in a common assignment grid. The shape and position of objects are determined in the common allocation grid.
- Embodiments of the method according to the invention are given below, which is used to estimate the roadside on the basis of the occupancy grid.
- a sensor e.g. a radar sensor system is provided, which can not measure the height of objects, since no elevation resolution is provided.
- the roadside As a roadside, extended, contiguous or periodic structures are interpreted longitudinally to the current direction of travel. The aim of the evaluation of the sensor data is therefore to find these structures in the calculated occupancy grid representation.
- a Kalman filter used for estimating the roadside.
- the roadside is preferably modeled as clothoid in a vehicle-fixed coordinate system.
- the (random) structure to be found is characterized by one or more contiguous occupied lattice cells with high
- Occupancy probability and a sharp demarcation by a significant change in the occupancy probability to the environment Transitions can be disturbed by measurement noise or by real existing objects such as metallic objects that do not belong to the road boundary. These disturbances can be filtered out or interpolated eg by the clothoid estimation with the Kalman filter.
- Another method for suppressing disturbances is to use heuristics in the context of the data association, which prevent corresponding misallocations.
- measurements for the Kalman filters in an advantageous embodiment of the invention, analogous to the line detection in image processing, places with a strong jump in the occupancy probability are used. Measuring points are extracted from the allocation grid, in particular within two relevant measuring ranges (search window). The areas each extend to the right and left of the vehicle in the longitudinal direction with predefined length and width. The search areas are adapted as soon as a roadside is tracked (tracked) with the help of the estimated state variables of the clothoids in position and extent.
- edge filters from image processing. These are applied one-dimensionally transversely to the vehicle on the left and right relevant measurement areas of the occupancy grid. Of interest for the estimation of the roadside is the edge closest to the direction of travel. An edge must exceed a predefined threshold value so that it is accepted as a prominent measuring point and taken over into the measurement vector for the Kalman filter.
- the measurement points are checked for ambiguity.
- the area to the center of the vehicle is checked for the occupancy of the cells. Only if these are not significantly occupied, it is ensured that a measuring point belongs to the sought inner edge of the structure.
- the measuring points associated with the roadside it is possible to determine a start and an end point of the clothoid.
- a statement about the quality of the estimate can be made about the distribution of the measuring points.
- Figure 1 shows a site scenario in which the roadside estimation is performed.
- the left roadside is through Given plastic elements, which are shown in the image section a).
- Figure 1b shows a top view of the scene with occupancy probabilities.
- the estimated roadside serves as an input for a situation interpretation, which in case of an impending overrunning of the roadside - in this case a collision with the boundary elements - engages by an additional steering torque.
- the system can now also apply to structuring, elevated elements of the road course, such as Guardrails or beacons, react.
- the z. B also includes the information of the road marking, ambiguities can finally be recognized and treated.
- the merging of the detection points of different types of sensors is displayed in a common allocation grid. It is shown in Figure 2 that an allocation grid is first created for each sensor type. Both grids are then joined together to form a two-dimensional shared grating. Alternatively, the allocation data can be entered immediately into a common grid. The shape and position of objects are determined in the common allocation grid. After the merger, coherent detection points are combined and recognized as objects. In Figure 2, this process is shown in the occupancy grid at the bottom right. The interconnected structures are marked with two thick lines. In this embodiment, the contiguous structures in shape and location correspond to roadway boundaries such as crash barriers or beacons.
- the presented method is preferably used in driver assistance systems for track estimation, which can be used eg in an ACC or LDW system.
- An advantageous system for acquiring sensor data is a monocular camera mounted in or on the vehicle. Based on a sequence of images, a three-dimensional recognition of objects, in particular static objects is performed. For this purpose, flow vectors of pixels are calculated according to the optical flow method in order to reconstruct these three-dimensionally.
- the camera is rigidly connected to the vehicle and is moved with this.
- the movement of pixels is caused by the relative movement between the vehicle and the static vehicle environment.
- the positions of the pixels can then be reconstructed three-dimensionally.
- the calculation of the optical flow is made from two consecutive images of the same scene. It is of interest where the pixels of the first recorded image in the following image have moved.
- Various methods can be used to calculate the optical flux or the displacement vectors.
- a differential or a matching method is used.
- the flow vectors are calculated for all pixels of the image.
- the matching method prominent pixels in the camera image are identified in a first step. A prominent pixel is characterized in particular by a high gradient in the x and y direction of the image. Only for these prominent points are the flow vectors calculated.
- Striking pixels are to be found in particular on structuring, raised elements of the road course, such as crash barriers or beacons, especially in a construction site area.
- the proper motion of the camera is needed. This means that translational and rotational movements of the camera between the respective times of the two shots of the successive images are determined.
- the rotation angle and translation direction can preferably be estimated directly from the displacement vectors by the estimation of the so-called epipolar geometry.
- Epipolar geometry is a mathematical model that describes the geometric relationships between different camera images of the same scene.
- the translational motion of the vehicle is determined from the data of the vehicle inertial sensor. Alternatively, it is intended to estimate the total camera movement from the inertial sensor data.
- the estimate from the epipolar geometry can be supplemented or completely replaced.
- the three-dimensional position of the flow vectors can then preferably be determined by linear or non-linear triangulation.
- the method presented here requires a static vehicle environment. That is, flow vectors may only have been caused by the relative motion between the moving camera and the non-moving vehicle environment. Flow vectors caused by moving objects (eg vehicles) must be excluded. Typical standing objects are reflectors for limiting the roadway or lane markings or a peripheral development of the road. These can be recognized by a suitable image processing algorithm based on characteristic features in the image. Alternatively, in particular a beam sensor (eg radar sensor) may be used to detect stationary objects.
- a beam sensor eg radar sensor
- the positions of moving objects determined therewith can preferably be projected into the camera image. At these points in the image no flow vectors are calculated. Thus, it is possible, even from a monocular camera, depth information and thus 3D information any static objects in the vehicle environment win. Thus, not only the location of the lane markings but also the peripheral building of a road, in particular structural, raised elements of the road course, such as crash barriers or beacons, can be supported for a lane keeping system, in particular in a construction site area.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Electromagnetism (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112010000146T DE112010000146A5 (de) | 2009-05-06 | 2010-01-16 | Verfahren zur Auswertung von Sensordaten für ein Kraftfahrzeug |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102009020073.8 | 2009-05-06 | ||
DE102009020073 | 2009-05-06 | ||
DE102009020071 | 2009-05-06 | ||
DE102009020071.1 | 2009-05-06 | ||
DE102009042780 | 2009-09-25 | ||
DE102009042780.5 | 2009-09-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010127650A1 true WO2010127650A1 (de) | 2010-11-11 |
Family
ID=42111354
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/DE2010/000035 WO2010127650A1 (de) | 2009-05-06 | 2010-01-16 | Verfahren zur auswertung von sensordaten für ein kraftfahrzeug |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE112010000146A5 (de) |
WO (1) | WO2010127650A1 (de) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102011111856A1 (de) | 2011-08-27 | 2013-02-28 | Volkswagen Aktiengesellschaft | Verfahren und Vorrichtung zur Detektion mindestens einer Fahrspur in einem Fahrzeugumfeld |
WO2013087067A1 (de) * | 2011-12-14 | 2013-06-20 | Continental Teves Ag & Co. Ohg | Freirauminformationen in einem belegungsgitter als grundlage für die bestimmung eines manöverraums für ein fahrzeug |
CN103176185A (zh) * | 2011-12-26 | 2013-06-26 | 上海汽车集团股份有限公司 | 用于检测道路障碍物的方法及*** |
DE102012105332A1 (de) * | 2012-06-19 | 2013-12-19 | Continental Teves Ag & Co. Ohg | Verfahren zur Darstellung einer Fahrzeugumgebung |
WO2014044272A1 (de) | 2012-09-20 | 2014-03-27 | Conti Temic Microelectronic Gmbh | Verfahren zur kalibrierung mehrerer umfeldsensoren in einem fahrzeug |
EP2950114A1 (de) * | 2014-05-30 | 2015-12-02 | Honda Research Institute Europe GmbH | Fahrerassistenzverfahren zum Antreiben eines Fahrzeugs, ein Fahrerassistenzsystem, ein Computersoftwareprogrammprodukt und Fahrzeug |
WO2015185048A1 (de) * | 2014-06-05 | 2015-12-10 | Conti Temic Microelectronic Gmbh | Verfahren und system zur bestimmung einer fahrzeugposition eines fahrzeuges |
DE102014111125A1 (de) | 2014-08-05 | 2016-02-11 | Valeo Schalter Und Sensoren Gmbh | Verfahren zum Erkennen eines Objekts in einem Umgebungsbereich eines Kraftfahrzeugs mittels eines Ultraschallsensors, Fahrerassistenzsystem sowie Kraftfahrzeug |
DE102015201747A1 (de) * | 2015-02-02 | 2016-08-04 | Continental Teves Ag & Co. Ohg | Sensorsystem für ein fahrzeug und verfahren |
EP3179270A1 (de) * | 2015-12-08 | 2017-06-14 | Delphi Technologies, Inc. | Fahrspurerweiterung eines spurhaltesystems durch einen entfernungsmessungssensor für ein automatisches fahrzeug |
DE102016200642A1 (de) * | 2016-01-19 | 2017-07-20 | Conti Temic Microelectronic Gmbh | Verfahren und vorrichtung zum klassifizieren von fahrbahnbegrenzungen und fahrzeug |
DE102017209977A1 (de) * | 2017-06-13 | 2018-12-13 | Continental Automotive Gmbh | Verfahren und Vorrichtung zum Bestimmen eines freien Objektraums und Erzeugen einer definierten Grenze |
EP2888604B1 (de) * | 2012-08-27 | 2019-07-24 | Continental Teves AG & Co. OHG | Verfahren zur bestimmung eines fahrspurverlaufs für ein fahrzeug |
FR3077549A1 (fr) * | 2018-02-08 | 2019-08-09 | Psa Automobiles Sa | Procede de determination de la trajectoire d’un vehicule automobile en absence de marquage au sol. |
DE102018206743A1 (de) * | 2018-05-02 | 2019-11-07 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren zum Betreiben eines Fahrerassistenzsystems eines Egofahrzeugs mit wenigstens einem Umfeldsensor zum Erfassen eines Umfelds des Egofahrzeugs, Computer-lesbares Medium, System, und Fahrzeug |
WO2020001828A1 (de) * | 2018-06-30 | 2020-01-02 | Robert Bosch Gmbh | Verfahren zur erkennung statischer radarziele mit einem radarsensor für kraftfahrzeuge |
WO2020069922A1 (de) * | 2018-10-05 | 2020-04-09 | HELLA GmbH & Co. KGaA | Verfahren zur bereitstellung von objektinformationen von statischen objekten in einer umgebung eines fahrzeuges |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102004007553A1 (de) * | 2004-02-17 | 2005-09-01 | Daimlerchrysler Ag | Erfassungsvorrichtung und Sicherheitssystem für ein Kraftfahrzeug |
EP1612580A1 (de) * | 2004-07-01 | 2006-01-04 | DaimlerChrysler AG | Objekterkennungsverfahren für Fahrzeuge |
EP1672390A1 (de) * | 2004-12-15 | 2006-06-21 | Deere & Company | Verfahren und System zur Objekterkennung mittels zusammengesetztem Beweisraster (Evidence-Grid) |
DE102006027123A1 (de) * | 2006-06-12 | 2007-12-13 | Robert Bosch Gmbh | Verfahren für die Erfassung eines Verkehrsraums |
EP1927866A1 (de) * | 2006-12-01 | 2008-06-04 | Robert Bosch Gmbh | Verfahren zum gitterbasierten Verarbeiten von Sensorsignalen |
US20080252433A1 (en) * | 2005-09-09 | 2008-10-16 | Institut National De La Recherche En Informatique Et En Automatique | Vehicle Driving Aid and Method and Improved Related Device |
-
2010
- 2010-01-16 DE DE112010000146T patent/DE112010000146A5/de active Pending
- 2010-01-16 WO PCT/DE2010/000035 patent/WO2010127650A1/de active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102004007553A1 (de) * | 2004-02-17 | 2005-09-01 | Daimlerchrysler Ag | Erfassungsvorrichtung und Sicherheitssystem für ein Kraftfahrzeug |
EP1612580A1 (de) * | 2004-07-01 | 2006-01-04 | DaimlerChrysler AG | Objekterkennungsverfahren für Fahrzeuge |
EP1672390A1 (de) * | 2004-12-15 | 2006-06-21 | Deere & Company | Verfahren und System zur Objekterkennung mittels zusammengesetztem Beweisraster (Evidence-Grid) |
US20080252433A1 (en) * | 2005-09-09 | 2008-10-16 | Institut National De La Recherche En Informatique Et En Automatique | Vehicle Driving Aid and Method and Improved Related Device |
DE102006027123A1 (de) * | 2006-06-12 | 2007-12-13 | Robert Bosch Gmbh | Verfahren für die Erfassung eines Verkehrsraums |
EP1927866A1 (de) * | 2006-12-01 | 2008-06-04 | Robert Bosch Gmbh | Verfahren zum gitterbasierten Verarbeiten von Sensorsignalen |
Cited By (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102011111856B4 (de) | 2011-08-27 | 2019-01-10 | Volkswagen Aktiengesellschaft | Verfahren und Vorrichtung zur Detektion mindestens einer Fahrspur in einem Fahrzeugumfeld |
DE102011111856A1 (de) | 2011-08-27 | 2013-02-28 | Volkswagen Aktiengesellschaft | Verfahren und Vorrichtung zur Detektion mindestens einer Fahrspur in einem Fahrzeugumfeld |
WO2013087067A1 (de) * | 2011-12-14 | 2013-06-20 | Continental Teves Ag & Co. Ohg | Freirauminformationen in einem belegungsgitter als grundlage für die bestimmung eines manöverraums für ein fahrzeug |
CN103176185A (zh) * | 2011-12-26 | 2013-06-26 | 上海汽车集团股份有限公司 | 用于检测道路障碍物的方法及*** |
US9607229B2 (en) | 2012-06-19 | 2017-03-28 | Conti Temic Microelectronic Gmbh | Method for representing the surroundings of a vehicle |
DE102012105332A1 (de) * | 2012-06-19 | 2013-12-19 | Continental Teves Ag & Co. Ohg | Verfahren zur Darstellung einer Fahrzeugumgebung |
EP2888604B1 (de) * | 2012-08-27 | 2019-07-24 | Continental Teves AG & Co. OHG | Verfahren zur bestimmung eines fahrspurverlaufs für ein fahrzeug |
WO2014044272A1 (de) | 2012-09-20 | 2014-03-27 | Conti Temic Microelectronic Gmbh | Verfahren zur kalibrierung mehrerer umfeldsensoren in einem fahrzeug |
DE102012108862A1 (de) | 2012-09-20 | 2014-05-28 | Continental Teves Ag & Co. Ohg | Verfahren zur Kalibrierung mehrerer Umfeldsensoren in einem Fahrzeug |
US9274213B2 (en) | 2012-09-20 | 2016-03-01 | Conti Temic Microelectronic Gmbh | Method for calibrating a plurality of environment sensors in a vehicle |
EP2950114A1 (de) * | 2014-05-30 | 2015-12-02 | Honda Research Institute Europe GmbH | Fahrerassistenzverfahren zum Antreiben eines Fahrzeugs, ein Fahrerassistenzsystem, ein Computersoftwareprogrammprodukt und Fahrzeug |
US9669830B2 (en) | 2014-05-30 | 2017-06-06 | Honda Research Institute Europe Gmbh | Method for assisting a driver in driving a vehicle, a driver assistance system, a computer software program product and vehicle |
DE102014210770A1 (de) * | 2014-06-05 | 2015-12-17 | Conti Temic Microelectronic Gmbh | Verfahren und system zur bestimmung einer fahrzeugposition eines fahrzeuges |
US10955854B2 (en) | 2014-06-05 | 2021-03-23 | Conti Temic Microelectronic Gmbh | Method and system for determining the position of a vehicle |
WO2015185048A1 (de) * | 2014-06-05 | 2015-12-10 | Conti Temic Microelectronic Gmbh | Verfahren und system zur bestimmung einer fahrzeugposition eines fahrzeuges |
DE102014111125A1 (de) | 2014-08-05 | 2016-02-11 | Valeo Schalter Und Sensoren Gmbh | Verfahren zum Erkennen eines Objekts in einem Umgebungsbereich eines Kraftfahrzeugs mittels eines Ultraschallsensors, Fahrerassistenzsystem sowie Kraftfahrzeug |
WO2016124189A1 (de) * | 2015-02-02 | 2016-08-11 | Conti Temic Microelectronic Gmbh | Sensorsystem für ein fahrzeug und verfahren |
DE102015201747A1 (de) * | 2015-02-02 | 2016-08-04 | Continental Teves Ag & Co. Ohg | Sensorsystem für ein fahrzeug und verfahren |
EP3179270A1 (de) * | 2015-12-08 | 2017-06-14 | Delphi Technologies, Inc. | Fahrspurerweiterung eines spurhaltesystems durch einen entfernungsmessungssensor für ein automatisches fahrzeug |
DE102016200642A1 (de) * | 2016-01-19 | 2017-07-20 | Conti Temic Microelectronic Gmbh | Verfahren und vorrichtung zum klassifizieren von fahrbahnbegrenzungen und fahrzeug |
DE102017209977A1 (de) * | 2017-06-13 | 2018-12-13 | Continental Automotive Gmbh | Verfahren und Vorrichtung zum Bestimmen eines freien Objektraums und Erzeugen einer definierten Grenze |
FR3077549A1 (fr) * | 2018-02-08 | 2019-08-09 | Psa Automobiles Sa | Procede de determination de la trajectoire d’un vehicule automobile en absence de marquage au sol. |
WO2019155134A1 (fr) * | 2018-02-08 | 2019-08-15 | Psa Automobiles Sa | Procede de determination de la trajectoire d'un vehicule automobile en absence de marquage au sol. |
DE102018206743A1 (de) * | 2018-05-02 | 2019-11-07 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren zum Betreiben eines Fahrerassistenzsystems eines Egofahrzeugs mit wenigstens einem Umfeldsensor zum Erfassen eines Umfelds des Egofahrzeugs, Computer-lesbares Medium, System, und Fahrzeug |
WO2019211293A1 (de) | 2018-05-02 | 2019-11-07 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren zum betreiben eines fahrerassistenzsystems eines egofahrzeugs mit wenigstens einem umfeldsensor zum erfassen eines umfelds des egofahrzeugs, computer-lesbares medium, system, und fahrzeug |
US11554795B2 (en) | 2018-05-02 | 2023-01-17 | Bayerische Motoren Werke Aktiengesellschaft | Method for operating a driver assistance system of an ego vehicle having at least one surroundings sensor for detecting the surroundings of the ego vehicle, computer readable medium, system and vehicle |
CN112368593A (zh) * | 2018-06-30 | 2021-02-12 | 罗伯特·博世有限公司 | 借助用于机动车的雷达传感器来识别静态雷达目标的方法 |
US20210173043A1 (en) * | 2018-06-30 | 2021-06-10 | Robert Bosch Gmbh | Method for identifying static radar targets using a radar sensor for motor vehicles |
JP2021530679A (ja) * | 2018-06-30 | 2021-11-11 | ロベルト・ボッシュ・ゲゼルシャフト・ミト・ベシュレンクテル・ハフツングRobert Bosch Gmbh | 自動車用レーダセンサによる静的レーダ目標の認識方法 |
JP7037672B2 (ja) | 2018-06-30 | 2022-03-16 | ロベルト・ボッシュ・ゲゼルシャフト・ミト・ベシュレンクテル・ハフツング | 自動車用レーダセンサによる静的レーダ目標の認識方法 |
WO2020001828A1 (de) * | 2018-06-30 | 2020-01-02 | Robert Bosch Gmbh | Verfahren zur erkennung statischer radarziele mit einem radarsensor für kraftfahrzeuge |
US11879992B2 (en) | 2018-06-30 | 2024-01-23 | Robert Bosch Gmbh | Method for identifying static radar targets using a radar sensor for motor vehicles |
CN112368593B (zh) * | 2018-06-30 | 2024-04-02 | 罗伯特·博世有限公司 | 借助用于机动车的雷达传感器来识别静态雷达目标的方法 |
WO2020069922A1 (de) * | 2018-10-05 | 2020-04-09 | HELLA GmbH & Co. KGaA | Verfahren zur bereitstellung von objektinformationen von statischen objekten in einer umgebung eines fahrzeuges |
Also Published As
Publication number | Publication date |
---|---|
DE112010000146A5 (de) | 2012-06-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2010127650A1 (de) | Verfahren zur auswertung von sensordaten für ein kraftfahrzeug | |
DE102016203086B4 (de) | Verfahren und Vorrichtung zur Fahrerassistenz | |
EP2766237B1 (de) | Vorrichtung zur unterstützung eines fahrers beim fahren eines fahrzeugs oder zum autonomen fahren eines fahrzeugs | |
DE102009006113B4 (de) | Vorrichtung und Verfahren zur Sensorfusion mit dynamischen Objekten | |
EP2561419B1 (de) | Verfahren zur bestimmung des fahrbahnverlaufes für ein kraftfahrzeug | |
EP2888604B1 (de) | Verfahren zur bestimmung eines fahrspurverlaufs für ein fahrzeug | |
DE102009005566B4 (de) | Verfahren und Vorrichtung zur Erstellung einer Umfeldkarte eines Kraftfahrzeugs | |
DE102008036009B4 (de) | Verfahren zum Kollisionsschutz eines Kraftfahrzeugs und Parkhausassistent | |
EP3386825B1 (de) | Verfahren zum erkennen einer möglichen kollision zwischen einem kraftfahrzeug und einem objekt unter berücksichtigung einer räumlichen unsicherheit, steuereinrichtung, fahrerassistenzsystem sowie kraftfahrzeug | |
DE102015209467A1 (de) | Verfahren zur Schätzung von Fahrstreifen | |
DE102010006828A1 (de) | Verfahren zur automatischen Erstellung eines Modells der Umgebung eines Fahrzeugs sowie Fahrerassistenzsystem und Fahrzeug | |
EP2982572B1 (de) | Verfahren zum unterstützen eines fahrers eines kraftfahrzeugs beim ausparken, fahrerassistenzsystem und kraftfahrzeug | |
EP2594461B1 (de) | Verfahren zum Erkennen einer Parklücke für ein Kraftfahrzeug, Parkhilfesystem und Kraftfahrzeug mit einem Parkhilfesystem | |
WO2015106913A1 (de) | Verfahren und system zum schätzen eines fahrspurverlaufs | |
WO2020229002A1 (de) | Verfahren und vorrichtung zur multi-sensor-datenfusion für automatisierte und autonome fahrzeuge | |
EP1684142A1 (de) | Verfahren zur Kursprädiktion in Fahrerassistenzsystemen für Kraftfahrzeuge | |
DE102007037610A1 (de) | Verfahren zum Bestimmen eines wahrscheinlichen Bewegungs-Aufenthaltsbereichs eines Lebewesens | |
DE102011010864A1 (de) | Verfahren und System zur Vorhersage von Kollisionen | |
EP2707862B1 (de) | Abstandsbestimmung mittels eines kamerasensors | |
EP3259614B1 (de) | Verfahren zur ermittlung einer ortsinformation eines kraftfahrzeugs bezüglich eines fahrkorridors und kraftfahrzeug | |
DE102013217486A1 (de) | Verfahren zur Repräsentation eines Umfelds eines Fahrzeugs in einem Belegungsgitter | |
WO2013087067A1 (de) | Freirauminformationen in einem belegungsgitter als grundlage für die bestimmung eines manöverraums für ein fahrzeug | |
EP2353958B1 (de) | Verfahren zur Auswertung von die Umgebung eines Kraftfahrzeugs betreffenden Sensordaten wenigstens eines Umfeldsensors und Kraftfahrzeug | |
WO2015010902A1 (de) | Effizientes bereitstellen von belegungsinformationen für das umfeld eines fahrzeugs | |
WO2020160796A1 (de) | Verfahren und vorrichtung zur multi-sensor-datenfusion für automatisierte und autonome fahrzeuge |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10705090 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112010000146 Country of ref document: DE Ref document number: 1120100001464 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10705090 Country of ref document: EP Kind code of ref document: A1 |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: R225 Ref document number: 112010000146 Country of ref document: DE Effective date: 20120606 |