WO2020094323A1 - Procédé de classification d'objets au moyen d'un véhicule à moteur à déplacement automatisé et véhicule à moteur à déplacement automatisé - Google Patents
Procédé de classification d'objets au moyen d'un véhicule à moteur à déplacement automatisé et véhicule à moteur à déplacement automatisé Download PDFInfo
- Publication number
- WO2020094323A1 WO2020094323A1 PCT/EP2019/077313 EP2019077313W WO2020094323A1 WO 2020094323 A1 WO2020094323 A1 WO 2020094323A1 EP 2019077313 W EP2019077313 W EP 2019077313W WO 2020094323 A1 WO2020094323 A1 WO 2020094323A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- passenger
- motor vehicle
- objects
- automated
- stored
- Prior art date
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/40—Software arrangements specially adapted for pattern recognition, e.g. user interfaces or toolboxes therefor
- G06F18/41—Interactive pattern learning with a human teacher
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/215—Selection or confirmation of options
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/402—Type
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2413—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
- G06F18/24133—Distances to prototypes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/443—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
- G06V10/449—Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
- G06V10/451—Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
- G06V10/454—Integrating the filters into a hierarchical structure, e.g. convolutional neural networks [CNN]
Definitions
- the invention relates to a method for classifying objects by means of an automated motor vehicle and an automated motor vehicle.
- Automated vehicles drive trajectories automatically, whereby the longitudinal as well as the lateral dynamics are regulated automatically.
- Motor vehicles of this type also have at least one device for recording and classifying objects, the data of which are required, inter alia, for trajectory planning.
- the device comprises an environmental sensor system and an evaluation unit.
- the environment sensors consist of several cameras, for example. Lidar, radar and / or ultrasonic sensors can also be used.
- the data from the environment sensors are then evaluated in the evaluation unit and objects found are classified, for example whether a detected object is a motor vehicle, a pedestrian, a tree or the like.
- the evaluation unit can be a trained neural network, for example.
- the object recognition and classification is very reliable. However, there are always situations in which objects or putative objects appear in the data from the environment sensors, for which the neural network, for example, was not learned.
- the invention is based on the technical problem of improving a method for classifying objects by means of an automated motor vehicle and of creating such an improved automated motor vehicle.
- the automated motor vehicle having a device for detection and classification of objects, as well as output and input means, has the method step that objects detected by the device, but which cannot be classified or cannot be clearly classified, are brought to the attention of at least one passenger of the automated motor vehicle via the output means.
- the passenger is asked to classify the detected object, this step also being able to be carried out in advance and once, for example by informing the passenger that objects that are subsequently not classifiable are to be brought to the customer's attention, which objects are then to be classified.
- the classifications of the at least one passenger are stored.
- the passenger is asked to enter or confirm attributes of the object. For example, when classifying a restaurant, the passenger can be asked whether he knows the restaurant and can recommend it. With other classifications, however, this can be dispensed with.
- the objects are preferably visually displayed to the at least one passenger, this then being able to take place in various ways.
- an image of the object can be displayed on a display unit, the display unit being permanently installed in the motor vehicle or the display of a mobile device.
- the object can be displayed, for example, on a head-up display.
- the passenger may also have put on AR glasses (augmented reality), in which case the objects are imported.
- the data of the objects now additionally classified can be used in a variety of ways.
- the stored classifications of the objects are read in as training data in the device for recording and classifying in order to learn the evaluation unit in an improved manner.
- the data can be stored in a digital street map. For example, a crane was recorded in the pictures and classified by the passenger. A construction site can now be entered on the basis of this information. This means that there is information that there are traffic disruptions due to construction vehicles at this point can come. But restaurants and similar objects can also be saved and output as points of interest.
- the stored classifications with the associated image data can be transmitted to a vehicle-external neural network as training data.
- a vehicle-external neural network For example, certain objects have been classified as birds. These are of no further interest for the navigation of the automated motor vehicle, but for a neural network by means of which birds are to be acquired from image data.
- the inputs of the at least one passenger of a monetization device are transmitted and stored. This can create incentives for the passenger to classify the objects. It is also possible to create a game situation for the passengers, where they classify objects in competition with one another.
- the classifications can also be validated if, for example, several passengers classify the same object. For example, only classifications can be adopted that are classified equally by all or at least the majority of the passengers.
- Fig. 1 is a schematic block diagram of an automated motor vehicle
- FIG. 2 shows a flowchart of a method for classifying objects.
- the motor vehicle 50 has a device 1 for detecting and classifying objects.
- the device 1 comprises an environmental sensor system 2, an evaluation unit 3 and a memory 4. Furthermore, the motor vehicle 50 has output means 5, input means 6 and one
- the environment sensor system 2 comprises, for example, a large number of cameras, the data of which are transmitted to the evaluation unit 3.
- the evaluation unit 3 determines objects in the data and classifies them.
- the objects relevant to the automated journey such as other motor vehicles or obstacles driving ahead, are transferred to a trajectory planning device 8, the trajectory to be driven then being adapted as a function of the detected and classified objects.
- the device 1 detects objects that the evaluation unit 3 cannot classify or cannot classify with sufficient certainty, the objects are brought to the attention of at least one passenger of the automated motor vehicle 50 via the output means 5 and the passenger is asked to classify the object.
- the passenger can then use the input means 6 to classify the object and, if appropriate, enter attributes of the object.
- the classification carried out is then stored in the memory 4 and the passenger's input is transmitted to the monetization device 7. It should be noted here that both the output means 5 and the input means 6 can be mobile, ie do not have to be an integral part of the motor vehicle.
- the classifications stored in the memory 4 can then be, for example, as
- Training data for neural networks are used and / or supplement digital road maps.
- a first step S1 at least one passenger of the automated motor vehicle 50
- a third step S3 the passenger classifies the object, the classified objects finally being stored in a step S4.
- Reference symbol list
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Automation & Control Theory (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Artificial Intelligence (AREA)
- Medical Informatics (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Computational Biology (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Biology (AREA)
- General Engineering & Computer Science (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
L'invention concerne un procédé de classification d'objets au moyen d'un véhicule à moteur (50) à déplacement automatisé, le véhicule à moteur (50) à déplacement automatisé comprenant un dispositif (1) pour la détection et la classification d'objets ainsi que des moyens de sortie et d'entrée (5, 6), l'objet détecté mais non classifiable ou non classifiable de façon univoque étant signalé à au moins un passager du véhicule à moteur (50) à déplacement automatique par l'intermédiaire des moyens de sortie (5), le passager étant invité à classifier l'objet détecté, les classifications réalisées par le ou les passagers étant mémorisées. L'invention concerne également un véhicule à moteur (50) à déplacement automatisé.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102018219125.5 | 2018-11-09 | ||
DE102018219125.5A DE102018219125A1 (de) | 2018-11-09 | 2018-11-09 | Verfahren zum Klassifizieren von Objekten mittels eines automatisiert fahrenden Kraftfahrzeuges und automatisiert fahrendes Kraftfahrzeug |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020094323A1 true WO2020094323A1 (fr) | 2020-05-14 |
Family
ID=68210803
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2019/077313 WO2020094323A1 (fr) | 2018-11-09 | 2019-10-09 | Procédé de classification d'objets au moyen d'un véhicule à moteur à déplacement automatisé et véhicule à moteur à déplacement automatisé |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE102018219125A1 (fr) |
WO (1) | WO2020094323A1 (fr) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11875551B2 (en) | 2020-06-09 | 2024-01-16 | Navbirswagen Aktiengesellschaft | Collecting and processing data from vehicles |
DE102021002918B4 (de) | 2021-06-07 | 2023-04-06 | Mercedes-Benz Group AG | Verfahren zur Erkennung von für ein Fahrzeug sicherheitsrelevanten Objekten |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170308092A1 (en) * | 2014-10-11 | 2017-10-26 | Audi Ag | Method for operating an automatically driven, driverless motor vehicle and monitoring system |
US20180307925A1 (en) * | 2017-04-20 | 2018-10-25 | GM Global Technology Operations LLC | Systems and methods for traffic signal light detection |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102012220146A1 (de) * | 2012-11-06 | 2014-05-22 | Robert Bosch Gmbh | Verfahren und Vorrichtung zum Charakterisieren eines Fahrverhaltens eines Fahrers eines Fahrzeugs |
DE102013102087A1 (de) * | 2013-03-04 | 2014-09-04 | Conti Temic Microelectronic Gmbh | Verfahren zum Betrieb eines Fahrerassistenzsystems eines Fahrzeugs |
DE102014004675A1 (de) * | 2014-03-31 | 2015-10-01 | Audi Ag | Gestenbewertungssystem, Verfahren zur Gestenbewertung und Fahrzeug |
DE102014214507A1 (de) * | 2014-07-24 | 2016-01-28 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren zur Erstellung eines Umfeldmodells eines Fahrzeugs |
DE102015007493B4 (de) * | 2015-06-11 | 2021-02-25 | Audi Ag | Verfahren zum Trainieren eines in einem Kraftfahrzeug eingesetzten Entscheidungsalgorithmus und Kraftfahrzeug |
-
2018
- 2018-11-09 DE DE102018219125.5A patent/DE102018219125A1/de not_active Withdrawn
-
2019
- 2019-10-09 WO PCT/EP2019/077313 patent/WO2020094323A1/fr active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170308092A1 (en) * | 2014-10-11 | 2017-10-26 | Audi Ag | Method for operating an automatically driven, driverless motor vehicle and monitoring system |
US20180307925A1 (en) * | 2017-04-20 | 2018-10-25 | GM Global Technology Operations LLC | Systems and methods for traffic signal light detection |
Also Published As
Publication number | Publication date |
---|---|
DE102018219125A1 (de) | 2020-05-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE112012006226B4 (de) | Fahrassistenzvorrichtung | |
DE102012205316B4 (de) | Navigationssystem und Anzeigeverfahren hiervon | |
WO2015149971A1 (fr) | Procédé pour l'analyse de la situation d'un véhicule dans un environnement de trafic | |
DE102018214510A1 (de) | Parkplatzkartenerzeugung auf der Straße | |
DE102018203583B4 (de) | Verfahren, Fahrerassistenzsystem sowie Kraftfahrzeug zur Prädiktion einer Position oder einer Trajektorie mittels eines graphbasierten Umgebungsmodells | |
DE102016210534A1 (de) | Verfahren zum Klassifizieren einer Umgebung eines Fahrzeugs | |
DE102015207123A1 (de) | Fahrassistenzvorrichtung und -verfahren | |
DE102008041679A1 (de) | Vorrichtung und Verfahren zur erinnerungsbasierten Umfelderkennung | |
WO2019007605A1 (fr) | Procédé pour faire vérifier la carte numérique d'un véhicule très automatisé, dispositif correspondant et programme d'ordinateur | |
DE102017209347A1 (de) | Verfahren und Vorrichtung zum Steuern eines Fahrzeugs | |
WO2020094323A1 (fr) | Procédé de classification d'objets au moyen d'un véhicule à moteur à déplacement automatisé et véhicule à moteur à déplacement automatisé | |
DE102017206344A1 (de) | Fahrerassistenzsystem für ein Fahrzeug | |
DE102016122200A1 (de) | Bilden einer Rettungsgasse unter Berücksichtigung deren Notwendigkeit | |
DE102008043756A1 (de) | Verfahren und Steuergerät zum Bereitstellen einer Verkehrszeicheninformation | |
DE102018004668A1 (de) | Verfahren zur Unterstützung eines Fahrers eines Fahrzeugs zum Bilden einer Rettungsgasse bei Erfassen eines Staus | |
DE102014013298A1 (de) | Verfahren zum Betrieb eines Fahrzeugs und Fahrerassistenzsystem | |
DE102017207441A1 (de) | Verfahren zum Überprüfen einer digitalen Umgebungskarte für ein Fahrerassistenzsystem eines Kraftfahrzeugs, Rechenvorrichtung, Fahrerassistenzsystem sowie Kraftfahrzeug | |
DE102016214599A1 (de) | Verfahren zur Verkehrszeichenerkennung in Kraftfahrzeugen | |
WO2019211293A1 (fr) | Procédé pour faire fonctionner un système d'aide à la conduite d'un véhicule autonome équipé d'au moins un capteur d'environnement pour détecter un environnement du véhicule autonome, support lisible par ordinateur, système et véhicule | |
DE102018214506A1 (de) | Verfahren zur Weiterentwicklung eines Fahrerassistenzsystems und Fahrerassistenzsystem für ein Fahrzeug | |
DE102018112164A1 (de) | Verfahren zum Betreiben eines Fahrunterstützungssystems eines Fahrzeugs zum Identifizieren einer bestimmten Fahrsituation und Fahrunterstützungssystem | |
DE102014008467A1 (de) | Verfahren zum Betreiben eines Navigationssystems sowie eine Navigationsvorrichtung | |
DE102022102782A1 (de) | Fahrassistenzsystem und Verfahren zum Betreiben eines Fahrassistenzsystems | |
DE102012220357A1 (de) | Verfahren zur Ausgabe mindestens einer Geschwindigkeitsinformation in einem Fahrzeug, Informationssystem und Ausgabevorrichtung | |
DE102013001747A1 (de) | Verfahren und Vorrichtung zur Bewertung einer Fahrt |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19786556 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19786556 Country of ref document: EP Kind code of ref document: A1 |