EP3103683A1 - Method for operating a camera-based vehicle system - Google Patents

Method for operating a camera-based vehicle system Download PDF

Info

Publication number
EP3103683A1
EP3103683A1 EP16001285.2A EP16001285A EP3103683A1 EP 3103683 A1 EP3103683 A1 EP 3103683A1 EP 16001285 A EP16001285 A EP 16001285A EP 3103683 A1 EP3103683 A1 EP 3103683A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
pattern
camera
image
image area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP16001285.2A
Other languages
German (de)
French (fr)
Other versions
EP3103683B1 (en
Inventor
Jens Pollmer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Audi AG
Original Assignee
Audi AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi AG filed Critical Audi AG
Publication of EP3103683A1 publication Critical patent/EP3103683A1/en
Application granted granted Critical
Publication of EP3103683B1 publication Critical patent/EP3103683B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/40Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the details of the power supply or the coupling to vehicle components
    • B60R2300/402Image calibration

Definitions

  • the invention relates to a method for operating a camera-based vehicle system, comprising a camera, which is arranged in the vehicle interior for receiving the vehicle apron or the vehicle rear field, and a control device processing the camera images.
  • Modern motor vehicles have a number of vehicle systems that are intended to assist the driver or that are relevant to the operation of the motor vehicle. Some of these vehicle systems are camera-based, that is to say that their function is based on camera images of the vehicle surroundings being recorded, which are analyzed by a control device in order to output information or messages to the driver on the basis of the analyzed information Vehicle longitudinal or transverse guidance, etc.
  • a camera z. B. installed in the region of the interior mirror on the windscreen, the camera is directed forward to record the vehicle apron.
  • vehicles are installed in the area of the rear of the vehicle, for example in SUVs or vans, to accommodate the vehicle rear field.
  • laterally arranged cameras are provided which absorb the side environment.
  • the cameras can be mounted both inside and outside the vehicle. Often this is the part of the motor vehicle in the field of vision of the camera, for example, in a front camera usually the hood of the vehicle, in a rear camera, for example, the lower bumper or the like.
  • the position of the hood is defined in advance in today's vehicle systems.
  • the invention is based on the problem of providing a method which is improved in comparison and which makes it possible to improve the system performance.
  • the method according to the invention provides that for determining an information-relevant image area given in at least one camera image by the control device, an image area shown in the camera image and showing part of the vehicle is determined in the camera image itself, which image area in FIG The processing of subsequently recorded camera images is disregarded.
  • the invention provides to determine the image area to be "cut out" of the overall image, in which the vehicle part is shown, in one or more currently recorded camera image itself. This makes it possible to dimension the unobservable image area exactly as it actually appears in the respective camera image. For the actual representation shows the real constructive situation, so that exactly the image area can be suppressed, in which the vehicle part is shown, but not an additional, tolerances reasons considered image area.
  • the information-relevant image area corresponds exactly to the image area, which actually contains information that can be detected or evaluated by the control device. There Consequently, this information-relevant image area is not prone to be prophylactically restricted for reasons of safety or tolerance, a larger image area is therefore available for evaluation, so that the system performance can be improved.
  • the determination of the image area showing a part of the vehicle takes place on the basis of a pattern or object shown in the camera image or images that is visible in the information-relevant image area and hidden in the image area showing part of the vehicle which pattern or object is determined by the controller.
  • the controller processes the camera image (s) using a corresponding algorithm that detects a pattern or object that is visible in the information-relevant area.
  • This object extends as far as the dividing line or edge in which the information-relevant image area merges into the other image area showing the vehicle part, that is to say that the pattern or the object is then obscured after the transition from the vehicle part.
  • the controller is now able to detect such a pattern or object and thus detect a point or area where the one image area merges into the other.
  • the controller Based on the geometry of the vehicle part, so for example the hood, which is known in their design data control device side, now the exact course of the edge can be detected, so that the transition region can be defined exactly.
  • the control device can determine the course of the edge or the image area transition via a suitable edge detection algorithm or the like.
  • the pattern or the object can either be arranged on the vehicle itself, for example, in that it is applied to the bonnet or placed or is attached in the edge region.
  • the controller now determines this pattern with the appropriate analysis algorithm or object and can determine from the information obtained, the dividing line of the image areas, respectively, the course of the vehicle edge part.
  • the pattern or object is located in front of the vehicle.
  • the vehicle is placed in a position to be defined in front of this pattern or object, after which the controller analyzes the camera image (s) again and first determines the pattern or object in the image, and then the point or area in which the pattern or object is at the image area transition "disappears". This means that the vehicle part begins in the image where it is no longer possible to recognize the pattern or object.
  • the pattern or object and the vehicle may be fixed in position, that is, both are static. So they do not move relative to each other.
  • the pattern or object moves relative to the vehicle or the vehicle relative to the pattern or object.
  • One element thus moves, while the other element is fixed in position.
  • a series of camera images are taken which show the pattern or object moving relative to the vehicle, regardless of which element is moving. It is a dynamic process in this case.
  • the pattern or object may be moved substantially horizontally. This means that it is guided, for example via a linear guide on the ground, relative to the vehicle, preferably of course in the vehicle longitudinal axis, is moved. It can even be guided under the vehicle.
  • the pattern or object can also be moved up and / or down, or moved horizontally across the vehicle. In this case, the pattern or object is moved, for example, vertically or laterally along a guide, so that it inevitably comes in the course of the movement, for example from above, from the vehicle part is covered.
  • the respective movement is shown in the camera images, so that the place where the pattern or object "disappears" behind the vehicle part, can be detected exactly.
  • a horizontal movement taking place transversely along the driver's side is also conceivable, and the part of the pattern or object visible in the successive images also changes.
  • the pattern or object When the vehicle is moving relative to the pattern or object, the pattern or object may in turn be positioned on the ground as the vehicle passes over the pattern or object, thus ensuring that it necessarily disappears.
  • the pattern or object may be, for example, a reference pattern, for example in the form of a checkerboard pattern printed on a surface, or a simple black and white surface with a defined edge.
  • the reference pattern may also be designed as a light column or the like. In any case, it is a specially provided for this purpose and appropriately positioned pattern or object.
  • the pattern or object may also be a natural pattern or object located in the vehicle apron or vehicle rear field, which is determined by the control device.
  • a pattern is preferably determined such that has an edge structure visible in the image. It is conceivable, for example, a vertical column, a beam, a piece of furniture or the like, such as those in a workshop or the like, where for image purposes, the image area determination or calibration is made. This alternative is advantageous in that no special patterns or objects need to be kept.
  • the detection of the hood or a unit obscuring another view can also take place via a temporal viewing of the images, that is to say a comparison of a plurality of images of the same built - in camera taken one after the other or at different times Area with z. B. the hood is less different than other areas.
  • the invention further relates to a motor vehicle, comprising a camera with associated control device, which motor vehicle is designed to carry out the method according to the invention.
  • Fig. 1 shows a motor vehicle 1 according to the invention, comprising a vehicle system 2, for example in the form of a driver assistance system or the like, with a control device 3, a camera 4 receiving the vehicle apron and exemplarily several devices 5 operated via the control device 3, which are a display element Control element or other components or assemblies in Can act vehicle that are controlled or operated via the vehicle system.
  • a vehicle system 2 for example in the form of a driver assistance system or the like
  • a control device 3 a camera 4 receiving the vehicle apron and exemplarily several devices 5 operated via the control device 3, which are a display element Control element or other components or assemblies in Can act vehicle that are controlled or operated via the vehicle system.
  • the camera 4 takes as described on the vehicle apron. Part of the hood 6 is also located in the detection area of the camera.
  • the image area which shows the hood 6 can not necessarily contain any information-relevant contents. It is therefore necessary to determine exactly where the boundary between the image area, which contains information-relevant information, and the image area, which shows the vehicle part, here the engine hood 6, runs.
  • Fig. 2 shows by way of example a camera image 7, which is recorded via the camera 4.
  • a reference pattern 8 is arranged in the region of the vehicle front, which extends to below the hood 6.
  • the dashed area of the reference pattern 8 is hidden over the hood 6.
  • the control device 3 is now able to determine the boundary line 11, where one image area 9 merges into the other image area 10. This is done by analyzing the camera image 7 for detecting the reference pattern 8, which is divided here like a checkerboard in black and white fields. The control device 3 now determines exactly the region 12 in which the regular structure, namely the reference pattern 8, coming from the upper image area 9 virtually disappears. On or off exactly at this image area 12, the reference pattern 8 is thus covered by the hood 6.
  • the control device can now determine the remaining course of the hood edge, ie the line 11. This can be done either by using known geometry data of the hood is present on the part of the control device 3. Because the hood 6 has a defined, characteristic course. From this defined course of the entire course of the line 11, starting from the area 12, can now be determined. Alternatively, it is conceivable to determine the course of the line 11, which indeed defines a sharp, optical separation of the image areas 9 and 10, via a corresponding edge detection algorithm, again by image analysis. Finally, if the boundary line is virtually rectilinear due to the geometry of the hood 6, the determined image line of the area 12 could simply be used as the dividing line of the image areas 9 and 10
  • control device 3 can now define the information-relevant image area 9, so that the information that will subsequently be used to control the equipment 5 can be obtained from this within the framework of a subsequent processing of the image content of the image area 9.
  • An influence on the image analysis by the image area 10 is excluded since this image area remains taken into account during the processing. Consequently, almost the maximum relevant image area 9 of the analysis is used, so that the greatest possible information depth can be recorded.
  • the analysis of the image area 6 is avoided, which is time-consuming and unnecessary because it contains no relevant information.
  • Fig. 3 shows in the form of a schematic representation of an alternative to the fixed arrangement of the reference pattern 8 on the vehicle front, as in Fig. 1 shown.
  • the reference pattern 8 as shown by the arrow P1, along a guide vertically moved up and down.
  • the camera 4 can detect the reference pattern until it is virtually covered by the vehicle front, here in turn the hood 6, which is the case when the reference pattern is lowered below the line 13.
  • the control device 3 now takes a multiplicity of images of the camera 4, which were taken during the lowering movement, for the purpose of analysis and in turn determines the corresponding transition region 12 in order to define the image regions 9 and 10 exactly.
  • a reference pattern 8 is provided, which is here, however, horizontally movable, as shown by the arrow P2. It is located in front of the motor vehicle 1.
  • the reference pattern 8 is pushed along the arrow P2 in the direction of the motor vehicle 1 and passes the line shown 13, so disappears with increasing approach movement.
  • the control device 3 analyzes a plurality of camera images and detects in this way the exact movement path of the reference pattern 3 and thus also the transition region 12, based on what then the image areas 9 and 10 are defined.
  • a reference pattern 8 is shown, which is static, so does not move relative to the vehicle, which is also fixed in position.
  • the reference pattern 8 is designed here as a rod or rod. It has a first section 8a which, being static, is permanently visible in the image of the camera 4. The second, lower area 8b is covered by the hood 6, so it is not visible in the picture. The areas 8a and 8b are separated by the line 13.
  • the control device 3 now includes in the camera image the reference pattern 8 and thus the area 8a and can thus define the image areas 9 and 10 exactly and distinguish them from one another.
  • the embodiment according to Fig. 5 corresponds substantially to the embodiment according to Fig. 1 , only with the difference that the reference pattern 8 does not, as in Fig. 1 , Is attached to the vehicle itself, but positionally fixed in front of the vehicle is arranged.
  • the determination made of the position of the line 11, that is, the definition of the image area 9 and 10 can be made once factory, if the system once set up or calibrated. However, it is also possible to carry out this calibration regularly during workshop visits.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

Verfahren zum Betrieb eines kamerabasierten Fahrzeugsystems, umfassend eine Kamera (4), die im Fahrzeuginneren zur Aufnahme des Fahrzeugvorfelds oder des Fahrzeugrückfelds angeordnet ist, sowie eine die Kamerabilder verarbeitende Steuerungseinrichtung (3), wobei zur Bestimmung eines in wenigstens einem Kamerabild (7) gegebenen informationsrelevanten Bildbereichs (9) seitens der Steuerungseinrichtung (3) ein in dem Kamerabild gezeigter, einen Teil des Fahrzeugs (1) zeigenden Bildbereich (10) im Kamerabild (7) selbst ermittelt wird, welcher Bildbereich (10) im Rahmen der Verarbeitung nachfolgend aufgenommener Kamerabilder (7) außer Betracht bleibt.Method for operating a camera-based vehicle system, comprising a camera (4), which is arranged in the vehicle interior for receiving the vehicle apron or the vehicle rear field, and a control device (3) processing the camera images, wherein for determining an information relevant in at least one camera image (7) Image area (9) on the part of the control device (3) a shown in the camera image, a part of the vehicle (1) facing image area (10) in the camera image (7) itself is determined which image area (10) in the context of processing subsequently captured camera images ( 7) is disregarded.

Description

Die Erfindung betrifft ein Verfahren zum Betrieb eines kamerabasierten Fahrzeugsystems, umfassend eine Kamera, die im Fahrzeuginneren zur Aufnahme des Fahrzeugvorfelds oder des Fahrzeugrückfelds angeordnet ist, sowie eine die Kamerabilder verarbeitende Steuerungseinrichtung.The invention relates to a method for operating a camera-based vehicle system, comprising a camera, which is arranged in the vehicle interior for receiving the vehicle apron or the vehicle rear field, and a control device processing the camera images.

Moderne Kraftfahrzeuge verfügen über eine Reihe von Fahrzeugsystemen, die dem Fahrer assistieren sollen oder die für den Betrieb des Kraftfahrzeugs relevant sind. Manche dieser Fahrzeugsysteme sind kamerabasiert, das heißt, dass ihre Funktion darauf beruht, dass über eine Kamera Bilder der Fahrzeugumgebung aufgenommen werden, die seitens einer Steuerungseinrichtung analysiert werden, um gestützt auf die analysierten Informationen entsprechende Informationen oder Mitteilungen an den Fahrer auszugeben, Eingriffe in die Fahrzeuglängs- oder Querführung vorzunehmen etc.Modern motor vehicles have a number of vehicle systems that are intended to assist the driver or that are relevant to the operation of the motor vehicle. Some of these vehicle systems are camera-based, that is to say that their function is based on camera images of the vehicle surroundings being recorded, which are analyzed by a control device in order to output information or messages to the driver on the basis of the analyzed information Vehicle longitudinal or transverse guidance, etc.

Häufig wird eine solche Kamera z. B. im Bereich des Innenspiegels an der Frontscheibe verbaut, wobei die Kamera nach vorne gerichtet ist, um das Fahrzeugvorfeld aufzunehmen. Mitunter sind aber auch Fahrzeuge im Bereich der Fahrzeugrückseite verbaut, beispielsweise bei SUVs oder Transportern, um das das Fahrzeugrückfeld aufzunehmen. Mitunter sind auch seitlich angeordnete Kameras vorgesehen, die das seitliche Umfeld aufnehmen. Die Kameras können sowohl im Inneren des Fahrzeugs als auch außen angebracht sein. Häufig befindet sich dabei im Sichtbereich der Kamera ein Teil des Kraftfahrzeugs, beispielsweise bei einer Frontkamera zumeist die Motorhaube des Fahrzeugs, bei einer Heckkamera beispielsweise die untere Stoßstange oder dergleichen. Um Wechselwirkungen und Einflüsse eines im Kamerabild sichtbaren Fahrzeugteils, also beispielsweise der Motorhaube, auf die Bildauswertung und damit die Informationsgewinnung zu vermeiden, wird in heutigen Fahrzeugsystemen die Lage der Motorhaube vorab definiert.Often, such a camera z. B. installed in the region of the interior mirror on the windscreen, the camera is directed forward to record the vehicle apron. Occasionally, however, vehicles are installed in the area of the rear of the vehicle, for example in SUVs or vans, to accommodate the vehicle rear field. Sometimes laterally arranged cameras are provided which absorb the side environment. The cameras can be mounted both inside and outside the vehicle. Often this is the part of the motor vehicle in the field of vision of the camera, for example, in a front camera usually the hood of the vehicle, in a rear camera, for example, the lower bumper or the like. In order to avoid interactions and influences of a vehicle visible in the camera image part, so for example the hood, on the image analysis and thus the information retrieval, the position of the hood is defined in advance in today's vehicle systems.

Sie wird aus Konstruktionsdaten des Kraftfahrzeugs und dem bekannten Positionsort der Kamera bestimmt und einmalig definiert. Diese Maße sind jedoch toleranzbehaftet, so dass ein Sicherheitspuffer im System vorgehalten werden muss, um sicherzugehen, dass diese Toleranzen nicht dazu führen, dass ein kleiner Teil der Motorhaube doch wieder im verarbeiteten Bildbereich erscheint. Es wird also quasi mehr aus dem aufgenommenen Kamerabild "ausgeschnitten", um die Motorhaube oder ein anderes Fahrzeugteil aus der Bildverarbeitung auszunehmen, als nötig, was die Systemperformance einschränkt.It is determined from construction data of the motor vehicle and the known location of the camera and defined once. However, these dimensions are tolerance tolerant, so a safety buffer must be kept in the system to ensure that these tolerances do not cause a small portion of the hood to reappear in the processed image area. It is almost "cut out" of the recorded camera image to exclude the hood or another vehicle part from the image processing, as necessary, which restricts the system performance.

Der Erfindung liegt das Problem zugrunde, ein Verfahren anzugeben, das demgegenüber verbessert ist und das es ermöglicht, die Systemperformance zu verbessern.The invention is based on the problem of providing a method which is improved in comparison and which makes it possible to improve the system performance.

Zur Lösung dieses Problems ist bei einem Verfahren der eingangs genannten Art erfindungsgemäß vorgesehen, dass zur Bestimmung eines in wenigstens einem Kamerabild gegebenen informationsrelevanten Bildbereichs seitens der Steuerungseinrichtung ein in dem Kamerabild gezeigter, einen Teil des Fahrzeugs zeigender Bildbereich im Kamerabild selbst ermittelt wird, welcher Bildbereich im Rahmen der Verarbeitung nachfolgend aufgenommener Kamerabilder außer Betracht bleibt.To solve this problem, the method according to the invention provides that for determining an information-relevant image area given in at least one camera image by the control device, an image area shown in the camera image and showing part of the vehicle is determined in the camera image itself, which image area in FIG The processing of subsequently recorded camera images is disregarded.

Die Erfindung sieht vor, den aus dem Gesamtbild "auszuschneidenden" Bildbereich, in dem der Fahrzeugteil gezeigt ist, in einem oder mehreren aktuell aufgenommenen Kamerabild selbst zu ermitteln. Dies ermöglich es, den nicht zu berücksichtigenden Bildbereich exakt so zu bemessen, wie er sich tatsächlich im jeweiligen Kamerabild darstellt. Denn die Ist-Darstellung zeigt die reale konstruktive Situation, so dass exakt der Bildbereich unterdrückt werden kann, in dem das Fahrzeugteil gezeigt ist, nicht aber ein zusätzlicher, aus Toleranzgründen berücksichtigter Bildbereich.The invention provides to determine the image area to be "cut out" of the overall image, in which the vehicle part is shown, in one or more currently recorded camera image itself. This makes it possible to dimension the unobservable image area exactly as it actually appears in the respective camera image. For the actual representation shows the real constructive situation, so that exactly the image area can be suppressed, in which the vehicle part is shown, but not an additional, tolerances reasons considered image area.

Hieraus resultiert, dass der informationsrelevante Bildbereich genau dem Bildbereich entspricht, der tatsächlich Informationen enthält, die seitens der Steuerungseinrichtung erfasst respektive ausgewertet werden können. Da dieser informationsrelevante Bildbereich nicht aus Sicherheitsgründen respektive Toleranzgründen prophylaktisch beschnitten ist, steht folglich ein größerer Bildbereich zur Auswertung zur Verfügung, so dass die Systemperformance verbessert werden kann.The result of this is that the information-relevant image area corresponds exactly to the image area, which actually contains information that can be detected or evaluated by the control device. There Consequently, this information-relevant image area is not prone to be prophylactically restricted for reasons of safety or tolerance, a larger image area is therefore available for evaluation, so that the system performance can be improved.

In Weiterbildung der Erfindung kann vorgesehen sein, dass die Ermittlung des einen Teil des Fahrzeugs zeigenden Bildbereichs anhand eines in dem oder den hierzu verarbeitenden Kamerabildern gezeigten Musters oder Objekt, das im informationsrelevanten Bildbereich sichtbar und in dem einen Teil des Fahrzeugs zeigenden Bildbereich verdeckt ist, erfolgt, welches Muster oder Objekt seitens der Steuerungseinrichtung ermittelt wird. Die Steuerungseinrichtung verarbeitet das oder die Kamerabilder unter Verwendung eines entsprechenden Algorithmus, der ein Muster oder ein Objekt erfasst, das im informationsrelevanten Bereich sichtbar ist. Dieses Objekt verläuft bis zur Trennlinie respektive Kante, in der der informationsrelevante Bildbereich in den anderen, das Fahrzeugteil zeigenden Bildbereich übergeht, das heißt, dass das Muster oder das Objekt dann nach dem Übergang von dem Fahrzeugteil verdeckt ist. Die Steuerungseinrichtung ist nun in der Lage, ein solches Muster oder ein solches Objekt zu erfassen und so einen Punkt oder Bereich zu detektieren, an dem der eine Bildbereich in den anderen übergeht. Anhand der Geometrie des Fahrzeugteils, also beispielsweise der Motorhaube, die in ihren Konstruktionsdaten steuerungseinrichtungsseitig bekannt ist, kann nun der genaue Verlauf der Kante erfasst werden, so dass der Übergangsbereich exakt definiert werden kann. Alternativ zur Ermittlung auf Basis der Geometriedaten des Fahrzeugteils ist es auch denkbar, dass die Steuerungseinrichtung über einen geeigneten Kantendetektionsalgorithmus oder dergleichen den Verlauf der Kante respektive des Bildbereichsübergangs ermitteln kann.In a further development of the invention, it can be provided that the determination of the image area showing a part of the vehicle takes place on the basis of a pattern or object shown in the camera image or images that is visible in the information-relevant image area and hidden in the image area showing part of the vehicle which pattern or object is determined by the controller. The controller processes the camera image (s) using a corresponding algorithm that detects a pattern or object that is visible in the information-relevant area. This object extends as far as the dividing line or edge in which the information-relevant image area merges into the other image area showing the vehicle part, that is to say that the pattern or the object is then obscured after the transition from the vehicle part. The controller is now able to detect such a pattern or object and thus detect a point or area where the one image area merges into the other. Based on the geometry of the vehicle part, so for example the hood, which is known in their design data control device side, now the exact course of the edge can be detected, so that the transition region can be defined exactly. Alternatively to the determination on the basis of the geometry data of the vehicle part, it is also conceivable that the control device can determine the course of the edge or the image area transition via a suitable edge detection algorithm or the like.

Das Muster oder das Objekt kann entweder am Fahrzeug selbst angeordnet sein, beispielsweise in dem es auf die Motorhaube aufgebracht oder gelegt wird respektive im Kantenbereich angebracht wird. Die Steuerungseinrichtung ermittelt nun mit dem geeigneten Analysealgorithmus dieses Muster oder Objekt und kann aus den gewonnenen Informationen die Trennlinie der Bildbereiche respektive den Verlauf der Fahrzeugteilkante bestimmen.The pattern or the object can either be arranged on the vehicle itself, for example, in that it is applied to the bonnet or placed or is attached in the edge region. The controller now determines this pattern with the appropriate analysis algorithm or object and can determine from the information obtained, the dividing line of the image areas, respectively, the course of the vehicle edge part.

Alternativ zur Anordnung des Musters oder des Objekts am Fahrzeug selbst ist es auch denkbar, dass sich das Muster oder Objekt vor dem Fahrzeug befindet. Das Fahrzeug wird in eine entsprechend zu definierende Position vor diesem Muster oder Objekt gebracht, wonach die Steuerungseinrichtung das oder die Kamerabilder wiederum analysiert und zunächst das Muster oder Objekt im Bild ermittelt, und anschließend den Punkt oder Bereich, in dem das Muster oder Objekt beim Bildbereichsübergang "verschwindet". Das heißt, dass das Fahrzeugteil im Bild dort beginnt, wo es nicht mehr gelingt, das Muster oder Objekt zu erkennen.Alternatively to the arrangement of the pattern or the object on the vehicle itself, it is also conceivable that the pattern or object is located in front of the vehicle. The vehicle is placed in a position to be defined in front of this pattern or object, after which the controller analyzes the camera image (s) again and first determines the pattern or object in the image, and then the point or area in which the pattern or object is at the image area transition "disappears". This means that the vehicle part begins in the image where it is no longer possible to recognize the pattern or object.

Während der Aufnahme des oder der Kamerabilder kann das Muster oder Objekt und das Fahrzeug positionsfest sein, das heißt, dass beide statisch sind. Sie bewegen sich also nicht relativ zueinander. Alternativ ist es denkbar, dass sich das Muster oder Objekt relativ zum Fahrzeug oder das Fahrzeug relativ zum Muster oder Objekt bewegt. Ein Element bewegt sich also, während das andere Element positionsfest ist. Hier werden folglich eine Reihe von Kamerabildern aufgenommen, die das sich - unabhängig davon, welches Element sich bewegt - relativ zum Fahrzeug bewegende Muster oder Objekt zeigen. Es handelt sich in diesem Fall also um ein dynamisches Verfahren.During capture of the camera image or images, the pattern or object and the vehicle may be fixed in position, that is, both are static. So they do not move relative to each other. Alternatively, it is conceivable that the pattern or object moves relative to the vehicle or the vehicle relative to the pattern or object. One element thus moves, while the other element is fixed in position. Hereby, a series of camera images are taken which show the pattern or object moving relative to the vehicle, regardless of which element is moving. It is a dynamic process in this case.

Bei einem sich relativ zum Fahrzeug bewegenden Muster oder Objekt kann das Muster oder Objekt im Wesentlichen horizontal bewegt werden. Das heißt, dass es, beispielsweise über eine Linearführung auf dem Boden geführt, relativ zum Fahrzeug, bevorzugt natürlich in der Fahrzeuglängsachse, bewegt wird. Es kann hierbei sogar unter das Fahrzeug geführt werden. Alternativ zu einer quasi horizontalen Bewegung kann das Muster oder Objekt auch auf- und/oder abbewegt, oder horizontal quer entlang des Fahrzeugs bewegt werden. Hierbei wird das Muster oder Objekt beispielsweise vertikal oder seitlich an einer Führung entlang bewegt, so dass es zwangsläufig im Laufe der Bewegung, beispielsweise von oben kommend, vom Fahrzeugteil verdeckt wird. Die jeweilige Bewegung ist in den Kamerabildern gezeigt, so dass der Ort, wo das Muster oder Objekt quasi hinter dem Fahrzeugteil "verschwindet", exakt erfasst werden kann. Auch eine horizontale, quer entlang der Fahrerseite erfolgende Bewegung ist denkbar, auch dabei ändert sich der in den nacheinander aufgenommen Bildern sichtbare Teil des Musters oder Objekts.In a pattern or object moving relative to the vehicle, the pattern or object may be moved substantially horizontally. This means that it is guided, for example via a linear guide on the ground, relative to the vehicle, preferably of course in the vehicle longitudinal axis, is moved. It can even be guided under the vehicle. As an alternative to a quasi-horizontal movement, the pattern or object can also be moved up and / or down, or moved horizontally across the vehicle. In this case, the pattern or object is moved, for example, vertically or laterally along a guide, so that it inevitably comes in the course of the movement, for example from above, from the vehicle part is covered. The respective movement is shown in the camera images, so that the place where the pattern or object "disappears" behind the vehicle part, can be detected exactly. A horizontal movement taking place transversely along the driver's side is also conceivable, and the part of the pattern or object visible in the successive images also changes.

Bewegt sich das Fahrzeug relativ zum Muster oder Objekt, so kann das Muster oder Objekt wiederum am Boden liegend positioniert sein, während das Fahrzeug das Muster oder Objekt überfährt, so dass sichergestellt ist, das es zwingend verschwindet.When the vehicle is moving relative to the pattern or object, the pattern or object may in turn be positioned on the ground as the vehicle passes over the pattern or object, thus ensuring that it necessarily disappears.

Das Muster oder Objekt kann beispielsweise ein Referenzmuster sein, beispielsweise in Form eines Schachbrettmusters, das auf eine Fläche gedruckt ist, oder eine einfache schwarz/weiß-Fläche mit einer definierten Kante. Alternativ kann das Referenzmuster auch als Lichtsäule ausgeführt sein oder ähnliches. In jedem Fall handelt es sich um ein extra für diesen Zweck vorgesehenes und entsprechend positioniertes Muster oder Objekt.The pattern or object may be, for example, a reference pattern, for example in the form of a checkerboard pattern printed on a surface, or a simple black and white surface with a defined edge. Alternatively, the reference pattern may also be designed as a light column or the like. In any case, it is a specially provided for this purpose and appropriately positioned pattern or object.

Alternativ dazu kann es sich bei dem Muster oder Objekt auch um ein natürliches, im Fahrzeugvorfeld oder Fahrzeugrückfeld befindliches Muster oder Objekt handeln, das seitens der Steuerungseinrichtung bestimmt wird. Als ein solches Muster wird bevorzugt ein solches bestimmt, dass eine im Bild sichtbare Kantenstruktur aufweist. Denkbar ist beispielsweise eine vertikale Säule, ein Balken, ein Möbel oder dergleichen, wie sie beispielsweise in einer Werkstatt oder ähnlichem, wo zu Einrichtungszwecken die Bildbereichsermittlung respektive Kalibrierung vorgenommen wird. Diese Alternative ist von Vorteil, als keine speziellen Muster oder Objekte vorgehalten werden müssen.Alternatively, the pattern or object may also be a natural pattern or object located in the vehicle apron or vehicle rear field, which is determined by the control device. As such a pattern is preferably determined such that has an edge structure visible in the image. It is conceivable, for example, a vertical column, a beam, a piece of furniture or the like, such as those in a workshop or the like, where for image purposes, the image area determination or calibration is made. This alternative is advantageous in that no special patterns or objects need to be kept.

Die Erkennung der Haube oder einer anderen Sicht verdeckenden Einheit kann auch über eine zeitliche Betrachtung der Bilder erfolgen, also einem Vergleich mehrerer zeitlich nacheinander bzw. zu verschiedenen Zeitpunkten aufgenommener Bilder der selben verbauten Kamera gegeneinander - der Bereich mit z. B. der Motorhaube unterscheidet sich weniger als andere Bereiche.The detection of the hood or a unit obscuring another view can also take place via a temporal viewing of the images, that is to say a comparison of a plurality of images of the same built - in camera taken one after the other or at different times Area with z. B. the hood is less different than other areas.

Neben dem Verfahren selbst betrifft die Erfindung ferner ein Kraftfahrzeug, umfassend eine Kamera mit zugeordneter Steuerungseinrichtung, welches Kraftfahrzeug zur Durchführung des erfindungsgemäßen Verfahrens ausgebildet ist.In addition to the method itself, the invention further relates to a motor vehicle, comprising a camera with associated control device, which motor vehicle is designed to carry out the method according to the invention.

Weitere Vorteile und Einzelheiten der Erfindung ergeben sich aus den im Folgenden beschriebenen Ausführungsbeispielen sowie anhand der Zeichnungen:

Fig. 1
eine Prinzipdarstellung eines erfindungsgemäßen Kraftfahrzeugs,
Fig. 2
eine Prinzipdarstellung eines Kamerabilds mit darin gezeigtem Referenzmuster,
Fig. 3
eine Prinzipdarstellung eines Verfahrensausführung mit vertikal beweglichem Referenzmuster,
Fig. 4
eine Prinzipdarstellung einer Verfahrensausführung mit horizontal beweglichem Referenzmuster und
Fig. 5
eine Prinzipdarstellung einer Verfahrensausführung mit statischem Referenzmuster.
Further advantages and details of the invention will become apparent from the embodiments described below and with reference to the drawings:
Fig. 1
a schematic diagram of a motor vehicle according to the invention,
Fig. 2
a schematic diagram of a camera image with reference pattern shown therein,
Fig. 3
a schematic diagram of a method embodiment with a vertically movable reference pattern,
Fig. 4
a schematic representation of a process execution with horizontally movable reference pattern and
Fig. 5
a schematic representation of a process execution with static reference pattern.

Fig. 1 zeigt ein erfindungsgemäßes Kraftfahrzeug 1, umfassend ein Fahrzeugsystem 2, beispielsweise in Form eines Fahrerassistenzsystems oder ähnlichem, mit einer Steuerungseinrichtung 3, einer das Fahrzeugvorfeld aufnehmenden Kamera 4 sowie exemplarisch mehreren über die Steuerungseinrichtung 3 bedienten Gerätschaften 5, bei denen es sich um ein Anzeigeelement, ein Stellelement oder sonstige Bauteile oder Baugruppen im Fahrzeug handeln kann, die über das Fahrzeugsystem angesteuert respektive betrieben werden. Fig. 1 shows a motor vehicle 1 according to the invention, comprising a vehicle system 2, for example in the form of a driver assistance system or the like, with a control device 3, a camera 4 receiving the vehicle apron and exemplarily several devices 5 operated via the control device 3, which are a display element Control element or other components or assemblies in Can act vehicle that are controlled or operated via the vehicle system.

Die Kamera 4 nimmt wie beschrieben das Fahrzeugvorfeld auf. Im Erfassungsbereich der Kamera befindet sich auch ein Teil der Motorhaube 6. Der Bildbereich, der die Motorhaube 6 zeigt, kann zwangsläufig keine informationsrelevanten Inhalte aufweisen. Es ist daher zu ermitteln, wo genau die Grenze zwischen dem Bildbereich, der informationsrelevante Informationen beinhaltet, und dem Bildbereich, der das Fahrzeugteil, hier die Motorhaube 6 zeigt, verläuft.The camera 4 takes as described on the vehicle apron. Part of the hood 6 is also located in the detection area of the camera. The image area which shows the hood 6 can not necessarily contain any information-relevant contents. It is therefore necessary to determine exactly where the boundary between the image area, which contains information-relevant information, and the image area, which shows the vehicle part, here the engine hood 6, runs.

Fig. 2 zeigt exemplarisch ein Kamerabild 7, das über die Kamera 4 aufgenommen wird. Wie Fig. 1 zeigt, ist am Fahrzeug 1 ein Referenzmuster 8 im Bereich der Fahrzeugfront angeordnet, das sich bis unter die Motorhaube 6 erstreckt. In Fig. 2 ist das Referenzmuster 8 daher teilweise gezeigt, der gestrichelte Bereich des Referenzmusters 8 ist über die Motorhaube 6 verdeckt. Es sind folglich zwei Bildbereiche gegeben, nämlich der obere Bildbereich 9 enthaltend analyserelevante Information sowie der untere Bildbereich 10, der letztlich die Motorhaube 6 zeigt. Fig. 2 shows by way of example a camera image 7, which is recorded via the camera 4. As Fig. 1 shows, on the vehicle 1, a reference pattern 8 is arranged in the region of the vehicle front, which extends to below the hood 6. In Fig. 2 Therefore, if the reference pattern 8 is partially shown, the dashed area of the reference pattern 8 is hidden over the hood 6. There are consequently two image areas, namely the upper image area 9 containing analyzer-relevant information and the lower image area 10, which ultimately shows the hood 6.

Die Steuerungseinrichtung 3 ist nun in der Lage, die Grenzlinie 11, wo der eine Bildbereich 9 in den anderen Bildbereich 10 übergeht, zu ermitteln. Dies geschieht durch Analyse des Kamerabildes 7 zur Erfassung des Referenzmusters 8, das hier schachbrettartig in Schwarz-Weiß-Felder unterteilt ist. Die Steuerungseinrichtung 3 ermittelt nun genau den Bereich 12, in dem die regelmäßige Struktur, nämlich das Referenzmuster 8, vom oberen Bildbereich 9 kommend quasi verschwindet. An bzw. ab genau an diesem Bildbereich 12 wird das Referenzmuster 8 folglich durch die Motorhaube 6 verdeckt.The control device 3 is now able to determine the boundary line 11, where one image area 9 merges into the other image area 10. This is done by analyzing the camera image 7 for detecting the reference pattern 8, which is divided here like a checkerboard in black and white fields. The control device 3 now determines exactly the region 12 in which the regular structure, namely the reference pattern 8, coming from the upper image area 9 virtually disappears. On or off exactly at this image area 12, the reference pattern 8 is thus covered by the hood 6.

Ist die Bildzeile oder der Bildpunkt bekannt, wo genau dieser Übergangsbereich 12 ist, kann die Steuerungseinrichtung nun den restlichen Verlauf der Motorhaubenkante, also der Linie 11 ermitteln. Dies kann entweder dadurch geschehen, dass auch bekannte Geometriedaten der Motorhaube zurückgegriffen wird, die seitens der Steuerungseinrichtung 3 vorliegen. Denn die Motorhaube 6 hat einen definierten, charakteristischen Verlauf. Aus diesem definierten Verlauf kann nun der gesamte Verlauf der Linie 11, beginnend ab dem Bereich 12, ermittelt werden. Alternativ ist es denkbar, den Verlauf der Linie 11, die ja eine scharfe, optische Trennung der Bildbereiche 9 und 10 definiert, über einen entsprechenden Kantendetektionsalgorithmus wiederum durch Bildanalyse zu ermitteln. Schließlich könnte, wenn die Grenzlinie aufgrund der Geometrie der Motorhaube 6 quasi geradlinig ist, einfach die ermittelte Bildzeile des Bereichs 12 als Trennungslinie der Bildbereiche 9 und 10 verwendet werdenIf the image line or the pixel is known exactly where this transition region 12 is, the control device can now determine the remaining course of the hood edge, ie the line 11. This can be done either by using known geometry data of the hood is present on the part of the control device 3. Because the hood 6 has a defined, characteristic course. From this defined course of the entire course of the line 11, starting from the area 12, can now be determined. Alternatively, it is conceivable to determine the course of the line 11, which indeed defines a sharp, optical separation of the image areas 9 and 10, via a corresponding edge detection algorithm, again by image analysis. Finally, if the boundary line is virtually rectilinear due to the geometry of the hood 6, the determined image line of the area 12 could simply be used as the dividing line of the image areas 9 and 10

Unabhängig davon kann die Steuerungseinrichtung 3 nun den informationsrelevanten Bildbereich 9 definieren, so dass im Rahmen einer nachfolgenden Verarbeitung des Bildinhalts des Bildbereichs 9 nur aus diesem die Informationen gewonnen werden können, die nachfolgend zur Ansteuerung der Gerätschaften 5 verwendet werden. Eine Beeinflussung der Bildanalyse durch den Bildbereich 10 ist ausgeschlossen, da dieser Bildbereich im Rahmen der Verarbeitung berücksichtigt bleibt. Es wird folglich nahezu der maximale relevante Bildbereich 9 der Analyse zugrunde gelegt, so dass größt mögliche Informationstiefe erfasst werden kann. Gleichzeitig wird die Analyse des Bildbereichs 6 vermieden, was zeitaufwendig und unnötig ist, da darin keine relevanten Informationen enthalten sind.Regardless of this, the control device 3 can now define the information-relevant image area 9, so that the information that will subsequently be used to control the equipment 5 can be obtained from this within the framework of a subsequent processing of the image content of the image area 9. An influence on the image analysis by the image area 10 is excluded since this image area remains taken into account during the processing. Consequently, almost the maximum relevant image area 9 of the analysis is used, so that the greatest possible information depth can be recorded. At the same time, the analysis of the image area 6 is avoided, which is time-consuming and unnecessary because it contains no relevant information.

Fig. 3 zeigt in Form einer Prinzipdarstellung eine Alternative zur festen Anordnung des Referenzmusters 8 an der Fahrzeugfront, wie in Fig. 1 gezeigt. Bei dieser Ausgestaltung ist das Referenzmuster 8, wie durch den Pfeil P1 gezeigt, längs einer Führung vertikal auf- und abbewegbar. Die Kamera 4 kann das Referenzmuster so lange erfassen, bis es quasi von der Fahrzeugfront, hier wiederum der Motorhaube 6, verdeckt ist, was dann der Fall ist, wenn das Referenzmuster bis unter die Linie 13 abgesenkt ist. Die Steuerungseinrichtung 3 nimmt nun eine Vielzahl von Bildern der Kamera 4, die während der Absenkbewegung aufgenommen wurden, zur Analyse heran und ermittelt so wiederum den entsprechenden Übergangsbereich 12, um die Bildbereiche 9 und 10 exakt zu definieren. Fig. 3 shows in the form of a schematic representation of an alternative to the fixed arrangement of the reference pattern 8 on the vehicle front, as in Fig. 1 shown. In this embodiment, the reference pattern 8, as shown by the arrow P1, along a guide vertically moved up and down. The camera 4 can detect the reference pattern until it is virtually covered by the vehicle front, here in turn the hood 6, which is the case when the reference pattern is lowered below the line 13. The control device 3 now takes a multiplicity of images of the camera 4, which were taken during the lowering movement, for the purpose of analysis and in turn determines the corresponding transition region 12 in order to define the image regions 9 and 10 exactly.

Bei der Ausgestaltung nach Fig. 4 ist wiederum ein Referenzmuster 8 vorgesehen, das hier jedoch horizontal beweglich ist, wie durch den Pfeil P2 dargestellt. Es befindet sich vor dem Kraftfahrzeug 1. Das Referenzmuster 8 wird längs des Pfeils P2 in Richtung des Kraftfahrzeugs 1 geschoben und passiert die dargestellte Linie 13, verschwindet also mit zunehmender Annäherungsbewegung. Die Steuerungseinrichtung 3 analysiert wiederum eine Vielzahl von Kamerabildern und erfasst auf diese Weise den exakten Bewegungsweg des Referenzmusters 3 und damit auch den Übergangsbereich 12, basierend worauf sodann die Bildbereiche 9 und 10 definiert werden.In the embodiment according to Fig. 4 Again, a reference pattern 8 is provided, which is here, however, horizontally movable, as shown by the arrow P2. It is located in front of the motor vehicle 1. The reference pattern 8 is pushed along the arrow P2 in the direction of the motor vehicle 1 and passes the line shown 13, so disappears with increasing approach movement. The control device 3 in turn analyzes a plurality of camera images and detects in this way the exact movement path of the reference pattern 3 and thus also the transition region 12, based on what then the image areas 9 and 10 are defined.

Bei der Ausführungsform gemäß Fig. 5 ist schließlich ein Referenzmuster 8 gezeigt, das statisch ist, sich also nicht relativ zum Fahrzeug bewegt, welches ebenfalls positionsfest ist. Das Referenzmuster 8 ist hier als Stange oder Stab ausgeführt. Er weist einen ersten Abschnitt 8a auf, der, da statisch, permanent im Bild der Kamera 4 sichtbar ist. Der zweite, untere Bereich 8b ist von der Motorhaube 6 abgedeckt, ist also im Bild nicht sichtbar. Die Bereiche 8a und 8b sind über die Linie 13 getrennt.In the embodiment according to Fig. 5 Finally, a reference pattern 8 is shown, which is static, so does not move relative to the vehicle, which is also fixed in position. The reference pattern 8 is designed here as a rod or rod. It has a first section 8a which, being static, is permanently visible in the image of the camera 4. The second, lower area 8b is covered by the hood 6, so it is not visible in the picture. The areas 8a and 8b are separated by the line 13.

Die Steuerungseinrichtung 3 umfasst nun im Kamerabild das Referenzmuster 8 und damit den Bereich 8a und kann so exakt die Bildbereiche 9 und 10 definieren und voneinander unterscheiden.The control device 3 now includes in the camera image the reference pattern 8 and thus the area 8a and can thus define the image areas 9 and 10 exactly and distinguish them from one another.

Die Ausführungsform gemäß Fig. 5 entspricht im Wesentlichen der Ausführungsform gemäß Fig. 1, lediglich mit dem Unterschied, dass das Referenzmuster 8 nicht, wie bei Fig. 1, am Fahrzeug selbst befestigt ist, sondern positionsfest vor dem Fahrzeug angeordnet ist.The embodiment according to Fig. 5 corresponds substantially to the embodiment according to Fig. 1 , only with the difference that the reference pattern 8 does not, as in Fig. 1 , Is attached to the vehicle itself, but positionally fixed in front of the vehicle is arranged.

Anstelle der Bewegung des Referenzmusters 8 ist es natürlich auch möglich, dass das Kraftfahrzeug 1 relativ zum positionsfesten Referenzmuster zu bewegen.Of course, instead of the movement of the reference pattern 8, it is also possible for the motor vehicle 1 to move relative to the positionally fixed reference pattern.

Die vorgenommene Bestimmung der Lage der Linie 11, also die Definition des Bildbereichs 9 und 10 kann einmalig werkseitig erfolgen, wenn das System einmalig eingerichtet respektive kalibriert wird. Es ist aber auch möglich, diese Kalibrierung regelmäßig bei Werkstattbesuchen vorzunehmen.The determination made of the position of the line 11, that is, the definition of the image area 9 and 10 can be made once factory, if the system once set up or calibrated. However, it is also possible to carry out this calibration regularly during workshop visits.

Claims (11)

Verfahren zum Betrieb eines kamerabasierten Fahrzeugsystems, umfassend eine Kamera (4), die im oder am Fahrzeug zur Aufnahme des Fahrzeugumfelds angeordnet ist, sowie eine die Kamerabilder verarbeitende Steuerungseinrichtung (3),
dadurch gekennzeichnet,
dass zur Bestimmung eines in wenigstens einem Kamerabild (7) gegebenen informationsrelevanten Bildbereichs (9) seitens der Steuerungseinrichtung (3) ein in dem Kamerabild gezeigter, einen Teil des Fahrzeugs (1) zeigenden Bildbereich (10) im Kamerabild (7) selbst ermittelt wird, welcher Bildbereich (10) im Rahmen der Verarbeitung nachfolgend aufgenommener Kamerabilder (7) außer Betracht bleibt.
Method for operating a camera-based vehicle system, comprising a camera (4), which is arranged in or on the vehicle for receiving the vehicle surroundings, and a control device (3) processing the camera images,
characterized,
in that for determining an information-relevant image area (9) given in at least one camera image (7), an image area (10) shown in the camera image and showing a part of the vehicle (1) is determined in the camera image (7) itself, which image area (10) is disregarded as part of the processing of subsequently recorded camera images (7).
Verfahren nach Anspruch 1,
dadurch gekennzeichnet,
dass die Ermittlung des einen Teil des Fahrzeugs (1) zeigenden Bildbereichs (10) anhand eines in dem oder den zu verarbeitenden Kamerabildern (7) gezeigten Musters oder Objekts (8) das im informationsrelevanten Bildbereich (9) sichtbar und im des Teil des Fahrzeugs (1) zeigenden Bildbereich (10) verdeckt ist, erfolgt, welches Muster oder Objekt (8) seitens der Steuerungseinrichtung (3) ermittelt wird.
Method according to claim 1,
characterized,
in that the image area (10) showing a part of the vehicle (1) is visible in the information-relevant image area (9) and in the part of the vehicle (in the form of a pattern or object (8) shown in the camera image (s) to be processed. 1), the pattern or object (8) is detected by the control device (3).
Verfahren nach Anspruch 2,
dadurch gekennzeichnet,
dass das Muster oder Objekt (8) am Fahrzeug (1) angeordnet ist, oder sich vor dem Fahrzeug (1) befindet.
Method according to claim 2,
characterized,
that the pattern or object (8) is arranged on the vehicle (1), or in front of the vehicle (1).
Verfahren nach Anspruch 3,
dadurch gekennzeichnet,
dass während der Aufnahme der Kamerabilder (7) das Muster oder Objekt (8) und das Fahrzeug (1) positionsfest sind, oder dass sich das Muster oder Objekt (8) relativ zum Fahrzeug (1) oder das Fahrzeug (1) relativ zum Muster oder Objekt (8) bewegt.
Method according to claim 3,
characterized,
that during the taking of the camera images (7) the pattern or object (8) and the vehicle (1) are fixed in position, or that the Moving pattern or object (8) relative to the vehicle (1) or the vehicle (1) relative to the pattern or object (8).
Verfahren nach Anspruch 4,
dadurch gekennzeichnet,
dass bei sich relativ zum Fahrzeug (1) bewegendem Muster oder Objekt (8) das Muster oder Objekt (8) im Wesentlichen horizontal bewegt und gegebenenfalls unter das Fahrzeug (1) geführt wird, oder dass das Muster oder Objekt (8) auf- und/oder abbewegt wird.
Method according to claim 4,
characterized,
that when relative to the vehicle (1) moving pattern or object (8), the pattern or object (8) to move substantially horizontally and optionally with the vehicle (1) is guided, or that the pattern or object (8) up and / or is moved.
Verfahren nach Anspruch 4,
dadurch gekennzeichnet,
dass bei sich relativ zum Muster oder Objekt (8) zu bewegendem Fahrzeug (1) das Fahrzeug (1) das Muster oder Objekt (8) überfährt.
Method according to claim 4,
characterized,
that in relative to the sample or object (8) to moving the vehicle (1) the vehicle (1) passes over the pattern or object (8).
Verfahren nach einem der vorangehenden Ansprüche,
dadurch gekennzeichnet,
dass das Muster oder Objekt ein Referenzmuster (8) ist.
Method according to one of the preceding claims,
characterized,
that the pattern or object is a reference pattern (8).
Verfahren nach einen der Ansprüche 1 - 6,
dadurch gekennzeichnet,
dass das Muster oder Objekt ein natürliches, im Fahrzeugvorfeld oder Fahrzeugrückfeld befindliches Muster oder Objekt ist, das seitens der Steuerungseinrichtung (3) bestimmt wird.
Method according to one of claims 1 - 6,
characterized,
in that the pattern or object is a natural pattern or object located in the vehicle apron or vehicle return field, which is determined by the control device (3).
Verfahren nach Anspruch 8,
dadurch gekennzeichnet,
dass als Muster oder Objekt ein solches bestimmt wird, das eine im Bild sichtbare Kantenstruktur aufweist.
Method according to claim 8,
characterized,
that a pattern or object is determined which has an edge structure visible in the image.
Verfahren nach Anspruch 1,
dadurch gekennzeichnet,
das der im Kamerabild gezeigte Teil des Kraftfahrzeugs durch einen Vergleich zeitlich nacheinander aufgenommener Bilder ermittelt wird.
Method according to claim 1,
characterized
the part of the motor vehicle shown in the camera image is determined by a comparison of temporally successively recorded images.
Kraftfahrzeug, umfassend eine Kamera (4) mit zugeordneter Steuerungseinrichtung (3), ausgebildet zur Durchführung des Verfahrens nach einen der vorangehenden Ansprüche.Motor vehicle, comprising a camera (4) with associated control device (3), designed for carrying out the method according to one of the preceding claims.
EP16001285.2A 2015-06-10 2016-06-08 Method for operating a camera-based vehicle system Active EP3103683B1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102015007391.5A DE102015007391A1 (en) 2015-06-10 2015-06-10 Method for operating a camera-based vehicle system

Publications (2)

Publication Number Publication Date
EP3103683A1 true EP3103683A1 (en) 2016-12-14
EP3103683B1 EP3103683B1 (en) 2018-08-15

Family

ID=56116179

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16001285.2A Active EP3103683B1 (en) 2015-06-10 2016-06-08 Method for operating a camera-based vehicle system

Country Status (2)

Country Link
EP (1) EP3103683B1 (en)
DE (1) DE102015007391A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017220492B4 (en) 2017-11-16 2019-08-14 Audi Ag Method and device for verifying sensor data in an environment of a vehicle

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0691599A2 (en) * 1994-07-05 1996-01-10 Hitachi, Ltd. Environment recognition device provided with image pickup device
DE102005003254A1 (en) * 2005-01-24 2006-08-10 Sick Ag Optoelectronic sensor for protecting e.g. press brake, has locally resolving optoelectronic receiver, and evaluation and control unit coupled with another receiver, where operating parameter is identifiable by unit based on received signals
DE102008028303A1 (en) * 2007-06-15 2009-01-08 Denso Corp., Kariya-shi Display system and program
DE102009059797A1 (en) * 2009-12-21 2011-06-22 Krauss-Maffei Wegmann GmbH & Co. KG, 80997 Optical marker and object detection device
DE102012103495A1 (en) * 2012-03-29 2013-10-02 Sick Ag Optoelectronic device for measuring structure or object sizes and method for calibration
DE102012214283A1 (en) * 2012-08-10 2014-02-13 Siemens Aktiengesellschaft Method for supporting target recording area setting of imaging device e.g. X-ray device, involves adjusting recording area of imaging device based on detecting position of markers representing target recoding area of object
WO2015041005A1 (en) * 2013-09-19 2015-03-26 富士通テン株式会社 Image generation device, image display system, image generation method, and image display method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0691599A2 (en) * 1994-07-05 1996-01-10 Hitachi, Ltd. Environment recognition device provided with image pickup device
DE102005003254A1 (en) * 2005-01-24 2006-08-10 Sick Ag Optoelectronic sensor for protecting e.g. press brake, has locally resolving optoelectronic receiver, and evaluation and control unit coupled with another receiver, where operating parameter is identifiable by unit based on received signals
DE102008028303A1 (en) * 2007-06-15 2009-01-08 Denso Corp., Kariya-shi Display system and program
DE102009059797A1 (en) * 2009-12-21 2011-06-22 Krauss-Maffei Wegmann GmbH & Co. KG, 80997 Optical marker and object detection device
DE102012103495A1 (en) * 2012-03-29 2013-10-02 Sick Ag Optoelectronic device for measuring structure or object sizes and method for calibration
DE102012214283A1 (en) * 2012-08-10 2014-02-13 Siemens Aktiengesellschaft Method for supporting target recording area setting of imaging device e.g. X-ray device, involves adjusting recording area of imaging device based on detecting position of markers representing target recoding area of object
WO2015041005A1 (en) * 2013-09-19 2015-03-26 富士通テン株式会社 Image generation device, image display system, image generation method, and image display method

Also Published As

Publication number Publication date
DE102015007391A1 (en) 2016-12-15
EP3103683B1 (en) 2018-08-15

Similar Documents

Publication Publication Date Title
EP1018839B2 (en) Method and apparatus for inspecting the rear observation space of a vehicle
EP3177474B1 (en) Method for generating a surroundings map, and driver assistance system
EP2753533B1 (en) Determination of the position of structural elements of a vehicle
EP3011225B1 (en) Device and method for securing a machine that operates in an automated manner
EP2920778B1 (en) Method for carrying out an at least semi-autonomous parking process of a motor vehicle into a garage, parking assistance system and motor vehicle
DE102009015142B4 (en) Vehicle surroundings recognition device and control system for tracking a preceding vehicle
EP1928687A1 (en) Method and driver assistance system for sensor-based driving off control of a motor vehicle
DE102013114369B4 (en) CAMERA MODULE FOR A VEHICLE AND SUBSEQUENT MONITORING SYSTEM
EP3335419B1 (en) Vehicle camera device for capturing the surroundings of a motor vehicle and driver assistance device for detecting objects with such a vehicle camera device
DE102009046726A1 (en) Method for detecting and selecting e.g. longitudinal parking spaces, for aiding driver of car for transporting persons or materials, involves displaying surfaces as possible parking spaces, and selecting suitable parking spaces
DE102006060893A1 (en) Device and method for determining a free space in front of a vehicle
DE102018217746A1 (en) Method for operating a driver assistance system of a motor vehicle and motor vehicle
DE102013021616A1 (en) Motor vehicle and method for checking a calibration of a camera
EP2856755B1 (en) Device and method for recording images of a vehicle underbody..
WO2009077445A1 (en) Method and apparatus for optically detecting an area surrounding a vehicle
DE102016010373A1 (en) Method and device for detecting the opening state of a garage door
EP3103683B1 (en) Method for operating a camera-based vehicle system
DE102011076795A1 (en) Method for determining a pitching movement of a camera installed in a vehicle and method for controlling a light emission of at least one headlight of a vehicle
EP2275318B1 (en) Method for determining whether a motor vehicle in front is changing lane
DE102015217049B4 (en) Device and method for testing an optical sensor for a vehicle
DE102016218079B3 (en) Camera for an assistance system of a vehicle, assistance system and vehicle
EP3421674A1 (en) Method and device for imaging areas
DE102014017128A1 (en) motor vehicle
EP3833576B1 (en) Surveillance camera system
DE102021003474A1 (en) Device for detecting a vehicle environment and vehicle

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20170614

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: G06T 7/00 20060101ALI20180411BHEP

Ipc: B60R 1/00 20060101AFI20180411BHEP

Ipc: G06K 9/78 20060101ALI20180411BHEP

Ipc: G06K 19/06 20060101ALI20180411BHEP

Ipc: G06K 9/80 20060101ALI20180411BHEP

INTG Intention to grant announced

Effective date: 20180507

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

Ref country code: AT

Ref legal event code: REF

Ref document number: 1029361

Country of ref document: AT

Kind code of ref document: T

Effective date: 20180815

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502016001625

Country of ref document: DE

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20180815

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181116

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181115

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181215

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181115

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 502016001625

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20190516

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20190630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190608

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190608

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190630

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190630

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181215

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20160608

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180815

REG Reference to a national code

Ref country code: AT

Ref legal event code: MM01

Ref document number: 1029361

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210608

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210608

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230530

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20230623

Year of fee payment: 8

Ref country code: DE

Payment date: 20230630

Year of fee payment: 8

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20240625

Year of fee payment: 9