EP3486871B1 - Bildsystem und verfahren zum autonomen fahren und/oder zur fahrerassistenz in einem kraftfahrzeug - Google Patents

Bildsystem und verfahren zum autonomen fahren und/oder zur fahrerassistenz in einem kraftfahrzeug Download PDF

Info

Publication number
EP3486871B1
EP3486871B1 EP17202090.1A EP17202090A EP3486871B1 EP 3486871 B1 EP3486871 B1 EP 3486871B1 EP 17202090 A EP17202090 A EP 17202090A EP 3486871 B1 EP3486871 B1 EP 3486871B1
Authority
EP
European Patent Office
Prior art keywords
value
data
imaging apparatus
vision system
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP17202090.1A
Other languages
English (en)
French (fr)
Other versions
EP3486871A1 (de
Inventor
Fredrik Medley
Daniel Ankelhed
Hagen Spies
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Veoneer Sweden AB
Original Assignee
Veoneer Sweden AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Veoneer Sweden AB filed Critical Veoneer Sweden AB
Priority to EP17202090.1A priority Critical patent/EP3486871B1/de
Publication of EP3486871A1 publication Critical patent/EP3486871A1/de
Application granted granted Critical
Publication of EP3486871B1 publication Critical patent/EP3486871B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the invention relates to a vision system for autonomous driving and/or driver assistance for a motor vehicle, comprising an imaging apparatus adapted to capture images from a surrounding of the motor vehicle, wherein the imaging apparatus has at least one optical axis, and a data processing device performing visual odometry on a plurality of images captured by the imaging apparatus yielding odometric data comprising the ego vehicle translation vector T for each image.
  • odometry is a technique to determine the ego-motion of a moving platform, e.g., a motor vehicles, using cameras.
  • a moving platform e.g., a motor vehicles
  • the known calibration of the camera or cameras, i.e., the alignment of the optical axis with the motor vehicle is crucial for reliable odometry.
  • the motion of a vehicle is typically described by six degrees of freedom: three rotations and a 3D translation vector T.
  • the alignment i.e., the rotation of the camera/optical axis relative to the world coordinate system, is expressed in terms of three angles: yaw as the rotation around a vertical axis, pitch as the rotation around a lateral axis, and roll as the rotation around a forward axis.
  • the world coordinate system may be defined by the adjustment of the imaging apparatus in the motor vehicle.
  • US Patent US 9,148,650 B2 discloses a method for multi-threaded visual odometry, using 2D-3D correspondences for continuous pose estimation, and combining this pose estimation with 2D-2D epipolar search to replenish 3D points accurately.
  • the method uses a sequence of images captured by a single camera and tracks a defined number of points in the sequence of images.
  • the scale drift is corrected using a mechanism that detects local planarity of the road by combining information from triangulated 3D points and image planar homography.
  • EP 2 181 417 A2 discloses a system for online calibration of a video system using odometric data in the form of a vanishing point.
  • US 9,213,938 B2 discloses a method of estimating the relative pose (rotation and translation) of an object between two images, using two views captured by a calibrated monocular camera, and a collection of pose hypotheses or estimates is generated by utilizing the redundant feature point information from both images.
  • US 2015/0312564 A1 discloses a method of misalignment correction and diagnostic function for a lane sensing sensor.
  • the problem underlying the invention is to provide an effective, simple and reliable calibration of the imaging apparatus with respect to pitch angle and/or yaw angle deviations.
  • the invention has realized that the motor vehicle will on average move straight forward and parallel to the ground. Based on this assumption, the vision system according to the invention uses a technique based on statistical analysis to determine the misalignment and uses the results for a reliable calibration of the imaging apparatus.
  • the invention suggests that the data processing device performs a statistical analysis on the odometric data yielding at least one statistical value, and the data processing device calculates a pitch and/or yaw misalignment of the at least one optical axis by calculating a deviation of said statistical value from a predefined value corresponding to exact and constant forward movement of the ego vehicle.
  • the misalignment and/or an erroneous calibration of the imaging apparatus could be detected and/or compensated by the statistical analysis. It is assumed that on average the motor vehicle moves straight forward. As the vehicle moves an optical flow is preferably introduced in the images captured by the imaging apparatus. The average straight motion implies that the corresponding average direction of the optical flow is also straight. An odometric result by the odometry other than an average forward motion indicates a misalignment and/or an erroneous calibration that can be identified by the statistical analysis.
  • the odometric data preferably comprises a plurality of data points, i.e., data sets corresponding to a plurality of images.
  • the odometric data can be stored and/or may be considered as a sequence.
  • the pitch angle and the yaw angle should each equal 0 if the vehicle moves straight forward, rendering the predefined values for yaw angle and pitch angle equal to zero. Any deviation of the statistical value from the predefined values, in this example 0, indicates the misalignment.
  • the data processing device performs a parametrization of the vehicle translation vector T representing it with two spherical coordinates ⁇ and ⁇ , where ⁇ denotes the yaw angle and ⁇ denotes the pitch angle of the vehicle.
  • the yaw angle ⁇ and/or the pitch angle ⁇ of the vehicle can change in short intervals. However, on average, the yaw angle ⁇ and/or the pitch angle ⁇ is constant, in this particular example zero, and any systematic deviation in the odometric data that is identified by the statistical analysis indicates a misalignment and/or an erroneous calibration.
  • the translation vector T is derived from and/or related to a frame to frame movement.
  • Representing the translation vector T with the two spherical angles ⁇ and ⁇ is advantageous as it requires the storage and/or processing of the data corresponding to the two spherical angles ⁇ and ⁇ only.
  • the at least one statistical value comprises the mode value, the mean value, and/or the median value of a statistically distributed quantity, namely of pitch angle and/or yaw angle.
  • the mode, mean, and/or median are statistical identifiers of a peak position of a unimodal distribution and/or histogram of the statistically distributed quantity. A systematic deviation in the odometric data is potentially easily identified by the statistical analysis of the mode, mean, and/or median.
  • the statistical value further comprises a standard deviation, or a value relating to the standard deviation, of a statistically distributed quantity to describe the width of the distribution of the statistically distributed quantity.
  • only image data taken at a time when a turn-rate of the motor vehicle is below a threshold is considered for said visual odometry and statistical analysis, preferably when the turn-rate of the motor vehicle is zero, more preferably, when the vehicle moves straight.
  • only image data taken at a time when the speed of the motor vehicle is exceeds a threshold is considered for said visual odometry and statistical analysis.
  • yaw, roll, and/or pitch change typically with a small rate or are constant. This allows a particularly reliable data acquisition without systematic changes in any of the angles induced by the ego-motion of the vehicle. In particular when the vehicle moves straight, the systematic changes of roll and/or yaw induced by the ego-motion of the vehicle is zero. When the vehicle does not accelerate, the pitch angle rate is zero.
  • the statistical analysis involves the construction of a histogram and/or an estimation of a probability distribution, wherein the probability distribution preferably has the form of a function, like a polynomial, an exponential, and/or a composition thereof, or of a list or table.
  • the odometric data can be sed to construct the histogram and/or estimate the probability distribution of a measured and/or estimated quantity.
  • the histogram and/or probability distribution gives a detailed statistical description of the odometric data and allows an identification of confidential and potentially erroneous data.
  • the bin width of the histogram is in the range between 0.001° and 0.1° in order to construct a histogram with a suitable resolution to perform statistical analysis.
  • the statistical value further comprises a width of a peak of the histogram and/or the probability distribution.
  • the width of the peak could be computed by the standard deviation, full width at half maximum (FWHM), and/or by other statistical measures and allows a refined identification of confidential and potentially erroneous data.
  • the statistical analysis uses a subset of odometric data.
  • the odometric data comprises typically erroneous data that need to be neglected.
  • a subset is preferably used to not take potential erroneous data into account.
  • One embodiment of the subset is given by the interpercentile range of the odometric data, whereby p is the p-th percentile and q is the q-th percentile.
  • the at least one statistical value is updated continuously.
  • a continuous update of the odometric data that is used to perform the statistical analysis is preferred.
  • the continuously updated at least one statistical value takes changes with time into account.
  • the vision system 10 is mounted in a motor vehicle and comprises an imaging apparatus 11 for capturing images of a region surrounding the motor vehicle, for example a region in front of the motor vehicle, typically corresponding to the principal locomotion direction 200 of the motor vehicle 2.
  • the imaging apparatus 11 comprises one or more optical imaging devices 12, in particular cameras, preferably operating in the visible and/or infrared wavelength range, where infrared covers near IR with wavelengths below 5 microns and/or far IR with wavelengths beyond 5 microns.
  • the imaging apparatus 11 comprises a plurality imaging devices 12 in particular forming a stereo imaging apparatus 11. In other embodiments only one imaging device 12 forming a mono imaging apparatus 11 can be used.
  • the imaging apparatus 11 is coupled to a data processing device 14 adapted to process the image data received from the imaging apparatus 11.
  • the data processing device 14 is preferably a digital device which is programmed or programmable and preferably comprises a microprocessor, microcontroller a digital signal processor (DSP), and/or a microprocessor part in a System-On-Chip (SoC) device, and preferably has access to, or comprises, a data memory 25.
  • DSP digital signal processor
  • SoC System-On-Chip
  • the data processing device 14 may comprise a dedicated hardware device, like a Field Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC), or an FPGA and/or ASIC part in a System-On-Chip (SoC) device, for performing certain functions, for example controlling the capture of images by the imaging apparatus 11, receiving the electrical signal containing the image information from the imaging apparatus 11, rectifying or warping pairs of left/right images into alignment and/or creating disparity or depth images.
  • the data processing device 14, or part of its functions can be realized by a System-On-Chip (SoC) device comprising, for example, FPGA, DSP, ARM and/or microprocessor functionality.
  • SoC System-On-Chip
  • the data processing device 14 and the memory device 25 are preferably realised in an on-board electronic control unit (ECU) and may be connected to the imaging apparatus 11 via a separate cable or a vehicle data bus.
  • ECU electronice control unit
  • the ECU and one or more of the imaging devices 12 can be integrated into a single unit, where a one box solution including the ECU and all imaging devices 12 can be preferred. All steps from imaging, image processing to possible activation or control of safety device 18 are performed automatically and continuously during driving in real time.
  • Image and data processing carried out in the processing device 14 advantageously comprises identifying and preferably also classifying possible objects (object candidates) in front of the motor vehicle, such as pedestrians, other vehicles, bicyclists and/or large animals, tracking over time the position of objects or object candidates identified in the captured images, and activating or controlling at least one safety device 18 depending on an estimation performed with respect to a tracked object, for example on an estimated collision probability.
  • object candidates possible objects
  • the motor vehicle such as pedestrians, other vehicles, bicyclists and/or large animals
  • the safety device 18 may comprise at least one active safety device and/or at least one passive safety device.
  • the safety device 18 may comprise one or more of: at least one safety belt tensioner, at least one passenger airbag, one or more restraint systems such as occupant airbags, a hood lifter, an electronic stability system, at least one dynamic vehicle control system, such as a brake control system and/or a steering control system, a speed control system; a display device to display information relating to a detected object; a warning device adapted to provide a warning to a driver by suitable optical, acoustical and/or haptic warning signals.
  • the invention is applicable to autonomous driving, where the ego vehicle 2 is an autonomous vehicle adapted to drive partly or fully autonomously or automatically, and driving actions of the driver are partially and/or completely replaced or executed by the ego vehicle 2.
  • the data processing device 14 is adapted to perform visual odometry 100 leading to odometric data 101. There is a large variety of algorithms for odometry 100 and all of them could be applied. The accuracy of the estimated motion influences the accuracy of the dynamic calibration and thus of the estimated misalignment 202.
  • the data memory 25 is adapted to store a sequence of images 5 and/or data about the motor vehicle 2.
  • the memory 25 is adapted to store a sequence 102 of odometric data 101, and the sequence 102 comprises odometric data 101 of at least one image 5.
  • FIG. 2 the motor vehicle 2 is shown.
  • the motor vehicle 2 moves along its principal locomotion direction, i.e., its exact forward direction 200.
  • the imaging apparatus 11 is mounted in/on the motor vehicle 2 and is directed along an optical axis 201.
  • the optical axis defines the actual calibration of the imaging apparatus 11.
  • the forward direction 200 and optical axis 201 are not aligned in parallel. That is, the forward direction 200 and the optical axis 201 are related by a rotation around at least one of the three axes spanning a Cartesian coordinate system.
  • the forward direction 200 and the optical axis 201 are in misalignment 202 which is defined by at least one angle.
  • the misalignment 202 characterizes the difference between the actual calibration and the static calibration 310.
  • straight movement results in a direct correspondence, namely, yaw angle ⁇ and pitch angle ⁇ .
  • the common visual odometry 100 will compute a translational movement caused by the misalignment 202 of the optical axis 201 of the camera and the world coordinate system, which is preferably defined by the forward direction 200 of the motor vehicle 2.
  • the measured yaw and/or pitch angles will cluster at angles corresponding to the current misalignment 202, i.e., the dynamic calibration angles.
  • This misalignment 202 can be identified and taken into account by further data processing, navigation and/or driver assistance.
  • the data processing device 14 is adapted to perform an estimation of the misalignment 202 between the forward direction 200 and the optical axis 201.
  • Figure 3 shows a schematic flow diagram of a preferred embodiment of the present invention.
  • the image 5 is processed by the data processing device 14 and visual odometry 100 is performed. This yields odometric data 101 from which a sequence 102 is formed.
  • the visual odometry 100 is used to generate a histogram 300.
  • the histogram 300 could be updated each time when a new image 5 is analyzed, and/or after a sequence 102 of odometric data 101 has been analyzed.
  • the statistical analysis 104 is preferably performed to estimate the misalignment 202.
  • the misalignment 202 could also be inferred from the histogram 300 directly.
  • only measurements/odometric data 101 where the estimated motion of the motor vehicle 2 comes close to the straight movement assumption is preferably used. This can be done by adding only odometric data 101, e.g., angles, to the sequence 102 when the vehicle's 2 turn-rate is below a threshold and/or the speed exceeds another threshold.
  • a preferred method to capture the misalignment 202 is to integrate the measurements by means of the histogram 300 and/or a probability distribution 308.
  • Figure 4 shows a histogram 300 in a schematic representation.
  • the histogram 300 records the frequency, f, or the number of measurements.
  • Each measurement e.g., the estimated pitch angle and/or yaw angle, represents a value (called x) and falls in a corresponding bin 309.
  • the number of bins 309 is chosen according to the amount of data, the amount of memory 25, and/or the application. Typically, the number of bins 309 is preferably greater than 100, more preferably greater than 250.
  • the histogram 300 has a peak 305 that is defined as the bin 309 with the largest frequency, i.e., the measurement with the largest number of occurrence. Around the peak 309 are reasonable values for measurements. Due to measurement errors, the histogram 300 can get wide/heavy tailed, i.e., there are measurements that are unlikely to be representative for a realistic and actual observable. Thus, for the statistical analysis 104, only measurements of a subset 306, e.g., a subset 306 of bins 309, is taken into account.
  • the histogram 300 needs to be quantized and this might differ depending on the system, e.g., on the data processing device 14, the memory 25, and/or the application.
  • a quantization level of 1/100° should be sufficient, i.e., the width of a bin 309 equal 1/100°.
  • the histogram 300 may be normalized by the number of samples, i.e., the number of data in the sequence 102 of odometric data 101, to results in a sample probability density function (pdf) as an estimate for the probability distribution 308 as shown schematically in Figure 5 .
  • pdf sample probability density function
  • Figure 5 shows a schematic probability distribution 308, p, as a function of the measurement, x.
  • the histogram 300 serves an estimate of the true probability distribution 308. It has a peak 305 and a well-defined mode 301, mean 302, and median 303 .
  • the mode 301 is defined as the position of the peak 305, i.e., the position of the maximum of a unimodal probability distribution 308.
  • the mean 302 is the average value of the probability distribution 308.
  • the median 303 is the value x separating the higher half of the probability distribution 308 of the lower half of the probability distribution 308.
  • the peak 305 has a width 304.
  • the width 304 could be defined by the standard deviation of probability distribution 308.
  • the odometric data 101 is the subset 306 is adapted to meet a defined confidence level.
  • a preferred confidence estimation is to examine the width 304 of the peak 305 and if the width 304 is below a threshold the estimate is considered confident.
  • a robust method to determine the width 304 is given by an interquartile range 307.
  • the probability distribution 308 can be characterized by quantiles, preferably by quartiles.
  • quantiles preferably by quartiles.
  • quantiles could be of interest for the statistical analysis 104.
  • a preferred subset 306 is given by the interquartile range.
  • a confidence in the estimated peak 305 position is required. This can preferably achieved when the calibration parameters, i.e., the estimated misalignment 202, is only updated if the confidence of the odometric data 101 is high.
  • Mode 301 mean 302, median 303, width 304 and/or interquantile range 307 can be defined, estimated, and/or computed of the histogram 300 analogous to respective quantities of the probability distribution 308.
  • Figure 6 shows a flow diagram of an embodiment of the invention applied in/on a motor vehicle 2.
  • the images 5 are taken by a camera.
  • visual odometry 100 is performed by the data processing device 14.
  • the set of odometric data 101 is constructed by the integration of current yaw angle and pitch angle data. Integration means, that as the image 5 is taken, the set of odometric data 101 is enlarged by the odometric date about yaw angle and pitch angle provided by the analysis of the captured image 5.
  • the statistical analysis 104 is performed.
  • the position of the peak 305 of the histogram 300 is extracted which is a direct measure of the misalignment 202. By this, the calibration of the imaging apparatus 11 is updated.
  • the present vision method relies on the statistical analysis 104 of the odometric data 101 and as such requires sufficient samples to be useful and reliable. Therefore, no calibration parameters are extracted before a specified number of measurements have been collected. That is, the sequence 102 preferably exceeds a defined length and/or the odometric data 101 reach a defined data size before the misalignment 202 is estimated. A good number of measurements depends on how the requirements of fast updated and accuracy are weighted; around 1000 measurements will preferably be sufficient for an initial estimate of the misalignment 202.
  • the histogram 300 and/or the probability distribution 309 is preferably updated continuously.
  • a preferred method keeps track of the last N samples in the odometric data 101 and whenever a new measurement of an odometric date is added to the odometric data 101, the oldest measurement is removed.
  • Figures 7 and 8 show examples of histograms 300 of pitch angle and yaw angle, respectively, each of which are statistically distributed quantities. Both histograms 300 are unimodal and show a well-defined peak 305.
  • the desired pitch angle ⁇ and yaw angle ⁇ can preferably obtained from the mode 301 (peak position), mean 302, and/or median 303 (50%percentile) in the histogram 300 and/or the probability distribution 308.
  • the bins 309 are chosen so that a reliable estimation of the median 303 can be performed.
  • the pitch and yaw angles cover an interval of about 0.05 radian, i.e., about 2,9°.
  • the bin 309 width is about 1/100°.
  • the number of bins 309 in these examples is 290.
  • neither the median 303 nor the peak 305 is centered close to 0° which here describes the predefined value corresponding to a perfect alignment and/or zero misalignment 202 of the optical axis 201 with the forward direction 200.
  • perfect alignment can correspond to a pitch angle different from 0°.
  • a pitch angle of a value unequal 0° could also indicate a straight movement as the vehicle changes its pitch under different conditions, e.g., the number of passengers, the weight of the luggage in the vehicle.
  • the misalignment 202 is expressed as the difference between the median 303 and a static calibration 310 which is defined as the position in which the imaging apparatus 11 is fixed. When the misalignment 202 is known, the imaging apparatus 11, the image processing and/or the visual odometry 100 can be calibrated accordingly.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Claims (11)

  1. Ein Sichtsystem (10) für autonomes Fahren und/oder Fahrerassistenz für ein Kraftfahrzeug (2), umfassend
    - eine Bildaufnahmevorrichtung (11), die eingerichtet ist, Bilder (5) aus einer Umgebung (6) des Kraftfahrzeugs (2) aufzunehmen, wobei
    - die Bildaufnahmevorrichtung (11) mindestens eine optische Achse (201) aufweist, und
    - eine Datenverarbeitungsvorrichtung (14), die eine visuelle Odometrie (100) an einer Vielzahl von Bildern (5) durchführt, die von der Bildaufnahmevorrichtung (11) aufgenommen wurden, was odometrische Daten (101) ergibt, die den Ego-Fahrzeug-Translationsvektor T (101) für jedes Bild (5) umfassen,
    wobei
    - die Datenverarbeitungsvorrichtung (14) eine statistische Analyse (104) an den odometrischen Daten (101) durchführt, die mindestens einen statistischen Wert ergibt, und
    - die Datenverarbeitungseinrichtung (14) eine Nick- und/oder Gier-Fehlausrichtung (202) der mindestens einen optischen Achse (201) berechnet, indem sie eine Abweichung des genannten statistischen Wertes von einem vordefinierten Wert berechnet, der einer exakten und konstanten Vorwärtsbewegung des Ego-Fahrzeugs (2) entspricht,
    wobei die Datenverarbeitungseinrichtung (14) eine Parametrisierung des Fahrzeugtranslationsvektors T durchführt, die diesen mit zwei Kugelkoordinaten θ und φ angibt, wobei θ den Gierwinkel und φ den Nickwinkel des Fahrzeugs (2) bezeichnet, und die statistische Analyse (104) eine Konstruktion eines Histogramms (300) und/oder eine Schätzung einer Wahrscheinlichkeitsverteilung (308) der statistisch verteilten Größe, nämlich des Nickwinkels φ und/oder des Gierwinkels θ, beinhaltet, wobei der mindestens eine statistische Wert den Modalwert (301), den Mittelwert (302) und/oder den Medianwert (303) der statistisch verteilten Größe in dem Histogramm (300) und/oder der Wahrscheinlichkeitsverteilung (308) umfasst, wobei die Fehlausrichtung (202) die Differenz zwischen dem Modalwert (301), dem Mittelwert (302) und/oder dem Medianwert (303) und einem statischen Kalibrierungswinkel (310) der Bildaufnahmevorrichtung (11) ist.
  2. Das Sichtsystem nach Anspruch 1, dadurch gekennzeichnet, dass die Datenverarbeitungsvorrichtung (14) die Parametrisierung durchführt, indem sie den Fahrzeugverschiebungsvektor T=(tx,ty,tz) als T=(cos θ cos φ, sin θ cos φ, -sin φ) angibt.
  3. Das Sichtsystem nach einem der vorhergehenden Ansprüche, dadurch gekennzeichnet, dass der statistische Wert weiterhin eine Standardabweichung oder einen auf die Standardabweichung bezogenen Wert einer statistisch verteilten Größe umfasst.
  4. Das Sichtsystem nach einem der vorhergehenden Ansprüche, dadurch gekennzeichnet, dass nur Bilddaten, die zu einem Zeitpunkt aufgenommen wurden, zu dem eine Drehrate des Kraftfahrzeugs (2) unter einem Schwellenwert liegt, für die visuelle Odometrie und die statistische Analyse berücksichtigt werden.
  5. Das Sichtsystem nach einem der vorhergehenden Ansprüche, dadurch gekennzeichnet, dass für die visuelle Odometrie und statistische Analyse (104) nur Bilddaten berücksichtigt werden, die zu einem Zeitpunkt aufgenommen werden, zu dem die Geschwindigkeit des Kraftfahrzeugs (2) einen Schwellenwert überschreitet.
  6. Das Sichtsystem nach einem der vorhergehenden Ansprüche, dadurch gekennzeichnet, dass die Wahrscheinlichkeitsverteilung (308) die Form einer Funktion oder einer Liste oder Tabelle hat.
  7. Das Sichtsystem nach einem der vorhergehenden Ansprüche, dadurch gekennzeichnet, daß die Bin-Breite des Histogramms (300) im Bereich zwischen 0,001° und 0,1° liegt.
  8. Das Sichtsystem nach einem der vorangehenden Ansprüche, dadurch gekennzeichnet, daß die statistische Analyse (104) eine Teilmenge (306) der odometrischen Daten (101) verwendet.
  9. Das Sichtsystem nach Anspruch 8, dadurch gekennzeichnet, daß die Teilmenge (306) durch einen Interquantilbereich (307) der odometrischen Daten (101) gegeben ist, iqr = p - q, wobei p und q jeweils p-Quantil und q-Quantil sind, wobei p>q.
  10. Das Sichtsystem nach einem der vorhergehenden Ansprüche, dadurch gekennzeichnet, dass der mindestens eine statistische Wert kontinuierlich aktualisiert wird.
  11. Ein Sichtverfahren für autonomes Fahren und/oder Fahrerassistenz für ein Kraftfahrzeug (2), umfassend
    - Aufnehmen von Bildern (5) aus einer Umgebung (6) des Kraftfahrzeugs (2) mit einer Bildaufnahmevorrichtung (11), wobei
    - die Bildaufnahmevorrichtung (11) mindestens eine optische Achse (201) aufweist, und
    - Durchführen einer visuellen Odometrie (100) mit einer Datenverarbeitungsvorrichtung (14) an einer Vielzahl von Bildern (5), die von der Bildaufnahmevorrichtung (11) aufgenommen wurden, wodurch odometrische Daten (101) erzeugt werden, die den Translationsvektor T (101) des Ego-Fahrzeugs für jedes Bild (5) umfassen,
    - Durchführen einer statistischen Analyse (104) mit der Datenverarbeitungsvorrichtung (14) an den odometrischen Daten (101), die mindestens einen statistischen Wert ergibt, und
    - Berechnen einer Nick- und/oder Gier-Fehlausrichtung (202) der mindestens einen optischen Achse (201) mit der Datenverarbeitungsvorrichtung (14) durch Berechnen einer Abweichung des statistischen Werts von einem vordefinierten Wert, der einer exakten und konstanten Vorwärtsbewegung des Ego-Fahrzeugs (2) entspricht, wobei eine Parametrisierung des Fahrzeugverschiebungsvektors T durchgeführt wird, die ihn mit zwei Kugelkoordinaten θ und φ angibt, wobei θ den Gierwinkel bezeichnet und φ den Nickwinkel des Fahrzeugs (2) bezeichnet, und die statistische Analyse (104) das Erstellen eines Histogramms (300) und/oder das Schätzen einer Wahrscheinlichkeitsverteilung (308) einer statistisch verteilten Größe, nämlich des Nickwinkels φ und/oder des Gierwinkels θ, umfasst, wobei der mindestens eine statistische Wert den Modalwert (301), den Mittelwert (302) und/oder den Medianwert (303) der statistisch verteilten Größe in dem Histogramm (300) und/oder der Wahrscheinlichkeitsverteilung (308) umfasst, wobei die Fehlausrichtung (202) die Differenz zwischen dem Modalwert (301), dem Mittelwert (302) und/oder dem Medianwert (303) und einem statischen Kalibrierungswinkel (310) der Bildaufnahmevorrichtung (11) ist.
EP17202090.1A 2017-11-16 2017-11-16 Bildsystem und verfahren zum autonomen fahren und/oder zur fahrerassistenz in einem kraftfahrzeug Active EP3486871B1 (de)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP17202090.1A EP3486871B1 (de) 2017-11-16 2017-11-16 Bildsystem und verfahren zum autonomen fahren und/oder zur fahrerassistenz in einem kraftfahrzeug

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP17202090.1A EP3486871B1 (de) 2017-11-16 2017-11-16 Bildsystem und verfahren zum autonomen fahren und/oder zur fahrerassistenz in einem kraftfahrzeug

Publications (2)

Publication Number Publication Date
EP3486871A1 EP3486871A1 (de) 2019-05-22
EP3486871B1 true EP3486871B1 (de) 2021-05-05

Family

ID=60387871

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17202090.1A Active EP3486871B1 (de) 2017-11-16 2017-11-16 Bildsystem und verfahren zum autonomen fahren und/oder zur fahrerassistenz in einem kraftfahrzeug

Country Status (1)

Country Link
EP (1) EP3486871B1 (de)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3789908A1 (de) * 2019-09-05 2021-03-10 Veoneer Sweden AB Monosichtsystem und -verfahren für ein kraftfahrzeug
CN113029128B (zh) * 2021-03-25 2023-08-25 浙江商汤科技开发有限公司 视觉导航方法及相关装置、移动终端、存储介质
CN114088113B (zh) * 2021-11-16 2023-05-16 北京航空航天大学 一种里程计轨迹对齐及精度测评方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110115912A1 (en) * 2007-08-31 2011-05-19 Valeo Schalter Und Sensoren Gmbh Method and system for online calibration of a video system
WO2011163341A2 (en) * 2010-06-22 2011-12-29 University Of Florida Research Foundation, Inc. Systems and methods for estimating pose
US20140139635A1 (en) 2012-09-17 2014-05-22 Nec Laboratories America, Inc. Real-time monocular structure from motion
US9930323B2 (en) * 2014-04-23 2018-03-27 GM Global Technology Operations LLC Method of misalignment correction and diagnostic function for lane sensing sensor

Also Published As

Publication number Publication date
EP3486871A1 (de) 2019-05-22

Similar Documents

Publication Publication Date Title
CN109690623B (zh) 用于识别场景中的相机的姿势的***和方法
US10424081B2 (en) Method and apparatus for calibrating a camera system of a motor vehicle
US9862318B2 (en) Method to determine distance of an object from an automated vehicle with a monocular device
US11431958B2 (en) Vision system and method for a motor vehicle
US20170297488A1 (en) Surround view camera system for object detection and tracking
EP2757524A1 (de) Tiefenmessungsverfahren und System für autonome Fahrzeuge
JP6708730B2 (ja) 移動体
EP3430593B1 (de) Verfahren zum erfassen eines objekts entlang einer strasse eines kraftfahrzeugs, rechenvorrichtung, fahrerassistenzsystem sowie kraftfahrzeug
EP3163506A1 (de) Verfahren zur stereokartenerzeugung mit neuartigen optischen auflösungen
EP3486871B1 (de) Bildsystem und verfahren zum autonomen fahren und/oder zur fahrerassistenz in einem kraftfahrzeug
US9892519B2 (en) Method for detecting an object in an environmental region of a motor vehicle, driver assistance system and motor vehicle
WO2019021876A1 (ja) 車載カメラのキャリブレーション装置及び方法
US10706586B2 (en) Vision system for a motor vehicle and method of controlling a vision system
US10706589B2 (en) Vision system for a motor vehicle and method of controlling a vision system
EP2913999A1 (de) Disparitätswertableitungsvorrichtung, Ausrüstungssteuerungssystem, bewegliche Vorrichtung, Roboter, Disparitätswertableitungsverfahren und computerlesbares Speichermedium
US20110304734A1 (en) Method and apparatus for operating a video-based driver assistance system in a vehicle
Michalke et al. Towards a closer fusion of active and passive safety: Optical flow-based detection of vehicle side collisions
Ziraknejad et al. The effect of Time-of-Flight camera integration time on vehicle driver head pose tracking accuracy
JP5425500B2 (ja) キャリブレーション装置およびキャリブレーション方法
WO2020064543A1 (en) Vision system and method for a motor vehicle
EP3789908A1 (de) Monosichtsystem und -verfahren für ein kraftfahrzeug
KR20170031282A (ko) 차량용 카메라 보정 장치

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20191120

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20201204

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1390768

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210515

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602017037981

Country of ref document: DE

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1390768

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210505

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210805

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210906

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210805

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210806

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210905

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20210505

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602017037981

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20220208

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210905

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20211116

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211116

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211130

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20211130

REG Reference to a national code

Ref country code: DE

Ref legal event code: R081

Ref document number: 602017037981

Country of ref document: DE

Owner name: ARRIVER SOFTWARE AB, SE

Free format text: FORMER OWNER: VEONEER SWEDEN AB, VARGARDA, SE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211116

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211116

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220630

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20171116

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220630

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20231010

Year of fee payment: 7

Ref country code: DE

Payment date: 20231010

Year of fee payment: 7

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210505