CN110161523A - The method estimated wall locations and activate the active triangulation of the matrix form headlight system of motor vehicle - Google Patents

The method estimated wall locations and activate the active triangulation of the matrix form headlight system of motor vehicle Download PDF

Info

Publication number
CN110161523A
CN110161523A CN201910083363.9A CN201910083363A CN110161523A CN 110161523 A CN110161523 A CN 110161523A CN 201910083363 A CN201910083363 A CN 201910083363A CN 110161523 A CN110161523 A CN 110161523A
Authority
CN
China
Prior art keywords
light distribution
light
isolation
active triangulation
triangulation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910083363.9A
Other languages
Chinese (zh)
Other versions
CN110161523B (en
Inventor
C·施奈德
S·泽纳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dr Ing HCF Porsche AG
Original Assignee
Dr Ing HCF Porsche AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dr Ing HCF Porsche AG filed Critical Dr Ing HCF Porsche AG
Publication of CN110161523A publication Critical patent/CN110161523A/en
Application granted granted Critical
Publication of CN110161523B publication Critical patent/CN110161523B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2504Calibration devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)

Abstract

The present invention relates to the methods of the active triangulation of the matrix form headlight system for estimating wall locations and for activating motor vehicles, method includes the following steps: a) checking whether to meet the basic on-condition for connecting active triangulation, and if so, b) light distribution of the projection with characteristic feature in vehicle front scene, the artificial visual field based on light is capable of providing by means of the light distribution, and pass through the camera recordings light distribution, c) by image processing software from extracting characteristic feature from the light distribution generated in previous step in the camera image of the video camera, d) it records and mathematically describes the area of isolation in the light distribution, e) position and the continuity of the area of isolation for providing the artificial visual field based on light are determined, f) it checks whether to meet for connecting active triangulation The on-condition of function, and if it does, connect the function of active triangulation.

Description

Estimate wall locations and the active triangle of the matrix form headlight system of motor vehicle is activated to survey The method of amount
Technical field
The present invention relates to a kind of matrix form headlight systems for estimating wall locations and for activating motor vehicles The method of active triangulation.
Background technique
With the development of motor vehicles, so-called matrix form headlight system has more and more important role, the matrix form Headlight system usually has that there are two be arranged in matrix form headlight in front part of vehicle region.The matrix form headlight includes with optional The activation of selecting property or deactivated picture element matrix, preferably further include the pixel unit of tunable optical.It is expected that in matrix form headlight system In system, in the future and depending on used technology, the pixel resolution of tens of thousands of or hundreds of thousands of pixel units may be implemented.It is logical Crossing picture element matrix may be implemented entirely different illumination functions.For example, a kind of feasible illumination functions are without dazzling high beam function Can, i.e., when high beam is in state of activation, the traffic participant for making opposite direction and coming will not be dazzling.Here, using vehicle camera (driver assistance video camera), the vehicle camera continuously record traffic participant opposite and coming and overtaking other vehicles.These are taken the photograph Shadow machine image will be processed by means of image processing software.Pass through a kind of corresponding control electronics, matrix form headlight system Each pixel element of the matrix form headlight of system is pointedly controlled, so as to realize that removing dazzle acts on.
It is carried out by the matrix form headlight that video camera and matrix form headlight system can be used in active triangulation apart from survey Amount, to determine the distance between object being equipped in the vehicle and vehicle foreground of matrix form headlight system.In calibration camera After matrix form headlight system, by the matrix form headlight of matrix form headlight system, projected on the front area of motor vehicle Limiting pattern with characteristic feature.Then, scene is generated, wherein the pattern of first front projection is deformed according to scene feature.So Afterwards by camera recordings camera image, wherein typical special by being extracted in camera image by corresponding image processing software Sign.Then, associated with the matrix form headlight of matrix form headlight system to the feature detected.In addition, based on specific to vehicle Light distribution, pass through the image processing software and calculate so-called depth map.
In order to implement active triangulation in the motor vehicle, for define on-condition or job area to projection wall The corresponding location estimation of wall is required.Due to active triangulation function be not by motor vehicles driver connect and Due to certain restrictive conditions (such as daylight or there is no projection wall) and it is unendurable be activated, to projection wall Location estimation be necessary, to realize the function of turn on automatically active triangulation.
Currently, the known method for wall detection or location estimation is based on to typical special in the scene of motor vehicles front end The detection and tracking of sign.By analyzing the position of these characteristic features, presence and its position of wall can be inferred that.However, This method is problematic for the scene with less structure and the lateral movement variation of reduction.These limitations especially go out Now in parking scene.
In the improvement project for the method being described above, continuously detection and tracking projects the projecting point of light distribution (preferably Ground, so-called HOVO point).By determining that the position of these points can execute wall detection.However, due to being limited in single protrusion On point, position detection can not achieve in the method.
In another approach, the light distribution of checkerboard pattern form is generated in frontal scene.In addition, globally executing figure As processing (that is, for entire camera image), it whereby it can be detected that multiple projecting points.Based on the detection of these points, then Calculate distance value and to environmental modeling.By this method, wall detection is feasible in principle.However, in this lack Point is that relatively high time caused by as executing corresponding mapping and triangulation algorithm and computing capability expend.Due to wall Detection is particularly for examining on-condition, so continuously lower apart from the possible purpose of triangulation.
Summary of the invention
Therefore, the object of the present invention is to provide a kind of for estimating wall locations and for activating motor vehicles The method of the active triangulation of matrix form headlight system, this method are steady and can be with relatively low computing capabilitys It expends to execute.
Method with preferred embodiment of the present invention feature provides the solution of this purpose.Other alternative embodiments It is related to favourable improvement scheme of the invention.
It is a kind of according to the present invention for recording wall and the master for activating the matrix form headlight system of motor vehicles The method of dynamic triangulation the following steps are included:
A) check whether to meet the basic on-condition for connecting active triangulation, and if it does,
B) light distribution of the projection with characteristic feature in vehicle front scene, is capable of providing by means of the light distribution and is based on The artificial visual field of light, and pass through the camera recordings light distribution,
C) by image processing software from the camera image of the video camera from the light distribution generated in previous step Characteristic feature is extracted,
D) area of isolation in the light distribution is recorded and mathematically describes,
E) position and the continuity of the area of isolation for providing the artificial visual field based on light are determined,
F) it checks whether to meet the on-condition for connecting active triangulation function, and if it does, connects The function of active triangulation.
In order to avoid using the algorithm for calculating complexity (especially mapping and triangulation algorithm) but still having as far as possible The presence of effect ground estimation wall and position, suggest the wall position based on the artificial visual field based on light in the method for the invention Set estimation.Here, being extracted from matched light distribution (such as checkerboard pattern) from camera image by image processing software Characteristic feature.These features form multiple area of isolation (in the case where checkerboard pattern, such as four isolated areas on the whole Domain).These area of isolation distribute to mathematical function by corresponding correlating method.This is analyzed in the camera image of video camera The position of a little area of isolation, the video camera especially can be driver assistance video camera.If wall is for example relative to video camera- The frame of reference of matrix form headlight system component tilts, accordingly even when also can reflect out this in the deflected position of area of isolation Kind inclination.Furthermore, it is possible to be distinguished by any discontinuity of area of isolation, whether light scene has in some cases The discontinuity of wall.
Itd is proposed in an advantageous embodiment of the invention: connect active triangulation function after, into The mapping and triangulation algorithm rebuild for the 3D of motor vehicle front scene are executed in the step g) of one step.Pass through active triangle Measurement especially can be carried out range measurement.
In a particularly advantageous embodiment of the invention, there are following possibilities: to base in method and step a) Upchecking for this on-condition analyzes the sensor signal of the ambient light sensor of the motor vehicles and/or by analyzing the machine It the kinematic data of motor-car and/or is carried out by analyzing by object that perceptibility light auxiliary device records.Pass through motor vehicle Ambient light sensor can recorde half-light environment scene, such as underground parking.It thus avoids with strong solar radiation Environment scene in activate active triangulation.It can recorde scene by the kinematic data of motor vehicles, in this scenario Motor vehicles low speed is mobile (such as: during parking).It is possible thereby to avoid in the scene with high speed (that is: dynamic The middle and high fast road of state driving scene is when driving etc.) activation active triangulation function.Analysis is assisted by the perceptibility light of motor vehicles The target of the object of device record is the scene that identification has low object density, such as uniform wall.It is achieved in, it can be to prevent Only have in frontal scene in the scene of object (as evening urban transportation in) connect active triangulation function.
It can propose in one preferred embodiment, the independent light distribution of projection in this method step b), especially It is checkerboard pattern, or the distribution of projection dipped beam or distance light distribution, is embedded with these characteristic features in these light distributions.Use dipped beam point Cloth or distance light distribution have the following advantages, i.e., can be used for executing this method with the consistent light distribution of ECE.Preferably, at using image Reason cascade from camera image for extracting characteristic feature.
It is proposed in a particularly advantageous embodiment of the invention, it is typical special extracting these from camera image When sign, outlier mark is executed.Preferably, it is carried out abnormality detection by implementing mathematics estimation method, such as RANSAC algorithm, with Just influence of the error detection (outlier) to whole result is thus reduced to the greatest extent.
It proposes in one preferred embodiment, the mathematical description to area of isolation, such as LMS is carried out by approximate algorithm Algorithm (Least-Mean-Squares-Algorithmus, least mean square algorithm) or function matching algorithm.Due to Disengagement zone The position in domain and feature can be inferred that the position and direction of vehicle front scene or wall.It is preferred that by for describing isolation The analysis of the mathematical function in region, can position to the area of isolation for providing the artificial visual field based on light and continuity into Row determines.
Detailed description of the invention
Other features and advantages of the present invention will be combined and be become apparent referring to following explanation of the attached drawing to preferred embodiment.It is attached It illustrates:
Fig. 1 is schematic diagram, illustrate according to a preferred embodiment of the present invention for recording wall and for swashing The method flow of the active triangulation of the matrix form headlight system of motor vehicles living,
Fig. 2 is the schematic diagram of the light distribution with characteristic feature and a plurality of shielding wire.
Specific embodiment
It should refer to attached drawing 1 below and matrix form headlight for recording wall and for activating motor vehicles be further described The basic function process of the method for the active triangulation of system.
Carry out basic initialization first in first step a), mode is to check whether substantially to meet for connecting actively The basic on-condition of triangulation.Pass through the sensing of the ambient light sensor of analysis motor vehicles especially in this method step Device signal and test.Herein, it is therefore an objective to which record can execute the half-light environment scene of active triangulation, such as underground parking ?.It thus avoids activating active triangulation in the environment scene with strong solar radiation.It can also be in this method step The kinematic data of middle analysis motor vehicles.Herein, it is therefore an objective to record scene of the motor vehicles with low velocity and (such as stop In the process).(that is: on the middle and high fast road of dynamic driving scene when driving etc.) swashs it is possible thereby to avoid in the scene with high speed Active triangulation function living.It can also be analyzed in this method step by the perceptibility light auxiliary device record of motor vehicles Object.Herein, it is therefore an objective to identify the scene with low object density, such as uniform wall.It is achieved in, it can be to avoid in front Have in the scene of object in scene (as evening urban transportation in) connect active triangulation function.
When the inspection of the basic on-condition for connecting active triangulation success, in next step b), The projection for being embedded in the light distribution of characteristic feature 11,21,31,41 is carried out in vehicle front scene, by means of these characteristic features Artificial visual field based on light can be provided.Such as independent light distribution, such as chessboard figure can be projected in vehicle front scene Case.In an alternative embodiment, dipped beam distribution or distance light distribution and wherein embedded structure, the structure can be projected It can not discover for vehicle occupant, especially for driver.It is recorded in by video camera (driver assistance video camera) The light distribution generated in step b).
In a subsequent step c), by image processing software, from passing through for the light distribution generated in above-mentioned steps b) These characteristic features 11,21,31,41 are extracted in the camera review of camera recordings, can be mentioned by means of these characteristic features For the artificial visual field based on light.It is cascaded using corresponding image processing algorithm, such as image procossing, to extract.Here, it is preferred that Ground executes outlier identification, will pass through the influence that image processing algorithm avoided or reduced to the greatest extent wrong identification (outlier).For Mathematics estimation method can be used for example in this purpose, such as RANSAC algorithm, to reduce error detection (outlier) to the greatest extent to whole The influence of body result.
Recorded in next step d) multiple area of isolation 10,20,30,40 and in a suitable manner mathematically into Row description, so as to the subsequent position that can analyze these area of isolation or continuity.Preferably, by approximate algorithm come in mathematics Upper these area of isolation 10,20,30,40 of description, as (Least-Mean-Squares-Algorithmus, minimum is for LMS algorithm Square algorithm) or function matching algorithm.
For example, can be extracted in the case where checkerboard pattern projection in vehicle front scene four area of isolation 10,20, 30,40.This situation is shown in FIG. 2.Here, different brightness passes through different strong in the light distribution simplifiedly shown in Fig. 2 The shade of degree is shown.Here, the region of strong shadow should represent the darker structure of light distribution and less or almost without shade Regional Representative's light distribution brighter structure.It should be understood that not bright in the true light distribution between respective brightness range Transition is spent, but is observed that the brightness step gradually changed with brightness.
Upper isolation region 10 can be defined by characteristic feature 11, these characteristic features by matrix form headlight system matrix The top pixels section of formula headlight generates.There are two area of isolation in the middle region, especially by the typical case of pixel section The middle and upper part that feature 21 (these characteristic features are provided by the upper zone of the matrix form headlight of matrix form headlight system) defines Area of isolation 20 and by pixel section characteristic feature 31 (these characteristic features by matrix form headlight system matrix form headlight Middle and lower part region provide) the middle and lower part area of isolation 30 that defines.In addition, lower isolation region 40 can be determined by characteristic feature 41 Justice, these characteristic features are generated by the bottom side pixels section of the matrix form headlight of matrix form headlight system.
Position and the continuity of area of isolation 10,20,30,40 are determined in next step e).It is preferred that by being used for The analysis for describing the mathematical function of area of isolation 10,20,30,40, can be to the position of the area of isolation for providing artificial visual field It sets and is analyzed with continuity.Since the position of separated region and feature can be inferred that the position and side of frontal scene or wall To.
In next step f), check whether to meet for connecting actively based on calculating performed in previous step The on-condition of triangulation function.And if it does, provide corresponding connection signal for matrix form headlight system, so that Active triangulation can be activated.
In a subsequent step g), executes and calculated for the 3D of the motor vehicle front scene mapping rebuild and triangulation Method.It especially can be carried out range measurement by active triangulation.
The principle of the above-mentioned artificial visual field based on light is other than for wall detection and checks whether satisfaction for connecting The condition of active triangulation function, it may also be used for further embodiment.These further embodiments include will be Be briefly described below, for example, feature association, turned off based on track matrix form headlight system matrix form headlight section with And the camera calibration method based on light.
In the case where feature association (solving so-called correspondence), by image within the scope of active triangulation Manage software detection characteristic feature 11,21,31,41.In another step, these features must be corresponding with matrix form headlight system Headlight section it is associated.This is realized by forming elliptical track (pixel path) along it.If the edge within this ellipse Track detection is to characteristic feature 11,21,31,41, then these points are assigned to the headlight section for belonging to the track.If projection wall Wall has relative rotation, then oval then also to be rotated, to detect characteristic feature 11,21,31,41.By aforementioned artificial Visual field realizes the determination to rotation.
Projected in active triangulation using headlight, headlight projection and then by camera record, so as to from its Middle extraction characteristic feature 11,21,31,41.These headlights are projected to be produced by means of two matrix form headlights of matrix form headlight system It is raw.By the two matrix form headlights, these projections lead to overlapping region, and in this overlapping region, these features are blurred Displaying, and cannot or all can only be mistakenly extracted anyway by image processing software.Accordingly, when overlapping When being recorded, it is desirable that some headlight section of shutdown matrix form headlight system.It is further defined when other than determining track intersection point When the intersection point of the artificial visual field (left and right side) based on light, overlapping can detecte.It may be implemented continuously to close in this way The pixel section of disconnected matrix headlight, and correspondingly may insure inerrably to extract feature.
In ongoing driving task, video camera must be calibrated relative to vehicle.This can for example, by institute " end point calibration " Lai Jinhang of meaning.Here, the straight line of tracking extension parallel to each other in camera image.Pass through taking the photograph for projection Shadow machine geometric figure, these straight line intersections actually parallel to each other in camera image.These crosspoints also referred to as disappear Lose point.Each end point is located on so-called vanishing line, by the straight line, the geometry site of video camera and motor vehicles It can be determined by rolling-pitching-yaw angle.For this purpose it is required that can be projected by headlight (as by chessboard figure The feature of case) straight line actually parallel to each other is provided.Driver assistance can accordingly be taken the photograph by the feature that headlight projects Calibration outside the progress of shadow machine.

Claims (9)

1. a kind of active triangulation of the matrix form headlight system for estimating wall locations and for activating motor vehicles Method, method includes the following steps:
A) check whether to meet the basic on-condition for connecting active triangulation, and if it does,
B) light distribution of the projection with characteristic feature (11,21,31,41) in vehicle front scene, by means of the light distribution energy It is enough that the artificial visual field based on light is provided, and pass through the camera recordings light distribution,
C) by image processing software from being extracted from the light distribution generated in previous step in the camera image of the video camera Characteristic feature (11,21,31,41),
D) area of isolation (10,20,30,40) in the light distribution is recorded and mathematically describes,
E) determine that the position of the area of isolation (10,20,30,40) and continuity are used to provide the artificial visual field based on light,
F) it checks whether to meet the on-condition for connecting active triangulation function, and if it does, connects actively The function of triangulation.
2. the method according to claim 1, wherein after connecting the active triangulation function, into one The mapping and triangulation algorithm rebuild for the 3D of motor vehicle front scene are executed in the step g) of step.
3. method according to one of claims 1 or 2, which is characterized in that connect item substantially to this in method and step a) Upchecking for part analyzes the sensor signal of the ambient light sensor of the motor vehicles and/or by analyzing the motor vehicles It kinematic data and/or is carried out by analyzing by object that perceptibility light auxiliary device records.
4. method according to claim 1 to 3, which is characterized in that the independent light point of projection in method and step b) Cloth, especially checkerboard pattern, or the distribution of projection dipped beam or distance light distribution, are embedded in dipped beam distribution or distance light distribution State characteristic feature (11,21,31,41).
5. method according to claim 1 to 4, which is characterized in that using image procossing cascade for from video camera The characteristic feature (11,21,31,41) are extracted in image.
6. method according to claim 1 to 5, which is characterized in that extracting the typical case from camera image When feature (11,21,31,41), outlier identification is executed.
7. according to the method described in claim 6, it is characterized in that, executing the outlier by implementing mathematics estimation method Identification.
8. method according to claim 1 to 7, which is characterized in that carried out by approximate algorithm to the isolated area The mathematical description in domain (10,20,30,40).
9. method according to claim 1 to 8, which is characterized in that by for describe area of isolation (10,20, 30,40) analysis of mathematical function, to determine for providing the area of isolation (10,20,30,40) of the artificial visual field based on light Position and continuity.
CN201910083363.9A 2018-02-12 2019-01-28 Method for estimating wall position and activating active triangulation of matrix headlight system of motor vehicle Active CN110161523B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102018103060.6 2018-02-12
DE102018103060.6A DE102018103060B3 (en) 2018-02-12 2018-02-12 A method for estimating a wall of a wall and for activating an active triangulation of a matrix headlight system of a motor vehicle

Publications (2)

Publication Number Publication Date
CN110161523A true CN110161523A (en) 2019-08-23
CN110161523B CN110161523B (en) 2023-06-30

Family

ID=64951566

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910083363.9A Active CN110161523B (en) 2018-02-12 2019-01-28 Method for estimating wall position and activating active triangulation of matrix headlight system of motor vehicle

Country Status (2)

Country Link
CN (1) CN110161523B (en)
DE (1) DE102018103060B3 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116698377B (en) * 2023-07-31 2023-10-03 常州星宇车灯股份有限公司 ADB function test method and system for automobile LED matrix headlight

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6285778B1 (en) * 1991-09-19 2001-09-04 Yazaki Corporation Vehicle surroundings monitor with obstacle avoidance lighting
US20050099821A1 (en) * 2004-11-24 2005-05-12 Valeo Sylvania Llc. System for visually aiding a vehicle driver's depth perception
US20070019181A1 (en) * 2003-04-17 2007-01-25 Sinclair Kenneth H Object detection system
US20120105868A1 (en) * 2010-10-29 2012-05-03 Canon Kabushiki Kaisha Measurement device and measurement method
CN104442541A (en) * 2013-09-23 2015-03-25 海拉胡克双合有限公司 Method For Controlling A Light Distribution Of A Headlamp And Headlamp Therefor
US20150294161A1 (en) * 2012-10-31 2015-10-15 Tk Holdings, Inc. Vehicular path sensing system and method
US20170336283A1 (en) * 2016-05-17 2017-11-23 Dr. Ing. H.C. F. Porsche Aktiengesellschaft Method for checking the position of characteristic points in light distributions
CN107607960A (en) * 2017-10-19 2018-01-19 深圳市欢创科技有限公司 A kind of anallatic method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7872764B2 (en) 2007-10-16 2011-01-18 Magna Electronics Inc. Machine vision for predictive suspension

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6285778B1 (en) * 1991-09-19 2001-09-04 Yazaki Corporation Vehicle surroundings monitor with obstacle avoidance lighting
US20070019181A1 (en) * 2003-04-17 2007-01-25 Sinclair Kenneth H Object detection system
US20050099821A1 (en) * 2004-11-24 2005-05-12 Valeo Sylvania Llc. System for visually aiding a vehicle driver's depth perception
US20120105868A1 (en) * 2010-10-29 2012-05-03 Canon Kabushiki Kaisha Measurement device and measurement method
US20150294161A1 (en) * 2012-10-31 2015-10-15 Tk Holdings, Inc. Vehicular path sensing system and method
CN104442541A (en) * 2013-09-23 2015-03-25 海拉胡克双合有限公司 Method For Controlling A Light Distribution Of A Headlamp And Headlamp Therefor
US20170336283A1 (en) * 2016-05-17 2017-11-23 Dr. Ing. H.C. F. Porsche Aktiengesellschaft Method for checking the position of characteristic points in light distributions
CN107607960A (en) * 2017-10-19 2018-01-19 深圳市欢创科技有限公司 A kind of anallatic method and device

Also Published As

Publication number Publication date
DE102018103060B3 (en) 2019-01-24
CN110161523B (en) 2023-06-30

Similar Documents

Publication Publication Date Title
CN111448591B (en) System and method for locating a vehicle in poor lighting conditions
US11854272B2 (en) Hazard detection from a camera in a scene with moving shadows
US11204417B2 (en) Selective attention mechanism for improved perception sensor performance in vehicular applications
CN106663193B (en) System and method for curb detection and pedestrian hazard assessment
US8102427B2 (en) Camera egomotion estimation from an infra-red image sequence for night vision
CN115668182A (en) Autonomous vehicle context aware software architecture
CN109697860A (en) Parking stall measure and tracking system and method and vehicle
JP7225074B2 (en) Object recognition device
CN104424477A (en) Apparatus and method for detecting obstacle
JP6569280B2 (en) Road marking detection device and road marking detection method
JP2009076076A (en) Method for recognizing obstacle
CN110073410B (en) Method for tracking objects in a scene
US11580695B2 (en) Method for a sensor-based and memory-based representation of a surroundings, display device and vehicle having the display device
CN110161523A (en) The method estimated wall locations and activate the active triangulation of the matrix form headlight system of motor vehicle
JP6398218B2 (en) Self-position calculation device and self-position calculation method
CN104931024B (en) Obstacle detector
JP6299319B2 (en) Self-position calculation device and self-position calculation method
Bertozzi et al. Vision-based Automated Vehicle Guidance: the experience of the ARGO vehicle
US11100653B2 (en) Image recognition apparatus
Gu et al. Road traffic tracking and parameter estimation based on visual information analysis using self-calibrated camera views
EP4140814A1 (en) Method for harmonizing images acquired from non overlapping camera views
Broggi et al. Applications of Computer Vision to Vehicles: An Extreme Test.
Simond Free space in front of an autonomous guided vehicle in inner-city conditions
FR2938227A1 (en) Obstacle detecting method for motor vehicle, involves temporally identifying closed contour blobs among averaged interest zones, and comparing identified blobs to obstacles for considered motor vehicle
Paulus et al. Monitoring range of visibility using machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant