EP2347400B1 - Verfahren und system zum kombinieren von sensordaten - Google Patents

Verfahren und system zum kombinieren von sensordaten Download PDF

Info

Publication number
EP2347400B1
EP2347400B1 EP08878020.0A EP08878020A EP2347400B1 EP 2347400 B1 EP2347400 B1 EP 2347400B1 EP 08878020 A EP08878020 A EP 08878020A EP 2347400 B1 EP2347400 B1 EP 2347400B1
Authority
EP
European Patent Office
Prior art keywords
driver
sensor
sensor data
data
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP08878020.0A
Other languages
English (en)
French (fr)
Other versions
EP2347400A1 (de
EP2347400A4 (de
Inventor
Fredrik Bengtsson
Fredrik SANDBLOM
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Volvo Truck Corp
Original Assignee
Volvo Lastvagnar AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Volvo Lastvagnar AB filed Critical Volvo Lastvagnar AB
Publication of EP2347400A1 publication Critical patent/EP2347400A1/de
Publication of EP2347400A4 publication Critical patent/EP2347400A4/de
Application granted granted Critical
Publication of EP2347400B1 publication Critical patent/EP2347400B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0019Control system elements or transfer functions
    • B60W2050/0028Mathematical models, e.g. for simulation
    • B60W2050/0029Mathematical model of the driver
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9321Velocity regulation, e.g. cruise control

Definitions

  • the invention relates to a method and a system for combining sensor data according to the preambles of the independent claims.
  • Modern vehicles can be equipped with sensors for detecting driver gaze orientation and the like. Such information is used for instance to assess driver distraction and fatigue of drowsiness. It is also known in the art to provide sensors for monitoring the vehicle's surrounding traffic situation. The information collected by these sensors is for instance supplied to applications like collision warning systems which are used to prevent or mitigate collisions between the vehicle and an obstacle.
  • a safety system which has to quickly and appropriately react on a threat must be aware of the threat. This information is supplied by a tracking system using data from sensors monitoring the surrounding environment. For this reason it is desirable that tracks of a probable obstacle are initiated and reported to a safety system as early as possible. It is also desirable that a track that is reported is assigned a high track score.
  • a track score is a quantity that assesses how much one can trust the reported track. In general, a track is reported as valid to safety application when a certain track score has been reached.
  • the track score is maintained after validation and is dynamically updated to reflect the quality of the score at any time.
  • a property of a tracking system that performs well is high track scores.
  • Achieving high track scores can be assessed by building detailed sensor models, improving present sensors or adding multiple sensors, also known as sensor fusion.
  • EP 1 878 604 A1 discloses a safety system for a vehicle where a driver alert is issued based on the proportion of off-road gaze time and the duration of a current off-road gaze.
  • the system incorporates driver focus/driver attention as a sensor capable of generating measurements.
  • the ocular state of the driver is detected and compared to data from sensors that monitor the surrounding environment. The system can conclude if the driver has the focus on an impending danger or if the driver is non-attentive.
  • EP 1484014 A1 discloses a method for determining the awareness of a driver. Radar is used to detect an object and a camera is used to detect a driver gaze direction. The gaze direction is compared to the direction of the object, and driver awareness can be established.
  • the sensor signals are treated by an extended Kalman filter. As the influence on an estimate of previous measurements in a Kalman filter decreases exponentially, the measurement errors of the sensor signals are correlated in time in the system set-up.
  • the sensor data of the at least two sensors are combined as the respective measurement errors of the data are uncorrelated in time with respect to the at least one application.
  • data from two sensors are not uncorrelated.
  • the measurement errors are usually modeled as uncorrelated, which is often a very good assumption, e.g. for sensors using different measurement principles.
  • the sensor data of the at least two sensors are combined and/or exchanged when the respective measurement errors of the data are uncorrelated in time with respect to their noise.
  • the early fusion of the sensor data allows for considering the driver-related data as raw data as anyone of other sensor data.
  • the sensor data of the at least two sensors are fed to a sensor fusion unit for combination and/or exchange of the sensor data.
  • the information of the driver-related sensor data can be shared with other sensor data before the data is fed to a concrete application.
  • the analysis of sensor data can become more accurate and faster.
  • the sensor data are transmitted to one or more applications subsequent to the sensor fusion unit, wherein noise of the sensor data is correlated in time after processing in the sensor fusion unit.
  • the application can preferably be an assistant function applied in a vehicle, such as a tracking system, a safety system, e.g. collision warning and the like.
  • the driver is modelled as an additional sensor (“driver sensor”) providing driver-related sensor data, achieved by using an additional sensor that monitors the driver.
  • the driver-related data indicate parameters such as a gaze direction, a head orientation and/or any other orientation of the driver indicating a gaze direction and/or a driver attention (awareness) with respect to an object in the surrounding ambient.
  • the driver information can be used to identify objects in cluttered environments and/or be used to prepare a tracking system to see an object that may enter the field of view of a traditional (external) sensor.
  • driver focus / driver attention as a sensor capable of generating measurements, by way of example a safety system can perform better and act quicker.
  • the safety system can also provide information whether a driver is aware if a particular target or not.
  • sensor data are combined in a far later stage, particularly, the output of sensor fusion is combined with the output of the driver-related sensor.
  • the sensor data of the at least two sensors can be preferably combined in a state where noise of the sensor data is uncorrelated in time.
  • the early fusion of the sensor data allows for considering the driver-related data as raw data as anyone of other sensor data.
  • the sensor data can be preprocessed before fed into the sensor fusion unit, enabling a more accurate and more sensitive analysis of the sensor data.
  • driver-related sensor data can comprise at least driver attention and/or driver gaze direction data. This is particularly favourable if the fused sensor data are presented to a tracking system or a safety system such as a collision warning system.
  • the driver-related sensor data and data derived from the application can be used to estimate the driver's field of view.
  • the individual driver's field of view can be entered into a threat assessment algorithm which subsequently can assess if a possible danger detected by one or more external sensors can be realized by the driver. Further, validation of a track of an object can be improved and can happen faster than in the prior art.
  • a reaction time of the driver and/or a distraction level of the driver can be derived from a comparison between driver-related sensor data indicating cognitive detection of an object and detection of the object by the at least one external sensor.
  • the comparison can most likely be dependent of other things as well, e.g. the driver field of view.
  • an alarm e.g. when there is a risk of a collision, can be issued earlier and when the reaction time is short, an alarm van be issued later. This reduces the risk of false alarms which might irritate and distract the driver.
  • individual driver profiles can be stored and the applications coupled to assist functions can be adapted to the individual driver. Particularly, the profiles can include the driver field of view.
  • a reaction time can be extracted from a time difference between a detection of an object by the at least one external sensor and driver-related sensor data indicating cognitive detection of the same object by the driver.
  • the reaction time can be monitored over a predetermined time span yielding an estimate on the driver attention. For instance, an increase in the reaction time of the driver can be interpreted as a reduction of driver attention, and an appropriate warning and/or reaction of the application can be issued.
  • the reaction time can also be dependent on the driver field of view, which can also be taken into account.
  • the at least one application can be a vehicular safety system.
  • reaction time of one or more drivers can be extracted and stored for use in the vehicular safety system.
  • Safety functions can be adapted to individual drivers.
  • the reaction time can particularly be used to adapt a sensitivity level of the safety system.
  • the reaction time can be estimated and compared to a general driver profile, particularly a fixed profile or an adaptive profile, estimated during suitable driving scenarios, for instance, a safety system can be designed to warn earlier or later than for a case where no information regarding the driver reaction time is available, thus increasing the performance of the safety system.
  • estimation of a reaction time by using tracks may not be sufficient. Typically, when tracks are reported, these data are old and their age is usually unknown.
  • the reaction time can be used to evaluate a level of driver attention. If the reaction time changes within a predetermined time, particularly increases, an alarm can be issued. It is not necessary to verify how the driver handles the vehicle but rather if the driver is much better or worse at assessing the environment than the external sensors, such as radar, lidar, cameras and the like, employed in or coupled to the vehicle.
  • the at least one application can be a tracking system.
  • the driver By modelling the driver as a sensor it can be derived early, if a driver recognizes an oncoming object and a probable danger or not. Also, information regarding object properties such as position, size, perceived interest can be extracted.
  • the driver-related sensor data can be combined with tracking data of one or more objects, making the analysis more accurate and faster.
  • a track can be validated with higher reliability and probably earlier than with purely external sensors.
  • a sensor model in a tracking system a sensor model can be built which at least comprises one or more of
  • a track can be recognized as valid if tracking sensor data and driver-related sensor data coincide.
  • a safety system which employs a method for combining sensor data collected by at least two sensors coupled to at least one application, wherein at least one of the sensors provides driver-related sensor data of a driver-related behaviour and at least one external sensor provides sensor data not related to driver-related behaviour, wherein the sensor data of the at least two sensors are combined as the respective measurement errors of the date are uncorrelated in time with respect to the at least one application.
  • the invention can be a component in a tracking system.
  • the safety system can be e.g. a collision warning system, a cruise control system and the like for instance collision avoidance system, intersection safety system, lane change assist system, and preferably any system that needs to know where other objects are.
  • a tracking system which employs a method for combining sensor data collected by at least two sensors coupled to at least one application, wherein at least one of the sensors provides driver-related sensor data of a driver-related behaviour and at least one external sensor provides sensor date not related to driver-related behaviour, wherein the sensor data of the at least two sensors are combined as the respective measurement errors of the data are uncorrelated in time with respect to the at least one application, wherein the system comprises at least one of a tracking system and a safety system.
  • a computer program comprising a computer program code adapted to perform a method or for use in a method for combining sensor data collected by at least two sensors coupled to at least one application, wherein at least one of the sensors provides driver-related sensor data of a driver-related behaviour and at least one external sensor provides sensor date not related to driver-related behaviour, wherein the sensor data of the at least two sensors are combined as the respective measurement errors of the data are uncorrelated in time with respect to the at least one application when said program is run on a programmable microcomputer.
  • the computer program can be adapted to be downloaded to a control unit or one of its components when run on a computer which is connected to the internet.
  • a computer program product stored on a computer readable medium comprising a program code for use in a method on a computer, wherein the method is a method for combining sensor data collected by at least two sensors coupled to at least one application, wherein at least one of the sensors provides driver-related sensor data of a driver-related behaviour and at least one external sensor provides sensor date not related to driver-related behaviour, wherein the sensor data of the at least two sensors are combined as the respective measurement errors of the data are uncorrelated in time with respect to the at least one application.
  • the invention when used in combination with a tracking filter, the invention may be treated as an extension of the tracking filter.
  • a new sensor such as the driver sensor can be included and correctly treated, preferably as a component in a tracking framework.
  • the invention can then provide as benefits a decreased validation time, a driver time statistic and more.
  • Fig. 1 depicts schematically a preferred way how data are produced and presented to an application 50 via a tracking system 52, originating from radar data levels of, e.g. raw data, detection instances and tracks.
  • a radar sensor 10 monitors an ambient 12.
  • the radar sensor 10 sends raw data A as input to a unit 14.
  • the unit 14 provides a signal detection algorithm, that for example can be based on spectral analysis using Fourier transformation to detect objects.
  • the output of unit 14 comprises detections B which are fed into the application 52, e.g. a collision mitigation system or the like.
  • the detections B originate from signal peaks which are determined by detection algorithms to originate from an object in the radar field of view, e.g. which are thresholded, i.e. are above a predefined or adaptively set signal level. Properties of such detections are e.g. a range, an azimuth and a velocity of a detected object in the radar field of view.
  • the tracking system 52 outputs tracks C in form of an estimate of a position x ⁇ k and/or expected values of a position E [ x k ] and/or a probability density function p(x k ) of its state vector x k . It should be noted that this position can be multidimensional to include velocities and the like.
  • the data C are presented to an application 50.
  • detections B are positions in measurement space of objects visible to the radar sensor 10, but no time correlation is assumed. That is, the detection algorithms do not make use of earlier detections when detecting a new one. In contradistinction to this, the tracking system 52 makes use of an earlier detection when a new detection B is detected.
  • the detections output by the tracking system 52 are detections verified through time and model consistency
  • y i k describes the measurements presented by sensor i at time t k . As they are affected by stochastic noise, measurements can be described by a probability density function p ( y k ).
  • Common tracking filters are the so called Kalman filter, e.g. with modifications such as the Extended Kalman filter or Unscented Kalman filter to handle nonlinearities, or Monte Carlo methods such as particle filters, which is a widely used filter based on Monte Carlo methods.
  • a tracking system 52 uses process models and sensor models to calculate the result. The resulting track is a combination of all received data, and estimation errors are therefore correlated in time, although smaller than the measurement error.
  • the Kalman filter is an efficient recursive filter that estimates the state of a dynamic system from a series of incomplete and noisy measurements.
  • Kalman filtering is an important topic in control theory and control systems engineering.
  • Kalman filter exploits the dynamics of the target, which govern its time evolution, to remove the effects of the noise and get a good estimate of the location of the target at the present time (filtering), at a future time (prediction), or at a time in the past (interpolation or smoothing).
  • Fig. 2 illustrates a preferred tracking system 52, which is also a sensor fusion system 20.
  • the tracking system 52 is also a fusion system 20, because it uses several data sources, i.e. sensor 1 to sensor N, to calculate the estimate of x k .
  • the sensor signals are fed into the fusion system 20, which, when embodied as tracking system 52 may comprise a unit 22 providing gating and data association, connected to a unit 24 which initiate. new tracks, update track score and delete tracks (i.e.
  • track management which is connected to a unit 26, providing updated tracks to the receiving applications, which is connected to a unit 28, providing predictions such as position x ⁇ k+1 being connected to unit 22 as the predicted starting point used by the system at the start of the next iteration to associate data with position and used in the next iteration of the tracking system 52.
  • the state space position is predicted using a motion model. This is needed to associate the updated track from the previous iteration with the measurements of the following iteration. As will be shown in Fig. 3 track-to-measurement association is assumed to be hard just when the two tracked vehicles are positioned close to each other.
  • the unit 26 outputs data to various applications 50 such as Active Cruise Control (ACC), Collision Mitigation by Braking (CMBB) and the like.
  • ACC Active Cruise Control
  • CMBB Collision Mitigation by Braking
  • a fusion system 20 can be much more than just a tracking system 52 or an application 50.
  • a fusion system 20 needs not to contain a tracking system 52. It can contain information fusion on "higher” algorithmic levels that a tracking system 52, or make use of situation assessment algorithms to "understand” e.g. the traffic situation, and the like. In some sense, any system that makes use of multiple sources of information can be said to perform sensor fusion at the widest sense of the concept.
  • the state vector x k is likely to contain the positions, velocities , accelerations and/or orientation of surrounding objects such as e.g. vehicles.
  • a sequence of positions believed to originate from a single object is a so called "track".
  • a "track” in a tracking system is sometimes simply the estimated position x ⁇ k of an object at time t k .
  • "Position” does not necessarily mean a two-dimensional (2D) position in Cartesian space, it depends on the parameterization of the object and it is common to be of higher dimensions.
  • the 6-dimensional "constant acceleration" representation is widely used in automotive tracking systems with coordinates ( x , y , ⁇ , ⁇ , ⁇ , ⁇ ), where x, y are orthogonal vectors spanning a two dimensional space and the dots represent the time derivative ( ⁇ being a first derivative, ⁇ being a second derivative etc.).
  • two vehicles 60a, 60b are moving ahead of a truck 90.
  • the stars 64a, 64b assigned to the vehicles 60a, 60b can be associated to form two tracks 62a, 62b indicated by solid lines connecting the stars 62a, 62b, wherein track 62a is associated with the vehicle 60a and track 62b is associated with the vehicle 60b.
  • a tracking system (mounted e.g. on the truck 90) does not report a track 62a, 62b based in a single detection, but waits for several measurements to be associated before it is “sure enough”. The result is the so called “validated track”.
  • the tracking system 52 will contain the invention leading to the reduced track validation time.
  • the driver sensor 100 can be located in the truck 90. Through this driver sensor 100 it is possible to monitor the driver behaviour, which means that the driver is actually regarded as a sensor 100, the driver being monitored by the driver sensor 100.
  • the other sensors 110 referred to as external sensors 110, typically detect positions and velocities of objects, such as the vehicles 60a, 60b, in the ambient of the truck 90.
  • sensor data are collected by the driver sensor (100 in Fig. 3 ) and by at least one external sensor (110 in Fig. 3 ), e. g. a radar sensor 10 or the like monitoring the surrounding ambient 12.
  • the driver sensor (100 in Fig. 3 ) and the at least one external sensor 100 are coupled to a tracking system 52, and are advantageously combined (fused) as the respective measurement errors of the data are uncorrelated in time with respect to the at least one application 50.
  • the sensor data are combined as detections B, when the measurement errors are still uncorrelated in time at least with respect to their noise, and then treated in the sensor fusion system 20 ( Fig. 2 ).
  • the estimation error of the processed sensor data is correlated in time after processing in the sensor fusion system 20.
  • the term "track" can be appropriately used for the sensor data, as most fusion systems consist of a tracking system.
  • the sensor data (track) can subsequently be presented to an application 50.
  • driver sensor 100 indicates a fast response or is even faster than the external sensor 110, it can be assumed that the driver is not distracted or drowsy. A change in such a relation can favourably be correlated with drowsiness, as the driver may become tired as time goes by, but the external sensors 110 are expected to perform the same all the time.
  • Fig. 4a to Fig. 4c illustrates a sequence of detection of an object.
  • the object is by way of example a vehicle 60 moving on a driveway entering a road in which a truck 90 is moving.
  • the stars 64 illustrate e.g. radar measurements.
  • the object vehicle 60
  • the approaching vehicle 60 may be still out of the field of view of the driver.
  • the tracking system reports the validated object (i.e. data C in Fig. 1 .
  • data B Before validation, data B may exist but data C is empty.
  • data C When data B has proved the presence of an object (validation) it is reported as data C) which can then be used in vehicular functions such as collision warning, active cruise control, collision mitigation by braking or the like.
  • a visual acknowledgement of the object 60 by a driver of the truck 90 i.e. by the driver sensor 100
  • the vehicle 60 may just have entered the field of view of the driver and the driver may have looked at the spot where the vehicle 60 appeared. This may have been recognized by a driver-related sensor 100 by monitoring a change in the driver's glance direction and/or head position or the like.
  • the driver helps validating the target as tD ⁇ t1.
  • Fig. 5 illustrates the sequence of Fig. 4 in a flow chart.
  • a reaction time assessment is performed based on the time difference tD-t0.
  • Fig. 4c illustrates both the calculation of a reaction time and a reduced track validation time, as in step 202 the track score part can be calculated.
  • the time it takes for the driver to react on the presence of detected objects can be used to estimate future reaction times, i.e. the time it takes from when a driving situation changes until the driver takes an appropriate action.
  • untracked (un-validated) sensor detections can be used as a time stamp for when the object (e.g. vehicle 60) first appears in the driver field of view.
  • the time of the first track detection t0 can be subtracted from when the driver looks at that spot at tD, wherein for a typical radar sensor, "later” means a time less than a second, by way of example between 100 ms to 400 ms. This results in an estimate of the cognitive reaction time of the driver.
  • the time t1 -t0 is generally unknown because t1 is the only time stamp available. Then it will be impossible to derive the desired reaction time tD-t0. Because it is possible for tD to be smaller than t1, this is not even possible to calculate on a higher level. However, according to the invention, by altering the tracking the time stamp of the first detection can be included.
  • a measurement can be derived that is likely to be correlated with the driver reaction time and distraction level. This can be done by looking not on how the driver handles the own vehicle, but rather how much better or worse the driver is at assessing the environment compared to a sensor system consisting of radars/lidars/cameras and the like.
  • Fig. 6 displays a further preferred embodiment of the invention by depicting a tracking system providing a functionality for "driver acknowledgement”.
  • the driver is an information provider and can favourably be modelled as a sensor additional to one or more external sensors which monitor the environment. If the driver looks quickly or often to some point there is a high probability that something is there, e.g. an approaching vehicle (object) or the like.
  • the driver can be expected to have a wider field of view than a single sensor system.
  • Fig. 6 provides a schematic illustration of a tracking system with an additional functionality for "driver acknowledgment".
  • Data of a multitude of sensors, i.e. sensor 1 (driver sensor) sensor 2 (detections), sensor 3 (detections) in step 302 are fed into a tracking system in step 304.
  • the driver sensor may deliver a signal of a visual acknowledgment of an object. Whether the activity is a visual confirmation or not is preferably judged by the tracking system, comparing the visual activity with other sensor data.
  • step 306 it is checked if the tracking score is high. If yes, the object (e.g. vehicle 60 in Fig. 4a-4c ) has been identified as a valid confirmed target to be reported to a safety application, e.g. a collision warning system.
  • a safety application e.g. a collision warning system.
  • the time t0 represents when a sensor first detects the presence of a probable object and t1 represents when the object has been identified as a valid confirmed target to be reported to the safety application. It is desirable that the time t1-t0 should be as small as possible.
  • a preferred safety system can perform better and act quicker. It is also possible to provide information to a system whether a driver is aware of a particular target or not.
  • box 202 is equal to box 304.
  • Fig. 7 illustrates an example where an object, e.g. a vehicle 60, is first detected by an external sensor 110 mounted on a truck 90 and validation of the detection is not influenced by a driver sensor 100 at this first stage.
  • the driver sensor 100 may influence the validation.
  • a sensor field of view 130 is assigned to the external sensor 110 wherein the object (vehicle 60) enters the external sensor field 130.
  • a field of view 120 is assigned to the driver of the truck 90. It can be estimated how long it takes for the driver to visually acknowledge the vehicle 60, i.e. the track 62 assigned to the vehicle 60, after entering the driver's field of view 120, as long as it is known for the low level fusion or tracking when the track 62 entered the field of view 120.
  • this setup can be used to estimate the region in which the driver can see objects, in other words estimate the driver visual field of view 120. This is done by remembering where objects usually are when visually acknowledged by the driver sensor 100. After some time the region will be known and assigned as the driver's field of view 120. This can be used by threat assessment algorithms that now can assess if a potential danger can be detected by the driver or not.
  • Fig. 8 illustrates an example where an object, e.g. a vehicle 60, is first detected by the driver sensor 100 and validation is not influenced by the external sensor 110 at this first stage. It is unlikely that the driver sensor 100 alone can validate a track 62 of the object (vehicle 60), being most probably a very noisy signal, but there is a reasonable probability that the presence for the track 62 can be quickly initiated when external sensor measurements are available. If the driver sensor 100 is present, this can advantageously be quicker than if the driver sensor 110 were not present.
  • driver is regarded as a driver sensor 100, it is necessary to describe what the driver sensor 100 is likely to measure at any given situation, that is to form a probability density p ( y k
  • a simple model can be the built with the probability P d (probability of detection), wherein the driver looks at a point close to the object with the probability P d .
  • P d probability of detection
  • ⁇ x , ⁇ y can be modelled as Gaussian random noise.
  • Extension of this model can include modelling the noise and the probability P d to be different if an object, e.g. another vehicle, is far away or seen through e.g. a rearward mirror.
  • the class of the object e.g. car, truck, pedestrian, road sign, intersection, can be used to affect noise and the probability P d .
  • Noise can have other distributions, P d can be a function of x k . Further the driver sensor measurement space may have more than two dimensions, measurement noise may not be additive, as it may enter through a more complex model, and the threat assessment by the driver may affect what object the driver chooses to look at.
  • the invention can be embodied as hardware or software or comprise both software and hardware. Further, the invention can be embodied as a computer program product which can be accessed from a medium which can be used or read by a computer.
  • the medium can provide a program code which can be used in a computer.
  • the medium can be a memory, such as a solid state memory, a RAM or a ROM and the like, a magnetic tape, a computer diskette, a magnetic or optical disc, a CD, a DVD, a USB stick etc.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Social Psychology (AREA)
  • Biomedical Technology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)
  • Time Recorders, Dirve Recorders, Access Control (AREA)

Claims (15)

  1. Verfahren zur Kombinierung von Sensordaten, die durch wenigstens zwei Sensoren (100, 110) gesammelt werden, die mit wenigstens einer Anwendung (50) gekoppelt sind, dadurch gekennzeichnet, dass wenigstens einer der Sensoren (100) das Verhalten eines Fahrers überwacht und fahrerbezogene Sensordaten des fahrerbezogenen Verhaltens bereitstellt, und wenigstens ein äußerer Sensor (110), der nicht das Fahrerverhalten überwacht, Sensordaten bereitstellt, die nicht auf das fahrerbezogene Verhalten bezogen sind, wobei die Sensordaten der wenigstens zwei Sensoren (100, 110) kombiniert und/oder ausgetauscht werden, wenn die jeweiligen Messfehler der Daten bezüglich ihres Rauschens zeitlich nicht korrelieren, und dass die Sensordaten der wenigstens zwei Sensoren (100, 110) einer Sensorfusionseinheit (20) für eine Kombination und/oder einen Austausch der Sensordaten zugeführt werden, und dass die Sensordaten an eine oder mehrere Anwendungen (50) nach der Sensorfusionseinheit (20) übertragen werden, wobei Schätzfehler nach der Verarbeitung in der Sensorfusionseinheit (20) zeitlich korreliert werden.
  2. Verfahren nach einem der vorhergehenden Ansprüche, dadurch gekennzeichnet, dass die Sensordaten vorverarbeitet werden, bevor sie in die Sensorfusionseinheit (20) geführt werden.
  3. Verfahren nach einem der vorhergehenden Ansprüche, dadurch gekennzeichnet, dass die fahrerbezogenen Sensordaten wenigstens Daten über die Aufmerksamkeit des Fahrers und/oder die Blickrichtung des Fahrers umfassen.
  4. Verfahren nach einem der vorhergehenden Ansprüche, dadurch gekennzeichnet, dass die fahrerbezogenen Sensordaten und von der Anwendung (50) abgeleitete Daten zur Schätzung des Gesichtsfeldes (120) des Fahrers verwendet werden.
  5. Verfahren nach einem der vorhergehenden Ansprüche, dadurch gekennzeichnet, dass eine Reaktionszeit des Fahrers und/oder ein Ablenkungsniveau des Fahrers aus einem Vergleich zwischen fahrerbezogenen Sensordaten, die eine kognitive Erfassung eines Objekts (60) angeben, und der Erfassung des Objekts (60) durch wenigstens einen externen Sensor (110) abgeleitet wird.
  6. Verfahren nach einem der vorhergehenden Ansprüche, dadurch gekennzeichnet, dass eine Reaktionszeit aus einer Zeitdifferenz zwischen einer Erfassung eines Objekts durch den wenigstens einen externen Sensor (110) und fahrerbezogenen Sensordaten extrahiert wird, die eine kognitive Erfassung des selben Objekts (60) durch den Fahrer angeben.
  7. Verfahren nach einem der vorhergehenden Ansprüche, dadurch gekennzeichnet, dass die wenigstens eine Anwendung (50) ein Fahrzeugssicherheitssystem ist.
  8. Verfahren nach Anspruch 6 und 7, dadurch gekennzeichnet, dass die Reaktionszeit eines oder mehrerer Fahrer extrahiert wird und zur Verwendung in dem Fahrzeugsicherheitssystem gespeichert wird.
  9. Verfahren nach einem der Ansprüche 6 bis 8, dadurch gekennzeichnet, dass die Reaktionszeit zur Anpassung eines Sensitivitätsniveaus des Sicherheitssystems verwendet wird und/oder dass die Reaktionszeit zur Evaluierung eines Niveaus der Fahreraufmerksamkeit verwendet wird.
  10. Verfahren nach einem der vorhergehenden Ansprüche, dadurch gekennzeichnet, dass die fahrerbezogenen Sensordaten mit Trackingdaten einer oder mehrerer Objekte (60) kombiniert werden, wobei vorzugsweise in einem Trackingsystem (52) ein Sensormodell eingebaut ist, das wenigstens eines oder mehrere umfasst von
    - einer Wahrscheinlichkeit, dass der Fahrer ein oder mehrere Objekte (60) bemerkt;
    - einer Genauigkeit einer Fahrerüberwachungskamera, die einen Kopf eines Fahrers und/oder eine Blickrichtung zeigt;
    - einer Beziehung zwischen Kopf-/Blickrichtung und Position eines oder mehrerer Objekte (60), die von dem Fahrer gesehen werden;
    - einer Wahrscheinlichkeit, dass der Fahrer auf Nicht-Objekte schaut;
    - eines Einflusses auf die Vertrauenswürdigkeit des Tracks als Funktion der Fahreraufmerksamkeit.
  11. Verfahren nach Anspruch 10, dadurch gekennzeichnet, dass ein Track als gültig erkannt wird, wenn externe Sensordaten und fahrerbezogenen Sensordaten übereinstimmen.
  12. Sicherheitssystem eines Fahrzeugs (90), das das Verfahren nach einem der vorhergehenden Ansprüche anwendet.
  13. Trackingsystem eines Fahrzeugs (90), das das Verfahren nach einem der Ansprüche 1 bis 11 anwendet.
  14. Computerprogramm mit einem Computerprogrammcode, der zur Durchführung eines Verfahrens oder zur Verwendung in einem Verfahren nach wenigstens einem der Ansprüche 1 bis 11 geeignet ist, wenn das Programm auf einem programmierbaren Mikrocomputer durchgeführt wird, wobei vorzugsweise das Computerprogramm dazu geeignet ist, auf eine Steuereinheit oder auf eine ihrer Komponenten heruntergeladen zu werden, wenn es auf einem Computer durchgeführt wird, der mit dem Internet verbunden ist.
  15. Computerprogrammerzeugnis, das auf einem computerlesbaren Medium gespeichert ist, mit einem Programmcode zur Verwendung in einem Verfahren nach einem der Ansprüche 1 bis 11 auf einem Computer.
EP08878020.0A 2008-11-07 2008-11-07 Verfahren und system zum kombinieren von sensordaten Active EP2347400B1 (de)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/SE2008/000634 WO2010053410A1 (en) 2008-11-07 2008-11-07 Method and system for combining sensor data

Publications (3)

Publication Number Publication Date
EP2347400A1 EP2347400A1 (de) 2011-07-27
EP2347400A4 EP2347400A4 (de) 2013-03-27
EP2347400B1 true EP2347400B1 (de) 2014-03-12

Family

ID=42153074

Family Applications (1)

Application Number Title Priority Date Filing Date
EP08878020.0A Active EP2347400B1 (de) 2008-11-07 2008-11-07 Verfahren und system zum kombinieren von sensordaten

Country Status (5)

Country Link
US (1) US8781688B2 (de)
EP (1) EP2347400B1 (de)
CN (1) CN102292754B (de)
BR (1) BRPI0823237A2 (de)
WO (1) WO2010053410A1 (de)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10266132B2 (en) 2015-08-08 2019-04-23 Audi Ag Method for operating driver assistance systems in a motor vehicle, and motor vehicle

Families Citing this family (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2977551B1 (fr) * 2011-07-07 2014-01-24 Peugeot Citroen Automobiles Sa Dispositif d'aide aux manoeuvres d'un vehicule en fonction d'une estimation d'un temps de reaction global
CN102368351B (zh) * 2011-10-19 2014-10-29 北京航空航天大学 一种消解无信号交叉口两车交通冲突的方法
US20130229298A1 (en) * 2012-03-02 2013-09-05 The Mitre Corporation Threaded Track Method, System, and Computer Program Product
US9760092B2 (en) * 2012-03-16 2017-09-12 Waymo Llc Actively modifying a field of view of an autonomous vehicle in view of constraints
EP2682318B1 (de) 2012-07-03 2015-01-28 Volvo Car Corporation Motorfahrzeugkollisionswarnsystem
US8937552B1 (en) * 2013-01-02 2015-01-20 The Boeing Company Heads down warning system
US9734685B2 (en) 2014-03-07 2017-08-15 State Farm Mutual Automobile Insurance Company Vehicle operator emotion management system and method
US9135803B1 (en) * 2014-04-17 2015-09-15 State Farm Mutual Automobile Insurance Company Advanced vehicle operator intelligence system
US10185999B1 (en) 2014-05-20 2019-01-22 State Farm Mutual Automobile Insurance Company Autonomous feature use monitoring and telematics
US9972054B1 (en) 2014-05-20 2018-05-15 State Farm Mutual Automobile Insurance Company Accident fault determination for autonomous vehicles
US10319039B1 (en) 2014-05-20 2019-06-11 State Farm Mutual Automobile Insurance Company Accident fault determination for autonomous vehicles
US11669090B2 (en) 2014-05-20 2023-06-06 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation feature monitoring and evaluation of effectiveness
US10373259B1 (en) 2014-05-20 2019-08-06 State Farm Mutual Automobile Insurance Company Fully autonomous vehicle insurance pricing
US10599155B1 (en) 2014-05-20 2020-03-24 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation feature monitoring and evaluation of effectiveness
US10185998B1 (en) 2014-05-20 2019-01-22 State Farm Mutual Automobile Insurance Company Accident fault determination for autonomous vehicles
US11030696B1 (en) 2014-07-21 2021-06-08 State Farm Mutual Automobile Insurance Company Methods of providing insurance savings based upon telematics and anonymous driver data
US10088561B2 (en) * 2014-09-19 2018-10-02 GM Global Technology Operations LLC Detection of a distributed radar target based on an auxiliary sensor
US10915965B1 (en) 2014-11-13 2021-02-09 State Farm Mutual Automobile Insurance Company Autonomous vehicle insurance based upon usage
JP6319349B2 (ja) * 2015-04-03 2018-05-09 株式会社デンソー 情報提示装置
JP6330712B2 (ja) * 2015-04-08 2018-05-30 トヨタ自動車株式会社 障害物検出装置
US11107365B1 (en) 2015-08-28 2021-08-31 State Farm Mutual Automobile Insurance Company Vehicular driver evaluation
US20170072850A1 (en) * 2015-09-14 2017-03-16 Pearl Automation Inc. Dynamic vehicle notification system and method
WO2017057058A1 (ja) * 2015-09-30 2017-04-06 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
US10747234B1 (en) 2016-01-22 2020-08-18 State Farm Mutual Automobile Insurance Company Method and system for enhancing the functionality of a vehicle
US11242051B1 (en) 2016-01-22 2022-02-08 State Farm Mutual Automobile Insurance Company Autonomous vehicle action communications
US11441916B1 (en) 2016-01-22 2022-09-13 State Farm Mutual Automobile Insurance Company Autonomous vehicle trip routing
US11719545B2 (en) 2016-01-22 2023-08-08 Hyundai Motor Company Autonomous vehicle component damage and salvage assessment
US10134278B1 (en) 2016-01-22 2018-11-20 State Farm Mutual Automobile Insurance Company Autonomous vehicle application
US10324463B1 (en) 2016-01-22 2019-06-18 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation adjustment based upon route
US10395332B1 (en) 2016-01-22 2019-08-27 State Farm Mutual Automobile Insurance Company Coordinated autonomous vehicle automatic area scanning
US9940834B1 (en) 2016-01-22 2018-04-10 State Farm Mutual Automobile Insurance Company Autonomous vehicle application
US10528725B2 (en) 2016-11-04 2020-01-07 Microsoft Technology Licensing, Llc IoT security service
US10972456B2 (en) 2016-11-04 2021-04-06 Microsoft Technology Licensing, Llc IoT device authentication
CN106767760A (zh) * 2016-12-30 2017-05-31 中国船舶重工集团公司第七0七研究所 基于多维度的多源船舶目标融合方法
US20180240025A1 (en) * 2017-02-17 2018-08-23 Microsoft Technology Licensing, Llc Behavior-based data corroboration
KR20180104235A (ko) * 2017-03-10 2018-09-20 만도헬라일렉트로닉스(주) 운전자 상태 모니터링 방법 및 장치
US20180268311A1 (en) * 2017-03-14 2018-09-20 Microsoft Technology Licensing, Llc Plausibility-based authorization
CN107174262B (zh) * 2017-05-27 2021-02-02 西南交通大学 注意力评测方法和***
DE102017211607A1 (de) * 2017-07-07 2019-01-10 Robert Bosch Gmbh Verfahren zur Verifizierung einer digitalen Karte eines höher automatisierten Fahrzeugs (HAF), insbesondere eines hochautomatisierten Fahrzeugs
US11906625B2 (en) 2018-01-08 2024-02-20 The Regents Of The University Of California Surround vehicle tracking and motion prediction
EP3819668A4 (de) * 2018-07-02 2021-09-08 Sony Semiconductor Solutions Corporation Informationsverarbeitungsvorrichtung, informationsverarbeitungsverfahren, computerprogramm und bewegtkörpervorrichtung
KR102569900B1 (ko) * 2018-12-04 2023-08-23 현대자동차주식회사 전방위 센서퓨전 장치 및 그의 센서퓨전 방법과 그를 포함하는 차량
KR102555916B1 (ko) * 2018-12-12 2023-07-17 현대자동차주식회사 Odm 정보 신뢰성 판단 장치 및 그의 판단 방법과 그를 이용하는 차량
US11625909B1 (en) * 2022-05-04 2023-04-11 Motional Ad Llc Track segment cleaning of tracked objects

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7477758B2 (en) * 1992-05-05 2009-01-13 Automotive Technologies International, Inc. System and method for detecting objects in vehicular compartments
JPH06150199A (ja) * 1992-11-13 1994-05-31 Mitsubishi Electric Corp 車両予防安全装置
US5482314A (en) * 1994-04-12 1996-01-09 Aerojet General Corporation Automotive occupant sensor system and method of operation by sensor fusion
US20060284839A1 (en) * 1999-12-15 2006-12-21 Automotive Technologies International, Inc. Vehicular Steering Wheel with Input Device
US7527288B2 (en) * 1995-06-07 2009-05-05 Automotive Technologies International, Inc. Vehicle with crash sensor coupled to data bus
US6026340A (en) * 1998-09-30 2000-02-15 The Robert Bosch Corporation Automotive occupant sensor system and method of operation by sensor fusion
US7920102B2 (en) * 1999-12-15 2011-04-05 Automotive Technologies International, Inc. Vehicular heads-up display system
US6580973B2 (en) * 2000-10-14 2003-06-17 Robert H. Leivian Method of response synthesis in a driver assistance system
US7565230B2 (en) * 2000-10-14 2009-07-21 Temic Automotive Of North America, Inc. Method and apparatus for improving vehicle operator performance
US6909947B2 (en) * 2000-10-14 2005-06-21 Motorola, Inc. System and method for driver performance improvement
US6925425B2 (en) * 2000-10-14 2005-08-02 Motorola, Inc. Method and apparatus for vehicle operator performance assessment and improvement
US7124027B1 (en) 2002-07-11 2006-10-17 Yazaki North America, Inc. Vehicular collision avoidance system
US6922632B2 (en) * 2002-08-09 2005-07-26 Intersense, Inc. Tracking, auto-calibration, and map-building system
US6989754B2 (en) * 2003-06-02 2006-01-24 Delphi Technologies, Inc. Target awareness determination system and method
SE0303122D0 (sv) * 2003-11-20 2003-11-20 Volvo Technology Corp Method and system for communication and/or interaction between a vehicle driver and a plurality of applications
DE102005014803A1 (de) 2005-03-31 2006-10-05 Bayerische Motoren Werke Ag Verfahren und Vorrichtung zum Steuern eines Kollisionsvermeidungssystems
US7835834B2 (en) * 2005-05-16 2010-11-16 Delphi Technologies, Inc. Method of mitigating driver distraction
US7460951B2 (en) * 2005-09-26 2008-12-02 Gm Global Technology Operations, Inc. System and method of target tracking using sensor fusion
JP2007094618A (ja) * 2005-09-28 2007-04-12 Omron Corp 通知制御装置および方法、記録媒体、並びに、プログラム。
DE602006010380D1 (de) * 2006-09-08 2009-12-24 Ford Global Tech Llc System und Verfahren zur Bestimmung der Aufmerksamkeit auf einem Objekt
JP4922715B2 (ja) * 2006-09-28 2012-04-25 タカタ株式会社 乗員検出システム、警報システム、制動システム、車両
US7579942B2 (en) * 2006-10-09 2009-08-25 Toyota Motor Engineering & Manufacturing North America, Inc. Extra-vehicular threat predictor
US7741962B2 (en) * 2006-10-09 2010-06-22 Toyota Motor Engineering & Manufacturing North America, Inc. Auditory display of vehicular environment
US7880621B2 (en) * 2006-12-22 2011-02-01 Toyota Motor Engineering & Manufacturing North America, Inc. Distraction estimator
US20100019880A1 (en) * 2008-07-24 2010-01-28 Gm Global Technology Operations, Inc. Adaptive vehicle control system with driving style recognition based on traffic sensing
US7831407B2 (en) * 2008-07-24 2010-11-09 Gm Global Technology Operations, Inc. Adaptive vehicle control system with driving style recognition based on vehicle U-turn maneuvers
US8170725B2 (en) * 2009-02-18 2012-05-01 GM Global Technology Operations LLC Vehicle stability enhancement control adaptation to driving skill based on highway on/off ramp maneuver
US8344894B2 (en) * 2009-04-02 2013-01-01 GM Global Technology Operations LLC Driver drowsy alert on full-windshield head-up display
US8269652B2 (en) * 2009-04-02 2012-09-18 GM Global Technology Operations LLC Vehicle-to-vehicle communicator on full-windshield head-up display
KR101302134B1 (ko) * 2009-12-18 2013-08-30 한국전자통신연구원 복합 센서정보 제공 장치 및 방법
US8384534B2 (en) * 2010-01-14 2013-02-26 Toyota Motor Engineering & Manufacturing North America, Inc. Combining driver and environment sensing for vehicular safety systems

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10266132B2 (en) 2015-08-08 2019-04-23 Audi Ag Method for operating driver assistance systems in a motor vehicle, and motor vehicle

Also Published As

Publication number Publication date
CN102292754A (zh) 2011-12-21
US8781688B2 (en) 2014-07-15
EP2347400A1 (de) 2011-07-27
BRPI0823237A2 (pt) 2015-06-16
US20120083974A1 (en) 2012-04-05
WO2010053410A8 (en) 2011-09-01
WO2010053410A1 (en) 2010-05-14
EP2347400A4 (de) 2013-03-27
CN102292754B (zh) 2014-07-30

Similar Documents

Publication Publication Date Title
EP2347400B1 (de) Verfahren und system zum kombinieren von sensordaten
Åsljung et al. Using extreme value theory for vehicle level safety validation and implications for autonomous vehicles
US9983306B2 (en) System and method for providing target threat assessment in a collision avoidance system on a vehicle
US10486707B2 (en) Prediction of driver intent at intersection
US9889858B2 (en) Confidence estimation for predictive driver assistance systems based on plausibility rules
Lee et al. Real-time rear-end collision-warning system using a multilayer perceptron neural network
JP6714513B2 (ja) 車両のナビゲーションモジュールに対象物の存在を知らせる車載装置
Jiménez et al. An improved method to calculate the time-to-collision of two vehicles
US9524643B2 (en) Orientation sensitive traffic collision warning system
EP2056123B1 (de) Kollisionsvermeidungs- und Kollisionswarnsystem und -verfahren
Lytrivis et al. An advanced cooperative path prediction algorithm for safety applications in vehicular networks
US20140025285A1 (en) False Warning Supression in a Collision Avoidance System
JP2017084352A (ja) 車両を運転する際に当該車両の運転者を支援する方法及びシステム、車両、並びにコンピュータプログラム
US20160236681A1 (en) Vehicle driving situation determination apparatus and vehicle driving situation determination method
Eidehall Tracking and threat assessment for automotive collision avoidance
CN112526521A (zh) 一种汽车毫米波防撞雷达的多目标跟踪方法
CN114518574A (zh) 用于传感器融合***的基于峰度的修剪
EP3467545A1 (de) Objektklassifizierung
CN114387821B (zh) 车辆碰撞预警方法、装置、电子设备和存储介质
US20050004719A1 (en) Device and method for determining the position of objects in the surroundings of a motor vehicle
Rezaei et al. Multisensor data fusion strategies for advanced driver assistance systems
Kloeden et al. Effectiveness study of cooperative sensor systems for VRU-safety
Sangorrin et al. Sensor data fusion for active safety systems
Eichberger et al. Review of recent patents in integrated vehicle safety, advanced driver assistance systems and intelligent transportation systems
CN117864075A (zh) 主动刹车控制方法、设备及存储介质

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20110607

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA MK RS

D17D Deferred search report published (deleted)
DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20130221

RIC1 Information provided on ipc code assigned before grant

Ipc: A61B 5/18 20060101ALI20130215BHEP

Ipc: G08G 1/16 20060101AFI20130215BHEP

Ipc: G01S 13/93 20060101ALI20130215BHEP

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20131022

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 656745

Country of ref document: AT

Kind code of ref document: T

Effective date: 20140315

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602008030906

Country of ref document: DE

Effective date: 20140424

REG Reference to a national code

Ref country code: SE

Ref legal event code: TRGR

REG Reference to a national code

Ref country code: NL

Ref legal event code: VDEP

Effective date: 20140312

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140612

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 656745

Country of ref document: AT

Kind code of ref document: T

Effective date: 20140312

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140712

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140612

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602008030906

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140714

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

26N No opposition filed

Effective date: 20141215

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602008030906

Country of ref document: DE

Effective date: 20141215

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

Ref country code: LU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20141107

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20141130

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20141130

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20141107

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 8

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140613

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140312

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20081107

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 9

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 10

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20191122

Year of fee payment: 12

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20191128

Year of fee payment: 12

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20201107

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20201107

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20201107

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: SE

Payment date: 20211122

Year of fee payment: 14

Ref country code: FR

Payment date: 20211126

Year of fee payment: 14

REG Reference to a national code

Ref country code: SE

Ref legal event code: EUG

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20221108

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20221130

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20231127

Year of fee payment: 16