EP3089136A1 - Apparatus and method for detecting an object in a surveillance area of a vehicle - Google Patents

Apparatus and method for detecting an object in a surveillance area of a vehicle Download PDF

Info

Publication number
EP3089136A1
EP3089136A1 EP15165858.0A EP15165858A EP3089136A1 EP 3089136 A1 EP3089136 A1 EP 3089136A1 EP 15165858 A EP15165858 A EP 15165858A EP 3089136 A1 EP3089136 A1 EP 3089136A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
camera
radar sensor
control unit
surveillance area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP15165858.0A
Other languages
German (de)
French (fr)
Inventor
Huba NÉMETH
Marton GYÖRI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Knorr Bremse Systeme fuer Nutzfahrzeuge GmbH
Original Assignee
Knorr Bremse Systeme fuer Nutzfahrzeuge GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Knorr Bremse Systeme fuer Nutzfahrzeuge GmbH filed Critical Knorr Bremse Systeme fuer Nutzfahrzeuge GmbH
Priority to EP15165858.0A priority Critical patent/EP3089136A1/en
Publication of EP3089136A1 publication Critical patent/EP3089136A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/168Driving aids for parking, e.g. acoustic or visual feedback on parking space

Definitions

  • the present invention relates to an apparatus and a method for detecting an object in a surveillance area of a vehicle and, in particular, to an advanced turning-assistant system for commercial vehicles.
  • Lane changes or turning maneuvers are potential critical traffic situations for commercial vehicles where other traffic participants are easily overseen.
  • blind spot assist systems are used to prevent accidents in such situations by supporting the driver, for example, during lane changes or turns.
  • the driver in passenger cars who is warned by such turn assist systems is in most cases able to confirm such warnings by a visual inspection of the respective area, for example by moving or turning her/his head.
  • this is in most cases not possible because the driver has no view, for example, at the passenger side of the vehicle and thus cannot confirm a warning given by a turning assist system by visual inspection.
  • DE 10 2010 048 144 A1 discloses a vehicle which has a sensor to detect objects in a surrounding area of the vehicle, wherein the surrounding area extends along one long side of the vehicle.
  • DE 10 2009 041 556 A1 discloses a blind spot assist system based on ultrasonic sensors, which are arranged along a side and a corner region of the vehicle.
  • DE 10 2012 010 876 A1 discloses another drive assist system based on image sensors which are arranged between the front axle and the rear axle in a lateral region of the vehicle main portion.
  • Further driver assist systems are disclosed in WO 2014/037064 A1 with sensors arranged at a vehicle trailer and in WO 2014/041023 , wherein a collision warning system for lane changes is described.
  • the present invention solves the afore-mentioned problem by providing an apparatus according to claim 1 and a method according to claim 15.
  • the dependent claims refer to specifically advantageous realizations of the subject matter of claim 1.
  • the present invention relates to an apparatus for detecting an object in a surveillance area of a vehicle.
  • the apparatus comprises at least one camera for capturing image data of the surveillance area, at least one radar sensor (or radar unit) for providing information indicative of a distance to the object, and a control unit configured to receive the image data and the information from the radar sensor(s) to generate a visual representation of the object based on the received image data and/or on the information indicative of the distance.
  • the surveillance area can be any area surrounding the vehicle and may cover, in particular, an adjacent driving lane or a sidewalk or a bicycle lane.
  • Each sensor has it own coverage area from where the sensor can take sensor data and which is possibly larger than the surveillance area so that both sensors cover a common area that can be identified as the surveillance area or it can be part thereof. Therefore, the surveillance area is part of both coverage areas.
  • the distance may be defined as the distance to the object measured from the respective sensor or from the vehicle (or any particular part of the vehicle).
  • the present invention does not rely on any particular coordinate system. Rather, the position, distance or angle may be defined with respect to any desired coordinate system (e.g. the distance(s) or angle(s) can be measured with respect to the sensor units).
  • Information indicative of a distance shall be interpreted very broadly and shall contain any information which can be used to derive therefrom a distance between the vehicle (or the radar sensor) and the object.
  • This information may include in particular time delay information between a transmitted and a reflected radar signal measured by the radar sensor or the corresponding moments in time of the emission and reception of the used (RF) signals. Based on the time delay or the two measured different moments in time or any other information, the control unit can then determine or calculate the distance to and/or the position of the object.
  • the above-mentioned problem is thus solved by the present invention by providing a redundancy of two independent sensor units of different types that enable a visual feedback to the driver while avoiding false detections (e.g. due to abnormal weather conditions affecting one sensor type).
  • the apparatus may further comprise a display unit that is configured to receive the visual representation of the object from the control unit and to show at least part of the surveillance area together with the object to the driver of the vehicle.
  • the display may be configured to show to the driver the image captured by the at least one camera as one image and, as a second image, to display a radar picture obtained from the information provided by at the at least one radar sensor.
  • the at least one radar sensor may be configured to transmit a radio frequency (RF) signal and to receive a reflected RF signal from the object.
  • the at least one radar sensor may further be configured to determine an angle or an angular range from where the RF signal was received.
  • the control unit may be configured to determine a position of the object based on the determined angle or angular range and the information indicative of the distance.
  • two visual representations are available, one referring to the image captured by the camera(s) and the second derived from the information received from the radar sensor(s). Both images may either be overlaid on the display or can be displayed as separate images, e.g. adjacent to each other.
  • the radar sensor may not distinguish between the object of interest and background objects (e.g. trees or buildings which the driver is aware of). Therefore, in order to detect an object of interest (i.e. to distinguish it from the background) subsequent positioning steps may be implemented.
  • multiple objects of interest may be present in the surveillance area.
  • the at least one radar sensor may be configured to repeatedly measure a distance between one or more candidate objects and the vehicle.
  • the control unit may receive the repeatedly measured distances and, based thereon, can determine a relative motion of the detected objects to the vehicle. If the objects move relative to each other, the reflected radar signals can be identified as signals originating from different objects.
  • background objects can be detected as static objects at a larger distance (e.g. beyond a certain threshold). As a result, the apparatus is able to eliminate background objects and/or to detect one or more further objects of interest.
  • the at least one camera When only one camera is available, there may be a problem in identifying an object being different from the background.
  • the ground or the driving lane may have certain patterns, which are typically not objects of any interest to the driver and thus should be eliminated in the captured images.
  • the radar sensor this can be achieved by taking subsequent images so that any relative motion of the vehicle with respect to the object and with respect to the background can be identified. Hence, any background object can be identified and thus eliminated from the captured images.
  • the at least one camera may be configured to capture subsequent images.
  • the control unit may further be configured to process the subsequent images received from the at least one camera using a processing algorithm to detect the object in the surveillance area and/or to eliminate background objects and/or to detect one or more further objects.
  • the same aim could be achieved, if multiple cameras are available so that at least two pictures can be taken from different angles, which allows to identify objects related to the background or further distant objects.
  • control unit might be configured to fuse (combine) both sensor signals (or sensor images) into one single sensor image. This can, for example, be done by highlighting the object in the visual representation of the image captured by the camera, wherein the highlighted object is related to the object detected by the at least one radar sensor. Hence, it is not necessary to detect the object by the camera(s). Instead, the position of an object detected by the radar sensor or the control unit can be simply be highlighted in the image taken by the camera(s) so that the driver can decide about the relevance of this object. Therefore, in yet another embodiment the control unit is configured to combine information received from the at least one radar sensor and the at least one camera and to highlight the object or multiple objects in the image shown to the driver of the vehicle.
  • control unit may be configured to predict a path of the vehicle and/or to predict a path of the object and is further configured to issue a warning, if the path of the vehicle intersects the path of the object. Also, the relative path between the object and the vehicle can be determined.
  • the warning can be done by a separate warning module (for example a loudspeaker or a warning light), which may or may not be part of the control unit and may or may not be already included in the vehicle, but may be controlled by the control unit.
  • the warning may comprise an acoustic signal and/or a haptic signal and/or an optical signal.
  • the haptic signal may be related to any kind of vibration, which the driver is able to recognize.
  • control unit may further be configured to issue a brake signal to enforce a braking and/or a steer signal to enforce a steering of the vehicle in case the path of the vehicle and the path of object indicate a collision for a predetermined period of time (e.g. if the driver ignores the warning and a collision is going to happen, when the vehicle stays on its path).
  • the predetermined period can be selected such that the remaining time to the collision is as long as the vehicle needs to prevent the collision (i.e. it may define the latest moment).
  • the optional warning module can issue a constant warning signal upon which the driver can interact to override the enforced braking and/or steering of the vehicle by taking appropriate actions to avoid the collision of the object.
  • the at least one camera may comprise a first and a second camera
  • the at least one radar sensor comprises a first and a second radar sensor, wherein the first camera and the first radar sensor are installed on one side of the vehicle and the second camera and the second radar sensor are installed on another side of the vehicle to detect objects on different sides of the vehicle.
  • the at least one camera comprises a fish-eye lens camera to capture image data of a large area (e.g. extending the viewing angle of more than 90° or up to 180°).
  • control unit may further be configured to transform the image data received from the at least one camera in a bird's eye view (e.g. by using a conversion method).
  • the present invention relates also to a vehicle with one of the described apparatus and the surveillance area may include an area of a side of the vehicle that extends over an angular range of more than 90° or up to 170° measured about the side of the vehicle.
  • the present invention relates also to a method for detecting an object in a surveillance area of a vehicle.
  • the method comprises: capturing image data of the surveillance area by at least one camera; providing information indicative of a distance to the object by at least one radar sensor; receiving the image data and the information by a control unit; and generating, by the control unit, a visual representation of the object based on the received image data and/or the information indicative of the distance.
  • This method may also be implemented in software or as a computer program product and the order of steps may be not important to achieve the desired effect.
  • all functions described in conjunction with the apparatus may be implemented in further embodiments of the method.
  • Fig. 1 depicts an apparatus for detecting an object 10 in a surveillance area 30, 50 of a vehicle 70.
  • the apparatus comprises one camera 110 for capturing image data of the surveillance area 30, one radar sensor 120 for providing a distance d to the object 10, and a control unit 130.
  • the control unit 130 is configured to receive the image data and the information from the radar sensor 120 to generate a visual representation of the object 10 based on the received image data and/or the distance d.
  • the apparatus is installed on an exemplary commercial vehicle 70 with a towing vehicle 71 (e.g. a tractor) and a trailer 72, wherein only one camera 110 and only one radar sensor 120 are installed at a right-hand side of the towing vehicle 71 in the driving direction (to the left in Fig. 1 ).
  • the control unit 130 and an optional display 140 may further be installed on the towing vehicle 71.
  • the control unit 130 is connected to the camera 110, to the radar sensor 120 and to the display 140 to receive the signals from the camera 110 and the radar sensor 120 and to supply corresponding visual data to the display 140.
  • the radar sensor 120 may provide an angular resolution of the detected object 10 so that the control unit 130 is able to determine a position for the object 10 (e.g. given by the distance d and the angle ⁇ ).
  • the camera 110 may be sensitive to capture images in the coverage area 30 and the radar sensor 120 has a range indicated by the dotted line 50.
  • both coverage areas 30, 50 overlap significantly over each other and the overlap may define the surveillance area.
  • the range 50 of the radar 120 is limited by the used radar signal.
  • the one or more radar sensors 120 may operate with radio frequency signals with exemplary frequencies between 10 and 100 GHz or about 24 GHz and/or at about 77/79 GHz.
  • one radar sensor may operate at 24 GHz whereas another may operate at 77/79 GHz.
  • the detection range may go up to 100 meters, although in the target operating mode the range may be around 25 meters.
  • Such range is of advantage, because it allows to cover a whole side the vehicle 70 with one radar sensor 120, which can thus be installed on the towing vehicle 71 - there is no need to install any sensor on the trailer 72 (hence the system operates trailer-independent).
  • the radar sensor provides more accurate information of the detected objects.
  • the radar coverage area 50 can also be smaller than the camera coverage area 30.
  • electromagnetic wave signals which can travel a sufficiently long distance to extend therewith the coverage area 50 so that only one radar sensor 120 would suffice to cover the whole side of the vehicle 70, and there is no need to distribute many radar sensors over the whole long side of the vehicle 70 (as in conventional systems).
  • the camera 110 and the radar sensor 120 may be installed at different locations.
  • the camera 110 may be installed at an upper portion of the driver cabin and the radar unit 120 may be installed at a lower portion (e.g. between two axles). Consequently, the viewing direction of the camera 110 is downward, which limits the coverage area 30 of the camera 110.
  • the radar unit 120 may emit signals parallel to the ground so that the coverage area 50 of the radar unit 120 is merely limited by the range of the used signals.
  • the range 50 of the radar sensor 120 may be larger than the coverage area 30 covered by the camera 110 (although this area can be changed by changing the orientation of the camera 110).
  • Fig. 2 depicts a further embodiment of the apparatus according to the present invention, which differs from the embodiment as shown in Fig. 1 in that the camera 110 is attached near or at a corner of the towing vehicle 71. With the camera 110 installed in the corner region of the towing vehicle 71 it becomes possible to extend the coverage area 30 of the camera 110 to cover also the front side of the vehicle 70. All other features as depicted in Fig. 2 are the same as in Fig. 1 so that a repetition of the description is not needed here.
  • already one radar sensor 120 can be used to obtain a radar image, which is suitable to be shown to the driver as an addition visual representation of the surveillance area.
  • the radar sensor 120 is configured to measure the distance d between the radar sensor 120 and the object 10.
  • the radar sensor may emit an RF-signal, which is reflected by the object 10 and returns to the radar sensor 120.
  • the radar sensor 120 measures the time difference between the emission of the RF-signal to the object 10 and the reception of the return signal. From this time difference, while taking into account the propagation speed of the wave signal, the control unit 130 (or the radar sensor 120 itself) can determine the distance d from the radar sensor to the object 10.
  • the radar sensor 120 may further be configured to provide an angular resolution of the detected return signals.
  • the radar signal 120 can scan the coverage area 50, for example, starting on the left hand side in Fig. 2 and subsequently scanning the area up to the right hand side of Fig. 2 (or vice versa).
  • the radar sensor 120 can determine the corresponding angle ⁇ associated with a reflecting object.
  • the radar sensor 120 may be configured to emit pulses of RF-signals, for example, for each angular value (or angular range) one pulse and if a reflected return signal is received, the radar sensor can assign the corresponding distance d and angle ⁇ to the detected object 10.
  • the radar sensor may receive multiple reflected signals from the multiple objects 10a, 10b. However, if the multiple objects 10 do not move relative to each other, the radar sensor 120 (or the control unit 130) cannot distinguish, whether the multiple reflected signals originate from different parts of only one object or whether two objects are present in the coverage area 50. However, the speed of each object can be determined by performing subsequent scans over the coverage area 50. If the independent objects move relative to each other they can be identified a different objects, and the radar sensor 120 can assign the corresponding distances d1, d2 and angles ⁇ 1, ⁇ 2 to the detected first and second objects 10a, 10b.
  • multiple objects can be detected or identified (by scanning the angular range) if the relative velocity and/or relative position between the multiple objects exceed a certain threshold.
  • These objects may be (visually) separated and identified as separate objects.
  • the field of view of these sensors may go up to 170 degrees.
  • the at least one radar sensor 120 can provide sufficient information to detect the object(s) in the coverage area 50 and to determine the position of the object(s) 10. The determined position may then be used to provide a further visual feedback to the driver.
  • this representation may be combined with the picture captured with the camera 110.
  • the control unit 130 can overlay both pictures and highlight a depicted object 10.
  • any background object can be eliminated (e.g. simply by ignoring object beyond a particular threshold).
  • Fig. 3 depicts a further embodiment of the apparatus installed on a vehicle.
  • This embodiment comprises a first radar sensor 121, a second radar sensor 122 and a third radar sensor 123.
  • the first radar sensor 121 is installed on the right-hand side of the towing vehicle 71.
  • the second radar sensor 122 is installed on the left-hand side of the towing vehicle 71.
  • the third radar sensor 123 is installed at a front side of the towing vehicle 71.
  • the embodiment comprises a first camera 111 and a second camera 112.
  • the first camera 111 is installed at the front right corner of the towing vehicle.
  • the second camera 112 is attached at the front left corner of the towing vehicle 71.
  • the first camera 111 and the first radar sensor 121 may cover the right-hand side of the vehicle 70 (as it was described in conjunction with Fig. 1 ).
  • the second camera 112 and the second radar sensor 122 cover the left-hand side of the vehicle.
  • This coverage is an analogy to the embodiment as described with Fig. 1 .
  • the first camera 111 has a first coverage area 31 (around the right corner) and the second camera 112 has a second coverage area 32 (around the left corner).
  • the third radar sensor 123 covers the range 53 in front of the vehicle 70, which overlaps with the radar coverage area on the right-hand side 51 and the radar coverage area on the left-hand side 52.
  • Fig. 3 shows an embodiment, which provides a maximum extension of all sides of the moving vehicle.
  • the display unit 140 can thus depict three pictures (or visual representations) of the three sides of the vehicle 70, wherein each visual representation is obtained in the same way as described in conjunction with Figs. 1 , 2 .
  • further cameras may also be installed at the same side of the vehicle and/or further radar sensors can also be installed at the same side of the vehicle, in which case the resolution can be further increased.
  • any further (hidden) object e.g. behind the one object as shown in Fig. 1 or Fig. 2
  • not only the towing vehicle 71 is used for attaching the at least one camera 110 and the at least one radar sensor 120. It is also possible that further, optional, cameras and/or radar sensors are attached to the trailer 172. However, installing the camera(s) and the radar sensor(s) merely on the towing 71 provide the advantage that the trailer can be changed freely while ensuring the correct operation of the turning-assistant system.
  • Fig. 4 shows an example for the visual feedback to the driver, for example shown on the display 140 of the system architecture as shown in Figs. 1 and 2 .
  • the driver can see the object 10 (for example a bicycle rider) as a marked object beside the vehicle.
  • the object marking can either be generated based on the captured image data of the at least one camera 110 or in combination with the at least one radar sensor 120 which also detects an object 10 travelling at the side of the vehicle 70.
  • both sensor units 110, 120 may identify the same object 10, which is marked in the visual feedback to the driver.
  • This marking may be interpreted by the driver as a confirmation that both systems 110, 120 have detected the same object at the same position.
  • the driver obtains a high level of confidence in the situation depicted on the display 140.
  • Fig. 4 shows a particular raw camera view, i.e. a view as seen from the position of the camera 110. However, it may be of advantage to manipulate the camera view such that it represents the bird's eye view.
  • FIG. 5 Such a converted picture is shown in Fig. 5 , which again shows the exemplary bicycle rider as object 10, who uses a bike lane parallel to the traffic direction of the vehicle 70. Again, this bird's eye view can be shown to the driver in combination, alternatively or selectable to the driver of the vehicle 70 so that the driver can select the way of viewing which s/he may prefer.
  • Fig. 5 further shows the area 50 covered by the radar 120 as a dotted line so that the driver can see, whether the detected object is reliably detected by both sensors or whether the object is outside the area of coverage 50 of the radar sensor 120.
  • Fig. 6 depicts a method for detecting an object 10 in a surveillance area 30, 50 of a vehicle 70.
  • the method comprises the steps of: capturing S110 image data of said surveillance area 30 by at least one camera 110; providing S120 information indicative of a distance to said object 10 by at least one radar sensor 120; receiving S130 said image data and said information by a control unit 130; and generating S140, by said control unit 130, a visual representation of said object 10 based on said received image data and/or said information indicative of said distance.
  • This method may also be a computer-implemented method.
  • a person of skill in the art would readily recognize that steps of various above-described methods might be performed by programmed computers.
  • Embodiments are also intended to cover program storage devices, e.g., digital data storage media, which are machine or computer readable and encode machine-executable or computer-executable programs of instructions, wherein the instructions perform some or all of the acts of the above-described methods, when executed on the a computer or processor.
  • the visual feedback (for example the displayed camera view) to the driver may allow the driver to have a better understanding of the situation to assist the driver in finding the correct action to avoid any accident.
  • the driver With a single camera view the driver is already able to observe the critical area around the vehicle without the need of an object detection feature implemented optionally by the disclosed system.
  • the system only in a visual mode the number of accidents caused by collisions related to objects in blind spots around the commercial vehicles can already be reduced significantly.
  • the disclosed system comprises, for example, at least one camera 110, at least one radar sensor 120 and a visual feedback 140 to the driver.
  • the commercial vehicle specific turning-assistant provides the possibility of visually observing critical areas by the driver, to help the driver to recognize relevant objects. This recognition can, for example, be done by marking them on the camera view and/or by warning for the occurrence of such objects.
  • the system may also initiate a braking and/or steering intervention, if the driver does not act accordingly.
  • the robustness of the system is increased by the combination or fusion of two different sensor signals (the radar signal and the camera images), which cover the same surveillance area. This may also extend the operational range of the system, because it can be applied at various environmental circumstances, for example at night, at day, or under severe weather conditions.
  • the redundancy provided by two different sensor units is a particular advantage of embodiments of the present invention.
  • the second, parallel sensor can compensate for insufficiencies of one sensor type.
  • the camera may not take sufficient pictures if the weather conditions are bad or at nighttime.
  • radar sensors might be affected by strong rain or snow falls, or falling leaves may generate false depiction signals of phantom objects.
  • the resulting apparatus will operate reliably during all weather conditions. This enables the sensor 110, 120 to operate under all circumstances during the daytime and at night and for all weather conditions, such as for example clear weather, fog, rain or snow.
  • the driver gains confidence about the actual driving traffic situation even at the side opposite to the driver's side.
  • the camera as well as the radar sensor may be mounted on the towing vehicle (tractor), whereas the towed vehicle (for example a trailer) is not involved in the system installation.
  • the trailer can be freely exchanged and can be used for any type of towing vehicle without compromising the safety.
  • the at least one camera 110 can be mounted high enough, for example around the top of the cabin, to provide a good freedom of view of the relevant area.
  • the radar sensor(s) 120 may, for example, be installed between the front axle and the rear axle of the towing vehicle 70.
  • the radar signal may be transmitted into the surveillance area 50 parallel to the ground, thereby avoiding any false detection originated by reflection from the ground.
  • Any type of wave signal can be used for the radar sensor, for example, any type of electromagnetic waves as for example RF waves or ultrasonic waves.
  • a turning assist apparatus comprising: at least one camera 110; and at least one radar sensor 120 (both installed on the relevant side of the vehicle, rider side - opposite to the driver side); and a display 140 showing the view of the camera to the driver.
  • the turning-assistant apparatus is characterized in that the relevant objects 10 are indicated to the driver.
  • the turning-assistant apparatus is characterized in that the camera view is processed by image processing algorithms to detect objects 10.
  • the turning-assistant apparatus is characterized in that the objects 10 detected by the radar sensor 120 are indicated to the driver.
  • the turning-assistant apparatus is characterized in that the sensor fusion is realized between the two sensor types (camera 110 & radar sensor 120) realizing the object detection.
  • the turning-assistant apparatus is characterized in that the object indication to the driver is realized by marking the objects 10 on the camera view displayed to the driver.
  • the turning-assistant apparatus is characterized in that the system can enable further warning methods (e.g. audible, haptic) to warn the driver of critical situations.
  • further warning methods e.g. audible, haptic
  • the turning-assistant apparatus is characterized in that the camera view is a fish-eye lens camera.
  • the turning-assistant apparatus is characterized in that the system is capable of converting the camera view into a bird's eye view using point of view conversion method.
  • the turning-assistant apparatus is characterized in that the system can predict the path of the vehicle.
  • the turning-assistant apparatus is characterized in that the system can distinguish between stationary and moving objects and predict the path of the object.
  • the turning-assistant apparatus is characterized in that the system can predict intersection of vehicle and object path.
  • the turning-assistant apparatus is characterized in that the system can brake and/or steer to avoid the collision.
  • the turning-assistant apparatus is characterized in that the system can be optionally extended to other sides of the vehicle by adding further cameras and radar sensors.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

An apparatus for detecting an object (10) in a surveillance area (30, 50) of a vehicle (70) comprises: at least one camera (110) for capturing image data of said surveillance area (30); at least one radar sensor (120) for providing information indicative of a distance (d) to said object (10); and a control unit (130) configured to receive said image data and said information and to generate a visual representation of said object (10) based on said received image data and/or said information indicative of said distance (d).

Description

  • The present invention relates to an apparatus and a method for detecting an object in a surveillance area of a vehicle and, in particular, to an advanced turning-assistant system for commercial vehicles.
  • Background
  • Lane changes or turning maneuvers are potential critical traffic situations for commercial vehicles where other traffic participants are easily overseen. In passenger cars blind spot assist systems are used to prevent accidents in such situations by supporting the driver, for example, during lane changes or turns. The driver in passenger cars who is warned by such turn assist systems is in most cases able to confirm such warnings by a visual inspection of the respective area, for example by moving or turning her/his head. On the other hand, for commercial vehicles this is in most cases not possible because the driver has no view, for example, at the passenger side of the vehicle and thus cannot confirm a warning given by a turning assist system by visual inspection.
  • Therefore, if conventional systems known for passenger cars are implemented in commercial vehicles, the level of awareness of the driver will be significantly lower - in particular for turns to the side opposite to the driver of the vehicle. Thus, the known systems should be extended to give the driver also for commercial vehicles a same level of confidence of traffic situations, where the driver cannot receive a visual confirmation of a possibly dangerous situation.
  • Conventional systems available for commercial vehicles are based on various sensors (e.g. a radar sensor or a laser scanner or a camera) and provide, for example, a blinking light (e.g. in a rearview mirror) or an audible warning for the driver. These systems do not provide the same level of confidence, as it is known for passenger cars.
  • For example, DE 10 2010 048 144 A1 discloses a vehicle which has a sensor to detect objects in a surrounding area of the vehicle, wherein the surrounding area extends along one long side of the vehicle. Furthermore, DE 10 2009 041 556 A1 discloses a blind spot assist system based on ultrasonic sensors, which are arranged along a side and a corner region of the vehicle. DE 10 2012 010 876 A1 discloses another drive assist system based on image sensors which are arranged between the front axle and the rear axle in a lateral region of the vehicle main portion. Further driver assist systems are disclosed in WO 2014/037064 A1 with sensors arranged at a vehicle trailer and in WO 2014/041023 , wherein a collision warning system for lane changes is described.
  • All these conventional systems provide some support for the driver to make sure that no object is present in a blind spot, when making turns with a commercial vehicle. However, these conventional systems are disadvantageous in that they rely on multiple sensors to be installed at the vehicle or in that all sensors are subject to the same limitations as the systems known from passenger cars and cannot ensure a desired level of safety.
  • Therefore, there is a need of providing an apparatus for detecting an object in a surveillance area of a vehicle, which overcomes the disadvantages of the conventional systems as described before and provides, in particular, an increased level of safety and robustness for commercial vehicles.
  • Summary of the Invention
  • The present invention solves the afore-mentioned problem by providing an apparatus according to claim 1 and a method according to claim 15. The dependent claims refer to specifically advantageous realizations of the subject matter of claim 1.
  • The present invention relates to an apparatus for detecting an object in a surveillance area of a vehicle. The apparatus comprises at least one camera for capturing image data of the surveillance area, at least one radar sensor (or radar unit) for providing information indicative of a distance to the object, and a control unit configured to receive the image data and the information from the radar sensor(s) to generate a visual representation of the object based on the received image data and/or on the information indicative of the distance.
  • The surveillance area can be any area surrounding the vehicle and may cover, in particular, an adjacent driving lane or a sidewalk or a bicycle lane. Each sensor has it own coverage area from where the sensor can take sensor data and which is possibly larger than the surveillance area so that both sensors cover a common area that can be identified as the surveillance area or it can be part thereof. Therefore, the surveillance area is part of both coverage areas.
  • The distance may be defined as the distance to the object measured from the respective sensor or from the vehicle (or any particular part of the vehicle). The present invention does not rely on any particular coordinate system. Rather, the position, distance or angle may be defined with respect to any desired coordinate system (e.g. the distance(s) or angle(s) can be measured with respect to the sensor units).
  • Information indicative of a distance shall be interpreted very broadly and shall contain any information which can be used to derive therefrom a distance between the vehicle (or the radar sensor) and the object. This information may include in particular time delay information between a transmitted and a reflected radar signal measured by the radar sensor or the corresponding moments in time of the emission and reception of the used (RF) signals. Based on the time delay or the two measured different moments in time or any other information, the control unit can then determine or calculate the distance to and/or the position of the object.
  • The above-mentioned problem is thus solved by the present invention by providing a redundancy of two independent sensor units of different types that enable a visual feedback to the driver while avoiding false detections (e.g. due to abnormal weather conditions affecting one sensor type).
  • For this feedback, the apparatus may further comprise a display unit that is configured to receive the visual representation of the object from the control unit and to show at least part of the surveillance area together with the object to the driver of the vehicle. The display may be configured to show to the driver the image captured by the at least one camera as one image and, as a second image, to display a radar picture obtained from the information provided by at the at least one radar sensor.
  • To obtain a radar picture, in yet another embodiment, the at least one radar sensor may be configured to transmit a radio frequency (RF) signal and to receive a reflected RF signal from the object. The at least one radar sensor may further be configured to determine an angle or an angular range from where the RF signal was received. The control unit may be configured to determine a position of the object based on the determined angle or angular range and the information indicative of the distance.
  • Hence, according to embodiments of the present invention, two visual representations are available, one referring to the image captured by the camera(s) and the second derived from the information received from the radar sensor(s). Both images may either be overlaid on the display or can be displayed as separate images, e.g. adjacent to each other.
  • However, the radar sensor may not distinguish between the object of interest and background objects (e.g. trees or buildings which the driver is aware of). Therefore, in order to detect an object of interest (i.e. to distinguish it from the background) subsequent positioning steps may be implemented. Similarly, multiple objects of interest may be present in the surveillance area. To identify correctly those objects, the at least one radar sensor may be configured to repeatedly measure a distance between one or more candidate objects and the vehicle. The control unit may receive the repeatedly measured distances and, based thereon, can determine a relative motion of the detected objects to the vehicle. If the objects move relative to each other, the reflected radar signals can be identified as signals originating from different objects. Similarly, background objects can be detected as static objects at a larger distance (e.g. beyond a certain threshold). As a result, the apparatus is able to eliminate background objects and/or to detect one or more further objects of interest.
  • Same applies to the at least one camera. When only one camera is available, there may be a problem in identifying an object being different from the background. For example, the ground or the driving lane may have certain patterns, which are typically not objects of any interest to the driver and thus should be eliminated in the captured images. As for the radar sensor, this can be achieved by taking subsequent images so that any relative motion of the vehicle with respect to the object and with respect to the background can be identified. Hence, any background object can be identified and thus eliminated from the captured images.
  • Therefore, in yet another embodiment, the at least one camera may be configured to capture subsequent images. The control unit may further be configured to process the subsequent images received from the at least one camera using a processing algorithm to detect the object in the surveillance area and/or to eliminate background objects and/or to detect one or more further objects.
  • The same aim could be achieved, if multiple cameras are available so that at least two pictures can be taken from different angles, which allows to identify objects related to the background or further distant objects.
  • As mentioned already above, the control unit might be configured to fuse (combine) both sensor signals (or sensor images) into one single sensor image. This can, for example, be done by highlighting the object in the visual representation of the image captured by the camera, wherein the highlighted object is related to the object detected by the at least one radar sensor. Hence, it is not necessary to detect the object by the camera(s). Instead, the position of an object detected by the radar sensor or the control unit can be simply be highlighted in the image taken by the camera(s) so that the driver can decide about the relevance of this object. Therefore, in yet another embodiment the control unit is configured to combine information received from the at least one radar sensor and the at least one camera and to highlight the object or multiple objects in the image shown to the driver of the vehicle.
  • In yet another embodiment the control unit may be configured to predict a path of the vehicle and/or to predict a path of the object and is further configured to issue a warning, if the path of the vehicle intersects the path of the object. Also, the relative path between the object and the vehicle can be determined. The warning can be done by a separate warning module (for example a loudspeaker or a warning light), which may or may not be part of the control unit and may or may not be already included in the vehicle, but may be controlled by the control unit.
  • The warning may comprise an acoustic signal and/or a haptic signal and/or an optical signal. The haptic signal may be related to any kind of vibration, which the driver is able to recognize.
  • In yet another embodiment the control unit may further be configured to issue a brake signal to enforce a braking and/or a steer signal to enforce a steering of the vehicle in case the path of the vehicle and the path of object indicate a collision for a predetermined period of time (e.g. if the driver ignores the warning and a collision is going to happen, when the vehicle stays on its path). The predetermined period can be selected such that the remaining time to the collision is as long as the vehicle needs to prevent the collision (i.e. it may define the latest moment). During the enforced braking or steering, the optional warning module can issue a constant warning signal upon which the driver can interact to override the enforced braking and/or steering of the vehicle by taking appropriate actions to avoid the collision of the object.
  • In yet another embodiment the at least one camera may comprise a first and a second camera, and the at least one radar sensor comprises a first and a second radar sensor, wherein the first camera and the first radar sensor are installed on one side of the vehicle and the second camera and the second radar sensor are installed on another side of the vehicle to detect objects on different sides of the vehicle.
  • In yet another embodiment the at least one camera comprises a fish-eye lens camera to capture image data of a large area (e.g. extending the viewing angle of more than 90° or up to 180°).
  • In yet another embodiment the control unit may further be configured to transform the image data received from the at least one camera in a bird's eye view (e.g. by using a conversion method).
  • The present invention relates also to a vehicle with one of the described apparatus and the surveillance area may include an area of a side of the vehicle that extends over an angular range of more than 90° or up to 170° measured about the side of the vehicle.
  • The present invention relates also to a method for detecting an object in a surveillance area of a vehicle. The method comprises: capturing image data of the surveillance area by at least one camera; providing information indicative of a distance to the object by at least one radar sensor; receiving the image data and the information by a control unit; and generating, by the control unit, a visual representation of the object based on the received image data and/or the information indicative of the distance.
  • This method may also be implemented in software or as a computer program product and the order of steps may be not important to achieve the desired effect. In addition, all functions described in conjunction with the apparatus may be implemented in further embodiments of the method.
  • Brief Description of the Drawings
  • Various embodiments of the present invention will be described in the following by way of examples only, and with respect to the accompanying drawings, in which:
  • Fig. 1
    depicts an apparatus for detecting an object in a surveillance area according to an embodiment of the present invention, when installed on a commercial vehicle;
    Fig. 2
    depicts further optional components of the embodiment depicted in Fig. 1;
    Fig. 3
    depicts another embodiment with multiple radar sensors and multiple cameras;
    Fig. 4
    shows a visual representation of the surveillance area as seen by the driver of the commercial vehicle;
    Fig. 5
    shows a bird's eye view of the visual representation of the surveillance area as seen by the driver of the commercial vehicle; and
    Fig. 6
    illustrates a flow chart of a method for detecting an object in a surveillance area according to an embodiment of the present invention.
    Detailed Description
  • Fig. 1 depicts an apparatus for detecting an object 10 in a surveillance area 30, 50 of a vehicle 70. The apparatus comprises one camera 110 for capturing image data of the surveillance area 30, one radar sensor 120 for providing a distance d to the object 10, and a control unit 130. The control unit 130 is configured to receive the image data and the information from the radar sensor 120 to generate a visual representation of the object 10 based on the received image data and/or the distance d.
  • In the embodiment of Fig. 1, the apparatus is installed on an exemplary commercial vehicle 70 with a towing vehicle 71 (e.g. a tractor) and a trailer 72, wherein only one camera 110 and only one radar sensor 120 are installed at a right-hand side of the towing vehicle 71 in the driving direction (to the left in Fig. 1). The control unit 130 and an optional display 140 may further be installed on the towing vehicle 71. The control unit 130 is connected to the camera 110, to the radar sensor 120 and to the display 140 to receive the signals from the camera 110 and the radar sensor 120 and to supply corresponding visual data to the display 140. The radar sensor 120 may provide an angular resolution of the detected object 10 so that the control unit 130 is able to determine a position for the object 10 (e.g. given by the distance d and the angle α).
  • The camera 110 may be sensitive to capture images in the coverage area 30 and the radar sensor 120 has a range indicated by the dotted line 50. In particular, both coverage areas 30, 50 overlap significantly over each other and the overlap may define the surveillance area. The range 50 of the radar 120 is limited by the used radar signal. For example, the one or more radar sensors 120 may operate with radio frequency signals with exemplary frequencies between 10 and 100 GHz or about 24 GHz and/or at about 77/79 GHz. For example, one radar sensor may operate at 24 GHz whereas another may operate at 77/79 GHz. The detection range may go up to 100 meters, although in the target operating mode the range may be around 25 meters. Such range is of advantage, because it allows to cover a whole side the vehicle 70 with one radar sensor 120, which can thus be installed on the towing vehicle 71 - there is no need to install any sensor on the trailer 72 (hence the system operates trailer-independent). In addition, in this case the radar sensor provides more accurate information of the detected objects.
  • In further embodiments, other or additional radar units may be used as, for example, ultrasonic devices or any other electromagnetic radiation can be used, which is suitable for measuring a distance between the radar sensor 120 and the object 10 (e.g. laser light). Therefore, in further embodiments, the radar coverage area 50 can also be smaller than the camera coverage area 30. However, it is of advantage to use electromagnetic wave signals, which can travel a sufficiently long distance to extend therewith the coverage area 50 so that only one radar sensor 120 would suffice to cover the whole side of the vehicle 70, and there is no need to distribute many radar sensors over the whole long side of the vehicle 70 (as in conventional systems).
  • The camera 110 and the radar sensor 120 may be installed at different locations. For example, the camera 110 may be installed at an upper portion of the driver cabin and the radar unit 120 may be installed at a lower portion (e.g. between two axles). Consequently, the viewing direction of the camera 110 is downward, which limits the coverage area 30 of the camera 110. On the other hand, the radar unit 120 may emit signals parallel to the ground so that the coverage area 50 of the radar unit 120 is merely limited by the range of the used signals. Hence, as shown in Fig. 1, the range 50 of the radar sensor 120 may be larger than the coverage area 30 covered by the camera 110 (although this area can be changed by changing the orientation of the camera 110).
  • Fig. 2 depicts a further embodiment of the apparatus according to the present invention, which differs from the embodiment as shown in Fig. 1 in that the camera 110 is attached near or at a corner of the towing vehicle 71. With the camera 110 installed in the corner region of the towing vehicle 71 it becomes possible to extend the coverage area 30 of the camera 110 to cover also the front side of the vehicle 70. All other features as depicted in Fig. 2 are the same as in Fig. 1 so that a repetition of the description is not needed here.
  • As it will be described in the following, already one radar sensor 120 can be used to obtain a radar image, which is suitable to be shown to the driver as an addition visual representation of the surveillance area.
  • As said before, the radar sensor 120 is configured to measure the distance d between the radar sensor 120 and the object 10. In order to measure this distance d the radar sensor may emit an RF-signal, which is reflected by the object 10 and returns to the radar sensor 120. The radar sensor 120 measures the time difference between the emission of the RF-signal to the object 10 and the reception of the return signal. From this time difference, while taking into account the propagation speed of the wave signal, the control unit 130 (or the radar sensor 120 itself) can determine the distance d from the radar sensor to the object 10.
  • The radar sensor 120 may further be configured to provide an angular resolution of the detected return signals. For example, the radar signal 120 can scan the coverage area 50, for example, starting on the left hand side in Fig. 2 and subsequently scanning the area up to the right hand side of Fig. 2 (or vice versa). When the radar sensor 120 receives a return signal, the radar sensor 120 can determine the corresponding angle α associated with a reflecting object. For example, the radar sensor 120 may be configured to emit pulses of RF-signals, for example, for each angular value (or angular range) one pulse and if a reflected return signal is received, the radar sensor can assign the corresponding distance d and angle α to the detected object 10.
  • If multiple objects are present in the coverage area 50 (e.g. a first object 10a and a second object 10b), the radar sensor may receive multiple reflected signals from the multiple objects 10a, 10b. However, if the multiple objects 10 do not move relative to each other, the radar sensor 120 (or the control unit 130) cannot distinguish, whether the multiple reflected signals originate from different parts of only one object or whether two objects are present in the coverage area 50. However, the speed of each object can be determined by performing subsequent scans over the coverage area 50. If the independent objects move relative to each other they can be identified a different objects, and the radar sensor 120 can assign the corresponding distances d1, d2 and angles α1, α2 to the detected first and second objects 10a, 10b. For example, multiple objects can be detected or identified (by scanning the angular range) if the relative velocity and/or relative position between the multiple objects exceed a certain threshold. These objects may be (visually) separated and identified as separate objects. The field of view of these sensors may go up to 170 degrees.
  • As a result, the at least one radar sensor 120 can provide sufficient information to detect the object(s) in the coverage area 50 and to determine the position of the object(s) 10. The determined position may then be used to provide a further visual feedback to the driver. Optionally, this representation may be combined with the picture captured with the camera 110. For example, the control unit 130 can overlay both pictures and highlight a depicted object 10. In addition, since the distance to the object 10 is known any background object can be eliminated (e.g. simply by ignoring object beyond a particular threshold).
  • Fig. 3 depicts a further embodiment of the apparatus installed on a vehicle. This embodiment comprises a first radar sensor 121, a second radar sensor 122 and a third radar sensor 123. The first radar sensor 121 is installed on the right-hand side of the towing vehicle 71. The second radar sensor 122 is installed on the left-hand side of the towing vehicle 71. The third radar sensor 123 is installed at a front side of the towing vehicle 71. In addition, the embodiment comprises a first camera 111 and a second camera 112. The first camera 111 is installed at the front right corner of the towing vehicle. The second camera 112 is attached at the front left corner of the towing vehicle 71.
  • With such an installed apparatus it is possible to cover not only the front side of the vehicle 70 but also both sides, the right-hand side as well as the left-hand side of the vehicle. For example, the first camera 111 and the first radar sensor 121 may cover the right-hand side of the vehicle 70 (as it was described in conjunction with Fig. 1). The second camera 112 and the second radar sensor 122 cover the left-hand side of the vehicle. This coverage is an analogy to the embodiment as described with Fig. 1. The first camera 111 has a first coverage area 31 (around the right corner) and the second camera 112 has a second coverage area 32 (around the left corner). In addition, the embodiment of Fig. 3 also covers the front side of the vehicle by the first camera 111 and/or the second camera 112 in combination with the third radar sensor 123. The third radar sensor 123 covers the range 53 in front of the vehicle 70, which overlaps with the radar coverage area on the right-hand side 51 and the radar coverage area on the left-hand side 52.
  • Hence, Fig. 3 shows an embodiment, which provides a maximum extension of all sides of the moving vehicle. The display unit 140 can thus depict three pictures (or visual representations) of the three sides of the vehicle 70, wherein each visual representation is obtained in the same way as described in conjunction with Figs. 1, 2.
  • Optionally, further cameras may also be installed at the same side of the vehicle and/or further radar sensors can also be installed at the same side of the vehicle, in which case the resolution can be further increased. In addition, any further (hidden) object (e.g. behind the one object as shown in Fig. 1 or Fig. 2) can be detected if multiple radar sensors are present. For example, with further radar sensors installed at the same side of the vehicle, it is possible to make several distance measurements between the radar sensor and the object. As a result, the position of the object 10 can be determined with better accuracy.
  • In further embodiments, not only the towing vehicle 71 is used for attaching the at least one camera 110 and the at least one radar sensor 120. It is also possible that further, optional, cameras and/or radar sensors are attached to the trailer 172. However, installing the camera(s) and the radar sensor(s) merely on the towing 71 provide the advantage that the trailer can be changed freely while ensuring the correct operation of the turning-assistant system.
  • Fig. 4 shows an example for the visual feedback to the driver, for example shown on the display 140 of the system architecture as shown in Figs. 1 and 2. As it is shown in Fig. 4, the driver can see the object 10 (for example a bicycle rider) as a marked object beside the vehicle. The object marking can either be generated based on the captured image data of the at least one camera 110 or in combination with the at least one radar sensor 120 which also detects an object 10 travelling at the side of the vehicle 70. Thus, both sensor units 110, 120 may identify the same object 10, which is marked in the visual feedback to the driver. This marking may be interpreted by the driver as a confirmation that both systems 110, 120 have detected the same object at the same position. Hence, the driver obtains a high level of confidence in the situation depicted on the display 140.
  • Fig. 4 shows a particular raw camera view, i.e. a view as seen from the position of the camera 110. However, it may be of advantage to manipulate the camera view such that it represents the bird's eye view.
  • Such a converted picture is shown in Fig. 5, which again shows the exemplary bicycle rider as object 10, who uses a bike lane parallel to the traffic direction of the vehicle 70. Again, this bird's eye view can be shown to the driver in combination, alternatively or selectable to the driver of the vehicle 70 so that the driver can select the way of viewing which s/he may prefer.
  • Fig. 5 further shows the area 50 covered by the radar 120 as a dotted line so that the driver can see, whether the detected object is reliably detected by both sensors or whether the object is outside the area of coverage 50 of the radar sensor 120.
  • Fig. 6 depicts a method for detecting an object 10 in a surveillance area 30, 50 of a vehicle 70. The method comprises the steps of: capturing S110 image data of said surveillance area 30 by at least one camera 110; providing S120 information indicative of a distance to said object 10 by at least one radar sensor 120; receiving S130 said image data and said information by a control unit 130; and generating S140, by said control unit 130, a visual representation of said object 10 based on said received image data and/or said information indicative of said distance.
  • This method may also be a computer-implemented method. A person of skill in the art would readily recognize that steps of various above-described methods might be performed by programmed computers. Embodiments are also intended to cover program storage devices, e.g., digital data storage media, which are machine or computer readable and encode machine-executable or computer-executable programs of instructions, wherein the instructions perform some or all of the acts of the above-described methods, when executed on the a computer or processor.
  • Advantageous aspects of the various embodiments can be summarized as follows:
    • Embodiments of the present invention relate to a turning assist system that gives visual information of a relevant area and marks objects on a camera view, for example, displayed to the driver. The indication of objects in the camera view can, for example, be based on a processing algorithm for the camera image and the radar sensor signals. The system may further warn the driver and reacts if necessary. The warning may be visual, audible or haptic. If the driver does not react on the warning, an intervention can be initiated to avoid a predicted collision, which may include for example a braking and/or steering of the vehicle.
  • The visual feedback (for example the displayed camera view) to the driver may allow the driver to have a better understanding of the situation to assist the driver in finding the correct action to avoid any accident. With a single camera view the driver is already able to observe the critical area around the vehicle without the need of an object detection feature implemented optionally by the disclosed system. When using the system only in a visual mode, the number of accidents caused by collisions related to objects in blind spots around the commercial vehicles can already be reduced significantly.
  • The disclosed system comprises, for example, at least one camera 110, at least one radar sensor 120 and a visual feedback 140 to the driver. Optionally, the commercial vehicle specific turning-assistant provides the possibility of visually observing critical areas by the driver, to help the driver to recognize relevant objects. This recognition can, for example, be done by marking them on the camera view and/or by warning for the occurrence of such objects. Finally, the system may also initiate a braking and/or steering intervention, if the driver does not act accordingly.
  • The robustness of the system is increased by the combination or fusion of two different sensor signals (the radar signal and the camera images), which cover the same surveillance area. This may also extend the operational range of the system, because it can be applied at various environmental circumstances, for example at night, at day, or under severe weather conditions.
  • The redundancy provided by two different sensor units is a particular advantage of embodiments of the present invention. The second, parallel sensor can compensate for insufficiencies of one sensor type. For example, the camera may not take sufficient pictures if the weather conditions are bad or at nighttime. Similarly, radar sensors might be affected by strong rain or snow falls, or falling leaves may generate false depiction signals of phantom objects. However, by combining a picture-taking sensor as a camera 110 with a radar sensor 120, the resulting apparatus will operate reliably during all weather conditions. This enables the sensor 110, 120 to operate under all circumstances during the daytime and at night and for all weather conditions, such as for example clear weather, fog, rain or snow.
  • In particular, the possibility of providing the driver with a clear visual feedback, for example by a camera view, the driver gains confidence about the actual driving traffic situation even at the side opposite to the driver's side.
  • As shown in Figs. 1, 2 and 3, the camera as well as the radar sensor may be mounted on the towing vehicle (tractor), whereas the towed vehicle (for example a trailer) is not involved in the system installation. This provides the advantage that the trailer can be freely exchanged and can be used for any type of towing vehicle without compromising the safety.
  • For example, the at least one camera 110 can be mounted high enough, for example around the top of the cabin, to provide a good freedom of view of the relevant area. The radar sensor(s) 120 may, for example, be installed between the front axle and the rear axle of the towing vehicle 70. The radar signal may be transmitted into the surveillance area 50 parallel to the ground, thereby avoiding any false detection originated by reflection from the ground. Any type of wave signal can be used for the radar sensor, for example, any type of electromagnetic waves as for example RF waves or ultrasonic waves.
  • Further advantageous realizations of the present invention relate to a turning assist apparatus comprising: at least one camera 110; and at least one radar sensor 120 (both installed on the relevant side of the vehicle, rider side - opposite to the driver side); and a display 140 showing the view of the camera to the driver.
  • In yet another realization, the turning-assistant apparatus is characterized in that the relevant objects 10 are indicated to the driver.
  • In yet another realization, the turning-assistant apparatus is characterized in that the camera view is processed by image processing algorithms to detect objects 10.
  • In yet another realization, the turning-assistant apparatus is characterized in that the objects 10 detected by the radar sensor 120 are indicated to the driver.
  • In yet another realization, the turning-assistant apparatus is characterized in that the sensor fusion is realized between the two sensor types (camera 110 & radar sensor 120) realizing the object detection.
  • In yet another realization, the turning-assistant apparatus is characterized in that the object indication to the driver is realized by marking the objects 10 on the camera view displayed to the driver.
  • In yet another realization, the turning-assistant apparatus is characterized in that the system can enable further warning methods (e.g. audible, haptic) to warn the driver of critical situations.
  • In yet another realization, the turning-assistant apparatus is characterized in that the camera view is a fish-eye lens camera.
  • In yet another realization, the turning-assistant apparatus is characterized in that the system is capable of converting the camera view into a bird's eye view using point of view conversion method.
  • In yet another realization, the turning-assistant apparatus is characterized in that the system can predict the path of the vehicle.
  • In yet another realization, the turning-assistant apparatus is characterized in that the system can distinguish between stationary and moving objects and predict the path of the object.
  • In yet another realization, the turning-assistant apparatus is characterized in that the system can predict intersection of vehicle and object path.
  • In yet another realization, the turning-assistant apparatus is characterized in that the system can brake and/or steer to avoid the collision.
  • In yet another realization, the turning-assistant apparatus is characterized in that the system can be optionally extended to other sides of the vehicle by adding further cameras and radar sensors.
  • The description and drawings merely illustrate the principles of the disclosure. It will thus be appreciated that those skilled in the art will be able to devise various installed that, although not explicitly described or shown herein, embody the principles of the disclosure and are included within its scope.
  • Furthermore, while each embodiment may stand on its own as a separate example, it is to be noted that in other embodiments the defined features can be combined differently, i.e. a particular feature descripted in one embodiment may also be realized in other embodiments. Such combinations are covered by the disclosure herein unless it is stated that a specific combination is not intended.
  • List of reference signs
  • 10, 10a, 10b
    object(s)
    30, 50
    surveillance area
    70
    vehicle
    110, 111, 112
    cameras
    120, 121, 122
    radar sensors
    130
    control unit
    140
    display unit
    d, d1, d2
    distance(s) to the object(s)
    α, α1, α2
    angle(s) to the object(s)

Claims (15)

  1. An apparatus for detecting an object (10) in a surveillance area (30, 50) of a vehicle (70), the apparatus comprising:
    at least one camera (110) for capturing image data of said surveillance area (30);
    at least one radar sensor (120) for providing information indicative of a distance to said object (10); and
    a control unit (130) configured to receive said image data and said information and to generate a visual representation of said object (10) based on said received image data and/or said information indicative of said distance (d).
  2. The apparatus according to claim 1, further comprising a display unit (140) configured to receive said visual representation of said object (10) from said control unit (130) and to show at least part of said surveillance area (30, 50) together with said object (10).
  3. The apparatus according to claim 1 or claim 2,
    said at least one radar sensor (120) is configured to transmit a radio frequency (RF) signal and to receive a reflected RF signal from said object (10) and is further configured to determine an angle (α) or an angular range from where said RF signal was received,
    and wherein said control unit (120) is configured to determine a position of said object (10) based on said determined angle (α) or angular range and said information indicative of said distance (d).
  4. The apparatus according to claim 2 or claim 3, wherein
    said at least one radar sensor (120) is configured to repeatedly measure a distance between one or more candidate objects and said vehicle (70), and
    wherein said control unit (130) is further configured to receive said repeatedly measured distances and, based thereon, to detect a relative motion of said detected object (10) to said vehicle (70) to enable an elimination of background objects and/or a detection of one or more further objects of interest (10a, 10b).
  5. The apparatus according to one of the preceding claims,
    wherein said at least one camera (110) is configured to capture subsequent images, and
    said control unit (130) is further configured to process said subsequent images received from said at least one camera (110) using a processing algorithm to detect said object (10) in said surveillance area (30) to enable an elimination of background objects and/or a detection of one or more further objects of interest (10a, 10b).
  6. The apparatus according to one of claims 2 to 5, wherein said control unit (130) is configured to combine information received from said at least one radar sensor (120) and said at least one camera (110) and to highlight said object (10) or multiple objects (10a, 10b) in the shown part of the surveillance area (30, 50).
  7. The apparatus according to one of the preceding claims, wherein said control unit (130) is further configured to predict a collision of said vehicle (70) and said object (10) and to issue a collision warning, wherein said collision prediction is performed based a path of said object (10) and/or on a path of said vehicle (70) and/or a relative path between said vehicle (70) and said object (10).
  8. The apparatus according to claim 7, wherein said warning comprises an acoustic signal and/or a haptic signal and/or an optical signal.
  9. The apparatus according to claim 7 or claim 8, wherein said control unit (130) is further configured to issue a brake signal to enforce a braking and/or a steer signal to enforce a steering of said vehicle (70) in case said collision is unavoidable if no breaking and/or steering is performed.
  10. The apparatus according to one of the preceding claims, wherein said at least one camera (110) comprises a first and a second camera (111, 112), and wherein said at least one radar sensor (120) comprises a first and a second radar sensor (121, 122), wherein said first camera (111) and said first radar sensor (121) are installed on one side of said vehicle (70), and said second camera (112) and said second radar sensor (122) are installed on another side of said vehicle (70) to detect objects on different sides of the said vehicle (70).
  11. The apparatus according to one of the preceding claims, wherein said at least one camera (110) comprises a fish-eye lens.
  12. The apparatus according to one of the preceding claims, wherein said control unit (130) is further configured to transform said image data received from said at least one camera (110) in a bird's eye view.
  13. A vehicle with an apparatus according to one of the preceding claims.
  14. The vehicle according to claim 13, wherein said surveillance area (30, 50) includes an area of a side of said vehicle (70) that extends over an angular range of more than 90° measured about said side of said vehicle (70).
  15. Method for detecting an object (10) in a surveillance area (30, 50) of a vehicle (70), comprising:
    capturing (S110) image data of said surveillance area (30) by at least one camera (110);
    providing (S120) information indicative of a distance to said object (10) by at least one radar sensor (120);
    receiving (S130) said image data and said information by a control unit (130); and
    generating (S140), by said control unit (130), a visual representation of said object (10) based on said received image data and/or said information indicative of said distance.
EP15165858.0A 2015-04-30 2015-04-30 Apparatus and method for detecting an object in a surveillance area of a vehicle Pending EP3089136A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP15165858.0A EP3089136A1 (en) 2015-04-30 2015-04-30 Apparatus and method for detecting an object in a surveillance area of a vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP15165858.0A EP3089136A1 (en) 2015-04-30 2015-04-30 Apparatus and method for detecting an object in a surveillance area of a vehicle

Publications (1)

Publication Number Publication Date
EP3089136A1 true EP3089136A1 (en) 2016-11-02

Family

ID=53016541

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15165858.0A Pending EP3089136A1 (en) 2015-04-30 2015-04-30 Apparatus and method for detecting an object in a surveillance area of a vehicle

Country Status (1)

Country Link
EP (1) EP3089136A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019101573A (en) * 2017-11-29 2019-06-24 トヨタ自動車株式会社 Object information acquisition device
FR3080590A1 (en) * 2018-04-25 2019-11-01 Psa Automobiles Sa METHOD AND SYSTEM FOR OBSTACLE DETECTION IN THE ENVIRONMENT OF A MOTOR VEHICLE
CN110456788A (en) * 2019-07-22 2019-11-15 中兴智能汽车有限公司 A kind of automatic Pilot bus controlling system and automatic Pilot car
CN111278703A (en) * 2017-10-26 2020-06-12 奔德士商用车***有限责任公司 System and method for determining when an object detected by a collision avoidance sensor on one component of an articulated vehicle comprises another component of the vehicle
CN113850999A (en) * 2021-09-03 2021-12-28 杭州海康威视数字技术股份有限公司 Parking space detection device and camera device with radar for monitoring parking space
CN117636671A (en) * 2024-01-24 2024-03-01 四川君迪能源科技有限公司 Cooperation scheduling method and system for intelligent vehicle meeting of rural roads

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030025597A1 (en) * 2001-07-31 2003-02-06 Kenneth Schofield Automotive lane change aid
DE102009041556A1 (en) 2009-09-15 2010-06-17 Daimler Ag Vehicle i.e. passenger car, has dead angle assisting device comprising sensors at front side, where sensors are arranged along longitudinal side in distributed manner and determine distance of object by vehicle in assisting device
US20100253918A1 (en) * 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Infotainment display on full-windshield head-up display
DE102010048144A1 (en) 2010-10-11 2011-07-28 Daimler AG, 70327 Vehicle i.e. lorry, has sensor arranged such that surrounding area of vehicle is detectable, where surrounding area extends in sections along long side, runs over entire length of long side and runs in sections in region before front side
JP5070809B2 (en) * 2006-11-10 2012-11-14 アイシン精機株式会社 Driving support device, driving support method, and program
DE102012010876A1 (en) 2012-06-01 2012-11-22 Daimler Ag Vehicle e.g. truck has image sensor which is arranged between front axle and rear axle in lateral region of vehicle main portion
US20130271606A1 (en) * 2012-04-13 2013-10-17 Paul Chiang Method of displaying an assistant screen for improving driving safety of a vehicle
US20130278769A1 (en) * 2012-03-23 2013-10-24 Magna Electronics Inc. Vehicle vision system with accelerated object confirmation
WO2014037064A1 (en) 2012-09-05 2014-03-13 Wabco Gmbh & Method for operating a trailer vehicle or a truck having the system
WO2014041023A1 (en) 2012-09-13 2014-03-20 Volkswagen Ag Methods and devices for collision warning during lane changes

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030025597A1 (en) * 2001-07-31 2003-02-06 Kenneth Schofield Automotive lane change aid
JP5070809B2 (en) * 2006-11-10 2012-11-14 アイシン精機株式会社 Driving support device, driving support method, and program
US20100253918A1 (en) * 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Infotainment display on full-windshield head-up display
DE102009041556A1 (en) 2009-09-15 2010-06-17 Daimler Ag Vehicle i.e. passenger car, has dead angle assisting device comprising sensors at front side, where sensors are arranged along longitudinal side in distributed manner and determine distance of object by vehicle in assisting device
DE102010048144A1 (en) 2010-10-11 2011-07-28 Daimler AG, 70327 Vehicle i.e. lorry, has sensor arranged such that surrounding area of vehicle is detectable, where surrounding area extends in sections along long side, runs over entire length of long side and runs in sections in region before front side
US20130278769A1 (en) * 2012-03-23 2013-10-24 Magna Electronics Inc. Vehicle vision system with accelerated object confirmation
US20130271606A1 (en) * 2012-04-13 2013-10-17 Paul Chiang Method of displaying an assistant screen for improving driving safety of a vehicle
DE102012010876A1 (en) 2012-06-01 2012-11-22 Daimler Ag Vehicle e.g. truck has image sensor which is arranged between front axle and rear axle in lateral region of vehicle main portion
WO2014037064A1 (en) 2012-09-05 2014-03-13 Wabco Gmbh & Method for operating a trailer vehicle or a truck having the system
WO2014041023A1 (en) 2012-09-13 2014-03-20 Volkswagen Ag Methods and devices for collision warning during lane changes

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111278703A (en) * 2017-10-26 2020-06-12 奔德士商用车***有限责任公司 System and method for determining when an object detected by a collision avoidance sensor on one component of an articulated vehicle comprises another component of the vehicle
JP2019101573A (en) * 2017-11-29 2019-06-24 トヨタ自動車株式会社 Object information acquisition device
FR3080590A1 (en) * 2018-04-25 2019-11-01 Psa Automobiles Sa METHOD AND SYSTEM FOR OBSTACLE DETECTION IN THE ENVIRONMENT OF A MOTOR VEHICLE
CN110456788A (en) * 2019-07-22 2019-11-15 中兴智能汽车有限公司 A kind of automatic Pilot bus controlling system and automatic Pilot car
CN113850999A (en) * 2021-09-03 2021-12-28 杭州海康威视数字技术股份有限公司 Parking space detection device and camera device with radar for monitoring parking space
CN113850999B (en) * 2021-09-03 2022-11-25 杭州海康威视数字技术股份有限公司 Parking space detection device and camera device with radar for monitoring parking space
CN117636671A (en) * 2024-01-24 2024-03-01 四川君迪能源科技有限公司 Cooperation scheduling method and system for intelligent vehicle meeting of rural roads
CN117636671B (en) * 2024-01-24 2024-04-30 四川君迪能源科技有限公司 Cooperation scheduling method and system for intelligent vehicle meeting of rural roads

Similar Documents

Publication Publication Date Title
JP4019736B2 (en) Obstacle detection device for vehicle
EP3089136A1 (en) Apparatus and method for detecting an object in a surveillance area of a vehicle
EP1892149B1 (en) Method for imaging the surrounding of a vehicle and system therefor
US8461976B2 (en) On-vehicle device and recognition support system
US9586525B2 (en) Camera-assisted blind spot detection
US20040051659A1 (en) Vehicular situational awareness system
KR20200102004A (en) Apparatus, system and method for preventing collision
US8106755B1 (en) Triple-function vehicle safety sensor system
RU2723193C1 (en) Sensor device and method of detecting object around vehicle trailer
US10846833B2 (en) System and method for visibility enhancement
EP3785996B1 (en) Integrated alarm system for vehicles
JP2006318093A (en) Vehicular moving object detection device
US11745654B2 (en) Method and apparatus for object alert for rear vehicle sensing
JP2020197506A (en) Object detector for vehicles
JP2007124097A (en) Apparatus for visually recognizing surrounding of vehicle
US20210237743A1 (en) Driver attentiveness detection method and device
US20230415734A1 (en) Vehicular driving assist system using radar sensors and cameras
JP3822417B2 (en) Vehicle periphery monitoring device
CN210617998U (en) Blind area detection equipment for freight transport and passenger transport vehicles
US11964691B2 (en) Vehicular control system with autonomous braking
US20220108117A1 (en) Vehicular lane marker determination system with lane marker estimation based in part on a lidar sensing system
JP2005352636A (en) On-vehicle alarm generation device
JP4900377B2 (en) Image processing device
KR102662224B1 (en) RaDAR apparatus, recognizing target Method of RaDAR apparatus, and system for controlling vehicle including it
US11794536B2 (en) Vehicle control system and vehicle control method for determining chance of collision

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

17P Request for examination filed

Effective date: 20170502

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20201216

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

APBK Appeal reference recorded

Free format text: ORIGINAL CODE: EPIDOSNREFNE

APBN Date of receipt of notice of appeal recorded

Free format text: ORIGINAL CODE: EPIDOSNNOA2E

APBR Date of receipt of statement of grounds of appeal recorded

Free format text: ORIGINAL CODE: EPIDOSNNOA3E

APAF Appeal reference modified

Free format text: ORIGINAL CODE: EPIDOSCREFNE