JP2004182121A - Drive assist system - Google Patents

Drive assist system Download PDF

Info

Publication number
JP2004182121A
JP2004182121A JP2002352558A JP2002352558A JP2004182121A JP 2004182121 A JP2004182121 A JP 2004182121A JP 2002352558 A JP2002352558 A JP 2002352558A JP 2002352558 A JP2002352558 A JP 2002352558A JP 2004182121 A JP2004182121 A JP 2004182121A
Authority
JP
Japan
Prior art keywords
vehicle
predicted trajectory
image
steering angle
road surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2002352558A
Other languages
Japanese (ja)
Inventor
Takeshi Okada
毅 岡田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Holdings Corp
Original Assignee
Matsushita Electric Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Industrial Co Ltd filed Critical Matsushita Electric Industrial Co Ltd
Priority to JP2002352558A priority Critical patent/JP2004182121A/en
Publication of JP2004182121A publication Critical patent/JP2004182121A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2400/00Special features or arrangements of exterior signal lamps for vehicles
    • B60Q2400/50Projected symbol or information, e.g. onto the road or car body

Landscapes

  • Closed-Circuit Television Systems (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To provide a drive assist system capable of assisting the drive operation even when a driver checks the moving direction of a vehicle directly with eyes. <P>SOLUTION: The drive assist system has a rudder angle sensor for detecting a rudder angle controlled by a steering gear of the vehicle, a calculation unit for calculating an predicted trajectory in the moving direction of the vehicle corresponding to the rudder angle detected by the rudder angle sensor, and projection units 25L and 25R installed in the vehicle for projecting an image of the predicted trajectory on a road. The predicted trajectory A in the moving direction of the vehicle is thereby projected on the road so as to assist the driver with the predicted trajectory A when the driver checks the road with eyes. <P>COPYRIGHT: (C)2004,JPO&NCIPI

Description

【0001】
【発明の属する技術分野】
本発明は自動車等の運転を支援する運転支援装置に係り、特に、車両周りを目視にて確認しながら駐車運転するときでも支援可能な運転支援装置に関する。
【0002】
【従来の技術】
従来の運転支援装置として、例えば下記の特許文献1や特許文献2に記載されたものが知られている。この従来の運転支援装置を図4から図6を用いて説明する。
【0003】
図4は、従来の運転支援装置のブロック構成図であり、車両に搭載したカメラ1と、車両のハンドル舵角を検出する舵角センサ2と、カメラ1の撮像画像データと舵角センサ2の検出舵角値とを取り込んで画像処理する画像処理手段3と、画像処理手段3の処理結果をモニタ画面に表示する表示手段4とを備えて構成される。
【0004】
カメラ1は、図5に示す様に、車両10の後方近傍を望むように車両後端部に設置され、車両10内に設置された画像処理手段3(図5では図示せず)は、カメラ1で撮影した車両後方の撮像画像を取り込み、この撮像画像に対し画像が見やすくなるように画像補正処理を行い、補正後の画像を、図6に示す様に、運転席に設置した表示手段4のモニタ画面4aに表示する。このとき、画像処理手段3は、舵角センサ2の検出信号も取り込み、ハンドル舵角に応じた車両進行方向を予測演算して車両進行方向を示す車両予測軌跡を求め、モニタ画面4aに表示されているカメラ1の撮像画像に重ねて車両進行方向の予測軌跡画像6を重畳表示する。
【0005】
この様に、従来の運転支援装置は、運転者が後方確認をモニタ画面4aを見ることで簡単に行なうことができ、死角位置にある障害物との相対的位置関係の確認や、ハンドルを切るタイミング等を適切に支援することができる。
【0006】
【特許文献1】
特開昭59―114139号公報(第1頁から第3頁、図2)
【特許文献2】
特許第2610146号公報(第2頁、図1及び図3)
【0007】
【発明が解決しようとする課題】
しかしながら、従来の運転支援装置では、モニタ画面4a上にしか車両進行方向の予測軌跡画像6が表示されないため、運転者が駐車運転操作時にサイドミラー7(図6参照)を注視して車両後方を確認した場合や、車両10の窓8から顔を覗かせて直接車両後方を確認した場合には、運転者の目がモニタ画面4aから外れるため、画像処理手段3が折角演算して求めた車両進行方向の予想軌跡を駐車運転に生かすことができないという問題がある。
【0008】
また、モニタ画面4a上の表示と実際の車両後方風景との対応が明確でないため、運転者が周辺状況を認知するのに時間がかかってしまい、駐車時の運転操作が円滑に行われず、ひいては左記操作の終了までに多大な時間を要する場合があるという問題がある。
【0009】
本発明は、上記従来技術の問題に鑑みなされたもので、運転者がモニタ画面を見ずに運転操作するときでも車両進行方向の予測軌跡を運転者に提示して運転支援を行う運転支援装置を提供することを目的とする。
【0010】
【課題を解決するための手段】
本発明の運転支援装置は、車両の操舵装置によって操作される舵角を検出する舵角センサと、前記舵角センサによって検出された舵角に対応した車両進行方向の予測軌跡を求める演算手段と、前記車両に設置され前記予測軌跡の画像を道路面上に投影する投影手段とを備えたことを特徴とする。
【0011】
この構成により、運転者がサイドミラーやバックミラーを通して、あるいは直接に道路面を確認したときでも、車両の進行方向の予測軌跡を視認でき、運転操作を支援することができる。
【0012】
本発明の運転支援装置の前記投影手段は、前記車両の直進時の予測軌跡及び前記車両の最大舵角時の予測軌跡の少なくとも一方の画像を前記道路面上に投影することを特徴とする。
【0013】
この構成により、運転者は、道路面上の各予測軌跡を見比べることで、車幅感覚や車両の移動可能範囲を容易に把握でき、より円滑な運転操作の支援が可能となる。
【0014】
本発明の運転支援装置の前記投影手段は、前記車両の左右夫々に設けられ、前記予測軌跡の左側と右側を分担して投影することを特徴とする。
【0015】
この構成により、個々の投影手段の道路面に投影する面積を小さくすることができ、このためシステム構成が容易となり、更に、同じ消費電力であるならば局所的に投影すればよい分だけ道路面上の予測軌跡画像の照度を上げることが可能となる。
【0016】
本発明の運転支援装置は更に、前記車両に搭載され車両近辺の前記車両進行方向の画像を撮像する撮像手段と、前記車両の運転席近傍に設置されるモニタ画面上に前記撮像手段による撮像画像と共に前記予測軌跡に対応したモニタ画面上の予測軌跡画像を重畳表示する表示手段とを備えたことを特徴とする。
【0017】
この構成により、例えば高輝度の太陽光によって道路面上に投影された予測軌跡の画像が見づらくなったときでも、モニタ画面にて予測軌跡を確認可能となる。
【0018】
【発明の実施の形態】
以下、本発明の実施の形態について、図面を参照して説明する。
【0019】
(第1の実施形態)
図1は、本発明の第1の実施の形態に係る運転支援装置のブロック構成図である。この運転支援装置は、車両に搭載したCCDカメラやCMOSカメラ等の撮像手段(以下、カメラという。)21と、車両に搭載された操舵装置のハンドル舵角を検出する舵角センサ22と、カメラ21の撮像画像データと舵角センサ22の検出舵角値とを取り込んで演算処理する画像処理手段23と、画像処理手段23の処理結果をモニタ画面に表示する表示手段24とを備え、更に、画像処理手段23が演算して求めた車両進行方向の予測軌跡画像を車両進行方向の道路面に投影する投影手段25を備える。
【0020】
図2は、本実施の形態に係る運転支援装置を搭載した車両の外観図である。車両20に搭載されるカメラ21は、この実施の形態では車両20の後方近傍を望むように車両後端部中央に設置され、舵角センサ22や画像処理手段23,表示手段24(図2では図示せず)は、車両20内の適宜箇所に設置される。
【0021】
小型のプロジェクタ等でなる投影手段25は、本実施の形態では左投影手段25Lと右投影手段25Rの2個で構成され、左投影手段25Lが車両20の左後端部に設置され、右投影手段25Rが車両20の右後端部に設置される。
【0022】
次に、本実施の形態に係る運転支援装置の動作について説明する。
【0023】
カメラ21は、車両20が後進するとき車両後方の画像を逐次撮像して取り込み、この撮像画像のデータを画像処理手段23に出力する。画像処理手段23は、この撮像画像に対し、例えばマッピングテーブルを使用して、カメラ21のレンズ歪みを補正した画像に変換したり、あるいは広角レンズによって生じる極度な遠近感を補正して見易い画像にするため画像を縦横に伸縮したり、仮想的に視点を変えるような画像変換処理を施す。但し、これらの補正処理等が不要な場合や処理負荷を軽減したい場合には、画像処理手段23はこれらの補正処理等を省く。
【0024】
舵角センサ22は、運転者がハンドル操作したときのハンドル舵角を逐次検出する。舵角センサ22とは、具体的には、ハンドルのステアリング軸に付けられた歯車の回転をMRセンサ等の磁気センサで読み取るものであり、予め歯車のピッチがわかっていれば、その数を数えることで回転角を推定することができる。
【0025】
この磁気センサを2個用意し、磁気センサ2個を歯車ピッチの1/2相当離して設置し歯車の回転を検出することで、歯車の移動量すなわち舵角だけでなく、歯車の回転方向も検出することができる。この舵角センサ22の検出値が画像処理手段23に逐次入力される。
【0026】
画像処理手段23は、舵角センサ22から得られた検出舵角値に基づき、車両20が後退するときの進行方向の予想軌跡を逐次計算する。例えば、車両20の左右の後輪あるいは車両20のシャーシの後方両端が、そのままの舵角で1mないし2m後退する時に道路面上でどのような軌跡を描くかを計算する。計算によって求められた道路面上の予測軌跡(以下、説明の都合上、この予測軌跡をAとする。)は、自車両20を基準としているため、一意に求められる。
【0027】
次に、画像処理手段23は、カメラ21の撮像画像あるいはこの撮像画像に補正処理や画像変換処理を施した画像に、上記の予測軌跡Aを表示手段25のモニタ画面上に重畳表示するとした場合、モニタ画面上のどの位置に表示することになるかを、予測軌跡Aの全体について計算する。この様にして計算で求めたモニタ画面上の予測軌跡をBとする。
【0028】
カメラ21の撮像画像に対して如何なる補正処理や画像変換処理を施しても、自車両20を基準とした道路面上の予測軌跡Aは一意に求まるので、いかなる画像変換を行なったとしても、予測軌跡Bも一意に求まる。画像処理手段23は、この計算によって求められたモニタ画面上における予測軌跡Bの画像及びカメラ21の撮像画像あるいは補正処理や画像変換後の撮像画像を表示手段5に出力する。
【0029】
更に画像処理手段23は、上記の道路面上の予測軌跡Aを計算して求めた後、左投影手段25L及び右投影手段25Rの道路面に対する投影角度や投影レンズの歪み等に基づき、予測軌跡Aの画像変換を例えばマッピングテーブルを使用して行い、次の様な予測軌跡Cを求める。
【0030】
この予測軌跡Cとは、画像処理手段23が左投影手段25Lと右投影手段25Rに入力する予測軌跡画像であり、予測軌跡Cの画像を左投影手段25Lと右投影手段25Rから道路面に投影したとき、道路面に描画される予測軌跡画像が予測軌跡Aに一致するような画像である。
【0031】
この予測軌跡Cの画像が画像処理手段23から左右の投影手段25L、25Rに入力され、予測軌跡Cの画像が道路に投影されると、図2に示す様に、左右の投影手段25L、25Rの夫々の投影範囲26L、26R内に、予測軌跡Aの画像が投影される。道路面上における予測軌跡Aの線画像は、例えば赤色の高輝度光によって照射された画像であるため、容易に視認可能である。尚、図2に示す予測軌跡Aに付加される線分a1、a2は、夫々、車両後端からの直線距離1m、2mを示すガイドである。
【0032】
運転者は、表示手段24のモニタ画面を見ながら駐車運転する場合には、モニタ画面上の予測軌跡Bの画像とカメラ21の撮像画像とを見比べながら駐車運転を行う。しかし、車両周りの安全確認のために直接目視により車両進行方向を見る場合、あるいはサイドミラーを介して確認する場合は、モニタ画面から目を離すことになる。この場合でも、本実施の形態に係る運転支援装置では、視線の先に予測軌跡Aの画像が道路面上に投影されているため、駐車運転の支援が可能である。
【0033】
本実施の形態では、道路面上に投影された予測軌跡Aがカメラ21によって撮像されているため、撮像画像中に予測軌跡Aの画像が含まれることになる。当然のことながら、モニタ画面上に表示される予測軌跡Aの撮像画像は、予測軌跡Bの画像と一致することになる。道路の照射光例えば太陽光が高輝度であった場合、道路面上の予測軌跡Aの投影画像は目視で視認不可となってしまうが、この場合でも、モニタ画面上で予測軌跡Bの画像を確認することで、運転支援が可能となる。
【0034】
尚、上記実施の形態では、投影手段25が小型のプロジェクタであるとして説明したが、より高輝度光で投影ができるように、例えばコヒーレント光のレーザ光線を高速にスキャニングすることで地面に線画を描画する構成のもので実現することも可能である。
【0035】
また、本実施の形態では、投影手段25を2個用意して車両の左右に夫々1つづつ取り付けたが、1つの投影手段25を設け、予測軌跡全体を1つの投影手段にて道路面に投影する構成でもよい。しかし、本実施の形態の様に、左投影手段25Lが車両左端の予測軌跡Aを投影し、右投影手段25Rが車両右端の予測軌跡Aを投影する構成とすることで、個々の投影手段の照射範囲を狭くすることができ、同じ消費電力で投影照度を上げることが可能となり、より鮮明な予測軌跡Aの画像を道路面に描画可能となる。
【0036】
(第2の実施の形態)
図3は、本発明の第2の実施の形態に係る運転支援装置による投影画像例を示す図である。本実施の形態に係る運転支援装置のハードウェア構成は第1の実施の形態と同じである。
【0037】
上述した第1の実施の形態では、ハンドル舵角に連動した動的な予測軌跡Aだけを投影手段25L、25Rから道路面に投影したが、本実施の形態では、この予測軌跡Aと共に、舵角ゼロすなわち直進後退したときの予測軌跡Dと、ハンドルを現在切っている方向と同一方向に一杯に切ったときの最大舵角における後退時の予測軌跡Eとを併せて道路面上に投影する。また、モニタ画面上においても、予測軌跡D、Eに対応した予測軌跡を求めて描画する。
【0038】
この様に、本実施の形態では、予測軌跡D、Eを現在の舵角に連動した予測軌跡Aと対比することで、車両20の現在の姿勢や車両20の移動可能範囲を運転者が直接道路面を視認したときでも容易に把握可能となる。
【0039】
【発明の効果】
本発明によれば、車両の進行方向の予測軌跡が道路面上に投影されるため、運転者がモニタ画面から目を離して直接道路面を確認したときでも運転を支援することが可能な運転支援装置を提供できる。
【図面の簡単な説明】
【図1】本発明の第1の実施の形態に係る運転支援装置のブロック構成図
【図2】本発明の第1の実施の形態に係る運転支援装置を搭載した車両の外観及び投影画像例を示す図
【図3】本発明の第2の実施の形態に係る運転支援装置を搭載した車両の外観及び投影画像例を示す図
【図4】従来例の運転支援装置のブロック構成図
【図5】従来例の運転支援装置を搭載した車両の外観図
【図6】従来の運転支援装置が計算した予測軌跡のモニタ表示例を示す図
【符号の説明】
21 カメラ
22 舵角センサ
23 画像処理手段
24 表示手段
25 投影手段
25L 左投影手段
25R 右投影手段
26L、26R 投影範囲
A 道路面に投影されたハンドル舵角に応じた予測軌跡
D 道路面に投影された直進後退の予測軌跡
E 道路面に投影された最大舵角の予測軌跡
[0001]
TECHNICAL FIELD OF THE INVENTION
The present invention relates to a driving assistance device that assists driving of an automobile or the like, and more particularly to a driving assistance device that can assist the driver in parking while visually confirming the surroundings of the vehicle.
[0002]
[Prior art]
2. Description of the Related Art As a conventional driving support device, for example, those described in Patent Literature 1 and Patent Literature 2 below are known. This conventional driving support device will be described with reference to FIGS.
[0003]
FIG. 4 is a block diagram showing the configuration of a conventional driving assistance apparatus. The camera 1 mounted on a vehicle, a steering angle sensor 2 for detecting a steering angle of a steering wheel of the vehicle, and image data of the camera 1 and the steering angle sensor 2 are shown. The image processing device 3 is configured to include an image processing unit 3 that captures the detected steering angle value and performs image processing, and a display unit 4 that displays a processing result of the image processing unit 3 on a monitor screen.
[0004]
As shown in FIG. 5, the camera 1 is installed at the rear end of the vehicle so as to view the vicinity of the rear of the vehicle 10, and the image processing means 3 (not shown in FIG. 5) installed in the vehicle 10 includes a camera. The captured image of the rear of the vehicle taken in step 1 is taken in, image correction processing is performed on the captured image so that the image is easy to see, and the corrected image is displayed on the display means 4 installed in the driver's seat as shown in FIG. Is displayed on the monitor screen 4a. At this time, the image processing means 3 also takes in the detection signal of the steering angle sensor 2 and predicts and calculates the vehicle traveling direction according to the steering angle of the steering wheel to obtain a vehicle predicted trajectory indicating the vehicle traveling direction, which is displayed on the monitor screen 4a. The predicted trajectory image 6 in the traveling direction of the vehicle is superimposed on the captured image of the camera 1 being displayed.
[0005]
As described above, the conventional driving support device allows the driver to easily perform the rearward confirmation by looking at the monitor screen 4a, confirming the relative positional relationship with the obstacle in the blind spot position, and turning the steering wheel. Timing and the like can be appropriately supported.
[0006]
[Patent Document 1]
JP-A-59-114139 (pages 1 to 3, FIG. 2)
[Patent Document 2]
Japanese Patent No. 2610146 (page 2, FIG. 1 and FIG. 3)
[0007]
[Problems to be solved by the invention]
However, in the conventional driving support device, since the predicted trajectory image 6 in the vehicle traveling direction is displayed only on the monitor screen 4a, the driver gazes at the side mirror 7 (see FIG. 6) at the time of the parking driving operation to look behind the vehicle. When the driver confirms or when the driver directly looks behind the vehicle 10 by looking through the window 8 of the vehicle 10, the driver's eyes fall off the monitor screen 4a. There is a problem that the predicted trajectory in the traveling direction cannot be utilized for parking driving.
[0008]
Further, since the correspondence between the display on the monitor screen 4a and the actual scenery behind the vehicle is not clear, it takes time for the driver to recognize the surrounding situation, and the driving operation during parking is not performed smoothly. There is a problem that it may take a long time to complete the operation on the left.
[0009]
The present invention has been made in view of the above-described problems of the related art, and provides a driving assistance device that provides a driver with a predicted trajectory in a vehicle traveling direction to a driver even when the driver performs a driving operation without looking at a monitor screen. The purpose is to provide.
[0010]
[Means for Solving the Problems]
The driving assistance device according to the present invention includes a steering angle sensor that detects a steering angle operated by a steering device of a vehicle, and a calculation unit that calculates a predicted trajectory of a vehicle traveling direction corresponding to the steering angle detected by the steering angle sensor. And a projecting unit installed on the vehicle to project an image of the predicted trajectory on a road surface.
[0011]
With this configuration, even when the driver checks the road surface through the side mirror or the rearview mirror or directly, the predicted trajectory of the traveling direction of the vehicle can be visually recognized, and the driving operation can be supported.
[0012]
The projection means of the driving support device of the present invention is characterized in that at least one image of a predicted trajectory of the vehicle when traveling straight ahead and a predicted trajectory of the vehicle at maximum steering angle is projected on the road surface.
[0013]
With this configuration, the driver can easily grasp the sense of the vehicle width and the movable range of the vehicle by comparing the predicted trajectories on the road surface, and can support smoother driving operation.
[0014]
The projection means of the driving support device of the present invention is provided on each of the left and right sides of the vehicle, and projects the left and right sides of the predicted trajectory in a shared manner.
[0015]
With this configuration, the area of each projection unit projected on the road surface can be reduced, so that the system configuration can be simplified. The illuminance of the above predicted trajectory image can be increased.
[0016]
The driving support device of the present invention further includes an imaging unit mounted on the vehicle and configured to capture an image of the vehicle traveling direction near the vehicle, and an image captured by the imaging unit on a monitor screen installed near a driver's seat of the vehicle. And display means for superimposing and displaying a predicted trajectory image on a monitor screen corresponding to the predicted trajectory.
[0017]
With this configuration, for example, even when the image of the predicted trajectory projected on the road surface due to high-brightness sunlight becomes difficult to see, the predicted trajectory can be confirmed on the monitor screen.
[0018]
BEST MODE FOR CARRYING OUT THE INVENTION
Hereinafter, embodiments of the present invention will be described with reference to the drawings.
[0019]
(1st Embodiment)
FIG. 1 is a block diagram of the driving support device according to the first embodiment of the present invention. This driving support device includes an imaging means (hereinafter, referred to as a camera) 21 such as a CCD camera or a CMOS camera mounted on a vehicle, a steering angle sensor 22 for detecting a steering angle of a steering device mounted on the vehicle, and a camera. Image processing means 23 for taking in the captured image data 21 and the steering angle value detected by the steering angle sensor 22 for arithmetic processing; display means 24 for displaying the processing result of the image processing means 23 on a monitor screen; Projection means 25 is provided for projecting a predicted trajectory image in the vehicle traveling direction calculated by the image processing means 23 onto a road surface in the vehicle traveling direction.
[0020]
FIG. 2 is an external view of a vehicle equipped with the driving support device according to the present embodiment. In this embodiment, the camera 21 mounted on the vehicle 20 is installed at the center of the rear end of the vehicle so as to view the vicinity of the rear of the vehicle 20, and the steering angle sensor 22, the image processing means 23, and the display means 24 (in FIG. (Not shown) is installed at an appropriate place in the vehicle 20.
[0021]
In the present embodiment, the projecting unit 25 composed of a small projector or the like is composed of two units, a left projecting unit 25L and a right projecting unit 25R. The means 25R is provided at the right rear end of the vehicle 20.
[0022]
Next, the operation of the driving support device according to the present embodiment will be described.
[0023]
When the vehicle 20 moves backward, the camera 21 sequentially captures and captures an image behind the vehicle, and outputs data of the captured image to the image processing unit 23. The image processing unit 23 converts the captured image into an image in which the lens distortion of the camera 21 has been corrected using, for example, a mapping table, or has corrected the extreme perspective generated by the wide-angle lens to obtain an image that is easy to view. In order to achieve this, an image conversion process is performed such that the image is expanded and contracted vertically and horizontally or the viewpoint is virtually changed. However, when these correction processes and the like are unnecessary or when it is desired to reduce the processing load, the image processing means 23 omits these correction processes and the like.
[0024]
The steering angle sensor 22 sequentially detects a steering angle when the driver operates the steering wheel. The steering angle sensor 22 specifically reads the rotation of a gear attached to the steering shaft of the steering wheel with a magnetic sensor such as an MR sensor. If the pitch of the gear is known in advance, the number is counted. Thus, the rotation angle can be estimated.
[0025]
By preparing two magnetic sensors, installing the two magnetic sensors at a distance corresponding to 1 / of the gear pitch and detecting the rotation of the gear, not only the amount of movement of the gear, that is, the steering angle, but also the rotation direction of the gear Can be detected. The detection value of the steering angle sensor 22 is sequentially input to the image processing means 23.
[0026]
The image processing means 23 sequentially calculates an expected trajectory in the traveling direction when the vehicle 20 moves backward based on the detected steering angle value obtained from the steering angle sensor 22. For example, it calculates how the left and right rear wheels of the vehicle 20 or the rear both ends of the chassis of the vehicle 20 draw back on the road surface when moving backward by 1 m or 2 m with the same steering angle. The predicted trajectory on the road surface obtained by the calculation (hereinafter, this predicted trajectory is referred to as A for convenience of explanation) is uniquely determined because the own vehicle 20 is used as a reference.
[0027]
Next, when the image processing unit 23 superimposes the predicted trajectory A on the monitor screen of the display unit 25 on the captured image of the camera 21 or an image obtained by performing a correction process or an image conversion process on the captured image, , The position on the monitor screen to be displayed is calculated for the entire predicted trajectory A. The predicted trajectory on the monitor screen calculated in this way is B.
[0028]
Even if any correction processing or image conversion processing is performed on the image captured by the camera 21, the predicted trajectory A on the road surface based on the own vehicle 20 is uniquely obtained. The locus B is also uniquely determined. The image processing means 23 outputs to the display means 5 the image of the predicted trajectory B on the monitor screen obtained by this calculation and the captured image of the camera 21 or the captured image after the correction processing and the image conversion.
[0029]
Further, the image processing unit 23 calculates and calculates the predicted trajectory A on the road surface, and then calculates the predicted trajectory A based on the projection angles of the left projection unit 25L and the right projection unit 25R with respect to the road surface and the distortion of the projection lens. The image conversion of A is performed using, for example, a mapping table, and the following predicted trajectory C is obtained.
[0030]
The predicted trajectory C is a predicted trajectory image input by the image processing unit 23 to the left projection unit 25L and the right projection unit 25R, and projects an image of the predicted trajectory C from the left projection unit 25L and the right projection unit 25R onto a road surface. Then, the predicted trajectory image drawn on the road surface matches the predicted trajectory A.
[0031]
When the image of the predicted trajectory C is input from the image processing unit 23 to the left and right projection units 25L and 25R, and the image of the predicted trajectory C is projected on the road, the left and right projection units 25L and 25R as shown in FIG. Are projected in the respective projection ranges 26L, 26R. The line image of the predicted trajectory A on the road surface is an image illuminated with, for example, red high-brightness light, and thus can be easily recognized. The line segments a1 and a2 added to the predicted trajectory A shown in FIG. 2 are guides indicating linear distances 1 m and 2 m from the rear end of the vehicle, respectively.
[0032]
When the driver performs the parking operation while watching the monitor screen of the display unit 24, the driver performs the parking operation while comparing the image of the predicted trajectory B on the monitor screen with the image captured by the camera 21. However, when the direction of travel of the vehicle is directly visually observed for confirmation of safety around the vehicle, or when confirmation is made via a side mirror, the user must keep an eye on the monitor screen. Even in this case, the driving support apparatus according to the present embodiment can support parking driving because the image of the predicted trajectory A is projected on the road surface ahead of the line of sight.
[0033]
In the present embodiment, since the predicted trajectory A projected on the road surface is imaged by the camera 21, the captured image includes the image of the predicted trajectory A. Naturally, the captured image of the predicted trajectory A displayed on the monitor screen coincides with the image of the predicted trajectory B. If the irradiation light of the road, for example, sunlight has high brightness, the projected image of the predicted trajectory A on the road surface becomes visually invisible, but even in this case, the image of the predicted trajectory B is displayed on the monitor screen. By confirming, driving assistance becomes possible.
[0034]
In the above-described embodiment, the projection unit 25 is described as a small projector. However, in order to perform projection with higher luminance light, for example, a line image is scanned on the ground by scanning a laser beam of coherent light at high speed. It is also possible to realize with a configuration of drawing.
[0035]
Further, in the present embodiment, two projection means 25 are prepared and attached to the left and right sides of the vehicle one by one. However, one projection means 25 is provided, and the entire predicted trajectory is projected onto the road surface by one projection means. A configuration for projecting may be used. However, as in the present embodiment, the left projection unit 25L projects the predicted trajectory A at the left end of the vehicle, and the right projection unit 25R projects the predicted trajectory A at the right end of the vehicle. The irradiation range can be narrowed, the projected illuminance can be increased with the same power consumption, and a clearer image of the predicted trajectory A can be drawn on the road surface.
[0036]
(Second embodiment)
FIG. 3 is a diagram illustrating an example of a projection image by the driving assistance device according to the second embodiment of the present invention. The hardware configuration of the driving support device according to the present embodiment is the same as that of the first embodiment.
[0037]
In the first embodiment described above, only the dynamic predicted trajectory A linked to the steering angle is projected from the projection means 25L and 25R onto the road surface. The predicted trajectory D when the vehicle travels straight back when the angle is zero, and the predicted trajectory E when the vehicle retreats at the maximum steering angle when the steering wheel is fully turned in the same direction as the current steering direction are projected on the road surface. . Also, on the monitor screen, a predicted trajectory corresponding to the predicted trajectories D and E is obtained and drawn.
[0038]
As described above, in the present embodiment, by comparing the predicted trajectories D and E with the predicted trajectory A linked to the current steering angle, the driver can directly determine the current attitude of the vehicle 20 and the movable range of the vehicle 20. Even when the road surface is visually recognized, it can be easily grasped.
[0039]
【The invention's effect】
According to the present invention, since the predicted trajectory of the traveling direction of the vehicle is projected on the road surface, it is possible to support the driving even when the driver looks away from the monitor screen and directly checks the road surface. A support device can be provided.
[Brief description of the drawings]
FIG. 1 is a block diagram of a driving support device according to a first embodiment of the present invention; FIG. 2 is an external view and an example of a projected image of a vehicle equipped with the driving support device according to the first embodiment of the present invention; FIG. 3 is a diagram showing an appearance and an example of a projected image of a vehicle equipped with a driving assistance device according to a second embodiment of the present invention. FIG. 4 is a block diagram of a conventional driving assistance device. 5 is an external view of a vehicle equipped with a conventional driving support device. FIG. 6 is a diagram showing a monitor display example of a predicted trajectory calculated by the conventional driving support device.
21 Camera 22 Steering angle sensor 23 Image processing means 24 Display means 25 Projection means 25L Left projection means 25R Right projection means 26L, 26R Projection range A Projected trajectory D according to steering angle projected on road surface Projected on road surface Predicted trajectory E of rectilinear receding predicted trajectory of maximum steering angle projected on road surface

Claims (4)

車両の操舵装置によって操作される舵角を検出する舵角センサと、前記舵角センサによって検出された舵角に対応した車両進行方向の予測軌跡を求める演算手段と、前記車両に設置され前記予測軌跡の画像を道路面上に投影する投影手段とを備えたことを特徴とする運転支援装置。A steering angle sensor for detecting a steering angle operated by a steering device of the vehicle; a calculating means for obtaining a predicted trajectory in a vehicle traveling direction corresponding to the steering angle detected by the steering angle sensor; A driving unit for projecting an image of a trajectory onto a road surface. 前記投影手段は、前記車両の直進時の予測軌跡及び前記車両の最大舵角時の予測軌跡の少なくとも一方の画像を前記道路面上に投影することを特徴とする請求項1に記載の運転支援装置。2. The driving support according to claim 1, wherein the projecting unit projects at least one image of a predicted trajectory of the vehicle when traveling straight ahead and a predicted trajectory of the vehicle at maximum steering angle on the road surface. 3. apparatus. 前記投影手段は、前記車両の左右夫々に設けられ、前記予測軌跡の左側と右側を分担して投影することを特徴とする請求項1または請求項2に記載の運転支援装置。3. The driving assistance device according to claim 1, wherein the projection unit is provided on each of the left and right sides of the vehicle, and projects the left and right sides of the predicted trajectory in a shared manner. 4. 前記車両に搭載され車両近辺の前記車両進行方向の画像を撮像する撮像手段と、前記車両の運転席近傍に設置されるモニタ画面上に前記撮像手段による撮像画像と共に前記予測軌跡に対応したモニタ画面上の予測軌跡画像を重畳表示する表示手段とを備えたことを特徴とする請求項1乃至請求項3のいずれかに記載の運転支援装置。Imaging means mounted on the vehicle for capturing an image of the vehicle traveling direction near the vehicle, and a monitor screen corresponding to the predicted trajectory together with the image captured by the imaging means on a monitor screen installed near the driver's seat of the vehicle The driving support device according to any one of claims 1 to 3, further comprising display means for superimposing and displaying the predicted trajectory image.
JP2002352558A 2002-12-04 2002-12-04 Drive assist system Pending JP2004182121A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2002352558A JP2004182121A (en) 2002-12-04 2002-12-04 Drive assist system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2002352558A JP2004182121A (en) 2002-12-04 2002-12-04 Drive assist system

Publications (1)

Publication Number Publication Date
JP2004182121A true JP2004182121A (en) 2004-07-02

Family

ID=32754146

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2002352558A Pending JP2004182121A (en) 2002-12-04 2002-12-04 Drive assist system

Country Status (1)

Country Link
JP (1) JP2004182121A (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006221536A (en) * 2005-02-14 2006-08-24 Denso Corp Display device for vehicle
JP2008143505A (en) * 2006-11-16 2008-06-26 Denso Corp Headlight control device
JP2008213646A (en) * 2007-03-02 2008-09-18 Aisin Aw Co Ltd Parking assistant method and parking assistant device
JP2009154775A (en) * 2007-12-27 2009-07-16 Toyota Central R&D Labs Inc Attention awakening device
JP2009166663A (en) * 2008-01-16 2009-07-30 Honda Motor Co Ltd External circumstance display device for vehicle
JP2009202866A (en) * 2008-01-31 2009-09-10 Yaskawa Electric Corp Moving body
WO2014122750A1 (en) * 2013-02-07 2014-08-14 富士機械製造株式会社 Mobile object
JP2014531358A (en) * 2011-09-08 2014-11-27 コンティネンタル・テーベス・アクチエンゲゼルシヤフト・ウント・コンパニー・オッフェネ・ハンデルスゲゼルシヤフト Method and apparatus for in-vehicle assistant system for implementing autonomous or partially autonomous driving maneuvers
US9845046B1 (en) 2016-06-15 2017-12-19 Denso International America, Inc. Laser projected lines to indicate while driving in semi-truck blind spots
US9845043B1 (en) 2016-06-15 2017-12-19 Denso International America, Inc. Projected laser lines/graphics to visually indicate truck turning path
JP2018014616A (en) * 2016-07-21 2018-01-25 株式会社Jvcケンウッド Display controller, control method, program and display control system
US9878657B2 (en) 2016-06-15 2018-01-30 Denso International America, Inc. Projected laser lines/graphics onto the road for indicating truck platooning/warning to other drivers of presence of truck platoon
CN107640147A (en) * 2016-07-18 2018-01-30 奥迪股份公司 parking assistance method and system
CN112026641A (en) * 2015-04-10 2020-12-04 麦克赛尔株式会社 Vehicle with a steering wheel
FR3105358A1 (en) * 2019-12-19 2021-06-25 Valeo Vision MOTOR VEHICLE SIGNALING DEVICE

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006221536A (en) * 2005-02-14 2006-08-24 Denso Corp Display device for vehicle
JP2008143505A (en) * 2006-11-16 2008-06-26 Denso Corp Headlight control device
JP4720764B2 (en) * 2006-11-16 2011-07-13 株式会社デンソー Headlight control device
JP2008213646A (en) * 2007-03-02 2008-09-18 Aisin Aw Co Ltd Parking assistant method and parking assistant device
JP2009154775A (en) * 2007-12-27 2009-07-16 Toyota Central R&D Labs Inc Attention awakening device
JP2009166663A (en) * 2008-01-16 2009-07-30 Honda Motor Co Ltd External circumstance display device for vehicle
JP2009202866A (en) * 2008-01-31 2009-09-10 Yaskawa Electric Corp Moving body
JP2014531358A (en) * 2011-09-08 2014-11-27 コンティネンタル・テーベス・アクチエンゲゼルシヤフト・ウント・コンパニー・オッフェネ・ハンデルスゲゼルシヤフト Method and apparatus for in-vehicle assistant system for implementing autonomous or partially autonomous driving maneuvers
WO2014122750A1 (en) * 2013-02-07 2014-08-14 富士機械製造株式会社 Mobile object
JPWO2014122750A1 (en) * 2013-02-07 2017-01-26 富士機械製造株式会社 Moving body
CN112026641A (en) * 2015-04-10 2020-12-04 麦克赛尔株式会社 Vehicle with a steering wheel
US9845046B1 (en) 2016-06-15 2017-12-19 Denso International America, Inc. Laser projected lines to indicate while driving in semi-truck blind spots
US9845043B1 (en) 2016-06-15 2017-12-19 Denso International America, Inc. Projected laser lines/graphics to visually indicate truck turning path
US9878657B2 (en) 2016-06-15 2018-01-30 Denso International America, Inc. Projected laser lines/graphics onto the road for indicating truck platooning/warning to other drivers of presence of truck platoon
CN107640147A (en) * 2016-07-18 2018-01-30 奥迪股份公司 parking assistance method and system
JP2018014616A (en) * 2016-07-21 2018-01-25 株式会社Jvcケンウッド Display controller, control method, program and display control system
WO2018016119A1 (en) * 2016-07-21 2018-01-25 株式会社Jvcケンウッド Display control device, method, program and system
FR3105358A1 (en) * 2019-12-19 2021-06-25 Valeo Vision MOTOR VEHICLE SIGNALING DEVICE

Similar Documents

Publication Publication Date Title
CN107444263B (en) Display device for vehicle
JP5500369B2 (en) Vehicle peripheral image generation device
JP4530060B2 (en) Parking support apparatus and method
US7379089B2 (en) Apparatus and method for monitoring the immediate surroundings of a vehicle
JP2004182121A (en) Drive assist system
JP2008296697A (en) Parking support device
JP2009083764A (en) Driving assisting device, driving assisting method, and computer program
JP2012116282A (en) Parking position adjustment device
JP5251804B2 (en) Driving assistance device
JP2006131213A (en) Rear view recognition device for motorcycle
JP2016168877A (en) Visual recognition device for vehicle
JP2000127851A (en) Driving supporting device for vehicle
JP2004194071A (en) Drive support image generator
JP2010006129A (en) Vehicle rear information display and vehicle rear information display method
JP2019110448A (en) Display control device and display system
JP2003205783A (en) Display for periphery of vehicle
JP2008114691A (en) Vehicular periphery monitoring device, and vehicular periphery monitoring image display method
JP3997945B2 (en) Peripheral image display device
US8384779B2 (en) Display device for vehicle
JP3674473B2 (en) Vehicle rear view support device
CN110476420B (en) Image display device for vehicle and image processing method
JP2004051063A (en) Device for visual recognition around vehicle
JP2004082918A (en) Vehicular noctovision and rearview mirror
JP2008146404A (en) In-vehicle image recognition device
JP2015227115A (en) On-vehicle camera control device