JPS63223810A - Method for deciding traveling route of visual unmanned vehicle - Google Patents

Method for deciding traveling route of visual unmanned vehicle

Info

Publication number
JPS63223810A
JPS63223810A JP62057505A JP5750587A JPS63223810A JP S63223810 A JPS63223810 A JP S63223810A JP 62057505 A JP62057505 A JP 62057505A JP 5750587 A JP5750587 A JP 5750587A JP S63223810 A JPS63223810 A JP S63223810A
Authority
JP
Japan
Prior art keywords
line
traveling
unmanned vehicle
image
red
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP62057505A
Other languages
Japanese (ja)
Inventor
Junichi Hida
淳一 飛田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toyota Industries Corp
Original Assignee
Toyoda Automatic Loom Works Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyoda Automatic Loom Works Ltd filed Critical Toyoda Automatic Loom Works Ltd
Priority to JP62057505A priority Critical patent/JPS63223810A/en
Publication of JPS63223810A publication Critical patent/JPS63223810A/en
Pending legal-status Critical Current

Links

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

PURPOSE:To run an unmanned vehicle along a traveling line of a designated color by discriminating the different colors of traveling paths. CONSTITUTION:A white traveling line WL, a red traveling line RL and a blue traveling line BL which indicate a traveling line 3 of unmanned vehicles 1 and 2 with a fixed line width are drawn on a road surface 4 centering on branch points C. In this case, the line WL serves as a common traveling line with both vehicles 1 and 2 together with the line RL serving as a traveling line for the vehicle 1 only and the line BL serving as a traveling line for the vehicle 2 only respectively. Then both vehicles 1 and 2 travel each prescribed traveling line with red, blue and green signals received from an image sensor of a color image pickup device. In such a way, the traveling routes of unmanned vehicles can be freely set, and execute high accurate processing immune to the influences of light intensity, scattered light, backlight, etc., compared with white/black pictures. Thus, the accurate driving control is possible with unmanned vehicles.

Description

【発明の詳細な説明】 発明の目的 (産業上の利用分野) この発明は画像式無人車における走行経路決定方法に係
り、詳しくは撮像装置が撮った走行ラインを画像処理し
てその走行ラインが当該無人車のための走行ラインかど
うかを判別する走行経路決定方法に関するものである。
[Detailed Description of the Invention] Purpose of the Invention (Field of Industrial Application) This invention relates to a method for determining a driving route in an image-based unmanned vehicle, and more specifically, the driving line is determined by image processing a driving line taken by an imaging device. The present invention relates to a driving route determining method for determining whether or not the driving line is for the unmanned vehicle.

(従来技術) 近年、走行路面上に白色の単一の走行ラインを撮像装置
で撮像し、その走行ラインの画像を画像処理して無人車
と走行ラインの相対位置関係を演算し、その演算結果に
基づいて操舵機構を制御して走行ラインに沿って同無人
車を走行させる画像式無人車が種々提案され、本出願人
も画像式無人車に関する発明を多数出願している。
(Prior art) In recent years, a single white traveling line on a traveling road surface is imaged by an imaging device, and the image of the traveling line is image-processed to calculate the relative positional relationship between the unmanned vehicle and the traveling line. Various image-based unmanned vehicles have been proposed that drive the unmanned vehicle along a travel line by controlling a steering mechanism based on the image-based unmanned vehicle, and the present applicant has also filed numerous inventions related to image-based unmanned vehicles.

例えば本出願人が先に出願した特願昭60−28139
8号、特願昭60−281399号等はeta装置でl
1IfI!シた走行ライン上の所定の点を複数細末める
。そして、その所定の点から同走行ラインに近似した関
数を求め、その求めた関数に基づいて操舵機構を制御す
るものであった。
For example, patent application No. 60-28139 filed earlier by the present applicant.
No. 8, Japanese Patent Application No. 60-281399, etc., with the eta device.
1 If I! A plurality of predetermined points on the travel line are narrowed down. Then, a function that approximates the same traveling line is determined from the predetermined point, and the steering mechanism is controlled based on the determined function.

(発明が解決しようとする問題点) ところが、この走行経路決定方法においては途中で走行
ラインが分岐している場合には走行ライン上の所定の点
が特定することができないことから、走行ラインに近似
した関数を求めることができず、走行経路上に設けられ
る走行ラインは1本でなければならなかった。従って、
走行ラインを多数分岐させ画像式無人車を種々の場所に
走行させることはできなかった。
(Problem to be Solved by the Invention) However, in this method of determining a travel route, if the travel line branches midway, it is not possible to specify a predetermined point on the travel line. It was not possible to obtain an approximate function, and the number of travel lines provided on the travel route had to be one. Therefore,
It was not possible to make the image-based unmanned vehicle travel to various locations by branching out multiple travel lines.

又、この種の画像式無人車においては搬像した白色の走
行ラインのm像信号を2値化して画像処理を行なってい
る。しかし、撮像装置は受光する光の強さ、不要の散乱
光や逆光によってm像信号に暗電流、シェーディング等
が生じ易ことがら、搬像信号から精度の高い2値化信号
を得ることは難しかった。
In addition, in this type of image-based unmanned vehicle, image processing is performed by binarizing the m-image signal of the transported white traveling line. However, it is difficult to obtain highly accurate binarized signals from carrier signals because imaging devices tend to cause dark current, shading, etc. in m-image signals due to the strength of the light they receive, unnecessary scattered light, and backlighting. Ta.

この発明の目的は上記問題点を解消し、画像式無人車の
走行経路を自由に設定することができ同無人車を種々の
場所に走行させることができるとともに、受光する光の
強さ、不要の散乱光や逆光等の影響に強く精度の高い画
像処理を行なうことができる画像式無人車における走行
経路決定方法を提供することにある。
The purpose of this invention is to solve the above-mentioned problems, to freely set the driving route of an image-based unmanned vehicle, to allow the unmanned vehicle to travel to various locations, and to adjust the strength of the received light and unnecessary An object of the present invention is to provide a method for determining a driving route for an image-based unmanned vehicle, which is resistant to the effects of scattered light, backlight, etc., and is capable of performing highly accurate image processing.

発明の構成 (問題点を解消するための手段) この発明は上記目的を達成すべく、色の異なる走行ライ
ンを設けて複数の走行経路を形成し、その走行ラインを
無人車に搭載したl1iWI装置にて撮像し、その画像
した走行ラインの色を判別し当該無人車のために指定さ
れた色の走行ラインに沿って無人車を走行させるように
した画像式無人車における走行経路決定方法をその要旨
とするものである。
Structure of the Invention (Means for Solving Problems) In order to achieve the above object, the present invention provides an l1iWI device in which a plurality of travel routes are formed by providing travel lines of different colors, and the travel lines are mounted on an unmanned vehicle. The present invention describes a driving route determination method for an image-based unmanned vehicle, in which the color of the imaged driving line is determined, and the unmanned vehicle is driven along the driving line of the color specified for the unmanned vehicle. This is a summary.

(作用) 照像装置が撮像した走行ラインの色が当該無人車のため
の走行ラインであるかを判別しながら走行することがで
きるから、走行ラインが途中で分岐してたり、合流して
いたり、又は複数本の走行ラインが平行して存在してい
たりしても無人車は所定の色の走行ラインに沿って走行
することになる。
(Function) It is possible to drive while determining whether the color of the driving line imaged by the imaging device is the driving line for the unmanned vehicle, so the driving line may diverge or merge in the middle. , or even if a plurality of running lines exist in parallel, the unmanned vehicle will run along the running line of a predetermined color.

(実施例) 以下、この発明を具体化した一実施例を図面に従って説
明する。
(Example) An example embodying the present invention will be described below with reference to the drawings.

第1図において、各無人車1,2の走行経路3を指示す
る一定の線幅を有する白色の走行ライン(以下単に白線
)WL、赤色の走行ライン(以下単に赤線)RL、青色
の走行ライン(以下単に青線)BLが分岐点Cを境に路
面4に描かれている。
In FIG. 1, a white traveling line (hereinafter simply referred to as a white line) WL, a red traveling line (hereinafter simply referred to as a red line) RL, and a blue traveling line having a certain width indicate the traveling route 3 of each unmanned vehicle 1, 2. A line (hereinafter simply referred to as a blue line) BL is drawn on the road surface 4 with branch point C as its boundary.

そして、白線WLは各無人車(以下、第1及び第2の無
人車という)1,2の共通の走行経路3、赤線RLは第
1の無人車1の走行経路3、又青線BLは第2の無人車
2の走行経路3のための走行ラインとして路面4にそれ
ぞれ描かれている。
The white line WL is the common travel route 3 of each unmanned vehicle (hereinafter referred to as the first and second unmanned vehicle) 1 and 2, the red line RL is the travel route 3 of the first unmanned vehicle 1, and the blue line BL are drawn on the road surface 4 as travel lines for the travel route 3 of the second unmanned vehicle 2.

第1の無人車1は第2図及び第3図に示すようにその前
側上部中央位置に支持フレーム5が立設されていて、そ
のフレーム5の上部中央位置には11i11像装置とし
てのカラー撮像装置6が無人車1の前方の路面4上のエ
リア4aをWAtaするようにセットされている。この
カラー撮像装置6は本実施例では第4図に示すように、
レンズ7〜10、ハーフミラ−11,12、反射&fi
13,14、緑色用、赤色用及び青色用フィルタ15〜
17及び緑色用、赤色用及び青色用イメージセンサ18
〜20から構成されている。そして、緑色用イメージセ
ンサ18はレンズ7から入射された光が緑色用フィルタ
15にて分光され、その分光された緑色の成分波長の光
を受光し緑映像信号(以下、G信号という)として出力
する。赤色用イメージセンサ19はレンズ7から入射さ
れた光が赤色用フィルタ16にて分光され、そ、の分光
された赤色の成分波長の光を受光し赤映像信号(以下、
R信号という)として出力する。青色用イメージセンサ
20はレンズ7から入射された光が青色用フィルタ17
にて分光され、その分光された青色の成分波長の光を受
光し青映像信号(以下、B信号という)として出力する
As shown in FIGS. 2 and 3, the first unmanned vehicle 1 has a support frame 5 erected at the upper center position on the front side thereof, and a color imaging device as an 11i11 imaging device is installed at the upper center position of the frame 5. The device 6 is set to WAta an area 4a on the road surface 4 in front of the unmanned vehicle 1. In this embodiment, the color imaging device 6 is as shown in FIG.
Lenses 7-10, half mirrors 11, 12, reflection & fi
13, 14, green, red and blue filters 15~
17 and green, red and blue image sensors 18
It consists of ~20. Then, the green image sensor 18 separates the light incident from the lens 7 by the green filter 15, receives the separated light of the green component wavelength, and outputs it as a green video signal (hereinafter referred to as the G signal). do. The red image sensor 19 separates the light incident from the lens 7 by the red filter 16, receives the separated light of the red component wavelength, and generates a red video signal (hereinafter referred to as
It is output as an R signal). In the blue image sensor 20, the light incident from the lens 7 passes through the blue filter 17.
The light of the separated blue component wavelength is received and output as a blue video signal (hereinafter referred to as a B signal).

従って、第5図に示すように走行経路3が分岐している
分岐点Cのエリア4aにおける白線WL。
Therefore, as shown in FIG. 5, the white line WL in the area 4a of the branching point C where the driving route 3 branches.

赤線RL、青線BLについて第6図(a)〜(C)に示
すように白線WLのみの画像21がG信号として、白線
WLと青線BLの画像22が8信号として、白線WLと
赤線RLの画像23がR信号として出力することになる
Regarding the red line RL and the blue line BL, as shown in FIGS. 6(a) to (C), the image 21 of only the white line WL is the G signal, the image 22 of the white line WL and the blue line BL is the 8 signal, and the white line WL is The image 23 of the red line RL will be output as an R signal.

次に、上記のように構成された第1の無人車1に搭載さ
れた走行経路決定装置の電気的構成を第7図に従って説
明する。
Next, the electrical configuration of the travel route determining device mounted on the first unmanned vehicle 1 configured as described above will be explained with reference to FIG.

マイクロコンピュータ30は中央処理装置(以下、CP
Uという)31と制御プログラムを記憶したリード・オ
ンリーメモリよりなるプログラムメモリ32とCPU3
 iの演算結果を及び画像データ等が一時記憶されるラ
ンダム・アクセス・メモリよりなる作業用メモリ33か
ら構成され、CPU31はプログラムメモリ32に記憶
された制御プログラムに従って走行経路3を割り出すと
ともに、操舵制御のための各種の演算処理を実行する。
The microcomputer 30 is a central processing unit (hereinafter referred to as CP).
(referred to as U) 31, a program memory 32 consisting of a read-only memory that stores a control program, and a CPU 3.
It is composed of a working memory 33 consisting of a random access memory in which the calculation results of i and image data, etc. are temporarily stored. Execute various arithmetic processing for.

前記CP U 31は第1の無人車1が一定の距離を走
行するたび毎に入出力インターフェイス34及びA/D
変換器35、カラーデコーダ36を介して前記カラーR
fIA装置6の各イメージセンサ18〜19を走査制御
するようになっている。
The CPU 31 operates the input/output interface 34 and the A/D every time the first unmanned vehicle 1 travels a certain distance.
The color R is transmitted through a converter 35 and a color decoder 36.
Each of the image sensors 18 to 19 of the fIA device 6 is scan-controlled.

カラーデコーダ36はカラーhfi像装置6の各イメー
ジセンサ18〜20からのG信号、R信号及びB信号を
入力し、R信号のみを次段のA/D変換器35に出力す
る。
The color decoder 36 inputs the G signal, R signal, and B signal from each image sensor 18 to 20 of the color HFI image device 6, and outputs only the R signal to the A/D converter 35 at the next stage.

前記A/D変換器35はR信号を画素データにしてバス
コントローラ37を介して前記作業用メモリ33に記憶
する。A/D変換器35は赤色用イメージセンサ19か
らのR信号をアナログ値からデジタル値に変換する際、
R信号が予め定めた設定値以上か否か判別し、設定値以
上のR信号の場合には白線WLと赤線RLの部分の画素
として「1」、反対に未満のR信号の場合には暗い色の
路面4の部分の画素と゛して「0」とするようにして順
次入力されてくるR信号を2値化し画素データとして作
業用メモリ33に記憶する。そして、作業用メモリ33
にはカラー搬像装置6の赤色用イメージセンサ19が踊
った画像が256X256個の画素データ群となって記
憶される。従って、第1の無人車1の走行経路決定装置
は白線WLと赤線RLの画像を画素データとして取り込
むことはでき、青線BLの画像の画素データは取り込む
ことができない。そして、新しい画像の画素データ群が
入力されるたびごとに先の画像の画素データ群が最も新
しい画像の画素データ群に古き替えられるようになって
いる。
The A/D converter 35 converts the R signal into pixel data and stores it in the working memory 33 via the bus controller 37. When the A/D converter 35 converts the R signal from the red image sensor 19 from an analog value to a digital value,
It is determined whether the R signal is equal to or greater than a predetermined value. If the R signal is equal to or greater than the predetermined value, the pixels in the white line WL and red line RL are set to "1," and on the other hand, if the R signal is less than the predetermined value, the pixels are set to "1." The R signals that are sequentially input are binarized so that the pixels of the dark-colored road surface 4 are set to "0" and stored in the working memory 33 as pixel data. And working memory 33
The image captured by the red image sensor 19 of the color image carrier 6 is stored as a 256×256 pixel data group. Therefore, the driving route determining device for the first unmanned vehicle 1 can capture the images of the white line WL and the red line RL as pixel data, but cannot capture the pixel data of the image of the blue line BL. Each time a new image's pixel data group is input, the pixel data group of the previous image is replaced by the pixel data group of the newest image.

尚、第2の無人車2は第1の無人車1と同様に構成され
、その走行経路決定装置が青色用イメージセンサ20の
B信号に基づいて画素データが作成される点が相違する
だけなので、その詳細は省略する。従って、第2の無人
車2の走行経路決定装置は白線WLと青線BLの画像を
画素データとして取り込み、赤線RLの画像の画素デー
タは取り込むことができない。
The second unmanned vehicle 2 is configured in the same manner as the first unmanned vehicle 1, and the only difference is that its travel route determining device creates pixel data based on the B signal of the blue image sensor 20. , the details are omitted. Therefore, the travel route determining device for the second unmanned vehicle 2 captures the images of the white line WL and the blue line BL as pixel data, but cannot capture the pixel data of the image of the red line RL.

又、各イメージセンサ18〜20の走査制御は横方向(
X軸方向)に走査し、その走査が画像21〜23の上か
ら下方向(Y軸方向)に移る走査方式を採用しているが
、これに限定されるものではなくその他の走査方法で実
施してもよい。
Further, scanning control of each image sensor 18 to 20 is performed in the horizontal direction (
A scanning method is adopted in which scanning is performed in the X-axis direction) and the scanning moves from the top to the bottom of the images 21 to 23 (Y-axis direction), but the method is not limited to this and can be performed using other scanning methods. You may.

2値化レベルコントローラ38は前記A / D変換器
35がR信号を2値化するための設定値データを前記C
PU31からの制御信号に基づいてA/D変換器35に
出力する。ドライブコントローラ39は図示しない走行
用の走行用モータ及び操舵機構40を同じ<CPU31
からの制御信号に基づいて1lilJ liOする。そ
して、操舵機構40はその制御信号に基づいてステアリ
ング角を制御するこのように本実施例においては、第1
の無人車1が白線WL又は赤線RL上を走行中に6いて
は同無人車1に搭載したカラー8m装置6は白線WL又
は赤線RLを撮像し、その白線WL又は赤線RLを赤色
用イメージセンサ19を介して撮像しR信号として出力
する。走行経路決定装置はこのR信号に基づいて白線W
L又は赤線RLの画素データを作成し、その画素データ
に基づいて自己の走行経路3を認識しその白線WL又は
赤線WLに沿って操舵機構40を制御することになる。
The binarization level controller 38 converts setting value data for the A/D converter 35 to binarize the R signal into the C
It outputs to the A/D converter 35 based on the control signal from the PU 31. The drive controller 39 uses the same drive motor and steering mechanism 40 (not shown) as the CPU 31.
1 lilJ liO based on the control signal from. In this embodiment, the steering mechanism 40 controls the steering angle based on the control signal.
While the unmanned vehicle 1 is traveling on the white line WL or the red line RL, the color 8m device 6 mounted on the unmanned vehicle 1 images the white line WL or the red line RL, and displays the white line WL or the red line RL as a red line. The image sensor 19 captures an image and outputs it as an R signal. The driving route determining device follows the white line W based on this R signal.
Pixel data for the L or red line RL is created, the vehicle's own travel route 3 is recognized based on the pixel data, and the steering mechanism 40 is controlled along the white line WL or red line WL.

従って、第1の無人車1は白線WL及び赤線RLを走行
経路3として走行することができる。
Therefore, the first unmanned vehicle 1 can travel along the white line WL and the red line RL as the travel route 3.

又、第1の無人車1が分岐点Cを走行する時、第6図(
C)に示すようにカラー搬像装置6は赤色用イメージセ
ンサ19を介して白線WLと赤線RLだけの画像23を
画像しR信号を出力する。
Also, when the first unmanned vehicle 1 travels at the branch point C, as shown in FIG.
As shown in C), the color image carrier 6 images an image 23 of only the white line WL and the red line RL via the red image sensor 19, and outputs an R signal.

従って、第1の無人車1は白線WLから赤線RL又は赤
線RLから白線WLに自由に走行することができるとと
もに、間違っても青線BLに沿って走行することはない
Therefore, the first unmanned vehicle 1 can freely travel from the white line WL to the red line RL or from the red line RL to the white line WL, and will not travel along the blue line BL even if it makes a mistake.

一方、第2の無人車2においても同様に第1の無人車1
は白@if W を及び青線BLを走行経路3として走
行することができるとともに、白線W「から青tQBL
又は青線BLから白線WLに自由に走行することができ
、間違っても赤線RLに沿って走行することはない。
On the other hand, in the second unmanned vehicle 2, the first unmanned vehicle 1
can run on the white line @if W and the blue line BL as driving route 3, and can also travel from the white line W to the blue line tQBL.
Alternatively, you can freely travel from the blue line BL to the white line WL, and even if you make a mistake, you will not travel along the red line RL.

しかも、各無人車1,2を同一の走行経路上を走行させ
たり、各無人車1.2ごとに決められた走行経路を自由
に設定することができることがら、複数の無人車を走行
させるための走行経路の設計は非常に簡単に行なうこと
ができる。さらに1本実施例ではカラー画像を扱うため
、白黒画像に較べて受光する光の強さ、不要の散乱光や
逆光等の影響に強く精度の高い画像処理を行なうことが
できる。
Furthermore, since each unmanned vehicle 1, 2 can be driven on the same travel route, or a predetermined travel route can be freely set for each unmanned vehicle 1.2, it is possible to run multiple unmanned vehicles. The design of the travel route can be done very easily. Furthermore, since this embodiment handles color images, it is more resistant to the effects of received light intensity, unnecessary scattered light, backlighting, etc. than monochrome images, and can perform highly accurate image processing.

尚、この発明は前記実施例限定されるものではな(、走
行ラインの色の数を増加して実施したり、カラーm像装
置6を前記実施例以外のカラー撮像装置に代えて実施し
てもよい。
Note that the present invention is not limited to the embodiments described above (the number of colors of the traveling line may be increased, or the color m-image device 6 may be replaced with a color imaging device other than the embodiments described above). Good too.

発明の効果 以上詳述したように、この発明によれば画像式無人車の
走行経路を自由に設定することができ同無人車を種々の
場所に走行させることができるとともに、受光する光の
強さ、不要の散乱光や逆光等の影響に強く精度の高い画
像処理を行なうことができ無人車を正確に走行制御する
ことができる。
Effects of the Invention As detailed above, according to the present invention, it is possible to freely set the travel route of an image-based unmanned vehicle, and the unmanned vehicle can be driven to various locations, and the intensity of the received light can be adjusted. In addition, highly accurate image processing can be performed that is resistant to the effects of unnecessary scattered light and backlighting, and the driving of unmanned vehicles can be accurately controlled.

【図面の簡単な説明】[Brief explanation of drawings]

第1図はこの発明を具体化した走行経路を示す図、第2
図は画像式無人車の側面図、第3図は同じく平面図、第
4図はカラー撮像装置のII構図、第5図は走行経路の
分岐点を説明するための説明図、第6図(a)(b)(
c)はカラー撮像装置の各イメージセンサがとらえた画
像を説明する図、第7図は走行経路決定装置の電気的構
成を示す電気ブロック回路図である。 図中、1.2は無人車、3は走行経路、6はカラー撮像
装置、15は緑色用フィルタ、16は赤色用フィルタ、
17は青色用フィルタ、18は緑色用イメージセンサ、
19は赤色用イメージセンサ、20は青色用イメージセ
ンサ、31は中央処理装置(CPU)、35はA/D変
換器、WLは白線、RLは赤線、BLは青線である。 特許出願人  株式会社 費田自動織i製作所代 理 
人   弁理士 恩1)博宣 第1図 一、、、C (at 第5図 第6図
Figure 1 is a diagram showing a driving route embodying this invention, Figure 2 is a diagram showing a driving route embodying this invention.
The figure is a side view of the image-based unmanned vehicle, FIG. 3 is a plan view, FIG. 4 is the II composition of the color imaging device, FIG. a)(b)(
c) is a diagram illustrating images captured by each image sensor of the color imaging device, and FIG. 7 is an electrical block circuit diagram showing the electrical configuration of the travel route determining device. In the figure, 1.2 is an unmanned vehicle, 3 is a driving route, 6 is a color imaging device, 15 is a green filter, 16 is a red filter,
17 is a blue filter, 18 is a green image sensor,
19 is a red image sensor, 20 is a blue image sensor, 31 is a central processing unit (CPU), 35 is an A/D converter, WL is a white line, RL is a red line, and BL is a blue line. Patent applicant: Kasuda Automatic Textile Manufacturing Co., Ltd.
Person Patent Attorney On 1) Hironobu Figure 1 1, C (at Figure 5 Figure 6)

Claims (1)

【特許請求の範囲】[Claims] 1、色の異なる走行ラインを設けて複数の走行経路を形
成し、その走行ラインを無人車に搭載した撮像装置にて
撮像し、その撮像した走行ラインの色を判別し当該無人
車のために指定された色の走行ラインに沿って無人車を
走行させるようにした画像式無人車における走行経路決
定方法。
1. Create a plurality of driving routes by providing driving lines with different colors, image the driving lines with an imaging device mounted on the unmanned vehicle, distinguish the color of the imaged driving line, and use it for the unmanned vehicle. A driving route determination method for an image-based unmanned vehicle, in which the unmanned vehicle is driven along a designated color traveling line.
JP62057505A 1987-03-11 1987-03-11 Method for deciding traveling route of visual unmanned vehicle Pending JPS63223810A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP62057505A JPS63223810A (en) 1987-03-11 1987-03-11 Method for deciding traveling route of visual unmanned vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP62057505A JPS63223810A (en) 1987-03-11 1987-03-11 Method for deciding traveling route of visual unmanned vehicle

Publications (1)

Publication Number Publication Date
JPS63223810A true JPS63223810A (en) 1988-09-19

Family

ID=13057587

Family Applications (1)

Application Number Title Priority Date Filing Date
JP62057505A Pending JPS63223810A (en) 1987-03-11 1987-03-11 Method for deciding traveling route of visual unmanned vehicle

Country Status (1)

Country Link
JP (1) JPS63223810A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU681973B2 (en) * 1994-08-04 1997-09-11 Canon Kabushiki Kaisha Liquid crystal device
JP2012234461A (en) * 2011-05-09 2012-11-29 Nippon Yusoki Co Ltd Unmanned carrier system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS505797A (en) * 1973-05-19 1975-01-21
JPS61179706A (en) * 1985-02-05 1986-08-12 Nippon Erasutoran Kk Pellet manufacturing device
JPS6336311A (en) * 1986-07-30 1988-02-17 Daifuku Co Ltd Guidance equipment for optical guidance type moving car

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS505797A (en) * 1973-05-19 1975-01-21
JPS61179706A (en) * 1985-02-05 1986-08-12 Nippon Erasutoran Kk Pellet manufacturing device
JPS6336311A (en) * 1986-07-30 1988-02-17 Daifuku Co Ltd Guidance equipment for optical guidance type moving car

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU681973B2 (en) * 1994-08-04 1997-09-11 Canon Kabushiki Kaisha Liquid crystal device
JP2012234461A (en) * 2011-05-09 2012-11-29 Nippon Yusoki Co Ltd Unmanned carrier system

Similar Documents

Publication Publication Date Title
CN103080978B (en) Object recognition equipment
JP3909691B2 (en) In-vehicle image processing device
US6812463B2 (en) Vehicle vicinity-monitoring apparatus
CN111563405B (en) Image processing apparatus and image processing method
US20130307985A1 (en) Vehicle driving assist device
JP3926673B2 (en) Vehicle type identification device
JP2007124676A (en) On-vehicle image processor
JPS63223810A (en) Method for deciding traveling route of visual unmanned vehicle
JPH0520593A (en) Travelling lane recognizing device and precedence automobile recognizing device
JP2006024120A (en) Image processing system for vehicles and image processor
JPH08108796A (en) Peripheral visual confirming device for vehicle
JP2006078635A (en) Front road-display control unit and front road-display control program
JPH10327337A (en) Smear prevention device for image input device
JPH0593981U (en) Glare sensor
JP3855102B2 (en) Lane boundary detection device
JP2638249B2 (en) Vehicle distance detection device
JP2611325B2 (en) Road recognition device for vehicles
JPH01188911A (en) Detecting method for abnormal traveling route of image type unmanned vehicle
JPS6340913A (en) Running speed determining method for image type unmanned vehicle
JPH0535885B2 (en)
JPH01175610A (en) Processing of picture in picture type unmanned vehicle
JPS62140107A (en) Method and device for controlling steering of unmanned carrier and its image pickup device
JP2814658B2 (en) Vehicle distance detection device
JP3783295B2 (en) Method for detecting lane marking of vehicle
JPH1134740A (en) Monitor for vehicle