JPH04301512A - Environment recognizing apparatus for moving vehicle - Google Patents

Environment recognizing apparatus for moving vehicle

Info

Publication number
JPH04301512A
JPH04301512A JP3065855A JP6585591A JPH04301512A JP H04301512 A JPH04301512 A JP H04301512A JP 3065855 A JP3065855 A JP 3065855A JP 6585591 A JP6585591 A JP 6585591A JP H04301512 A JPH04301512 A JP H04301512A
Authority
JP
Japan
Prior art keywords
image
difference
differential
luminance
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
JP3065855A
Other languages
Japanese (ja)
Inventor
Hiroyuki Takahashi
弘行 高橋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mazda Motor Corp
Original Assignee
Mazda Motor Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mazda Motor Corp filed Critical Mazda Motor Corp
Priority to JP3065855A priority Critical patent/JPH04301512A/en
Publication of JPH04301512A publication Critical patent/JPH04301512A/en
Withdrawn legal-status Critical Current

Links

Landscapes

  • Measurement Of Optical Distance (AREA)
  • Image Processing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)

Abstract

PURPOSE:To perform matching at high speeds by deleting meaningless image data at a pretreatment before matching. CONSTITUTION:The environment in front of a vehicle is viewed stereoscopically by a right image input part 1 and a left image input part 2. The image photographed by the right image input part 2 is output to a luminance evaluating part 3 and differential devices 5, 6, while the image from the left image input part 2 is output to a luminance evaluating part 4 and differential devices 5, 7. The luminance evaluating parts extract what has the luminance not smaller than a predetermined value from the right and left images, and output to the differential devices as the characteristic quantity related to the luminance. The differential device 5 obtains the difference between the right and left images, and deletes an area where no parallax is generated. The differential device 6 obtains the difference between the right image and the differential image based on the data of the characteristic quantity related to the luminance, and sends the result to an image storing part 8. The differential device 7 alike feeds the difference between the left image and the differential image to the image storing part. The image storing part stores the images from the differential devices and outputs sequentially to an outside world recognizing part 9. The image data obtained in the recognizing part 9 is processed through matching, so that the outside world can be recognized at high speeds.

Description

【発明の詳細な説明】[Detailed description of the invention]

【0001】0001

【産業上の利用分野】本発明は画像処理にて障害物等を
認識する移動車の環境認識装置に関するものである。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an environment recognition device for a moving vehicle that recognizes obstacles and the like by image processing.

【0002】0002

【従来の技術】従来の障害物の抽出、特に人や車両等の
道路上の移動障害物を抽出する方法としては、2台以上
のカメラによるステレオ視を行ない、その視差を用いて
対象物までの距離を算出するものがある。
[Prior Art] A conventional method for extracting obstacles, especially moving obstacles on the road such as people and vehicles, is to perform stereo viewing using two or more cameras, and use the parallax to detect objects. There is a method that calculates the distance between.

【0003】0003

【発明が解決しようとしている課題】しかしながら、上
記従来のステレオ視を用いた障害物抽出方法では、画像
処理にて得た特徴量の対応関係を得るために多くの処理
が必要となるという問題がある。これは、意味のある特
徴と意味のない特徴との対応を行なうために効率が悪化
したり、相互に類似した特徴量の対応を把握するときに
誤りが生じやすくなるからである。
[Problems to be Solved by the Invention] However, the above-mentioned conventional obstacle extraction method using stereo vision has the problem that a lot of processing is required to obtain the correspondence between the feature values obtained through image processing. be. This is because efficiency deteriorates due to the correspondence between meaningful features and meaningless features, and errors tend to occur when determining the correspondence between mutually similar feature quantities.

【0004】0004

【課題を解決するための手段】本発明は上述の課題を解
決することを目的としてなされたもので、上述の課題を
解決するための手段として、以下の構成を備える。即ち
、左右のカメラによるステレオ画像を入力する手段と、
該ステレオ画像の各画像の遠方画像情報を削除する手段
と、該遠方画像情報を削除した両画像に基づいて外界を
認識する手段とを備える。
[Means for Solving the Problems] The present invention has been made for the purpose of solving the above-mentioned problems, and has the following configuration as a means for solving the above-mentioned problems. That is, means for inputting stereo images from left and right cameras;
The apparatus includes means for deleting distant image information of each image of the stereo image, and means for recognizing the outside world based on both images from which the distant image information has been deleted.

【0005】[0005]

【作用】以上の構成において、意味のない画像情報をマ
ツチング処理の前段階で削除して、高速なマツチング処
理をするよう機能する。
[Operation] The above configuration functions to perform high-speed matching processing by deleting meaningless image information before matching processing.

【0006】[0006]

【実施例】以下、添付図面を参照して本発明に係る好適
な実施例を詳細に説明する。図1は、本発明の実施例に
係る移動車の環境認識装置(以下、装置という)全体の
構成を示すブロツク図である。同図において、右画像入
力部1、及び左画像入力部2は、車両前方の環境をステ
レオ視するために、例えば、2台のCCDカメラ等にて
構成される左右画像の入力部であり、それらの中心軸は
相互に平行になるよう設置されている。右画像入力部1
にて撮られた画像は、輝度判定部3、及び差分装置5,
6にそれぞれ出力され、左画像入力部2からの画像は、
輝度判定部4、及び差分装置5,7にそれぞれ出力され
る。
DESCRIPTION OF THE PREFERRED EMBODIMENTS Preferred embodiments of the present invention will now be described in detail with reference to the accompanying drawings. FIG. 1 is a block diagram showing the overall configuration of an environment recognition device for a moving vehicle (hereinafter referred to as the device) according to an embodiment of the present invention. In the figure, a right image input unit 1 and a left image input unit 2 are input units for left and right images, which are configured with, for example, two CCD cameras, in order to stereoscopically view the environment in front of the vehicle. Their central axes are placed parallel to each other. Right image input section 1
The image taken by the brightness determination unit 3 and the difference device 5,
6, and the image from the left image input section 2 is
The signals are output to the brightness determination section 4 and the difference devices 5 and 7, respectively.

【0007】輝度判定部3,4は、入力された左右画像
の中から、その輝度が所定値以上のものを抽出し、それ
を輝度に関する特徴量として、それぞれ左右画像に対応
する差分装置6,7に出力する。差分装置5は、右画像
入力部1からの画像と左画像入力部2からの画像の差分
をとり、右の画像と左の画像とで視差が生じない領域を
削除する。また、差分装置6は、輝度判定部3からの輝
度に関する特徴量情報をもとに右画像入力部1からの画
像と差分装置5にて得られた差分画像の差分をとり、そ
の結果を画像蓄積部8に送る。同様に、差分装置7も、
輝度判定部4からの輝度に関する特徴量情報をもとに左
画像入力部2からの画像と差分装置5にて得られた差分
画像の差分をとり、その結果を画像蓄積部8に送る。
The brightness determination units 3 and 4 extract images whose brightness exceeds a predetermined value from the inputted left and right images, and use this as a feature related to brightness to apply difference devices 6 and 4 corresponding to the left and right images, respectively. Output to 7. The difference device 5 takes the difference between the image from the right image input section 1 and the image from the left image input section 2, and deletes an area where no parallax occurs between the right image and the left image. Further, the difference device 6 calculates the difference between the image from the right image input section 1 and the difference image obtained by the difference device 5 based on the feature quantity information regarding brightness from the brightness determination section 3, and uses the result as an image. It is sent to the storage section 8. Similarly, the difference device 7 also
The difference between the image from the left image input section 2 and the difference image obtained by the difference device 5 is calculated based on the feature amount information regarding the brightness from the brightness determination section 4, and the result is sent to the image storage section 8.

【0008】画像蓄積部8は、差分装置6,7から送ら
れる冗長な情報が削除された画像を蓄積し、それを順次
外界認識部9に出力する。外界認識部9は、得られた画
像情報のマツチング処理をして高速に外界認識を行なう
。次に、本実施例の装置における特徴量の抽出について
説明する。図2は、本実施例の装置における特徴量抽出
の手順を示す概略フローチヤートである。同図において
、ステツプS1で右画像入力部1、及び左画像入力部2
にて対象画像として車両前方の画像を入力し、ステツプ
S2では差分装置5にて、ステツプS1で入力した左右
画像の比較を行なう。そして、続くステツプS3では、
差分装置5により左右の画像に視差があるか否かを判定
し、視差があればその部分の画像はそのままにし、視差
がなければステツプS4にて、その部分の画像を意味の
ない特徴量として削除する。
The image storage unit 8 stores images from which redundant information has been removed, sent from the difference devices 6 and 7, and sequentially outputs them to the external world recognition unit 9. The external world recognition unit 9 performs matching processing on the obtained image information to perform external world recognition at high speed. Next, feature extraction in the apparatus of this embodiment will be described. FIG. 2 is a schematic flowchart showing the procedure for feature extraction in the apparatus of this embodiment. In the figure, in step S1, the right image input section 1 and the left image input section 2 are
In step S2, the image in front of the vehicle is input as a target image, and in step S2, the difference device 5 compares the left and right images input in step S1. Then, in the following step S3,
The difference device 5 determines whether or not there is a parallax between the left and right images. If there is a parallax, that part of the image is left as is. If there is no parallax, in step S4, that part of the image is treated as a meaningless feature. delete.

【0009】上述の処理は、視差が一定値以上の対象に
着目し、遠方にある物体については、2台のカメラにて
画像として捉えた場合、視差がないか、あつても少ない
ので特徴量として無視することを意味する。ステツプS
5では、輝度判定部3,4にて判定を行なう。つまり、
判定部に入力された左右画像の中に、その輝度が所定値
を越えるものが存在するか否かを判定し、所定値を越え
るものがあれば、ステツプS6で、それを輝度に関する
特徴量として残す。しかし、輝度が所定値以下であれば
、それらを輝度に関する特徴量として除外する。
[0009] The above processing focuses on objects with parallax above a certain value, and when objects in the distance are captured as images with two cameras, there is no parallax, or even if there is, it is small, so the feature value is means to ignore it. Step S
In step 5, the brightness determination units 3 and 4 perform the determination. In other words,
It is determined whether or not there is an image whose brightness exceeds a predetermined value among the left and right images input to the determination unit, and if there is an image whose brightness exceeds a predetermined value, in step S6, it is used as a feature amount related to brightness. leave. However, if the brightness is less than a predetermined value, they are excluded as features related to brightness.

【0010】ステツプS7では、ステツプS5での輝度
判定の結果残された特徴量をもとに、差分装置6,7に
て、右画像入力部1からの画像と差分装置5にて得られ
た差分画像との差分、及び左画像入力部2からの画像と
差分装置5にて得られた差分画像の差分をとり、続くス
テツプS8でその結果を画像蓄積部8に送る。そして、
ステツプS9では、画像蓄積部8に蓄積された画像、つ
まり冗長な画像情報が削除され、必要な特徴量のみを有
する画像をもとに外界を認識する。
In step S7, based on the feature amount remaining as a result of the brightness determination in step S5, the difference devices 6 and 7 compare the image from the right image input section 1 with the image obtained by the difference device 5. The difference with the difference image and the difference between the image from the left image input unit 2 and the difference image obtained by the difference device 5 are calculated, and the results are sent to the image storage unit 8 in the following step S8. and,
In step S9, the image stored in the image storage section 8, that is, redundant image information is deleted, and the external world is recognized based on the image having only the necessary feature amounts.

【0011】図3は、差分画像をもとに意味のない画像
情報が削除される様子を模式的に示す図である。同図に
おいて、上述のように右画像11は差分装置5,6に、
左画像12は差分装置5,7にそれぞれ入力される。そ
の中で、遠方の画像は右と左とで視差がほとんど現われ
ないので、差分装置5では、その部分は削除される。図
3の斜線領域15,16が、意味のない特徴量として削
除された領域である。
FIG. 3 is a diagram schematically showing how meaningless image information is deleted based on the difference image. In the figure, as mentioned above, the right image 11 is sent to the subtraction devices 5 and 6.
The left image 12 is input to subtraction devices 5 and 7, respectively. Among these images, there is almost no parallax between the right and left images in the distant image, so the difference device 5 deletes that portion. Shaded areas 15 and 16 in FIG. 3 are areas deleted as meaningless features.

【0012】しかし、視差が生じない画像の中にも、例
えば、道路端を示す白線のように障害物抽出のときに重
要な要素として削除できないものが含まれる場合がある
。その際、単純に左右画像の差分をとると、意味のある
画像情報であるにも拘らず白線が削除されてしまうので
、輝度判定部3,4にて、あらかじめ入力画像の中で輝
度が所定値を越えるものが存在するか否かを判定する。 そして、所定値を越えるものを白線として判断し、その
領域を差分装置6,7に特徴量情報として与える。
[0012] However, even images without parallax may contain important elements that cannot be deleted when extracting obstacles, such as a white line indicating the edge of a road. At that time, if we simply take the difference between the left and right images, the white line will be deleted even though it is meaningful image information, so the brightness determination units 3 and 4 have to set the brightness in the input image to a predetermined value in advance. Determine whether there is anything that exceeds the value. Then, a line exceeding a predetermined value is determined to be a white line, and the area is provided to the difference devices 6 and 7 as feature amount information.

【0013】図3においては、右画像11の実線17、
及び左画像12の実線18が、それぞれ輝度判定部3,
4にて白線として判断された道路端である。これらの白
線17,18は、その一部分17a,18aにおいて相
互に視差がなく、差分装置5では削除の対象となる部分
となる。しかし、差分装置6、あるいは差分装置7にて
、元画像である画像11、あるいは画像12と差分装置
5にて得られた差分画像との差分をとる際、差分装置6
,7は、あらかじめ白線の特徴量情報として白線17,
18を与えられているので、白線の一部分17a,18
aを差分をとるときの領域としては除外する。
In FIG. 3, the solid line 17 in the right image 11,
and the solid line 18 of the left image 12 are the brightness determination unit 3,
This is the edge of the road that was determined to be a white line in step 4. These white lines 17 and 18 have no parallax in their portions 17a and 18a, and are the portions to be deleted by the difference device 5. However, when the difference device 6 or 7 takes the difference between the original image 11 or 12 and the difference image obtained by the difference device 5, the difference device 6
, 7 are the white line 17, 7 as feature information of the white line in advance.
18, so the white line parts 17a, 18
A is excluded from the area when calculating the difference.

【0014】その結果、図3の画像13,14に示すよ
うに、道路端である白線はそのまま残される。つまり、
ある値以上の輝度を有する画像に対しては、意味ある領
域であるとして差分をとらない。以上説明したように、
本実施例によれば、2台のカメラでステレオ視して得ら
れた画像の内、視差がないものを無意味な特徴量である
として差分処理の際に削除し、さらにその中でも所定の
閾値を越える輝度を有する画像については、意味ある画
像として残し、冗長な情報を障害物を認識する前処理段
階で削除することで、特徴量算出、及びマツチング時の
処理量を削減し、高速な処理ができるという効果がある
As a result, as shown in images 13 and 14 of FIG. 3, the white lines at the edges of the road remain as they are. In other words,
For images having luminance above a certain value, the difference is not taken because it is a meaningful area. As explained above,
According to this embodiment, among images obtained by stereo viewing with two cameras, those without parallax are considered to be meaningless features and are deleted during the difference processing, and furthermore, a predetermined threshold value is set among them. For images with brightness exceeding It has the effect of being able to.

【0015】[0015]

【発明の効果】以上説明したように、本発明によれば、
前処理段階で意味のない情報を削除することで、左右の
画像のマツチング等の処理量を削減し、高速な処理が可
能となるという効果がある。
[Effects of the Invention] As explained above, according to the present invention,
By deleting meaningless information at the pre-processing stage, the amount of processing such as matching left and right images can be reduced and high-speed processing becomes possible.

【図面の簡単な説明】[Brief explanation of drawings]

【図1】本発明の実施例に係る移動車の環境認識装置全
体の構成を示すブロツク図、
FIG. 1 is a block diagram showing the overall configuration of an environment recognition device for a moving vehicle according to an embodiment of the present invention;

【図2】実施例の装置における特徴量抽出の手順を示す
概略フローチヤート、
FIG. 2 is a schematic flowchart showing the procedure for feature extraction in the device of the embodiment;

【図3】差分画像をもとに意味のない画像情報が削除さ
れる様子を模式的に示す図である。
FIG. 3 is a diagram schematically showing how meaningless image information is deleted based on a difference image.

【符号の説明】[Explanation of symbols]

1          右画像入力部 2          左画像入力部 3,4      輝度判定部 5,6,7  差分装置 8          画像蓄積部 9          外界認識部 17,18  道路端白線 1 Right image input section 2 Left image input section 3, 4 Brightness determination section 5, 6, 7 Differential device 8 Image storage section 9. External world recognition section 17,18 Road edge white line

Claims (1)

【特許請求の範囲】[Claims] 【請求項1】  左右のカメラによるステレオ画像を入
力する手段と、該ステレオ画像の各画像の遠方画像情報
を削除する手段と、該遠方画像情報を削除した両画像に
基づいて外界を認識する手段とを備えることを特徴とす
る移動車の環境認識装置。
1. Means for inputting stereo images from left and right cameras, means for deleting distant image information of each image of the stereo images, and means for recognizing the external world based on both images from which the distant image information has been deleted. An environment recognition device for a moving vehicle, comprising:
JP3065855A 1991-03-29 1991-03-29 Environment recognizing apparatus for moving vehicle Withdrawn JPH04301512A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP3065855A JPH04301512A (en) 1991-03-29 1991-03-29 Environment recognizing apparatus for moving vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP3065855A JPH04301512A (en) 1991-03-29 1991-03-29 Environment recognizing apparatus for moving vehicle

Publications (1)

Publication Number Publication Date
JPH04301512A true JPH04301512A (en) 1992-10-26

Family

ID=13299048

Family Applications (1)

Application Number Title Priority Date Filing Date
JP3065855A Withdrawn JPH04301512A (en) 1991-03-29 1991-03-29 Environment recognizing apparatus for moving vehicle

Country Status (1)

Country Link
JP (1) JPH04301512A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08138053A (en) * 1994-11-08 1996-05-31 Canon Inc Subject imformation processor and remote control device
US5729216A (en) * 1994-03-14 1998-03-17 Yazaki Corporation Apparatus for monitoring vehicle periphery

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5729216A (en) * 1994-03-14 1998-03-17 Yazaki Corporation Apparatus for monitoring vehicle periphery
JPH08138053A (en) * 1994-11-08 1996-05-31 Canon Inc Subject imformation processor and remote control device

Similar Documents

Publication Publication Date Title
JP3242529B2 (en) Stereo image matching method and stereo image parallax measurement method
US6661838B2 (en) Image processing apparatus for detecting changes of an image signal and image processing method therefor
US11379963B2 (en) Information processing method and device, cloud-based processing device, and computer program product
US20210174092A1 (en) Image processing apparatus and method for feature extraction
KR101988551B1 (en) Efficient object detection and matching system and method using stereo vision depth estimation
CN108257165B (en) Image stereo matching method and binocular vision equipment
CN110781770B (en) Living body detection method, device and equipment based on face recognition
CN111814773A (en) Lineation parking space identification method and system
CN113763449B (en) Depth recovery method and device, electronic equipment and storage medium
JPH04301512A (en) Environment recognizing apparatus for moving vehicle
JPH0721388A (en) Picture recognizing device
CN113657277A (en) System and method for judging shielded state of vehicle
RU2383925C2 (en) Method of detecting contours of image objects and device for realising said method
JP2972924B2 (en) Apparatus and method for recognizing an object
JPH061171B2 (en) Compound vision device
JPH0520593A (en) Travelling lane recognizing device and precedence automobile recognizing device
JP2000331160A (en) Device and method for matching and recording medium stored with matching program
JPS60217472A (en) Edge extracting method in picture processing
JP4265927B2 (en) Stereo image processing device
JPH06101024B2 (en) Obstacle detection device
JPH0869535A (en) Background feature mask generating device and mobile object feature extracting device
JPH04301571A (en) Apparatus for recognizing environment of moving vehicle
EP2426930A1 (en) System and method of analyzing a stereo video signal
JP3081727B2 (en) Distance detection method
CN110852153A (en) Berth state detection method, berth state acquisition device, industrial personal computer and medium

Legal Events

Date Code Title Description
A300 Application deemed to be withdrawn because no request for examination was validly filed

Free format text: JAPANESE INTERMEDIATE CODE: A300

Effective date: 19980514