JP2007263657A - Three-dimensional coordinates acquisition system - Google Patents

Three-dimensional coordinates acquisition system Download PDF

Info

Publication number
JP2007263657A
JP2007263657A JP2006087478A JP2006087478A JP2007263657A JP 2007263657 A JP2007263657 A JP 2007263657A JP 2006087478 A JP2006087478 A JP 2006087478A JP 2006087478 A JP2006087478 A JP 2006087478A JP 2007263657 A JP2007263657 A JP 2007263657A
Authority
JP
Japan
Prior art keywords
vehicle
processing unit
dimensional coordinates
stereo processing
estimation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2006087478A
Other languages
Japanese (ja)
Inventor
Hideki Shirai
英樹 白井
Hiroaki Kumon
宏明 公文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Denso Corp
Denso IT Laboratory Inc
Original Assignee
Denso Corp
Denso IT Laboratory Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Denso Corp, Denso IT Laboratory Inc filed Critical Denso Corp
Priority to JP2006087478A priority Critical patent/JP2007263657A/en
Publication of JP2007263657A publication Critical patent/JP2007263657A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Processing (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To provide a three-dimensional coordinates acquisition system which is used by merging technologies of compound-eye stereo and single-eye stereo. <P>SOLUTION: The system has at least two cameras which are installed in a vehicle and are arranged with a predetermined installation distance and estimates and acquires the three-dimensional coordinates of a target around the vehicle using a plurality of images acquired by the at least two cameras. The system comprises a single-eye stereo processing section which estimates and calculates the three-dimensional coordinates of the target around the vehicle on the basis of two images acquired with time by one of the at least two cameras, a compound-eye processing section which estimates and calculates the three-dimensional coordinates of the target around the vehicle on the basis of two images acquired simultaneously by two of the at least two cameras, and a result integrating/switching means which is connected to the single-eye stereo processing section and the compound-eye stereo processing section, selects and integrates the three-dimensional coordinates of the target estimated and calculated by the processing sections by a predetermined standard, estimates them to be the three-dimensional coordinates of the target to be determined, and outputs the result. <P>COPYRIGHT: (C)2008,JPO&INPIT

Description

本発明は、車両に取り付けられたカメラにより捕捉された画像を用いて、車両周辺物の3次元座標を取得する3次元座標取得装置に係わり、特に2台以上のカメラを用いて車両周辺物の3次元座標を取得する3次元座標取得装置に関する。     The present invention relates to a three-dimensional coordinate acquisition apparatus that acquires three-dimensional coordinates of an object around a vehicle using an image captured by a camera attached to the vehicle, and more particularly to the use of two or more cameras. The present invention relates to a three-dimensional coordinate acquisition apparatus that acquires three-dimensional coordinates.

従来から、車両にカメラを搭載し、車両前方や後方の画像を撮影し、車両周辺の環境を3次元的に認識するシステムが開発されている。カメラで3次元座標(距離)を測定する方法としては、2台以上のカメラを利用するステレオ視(以後、複眼ステレオとする)が一般的である。例えば、車両に搭載したステレオカメラで撮像した画像を処理して車外の対象物の3次元座標位置を測定する技術が知られている(例えば、特許文献1参照)。     Conventionally, a system has been developed in which a camera is mounted on a vehicle, images of the front and rear of the vehicle are taken, and the environment around the vehicle is three-dimensionally recognized. As a method of measuring three-dimensional coordinates (distance) with a camera, stereo vision using two or more cameras (hereinafter referred to as compound eye stereo) is generally used. For example, a technique is known in which an image captured by a stereo camera mounted on a vehicle is processed to measure a three-dimensional coordinate position of an object outside the vehicle (see, for example, Patent Document 1).

複眼ステレオの場合、空間的に離れた位置に置かれた複数台のカメラで画像を撮影し、物体の特徴的な点(コーナなど、以後、単に「特徴点」と言う)やテクスチャの視差を利用して、いわゆる三角測量の原理で3次元座標(距離)を計算することが出来る。     In the case of compound-eye stereo, images are taken with multiple cameras placed at spatially separated positions, and object characteristic points (such as corners, hereinafter simply referred to as “feature points”) and texture parallax are detected. Utilizing this, the three-dimensional coordinates (distance) can be calculated by the principle of so-called triangulation.

また、1台のカメラで3次元座標(距離)を測定する方法として、単眼ステレオ(別称、モーションステレオ、SfM(Structure from Motion))と呼ばれる技術がある。1台のカメラで別時刻・別視点から撮影された複数枚の画像を利用することによりステレオ視が実現できる技術である。例えば、この技術を用いた駐車支援システムが特許文献2及び特許文献3に開示されている。
特開平5−114099公報 特開2001−187553公報 特開2004−198211公報
As a method for measuring three-dimensional coordinates (distance) with one camera, there is a technique called monocular stereo (also known as motion stereo, SfM (Structure from Motion)). This is a technique that can realize a stereo view by using a plurality of images taken from different times and different viewpoints with a single camera. For example, Patent Literature 2 and Patent Literature 3 disclose parking assist systems using this technology.
Japanese Patent Application Laid-Open No. 5-114099 JP 2001-187553 A JP 2004-198211 A

複眼ステレオの場合、距離の推定精度はカメラ間の設置距離(ベースライン長)に依存する。ベースラインを長くすることで、視差が大きくなり、距離推定精度も向上する。しかし、車に搭載することを考えると、このベースラインはあまり大きくできない。     In the case of compound-eye stereo, the distance estimation accuracy depends on the installation distance (baseline length) between the cameras. Increasing the baseline increases the parallax and improves the distance estimation accuracy. However, considering the fact that it is installed in a car, this baseline cannot be made too large.

単眼ステレオの場合、1台のカメラで済むので、カメラの設置に関して制約はほとんどない。また、観測時間を長くとることにより視差を稼ぐことができる。しかし車両が停止している場合は利用できないという欠点がある。     In the case of monocular stereo, only one camera is required, so there are almost no restrictions on camera installation. Moreover, parallax can be earned by taking long observation time. However, there is a disadvantage that it cannot be used when the vehicle is stopped.

本発明は、複眼ステレオと単眼ステレオの技術を融合して利用することにより、両者の欠点を補い、さらに高精度な3次元座標取得装置を提供することを目的とするものである。     An object of the present invention is to provide a highly accurate three-dimensional coordinate acquisition apparatus that compensates for the disadvantages of both by using a compound eye stereo technique and a monocular stereo technique in combination.

図5は、1台の3次元座標取得装置を車両正面部(面)に配置した例を示す平面図、図6は、1台の3次元座標取得装置を車両正面部(面)に配置した別の例を示す平面図、図7は、4台の3次元座標取得装置を車両の前後左右部(面)に配置した例を示す平面図である。     FIG. 5 is a plan view showing an example in which one three-dimensional coordinate acquisition device is arranged on the front surface (surface) of the vehicle, and FIG. 6 is an arrangement of one three-dimensional coordinate acquisition device on the vehicle front portion (surface). Fig. 7 is a plan view showing another example, and Fig. 7 is a plan view showing an example in which four three-dimensional coordinate acquisition devices are arranged on the front, rear, left and right parts (planes) of the vehicle.

単眼ステレオと複眼ステレオでそれぞれ3次元座標を計算し、その結果を単純に切り替え、あるいは統合する。例えば、車両が静止している場合は、複眼ステレオの結果、車両が移勤している場合は、単眼ステレオと複眼ステレオの結果を統合する。また例えば、複眼ステレオが可能な領域は複眼ステレオの結果を利用し、それ以外の領域では単眼ステレオの結果を利用する(図5参照)。また例えば、図6に示すように、カメラペア2A、2Bを、その光軸2c、2dを互いに外向きになるように、かつ、各カメラ2Aのカメラ視野Aとカメラ2Bのカメラ視野Bが車両の前方(カメラペア2A、2Bが車両の側面に装着されていた場合には、車両の側方外側)、即ちその光軸2c、2dの前方でその一部が重なる形となるように、配置することにより、図5に示すように、カメラペア2A、2Bを、その光軸2c、2dを互いに並行に配置した場合に比して、両カメラ2A、2Bの合計されたカメラ視野を広げつつ、車両前方(側面装着の場合には、側方外側)については複眼ステレオにより精度の高い3次元座標推定を行うようにすることもできる。また例えば、図7に示すように、車両の前後左右にカメラを配置し、単眼ステレオの結果と複眼ステレオの結果を組み合わせることにより、比較的少数のカメラで、ほぼ死角のないステレオ視を実現することができる。     Three-dimensional coordinates are calculated for monocular stereo and compound eye stereo, and the results are simply switched or integrated. For example, the result of compound eye stereo is integrated when the vehicle is stationary, and the result of monocular stereo and compound eye stereo is integrated when the vehicle is transferred. Further, for example, the result of compound eye stereo is used for an area where compound eye stereo is possible, and the result of monocular stereo is used for other areas (see FIG. 5). Further, for example, as shown in FIG. 6, the camera pairs 2A and 2B are arranged so that their optical axes 2c and 2d face each other, and the camera field A of each camera 2A and the camera field B of the camera 2B are vehicles. Placed in front of the camera (when the camera pair 2A, 2B is mounted on the side of the vehicle, laterally outside the vehicle), that is, in such a way that a part thereof overlaps in front of the optical axes 2c, 2d. By doing so, as shown in FIG. 5, the camera pair 2A, 2B is expanded in the total camera field of view of both cameras 2A, 2B as compared with the case where the optical axes 2c, 2d are arranged in parallel with each other. Further, it is also possible to perform highly accurate three-dimensional coordinate estimation by compound eye stereo for the front of the vehicle (in the case of side mounting, the side outside). Further, for example, as shown in FIG. 7, cameras are arranged on the front, rear, left and right sides of the vehicle, and the result of monocular stereo and the result of compound eye stereo are combined to realize stereo vision with almost no blind spots with a relatively small number of cameras. be able to.

結果を統合する場合は、状況(対象物のカメラからの距離、車速度、フローの数や精度)に応じて両者の重みを代えるなどする。     When integrating the results, the weights of both are changed depending on the situation (distance of the object from the camera, vehicle speed, number of flows and accuracy).

また、単眼ステレオと複眼ステレオの途中処理結果を利用して、演算コストの削減および推定精度の向上を行うようにする。     Further, the calculation cost is reduced and the estimation accuracy is improved by using the intermediate processing results of the monocular stereo and the compound eye stereo.

ステレオカメラが車両に設置済みであれば、車両の移動状態、周囲の状況、求めるべき座標位置の遠近などに応じて、ハードウエアの追加なくして、より付加価値の高い3次元座標取得装置が実現できる。また、カメラ配置の工夫により、少数のカメラにより広視野角な3次元座標取得装置が実現できる。また、最終結果だけではなく、途中結果の段階で情報を共有、融合することで、より精度の高い推定が期待できる。     If the stereo camera is already installed in the vehicle, a 3D coordinate acquisition device with higher added value can be realized without adding hardware according to the moving state of the vehicle, the surrounding conditions, the distance of the coordinate position to be obtained, etc. it can. In addition, by devising the camera arrangement, a three-dimensional coordinate acquisition device having a wide viewing angle can be realized with a small number of cameras. In addition, not only the final result but also information can be shared and fused at the intermediate result stage, so that more accurate estimation can be expected.

以下、図面に基づき、本発明の実施例を説明する。     Embodiments of the present invention will be described below with reference to the drawings.

図1は、3次元座標取得装置の一例を示す図、図2は、3次元座標取得装置の別の例を示す図、図3は、3次元座標取得装置の処理の一例を示すフローチャート、図4は、単眼ステレオカメラで取得される画像とそれら画像から得られるフローを示す図である。また、図8は、図6に対応する左右カメラ画像で得られるフローの一例を示す図である。     1 is a diagram illustrating an example of a three-dimensional coordinate acquisition device, FIG. 2 is a diagram illustrating another example of the three-dimensional coordinate acquisition device, and FIG. 3 is a flowchart illustrating an example of processing of the three-dimensional coordinate acquisition device. FIG. 4 is a diagram illustrating images acquired by a monocular stereo camera and a flow obtained from the images. Moreover, FIG. 8 is a figure which shows an example of the flow obtained with the left-right camera image corresponding to FIG.

車両に搭載される3次元座標取得装置1は、図1に示すように、車両(図示せず)前方に所定の設置距離Lだけ離れた形で配置された2台のカメラ2A、2Bを有しており(なお、カメラの数は、2台以上であっても良いが、最低でも2台は必要である)、カメラ2Aには、単眼ステレオ処理部3及び複眼ステレオ処理部5が接続している。カメラ2Bには、前記した複眼ステレオ処理部5が接続され、単眼ステレオ処理部3及び複眼ステレオ処理部5には、結果統合・切替手段6が接続している。     As shown in FIG. 1, the three-dimensional coordinate acquisition apparatus 1 mounted on a vehicle has two cameras 2A and 2B arranged in a form separated by a predetermined installation distance L in front of the vehicle (not shown). (The number of cameras may be two or more, but at least two are necessary.) The monocular stereo processing unit 3 and the compound eye stereo processing unit 5 are connected to the camera 2A. ing. The compound eye stereo processing unit 5 is connected to the camera 2B, and the result integration / switching unit 6 is connected to the monocular stereo processing unit 3 and the compound eye stereo processing unit 5.

単眼ステレオ処理部3は、カメラ2Aに接続されたフロー推定手段7,自車運動量推定手段9及び結果統合・切替手段6に接続された3次元座標推定手段10を有しており、複眼ステレオ処理部5は、2台のカメラ2A、2Bに接続された対応点推定手段11及び前述の結果統合・切替手段6に接続された3次元座標推定手段12を有している。     The monocular stereo processing unit 3 includes a flow estimation unit 7 connected to the camera 2A, a vehicle movement amount estimation unit 9, and a three-dimensional coordinate estimation unit 10 connected to the result integration / switching unit 6, and a compound eye stereo process. The unit 5 includes corresponding point estimation means 11 connected to the two cameras 2A and 2B and three-dimensional coordinate estimation means 12 connected to the result integration / switching means 6 described above.

なお、図1のカメラ2A.2B以外の、フロー推定手段7,自車運動量推定手段9、3次元座標推定手段10、対応点推定手段11、3次元座標推定手段12、結果統合・切替手段6は、CPUを有するコンピュータが、例えば図3に示す処理のフローチャートに基づいて作成されたプログラムを実行することにより、仮想的に実現される機能を模式的に表示したものであるが、各部を半導体などの集積回路を用いたハードウエアとして構成しても良い。     In addition, the camera 2A. Other than 2B, the flow estimation means 7, the own vehicle momentum estimation means 9, the three-dimensional coordinate estimation means 10, the corresponding point estimation means 11, the three-dimensional coordinate estimation means 12, and the result integration / switching means 6 are performed by a computer having a CPU. For example, functions that are virtually realized by executing a program created based on the flowchart of the processing shown in FIG. 3 are schematically displayed, but each part is a hardware that uses an integrated circuit such as a semiconductor. You may comprise as a wear.

3次元座標取得装置1は、以上のような構成を有するので、まず、1台のカメラ2Aで捕捉した画像から、当該画像に捕捉された物標などの対象物の3次元座標を演算推定するには、まず、カメラ2Aを用いて、車両の移動に伴って時々刻々変化する車両周辺の状況を撮影する。カメラ2Aにより捕捉された画像は所定のフレームレート(単位時間あたりに撮影する枚数)で記録され、図示しないメモリに格納される。     Since the three-dimensional coordinate acquisition apparatus 1 has the above-described configuration, first, from the image captured by one camera 2A, the three-dimensional coordinates of a target such as a target captured by the image are calculated and estimated. First, using the camera 2A, the situation around the vehicle that changes from moment to moment as the vehicle moves is photographed. Images captured by the camera 2A are recorded at a predetermined frame rate (the number of images taken per unit time) and stored in a memory (not shown).

例えば車両前方に取り付けられたカメラで、一定時刻ごとに車両周辺の画像を撮影取得する(図4)。次に、図3に示すステップS1に示すように、フロー推定手段7は、公知の手法を用いて前記カメラで得られた撮影(取得)時刻の異なる2枚の画像(図4参照)を比較して、両方の画像で共通に表示されていると判断される点(画像の一部であり、画像中の特徴的な点、例えば、コーナ部、尖った先端部、明度、彩度の変化点など)を特徴点(任意の点で可)として多数抽出し、それらの特徴点を異なる画像間で互いに対応付けて両画像間の対応点と推定する処理からなる、対応点の推定処理を行う。これは一般的にオプティカルフロー推定(あるいは単にフロー推定)と呼ばれるものであり、KLT(Kanade Lucas-Tomashi)法がよく利用される。フローの例を、図4に示す。図4からも明らかであるが、フローは1時刻前の画像と現時刻における画像間で、特徴点が移動した軌跡である。     For example, a camera mounted in front of the vehicle captures and acquires images around the vehicle at regular time intervals (FIG. 4). Next, as shown in step S1 shown in FIG. 3, the flow estimation means 7 compares two images (see FIG. 4) having different shooting (acquisition) times obtained by the camera using a known method. Points that are determined to be displayed in common in both images (parts of the image, characteristic points in the image, for example, corners, pointed tips, brightness, saturation change) A number of points as feature points (any point is acceptable), and corresponding point estimation processing is performed by associating these feature points with each other between different images and estimating the corresponding points between the two images. Do. This is generally called optical flow estimation (or simply flow estimation), and the KLT (Kanade Lucas-Tomashi) method is often used. An example of the flow is shown in FIG. As is clear from FIG. 4, the flow is a trajectory in which the feature point has moved between the image one hour before and the image at the current time.

次に、自車運動量推定手段9では、図3のステップS2に示すように、フロー推定手段7で得られたフローを利用して自車の運動量、即ちカメラの移動量を推定する。カメラの移動量とは、移動前後のカメラ間のX,Y,Z軸方向の並進位置ずれ量、およびX,Y,Z軸周りの回転ずれ量である。カメラは車に固定されているので、カメラ移動量と自車移動量は同じ意味で用いている。自車(カメラ)運動量の推定方法としては、車輪速センサ等から取得される車両の移動距離を示すセンサ情報を利用する方法(例えば、特開2001−187553公報)や、画像中の特徴点のフローを利用する方法がある。この方法としては、例えば、路面に属する対応点の移動状態を示すフローを利用して、カメラを搭載した自車両の移動量を計算する方法などが知られている(例えば、T.Suzuki(Toyota),T.Kanade, 'Measurement of Vehicle Motion and
Orientation using Optical Flow', IEEE conference on ITS, 1999)。また、基礎行列を利用する方法としては、(山口(豊田中研) 他、‘車載単眼カメラによる前方車両の障害物検出’、情報処理学会研究報告、2005.11)などがある。
Next, as shown in step S <b> 2 of FIG. 3, the own vehicle momentum estimating means 9 estimates the own vehicle momentum, that is, the camera movement amount using the flow obtained by the flow estimating means 7. The amount of camera movement refers to the amount of translational displacement in the X, Y, and Z axis directions between the cameras before and after movement, and the amount of rotational deviation about the X, Y, and Z axes. Since the camera is fixed to the car, the camera movement amount and the own vehicle movement amount are used interchangeably. As a method for estimating the amount of movement of the own vehicle (camera), a method using sensor information indicating a moving distance of the vehicle acquired from a wheel speed sensor or the like (for example, Japanese Patent Laid-Open No. 2001-187553), or a feature point in an image There is a method using a flow. As this method, for example, a method for calculating the amount of movement of the host vehicle equipped with the camera using a flow indicating the movement state of the corresponding point belonging to the road surface is known (for example, T. Suzuki (Toyota ), T. Kanade, 'Measurement of Vehicle Motion and
Orientation using Optical Flow ', IEEE conference on ITS, 1999). In addition, as a method using the basic matrix, there are (Yamaguchi (Toyota Chuken) et al., “Detection of obstacles in front vehicle with in-vehicle monocular camera”, Research report of Information Processing Society of Japan, 2005.11).

次に、図3のステップS3で、単眼ステレオ処理部9の3次元座標推定手段10において、フロー推定手段7で得られた特徴点のフローと、自車運動量推定手段9で得られた自車運動量、カメラ内部パラメータ(焦点距離や画角など(これらは既知とする))を用いて、特徴点までの3次元座標を公知の手法(例えば、金谷健一「画像理解-3次元認識の数理-」,森北出版,ISBN4-627-82140-9;に開示された手法)で求める。     Next, in step S3 in FIG. 3, in the three-dimensional coordinate estimation unit 10 of the monocular stereo processing unit 9, the flow of feature points obtained by the flow estimation unit 7 and the own vehicle obtained by the own vehicle momentum estimation unit 9 are obtained. Using the momentum and camera internal parameters (focal length, angle of view, etc. (these are known)), the three-dimensional coordinates to the feature point can be obtained by a known method (for example, Kenichi Kanaya “Image Understanding-Mathematical Science for Three-Dimensional Recognition- ", The method disclosed in Morikita Publishing, ISBN4-627-82140-9;).

一方、3次元座標取得装置1は、1台のカメラ2Aを用いた前述した単眼ステレオ処理部3による、車両周囲の対象物の3次元座標取得処理と並行して、2台のカメラ2A、2Bにより捕捉された画像から、複眼ステレオ処理部5により、車両周囲の対象物の3次元座標取得処理を実行する。     On the other hand, the three-dimensional coordinate acquisition apparatus 1 includes two cameras 2A, 2B in parallel with the three-dimensional coordinate acquisition process of the object around the vehicle by the monocular stereo processing unit 3 using the single camera 2A. The compound eye stereo processing unit 5 executes the three-dimensional coordinate acquisition processing of the object around the vehicle from the image captured by the above.

この処理は、まず、カメラ2A、2Bを用いて、車両周辺の状況を撮影する。これは、前述した単眼ステレオ処理部3によるカメラ2Aを介した画像取得動作と並行して行うことが出来る。即ち、カメラ2Aで取得された画像のデータは、同じデータが単眼ステレオ処理部3及び複眼ステレオ処理部5に、並行して出力され、単眼ステレオ処理部3では、カメラ2Aからの画像のみで車両周囲の対象物の3次元座標取得処理を行い、複眼ステレオ処理部5は、カメラ2Aに加えて、同時に動作しているカメラ2Bからの画像に基づいて車両周囲の対象物の3次元座標取得処理を行う。     In this process, first, the situation around the vehicle is photographed using the cameras 2A and 2B. This can be performed in parallel with the image acquisition operation via the camera 2A by the monocular stereo processing unit 3 described above. That is, the image data acquired by the camera 2A is output to the monocular stereo processing unit 3 and the compound eye stereo processing unit 5 in parallel, and the monocular stereo processing unit 3 uses only the image from the camera 2A as the vehicle. The three-dimensional coordinate acquisition process of the surrounding object is performed, and the compound eye stereo processing unit 5 performs the three-dimensional coordinate acquisition process of the object around the vehicle based on the image from the camera 2B operating simultaneously in addition to the camera 2A. I do.

次いで、図4のステップS4で、複眼ステレオ処理部5の対応点推定手段11において、2台のカメラで同時刻に撮影(取得)された画像中の特徴点の対応付けを行う。例えばHarris法などのコーナ検出フィルタを用いて画像中の特徴的な点(特徴点)を抽出し、その点を中心にブロックを設定し、画像間でブロックマッチングなどの相関演算を行う手法などがある。なお、特徴点の選択方法は、前述した単眼ステレオ処理部3によるフロー推定手段7と同様である。     Next, in step S4 of FIG. 4, the corresponding point estimation unit 11 of the compound eye stereo processing unit 5 associates feature points in images captured (acquired) at the same time by two cameras. For example, there is a method of extracting characteristic points (feature points) in an image using a corner detection filter such as the Harris method, setting a block around the point, and performing a correlation operation such as block matching between the images. is there. Note that the feature point selection method is the same as the flow estimation means 7 by the monocular stereo processing unit 3 described above.

次に複眼ステレオ処理部5の3次元座標推定手段12において、上記得られた特徴点の対応関係(視差情報)と、カメラの設置情報(カメラ間の位置ずれおよび回転ずれ(これらは事前にキャリブレーションにより取得済みとする))、カメラ内部情報(焦点距離、画角など)を用いて、公知の手法を用いて各特徴点の3次元座標を算出する。     Next, in the three-dimensional coordinate estimation means 12 of the compound eye stereo processing unit 5, the correspondence relationship (parallax information) of the obtained feature points and camera installation information (positional displacement and rotational displacement between cameras (these are calibrated in advance). 3) and 3D coordinates of each feature point are calculated using a known method using the internal information of the camera (focal length, angle of view, etc.).

次に、3次元座標取得装置1は、結果統合・切替手段6に、単眼ステレオ処理部3で得られた車両の周囲の物体についての3次元座標、即ち単眼位置情報と、複眼ステレオ処理部5で得られた3次元座標、即ち複眼位置情報のどちらかを選択、あるいは統合させる処理を行わせる(図3のステップS6)。     Next, the three-dimensional coordinate acquisition apparatus 1 sends the result integration / switching means 6 to the three-dimensional coordinates of the object around the vehicle obtained by the monocular stereo processing unit 3, that is, the monocular position information and the compound eye stereo processing unit 5. 3 is selected or integrated (step S6 in FIG. 3).

例えば、図示しない車速センサなどにより車両が停止していることが検知された場合には、結果統合・切替手段6は、単眼ステレオ処理部3の演算結果は使えないものと判断し、複眼ステレオ処理部5からの演算結果のみを採用して、車両周囲の物体の3次元座標を推定する。     For example, when it is detected that the vehicle is stopped by a vehicle speed sensor (not shown), the result integration / switching unit 6 determines that the calculation result of the monocular stereo processing unit 3 cannot be used, and the compound eye stereo processing Only the calculation result from the unit 5 is employed to estimate the three-dimensional coordinates of the object around the vehicle.

そして、車速センサなどにより車両が移動していることが検知された場合には、結果統合・切替手段6は、単眼ステレオ処理部3及び複眼ステレオ処理部5の両方の演算結果を利用出来るものと判断し、単眼ステレオ処理部3及び/又は複眼ステレオ処理部5からの演算結果から、車両周囲の物体(物標)の3次元座標を推定する。     When the vehicle speed sensor detects that the vehicle is moving, the result integration / switching unit 6 can use the calculation results of both the monocular stereo processing unit 3 and the compound eye stereo processing unit 5. The three-dimensional coordinates of the object (target) around the vehicle are estimated from the calculation results from the monocular stereo processing unit 3 and / or the compound eye stereo processing unit 5.

この際の、推定方法は、各種の態様が考えられるが、単純な方法としては、どちらか一方の推定結果を選択して、物標の3次元座標とする方法や、単眼ステレオ処理部3及び複眼ステレオ処理部5からの演算結果から、車両周囲の物体について、同一の特徴点と見なされる特徴点についての両処理部3,5の演算結果を集めて、それらの平均を求めて、当該特徴点についての3次元座標とすることなどが考えられる。一般的に、カメラ2A、2Bで取得される車両周囲の物体の画像については、その後の画像処理が単眼ステレオ処理であっても、複眼ステレオ処理であっても、選択される物体の特徴点は、一致することが多いものと考えられるので、前記した処理により、単眼ステレオ処理部3及び複眼ステレオ処理部5による高精度な3次元座標取得が可能となる。     Various methods can be considered for the estimation method at this time, but as a simple method, either one of the estimation results is selected and the three-dimensional coordinates of the target are selected, or the monocular stereo processing unit 3 and From the calculation results from the compound eye stereo processing unit 5, the calculation results of both processing units 3 and 5 for the feature points regarded as the same feature point are collected for an object around the vehicle, and an average of them is obtained to obtain the feature. It is conceivable to use three-dimensional coordinates for a point. In general, for an image of an object around a vehicle acquired by the cameras 2A and 2B, the feature point of the selected object is the monocular stereo process or the compound eye stereo process, regardless of the subsequent image processing. Therefore, the above-described processing enables highly accurate three-dimensional coordinate acquisition by the monocular stereo processing unit 3 and the compound eye stereo processing unit 5.

なお、結果統合・切替手段6で最終的に推定された車両の周囲の物標の3次元座標位置は、後段の処理のために図示しないインターフェースを介して適宜な制御手段に出力される。     The three-dimensional coordinate position of the target around the vehicle finally estimated by the result integration / switching unit 6 is output to an appropriate control unit via an interface (not shown) for subsequent processing.

また、複眼ステレオの座標推定精度が高い車両近辺については、複眼ステレオ処理部5の3次元座標推定手段12の演算結果を採用し、それ以外の中遠方は、座標推定精度が相対的に高くなる単眼ステレオ処理部3の3次元座標推定手段12の演算結果を採用するか、単眼ステレオ処理部3と複眼ステレオ処理部5の演算結果を、適宜な重みで統合採用するようにしても良い(図5)。     Further, the calculation result of the three-dimensional coordinate estimation means 12 of the compound eye stereo processing unit 5 is adopted for the vicinity of the vehicle where the coordinate estimation accuracy of the compound eye stereo is high, and the coordinate estimation accuracy is relatively high in other middle distances. The calculation result of the three-dimensional coordinate estimation means 12 of the monocular stereo processing unit 3 may be adopted, or the calculation results of the monocular stereo processing unit 3 and the compound eye stereo processing unit 5 may be integrated and adopted with appropriate weights (FIG. 5).

車両停止時には複眼ステレオのみ利用可能であるが、そもそも車両停止時には遠方の状況を知る必要はないため、カメラ間距離(ベースライン長)を短く設定できる。車速が上がるに従いより遠距離を見たい場合は、単眼ステレオの結果を用いればよい。また、図6のようなカメラ構成(カメラペア2A,2Bが、その光軸2c、2dがお互いに外向きなるように設置)にした場合、車両正面については複眼ステレオ視で高い精度を実現しつつ、それ以外は単眼ステレオ視を利用して広角なステレオシステムが構成できる。一般的に車両正面(進行)方向のフローは大きさが小さく(図8参照)、単眼ステレオ視では精度がでにくいが、この構成ではその問題を回避することが出来る。     Although only compound eye stereo can be used when the vehicle is stopped, it is not necessary to know the distant situation when the vehicle is stopped, so the distance between cameras (baseline length) can be set short. If you want to see farther distances as the vehicle speed increases, you can use monocular stereo results. In addition, when the camera configuration as shown in FIG. 6 (camera pairs 2A and 2B are installed so that their optical axes 2c and 2d face each other), high accuracy is realized by stereo vision on the front side of the vehicle. However, other than that, a wide-angle stereo system can be configured using monocular stereo vision. In general, the flow in the front (traveling) direction of the vehicle is small (see FIG. 8) and is difficult to obtain with monocular stereo vision, but this configuration can avoid this problem.

複眼ステレオで遠方まで観測したい場合、カメラ間距離を長くする必要がある。また例えば、図7に示すように、車両の前後左右にカメラを配置し、単眼ステレオの結果と複眼ステレオの結果を組み合わせることにより、比較的少数のカメラで、ほぼ死角のないステレオ視を実現することができる。     If you want to observe far away with compound-eye stereo, you need to increase the distance between the cameras. Further, for example, as shown in FIG. 7, cameras are arranged on the front, rear, left and right sides of the vehicle, and the result of monocular stereo and the result of compound eye stereo are combined to realize stereo vision with almost no blind spots with a relatively small number of cameras. be able to.

図2に、3次元座標推定装置1の別の例を示す。図1に示したものと、同一の部分には同一の符号を付して当該部分の説明を省略する。     FIG. 2 shows another example of the three-dimensional coordinate estimation apparatus 1. The same parts as those shown in FIG. 1 are denoted by the same reference numerals, and description thereof will be omitted.

図1の実施例と比較して、単眼ステレオ処理部3と複眼ステレオ処理部5の間で途中結果を利用する形になっている。即ち、フロー推定手段7と対応点抽出手段11の間に、どちらか一方が選択した特徴点を他方の推定手段側に、当該他方の推定手段で使用する特徴点として出力する、特徴点出力手段(どちらかの推定手段7,11に内蔵)を設け、単眼ステレオ処理部3及び複眼ステレオ処理部5で、共通の特徴点を使用するようにすると、共通の特徴点の3次元座標を単眼ステレオ処理部3及び複眼ステレオ処理部5で演算することとなり、推定演算精度の向上が期待出来る。また、どちらか一方の推定手段7又は9で特徴点を抽出して、他方の推定手段9又は7では、当該抽出された特徴点をそのまま使用することが出来るので、特徴点の抽出演算を半減させることが出来、コンピュータの演算負荷を低減させることが出来る。     Compared with the embodiment of FIG. 1, the intermediate result is used between the monocular stereo processing unit 3 and the compound eye stereo processing unit 5. That is, a feature point output unit that outputs a feature point selected by either one of the flow estimation unit 7 and the corresponding point extraction unit 11 to the other estimation unit side as a feature point used by the other estimation unit. (Built in one of the estimation means 7 and 11), and the monocular stereo processing unit 3 and the compound eye stereo processing unit 5 use the common feature points, the three-dimensional coordinates of the common feature points are converted to monocular stereo. Since the calculation is performed by the processing unit 3 and the compound eye stereo processing unit 5, an improvement in estimation calculation accuracy can be expected. In addition, either one of the estimation means 7 or 9 can extract the feature points, and the other estimation means 9 or 7 can use the extracted feature points as they are. And the computational load on the computer can be reduced.

また、同様に、単眼ステレオ処理部3の3次元座標推定手段10と複眼ステレオ処理部5の3次元座標推定手段12との間に演算推定された3次元座標を互いに出力することが自在な、演算結果交換手段(各推定手段10,12に内蔵)を設け、単眼ステレオ処理部3と複眼ステレオ処理部5との間で、演算結果を交換することにより、他方の処理部での3次元座標の推定結果を、自らの処理部での推定結果に反映させて、誤りや誤差を除去することが可能となり、推定演算精度の向上が期待出来る。     Similarly, the three-dimensional coordinates calculated and estimated between the three-dimensional coordinate estimation unit 10 of the monocular stereo processing unit 3 and the three-dimensional coordinate estimation unit 12 of the compound eye stereo processing unit 5 can be output to each other. Computation result exchange means (built in each estimation means 10 and 12) is provided, and the computation results are exchanged between the monocular stereo processing section 3 and the compound eye stereo processing section 5, so that the three-dimensional coordinates in the other processing section are obtained. It is possible to eliminate the error and the error by reflecting the estimation result in the estimation result in its own processing unit, and it can be expected to improve the estimation calculation accuracy.

本発明は、車両に搭載され、2台以上のカメラを用いて車両周辺物の3次元座標を取得する3次元座標取得装置に利用することが出来る。     INDUSTRIAL APPLICABILITY The present invention can be used in a three-dimensional coordinate acquisition device that is mounted on a vehicle and acquires three-dimensional coordinates of a vehicle peripheral object using two or more cameras.

図1は、3次元座標取得装置の一例を示す図。FIG. 1 is a diagram illustrating an example of a three-dimensional coordinate acquisition apparatus. 図2は、3次元座標取得装置の別の例を示す図。FIG. 2 is a diagram illustrating another example of a three-dimensional coordinate acquisition apparatus. 図3は、3次元座標取得装置の処理の一例を示すフローチャート、FIG. 3 is a flowchart showing an example of processing of the three-dimensional coordinate acquisition apparatus. 図3は、単眼ステレオカメラで取得される画像とそれら画像から得られるフローを示す図。FIG. 3 is a diagram illustrating images acquired by a monocular stereo camera and a flow obtained from the images. 図5は、1台の3次元座標取得装置を車両正面部(面)に配置した例を示す平面図。FIG. 5 is a plan view showing an example in which one three-dimensional coordinate acquisition device is arranged on the front surface (surface) of the vehicle. 図6は、1台の3次元座標取得装置を車両正面部(面)に配置した別の例を示す平面図。FIG. 6 is a plan view showing another example in which one three-dimensional coordinate acquisition device is arranged on the front surface (surface) of the vehicle. 図7は、4台の3次元座標取得装置を車両の前後左右部(面)に配置した例を示す平面図。FIG. 7 is a plan view showing an example in which four three-dimensional coordinate acquisition devices are arranged in front, rear, left and right parts (surfaces) of a vehicle. 図8は、図6に対応する左右カメラ画像で得られるフローの一例を示す図。FIG. 8 is a diagram illustrating an example of a flow obtained from left and right camera images corresponding to FIG. 6.

符号の説明Explanation of symbols

1……ステレオ画像処理装置
2A、2B……カメラ
2c、2d……光軸
3……単眼ステレオ処理部
5……複眼ステレオ処理部
6……結果統合・切替手段
7……フロー推定手段
9……単眼座標推定演算手段(自車運動量推定手段)
10……単眼座標推定演算手段、演算結果交換手段(3次元座標推定手段)
11……対応点推定手段
12……複眼座標推定演算手段、演算結果交換手段(3次元座標推定手段)
DESCRIPTION OF SYMBOLS 1 ... Stereo image processing apparatus 2A, 2B ... Camera 2c, 2d ... Optical axis 3 ... Monocular stereo processing part 5 ... Compound eye stereo processing part 6 ... Result integration and switching means 7 ... Flow estimation means 9 ... ... Monocular coordinate estimation calculation means (vehicle movement amount estimation means)
10: Monocular coordinate estimation calculation means, calculation result exchange means (three-dimensional coordinate estimation means)
11: Corresponding point estimation means 12: Compound eye coordinate estimation calculation means, calculation result exchange means (three-dimensional coordinate estimation means)

Claims (5)

車両に搭載され、所定の設置距離で配置された、少なくとも2台のカメラを有し、該少なくとも2台のカメラで取得された複数の画像を用いて、該車両周辺の物標の3次元座標を推定取得する3次元座標取得装置において、
前記少なくとも2台のカメラの内の1台のカメラが経時的に取得した2枚の画像から、前記車両周辺の物標の3次元座標を推定演算する単眼ステレオ処理部、
前記少なくとも2台のカメラの内の2台のカメラが同時に取得した2枚の画像から、前記車両周辺の物標の3次元座標を推定演算する複眼ステレオ処理部、
それら単眼ステレオ処理部及び複眼ステレオ処理部に接続され、それら処理部でそれぞれ演算推定された前記物標の3次元座標を、所定の基準で選択、統合して、求めるべき前記物標の3次元座標と推定し、その結果を出力する、結果統合・切替手段、
を有する、3次元座標取得装置。
Three-dimensional coordinates of a target around the vehicle using at least two cameras mounted on the vehicle and arranged at a predetermined installation distance, and using a plurality of images acquired by the at least two cameras In the three-dimensional coordinate acquisition apparatus that estimates and acquires
A monocular stereo processing unit that estimates and calculates the three-dimensional coordinates of a target around the vehicle from two images acquired by one of the at least two cameras over time;
A compound eye stereo processing unit for estimating and calculating the three-dimensional coordinates of the target around the vehicle from two images simultaneously acquired by two of the at least two cameras;
The three-dimensional coordinates of the target to be obtained by selecting and integrating the three-dimensional coordinates of the target connected to the monocular stereo processing unit and the compound-eye stereo processing unit and calculated and estimated by the processing units according to a predetermined criterion. Result integration / switching means that estimates coordinates and outputs the results,
A three-dimensional coordinate acquisition apparatus.
前記結果統合・切替手段は、前記車両の移動の有無を検出するセンサを有し、
前記センサにより、前記車両が停止しているものと判断された場合には、前記複眼ステレオ処理部で演算された前記車両周辺の物標の3次元座標を採用し、前記センサにより、前記車両が移動しているものと判断された場合には、前記単眼ステレオ処理部及び/又は前記複眼ステレオ処理部で演算された前記車両周辺の物標の3次元座標の推定演算結果に基づいて、求めるべき前記物標の3次元座標を推定する、座標推定演算手段を有する、
請求項1記載の3次元座標取得装置。
The result integration / switching means has a sensor for detecting the presence or absence of movement of the vehicle,
When it is determined by the sensor that the vehicle is stopped, the three-dimensional coordinates of the target around the vehicle calculated by the compound eye stereo processing unit are adopted, and the vehicle is detected by the sensor. If it is determined that the object is moving, it should be obtained based on the estimation calculation result of the three-dimensional coordinates of the target around the vehicle calculated by the monocular stereo processing unit and / or the compound eye stereo processing unit. Having a coordinate estimation calculation means for estimating the three-dimensional coordinates of the target;
The three-dimensional coordinate acquisition apparatus according to claim 1.
前記単眼ステレオ処理部は、
前記1台のカメラが経時的に取得した2枚の画像の特徴点を抽出し、それら特徴点の前記画像間におけるフローを推定するフロー推定手段、
該推定されたフローに基づいて前記車両周辺の物標の3次元座標を推定演算する、単眼座標推定演算手段、を有し、
前記複眼ステレオ処理部は、
2台のカメラが同時に取得した2枚の画像の特徴点を抽出し、それら特徴点を前記2枚の画像の中で対応付ける対応点推定手段、
該推定された対応点に基づいて前記車両周辺の物標の3次元座標を推定演算する、複眼座標推定演算手段、を有し、
前記フロー推定手段と対応点推定手段の間に、どちらか一方の推定手段が抽出した前記特徴点を他方の推定手段に、当該他方の推定手段で使用する前記特徴点として出力する特徴点出力手段を設けて、構成した、請求項1記載の3次元座標取得装置。
The monocular stereo processing unit
Flow estimation means for extracting feature points of two images acquired by the one camera over time and estimating a flow of the feature points between the images;
Monocular coordinate estimation calculation means for estimating and calculating the three-dimensional coordinates of the target around the vehicle based on the estimated flow;
The compound eye stereo processing unit
Corresponding point estimation means for extracting feature points of two images acquired simultaneously by two cameras and associating these feature points in the two images;
A compound eye coordinate estimation calculation means for estimating and calculating the three-dimensional coordinates of the target around the vehicle based on the estimated corresponding points;
Feature point output means for outputting the feature point extracted by one of the estimation means to the other estimation means as the feature point used by the other estimation means between the flow estimation means and the corresponding point estimation means The three-dimensional coordinate acquisition apparatus according to claim 1, comprising:
前記単眼ステレオ処理部と複眼ステレオ処理部の間に、演算推定された3次元座標を互いに出力することが自在な、演算結果交換手段を設け、
前記単眼ステレオ処理部及び複眼ステレオ処理部は、前記演算結果交換手段により出力される、他方の処理部での3次元座標の推定結果を自らの処理部での推定結果に反映させることを可能にしたことを特徴とする、請求項1記載の3次元座標取得装置。
An operation result exchanging means is provided between the monocular stereo processing unit and the compound eye stereo processing unit, which can freely output the calculated and estimated three-dimensional coordinates.
The monocular stereo processing unit and the compound eye stereo processing unit can reflect the estimation result of the three-dimensional coordinates in the other processing unit, which is output by the calculation result exchanging means, in the estimation result in its own processing unit. The three-dimensional coordinate acquisition apparatus according to claim 1, wherein
前記少なくとも2台のカメラは、その光軸が互いに外向きとなるように、かつそれらのカメラの視野がその光軸の前方でその一部が重なる形で配置されている、
請求項1乃至4のうち、何れか1項記載の3次元座標取得装置。
The at least two cameras are arranged such that their optical axes are directed outward from each other, and the fields of view of these cameras are disposed in front of the optical axis and partially overlap each other.
The three-dimensional coordinate acquisition apparatus according to any one of claims 1 to 4.
JP2006087478A 2006-03-28 2006-03-28 Three-dimensional coordinates acquisition system Pending JP2007263657A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006087478A JP2007263657A (en) 2006-03-28 2006-03-28 Three-dimensional coordinates acquisition system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2006087478A JP2007263657A (en) 2006-03-28 2006-03-28 Three-dimensional coordinates acquisition system

Publications (1)

Publication Number Publication Date
JP2007263657A true JP2007263657A (en) 2007-10-11

Family

ID=38636800

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2006087478A Pending JP2007263657A (en) 2006-03-28 2006-03-28 Three-dimensional coordinates acquisition system

Country Status (1)

Country Link
JP (1) JP2007263657A (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011112507A (en) * 2009-11-26 2011-06-09 Fujitsu Ltd Apparatus and method for three-dimensional position measurement
WO2011090053A1 (en) * 2010-01-21 2011-07-28 クラリオン株式会社 Obstacle detection warning device
JP2011163589A (en) * 2010-02-05 2011-08-25 Mitsubishi Electric Corp Guided flying object device
WO2012172870A1 (en) 2011-06-14 2012-12-20 日産自動車株式会社 Distance measurement device and environment map generation apparatus
WO2013132951A1 (en) 2012-03-09 2013-09-12 日立オートモティブシステムズ株式会社 Distance calculation device and distance calculation method
WO2013132947A1 (en) 2012-03-09 2013-09-12 日立オートモティブシステムズ株式会社 Distance calculation device and distance calculation method
JP2014238409A (en) * 2014-07-23 2014-12-18 日立オートモティブシステムズ株式会社 Distance calculation device and distance calculation method
JP2016142577A (en) * 2015-01-30 2016-08-08 株式会社リコー Image processing device, image processing method, image processing system, and program
JP2017227966A (en) * 2016-06-20 2017-12-28 株式会社東海理化電機製作所 Gesture recognition device
JP2018022255A (en) * 2016-08-02 2018-02-08 株式会社日立製作所 Image processing device and image processing method
WO2018142493A1 (en) * 2017-01-31 2018-08-09 富士通株式会社 Image processing device, image processing method, image processing program, image capturing method, and mobile object
JP2018528631A (en) * 2015-12-10 2018-09-27 グーグル エルエルシー Stereo autofocus
CN110053625A (en) * 2018-01-19 2019-07-26 本田技研工业株式会社 Apart from computing device and controller of vehicle
JP2019158741A (en) * 2018-03-15 2019-09-19 株式会社日立製作所 Three-dimensional image processing device, and three-dimensional image processing method
JP2019200509A (en) * 2018-05-15 2019-11-21 株式会社日立製作所 Autonomous mobile device and autonomous mobile system
JP2020003432A (en) * 2018-06-29 2020-01-09 キヤノン株式会社 Imaging device, image processing method, image processing program and recording medium
DE112018002247T5 (en) 2017-04-28 2020-01-16 Denso Corporation HINDERNISABTASTVORRICHTUNG
JP2020047059A (en) * 2018-09-20 2020-03-26 株式会社Subaru Traveling environment detector of vehicle and traveling control system
US10679072B2 (en) 2017-02-07 2020-06-09 Fujitsu Limited Moving-object position estimating system, information processing apparatus and moving-object position estimating method
EP3659872A4 (en) * 2017-07-24 2020-07-15 Fujitsu Limited Vehicle parking assistance device, vehicle parking assistance program
US20220301208A1 (en) * 2021-03-18 2022-09-22 Kabushiki Kaisha Toshiba Distance estimation device and distance estimation method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03273315A (en) * 1990-03-23 1991-12-04 Mazda Motor Corp Picture processor for moving vehicle
JPH0843055A (en) * 1994-07-29 1996-02-16 Canon Inc Method and apparatus for recognizing shape of three dimensional object
JP2001266160A (en) * 2000-03-22 2001-09-28 Toyota Motor Corp Method and device for recognizing periphery
JP2005024463A (en) * 2003-07-04 2005-01-27 Fuji Heavy Ind Ltd Stereo wide visual field image processing apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03273315A (en) * 1990-03-23 1991-12-04 Mazda Motor Corp Picture processor for moving vehicle
JPH0843055A (en) * 1994-07-29 1996-02-16 Canon Inc Method and apparatus for recognizing shape of three dimensional object
JP2001266160A (en) * 2000-03-22 2001-09-28 Toyota Motor Corp Method and device for recognizing periphery
JP2005024463A (en) * 2003-07-04 2005-01-27 Fuji Heavy Ind Ltd Stereo wide visual field image processing apparatus

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011112507A (en) * 2009-11-26 2011-06-09 Fujitsu Ltd Apparatus and method for three-dimensional position measurement
WO2011090053A1 (en) * 2010-01-21 2011-07-28 クラリオン株式会社 Obstacle detection warning device
JP2011149810A (en) * 2010-01-21 2011-08-04 Clarion Co Ltd Obstacle detection alarm device
JP2011163589A (en) * 2010-02-05 2011-08-25 Mitsubishi Electric Corp Guided flying object device
JP5472538B2 (en) * 2011-06-14 2014-04-16 日産自動車株式会社 Distance measuring device and environmental map generating device
WO2012172870A1 (en) 2011-06-14 2012-12-20 日産自動車株式会社 Distance measurement device and environment map generation apparatus
CN103154666B (en) * 2011-06-14 2015-03-18 日产自动车株式会社 Distance measurement device and environment map generation apparatus
US9046364B2 (en) 2011-06-14 2015-06-02 Nissan Motor Co., Ltd. Distance measurement device and environment map generation apparatus
CN103154666A (en) * 2011-06-14 2013-06-12 日产自动车株式会社 Distance measurement device and environment map generation apparatus
US9530210B2 (en) 2012-03-09 2016-12-27 Hitachi Automotive Systems, Ltd. Distance calculator and distance calculation method
WO2013132951A1 (en) 2012-03-09 2013-09-12 日立オートモティブシステムズ株式会社 Distance calculation device and distance calculation method
WO2013132947A1 (en) 2012-03-09 2013-09-12 日立オートモティブシステムズ株式会社 Distance calculation device and distance calculation method
US20150015673A1 (en) * 2012-03-09 2015-01-15 Hitachi Automotive Systems, Ltd. Distance calculator and distance calculation method
EP2824416A4 (en) * 2012-03-09 2015-10-21 Hitachi Automotive Systems Ltd Distance calculation device and distance calculation method
JP2014238409A (en) * 2014-07-23 2014-12-18 日立オートモティブシステムズ株式会社 Distance calculation device and distance calculation method
JP2016142577A (en) * 2015-01-30 2016-08-08 株式会社リコー Image processing device, image processing method, image processing system, and program
JP2018528631A (en) * 2015-12-10 2018-09-27 グーグル エルエルシー Stereo autofocus
JP2017227966A (en) * 2016-06-20 2017-12-28 株式会社東海理化電機製作所 Gesture recognition device
JP2018022255A (en) * 2016-08-02 2018-02-08 株式会社日立製作所 Image processing device and image processing method
WO2018142493A1 (en) * 2017-01-31 2018-08-09 富士通株式会社 Image processing device, image processing method, image processing program, image capturing method, and mobile object
US10679072B2 (en) 2017-02-07 2020-06-09 Fujitsu Limited Moving-object position estimating system, information processing apparatus and moving-object position estimating method
DE112018002247T5 (en) 2017-04-28 2020-01-16 Denso Corporation HINDERNISABTASTVORRICHTUNG
DE112018002247B4 (en) 2017-04-28 2023-07-06 Denso Corporation OBSTACLE DETECTION DEVICE
US11378974B2 (en) 2017-07-24 2022-07-05 Fujitsu Limited Information processing device and recording medium recording vehicle parking support program
EP3659872A4 (en) * 2017-07-24 2020-07-15 Fujitsu Limited Vehicle parking assistance device, vehicle parking assistance program
CN110053625A (en) * 2018-01-19 2019-07-26 本田技研工业株式会社 Apart from computing device and controller of vehicle
JP2019128153A (en) * 2018-01-19 2019-08-01 本田技研工業株式会社 Distance calculation device and vehicle control device
CN110053625B (en) * 2018-01-19 2022-03-11 本田技研工业株式会社 Distance calculation device and vehicle control device
JP2019158741A (en) * 2018-03-15 2019-09-19 株式会社日立製作所 Three-dimensional image processing device, and three-dimensional image processing method
JP2019200509A (en) * 2018-05-15 2019-11-21 株式会社日立製作所 Autonomous mobile device and autonomous mobile system
WO2019220740A1 (en) * 2018-05-15 2019-11-21 株式会社日立製作所 Autonomous movement device and autonomous movement system
JP7064948B2 (en) 2018-05-15 2022-05-11 株式会社日立製作所 Autonomous mobile devices and autonomous mobile systems
JP7118776B2 (en) 2018-06-29 2022-08-16 キヤノン株式会社 IMAGING DEVICE, IMAGE PROCESSING METHOD, IMAGE PROCESSING PROGRAM AND RECORDING MEDIUM
JP2020003432A (en) * 2018-06-29 2020-01-09 キヤノン株式会社 Imaging device, image processing method, image processing program and recording medium
JP2020047059A (en) * 2018-09-20 2020-03-26 株式会社Subaru Traveling environment detector of vehicle and traveling control system
JP7232005B2 (en) 2018-09-20 2023-03-02 株式会社Subaru VEHICLE DRIVING ENVIRONMENT DETECTION DEVICE AND DRIVING CONTROL SYSTEM
US20220301208A1 (en) * 2021-03-18 2022-09-22 Kabushiki Kaisha Toshiba Distance estimation device and distance estimation method
JP2022143955A (en) * 2021-03-18 2022-10-03 株式会社東芝 Distance estimation device and distance estimation method
JP7439006B2 (en) 2021-03-18 2024-02-27 株式会社東芝 Distance estimation device and distance estimation method

Similar Documents

Publication Publication Date Title
JP2007263657A (en) Three-dimensional coordinates acquisition system
JP4814669B2 (en) 3D coordinate acquisition device
US11354891B2 (en) Image capturing apparatus, monitoring system, image processing apparatus, image capturing method, and non-transitory computer readable recording medium
JP6565188B2 (en) Parallax value deriving apparatus, device control system, moving body, robot, parallax value deriving method, and program
JP2010198552A (en) Driving state monitoring device
JP6407010B2 (en) Method for analyzing related images, image processing system, vehicle comprising the system, and computer program product
US20180309978A1 (en) Camera parameter set calculation method, recording medium, and camera parameter set calculation apparatus
WO2013132951A1 (en) Distance calculation device and distance calculation method
WO2015145543A1 (en) Object detection apparatus, object detection method, and mobile robot
JP2004198211A (en) Apparatus for monitoring vicinity of mobile object
JPWO2017057054A1 (en) Information processing apparatus, information processing method, and program
JP6306735B2 (en) Stereo camera device and vehicle equipped with stereo camera device
JP2014074632A (en) Calibration apparatus of in-vehicle stereo camera and calibration method
JP5007863B2 (en) 3D object position measuring device
JP6589313B2 (en) Parallax value deriving apparatus, device control system, moving body, robot, parallax value deriving method, and program
JP5299101B2 (en) Peripheral display device
JP6337504B2 (en) Image processing apparatus, moving body, robot, device control method and program
JP6455164B2 (en) Parallax value deriving apparatus, device control system, moving body, robot, parallax value deriving method, and program
JP6543935B2 (en) PARALLEL VALUE DERIVING DEVICE, DEVICE CONTROL SYSTEM, MOBILE OBJECT, ROBOT, PARALLEL VALUE DERIVING METHOD, AND PROGRAM
JP2011024079A (en) Peripheral display device
CN104697491A (en) Distance determination using a monoscopic imager in a vehicle
JP2007051976A (en) On-vehicle camera system, object position detecting system and object position detection method
JP2009104363A (en) Object identification device
JP6566367B2 (en) Three-dimensional position measurement system, moving object, three-dimensional position measurement method, and program
JP2023151059A (en) Speed estimation method and speed estimation device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20080807

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20110216

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110222

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110407

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20120214