WO2010061860A1 - Stereo matching process system, stereo matching process method, and recording medium - Google Patents

Stereo matching process system, stereo matching process method, and recording medium Download PDF

Info

Publication number
WO2010061860A1
WO2010061860A1 PCT/JP2009/069887 JP2009069887W WO2010061860A1 WO 2010061860 A1 WO2010061860 A1 WO 2010061860A1 JP 2009069887 W JP2009069887 W JP 2009069887W WO 2010061860 A1 WO2010061860 A1 WO 2010061860A1
Authority
WO
WIPO (PCT)
Prior art keywords
line
line segment
stereo matching
same position
scanning line
Prior art date
Application number
PCT/JP2009/069887
Other languages
French (fr)
Japanese (ja)
Inventor
小泉 博一
神谷 俊之
弘之 柳生
Original Assignee
Necシステムテクノロジー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necシステムテクノロジー株式会社 filed Critical Necシステムテクノロジー株式会社
Priority to KR1020117011938A priority Critical patent/KR101453143B1/en
Priority to CN200980146982.2A priority patent/CN102224523B/en
Publication of WO2010061860A1 publication Critical patent/WO2010061860A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Definitions

  • the present invention relates to a stereo matching processing system, a stereo matching processing method, and a computer readable recording medium recording a program, and a stereo matching processing system capable of accurately associating the same position among a plurality of images, stereo matching processing
  • the present invention relates to a method and a recording medium.
  • a method of generating three-dimensional data [DSM (Digital Surface Model) data] representing topography by stereo matching based on an image obtained from a satellite or aircraft is widely used.
  • DSM Digital Surface Model
  • stereo matching processing corresponding points in each image capturing the same point are determined for two images captured from different viewpoints, so-called stereo images, and the principle of triangulation is used using the parallax. To determine the depth and shape of the object.
  • the movement range of the search window is limited to the epipolar line direction in the image, so that for each point in the left image, the x direction of the point in the corresponding right image It is possible to obtain the positional displacement amount, that is, the parallax.
  • the epipolar line is a straight line which can be drawn as an existing range of a point corresponding to the point in the other image with respect to a certain point in one image in the stereo image (see, for example, Non-Patent Document 1).
  • the epipolar line direction is different from the scanning line direction of the image, by performing coordinate conversion, the epipolar line direction can be made to coincide with the scanning line direction of the image, and rearrangement can be performed.
  • the method of this coordinate conversion is described in the above non-patent document 1.
  • the movement range of the search window of the corresponding point can be limited to the scanning line, so parallax is obtained as the difference between the x coordinates of corresponding points in the left and right images. .
  • the related stereo matching 3D data generation method includes an area without texture and an area where a correlation coefficient can not be obtained, the image of the above 3D data is erroneously different from the surroundings. There are many points that indicate the height. In particular, since concealment occurs in the periphery of a structure, etc., there are many points that can not be dealt with, which may show extremely high values or may cause a large loss of the structure.
  • the present invention has been made to solve the above problems, and a stereo matching processing system, a stereo matching processing method, and a recording medium, which can accurately associate areas indicating the same position among a plurality of images. Intended to be provided.
  • a stereo matching processing system has a correlation coefficient that is maximized on the same scanning line in a plurality of images obtained by photographing the same object from different directions. And a line segment determination unit that determines whether a line segment associated with each of the plurality of images as having the same position is drawn. And when the line segment determination unit determines that the line segment is drawn, the association unit does not select areas where the correlation coefficient is maximized on the same scanning line, but the scanning line and the line segment And the point of intersection with each other to indicate the same position.
  • a line segment determination step of determining whether a line segment associated with each of the plurality of images has been drawn as corresponding to each other.
  • the attaching step when it is determined that the line segment has been drawn in the line segment determining step, the intersections of the scanning line and the line segment are not the areas where the correlation coefficient is maximum on the same scanning line. It corresponds, as what shows the same position.
  • the computer readable recording medium having the program according to the third aspect of the present invention recorded on a computer has a plurality of images obtained by photographing the same object from different directions, with a phase relationship on the same scanning line.
  • a correlating procedure for correlating areas where the number is the largest as indicating the same position, and a line segment for determining whether a line segment correlated as indicating the same position to each of the plurality of images is drawn
  • a computer readable recording medium storing a program for executing the determining step, wherein the associating step determines that the line segment is drawn according to the line segment determining step.
  • the associating step determines that the line segment is drawn according to the line segment determining step.
  • a stereo matching processing system it is possible to provide a stereo matching processing system, a stereo matching processing method, and a computer readable recording medium having a program recorded thereon, in which the same position can be accurately associated among a plurality of images.
  • FIG. 1 is a block diagram showing an example of the configuration of a stereo matching processing system according to an embodiment of the present invention.
  • the stereo matching processing system 1 includes, for example, a general-purpose computer, and as shown in FIG. 1, the display unit 10, the auxiliary line input unit 11, the relative positioning unit 12, the stereo matching unit 13, ortho processing / grounding And a positioning unit 14.
  • the display unit 10 is configured of, for example, a liquid crystal display (LCD).
  • FIG. 2 is a view showing a display example of the auxiliary line input screen.
  • 3A and 3B are diagrams showing display examples of stereo matching results.
  • the display unit 10 displays an auxiliary line input screen shown in FIG. 2 including two aerial photograph images (hereinafter referred to as a left image and a right image) obtained by photographing the same object from different directions, and FIG. And a stereo matching result of the left image and the right image shown in FIG.
  • the auxiliary line input unit 11 includes, for example, a keyboard and a mouse, and is used when an operator draws an auxiliary line on the left image and the right image on the auxiliary line input screen displayed on the display unit 10.
  • the auxiliary line is a line segment for correlating a portion at the same position between the left image and the right image by the operation of the operator.
  • the relative positioning unit 12 is realized by, for example, a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), a hard disk drive, etc., and a positioning process for orienting camera parameters at the time of shooting And the right image are reprojected to a common parallel plane as needed.
  • orientation means obtaining a predetermined value for the object of evaluation.
  • the relative orientation unit 12 reads the coordinate values of the object commonly displayed in the left image and the right image, and uses the two read coordinate values to read the left image and the right image.
  • Camera parameters at the time of shooting such as camera rotation angle between
  • the relative positioning unit 12 can normally locate camera parameters at the time of photographing which is difficult to grasp due to the influence of a change in attitude of the aircraft and the like even when photographing an aerial photograph from a direction close to the vertical.
  • the relative orientation unit 12 executes the parallelization process to share the left image and the right image so that the epipolar line connecting the epipoles on the left image and the right image matches one of the plurality of scanning lines. Reproject to parallel plane.
  • the stereo matching unit 13 is realized by, for example, a CPU, a ROM, a RAM, a hard disk drive, and the like. Then, the RAM is provided with an auxiliary line input flag indicating that the auxiliary line is input to each of the left image and the right image, an intersection point coordinate buffer storing coordinates of an intersection point of the auxiliary line and the scanning line, and the like.
  • the stereo matching unit 13 performs stereo matching processing, specifically, DP (Dynamic Programming) matching processing on the left image and the right image (parallelized image pair) subjected to the parallelizing processing in the relative positioning unit 12.
  • DP Dynamic Programming
  • FIG. 4 is a diagram for explaining the DP matching process. Specifically, in the DP matching process, the stereo matching unit 13 performs the same scanning as shown in FIG. 4 when auxiliary lines are not input on the left image and the right image by the operation of the auxiliary line input unit 11 by the operator. The left image and the right image on the line are correlated to search for a lattice area having the largest correlation coefficient.
  • FIG. 5 is a diagram illustrating a search plane.
  • the stereo matching unit 13 associates the center coordinates of the lattice area where the correlation coefficient is maximum on the search plane shown in FIG.
  • the horizontal axis x1 indicates the x coordinate in the right image
  • the vertical axis x2 indicates the x coordinate in the left image.
  • the stereo matching unit 13 executes such association for each scanning line to generate parallax data, and displays an image as shown in FIG. 3A on the display unit 10 as a stereo matching result. .
  • FIG. 6 is a diagram for explaining stereo matching processing. As shown in FIG. 6, the stereo matching unit 13 performs such association for each scanning line to generate parallax data that has been corrected by the auxiliary lines, as shown in FIG. 3B. The image is displayed on the display unit 10 as a stereo matching result.
  • the stereo matching unit 13 combines the generated parallax data and the camera parameters determined by the relative orientation unit 12 and calculates the position in the three-dimensional coordinate system corresponding to each pixel according to the principle of triangulation.
  • the depth and shape of the object are determined by extracting digital surface model (DSM) data including elevation data indicating the height of the surface layer of the object.
  • DSM digital surface model
  • the ortho processing / ground positioning unit 14 is realized by, for example, a CPU, a ROM, a RAM, a hard disk drive, etc., and performs ortho processing or ortho processing for orthographically converting an aerial photograph image and DSM data using DSM data.
  • Ortho image and ortho DSM data are generated by using the aerial image image and DSM data, and performing ground orientation processing to determine the exact coordinates of the object relative to the ground surface, specifically the longitude and latitude of the object. Do.
  • the orthoimage includes color data and latitude data and longitude data indicating latitude and longitude obtained by the ground positioning.
  • the ortho DSM data includes elevation data indicating the height of the surface layer of the object, latitude data and longitude data.
  • auxiliary line input processing and stereo matching processing are periodically executed.
  • the auxiliary line input process and the stereo matching process are executed at arbitrary timings, for example, when there is an instruction from the operator, when there is a predetermined image, or when a predetermined time comes.
  • FIG. 7 is a flowchart showing the details of the auxiliary line input process.
  • the stereo matching unit 13 inputs an auxiliary line to each of the left image and the right image by the operation of the auxiliary line input unit 11 by the operator. It is determined (step S11). At this time, if the auxiliary line is not input (step S11; No), the stereo matching unit 13 ends the auxiliary line input process as it is.
  • step S11 when it is determined in the process of step S11 that an auxiliary line is input (step S11; Yes), the stereo matching unit 13 determines the intersection of the scanning line and the auxiliary line for each of the left image and the right image.
  • the coordinates of are stored in the intersection point coordinate buffer provided in the RAM (step S12).
  • the stereo matching unit 13 sets the auxiliary line input flag provided in the RAM to ON (step S13), and ends the auxiliary line input process.
  • FIG. 8 is a flowchart showing the details of the stereo matching process.
  • the relative orientation unit 12 executes the orientation process to locate the camera parameters at the time of shooting (step S21), and executes the parallelization process, and the epipolar line is one of a plurality of scanning lines.
  • the left image and the right image are reprojected to a common parallel plane so as to coincide with one (step S22).
  • the stereo matching unit 13 determines whether the auxiliary line is input to each of the left image and the right image by checking whether the auxiliary line input flag provided in the RAM is on or not. (Step S23).
  • step S23 when it is determined in the process of step S23 that no auxiliary line is input (step S23; No), the stereo matching unit 13 generates parallax data without correction by the auxiliary line (step S24).
  • the stereo matching unit 13 correlates the left image and the right image on the same scanning line to search for a lattice area having the largest correlation coefficient. Then, the stereo matching unit 13 associates the center coordinates of the lattice area where the correlation coefficient is maximum on the search plane. The stereo matching unit 13 executes such association for each scanning line to generate parallax data without correction by the auxiliary line.
  • step S23 when it is determined in the process of step S23 that the auxiliary line is input (step S23; Yes), the stereo matching unit 13 generates parallax data with correction by the auxiliary line (step S25).
  • the stereo matching unit 13 corresponds the coordinates of the intersection point of the scanning line and the auxiliary line stored in the intersection point coordinate buffer provided in the RAM on the search plane at the place where the auxiliary line is input. At the same time, in the place where the auxiliary line is not input, the coordinates of the center position of the lattice area where the correlation coefficient is maximum are associated on the search plane. The stereo matching unit 13 executes such association for each scanning line to generate parallax data with correction by the auxiliary line.
  • the stereo matching unit 13 displays an image based on the parallax data generated in the process of step S24 or S25 on the display unit 10 as a stereo matching result (step S26), and specifies the parallax data and the process in step S21.
  • the extracted camera data is used to extract DSM data including elevation data indicating the height of the surface layer of the object (step S27).
  • the ortho processing / ground positioning unit 14 performs ortho processing on the aerial photograph image and the DSM data by executing the ortho processing using the DSM data extracted in the processing of step S 27 (step S 28). .
  • the ortho processing / ground positioning unit 14 executes the ground positioning processing using the DSM data subjected to the ortho processing in step S 28 to obtain an ortho image and elevation data indicating the height of the surface layer of the object. And generating ortho-DSM data (step S29).
  • step S23 when the auxiliary line is not input, it is determined in the process of step S23 shown in FIG. 8 that the auxiliary line input flag is off, so that the lattice having the largest correlation coefficient in the process of step S24. Center coordinates of the regions are associated on the search plane.
  • FIG. 9A and 9B are diagrams illustrating search planes. Moreover, FIG. 10 is a figure for demonstrating the improvement of the matching using an auxiliary line.
  • the operator on the auxiliary line input unit 11 causes the auxiliary on the left image to be displayed as shown in FIG.
  • the line A2 and the auxiliary line A1 on the right image are input in association with each other, and the auxiliary line B2 on the left image and the auxiliary line B1 on the right image are input in association with each other.
  • step S12 x coordinates a1, a2, b1, and b2 of the intersections of the scanning line and the auxiliary lines A1, A2, B1, and B2 are determined, and In the process, the auxiliary line input flag is set on.
  • the scanning line and the auxiliary on the left image The coordinate a2 of the intersection of the line A2 and the coordinate a1 of the intersection of the scanning line and the auxiliary line A1 on the right image, and the coordinate b2 of the intersection of the scanning line and the auxiliary line B2 on the left image and the scanning line on the right image And the coordinate b1 of the intersection of the auxiliary line B1 are associated with each other on the search plane.
  • the stereo matching processing system 1 can correct an incorrect correspondence on the search plane and accurately associate the same position between the left image and the right image.
  • the stereo matching processing system 1 by performing the correction using the auxiliary line, it is possible to improve the false correspondence on the search plane, and therefore, as the stereo matching result.
  • the disparity data can be corrected.
  • the stereo matching processing system 1 can acquire elevation data indicating the height of the surface layer of the object accurately by extracting the DSM data using the corrected disparity data.
  • FIG. 11 is a diagram showing a search plane in the modification. As shown in FIG. 11, all points on a line segment connecting coordinates (a1, a2) on the search plane and coordinates (b1, b2) may be made to correspond to each other.
  • auxiliary line input unit 11 it may be determined by the operation of the auxiliary line input unit 11 by the operator whether or not the auxiliary line (forced auxiliary line) to perform such association is input.
  • FIG. 12 is a diagram for explaining the improvement of the association using the auxiliary line in the modification.
  • FIG. 13 is a figure which shows the search plane in a modification.
  • the auxiliary lines may be drawn horizontally to the scanning lines as shown in FIG.
  • points (equal division points) equally dividing the auxiliary line (line segment) in the left image and the right image by n (n is a natural number) are shown in FIG. As shown, it may be made to correspond on the inspection plane from the start point of the line segment to the end point of the line segment in order.
  • the relative orientation unit 12 reads the coordinate values of the object commonly shown in the left image and the right image, and uses the two read coordinate values to read the left image and the right image. It has been described that the camera parameters at the time of shooting, such as the camera rotation angle between them, are located. However, the present invention is not limited to this, and the method for determining the camera parameters at the time of shooting is arbitrary, and for example, the camera parameters at the time of shooting are determined using values calculated by the plotting program. May be
  • the ortho processing / ground positioning unit 14 performs the ground positioning for obtaining the longitude and latitude of the object using the image of the aerial photograph subjected to the ortho processing and the DSM data.
  • the present invention is not limited to this, and the method for performing ground localization is arbitrary, for example, an aerial photograph image in which longitude, latitude, and elevation values (image coordinates) are detected in advance. From the image coordinates of a plurality of upper points, a conversion equation to ground coordinates (longitude, latitude, and elevation value) of the ground surface may be obtained.
  • aerial triangulation data in which longitude, latitude, and elevation are surveyed may be used by taking an aerial photograph by imprinting an anti-air sign, and in this way, ground coordinates at arbitrary coordinates on the image Can be determined.
  • the anti-aircraft sign means one that can clearly recognize the shape on the aerial photograph image and can measure the image coordinates. Therefore, the exact three-dimensional coordinates are shown at the points where the anti-airmarks are placed.
  • the ortho image includes color data, latitude data, and longitude data
  • the ortho DSM data includes elevation data, latitude data, and longitude data.
  • the ortho image and ortho DSM data may include coordinate value data represented by another coordinate system instead of the latitude data and the longitude data.
  • height data indicating relative height from another reference may be included.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

When an operator inputs and matches an auxiliary line (A2) on the left image with an auxiliary line (A1) on the right image and also inputs and matches an auxiliary line (B2) on the left image with an auxiliary line (B1) on the right image, a stereo matching process system (1) correlates, on a search plane, the coordinate (a2) where a scan line on the left image intersects the auxiliary line (A2) with the coordinate (a1) where the scan line on the right image intersects the auxiliary line (A1).  Moreover, the stereo matching process system (1) correlates, on the search plane, the coordinate (b2) where the scan line on the left image intersects the auxiliary line (B2) with the coordinate (b1) where the scan line on the right image intersects the auxiliary line (B1).  Thus, the stereo matching process system (1) can correct an erroneous correspondence on the search plane and accurately correlate the same position on the left image and the right image.

Description

ステレオマッチング処理システム、ステレオマッチング処理方法、及び記録媒体Stereo matching processing system, stereo matching processing method, and recording medium
 本発明は、ステレオマッチング処理システム、ステレオマッチング処理方法、及びプログラムを記録したコンピュータ読み取り可能な記録媒体に関し、同一の位置を複数の画像間において正確に対応付けることができるステレオマッチング処理システム、ステレオマッチング処理方法、及び記録媒体に関する。 The present invention relates to a stereo matching processing system, a stereo matching processing method, and a computer readable recording medium recording a program, and a stereo matching processing system capable of accurately associating the same position among a plurality of images, stereo matching processing The present invention relates to a method and a recording medium.
 3次元データ自動生成方法においては、人工衛星や航空機等から得られる画像を基に、地形を示す3次元データ[DSM(Digital Surface Model)データ]をステレオマッチングによって生成する方法が広く行われており、またオペレータを介在させて対応の取れない点を修正させる方法も提案されている。 In the method of automatically generating three-dimensional data, a method of generating three-dimensional data [DSM (Digital Surface Model) data] representing topography by stereo matching based on an image obtained from a satellite or aircraft is widely used. There is also proposed a method in which an operator is interposed to correct a point which can not be dealt with.
 ここで、ステレオマッチング処理とは、異なる視点から撮影した2枚の画像、いわゆるステレオ画像について、同一の点を撮像している各画像中の対応点を求め、その視差を用いて三角測量の原理によって対象までの奥行きや形状を求めることである。 Here, with stereo matching processing, corresponding points in each image capturing the same point are determined for two images captured from different viewpoints, so-called stereo images, and the principle of triangulation is used using the parallax. To determine the depth and shape of the object.
 このステレオマッチング処理については既に様々な手法が提案されている。例えば一般的に広く用いられている面積相関法として、左画像中に相関窓を設定してこれをテンプレートとし、右画像中の探索窓を動かしてテンプレートとの相互相関係数を算出し、これを一致度とみなして高いものを探索することによって対応点を得る方法が開示されている(例えば特許文献1参照)。 Various methods have already been proposed for this stereo matching process. For example, as a widely used area correlation method, a correlation window is set in the left image and used as a template, and a search window in the right image is moved to calculate a cross-correlation coefficient with the template There is disclosed a method of obtaining corresponding points by searching for high ones by regarding H as a degree of coincidence (see, for example, Patent Document 1).
 上記の方法においては処理量を軽減するために、探索窓の移動範囲を画像中のエピポーラ線方向に限定することによって、左画像中の各点について、対応する右画像中の点のx方向の位置ずれ量、すなわち視差を得ることができる。ここで、エピポーラ線とはステレオ画像において片方の画像中のある点について、他方の画像中で当該点に対応する点の存在範囲として引くことができる直線である(例えば非特許文献1参照)。 In the above method, in order to reduce the amount of processing, the movement range of the search window is limited to the epipolar line direction in the image, so that for each point in the left image, the x direction of the point in the corresponding right image It is possible to obtain the positional displacement amount, that is, the parallax. Here, the epipolar line is a straight line which can be drawn as an existing range of a point corresponding to the point in the other image with respect to a certain point in one image in the stereo image (see, for example, Non-Patent Document 1).
 通常、エピポーラ線方向は画像の走査線方向とは異なるが、座標変換を行うことで、エピポーラ線方向を画像の走査線方向に一致させ、再配列を行うことができる。この座標変換の方法については上記の非特許文献1に記載されている。 Usually, although the epipolar line direction is different from the scanning line direction of the image, by performing coordinate conversion, the epipolar line direction can be made to coincide with the scanning line direction of the image, and rearrangement can be performed. The method of this coordinate conversion is described in the above non-patent document 1.
 上記のような再配列を行ったステレオ画像においては、対応点の探索窓の移動範囲を走査線上に限定することができるため、視差は左右画像中の対応点同士のx座標の差として得られる。 In a stereo image rearranged as described above, the movement range of the search window of the corresponding point can be limited to the scanning line, so parallax is obtained as the difference between the x coordinates of corresponding points in the left and right images. .
特公平8-16930号公報Japanese Examined Patent Publication 8-16930
 しかしながら、関連するステレオマッチングによる3次元データ生成手法では、テクスチャのない領域や相関係数による対応が得られない領域も含まれるため、上記の3次元データの画像中には周囲と大きく異なる誤った高さを示す点が多く含まれる。特に、建造物の周囲等では隠蔽が発生するため、対応の取れない点が多くなり、著しく高い値を示したり、あるいは建造物が大きく欠損したりする場合がある。 However, since the related stereo matching 3D data generation method includes an area without texture and an area where a correlation coefficient can not be obtained, the image of the above 3D data is erroneously different from the surroundings. There are many points that indicate the height. In particular, since concealment occurs in the periphery of a structure, etc., there are many points that can not be dealt with, which may show extremely high values or may cause a large loss of the structure.
 そのため、関連するステレオマッチングによる3次元データ生成手法では、対応点のミスマッチングによる誤差が生じ、精度の高い3次元情報が得られないので、都市部等の建造物の多い複雑な画像に対しては適用することが難しいという課題があった。 Therefore, with the related stereo matching 3D data generation method, an error is generated due to mismatching of corresponding points, and high accuracy 3D information can not be obtained. Therefore, for complex images with many buildings such as urban areas There was a problem that it was difficult to apply.
 本発明は、上記課題を解決するためになされたものであり、複数の画像間において同一の位置を示す領域同士を正確に対応付けることができるステレオマッチング処理システム、ステレオマッチング処理方法、及び記録媒体を提供することを目的とする。 The present invention has been made to solve the above problems, and a stereo matching processing system, a stereo matching processing method, and a recording medium, which can accurately associate areas indicating the same position among a plurality of images. Intended to be provided.
 上記目的を達成するため、本発明の第1の観点に係るステレオマッチング処理システムは、同一の対象物を異なる方向から撮影して得られる複数の画像において、同一走査線上で相関係数が最大となる領域同士を同一の位置を示すものとして対応付ける対応付け部と、前記複数の画像のそれぞれに同一の位置を示すものとして対応付けられた線分が引かれたかを判別する線分判別部と、を備え、前記対応付け部は、前記線分判別部によって前記線分が引かれたと判別した場合、前記同一走査線上で相関係数が最大となる領域同士ではなく、該走査線と該線分との交点同士を同一の位置を示すものとして対応付ける、ことを特徴とする。 In order to achieve the above object, a stereo matching processing system according to a first aspect of the present invention has a correlation coefficient that is maximized on the same scanning line in a plurality of images obtained by photographing the same object from different directions. And a line segment determination unit that determines whether a line segment associated with each of the plurality of images as having the same position is drawn. And when the line segment determination unit determines that the line segment is drawn, the association unit does not select areas where the correlation coefficient is maximized on the same scanning line, but the scanning line and the line segment And the point of intersection with each other to indicate the same position.
 また、本発明の第2の観点に係るステレオマッチング処理方法は、同一の対象物を異なる方向から撮影して得られる複数の画像において、同一走査線上で相関係数が最大となる領域同士を同一の位置を示すものとして対応付ける対応付けステップと、前記複数の画像のそれぞれに同一の位置を示すものとして対応付けられた線分が引かれたかを判別する線分判別ステップと、を備え、前記対応付けステップは、前記線分判別ステップによって前記線分が引かれたと判別した場合、前記同一走査線上で相関係数が最大となる領域同士ではなく、該走査線と該線分との交点同士を同一の位置を示すものとして対応付ける、ことを特徴とする。 Further, in the stereo matching processing method according to the second aspect of the present invention, in a plurality of images obtained by photographing the same object from different directions, regions having the largest correlation coefficient on the same scanning line are identical to each other. And a line segment determination step of determining whether a line segment associated with each of the plurality of images has been drawn as corresponding to each other. In the attaching step, when it is determined that the line segment has been drawn in the line segment determining step, the intersections of the scanning line and the line segment are not the areas where the correlation coefficient is maximum on the same scanning line. It corresponds, as what shows the same position.
 さらに、本発明の第3の観点に係るプログラムを記録したコンピュータ読み取り可能な記録媒体は、コンピュータに、同一の対象物を異なる方向から撮影して得られる複数の画像において、同一走査線上で相関係数が最大となる領域同士を同一の位置を示すものとして対応付ける対応付け手順と、前記複数の画像のそれぞれに同一の位置を示すものとして対応付けられた線分が引かれたかを判別する線分判別手順と、を実行させるためのプログラムを記録したコンピュータ読み取り可能な記録媒体であって、前記対応付け手順は、前記線分判別手順によって前記線分が引かれたと判別した場合、前記同一走査線上で相関係数が最大となる領域同士ではなく、該走査線と該線分との交点同士を同一の位置を示すものとして対応付ける、ことを特徴とする。 Furthermore, the computer readable recording medium having the program according to the third aspect of the present invention recorded on a computer has a plurality of images obtained by photographing the same object from different directions, with a phase relationship on the same scanning line. A correlating procedure for correlating areas where the number is the largest as indicating the same position, and a line segment for determining whether a line segment correlated as indicating the same position to each of the plurality of images is drawn And a computer readable recording medium storing a program for executing the determining step, wherein the associating step determines that the line segment is drawn according to the line segment determining step. To associate the intersections of the scanning line and the line segment with each other as indicating the same position, not the regions in which the correlation coefficient is maximized. And butterflies.
 本発明によれば、同一の位置を複数の画像間において正確に対応付けることができるステレオマッチング処理システム、ステレオマッチング処理方法、及びプログラムを記録したコンピュータ読み取り可能な記録媒体を提供することができる。 According to the present invention, it is possible to provide a stereo matching processing system, a stereo matching processing method, and a computer readable recording medium having a program recorded thereon, in which the same position can be accurately associated among a plurality of images.
ステレオマッチング処理システムの構成例を示すブロック図である。It is a block diagram showing an example of composition of a stereo matching processing system. 補助線入力画面の表示例を示す図である。It is a figure which shows the example of a display of an auxiliary line input screen. ステレオマッチング結果の表示例を示す図である。It is a figure which shows the example of a display of a stereo matching result. ステレオマッチング結果の表示例を示す図である。It is a figure which shows the example of a display of a stereo matching result. DPマッチング処理を説明するための図である。It is a figure for demonstrating DP matching processing. 検索平面を例示する図である。It is a figure which illustrates a search plane. ステレオマッチング処理を説明するための図である。It is a figure for demonstrating a stereo matching process. 補助線入力処理の一例を示すフローチャートである。It is a flowchart which shows an example of an auxiliary line input process. ステレオマッチング処理の一例を示すフローチャートである。It is a flowchart which shows an example of a stereo matching process. 検索平面を例示する図である。It is a figure which illustrates a search plane. 検索平面を例示する図である。It is a figure which illustrates a search plane. 補助線を使用した対応付けの改善を説明するための図である。It is a figure for demonstrating the improvement of the matching using an auxiliary line. 変形例における検索平面を示す図である。It is a figure which shows the search plane in a modification. 変形例における補助線を使用した対応付けの改善を説明するための図である。It is a figure for demonstrating the improvement of the matching using the auxiliary line in a modification. 変形例における検索平面を示す図である。It is a figure which shows the search plane in a modification.
 次に、本発明を実施するための最良の形態について図面を参照して説明する。図1は、本発明の実施形態に係るステレオマッチング処理システムの構成例を示すブロック図である。ステレオマッチング処理システム1は、例えば汎用コンピュータなどから構成され、図1に示すように、表示部10と、補助線入力部11と、相互標定部12と、ステレオマッチング部13と、オルソ処理・対地標定部14と、を備えている。 Next, the best mode for carrying out the present invention will be described with reference to the drawings. FIG. 1 is a block diagram showing an example of the configuration of a stereo matching processing system according to an embodiment of the present invention. The stereo matching processing system 1 includes, for example, a general-purpose computer, and as shown in FIG. 1, the display unit 10, the auxiliary line input unit 11, the relative positioning unit 12, the stereo matching unit 13, ortho processing / grounding And a positioning unit 14.
 表示部10は、例えばLCD(Liquid Crystal Display)などから構成される。図2は、補助線入力画面の表示例を示す図である。また、図3A及び3Bは、ステレオマッチング結果の表示例を示す図である。表示部10は、同一の対象物を異なる方向から撮影して得られた2つの航空写真の画像(以下、左画像及び右画像という)が含まれる図2に示す補助線入力画面や、図3A及び3Bに示す左画像と右画像とのステレオマッチング結果等を表示する。 The display unit 10 is configured of, for example, a liquid crystal display (LCD). FIG. 2 is a view showing a display example of the auxiliary line input screen. 3A and 3B are diagrams showing display examples of stereo matching results. The display unit 10 displays an auxiliary line input screen shown in FIG. 2 including two aerial photograph images (hereinafter referred to as a left image and a right image) obtained by photographing the same object from different directions, and FIG. And a stereo matching result of the left image and the right image shown in FIG.
 補助線入力部11は、例えばキーボードやマウスなどから構成され、表示部10に表示された補助線入力画面において、オペレータが左画像及び右画像上に補助線を引く際などに用いられる。ここで、補助線とは、左画像と右画像との間で同一の位置となる箇所をオペレータの操作により対応付けるための線分である。 The auxiliary line input unit 11 includes, for example, a keyboard and a mouse, and is used when an operator draws an auxiliary line on the left image and the right image on the auxiliary line input screen displayed on the display unit 10. Here, the auxiliary line is a line segment for correlating a portion at the same position between the left image and the right image by the operation of the operator.
 相互標定部12は、例えばCPU(Central Processing Unit)や、ROM(Read Only Memory)、RAM(Random Access Memory)、ハードディスクドライブなどによって実現され、撮影時のカメラパラメータを標定する標定処理や、左画像と右画像とを必要に応じて共通平行面に再投影する平行化処理などを実行する。ここで、標定とは、評価の対象について所定の値を求めることをいう。 The relative positioning unit 12 is realized by, for example, a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), a hard disk drive, etc., and a positioning process for orienting camera parameters at the time of shooting And the right image are reprojected to a common parallel plane as needed. Here, orientation means obtaining a predetermined value for the object of evaluation.
 具体的に、相互標定部12は、標定処理において、左画像と右画像とに共通に映し出されている対象物の座標値を読み取り、読み取った2つの座標値を用いて、左画像と右画像との間でのカメラ回転角等の撮影時のカメラパラメータを標定する。これにより、相互標定部12は、通常、鉛直に近い方向から航空写真を撮影した場合でも航空機の姿勢変化等の影響により把握することが難しい撮影時のカメラパラメータを標定することができる。 Specifically, in the orientation processing, the relative orientation unit 12 reads the coordinate values of the object commonly displayed in the left image and the right image, and uses the two read coordinate values to read the left image and the right image. Camera parameters at the time of shooting such as camera rotation angle between Thereby, the relative positioning unit 12 can normally locate camera parameters at the time of photographing which is difficult to grasp due to the influence of a change in attitude of the aircraft and the like even when photographing an aerial photograph from a direction close to the vertical.
 その後、相互標定部12は、平行化処理を実行することにより、左画像及び右画像上のエピポールを結ぶエピポーラ線が複数の走査線の一つに一致するように左画像と右画像とを共通平行面に再投影する。 After that, the relative orientation unit 12 executes the parallelization process to share the left image and the right image so that the epipolar line connecting the epipoles on the left image and the right image matches one of the plurality of scanning lines. Reproject to parallel plane.
 ステレオマッチング部13は、例えばCPUや、ROM、RAM、ハードディスクドライブなどによって実現される。そして、左画像及び右画像のそれぞれに補助線が入力されたことを示す補助線入力フラグや、補助線と走査線との交点の座標を格納する交点座標バッファなどがRAMに備えられている。 The stereo matching unit 13 is realized by, for example, a CPU, a ROM, a RAM, a hard disk drive, and the like. Then, the RAM is provided with an auxiliary line input flag indicating that the auxiliary line is input to each of the left image and the right image, an intersection point coordinate buffer storing coordinates of an intersection point of the auxiliary line and the scanning line, and the like.
 ステレオマッチング部13は、相互標定部12にて平行化処理が施された左画像及び右画像(平行化済画像対)に対してステレオマッチング処理、具体的にはDP(Dynamic Programming)マッチング処理を施すことにより、左画像と右画像との間での位置ずれ(視差)を計測し、視差データを生成する。 The stereo matching unit 13 performs stereo matching processing, specifically, DP (Dynamic Programming) matching processing on the left image and the right image (parallelized image pair) subjected to the parallelizing processing in the relative positioning unit 12. As a result, positional deviation (parallax) between the left image and the right image is measured, and parallax data is generated.
 図4は、DPマッチング処理を説明するための図である。具体的に、ステレオマッチング部13は、DPマッチング処理において、オペレータによる補助線入力部11の操作によって左画像及び右画像上に補助線が入力されていない場合、図4に示すように、同一走査線上における左画像と右画像との相関を取っていき、相関係数が最大となる格子領域を検索する。 FIG. 4 is a diagram for explaining the DP matching process. Specifically, in the DP matching process, the stereo matching unit 13 performs the same scanning as shown in FIG. 4 when auxiliary lines are not input on the left image and the right image by the operation of the auxiliary line input unit 11 by the operator. The left image and the right image on the line are correlated to search for a lattice area having the largest correlation coefficient.
 図5は、検索平面を例示する図である。ステレオマッチング部13は、相関係数が最大となる格子領域の中心座標同士を図5に示す検索平面上で対応付けていく。この検索平面において、横軸x1は、右画像におけるx座標を示し、縦軸x2は、左画像におけるx座標を示している。 FIG. 5 is a diagram illustrating a search plane. The stereo matching unit 13 associates the center coordinates of the lattice area where the correlation coefficient is maximum on the search plane shown in FIG. In this search plane, the horizontal axis x1 indicates the x coordinate in the right image, and the vertical axis x2 indicates the x coordinate in the left image.
 そして、ステレオマッチング部13は、このような対応付けを走査線毎に実行していくことにより、視差データを生成して、図3Aに示すような画像をステレオマッチング結果として表示部10に表示する。 Then, the stereo matching unit 13 executes such association for each scanning line to generate parallax data, and displays an image as shown in FIG. 3A on the display unit 10 as a stereo matching result. .
 これに対して、オペレータによる補助線入力部11の操作によって左画像及び右画像のそれぞれに補助線が入力された場合、補助線が入力された箇所では、走査線と補助線との交点の座標を検索平面上に対応付けていく。一方、補助線が入力されていない箇所では、相関係数が最大となる格子領域の中心位置の座標同士を検索平面上に対応付けていく。 On the other hand, when an auxiliary line is input to each of the left image and the right image by the operation of the auxiliary line input unit 11 by the operator, the coordinates of the intersection of the scanning line and the auxiliary line at the part where the auxiliary line is input. Are mapped on the search plane. On the other hand, in the place where the auxiliary line is not input, the coordinates of the center position of the lattice area where the correlation coefficient is maximum are associated on the search plane.
 図6は、ステレオマッチング処理を説明するための図である。ステレオマッチング部13は、図6に示すように、このような対応付けを走査線毎に実行していくことにより、補助線による補正が施された視差データを生成し、図3Bに示すような画像をステレオマッチング結果として表示部10に表示する。 FIG. 6 is a diagram for explaining stereo matching processing. As shown in FIG. 6, the stereo matching unit 13 performs such association for each scanning line to generate parallax data that has been corrected by the auxiliary lines, as shown in FIG. 3B. The image is displayed on the display unit 10 as a stereo matching result.
 その後、ステレオマッチング部13は、生成した視差データと、相互標定部12にて標定されたカメラパラメータと、を組み合わせて、三角測量の原理によって各画素に対応する3次元座標系での位置を計算し、対象物の表層の高さを示す標高データを含むDSM(Digital Surface Model)データを抽出することにより、対象物までの奥行きや形状を求める。 After that, the stereo matching unit 13 combines the generated parallax data and the camera parameters determined by the relative orientation unit 12 and calculates the position in the three-dimensional coordinate system corresponding to each pixel according to the principle of triangulation. The depth and shape of the object are determined by extracting digital surface model (DSM) data including elevation data indicating the height of the surface layer of the object.
 オルソ処理・対地標定部14は、例えばCPUや、ROM、RAM、ハードディスクドライブなどによって実現され、DSMデータを用いて航空写真の画像とDSMデータとを正射変換するオルソ処理や、オルソ処理を施した航空写真の画像とDSMデータを用いて、地表に対する対象物の正確な座標、具体的には対象物の経度および緯度を求める対地標定処理を行うことにより、オルソ画像とオルソDSMデータとを生成する。 The ortho processing / ground positioning unit 14 is realized by, for example, a CPU, a ROM, a RAM, a hard disk drive, etc., and performs ortho processing or ortho processing for orthographically converting an aerial photograph image and DSM data using DSM data. Ortho image and ortho DSM data are generated by using the aerial image image and DSM data, and performing ground orientation processing to determine the exact coordinates of the object relative to the ground surface, specifically the longitude and latitude of the object. Do.
 ここで、オルソ画像には、色データと、対地標定によって求められた緯度及び経度を示す緯度データ及び経度データと、が含まれている。また、オルソDSMデータには、対象物の表層の高さを示す標高データと、緯度データ及び経度データと、が含まれている。このような緯度データ及び経度データを求めることにより、同一対象物の位置を、異なるタイミングで撮影された航空写真の画像間で相互に対応付けることができるようになる。 Here, the orthoimage includes color data and latitude data and longitude data indicating latitude and longitude obtained by the ground positioning. The ortho DSM data includes elevation data indicating the height of the surface layer of the object, latitude data and longitude data. By obtaining such latitude data and longitude data, the positions of the same object can be associated with each other among the aerial photographs taken at different timings.
 次に、上記構成を備えるステレオマッチング処理システムが実行する処理について図面を参照して説明する。 Next, processing executed by the stereo matching processing system having the above configuration will be described with reference to the drawings.
 ステレオマッチング処理システムでは、補助線入力処理と、ステレオマッチング処理と、が定期的に実行される。補助線入力処理、及び、ステレオマッチング処理は、例えば、オペレータからの指示がある場合、所定の画像がある場合、所定の時刻になった場合等、任意のタイミングにおいて実行される。 In the stereo matching processing system, auxiliary line input processing and stereo matching processing are periodically executed. The auxiliary line input process and the stereo matching process are executed at arbitrary timings, for example, when there is an instruction from the operator, when there is a predetermined image, or when a predetermined time comes.
 図7は、補助線入力処理の詳細を示すフローチャートである。補助線入力処理において、ステレオマッチング部13は、表示部10に表示される補助線入力画面において、オペレータによる補助線入力部11の操作により左画像及び右画像のそれぞれに補助線が入力されたか否かを判別する(ステップS11)。このとき、補助線が入力されていなければ(ステップS11;No)、ステレオマッチング部13は、そのまま補助線入力処理を終了する。 FIG. 7 is a flowchart showing the details of the auxiliary line input process. In the auxiliary line input process, in the auxiliary line input screen displayed on the display unit 10, the stereo matching unit 13 inputs an auxiliary line to each of the left image and the right image by the operation of the auxiliary line input unit 11 by the operator. It is determined (step S11). At this time, if the auxiliary line is not input (step S11; No), the stereo matching unit 13 ends the auxiliary line input process as it is.
 これに対して、ステップS11の処理にて補助線が入力されたと判別した場合(ステップS11;Yes)、ステレオマッチング部13は、左画像及び右画像のそれぞれにつき、走査線と補助線との交点の座標を求めてRAMに設けられた交点座標バッファに格納する(ステップS12)。 On the other hand, when it is determined in the process of step S11 that an auxiliary line is input (step S11; Yes), the stereo matching unit 13 determines the intersection of the scanning line and the auxiliary line for each of the left image and the right image. The coordinates of are stored in the intersection point coordinate buffer provided in the RAM (step S12).
 そして、ステレオマッチング部13は、RAMに設けられた補助線入力フラグをオンにセットしてから(ステップS13)、補助線入力処理を終了する。 Then, the stereo matching unit 13 sets the auxiliary line input flag provided in the RAM to ON (step S13), and ends the auxiliary line input process.
 図8は、ステレオマッチング処理の詳細を示すフローチャートである。ステレオマッチング処理において、相互標定部12は、標定処理を実行することにより、撮影時のカメラパラメータを標定するとともに(ステップS21)、平行化処理を実行して、エピポーラ線が複数の走査線の一つに一致するように左画像と右画像とを共通平行面に再投影する(ステップS22)。 FIG. 8 is a flowchart showing the details of the stereo matching process. In the stereo matching process, the relative orientation unit 12 executes the orientation process to locate the camera parameters at the time of shooting (step S21), and executes the parallelization process, and the epipolar line is one of a plurality of scanning lines. The left image and the right image are reprojected to a common parallel plane so as to coincide with one (step S22).
 次に、ステレオマッチング部13は、RAMに設けられた補助線入力フラグがオンとなっているか否かをチェックすることにより、左画像及び右画像のそれぞれに補助線が入力されたか否かを判別する(ステップS23)。 Next, the stereo matching unit 13 determines whether the auxiliary line is input to each of the left image and the right image by checking whether the auxiliary line input flag provided in the RAM is on or not. (Step S23).
 ここで、ステップS23の処理にて補助線が入力されていないと判別した場合(ステップS23;No)、ステレオマッチング部13は、補助線による補正のない視差データを生成する(ステップS24)。 Here, when it is determined in the process of step S23 that no auxiliary line is input (step S23; No), the stereo matching unit 13 generates parallax data without correction by the auxiliary line (step S24).
 ステップS24の処理において、ステレオマッチング部13は、同一走査線上における左画像と右画像との相関を取って相関係数が最大となる格子領域を検索していく。そして、ステレオマッチング部13は、相関係数が最大となる格子領域の中心座標同士を検索平面上で対応付けていく。ステレオマッチング部13は、このような対応付けを走査線毎に実行していくことにより、補助線による補正のない視差データを生成する。 In the process of step S24, the stereo matching unit 13 correlates the left image and the right image on the same scanning line to search for a lattice area having the largest correlation coefficient. Then, the stereo matching unit 13 associates the center coordinates of the lattice area where the correlation coefficient is maximum on the search plane. The stereo matching unit 13 executes such association for each scanning line to generate parallax data without correction by the auxiliary line.
 これに対して、ステップS23の処理にて補助線が入力されたと判別した場合(ステップS23;Yes)、ステレオマッチング部13は、補助線による補正のある視差データを生成する(ステップS25)。 On the other hand, when it is determined in the process of step S23 that the auxiliary line is input (step S23; Yes), the stereo matching unit 13 generates parallax data with correction by the auxiliary line (step S25).
 ステップS25の処理において、ステレオマッチング部13は、補助線が入力された箇所では、RAMに設けられた交点座標バッファに格納されている走査線と補助線との交点の座標を検索平面上に対応付けていく一方で、補助線が入力されていない箇所では、相関係数が最大となる格子領域の中心位置の座標同士を検索平面上に対応付けていく。ステレオマッチング部13は、このような対応付けを走査線毎に実行していくことにより、補助線による補正のある視差データを生成する。 In the process of step S25, the stereo matching unit 13 corresponds the coordinates of the intersection point of the scanning line and the auxiliary line stored in the intersection point coordinate buffer provided in the RAM on the search plane at the place where the auxiliary line is input. At the same time, in the place where the auxiliary line is not input, the coordinates of the center position of the lattice area where the correlation coefficient is maximum are associated on the search plane. The stereo matching unit 13 executes such association for each scanning line to generate parallax data with correction by the auxiliary line.
 そして、ステレオマッチング部13は、ステップS24又はS25の処理にて生成した視差データに基づく画像をステレオマッチング結果として表示部10に表示するとともに(ステップS26)、視差データとステップS21の処理にて標定したカメラパラメータとを用いて、対象物の表層の高さを示す標高データを含むDSMデータを抽出する(ステップS27)。 Then, the stereo matching unit 13 displays an image based on the parallax data generated in the process of step S24 or S25 on the display unit 10 as a stereo matching result (step S26), and specifies the parallax data and the process in step S21. The extracted camera data is used to extract DSM data including elevation data indicating the height of the surface layer of the object (step S27).
 続いて、オルソ処理・対地標定部14は、ステップS27の処理にて抽出したDSMデータを用いてオルソ処理を実行することにより、航空写真の画像とDSMデータとを正射変換する(ステップS28)。 Subsequently, the ortho processing / ground positioning unit 14 performs ortho processing on the aerial photograph image and the DSM data by executing the ortho processing using the DSM data extracted in the processing of step S 27 (step S 28). .
 そして、オルソ処理・対地標定部14は、ステップS28にてオルソ処理を施したDSMデータを用いて、対地標定処理を実行することにより、オルソ画像と、対象物の表層の高さを示す標高データを含むオルソDSMデータと、を生成する(ステップS29)。 Then, the ortho processing / ground positioning unit 14 executes the ground positioning processing using the DSM data subjected to the ortho processing in step S 28 to obtain an ortho image and elevation data indicating the height of the surface layer of the object. And generating ortho-DSM data (step S29).
 続いて、上記処理を実行するステレオマッチング処理システムの動作について具体例を挙げて説明する。 Subsequently, the operation of the stereo matching processing system that executes the above-described processing will be described using a specific example.
 まず、補助線が入力されていないときには、図8に示すステップS23の処理にて補助線入力フラグがオフであると判別されることから、ステップS24の処理にて相関係数が最大となる格子領域の中心座標同士が検索平面上で対応付けられていく。 First, when the auxiliary line is not input, it is determined in the process of step S23 shown in FIG. 8 that the auxiliary line input flag is off, so that the lattice having the largest correlation coefficient in the process of step S24. Center coordinates of the regions are associated on the search plane.
 図9A及び9Bは、検索平面を例示する図である。また、図10は、補助線を使用した対応付けの改善を説明するための図である。ステップS24の処理にて、図9Aに示すように、検索平面上で誤った対応付けがなされた場合、オペレータによる補助線入力部11の操作により、図10に示すように、左画像上の補助線A2と右画像上の補助線A1とが対応付けて入力されるとともに、左画像上の補助線B2と右画像上の補助線B1とが対応付けて入力される。 9A and 9B are diagrams illustrating search planes. Moreover, FIG. 10 is a figure for demonstrating the improvement of the matching using an auxiliary line. In the process of step S24, as shown in FIG. 9A, if incorrect association is made on the search plane, the operator on the auxiliary line input unit 11 causes the auxiliary on the left image to be displayed as shown in FIG. The line A2 and the auxiliary line A1 on the right image are input in association with each other, and the auxiliary line B2 on the left image and the auxiliary line B1 on the right image are input in association with each other.
 その後、図7に示すステップS12の処理では、走査線と、補助線A1、A2、B1、及びB2と、の交点のx座標a1、a2、b1、及びb2がそれぞれ求められるとともに、ステップS13の処理にて補助線入力フラグがオンにセットされる。 Thereafter, in the process of step S12 shown in FIG. 7, x coordinates a1, a2, b1, and b2 of the intersections of the scanning line and the auxiliary lines A1, A2, B1, and B2 are determined, and In the process, the auxiliary line input flag is set on.
 そして、図8に示すステップS23の処理にて今度は補助線入力フラグがオンであると判別されることから、ステップS25の処理では、図9Bに示すように、左画像上における走査線及び補助線A2の交点の座標a2と右画像上における走査線及び補助線A1の交点の座標a1とが、また、左画像上における走査線及び補助線B2の交点の座標b2と右画像上における走査線及び補助線B1の交点の座標b1とが、それぞれ検索平面上で対応付けられる。 Then, since it is determined in the process of step S23 shown in FIG. 8 that the auxiliary line input flag is on this time, in the process of step S25, as shown in FIG. 9B, the scanning line and the auxiliary on the left image The coordinate a2 of the intersection of the line A2 and the coordinate a1 of the intersection of the scanning line and the auxiliary line A1 on the right image, and the coordinate b2 of the intersection of the scanning line and the auxiliary line B2 on the left image and the scanning line on the right image And the coordinate b1 of the intersection of the auxiliary line B1 are associated with each other on the search plane.
 これにより、ステレオマッチング処理システム1は、検索平面上における誤った対応付けを改善して、同一の位置を左画像と右画像との間で正確に対応付けることができる。 As a result, the stereo matching processing system 1 can correct an incorrect correspondence on the search plane and accurately associate the same position between the left image and the right image.
 以上説明したように、本実施形態に係るステレオマッチング処理システム1によれば、補助線による補正を施すことにより、検索平面上における誤った対応付けを改善することができるため、ステレオマッチング結果としての視差データを修正することができる。そして、ステレオマッチング処理システム1は、修正された視差データを用いて、DSMデータを抽出することにより、対象物の表層の高さを正確に示す標高データを取得することができる。 As described above, according to the stereo matching processing system 1 according to the present embodiment, by performing the correction using the auxiliary line, it is possible to improve the false correspondence on the search plane, and therefore, as the stereo matching result. The disparity data can be corrected. Then, the stereo matching processing system 1 can acquire elevation data indicating the height of the surface layer of the object accurately by extracting the DSM data using the corrected disparity data.
 なお、本発明は、上記実施形態に限定されず、種々の変形、応用が可能である。以下、本発明に適用可能な上記実施形態の変形態様について、説明する。 The present invention is not limited to the above embodiment, and various modifications and applications are possible. Hereinafter, modifications of the embodiment applicable to the present invention will be described.
 上記実施形態では、図8に示すステップS25の処理にて、左画像上における走査線及び補助線A2の交点の座標a2と右画像上における走査線及び補助線A1の交点の座標a1とが、また、左画像上における走査線及び補助線B2の交点の座標b2と右画像上における走査線及び補助線B1の交点の座標b1とが、それぞれ検索平面上で対応付けられ、その間では、相関係数が最大となる格子領域の中心座標同士が検索平面上で対応付けられるものとして説明した。しかしながら、本発明はこれに限定されるものではない。図11は、変形例における検索平面を示す図である。図11に示すように、検索平面上の座標(a1、a2)と座標(b1、b2)とを結ぶ線分上の点が全て対応付けられるようにしてもよい。 In the above embodiment, in the process of step S25 shown in FIG. 8, the coordinate a2 of the intersection of the scanning line and the auxiliary line A2 on the left image and the coordinate a1 of the intersection of the scanning line and the auxiliary line A1 on the right image are In addition, coordinates b2 of the intersection of the scanning line and the auxiliary line B2 on the left image and coordinates b1 of the intersection of the scanning line and the auxiliary line B1 on the right image are respectively associated on the search plane, It has been described that the central coordinates of the grid regions having the largest number are associated on the search plane. However, the present invention is not limited to this. FIG. 11 is a diagram showing a search plane in the modification. As shown in FIG. 11, all points on a line segment connecting coordinates (a1, a2) on the search plane and coordinates (b1, b2) may be made to correspond to each other.
 そして、このような対応付けを行う補助線(強制補助線)を入力するか否かは、オペレータによる補助線入力部11による操作により決定されればよい。 Then, it may be determined by the operation of the auxiliary line input unit 11 by the operator whether or not the auxiliary line (forced auxiliary line) to perform such association is input.
 上記実施形態では、補助線が走査線に垂直に引かれるものを例に説明したが、本発明はこれに限定されるものではない。図12は、変形例における補助線を使用した対応付けの改善を説明するための図である。また、図13は、変形例における検索平面を示す図である。補助線は、図12に示すように、走査線に水平に引かれるものであってもよい。補助線が走査線に水平に引かれた場合には、左画像及び右画像それぞれにおける補助線(線分)をn(nは自然数)等分する点(等分点)同士を、図13に示すように、線分の始点から順に線分の終点まで検査平面上で対応付けるようにすればよい。 In the above embodiment, the auxiliary line is drawn perpendicular to the scanning line as an example, but the present invention is not limited to this. FIG. 12 is a diagram for explaining the improvement of the association using the auxiliary line in the modification. Moreover, FIG. 13 is a figure which shows the search plane in a modification. The auxiliary lines may be drawn horizontally to the scanning lines as shown in FIG. When the auxiliary line is drawn horizontally to the scanning line, points (equal division points) equally dividing the auxiliary line (line segment) in the left image and the right image by n (n is a natural number) are shown in FIG. As shown, it may be made to correspond on the inspection plane from the start point of the line segment to the end point of the line segment in order.
 上記実施形態では、相互標定部12が、標定処理において、左画像と右画像とに共通に映っている対象物の座標値を読み取り、読み取った2つの座標値を用いて、左画像と右画像との間でのカメラ回転角等の撮影時のカメラパラメータを標定するものとして説明した。しかしながら、本発明はこれに限定されるものではなく、撮影時のカメラパラメータを標定する手法は、任意であり、例えば図化プログラムによって算出される値を用いて、撮影時のカメラパラメータを標定してもよい。 In the above embodiment, in the orientation processing, the relative orientation unit 12 reads the coordinate values of the object commonly shown in the left image and the right image, and uses the two read coordinate values to read the left image and the right image. It has been described that the camera parameters at the time of shooting, such as the camera rotation angle between them, are located. However, the present invention is not limited to this, and the method for determining the camera parameters at the time of shooting is arbitrary, and for example, the camera parameters at the time of shooting are determined using values calculated by the plotting program. May be
 上記実施形態では、オルソ処理・対地標定部14が、オルソ処理を施した航空写真の画像とDSMデータを用いて、対象物の経度および緯度を求める対地標定を行うものとして説明した。しかしながら、本発明はこれに限定されるものではなく、対地標定を行うための手法は、任意であり、例えば、予め経度、緯度、及び標高値(画像座標)が検出されている航空写真の画像上の複数の点の画像座標から、地上表面の地上座標(経度、緯度、及び標高値)への変換式を求めるようにしてもよい。 In the above embodiment, it has been described that the ortho processing / ground positioning unit 14 performs the ground positioning for obtaining the longitude and latitude of the object using the image of the aerial photograph subjected to the ortho processing and the DSM data. However, the present invention is not limited to this, and the method for performing ground localization is arbitrary, for example, an aerial photograph image in which longitude, latitude, and elevation values (image coordinates) are detected in advance. From the image coordinates of a plurality of upper points, a conversion equation to ground coordinates (longitude, latitude, and elevation value) of the ground surface may be obtained.
 また、対空標識を写し込で航空写真を撮影することで経度、緯度、及び標高が測量された空中三角測量データを用いてもよく、このようにすれば、画像上の任意の座標における地上座標を求めることが可能となる。ここで、対空標識とは、航空機から各種センサにより航空写真を撮影したときに、航空写真の画像上で明瞭にその形が認識でき、且つその画像座標を測定できるものをいう。したがって、対空標識を置いた点では、正確な3次元座標が示されている。 In addition, aerial triangulation data in which longitude, latitude, and elevation are surveyed may be used by taking an aerial photograph by imprinting an anti-air sign, and in this way, ground coordinates at arbitrary coordinates on the image Can be determined. Here, when the aerial photograph is taken from the aircraft by various sensors, the anti-aircraft sign means one that can clearly recognize the shape on the aerial photograph image and can measure the image coordinates. Therefore, the exact three-dimensional coordinates are shown at the points where the anti-airmarks are placed.
 上記実施形態において、オルソ画像には、色データと緯度データと経度データとが含まれ、オルソDSMデータには、標高データと緯度データと経度データとが含まれているものとして説明した。しかしながら、本発明はこれに限定されるものではなく、オルソ画像及びオルソDSMデータには、緯度データ及び経度データに代えて、他の座標系で表現される座標値データが含まれてもよく、また、標高値データに代えて、他の基準からの相対的な高さを示す高さデータが含まれてもよい。 In the above embodiment, the ortho image includes color data, latitude data, and longitude data, and the ortho DSM data includes elevation data, latitude data, and longitude data. However, the present invention is not limited to this, and the ortho image and ortho DSM data may include coordinate value data represented by another coordinate system instead of the latitude data and the longitude data. Also, instead of elevation value data, height data indicating relative height from another reference may be included.
 なお、本願については、日本国特許出願2008-300103号を基礎とする優先権を主張し、本明細書中に日本国特許出願2008-300103号の明細書、特許請求の範囲、図面全体を参照として取り込むものとする。 The present application claims priority based on Japanese Patent Application No. 2008-300103, and the specification, claims, and entire drawings of Japanese Patent Application No. 2008-300103 are referred to in the present specification. Shall be imported as
  1 ステレオマッチング処理システム
 10 表示部
 11 補助線入力部
 12 相互標定部
 13 ステレオマッチング部
 14 オルソ処理・対地標定部
DESCRIPTION OF SYMBOLS 1 stereo matching processing system 10 display part 11 auxiliary line input part 12 mutual positioning part 13 stereo matching part 14 ortho processing / ground positioning part

Claims (6)

  1.  同一の対象物を異なる方向から撮影して得られる複数の画像において、同一走査線上で相関係数が最大となる領域同士を同一の位置を示すものとして対応付ける対応付け部と、
     前記複数の画像のそれぞれに同一の位置を示すものとして対応付けられた線分が引かれたかを判別する線分判別部と、を備え、
     前記対応付け部は、前記線分判別部によって前記線分が引かれたと判別した場合、前記同一走査線上で相関係数が最大となる領域同士ではなく、該走査線と該線分との交点同士を同一の位置を示すものとして対応付ける、
     ことを特徴とするステレオマッチング処理システム。
    A plurality of images obtained by photographing the same object from different directions, an associating unit that associates regions having the largest correlation coefficient on the same scanning line as indicating the same position;
    A line segment determination unit that determines whether a line segment associated with each of the plurality of images indicates the same position;
    When the association unit determines that the line segment is drawn by the line segment determination unit, an intersection point between the scanning line and the line segment instead of areas where the correlation coefficient is maximum on the same scanning line. Associate the two with one another as indicating the same position,
    A stereo matching processing system characterized by
  2.  前記対応付け部は、前記複数の画像のそれぞれに複数の前記線分が引かれた場合、前記走査線と該線分との交点同士を結ぶ線分同士を同一の位置を示すものとして対応付ける、
     ことを特徴とする請求項1に記載のステレオマッチング処理システム。
    When the plurality of line segments are drawn in each of the plurality of images, the association unit associates line segments connecting intersection points of the scanning line and the line segments with each other as indicating the same position.
    The stereo matching processing system according to claim 1, characterized in that:
  3.  前記対応付け部は、前記複数の画像のそれぞれに前記走査線と水平な前記線分が引かれた場合、前記線分の始点同士と終点同士とをそれぞれ同一の位置を示すものとして対応付ける、
     ことを特徴とする請求項2に記載のステレオマッチング処理システム。
    The association unit associates start points and end points of the line segments with each other as indicating the same position when the line segment that is horizontal to the scanning line is drawn in each of the plurality of images.
    The stereo matching processing system according to claim 2, characterized in that:
  4.  前記対応付け部は、前記複数の画像のそれぞれに前記走査線と水平な前記線分が引かれた場合、該線分を所定等分する等分点同士を前記始点から順にそれぞれ同一の位置を示すものとして対応付けていく、
     ことを特徴とする請求項3に記載のステレオマッチング処理システム。
    When the line segment parallel to the scanning line is drawn in each of the plurality of images, the association unit divides equally the line segment by predetermined equal points in order from the start point to the same position. We will associate as shown
    The stereo matching processing system according to claim 3, characterized in that:
  5.  同一の対象物を異なる方向から撮影して得られる複数の画像において、同一走査線上で相関係数が最大となる領域同士を同一の位置を示すものとして対応付ける対応付けステップと、
     前記複数の画像のそれぞれに同一の位置を示すものとして対応付けられた線分が引かれたかを判別する線分判別ステップと、を備え、
     前記対応付けステップは、前記線分判別ステップによって前記線分が引かれたと判別した場合、前記同一走査線上で相関係数が最大となる領域同士ではなく、該走査線と該線分との交点同士を同一の位置を示すものとして対応付ける、
     ことを特徴とするステレオマッチング処理方法。
    Associating a plurality of images obtained by photographing the same object from different directions, in which regions having the largest correlation coefficient on the same scanning line are associated with each other as indicating the same position;
    A line segment determination step of determining whether a line segment associated with each of the plurality of images is indicated to indicate the same position;
    In the associating step, when it is determined that the line segment is drawn in the line segment determining step, an intersection point between the scanning line and the line segment instead of regions where the correlation coefficient is maximized on the same scanning line Associate the two with one another as indicating the same position,
    Stereo matching processing method characterized in that
  6.  コンピュータに、同一の対象物を異なる方向から撮影して得られる複数の画像において、同一走査線上で相関係数が最大となる領域同士を同一の位置を示すものとして対応付ける対応付け手順と、
     前記複数の画像のそれぞれに同一の位置を示すものとして対応付けられた線分が引かれたかを判別する線分判別手順と、を実行させるためのプログラムを記録したコンピュータ読み取り可能な記録媒体であって、
     前記対応付け手順は、前記線分判別手順によって前記線分が引かれたと判別した場合、前記同一走査線上で相関係数が最大となる領域同士ではなく、該走査線と該線分との交点同士を同一の位置を示すものとして対応付ける、
     ことを特徴とするプログラムを記録したコンピュータ読み取り可能な記録媒体。
    A correlating procedure of correlating regions on the same scanning line with regions having the largest correlation coefficient to indicate the same position in a plurality of images obtained by photographing the same object from different directions on the computer;
    A computer-readable recording medium having recorded thereon a program for executing a line segment determination procedure of determining whether a line segment associated with each of the plurality of images is indicated as indicating the same position. ,
    When it is determined that the line segment is drawn by the line segment determination procedure, the association procedure is not the areas where the correlation coefficient is maximum on the same scanning line, but the intersection points of the scanning line and the line segment. Associate the two with one another as indicating the same position,
    A computer-readable recording medium having a program recorded therein.
PCT/JP2009/069887 2008-11-25 2009-11-25 Stereo matching process system, stereo matching process method, and recording medium WO2010061860A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020117011938A KR101453143B1 (en) 2008-11-25 2009-11-25 Stereo matching process system, stereo matching process method, and recording medium
CN200980146982.2A CN102224523B (en) 2008-11-25 2009-11-25 Stereo matching process system, stereo matching process method, and recording medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008-300103 2008-11-25
JP2008300103A JP5311465B2 (en) 2008-11-25 2008-11-25 Stereo matching processing system, stereo matching processing method, and program

Publications (1)

Publication Number Publication Date
WO2010061860A1 true WO2010061860A1 (en) 2010-06-03

Family

ID=42225733

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/069887 WO2010061860A1 (en) 2008-11-25 2009-11-25 Stereo matching process system, stereo matching process method, and recording medium

Country Status (4)

Country Link
JP (1) JP5311465B2 (en)
KR (1) KR101453143B1 (en)
CN (1) CN102224523B (en)
WO (1) WO2010061860A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101922930A (en) * 2010-07-08 2010-12-22 西北工业大学 Aviation polarization multi-spectrum image registration method
CN113436057A (en) * 2021-08-27 2021-09-24 绍兴埃瓦科技有限公司 Data processing method and binocular stereo matching method

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8587518B2 (en) * 2010-12-23 2013-11-19 Tektronix, Inc. Disparity cursors for measurement of 3D images
JP5839671B2 (en) * 2011-09-20 2016-01-06 株式会社Screenホールディングス 3D position / attitude recognition device, industrial robot, 3D position / attitude recognition method, program, recording medium
CN103339651B (en) 2011-10-11 2016-12-07 松下知识产权经营株式会社 Image processing apparatus, camera head and image processing method
CN108629731A (en) * 2017-03-15 2018-10-09 长沙博为软件技术股份有限公司 A kind of image split-joint method being suitable for rolling screenshotss
KR102610989B1 (en) * 2019-12-26 2023-12-08 한국전자통신연구원 Method and apparatus of generating digital surface model using satellite imagery
CN112417208A (en) * 2020-11-20 2021-02-26 百度在线网络技术(北京)有限公司 Target searching method and device, electronic equipment and computer-readable storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002230527A (en) * 2001-01-31 2002-08-16 Olympus Optical Co Ltd Three-dimensional information acquisition device and method and computer readable storage medium storing three-dimensional information acquisition program

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3842988B2 (en) * 2000-07-19 2006-11-08 ペンタックス株式会社 Image processing apparatus for measuring three-dimensional information of an object by binocular stereoscopic vision, and a method for recording the same, or a recording medium recording the measurement program
US7164784B2 (en) * 2002-07-30 2007-01-16 Mitsubishi Electric Research Laboratories, Inc. Edge chaining using smoothly-varying stereo disparity
CN100550054C (en) * 2007-12-17 2009-10-14 电子科技大学 A kind of image solid matching method and device thereof
CN101226636B (en) * 2008-02-02 2010-06-02 中国科学院遥感应用研究所 Method for matching image of rigid body transformation relation
CN100586199C (en) * 2008-03-30 2010-01-27 深圳华为通信技术有限公司 Method and device for capturing view difference

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002230527A (en) * 2001-01-31 2002-08-16 Olympus Optical Co Ltd Three-dimensional information acquisition device and method and computer readable storage medium storing three-dimensional information acquisition program

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101922930A (en) * 2010-07-08 2010-12-22 西北工业大学 Aviation polarization multi-spectrum image registration method
CN101922930B (en) * 2010-07-08 2013-11-06 西北工业大学 Aviation polarization multi-spectrum image registration method
CN113436057A (en) * 2021-08-27 2021-09-24 绍兴埃瓦科技有限公司 Data processing method and binocular stereo matching method

Also Published As

Publication number Publication date
KR20110089299A (en) 2011-08-05
JP5311465B2 (en) 2013-10-09
KR101453143B1 (en) 2014-10-27
CN102224523B (en) 2014-04-23
CN102224523A (en) 2011-10-19
JP2010128608A (en) 2010-06-10

Similar Documents

Publication Publication Date Title
WO2010061860A1 (en) Stereo matching process system, stereo matching process method, and recording medium
JP5832341B2 (en) Movie processing apparatus, movie processing method, and movie processing program
JP5713159B2 (en) Three-dimensional position / orientation measurement apparatus, method and program using stereo images
US8107722B2 (en) System and method for automatic stereo measurement of a point of interest in a scene
US10529076B2 (en) Image processing apparatus and image processing method
JP5442111B2 (en) A method for high-speed 3D construction from images
US8571303B2 (en) Stereo matching processing system, stereo matching processing method and recording medium
JP5715735B2 (en) Three-dimensional measurement method, apparatus and system, and image processing apparatus
CN112686877B (en) Binocular camera-based three-dimensional house damage model construction and measurement method and system
JP2012058076A (en) Three-dimensional measurement device and three-dimensional measurement method
JP2011253376A (en) Image processing device, image processing method and program
JP6580761B1 (en) Depth acquisition apparatus and method using polarization stereo camera
TW201523510A (en) System and method for combining point clouds
CN116958218A (en) Point cloud and image registration method and equipment based on calibration plate corner alignment
Kochi et al. Development of 3D image measurement system and stereo‐matching method, and its archaeological measurement
JP2008224323A (en) Stereoscopic photograph measuring instrument, stereoscopic photograph measuring method, and stereoscopic photograph measuring program
CN112991372B (en) 2D-3D camera external parameter calibration method based on polygon matching
AU2009249003B2 (en) Stereoscopic measurement system and method
JP6448413B2 (en) Roof slope estimation system and roof slope estimation method
JP4810403B2 (en) Information processing apparatus and information processing method
KR101770866B1 (en) Apparatus and method for calibrating depth image based on depth sensor-color camera relations
CN116866522B (en) Remote monitoring method
JP3851483B2 (en) Image point association apparatus, image point association method, and recording medium storing image point association program
JP4999010B2 (en) Free viewpoint video generation method and recording medium
CN108731644B (en) Oblique photography mapping method and system based on vertical auxiliary line

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200980146982.2

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09829108

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20117011938

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09829108

Country of ref document: EP

Kind code of ref document: A1