WO2010061860A1 - Stereo matching process system, stereo matching process method, and recording medium - Google Patents
Stereo matching process system, stereo matching process method, and recording medium Download PDFInfo
- Publication number
- WO2010061860A1 WO2010061860A1 PCT/JP2009/069887 JP2009069887W WO2010061860A1 WO 2010061860 A1 WO2010061860 A1 WO 2010061860A1 JP 2009069887 W JP2009069887 W JP 2009069887W WO 2010061860 A1 WO2010061860 A1 WO 2010061860A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- line
- line segment
- stereo matching
- same position
- scanning line
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
- G01C11/06—Interpretation of pictures by comparison of two or more pictures of the same area
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
Definitions
- the present invention relates to a stereo matching processing system, a stereo matching processing method, and a computer readable recording medium recording a program, and a stereo matching processing system capable of accurately associating the same position among a plurality of images, stereo matching processing
- the present invention relates to a method and a recording medium.
- a method of generating three-dimensional data [DSM (Digital Surface Model) data] representing topography by stereo matching based on an image obtained from a satellite or aircraft is widely used.
- DSM Digital Surface Model
- stereo matching processing corresponding points in each image capturing the same point are determined for two images captured from different viewpoints, so-called stereo images, and the principle of triangulation is used using the parallax. To determine the depth and shape of the object.
- the movement range of the search window is limited to the epipolar line direction in the image, so that for each point in the left image, the x direction of the point in the corresponding right image It is possible to obtain the positional displacement amount, that is, the parallax.
- the epipolar line is a straight line which can be drawn as an existing range of a point corresponding to the point in the other image with respect to a certain point in one image in the stereo image (see, for example, Non-Patent Document 1).
- the epipolar line direction is different from the scanning line direction of the image, by performing coordinate conversion, the epipolar line direction can be made to coincide with the scanning line direction of the image, and rearrangement can be performed.
- the method of this coordinate conversion is described in the above non-patent document 1.
- the movement range of the search window of the corresponding point can be limited to the scanning line, so parallax is obtained as the difference between the x coordinates of corresponding points in the left and right images. .
- the related stereo matching 3D data generation method includes an area without texture and an area where a correlation coefficient can not be obtained, the image of the above 3D data is erroneously different from the surroundings. There are many points that indicate the height. In particular, since concealment occurs in the periphery of a structure, etc., there are many points that can not be dealt with, which may show extremely high values or may cause a large loss of the structure.
- the present invention has been made to solve the above problems, and a stereo matching processing system, a stereo matching processing method, and a recording medium, which can accurately associate areas indicating the same position among a plurality of images. Intended to be provided.
- a stereo matching processing system has a correlation coefficient that is maximized on the same scanning line in a plurality of images obtained by photographing the same object from different directions. And a line segment determination unit that determines whether a line segment associated with each of the plurality of images as having the same position is drawn. And when the line segment determination unit determines that the line segment is drawn, the association unit does not select areas where the correlation coefficient is maximized on the same scanning line, but the scanning line and the line segment And the point of intersection with each other to indicate the same position.
- a line segment determination step of determining whether a line segment associated with each of the plurality of images has been drawn as corresponding to each other.
- the attaching step when it is determined that the line segment has been drawn in the line segment determining step, the intersections of the scanning line and the line segment are not the areas where the correlation coefficient is maximum on the same scanning line. It corresponds, as what shows the same position.
- the computer readable recording medium having the program according to the third aspect of the present invention recorded on a computer has a plurality of images obtained by photographing the same object from different directions, with a phase relationship on the same scanning line.
- a correlating procedure for correlating areas where the number is the largest as indicating the same position, and a line segment for determining whether a line segment correlated as indicating the same position to each of the plurality of images is drawn
- a computer readable recording medium storing a program for executing the determining step, wherein the associating step determines that the line segment is drawn according to the line segment determining step.
- the associating step determines that the line segment is drawn according to the line segment determining step.
- a stereo matching processing system it is possible to provide a stereo matching processing system, a stereo matching processing method, and a computer readable recording medium having a program recorded thereon, in which the same position can be accurately associated among a plurality of images.
- FIG. 1 is a block diagram showing an example of the configuration of a stereo matching processing system according to an embodiment of the present invention.
- the stereo matching processing system 1 includes, for example, a general-purpose computer, and as shown in FIG. 1, the display unit 10, the auxiliary line input unit 11, the relative positioning unit 12, the stereo matching unit 13, ortho processing / grounding And a positioning unit 14.
- the display unit 10 is configured of, for example, a liquid crystal display (LCD).
- FIG. 2 is a view showing a display example of the auxiliary line input screen.
- 3A and 3B are diagrams showing display examples of stereo matching results.
- the display unit 10 displays an auxiliary line input screen shown in FIG. 2 including two aerial photograph images (hereinafter referred to as a left image and a right image) obtained by photographing the same object from different directions, and FIG. And a stereo matching result of the left image and the right image shown in FIG.
- the auxiliary line input unit 11 includes, for example, a keyboard and a mouse, and is used when an operator draws an auxiliary line on the left image and the right image on the auxiliary line input screen displayed on the display unit 10.
- the auxiliary line is a line segment for correlating a portion at the same position between the left image and the right image by the operation of the operator.
- the relative positioning unit 12 is realized by, for example, a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), a hard disk drive, etc., and a positioning process for orienting camera parameters at the time of shooting And the right image are reprojected to a common parallel plane as needed.
- orientation means obtaining a predetermined value for the object of evaluation.
- the relative orientation unit 12 reads the coordinate values of the object commonly displayed in the left image and the right image, and uses the two read coordinate values to read the left image and the right image.
- Camera parameters at the time of shooting such as camera rotation angle between
- the relative positioning unit 12 can normally locate camera parameters at the time of photographing which is difficult to grasp due to the influence of a change in attitude of the aircraft and the like even when photographing an aerial photograph from a direction close to the vertical.
- the relative orientation unit 12 executes the parallelization process to share the left image and the right image so that the epipolar line connecting the epipoles on the left image and the right image matches one of the plurality of scanning lines. Reproject to parallel plane.
- the stereo matching unit 13 is realized by, for example, a CPU, a ROM, a RAM, a hard disk drive, and the like. Then, the RAM is provided with an auxiliary line input flag indicating that the auxiliary line is input to each of the left image and the right image, an intersection point coordinate buffer storing coordinates of an intersection point of the auxiliary line and the scanning line, and the like.
- the stereo matching unit 13 performs stereo matching processing, specifically, DP (Dynamic Programming) matching processing on the left image and the right image (parallelized image pair) subjected to the parallelizing processing in the relative positioning unit 12.
- DP Dynamic Programming
- FIG. 4 is a diagram for explaining the DP matching process. Specifically, in the DP matching process, the stereo matching unit 13 performs the same scanning as shown in FIG. 4 when auxiliary lines are not input on the left image and the right image by the operation of the auxiliary line input unit 11 by the operator. The left image and the right image on the line are correlated to search for a lattice area having the largest correlation coefficient.
- FIG. 5 is a diagram illustrating a search plane.
- the stereo matching unit 13 associates the center coordinates of the lattice area where the correlation coefficient is maximum on the search plane shown in FIG.
- the horizontal axis x1 indicates the x coordinate in the right image
- the vertical axis x2 indicates the x coordinate in the left image.
- the stereo matching unit 13 executes such association for each scanning line to generate parallax data, and displays an image as shown in FIG. 3A on the display unit 10 as a stereo matching result. .
- FIG. 6 is a diagram for explaining stereo matching processing. As shown in FIG. 6, the stereo matching unit 13 performs such association for each scanning line to generate parallax data that has been corrected by the auxiliary lines, as shown in FIG. 3B. The image is displayed on the display unit 10 as a stereo matching result.
- the stereo matching unit 13 combines the generated parallax data and the camera parameters determined by the relative orientation unit 12 and calculates the position in the three-dimensional coordinate system corresponding to each pixel according to the principle of triangulation.
- the depth and shape of the object are determined by extracting digital surface model (DSM) data including elevation data indicating the height of the surface layer of the object.
- DSM digital surface model
- the ortho processing / ground positioning unit 14 is realized by, for example, a CPU, a ROM, a RAM, a hard disk drive, etc., and performs ortho processing or ortho processing for orthographically converting an aerial photograph image and DSM data using DSM data.
- Ortho image and ortho DSM data are generated by using the aerial image image and DSM data, and performing ground orientation processing to determine the exact coordinates of the object relative to the ground surface, specifically the longitude and latitude of the object. Do.
- the orthoimage includes color data and latitude data and longitude data indicating latitude and longitude obtained by the ground positioning.
- the ortho DSM data includes elevation data indicating the height of the surface layer of the object, latitude data and longitude data.
- auxiliary line input processing and stereo matching processing are periodically executed.
- the auxiliary line input process and the stereo matching process are executed at arbitrary timings, for example, when there is an instruction from the operator, when there is a predetermined image, or when a predetermined time comes.
- FIG. 7 is a flowchart showing the details of the auxiliary line input process.
- the stereo matching unit 13 inputs an auxiliary line to each of the left image and the right image by the operation of the auxiliary line input unit 11 by the operator. It is determined (step S11). At this time, if the auxiliary line is not input (step S11; No), the stereo matching unit 13 ends the auxiliary line input process as it is.
- step S11 when it is determined in the process of step S11 that an auxiliary line is input (step S11; Yes), the stereo matching unit 13 determines the intersection of the scanning line and the auxiliary line for each of the left image and the right image.
- the coordinates of are stored in the intersection point coordinate buffer provided in the RAM (step S12).
- the stereo matching unit 13 sets the auxiliary line input flag provided in the RAM to ON (step S13), and ends the auxiliary line input process.
- FIG. 8 is a flowchart showing the details of the stereo matching process.
- the relative orientation unit 12 executes the orientation process to locate the camera parameters at the time of shooting (step S21), and executes the parallelization process, and the epipolar line is one of a plurality of scanning lines.
- the left image and the right image are reprojected to a common parallel plane so as to coincide with one (step S22).
- the stereo matching unit 13 determines whether the auxiliary line is input to each of the left image and the right image by checking whether the auxiliary line input flag provided in the RAM is on or not. (Step S23).
- step S23 when it is determined in the process of step S23 that no auxiliary line is input (step S23; No), the stereo matching unit 13 generates parallax data without correction by the auxiliary line (step S24).
- the stereo matching unit 13 correlates the left image and the right image on the same scanning line to search for a lattice area having the largest correlation coefficient. Then, the stereo matching unit 13 associates the center coordinates of the lattice area where the correlation coefficient is maximum on the search plane. The stereo matching unit 13 executes such association for each scanning line to generate parallax data without correction by the auxiliary line.
- step S23 when it is determined in the process of step S23 that the auxiliary line is input (step S23; Yes), the stereo matching unit 13 generates parallax data with correction by the auxiliary line (step S25).
- the stereo matching unit 13 corresponds the coordinates of the intersection point of the scanning line and the auxiliary line stored in the intersection point coordinate buffer provided in the RAM on the search plane at the place where the auxiliary line is input. At the same time, in the place where the auxiliary line is not input, the coordinates of the center position of the lattice area where the correlation coefficient is maximum are associated on the search plane. The stereo matching unit 13 executes such association for each scanning line to generate parallax data with correction by the auxiliary line.
- the stereo matching unit 13 displays an image based on the parallax data generated in the process of step S24 or S25 on the display unit 10 as a stereo matching result (step S26), and specifies the parallax data and the process in step S21.
- the extracted camera data is used to extract DSM data including elevation data indicating the height of the surface layer of the object (step S27).
- the ortho processing / ground positioning unit 14 performs ortho processing on the aerial photograph image and the DSM data by executing the ortho processing using the DSM data extracted in the processing of step S 27 (step S 28). .
- the ortho processing / ground positioning unit 14 executes the ground positioning processing using the DSM data subjected to the ortho processing in step S 28 to obtain an ortho image and elevation data indicating the height of the surface layer of the object. And generating ortho-DSM data (step S29).
- step S23 when the auxiliary line is not input, it is determined in the process of step S23 shown in FIG. 8 that the auxiliary line input flag is off, so that the lattice having the largest correlation coefficient in the process of step S24. Center coordinates of the regions are associated on the search plane.
- FIG. 9A and 9B are diagrams illustrating search planes. Moreover, FIG. 10 is a figure for demonstrating the improvement of the matching using an auxiliary line.
- the operator on the auxiliary line input unit 11 causes the auxiliary on the left image to be displayed as shown in FIG.
- the line A2 and the auxiliary line A1 on the right image are input in association with each other, and the auxiliary line B2 on the left image and the auxiliary line B1 on the right image are input in association with each other.
- step S12 x coordinates a1, a2, b1, and b2 of the intersections of the scanning line and the auxiliary lines A1, A2, B1, and B2 are determined, and In the process, the auxiliary line input flag is set on.
- the scanning line and the auxiliary on the left image The coordinate a2 of the intersection of the line A2 and the coordinate a1 of the intersection of the scanning line and the auxiliary line A1 on the right image, and the coordinate b2 of the intersection of the scanning line and the auxiliary line B2 on the left image and the scanning line on the right image And the coordinate b1 of the intersection of the auxiliary line B1 are associated with each other on the search plane.
- the stereo matching processing system 1 can correct an incorrect correspondence on the search plane and accurately associate the same position between the left image and the right image.
- the stereo matching processing system 1 by performing the correction using the auxiliary line, it is possible to improve the false correspondence on the search plane, and therefore, as the stereo matching result.
- the disparity data can be corrected.
- the stereo matching processing system 1 can acquire elevation data indicating the height of the surface layer of the object accurately by extracting the DSM data using the corrected disparity data.
- FIG. 11 is a diagram showing a search plane in the modification. As shown in FIG. 11, all points on a line segment connecting coordinates (a1, a2) on the search plane and coordinates (b1, b2) may be made to correspond to each other.
- auxiliary line input unit 11 it may be determined by the operation of the auxiliary line input unit 11 by the operator whether or not the auxiliary line (forced auxiliary line) to perform such association is input.
- FIG. 12 is a diagram for explaining the improvement of the association using the auxiliary line in the modification.
- FIG. 13 is a figure which shows the search plane in a modification.
- the auxiliary lines may be drawn horizontally to the scanning lines as shown in FIG.
- points (equal division points) equally dividing the auxiliary line (line segment) in the left image and the right image by n (n is a natural number) are shown in FIG. As shown, it may be made to correspond on the inspection plane from the start point of the line segment to the end point of the line segment in order.
- the relative orientation unit 12 reads the coordinate values of the object commonly shown in the left image and the right image, and uses the two read coordinate values to read the left image and the right image. It has been described that the camera parameters at the time of shooting, such as the camera rotation angle between them, are located. However, the present invention is not limited to this, and the method for determining the camera parameters at the time of shooting is arbitrary, and for example, the camera parameters at the time of shooting are determined using values calculated by the plotting program. May be
- the ortho processing / ground positioning unit 14 performs the ground positioning for obtaining the longitude and latitude of the object using the image of the aerial photograph subjected to the ortho processing and the DSM data.
- the present invention is not limited to this, and the method for performing ground localization is arbitrary, for example, an aerial photograph image in which longitude, latitude, and elevation values (image coordinates) are detected in advance. From the image coordinates of a plurality of upper points, a conversion equation to ground coordinates (longitude, latitude, and elevation value) of the ground surface may be obtained.
- aerial triangulation data in which longitude, latitude, and elevation are surveyed may be used by taking an aerial photograph by imprinting an anti-air sign, and in this way, ground coordinates at arbitrary coordinates on the image Can be determined.
- the anti-aircraft sign means one that can clearly recognize the shape on the aerial photograph image and can measure the image coordinates. Therefore, the exact three-dimensional coordinates are shown at the points where the anti-airmarks are placed.
- the ortho image includes color data, latitude data, and longitude data
- the ortho DSM data includes elevation data, latitude data, and longitude data.
- the ortho image and ortho DSM data may include coordinate value data represented by another coordinate system instead of the latitude data and the longitude data.
- height data indicating relative height from another reference may be included.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
Description
10 表示部
11 補助線入力部
12 相互標定部
13 ステレオマッチング部
14 オルソ処理・対地標定部 DESCRIPTION OF SYMBOLS 1 stereo matching processing system 10 display part 11 auxiliary
Claims (6)
- 同一の対象物を異なる方向から撮影して得られる複数の画像において、同一走査線上で相関係数が最大となる領域同士を同一の位置を示すものとして対応付ける対応付け部と、
前記複数の画像のそれぞれに同一の位置を示すものとして対応付けられた線分が引かれたかを判別する線分判別部と、を備え、
前記対応付け部は、前記線分判別部によって前記線分が引かれたと判別した場合、前記同一走査線上で相関係数が最大となる領域同士ではなく、該走査線と該線分との交点同士を同一の位置を示すものとして対応付ける、
ことを特徴とするステレオマッチング処理システム。 A plurality of images obtained by photographing the same object from different directions, an associating unit that associates regions having the largest correlation coefficient on the same scanning line as indicating the same position;
A line segment determination unit that determines whether a line segment associated with each of the plurality of images indicates the same position;
When the association unit determines that the line segment is drawn by the line segment determination unit, an intersection point between the scanning line and the line segment instead of areas where the correlation coefficient is maximum on the same scanning line. Associate the two with one another as indicating the same position,
A stereo matching processing system characterized by - 前記対応付け部は、前記複数の画像のそれぞれに複数の前記線分が引かれた場合、前記走査線と該線分との交点同士を結ぶ線分同士を同一の位置を示すものとして対応付ける、
ことを特徴とする請求項1に記載のステレオマッチング処理システム。 When the plurality of line segments are drawn in each of the plurality of images, the association unit associates line segments connecting intersection points of the scanning line and the line segments with each other as indicating the same position.
The stereo matching processing system according to claim 1, characterized in that: - 前記対応付け部は、前記複数の画像のそれぞれに前記走査線と水平な前記線分が引かれた場合、前記線分の始点同士と終点同士とをそれぞれ同一の位置を示すものとして対応付ける、
ことを特徴とする請求項2に記載のステレオマッチング処理システム。 The association unit associates start points and end points of the line segments with each other as indicating the same position when the line segment that is horizontal to the scanning line is drawn in each of the plurality of images.
The stereo matching processing system according to claim 2, characterized in that: - 前記対応付け部は、前記複数の画像のそれぞれに前記走査線と水平な前記線分が引かれた場合、該線分を所定等分する等分点同士を前記始点から順にそれぞれ同一の位置を示すものとして対応付けていく、
ことを特徴とする請求項3に記載のステレオマッチング処理システム。 When the line segment parallel to the scanning line is drawn in each of the plurality of images, the association unit divides equally the line segment by predetermined equal points in order from the start point to the same position. We will associate as shown
The stereo matching processing system according to claim 3, characterized in that: - 同一の対象物を異なる方向から撮影して得られる複数の画像において、同一走査線上で相関係数が最大となる領域同士を同一の位置を示すものとして対応付ける対応付けステップと、
前記複数の画像のそれぞれに同一の位置を示すものとして対応付けられた線分が引かれたかを判別する線分判別ステップと、を備え、
前記対応付けステップは、前記線分判別ステップによって前記線分が引かれたと判別した場合、前記同一走査線上で相関係数が最大となる領域同士ではなく、該走査線と該線分との交点同士を同一の位置を示すものとして対応付ける、
ことを特徴とするステレオマッチング処理方法。 Associating a plurality of images obtained by photographing the same object from different directions, in which regions having the largest correlation coefficient on the same scanning line are associated with each other as indicating the same position;
A line segment determination step of determining whether a line segment associated with each of the plurality of images is indicated to indicate the same position;
In the associating step, when it is determined that the line segment is drawn in the line segment determining step, an intersection point between the scanning line and the line segment instead of regions where the correlation coefficient is maximized on the same scanning line Associate the two with one another as indicating the same position,
Stereo matching processing method characterized in that - コンピュータに、同一の対象物を異なる方向から撮影して得られる複数の画像において、同一走査線上で相関係数が最大となる領域同士を同一の位置を示すものとして対応付ける対応付け手順と、
前記複数の画像のそれぞれに同一の位置を示すものとして対応付けられた線分が引かれたかを判別する線分判別手順と、を実行させるためのプログラムを記録したコンピュータ読み取り可能な記録媒体であって、
前記対応付け手順は、前記線分判別手順によって前記線分が引かれたと判別した場合、前記同一走査線上で相関係数が最大となる領域同士ではなく、該走査線と該線分との交点同士を同一の位置を示すものとして対応付ける、
ことを特徴とするプログラムを記録したコンピュータ読み取り可能な記録媒体。 A correlating procedure of correlating regions on the same scanning line with regions having the largest correlation coefficient to indicate the same position in a plurality of images obtained by photographing the same object from different directions on the computer;
A computer-readable recording medium having recorded thereon a program for executing a line segment determination procedure of determining whether a line segment associated with each of the plurality of images is indicated as indicating the same position. ,
When it is determined that the line segment is drawn by the line segment determination procedure, the association procedure is not the areas where the correlation coefficient is maximum on the same scanning line, but the intersection points of the scanning line and the line segment. Associate the two with one another as indicating the same position,
A computer-readable recording medium having a program recorded therein.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020117011938A KR101453143B1 (en) | 2008-11-25 | 2009-11-25 | Stereo matching process system, stereo matching process method, and recording medium |
CN200980146982.2A CN102224523B (en) | 2008-11-25 | 2009-11-25 | Stereo matching process system, stereo matching process method, and recording medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-300103 | 2008-11-25 | ||
JP2008300103A JP5311465B2 (en) | 2008-11-25 | 2008-11-25 | Stereo matching processing system, stereo matching processing method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010061860A1 true WO2010061860A1 (en) | 2010-06-03 |
Family
ID=42225733
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/069887 WO2010061860A1 (en) | 2008-11-25 | 2009-11-25 | Stereo matching process system, stereo matching process method, and recording medium |
Country Status (4)
Country | Link |
---|---|
JP (1) | JP5311465B2 (en) |
KR (1) | KR101453143B1 (en) |
CN (1) | CN102224523B (en) |
WO (1) | WO2010061860A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101922930A (en) * | 2010-07-08 | 2010-12-22 | 西北工业大学 | Aviation polarization multi-spectrum image registration method |
CN113436057A (en) * | 2021-08-27 | 2021-09-24 | 绍兴埃瓦科技有限公司 | Data processing method and binocular stereo matching method |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8587518B2 (en) * | 2010-12-23 | 2013-11-19 | Tektronix, Inc. | Disparity cursors for measurement of 3D images |
JP5839671B2 (en) * | 2011-09-20 | 2016-01-06 | 株式会社Screenホールディングス | 3D position / attitude recognition device, industrial robot, 3D position / attitude recognition method, program, recording medium |
CN103339651B (en) | 2011-10-11 | 2016-12-07 | 松下知识产权经营株式会社 | Image processing apparatus, camera head and image processing method |
CN108629731A (en) * | 2017-03-15 | 2018-10-09 | 长沙博为软件技术股份有限公司 | A kind of image split-joint method being suitable for rolling screenshotss |
KR102610989B1 (en) * | 2019-12-26 | 2023-12-08 | 한국전자통신연구원 | Method and apparatus of generating digital surface model using satellite imagery |
CN112417208A (en) * | 2020-11-20 | 2021-02-26 | 百度在线网络技术(北京)有限公司 | Target searching method and device, electronic equipment and computer-readable storage medium |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002230527A (en) * | 2001-01-31 | 2002-08-16 | Olympus Optical Co Ltd | Three-dimensional information acquisition device and method and computer readable storage medium storing three-dimensional information acquisition program |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3842988B2 (en) * | 2000-07-19 | 2006-11-08 | ペンタックス株式会社 | Image processing apparatus for measuring three-dimensional information of an object by binocular stereoscopic vision, and a method for recording the same, or a recording medium recording the measurement program |
US7164784B2 (en) * | 2002-07-30 | 2007-01-16 | Mitsubishi Electric Research Laboratories, Inc. | Edge chaining using smoothly-varying stereo disparity |
CN100550054C (en) * | 2007-12-17 | 2009-10-14 | 电子科技大学 | A kind of image solid matching method and device thereof |
CN101226636B (en) * | 2008-02-02 | 2010-06-02 | 中国科学院遥感应用研究所 | Method for matching image of rigid body transformation relation |
CN100586199C (en) * | 2008-03-30 | 2010-01-27 | 深圳华为通信技术有限公司 | Method and device for capturing view difference |
-
2008
- 2008-11-25 JP JP2008300103A patent/JP5311465B2/en active Active
-
2009
- 2009-11-25 KR KR1020117011938A patent/KR101453143B1/en active IP Right Grant
- 2009-11-25 WO PCT/JP2009/069887 patent/WO2010061860A1/en active Application Filing
- 2009-11-25 CN CN200980146982.2A patent/CN102224523B/en not_active Expired - Fee Related
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002230527A (en) * | 2001-01-31 | 2002-08-16 | Olympus Optical Co Ltd | Three-dimensional information acquisition device and method and computer readable storage medium storing three-dimensional information acquisition program |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101922930A (en) * | 2010-07-08 | 2010-12-22 | 西北工业大学 | Aviation polarization multi-spectrum image registration method |
CN101922930B (en) * | 2010-07-08 | 2013-11-06 | 西北工业大学 | Aviation polarization multi-spectrum image registration method |
CN113436057A (en) * | 2021-08-27 | 2021-09-24 | 绍兴埃瓦科技有限公司 | Data processing method and binocular stereo matching method |
Also Published As
Publication number | Publication date |
---|---|
KR20110089299A (en) | 2011-08-05 |
JP5311465B2 (en) | 2013-10-09 |
KR101453143B1 (en) | 2014-10-27 |
CN102224523B (en) | 2014-04-23 |
CN102224523A (en) | 2011-10-19 |
JP2010128608A (en) | 2010-06-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2010061860A1 (en) | Stereo matching process system, stereo matching process method, and recording medium | |
JP5832341B2 (en) | Movie processing apparatus, movie processing method, and movie processing program | |
JP5713159B2 (en) | Three-dimensional position / orientation measurement apparatus, method and program using stereo images | |
US8107722B2 (en) | System and method for automatic stereo measurement of a point of interest in a scene | |
US10529076B2 (en) | Image processing apparatus and image processing method | |
JP5442111B2 (en) | A method for high-speed 3D construction from images | |
US8571303B2 (en) | Stereo matching processing system, stereo matching processing method and recording medium | |
JP5715735B2 (en) | Three-dimensional measurement method, apparatus and system, and image processing apparatus | |
CN112686877B (en) | Binocular camera-based three-dimensional house damage model construction and measurement method and system | |
JP2012058076A (en) | Three-dimensional measurement device and three-dimensional measurement method | |
JP2011253376A (en) | Image processing device, image processing method and program | |
JP6580761B1 (en) | Depth acquisition apparatus and method using polarization stereo camera | |
TW201523510A (en) | System and method for combining point clouds | |
CN116958218A (en) | Point cloud and image registration method and equipment based on calibration plate corner alignment | |
Kochi et al. | Development of 3D image measurement system and stereo‐matching method, and its archaeological measurement | |
JP2008224323A (en) | Stereoscopic photograph measuring instrument, stereoscopic photograph measuring method, and stereoscopic photograph measuring program | |
CN112991372B (en) | 2D-3D camera external parameter calibration method based on polygon matching | |
AU2009249003B2 (en) | Stereoscopic measurement system and method | |
JP6448413B2 (en) | Roof slope estimation system and roof slope estimation method | |
JP4810403B2 (en) | Information processing apparatus and information processing method | |
KR101770866B1 (en) | Apparatus and method for calibrating depth image based on depth sensor-color camera relations | |
CN116866522B (en) | Remote monitoring method | |
JP3851483B2 (en) | Image point association apparatus, image point association method, and recording medium storing image point association program | |
JP4999010B2 (en) | Free viewpoint video generation method and recording medium | |
CN108731644B (en) | Oblique photography mapping method and system based on vertical auxiliary line |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980146982.2 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09829108 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20117011938 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09829108 Country of ref document: EP Kind code of ref document: A1 |