WO2010061860A1 - Système et procédé de traitement des concordances stéréoscopiques, et support d'enregistrement - Google Patents

Système et procédé de traitement des concordances stéréoscopiques, et support d'enregistrement Download PDF

Info

Publication number
WO2010061860A1
WO2010061860A1 PCT/JP2009/069887 JP2009069887W WO2010061860A1 WO 2010061860 A1 WO2010061860 A1 WO 2010061860A1 JP 2009069887 W JP2009069887 W JP 2009069887W WO 2010061860 A1 WO2010061860 A1 WO 2010061860A1
Authority
WO
WIPO (PCT)
Prior art keywords
line
line segment
stereo matching
same position
scanning line
Prior art date
Application number
PCT/JP2009/069887
Other languages
English (en)
Japanese (ja)
Inventor
小泉 博一
神谷 俊之
弘之 柳生
Original Assignee
Necシステムテクノロジー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necシステムテクノロジー株式会社 filed Critical Necシステムテクノロジー株式会社
Priority to KR1020117011938A priority Critical patent/KR101453143B1/ko
Priority to CN200980146982.2A priority patent/CN102224523B/zh
Publication of WO2010061860A1 publication Critical patent/WO2010061860A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Definitions

  • the present invention relates to a stereo matching processing system, a stereo matching processing method, and a computer readable recording medium recording a program, and a stereo matching processing system capable of accurately associating the same position among a plurality of images, stereo matching processing
  • the present invention relates to a method and a recording medium.
  • a method of generating three-dimensional data [DSM (Digital Surface Model) data] representing topography by stereo matching based on an image obtained from a satellite or aircraft is widely used.
  • DSM Digital Surface Model
  • stereo matching processing corresponding points in each image capturing the same point are determined for two images captured from different viewpoints, so-called stereo images, and the principle of triangulation is used using the parallax. To determine the depth and shape of the object.
  • the movement range of the search window is limited to the epipolar line direction in the image, so that for each point in the left image, the x direction of the point in the corresponding right image It is possible to obtain the positional displacement amount, that is, the parallax.
  • the epipolar line is a straight line which can be drawn as an existing range of a point corresponding to the point in the other image with respect to a certain point in one image in the stereo image (see, for example, Non-Patent Document 1).
  • the epipolar line direction is different from the scanning line direction of the image, by performing coordinate conversion, the epipolar line direction can be made to coincide with the scanning line direction of the image, and rearrangement can be performed.
  • the method of this coordinate conversion is described in the above non-patent document 1.
  • the movement range of the search window of the corresponding point can be limited to the scanning line, so parallax is obtained as the difference between the x coordinates of corresponding points in the left and right images. .
  • the related stereo matching 3D data generation method includes an area without texture and an area where a correlation coefficient can not be obtained, the image of the above 3D data is erroneously different from the surroundings. There are many points that indicate the height. In particular, since concealment occurs in the periphery of a structure, etc., there are many points that can not be dealt with, which may show extremely high values or may cause a large loss of the structure.
  • the present invention has been made to solve the above problems, and a stereo matching processing system, a stereo matching processing method, and a recording medium, which can accurately associate areas indicating the same position among a plurality of images. Intended to be provided.
  • a stereo matching processing system has a correlation coefficient that is maximized on the same scanning line in a plurality of images obtained by photographing the same object from different directions. And a line segment determination unit that determines whether a line segment associated with each of the plurality of images as having the same position is drawn. And when the line segment determination unit determines that the line segment is drawn, the association unit does not select areas where the correlation coefficient is maximized on the same scanning line, but the scanning line and the line segment And the point of intersection with each other to indicate the same position.
  • a line segment determination step of determining whether a line segment associated with each of the plurality of images has been drawn as corresponding to each other.
  • the attaching step when it is determined that the line segment has been drawn in the line segment determining step, the intersections of the scanning line and the line segment are not the areas where the correlation coefficient is maximum on the same scanning line. It corresponds, as what shows the same position.
  • the computer readable recording medium having the program according to the third aspect of the present invention recorded on a computer has a plurality of images obtained by photographing the same object from different directions, with a phase relationship on the same scanning line.
  • a correlating procedure for correlating areas where the number is the largest as indicating the same position, and a line segment for determining whether a line segment correlated as indicating the same position to each of the plurality of images is drawn
  • a computer readable recording medium storing a program for executing the determining step, wherein the associating step determines that the line segment is drawn according to the line segment determining step.
  • the associating step determines that the line segment is drawn according to the line segment determining step.
  • a stereo matching processing system it is possible to provide a stereo matching processing system, a stereo matching processing method, and a computer readable recording medium having a program recorded thereon, in which the same position can be accurately associated among a plurality of images.
  • FIG. 1 is a block diagram showing an example of the configuration of a stereo matching processing system according to an embodiment of the present invention.
  • the stereo matching processing system 1 includes, for example, a general-purpose computer, and as shown in FIG. 1, the display unit 10, the auxiliary line input unit 11, the relative positioning unit 12, the stereo matching unit 13, ortho processing / grounding And a positioning unit 14.
  • the display unit 10 is configured of, for example, a liquid crystal display (LCD).
  • FIG. 2 is a view showing a display example of the auxiliary line input screen.
  • 3A and 3B are diagrams showing display examples of stereo matching results.
  • the display unit 10 displays an auxiliary line input screen shown in FIG. 2 including two aerial photograph images (hereinafter referred to as a left image and a right image) obtained by photographing the same object from different directions, and FIG. And a stereo matching result of the left image and the right image shown in FIG.
  • the auxiliary line input unit 11 includes, for example, a keyboard and a mouse, and is used when an operator draws an auxiliary line on the left image and the right image on the auxiliary line input screen displayed on the display unit 10.
  • the auxiliary line is a line segment for correlating a portion at the same position between the left image and the right image by the operation of the operator.
  • the relative positioning unit 12 is realized by, for example, a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), a hard disk drive, etc., and a positioning process for orienting camera parameters at the time of shooting And the right image are reprojected to a common parallel plane as needed.
  • orientation means obtaining a predetermined value for the object of evaluation.
  • the relative orientation unit 12 reads the coordinate values of the object commonly displayed in the left image and the right image, and uses the two read coordinate values to read the left image and the right image.
  • Camera parameters at the time of shooting such as camera rotation angle between
  • the relative positioning unit 12 can normally locate camera parameters at the time of photographing which is difficult to grasp due to the influence of a change in attitude of the aircraft and the like even when photographing an aerial photograph from a direction close to the vertical.
  • the relative orientation unit 12 executes the parallelization process to share the left image and the right image so that the epipolar line connecting the epipoles on the left image and the right image matches one of the plurality of scanning lines. Reproject to parallel plane.
  • the stereo matching unit 13 is realized by, for example, a CPU, a ROM, a RAM, a hard disk drive, and the like. Then, the RAM is provided with an auxiliary line input flag indicating that the auxiliary line is input to each of the left image and the right image, an intersection point coordinate buffer storing coordinates of an intersection point of the auxiliary line and the scanning line, and the like.
  • the stereo matching unit 13 performs stereo matching processing, specifically, DP (Dynamic Programming) matching processing on the left image and the right image (parallelized image pair) subjected to the parallelizing processing in the relative positioning unit 12.
  • DP Dynamic Programming
  • FIG. 4 is a diagram for explaining the DP matching process. Specifically, in the DP matching process, the stereo matching unit 13 performs the same scanning as shown in FIG. 4 when auxiliary lines are not input on the left image and the right image by the operation of the auxiliary line input unit 11 by the operator. The left image and the right image on the line are correlated to search for a lattice area having the largest correlation coefficient.
  • FIG. 5 is a diagram illustrating a search plane.
  • the stereo matching unit 13 associates the center coordinates of the lattice area where the correlation coefficient is maximum on the search plane shown in FIG.
  • the horizontal axis x1 indicates the x coordinate in the right image
  • the vertical axis x2 indicates the x coordinate in the left image.
  • the stereo matching unit 13 executes such association for each scanning line to generate parallax data, and displays an image as shown in FIG. 3A on the display unit 10 as a stereo matching result. .
  • FIG. 6 is a diagram for explaining stereo matching processing. As shown in FIG. 6, the stereo matching unit 13 performs such association for each scanning line to generate parallax data that has been corrected by the auxiliary lines, as shown in FIG. 3B. The image is displayed on the display unit 10 as a stereo matching result.
  • the stereo matching unit 13 combines the generated parallax data and the camera parameters determined by the relative orientation unit 12 and calculates the position in the three-dimensional coordinate system corresponding to each pixel according to the principle of triangulation.
  • the depth and shape of the object are determined by extracting digital surface model (DSM) data including elevation data indicating the height of the surface layer of the object.
  • DSM digital surface model
  • the ortho processing / ground positioning unit 14 is realized by, for example, a CPU, a ROM, a RAM, a hard disk drive, etc., and performs ortho processing or ortho processing for orthographically converting an aerial photograph image and DSM data using DSM data.
  • Ortho image and ortho DSM data are generated by using the aerial image image and DSM data, and performing ground orientation processing to determine the exact coordinates of the object relative to the ground surface, specifically the longitude and latitude of the object. Do.
  • the orthoimage includes color data and latitude data and longitude data indicating latitude and longitude obtained by the ground positioning.
  • the ortho DSM data includes elevation data indicating the height of the surface layer of the object, latitude data and longitude data.
  • auxiliary line input processing and stereo matching processing are periodically executed.
  • the auxiliary line input process and the stereo matching process are executed at arbitrary timings, for example, when there is an instruction from the operator, when there is a predetermined image, or when a predetermined time comes.
  • FIG. 7 is a flowchart showing the details of the auxiliary line input process.
  • the stereo matching unit 13 inputs an auxiliary line to each of the left image and the right image by the operation of the auxiliary line input unit 11 by the operator. It is determined (step S11). At this time, if the auxiliary line is not input (step S11; No), the stereo matching unit 13 ends the auxiliary line input process as it is.
  • step S11 when it is determined in the process of step S11 that an auxiliary line is input (step S11; Yes), the stereo matching unit 13 determines the intersection of the scanning line and the auxiliary line for each of the left image and the right image.
  • the coordinates of are stored in the intersection point coordinate buffer provided in the RAM (step S12).
  • the stereo matching unit 13 sets the auxiliary line input flag provided in the RAM to ON (step S13), and ends the auxiliary line input process.
  • FIG. 8 is a flowchart showing the details of the stereo matching process.
  • the relative orientation unit 12 executes the orientation process to locate the camera parameters at the time of shooting (step S21), and executes the parallelization process, and the epipolar line is one of a plurality of scanning lines.
  • the left image and the right image are reprojected to a common parallel plane so as to coincide with one (step S22).
  • the stereo matching unit 13 determines whether the auxiliary line is input to each of the left image and the right image by checking whether the auxiliary line input flag provided in the RAM is on or not. (Step S23).
  • step S23 when it is determined in the process of step S23 that no auxiliary line is input (step S23; No), the stereo matching unit 13 generates parallax data without correction by the auxiliary line (step S24).
  • the stereo matching unit 13 correlates the left image and the right image on the same scanning line to search for a lattice area having the largest correlation coefficient. Then, the stereo matching unit 13 associates the center coordinates of the lattice area where the correlation coefficient is maximum on the search plane. The stereo matching unit 13 executes such association for each scanning line to generate parallax data without correction by the auxiliary line.
  • step S23 when it is determined in the process of step S23 that the auxiliary line is input (step S23; Yes), the stereo matching unit 13 generates parallax data with correction by the auxiliary line (step S25).
  • the stereo matching unit 13 corresponds the coordinates of the intersection point of the scanning line and the auxiliary line stored in the intersection point coordinate buffer provided in the RAM on the search plane at the place where the auxiliary line is input. At the same time, in the place where the auxiliary line is not input, the coordinates of the center position of the lattice area where the correlation coefficient is maximum are associated on the search plane. The stereo matching unit 13 executes such association for each scanning line to generate parallax data with correction by the auxiliary line.
  • the stereo matching unit 13 displays an image based on the parallax data generated in the process of step S24 or S25 on the display unit 10 as a stereo matching result (step S26), and specifies the parallax data and the process in step S21.
  • the extracted camera data is used to extract DSM data including elevation data indicating the height of the surface layer of the object (step S27).
  • the ortho processing / ground positioning unit 14 performs ortho processing on the aerial photograph image and the DSM data by executing the ortho processing using the DSM data extracted in the processing of step S 27 (step S 28). .
  • the ortho processing / ground positioning unit 14 executes the ground positioning processing using the DSM data subjected to the ortho processing in step S 28 to obtain an ortho image and elevation data indicating the height of the surface layer of the object. And generating ortho-DSM data (step S29).
  • step S23 when the auxiliary line is not input, it is determined in the process of step S23 shown in FIG. 8 that the auxiliary line input flag is off, so that the lattice having the largest correlation coefficient in the process of step S24. Center coordinates of the regions are associated on the search plane.
  • FIG. 9A and 9B are diagrams illustrating search planes. Moreover, FIG. 10 is a figure for demonstrating the improvement of the matching using an auxiliary line.
  • the operator on the auxiliary line input unit 11 causes the auxiliary on the left image to be displayed as shown in FIG.
  • the line A2 and the auxiliary line A1 on the right image are input in association with each other, and the auxiliary line B2 on the left image and the auxiliary line B1 on the right image are input in association with each other.
  • step S12 x coordinates a1, a2, b1, and b2 of the intersections of the scanning line and the auxiliary lines A1, A2, B1, and B2 are determined, and In the process, the auxiliary line input flag is set on.
  • the scanning line and the auxiliary on the left image The coordinate a2 of the intersection of the line A2 and the coordinate a1 of the intersection of the scanning line and the auxiliary line A1 on the right image, and the coordinate b2 of the intersection of the scanning line and the auxiliary line B2 on the left image and the scanning line on the right image And the coordinate b1 of the intersection of the auxiliary line B1 are associated with each other on the search plane.
  • the stereo matching processing system 1 can correct an incorrect correspondence on the search plane and accurately associate the same position between the left image and the right image.
  • the stereo matching processing system 1 by performing the correction using the auxiliary line, it is possible to improve the false correspondence on the search plane, and therefore, as the stereo matching result.
  • the disparity data can be corrected.
  • the stereo matching processing system 1 can acquire elevation data indicating the height of the surface layer of the object accurately by extracting the DSM data using the corrected disparity data.
  • FIG. 11 is a diagram showing a search plane in the modification. As shown in FIG. 11, all points on a line segment connecting coordinates (a1, a2) on the search plane and coordinates (b1, b2) may be made to correspond to each other.
  • auxiliary line input unit 11 it may be determined by the operation of the auxiliary line input unit 11 by the operator whether or not the auxiliary line (forced auxiliary line) to perform such association is input.
  • FIG. 12 is a diagram for explaining the improvement of the association using the auxiliary line in the modification.
  • FIG. 13 is a figure which shows the search plane in a modification.
  • the auxiliary lines may be drawn horizontally to the scanning lines as shown in FIG.
  • points (equal division points) equally dividing the auxiliary line (line segment) in the left image and the right image by n (n is a natural number) are shown in FIG. As shown, it may be made to correspond on the inspection plane from the start point of the line segment to the end point of the line segment in order.
  • the relative orientation unit 12 reads the coordinate values of the object commonly shown in the left image and the right image, and uses the two read coordinate values to read the left image and the right image. It has been described that the camera parameters at the time of shooting, such as the camera rotation angle between them, are located. However, the present invention is not limited to this, and the method for determining the camera parameters at the time of shooting is arbitrary, and for example, the camera parameters at the time of shooting are determined using values calculated by the plotting program. May be
  • the ortho processing / ground positioning unit 14 performs the ground positioning for obtaining the longitude and latitude of the object using the image of the aerial photograph subjected to the ortho processing and the DSM data.
  • the present invention is not limited to this, and the method for performing ground localization is arbitrary, for example, an aerial photograph image in which longitude, latitude, and elevation values (image coordinates) are detected in advance. From the image coordinates of a plurality of upper points, a conversion equation to ground coordinates (longitude, latitude, and elevation value) of the ground surface may be obtained.
  • aerial triangulation data in which longitude, latitude, and elevation are surveyed may be used by taking an aerial photograph by imprinting an anti-air sign, and in this way, ground coordinates at arbitrary coordinates on the image Can be determined.
  • the anti-aircraft sign means one that can clearly recognize the shape on the aerial photograph image and can measure the image coordinates. Therefore, the exact three-dimensional coordinates are shown at the points where the anti-airmarks are placed.
  • the ortho image includes color data, latitude data, and longitude data
  • the ortho DSM data includes elevation data, latitude data, and longitude data.
  • the ortho image and ortho DSM data may include coordinate value data represented by another coordinate system instead of the latitude data and the longitude data.
  • height data indicating relative height from another reference may be included.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

Quand un opérateur introduit et met en concordance une ligne auxiliaire (A2) de l'image de gauche avec une ligne auxiliaire (A1) de l'image de droite, et qu'il introduit et met en concordance également une ligne auxiliaire (B2) de l'image de gauche avec une ligne auxiliaire (B1) de l'image de droite, un système de traitement des concordances stéréoscopiques (1) établit des corrélations dans un plan de recherche entre, d'une part les coordonnées (a2) où une ligne de balayage de l'image de gauche coupe la ligne auxiliaire (A2), et d'autre part les coordonnées (a1) où une ligne de balayage de l'image de droite coupe la ligne auxiliaire (A1). De même, le système de traitement des concordances stéréoscopiques (1) établit des corrélations, dans le plan de recherche, entre d'une part les coordonnées (b2) où une ligne de balayage de l'image de gauche coupe la ligne auxiliaire (B2), et d'autre part les coordonnées (b1) où une ligne de balayage de l'image de droite coupe la ligne auxiliaire (B1). Cela permet au système de traitement des concordances stéréoscopiques (1) de corriger des correspondances erronées dans le plan de recherche et d'établir des corrélations précises de la même position dans l'image de gauche et l'image de droite.
PCT/JP2009/069887 2008-11-25 2009-11-25 Système et procédé de traitement des concordances stéréoscopiques, et support d'enregistrement WO2010061860A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020117011938A KR101453143B1 (ko) 2008-11-25 2009-11-25 스테레오 매칭 처리 시스템, 스테레오 매칭 처리 방법, 및 기록 매체
CN200980146982.2A CN102224523B (zh) 2008-11-25 2009-11-25 立体匹配处理***、立体匹配处理方法和记录媒介

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008300103A JP5311465B2 (ja) 2008-11-25 2008-11-25 ステレオマッチング処理システム、ステレオマッチング処理方法、及びプログラム
JP2008-300103 2008-11-25

Publications (1)

Publication Number Publication Date
WO2010061860A1 true WO2010061860A1 (fr) 2010-06-03

Family

ID=42225733

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/069887 WO2010061860A1 (fr) 2008-11-25 2009-11-25 Système et procédé de traitement des concordances stéréoscopiques, et support d'enregistrement

Country Status (4)

Country Link
JP (1) JP5311465B2 (fr)
KR (1) KR101453143B1 (fr)
CN (1) CN102224523B (fr)
WO (1) WO2010061860A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101922930A (zh) * 2010-07-08 2010-12-22 西北工业大学 一种航空偏振多光谱图像配准方法
CN113436057A (zh) * 2021-08-27 2021-09-24 绍兴埃瓦科技有限公司 数据处理方法及双目立体匹配方法

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8587518B2 (en) * 2010-12-23 2013-11-19 Tektronix, Inc. Disparity cursors for measurement of 3D images
JP5839671B2 (ja) * 2011-09-20 2016-01-06 株式会社Screenホールディングス 三次元位置・姿勢認識装置、産業用ロボット、三次元位置・姿勢認識方法、プログラム、記録媒体
WO2013054499A1 (fr) 2011-10-11 2013-04-18 パナソニック株式会社 Dispositif de traitement d'image, dispositif d'imagerie et procédé de traitement d'image
CN108629731A (zh) * 2017-03-15 2018-10-09 长沙博为软件技术股份有限公司 一种适用于滚动截屏的图像拼接方法
KR102610989B1 (ko) * 2019-12-26 2023-12-08 한국전자통신연구원 위성영상을 이용한 수치표면모델 생성 방법 및 장치
CN112417208A (zh) * 2020-11-20 2021-02-26 百度在线网络技术(北京)有限公司 目标搜索方法、装置、电子设备和计算机可读存储介质

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002230527A (ja) * 2001-01-31 2002-08-16 Olympus Optical Co Ltd 立体情報取得装置および立体情報取得方法ならびに立体情報取得プログラムを記録したコンピュータリーダブル記録媒体

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3842988B2 (ja) * 2000-07-19 2006-11-08 ペンタックス株式会社 両眼立体視によって物体の3次元情報を計測する画像処理装置およびその方法又は計測のプログラムを記録した記録媒体
US7164784B2 (en) * 2002-07-30 2007-01-16 Mitsubishi Electric Research Laboratories, Inc. Edge chaining using smoothly-varying stereo disparity
CN100550054C (zh) * 2007-12-17 2009-10-14 电子科技大学 一种图像立体匹配方法及其装置
CN101226636B (zh) * 2008-02-02 2010-06-02 中国科学院遥感应用研究所 一种刚体变换关系的图像的匹配方法
CN100586199C (zh) * 2008-03-30 2010-01-27 深圳华为通信技术有限公司 视差获取方法和装置

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002230527A (ja) * 2001-01-31 2002-08-16 Olympus Optical Co Ltd 立体情報取得装置および立体情報取得方法ならびに立体情報取得プログラムを記録したコンピュータリーダブル記録媒体

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101922930A (zh) * 2010-07-08 2010-12-22 西北工业大学 一种航空偏振多光谱图像配准方法
CN101922930B (zh) * 2010-07-08 2013-11-06 西北工业大学 一种航空偏振多光谱图像配准方法
CN113436057A (zh) * 2021-08-27 2021-09-24 绍兴埃瓦科技有限公司 数据处理方法及双目立体匹配方法

Also Published As

Publication number Publication date
JP5311465B2 (ja) 2013-10-09
CN102224523B (zh) 2014-04-23
KR20110089299A (ko) 2011-08-05
KR101453143B1 (ko) 2014-10-27
CN102224523A (zh) 2011-10-19
JP2010128608A (ja) 2010-06-10

Similar Documents

Publication Publication Date Title
WO2010061860A1 (fr) Système et procédé de traitement des concordances stéréoscopiques, et support d'enregistrement
JP5832341B2 (ja) 動画処理装置、動画処理方法および動画処理用のプログラム
JP5713159B2 (ja) ステレオ画像による3次元位置姿勢計測装置、方法およびプログラム
US10529076B2 (en) Image processing apparatus and image processing method
JP5442111B2 (ja) 画像から高速に立体構築を行なう方法
US8571303B2 (en) Stereo matching processing system, stereo matching processing method and recording medium
JP5715735B2 (ja) 3次元測定方法、装置、及びシステム、並びに画像処理装置
CN112686877B (zh) 基于双目相机的三维房屋损伤模型构建测量方法及***
JP2012058076A (ja) 3次元計測装置及び3次元計測方法
JP2011253376A (ja) 画像処理装置、および画像処理方法、並びにプログラム
JP6580761B1 (ja) 偏光ステレオカメラによる深度取得装置及びその方法
TW201523510A (zh) 點雲拼接系統及方法
CN116958218A (zh) 一种基于标定板角点对齐的点云与图像配准方法及设备
Kochi et al. Development of 3D image measurement system and stereo‐matching method, and its archaeological measurement
JP2008224323A (ja) ステレオ写真計測装置、ステレオ写真計測方法及びステレオ写真計測用プログラム
CN112991372B (zh) 一种基于多边形匹配的2d-3d相机外参标定方法
AU2009249003B2 (en) Stereoscopic measurement system and method
JP6448413B2 (ja) 屋根勾配推定システム及び屋根勾配推定方法
JP4810403B2 (ja) 情報処理装置、情報処理方法
KR101770866B1 (ko) 뎁스 센서와 촬영 카메라 간의 관계에 기초한 뎁스 영상 보정 장치 및 방법
CN116866522B (zh) 一种远程监控方法
JP3851483B2 (ja) 像点対応付け装置、像点対応付け方法、および像点対応付けプログラムを格納した記録媒体
JP4999010B2 (ja) 自由視点映像生成方式及び記録媒体
CN108731644B (zh) 基于铅直辅助线的倾斜摄影测图方法及其***
GB2560243B (en) Apparatus and method for registering recorded images.

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200980146982.2

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09829108

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20117011938

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09829108

Country of ref document: EP

Kind code of ref document: A1