WO2016157349A1 - Shape measurement method and device for same - Google Patents

Shape measurement method and device for same Download PDF

Info

Publication number
WO2016157349A1
WO2016157349A1 PCT/JP2015/059832 JP2015059832W WO2016157349A1 WO 2016157349 A1 WO2016157349 A1 WO 2016157349A1 JP 2015059832 W JP2015059832 W JP 2015059832W WO 2016157349 A1 WO2016157349 A1 WO 2016157349A1
Authority
WO
WIPO (PCT)
Prior art keywords
shape
measurement
image
unit
illumination
Prior art date
Application number
PCT/JP2015/059832
Other languages
French (fr)
Japanese (ja)
Inventor
敦史 谷口
渡辺 正浩
達雄 針山
啓晃 笠井
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Priority to PCT/JP2015/059832 priority Critical patent/WO2016157349A1/en
Priority to JP2017508875A priority patent/JPWO2016157349A1/en
Publication of WO2016157349A1 publication Critical patent/WO2016157349A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C7/00Tracing profiles

Definitions

  • the present invention relates to a shape measuring method and apparatus for measuring a three-dimensional shape of a measurement object optically in a non-contact manner.
  • Patent Document 2 discloses a tube inner surface shape measuring apparatus that three-dimensionally measures the surface shape of a tube inner surface, and irradiates the tube inner surface by moving a light spot in the circumferential direction along the tube inner surface.
  • Means imaging means for imaging the surface shape of the inner surface of the tube irradiated by the light irradiation means, moving means for moving the light irradiation means and the imaging means in the axial direction of the inner surface of the tube, and each means by the moving means
  • a position coordinate detection means for detecting the position coordinates of each means when the position is moved, a two-dimensional image captured by the imaging means, and a three-dimensional surface of the inner surface of the tube based on the position coordinates detected by the position coordinate detection means
  • An apparatus for measuring the shape of the inner surface of a pipe is described.
  • Patent Document 1 describes an apparatus for calculating the amount of movement of a moving body from distance measurement data and image data and measuring shape data around the moving body. However, it is aimed at making a map and moving the moving body. There is a problem that it is a patent and measurement accuracy is low.
  • Patent Document 2 describes a high-precision vertical cross-section measuring device, but the moving amount of the moving body is calculated by odometry, and there is a problem that the error is large and the device becomes large and versatility is low.
  • the present invention separates the acquired color image into a plurality of images while moving the imaging means along the measurement target, and after processing each separated image, it is integrated by SLAM technology,
  • the present invention relates to a shape measuring method and apparatus that enable shape measurement having versatility that enables measurement over a wide range. Embodiments of the present invention will be described below with reference to the drawings.
  • high-precision and high-density shape information can be acquired by applying different shape measurement methods for each of RGB acquired by one color camera.
  • a first embodiment of the present invention will be described with reference to FIGS.
  • FIG. 1 shows a configuration of a shape measuring apparatus 1000 according to the present embodiment.
  • the shape measuring apparatus 1000 includes a measuring head 10 that measures the shape of the measuring object 1, a processing unit 500 that processes an output signal from the measuring head 10, and a control unit 600 that controls the whole.
  • the measuring head 10 that measures the shape of the measuring object 1 includes a measuring unit 100 and an illumination unit 130.
  • the measuring unit 100 includes a laser beam source 111 and a cone mirror 112 to form a ring beam, a ring beam generating unit 110, an image forming lens 121, a special transmission filter 122, and an image detecting unit 120 including a color camera 123. It consists of.
  • the illumination unit 130 has an illumination function that uniformly illuminates the measurement object 1.
  • the ring beam is adjusted so as to be irradiated in a plane perpendicular to the traveling direction of the measuring head 10.
  • a portion where the ring beam 115 generated from the ring beam generator 110 is irradiated onto the measurement object 1 is imaged by the color camera 123 via the imaging lens 121 and the special transmission filter 122.
  • the measuring head 10 performs imaging by the color camera 123 while moving by the transport mechanism or the hand of the operator.
  • the distortion of the imaging lens 121 is calibrated in advance.
  • a wide-angle lens is used when acquiring a wide range at once. Fisheye lenses can also be used when a wider field of view is required.
  • the transport mechanism is configured to travel on a wire stretched in the measurement object 1.
  • the processing unit 500 processes an output from the color camera 123 of the measuring unit 100 to generate a two-dimensional image, and a cross-sectional shape measuring unit 502 that calculates a cross-sectional shape from the image generated by the image generating unit 501.
  • the movement amount calculation unit 503 that calculates the movement amount of the measuring head 10 from the image generated by the image generation unit 501, and the cross-sectional shape and movement amount calculation unit calculated by the cross-sectional shape calculation unit 502 using SLAM (Simultaneous Localization and Mapping) technology
  • a position and orientation and shape calculation unit 504 that simultaneously calculates the shape of the measurement target and the position and orientation information of the measurement head based on the movement amount calculated in 503, and the overall shape data and design calculated by the position and orientation and shape calculation unit 504 Comparing with data and calculating difference by comparison unit 505 and comparison unit 505
  • the abnormality extraction unit 506 that extracts as an abnormal value and the overall shape
  • the control unit 600 controls the measurement head 10 and the processing unit 500 in order to measure the three-dimensional shape of the measurement target 1.
  • a measurement flow using the shape measuring apparatus 1000 is shown in FIG.
  • measurement conditions such as the moving speed, acceleration / deceleration, image acquisition rate, and laser output of the measuring head 10 are set in the control unit 600 from an input device (not shown) (S100), and measurement is performed under the control of the control unit 600.
  • the measurement is started by the unit 100 and the movement of the measuring head 10 is started (S101).
  • the moving means of the measuring head is moved by a transport mechanism or an operator's hand.
  • a moving method using a stage a moving method in which a wire is hung, a moving method using UAV (Unmanned Air Vehicle), or the like can be used.
  • UAV Unmanned Air Vehicle
  • the image of the measuring object 1 is acquired by the color camera 123 (S102), and the image data is transferred to the processing unit 500 (S103).
  • Image data transfer between the measurement head 10 and the processing unit 500 is performed by wired or wireless transfer.
  • Shape information and measurement head movement amount are calculated from each acquired image stored in the processing unit 500 (S104), and shape and position / orientation information are simultaneously identified by SLAM processing (S105). Steps S102 to S105 are repeated until the measurement head stops moving, and the movement of the measurement head 10 is stopped and image acquisition is also ended (S106).
  • S106 SLAM processing
  • a feature of the measurement unit 100 in the present embodiment is that a special transmission filter 122 is used between the lens 121 and the color camera 123.
  • FIG. 3 shows the transmittance characteristic 122 a of the special transmission filter 122. Further, RGB (Green, Red, Blue) transmittance characteristics 123a to 123c of the color camera 123 are shown in an overlapping manner. The horizontal axis indicates the wavelength, and the vertical axis indicates the transmittance. As shown in FIG. 3, the transmittance 122a of the special transmission filter 122 has the following transmittance as compared with the RGB transmittance characteristics 123a to 123c of the color camera 123 when the special transmission filter 122 is not used.
  • the laser emitted from the laser light source 111 used in the ring beam generation unit 110 for forming the laser line 115a has the G wavelength band 123b, but the transmittance 122a of the special transmission filter 122 is in the G wavelength region 123b. Then, the transmittance is high with respect to the wavelength region 123b of the laser light source 111, other wavelengths of the G wavelength region 123b are not passed, and the R wavelength region 123a and the B wavelength region 123c are transmitted. Further, as shown in FIG. 4, the illumination light spectrum 130a emitted from the illumination unit 130 does not include the G-band wavelength for acquiring the laser line 115a, and has the B- and R-band wavelengths. .
  • FIGS. 5A to 5D Images acquired using each optical element having the above-described transmittance characteristics are schematically shown in FIGS. 5A to 5D.
  • FIG. 5A in the acquired color image 551, there is a laser line 115a generated when the ring beam 115 hits the measurement target 1, and the special transmission filter 122 transmits a part of the wavelength region of B and R.
  • the laser line 115a not only the laser line 115a but also the area illuminated by the surrounding illumination unit 130 is imaged.
  • FIGS. 5B to 5D Images obtained by disassembling the color image shown in FIG. 5A into RGB are shown in FIGS. 5B to 5D.
  • the illumination emitted from the illuminating unit 130 has a characteristic of not including the wavelength of the G wavelength region component, and thus the detection of the G wavelength region shown in FIG. 5C is performed.
  • G image an image formed using only signals
  • only the laser line 115a is captured.
  • an image formed using only the detection signal of the B wavelength region 123c shown in FIG. 5B hereinafter referred to as a B image
  • R image In the image (hereinafter referred to as “R image”), the laser line 115a is not displayed, and an image of the measurement object 1 in the visual field of the color camera 123 is captured. The details will be described later (processing unit). From the G image shown in FIG. 5C, the cross-sectional shape 115b on the laser line 115a of the measurement target 1 is shown in FIG. The point cloud to represent can be calculated. (Processing part) The processing unit 500 calculates the shape of the measurement object 1 from the image captured by the measurement unit 100 using SLAM technology. The shape calculation method will be described with reference to the flow of FIG. 7 and FIGS. 8A to 8C, FIGS. 9A to 9B, FIGS. 10A, 10B, and 11. FIG.
  • the image of the measuring object 1 acquired by the color camera 123 of the measuring unit 100 is decomposed into RGB components (S200).
  • the laser formed on the measuring object 1 by irradiating the G component ring beam to the wavelength region generated by the ring beam generator 110 From the G image in which the line 115a is captured, the cross-sectional shape is calculated using a light cutting method (S201).
  • the amount of movement of the measuring head 10 is calculated using one or both of the RB components.
  • a feature quantity image 802 in which feature points shown in FIG. 8B are extracted from two images acquired sequentially in time for the R image 801 shown in FIG. 8A is calculated (S202).
  • edge extraction As the feature quantity, edge extraction, SIFT (Scale-Invariant Feature Transform) feature quantity, or the like can be used. Recognize the same feature point pair 803 between the two images (S203), calculate the distance between each feature point pair and the amount of rotation / translational movement that satisfies each feature point pair 803 extracted from the image,
  • the stereo shape data 804 shown in FIG. 8C is calculated from the movement amount and the feature amount of the measuring head 10 using the motion stereo method (S204).
  • the B image an image slightly different from the R image is obtained due to the difference in color.
  • the amount of movement can be calculated by performing the same processing as the R image.
  • the position / orientation and shape calculation unit calculates the position / orientation and shape of the measurement head 10 with high accuracy ( S205).
  • SLAM technology can be used for the position and orientation and shape calculation unit. As the SLAM technique, a method using a Kalman filter, a method using a particle filter, or the like can be used.
  • the entire shape data is acquired by repeating the processing from S201 to S205 while acquiring an image.
  • S201 is performed on the images acquired while moving the measuring head 10, and a plurality of G images 901 shown in FIG. 902 is acquired as a point cloud.
  • a plurality of movement amounts and stereo shape data are also measured from the RB image, and the data is integrated by the position / orientation / shape calculation unit of S205 in succession, thereby obtaining the entire shape data 1001 of the measurement target shown in FIG. (S206).
  • the overall shape data calculated in S206 by the comparison unit 505 is compared with the design data to be measured (S207), and the abnormality extraction unit 506 detects the difference between the overall shape data 1001 and the design data.
  • a part where the difference is larger than a preset reference is extracted as an abnormal value (S208), and the positional information of the extracted abnormal value is displayed on the display unit 507 so as to overlap with the entire shape data 1001 (S209).
  • Example 2 of the present invention will be described with reference to FIGS.
  • the difference from the first embodiment is that there are two types of ring beams for light cutting with different wavelengths and irradiation positions, and accordingly, the transmission characteristics of the special transmission filter 1222 and the spectrum of the illumination 1230 are different.
  • Other configurations are the same as the configuration of the shape measuring apparatus 1000 according to the first embodiment described with reference to FIG.
  • FIG. 11 shows an outline of the measuring unit 1200 according to the present embodiment.
  • the ring beam generators 1210 and 1213 irradiate the measurement target 1 with the ring beam 115 described in the first embodiment, and the ring beam 116 irradiated from a position different from the ring beam 115 is also the measurement target. 1 is irradiated.
  • the wavelength of the ring beam 115 is near the center of the G band, and the wavelength of the ring beam 116 is near the center of the R band.
  • the ring beam generator 1210 has a combination of a laser light source 1211 that emits a parallel laser having a wavelength in the G band and a cone mirror 1212 that forms a ring-shaped beam, and the ring beam generator 1213 has a parallel having an R band wavelength.
  • a combination of a laser light source 1214 that transmits an optical laser and a cone mirror 1215 that forms a ring-shaped beam is provided.
  • the ring beams 115 and 116 are both adjusted to be irradiated in a cross section perpendicular to the traveling direction of the measuring head 10.
  • the transmission characteristics of the special transmission filter 1222 are shown in FIG.
  • the special transmission filter 1222 in this embodiment is designed to have a high transmittance in the B band, and to have a high transmittance only in the wavelength band that the laser light sources 1211 and 1214 transmit in the G and R bands.
  • FIG. 13 shows a spectrum 1230a of illumination light irradiated from the illumination unit 1230 to the measurement object 1.
  • the transmission band 1230a is matched with the B band 1223c of the color camera 123, and an image can be acquired using light having a wavelength in the B band.
  • FIGS. 14A to 14C schematically show images obtained using the optical elements having the transmittance characteristics as described above.
  • FIGS. 14A and 14B there are laser line ring beams 116 a and 115 a generated in the measurement target 1 in the acquired image.
  • FIG. 14C since the special transmission filter 1222 transmits a part of the wavelength region of B, the region illuminated by the surrounding illumination unit 1230 is also imaged.
  • Fig. 15 shows the measurement flow.
  • the difference from FIG. 7 is that the cross-sectional shape is calculated from each of the R image and the G image in S301, and the feature amount image is calculated from the B image in S302.
  • FIG. 16 shows a modification.
  • the case where the ring beam generator 1213 is irradiated with being tilted from a plane perpendicular to the traveling direction of the measuring head 10 is shown.
  • the cross-sectional shape measured from the acquired image includes the traveling direction information of the measuring head 10, and high accuracy is obtained when calculating the overall shape data.
  • one ring beam can detect the beam irradiation position, or the area that becomes a blind spot depending on the positional relationship between the beam irradiation position and the camera, with the other ring beam.
  • the detection range is expanded.
  • by performing stereo measurement using an image acquired with light of a wavelength in the B band it is possible to perform measurement with fewer blind spots during measurement with one camera.
  • FIG. 17 shows a measurement unit 1700.
  • random dot illumination is performed in the R band
  • a light cutting method is performed in the G band
  • a stereo method is performed in the B band.
  • the random dot illumination unit 1740 irradiates the measurement target 1 with illumination light having an R band wavelength.
  • the transmission characteristics of the special transmission filter 1722 have the same transmission characteristics as the special transmission filter 1222 described in FIG. Further, the spectrum of the illumination unit 1530 is the same as the characteristic shown in FIG.
  • three different shape measuring methods can be independently implemented for each of the RGB bands of the color camera 123.
  • Random dot illumination by the random dot illumination unit 1740 can be used even when the surface of the measurement object 1 has few features.
  • the ring beam generated by the ring beam generation unit 1710 in the light cutting method measures one section with high accuracy by one measurement, but the shape measurement method using random dot illumination illuminates a wide range. The measurement range is wide.
  • pattern illumination can be used instead of random dot illumination. Even in this case, high-precision and wide-range measurement is possible by compensating for the disadvantages of each shape measurement method.
  • FIG. 18 shows the configuration of the measurement unit 1700.
  • the difference between the first embodiment and FIG. 1 is that a cone mirror 125 is disposed on the optical axis of the color camera 123.
  • a plane perpendicular to the traveling direction of the measuring head 10 is imaged by the cone mirror 125.
  • FIG. 19 shows an example of acquiring an R image.
  • the central portion 8402 is an imaging portion by a cone mirror, and a plane orthogonal to the traveling direction is imaged on the optical axis of the camera, so that the traveling information of the measuring head 10 is imaged without distortion, and the entire shape Data can be calculated with high accuracy.
  • the size of the cone mirror is designed so that the light cutting line of the G image does not overlap with the imaging range of the cone mirror.
  • SYMBOLS 1 ... Measuring object 1000 ...
  • Shape measuring apparatus 10 ... Measuring head 100, 1700 ... Measuring part 110, 1210, 1213, 1710 ... Ring beam generating part 111, 1211, 1214 ... Laser Light sources 112, 1212, 1215, 125 ... cone mirrors 115, 116 ... ring beams 115a, 116a ... laser line 120 ... image detector 121 ... imaging lenses 122, 1222, 1722,.
  • Special transmission filter 122a ... Transmittance 123 ... Color cameras 123a to 123c, 1223a to 1223c ... Transmittance characteristics 130, 1230, 1530 ... Illumination units 130a, 1230a ...
  • Illumination light spectrum 500 ... Processing unit 501 ... Image generation unit 502 ... Cross-section measurement unit 503 ... Dynamic amount calculation unit 504 ... Position and orientation and shape calculation unit 505 ... Comparison unit 506 ... Abnormality extraction unit 551 ... Color image 600 ... Control unit 801 ... R image 802 ... Feature quantity Image 803 ... Feature point pair 804 ... Stereo shape data 901 ... G image 902 ... Cross-sectional shape data 1001 ... Overall shape data 1740 ... Random dot illumination unit 8402 ... Central portion

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The method and device of the present invention for measuring the shape of a measurement object are configured so that a plurality of illuminating lights emitted from a plurality of light sources are radiated to a measurement object, an image of a region of the measurement object irradiated by the plurality of illuminating lights is taken by a color camera, the color image of an inspection object obtained by imaging by the color camera is separated into a plurality of images in accordance with wavelength, different image processing is performed for each of the separated plurality of images and a cross-sectional shape and a movement amount are calculated, and overall shape data are calculated from the movement amount and the cross-sectional shape calculated from the different images.

Description

形状計測方法およびその装置Shape measuring method and apparatus
 本発明は、測定対象物の三次元の形状を光学的に非接触で計測する形状計測方法およびその装置に関する。 The present invention relates to a shape measuring method and apparatus for measuring a three-dimensional shape of a measurement object optically in a non-contact manner.
 測定対象物の三次元形状を光学的に非接触で計測する方法として、いろいろな方法が提案されている。 Various methods have been proposed as a method for optically measuring the three-dimensional shape of a measurement object in a non-contact manner.
 特許文献1には、環境を表す地図を用いて、前記環境内を状態推定をしながら移動する移動体を制御するための装置であって、(i) 前記地図を格納する、記憶手段と、(ii)前記移動体の状態を推定するための状態推定データの初期値を設定し、前記記憶手段に格納して、格納した前記状態推定データの更新を行う、状態推定データ格納・処理手段と、(iii) 複数の測定点を前記環境内に設定し、前記状態推定データから得た前記移動体の位置と前記複数の測定点の各々との位置関係を測定した測距データを、単位時間ごとに前記記憶手段に取りこむ、測距手段と、(iv)前記環境の一部を画像データとして前記記憶手段に単位時間ごとに取りこむ、画像取込手段と、(v) 前記測距データと前記画像データを対応づけた合成データを複数作成し、前記合成データ同士の残差を計算する、残差算出手段と、(vi)前記移動体を動かし、動かした移動量を出力する、移動手段とを含み、前記状態推定データ格納・処理手段が、前記移動手段の出力と、前記残差算出手段の計算した前記残差とを用いて、前記状態推定データを更新し、前記測距手段が、前記測距データを用いて前記地図を更新し、前記移動手段が、更新された前記状態推定データと前記地図とを用いて、前記移動体の移動を調節することを特徴とする、装置。について記載されている。 Patent Document 1 is an apparatus for controlling a moving body that moves while estimating the state in the environment using a map representing the environment, and (i) 記憶 storage means for storing the map; (ii) a state estimation data storage / processing unit that sets an initial value of state estimation data for estimating the state of the moving body, stores the initial value in the storage unit, and updates the stored state estimation data; (Iii) A plurality of measurement points are set in the environment, and distance measurement data obtained by measuring a positional relationship between the position of the moving body obtained from the state estimation data and each of the plurality of measurement points, unit time A distance measuring means that captures each of the storage means, and (iv) an image capture means that captures a part of the environment as image data into the storage means every unit time, and (v) the distance measurement data and the distance Create a plurality of composite data associating image data, A residual calculation means for calculating a residual between the combined data; and (vi) a moving means for moving the moving body and outputting the moved movement amount, the state estimation data storage / processing means, The state estimation data is updated using the output of the moving unit and the residual calculated by the residual calculation unit, and the ranging unit updates the map using the ranging data, A moving means adjusts the movement of the moving body using the updated state estimation data and the map. Is described.
 特許文献2には、管内面の表面形状を三次元的に計測する管内面形状測定装置であって、 前記管内面に沿って周方向に光スポットを移動させて前記管内面を照射する光照射手段と、該光照射手段により照射された管内面の表面形状を撮像する撮像手段と、前記光照射手段および撮像手段を前記管内面の軸方向に移動する移動手段と、該移動手段により各手段を移動させた時の前記各手段の位置座標を検知する位置座標検知手段と、前記撮像手段により撮像した二次元画像と前記位置座標検知手段により検知した位置座標に基づいて管内面の三次元表面形状を演算する演算手段と、を備えたことを特徴とする管内面形状測定装について記載されている。 Patent Document 2 discloses a tube inner surface shape measuring apparatus that three-dimensionally measures the surface shape of a tube inner surface, and irradiates the tube inner surface by moving a light spot in the circumferential direction along the tube inner surface. Means, imaging means for imaging the surface shape of the inner surface of the tube irradiated by the light irradiation means, moving means for moving the light irradiation means and the imaging means in the axial direction of the inner surface of the tube, and each means by the moving means A position coordinate detection means for detecting the position coordinates of each means when the position is moved, a two-dimensional image captured by the imaging means, and a three-dimensional surface of the inner surface of the tube based on the position coordinates detected by the position coordinate detection means An apparatus for measuring the shape of the inner surface of a pipe is described.
特開2011-48706号公報JP 2011-48706 JP 特開2006-64690号公報JP 2006-64690 A
 特許文献1には、測距データと画像データから移動体の移動量を算出し、移動体周辺の形状データを計測する装置に関し記載されているが、地図作製と移動体の移動を目的にした特許であり計測精度が低いという課題がある。 Patent Document 1 describes an apparatus for calculating the amount of movement of a moving body from distance measurement data and image data and measuring shape data around the moving body. However, it is aimed at making a map and moving the moving body. There is a problem that it is a patent and measurement accuracy is low.
 特許文献2では、高精度な垂直断面計測装置に関し記載されているが、移動体の移動量はオドメトリにより算出しており、誤差が大きい、かつ装置が大がかりとなり汎用性が低いという課題がある。 Patent Document 2 describes a high-precision vertical cross-section measuring device, but the moving amount of the moving body is calculated by odometry, and there is a problem that the error is large and the device becomes large and versatility is low.
 本発明は、上記従来技術の課題を解決して、大物形状、長尺、広範囲の計測を可能にする汎用性を有する形状計測方法及びその装置を提供するものである。 The present invention solves the above-described problems of the prior art and provides a versatile shape measuring method and apparatus for enabling measurement of large shapes, long shapes, and a wide range.
 上記課題を解決するために、本発明では、計測対象の形状を計測する装置であって、計測対象に複数の照明光を照射する複数の光源を備えた光照射部と、前記計測対象の前記光照射部により複数の照明光が照射された領域を撮像して取得した前記検査対象のカラー画像を波長に応じて複数の画像に分離して出力する計測部と、前記画像取得部から分離して出力された複数の画像に対してそれぞれ異なる画像処理を施し全体形状データを算出する処理部と、前記計測部で前記計測対象の形状を計測した結果を出力する出力部とを備えて構成した。 In order to solve the above problems, in the present invention, an apparatus for measuring the shape of a measurement object, the light irradiation unit including a plurality of light sources for irradiating the measurement object with a plurality of illumination lights, and the measurement object Separated from the image acquisition unit, a measurement unit that separates and outputs a color image of the inspection object acquired by imaging a region irradiated with a plurality of illumination lights by a light irradiation unit according to a wavelength. And a processing unit that performs different image processing on each of the plurality of images that are output and calculates overall shape data, and an output unit that outputs a result of measuring the shape of the measurement target by the measurement unit .
 本発明によれば、対象の形状や表面状態にかかわらず、大物形状、長尺、広範囲の計測を可能にする汎用性を有する形状計測を行うことができる。 According to the present invention, it is possible to perform shape measurement having versatility that enables measurement of large shapes, long shapes, and a wide range regardless of the shape and surface state of the target.
本発明の実施例1に係る形状計測装置の概略の構成を示すブロック図である。It is a block diagram which shows the schematic structure of the shape measuring apparatus which concerns on Example 1 of this invention. 本発明の実施例1に係る形状計測装置の計測の処理の流れを示すフロー図である。It is a flowchart which shows the flow of a measurement process of the shape measuring apparatus which concerns on Example 1 of this invention. 本発明の実施例1に係る特殊透過フィルタの透過率特性を示すグラフである。It is a graph which shows the transmittance | permeability characteristic of the special transmission filter which concerns on Example 1 of this invention. 本発明の実施例1に係る照明部のスペクトルの特性を示すグラフである。It is a graph which shows the characteristic of the spectrum of the illumination part which concerns on Example 1 of this invention. 本発明の実施例1に係るカラーカメラによる取得画像の模式図である。It is a schematic diagram of the acquired image by the color camera which concerns on Example 1 of this invention. 本発明の実施例1に係るG画像からの断面形状算出の結果得られる画像の模式図である。It is a schematic diagram of the image obtained as a result of the cross-sectional shape calculation from G image which concerns on Example 1 of this invention. 本発明の実施例1に係る処理部の処理の流れを示すフロー図である。It is a flowchart which shows the flow of a process of the process part which concerns on Example 1 of this invention. 本発明の実施例1に係る複数のR画像の模式図である。It is a schematic diagram of a plurality of R images according to Embodiment 1 of the present invention. 本発明の実施例1に係る複数のG画像の模式図である。It is a schematic diagram of the several G image which concerns on Example 1 of this invention. 本発明の実施例1に係るG、R画像の形状復元結果を統合した結果を示す模式図である。It is a schematic diagram which shows the result of having integrated the shape restoration result of the G and R image which concerns on Example 1 of this invention. 本発明の実施例2に係る計測部の概略の構成を示すブロック図である。It is a block diagram which shows the schematic structure of the measurement part which concerns on Example 2 of this invention. 本発明の実施例2に係る特殊透過フィルタの透過率特性を示すグラフである。It is a graph which shows the transmittance | permeability characteristic of the special transmission filter which concerns on Example 2 of this invention. 本発明の実施例2に係る照明部のスペクトルの特性を示すグラフである。It is a graph which shows the characteristic of the spectrum of the illumination part which concerns on Example 2 of this invention. 本発明の実施例2に係るカラーカメラによる取得画像の模式図である。It is a schematic diagram of the acquired image by the color camera which concerns on Example 2 of this invention. 本発明の実施例2に係る処理部の処理の流れを示すフロー図である。It is a flowchart which shows the flow of a process of the process part which concerns on Example 2 of this invention. 本発明の実施例3に係る計測部の概略の構成を示すブロック図である。It is a block diagram which shows the schematic structure of the measurement part which concerns on Example 3 of this invention. 本発明の実施例4に係る計測部の概略の構成を示す斜視図である。It is a perspective view which shows the schematic structure of the measurement part which concerns on Example 4 of this invention. 本発明の実施例4に係る複数のR画像の模式図である。It is a schematic diagram of the some R image which concerns on Example 4 of this invention. 本発明の実施例4に係るR画像の模式図である。It is a schematic diagram of the R image which concerns on Example 4 of this invention.
 本発明は、撮像手段を計測対象に沿って移動させながら取得したカラー画像を複数の画像に分離し、分離したそれぞれの画像を処理した後にSLAM技術により統合することにより、大物形状、長尺、広範囲の計測を可能にする汎用性を有する形状計測を可能にした形状計測方法及びその装置に関するものである。 
 以下に、本発明の実施例を、図を用いて説明する。 
The present invention separates the acquired color image into a plurality of images while moving the imaging means along the measurement target, and after processing each separated image, it is integrated by SLAM technology, The present invention relates to a shape measuring method and apparatus that enable shape measurement having versatility that enables measurement over a wide range.
Embodiments of the present invention will be described below with reference to the drawings.
 本発明では、1つのカラーカメラで取得されるRGBそれぞれで異なる形状計測手法を適用することで高精度・高密度な形状情報を取得することを可能にする。本発明の実施例1を図1~図10を用いて説明する。 In the present invention, high-precision and high-density shape information can be acquired by applying different shape measurement methods for each of RGB acquired by one color camera. A first embodiment of the present invention will be described with reference to FIGS.
 図1に本実施例の形状計測装置1000の構成を示す。測定対象1として電車の車両構体の内部形状を想定する。本実施例による形状計測装置1000は、測定対象1の形状を計測する計測ヘッド10と、計測ヘッド10からの出力信号を処理する処理部500、全体を制御する制御部600を備えている。 FIG. 1 shows a configuration of a shape measuring apparatus 1000 according to the present embodiment. As the measurement object 1, an internal shape of a train vehicle structure is assumed. The shape measuring apparatus 1000 according to the present embodiment includes a measuring head 10 that measures the shape of the measuring object 1, a processing unit 500 that processes an output signal from the measuring head 10, and a control unit 600 that controls the whole.
 測定対象1の形状を計測する計測ヘッド10は、計測部100と照明部130を備えている。計測部100は、レーザ光源111とコーンミラー112とを備えてリング状のビームを形成するリングビーム発生部110と、結像レンズ121と特殊透過フィルタ122とカラーカメラ123を備えた画像検出部120とから構成される。照明部130は測定対象1を一様に照明する照明機能を備える。リングビームは計測ヘッド10の進行方向に垂直な平面内に照射されるよう調整されている。 The measuring head 10 that measures the shape of the measuring object 1 includes a measuring unit 100 and an illumination unit 130. The measuring unit 100 includes a laser beam source 111 and a cone mirror 112 to form a ring beam, a ring beam generating unit 110, an image forming lens 121, a special transmission filter 122, and an image detecting unit 120 including a color camera 123. It consists of. The illumination unit 130 has an illumination function that uniformly illuminates the measurement object 1. The ring beam is adjusted so as to be irradiated in a plane perpendicular to the traveling direction of the measuring head 10.
 リングビーム発生部110から発生したリングビーム115が測定対象1に照射された箇所を結像レンズ121と特殊透過フィルタ122を介してカラーカメラ123にて撮像する。計測ヘッド10は搬送機構もしくは作業者の手持ちにより移動しながら、カラーカメラ123による撮像を行う。ここで、結像レンズ121の歪みはあらかじめ較正しておく。広範囲を一括で取得する際には広角レンズを用いる。より広い視野が求められる際には魚眼レンズを用いることもできる。例えば、搬送機構は、測定対象1中に張られたワイヤ上を進む構成とする。 A portion where the ring beam 115 generated from the ring beam generator 110 is irradiated onto the measurement object 1 is imaged by the color camera 123 via the imaging lens 121 and the special transmission filter 122. The measuring head 10 performs imaging by the color camera 123 while moving by the transport mechanism or the hand of the operator. Here, the distortion of the imaging lens 121 is calibrated in advance. A wide-angle lens is used when acquiring a wide range at once. Fisheye lenses can also be used when a wider field of view is required. For example, the transport mechanism is configured to travel on a wire stretched in the measurement object 1.
 処理部500は、計測部100のカラーカメラ123からの出力を処理して2次元の画像を生成する画像生成部501、画像生成部501で生成した画像から断面形状を算出する断面形状計測部502、画像生成部501で生成した画像から計測ヘッド10の移動量を算出する移動量算出部503、SLAM(Simultaneous Localization and Mapping)技術により断面形状算出部502にて算出した断面形状と移動量算出部503にて算出した移動量とをベースに計測対象の形状と計測ヘッドの位置姿勢情報を同時に算出する位置姿勢及形状算出部504、位置姿勢及形状算出部504にて算出した全体形状データと設計データとで形状を比較し差異を算出する比較部505、比較部505にて算出した全体形状データと設計データとの差異が予め定めた基準値以上の場合異常値として抽出する異常抽出部506、位置姿勢及形状算出部504にて算出した全体形状データと異常抽出部506にて抽出した異常値を画面上に表示する表示部507を備えている。 The processing unit 500 processes an output from the color camera 123 of the measuring unit 100 to generate a two-dimensional image, and a cross-sectional shape measuring unit 502 that calculates a cross-sectional shape from the image generated by the image generating unit 501. The movement amount calculation unit 503 that calculates the movement amount of the measuring head 10 from the image generated by the image generation unit 501, and the cross-sectional shape and movement amount calculation unit calculated by the cross-sectional shape calculation unit 502 using SLAM (Simultaneous Localization and Mapping) technology A position and orientation and shape calculation unit 504 that simultaneously calculates the shape of the measurement target and the position and orientation information of the measurement head based on the movement amount calculated in 503, and the overall shape data and design calculated by the position and orientation and shape calculation unit 504 Comparing with data and calculating difference by comparison unit 505 and comparison unit 505 When the difference between the overall shape data and the design data is equal to or greater than a predetermined reference value, the abnormality extraction unit 506 that extracts as an abnormal value and the overall shape data calculated by the position / orientation / shape calculation unit 504 and the abnormality extraction unit 506 A display unit 507 for displaying the extracted abnormal value on the screen is provided.
 制御部600は、測定対象1の3次元形状計測するために、計測ヘッド10と処理部500とを制御する。 The control unit 600 controls the measurement head 10 and the processing unit 500 in order to measure the three-dimensional shape of the measurement target 1.
 形状計測装置1000を用いた計測のフローを図2に示す。まず、制御部600に、図示していない入力装置から計測ヘッド10の移動速度、加減速度、画像取得レート、レーザ出力等の計測条件を設定し(S100)、制御部600による制御にて、計測部100で計測を始めるとともに、計測ヘッド10を移動開始させる(S101)。計測ヘッドの移動手段は、搬送機構もしくは作業者の手持ちにより移動にて行う。搬送機構はステージによる移動方式、ワイヤ上を吊り下げた移動方式、UAV(Unmanned Air Viechle)による移動方式などが使用できる。計測ヘッド10が移動しながら、カラーカメラ123にて測定対象1の画像を取得し(S102)、画像データを処理部500へ転送する(S103)。計測ヘッド10と処理部500間の画像データ転送は、有線もしくは無線転送にて行う。処理部500に保存された各取得画像から形状情報および計測ヘッド移動量を算出し(S104)、SLAM処理にて形状と位置姿勢情報を同時に同定する(S105)。計測ヘッドに移動を停止させるまでS102-S105を繰り返し、計測ヘッド10の移動を停止させるとともに画像取得も終了させる(S106)。図2のフローでは、画像データを取得するたびに形状を算出する例を示したが、処理部500に保存した画像に対し後処理にて形状を算出することも可能である。
以下、各部の詳細を説明する。
(計測部)
 本実施例における計測部100の特徴は、レンズ121とカラーカメラ123との間に特殊透過フィルタ122を用いることである。図3に、特殊透過フィルタ122の透過率特性122aを示す。また、カラーカメラ123のRGB(Green,Red,Blue)それぞれの透過率特性123a~123cを重ねて示した。横軸が波長を示し、縦軸に透過率を示す。図3に示したように、特殊透過フィルタ122を用いない場合のカラーカメラ123のRGBそれぞれの透過率特性123a~123cに対し、特殊透過フィルタ122の透過率122aには、以下のような透過率特性を持たせる。すなわち、レーザライン115aを形成するためのリングビーム生成部110に用いるレーザ光源111から発射されるレーザはGの波長帯域123bを有するが、特殊透過フィルタ122の透過率122aは、Gの波長領域123bではレーザ光源111の波長領域123bに対して透過率が高く、Gの波長領域123bの他の波長は通さない、かつRの波長領域123a及びBの波長領域123cでは透過させるという特徴を持たせる。また、照明部130から発射される照明光スペクトル130aは、図4に示すように、レーザライン115aを取得するG帯域の波長は含まず、BとRの帯域の波長を持つことを特徴とする。
A measurement flow using the shape measuring apparatus 1000 is shown in FIG. First, measurement conditions such as the moving speed, acceleration / deceleration, image acquisition rate, and laser output of the measuring head 10 are set in the control unit 600 from an input device (not shown) (S100), and measurement is performed under the control of the control unit 600. The measurement is started by the unit 100 and the movement of the measuring head 10 is started (S101). The moving means of the measuring head is moved by a transport mechanism or an operator's hand. As the transfer mechanism, a moving method using a stage, a moving method in which a wire is hung, a moving method using UAV (Unmanned Air Vehicle), or the like can be used. While the measuring head 10 is moving, the image of the measuring object 1 is acquired by the color camera 123 (S102), and the image data is transferred to the processing unit 500 (S103). Image data transfer between the measurement head 10 and the processing unit 500 is performed by wired or wireless transfer. Shape information and measurement head movement amount are calculated from each acquired image stored in the processing unit 500 (S104), and shape and position / orientation information are simultaneously identified by SLAM processing (S105). Steps S102 to S105 are repeated until the measurement head stops moving, and the movement of the measurement head 10 is stopped and image acquisition is also ended (S106). In the flow of FIG. 2, an example is shown in which the shape is calculated every time image data is acquired. However, the shape can also be calculated by post-processing on the image stored in the processing unit 500.
Details of each part will be described below.
(Measurement part)
A feature of the measurement unit 100 in the present embodiment is that a special transmission filter 122 is used between the lens 121 and the color camera 123. FIG. 3 shows the transmittance characteristic 122 a of the special transmission filter 122. Further, RGB (Green, Red, Blue) transmittance characteristics 123a to 123c of the color camera 123 are shown in an overlapping manner. The horizontal axis indicates the wavelength, and the vertical axis indicates the transmittance. As shown in FIG. 3, the transmittance 122a of the special transmission filter 122 has the following transmittance as compared with the RGB transmittance characteristics 123a to 123c of the color camera 123 when the special transmission filter 122 is not used. Give a characteristic. That is, the laser emitted from the laser light source 111 used in the ring beam generation unit 110 for forming the laser line 115a has the G wavelength band 123b, but the transmittance 122a of the special transmission filter 122 is in the G wavelength region 123b. Then, the transmittance is high with respect to the wavelength region 123b of the laser light source 111, other wavelengths of the G wavelength region 123b are not passed, and the R wavelength region 123a and the B wavelength region 123c are transmitted. Further, as shown in FIG. 4, the illumination light spectrum 130a emitted from the illumination unit 130 does not include the G-band wavelength for acquiring the laser line 115a, and has the B- and R-band wavelengths. .
 上記したような透過率特性を有する各光学素子を用いて取得した画像を、図5A乃至図5Dに模式的に示す。図5Aに示したように、取得したカラー画像551中には、リングビーム115が測定対象1にあたり生じるレーザライン115aがあり、特殊透過フィルタ122がBとRの波長領域の一部を透過することから、レーザライン115aだけでなく、その周辺の照明部130により照明された領域も撮像される。 Images acquired using each optical element having the above-described transmittance characteristics are schematically shown in FIGS. 5A to 5D. As shown in FIG. 5A, in the acquired color image 551, there is a laser line 115a generated when the ring beam 115 hits the measurement target 1, and the special transmission filter 122 transmits a part of the wavelength region of B and R. Thus, not only the laser line 115a but also the area illuminated by the surrounding illumination unit 130 is imaged.
 図5Aに示したカラー画像をRGBそれぞれに分解した画像を図5B~図5Dに示す。照明部130から発射される照明には、図4に示したように、Gの波長領域成分の波長を含まないという特性を有しているために、図5Cに示したGの波長領域の検出信号だけを用いて形成した画像(以下、G画像と記す)には、ほぼレーザライン115aのみが撮像されている。一方、図5Bに示したBの波長領域123cの検出信号だけを用いて形成した画像(以下、B画像と記す)と図5Cに示したRの波長領域123aの検出信号だけを用いて形成した画像(以下、R画像と記す)とには、レーザライン115aは表示されず、カラーカメラ123視野内の計測対象1の像が撮像されている。詳細は後述の(処理部)にて説明するが、図5Cに示したG画像からは、光切断法を用いて、図6に示すように測定対象1のレーザライン115a上の断面形状115bを表す点群を算出することができる。
(処理部)
 処理部500では、計測部100にて撮像した画像よりSLAM技術を用いて計測対象1の形状を算出する。形状の算出方法を図7のフロー及び図8A乃至8C、図9A乃9B、図10A、図10B、及び図11を参照しながら説明する。まず、計測部100のカラーカメラ123にて取得した測定対象1の画像を、RGB成分に分解する(S200)。次に、RGB成分に分解して得られた3種類の画像の内、リングビーム発生部110で発生させた波長領域がG成分のリングビームが照射されることにより測定対象1に形成されたレーザライン115aが撮像されているG画像から、光切断法を用いて断面形状を算出する(S201)。次に、RB成分の片方もしくは両方を用いて計測ヘッド10の移動量を算出する。図8Aに示すR画像801について時間的に連続して取得した2枚の画像から、図8Bに示す特徴点を抽出した特徴量画像802を計算する(S202)。特徴量としてはエッジ抽出やSIFT(Scale-Invariant Feature Transform)特徴量などを用いることができる。2枚の画像間にて同一の特徴点ペア803を認識し(S203)、画像より抽出した複数の特徴点ペア803においてそれぞれの特徴点ペア間の距離と満たす回転・並進移動量を算出し、計測ヘッド10の移動量及び、特徴量からモーションステレオ法を用いて図8Cに示すステレオ形状データ804を算出する(S204)。ここで、B画像においては、色の違いからR画像と若干異なる画像が得られているが、R画像と同様の処理を施し移動量を算出することができる。R画像、B画像の両方を用いることで特徴点ペアが増え、より精度の高い移動量算出が期待される。また、計測条件においては、R画像とB画像の画質に大きな差が生じる場合がある。そのようなときは、どちらかの画像のみを用いて移動量を算出する。S201にて算出した断面形状データ、S204にて算出した計測ヘッド10の移動量およびステレオ形状データを用いて位置姿勢及形状算出部にて計測ヘッド10の位置姿勢および形状を高精度に算出する(S205)。この位置姿勢及形状算出部には、SLAM技術を用いることができる。SLAM技術としては、カルマンフィルターを用いる方法やパーティクルフィルターを用いる方法、などが使用可能である。次に、S201からS205の処理を画像を取得しながら繰り返し全体形状データを取得する。S201を計測ヘッド10を移動させながら取得した画像にて実施し図9Aに示す複数のG画像901を取得し、これらの画像それぞれについて行うことで図9Bに示すような測定対象1の断面形状データ902が点群として取得される。同様にRB画像からも移動量とステレオ形状データを複数計測し、逐次S205の位置姿勢及形状算出部にてデータを統合することで、図10に示す計測対象の全体形状データ1001を取得することができる(S206)。次に、比較部505にてS206にて算出した全体形状データを、計測対象の設計データと比較して差異を検出し(S207)、異常抽出部506にて全体形状データ1001と設計データとの差異が予め設定した基準よりも大きい箇所を異常値として抽出し(S208)、抽出した異常値の位置情報を、全体形状データ1001と重ね合せて表示部507上に表示する(S209)。
Images obtained by disassembling the color image shown in FIG. 5A into RGB are shown in FIGS. 5B to 5D. As shown in FIG. 4, the illumination emitted from the illuminating unit 130 has a characteristic of not including the wavelength of the G wavelength region component, and thus the detection of the G wavelength region shown in FIG. 5C is performed. In an image formed using only signals (hereinafter referred to as G image), only the laser line 115a is captured. On the other hand, an image formed using only the detection signal of the B wavelength region 123c shown in FIG. 5B (hereinafter referred to as a B image) and the detection signal of the R wavelength region 123a shown in FIG. In the image (hereinafter referred to as “R image”), the laser line 115a is not displayed, and an image of the measurement object 1 in the visual field of the color camera 123 is captured. The details will be described later (processing unit). From the G image shown in FIG. 5C, the cross-sectional shape 115b on the laser line 115a of the measurement target 1 is shown in FIG. The point cloud to represent can be calculated.
(Processing part)
The processing unit 500 calculates the shape of the measurement object 1 from the image captured by the measurement unit 100 using SLAM technology. The shape calculation method will be described with reference to the flow of FIG. 7 and FIGS. 8A to 8C, FIGS. 9A to 9B, FIGS. 10A, 10B, and 11. FIG. First, the image of the measuring object 1 acquired by the color camera 123 of the measuring unit 100 is decomposed into RGB components (S200). Next, among the three types of images obtained by decomposing into RGB components, the laser formed on the measuring object 1 by irradiating the G component ring beam to the wavelength region generated by the ring beam generator 110. From the G image in which the line 115a is captured, the cross-sectional shape is calculated using a light cutting method (S201). Next, the amount of movement of the measuring head 10 is calculated using one or both of the RB components. A feature quantity image 802 in which feature points shown in FIG. 8B are extracted from two images acquired sequentially in time for the R image 801 shown in FIG. 8A is calculated (S202). As the feature quantity, edge extraction, SIFT (Scale-Invariant Feature Transform) feature quantity, or the like can be used. Recognize the same feature point pair 803 between the two images (S203), calculate the distance between each feature point pair and the amount of rotation / translational movement that satisfies each feature point pair 803 extracted from the image, The stereo shape data 804 shown in FIG. 8C is calculated from the movement amount and the feature amount of the measuring head 10 using the motion stereo method (S204). Here, in the B image, an image slightly different from the R image is obtained due to the difference in color. However, the amount of movement can be calculated by performing the same processing as the R image. By using both the R image and the B image, the number of feature point pairs is increased, and more accurate movement amount calculation is expected. Further, there may be a large difference in image quality between the R image and the B image under measurement conditions. In such a case, the movement amount is calculated using only one of the images. Using the cross-sectional shape data calculated in S201, the movement amount of the measurement head 10 calculated in S204, and the stereo shape data, the position / orientation and shape calculation unit calculates the position / orientation and shape of the measurement head 10 with high accuracy ( S205). SLAM technology can be used for the position and orientation and shape calculation unit. As the SLAM technique, a method using a Kalman filter, a method using a particle filter, or the like can be used. Next, the entire shape data is acquired by repeating the processing from S201 to S205 while acquiring an image. S201 is performed on the images acquired while moving the measuring head 10, and a plurality of G images 901 shown in FIG. 902 is acquired as a point cloud. Similarly, a plurality of movement amounts and stereo shape data are also measured from the RB image, and the data is integrated by the position / orientation / shape calculation unit of S205 in succession, thereby obtaining the entire shape data 1001 of the measurement target shown in FIG. (S206). Next, the overall shape data calculated in S206 by the comparison unit 505 is compared with the design data to be measured (S207), and the abnormality extraction unit 506 detects the difference between the overall shape data 1001 and the design data. A part where the difference is larger than a preset reference is extracted as an abnormal value (S208), and the positional information of the extracted abnormal value is displayed on the display unit 507 so as to overlap with the entire shape data 1001 (S209).
 本発明の実施例2を図11~図16を用いて説明する。 Example 2 of the present invention will be described with reference to FIGS.
 実施例1との違いは光切断用のリングビームが波長と照射位置の異なる2種類あり、それに伴い特殊透過フィルタ1222の透過特性、照明1230のスペクトルが異なる点である。その他の構成は図1を用いて説明した実施例1の形状測定装置1000の構成と同じであるので、説明を省略する。2つの異なる位置にて光切断法による断面形状を求めることにより、計測ヘッドが移動した際に、ほぼ同じ場所の断面形状を2回計測することができ、全体形状データ算出時のSLAM処理の高精度化が実現する。 The difference from the first embodiment is that there are two types of ring beams for light cutting with different wavelengths and irradiation positions, and accordingly, the transmission characteristics of the special transmission filter 1222 and the spectrum of the illumination 1230 are different. Other configurations are the same as the configuration of the shape measuring apparatus 1000 according to the first embodiment described with reference to FIG. By obtaining the cross-sectional shape by the optical cutting method at two different positions, the cross-sectional shape at almost the same place can be measured twice when the measuring head is moved. Accuracy is achieved.
 図11に、本実施例による計測部1200の概要を示す。本実施例においては、リングビーム発生部1210および1213において、実施例1で説明したリングビーム115を計測対象1に照射するとともに、リングビーム115とは異なる位置から照射されるリングビーム116も測定対象1に照射される。リングビーム115の波長はG帯域の中心付近であり、リングビーム116の波長はR帯域の中心付近とする。リングビーム発生部1210ではG帯域の波長を有する平行光レーザを発信するレーザ光源1211とリング状のビームを形成するコーンミラー1212との組み合わせと、リングビーム発生部1213ではR帯域の波長を有する平行光レーザを発信するレーザ光源1214とリング状のビームを形成するコーンミラー1215との組み合わせとを備えている。リングビーム115および116はともに計測ヘッド10の進行方向に垂直な断面内に照射されるよう調整されている。 FIG. 11 shows an outline of the measuring unit 1200 according to the present embodiment. In the present embodiment, the ring beam generators 1210 and 1213 irradiate the measurement target 1 with the ring beam 115 described in the first embodiment, and the ring beam 116 irradiated from a position different from the ring beam 115 is also the measurement target. 1 is irradiated. The wavelength of the ring beam 115 is near the center of the G band, and the wavelength of the ring beam 116 is near the center of the R band. The ring beam generator 1210 has a combination of a laser light source 1211 that emits a parallel laser having a wavelength in the G band and a cone mirror 1212 that forms a ring-shaped beam, and the ring beam generator 1213 has a parallel having an R band wavelength. A combination of a laser light source 1214 that transmits an optical laser and a cone mirror 1215 that forms a ring-shaped beam is provided. The ring beams 115 and 116 are both adjusted to be irradiated in a cross section perpendicular to the traveling direction of the measuring head 10.
 特殊透過フィルタ1222の透過特性を図12に示す。本実施例における特殊透過フィルタ1222は、B帯域では高い透過率を有し、また、GおよびR帯域ではレーザ光源1211および1214が透過する波長帯のみ高い透過率を有するよう設計されている。 The transmission characteristics of the special transmission filter 1222 are shown in FIG. The special transmission filter 1222 in this embodiment is designed to have a high transmittance in the B band, and to have a high transmittance only in the wavelength band that the laser light sources 1211 and 1214 transmit in the G and R bands.
 また、図13には、照明部1230から測定対象1に照射される照明光のスペクトル1230aを示す。カラーカメラ123のB帯域1223cに合わせた透過帯域1230aとなっており、B帯域の波長の光による画像取得が可能となる。 FIG. 13 shows a spectrum 1230a of illumination light irradiated from the illumination unit 1230 to the measurement object 1. The transmission band 1230a is matched with the B band 1223c of the color camera 123, and an image can be acquired using light having a wavelength in the B band.
 上記したような透過率特性を有する各光学素子を用いて取得した画像を、図14A乃至図14Cに模式的に示す。図14A、Bに示すように、取得した画像中には、測定対象1にあたり生じるレーザラインリングビーム116aおよび115aがある。また、図14Cには特殊透過フィルタ1222がBの波長領域の一部を透過することから、その周辺の照明部1230により照明された領域も撮像される。 14A to 14C schematically show images obtained using the optical elements having the transmittance characteristics as described above. As shown in FIGS. 14A and 14B, there are laser line ring beams 116 a and 115 a generated in the measurement target 1 in the acquired image. In FIG. 14C, since the special transmission filter 1222 transmits a part of the wavelength region of B, the region illuminated by the surrounding illumination unit 1230 is also imaged.
 計測フローを図15に示す。図7との違いは、S301にて、R画像とG画像それぞれから断面形状を算出すること、およびS302にてB画像より特徴量画像を算出することである。 Fig. 15 shows the measurement flow. The difference from FIG. 7 is that the cross-sectional shape is calculated from each of the R image and the G image in S301, and the feature amount image is calculated from the B image in S302.
 図16に変形例を示す。リングビーム発生部1213を計測ヘッド10の進行方向と垂直な平面から傾けて照射する場合を示す。この場合、取得画像より計測される断面形状には計測ヘッド10の進行方向情報も含んでおり、全体形状データを算出する際の高精度が実現する。また、2波長のリングビームを異なる方向に照射することにより、一方のリングビームでは、ビーム照射位置、もしくはビーム照射位置とカメラの位置関係により死角となる領域が、もう一方のリングビームでは検出可能となり、検出範囲が広がる。さらに、B帯域の波長の光にて取得した画像を用いたステレオ計測も行うことで、より計測時の死角が少ない計測を、1つのカメラにて行うことができる。 FIG. 16 shows a modification. The case where the ring beam generator 1213 is irradiated with being tilted from a plane perpendicular to the traveling direction of the measuring head 10 is shown. In this case, the cross-sectional shape measured from the acquired image includes the traveling direction information of the measuring head 10, and high accuracy is obtained when calculating the overall shape data. In addition, by irradiating two-wave ring beams in different directions, one ring beam can detect the beam irradiation position, or the area that becomes a blind spot depending on the positional relationship between the beam irradiation position and the camera, with the other ring beam. Thus, the detection range is expanded. Furthermore, by performing stereo measurement using an image acquired with light of a wavelength in the B band, it is possible to perform measurement with fewer blind spots during measurement with one camera.
 本発明の実施例3を図17を用いて説明する。 
 実施例1との違いは、リングビーム照明による光切断法、画像によるステレオ法に加え、ランダムドット照明を用いた形状計測法も同時に行う点にある。図17に計測部1700を示す。カラーカメラ123の帯域RGBそれぞれについて、R帯域にてランダムドット照明、G帯域にて光切断法、B帯域にてステレオ法を実施する。ランダムドット照明部1740は、R帯域の波長を持つ照明光を測定対象1に照射する。特殊透過フィルタ1722の透過特性は実施例2の図13で説明した特殊透過フィルタ1222と同様な透過率特性を有している。また、照明部1530のスペクトルも実施例2において図14の(b)に示した特性と同様である。
A third embodiment of the present invention will be described with reference to FIG.
The difference from the first embodiment is that a shape measurement method using random dot illumination is simultaneously performed in addition to the light cutting method using ring beam illumination and the stereo method using images. FIG. 17 shows a measurement unit 1700. For each band RGB of the color camera 123, random dot illumination is performed in the R band, a light cutting method is performed in the G band, and a stereo method is performed in the B band. The random dot illumination unit 1740 irradiates the measurement target 1 with illumination light having an R band wavelength. The transmission characteristics of the special transmission filter 1722 have the same transmission characteristics as the special transmission filter 1222 described in FIG. Further, the spectrum of the illumination unit 1530 is the same as the characteristic shown in FIG.
 本実施例では、カラーカメラ123のRGBの帯域それぞれにて異なる3通りの形状計測手法を独立に実施することができる。ランダムドット照明部1740によるランダムドット照明は、測定対象1の表面に特徴が少ない時でも使用できる。また、光切断法においてリングビーム生成部1710で発生させたリングビームは一回の計測にて一断面を高精度に計測するが、ランダムドット照明を用いた形状計測手法は広い範囲に照明するため、計測範囲が広いという特徴がある。これらの計測手法を複数組み合わせ同時に実施することで、計測時の死角が少ない計測を、1つのカメラにて行うことができる。 In the present embodiment, three different shape measuring methods can be independently implemented for each of the RGB bands of the color camera 123. Random dot illumination by the random dot illumination unit 1740 can be used even when the surface of the measurement object 1 has few features. In addition, the ring beam generated by the ring beam generation unit 1710 in the light cutting method measures one section with high accuracy by one measurement, but the shape measurement method using random dot illumination illuminates a wide range. The measurement range is wide. By performing a combination of a plurality of these measurement methods at the same time, measurement with a small blind spot at the time of measurement can be performed with one camera.
 また、ランダムドット照明の代わりにパターン照明を用いることもできる。その場合も、各形状計測手法の欠点を補い合うことで、高精度・広範囲の計測が可能となる。 Also, pattern illumination can be used instead of random dot illumination. Even in this case, high-precision and wide-range measurement is possible by compensating for the disadvantages of each shape measurement method.
 本発明の実施例4を図18、図19を用いて説明する。 
 図18に計測部1700の構成を示す。実施例1の図1との違いは、カラーカメラ123の光軸上にコーンミラー125が配置されていることである。コーンミラー125により、計測ヘッド10の進行方向に対し直交する平面が撮像される。図19にR画像の取得例を示す。中心部分8402がコーンミラーによる撮像部分であり、進行方向に対し直交する平面が、カメラの光軸上にて結像されることから、計測ヘッド10の進行情報が歪なく結像され、全体形状データの算出を高精度に実現できる。なお、コーンミラーは、G画像の光切断線と、コーンミラーによる撮像範囲が重ならないように、大きさを設計する。
A fourth embodiment of the present invention will be described with reference to FIGS.
FIG. 18 shows the configuration of the measurement unit 1700. The difference between the first embodiment and FIG. 1 is that a cone mirror 125 is disposed on the optical axis of the color camera 123. A plane perpendicular to the traveling direction of the measuring head 10 is imaged by the cone mirror 125. FIG. 19 shows an example of acquiring an R image. The central portion 8402 is an imaging portion by a cone mirror, and a plane orthogonal to the traveling direction is imaged on the optical axis of the camera, so that the traveling information of the measuring head 10 is imaged without distortion, and the entire shape Data can be calculated with high accuracy. The size of the cone mirror is designed so that the light cutting line of the G image does not overlap with the imaging range of the cone mirror.
 これまで説明してきた実施例は、何れも本発明を実施するにあたっての具体化の一例を示したものに過ぎず、これらによって本発明の技術的範囲が限定的に解釈されない。すなわち、本発明はその技術思想、又はその主要な特徴から逸脱することなく、様々な形で実施することができる。 The embodiments described so far are merely examples of implementation in carrying out the present invention, and the technical scope of the present invention is not limitedly interpreted by these embodiments. That is, the present invention can be implemented in various forms without departing from the technical idea or the main features thereof.
1・・・測定対象
1000・・・形状計測装置
10・・・計測ヘッド
100、1700・・・計測部
110、1210、1213、1710・・・リングビーム発生部
111、1211、1214・・・レーザ光源
112、1212、1215、125・・・コーンミラー
115、116・・・リングビーム
115a、116a・・・レーザライン
120・・・画像検出部
121・・・結像レンズ
122、1222、1722・・・特殊透過フィルタ
122a・・・透過率
123・・・カラーカメラ
123a~123c、1223a~1223c・・・透過率特性
130、1230、1530・・・照明部
130a、1230a・・・照明光のスペクトル
500・・・処理部
501・・・画像生成部
502・・・断面形状計測部
503・・・移動量算出部
504・・・位置姿勢及形状算出部
505・・・比較部
506・・・異常抽出部
551・・・カラー画像
600・・・制御部
801・・・R画像
802・・・特徴量画像
803・・・特徴点ペア
804・・・ステレオ形状データ
901・・・G画像
902・・・断面の形状データ
1001・・・全体形状データ
1740・・・ランダムドット照明部
8402・・・中心部分
DESCRIPTION OF SYMBOLS 1 ... Measuring object 1000 ... Shape measuring apparatus 10 ... Measuring head 100, 1700 ... Measuring part 110, 1210, 1213, 1710 ... Ring beam generating part 111, 1211, 1214 ... Laser Light sources 112, 1212, 1215, 125 ... cone mirrors 115, 116 ... ring beams 115a, 116a ... laser line 120 ... image detector 121 ... imaging lenses 122, 1222, 1722,. Special transmission filter 122a ... Transmittance 123 ... Color cameras 123a to 123c, 1223a to 1223c ... Transmittance characteristics 130, 1230, 1530 ... Illumination units 130a, 1230a ... Illumination light spectrum 500 ... Processing unit 501 ... Image generation unit 502 ... Cross-section measurement unit 503 ... Dynamic amount calculation unit 504 ... Position and orientation and shape calculation unit 505 ... Comparison unit 506 ... Abnormality extraction unit 551 ... Color image 600 ... Control unit 801 ... R image 802 ... Feature quantity Image 803 ... Feature point pair 804 ... Stereo shape data 901 ... G image 902 ... Cross-sectional shape data 1001 ... Overall shape data 1740 ... Random dot illumination unit 8402 ... Central portion

Claims (12)

  1.  計測対象の形状を移動しながら計測する方法であって、
     計測対象に複数の光源から発射された複数の照明光を照射し、
     前記計測対象の前記複数の照明光が照射された領域をカラーカメラで撮像し、
     前記カラーカメラで撮像して得られた前記検査対象のカラー画像を波長に応じて複数の画像に分離し、
     前記分離した複数の画像に対してそれぞれ異なる画像処理を施し断面形状および移動量を算出し、
     前記異なる画像から算出した断面形状および移動量から全体形状データを算出することを特徴とする形状計測方法。
    It is a method of measuring while moving the shape of the measurement object,
    Irradiate multiple illumination lights emitted from multiple light sources onto the measurement target,
    The area irradiated with the plurality of illumination lights of the measurement object is imaged with a color camera,
    The color image of the inspection object obtained by imaging with the color camera is separated into a plurality of images according to the wavelength,
    Each of the plurality of separated images is subjected to different image processing to calculate a cross-sectional shape and a moving amount,
    A shape measuring method, wherein the entire shape data is calculated from the cross-sectional shape and the movement amount calculated from the different images.
  2.  請求項1記載の形状計測方法であって、前記計測対象の前記複数の照明光が照射された領域をカラーカメラで撮像することを、特定の波長領域の光を遮光する光学フィルタを介して前記カラーカメラで撮像することを特徴とする形状計測方法。 The shape measurement method according to claim 1, wherein the area to which the plurality of illumination lights to be measured is irradiated is imaged with a color camera via an optical filter that blocks light in a specific wavelength range. A shape measuring method characterized by capturing an image with a color camera.
  3.  請求項1記載の形状計測方法であって、前記分離した複数の画像に対して、光切断法にて断面形状を算出する処理を施し、移動しながら計測した画像中の特徴量の移動量から移動量を算出する処理を施すことを特徴とする形状計測方法。 The shape measurement method according to claim 1, wherein the plurality of separated images are subjected to a process of calculating a cross-sectional shape by a light cutting method, and from the amount of movement of the feature amount in the image measured while moving A shape measuring method characterized by performing a process of calculating a movement amount.
  4.  請求項1記載の形状計測方法であって、前記分離した複数の画像に対してそれぞれ異なる画像処理を施すことが、前記分離した複数の画像のそれぞれに対して、光切断法、ステレオ法、パターン投影法、およびランダムドット法のうちの何れかの異なる画像処理を施すことを特徴とする形状計測方法。 The shape measurement method according to claim 1, wherein different image processing is performed on each of the plurality of separated images, and a light cutting method, a stereo method, and a pattern are performed on each of the plurality of separated images. A shape measuring method characterized by performing different image processing of any one of a projection method and a random dot method.
  5.  請求項1記載の形状計測方法であって、前記計測対象の前記複数の照明光が照射された領域をカラーカメラで撮像することを、前記カラーカメラを前記計測対象に沿って移動させながら複数の位置で前記計測対象の前記複数の照明光が照射された領域をカラーカメラで撮像することを特徴とする形状計測方法。 2. The shape measurement method according to claim 1, wherein the area of the measurement object irradiated with the plurality of illumination lights is imaged with a color camera, and the color camera is moved along the measurement object. A shape measuring method comprising: imaging a region irradiated with the plurality of illumination lights to be measured with a color camera at a position.
  6.  請求項1記載の形状計測方法であって、照射前記計測対象に前記複数の照明光を照射することを、前記計測対象の比較的広い領域に第1の光源から発射された第1の照明光を照射し、前記計測対象の前記第1の光源から発射された第1の照明光が照射された領域に第2の光源から発射された前記第1の照明光とは異なる波長の第2の照明光を前記計測対象の比較的狭い領域の照射することを特徴とする形状計測方法。 2. The shape measurement method according to claim 1, wherein the first illumination light emitted from the first light source to a relatively wide area of the measurement target is used to irradiate the irradiation target with the plurality of illumination lights. The second illumination light having a wavelength different from that of the first illumination light emitted from the second light source is irradiated on the region irradiated with the first illumination light emitted from the first light source to be measured. A shape measurement method characterized by irradiating illumination light on a relatively narrow region of the measurement target.
  7.  計測対象の形状を計測する装置であって、
     計測対象に複数の照明光を照射する複数の光源を備えた光照射部と、
     前記計測対象の前記光照射部により複数の照明光が照射された領域を撮像して取得した前記検査対象のカラー画像を波長に応じて複数の画像に分離して出力する計測部と、
     前記画像取得部から分離して出力された複数の画像に対してそれぞれ異なる画像処理を施し全体形状データを算出する処理部と、
     前記計測部で前記計測対象の形状を計測した結果を出力する出力部と
     を備えたことを特徴とする形状計測装置。
    An apparatus for measuring the shape of a measurement object,
    A light irradiation unit comprising a plurality of light sources for irradiating a measurement object with a plurality of illumination lights;
    A measurement unit that separates and outputs a color image of the inspection target obtained by imaging a region irradiated with a plurality of illumination lights by the light irradiation unit of the measurement target according to a wavelength; and
    A processing unit that performs different image processing on the plurality of images that are output separately from the image acquisition unit, and calculates overall shape data;
    An output unit that outputs a result of measuring the shape of the measurement target by the measurement unit.
  8.  請求項7記載の形状計測装置であって、前記画像取得部は、特定の波長領域の光を遮光する光学フィルタと、前記計測対象の前記複数の照明光が照射された領域を前記光学フィルタを介して撮像するカラーカメラとを備えることを特徴とする形状計測装置。 The shape measurement apparatus according to claim 7, wherein the image acquisition unit includes an optical filter that blocks light in a specific wavelength region, and a region that is irradiated with the plurality of illumination lights of the measurement target. A shape measuring apparatus comprising: a color camera that picks up images.
  9.  請求項7記載の形状計測装置であって、前記分離した複数の画像に対して、光切断法にて断面形状を算出する処理を施し、移動しながら計測した画像中の特徴量の移動量から移動量を算出する処理を施すことを特徴とする形状計測装置。 The shape measurement apparatus according to claim 7, wherein the plurality of separated images are subjected to a process of calculating a cross-sectional shape by a light cutting method, and from the amount of movement of the feature amount in the image measured while moving A shape measuring apparatus that performs a process of calculating a movement amount.
  10.  請求項7記載の形状計測装置であって、前記画像処理部は、前記分離した複数の画像に対してそれぞれ異なる画像処理を施すことを、前記分離した複数の画像のそれぞれに対して、光切断法、ステレオ法、パターン投影法、およびランダムドット法のうちの何れかの異なる方法を用いて画像処理を施すことを特徴とする形状計測装置。 The shape measuring apparatus according to claim 7, wherein the image processing unit performs different image processing on the plurality of separated images, and performs optical cutting on each of the plurality of separated images. A shape measuring device that performs image processing using any one of a method, a stereo method, a pattern projection method, and a random dot method.
  11.  請求項7記載の形状計測装置であって、前記画像取得部を搬送する搬送部をさらに備え、前記画像取得部で計測対象の前記複数の照明光が照射された領域を撮像することを、前記搬送部で前記画像取得部を前記計測対象に沿って移動させながら複数の位置で前記計測対象の前記光照射部により複数の照明光が照射された領域を撮像することを特徴とする形状計測装置。 The shape measurement apparatus according to claim 7, further comprising a transport unit configured to transport the image acquisition unit, and imaging the region irradiated with the plurality of illumination lights to be measured by the image acquisition unit, A shape measuring apparatus that images a region irradiated with a plurality of illumination lights by the light irradiation unit of the measurement target at a plurality of positions while moving the image acquisition unit along the measurement target by a transport unit. .
  12.  請求項7記載の形状計測装置であって、前記照明部は、前記計測対象の比較的広い領域に第1の光源から発射された第1の照明光を照射する第1の光照射部と、前記計測対象の前記第1の光源から発射された第1の照明光が照射された領域に第2の光源から発射された前記第1の照明光とは異なる波長の第2の照明光を前記計測対象の比較的狭い領域の照射する第2の光照射部とを有することを特徴とする形状計測装置。 8. The shape measuring apparatus according to claim 7, wherein the illumination unit irradiates a first illumination light emitted from a first light source onto a relatively wide area of the measurement target; The second illumination light having a wavelength different from that of the first illumination light emitted from the second light source is applied to the region irradiated with the first illumination light emitted from the first light source to be measured. A shape measuring apparatus comprising: a second light irradiation unit configured to irradiate a relatively narrow region to be measured.
PCT/JP2015/059832 2015-03-30 2015-03-30 Shape measurement method and device for same WO2016157349A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2015/059832 WO2016157349A1 (en) 2015-03-30 2015-03-30 Shape measurement method and device for same
JP2017508875A JPWO2016157349A1 (en) 2015-03-30 2015-03-30 Shape measuring method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/059832 WO2016157349A1 (en) 2015-03-30 2015-03-30 Shape measurement method and device for same

Publications (1)

Publication Number Publication Date
WO2016157349A1 true WO2016157349A1 (en) 2016-10-06

Family

ID=57006595

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/059832 WO2016157349A1 (en) 2015-03-30 2015-03-30 Shape measurement method and device for same

Country Status (2)

Country Link
JP (1) JPWO2016157349A1 (en)
WO (1) WO2016157349A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019198562A1 (en) * 2018-04-11 2019-10-17 富士フイルム株式会社 Structure management device, structure management method, and structure management program
JP2021022369A (en) * 2019-07-25 2021-02-18 パロ アルト リサーチ センター インコーポレイテッド System and method for automated surface evaluation

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006064690A (en) * 2004-07-26 2006-03-09 Univ Of Miyazaki In-tubing profile measuring device
WO2012172870A1 (en) * 2011-06-14 2012-12-20 日産自動車株式会社 Distance measurement device and environment map generation apparatus
JP2014115277A (en) * 2012-11-13 2014-06-26 Hitachi Ltd Shape measurement system
JP2014173901A (en) * 2013-03-06 2014-09-22 Canon Inc Measurement device and article manufacturing method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011077564B4 (en) * 2011-06-15 2016-08-25 Sirona Dental Systems Gmbh Method for the optical three-dimensional measurement of a dental object
EP2823252A1 (en) * 2012-03-09 2015-01-14 Galil Soft Ltd System and method for non-contact measurement of 3d geometry

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006064690A (en) * 2004-07-26 2006-03-09 Univ Of Miyazaki In-tubing profile measuring device
WO2012172870A1 (en) * 2011-06-14 2012-12-20 日産自動車株式会社 Distance measurement device and environment map generation apparatus
JP2014115277A (en) * 2012-11-13 2014-06-26 Hitachi Ltd Shape measurement system
JP2014173901A (en) * 2013-03-06 2014-09-22 Canon Inc Measurement device and article manufacturing method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019198562A1 (en) * 2018-04-11 2019-10-17 富士フイルム株式会社 Structure management device, structure management method, and structure management program
JPWO2019198562A1 (en) * 2018-04-11 2021-04-30 富士フイルム株式会社 Structure management device, structure management method, and structure management program
US11915411B2 (en) 2018-04-11 2024-02-27 Fujifilm Corporation Structure management device, structure management method, and structure management program
JP2021022369A (en) * 2019-07-25 2021-02-18 パロ アルト リサーチ センター インコーポレイテッド System and method for automated surface evaluation
JP7489240B2 (en) 2019-07-25 2024-05-23 パロ アルト リサーチ センター,エルエルシー Systems and methods for automated surface evaluation - Patents.com

Also Published As

Publication number Publication date
JPWO2016157349A1 (en) 2017-07-13

Similar Documents

Publication Publication Date Title
EP3531066B1 (en) Three-dimensional scanning method including a plurality of lasers with different wavelengths, and scanner
US10782126B2 (en) Three-dimensional scanning method containing multiple lasers with different wavelengths and scanner
CN108802043B (en) Tunnel detection device, tunnel detection system and tunnel defect information extraction method
JP2017531558A5 (en)
JP5116537B2 (en) Tire appearance inspection device
TWI542893B (en) Distance measuring device
US10430940B2 (en) Inspection system and inspection method
CN104567728A (en) Laser vision profile measurement system, measurement method and three-dimensional target
CN111735413A (en) Three-dimensional shape measuring device
US10794687B2 (en) Shape measurement system and shape measurement method
CN103983207A (en) Three-dimensional scanning endoscope and three-dimensional scanning method
WO2016157349A1 (en) Shape measurement method and device for same
KR101913705B1 (en) Method and apparatus for measuring depth of materials attached to cylinder using line laser
JP2008180630A (en) Fluid measuring system, fluid measuring method and computer program
JP5278878B2 (en) Pipe inner surface shape measuring device
WO2018066270A1 (en) Method and device for measuring external shape of railroad vehicle
WO2016103492A1 (en) Shape measurement method and device
KR101706627B1 (en) Distance measurement device by using stereo image and method thereof
US20230070281A1 (en) Methods and systems of generating camera models for camera calibration
JP6650753B2 (en) Image acquisition system and image acquisition method
KR101465996B1 (en) Method for measurement of high speed 3d shape using selective long period
JP2018059835A5 (en)
JP2012177596A (en) Holder for calibration and method for calibrating photo cutting type shape measuring instrument
WO2017145270A1 (en) Image processing apparatus, image processing method, and endoscope
JP6371742B2 (en) Measuring device and acquisition method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15887502

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017508875

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15887502

Country of ref document: EP

Kind code of ref document: A1