JP2009180689A - Three-dimensional shape measuring apparatus - Google Patents

Three-dimensional shape measuring apparatus Download PDF

Info

Publication number
JP2009180689A
JP2009180689A JP2008022204A JP2008022204A JP2009180689A JP 2009180689 A JP2009180689 A JP 2009180689A JP 2008022204 A JP2008022204 A JP 2008022204A JP 2008022204 A JP2008022204 A JP 2008022204A JP 2009180689 A JP2009180689 A JP 2009180689A
Authority
JP
Japan
Prior art keywords
light
unit
imaging
image data
inspection object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2008022204A
Other languages
Japanese (ja)
Inventor
Hiroshi Aoki
洋 青木
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp filed Critical Nikon Corp
Priority to JP2008022204A priority Critical patent/JP2009180689A/en
Publication of JP2009180689A publication Critical patent/JP2009180689A/en
Pending legal-status Critical Current

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To provide a three-dimensional measuring apparatus capable of improving measurement accuracy by accurate phase computations. <P>SOLUTION: The three-dimensional measuring apparatus (100) is provided with a projection optical part (23) for shaping light outputted from a light source into prescribed pattern light and projecting it to an object to be inspected; a scanning part (22) for shifting the phase of the prescribed pattern light projected onto the object to be inspected; an image-forming optical part (31) for forming a pattern image generated by the scanning and projection of the prescribed pattern light onto the object to be inspected; an imaging part (32) for imaging the pattern image formed by the image-forming optical part; and a control part (40) for correcting image data and determining shape information of the object to be inspected on the basis of corrected image data. <P>COPYRIGHT: (C)2009,JPO&INPIT

Description

本発明は、物体の三次元形状を測定するためのパターン投影型位相シフト法による三次元計測装置に関するものである。   The present invention relates to a three-dimensional measurement apparatus using a pattern projection type phase shift method for measuring a three-dimensional shape of an object.

一般に、物体の三次元形状を非接触で計測する手法として、位相シフトを用いた格子パターン投影法がよく知られている。具体的には、格子パターン投影法は照度分布が正弦波状態の格子パターンを被検査物に投影しパターン画像を得て、引き続き、格子パターンを横方向に例えばπ/2ずつ順次移動させて、位相が異なる4枚の画像を得ている。そして格子パターン投影法は、得られた4枚のパターン画像より画素毎にその点における格子の位相を求め、その位相情報から被検査物の形状を演算し、被検査物の高さを求めている。   In general, a lattice pattern projection method using phase shift is well known as a method for measuring the three-dimensional shape of an object in a non-contact manner. Specifically, in the grid pattern projection method, a grid pattern with an illuminance distribution of a sine wave state is projected onto an inspection object to obtain a pattern image, and subsequently the grid pattern is sequentially moved in the horizontal direction by, for example, π / 2, Four images with different phases are obtained. The lattice pattern projection method obtains the phase of the lattice at each point from the obtained four pattern images, calculates the shape of the inspection object from the phase information, and obtains the height of the inspection object. Yes.

格子パターン投影法は、被検査物のある位置での表面性状(明暗差や傾斜等)が変化しても、π/2ずつ順次移動する格子パターンによる相対的な照度差は、必ず格子パターンの位相差分だけの変化を示す。このため、画素毎にその位置における格子の位相を求めて三次元形状を高精度に測定するためには、撮影部におけるパターン画像の取り込みが重要となる。この画像を取り込む撮影部の性能は多岐にわたっており、装置の価格を下げるため低価格のCCDセンサーが使用されることもある。また、低消費電力を売り物とするためCMOSセンサーも使用されている。
特開平2005−214653号公報
In the grid pattern projection method, the relative illuminance difference due to the grid pattern that sequentially moves by π / 2 is always the same as the grid pattern even if the surface properties (lightness difference, inclination, etc.) at a certain position of the inspection object change. It shows the change only in the phase difference. For this reason, in order to measure the three-dimensional shape with high accuracy by obtaining the phase of the grating at each position for each pixel, it is important to capture the pattern image in the photographing unit. The performance of the image capturing unit for capturing the image varies widely, and a low-cost CCD sensor may be used to reduce the price of the apparatus. In addition, CMOS sensors are also used to sell low power consumption.
JP-A-2005-214653

格子パターンの位相の計算には撮影部が取得した輝度値を用いるが、格子パターンの輝度値の正確さが最終結果へ直接的に影響を与えるため、撮影部が出力する格子パターンの輝度値を正確に求めることが重要となる。被検査物は反射率の低いものから金属表面など反射率の高いものまであり広範囲のダイナミックレンジが必要である。低価格なCCDセンサーでは実際に投影される光量に対して輝度値が正確な直線性(リニアリティ)を有していないことがある。また、CMOSセンサーでは広範囲のダイナミックレンジを確保するためLOG特性(対数変換特性)などを有するセンサーが使用されることもある。   The luminance value acquired by the imaging unit is used to calculate the phase of the grid pattern, but since the accuracy of the luminance value of the grid pattern directly affects the final result, the luminance value of the grid pattern output by the imaging unit is used. Finding it accurately is important. Test objects range from low reflectivity to high reflectivity such as metal surfaces and require a wide dynamic range. In a low-cost CCD sensor, the luminance value may not have accurate linearity with respect to the amount of light actually projected. In addition, a CMOS sensor may have a LOG characteristic (logarithmic conversion characteristic) or the like in order to ensure a wide dynamic range.

このようなリニアリティを有さない撮影部を使用した場合には位相の算出精度が低くなり、そのまま格子パターンの画像を直接使用すると、測定対象物の高さ精度に影響する問題があった。
そこで本発明は、正確な位相計算により測定精度の向上を可能とする三次元計測装置を提供することを目的とする。
When an imaging unit that does not have such linearity is used, the phase calculation accuracy is low, and using the lattice pattern image directly has a problem of affecting the height accuracy of the measurement object.
Accordingly, an object of the present invention is to provide a three-dimensional measurement apparatus that can improve measurement accuracy by accurate phase calculation.

第1の観点の三次元計測装置は、光源から出力された光を所定のパターン光に整形して被検査物に投影する投影光学部と、被検査物上に投影される所定のパターン光を位相シフトさせる走査部と、被検査物上への所定のパターン光の走査投影により生じるパターン像を結像する結像光学部と、結像光学部により結像されたパターン像を撮像する撮像部と、画像データを補正するとともに、この補正された画像データに基づいて前記被検査物の形状の情報を求める制御部と、を備える。
この構成により、三次元計測装置は補正部が撮像部から出力された画像データを補正し、この補正された画像データに基づいて被検査物の形状の情報を求めることができる。これにより、被検査物の形状の測定精度が向上する。
A three-dimensional measurement apparatus according to a first aspect includes a projection optical unit that shapes light output from a light source into a predetermined pattern light and projects the light onto a test object, and a predetermined pattern light projected on the test object. A phase shift scanning unit, an imaging optical unit that forms a pattern image generated by scanning and projecting a predetermined pattern light onto the inspection object, and an imaging unit that captures the pattern image formed by the imaging optical unit And a control unit that corrects the image data and obtains information on the shape of the inspection object based on the corrected image data.
With this configuration, the three-dimensional measurement apparatus can correct the image data output from the imaging unit by the correction unit, and obtain information on the shape of the inspection object based on the corrected image data. Thereby, the measurement accuracy of the shape of the inspection object is improved.

本発明によれば、位相シフト法に基づく三次元計測を行うときに、測定精度の向上が可能となる三次元計測方法とその計測装置を提供することができる。   ADVANTAGE OF THE INVENTION According to this invention, when performing the three-dimensional measurement based on a phase shift method, the three-dimensional measurement method and its measuring device which can improve a measurement precision can be provided.

以下、図面を参照して本発明の実施形態について詳細に説明する。
<三次元形状測定装置100の構成>
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
<Configuration of the three-dimensional shape measuring apparatus 100>

図1は、実施形態に係る三次元形状測定装置100の構成を示す説明図である。
この三次元形状測定装置100は大別して、被検査物SAを載置するステージ11と、ステージ11に向けて格子パターンを投影する格子パターン投影機構20と、撮像機構30と、格子パターン投影機構20及び撮像機構30のそれぞれの作動を制御するため、例えばパーソナルコンピュータからなる制御部40とで構成される。
FIG. 1 is an explanatory diagram illustrating a configuration of a three-dimensional shape measuring apparatus 100 according to the embodiment.
The three-dimensional shape measuring apparatus 100 is roughly divided into a stage 11 on which the inspection object SA is placed, a lattice pattern projection mechanism 20 that projects a lattice pattern toward the stage 11, an imaging mechanism 30, and a lattice pattern projection mechanism 20. In order to control the operation of each of the image pickup mechanism 30 and the control unit 40, for example, a control unit 40 including a personal computer is used.

格子パターン投影機構20は、照明光を射出する光源21と、光軸方向と平行になった照明光に所定の格子パターンを付与し、その格子パターンを走査させるパターン形成走査部22と、格子パターン光を投影する投影光学系23とで構成される。投影光学系23は複数のレンズの組み合わせにより構成される。パターン形成走査部22は例えば液晶板で構成され、π/2ごとに格子パターンを走査する。光源21から出射された照明光がパターン形成走査部22に照射されることにより形成された格子パターン光は、投影光学系23を経て被検査物SAに照射される。パターン形成走査部22の液晶板を全透過状態にすることにより、光源21から射出された光を液晶板で遮ることなくそのまま被検査物SAに照射することもできる。   The lattice pattern projection mechanism 20 includes a light source 21 that emits illumination light, a pattern formation scanning unit 22 that applies a predetermined lattice pattern to illumination light parallel to the optical axis direction, and scans the lattice pattern, and a lattice pattern And a projection optical system 23 for projecting light. The projection optical system 23 is configured by a combination of a plurality of lenses. The pattern forming scanning unit 22 is constituted by a liquid crystal plate, for example, and scans the lattice pattern every π / 2. The lattice pattern light formed by irradiating the pattern forming scanning unit 22 with the illumination light emitted from the light source 21 is applied to the inspection object SA via the projection optical system 23. By setting the liquid crystal plate of the pattern forming scanning unit 22 to a fully transmissive state, it is possible to irradiate the inspection object SA as it is without blocking the light emitted from the light source 21 by the liquid crystal plate.

撮像機構30は、格子パターン投影機構20で投影された格子パターン光によって、被検査物SAから反射された格子パターン光を結像する結像光学系31と、格子パターン光を撮像し画像データを出力するCCD又はCMOS等の撮像素子32を備える撮像装置33とにより構成される。   The imaging mechanism 30 is configured to form an image forming optical system 31 that forms an image of the lattice pattern light reflected from the inspection object SA by the lattice pattern light projected by the lattice pattern projection mechanism 20, and to capture the image data by capturing the lattice pattern light. The image pickup apparatus 33 includes an image pickup device 32 such as a CCD or a CMOS for output.

さらに、制御部40は、格子パターン投影機構20のパターン形成走査部22を制御する位相制御部41を備えている。また制御部40は、撮像装置33で撮像された格子パターン画像を演算処理する演算処理部42と、各種パラメータや撮像された画像データを保存する記憶部43と、データ等を表示する表示部49とを備えている。また制御部40は、撮像素子32から出力された画像データを補正する補正部44と光源21の照度(光量)を変える照度制御部45とを備えている。   Further, the control unit 40 includes a phase control unit 41 that controls the pattern formation scanning unit 22 of the lattice pattern projection mechanism 20. In addition, the control unit 40 includes an arithmetic processing unit 42 that performs arithmetic processing on a lattice pattern image captured by the imaging device 33, a storage unit 43 that stores various parameters and captured image data, and a display unit 49 that displays data and the like. And. The control unit 40 includes a correction unit 44 that corrects image data output from the image sensor 32 and an illuminance control unit 45 that changes the illuminance (light amount) of the light source 21.

位相制御部41は、パターン形成走査部22を制御して格子パターンを投影光学系23の光軸と垂直方向にシフトさせる。パターン形成走査部22を透過したパターン光は、投影光学系23によりステージ11上の被検査物SAに投影される。投影したパターン光の像は、明度が正弦波状(スリット状)に変化するものになっている。また、格子パターン投影機構20の光軸が撮像機構30の光軸に対して傾いている入射角度は調整可能である。   The phase control unit 41 controls the pattern forming scanning unit 22 to shift the grating pattern in the direction perpendicular to the optical axis of the projection optical system 23. The pattern light transmitted through the pattern forming scanning unit 22 is projected onto the inspection object SA on the stage 11 by the projection optical system 23. The projected pattern light image has a brightness that changes in a sine wave shape (slit shape). The incident angle at which the optical axis of the grating pattern projection mechanism 20 is inclined with respect to the optical axis of the imaging mechanism 30 can be adjusted.

被検査物SAで反射されたパターン反射光は、撮像機構30の結像光学系31により撮像装置33の撮像素子32の受光面に結像される。パターン反射光は撮像素子32で光電変換されて、被検査物SAにより変形された格子パターンの画像データを得ることができる。以下同様に、パターン形成走査部22の格子パターンを例えばπ/2ずつ位相をシフトさせて被検査物SAにより変形された格子パターンを含むパターン反射光を撮像する。この移動毎に撮像された画像データは、制御部40の記憶部43に保存される。   The pattern reflected light reflected by the inspection object SA is imaged on the light receiving surface of the imaging device 32 of the imaging device 33 by the imaging optical system 31 of the imaging mechanism 30. Pattern reflected light is photoelectrically converted by the image sensor 32, and image data of a lattice pattern deformed by the inspection object SA can be obtained. Similarly, the pattern reflected light including the lattice pattern deformed by the inspection object SA is imaged by shifting the phase of the lattice pattern of the pattern forming scanning unit 22 by π / 2, for example. Image data captured for each movement is stored in the storage unit 43 of the control unit 40.

これらの画像データは、被検査物SAを位相ごとに撮像した画像であり、隣り合う画像を連結することにより、被検査物SAの全体画像が得られる。また、この時に、撮像した際の入射角度、格子ピッチ、位相シフトのステップ数等の撮像に関するパラメータを合わせて記憶部43に保存する。   These image data are images obtained by imaging the inspection object SA for each phase, and an entire image of the inspection object SA is obtained by connecting adjacent images. At this time, parameters relating to imaging such as an incident angle, a grating pitch, and a phase shift step at the time of imaging are also stored in the storage unit 43.

<撮像素子32が受ける光量と輝度値との関係>
図2は、撮像素子32が受ける光量と輝度値との関係を示したグラフである。(a)〜(c)において、横軸は撮影素子32が受ける実際の光量であり、縦軸は撮影素子32が出力する光の輝度値である。また、(a)〜(c)において点線がリニアリティを示すLL、実線が撮影素子32からの輝度値Idnx(階調)、二点差線が補正係数を示すCCを示す。
<Relationship between the amount of light received by the image sensor 32 and the luminance value>
FIG. 2 is a graph showing the relationship between the amount of light received by the image sensor 32 and the luminance value. In (a) to (c), the horizontal axis represents the actual light quantity received by the imaging element 32, and the vertical axis represents the luminance value of the light output by the imaging element 32. In (a) to (c), the dotted line indicates LL indicating linearity, the solid line indicates the luminance value Idnx (gradation) from the imaging element 32, and the two-point difference line indicates CC indicating the correction coefficient.

図2(a)は、撮像素子32がダイナミックレンジを拡大したCMOSセンサーの一例を示している。このCMOSセンサーは光量が所定量に達する箇所で感度特性を変えるKnee特性を有している。   FIG. 2A shows an example of a CMOS sensor in which the image sensor 32 has an expanded dynamic range. This CMOS sensor has a Knee characteristic that changes the sensitivity characteristic when the amount of light reaches a predetermined amount.

図2(a)において領域Aでは、実際にCMOSセンサーが受ける光量Ixと輝度値Idnx(階調)との関係は、Idnx=α1×Ixと表すことができる。領域Bでは、実際にCMOSセンサーが受ける光量Ixと輝度値Idnx(階調)との関係は、Idnx=α2×Ix+β1(α1>α2)と表すことができる。このように、CMOSセンサーが受ける実際の光量Ixと輝度値Idnx(階調)とに一貫したリニアリティがない。このため高い反射率の金属などが被検査物SAの一部に存在すると、位相分布φ(x,y)の演算又は各位相φ(x,y)の位相接続演算において誤差が生じてしまい、被検査物SAの高さ情報h(x,y)を正確に把握することができない。   In region A in FIG. 2A, the relationship between the light amount Ix actually received by the CMOS sensor and the luminance value Idnx (gradation) can be expressed as Idnx = α1 × Ix. In region B, the relationship between the light quantity Ix actually received by the CMOS sensor and the luminance value Idnx (gradation) can be expressed as Idnx = α2 × Ix + β1 (α1> α2). Thus, there is no consistent linearity between the actual light quantity Ix and the luminance value Idnx (gradation) received by the CMOS sensor. For this reason, if a metal having a high reflectance exists in a part of the inspection object SA, an error occurs in the calculation of the phase distribution φ (x, y) or the phase connection calculation of each phase φ (x, y). The height information h (x, y) of the inspection object SA cannot be accurately grasped.

したがって、CMOSセンサーからの領域Bの画素出力に対して補正係数CCをかけるようにする。例えば補正後の輝度値CIdnx(階調)は、CIdnx=α1/α2×Ixと補正することができる。   Therefore, the correction coefficient CC is applied to the pixel output in the region B from the CMOS sensor. For example, the corrected luminance value CIdnx (gradation) can be corrected as CIdnx = α1 / α2 × Ix.

図2(b)は、撮像素子32がダイナミックレンジを拡大したCMOSセンサーの別の例を示している。このCMOSセンサーは対数変換回路を各画素に付加し、入射した光量をその光量の積分値の対数に比例した量の電子に変換して、高ダイナミックレンジを得るLOG特性を有している。このLOG特性を有するCMOSセンサーが受ける光量Ixと輝度値Idnx(階調)との関係は、Idnx=A×Log(Ix)、(Aは定数)と表すことができる。このCMOSセンサーが受ける実際の光量Ixと輝度値Idnx(階調)とにリニアリティがない。   FIG. 2B shows another example of a CMOS sensor in which the image sensor 32 has an expanded dynamic range. This CMOS sensor has a LOG characteristic that obtains a high dynamic range by adding a logarithmic conversion circuit to each pixel and converting the incident light quantity into an amount of electrons proportional to the logarithm of the integral value of the light quantity. The relationship between the light quantity Ix received by the CMOS sensor having the LOG characteristic and the luminance value Idnx (gradation) can be expressed as Idnx = A × Log (Ix), where A is a constant. There is no linearity between the actual light quantity Ix and the luminance value Idnx (gradation) received by the CMOS sensor.

したがって、CMOSセンサーからの画素出力に対して補正係数CCをかけるようにする。例えば補正後の輝度値CIdnx(階調)は、CIdnx=B×e(Idnx)(Bは定数)と補正することができる。   Therefore, the correction coefficient CC is applied to the pixel output from the CMOS sensor. For example, the corrected luminance value CIdnx (gradation) can be corrected as CIdnx = B × e (Idnx) (B is a constant).

図2(c)は、撮像素子32がCMOSセンサー又はCCDセンサーのある画素がリニアリティを有していない例を示している。厳密には各画素は感度特性にバラツキを有している。この感度特性のバラツキを正確なリニアリティを有するように補正をする。補正係数CCを作成する多項式を記憶部43に記憶したり、又は個々の補正係数CCをルックアップテーブルなどの記憶部43に記憶したりする。   FIG. 2C shows an example in which the image sensor 32 has a CMOS sensor or a CCD sensor and a pixel having no linearity. Strictly speaking, each pixel has variations in sensitivity characteristics. This variation in sensitivity characteristics is corrected so as to have an accurate linearity. A polynomial for creating the correction coefficient CC is stored in the storage unit 43, or each correction coefficient CC is stored in the storage unit 43 such as a lookup table.

次に、撮像素子32が受ける光量と輝度値との関係について説明する。
図3に示す基準サンプルSP及びフローチャートを参照して、撮像素子32の補正係数CCの算出手順について説明する。
Next, the relationship between the amount of light received by the image sensor 32 and the luminance value will be described.
A procedure for calculating the correction coefficient CC of the image sensor 32 will be described with reference to the reference sample SP and the flowchart shown in FIG.

ステップS31において、平面で反射率の等しい基準サンプルSPを用意する。そしてこの基準サンプルSPをステージ11に配置する。
ステップS32において、位相制御部41が、パターン形成走査部22を制御して光源21の光を何もさえぎることのない透過パターンにする。
In step S31, a reference sample SP having a flat and equal reflectivity is prepared. The reference sample SP is placed on the stage 11.
In step S <b> 32, the phase control unit 41 controls the pattern forming scanning unit 22 to make the transmission pattern that does not block the light from the light source 21.

ステップS33において、照度制御部45が光源21の照度(光量)を断続的に又は連続的に変えながら、透過パターンを照明する。照度制御部45によって光源21の実際の光量Ixがどの程度であるかをステージ11に配置した照度計SSを用いて測定する。
ステップS34では、変えられた光量Ixごとに撮像素子32からの輝度値Idnxを記憶部43に記憶する。
In step S33, the illuminance control unit 45 illuminates the transmission pattern while changing the illuminance (light quantity) of the light source 21 intermittently or continuously. The illuminance control unit 45 measures how much the actual light amount Ix of the light source 21 is using an illuminometer SS arranged on the stage 11.
In step S <b> 34, the luminance value Idnx from the image sensor 32 is stored in the storage unit 43 for each changed light amount Ix.

ステップS35では、演算処理部42が、光量Ixと撮像素子32からの輝度値Idnxとの関係から補正係数CCを演算する。この補正係数CCは、撮像素子32の画素ごとに記憶してもよいし、記憶部43の容量との関係で多項式として記憶してもよく、複数の画素を1つのブロックとして補正係数CCを記憶しても良い。
ステップS36では、その補正係数CCが記憶部43に記憶される。
In step S <b> 35, the arithmetic processing unit 42 calculates the correction coefficient CC from the relationship between the light amount Ix and the luminance value Idnx from the image sensor 32. The correction coefficient CC may be stored for each pixel of the image sensor 32, or may be stored as a polynomial in relation to the capacity of the storage unit 43, and the correction coefficient CC is stored with a plurality of pixels as one block. You may do it.
In step S36, the correction coefficient CC is stored in the storage unit 43.

基準サンプルSPに対する測定は、ユーザやサービスエンジニアが設置場所等で任意に行うことも可能である。さらに、装置の出荷段階に測定を行い、補正係数CCを記憶部43に予め保存しておくことも可能である。   The measurement for the reference sample SP can be arbitrarily performed by a user or a service engineer at an installation location or the like. Further, it is possible to perform measurement at the shipping stage of the apparatus and store the correction coefficient CC in the storage unit 43 in advance.

撮像素子32は、赤色、青色又は緑など色彩によって光量と輝度値との関係が撮像素子32で変動することがある。このため、撮像素子32の感度特性を、色彩ごとに測定する。すなわち、図3において、平面で反射率の等しい基準サンプルSPを用意したが、さらに、主要な色彩の基準サンプルSPを用意し、色彩ごとに補正係数CCを取得し記憶部43に予め保存しても良い。   In the image sensor 32, the relationship between the light amount and the luminance value may vary depending on the color such as red, blue, or green. For this reason, the sensitivity characteristic of the image sensor 32 is measured for each color. That is, in FIG. 3, the reference sample SP having the same reflectance on the plane is prepared, but further, the reference sample SP of the main color is prepared, the correction coefficient CC is acquired for each color, and stored in the storage unit 43 in advance. Also good.

次に、このように構成された三次元形状測定装置100による測定について説明する。
図4は、三次元形状測定装置100による測定方法を示すフローチャートである。
Next, measurement by the three-dimensional shape measuring apparatus 100 configured as described above will be described.
FIG. 4 is a flowchart showing a measuring method by the three-dimensional shape measuring apparatus 100.

ステップS51では、被検査物SAがステージ11に載置される。
ステップS52では、位相制御部41が、パターン形成走査部22を制御して格子パターンにし、光源21からの光を格子パターンを照射する。そして格子パターンが被検査物SAに投影される。
ステップS53では、撮像素子32が被検査物SAから反射された第1の位相状態の格子パターンの反射光を受光する。記憶部43が被検査物SAの格子パターンの画像データを記憶する。
In step S51, the inspection object SA is placed on the stage 11.
In step S52, the phase control unit 41 controls the pattern formation scanning unit 22 to form a lattice pattern, and irradiates the light from the light source 21 with the lattice pattern. Then, the lattice pattern is projected onto the inspection object SA.
In step S53, the imaging device 32 receives the reflected light of the grating pattern in the first phase state reflected from the inspection object SA. The storage unit 43 stores image data of the lattice pattern of the inspection object SA.

ステップS54において、記憶部43が必要な位相分の画像データが記憶しているか否かを判断する。記憶部43が必要な位相分の画像データを記憶していればステップS56に進み、不足していればステップS55に進む。例えばπ/2位相であれば0から2πまで4枚の画像データ(I1,I2,I3,I4)を取得する。フローチャートの右側に4枚の画像データの例を示す。
ステップS55では、位相制御部41がパターン形成走査部22を制御して例えば、π/2の位相をシフトし、再び、ステップS52へ進みシフトした格子パターンを照射する。
In step S54, it is determined whether the storage unit 43 stores image data for a necessary phase. If the storage unit 43 stores image data for the necessary phase, the process proceeds to step S56, and if it is insufficient, the process proceeds to step S55. For example, if the phase is π / 2, four pieces of image data (I1, I2, I3, I4) are acquired from 0 to 2π. An example of four pieces of image data is shown on the right side of the flowchart.
In step S55, the phase control unit 41 controls the pattern forming scanning unit 22 to shift, for example, the phase of π / 2, and again proceeds to step S52 to irradiate the shifted lattice pattern.

ステップS56では、記憶部43に記憶されたリニアリティ補正係数CCを使って画像データのリニアリティを補正する。補正された画像データ(CI1,CI2,CI3,CI4)は再び記憶部43に記憶される。   In step S56, the linearity of the image data is corrected using the linearity correction coefficient CC stored in the storage unit 43. The corrected image data (CI1, CI2, CI3, CI4) is stored in the storage unit 43 again.

ステップS57では、記憶部43から被検査物SAのリニアリティ補正された画像データを演算処理部42へ読み込む。演算処理部42は各画素(x,y)に対して各位相φ(x,y)を算出する。フローチャートの右側に、算出した位相の例を示す。
φ(x,y)=arctan{(CI4−CI2)/(CI1−CI3)}
In step S <b> 57, the linearity corrected image data of the inspection object SA is read from the storage unit 43 into the arithmetic processing unit 42. The arithmetic processing unit 42 calculates each phase φ (x, y) for each pixel (x, y). An example of the calculated phase is shown on the right side of the flowchart.
φ (x, y) = arctan {(CI4-CI2) / (CI1-CI3)}

次に、ステップS58では、各画素の位相分布φ(x,y)を合成し、各位相φ(x,y)の位相接続を行う。フローチャートの右側に位相接続した例を示す。
ステップS59では、位相接続した後、被検査物SAの高さ情報h(x,y)に変換される。この被検査物SAの高さ情報h(x,y)が表示部49に表示される。フローチャートの右側に高さ情報h(x,y)の例を示す。
Next, in step S58, the phase distributions φ (x, y) of the respective pixels are synthesized and phase connection of the respective phases φ (x, y) is performed. An example of phase connection is shown on the right side of the flowchart.
In step S59, after the phase connection, the height information h (x, y) of the inspection object SA is converted. The height information h (x, y) of the inspection object SA is displayed on the display unit 49. An example of the height information h (x, y) is shown on the right side of the flowchart.

なお、ステップS54において画像データ(I1,I2,I3,I4)を取得してから、φ(x,y)=arctan{(I4−I2)/(I1−I3)}の計算をするステップにて、各画素の輝度値毎に補正してもよい。すなわち、ステップS56で補正された画像データを一旦記憶することなく、位相計算の際に補正処理をしてもよい。   In step S54, the image data (I1, I2, I3, I4) is acquired, and then φ (x, y) = arctan {(I4-I2) / (I1-I3)} is calculated. The correction may be made for each luminance value of each pixel. That is, the correction process may be performed during the phase calculation without temporarily storing the image data corrected in step S56.

以上説明したように、位相シフト法による三次元計測前に、予め撮像素子32が受ける光量と輝度値との関係を撮像素子32の感度特性を取得し記憶部43に記憶しておく。そして、リニアリティを確保するように位相分の画像データを補正してから、各画素のできるように補正して位相φ(x,y)の位相接続を行う。   As described above, before the three-dimensional measurement by the phase shift method, the sensitivity characteristic of the image sensor 32 is acquired in advance and the storage unit 43 stores the relationship between the light quantity and the luminance value received by the image sensor 32. Then, the image data for the phase is corrected so as to ensure the linearity, and then corrected so that each pixel can be made, and the phase connection of the phase φ (x, y) is performed.

従って、撮像素子32が受ける実際の光量と撮像素子32が出力する輝度値との感度特性を求めた補正係数CCによりリニアリティ補正を行うため、位相が正確に得られ、高さ計算の結果への誤差を少なくすることができる。また、色彩違いに対してもリニアリティ補正を行うことも可能となる。ハードウェアの複雑化を招くことはなく、簡易な構成で正確な位相計算により測定精度の向上を実現している。   Therefore, since the linearity correction is performed by the correction coefficient CC obtained from the sensitivity characteristic between the actual light quantity received by the image pickup device 32 and the luminance value output from the image pickup device 32, the phase is accurately obtained, and the result of the height calculation Errors can be reduced. In addition, linearity correction can be performed for different colors. The hardware is not complicated, and the measurement accuracy is improved by accurate phase calculation with a simple configuration.

パターン形成走査部22は例えば液晶板で構成する例を説明したが、透明ガラスにクロムなどの遮光膜を形成したり、金属板にスリット開口を設けたりした構成であってもよい。   For example, the pattern forming scanning unit 22 is configured by a liquid crystal plate. However, the pattern forming scanning unit 22 may have a configuration in which a light shielding film such as chromium is formed on transparent glass or a slit opening is provided on a metal plate.

実施形態に係る三次元形状測定装置100の構成を示す説明図である。It is explanatory drawing which shows the structure of the three-dimensional shape measuring apparatus 100 which concerns on embodiment. 撮像素子32が受ける光量と輝度値との関係を示したグラフである。4 is a graph showing the relationship between the amount of light received by the image sensor 32 and the luminance value. 補正係数CCを取得する図及びフローチャートである。It is a figure and flowchart which acquire the correction coefficient CC. 三次元形状測定装置100による測定方法を示すフローチャートである。4 is a flowchart showing a measurement method by the three-dimensional shape measuring apparatus 100.

符号の説明Explanation of symbols

11 … ステージ
20 … 格子パターン投影機構
21 … 光源
22 … パターン形成走査部
23 … 投影光学系
24 … 投影光学系
30 … 撮像機構
31 … 結像光学系
32 … 撮像素子
33 … 撮像装置
40 … 制御部
41 … 位相制御部
42 … 演算処理部
43 … 記憶部
49 … 表示部
44 … 補正部
45 … 照度制御部
100 … 三次元形状測定装置
CC … 補正係数
SA … 被検査物
SS … 基準サンプル
DESCRIPTION OF SYMBOLS 11 ... Stage 20 ... Lattice pattern projection mechanism 21 ... Light source 22 ... Pattern formation scanning part 23 ... Projection optical system 24 ... Projection optical system 30 ... Imaging mechanism 31 ... Imaging optical system 32 ... Imaging element 33 ... Imaging apparatus 40 ... Control part 41 ... Phase control unit 42 ... Arithmetic processing unit 43 ... Storage unit 49 ... Display unit 44 ... Correction unit 45 ... Illuminance control unit 100 ... Three-dimensional shape measuring device CC ... Correction coefficient SA ... Inspected object SS ... Reference sample

Claims (5)

光源から出力された光を所定のパターン光に整形して被検査物に投影する投影光学部と、
前記被検査物上に投影される前記所定のパターン光を位相シフトさせる走査部と、
前記被検査物上への前記所定のパターン光の走査投影により生じるパターン像を結像する結像光学部と、
前記結像光学部により結像された前記パターン像を撮像する撮像部と、
前記画像データを補正するとともに、この補正された画像データに基づいて前記被検査物の形状の情報を求める制御部と、
を備えることを特徴とする三次元形状測定装置。
A projection optical unit that shapes the light output from the light source into a predetermined pattern light and projects the light onto the inspection object;
A scanning unit for phase-shifting the predetermined pattern light projected on the inspection object;
An imaging optical unit that forms a pattern image generated by scanning projection of the predetermined pattern light on the inspection object;
An imaging unit for imaging the pattern image formed by the imaging optical unit;
A controller that corrects the image data and obtains information on the shape of the inspection object based on the corrected image data;
A three-dimensional shape measuring apparatus comprising:
前記制御部は、
前記撮像部から出力された画像データを補正する補正部と、
前記補正部で補正された画像データを取り込み、前記走査部による位相シフトごとに前記被検査物上の前記パターン像の画像データを保存する記憶部と、
前記記憶部に保存された前記画像データに基づいて前記被検査物の形状の情報を求める演算処理部と
を有することを特徴とする請求項1に記載の三次元形状測定装置。
The controller is
A correction unit for correcting the image data output from the imaging unit;
A storage unit that captures image data corrected by the correction unit, and stores image data of the pattern image on the inspection object for each phase shift by the scanning unit;
The three-dimensional shape measuring apparatus according to claim 1, further comprising: an arithmetic processing unit that obtains information on a shape of the inspection object based on the image data stored in the storage unit.
前記補正部は、前記撮像部が受ける光量と輝度値との関係を前記撮像部の感度特性に基づいて予め取得しておき、前記光量と前記輝度値との関係を示す特性線が直線になるように補正することを特徴とする請求項2に記載の三次元形状測定装置。   The correction unit acquires in advance the relationship between the light amount received by the imaging unit and the luminance value based on the sensitivity characteristic of the imaging unit, and the characteristic line indicating the relationship between the light amount and the luminance value is a straight line. The three-dimensional shape measuring apparatus according to claim 2, wherein correction is performed as follows. 前記撮像部が受ける光量は異なる色彩の光量と輝度値との関係を示す感度特性を予め取得しておき、前記光量と前記輝度値との関係を示す特性線が直線になるように補正することを特徴とする請求項3に記載の三次元形状測定装置。   The amount of light received by the imaging unit is acquired in advance as a sensitivity characteristic indicating the relationship between the amount of light of different colors and the luminance value, and is corrected so that the characteristic line indicating the relationship between the amount of light and the luminance value is a straight line. The three-dimensional shape measuring apparatus according to claim 3. 前記補正部は、前記撮像部が配置される環境の温度変化に応じて前記特性線の補正値を変更することを特徴とする請求項1ないし請求項4のいずれか一項に記載の三次元形状測定装置。   The said correction | amendment part changes the correction value of the said characteristic line according to the temperature change of the environment where the said imaging part is arrange | positioned, The three-dimensional as described in any one of Claim 1 thru | or 4 characterized by the above-mentioned. Shape measuring device.
JP2008022204A 2008-02-01 2008-02-01 Three-dimensional shape measuring apparatus Pending JP2009180689A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008022204A JP2009180689A (en) 2008-02-01 2008-02-01 Three-dimensional shape measuring apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2008022204A JP2009180689A (en) 2008-02-01 2008-02-01 Three-dimensional shape measuring apparatus

Publications (1)

Publication Number Publication Date
JP2009180689A true JP2009180689A (en) 2009-08-13

Family

ID=41034771

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008022204A Pending JP2009180689A (en) 2008-02-01 2008-02-01 Three-dimensional shape measuring apparatus

Country Status (1)

Country Link
JP (1) JP2009180689A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012093235A (en) * 2010-10-27 2012-05-17 Nikon Corp Three-dimensional shape measurement device, three-dimensional shape measurement method, structure manufacturing method, and structure manufacturing system
WO2013054814A1 (en) 2011-10-11 2013-04-18 株式会社ニコン Shape-measuring device, system for manufacturing structures, shape-measuring method, method for manufacturing structures, shape-measuring program
CN103575234A (en) * 2012-07-20 2014-02-12 德律科技股份有限公司 Three-dimensional image measuring device
US8681217B2 (en) 2010-07-21 2014-03-25 Olympus Corporation Inspection apparatus and measurement method
US8704890B2 (en) 2010-08-19 2014-04-22 Olympus Corporation Inspection apparatus and measuring method
JP2015232478A (en) * 2014-06-09 2015-12-24 株式会社キーエンス Inspection device, inspection method, and program
CN105651203A (en) * 2016-03-16 2016-06-08 广东工业大学 High-dynamic-range three-dimensional shape measurement method for self-adaptation fringe brightness
WO2019180899A1 (en) * 2018-03-23 2019-09-26 株式会社日立ハイテクノロジーズ Appearance inspection device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03289505A (en) * 1990-04-06 1991-12-19 Nippondenso Co Ltd Three-dimensional shape measuring apparatus
JPH0719812A (en) * 1993-07-03 1995-01-20 Omron Corp Measuring instrument
JPH10210360A (en) * 1997-01-27 1998-08-07 Minolta Co Ltd Digital camera
JP2000292130A (en) * 1999-04-07 2000-10-20 Minolta Co Ltd Three-dimensional information input camera
JP2001074411A (en) * 1999-09-01 2001-03-23 Nikon Corp Interferometer
JP2005214653A (en) * 2004-01-27 2005-08-11 Olympus Corp Three-dimensional shape measuring method and its apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03289505A (en) * 1990-04-06 1991-12-19 Nippondenso Co Ltd Three-dimensional shape measuring apparatus
JPH0719812A (en) * 1993-07-03 1995-01-20 Omron Corp Measuring instrument
JPH10210360A (en) * 1997-01-27 1998-08-07 Minolta Co Ltd Digital camera
JP2000292130A (en) * 1999-04-07 2000-10-20 Minolta Co Ltd Three-dimensional information input camera
JP2001074411A (en) * 1999-09-01 2001-03-23 Nikon Corp Interferometer
JP2005214653A (en) * 2004-01-27 2005-08-11 Olympus Corp Three-dimensional shape measuring method and its apparatus

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8681217B2 (en) 2010-07-21 2014-03-25 Olympus Corporation Inspection apparatus and measurement method
US8704890B2 (en) 2010-08-19 2014-04-22 Olympus Corporation Inspection apparatus and measuring method
JP2012093235A (en) * 2010-10-27 2012-05-17 Nikon Corp Three-dimensional shape measurement device, three-dimensional shape measurement method, structure manufacturing method, and structure manufacturing system
EP2770295A4 (en) * 2011-10-11 2015-07-29 Nikon Corp Shape-measuring device, system for manufacturing structures, shape-measuring method, method for manufacturing structures, shape-measuring program
CN103857981A (en) * 2011-10-11 2014-06-11 株式会社尼康 Shape-measuring device, system for manufacturing structures, shape-measuring method, method for manufacturing structures, shape-measuring program
WO2013054814A1 (en) 2011-10-11 2013-04-18 株式会社ニコン Shape-measuring device, system for manufacturing structures, shape-measuring method, method for manufacturing structures, shape-measuring program
US9891043B2 (en) 2011-10-11 2018-02-13 Nikon Corporation Profile measuring apparatus, structure manufacturing system, method for measuring profile, method for manufacturing structure, and non-transitory computer readable medium
CN103575234A (en) * 2012-07-20 2014-02-12 德律科技股份有限公司 Three-dimensional image measuring device
CN103575234B (en) * 2012-07-20 2016-08-24 德律科技股份有限公司 3-dimensional image measurement apparatus
JP2015232478A (en) * 2014-06-09 2015-12-24 株式会社キーエンス Inspection device, inspection method, and program
CN105651203A (en) * 2016-03-16 2016-06-08 广东工业大学 High-dynamic-range three-dimensional shape measurement method for self-adaptation fringe brightness
CN105651203B (en) * 2016-03-16 2018-09-04 广东工业大学 A kind of high dynamic range 3 D measuring method of adaptive striped brightness
WO2019180899A1 (en) * 2018-03-23 2019-09-26 株式会社日立ハイテクノロジーズ Appearance inspection device

Similar Documents

Publication Publication Date Title
JP5123522B2 (en) 3D measurement method and 3D shape measurement apparatus using the same
JP2009180689A (en) Three-dimensional shape measuring apparatus
JP5140761B2 (en) Method for calibrating a measurement system, computer program, electronic control unit, and measurement system
JP6532325B2 (en) Measuring device for measuring the shape of the object to be measured
JP5375201B2 (en) 3D shape measuring method and 3D shape measuring apparatus
JP2008096439A (en) Method and device for determining three-dimensional coordinates of object
JP2016503509A (en) 3D scanner and operation method
US20150233707A1 (en) Method and apparatus of measuring the shape of an object
WO2004046645A2 (en) Fast 3d height measurement method and system
JP2015045587A (en) Three-dimensional image processor, method of determining change in state of three-dimensional image processor, program for determining change in state of three-dimensional image processor, computer readable recording medium, and apparatus having the program recorded therein
JP2020139869A (en) Measuring device, calculating method, system and program
JP2007322162A (en) Three-dimensional shape measuring apparatus and three-dimensional shape measuring method
WO2018163530A1 (en) Three-dimensional shape measurement device, three-dimensional shape measurement method, and program
KR101445831B1 (en) 3D measurement apparatus and method
JP2012053015A (en) Visual inspection device and visual inspection method
US10803623B2 (en) Image processing apparatus
JP2009036589A (en) Target for calibration and device, method and program for supporting calibration
JP4516949B2 (en) Three-dimensional shape measuring apparatus and three-dimensional shape measuring method
JP2012093235A (en) Three-dimensional shape measurement device, three-dimensional shape measurement method, structure manufacturing method, and structure manufacturing system
JP2008145139A (en) Shape measuring device
JP2014060549A (en) Illuminance output device, luminance output device and image projection device
JP4797109B2 (en) Three-dimensional shape measuring apparatus and three-dimensional shape measuring method
TWI622755B (en) Method for measuring surface profile
JP2006084286A (en) Three-dimensional measuring method and its measuring device
JP4962852B2 (en) Shape measuring method and shape measuring apparatus

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20110131

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110323

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20120517

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20120528

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20121001