WO2009096422A1 - Dispositif de mesure d'une forme tridimensionnelle, procédé et programme - Google Patents

Dispositif de mesure d'une forme tridimensionnelle, procédé et programme Download PDF

Info

Publication number
WO2009096422A1
WO2009096422A1 PCT/JP2009/051346 JP2009051346W WO2009096422A1 WO 2009096422 A1 WO2009096422 A1 WO 2009096422A1 JP 2009051346 W JP2009051346 W JP 2009051346W WO 2009096422 A1 WO2009096422 A1 WO 2009096422A1
Authority
WO
WIPO (PCT)
Prior art keywords
focus
pixel
measure
dimensional shape
focus position
Prior art date
Application number
PCT/JP2009/051346
Other languages
English (en)
Japanese (ja)
Inventor
Masaya Yamaguchi
Original Assignee
Nikon Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corporation filed Critical Nikon Corporation
Priority to JP2009551537A priority Critical patent/JP5218429B2/ja
Publication of WO2009096422A1 publication Critical patent/WO2009096422A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures

Definitions

  • the present invention relates to a three-dimensional shape measurement apparatus and method, and a program, and more particularly, to a three-dimensional shape measurement apparatus and method, and a program that improve the measurement accuracy of a three-dimensional shape.
  • an SFF (Shape-From-Focus) method is known as a method for measuring the three-dimensional shape of an object using a two-dimensional image (see, for example, Non-Patent Document 1).
  • the SFF method is an effective method when the surface of the measurement object has a texture.
  • a plurality of images with different focal positions are photographed while moving the imaging device with respect to the measurement object, and a differential operation is performed for each pixel of the obtained image to indicate the degree of focus. Calculate the measure. Then, the three-dimensional shape of the measurement object is measured based on the focal position when the focus measure of each pixel is maximized.
  • the bright part tends to swell and be photographed, and the tendency becomes stronger as the focus shift increases. For this reason, there are cases where a dark part is originally brightly photographed at a boundary part where the luminance or color of the measurement object changes greatly.
  • a focus position a position where each pixel is in focus (hereinafter referred to as a focus position) is erroneously detected, and measurement accuracy may be reduced.
  • the present invention has been made in view of such a situation, and is intended to improve the measurement accuracy of a three-dimensional shape.
  • the three-dimensional shape measurement apparatus calculates a focus measure indicating a degree of focus on each pixel of a plurality of images having different focus positions with respect to a measurement target, and the focus measure is calculated as the focus measure.
  • the three-dimensional shape measurement apparatus that measures the three-dimensional shape of the measurement object by detecting the focus position based on the focus position, at least one of luminance, hue, and saturation is used as a determination value, and the focus of the target pixel is determined.
  • Effective pixel extraction means for extracting effective pixels having the determination value whose difference from the determination value of the target pixel is within an allowable range from pixels within a predetermined range in the vicinity of the target pixel used for calculation of the focus measure And using the extracted effective pixels, a focus measure calculation means for calculating the focus measure at the target pixel, and a focus position at which the focus measure reaches a peak is detected as the focus position. And a focus position detecting means.
  • the three-dimensional shape measurement method calculates a focus measure indicating a degree of focus on each pixel of a plurality of images having different focus positions with respect to a measurement target, and the focus measure is calculated as the focus measure.
  • the three-dimensional shape measuring method of the three-dimensional shape measuring apparatus for measuring the three-dimensional shape of the measurement object by detecting the in-focus position based on at least one of luminance, hue, and saturation as a determination value
  • An effective pixel extraction step to extract a focus measure calculation step to calculate the focus measure at the pixel of interest using the extracted effective pixel, and a focus position at which the focus measure reaches a peak
  • a focus position detection step of detecting as said in-focus position is an effective pixel extraction step to extract, a focus measure calculation step to calculate the focus measure at the pixel of interest using the extracted effective pixel, and a focus position at which the focus measure reaches a peak.
  • a program calculates a focus measure indicating a degree of focus on each pixel of a plurality of images having different focus positions with respect to a measurement target, and focuses on the basis of the focus measure.
  • a program that causes a computer to execute a process of measuring a three-dimensional shape of the measurement object by detecting a position at least one of luminance, hue, and saturation is used as a determination value, and the focus on a target pixel
  • An effective pixel extracting step of extracting effective pixels having the determination value whose difference from the determination value of the target pixel is within an allowable range from pixels within a predetermined range in the vicinity of the target pixel used for calculating the measure;
  • To execute processing including a focus position detection step of detecting a focus position on the computer.
  • At least one of luminance, hue, and saturation is used as a determination value, and pixels within a predetermined range near the target pixel used for calculation of the focus measure in the target pixel are An effective pixel having the determination value whose difference from the determination value of the target pixel is within an allowable range is extracted, and using the extracted effective pixel, the in-focus measure at the target pixel is calculated, A focus position where the focus measure reaches a peak is detected as the focus position.
  • the measurement accuracy of the three-dimensional shape is improved.
  • 3D shape measurement system 1 3D shape measurement system, 2 measurement object, 11 imaging device, 12 computer, 51 3D shape measurement unit, 61 imaging control unit, 62 measurement unit, 71 pre-processing unit, 72 effective pixel extraction unit, 73 focusing measure Calculation unit, 74 Focus position detection unit, 75 3D shape data generation unit
  • FIG. 1 is a diagram showing an embodiment of a three-dimensional shape measurement system to which the present invention is applied.
  • the three-dimensional shape measurement system 1 of FIG. 1 is configured to include an imaging device 11, a computer 12, and a display 13.
  • the imaging device 11 and the computer 12 are connected via a cable 14, and the computer 12 and the display 13 are connected to each other. They are connected via a cable 15.
  • the image pickup apparatus 11 schematically shown in FIG. 1 is configured to include at least an optical lens 21 and a photodetector 22 including an image pickup device such as a CCD (Charge Coupled Device).
  • CCD Charge Coupled Device
  • the imaging device 11 has a height (in the Z-axis direction) with respect to the measurement object 2 installed on the stage 3 of the measurement microscope. While changing the position), a plurality of images with different focal positions are taken with respect to the measurement object 2, and the computer 12 measures the three-dimensional shape of the measurement object 2 using the plurality of images.
  • FIG. 2 is a block diagram showing an example of a functional configuration realized when a processor (for example, a CPU (Central Processing Unit), a DSP (Digital Signal Processor), etc.) of the computer 12 executes a predetermined program.
  • the three-dimensional shape measuring unit 51 is realized by the processor of the computer 12 executing a predetermined program.
  • the three-dimensional shape measurement unit 51 includes an imaging control unit 61 and a measurement unit 62.
  • the imaging control unit 61 controls the imaging device 11 to image the measurement object 2 while changing the focal position.
  • the imaging control unit 61 acquires a captured image (hereinafter referred to as an original image) from the imaging device 11 and supplies the acquired image to the preprocessing unit 71 of the measurement unit 62.
  • the imaging control unit 61 notifies the preprocessing unit 71 of the end of imaging when imaging of the measurement object 2 at all focal positions is completed.
  • the measuring unit 62 measures the three-dimensional shape of the measurement object 2 using a plurality of original images having different focal positions.
  • the measurement unit 62 includes a preprocessing unit 71, an effective pixel extraction unit 72, a focus measure calculation unit 73, a focus position detection unit 74, and a three-dimensional shape data generation unit 75.
  • the preprocessing unit 71 performs predetermined preprocessing on the original image and stores an image generated as a result (hereinafter also referred to as a preprocessed image) in the memory. 52 is stored. Further, when the end of shooting is notified from the shooting control unit 61, the preprocessing unit 71 transfers the notification to the effective pixel extraction unit 72.
  • the effective pixel extraction unit 72 is a pixel in a predetermined range (hereinafter referred to as a target range) in the vicinity of the target pixel for which a focus measure is calculated. Then, using the luminance, hue, or saturation allowable value set by the user, the effective pixel used for calculating the focus measure at the target pixel is extracted, and the luminance value of the effective pixel is extracted from the preprocessed image. .
  • the effective pixel extraction unit 72 supplies information indicating the luminance value of the extracted effective pixels to the focus measure calculation unit 73.
  • the focus measure calculation unit 73 calculates the focus measure of the target pixel using the extracted luminance value of the effective pixel, and calculates the calculated focus measure. Is supplied to the in-focus position detector 74.
  • the in-focus position detection unit 74 uses the luminance tolerance value set by the user based on the calculated in-focus measure, and adjusts the focus pixel of interest. The focus position is detected, and the detected focus position is stored in the memory 52. Further, when the focus position detection unit 74 detects the focus positions of all the pixels, the focus position detection unit 74 notifies the three-dimensional shape data generation unit 75 of the end of the detection of the focus position.
  • the 3D shape data generation unit 75 generates 3D shape data based on the in-focus position of each pixel stored in the memory 52, and outputs it to the subsequent stage.
  • step S1 the effective pixel extraction unit 72 and the focus position detection unit 74 acquire various allowable values. Specifically, for example, the user sets which of luminance, hue, and saturation to use as a determination value used for effective pixel extraction via an input unit (not shown) of the computer 12 and determines The permissible values of luminance, hue, and saturation used in the above are input to the computer 12.
  • the effective pixel extraction unit 72 acquires the input allowable value. Note that any one of luminance, hue, and saturation may be used for the determination value, or two or more types may be used in combination.
  • the user inputs an allowable luminance value used for determination of the maximum value of the focus measure to the computer 12 via an input unit (not shown) of the computer 12, and the focus position detection unit 74 receives the allowable value. Get the value.
  • step S ⁇ b> 2 the imaging device 11 captures an image of the measurement object under the control of the imaging control unit 61.
  • the imaging apparatus 11 supplies an original image represented by the three primary colors of RGB obtained as a result of shooting to the shooting control unit 61 via the cable 14.
  • the imaging control unit 61 supplies the acquired original image to the preprocessing unit 71.
  • the preprocessing unit 71 performs preprocessing of the image. Specifically, the preprocessing unit 71 performs an Fourier transform on the acquired original image, removes a low frequency component equal to or lower than a predetermined frequency, and a noise component within a predetermined frequency range, and then performs an inverse Fourier transform. As a result, the low-frequency component and the noise component are removed from the original image, and an RGB image (hereinafter referred to as a preprocessed RGB image) from which the medium-high frequency component is extracted is generated.
  • the preprocessing unit 71 stores the generated preprocessed RGB image in the memory 52. Further, the preprocessing unit 71 converts the generated preprocessed RGB image into an YPbPr image, and stores the converted image (hereinafter referred to as a preprocessed YPbPr image) in the memory 52.
  • step S4 the photographing control unit 61 determines whether or not photographing has been performed at all focal positions. If it is determined that shooting has not been performed at all focal positions, the process proceeds to step S5.
  • step S5 the imaging control unit 61 changes the focal position. That is, the imaging control unit 61 moves the imaging device 11 in the Z-axis direction so that the focal position becomes the following value.
  • step S4 the processes of steps S2 to S5 are repeatedly executed until it is determined in step S4 that images have been taken at all the focal positions.
  • N original images are photographed while moving the focal position from the lowermost end to the uppermost end of the measuring object 2 at a predetermined interval, and preprocessing is performed from each original image.
  • An RGB image and a preprocessed YPbPr image are generated and stored in the memory 52.
  • the vertical axis represents the focal position, and on the vertical axis, an index representing the setting order of the focal position is shown instead of the value of the focal position.
  • FIG. 6 shows an example of an original image taken at the kth focal position in FIG.
  • Each square box in FIG. 6 represents a pixel.
  • the region Rf indicates a region in which the imaging apparatus 11 is in focus
  • the region Rin indicates a region inside the region in focus
  • the region Rout indicates the focus.
  • the area outside the area where the That is, the images in the region Rin and the region Rout are so-called defocused images that are out of focus.
  • step S ⁇ b> 4 when it is determined in step S ⁇ b> 4 that the photographing control unit 61 has photographed at all the focal positions, it notifies the effective pixel extracting unit 72 of the end of photographing through the preprocessing unit 71. The process proceeds to step S6.
  • step S6 the measurement unit 62 selects one of the pixels for which the in-focus position has not been obtained and sets it as the target pixel.
  • the pixel of interest is set, for example, in raster scan order.
  • step S7 the measuring unit 62 selects one of the images for which the focus measure at the current target pixel is not obtained and sets it as the target image.
  • the attention image is set, for example, in the order of shooting (index order).
  • step S8 the effective pixel extraction unit 72 selects one pixel to be processed from the pixels within the attention range. That is, the effective pixel extraction unit 72 selects, as a pixel to be processed, one of the pixels in the attention range that has not been determined whether or not it is an effective pixel.
  • FIG. 7 shows an example of the attention range when the pixel PA1 in FIG. 6 is set as the attention pixel.
  • a 5 ⁇ 5 pixel range RA1 centered on the target pixel PA1 is set as the target range.
  • step S9 the effective pixel extraction unit 72 determines whether the difference between the determination values of the target pixel and the pixel to be processed is within an allowable range. Specifically, the effective pixel extraction unit 72 obtains the determination value of the target pixel and the pixel to be processed from the preprocessed RGB image and the preprocessed YPbPr image of the target image stored in the memory 52. The effective pixel extraction unit 72 takes the difference between the determination values of the target pixel and the pixel to be processed, and compares the absolute value of the difference value with the allowable value set in step S1.
  • the effective pixel extraction unit 72 determines that the difference between the determination values of the target pixel and the pixel to be processed is within the allowable range, that is, the pixel to be processed is valid The pixel is determined to be a pixel, and the process proceeds to step S10.
  • the difference between the determination values of the target pixel and the pixel to be processed Is determined to be within the allowable range.
  • the absolute value of the difference value between the target pixel and the pixel to be processed is an allowable value for all of luminance, hue, and saturation.
  • step S10 the effective pixel extraction unit 72 extracts the luminance value of the pixel to be processed. That is, the effective pixel extraction unit 72 extracts the luminance value of the current pixel to be processed from the preprocessed YPbPr image of the target image stored in the memory 52.
  • step S9 when the absolute value of the difference value of at least one kind of determination value exceeds the allowable value, it is determined that the difference between the determination value of the target pixel and the pixel to be processed exceeds the allowable range, The process of step S10 is skipped and the process proceeds to step S11.
  • step S11 the effective pixel extraction unit 72 determines whether all the pixels within the attention range have been determined. If it is determined that all the pixels within the attention range have not yet been determined, the process returns to step S8. After that, until it is determined in step S11 that all the pixels in the attention range have been determined, the processing in steps S8 to S11 is repeatedly performed, and the difference in determination value from the pixel in the attention range to the attention pixel is determined. An effective pixel having a determination value within the allowable range is extracted, and further, a luminance value of the effective pixel is extracted.
  • step S11 determines whether all pixels within the range of interest have been determined. If it is determined in step S11 that all pixels within the range of interest have been determined, the process proceeds to step S12.
  • the focus measure calculation unit 73 calculates the focus measure at the target pixel of the target image. Specifically, the effective pixel extraction unit 72 supplies information indicating the luminance value of the extracted effective pixel to the focus measure calculation unit 73. The focus measure calculation unit 73 calculates the focus measure at the target pixel of the target image using the following equation (1).
  • Focus measure ( ⁇
  • the focus measure is an average value of absolute values of difference values between the luminance value of the target pixel and the luminance value of the effective pixel within the target range. Therefore, the value of the focus measure increases as the difference in luminance value between the target pixel and the effective pixel increases, that is, as the contrast between the target pixel and the effective pixel increases.
  • the focus measure calculation unit 73 supplies the calculated focus measure to the focus position detection unit 74.
  • the focus measure calculation unit 73 stores the calculated focus measure, the luminance value of the target pixel of the current target image, and the focal position of the current target image in the memory 52 in association with each other.
  • the focus measure is obtained using all the pixels in the attention range RA1, it is affected by both the ranges RA1a and RA1b having different heights, and corresponds to the height between the range RA1a and the range RA1b.
  • the focus measure is calculated.
  • FIG. 9 schematically shows the state of hue or saturation distribution within the range RA1a.
  • the hue or saturation also varies between the range RA1a and the range RB1b.
  • the focus measure at the target pixel PA1 can be calculated more accurately by using only the luminance of the pixel within the range RA1a or a range close thereto. Note that effective pixels extracted using luminance, hue, or saturation do not necessarily match, and using all three types as determination values makes it possible to calculate a focus measure more accurately. .
  • step S13 the focus position detection unit 74 determines whether the focus measure is the maximum. Specifically, the in-focus position detection unit 74 determines that the in-focus measure is the maximum when the in-focus measure calculated in step S13 is larger than the maximum value of the in-focus measure stored so far in the memory 52. The process proceeds to step S14. Note that when the first focus measure is calculated for the current pixel of interest, since the maximum value of the focus measure is not yet stored in the memory 52, it is determined that the calculated focus measure is unconditionally maximum. Is done.
  • step S14 the focus position detection unit 74 determines whether the luminance value of the target pixel is within the allowable range. Specifically, the in-focus position detection unit 74 stores the brightness value of the target pixel at the current focal position (the brightness value of the target pixel of the current target image) and the focus that has been stored in the memory 52 so far. The difference between the luminance value at the focal position where the measure is maximized (the luminance value of the target pixel of the image where the in-focus measure has been maximized so far) is taken, and the allowable value set in step S1. And compare. If the difference value is less than or equal to the allowable value, the in-focus position detection unit 74 determines that the luminance value of the target pixel is within the allowable range, and the process proceeds to step S15.
  • step S15 the focus position detection unit 74 stores the current focus measure as a maximum value. That is, the focus position detection unit 74 sets the focus measure at the target pixel of the current target image as the maximum value of the focus measure, and the luminance value of the target pixel of the current target image and the focus position of the current target image. And stored in the memory 52.
  • step S14 if the calculated difference value exceeds the allowable value in step S14, it is determined that the luminance value of the target pixel exceeds the allowable range, the process of step S15 is skipped, and the process proceeds to step S16. That is, the focus measure at the current focus position is not stored as the maximum value.
  • FIG. 10 is a graph showing an example of the relationship between the focus position and the focus measure in the target pixel PA11 at the boundary portion where the luminance or color of the measurement object changes greatly.
  • the bright portion may expand as the focus shifts, and the originally dark portion may be photographed brightly.
  • the peak PK11 of the focus measure appears.
  • the luminance of the target pixel PA11 becomes brighter, and the peak PK12 of the focus measure may appear again.
  • the difference between the luminance of the pixel of interest PA11 at the focal point where the focus measure is the peak PK11 and the luminance of the pixel of interest PA11 at the focal point where the focus measure is the peak PK12 is determined by the determination process in step S14. If it is larger, the focus position where the focus measure reaches the peak PK12 and the focus position in the vicinity thereof are excluded from the detection target of the maximum value of the focus measure. Thereby, the peak of the focus measure is detected at the focus position where the focus is not in focus, and the erroneous detection of the focus position is prevented.
  • step S16 the focusing speed calculation unit 73 determines whether all the images have been processed. If it is determined that all the images have not yet been processed, the process returns to step S7. Thereafter, until it is determined in step S16 that all the images have been processed, the processes in steps S7 to S16 are repeatedly performed, and the focus measure at all the focus positions is obtained for the target pixel, and the focus measure The maximum value is detected.
  • step S16 determines whether all the images have been processed. If it is determined in step S16 that all the images have been processed, the process proceeds to step S17.
  • the focus position detection unit 74 detects the focus position. Specifically, the focus position detection unit 74 reads, from the memory 52, the focus position at the focus position where the focus measure is maximum and the focus positions before and after the focus position for the target pixel. For example, when the focus measure becomes the maximum at the k + 1th focus position, the focus measure at the kth to k + 2th focus positions is read from the memory 52. The focus position detection unit 74 performs interpolation of data between the three points by modeling the relationship between the focus position and the focus measure using a Gaussian function from the read focus positions and focus measures of the three points. Then, based on the interpolated data, the focus position detection unit 74 detects the focus position where the focus measure reaches a peak as the focus position at the target pixel. The focus position detection unit 74 stores the detected focus position in the memory 52.
  • FIG. 11 shows an example of a graph when the relationship between the focus position and the focus measure at points 1 to N is modeled by a Gaussian function.
  • a Gaussian function By modeling the relationship between the focal position and the in-focus measure with a Gaussian function, it is possible to detect the in-focus position at the target pixel with higher accuracy than the interval between the focal positions at which the images were actually captured.
  • the focal position between k + 1 and k + 2 corresponding to the peak PK21 in FIG. 11 can be detected as the in-focus position.
  • the relationship between the focal position and the focus measure may be modeled by a Gaussian function using data of four or more points. Further, instead of a Gaussian function, data interpolation may be performed by a quadratic function interpolation calculation. Furthermore, when the interval between the focal positions at which an image is taken is sufficiently small, data interpolation may be performed by taking a moving average of the focus measure.
  • step S18 the focus position detection unit 74 determines whether the focus positions of all the pixels have been detected. If it is determined that the focus positions of all the pixels have not been detected yet, the process returns to step S6, and steps S6 to S18 are performed until it is determined in step S18 that the focus positions of all the pixels have been detected. This process is repeatedly executed.
  • step S18 when it is determined that the focus position of all the pixels has been detected, the focus position detection unit 74 notifies the end of detection of the focus position to the three-dimensional shape data generation unit 75, and the process Proceed to step S19.
  • step S19 the three-dimensional shape data generation unit 75 generates three-dimensional shape data based on the focus position of each pixel stored in the memory 52, and outputs it to the subsequent stage.
  • the three-dimensional shape data is represented by the focus position of each pixel or the distance from a certain reference point to each pixel calculated based on the focus position.
  • the subsequent apparatus or processing unit of the three-dimensional shape measurement unit 51 displays a three-dimensional image of the measurement object 2 on the display 13 based on the three-dimensional shape data, for example. Thereafter, the three-dimensional shape measurement process ends.
  • the allowable value which of the darker and brighter peaks is selected as the maximum value of the focus measure in the processes of steps S14 and S15. If there are three or more focus positions where the focus measure is a peak, the peak to be selected as the maximum value of the focus measure is set based on the luminance value of the pixel of interest at the focus position. It is possible. Thereby, when there are a plurality of focus positions at which the focus measure reaches a peak, the focus position is selected from the focus positions at which the focus measure has a peak, based on the luminance value of the target pixel at each focus position. It becomes like this.
  • the focus measure is obtained by paying attention to the pixels.
  • the focus measure may be obtained in units of regions composed of a plurality of pixels.
  • the low frequency component and the noise component are removed and the inverse Fourier transform is performed.
  • Low frequency components and noise components may be removed and inverse wavelet transform may be performed.
  • the in-focus measure may be obtained using an effective pixel within the attention range and a Laplacian filter.
  • the program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
  • system means an overall apparatus composed of a plurality of apparatuses and means.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

La présente invention concerne un dispositif de mesure d'une forme tridimensionnelle, un procédé et un programme pouvant améliorer la précision d'une mesure d'une forme tridimensionnelle. Une partie de mesure de forme tridimensionnelle (51) calcule les mesures de focalisation des pixels respectifs d'images présentant des positions focales différentes par rapport à un objet devant être mesuré pour détecter des positions de focalisation des pixels respectifs mesurant la forme tridimensionnelle de l'objet à mesurer. Une section d'extraction de pixels effectifs (72), à l'aide d'au moins une luminance, une teinte et une saturation de couleur comme valeurs d'évaluation, extrait des pixels effectifs dont les valeurs d'évaluation sont chacune différentes de la valeur d'évaluation d'un pixel d'attention à hauteur d'une différence dans une plage autorisée, à partir de pixels dans une plage prédéterminée au voisinage du pixel d'attention. Une section de calcul de mesure de focalisation (73) calcule la mesure de focalisation du pixel d'attention à l'aide des pixels effectifs extraits. Une section de détection de la position de focalisation (74) détecte la position de focalisation où la mesure de focalisation présente un pic, formant la position de focalisation du pixel d'attention. La présente invention peut être appliquée au dispositif de mesure de forme tridimensionnelle, par exemple.
PCT/JP2009/051346 2008-01-28 2009-01-28 Dispositif de mesure d'une forme tridimensionnelle, procédé et programme WO2009096422A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009551537A JP5218429B2 (ja) 2008-01-28 2009-01-28 3次元形状測定装置および方法、並びに、プログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008-016349 2008-01-28
JP2008016349 2008-01-28

Publications (1)

Publication Number Publication Date
WO2009096422A1 true WO2009096422A1 (fr) 2009-08-06

Family

ID=40912774

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/051346 WO2009096422A1 (fr) 2008-01-28 2009-01-28 Dispositif de mesure d'une forme tridimensionnelle, procédé et programme

Country Status (2)

Country Link
JP (1) JP5218429B2 (fr)
WO (1) WO2009096422A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011046115A1 (fr) * 2009-10-13 2011-04-21 日立化成工業株式会社 Substrat guide d'ondes optique et son procédé de fabrication
CN102331622A (zh) * 2010-07-12 2012-01-25 索尼公司 信息处理***、显微镜控制装置及其操作方法
WO2012057283A1 (fr) 2010-10-27 2012-05-03 株式会社ニコン Dispositif de mesure de forme, procédé de mesure de forme, procédé de fabrication de structure et programme
JP2013257187A (ja) * 2012-06-11 2013-12-26 Ricoh Co Ltd 移動情報検出装置および多色画像形成装置
US10952827B2 (en) 2014-08-15 2021-03-23 Align Technology, Inc. Calibration of an intraoral scanner
TWI828386B (zh) * 2021-10-28 2024-01-01 日商尼康股份有限公司 形狀取得方法、對象物之管理方法及作業支援方法、以及形狀取得系統及作業支援系統

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7071088B2 (ja) * 2017-10-24 2022-05-18 キヤノン株式会社 距離検出装置、撮像装置、距離検出方法、及びプログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0926312A (ja) * 1996-08-02 1997-01-28 Hitachi Ltd 立体形状検出方法及びその装置
JP2001066112A (ja) * 1999-06-25 2001-03-16 Mitsutoyo Corp 画像計測方法及び装置
JP2001074422A (ja) * 1999-08-31 2001-03-23 Hitachi Ltd 立体形状検出装置及びハンダ付検査装置並びにそれらの方法
JP2006258444A (ja) * 2005-03-15 2006-09-28 Opcell Co Ltd 物体表面形状測定装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0926312A (ja) * 1996-08-02 1997-01-28 Hitachi Ltd 立体形状検出方法及びその装置
JP2001066112A (ja) * 1999-06-25 2001-03-16 Mitsutoyo Corp 画像計測方法及び装置
JP2001074422A (ja) * 1999-08-31 2001-03-23 Hitachi Ltd 立体形状検出装置及びハンダ付検査装置並びにそれらの方法
JP2006258444A (ja) * 2005-03-15 2006-09-28 Opcell Co Ltd 物体表面形状測定装置

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011046115A1 (fr) * 2009-10-13 2011-04-21 日立化成工業株式会社 Substrat guide d'ondes optique et son procédé de fabrication
JP2011085647A (ja) * 2009-10-13 2011-04-28 Hitachi Chem Co Ltd 光導波路基板及びその製造方法
CN103869411A (zh) * 2009-10-13 2014-06-18 日立化成工业株式会社 光波导基板、光电混载基板及其制造方法、以及位置对准用凹部形成装置
US8818147B2 (en) 2009-10-13 2014-08-26 Hitachi Chemical Company, Ltd. Optical waveguide substrate and method for manufacturing same
CN102331622A (zh) * 2010-07-12 2012-01-25 索尼公司 信息处理***、显微镜控制装置及其操作方法
JP2012037861A (ja) * 2010-07-12 2012-02-23 Sony Corp 顕微鏡制御装置、画像表示装置、画像管理サーバ、合焦位置情報生成方法、画像表示方法、画像管理方法及び顕微鏡画像管理システム
WO2012057283A1 (fr) 2010-10-27 2012-05-03 株式会社ニコン Dispositif de mesure de forme, procédé de mesure de forme, procédé de fabrication de structure et programme
US9239219B2 (en) 2010-10-27 2016-01-19 Nikon Corporation Form measuring apparatus, method for measuring form, method for manufacturing structure and non-transitory computer readable medium storing a program for setting measurement area
JP2013257187A (ja) * 2012-06-11 2013-12-26 Ricoh Co Ltd 移動情報検出装置および多色画像形成装置
US10952827B2 (en) 2014-08-15 2021-03-23 Align Technology, Inc. Calibration of an intraoral scanner
TWI828386B (zh) * 2021-10-28 2024-01-01 日商尼康股份有限公司 形狀取得方法、對象物之管理方法及作業支援方法、以及形狀取得系統及作業支援系統

Also Published As

Publication number Publication date
JP5218429B2 (ja) 2013-06-26
JPWO2009096422A1 (ja) 2011-05-26

Similar Documents

Publication Publication Date Title
JP5218429B2 (ja) 3次元形状測定装置および方法、並びに、プログラム
US9542754B2 (en) Device and method for detecting moving objects
US10726539B2 (en) Image processing apparatus, image processing method and storage medium
JP5374119B2 (ja) 距離情報取得装置、撮像装置、及びプログラム
EP3896643A1 (fr) Appareil de traitement d'images et son procédé de commande
US20150278996A1 (en) Image processing apparatus, method, and medium for generating color image data
US9204034B2 (en) Image processing apparatus and image processing method
JP6833415B2 (ja) 画像処理装置、画像処理方法、及びプログラム
JP6711396B2 (ja) 画像処理装置、撮像装置、および画像処理方法、並びにプログラム
CN107018407B (zh) 信息处理装置、评价用图、评价***、以及性能评价方法
US9438887B2 (en) Depth measurement apparatus and controlling method thereof
JP6025467B2 (ja) 画像処理装置及び画像処理方法
US7869706B2 (en) Shooting apparatus for a microscope
EP3062065A1 (fr) Dispositif d'imagerie et procédé de détection de différence de phase
JP2009259036A (ja) 画像処理装置、画像処理方法、画像処理プログラム、記録媒体、及び画像処理システム
WO2016113805A1 (fr) Procédé de traitement d'image, appareil de traitement d'image, appareil de collecte d'image, programme et support d'informations
JP4701111B2 (ja) パターンマッチングシステム及び被写体追尾システム
US10063829B2 (en) Image processing method, image processing apparatus, image pickup apparatus, and non-transitory computer-readable storage medium
JP2008042227A (ja) 撮像装置
JP5050282B2 (ja) 合焦検出装置、合焦検出方法および合焦検出プログラム
JP5754931B2 (ja) 画像解析装置、画像解析方法及びプログラム
JP7009252B2 (ja) 画像処理装置、画像処理方法およびプログラム
JP2008058279A (ja) 距離画像生成装置、距離画像生成方法及びプログラム
JP6464553B2 (ja) レンズ駆動制御装置、電子カメラ及びレンズ駆動制御プログラム
JP6558978B2 (ja) 画像処理装置、画像処理方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09706627

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2009551537

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09706627

Country of ref document: EP

Kind code of ref document: A1