WO2018207300A1 - Measurement device, measurement method, and measurement program - Google Patents

Measurement device, measurement method, and measurement program Download PDF

Info

Publication number
WO2018207300A1
WO2018207300A1 PCT/JP2017/017799 JP2017017799W WO2018207300A1 WO 2018207300 A1 WO2018207300 A1 WO 2018207300A1 JP 2017017799 W JP2017017799 W JP 2017017799W WO 2018207300 A1 WO2018207300 A1 WO 2018207300A1
Authority
WO
WIPO (PCT)
Prior art keywords
illumination
image
illumination pattern
pattern
measurement
Prior art date
Application number
PCT/JP2017/017799
Other languages
French (fr)
Japanese (ja)
Inventor
高橋 文之
哲男 肥塚
Original Assignee
富士通株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通株式会社 filed Critical 富士通株式会社
Priority to JP2019516808A priority Critical patent/JP6927294B2/en
Priority to PCT/JP2017/017799 priority patent/WO2018207300A1/en
Publication of WO2018207300A1 publication Critical patent/WO2018207300A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness

Definitions

  • the present invention relates to a measuring device, a measuring method, and a measuring program.
  • the measurement target When performing dimension measurement, illuminate the measurement target from the front using front light illumination, measure the dimensions using the reflected image from the measurement target, and illuminate the measurement target from the back using backlight illumination.
  • the latter method uses a telecentric illumination that uses a telecentric optical system to illuminate the measurement target with illumination that contains only light rays parallel to the optical axis of the camera, and, for example, a light emitting diode (LED: Light ⁇ ⁇ Emitting Diode) and diffusion. It can be classified into two types of methods using a flat illumination that irradiates a measurement target with illumination including light beams in various directions using a plate.
  • LED Light ⁇ ⁇ Emitting Diode
  • a light source in which light rays include only a component parallel to the optical axis of the camera, such as telecentric illumination. This is because light rays that are not parallel to the optical axis of the camera are reflected or diffused on the surface of the measurement object and enter the camera, and an ideal silhouette image cannot be obtained due to the adverse effect of reflected or diffused light. is there.
  • an object of one aspect is to provide a measurement device, a measurement method, and a measurement program that can improve measurement accuracy.
  • an illuminating device having a display surface for displaying an illumination pattern, and an imaging device for capturing an image of a measurement object that is disposed on the near side of the display surface of the illuminating device and illuminated by the illumination pattern; Generating an illumination pattern in which the phase of the irradiation intensity is modulated at least one of spatial and temporal, and outputting an instruction to display the illumination pattern on the display surface of the illumination device to the illumination device, and A control device that performs synchronous control for synchronizing the operation of the imaging device with an illumination device, and the control device divides the luminance amplitude of each pixel of the image by the average value of the luminance of the image, and There is provided a measurement device that measures the dimension of the measurement target based on a visibility image generated by calculating the visibility value of the illumination pattern in each pixel of the image.
  • measurement accuracy can be improved.
  • FIG. It is a figure which shows an example of a function structure of the control apparatus shown in FIG. It is a figure explaining the case where a measurement object is imaged under the illumination pattern of flat illumination. It is a figure which shows an example of the image which imaged the striped illumination pattern displayed on LCD with the CCD camera in the state by which the measurement object is not arrange
  • the illumination device displays the illumination pattern on the display surface
  • the imaging device is disposed on the front side of the illumination device display surface and illuminated by the illumination pattern.
  • the control device generates an illumination pattern in which the phase of the irradiation intensity is modulated at least one of spatial and temporal, and outputs an instruction to display the illumination pattern on the display surface of the illumination device.
  • synchronous control is performed to synchronize the operations of the illumination device and the imaging device.
  • the control device divides the luminance amplitude of each pixel of the image by the average value of the luminance of the image, and calculates the visibility value of the illumination pattern in each pixel of the image, and then the measurement target Measure the dimensions.
  • the measurement object 10 is measured with a light source in which the light beam 500 includes only a component parallel to the optical axis 510 of a camera (not shown), like the telecentric illumination shown in FIG. 1A. Irradiation is preferred. This is because light rays 501 that are not parallel to the optical axis 510 of the camera are incident on the camera after being reflected or diffused on the surface of the measurement object 10 as in the flat illumination 520 shown in FIG. 1B. This is because an ideal silhouette image cannot be obtained due to the adverse effect of at least one of 502.
  • FIG. 2 shows a case where the camera 530 captures an image of a cylinder that is an example of the measurement target 10 under the flat illumination 520.
  • 3A shows a silhouette image obtained by imaging the measurement object 10 of FIG. 2 under telecentric illumination
  • FIG. 3B shows a silhouette image obtained by imaging the measurement object 10 of FIG. 2 under flat illumination 520.
  • the silhouette image of FIG. 3A is an ideal image in which the entire measurement object 10 becomes a shadow unless the reflected light or diffused light generated by the secondary reflected light other than the external light source or the measurement object hits the measurement object 10.
  • the silhouette image in FIG. 3B is not an ideal silhouette image because a part of the measurement object 10 appears to be shined by reflected light or diffused light.
  • the dimension of the measurement target 10 measured from the silhouette image is smaller than the example shown in FIG.
  • FIG. 5 illustrates a portion 101 of the measurement target 10 where the reflection of a fluorescent lamp, which is an example of the light source 540, a portion 102 of the measurement target 10 that is illuminated by the external light 503, and an object 541 other than the measurement target 10.
  • a silhouette image 105 including a portion 103 of the measurement object 10 where reflection occurs is shown.
  • each embodiment that improves the measurement accuracy by suppressing the adverse effect of reflected light or diffused light caused by secondary reflected light from an object other than the measurement target or the external light source will be described below.
  • measurement accuracy can be improved without providing a measurement device in the darkroom.
  • FIG. 6 is a diagram illustrating an example of a hardware configuration of a measurement device according to an embodiment.
  • the measuring device 1 using backlight illumination includes a lighting device 2, an imaging device 3, and a control device 4.
  • the control device 4 can be formed by a computer such as a general-purpose computer.
  • the control device 4 includes a CPU (Central Processing Unit) 41 that is an example of a processor and a memory 42 that is an example of a storage device.
  • the CPU 41 executes a program including a measurement program stored in the memory 42 and executes a measurement process and the like described later.
  • the memory 42 stores programs, data, and the like.
  • the memory 42 is a portable recording medium such as a USB (Universal Serial Bus) memory, a semiconductor storage device such as a flash memory, a magnetic recording medium, a CD-ROM (Compact Disk Read Only Memory), a DVD disk (Digital Versatile Disk), etc. It can be formed by a computer-readable recording medium such as an optical recording medium or a magneto-optical recording medium.
  • a magnetic recording medium such as a disk, an optical recording medium, or a magneto-optical recording medium is used for the memory 42
  • the recording medium is loaded into a drive such as a disk drive, and a program or the like is read from the recording medium by the drive. Write data to the recording medium.
  • the illumination device 2 can be formed by a known liquid crystal display (LCD) having a display surface and a backlight, for example, and displays an illumination pattern on the display surface.
  • the imaging device 3 can be formed with a well-known CCD (Charge-Coupled Device) camera or the like, and is arranged on the near side of the display surface of the illumination device 2 and is illuminated by the illumination pattern displayed on the display surface. Take an image. That is, the display surface (or light emitting surface) of the illumination device 2 is disposed behind the measurement target 10 when viewed from the imaging device 3. Accordingly, the imaging device 3 simultaneously captures the illumination pattern displayed on the display surface of the illumination device 2 and the image of the measurement target 10 that is disposed on the near side of the display surface and illuminated by the illumination pattern.
  • CCD Charge-Coupled Device
  • the control device 4 generates an illumination pattern in which the phase of the irradiation intensity is modulated at least one of spatial and temporal, and outputs an instruction to display the illumination pattern on the display surface of the illumination device 2 to the illumination device 2. Synchronous control for synchronizing the operations of the illumination device 2 and the imaging device 3 is performed. Further, the control device 4 divides the luminance amplitude of each pixel of the image captured by the image capturing device 3 by the average value of the luminance of the image (hereinafter, also referred to as “average luminance value”), and each pixel of the image. The dimension of the measurement object 10 is measured based on the visibility image generated by calculating the visibility value of the illumination pattern at.
  • the control device 4 may perform a binarization process using a threshold value on the generated visibility image and measure the dimension of the measurement target 10 based on information identifying the illumination part and the target part.
  • the threshold value may be set by the operator as a fixed value, or may be automatically determined by a technique such as adaptive binarization.
  • a threshold value that is slightly lower than the initial visibility value of the illumination pattern in each pixel of the initial image in which the imaging device 3 captures the illumination pattern in the initial state where the measurement target 10 is not disposed on the front side of the display surface of the illumination device 2 May be adopted.
  • the dimension of the measurement target 10 may be measured based on a pixel whose difference from the initial visibility value in FIG.
  • the control apparatus 4 performs the measurement process which measures the dimension of the measuring object 10 including synchronous control.
  • the object that forms the measurement object 10 is not particularly limited.
  • FIG. 7 is a diagram illustrating an example of a functional configuration of the control device illustrated in FIG.
  • the LCD 20 is an example of the illumination device 2 that has a display surface and a backlight and displays an illumination pattern on the display surface.
  • the CCD camera 30 that captures the illumination pattern displayed on the display surface of the LCD 20 and the image of the measurement target 10 disposed on the near side of the display surface of the LCD 20 is an example of the imaging device 3.
  • the computer 40 generates an illumination pattern in which the phase of the irradiation intensity is modulated at least one of spatial and temporal, and outputs an instruction to display the illumination pattern on the display surface of the LCD 20 to the LCD 20 and also the LCD 20 and the CCD.
  • the monitor device 6 is an example of a display device that displays a message or the like to the operator of the computer 40.
  • the keyboard 5 is an example of an input device that is operated when an operator of the computer 40 inputs commands, data, and the like to the computer 40.
  • the input device may be a mouse or the like.
  • the computer 40 includes an image input unit 411, a pattern output unit 412, a synchronization control circuit 413, and a data processing unit 414.
  • the image input unit 411 has a functional configuration including an image input circuit 411-1 and an input image storage memory 411-2.
  • the pattern output unit 412 includes a pattern generation circuit 412-1 and a pattern output circuit 412-2.
  • the data processing unit 414 includes an image calculation circuit 414-1, a dimension calculation circuit 414-2, and a result storage memory 414-3.
  • the functions of the image input unit 411, the pattern output unit 412, the synchronization control circuit 413, and the data processing unit 414 can be realized by executing a measurement program stored in the memory 42 by the CPU 41 shown in FIG. .
  • the input image storage memory 411-2 and the result storage memory 414-3 can be formed by the memory 42, for example.
  • the pattern output unit 412 outputs an instruction to display an illumination pattern on the display surface of the LCD 20 to the LCD 20. More specifically, the pattern generation circuit 412-1 generates an illumination pattern in which the phase of the irradiation intensity is modulated at least one of spatial and temporal, and the pattern output circuit 412-2 generates the generated illumination pattern. Is displayed on the LCD 20. As a result, the synchronization control circuit 413 modulates the phase of the illumination intensity of the illumination pattern displayed on the LCD 20 spatially, temporally, spatially and temporally according to, for example, a default setting. Can synchronize the operation of the LCD 20 and the CCD camera 30 by performing synchronization control.
  • the image input unit 411 acquires one or a plurality of images captured by the CCD camera 30 and temporarily stores them. More specifically, the image input circuit 411-1 acquires one or more images captured by the CCD camera 30, and the input image storage memory 411-2 temporarily stores the acquired one or more images. .
  • the pattern output unit 412 changes the illumination pattern displayed on the LCD 20, and the plurality of images acquired by the image input circuit 411-1 are input to the input image storage memory. It can be temporarily stored in 411-2.
  • the image stored in the input image storage memory 411-2 is supplied to the data processing unit 414 and used for measuring the dimension of the measurement target 10.
  • the image calculation circuit 414-1 calculates a visibility image and an average luminance image, which will be described later, based on the image supplied from the image input unit 411.
  • the dimension calculation circuit 414-2 calculates the dimension of the measurement target 10 based on the calculated visibility image and average luminance image.
  • the result storage memory 414-3 stores the calculated dimensions of the measurement target 10. The dimensions of the measurement object 10 stored in the result storage memory 414-3 can be displayed on the monitor device 6, for example.
  • the data processing unit 414 displays the CCD camera 30. Visibility values and the like are calculated based on at least one image captured by.
  • the illumination pattern in which the phase of the irradiation intensity is spatially modulated is, for example, a striped pattern, a color tone, or a color changing pattern, and the pattern is preferably regular.
  • the data processing unit 414 When the pattern output unit 412 outputs to the LCD 20 an instruction to display a plurality of illumination patterns whose irradiation intensity phases are temporally modulated on the display surface of the LCD 20 according to the synchronization control, the data processing unit 414 The visibility value and the like are calculated based on a plurality of images captured by the CCD camera 30.
  • the illumination pattern in which the phase of the irradiation intensity is temporally modulated is, for example, a striped pattern, a color tone, or a color changing pattern at each time when the CCD camera 30 captures an image, and the pattern is regular. Yes, it is desirable to change regularly at each time.
  • the illumination pattern in this case is a pattern with different illumination intensity or illumination color at different times, the entire display surface of the LCD 20 may have a single illumination intensity or a single illumination color.
  • the pattern output unit 412 outputs an instruction to the LCD 20 to display a plurality of illumination patterns whose irradiation intensity phases are spatially and temporally modulated on the display surface of the LCD 20 in accordance with the synchronization control
  • data processing is performed.
  • the unit 414 calculates the visibility value and the like based on a plurality of images captured by the CCD camera 30.
  • the illumination pattern in which the phase of the illumination intensity is spatially and temporally modulated is, for example, a striped pattern, a pattern in which the illumination intensity or illumination color changes at each time when the CCD camera 30 captures an image. At least one of the change of itself (or the change in the pattern) and the change of the pattern for each time may be regular. In this case, for example, a striped pattern, a pattern in which irradiation intensity or illumination color changes, and the like are desirably changed regularly at each time.
  • the instruction to display the illumination pattern on the display surface of the LCD 20 that the pattern output unit 412 outputs to the LCD 20 represents the illumination pattern to be displayed, that is, the illumination pattern to be displayed, even if it is a signal designating the illumination pattern to be displayed. It may be a signal.
  • the data processing unit 414 calculates luminance information indicating an average of luminance values of corresponding pixels of a plurality of images or a difference between the luminance values, and the measurement target 10 is calculated in front of the display surface of the LCD 20 that is calculated in advance.
  • the dimensions of the measurement object 10 are measured based on the pixels whose difference is greater than a certain value. The larger dimension of the measured object 10 may be output.
  • the data processing unit 414 calculates luminance information indicating the luminance value of the corresponding pixel of the plurality of images or the average of the luminance values, and the initial luminance information of the illumination pattern in each pixel of the initial image calculated in advance.
  • the dimension of the measurement object 10 may be measured based on the pixels whose difference is equal to or greater than a certain value, and the larger dimension among the measured dimensions of the measurement object 10 may be output.
  • the pattern output unit 412 outputs, to the LCD 20, an instruction to display an illumination pattern according to the setting designated by the operator on the display surface of the LCD 20 instead of the default setting.
  • FIG. 8 is a diagram for explaining a case where a measurement target is imaged under a flat illumination pattern.
  • the illumination pattern 52 displayed on the display surface of the LCD 20 is a striped pattern in which the phase of the irradiation intensity is spatially modulated.
  • the brighter portion has a higher luminance value
  • the darker portion has a lower luminance value.
  • FIG. 9 shows an example of an image 52 ⁇ / b> A obtained by imaging the striped illumination pattern 52 displayed on the display surface of the LCD 20 with the CCD camera 30 in a state where the measurement target 10 is not disposed on the front side of the display surface of the LCD 20.
  • FIG. 10 is a diagram showing an example of a luminance profile along the virtual line 52B in the image 52A shown in FIG.
  • the vertical axis indicates the luminance value of the image 52A in arbitrary units
  • the horizontal axis indicates the horizontal position along the virtual line 52B in the image 52A in arbitrary units.
  • the average luminance is a value indicating the average brightness of the striped pattern of the image 52A.
  • Visibility is a value obtained by dividing the luminance amplitude of each pixel of the striped pattern of the image 52A by the average luminance value of the image 52A.
  • Visibility is an index indicating the visibility of a striped pattern, that is, the ease of discrimination, and is sometimes called “contrast” or “degree of modulation”.
  • the striped illumination pattern 52 displayed on the display surface of the LCD 20 and the measurement target 10 are simultaneously imaged by the CCD camera 30 in a state where the measurement target 10 is arranged on the front side of the display surface of the LCD 20.
  • 10A indicates an image (silhouette image) of the measurement target 10.
  • FIG. 11 shows a region including the virtual line 52D in the image 52C in a partially enlarged manner in the upper right part.
  • the striped illumination pattern in the image 52C shown in FIG. 11 the brighter part, the higher the brightness, and the darker part, the lower the brightness.
  • FIG. 12 is a diagram showing an example of a brightness profile along the virtual line 52D in the image 52C shown in FIG.
  • the vertical axis indicates the brightness of the image 52C in arbitrary units
  • the horizontal axis indicates the position in the vertical direction along the virtual line 52D in the image 52C in arbitrary units.
  • FIG. 11 in the image 52 ⁇ / b> C, it is determined that the striped pattern of the bright portion of the measurement target 10 is less visible than the striped pattern corresponding to the illumination pattern 52 displayed on the display surface of the LCD 20. It was confirmed that it was difficult to do. From the brightness profile along the virtual line 52D in the image 52C shown in FIG.
  • the visibility of the measurement target 10 is more than the visibility of the striped pattern corresponding to the illumination pattern 52 displayed on the display surface of the LCD 20. It was confirmed that it was getting smaller. That is, as can be seen from the brightness profile, the striped pattern in the portion of the measurement target 10 is harder to see and discriminate than the striped pattern corresponding to the illumination pattern 52 displayed on the display surface of the LCD 20.
  • the striped pattern of the illumination pattern 52 displayed on the display surface of the LCD 20 is reflected on the surface of the measurement object 10 and captured by the CCD camera 30.
  • a relatively wide range of fringe information of the striped pattern is measured due to the blurring of the pattern due to the surface roughness of the measurement target 10 and the distortion of the striped pattern caused by the shape of the measurement target 10. 10 is reflected in a relatively narrow range on the surface of 10 and the visibility of the striped pattern deteriorates, making it difficult to discriminate.
  • the visibility of the measurement object 10 is higher than that of the striped pattern corresponding to the illumination pattern 52 displayed on the display surface of the LCD 20 except when the measurement object 10 is a perfect mirror surface and the shape is flat. Deteriorating, the visibility value becomes smaller.
  • the method of capturing the change in average luminance is to detect the dimensions based on the reflectance information of the measurement object
  • the method for determining the visibility as in the above-described embodiment is the method for measuring the surface roughness of the measurement object.
  • the size is detected based on the shape information. Therefore, the embodiment described above can measure dimensions based on the two information of average luminance and visibility, compared with the conventional method using uniform flat illumination. There is a lot of information.
  • a value or the like can be calculated. This is as described in conjunction with FIGS.
  • the information amount is limited to the information of one image, so that the accuracy of the visibility value, the average luminance value, etc. can be further improved. difficult.
  • the phase of the irradiation intensity is modulated at least one of spatially and temporally, for example, a measurement object arranged on the front side of the display surface of the illumination device that displays a plurality of striped illumination patterns is imaged.
  • N types N is a natural number of 2 or more
  • N is a natural number of 2 or more
  • the CCD camera 30 captures N images in synchronization with the illumination pattern switching.
  • the visibility value, the average luminance value, etc. of each pixel are obtained from N images obtained by imaging the measurement object 10 arranged on the front side of the display surface on which N types of striped illumination patterns are displayed.
  • the pattern generation circuit 412-1 shown in FIG. 7 generates a striped illumination pattern to be displayed on the display surface of the LCD 20 according to the following equation.
  • I n (x, y) represents an instruction value indicating the display gradation value in the pixel (x, y) of the striped pattern displayed nth on the LCD 20, and the pattern output circuit 412- 2 outputs this instruction value to the LCD 20.
  • a and B represent the brightness and amplitude of the striped illumination pattern displayed on the LCD 20, respectively, and ⁇ represents the phase of the striped illumination pattern displayed on the LCD 20.
  • the first term in the sine term represents the time change, and the next term in the sine term represents the spatial change.
  • the input image storage memory 411-2 stores the luminance value I ′ n (x, y) at each pixel of these four images.
  • the image calculation circuit 414-1 calculates each pixel value of the average luminance image from the luminance value I ′ n (x, y) in each pixel of these four images stored in the input image storage memory 411-2, that is, The average luminance value Av (x, y) is obtained according to the following equation.
  • the image calculation circuit 414-1 also calculates each pixel value of the visibility image, that is, the luminance value I ′ n (x, y) of each pixel of these four images stored in the input image storage memory 411-2.
  • the visibility value V (x, y) is obtained according to the following equation.
  • FIG. 13 is a diagram illustrating an example of an average luminance image and a visibility image calculated from an image obtained by imaging a measurement target arranged on the front side of a display surface that displays four types of striped illumination patterns.
  • the image calculation circuit 414-1 since the four captured images 800-1 to 800-4 are stored in the input image storage memory 411-2, the image calculation circuit 414-1 has the four captured images 800-1 to 800-. 4, an average luminance image 801 and a visibility image 802 are calculated.
  • the calculated average luminance image 801 has a brightness. This is the same as the image obtained when the conventional flat illumination illuminated with A is used.
  • the visibility image 802 has a large curvature in the portion of the measurement object 10 in which the illumination is reflected in this example, so that the visibility of the portion is remarkably deteriorated and the visibility value is small, and telecentric illumination is used. Similar to the image obtained.
  • the visibility image 802 is located between the image of the illumination pattern displayed on the display surface of the LCD 20 (that is, the illumination portion) and the image 10A of the measurement target 10 (ie, the subject portion) rather than the average luminance image 801. Since the change in the pixel value at is large, each edge position used for dimension measurement is obtained from the visibility image 802 and dimension measurement is performed.
  • the virtual line 52E will be described later with reference to FIG.
  • FIG. 14 is a flowchart for explaining a first example of measurement processing.
  • the measurement process shown in FIG. 14 can be executed by the CPU 41 of the control device 4 shown in FIG. 6 executing a measurement program stored in the memory 42, for example.
  • the measurement of the measurement object 10 is performed based on at least the visibility value.
  • step S1 the CPU 41 displays an illumination pattern in which the phase of the irradiation intensity is modulated at least one of spatial and temporal on the display surface of the LCD 20.
  • the CPU 41 displays the first illumination pattern generated among the four types of striped illumination patterns as described above, for example, on the display surface of the LCD 20.
  • step S ⁇ b> 2 the CPU 41 controls the CCD camera 30 to pick up an image of the measurement object 10 arranged in front of the display surface of the LCD 20 that displays a striped illumination pattern, and stores the picked up image in the memory 42. To do.
  • step S3 the CPU 41 determines whether imaging of the measurement target 10 using all of the four types of striped illumination patterns has been completed. If the determination result is NO, the process proceeds to step S4, where determination is made. If the result is YES, the process proceeds to step S5 if the process is step S25.
  • step S4 the CPU 41 generates a striped illumination pattern whose phase is changed by ⁇ / 2 radians (rad), and the process returns to step S1.
  • step S5 the CPU 41 calculates an average luminance image 801 and a visibility image 802 from four captured images 800-1 to 800-4 as shown in FIG. 13 in this example.
  • step S ⁇ b> 6 the CPU 41 obtains each edge position used for dimension measurement from the average luminance image 801 or the visibility image 802, and the image of the measurement target 10 (that is, the target portion) and the background, that is, the stripe shape displayed on the LCD 20.
  • the image of the illumination pattern (that is, the illumination part) is separated by binarization processing, and the dimension of the measurement object 10 is measured from the binarized image.
  • FIG. 15 is a diagram illustrating an example of binarization processing.
  • FIG. 15 shows an example in which the visibility image 802 including the image 10A of the measurement target 10 is binarized for convenience of explanation.
  • the CPU 41 performs binarization processing using threshold values on the visibility image 802 shown in the upper part of FIG. 15 in step S6, a binarized visibility image 802-1 shown in the lower part of FIG. 15 is obtained.
  • the binarized visibility image 802-1 includes a measurement target image 10A-1.
  • the background that is, an image of a striped illumination pattern displayed on the LCD 20 (that is, an illumination part)
  • the measurement object 10 An image (that is, a target portion).
  • the determination as to which of the average luminance image 801 and the visibility image 802 should be selected to obtain the dimensions of the measurement target 10 may be performed as follows, for example. That is, in the state in which only the illumination pattern displayed on the display surface of the LCD 20 is reflected in the image captured by the CCD camera 30 in the initial state where the measurement target 10 is not arranged in front of the display surface of the LCD 20 in advance.
  • the average luminance image Av 0 (x, y) and the visibility image V 0 (x, y) are obtained in accordance with the following equation and stored in the memory 42.
  • the CCD camera 30 captures an image with the average luminance image Av 0 (x, y) and the visibility image V 0 (x, y) in the initial state and the measurement object 10 arranged on the front side of the display surface of the LCD 20.
  • a difference between the average luminance image Av (x, y) and the visibility image V (x, y) in the image may be obtained, and the pixel value of the image having the larger difference may be used for size calculation.
  • step S7 the CPU 41 selects the image having the larger difference between the difference between the average luminance image 801 and the average luminance image in the initial state and the difference between the visibility image 802 and the visibility image in the initial state. Using the pixel value, a dimension measurement result obtained by measuring the dimension of the measurement object 10 is output, and the process ends.
  • the CPU 41 may output and display the dimension measurement result of the measurement target 10 on, for example, the monitor device 6 shown in FIG.
  • the visibility value since the amount of information increases by obtaining the visibility value, the average luminance value, etc. based on a plurality of images as compared with the case of obtaining the visibility value, the average luminance value, etc. based on one image, the visibility value. In addition, the accuracy of the average luminance value and the like can be further improved.
  • FIG. 16 is a diagram showing an example of a brightness profile along the virtual line 52E of the average luminance image 801 and the visibility image 802 shown in FIG.
  • the vertical axis indicates the brightness of the average luminance image 801 and the visibility image 802 in arbitrary units
  • the horizontal axis indicates the position in the vertical direction along the virtual line 52E in the average luminance image 801 and the visibility image 802 in arbitrary units. It shows with.
  • a broken line I in FIG. 16 since the change in brightness at the upper edge portion of the measurement target 10 in the average luminance image 801 is small, the dimension detection sensitivity of the measurement target 10 is relatively low. Low.
  • the dimension detection sensitivity of the measurement object 10 is relatively high, and the average luminance It was confirmed that a dimension detection sensitivity of about 30 times the dimension detection sensitivity in the case of the image 801 was obtained.
  • the average luminance image 801 may not be obtained, and the dimension measurement may be performed based only on the visibility image 802. In this case, it is also possible to omit the determination of which one of the average luminance image 801 and the visibility image 802 is selected to obtain the dimension of the measurement target 10.
  • a stripe pattern in one direction was used as the spatial change in the phase of irradiation intensity, but this is not a limitation, and various patterns other than the stripe pattern that changes spatially are used. You may do it.
  • the temporal change in the phase of the irradiation intensity the initial phase of the striped pattern was changed, but the change in illumination intensity and the illumination A combination of color changes may be used.
  • FIG. 17 is a flowchart for explaining a second example of the measurement process.
  • the measurement process illustrated in FIG. 17 can be executed by, for example, the CPU 41 of the control device 4 illustrated in FIG. 6 executing a measurement program stored in the memory 42.
  • an illumination pattern in which the phase of the illumination intensity is temporally modulated is used to measure the dimension of the measurement object 10 based on at least the luminance difference value, but the phase of the illumination intensity of the illumination pattern is further spatially modulated. May be.
  • step S11 the CPU 41 sets the illumination intensity of the LCD 20 to an initial value, for example, the entire surface of the LCD 20 is set to a single irradiation intensity.
  • step S ⁇ b> 12 the CPU 41 controls the CCD camera 30 to measure the illumination pattern with the initial illumination intensity displayed on the display surface of the LCD 20 and the measurement target 10 arranged in front of the display surface displaying the illumination pattern. And the captured image is stored in the memory 42.
  • step S13 the CPU 41 changes the illumination intensity of the LCD 20.
  • step S ⁇ b> 14 the CPU 41 controls the CCD camera 30 to change the illumination pattern of the changed illumination intensity displayed on the display surface of the LCD 20 and the measurement target 10 arranged in front of the display surface displaying the illumination pattern.
  • step S ⁇ b> 15 the CPU 41 calculates a luminance difference image from the two captured images stored in the memory 42. Specifically, the CPU 41 captures the luminance value of each pixel of one captured image captured with the illumination pattern having the initial illumination intensity and one captured image captured with the illumination pattern having the changed illumination intensity. The luminance difference value with the luminance value of each pixel is calculated, and the luminance difference image formed with the luminance difference value is calculated.
  • step S15 For example, if the pixel value of the captured image captured in step S12 is represented by I ′ 1 (x, y) and the pixel value of the captured image captured in step S14 is represented by I ′ 2 (x, y), then in step S15.
  • the calculated luminance difference value can be calculated from
  • step S16 the CPU 41 obtains each edge position used for dimension measurement from the luminance difference image, and the image of the measurement target 10 (ie, the target portion) and the background, ie, the image of the illumination pattern displayed on the LCD 20 (ie, the illumination pattern).
  • the illumination part) is separated by binarization processing.
  • step S17 CPU41 outputs the dimension measurement result which measured the dimension of the measuring object 10 using the brightness
  • step S ⁇ b> 17 the CPU 41 may output and display the dimension measurement result of the measurement target 10 on the monitor device 6 illustrated in FIG. 7, for example.
  • FIG. 18 is a flowchart for explaining a third example of the measurement process.
  • the measurement process shown in FIG. 18 can be executed by the CPU 41 of the control device 4 shown in FIG. 6 executing a measurement program stored in the memory 42, for example.
  • a measurement program stored in the memory 42, for example.
  • an illumination pattern in which the phase of the illumination color is temporally modulated is used, and the dimension of the measurement object 10 is measured based on at least the color tone difference or the color difference value.
  • the illumination color phase of the illumination pattern is more spatial. It may be modulated.
  • step S21 the CPU 41 sets the illumination color of the LCD 20 to an initial value, and for example, the entire surface of the LCD 20 is set to a single illumination color.
  • step S ⁇ b> 22 the CPU 41 controls the CCD camera 30 so that the illumination pattern of the initial illumination color displayed on the display surface of the LCD 20 and the measurement target 10 arranged in front of the display surface displaying the illumination pattern are displayed. And the captured image is stored in the memory 42.
  • step S23 the CPU 41 changes the illumination color of the LCD 20.
  • step S24 the CPU 41 controls the CCD camera 30 to change the illumination pattern of the changed illumination color displayed on the display surface of the LCD 20 and the measurement target 10 arranged in front of the display surface displaying the illumination pattern.
  • step S ⁇ b> 25 the CPU 41 calculates a color tone difference image or a color difference image from the two captured images stored in the memory 42. Specifically, the CPU 41 captures the luminance value of each pixel of one captured image captured with the illumination pattern of the initial illumination color and one captured image captured with the illumination pattern of the changed illumination color. A luminance difference value with respect to the luminance value of each pixel is calculated, and a color difference image formed with the color difference value or a color difference image formed with the color difference value is calculated.
  • the pixel value of the red (R) component of the captured image captured in step S22 is I ′ R1 (x, y), and the pixel value of the red (R) component of the captured image captured in step S24 is I ′ R2.
  • (x, y) is represented by the luminance difference value of the red (R) component is calculated in step S25,
  • the pixel value of the green (G) component of the captured image captured in step S22 is I ′ G1 (x, y), and the pixel value of the green (G) component of the captured image captured in step S24 is I ′.
  • G2 (x, y) is represented by the luminance difference value of the green (G) component calculated at step S25,
  • the pixel value of the blue (B) component of the captured image captured in step S24 is I ′.
  • the luminance difference value of the blue (B) component calculated in step 25 should be calculated from
  • step S ⁇ b> 26 the CPU 41 obtains each edge position used for dimension measurement from the color difference image or the color difference image, and the image of the measurement target 10 (that is, the target portion) and the background, that is, the image of the illumination pattern displayed on the LCD 20. (Ie, the illumination portion) is separated by binarization processing.
  • step S ⁇ b> 17 the CPU 41 outputs a dimension measurement result obtained by measuring the dimension of the measurement object 10 using the color difference value (pixel value) of the color difference image or the color difference value (pixel value) of the color difference image, and the process ends. To do.
  • the CPU 41 may output and display the dimension measurement result of the measurement target 10 on the monitor device 6 illustrated in FIG. 7, for example.
  • the dimension obtained by the dimension measurement of the measuring object 10 based on the average luminance value by the procedure described with FIG. 14, FIG. 17 and FIG. 18 is used as the dimension value obtained by the dimension measurement of the measuring object 10 based on the visibility value.
  • the larger dimension value may be output as a dimension measurement result.
  • the maximum value may be output as the dimension measurement result among the dimension values obtained by the dimension measurement of the measurement target 10 based on the visibility value, the average luminance value, the luminance difference value, and the color tone difference value or the color difference value.
  • the CPU 41 calculates luminance information representing the average of the luminance values of the corresponding pixels of the plurality of images or the difference between the luminance values, and calculates the initial luminance information of the illumination pattern in each pixel of the initial image calculated in advance.
  • the dimension of the measurement target may be measured based on the pixels whose difference is greater than or equal to a certain value. In this case, the larger dimension value of the measurement target dimension value measured based on the visibility value and the measurement target dimension value measured based on the luminance information may be output as the dimension measurement result.
  • the CPU 41 calculates the color value representing the color tone value or color value of the corresponding pixel of the plurality of images, and the difference is greater than or equal to a predetermined value with respect to the initial color information of the illumination pattern in each pixel of the initial image calculated in advance. You may measure the dimension of a measuring object based on the pixel which becomes. In this case, the larger dimension value of the measurement target dimension value measured based on the visibility value and the measurement target dimension value measured based on the color information may be output as the dimension measurement result.
  • an LCD having a backlight and displaying an illumination pattern is used in the illumination device.
  • the illumination device includes a plasma display device that displays the illumination pattern, an organic electroluminescence display device that displays the illumination pattern, a screen, and the like.
  • the projection device projects an illumination pattern onto the first surface of the screen, and is disposed on the near side of the first surface corresponding to the display surface of the illumination pattern with the illumination pattern projected onto the first surface.
  • the measurement object may be illuminated.
  • the projection device projects an illumination pattern onto the first surface of the screen, and transmits the second illumination pattern transmitted to the second surface opposite to the first surface, which corresponds to the display surface of the illumination pattern. You may illuminate the measurement object arrange
  • the obtained information is limited to the average luminance value in the first example.
  • the visibility value is used and compared with the average luminance value as necessary.
  • the same flat type illumination as the flat illumination can be used to measure dimensions with high sensitivity equivalent to that of telecentric illumination, and it is possible to measure dimensions of relatively large parts that are difficult with telecentric illumination with high sensitivity.
  • the visibility value by detecting the temporal change in the pixel value, it is possible to reduce the adverse effects of external light that can be regarded as constant illuminance in a short period of time, and n-order reflected light that is difficult to maintain spatial fringe information. It can be removed well, eliminating the need for dimension measurement in a dark room.
  • the illumination pattern in which the phase of the irradiation intensity is spatially modulated cannot be generated, but an illumination pattern in which the phase of the irradiation intensity is temporally modulated can be generated. Therefore, in the modified example of the above embodiment, the illumination pattern in which the phase of the irradiation intensity is temporally modulated is generated using telecentric illumination instead of the LCD.
  • the measurement accuracy is improved by suppressing the adverse effects of reflected light or diffused light caused by secondary reflected light from other sources than the external light source or measurement target. can do. Therefore, measurement accuracy can be improved without providing a measurement device using telecentric illumination in the darkroom.
  • the measurement accuracy can be improved by suppressing the adverse effect of reflected light or diffused light caused by the secondary reflected light from other than the external light source and the measurement target hitting the measurement target. Can do.
  • measurement accuracy can be improved without providing a measurement device in the darkroom.
  • the measuring device can be reduced in size and cost. In this case, it is possible to measure dimensions of relatively large parts with high accuracy.
  • the present invention is not limited to the above-mentioned example, and various modification and improvement are possible within the scope of the present invention. Needless to say.

Abstract

This measurement device comprises: an illumination device having a display surface for displaying an illumination pattern; an imaging device that is disposed in front of the display surface of the illumination device, and that captures an image of an object of measurement that is illuminated by the illumination pattern; and a control device that generates an illumination pattern in which the illumination intensity phase is modulated spatially and/or temporally, that outputs a command to the illumination device to display the illumination pattern on the display surface of the illumination device, and that performs synchronization control to synchronize the operations of the illumination device and the imaging device. The control device divides the amplitude of luminance of each pixel of the image by the average value of the luminance of the image to measure the dimensions of the object of measurement on the basis of a visibility image generated by calculating a visibility value of the illumination pattern for each pixel in the image.

Description

計測装置、計測方法及び計測プログラムMeasuring device, measuring method and measuring program
 本発明は、計測装置、計測方法及び計測プログラムに関する。 The present invention relates to a measuring device, a measuring method, and a measuring program.
 工業製品の品質を保つために、製造段階で部品等に対する寸法計測が行われている。近年、自動車等に使われる、比較的大型の部品の寸法計測を自動化する要望がある。 In order to maintain the quality of industrial products, dimensions are measured for parts and the like at the manufacturing stage. In recent years, there is a demand for automating the measurement of dimensions of relatively large parts used in automobiles and the like.
 寸法計測を行う場合、フロントライト照明により計測対象を正面から照明し、計測対象からの反射画像を利用して寸法計測を行う方法と、バックライト照明により計測対象を背後から照明し、計測対象のシルエット画像を利用して寸法計測を行う方法とがある。後者の方法は、テレセントリック光学系を用いてカメラの光軸と平行な光線のみが含まれる照明で計測対象を照射するテレセントリック照明を利用する方法と、例えば発光ダイオード(LED:Light Emitting Diode)と拡散板を用いて様々な方向の光線が含まれる照明で計測対象を照射するフラット照明を利用する方法の、2種類の方法に分類できる。 When performing dimension measurement, illuminate the measurement target from the front using front light illumination, measure the dimensions using the reflected image from the measurement target, and illuminate the measurement target from the back using backlight illumination. There is a method of measuring dimensions using a silhouette image. The latter method uses a telecentric illumination that uses a telecentric optical system to illuminate the measurement target with illumination that contains only light rays parallel to the optical axis of the camera, and, for example, a light emitting diode (LED: Light 拡 散 Emitting Diode) and diffusion. It can be classified into two types of methods using a flat illumination that irradiates a measurement target with illumination including light beams in various directions using a plate.
 一般的に、寸法計測においては、テレセントリック照明のように光線がカメラの光軸と平行な成分しか含まない光源で計測対象を照射するのが好ましい。これは、カメラの光軸と平行ではない光線は、計測対象の表面で反射又は拡散してカメラに入射してしまい、反射光又は拡散光の悪影響により理想的なシルエット画像が得られないからである。 Generally, in dimension measurement, it is preferable to irradiate the measurement target with a light source in which light rays include only a component parallel to the optical axis of the camera, such as telecentric illumination. This is because light rays that are not parallel to the optical axis of the camera are reflected or diffused on the surface of the measurement object and enter the camera, and an ideal silhouette image cannot be obtained due to the adverse effect of reflected or diffused light. is there.
 一方、テレセントリック照明を利用する場合、光学系が複雑であるため、大口径の照明を作ることは困難である。現在市販されているテレセントリック照明用のレンズの口径は、最大でも数100mm程度であり、比較的大型の部品の全面を単一のテレセントリック照明で照明することは難しい。複数個のテレセントリック照明を組み合わせることで、比較的大型の部品に対応することは可能であるが、口径の大きなレンズは高価であり、高価なテレセントリック照明を複数個組み合わせることは、コストの面から実用的ではない。また、テレセントリック照明は、複雑な光学系を有するため、奥行のサイズを縮小することは難しく、小型化することは難しい。 On the other hand, when using telecentric illumination, it is difficult to make a large-diameter illumination because the optical system is complicated. The diameter of a lens for telecentric illumination currently on the market is several hundred mm at the maximum, and it is difficult to illuminate the entire surface of a relatively large component with a single telecentric illumination. By combining multiple telecentric lights, it is possible to handle relatively large parts, but lenses with large apertures are expensive, and combining multiple expensive telecentric lights is practical from a cost standpoint. Not right. Further, since telecentric illumination has a complicated optical system, it is difficult to reduce the depth size and it is difficult to reduce the size.
 更に、テレセントリック照明を利用する場合であっても、外部光源や計測対象以外からの2次反射光等が計測対象に当たることで生じる反射光又は拡散光がカメラに入射すると、理想的シルエット画像が得られず、計測精度が低下してしまう。このような反射光又は拡散光の悪影響を抑制して計測精度を向上するためには、計測装置全体を暗室内に設ける必要があり、計測装置の大型化とコスト増大の要因となる。 Furthermore, even when telecentric illumination is used, an ideal silhouette image is obtained when reflected light or diffused light, which is generated when secondary reflected light from a light source other than the light source or the measurement target hits the measurement target, enters the camera. Measurement accuracy is degraded. In order to improve the measurement accuracy by suppressing such an adverse effect of reflected light or diffused light, it is necessary to provide the entire measurement device in a darkroom, which causes an increase in size and cost of the measurement device.
 一方、フラット照明を利用する場合、カメラの光軸と平行な成分以外の、様々な方向の光線が含まれる照明で計測対象を照射するため、理想的なシルエット画像を得ることは難しいが、比較的大面積の照明を比較的安価で実現できる。このため、フラット照明を利用する場合は、比較的大型の部品の寸法計測に対応可能である。また、フラット照明を利用する場合、光学系の構成が比較的簡単であるため、奥行のサイズを容易に縮小して計測装置を小型化できる。しかし、フラット照明を利用する場合であっても、外部光源や計測対象以外からの2次反射光等が計測対象に当たることで生じる反射光又は拡散光の悪影響を抑制して計測精度を向上するためには、テレセントリック照明を利用する場合と同様に、計測装置全体を暗室内に設ける必要がある。 On the other hand, when using flat illumination, it is difficult to obtain an ideal silhouette image because the measurement object is illuminated with illumination that includes rays in various directions other than components parallel to the optical axis of the camera. Large area lighting can be realized at a relatively low cost. For this reason, when using flat illumination, it can respond to the dimension measurement of comparatively large components. Further, when flat illumination is used, since the configuration of the optical system is relatively simple, the depth size can be easily reduced to reduce the size of the measuring device. However, even in the case of using flat illumination, in order to improve measurement accuracy by suppressing the adverse effect of reflected light or diffused light caused by secondary reflected light from other than the external light source or the measurement target hitting the measurement target Therefore, as in the case of using telecentric illumination, it is necessary to provide the entire measuring device in the darkroom.
特開平11-304452号公報Japanese Patent Laid-Open No. 11-304452 特開平9-203613号公報JP-A-9-203613 特開平7-260703号公報JP-A-7-260703 特開2010-25602号公報JP 2010-25602 A
 従来の計測装置では、外部光源や計測対象以外からの2次反射光等が計測対象に当たることで生じる反射光又は拡散光の悪影響を抑制することは難しいため、計測精度を向上することは難しい。 In conventional measurement devices, it is difficult to suppress the adverse effects of reflected light or diffused light caused by external light sources or secondary reflected light from other than the measurement target hitting the measurement target, so it is difficult to improve measurement accuracy.
 そこで、1つの側面では、計測精度を向上可能な計測装置、計測方法及び計測プログラムを提供することを目的とする。 Therefore, an object of one aspect is to provide a measurement device, a measurement method, and a measurement program that can improve measurement accuracy.
 1つの案によれば、照明パターンを表示する表示面を有する照明装置と、前記照明装置の前記表示面より手前側に配置され前記照明パターンにより照明された計測対象の画像を撮像する撮像装置と、照射強度の位相が空間的及び時間的のうち少なくとも一方に変調された照明パターンを生成し、当該照明パターンを前記照明装置の前記表示面に表示させる指示を前記照明装置に出力すると共に、前記照明装置と前記撮像装置の動作を同期させる同期制御を行う制御装置と、を備え、前記制御装置は、前記画像の各画素の輝度の振幅を前記画像の輝度の平均値で除して、前記画像の各画素における前記照明パターンのビジビリティ値を算出して生成したビジビリティ画像を基に前記計測対象の寸法を計測する計測装置が提供される。 According to one proposal, an illuminating device having a display surface for displaying an illumination pattern, and an imaging device for capturing an image of a measurement object that is disposed on the near side of the display surface of the illuminating device and illuminated by the illumination pattern; Generating an illumination pattern in which the phase of the irradiation intensity is modulated at least one of spatial and temporal, and outputting an instruction to display the illumination pattern on the display surface of the illumination device to the illumination device, and A control device that performs synchronous control for synchronizing the operation of the imaging device with an illumination device, and the control device divides the luminance amplitude of each pixel of the image by the average value of the luminance of the image, and There is provided a measurement device that measures the dimension of the measurement target based on a visibility image generated by calculating the visibility value of the illumination pattern in each pixel of the image.
 一態様によれば、計測精度を向上することができる。 According to one aspect, measurement accuracy can be improved.
テレセントリック照明を用いる寸法計測を説明する図である。It is a figure explaining the dimension measurement using a telecentric illumination. フラット照明を用いる寸法計測を説明する図である。It is a figure explaining dimension measurement using flat illumination. 計測対象をフラット照明下で撮像する場合を説明する図である。It is a figure explaining the case where a measuring object is imaged under flat illumination. 図2の計測対象をテレセントリック照明下で撮像して得られるシルエット画像を示す図である。It is a figure which shows the silhouette image obtained by imaging the measurement object of FIG. 2 under telecentric illumination. 図2の計測対象をフラット照明下で撮像して得られるシルエット画像を示す図である。It is a figure which shows the silhouette image obtained by imaging the measurement object of FIG. 2 under flat illumination. 外部光源や計測対象以外からの2次反射光等が計測対象に当たることで生じる反射光又は拡散光を説明する図である。It is a figure explaining the reflected light or diffused light which arises when secondary reflected light from those other than an external light source and a measurement object hits the measurement object. 図4の場合に得られるシルエット画像を示す図である。It is a figure which shows the silhouette image obtained in the case of FIG. 一実施例における計測装置のハードウェア構成の一例を示す図である。It is a figure which shows an example of the hardware constitutions of the measuring device in one Example. 図6に示す制御装置の機能構成の一例を示す図である。It is a figure which shows an example of a function structure of the control apparatus shown in FIG. 計測対象をフラット照明の照明パターン下で撮像する場合を説明する図である。It is a figure explaining the case where a measurement object is imaged under the illumination pattern of flat illumination. LCDの表示面の手前側に計測対象が配置されていない状態で、LCDに表示された縞状の照明パターンをCCDカメラで撮像した画像の一例を示す図である。It is a figure which shows an example of the image which imaged the striped illumination pattern displayed on LCD with the CCD camera in the state by which the measurement object is not arrange | positioned at the near side of the display surface of LCD. 図9に示す画像の輝度プロファイルの一例を示す図である。It is a figure which shows an example of the brightness | luminance profile of the image shown in FIG. LCDの表示面の手前側に計測対象が配置されている状態で、LCDの表示面に表示された縞状の照明パターンをCCDカメラで撮像した画像の一例を示す図である。It is a figure which shows an example of the image which imaged the striped illumination pattern displayed on the display surface of LCD with the CCD camera in the state by which the measuring object is arrange | positioned in the near side of the display surface of LCD. 図11に示す画像の明るさプロファイルの一例を示す図である。It is a figure which shows an example of the brightness profile of the image shown in FIG. 4種類の縞状の照明パターンの手前側に配置された計測対象を撮像した画像から算出した平均輝度画像とビジビリティ画像の一例を示す図である。It is a figure which shows an example of the average luminance image computed from the image which imaged the measurement object arrange | positioned at the near side of four types of striped illumination patterns, and a visibility image. 計測処理の第1の例を説明するフローチャートである。It is a flowchart explaining the 1st example of a measurement process. 2値化処理の一例を説明する図である。It is a figure explaining an example of a binarization process. 図13に示す平均輝度画像及びビジビリティ画像の仮想線に沿った明るさプロファイルの一例を示す図である。It is a figure which shows an example of the brightness profile along the virtual line of the average luminance image and visibility image shown in FIG. 計測処理の第2の例を説明するフローチャートである。It is a flowchart explaining the 2nd example of a measurement process. 計測処理の第3の例を説明するフローチャートである。It is a flowchart explaining the 3rd example of a measurement process.
 開示の計測装置、計測方法及び計測プログラムでは、照明装置が、照明パターンを表示面に表示し、撮像装置が、照明装置の表示面より手前側に配置され照明パターンにより照明された計測対象の画像を撮像し、制御装置が、照射強度の位相が空間的及び時間的のうち少なくとも一方に変調された照明パターンを生成し、当該照明パターンを照明装置の表示面に表示させる指示を照明装置に出力すると共に、照明装置と撮像装置の動作を同期させる同期制御を行う。また、制御装置が、画像の各画素の輝度の振幅を前記画像の輝度の平均値で除して、画像の各画素における照明パターンのビジビリティ値を算出して生成したビジビリティ画像を基に計測対象の寸法を計測する。 In the disclosed measurement device, measurement method, and measurement program, the illumination device displays the illumination pattern on the display surface, and the imaging device is disposed on the front side of the illumination device display surface and illuminated by the illumination pattern. The control device generates an illumination pattern in which the phase of the irradiation intensity is modulated at least one of spatial and temporal, and outputs an instruction to display the illumination pattern on the display surface of the illumination device. In addition, synchronous control is performed to synchronize the operations of the illumination device and the imaging device. In addition, the control device divides the luminance amplitude of each pixel of the image by the average value of the luminance of the image, and calculates the visibility value of the illumination pattern in each pixel of the image, and then the measurement target Measure the dimensions.
 以下に、開示の計測装置、計測方法及び計測プログラムの各実施例を図面と共に説明する。 Hereinafter, each embodiment of the disclosed measuring apparatus, measuring method, and measuring program will be described with reference to the drawings.
 一般的に、バックライト照明を用いる寸法計測においては、図1Aに示すテレセントリック照明のように、光線500がカメラ(図示せず)の光軸510と平行な成分しか含まない光源で計測対象10を照射するのが好ましい。これは、図1Bに示すフラット照明520のように、カメラの光軸510と平行ではない光線501は、計測対象10の表面で反射又は拡散してカメラに入射してしまい、反射光及び拡散光502の少なくとも一方の悪影響により理想的なシルエット画像が得られないためである。 In general, in dimension measurement using backlight illumination, the measurement object 10 is measured with a light source in which the light beam 500 includes only a component parallel to the optical axis 510 of a camera (not shown), like the telecentric illumination shown in FIG. 1A. Irradiation is preferred. This is because light rays 501 that are not parallel to the optical axis 510 of the camera are incident on the camera after being reflected or diffused on the surface of the measurement object 10 as in the flat illumination 520 shown in FIG. 1B. This is because an ideal silhouette image cannot be obtained due to the adverse effect of at least one of 502.
 図2は、カメラ530が計測対象10の一例である円柱をフラット照明520下で撮像する場合を示す。図3Aは、図2の計測対象10をテレセントリック照明下で撮像して得られるシルエット画像を示し、図3Bは、図2の計測対象10をフラット照明520下で撮像して得られるシルエット画像を示す。図3Aのシルエット画像は、外部光源や計測対象以外からの2次反射光等が計測対象10に当たることで生じる反射光又は拡散光がカメラに入射しなければ、計測対象10全体が影になる理想的なシルエット画像となっている。一方、図3Bのシルエット画像は、計測対象10の一部が反射光又は拡散光により光って見えてしまうため、理想的なシルエット画像ではない。図3Bに示す例では、シルエット画像から計測される計測対象10の寸法が、図3Aに示す例と比べて上下の部分がDに相当する分だけ小さくなってしまう。 FIG. 2 shows a case where the camera 530 captures an image of a cylinder that is an example of the measurement target 10 under the flat illumination 520. 3A shows a silhouette image obtained by imaging the measurement object 10 of FIG. 2 under telecentric illumination, and FIG. 3B shows a silhouette image obtained by imaging the measurement object 10 of FIG. 2 under flat illumination 520. . The silhouette image of FIG. 3A is an ideal image in which the entire measurement object 10 becomes a shadow unless the reflected light or diffused light generated by the secondary reflected light other than the external light source or the measurement object hits the measurement object 10. The silhouette image. On the other hand, the silhouette image in FIG. 3B is not an ideal silhouette image because a part of the measurement object 10 appears to be shined by reflected light or diffused light. In the example shown in FIG. 3B, the dimension of the measurement target 10 measured from the silhouette image is smaller than the example shown in FIG.
 更に、図4に示すように、テレセントリック照明を利用する場合であっても、蛍光灯等の外部の光源540からの外光503や計測対象10以外の物体541からの2次反射光504等が計測対象10に当たることで生じる反射光及び拡散光505の少なくとも一方がカメラに入射すると、図5からもわかるように、理想的シルエット画像が得られず、計測精度が低下してしまう。図5は一例として、光源540の一例である蛍光灯の映り込みが発生する計測対象10の部分101、外光503により光っている計測対象10の部分102、及び計測対象10以外の物体541の映り込みが発生する計測対象10の部分103を含むシルエット画像105を示す。 Furthermore, as shown in FIG. 4, even when using telecentric illumination, external light 503 from an external light source 540 such as a fluorescent lamp, secondary reflected light 504 from an object 541 other than the measurement target 10, etc. If at least one of the reflected light and the diffused light 505 generated by hitting the measurement object 10 enters the camera, an ideal silhouette image cannot be obtained as can be seen from FIG. As an example, FIG. 5 illustrates a portion 101 of the measurement target 10 where the reflection of a fluorescent lamp, which is an example of the light source 540, a portion 102 of the measurement target 10 that is illuminated by the external light 503, and an object 541 other than the measurement target 10. A silhouette image 105 including a portion 103 of the measurement object 10 where reflection occurs is shown.
 このような反射光及び拡散光505の少なくとも一方の悪影響を抑制して計測精度を向上するためには、計測装置全体を暗室内に設ける必要があり、計測装置の大型化とコスト増大の要因となる。 In order to suppress the adverse effect of at least one of the reflected light and the diffused light 505 and improve the measurement accuracy, it is necessary to provide the entire measurement device in a darkroom, which increases the size and cost of the measurement device. Become.
 一方、フラット照明を利用する場合、カメラの光軸と平行な成分以外の、様々な方向の光線が含まれる照明で計測対象を照射するため、理想的なシルエット画像を得ることは難しいが、比較的大面積の照明を比較的安価で実現できる。このため、フラット照明を利用する場合は、比較的大型の部品の寸法計測に対応可能である。また、フラット照明を利用する場合、光学系の構成が比較的簡単であるため、奥行のサイズを容易に縮小して計測装置を小型化できる。しかし、フラット照明を利用する場合であっても、外部光源から外光や計測対象以外の物体からの2次反射光等が計測対象に当たることで生じる反射光又は拡散光の悪影響を抑制して計測精度を向上するためには、テレセントリック照明を利用する場合と同様に、計測装置全体を暗室内に設ける必要がある。 On the other hand, when using flat illumination, it is difficult to obtain an ideal silhouette image because the measurement object is illuminated with illumination that includes rays in various directions other than components parallel to the optical axis of the camera. Large area lighting can be realized at a relatively low cost. For this reason, when using flat illumination, it can respond to the dimension measurement of comparatively large components. Further, when flat illumination is used, since the configuration of the optical system is relatively simple, the depth size can be easily reduced to reduce the size of the measuring device. However, even when using flat illumination, measurement is performed while suppressing the adverse effects of reflected light or diffused light caused by external light from an external light source or secondary reflected light from an object other than the measurement target on the measurement target. In order to improve the accuracy, it is necessary to provide the entire measuring device in the dark room, as in the case of using telecentric illumination.
 そこで、外部光源や計測対象以外の物体からの2次反射光等が計測対象に当たることで生じる反射光又は拡散光の悪影響を抑制して、計測精度を向上する各実施例について、以下に説明する。なお、以下に説明する各実施例においては、計測装置を暗室内に設けることなく、計測精度を向上できる。 Accordingly, each embodiment that improves the measurement accuracy by suppressing the adverse effect of reflected light or diffused light caused by secondary reflected light from an object other than the measurement target or the external light source will be described below. . In each of the embodiments described below, measurement accuracy can be improved without providing a measurement device in the darkroom.
 図6は、一実施例における計測装置のハードウェア構成の一例を示す図である。この例では、バックライト照明を用いる計測装置1は、照明装置2と、撮像装置3と、制御装置4とを有する。制御装置4は、例えば汎用コンピュータ等のコンピュータで形成可能である。この例では、制御装置4は、プロセッサの一例であるCPU(Central Processing Unit)41と、記憶装置の一例であるメモリ42とを有する。CPU41は、メモリ42に記憶された計測プログラム等を含むプログラムを実行し、後述する計測処理等を実行する。メモリ42は、プログラム、データ等を記憶する。メモリ42は、USB(Universal Serial Bus)メモリ等の可搬型記録媒体、フラッシュメモリ等の半導体記憶装置、磁気記録媒体、CD-ROM(Compact Disk-Read Only Memory)、DVDディスク(Digital Versatile Disk)等の光記録媒体、光磁気記録媒体等のコンピュータ読取可能な記録媒体により形成可能である。なお、メモリ42にディスク等の磁気記録媒体、光記録媒体又は光磁気記録媒体を用いる場合、記録媒体はディスクドライブ等のドライブにロードされ、ドライブによりプログラム等を記録媒体から読み出し、必要に応じて記録媒体にデータ等を書き込む。図6では、便宜上、コンピュータに無線又は有線で接続可能な入力装置と表示装置の図示は省略する。 FIG. 6 is a diagram illustrating an example of a hardware configuration of a measurement device according to an embodiment. In this example, the measuring device 1 using backlight illumination includes a lighting device 2, an imaging device 3, and a control device 4. The control device 4 can be formed by a computer such as a general-purpose computer. In this example, the control device 4 includes a CPU (Central Processing Unit) 41 that is an example of a processor and a memory 42 that is an example of a storage device. The CPU 41 executes a program including a measurement program stored in the memory 42 and executes a measurement process and the like described later. The memory 42 stores programs, data, and the like. The memory 42 is a portable recording medium such as a USB (Universal Serial Bus) memory, a semiconductor storage device such as a flash memory, a magnetic recording medium, a CD-ROM (Compact Disk Read Only Memory), a DVD disk (Digital Versatile Disk), etc. It can be formed by a computer-readable recording medium such as an optical recording medium or a magneto-optical recording medium. When a magnetic recording medium such as a disk, an optical recording medium, or a magneto-optical recording medium is used for the memory 42, the recording medium is loaded into a drive such as a disk drive, and a program or the like is read from the recording medium by the drive. Write data to the recording medium. In FIG. 6, for the sake of convenience, illustration of an input device and a display device that can be connected to a computer wirelessly or by wire is omitted.
 照明装置2は、例えば表示面及びバックライトを有する周知の液晶表示装置(LCD:Liquid Crystal Display)等で形成可能であり、照明パターンを表示面に表示する。撮像装置3は、周知のCCD(Charge-Coupled Device)カメラ等で形成可能であり、照明装置2の表示面より手前側に配置され表示面に表示された照明パターンにより照明された計測対象10の画像を撮像する。つまり、照明装置2の表示面(又は、発光面)は、撮像装置3から見て計測対象10の背後に配置されている。従って、撮像装置3は、照明装置2の表示面に表示された照明パターンと、表示面より手前側に配置され照明パターンにより照明された計測対象10の画像を同時に撮像する。 The illumination device 2 can be formed by a known liquid crystal display (LCD) having a display surface and a backlight, for example, and displays an illumination pattern on the display surface. The imaging device 3 can be formed with a well-known CCD (Charge-Coupled Device) camera or the like, and is arranged on the near side of the display surface of the illumination device 2 and is illuminated by the illumination pattern displayed on the display surface. Take an image. That is, the display surface (or light emitting surface) of the illumination device 2 is disposed behind the measurement target 10 when viewed from the imaging device 3. Accordingly, the imaging device 3 simultaneously captures the illumination pattern displayed on the display surface of the illumination device 2 and the image of the measurement target 10 that is disposed on the near side of the display surface and illuminated by the illumination pattern.
 制御装置4は、照射強度の位相が空間的及び時間的のうち少なくとも一方に変調された照明パターンを生成し、照明装置2の表示面に照明パターンを表示させる指示を照明装置2に出力すると共に、照明装置2と撮像装置3の動作を同期させる同期制御を行う。また、制御装置4は、撮像装置3が撮像した画像の各画素の輝度の振幅を当該画像の輝度の平均値(以下、「平均輝度値」とも言う)で除して、当該画像の各画素における照明パターンのビジビリティ値を算出して生成したビジビリティ画像を基に計測対象10の寸法を計測する。制御装置4は、生成したビジビリティ画像に閾値を用いた2値化処理を施し、照明部分と対象部分を識別した情報を基に計測対象10の寸法を計測しても良い。上記の閾値は、固定値として操作者が設定しても良く、適応2値化等の手法で自動的に決めても良い。また、照明装置2の表示面の手前側に計測対象10が配置されていない初期状態で撮像装置3が照明パターンを撮像した初期画像の各画素における照明パターンの初期ビジビリティ値より若干低い値を閾値として採用しても良い。更に、閾値を用いた2値化処理を行う代わりに、照明装置2の表示面の手前側に計測対象10が配置されていない初期状態で撮像装置3が照明パターンを撮像した初期画像の各画素における初期ビジビリティ値に対し、計測対象10がある状態で算出したビジビリティ値との差分が一定以上となる画素に基づき計測対象10の寸法を計測しても良い。このように、制御装置4は、同期制御を含む、計測対象10の寸法を計測する計測処理を実行する。計測対象10を形成する対象物は、特に限定されない。 The control device 4 generates an illumination pattern in which the phase of the irradiation intensity is modulated at least one of spatial and temporal, and outputs an instruction to display the illumination pattern on the display surface of the illumination device 2 to the illumination device 2. Synchronous control for synchronizing the operations of the illumination device 2 and the imaging device 3 is performed. Further, the control device 4 divides the luminance amplitude of each pixel of the image captured by the image capturing device 3 by the average value of the luminance of the image (hereinafter, also referred to as “average luminance value”), and each pixel of the image. The dimension of the measurement object 10 is measured based on the visibility image generated by calculating the visibility value of the illumination pattern at. The control device 4 may perform a binarization process using a threshold value on the generated visibility image and measure the dimension of the measurement target 10 based on information identifying the illumination part and the target part. The threshold value may be set by the operator as a fixed value, or may be automatically determined by a technique such as adaptive binarization. In addition, a threshold value that is slightly lower than the initial visibility value of the illumination pattern in each pixel of the initial image in which the imaging device 3 captures the illumination pattern in the initial state where the measurement target 10 is not disposed on the front side of the display surface of the illumination device 2 May be adopted. Further, instead of performing the binarization process using the threshold value, each pixel of the initial image in which the imaging device 3 captures the illumination pattern in the initial state where the measurement target 10 is not disposed on the front side of the display surface of the illumination device 2. The dimension of the measurement target 10 may be measured based on a pixel whose difference from the initial visibility value in FIG. Thus, the control apparatus 4 performs the measurement process which measures the dimension of the measuring object 10 including synchronous control. The object that forms the measurement object 10 is not particularly limited.
 図7は、図6に示す制御装置の機能構成の一例を示す図である。この例において、LCD20は、表示面及びバックライトを有し、照明パターンを表示面に表示する照明装置2の一例である。LCD20の表示面に表示された照明パターンとLCD20の表示面より手前側に配置された計測対象10の画像を撮像するCCDカメラ30は、撮像装置3の一例である。コンピュータ40は、照射強度の位相が空間的及び時間的のうち少なくとも一方に変調された照明パターンを生成し、当該照明パターンをLCD20の表示面に表示させる指示をLCD20に出力すると共に、LCD20とCCDカメラ30の動作を同期させる同期制御を含む、計測処理を実行する制御装置4の一例である。モニタ装置6は、コンピュータ40の操作者にメッセージ等を表示する表示装置の一例である。キーボード5は、コンピュータ40の操作者がコマンド、データ等をコンピュータ40に入力する際に操作される入力装置の一例である。入力装置は、マウス等であっても良い。 FIG. 7 is a diagram illustrating an example of a functional configuration of the control device illustrated in FIG. In this example, the LCD 20 is an example of the illumination device 2 that has a display surface and a backlight and displays an illumination pattern on the display surface. The CCD camera 30 that captures the illumination pattern displayed on the display surface of the LCD 20 and the image of the measurement target 10 disposed on the near side of the display surface of the LCD 20 is an example of the imaging device 3. The computer 40 generates an illumination pattern in which the phase of the irradiation intensity is modulated at least one of spatial and temporal, and outputs an instruction to display the illumination pattern on the display surface of the LCD 20 to the LCD 20 and also the LCD 20 and the CCD. It is an example of the control apparatus 4 which performs the measurement process including the synchronous control which synchronizes operation | movement of the camera 30. The monitor device 6 is an example of a display device that displays a message or the like to the operator of the computer 40. The keyboard 5 is an example of an input device that is operated when an operator of the computer 40 inputs commands, data, and the like to the computer 40. The input device may be a mouse or the like.
 図7に示すように、コンピュータ40は、画像入力部411と、パターン出力部412と、同期制御回路413と、データ処理部414とを有する。画像入力部411は、画像入力回路411-1及び入力画像格納メモリ411-2を含む機能構成を有する。パターン出力部412は、パターン生成回路412-1及びパターン出力回路412-2を有する。データ処理部414は、画像算出回路414-1、寸法算出回路414-2、及び結果格納メモリ414-3を有する。画像入力部411、パターン出力部412、同期制御回路413、及びデータ処理部414の夫々の機能は、例えば図6に示すCPU41がメモリ42に記憶された計測プログラムを実行することで実現可能である。一方、入力画像格納メモリ411-2及び結果格納メモリ414-3は、例えばメモリ42により形成可能である。 As shown in FIG. 7, the computer 40 includes an image input unit 411, a pattern output unit 412, a synchronization control circuit 413, and a data processing unit 414. The image input unit 411 has a functional configuration including an image input circuit 411-1 and an input image storage memory 411-2. The pattern output unit 412 includes a pattern generation circuit 412-1 and a pattern output circuit 412-2. The data processing unit 414 includes an image calculation circuit 414-1, a dimension calculation circuit 414-2, and a result storage memory 414-3. The functions of the image input unit 411, the pattern output unit 412, the synchronization control circuit 413, and the data processing unit 414 can be realized by executing a measurement program stored in the memory 42 by the CPU 41 shown in FIG. . On the other hand, the input image storage memory 411-2 and the result storage memory 414-3 can be formed by the memory 42, for example.
 パターン出力部412は、LCD20の表示面に照明パターンを表示させる指示をLCD20に出力する。より具体的には、パターン生成回路412-1は、照射強度の位相が空間的及び時間的のうち少なくとも一方に変調された照明パターンを生成し、パターン出力回路412-2は、生成した照明パターンをLCD20の表示面に表示させる指示をLCD20に出力する。これにより、LCD20に表示する照明パターンの照射強度の位相を、例えばデフォルトの設定に応じて、空間的に、或いは、時間的に、或いは、空間的及び時間的に変調しながら、同期制御回路413が同期制御を行ってLCD20とCCDカメラ30の動作を同期させることができる。 The pattern output unit 412 outputs an instruction to display an illumination pattern on the display surface of the LCD 20 to the LCD 20. More specifically, the pattern generation circuit 412-1 generates an illumination pattern in which the phase of the irradiation intensity is modulated at least one of spatial and temporal, and the pattern output circuit 412-2 generates the generated illumination pattern. Is displayed on the LCD 20. As a result, the synchronization control circuit 413 modulates the phase of the illumination intensity of the illumination pattern displayed on the LCD 20 spatially, temporally, spatially and temporally according to, for example, a default setting. Can synchronize the operation of the LCD 20 and the CCD camera 30 by performing synchronization control.
 画像入力部411は、CCDカメラ30が撮像した1又は複数の画像を取得し、一旦格納する。より具体的には、画像入力回路411-1は、CCDカメラ30が撮像した1又は複数の画像を取得し、入力画像格納メモリ411-2は、取得された1又は複数の画像を一旦格納する。画像入力部411がCCDカメラ30からの複数の画像を取得する場合、パターン出力部412がLCD20に表示する照明パターンを変えながら、画像入力回路411-1が取得した複数の画像を入力画像格納メモリ411-2に一旦格納することができる。入力画像格納メモリ411-2に格納した画像は、データ処理部414に供給されて、計測対象10の寸法計測に用いられる。 The image input unit 411 acquires one or a plurality of images captured by the CCD camera 30 and temporarily stores them. More specifically, the image input circuit 411-1 acquires one or more images captured by the CCD camera 30, and the input image storage memory 411-2 temporarily stores the acquired one or more images. . When the image input unit 411 acquires a plurality of images from the CCD camera 30, the pattern output unit 412 changes the illumination pattern displayed on the LCD 20, and the plurality of images acquired by the image input circuit 411-1 are input to the input image storage memory. It can be temporarily stored in 411-2. The image stored in the input image storage memory 411-2 is supplied to the data processing unit 414 and used for measuring the dimension of the measurement target 10.
 データ処理部414では、画像算出回路414-1が、画像入力部411から供給される画像に基づいて、後述するビジビリティ画像及び平均輝度画像を算出する。寸法算出回路414-2は、算出したビジビリティ画像及び平均輝度画像に基づいて、計測対象10の寸法を算出する。結果格納メモリ414-3は、算出された計測対象10の寸法を格納する。結果格納メモリ414-3に格納された計測対象10の寸法は、例えばモニタ装置6に表示可能である。 In the data processing unit 414, the image calculation circuit 414-1 calculates a visibility image and an average luminance image, which will be described later, based on the image supplied from the image input unit 411. The dimension calculation circuit 414-2 calculates the dimension of the measurement target 10 based on the calculated visibility image and average luminance image. The result storage memory 414-3 stores the calculated dimensions of the measurement target 10. The dimensions of the measurement object 10 stored in the result storage memory 414-3 can be displayed on the monitor device 6, for example.
 パターン出力部412が、同期制御に応じて、照射強度の位相が空間的に変調された照明パターンをLCD20の表示面に表示させる指示をLCD20に出力する場合、データ処理部414は、CCDカメラ30が撮像した少なくとも1枚の画像に基づきビジビリティ値等を算出する。照射強度の位相が空間的に変調された照明パターンは、例えば縞状のパターン、色調、又は色が変化するパターン等であり、パターンは規則的であることが望ましい。 When the pattern output unit 412 outputs to the LCD 20 an instruction to display an illumination pattern whose irradiation intensity phase is spatially modulated on the display surface of the LCD 20 according to the synchronization control, the data processing unit 414 displays the CCD camera 30. Visibility values and the like are calculated based on at least one image captured by. The illumination pattern in which the phase of the irradiation intensity is spatially modulated is, for example, a striped pattern, a color tone, or a color changing pattern, and the pattern is preferably regular.
 一方、パターン出力部412が、同期制御に応じて、照射強度の位相が時間的に変調された複数の照明パターンをLCD20の表示面に表示させる指示をLCD20に出力する場合、データ処理部414は、CCDカメラ30が撮像した複数枚の画像に基づきビジビリティ値等を算出する。照射強度の位相が時間的に変調された照明パターンは、CCDカメラ30が画像を撮像する時刻毎に、例えば縞状のパターン、色調、又は色が変化するパターン等であり、パターンは規則的であり、時刻毎に規則的に変化することが望ましい。また、この場合の照明パターンは、異なる時刻で照射強度又は照明色が異なるパターンであれば、LCD20の全表示面が単一照射強度又は単一照明色であっても良い。 On the other hand, when the pattern output unit 412 outputs to the LCD 20 an instruction to display a plurality of illumination patterns whose irradiation intensity phases are temporally modulated on the display surface of the LCD 20 according to the synchronization control, the data processing unit 414 The visibility value and the like are calculated based on a plurality of images captured by the CCD camera 30. The illumination pattern in which the phase of the irradiation intensity is temporally modulated is, for example, a striped pattern, a color tone, or a color changing pattern at each time when the CCD camera 30 captures an image, and the pattern is regular. Yes, it is desirable to change regularly at each time. Moreover, if the illumination pattern in this case is a pattern with different illumination intensity or illumination color at different times, the entire display surface of the LCD 20 may have a single illumination intensity or a single illumination color.
 また、パターン出力部412が、同期制御に応じて、照射強度の位相が空間的及び時間的に変調された複数の照明パターンをLCD20の表示面に表示させる指示をLCD20に出力する場合、データ処理部414は、CCDカメラ30が撮像した複数枚の画像に基づき前記ビジビリティ値等を算出する。照射強度の位相が空間的及び時間的に変調された照明パターンは、CCDカメラ30が画像を撮像する時刻毎に、例えば縞状のパターン、照射強度又は照明色が変化するパターン等であり、パターン自体の変化(又は、パターン内の変化)及びパターンの時刻毎の変化のうち少なくとも一方が規則的であっても良い。この場合、例えば縞状のパターン、照射強度又は照明色が変化するパターン等は、時刻毎に規則的に変化することが望ましい。 Further, when the pattern output unit 412 outputs an instruction to the LCD 20 to display a plurality of illumination patterns whose irradiation intensity phases are spatially and temporally modulated on the display surface of the LCD 20 in accordance with the synchronization control, data processing is performed. The unit 414 calculates the visibility value and the like based on a plurality of images captured by the CCD camera 30. The illumination pattern in which the phase of the illumination intensity is spatially and temporally modulated is, for example, a striped pattern, a pattern in which the illumination intensity or illumination color changes at each time when the CCD camera 30 captures an image. At least one of the change of itself (or the change in the pattern) and the change of the pattern for each time may be regular. In this case, for example, a striped pattern, a pattern in which irradiation intensity or illumination color changes, and the like are desirably changed regularly at each time.
 パターン出力部412がLCD20に出力する、照明パターンをLCD20の表示面に表示させる指示は、表示する照明パターンを指定する信号であっても、表示する照明パターン自体、即ち、表示する照明パターンを表す信号であっても良い。 The instruction to display the illumination pattern on the display surface of the LCD 20 that the pattern output unit 412 outputs to the LCD 20 represents the illumination pattern to be displayed, that is, the illumination pattern to be displayed, even if it is a signal designating the illumination pattern to be displayed. It may be a signal.
 データ処理部414は、複数枚の画像の対応する画素の輝度値の平均又は当該輝度値の差を表す輝度情報を算出し、予め算出された、LCD20の表示面の手前側に計測対象10が配置されていない初期状態でCCDカメラ30が照明パターンを撮像した初期画像の各画素における照明パターンの初期輝度情報に対し、差分が一定以上となる画素に基づき計測対象10の寸法を計測し、計測した計測対象10の寸法のうち大きい方の寸法を出力しても良い。また、データ処理部414は、複数枚の画像の対応する画素の輝度値又は当該輝度値の平均を表す輝度情報を算出し、予め算出された、初期画像の各画素における照明パターンの初期輝度情報に対し、差分が一定以上となる画素に基づき計測対象10の寸法を計測し、計測した計測対象10の寸法のうち大きい方の寸法を出力しても良い。 The data processing unit 414 calculates luminance information indicating an average of luminance values of corresponding pixels of a plurality of images or a difference between the luminance values, and the measurement target 10 is calculated in front of the display surface of the LCD 20 that is calculated in advance. With respect to the initial luminance information of the illumination pattern in each pixel of the initial image in which the CCD camera 30 captures the illumination pattern in the initial state where it is not arranged, the dimensions of the measurement object 10 are measured based on the pixels whose difference is greater than a certain value. The larger dimension of the measured object 10 may be output. Further, the data processing unit 414 calculates luminance information indicating the luminance value of the corresponding pixel of the plurality of images or the average of the luminance values, and the initial luminance information of the illumination pattern in each pixel of the initial image calculated in advance. On the other hand, the dimension of the measurement object 10 may be measured based on the pixels whose difference is equal to or greater than a certain value, and the larger dimension among the measured dimensions of the measurement object 10 may be output.
 なお、操作者は、キーボード5から、LCD20に表示する照明パターンの照射強度の位相を、空間的に変調するか、或いは、時間的に変調するか、或いは、空間的及び時間的に変調するかを指定するコマンドをコンピュータ40に入力しても良い。この場合、パターン出力部412は、デフォルトの設定ではなく、操作者により指定された設定に応じた照明パターンをLCD20の表示面に表示させる指示をLCD20に出力する。 Whether the operator modulates the irradiation intensity phase of the illumination pattern displayed on the LCD 20 from the keyboard 5 spatially, temporally, or spatially and temporally. May be input to the computer 40. In this case, the pattern output unit 412 outputs, to the LCD 20, an instruction to display an illumination pattern according to the setting designated by the operator on the display surface of the LCD 20 instead of the default setting.
 図8は、計測対象をフラット照明の照明パターン下で撮像する場合を説明する図である。図8中、図2と同一部分には同一符号を付し、その説明は省略する。この例では、LCD20の表示面に表示する照明パターン52は、照射強度の位相が空間的に変調された縞状のパターンである。図8において、縞状の照明パターン52のうち、明るく見える部分程、輝度値が高く、暗く見える部分程、輝度値が低い。 FIG. 8 is a diagram for explaining a case where a measurement target is imaged under a flat illumination pattern. In FIG. 8, the same parts as those in FIG. In this example, the illumination pattern 52 displayed on the display surface of the LCD 20 is a striped pattern in which the phase of the irradiation intensity is spatially modulated. In FIG. 8, in the striped illumination pattern 52, the brighter portion has a higher luminance value, and the darker portion has a lower luminance value.
 図9は、LCD20の表示面の手前側に計測対象10が配置されていない状態で、LCD20の表示面に表示された縞状の照明パターン52をCCDカメラ30で撮像した画像52Aの一例を示す図である。図9に示す画像52Aにおける縞状の照明パターンのうち、明るく見える部分程、輝度値が高く、暗く見える部分程、輝度値が低い。 FIG. 9 shows an example of an image 52 </ b> A obtained by imaging the striped illumination pattern 52 displayed on the display surface of the LCD 20 with the CCD camera 30 in a state where the measurement target 10 is not disposed on the front side of the display surface of the LCD 20. FIG. Of the striped illumination pattern in the image 52A shown in FIG. 9, the brighter part has a higher luminance value and the darker part has a lower luminance value.
 図10は、図9に示す画像52A中、仮想線52Bに沿った輝度プロファイルの一例を示す図である。図10中、縦軸は画像52Aの輝度値を任意単位で示し、横軸は画像52A中の仮想線52Bに沿った水平方向の位置を任意単位で示す。図10に示すように、平均輝度は、画像52Aの縞状のパターンの平均的な明るさを示す値である。また、ビジビリティは、画像52Aの縞状のパターンの各画素の輝度の振幅を画像52Aの平均輝度値で除した値である。ビジビリティは、縞状のパターンの見やすさ、即ち、判別のしやすさを示す指標であり、「コントラスト」或いは「変調度」と呼ばれることもある。 FIG. 10 is a diagram showing an example of a luminance profile along the virtual line 52B in the image 52A shown in FIG. In FIG. 10, the vertical axis indicates the luminance value of the image 52A in arbitrary units, and the horizontal axis indicates the horizontal position along the virtual line 52B in the image 52A in arbitrary units. As shown in FIG. 10, the average luminance is a value indicating the average brightness of the striped pattern of the image 52A. Visibility is a value obtained by dividing the luminance amplitude of each pixel of the striped pattern of the image 52A by the average luminance value of the image 52A. Visibility is an index indicating the visibility of a striped pattern, that is, the ease of discrimination, and is sometimes called “contrast” or “degree of modulation”.
 図11は、LCD20の表示面の手前側に計測対象10が配置されている状態で、LCD20の表示面に表示された縞状の照明パターン52と、計測対象10をCCDカメラ30で同時に撮像した画像52Cの一例を示す図である。図11に示す画像52C中、10Aは計測対象10の画像(シルエット画像)を示す。図11は、画像52C中の仮想線52Dを含む領域を、一部拡大して右上部分に示す。図11に示す画像52Cにおける縞状の照明パターンのうち、明るく見える部分程、明るさが高く、暗く見える部分程、明るさが低い。 In FIG. 11, the striped illumination pattern 52 displayed on the display surface of the LCD 20 and the measurement target 10 are simultaneously imaged by the CCD camera 30 in a state where the measurement target 10 is arranged on the front side of the display surface of the LCD 20. It is a figure which shows an example of the image 52C. In the image 52C illustrated in FIG. 11, 10A indicates an image (silhouette image) of the measurement target 10. FIG. 11 shows a region including the virtual line 52D in the image 52C in a partially enlarged manner in the upper right part. Of the striped illumination pattern in the image 52C shown in FIG. 11, the brighter part, the higher the brightness, and the darker part, the lower the brightness.
 図12は、図11に示す画像52C中、仮想線52Dに沿った明るさプロファイルの一例を示す図である。図12中、縦軸は画像52Cの明るさを任意単位で示し、横軸は画像52C中の仮想線52Dに沿った垂直方向の位置を任意単位で示す。図11に示すように、画像52Cでは、計測対象10の明るくなっている部分の縞状のパターンが、LCD20の表示面に表示された照明パターン52に相当する縞状のパターンに比べて見にくく判別しにくくなっていることが確認できた。図12に示す、画像52Cにおける仮想線52Dに沿った明るさプロファイルからも、計測対象10のビジビリティが、LCD20の表示面に表示された照明パターン52に相当する縞状のパターンのビジビリティに比べて小さくなっていることが確認できた。つまり、明るさプロファイルからもわかるように、計測対象10の部分における縞状のパターンは、LCD20の表示面に表示された照明パターン52に相当する縞状のパターンに比べて、見にくく判別しにくい。 FIG. 12 is a diagram showing an example of a brightness profile along the virtual line 52D in the image 52C shown in FIG. In FIG. 12, the vertical axis indicates the brightness of the image 52C in arbitrary units, and the horizontal axis indicates the position in the vertical direction along the virtual line 52D in the image 52C in arbitrary units. As shown in FIG. 11, in the image 52 </ b> C, it is determined that the striped pattern of the bright portion of the measurement target 10 is less visible than the striped pattern corresponding to the illumination pattern 52 displayed on the display surface of the LCD 20. It was confirmed that it was difficult to do. From the brightness profile along the virtual line 52D in the image 52C shown in FIG. 12, the visibility of the measurement target 10 is more than the visibility of the striped pattern corresponding to the illumination pattern 52 displayed on the display surface of the LCD 20. It was confirmed that it was getting smaller. That is, as can be seen from the brightness profile, the striped pattern in the portion of the measurement target 10 is harder to see and discriminate than the striped pattern corresponding to the illumination pattern 52 displayed on the display surface of the LCD 20.
 LCD20の表示面に表示した照明パターン52の縞状のパターンは、計測対象10の表面に写り込んでCCDカメラ30で撮像される。この撮像の際に、計測対象10の表面の表面粗さによるパターンのボケや計測対象10の形状に起因する縞状のパターンの歪みにより、縞状のパターンの比較的広範囲の縞情報が計測対象10の表面の比較的狭い範囲に写り込んでしまい、縞状のパターンの見やすさが悪化して判別しにくくなる。例えば、計測対象10が完全な鏡面であり、且つ、形状が平らな場合を除き、計測対象10のビジビリティは、LCD20の表示面に表示された照明パターン52に相当する縞状のパターンに比べて悪化して、ビジビリティ値は小さくなる。 The striped pattern of the illumination pattern 52 displayed on the display surface of the LCD 20 is reflected on the surface of the measurement object 10 and captured by the CCD camera 30. At the time of this imaging, a relatively wide range of fringe information of the striped pattern is measured due to the blurring of the pattern due to the surface roughness of the measurement target 10 and the distortion of the striped pattern caused by the shape of the measurement target 10. 10 is reflected in a relatively narrow range on the surface of 10 and the visibility of the striped pattern deteriorates, making it difficult to discriminate. For example, the visibility of the measurement object 10 is higher than that of the striped pattern corresponding to the illumination pattern 52 displayed on the display surface of the LCD 20 except when the measurement object 10 is a perfect mirror surface and the shape is flat. Deteriorating, the visibility value becomes smaller.
 ところで、従来の均一なフラット照明を用いる方法では、平均輝度の変化を捉えて計測対象のエッジを検出する。この方法は、計測対象の表面の反射率による光量の減少を見ていることになる。即ち、平均輝度の変化を捉える方法は、計測対象の反射率情報を基に寸法を検出するものであるのに対し、上記実施例のように、ビジビリティを求める方法は、計測対象の表面粗さや形状情報を基に寸法を検出するものである。従って、上記実施例は、平均輝度及びビジビリティの2つの情報を基にして寸法計測を行うことができると言う点で、従来の均一なフラット照明を用いる方法に比べて、寸法計測を行うための情報量が多い。 By the way, in the conventional method using uniform flat illumination, a change in average luminance is detected and an edge to be measured is detected. This method sees a decrease in the amount of light due to the reflectance of the surface of the measurement object. That is, the method of capturing the change in average luminance is to detect the dimensions based on the reflectance information of the measurement object, whereas the method for determining the visibility as in the above-described embodiment is the method for measuring the surface roughness of the measurement object. The size is detected based on the shape information. Therefore, the embodiment described above can measure dimensions based on the two information of average luminance and visibility, compared with the conventional method using uniform flat illumination. There is a lot of information.
 次に、縞状の照明パターンを表示する照明装置の表示面の手前側に配置された計測対象の撮像画像からビジビリティ画像及び平均輝度画像を算出する方法、即ち、画素毎のビジビリティ値と平均輝度値を求める方法について説明する。 Next, a method for calculating the visibility image and the average luminance image from the captured image of the measurement object arranged on the front side of the display surface of the lighting device that displays the striped illumination pattern, that is, the visibility value and the average luminance for each pixel A method for obtaining the value will be described.
 照射強度の位相が空間的に変調された、例えば縞状の照明パターンを表示する照明装置の表示面の手前側に配置された計測対象を撮像した、1枚の画像に基づきビジビリティ値、平均輝度値等を算出することができる。これは、図11及び図12等と共に説明した通りである。しかし、1枚の画像に基づきビジビリティ値、平均輝度値等を求める場合、情報量が1枚の画像の情報に限られているため、ビジビリティ値、平均輝度値等の精度を更に向上することは難しい。 Visibility value and average luminance based on a single image obtained by imaging a measurement target arranged in front of the display surface of a lighting device that displays a striped illumination pattern in which the phase of the irradiation intensity is spatially modulated. A value or the like can be calculated. This is as described in conjunction with FIGS. However, when obtaining the visibility value, the average luminance value, etc. based on one image, the information amount is limited to the information of one image, so that the accuracy of the visibility value, the average luminance value, etc. can be further improved. difficult.
 そこで、照射強度の位相が空間的及び時間的のうち少なくとも一方に変調された、例えば縞状の複数の照明パターンを表示する照明装置の表示面の手前側に配置された計測対象を撮像した、複数枚の画像に基づきビジビリティ値等を算出する例について説明する。この例では、LCD20の表示面に表示する縞状の照明パターンの初期位相を一定角度ずつずらしてN種類(Nは、2以上の自然数)の縞状の照明パターンを表示する。また、LCD20の表示面に表示する縞状の照明パターンを切り替えながら、CCDカメラ30で照明パターンの切り替えに同期してN枚の画像を撮像する。これにより、N種類の縞状の照明パターンを表示する表示面の手前側に配置された計測対象10を撮像したN枚の画像から、各画素のビジビリティ値、平均輝度値等を求める。 Therefore, the phase of the irradiation intensity is modulated at least one of spatially and temporally, for example, a measurement object arranged on the front side of the display surface of the illumination device that displays a plurality of striped illumination patterns is imaged. An example of calculating a visibility value and the like based on a plurality of images will be described. In this example, N types (N is a natural number of 2 or more) of striped illumination patterns are displayed by shifting the initial phase of the striped illumination pattern displayed on the display surface of the LCD 20 by a certain angle. In addition, while switching the striped illumination pattern displayed on the display surface of the LCD 20, the CCD camera 30 captures N images in synchronization with the illumination pattern switching. Thereby, the visibility value, the average luminance value, etc. of each pixel are obtained from N images obtained by imaging the measurement object 10 arranged on the front side of the display surface on which N types of striped illumination patterns are displayed.
 先ず、図7に示すパターン生成回路412-1は、LCD20の表示面に表示する縞状の照明パターンを次式に従って生成する。 First, the pattern generation circuit 412-1 shown in FIG. 7 generates a striped illumination pattern to be displayed on the display surface of the LCD 20 according to the following equation.
Figure JPOXMLDOC01-appb-M000001


 上記の式中、In(x,y)は、LCD20にn番目に表示する縞状のパターンの画素(x,y)における表示階調値を指示する指示値を表し、パターン出力回路412-2は、この指示値をLCD20へ出力する。また、上記の式中、A,Bは、夫々LCD20に表示する縞状の照明パターンの明るさ及び振幅を表し、θは、LCD20に表示する縞状の照明パターンの位相を表す。この例では、nは例えばn=1,2,3,4であり、初期位相の変化を生成するパラメータを表す。サインの項における最初の項が時間変化を表し、サインの項における次の項が空間変化を表す。上記の式に従った縞状の照明パターンをLCD20の表示面に表示することで、4種類の縞状の照明パターンを表示し、各縞状の照明パターンを表示する表示面の手前側に配置された計測対象10をCCDカメラ30で撮像した4枚の画像を取得する。入力画像格納メモリ411-2は、これら4枚の画像の各画素における輝度値I'n(x,y)を格納する。画像算出回路414-1は、入力画像格納メモリ411-2に格納されたこれら4枚の画像の各画素における輝度値I'n(x,y)から、平均輝度画像の各画素値、即ち、平均輝度値Av(x,y)を次式に従って求める。
Figure JPOXMLDOC01-appb-M000001


In the above expression, I n (x, y) represents an instruction value indicating the display gradation value in the pixel (x, y) of the striped pattern displayed nth on the LCD 20, and the pattern output circuit 412- 2 outputs this instruction value to the LCD 20. In the above formula, A and B represent the brightness and amplitude of the striped illumination pattern displayed on the LCD 20, respectively, and θ represents the phase of the striped illumination pattern displayed on the LCD 20. In this example, n is, for example, n = 1, 2, 3, 4, and represents a parameter that generates a change in the initial phase. The first term in the sine term represents the time change, and the next term in the sine term represents the spatial change. By displaying the striped illumination pattern according to the above formula on the display surface of the LCD 20, four types of striped illumination patterns are displayed and arranged on the front side of the display surface for displaying each striped illumination pattern. Four images obtained by capturing the measured object 10 with the CCD camera 30 are acquired. The input image storage memory 411-2 stores the luminance value I ′ n (x, y) at each pixel of these four images. The image calculation circuit 414-1 calculates each pixel value of the average luminance image from the luminance value I ′ n (x, y) in each pixel of these four images stored in the input image storage memory 411-2, that is, The average luminance value Av (x, y) is obtained according to the following equation.
Figure JPOXMLDOC01-appb-M000002


 また、画像算出回路414-1は、入力画像格納メモリ411-2に格納されたこれら4枚の画像の各画素における輝度値I'n(x,y)から、ビジビリティ画像の各画素値、即ち、ビジビリティ値V(x,y)を、次式に従って求める。
Figure JPOXMLDOC01-appb-M000002


The image calculation circuit 414-1 also calculates each pixel value of the visibility image, that is, the luminance value I ′ n (x, y) of each pixel of these four images stored in the input image storage memory 411-2. The visibility value V (x, y) is obtained according to the following equation.
Figure JPOXMLDOC01-appb-M000003


 図13は、4種類の縞状の照明パターンを表示する表示面の手前側に配置された計測対象を撮像した画像から算出した平均輝度画像とビジビリティ画像の一例を示す図である。この例では、4枚の撮像画像800-1~800-4が入力画像格納メモリ411-2に格納されるので、画像算出回路414-1は、これら4枚の撮像画像800-1~800-4から平均輝度画像801とビジビリティ画像802を算出する。LCD20の表示面に表示する照明パターンの生成に用いる上記の式において、縞状の照明パターンの明るさAを画素に依存しない固定値に設定した場合、算出される平均輝度画像801は、明るさAで照明する従来のフラット照明を用いた場合に得られる画像と同じである。一方、ビジビリティ画像802は、この例では照明が写り込んでいる計測対象10の部分の曲率が大きいため、当該部分のビジビリティが著しく悪化してビジビリティ値が小さくなっており、テレセントリック照明を用いた場合に得られる画像に似ている。この場合、平均輝度画像801よりもビジビリティ画像802の方が、LCD20の表示面に表示された照明パターンの画像(即ち、照明部分)と計測対象10の画像10A(即ち、対象部分)との間での画素値の変化が大きいため、寸法計測に利用する各エッジ位置をビジビリティ画像802から求めて寸法計測を行う。なお、仮想線52Eについては、図16と共に後述する。
Figure JPOXMLDOC01-appb-M000003


FIG. 13 is a diagram illustrating an example of an average luminance image and a visibility image calculated from an image obtained by imaging a measurement target arranged on the front side of a display surface that displays four types of striped illumination patterns. In this example, since the four captured images 800-1 to 800-4 are stored in the input image storage memory 411-2, the image calculation circuit 414-1 has the four captured images 800-1 to 800-. 4, an average luminance image 801 and a visibility image 802 are calculated. In the above formula used to generate the illumination pattern to be displayed on the display surface of the LCD 20, when the brightness A of the striped illumination pattern is set to a fixed value independent of the pixel, the calculated average luminance image 801 has a brightness. This is the same as the image obtained when the conventional flat illumination illuminated with A is used. On the other hand, the visibility image 802 has a large curvature in the portion of the measurement object 10 in which the illumination is reflected in this example, so that the visibility of the portion is remarkably deteriorated and the visibility value is small, and telecentric illumination is used. Similar to the image obtained. In this case, the visibility image 802 is located between the image of the illumination pattern displayed on the display surface of the LCD 20 (that is, the illumination portion) and the image 10A of the measurement target 10 (ie, the subject portion) rather than the average luminance image 801. Since the change in the pixel value at is large, each edge position used for dimension measurement is obtained from the visibility image 802 and dimension measurement is performed. The virtual line 52E will be described later with reference to FIG.
 図14は、計測処理の第1の例を説明するフローチャートである。図14に示す計測処理は、例えば図6に示す制御装置4のCPU41が、メモリ42に記憶された計測プログラムを実行することで実行できる。この例では、少なくともビジビリティ値に基づき計測対象10の寸法計測を行う。 FIG. 14 is a flowchart for explaining a first example of measurement processing. The measurement process shown in FIG. 14 can be executed by the CPU 41 of the control device 4 shown in FIG. 6 executing a measurement program stored in the memory 42, for example. In this example, the measurement of the measurement object 10 is performed based on at least the visibility value.
 図14において、ステップS1では、CPU41が、照射強度の位相が空間的及び時間的のうち少なくとも一方に変調された照明パターンをLCD20の表示面に表示する。最初、CPU41は、例えば上記の如く4種類の縞状の照明パターンのうち、生成された最初の照明パターンをLCD20の表示面に表示する。ステップS2では、CPU41が、CCDカメラ30を制御して、縞状の照明パターンを表示するLCD20の表示面の手前に配置された計測対象10を撮像すると共に、撮像された画像をメモリ42に格納する。ステップS3では、CPU41が、4種類の縞状の照明パターンの全てを用いた計測対象10の撮像が終了したか否かを判定し、判定結果がNOであると処理はステップS4へ進み、判定結果がYESであると処理はステップS25であると処理はステップS5へ進む。 14, in step S1, the CPU 41 displays an illumination pattern in which the phase of the irradiation intensity is modulated at least one of spatial and temporal on the display surface of the LCD 20. First, the CPU 41 displays the first illumination pattern generated among the four types of striped illumination patterns as described above, for example, on the display surface of the LCD 20. In step S <b> 2, the CPU 41 controls the CCD camera 30 to pick up an image of the measurement object 10 arranged in front of the display surface of the LCD 20 that displays a striped illumination pattern, and stores the picked up image in the memory 42. To do. In step S3, the CPU 41 determines whether imaging of the measurement target 10 using all of the four types of striped illumination patterns has been completed. If the determination result is NO, the process proceeds to step S4, where determination is made. If the result is YES, the process proceeds to step S5 if the process is step S25.
 ステップS4では、CPU41が、位相をπ/2ラジアン(rad)変化させた縞状の照明パターンを生成し、処理はステップS1へ戻る。一方、ステップS5では、CPU41が、この例では図13に示す如き4枚の撮像画像800-1~800-4から平均輝度画像801とビジビリティ画像802を算出する。ステップS6では、CPU41が、平均輝度画像801又はビジビリティ画像802から寸法計測に利用する各エッジ位置を求め、計測対象10の画像(即ち、対象部分)と背景、即ち、LCD20に表示された縞状の照明パターンの画像(即ち、照明部分)とを2値化処理により分離し、2値化した画像から計測対象10の寸法を計測する。 In step S4, the CPU 41 generates a striped illumination pattern whose phase is changed by π / 2 radians (rad), and the process returns to step S1. On the other hand, in step S5, the CPU 41 calculates an average luminance image 801 and a visibility image 802 from four captured images 800-1 to 800-4 as shown in FIG. 13 in this example. In step S <b> 6, the CPU 41 obtains each edge position used for dimension measurement from the average luminance image 801 or the visibility image 802, and the image of the measurement target 10 (that is, the target portion) and the background, that is, the stripe shape displayed on the LCD 20. The image of the illumination pattern (that is, the illumination part) is separated by binarization processing, and the dimension of the measurement object 10 is measured from the binarized image.
 図15は、2値化処理の一例を説明する図である。図15は、説明の便宜上、計測対象10の画像10Aを含むビジビリティ画像802を2値化する例を示す。ステップS6においてCPU41が図15の上部に示すビジビリティ画像802に対して閾値を用いた2値化処理を施すと、図15の下部に示す2値化されたビジビリティ画像802-1が得られる。2値化されたビジビリティ画像802-1は、計測対象の画像10A-1を含む。例えば、ビジビリティ画像802-1の画素値が閾値以上であれば背景、即ち、LCD20に表示された縞状の照明パターンの画像(即ち、照明部分)であり、閾値未満であれば計測対象10の画像(即ち、対象部分)である。このような2値化処理により、計測対象10の画像(即ち、対象部分)と背景、即ち、LCD20に表示された縞状の照明パターンの画像(即ち、照明部分)とを分離することができる。 FIG. 15 is a diagram illustrating an example of binarization processing. FIG. 15 shows an example in which the visibility image 802 including the image 10A of the measurement target 10 is binarized for convenience of explanation. When the CPU 41 performs binarization processing using threshold values on the visibility image 802 shown in the upper part of FIG. 15 in step S6, a binarized visibility image 802-1 shown in the lower part of FIG. 15 is obtained. The binarized visibility image 802-1 includes a measurement target image 10A-1. For example, if the pixel value of the visibility image 802-1 is greater than or equal to a threshold value, the background, that is, an image of a striped illumination pattern displayed on the LCD 20 (that is, an illumination part), and if less than the threshold value, the measurement object 10 An image (that is, a target portion). By such binarization processing, the image of the measurement object 10 (that is, the target portion) and the background, that is, the image of the striped illumination pattern displayed on the LCD 20 (that is, the illumination portion) can be separated. .
 平均輝度画像801とビジビリティ画像802のうち、どちらの画像を選択して計測対象10の寸法を求めるかの判断は、例えば次のように行っても良い。つまり、予め計測対象10がLCD20の表示面の手前側に配置されていない初期状態でCCDカメラ30が撮像した画像に、LCD20の表示面に表示された照明パターンだけが映っている状態で、上記の式に従って平均輝度画像Av0(x,y)及びビジビリティ画像V0(x,y)を求めてメモリ42に格納しておく。そして、初期状態における平均輝度画像Av0(x,y)及びビジビリティ画像V0(x,y)と、計測対象10がLCD20の表示面の手前側に配置された状態でCCDカメラ30が撮像した画像における平均輝度画像Av(x,y)及びビジビリティ画像V(x,y)との差分を求め、差分がより大きくなる方の画像の画素値を寸法算出に利用すれば良い。 The determination as to which of the average luminance image 801 and the visibility image 802 should be selected to obtain the dimensions of the measurement target 10 may be performed as follows, for example. That is, in the state in which only the illumination pattern displayed on the display surface of the LCD 20 is reflected in the image captured by the CCD camera 30 in the initial state where the measurement target 10 is not arranged in front of the display surface of the LCD 20 in advance. The average luminance image Av 0 (x, y) and the visibility image V 0 (x, y) are obtained in accordance with the following equation and stored in the memory 42. Then, the CCD camera 30 captures an image with the average luminance image Av 0 (x, y) and the visibility image V 0 (x, y) in the initial state and the measurement object 10 arranged on the front side of the display surface of the LCD 20. A difference between the average luminance image Av (x, y) and the visibility image V (x, y) in the image may be obtained, and the pixel value of the image having the larger difference may be used for size calculation.
 ステップS6の後、ステップS7では、CPU41が、平均輝度画像801と初期状態における平均輝度画像との差分と、ビジビリティ画像802と初期状態におけるビジビリティ画像との差分のうち、差分が大きい方の画像の画素値を用いて、計測対象10の寸法を計測した寸法計測結果を出力し、処理は終了する。ステップS7において、CPU41は、計測対象10の寸法計測結果を、例えば図7に示すモニタ装置6に出力して表示しても良い。 After step S6, in step S7, the CPU 41 selects the image having the larger difference between the difference between the average luminance image 801 and the average luminance image in the initial state and the difference between the visibility image 802 and the visibility image in the initial state. Using the pixel value, a dimension measurement result obtained by measuring the dimension of the measurement object 10 is output, and the process ends. In step S7, the CPU 41 may output and display the dimension measurement result of the measurement target 10 on, for example, the monitor device 6 shown in FIG.
 この場合、1枚の画像に基づきビジビリティ値、平均輝度値等を求める場合と比較すると、複数枚の画像に基づきビジビリティ値、平均輝度値等を求めることで、情報量が増大するため、ビジビリティ値、平均輝度値等の精度を更に向上することができる。 In this case, since the amount of information increases by obtaining the visibility value, the average luminance value, etc. based on a plurality of images as compared with the case of obtaining the visibility value, the average luminance value, etc. based on one image, the visibility value In addition, the accuracy of the average luminance value and the like can be further improved.
 図16は、図13に示す平均輝度画像801及びビジビリティ画像802の仮想線52Eに沿った明るさプロファイルの一例を示す図である。図16中、縦軸は平均輝度画像801及びビジビリティ画像802の明るさを任意単位で示し、横軸は平均輝度画像801及びビジビリティ画像802中の仮想線52Eに沿った垂直方向の位置を任意単位で示す。図13に示す例では、図16に破線Iで示すように、平均輝度画像801における計測対象10の上側のエッジ部分での明るさの変化が小さいので、計測対象10の寸法検出感度が比較的低い。これに対し、図16に実線IIで示すように、ビジビリティ画像802における計測対象10の上側のエッジ部分での明るさの変化が大きいので、計測対象10の寸法検出感度が比較的高く、平均輝度画像801の場合の寸法検出感度の約30倍程度の寸法検出感度が得られることが確認できた。 FIG. 16 is a diagram showing an example of a brightness profile along the virtual line 52E of the average luminance image 801 and the visibility image 802 shown in FIG. In FIG. 16, the vertical axis indicates the brightness of the average luminance image 801 and the visibility image 802 in arbitrary units, and the horizontal axis indicates the position in the vertical direction along the virtual line 52E in the average luminance image 801 and the visibility image 802 in arbitrary units. It shows with. In the example shown in FIG. 13, as indicated by a broken line I in FIG. 16, since the change in brightness at the upper edge portion of the measurement target 10 in the average luminance image 801 is small, the dimension detection sensitivity of the measurement target 10 is relatively low. Low. On the other hand, as shown by a solid line II in FIG. 16, since the change in brightness at the upper edge portion of the measurement object 10 in the visibility image 802 is large, the dimension detection sensitivity of the measurement object 10 is relatively high, and the average luminance It was confirmed that a dimension detection sensitivity of about 30 times the dimension detection sensitivity in the case of the image 801 was obtained.
 なお、図14の計測処理において、平均輝度画像801は求めず、ビジビリティ画像802のみに基づいて寸法計測を行うようにしても良い。この場合、平均輝度画像801とビジビリティ画像802のうち、どちらの画像を選択して計測対象10の寸法を求めるかの判断も省略可能である。 In the measurement process of FIG. 14, the average luminance image 801 may not be obtained, and the dimension measurement may be performed based only on the visibility image 802. In this case, it is also possible to omit the determination of which one of the average luminance image 801 and the visibility image 802 is selected to obtain the dimension of the measurement target 10.
 上記の例では、照射強度の位相の空間的な変化として、一方向の縞状のパターンを用いたが、これに限定するものではなく、空間的に変化する縞状以外の様々なパターンを使用しても良い。また、照射強度の位相の時間的な変化として、縞状のパターンの初期位相を変化させたが、照明強度を変化させても、照明の色を変化させても、照明強度の変化と照明の色の変化の組み合わせを用いても良い。 In the above example, a stripe pattern in one direction was used as the spatial change in the phase of irradiation intensity, but this is not a limitation, and various patterns other than the stripe pattern that changes spatially are used. You may do it. In addition, as the temporal change in the phase of the irradiation intensity, the initial phase of the striped pattern was changed, but the change in illumination intensity and the illumination A combination of color changes may be used.
 図17は、計測処理の第2の例を説明するフローチャートである。図17に示す計測処理は、例えば図6に示す制御装置4のCPU41が、メモリ42に記憶された計測プログラムを実行することで実行できる。この例では、照射強度の位相が時間的に変調された照明パターンを用い、少なくとも輝度差値に基づき計測対象10の寸法計測を行うが、照明パターンの照射強度の位相が更に空間的に変調されていても良い。 FIG. 17 is a flowchart for explaining a second example of the measurement process. The measurement process illustrated in FIG. 17 can be executed by, for example, the CPU 41 of the control device 4 illustrated in FIG. 6 executing a measurement program stored in the memory 42. In this example, an illumination pattern in which the phase of the illumination intensity is temporally modulated is used to measure the dimension of the measurement object 10 based on at least the luminance difference value, but the phase of the illumination intensity of the illumination pattern is further spatially modulated. May be.
 図17において、ステップS11では、CPU41が、LCD20の照明強度を初期値に設定し、例えばLCD20の全面が単一照射強度に設定される。ステップS12では、CPU41が、CCDカメラ30を制御して、LCD20の表示面に表示された初期値の照明強度の照明パターンと、当該照明パターンを表示する表示面の手前に配置された計測対象10を撮像すると共に、撮像された画像をメモリ42に格納する。ステップS13では、CPU41が、LCD20の照明強度を変更する。ステップS14では、CPU41が、CCDカメラ30を制御して、LCD20の表示面に表示された変更後の照明強度の照明パターンと、当該照明パターンを表示する表示面の手前に配置された計測対象10を撮像すると共に、撮像された画像をメモリ42に格納する。ステップS15では、CPU41が、メモリ42に格納された2枚の撮像画像から、輝度差画像を算出する。具体的には、CPU41は、初期値の照明強度の照明パターンで撮像された1枚の撮像画像の各画素の輝度値と、変更後の照明強度の照明パターンで撮像された1枚の撮像画像の各画素の輝度値との輝度差値を算出し、輝度差値で形成された輝度差画像を算出する。 17, in step S11, the CPU 41 sets the illumination intensity of the LCD 20 to an initial value, for example, the entire surface of the LCD 20 is set to a single irradiation intensity. In step S <b> 12, the CPU 41 controls the CCD camera 30 to measure the illumination pattern with the initial illumination intensity displayed on the display surface of the LCD 20 and the measurement target 10 arranged in front of the display surface displaying the illumination pattern. And the captured image is stored in the memory 42. In step S13, the CPU 41 changes the illumination intensity of the LCD 20. In step S <b> 14, the CPU 41 controls the CCD camera 30 to change the illumination pattern of the changed illumination intensity displayed on the display surface of the LCD 20 and the measurement target 10 arranged in front of the display surface displaying the illumination pattern. And the captured image is stored in the memory 42. In step S <b> 15, the CPU 41 calculates a luminance difference image from the two captured images stored in the memory 42. Specifically, the CPU 41 captures the luminance value of each pixel of one captured image captured with the illumination pattern having the initial illumination intensity and one captured image captured with the illumination pattern having the changed illumination intensity. The luminance difference value with the luminance value of each pixel is calculated, and the luminance difference image formed with the luminance difference value is calculated.
 例えば、ステップS12で撮像された撮像画像の画素値をI'1(x,y)、ステップS14で撮像された撮像画像の画素値をI'2(x,y)で表すと、ステップS15で算出される輝度差値は、| I'1(x,y)-I'2(x,y)|から算出することができる。 For example, if the pixel value of the captured image captured in step S12 is represented by I ′ 1 (x, y) and the pixel value of the captured image captured in step S14 is represented by I ′ 2 (x, y), then in step S15. The calculated luminance difference value can be calculated from | I ′ 1 (x, y) −I ′ 2 (x, y) |.
 ステップS16では、CPU41が、輝度差画像から寸法計測に利用する各エッジ位置を求め、計測対象10の画像(即ち、対象部分)と背景、即ち、LCD20に表示された照明パターンの画像(即ち、照明部分)とを2値化処理により分離する。ステップS17では、CPU41が、輝度差画像の輝度差値(画素値)を用いて、計測対象10の寸法を計測した寸法計測結果を出力し、処理は終了する。ステップS17において、CPU41は、計測対象10の寸法計測結果を、例えば図7に示すモニタ装置6に出力して表示しても良い。 In step S16, the CPU 41 obtains each edge position used for dimension measurement from the luminance difference image, and the image of the measurement target 10 (ie, the target portion) and the background, ie, the image of the illumination pattern displayed on the LCD 20 (ie, the illumination pattern). The illumination part) is separated by binarization processing. In step S17, CPU41 outputs the dimension measurement result which measured the dimension of the measuring object 10 using the brightness | luminance difference value (pixel value) of a brightness | luminance difference image, and a process is complete | finished. In step S <b> 17, the CPU 41 may output and display the dimension measurement result of the measurement target 10 on the monitor device 6 illustrated in FIG. 7, for example.
 図18は、計測処理の第3の例を説明するフローチャートである。図18に示す計測処理は、例えば図6に示す制御装置4のCPU41が、メモリ42に記憶された計測プログラムを実行することで実行できる。この例では、照明色の位相が時間的に変調された照明パターンを用い、少なくとも色調差又は色差値に基づき計測対象10の寸法計測を行うが、照明パターンの照明色の位相が更に空間的に変調されていても良い。 FIG. 18 is a flowchart for explaining a third example of the measurement process. The measurement process shown in FIG. 18 can be executed by the CPU 41 of the control device 4 shown in FIG. 6 executing a measurement program stored in the memory 42, for example. In this example, an illumination pattern in which the phase of the illumination color is temporally modulated is used, and the dimension of the measurement object 10 is measured based on at least the color tone difference or the color difference value. However, the illumination color phase of the illumination pattern is more spatial. It may be modulated.
 図18において、ステップS21では、CPU41が、LCD20の照明色を初期値に設定し、例えばLCD20の全面が単一照明色に設定される。ステップS22では、CPU41が、CCDカメラ30を制御して、LCD20の表示面に表示された初期値の照明色の照明パターンと、当該照明パターンを表示する表示面の手前に配置された計測対象10を撮像すると共に、撮像された画像をメモリ42に格納する。ステップS23では、CPU41が、LCD20の照明色を変更する。ステップS24では、CPU41が、CCDカメラ30を制御して、LCD20の表示面に表示された変更後の照明色の照明パターンと、当該照明パターンを表示する表示面の手前に配置された計測対象10を撮像すると共に、撮像された画像をメモリ42に格納する。ステップS25では、CPU41が、メモリ42に格納された2枚の撮像画像から、色調差画像又は色差画像を算出する。具体的には、CPU41は、初期値の照明色の照明パターンで撮像された1枚の撮像画像の各画素の輝度値と、変更後の照明色の照明パターンで撮像された1枚の撮像画像の各画素の輝度値との輝度差値を算出し、色調差値で形成された色調差画像又は色差値で形成された色差画像を算出する。 In FIG. 18, in step S21, the CPU 41 sets the illumination color of the LCD 20 to an initial value, and for example, the entire surface of the LCD 20 is set to a single illumination color. In step S <b> 22, the CPU 41 controls the CCD camera 30 so that the illumination pattern of the initial illumination color displayed on the display surface of the LCD 20 and the measurement target 10 arranged in front of the display surface displaying the illumination pattern are displayed. And the captured image is stored in the memory 42. In step S23, the CPU 41 changes the illumination color of the LCD 20. In step S24, the CPU 41 controls the CCD camera 30 to change the illumination pattern of the changed illumination color displayed on the display surface of the LCD 20 and the measurement target 10 arranged in front of the display surface displaying the illumination pattern. And the captured image is stored in the memory 42. In step S <b> 25, the CPU 41 calculates a color tone difference image or a color difference image from the two captured images stored in the memory 42. Specifically, the CPU 41 captures the luminance value of each pixel of one captured image captured with the illumination pattern of the initial illumination color and one captured image captured with the illumination pattern of the changed illumination color. A luminance difference value with respect to the luminance value of each pixel is calculated, and a color difference image formed with the color difference value or a color difference image formed with the color difference value is calculated.
 例えば、ステップS22で撮像された撮像画像の赤(R)成分の画素値をI'R1(x,y)、ステップS24で撮像された撮像画像の赤(R)成分の画素値をI'R2(x,y)で表すと、ステップS25で算出される赤(R)成分の輝度差値は、| I'R1(x,y)-I'R2(x,y)|から算出することができる。同様に、ステップS22で撮像された撮像画像の緑(G)成分の画素値をI'G1(x,y)、ステップS24で撮像された撮像画像の緑(G)成分の画素値をI'G2(x,y)で表すと、ステップS25で算出される緑(G)成分の輝度差値は、| I'G1(x,y)-I'G2(x,y)|から算出することができ、ステップS22で撮像された撮像画像の青(B)成分の画素値をI'B1(x,y)、ステップS24で撮像された撮像画像の青(B)成分の画素値をI'B2(x,y)で表すと、ステップ25で算出される青(B)成分の輝度差値は、| I'B1(x,y)-I'B2(x,y)|から算出することができる。 For example, the pixel value of the red (R) component of the captured image captured in step S22 is I ′ R1 (x, y), and the pixel value of the red (R) component of the captured image captured in step S24 is I ′ R2. (x, y) is represented by the luminance difference value of the red (R) component is calculated in step S25, | I 'R1 (x , y) -I' R2 (x, y) | be calculated from it can. Similarly, the pixel value of the green (G) component of the captured image captured in step S22 is I ′ G1 (x, y), and the pixel value of the green (G) component of the captured image captured in step S24 is I ′. G2 (x, y) is represented by the luminance difference value of the green (G) component calculated at step S25, | I 'G1 (x , y) -I' G2 (x, y) | be calculated from The pixel value of the blue (B) component of the captured image captured in step S22 is I ′ B1 (x, y), and the pixel value of the blue (B) component of the captured image captured in step S24 is I ′. In terms of B2 (x, y), the luminance difference value of the blue (B) component calculated in step 25 should be calculated from | I ′ B1 (x, y) −I ′ B2 (x, y) | Can do.
 ステップS26では、CPU41が、色調差画像又は色差画像から寸法計測に利用する各エッジ位置を求め、計測対象10の画像(即ち、対象部分)と背景、即ち、LCD20に表示された照明パターンの画像(即ち、照明部分)とを2値化処理により分離する。ステップS17では、CPU41が、色調差画像の色調差値(画素値)又は色差画像の色差値(画素値)を用いて、計測対象10の寸法を計測した寸法計測結果を出力し、処理は終了する。ステップS17において、CPU41は、計測対象10の寸法計測結果を、例えば図7に示すモニタ装置6に出力して表示しても良い。 In step S <b> 26, the CPU 41 obtains each edge position used for dimension measurement from the color difference image or the color difference image, and the image of the measurement target 10 (that is, the target portion) and the background, that is, the image of the illumination pattern displayed on the LCD 20. (Ie, the illumination portion) is separated by binarization processing. In step S <b> 17, the CPU 41 outputs a dimension measurement result obtained by measuring the dimension of the measurement object 10 using the color difference value (pixel value) of the color difference image or the color difference value (pixel value) of the color difference image, and the process ends. To do. In step S <b> 17, the CPU 41 may output and display the dimension measurement result of the measurement target 10 on the monitor device 6 illustrated in FIG. 7, for example.
 なお、図14、図17及び図18と共に説明した手順で、ビジビリティ値に基づく計測対象10の寸法計測により得られた寸法値を、平均輝度値に基づく計測対象10の寸法計測により得られた寸法値、輝度差値に基づく計測対象10の寸法計測により得られた寸法値、及び色調差値又は色差値に基づく計測対象10の寸法計測により得られた寸法値のうち少なくとも1つの寸法値と比較し、大きい方の寸法値を寸法計測結果として出力するようにしても良い。また、ビジビリティ値、平均輝度値、輝度差値、及び色調差値又は色差値に基づく計測対象10の寸法計測により得られた寸法値のうち、最大値を寸法計測結果として出力しても良い。 In addition, the dimension obtained by the dimension measurement of the measuring object 10 based on the average luminance value by the procedure described with FIG. 14, FIG. 17 and FIG. 18 is used as the dimension value obtained by the dimension measurement of the measuring object 10 based on the visibility value. Compared with at least one dimension value among the dimension value obtained by the dimension measurement of the measurement object 10 based on the value, the luminance difference value, and the dimension value obtained by the dimension measurement of the measurement object 10 based on the color tone difference value or the color difference value The larger dimension value may be output as a dimension measurement result. Moreover, the maximum value may be output as the dimension measurement result among the dimension values obtained by the dimension measurement of the measurement target 10 based on the visibility value, the average luminance value, the luminance difference value, and the color tone difference value or the color difference value.
 つまり、CPU41は、複数枚の画像の対応する画素の輝度値の平均又は前記輝度値の差を表す輝度情報を算出し、予め算出された、初期画像の各画素における照明パターンの初期輝度情報に対し、差分が一定以上となる画素に基づき計測対象の寸法を計測しても良い。この場合、ビジビリティ値に基づき計測した計測対象の寸法値と、輝度情報に基づき計測した計測対象の寸法値のうち、大きい方の寸法値を寸法計測結果として出力しても良い。 That is, the CPU 41 calculates luminance information representing the average of the luminance values of the corresponding pixels of the plurality of images or the difference between the luminance values, and calculates the initial luminance information of the illumination pattern in each pixel of the initial image calculated in advance. On the other hand, the dimension of the measurement target may be measured based on the pixels whose difference is greater than or equal to a certain value. In this case, the larger dimension value of the measurement target dimension value measured based on the visibility value and the measurement target dimension value measured based on the luminance information may be output as the dimension measurement result.
 CPU41は、複数枚の画像の対応する画素の色調値又は色値を表す色情報を算出し、予め算出された、初期画像の各画素における照明パターンの初期色情報に対し、差分が一定以上となる画素に基づき計測対象の寸法を計測しても良い。この場合、ビジビリティ値に基づき計測した計測対象の寸法値と、色情報に基づき計測した計測対象の寸法値のうち、大きい方の寸法値を寸法計測結果として出力しても良い。 The CPU 41 calculates the color value representing the color tone value or color value of the corresponding pixel of the plurality of images, and the difference is greater than or equal to a predetermined value with respect to the initial color information of the illumination pattern in each pixel of the initial image calculated in advance. You may measure the dimension of a measuring object based on the pixel which becomes. In this case, the larger dimension value of the measurement target dimension value measured based on the visibility value and the measurement target dimension value measured based on the color information may be output as the dimension measurement result.
 上記実施例では、照明装置にバックライトを有し照明パターンを表示するLCDを用いたが、照明装置は、照明パターンを表示するプラズマディスプレイ装置、照明パターンを表示する有機エレクトロルミネッセンスディスプレイ装置、スクリーンと照明パターンをスクリーンに投影するプロジェクタとを有するプロジェクション装置等を用いても良い。例えばプロジェクション装置を用いる場合、例えばLCDを用いる場合より照明装置の奥行のサイズの縮小が難しいが、LCDでは対応が難しい比較的大きなサイズの計測対象にも対応可能となる。なお、プロジェクション装置は、スクリーンの第1の面に照明パターンを投影し、第1の面に投影された照明パターンで、照明パターンの表示面に相当する第1の面の手前側に配置された計測対象を照明しても良い。また、プロジェクション装置は、スクリーンの第1の面に照明パターンを投影し、第1の面とは反対側の第2の面に透過した照明パターンで、照明パターンの表示面に相当する第2の面の手前側に配置された計測対象を照明しても良い。 In the above embodiment, an LCD having a backlight and displaying an illumination pattern is used in the illumination device. However, the illumination device includes a plasma display device that displays the illumination pattern, an organic electroluminescence display device that displays the illumination pattern, a screen, and the like. You may use the projection apparatus etc. which have a projector which projects an illumination pattern on a screen. For example, when a projection device is used, it is difficult to reduce the depth of the illumination device, for example, when using an LCD, but it is also possible to handle a relatively large size measurement target that is difficult to handle with an LCD. The projection device projects an illumination pattern onto the first surface of the screen, and is disposed on the near side of the first surface corresponding to the display surface of the illumination pattern with the illumination pattern projected onto the first surface. The measurement object may be illuminated. The projection device projects an illumination pattern onto the first surface of the screen, and transmits the second illumination pattern transmitted to the second surface opposite to the first surface, which corresponds to the display surface of the illumination pattern. You may illuminate the measurement object arrange | positioned at the near side of the surface.
 従来のフラット照明では、得られる情報が上記第1の例における平均輝度値に限定されるが、例えば上記第1の例では、ビジビリティ値を用い、必要に応じて平均輝度値と比較する。これにより、例えばフラット照明と同じフラットタイプの照明でテレセントリック照明並の感度の良い寸法計測が可能になると共に、テレセントリック照明では難しい比較的大型の部品の寸法計測を感度良く行える。また、ビジビリティ値については、画素値の時間的な変化を検出することで、短時間では照度が一定としてみなせる外光や、空間的な縞情報を維持しづらいn次反射光による悪影響を、効率良く除去することができ、暗室下での寸法計測が不要になる。 In the conventional flat illumination, the obtained information is limited to the average luminance value in the first example. For example, in the first example, the visibility value is used and compared with the average luminance value as necessary. Accordingly, for example, the same flat type illumination as the flat illumination can be used to measure dimensions with high sensitivity equivalent to that of telecentric illumination, and it is possible to measure dimensions of relatively large parts that are difficult with telecentric illumination with high sensitivity. In addition, with regard to the visibility value, by detecting the temporal change in the pixel value, it is possible to reduce the adverse effects of external light that can be regarded as constant illuminance in a short period of time, and n-order reflected light that is difficult to maintain spatial fringe information. It can be removed well, eliminating the need for dimension measurement in a dark room.
 ところで、テレセントリック照明の場合、照射強度の位相が空間的に変調された照明パターンを生成することはできないが、照射強度の位相が時間的に変調された照明パターンを生成することはできる。そこで、上記実施例の変形例では、LCDの代わりにテレセントリック照明を用いて、照射強度の位相が時間的に変調された照明パターンを生成する。この場合も、LCDを用いる場合と同様の原理で、外部光源や計測対象以外からの2次反射光等が計測対象に当たることで生じる反射光又は拡散光の悪影響を抑制して、計測精度を向上することができる。従って、テレセントリック照明を用いる計測装置を暗室内に設けることなく、計測精度を向上できる。 Incidentally, in the case of telecentric illumination, an illumination pattern in which the phase of the irradiation intensity is spatially modulated cannot be generated, but an illumination pattern in which the phase of the irradiation intensity is temporally modulated can be generated. Therefore, in the modified example of the above embodiment, the illumination pattern in which the phase of the irradiation intensity is temporally modulated is generated using telecentric illumination instead of the LCD. In this case as well, using the same principle as when using an LCD, the measurement accuracy is improved by suppressing the adverse effects of reflected light or diffused light caused by secondary reflected light from other sources than the external light source or measurement target. can do. Therefore, measurement accuracy can be improved without providing a measurement device using telecentric illumination in the darkroom.
 上記の各実施例及び変形例によれば、外部光源や計測対象以外からの2次反射光等が計測対象に当たることで生じる反射光又は拡散光の悪影響を抑制して、計測精度を向上することができる。また、計測装置を暗室内に設けることなく、計測精度を向上できる。 According to each of the above-described embodiments and modifications, the measurement accuracy can be improved by suppressing the adverse effect of reflected light or diffused light caused by the secondary reflected light from other than the external light source and the measurement target hitting the measurement target. Can do. In addition, measurement accuracy can be improved without providing a measurement device in the darkroom.
 更に、フラット照明を用いる実施例の場合、計測装置を小型化及び低コスト化することができる。この場合、比較的大型の部品等の寸法計測も高精度に行うことができる。 Furthermore, in the case of an embodiment using flat illumination, the measuring device can be reduced in size and cost. In this case, it is possible to measure dimensions of relatively large parts with high accuracy.
 以上、開示の計測装置、計測方法及び計測プログラムを実施例により説明したが、本発明は上記実施例に限定されるものではなく、本発明の範囲内で種々の変形及び改良が可能であることは言うまでもない。 As mentioned above, although the measuring device, measuring method, and measuring program of an indication were explained by the example, the present invention is not limited to the above-mentioned example, and various modification and improvement are possible within the scope of the present invention. Needless to say.
1   計測装置
2   照明装置
3   撮像装置
4   制御装置
5   キーボード
6   モニタ装置
10   計測対象
20   LCD
30   CCDカメラ
40   コンピュータ
41   CPU
42   メモリ
DESCRIPTION OF SYMBOLS 1 Measurement apparatus 2 Illumination apparatus 3 Imaging apparatus 4 Control apparatus 5 Keyboard 6 Monitor apparatus 10 Measurement object 20 LCD
30 CCD camera 40 Computer 41 CPU
42 memory

Claims (20)

  1.  照明パターンを表示する表示面を有する照明装置と、
     前記照明装置の前記表示面より手前側に配置され前記照明パターンにより照明された計測対象の画像を撮像する撮像装置と、
     照射強度の位相が空間的及び時間的のうち少なくとも一方に変調された照明パターンを生成し、当該照明パターンを前記照明装置の前記表示面に表示させる指示を前記照明装置に出力すると共に、前記照明装置と前記撮像装置の動作を同期させる同期制御を行う制御装置と、
     を備え、
     前記制御装置は、前記画像の各画素の輝度の振幅を前記画像の輝度の平均値で除して、前記画像の各画素における前記照明パターンのビジビリティ値を算出して生成したビジビリティ画像を基に前記計測対象の寸法を計測することを特徴とする、計測装置。
    An illumination device having a display surface for displaying an illumination pattern;
    An imaging device that captures an image of a measurement target that is disposed on the near side of the display surface of the illumination device and illuminated by the illumination pattern;
    An illumination pattern in which the phase of the irradiation intensity is modulated at least one of spatial and temporal is generated, an instruction to display the illumination pattern on the display surface of the illumination device is output to the illumination device, and the illumination A control device that performs synchronous control to synchronize the operation of the device and the imaging device;
    With
    The control device divides the luminance amplitude of each pixel of the image by the average value of the luminance of the image, and calculates the visibility value of the illumination pattern at each pixel of the image based on the visibility image generated. A measuring apparatus for measuring a dimension of the measuring object.
  2.  前記制御装置は、前記同期制御に応じて、前記照明装置の前記表示面に前記照射強度の位相が空間的に変調された照明パターンを表示させる指示を前記照明装置に出力し、前記撮像装置が撮像した少なくとも1枚の画像と、前記照明装置の手前側に前記計測対象が配置されていない初期状態で前記撮像装置が前記照明パターンを撮像した初期画像とに基づき前記ビジビリティ値を算出することを特徴とする、請求項1記載の計測装置。 In response to the synchronization control, the control device outputs an instruction to display an illumination pattern in which the phase of the irradiation intensity is spatially modulated on the display surface of the illumination device, and the imaging device Calculating the visibility value based on at least one captured image and an initial image in which the imaging device captures the illumination pattern in an initial state where the measurement target is not disposed on the near side of the illumination device. The measuring apparatus according to claim 1, wherein the measuring apparatus is characterized.
  3.  前記制御装置は、前記同期制御に応じて、前記照明装置の前記表示面に前記照射強度の位相が時間的に変調された照明パターンを表示させる指示を前記照明装置に出力し、前記撮像装置が撮像した複数枚の画像と、前記照明装置の手前側に前記計測対象が配置されていない初期状態で前記撮像装置が前記照明パターンを撮像した初期画像とに基づき前記ビジビリティ値を算出することを特徴とする、請求項1記載の計測装置。 In response to the synchronization control, the control device outputs an instruction to the illumination device to display an illumination pattern in which the phase of the irradiation intensity is temporally modulated on the display surface of the illumination device, and the imaging device The visibility value is calculated based on a plurality of captured images and an initial image in which the imaging device captures the illumination pattern in an initial state where the measurement target is not disposed on the near side of the illumination device. The measuring device according to claim 1.
  4.  前記制御装置は、前記同期制御に応じて、前記照明装置の前記表示面に前記照射強度の位相が空間的及び時間的に変調された複数の照明パターンを表示させる指示を前記照明装置に出力し、前記撮像装置が撮像した複数枚の画像と、前記照明装置の手前側に前記計測対象が配置されていない初期状態で前記撮像装置が前記照明パターンを撮像した初期画像とに基づき前記ビジビリティ値を算出することを特徴とする、請求項1記載の計測装置。 In response to the synchronization control, the control device outputs an instruction to the lighting device to display a plurality of lighting patterns in which the phase of the irradiation intensity is spatially and temporally modulated on the display surface of the lighting device. The visibility value is calculated based on a plurality of images captured by the imaging device and an initial image in which the imaging device captures the illumination pattern in an initial state where the measurement target is not disposed on the front side of the illumination device. The measuring device according to claim 1, wherein the measuring device calculates.
  5.  前記制御装置は、
     前記複数枚の画像の対応する画素の輝度値の平均又は前記輝度値の差を表す輝度情報を算出し、予め算出された、前記初期画像の各画素における照明パターンの初期輝度情報に対し、差分が一定以上となる画素に基づき前記計測対象の寸法を計測し、
     前記ビジビリティ値に基づき計測した前記計測対象の寸法値と、前記輝度情報に基づき計測した前記計測対象の寸法値のうち、大きい方の寸法値を寸法計測結果として出力することを特徴とする、請求項3又は4記載の計測装置。
    The control device includes:
    The luminance information representing the average of the luminance values of the corresponding pixels of the plurality of images or the difference between the luminance values is calculated, and the difference is compared with the initial luminance information of the illumination pattern in each pixel of the initial image calculated in advance. Measure the dimension of the measurement object based on the pixel that is more than a certain value,
    The dimensional value of the measurement object measured based on the visibility value and the dimensional value of the measurement object measured based on the luminance information are output as a dimensional measurement result. Item 5. The measuring device according to item 3 or 4.
  6.  前記制御装置は、前記複数枚の画像の対応する画素の色調値又は色値を表す色情報を算出し、予め算出された、前記初期画像の各画素における照明パターンの初期色情報に対し、差分が一定以上となる画素に基づき前記計測対象の寸法を計測し、
     前記ビジビリティ値に基づき計測した前記計測対象の寸法値と、前記色情報に基づき計測した前記計測対象の寸法値のうち、大きい方の寸法値を寸法計測結果として出力することを特徴とする、請求項3又は4記載の計測装置。
    The control device calculates a color value representing a color tone value or a color value of a corresponding pixel of the plurality of images, and calculates a difference with respect to the initial color information of the illumination pattern in each pixel of the initial image calculated in advance. Measure the dimension of the measurement object based on the pixel that is more than a certain value,
    The larger dimension value is output as a dimension measurement result among the dimension value of the measurement object measured based on the visibility value and the dimension value of the measurement object measured based on the color information. Item 5. The measuring device according to item 3 or 4.
  7.  前記制御装置は、算出した、前記画像の各画素における前記照明パターンの前記ビジビリティ値が、予め算出された、前記照明装置の手前側に前記計測対象が配置されていない初期状態で前記撮像装置が前記照明パターンを撮像した初期画像の各画素における照明パターンの初期ビジビリティ値に対し、差分が一定以上となる画素に基づき前記計測対象の寸法を計測することを特徴とする、請求項1乃至4のいずれか1項記載の計測装置。 The control device calculates the visibility value of the illumination pattern in each pixel of the image, and the imaging device is calculated in an initial state where the measurement target is not arranged on the near side of the illumination device. The dimension of the measurement target is measured based on a pixel having a difference that is equal to or greater than a predetermined value with respect to an initial visibility value of the illumination pattern in each pixel of the initial image obtained by capturing the illumination pattern. Any one of the measuring devices of a statement.
  8.  前記照明パターンは、縞状のパターンであることを特徴とする、請求項1乃至7のいずれか1項記載の計測装置。 The measuring device according to any one of claims 1 to 7, wherein the illumination pattern is a striped pattern.
  9.  前記照明装置は、バックライトを有し前記照明パターンを表示する液晶表示装置、前記照明パターンを表示するプラズマディスプレイ装置、前記照明パターンを表示する有機エレクトロルミネッセンスディスプレイ装置、及びスクリーンと前記照明パターンを前記スクリーンに投影するプロジェクタとを有するプロジェクション装置のいずれかであることを特徴とする、請求項1乃至8のいずれか1項記載の計測装置。 The illumination device includes a liquid crystal display device having a backlight and displaying the illumination pattern, a plasma display device displaying the illumination pattern, an organic electroluminescence display device displaying the illumination pattern, and a screen and the illumination pattern. The measurement apparatus according to claim 1, wherein the measurement apparatus is a projection apparatus having a projector that projects onto a screen.
  10.  計測対象の寸法を計測する計測方法であって、
     照明装置が、照明パターンを表示面に表示し、
     撮像装置が、前記照明装置の前記表示面より手前側に配置され前記照明パターンにより照明された計測対象の画像を撮像し、
     プロセッサが、照射強度の位相が空間的及び時間的のうち少なくとも一方に変調された照明パターンを生成し、当該照明パターンを前記照明装置の前記表示面に表示させる指示を前記照明装置に出力すると共に、前記照明装置と前記撮像装置の動作を同期させる同期制御を行い、
     前記プロセッサが、前記画像の各画素の輝度の振幅を前記画像の輝度の平均値で除して、前記画像の各画素における前記照明パターンのビジビリティ値を算出して生成したビジビリティ画像を基に前記計測対象の寸法を計測する、
     ことを特徴とする、計測方法。
    A measurement method for measuring a dimension of a measurement object,
    The lighting device displays the lighting pattern on the display surface,
    An imaging device is arranged on the near side of the display surface of the illumination device and captures an image of a measurement target illuminated by the illumination pattern,
    The processor generates an illumination pattern in which the phase of the illumination intensity is modulated at least one of spatial and temporal, and outputs an instruction to display the illumination pattern on the display surface of the illumination device to the illumination device , Performing synchronization control to synchronize the operation of the illumination device and the imaging device,
    The processor divides the amplitude of the luminance of each pixel of the image by the average value of the luminance of the image and calculates the visibility value of the illumination pattern at each pixel of the image based on the visibility image generated. Measure the dimensions of the measurement target,
    A measuring method characterized by the above.
  11.  前記プロセッサが、前記同期制御に応じて、前記照明装置の前記表示面に前記照射強度の位相が空間的に変調された照明パターンを表示させる指示を前記照明装置に出力し、前記撮像装置が撮像した少なくとも1枚の画像と、前記照明装置の手前側に前記計測対象が配置されていない初期状態で前記撮像装置が前記照明パターンを撮像した初期画像とに基づき前記ビジビリティ値を算出することを特徴とする、請求項10記載の計測方法。 The processor outputs an instruction to display an illumination pattern in which the phase of the irradiation intensity is spatially modulated on the display surface of the illumination device according to the synchronization control, and the imaging device captures an image. The visibility value is calculated based on at least one image obtained and the initial image in which the imaging device captures the illumination pattern in an initial state where the measurement target is not disposed on the near side of the illumination device. The measurement method according to claim 10.
  12.  前記プロセッサが、前記同期制御に応じて、前記照明装置の前記表示面に前記照射強度の位相が時間的に変調された複数の照明パターンを表示させる指示を前記照明装置に出力し、前記撮像装置が撮像した複数の画像と、前記照明装置の手前側に前記計測対象が配置されていない初期状態で前記撮像装置が前記照明パターンを撮像した初期画像とに基づき前記ビジビリティ値を算出することを特徴とする、請求項10記載の計測方法。 In response to the synchronization control, the processor outputs an instruction to the illumination device to display a plurality of illumination patterns whose phases of irradiation intensity are temporally modulated on the display surface of the illumination device, and the imaging device The visibility value is calculated based on a plurality of images captured by the camera and an initial image in which the imaging device captures the illumination pattern in an initial state where the measurement target is not disposed on the near side of the illumination device. The measurement method according to claim 10.
  13.  前記プロセッサが、前記同期制御に応じて、前記照明装置の前記表示面に前記照射強度の位相が空間的及び時間的に変調された複数の照明パターンを表示させる指示を前記照明装置に出力し、前記撮像装置が撮像した複数枚の画像と、前記照明装置の手前側に前記計測対象が配置されていない初期状態で前記撮像装置が前記照明パターンを撮像した初期画像とに基づき前記ビジビリティ値を算出することを特徴とする、請求項10記載の計測方法。 In response to the synchronization control, the processor outputs to the lighting device an instruction to display a plurality of lighting patterns in which the phase of the irradiation intensity is spatially and temporally modulated on the display surface of the lighting device, The visibility value is calculated based on a plurality of images picked up by the image pickup device and an initial image in which the image pickup device picks up the illumination pattern in an initial state where the measurement target is not arranged on the near side of the lighting device. The measuring method according to claim 10, wherein:
  14.  前記照明パターンは、縞状のパターンであることを特徴とする、請求項10乃至13のいずれか1項記載の計測方法。 The measurement method according to any one of claims 10 to 13, wherein the illumination pattern is a striped pattern.
  15.  前記照明装置に、バックライトを有し前記照明パターンを表示する液晶表示装置、前記照明パターンを表示するプラズマディスプレイ装置、前記照明パターンを表示する有機エレクトロルミネッセンスディスプレイ装置、及びスクリーンと前記照明パターンを前記スクリーンに投影するプロジェクタとを有するプロジェクション装置のいずれかを用いることを特徴とする、請求項10乃至14のいずれか1項記載の計測方法。 The lighting device includes a liquid crystal display device that has a backlight and displays the illumination pattern, a plasma display device that displays the illumination pattern, an organic electroluminescence display device that displays the illumination pattern, and a screen and the illumination pattern. The measurement method according to claim 10, wherein any one of a projection device having a projector that projects onto a screen is used.
  16.  コンピュータに、計測対象の寸法を計測する処理を実行させるためのプログラムであって、
     照明パターンを照明装置の表示面に表示し、
     前記照明装置の前記表示面より手前側に配置され前記照明パターンにより照明された計測対象の画像を撮像装置により撮像し、
     照射強度の位相が空間的及び時間的のうち少なくとも一方に変調された照明パターンを生成し、当該照明パターンを前記照明装置の前記表示面に表示させる指示を前記照明装置に出力すると共に、前記照明装置と前記撮像装置の動作を同期させる同期制御を行い、
     前記画像の各画素の輝度の振幅を前記画像の輝度の平均値で除して、前記画像の各画素における前記照明パターンのビジビリティ値を算出して生成したビジビリティ画像を基に前記計測対象の寸法を計測する、
     処理を前記コンピュータに実行させることを特徴とする、プログラム。
    A program for causing a computer to execute processing for measuring a dimension to be measured,
    Display the lighting pattern on the display surface of the lighting device,
    Taking an image of a measurement object that is arranged on the near side of the display surface of the illumination device and illuminated by the illumination pattern with an imaging device,
    An illumination pattern in which the phase of the irradiation intensity is modulated at least one of spatial and temporal is generated, an instruction to display the illumination pattern on the display surface of the illumination device is output to the illumination device, and the illumination Performing synchronization control to synchronize the operation of the device and the imaging device,
    The size of the measurement target based on the visibility image generated by dividing the amplitude of the luminance of each pixel of the image by the average value of the luminance of the image and calculating the visibility value of the illumination pattern at each pixel of the image Measuring
    A program for causing a computer to execute processing.
  17.  前記同期制御に応じて、前記照明装置の前記表示面に前記照射強度の位相が空間的に変調された照明パターンを表示させる指示を前記照明装置に出力し、前記撮像装置が撮像した少なくとも1枚の画像と、前記照明装置の手前側に前記計測対象が配置されていない初期状態で前記撮像装置が前記照明パターンを撮像した初期画像とに基づき前記ビジビリティ値を算出する、
     処理を前記コンピュータに更に実行させることを特徴とする、請求項16記載のプログラム。
    In response to the synchronization control, an instruction to display an illumination pattern in which the phase of the irradiation intensity is spatially modulated on the display surface of the illumination device is output to the illumination device, and at least one image captured by the imaging device And calculating the visibility value based on the image and the initial image in which the imaging device images the illumination pattern in an initial state where the measurement target is not disposed on the front side of the illumination device.
    The program according to claim 16, further causing the computer to execute processing.
  18.  前記同期制御に応じて、前記照明装置の前記表示面に前記照射強度の位相が時間的に変調された複数の照明パターンを表示させる指示を前記照明装置に出力し、前記撮像装置が撮像した複数の画像と、前記照明装置の手前側に前記計測対象が配置されていない初期状態で前記撮像装置が前記照明パターンを撮像した初期画像とに基づき前記ビジビリティ値を算出する、
     処理を前記コンピュータに更に実行させることを特徴とする、請求項16記載のプログラム。
    In response to the synchronization control, an instruction to display a plurality of illumination patterns whose phases of irradiation intensity are temporally modulated on the display surface of the illumination device is output to the illumination device, and a plurality of images captured by the imaging device And calculating the visibility value based on the image and the initial image in which the imaging device images the illumination pattern in an initial state where the measurement target is not disposed on the front side of the illumination device.
    The program according to claim 16, further causing the computer to execute processing.
  19.  前記同期制御に応じて、前記照明装置の前記表示面に前記照射強度の位相が空間的及び時間的に変調された複数の照明パターンを表示させる指示を前記照明装置に出力し、前記撮像装置が撮像した複数枚の画像と、前記照明装置の手前側に前記計測対象が配置されていない初期状態で前記撮像装置が前記照明パターンを撮像した初期画像とに基づき前記ビジビリティ値を算出する、
     処理を前記コンピュータに更に実行させることを特徴とする、請求項16記載のプログラム。
    In response to the synchronization control, the imaging device outputs an instruction to display a plurality of illumination patterns in which the phase of the irradiation intensity is spatially and temporally modulated on the display surface of the illumination device. Calculating the visibility value based on a plurality of captured images and an initial image in which the imaging device captures the illumination pattern in an initial state where the measurement target is not disposed on the front side of the illumination device;
    The program according to claim 16, further causing the computer to execute processing.
  20.  前記照明パターンは、縞状のパターンであることを特徴とする、請求項16乃至19のいずれか1項記載のプログラム。 20. The program according to any one of claims 16 to 19, wherein the illumination pattern is a striped pattern.
PCT/JP2017/017799 2017-05-11 2017-05-11 Measurement device, measurement method, and measurement program WO2018207300A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2019516808A JP6927294B2 (en) 2017-05-11 2017-05-11 Measuring device, measuring method and measuring program
PCT/JP2017/017799 WO2018207300A1 (en) 2017-05-11 2017-05-11 Measurement device, measurement method, and measurement program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/017799 WO2018207300A1 (en) 2017-05-11 2017-05-11 Measurement device, measurement method, and measurement program

Publications (1)

Publication Number Publication Date
WO2018207300A1 true WO2018207300A1 (en) 2018-11-15

Family

ID=64105591

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/017799 WO2018207300A1 (en) 2017-05-11 2017-05-11 Measurement device, measurement method, and measurement program

Country Status (2)

Country Link
JP (1) JP6927294B2 (en)
WO (1) WO2018207300A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110793448A (en) * 2019-09-15 2020-02-14 同济大学 Vision-based measuring system for large building components
JP2021089168A (en) * 2019-12-03 2021-06-10 リョーエイ株式会社 Panel illumination system for workpiece inspection and workpiece inspection method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110069320A1 (en) * 2009-09-24 2011-03-24 Kde Corporation Inspecting system and inspecting method
JP2012127934A (en) * 2010-11-26 2012-07-05 Fujitsu Ltd Inspection method and inspection device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6041790B2 (en) * 2013-11-22 2016-12-14 本田技研工業株式会社 Coil gap measurement method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110069320A1 (en) * 2009-09-24 2011-03-24 Kde Corporation Inspecting system and inspecting method
JP2012127934A (en) * 2010-11-26 2012-07-05 Fujitsu Ltd Inspection method and inspection device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110793448A (en) * 2019-09-15 2020-02-14 同济大学 Vision-based measuring system for large building components
JP2021089168A (en) * 2019-12-03 2021-06-10 リョーエイ株式会社 Panel illumination system for workpiece inspection and workpiece inspection method

Also Published As

Publication number Publication date
JPWO2018207300A1 (en) 2019-11-21
JP6927294B2 (en) 2021-08-25

Similar Documents

Publication Publication Date Title
US10169857B2 (en) Image inspection apparatus, image inspection method, image inspection program, and computer-readable recording medium and recording device
US10240982B2 (en) Measurement system, information processing apparatus, information processing method, and medium
JP6507653B2 (en) Inspection apparatus and control method of inspection apparatus
US9857166B2 (en) Information processing apparatus and method for measuring a target object
JP5055191B2 (en) Three-dimensional shape measuring method and apparatus
CN107238484B (en) Method and device for detecting definition of transparent display screen
US20170372489A1 (en) Three-Dimensional Measurement Device
CN108779978B (en) Depth sensing system and method
JP5682419B2 (en) Inspection method and inspection apparatus
EP3465271B1 (en) 3d sensing technology based on multiple structured illumination
JP6351201B2 (en) Distance measuring device and method
US10955235B2 (en) Distance measurement apparatus and distance measurement method
WO2018207300A1 (en) Measurement device, measurement method, and measurement program
JP6045429B2 (en) Imaging apparatus, image processing apparatus, and image processing method
KR20200081541A (en) Imaging apparatus and driving method of the same
JP2008232837A (en) Method and system of defective enhancement, and detection of defect, and program
JP2020004085A (en) Image processor, image processing method and program
WO2020158340A1 (en) Inspection device and inspection method
US20220189132A1 (en) Interest determination apparatus, interest determination system, interest determination method, and non-transitory computer readable medium storing program
JP6567199B2 (en) Distance measuring device, distance measuring method, and distance measuring program
JP2018500576A (en) Optical measurement configuration
JP2017191082A (en) Bright-spot image acquisition apparatus and bright-spot image acquisition method
CN111044261A (en) Method, device, storage medium and system for detecting illumination uniformity of eye fundus camera
JP2016008837A (en) Shape measuring method, shape measuring device, structure manufacturing system, structure manufacturing method, and shape measuring program
JP2021004762A (en) Measurement device, imaging device, measurement system, control method, program and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17909491

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019516808

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17909491

Country of ref document: EP

Kind code of ref document: A1