CN110651165A - Optical component inspection system and method for inspecting at least one component - Google Patents

Optical component inspection system and method for inspecting at least one component Download PDF

Info

Publication number
CN110651165A
CN110651165A CN201880028492.1A CN201880028492A CN110651165A CN 110651165 A CN110651165 A CN 110651165A CN 201880028492 A CN201880028492 A CN 201880028492A CN 110651165 A CN110651165 A CN 110651165A
Authority
CN
China
Prior art keywords
image
image sensor
optically active
active element
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201880028492.1A
Other languages
Chinese (zh)
Other versions
CN110651165B (en
Inventor
乌韦·弗朗茨·奥格斯特
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ltd Of Muehlbauer LP
Muehlbauer GmbH and Co KG
Original Assignee
Ltd Of Muehlbauer LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ltd Of Muehlbauer LP filed Critical Ltd Of Muehlbauer LP
Publication of CN110651165A publication Critical patent/CN110651165A/en
Application granted granted Critical
Publication of CN110651165B publication Critical patent/CN110651165B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/0016Technical microscopes, e.g. for inspection or measuring in industrial production processes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/02Mountings, adjusting means, or light-tight connections, for optical elements for lenses
    • G02B7/04Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
    • G02B7/08Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted to co-operate with a remote control mechanism

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Studio Devices (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Lens Barrels (AREA)
  • Testing Or Measuring Of Semiconductors Or The Like (AREA)

Abstract

An optical component inspection system for inspecting at least one surface of at least one component is disclosed, wherein a receptacle is configured to position the component in front of a camera device for inspection of a first surface of the component by means of the camera device. The camera device includes an image sensor configured to receive reflected light of the first surface of the component. The optical component detection system further comprises a first optically active element arranged in the optical path of the reflected light to the image sensor and an adjustment means for the first optically active element. The adjusting device comprises a holder for the first optically active element. The holder is fixed on the inside of a hollow cylindrical barrel and the longitudinal center line of the barrel is coaxial with the optical axis of the first optically effective element. The holder is elastically bendable at least in the lens barrel longitudinal direction. Furthermore, the adjustment device comprises a first actuator configured to adjust the relative distance between the optically active element and the image sensor so as to displace the optically active element along the optical axis relative to the image sensor.

Description

Optical component inspection system and method for inspecting at least one component
Technical Field
Semiconductor components find application in a variety of technical fields, such as the manufacture of semiconductor electronics, photovoltaics, optical detectors, and radiation sources (e.g., light emitting diodes). The widespread use of semiconductor components places ever greater demands on the semiconductor component manufacturers, in particular with regard to quality. Defects or damage in or on the semiconductor component are undesirable because they can lead to a malfunctioning semiconductor component. Therefore, the semiconductor assembly is inspected for defects and damage at the time of manufacture. One method of inspecting semiconductor components is to image the semiconductor components with the aid of a micrometer-scale optical system.
But the conventional structure of the optical system has a low depth of field. If the surface to be imaged of the semiconductor component to be examined is inclined relative to the image sensor surface of the image sensor of the optical system, the semiconductor cannot be imaged completely clearly in the image. Secondly, the optical components built into the optical system are generally of a greater mass. Therefore, the adjustment process may be slow, or the optical system may vibrate while changing the focus by the adjustment member, thereby significantly deteriorating the image quality. It is therefore necessary to wait until the vibrations have damped before an image can be taken. Such increased image acquisition time in continuously manufacturing and inspecting semiconductor devices results in a reduction in the yield of semiconductor devices.
Background
DE 102008018586 a1 relates to an optical inspection device for inspecting at least one surface of a component. The assembly is directed to the camera device by means of the fastening element and a first surface of the assembly is illuminated by the light source with a first light beam in the short-wave range. Furthermore, the detection device comprises a second light source which irradiates a second light beam in the long-wave range onto a second surface of the component, wherein the second surface of the component is opposite to the first surface. The light beams reflected on the respective surfaces are received by means of a camera device.
JP 2016128781 discloses a device for inspecting electronic components. The apparatus comprises first and second image pickup devices which take images of different regions of the electronic component, wherein the regions are two different regions of a cuboid.
Disclosure of Invention
It is an object of the present application to provide an efficient optical component detection system for detecting components.
An optical component detection system for detecting at least one surface of at least one component is proposed. The receptacle is configured to position the component in front of the camera device for inspection of the first surface of the component by means of the camera device. The camera device includes an image sensor configured to receive reflected light on the first surface of the assembly. The optical component detection system further comprises a first optically active element arranged in the optical path of the reflected light to the image sensor and an adjustment means for the first optically active element. The adjusting device comprises a holder for the first optically active element. The holder is fixed on the inside of a hollow cylindrical barrel and the longitudinal center line of the barrel is coaxial with the optical axis of the first optically effective element. The holder is elastically bendable at least in the lens barrel longitudinal direction. Furthermore, the adjustment device comprises a first actuator for adjusting the relative distance between the optically effective element and the image sensor in order to displace the optically effective element along the optical axis relative to the image sensor.
By means of the elastically bendable holder, the first optically active element can be held without a corresponding thread or a longitudinally adjustable support of the optically active element. If the optically effective element is an achromatic lens, the lens pair of the achromatic lens (e.g., consisting of a flint lens and a crown lens) is held by a holder. Consequently, the support of the lens pair is superfluous and the mass to be moved of the optically active element and the overall mass of the optical component detection system are reduced. Due to the low mass of the optically active element and the holder, the position of the optically active element can be adjusted with a relatively small application of force. The lower mass allows low inertia steering of the optically active element; whereby the sharpness plane of the assembly can be quickly readjusted.
The elastically bendable holder may comprise a rubber-containing material or a helical spring or be constructed as a bellows. Alternatively, the holder may be configured as a coil spring, wherein the coil spring has an opening in the center line accommodating the first optically active element.
Disturbances such as vibrations from external sources, which are applied to the optically effective element, can thereby be significantly reduced or completely avoided. The measurements performed by means of the optical component detection system are therefore less prone to interference.
Furthermore, the elastically bendable holder can be continuously stretched or contracted due to its elastic properties, so that the position of the first optically effective element can be precisely adjusted.
In a practical production environment, the components may not be accurately aligned with the optical device. In this case, the component cannot be imaged completely clearly with sufficient depth of field. The optical component detection system proposed here makes it possible to rapidly carry out a series of multiple image recordings.
The good and fast adjustability of the optical component detection system allows component measurements to be made faster and more accurately, thereby achieving high throughput.
The image sensor may be a CCD chip. In other variations, the image sensor may be a CMOS chip or an image sensor sensitive to certain wavelength ranges, such as a microbolometer array or a pyroelectric array.
The lens barrel may have a low magnetic permeability. In one variant, the barrel is of paramagnetic material. In addition, the lens barrel has at least aluminum or plastic.
The coil may abut against the lens barrel or be spaced apart from the lens barrel. The position of the coil is offset on the optical axis relative to the position of the holder.
The holder has a first end region in which the holder is fixed to the lens barrel. The holder also has a second end region on which the first optically active element is held. In the rest state, the position of the first end region of the holder coincides with the position of one end of the coil with respect to the optical axis. In another variant, the position of the first end region of the holder is located inside or outside the coil with respect to the optical axis. Alternatively or additionally, the second end region of the stent is located inside or outside the coil with respect to the optical axis.
The support of the first actuator may be at least partially surrounded by a coil. The control device is configured to control the current delivered to the coil for generating the magnetic field. The holder also has a soft-iron or permanent magnet yoke as a soft-iron or permanent magnet assembly configured to displace the holder along the longitudinal centerline of the lens barrel in accordance with a current supplied to the coil.
The wear or wear on the means for adjusting the relative distance is reduced by this structure. Furthermore, precise adjustment of the relative distance can be achieved by the force generated by the magnetic field, which is applied to the holder.
If the holder with the yoke comprising soft iron is arranged at one end of the coil, this force acts on the holder in the direction of the center of the coil. If the magnetic field is lowered and then switched off, no more force acts on the holder. The holder is now in its initial position again.
If the support has a permanent magnet yoke, the support can be stretched in two directions depending on the direction of current flow through the coil and the orientation of the north and south poles of the permanent magnet assembly. Accordingly, the relative distance can be set in a targeted manner in both directions by means of the permanent magnet assembly.
The receptacle is configured to position the component in front of the camera device such that a sharpness plane of the component is at least partially projected on an image sensor surface of the image sensor.
The optical component detection system may further comprise second actuator means controlled by the control means for adjusting the image sensor so as to displace the image sensor along the optical axis relative to the first optically active element.
The second actuator may comprise at least one piezo-electric actuator or a micro-linearly operating shaft system. The micromotive linearly operating shaft system can be a system composed of push rods.
The image sensor can be displaced along the optical axis by means of the second actuator and the sharpness plane can be projected onto the image sensor surface of the image sensor. Thus, the optical device may be allowed to quickly focus the assembly.
Thus, even when the component is first photographed by means of the image sensor, a specific portion of the component can be clearly imaged in the image.
Further, the optical component detection system may include a light source configured to transmit light onto a surface of the component. The light source is configured to emit light of a predetermined wavelength or a predetermined wavelength range onto a surface of the component. The light emitted by the light source may be visible light, infrared light and/or polarized light having a wavelength in the range of about 380nm to 780 nm. Furthermore, image sensors have sensitivity in the infrared or to polarized light.
As an option, the control means may be configured to adjust the relative distance between the first optically active element and the image sensor by controlling the first and/or second actuator means in order to project the sharpness plane of the reflected light onto the image sensor surface of the image sensor facing the reflected light.
The relative distance between the first optically active element and the image sensor can be adjusted quickly and precisely by controlling the first and/or second actuator, so that a sharpness plane is projected onto the image sensor surface of the image sensor.
The image processing device of the optical component detection system may be configured to take a first image of the component by means of the image sensor. The image sensor provides the first image to the image processing device. The image processing device determines whether the sharpness plane of the reflected light is substantially completely projected onto the image sensor surface of the image sensor based on the generated first image. In case the sharpness plane of the reflected light is not substantially fully projected onto the image sensor surface of the image sensor, the image processing device is configured to determine a first image area of the first plurality of image areas of the first image in which a first partial area of the sharpness plane of the reflected light is projected onto the image sensor surface of the image sensor.
Furthermore, the image processing device is configured to determine a second image region of the first plurality of image regions of the first image in which a second partial region of the sharpness plane of the reflected light is not projected on the image sensor surface of the image sensor. The image processing apparatus is further configured to provide the control command to the control apparatus. The control means are configured to control the first and/or second actuator means based on the control command in order to adjust the relative distance between the optically active element and the image sensor. In this case, the second subregion of the sharpness plane of the reflected light is adjusted in such a way that the sharpness plane of the reflected light is projected onto the image sensor surface of the image sensor.
The image processing device is also configured to take a second image of the component by means of the image sensor. In the second image, the image processing device determines a third image region of the second plurality of image regions of the second image in which the first partial region of the sharpness plane of the reflected light is not projected on the image sensor surface of the image sensor. Furthermore, the image processing device determines a fourth image region of the second plurality of image regions of the second image, in which a second partial region of the sharpness plane of the reflected light is projected on the image sensor surface of the image sensor.
If the component is not clearly imaged in the image, at least two images of the component can be generated for the component by the process described above, wherein the image area of each image separately clearly images a portion of the component. In addition, the image regions detected as not sharp are imaged sharp in the next image, so that the inspection of the entire assembly is achieved.
In a first alternative, the image processing device may be configured to take the first image by means of an image sensor, which provides the first image to the image processing device. The image processing means is further configured to provide control commands to the control means to control the first and/or second actuator means to adjust the relative distance between the first optically active element and the image sensor with a predetermined path length. The image processing device is furthermore configured to capture a second image by means of an image sensor, wherein the image sensor supplies the second image to the image processing device.
The image processing means may provide the control command to the first and/or second execution means after a predetermined period of time after the first image is captured.
The predetermined path length may be related to the optical properties of the first and/or second optically active element and/or the dimensions of the surface of the component. The preset path length is preferably preset in accordance with the depth of field such that the preset path length is less than, equal to or greater than the depth of field.
In the case where the predetermined path length is less than or equal to the depth of field, the surface of the component is inspected entirely for defects by each of the clearly imaged image areas of the captured image. If the predetermined path length is greater than the depth of field, the predetermined surface area of the component may be inspected for defects. Accordingly, the other surface areas are not checked for defects. This allows the surface region to be inspected for defects with less computational cost and/or in a short time.
In a second variant, the image processing device may be configured to take the first image by means of an image sensor, which provides the first image to the image processing device. The image processing means are further configured to provide control commands to the control means to control the first and/or second actuator means to adjust the relative distance between the first optically active element and the image sensor at a predetermined speed during capturing of the first image. Furthermore, the image processing device is configured to take a second image by means of an image sensor, which provides the second image to the image processing device. After the first image or after a predetermined period of time after the first image is captured, the relative distance between the first optically active element and the image sensor is adjusted, during which a second image is simultaneously captured. In a further alternative, the second image is taken after a predetermined length change of the relative distance between the first optically active element and the image sensor.
After taking the second image, the image processing device may be configured to provide a further control command to the first and/or second execution device in order to stop the adjustment of the relative distance between the first optically active element and the image sensor.
The predetermined speed may be related to the optical properties of the first and/or second optically active element, the dimensions of the surface of the component, the quality of the first and/or second optically active element and/or the image sensor and its response to the first and/or second actuator.
The predetermined speed is selected such that the adjustment of the relative distance between the image sensor and the first optically active element during the taking of the image is not greater than the depth of field. Therefore, although the relative distance between the image sensor and the first optically effective element is adjusted simultaneously, the image is captured sufficiently clearly. By continuing to adjust the relative distance between the image sensor and the first optically active element, there is no need to stop and re-accelerate the image sensor and/or the first optically active element. This results in the advantage that vibrations caused by acceleration and stopping of the image sensor and/or the first optically active element are avoided. Another advantage is that the time for checking whether the component is defective is minimized, since the image sensor and/or the first optically active element only have to be accelerated and stopped once.
The image processing apparatus of the first and second alternatives may be configured to determine a first image area of the first plurality of image areas of the first image after the first image is captured or after the second image is captured. In the first image region, a first partial region of the sharpness plane of the reflected light is projected on an image sensor surface of the image sensor. In addition or alternatively, the image processing device determines a second image region of the first plurality of image regions of the first image in which a second partial region of the sharpness plane of the reflected light is not projected on the image sensor surface of the image sensor.
Furthermore, the image processing device additionally or alternatively determines a third image area of the second plurality of image areas of the second image after capturing the second image. In the third image region, the first partial region of the sharpness plane of the reflected light is not projected on the image sensor surface of the image sensor. Additionally or alternatively, the image processing device also determines a fourth image region of the second plurality of image regions of the second image, in which a second partial region of the sharpness plane of the reflected light is projected on the image sensor surface of the image sensor.
The image processing means may be further configured to provide control commands to the control means to control the first and/or second execution means after taking the second image in order to adjust the relative distance between the first optically active element and the image sensor to its initial length after taking the second image.
The first and second image areas of the first image may each form a portion of the assembly that substantially corresponds to portions of the assembly in the third and fourth image areas of the second image. The first and second image areas of the first image form a complete assembly.
The first plurality of image regions of the first image may correspond to a second plurality of image regions of the second image.
The first and second image areas of the first image and/or the third and fourth image areas of the second image may be contiguous image areas. Alternatively, the first and second image regions of the first image and the third and fourth image regions of the second image may at least partially overlap.
A first partial region of the sharpness plane in the first image and a second partial region of the sharpness plane in the second image are projected onto the image sensor surface of the image sensor within a predetermined depth of field. At the same time, the second partial region of the sharpness plane in the first image and the first partial region of the sharpness plane in the second image are projected outside the predetermined depth of field onto the image sensor surface of the image sensor.
The image processing device may be configured to cut out a first image region from the first image and a fourth image region from the second image. Further, the image processing device is configured to combine the cut-out first and fourth image regions to generate a third image.
By the generated third image, only one image, not two images, needs to be inspected by the image processing apparatus when inspecting whether the component is damaged or defective, thereby reducing time and calculation cost for inspecting the images.
The image processing device may be further configured to determine whether the component has at least one defect based on the first image region of the first image and/or the fourth image region of the second image and/or the third image. If the image processing device determines that there is at least one defect, the image processing device is configured to provide defect information about the component.
The optical assembly detection system may comprise a position detection sensor configured to determine a position and/or orientation of the first optically active element and/or an image sensor surface of the image sensor. The position detection sensor is further configured to provide information about the position and/or orientation of the first optically active element and/or the image sensor surface of the image sensor to the control device, which controls the first and/or second actuator device based on the provided information. The position detection sensor may be an optical or (electro) mechanical position detection sensor.
The position detection sensor may be mounted on the inner side of the lens barrel. In a further variant, the position detection sensor can be integrated in the image sensor. The support has a pattern on a surface area facing the image sensor. The image sensor is configured to identify a pattern on the surface area and determine a position and/or orientation of the optically effective element based on the identified surface area.
The information provided enables the control device to precisely position the first optically active element and/or the image sensor in order to project the sharpness of the reflected light onto the image sensor surface of the component.
The camera arrangement may include a second optically active element in the optical path between the first optically active element and the image sensor. The optical axis of the second optically active element is coaxial with the optical axis of the first optically active element.
The first optically effective element can be an achromatic lens or apochromatic lens and/or the second optically effective element can be a condenser lens.
The refractive index of the light increases continuously from red to blue and thus the focal length of the lens decreases. In order to compensate or correct imaging errors, achromatic lenses are used.
Furthermore, an optical component detection system for detecting at least one surface of at least one component is proposed. The receptacle is configured to position the component in front of the camera device for inspection of the first surface of the component by the camera device. The camera device includes an image sensor configured to receive reflected light on the first surface of the assembly. The optical component detection system further comprises a first optically active element arranged in the optical path of the reflected light to the image sensor and an adjustment means for the first optically active element. The adjusting device comprises a holder for the first optically active element. The holder is fixed on the inside of a hollow cylindrical barrel by means of a linear guide and the longitudinal center line of the barrel is coaxial with the optical axis of the first optically active element. The linear guide is configured to guide the carriage parallel to the optical axis. The adjustment means further comprise first actuator means for adjusting the relative distance between the optically active element and the image sensor in order to displace the first optically active element relative to the image sensor.
Furthermore, an adjustment device for an optically effective element is proposed, which comprises a holder for the first optically effective element. The holder is fixed on the inside of a hollow cylindrical barrel and the longitudinal center line of the barrel is coaxial with the optical axis of the first optically effective element. The holder is elastically bendable at least in the lens barrel longitudinal direction. Furthermore, the adjustment device comprises a first actuator for adjusting the holder in order to displace the first optically active element along the optical axis relative to the barrel.
Furthermore, an adjustment device for an optically effective element is proposed, which comprises a holder for the first optically effective element. The holder is fixed on the inside of a hollow cylindrical barrel by means of a linear guide and the longitudinal center line of the barrel is coaxial with the optical axis of the first optically active element. The linear guide is configured to guide the carriage parallel to the optical axis. Furthermore, the adjustment device comprises a first actuator for adjusting the holder in order to displace the first optically active element along the optical axis relative to the barrel.
Furthermore, a method for inspecting at least one surface of at least one component is proposed, which has the following steps: aligning the component with a camera device; detecting a first surface of the component by means of a camera device; receiving reflected light on a first surface of a component with an image sensor of a camera device; holding the first optically active element in the optical path of the reflected light by means of a holder which is elastically bendable in a longitudinal direction, wherein the longitudinal direction is parallel to the optical axis of the optically active element; the relative distance between the image sensor and the first optically active element is adjusted so as to displace the first optically active element along the optical axis relative to the image sensor.
The method may further comprise the steps of: adjusting a relative distance between the image sensor and the first optically active element so as to displace the image sensor along the optical axis relative to the first optically active element; and as an option, adjusting a relative distance between the image sensor and the first optically active element so as to project a sharpness plane of the reflected light onto an image sensor surface of the image sensor facing the reflected light.
The method may further comprise the steps of: a first surface of the assembly is illuminated with light. The light may be light of a particular wavelength or a particular range of wavelengths.
The method may further comprise the steps of: shooting a first image; determining whether a sharpness plane of the reflected light is substantially completely projected onto an image sensor surface of the image sensor based on the first image; and if the sharpness plane of the reflected light is not substantially completely projected onto the image sensor surface of the image sensor, determining a first image region of a first plurality of image regions of the first image in which a first partial region of the sharpness plane of the reflected light is projected onto the image sensor surface of the image sensor; determining a second image region of the first plurality of image regions of the first image in which a second partial region of the sharpness plane of the reflected light is not projected on the image sensor surface of the image sensor; adjusting the relative distance between the first optically active element and the image sensor so as to project a second partial area of the sharpness plane of the reflected light on the image sensor surface of the image sensor; shooting a second image; determining a third image region of the second plurality of image regions of the second image in which the first partial region of the sharpness plane of the reflected light is not projected on the image sensor surface of the image sensor; a fourth image region of the second plurality of image regions of the second image is determined, in which a second partial region of the sharpness plane of the reflected light is projected on the image sensor surface of the image sensor.
In a first alternative, the method may comprise the steps of: shooting a first image; adjusting a relative distance between the image sensor and the first optically active element by a predetermined path length; a second image is captured.
In a second alternative, the method may comprise the steps of: shooting a first image; adjusting a relative distance between the image sensor and the first optically active element at a predetermined speed during capturing of the first image; after the first image is captured or after a predetermined period of time after the first image is captured, the relative distance between the image sensor and the first optically active element is adjusted, during which a second image is simultaneously captured.
The method according to the first and second alternative may further comprise the steps of: determining a first image region of a first plurality of image regions of a first image in which a first partial region of a sharpness plane of the reflected light is projected on an image sensor surface of the image sensor; and/or determining a second image region of the plurality of image regions of the first image in which a second partial region of the sharpness plane of the reflected light is not projected on the image sensor surface of the image sensor; and/or determining a third image region of the second plurality of image regions of the second image in which the first partial region of the sharpness plane of the reflected light is not projected on the image sensor surface of the image sensor; and/or a fourth image region of the second plurality of image regions of the second image is determined, in which a second partial region of the sharpness plane of the reflected light is projected on the image sensor surface of the image sensor.
The method may further comprise the steps of: cutting out a first image area from the first image and a fourth image area from the second image; the cut-out first and fourth image regions are combined to generate a third image.
The method may further comprise the steps of: determining whether the component has at least one defect based on the first image region of the first image and/or the fourth image region of the second image and/or the third image; and if the component has at least one defect, providing defect information about the component.
Although some of the foregoing aspects relate to methods, these aspects may also be applied to devices. Likewise, the above aspects relating to the apparatus may be applied to the method accordingly.
Drawings
Further objects, features, advantages and applications refer to the following description of non-limiting embodiments, which is to be understood in connection with the accompanying drawings. In this case, all described and/or shown features are indicative of the subject matter disclosed herein, either by themselves or in any combination, irrespective of their combination or relationship in the respective claims. The dimensions and proportions of parts illustrated in the figures are not necessarily shown to scale herein; may be different in the embodiment to be realized.
Fig. 1 shows a schematic side view of an adjusting device for an optically effective element;
figures 2 and 3 show a top view of an embodiment of a bracket of the adjustment device;
FIG. 4 shows a schematic side view of an optical component detection system for detecting at least one surface of at least one component;
fig. 5 and 6 schematically show different image shots with image areas of different sharpness;
FIG. 7 schematically shows an image photograph, which is made up of image areas of a previous image shot;
figures 8 to 10 show side views of the clear plane of the component to be imaged relative to the image sensor of the optical component detection system;
FIG. 11 shows a time-velocity diagram according to which the relative distance between the optically active element and the image sensor of the optical assembly detection system is varied;
FIG. 12 shows a time-focus path diagram according to which an embodiment for taking an image of a component is explained;
fig. 13 shows another time-focus path diagram according to which another embodiment for capturing component images can be explained.
The device variants and their functions and operating solutions described herein are only intended to better understand their structure, manner of operation and performance; they do not limit the invention to the examples. The drawings are partly schematic, in which the main features and effects are shown greatly exaggerated, in order to clarify the function, principle of action, technical solutions and features. In this case, each mode of operation, each principle, each technical solution and each feature disclosed in the drawings or the text can be freely and arbitrarily combined with each feature in all the claims, in the description and in the other drawings, other modes of operation, principles, technical solutions and features included in or derived from the present disclosure, thereby enabling all conceivable combinations to be associated with the device of the present invention. In this case, the combination included in the text, i.e. in each paragraph of the description, between all the individual embodiments in the claims and between the different variants in the description, in the claims and in the drawings, is included herein and can be the subject of further claims. The claims are also not limited to the disclosure and all combinations of the shown features with one another. All disclosed features are also expressly disclosed either individually or in combination with all other features.
Detailed Description
Corresponding or functionally similar components in the figures are provided with the same reference numerals. Apparatus and methods will now be described in accordance with embodiments.
Fig. 1 shows an adjusting device 100 for an achromatic lens 130 as a first optically effective element. The achromatic lens 130 is composed of a flint glass lens 130B and a crown glass lens 130A. The adjustment device 100 comprises a holder 140 for the achromatic lens 130. The holder 140 is fixed on the inside of the hollow cylindrical barrel 110 and the longitudinal center line of the barrel 110 is coaxial with the optical axis OA of the achromatic lens 130. In the present exemplary embodiment, the carrier 140 is configured in the form of a ring. Bracket 140 receives achromatic lens 130 through an opening of bracket 140 such that optical axis OA of the achromatic lens is coaxial with the longitudinal centerline of barrel 110. The lens barrel 110 has a small magnetic permeability (magnetic permeability μ > 1) and a paramagnetic material. The paramagnetic material is aluminum. In another variant, lens barrel 110 has plastic as the material.
The holder 140 is elastically bendable at least in the lens barrel longitudinal direction. For this purpose, the carrier 140 has a rubber-containing material which is elastically deformable. In another variant, the support 140 is configured as a bellows or a coil spring. In yet another variation, mount 140 is configured as a coil spring that holds achromatic lens 130 such that its optical axis OA is coaxial with the longitudinal centerline of barrel 110.
As shown in fig. 1, the bracket 140 is fixed to the lens barrel 110 at a first end region. Along optical axis OA, the wall thickness of bracket 140 decreases toward the second end region of bracket 140 that holds achromatic lens 130. In this case, the second end region of the bracket 140 is not fixed to the lens barrel 110 and is spaced apart from the lens barrel 110 in the direction perpendicular to the optical axis OA. To this end, the holder 140 has at least one notch which serves as a seat for the achromatic lens 130. In which the achromatic lens 130 is accommodated and held.
Adjustment device 100 also includes a first actuator 120 for adjusting carriage 140 to displace achromatic lens 130 along optical axis OA relative to barrel 110. For this purpose, the first actuator 120 has a coil 121 in fig. 1, which at least partially surrounds the holder 140.
In fig. 1, coil 121 is attached to barrel 110. In another variation, the coil 121 is disposed spaced apart from the lens barrel 110.
In fig. 1, the first end region of the carrier 140 is arranged outside the coil 121 with respect to the optical axis OA. On the other hand, the second end region of the support 140 is located inside the coil 121 with respect to the optical axis OA.
In another variant, the position of the first end region of the support 140 coincides with an end of the coil 121 with respect to the optical axis OA. In another variant, the position of the first end region of the support 140 is located inside or outside the coil 121 with respect to the optical axis OA. In yet another variation, the second end region of the scaffold 140 is located inside or outside the coil 121 relative to the optical axis OA.
The bracket 140 has yokes 141, 142 comprising soft iron at a second end of the bracket 140. In another variant, the holder 140 has permanent- magnetic yokes 141, 142. In fig. 1, the carrier 140 has yokes 141, 142 at a second end region of the carrier 140. The yokes 141, 142 are arranged opposite the seats of the carriage 140. In another variation, yokes 141, 142 are located between achromatic lens 130 and mount 140.
When a current flows through the coil 121, a magnetic field is generated within the coil. When the coil 121 is energized, the magnetic field acts on the soft- iron containing yokes 141, 142 with a force that pulls the soft- iron containing yokes 141, 142 towards the center of the coil 121. The yokes 141, 142 containing soft iron are unalloyed iron having a high purity value. The carrier 140 is thereby stretched or compressed (depending on where the coil 120 is arranged along the optical axis OA relative to the soft-iron containing yokes 141, 142) because the carrier 140 is on the one hand firmly fixed inside the lens barrel 110. On the other hand, the yokes 141, 142 of the yoke 140, which contain soft iron, are pulled toward the center of the coil 121 by the magnetic field. Due to the applied magnetic field, holder 140 is stretched or compressed in such a way that the position of achromatic lens 130 and holder 140 is shifted along optical axis OA. The orientation of achromatic lens 130 is not changed, so that by shifting the position of achromatic lens 130, only the focal point of achromatic lens 130 is shifted along optical axis OA.
In another variant, the holder 140 has permanent- magnetic yokes 141, 142. The permanent- magnetic yokes 141, 142 extend in a longitudinal direction parallel to the optical axis OA. The north and south poles of the permanent- magnetic yokes 141, 142 are oriented such that the north and south poles are oriented in opposite directions along the optical axis OA, respectively. In this case, the bracket 140 is deformed according to the current direction of the current in the coil 121. Accordingly, the bracket 140 is deformed in a first direction and a second direction, wherein the second direction is opposite to the first direction.
Because the carrier 140 is elastically bendable, it has deformation resistance against deformation. When the bracket 140 is deformed, the force generated by the deformation resistance may increase or decrease according to the degree of deformation of the bracket 140. Therefore, if a magnetic field is generated and the bracket 140 is deformed by the force generated by the magnetic field, the force generated by the deformation resistance and the force generated by the magnetic field form a force balance. If the magnetic field is reduced and eventually switched off, the bracket 140 is again in the original position due to its elastically bendable property. Accordingly, the position of the carrier 140 along the optical axis OA is adjusted by targeted control of the current intensity of the current.
A lightweight construction for the optically effective element is thus achieved by the construction shown in fig. 1. Further, the achromatic lens 130 is less likely to vibrate due to the support 140 because the vibration from the external source is attenuated by the support 140. When it is necessary to adjust the focus of the achromatic lens 130, the holder 140 is deformed by a corresponding magnetic field. In this case, only a small mass is moved, so that vibrations of the actuator 120 are minimized.
In another embodiment, the adjustment device 100 comprises a linear guide (not shown) instead of the coil 121. The linear guide is adapted to guide the carriage 140 parallel to the optical axis OA. Accordingly, the position of bracket 140 and achromatic lens 130 may be adjusted along optical axis OA. The adjustment device 100 comprises a drive device (not shown) adapted to move the carriage 140 along the linear guide device.
A possible embodiment of the holder 140 is shown in fig. 2 and 3. Fig. 2 and 3 show top views of holder 140, yokes 141, 142, and achromatic lens 130.
In fig. 2, the holder 140 is constituted by at least two holder members 143, 144 that hold the achromatic lens 130 together. In one variant, the achromatic lens 130 is arranged between the two support members 143, 144 such that the two support members 143, 144 are oppositely arranged. In another variant, the holder has at least four holder members (not shown), each of which is staggered by 90 ° about the optical axis OA and holds an achromatic lens 130.
According to the variant of the carrier 140 shown in fig. 3, the carrier 140 has a yoke 141, 142 in each of the two carrier parts 143, 144. In the variant with four carrier parts, the carrier 140 has at least one yoke in each case in two carrier parts or in each case in all four regions.
Fig. 3 shows another variation of holder 140, in which holder 140 is configured as a ring holder 145 to hold achromatic lens 130.
According to the variant shown in fig. 4, the yoke 146 is annular. An induced current generated by the magnetic field in the annular yoke 146 is suppressed by the plurality of slits in the annular yoke 146. By means of the annular yoke 146, the force generated by the magnetic field acts uniformly on the carrier 140 and deforms it uniformly. Alternatively, the yoke may be constructed as described with respect to fig. 3.
In fig. 4, an optical component inspection system 200 for inspecting at least one surface O of a semiconductor chip B as a component is shown. The optical component detection system 200 includes a receptacle 150. The receiving part 150 is configured to position the semiconductor chip B in front of the camera device 220 to detect the first surface O of the semiconductor chip B by means of the camera device 220. The first surface is a side surface of the semiconductor chip B in fig. 4. The receiving portion 150 does not merely align the semiconductor chip B with the surface O toward the camera device 220. The housing 150 is configured to direct the semiconductor chip B with the other surface of the semiconductor chip B toward the camera device 220 so as to inspect the other surface of the semiconductor chip B as well. Further, the optical component detection system 200 includes a light source (not shown) configured to transmit light to the first surface O of the semiconductor chip B or to illuminate the semiconductor chip B. The light source is configured to illuminate the semiconductor chip B with light of a specific wavelength or a specific wavelength range. In one variation, the light emitted by the light source is visible light having a wavelength in the range of about 380 to 790 nm. In other variants, infrared light or polarized light is used.
The camera device 220 includes a CCD chip 230 as an image sensor configured to receive light reflected at the first surface O of the semiconductor chip B. In other variations, the image sensor may be a CMOS chip or an image sensor sensitive to certain wavelength ranges, such as a microbolometer array or a pyroelectric array. Depending on the wavelength range of light used, CCD chips have an adapted sensitivity for the respective light range or polarized light. In the optical path of the reflected light L, there are also arranged an adjustment device 100 as shown in fig. 1 and an achromatic lens 130 as a first optically effective element. Further, the condenser lens 160 is a second optically effective element in the optical path between the achromatic lens 130 and the CCD chip 230. The optical axis of the condenser lens 160 is coaxial with the optical axis OA of the achromatic lens 130.
In the optical component detection system 200 shown in fig. 4, the adjustment device 100 comprises a coil 121 as the first actuator 120. In this embodiment, the holder 140 is elastically bendable in the lens barrel longitudinal direction of the lens barrel 110.
Alternatively, the optical component detection system 200 includes a linear guide in place of the coil 121. The holder 140 is fixed on the inner side of the lens barrel 110 via the linear guide. Furthermore, the linear guide is adapted to guide the carriage 140 parallel to the optical axis OA. The linear guide is configured as a rail extending parallel to the optical axis OA. In a further variant, the linear guide is configured as two guide rails. In this case, the two rails are symmetrically opposite the axis OA.
In fig. 4, the first surface O of the semiconductor chip B is not arranged parallel to the image sensor surface 231 of the CCD chip 230, but is inclined at an angle α with respect to the optical axis OA. The angle alpha is greater or less than 90 deg. in fig. 4.
The optical component detection system 200 in fig. 4 further comprises a control device ECU configured to control the first actuator 120. The control device ECU is accordingly configured to control the current supplied to the coil 121 and to adjust the relative distance between the achromatic lens 130 and the CCD chip 230 (or the image sensor surface 231 of the CCD chip 230). If the optical package inspection system 200 comprises a linear drive instead of the coil 121, the control unit ECU is configured to control the motor that moves the carriage 140 along the linear guide.
In fig. 4, the control device ECU includes an image processing device BV. In another variant, the control device ECU and the image processing device BV may be two independent units configured to communicate with each other.
Furthermore, the optical component detection system 200 in fig. 4 has a second actuator 240 controlled by the control unit ECU. Second actuator 240 is used to adjust CCD chip 230 so as to displace CCD chip 230 along optical axis OA relative to achromatic lens 130.
Optical assembly detection system 200 also includes a position detection sensor 250 configured to determine the position and/or orientation of achromatic lens 130 and/or image sensor surface 231 of CCD chip 230. In fig. 1, the position detection sensor 250 is arranged and fixed inside the lens barrel 110 and arranged opposite to the yoke 142. In another variation, the position detection sensor 250 may be disposed and fixed inside the lens barrel 110 and disposed opposite to the yoke 141.
In another variation, the position detection sensor 250 may be integrated in the CCD chip 230. In this case, the holder 140 has a mark on at least one surface area facing the CCD chip 230. CCD chip 230 is configured to detect the mark and determine the position and/or orientation of achromatic lens 130 based on the detected mark.
The position detection sensor 250 provides information about the position and/or orientation of the achromatic lens 130 and/or the image sensor surface 231 of the CCD chip 230 to the control device ECU, which controls the first and/or second actuator 120, 240 based on the provided information. The position detection sensor 250 is an optical position detection sensor, but may in another variant be an (electro) mechanical position detection sensor.
In fig. 4, the control device ECU is configured to adjust the relative distance between the achromatic lens 130 and the CCD chip 230 by controlling the first and/or second actuators 120, 240 so as to project the sharpness plane SE of the reflected light L onto the image sensor surface 231 of the CCD chip 230 facing the reflected light L.
In one possible case, the surface O of the semiconductor chip B is substantially parallel to the image sensor surface 231 of the CCD chip 230. The control unit ECU controls the first and/or second actuator 120, 240 such that the sharpness plane SE is projected onto the image sensor surface 231 of the CCD chip 230.
If the CCD chip 230 takes an image of the semiconductor chip B at this time, the semiconductor chip B is clearly imaged in the image, and a defect or damage that the semiconductor chip B may have is inspected. The image processing apparatus BV provides defect information on the semiconductor chip B if the image processing apparatus BV detects a defect or damage.
The first and/or second actuator 120, 240 are adapted to increase or decrease the relative distance between the achromatic lens 130 and the CCD chip 230 by 100 μm within a few milliseconds. In particular, the first and/or second actuator 120, 240 is configured to change the relative distance by 100 μm within 2 to 10ms, preferably within about 5 ms. The relative distance between the achromatic lens 130 and the CCD chip 230 increases and decreases in such a way that excessive vibration of the achromatic lens 130 and/or the CCD chip 230 does not occur or occur, particularly along the direction of motion.
Since the semiconductor chip B in fig. 4 is inclined with respect to the image sensor surface 231 of the CCD chip 230, it is impossible for the control device ECU to substantially completely project the sharpness plane of the reflected light L onto the image sensor surface 231 of the CCD chip 230 by adjusting the relative distance between the achromatic lens 130 and the image sensor surface 231 of the CCD chip 230.
The image processing apparatus BV is configured to capture a first image BA by the CCD chip 230. Based on the first image BA captured by the CCD chip 230, the image processing device BA determines whether the sharpness plane SE of the reflected light L is substantially completely projected on the image sensor surface 231 of the CCD chip 230. If the image processing means BV determine that the sharpness plane SE is not substantially completely projected on the image sensor surface 231 of the CCD chip 230, the image processing means BV determine a first image area B1 of the first plurality of image areas B1, B2 of the first image BA. In the first image region B1, a first partial region of the sharpness plane SE of the reflected light L is projected on the image sensor surface 231 of the CCD chip 230.
The first captured image BA is schematically shown in fig. 5, in which the image area with the solid black line is a clearly imaged image area. In other words, the image areas with the black solid lines are areas in which partial areas of the definition plane SE of the reflected light L are projected onto the image sensor surface 231 of the CCD chip 230. The image area with the black dashed line is an image area in which the image area is blurred or in which a partial area of the sharpness plane SE is not projected onto the image sensor surface 231 of the CCD chip 230.
The image processing device BV determines, based on the first image BA, a second image region B2 of the first plurality of image regions B1, B2 in which second image region B2 the second partial region of the sharpness plane SE of the reflected light L is not projected onto the image sensor surface 231 of the CCD chip 230.
Next, the image processing apparatus BV supplies a control signal to the control apparatus ECU. The control device ECU controls the first and/or second actuator 120, 240 based on the control signal so as to adjust the relative distance between the achromatic lens 130 and the CCD chip 230. In this case, the second part of the sharpness plane SE of the reflected light L is projected onto the image sensor surface 231 of the CCD chip 230. At the same time, a second partial region of the sharpness plane SE of the reflected light L is projected onto the image sensor surface 231 of the CCD chip 230. Whereas the first partial region of the sharpness plane SE of the reflected light L is no longer projected onto the image sensor surface 231 of the CCD chip 230. The image processing means BV then sends a signal to the CCD chip 230 so that it takes the second image BB.
As schematically shown in fig. 6, the image processing apparatus BV determines, based on the second image BB, a third image region B3 of the second plurality of image regions B3, B4 of the second image BB. In the third image region B3, the first partial region of the sharpness plane SE of the reflected light L is not projected onto the image sensor surface 231 of the CCD chip 230. The image processing apparatus BV then determines a fourth image region B4 of the second plurality of image regions B3, B4 of the second image BB. In the fourth image region B4, a second partial region of the sharpness plane SE of the reflected light L is projected onto the image sensor surface 231 of the CCD chip 230. Therefore, in the second image BB, the fourth image area B4 clearly images a part of the semiconductor chip B. The first and second plurality of image areas are not limited to a number of two.
In a preferred embodiment, the first and second image regions B1, B2 of the first image BA image a portion of the semiconductor chip B, respectively, which substantially corresponds to the portion of the semiconductor chip B in the third and fourth image regions B3, B4 of the second image BB. It is thus ensured that the same portion of the semiconductor chip B is imaged in the image areas B1, B2, B3, B4 of the first and second images BA, BB.
The first and second image regions B1, B2 of the first image BA and the third and fourth image regions B3, B4 of the second image BB are shown in fig. 5 to 7 as connected image regions. Alternatively, the first and second image regions B1, B2 of the first image BA and the third and fourth image regions B3, B4 of the second image BB may at least partially overlap.
The image processing apparatus BV is configured to cut out a first image area B1 from a first image BA and a fourth image area B4 from a second image BB. Based on the cut-out image regions B1, B4, the image processing apparatus BV generates a third image BC. The third image BC is completely clear imaging of the semiconductor chip B because the semiconductor chip B is clearly imaged in both the first image region B1 of the first image BA and the fourth image region B4 of the second image BB.
The image processing means BV are also configured to determine whether the semiconductor chip B has at least one defect or damage based on the first image region B1 of the first image B2 and/or the fourth image region B4 of the second image BB and/or the third image BC. The image processing apparatus BV provides defect information if the semiconductor chip B has a defect or damage.
The definition plane SE with the corresponding depth of field ST is now shown in fig. 8 to 10. In order to clearly photograph the semiconductor chip B, the definition plane SE must be substantially projected on the image sensor surface 231 of the CCD chip 230. Since the definition plane SE has a certain depth of field ST, the semiconductor chip B is also imaged clearly when the definition plane SE is not perfectly parallel to the image sensor surface 231 of the CCD chip 230. Also, the semiconductor chip B is clearly imaged when the definition plane SE is projected at a distance before or after the image sensor surface 231 of the CCD chip 230. In this regard, the depth of field ST describes the distance over which the object (here the semiconductor chip B) is imaged sufficiently sharply.
As schematically shown in fig. 8, the sharpness plane SE is projected parallel to the image sensor surface 231 of the CCD chip 230 and spaced apart from the image sensor surface 231 along the optical axis OA. In fig. 8, the depth of field ST of the definition plane SE is sufficiently large that the image of the thus projected definition plane SE of the reflected light L is sharp.
In fig. 9 and 10, the case of the semiconductor chip B being inclined is schematically shown. Since the surface O of the semiconductor chip B is inclined with respect to the image sensor surface 231 of the CCD chip 230, the sharpness plane SE of the reflected light L is also inclined with respect to the image sensor surface 231 of the CCD chip 230.
In fig. 9, a first subregion of the sharpness plane SE is projected onto the image sensor surface 231 of the CCD chip 230, so that in the captured first image BA the first image region B1 reproduces a sharp image of the semiconductor chip B. Although the sharpness plane SE only partially intersects the image sensor surface 231 of the CCD chip 230, the first image area B1 of the first image BA is clearly imaged due to the depth of field ST. In the second image region B2, the second subregion of the sharpness plane SE with the depth of field ST is no longer projected onto the image sensor surface 231 of the CCD chip 230, so that the second image region B2 of the first image BA reproduces the unclear imaging of the semiconductor chip B.
The control device ECU controls the first and/or second execution device 120, 240 to adjust the relative distance between the achromatic lens 130 and the CCD chip 230 based on the control command of the image processing device BV. Therefore, the sharpness plane SE is displaced with respect to the image sensor surface 231 of the CCD chip 230, so that a second partial area of the sharpness plane SE is projected onto the image sensor surface 231 of the CCD chip 230.
As schematically shown in fig. 10, in the case of the fourth image region B4 of the captured second image BB, the second partial region of the sharpness plane SE of the reflected light L with the depth of field ST is projected onto the image sensor surface 231 of the CCD chip 230. Since the definition plane SE only partially intersects the image sensor surface 231 of the CCD chip 230 even in the second image BB, the fourth image area B4 of the second image BC reproduces a part of the semiconductor chip B clearly due to the depth of field ST. Whereas in the third image region B3 of the second image BC the first partial region of the sharpness plane SE with the depth of field ST is no longer projected on the image sensor surface 231 of the CCD chip 230. Accordingly, the third image area B3 of the second image BB does not clearly reproduce a part of the semiconductor chip B.
In light of the foregoing description of the optical component inspection system 200, a method for inspecting a surface of at least one semiconductor chip B is now described.
In one step, the semiconductor chip B is aligned to the camera device 220. In a next step, the first surface O of the semiconductor chip B is inspected by means of the camera device 220.
In the next step, the light L reflected on the surface O of the semiconductor chip B is received by means of the CCD chip 230. In a next step, the achromatic lens 130 is held in the optical path of the reflected light L by means of a support 140 elastically bendable in the longitudinal direction. The longitudinal direction is parallel to the optical axis OA of the achromatic lens 130. In the next step, the relative distance between CCD chip 230 and achromatic lens 130 is adjusted to shift achromatic lens 130 relative to CCD chip 230 along optical axis OA.
A time-velocity (t, v) diagram is shown in fig. 11. According to the figure, the control device ECU is configured to adjust the relative distance between the achromatic lens 130 and the CCD chip 230.
At time t0, the relative distance between achromatic lens 130 and CCD chip 230 is adjusted by first and/or second actuators 120, 240. In this case, the achromatic lens 130 and/or the CCD chip 230 are accelerated to the preset speed v1 by the first and/or second actuator 120, 240 for half of the time period (t0-t 1). After half of the time period (t0-t1), the achromatic lens 130 and/or the CCD chip 230 are decelerated by the first and/or second actuators 120, 240.
After adjusting the achromatic lens 130 and/or the image sensor 230, as shown in fig. 9, the sharpness plane SE is projected onto the image sensor surface 231 of the CCD chip 230. After the achromatic lens 130 and/or the CCD chip 230 are stopped, the first image BA is photographed within a second period (t1-t 2).
Subsequently, the first and second image areas B1, B2 are determined by the image processing apparatus BV based on the first image BA. In the case of the first image BA, in the first image region B1, a first partial region of the sharpness plane SE is projected onto the image sensor surface 231 of the CCD chip 230. Whereas in the second image region B2 the second part of the sharpness plane SE is not projected onto the image sensor surface 231 of the CCD chip 230. The CCD chip 230 requires a time of 6 to 12 milliseconds (e.g., about 8 to 10 milliseconds) to capture and read an image depending on the lighting conditions at the time of capture and the chip size and reading speed (pixels/second) that affects the reading time.
Then at the beginning of a third time period (t2-t3), the control device ECU controls the first and/or second execution device 120, 240 based on the control command of the image processing device BV so as to adjust the relative distance between the achromatic lens 130 and the CCD chip 230. In this case, the relative distance is adjusted such that the second subregion of the sharpness plane SE is projected onto the image sensor surface 231 of the image sensor 230.
Accordingly, until half of the third time period (t2-t3), the achromatic lens 130 and/or the CCD chip 230 are accelerated to a preset speed. After half of the third time period (t2-t3), achromatic lens 130 and/or CCD chip 230 are decelerated. Once the achromatic lens 130 and/or the CCD chip 230 are stopped, the second image BB is photographed by the CCD chip 230. The path along which the relative distance between the achromatic lens 130 and the CCD chip 230 is decreased or increased is in the range between 70 and 150 μm, preferably 100 μm. The path is moved within a time period of 2 to 10 milliseconds, preferably within 10 milliseconds. The path along which the relative distance between the achromatic lens 130 and the CCD chip 230 is moved is related to the first and second optically effective elements 130, 160 used and their optical characteristics.
The image processing means BV are further configured to provide a further control command to the first and/or second executing means 120, 240 in order to adjust the relative distance between the achromatic lens 130 and the CCD chip 230 to the initial length after taking the second image BB.
In a further variant, three images each having three image areas are captured by means of the CCD chip 230. In one of the three image regions, a partial region of the sharpness plane SE is projected onto the image sensor surface 231 of the CCD chip 230, whereas in the other two image regions, the sharpness plane SE is not projected onto the image sensor surface 231 of the CCD chip 230. The three-image capturing process works as the two-image process, but only three images each have three image regions. For the taking of the three images and the corresponding adjustment of the relative distance between the achromatic lens 130 and the CCD chip 230, the optical assembly detection system takes 80 to 120 milliseconds, preferably 100 milliseconds.
In a first variant, the receptacle 150 is configured to position the semiconductor chip B in front of the camera device 220 such that the sharpness plane SE is at least partially projected onto the image sensor surface 231 of the CCD chip 230. Accordingly, the image processing apparatus BV first captures a first image BA through the CCD chip 230. Then, the image processing apparatus BV determines the first and second image areas B1, B2 and sends a control command to the control apparatus ECU before the time point t0 or at the time point t 0. The control unit ECU controls the first and/or second actuator 120, 240 based on the control command to accelerate the achromatic lens 130 and/or the CCD chip to the velocity v1 within half of the first period (t0-t 1). After half of the first time period (t0-t1), the achromatic lens 130 and/or the CCD chip are decelerated. If the achromatic lens 130 and/or the CCD chip are stopped, the second image BB is photographed by the CCD chip 230.
Fig. 12 and 13 each show a time-focus path diagram, in which different steps 310 to 360 and 410 to 460 are carried out at different points in time, wherein corresponding steps are provided with the same reference numerals. In addition, three images of the semiconductor chip B are taken in fig. 12 and 13. One image is divided into three image regions, of which only one image region is clearly imaged with respect to a partial region of the semiconductor chip B. Accordingly, only one image region is located in the region of the depth of field. But the number of captured images is not limited to this number. Furthermore, in another variant, the plurality of image areas may be wholly or at least partially within and/or overlapping the depth of field.
In fig. 12, before time point t0, the semiconductor chip B is positioned before the camera apparatus 220 through the housing portion 150. In step 310, after time point t0, the CCD chip 230 captures a first image of the semiconductor component B. In a step 311, which follows step 310, the image processing means BV determine, in the first image, a first image region of a first plurality of image regions in which a first partial region of the sharpness plane SE of the reflected light L is projected on the image sensor surface 231 of the CCD chip 230. Furthermore, in step 311, the image processing means BV determines second and third image regions of the first image in which neither second nor third partial regions of the sharpness plane SE of the reflected light L are projected on the image sensor surface 231 of the CCD chip 230.
In step 310, the image processing means BV provide control commands to the first and/or second execution means 120, 240 after a predetermined period of time after the first image has been captured. In step 320, the first and/or second actuator 120, 240 adjusts the relative distance between the CCD chip 230 and the achromatic lens 130 by a predetermined path length. Accordingly, during the determination of the first, second and third image areas, the relative distance between the CCD chip 230 and the achromatic lens 130 is adjusted by the image processing apparatus BV.
The predetermined path length is related to the optical characteristics of achromatic lens 130, lens 160, and/or the dimensions of surface O of component B. Preferably, the predetermined path length is predetermined according to the depth of field ST such that the predetermined path length is less than, equal to, or greater than the depth of field ST. As shown in fig. 12, the speed for traversing the predetermined path length first increases slowly and then increases substantially. After passing the portion of the predetermined path length, the speed is first reduced substantially and then slowly. Abrupt movement and stoppage of the achromatic lens 130 and/or the CCD chip 230 by the first and/or second actuators 120, 240 is thus avoided, thereby preventing excessive vibration in the direction of movement.
After the adjustment of the relative distance in step 320 is finished, a second image is captured by means of the CCD chip 230 in step 330. In a step 331, which follows step 330, the image processing means BV determine, in the second image, fourth and sixth image areas of the second plurality of image areas in which the first and third partial areas of the sharpness plane SE of the reflected light L are not projected on the image sensor surface 231 of the CCD chip 230. Furthermore, in step 331 the image processing device BV determines a fifth image region of the second image in which a second partial region of the sharpness plane SE of the reflected light L is projected onto the image sensor surface 231 of the CCD chip 230.
After completing the capturing of the second image in step 330, the image processing device again provides control commands to the first and/or second execution device 120, 240. In step 340, the first and/or second actuator 120, 240 adjusts the relative distance between the CCD chip 230 and the achromatic lens 130 by a predetermined path length. In another variation, in step 340, the relative distance between CCD chip 230 and achromatic lens 130 may be adjusted by a second predetermined path length, which is greater or less than the predetermined path length in step 320. Accordingly, the relative distance between the CCD chip 230 and the achromatic lens 130 is adjusted during the determination of the fourth, fifth, and sixth image areas by the image processing apparatus BV.
After the adjustment of the relative distance in step 340 is finished, a third image is captured by means of the CCD chip 230 in step 350. In a step 351, which is immediately followed by a successful acquisition of the third image, the image processing means BV determine a seventh and an eighth image region of the third image in which the first and second partial regions of the sharpness plane SE of the reflected light L are not projected on the image sensor surface 231 of the CCD chip 230. In step 351, the image processing means BV also determine a ninth image region of the third image in which a third partial region of the sharpness plane SE of the reflected light L is projected on the image sensor surface 231 of the CCD chip 230.
After capturing the third image in step 350, the image processing means BV provide a further control command to the first and/or second executing means 120, 240. In step 360, the first and/or second actuator 120, 240 adjusts the relative distance between the achromatic lens 130 and the CCD chip 230 to its initial length. Accordingly, the relative distance between the achromatic lens 130 and the CCD chip 230 is adjusted during the determination of the seventh, eighth, and ninth image areas by the image processing apparatus BV. The relative distance between achromatic lens 130 and CCD chip 230 is adjusted to its original length, but in the opposite direction, in the same manner as the relative distance is adjusted in steps 320 and 340.
After step 360, in successive steps 312, 332, 352, the image processing means BV check whether the semiconductor chip B has a defect or not, based on the respective determined image areas of the first, second and third image.
In another variant, after one of the steps 360, 312, 332, 352, the image processing means BV generate a fourth image in a step 353, in which the first, fifth and ninth image areas of the first, second and third image are cut out and combined.
In another variant, the image processing means BV carry out step 311 after one of the steps 330, 331, 340, 350, 351, 360. Furthermore, the image processing means BV carry out step 331 after one of the steps 340, 350, 351, 360 and/or carry out step 351 after one of the steps 312, 332.
The image processing and the inspection of the image of the semiconductor chip B can be separated from the image capturing process. By making the capturing of the image of the semiconductor chip B independent of the image processing and the inspection of the image, the process of image capturing can be reduced to fewer steps. Accordingly, the duration of the image capturing of each component is reduced and a larger number of components and corresponding image capturing can be achieved in the same time.
In fig. 13, before time point t0, semiconductor chip B is positioned before camera device 220. In step 410, after the time point t0, the CCD chip 230 captures a first image and supplies it to the image processing apparatus BV. During step 410, the image processing means BV provides control commands to the first and/or second execution means 120, 240 in order to adjust the relative distance between the achromatic lens 130 and the CCD chip 230 at a predetermined speed. During the taking of this first image, the first and/or second actuator 120, 240 has started to adjust the relative distance between the achromatic lens 130 and the CCD chip 230.
The timing of adjusting the relative distance between the achromatic lens 130 and the CCD chip 230 at a predetermined speed and the predetermined speed depend on: the optical characteristics of the achromatic lens 130, the lens 160, the depth of field ST, the dimensions of the surface O of the component B, the quality of the achromatic lens 130 and/or the lens 160 and/or the image sensor 230, and the response of the above-mentioned components to the first and/or second actuators 120, 240. For example, if the depth of field size is 70 μm, the sharpness plane SE is allowed to move by at most 70 μm during the acquisition of the first image. A sharp image of the partial region of the semiconductor chip B on an image region of the captured image is thus ensured.
In fig. 13, the predetermined speed is kept constant along the moving path of the generated image. In another variation, the speed for adjusting the relative distance between CCD chip 230 and the achromatic lens is varied within a predetermined speed range or between at least two different speeds.
After adjusting the sharpness plane SE for a predetermined path length and/or a predetermined time, a second image is captured by means of the CCD chip 230 in step 430. The relative distance between the CCD chip 230 and the achromatic lens 130 is further adjusted at the predetermined speed, in a manner unrelated to the photographing of the second image. In another variant, the second image is taken when a predetermined sharpness plane position is reached.
After further adjustment of the sharpness plane SE by the predetermined path length and/or the predetermined time, a third image is captured in step 450 by means of the CCD chip 230. As with the shooting of the second image, the relative distance between the CCD chip 230 and the achromatic lens 130 is also adjusted at a predetermined speed during the shooting of the third image. In another variation, the predetermined speed is reduced during the taking of the third image until the relative distance between CCD chip 230 and achromatic lens 130 is eventually no longer adjusted.
Steps 311, 312, 331, 332, 351, 352, 353, 360 may be performed as described in fig. 12.
According to fig. 13, the offset lens 130 and/or the image sensor 230 are accelerated and stopped only once during the taking of the first, second and third images. This avoids corresponding stops and re-accelerations, thereby further reducing the duration of image capture for each component. Also, by accelerating and stopping again, vibration of the optical assembly detection system 200 is avoided.

Claims (24)

1. An optical component detection system (200) for detecting at least one surface of at least one component (B), wherein a receptacle (150) is configured to position the component (B) in front of a camera device (220) for detecting a first surface (O) of the component (B) by means of the camera device (220),
wherein the camera arrangement (220) comprises an image sensor (230) configured to receive reflected light (L) on the first surface (O) of the component (B);
wherein the optical component detection system (200) comprises a first optically effective element (130) arranged in the optical path of the reflected light (L) to the image sensor (230) and an adjustment device (100) for the first optically effective element (130), the adjustment device (100) comprising:
a holder (140) for the first optically active element (130), wherein the holder (140) is fixed on the inside of a hollow-cylindrical barrel (110) and the longitudinal center line of the barrel (110) is coaxial with the Optical Axis (OA) of the first optically active element (130), and wherein the holder (140) is elastically bendable at least in the barrel longitudinal direction; and
a first actuator (120) for adjusting a relative distance between the optically active element (130) and the image sensor (230) so as to displace the optically active element (130) along the Optical Axis (OA) relative to the image sensor (230).
2. The optical component detection system (200) of claim 1,
wherein the first actuator (120) comprises a coil (121) at least partially surrounding the holder (140);
wherein the control device (ECU) is configured to control the current delivered to the coil (121) for generating the magnetic field;
wherein the holder (140) has a soft-iron or permanent magnet component (141, 142) configured to displace the holder (140) along a longitudinal centerline of the lens barrel (110) in dependence on the current delivered to the coil (121).
3. The optical component detection system (200) according to any one of the preceding claims, further comprising:
-second actuator means (240) controlled by said control means (ECU) for adjusting said image sensor (230) so as to displace said image sensor (230) along said Optical Axis (OA) with respect to said first optically active element (130); and/or
At least one light source configured to send light onto the first surface (O) of the component (B);
wherein, as an option, the control device (ECU) is configured to adjust the relative distance between the first optically active element (130) and the image sensor (230) by controlling the first and/or second execution device (120, 240) so as to project the sharpness plane (SE) of the reflected light (L) onto an image sensor surface (231) of the image sensor facing the reflected light (L).
4. The optical component detection system (200) according to any one of the preceding claims, wherein the image processing arrangement (BV) is configured to:
-taking a first image (BA) by means of the image sensor (230), the first image being provided to the image processing apparatus (BV);
determining, based on the generated first image (BA), whether a sharpness plane (SE) of the reflected light (L) is substantially fully projected onto the image sensor surface (231) of the image sensor (230); and is
If the sharpness plane (SE) of the reflected light (L) is not substantially completely projected onto the image sensor surface (231) of the image sensor (230), then
Determining a first image region (B1) of a first plurality of image regions (B1, B2) of the first image (BA), in which first image region (B1) a first partial region of the sharpness plane (SE) of the reflected light (L) is projected on the image sensor surface (231) of the image sensor (230);
determining a second image region (B2) of the first plurality of image regions (B1, B2) of the first image (BA), in which second image region (B2) a second partial region of the sharpness plane (SE) of the reflected light (L) is not projected on the image sensor surface (231) of the image sensor (230);
-providing control commands to the control means (ECU) to control the first and/or second actuator means (120, 240) in order to adjust the relative distance between the first optically active element (130) and the image sensor (230) and to project a second partial area of the sharpness plane (SE) of the reflected light (L) on the image sensor surface (231) of the image sensor (230);
-taking a second image (BB) by means of the image sensor (230), the second image being provided to the image processing means (BV);
determining a third image region (B3) of a second plurality of image regions (B3, B4) of the second image (BB), in which third image region (B3) a first partial region of the sharpness plane (SE) of the reflected light (L) is not projected on the image sensor surface (231) of the image sensor (230); and
determining a fourth image region (B4) of the second plurality of image regions (B3, B4) of the second image (BB), in which fourth image region (B4) a second partial region of the sharpness plane (SE) of the reflected light (L) is projected on the image sensor surface (231) of the image sensor (230).
5. The optical component detection system (200) according to any one of claims 1 to 3, wherein the image processing arrangement (BV) is configured to:
-taking a first image (BA) by means of the image sensor (230), the first image being provided to the image processing apparatus (BV);
-providing control commands to the control means (ECU) to control the first and/or second actuator means (120, 240) to adjust the relative distance between the first optically active element (130) and the image sensor (230) by a predetermined path length;
-taking a second image (BB) by means of the image sensor (230), the second image being provided to the image processing device (BV).
6. The optical component detection system (200) according to any one of claims 1 to 3, wherein the image processing arrangement (BV) is configured to:
-taking a first image (BA) by means of the image sensor (230), the first image being provided to the image processing apparatus (BV);
-providing control commands to the control means (ECU) to control the first and/or second actuator means (120, 240) to adjust the relative distance between the first optically active element (130) and the image sensor (230) at a predetermined speed during capturing of the first image (BA);
-adjusting the relative distance between the first optically active element (130) and the image sensor (230) after capturing the first image (BA) or after a predetermined period of time after capturing the first image (BA), during which a second image (BB) is simultaneously captured by means of the image sensor (230), the second image being provided to the image processing device (BV).
7. The optical component detection system (200) according to any one of claims 5 or 6, wherein the image processing arrangement (BV) is configured to, after capturing the first image (BA) or after capturing the second image (BB):
determining a first image region (B1) of a first plurality of image regions (B1, B2) of the first image (BA), in which first image region (B1) a first partial region of a sharpness plane (SE) of the reflected light (L) is projected on the image sensor surface (231) of the image sensor (230); and/or
Determining a second image region (B2) of the first plurality of image regions (B1, B2) of the first image (BA) in which a second partial region of the sharpness plane (SE) of the reflected light (L) is not projected on the image sensor surface (231) of the image sensor (230); and/or
Determining a third image region (B3) of a second plurality of image regions (B3, B4) of the second image (BB), in which a first partial region of the sharpness plane (SE) of the reflected light (L) is not projected on the image sensor surface (231) of the image sensor (230); and/or
Determining a fourth image region (B4) of the second plurality of image regions (B3, B4) of the second image (BB), in which fourth image region (B4) a second partial region of the sharpness plane (SE) of the reflected light (L) is projected on the image sensor surface (231) of the image sensor (230).
8. Optical component detection system (200) according to claim 4 or 7, wherein the first and second image regions (B1, B2) of the first image (BA) respectively image a portion of the component (B) which substantially corresponds to the portion of the component (B) in the third and fourth image regions (B3, B4) of the second image (BB); and/or
Wherein the first partial region of the definition plane (SE) in the first image (BA) and the second partial region of the definition plane (SE) in the second image (BB) are projected onto the image sensor surface (231) of the image sensor (230) within a predetermined depth of field (ST); and
wherein the second partial region of the definition plane (SE) in the first image (BA) and the first partial region of the definition plane (SE) in the second image (BB) are projected outside the predetermined depth of field (ST) onto the image sensor surface (231) of the image sensor (230).
9. The optical component detection system (200) according to any one of claims 4 or 7 and/or 8, wherein the image processing device is further configured to:
-cutting out the first image area (B1) from the first image (BA) and the fourth image area (B4) from the second image (BB); and
combining the first and fourth cut-out image regions (B1, B4) to produce a third image (BC); and/or
Determining whether the component (B) has at least one defect based on the first image region (B1) of the first image (BA) and/or the fourth image region (B4) of the second image (BB) and/or the third image (BC); and
providing defect information on the component (B) if the image processing apparatus (BV) determines that there is at least one defect.
10. The optical component detection system (200) according to any one of the preceding claims, further comprising:
a position detection sensor (250) configured to determine a position and/or orientation of the first optically active element (130) and/or the image sensor surface (231) of the image sensor (230) and to provide information about the position and/or orientation of the first optically active element (130) and/or the image sensor surface (231) of the image sensor (230) to the control device (ECU), which controls the first and/or second actuator device (120, 240) based on the provided information,
wherein, as an option, the position detection sensor is an optical or (electro-) mechanical position detection sensor.
11. The optical component acquisition system (200) according to any one of the preceding claims, wherein the camera arrangement (220) comprises a second optically active element (160) in the optical path between the first optically active element (130) and the image sensor (230), wherein an optical axis of the second optically active element (160) is coaxial with an optical axis of the first optically active element (130); and/or
Wherein the first optically effective element (130) is an achromatic lens; and/or
Wherein the second optically active element (160) is a condenser lens.
12. An optical component detection system (200) for detecting at least one surface of at least one component (B), wherein a receptacle (150) is configured to position the component (B) in front of a camera device (220) for detecting a first surface (O) of the component (B) by means of the camera device (220),
wherein the camera arrangement (220) comprises an image sensor (230) configured to receive reflected light (L) on the first surface (O) of the component (B);
wherein the optical component detection system (200) comprises a first optically effective element (130) arranged in the optical path of the reflected light (L) to the image sensor (230) and an adjustment device (100) for the first optically effective element (130), the adjustment device (100) comprising:
a holder (140) for the first optically active element (130), wherein the holder (140) is fixed on the inside of a hollow-cylindrical barrel (110) by means of a linear guide and the longitudinal center line of the barrel (110) is coaxial with the Optical Axis (OA) of the first optically active element (130), and wherein the linear guide is configured to guide the holder (140) parallel to the Optical Axis (OA);
a first actuator (120) for adjusting the relative distance between the image sensor (230) and the first optically active element (130) in order to displace the first optically active element (130) relative to the image sensor (230).
13. Adjustment device (100) for an optically active element (130), comprising:
a holder (140) for the first optically active element (130), wherein the holder (140) is fixed on the inside of a hollow-cylindrical barrel (110) and the longitudinal center line of the barrel (110) is coaxial with the Optical Axis (OA) of the first optically active element (130), and wherein the holder (140) is elastically bendable at least in the barrel longitudinal direction; and
a first actuator (120) for adjusting the holder (140) in order to displace the first optically active element (130) along the Optical Axis (OA) relative to the barrel (110).
14. Adjustment device (100) for an optically active element (130), comprising:
a holder (140) for the first optically active element (130), wherein the holder (140) is fixed on the inside of a hollow-cylindrical barrel (110) by means of a linear guide and the longitudinal center line of the barrel (110) is coaxial with the Optical Axis (OA) of the first optically active element (130), and wherein the linear guide is configured to guide the holder (140) parallel to the Optical Axis (OA); and
a first actuator (120) for adjusting the holder (140) in order to displace the first optically active element (130) along the Optical Axis (OA) relative to the barrel (110).
15. Method for inspecting at least one surface of at least one component (B), comprising the steps of:
aligning the component (B) with a camera device (220);
-detecting a first surface (O) of said component (B) by means of said camera means (220);
receiving reflected light (L) on the first surface (O) of the component (B) by means of an image sensor (230) of the camera device (220);
-holding a first optically effective element (130) in the optical path of the reflected light (L) by means of a holder (140) being elastically bendable in a longitudinal direction, wherein the longitudinal direction is parallel to an Optical Axis (OA) of the optically effective element (130);
adjusting a relative distance between the image sensor (230) and the first optically active element (130) so as to displace the first optically active element (130) along the optical axis relative to the image sensor (230).
16. The method of claim 15, further comprising the steps of:
adjusting a relative distance between the image sensor (230) and the first optically active element (130) so as to displace the image sensor (230) along the Optical Axis (OA) relative to the first optically active element (130); and as an option to do so,
adjusting a relative distance between the image sensor (230) and the first optically active element (130) so as to project a sharpness plane (SE) of the reflected light (L) onto an image sensor surface (231) of the image sensor (230) facing the reflected light (L).
17. The method according to any one of claims 15 or 16, comprising the steps of:
-taking a first image (BA);
determining, based on the first image (BA), whether a sharpness plane (SE) of the reflected light (L) is substantially fully projected onto the image sensor surface (231) of the image sensor (230); and
if the sharpness plane (SE) of the reflected light (L) is not substantially completely projected onto the image sensor surface (231) of the image sensor (230), then
Determining a first image region (B1) of a first plurality of image regions (B1, B2) of the first image (BA), in which first image region (B1) a first partial region of the sharpness plane (SE) of the reflected light (L) is projected on the image sensor surface (231) of the image sensor (230);
determining a second image region (B2) of the first plurality of image regions (B1, B2) of the first image (BA), in which second image region (B2) a second partial region of the sharpness plane (SE) of the reflected light (L) is not projected on the image sensor surface (231) of the image sensor (230);
-adjusting the relative distance between the first optically active element (130) and the image sensor (230) so as to project a second partial area of the sharpness plane (SE) of the reflected light (L) on the image sensor surface (231) of the image sensor (230);
-taking a second image (BB);
determining a third image region (B3) of a second plurality of image regions (B3, B4) of the second image (BB), in which third image region (B3) a first partial region of the sharpness plane (SE) of the reflected light (L) is not projected on the image sensor surface (231) of the image sensor (230); and
determining a fourth image region (B4) of the second plurality of image regions (B3, B4) of the second image (BB), in which fourth image region (B4) a second partial region of the sharpness plane (SE) of the reflected light (L) is projected on the image sensor surface (231) of the image sensor (230).
18. The method according to claim 15 or 16, comprising the steps of:
-taking a first image (BA);
adjusting a relative distance between the first optically active element (130) and the image sensor (230) by a predetermined path length;
a second image (BB) is captured.
19. The method according to claim 15 or 16, comprising the steps of:
-taking a first image (BA);
-adjusting the relative distance between the first optically active element (130) and the image sensor (230) at a predetermined speed during taking of the first image (BA);
-adjusting the relative distance between the first optically active element (130) and the image sensor (230) after taking the first image (BA) or after a predetermined period of time after taking the first image (BA), during which a second image (BB) is taken simultaneously.
20. The method according to any one of claims 18 or 19, comprising the steps of:
determining a first image region (B1) of a first plurality of image regions (B1, B2) of the first image (BA), in which first image region (B1) a first partial region of the sharpness plane (SE) of the reflected light (L) is projected on the image sensor surface (231) of the image sensor (230); and/or
Determining a second image region (B2) of the first plurality of image regions (B1, B2) of the first image (BA), in which second image region (B2) a second partial region of the sharpness plane (SE) of the reflected light (L) is not projected on the image sensor surface (231) of the image sensor (230); and/or
Determining a third image region (B3) of a second plurality of image regions (B3, B4) of the second image (BB), in which third image region (B3) a first partial region of the sharpness plane (SE) of the reflected light (L) is not projected on the image sensor surface (231) of the image sensor (230); and/or
Determining a fourth image region (B4) of the second plurality of image regions (B3, B4) of the second image (BB), in which fourth image region (B4) a second partial region of the sharpness plane (SE) of the reflected light (L) is projected on the image sensor surface (231) of the image sensor (230).
21. The method of any one of claims 17 or 20,
the first and second image areas (B1, B2) of the first image (BA) each image a portion of the component (B) which substantially corresponds to the portion of the component (B) in the third and fourth image areas (B3, B4) of the second image (BB); and/or
Wherein a first partial region of the definition plane (SE) in the first image (BA) and a second partial region of the definition plane (SE) in the second image (BB) are projected onto the image sensor surface (231) of the image sensor (230) within a predetermined depth of field (ST); and
wherein a second partial region of the definition plane (SE) in the first image (BA) and a first partial region of the definition plane (SE) in the second image (BB) are projected outside the predetermined depth of field (ST) onto the image sensor surface (231) of the image sensor (230).
22. The method according to any one of claims 17 or 20 and/or 21, comprising the steps of:
-cutting out the first image area (B1) from the first image (BA) and the fourth image area (B4) from the second image (BB); and
combining the cut-out first and fourth image regions (B1, B4) to produce a third image (BC); and/or
Determining whether the component (B) has at least one defect based on the first image region (B1) of the first image (BA) and/or the fourth image region (B4) of the second image (BB) and/or the third image (BC); and
providing defect information on the component (B) if the image processing apparatus (BV) determines that there is at least one defect.
23. The method of any one of claims 15 to 22,
if the support (140) has a component comprising soft iron, the support (140) is deformed in a first direction along the Optical Axis (OA) relative to the image sensor (230) by the generated magnetic field.
24. The method according to any of claims 15 to 22, wherein, if the holder (140) has a permanent magnetic component, the holder (140) is deformed by the generated magnetic field in a first or a second direction with respect to the image sensor (230) depending on the flow direction of the current in the coil (121), wherein the second direction is opposite to the first direction.
CN201880028492.1A 2017-04-03 2018-03-13 Optical component inspection system and method for inspecting at least one component Active CN110651165B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE102017003231.9A DE102017003231A1 (en) 2017-04-03 2017-04-03 Optical component detection system and method for detecting at least one component
DE102017003231.9 2017-04-03
PCT/EP2018/056154 WO2018184792A1 (en) 2017-04-03 2018-03-13 Optical component measurement system and method for measuring at least one component

Publications (2)

Publication Number Publication Date
CN110651165A true CN110651165A (en) 2020-01-03
CN110651165B CN110651165B (en) 2021-07-23

Family

ID=61691955

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880028492.1A Active CN110651165B (en) 2017-04-03 2018-03-13 Optical component inspection system and method for inspecting at least one component

Country Status (6)

Country Link
JP (1) JP7128881B2 (en)
CN (1) CN110651165B (en)
DE (1) DE102017003231A1 (en)
SG (1) SG11201909202QA (en)
TW (1) TWI714839B (en)
WO (1) WO2018184792A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001116980A (en) * 1999-10-18 2001-04-27 Fuji Photo Film Co Ltd Automatic focusing camera and photographing method
CN102037309A (en) * 2008-05-19 2011-04-27 瑞尼斯豪公司 Optical inspection probe
CN102062926A (en) * 2009-11-17 2011-05-18 台湾东电化股份有限公司 Lens driving device
DE102011089055A1 (en) * 2011-12-19 2013-06-20 Technische Universität Berlin Instrument e.g. medical instrument e.g. endoscope for e.g. medical investigation, has actuator that is provided to adjust adjusting element along longitudinal axis of runner relative to stator and is designed as reluctance motor
CN103636201A (en) * 2011-07-05 2014-03-12 罗伯特·博世有限公司 Arrangement and method for determining imaging deviation of camera
US20140368724A1 (en) * 2013-06-12 2014-12-18 Nvidia Corporation Methods for enhancing camera focusing performance using camera orientation
CN104429055A (en) * 2012-06-29 2015-03-18 Lg伊诺特有限公司 Camera module
CN104783757A (en) * 2009-06-17 2015-07-22 3形状股份有限公司 Focus scanning apparatus

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2539778B2 (en) * 1984-11-22 1996-10-02 株式会社日立製作所 Inspection method and inspection device
JPS61241605A (en) * 1985-04-19 1986-10-27 Yoshiaki Ihara Image processing device
US5289318A (en) * 1990-07-31 1994-02-22 Canon Kabushiki Kaisha Optical apparatus provided with a driving unit for moving a lens
JPH04166906A (en) * 1990-10-31 1992-06-12 Sony Corp Original position detecting device for lens barrel
JP3701353B2 (en) * 1995-09-29 2005-09-28 大日本印刷株式会社 Image acquisition device
KR100403862B1 (en) * 2001-01-26 2003-11-01 어플라이드비전텍(주) Apparatus for inspecting semiconductor wafer and the methods thereof
TW556038B (en) * 2001-06-29 2003-10-01 Arc Design Inc Control system of zoom lens for digital still cameras
JP3831318B2 (en) * 2002-08-22 2006-10-11 オリンパス株式会社 Endoscopic imaging device
JP2004097292A (en) * 2002-09-05 2004-04-02 Olympus Corp Imaging device for endoscope
DE102008018586A1 (en) 2008-04-12 2009-11-05 Mühlbauer Ag Optical detection device and method for detecting surfaces of components
JP5402277B2 (en) * 2009-06-16 2014-01-29 コニカミノルタ株式会社 Actuator, drive device, and imaging device
JP5274733B1 (en) * 2011-10-13 2013-08-28 オリンパスメディカルシステムズ株式会社 Imaging unit and endoscope
JP5932343B2 (en) * 2012-01-13 2016-06-08 エイチエスティ・ビジョン株式会社 Observation equipment
WO2014094828A1 (en) * 2012-12-18 2014-06-26 Carl Zeiss Industrielle Messtechnik Gmbh Zoom lens having setting error correction
DE102012224179A1 (en) * 2012-12-21 2014-06-26 Olympus Winter & Ibe Gmbh Electromagnetic actuator for a surgical instrument
JP2017507680A (en) * 2013-12-23 2017-03-23 キャンプレックス インコーポレイテッド Surgical visualization system
JP5954757B2 (en) 2015-01-09 2016-07-20 上野精機株式会社 Appearance inspection device
JP6429718B2 (en) * 2015-04-22 2018-11-28 オリンパス株式会社 Imaging apparatus and endoscope
DE102015117276B4 (en) * 2015-10-09 2018-09-06 Carl Zeiss Industrielle Messtechnik Gmbh Method and device for measuring a test object with improved measuring accuracy

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001116980A (en) * 1999-10-18 2001-04-27 Fuji Photo Film Co Ltd Automatic focusing camera and photographing method
CN102037309A (en) * 2008-05-19 2011-04-27 瑞尼斯豪公司 Optical inspection probe
CN104783757A (en) * 2009-06-17 2015-07-22 3形状股份有限公司 Focus scanning apparatus
CN102062926A (en) * 2009-11-17 2011-05-18 台湾东电化股份有限公司 Lens driving device
CN103636201A (en) * 2011-07-05 2014-03-12 罗伯特·博世有限公司 Arrangement and method for determining imaging deviation of camera
DE102011089055A1 (en) * 2011-12-19 2013-06-20 Technische Universität Berlin Instrument e.g. medical instrument e.g. endoscope for e.g. medical investigation, has actuator that is provided to adjust adjusting element along longitudinal axis of runner relative to stator and is designed as reluctance motor
CN104429055A (en) * 2012-06-29 2015-03-18 Lg伊诺特有限公司 Camera module
US20140368724A1 (en) * 2013-06-12 2014-12-18 Nvidia Corporation Methods for enhancing camera focusing performance using camera orientation

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PEYRE.P: "SURFACE FINISH ISSUES AFTER DIRECT METAL DEPOSITION", 《THERMEC 2011, PART 1: 7TH INTERNATIONAL CONFERENCE ON PROCESSING & MANUFACTURING OF ADVANCED MATERIALS (THERMEC 2011), AUGUST 1-5, 2011, QUEBEC CITY, CANADA》 *
周文举: "基于机器视觉的在线高速检测与精确控制研究及应用", 《中国博士学位论文全文数据库 信息科技辑》 *

Also Published As

Publication number Publication date
JP7128881B2 (en) 2022-08-31
TW201842300A (en) 2018-12-01
DE102017003231A1 (en) 2018-10-04
JP2020518832A (en) 2020-06-25
TWI714839B (en) 2021-01-01
CN110651165B (en) 2021-07-23
WO2018184792A1 (en) 2018-10-11
SG11201909202QA (en) 2019-11-28

Similar Documents

Publication Publication Date Title
US7460772B2 (en) Optical apparatus
CN1831622B (en) Image vibration reduction apparatus and camera equipped with the same
US7352477B2 (en) Two dimensional position detecting device
JP5854501B2 (en) Automatic visual inspection equipment
CN104219441A (en) Blur correction apparatus
WO2015011853A1 (en) Electronic component mounting apparatus and electronic component mounting method
CN105635564A (en) Multiple camera apparatus and method for synchronized autofocus
CN111654242B (en) Method and system for detecting notch on solar wafer
JP6789733B2 (en) Image blur correction device, lens barrel, and image pickup device
CN110651165B (en) Optical component inspection system and method for inspecting at least one component
WO2015011850A1 (en) Electronic component mounting apparatus and electronic component mounting method
JP5875676B2 (en) Imaging apparatus and image processing apparatus
JP2006030256A (en) Focusing adjustment method and focusing adjustment device for imaging apparatus
US20100091271A1 (en) Method and system for supporting a moving optical component on a sloped portion
WO2018047374A1 (en) Image processing device and image processing method
JP2008185788A (en) Camera
CN106461382B (en) Five-axis optical detection system
WO2015011851A1 (en) Electronic component mounting apparatus and electronic component mounting method
WO2015011852A1 (en) Electronic component mounting apparatus and electronic component mounting method
JP6746972B2 (en) Imaging device and imaging method
KR101861293B1 (en) Apparatus for inspecting optical lense and control mothod thereof
US11796608B2 (en) Magnetic property measurement apparatus
JP2002214693A (en) Method and device for picking up image of several objects and electronic part mounting device using the same
JP2600500B2 (en) Glass pipe end chamfering inspection equipment
JPH09250912A (en) Pattern measurement device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant