CN113532327A - Detection method for chip shape in material tray based on stripe projection 3D imaging - Google Patents

Detection method for chip shape in material tray based on stripe projection 3D imaging Download PDF

Info

Publication number
CN113532327A
CN113532327A CN202110801601.2A CN202110801601A CN113532327A CN 113532327 A CN113532327 A CN 113532327A CN 202110801601 A CN202110801601 A CN 202110801601A CN 113532327 A CN113532327 A CN 113532327A
Authority
CN
China
Prior art keywords
material tray
product
algorithm
imaging
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110801601.2A
Other languages
Chinese (zh)
Other versions
CN113532327B (en
Inventor
洪敬柱
李林林
郑飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Tuxun Electronic Technology Co ltd
Original Assignee
Hefei Tuxun Electronic Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Tuxun Electronic Technology Co ltd filed Critical Hefei Tuxun Electronic Technology Co ltd
Priority to CN202110801601.2A priority Critical patent/CN113532327B/en
Publication of CN113532327A publication Critical patent/CN113532327A/en
Application granted granted Critical
Publication of CN113532327B publication Critical patent/CN113532327B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/245Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using a plurality of fixed, simultaneously operating transducers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2545Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • G01N2021/888Marking defects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Chemical & Material Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a detection method of chip shapes in a material tray based on stripe projection 3D imaging, which has the technical scheme that the imaging adopts a defocused stripe projection technology and a binocular vision camera calibration technology to reduce the false detection rate and the omission factor, solve the problems of empty material, material tilting, material overlapping and the like in production, and simultaneously obtain a more perfect 3D point cloud image by using a three-step phase shift algorithm and a unwrapping algorithm. And 3D image processing is more perfected by using an error compensation algorithm and a filtering algorithm. Screening defects in the 3D picture by using a defect detection algorithm; set up four cameras, the whole charging tray is covered to the field of vision scope, once detects the problem that can solve whole charging tray. Faster compared to the 2D detection scheme. The method meets the customer requirements, and simultaneously designs a brand-new 3D detection method, and the detection method has the 3D detection function of detecting whether the product is in the material tray or not and whether the position of the product in the material tray is correct or not, so that the false detection rate and the missed detection rate are greatly reduced.

Description

Detection method for chip shape in material tray based on stripe projection 3D imaging
Technical Field
The invention belongs to the technical field of chip detection, and particularly relates to a detection method of a chip form in a material tray based on stripe projection 3D imaging.
Background
At present, the processes of IC testing, packaging, baking and the like all use material trays as containers, the material trays are various in types, the defect types of contents are diversified, and the problems of empty materials, material tilting, material stacking and the like generally occur. The existing 2D detection means detects by optically photographing above the material tray, the problem of material stacking is difficult to solve by optically photographing, and all defect effects cannot be compatible.
Aiming at the defects, the existing 2D detection algorithm cannot meet the actual detection requirement, and meanwhile, the false detection rate and the omission factor are high, and the customer requirement cannot be met. And at present, no domestic 3D detection equipment exists, so that a 3D detection method is developed for meeting the customer requirements, and the false detection rate and the omission factor are greatly reduced.
Disclosure of Invention
The invention aims to provide a method for detecting the shape of a chip in a material tray based on stripe projection 3D imaging, which aims to solve the problems in the background technology.
In order to achieve the purpose, the invention provides the following technical scheme: a detection method for the shape of a chip in a material tray based on stripe projection 3D imaging comprises the following steps:
the machine station controls to move the material tray to be detected to a region to be detected under the optical detection module, and machine station control software sends a detection signal;
a projector in the optical detection module receives a detection signal, the projector projects a plurality of discrete stripes with different thicknesses, and a camera in the optical detection module acquires the corresponding discrete stripes;
analyzing the discrete stripes through a three-step phase shift algorithm and a unwrapping algorithm to obtain an original 3D point cloud image;
carrying out noise reduction processing on the original 3D point cloud image by using a point cloud filtering algorithm and an error compensation algorithm to obtain a perfect 3D point cloud image;
analyzing the 3D point cloud image by using a defect detection algorithm, determining morphological characteristics of each product in the material tray, judging whether the morphological characteristics meet the requirements of a user, and outputting a result signal to the machine control software by the detection software according to a judgment result;
the machine control software receives the result signal, and if the result signal is a good product signal, the machine continues to move the next product to be tested; if the material tray is a defective product signal, the machine table sends alarm information, and abnormal products in the current material tray are manually processed.
Preferably, the machine control software is handler software.
Preferably, the optical detection module comprises a projector and two or more cameras, wherein the projector is used for projecting sinusoidal stripes onto the object to be detected; and the camera is used for acquiring the deformed sine stripes.
Preferably, the cameras adopt a four-camera setting, each camera simultaneously adopts picture collection in a hard trigger mode, and each camera adopts 36 pictures.
Preferably, the projector uses a defocused fringe projection technology, the projector projects a group of fringe patterns, the deformed fringe patterns are synchronously observed from the camera at the moment, the group of deformed fringe patterns are collected and analyzed by two or more cameras, the deformation conditions are different due to different distances from each point on the surface of the object to the projection center, the distance from the object to be measured to the camera is determined through the deformation of the fringes, and finally 3D phase information is obtained.
Preferably, the camera adopts a binocular vision camera calibration technology, the phase always represents the distance from the projector to the measured object, wherein the projection range is a public view range in which the dual-camera can observe the deformed stripes, and only the points which can be projected by the projector and can be seen by the dual-camera can acquire the corresponding 3D phase information.
Preferably, the 3D phase information is calculated by using a three-step phase shift algorithm as follows:
I1(x,y)=I0(x,y)+Imod(x,y)cos(φ(x,y)-θ),
I2(x,y)=I0(x,y)+Imod(x,y)cos(φ(x,y)),
I3(x,y)=I0(x,y)+Imod(x,y)cos(φ(x,y)+θ),
wherein, I0(x, y) is the background intensity of the pixel, Imod(x, y) is a modulation term, phi (x, y) is a wrapping phase, theta is a phase difference of the target image, and I1(x,y)、I2(x,y)、I3(x, y) are the pixels of each point in the target image, respectively.
Preferably, the 3D point coordinates are calculated by using an unwrapping algorithm according to the formula:
I4(x,y)=I0(x,y)+Imod(x,y)cos(φ(x,y)),
wherein, I0(x, y) is the background intensity of the pixel, Imod(x, y) is the modulation term, Φ (x, y) is the wrapped phase, I4(x, y) represents the actual phase value.
Preferably, the result signal includes whether a product is present in the material tray and the position of the product in the material tray, when the product is present in the material tray and the position of the product in the material tray is detected normally, the result signal is an OK signal, the suction nozzle takes the material to move the product to be detected to the discharge hole, and the machine station continues to move the next product to be detected; when the product is absent in the material tray or the product is abnormal in the material tray, the signal is NG, the machine station sends out alarm information, and whether the product is present in the material tray or not and the position of the product in the material tray are manually checked.
The invention has the technical effects and advantages that:
the detection method of the chip form in the material tray based on the stripe projection 3D imaging has a 3D detection function of detecting whether a product exists in the material tray or not and whether the position of the product in the material tray is correct or not. The imaging method adopts a defocusing fringe projection technology and a binocular vision camera calibration technology to reduce the false detection rate and the omission factor, solves the problems of empty material, material tilting, material stacking and the like in production, and obtains a more perfect 3D point cloud image by using a three-step phase shift algorithm and a unwrapping algorithm. And 3D image processing is more perfected by using an error compensation algorithm and a filtering algorithm. Screening defects in the 3D picture by using a defect detection algorithm; set up four cameras, the whole charging tray is covered to the field of vision scope, once detects the problem that can solve whole charging tray. Faster compared to the 2D detection scheme.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
FIG. 1 is a flow chart of the detection method of the present invention;
FIG. 2 is a schematic diagram of a binocular vision calibration technique in an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a detection device in an embodiment of the invention.
In the figure: 1-a projector; 2, mounting a bracket; 3-a camera; 4-a three-dimensional moving module; and 5, mounting the plate.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention provides a detection method of the shape of a chip in a material tray based on stripe projection 3D imaging, which specifically detects whether a product exists in the material tray or not and whether the position of the product in the material tray is correct or not.
As shown in fig. 1, it specifically includes the following steps:
the machine station controls to move the material tray to be detected to a region to be detected under the optical detection module, and machine station control software sends a detection signal;
a projector 1 in the optical detection module receives a detection signal, the projector 1 projects a plurality of discrete stripes with different thicknesses, and a camera 3 in the optical detection module acquires the corresponding discrete stripes;
analyzing the discrete stripes through a three-step phase shift algorithm and a unwrapping algorithm to obtain an original 3D point cloud image;
carrying out noise reduction processing on the original 3D point cloud image by using a point cloud filtering algorithm and an error compensation algorithm to obtain a perfect 3D point cloud image;
analyzing the 3D point cloud image by using a defect detection algorithm, determining morphological characteristics of each product in the material tray, judging whether the morphological characteristics meet the requirements of a user, and outputting a result signal to the machine control software by the detection software according to a judgment result;
the machine control software receives the result signal, and if the result signal is a good product signal, the machine continues to move the next product to be tested; if the material tray is a defective product signal, the machine table sends alarm information, and abnormal products in the current material tray are manually processed.
Specifically, a machine station controls to move a product to be detected to a region to be detected right below an optical detection module and waits, machine station control software sends a detection signal, a projector 1 in the optical detection module receives the detection signal, the projector 1 projects 36 discrete stripes with different thicknesses, and a camera 3 in the optical detection module collects the corresponding discrete stripes; the machine control software is handler software.
The optical detection module comprises a projector 1 and two or more cameras 3; the projector 1 is used for projecting sinusoidal stripes on an object to be measured; and the camera 3 is used for acquiring the deformed sine stripes. In this embodiment, the camera 3 is set by using four cameras 3, and the whole tray can be seen at one time by using the four cameras 3.
In this embodiment, after the projector 1 in the optical detection module receives the detection signal sent by the machine control software, the projector 1 is controlled to project 36 discrete stripes with different thicknesses, so as to obtain the corresponding 3D phase information. The projector 1 applies the defocused fringe projection technology, the projector 1 projects a group of discrete fringe patterns with different thicknesses, and at the moment, deformed fringe patterns can be synchronously observed from an object; the set of deformed stripe patterns are collected and analyzed by two or more cameras 3, the deformation conditions are different due to different distances from each point on the surface of the object to the projection center, the imaging effects of the same stripe state in the two cameras 3 are different by using the binocular cameras 3, and the parallax between the corresponding points is obtained by analyzing the stripe difference of the two images. The projection range and the common field of view in which the deformed stripes can be observed by the two cameras 3 must be the common field of view of the two cameras 3, and the 3D phase information can be obtained only from the points that the projector 1 can project and the two cameras 3 can also see.
The camera adopts a binocular vision camera calibration technology, and information such as positions, angles, space coordinates and the like of the two cameras is determined through the binocular camera calibration technology; specifically, each camera adopts simultaneous hard trigger image acquisition, and each camera 3 adopts 36 images. In the present embodiment, as shown in fig. 2, the camera 3 employs a binocular vision camera calibration technique; the phase always represents the distance from the projector 1 to the measured object, i.e. the same point on the object has the same phase value when viewed by different CCDs. Two straight lines can be obtained by using the same pixel points in the phases of the CCD1 and the CCD2 to the corresponding CCD optical center, the 2 straight lines are intersected at the same point of the object, and the coordinate of the point is the obtained 3D phase information. Specifically, the CCD represents a charge coupled element, i.e., an image sensor. The CCD1 is an image sensor of the first camera, and the CCD2 is an image sensor of the second camera, and is not limited to the two-camera limitation, and may be the four-camera 3.
Further, obtaining an original 3D point cloud image by the obtained 3D phase information through a three-step phase shift algorithm and a unwrapping algorithm;
and the 3D phase information comprises a to-be-detected 3D point coordinate, and the obtained to-be-detected 3D point coordinate is used for calculating an actual phase value by using a three-step phase shift algorithm and a unwrapping algorithm so as to obtain a more perfect 3D point cloud image. Specifically, the phase of the target image in the embodiment of the present invention differs by θ. Further, the pixels of each point in the target image with the phase difference theta are set to be I respectively1(x,y)、I2(x,y)、I3(x, y), the light intensity of each point on the three phase-difference target images is respectively expressed as theta, the calculation formula of the three-step phase-shift algorithm is as follows,
I1(x,y)=I0(x,y)+Imod(x,y)cos(φ(x,y)-θ),
I2(x,y)=I0(x,y)+Imod(x,y)cos(φ(x,y)),
I3(x,y)=I0(x,y)+Imod(x,y)cos(φ(x,y)+θ),
wherein I0(x, y) is the background intensity of the pixel, Imod(x, y) is the modulation term and Φ (x, y) is the wrapped phase.
In the actual use process, if a group of stripe information is used, the error point is too large, so that stripes with three different thicknesses are needed to be used, but the arccos data finally calculated is wrapped, so that the actual phase value is calculated by using an unwrapping algorithm; the unwrapping algorithm is described as follows,
I4(x,y)=I0(x,y)+Imod(x,y)cos(φ(x,y)),
wherein I0(x, y) is the background intensity of the pixel, Imod(x, y) is the modulation term, Φ (x, y) is the wrapped phase, I4(x, y) represents the actual phase value.
Further, the obtained original 3D point cloud image is subjected to point cloud filtering algorithm and error compensation algorithm to remove noise, and a perfect 3D point cloud image is obtained.
And 3D image processing is more perfected by using an error compensation algorithm and a filtering algorithm. In the high-precision measurement, the image processing precision plays a crucial role in the overall measurement precision, but the discretization sampling of a camera on a characteristic point image in the imaging process can cause distortion of the image and an original signal, so that an error in an image processing link is brought. The filtering algorithm adopts an SOR filtering algorithm, and in order to remove noise generated by the camera 3 and greatly save details, the SOR filtering algorithm adopted by the system can blur the defect of the edge of the area, so that the SOR filtering algorithm can eliminate the influence. The SOR filtering algorithm can eliminate noise without influencing the edge of the region, and the effect of the SOR filtering algorithm improved system is superior to that of other mean value filtering algorithm improved systems through experimental verification.
Further, the obtained perfect 3D point cloud image is used for screening the defects in the 3D image by using a defect detection algorithm, unmatched points in the defects are marked as bad points, information of all the bad points in the 3D image is screened, screening results are integrated, whether the current product has the defects or not is judged, and a result signal is sent after the judgment is finished. In the embodiment, a defect detection algorithm is used for analyzing the 3D point cloud image, determining the morphological characteristics of each product in the material tray, judging whether the morphological characteristics meet the requirements of a user, and outputting a result signal to the machine control software by the detection software according to a judgment result;
in the embodiment, a defect detection algorithm is used for screening defects in the 3D picture; after a relatively qualified point cloud is obtained, a large amount of noise still exists in the point cloud, and the final detection result is influenced, so that invalid noise points are filtered by using a point cloud filtering algorithm; the defect detection adopts a template matching mode, three-dimensional information of good products needs to be stored in advance, a current picture is gradually compared with a template picture, points with unmatched 3D information are marked as bad, all bad point information is screened again finally, a current product is compared with a template image, shops with unmatched 3D information are marked as bad, all bad point information is screened finally, and whether the current picture is abnormal or not is judged through 2D image processing algorithms such as contour detection, corrosion expansion and the like.
Further, the machine control software receives a result signal, and if the result signal is a good product signal, the machine continues to move the next product to be tested; if the material tray is a defective product signal, the machine table sends alarm information, and abnormal products in the current material tray are manually processed. In this embodiment, the machine receives the result signal, and if the result signal is an OK signal, the machine continues to move the next product to be tested; if for NG signal, the board sends alarm information, and the manual work is examined whether have product and the interior product position of charging tray in the present charging tray.
Specifically, the result signal includes whether a product is present in the material tray and the position of the product in the material tray, when the product is present in the material tray and the position of the product in the material tray is detected normally, the result signal is an OK signal, the suction nozzle takes the material to move the product to be detected to the discharge hole, and the machine station continues to move the next product to be detected; when the product is absent in the material tray or the product is abnormal in the material tray, the signal is NG, the machine station sends out alarm information, and whether the product is present in the material tray or not and the position of the product in the material tray are manually checked.
Specifically, after the optical detection software calculates the current picture, the optical detection software sends a detection result through an IO signal control algorithm, triggers an IO relay board level signal (send signal), and the machine control software reads level information (receive signal) of the relay board at the corresponding point, and performs the next action.
In the embodiment, the customized light source is adopted, so that good 3D data can be acquired; the speed is high, and the defect detection of the whole material tray can be solved within 3 s; the packaging material is compatible with various packaging products such as BGA, QFP, QFN and the like, and the minimum detectable product is 3x3 mm; the device can be compatible with common material trays such as black, red and blue, and the deformation of the material tray does not influence the detection; through long-term verification on the client site, the stability is very high.
The detection method can determine the position of a product in the material tray and output a Mapping chart. The suction nozzle is instructed according to the Mapping picture, absorbs corresponding position product, can solve the manual operation of meneing the dish before the pan feeding, practices thrift the manpower, even the charging tray material is not full, also can produce.
The detection system corresponding to the detection method of the chip form in the material tray based on stripe projection 3D imaging comprises a projector 1, a mounting bracket 2, a camera 3, a three-dimensional moving module 4 and a machine table; reference may be made specifically to fig. 3, in which:
the projector 1 is positioned right above the object to be measured and used for projecting sinusoidal stripes on the object to be measured;
the optical detection module is arranged on the mounting bracket 2 and is connected with the machine table;
the camera 3 is used for acquiring the deformed sine stripes; four cameras 3 are selected for use by the cameras 3, the whole material tray is paved in the visual field range, and the problem of the whole material tray can be solved through one-time detection. Faster compared to the 2D detection scheme.
The three-dimensional moving module 4 is arranged above the mounting bracket, has a three-dimensional moving function and is used for adjusting the position of the optical detection module;
the machine station is used for moving the object to be detected and is positioned below the optical detection module, which is not marked in the figure.
Projector 1 is located the position directly over the product that awaits measuring, projector 1 installs on installing support 2, installing support 2 divide into about two-layer, and the middle fretwork design that adopts, upper portion fretwork are used for placing projector 1, and the lower part fretwork is for not sheltering from the light beam of projector, and the object that awaits measuring is located 400mm departments under the projector, camera 3 is located installing support four angles all around. And a three-dimensional moving module 4 is also arranged on the upper side edge of the mounting bracket 2 and used for adjusting the position of the optical detection module. The three-dimensional moving module is connected with the mounting bracket 2 through bolts, a mounting plate 5 is further arranged between the three-dimensional moving module 4 and the mounting bracket, and the mounting plate 5 can play a role in preventing looseness and adjusting height. The other end of the three-dimensional moving module 4 is installed on the machine table, and a limiting clamping block is arranged on the contact part of the three-dimensional moving module 4 and the projector and used for clamping the projector 1, so that the projector 1 and the three-dimensional moving module 4 are fixedly connected.
The invention has the beneficial effects that: the imaging adopts a defocusing fringe projection technology and a binocular vision camera calibration technology to reduce the false detection rate and the omission factor, solves the problems of empty material, material tilting, material stacking and the like in production, and simultaneously obtains a more perfect 3D point cloud image by using a three-step phase shift algorithm and a unwrapping algorithm. And 3D image processing is more perfected by using an error compensation algorithm and a filtering algorithm. Screening defects in the 3D picture by using a defect detection algorithm; set up four cameras 3, the whole charging tray is paved to the field of vision scope, once detects the problem that can solve whole charging tray. Faster compared to the 2D detection scheme.
Finally, it should be noted that: although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that modifications may be made to the embodiments or portions thereof without departing from the spirit and scope of the invention.

Claims (9)

1. A detection method of the shape of a chip in a material tray based on stripe projection 3D imaging is characterized in that: the detection method comprises the following steps:
the machine station controls to move the material tray to be detected to a region to be detected under the optical detection module, and machine station control software sends a detection signal;
a projector in the optical detection module receives a detection signal, the projector projects a plurality of discrete stripes with different thicknesses, and a camera in the optical detection module acquires the corresponding discrete stripes;
analyzing the discrete stripes through a three-step phase shift algorithm and a unwrapping algorithm to obtain an original 3D point cloud image;
carrying out noise reduction processing on the original 3D point cloud image by using a point cloud filtering algorithm and an error compensation algorithm to obtain a perfect 3D point cloud image;
analyzing the 3D point cloud image by using a defect detection algorithm, determining morphological characteristics of each product in the material tray, judging whether the morphological characteristics meet the requirements of a user, and outputting a result signal to the machine control software by the detection software according to a judgment result;
the machine control software receives the result signal, and if the result signal is a good product signal, the machine continues to move the next product to be tested; if the material tray is a defective product signal, the machine table sends alarm information, and abnormal products in the current material tray are manually processed.
2. The method for detecting the chip morphology in the material tray based on the fringe projection 3D imaging is characterized in that: the machine control software is handler software.
3. The method for detecting the chip morphology in the material tray based on the fringe projection 3D imaging is characterized in that: the optical detection module comprises a projector and two or more cameras, wherein the projector is used for projecting sinusoidal stripes to an object to be detected; and the camera is used for acquiring the deformed sine stripes.
4. The method for detecting the chip morphology in the material tray based on the fringe projection 3D imaging is characterized in that: the cameras are arranged in four cameras, each camera is used for simultaneously and hard triggering image acquisition, and each camera acquires 36 images.
5. The method for detecting the chip morphology in the material tray based on the fringe projection 3D imaging is characterized in that: the projector projects a group of stripe patterns by using a defocusing stripe projection technology, the deformed stripe patterns are synchronously observed from the camera at the moment, the group of deformed stripe patterns are collected and analyzed by two or more cameras, the distance from an object to be detected to the camera is determined by the deformation of the stripes, and finally 3D phase information is obtained.
6. The method for detecting the chip morphology in the material tray based on the fringe projection 3D imaging as claimed in claim 3 or 5, wherein the method comprises the following steps: the camera adopts a binocular vision camera calibration technology, the phase position always represents the distance from the projector to the measured object, wherein the projection range is a public view range in which the dual-camera can observe the deformed stripes, and the 3D phase information can be obtained only by the point which can be projected by the projector and can be seen by the dual-camera.
7. The method for detecting the chip morphology in the material tray based on the fringe projection 3D imaging is characterized in that: the 3D phase information uses a three-step phase shift algorithm, and the calculation formula is as follows:
I1(x,y)=I0(x,y)+Imod(x,y)cos(φ(x,y)-θ),
I2(x,y)=I0(x,y)+Imod(x,y)cos(φ(x,y)),
I3(x,y)=I0(x,y)+Imod(x,y)cos(φ(x,y)+θ),
wherein, I0(x, y) is the background intensity of the pixel, Imod (x, y) is the modulation term, phi (x, y) is the wrapping phase, theta is the phase difference of the target image, I1(x,y)、I2(x,y)、I3(x, y) are the pixels of each point in the target image, respectively.
8. The method for detecting the chip morphology in the material tray based on the fringe projection 3D imaging is characterized in that: the 3D point coordinate calculation formula by using the unwrapping algorithm is as follows:
I4(x,y)=I0(x,y)+Imod(x,y)cos(φ(x,y)),
wherein, I0(x, y) is the background intensity of the pixel, Imod (x, y) is the modulation term, Φ (x, y) is the wrapped phase, I4(x, y) represents the actual phase value.
9. The method for detecting the chip morphology in the material tray based on the fringe projection 3D imaging is characterized in that: the result signal comprises whether a product exists in the current material tray and the position of the product in the material tray, when the product exists in the material tray and the position of the product in the material tray is detected normally, the result signal is an OK signal, the suction nozzle takes the material to move the product to be detected to the discharge hole, and the machine station continues to move the next product to be detected; when the product is absent in the material tray or the product is abnormal in the material tray, the signal is NG, the machine station sends out alarm information, and whether the product is present in the material tray or not and the position of the product in the material tray are manually checked.
CN202110801601.2A 2021-07-15 2021-07-15 Method for detecting chip morphology in tray based on stripe projection 3D imaging Active CN113532327B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110801601.2A CN113532327B (en) 2021-07-15 2021-07-15 Method for detecting chip morphology in tray based on stripe projection 3D imaging

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110801601.2A CN113532327B (en) 2021-07-15 2021-07-15 Method for detecting chip morphology in tray based on stripe projection 3D imaging

Publications (2)

Publication Number Publication Date
CN113532327A true CN113532327A (en) 2021-10-22
CN113532327B CN113532327B (en) 2023-09-12

Family

ID=78128140

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110801601.2A Active CN113532327B (en) 2021-07-15 2021-07-15 Method for detecting chip morphology in tray based on stripe projection 3D imaging

Country Status (1)

Country Link
CN (1) CN113532327B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114199766A (en) * 2021-11-30 2022-03-18 联想(北京)有限公司 Surface defect detection device and detection method

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103983208A (en) * 2014-05-09 2014-08-13 南昌航空大学 Out-of-focus projection three-dimensional measurement method of color binary fringes
JP2015125089A (en) * 2013-12-27 2015-07-06 Jfeスチール株式会社 Surface defect detection method and surface defect detection apparatus
CN105300319A (en) * 2015-11-20 2016-02-03 华南理工大学 Quick three-dimensional reconstruction method based on colorful grating
US20170122878A1 (en) * 2013-12-27 2017-05-04 Jfe Steel Corporation Surface defect detecting method and surface defect detecting apparatus
CN108534714A (en) * 2018-03-09 2018-09-14 南昌航空大学 Based on sinusoidal and binary system fringe projection quick three-dimensional measurement method
CN110672039A (en) * 2019-09-18 2020-01-10 南京理工大学 Object omnibearing three-dimensional measurement method based on plane reflector
CN111062919A (en) * 2019-12-12 2020-04-24 韦士肯(厦门)智能科技有限公司 Bearing ring appearance defect detection method
CN112816493A (en) * 2020-05-15 2021-05-18 奕目(上海)科技有限公司 Chip routing defect detection method and device
CN114485470A (en) * 2022-01-30 2022-05-13 北京理工大学 Speckle-based composite material three-dimensional appearance and defect comprehensive measurement system and method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015125089A (en) * 2013-12-27 2015-07-06 Jfeスチール株式会社 Surface defect detection method and surface defect detection apparatus
US20170122878A1 (en) * 2013-12-27 2017-05-04 Jfe Steel Corporation Surface defect detecting method and surface defect detecting apparatus
CN103983208A (en) * 2014-05-09 2014-08-13 南昌航空大学 Out-of-focus projection three-dimensional measurement method of color binary fringes
CN105300319A (en) * 2015-11-20 2016-02-03 华南理工大学 Quick three-dimensional reconstruction method based on colorful grating
CN108534714A (en) * 2018-03-09 2018-09-14 南昌航空大学 Based on sinusoidal and binary system fringe projection quick three-dimensional measurement method
CN110672039A (en) * 2019-09-18 2020-01-10 南京理工大学 Object omnibearing three-dimensional measurement method based on plane reflector
CN111062919A (en) * 2019-12-12 2020-04-24 韦士肯(厦门)智能科技有限公司 Bearing ring appearance defect detection method
CN112816493A (en) * 2020-05-15 2021-05-18 奕目(上海)科技有限公司 Chip routing defect detection method and device
CN114485470A (en) * 2022-01-30 2022-05-13 北京理工大学 Speckle-based composite material three-dimensional appearance and defect comprehensive measurement system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
毛翠丽等: "相移条纹投影三维形貌测量技术综述", 《计量学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114199766A (en) * 2021-11-30 2022-03-18 联想(北京)有限公司 Surface defect detection device and detection method
CN114199766B (en) * 2021-11-30 2024-03-22 联想(北京)有限公司 Surface defect detection device and detection method

Also Published As

Publication number Publication date
CN113532327B (en) 2023-09-12

Similar Documents

Publication Publication Date Title
KR101940936B1 (en) Point cloud merging from multiple cameras and sources in three-dimensional profilometry
CN101506614A (en) Method and apparatus for 3-dimensional vision and inspection of ball and like protrusions of electronic components
JP2008292430A (en) Appearance inspecting method and appearance inspecting device
CA2559591A1 (en) Method and system of measuring an object in a digital image
CN107271445B (en) Defect detection method and device
JPH1038533A (en) Instrument and method for measuring shape of tire
CA2321096A1 (en) Automatic inspection system with stereovision
CN112334761A (en) Defect discriminating method, defect discriminating device, defect discriminating program, and recording medium
CN108827597A (en) A kind of the hot spot uniformity detection method and detection system of structured light projection device
CN113532327A (en) Detection method for chip shape in material tray based on stripe projection 3D imaging
Trucco et al. Acquisition of consistent range data using local calibration
CN117346694B (en) Detection method and detection system for composite surface type sample
JP3819597B2 (en) PTP seal inspection device
KR100558325B1 (en) The method and device for 3D inspection by moire and stereo vision
JP2000321039A (en) Apparatus and method for inspecting coating fault
JP3553652B2 (en) Shape measuring device, inspection device, and product manufacturing method
JP7266070B2 (en) Board wiring measurement system and method
JP2008170281A (en) Shape measuring device and shape measuring method
CN211042118U (en) Three-dimensional detection system
CN112710662A (en) Generation method and device, generation system and storage medium
JPH03186706A (en) Three-dimensional shape dimension measuring instrument
JP2001280939A (en) Method of evaluating abnormal condition of object surface
JP4062581B2 (en) Region extraction method for fringe analysis
CN113063352B (en) Detection method and device, detection equipment and storage medium
Munaro et al. Fast 2.5 D model reconstruction of assembled parts with high occlusion for completeness inspection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant