CN111862024A - Workpiece detection system and method based on depth information enhanced projection - Google Patents

Workpiece detection system and method based on depth information enhanced projection Download PDF

Info

Publication number
CN111862024A
CN111862024A CN202010675454.4A CN202010675454A CN111862024A CN 111862024 A CN111862024 A CN 111862024A CN 202010675454 A CN202010675454 A CN 202010675454A CN 111862024 A CN111862024 A CN 111862024A
Authority
CN
China
Prior art keywords
target workpiece
depth information
pseudo
projection
workpiece
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010675454.4A
Other languages
Chinese (zh)
Inventor
艾佳
苏显渝
李彪
曾吉勇
张召世
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Shenrui Vision Technology Co ltd
Nanchang Virtual Reality Institute Co Ltd
Original Assignee
Sichuan Shenrui Vision Technology Co ltd
Nanchang Virtual Reality Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Shenrui Vision Technology Co ltd, Nanchang Virtual Reality Institute Co Ltd filed Critical Sichuan Shenrui Vision Technology Co ltd
Priority to CN202010675454.4A priority Critical patent/CN111862024A/en
Publication of CN111862024A publication Critical patent/CN111862024A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the application discloses a workpiece detection system and method based on depth information enhanced projection. The system comprises: the projection unit comprises at least two projection devices and is used for projecting a structural light pattern and a pseudo-color pattern to a target workpiece in a staggered mode on a time sequence; the acquisition unit is used for acquiring the structured light pattern which is projected to the target workpiece by the projection unit and reflected by the target workpiece in real time; and the processing unit is used for calculating the depth information of the target workpiece according to the structured light pattern, obtaining the pseudo-color pattern according to the depth information and sending the pseudo-color pattern to the projection unit. By using a plurality of projection devices to project the structured light pattern and the pseudo-color pattern to the target workpiece, the projection field of view is expanded, and the workpiece detection display effect is enhanced.

Description

Workpiece detection system and method based on depth information enhanced projection
Technical Field
The application belongs to the technical field of computer vision, and particularly relates to a workpiece detection system and method based on depth information enhanced projection.
Background
The pseudo color coding is a process of giving a monochrome image to a color, and gives a color to a gradation value based on a predetermined rule. The pseudo-color pattern is a pattern obtained by a pseudo-color encoding manner.
Many research applications of pseudo-color pattern projection in industry, transportation and medical treatment have been carried out, and how to apply the pseudo-color pattern projection to the field of workpiece detection is one of the hot spots of current research.
Disclosure of Invention
In view of the above problems, the present application provides a workpiece detection system and method based on depth information enhanced projection to improve the above problems.
In a first aspect, an embodiment of the present application provides a workpiece detection system based on depth information enhanced projection, the system including: the projection unit comprises at least two projection devices and is used for projecting a structural light pattern and a pseudo-color pattern to a target workpiece in a staggered mode on a time sequence; the acquisition unit is used for acquiring the structured light pattern which is projected to the target workpiece by the projection unit and reflected by the target workpiece in real time; and the processing unit is used for calculating the depth information of the target workpiece according to the structured light pattern, obtaining the pseudo-color pattern according to the depth information and sending the pseudo-color pattern to the projection unit.
In a second aspect, the present application provides a workpiece detection method based on depth information enhanced projection, which is applied to a workpiece detection system based on depth information enhanced projection, the system includes a projection unit, an acquisition unit, and a processing unit, and the method includes: the projection unit emits a structured light pattern toward a target workpiece; the acquisition unit acquires the structured light pattern reflected by the target workpiece in real time; the processing unit calculates the depth information of the target workpiece based on the structured light pattern, and obtains a pseudo-color pattern according to the depth information; the projection unit projects the structured light pattern and the pseudo-color pattern to the target workpiece in an interlaced manner in time series.
The embodiment of the application provides a workpiece detection system and method based on depth information enhanced projection. The projection unit comprises at least two projection devices used for projecting a structured light pattern and a pseudo color pattern to a target workpiece in a staggered manner on a time sequence, the acquisition unit is used for acquiring the structured light pattern projected to the target workpiece by the projection unit and reflected by the target workpiece in real time, and the processing unit is used for calculating the depth information of the target workpiece according to the structured light pattern, obtaining the pseudo color pattern according to the depth information and sending the pseudo color pattern to the projection unit. By using a plurality of projection devices to project the structured light pattern and the pseudo-color pattern to the target workpiece, the projection field of view is expanded, and the workpiece detection display effect is enhanced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a workpiece detection system based on depth information enhanced projection according to an embodiment of the present application;
FIG. 2 is a block diagram of a workpiece inspection system based on depth information enhanced projection according to an embodiment of the present application;
FIG. 3 is a timing diagram of the elements of a workpiece inspection system based on depth information enhanced projection according to an embodiment of the present application;
FIG. 4 is a block diagram of a workpiece inspection system based on depth information enhanced projection according to another embodiment of the present application;
fig. 5 is a schematic view illustrating a scene of workpiece detection based on depth information enhanced projection according to an embodiment of the present application;
fig. 6 is a flowchart of a workpiece detection method based on depth information enhanced projection according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The pseudo color coding is a process of giving a monochrome image to a color, and is a process of giving a color to a gray value based on a specified rule, and presenting a pseudo color pattern obtained in this way to a user, thereby improving the perception capability of the user to the real world. False color pattern projection has been used in many research applications in the fields of industry, transportation, medicine, education, and the like.
In the research process of the related pseudo-color pattern projection display device, the related pseudo-color pattern projection display device is found to be composed of a single projector and a single depth camera, and a three-dimensional measurement technology is utilized to project a structured light pattern to a free curved surface to obtain a depth image, so that the pseudo-color projection display is realized. The application limitation faced by this kind of device is that the projection field of view of a single projector is small, and when the problem of shading, shadow, etc. appears on the target workpiece, the display effect of false color pattern can be affected.
Therefore, the inventor provides a depth information enhanced projection-based workpiece detection system and method in the application, wherein a projection unit emits a structured light pattern to a target workpiece, a collection unit collects the structured light pattern reflected by the target workpiece in real time, a processing unit calculates the depth information of the target workpiece based on the structured light pattern, then a pseudo color pattern is obtained according to the depth information, and finally the structured light pattern and the pseudo color pattern are projected to the target workpiece through at least two projection devices, so that the projection field of view is expanded, and the workpiece detection display effect is enhanced.
As shown in fig. 1, a workpiece inspection system based on depth information enhanced projection is shown for acquiring a pseudo-color pattern generated by a structured light pattern reflected by a target workpiece for workpiece inspection. The projection unit can be used for projecting structured light patterns with different densities and/or shapes and can also be used for projecting a pseudo-color pattern.
Alternatively, the projection unit may be a visible light projection device. Alternatively, the projection unit may be an infrared laser module, and the light source may be a VCSEL array laser for projecting an infrared pattern.
The specific type of structured light pattern projected by the projection unit is not limited in the embodiments of the present application. The structured pattern may include point structured light, line structured light, and area structured light, such as grating stripes, speckle, and the like. When the same structured light pattern is projected from the projection unit, the height of the projected target workpiece is modulated, the modulated structured light is collected by the collection unit and is transmitted to the processing unit for analysis and calculation, and the three-dimensional surface shape data of the surface of the target workpiece can be obtained.
The specific light source of the projection unit is not limited in the embodiment of the application, and the structured light pattern projected by the projection unit can be collected by the corresponding collection unit, for example, the structured light pattern projected by the infrared projection unit is collected by an infrared image collection device, and the structured light pattern projected by the visible light projection unit is collected by a visible light image collection device.
The collecting unit and the projecting unit keep a certain base line distance, can be an image sensor for recording the wavelength of the pattern emitted by the projecting unit, is used for collecting the image of the structured light pattern projected by the projecting unit, and can comprise a photosensitive element, an optical filter, a lens and the like. The acquisition unit can be an image sensor corresponding to the type of the light source, for example, the light source of the projection unit is infrared light, and the acquisition unit is infrared light image acquisition equipment; if the light source is visible light, the acquisition unit is a visible light image acquisition device and the like. The position relationship between the image capturing unit and the projection unit is not limited in the embodiments of the present application, for example, the projection unit is horizontally disposed, horizontally projected, and the image capturing unit and the projection unit are disposed at the same horizontal height.
The processing unit is connected with the acquisition unit and used for processing the structured light pattern reflected by the target workpiece acquired by the acquisition unit, calculating the depth information of the target workpiece according to the acquired structured light pattern, generating a pseudo-color pattern according to the depth information and sending the pseudo-color pattern to the projection unit. The platform of the processing unit can be one of ASIC, FPGA and DSP, and is used for processing the acquired structured light pattern, and also can be used for controlling the projection of the projection unit and the pattern acquisition of the acquisition unit. Optionally, the processing unit may include a controller for controlling, such as by a synchronous timing circuit and an asynchronous timing circuit; a depth processor may also be included for performing the process of depth information acquisition.
The units in the system can be independent from each other or integrated together. For example, the system may be an electronic device such as a mobile phone, a tablet computer, a notebook computer, etc. which integrates a projection unit, an image acquisition unit, a storage unit, and a processing unit.
Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Referring to fig. 2, an embodiment of the present application provides a workpiece inspection system 100 for enhancing projection based on depth information, where the system 100 includes:
the projection unit 110 comprises at least two projection devices for projecting the structured light pattern and the pseudo-color pattern onto the target workpiece in a time-series staggered manner.
It should be noted that the projection unit 110 may emit invisible light to the target workpiece, such as emitting a laser beam by using a laser light source; optionally, the projection unit 110 may also emit visible light to the target workpiece.
By one approach, the plurality of projection devices in the projection unit 110 are configured such that when projection is performed simultaneously, the target workpiece is shadowless, shielded and overlapped with each other, and each overlapping area includes more than 2 and less than 4 projectors, wherein the projectors may be Digital Light Processing (DLP) projectors.
Optionally, in one period, the sum of the time for which the projection unit 110 projects the structured light pattern is less than the time for which the projection unit 110 projects the pseudo-color pattern.
As shown in fig. 3, fig. 3 is a timing diagram of the units in the workpiece inspection system 100 according to the embodiment of the present application, a plurality of projectors in the projection unit 110 project the structured light pattern onto the target workpiece in a manner of staggered projection on a time sequence, and a time sequence of the acquisition unit 120 completely matches the time sequence of the projection unit 110, so as to complete the process of reflecting the structured light pattern by the target workpiece, wherein the content projected by the projectors can be arbitrarily encoded, and the encoding method is as follows: the projector emits a structured light pattern and a pseudo color pattern to a target workpiece in a staggered manner on a time sequence, the structured light pattern is coded at a certain moment and is projected onto the target workpiece, the acquisition unit 120 can acquire depth information after acquiring the structured light pattern, and the target workpiece can be identified by using the depth information or by using the structured light pattern in combination with the depth information after acquiring the depth information; coding a pseudo-color pattern at the next moment and projecting the pseudo-color pattern to a target workpiece; thereafter, circulation is performed in this manner.
An acquisition unit 120 for acquiring in real time the structured light pattern projected onto and reflected by the target workpiece by the projection unit 110.
As one way, the collecting unit 120 may be a Charge Coupled Device (CCD) camera, which may collect in real time a structured light pattern reflected back from a Digital Light Processing (DLP) projector projected onto a target workpiece.
The processing unit 130 is configured to calculate depth information of the target workpiece according to the structured light pattern, obtain the pseudo-color pattern according to the depth information, and send the pseudo-color pattern to the projection unit 110.
By way of example, the processing unit 130 is configured to establish a mapping relationship between the pixels of the projection unit 110 and the pixels of the acquisition unit 120 by using a back fringe projection technique. Optionally, the processing unit 130 may also use a planar calibration technique based on fringe projection to establish a phase-depth mapping table within a measurement range, and further, may obtain depth information of the target workpiece through the mapping table. The mapping relationship between the pixels of the projection unit 110 and the collection unit 120 is established by a reverse fringe projection technique, which is essentially the modulation of the light transmission process between the collection unit 120 (a Charge Coupled Device (CCD) camera) and the projection unit 110 (a Digital Light Processing (DLP) projector) pixel array by the target workpiece. The process can be described as:
Figure 1
Where l and m are pixel coordinates of the projection unit 110 (digital light processing (DLP) projector), IproRepresented as the light intensity projected by the projection unit 110 (digital light processing (DLP) projector); i and j are pixel coordinates of the acquisition unit 120 (charge coupled device (CCD) camera), ICCDDenoted as the response intensity of the acquisition unit 120 (charge coupled device (CCD) camera) and f denotes the transfer function between the two arrays, which establishes a mapping between the acquisition unit 120 (charge coupled device (CCD) camera) pixels and the projection unit 110 (digital light processing (DLP) projector) pixels.
As a mode, based on a high-precision phase shift technique of triangulation, a binocular stereo vision technique, or a depth acquisition technique such as a TOF depth camera, a mapping relationship between a height and a phase is established in a non-volatile memory in advance according to structural parameters of the system, a plurality of projectors in the projection unit 110 respectively emit structured light patterns to a target workpiece, the acquisition unit 120 acquires the structured light patterns reflected by the target workpiece, and a three-frequency phase expansion algorithm and the established phase-height mapping relationship are utilized, wherein the phase-height mapping relationship can be understood as the phase-depth mapping relationship, and the processing unit 130 calculates depth information of the target workpiece, and the calculation formula is as follows:
Figure 2
Wherein h represents the target workThe depth of the piece, l, is expressed as the distance of the axis center of the pick unit 120 from the target workpiece,
Figure BDA0002583875430000063
expressed as phase difference, d is expressed as distance between the projection unit 110 and the acquisition unit 120, and f is expressed as fundamental frequency of the structured light.
The processing unit 130 stores the depth information of the target workpiece calculated from the structured light pattern in a database, and may store a plurality of pseudo-color patterns corresponding to the stored depth information in the database.
In related applications, the processing unit 130 may query whether there is corresponding depth information in the database according to the obtained depth information of the target workpiece, and obtain a corresponding pseudo-color pattern or a type of the workpiece corresponding to the depth information, and send the corresponding pseudo-color pattern to the projection unit 110 for projection operation. For example, when the system 100 is applied to a workpiece store, the depth information of all workpieces in the workpiece store may be stored in a database in advance, and a plurality of pseudo-color patterns may be matched for each depth information. After the system 100 obtains the depth information of the current workpiece, the system 100 may project the pseudo-color pattern matched with the current workpiece by searching the database, or may determine the type and model of the current workpiece according to the depth information of the current workpiece, and project the matched pseudo-color pattern according to the type and model of the workpiece. The pseudo-color pattern is realized based on depth information coding, the depth information is defined in a fixed reference system during calibration, if the position of a target workpiece in the reference system is fixed, the pseudo-color pattern cannot be changed, and the system 100 can project by searching the pseudo-color pattern matched with the current workpiece in a database; if the position of the target workpiece moves, the pseudo-color pattern changes. Therefore, if it can be ensured that the positions of the target workpiece, the projection unit and the acquisition unit are relatively fixed, after the system 100 acquires the depth information of the current target workpiece, the system 100 may project by searching the pseudo-color pattern matched with the current target workpiece in the database.
The workpiece detection system based on the depth information enhanced projection comprises a collecting unit, a processing unit and a projection unit, wherein the collecting unit is used for collecting a structured light pattern which is projected to a target workpiece by the projection unit and reflected by the target workpiece in real time, the processing unit is used for calculating the depth information of the target workpiece according to the structured light pattern, generating a pseudo color pattern according to the depth information and sending the pseudo color pattern to the projection unit, and the projection unit comprises at least two projection devices and is used for projecting the structured light pattern and the pseudo color real pattern to the target workpiece in a staggered mode on a time sequence. By using a plurality of projection devices to project the structured light pattern and the pseudo-color pattern to the target workpiece, the projection field of view is enlarged, and the projection display effect is optimized.
Further, as shown in fig. 4, the workpiece detection system 100 for enhancing projection based on depth information further includes an instruction unit 140, where the instruction unit 140 is configured to project an optical indicator or issue an instruction to any position of the target workpiece.
As a mode, in order to implement a human-computer interaction function and increase the interest of using the system, an instruction unit 140 may be added to the depth information enhanced projection-based workpiece detection system 100, where the instruction unit 140 may include an internal instruction unit 141 and an external instruction unit 142, the internal instruction unit 141 may input an instruction through a mouse, a keyboard, or other devices, the external instruction unit 142 may emit an instruction with an optical mark, such as a pattern, a voice, or an action, and the two parts may be matched with each other or may independently complete the emission of the instruction.
The depth information enhanced projection-based workpiece detection system 100 in the embodiment of the present application may project a pseudo-color pattern corresponding to the depth information of a target workpiece, wherein the target workpiece may include a complete target workpiece or a defective target workpiece.
As one mode, the processing unit 130 is specifically configured to obtain depth information and two-dimensional profile information of the target workpiece according to the structured light pattern reflected by the target workpiece, generate the pseudo-color pattern according to the depth information and the two-dimensional profile information of the target workpiece, and send the pseudo-color pattern to the projection unit 110 to be projected on the target workpiece.
Specifically, the depth information of the target workpiece can be understood as three-dimensional information of the target workpiece, two-dimensional profile information of the target workpiece can be obtained based on the obtained three-dimensional information, and a corresponding pseudo-color pattern of the target workpiece can be generated by combining the three-dimensional information and the two-dimensional profile information of the target workpiece.
Further, the processing unit 130 is further specifically configured to calculate depth information of the target workpiece according to the structured light pattern, and obtain a pseudo-color pattern corresponding to the depth information according to a user requirement.
It can be understood that the user requirement may be to display the target workpiece according to the current requirement of the user on the target workpiece, or to detect the target workpiece, and the corresponding pseudo-color patterns with different colors may be generated according to the use condition of the target workpiece by the user.
Optionally, the processing unit 130 is further configured to acquire audio information corresponding to the depth information or the acquired pseudo color information, and play the audio information.
Specifically, the audio information may be commentary or music corresponding to the projected pseudo-color pattern. The processing unit 130 may project the pseudo-color patterns of different styles according to the selection of the user, and play the commentary or music corresponding to the pseudo-color patterns according to the currently projected pseudo-color patterns. For example, in an application in the field of automobile sales, the processor 130 may play audio commentary or matching music corresponding to complex automobile parts, etc. when a customer purchases an automobile. Illustratively, during workpiece detection, the acquired pseudo-color information can be utilized to play and detect the missing or complete condition of the workpiece.
As another mode, the processing unit 130 is further specifically configured to calculate depth information of the target workpiece according to the structured light pattern, determine a defect position of the target workpiece according to the depth information, and determine a corresponding pseudo-color pattern according to the depth information and the defect position.
Further, the processing unit 130 is configured to perform difference operation on the depth information of the target workpiece obtained through the calculation and the depth information of the target workpiece without defect pre-stored in the database, and if the difference is greater than a preset threshold, determine a defect position of the target workpiece according to the difference.
It is understood that the preset threshold is a difference value for determining that the target workpiece is a defective target workpiece. A memory area may be partitioned for the processing unit 130 and a database may be established for storing depth information and two-dimensional profile information of various defect-free workpieces. After the processing unit 130 obtains the depth information of the current target workpiece, the depth information of the current target workpiece may be compared with the depth information of the non-defective workpiece stored in the database in advance, further, a difference value operation may be performed according to the three-dimensional information and the two-dimensional profile information of the current target workpiece, and the three-dimensional information and the two-dimensional profile information of the non-defective workpiece corresponding to the current target workpiece stored in the database in advance, and if the difference value is greater than a preset threshold value, the current target workpiece is determined to be a defective workpiece. Further, the processing unit 130 may perform feature extraction on the three-dimensional information of the current target workpiece, so as to obtain a specific defect position of the current target workpiece.
Illustratively, as shown in FIG. 5, FIG. 5 is a target workpiece, it being understood that the target workpiece has no pattern and no color. The acquisition unit 120 acquires a structured light pattern reflected by the target workpiece, the processing unit 130 calculates depth information of the target workpiece according to the reflected structured light pattern, obtains three-dimensional information and two-dimensional profile information of the target workpiece according to the calculated depth information, looks up the database to see whether the corresponding complete depth information of the target workpiece is stored in the database, and if yes, performs difference operation on the calculated depth information of the target workpiece and the complete depth information of the target workpiece stored in the database, calculates that the difference is greater than a preset threshold value, and determines that the target workpiece is a defective target workpiece. The processing unit 130 performs feature extraction on the calculated depth information of the target workpiece, so as to obtain a specific defect position of the target workpiece. Further, the processing unit 130 may generate a corresponding pseudo-color pattern according to the depth information of the target workpiece. The optional pseudo-color patterns may be those with different colors, patterns or textures stored in advance in a built-up database.
Further, after determining the specific defect position of the target workpiece, the processing unit 130 may be configured to calculate depth information of the defect position of the target workpiece according to the structured light pattern, and obtain a corresponding pseudo-color pattern according to the depth information of the defect position of the target workpiece, so as to perform enhanced display on the defect position of the target workpiece.
In one way, the processing unit 130 is configured to determine a defect type of the target workpiece by looking up a database or by using machine learning, and obtain a corresponding pseudo-color pattern according to the defect type of the target workpiece.
It will be appreciated that depth information for different types of workpieces, as well as defect types for different workpieces, may be pre-stored in the database. The processing unit 130 may determine the workpiece type of the target workpiece by searching for a workpiece with the same depth information as the target workpiece stored in the database, further determine whether the target workpiece is a defective workpiece, and if the target workpiece is a defective target workpiece, determine the defect type of the target workpiece by querying the defect type of the target workpiece stored in the database in advance. Alternatively, if the defect type of the corresponding target workpiece is not found in the database, the processing unit 130 may store the defect type of the current target workpiece as a new workpiece defect type.
Further, the processing unit 130 is configured to generate pseudo-color patterns with different colors according to the depth information of the defective target workpiece and the defective position of the defective target workpiece.
It is understood that, after determining that the target workpiece is a defective target workpiece, the processing unit 130 may generate pseudo-color patterns with different depth colors according to the defective position of the defective target workpiece to perform enhanced display on the defective position of the defective target workpiece. Illustratively, a dark color may be used at the defect site of the defect target workpiece and a light color false color pattern may be used at the full site.
Referring to fig. 5, a method for detecting a workpiece based on depth information enhanced projection according to an embodiment of the present application includes:
step S210: the projection unit emits a structured light pattern toward a target workpiece.
As one mode, the projection unit includes at least two projection devices, wherein one projection device can be used to project the structured light pattern, the other projection device can be used to project the pseudo-color pattern, or both projection devices are used to project the structured light pattern and the pseudo-color pattern.
Further, the structured light pattern emitted by the projection unit to the target workpiece can be a visible light pattern or an invisible light pattern. The visible light pattern may include: binary images, grayscale images, color images, and the like; the invisible light pattern may include: infrared laser images, etc.
Step S220: the acquisition unit acquires the structured light pattern reflected by the target workpiece in real time.
As one mode, the acquisition unit acquires the structured light pattern reflected by the target workpiece in real time, and when the acquisition unit detects an instruction of projecting the structured light pattern to the target workpiece by the projection unit, the acquisition unit starts to acquire the structured light pattern reflected by the target workpiece in real time.
Alternatively, a timer may be set in the acquisition unit, the timer may set a time for the acquisition unit to start acquiring the structured light pattern reflected by the target workpiece, and after the time for acquiring the structured light pattern reflected by the target workpiece is set, the acquisition unit performs the structured light pattern acquisition operation reflected by the target workpiece according to the time set by the timer.
Step S230: the processing unit calculates the depth information of the target workpiece based on the structured light pattern, and obtains the pseudo-color pattern according to the depth information.
As one mode, the processing unit establishes a mapping relationship between the projection unit and the acquisition unit pixel through a reverse fringe projection technique. Optionally, the processing unit may further use a planar calibration technique based on fringe projection to establish a phase-depth mapping table within a measurement range, and further, may obtain depth information of the target workpiece based on the mapping table.
In the embodiment of the application, the projection unit and the acquisition unit are based on a mapping relation and a splicing relation established by a reverse fringe projection technology, and a target workpiece is modulated in a light transmission process between the projection unit and the acquisition unit pixel array, wherein the mapping relation between the projection unit and the acquisition unit pixel array can be established not only by a phase expansion algorithm, but also by intensity, color, gray scale and the like, and the mapping relation between the projection unit and the acquisition unit pixel array can be established, wherein the phase expansion algorithm can comprise a time phase expansion algorithm, a space-time phase expansion algorithm and the like.
Alternatively, the acquisition unit may be a different type of camera. One more camera can be added to acquire the depth information of the target workpiece by using a binocular stereo vision technology, and the depth information of the target workpiece can also be directly acquired by using a TOF depth camera.
Optionally, the measurement field of view of the depth information of the target workpiece is directly related to the size of the field of view of the acquisition unit, and the field of view of the projection unit is approximately matched with the measurement field of view of the depth information. For example, if the size of the target workpiece is small, only one depth camera is needed to cover the measurement range, but occlusion may be caused by drastic change of depth information of the target workpiece during measurement, a plurality of projectors are needed to cover from multiple directions to solve occlusion and shadow, at this time, the depth camera and the plurality of projectors form a plurality of three-dimensional measurement devices, a pixel mapping relation needs to be established between the depth camera and each projector, then the problem of splicing and fusion during projection can be solved, and at this time, in order to solve occlusion and shadow, the overlapping area of multiple projections is inevitably large; if the problem that the measurement view field of a large scene is insufficient is solved, a plurality of depth cameras and projectors are required to cover, meanwhile, a plurality of projections are required by the projectors, the projections need to be spliced and fused at the moment, the overlapping area can be as small as possible, and the workload of splicing and fusing is reduced.
Step S240: the projection unit projects the structured light pattern and the pseudo-color pattern to the target workpiece in an interlaced manner in time series.
As a mode, the structured light pattern projected by the projection unit is matched with the pseudo color pattern, and if the structured light pattern is visible light, the pseudo color pattern is also visible light; when the structured light pattern is invisible light, corresponding equipment is additionally arranged to display the pseudo-color pattern in an additional mode.
Optionally, only one target workpiece needs to be scanned, and further, the target workpiece can be sequentially scanned by using a plurality of projectors, so that the depth information of the target workpiece is obtained, then the pseudo-color pattern is generated in real time according to the depth information, and the operation is repeated for the next target workpiece.
According to the workpiece detection method based on the depth information enhanced projection, the projection unit emits the structured light pattern to the target workpiece, the acquisition unit acquires the structured light pattern reflected by the target workpiece in real time, the processing unit calculates the depth information of the target workpiece based on the structured light pattern, the pseudo color pattern is obtained according to the depth information, and the projection unit projects the structured light pattern and the pseudo color pattern to the target workpiece in a staggered mode in time sequence. By the method, the projection visual field is expanded and the projection display effect is optimized by using the plurality of projection devices to project the structured light pattern and the pseudo color pattern to the target workpiece.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (10)

1. A workpiece inspection system for enhancing projection based on depth information, the system comprising:
the projection unit comprises at least two projection devices and is used for projecting a structural light pattern and a pseudo-color pattern to a target workpiece in a staggered mode on a time sequence;
the acquisition unit is used for acquiring the structured light pattern which is projected to the target workpiece by the projection unit and reflected by the target workpiece in real time;
and the processing unit is used for calculating the depth information of the target workpiece according to the structured light pattern, obtaining the pseudo-color pattern according to the depth information and sending the pseudo-color pattern to the projection unit.
2. The system of claim 1, wherein the processing unit is specifically configured to obtain depth information and two-dimensional profile information of the target workpiece according to the structured light pattern reflected by the target workpiece, generate the pseudo-color pattern according to the depth information and the two-dimensional profile information of the target workpiece, and send the pseudo-color pattern to a projection unit to be projected on the target workpiece.
3. The system of claim 1, wherein the processing unit is specifically configured to calculate depth information of the target workpiece according to the structured light pattern, and obtain a pseudo-color pattern corresponding to the depth information according to a user requirement.
4. The system of claim 1, wherein the processing unit is further configured to obtain audio information corresponding to the depth information, and play the audio information.
5. The system of claim 1, wherein the processing unit is further configured to calculate depth information for the target workpiece based on the structured light pattern, determine a defect location for the target workpiece based on the depth information, and determine a corresponding pseudo-color pattern based on the depth information and the defect location.
6. The system of claim 5, wherein the processing unit is configured to perform a difference operation on the calculated depth information of the target workpiece and depth information of a non-defective target workpiece pre-stored in the database, and determine a defect position of the target workpiece according to the difference if the difference is greater than a preset threshold.
7. The system of claim 5, wherein the processing unit is specifically configured to calculate depth information of a defect position of the target workpiece according to the structured light pattern, and obtain a corresponding pseudo-color pattern according to the depth information of the defect position of the target workpiece, so as to enhance and display the defect position of the target workpiece.
8. The system of any of claims 5 to 7, wherein the processing unit is configured to determine a defect type of the target workpiece by searching a database or by machine learning, and obtain a corresponding pseudo-color pattern according to the defect type of the target workpiece.
9. The system of claim 8 wherein the processing unit is configured to generate pseudo-color patterns of different colors based on depth information of the defective target workpiece and a defect location of the defective target workpiece.
10. The workpiece detection method based on the depth information enhanced projection is characterized by being applied to a workpiece detection system based on the depth information enhanced projection, the system comprises a projection unit, an acquisition unit and a processing unit, and the method comprises the following steps:
the projection unit emits a structured light pattern toward a target workpiece;
the acquisition unit acquires the structured light pattern reflected by the target workpiece in real time;
the processing unit calculates the depth information of the target workpiece based on the structured light pattern, and obtains a pseudo-color pattern according to the depth information;
the projection unit projects the structured light pattern and the pseudo-color pattern to the target workpiece in an interlaced manner in time series.
CN202010675454.4A 2020-07-14 2020-07-14 Workpiece detection system and method based on depth information enhanced projection Pending CN111862024A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010675454.4A CN111862024A (en) 2020-07-14 2020-07-14 Workpiece detection system and method based on depth information enhanced projection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010675454.4A CN111862024A (en) 2020-07-14 2020-07-14 Workpiece detection system and method based on depth information enhanced projection

Publications (1)

Publication Number Publication Date
CN111862024A true CN111862024A (en) 2020-10-30

Family

ID=72983894

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010675454.4A Pending CN111862024A (en) 2020-07-14 2020-07-14 Workpiece detection system and method based on depth information enhanced projection

Country Status (1)

Country Link
CN (1) CN111862024A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107067428A (en) * 2017-03-10 2017-08-18 深圳奥比中光科技有限公司 Augmented reality projection arrangement and method
CN107578397A (en) * 2017-07-25 2018-01-12 西南交通大学 A kind of novel non-contact abrasion of contact wire detection method
CN107708624A (en) * 2015-06-12 2018-02-16 智能眼睛有限公司 Blind person or visually impaired people is allowed to understand the portable system of surrounding environment by sound or tactile
WO2019023625A1 (en) * 2017-07-27 2019-01-31 Invuity, Inc. Projection scanning system
US20190269333A1 (en) * 2016-11-22 2019-09-05 Provincial Health Services Authority Dual mode biophotonic imaging systems and their applications for detection of epithelial dysplasia in vivo
CN111083453A (en) * 2018-10-18 2020-04-28 中兴通讯股份有限公司 Projection device, method and computer readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107708624A (en) * 2015-06-12 2018-02-16 智能眼睛有限公司 Blind person or visually impaired people is allowed to understand the portable system of surrounding environment by sound or tactile
US20190269333A1 (en) * 2016-11-22 2019-09-05 Provincial Health Services Authority Dual mode biophotonic imaging systems and their applications for detection of epithelial dysplasia in vivo
CN107067428A (en) * 2017-03-10 2017-08-18 深圳奥比中光科技有限公司 Augmented reality projection arrangement and method
CN107578397A (en) * 2017-07-25 2018-01-12 西南交通大学 A kind of novel non-contact abrasion of contact wire detection method
WO2019023625A1 (en) * 2017-07-27 2019-01-31 Invuity, Inc. Projection scanning system
AU2018306730A1 (en) * 2017-07-27 2020-02-13 Invuity, Inc. Projection scanning system
CN111083453A (en) * 2018-10-18 2020-04-28 中兴通讯股份有限公司 Projection device, method and computer readable storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
CHANGSOO JE 等: "Multi-Projector color structured-light vision", 《SIGNAL PROCESSING:IMAGE COMMUNICATION》, vol. 28, no. 9, pages 1046 - 1058, XP028723638, DOI: 10.1016/j.image.2013.05.005 *
侯颖: "基于投影的三维打印模型纹理着色", 《中国优秀硕士学位论文全文数据库 信息科技辑》, pages 138 - 1199 *
王国新;汝洪芳;朱显辉;: "基于彩色伪随机编码结构光的三维重建方法", 《黑龙江科技大学学报》, vol. 28, no. 04, pages 415 - 418 *

Similar Documents

Publication Publication Date Title
US10902668B2 (en) 3D geometric modeling and 3D video content creation
US7103212B2 (en) Acquisition of three-dimensional images by an active stereo technique using locally unique patterns
US9392262B2 (en) System and method for 3D reconstruction using multiple multi-channel cameras
US10547822B2 (en) Image processing apparatus and method to generate high-definition viewpoint interpolation image
US20120176478A1 (en) Forming range maps using periodic illumination patterns
US20120176380A1 (en) Forming 3d models using periodic illumination patterns
US20130335535A1 (en) Digital 3d camera using periodic illumination
CN107734267B (en) Image processing method and device
US20160086341A1 (en) System and method for adaptive depth map reconstruction
CN107610080B (en) Image processing method and apparatus, electronic apparatus, and computer-readable storage medium
JP2001116526A (en) Three-dimensional shape measuring instrument
KR20230065978A (en) Systems, methods and media for directly repairing planar surfaces in a scene using structured light
KR20100112853A (en) Apparatus for detecting three-dimensional distance
CN112184793B (en) Depth data processing method and device and readable storage medium
CN114858086A (en) Three-dimensional scanning system, method and device
JP2011237296A (en) Three dimensional shape measuring method, three dimensional shape measuring device, and program
CN113011206A (en) Handheld scanner and scanning method thereof
US9752870B2 (en) Information processing apparatus, control method thereof and storage medium
JP2001194126A (en) Apparatus and method for measuring three-dimensional shape and program providing medium
CN111862024A (en) Workpiece detection system and method based on depth information enhanced projection
WO2019219687A1 (en) Using time-of-flight techniques for stereoscopic image processing
JP5375479B2 (en) Three-dimensional measurement system and three-dimensional measurement method
CN112750157B (en) Depth image generation method and device
JP2023088061A (en) Three-dimensional model generation apparatus, three-dimensional model generation method, and three-dimensional model generation program
US20200234458A1 (en) Apparatus and method for encoding in structured depth camera system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination