CN112837327B - Image processing method and device, electronic equipment and storage medium - Google Patents

Image processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112837327B
CN112837327B CN202110122036.7A CN202110122036A CN112837327B CN 112837327 B CN112837327 B CN 112837327B CN 202110122036 A CN202110122036 A CN 202110122036A CN 112837327 B CN112837327 B CN 112837327B
Authority
CN
China
Prior art keywords
image
projection
binarized
calibration pattern
envelope
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110122036.7A
Other languages
Chinese (zh)
Other versions
CN112837327A (en
Inventor
程星凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Tricolor Technology Co ltd
Original Assignee
Beijing Tricolor Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Tricolor Technology Co ltd filed Critical Beijing Tricolor Technology Co ltd
Priority to CN202110122036.7A priority Critical patent/CN112837327B/en
Publication of CN112837327A publication Critical patent/CN112837327A/en
Application granted granted Critical
Publication of CN112837327B publication Critical patent/CN112837327B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The application provides an image processing method, an image processing device, electronic equipment and a storage medium, and relates to the field of image processing, wherein the method comprises the following steps: based on a first projection image and a second projection image, a binarized image is obtained, the first projection image is an image with a first calibration pattern projected on a detection surface, the second projection image is an image with a second calibration pattern projected on the detection surface, and the colors of the first calibration pattern and the second calibration pattern are complementary; obtaining a main phase diagram based on the phase shift grating image, wherein the phase shift grating image is an image with the phase shift grating pattern projected on the detection surface; removing part of invalid areas in the binary image based on the dominant phase diagram; and extracting an effective area based on the binarized image with part of the ineffective area removed, and obtaining an effective area image. The image processing method provided by the application can restore image information better, and cooperate with the phase shift image, so that the image precision is improved to a certain extent, and the influence of noise in image acquisition is reduced.

Description

Image processing method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of image processing, and more particularly, to an image processing method, an image processing apparatus, an electronic device, and a storage medium.
Background
Image-based three-dimensional model reconstruction is a very important research direction in the field of computer vision. Compared with the two-dimensional image information, the three-dimensional model has stronger sense of reality and can present more information to people. The image processing technology of the structured light grating projection is important to the reconstruction of the three-dimensional model.
In the existing image processing technology of structured light grating projection, a direct set threshold value is mostly adopted to binarize an acquired phase shift grating image, and an effective grating area is extracted according to a binarization result. Under the condition of complex environment or strong interference around, the method is used for extracting the effective grating area of the acquired image, the extraction effect is poor, the accuracy of the effective area of the extracted image is low, excessive noise is introduced during decoding, and the three-dimensional reconstruction, image mapping and other works are not facilitated.
Disclosure of Invention
An object of an embodiment of the present invention is to provide an image processing method, an image processing apparatus, an electronic device, and a storage medium, which are used for solving a problem of poor extraction effect when an effective grating region is extracted from an acquired image in a complex environment or under a strong interference condition.
The application provides an image processing method, which comprises the following steps: based on a first projection image and a second projection image, a binarized image is obtained, wherein the first projection image is an image with a first calibration pattern projected on a detection surface, the second projection image is an image with a second calibration pattern projected on the detection surface, and the colors of the first calibration pattern and the second calibration pattern are complementary; obtaining a main phase diagram based on a phase shift grating image, wherein the phase shift grating image is an image of the detection surface projected with the phase shift grating pattern; removing part of invalid areas in the binarized image based on the main phase map; and extracting an effective area based on the binarized image with part of the ineffective area removed, and obtaining an effective area image.
In the embodiment of the application, since the first projection image is an image with the first calibration pattern projected on the detection surface, the second projection image is an image with the second calibration pattern projected on the detection surface, and the colors of the first calibration pattern and the second calibration pattern are complementary, the influence of environmental factors can be eliminated by obtaining the binarized image based on the first projection image and the second projection image, so that a relatively better extraction effect can be achieved when effective region extraction is performed subsequently; in addition, the extraction effect in the subsequent extraction of the effective region can be further improved by obtaining the main phase diagram based on the phase shift grating image and removing part of the ineffective region in the binary image by utilizing the main phase diagram.
In an embodiment, the obtaining a binarized image based on the first projection image and the second projection image includes: performing differential calculation on the first projection image and the second projection image to obtain a differential image; and binarizing the differential image based on the gray average value of the differential image to obtain the binarized image.
In the embodiment of the application, the difference image is obtained by carrying out difference calculation processing on the first projection image and the second projection image; obtaining a gray average value of the differential image by calculating the differential image; and then, taking the gray average value of the differential image as a division point, carrying out binarization processing on the differential image to obtain a binarized image, and compared with the mode of directly setting a threshold value in the prior art and carrying out binarization processing to obtain the binarized image, the method has the advantages that the first projection image and the second projection image are respectively images with the detection surfaces projected with the complementary color calibration patterns, and the influence of environmental factors on image information can be reduced to a certain extent by adopting the gray average value of the differential image of the first projection image and the second projection image as the division point to carry out binarization processing, so that the accuracy of the image information extracted subsequently is improved.
In one embodiment, before the removing the part of the invalid region in the binarized image based on the dominant phase map, the method further includes: and filtering the binarized image to obtain a filtered image.
In the embodiment of the application, before the invalid area of the binary image part is removed by using the main phase diagram of the phase shift grating, a filtering processing operation is performed on the binary image to obtain a filtered image, so that noise caused by the binary image result is reduced as much as possible.
In an embodiment, the removing the part of the invalid area in the binary image based on the dominant phase map includes: part of the invalid regions in the binarized image are eliminated by the following formula,
Figure BDA0002920237340000031
wherein P is remove(i,j) Representing a binarized image after removal of invalid regions, P phase(i,j) Representing the map of the principal phases,
Figure BDA0002920237340000032
representing the filtered image, (i, j) representing the current pixel coordinate position.
In an embodiment, the extracting the effective area based on the binarized image after removing the part of the ineffective area includes: performing morphological expansion based on the binarized image after removing part of the invalid region to obtain an expanded image; extracting the peripheral outline of a single standard point in the expansion image to obtain an outline point set image; the peripheral envelope of each single standard point in the contour point set image is obtained, and an envelope point set image is obtained; screening envelope points with the largest area in the envelope point set image within a preset area range, and obtaining a screened envelope point image; and filling the screened envelope point images to extract the effective area.
In the embodiment of the application, firstly, morphological expansion is carried out on a binarized image with part of invalid areas removed to obtain an expanded image, then, a single standard point peripheral outline in the expanded image is extracted to obtain an outline point set image, then, peripheral envelope calculation is carried out on each single standard point of the outline point set image to obtain an envelope point set image, then, envelope points with the largest area are screened out from the envelope point set image within the preset area range, finally, the envelope point image with the largest area is filled, the effective area is extracted, and the acquisition of the effective image of the target area is completed, so that a foundation is provided for the follow-up three-dimensional reconstruction, image mapping and other works.
In a second aspect, the present application provides an image processing apparatus including: the processing module is used for obtaining a binarized image based on a first projection image and a second projection image, wherein the first projection image is an image with a first calibration pattern projected on a detection surface, the second projection image is an image with a second calibration pattern projected on the detection surface, and the colors of the first calibration pattern and the second calibration pattern are complementary; the processing module is further used for obtaining a main phase diagram based on the phase shift grating image, wherein the phase shift grating image is an image of the detection surface projected with the phase shift grating pattern; the rejecting module is used for rejecting part of invalid areas in the binarized image based on the main phase diagram; and the extraction module is used for extracting the effective area based on the binarized image with part of the ineffective area removed, so as to obtain an effective area image.
In an embodiment, the processing module is further configured to perform differential computation on the first projection image and the second projection image to obtain a differential image; and binarizing the differential image based on the gray average value of the differential image to obtain the binarized image.
In an embodiment, the processing module is further configured to perform filtering processing on the binarized image to obtain a filtered image.
In one embodiment, the rejecting module is further configured to reject a part of the invalid region in the binarized image by the following formula,
Figure BDA0002920237340000041
wherein P is remove(i,j) Representing a binarized image after removal of invalid regions, P phase(i,j) Representing the map of the principal phases,
Figure BDA0002920237340000042
representing the filtered image, (i, j) representing the current pixel coordinate position.
In an embodiment, the extracting module is further configured to perform morphological dilation based on the binarized image after the removing of the part of the ineffective area to obtain a dilated image; extracting the peripheral outline of a single standard point in the expansion image to obtain an outline point set image; the peripheral envelope of each single standard point in the contour point set image is obtained, and an envelope point set image is obtained; screening out envelope points with the largest area in a preset area range in the envelope point set image, and obtaining a screened envelope point image; and filling the screened envelope point images to extract the effective area.
In a third aspect, an embodiment of the present application provides an electronic device, including: the device comprises a memory and a processor, wherein the memory is connected with the processor; the memory is used for storing programs; the processor is configured to invoke the program stored in the memory to perform the method as the above-described first aspect embodiment and/or any possible implementation manner in combination with the first aspect embodiment.
In a fourth aspect, embodiments of the present application provide a storage medium having stored thereon a computer program which, when executed by a computer, performs a method as described above with respect to the first aspect embodiment and/or any one of the possible implementations in combination with the first aspect embodiment.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of a structured light device according to an embodiment of the present application.
Fig. 2 is a schematic working diagram of a structured light device according to an embodiment of the present application.
Fig. 3 is a flowchart of an image processing method according to an embodiment of the present application.
FIG. 4 is a first calibration pattern according to an embodiment of the present application.
FIG. 5 is a second calibration pattern according to an embodiment of the present application.
Fig. 6 is a block diagram of an image processing apparatus according to an embodiment of the present application.
Fig. 7 is a schematic diagram of an internal structure of an electronic device according to an embodiment of the present application.
Icon: a structured light device 10; a projection module 11; a camera module 12; a processor 13; a memory 14; an image processing device 20; a processing module 21; a rejection module 22; the extraction module 23.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
Image-based three-dimensional model reconstruction is a very important research direction in the field of computer vision. Compared with the two-dimensional image information, the three-dimensional model has stronger sense of reality and can present more information to people. The image processing technology of the structured light grating projection is important to the reconstruction of the three-dimensional model.
In the existing image processing technology of structured light grating projection, a direct set threshold value is mostly adopted to binarize an acquired phase shift grating image, and an effective grating area is extracted according to a binarization result. Under the condition of complex environment or strong interference around, the method is used for extracting the effective grating area of the acquired image, the extraction effect is poor, the accuracy of the effective area of the extracted image is low, excessive noise is introduced during decoding, and the three-dimensional reconstruction, image mapping and other works are not facilitated.
The embodiment of the application provides an image processing method, an image processing device, electronic equipment and a storage medium, which are used for improving the accuracy of an effective domain of an acquired image when the effective grating area of the acquired image is extracted under the condition of complex environment or strong interference around the complex environment, so that the image can be better applied to subsequent work.
The following detailed description is provided with reference to the accompanying drawings.
Referring to fig. 1 and 2, an embodiment of the present application provides a structured light apparatus 10. The structured light device 10 may include a projection module 11, a camera module 12, a processor 13, a memory 14, and the like. The projection module 11, the camera module 12, the processor 13 and the memory 14 may be connected via a communication bus. The projection module 11 is used for projecting a pattern onto the surface of an object to be projected. The camera module 12 is used for acquiring an image of the surface of the object to be projected, on which a corresponding pattern is projected. Stored within memory 14 are computer readable instructions. The processor 13 is configured to invoke the instructions of the memory 14 to implement the image processing method provided in the present application.
It will be appreciated that in other embodiments, the projection module 11 and the camera module 12 may be separately disposed, and in this case, the projection module 11 may be a projector, and the camera module 12 may be a camera or a video camera.
Referring to fig. 3, an image processing method according to an embodiment of the present application may be applied to the aforementioned structured light apparatus 10.
In this embodiment, the image processing method may include the following steps.
Step S11, obtaining a binarized image based on the first projection image and the second projection image. The first projection image is an image of the detection surface projected with a first calibration pattern; the second projection image is an image of the detection surface projected with a second calibration pattern; the first calibration pattern is complementary in color to the second calibration pattern.
It will be appreciated that the method further comprises the step of acquiring a first projection image and a second projection image prior to step S11. Alternatively, the first projection image and the second projection image may be acquired in advance by the camera module and stored in the memory, and the processor may directly acquire the first projection image and the second projection image. Or the first projection image and the second projection image can be acquired in real time through the camera module and transmitted to the processor for subsequent processing.
Specifically, the acquisition of the first projection image and the second projection image can be achieved in the following manner.
Firstly, controlling a projection module to project a first calibration pattern on the surface of an object to be projected (namely, the detection surface in the previous description); then, the camera module is controlled to collect a first projection image of the surface of the object to be projected, on which a first calibration pattern is projected, and the collected first projection image is stored in the memory; then, the projection module is controlled to project a second calibration pattern on the surface of the object to be projected; then, the camera module is controlled to acquire a second projection image of the surface of the object to be projected, on which a second calibration pattern is projected.
In this embodiment, the first calibration pattern and the second calibration pattern have the same pattern content and are complementary in color. Specifically, the first calibration pattern and the second calibration pattern respectively comprise a plurality of dots arranged in an array; in the two patterns, the sizes of the dots and the arrangement mode of the dots are the same, the distances between the adjacent dots are the same, but the colors of the dots are complementary, and the colors of gaps between the dots are also complementary. For example, the dots of the first calibration pattern in FIG. 4 are white in color and the gaps are black in color; the dots of the second calibration pattern in fig. 5 are black in color and the gaps are white in color.
It will be appreciated that in other embodiments, the dot and gap colors may be other contrasting colors (e.g., dot bright, gap dark; dot warm, gap cool, etc.), and the application is not limited to black and white.
It is understood that the dots in the first calibration pattern and the second calibration pattern in other embodiments may be patterned in other shapes (e.g. triangle, quadrangle, pentagon, etc.), which is not limited in this application.
In this embodiment, the dots at the peripheral boundaries of the first calibration pattern and the second calibration pattern are not complete dots, but are semicircular patterns arranged in a semicircular shape, and the diameters of the semicircular patterns arranged in a straight line are coincident with each side of the calibration pattern.
In this embodiment, step S11 may include the following substeps.
Step S101: for the first projection image (denoted as P bright ) And a second projection image (denoted as P black ) Differential computation is performed to obtain a differential image (denoted as P diff ) The calculation relation is as follows:
P diff =P bright -P black
specifically, the pixels in the first projection image are in one-to-one correspondence with the pixels in the second projection image, and the difference calculation is performed on the first projection image and the second projection image, so that the gray value of each pixel in the first projection image is different from the gray value of the corresponding pixel in the second projection image, and a difference image is obtained. The pixel value of each pixel point in the differential image is the difference value of the gray value of the corresponding pixel point in the first projection image and the gray value of the corresponding pixel point in the second projection image.
Step S102: calculating a difference image P diff The gray average value is obtained to obtain the gray average value V avg
Specifically, the differential image P is first diff The gray values of all the pixel points are summed and then averaged to obtain a gray average value V avg
Step S103: with gray mean value V avg For dividing points, for differential image P diff Binarizing to obtain a binarized image P th
It is understood that the specific process of binarization is well known in the art and will not be described here.
Step S12, obtaining a main phase diagram based on the phase shift grating image.
It will be appreciated that prior to step S12, the method further comprises the step of acquiring a phase-shifted grating image.
In particular, depending on the image application requirements, the user may control the projection module 11 to project a phase-shift grating pattern onto the surface of the object to be projected (i.e., the detection surface in the foregoing), and obtain a phase-shift grating image corresponding to the projected phase-shift grating pattern (i.e., the detection surface is projected with a phase-shift grating image of the phase-shift grating pattern, it will be appreciated that the phase-shift grating pattern may be a sine phase-shift grating pattern, a cosine phase-shift grating pattern, or the like, via the camera module 12.
In this embodiment, after the phase shift grating image is obtained, step S12 may specifically be to obtain the dominant phase map P of the corresponding phase shift grating image by analyzing the phase shift grating image phase
The specific process of image analysis is a well-known technique in the art, and will not be described here.
And step S13, eliminating part of invalid areas in the binarized image based on the main phase diagram.
It will be appreciated that prior to step S13, the method further comprises the step of binarizing the image P th Median filtering processing is performed to obtain a filtered image (denoted as
Figure BDA0002920237340000091
) Is carried out by a method comprising the steps of. The filtering process can reduce noise in the image binarization result as much as possible. Alternatively, the filtering process may be median filtering. Of course, other suitable filtering methods may be used for the filtering process, and the application is not limited thereto.
In this embodiment, in step S13, a part of invalid regions in the binarized image subjected to the filtering process is removed based on the dominant phase map, and step S13 may be specifically implemented as follows.
Part of the invalid regions in the binarized image are eliminated by the following formula,
Figure BDA0002920237340000092
wherein P is remove(i,j) Representing a binarized image after removal of invalid regions, P phase(i,j) Representing a dominant phase map of the phase shifted raster image,
Figure BDA0002920237340000093
representing the filtered image, (i, j) representing the pixel coordinate position of the current image.
In this embodiment, by removing a part of the invalid region in the binarized image by using the dominant phase map, the extraction effect in the subsequent effective region extraction can be improved to a certain extent.
It can be understood that other applicable manners may be used to remove a part of the invalid region in the binary image, which is not limited in this application.
And S14, extracting an effective area based on the binarized image with part of the ineffective area removed, and obtaining an effective area image.
In this embodiment, the step S14 specifically includes the following substeps.
Step S401: based on the rejected image P remove Morphological dilation is performed to obtain a dilated image (denoted as P dilate ). Wherein the specific implementation of morphological dilation is well known to those skilled in the art and is not described hereinAnd will be described in detail.
Step S402: extracting the inflated image P dilate Is a single-index-point peripheral contour in the image, and a contour point set image (expressed as P is obtained outline ). Specifically, edge extraction techniques may be employed to achieve extraction of single-index point peripheral contours. The edge extraction technique is well known to those skilled in the art, and is not described here in detail.
Step S403: obtaining an outline point set image P outline The peripheral envelope of each single calibration point in (a) to obtain an envelope point set image (expressed as P hull ). Specifically, a Graham scanning method can be used to find the peripheral envelope of each calibration point. The specific implementation of the Graham scanning method is well known to those skilled in the art, and will not be described in detail herein.
Step S404: screening out the envelope point set image P hull The envelope point with the middle area in the preset area range and the largest area is obtained, and the screened envelope point image (expressed as P f_hull )。
Wherein step S404 may comprise the sub-steps of:
step S501: setting and selecting envelope point set image P hull Is defined by the area range of (a).
Specifically, the selected area range includes an upper limit area and a lower limit area. It can be appreciated that the setting of the area range can be adjusted according to the actual application scenario. In this embodiment, the upper limit area is 0.9 times the image area of the envelope point set, and the lower limit area is 0.3 times the image area of the envelope point set.
Step S502: and screening the image area of the envelope point set based on the set area range.
Specifically, the area of each envelope point set in the envelope point set image is compared with the selected area range, and the envelope point sets smaller than the lower limit area and larger than the upper limit area are removed to obtain the screened envelope point set image.
Step S503: and obtaining an envelope point image based on the screened envelope point set image.
Extracting the maximum area from the screened envelope point set imageEnvelope points, the envelope point image P is obtained f_hull
It will be appreciated that the filtered envelope point set image has a plurality of envelope points, and the envelope point with the largest area among the envelope points is selected.
Step S405: for the envelope point image P after screening f_hull Filling is carried out, and extraction of the effective area is achieved.
Based on the envelope point image P f_hull Filling, namely obtaining a final extracted effective area image P after filling according to the fact that the inner point is 255 and the outer point is 0 mask . The specific implementation of image filling is well known to those skilled in the art, and will not be described in detail here.
Specifically, the envelope point is a closed image, and the pixel points inside the closed image are defined as internal points, wherein the contour of the closed image also belongs to the internal points, and the contour of the closed image also belongs to the external points.
In the embodiment of the application, the first projection image is an image with the first calibration pattern projected on the detection surface, the second projection image is an image with the second calibration pattern projected on the detection surface, and the colors of the first calibration pattern and the second calibration pattern are complementary, so that the influence of environmental factors can be eliminated by obtaining the binarized image based on the first projection image and the second projection image, and a relatively better extraction effect can be achieved when effective region extraction is performed subsequently; in addition, the extraction effect in the subsequent extraction of the effective region can be further improved by obtaining the main phase diagram based on the phase shift grating image and removing part of the ineffective region in the binary image by utilizing the main phase diagram.
Referring to fig. 6, an image processing apparatus 20 is also provided in the embodiment of the present application based on the same inventive concept. In this embodiment, the image processing apparatus may include a processing module 21, a rejection module 22 and an extraction module 23.
The processing module 21 is configured to obtain a binarized image based on a first projection image and a second projection image, where the first projection image is an image with a first calibration pattern projected on a detection surface, and the second projection image is an image with a second calibration pattern projected on the detection surface, and the first calibration pattern and the second calibration pattern are complementary in color. In this embodiment, the processing module 21 is further configured to obtain a dominant phase map based on a phase shift grating image, where the phase shift grating image is an image of the detection surface projected with the phase shift grating pattern. And the rejecting module 22 is configured to reject a part of the invalid region in the binarized image based on the dominant phase map. And the extracting module 23 is configured to extract an effective area based on the binarized image after removing a part of the ineffective area, so as to obtain an effective area image.
In this embodiment of the present application, the processing module 21 is further configured to perform differential computation on the first projection image and the second projection image to obtain a differential image; and binarizing the differential image based on the gray average value of the differential image to obtain the binarized image. The processing module 21 is further configured to perform filtering processing on the binarized image to obtain a filtered image.
In the embodiment of the present application, the rejection module 22 is further configured to reject a part of the invalid region in the binarized image by the following formula
Figure BDA0002920237340000121
Wherein P is remove(i,j) Representing a binarized image after removal of invalid regions, P phase(i,j) A main phase diagram is represented and is displayed,
Figure BDA0002920237340000122
representing a filtered image, (i, j) representing the current pixel coordinate position.
In the embodiment of the present application, the extracting module 24 is further configured to perform morphological dilation based on the binarized image after the removing of the part of the ineffective area, so as to obtain a dilated image; extracting the peripheral outline of a single standard point in the expansion image to obtain an outline point set image; obtaining the peripheral envelope of each single standard point in the contour point set image to obtain an envelope point set image; screening out envelope points with the largest area in a preset area range in the envelope point set image, and obtaining a screened envelope point image; and filling the screened envelope point images to extract the effective area.
It is to be understood that the image processing apparatus 20 provided in the present application corresponds to the image processing method provided in the present application, and for brevity of description, the same or similar parts may refer to the content of the image processing method, and will not be described herein again.
The respective modules in the above-described image processing apparatus may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or independent of a processor in a server, or may be stored in software in a memory in the server, so that the processor may call and execute operations corresponding to the above modules. The processor may be a Central Processing Unit (CPU), microprocessor, single-chip microcomputer, etc.
The image processing method and/or the image processing apparatus described above may be implemented in the form of a computer readable instruction which may be run on an electronic device as shown in fig. 7.
The embodiment of the application also provides an electronic device, which comprises a memory, a processor and computer readable instructions stored on the memory and capable of running on the processor, wherein the processor realizes the image processing method when executing the program.
Fig. 7 is a schematic diagram of an internal structure of an electronic device according to an embodiment of the present application, which may be a server. Referring to fig. 7, the electronic device includes a processor, a nonvolatile storage medium, an internal memory, an input device, a display screen, and a network interface connected by a system bus. The non-volatile storage medium of the electronic device may store an operating system and computer readable instructions, where the computer readable instructions, when executed, may cause a processor to perform an image processing method according to embodiments of the present application, and a specific implementation process of the method may refer to fig. 1, which is not described herein. The processor of the electronic device is configured to provide computing and control capabilities to support the operation of the entire electronic device. The internal memory may store computer readable instructions that, when executed by the processor, cause the processor to perform an image processing method. The input device of the electronic equipment is used for inputting various parameters, the display screen of the electronic equipment is used for displaying, and the network interface of the electronic equipment is used for carrying out network communication. It will be appreciated by those skilled in the art that the structure shown in fig. 7 is merely a block diagram of a portion of the structure associated with the present application and is not limiting of the electronic device to which the present application is applied, and that a particular electronic device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
Based on the same inventive concept, an embodiment of the present application provides a computer-readable storage medium having stored thereon computer-readable instructions, which when executed by a processor, implement the steps in the image processing method described above.
Any reference to memory, storage, database, or other medium as used herein may include non-volatile. Suitable nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
Further, the units described as separate units may or may not be physically separate, and units displayed as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
Furthermore, functional modules in various embodiments of the present application may be integrated together to form a single portion, or each module may exist alone, or two or more modules may be integrated to form a single portion.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The foregoing is merely exemplary embodiments of the present application and is not intended to limit the scope of the present application, and various modifications and variations may be suggested to one skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principles of the present application should be included in the protection scope of the present application.

Claims (8)

1. An image processing method, comprising:
based on a first projection image and a second projection image, a binarized image is obtained, wherein the first projection image is an image with a first calibration pattern projected on a detection surface, the second projection image is an image with a second calibration pattern projected on the detection surface, and the colors of the first calibration pattern and the second calibration pattern are complementary;
obtaining a main phase diagram based on a phase shift grating image, wherein the phase shift grating image is projected with an image of the phase shift grating pattern on the detection surface;
removing part of invalid areas in the binarized image based on the main phase map;
extracting an effective area based on the binarized image with part of the ineffective area removed, so as to obtain an effective area image;
the obtaining a binarized image based on the first projection image and the second projection image includes: performing differential calculation on the first projection image and the second projection image to obtain a differential image; and binarizing the differential image based on the gray average value of the differential image to obtain the binarized image.
2. The image processing method according to claim 1, characterized in that before the removing of a part of the invalid region in the binarized image based on the dominant phase map, the method further comprises:
and filtering the binarized image to obtain a filtered image.
3. The image processing method according to claim 2, wherein the removing a part of the invalid region in the binarized image based on the dominant phase map includes:
part of the invalid regions in the binarized image are eliminated by the following formula,
Figure FDA0004142997560000011
wherein P is remove(,j) Representing a binarized image after removal of invalid regions, P phase(i,j) Representing the map of the principal phases, mfliter(i,) representing the filtered image, (i, j) representing the current pixel coordinate position.
4. The image processing method according to claim 1, wherein the effective region extraction based on the binarized image after removing a part of the ineffective region, comprises:
performing morphological expansion based on the binarized image after removing part of the invalid region to obtain an expanded image;
extracting the peripheral outline of a single standard point in the expansion image to obtain an outline point set image;
the peripheral envelope of each single standard point in the contour point set image is obtained, and an envelope point set image is obtained;
screening envelope points with the largest area in the envelope point set image within a preset area range, and obtaining a screened envelope point image;
and filling the screened envelope point images to extract the effective area.
5. An image processing apparatus, comprising:
the processing module is used for obtaining a binarized image based on a first projection image and a second projection image, wherein the first projection image is an image with a first calibration pattern projected on a detection surface, the second projection image is an image with a second calibration pattern projected on the detection surface, and the colors of the first calibration pattern and the second calibration pattern are complementary;
the processing module is further used for obtaining a main phase diagram based on the phase shift grating image, wherein the phase shift grating image is an image of the detection surface projected with the phase shift grating pattern;
the rejecting module is used for rejecting part of invalid areas in the binarized image based on the main phase diagram;
the extraction module is used for extracting the effective area based on the binarized image with part of the ineffective area removed, so as to obtain an effective area image;
the processing module is further configured to: performing differential calculation on the first projection image and the second projection image to obtain a differential image; and binarizing the differential image based on the gray average value of the differential image to obtain the binarized image.
6. The image processing apparatus of claim 5, wherein the extraction module is further configured to: performing morphological expansion based on the binarized image after removing part of the invalid region to obtain an expanded image; extracting the peripheral outline of a single standard point in the expansion image to obtain an outline point set image; the peripheral envelope of each single standard point in the contour point set image is obtained, and an envelope point set image is obtained; screening out envelope points with the largest area in a preset area range in the envelope point set image, and obtaining a screened envelope point image; and filling the screened envelope point images to extract the effective area.
7. An electronic device comprising a memory and a processor, the memory having stored therein computer readable instructions that, when executed by the processor, cause the processor to perform an image processing method according to any of claims 1-4 or to implement a function of an image processing apparatus according to any of claims 5-6.
8. A non-transitory readable storage medium storing computer readable instructions which, when executed by a processor, cause the processor to perform an imaging method as claimed in any one of claims 1 to 4 or to implement the functionality of an image processing apparatus as claimed in any one of claims 5 to 6.
CN202110122036.7A 2021-01-28 2021-01-28 Image processing method and device, electronic equipment and storage medium Active CN112837327B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110122036.7A CN112837327B (en) 2021-01-28 2021-01-28 Image processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110122036.7A CN112837327B (en) 2021-01-28 2021-01-28 Image processing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112837327A CN112837327A (en) 2021-05-25
CN112837327B true CN112837327B (en) 2023-04-28

Family

ID=75932333

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110122036.7A Active CN112837327B (en) 2021-01-28 2021-01-28 Image processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112837327B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011133258A (en) * 2009-12-22 2011-07-07 Yamatake Corp Three-dimensional shape measuring method and three-dimensional shape measuring instrument
CN103942796A (en) * 2014-04-23 2014-07-23 清华大学 High-precision projector and camera calibration system and method
JP2015038466A (en) * 2013-07-16 2015-02-26 株式会社キーエンス Three-dimensional image processing apparatus, three-dimensional image processing method, three-dimensional image processing program, computer-readable recording medium, and device for storage

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5875637B2 (en) * 2013-12-19 2016-03-02 キヤノン株式会社 Image processing apparatus and image processing method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011133258A (en) * 2009-12-22 2011-07-07 Yamatake Corp Three-dimensional shape measuring method and three-dimensional shape measuring instrument
JP2015038466A (en) * 2013-07-16 2015-02-26 株式会社キーエンス Three-dimensional image processing apparatus, three-dimensional image processing method, three-dimensional image processing program, computer-readable recording medium, and device for storage
CN103942796A (en) * 2014-04-23 2014-07-23 清华大学 High-precision projector and camera calibration system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
虞梓豪等.基于边缘检测及可靠性排序算法的三维曲面重构.激光与光电子学进展.2020,第第57卷卷(第第24期期),217-226. *

Also Published As

Publication number Publication date
CN112837327A (en) 2021-05-25

Similar Documents

Publication Publication Date Title
CN109409335B (en) Image processing method, image processing device, computer readable medium and electronic equipment
CN105828691B (en) Image processing apparatus, image processing method
CN107038704B (en) Retina image exudation area segmentation method and device and computing equipment
CN111220235B (en) Water level monitoring method and device
CN111307039A (en) Object length identification method and device, terminal equipment and storage medium
CN110378351B (en) Seal identification method and device
CN114299086B (en) Image segmentation processing method, electronic equipment and system for low-contrast imaging
JP2007272435A (en) Face feature extraction device and face feature extraction method
CN106919883B (en) QR code positioning method and device
CN112785591B (en) Method and device for detecting and segmenting rib fracture in CT image
CN116168351B (en) Inspection method and device for power equipment
JPWO2012137511A1 (en) Image processing apparatus and image processing method
CN106937059A (en) Image synthesis method and system based on Kinect
US11704807B2 (en) Image processing apparatus and non-transitory computer readable medium storing program
KR101195917B1 (en) Acquisition method of Tongue Diagnosis Region
CN112991374A (en) Canny algorithm-based edge enhancement method, device, equipment and storage medium
Koehoorn et al. Effcient and effective automated digital hair removal from dermoscopy images
CN111524171B (en) Image processing method and device and electronic equipment
CN112837327B (en) Image processing method and device, electronic equipment and storage medium
CN113139929A (en) Gastrointestinal tract endoscope image preprocessing method comprising information screening and fusion repairing
CN111340040A (en) Paper character recognition method and device, electronic equipment and storage medium
CN113744200B (en) Camera dirt detection method, device and equipment
CN112017148A (en) Method and device for extracting single-joint skeleton contour
CN115205113A (en) Image splicing method, device, equipment and storage medium
JP2007219899A (en) Personal identification device, personal identification method, and personal identification program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant