CN111899239A - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN111899239A
CN111899239A CN202010733029.6A CN202010733029A CN111899239A CN 111899239 A CN111899239 A CN 111899239A CN 202010733029 A CN202010733029 A CN 202010733029A CN 111899239 A CN111899239 A CN 111899239A
Authority
CN
China
Prior art keywords
image
color channel
images
data set
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010733029.6A
Other languages
Chinese (zh)
Inventor
郑旭平
陈晓濠
任小华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010733029.6A priority Critical patent/CN111899239A/en
Publication of CN111899239A publication Critical patent/CN111899239A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application provides an image processing method and device, and relates to the technical field of computers. The image processing method in the embodiment of the application comprises the following steps: clustering the images in the image data set based on the color channel parameters of the images in the image data set to obtain a plurality of image cluster sets; selecting a target image cluster set with the largest number of images from the plurality of image cluster sets, and calculating an average color channel parameter of the images in the target image cluster set based on the color channel parameter of each image in the target image cluster set; and calibrating the color channel parameters of the image to be detected based on the average color channel parameters of the images in the target image cluster set to obtain the calibrated image to be detected. According to the technical scheme of the embodiment of the application, the image defect positioning model can accurately position the defects of the liquid crystal panel image gathered in the specific color mode, and the accuracy of the image defect positioning model in defect positioning is improved.

Description

Image processing method and device
Technical Field
The present application relates to the field of computer technologies, and in particular, to an image processing method and apparatus.
Background
In the manufacturing process of the liquid crystal panel, the liquid crystal panel processed by each processing technology needs to be photographed to obtain an image of the liquid crystal panel, and the image of the liquid crystal panel needs to be processed by the image defect positioning model to obtain a defect position of the image of the liquid crystal panel.
The image defect localization model needs to be trained through a large number of liquid crystal panel images, and in the related art, the image defect localization model is generally trained by directly taking the liquid crystal panel images shot in the manufacturing process of the liquid crystal panel as training sample data of the image defect localization model.
Because the shooting of the liquid crystal panel is influenced by various factors such as ambient light, camera imaging parameters, the light reflection degree of the liquid crystal panel and the like, the liquid crystal panel image for training the image defect positioning model is gathered in a specific color mode, such as light yellow, yellow green, brown yellow or orange. The liquid crystal panel images are directly used as training sample data of the image defect positioning model, so that the image defect positioning model cannot accurately position the defects of the liquid crystal panel images gathered in the specific color mode, and the defect positioning accuracy is low.
Disclosure of Invention
The embodiment of the application provides an image processing method and an image processing device, which can solve the technical problems that an image defect positioning model cannot accurately position the defects of liquid crystal panel images gathered in a specific color mode and the defect positioning accuracy is low to a certain extent.
Other features and advantages of the present application will be apparent from the following detailed description, or may be learned by practice of the application.
According to an aspect of an embodiment of the present application, there is provided an image processing method including: clustering images in the image data set based on color channel parameters of the images in the image data set to obtain a plurality of image cluster sets; selecting a target image cluster set with the largest number of images from the plurality of image cluster sets, and calculating an average color channel parameter of the images in the target image cluster set based on the color channel parameter of each image in the target image cluster set; and calibrating the color channel parameters of the image to be detected based on the average color channel parameters of the images in the target image cluster set to obtain the calibrated image to be detected.
According to an aspect of an embodiment of the present application, there is provided an image processing apparatus including: the clustering unit is used for clustering the images in the image data set based on the color channel parameters of the images in the image data set to obtain a plurality of image clustering sets; the calculation unit is used for selecting a target image cluster set with the largest number of images from the plurality of image cluster sets and calculating the average color channel parameter of the images in the target image cluster set based on the color channel parameter of each image in the target image cluster set; and the first calibration unit is used for calibrating the color channel parameters of the image to be detected based on the average color channel parameters of the images in the target image cluster set to obtain the calibrated image to be detected.
In some embodiments of the present application, based on the foregoing scheme, the clustering unit is configured to: respectively calculating an average color channel parameter corresponding to each image in the image data set based on the color channel parameters corresponding to all pixels contained in each image in the image data set; and clustering the images in the image data set based on the average color channel parameter corresponding to each image in the image data set to obtain a plurality of image cluster sets.
In some embodiments of the present application, based on the foregoing scheme, the clustering unit is configured to: and respectively calculating the average color channel parameter of each image in the image data set under each color channel based on the color channel parameters of all pixels contained in each image in the image data set under each color channel.
In some embodiments of the present application, based on the foregoing, the first calibration unit is configured to: determining average color channel parameters corresponding to the image to be detected based on the color channel parameters of all pixels contained in the image to be detected; generating a calibration parameter based on the average color channel parameter of the images in the target image cluster set and the average color channel parameter corresponding to the image to be detected; and based on the calibration parameters, calibrating the color channel parameters of all pixels contained in the image to be detected to obtain a calibrated image to be detected.
In some embodiments of the present application, based on the foregoing, the first calibration unit is configured to: calculating the ratio of the average color channel parameter of the images in the target image cluster set to the average color channel parameter corresponding to the image to be detected; based on the ratio, calibration parameters are generated.
In some embodiments of the present application, based on the foregoing solution, the image processing apparatus further includes: the input unit is used for inputting the calibrated image to be detected into a pre-trained image defect positioning model; and the positioning unit is used for carrying out defect positioning treatment on the calibrated image to be detected through the image defect positioning model and outputting a defect positioning result.
In some embodiments of the present application, based on the foregoing solution, the image processing apparatus further includes: a second calibration unit, configured to perform calibration processing on the color channel parameters of the images in the image data set based on the average color channel parameters of the images in the target image cluster set, so as to obtain a calibrated image data set; the marking unit is used for carrying out image defect marking processing on the image in the calibrated image data set to obtain a marked image data set; the generating unit is used for generating training sample data based on the labeled image data set; and the training unit is used for training a machine learning model based on the generated training sample data to obtain the image defect positioning model.
In some embodiments of the present application, based on the foregoing scheme, the generating unit is configured to: selecting a target image from the labeled image data set; performing data enhancement processing on the target image to obtain a processed image; and generating training sample data based on the processed image and the labeled image data set.
In some embodiments of the present application, based on the foregoing scheme, the generating unit is configured to: acquiring a data enhancement probability threshold; distributing a random number to the image in the labeled image data set; and determining the image of which the distributed random number is less than or equal to the data enhancement probability threshold value as the target image.
According to an aspect of embodiments of the present application, there is provided a computer-readable medium on which a computer program is stored, which, when executed by a processor, implements an image processing method as described in the above embodiments.
According to an aspect of an embodiment of the present application, there is provided an electronic device including: one or more processors; a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the image processing method as described in the above embodiments.
In the technical solutions provided in some embodiments of the present application, clustering is performed on images in an image data set based on color channel parameters of the images in the image data set to obtain a plurality of image cluster sets, a target image cluster set with the largest number of images is selected from the plurality of image cluster sets, an average color channel parameter of the images in the target image cluster set is calculated based on the color channel parameter of each image in the target image cluster set, then calibration is performed on the color channel parameter of the image to be detected based on the average color channel parameter of the images in the target image cluster set to obtain a calibrated image to be detected, image calibration is performed on the image to be detected by pre-processing data of image calibration, then defect location detection is performed by an image defect location model according to the calibrated image to be detected, compared with the method that the image to be detected without image calibration is directly input to the image defect location model, when the image to be detected is the liquid crystal panel image gathered in the specific color mode, the image defect positioning model can also perform accurate defect positioning, and the accuracy of the image defect positioning model in defect positioning is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application. It is obvious that the drawings in the following description are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort. In the drawings:
fig. 1 shows a schematic diagram of an exemplary system architecture to which the technical solution of the embodiments of the present application can be applied.
FIG. 2 shows a flow diagram of an image processing method according to an embodiment of the present application.
Fig. 3 shows a detailed flowchart of step S210 of the image processing method according to an embodiment of the present application.
Fig. 4 shows a detailed flowchart of step S230 of the image processing method according to an embodiment of the present application.
Fig. 5 shows a detailed flowchart of step S410 of the image processing method according to an embodiment of the present application.
FIG. 6 shows a flow diagram of an image processing method according to an embodiment of the present application.
FIG. 7 shows a flow diagram of an image processing method according to an embodiment of the present application.
Fig. 8 shows a detailed flowchart of step S720 of the image processing method according to an embodiment of the present application.
Fig. 9 shows a detailed flowchart of step S810 of the image processing method according to an embodiment of the present application.
Fig. 10 shows a block diagram of an image processing apparatus according to an embodiment of the present application.
FIG. 11 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present application.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the application. One skilled in the relevant art will recognize, however, that the subject matter of the present application can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known methods, devices, implementations, or operations have not been shown or described in detail to avoid obscuring aspects of the application.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
Fig. 1 shows a schematic diagram of an exemplary system architecture to which the technical solution of the embodiments of the present application can be applied.
As shown in fig. 1, the system architecture may include a client 101, a network 102, and a server 103. Network 102 serves as a medium for providing communication links between clients 101 and servers 103. Network 102 may include various connection types, such as wired communication links, wireless communication links, and so forth.
It should be understood that the number of clients 101, networks 102, and servers 103 in fig. 1 is merely illustrative. There may be any number of clients 101, networks 102, and servers 103, as the server 103 may be a server cluster of multiple servers, etc., according to implementation needs.
The client 101 interacts with a server 103 through the network 102 to receive or transmit messages and the like, and the server 103 may be a server that provides various services, for example, may be a server that provides an image processing application.
The client 101 clusters the images in the image data set based on the color channel parameters of the images in the image data set to obtain a plurality of image cluster sets, selects a target image cluster set with the largest number of images from the plurality of image cluster sets, calculates the average color channel parameters of the images in the target image cluster set based on the color channel parameters of the images in the target image cluster set, calibrates the color channel parameters of the images to be detected based on the average color channel parameters of the images in the target image cluster set to obtain a calibrated image to be detected, performs defect location detection by performing image calibration on the images to be detected first and then by using an image defect location model according to the calibrated image to be detected, and compared with the method that the images to be detected without image calibration are directly input to the image defect location model, when the image to be detected is the liquid crystal panel image gathered in the specific color mode, the image defect positioning model can also perform accurate defect positioning, and the accuracy of the image defect positioning model in defect positioning is improved.
It should be noted that the image processing method provided in the embodiment of the present application is generally executed by the client 101, and accordingly, the image processing apparatus is generally disposed in the client 101. However, in other embodiments of the present application, the server 103 may also have similar functions as the client 101, so as to execute the scheme of the image processing method provided in the embodiments of the present application.
The details of implementation of the technical solution of the embodiments of the present application are set forth in the following.
Fig. 2 shows a flow diagram of an image processing method according to an embodiment of the present application, which may be performed by a client, which may be the client 101 shown in fig. 1. Referring to fig. 2, the image processing method at least includes steps S210 to S230, which are described in detail as follows.
In step S210, the images in the image data set are clustered based on the color channel parameters of the images in the image data set, so as to obtain a plurality of image cluster sets.
In one embodiment, in the manufacturing process of the liquid crystal panel, after the liquid crystal panel is processed by a certain processing technology, a specific image capturing device may capture an image of the processed liquid crystal panel to obtain a liquid crystal panel image, and the obtained liquid crystal panel image is captured as an image for performing color defect analysis on the liquid crystal panel. The image data set is an image set obtained by shooting the liquid crystal panels, and it should be noted that there is a difference in color of the liquid crystal panel images due to objective differences of shooting environments of the liquid crystal panel images, for example, differences in shooting environments generated under the action of various factors such as ambient light, camera imaging parameters, and light reflection of the liquid crystal panels. Thus, the set of liquid crystal panel images contained in the image data set generally refers to a set of liquid crystal panel images having differences in color characteristics.
In an embodiment, the color channel parameter of the image is used as a parameter value in a color channel corresponding to the image in the specific color mode, the color channel parameter of the image may be used as a parameter characterizing color characteristics of the image, the image generally includes a plurality of pixels, each pixel has a corresponding color channel parameter, and the color channel parameters of all pixels included in the image constitute the color channel parameter of the image.
The color mode corresponding to the image may be a color mode of a single color channel, such as a gray scale map, and for an image of a single color channel, the color channel parameter value corresponding to each pixel has only one parameter value under one color channel. The color mode corresponding to the image may also be a color mode of a plurality of color channels, such as a red, green, and blue RGB color mode, for example, in the RGB color mode, the corresponding color channel parameters are three parameter values, i.e., parameter values in an R color channel, a G color channel, and a B color channel.
In one embodiment, in order to determine common color characteristics of all images included in an image data set, the images in the image data set may be clustered based on color channel parameters of the images in the image data set, so as to obtain a plurality of image cluster sets, and each image cluster set may be used as an image set representing images with the same color characteristics.
Optionally, the clustering algorithm used for clustering the images in the image data set may be a Mean-Shift clustering algorithm, a k-means clustering algorithm, or the like, and of course, other clustering algorithms may also be used, which are not limited herein.
Optionally, when the color channel parameters of each image in the image data set are used as input data for performing clustering processing, the color channel parameters corresponding to all pixels included in the image may be directly used as the input data for performing clustering processing to perform clustering processing on the images in the image data set, so as to obtain a plurality of image cluster sets, so as to implement clustering on the images in the image data set according to the color characteristics included in the images.
Referring to fig. 3, fig. 3 shows a detailed flowchart of step S210 of the image processing method according to an embodiment of the present application, in which embodiment, step S210 may specifically include step S310 to step S320, which is described in detail as follows.
In step S310, an average color channel parameter corresponding to each image in the image data set is calculated based on the color channel parameters corresponding to all pixels included in each image in the image data set.
In an embodiment, when clustering the images in the image data set based on the color channel parameters of the images in the image data set, an average color channel parameter corresponding to each image in the image data set may be calculated according to the color channel parameters corresponding to all pixels included in each image in the image data set, in other words, for each image in the image data set, an average color channel parameter corresponding to the image may be calculated according to the color channel parameters corresponding to all pixels included in the image, and the average color channel parameter is used as a parameter reflecting the overall color characteristics of the image.
Specifically, the color channel parameters of all pixels may be added, and divided by the number of pixels included in the image, so as to obtain an average color channel parameter corresponding to the image.
Optionally, when the color mode corresponding to the image is a color mode of a single color channel, the single color channel parameters corresponding to all pixels included in each image may be added, and the average color channel parameters corresponding to the image are obtained by dividing the single color channel parameters by the number of pixels included in the image, where the average color channel parameters corresponding to the obtained image are only one.
Optionally, when the color mode of the image is a color mode of a multi-color channel, step S310 may specifically include: and respectively calculating the average color channel parameter of each image in the image data set under each color channel based on the color channel parameters of all pixels contained in each image in the image data set under each color channel.
When the color mode of the image is a color mode of multiple color channels, the average color channel parameter corresponding to the image is a color channel parameter in the multiple color channels. Taking an image in an RGB color mode as an example, color channel parameters of all pixels included in each image in an R color channel may be added, and divided by the number of pixels included in the image to obtain an average color channel parameter of the image in the R color channel.
In step S320, clustering is performed on the images in the image data set based on the average color channel parameter corresponding to each image in the image data set, so as to obtain a plurality of image cluster sets.
In one embodiment, after the average color channel parameter corresponding to each image in the image data set is obtained, the average color channel parameter is used as input data of a clustering algorithm, the input data is processed through the clustering algorithm, and clustering results of a plurality of image clustering sets are output, so that the images in the image data set are clustered according to the color channel parameters of the images in the image data set.
In the technical solution of the embodiment shown in fig. 3, the average color channel parameter corresponding to each image in the image data set is calculated according to the color channel parameters corresponding to all pixels included in each image in the image data set, and then the images in the image data set are clustered based on the average color channel parameter corresponding to each image in the image data set, so that compared with a method of directly clustering the images in the image data set according to the color channel parameters included in all pixels, the dimensionality of the calculated data corresponding to each image input into the clustering algorithm can be significantly reduced, thereby effectively reducing the complexity of the clustering operation, reducing the calculated amount of the system, and saving system resources.
Still referring to fig. 2, in step S220, a target image cluster set with the largest number of images is selected from the plurality of image clusters, and an average color channel parameter of the images in the target image cluster set is calculated based on the color channel parameter of each image in the target image cluster set.
In one embodiment, after obtaining the plurality of image cluster sets, the image cluster set with the largest number of images may be selected from the plurality of image cluster sets as a target image cluster set, where the target image cluster set is an image set that can represent the common color characteristics of all images in the image data set most.
It will be appreciated that if there are two image clusters that have the same number of images and each have a greater number of images than the other image clusters, then one of the two image clusters may be selected as the target image cluster.
In one embodiment, after determining the target image cluster set, an average color channel parameter for the images in the target image cluster set may then be calculated based on the color channel parameters for the images in the target image cluster set.
Specifically, the average color channel parameter corresponding to each image in the image data set may be respectively calculated according to the color channel parameters corresponding to all pixels included in each image in the target image cluster set, in other words, for each image in the target image cluster set, the average color channel parameter corresponding to the image may be calculated according to the color channel parameters corresponding to all pixels included in the image. And after the average color channel parameter corresponding to each image in the target image clustering set is obtained through calculation, summing the average color channel parameters corresponding to each image in the target image clustering set, dividing the sum result by the number of the images in the target image clustering set, and obtaining the average color channel parameter of the images in the target image clustering set through calculation.
As described above, the color channel parameters of the image in each color channel need to be calculated respectively, so that when the color mode corresponding to the image is the color mode of a single color channel, the average color channel parameter corresponding to the obtained image is only one, and correspondingly, the average color channel parameter of the image in the target image cluster set is also only one; when the color mode of the image is a color mode of multiple color channels, the average color channel parameters corresponding to the obtained image are multiple, and correspondingly, the average color channel parameters of the images in the target image cluster set are multiple.
In step S230, based on the average color channel parameter of the images in the target image cluster set, the color channel parameter of the image to be detected is calibrated, so as to obtain a calibrated image to be detected.
In an embodiment, after determining the average color channel parameter of the images in the target image cluster set, the color channel parameter of the image to be detected may be calibrated based on the average color channel parameter of the images in the target image cluster set, so as to obtain a calibrated image to be detected.
Specifically, the color channel parameters of all pixels included in the image to be detected can be calibrated according to the average color channel parameters of the images in the target image cluster set, so that the image to be detected can be adjusted into an image with the common color characteristics of all the images in the image data set, and the image to be detected is used as the calibrated image to be detected.
Compared with the mode that the image to be detected which is not subjected to image calibration is directly input into the image defect positioning model, the image defect positioning model performs image calibration data preprocessing on the image to be detected, and then performs defect positioning detection according to the calibrated image to be detected, so that when the image to be detected is a liquid crystal panel image gathered in a specific color mode, the image defect positioning model can also perform accurate defect positioning, and the accuracy of the image defect positioning model in defect positioning is improved.
Referring to fig. 4, fig. 4 shows a detailed flowchart of step S230 of the image processing method according to an embodiment of the present application, in which embodiment, step S230 may specifically include step S410 to step S430, which is described in detail as follows.
In step S410, an average color channel parameter corresponding to the image to be detected is determined based on the color channel parameters of all pixels included in the image to be detected.
In an embodiment, when the color channel parameters of the image to be detected are calibrated based on the average color channel parameters of the images in the target image cluster set, the average color channel parameters corresponding to the image to be detected may be determined based on the color channel parameters of all pixels included in the image to be detected.
Specifically, the color channel parameters of all pixels may be added, and divided by the number of pixels included in the image, so as to obtain an average color channel parameter corresponding to the image.
It can be understood that, when the color mode corresponding to the image is a color mode of a single color channel, the average color channel parameters corresponding to the image may be obtained by adding the single color channel parameters corresponding to all pixels included in the image to be detected and dividing by the number of pixels included in the image, and the obtained average color channel parameters corresponding to the image to be detected are only one.
And when the color mode of the image to be detected is the color mode of the multi-color channel, the average color channel parameter corresponding to the image to be detected is the color channel parameter in the multi-color channel. Taking an image in an RGB color mode as an example, color channel parameters of all pixels included in the image to be detected in the R color channel may be added, and divided by the number of pixels included in the image to be detected to obtain an average color channel parameter of the image to be detected in the R color channel, and similarly, an average color channel parameter of the image to be detected in the G color channel and an average color channel parameter of the image to be detected in the B color channel may be obtained through calculation, and then average color channel parameters of the image to be detected in the R color channel, the G color channel and the B color channel are obtained through calculation, and the average color channel parameters of the image to be detected in the three different color channels constitute an average color channel parameter corresponding to the image to be detected.
In step S420, a calibration parameter is generated based on the average color channel parameter of the images in the target image cluster set and the average color channel parameter corresponding to the image to be detected.
In an embodiment, when the calibration parameter is generated according to the average color channel parameter based on the images in the target image cluster set and the average color channel parameter corresponding to the image to be detected, the calibration parameter for performing calibration processing on the image is generated according to the correspondence between the average color channel parameter of the images in the target image cluster set and the average color channel parameter corresponding to the image to be detected, and the calibration parameter for performing calibration processing on the image and the corresponding relationship between the average color channel parameter of the images in the target image cluster set and the average color channel parameter corresponding to the image, so that the calibration processing on the image to be detected is realized.
Referring to fig. 5, fig. 5 shows a detailed flowchart of step S410 of the image processing method according to an embodiment of the present application, in which embodiment, step S410 may specifically include step S510 to step S520, which is described in detail as follows.
In step S510, a ratio between the average color channel parameter of the images in the target image cluster set and the average color channel parameter corresponding to the image to be detected is calculated.
In an embodiment, when generating the calibration parameter for performing calibration processing on the image to be detected, a ratio between an average color channel parameter of the image in the target image cluster set and an average color channel parameter corresponding to the image to be detected may be calculated first.
It can be understood that, when the color mode of the image is a color mode of multiple color channels, the average color channel parameters of the images in the target image cluster set and the average color channel parameters corresponding to the images to be detected both include multiple average color channel parameters under different color channels, and thus, the ratio between the average color channel parameters of the images under different color channels needs to be calculated respectively.
Taking an image in an RGB color mode as an example, the average color channel parameters of the images in the target image cluster set include an average color channel parameter R1 under an R color channel, an average color channel parameter G1 under a G color channel, and an average color channel parameter B1 under a B color channel, the average color channel parameters corresponding to the image to be detected include an average color channel parameter R2 under the R color channel, an average color channel parameter G2 under the G color channel, and an average color channel parameter B2 under the B color channel, and the ratio between the average color channel parameters of the images in the target image cluster set and the average color channel parameters corresponding to the image to be detected includes the average color channel parameter corresponding to the R color channel
Figure BDA0002604005450000121
G color toneThe way corresponds to
Figure BDA0002604005450000122
And B color channel corresponds
Figure BDA0002604005450000123
In step S520, based on the ratio, a calibration parameter is generated.
In one embodiment, after a ratio between an average color channel parameter of an image in a target image cluster set and an average color channel parameter corresponding to an image to be detected is obtained, a calibration parameter for performing calibration processing on the image to be detected is generated based on a corresponding relationship among the ratio, the ratio and the calibration parameter.
Optionally, the correspondence between the ratio and the calibration parameter may be a linear relationship with positive correlation, for example, the ratio may be directly used as the calibration parameter for performing calibration processing on the image.
It can be understood that, when the color mode of the image to be detected is a color mode of multiple color channels, the calibration parameters generated based on the ratio need to be determined respectively according to the ratio between the average color channel parameters of the two images in different color channels.
Taking the image in RGB color mode as an example, the corresponding ratio of R color channel can be directly used
Figure BDA0002604005450000124
As a calibration parameter of the image under the corresponding R color channel, the calibration parameter is used to calibrate the color channel parameter under the R color channel, and similarly, the ratio corresponding to the G color channel may be used
Figure BDA0002604005450000125
As calibration parameters of the image under the corresponding G color channel and the ratio corresponding to the B color channel
Figure BDA0002604005450000126
As an image in the corresponding B color channelCalibration parameters under the track.
Referring to fig. 4 again, in step S430, based on the calibration parameters, the color channel parameters of all pixels included in the image to be detected are calibrated, so as to obtain a calibrated image to be detected.
In an embodiment, after the calibration parameters for performing calibration processing on the image to be detected are generated, the color channel parameters of all pixels included in the image to be detected may be calibrated according to the calibration parameters corresponding to the image to be detected, so as to obtain a calibrated image data set.
Specifically, color channel parameters of each pixel of the image to be detected under the corresponding color channel are calibrated according to the calibration parameters of the image to be detected under each color channel. The calibration processing method may specifically be to calculate a product between a calibration parameter of the image in each color channel and a color channel parameter of a pixel of the image in the corresponding color channel, where the product is used as the calibrated color channel parameter, thereby calibrating the color channel parameter of each pixel of the image in the corresponding color channel.
And after the color channel parameters of all pixels contained in the image are calibrated, obtaining the calibrated image to be detected.
It can be seen from the above that, by clustering the images in the image data set based on the color channel parameters of the images in the image data set to obtain a plurality of image cluster sets, selecting the target image cluster set with the largest number of images from the plurality of image cluster sets, calculating the average color channel parameter of the images in the target image cluster set based on the color channel parameters of the images in the target image cluster set, calibrating the color channel parameters of the images to be detected based on the average color channel parameters of the images in the target image cluster set to obtain the calibrated images to be detected, performing image calibration data preprocessing on the images to be detected first, performing defect location detection by the image defect location model according to the calibrated images to be detected, and compared with the method of directly inputting the images to be detected without image calibration data preprocessing into the image defect location model, when the image to be detected is the liquid crystal panel image gathered in the specific color mode, the image defect positioning model can also perform accurate defect positioning, and the accuracy of the image defect positioning model in defect positioning is improved.
Referring to fig. 6, fig. 6 shows a flowchart of an image processing method according to an embodiment of the present application, and the image processing method in the embodiment of the present application may further include steps S610 to S620, which are described in detail as follows.
In step S610, the calibrated image to be detected is input into the pre-trained image defect localization model.
In one embodiment, when color defect analysis is performed on the liquid crystal panel, an image of the liquid crystal panel may be input to a trained image defect localization model, and defect localization of the image of the liquid crystal panel is achieved through the image defect localization model, where the image defect localization model is obtained by training a machine learning model. The machine learning model may be a CNN (Convolutional Neural Network) model, or may be a deep Neural Network model or the like. The pre-trained image defect positioning model can perform defect positioning processing on the input calibrated image to be detected so as to detect the defect position in the image to be detected.
In step S620, the calibrated image to be detected is processed by defect location through the image defect location model, and a defect location result is output.
In one embodiment, the image defect positioning model is used for performing defect positioning processing on the calibrated image to be detected, and outputting a defect positioning result, wherein the defect positioning result is defect positioning information obtained by performing defect position positioning processing on the calibrated image to be detected.
In the technical scheme of the embodiment shown in fig. 6, the color channel parameters of the image to be detected are calibrated to obtain a calibrated image to be detected, the image to be detected is subjected to image calibration data preprocessing first, and then the image defect positioning model is used for performing defect positioning detection according to the calibrated image to be detected.
Referring to fig. 7, fig. 7 shows a flowchart of an image processing method according to an embodiment of the present application, and the image processing method in the embodiment of the present application may further include steps S710 to S740, which are described in detail as follows.
In step S710, based on the average color channel parameter of the images in the target image cluster set, the color channel parameter of the images in the image data set is calibrated, so as to obtain a calibrated image data set.
In an embodiment, when training sample data for training a machine learning model is generated according to an image data set, calibration processing may be performed on color channel parameters of images in the image data set based on an average color channel parameter of the images in a target image cluster set to obtain a calibrated image data set. The method for calibrating the color channel parameter of each image in the image data set based on the average color channel parameter of the images in the target image cluster set is consistent with the method for calibrating the color channel parameter of the image to be detected based on the average color channel parameter of the images in the target image cluster set, and is not repeated here.
In step S720, an image defect labeling process is performed on each image in the calibrated image dataset to obtain a labeled image dataset.
In an embodiment, when the image defect localization model is trained according to the calibrated image data set, an image defect labeling process may be performed on each image in the calibrated image data set, that is, a labeling process is performed on an actual defect position of each image in advance.
In step S730, training sample data is generated based on the labeled image dataset.
In an embodiment, after the calibrated image data set is labeled to obtain a labeled image data set, training sample data for training the machine learning model may be generated according to the labeled image data set.
Referring to fig. 8, fig. 8 shows a detailed flowchart of step S720 of the image processing method according to an embodiment of the present application, and step S720 in the embodiment of the present application may include steps S810 to S830, which are described in detail as follows.
In step S810, a target image is selected from the labeled image dataset.
In one embodiment, after the calibrated image dataset is subjected to labeling processing to obtain a labeled image dataset, in order to improve the effect on the machine learning model, a partial image may be selected from the labeled image dataset as a target image for data enhancement processing.
Optionally, a partial image may be directly extracted from the labeled image data set at random as a target image for data enhancement.
Alternatively, referring to fig. 9, fig. 9 shows a specific flowchart of step S810 of the image processing method according to an embodiment of the present application, and step S810 in the embodiment of the present application may include step S910 to step S930, which is described in detail as follows.
In step S910, a data enhancement probability threshold is acquired.
In step S920, a random number is assigned to the image in the labeled image dataset.
In step S930, an image for which the assigned random number is less than or equal to the data enhancement probability threshold is determined as a target image.
In an embodiment, when selecting a target image for data enhancement processing from the labeled image dataset, in order to ensure randomness, a data enhancement probability threshold for image selection may be preset, and image selection may be performed according to the data enhancement probability threshold, where the data enhancement probability threshold may be a preset certain probability parameter, for example, a certain probability parameter between 0.5 and 0.8.
In particular, it may be preselected to assign a random number to each image in the annotated image dataset, which is typically a parameter between 0 and 1.
And comparing the random number distributed to each image with the data enhancement probability threshold, and determining the image of which the distributed random number is less than or equal to the data enhancement probability threshold as a target image, thereby realizing the random selection of the target image needing data enhancement processing from the marked image data set.
Still referring to fig. 8, in step S820, a data enhancement process is performed on the target image to obtain a processed image.
In one embodiment, after the target image is acquired, data enhancement processing may be performed on the selected target image to obtain a processed image.
Specifically, for any selected target image, various types of data enhancement processing such as color conversion, rotation, scaling, and noise addition may be performed on the target image. It will be appreciated that for each type of data enhancement processing, it may be determined with a certain probability whether each target image is to be subjected to the data enhancement processing corresponding to that type, and thus the respective target images may be subjected to one or more types of data enhancement processing simultaneously.
In step S830, training sample data is generated based on the processed image and the labeled image dataset.
And forming training sample data for training the machine learning model based on the image subjected to the data enhancement processing and the original image not subjected to the data enhancement processing.
In the technical scheme of the embodiment shown in fig. 8, training sample data for training the machine learning model is generated by using the image without data enhancement and the image after data enhancement, so that the generalization capability of the image defect localization model obtained by training can be effectively improved, and the performance of the image defect localization model for image defect localization is improved.
Referring to fig. 7 again, in step S740, the machine learning model is trained based on the generated training sample data to obtain an image defect localization model.
In one embodiment, the machine learning model is trained based on generated training sample data to obtain an image defect location model, and the process of training the machine learning model is to adjust coefficients in a network structure corresponding to the machine learning model, so that for an input image to be detected, the result is the determined defect location information through the operation of the coefficients in the network structure corresponding to the machine learning model.
In the technical solution of the embodiment shown in fig. 7, when the image to be detected is a liquid crystal panel image aggregated in a specific color mode, the color difference between the image and the image in the calibrated image data set is reduced by performing data preprocessing of color calibration on each image to be detected in the image data set, so that the image defect positioning model trained by the calibrated image data set image can realize accurate defect positioning on the liquid crystal panel image aggregated in the specific color mode, and the accuracy of the image defect positioning model in performing defect positioning is improved.
Embodiments of the apparatus of the present application are described below, which may be used to perform the image processing methods in the above-described embodiments of the present application. For details that are not disclosed in the embodiments of the apparatus of the present application, please refer to the embodiments of the image processing method described above in the present application.
Fig. 10 shows a block diagram of an image processing apparatus according to an embodiment of the present application.
Referring to fig. 10, an image processing apparatus 1000 according to an embodiment of the present application includes: a clustering unit 1010, a calculating unit 1020, and a first calibrating unit 1030. The clustering unit 1010 is configured to perform clustering processing on the images in the image data set based on color channel parameters of the images in the image data set to obtain a plurality of image cluster sets; a calculating unit 1020, configured to select a target image cluster set with the largest number of images from the multiple image cluster sets, and calculate an average color channel parameter of the images in the target image cluster set based on the color channel parameter of each image in the target image cluster set; the first calibration unit 1030 is configured to calibrate the color channel parameters of the image to be detected based on the average color channel parameters of the images in the target image cluster set, so as to obtain a calibrated image to be detected.
In some embodiments of the present application, based on the foregoing scheme, the clustering unit 1010 is configured to: respectively calculating the average color channel parameter corresponding to each image in the image data set based on the color channel parameters corresponding to all pixels contained in each image in the image data set; and clustering the images in the image data set based on the average color channel parameter corresponding to each image in the image data set to obtain a plurality of image cluster sets.
In some embodiments of the present application, based on the foregoing scheme, the clustering unit 1010 is configured to: and respectively calculating the average color channel parameter of each image in the image data set under each color channel based on the color channel parameters of all pixels contained in each image in the image data set under each color channel.
In some embodiments of the present application, based on the foregoing scheme, the first calibration unit 1030 is configured to: determining average color channel parameters corresponding to the image to be detected based on the color channel parameters of all pixels contained in the image to be detected; generating a calibration parameter based on the average color channel parameter of the images in the target image cluster set and the average color channel parameter corresponding to the image to be detected; and based on the calibration parameters, calibrating the color channel parameters of all pixels contained in the image to be detected to obtain a calibrated image to be detected.
In some embodiments of the present application, based on the foregoing scheme, the first calibration unit 1030 is configured to: calculating the ratio of the average color channel parameter of the images in the target image cluster set to the average color channel parameter corresponding to the image to be detected; based on the ratio, calibration parameters are generated.
In some embodiments of the present application, based on the foregoing solution, the image processing apparatus further includes: the input unit is used for inputting the calibrated image to be detected into the pre-trained image defect positioning model; and the positioning unit is used for carrying out defect positioning treatment on the calibrated image to be detected through the image defect positioning model and outputting a defect positioning result.
In some embodiments of the present application, based on the foregoing solution, the image processing apparatus further includes: the second calibration unit is used for calibrating the color channel parameters of the images in the image data set based on the average color channel parameters of the images in the target image clustering set to obtain a calibrated image data set; the marking unit is used for carrying out image defect marking processing on the image in the calibrated image data set to obtain a marked image data set; the generating unit is used for generating training sample data based on the labeled image data set; and the training unit is used for training the machine learning model based on the generated training sample data to obtain an image defect positioning model.
In some embodiments of the present application, based on the foregoing scheme, the generating unit is configured to: selecting a target image from the labeled image data set; performing data enhancement processing on the target image to obtain a processed image; and generating training sample data based on the processed image and the labeled image data set.
In some embodiments of the present application, based on the foregoing scheme, the generating unit is configured to: acquiring a data enhancement probability threshold; distributing a random number for the image in the marked image data set; and determining the image of which the distributed random number is less than or equal to the data enhancement probability threshold value as the target image.
FIG. 11 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present application.
It should be noted that the computer system 1100 of the electronic device shown in fig. 11 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 11, a computer system 1110 includes a Central Processing Unit (CPU)1101, which can perform various appropriate actions and processes, such as performing the methods described in the above embodiments, according to a program stored in a Read-Only Memory (ROM) 1102 or a program loaded from a storage section 1108 into a Random Access Memory (RAM) 1103. In the RAM 1103, various programs and data necessary for system operation are also stored. The CPU 1101, ROM 1102, and RAM 1103 are connected to each other by a bus 1104. An Input/Output (I/O) interface 1105 is also connected to bus 1104.
The following components are connected to the I/O interface 1105: an input portion 1106 including a keyboard, mouse, and the like; an output section 1107 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, a speaker, and the like; a storage section 1108 including a hard disk and the like; and a communication section 1109 including a network interface card such as a LAN (Local area network) card, a modem, or the like. The communication section 1109 performs communication processing via a network such as the internet. A driver 1110 is also connected to the I/O interface 1105 as necessary. A removable medium 1111, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like, is mounted on the drive 1110 as necessary, so that a computer program read out therefrom is mounted into the storage section 1108 as necessary.
In particular, according to embodiments of the application, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising a computer program for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication portion 1109 and/or installed from the removable medium 1111. When the computer program is executed by a Central Processing Unit (CPU)1101, various functions defined in the system of the present application are executed.
It should be noted that the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with a computer program embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. The computer program embodied on the computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method described in the above embodiments.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the application. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present application can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which can be a personal computer, a server, a touch terminal, or a network device, etc.) to execute the method according to the embodiments of the present application.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the embodiments disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (10)

1. An image processing method, comprising:
clustering images in the image data set based on color channel parameters of the images in the image data set to obtain a plurality of image cluster sets;
selecting a target image cluster set with the largest number of images from the plurality of image cluster sets, and calculating an average color channel parameter of the images in the target image cluster set based on the color channel parameter of each image in the target image cluster set;
and calibrating the color channel parameters of the image to be detected based on the average color channel parameters of the images in the target image cluster set to obtain the calibrated image to be detected.
2. The method according to claim 1, wherein the clustering the images in the image data set based on the color channel parameter of each image in the image data set to obtain a plurality of image cluster sets comprises:
respectively calculating an average color channel parameter corresponding to each image in the image data set based on the color channel parameters corresponding to all pixels contained in each image in the image data set;
and clustering the images in the image data set based on the average color channel parameter corresponding to each image in the image data set to obtain a plurality of image cluster sets.
3. The method according to claim 2, wherein the calculating an average color channel parameter corresponding to each image in the image data set based on the color channel parameters corresponding to all pixels included in each image in the image data set comprises:
and respectively calculating the average color channel parameter of each image in the image data set under each color channel based on the color channel parameters of all pixels contained in each image in the image data set under each color channel.
4. The image processing method according to claim 1, wherein the calibrating the color channel parameters of the image to be detected based on the average color channel parameters of the images in the target image cluster set to obtain the calibrated image to be detected comprises:
determining average color channel parameters corresponding to the image to be detected based on the color channel parameters of all pixels contained in the image to be detected;
generating a calibration parameter based on the average color channel parameter of the images in the target image cluster set and the average color channel parameter corresponding to the image to be detected;
and based on the calibration parameters, calibrating the color channel parameters of all pixels contained in the image to be detected to obtain a calibrated image to be detected.
5. The image processing method according to claim 4, wherein the generating calibration parameters based on the average color channel parameters of the images in the target image cluster set and the average color channel parameters corresponding to the images to be detected comprises:
calculating the ratio of the average color channel parameter of the images in the target image cluster set to the average color channel parameter corresponding to the image to be detected;
based on the ratio, calibration parameters are generated.
6. The image processing method according to any one of claims 1 to 5, characterized by further comprising:
inputting the calibrated image to be detected into a pre-trained image defect positioning model;
and carrying out defect positioning treatment on the calibrated image to be detected through the image defect positioning model, and outputting a defect positioning result.
7. The image processing method according to claim 6, further comprising:
calibrating the color channel parameters of the images in the image data set based on the average color channel parameters of the images in the target image cluster set to obtain a calibrated image data set;
performing image defect labeling processing on each image in the calibrated image data set to obtain a labeled image data set;
generating training sample data based on the labeled image data set;
training a machine learning model based on the generated training sample data to obtain the image defect positioning model.
8. The method according to claim 7, wherein generating training sample data based on the labeled image dataset comprises:
selecting a target image from the labeled image data set;
performing data enhancement processing on the target image to obtain a processed image;
and generating training sample data based on the processed image and the labeled image data set.
9. The image processing method of claim 8, wherein said extracting a target image from said annotated image dataset comprises:
acquiring a data enhancement probability threshold;
distributing a random number to the image in the labeled image data set;
and determining the image of which the distributed random number is less than or equal to the data enhancement probability threshold value as the target image.
10. An image processing apparatus characterized by comprising:
the clustering unit is used for clustering the images in the image data set based on the color channel parameters of the images in the image data set to obtain a plurality of image clustering sets;
the calculation unit is used for selecting a target image cluster set with the largest number of images from the plurality of image cluster sets and calculating the average color channel parameter of the images in the target image cluster set based on the color channel parameter of each image in the target image cluster set;
and the first calibration unit is used for calibrating the color channel parameters of the image to be detected based on the average color channel parameters of the images in the target image cluster set to obtain the calibrated image to be detected.
CN202010733029.6A 2020-07-27 2020-07-27 Image processing method and device Pending CN111899239A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010733029.6A CN111899239A (en) 2020-07-27 2020-07-27 Image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010733029.6A CN111899239A (en) 2020-07-27 2020-07-27 Image processing method and device

Publications (1)

Publication Number Publication Date
CN111899239A true CN111899239A (en) 2020-11-06

Family

ID=73190214

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010733029.6A Pending CN111899239A (en) 2020-07-27 2020-07-27 Image processing method and device

Country Status (1)

Country Link
CN (1) CN111899239A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114596593A (en) * 2022-05-10 2022-06-07 慧医谷中医药科技(天津)股份有限公司 Health-preserving data recommendation method and system based on image processing

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114596593A (en) * 2022-05-10 2022-06-07 慧医谷中医药科技(天津)股份有限公司 Health-preserving data recommendation method and system based on image processing
CN114596593B (en) * 2022-05-10 2022-07-29 慧医谷中医药科技(天津)股份有限公司 Health-preserving data recommendation method and system based on image processing

Similar Documents

Publication Publication Date Title
Fu et al. Uncertainty inspired underwater image enhancement
CN108898086B (en) Video image processing method and device, computer readable medium and electronic equipment
CN108229526B (en) Network training method, network training device, image processing method, image processing device, storage medium and electronic equipment
JP6994588B2 (en) Face feature extraction model training method, face feature extraction method, equipment, equipment and storage medium
CN107679466B (en) Information output method and device
CN108229419B (en) Method and apparatus for clustering images
US20230021661A1 (en) Forgery detection of face image
CN107507153B (en) Image denoising method and device
Zhang et al. A new haze removal approach for sky/river alike scenes based on external and internal clues
CN108197618B (en) Method and device for generating human face detection model
US9697592B1 (en) Computational-complexity adaptive method and system for transferring low dynamic range image to high dynamic range image
CN110225366B (en) Video data processing and advertisement space determining method, device, medium and electronic equipment
CN109871845B (en) Certificate image extraction method and terminal equipment
CN109344752B (en) Method and apparatus for processing mouth image
CN111242097A (en) Face recognition method and device, computer readable medium and electronic equipment
CN110929780A (en) Video classification model construction method, video classification device, video classification equipment and media
CN113743607B (en) Training method of anomaly detection model, anomaly detection method and device
CN109389096B (en) Detection method and device
CN109377508B (en) Image processing method and device
CN108241855B (en) Image generation method and device
CN105225222A (en) To the automatic evaluation of the perception visual quality of different images collection
CN114612987A (en) Expression recognition method and device
CN113780365B (en) Sample generation method and device
CN114332993A (en) Face recognition method and device, electronic equipment and computer readable storage medium
CN111899239A (en) Image processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination