CN116310351A - Image processing method, device and storage medium - Google Patents

Image processing method, device and storage medium Download PDF

Info

Publication number
CN116310351A
CN116310351A CN202211640088.4A CN202211640088A CN116310351A CN 116310351 A CN116310351 A CN 116310351A CN 202211640088 A CN202211640088 A CN 202211640088A CN 116310351 A CN116310351 A CN 116310351A
Authority
CN
China
Prior art keywords
image
boundary
continuous area
area
search
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211640088.4A
Other languages
Chinese (zh)
Inventor
郑亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZTE Corp
Original Assignee
ZTE Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZTE Corp filed Critical ZTE Corp
Publication of CN116310351A publication Critical patent/CN116310351A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/457Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by analysing connectivity, e.g. edge linking, connected component analysis or slices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses an image processing method, an image processing device and a storage medium, wherein the method comprises the following steps: reading image data of an image to be processed; screening the image data for characteristic information meeting preset conditions to obtain a corresponding binarized image; processing the binarized image to obtain an integral image corresponding to the image to be processed; and moving the search box in the integral graph to obtain the boundary information of the continuous area of the integral graph. The method and the device can rapidly and efficiently acquire the statistical information of the continuous area, including boundary, area, equivalent center and the like, are suitable for application scenes with high real-time requirements, and can be widely applied to the field of image processing.

Description

Image processing method, device and storage medium
Technical Field
The application relates to the field of image processing, in particular to an image processing method, an image processing device and a storage medium.
Background
In the image processing process, image features are often required to be obtained through calculation, wherein the image features comprise the size of the area of an image and the change condition of the image, the position of a continuous area in the image and the change condition of the continuous area in the image and the like. The meaning of obtaining image features is that a technician can use image feature analysis to obtain a range of valuable results. For example, when the image processing technology is applied to face recognition, the image features of the face image include the area of the face, the moving speed and integrity of the face, and the change in the position of the face, and by analyzing these image features, it can be determined whether the face can be correctly recognized. The calculation of the obtained image features is therefore of great importance for the image processing.
In order to calculate and obtain the image characteristics, the method adopted in the related art is as follows: and searching the outline and the discontinuous area of the image by using a clustering algorithm, and calculating to obtain the image characteristics. The processing speed of the method is low, the processing time is long when the method faces larger images, and therefore the method is difficult to adapt to application scenes with high real-time requirements.
Therefore, the technical problems existing in the related art are to be solved.
Disclosure of Invention
The present application aims to solve one of the technical problems in the related art. For this reason, the embodiment of the application provides an efficient image processing method, device and storage medium.
According to an aspect of embodiments of the present application, there is provided an image processing method, including:
reading image data of an image to be processed;
screening the image data for characteristic information meeting preset conditions to obtain a corresponding binarized image;
processing the binarized image to obtain an integral image corresponding to the image to be processed;
and moving a search box in the integral graph to obtain boundary information of a continuous area of the integral graph.
In one embodiment, moving a search box in the integral graph to obtain boundary information of a continuous area of the integral graph includes:
Moving a search frame in the integral graph, and acquiring statistical data of pixel points meeting a first condition in the search frame;
and determining the boundary of the continuous area of the target area according to the statistical data, wherein the boundary of the continuous area consists of a search box meeting a second condition in the statistical data.
In one embodiment, the meeting the first condition is at least one of:
the brightness value of the search box is in a preset brightness range;
or the color value of the search box is in a preset color range;
or the gray value of the search frame is in a preset gray range.
In one embodiment, the meeting the second condition includes that the statistical data is within a preset range or the statistical data is greater than a threshold, and the determining the boundary of the continuous area according to the statistical data includes:
acquiring the preset range;
if the statistical data is in the preset range, determining the boundary of a continuous area, wherein the boundary of the continuous area consists of a plurality of search boxes of the statistical data in the preset range;
alternatively, the threshold value is obtained;
and if the statistical data is larger than the threshold value, determining the boundary of the continuous area, wherein the boundary of the continuous area consists of a plurality of search boxes with the statistical data larger than the threshold value.
In one embodiment, the obtaining the threshold value includes:
determining the area of the search frame according to the coordinate information of the search frame;
and determining the product of the area of the search box and a preset coefficient as the threshold value.
In one embodiment, determining the boundary of the continuous region from the statistical data comprises:
acquiring a plurality of search boxes of which the statistical data accords with a second condition;
and arranging a plurality of search boxes of which the statistical data accords with a second condition according to a preset sequence, and generating the boundary of the continuous area according to the arranged result.
In one embodiment, the moving the search box in the integral graph includes:
acquiring the central position of the integral graph;
and moving the search frame from the central position to the edge of the integral graph according to a preset step length.
In one embodiment, the processing the binarized image to obtain an integral image corresponding to the image to be processed includes:
acquiring an image to be processed;
preprocessing the image to be processed, wherein the preprocessing comprises screening characteristic information meeting preset conditions, denoising and binarization;
and integrating the preprocessed image to obtain the integral graph.
In one embodiment, after obtaining the boundary information of the continuous area of the integral graph, the method further includes:
and obtaining the image characteristics of the continuous area according to the boundary of the continuous area.
In one embodiment, the image features of the continuous area include an area of the continuous area, and the obtaining the image features of the continuous area according to the boundary of the continuous area includes:
acquiring the total area and the total number of all search boxes of which the statistical data meet a second condition according to the coordinates of the boundary of the continuous area;
the product of the total area and the total number is taken as the area of the continuous region.
In one embodiment, the image features of the continuous region include an equivalent center or a symmetry center of the continuous region, and the calculating the image features of the continuous region according to coordinates of a boundary of the continuous region includes:
and calculating the equivalent center or the symmetrical center according to the coordinate calculation of the boundary of the continuous area.
According to an aspect of an embodiment of the present application, there is provided an image processing apparatus including:
at least one processor;
at least one memory for storing at least one program;
The image processing method as described in the previous embodiment is implemented when at least one of the programs is executed by at least one of the processors.
According to an aspect of the embodiments of the present application, there is provided a storage medium storing a program executable by a processor, which when executed by the processor, implements the image processing method as described in the previous embodiments.
The image processing method and device provided by the embodiment of the application have the beneficial effects that: the method comprises the steps of reading image data of an image to be processed; performing binarization processing on the image data to obtain a corresponding binarized image; processing the binarized image to obtain an integral image corresponding to the image to be processed; and moving the search box in the integral graph to obtain the boundary information of the continuous area of the integral graph. The method for moving the search boundary on the integral graph by the search box can quickly and efficiently acquire the image characteristics, and is suitable for application scenes with high real-time requirements.
Additional aspects and advantages of the application will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic view of an implementation environment of an image processing method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an implementation environment of another image processing method according to an embodiment of the present application;
fig. 3 is a flowchart of an image processing method according to an embodiment of the present application;
FIG. 4 is a flow chart of determining continuous region boundaries according to a second condition provided in an embodiment of the present application;
FIG. 5 is a flowchart of determining continuous region boundaries according to statistics provided by an embodiment of the present application;
FIG. 6 is a flowchart of acquiring an integral map of an image to be processed according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application.
Detailed Description
In order to make the present application solution better understood by those skilled in the art, the following description will be made in detail and with reference to the accompanying drawings in the embodiments of the present application, it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
The terms "first," "second," "third," and "fourth" and the like in the description and in the claims and drawings are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the present application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
First, description and explanation are made on related noun terms involved in the embodiments of the present application:
Image features (image features): mainly has the color characteristics, texture characteristics, shape characteristics and spatial relation characteristics of the image.
Wherein the color feature is a global feature describing the surface properties of the scene to which the image or image area corresponds; texture features are also global features that also describe the surface properties of the scene to which an image or image region corresponds; the shape features have two types of representation methods, one is outline features, the other is area features, the outline features of the image are mainly aimed at the outer boundary of the object, and the area features of the image relate to the whole shape area; the spatial relationship feature refers to a mutual spatial position or a relative direction relationship between a plurality of objects segmented in an image, and these relationships may be also classified into a connection/adjacency relationship, an overlapping/overlapping relationship, an inclusion/containment relationship, and the like.
Integration chart: the value of any point coordinate in the integral map refers to the sum of pixel values of all points in the region constituted from the upper left corner of the original image to this point. Therefore, the integral graph has the characteristic of being convenient for calculating the area between any coordinates.
Binarization: binarization is to set the gray value of a pixel point on an image to 0 or 255, that is, to present a visual effect of only black and white to the whole image. Binarization is one of the simplest methods of image segmentation. Binarization can be performed by converting a gradation image into a binary image, setting a pixel gradation larger than a certain critical gradation value as a gradation maximum value, and setting a pixel gradation smaller than this value as a gradation minimum value. According to different threshold selection, the binarization algorithm is divided into a fixed threshold and an adaptive threshold. The usual binarization method is: bimodal, P-parametric, iterative, OTSU, etc.
Image Segmentation (Segmentation): image segmentation refers to the process of subdividing a digital image into a plurality of image sub-regions (sets of pixels). The purpose of image segmentation is to simplify or alter the representation of the image so that the image is easier to understand and analyze. Image segmentation is typically used to locate objects and boundaries (lines, curves, etc.) in an image. More precisely, image segmentation is a process of labeling each pixel in an image, which causes pixels with the same label to have some common visual property. The result of image segmentation is a set of sub-regions on the image (the totality of these sub-regions covering the whole image) or a set of contour lines extracted from the image (e.g. edge detection). Each pixel in a sub-region is similar under a measure of a characteristic or a calculated characteristic, such as color, brightness, texture. The contiguous areas differ greatly under the measure of a certain characteristic.
The schemes of the embodiments of the present application mainly relate to technologies such as image feature extraction in the process of image processing by using a computer, and specifically are described by the following embodiments:
in the image processing process, image features are often required to be obtained through calculation, wherein the image features comprise the size of the area of an image and the change condition of the image, the position of a continuous area in the image and the change condition of the continuous area in the image and the like. The meaning of obtaining image features is that a technician can use image feature analysis to obtain a range of valuable results. For example, when the image processing technology is applied to face recognition, the image features of the face image include the area of the face, the moving speed and integrity of the face, and the change in the position of the face, and by analyzing these image features, it can be determined whether the face can be correctly recognized. The calculation of the obtained image features is therefore of great importance for the image processing.
In order to calculate and obtain the image characteristics, the method adopted in the related art is as follows: and searching the outline and the discontinuous area of the image by using a clustering algorithm, and calculating to obtain the image characteristics. The processing speed of the method is low, the processing time is long when the method faces larger images, and therefore the method is difficult to adapt to application scenes with high real-time requirements.
Therefore, the embodiment of the application provides an image processing method, which adopts a method that a search box searches boundaries in an integral graph according to a designed search method, can quickly and efficiently acquire image features, and is suitable for application scenes with high real-time requirements including face recognition. The method can be applied to the terminal, the server or the implementation environment formed by the terminal and the server. In addition, the method may be software running in the terminal or the server, such as an application program having an image feature extraction function, or the like. The terminal may be, but is not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, etc. The server can be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and can also be a cloud server for providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligent platforms and the like.
Fig. 1 and fig. 2 are an implementation environment of an image processing method provided in an embodiment of the present application. The two implementation environments include different devices, and the functions of each device may also be different, so that the steps performed by each device in the image processing method may be different. The embodiment of the present application is not limited to what kind of implementation environment is specifically.
In one possible implementation, referring to fig. 1, the implementation environment may include a computer device 101, where the computer device 101 may have an image capturing function and an image processing function, and the computer device 101 may capture an image of an object to be processed (such as a person, an environment) to be processed (referred to as a to-be-processed image for short), and process a region to be processed in the image to extract an image feature. In fig. 1, a smart phone 1011, a tablet 1012, and a desktop 1013 are all exemplary embodiments of terminals; and 1014 is a server.
In another possible implementation, referring to fig. 2, the implementation environment may include a computer device 101 and an image acquisition device 102, the image acquisition device 102 may have an image acquisition function, and the computer device 101 may have an image processing function. The image capturing device 102 may capture an image of an object (to be processed image for short), and send the captured image to the computer device 101, where the computer device 101 processes a region to be processed in the image to extract image features. In fig. 2, a smart phone 1011, a tablet 1012, and a desktop 1013 are all exemplary embodiments of terminals; and 1014 is a server.
The image processing method can be applied to various scenes, for example, the method can be applied to face recognition door locks, images of objects can be acquired, frames (namely face frames) of the acquired images are acquired through the image processing method, image features (namely face image features) are acquired according to the acquired frames, the acquired face image features are compared with the face image features in a database, whether the face image features meet requirements or not is determined, if the face image features meet the door opening requirements, door opening can be controlled, and if the image features do not meet the door opening requirements, door opening logic can not be executed. For example, the method for acquiring the image features can be applied to face recognition payment scenes, images of objects can be acquired, frames (namely face frames) of the acquired images are acquired through the image processing method, image features (face image features) are acquired according to the acquired frames, the acquired face image features are compared with the face image features in a database, whether the face image features meet requirements or not is determined, if the face image features meet the door opening requirements, payment logic can be executed, and if the image features do not meet the requirements, payment failure can be returned. Of course, the image processing method can also be applied to other scenes needing to extract image features, such as attendance checking, intelligent communities, intelligent retail, and the like, which are not listed here.
Those skilled in the art will appreciate that the computer device 101 may be a terminal or a server, which is not limited in this embodiment of the present application.
Fig. 3 is a flowchart of an image processing method according to an embodiment of the present application, where the method can be applied to the computer device 101 shown in fig. 1 or fig. 2. Referring to fig. 3, the method may include the following steps S301, S302, S303, and S304:
s301, reading image data of an image to be processed.
Specifically, the image to be processed refers to an image to be processed acquired by including an image capturing apparatus or an image database or the like.
S302, screening the image data for characteristic information meeting preset conditions to obtain a corresponding binarized image.
Specifically, the filtering of the image data includes filtering feature information meeting preset conditions, then setting 255 pixels meeting preset conditions, and setting 0 pixels not meeting preset conditions, so as to obtain a binarized image. The preset condition may be a certain range of color values or a certain range of gray values, which is not specifically limited in the embodiment of the present application.
For example, in this embodiment, the HSV color is used as a preset condition to screen the images meeting the preset condition, where the HSV value is set to h 0 ,s 0 ,v 0 To h 1 ,s 1 ,v 1 The specific formula is as follows:
Figure BDA0004008546220000071
wherein thresh low And thresh high For the boundary of the range of HSV values, thresh low For the lower boundary of HSV values, thresh high Is the upper boundary of HSV value, thresh low And thresh high A range of HSV values is composed.
And then setting 255 pixels which meet the preset conditions and setting 0 pixels which do not meet the preset conditions to obtain a binarized image, and calculating according to the following formula:
Figure BDA0004008546220000072
wherein Binary is the gray value of the pixel point, and after being set to 255 or 0, the Binary image of the image to be processed is obtained, so that the subsequent calculation and processing are facilitated.
S303, processing the binarized image to obtain an integral graph corresponding to the image to be processed.
It should be noted that, the specific operation of processing the binarized image to obtain the integral map corresponding to the image to be processed may include: calculating the integral value of each pixel point in the image to be processed, wherein the integral value comprises taking the pixel point at the upper left corner of the image to be processed as the vertex at the upper left corner, and taking the sum of all elements contained in the rectangle with the pixel point as the vertex at the lower right corner as the integral value of each pixel point in the image to be processed. For example, the integrated value at the point coordinate (x, y) is the sum of all pixels in a rectangular area of length x and width y (including the point).
Alternatively, when calculating the integral value of each pixel point in the image to be processed, the integral value of all the pixel points in the image to be processed may be calculated one by one, or the sum of all the element values contained in the rectangular area may not be recalculated for each pixel point, but the integral value of the adjacent points may be used to implement rapid calculation, for example, the integral value of the point (x, y) may use the sum of the integral values of the point (x-1, y) and the point (x, y-1), then the overlapping area is subtracted, that is, the integral value of the point (x, y) is subtracted, and finally the integral value of the point (x, y) is obtained by adding the pixel value of the point (x, y), so that the integral value of each pixel point in the image to be processed may be obtained.
Specifically, the calculation formula for obtaining the image integral map to be processed in step S303 includes:
Figure BDA0004008546220000073
wherein Inter (i, j) is an integral value corresponding to a pixel point (x, y) in the image to be processed, binary (i, j) is a gray value of the pixel point (x, y) in the image to be processed, rows is the number of lines of the image to be processed, and cols is the number of columns of the image to be processed.
S304, moving the search frame in the integral graph to obtain boundary information of a continuous area of the integral graph.
In this embodiment, the search box refers to a closed box having a certain shape, and the shape of the search box includes a rectangle, a triangle, a circle, or the like. It can be appreciated by those skilled in the art that, for the images to be processed with different sizes and shapes, search boxes with different shapes can be used to meet the requirements of different searching efficiency and accuracy, so the specific shape of the search box is not limited in the embodiments of the present application.
It should be noted that the continuous region in this application is a closed region having a regular or irregular shape and generally having the same numerical distribution, and being connected in one piece.
Specifically, moving the search box in the integral graph in step S304 specifically includes: and acquiring the central position of the integral graph, and moving the search frame from the central position to the edge of the integral graph according to a preset step length. It can be appreciated that, the center position of the integral graph is generally located at the origin of coordinates of the integral graph, and when the center position of the integral graph is located at the origin of coordinates of the integral graph, the moving the search box in the integral graph specifically includes: and moving the search frame from the origin of coordinates to the edge of the integral graph according to a preset step length. When the central position of the integral graph is not located at the coordinate origin of the integral graph, a relative central position can be selected, the relative central position is taken as a starting point, and the search box is moved from the starting point to the edge of the integral graph according to a preset step length. By moving the moving direction of the search frame from the center position of the integral diagram to the edge, the search efficiency of the border of the integral diagram of the search frame can be improved, and compared with the case of randomly selecting the search starting point, the mode of the embodiment can reduce the condition that the search frame searches the area outside the integral diagram and reduce the waste of running time.
In this embodiment, the boundary of the continuous area is composed of a plurality of search boxes in the statistical data, where the search boxes conform to the second condition, and the continuous area is an area in the integral graph where the image feature needs to be acquired.
After obtaining the statistics (such as the number, etc.) of the pixel points satisfying the first condition in the search box according to the foregoing embodiment, it is further necessary to determine or determine whether the statistics of the search box satisfies the second condition (such as the number exceeds the preset threshold, etc.), and if so, the search box is used as a part of the continuous area boundary. By continuously moving the search, a plurality of or all (the specific number can be selected according to the actual needs) search boxes which are part of the boundary of the continuous area can be found, and then the search boxes can be formed into the boundary of the continuous area.
In this embodiment, the boundary of the continuous area may be described by a set of several search boxes that form the boundary of the continuous area, and the set may include data such as vertex coordinates, area and center position coordinates of the search boxes whose statistical data meets the second condition. It is thus possible to provide a data basis for the subsequent step of computing the relevant image features of the continuous area.
It should be noted that, in this embodiment, the search frame is moved from the center position to the edge of the integral graph according to a preset step length, where the preset step length is a distance preset by a technician for each step of the search frame. For example, a technician may set different preset step sizes according to the specific size and type of the image to be processed and the integral graph of the image to be processed, and after the technician sets the step distance of each step of the search frame as step, the search frame is moved by the step distance in this step each time. It is expected that setting different preset step sizes will have a certain influence on the searching accuracy and the efficiency of the search box in this embodiment, so those skilled in the art may set an appropriate preset step size according to the specific size and type of the image to be processed and the integral graph of the image to be processed, and the specific magnitude value of the preset step size is not limited in this embodiment of the present application.
Specifically, in this embodiment, statistical data of pixels meeting a first condition in the search box is obtained, where the first condition is at least one of the following: the brightness value of the search box is in a preset brightness range; or the color value of the search box is in a preset color range; or the gray value of the search frame is in a preset gray range.
In this embodiment, the luminance value of the search frame includes the luminance value of the integral map region within the search frame (i.e., the integral map region overlapping with the search frame), the color value of the search frame includes the color value of the integral map region within the search frame (i.e., the integral map region overlapping with the search frame), and the gray value of the search frame includes the gray value of the integral map region within the search frame (i.e., the integral map region overlapping with the search frame). Of course, since the search box is moving, the brightness value, color value, or gray value of the search box also dynamically changes with the movement of the search box. In addition, the pixel points meeting the first condition may be screened out according to other parameters (such as image texture, etc.), which are not listed in the embodiment of the present application.
It should be noted that, the statistical data in this embodiment may specifically include the number of pixels meeting the first condition in the search box.
In this embodiment, obtaining the image feature of the continuous area according to the boundary of the continuous area may specifically include: and acquiring coordinates of the boundary of the continuous area, and calculating image features of the continuous area according to the coordinates of the boundary of the continuous area.
Specifically, the coordinates of the boundary of the continuous region in the present embodiment include the center position of the search box, the non-zero region area of the search box, and the vertex position of the search box, where the center position of the search box is calculated as follows:
Figure BDA0004008546220000091
Wherein, cenxy i,j (x, y) is the center position coordinate of the search box, x is the abscissa of the center position of the search box, y is the ordinate of the center position of the search box, and step is the preset step size of the movement of the search box.
In particular, when the search box is rectangular, the four vertex position calculation formulas of the search box are as follows:
cor i,j 0(x,y)=cenxy i,j (x-step/2,y-step/2)
cor i,j 1(x,y)=cenxy i,j (x-step/2,y+step/2)
cor i,j 2(x,y)=cenxy i,j (x+step/2,y-step/2)
cor i,j 3(x,y)=cenxy i,j (x+step/2,y+step/2)
wherein cor i,j 0 (x, y) is the coordinates of the first vertex of the search box, cor i,j 1 (x, y) is the coordinates of the second vertex of the search box, cor i,j 2 (x, y) is the coordinates of the third vertex of the search box, cor i,j 3 (x, y) is the coordinates of the fourth vertex of the search box.
The non-zero area of the search box is calculated as follows:
s i,j =Inter(cor i,j 3)+Inter(cor i,j 0)-Inter(cor i,j 1)-Inter(cor i,j 2)
wherein, inter (cor i,j 0) To find the integral value of the first vertex of the box, inter (cor i,j 1) For the integral value of the second vertex of the search box, inter (cor i,j 2) For the integral value of the third vertex of the search box, inter (cor i,j 3) S is the integral value of the fourth vertex of the search box i,j Is the non-zero area of the search box.
It should be noted that, the shape of the search box may include other shapes within a rectangle, and the calculation formula of the vertex is not completely consistent with the calculation formula of the vertex when the search box is a rectangle in this embodiment, which is not listed in this embodiment.
Further, the image features of the continuous area in the present embodiment include an area of the continuous area, and the calculating the image features of the continuous area according to the coordinates of the boundary of the continuous area includes: acquiring the total area and the total number of all search boxes of which the statistical data meet a second condition according to the coordinates of the boundary of the continuous area; the product of the total area and the total number is taken as the area of the continuous region. Illustratively, when the search box is rectangular in shape, the area of the continuous region is specifically calculated as follows:
S(L)=blk w *blk h *cnt
wherein S (L) is the area of the continuous region, blk w Blk is the length of the search box h For the width of the search box, cnt is the number of search boxes that make up the boundary of the continuous area.
In another possible implementation manner, the image features of the continuous area in this embodiment may also include an equivalent center or a symmetry center of the continuous area, and calculating the image features of the continuous area according to coordinates of a boundary of the continuous area includes: and calculating the equivalent center or the symmetrical center according to the coordinate calculation of the boundary of the continuous area.
Illustratively, the specific calculation formula of the equivalent center includes:
Figure BDA0004008546220000101
Where cenx is the abscissa of the equivalent center of the continuous region, ceny is the ordinate of the equivalent center of the continuous region, ΣBinry (x, y) is the gray value of the equivalent center.
It should be noted that, the mathematical calculation formula provided in this embodiment is a calculation formula under specific conditions, and is merely an exemplary description, and is not a unique method for calculating each item of data in this embodiment, and does not constitute an undue limitation on a calculation method.
In summary, the present application obtains an integral map of an image to be processed; moving a search frame in the integral graph, and acquiring statistical data of pixel points meeting a first condition in the search frame; determining the boundary of a continuous area according to the statistical data, wherein the boundary of the continuous area consists of all search boxes of which the statistical data accords with a second condition, and the continuous area is an area in the integral graph, wherein the image characteristics need to be acquired; and obtaining the image characteristics of the continuous area according to the boundary of the continuous area. The method for searching the boundary in the integral map by the search frame according to the designed search method utilizes the characteristic that the integral map is convenient to calculate the area between any coordinates, can quickly and efficiently acquire the image characteristics, and is suitable for application scenes with high real-time requirements including face recognition.
Fig. 4 is a flowchart of determining a continuous area boundary according to a second condition according to an embodiment of the present application. Referring to fig. 4, the meeting of the second condition in this embodiment includes that the statistical data is within a preset range or the statistical data is greater than a threshold, and the determining the boundary of the continuous area according to the statistical data includes:
s401, acquiring a threshold value.
In step S401, the acquisition threshold includes: determining the area of the search frame according to the coordinate information of the search frame; and determining the product of the area of the search box and a preset coefficient as the threshold value. Specifically, the increment parameter may be set to be ck, ck e (0, 1), and then the calculation formula of the threshold blkTh is:
blkTh=blk w *blk h *ck
wherein blk is w Blk is the length of the search box h For the width of the search box, ck is an increment parameter.
And S402, if the statistical data is larger than the threshold value, determining the boundary of the continuous area.
After obtaining the statistics data (such as the number, etc.) of the pixel points meeting the first condition in the search box, it is further required to determine or determine whether the statistics data of the search box meets the second condition (such as the number exceeds a preset threshold, etc.), and if so, the search box is used as a part of the continuous area boundary. By continuously moving the search, a plurality of or all (the specific number can be selected according to the actual needs) search boxes which are part of the boundary of the continuous area can be found, and then the search boxes can be formed into the boundary of the continuous area.
In another possible implementation manner, the meeting of the second condition in this embodiment may also include that the statistical data is within a preset range, and correspondingly, the determining the boundary of the continuous area according to the statistical data includes: acquiring the preset range; and if the statistical data is in the preset range, determining the boundary of the continuous area.
Fig. 5 is a flowchart of determining continuous region boundaries according to statistical data provided in an embodiment of the present application. Referring to fig. 5, determining the boundary of the continuous area according to the statistical data in this embodiment includes:
s501, acquiring a plurality of search boxes of which the statistical data accords with a second condition.
S502, arranging a plurality of search boxes of which the statistical data meet a second condition according to a preset sequence, and generating the boundary of the continuous area according to the arrangement result.
In this embodiment, the plurality of search boxes whose statistical data meets the second condition are search boxes including a boundary of a continuous area, so that the search boxes need to be combined to form the boundary of the continuous area. It should be noted that the search boxes that form the boundary of the continuous area may include all the search boxes whose statistical data meets the second condition, or may be some of the search boxes whose statistical data meets the second condition. When the calculation accuracy requirement is low and the running and calculating time is short, and a calculation result needs to be obtained quickly, a boundary of a continuous area can be formed by a search box with partial statistical data conforming to a second condition; when the calculation accuracy requirement is higher and the running and calculating time is longer, the boundary of the continuous area can be formed by the search boxes with all the statistical data conforming to the second condition, so that a clearer and more accurate calculation result can be obtained. It should be noted that, the search box with the partial statistics meeting the second condition may be selected according to actual needs, for example, according to a priori knowledge, a plurality of (e.g., 10, 100, etc.) search boxes meeting the second condition may be selected as a part of the boundary of the continuous area. For another example, when the shape of the continuous area is rectangular, 4 search boxes satisfying the second condition at four vertices of the continuous area may be selected to constitute the boundary of the continuous area. For another example, when the shape of the continuous area is rectangular, 4 search boxes satisfying the second condition at four vertices of the continuous area and 4 search boxes satisfying the second condition at 4 midpoints of an edge surrounded by the four vertices may be selected, and a total of 8 search boxes may form the boundary of the continuous area.
Optionally, after obtaining the boundary of the continuous area, the method further includes a step of determining the continuous area, specifically:
the continuous region is Lable (L), L represents that L continuous regions exist in the binary image, L is more than or equal to 0, and blk (m, n) may belong to table (0), table (1.) table (L).
Firstly, carrying out line-by-line processing on blk (m, n), and judging whether two adjacent blks (m, n) and blk (m+1, n) belong to the same continuous region or not, wherein the following conditions are satisfied:
S m+1,n -S m,n ≥threshold
then block (m, n), block (m+1, n) belongs to the kth contiguous region, labeled:
block(m,n),block(m+1,n)∈lable(k),k≤L
then, processing the blks (m, n) column by column, and sequencing according to the coordinates m, firstly judging whether two adjacent blocks (m, n) and blocks (m, n+1) belong to the same continuous region or not, wherein the following conditions are met:
S m,n+1 -S m,n ≥threshold
then block (m, n), block (m, n+1) belongs to the same contiguous region, labeled:
block(m,n),block(m+1,n)∈lable(k1),k1≤L
after the traversal is finished, if Block (m, n) belongs to both Block (k 1) and Block (k), k1=k, that is, table (k 1) is equal to table (k) and is in the same continuous region. From this it is determined to which consecutive region block (m, n) belongs. The maximum and minimum positions of coordinates (m, n) in Lable (k) are the boundaries of block (k).
Fig. 6 is a flowchart of acquiring an integral chart of an image to be processed according to an embodiment of the present application. Referring to fig. 6, the acquiring an integral map of an image to be processed in this embodiment includes:
S601, acquiring an image to be processed.
S602, preprocessing the image to be processed, wherein the preprocessing comprises denoising and binarization.
S603, integrating the preprocessed image to obtain the integral graph.
When calculating the integral value of each pixel point in the image to be processed, the integral value of all the pixel points in the image to be processed can be calculated one by one, or the sum of all the element values contained in the rectangular area can be calculated again for each pixel point, but the integral value of the adjacent points is used for realizing rapid calculation, for example, the integral value of the point (x, y) can be calculated by using the sum of the integral values of the point (x-1, y) and the point (x, y-1), then the overlapped area is subtracted, namely the integral value of the point (x-1, y-1) is subtracted, and finally the integral value of the point (x, y) is obtained by adding the pixel value of the point (x, y), so that the integral value of each pixel point in the image to be processed can be obtained.
In the embodiment of the present application, based on the foregoing image processing method in the embodiment of the present application, a method for preprocessing an image to be processed is provided, so that before the image to be processed performs the foregoing image feature acquisition, an area that does not meet the feature requirement is effectively removed, and by using the method provided in the embodiment to acquire the image feature, the time and the calculation amount required when the computer processes the image and extracts the image feature can be reduced, the image feature can be quickly and efficiently acquired, and the method is suitable for an application scenario with high real-time requirements including face recognition.
When the image processing method provided by the application is applied to a face recognition scene, the image to be processed is taken as a face image, the search frame is a rectangular search frame, the continuous area is a face area, the image features are face image features, the first condition is that the brightness value of the search frame is in a preset brightness range, and the second condition is that the statistical data is larger than a threshold value, and the face recognition method specifically comprises the following steps:
step one, obtaining an integral graph of a face image to be processed;
moving a rectangular search frame in the integral graph, and acquiring statistical data of pixels with brightness values of the search frame in the rectangular search frame within a preset brightness range;
determining the boundary of a face area according to the statistical data, wherein the boundary of the face area consists of a plurality of rectangular search boxes with the statistical data larger than a threshold value, and the face area is an area needing to acquire face image characteristics in the integral graph;
step four, obtaining the face image characteristics of the face area according to the boundary of the face area;
and step five, comparing the obtained facial image characteristics with the facial image characteristics in the database, and determining that the face recognition is successful when the similarity exceeds a preset proportion.
In summary, the method and the device acquire the integral graph of the face image to be processed; moving a rectangular search frame in the integral graph, and acquiring statistical data of pixel points of which the brightness value of the search frame in the rectangular search frame is in a preset brightness range; determining the boundary of the face area according to the statistical data; obtaining the face image characteristics of the face region according to the boundary of the face region; and comparing the obtained facial image characteristics with the facial image characteristics in the database, and determining that the face recognition is successful when the similarity exceeds a preset proportion. The method provided by the embodiment of the application utilizes the mobile search box to improve the speed of acquiring the image characteristics, can be well adapted to the face recognition scene, and provides face image characteristic data for face recognition at a higher speed.
Referring to fig. 7, an embodiment of the present application further discloses a computer device, including:
at least one processor 701;
at least one memory 702 for storing at least one program;
the at least one program, when executed by the at least one processor 701, causes the at least one processor to implement the image processing method as shown in the previous embodiment.
For example, in practical application, an embodiment of the present application may be a computer device with an image processing function, including a processor, a memory, and an image capturing device, where the memory stores a computer program of the image processing method described in the foregoing embodiment, and the image capturing device captures an image and transmits the image to the processor or the memory, so as to implement the image processing method shown in the foregoing embodiment.
The methods shown in the foregoing embodiments are applicable to the embodiments of the present computer apparatus, where the functions specifically implemented by the embodiments of the present computer apparatus are the same as those of the image processing method shown in the foregoing embodiments, and the advantages achieved by the methods shown in the foregoing embodiments are the same as those achieved by the image processing method shown in the foregoing embodiments.
The present embodiment also discloses a computer-readable storage medium in which a processor-executable program is stored, which when executed by a processor is for implementing the image processing method shown in the previous embodiment.
The methods described in the foregoing embodiments are all applicable to the present storage medium, and the functions implemented by the present storage medium are the same as those of the image processing method described in the foregoing embodiments, and the advantages achieved by the image processing method described in the foregoing embodiments are also the same.
In some alternative embodiments, the functions/acts noted in the block diagrams may occur out of the order noted in the operational illustrations. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Furthermore, the embodiments presented and described in the flowcharts of this application are provided by way of example in order to provide a more thorough understanding of the technology. The disclosed methods are not limited to the operations and logic flows presented herein. Alternative embodiments are contemplated in which the order of various operations is changed, and in which sub-operations described as part of a larger operation are performed independently.
Furthermore, while the present application is described in the context of functional modules, it should be appreciated that, unless otherwise indicated, one or more of the functions and/or features may be integrated in a single physical device and/or software module or one or more of the functions and/or features may be implemented in separate physical devices or software modules. It will also be appreciated that a detailed discussion of the actual implementation of each module is not necessary to an understanding of the present application. Rather, the actual implementation of the various functional modules in the apparatus disclosed herein will be apparent to those skilled in the art from consideration of their attributes, functions and internal relationships. Thus, those of ordinary skill in the art will be able to implement the present application as set forth in the claims without undue experimentation. It is also to be understood that the specific concepts disclosed are merely illustrative and are not intended to be limiting upon the scope of the application, which is to be defined by the appended claims and their full scope of equivalents.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium may even be paper or other suitable medium upon which the program is printed, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In such embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
In the description of the present specification, the descriptions of the terms "one embodiment/example," "another embodiment/example," or "certain embodiments/examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present application have been shown and described, it will be understood by those of ordinary skill in the art that: many changes, modifications, substitutions and variations may be made to the embodiments without departing from the principles and spirit of the application, the scope of which is defined by the claims and their equivalents.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions.

Claims (13)

1. An image processing method, the method comprising:
reading image data of an image to be processed;
screening the image data for characteristic information meeting preset conditions to obtain a corresponding binarized image;
processing the binarized image to obtain an integral image corresponding to the image to be processed;
and moving a search box in the integral graph to obtain boundary information of a continuous area of the integral graph.
2. The image processing method according to claim 1, wherein moving a search box in the integral map to obtain boundary information of a continuous area of the integral map, comprises:
moving a search frame in the integral graph, and acquiring statistical data of pixel points meeting a first condition in the search frame;
and determining the boundary of the continuous area of the target area according to the statistical data, wherein the boundary of the continuous area consists of a search box meeting a second condition in the statistical data.
3. An image processing method according to claim 2, wherein said meeting a first condition is at least one of:
the brightness value of the search box is in a preset brightness range;
or the color value of the search box is in a preset color range;
Or the gray value of the search frame is in a preset gray range.
4. An image processing method according to claim 2, wherein said meeting a second condition includes said statistical data being within a preset range or said statistical data being greater than a threshold, said determining a boundary of a continuous area based on said statistical data comprising:
acquiring the preset range;
if the statistical data is in the preset range, determining the boundary of a continuous area, wherein the boundary of the continuous area consists of a plurality of search boxes of the statistical data in the preset range;
alternatively, the threshold value is obtained;
and if the statistical data is larger than the threshold value, determining the boundary of the continuous area, wherein the boundary of the continuous area consists of a plurality of search boxes with the statistical data larger than the threshold value.
5. The method of image processing according to claim 4, wherein said obtaining said threshold value comprises:
determining the area of the search frame according to the coordinate information of the search frame;
and determining the product of the area of the search box and a preset coefficient as the threshold value.
6. An image processing method according to claim 2, wherein determining the boundary of the continuous area based on the statistical data comprises:
Acquiring a plurality of search boxes of which the statistical data accords with a second condition;
and arranging a plurality of search boxes of which the statistical data accords with a second condition according to a preset sequence, and generating the boundary of the continuous area according to the arranged result.
7. The image processing method according to claim 1, wherein said moving a search box in said integral map comprises:
acquiring the central position of the integral graph;
and moving the search frame from the central position to the edge of the integral graph according to a preset step length.
8. The method according to claim 1, wherein the processing the binarized image to obtain an integral map corresponding to the image to be processed comprises:
acquiring an image to be processed;
preprocessing the image to be processed, wherein the preprocessing comprises denoising and binarization;
and integrating the preprocessed image to obtain the integral graph.
9. The image processing method according to claim 1, wherein after obtaining the boundary information of the continuous area of the integral map, the method further comprises:
and obtaining the image characteristics of the continuous area according to the boundary of the continuous area.
10. An image processing method according to claim 9, wherein the image features of the continuous area include an area of the continuous area, and the obtaining the image features of the continuous area from the boundary of the continuous area includes:
acquiring the total area and the total number of all search boxes of which the statistical data meet a second condition according to the coordinates of the boundary of the continuous area;
the product of the total area and the total number is taken as the area of the continuous region.
11. An image processing method according to claim 9, wherein the image features of the continuous region include an equivalent center or a symmetry center of the continuous region, and the calculating the image features of the continuous region from coordinates of a boundary of the continuous region includes:
and calculating the equivalent center or the symmetrical center according to the coordinate calculation of the boundary of the continuous area.
12. An image processing apparatus, characterized in that the apparatus comprises:
at least one processor;
at least one memory for storing at least one program;
the image processing method according to any one of claims 1-11, when at least one of said programs is executed by at least one of said processors.
13. A storage medium storing a processor-executable program which, when executed by a processor, implements the image processing method according to any one of claims 1 to 11.
CN202211640088.4A 2021-12-20 2022-12-20 Image processing method, device and storage medium Pending CN116310351A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111564190 2021-12-20
CN2021115641906 2021-12-20

Publications (1)

Publication Number Publication Date
CN116310351A true CN116310351A (en) 2023-06-23

Family

ID=86811972

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211640088.4A Pending CN116310351A (en) 2021-12-20 2022-12-20 Image processing method, device and storage medium

Country Status (1)

Country Link
CN (1) CN116310351A (en)

Similar Documents

Publication Publication Date Title
Yuan et al. A robust and efficient approach to license plate detection
CN110163842B (en) Building crack detection method and device, computer equipment and storage medium
CN107688806B (en) Affine transformation-based free scene text detection method
Yu et al. Feature point-based copy-move forgery detection: covering the non-textured areas
CN104866616B (en) Monitor video Target Searching Method
CN106846339A (en) Image detection method and device
CN111680690B (en) Character recognition method and device
CN112101386B (en) Text detection method, device, computer equipment and storage medium
Mukherjee et al. Enhancement of image resolution by binarization
CN111383244A (en) Target detection tracking method
CN112651953A (en) Image similarity calculation method and device, computer equipment and storage medium
CN108961262B (en) Bar code positioning method in complex scene
CN109255792B (en) Video image segmentation method and device, terminal equipment and storage medium
CN107578011A (en) The decision method and device of key frame of video
CN110807404A (en) Form line detection method, device, terminal and storage medium based on deep learning
CN113781406A (en) Scratch detection method and device for electronic component and computer equipment
CN113033558A (en) Text detection method and device for natural scene and storage medium
Meena et al. Image splicing forgery detection using noise level estimation
CN110956184A (en) Abstract diagram direction determination method based on HSI-LBP characteristics
CN114359352A (en) Image processing method, apparatus, device, storage medium, and computer program product
CN117037049B (en) Image content detection method and system based on YOLOv5 deep learning
Fang et al. 1-D barcode localization in complex background
US7440636B2 (en) Method and apparatus for image processing
CN111860687A (en) Image identification method and device, electronic equipment and storage medium
CN114913350B (en) Material duplicate checking method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication