CN111768410B - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN111768410B
CN111768410B CN201910427420.0A CN201910427420A CN111768410B CN 111768410 B CN111768410 B CN 111768410B CN 201910427420 A CN201910427420 A CN 201910427420A CN 111768410 B CN111768410 B CN 111768410B
Authority
CN
China
Prior art keywords
image
preset
histogram
binary image
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910427420.0A
Other languages
Chinese (zh)
Other versions
CN111768410A (en
Inventor
车广富
安山
陈宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Wodong Tianjun Information Technology Co Ltd
Original Assignee
Beijing Wodong Tianjun Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Wodong Tianjun Information Technology Co Ltd filed Critical Beijing Wodong Tianjun Information Technology Co Ltd
Priority to CN201910427420.0A priority Critical patent/CN111768410B/en
Publication of CN111768410A publication Critical patent/CN111768410A/en
Application granted granted Critical
Publication of CN111768410B publication Critical patent/CN111768410B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application discloses an image processing method and device. One embodiment of the method comprises the following steps: acquiring a gray level image of an image to be processed; for each pixel of the gray scale map, determining a difference in gray scale between the pixel and a pixel differing by a preset distance; binarizing the gray scale image based on the difference value to generate a binary image and generating a histogram of the binary image; based on the histogram, at least one partial image containing the object is determined from the image to be processed. The embodiment of the application can divide images with various sizes, so that the application has strong universality. In addition, the pixel where the edge is can be accurately determined through the difference value between the pixels, and whether the edge is quantized or not is determined, so that a more accurate local image is determined.

Description

Image processing method and device
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to the technical field of Internet, and particularly relates to an image processing method and device.
Background
Image processing techniques are becoming more and more widely used. In the existing image segmentation mode in the image processing technology, a neural network model is often required to identify the semantics of the whole image.
Since the neural network model is used for image semantic recognition, a lot of calculation resources are occupied, and the processing efficiency of image segmentation by using the neural network model is low. Meanwhile, the neural network model has strict size requirements on the input image, and the image with unsuitable size needs to be subjected to a preprocessing step.
Disclosure of Invention
The embodiment of the application provides an image processing method and device.
In a first aspect, an embodiment of the present application provides an image processing method, including: acquiring a gray level image of an image to be processed; for each pixel of the gray scale map, determining a difference in gray scale between the pixel and a pixel differing by a preset distance; binarizing the gray level map based on the difference value to generate a binary image and generating a histogram of the binary image; at least one partial image containing the object is determined from the images to be processed based on the histogram.
In some embodiments, generating a histogram of a binary image includes: generating a histogram of the binary image in a preset direction, wherein the histogram of the preset direction indicates the number of pixels of an object displayed by the binary image, which are distributed in each preset pixel area of the preset direction.
In some embodiments, generating a histogram of the binary image in a preset direction includes: generating a histogram of the binary image in a first preset direction, wherein the first preset direction is one of a transverse direction and a longitudinal direction, the histogram in the first preset direction is used for indicating the sum of gray values of the binary image in each first preset pixel area, the first preset pixel area is a pixel area formed by dividing the binary image along a second preset direction, and the second preset direction is the other of the transverse direction and the longitudinal direction, which is different from the first preset direction; and determining at least one partial image containing the object from the images to be processed based on the histogram, comprising: and dividing the binary image based on the position of a first preset pixel area with the sum of gray values not larger than a first preset threshold value or the position of a first preset pixel area with the gray value of each pixel not larger than a second preset threshold value in the histogram of the first preset direction, so as to obtain a candidate local image.
In some embodiments, generating a histogram of the binary image in a first preset direction includes: and generating a histogram of the binary image in the first preset direction in response to the side length of the binary image in the first preset direction being greater than the side length of the binary image in the second preset direction.
In some embodiments, determining at least one partial image containing the object from the images to be processed based on the histogram further comprises: generating a histogram of the candidate local image in a second preset direction; for the candidate partial image, determining the maximum gray value sum as a first numerical value in the gray value sum in each first preset pixel area; for the histogram of the second preset direction, determining the maximum gray value sum as a second value in the gray value sum of each second preset pixel area, wherein the second preset pixel area is a pixel area formed by dividing the candidate local image along the first preset direction; and determining the ratio of the first numerical value to the second numerical value, and if the ratio is in a preset ratio range, determining the corresponding region of the candidate partial image in the image to be processed as the partial image.
In some embodiments, the difference of the preset distance indicates a difference of at least two pixels.
In some embodiments, binarizing the gray scale map based on the difference value, generating a binary image, comprises: generating a differential image including the respective differences; and binarizing the differential image to obtain a binary image.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including: an acquisition unit configured to acquire a gray-scale image of an image to be processed; a determining unit configured to determine, for each pixel of the gray-scale image, a difference in gray-scale between the pixel and a pixel that differs by a preset distance; a generating unit configured to binarize the gray-scale image based on the difference value, generate a binary image, and generate a histogram of the binary image; and a segmentation unit configured to determine at least one partial image containing the object from the images to be processed based on the histogram.
In some embodiments, the generating unit comprises: the generation module is configured to generate a histogram of the binary image in a preset direction, wherein the histogram of the preset direction indicates the number of pixels of an object displayed by the binary image, which are distributed in each preset pixel area of the preset direction.
In some embodiments, the generation module is further configured to: generating a histogram of the binary image in a first preset direction, wherein the first preset direction is one of a transverse direction and a longitudinal direction, the histogram in the first preset direction is used for indicating the sum of gray values of the binary image in each first preset pixel area, the first preset pixel area is a pixel area formed by dividing the binary image along a second preset direction, and the second preset direction is the other of the transverse direction and the longitudinal direction, which is different from the first preset direction; and a dividing unit including: the segmentation module is configured to segment the binary image based on the position of a first preset pixel area with the sum of gray values not larger than a first preset threshold value or the position of a first preset pixel area with the gray value of each pixel not larger than a second preset threshold value in the histogram of the first preset direction, so as to obtain a candidate local image.
In some embodiments, the generation module is further configured to: and generating a histogram of the binary image in the first preset direction in response to the side length of the binary image in the first preset direction being greater than the side length of the binary image in the second preset direction.
In some embodiments, the partitioning unit further comprises: the histogram generation module is configured to generate a histogram of the candidate local image in a second preset direction; a first determining module configured to determine, for the candidate partial image, a sum of maximum gray values among sums of gray values in respective first preset pixel regions as a first numerical value; a second determining module configured to determine, for a histogram of a second preset direction, a sum of maximum gray values among sums of gray values in second preset pixel regions, the second preset pixel region being a pixel region formed by dividing the candidate partial image along the first preset direction; and the third determining module is configured to determine the ratio of the first value to the second value, and if the ratio is within a preset ratio range, determine the corresponding region of the candidate local image in the image to be processed as the local image.
In some embodiments, the difference of the preset distance indicates a difference of at least two pixels.
In some embodiments, the generating unit is further configured to: generating a differential image including the respective differences; and binarizing the differential image to obtain a binary image.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors; and a storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement a method as in any of the embodiments of the image processing method.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method as in any of the embodiments of the image processing method.
According to the image processing scheme provided by the embodiment of the application, firstly, a gray level image of an image to be processed is obtained. Then, for each pixel of the gray-scale image, a difference in gray-scale between the pixel and a pixel differing by a preset distance is determined. Then, the gray-scale image is binarized based on the difference value, a binary image is generated, and a histogram of the binary image is generated. Finally, at least one partial image containing the object is determined from the images to be processed based on the histogram. The embodiment of the application can divide images with various sizes, so that the application has strong universality. In addition, the pixel where the edge is can be accurately determined through the difference value between the pixels, and whether the edge is quantized or not is determined, so that a more accurate local image is determined.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the following drawings, in which:
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2a is a flow chart of one embodiment of an image processing method according to the present application;
FIG. 2b is a schematic illustration of a difference image and a binary image according to the image processing method of the present application;
FIG. 3a is a flow chart of yet another embodiment of an image processing method according to the present application;
FIG. 3b is a schematic diagram of a histogram of an image processing method according to the present application;
FIG. 3c is a schematic diagram of yet another histogram of an image processing method according to the present application;
FIG. 4 is a schematic structural view of one embodiment of an image processing apparatus according to the present application;
fig. 5 is a schematic diagram of a computer system suitable for use in implementing embodiments of the present application.
Detailed Description
The present application is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings.
It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other. The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates an exemplary system architecture 100 to which embodiments of the image processing methods or image processing apparatuses of the present application may be applied.
As shown in fig. 1, a system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The user may interact with the server 105 via the network 104 using the terminal devices 101, 102, 103 to receive or send messages or the like. Various communication client applications, such as an image processing application, a video class application, a live application, an instant messaging tool, a mailbox client, social platform software, etc., may be installed on the terminal devices 101, 102, 103.
The terminal devices 101, 102, 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices with display screens, including but not limited to smartphones, tablets, electronic book readers, laptop and desktop computers, and the like. When the terminal devices 101, 102, 103 are software, they can be installed in the above-listed electronic devices. Which may be implemented as multiple software or software modules (e.g., multiple software or software modules for providing distributed services) or as a single software or software module. The present invention is not particularly limited herein.
The server 105 may be a server providing various services, such as a background server providing support for the terminal devices 101, 102, 103. The background server may analyze and process the received data such as the image to be processed, and feed back the processing result (for example, a partial image including the object) to the terminal device.
It should be noted that the image processing method provided in the embodiment of the present application may be executed by the server 105 or the terminal devices 101, 102, 103, and accordingly, the image processing apparatus may be provided in the server 105 or the terminal devices 101, 102, 103.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to fig. 2a, a flow 200 of one embodiment of an image processing method according to the present application is shown. The image processing method comprises the following steps:
step 201, a gray scale image of an image to be processed is acquired.
In the present embodiment, the execution subject of the image processing method (e.g., the server or the terminal device shown in fig. 1) may acquire the gradation map of the image to be processed. The image to be processed is an image to be subjected to the segmentation processing. The gray scale map of the image to be processed may be, for example, obtained by performing gray scale conversion on the image to be processed. Typically, the image to be processed is a color map having three color channels, and the conversion process requires converting the color map to a gray scale map of one color channel.
The execution body may acquire the gray-scale map in various ways. Specifically, the executing body may directly acquire the gray-scale image of the image to be processed from the local or other electronic devices. Alternatively, the executing body may acquire the image to be processed first, and then perform gray-level conversion on the image to be processed to obtain a gray-level image of the image to be processed.
Step 202, for each pixel of the gray scale map, determining a difference in gray scale between the pixel and a pixel that differs by a preset distance.
In this embodiment, the execution body may determine, for each pixel of the gray-scale map in which a difference in gray-scale between the pixel and the pixel that differs by a predetermined distance. Here the gray scale, i.e. the pixel value. The difference preset distance may refer to a difference of a preset number of pixels between pixels at different positions. For example, one pixel is the 2 nd pixel of the 1 st row, the difference preset distance may be 2 pixels, and the other pixel may be the 4 th pixel of the 1 st row, or the 2 nd pixel of the 3 rd row.
In practice, the difference in gray scale can be calculated in the x-direction, i.e., the pixel row direction, and in the y-direction, i.e., the pixel column direction. For example, the difference in transverse direction I.e. the difference in the x-direction, where the pixel rows between the pixels where the difference is calculated are unchanged and the pixel columns are changed. Difference in longitudinal direction->I.e. the difference in the y-direction, the pixel columns between the pixels where the difference is calculated are unchanged and the pixel rows are changed. Specifically, the above difference may be expressed as a first order difference:
wherein i represents the ith row, j represents the jth column, and i and j are positive integers. f (i, j) represents the value of the gradation of the pixel.
In some alternative implementations of this embodiment, the difference preset distance indicates a difference of at least two pixels.
In these alternative implementations, the predetermined distance of phase difference may be more than two pixels. Thus, the above difference can be expressed as:
in the case that the pixels for calculating the difference differ by more than two pixels, the execution body may use a specific operator (such as the Sobel operator of opencv) to calculate the difference, so as to determine the difference corresponding to each pixel. The difference value may also be used as a pixel value of a pixel to generate a difference image. These implementations may employ specific operators to determine differences, improving the efficiency of determining differences.
Step 203, binarizing the gray scale image based on the difference value, generating a binary image, and generating a histogram of the binary image.
In this embodiment, the execution body may binarize the gray-scale image based on the difference value, thereby generating a binary image. The execution subject may generate a histogram of the binary image. Specifically, the execution body may binarize the gray-scale map based on the difference value in various ways. For example, the execution body may compare the difference value with a preset difference threshold after determining the difference value corresponding to each pixel of the gray scale map. If the difference is greater than or equal to the preset difference threshold, the pixel in the binary image may be assigned a larger pixel value (e.g., 255) of "binary". If the difference is less than the preset difference threshold, the pixel may be assigned a smaller pixel value (e.g., 0) of the two values. In practice, the histogram may be obtained by counting pixels having larger pixel values in the binary image, and may represent the distribution of display pixels. Here, the display pixel is a pixel having a gradation other than 0 in a binary image. For example, the execution body may determine for pixel (i, j) ifAnd->Any one of which is greater than a preset difference threshold, or determines And->If the specified one of the pixels is greater than the predetermined difference threshold, then the pixel is determined to be a display pixel.
At step 204, at least one partial image containing the object is determined from the images to be processed based on the histogram.
In this embodiment, the execution subject may determine at least one local image including the object from the images to be processed based on the generated histogram. The image to be processed may contain one or more objects. Specifically, the execution subject may determine the partial image in various ways. For example, the execution subject may determine a plurality of adjacent display pixels whose positions are continuous in the histogram, and take a region corresponding to the determined plurality of adjacent display pixels whose positions are continuous in the image to be processed as the partial image.
In the case where the determined difference is more than two, the execution subject may be based on the determined difference in various ways (e.gAnd->) A final difference is obtained. Specifically, the difference value finally determined for the pixel here may be +.>And->The one having the larger difference may be an average value of the two or a specified one of the two.
In some alternative implementations of the present embodiment, step 203 may include:
Generating a differential image including the respective differences; and binarizing the differential image to obtain a binary image.
In these alternative implementations, the execution body may take, for each pixel, the difference value determined for the pixel as the pixel value of the pixel, thereby generating a differential image, and binarize the differential image to obtain a binary image.
In practice, the differential image may include a transverse differential image and a longitudinal differential image, and accordingly, the generated binary image may include binarizing the transverse differential image, generating a transverse binary image, and binarizing the longitudinal differential image, generating a longitudinal binary image. Specifically, the horizontal differential image refers to a differential image obtained from the difference in the x-direction, and the vertical differential image refers to a differential image obtained from the difference in the y-direction.
As shown in fig. 2b, fig. 2b-1 shows a transverse differential image, fig. 2b-2 shows a transverse binary image, fig. 2b-3 shows a longitudinal differential image, and fig. 2b-4 shows a longitudinal binary image.
The method provided by the embodiment of the application can divide images with various sizes, so that the method has strong universality. In addition, the pixel where the edge is can be accurately determined through the difference value between the pixels, and whether the edge is quantized or not is determined, so that a more accurate local image is determined.
With further reference to fig. 3a, a flow 300 of yet another embodiment of an image processing method is shown. The flow 300 of the image processing method comprises the steps of:
step 301, acquiring a gray scale image corresponding to the image to be processed.
In this embodiment, the execution subject of the image processing method (for example, the server or the terminal device shown in fig. 1) may acquire the gray-scale image corresponding to the image to be processed. Specifically, the execution body may directly acquire the grayscale map. In addition, the execution body may acquire the image to be processed first, and then convert the image to be processed into a gray scale image, thereby acquiring the gray scale image.
Step 302, for each pixel of the gray scale map, determining a difference in gray scale between the pixel and a pixel that differs by a preset distance.
In this embodiment, the execution body may determine, for each pixel of the gray-scale map, a difference in gray-scale between the pixel and a pixel that differs by a predetermined distance. Here the gray scale, i.e. the pixel value. The difference preset distance may refer to a preset number of pixels of the difference between pixels at different positions. For example, one pixel is the 2 nd pixel of the 1 st row, the difference preset distance may be 2 pixels, and the other pixel may be the 4 th pixel of the 1 st row, or the 2 nd pixel of the 3 rd row.
Step 303, binarizing the gray level map based on the difference value, generating a binary image, and generating a histogram of the binary image in a preset direction, wherein the histogram in the preset direction indicates the number of pixels of the object displayed by the binary image, which are distributed in each preset pixel area in the preset direction.
In this embodiment, the execution body may binarize the gray-scale image based on the difference value, thereby generating a binary image. The execution subject may generate a histogram of the binary image in a predetermined direction. In practice, the preset direction here may be any direction in the binary image, such as a direction of one diagonal line of the binary image, and a direction perpendicular to the diagonal line, or a lateral direction and a longitudinal direction of the binary image. The binary image is a generally rectangular image, and the lateral direction may be either the wide direction or the high direction of the binary image, while the longitudinal direction is the other. The preset pixel area may be a row of pixels or a column of pixels. A histogram may represent where the object is located in one direction in the binary image, i.e. where the display pixels are located.
As shown in fig. 3b, fig. 3b-1 is a (vertical) binary image, and fig. 3b-2 is a vertical histogram obtained from the binary image.
At step 304, at least one partial image containing the object is determined from the images to be processed based on the histogram.
In this embodiment, the execution subject may determine at least one local image including the object from the images to be processed based on the generated histogram. The image to be processed may contain one or more objects. Specifically, the execution subject may determine the partial image in various ways. For example, the execution subject may determine display pixels with continuous positions in the histogram, and take the determined regions corresponding to the display pixels with continuous positions in the image to be processed as the partial images.
In some optional implementations of the present embodiment, generating a histogram of the binary image in the preset direction in step 303 may include:
generating a histogram of the binary image in a first preset direction, wherein the first preset direction is one of a transverse direction and a longitudinal direction, the histogram in the first preset direction is used for indicating the sum of gray values of the binary image in each first preset pixel area, the first preset pixel area is a pixel area formed by dividing the binary image along a second preset direction, and the second preset direction is the other of the transverse direction and the longitudinal direction, which is different from the first preset direction; and step 304 may include: and dividing the binary image based on the position of a first preset pixel area with the sum of gray values not larger than a first preset threshold value or the position of a first preset pixel area with the gray value of each pixel not larger than a second preset threshold value in the histogram of the first preset direction, so as to obtain a candidate local image.
In these alternative implementations, the execution subject may generate a histogram of the binary image in the first preset direction. Each of the first preset pixel areas may be, for example, an area where a row of pixels or a column of pixels are located. For example, if the first preset direction is the column direction of the binary image, each first preset pixel area may be one pixel row of the binary image. The sum of gray values refers to the sum of gray values of display pixels in the first preset pixel region, and may indicate the number of display pixels in the first preset pixel region.
In practice, the execution body may find the position of the first preset pixel region with the smaller gray value sum by determining the gray value sum of the first preset pixel region, and divide the position as the dividing position to obtain the candidate local region. In addition, the executing body may determine the positions of the first preset pixel areas having gray values smaller than or equal to the second preset threshold, for example, the blank positions without patterns, that is, the positions of the first preset pixel areas having gray values of 0. Then, the execution body may divide the determined position as a division position to obtain the candidate local area.
For example, in the histogram of the first preset direction, the determined position of the first preset pixel area is the 5 th line of pixels, and 1-4 lines of display pixels may be determined as the candidate partial images, and 6-10 lines of display pixels may be determined as another candidate partial image.
The position of the first preset pixel area determined herein may be the position of a row or a column of pixels, or may be the position of a continuous row or a continuous column of pixels. In addition, the execution body may be preset at the determined position of the first preset pixel area, and the execution body may divide the first preset pixel area at the determined position when the first preset pixel area is a position where a plurality of rows or a plurality of columns are located. For example, the division may be performed only when the determined position is a position of three or more rows or three or more columns of pixels, and the division may be performed not otherwise. For example, if the determined position is that the gray values of the pixels of the adjacent two rows are both 0, no division is performed.
After determining the candidate local image, the execution subject may determine a region corresponding to the candidate local image in the image to be processed as the local image. Thereby completing the segmentation of the image. The division may be a binary image of the first preset direction obtained by binarizing the differential image of the first preset direction.
The realization method can accurately determine the candidate local image in the binary image through the sum of gray values in the statistical histogram, so that a more accurate local image can be determined in the image to be processed.
In some optional application scenarios of these implementations, the generating a histogram of the binary image in the first preset direction may include:
and generating a histogram of the binary image in the first preset direction in response to the side length of the binary image in the first preset direction being greater than the side length of the binary image in the second preset direction.
In these application scenarios, the execution subject may determine the histogram in the direction of the longer side length, especially in the case where the image to be processed is one long graph (i.e., the lengths of the two sides of the image are widely different). Therefore, the problem that the image to be processed is difficult to divide due to the fact that a plurality of objects are overlapped in the histogram can be avoided if the histogram of the direction of the shorter side length is directly determined without dividing the binary image of the image, and therefore the application scenes can divide the image more accurately.
In some optional application scenarios of these implementations, step 304 may further include: generating a histogram of the candidate local image in a second preset direction; for the candidate partial image, determining the maximum gray value sum as a first gray value in the gray value sum of each first preset pixel area; for the histogram of the second preset direction, determining the maximum gray value sum as a second gray value in the gray value sum of each second preset pixel area, wherein the second preset pixel area is a pixel area formed by dividing the candidate local image along the first preset direction; and determining the ratio of the first gray level value to the second gray level value, and if the ratio is in a preset ratio range, determining the region corresponding to the candidate local image in the image to be processed as the local image.
In these optional application scenarios, the execution subject may generate a histogram of the candidate partial image in a second preset direction. The execution body may determine, as the second numerical value, a sum of the maximum gradation values among the sums of the gradation values of the histogram. The execution body may further determine, as the first numerical value, a sum of the maximum gray values among the respective gray values of the histogram in the first preset direction. The execution body may determine a ratio of the first value to the second value and compare the ratio to a predetermined ratio range. If the candidate partial images are within the preset ratio range, the execution body can determine the corresponding areas of the candidate partial images in the images to be processed as partial images, so that the segmentation process is completed. The ratio may be a ratio obtained by dividing the first value by the second value, or a ratio obtained by dividing the second value by the first value, which is not limited herein.
For example, the preset ratio may be 1:5-5:1, and if the determined ratio is 1, the ratio is within the preset ratio.
The horizontal histograms obtained from the different candidate partial images may be different. As shown in fig. 3c, three horizontal histograms are shown.
In practice, the process of generating the histogram of the second preset direction may include first determining, from the histogram of the second preset direction generated from the difference image of the second preset direction, an area corresponding to the candidate partial image, and generating the histogram of the second preset direction for the area.
The application scenes can accurately select the effective partial images through the proportion of the patterns in the candidate partial images, so that the interference of the effective partial images required to be selected by determining various pattern elements as the partial images is avoided.
According to the embodiment, the position of the object can be determined more accurately through the histogram in the preset direction, so that the image can be segmented more accurately.
With further reference to fig. 4, as an implementation of the method shown in the foregoing figures, the present application provides an embodiment of an image processing apparatus, where the embodiment of the apparatus corresponds to the embodiment of the method shown in fig. 2, and the apparatus is particularly applicable to various electronic devices.
As shown in fig. 4, the image processing apparatus 400 of the present embodiment includes: an acquisition unit 401, a determination unit 402, a generation unit 403, and a division unit 404. Wherein the acquiring unit 401 is configured to acquire a gray scale image of an image to be processed; a determining unit 402 configured to determine, for each pixel of the gray-scale image, a difference in gray-scale between the pixel and a pixel that differs by a preset distance; a generating unit 403 configured to binarize the gray-scale image based on the difference value, generate a binary image, and generate a histogram of the binary image; the segmentation unit 404 is configured to determine at least one partial image comprising the object from the images to be processed based on the histogram.
In some embodiments, the acquisition unit 401 of the image processing apparatus 400 acquires a grayscale image of an image to be processed. The image to be processed is an image to be subjected to the segmentation processing. The gray level map is obtained by performing image conversion on the image to be processed. Typically, the image to be processed is a color map having three color channels, and the conversion process requires converting the color map to a gray scale map of one color channel.
In some embodiments, the determining unit 402 may determine, for each pixel of a gray-scale map in which a difference in gray-scale between the pixel and a pixel differing by a preset distance. Here the gray scale, i.e. the pixel value. The difference preset distance may refer to a difference of a preset number of pixels between pixels at different positions.
In some embodiments, the generating unit 403 may binarize the gray-scale map based on the difference value, thereby generating a binary image. The execution subject may generate a histogram of the binary image. Specifically, the execution body may binarize the gray-scale map based on the difference value in various ways.
In some embodiments, the segmentation unit 404 may determine at least one local image containing the object from the images to be processed based on the generated histogram. The image to be processed may contain one or more objects. Specifically, the execution subject may determine the partial image in various ways.
In some optional implementations of the present embodiment, the generating unit includes: the generation module is configured to generate a histogram of the binary image in a preset direction, wherein the histogram of the preset direction indicates the number of pixels of an object displayed by the binary image, which are distributed in each preset pixel area of the preset direction.
In some optional implementations of the present embodiment, the generating module is further configured to: generating a histogram of the binary image in a first preset direction, wherein the first preset direction is one of a transverse direction and a longitudinal direction, the histogram in the first preset direction is used for indicating the sum of gray values of the binary image in each first preset pixel area, the first preset pixel area is a pixel area formed by dividing the binary image along a second preset direction, and the second preset direction is the other of the transverse direction and the longitudinal direction, which is different from the first preset direction; and a dividing unit including: the segmentation module is configured to segment the binary image based on the position of a first preset pixel area with the sum of gray values not larger than a first preset threshold value or the position of a first preset pixel area with the gray value of each pixel not larger than a second preset threshold value in the histogram of the first preset direction, so as to obtain a candidate local image.
In some optional implementations of the present embodiment, the generating module is further configured to: and generating a histogram of the binary image in the first preset direction in response to the side length of the binary image in the first preset direction being greater than the side length of the binary image in the second preset direction.
In some optional implementations of the present embodiment, the partitioning unit further includes: the histogram generation module is configured to generate a histogram of the candidate local image in a second preset direction; a first determining module configured to determine, for the candidate partial image, a sum of maximum gray values among sums of gray values in respective first preset pixel regions as a first numerical value; a second determining module configured to determine, for a histogram of a second preset direction, a sum of maximum gray values among sums of gray values in second preset pixel regions, the second preset pixel region being a pixel region formed by dividing the candidate partial image along the first preset direction; and the third determining module is configured to determine the ratio of the first value to the second value, and if the ratio is within a preset ratio range, determine the corresponding region of the candidate local image in the image to be processed as the local image.
In some alternative implementations of the present embodiment, the difference preset distance indicates a difference of at least two pixels.
In some optional implementations of the present embodiment, the generating unit is further configured to: generating a differential image including the respective differences; and binarizing the differential image to obtain a binary image.
As shown in fig. 5, the electronic device 500 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 501, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data required for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM 502, and the RAM 503 are connected to each other via a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
In general, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 507 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 508 including, for example, magnetic tape, hard disk, etc.; and communication means 509. The communication means 509 may allow the electronic device 500 to communicate with other devices wirelessly or by wire to exchange data. While fig. 5 shows an electronic device 500 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead. Each block shown in fig. 5 may represent one device or a plurality of devices as needed.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 509, or from the storage means 508, or from the ROM 502. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 501. It should be noted that the computer readable medium of the embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In an embodiment of the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. Whereas in embodiments of the present disclosure, the computer-readable signal medium may comprise a data signal propagated in baseband or as part of a carrier wave, with computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present application may be implemented by software, or may be implemented by hardware. The described units may also be provided in a processor, for example, described as: a processor includes an acquisition unit, a determination unit, a generation unit, and a segmentation unit. The names of these units do not constitute a limitation on the unit itself in some cases, and the acquisition unit may also be described as "a unit that acquires a gradation image of an image to be processed", for example.
As another aspect, the present application also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be present alone without being fitted into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: acquiring a gray level image of an image to be processed; for each pixel of the gray scale map, determining a difference in gray scale between the pixel and a pixel differing by a preset distance; binarizing the gray level map based on the difference value to generate a binary image and generating a histogram of the binary image; at least one partial image containing the object is determined from the images to be processed based on the histogram.
The foregoing description is only of the preferred embodiments of the present application and is presented as a description of the principles of the technology being utilized. It will be appreciated by persons skilled in the art that the scope of the invention referred to in this application is not limited to the specific combinations of features described above, but it is intended to cover other embodiments in which any combination of features described above or equivalents thereof is possible without departing from the spirit of the invention. Such as the above-described features and technical features having similar functions (but not limited to) disclosed in the present application are replaced with each other.

Claims (9)

1. An image processing method, comprising:
acquiring a gray level image of an image to be processed;
for each pixel of the gray scale map, determining a difference value of gray scales between the pixel and a pixel differing by a preset distance;
binarizing the gray scale image based on the difference value to generate a binary image and generating a histogram of the binary image;
determining at least one local image containing an object from the images to be processed based on the histogram; the generating the histogram of the binary image includes:
generating a histogram of the binary image in a preset direction, wherein the histogram of the preset direction indicates the number of pixels of an object displayed by the binary image, which are distributed in each preset pixel area of the preset direction, and the preset direction comprises: a direction of one diagonal line of the binary image, and a direction perpendicular to the diagonal line, or a lateral direction and a longitudinal direction of the binary image.
2. The method of claim 1, wherein,
the generating the histogram of the binary image in the preset direction comprises the following steps:
generating a histogram of the binary image in a first preset direction, wherein the first preset direction is one of a transverse direction and a longitudinal direction, the histogram in the first preset direction is used for indicating the sum of gray values of the binary image in each first preset pixel area, the first preset pixel area is a pixel area formed by dividing the binary image along a second preset direction, and the second preset direction is the other of the transverse direction and the longitudinal direction, which is different from the first preset direction; and
The determining at least one local image containing the object from the images to be processed based on the histogram comprises the following steps:
and dividing the binary image based on the position of a first preset pixel area with the sum of gray values not larger than a first preset threshold value or the position of a first preset pixel area with the gray value of each pixel not larger than a second preset threshold value in the histogram of the first preset direction, so as to obtain a candidate local image.
3. The method of claim 2, wherein the generating a histogram of the binary image in a first preset direction comprises:
and generating a histogram of the binary image in the first preset direction in response to the side length of the binary image in the first preset direction being greater than the side length of the binary image in the second preset direction.
4. The method of claim 2, wherein the determining at least one partial image containing an object from the images to be processed based on the histogram further comprises:
generating a histogram of the candidate local image in the second preset direction;
for the candidate partial images, determining the maximum gray value sum as a first numerical value in the gray value sum in each first preset pixel area;
For the histogram of the second preset direction, determining the maximum sum of gray values as a second value in the sum of gray values in each second preset pixel area, wherein the second preset pixel area is a pixel area formed by dividing the candidate local image along the first preset direction;
and determining the ratio of the first numerical value to the second numerical value, and determining the corresponding region of the candidate local image in the image to be processed as the local image if the ratio is in a preset ratio range.
5. The method of claim 1, wherein the phase difference preset distance indicates a phase difference of at least two pixels.
6. The method of claim 1, wherein the binarizing the gray scale map based on the difference value to generate a binary image comprises:
generating a differential image including each of the differences;
and binarizing the differential image to obtain the binary image.
7. An image processing apparatus comprising:
an acquisition unit configured to acquire a gray-scale image of an image to be processed;
a determining unit configured to determine, for each pixel of the gray-scale image, a difference in gray-scale between the pixel and a pixel that differs by a preset distance;
A generating unit configured to binarize the gradation map based on the difference value, generate a binary image, and generate a histogram of the binary image; the generation unit is further configured to: generating a histogram of the binary image in a preset direction, wherein the histogram of the preset direction indicates the number of pixels of an object displayed by the binary image, which are distributed in each preset pixel area of the preset direction, and the preset direction comprises: a direction of one diagonal line of the binary image, and a direction perpendicular to the diagonal line, or a lateral direction and a longitudinal direction of the binary image;
and the segmentation unit is configured to determine at least one local image containing an object from the images to be processed based on the histogram.
8. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs,
when executed by the one or more processors, causes the one or more processors to implement the method of any of claims 1-6.
9. A computer readable storage medium having stored thereon a computer program, wherein the program when executed by a processor implements the method of any of claims 1-6.
CN201910427420.0A 2019-05-22 2019-05-22 Image processing method and device Active CN111768410B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910427420.0A CN111768410B (en) 2019-05-22 2019-05-22 Image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910427420.0A CN111768410B (en) 2019-05-22 2019-05-22 Image processing method and device

Publications (2)

Publication Number Publication Date
CN111768410A CN111768410A (en) 2020-10-13
CN111768410B true CN111768410B (en) 2024-04-05

Family

ID=72718320

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910427420.0A Active CN111768410B (en) 2019-05-22 2019-05-22 Image processing method and device

Country Status (1)

Country Link
CN (1) CN111768410B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5784500A (en) * 1995-06-23 1998-07-21 Kabushiki Kaisha Toshiba Image binarization apparatus and method of it
CN1678016A (en) * 2004-03-30 2005-10-05 东芝解决方案株式会社 Image processing apparatus and image processing method
CN103026384A (en) * 2011-01-20 2013-04-03 松下电器产业株式会社 Feature extraction unit, feature extraction method, feature extraction program, and image processing device
CN103733224A (en) * 2011-08-11 2014-04-16 松下电器产业株式会社 Feature extraction device, feature extraction program, and image processing device
CN104021574A (en) * 2014-07-04 2014-09-03 武汉武大卓越科技有限责任公司 Method for automatically identifying pavement diseases
CN104952081A (en) * 2015-07-20 2015-09-30 电子科技大学 COG (Chip-On-Glass) offset detection method based on extreme value difference statistical characteristic
CN105303676A (en) * 2015-10-27 2016-02-03 深圳怡化电脑股份有限公司 Banknote version identification method and banknote version identification system
CN106204617A (en) * 2016-07-21 2016-12-07 大连海事大学 Adapting to image binarization method based on residual image rectangular histogram cyclic shift
CN106991418A (en) * 2017-03-09 2017-07-28 上海小蚁科技有限公司 Winged insect detection method, device and terminal
CN107067012A (en) * 2017-04-25 2017-08-18 中国科学院深海科学与工程研究所 Submarine geomorphy cell edges intelligent identification Method based on image procossing
CN107103683A (en) * 2017-04-24 2017-08-29 深圳怡化电脑股份有限公司 Paper Currency Identification and device, electronic equipment and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5784500A (en) * 1995-06-23 1998-07-21 Kabushiki Kaisha Toshiba Image binarization apparatus and method of it
CN1678016A (en) * 2004-03-30 2005-10-05 东芝解决方案株式会社 Image processing apparatus and image processing method
CN103026384A (en) * 2011-01-20 2013-04-03 松下电器产业株式会社 Feature extraction unit, feature extraction method, feature extraction program, and image processing device
CN103733224A (en) * 2011-08-11 2014-04-16 松下电器产业株式会社 Feature extraction device, feature extraction program, and image processing device
CN104021574A (en) * 2014-07-04 2014-09-03 武汉武大卓越科技有限责任公司 Method for automatically identifying pavement diseases
CN104952081A (en) * 2015-07-20 2015-09-30 电子科技大学 COG (Chip-On-Glass) offset detection method based on extreme value difference statistical characteristic
CN105303676A (en) * 2015-10-27 2016-02-03 深圳怡化电脑股份有限公司 Banknote version identification method and banknote version identification system
CN106204617A (en) * 2016-07-21 2016-12-07 大连海事大学 Adapting to image binarization method based on residual image rectangular histogram cyclic shift
CN106991418A (en) * 2017-03-09 2017-07-28 上海小蚁科技有限公司 Winged insect detection method, device and terminal
CN107103683A (en) * 2017-04-24 2017-08-29 深圳怡化电脑股份有限公司 Paper Currency Identification and device, electronic equipment and storage medium
CN107067012A (en) * 2017-04-25 2017-08-18 中国科学院深海科学与工程研究所 Submarine geomorphy cell edges intelligent identification Method based on image procossing

Also Published As

Publication number Publication date
CN111768410A (en) 2020-10-13

Similar Documents

Publication Publication Date Title
CN109308681B (en) Image processing method and device
CN109242801B (en) Image processing method and device
CN109711508B (en) Image processing method and device
CN109255337B (en) Face key point detection method and device
CN109344762B (en) Image processing method and device
CN109389072B (en) Data processing method and device
CN109118456B (en) Image processing method and device
CN109344752B (en) Method and apparatus for processing mouth image
CN109255767B (en) Image processing method and device
CN109377508B (en) Image processing method and device
CN110516678B (en) Image processing method and device
CN110059623B (en) Method and apparatus for generating information
CN109285181B (en) Method and apparatus for recognizing image
CN109960959B (en) Method and apparatus for processing image
CN110827301B (en) Method and apparatus for processing image
CN110288625B (en) Method and apparatus for processing image
CN109272526B (en) Image processing method and system and electronic equipment
CN111724396A (en) Image segmentation method and device, computer-readable storage medium and electronic device
CN111783777B (en) Image processing method, apparatus, electronic device, and computer readable medium
CN112967191B (en) Image processing method, device, electronic equipment and storage medium
CN113658196A (en) Method and device for detecting ship in infrared image, electronic equipment and medium
CN112632952A (en) Method and device for comparing files
CN111768410B (en) Image processing method and device
CN113780294B (en) Text character segmentation method and device
CN112907164A (en) Object positioning method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant