WO2020232910A1 - 基于图像处理的目标物统计方法、装置、设备及存储介质 - Google Patents

基于图像处理的目标物统计方法、装置、设备及存储介质 Download PDF

Info

Publication number
WO2020232910A1
WO2020232910A1 PCT/CN2019/103845 CN2019103845W WO2020232910A1 WO 2020232910 A1 WO2020232910 A1 WO 2020232910A1 CN 2019103845 W CN2019103845 W CN 2019103845W WO 2020232910 A1 WO2020232910 A1 WO 2020232910A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
target
target object
recognized
gradient
Prior art date
Application number
PCT/CN2019/103845
Other languages
English (en)
French (fr)
Inventor
王俊
高鹏
谢国彤
Original Assignee
平安科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 平安科技(深圳)有限公司 filed Critical 平安科技(深圳)有限公司
Publication of WO2020232910A1 publication Critical patent/WO2020232910A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/155Segmentation; Edge detection involving morphological operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/695Preprocessing, e.g. image segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/698Matching; Classification
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H70/00ICT specially adapted for the handling or processing of medical references
    • G16H70/60ICT specially adapted for the handling or processing of medical references relating to pathologies

Definitions

  • This application relates to the field of image detection technology, and in particular to a method, device, equipment, and storage medium for object statistics based on image processing.
  • This application provides a method, device, equipment, and storage medium for object statistics based on image processing to improve the efficiency and accuracy of statistical objects.
  • this application provides a method for object statistics based on image processing, including:
  • the image to be recognized includes a target
  • this application also provides a device for counting objects based on image processing, including:
  • An image acquisition unit for acquiring an image to be recognized, the image to be recognized includes a target;
  • a preprocessing unit configured to perform noise reduction and anti-color processing on the image to be identified to obtain a target image
  • the construction processing unit is configured to perform a convolution operation on the target image according to the Gauss-Laplacian function to construct a Gaussian pyramid and construct a differential pyramid according to the Gaussian pyramid, and extract the local extrema of the target according to the differential pyramid ;
  • a gradient calculation unit configured to calculate the image gradient of the boundary of the target object according to the local extremum of the target object
  • the result statistics unit is configured to perform differentiated statistics on the target object according to the image gradient, and display the statistics result.
  • the present application also provides a computer device, the computer device includes a memory and a processor; the memory is used to store a computer program; the processor is used to execute the computer program and execute the The computer program implements the above-mentioned object statistical method based on image processing.
  • the present application also provides a computer-readable storage medium that stores a computer program, and when the computer program is executed by a processor, the processor realizes the above-mentioned image-based processing The statistical method of the target object.
  • the present application discloses a method, device, equipment and storage medium for object statistics based on image processing.
  • the method includes: acquiring an image to be identified, the image to be identified includes a target object; and denoising the image to be identified Reverse color processing to obtain a target image; perform a convolution operation on the target image according to the Gauss-Laplace function to construct a Gaussian pyramid and construct a differential pyramid based on the Gaussian pyramid, and extract the target image based on the differential pyramid Local extremum; calculate the image gradient of the boundary of the target object according to the local extremum of the target object; perform differentiated statistics on the target object according to the image gradient, and display the statistical result.
  • This method can quickly and accurately distinguish and count the objects in the target image, thereby improving the efficiency and accuracy of the statistics of the objects, and reducing the burden of manual statistics.
  • FIG. 1 is a schematic flowchart of a method for object statistics based on image processing provided by an embodiment of the present application
  • FIG. 2 is a schematic flowchart of sub-steps of the target object statistics method in FIG. 1;
  • FIG. 3a is a schematic diagram of the effect of an image to be recognized provided by an embodiment of the present application.
  • FIG. 3b is a schematic diagram of the effect of the target image provided by the embodiment of the present application.
  • FIG. 4 is a schematic diagram of the construction process of a differential pyramid provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of local extreme points of target cells in scale space and two-dimensional image space provided by an embodiment of this application;
  • FIG. 6 is a schematic diagram of the structure of a cascade filter provided by an embodiment of the application.
  • FIG. 7 is a schematic diagram of a local extremum area of target cells provided by an embodiment of the application.
  • FIG. 8 is a schematic flowchart of sub-steps of the target object statistics method in FIG. 1;
  • Figure 9a is a schematic diagram of a target image provided by an embodiment of the application.
  • FIG. 9b is a schematic diagram of a sub-image provided by an embodiment of the application.
  • FIG. 10 is a schematic diagram of several sub-images after cropping of a target image provided by an embodiment of the application.
  • FIG. 11 is a schematic diagram of several target cells with contour buffers after the contour line of the target cell is expanded according to an embodiment of the application;
  • FIG. 12 is a schematic diagram showing the effect of target cell statistical results provided by an embodiment of the application.
  • FIG. 13 is a schematic diagram of several endothelial cells counted in an embodiment of the application.
  • FIG. 14 is a schematic diagram of several mesangial cells counted in an embodiment of the application.
  • FIG. 15 is a schematic diagram of several podocytes calculated according to an embodiment of the application.
  • FIG. 16 is a schematic flowchart of another object statistics method based on image processing provided by an embodiment of the application.
  • FIG. 17 is a schematic block diagram of a device for counting objects based on image processing according to an embodiment of the application.
  • FIG. 18 is a schematic block diagram of a preprocessing unit provided by an embodiment of this application.
  • FIG. 19 is a schematic block diagram of a gradient calculation unit provided by an embodiment of the application.
  • 20 is a schematic block diagram of another device for counting objects based on image processing provided by an embodiment of the application.
  • FIG. 21 is a schematic block diagram of the structure of a computer device according to an embodiment of the application.
  • the embodiments of the present application provide a method, device, equipment, and storage medium for object statistics based on image processing, which can be used to count objects on pathological images, and of course, can also be used for statistics on objects on other images.
  • the target on the pathological image can be lymphocytes, DNA, RNA, chromosomes, and glomerular endothelial cells, mesangial cells, and podocytes.
  • the following examples will take the endothelial cells, mesangial cells and podocytes of the glomerulus as targets for detailed introduction.
  • the method for counting objects based on image processing can be applied to a terminal or a server, or the server and the terminal can be used interactively to quickly and accurately count the objects.
  • the server and the terminal are used interactively, for example, the server sends the statistical results to the terminal for application.
  • the server can be an independent server or a server cluster.
  • the terminal can be an electronic device such as a tablet computer, a notebook computer, a desktop computer, a smart phone, or a wearable device.
  • FIG. 1 is a schematic flowchart of a method for object statistics based on image processing provided by an embodiment of the present application.
  • the object statistics method includes steps S101 to S105.
  • Step S101 Obtain an image to be recognized.
  • the image to be recognized is an image including a target object.
  • the image to be recognized can be obtained by means of ordinary cameras, scanners, cameras, video cameras, image capture cards, microscopic digital cameras, scanners, and the like.
  • the acquisition process is: collecting a sample of the kidney living tissue; staining the kidney living tissue to make a kidney section; collecting with an image acquisition device
  • the image of the stained kidney section is used as the image to be recognized. Because the number of endothelial cells, mesangial cells, and podocytes in the glomerulus is usually hundreds of them, they are dark and indistinguishable with similar characteristics. Therefore, the sample is stained before acquiring the image to be identified.
  • the staining agent used It is a Periodic Acid-Schiffstain (PAS), and the endothelial cells, mesangial cells, and podocytes of the glomeruli on the image to be identified after being stained by PAS are highlighted in purple. Therefore, the number of endothelial cells, mesangial cells, and podocytes of the glomeruli on the image to be recognized can be counted more intuitively, thereby improving the accuracy of counting target cells.
  • PAS Periodic Acid-Schiffstain
  • Step S102 Perform noise reduction and anti-color processing on the image to be identified to obtain a target image.
  • the preprocessing includes noise reduction processing and color inversion processing, etc., wherein the noise reduction processing can adopt wavelet noise reduction or Fourier transform reduction Noise, etc., the specific noise reduction method is not limited here, as long as the effect of noise removal is achieved.
  • the image to be recognized needs to be inverted.
  • the inverted color can be called the inverted processing function or the corresponding image processing tool.
  • FIG. 2 is the step of performing noise reduction and color reversal processing on the image to be recognized to obtain a target image, including sub-step S1021 and sub-step S1022.
  • Step S1021 Smooth the image to be recognized by bilateral filtering to remove pseudo-point noise.
  • Noise is also called noise. Noise is mainly caused by the influence of factors such as homogeneity and different spectrum, same spectrum and heterogeneity in the imaging process of the sensor. Common noises include salt and pepper noise and striped noise.
  • this embodiment adopts a bilateral filtering algorithm to perform noise reduction processing on the image to be recognized.
  • Bilateral filtering not only considers the distance between pixels, but also considers the similarity between gray levels. That is, bilateral filtering includes filtering within the spatial range and filtering within the gray-scale range. The expression of filtering in the above spatial range is:
  • w s (i,j) is the spatial domain weight
  • I(i,j) is the image to be recognized
  • is the neighborhood range at the pixel (x,y).
  • filtering in the gray scale is similar to filtering in the space.
  • the expression for filtering in the grayscale range is:
  • I the image to be recognized after denoising
  • w r (i, j) is the gray domain weight
  • I(i, j) is the image to be recognized
  • is the neighborhood range at the pixel (x, y).
  • Is the image to be recognized after denoising w s (i,j) is the spatial domain weight
  • w r (i,j) is the gray domain weight
  • w(i,j) is the spatial domain weight and gray
  • the product of domain weights, I(i,j) is the image to be recognized
  • is the neighborhood range at the pixel (x,y).
  • the bilateral filter in the area where the image to be recognized changes relatively smoothly, the pixel gray values in the neighborhood are not much different, and the bilateral filtering is converted into a Gaussian low-pass filter; In the area, the filter replaces the original gray value with the gray average value of pixels with similar gray levels in the neighborhood of the edge point. Therefore, the bilateral filter not only protects the edge information of the image, but also smoothes the noise of the image to be recognized.
  • step S1022 the image to be recognized after the bilateral filtering process is subjected to color inversion processing to obtain a target image.
  • the target cell area usually presents dark, nearly circular plaques of different sizes, in order to improve the statistical efficiency and accuracy of the target cells, the image to be recognized needs to be inverted.
  • Each pixel in the image has four values, namely alpha, red, green, and blue. They are the basic elements that make up the color, and the value range of each element is [0,255].
  • the inversion process is to subtract the R, G, and B values of each pixel of the image to be recognized from 255, as shown in Figure 3a and Figure 3b, Figure 3a is the image to be recognized before the inversion process, and Figure 3b is the inversion
  • the processed image to be recognized is the target image.
  • Step S103 Perform a convolution operation on the target image according to the Gauss-Laplacian function to construct a Gaussian pyramid and construct a differential pyramid according to the Gaussian pyramid, and extract the local extrema of the target according to the differential pyramid.
  • L(x,y,k ⁇ ) and L(x,y, ⁇ ) are Gaussian pyramids
  • D(x,y, ⁇ ) is the difference pyramid
  • G(x, y, k ⁇ ) and G (x, y, ⁇ ) are Gauss-Laplace functions, respectively
  • I (x, y) is the input two-dimensional image, that is, the target image.
  • the differential pyramid is constructed by taking the difference between the convolution results of two adjacent images of different scales (pixels) in each layer of the Gaussian pyramid. .
  • the differential pyramid can be thought of as a visual mechanism that simulates the nerves on the retina to extract information from the image and then provide it to the brain.
  • the difference pyramid can effectively detect stable local extreme points in the scale space, and then calculate the local extreme value of the target through the difference pyramid.
  • the construction process of the Gaussian pyramid and the difference pyramid is as follows: local extremum extraction uses a Gaussian-Laplace (Laplacian of Gaussian, LOG) function for convolution processing.
  • the Laplacian smoothed by Gaussian is a second-order differential operator, which can generate a steep zero crossing (that is, a zero crossing from positive to negative) at the edge, and perform extreme point detection based on the zero crossing point .
  • the convolution operation between an image and a two-dimensional function is actually to find the similarity between the image and this function.
  • the Gauss-Laplacian operator and the Gauss kernel have the following relationship:
  • Gaussian convolution kernel is the only linear transformation kernel that realizes scale transformation
  • an image can be expressed in the scale space as the convolution of the image and the variable Gaussian kernel function, and the Gaussian pyramid operator is used as shown in the above formula (5) .
  • Gauss-Laplacian operator can be approximately replaced by the Difference of Gaussian (DOG) operator.
  • DOG Difference of Gaussian
  • this embodiment further constructs a difference pyramid , To quickly and effectively count the target. Moreover, by fusing multi-scale differential pyramids to perform convolution operations, the extreme centers of targets of different sizes can be identified robustly and effectively, so that the center and size (radius) of the target can be known.
  • FIG. 5 is a schematic diagram of the local extreme points of the target cell in the scale space and the two-dimensional image space, where L1 is the neighboring point of the previous scale, L2 is the neighboring point of the same scale space, and L3 is the neighboring point of the next scale.
  • the image scale space is formed by convolving the image with a Gaussian filter with a variable kernel to obtain the Gaussian pyramid of the image. For example, divide the Gaussian pyramid into O groups, each group of S plus 3 layers, S is the number of layers between ⁇ and 2 ⁇ , and S is generally 2 or 3.
  • the formation of each group of Gaussian pyramids is obtained by convolution of the input image and the cascade filter. In the following, taking S as 2 as an example, the formation process of the Gaussian pyramid will be described in detail.
  • each group of Gaussian pyramids has 5 layers, which are obtained by four-stage cascade filters.
  • the formation process of the Gaussian pyramid for the first group is shown in Figure 6.
  • the image size of each layer of the second group of Gaussian pyramids is 1/4 of the image size of the first group of Gaussian pyramids
  • the input image is the S-th layer image of the first group of Gaussian pyramids obtained through a sampling process with a sampling rate of 2.
  • S the input image I 0 ′ of the second group of Gaussian pyramids is obtained by sampling the image I 1 mentioned above.
  • a total of 4 sets of Gaussian pyramids are generated.
  • the difference pyramid constructed indirectly by the Gauss-Laplace function or directly constructed by the Gaussian gold tower can extract the local extrema of the target.
  • the extraction effect of the local extremum of the target cell in this embodiment can be seen in FIG. 7.
  • Step S104 Calculate the image gradient of the boundary of the target object according to the local extremum of the target object.
  • the target boundary refers to the boundary area of the target, such as the contour buffer of the target.
  • the image gradient can be calculated according to the local extremum of the target using the image gradient algorithm to calculate the image gradient of the target boundary.
  • step S104 includes sub-step S1041 to sub-step S1043.
  • Step S1041 according to the local extremum of the target, crop the sub-images including the target in the target image.
  • a rectangular area corresponding to the target is generated according to the local extremum of the target and the circumscribed rectangle of the target; the target on the target image is cropped according to the rectangular area to obtain the The sub-image of the target.
  • the target image is cropped by taking the local extremum point of the target as the center and the circumscribed rectangle of the target by expanding a rectangular area of M*N pixels
  • M and N are both positive integers.
  • the values of M and N can be equal.
  • Fig. 9a is a target image.
  • the circumscribed rectangle of the target is expanded by a rectangular area of 5*5 pixels to obtain a sub-image of the target.
  • the circumscribed rectangle is the inner rectangle shown in Fig. 9b.
  • the rectangular area The area enclosed by the outer rectangle as shown in FIG. 9b is expanded by 5*5 pixels from the outer rectangle.
  • H and D are both positive integers, thereby improving the statistical efficiency and accuracy of the object.
  • Figure 10 is a number of sub-images cropped after the target image is uniformly resampled to 80*80 pixels.
  • Step S1042 extract the contour of the target in the sub-image according to the watershed segmentation algorithm, and extract the contour buffer of the target with the contour as the center according to the morphological expansion operation.
  • the contour line of the target cell is extracted with the local extremum center of the target cell in the sub-image as the seed point, and the contour line of the target cell is used as the center, using L x L structural elements (can be Using 3x3 structural elements) to perform morphological expansion calculations, thicken the extracted contour buffer of the target cell inside and outside the contour line of the target cell to obtain the contour buffer of each target cell, as shown in Fig. 11 Show.
  • mathematical image morphology is based on morphology, using structural elements with a certain morphology to measure and extract the corresponding target contour in the image.
  • Mathematical morphology operations mainly include corrosion, expansion, opening and closing, etc. Based on these basic operations, various practical mathematical morphology algorithms can be combined and derived, and then the image shape and structure can be analyzed and processed.
  • Dilation is the use of vector addition to merge two sets. Dilation is the set of the sum of all vectors. The two operands of vector addition come from the set X and the structure element B, and any possible combination is obtained.
  • the expression of the expansion operation is as follows:
  • Dilation can fill the small holes in the image (the small holes are relatively small holes relative to the size of the structural elements) and the fine depressions that appear at the edges of the image, and have the effect of filtering the image outside.
  • Corrosion is the use of vector subtraction on set elements to merge two sets. Corrosion is a dual operation of expansion, and erosion and expansion are not mutually reversible operations.
  • the expression of the corrosion operation is as follows:
  • Corrosion can eliminate the smaller part of the image, have the effect of filtering the image, and reduce the image.
  • Step S1043 Calculate the image gradient of the contour buffer of the target object according to the image gradient algorithm.
  • the classic image gradient algorithm considers the grayscale changes in a certain neighborhood of each pixel of the image, and uses the first-order or second-order derivative change law adjacent to the edge to set a gradient operator to convolve a certain neighborhood of the pixels in the image. To calculate.
  • the morphological gradient is based on the combination of expansion or erosion and the difference of the image to enhance the intensity of the pixels in the neighborhood of the structural element.
  • the basic operations of expansion and erosion are combined to use the image gradient calculation of the buffer.
  • Step S105 Perform distinguishing statistics on the target object according to the image gradient, and display the statistics result.
  • the target is distinguished according to the image gradient to obtain the target type corresponding to the target, and the number of the target in each target type is counted.
  • displaying the statistical results can directly display the number of targets of each target type, or display in combination with graphics, as shown in FIG. 12.
  • the performing differentiated statistics on the target object according to the image gradient includes:
  • the target types are glomerular endothelial cells, mesangial cells, and podocytes.
  • the prior knowledge of the three types of target cells can be programmed as follows:
  • FIG. 13 shows the counted number of endothelial cells.
  • the boundary of endothelial cells is between the bright background and the dark cells, and the corresponding preset image gradient range is: the gradient value is greater than or equal to 192.
  • the preset image gradient range may also include other conditions. For example, if the number of pixels whose gradient value is greater than or equal to 192 in the contour buffer of the target cell exceeds 20%, it can be judged as an endothelial cell;
  • FIG. 14 shows several mesangial cells counted.
  • the preset image gradient range corresponding to the mesangial cells in the purple-red mesangial area is: the gradient value does not exceed 64.
  • the preset image gradient range can also be that the number of pixels whose gradient value does not exceed 64 in the contour buffer of the target cell accounts for more than 20%, which can be judged as mesangial cells;
  • FIG. 15 shows the counted podocytes.
  • the podocytes are in the lavender non-mesangial area, and the corresponding preset image gradient range is: 50 ⁇ gradient value ⁇ 128.
  • the preset image gradient range can also be pixels located between 50 and 128 in the contour buffer of the target cell, and the number of these pixels accounting for more than 80% can be determined as a podocyte.
  • the object statistics method based on image processing of the present application can quickly and accurately distinguish and count the objects in the target image, thereby improving the efficiency and accuracy of the object statistics, and reducing the burden of manual statistics.
  • FIG. 16 is a schematic flowchart of another method for object statistics based on image processing provided by an embodiment of the application.
  • the method for counting objects includes steps S201 to S206.
  • Step S201 Obtain an image to be recognized, and the image to be recognized includes a target object.
  • the image to be recognized is an image including a target object, and specifically, an image obtained by collecting a sample including the target object by an image acquisition device is used as the image to be recognized.
  • Step S202 Perform noise reduction and color inversion processing on the image to be identified to obtain a target image.
  • the preprocessing includes noise reduction processing and color inversion processing.
  • Step S203 Perform a convolution operation on the target image according to the Gauss-Laplacian function to construct a Gaussian pyramid and construct a differential pyramid according to the Gaussian pyramid, and extract a local extremum of the target object according to the differential pyramid.
  • the difference pyramid is constructed by taking the difference between the convolution results of two adjacent images of different scales in each layer of the Gaussian pyramid.
  • the difference pyramid can effectively detect stable local extreme points in the scale space, and then calculate the local extreme value of the target through the difference pyramid.
  • Step S204 based on the non-maximum value suppression algorithm, perform deduplication processing on the target object according to the local extremum of the target object.
  • a non-maximum value suppression algorithm is used to achieve deduplication.
  • the non-maximum suppression algorithm can be understood as a local maximum search. This local represents a neighborhood. The neighborhood has two variable parameters, one is the dimension of the neighborhood, and the other is the size of the neighborhood.
  • search range is 2n-1, where n is greater than or equal to 1;
  • the local maximum value is satisfied, the value of the point is greater than the value of the adjacent point; when the point is determined to be the local maximum value, the iteration parameter can be increased by 2 to find the next local maximum value;
  • the iteration parameter starts from the left. If the value of the point is not satisfied that the value is greater than the value on the right, the iteration parameter needs to be increased by 1 until the value of the current point is greater than the value on the right, and then the local optimal value is obtained. Then continue to step 2 until all the values are traversed.
  • Step S205 Extract the image gradient of the boundary of the target object after de-duplication processing according to the local extremum of the target object.
  • the sub-image is cropped according to the local extremum of the target; the contour line of the target in the sub-image is extracted according to the watershed segmentation algorithm; the contour buffer of the target is extracted according to the morphological expansion operation; and then calculated according to the image gradient algorithm The image gradient of the contour buffer of the target.
  • Step S206 Perform differentiated statistics on the target object according to the image gradient, and display the statistical result.
  • the target type corresponding to the target object and the preset image gradient range corresponding to the target type According to acquiring the target type corresponding to the target object and the preset image gradient range corresponding to the target type; performing distinguishing statistics on the target object according to the image gradient and the preset image gradient range, and displaying statistical results.
  • the statistical result can be displayed on the target image or on the display interface through the terminal.
  • the target object statistics based on image processing of the present application can quickly and accurately distinguish and count the objects in the target image, thereby improving the efficiency and accuracy of the target object statistics, and reducing the burden of manual statistics.
  • FIG. 17 is a schematic block diagram of an image processing-based target statistics device provided by an embodiment of the present application.
  • the target object statistics device 300 is configured to perform any of the aforementioned image processing-based target statistics method.
  • the object statistics device 300 includes: an image acquisition unit 301, a preprocessing unit 302, a construction processing unit 303, a gradient calculation unit 304, and a result statistics unit 305.
  • the image acquisition unit 301 is configured to acquire an image to be identified, and the image to be identified includes a target object.
  • the preprocessing unit 302 is configured to perform noise reduction and color inversion processing on the image to be recognized to obtain a target image.
  • the preprocessing unit 302 includes a denoising unit 3021 and an inversion processing unit 3022.
  • the denoising unit 3021 is used to smooth the image to be recognized by bilateral filtering to filter out false point noise; the color inversion processing unit 3022 is used to perform the color inversion processing on the image to be recognized after the bilateral filtering process to obtain the target image.
  • the construction processing unit 303 is configured to perform a convolution operation on the target image according to the Gauss-Laplacian function to construct a Gaussian pyramid and construct a differential pyramid according to the Gaussian pyramid, and extract the local pole of the target object according to the differential pyramid value.
  • the gradient calculation unit 304 is configured to calculate the image gradient of the boundary of the target object according to the local extremum of the target object.
  • the gradient calculation unit 304 includes a sub-image cropping unit 3041, a buffer extraction unit 3042, and a buffer calculation unit 3043.
  • the sub-image cropping unit 3041 is used to crop the sub-images including the target in the target image according to the local extrema of the target;
  • the buffer extraction unit 3042 is used to extract the sub-images according to the watershed segmentation algorithm The contour of the target in the image is extracted, and the contour buffer of the target is extracted with the contour as the center according to the morphological expansion operation;
  • the buffer calculation unit 3043 is used to calculate the contour buffer of the target according to the image gradient algorithm Image gradient.
  • the result statistics unit 305 is configured to perform differentiated statistics on the target object according to the image gradient, and display the statistics result.
  • the image processing-based target statistics device of the present application has a high degree of intelligent statistics, low data storage space requirements, and fast processing speed, and can accurately count targets and reduce the burden of manual statistics.
  • FIG. 20 is a schematic block diagram of another object statistics device based on image processing provided by an embodiment of the application.
  • the object statistics device 400 is configured to execute any one of the aforementioned methods for object statistics based on image processing.
  • the object statistics device 400 includes: an image acquisition unit 401, a preprocessing unit 402, a construction processing unit 403, a deduplication unit 404, a gradient calculation unit 405, and a result statistics unit 406.
  • the image acquisition unit 401 is configured to acquire an image to be identified, and the image to be identified includes a target object.
  • the preprocessing unit 402 is configured to perform noise reduction and color inversion processing on the image to be recognized to obtain a target image.
  • the construction processing unit 403 is configured to perform a convolution operation on the target image according to the Gauss-Laplacian function to construct a Gaussian pyramid and construct a differential pyramid according to the Gaussian pyramid, and extract the local pole of the target object according to the differential pyramid. value.
  • the deduplication unit 404 is based on a non-maximum value suppression algorithm, and is configured to perform deduplication processing on the target object according to the local extreme value of the target object.
  • the gradient calculation unit 405 is configured to calculate the image gradient of the boundary of the target object after deduplication processing according to the local extremum of the target object.
  • the result statistics unit 406 is configured to perform differentiated statistics on the target object according to the image gradient, and display the statistics result.
  • the object statistical device based on image processing in the present application provides a fast and effective tool for image statistical features, without model training, and the device has high processing efficiency and accuracy.
  • the above-mentioned object statistics device may be implemented in the form of a computer program, and the computer program may run on the computer device as shown in FIG. 21.
  • FIG. 21 is a schematic block diagram of a computer device according to an embodiment of the present application.
  • the computer equipment can be a server or a terminal.
  • the computer device includes a processor, a memory, and a network interface connected through a system bus, where the memory may include a non-volatile storage medium and an internal memory.
  • the non-volatile storage medium can store an operating system and a computer program.
  • the computer program includes program instructions, and when the program instructions are executed, the processor can execute any object statistical method based on image processing.
  • the processor is used to provide computing and control capabilities and support the operation of the entire computer equipment.
  • the internal memory provides an environment for the operation of the computer program in the non-volatile storage medium.
  • the processor can make the processor execute any object statistical method based on image processing.
  • the network interface is used for network communication, such as sending assigned tasks.
  • the network interface is used for network communication, such as sending assigned tasks.
  • FIG. 21 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the computer device to which the solution of the present application is applied.
  • the specific computer device may Including more or fewer parts than shown in the figure, or combining some parts, or having a different arrangement of parts.
  • the processor may be a central processing unit (Central Processing Unit, CPU), the processor may also be other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), and application specific integrated circuits (Application Specific Integrated Circuits). Circuit, ASIC), Field-Programmable Gate Array (FPGA) or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components, etc.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor.
  • the processor is used to run a computer program stored in a memory to implement the following steps:
  • the image to be recognized includes a target object; perform noise reduction and anti-color processing on the image to be recognized to obtain a target image; perform convolution operation on the target image according to the Gauss-Laplace function To construct a Gaussian pyramid and construct a differential pyramid according to the Gaussian pyramid, and extract the local extrema of the target object according to the differential pyramid; calculate the image gradient of the target boundary according to the local extremum of the target; The image gradient performs distinguishing statistics on the target, and displays the statistical result.
  • the processor when the processor implements the denoising and anti-color processing on the image to be recognized to obtain the target image, it is used to implement:
  • the image to be recognized is smoothed and filtered through bilateral filtering to remove pseudo-point noise; and the image to be recognized after the bilateral filtering is processed by inverse color processing to obtain the target image.
  • the processor when the processor realizes the calculation of the image gradient of the boundary of the target object according to the local extremum of the target object, it is configured to realize:
  • the sub-image of the target image is cut out; the contour of the target in the sub-image is extracted according to the watershed segmentation algorithm, and the contour is calculated according to the morphological expansion calculation. Extract the contour buffer of the target object as the center; calculate the image gradient of the contour buffer of the target object according to an image gradient algorithm.
  • the processor when the processor implements the clipping of sub-images including the target object in the target image according to the local extremum of the target object, the processor is configured to achieve:
  • the processor before the processor realizes the calculation of the image gradient of the target object boundary according to the local extremum of the target object, it is further configured to realize:
  • the processor realizes the extraction of the image gradient of the boundary of the target object according to the local extremum of the target object, it is used to realize: extract the deduplication processed image gradient according to the local extremum of the target object The image gradient of the boundary of the target.
  • the processor when the processor realizes the differentiated statistics of the target object according to the image gradient, it is configured to realize:
  • the embodiments of the present application also provide a computer-readable storage medium, the computer-readable storage medium stores a computer program, the computer program includes program instructions, and the processor executes the program instructions to implement the present application Any of the object statistics methods based on image processing provided in the embodiments.
  • the computer-readable storage medium may be the internal storage unit of the computer device described in the foregoing embodiment, such as the hard disk or memory of the computer device.
  • the computer-readable storage medium may also be an external storage device of the computer device, such as a plug-in hard disk, a smart memory card (SMC), or a secure digital (Secure Digital, SD) equipped on the computer device. ) Card, Flash Card, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

一种基于图像处理的目标物统计方法、装置、设备及存储介质,该方法包括:获取待识别图像,待识别图像中包括目标物;处理待识别图像以得到目标图像;对目标图像进行卷积运算构建高斯金字塔以及差分金字塔,根据差分金字塔提取目标物的局部极值;根据局部极值计算目标物边界的图像梯度以对目标物进行区分统计并显示统计结果。

Description

基于图像处理的目标物统计方法、装置、设备及存储介质
本申请要求于2019年5月20日提交中国专利局、申请号为201910420808.8、发明名称为“基于图像处理的目标物统计方法、装置、设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及图像检测技术领域,尤其涉及一种基于图像处理的目标物统计方法、装置、设备及存储介质。
背景技术
在传统目标物统计中,只能通过人工判读的方式对图像上的目标物进行统计。比如,医生通过判读的方式统计病理图像上的目标物(目标物可以为肾小球的内皮细胞、系膜细胞及足细胞)的数量。但该种病理图像的目标物杂而多,医生统计费时费力、易产生误差。所以亟需解决目标物统计效率低且准确率差的问题。
发明内容
本申请提供了一种基于图像处理的目标物统计方法、装置、设备及存储介质,以提高统计目标物效率和准确率。
第一方面,本申请提供了一种基于图像处理的目标物统计方法,包括:
获取待识别图像,所述待识别图像中包括目标物;
对所述待识别图像进行降噪反色处理,以得到目标图像;
根据高斯-拉普拉斯函数对所述目标图像进行卷积运算以构建高斯金字塔以及根据所述高斯金字塔构建差分金字塔,并根据所述差分金字塔提取目标物的局部极值;
根据所述目标物的局部极值计算所述目标物边界的图像梯度;
根据所述图像梯度对所述目标物进行区分统计,并显示统计结果。
第二方面,本申请还提供了一种基于图像处理的目标物统计装置,包括:
图像获取单元,用于获取待识别图像,所述待识别图像中包括目标物;
预处理单元,用于对所述待识别图像进行降噪反色处理,以得到目标图像;
构建处理单元,用于根据高斯-拉普拉斯函数对所述目标图像进行卷积运算以构建高斯金字塔以及根据所述高斯金字塔构建差分金字塔,并根据所述差分金字塔提取目标物的局部极值;
梯度计算单元,用于根据所述目标物的局部极值计算所述目标物边界的图像梯度;
结果统计单元,用于根据所述图像梯度对所述目标物进行区分统计,并显示统计结果。
第三方面,本申请还提供了一种计算机设备,所述计算机设备包括存储器和处理器;所述存储器用于存储计算机程序;所述处理器,用于执行所述计算机程序并在执行所述计算机程序时实现如上述的基于图像处理的目标物统计方法。
第四方面,本申请还提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如上述的基于图像处理的目标物统计方法。
本申请公开了一种基于图像处理的目标物统计方法、装置、设备及存储介质,该方法包括:获取待识别图像,所述待识别图像中包括目标物;对所述待识别图像进行降噪反色处理,以得到目标图像;根据高斯-拉普拉斯函数对所述目标图像进行卷积运算以构建高斯金字塔以及根据所述高斯金字塔构建差分金字塔,并根据所述差分金字塔提取目标物的局部极值;根据所述目标物的局部极值计算所述目标物边界的图像梯度;根据所述图像梯度对所述目标物进行区分统计,并显示统计结果。该方法可以对目标图像中的目标物进行快速准确地区分统计,从而提高对目标物统计效率及准确率,减轻人工统计的负担。
附图说明
为了更清楚地说明本申请实施例技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请的实施例提供的一种基于图像处理的目标物统计方法的示意流程图;
图2是图1中的目标物统计方法的子步骤示意流程图;
图3a是本申请实施例提供的待识别图像的效果示意图;
图3b是本申请实施例提供的目标图像的效果示意图;
图4是本申请的实施例提供的差分金字塔的构造过程示意图;
图5为本申请实施例提供的目标细胞在尺度空间和二维图像空间中的局部极值点的示意图;
图6为本申请实施例提供的级联滤波器构造的示意图;
图7为本申请实施例提供的目标细胞的局部极值区域示意图;
图8是图1中的目标物统计方法的子步骤示意流程图;
图9a为本申请实施例提供的目标图像的示意图;
图9b为本申请实施例提供的子图像的示意图;
图10为本申请实施例提供的目标图像经裁取后的数个子图像的示意图;
图11为本申请实施例提供的目标细胞的轮廓线经膨胀运算后的数个具有轮廓缓冲区的目标细胞的示意图;
图12为本申请实施例提供的目标细胞统计结果显示的效果示意图;
图13为本申请实施例统计出的数个内皮细胞的示意图;
图14为本申请实施例统计出的数个系膜细胞的示意图;
图15为本申请实施例统计出的数个足细胞的示意图;
图16为本申请实施例提供的另一种基于图像处理的目标物统计方法的示意流程图;
图17为本申请实施例提供的基于图像处理的目标物统计装置的示意性框图;
图18为本申请实施例提供的预处理单元的示意性框图;
图19为本申请实施例提供的梯度计算单元的示意性框图;
图20为本申请实施例提供的另一种基于图像处理的目标物统计装置示意性框图;
图21为本申请实施例提供的一种计算机设备的结构示意性框图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
附图中所示的流程图/示意性框图仅是示例说明,不是必须包括所有的内容和操作/步骤,也不是必须按所描述的顺序执行。例如,有的操作/步骤还可以分解、组合或部分合并,因此实际执行的顺序有可能根据实际情况改变。
本申请的实施例提供了一种基于图像处理的目标物统计方法、装置、设备及存储介质,可用于统计病理图像上的目标物,当然也可用于对其他图像上的目标物的统计。其中,病理图像上的目标物可以是淋巴细胞、DNA、RNA、染色体以及肾小球的内皮细胞、系膜细胞及足细胞。但为了便于理解,以下实施例将以肾小球的内皮细胞、系膜细胞及足细胞为目标物进行详细介绍。
该基于图像处理的目标物统计方法可应用于终端或服务器中,或者服务器和终端交互使用,以快速准确地对目标物进行统计。服务器和终端交互使用,比如服务器将统计结果发送终端进行应用。
其中,服务器可以为独立的服务器,也可以为服务器集群。该终端可以为平板电脑、笔记本电脑、台式电脑、智能手机或穿戴式设备等电子设备。
下面结合附图,对本申请的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
如图1所示,图1是本申请的实施例提供的基于图像处理的目标物统计方法的示意流程图。该目标物统计方法包括步骤S101至步骤S105。
步骤S101、获取待识别图像。
具体地,所述待识别图像为包括有目标物的图像。所述待识别图像可以通过普通相机、扫描仪、摄像头、摄像机、图像采集卡、显微数码相机、扫描仪等方式获取。
比如,当所述目标物为肾小球的内皮细胞、系膜细胞及足细胞时,则获取过程为:采集肾脏活体组织的样本;对肾脏活体组织染色以制作肾脏切片;通 过图像采集设备采集染色后的肾脏切片的图像作为待识别图像。因为肾小球的内皮细胞、系膜细胞及足细胞的数量通常上百个,呈暗色调且特征相近难以区分,所以在获取所述待识别图像前对样本进行染色处理,所用到的染色剂为过碘酸雪夫氏染色剂(PeriodicAcid-Schiffstain,PAS),被PAS染色后的所述待识别图像上肾小球的内皮细胞、系膜细胞及足细胞被突出显示为***。因而可更直观地统计所述待识别图像上的肾小球的内皮细胞、系膜细胞及足细胞的数量,从而提高统计目标细胞的准确率。
步骤S102、对所述待识别图像进行降噪反色处理,以得到目标图像。
具体地,为了快速准确地统计目标物,需要对获取的待识别图像进行预处理,该预处理包括降噪处理和反色处理等,其中降噪处理可以采用小波降噪或傅里叶变换降噪等,具体降噪方式在此不做限定,达到去除噪点的效果即可。为了突出待识别图像中的目标物,在对待识别图像进行降噪处理后,还需对待识别图像进行反色处理,该反色处理可以调用反色处理函数或相应的图像处理工具均可。
在一个实施例中,图2是所述对所述待识别图像进行降噪反色处理,以得到目标图像的步骤,包括子步骤S1021和子步骤S1022。
步骤S1021、通过双边滤波对所述待识别图像进行平滑滤除伪点噪声。
因获取所述待识别图像的仪器温度升高,周围的噪音信号过强,会在所述待识别图像上不应该有的地方形成杂色的斑点,这些点就是噪点。噪点又称为噪声。噪声主要是由传感器在成像过程中受同质异谱、同谱异质等因素的影响而产生。常见的噪声有椒盐噪声、条带噪声等。
为避免噪点妨碍目标物统计,本实施例采用双边滤波算法对待识别图像进行降噪处理。双边滤波不仅考虑像素间的距离,也考虑灰度间的相似性。即双边滤波包括空间范围内滤波和灰度范围内滤波。上述空间范围内滤波的表达式为:
Figure PCTCN2019103845-appb-000001
在上述公式(1)中,
Figure PCTCN2019103845-appb-000002
为去噪后的待识别图像,w s(i,j)为空间域权值,I(i,j)为待识别图像,Ω为像素(x,y)处的邻域范围。
同理,在灰度范围内进行滤波与空间范围内滤波的方法相似。灰度范围内滤波的表达式为:
Figure PCTCN2019103845-appb-000003
在上述公式(2)中,
Figure PCTCN2019103845-appb-000004
为去噪后的待识别图像,w r(i,j)为灰度域权值,I(i,j)为待识别图像,Ω为像素(x,y)处的邻域范围。
将空间邻近度与灰度相似度相结合进行滤波,得到双边滤波表达式为:
Figure PCTCN2019103845-appb-000005
w(i,j)=w s(i,j)w r(i,j)         (4)
在上述公式(3)和(4)中,
Figure PCTCN2019103845-appb-000006
为去噪后的待识别图像,w s(i,j)为空间域权值,w r(i,j)为灰度域权值,w(i,j)为空间域权值和灰度域权值的乘积,I(i,j)为待识别图像,Ω为像素(x,y)处的邻域范围。
由上述公式(3)可知:在所述待识别图像变化较为平缓的区域,其邻域内像素灰度值相差不大,双边滤波转化为高斯低通滤波器;在所述待识别图像变化剧烈的区域,滤波器用边缘点邻域内灰度相似的像素点的灰度平均值代替原灰度值。因此双边滤波器既保护了图像边缘信息,又对所述待识别图像进行平滑去噪点。
步骤S1022、将经过双边滤波处理后的待识别图像进行反色处理,以得到目标图像。
由于目标细胞区域通常呈现大小不一的暗色、近圆形的斑块为了提高目标细胞的统计效率和准确率,因而需要将待识别图像进行反色处理。
图像中的每个像素有四个值,分别是alpha、red、green和blue,它们是组成颜色的基本元素,而每一元素的取值范围是[0,255]。反色处理是用255减去待识别图像的每个像素点的R、G、B值,如图3a和图3b所示,图3a是反色处理前的待识别图像,图3b是反色处理后的待识别图像,即目标图像。
步骤S103、根据高斯-拉普拉斯函数对所述目标图像进行卷积运算以构建高斯金字塔以及根据所述高斯金字塔构建差分金字塔,并根据所述差分金字塔提取目标物的局部极值。
通过高斯-拉普拉斯函数构建的高斯金字塔的表达式为:
Figure PCTCN2019103845-appb-000007
通过高斯-拉普拉斯函数间接构建或高斯金子塔直接构建的差分金字塔的表达式为:
Figure PCTCN2019103845-appb-000008
在上述公式(5)和(6)中,L(x,y,kσ)和L(x,y,σ)分别为高斯金字塔,D(x,y,σ)为差分金字塔,G(x,y,kσ)和G(x,y,σ)分别为高斯-拉普拉斯函数,I(x,y)为输入的二维图像,即目标图像。
如图4所示,给出了第一组、第二组差分金字塔的构造过程,差分金字塔是通过高斯金字塔每层内相邻两张不同尺度(像素)的图像卷积结果取差值构建的。差分金字塔可以被认为是在模拟视网膜上的神经从图像中提取信息进而提供给大脑的视觉机制。差分金字塔可有效地在尺度空间检测到稳定的局部极值点,进而通过差分金字塔计算出目标物的局部极值。
高斯金字塔和差分金字塔的构建过程如下所述:局部极值提取采用高斯-拉普拉斯(Laplace of Gaussian,LOG)函数进行卷积处理。高斯平滑后的拉普 拉斯算子是一个二阶微分算子,它可在边缘处产生一个陡峭的零交叉(即一个由正到负的过零点),依据零交叉点进行极值点检测。而图像与某一个二维函数进行卷积运算实际就是求取图像与这一函数的相似性。在图像与高斯-拉普拉斯函数卷积时,若图像中的目标物的形状与高斯-拉普拉斯函数的形状趋近一致,图像的拉普拉斯响应则达到最大。其中,高斯-拉普拉斯算子与高斯核(Gauss)具有如下关系:
Figure PCTCN2019103845-appb-000009
Figure PCTCN2019103845-appb-000010
由于高斯卷积核是实现尺度变换的唯一线性变换核,一幅图像在尺度空间中可表示为图像和可变高斯核函数的卷积,采用高斯金字塔算子表示如上述公式(5)所示。
并且,高斯-拉普拉斯算子可以用高斯差分(Difference of Gaussian,DOG)算子近似替代。替代后的差分金字塔的表达式如上述公式(6)所示。
鉴于高斯差分函数与尺度归一化的高斯-拉普拉斯函数近似,同时考虑到高斯差分函数卷积运算比高斯-拉普拉斯函数卷积运算更快,本实施例进一步通过构建差分金字塔,来快速有效地对目标物进行统计。并且,通过融合多尺度差分金字塔进行卷积运算,可鲁棒有效地识别不同尺寸的目标物的极值中心,从而获知目标物的中心和尺寸(半径)。
在多尺度差分金字塔进行卷积运算时,空间和尺度上达到最大值的点就是所期望的局部极值点。对于二维图像I(x,y),计算图像在不同尺度下的离散拉普拉斯响应值,然后检查空间中的每个点。图5是目标细胞在尺度空间和二维图像空间中的局部极值点的示意图,其中L1为上一尺度邻近点、L2为同尺度空间邻近点、L3为下一尺度邻近点,每个点与它同尺度的3*3邻域内的8个点做比较;并在同一组内的尺度空间上,中心点和上下相邻的两层图像的2*9个点作比较,如此可以保证检测到的关键点在尺度空间和二维图像空间上都是局部极值点,那么该点就是被检测到的目标物的极值区域中心。
图像尺度空间的形成是通过将图像与具有可变核的高斯滤波器进行卷积,从而得到图像的高斯金字塔。比如,将高斯金字塔共分为O组,每组S加3层,S为σ与2σ之间的层数,S一般取2或3。每组高斯金字塔的形成是通过输入图像与级联滤波器进行卷积得到。下面以S为2为例,对高斯金字塔的形成过程进行详细说明。
当S为2时,每组高斯金字塔共有5层,共由四级级联滤波器得到。对第一组的高斯金字塔形成过程如图6所示。输入图像I 0由初始图像I经高斯核为σ的滤波器得到,输出图像I i(i=1,…,4)由输入图像I 0分别经级联滤波器Ⅰ、Ⅱ、Ⅲ和Ⅳ得到。第一组高斯金字塔为图6中五幅图像I i(i=0,1,...4),每幅图像对应的高斯核为k iσ(i=0,1...4),表示图像I i(i=0,1,...4)可看成初始图像I经高斯核为k iσ(i=0,1...4)的高斯函数得到。
第二组高斯金字塔的每层图像大小都是第一组高斯金字塔中图像大小的1/4,其输入图像是第一组高斯金字塔中第S层图像经采样率为2的采样过程得到。当S取2时,第二组高斯金字塔的输入图像I 0'为上面提到的图像I 1采样得到的。输入图像I 0'再经过四级的级联滤波器得到输出图像I i'(i=0,1,...4),级联滤波器的结构与第一组高斯金字塔一致,图像I i'(i=0,1,...4)构成了第二组高斯金字塔。依此类推,共生成4组高斯金字塔。
通过高斯-拉普拉斯函数间接构建或高斯金子塔直接构建的差分金字塔可提取目标物的局部极值。本实施例中目标细胞的局部极值的提取效果可参见图7所示。
步骤S104、根据所述目标物的局部极值计算所述目标物边界的图像梯度。
其中,目标物边界是指目标物的边界区域,比如目标物的轮廓缓冲区等,计算图像梯度可以根据所述目标物的局部极值利用图像梯度算法计算该目标物边界的图像梯度,当然也可以调用图像处理工具计算该目标物边界的图像梯度,比如调用Matlab等。
在一个实施例中,如图8所示,步骤S104包括子步骤S1041至子步骤S1043。
步骤S1041、根据所述目标物的局部极值裁取所述目标图像中包括所述目标物的子图像。
具体地,根据所述目标物的局部极值和所述目标物的外切矩形生成所述目标物对应的矩形区域;根据所述矩形区域裁取所述目标图像上的目标物以获取所述目标物的子图像。
在一个实施例中,获取目标细胞的局部极值之后,以所述目标物的局部极值点为中心和目标物的外切矩形外扩M*N个像素的矩形区域裁取所述目标图像上的目标物以获取所述目标物的子图像,M和N均为正整数。M和N的取值可相等。图9a是目标图像,本实施例以目标物的外切矩形外扩5*5个像素的矩形区域进行截取得到目标物的子图像,其中外切矩形为图9b所示的内部矩形,矩形区域为外切矩形外扩5*5个像素的如图9b所示的外部矩形包围的区域。
由于内皮细胞、系膜细胞或足细胞的尺寸不一,为便于对目标细胞进行统计,对不同尺寸的目标细胞同等像素重采样到H*D个像素,将同等像素重采样后的图像作为目标物的子图像,H和D均为正整数,由此提高了目标物的统计效率和准确度。如图10所示,图10是目标图像经统一重采样到80*80个像素后裁取的数个子图像。
步骤S1042、根据分水岭分割算法提取所述子图像中目标物的轮廓,并根据形态学膨胀运算以所述轮廓为中心提取所述目标物的轮廓缓冲区。
根据分水岭分割算法以所述子图像中的目标细胞的局部极值中心为种子点提取所述目标细胞的轮廓线,以所述目标细胞的轮廓线为中心,用L x L的结构元素(可采用3x3结构元素)进行形态学膨胀运算,向所述目标细胞的轮廓线内外各加粗一层提取的所述目标细胞的轮廓缓冲区,得到各个目标细胞的轮廓缓冲区,具体如图11所示。
其中,数学图像形态学以形态为基础,用具有一定形态的结构元素去度量 和提取图像中的对应目标物轮廓。数学形态学运算主要包括腐蚀、膨胀、开启以及闭合等,基于这些基本运算可以组合和推导各种数学形态学实用算法,进而能够对图像形状和结构进行分析与处理。
(1)膨胀是采用向量加法对两个集合进行合并。膨胀是所有向量加之和的集合,向量加法的两个操作数分别来自集合X和结构元素B,并且取得任何可能的组合。膨胀运算的表达式如下:
Figure PCTCN2019103845-appb-000011
膨胀可以填充图像中的小孔(该小孔相对于结构元素的尺寸来说为较小的孔洞)及在图像边缘出现的细微凹陷部分,并有对图像外部滤波的作用。
(2)腐蚀是对集合元素采用向量减法,将两个集合合并,腐蚀是膨胀的对偶运算,腐蚀和膨胀不互为可逆运算。腐蚀运算的表达式如下:
Figure PCTCN2019103845-appb-000012
腐蚀可以消除图像中较小的部分,有对图像内部滤波的作用,并将图像缩小。
步骤S1043、根据图像梯度算法计算所述目标物的轮廓缓冲区的图像梯度。
经典的图像梯度算法是考虑图像的每个像素的某个邻域内的灰度变化,利用边缘临近的一阶或二阶导数变化规律,对图像中像素某个邻域设置梯度算子进行卷积来计算。而形态学梯度根据膨胀或者腐蚀与图像作差组合来实现增强结构元素邻域中像素的强度。计算图像的形态学梯度时将膨胀和腐蚀基础操作组合起来使用实现缓冲区的图像梯度计算。
步骤S105、根据所述图像梯度对所述目标物进行区分统计,并显示统计结果。
具体地,根据所述图像梯度对所述目标物进行区分以得到目标物对应的目标类型,并统计每个目标类型中的目标物的数量。其中,显示统计结果可以直接显示每个目标类型的目标物的数量,或者结合图形进行显示,具体如图12所示。
在一个实施例中,所述根据所述图像梯度对所述目标物进行区分统计,包括:
获取所述目标物对应的目标类型以及所述目标类型对应的预设图像梯度范围;根据所述图像梯度和所述预设图像梯度范围对所述目标物进行区分统计。
示例性的,所述目标类型分别为肾小球的内皮细胞、系膜细胞及足细胞。在对目标细胞进行统计时,可根据所述目标物对应的目标类型以及所述目标类型对应的预设图像梯度范围,将三类目标细胞的先验知识程式化表达如下:
a.图13是统计出的数个内皮细胞。内皮细胞边界介于亮色背景和暗色细胞之间,对应的预设图像梯度范围为:梯度值大于或等于192。当然,预设图像梯度范围还可包括其他条件。比如,在目标细胞的轮廓缓冲区中梯度值大于或等于192的像素个数占比超过20%,即可判断为内皮细胞;
b.图14是统计出的数个系膜细胞。***的系膜区的系膜细胞对应的预设图像梯度范围为:梯度值不超过64。当然,预设图像梯度范围还可以为在目标 细胞的轮廓缓冲区中梯度值不超过64的像素个数占比超过20%,即可判断为系膜细胞;
c.图15是统计出的数个足细胞。足细胞处于淡紫色的非系膜区,对应的预设图像梯度范围为:50≤梯度值≤128。当然,预设图像梯度范围还可以为在目标细胞的轮廓缓冲区中位于50至128之间的像素,且这些像素个数占比80%以上的即可判断为足细胞。
本申请的基于图像处理的目标物统计方法可以对目标图像中的目标物快速准确地区分统计,从而提高对目标物统计效率及准确率,减轻人工统计的负担。
如图16所示,图16为本申请实施例提供的另一种基于图像处理的目标物统计方法的示意流程图。其中该目标物统计方法包括步骤S201至步骤S206。
步骤S201、获取待识别图像,所述待识别图像中包括目标物。
其中,所述待识别图像为包括目标物的图像,具体通过图像采集设备对包括目标物的样本进行采集得到的图像作为待识别图像。
步骤S202、对所述待识别图像进行降噪反色处理,以得到目标图像。
为了快速准确地统计目标物,需要对获取的待识别图像进行预处理,该预处理包括降噪处理和反色处理等。
步骤S203、根据高斯-拉普拉斯函数对所述目标图像进行卷积运算以构建高斯金字塔以及根据所述高斯金字塔构建差分金字塔,并根据所述差分金字塔提取目标物的局部极值。
其中,差分金字塔是通过高斯金字塔每层内相邻两张不同尺度的图像卷积结果取差值构建的。差分金字塔可有效地在尺度空间检测到稳定的局部极值点,进而通过差分金字塔计算出目标物的局部极值。
步骤S204、基于非极大值抑制算法,根据所述目标物的局部极值对所述目标物进行去重处理。
在一实施例中,对所述目标物的局部极值区域检测时,存在粘连或边界不显著的部分去重处理以获取边界显著的目标物。本实施例采用非极大值抑制算法实现去重。非极大值抑制算法可理解为局部最大搜索,这个局部代表的是一个邻域,邻域有两个参数可变,一是邻域的维数,二是邻域的大小。
具体地,非极大值抑制算法处理步骤如下:
a.在目标物的边界不是极值点的范围搜索。对于一组长度为n的数来说,搜索范围为2n–1,其中n大于等于1;
b.局部极大值满足,该点的值大于相邻点的值;当确定该点是局部极大值时,即可将迭代参数增加2,寻找下一个局部极大值;
c.迭代参数从左开始,若不满足该点的值大于右边的值时,则需将迭代参数增加1,直到满足当前点的值大于右边的值停止,此时得到局部最优值。然后继续第2步,直到遍历所有值为止。
步骤S205、根据所述目标物的局部极值提取去重处理后的所述目标物边界的图像梯度。
具体地,根据目标物的局部极值裁取子图像;根据分水岭分割算法提取子图像中目标物的轮廓线;根据形态学膨胀运算提取所述目标物的轮廓缓冲区; 进而根据图像梯度算法计算所述目标物的轮廓缓冲区的图像梯度。
步骤S206、根据所述图像梯度对所述目标物进行区分统计,并显示统计结果。
具体地,根据获取所述目标物对应的目标类型以及所述目标类型对应的预设图像梯度范围;根据所述图像梯度和所述预设图像梯度范围对所述目标物进行区分统计,并显示统计结果。其中统计结果既可以显示在目标图像上,也可以通过终端显示在显示界面上。
本申请的基于图像处理的目标物统计可以对目标图像中的目标物快速准确地区分统计,从而提高对目标物统计效率及准确率,减轻人工统计的负担。
请参阅图17,图17为本申请实施例提供的基于图像处理的目标物统计装置的示意性框图,该目标物统计装置300用于执行前述任一项基于图像处理的目标物统计方法。
该目标物统计装置300,包括:图像获取单元301、预处理单元302、构建处理单元303、梯度计算单元304及结果统计单元305。
图像获取单元301,用于获取待识别图像,所述待识别图像中包括目标物。
预处理单元302,用于对所述待识别图像进行降噪反色处理,以得到目标图像。
在一个实施例中,如图18所示,预处理单元302包括去噪点单元3021和反色处理单元3022。
去噪点单元3021,用于通过双边滤波对所述待识别图像进行平滑滤除伪点噪声;反色处理单元3022,用于将经过双边滤波处理后的待识别图像进行反色处理,以得到目标图像。
构建处理单元303,用于根据高斯-拉普拉斯函数对所述目标图像进行卷积运算以构建高斯金字塔以及根据所述高斯金字塔构建差分金字塔,并根据所述差分金字塔提取目标物的局部极值。
梯度计算单元304,用于根据所述目标物的局部极值计算所述目标物边界的图像梯度。
在一个实施例中,如图19所示,梯度计算单元304包括子图像裁取单元3041、缓冲区提取单元3042及缓冲区计算单元3043。
子图像裁取单元3041,用于根据所述目标物的局部极值裁取所述目标图像中包括所述目标物的子图像;缓冲区提取单元3042,用于根据分水岭分割算法提取所述子图像中目标物的轮廓,并根据形态学膨胀运算以所述轮廓为中心提取所述目标物的轮廓缓冲区;缓冲区计算单元3043,用于根据图像梯度算法计算所述目标物的轮廓缓冲区的图像梯度。
结果统计单元305,用于根据所述图像梯度对所述目标物进行区分统计,并显示统计结果。
本申请的基于图像处理的目标物统计装置智能化统计程度高、数据存储空间需求量低、处理速度快,能够准确地对目标物进行统计,减轻人工统计负担。
需要说明的是,所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,上述描述的基于图像处理的目标物统计装置和各单元的具体工作过程, 可以参考前述基于图像处理的目标物统计方法实施例中的对应过程,在此不再赘述。
图20为本申请实施例提供的另一种基于图像处理的目标物统计装置示意性框图,该目标物统计装置400用于执行前述任一项基于图像处理的目标物统计方法。该目标物统计装置400,包括:图像获取单元401、预处理单元402、构建处理单元403、去重单元404、梯度计算单元405及结果统计单元406。
图像获取单元401,用于获取待识别图像,所述待识别图像中包括目标物。
预处理单元402,用于对所述待识别图像进行降噪反色处理,以得到目标图像。
构建处理单元403,用于根据高斯-拉普拉斯函数对所述目标图像进行卷积运算以构建高斯金字塔以及根据所述高斯金字塔构建差分金字塔,并根据所述差分金字塔提取目标物的局部极值。
去重单元404,基于非极大值抑制算法,用于根据所述目标物的局部极值对所述目标物进行去重处理。
梯度计算单元405,用于根据所述目标物的局部极值计算去重处理后的所述目标物边界的图像梯度。
结果统计单元406,用于根据所述图像梯度对所述目标物进行区分统计,并显示统计结果。
本申请中的基于图像处理的目标物统计装置为图像统计特征提供一种快速有效的工具,且无需模型训练,装置的处理效率和准确性高。
需要说明的是,所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,上述描述的基于图像处理的目标物统计装置和各单元的具体工作过程,可以参考前述基于图像处理的目标物统计方法实施例中的对应过程,在此不再赘述。
上述的目标物统计装置可以实现为一种计算机程序的形式,该计算机程序可以在如图21所示的计算机设备上运行。
请参阅图21,图21是本申请实施例提供的一种计算机设备的示意性框图。该计算机设备可以是服务器或终端。
参阅图21,该计算机设备包括通过***总线连接的处理器、存储器和网络接口,其中,存储器可以包括非易失性存储介质和内存储器。
非易失性存储介质可存储操作***和计算机程序。该计算机程序包括程序指令,该程序指令被执行时,可使得处理器执行任意一种基于图像处理的目标物统计方法。
处理器用于提供计算和控制能力,支撑整个计算机设备的运行。
内存储器为非易失性存储介质中的计算机程序的运行提供环境,该计算机程序被处理器执行时,可使得处理器执行任意一种基于图像处理的目标物统计方法。
该网络接口用于进行网络通信,如发送分配的任务等。本领域技术人员可以理解,图21中示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的计算机设备的限定,具体的计算机设备可 以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
应当理解的是,处理器可以是中央处理单元(Central Processing Unit,CPU),该处理器还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。其中,通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
其中,在一个实施例中,所述处理器用于运行存储在存储器中的计算机程序,以实现如下步骤:
获取待识别图像,所述待识别图像中包括目标物;对所述待识别图像进行降噪反色处理,以得到目标图像;根据高斯-拉普拉斯函数对所述目标图像进行卷积运算以构建高斯金字塔以及根据所述高斯金字塔构建差分金字塔,并根据所述差分金字塔提取目标物的局部极值;根据所述目标物的局部极值计算所述目标物边界的图像梯度;根据所述图像梯度对所述目标物进行区分统计,并显示统计结果。
在一个实施例中,所述处理器在实现所述对所述待识别图像进行降噪反色处理,以得到目标图像时,用于实现:
通过双边滤波对所述待识别图像进行平滑滤除伪点噪声;将经过双边滤波处理后的待识别图像进行反色处理以得到目标图像。
在一个实施例中,所述处理器在实现所述根据所述目标物的局部极值计算所述目标物边界的图像梯度时,用于实现:
根据所述目标物的局部极值裁取所述目标图像中包括所述目标物的子图像;根据分水岭分割算法提取所述子图像中目标物的轮廓,并根据形态学膨胀运算以所述轮廓为中心提取所述目标物的轮廓缓冲区;根据图像梯度算法计算所述目标物的轮廓缓冲区的图像梯度。
在一个实施例中,所述处理器在实现所述根据所述目标物的局部极值裁取所述目标图像中包括所述目标物的子图像时,用于实现:
根据所述目标物的局部极值和所述目标物的外切矩形生成所述目标物对应的矩形区域;根据所述矩形区域裁取所述目标图像上的目标物以获取所述目标物的子图像。
在一个实施例中,所述处理器在实现所述根据所述目标物的局部极值计算所述目标物边界的图像梯度之前,还用于实现:
基于非极大值抑制算法,根据所述目标物的局部极值对所述目标物进行去重处理;
相应地,所述处理器在实现所述根据所述目标物的局部极值提取所述目标物边界的图像梯度时,用于实现:根据所述目标物的局部极值提取去重处理后的所述目标物边界的图像梯度。
在一个实施例中,所述处理器在实现所述根据所述图像梯度对所述目标物进行区分统计时,用于实现:
获取所述目标物对应的目标类型以及所述目标类型对应的预设图像梯度范围;根据所述图像梯度和所述预设图像梯度范围对所述目标物进行区分统计。
本申请的实施例中还提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序中包括程序指令,所述处理器执行所述程序指令,实现本申请实施例提供的任一项基于图像处理的目标物统计方法。
其中,所述计算机可读存储介质可以是前述实施例所述的计算机设备的内部存储单元,例如所述计算机设备的硬盘或内存。所述计算机可读存储介质也可以是所述计算机设备的外部存储设备,例如所述计算机设备上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到各种等效的修改或替换,这些修改或替换都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (22)

  1. 一种基于图像处理的目标物统计方法,包括:
    获取待识别图像,所述待识别图像中包括目标物;
    对所述待识别图像进行降噪反色处理,以得到目标图像;
    根据高斯-拉普拉斯函数对所述目标图像进行卷积运算以构建高斯金字塔以及根据所述高斯金字塔构建差分金字塔,并根据所述差分金字塔提取目标物的局部极值;
    根据所述目标物的局部极值计算所述目标物边界的图像梯度;
    根据所述图像梯度对所述目标物进行区分统计,并显示统计结果。
  2. 根据权利要求1所述的目标物统计方法,其中,所述对所述待识别图像进行降噪反色处理,以得到目标图像,包括:
    通过双边滤波对所述待识别图像进行平滑滤除伪点噪声;
    将经过双边滤波处理后的待识别图像进行反色处理以得到目标图像。
  3. 根据权利要求2所述的目标物统计方法,其中,所述双边滤波的表达式为:
    Figure PCTCN2019103845-appb-100001
    w(i,j)=w s(i,j)w r(i,j)
    其中
    Figure PCTCN2019103845-appb-100002
    为去噪后的待识别图像,w s(i,j)为空间域权值,w r(i,j)为灰度域权值,w(i,j)为空间域权值和灰度域权值的乘积,I(i,j)为待识别图像,Ω为像素(x,y)处的邻域范围。
  4. 根据权利要求1所述的目标物统计方法,其中,所述根据所述目标物的局部极值计算所述目标物边界的图像梯度,包括:
    根据所述目标物的局部极值裁取所述目标图像中包括所述目标物的子图像;
    根据分水岭分割算法提取所述子图像中目标物的轮廓,并根据形态学膨胀运算以所述轮廓为中心提取所述目标物的轮廓缓冲区;
    根据图像梯度算法计算所述目标物的轮廓缓冲区的图像梯度。
  5. 根据权利要求4所述的目标物统计方法,其中,所述根据所述目标物的局部极值裁取所述目标图像中包括所述目标物的子图像,包括:
    根据所述目标物的局部极值和所述目标物的外切矩形生成所述目标物对应的矩形区域;
    根据所述矩形区域裁取所述目标图像上的目标物以获取所述目标物的子图像。
  6. 根据权利要求1所述的目标物统计方法,其中,所述根据所述目标物的局部极值计算所述目标物边界的图像梯度之前,还包括:
    基于非极大值抑制算法,根据所述目标物的局部极值对所述目标物进行去重处理;
    所述根据所述目标物的局部极值提取所述目标物边界的图像梯度,包括:根据所述目标物的局部极值提取去重处理后的所述目标物边界的图像梯度。
  7. 根据权利要求1所述的目标物统计方法,其中,所述根据所述图像梯度对所述目标物进行区分统计,包括:
    获取所述目标物对应的目标类型以及所述目标类型对应的预设图像梯度范围;
    根据所述图像梯度和所述预设图像梯度范围对所述目标物进行区分统计。
  8. 一种基于图像处理的目标物统计装置,包括:
    图像获取单元,用于获取待识别图像,所述待识别图像中包括目标物;
    预处理单元,用于对所述待识别图像进行降噪反色处理,以得到目标图像;
    构建处理单元,用于根据高斯-拉普拉斯函数对所述目标图像进行卷积运算以构建高斯金字塔以及根据所述高斯金字塔构建差分金字塔,并根据所述差分金字塔提取目标物的局部极值;
    梯度计算单元,用于根据所述目标物的局部极值计算所述目标物边界的图像梯度;
    结果统计单元,用于根据所述图像梯度对所述目标物进行区分统计,并显示统计结果。
  9. 一种计算机设备,所述计算机设备包括存储器和处理器;
    所述存储器用于存储计算机程序;
    所述处理器,用于执行所述计算机程序并在执行所述计算机程序时实现如下步骤:
    获取待识别图像,所述待识别图像中包括目标物;
    对所述待识别图像进行降噪反色处理,以得到目标图像;
    根据高斯-拉普拉斯函数对所述目标图像进行卷积运算以构建高斯金字塔以及根据所述高斯金字塔构建差分金字塔,并根据所述差分金字塔提取目标物的局部极值;
    根据所述目标物的局部极值计算所述目标物边界的图像梯度;
    根据所述图像梯度对所述目标物进行区分统计,并显示统计结果。
  10. 根据权利要求9所述的计算机设备,其中,所述处理器在实现所述对所述待识别图像进行降噪反色处理,以得到目标图像时,具体实现:
    通过双边滤波对所述待识别图像进行平滑滤除伪点噪声;
    将经过双边滤波处理后的待识别图像进行反色处理以得到目标图像。
  11. 根据权利要求10所述的计算机设备,其中,所述双边滤波的表达式为:
    Figure PCTCN2019103845-appb-100003
    w(i,j)=w s(i,j)w r(i,j)
    其中
    Figure PCTCN2019103845-appb-100004
    为去噪后的待识别图像,w s(i,j)为空间域权值,w r(i,j)为灰度域权值,w(i,j)为空间域权值和灰度域权值的乘积,I(i,j)为待识别图像,Ω为像素 (x,y)处的邻域范围。
  12. 根据权利要求9所述的计算机设备,其中,所述处理器在实现所述根据所述目标物的局部极值计算所述目标物边界的图像梯度时,具体实现:
    根据所述目标物的局部极值裁取所述目标图像中包括所述目标物的子图像;
    根据分水岭分割算法提取所述子图像中目标物的轮廓,并根据形态学膨胀运算以所述轮廓为中心提取所述目标物的轮廓缓冲区;
    根据图像梯度算法计算所述目标物的轮廓缓冲区的图像梯度。
  13. 根据权利要求12所述的计算机设备,其中,所述处理器在实现所述根据所述目标物的局部极值裁取所述目标图像中包括所述目标物的子图像时,具体实现:
    根据所述目标物的局部极值和所述目标物的外切矩形生成所述目标物对应的矩形区域;
    根据所述矩形区域裁取所述目标图像上的目标物以获取所述目标物的子图像。
  14. 根据权利要求9所述的计算机设备,其中,所述处理器在实现所述根据所述目标物的局部极值计算所述目标物边界的图像梯度之前,还用于实现:
    基于非极大值抑制算法,根据所述目标物的局部极值对所述目标物进行去重处理;
    所述根据所述目标物的局部极值提取所述目标物边界的图像梯度,包括:根据所述目标物的局部极值提取去重处理后的所述目标物边界的图像梯度。
  15. 根据权利要求9所述的计算机设备,其中,所述处理器在实现所述根据所述图像梯度对所述目标物进行区分统计时,具体实现:
    获取所述目标物对应的目标类型以及所述目标类型对应的预设图像梯度范围;
    根据所述图像梯度和所述预设图像梯度范围对所述目标物进行区分统计。
  16. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如下步骤:
    获取待识别图像,所述待识别图像中包括目标物;
    对所述待识别图像进行降噪反色处理,以得到目标图像;
    根据高斯-拉普拉斯函数对所述目标图像进行卷积运算以构建高斯金字塔以及根据所述高斯金字塔构建差分金字塔,并根据所述差分金字塔提取目标物的局部极值;
    根据所述目标物的局部极值计算所述目标物边界的图像梯度;
    根据所述图像梯度对所述目标物进行区分统计,并显示统计结果。
  17. 根据权利要求16所述的计算机可读存储介质,其中,所述处理器在实现所述对所述待识别图像进行降噪反色处理,以得到目标图像时,具体实现:
    通过双边滤波对所述待识别图像进行平滑滤除伪点噪声;
    将经过双边滤波处理后的待识别图像进行反色处理以得到目标图像。
  18. 根据权利要求17所述的计算机设备,其中,所述双边滤波的表达式为:
    Figure PCTCN2019103845-appb-100005
    w(i,j)=w s(i,j)w r(i,j)
    其中
    Figure PCTCN2019103845-appb-100006
    为去噪后的待识别图像,w s(i,j)为空间域权值,w r(i,j)为灰度域权值,w(i,j)为空间域权值和灰度域权值的乘积,I(i,j)为待识别图像,Ω为像素(x,y)处的邻域范围。
  19. 根据权利要求16所述的计算机设备,其中,所述处理器在实现所述根据所述目标物的局部极值计算所述目标物边界的图像梯度时,具体实现:
    根据所述目标物的局部极值裁取所述目标图像中包括所述目标物的子图像;
    根据分水岭分割算法提取所述子图像中目标物的轮廓,并根据形态学膨胀运算以所述轮廓为中心提取所述目标物的轮廓缓冲区;
    根据图像梯度算法计算所述目标物的轮廓缓冲区的图像梯度。
  20. 根据权利要求19所述的计算机设备,其中,所述处理器在实现所述根据所述目标物的局部极值裁取所述目标图像中包括所述目标物的子图像时,具体实现:
    根据所述目标物的局部极值和所述目标物的外切矩形生成所述目标物对应的矩形区域;
    根据所述矩形区域裁取所述目标图像上的目标物以获取所述目标物的子图像。
  21. 根据权利要求16所述的计算机设备,其中,所述处理器在实现所述根据所述目标物的局部极值计算所述目标物边界的图像梯度之前,还用于实现:
    基于非极大值抑制算法,根据所述目标物的局部极值对所述目标物进行去重处理;
    所述根据所述目标物的局部极值提取所述目标物边界的图像梯度,包括:根据所述目标物的局部极值提取去重处理后的所述目标物边界的图像梯度。
  22. 根据权利要求16所述的计算机设备,其中,所述处理器在实现所述根据所述图像梯度对所述目标物进行区分统计时,具体实现:
    获取所述目标物对应的目标类型以及所述目标类型对应的预设图像梯度范围;
    根据所述图像梯度和所述预设图像梯度范围对所述目标物进行区分统计。
PCT/CN2019/103845 2019-05-20 2019-08-30 基于图像处理的目标物统计方法、装置、设备及存储介质 WO2020232910A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910420808.8A CN110298817A (zh) 2019-05-20 2019-05-20 基于图像处理的目标物统计方法、装置、设备及存储介质
CN201910420808.8 2019-05-20

Publications (1)

Publication Number Publication Date
WO2020232910A1 true WO2020232910A1 (zh) 2020-11-26

Family

ID=68026982

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/103845 WO2020232910A1 (zh) 2019-05-20 2019-08-30 基于图像处理的目标物统计方法、装置、设备及存储介质

Country Status (2)

Country Link
CN (1) CN110298817A (zh)
WO (1) WO2020232910A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112712465A (zh) * 2020-12-31 2021-04-27 四川长虹网络科技有限责任公司 一种优化拍照式抄表终端通信数据量的方法和***
CN114137984A (zh) * 2021-11-29 2022-03-04 江苏科技大学 一种模块化传输平台及其控制方法和路径规划方法

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112541507B (zh) * 2020-12-17 2023-04-18 中国海洋大学 多尺度卷积神经网络特征提取方法、***、介质及应用
CN113222853B (zh) * 2021-05-26 2022-07-12 武汉博宇光电***有限责任公司 一种基于噪声估计的渐进式红外图像降噪方法
CN115311228A (zh) * 2022-08-05 2022-11-08 山东省产品质量检验研究院 基于matlab图像边缘检测的球压压痕测量方法及***

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090041376A1 (en) * 2007-08-03 2009-02-12 Joan Elizabeth Carletta Method for real-time implementable local tone mapping for high dynamic range images
CN103218831A (zh) * 2013-04-21 2013-07-24 北京航空航天大学 一种基于轮廓约束的视频运动目标分类识别方法
CN103295224A (zh) * 2013-03-14 2013-09-11 北京工业大学 一种基于均值漂移和分水岭的乳腺超声图像自动分割方法
CN104658011A (zh) * 2015-01-31 2015-05-27 北京理工大学 一种智能交通运动目标检测跟踪方法
CN107092871A (zh) * 2017-04-06 2017-08-25 重庆市地理信息中心 基于多尺度多特征融合的遥感影像建筑物检测方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101837286B1 (ko) * 2016-08-10 2018-03-09 한국과학기술원 라플라시안 패치 기반 이미지 합성 방법 및 장치
CN107665479A (zh) * 2017-09-05 2018-02-06 平安科技(深圳)有限公司 一种特征提取方法、全景拼接方法及其装置、设备及计算机可读存储介质
CN109145929A (zh) * 2017-10-09 2019-01-04 苏州高科中维软件科技有限公司 一种基于sift尺度空间特征信息提取方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090041376A1 (en) * 2007-08-03 2009-02-12 Joan Elizabeth Carletta Method for real-time implementable local tone mapping for high dynamic range images
CN103295224A (zh) * 2013-03-14 2013-09-11 北京工业大学 一种基于均值漂移和分水岭的乳腺超声图像自动分割方法
CN103218831A (zh) * 2013-04-21 2013-07-24 北京航空航天大学 一种基于轮廓约束的视频运动目标分类识别方法
CN104658011A (zh) * 2015-01-31 2015-05-27 北京理工大学 一种智能交通运动目标检测跟踪方法
CN107092871A (zh) * 2017-04-06 2017-08-25 重庆市地理信息中心 基于多尺度多特征融合的遥感影像建筑物检测方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ZHANG, ZHIQIANG ET AL.: "A modified bilateral filtering algorithm", JOURNAL OF IMAGE AND GRAPHICS, vol. 14, no. 3, 31 March 2009 (2009-03-31), DOI: 20200215233621A *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112712465A (zh) * 2020-12-31 2021-04-27 四川长虹网络科技有限责任公司 一种优化拍照式抄表终端通信数据量的方法和***
CN112712465B (zh) * 2020-12-31 2023-08-04 四川长虹网络科技有限责任公司 一种优化拍照式抄表终端通信数据量的方法和***
CN114137984A (zh) * 2021-11-29 2022-03-04 江苏科技大学 一种模块化传输平台及其控制方法和路径规划方法
CN114137984B (zh) * 2021-11-29 2024-02-27 江苏科技大学 一种模块化传输平台及其控制方法和路径规划方法

Also Published As

Publication number Publication date
CN110298817A (zh) 2019-10-01

Similar Documents

Publication Publication Date Title
WO2020232910A1 (zh) 基于图像处理的目标物统计方法、装置、设备及存储介质
Lopez-Molina et al. Multiscale edge detection based on Gaussian smoothing and edge tracking
Ghosh et al. Fast scale-adaptive bilateral texture smoothing
Yang et al. Constant time median and bilateral filtering
US8406518B2 (en) Smoothed local histogram filters for computer graphics
CN112150371B (zh) 图像降噪方法、装置、设备及存储介质
CN111402170A (zh) 图像增强方法、装置、终端及计算机可读存储介质
Chen et al. Fast defocus map estimation
Liu et al. Automatic blur-kernel-size estimation for motion deblurring
Deshpande et al. A novel modified cepstral based technique for blind estimation of motion blur
WO2022233185A1 (zh) 一种图像滤波方法、装置、终端和计算机可读存储介质
Hacini et al. A 2D-fractional derivative mask for image feature edge detection
CN109949294A (zh) 一种基于OpenCV的断口形貌图裂纹缺陷提取方法
Gu et al. A novel total generalized variation model for image dehazing
Qiao et al. Layered input GradiNet for image denoising
WO2024001538A1 (zh) 划痕检测方法、装置、电子设备和可读存储介质
Oprisescu et al. Automatic pap smear nuclei detection using mean-shift and region growing
Zingman et al. Detection of texture and isolated features using alternating morphological filters
CN111311610A (zh) 图像分割的方法及终端设备
CN115841632A (zh) 输电线路提取方法、装置以及双目测距方法
CN112329572B (zh) 一种基于边框和闪光点的快速静态活体检测方法及装置
Gao et al. Multiscale phase congruency analysis for image edge visual saliency detection
CN112967321A (zh) 运动目标的检测方法、装置、终端设备及存储介质
CN113705660A (zh) 目标识别方法及相关设备
CN114596210A (zh) 噪声估计方法、装置、终端设备及计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19929337

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19929337

Country of ref document: EP

Kind code of ref document: A1