CN109214996B - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN109214996B
CN109214996B CN201810994081.XA CN201810994081A CN109214996B CN 109214996 B CN109214996 B CN 109214996B CN 201810994081 A CN201810994081 A CN 201810994081A CN 109214996 B CN109214996 B CN 109214996B
Authority
CN
China
Prior art keywords
image
gray
pixel
value
background
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810994081.XA
Other languages
Chinese (zh)
Other versions
CN109214996A (en
Inventor
刘均
秦文礼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Launch Technology Co Ltd
Original Assignee
Shenzhen Launch Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Launch Technology Co Ltd filed Critical Shenzhen Launch Technology Co Ltd
Priority to CN201810994081.XA priority Critical patent/CN109214996B/en
Publication of CN109214996A publication Critical patent/CN109214996A/en
Application granted granted Critical
Publication of CN109214996B publication Critical patent/CN109214996B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention discloses an image processing method and device, which are used for rapidly segmenting a foreground image and a background image in an image. The method in the embodiment of the application comprises the following steps: acquiring an original image, and judging whether the original image is a single-channel image or not; if not, converting the original image into a single-channel image to obtain a gray image of the original image; carrying out high-frequency filtering on the gray value of each pixel in the gray image to obtain a first gray image; performing Kalman filtering on the gray value of each pixel in the gray image according to the gray image and the first gray image to obtain a template image of the gray image; and calculating the distance norm of the difference value between the gray value of each pixel in the template image and the gray value of the corresponding pixel in the first gray image, if the distance norm is greater than a preset threshold value, defining the current pixel as a foreground image, otherwise, defining the current pixel as a background image.

Description

Image processing method and device
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus.
Background
With the development of computer technology, more and more information is spread in the form of digital images, and in computer image processing, foreground segmentation and extraction are basic operations, and the foreground segmentation refers to that a computer judges which is a foreground object and which is a background object from a picture, and then a foreground key object of interest is segmented.
In a natural scene, an image background is complex, the resolution is low, the image is diversified and randomly distributed, and in the traditional image recognition, the image is required to be preprocessed by denoising, increasing, distortion correcting, zooming and the like in the traditional image recognition mainly for a high-quality document image, so that a high recognition level can be achieved under the condition of meeting the requirements. Because of the good image preprocessing process, the method is a key step influencing the foreground identification of the later image.
In the traditional image preprocessing process, the influence of illumination and image shadow is large, and the image cannot be quickly preprocessed to segment a foreground image and a background image in the image.
Disclosure of Invention
The embodiment of the application provides an image processing method and device, which are used for quickly performing high-frequency filtering on an image and performing Kalman filtering on the image after the high-frequency filtering so as to quickly segment a foreground image and a background image in a current image.
A first aspect of an embodiment of the present application provides an image processing method, including:
acquiring an original image, and judging whether the original image is a single-channel image or not;
if not, converting the original image into a single-channel image to obtain a gray image of the original image;
carrying out high-frequency filtering on the gray value of each pixel in the gray image to obtain a first gray image;
performing Kalman filtering on the gray value of each pixel in the gray image according to the gray image and the first gray image to obtain a template image of the gray image;
and calculating the distance norm of the difference value between the gray value of each pixel in the template image and the gray value of the corresponding pixel in the first gray image, if the distance norm is greater than a preset threshold value, defining the current pixel as a foreground image, otherwise, defining the current pixel as a background image.
Preferably, the performing kalman filtering on the gray value of each pixel in the gray image according to the gray image and the first gray image to obtain the template image of the gray image includes:
estimating the background gray scale of the gray scale image to obtain a background characteristic gray scale value of the gray scale image;
selecting a region with G (i, j) as the center and m x n as the size in the gray scale image, and applying the gray scale value of each pixel in the region
Figure GDA0003226906670000021
Performing high-frequency filtering to obtain a first gray image of the region;
calculating a gray scale prediction value of each pixel in the gray scale image according to the following formula (1) and formula (2):
Figure GDA0003226906670000022
w1+w2+w3=1; (2)
if any pixel in G (i-1, j-1), G (i-1, j-1) or G (i, j-1) exceeds the boundary of B, making the pixel value exceeding the boundary as the background characteristic gray value;
correcting the predicted gray scale value of each pixel according to formula (3):
Figure GDA0003226906670000023
0<λ<1;
Figure GDA0003226906670000024
is a filter operator (3)
To obtain a gray value for each pixel in the template image.
Preferably, the calculating a distance norm of a difference between a gray value of each pixel in the template image and a gray value of a corresponding pixel in the first gray image, if the distance norm is greater than a preset threshold, defining the current pixel as a foreground image, otherwise, defining the current pixel as a background image includes:
calculating the distance norm of the difference value of the gray value of each pixel in the template image and the gray value of the corresponding pixel in the first gray image according to a formula (4):
Figure GDA0003226906670000025
Figure GDA0003226906670000026
is a filter operator (4)
Determining that the current pixel is a foreground image or a background image according to formula (5), wherein epsilon is a visual perception gray threshold value:
Figure GDA0003226906670000031
Figure GDA0003226906670000032
is a filter operator (5)
And if the distance norm is greater than the epsilon, defining the current pixel as a foreground image, and if the distance norm is not greater than the epsilon, defining the current pixel as a background image.
Preferably, the algorithm for estimating the background gray level of the gray level image includes:
one or more of a background gray mode method, a background gray mean method and a background gray fitting Gaussian distribution mean method.
Preferably, the method for high-frequency filtering the gray value of each pixel in the gray image comprises:
and performing mean filtering, Gaussian filtering or Gaussian-Laplace filtering on the gray value of each pixel in the gray image.
Preferably, the method further comprises:
and outputting and displaying the foreground image.
An embodiment of the present application further provides an image processing apparatus, including:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring an original image and judging whether the original image is a single-channel image;
the conversion unit is used for converting the original image into a single-channel image to obtain a gray image of the original image when the original image is not the single-channel image;
the high-frequency filtering unit is used for performing high-frequency filtering on the gray value of each pixel in the gray image to obtain a first gray image;
the Kalman filtering unit is used for carrying out Kalman filtering on the gray value of each pixel in the gray image according to the gray image and the first gray image so as to obtain a template image of the gray image;
and the determining unit is used for calculating a distance norm of a difference value between the gray value of each pixel in the template image and the gray value of the corresponding pixel in the first gray image, if the distance norm is greater than a preset threshold, the current pixel is defined as a foreground image, and otherwise, the current pixel is defined as a background image.
Preferably, the kalman filtering unit includes:
the background gray estimation module is used for estimating the background gray of the gray image to obtain a background characteristic gray value of the gray image;
a high-frequency filtering module for selecting a region with G (i, j) as the center and m x n as the size from the gray image, and applying the gray value of each pixel in the region
Figure GDA0003226906670000033
Performing high-frequency filtering to obtain a first gray image of the region;
a gray prediction module for calculating a gray prediction value of each pixel in the gray image according to the following formula (1) and formula (2):
Figure GDA0003226906670000041
w1+w2+w3=1; (2)
if any pixel in G (i-1, j-1), G (i-1, j-1) or G (i, j-1) exceeds the boundary of B, making the pixel value exceeding the boundary as the background characteristic gray value;
a correction module, configured to correct the predicted gray scale value of each pixel according to formula (3):
Figure GDA0003226906670000042
0<λ<1;
Figure GDA0003226906670000043
is a filter operator (3)
To obtain a gray value for each pixel in the template image.
Preferably, the determining unit includes:
a calculating module, configured to calculate a distance norm of a difference between a grayscale value of each pixel in the template image and a grayscale value of a corresponding pixel in the first grayscale image according to formula (4):
Figure GDA0003226906670000044
Figure GDA0003226906670000045
is a filter operator (4)
The determining module is used for determining that the current pixel is a foreground image or a background image according to a formula (5), and epsilon is a visual perception gray threshold value:
Figure GDA0003226906670000046
Figure GDA0003226906670000047
is a filter operator (5)
And if the distance norm is greater than the epsilon, defining the current pixel as a foreground image, and if the distance norm is not greater than the epsilon, defining the current pixel as a background image.
Preferably, the algorithm for estimating the background gray of the gray image in the background gray estimation module includes:
one or more of a background gray mode method, a background gray mean method and a background gray fitting Gaussian distribution mean method.
Preferably, the method for performing high-frequency filtering on the gray value of each pixel in the gray image in the high-frequency filtering unit includes:
and performing mean filtering, Gaussian filtering or Gaussian-Laplace filtering on the gray value of each pixel in the gray image.
Preferably, the image processing apparatus further includes an output module, configured to output and display the foreground image.
Embodiments of the present application further provide a readable storage medium, on which a computer program is stored, where the computer program is used to implement the image processing method provided in the first aspect of the present application when the computer program is executed by a processor.
According to the technical scheme, the embodiment of the application has the following advantages:
in the embodiment of the application, an original image is obtained first, the original image is converted into a gray image, high-frequency filtering is performed on the gray image to generate a first gray image, Kalman filtering is performed on the gray image according to the gray image and the first gray image to generate a template image, then a distance norm of a gray value difference value of a corresponding pixel point is calculated according to the template image and the gray image, the pixel point is defined as a foreground image when the distance norm is larger than a preset threshold, and the pixel point is defined as a background image when the distance range is not larger than the preset threshold, so that the segmentation process of the foreground image is simplified, and the image processing method has the characteristics of high illumination resistance and high shadow resistance.
Drawings
FIG. 1 is a schematic diagram of an embodiment of an image processing method in an embodiment of the present application;
FIG. 2 is a refinement of step 104 in the embodiment described in FIG. 1;
FIG. 3 is a refinement of step 105 in the embodiment described in FIG. 1;
FIG. 4 is a schematic diagram of another embodiment of an image processing method in an embodiment of the present application;
FIG. 5 is a schematic diagram of an embodiment of an image processing apparatus according to an embodiment of the present application;
fig. 6 is a schematic diagram of another embodiment of an image processing apparatus in the embodiment of the present application.
Detailed Description
The embodiment of the application provides an image processing method and device, which are used for quickly performing high-frequency filtering on an image and performing Kalman filtering on the image after the high-frequency filtering so as to quickly segment a foreground image and a background image in a current image.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
For convenience of understanding, the following describes an image processing method in an embodiment of the present application, and with reference to fig. 1, an embodiment of the image processing method in the embodiment of the present application includes:
101. acquiring an original image, judging whether the original image is a single-channel image, if not, executing a step 102, and if so, executing a step 103;
before processing the image, it is first necessary to obtain a processed raw image, the raw image in this application may be read from a video camera, a computer, a camera or other image storage device, and the raw image may be any one of jpeg, flashpix, Tiff, gif or mpeg, and is not limited herein.
After the original image is acquired, it is determined whether the original image is a single-channel image, that is, a grayscale image, if the original image itself is a single-channel image (grayscale image), step 103 is directly performed on the original image, and if the original image itself is a color image, that is, a non-single-channel image, step 102 is performed on the original image.
102. Converting the original image into a single-channel image to obtain a gray image of the original image;
if the original image is a non-single-channel image, processing the original image according to the following formula, and converting the original image into a single-channel image:
G(i,j)=0.299·rA(i,j)+0.587·gA(i,j)+0.114·bA(i,j)
wherein A (i, j) is a pixel point in the original image, and rA(i,j),gA(i, j) and bAAnd (i, j) are respectively an r channel, a G channel and a b channel of the original image A, and G (i, j) is a single-channel image, namely a pixel point in a gray image.
103. Carrying out high-frequency filtering on the gray value of each pixel in the gray image to obtain a first gray image;
after the gray level image of the original image is obtained, high-frequency filtering is carried out on each pixel point G (i, j) in the gray level image to obtain a first filtered gray level image.
In the process of actual image acquisition, transmission and processing, noise interference of a certain program often exists, the noise deteriorates the quality of an image, the image is blurred, characteristics are submerged, and difficulty is brought to picture analysis.
Specifically, the grayscale image may be high-frequency filtered by various methods to eliminate noise in the image, such as gaussian filtering, mean filtering, and gaussian-laplacian filtering, and different types of template types with a size of m × n may be selected in practical applications
Figure GDA0003226906670000071
Filtering the gray scale image, wherein
Figure GDA0003226906670000072
It may be a gaussian operator, a mean operator or a gaussian-laplace operator.
The filtering process of the grayscale image is described below by taking an example of a mean operator with m × n being 3 × 3:
assuming the filter template is as described in table 1:
Figure GDA0003226906670000073
TABLE 1
In the mean filtering, a template is given to the target pixel, and the template includes neighboring pixels around the target pixel (8 surrounding pixels centered on the target pixel constitute a filtering template, i.e., the target pixel itself is removed), and the average value of all pixels in the template is used to replace the original pixel value.
The mean filtered target pixel value is shown in table 2:
Figure GDA0003226906670000074
Figure GDA0003226906670000081
according to the definition of the mean filtering, the mean pixel of the target pixel is:
(5+3+6+2+1+9+8+4+7)/9=45/9=5
the process of gaussian filtering and laplacian of gaussian filtering has been described in detail in the prior art, and is not described herein again.
104. Performing Kalman filtering on the gray value of each pixel in the gray image according to the gray image and the first gray image to obtain a template image of the gray image;
after the gray level image is subjected to high-frequency filtering to obtain a first gray level image, Kalman filtering is performed on the gray level value of each pixel in the gray level image according to the gray level image and the first gray level image to obtain a template image of the gray level image.
Specifically, the kalman filtering is to estimate a pixel value of each pixel point in the gray image according to an optimization algorithm, and then correct the estimated value by using an actually measured pixel value to obtain a pixel value closer to a true value, and the specific kalman filtering process in this embodiment is described in detail in the following embodiments, which is not described herein again.
105. And calculating the distance norm of the difference value between the gray value of each pixel in the template image and the gray value of the corresponding pixel in the first gray image, if the distance norm is greater than a preset threshold value, defining the current pixel as a foreground image, otherwise, defining the current pixel as a background image.
After the template image is obtained in step 104, a distance norm of a difference between a gray value of each pixel in the template image and a gray value of a corresponding pixel in the first gray image (i.e., the high-frequency filtered gray image) is further calculated, and if the obtained distance norm is greater than a preset threshold, the current pixel is defined as a foreground pixel, otherwise, the current pixel is defined as a background pixel.
Specifically, the implementation process of step 105 is described in detail in the following embodiments.
In the embodiment of the application, an original image is obtained first, the original image is converted into a gray image, high-frequency filtering is performed on the gray image to generate a first gray image, Kalman filtering is performed on the gray image according to the gray image and the first gray image to generate a template image, then a distance norm of a gray value difference value of a corresponding pixel point is calculated according to the template image and the gray image, the pixel point is defined as a foreground image when the distance norm is larger than a preset threshold, and the pixel point is defined as a background image when the distance range is not larger than the preset threshold, so that the segmentation process of the foreground image is simplified, and the image processing method has the characteristics of high illumination resistance and high shadow resistance.
Based on the embodiment described in fig. 1, step 104 of fig. 1 is described in detail below, please refer to fig. 2, and fig. 2 is a detailed step of step 104 of fig. 1:
1041. estimating the background gray scale of the gray scale image to obtain a background characteristic gray scale value of the gray scale image;
after the grayscale image is obtained in step 102, the background grayscale of the grayscale image is estimated, and the specific estimation algorithm may be a mode, an average, or an average of a fitted gaussian distribution, and the like of the grayscale image, which is not limited herein.
After the background gray scale estimation is performed on the gray scale image by adopting the method, the background characteristic gray scale value K of the gray scale image can be obtained.
1042. In a gray image, a region is selected with G (i, j) as the center and m x n as the size, and the gray value of each pixel in the region is applied
Figure GDA0003226906670000091
Carrying out high-frequency filtering to obtain a first gray image;
this step is similar to step 103, that is, a specific implementation manner of step 103, in the grayscale image G, a region with G (i, j) as the center and m × n as the size is selected, and the grayscale value of each pixel in the region is filtered by applying a high-frequency filtering algorithm to obtain a first grayscale image of the region.
1043. Calculating a gray scale predicted value of each pixel in the gray scale image according to the formula (1) and the formula (2), and when any pixel of G (i-1, j-1), G (i-1, j) or G (i, j-1) exceeds the boundary of B, enabling the pixel value exceeding the boundary to be the background characteristic gray scale value;
Figure GDA0003226906670000092
w1+w2+w3=1; (2)
specifically, the formula (1) and the formula (2) are processes of predicting the gray value of each pixel in the gray image, wherein the formula (1) is a process of predicting the pixel value of the target pixel according to the pixel values of three pixels at the upper left corner of the target pixel,
such as
Figure GDA0003226906670000093
G (0,0), G (0,1), and G (1,0) all exceed the boundary of G, so let G (0,0), G (0,1), and G (1,0) all be the background gray level feature value of the gray level image, i.e. k in step 1041, then let G (0,0), G (0,1), and G (1,0) be the background gray level feature value of the gray level image, then let k be the background gray level feature value of the gray level image, and let k be the background gray level feature value of the gray level image
Figure GDA0003226906670000094
And for the pixel values of other position points, a similar recursion method is adopted for calculation, and details are not repeated here.
1044. And (4) correcting the gray scale predicted value according to the formula (3) to obtain the gray scale value of each pixel in the template image.
Figure GDA0003226906670000101
0<λ<1,
Figure GDA0003226906670000102
Is a filter operator (3)
In step 1043, a predicted value for each pixel in the gray-scale image is obtained
Figure GDA0003226906670000103
Then, the predicted value is further corrected according to the pixel value measured after the high-frequency filtering, so that the corrected pixel value G (i, j) is closer toAnd the true value, and the corrected pixel gray value G (i, j) is the gray value of each pixel in the template image.
The embodiment of the application describes in detail how to perform kalman filtering on the grayscale image according to the grayscale image and the first grayscale image to obtain the template image, so that the implementability of the application is improved.
Based on the embodiment shown in fig. 1, step 105 in the embodiment shown in fig. 1 is described in detail below, please refer to fig. 3, and fig. 3 is a detailed step of step 105 in fig. 1:
1051. calculating the distance norm of the difference value of the gray value of each pixel in the template image and the gray value of the corresponding pixel in the first gray image according to a formula (4):
Figure GDA0003226906670000104
Figure GDA0003226906670000105
is a filter operator (4)
After the gray value of each pixel in the template image is obtained, the distance norm of the difference value between the gray value of each pixel in the template image and the gray value of the corresponding pixel in the first gray image is further calculated according to the formula (4), wherein G (i, j) is the gray value of each pixel in the template image,
Figure GDA0003226906670000106
is the gray scale value of each pixel in the first gray scale image, and C (i, j) is the distance norm of the difference between the gray scale value of each pixel in the template image and the gray scale value of the corresponding pixel in the first gray scale image.
The norm in this embodiment may be an L1 norm, an L2 norm, or an L- ∞ norm, and may be customized according to actual needs, which is not limited herein.
1052. Determining that the current pixel is a foreground image or a background image according to formula (5), wherein epsilon is a visual perception gray threshold value:
Figure GDA0003226906670000107
Figure GDA0003226906670000108
is a filter operator (5)
And if the distance norm is greater than the epsilon, defining the current pixel as a foreground image, and if the distance norm is not greater than the epsilon, defining the current pixel as a background image.
In the specific image processing process, after a distance norm C (i, j) of a gray value difference value of each pixel in the template image and a corresponding pixel in the first gray image is obtained, the distance norm is compared with a visual perception gray threshold epsilon, if the distance norm is greater than epsilon, the contrast of the current pixel is obvious, namely a foreground image, and if the distance norm is not greater than epsilon, the contrast of the current pixel is not obvious, namely a background image.
The embodiment of the application describes the segmentation process of the foreground image and the background image in the gray image in detail, and improves the implementability of the application.
Referring to fig. 4, the method for processing an image according to the present application will be described in detail with reference to the embodiments shown in fig. 1, fig. 2 and fig. 3, and another embodiment of the method for processing an image according to the present application includes:
401. acquiring an original image, judging whether the original image is a single-channel image, if not, executing a step 402, and if so, executing a step 403;
402. converting the original image into a single-channel image to obtain a gray image of the original image;
403. carrying out high-frequency filtering on the gray value of each pixel in the gray image to obtain a first gray image;
404. estimating the background gray scale of the gray scale image to obtain a background characteristic gray scale value of the gray scale image;
405. in a gray image, a region is selected with G (i, j) as the center and m x n as the size, and the gray value of each pixel in the region is applied
Figure GDA0003226906670000111
Performing high-frequency filtering to obtain a first gray image of the region;
406. calculating a gray scale predicted value of each pixel in the gray scale image according to the formula (1) and the formula (2), and when any pixel in G (i-1, j-1), G (i-1, j-1) or G (i, j-1) exceeds the boundary of B, enabling the pixel value exceeding the boundary to be the background characteristic gray scale value;
Figure GDA0003226906670000112
w1+w2+w3=1; (2)
407. correcting the predicted value of the pixel according to a formula (3) to obtain the gray value of each pixel in the template image;
Figure GDA0003226906670000113
0<λ<1,
Figure GDA0003226906670000114
is a filter operator (3)
408. Calculating the distance norm of the difference value of the gray value of each pixel in the template image and the gray value of the corresponding pixel in the first gray image according to a formula (4):
Figure GDA0003226906670000121
Figure GDA0003226906670000122
is a filter operator (4)
409. Determining that the current pixel is a foreground image or a background image according to formula (5), wherein epsilon is a visual perception gray threshold value:
Figure GDA0003226906670000123
Figure GDA0003226906670000124
is a filter operator (5)
And if the distance norm is greater than the epsilon, defining the current pixel as a foreground image, and if the distance norm is not greater than the epsilon, defining the current pixel as a background image.
It should be noted that steps 401 to 409 in this embodiment are similar to those in the embodiments described in fig. 1, fig. 2, and fig. 3, and are not repeated here.
410. And outputting and displaying the foreground image.
After the foreground image in the gray image is divided in step 409, the gray value of the position point of the foreground image may be further displayed to obtain the foreground image in the gray image, and the gray value of the position point of the background image may be further displayed to obtain the background image in the gray image.
In the embodiment of the application, an original image is obtained first, the original image is converted into a gray image, high-frequency filtering is performed on the gray image to generate a first gray image, Kalman filtering is performed on the gray image according to the gray image and the first gray image to generate a template image, then a distance norm of a gray value difference value of a corresponding pixel point is calculated according to the template image and the gray image, the pixel point is defined as a foreground image when the distance norm is larger than a preset threshold, and the pixel point is defined as a background image when the distance range is not larger than the preset threshold, so that the segmentation process of the foreground image is simplified, and the image processing method has the characteristics of high illumination resistance and high shadow resistance.
Secondly, the embodiment of the application describes in detail how to perform kalman filtering on the grayscale image according to the grayscale image and the first grayscale image to obtain the template image, and further describes in detail a segmentation process of the foreground image and the background image in the grayscale image according to the template image, so that the implementability of the application is improved.
With reference to fig. 5, the image processing method in the embodiment of the present application is described above, and an embodiment of the image processing apparatus in the embodiment of the present application includes:
an obtaining unit 501, configured to obtain an original image, and determine whether the original image is a single-channel image;
a converting unit 502, configured to convert the original image into a single-channel image to obtain a grayscale image of the original image when the original image is not a single-channel image;
a high-frequency filtering unit 503, configured to perform high-frequency filtering on the grayscale value of each pixel in the grayscale image to obtain a first grayscale image;
a kalman filtering unit 504, configured to perform kalman filtering on the grayscale value of each pixel in the grayscale image according to the grayscale image and the first grayscale image to obtain a template image of the grayscale image;
the determining unit 505 is configured to calculate a distance norm of a difference between a gray value of each pixel in the template image and a gray value of a corresponding pixel in the first gray image, and if the distance norm is greater than a preset threshold, define the current pixel as a foreground image, otherwise, define the current pixel as a background image.
It should be noted that the functions of the units in the embodiment of the present application are similar to those described in the embodiment illustrated in fig. 1, and are not described again here.
In the embodiment of the application, an original image is obtained through an obtaining unit 501, whether the original image is a single-channel image is judged, the original image is converted into a gray image through a converting unit 502, high-frequency filtering is performed on the gray image through a high-frequency filtering unit 503 to generate a first gray image, kalman filtering is performed on the gray image according to the gray image and the first gray image to generate a template image, a distance norm of a gray value difference value of a corresponding pixel point is calculated according to the template image and the gray image, the pixel point is defined as a foreground image when the distance norm is greater than a preset threshold, and the pixel point is defined as a background image when the distance range is not greater than the preset threshold, so that the segmentation process of the foreground image is simplified, and the image processing method has the characteristics of high illumination resistance and high shadow resistance.
Referring to fig. 6, the image processing apparatus according to the embodiment of the present application will be described in detail below based on the embodiment shown in fig. 5, where another embodiment of the image processing apparatus according to the embodiment of the present application includes:
an obtaining unit 601, configured to obtain an original image, and determine whether the original image is a single-channel image;
a converting unit 602, configured to convert the original image into a single-channel image to obtain a grayscale image of the original image when the original image is not a single-channel image;
a high-frequency filtering unit 603, configured to perform high-frequency filtering on a grayscale value of each pixel in the grayscale image to obtain a first grayscale image;
a kalman filtering unit 604, configured to perform kalman filtering on the grayscale value of each pixel in the grayscale image according to the grayscale image and the first grayscale image to obtain a template image of the grayscale image;
the determining unit 605 is configured to calculate a distance norm of a difference between a gray value of each pixel in the template image and a gray value of a corresponding pixel in the first gray image, and if the distance norm is greater than a preset threshold, define the current pixel as a foreground image, otherwise, define the current pixel as a background image.
Preferably, the kalman filtering unit 604 includes:
a background gray estimation module 6041, configured to estimate a background gray of the gray image to obtain a background feature gray of the gray image;
a high-frequency filtering module 6042, configured to select a region with a size of m × n and G (i, j) as a center in the grayscale image, and apply a grayscale value to each pixel in the region
Figure GDA0003226906670000141
Performing high-frequency filtering to obtain a first gray image of the region;
a gray value prediction module 6043, configured to calculate a gray value prediction value of each pixel in the gray image according to the following formula (1) and formula (2):
Figure GDA0003226906670000142
w1+w2+w3=1; (2)
if any pixel in G (i-1, j-1), G (i-1, j-1) or G (i, j-1) exceeds the boundary of B, making the pixel value exceeding the boundary as the background characteristic gray value;
a correcting module 6044, configured to correct the predicted gray level value of each pixel according to formula (3):
Figure GDA0003226906670000143
0<λ<1;
Figure GDA0003226906670000144
is a filter operator (3)
To obtain a gray value for each pixel in the template image.
Preferably, the determining unit 605 includes:
a calculating module 6051, configured to calculate a distance norm of a difference between a grayscale value of each pixel in the template image and a grayscale value of a corresponding pixel in the first grayscale image according to formula (4):
Figure GDA0003226906670000145
Figure GDA0003226906670000146
is a filter operator (4)
A determining module 6052, configured to determine that the current pixel is a foreground image or a background image according to formula (5), where ∈ is a visual perception grayscale threshold:
Figure GDA0003226906670000151
Figure GDA0003226906670000152
is a filter operator (5)
And if the distance norm is greater than the epsilon, defining the current pixel as a foreground image, and if the distance norm is not greater than the epsilon, defining the current pixel as a background image.
It should be noted that the functions of each unit and each module in the embodiment of the present application are similar to those described in the embodiment of fig. 4, and are not described again here.
In the embodiment of the application, the original image is obtained through the obtaining unit 601, whether the original image is a single-channel image is judged, the original image is converted into a gray image through the converting unit 602, high-frequency filtering is performed on the gray image through the high-frequency filtering unit 603 to generate a first gray image, kalman filtering is performed on the gray image according to the gray image and the first gray image to generate a template image, a distance norm of a gray value difference value of a corresponding pixel point is calculated according to the template image and the gray image, the pixel point is defined as a foreground image when the distance norm is greater than a preset threshold, and the pixel point is defined as a background image when the distance range is not greater than the preset threshold, so that the segmentation process of the foreground image is simplified, and the image processing method has the characteristics of high illumination resistance and high shadow resistance.
Secondly, the embodiment of the present application describes in detail how to perform kalman filtering on the grayscale image according to the kalman filtering unit 604 to obtain the template image, and further describes in detail a segmentation process of the foreground image and the background image in the grayscale image according to the determining unit 605, so as to improve the implementability of the present application.
The image processing apparatus in the embodiment of the present application is described above from the perspective of the modular functional entity, and the image processing apparatus in the embodiment of the present application is described below from the perspective of hardware processing:
an embodiment of an image processing apparatus in an embodiment of the present application includes:
a processor and a memory;
the memory is used for storing the computer program, and the processor is used for realizing the following steps when executing the computer program stored in the memory:
acquiring an original image, and judging whether the original image is a single-channel image or not;
if not, converting the original image into a single-channel image to obtain a gray image of the original image;
carrying out high-frequency filtering on the gray value of each pixel in the gray image to obtain a first gray image;
performing Kalman filtering on the gray value of each pixel in the gray image according to the gray image and the first gray image to obtain a template image of the gray image;
and calculating the distance norm of the difference value between the gray value of each pixel in the template image and the gray value of the corresponding pixel in the first gray image, if the distance norm is greater than a preset threshold value, defining the current pixel as a foreground image, otherwise, defining the current pixel as a background image.
In some embodiments of the present application, the processor may be further configured to:
estimating the background gray scale of the gray scale image to obtain a background characteristic gray scale value of the gray scale image;
selecting a region with G (i, j) as the center and m x n as the size in the gray scale image, and applying the gray scale value of each pixel in the region
Figure GDA0003226906670000161
Performing high-frequency filtering to obtain a first gray image of the region;
calculating a gray scale prediction value of each pixel in the gray scale image according to the following formula (1) and formula (2):
Figure GDA0003226906670000162
w1+w2+w3=1; (2)
if any pixel in G (i-1, j-1), G (i-1, j-1) or G (i, j-1) exceeds the boundary of B, making the pixel value exceeding the boundary as the background characteristic gray value;
correcting the predicted gray scale value of each pixel according to formula (3):
Figure GDA0003226906670000163
0<λ<1;
Figure GDA0003226906670000164
is a filter operator (3)
To obtain a gray value for each pixel in the template image.
In some embodiments of the present application, the processor may be further configured to:
calculating the distance norm of the difference value of the gray value of each pixel in the template image and the gray value of the corresponding pixel in the first gray image according to a formula (4):
Figure GDA0003226906670000165
Figure GDA0003226906670000166
is a filter operator (4)
Determining that the current pixel is a foreground image or a background image according to formula (5), wherein epsilon is a visual perception gray threshold value:
Figure GDA0003226906670000167
Figure GDA0003226906670000168
is a filter operator (5)
And if the distance norm is greater than the epsilon, defining the current pixel as a foreground image, and if the distance norm is not greater than the epsilon, defining the current pixel as a background image.
In some embodiments of the present application, the processor may be further configured to:
and outputting and displaying the foreground image.
It is to be understood that, when the processor in the image processing apparatus described above executes the computer program, the functions of the units in the corresponding apparatus embodiments may also be implemented, and are not described herein again. Illustratively, the computer program may be partitioned into one or more modules/units that are stored in the memory and executed by the processor to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program in the image processing apparatus. For example, the computer program may be divided into units in the above-described image processing apparatus, and each unit may realize a specific function as described above in the corresponding image processing apparatus.
The computer device can be a desktop computer, a notebook, a palm computer, a cloud server and other computing equipment. The computer device may include, but is not limited to, a processor, a memory. It will be appreciated by those skilled in the art that the processor, memory are merely examples of a computer apparatus and are not meant to be limiting, and that more or fewer components may be included, or certain components may be combined, or different components may be included, for example, the computer apparatus may also include input output devices, network access devices, buses, etc.
The Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable gate array (FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware component, etc. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like which is the control center for the computer device and which connects the various parts of the overall computer device using various interfaces and lines.
The memory may be used to store the computer programs and/or modules, and the processor may implement various functions of the computer device by running or executing the computer programs and/or modules stored in the memory and invoking data stored in the memory. The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to the use of the terminal, and the like. In addition, the memory may include high speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other volatile solid state storage device.
The present application also provides a computer-readable storage medium for implementing the functions of an image processing apparatus, having a computer program stored thereon, which, when executed by a processor, the processor is operable to perform the steps of:
acquiring an original image, and judging whether the original image is a single-channel image or not;
if not, converting the original image into a single-channel image to obtain a gray image of the original image;
carrying out high-frequency filtering on the gray value of each pixel in the gray image to obtain a first gray image;
performing Kalman filtering on the gray value of each pixel in the gray image according to the gray image and the first gray image to obtain a template image of the gray image;
and calculating the distance norm of the difference value between the gray value of each pixel in the template image and the gray value of the corresponding pixel in the first gray image, if the distance norm is greater than a preset threshold value, defining the current pixel as a foreground image, otherwise, defining the current pixel as a background image.
In some embodiments of the present application, the computer program stored on the computer-readable storage medium, when executed by the processor, may be specifically configured to perform the following steps:
estimating the background gray scale of the gray scale image to obtain a background characteristic gray scale value of the gray scale image;
selecting a region with G (i, j) as the center and m x n as the size in the gray scale image, and applying the gray scale value of each pixel in the region
Figure GDA0003226906670000181
Performing high-frequency filtering to obtain a first gray image of the region;
calculating a gray scale prediction value of each pixel in the gray scale image according to the following formula (1) and formula (2):
Figure GDA0003226906670000182
w1+w2+w3=1; (2)
if any pixel in G (i-1, j-1), G (i-1, j-1) or G (i, j-1) exceeds the boundary of B, making the pixel value exceeding the boundary as the background characteristic gray value;
correcting the predicted gray scale value of each pixel according to formula (3):
Figure GDA0003226906670000191
0<λ<1;
Figure GDA0003226906670000192
is a filter operator (3)
To obtain a gray value for each pixel in the template image.
In some embodiments of the present application, the computer program stored on the computer-readable storage medium, when executed by the processor, may be specifically configured to perform the following steps:
calculating the distance norm of the difference value of the gray value of each pixel in the template image and the gray value of the corresponding pixel in the first gray image according to a formula (4):
Figure GDA0003226906670000193
Figure GDA0003226906670000194
is a filter operator (4)
Determining that the current pixel is a foreground image or a background image according to formula (5), wherein epsilon is a visual perception gray threshold value:
Figure GDA0003226906670000195
Figure GDA0003226906670000196
is a filter operator (5)
And if the distance norm is greater than the epsilon, defining the current pixel as a foreground image, and if the distance norm is not greater than the epsilon, defining the current pixel as a background image.
In some embodiments of the present application, the computer program stored on the computer-readable storage medium, when executed by the processor, may be specifically configured to perform the following steps:
and outputting and displaying the foreground image.
It will be appreciated that the integrated units, if implemented as software functional units and sold or used as a stand-alone product, may be stored in a corresponding one of the computer readable storage media. Based on such understanding, all or part of the flow of the method according to the above embodiments may be implemented by a computer program, which may be stored in a computer-readable storage medium and used by a processor to implement the steps of the above embodiments of the method. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, etc. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (8)

1. An image processing method, comprising:
acquiring an original image, and judging whether the original image is a single-channel image or not;
if not, converting the original image into a single-channel image to obtain a gray image of the original image;
carrying out high-frequency filtering on the gray value of each pixel in the gray image to obtain a first gray image;
performing Kalman filtering on the gray value of each pixel in the gray image according to the gray image and the first gray image to obtain a template image of the gray image;
calculating a distance norm of a difference value between the gray value of each pixel in the template image and the gray value of the corresponding pixel in the first gray image, if the distance norm is greater than a preset threshold value, defining the current pixel as a foreground image, otherwise, defining the current pixel as a background image;
the performing kalman filtering on the gray value of each pixel in the gray image according to the gray image and the first gray image to obtain a template image of the gray image includes:
estimating the background gray scale of the gray scale image to obtain a background characteristic gray scale value of the gray scale image;
selecting a region with G (i, j) as the center and m x n as the size in the gray scale image, and applying the gray scale value of each pixel in the region
Figure FDA0003226906660000011
Performing high-frequency filtering to obtain a first gray image of the region;
calculating a gray scale prediction value of each pixel in the gray scale image according to the following formula (1) and formula (2):
Figure FDA0003226906660000012
w1+w2+w3=1; (2)
if any pixel in G (i-1, j-1), G (i-1, j-1) or G (i, j-1) exceeds the boundary of G, making the pixel value exceeding the boundary as the background characteristic gray value;
correcting the predicted gray scale value of each pixel according to formula (3):
Figure FDA0003226906660000013
to obtain a gray value for each pixel in the template image.
2. The method according to claim 1, wherein the calculating a distance norm of a difference between a gray value of each pixel in the template image and a gray value of a corresponding pixel in the first gray image, and if the distance norm is greater than a preset threshold, defining a current pixel as a foreground image, otherwise defining the current pixel as a background image comprises:
calculating the distance norm of the difference value of the gray value of each pixel in the template image and the gray value of the corresponding pixel in the first gray image according to a formula (4):
Figure FDA0003226906660000021
determining that the current pixel is a foreground image or a background image according to formula (5), wherein epsilon is a visual perception gray threshold value:
Figure FDA0003226906660000022
and if the distance norm is greater than the epsilon, defining the current pixel as a foreground image, and if the distance norm is not greater than the epsilon, defining the current pixel as a background image.
3. The method of claim 1, wherein the algorithm for estimating the background gray level of the gray level image comprises:
one or more of a background gray mode method, a background gray mean method and a background gray fitting Gaussian distribution mean method.
4. The method of claim 1, wherein the high frequency filtering the grayscale value of each pixel in the grayscale image comprises:
and performing mean filtering, Gaussian filtering or Gaussian-Laplace filtering on the gray value of each pixel in the gray image.
5. An image processing apparatus characterized by comprising:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring an original image and judging whether the original image is a single-channel image;
the conversion unit is used for converting the original image into a single-channel image to obtain a gray image of the original image when the original image is not the single-channel image;
the high-frequency filtering unit is used for performing high-frequency filtering on the gray value of each pixel in the gray image to obtain a first gray image;
the Kalman filtering unit is used for carrying out Kalman filtering on the gray value of each pixel in the gray image according to the gray image and the first gray image so as to obtain a template image of the gray image;
the determining unit is used for calculating a distance norm of a difference value between the gray value of each pixel in the template image and the gray value of the corresponding pixel in the first gray image, if the distance norm is greater than a preset threshold value, the current pixel is defined as a foreground image, and otherwise, the current pixel is defined as a background image;
the Kalman filtering unit includes:
the background gray estimation module is used for estimating the background gray of the gray image to obtain a background characteristic gray value of the gray image;
a high-frequency filtering module for selecting a region with G (i, j) as the center and m x n as the size from the gray image, and applying the gray value of each pixel in the region
Figure FDA0003226906660000031
Performing high-frequency filtering to obtain a first gray image of the region;
a gray prediction module for calculating a gray prediction value of each pixel in the gray image according to the following formula (1) and formula (2):
Figure FDA0003226906660000032
w1+w2+w3=1; (2)
if any pixel in G (i-1, j-1), G (i-1, j) or G (i, j-1) exceeds the boundary of G, enabling the pixel value exceeding the boundary to be the background characteristic gray value;
a correction module, configured to correct the predicted gray scale value of each pixel according to formula (3):
Figure FDA0003226906660000033
to obtain a gray value for each pixel in the template image.
6. The apparatus of claim 5, wherein the determining unit comprises:
a calculating module, configured to calculate a distance norm of a difference between a grayscale value of each pixel in the template image and a grayscale value of a corresponding pixel in the first grayscale image according to formula (4):
Figure FDA0003226906660000034
the determining module is used for determining that the current pixel is a foreground image or a background image according to a formula (5), and epsilon is a visual perception gray threshold value:
Figure FDA0003226906660000035
and if the distance norm is greater than the epsilon, defining the current pixel as a foreground image, and if the distance norm is not greater than the epsilon, defining the current pixel as a background image.
7. An image processing apparatus comprising a processor, characterized in that the processor, when executing a computer program stored on a memory, is adapted to carry out the image processing method according to any of claims 1 to 4.
8. A readable storage medium, on which a computer program is stored which, when being executed by a processor, is adapted to carry out the image processing method of any one of claims 1 to 4.
CN201810994081.XA 2018-08-29 2018-08-29 Image processing method and device Active CN109214996B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810994081.XA CN109214996B (en) 2018-08-29 2018-08-29 Image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810994081.XA CN109214996B (en) 2018-08-29 2018-08-29 Image processing method and device

Publications (2)

Publication Number Publication Date
CN109214996A CN109214996A (en) 2019-01-15
CN109214996B true CN109214996B (en) 2021-11-12

Family

ID=64985561

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810994081.XA Active CN109214996B (en) 2018-08-29 2018-08-29 Image processing method and device

Country Status (1)

Country Link
CN (1) CN109214996B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110135519B (en) * 2019-05-27 2022-10-21 广东工业大学 Image classification method and device
CN111105428B (en) * 2019-11-08 2023-11-14 上海航天控制技术研究所 Star sensor forward filtering hardware image processing method
CN111798389B (en) * 2020-06-30 2023-08-15 中国工商银行股份有限公司 Adaptive image enhancement method and device
CN113298812B (en) * 2021-04-22 2023-11-03 中国电子产品可靠性与环境试验研究所((工业和信息化部电子第五研究所)(中国赛宝实验室)) Image segmentation method, device, system, electronic equipment and readable storage medium
CN114255185B (en) * 2021-12-16 2022-11-25 武汉高德智感科技有限公司 Image processing method, device, terminal and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727672A (en) * 2008-10-24 2010-06-09 云南正卓信息技术有限公司 Method for detecting, tracking and identifying object abandoning/stealing event
CN101916448A (en) * 2010-08-09 2010-12-15 云南清眸科技有限公司 Moving object detecting method based on Bayesian frame and LBP (Local Binary Pattern)
CN102034247A (en) * 2010-12-23 2011-04-27 中国科学院自动化研究所 Motion capture method for binocular vision image based on background modeling
CN102750535A (en) * 2012-04-01 2012-10-24 北京京东世纪贸易有限公司 Method and system for automatically extracting image foreground
CN102819841A (en) * 2012-07-30 2012-12-12 中国科学院自动化研究所 Global threshold partitioning method for partitioning target image
CN103873743A (en) * 2014-03-24 2014-06-18 中国人民解放军国防科学技术大学 Video de-noising method based on structure tensor and Kalman filtering
CN104166841A (en) * 2014-07-24 2014-11-26 浙江大学 Rapid detection identification method for specified pedestrian or vehicle in video monitoring network
CN104599271A (en) * 2015-01-20 2015-05-06 中国科学院半导体研究所 CIE Lab color space based gray threshold segmentation method
CN105761261A (en) * 2016-02-17 2016-07-13 南京工程学院 Method for detecting artificial malicious damage to camera

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7224735B2 (en) * 2003-05-21 2007-05-29 Mitsubishi Electronic Research Laboratories, Inc. Adaptive background image updating
US8670611B2 (en) * 2011-10-24 2014-03-11 International Business Machines Corporation Background understanding in video data

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727672A (en) * 2008-10-24 2010-06-09 云南正卓信息技术有限公司 Method for detecting, tracking and identifying object abandoning/stealing event
CN101916448A (en) * 2010-08-09 2010-12-15 云南清眸科技有限公司 Moving object detecting method based on Bayesian frame and LBP (Local Binary Pattern)
CN102034247A (en) * 2010-12-23 2011-04-27 中国科学院自动化研究所 Motion capture method for binocular vision image based on background modeling
CN102750535A (en) * 2012-04-01 2012-10-24 北京京东世纪贸易有限公司 Method and system for automatically extracting image foreground
CN102819841A (en) * 2012-07-30 2012-12-12 中国科学院自动化研究所 Global threshold partitioning method for partitioning target image
CN103873743A (en) * 2014-03-24 2014-06-18 中国人民解放军国防科学技术大学 Video de-noising method based on structure tensor and Kalman filtering
CN104166841A (en) * 2014-07-24 2014-11-26 浙江大学 Rapid detection identification method for specified pedestrian or vehicle in video monitoring network
CN104599271A (en) * 2015-01-20 2015-05-06 中国科学院半导体研究所 CIE Lab color space based gray threshold segmentation method
CN105761261A (en) * 2016-02-17 2016-07-13 南京工程学院 Method for detecting artificial malicious damage to camera

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Adaptive Background Estimation and Foreground Detection using Kalman-Filtering;Christof Ridder等;《Computer Science》;19951231;第1-7页 *
一种改进的卡尔曼滤波背景减除方法;李文光等;《信号处理》;20090831;第25卷(第8A期);第274-277页 *
基于灰度区间统计的背景自适应更新算法;罗松飞等;《科技资讯》;20171231(第26期);第179-180页 *
基于颜色和局部二值相似模式的背景减除;任典元等;《计算机科学》;20160331;第43卷(第3期);第296-300、304页 *

Also Published As

Publication number Publication date
CN109214996A (en) 2019-01-15

Similar Documents

Publication Publication Date Title
CN109214996B (en) Image processing method and device
CN107403421B (en) Image defogging method, storage medium and terminal equipment
CN109978890B (en) Target extraction method and device based on image processing and terminal equipment
CN110852997B (en) Dynamic image definition detection method and device, electronic equipment and storage medium
CN110335216B (en) Image processing method, image processing apparatus, terminal device, and readable storage medium
CN111311482B (en) Background blurring method and device, terminal equipment and storage medium
KR20130016213A (en) Text enhancement of a textual image undergoing optical character recognition
CN112150371B (en) Image noise reduction method, device, equipment and storage medium
CN109255752B (en) Image self-adaptive compression method, device, terminal and storage medium
US20210185285A1 (en) Image processing method and apparatus, electronic device, and readable storage medium
US20140140620A1 (en) Image processing apparatus, image processing method and computer readable medium
CN113298761B (en) Image filtering method, device, terminal and computer readable storage medium
CN111882565B (en) Image binarization method, device, equipment and storage medium
CN109035167B (en) Method, device, equipment and medium for processing multiple faces in image
CN111696064B (en) Image processing method, device, electronic equipment and computer readable medium
CN108234826B (en) Image processing method and device
CN110717864B (en) Image enhancement method, device, terminal equipment and computer readable medium
CN109255311B (en) Image-based information identification method and system
CN114998122A (en) Low-illumination image enhancement method
CN111563517A (en) Image processing method, image processing device, electronic equipment and storage medium
CN108805838B (en) Image processing method, mobile terminal and computer readable storage medium
CN113744294A (en) Image processing method and related device
CN111311619A (en) Method and device for realizing slider verification
CN111222446B (en) Face recognition method, face recognition device and mobile terminal
Hong et al. Single image dehazing based on pixel-wise transmission estimation with estimated radiance patches

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant