CN113689455A - Thermal fluid image processing method, system, terminal and medium - Google Patents

Thermal fluid image processing method, system, terminal and medium Download PDF

Info

Publication number
CN113689455A
CN113689455A CN202110742298.3A CN202110742298A CN113689455A CN 113689455 A CN113689455 A CN 113689455A CN 202110742298 A CN202110742298 A CN 202110742298A CN 113689455 A CN113689455 A CN 113689455A
Authority
CN
China
Prior art keywords
background
image
detected
pixel
gradient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110742298.3A
Other languages
Chinese (zh)
Other versions
CN113689455B (en
Inventor
周逸帆
张玉银
齐文元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN202110742298.3A priority Critical patent/CN113689455B/en
Publication of CN113689455A publication Critical patent/CN113689455A/en
Application granted granted Critical
Publication of CN113689455B publication Critical patent/CN113689455B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/181Segmentation; Edge detection involving edge growing; involving edge linking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a thermal fluid image processing method and a thermal fluid image processing system. A corresponding terminal and medium are also provided. The invention can increase the difference of the target area to be detected and the background area by extracting the directional gradient histogram characteristics of the two areas. The method can be applied to the field of image processing of hot fluid, can be used for carrying out target identification, edge detection and segmentation on the to-be-detected area and the background area, has advantages under the conditions that the difference between the to-be-detected target and the background is small and the background changes along with frames, has low sensitivity on parameter setting, good universality, can be used for different cases, working conditions and optical systems, and has the advantages of easy realization of programs, high processing speed and accurate and reliable results.

Description

Thermal fluid image processing method, system, terminal and medium
Technical Field
The invention relates to the technical field of image processing of hot fluid, in particular to a hot fluid image processing method, a hot fluid image processing system, a hot fluid image processing terminal and a hot fluid image processing medium based on direction gradient histogram characteristics.
Background
The experimental data of the hot fluid obtained based on the optical diagnosis technology not only is beneficial to the mechanism research of the hot fluid, but also has important significance for establishing and calibrating a simulation model. Various optical measurement methods are developed and applied to research on relevant characteristics of hot fluids, and have been widely applied to constant volume combustion bombs, optical engines, shock tubes, rapid presses and the like to measure macroscopic and microscopic characteristics of hot fluids, including but not limited to mie scattering, backlight methods, shadow methods, schlieren methods, laser induced fluorescence, ultraviolet/visible absorption and scattering techniques and the like, and for analysis of digital images obtained by these optical measurement methods, object recognition, edge detection, image segmentation, contour extraction are indispensable steps. For digital images for measuring macroscopic or microscopic properties of a thermal fluid, if the segmentation is not accurate, the extracted information may contain wrong or unnecessary information, so it is important to accurately segment the region of the object to be detected from the image obtained by the optical test.
The current common thermal fluid image processing method generally completes image segmentation by calculating the difference between an image to be detected and a background image and then binarizing the image by selecting a threshold value, and is simple and effective when the difference between the brightness values of the area to be detected and the background area is large. However, under high environmental pressure and high environmental density (such as engine-like conditions), the temporal-spatial variation of the environmental density gradient is converted into the variation of the texture in the image, and the texture of the environmental gas becomes complex background noise changing along with time, so that the image becomes difficult to identify and segment, and the difficulty of target identification, edge detection, image segmentation and contour extraction is greatly increased. However, the currently developed processing methods for segmentation of thermal fluid images have more or less some problems and disadvantages: (1) the steps are more, the treatment process is complex and not simple enough; (2) in each step, there are several thresholds and corrections set artificiallyFactors, etc., which are inconvenient; (3) the processing result is very sensitive to the selection of threshold values and other parameters, and an undesirable result can be obtained if the processing result is slightly set incorrectly; (4) many methods can only deal with a relatively clean background or with little noise but no evidence (ambient density 10 kg/m)3Following), the identified boundary will be ambiguous when the intensity difference between the region to be detected and the background region is not significant; (5) the developed method has good effect when a certain case and working condition are processed, but the situation of poor processing effect can occur when one environment, working condition and optical system are changed, the threshold value and some manually set parameters need to be determined again, and the universality is not enough; (6) for the image with high background noise or the image with the background changing with the frame, the processing effect is general, and the accuracy is poor.
In conclusion, the existing thermal fluid image processing method has the problems of complicated steps, insufficient convenience, insufficient universality, poor accuracy of processing effect and the like, and no explanation or report of the similar technology of the invention is found at present, and similar data at home and abroad are not collected yet.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a thermal fluid image processing method, a thermal fluid image processing system, a thermal fluid image processing terminal and a thermal fluid image processing medium based on the histogram feature of the directional gradient.
According to an aspect of the present invention, there is provided a thermal fluid image processing method including:
dividing a plurality of hot fluid images which are arranged according to a time sequence into a to-be-detected graph containing hot fluid to be detected and identified and a background graph not containing the hot fluid to be detected and identified respectively, and extracting the directional gradient histogram characteristics of each pixel point of the to-be-detected graph and the background graph respectively;
calculating the difference between the directional gradient histogram characteristics of all pixel points in the to-be-detected graph and the background graph at the same position to obtain a difference graph of the directional gradient histogram characteristics;
and carrying out binarization processing on the difference map, judging whether each pixel belongs to a target area, dividing the difference map into the target area and a background area according to a judgment result, extracting the outline of the target area, and finishing the processing of the thermal fluid image.
Preferably, the extracting the direction gradient histogram characteristics of the to-be-detected graph and the pixel points of the background graph respectively includes:
for each pixel point (I, j) on the to-be-detected image I and the background image Bk, respectively taking the pixel point (I, j) as a center, acquiring an n multiplied by n area around the pixel point (I, j) as a primitive C (I, j) of the pixel point (I, j), and performing primitive segmentation on the to-be-detected image and the background image;
performing gamma correction on the gray value of each element C (i, j), and taking a square root to obtain:
Figure BDA0003143130170000021
for each pixel C ' in each of the obtained cells C ' (i, j) '(i,j)(m, n) horizontal gradients Gx(i,j)(m, n) and vertical gradient
Figure BDA0003143130170000022
Calculating (1); the horizontal gradient G of the same pixelx(i,j)(m, n) and vertical gradient
Figure BDA0003143130170000033
Performing vector synthesis to obtain the gradient size and gradient direction of each pixel in a two-dimensional plane;
dividing the value range [0 DEG, 360 DEG ] of the gradient direction into N types, wherein N is the dimension of the characteristic of the directional gradient histogram to be obtained, and each dimension corresponds to a latitude; each pixel C ' in the primitive C ' (i, j) '(i,j)The gradient directions of (m, n) are respectively classified according to angles and divided into latitudes, the gradient sizes of all pixels divided into the same latitude are superposed to obtain the value of the directional gradient histogram feature to be obtained in the latitude, and further the directional gradient histogram feature H corresponding to each element is obtained(i,j)
Preferably, the value of nxn is as follows: (2z +1) × (2z +1), z being a positive integer.
Preferably, each pixel C ' in each resulting cell C ' (i, j) of the pair '(i,j)(m, n) are each subjected to a horizontal gradient Gx(i,j)(m, n) comprising:
is provided with pixel C'(i,j)The first side of (m, n) is positive, and the pixel C 'is used'(i,j)(m, n) pixels C 'adjacent on the positive side'(i,j)Value of (m, n +1) minus the pixel C'(i,j)(m, n) a second side-adjacent pixel C 'opposite the first side'(i,j)The value of (m, n-1) gives a horizontal gradient;
the pixel C ' in each obtained primitive C ' (i, j) '(i,j)(m, n) are respectively subjected to vertical gradient
Figure BDA0003143130170000034
Comprises the following steps:
is provided with pixel C'(i,j)(m, n) is positive on the third side, and the pixel C 'is used'(i,j)(m, n) pixel C 'adjacent to the front side'(i,j)Value of (m +1, n) minus a pixel C 'adjacent to a fourth side of the pixel opposite to the third side'(i,j)The value of (m-1, n) gives a vertical gradient.
Preferably, the value range [0 °, 360 °) of the gradient direction is divided into N classes, where the value range corresponding to each class is (360/N) °, and the N classes include: [0 °, 360/N) - [360/N °,2 × 360/N) - [2 × 360/N °, 3 × 360/N) -, … …, [ (N-1) × 360/N °, 360 °).
Preferably, the method further comprises:
corresponding directional gradient histogram characteristics H of each element(i,j)Normalizing at all latitudes to make the sum of the values of the histogram feature of the directional gradient of each element at each latitude 1 to obtain a normalized histogram feature Hn, wherein the value of the histogram feature of the directional gradient corresponding to each latitude is
Figure BDA0003143130170000031
Figure BDA0003143130170000032
Preferably, the calculating the difference between the histogram features of the directional gradients of the pixel points of the to-be-detected image and the background image at the same position includes: calculating the directional gradient histogram characteristics Hn of the pixel points (i, j) at the same position on the to-be-detected graph and the background graph by adopting a p-order difference calculation methodI(i, j) and HnBk(i, j) difference between:
Figure BDA0003143130170000041
wherein p is a positive integer, and i is each latitude of the histogram of oriented gradients.
Preferably, if p is 1, then:
DiffL1=||HnI-HnBk||1=∑i|HnI(i)-HnBk(i)|1),DiffL1∈[0,2]。
preferably, the difference map is subjected to binarization processing, whether each pixel belongs to a target region is judged, each pixel is classified to obtain a target region and a background region, the difference map is segmented, and the outline of the target region is extracted; the segmented background area is used for dynamic background correction updating, and the extracted contour is used for measuring and calculating related characteristic parameters.
Preferably, the dynamic background correction update comprises:
taking the last image Bk _01 before the image to be detected in a plurality of thermal fluid images which are arranged according to the time sequence as a first background image, and performing image processing on the basis of the first background image Bk _01 and the first image to be detected I _01 to detect a target region and a background region on the first image to be detected I _01 so as to complete the segmentation of the first image to be detected I _ 01;
pixels in a background area on the first image to be detected I _01 are placed at the same position in the first background image Bk _01 to complete background updating correction, and an image of the first background image Bk _01 after background updating is the second background image Bk _ 02;
acquiring a second background image Bk _02, and performing image processing on the second image I _02 to be detected and the second background image Bk _02 to detect a target area and a background area on the second image I _02 to be detected so as to complete the segmentation of the next image to be detected;
and so on until the segmentation of all the images to be detected in all the images of the thermal fluid is completed.
According to another aspect of the present invention, there is provided a thermal fluid image processing system including:
the system comprises a direction gradient histogram feature extraction module, a direction gradient histogram feature extraction module and a direction gradient histogram feature extraction module, wherein the direction gradient histogram feature extraction module divides a plurality of hot fluid images into a to-be-detected diagram containing hot fluid needing to be detected and identified and a background diagram not containing hot fluid needing to be detected and identified respectively and extracts the direction gradient histogram features of all pixel points of the to-be-detected diagram and the background diagram;
the difference map calculation module is used for calculating the difference between the directional gradient histogram characteristics of all the pixel points in the to-be-detected map and the background map at the same position to obtain a difference map of the directional gradient histogram characteristics;
and the image processing module is used for carrying out binarization processing on the difference map, judging whether each pixel point belongs to a target area, segmenting the difference map into the target area and a background area according to a judgment result, extracting the outline of the target area and finishing the processing of the thermal fluid image.
According to a third aspect of the present invention, there is provided a terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor when executing the program being operable to perform the method of any one of the above, or to operate the system as described above.
According to a fourth aspect of the invention, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, is operable to perform the method of any one of the above or to operate the system of the above.
Due to the adoption of the technical scheme, compared with the prior art, the invention has the following beneficial effects:
the thermal fluid image processing method, the thermal fluid image processing system, the thermal fluid image processing terminal and the thermal fluid image processing medium are based on Histogram of Oriented Gradient (HOG) characteristics, the hot fluid image under the condition of cleaner background and high background noise can be processed, the HOG characteristics of the image to be detected and the background image can be compared on the directional gradient histogram, the difference of the HOG characteristics can more clearly reflect the difference between the target area and the background area than the difference of the original brightness value, the difference and the distinguishability between the target area and the background area are enlarged, the image to be detected is conveniently segmented, and the outline of a target object is extracted, the method has low sensitivity to parameter setting, good universality, easy realization of programs, high processing speed and accurate and reliable processing results, and can be used for different cases, working conditions and optical systems.
The hot fluid image processing method, the hot fluid image processing system, the hot fluid image processing terminal and the hot fluid image processing medium can process hot fluid images under different conditions (including a clean background condition, high background noise, a static background, a dynamic background, different optical systems, different testing methods and the like), the HOG characteristics of the image to be detected and the HOG characteristics of the background image can be compared on a direction gradient histogram, the difference of the HOG characteristics can reflect the difference between a target area and the background area more clearly than the difference of original brightness values, the difference and the distinguishability between the target area and the background area are enlarged, and the image to be detected can be segmented conveniently and the outline of a target object can be extracted.
The method, the system, the terminal and the medium for processing the thermal fluid image can complete a difficult segmentation task when the difference between the background and the target to be detected on the image is small, and compared with the existing scheme, the method has the advantages of very good processing effect, low sensitivity of the proposed method on parameter setting, good universality, capability of facing different cases, working conditions and optical systems, easy realization of programs, high processing speed and accurate and reliable processing results.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a flow chart of a thermal fluid image processing method according to an embodiment of the invention.
FIG. 2 is a flow chart of a thermal fluid image processing method according to a preferred embodiment of the present invention.
FIG. 3 is a schematic view of a thermal fluid image case as a processing object in an embodiment of the present invention; wherein, (a) is an injection development image of the engine fuel spray, and (b) is a background image.
FIG. 4 is a schematic diagram of a Histogram of Oriented Gradients (HOG) feature extraction procedure in an embodiment of the present invention.
FIG. 5 is a diagram illustrating Histogram of Oriented Gradients (HOG) feature difference calculation steps in an embodiment of the present invention.
Fig. 6 is a schematic diagram of the steps of binarization, segmentation, and contour extraction in an embodiment of the present invention.
FIG. 7 is a graph showing a comparison of the effect of a thermal fluid image processing method in a preferred embodiment of the present invention and the effect of a conventional processing method (segmentation by thresholding, with a difference in brightness values between the background image and the image to be examined);
FIG. 8 is a block diagram of a thermal fluid image processing system according to an embodiment of the present invention.
Detailed Description
The following examples illustrate the invention in detail: the embodiment is implemented on the premise of the technical scheme of the invention, and a detailed implementation mode and a specific operation process are given. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention.
Fig. 1 is a flowchart of a thermal fluid image processing method according to an embodiment of the present invention.
As shown in fig. 1, the thermal fluid image processing method provided by this embodiment may include the following steps:
s100, dividing a plurality of thermal fluid images (image sequences) arranged according to a time sequence into a to-be-detected diagram containing thermal fluid to be detected and identified and a background diagram not containing thermal fluid to be detected and identified respectively, and extracting the directional gradient histogram characteristics of each pixel point of the to-be-detected diagram and the background diagram respectively;
s200, calculating the difference between the directional gradient histogram characteristics of all the pixel points in the to-be-detected graph and the background graph at the same position to obtain a difference graph of the directional gradient histogram characteristics, namely a set of characteristic differences of all the pixel points;
s300, performing binarization processing on the difference map, judging whether each pixel point belongs to a target area, dividing the difference map into the target area and a background area according to a judgment result, extracting the outline of the target area, and finishing the processing of the thermal fluid image.
In S100 of this embodiment, as a preferred embodiment, the step of respectively extracting directional gradient histogram features of each pixel point of the to-be-detected map and the background map may include the following steps:
s101, respectively taking each pixel point (I, j) on an image I to be detected and a background image Bk as a center, acquiring an n multiplied by n area around the pixel point (I, j) as a base element C (I, j) of the pixel point (I, j), and performing element segmentation on the image to be detected and the background image;
s102, carrying out gamma correction on the gray value of each element C (i, j), and taking a square root to obtain:
Figure BDA0003143130170000071
for each pixel C ' in each of the obtained cells C ' (i, j) '(i,j)(m, n) horizontal gradients Gx(i,j)(m, n) and vertical gradient
Figure BDA0003143130170000072
Calculating (1); the horizontal gradient G of the same pixelx(i,j)(m, n) and vertical gradient
Figure BDA0003143130170000073
Vector synthesis is carried out to obtain the gradient size and gradient direction of each pixel point in a two-dimensional plane;
s103, dividing the value range [0 degrees and 360 degrees ] of the gradient direction into N types, wherein N is the dimension of the characteristic of the gradient histogram of the direction to be obtained, and each dimension corresponds to a latitude; each pixel in primitive C' (i, j)
Figure BDA0003143130170000074
The gradient directions are classified according to angles and divided into latitudes, the gradient sizes of all pixels divided into the same latitude are superposed to obtain the value of the directional gradient histogram feature to be obtained in the latitude, and further obtain the directional gradient histogram feature H corresponding to each element(i,j)
In S101 of this embodiment, as a preferred embodiment, the value of nxn may be: (2z +1) × (2z +1), z being a positive integer.
In a specific application example of the preferred embodiment, the value of n × n may be: 3 × 3, 5 × 5, 7 × 7, 9 × 9, 11 × 11, 13 × 13, … ….
In S101 of this embodiment, as a specific application example, the value of n × n may preferably be: 3X 3.
In S102 of this embodiment, as a preferred embodiment, each pixel C ' in each cell C ' (i, j) is subjected to '(i,j)(m, n) horizontal gradients Gx(i,j)The calculation of (m, n) may comprise the steps of:
is provided with pixel C'(i,j)The first side of (m, n) is positive, and the pixel C 'is used'(i,j)(m, n) pixels C 'adjacent on the positive side'(i,j)Value of (m, n +1) minus the pixel C'(i,j)(m, n) a second side-adjacent pixel C 'opposite the first side'(i,j)(m, n-1) value gives waterAnd (4) flattening the gradient.
In S102 of this embodiment, as a preferred embodiment, each pixel C ' in each cell C ' (i, j) is subjected to '(i,j)(m, n) are respectively subjected to vertical gradient
Figure BDA0003143130170000084
The calculation of (c) may include the steps of:
is provided with pixel C'(i,j)(m, n) is positive on the third side, and the pixel C 'is used'(i,j)(m, n) pixel C 'adjacent to the front side'(i,j)Value of (m +1, n) minus a pixel C 'adjacent to a fourth side of the pixel opposite to the third side'(i,j)The value of (m-1, n) gives a vertical gradient.
In S103 of this embodiment, as a preferred embodiment, the dimension N of the histogram of directional gradients may be a dimension that divides the directional range [0 °, 360 °) of the gradient into N classes, where each class corresponds to a value range of (360/N) °, and the N classes may be, but are not limited to: [0 °, 360/N) - [360/N °,2 × 360/N) - [2 × 360/N °, 3 × 360/N) -, … …, [ (N-1) × 360/N °, 360 °).
In the preferred embodiment, the purpose of the classification is to obtain histogram of oriented gradient features, and the obtained histogram of oriented gradient features are used for calculating the difference between the histogram of oriented gradient features of the pixel points of the to-be-detected graph and the background graph at the same position.
In S103 of this embodiment, as a specific application example, the value of N is selected from, but not limited to, 6 to 12, and further preferably, the value may be 9.
In S103 of this embodiment, as a preferred embodiment, the method may further include:
corresponding directional gradient histogram characteristics H of each element(i,j)Normalizing at all latitudes to make the sum of the values of the histogram feature of the directional gradient of each element at each latitude 1 to obtain a normalized histogram feature Hn, wherein the value of the histogram feature of the directional gradient corresponding to each latitude is
Figure BDA0003143130170000081
Figure BDA0003143130170000082
In S200 of this embodiment, as a preferred embodiment, the difference between the directional gradient histogram features of the pixel points of the to-be-detected image and the background image at the same position is calculated, and the specific method is preferably, but not limited to, the following steps:
calculating the directional gradient histogram characteristics Hn of the pixel points (i, j) at the same positions on the image to be detected and the background image by adopting a p-order difference calculation methodI(i, j) and HnBk(i, j) difference between:
Figure BDA0003143130170000083
wherein p is a positive integer, and i is each latitude of the histogram of oriented gradients.
In S200 of this embodiment, as a preferred embodiment, it is recommended to adopt but not limited to 1, and as a preferred solution, by adopting a 1-step difference calculation, then:
DiffL1=||HnI-HnBk||1=∑i|HnI(i)-HnBk(i)|1),DiffL1∈[0,2]。
in S300 of this embodiment, as a specific application example, binarization processing is performed on the difference map, whether each pixel belongs to the target region is determined, each pixel is classified to obtain the target region and the background region, segmentation of the difference map is completed, the segmented background region is used for dynamic background correction and update, the contour of the target region is extracted, and the extracted contour is used for measurement and calculation of the relevant characteristic parameters.
In S300 of this embodiment, as a preferred embodiment, the dynamic background correction updating includes:
using a last image Bk _01 before the image to be detected in a plurality of thermal fluid images (image sequences) arranged according to a time sequence as a first background image, and performing image processing on the basis of the first background image Bk _01 and the first image to be detected I _01 to detect a target region and a background region on the first image to be detected I _01 so as to complete the segmentation of the first image to be detected I _ 01;
pixels in a background area on the first image to be detected I _01 are placed at the same position in the first background image Bk _01 to complete background updating correction, and an image of the first background image Bk _01 after background updating is the second background image Bk _ 02;
acquiring a second background image Bk _02, and performing image processing on the second image I _02 to be detected and the second background image Bk _02 to detect a target area and a background area on the second image I _02 to be detected so as to complete the segmentation of the next image to be detected;
and so on until the segmentation of all the images to be detected in the image sequence is completed.
Fig. 2 is a flowchart of a thermal fluid image processing method according to a preferred embodiment of the present invention. The thermal fluid image processing method provided by the preferred embodiment is used for processing a digital image of a thermal fluid obtained by an optical test based on the histogram of oriented gradients, and performing target recognition, edge detection, segmentation and contour extraction on a region to be detected and a background region.
As shown in fig. 2, the thermal fluid image processing method provided by the preferred embodiment may include the following steps:
s1: the method comprises the steps of directional gradient Histogram (HOG) feature extraction, which mainly comprises the steps of a, element division, b, gradient size and direction calculation and c, directional histogram classification and superposition, wherein HOG features of pixel points of an inspection image and a background image are respectively extracted (the inspection image is generally an image containing hot fluid needing to be detected and identified, and the background image is an image not containing the hot fluid needing to be detected and identified), so that preparation is made for the subsequent second-step difference calculation.
S2: and (3) calculating the difference of the features of the directional gradient Histogram (HOG), calculating the difference between the HOG features of each pixel point of the image to be detected and the background image, wherein the difference of the HOG features can reflect the difference between the area to be detected and the background area more clearly than the difference of the original brightness value, and the difference and the distinguishability between the area to be detected and the background area are amplified, so that the subsequent binarization, segmentation and contour extraction are facilitated.
S3: and (4) binarization, segmentation and contour extraction, namely, performing binarization processing on the image with the difference in response obtained in the step (S2), namely classifying pixel points, dividing the pixel points into a target region and a background region, completing segmentation, wherein the segmented background region is used for dynamic background correction and updating, and can extract the contour of the target region, and the extracted contour can be used for measurement and calculation of related characteristic parameters.
The dynamic background correction updating method is specifically as follows, firstly using the last image (not including the object to be detected) before the image to be detected in the image sequence as the initial background (which may be labeled as Bk _01, i.e. the first background image), performing image processing based on Bk _01 and the first image to be detected (I _01) to detect the object region and the background region on I _01, completing segmentation, then placing the pixels in the background region on the I _01 image at the same position in Bk _01 to complete background update correction to obtain the second background image Bk _02, performing next image segmentation based on the second image to be detected I _02 and the second background image Bk _02, and so on, using dynamic background correction updating can better complete segmentation of all the image sequences one by one.
As a preferred embodiment, in the step S1 Histogram of Oriented Gradient (HOG) feature extraction, HOG feature extraction is performed on each pixel (I, j) on the picture I to be detected and the background picture Bk, when the HOG feature of each pixel is extracted, S1-a primitive division is performed, a nearby n × n region (3 × 3, 5 × 5, 7 × 7, etc.) centered on the point (I, j) is selected as a primitive C (I, j) of the pixel (I, j), and the pixel at the boundary may be filled by complementing the surroundings with a copy value or the like.
As a preferred embodiment, in the step S1 Histogram of Oriented Gradient (HOG) feature extraction, the primitive size may be selected to be 3 × 3 in the S1-a.
As a preferred embodiment, step S1 histogram of oriented gradientsIn (HOG) feature extraction, prior to performing S1-b. gradient magnitude and direction calculations, gamma correction is typically performed on the gray scale values of the original image, taking the square root
Figure BDA0003143130170000101
Subsequent gradient calculations are performed.
As a preferred embodiment, in the step S1 Histogram of Oriented Gradients (HOG) feature extraction, after the division of the primitive S1-a, the calculation of the gradient magnitude and direction is performed S1-b, and first, for each cell (pixel) C ' in each primitive C ' (i, j) '(i,j)(m, n) horizontal gradients Gx(i,j)(m, n) and vertical gradient
Figure BDA0003143130170000102
And (4) calculating.
As a preferred embodiment, in step S1-b. gradient magnitude and direction calculation, for each cell (pixel) C'(i,j)(m, n) horizontal gradient Gx(i,j)In the calculation of (m, n), the right is set to positive, and the cell (pixel) C 'adjacent to the right of the cell (pixel) is used'(i,j)The value of (m, n +1) minus the cell (pixel) C 'adjacent to the left of the cell (pixel)'(i,j)(m, n-1) is obtained as a horizontal gradient in the counter unit (pixel) C'(i,j)(m, n) performing a vertical gradient
Figure BDA0003143130170000112
In the calculation, the unit (pixel) C 'adjacent to the lower side of the unit (pixel) is set to be positive in the downward direction'(i,j)The value of (m +1, n) minus the cell (pixel) C 'adjacent to the upper side of the cell (pixel)'(i,j)The value of (m-1, n) gives a vertical gradient.
As a preferred embodiment, in the step S1-b. gradient magnitude and direction calculation, the calculated horizontal gradient and vertical gradient are vector-synthesized to obtain the magnitude and direction of the gradient in the two-dimensional plane, and may be represented by polar coordinates (ρ, θ), where ρ represents the magnitude of the synthesized gradient and θ represents the direction of the synthesized gradient, at which step S1-b. gradient magnitude and direction calculation is completed, and each cell (pixel) in the n × n cell corresponds to one gradient magnitude and direction.
As a preferred embodiment, in the step S1, after the gradient magnitude and direction are calculated in step S1-b, the histogram of direction gradient (HOG) is classified and superimposed in step S1-c, in this step, the value range [0 ° and 360 ° in the gradient direction is firstly classified into N classes, N is the dimension of the HOG feature, each dimension corresponds to a latitude, then the gradient directions of each unit (pixel) in the primitive obtained in the step S1-b, the gradient magnitude and direction are classified and divided into each latitude according to the angle, then the gradient magnitudes of all units (pixels) classified into the same class (latitude) are superimposed to obtain the value of the HOG feature at the latitude, and the HOG feature H corresponding to each primitive can be obtained through this step(i,j)
As a preferred embodiment, in step S1-c, the dimension of the HOG feature is set to 9 dimensions in the histogram classification superposition, and at this time, the 9 classes respectively correspond to the intervals with the range of 40 °, and each cell can be classified into the corresponding latitude according to the range to which the gradient direction of each cell in the cell belongs.
As a preferred embodiment, in step S1-c, the histogram classification and superposition, the obtained HOG feature H may be normalized at all latitudes, so that the sum of the values of the HOG feature of each cell (pixel) at each latitude is 1, and the normalized HOG feature Hn is obtained, where the value corresponds to each latitude
Figure BDA0003143130170000111
As a preferred embodiment, after the Histogram of Oriented Gradients (HOG) feature extraction in step S1, a Histogram of Oriented Gradients (HOG) feature difference calculation in step S2 is performed to calculate the HOG features (Hn) of the pixels at the same position (i, j) on the inspection map and the background mapI(i, j) and HnBk(i, j)) between the two.
As a preferred embodiment, in the step S2 Histogram Oriented Gradient (HOG) feature difference calculation, the difference calculation may be a way of calculating the difference of order p (Lp-norm), and the formula is||HnI-HnBk||p= (∑i|HnI(i)-HnBk(i)|p)1/pP is a positive integer, and i is the latitude of the HOG feature.
As a preferred embodiment, in the step of calculating the difference of Histogram of Oriented Gradient (HOG) features in S2, the difference calculation is performed by calculating the difference of order 1 (L1-norm) as DiffL1=||HnI-HnBk||1= ∑i|HnI(i)-HnBk(i)|1),DiffL1∈[0,2]。
As a preferred embodiment, after the Histogram of Oriented Gradients (HOG) feature difference calculation is performed in step S2, a difference map of the HOG features of each pixel of the to-be-detected map and the background map is obtained, and then binarization, segmentation and contour extraction are performed in step S3.
As a preferred embodiment, in the binarization, segmentation and contour extraction in step S3, the threshold value may be selected by referring to a threshold value selection manner of a conventional image processing method.
The core idea of the thermal fluid image processing method provided by the preferred embodiment is as follows: the target area and the background area are distinguished through the difference of the target area and the background area in the pixel brightness gradient direction, the segmentation and the contour extraction are completed, when whether a certain pixel belongs to a target/background to be detected or not is judged, the judgment is not simply carried out through the brightness value of the certain pixel, the gradient size and the gradient direction are calculated by combining the pixel and the information of the surrounding pixels, then the binarization processing is carried out on the difference image of the directional gradient histogram characteristics, whether each pixel belongs to the target area or not is judged, and the edge detection and the area identification under certain conditions have advantages.
And in the processing process, a dynamic background correction updating mode is adopted to deal with the condition that a background area in the image sequence changes, and the dynamic background correction updating is executed after each to-be-detected mapping is processed and divided and before the next to-be-detected mapping is processed and divided. In some measurements the background does not change much and can be considered static, for which the average of several background maps can be used as the background map; however, in some cases the background image changes greatly (for example, engine-like conditions of high temperature and high pressure), if the optical system is a schlieren method and even density gradient can be seen, the ambient gas moves very obviously in the background of the image, and a new (dynamic) background is needed for each image.
The dynamic background correction updating method comprises the following steps: the last image (not containing the object to be detected) before the image to be detected in the image sequence is initially used as an initial background (which may be labeled as Bk _01), image processing is performed based on Bk01 and the first image to be detected (I _01) to detect the object region and the background region on I _01, segmentation is completed, then the pixels in the background region on the I _01 image are placed at the same position in Bk _01 to complete background update correction, Bk _02 is obtained, the next image segmentation is performed based on I _02 and Bk _02, and so on, segmentation of all image sequences can be better completed one by one using dynamic background correction update.
The technical solutions provided by the above embodiments of the present invention are further described in detail below with reference to the accompanying drawings and a specific application example.
FIG. 3 is a schematic diagram of a thermal fluid image case as a treatment object in a specific application example, in which (a) is an image of the spray development of engine fuel spray, the optical test method is schlieren, and the image sequence is 50 images, wherein the first image is an initial background image (Bk)01) Wherein the fuel spray is not contained, and the remaining 49 images (I) to be detected containing the fuel spray (i.e. the object to be detected)01~I49) The pixel resolution of each picture is 112 × 400 and the gray value resolution is 8, i.e., [0,255 [ ]]. (b) Is a background picture. In some measurements the background does not change much and can be considered static, for which the average of several background maps can be used as the background map; however, in some cases where the background map changes significantly (e.g. engine-like conditions of high temperature and high pressure), if the optical system is a schlieren method or even a density gradient can be seen, the ambient gas moves very significantly in the background of the image, and a new (dynamic) background is required for each image.
For a thermal fluid digital picture similar to fig. 1, the following difficulties exist in identifying, edge detecting, segmenting and extracting the contour of the target area and the background area: (1) the target area (fuel spray area) to be detected and the surrounding background area are relatively close in brightness value (gray value), especially the difference of the gray values of the gas phase area and the background area of the fuel spray is very small, and the target fuel spray area found through direct subtraction comparison of the brightness values has poor energy efficiency; (2) under the conditions of high environmental pressure, high environmental density and high environmental temperature of engine-like working conditions, a large amount of complex background noises appear in the picture background (as shown in (b) in fig. 3), and the complex texture structure formed by the background noises greatly increases the difficulty of identifying and segmenting the target to be detected in the picture; (3) the picture background area is changed from frame to frame, since the air flow around the spray is also inevitably caused to move, which also increases the difficulty of image processing.
Referring to fig. 2, the thermal fluid image processing method provided by this specific application example can complete a task of segmenting a complex difficult image similar to fig. 1, and includes 3 main steps, specifically as follows:
s1: and (3) extracting the features of a directional gradient Histogram (HOG), which mainly comprises the steps of a, element division, b, gradient size and direction calculation and c, classifying and superposing the directional histogram, and respectively extracting the HOG features of each pixel point of the fuel spray image and the background image to be detected to prepare for the subsequent second-step difference calculation.
S2: and (3) calculating the difference of the features of a directional gradient Histogram (HOG), calculating the difference between the HOG features of each pixel point of the fuel spray image to be detected and the background image, wherein the difference of the HOG features can more clearly reflect the difference between the region to be detected and the background region than the difference of the original brightness value, and the difference and the differentiability between the region to be detected and the background region are amplified, so that the subsequent binaryzation, segmentation and contour extraction are facilitated.
S3: and binarization, segmentation and contour extraction, namely performing binarization processing on the image with the reaction difference obtained in the step S2, namely classifying pixel points, dividing the pixel points into a target spraying area and a background area, completing segmentation, wherein the segmented background area is used for dynamic background correction and updating, and can extract the contour of the spraying area, and the extracted spraying contour can be used for measuring and calculating related characteristic parameters of spraying.
In addition, a dynamic background correction updating mode is adopted in the processing process to deal with the situation that the background area changes in the image sequence, it is to be noted here that the change of the background is not large in some measurements and can be considered as static, and for the static situation, the average value of several background images can be used as the background image; however, in some cases the background map changes significantly (e.g. similar to the engine-like condition of fig. 1), and if the optical system is a schlieren method and even density gradients can be seen, the ambient gas moves very significantly in the background of the image, where a new (dynamic) background is required for each image. After the steps of S3 binarization, segmentation and contour extraction are completed, the algorithm provided by the invention carries out dynamic background correction updating, and uses the last image (not containing fuel spray) before the fuel spray image to be detected in the image sequence as the initial background (which can be marked as Bk)01) Based on Bk01 and the first fuel spray image (I) to be detected01) Image processing to detect I01The target spray area and the background area, completing the segmentation, and then, dividing I01The pixels in the background area on the image are placed at Bk01To complete background update correction and obtain Bk02The next image segmentation will be based on I02And Bk02And by analogy, the segmentation of all fuel spray image sequences can be better completed one by using dynamic background correction updating.
In the thermal fluid image processing method provided by this specific application example, in step S1 Histogram of Oriented Gradient (HOG) feature extraction, HOG feature extraction is performed on each pixel (I, j) on the image I of the fuel spray to be detected and the background image Bk, respectively, fig. 4 is a schematic diagram of the step S1 Histogram of Oriented Gradient (HOG) feature extraction step of the present invention, which is an example to process the 49 th image I of the fuel spray to be detected49Extracting a to-be-detected mapping image and a background image Bk49Line 22, b,The HOG feature of the 207 th column pixel is taken as an example.
In the thermal fluid image processing method provided by this specific application example, when the HOG feature of each pixel is extracted, S1-a primitive division is performed, a nearby n × n region (3 × 3, 5 × 5, 7 × 7, etc.) centered on the point (i, j) is selected as a pixel (i, j) primitive C (i, j), and for pixels at the boundary, the surroundings may be filled by complementing methods such as a copy value. In this example, the primitive size is 3 × 3, shown separately in fig. 4
Figure BDA0003143130170000142
And
Figure BDA0003143130170000143
the specific numerical value of (1). Before the calculation of the gradient size and the gradient direction of S1-b, gamma correction is firstly carried out on the gray value of the original image, and a square root is taken
Figure BDA0003143130170000141
Subsequent gradient calculations were performed as shown in fig. 4.
After the division of the primitives S1-a, the gradient magnitude and direction calculations are performed S1-b, the primitives are shown in FIG. 4
Figure BDA0003143130170000144
The calculation process of the gradient size and the gradient direction of the image is obtained by respectively calculating the horizontal gradient and the vertical gradient of 9 units (pixels) in each element
Figure BDA0003143130170000145
And
Figure BDA0003143130170000146
when a horizontal gradient calculation is performed on a cell (pixel), the value of the cell (pixel) adjacent to the left side of the cell (pixel) is subtracted from the value of the cell (pixel) adjacent to the right side of the cell (pixel) to obtain a horizontal gradient, and when a vertical gradient calculation is performed on a cell (pixel), the value of the cell (pixel) adjacent to the lower side of the cell (pixel) is subtracted from the value of the cell (pixel) adjacent to the left side of the cell (pixel) to obtain a horizontal gradientThe value of the cell (pixel) adjacent to the upper side of the element (pixel) obtains a vertical gradient. For pixels at the boundary, fill-in is done using a method of duplicate values to complement the surrounding in computing the gradient, as shown in FIG. 4, to compute the primitive
Figure BDA0003143130170000153
The horizontal gradient of the upper left cell value 9.33 is taken as an example, the horizontal gradient-1.46 is obtained by subtracting the left filled value (9.33) by the copy method from the value (7.87) of the pixel on the right, the vertical gradient is obtained by subtracting the upper filled value (9.33) by the copy method from the value (11) on the lower side to obtain the vertical gradient 1.67, and the calculation of the horizontal gradient and the vertical gradient of the other cells can be similarly performed.
Next, the calculated horizontal gradient and vertical gradient are vector-synthesized to obtain the magnitude and direction of the gradient in the two-dimensional plane, and may be represented by polar coordinates (ρ, θ), where ρ represents the magnitude of the synthesized gradient and θ represents the direction of the synthesized gradient, to calculate the primitive
Figure BDA0003143130170000154
The horizontal gradient-1.46 and the vertical gradient 1.67 of the upper left corner cell are taken as examples, and the magnitude of the synthesized vector is
Figure BDA0003143130170000151
The vector direction makes an angle of 131.2 ° with the horizontal right direction, and vector composition of the horizontal and vertical gradients of the other cells can be similarly performed, at which point step S1-b the gradient magnitude and direction calculations are performed, one for each cell (pixel) in the 3 x 3 primitive, 9 vectors for each primitive, as shown in fig. 4.
After S1-b gradient magnitude and direction calculation, S1-c direction histogram classification superposition is carried out, in the step, firstly, the direction [0 degrees and 360 degrees ] of the gradient is divided into N types, N is the dimension of HOG characteristics, each dimension corresponds to one latitude, N is 9 in the example, at the moment, 9 types respectively correspond to the interval with the range of 40 degrees, and then 9 types in the primitives obtained in the S1-b gradient magnitude and direction calculation are carried outClassifying the gradient vectors into various latitudes according to angles, and then superposing all the gradients classified into the same category (latitude) to obtain the value of the HOG characteristic at the latitude, for example, in a fuel spray diagram I to be detected in figure 449The value of the HOG feature corresponding to the middle pixel (22, 207) at the 9 th latitude degree is 2.01+1.55+ 3.78-7.34, which is obtained by superposing 3 vectors in the range of [90 degrees ], 130 degrees and is included in 9 vectors in the primitive. By the steps, the HOG characteristics corresponding to the fuel spray pattern element to be detected can be obtained
Figure BDA0003143130170000155
HOG features corresponding to background picture elements
Figure BDA0003143130170000156
. The obtained HOG features may be normalized at all latitudes so that the sum of the values of the HOG features of each primitive (pixel) at each latitude is 1, and a normalized HOG feature Hn is obtained, where the value corresponds to each latitude
Figure BDA0003143130170000152
As shown in fig. 4, the normalized HOG features of the inspection map and the background map are obtained
Figure BDA0003143130170000157
And
Figure BDA0003143130170000158
it can also be represented in the form of a histogram.
After the directional gradient Histogram (HOG) feature extraction in the step S1, the method carries out S2 directional gradient Histogram (HOG) feature difference calculation, and calculates the HOG feature (Hn) of the pixel point at the same position (i, j) on the fuel spray diagram to be detected and the background diagramI(i, j) and HnBk(i, j)) and calculating the difference by calculating the difference of order p (Lp-norm) according to the formula of | | HnI-HnBk||p= (∑i|HnI(i)-HnBk(i)|p)1/pP is a positive integer, i is HOThe respective latitudes of the G-feature.
As shown in FIG. 5, in the Histogram of Oriented Gradients (HOG) feature difference calculation step, in the step of S2 Histogram of Oriented Gradients (HOG) feature difference calculation, the difference calculation may be performed by calculating the difference of order 1 (L1-norm), which is expressed as DiffL1=||HnI-HnBk||1=∑i|HnI(i)-HnBk(i)|1), DiffL1Has a value range of [0,2 ]]In this example, the difference between the HOG characteristics of the detected spray pattern and the background pattern on the pixel (22, 207) is 1.854
Figure BDA0003143130170000161
Figure BDA0003143130170000162
Figure BDA0003143130170000163
And (5) obtaining the juice.
After the Histogram of Oriented Gradients (HOG) feature difference calculation is performed in step S2, a difference map of the HOG features of each pixel of the image to be detected and the background image can be obtained, as shown in the upper right-hand diagram in fig. 6, the resolution of the pixel of the image is consistent with the resolution of the original image, which is 112 × 400, and it can be seen clearly that the gray value of the area where the fuel spray, which is the target to be detected, is located on the map is higher, the gray value of the background area is lower, the difference between the two areas is obvious, and the differentiability is greatly increased compared with the original image, which is the essential principle of the core idea of the present invention. The core idea is as follows: the target area and the background area are distinguished through the difference of the target area and the background area in the pixel brightness gradient direction, segmentation and outline extraction are completed, when whether a certain pixel belongs to a target/background to be detected or not is judged, different from a traditional method (operations such as binarization, outline extraction, segmentation and the like are carried out based on the difference of brightness values at the same positions of a mapping to be detected and a background image), the gradient size and direction are calculated by combining the information of the pixel and the surrounding pixels, and then binarization processing is carried out on the difference image of the directional gradient histogram characteristics to judge whether each pixel belongs to the target area or not. The method provided by the patent carries out operations such as binarization, contour extraction, segmentation and the like based on the difference between the HOG characteristics of the diagram to be detected and the background diagram, and has advantages in edge detection and region identification under certain conditions.
According to the difference graph of the HOG characteristics of the fuel spray image to be detected and the HOG characteristics of the background image, the binarization, segmentation and contour extraction of the step S3 can be carried out, the selection of the threshold value in the step can refer to the threshold value selection mode of a conventional image processing method, such as an otsu threshold value segmentation algorithm, the threshold value of the case shown in the example when the HOG difference graph of the 49 th fuel spray image and the background image is processed is 0.88, the part with the difference value of the HOG characteristics of the two HOG characteristics being greater than 0.88 in the graph is identified as a fuel spray area, the part with the difference value of the HOG characteristics of the two HOG characteristics being not greater than 0.88 is a background area, then the target spray area can be obtained through conventional corrosion filling, the contour of the area can be extracted to obtain the contour of the fuel spray of the target object, and the contour can be used for calculating relevant characteristic parameters of the spray, such as penetration distance, cone angle, area and the like.
As before, the present invention takes the form of dynamic background correction updates during processing, as in FIG. 6, at I49After the image segmentation is finished, pixels in the background area outside the spray outline are placed in Bk49To complete background update correction and obtain Bk50The next image segmentation will be based on I50If any, and Bk50And by analogy, the segmentation of all image sequences can be better completed one by using dynamic background correction updating.
FIG. 7 shows a comparison between the effect of the image processing method in the above embodiment of the present invention and the effect of the conventional processing method (segmentation is performed by the difference between the brightness values of the background image and the image to be detected and then by threshold selection), wherein the processing object is the picture (I) shown in FIG. 3(b)49) The image processing method and specific steps in the above-described embodiments of the present invention are as described above, and the conventional method herein also employs dynamic background correction, which is a unique difference between their image processing proceduresHowever, the conventional method calculates only the difference in gray value between the background image of each pixel and the image of the fuel spray to be detected, while the method of the present invention calculates the difference in HOG characteristics, the threshold value in the conventional method is determined by Otsu's threshold value method (OTM), 2.5 times and 5 times the threshold value, and it can be seen from the figure that, no matter how the threshold value is selected, the spray profile (represented by continuous equal gray-scale profile lines in the figure) extracted by the conventional method is irregular and disordered, and there is a large error in the edge (some pixels in the gas phase region of the spray are identified as background rather than spray). This is because the difference between the gray values of the spray area and the background area is small, and thus it is difficult to obtain an accurate target spray profile by the conventional method.
As shown in fig. 7, compared with the conventional method, the fuel spray profile (represented by the continuous iso-gray contour line in the figure) identified by the method provided by the above embodiment of the present invention is more regular and the result is more accurate, and moreover, even if different parameters (element size, HOG feature dimension, difference norm) are used in the image processing provided by the above embodiment of the present invention, the method provided by the above embodiment of the present invention can achieve good effect, which shows that the method provided by the above embodiment of the present invention has good adaptability and is more general.
Fig. 8 is a schematic diagram illustrating a thermal fluid image processing system according to an embodiment of the present invention.
As shown in fig. 8, the thermal fluid image processing system provided by this embodiment may include: the direction gradient histogram feature extraction module, the difference map calculation module and the image processing module; wherein:
the direction gradient histogram feature extraction module divides the hot fluid image into a to-be-detected diagram containing the hot fluid needing to be detected and identified and a background diagram not containing the hot fluid needing to be detected and identified, and respectively extracts the direction gradient histogram features of all pixel points of the to-be-detected diagram and the background diagram;
the difference map calculation module is used for calculating the difference between the directional gradient histogram characteristics of the pixel points of the to-be-detected map and the background map at the same position to obtain a difference map;
and the image processing module is used for carrying out binarization processing on the difference map, dividing the difference map into a target area and a background area, extracting the outline of the target area and finishing the processing of the thermal fluid image.
An embodiment of the present invention provides a terminal, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor is configured to execute the method in any one of the above embodiments of the present invention or execute the system in the above embodiments of the present invention when executing the program.
Optionally, a memory for storing a program; a Memory, which may include a volatile Memory (RAM), such as a Random Access Memory (SRAM), a Double Data Rate Synchronous Dynamic Random Access Memory (DDR SDRAM), and the like; the memory may also comprise a non-volatile memory, such as a flash memory. The memory is used to store computer programs (e.g., applications, functional modules, etc. that implement the above-described methods), computer instructions, etc., which may be stored in one or more memories in a partitioned manner. And the computer programs, computer instructions, data, etc. described above may be invoked by a processor.
The computer programs, computer instructions, etc. described above may be stored in one or more memories in a partitioned manner. And the computer programs, computer instructions, data, etc. described above may be invoked by a processor.
A processor for executing the computer program stored in the memory to implement the steps of the method according to the above embodiments. Reference may be made in particular to the description relating to the preceding method embodiment.
The processor and the memory may be separate structures or may be an integrated structure integrated together. When the processor and the memory are separate structures, the memory, the processor may be coupled by a bus.
According to a fourth aspect of the present invention, there is provided a computer readable storage medium, on which a computer program is stored, which program, when being executed by a processor, is adapted to carry out the method of any one of the above-mentioned embodiments of the present invention or to run the system of the above-mentioned embodiments of the present invention.
The thermal fluid image processing method, the thermal fluid image processing system, the thermal fluid image processing terminal and the thermal fluid image processing medium provided by the embodiments of the invention can process a digital image containing an object to be detected in a background based on the histogram features of the directional gradient, firstly extract the histogram features of the directional gradient, including the steps of primitive division, gradient size and direction calculation, histogram classification and superposition, and the like, then calculate the feature difference of the histogram features of the directional gradient, and then perform operations such as binarization, segmentation, contour extraction and the like, and the method covers an algorithm of dynamic background correction and update. The core idea is as follows: the difference between the target region to be detected and the background region can be increased by extracting the directional gradient histogram characteristics of the two regions. The thermal fluid image processing method, the thermal fluid image processing system, the thermal fluid image processing terminal and the thermal fluid image processing medium provided by the embodiment of the invention can be applied to the field of thermal fluid image processing, can be used for carrying out target identification, edge detection and segmentation on the to-be-detected region and the background region, have advantages under the conditions that the difference between the to-be-detected target and the background is small and the background changes along with frames, have low sensitivity on parameter setting, are good in universality, can be used for different cases, working conditions and optical systems, are easy to realize in program, are high in processing speed, and are accurate and reliable in result.
It should be noted that, the steps in the method provided by the present invention may be implemented by using corresponding modules, devices, units, and the like in the system, and those skilled in the art may implement the composition of the system by referring to the technical solution of the method, that is, the embodiment in the method may be understood as a preferred example for constructing the system, and will not be described herein again.
Those skilled in the art will appreciate that, in addition to implementing the system and its various devices provided by the present invention in purely computer readable program code means, the system and its various devices provided by the present invention can be implemented with the same functionality in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like, all by logically programming method steps. Therefore, the system and various devices thereof provided by the present invention can be considered as a hardware component, and the devices included in the system and various devices thereof for realizing various functions can also be considered as structures in the hardware component; means for performing the functions may also be regarded as structures within both software modules and hardware components for performing the methods.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes and modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention.

Claims (13)

1. A thermal fluid image processing method, comprising:
dividing a plurality of hot fluid images which are arranged according to a time sequence into a to-be-detected diagram containing hot fluid to be detected and identified and a background diagram not containing the hot fluid to be detected and identified respectively, and extracting the direction gradient histogram characteristics of each pixel point of the to-be-detected diagram and the background diagram respectively;
calculating the difference between the directional gradient histogram characteristics of all pixel points in the to-be-detected graph and the background graph at the same position to obtain a difference graph of the directional gradient histogram characteristics;
and carrying out binarization processing on the difference map, judging whether each pixel belongs to a target area, dividing the difference map into the target area and a background area according to a judgment result, extracting the outline of the target area, and finishing the processing of the thermal fluid image.
2. The thermal fluid image processing method according to claim 1, wherein the extracting direction gradient histogram features of each pixel point of the to-be-detected graph and the background graph respectively comprises:
for each pixel point (I, j) on the image I to be detected and the background image Bk, respectively taking the pixel point (I, j) as a center, acquiring an n multiplied by n area around the pixel point (I, j) as a primitive C (I, j) of the pixel point (I, j), and performing primitive segmentation on the image I to be detected and the background image;
performing gamma correction on the gray value of each element C (i, j), and taking a square root to obtain:
Figure FDA0003143130160000011
for each pixel C ' in each of the obtained cells C ' (i, j) '(i,j)(m, n) horizontal gradients Gx(i,j)(m, n) and vertical gradient Gy(i,j)Calculating (m, n); the horizontal gradient G of the same pixelx(i,j)(m, n) and vertical gradient Gy(i,j)(m, n) carrying out vector synthesis to obtain the gradient size and gradient direction of each pixel point in the two-dimensional plane;
dividing the value range [0 DEG, 360 DEG ] of the gradient direction into N types, wherein N is the dimension of the characteristic of the directional gradient histogram to be obtained, and each dimension corresponds to a latitude; each pixel C ' in the primitive C ' (i, j) '(i,j)The gradient directions of (m, n) are respectively classified according to angles and divided into latitudes, the gradient sizes of all pixels divided into the same latitude are superposed to obtain the value of the directional gradient histogram feature to be obtained at the latitude, and further the directional gradient histogram feature H corresponding to each element is obtained(i,j)
3. The thermal fluid image processing method according to claim 2, wherein the value of nxn is: (2z +1) × (2z +1), z being a positive integer.
4. Thermal fluid image processing method according to claim 2, characterized in that each pixel C ' in each resulting cell C ' (i, j) of said pair '(i,j)(m, n) horizontal gradients Gx(i,j)(m, n) comprising:
is provided with pixel C'(i,j)The first side of (m, n) is positive, and the pixel C 'is used'(i,j)(m, n) pixels C 'adjacent on the positive side'(i,j)Value of (m, n +1) minus the pixel C'(i,j)(m, n) a second side-adjacent pixel C 'opposite the first side'(i,j)The value of (m, n-1) gives a horizontal gradient;
the pixel C ' in each obtained primitive C ' (i, j) '(i,j)(m, n) are each subjected to a vertical gradient Gy(i,j)(m, n) comprising:
is provided with pixel C'(i,j)(m, n) is positive on the third side, and the pixel C 'is used'(i,j)(m, n) pixels C 'adjacent to the positive side'(i,j)Value of (m +1, n) minus a pixel C 'adjacent to a fourth side of the pixel opposite to the third side'(i,j)The value of (m-1, n) gives a vertical gradient.
5. The thermal fluid image processing method according to claim 2, wherein the gradient direction value ranges [0 °, 360 °) are divided into N classes, wherein each class corresponds to a value range of (360/N) °, and the N classes include: [0 °, 360/N) - [360/N °,2 × 360/N) - [2 × 360/N °, 3 × 360/N) -, … …, [ (N-1) × 360/N °, 360 °).
6. The thermal fluid image processing method of claim 2, further comprising:
corresponding directional gradient histogram characteristics H of each element(i,j)Normalizing at all latitudes to make the sum of the values of the histogram feature of the directional gradient of each element at each latitude 1 to obtain a normalized histogram feature of the directional gradient Hn, wherein the value of the histogram feature of the directional gradient corresponding to each latitude corresponds to
Figure FDA0003143130160000021
Figure FDA0003143130160000022
7. The thermal fluid image processing method according to claim 1, wherein the calculating the difference between the histogram of oriented gradients at the same position for each pixel of the image to be detected and the background image comprises: calculating the directional gradient histogram characteristics Hn of the pixel points (i, j) at the same position on the to-be-detected graph and the background graph by adopting a p-order difference calculation methodI(i, j) and HnBk(i, j) difference between:
Figure FDA0003143130160000023
wherein p is a positive integer, and i is each latitude of the histogram of oriented gradients.
8. The thermal fluid image processing method according to claim 7, wherein if the value of p is 1, then:
DiffL1=||HnI-HnBk||1=∑i|HnI(i)-HnBk(i)|1),DiffL1∈[0,2]。
9. the thermal fluid image processing method according to claim 1, wherein the difference map is subjected to binarization processing, whether each pixel belongs to a target region is judged, each pixel is classified to obtain a target region and a background region, the difference map is segmented, and the contour of the target region is extracted; the segmented background area is used for dynamic background correction updating, and the extracted contour is used for measuring and calculating related characteristic parameters.
10. The thermal fluid image processing method of claim 9, wherein the dynamic background correction update comprises:
taking the last image Bk _01 before the image to be detected in a plurality of thermal fluid images which are arranged according to the time sequence as a first background image, and performing image processing on the basis of the first background image Bk _01 and the first image to be detected I _01 to detect a target region and a background region on the first image to be detected I _01 so as to complete the segmentation of the first image to be detected I _ 01;
pixels in a background area on the first image to be detected I _01 are placed at the same position in the first background image Bk _01 to complete background updating correction, and an image of the first background image Bk _01 after background updating is the second background image Bk _ 02;
acquiring a second background image Bk _02, and performing image processing on the second image to be detected I _02 and the second background image Bk _02 to detect a target area and a background area on the second image to be detected I _02 so as to complete the segmentation of the next image to be detected;
and so on until the segmentation of all the images to be detected in all the images of the thermal fluid is completed.
11. A thermal fluid image processing system, comprising:
the system comprises a direction gradient histogram feature extraction module, a direction gradient histogram feature extraction module and a direction gradient histogram feature extraction module, wherein the direction gradient histogram feature extraction module is used for dividing a plurality of hot fluid images into a to-be-detected mapping containing hot fluid to be detected and identified and a background map not containing hot fluid to be detected and identified respectively and extracting direction gradient histogram features of all pixel points of the to-be-detected mapping and the background map respectively;
the difference map calculation module is used for calculating the difference between the directional gradient histogram characteristics of all the pixel points in the to-be-detected map and the background map at the same position to obtain a difference map of the directional gradient histogram characteristics;
and the image processing module is used for carrying out binarization processing on the difference map, judging whether each pixel point belongs to a target area, segmenting the difference map into the target area and a background area according to a judgment result, extracting the outline of the target area and finishing the processing of the thermal fluid image.
12. A terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program is operable to perform the method of any one of claims 1 to 10 or to operate the system of claim 11.
13. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, is adapted to carry out the method of any one of claims 1 to 10 or to carry out the system of claim 11.
CN202110742298.3A 2021-07-01 2021-07-01 Thermal fluid image processing method, system, terminal and medium Active CN113689455B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110742298.3A CN113689455B (en) 2021-07-01 2021-07-01 Thermal fluid image processing method, system, terminal and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110742298.3A CN113689455B (en) 2021-07-01 2021-07-01 Thermal fluid image processing method, system, terminal and medium

Publications (2)

Publication Number Publication Date
CN113689455A true CN113689455A (en) 2021-11-23
CN113689455B CN113689455B (en) 2023-10-20

Family

ID=78576820

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110742298.3A Active CN113689455B (en) 2021-07-01 2021-07-01 Thermal fluid image processing method, system, terminal and medium

Country Status (1)

Country Link
CN (1) CN113689455B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103197280A (en) * 2013-04-02 2013-07-10 中国科学院计算技术研究所 Access point (AP) location estimation method based on radio-frequency signal strength
CN103336965A (en) * 2013-07-18 2013-10-02 江西省电力公司检修分公司 Prospect and feature extraction method based on outline differences and principal direction histogram of block
CN105260709A (en) * 2015-09-28 2016-01-20 北京石油化工学院 Water meter detecting method, apparatus, and system based on image processing
CN108230365A (en) * 2017-12-26 2018-06-29 西安理工大学 SAR image change detection based on multi-source differential image content mergence
WO2019000653A1 (en) * 2017-06-30 2019-01-03 清华大学深圳研究生院 Image target identification method and apparatus
CN109214420A (en) * 2018-07-27 2019-01-15 北京工商大学 The high texture image classification method and system of view-based access control model conspicuousness detection
CN111612734A (en) * 2020-04-03 2020-09-01 苗锡奎 Background clutter characterization method based on image structure complexity

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103197280A (en) * 2013-04-02 2013-07-10 中国科学院计算技术研究所 Access point (AP) location estimation method based on radio-frequency signal strength
CN103336965A (en) * 2013-07-18 2013-10-02 江西省电力公司检修分公司 Prospect and feature extraction method based on outline differences and principal direction histogram of block
CN105260709A (en) * 2015-09-28 2016-01-20 北京石油化工学院 Water meter detecting method, apparatus, and system based on image processing
WO2019000653A1 (en) * 2017-06-30 2019-01-03 清华大学深圳研究生院 Image target identification method and apparatus
CN108230365A (en) * 2017-12-26 2018-06-29 西安理工大学 SAR image change detection based on multi-source differential image content mergence
CN109214420A (en) * 2018-07-27 2019-01-15 北京工商大学 The high texture image classification method and system of view-based access control model conspicuousness detection
CN111612734A (en) * 2020-04-03 2020-09-01 苗锡奎 Background clutter characterization method based on image structure complexity

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
赵露露: "结合SVM底质分类的浅海光学遥感水深反演", 《中国优秀硕士学位论文全文数据库 基础科学辑》, pages 1 - 48 *

Also Published As

Publication number Publication date
CN113689455B (en) 2023-10-20

Similar Documents

Publication Publication Date Title
CN107543828B (en) Workpiece surface defect detection method and system
CN111507976B (en) Defect detection method and system based on multi-angle imaging
CN108447061B (en) Commodity information processing method and device, computer equipment and storage medium
US20220092856A1 (en) Crack detection, assessment and visualization using deep learning with 3d mesh model
CN112101361A (en) Target detection method, device and equipment for fisheye image and storage medium
CN104899888B (en) A kind of image sub-pixel edge detection method based on Legendre squares
CN112734761B (en) Industrial product image boundary contour extraction method
WO2020140826A1 (en) Infrared temperature sensing method employing contour extraction for target object, device, and storage medium
CN104318559A (en) Quick feature point detecting method for video image matching
CN111126393A (en) Vehicle appearance refitting judgment method and device, computer equipment and storage medium
CN115272234A (en) Bottle cap quality detection method and device, computer equipment and storage medium
CN112907626A (en) Moving object extraction method based on satellite time-exceeding phase data multi-source information
CN117557565B (en) Detection method and device for lithium battery pole piece
Han et al. Automatic borehole fracture detection and characterization with tailored Faster R-CNN and simplified Hough transform
CN116433661B (en) Method, device, equipment and medium for detecting semiconductor wafer by multitasking
CN105631849A (en) Polygon object change detection method and device
CN113689455B (en) Thermal fluid image processing method, system, terminal and medium
CN116935369A (en) Ship water gauge reading method and system based on computer vision
CN115830073A (en) Map element reconstruction method, map element reconstruction device, computer equipment and storage medium
CN115546130A (en) Height measuring method and device for digital twins and electronic equipment
CN115115889A (en) Instrument image analysis method and device
CN114820738A (en) Accurate registration method and device for star atlas, computer equipment and storage medium
CN110751189B (en) Ellipse detection method based on perception contrast and feature selection
CN115330705A (en) Skin paint surface defect detection method based on adaptive weighting template NCC
CN114241150A (en) Water area data preprocessing method in oblique photography modeling

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant