CN114782268A - Low-illumination image enhancement method for improving SURF algorithm - Google Patents

Low-illumination image enhancement method for improving SURF algorithm Download PDF

Info

Publication number
CN114782268A
CN114782268A CN202210415824.XA CN202210415824A CN114782268A CN 114782268 A CN114782268 A CN 114782268A CN 202210415824 A CN202210415824 A CN 202210415824A CN 114782268 A CN114782268 A CN 114782268A
Authority
CN
China
Prior art keywords
image
space
hsv
rgb
color space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210415824.XA
Other languages
Chinese (zh)
Inventor
梅劲松
沈洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN202210415824.XA priority Critical patent/CN114782268A/en
Publication of CN114782268A publication Critical patent/CN114782268A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration using non-spatial domain filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20064Wavelet transform [DWT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a low-illumination image enhancement method for improving SURF algorithm, which comprises the following steps: acquiring an original image, and converting a pixel value matrix of the original image from an RGB color space to an HSV color space; acquiring a V component of the image in an HSV color space, and performing brightness remapping on the V vector by setting a nonlinear function; converting the image with the remapped brightness from an HSV space to an RGB space; performing SURF feature detection on the enhanced image; constructing a Hessian matrix, constructing a scale space and positioning feature points; determining the main direction of the feature points and generating a feature point descriptor; and extracting and matching the feature points to obtain an enhanced feature matching image. The method provided by the invention can change the illumination condition under the condition of insufficient illumination, and improve the contrast of the image and the matching quantity of the characteristic points on the premise of not increasing other interference noise.

Description

Low-illumination image enhancement method for improving SURF algorithm
Technical Field
The invention relates to the technical field of image processing, in particular to a low-illumination image enhancement method for improving SURF algorithm.
Background
The world is an era of rapid development of information, and people acquire valuable information from the outside through a visual system. The image is used as an important means for transmitting visual information, so that a better visual effect is provided for human beings, and research conditions are provided for the research fields of target detection, face recognition and the like. The quality of pictures is critical to the amount of information we can obtain, both in daily life and in scientific research. The higher the quality of the image, the more detailed information is hidden, and the more information is transmitted to the outside.
Due to the development of image acquisition hardware equipment, the way of acquiring images becomes simple, convenient and fast. However, when the shooting environment is poor, for example, under the conditions of night, overcast and rainy haze, inefficacy and the like, the whole brightness of the image collected by the image collecting device is darker, and the contrast is not high, so that the problem of detail information loss is caused. People can not obtain the required detailed information from the image, and can also set obstacles for the later stage of identification, detection and research.
Therefore, an enhancement method for the low-illumination image is researched, and the quality of the low-illumination image can be improved by using the improved SURF algorithm.
Disclosure of Invention
The invention aims to provide a low-illumination image enhancement method for improving SURF algorithm. The contrast and detail information of the image can be increased under the condition of low illumination, and the matching quantity and effect of the characteristic points of the image are improved.
In order to achieve the technical purpose, the invention adopts the following technical scheme:
a low-illumination image enhancement method for improving SURF algorithm is characterized by comprising the following steps:
step 1, acquiring an image shot under a low illumination condition, and converting a pixel value matrix of the image from an RGB color space to an HSV color space;
step 2, setting a nonlinear function, carrying out brightness remapping on the V vector in HSV space of the image, and converting the image with the remapped brightness from HSV color space to RGB color space again;
step 3, according to the enhanced image after the brightness remapping, SURF characteristic detection and matching are carried out on the enhanced image.
Further, the specific process of step 1 is as follows:
(1a) when the low-illumination image enhancement processing is performed, since the HSV space can directly separate the brightness from the color information, the conversion of the color space of the original image is required, and the original image is firstly converted from the RGB space to the HSV space. The conversion formula from the RGB space to the HSV space is as follows:
R'=R/255
G'=G/255
B'=B/255
C max=max(R',G',B')
C min=min(R',G',B')
Δ=C max-C min
Figure BDA0003602504390000021
Figure BDA0003602504390000022
V=C max
where C max is the maximum of R, G, B and C min is the minimum of R, G, B.
Further, the specific process of step 2 is as follows:
(2a) and the Sigmoid function is modified to carry out brightness remapping on the V component in the HSV space of the image under the enlightening of the Sigmoid function by the enlightening of the Sigmoid function in the neural network, so that the contrast and the detail of the image are enhanced, and the matching quantity and the effect of the feature points are improved. The modified Sigmoid function formula is as follows:
Figure BDA0003602504390000023
(2b) after the luminance remapping, the enhanced picture is converted to the RGB space. The conversion formula from the HSV space to the RGB space is as follows:
Figure BDA0003602504390000031
wherein:
Figure BDA0003602504390000032
Figure BDA0003602504390000033
P=V·(1-S)
Q=V·(1-F·S)
T=V·(1-(1-F)·S)
further, the specific process of step 3 is as follows:
(3a) and constructing a Hessian matrix, and performing discrimination solution on image pixel points to generate all feature points for feature extraction. The Hessian matrix is the core of the SURF algorithm for extracting characteristic points, is a square matrix formed by second-order partial derivatives of a multivariate function and describes the local curvature of the function. For an image f (x, y), the Hessian matrix is as follows:
Figure BDA0003602504390000034
before constructing the Hessian matrix, Gaussian filtering needs to be carried out on an image, and the expression of the Hessian matrix after the Gaussian filtering is as follows:
Figure BDA0003602504390000035
when the discrimination formula of the Hessian matrix obtains a local maximum value, the current point is judged to be a point brighter or darker than other points in the peripheral field, and the position of the key point is positioned. The discriminant of the Hessian matrix is as follows:
det(H)=Dxx·Dyy-Dxy·Dxy
wherein Dxx,Dyy,DxyRespectively representing the second-order partial derivatives of the current point to the horizontal direction, the second-order partial derivatives to the vertical direction and the second-order partial derivatives to the horizontal direction and the vertical direction.
In the SURF algorithm, in order to increase the operation speed, a box filter is used instead of a gaussian filter, and a weighting coefficient needs to be set in order to balance the error caused by using the box filter, so the Hessian matrix discriminant formula used by SURF is as follows:
det(H)=Dxx·Dyy-(0.9·Dxy)2
(3b) the method of image pyramid is adopted for constructing the scale space, but filters with different sizes are adopted. Hessian matrix response is obtained by continuously increasing the size of a template of the box filter and an integral image, and then non-maximum suppression is adopted in a response image to obtain spots of different scales.
The starting filter is of 9 x 9 size and subsequent layers are constructed by progressively enlarging the size of the filter template and the enlarged template is continuously filtered with the image. The scale space is divided into groups, where one group represents a response map of the progressively enlarged filtering template to filter the same input image. Each group consisting of several layers. The minimum scale change between the two layers is determined by the Gaussian second order derivative filter in the derivative direction for the positive and negative speckle response length l0Determined of0Is 1/3 the size of the template of the box filter. The response length of the next layer is l0Two pixels are added on the basis.
In order to locate feature points in images and different sizes, a 3 × 3 × 3 neighborhood non-maximum suppression is employed. And comparing the size of each pixel point processed by the Hessian matrix with the size of 26 points of the three-dimensional neighborhood, and if the pixel point is the maximum value or the minimum value of the points, keeping the pixel point as a preliminarily determined characteristic point. And then, carrying out accurate positioning through the local maximum value, removing points smaller than a set threshold value, increasing an extreme value to reduce the number of detected feature points, and finally detecting to obtain a plurality of feature points with strongest features.
(3c) After extracting the feature points, in order to ensure the rotation invariance of the feature vectors, a main direction needs to be assigned to each feature point.
And performing Haar wavelet response operation on the image by taking the characteristic point as a center and 6s as a radius circular area. In order to obtain the value of the principal direction, a fan-shaped sliding window with the characteristic point as the center and the opening angle of 60 degrees needs to be designed, and the sum of the Haar wavelet characteristics in the fan-shaped area is counted. And rotating the sliding window by the step length of about 0.2 radian, and then counting the wavelet characteristic sum. The direction in which the sum of the wavelet features is maximum is the principal direction. The formula for summing the features is as follows:
Figure BDA0003602504390000041
Figure BDA0003602504390000042
where dx, dy are the response values of Haar wavelet features to horizontal and vertical directions, (m)ww) Is a vector obtained by accumulating the Haar wavelet response values dx, dy of the image. The dominant direction is the direction corresponding to the accumulated value of the maximum Haar response, i.e. the direction corresponding to the longest vector
θw=θw|max{mw}
(3d) And calculating the Haar wavelet response of the image to generate a characteristic point descriptor, and selecting a rectangular region around the key point to calculate the Haar wavelet response. Dividing an image of 20s multiplied by 20s into 4 multiplied by 4 sub-blocks along the main direction of feature point matching by taking a feature point as a center, calculating a response value of each sub-block by using a Haar template with the size of 2s, and then carrying out statistics on the response values to form vectors of sigma dx, sigma | dx |, sigmady and sigma | dy |.
And the fast matching is realized by adding a Laplace response sign of the feature point in the feature vector. When detecting the characteristic point, recording the sign of the trace of the Hessian matrix as a variable in the characteristic vector. According to the response value signs of the feature points, the feature points are divided into two groups, one group is the feature points with the Laplace positive response, the other group is the feature points with the Laplace negative response, when matching is carried out, only two feature points with the same sign are possible to match, and matching is not carried out on the feature points with different signs.
The invention has the following beneficial effects:
(1) the HSV space brightness remapping is introduced into an SURF algorithm preprocessing stage, so that the optimization effect of image enhancement is improved;
(2) compared with other image enhancement methods, the method provided by the invention has the advantages that the visual effect is remarkably improved, the image contrast and the number of feature matching points are improved, simultaneously, the noise is better inhibited, and the detail information of a brightness saturation area can be well reserved.
Drawings
FIG. 1 is an original image of the present invention;
FIG. 2 is an HSV space V component image of the present invention;
FIG. 3 is a diagram illustrating the matching effect of feature points of an original image according to the present invention;
FIG. 4 is a diagram illustrating the matching effect of histogram equalization enhanced image feature points according to the present invention;
FIG. 5 is a diagram illustrating the matching effect of feature points of an adaptive histogram equalization enhanced image according to the present invention;
FIG. 6 is a graph of the matching effect of the algorithm enhanced image feature points of the present invention;
FIG. 7 is a flowchart of the algorithm of the present invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
In an image under a low illumination environment, because the brightness is very limited, it is difficult to recover the details of the image from a global perspective, and a dark portion has a relatively large distortion. Considering that the brightness of the image near the light source is better and contains more details, the brightness of the relevant area is enhanced, and for most dark areas in the image, the brightness is too low and the details are lacked, so that the image characteristics of the part are directly abandoned and the brightness of the relevant area is reduced. In the SURF image preprocessing process, the quality and the brightness of the low-illumination area are improved, but the information of the saturated area of the original image is not changed, so that the information of other areas is not lost while the quality of the processed image is improved.
Taking the image of a daily scene shot at night as an example, histogram equalization and adaptive histogram equalization are simultaneously performed on the original image, and comparison detection is performed with the algorithm of the invention. The entire process of the present invention will be described in detail below.
1. The method comprises the steps of obtaining an image shot under a low-illumination condition through an image acquisition device, and converting a pixel value matrix of the image from an RGB color space to an HSV color space.
Here, the captured nighttime daily scene is used as an original image, as shown in fig. 1. It can be seen that the original image is dark in overall brightness, but there are also areas of saturated brightness. Then, converting the original image from the RGB space to the HSV space by a color space conversion formula, wherein the RGB-to-HSV space formula is as follows:
R'=R/255
G'=G/255
B'=B/255
C max=max(R',G',B')
C min=min(R',G',B')
Δ=C max-C min
Figure BDA0003602504390000061
Figure BDA0003602504390000062
V=C max
values of H, S and V are obtained through a space conversion formula, and a V component image of the original image in HSV space is obtained as shown in figure 2.
2. And setting a nonlinear function, performing brightness remapping on the V vector in the HSV space of the image, and converting the image with the remapped brightness from the HSV color space back to the RGB color space.
According to the characteristics of the image, a dark area and a high-brightness area exist at the same time, so that a nonlinear function is designed to carry out brightness remapping on the low-illumination image, and the contrast of the image is increased on the premise of no distortion. The nonlinear function formula adopted by the invention is as follows:
Figure BDA0003602504390000071
converting the image with the remapped brightness from the HSV space to the RGB space by a color space conversion formula, wherein the HSV-to-RGB space formula is as follows:
Figure BDA0003602504390000072
wherein:
Figure BDA0003602504390000073
Figure BDA0003602504390000074
P=V·(1-S)
Q=V·(1-F·S)
T=V·(1-(1-F)·S)
as shown in fig. 4, 5, and 6, respectively, the original image enhanced by the histogram equalization method generates a large amount of noise and interference information. The adaptive histogram equalization does not generate much interference and noise because the image is segmented. The algorithm of the invention remaps the original brightness, and effectively improves the brightness and the details of the area near the high brightness in the low-illumination image.
3. And performing SURF feature detection on the enhanced image according to the enhanced image subjected to brightness remapping. And constructing a feature descriptor of the SURF feature points, and extracting and matching the feature points.
And constructing an image pyramid by constructing a Hessian matrix determinant approximate image, and performing discrimination solution on image pixel points to generate all feature points. For image f (x, y), the Hessian matrix is as follows:
Figure BDA0003602504390000081
the Hessian matrix is solved by first performing gaussian smoothing. The expression of the Hessian matrix after Gaussian filtering is as follows:
Figure BDA0003602504390000082
when the discrimination formula of the Hessian matrix obtains a local maximum value, the current point is judged to be a point brighter or darker than other points in the peripheral field, and the position of the key point is positioned. The discriminant of the Hessian matrix is as follows:
det(H)=Dxx·Dyy-(0.9·Dxy)2
and (3) continuously increasing the size of a filter template and the integral image on the basis of the initial filter to obtain Hessian matrix response and construct a scale space. And then, a non-maximum suppression method is adopted in the response image to preliminarily determine the characteristic points. And finally, carrying out accurate positioning through the local maximum value, removing points smaller than a set threshold value, and increasing extreme values to screen out a plurality of characteristic points with strongest characteristics.
After the feature points are determined, the principal direction needs to be added to the feature points. And calculating the sum of Haar wavelet responses in the horizontal and vertical directions of all points of the 60-degree sector by taking the characteristic point as the center and 6s as the radius of the circular area. The response forms a new vector within the sector area. The new vector formula formed is as follows:
Figure BDA0003602504390000083
Figure BDA0003602504390000084
where dx, dy are the response values of the Haar wavelet features to the horizontal and vertical directions. Traversing the formed vector to the whole circular area, and selecting the longest vector direction as the main direction of the feature point.
And finding a rectangular frame with the length of 20s at the periphery of the feature point. The boxes were equally divided into 4 x 4 subregions, each subregion calculated using a 2s Haar wavelet module. Calculating to obtain the sum of the horizontal values, the sum of the horizontal absolute values, the sum of the vertical values and the sum of the vertical absolute values of the Haar wavelet features, and generating feature vectors V as follows:
V=[∑dx ∑|dx| ∑dy ∑|dy|]
each feature point is composed of 16 sub-region feature vectors, and then normalization processing is performed.
And determining the matching degree by calculating the Euclidean distance between the two characteristic points, wherein the shorter the Euclidean distance is, the better the matching degree of the two characteristic points is represented. Adding judgment of a Hessian matrix track, if the signs of the matrix tracks of two characteristic points are the same, representing that the two characteristics have contrast change in the same direction, if the signs are different, indicating that the contrast change directions of the two characteristic points are opposite, even if the Euclidean distance is 0, directly eliminating the contrast change.
As shown in fig. 5 and 6, adaptive histogram equalization can enhance the effect of feature point matching locally and effectively. But compared with the algorithm of the invention, the algorithm of the invention has better effect of enhancing the feature matching points.
The invention is applied to low-illumination images. The observability of the original image is poor, and in order to meet the requirements, the original image is acquired at night through image acquisition equipment, and the image is processed by adopting the algorithm. The original image has low contrast and the texture details are not clear. In the comparative experiment, comparison is made with the conventional image enhancement method histogram equalization method and the adaptive histogram equalization method. Compared with the results of other two algorithms, the algorithm provided by the invention has the advantages that the visual effect is obviously improved, the image contrast is improved, and the noise is better inhibited.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (3)

1. A low-illumination image enhancement method for improving SURF algorithm, the method comprising the steps of:
1) acquiring an image shot under a low illumination condition, and converting the image from an RGB color space to an HSV color space;
2) setting a nonlinear function y (V), performing brightness remapping on a V vector in an HSV color space of the image through the nonlinear function y (V) to obtain an enhanced image, and converting the enhanced image from the HSV color space back to an RGB color space again;
3) and performing SURF feature detection and matching on the enhanced image, and outputting the enhanced image after feature matching.
2. The method of claim 1, wherein the conversion formula of the image from RGB color space to HSV color space in step 1) is as follows:
R'=R/255
G'=G/255
B'=B/255
Cmax=max(R',G',B')
Cmin=min(R',G',B')
Δ=Cmax-Cmin
Figure FDA0003602504380000011
Figure FDA0003602504380000012
V=C max;
wherein:
H. s, V are hue, saturation, and value components of the HSV space, respectively; r, G, B R, G, B color components of RGB space, respectively; r ', G', B 'are the values normalized for the R, G, B components, respectively, such that the values of R', G ', B' are at [0,1], Cmax is the maximum value of R ', G', B ', and Cmin is the minimum value of R', G ', B'.
3. The method of claim 2, wherein the step 2) is implemented by:
2.1) Sigmoid function
Figure FDA0003602504380000024
Modified as a function of the V component:
Figure FDA0003602504380000021
carrying out brightness remapping on a V component in an HSV space of an image through a function y (V) to obtain an enhanced image, and representing color components of the enhanced image by H, S and V';
2.2) converting the enhanced image into an RGB space, wherein the conversion formula of converting the HSV space into the RGB space is as follows:
Figure FDA0003602504380000022
wherein:
Figure FDA0003602504380000023
r ', G ' and B ' are color components after being converted into RGB space again respectively, mod is a remainder operation, parameters P, Q and T are obtained through V ', S and F calculation, parameter F is obtained through H calculation, and the values of the obtained parameters P, Q, V ' and T are converted into the values of the RGB space according to the H range.
CN202210415824.XA 2022-04-19 2022-04-19 Low-illumination image enhancement method for improving SURF algorithm Pending CN114782268A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210415824.XA CN114782268A (en) 2022-04-19 2022-04-19 Low-illumination image enhancement method for improving SURF algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210415824.XA CN114782268A (en) 2022-04-19 2022-04-19 Low-illumination image enhancement method for improving SURF algorithm

Publications (1)

Publication Number Publication Date
CN114782268A true CN114782268A (en) 2022-07-22

Family

ID=82430324

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210415824.XA Pending CN114782268A (en) 2022-04-19 2022-04-19 Low-illumination image enhancement method for improving SURF algorithm

Country Status (1)

Country Link
CN (1) CN114782268A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116824511A (en) * 2023-08-03 2023-09-29 行为科技(北京)有限公司 Tool identification method and device based on deep learning and color space

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116824511A (en) * 2023-08-03 2023-09-29 行为科技(北京)有限公司 Tool identification method and device based on deep learning and color space

Similar Documents

Publication Publication Date Title
CN102713938B (en) Scale space normalization technique for improved feature detection in uniform and non-uniform illumination changes
CN107945111B (en) Image stitching method based on SURF (speeded up robust features) feature extraction and CS-LBP (local binary Pattern) descriptor
CN109918971B (en) Method and device for detecting number of people in monitoring video
CN103020965B (en) A kind of foreground segmentation method based on significance detection
CN103020992B (en) A kind of video image conspicuousness detection method based on motion color-associations
CN110929593A (en) Real-time significance pedestrian detection method based on detail distinguishing and distinguishing
CN112184604B (en) Color image enhancement method based on image fusion
CN108875645B (en) Face recognition method under complex illumination condition of underground coal mine
CN111369605B (en) Infrared and visible light image registration method and system based on edge features
CN115063331B (en) Multi-scale block LBP operator-based ghost-free multi-exposure image fusion method
CN113177467A (en) Flame identification method, system, device and medium
CN108694349A (en) A kind of pantograph image extraction method and device based on line-scan digital camera
CN107122732B (en) High-robustness rapid license plate positioning method in monitoring scene
CN117496019B (en) Image animation processing method and system for driving static image
CN114782268A (en) Low-illumination image enhancement method for improving SURF algorithm
CN111311503A (en) Night low-brightness image enhancement system
CN116311212B (en) Ship number identification method and device based on high-speed camera and in motion state
CN106934395B (en) Rigid body target tracking method adopting combination of SURF (speeded Up robust features) and color features
Hua et al. Image segmentation algorithm based on improved visual attention model and region growing
CN108960285B (en) Classification model generation method, tongue image classification method and tongue image classification device
CN115035281B (en) Rapid infrared panoramic image stitching method
CN116167905A (en) Anti-screen robust watermark embedding and extracting method and system based on feature point detection
CN112532938B (en) Video monitoring system based on big data technology
CN114463379A (en) Dynamic capturing method and device for video key points
CN112070048B (en) Vehicle attribute identification method based on RDSNet

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination