WO2023108933A1 - 一种基于聚类算法的车辆检测方法 - Google Patents

一种基于聚类算法的车辆检测方法 Download PDF

Info

Publication number
WO2023108933A1
WO2023108933A1 PCT/CN2022/081197 CN2022081197W WO2023108933A1 WO 2023108933 A1 WO2023108933 A1 WO 2023108933A1 CN 2022081197 W CN2022081197 W CN 2022081197W WO 2023108933 A1 WO2023108933 A1 WO 2023108933A1
Authority
WO
WIPO (PCT)
Prior art keywords
color
pixel
image
distance
pixels
Prior art date
Application number
PCT/CN2022/081197
Other languages
English (en)
French (fr)
Inventor
顾伟
赵志伟
曹渊
张申浩
Original Assignee
江苏航天大为科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 江苏航天大为科技股份有限公司 filed Critical 江苏航天大为科技股份有限公司
Publication of WO2023108933A1 publication Critical patent/WO2023108933A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering

Definitions

  • the invention belongs to the technical field of target recognition, in particular to a vehicle detection method based on a clustering algorithm.
  • target detection and recognition technologies using virtual reality technology and image processing technology such as face Recognition, pedestrian feature detection, vehicle detection, etc.
  • image processing technology such as face Recognition, pedestrian feature detection, vehicle detection, etc.
  • video capture is used to analyze and track with visual technology, and to detect crowding and vehicle traffic conditions, which can facilitate traffic management and reduce traffic accidents.
  • the clustering algorithm mainly discriminates the difference and similarity of the data in the generated clusters, and realizes the similarity measurement in the objects to the greatest extent.
  • the existing clustering algorithms have low detection accuracy, large amount of calculation and low efficiency.
  • the present invention proposes a vehicle detection method based on a clustering algorithm.
  • the continuous iteration of the algorithm makes its quantization value more accurate, thereby improving the effect of subsequent detection.
  • vehicle detection there is no need to specify a predefined number of clusters, and only a parameter with low sensitivity is needed, so as to achieve the effect of univariate control and reduce the complexity of the algorithm.
  • the computational load of the entire algorithm is reduced through various methods, thereby improving the efficiency of the entire algorithm.
  • a kind of vehicle detection method based on clustering algorithm disclosed by the present invention comprises the following steps:
  • Preprocessing the image including filtering and noise reduction of the image and resizing the image;
  • the image is converted from the RGB color space to the LAB color space, the color feature vector of the image pixel point of the LAB color space is extracted, and the color feature vector forms a feature value matrix;
  • the filtering and denoising method of the image is a median filtering method, which is used to remove impulse noise and salt and pepper noise, while retaining the edge details of the image; the adjustment of the image size is used to reduce the number of calculations of the similarity measure and improve the running speed .
  • the steps of the color quantization processing include:
  • k is the count of K RGB components
  • j is the count of pixels
  • d kj is the color distance between the jth pixel and the kth RGB component
  • S3 classify the color distance into pixels, and the classification method is as follows:
  • step S5 Calculate the color distance between the pixel values corresponding to each category and the calculated new K category pixel values, and judge whether the classification of the corresponding pixels in each category changes; if the classification of all pixels in all categories If the classifications of all the pixels in have not changed, proceed to step S6, otherwise jump to step S2;
  • the image is converted from RGB color space to LAB color space, and the conversion steps are as follows:
  • Gamma correction color components Rg, Gg, Bg are obtained by the first Gamma correction method for the RGB components;
  • X1, Y1, Z1 are converted into the color component l, a, b of LAB color space;
  • the color information of image pixels in LAB color space is extracted, and the color feature vector of each pixel is generated.
  • the first Gamma correction method is as follows:
  • x is one of R, G, B original color components
  • the second Gamma correction formula is as follows:
  • y is one of the color components X, Y, and Z in the XYZ color space
  • X, Y, and Z are the color components of the XYZ color space
  • X1, Y1, Z1 are converted into the formula of the color component 1, a, b of LAB color space as follows:
  • i represents the i-th pixel point
  • j represents the j-th pixel point
  • d ij represents the feature value between pixel point i and pixel point j, where l, a, b represent the pixel point in the LAB color space l , a, b three parameters;
  • i is the serial number of the current pixel point
  • j represents the serial number of other pixel points except the current pixel point
  • d ij is the feature value between pixel point i and pixel point j
  • d c is the cut-off distance
  • d ij is the feature quantity value between pixel point i and pixel point j
  • i is the serial number of the current pixel point
  • j represents the serial number of other pixel points except the current pixel point and ⁇ i ⁇ j ;
  • the aspect ratio of the vehicle after pixel clustering is screened, and the cluster with a large error with the aspect ratio k of the vehicle is removed.
  • the generating the segmented image of the vehicle according to the clustering result includes: selecting the coordinates of the outermost pixel points of each cluster to form a rectangle according to the detection result of the vehicle, and generating the segmented image of the vehicle.
  • the present invention greatly reduces the calculation amount and improves the efficiency under the condition of ensuring the detection accuracy through the steps of reducing image size and color quantization;
  • the quantized color makes its quantized value more accurate through continuous iteration of the algorithm, thereby improving the effect of subsequent detection.
  • the vehicle detection algorithm proposed by the invention does not need to specify a predefined number of clusters, but only needs a parameter with low sensitivity. Compared with other density-based methods, the invention has higher calculation efficiency.
  • Fig. 1 vehicle detection method flowchart of the present invention
  • FIG. 2 color quantization flowchart of the present invention
  • Fig. 3 is a flow chart of cluster screening in the present invention.
  • FIG. 1 is a flow chart of vehicle detection based on clustering algorithm, and the specific description of each step is as follows:
  • Image preprocessing The main work of image preprocessing in the present invention is image filtering and noise reduction and image size adjustment.
  • the image filtering method is the median filtering method.
  • the advantage of this method is that it can remove impulse noise and salt and pepper noise. In addition, it can preserve the edge details of the image while removing noise.
  • the adjustment of the image size is to reduce the number of operations of the similarity measure and improve the running speed.
  • the specific zoom size is selected according to the actual hardware conditions, and the original image is generally selected to be reduced by half.
  • step 1 the image size adjustment method is adopted in order to reduce the number of calculations of the similarity measure value and improve the running speed, but for RGB images, the calculation load is still very large, and the original image will be lost if the resolution is continued to be reduced. a lot of information.
  • the present invention proposes a color quantization method.
  • the color information of the original image is maximized while reducing the color types of the pixels, thereby improving the operating efficiency.
  • Figure 2 is a flow chart of color quantization, and the specific steps are as follows:
  • 2.1 K value selection Randomly select K RGB components from the image.
  • the general selection rule is: disperse selection in the image as much as possible, and represent the main colors in the image to the greatest extent.
  • k is the count of K RGB components
  • j is the count of pixels
  • dkj is the distance between the jth pixel and the kth RGB component.
  • Pixel classification perform pixel classification on the color distance calculated in step 2.2, the specific classification method is:
  • this step is to recalculate the color values of the K RGB components selected in step 2.1 to obtain new category pixel values.
  • the specific calculation method is: according to the pixel classification result in step 2.3, calculate the color average value of all pixels in each category according to formula 2, and replace M k in step 2.1 with the calculated average value.
  • k is the count of K RGB components
  • R' k , G' k , and B' k are the R, G, and B components corresponding to the selected K points respectively
  • n is the total number of pixels corresponding to this category
  • i is the number of the pixel in this category
  • Ri, Gi, and Bi are the RGB components of the i-th pixel in this category respectively.
  • This step is an iterative step in the color quantization step, and the purpose of this step is to make the pixel classification more accurate and the category pixel values more reasonable.
  • This step is the specific execution step of the color quantization algorithm.
  • the method is as follows: according to the pixel values of the K categories and the pixel points in each category in steps 2.1 to 2.5, continue to use the pixel values used in each category The RGB component of the pixel direction is replaced with the RGB component of the category pixel value in this category, and then the color quantization is completed.
  • RGB color space needs to be converted into LAB color space among the present invention, and concrete conversion steps are as follows:
  • Rg, Gg, Bg are Gamma correction color components
  • X, Y, Z are XYZ color space color components
  • LAB color space conversion carry out LAB color space according to formula 6, wherein, l, a, b are the color components of LAB color space, X1, Y l, Zl are X, Y, Z in XYZ color space after linear normalization value.
  • Extract feature vector extract the color information of image pixels in LAB color space, and generate the color feature vector of each pixel.
  • i represents the i-th pixel point
  • j represents the j-th pixel point
  • d ij represents the feature value between pixel point i and pixel point j
  • l, a, b represent the pixel point in the LAB color space l , a, b three parameters.
  • the specific method for determining the cut-off distance dc in this step is: arrange the upper triangular matrix of the characteristic value matrix D calculated in step 5.1 in ascending order, and calculate the cut-off distance dc according to the median calculation method.
  • the local density ⁇ i in this step represents the number of similar pixels around the pixel, the larger the value, the more the number of similar pixels around the pixel, that is, find the i-th pixel The number of pixels whose distance between pixels is less than the cut-off distance d c .
  • the specific local density method is: calculate the local density of each pixel according to formula 8.
  • i is the number of the current pixel point
  • j represents the number of other pixel points except the current pixel point
  • d ij is the feature value between pixel point i and pixel point j
  • d c is the cutoff distance
  • the distance ⁇ i calculated in this step from the higher density point is to find the distance between all the pixels with higher local density than the i-th pixel, and the i-th pixel The minimum distance between.
  • the specific calculation method is: calculate the distance ⁇ i from the higher density point according to Formula 9.
  • d ij is the feature value between pixel i and pixel j
  • i is the number of the current pixel
  • j represents the number of other pixels except the current pixel
  • ⁇ i ⁇ j is the feature value between pixel i and pixel j
  • This step is to determine the cluster center of the pixel points in the image, that is, to initially find out the center of the vehicle in the image.
  • the specific determination method is: filter according to the local density calculated in step 5.3 and step 5.4 and the distance from the higher density point, and select the pixel points with larger local density and larger distance from the higher density point as clusters center.
  • Pixel point clustering perform pixel point clustering according to the cluster centers determined in step 5.5.
  • the specific clustering method is as follows: take the pixel point of the determined cluster center as the center, and place all the pixel points whose distance from the cluster center pixel point is less than the cut-off distance dc to the cluster center according to formula 8, so as to complete the pixel point clustering. clustering.
  • the clustering result of this step represents the preliminary detection result of the vehicle in the image.
  • Cluster screening This step is to screen the vehicle centers and vehicle detection results initially detected in steps 5.5 and 5.6.
  • the specific screening method is: according to the aspect ratio k of the actual vehicle in the image, the aspect ratio of the vehicle after pixel clustering in step 5.6 is screened, and the cluster with a large error with the aspect ratio k of the vehicle is removed.
  • This step is to segment the vehicle from the image or frame it in the image.
  • the specific method is: according to the vehicle detection result in step 5, select the coordinate composition of the outermost pixel points of each cluster A rectangle, so as to complete the segmentation of the vehicle to generate a vehicle segmentation image.
  • the present invention greatly reduces the calculation amount and improves the efficiency under the condition of ensuring the detection accuracy through the steps of reducing image size and color quantization;
  • the quantized color makes its quantized value more accurate through continuous iteration of the algorithm, thereby improving the effect of subsequent detection.
  • the vehicle detection algorithm proposed by the present invention does not need to specify a predefined number of clusters, but only requires a parameter with low sensitivity. Compared with other density-based methods, the present invention has higher computational efficiency.
  • the word "preferred” means serving as an example, instance or illustration. Any aspect or design described herein as “preferred” is not necessarily to be construed as advantageous over other aspects or designs. Rather, use of the word “preferably” is intended to present concepts in a concrete manner.
  • the term “or” is intended to mean an inclusive “or” rather than an exclusive “or”. That is, unless otherwise specified or clear from context, "X employs A or B” is meant to naturally include either of the permutations. That is, if X employs A; X employs B; or X employs both A and B, then "X employs A or B" is satisfied in any of the foregoing instances.
  • Each functional unit in the embodiment of the present invention may be integrated into one processing module, or each unit may physically exist separately, or multiple or more of the above units may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or in the form of software function modules. If the integrated modules are realized in the form of software function modules and sold or used as independent products, they can also be stored in a computer-readable storage medium.
  • the storage medium mentioned above may be a read-only memory, a magnetic disk or an optical disk, and the like.
  • Each of the above devices or systems may execute the storage method in the corresponding method embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Geometry (AREA)
  • Color Image Communication Systems (AREA)
  • Image Analysis (AREA)

Abstract

一种基于聚类算法的车辆检测方法,步骤包括:对图像进行预处理;通过对图像进行颜色量化处理,减少像素点的颜色种类;将图像从RGB色彩空间转换为LAB色彩空间,提取LAB颜色空间的图像像素点的颜色特征向量,并将颜色特征向量组成特征量值矩阵;计算每个像素点的局部密度和距较高密度点的距离,筛选出局部密度大于阈值以及距较高密度点的距离大于阈值的像素点作为簇中心,将其它像素点规置到簇中心,进行像素点的聚类;根据聚类结果生成车辆分割图像。本方法大幅降低了计算量;量化的颜色通过算法的不断迭代使得其量化值更加准确,提高后续检测的效果;不需要指定预先定义的簇数,计算效率较高。

Description

一种基于聚类算法的车辆检测方法 技术领域
本发明属于目标识别技术领域,尤其涉及一种基于聚类算法的车辆检测方法。
背景技术
随着人们生活水平的提高,车辆的数量越来越多,车辆的种类、型号、框架结构等都具有不同的特征,因此应用虚拟现实技术和图像处理技术的目标检测和识别技术,例如人脸识别、行人特征检测、车辆检测等,在智能交通和传感技术发展的过程中,也越来越多地被应用到交通领域,例如车辆碰撞预测预警、车辆偏离车道等突发情况;在智能交通中通过视频捕捉利用视觉技术进行分析和跟踪,检测人流拥挤和车辆通行状况,可便捷高效地进行交通管理,减少交通事故。
在车辆检测技术中,通过背景图像、纹理和颜色等对运动目标进行检测,提高检测精度,在检测过程中,通过利用高速混合建模、分类器、决策树等算法进行多目标检测、跟踪、识别等。在车辆检测中通过视频和图像分离背景图像,常用的技术有基于背景差分、先验知识、光流法、机器学习等,其中,机器学习的研究应用已成为目前的前沿领域,目标检测的方法和技术越来越多。
聚类算法主要是对生成的簇中的数据进行相异和相似性判别,尽最大程度地实现对象中的相似度度量。现有的聚类算法检测精度不高,计算量大,效率低下。
发明内容
有鉴于此,本发明提出了一种基于聚类算法的车辆检测方法,在颜色量 化算法方面通过算法的不断迭代使得其量化值更加准确,从而提高后续检测的效果。在车辆检测不需要指定预先定义的簇数,只需要一个灵敏度较低的参数,进而达到单变量控制的效果降低的算法的复杂度。此外通过多种方式降低了整个算法的运算量,从而提高了整个算法效率。
具体的,本发明公开的一种基于聚类算法的车辆检测方法,包括以下步骤:
对图像进行预处理,包括对图像的滤波降噪以及图像的尺寸调整;
通过对图像进行颜色量化处理,减少像素点的颜色种类;
将图像从RGB色彩空间转换为LAB色彩空间,提取LAB颜色空间的图像像素点的颜色特征向量,并所述颜色特征向量组成特征量值矩阵;
计算每个像素点的局部密度和距较高密度点的距离,筛选出局部密度大于阈值以及距较高密度点的距离大于阈值的像素点作为簇中心,将其它像素点规置到该簇中心,进行像素点的聚类;
根据聚类结果生成车辆分割图像。
进一步的,所述图像的滤波降噪方法为中值滤波方法,用于去除脉冲噪声与椒盐噪声,同时保留图像的边缘细节;图像尺寸的调整用于降低相似性度量值的运算数量提高运行速度。
进一步的,所述颜色量化处理的步骤包括:
S1:从图像中随机选取K个RGB分量,M k=[R′ k,G′ k,B′ k],其中k为K的计数,R′ k,G′ k,B′ k分别为选取的K个点所对应的R、G、B分量;
S2:计算各个像素点与所述选取K个RGB分量的色彩距离
Figure PCTCN2022081197-appb-000001
其中,k为K个RGB分量计数,j为像素点的计数,d kj为第j个像素点与第k个RGB分量的色彩距离;
S3:将所述色彩距离进行像素归类,归类方法如下:
将每个像素点与K个RGB分量的计算出的K个色彩距离进行比较,将该像素点归类到与K个RGB分量中对应的最小的色彩距离的类别中去;
S4:根据像素归类的结果,计算各个类别中的所有像素点的色彩平均值,并以计算出的平均值替换M k的值;
S5:将各个类别所对应的像素值与计算出的新的K个类别像素值行进行色彩距离的计算,判断各个类别中所对应的像素的分类是否发生变化;如果所有类别中所有像素的分类中所有像素点的分类均未发生变化,则进行步骤S6,否则跳转到步骤S2;
S6:根据步骤S1~步骤S5中的得到的K个类别像素值以及各个类别中的像素点继续,将各个类别中所用的像素点的RGB分量用该类别中的类别像素值的RGB分量替换,完成颜色量化。
进一步的,所述将图像从RGB色彩空间转换为LAB色彩空间,转换步骤如下:
对RGB分量通过第一Gamma校正方法得到Gamma校正色彩分量Rg、Gg、Bg;
将所述Gamma校正色彩分量进行XYZ色彩空间的转换,得到XYZ色彩空间色彩分量X、Y、Z;
对所述色彩空间色彩分量X、Y、Z进行第二Gamma校正获取到校正后的XYZ色彩空间色彩分量Xl、Y l、Zl;
将所述Xl、Y l、Zl转换为LAB色彩空间的色彩分量l、a、b;
提取LAB颜色空间的图像像素点的颜色信息,生成各像素点的颜色特征向量。
进一步的,所述第一Gamma校正方法如下:
Figure PCTCN2022081197-appb-000002
其中x为R、G、B原始色彩分量之一;
所述第二Gamma校正公式如下:
Figure PCTCN2022081197-appb-000003
其中y为XYZ色彩空间色彩分量X、Y、Z之一;
所述Gamma校正色彩分量进行XYZ色彩空间的转换的公式如下:
[X,Y,Z]=M*[Rg,Gg,Bg]
其中X、Y、Z为XYZ色彩空间色彩分量,
Figure PCTCN2022081197-appb-000004
所述Xl、Y l、Zl转换为LAB色彩空间的色彩分量l、a、b的公式如下:
Figure PCTCN2022081197-appb-000005
a=500(Xl-Yl)
b=500(Yl-Zl)
进一步的,所述车辆检测:
根据下式计算出所述颜色特征向量的特征量值,并将所有的特征量值组成特征量值矩阵D:
Figure PCTCN2022081197-appb-000006
其中i代表第i的像素点,j代表第j个像素点,d ij代表像素点i与像素点j之间的特征量值,其中l、a、b表示像素点的在LAB颜色空间下l、a、b三 个参数;
将所述特征量值矩阵D上三角矩阵进行升序排列,根据中值计算方法计算出截断距离d c
根据下式计算每个像素点的局部密度ρ i
Figure PCTCN2022081197-appb-000007
其中,i为当前像素点的编号,j代表除当前像素点外其他像素点的编号,d ij为像素点i与像素点j之间的特征量值,d c为截断距离;
χ的表达形式为:
Figure PCTCN2022081197-appb-000008
按照下式计算距较高密度点的距离δ i
δ i=min(d ij)
其中,d ij为像素点i与像素点j之间的特征量值,i为当前像素点的编号,j代表除当前像素点外其他像素点的编号且ρ ij
将所述的局部密度以及距较高密度点的距离进行筛选,筛选出较大局部密度以及距较高密度点的距离较大的像素点作为簇中心;
以确定的簇中心的像素点为中心,根据所述局部密度公式将所有跟该簇中心像素点的距离小于截断距离d c的像素点规置到该簇中心,完成像素点的聚类;
根据图像中实际车辆的长宽比k,对像素聚类后的车辆长宽比进行筛选,将与车辆的长高比k误差大的聚类去除掉。
进一步的,所述根据聚类结果生成车辆分割图像包括:根据车辆检测结果选取每个聚类的最***像素点的坐标组成一个矩形,生成车辆分割图像。
本发明的有益效果如下:
本发明通过降低图像尺寸、颜色量化的步骤在保证检测精度的条件下大幅降低了计算量,提高了效率;
本发明提出的颜色量化算法中其量化的颜色通过算法的不断迭代使得其量化值更加准确,从而提高后续检测的效果。
本发明提出的车辆检测算法不需要指定预先定义的簇数,只需要一个灵敏度较低的参数,与其他基于密度的方法相比,本发明计算效率较高。
附图说明
图1本发明的车辆检测方法流程图;
图2本发明的颜色量化流程图;
图3本发明的聚类筛选流程图。
具体实施方式
下面结合附图对本发明作进一步的说明,但不以任何方式对本发明加以限制,基于本发明教导所作的任何变换或替换,均属于本发明的保护范围。
本发明采用的技术方案包括步骤如下:
本发明利用聚类算法对图像中的车辆进行分割并得到车辆目标检测图像。图1为基于聚类算法的车辆检测流程图,各个步骤的具体描述如下:
1.图像预处理:本发明中图像预处理主要工作为图像的滤波降噪以及图像的尺寸调整。
图像的滤波采取的方法为中值滤波的方法,该方法的优势在于能够去除脉冲噪声与椒盐噪声,此外在去除噪声的同时还能够保留图像的边缘细节。图像尺寸的调整是为了降低相似性度量值的运算数量提高运行速度,具体的缩放尺寸根据实际的硬件条件进行选择,一般选择将原图像缩小一半的比例。
2.颜色量化:步骤1中为了降低相似性度量值的运算数量提高运行速度 采取了图像尺寸的调整方法,但对于RGB图像而言其运算量依旧很大,继续降低分辨率会损失原图中大量信息。
为了解决该问题,本发明提出了一种颜色量化的方法,通过对图像进行颜色量化处理,在减少像素点的颜色种类的同时对原图的颜色信息做最大化保留,提高了运行效率。图2为颜色量化流程图,具体步骤如下:
2.1K值选取:从图像中随机选取K个RGB分量,选取一般规则为:尽可能的在图像中分散选取,最大程度能够代表图像中的主要色彩。
其中K为颜色量化类别数,K的取值与图像场景颜色的复杂度成正比关系,通常K的取值大于等于3。其表示形式为M k=[R′ k,G′ k,B′ k],其中k为K的计数,R′ k,G′ k,B′ k分别为选取的K个点所对应的R、G、B分量。
2.2色彩距离计算:根据公式1分别计算各个像素点与步骤2.1中选取的K个RGB分量的距离。
Figure PCTCN2022081197-appb-000009
其中,k为K个RGB分量计数,j为像素点的计数,dkj为第j个像素点与第k个RGB分量的距离。
2.3像素归类:将步骤2.2中计算出的色彩距离进行像素归类,具体的归类方法为:
将步骤2.2中每个像素点与K个RGB分量的计算出的K个色彩距离进行比较,将该像素点归类到与K个RGB分量中对应的最小的色彩距离的类别中去。
2.4类别像素值计算:该步骤是对步骤2.1中选取的K个RGB分量的色彩值重新进行计算,获取到新的类别像素值。
具体的计算方法为:根据步骤2.3中的像素归类的结果,按照公式2分别 计算各个类别中的所有像素点的色彩平均值,并以计算出的平均值替换步骤2.1中的M k
Figure PCTCN2022081197-appb-000010
其中,k为K个RGB分量计数,R′ k,G′ k,B′ k分别为选取的K个点所对应的R、G、B分量,n为该类别中所对应的像素点总数,i为该类别中像素点的编号,Ri、Gi、Bi分别为该类别中第i个像素点的RGB分量。
2.5像素类别判断:该步骤属于颜色量化步骤中的迭代步骤,该步骤的目的在于使得像素归类更加准确,以及类别像素值更加合理。
其具体的步骤为:将各个类别所对应的像素值与步骤2.4中计算出的新的K个类别像素值行进行色彩距离的计算,判断各个类别中所对应的像素的分类是否发生变化;如果所有类别中所有像素的分类中所有像素点的分类均未发生变化,则进行下一步骤,否则跳转到步骤2.2。
2.6色彩替换量化:该步骤是颜色量化的算法具体执行步骤,其方法为:根据步骤2.1~步骤2.5中的出的K个类别像素值以及各个类别中的像素点继续,将各个类别中所用的像素向的RGB分量用该类别中的类别像素值中的RGB分量替换,进而完成颜色量化。
3.色彩空间转换:本发明中需要将RGB色彩空间转换为LAB色彩空间,具体转换步骤如下:
3.1Gamma校正:根据公式3对R、G、B进行Gamma校正获取到校正后的Rg=f(R)、Gg=f(G)、Bg=f(B),其中R、G、B为原始色彩分量,Rg、Gg、Bg为Gamma校正色彩分量。
Figure PCTCN2022081197-appb-000011
3.2XYZ色彩空间转换:根据公式4进行XYZ色彩空间的转换。
[X,Y,Z]=M*[Rg,Gg,Bg]   (4)
其中Rg、Gg、Bg为Gamma校正色彩分量,X、Y、Z为XYZ色彩空间色彩分量,
Figure PCTCN2022081197-appb-000012
3.3XYZ线性归一化:根据公式5对X、Y、Z进行Gamma校正获取到校正后的Xl=g(X)、Yl=g(Y)、Zl=g(Z),其中X、Y、Z为XYZ色彩空间色彩分量,Xl、Yl、Zl为X、Y、Z线性归一化之后的值的色彩分量。
Figure PCTCN2022081197-appb-000013
3.4LAB色彩空间转换:根据公式6进行LAB色彩空间,其中,l、a、b为LAB色彩空间的色彩分量,X1、Y l、Zl为XYZ色彩空间中X、Y、Z线性归一化之后的值。
Figure PCTCN2022081197-appb-000014
4.提取特征向量:提取LAB颜色空间的图像像素点的颜色信息,生成各像素点的颜色特征向量.
每个像素点的特征向量为:Li=[l,a,b],其中l、a、b表示像素点的在LAB颜色空间下l、a、b三个参数,i为图像像素点的编号。
5.车辆检测:
5.1计算特征量值:根据公式7计算出由步骤4获取的调整向量的特征量 值,并将所有的特征量值放到特征量值矩阵D中去。
Figure PCTCN2022081197-appb-000015
其中i代表第i的像素点,j代表第j个像素点,d ij代表像素点i与像素点j之间的特征量值,其中l、a、b表示像素点的在LAB颜色空间下l、a、b三个参数。
特征量值矩阵D的表现形式为:
Figure PCTCN2022081197-appb-000016
5.2确定截断距离:本步骤中截断距离dc的具体确定方法为:将步骤5.1中的计算得出的特征量值矩阵D上三角矩阵进行升序排列,根据中值计算方法计算出截断距离d c
5.3计算局部密度:本步骤中的局部密度ρ i代表着该像素点周围的相似像素点的数量,该值越大说明该像素点周围的相似像素点的数量越多,即找到与第i个像素点之间的距离小于截断距离d c的像素点的个数。
具体的局部密度方法为:根据公式8计算每个像素点的局部密度。
ρ i=∑ jχ(d ij-d c)   (8)
其中,i为当前像素点的编号,j代表除当前像素点外其他像素点的编号,d ij为像素点i与像素点j之间的特征量值,d c为截断距离。χ的表达形式为:
Figure PCTCN2022081197-appb-000017
5.4计算距较高密度点的距离:该步骤计算的距较高密度点的距离δ i目的在于找到所有比第i个像素点的局部密度都大的像素点中,与第i个像素点之间的距离的最小值。
具体的计算方法为:按照公式9计算距较高密度点的距离δ i
δ i=min(d ij)   (9)
其中,d ij为像素点i与像素点j之间的特征量值,i为当前像素点的编号,j代表除当前像素点外其他像素点的编号且ρ ij
5.5确定簇中心:该步骤为确定图像中像素点的簇中心,即初步找出图像中车辆的中心。具体的确定方法为:根据步骤5.3与步骤5.4中计算出的局部密度以及距较高密度点的距离进行筛选,筛选出较大局部密度以及距较高密度点的距离较大的像素点作为簇中心。
5.6像素点聚类:根据步骤5.5中确定的簇中心进行像素点聚类。具体的聚类方法为:以确定的簇中心的像素点为中心,根据公式8将所有跟簇中心像素点的距离小于截断距离dc的像素点均规置到该簇中心,从而完成像素点的聚类。该步骤的聚类结果代表着图像中车辆初步检测结果。
5.7聚类筛选:该步骤是对步骤5.5与步骤5.6中初步检测的车辆中心以及车辆的检测结果进行筛选。
具体的筛选方法为:根据图像中实际车辆的长宽比k,对步骤5.6像素聚类后车辆长宽比进行筛选,将与车辆的长高比k误差较大的聚类去除掉。
6.生成车辆分割图像:该步骤是将车辆从图像中分割出来或者在图像中框选出来,具体的方法为:根据步骤5的车辆检测结果选取每个聚类的最***像素点的坐标组成一个矩形,从而完成车辆的分割生成车辆分割图像。
本发明的有益效果如下:
本发明通过降低图像尺寸、颜色量化的步骤在保证检测精度的条件下大幅降低了计算量,提高了效率;
本发明提出的颜色量化算法中其量化的颜色通过算法的不断迭代使得其量化值更加准确,从而提高后续检测的效果。
本发明提出的车辆检测算法不需要指定预先定义的簇数,只需要一个灵敏度较低的参数,与其他基于密度的方法相比,本发明的计算效率较高。
本文所使用的词语“优选的”意指用作实例、示例或例证。本文描述为“优选的”任意方面或设计不必被解释为比其他方面或设计更有利。相反,词语“优选的”的使用旨在以具体方式提出概念。如本申请中所使用的术语“或”旨在意指包含的“或”而非排除的“或”。即,除非另外指定或从上下文中清楚,“X使用A或B”意指自然包括排列的任意一个。即,如果X使用A;X使用B;或X使用A和B二者,则“X使用A或B”在前述任一示例中得到满足。
而且,尽管已经相对于一个或实现方式示出并描述了本公开,但是本领域技术人员基于对本说明书和附图的阅读和理解将会想到等价变型和修改。本公开包括所有这样的修改和变型,并且仅由所附权利要求的范围限制。特别地关于由上述组件(例如元件等)执行的各种功能,用于描述这样的组件的术语旨在对应于执行所述组件的指定功能(例如其在功能上是等价的)的任意组件(除非另外指示),即使在结构上与执行本文所示的本公开的示范性实现方式中的功能的公开结构不等同。此外,尽管本公开的特定特征已经相对于若干实现方式中的仅一个被公开,但是这种特征可以与如可以对给定或特定应用而言是期望和有利的其他实现方式的一个或其他特征组合。而且,就术语“包括”、“具有”、“含有”或其变形被用在具体实施方式或权利要求中而言,这样的术语旨在以与术语“包含”相似的方式包括。
本发明实施例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以多个或多个以上单元集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售 或使用时,也可以存储在一个计算机可读取存储介质中。上述提到的存储介质可以是只读存储器,磁盘或光盘等。上述的各装置或***,可以执行相应方法实施例中的存储方法。
综上所述,上述实施例为本发明的一种实施方式,但本发明的实施方式并不受所述实施例的限制,其他的任何背离本发明的精神实质与原理下所做的改变、修饰、代替、组合、简化,均应为等效的置换方式,都包含在本发明的保护范围之内。

Claims (7)

  1. 一种基于聚类算法的车辆检测方法,其特征在于,包括以下步骤:
    对图像进行预处理,包括对图像的滤波降噪以及图像的尺寸调整;
    通过对图像进行颜色量化处理,减少像素点的颜色种类;
    将图像从RGB色彩空间转换为LAB色彩空间,提取LAB颜色空间的图像像素点的颜色特征向量,并所述颜色特征向量组成特征量值矩阵;
    计算每个像素点的局部密度和距较高密度点的距离,筛选出局部密度大于阈值以及距较高密度点的距离大于阈值的像素点作为簇中心,将其它像素点规置到该簇中心,进行像素点的聚类;
    根据聚类结果生成车辆分割图像。
  2. 根据权利要求1所述的基于聚类算法的车辆检测方法,其特征在于,所述图像的滤波降噪方法为中值滤波方法,用于去除脉冲噪声与椒盐噪声,同时保留图像的边缘细节;图像尺寸的调整用于降低相似性度量值的运算数量提高运行速度。
  3. 根据权利要求1所述的基于聚类算法的车辆检测方法,其特征在于,所述颜色量化处理的步骤包括:
    S1:从图像中随机选取K个RGB分量,M k=[R′ k,G′ k,B′ k],其中k为K的计数,R′ k,G′ k,B′ k分别为选取的K个点所对应的R、G、B分量;
    S2:计算各个像素点与所述选取K个RGB分量的色彩距离
    Figure PCTCN2022081197-appb-100001
    其中,k为K个RGB分量计数,j为像素点的计数,d kj为第j个像素点与第k个RGB分量的色彩距离;
    S3:将所述色彩距离进行像素归类,归类方法如下:
    将每个像素点与K个RGB分量的计算出的K个色彩距离进行比较,将该像素点归类到与K个RGB分量中对应的最小的色彩距离的类别中去;
    S4:根据像素归类的结果,计算各个类别中的所有像素点的色彩平均值,并以计算出的平均值替换M k的值;
    S5:将各个类别所对应的像素值与计算出的新的K个类别像素值行进行色彩距离的计算,判断各个类别中所对应的像素的分类是否发生变化;如果所有类别中所有像素的分类中所有像素点的分类均未发生变化,则进行步骤S6,否则跳转到步骤S2;
    S6:根据步骤S1~步骤S5中的得到的K个类别像素值以及各个类别中的像素点继续,将各个类别中所用的像素点的RGB分量用该类别中的类别像素值的RGB分量替换,完成颜色量化。
  4. 根据权利要求1所述的基于聚类算法的车辆检测方法,其特征在于,所述将图像从RGB色彩空间转换为LAB色彩空间,转换步骤如下:
    对RGB分量通过第一Gamma校正方法得到Gamma校正色彩分量Rg、Gg、Bg;
    将所述Gamma校正色彩分量进行XYZ色彩空间的转换,得到XYZ色彩空间色彩分量X、Y、Z;
    对所述色彩空间色彩分量X、Y、Z进行第二Gamma校正获取到校正后的XYZ色彩空间色彩分量Xl、Yl、Zl;
    将所述Xl、Yl、Zl转换为LAB色彩空间的色彩分量l、a、b;
    提取LAB颜色空间的图像像素点的颜色信息,生成各像素点的颜色特征向量。
  5. 根据权利要求4所述的基于聚类算法的车辆检测方法,其特征在于,所述第一Gamma校正方法如下:
    Figure PCTCN2022081197-appb-100002
    其中x为R、G、B原始色彩分量之一;
    所述第二Gamma校正公式如下:
    Figure PCTCN2022081197-appb-100003
    其中y为XYZ色彩空间色彩分量X、Y、Z之一;
    所述Gamma校正色彩分量进行XYZ色彩空间的转换的公式如下:
    [X,Y,Z]=M*[Rg,Gg,Bg]
    其中X、Y、Z为XYZ色彩空间色彩分量,
    Figure PCTCN2022081197-appb-100004
    所述Xl、Yl、Zl转换为LAB色彩空间的色彩分量l、a、b的公式如下:
    Figure PCTCN2022081197-appb-100005
    a=500(Xl-Yl)
    b=500(Yl-Zl)
  6. 根据权利要求1所述的基于聚类算法的车辆检测方法,其特征在于,所述计算每个像素点的局部密度和距较高密度点的距离,筛选出局部密度大于阈值以及距较高密度点的距离大于阈值的像素点作为簇中心,将其它像素点规置到该簇中心,进行像素点的聚类包括:
    根据下式计算出所述颜色特征向量的特征量值,并将所有的特征量值组成特征量值矩阵D:
    Figure PCTCN2022081197-appb-100006
    其中i代表第i的像素点,j代表第j个像素点,d ij代表像素点i与像素点j之间的特征量值,其中l、a、b表示像素点的在LAB颜色空间下l、a、b三个参数;
    将所述特征量值矩阵D上三角矩阵进行升序排列,根据中值计算方法计算出截断距离d c
    根据下式计算每个像素点的局部密度ρ i
    Figure PCTCN2022081197-appb-100007
    其中,i为当前像素点的编号,j代表除当前像素点外其他像素点的编号,d ij为像素点i与像素点j之间的特征量值,d c为截断距离;
    χ的表达形式为:
    Figure PCTCN2022081197-appb-100008
    按照下式计算距较高密度点的距离δ i
    δ i=min(d ij)
    其中,d ij为像素点i与像素点j之间的特征量值,i为当前像素点的编号,j代表除当前像素点外其他像素点的编号且ρ ij
    将所述的局部密度以及距较高密度点的距离进行筛选,筛选出较大局部密度以及距较高密度点的距离较大的像素点作为簇中心;
    以确定的簇中心的像素点为中心,根据所述局部密度公式将所有跟该簇中心像素点的距离小于截断距离d c的像素点规置到该簇中心,完成像素点的聚类;
    根据图像中实际车辆的长宽比k,对像素聚类后的车辆长宽比进行筛选,将与车辆的长高比k误差大的聚类去除掉。
  7. 根据权利要求1所述的基于聚类算法的车辆检测方法,其特征在于,所述根据聚类结果生成车辆分割图像包括:根据车辆检测结果选取每个聚类的最***像素点的坐标组成一个矩形,生成车辆分割图像。
PCT/CN2022/081197 2021-12-14 2022-03-16 一种基于聚类算法的车辆检测方法 WO2023108933A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111542446.3 2021-12-14
CN202111542446.3A CN114463570A (zh) 2021-12-14 2021-12-14 一种基于聚类算法的车辆检测方法

Publications (1)

Publication Number Publication Date
WO2023108933A1 true WO2023108933A1 (zh) 2023-06-22

Family

ID=81406676

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/081197 WO2023108933A1 (zh) 2021-12-14 2022-03-16 一种基于聚类算法的车辆检测方法

Country Status (2)

Country Link
CN (1) CN114463570A (zh)
WO (1) WO2023108933A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116597188A (zh) * 2023-07-17 2023-08-15 山东北国发展集团有限公司 一种基于视觉的固废资源化利用方法及***
CN116858991A (zh) * 2023-09-04 2023-10-10 济宁华晟服装股份有限公司 一种棉织品退浆处理监测方法
CN117173175A (zh) * 2023-11-02 2023-12-05 湖南格尔智慧科技有限公司 一种基于超像素的图像相似度检测方法
CN117689768A (zh) * 2023-11-22 2024-03-12 武汉纺织大学 一种自然场景驱动的服装模板着色方法和***

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117274405B (zh) * 2023-11-22 2024-02-02 深圳市蓝方光电有限公司 基于机器视觉的led灯工作颜色检测方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100231995A1 (en) * 2009-03-11 2010-09-16 Canon Kabushiki Kaisha Image processing apparatus, image processing apparatus control method, and storage medium storing program
CN104899899A (zh) * 2015-06-12 2015-09-09 天津大学 一种基于密度峰值的颜色量化方法
CN107729812A (zh) * 2017-09-18 2018-02-23 南京邮电大学 一种适用于监控场景中的车辆颜色识别的方法
CN107766878A (zh) * 2017-09-28 2018-03-06 北京华航无线电测量研究所 一种基于Lab色彩空间K均值聚类的危险品检测方法
CN109035254A (zh) * 2018-09-11 2018-12-18 中国水产科学研究院渔业机械仪器研究所 基于改进K-means聚类的运动鱼体阴影去除及图像分割方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100231995A1 (en) * 2009-03-11 2010-09-16 Canon Kabushiki Kaisha Image processing apparatus, image processing apparatus control method, and storage medium storing program
CN104899899A (zh) * 2015-06-12 2015-09-09 天津大学 一种基于密度峰值的颜色量化方法
CN107729812A (zh) * 2017-09-18 2018-02-23 南京邮电大学 一种适用于监控场景中的车辆颜色识别的方法
CN107766878A (zh) * 2017-09-28 2018-03-06 北京华航无线电测量研究所 一种基于Lab色彩空间K均值聚类的危险品检测方法
CN109035254A (zh) * 2018-09-11 2018-12-18 中国水产科学研究院渔业机械仪器研究所 基于改进K-means聚类的运动鱼体阴影去除及图像分割方法

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116597188A (zh) * 2023-07-17 2023-08-15 山东北国发展集团有限公司 一种基于视觉的固废资源化利用方法及***
CN116597188B (zh) * 2023-07-17 2023-09-05 山东北国发展集团有限公司 一种基于视觉的固废资源化利用方法及***
CN116858991A (zh) * 2023-09-04 2023-10-10 济宁华晟服装股份有限公司 一种棉织品退浆处理监测方法
CN116858991B (zh) * 2023-09-04 2023-12-01 济宁华晟服装股份有限公司 一种棉织品退浆处理监测方法
CN117173175A (zh) * 2023-11-02 2023-12-05 湖南格尔智慧科技有限公司 一种基于超像素的图像相似度检测方法
CN117173175B (zh) * 2023-11-02 2024-02-09 湖南格尔智慧科技有限公司 一种基于超像素的图像相似度检测方法
CN117689768A (zh) * 2023-11-22 2024-03-12 武汉纺织大学 一种自然场景驱动的服装模板着色方法和***
CN117689768B (zh) * 2023-11-22 2024-05-07 武汉纺织大学 一种自然场景驱动的服装模板着色方法和***

Also Published As

Publication number Publication date
CN114463570A (zh) 2022-05-10

Similar Documents

Publication Publication Date Title
WO2023108933A1 (zh) 一种基于聚类算法的车辆检测方法
CN108985186B (zh) 一种基于改进YOLOv2的无人驾驶中行人检测方法
CN111080620B (zh) 一种基于深度学习的道路病害检测方法
CN107563372B (zh) 一种基于深度学习ssd框架的车牌定位方法
WO2017190574A1 (zh) 一种基于聚合通道特征的快速行人检测方法
CN109684922B (zh) 一种基于卷积神经网络的多模型对成品菜的识别方法
CN109190444B (zh) 一种基于视频的收费车道车辆特征识别***的实现方法
CN110688987A (zh) 一种行人位置检测与跟踪方法及***
CN109255326B (zh) 一种基于多维信息特征融合的交通场景烟雾智能检测方法
CN112836713A (zh) 基于图像无锚框检测的中尺度对流***识别与追踪方法
CN101470809B (zh) 一种基于扩展混合高斯模型的运动目标检测方法
CN107273832B (zh) 基于积分通道特征与卷积神经网络的车牌识别方法及***
CN107315990B (zh) 一种基于xcs-lbp特征的行人检测算法
CN113724231A (zh) 一种基于语义分割和目标检测融合模型的工业缺陷检测方法
Li et al. Robust vehicle detection in high-resolution aerial images with imbalanced data
CN110969171A (zh) 基于改进卷积神经网络的图像分类模型、方法及应用
CN109741358B (zh) 基于自适应超图学习的超像素分割方法
CN113435319B (zh) 一种联合多目标跟踪和行人角度识别的分类方法
CN114332921A (zh) 基于改进聚类算法的Faster R-CNN网络的行人检测方法
CN111915583A (zh) 复杂场景中基于车载红外热像仪的车辆和行人检测方法
CN110889360A (zh) 一种基于切换卷积网络的人群计数方法及***
CN114529581A (zh) 基于深度学习及多任务联合训练的多目标跟踪方法
Orozco et al. Vehicular detection and classification for intelligent transportation system: A deep learning approach using faster R-CNN model
Babaei Vehicles tracking and classification using traffic zones in a hybrid scheme for intersection traffic management by smart cameras
Ren et al. Lane Detection in Video‐Based Intelligent Transportation Monitoring via Fast Extracting and Clustering of Vehicle Motion Trajectories

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22905704

Country of ref document: EP

Kind code of ref document: A1