CN111275698B - Method for detecting visibility of road in foggy weather based on unimodal offset maximum entropy threshold segmentation - Google Patents

Method for detecting visibility of road in foggy weather based on unimodal offset maximum entropy threshold segmentation Download PDF

Info

Publication number
CN111275698B
CN111275698B CN202010087143.6A CN202010087143A CN111275698B CN 111275698 B CN111275698 B CN 111275698B CN 202010087143 A CN202010087143 A CN 202010087143A CN 111275698 B CN111275698 B CN 111275698B
Authority
CN
China
Prior art keywords
image
value
road
gray value
gray
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010087143.6A
Other languages
Chinese (zh)
Other versions
CN111275698A (en
Inventor
黄鹤
茹锋
王会峰
程慈航
胡凯益
陈永安
郭璐
许哲
黄莺
惠晓滨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Huizhi Information Technology Co ltd
Original Assignee
Xi'an Huizhi Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Huizhi Information Technology Co ltd filed Critical Xi'an Huizhi Information Technology Co ltd
Priority to CN202010087143.6A priority Critical patent/CN111275698B/en
Publication of CN111275698A publication Critical patent/CN111275698A/en
Application granted granted Critical
Publication of CN111275698B publication Critical patent/CN111275698B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30192Weather; Meteorology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a road visibility detection method based on a single peak shift maximum entropy threshold segmentation foggy day, which comprises the steps of obtaining a road traffic image in a foggy weather, carrying out gray value processing on the obtained traffic image, obtaining a gray value image, calculating the gray value image, obtaining a numerical value with the most dense gray value distribution, carrying out left shift operation according to the numerical value obtained by calculation, simultaneously calculating an entropy value corresponding to each point gray value, obtaining a threshold value meeting the road segmentation requirement, carrying out maximum entropy road segmentation according to the obtained threshold value, obtaining a continuous band for the image after road segmentation, obtaining a gray value mutation point in the obtained continuous band region, carrying out median processing on the gray value mutation point to obtain a space division line, obtaining a Vi value by the obtained space division line, and further solving the atmospheric visibility. The method can well divide the gray level image of the original image, and simultaneously solves the problems of slow calculation, low precision and the like based on the region growing and inflection point algorithm.

Description

Method for detecting visibility of road in foggy weather based on unimodal offset maximum entropy threshold segmentation
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a foggy road visibility detection method based on unimodal deviation maximum entropy threshold segmentation.
Background
The visibility is obviously reduced under the condition of foggy days, so that the human eyes easily overestimate the scene distance, and the driving speed is too high to cause traffic accidents, so that the real-time measurement of the visibility information can limit the vehicle speed to a proper range by a driver, and the method plays an important role in air and ground traffic safety under adverse weather conditions. At present, a detection method based on a plurality of images in the same scene and a detection method based on a single image are widely focused by domestic and foreign specialists. However, the detection method based on the plurality of images does not have real-time performance, so that the detection method of the cluster fog based on the single image becomes a research hot spot. The visual pixel model is defined based on the contrast of the four neighborhoods in the video image and the brightness characteristics of road surface pixels, and the furthest distance from the visual pixels to a camera is calculated by adopting a camera calibration technology, so that the visibility measurement without manual marking is realized, but the defects are that the characteristics (such as lane lines, road marks and the like) of the road surface marks are required to be extracted. However, these features cannot be extracted effectively in complex scenes or with occlusions, thereby affecting the efficiency and accuracy of the visibility measurement. In order to solve the problems, a novel visibility detection method is provided by combining a foggy-day image imaging model and an atmospheric contrast attenuation model. Firstly, solving a second derivative of a foggy imaging model to obtain the relation between the inflection point position of the gray value change of an image and the extinction coefficient of the atmosphere; then, a specific position of a gray value change inflection point of the video image is obtained by adopting a region growing algorithm, and a value of an extinction coefficient is obtained; and finally substituting the extinction coefficient into an atmospheric contrast attenuation model to realize the measurement of the visibility. The method avoids dependence on road marks and enhances robustness; however, the region growing algorithm in the method is sensitive to noise and low in calculation accuracy, and the inflection point detection can be realized by continuously iterating, so that the time calculation complexity is high, and the requirement of real-time performance cannot be met.
Disclosure of Invention
The invention provides a method for detecting road visibility in foggy days based on a unimodal migration maximum entropy threshold, which overcomes the defects of the prior art, and the method can well divide gray images of original images by adopting a unimodal migration maximum entropy threshold segmentation algorithm, and simultaneously provides a concept of a communication band and a mutation point, thereby improving the problems of slow calculation, low precision and the like based on region growth and inflection point algorithm, and having good performances.
In order to achieve the above purpose, the invention adopts the following technical scheme:
the method for detecting the visibility of the road in the foggy weather based on the peak shift maximum entropy threshold segmentation comprises the following steps:
step 1: acquiring a road traffic image in haze weather;
step 2: carrying out gray value processing on the traffic image obtained in the step 1 to obtain a gray value image;
step 3: calculating the gray value image in the step 2 to obtain the gray value of the area with the most dense gray value distribution;
step 4: performing offset operation according to the gray values calculated in the step 3, namely calculating entropy values by taking the gray values obtained in the step 3 as an origin, then gradually offsetting one unit gray value leftwards, simultaneously calculating the entropy value corresponding to each point gray value, and obtaining a threshold value meeting the road segmentation requirement;
step 5: carrying out maximum entropy road segmentation according to the threshold value obtained in the step 4;
step 6: acquiring a link band for the image after road segmentation in the step 5;
step 7: solving gray value mutation points in the communication band area obtained in the step 6, and carrying out median processing on the gray value mutation points to obtain a space-earth dividing line;
step 8: deriving v from the world division line obtained in step 7 i And (3) the value, namely the vertical coordinate value of the world division line in the image, so as to further solve the visibility of the foggy weather.
Further, in step 3, the gray value distribution of the image is obtained by calculating the gray value image, so that the gray value corresponding to the region with the denser gray value distribution, namely the peak value with the highest gray value distribution, is obtained, and the region with the gray value distribution is effectively and rapidly found.
Further, in step 4, the highest peak of the gray value distribution is used as a starting point to calculate the image entropy value, and the gray value is shifted leftwards and the corresponding entropy value is calculated, and then the comparison and judgment are carried out with the entropy value corresponding to the highest peak, and the steps are repeated until the entropy value corresponding to the gray value is 1.05-1.25 times of the entropy value of the highest peak, so that the entropy value meeting the road segmentation requirement is obtained.
Further, in step 5, road segmentation is performed on the foggy-day road traffic map obtained in step 1 by using a threshold value meeting the road segmentation requirement to obtain a road region.
Further, in the step 6, the linking band is a region where the sky is connected with the road, which is obtained by dividing the road region; the method for solving the passband is as follows: firstly, carrying out summation processing on each column of pixel values of an image matrix processed by a unimodal offset maximum entropy threshold segmentation algorithm, and storing the sum of the pixel values of each column into a new array; secondly, comparing the sizes of all elements in the new array, and finding out the largest element, wherein the largest element position is the position of the column pixel and the largest column in the image, namely the position closest to the center of the highway; then, determining a range of pixel value sums of a column to obtain a sum of a plurality of column pixels meeting the conditions, and only reserving two leftmost vertical lines and rightmost vertical lines after the column where the column pixels meeting the conditions are positioned is drawn in an original image, wherein a region sandwiched between the two vertical lines is a connecting band.
Further, in step 7, performing a difference processing in the continuous band to obtain points with obvious pixel value change, namely gray value abrupt change points, which specifically are: firstly, storing pixel values of each point of the gray level image obtained in the step 2 into a matrix; secondly, position coordinates of two vertical lines of the connecting band are found in the matrix; and finally, taking coordinates of the two positions as boundaries, regarding all elements in the middle of the boundaries as a new matrix, namely a connected band matrix, carrying out difference processing on adjacent elements on each column of the connected band matrix, setting a threshold value, marking coordinate points corresponding to the upper elements in the adjacent elements on a gray level image when the difference of pixel values is larger than the threshold value, judging the coordinates of the abrupt change points, deducing the position of a space-earth dividing line when the coordinate points are gathered or connected into one piece, and prompting that the image is not in accordance with the requirement if the judgment condition cannot be met.
Further, the visibility V in the foggy day in step 8 is calculated by the following formula:
V=3Hf/[2(v i -v h )cos(θ)]
wherein H represents the height of the camera from the ground plane, θ represents the included angle between the optical axis of the camera and the ground plane, f represents the effective focal length of the camera, v h Representing the vertical coordinate values of the horizon in the image.
Compared with the prior art, the invention has the following beneficial technical effects:
according to the method, the visibility value under the heavy fog weather is calculated by combining a visibility calculation model, an image segmentation algorithm and a gray value mutation point detection result, and the running time and the detection error are obviously optimized. When the image segmentation is carried out, the traditional image segmentation method generally needs to process the whole image or 256-point gray values, and on the basis of the traditional maximum entropy threshold segmentation algorithm, the invention can only process partial images and images corresponding to partial gray values on the premise of obtaining better image processing effect, thereby greatly reducing the calculated amount and the running time. Meanwhile, the image segmented by the method is subjected to target object extraction, so that the running time can be shortened again while part of noise is eliminated. And finally, on the basis of extracting the target object, according to the design of the mutation point concept, a more accurate detection result can be obtained, and the result is superior to the traditional foggy day visibility detection algorithm.
Drawings
FIG. 1 is a camera model in a traffic scene;
FIG. 2 is a schematic flow chart of the present invention;
FIG. 3 is a flow chart of a unimodal shifted maximum entropy threshold segmentation algorithm;
fig. 4 is a graph comparing the effect of detecting the visibility of the fog based on the turning point line, wherein (a), (c), (e), (g) and (i) are images processed by the detection algorithm of the road visibility of the foggy days of the abrupt point based on the segmentation of the unimodal offset maximum entropy threshold, and (b), (d), (f), (h) and (j) are images processed by the detection algorithm of the visibility of the fog based on the turning point line.
Detailed Description
The invention is described in further detail below with reference to the attached drawing figures:
referring to fig. 2 and 3, the invention provides a foggy road visibility detection method based on a unimodal offset maximum entropy threshold segmentation, which uses a unimodal offset maximum entropy threshold segmentation algorithm to process images so as to realize accurate and rapid road segmentation. The concrete thought of solving the heaven and earth wires is as follows: by utilizing the principle of image segmentation, roads and sky in the image are separated from other elements by combining road features, and a white area which is communicated with the sky and the roads is obtained in the image after ideal processing. The mutation point solving is carried out in the connecting band, so that the data volume of algorithm operation can be greatly reduced, the operation time is shortened, and the algorithm efficiency is improved. The elements such as the road and the scenery in the image have continuity, namely the gray value generally does not have abrupt change phenomenon, and according to the characteristics of the earth and the sky, the gray value near the earth and the sky can have obvious abrupt change, namely a large number of abrupt change points exist. Therefore, according to the property, a deviation threshold is set to obtain the mutation point, and finally, the earth and the sky line are drawn.
The method comprises the following specific steps:
step 1, obtaining a road image in haze weather: and obtaining the road image in the haze weather to be processed by using the image acquisition equipment.
And 2, converting the image obtained in the step 1 into a gray level image, and waiting for the next processing.
The solving process of the maximum entropy threshold segmentation algorithm is to calculate the total entropy of the image under all segmentation thresholds, find the maximum entropy, take the segmentation threshold corresponding to the maximum entropy as the final threshold, take the pixel with gray level larger than the threshold as the foreground in the image, otherwise take the pixel as the background. The unimodal offset algorithm can quickly find the optimal solution k and the corresponding entropy value according to the image gray value distribution characteristics. First, in order to use a unimodal offset maximum entropy threshold segmentation algorithm, gray value processing needs to be performed on an image, so as to obtain an image gray value distribution situation.
And 3, calculating the gray value image in the step 2 to obtain the gray value of the area with the most dense gray value distribution.
In step 3, the gray value distribution of the image is obtained by calculating the gray value image, so that the value with the most dense gray value distribution, namely the peak value with the highest gray value distribution, is obtained, and the region with the gray value distribution is effectively and rapidly found.
And 4, performing left offset operation on the numerical value calculated in the step 3, and simultaneously calculating an entropy value corresponding to each point gray value and obtaining a threshold value meeting the road segmentation requirement.
In step 4, calculating the image entropy value by taking the highest peak of the gray value distribution as a starting point, shifting one gray value leftwards and calculating the corresponding entropy value, comparing and judging with the entropy value corresponding to the highest peak, repeating the steps until the entropy value meeting the road segmentation requirement is obtained, wherein the relationship between the entropy value corresponding to the highest peak and the proper entropy value of the image is different, and the entropy value corresponding to the optimal gray value is 1.05-1.25 times of the entropy value of the highest peak. The traditional maximum entropy threshold segmentation algorithm needs to calculate 256-bit gray values, comprises a large number of invalid operations, only needs to calculate the gray value entropy value of the interval corresponding to the highest peak and the better entropy value according to the unimodal offset algorithm, and saves at least half of image segmentation calculation time.
And 5, carrying out maximum entropy road segmentation on the threshold value obtained in the step 4.
In step 5, road segmentation is performed on the foggy road traffic map by using the threshold value obtained by the unimodal shift operation to obtain a road area.
And 6, acquiring a link band for the image after road segmentation in the step 5.
The method for obtaining the image solving passband comprises the following steps: the concept of a communication band is provided, namely, road segmentation is carried out on a foggy road traffic image by utilizing a unimodal offset maximum entropy threshold segmentation algorithm, and a white area of the sky communicated with the road is obtained after a road area is segmented (the segmented road area is displayed in white). The design idea of the method is as follows: firstly, carrying out summation processing on each column of pixel values of an image matrix processed by a unimodal offset maximum entropy threshold segmentation algorithm by programming, and storing the sum of the pixel values of each column into a new array; secondly, comparing the sizes of all elements in the new array, and finding out the largest element, wherein the largest element position is the position of the column pixel and the largest column in the image, namely the position closest to the center of the highway; then, a range of column pixel value sums is determined, for example, the sum maximum value of column pixels is found, wherein the sum maximum value of column pixels is smaller than the sum maximum value of column pixels and larger than 0.97 times, so that I can obtain the sum of a plurality of column pixels meeting the conditions, only two vertical lines at the leftmost side and the rightmost side are reserved after the sum is drawn in an original picture, the area sandwiched between the two vertical lines is the connecting band required by us, the width of the connecting band can be conveniently changed by changing the size of parameters, and the operation precision and the operation time are changed. The image obtained by the unimodal shift maximum entropy threshold segmentation algorithm selects a proper communication band, and then the communication band in the gray level image is processed in the next step.
And 7, solving gray value mutation points in the connected band area obtained in the step 6, and carrying out median processing on the gray value mutation points to obtain a space-earth dividing line.
In step 7, a difference processing is performed in the selected continuous band to obtain a point with obvious pixel value change, namely a gray value abrupt change point. Firstly, storing pixel values of each point of a gray image into a matrix; secondly, the position coordinates of the two vertical lines of the communication band in the step are found out in the new matrix; and finally, taking the two coordinates as boundaries, regarding all elements in the middle of the boundaries as a new matrix, carrying out difference processing on adjacent elements on each column of the new matrix, artificially giving out the threshold value, and marking out the corresponding point coordinates of the elements on the image when the difference of the pixel values is larger than the threshold value. The marked coordinate points are gray value abrupt points, namely points closest to the earth line, and when the points are gathered or connected together, the approximate position of the earth line can be deduced. In order to reduce the error between the extracted antenna and ground lines, we use a method of extracting the median value of the gray value discontinuity in the continuous band. Since some interference remains in the experimental pictures, for example: white lane lines, white street lamps, thick trunks, etc. In order to eliminate the interference to the greatest extent, the median value is definitely the best way, because of the several gray value abrupt points obtained in the above steps, some gray value abrupt points are always distributed around the interference, but the interference is only a small part after all, and most gray value abrupt points are distributed around the earth and the sky. The method of taking the median is different from taking the average value, and the median eliminates a small part of errors caused by interference, so that a more accurate antenna and ground wire is obtained. Finally, the visibility of the atmosphere is calculated through the ordinate Vi of the earth wire.
Step 8: and (3) obtaining a Vi value by the space-earth dividing line obtained in the step (7), and further solving the atmospheric visibility.
And (5) obtaining a Vi value from the space-earth dividing line obtained in the step 7 to solve the visibility. The principle of the process is as follows, koschmieder states that there is the following relationship between the inherent light intensity of an object and the observed light intensity:
I(X)=t(X)J(X)+(1-t(X))A (1)
wherein A represents the atmospheric light intensity, and t (x) represents the scene transmissivity.
t(x)=e -kd(x) (2)
Where k represents the extinction coefficient. According to equation (2), for road areas and sky areas, the pixels will change in hyperbolic form from top to bottom due to uniform texture, illumination.
Substituting formula (2) into formula (1):
I(X)=e -kd(x) J(x)+(1-e -kd(x) )A (3)
stewart et al demonstrate that the contrast decays with increasing distance in the following manner:
C=C 0 e -kd(x) (4)
wherein C is expressed in distanced contrast presented by the object, C 0 Representing the inherent contrast of the object against the background. The formula is only applicable to homogeneous foggy days. For an object to be barely visible C must be greater than a minimum threshold epsilon. In practical terms, the international adoption of the contrast threshold epsilon=0.05 defines the meteorological visibility distance, i.e. the maximum distance that an object can see with a suitable size when the background contrast is 1.
The camera used in the invention is erected in the middle of a road, and a camera model in a traffic scene is shown in fig. 1. Coordinates are given in (u, v) in the image pixel coordinate system, where u, v denote the number of rows and columns of pixels. Let the projection of the optical axis on the image plane be (u) 0 ,v 0 ) θ represents the angle between the optical axis of the camera and the horizon, v h Representing a vertical projection of the horizon. The camera internal parameters include the camera focal length f, the horizontal dimension t of unit pixel u Vertical dimension t v . Suppose a u =f/t u 、a v =f/t v Generally a u And a v Equal but not let a u =a v =a。
By using a pinhole model, the coordinates of three-dimensional points in a scene can be found according to the following formula:
Figure RE-GDA0002446331230000091
in the image plane, the horizon may be represented as v h =v 0 -atanθ (6)
Combining formulas (5) and (6) yields:
Figure RE-GDA0002446331230000092
in the world coordinate system (S, X w ,Y w ,Z w ) In (2), formula (8) becomes:
Figure RE-GDA0002446331230000093
on a road surface, assuming a point M at a distance S from the origin, then the coordinates of M
The method meets the following conditions:
Figure RE-GDA0002446331230000094
substitution formula (8):
Figure RE-GDA0002446331230000095
the distance information of the points in the plane can be calculated by equation (10):
Figure RE-GDA0002446331230000096
wherein the method comprises the steps of
Figure RE-GDA0002446331230000097
The distance d to the camera for any pixel point (u, v) in the image can be defined as the following expression:
Figure RE-GDA0002446331230000098
wherein H is the height of the camera from the ground plane, θ represents the included angle between the optical axis of the camera and the ground plane, f is the effective focal length of the camera, v h Representing the vertical coordinate values of the horizon (or vanishing point) in the image.
The image optical model of foggy days is shown as follows:
I(x)=J(x)exp(-βd(x))+A(1-exp(-βd(x))) (12)
where I is the observed brightness of the object, J is the inherent brightness of the object itself, beta is the scattering coefficient of the atmosphere, and A is the intensity of the atmosphere light.
Substituting formula (1) into formula (2) and obtaining the second derivative of v:
Figure RE-GDA0002446331230000101
let equation (3) equal to zero yields two solutions, one is meaningless β=0; the other is β=2 (v i -v h )/λ=2/d(v i )。v i Representing coordinates of an inflection point of the gray value change in the image coordinate system along the vertical direction of the image. The atmospheric extinction coefficient is approximately equal to the atmospheric scattering coefficient in the foggy environment.
The foggy day visibility V can be calculated by the following formula:
V=3Hf/[2(v i -v h )cos(θ)] (14)
from the calculation model shown in the formula (14), it is known that the visibility in foggy days can be calculated as long as the position of the mutation point where the image gradation value changes in the vertical direction is detected. H is the height of the camera from the ground plane, θ represents the angle between the optical axis of the camera and the ground plane, f is the effective focal length of the camera, v h Representing the vertical coordinate values of the horizon (or vanishing point) in the image. So v h From real world measurements.
As can be seen from fig. 4, (a), (c), (e), (g) and (i) are images processed by the mutation point foggy-day road visibility detection algorithm based on the segmentation of the unimodal deviation maximum entropy threshold, the processing effect is good, and the space and ground line is accurately calculated; (b) And (d), f, h and j) are based on images processed by the large fog visibility detection algorithm of the corner line, the operation time is too long, and the calculated deviation of the ground and the sky is larger.

Claims (7)

1. The method for detecting the visibility of the road in the foggy weather based on the peak shift maximum entropy threshold is characterized by comprising the following steps:
step 1: acquiring a road traffic image in haze weather;
step 2: carrying out gray value processing on the traffic image obtained in the step 1 to obtain a gray value image;
step 3: calculating the gray value image in the step 2 to obtain the gray value of the area with the most dense gray value distribution;
step 4: performing offset operation according to the gray values calculated in the step 3, namely calculating entropy values by taking the gray values obtained in the step 3 as an origin, then gradually offsetting one unit gray value leftwards, simultaneously calculating the entropy value corresponding to each point gray value, and obtaining a threshold value meeting the road segmentation requirement;
step 5: carrying out maximum entropy road segmentation according to the threshold value obtained in the step 4;
step 6: acquiring a link band for the image after road segmentation in the step 5;
step 7: solving gray value mutation points in the communication band area obtained in the step 6, and carrying out median processing on the gray value mutation points to obtain a space-earth dividing line;
step 8: deriving v from the world division line obtained in step 7 i And (3) the value, namely the vertical coordinate value of the world division line in the image, so as to further solve the visibility of the foggy weather.
2. The method for detecting the visibility of the foggy road based on the unimodal offset maximum entropy threshold segmentation according to claim 1, wherein in the step 3, the gray value distribution of the image is obtained through the calculation of the gray value image, so that the gray value corresponding to the region with the denser gray value distribution, namely the highest peak value of the gray value distribution, is obtained, and the region with the gray value distribution is effectively and rapidly found.
3. The method for detecting the visibility of the road based on the single-peak offset maximum entropy threshold segmentation of the foggy day according to claim 2, wherein in the step 4, the highest peak of the gray value distribution is used as a starting point to calculate the entropy value of the image, one gray value is offset leftwards and the corresponding entropy value is calculated, then the comparison and judgment are carried out with the entropy value corresponding to the highest peak, and the steps are repeated until the entropy value corresponding to the gray value is 1.05-1.25 times of the entropy value of the highest peak, so that the entropy value meeting the road segmentation requirement is obtained.
4. The method for detecting the visibility of the road in the foggy days based on the peak shift maximum entropy threshold segmentation according to claim 1, wherein in the step 5, the road region is obtained by road segmentation of the road traffic map in the foggy days obtained in the step 1 by using the threshold satisfying the road segmentation requirement.
5. The method for detecting the visibility of the road in foggy days based on the peak shift maximum entropy threshold segmentation according to claim 4, wherein the linking band in the step 6 is a region in which the sky obtained after the road region is segmented is communicated with the road; the method for solving the passband is as follows: firstly, carrying out summation processing on each column of pixel values of an image matrix processed by a unimodal offset maximum entropy threshold segmentation algorithm, and storing the sum of the pixel values of each column into a new array; secondly, comparing the sizes of all elements in the new array, and finding out the largest element, wherein the largest element position is the position of the column pixel and the largest column in the image, namely the position closest to the center of the highway; then, determining a range of pixel value sums of a column to obtain a sum of a plurality of column pixels meeting the conditions, and only reserving two leftmost vertical lines and rightmost vertical lines after the column where the column pixels meeting the conditions are positioned is drawn in an original image, wherein a region sandwiched between the two vertical lines is a connecting band.
6. The method for detecting the visibility of the road based on the single peak shift maximum entropy threshold segmentation foggy days according to claim 4, wherein in the step 7, a difference processing is performed in a continuous band to obtain a point with obvious pixel value change, namely a gray value abrupt change point, specifically: firstly, storing pixel values of each point of the gray level image obtained in the step 2 into a matrix; secondly, position coordinates of two vertical lines of the connecting band are found in the matrix; and finally, taking coordinates of the two positions as boundaries, regarding all elements in the middle of the boundaries as a new matrix, namely a connected band matrix, carrying out difference processing on adjacent elements on each column of the connected band matrix, setting a threshold value, marking coordinate points corresponding to the upper elements in the adjacent elements on a gray level image when the difference of pixel values is larger than the threshold value, judging the coordinates of the abrupt change points, deducing the position of a space-earth dividing line when the coordinate points are gathered or connected into one piece, and prompting that the image is not in accordance with the requirement if the judgment condition cannot be met.
7. The method for detecting the visibility of a foggy road based on the peak shift maximum entropy threshold segmentation according to claim 4, wherein the foggy visibility V in step 8 is calculated by the following formula:
V=3Hf/[2(v i -v h )cos(θ)]
wherein H represents the height of the camera from the ground plane, θ represents the included angle between the optical axis of the camera and the ground plane, f represents the effective focal length of the camera, v h Representing the vertical coordinate values of the horizon in the image.
CN202010087143.6A 2020-02-11 2020-02-11 Method for detecting visibility of road in foggy weather based on unimodal offset maximum entropy threshold segmentation Active CN111275698B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010087143.6A CN111275698B (en) 2020-02-11 2020-02-11 Method for detecting visibility of road in foggy weather based on unimodal offset maximum entropy threshold segmentation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010087143.6A CN111275698B (en) 2020-02-11 2020-02-11 Method for detecting visibility of road in foggy weather based on unimodal offset maximum entropy threshold segmentation

Publications (2)

Publication Number Publication Date
CN111275698A CN111275698A (en) 2020-06-12
CN111275698B true CN111275698B (en) 2023-05-09

Family

ID=71000586

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010087143.6A Active CN111275698B (en) 2020-02-11 2020-02-11 Method for detecting visibility of road in foggy weather based on unimodal offset maximum entropy threshold segmentation

Country Status (1)

Country Link
CN (1) CN111275698B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115797848B (en) * 2023-01-05 2023-04-28 山东高速股份有限公司 Visibility detection early warning method based on video data in high-speed event prevention system
CN117094914B (en) * 2023-10-18 2023-12-12 广东申创光电科技有限公司 Smart city road monitoring system based on computer vision

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102175613A (en) * 2011-01-26 2011-09-07 南京大学 Image-brightness-characteristic-based pan/tilt/zoom (PTZ) video visibility detection method
WO2018058356A1 (en) * 2016-09-28 2018-04-05 驭势科技(北京)有限公司 Method and system for vehicle anti-collision pre-warning based on binocular stereo vision

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102175613A (en) * 2011-01-26 2011-09-07 南京大学 Image-brightness-characteristic-based pan/tilt/zoom (PTZ) video visibility detection method
WO2018058356A1 (en) * 2016-09-28 2018-04-05 驭势科技(北京)有限公司 Method and system for vehicle anti-collision pre-warning based on binocular stereo vision

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于场景深度的雾天图像能见度检测算法;徐敏等;《自动化仪表》(第09期);全文 *
基于最大熵模型的玉米冠层LAI升尺度方法;苏伟等;《农业工程学报》(第07期);全文 *

Also Published As

Publication number Publication date
CN111275698A (en) 2020-06-12

Similar Documents

Publication Publication Date Title
CN107516077B (en) Traffic sign information extraction method based on fusion of laser point cloud and image data
CN102682292B (en) Method based on monocular vision for detecting and roughly positioning edge of road
US8121350B2 (en) Apparatus, method and computer program for determining a position on the basis of a camera image from a camera
CN109919951B (en) Semantic-associated object-oriented urban impervious surface remote sensing extraction method and system
CN105447838A (en) Method and system for infrared and low-level-light/visible-light fusion imaging
CN111275698B (en) Method for detecting visibility of road in foggy weather based on unimodal offset maximum entropy threshold segmentation
US8340423B1 (en) Enhancing digital image mosaics using region-statistics
CN108596165A (en) Road traffic marking detection method based on unmanned plane low latitude Aerial Images and system
US20060290706A1 (en) Terrain map summary elements
CN113935428A (en) Three-dimensional point cloud clustering identification method and system based on image identification
TW201928913A (en) Apparatus for generating feature height-specific colored image, and program for generating feature height-specific colored image
CN103578083A (en) Single image defogging method based on joint mean shift
KR101549155B1 (en) Method of automatic extraction of building boundary from lidar data
CN113255452A (en) Extraction method and extraction system of target water body
CN111563852A (en) Dark channel prior defogging method based on low-complexity MF
CN112906616A (en) Lane line extraction and generation method
CN114998545A (en) Three-dimensional modeling shadow recognition system based on deep learning
CN109800693B (en) Night vehicle detection method based on color channel mixing characteristics
CN106097259B (en) A kind of Misty Image fast reconstructing method based on transmissivity optimisation technique
CN112598777B (en) Haze fusion method based on dark channel prior
CN112396572B (en) Composite insulator double-light fusion method based on feature enhancement and Gaussian pyramid
CN115439349A (en) Underwater SLAM optimization method based on image enhancement
CN115018859A (en) Urban built-up area remote sensing extraction method and system based on multi-scale space nesting
CN215117795U (en) Visibility determination device under low light level condition
Luo et al. Research on Traffic Signal Light Recognition Method in Complex Scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230424

Address after: Room 508, block a, Rongcheng cloud Valley, 57 Keji 3rd road, Zhangba Street office, high tech Zone, Xi'an City, Shaanxi Province, 710075

Applicant after: Xi'an Huizhi Information Technology Co.,Ltd.

Address before: 710064 middle section, south two ring road, Shaanxi, Xi'an

Applicant before: CHANG'AN University

GR01 Patent grant
GR01 Patent grant