CN114643926A - Method, system, device and storage medium for controlling high beam - Google Patents
Method, system, device and storage medium for controlling high beam Download PDFInfo
- Publication number
- CN114643926A CN114643926A CN202011501294.8A CN202011501294A CN114643926A CN 114643926 A CN114643926 A CN 114643926A CN 202011501294 A CN202011501294 A CN 202011501294A CN 114643926 A CN114643926 A CN 114643926A
- Authority
- CN
- China
- Prior art keywords
- brightness
- image set
- high beam
- road condition
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 55
- 238000012545 processing Methods 0.000 claims description 48
- 238000004891 communication Methods 0.000 claims description 9
- 238000004590 computer program Methods 0.000 claims description 4
- 238000007781 pre-processing Methods 0.000 claims description 2
- 230000007547 defect Effects 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 9
- 238000004458 analytical method Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 230000010339 dilation Effects 0.000 description 3
- 230000003628 erosive effect Effects 0.000 description 3
- 206010039203 Road traffic accident Diseases 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 208000001140 Night Blindness Diseases 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 235000019800 disodium phosphate Nutrition 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/02—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
- B60Q1/04—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
- B60Q1/14—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights having dimming means
- B60Q1/1415—Dimming circuits
- B60Q1/1423—Automatic dimming circuits, i.e. switching between high beam and low beam due to change of ambient light or light level in road traffic
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
Abstract
The embodiment of the application discloses a control method, a system, equipment and a storage medium of a high beam, wherein the driving environment information of a vehicle is obtained by analyzing an image representing the current driving environment of the vehicle, and the high beam is controlled to be turned on or off according to the driving environment information; the method has the advantages that the road condition environment where the vehicle is located can be accurately analyzed, so that the opening or closing time of the high beam is intelligently selected, the defect that the high beam is easily opened by mistake or forgotten to be closed when the high beam is manually controlled is overcome, system resources are saved, and the driving safety guarantee of the vehicle is enhanced.
Description
Technical Field
The present disclosure relates to the field of vehicle control, and more particularly, to a method, a system, a device, and a storage medium for controlling a high beam.
Background
The high beam is the especially important equipment on the car, compares with the passing lamp, and the light of high beam is parallel to be jetted out, and light is concentrated and luminance is great, can shine higher more distant object, has very big help to driver's sight at night. However, the situation that the high beam is turned on by mistake or the high beam is forgotten to be turned off often occurs during driving, which causes troubles to drivers of opposite vehicles or pedestrians on roads, and even causes traffic accidents. When eyes are stimulated by strong light of a high beam, the pupil automatically shrinks from about 5-8 mm to 1 mm or even smaller, so that the light incoming amount is reduced to above 1/30, the pupil cannot recover in time after meeting, the light incoming amount is reduced rapidly, the phenomenon similar to night blindness occurs, and numerous traffic accidents happen accordingly. At present, the phenomenon of abusing the high beam is serious, and the use of the high beam at wrong time can not improve the driving safety but can increase the occurrence probability of dangerous accidents.
The manner of manually controlling the turning on and off of the high beam depends on the subjective judgment of the driver. Under the condition of long-time driving or complex road conditions, a driver is likely to mistakenly turn on or forget to turn off the high beam due to negligence, so that potential safety hazards are caused. Therefore, a more intelligent and sensitive high beam control method needs to be researched.
Disclosure of Invention
An object of the present application is to provide a method, system, device and storage medium for controlling a high beam, which are advantageous in that images representing current traffic conditions are processed and analyzed, and it is determined whether a vehicle that has turned on the high beam exists in front of a vehicle that is driven by itself, so that the high beam that drives the vehicle by itself is intelligently turned off or turned on, and the disadvantage that the high beam is turned on by mistake or forgotten to be turned off easily when a mode of manually controlling the high beam is adopted is overcome.
Another object of the present application is to provide a method, a system, a device, and a storage medium for controlling a high beam, which are advantageous in that whether the overall brightness of the current road condition meets the brightness condition is determined, and then whether the vehicle is driving, so as to more accurately analyze the road condition environment where the vehicle is located, thereby more intelligently selecting the time for turning on or off the high beam, and avoiding the disadvantage that the vehicle still maintains the on state in the environment with sufficient brightness due to the negligence of the driver, thereby saving system resources, and further enhancing the driving safety of the vehicle.
In order to achieve the above object, in a first aspect, an embodiment of the present application provides a method for controlling a high beam, where the method includes: acquiring a traffic road condition image set; judging the driving environment of the vehicle according to the traffic road condition image set, wherein the driving environment comprises the environment brightness and whether the vehicle comes in opposite directions; and controlling the high beam to be turned on or off according to the driving environment.
After the images in the traffic road condition image set are analyzed, whether the high beam should be turned on or off under the brightness of the current road condition environment or not can be judged, whether a vehicle with the high beam turned on exists in front of the vehicle or not can be judged, and whether the high beam should be turned on or turned off or not can be further judged.
According to the method, the image of the road condition environment is collected in real time, so that the current road condition environment is judged, the time for turning on or turning off the high beam is selected more accurately and sensitively, and compared with a mode of manually controlling the high beam, the method is simpler and more convenient, and provides greater guarantee for driving safety.
In a second aspect, an embodiment of the present application provides a control system for a high beam, the system includes an acquisition module, a processing module and a control module, the acquisition module is in communication connection with the processing module, and the processing module is in communication connection with the control module: the acquisition module is used for acquiring a traffic road condition image set; the processing module is used for receiving the traffic road condition image set acquired by the acquisition module and judging the driving environment of the vehicle according to the traffic road condition image set, wherein the driving environment comprises the environment brightness and whether the vehicle comes in an opposite direction; and the control module is used for receiving the judgment information of the processing module on the driving environment and controlling the high beam to be turned on or off according to the judgment information.
It should be understood that the control system of the high beam may be applied to cars, trucks, motorcycles, buses, recreational vehicles, playground vehicles, construction equipment, and the embodiment of the present invention is not particularly limited. Each module is connected between the high beam control system, and each module cooperates, through the image of real-time acquisition road conditions environment to judge current real-time road conditions environment, thereby select the time of opening or closing the high beam more accurately and sensitively, need not to use the high beam through manual control properly, and the bigger guarantee that provides for driving safety.
In a third aspect, an embodiment of the present application provides an electronic device, where the electronic device includes: a memory for storing a program; a processor for executing the program stored in the memory, the processor being configured to perform the method of the first aspect and any one of the alternative implementations when the program is executed.
In a fourth aspect, the present application provides a computer-readable storage medium, in which a computer program is stored, where the computer program includes program instructions, and when the program instructions are executed by a processor, the processor is caused to execute the method of the first aspect and any optional implementation manner.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the background art of the present application, the drawings used in the embodiments or the background art of the present application will be briefly described below.
Fig. 1 is a flowchart of a method for controlling a high beam provided in an embodiment of the present application;
fig. 2 is a flowchart of another high beam control method according to an embodiment of the present disclosure;
fig. 3 is a comparison diagram of effects before and after image denoising according to the embodiment of the present application;
fig. 4 is a schematic structural diagram of a control system of a high beam provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of a control apparatus for a high beam provided in an embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more clear, the present application will be further described with reference to the accompanying drawings.
The terms "first" and "second," and the like in the description, claims, and drawings of the present application are used solely to distinguish between different objects and not to describe a particular order. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions. Such as a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those skilled in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
In this application, "at least one" means one or more, "a plurality" means two or more, "at least two" means two or three and three or more, "and/or" for describing an association relationship of associated objects, which means that there may be three relationships, for example, "a and/or B" may mean: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one item(s) below" or similar expressions refer to any combination of these items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b," a and c, "" b and c, "or" a and b and c.
The embodiment of the invention provides a method, a system, equipment and a storage medium for controlling a high beam, which are used for more clearly describing the scheme of the invention. The following description is provided to illustrate some of the knowledge involved in the control method, system, device and storage medium for high beam provided by the embodiments of the present application.
YUV image: YUV is a color coding method and is more commonly used in various video processing components. YUV allows for reduced bandwidth of chrominance in view of human perception when encoding photos or videos. YUV is a kind of compiled true-color space (color space), and the proper terms such as Y' UV, YUV, YCbCr, YPbPr, etc. may be called YUV, overlapping with each other. "Y" represents brightness, i.e., gray scale value, and "U" and "V" represent chroma, which is used to describe the color and saturation of the image for specifying the color of the pixel.
And (3) binarization processing: binarization is one of the simplest methods of image segmentation. Binarization may convert a grayscale image into a binary image. The pixel gradation greater than a certain threshold gradation value (threshold value) is set as a gradation maximum value (255), and the pixel gradation smaller than this value is set as a gradation minimum value (0), thereby realizing binarization. In brief, a threshold is set, for each row in the video signal matrix, the pixel values and the threshold are compared from left to right, and if the gray value of the image is greater than or equal to the threshold, the corresponding pixel is determined to be 255; and conversely, the gray value smaller than the threshold is 0, so that the whole image presents an obvious black-and-white effect.
Connected domain: generally, the image area is an image area composed of foreground pixels having the same pixel value and adjacent positions in an image. Connected component analysis is a common and basic method in many application areas of image analysis processing. The connected region analysis method can be used in application scenes in which foreground objects need to be extracted for subsequent processing, and generally, an object of connected region analysis processing is a binarized image.
Image erosion and dilation: erosion and dilation are two basic operations in digital morphology, typically used for binary images. The erosion functions to enlarge the dark areas and the dilation functions to enlarge the light areas.
The main composition of automotive lighting is with passing light to the high beam, and its light of passing light is more parallel and concentrated to the high beam, and luminance is higher, and the distance is farther. For the driver, the high beam is the most dominant lighting means for driving at night. However, in daily life, the high beam is abused for the effect of irradiation due to carelessness of the driver or poor driving habits of the driver. The wrong use of the high beam not only affects the driving safety, but also greatly increases the occurrence probability of dangerous accidents, and the mode of manually controlling the high beam excessively depends on the subjective judgment of a driver, so that the on-off of the high beam cannot be reasonably controlled frequently.
In view of the above problems, embodiments of the present application provide a method for controlling a high beam, which accurately and sensitively selects a time for turning on or off the high beam by analyzing a driving environment of a vehicle in real time.
Next, a flow chart of a method for controlling a high beam according to an embodiment of the present application will be described, referring to fig. 1. As shown in fig. 1, the method comprises the steps of:
101. and acquiring a traffic road condition image set.
The traffic road condition image set comprises a plurality of frames of images in a video stream obtained by shooting the current driving environment in real time. It should be understood that a vehicle is equipped with a device supporting a camera function, and the device may be configured to capture a road condition environment where the vehicle is located, to obtain a video stream representing the road condition environment, where the traffic road condition image set includes multiple frames of images in the video. In a specific application scenario, due to performance differences among different vehicle-mounted systems, the images in the traffic condition image set may be images of consecutive frames in the video stream, or may be a plurality of frames of images, one frame of which is selected from several frames in the video stream.
102. And judging the driving environment of the vehicle according to the traffic road condition image set.
After the traffic road condition image set is obtained, the vehicle analyzes and processes the images in the traffic road condition image set, namely, foreground targets contained in the images and average brightness values of all pixel points in the images are extracted, so that the current driving environment of the vehicle is judged. Specifically, the analysis processing includes: and judging the brightness of each image in the traffic road condition map set and judging the coming vehicle. And the brightness judgment is to calculate the respective average brightness of each frame of image in the traffic road condition map set, the average brightness of the images exceeding a certain proportion or a certain number in the traffic road condition map set is greater than a preset numerical value, or the average brightness of all the images in the traffic road condition map set is greater than the preset numerical value, so that the environment where the vehicle is located is the environment with stronger light brightness. Judging the oncoming vehicle, namely performing binarization processing on the images in the traffic road condition image set to obtain a binarization image set; and analyzing the binary images in the binary image set to obtain a white connected domain with the largest area in each binary image as a target connected domain of the image, and judging that a vehicle which runs in an opposite direction and has an opened high beam exists in front of the vehicle when the areas (namely the number of pixel points) of the target connected domains of at least two binary images in the binary image set are larger than a preset value.
103. And controlling the high beam to be turned on or off according to the driving environment.
When the environment of the vehicle is the environment with strong light brightness, if the high beam is in the opening state, the high beam is closed; otherwise, the current state of the high beam is kept. When the environment of the vehicle is an environment with weak light brightness, and when an oncoming vehicle which runs in opposite directions and has a vehicle lamp (possibly a dipped headlight or a high beam) turned on exists in front of the vehicle, if the high beam of the vehicle is also in an on state at the moment, the vehicle turns off the high beam; if the areas (namely the pixel point quantity) of the target connected domains of at least two binary images in the binary image set are larger than a preset value, the two binary images are opposite to each other without a vehicle, and if the high beam of the vehicle is in a closed state at the moment, the high beam is turned on.
The embodiment of the application analyzes the current driving environment of the vehicle in real time, including the environment brightness and the situation of coming vehicles, so that the time for opening or closing the high beam is accurately and sensitively selected, and compared with a mode for manually controlling the high beam, the method is simpler and more convenient, and can provide greater guarantee for driving safety.
It should be understood that, in the above method, the average brightness of the traffic road condition image set is obtained, the average brightness of a plurality of images in the traffic road condition image set is used to reflect the ambient brightness of the current driving road condition of the vehicle, and whether the vehicle is coming or not is judged by judging whether an obvious aperture exists in the binarized image set. Based on the above method, the present application provides another more specific flowchart of a method for controlling a high beam, referring to fig. 2, in order to further improve the accuracy of image analysis and select a more appropriate time for turning on the high beam after turning off.
Fig. 2 is a flowchart of another high beam control method according to an embodiment of the present application, and as shown in fig. 2, the method includes the following steps:
201. and acquiring a traffic road condition image set.
The traffic road condition image set comprises a plurality of frames of images in a video stream obtained by shooting the current driving environment in real time. It should be understood that a vehicle is equipped with a device supporting a camera function, and the device may be configured to capture a road condition environment where the vehicle is located, to obtain a video stream representing the road condition environment, where the traffic road condition image set includes multiple frames of images in the video. In a specific application scenario, due to performance differences among different vehicle-mounted systems, the images in the traffic condition image set may be images of consecutive frames in the video stream, or an image set formed by multiple frames of images of one frame selected from several frames in the video stream.
In an optional implementation manner, when the format of the multiple frames of images in the video stream captured by the device supporting the image capture function is not a YUV format, format conversion is performed on the multiple frames of images in the video stream, that is, all the multiple frames of images are converted into the YUV format, so that the average brightness of the multiple frames of images is obtained in the subsequent process, and binarization processing is performed on the multiple frames of images.
202. And calculating the average brightness of each frame of image in the traffic road condition image set.
After the traffic road condition image set is obtained, the average brightness of the images in the traffic road condition image set is respectively obtained. It should be understood that the raw data type of a general camera is YUV format, for a picture in YUV format, the Y channel is a luminance channel of an image, and the average value of the Y channel of the image is an average luminance of the image. When the images in the traffic road condition image set are images in a YUV format, for each frame of image, averaging the Y value (i.e., luminance value) of each pixel point of the image, and obtaining an average value, which is the imageThe average brightness of the image. Similarly, images in other formats can obtain the average brightness of the image based on the method. Taking an RGB image as an example, the luminance calculation formula of the RGB image is: y is1=0.299*R1+0.587*G1+0.114*B1Where Y1 is the average luminance of the RGB image, R1、G1、B1Which are the average of R, G, B channel values in the RGB image, respectively.
203. And judging whether the average brightness of the continuous N frames of images in the traffic road condition image set is greater than or equal to a brightness threshold value.
It should be understood that the average brightness of the above images may reflect the brightness of the driving environment of the vehicle. Therefore, after the average brightness of the images in the traffic condition image set is obtained, the brightness of the vehicle driving environment can be judged according to the average brightness value of the images. If the average brightness of the traffic road condition image set exceeding the N frames of images is greater than or equal to the preset brightness threshold, it can be considered that the brightness of the current driving environment of the vehicle is stronger, and at this time, to avoid resource waste, the high beam of the vehicle is directly turned off (assuming that the high beam of the vehicle is in an on state before the method is executed in step 201); otherwise, if the brightness of the current running environment of the vehicle is judged to be low, the high beam is kept in an opening state, and step 204 is executed; the value range of N is 2-10.
204. And carrying out binarization processing on the images in the traffic road condition map set to obtain a binarization image set.
After determining that the current running environment brightness of the vehicle is low, it is determined whether the vehicle is approaching. Because the machine identification image is limited, a computer based on Boolean algebra can only represent that the image is NAND, objects in colorful images are not easy to identify, image binarization is often adopted for image identification, namely the image is non-black or white, useful elements or characteristic elements are set to be one color according to the characteristics of the useful elements or the characteristic elements, and the rest interference elements are set to be the other color. Therefore, in order to obtain a more accurate determination result, the images in the traffic road condition image set are subjected to binarization processing: dividing the images in the traffic road condition image set into two types according to the brightness values of the pixel points, namely, the pixel points (hereinafter referred to as highlight points) with the brightness values larger than a preset brightness value and the pixel points (hereinafter referred to as low points) with the brightness values smaller than the preset brightness value, and uniformly setting the brightness of the highlight points in the images to be a larger value, for example, 255; the brightness of the low spots in the image is uniformly set to a small value, for example 0. After the above processing, there will be clearly differentiated black and white regions in the image, and the preset brightness value may be a fixed value, or may be obtained by a preset function according to the overall brightness of each image. And carrying out binarization processing on the images in the traffic road condition image set to obtain the binarization image set.
Specifically, the obtaining of the binarized image by the binarization processing can be further completed by the following steps: establishing a background image, wherein the size of the background image is the same as that of the images in the traffic road condition map set, and the gray value of each pixel point in the background image is 0; and (2) performing primary scanning judgment on each row of pixel points from top to bottom according to the column sequence of the brightness values of the pixel points in the preprocessed YUV image obtained in the step 202, then performing secondary scanning judgment on the brightness values of each row of pixel points in the preprocessed YUV image from bottom to top, and in the processes of primary scanning judgment and secondary scanning judgment, when the brightness values of the pixel points in the preprocessed YUV image are all larger than the preset brightness values, setting the gray values of the pixel points at the corresponding positions of the YUV image in the step 202 to be 255, so as to obtain a binarization characteristic image on the background image.
205. And carrying out the denoising treatment on each binary image in the binary image set.
It should be understood that although each binary image in the binary image set obtained in step 204 has only two regions of colors, which can be clearly distinguished from each other, such as black and white, in a visual sense, when a road condition environment is complicated (for example, other light sources such as motorcycle headlights and dim street lamps are also present in a driving environment), there may be multiple white regions and black regions in the same binary image. At this time, there may be problems that the boundary between the white area and the black area of the binarized image is unclear and the white area is too small to be obvious enough. Aiming at the problem, the binarization image can be denoised to obtain a clearer image of a white area, so that a target connected domain can be conveniently obtained from the image by subsequent extraction, and whether the vehicle is coming or not can be judged more accurately.
In an optional implementation manner, the denoising process includes the following steps: and corroding the binary image, and expanding, namely performing open operation on the image. After the opening operation is carried out, the contour of the object in the binary image becomes smoother, the narrow connection between the objects is also broken, the sharp protruding part of the edge of the object is eliminated, and the white area becomes more obvious. The images before and after processing can be referred to as the image shown in fig. 3.
Fig. 3 is a comparison diagram of effects before and after image denoising provided in the embodiment of the present application. As shown in fig. 3, the image 301 is a frame image of the traffic map set, and is an image after the binarization processing in step 204. As shown in the image 301, it is obvious that the image 301 includes a vehicle which is driving in an opposite direction and the vehicle has turned on the high beam; the graph 302 is an image obtained by the denoising process described in step 205 on the graph 301, and compared with the graph 301, the black and white regions in the graph 302 are more clearly divided, and the white regions (i.e., the regions of the lamps of the oncoming vehicle in the image, which may be the regions formed by the high beam or the low beam of the oncoming vehicle in the image) are clearer.
206. And acquiring a target connected domain of each binary image in the binary image set.
After the denoising processing, each binarized image in the binarized image set includes a black region and a white region. In order to eliminate the interference of other light source points in the driving environment and ensure that a white area caused by the headlight of an oncoming vehicle exists in the binary image, the following processing is carried out on the binary image set:
calculating the number of pixel points of a plurality of connected domains (referring to the white region in the image 302 in fig. 3) with larger brightness values in the binarized image to obtain the number of pixel points of each white connected domain; and arranging the plurality of connected regions with larger brightness values according to the sequence of the number of the pixel points contained in the connected regions from small to large to obtain the most white connected regions of the pixel points in the white connected regions in each image, and taking the white connected regions as the target connected regions of the binary image to which the white connected regions belong.
207. And judging whether the area of the target connected domain of the M frames of images in the binarized image set is larger than a first threshold value.
When the area (namely the number of pixel points) of the target connected domain exceeding the M frames of images in the binary image set is larger than a first threshold value, the oncoming vehicle is considered to exist in the driving environment represented by the binary image set, the high beam lamp of the oncoming vehicle is opened by the oncoming vehicle, and the target connected domain represented in the images is the high beam lamp of the oncoming vehicle; at this point, step 208 will be performed; otherwise, the opposite coming vehicle with the high beam switched on does not exist in the driving environment represented by the binary image set, at this time, step 209 is executed, the value range of M is 2-10, and the judgment method of the connected domain is an eight-neighborhood connected domain marking method. Due to the fact that the performances of the vehicle-mounted system are different, the size of the imaging area of the binarized image set is also different, the first threshold value is determined according to the size of the imaging area of the binarized image set, and the embodiment of the application is not limited.
208. The high beam is turned off.
After it is determined that there is an oncoming vehicle that has already been turned on with the high beam, the high beam is turned off in order to ensure driving safety.
209. The current state of the high beam is maintained.
After the situation that the brightness of the driving environment is low and the coming vehicle with the high beam already turned on does not exist in the front of the driving environment is judged, the high beam of the driving environment is kept in the turning-on state.
The embodiment of the application judges the integral brightness of the current road condition firstly, and judges whether the vehicle is coming or not after the traffic road condition image set is preprocessed, so that the road condition environment where the vehicle is located can be analyzed more accurately, the opening or closing time of the high beam can be selected more intelligently, the defect that the vehicle still keeps an opening state in the environment with enough brightness due to driver negligence can be avoided, system resources are saved, and the driving safety guarantee of the vehicle is further enhanced.
A schematic structural diagram of a high beam control system according to an embodiment of the present application is described below, please refer to fig. 4. As shown in fig. 4, the schematic structural diagram of the control system of the high beam in fig. 4 may execute the flow of the control method of the high beam in fig. 1 or fig. 2, and the system includes: collection module, processing module and control module, collection module and processing module communication connection, processing module and control module communication connection: the acquisition module is used for acquiring a traffic road condition image set; the processing module is used for receiving the traffic road condition image set acquired by the acquisition module and judging the driving environment of the vehicle according to the traffic road condition image set, wherein the driving environment comprises the environment brightness and whether the vehicle comes in an opposite direction; and the control module is used for receiving the judgment information of the processing module on the driving environment and controlling the high beam to be turned on or off according to the judgment information.
In an optional implementation manner, the control module is specifically configured to: if the ambient brightness is greater than or equal to the brightness threshold, controlling the high beam to be turned off; if the ambient brightness is smaller than the brightness threshold value and the vehicle is coming, controlling the high beam to be turned off; and if the ambient brightness is less than the brightness threshold value and no vehicle comes in the opposite direction, controlling the high beam to be turned on.
In an optional implementation manner, the processing module includes a binarization unit and a determination unit, the binarization unit is configured to perform binarization processing on the traffic road condition image set acquired by the acquisition module to acquire a binarized image set, and the binarization processing includes the following steps: setting the brightness value of the pixel point with the brightness larger than n in the traffic road condition image set as a first value, and setting the brightness value of the pixel point with the brightness smaller than or equal to n as a second value, wherein the first value is larger than the second value, and n is a numerical value larger than 0; the judging unit is used for judging whether the oncoming vehicle comes or not according to the binary image set.
In an optional implementation manner, the determining unit is specifically configured to: acquiring a target connected region of each binarized image in the binarized image set, wherein the target connected region is a connected region with the largest area in the connected regions of which the brightness values are the first values in the binarized image; if the areas of the target connected areas of at least two binary images in the binary images are larger than a first threshold value, judging that an oncoming vehicle is coming; otherwise, judging that no vehicle comes to the opposite direction.
In an optional implementation, the system further includes: and the preprocessing module is used for converting an initial image set into a YUV format to obtain the traffic road condition image set, wherein the initial image set comprises a plurality of frames of images in a video stream which is acquired by the vehicle and represents the traffic road condition.
In an optional implementation manner, the determining unit is further configured to: and acquiring the brightness of each image in the traffic road condition image set, acquiring the average brightness according to the brightness of each image, and taking the average brightness as the environment brightness.
It should be understood that the division of the modules of the control device of the high beam above is only a division of logical functions, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. For example, the above modules may be processing elements which are set up separately, or may be implemented by integrating the same chip, or may be stored in a storage element of the controller in the form of program codes, and a certain processing element of the processor calls and executes the functions of the above modules. In addition, the modules can be integrated together or can be independently realized. The processing element may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the method or the above modules may be implemented by integrated logic circuits of hardware in a processor element or instructions in the form of software. The processing element may be a general purpose processor, such as a CPU, or one or more integrated circuits configured to implement the above method, such as: one or more application-specific integrated circuits (ASICs), or one or more microprocessors (DSPs), or one or more field-programmable gate arrays (FPGAs), among others.
A schematic structural diagram of a high beam control apparatus according to an embodiment of the present application is described below, please refer to fig. 5.
Fig. 5 is a schematic structural diagram of another control apparatus for a high beam provided in an embodiment of the present application. As shown in fig. 5, the apparatus includes: a camera 501 (corresponding to the acquisition module in fig. 4), a data processor (corresponding to the processing module and the control module in fig. 4), and a high beam 503 of the vehicle. The camera 501, the data processor 502 and the high beam are connected by a bus. The camera 501 may be a camera carried by the vehicle-mounted terminal, or may be another camera mounted on the vehicle, and is responsible for acquiring the video stream, that is, acquiring the road traffic video stream by real-time shooting, and outputting the acquired road traffic video stream to the data processor 502; the data processor 502 decodes the received road traffic video stream to obtain a plurality of frames of road traffic images; the data processor 502 is further configured to perform binarization processing on the traffic road condition image and the images therein, obtain an average brightness value of the images, and detect a high beam region in the images. In an alternative implementation, the data processor 502 may be further configured to pre-process the images in the traffic condition image set. When the images in the traffic road condition set meet the preset conditions, the data processor 520 is further configured to control the high beam 503 to be turned on and off.
In an alternative implementation, the camera 501 and the data processor 502 in fig. 5 may belong to the same electronic device, please refer to fig. 6.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 6, the electronic device 60 includes a processor 601, a memory 602, a camera 603, and a communication interface 604; the processor 601, the memory 602, the camera 603 and the communication interface 604 are connected to each other via a bus 605, and the electronic apparatus can be used in an in-vehicle system.
The memory 602 includes, but is not limited to, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM), or a portable read-only memory (CDROM), and the memory 602 is used for related instructions and data. The camera 603 is configured to photograph a scene and transmit a video stream obtained by photographing to the processor 601 through the bus 605, and the communication interface 604 is configured to receive and transmit data, such as a turn-off or turn-on instruction sent to the high beam 503 in fig. 5.
The processor 601 may be one or more Central Processing Units (CPUs), and in the case that the processor 601 is one CPU, the CPU may be a single-core CPU or a multi-core CPU. The steps performed by the high-beam control system in the embodiment may be based on the configuration of the electronic apparatus shown in fig. 6. In particular, the processor 601 may implement the functions of the modules in fig. 4.
In an embodiment of the present application, there is provided another computer-readable storage medium storing a computer program which, when executed by a processor, implements: acquiring a traffic road condition image set; judging the driving environment of the vehicle according to the traffic road condition image set, wherein the driving environment comprises the environment brightness and whether the vehicle comes in opposite directions; and controlling the high beam to be turned on or off according to the driving environment.
While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (14)
1. A method for controlling a high beam, comprising the steps of:
acquiring a traffic road condition image set;
judging the driving environment of the vehicle according to the traffic road condition image set, wherein the driving environment comprises the environment brightness and whether the vehicle comes in opposite directions;
and controlling the high beam to be turned on or off according to the driving environment.
2. The method according to claim 1, wherein said controlling the high beam to be turned on or off according to said driving environment comprises the steps of:
if the ambient brightness is greater than or equal to the brightness threshold, controlling the high beam to be turned off;
if the ambient brightness is smaller than the brightness threshold value and the vehicle is coming, controlling the high beam to be turned off;
and if the ambient brightness is less than the brightness threshold value and no vehicle comes in the opposite direction, controlling the high beam to be turned on.
3. The method according to claim 1 or 2, wherein the step of determining the driving environment of the vehicle according to the traffic road condition image set comprises the following steps:
carrying out binarization processing on the traffic road condition image set to obtain a binarization image set, wherein the binarization processing comprises the following steps: setting the brightness value of the pixel point with the brightness larger than n in the traffic road condition image set as a first value, and setting the brightness value of the pixel point with the brightness smaller than or equal to n as a second value, wherein the first value is larger than the second value, and n is a numerical value larger than 0;
and judging whether the opposite vehicle comes or not according to the binary image set.
4. The method of claim 3, said determining from said binarized image set whether an oncoming vehicle is present, comprising the steps of:
acquiring a target connected region of each binarized image in the binarized image set, wherein the target connected region is a connected region with the largest area in the connected regions of which the brightness values are the first values in the binarized image;
if the areas of the target connected regions of at least two binary images in the binary images are larger than a first threshold value, judging that the vehicles come in opposite directions; otherwise, judging that no vehicle comes to the opposite direction.
5. The method according to claim 4, wherein the acquiring of the traffic road condition image set comprises the following steps:
and converting the initial image set into a YUV format to obtain the traffic road condition image set, wherein the initial image set comprises a plurality of frames of images in a video stream which is collected by the vehicle and represents the traffic road condition.
6. The method according to claim 5, wherein the step of obtaining the driving environment of the vehicle according to the traffic road condition image set comprises the following steps:
and acquiring the brightness of each image in the traffic road condition image set, acquiring the average brightness according to the brightness of each image, and taking the average brightness as the environment brightness.
7. The utility model provides a control system of high beam, its characterized in that, includes collection module, processing module and control module, collection module and processing module communication connection, processing module and control module communication connection:
the acquisition module is used for acquiring a traffic road condition image set;
the processing module is used for receiving the traffic road condition image set acquired by the acquisition module and judging the driving environment of the vehicle according to the traffic road condition image set, wherein the driving environment comprises the environment brightness and whether the vehicle comes in an opposite direction;
and the control module is used for receiving the judgment information of the processing module on the driving environment and controlling the high beam to be turned on or off according to the judgment information.
8. The system of claim 7, the control module being specifically configured to:
if the ambient brightness is greater than or equal to the brightness threshold, controlling the high beam to be turned off;
if the ambient brightness is smaller than the brightness threshold value and the oncoming car is opposite, controlling the high beam to be turned off;
and if the ambient brightness is less than the brightness threshold value and no vehicle comes in the opposite direction, controlling the high beam to be turned on.
9. The system according to claim 7 or 8, the processing module comprising a binarization unit and a decision unit,
the binarization unit is used for carrying out binarization processing on the traffic road condition image set acquired by the acquisition module to acquire a binarization image set, and the binarization processing comprises the following steps: setting the brightness value of the pixel point with the brightness larger than n in the traffic road condition image set as a first value, and setting the brightness value of the pixel point with the brightness smaller than or equal to n as a second value, wherein the first value is larger than the second value, and n is a numerical value larger than 0;
the judging unit is used for judging whether the oncoming vehicle comes or not according to the binary image set.
10. The system of claim 9, the determination unit being specifically configured to:
acquiring a target connected region of each binarized image in the binarized image set, wherein the target connected region is a connected region with the largest area in the connected regions of which the brightness values are the first values in the binarized images;
if the areas of the target connected regions of at least two binary images in the binary images are larger than a first threshold value, judging that the vehicles come in opposite directions; otherwise, judging that no vehicle comes to the opposite direction.
11. The system of claim 10, further comprising:
and the preprocessing module is used for converting an initial image set into a YUV format to obtain the traffic road condition image set, wherein the initial image set comprises a plurality of frames of images in a video stream which represents traffic road conditions and is acquired by the vehicle.
12. The system of claim 9 or 10, the decision unit further to:
and acquiring the brightness of each image in the traffic road condition image set, acquiring the average brightness according to the brightness of each image, and taking the average brightness as the environment brightness.
13. An electronic device, comprising: a memory for storing a program; a processor for executing the program stored by the memory, the processor being configured to perform the method of any of claims 1 to 6 when the program is executed.
14. A computer-readable storage medium, in which a computer program is stored which, when run on one or more processors, performs the method of any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011501294.8A CN114643926B (en) | 2020-12-17 | 2020-12-17 | Control method, system, equipment and storage medium for high beam |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011501294.8A CN114643926B (en) | 2020-12-17 | 2020-12-17 | Control method, system, equipment and storage medium for high beam |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114643926A true CN114643926A (en) | 2022-06-21 |
CN114643926B CN114643926B (en) | 2024-02-27 |
Family
ID=81990543
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011501294.8A Active CN114643926B (en) | 2020-12-17 | 2020-12-17 | Control method, system, equipment and storage medium for high beam |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114643926B (en) |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008293116A (en) * | 2007-05-22 | 2008-12-04 | Nippon Soken Inc | Vehicle light detecting apparatus to be mounted on vehicle, and illuminating apparatus for vehicle |
US20100052550A1 (en) * | 2008-08-28 | 2010-03-04 | Koito Manufacturing Co., Ltd. | Headlamp control device and vehicle headlamp having headlamp control device |
CN103295399A (en) * | 2013-05-14 | 2013-09-11 | 西安理工大学 | On-state judging method of headlights on full beam of night-driving cars based on morphological characteristics |
CN104097565A (en) * | 2014-06-24 | 2014-10-15 | 奇瑞汽车股份有限公司 | Automobile high beam and low beam control method and device |
CN106915295A (en) * | 2017-03-21 | 2017-07-04 | 青岛海信移动通信技术股份有限公司 | The control method and device of automobile front lamp state |
CN108052893A (en) * | 2017-12-11 | 2018-05-18 | 浙江大华技术股份有限公司 | A kind of method and apparatus for identifying high beam and whether opening |
DE102017001893A1 (en) * | 2017-02-28 | 2018-08-30 | GM Global Technology Operations LLC (n. d. Ges. d. Staates Delaware) | Method for controlling a headlight system of a vehicle |
CN109094451A (en) * | 2018-07-23 | 2018-12-28 | 华南师范大学 | Night meeting high beam control method and its system, computer readable storage medium |
CN109263538A (en) * | 2018-08-02 | 2019-01-25 | 惠州市德赛西威汽车电子股份有限公司 | It is a kind of can intelligent recognition close high beam preceding viewing system and control method |
WO2019024228A1 (en) * | 2017-08-04 | 2019-02-07 | 西安中兴新软件有限责任公司 | Automobile headlamp control method and device |
CN110371017A (en) * | 2019-06-26 | 2019-10-25 | 江铃汽车股份有限公司 | A kind of distance light lamp control method and system |
CN110738158A (en) * | 2019-10-11 | 2020-01-31 | 奇点汽车研发中心有限公司 | Vehicle light control method and device, electronic equipment and storage medium |
CN211827564U (en) * | 2019-04-22 | 2020-10-30 | 桂林金铱星科技发展有限公司 | A recognition device for detecting whether vehicle opens high beam night |
-
2020
- 2020-12-17 CN CN202011501294.8A patent/CN114643926B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008293116A (en) * | 2007-05-22 | 2008-12-04 | Nippon Soken Inc | Vehicle light detecting apparatus to be mounted on vehicle, and illuminating apparatus for vehicle |
US20100052550A1 (en) * | 2008-08-28 | 2010-03-04 | Koito Manufacturing Co., Ltd. | Headlamp control device and vehicle headlamp having headlamp control device |
CN103295399A (en) * | 2013-05-14 | 2013-09-11 | 西安理工大学 | On-state judging method of headlights on full beam of night-driving cars based on morphological characteristics |
CN104097565A (en) * | 2014-06-24 | 2014-10-15 | 奇瑞汽车股份有限公司 | Automobile high beam and low beam control method and device |
DE102017001893A1 (en) * | 2017-02-28 | 2018-08-30 | GM Global Technology Operations LLC (n. d. Ges. d. Staates Delaware) | Method for controlling a headlight system of a vehicle |
CN106915295A (en) * | 2017-03-21 | 2017-07-04 | 青岛海信移动通信技术股份有限公司 | The control method and device of automobile front lamp state |
WO2019024228A1 (en) * | 2017-08-04 | 2019-02-07 | 西安中兴新软件有限责任公司 | Automobile headlamp control method and device |
CN108052893A (en) * | 2017-12-11 | 2018-05-18 | 浙江大华技术股份有限公司 | A kind of method and apparatus for identifying high beam and whether opening |
CN109094451A (en) * | 2018-07-23 | 2018-12-28 | 华南师范大学 | Night meeting high beam control method and its system, computer readable storage medium |
CN109263538A (en) * | 2018-08-02 | 2019-01-25 | 惠州市德赛西威汽车电子股份有限公司 | It is a kind of can intelligent recognition close high beam preceding viewing system and control method |
CN211827564U (en) * | 2019-04-22 | 2020-10-30 | 桂林金铱星科技发展有限公司 | A recognition device for detecting whether vehicle opens high beam night |
CN110371017A (en) * | 2019-06-26 | 2019-10-25 | 江铃汽车股份有限公司 | A kind of distance light lamp control method and system |
CN110738158A (en) * | 2019-10-11 | 2020-01-31 | 奇点汽车研发中心有限公司 | Vehicle light control method and device, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN114643926B (en) | 2024-02-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11003931B2 (en) | Vehicle monitoring method and apparatus, processor, and image acquisition device | |
US8036427B2 (en) | Vehicle and road sign recognition device | |
CN110688907B (en) | Method and device for identifying object based on night road light source | |
O'Malley et al. | Vehicle detection at night based on tail-light detection | |
US11700457B2 (en) | Flicker mitigation via image signal processing | |
CN107808392B (en) | Automatic tracking and positioning method and system for security check vehicle in open scene | |
US9493108B2 (en) | Apparatus for detecting other vehicle lights and light control apparatus for vehicles | |
CN106991707B (en) | Traffic signal lamp image strengthening method and device based on day and night imaging characteristics | |
US10922827B2 (en) | Distance estimation of vehicle headlights | |
CN108875458B (en) | Method and device for detecting turning-on of high beam of vehicle, electronic equipment and camera | |
CN106778534B (en) | Method for identifying ambient light during vehicle running | |
CN110084111B (en) | Rapid night vehicle detection method applied to self-adaptive high beam | |
CN109215364B (en) | Traffic signal recognition method, system, device and storage medium | |
CN106161984B (en) | Video image highlight suppression, contour and detail enhancement processing method and system | |
US11068729B2 (en) | Apparatus and method for detecting a traffic light phase for a motor vehicle | |
Lin et al. | Adaptive IPM-based lane filtering for night forward vehicle detection | |
CN113306486B (en) | In-vehicle lighting device control method, storage medium, and electronic apparatus | |
CN111046741A (en) | Method and device for identifying lane line | |
US20230021116A1 (en) | Lateral image processing apparatus and method of mirrorless car | |
CN109094451A (en) | Night meeting high beam control method and its system, computer readable storage medium | |
CN111688568B (en) | Brightness detection method, vehicle lamp control method, system thereof and storage medium | |
US20210203901A1 (en) | Low-light imaging system | |
CN114643926B (en) | Control method, system, equipment and storage medium for high beam | |
CN109800693B (en) | Night vehicle detection method based on color channel mixing characteristics | |
TWI630818B (en) | Dynamic image feature enhancement method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |