CN116416599A - Method, device, equipment and storage medium for determining front vehicle brake - Google Patents

Method, device, equipment and storage medium for determining front vehicle brake Download PDF

Info

Publication number
CN116416599A
CN116416599A CN202310275725.0A CN202310275725A CN116416599A CN 116416599 A CN116416599 A CN 116416599A CN 202310275725 A CN202310275725 A CN 202310275725A CN 116416599 A CN116416599 A CN 116416599A
Authority
CN
China
Prior art keywords
image
determining
average value
vehicle
communication
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310275725.0A
Other languages
Chinese (zh)
Inventor
何中杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huizhou Desay SV Automotive Co Ltd
Original Assignee
Huizhou Desay SV Automotive Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huizhou Desay SV Automotive Co Ltd filed Critical Huizhou Desay SV Automotive Co Ltd
Priority to CN202310275725.0A priority Critical patent/CN116416599A/en
Publication of CN116416599A publication Critical patent/CN116416599A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/273Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion removing elements interfering with the pattern to be recognised
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/34Smoothing or thinning of the pattern; Morphological operations; Skeletonisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the disclosure provides a method, a device, equipment and a storage medium for determining front vehicle brake. The method comprises the following steps: acquiring an image of a vehicle in front; wherein the image comprises a current frame image and a previous frame image; preprocessing an image to obtain a target image; determining a plurality of connected areas according to the target image; grouping the plurality of communication areas to obtain a communication area group; the communication region group comprises at least one communication region; respectively determining the brightness average value in the connected region groups of the current frame image and the previous frame image to obtain a first brightness average value and a second brightness average value; determining a braking state of the front vehicle according to the first brightness average value and the second brightness average value; wherein, the braking state comprises braking and unbraking. According to the method and the device for determining the braking state of the front vehicle, the braking state of the front vehicle can be determined through the brightness average value in the communication area group of the current frame image and the last frame image, so that the accuracy of determining the braking state of the front vehicle can be improved, and the driving safety is improved.

Description

Method, device, equipment and storage medium for determining front vehicle brake
Technical Field
The embodiment of the disclosure relates to the technical field of vehicles, in particular to a method, a device, equipment and a storage medium for determining front vehicle braking.
Background
When a vehicle is traveling with the vehicle, a millimeter wave radar is generally used to identify a forward obstacle or to determine the distance between the current vehicle and the forward vehicle. The first scheme is as follows: the millimeter wave radar is directly carried on the vehicle, the distance between the front vehicle and the vehicle is judged by the millimeter wave radar, and prompt is carried out on a display screen; the second scheme is that millimeter wave radar and a high-definition camera are arranged on a road, the millimeter wave radar is used for measuring distance and combining with the camera to identify vehicles, front vehicle information is provided for each vehicle in a wireless communication mode, and then the vehicle owner is reminded of the conditions such as braking of the front vehicle in a vehicle display screen.
However, the scheme of combining the millimeter wave radar and the millimeter wave radar with the high-definition camera has higher complexity, has the condition of missed judgment on vehicles or misjudgment on non-vehicles, has low judgment accuracy on front vehicle braking, needs to provide a hardware match for vehicles or basic settings, and has the problems of high cost, low practicability and the like. If the wireless communication is combined, the communication algorithm has the defects of high complexity and large influence by communication environment, and the situation of missed judgment and misjudgment can be caused.
The method for judging the front vehicle brake by using the camera is single in application scene, easy to cause misjudgment, multiple in interference targets, high in algorithm complexity, high in delay and the like for dynamic driving scenes, and weak in practicability.
Disclosure of Invention
The embodiment of the disclosure provides a method, a device, equipment and a storage medium for determining the braking state of a front vehicle, which can improve the accuracy of determining the braking state of the front vehicle.
In a first aspect, an embodiment of the present disclosure provides a method for determining a front vehicle brake, including: acquiring an image of a vehicle in front; wherein the image comprises a current frame image and a previous frame image; preprocessing the image to obtain a target image; determining a plurality of connected areas according to the target image; wherein the communication region includes a target object; grouping the plurality of communication areas to obtain a communication area group; wherein the set of connected regions characterizes connected regions within the same vehicle; the communication region group comprises at least one communication region; respectively determining the brightness average value in the connected region groups of the current frame image and the previous frame image to obtain a first brightness average value and a second brightness average value; determining a braking state of the front vehicle according to the first brightness average value and the second brightness average value; wherein, the braking state includes braking and unbraking.
In a second aspect, an embodiment of the present disclosure further provides a device for determining a front vehicle brake, including: an image acquisition module for acquiring an image of a preceding vehicle; wherein the image comprises a current frame image and a previous frame image; the target image acquisition module is used for preprocessing the image to acquire a target image; a communication region determining module for determining a plurality of communication regions according to the target image; wherein the communication region includes a target object; the grouping module is used for grouping the plurality of communication areas to obtain a communication area group; wherein the set of connected regions characterizes connected regions within the same vehicle; the communication region group comprises at least one communication region; the brightness average value determining module is used for respectively determining brightness average values in the connected region groups of the current frame image and the previous frame image to obtain a first brightness average value and a second brightness average value; the brake state determining module is used for determining the brake state of the front vehicle according to the first brightness average value and the second brightness average value; wherein, the braking state includes braking and unbraking.
In a third aspect, embodiments of the present disclosure further provide an electronic device, including:
One or more processors;
storage means for storing one or more programs,
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement a method of determining a front vehicle brake as described in embodiments of the present disclosure.
In a fourth aspect, the disclosed embodiments also provide a storage medium containing computer-executable instructions for performing a method of determining a front truck-brake as described in the disclosed embodiments when executed by a computer processor.
According to the technical scheme disclosed by the embodiment, an image of a front vehicle is acquired; wherein the image comprises a current frame image and a previous frame image; preprocessing an image to obtain a target image; determining a plurality of connected areas according to the target image; wherein the communication region includes a target object; grouping the plurality of communication areas to obtain a communication area group; wherein the set of connected regions characterizes connected regions within the same vehicle; the communication region group comprises at least one communication region; respectively determining the brightness average value in the connected region groups of the current frame image and the previous frame image to obtain a first brightness average value and a second brightness average value; determining a braking state of the front vehicle according to the first brightness average value and the second brightness average value; wherein, the braking state comprises braking and unbraking. According to the method and the device for determining the braking state of the front vehicle, the braking state of the front vehicle can be determined through the brightness average value in the communication area group of the current frame image and the last frame image, so that the accuracy of determining the braking state of the front vehicle can be improved, and the driving safety is improved.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
FIG. 1 is a schematic flow chart of a method for determining a front vehicle brake according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an installation position of a warning light according to an embodiment of the present invention;
FIG. 3 is a schematic circuit diagram of an ARM processor controlling three warning lamps according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a 12V to 5V chip AMS1117-3.3 wiring provided by an embodiment of the present invention;
FIG. 5 is a schematic structural diagram of a device for determining a front brake according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
It will be appreciated that the data (including but not limited to the data itself, the acquisition or use of the data) involved in the present technical solution should comply with the corresponding legal regulations and the requirements of the relevant regulations.
FIG. 1 is a schematic flow chart of a method for determining a front vehicle brake according to an embodiment of the present invention; the embodiment is applicable to determining a situation of a front vehicle brake, and the method can be executed by a front vehicle brake determining device and specifically comprises the following steps:
s110, acquiring an image of a front vehicle.
Wherein the image includes a current frame image and a previous frame image. In this embodiment, the image of the preceding vehicle may be acquired at the set frequency and/or the sampling frequency.
Alternatively, the manner of acquiring the image of the preceding vehicle may be: acquiring an image of a vehicle in front; identifying a target object on the acquired image; if the target object is not identified, acquiring images of the front vehicle according to the set frequency; if the target object is identified, determining a sampling frequency according to the relative speed of the current vehicle and the front vehicle, and acquiring an image of the front vehicle based on the sampling frequency.
In this embodiment, the image of the vehicle ahead may be collected by the camera, and stored in the storage module while being collected, and the target object may be identified by the advanced reduced instruction set processor (AdvancedRISCMachines, ARM) for the collected image. And acquiring images of the front vehicle according to the set frequency when the vehicle is initially started, acquiring the images of the front vehicle according to the set frequency if the target object is not identified in the driving process, determining the sampling frequency according to the relative speed of the current vehicle and the front vehicle if the target object is identified in the driving process, and acquiring the images of the front vehicle based on the sampling frequency. The target object may be a brake light at the tail of the vehicle. Wherein, the set frequency can be a frequency of 1 frame every 3 seconds, and the sampling frequency can be relative vehicle speed/120.
The images are acquired according to the set frequency, namely, the frequency of 1 frame per 3 seconds, and the standard safety distance exceeding 100 meters can be still kept when the vehicle runs for 3 seconds at the speed of 120km/h on the basis of 200 meters of identification distance; after the target object is identified, sampling is carried out in proportion to the relative vehicle speed, the braking distance is about 45 m when the vehicle runs at 120km/h, and the driver reflection time is normal within 1.5 seconds, so that the distance from braking to reaction is about 100 m, and the minimum safe distance of 50 m is kept, and the image processing distance of 50 m is left to be about 1.5 seconds. The sampling frequency at 120km/h travel should be 1 frame per second, excluding 0.5 second reservation and data transmission processing time. The sampling frequency after the target object is identified is the relative vehicle speed/120. In addition, if the target object is located within 200 meters of the current vehicle, it can be considered that the target object is recognized.
According to the embodiment, through setting different sampling frequencies, images can be sampled and processed at different vehicle speeds, so that a driver can be timely reminded of the situation that the vehicle in front of the driver has brake.
S120, preprocessing the image to obtain a target image.
In this embodiment, the image may be preprocessed to obtain the binarized target image of interest, where the preprocessing may include color model conversion, filtering, binarization, erosion, and dilation, among other image preprocessing operations.
Optionally, the image is preprocessed, and the mode of obtaining the target image may be: converting the image into an image in a color model space HSV to obtain a first image; screening out pixel points which fall into a set color range in the first image to serve as a second image; performing binarization processing on the second image to obtain a binarized image; performing corrosion and expansion treatment operation on the binarized image to obtain a third image; and deleting the region, which contains the pixel points with the number smaller than the set threshold value, in the third image to obtain the target image.
In this embodiment, the image obtained by the camera is an RGB image, which can be converted into an image in a color model space (HSVcolormodel, HSV), so as to obtain a first image, screen out pixels in a set color range, and set the colors of the other pixels to be transparent pixels, so as to obtain a second image, so as to eliminate most of light sources irrelevant to brake lights. And performing binarization processing on the second image to obtain a binarized image, performing corrosion and expansion processing operation on the binarized image to obtain a third image, and deleting the region, which contains the pixel points with the number smaller than the set threshold value, in the third image so as to obtain an interested target image, so as to remove target objects with the area smaller than the set threshold value, wherein the target objects with the area smaller than the set threshold value can be non-brake lamps, non-marker lamps or more distant brake lamps or more distant marker lamps. The set color range may be a color range corresponding to the red pixel point.
After the binarized image is obtained, the largest inscribed circle obtained by a single brake lamp in the image can be used as a template, the erosion and expansion operation is performed to remove the target image and the interference image which are too far, and if the area of the eroded image is still larger than the area of the template circle, the eroded image is the target image of interest.
S130, determining a plurality of communication areas according to the target image.
Wherein the communication region includes a target object; the connected regions may be regions in the image having the same pixel value and adjacent.
S140, grouping the plurality of connected areas to obtain a connected area group.
Wherein the set of connected regions characterizes connected regions within the same vehicle; the communication region group includes at least one communication region. In this embodiment, the gravity center points of the communication areas may be determined, and the communication areas may be grouped by the difference between the gravity center points, thereby obtaining the communication area group.
Optionally, the method for grouping the plurality of connected areas to obtain the connected area group may be: determining the gravity center point of the communication area; wherein the gravity center point is characterized by a first set coordinate component and a second set coordinate component; determining first difference information between gravity center points on a second set coordinate component; and dividing the connected areas of which the first difference information falls into the error range into a connected area group.
In this embodiment, the gravity center point of the connected region, that is, the gravity center coordinate may be determined by the mean method. After the gravity center points in each connected region are determined, first difference information between the gravity center points on a second set coordinate component is determined, and the connected regions of which the first difference information falls into an error range are divided into a connected region group from left to right based on a coordinate system, namely, two approximate gravity center points on the second set coordinate component are marked as the same connected region group. Each communication region group can comprise two communication regions, and the two communication regions in the communication region group represent the communication regions of the same vehicle. The gravity center point is characterized by a first set coordinate component and a second set coordinate component. The first set of coordinate components may be characterized by an X-axis, the second set of coordinate components may be characterized by a Y-axis, and the camera may be the origin.
S150, respectively determining the brightness average value in the connected region groups of the current frame image and the previous frame image, and obtaining a first brightness average value and a second brightness average value.
In this embodiment, the luminance average value in the connected region group of the current frame image and the previous frame image may be determined by the luminance average value of the center-of-gravity point pixel point and the neighboring pixel point, so as to obtain the first luminance average value and the second luminance average value.
Optionally, the means for respectively determining the luminance average value in the connected region groups of the current frame image and the previous frame image, and the means for obtaining the first luminance average value and the second luminance average value may be: for a current frame image, determining the brightness average value of the gravity center pixel point of each connected region and the adjacent pixel points of each connected region in the connected region group, and obtaining a first brightness average value; and for the previous frame of image, determining the brightness average value of the gravity center pixel point of each connected region in the connected region group and the adjacent pixel points of the gravity center pixel point, and obtaining a second brightness average value.
In this embodiment, by determining the center of gravity point, which is equivalent to determining the center of gravity point of the image in the color model space HSV, the luminance average value can be determined by the center of gravity point. Specifically, for the current frame image, the luminance average value of the center-of-gravity pixel point of each connected region in the connected region group and the adjacent pixel point thereof is determined, and as the connected region group comprises a first connected region corresponding to the left tail lamp and a second connected region corresponding to the right tail lamp, the luminance average value of the first connected region of the current frame image and the luminance average value of the second connected region of the current frame image can be obtained, that is, the first luminance average value comprises the luminance average value of the first connected region of the current frame image and the luminance average value of the second connected region of the current frame image, and for the previous frame image, the luminance average value of the center-of-gravity pixel point of each connected region in the connected region group and the luminance average value of the adjacent pixel point thereof is determined, so that the luminance average value of the first connected region of the previous frame image and the luminance average value of the second connected region of the previous frame image can also be obtained, that is the second luminance average value comprises the luminance average value of the first connected region of the previous frame image and the luminance average value of the second connected region of the previous frame image. The luminance average value of the connected region may be determined by the luminance average value of the center-of-gravity pixel point of the connected region and the adjacent pixel point thereof.
S160, determining the braking state of the front vehicle according to the first brightness average value and the second brightness average value.
Wherein, the braking state comprises braking and unbraking. In this embodiment, the braking state of the front vehicle may be determined according to the error between the first luminance average value and the second luminance average value, and if the difference between the first luminance average value and the second luminance average value is greater, the braking state of the front vehicle may be considered as braking.
Optionally, the manner of determining the braking state of the front vehicle according to the first luminance average value and the second luminance average value may be: and if the error between the first brightness average value and the second brightness average value falls within the set brightness threshold value range, determining the braking state of the front vehicle as braking.
Specifically, an error between the luminance average value of the first communication area of the current frame image and the luminance average value of the first communication area of the previous frame image falls within a set luminance threshold value range, or an error between the luminance average value of the second communication area of the current frame image and the luminance average value of the second communication area of the previous frame image falls within a set luminance threshold value range, and then a braking state of the front vehicle is determined as braking.
Optionally, after determining the braking state of the preceding vehicle as braking, the method further includes: determining azimuth information of the brake vehicle according to the position information of the gravity center of each communication area in the communication area group; wherein the position information of the center of gravity is characterized by a first set coordinate component; the azimuth information includes left front, right front and right front; and determining the voice prompt information and the azimuth information of the light prompt according to the azimuth information.
In this embodiment, if the position information of the center of gravity of each communication area in the communication area group is located in the negative direction of the first set coordinate component (for example, the X axis), the azimuth information of the braking vehicle may be considered to be the left front, the corresponding voice prompt information may be "please notice the braking of the vehicle in the left front", and the azimuth information of the light prompt may be the left light. If the position information of the center of gravity of each communication area in the communication area group is located in the positive direction of the first set coordinate component (such as the X axis), the azimuth information of the braking vehicle can be considered to be the right front, the corresponding voice prompt information can be "please notice the vehicle brakes in the right front", and the azimuth information of the light prompt can be the right light. If the position information of the center of gravity of one of the communication areas in the communication area group is located in the positive direction of the first set coordinate component (for example, the X axis), and the position information of the center of gravity of the other communication area is located in the negative direction of the first set coordinate component (for example, the X axis), the position information of the braking vehicle can be considered to be the positive front, the corresponding voice prompt information can be "please notice the vehicle braking in front", and the position information of the light prompt can be the middle side light.
In this embodiment, voice prompt information can report through on-vehicle stereo set, and the azimuth information of light suggestion can be through setting up on the foreground under front windshield, and is located the warning light in the steering wheel place ahead and carry out the suggestion. The warning lamp comprises a left side lamp, a right side lamp and a middle side lamp, as shown in fig. 2, and fig. 2 is a schematic diagram of the mounting position of the warning lamp according to the embodiment of the invention. When the front left vehicle is braked, the left side lamp is lightened, when the front right vehicle is braked, the right side lamp is lightened, and when the front right vehicle is braked, the middle side lamp is lightened.
Optionally, the communication area group includes a first communication area corresponding to the left tail lamp and a second communication area corresponding to the right tail lamp, and after determining the braking state of the front vehicle as braking, the method further includes: determining second difference information of the gravity center point of the first communication area and the gravity center point of the second communication area on the first set coordinate component; determining a current vehicle-to-front vehicle distance based on the second difference information; determining a brake emergency degree based on the spacing; and determining the brightness information of the lamplight prompt based on the emergency degree of the brake.
In this embodiment, for the current frame image or the previous frame image, second difference information, such as a difference value on the X-axis, between the center of gravity point of the first communication area and the center of gravity point of the second communication area on the first set coordinate component (such as the X-axis), may be determined, a distance between the current vehicle and the front vehicle is determined based on the second difference information, a brake emergency degree is determined based on the distance, and luminance information of the lamplight prompt is determined based on the brake emergency degree, that is, the smaller the distance is, the higher the emergency degree is, and the higher the luminance of the lamplight is. The degree of urgency may be classified into three classes, low, medium and high.
For example, if the vehicle in front brakes and the distance between the current vehicle and the vehicle in front falls within the range of the highest emergency, the middle side lamp of the warning lamp is turned on and the brightness reaches the maximum brightness, and the corresponding voice prompt message may be "please notice that the vehicle in front brakes".
In general, the vehicle width of a common vehicle is 1.85 m to 2.5 m, and the maximum value is taken as a unit standard for calculation so as to meet the requirement. I.e. the vehicle brake lights (left and right tail lights) occupy 2.5 units of length in the image, the picture of the photograph taken 10 meters in front of the tested front car has an image amplitude of 10tan a units, then the brake lights occupy a 2.5/10tan a transverse image in the image. Thus, the distance between the current vehicle and the preceding vehicle can be calculated by the following formula: s=2.5/k tan α. Wherein S represents a pitch, k represents second difference information, α represents a camera view angle, and tan α represents an image width which can be contained by the camera view angle.
According to the technical scheme disclosed by the embodiment, an image of a front vehicle is acquired; wherein the image comprises a current frame image and a previous frame image; preprocessing an image to obtain a target image; determining a plurality of connected areas according to the target image; wherein the communication region includes a target object; grouping the plurality of communication areas to obtain a communication area group; wherein the set of connected regions characterizes connected regions within the same vehicle; the communication region group comprises at least one communication region; respectively determining the brightness average value in the connected region groups of the current frame image and the previous frame image to obtain a first brightness average value and a second brightness average value; determining a braking state of the front vehicle according to the first brightness average value and the second brightness average value; wherein, the braking state comprises braking and unbraking. According to the method and the device for determining the braking state of the front vehicle, the braking state of the front vehicle can be determined through the brightness average value in the communication area group of the current frame image and the last frame image, so that the accuracy of determining the braking state of the front vehicle can be improved, and the driving safety is improved.
In this embodiment, by simulating the target that a person can filter a non-vehicle when looking at a vehicle ahead, two sitting and right taillights of a vehicle can be regarded as a group, the braking state of the front vehicle can be determined according to the same group of braking light brightness of front and rear frame images, and then the azimuth information of the braking vehicle can be determined, and the azimuth information of the braking vehicle is used as the basis of voice broadcasting of a vehicle-mounted sound and the basis of prompting by a warning light, and meanwhile, the brightness of the prompting by the warning light can be determined according to the distance between the current vehicle and the vehicle ahead, so that the safe auxiliary driving function in the aspect of determining the braking state of the vehicle ahead is realized.
In this embodiment, the image of the vehicle in front may be collected by the camera module; SDRAM module (synchronous dynamic random-access memory, SDRAM) stores the video that the camera module gathered and the video frame of sampling, and the relevant microcontroller module of speed of a motor vehicle detects and is responsible for outputting relevant speed of a motor vehicle information, ARM treater according to speed of a motor vehicle information control camera module gathering the sampling frequency of image. The ARM processor module can be an image processing working module, and is used for preprocessing sampled images or video frames, determining the braking state of the front vehicle and outputting an output signal corresponding to the braking state. The vehicle-mounted sound module acquires an output signal transmitted by the ARM processor module and broadcasts relevant prompt tones according to the output signal; the power conversion module is responsible for providing converted power for the ARM processor; the warning light module can provide the driver with the azimuth information of front brake and the front brake condition warning corresponding to the emergency degree when detecting the front vehicle brake. The GM9910B chip of the warning lamp module is a pulse width modulation (Pulse Width Modulation, PWM) high-efficiency control Light-Emitting Diode (LED) driving IC chip (Integrated Circuit Chip, IC), the GM9910B provides a low-frequency PWM dimming input, and the acceptable duty ratio is 0-100%.
The pulse width modulation (Pulse Width modulation, PWM) pins of the three warning lamp modules are respectively connected with three PWM output pins of the ARM processor to adjust the brightness of the warning lamps, so that the warning effect of different emergency degrees can be achieved. The three dimming circuit boards are connected to the same 12V power supply, the PWMD pin of the GM9910B chip is a common input/output pin of the ARM processor, and the wire grounding end GND of the GM9910B and the wire grounding end GND of the ARM are commonly grounded. The vehicle-mounted sound module determines an audio file through the result of ARM processing image processing, reads the audio file, and carries out different prompt voice broadcasting based on the audio file so as to warn a driver that a vehicle in front of the driver has braking behaviors.
Fig. 3 is a schematic circuit diagram of an ARM processor according to an embodiment of the present invention for controlling three warning lamps. The PWM pin of the warning light circuit is connected with the PWM pin of the ARM processor, and when the warning light circuit does not work, the warning light circuit continuously gives a low level. Input capacitance C IN The input ripple wave is reduced, and the input is stabilized; capacitor C 0 The inductor L1 has the function of reducing output ripple waves, supplying power to the lamp beads when the gate circuit is closed, and realizing the follow current function; diode D1 depends on C at the output 0 Limiting different current loops from the L1 working time and the normal gate circuit opening time; resistor R CS For sampling resistance, the GATE is opened and closed according to the voltage division of the resistance, so as to achieve the effect of constant current and voltage regulation. VDD is a power supply, LD is a linear dimming pin, RT is a pin for setting oscillation frequency, C DD Is a capacitor, R T Is a resistor.
The CS pin is a sampling pin, when the sampling voltage is smaller than the threshold value, the current loop passes through the capacitor C from the positive electrode of the power supply to the working lamp 0 The ripple is absorbed and charged, then passes through the inductor L1, then flows to the drain through the source of the mos transistor, and finally flows to the ground through the sampling resistor. When the sampling voltage is larger than a specific value, the control chip closes the GATE circuit, so that the current loop is changed. At this time, the current loop is discharged from the capacitorThe electricity passes through the positive electrode of the working lamp, flows through the diode D1 from the inductor and returns to the negative electrode of the working lamp. The current threshold value passing through the sampling resistor can be changed by adjusting the size of the sampling resistor, so that the effect of stabilizing output current is achieved, and output voltages with different sizes can be obtained by changing the duty ratio of PWM waveforms input by PWMD.
FIG. 4 is a schematic diagram of the 12V to 5V chip AMS1117-3.3 wiring provided by an embodiment of the present invention. The battery output voltage on the car is normally +12v, and different blocks of the arm processor run at several different voltages. The requirements typically include a core (powered internal logic array), I/O (driving I/O buffers can be grouped at the bank, operating from each of a different voltage), a Phase Locked Loop (PLL) (powered PLL core), and a transceiver (supplied to digital and analog circuits in the transceiver, receiver and transmitter). In this embodiment, the AMS1117-3.3 voltage regulator is selected to convert the voltage to 3.3V, and then the conventional ARM power conversion chip LM26480 supplies power to each block of the ARM. LM26480 integrates two 1.5A buck ("buck") switching regulators and two 300 ma linear regulators. The device adopts 2.8 to 5.5V power supply and the first switch voltage stabilizing power supply with 0.8 to 2V voltage, and the second voltage providing 1.0 to 3.3V voltage, thereby meeting the power supply requirements of each block of ARM. The connection of AMS1117-3.3 is shown in fig. 4, where C3 and C4 are output filter capacitors, which act to suppress self-oscillation, and if these two capacitors are not connected, the output of a typical linear regulator will be an oscillating waveform. C1 and C2 are input capacitors which function to prevent voltage inversion after power failure. uF is the capacitance unit.
Fig. 5 is a schematic structural diagram of a determining device for front vehicle braking according to an embodiment of the present disclosure, as shown in fig. 5, where the determining device includes:
an image acquisition module 501 for acquiring an image of a preceding vehicle; wherein the image comprises a current frame image and a previous frame image;
a target image obtaining module 502, configured to pre-process the image to obtain a target image;
a connected region determining module 503, configured to determine a plurality of connected regions according to the target image; wherein the communication region includes a target object;
a grouping module 504, configured to group the plurality of connected areas to obtain a connected area group; wherein the set of connected regions characterizes connected regions within the same vehicle; the communication region group comprises at least one communication region;
the brightness average value determining module 505 is configured to determine brightness average values in the connected region groups of the current frame image and the previous frame image, respectively, to obtain a first brightness average value and a second brightness average value;
a braking state determining module 506, configured to determine a braking state of the front vehicle according to the first luminance average value and the second luminance average value; wherein, the braking state includes braking and unbraking.
According to the technical scheme disclosed by the embodiment, an image of a front vehicle is acquired through an image acquisition module; wherein the image comprises a current frame image and a previous frame image; preprocessing an image through a target image acquisition module to acquire a target image; determining a plurality of connected areas according to the target image by a connected area determining module; wherein the communication region includes a target object; grouping the plurality of connected areas through a grouping module to obtain a connected area group; wherein the set of connected regions characterizes connected regions within the same vehicle; the communication region group comprises at least one communication region; respectively determining the brightness average value in the connected region groups of the current frame image and the previous frame image by a brightness average value determining module to obtain a first brightness average value and a second brightness average value; determining a braking state of the front vehicle according to the first brightness average value and the second brightness average value through a braking state determining module; wherein, the braking state comprises braking and unbraking. According to the method and the device for determining the braking state of the front vehicle, the braking state of the front vehicle can be determined through the brightness average value in the communication area group of the current frame image and the last frame image, so that the accuracy of determining the braking state of the front vehicle can be improved, and the driving safety is improved.
Optionally, the image acquisition module is configured to: acquiring an image of a vehicle in front; identifying a target object for the acquired image; if the target object is not identified, acquiring images of the front vehicle according to the set frequency; if the target object is identified, determining a sampling frequency according to the relative speed of the current vehicle and the front vehicle, and acquiring an image of the front vehicle based on the sampling frequency.
Optionally, the target image obtaining module is specifically configured to: converting the image into an image in a color model space HSV to obtain a first image; screening out pixel points which fall into a set color range in the first image to serve as a second image; performing binarization processing on the second image to obtain a binarized image; performing corrosion and expansion treatment operation on the binarized image to obtain a third image; and deleting the region, which contains the pixel points with the number smaller than the set threshold value, in the third image to obtain a target image.
Optionally, the communication area determining module is specifically configured to: determining the gravity center point of the communication area; wherein the center of gravity point is characterized by a first set coordinate component and a second set coordinate component; determining first difference information between gravity center points on a second set coordinate component; and dividing the communication area of which the first difference information falls into an error range into a communication area group.
Optionally, the grouping module is specifically configured to: for the current frame image, determining the brightness average value of the gravity center pixel point of each connected region and the adjacent pixel point of each connected region in the connected region group to obtain a first brightness average value; and for the previous frame of image, determining the brightness average value of the gravity center pixel point of each connected region in the connected region group and the adjacent pixel point of each connected region, and obtaining a second brightness average value.
Optionally, the brake state determining module is specifically configured to: and if the error between the first brightness average value and the second brightness average value falls within a set brightness threshold value range, determining the braking state of the front vehicle as braking.
Optionally, the device further comprises an azimuth information determining module, wherein the azimuth information determining module is used for determining azimuth information of the brake vehicle according to the position information of the center of gravity of each connected region in the connected region group; wherein the position information of the center of gravity is characterized by the first set coordinate component; the azimuth information comprises a left front, a right front and a right front; and determining the voice prompt information and the azimuth information of the light prompt according to the azimuth information.
Optionally, the communication area group includes a first communication area corresponding to the left tail lamp and a second communication area corresponding to the right tail lamp. Optionally, the apparatus further includes a brightness information determining module, configured to: determining second difference information of the gravity center point of the first communication area and the gravity center point of the second communication area on a first set coordinate component; determining a current vehicle-to-front vehicle distance based on the second difference information; determining a brake emergency level based on the spacing; and determining the brightness information of the lamplight prompt based on the brake emergency degree.
The front vehicle brake determining device provided by the embodiment of the disclosure can execute the front vehicle brake determining method provided by any embodiment of the disclosure, and has the corresponding functional modules and beneficial effects of the executing method.
It should be noted that each unit and module included in the above apparatus are only divided according to the functional logic, but not limited to the above division, so long as the corresponding functions can be implemented; in addition, the specific names of the functional units are also only for convenience of distinguishing from each other, and are not used to limit the protection scope of the embodiments of the present disclosure.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure. Referring now to fig. 6, a schematic diagram of an electronic device (e.g., a terminal device or server in fig. 6) 500 suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 6 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 6, the electronic device 500 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 501, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM503, various programs and data required for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM502, and the RAM503 are connected to each other via a bus 504. An edit/output (I/O) interface 505 is also connected to bus 504.
In general, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 507 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 508 including, for example, magnetic tape, hard disk, etc.; and communication means 509. The communication means 509 may allow the electronic device 500 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 shows an electronic device 500 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 509, or from the storage means 508, or from the ROM 502. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 501.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
The electronic device provided in the embodiment of the present disclosure and the method for determining a front vehicle brake provided in the foregoing embodiment belong to the same inventive concept, and technical details not described in detail in the present embodiment may be referred to the foregoing embodiment, and the present embodiment has the same beneficial effects as the foregoing embodiment.
The embodiment of the present disclosure provides a computer storage medium having a computer program stored thereon, which when executed by a processor, implements the method for determining a front vehicle brake provided by the above embodiment.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperTextTransfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring an image of a vehicle in front; wherein the image comprises a current frame image and a previous frame image; preprocessing the image to obtain a target image; determining a plurality of connected areas according to the target image; wherein the communication region includes a target object; grouping the plurality of communication areas to obtain a communication area group; wherein the set of connected regions characterizes connected regions within the same vehicle; the communication region group comprises at least one communication region; respectively determining the brightness average value in the connected region groups of the current frame image and the previous frame image to obtain a first brightness average value and a second brightness average value; determining a braking state of the front vehicle according to the first brightness average value and the second brightness average value; wherein, the braking state includes braking and unbraking.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. The name of the unit does not in any way constitute a limitation of the unit itself, for example the first acquisition unit may also be described as "unit acquiring at least two internet protocol addresses".
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (11)

1. A method for determining a front vehicle brake, comprising:
acquiring an image of a vehicle in front; wherein the image comprises a current frame image and a previous frame image;
preprocessing the image to obtain a target image;
determining a plurality of connected areas according to the target image; wherein the communication region includes a target object;
grouping the plurality of communication areas to obtain a communication area group; wherein the set of connected regions characterizes connected regions within the same vehicle; the communication region group comprises at least one communication region;
respectively determining the brightness average value in the connected region groups of the current frame image and the previous frame image to obtain a first brightness average value and a second brightness average value;
determining a braking state of the front vehicle according to the first brightness average value and the second brightness average value; wherein, the braking state includes braking and unbraking.
2. The method of claim 1, wherein acquiring an image of a vehicle in front comprises:
acquiring an image of a vehicle in front;
identifying a target object for the acquired image;
if the target object is not identified, acquiring images of the front vehicle according to the set frequency;
if the target object is identified, determining a sampling frequency according to the relative speed of the current vehicle and the front vehicle, and acquiring an image of the front vehicle based on the sampling frequency.
3. The method of claim 1, wherein preprocessing the image to obtain a target image comprises:
converting the image into an image in a color model space HSV to obtain a first image;
screening out pixel points which fall into a set color range in the first image to serve as a second image;
performing binarization processing on the second image to obtain a binarized image;
performing corrosion and expansion treatment operation on the binarized image to obtain a third image;
and deleting the region, which contains the pixel points with the number smaller than the set threshold value, in the third image to obtain a target image.
4. The method of claim 1, wherein grouping the plurality of connected regions to obtain a group of connected regions comprises:
Determining the gravity center point of the communication area; wherein the center of gravity point is characterized by a first set coordinate component and a second set coordinate component;
determining first difference information between gravity center points on a second set coordinate component;
and dividing the communication area of which the first difference information falls into an error range into a communication area group.
5. The method of claim 4, wherein determining the luminance average in the connected region groups of the current frame image and the previous frame image, respectively, to obtain the first luminance average and the second luminance average, comprises:
for the current frame image, determining the brightness average value of the gravity center pixel point of each connected region and the adjacent pixel point of each connected region in the connected region group to obtain a first brightness average value;
and for the previous frame of image, determining the brightness average value of the gravity center pixel point of each connected region in the connected region group and the adjacent pixel point of each connected region, and obtaining a second brightness average value.
6. The method of claim 5, wherein determining a braking status of the front vehicle based on the first and second luminance averages comprises:
and if the error between the first brightness average value and the second brightness average value falls within a set brightness threshold value range, determining the braking state of the front vehicle as braking.
7. The method of claim 6, further comprising, after determining the braking status of the preceding vehicle as braking:
determining azimuth information of the brake vehicle according to the position information of the gravity center of each communication area in the communication area group; wherein the position information of the center of gravity is characterized by the first set coordinate component; the azimuth information comprises a left front, a right front and a right front;
and determining the voice prompt information and the azimuth information of the light prompt according to the azimuth information.
8. The method according to claim 7, wherein the communication region group includes a first communication region corresponding to a left tail lamp and a second communication region corresponding to a right tail lamp, and further comprising, after determining a braking state of the preceding vehicle as braking:
determining second difference information of the gravity center point of the first communication area and the gravity center point of the second communication area on a first set coordinate component;
determining a current vehicle-to-front vehicle distance based on the second difference information;
determining a brake emergency level based on the spacing;
and determining the brightness information of the lamplight prompt based on the brake emergency degree.
9. A front vehicle brake determining apparatus, comprising:
An image acquisition module for acquiring an image of a preceding vehicle; wherein the image comprises a current frame image and a previous frame image;
the target image acquisition module is used for preprocessing the image to acquire a target image;
a communication region determining module for determining a plurality of communication regions according to the target image; wherein the communication region includes a target object;
the grouping module is used for grouping the plurality of communication areas to obtain a communication area group; wherein the set of connected regions characterizes connected regions within the same vehicle; the communication region group comprises at least one communication region;
the brightness average value determining module is used for respectively determining brightness average values in the connected region groups of the current frame image and the previous frame image to obtain a first brightness average value and a second brightness average value;
the brake state determining module is used for determining the brake state of the front vehicle according to the first brightness average value and the second brightness average value; wherein, the braking state includes braking and unbraking.
10. An electronic device, the electronic device comprising:
one or more processors;
storage means for storing one or more programs,
When executed by the one or more processors, causes the one or more processors to implement the method of determining a front truck brake as recited in any one of claims 1-8.
11. A storage medium containing computer executable instructions which, when executed by a computer processor, are for performing the method of determining a front truck brake as claimed in any one of claims 1 to 8.
CN202310275725.0A 2023-03-21 2023-03-21 Method, device, equipment and storage medium for determining front vehicle brake Pending CN116416599A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310275725.0A CN116416599A (en) 2023-03-21 2023-03-21 Method, device, equipment and storage medium for determining front vehicle brake

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310275725.0A CN116416599A (en) 2023-03-21 2023-03-21 Method, device, equipment and storage medium for determining front vehicle brake

Publications (1)

Publication Number Publication Date
CN116416599A true CN116416599A (en) 2023-07-11

Family

ID=87050713

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310275725.0A Pending CN116416599A (en) 2023-03-21 2023-03-21 Method, device, equipment and storage medium for determining front vehicle brake

Country Status (1)

Country Link
CN (1) CN116416599A (en)

Similar Documents

Publication Publication Date Title
US11386673B2 (en) Brake light detection
CN108737992B (en) Camera-assisted diagnosis of vehicle lights via inter-vehicle communication
US9813593B2 (en) Outside recognition apparatus and lens dirtiness detection method
CN107972569B (en) Vehicle lamp set control method, device and system and vehicle
CN103448653A (en) Vehicle collision warning system and method
US11341753B2 (en) Emergency vehicle detection
US10748012B2 (en) Methods and apparatus to facilitate environmental visibility determination
CN111669513A (en) System and method for low-light vision through pulse illumination
CN105374221A (en) Reminder system and reminder method of states of traffic lights
EP3737084A1 (en) Solid-state imaging element, imaging device, and method for controlling solid-state imaging element
US11490023B2 (en) Systems and methods for mitigating light-emitting diode (LED) imaging artifacts in an imaging system of a vehicle
JP2018077828A (en) Image processing algorithm
CN205059421U (en) Advanced driver assistance systems's improvement structure
EP3806451A1 (en) Solid-state imaging element, imaging device, and method for controlling solid-state imaging element
CN112793586A (en) Automatic driving control method and device for automobile and computer storage medium
US9969332B1 (en) Reduction of LED headlight flickering in electronic mirror applications
CN116416599A (en) Method, device, equipment and storage medium for determining front vehicle brake
EP3518524A1 (en) Signal processing device, image-capturing device, and signal processing method
CN115131749A (en) Image processing apparatus, image processing method, and computer-readable storage medium
CN112435475A (en) Traffic state detection method, device, equipment and storage medium
CN113129617A (en) Driving prompting method and device
JP2020136731A (en) Abnormality detection system, mobile object, abnormality detection method, and program
WO2019044434A1 (en) Object detection system
CN117261747A (en) Automatic switching method and system for high beam and low beam of vehicle and vehicle
CN116424215A (en) Vehicle-mounted control system, vehicle and control method of vehicle-mounted control system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination