CN115147791A - Vehicle lane change detection method and device, vehicle and storage medium - Google Patents

Vehicle lane change detection method and device, vehicle and storage medium Download PDF

Info

Publication number
CN115147791A
CN115147791A CN202210741525.5A CN202210741525A CN115147791A CN 115147791 A CN115147791 A CN 115147791A CN 202210741525 A CN202210741525 A CN 202210741525A CN 115147791 A CN115147791 A CN 115147791A
Authority
CN
China
Prior art keywords
lane
target vehicle
lane change
vehicle
tire
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210741525.5A
Other languages
Chinese (zh)
Inventor
杨伟嘉
韩旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Weride Technology Co Ltd
Original Assignee
Guangzhou Weride Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Weride Technology Co Ltd filed Critical Guangzhou Weride Technology Co Ltd
Priority to CN202210741525.5A priority Critical patent/CN115147791A/en
Publication of CN115147791A publication Critical patent/CN115147791A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the invention provides a vehicle lane change detection method and device, a vehicle and a storage medium. The vehicle lane change detection method comprises the following steps: acquiring a two-dimensional structure of a target vehicle in the image, and segmenting the two-dimensional structure to determine a tire landing point of the target vehicle; acquiring the distance between the tire landing point and the lane line according to the identified lane line and the tire landing point; calculating lane change weight of the target vehicle; and acquiring the lane change probability of the target vehicle according to the distance between the tire landing point and the lane line and the lane change weight. The lane changing intention of the target vehicle is obtained by combining the lane changing weight and the distance between the tire landing point and the lane line, and the lane changing intention of the target vehicle obtained by adopting the scheme has higher accuracy.

Description

Vehicle lane change detection method and device, vehicle and storage medium
Technical Field
The invention relates to the technical field of automatic driving, in particular to a vehicle lane change detection method and device, a vehicle and a storage medium.
Background
Autonomous driving (also known as automated driving, or driving automation) refers to a vehicle continuously performing some or all of the dynamic driving tasks in an automated fashion. The automatic driving grade is classified into 0-5 grade in the automobile automatic driving grade (GB/T40429-2021), wherein 0 grade is emergency auxiliary driving, and 5 grade is full automatic driving.
In the driving process of the automatic driving vehicle, the prediction of the behavior of the surrounding vehicle is an important expression of the perception capability, and the lane changing behavior of the surrounding vehicle is an important factor influencing the driving. At present, the position of a vehicle is mainly obtained through a laser radar, and the prediction of the lane change intention of the vehicle is realized by combining with a lane line area of a semantic map. However, when there is no semantic map and the vehicle turns, it is difficult for the current method to obtain an accurate prediction result.
In the prior art, there is also a scheme for recognizing whether a lane change intention of a vehicle is present based on an image analysis technology. For example, in one conventional method, an image captured by a vehicle-mounted camera is analyzed, a lane line is identified from the captured image by using the characteristic that the lane line is white in color, wheels in the image are identified based on the image characteristics of the wheels, and thus the distance between the wheels and the lane line is determined, and when the variation trend of the distance is shortened and the variation length is greater than a preset length, the target vehicle can be determined to have a lane change trend.
However, in the above-described method, the tire landing point detection method is performed by the wheel rim, and the landing point may be deviated due to a viewing angle problem. Therefore, the accuracy of the existing approaches is to be improved.
Disclosure of Invention
The invention mainly aims to provide a vehicle lane change detection method, a vehicle lane change detection device and a storage medium, which can improve the identification accuracy of a vehicle lane change intention.
In order to achieve the above object of the invention, the present invention provides a lane change detection method for a vehicle, the method comprising the steps of:
acquiring a two-dimensional structure of a target vehicle in an image, and segmenting the two-dimensional structure to determine a tire landing point of the target vehicle;
acquiring the distance between the tire landing point and the lane line according to the lane line and the tire landing point;
calculating lane change weight of the target vehicle;
and determining the lane change probability of the target vehicle according to the distance between the tire landing point and the lane line and the lane change weight.
Optionally, the calculating the lane change weight of the target vehicle includes:
identifying the lane line type of the lane line, and acquiring a first lane change weight according to the lane line type; and/or the presence of a gas in the gas,
acquiring the type of the road where the target vehicle is located, and acquiring a second lane change weight according to the type of the road; and/or the presence of a gas in the atmosphere,
obtaining a third lane change weight according to whether the target vehicle runs in the tunnel or not; and/or the presence of a gas in the gas,
according to whether the target vehicle is within the designated range or not obtaining a fourth lane change weight by the existing running vehicles;
and accumulating the first lane change weight and/or the second lane change weight and/or the third lane change weight and/or the fourth lane change weight to obtain the lane change weight of the target vehicle.
Optionally, the extracting a two-dimensional structure of a target vehicle from the acquired image and segmenting the two-dimensional structure to determine a tire landing point of the target vehicle includes:
according to the three-dimensional structure of the target vehicle detected by the radar, obtaining the two-dimensional structure of the target vehicle from the image;
inputting the two-dimensional structure into a segmentation model to obtain coordinates of a contact point between a tire of the target vehicle and the ground; and
and converting the coordinates of the contact point between the tire of the target vehicle and the ground into the coordinates of the tire of the target vehicle in the image, wherein the coordinates are the tire landing point of the target vehicle.
Optionally, obtaining a distance between the tire landing point and the lane line according to the identified lane line and the tire landing point includes:
mapping the lane line and the tire landing point to a world coordinate system;
and obtaining the distance between the tire landing point and the lane line from a world coordinate system.
Optionally, the tire landing point includes a front wheel landing point and a rear wheel landing point, and a distance between the tire landing point and the lane line includes: a distance between a front wheel of the target vehicle and the lane line, and a distance between a rear wheel of the target vehicle and the lane line.
Optionally, the determining the lane change probability of the target vehicle according to the distance between the tire landing point and the lane line and the lane change weight includes the following steps:
inputting the distance between the tire landing point and the lane line into a network model to obtain a first lane change probability of the target vehicle;
acquiring a second lane change probability of the target vehicle according to the lane change weight;
and multiplying the first lane change probability and the second lane change probability to obtain the lane change probability of the target vehicle.
In addition, to achieve the above object, the present invention also provides a vehicle lane change detection apparatus, including:
the floor point identification module is used for acquiring a two-dimensional structure of a target vehicle in an image and segmenting the two-dimensional structure to determine a tire floor point of the target vehicle;
the distance calculation module is used for acquiring the distance between the tire landing point and the lane line according to the lane line and the tire landing point;
the weight calculation module is used for calculating lane changing weight of the target vehicle;
and the lane change probability calculation module is used for determining the lane change probability of the target vehicle according to the distance between the tire landing point and the lane line and the lane change weight.
Further, to achieve the above object, the present invention also proposes a vehicle comprising: a memory, a processor, and a vehicle lane change detection program stored on the memory and executable on the processor, the vehicle lane change detection program configured to implement the steps of any of the vehicle lane change detection methods as described above.
Furthermore, to achieve the above object, the present invention also proposes a computer-readable storage medium storing a computer program executable by a processor to implement the steps of any one of the vehicle lane change detection methods as described above.
The embodiment of the invention has the following beneficial effects:
embodiments of the present invention determine a tire footprint of a target vehicle based on segmenting a two-dimensional structure of the target vehicle extracted from an image, therefore, the position information of the floor point can be directly obtained without processing the wheel frame; and meanwhile, obtaining a lane change weight according to the environment of the target vehicle, and obtaining the lane change intention of the target vehicle according to the lane change weight and the distance between the tire landing point and the lane line. The lane-changing intention of the target vehicle obtained by the scheme has higher accuracy.
Drawings
Fig. 1 is a schematic flow chart of a vehicle lane change detection method provided by the present invention.
Fig. 2 is a schematic flow chart of obtaining lane change weight of a target vehicle according to the present invention.
Fig. 3 is a schematic flow chart of determining a tire landing point of a target vehicle according to the present invention.
Fig. 4 is a schematic flow chart of obtaining lane change probability of a target vehicle according to the present invention.
Fig. 5 is a schematic structural diagram of an embodiment of the vehicle lane change detection apparatus of the present invention.
Fig. 6 is a schematic vehicle structure diagram of a hardware operating environment according to an embodiment of the present invention.
Detailed Description
In order to make the technical problems, technical solutions and advantageous effects to be solved by the present invention clearer and clearer, the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in itself. Thus, "module", "component" or "unit" may be used mixedly.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
The present invention will be described below by way of examples with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of a vehicle lane change detection method according to an embodiment of the present invention. The scheme of fig. 1 is used for identifying whether a target vehicle has a lane change intention, for example, whether a vehicle in a front adjacent lane changes the lane to the own lane, for example, whether a vehicle on the front right side/left side changes the lane to the own lane. Specifically, the scheme of fig. 1 may include the following steps:
s1, acquiring a two-dimensional structure of a target vehicle in an image, and segmenting the two-dimensional structure to determine a tire landing point of the target vehicle.
The vehicle is a vehicle controlled by an automatic driving system, and the vehicle is loaded with software and hardware systems supporting the automatic driving function, such as an automatic driving operation system, a laser radar, a camera, and the like. The automatic driving here includes full automatic driving, assist driving, and the like, for example, automatic driving on the L0 to L5 levels.
In this step, the vehicle-mounted camera may be used to capture an image of the environment of the vehicle, so as to obtain an image of the environment.
There are various ways to identify the lane line, for example, the lane line may be identified from the image by using the characteristic that the color of the lane line is white as described in the background art. However, the way of obtaining the lane line area through the color information is very easily influenced by other factors, and the lane line fitted by the way is easily different from the actual lane line. Therefore, the present embodiment preferably employs the Lane ATT network model to identify the Lane lines in the image. The Lane ATT network model is a real-time high-performance Lane line detection algorithm model, is realized based on an anchor, and applies an attention mechanism. Of course, lane line detection schemes based on other network models are equally applicable to this embodiment.
The target vehicle mainly refers to vehicles located in a front adjacent lane (including a left lane and/or a right lane) of the vehicle, and these vehicles need attention in automatic driving, and in the automatic driving, the vehicle needs to pay attention to whether the vehicle in the front adjacent lane has a lane change intention. There may be several target vehicles, including, for example, vehicle a in the right-hand lane in front, vehicle B, and vehicle C in the left-hand lane in front, etc. In this embodiment and other embodiments, one target vehicle is taken as an example for illustration, but those skilled in the art know that this embodiment and other embodiments are also suitable for a plurality of target vehicle scenarios.
Additionally, radar may be utilized to lock the target vehicle. The radar arranged on the vehicle can detect the direction (front, rear, side and the like) and the corresponding distance of other vehicles in a certain range, and can determine which vehicles need special attention by combining the information, namely the target vehicle can be locked.
In this step, the tire landing point of the target vehicle is directly recognized based on the segmentation method without using a wheel rim method, thereby improving the recognition accuracy.
Specifically, as shown in fig. 3, it is a schematic diagram of one implementation of step S1. As shown in fig. 3, in step S301, a two-dimensional structure of the target vehicle, i.e., a 2D frame structure, which may be represented by (x, y), is obtained from an image acquired by the autonomous vehicle according to the radar-detected three-dimensional structure (i.e., 3D map) of the target vehicle.
In step S302, the two-dimensional structure is input into a segmentation model, and coordinates (x, y coordinates) of a contact point between a tire of the target vehicle and the ground are obtained. Finally, in step S303, a coordinate u = x + x ', v = y + y') of the tire of the target vehicle in the image is obtained according to the coordinate of the contact point between the tire of the target vehicle and the ground, which is the tire landing point of the identified target vehicle.
The segmentation model may be, for example, a U-net network model. In addition, as indicated above, the coordinates of the tires of the target vehicle in the image include: an x-coordinate and a y-coordinate. In the prior art, the manner of identifying the tire landing point through the wheel rim generally determines only the y coordinate without paying attention to the x coordinate, while the tire landing point determined by the manner of the present embodiment includes the x coordinate and the y coordinate, and therefore, the tire landing point will have higher accuracy.
And S2, acquiring the distance between the tire landing point and the lane line according to the lane line and the tire landing point.
In one embodiment, the distance of the tires of the target vehicle from the lane line is determined by:
firstly, mapping a tire landing point of a target vehicle to a world coordinate system, and simultaneously mapping each point on a lane line to the world coordinate system; and then obtaining the distance between the tire of the target vehicle and the lane line from the world coordinate system. In addition, the distance between the tire and the lane line includes: the distance d1 from the front wheels to the lane line and the distance d2 from the rear wheels to the lane line of the subject vehicle. Specifically, because the tires and the lane lines are on the ground, the tire points and the lane line points of each vehicle can be mapped into a world coordinate system by using a projection transformation formula in combination with the internal and external parameters of the camera.
And S3, calculating the lane change weight of the target vehicle.
When the vehicle runs on different roads, the probability of lane change is different. That is, the distance between the vehicle tire landing point and the lane line is the same under different road conditions, but the intention of lane change is different. As in the case of the yellow solid line and the white dotted line, when the tire landing point of the vehicle is the same as the lane line distance, the lane change intention of the target vehicle may be different, and the lane change intention in the case of the white dotted line lane driving may be greater than the lane change intention in the case of the yellow solid line lane driving. Therefore, the lane change weight of the vehicle needs to be obtained in real time according to the actual situation of the vehicle, and the lane change weight of the vehicle needs to be obtained, referring to the flow illustrated in fig. 2.
Step S201, identifying the lane line type of the lane line, and acquiring a first lane change weight according to the lane line type.
And carrying out image recognition on the lane line in the acquired image to obtain the lane line type of the lane line. According to the lane line obtained from the image obtained by the automatic driving vehicle, the type of the obtained lane line is identified. When the type is identified, a network model can be used for identifying, and the technical scheme is not limited by the specific use of the network model. The lane line types include: the lane line type can be set according to actual requirements.
The current position information can also be acquired in real time through the positioning equipment of the automatic driving vehicle, then the distance and the direction between the target vehicle and the current automatic driving vehicle are acquired by combining the distance measurement scheme of the vehicle radar, and the position information of the target vehicle is acquired by combining the position information of the current automatic driving vehicle and the distance and the direction between the target vehicle and the current automatic driving vehicle. And inquiring the high-precision map in real time by using the position information of the target vehicle to obtain the lane line type of the current position of the target vehicle.
Each lane line type corresponds to a lane change weight, and the higher the lane change weight is, the higher the probability that the vehicle will change lanes when the lane line is driven is. The lane change weight corresponding to each lane line type is shown in the following table:
serial number Lane line type First lane change weight
1 White dotted line 50
2 Solid white line 40
3 Yellow dotted line 35
4 Single yellow solid line 20
5 Double-yellow solid line 10
6 Yellow dotted solid line 28
7 Double white dotted line 30
8 Solid line white 25
The weight corresponding to each lane line type can be set according to actual requirements, and the technical scheme is not limited.
And S202, acquiring the road type of the target vehicle, and acquiring a second lane change weight according to the road type.
And performing image processing on the image acquired by the automatic driving vehicle, and judging whether the road where the target vehicle is located is a straight road or a turning road. The network model is adopted for image recognition to obtain whether the road where the target vehicle is located is a straight road or a turning road, and the technical scheme is not limited by the specific use of the network model.
The current position information can also be acquired in real time through the positioning equipment of the automatic driving vehicle, then the distance and the direction between the target vehicle and the current automatic driving vehicle are acquired by combining the distance measurement scheme of the vehicle radar, and the position information of the target vehicle is acquired by combining the position information of the current automatic driving vehicle and the distance and the direction between the target vehicle and the current automatic driving vehicle. And inquiring the high-precision map in real time by using the position information of the target vehicle to obtain whether the road where the current target vehicle is located is a straight road or a turning road.
Each road type corresponds to a lane change weight, and the higher the lane change weight is, the higher the probability of lane change of the vehicle when the vehicle runs on the road is. The lane change weight for each road type is shown in the following table:
serial number Type of road Second lane change weight
1 Straight road 50
2 Turning road 30
The weight corresponding to each road type can be set according to actual requirements, and the technical scheme is not limited.
And step S203, obtaining a third lane change weight according to whether the target vehicle runs in the tunnel.
The probability of lane change occurring when the vehicle travels in the tunnel is small, so it is necessary to determine whether the current target vehicle travels in the tunnel.
And performing image processing on the image acquired by the automatic driving vehicle, and judging whether the road where the target vehicle is located is a tunnel. And (3) adopting the network model to carry out image recognition to obtain whether the road where the target vehicle is located is a tunnel, and particularly using the network model, the technical scheme is not limited.
The current position information can also be acquired in real time through the positioning equipment of the automatic driving vehicle, then the distance and the direction between the target vehicle and the current automatic driving vehicle are acquired by combining the distance measuring scheme of a vehicle radar, and the position information of the target vehicle is acquired by combining the position information of the current automatic driving vehicle and the distance and the direction between the target vehicle and the current automatic driving vehicle. And inquiring the high-precision map in real time by using the position information of the target vehicle to obtain whether the road where the current target vehicle is located is a tunnel.
Whether the driving road is a tunnel or not corresponds to different lane changing weights, and the probability of lane changing is higher when the lane changing weight is larger, namely the vehicle drives on the road. Whether the driving road is the lane change weight corresponding to the tunnel or not is shown in the following table:
serial number Tunnel Third lane change weight
1 Whether or not 50
2 Is that 20
Whether the driving road is the lane change weight corresponding to the tunnel or not can be set according to actual requirements, and the technical scheme is not limited.
And 204, obtaining a fourth lane change weight according to whether a running vehicle exists in the specified range of the target vehicle.
When vehicles are driven in a certain range around the vehicles, the lane change weight of the vehicles is also reduced. Whether a vehicle is running within a certain range around the target vehicle, such as within 30 meters of the left side or the right side of the vehicle, is judged by performing image processing on an image acquired by the automatic driving vehicle. Whether a vehicle runs in a certain range of the target vehicle is judged through a network model, such as a machine learning network model. Specifically, the network model is used for judgment, and the technical scheme is not limited.
Whether a vehicle runs in a certain range of the target vehicle or not corresponds to different lane changing weights, and the probability of lane changing is higher when the lane changing weight is larger. Whether the target vehicle has the lane change weight corresponding to the running vehicle within a certain range is shown in the following table:
serial number Whether a target vehicle runs within a certain range or not Fourth lane change weight
1 Whether or not 50
2 Is that 30
Whether the lane change weight corresponding to the running vehicle exists in a certain range of the target vehicle can be set according to actual requirements, and the technical scheme is not limited.
And S205, accumulating the first lane changing weight and/or the second lane changing weight and/or the third lane changing weight and/or the fourth lane changing weight to obtain the lane changing weight of the target vehicle.
The lane change weights for the environment in which the target vehicle is located are accumulated as shown in the following table:
Figure BDA0003718167920000091
Figure BDA0003718167920000101
the lane change weight of the current vehicle is found to be 140.
And S4, determining the lane change probability of the target vehicle according to the distance between the tire landing point and the lane line and the lane change weight.
And acquiring the lane change probability of the target vehicle according to the lane change weight and the distance between the tire landing point and the lane line, and referring to the flow shown in fig. 4.
Step S401, inputting the distance between the tire landing point and the lane line into a network model to obtain a first lane change probability of the target vehicle.
After the distance between the tire landing point of the target vehicle and the lane line is obtained, the lane change probability of the target vehicle is judged according to the determined distance, which may be: predicting whether the target vehicle has a lane change or not through a time sequence model according to the distance d1 from the front wheel to the lane line and the distance d2 from the rear wheel to the lane line on the same side of the target vehicle; for example, when the values of d1 and d2 are small, the target vehicle is close to the lane line; if d1 is close to d2, indicating that the target vehicle is parallel to the lane line, lane change may be low; if d1< < d2, the car body is inclined at the moment, and the lane change is more likely; in time sequence, the process that d1 and d2 are becoming smaller indicates that the vehicle is approaching the lane line and tends to change lanes. Therefore, the lane change probability of the vehicle can be obtained according to the situation that d1< < d2 and d1 and d2 become smaller in time sequence. The above determination process may be implemented by using a Long Short-Term Memory network (LSTM) model, for example, inputting the distance between the tire of the target vehicle and the lane line into the LSTM model, and then obtaining the determination result of the lane change probability of the target vehicle from the output of the LSTM model. Such that a second lane change probability of 80% is obtained.
And S402, acquiring a second lane change probability of the target vehicle according to the lane change weight.
And acquiring the lane change weight of the target vehicle according to the running state of the target vehicle, wherein the lane change weight is 140. And then comparing the lane change weight with the sum of the maximum value of the first lane change weight, the maximum value of the second lane change weight, the maximum value of the third lane change weight and the maximum value of the fourth lane change weight to obtain a third lane change probability. If 140/(50 + 50) =70%.
And S403, multiplying the first lane change probability and the second lane change probability to obtain the lane change probability of the target vehicle.
And multiplying the second lane change probability obtained according to the distance between the tire of the target vehicle and the lane line and the third lane change probability obtained according to the lane change weight to obtain the first lane change probability of the target vehicle. If the second lane change probability is 80% and the third lane change probability is 70%, the first lane change probability =80% × 70% =56%.
After obtaining the lane change probability, the lane change probability may be used when performing an automatic driving operation, for example, if it is recognized that the target vehicle has an intention to change to the own lane, the own vehicle is controlled to decelerate so as to keep within a reasonable vehicle distance from the target vehicle.
According to the embodiment of the invention, the tire landing point of the target vehicle is determined based on the segmentation of the two-dimensional structure of the target vehicle extracted from the image, so that the position information of the landing point is directly obtained without processing the wheel frame; and meanwhile, obtaining a lane change weight according to the environment where the target vehicle is located, and obtaining a lane change intention of the target vehicle according to the lane change weight and the distance between the tire landing point and the lane line. The lane-changing intention of the target vehicle obtained by the scheme has higher accuracy.
In addition, the embodiment of the invention also provides a vehicle lane change detection device, which corresponds to the embodiment shown in fig. 1. As shown in fig. 5, the vehicle lane change detecting device includes:
the floor point identification module 10 is configured to acquire a two-dimensional structure of a target vehicle in an image, and segment the two-dimensional structure to determine a tire floor point of the target vehicle;
the distance calculation module 20 is configured to obtain a distance between the tire landing point and the lane line according to the lane line and the tire landing point;
a weight calculation module 30 for calculating lane change weights of the target vehicle;
and the lane change probability calculation module 40 is configured to determine the lane change probability of the target vehicle according to the distance between the tire landing point and the lane line and the lane change weight.
According to the embodiment of the invention, the tire landing point of the target vehicle is determined based on the segmentation of the two-dimensional structure of the target vehicle extracted from the image, so that the position information of the landing point is directly obtained without processing the wheel frame; and meanwhile, obtaining a lane change weight according to the environment of the target vehicle, and obtaining the lane change intention of the target vehicle according to the lane change weight and the distance between the tire landing point and the lane line. The lane-changing intention of the target vehicle obtained by the scheme has higher accuracy.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a vehicle in a hardware operating environment according to an embodiment of the present invention.
As shown in fig. 6, the vehicle may include: a processor 1001, such as a CPU, a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. The communication bus 1002 is used to implement connection communication among these components. The user interface 1003 may include a Display (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include standard wired interfaces, wireless interfaces (e.g., WI-FI, 4G, 5G interfaces). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the configuration shown in fig. 6 does not constitute a limitation of the vehicle and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 6, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and a vehicle lane change detection program.
In the vehicle shown in fig. 6, the network interface 1004 is mainly used for data communication with an external network; the user interface 1003 is mainly used for receiving input instructions of a user; the vehicle-passing processor 1001 calls a vehicle lane change detection program stored in the memory 1005, and performs the following operations:
acquiring a two-dimensional structure of a target vehicle in an image, and segmenting the two-dimensional structure to determine a tire landing point of the target vehicle;
acquiring the distance between the tire landing point and the lane line according to the lane line and the tire landing point;
calculating lane change weight of the target vehicle;
and determining the lane change probability of the target vehicle according to the distance between the tire landing point and the lane line and the lane change weight.
Optionally, the calculating the lane change weight of the target vehicle includes:
identifying the lane line type of the lane line, and acquiring a first lane change weight according to the lane line type; and/or the presence of a gas in the gas,
acquiring the type of the road where the target vehicle is located, and acquiring a second lane change weight according to the type of the road; and/or the presence of a gas in the gas,
obtaining a third lane change weight according to whether the target vehicle runs in the tunnel or not; and/or the presence of a gas in the gas,
obtaining a fourth lane change weight according to whether a running vehicle exists in the specified range of the target vehicle;
and accumulating the first lane change weight and/or the second lane change weight and/or the third lane change weight and/or the fourth lane change weight to obtain the lane change weight of the target vehicle.
Optionally, the extracting a two-dimensional structure of a target vehicle from the acquired image and segmenting the two-dimensional structure to determine a tire landing point of the target vehicle includes:
obtaining a two-dimensional structure of the target vehicle from the image according to the three-dimensional structure of the target vehicle detected by the radar;
inputting the two-dimensional structure into a segmentation model to obtain coordinates of a contact point between a tire of the target vehicle and the ground; and
and converting the coordinates of the contact point between the tire of the target vehicle and the ground into the coordinates of the tire of the target vehicle in an image, wherein the coordinates are the tire landing point of the target vehicle.
Optionally, obtaining a distance between the tire landing point and the lane line according to the identified lane line and the tire landing point includes:
mapping the lane line and the tire landing point to a world coordinate system;
and obtaining the distance between the tire landing point and the lane line from a world coordinate system.
Optionally, the tire landing point includes a front wheel landing point and a rear wheel landing point, and a distance between the tire landing point and the lane line includes: a distance between a front wheel of the target vehicle and the lane line, and a distance between a rear wheel of the target vehicle and the lane line.
Optionally, the determining the lane change probability of the target vehicle according to the distance between the tire landing point and the lane line and the lane change weight includes the following steps:
inputting the distance between the tire landing point and the lane line into a network model to obtain a first lane change probability of the target vehicle;
acquiring a second lane change probability of the target vehicle according to the lane change weight;
and multiplying the first lane change probability and the second lane change probability to obtain the lane change probability of the target vehicle.
According to the embodiment of the invention, the tire landing point of the target vehicle is determined based on the segmentation of the two-dimensional structure of the target vehicle extracted from the image, so that the position information of the landing point is directly obtained without processing the wheel frame; and meanwhile, obtaining a lane change weight according to the environment of the target vehicle, and obtaining the lane change intention of the target vehicle according to the lane change weight and the distance between the tire landing point and the lane line. The lane-changing intention of the target vehicle obtained by the scheme has higher accuracy.
Furthermore, an embodiment of the present invention further provides a computer-readable storage medium, where a vehicle lane change detection program is stored on the computer-readable storage medium, and when executed by a processor, the computer-readable storage medium implements the following operations:
acquiring a two-dimensional structure of a target vehicle in an image, and segmenting the two-dimensional structure to determine a tire landing point of the target vehicle;
acquiring the distance between the tire landing point and the lane line according to the lane line and the tire landing point;
calculating lane change weight of the target vehicle;
and determining the lane change probability of the target vehicle according to the distance between the tire landing point and the lane line and the lane change weight.
Optionally, the calculating the lane change weight of the target vehicle includes:
identifying the lane line type of the lane line, and acquiring a first lane change weight according to the lane line type; and/or the presence of a gas in the gas,
acquiring the type of the road where the target vehicle is located, and acquiring a second lane change weight according to the type of the road; and/or the presence of a gas in the gas,
obtaining a third lane change weight according to whether the target vehicle runs in the tunnel or not; and/or the presence of a gas in the gas,
obtaining a fourth lane change weight according to whether a running vehicle exists in the specified range of the target vehicle;
and accumulating the first lane change weight and/or the second lane change weight and/or the third lane change weight and/or the fourth lane change weight to obtain the lane change weight of the target vehicle.
Optionally, the extracting a two-dimensional structure of a target vehicle from the acquired image and segmenting the two-dimensional structure to determine a tire landing point of the target vehicle includes:
according to the three-dimensional structure of the target vehicle detected by the radar, obtaining the two-dimensional structure of the target vehicle from the image;
inputting the two-dimensional structure into a segmentation model to obtain coordinates of a contact point between a tire of the target vehicle and the ground; and
and converting the coordinates of the contact point between the tire of the target vehicle and the ground into the coordinates of the tire of the target vehicle in an image, wherein the coordinates are the tire landing point of the target vehicle.
Optionally, obtaining a distance between the tire landing point and the lane line according to the identified lane line and the tire landing point includes:
mapping the lane lines and the tire landing points to a world coordinate system;
and obtaining the distance between the tire landing point and the lane line from a world coordinate system.
Optionally, the tire landing point includes a front wheel landing point and a rear wheel landing point, and a distance between the tire landing point and the lane line includes: a distance between a front wheel of the target vehicle and the lane line, and a distance between a rear wheel of the target vehicle and the lane line.
Optionally, the determining the lane change probability of the target vehicle according to the distance between the tire landing point and the lane line and the lane change weight includes the following steps:
inputting the distance between the tire landing point and the lane line into a network model to obtain a first lane change probability of the target vehicle;
acquiring a second lane change probability of the target vehicle according to the lane change weight;
and multiplying the first lane change probability and the second lane change probability to obtain the lane change probability of the target vehicle.
According to the embodiment of the invention, the tire landing point of the target vehicle is determined based on the segmentation of the two-dimensional structure of the target vehicle extracted from the image, so that the position information of the landing point is directly obtained without processing the wheel frame; and meanwhile, obtaining a lane change weight according to the environment of the target vehicle, and obtaining the lane change intention of the target vehicle according to the lane change weight and the distance between the tire landing point and the lane line. The lane-changing intention of the target vehicle obtained by the scheme has higher accuracy.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of other like elements in a process, method, article, or system comprising the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solution of the present invention or the portions contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) as described above and includes several instructions for enabling a terminal device (which may be a mobile phone, a computer, a server, a controller, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (9)

1. A vehicle lane change detection method is characterized by comprising the following steps:
acquiring a two-dimensional structure of a target vehicle in an image, and segmenting the two-dimensional structure to determine a tire landing point of the target vehicle;
acquiring the distance between the tire landing point and the lane line according to the lane line and the tire landing point;
calculating lane change weight of the target vehicle;
and determining the lane change probability of the target vehicle according to the distance between the tire landing point and the lane line and the lane change weight.
2. The vehicle lane-change detection method according to claim 1, wherein the calculating of the lane-change weight of the target vehicle includes the steps of:
identifying the lane line type of the lane line, and acquiring a first lane change weight according to the lane line type; and/or the presence of a gas in the gas,
acquiring the type of the road where the target vehicle is located, and acquiring a second lane change weight according to the type of the road; and/or the presence of a gas in the atmosphere,
obtaining a third lane change weight according to whether the target vehicle runs in the tunnel or not; and/or the presence of a gas in the gas,
obtaining a fourth lane change weight according to whether a running vehicle exists in the specified range of the target vehicle;
and accumulating the first lane change weight and/or the second lane change weight and/or the third lane change weight and/or the fourth lane change weight to obtain the lane change weight of the target vehicle.
3. The vehicle lane-change detection method of claim 1, wherein the extracting a two-dimensional structure of a target vehicle from the captured image and segmenting the two-dimensional structure to determine a tire landing point of the target vehicle comprises:
according to the three-dimensional structure of the target vehicle detected by the radar, obtaining the two-dimensional structure of the target vehicle from the image;
inputting the two-dimensional structure into a segmentation model to obtain coordinates of a contact point between a tire of the target vehicle and the ground; and
and converting the coordinates of the contact point between the tire of the target vehicle and the ground into the coordinates of the tire of the target vehicle in the image, wherein the coordinates are the tire landing point of the target vehicle.
4. The method of claim 1, wherein obtaining the distance between the tire landing point and the lane line according to the identified lane line and the tire landing point comprises:
mapping the lane lines and the tire landing points to a world coordinate system;
and obtaining the distance between the tire landing point and the lane line from a world coordinate system.
5. The vehicle lane change detection method of claim 4, wherein the tire landing spots comprise front wheel landing spots and rear wheel landing spots, and the distance between the tire landing spots and the lane line comprises: a distance between a front wheel of the target vehicle and the lane line, and a distance between a rear wheel of the target vehicle and the lane line.
6. The vehicle lane-change detection method according to claim 1, wherein the determining of the lane-change probability of the target vehicle based on the distance between the tire landing point and the lane line and the lane-change weight includes:
inputting the distance between the tire landing point and the lane line into a network model to obtain a first lane change probability of the target vehicle;
acquiring a second lane change probability of the target vehicle according to the lane change weight;
and multiplying the first lane change probability and the second lane change probability to obtain the lane change probability of the target vehicle.
7. A vehicle lane change detection device, characterized by comprising:
the floor point identification module is used for acquiring a two-dimensional structure of a target vehicle in an image and segmenting the two-dimensional structure to determine a tire floor point of the target vehicle;
the distance calculation module is used for acquiring the distance between the tire landing point and the lane line according to the lane line and the tire landing point;
the weight calculation module is used for calculating lane changing weight of the target vehicle;
and the lane change probability calculation module is used for determining the lane change probability of the target vehicle according to the distance between the tire landing point and the lane line and the lane change weight.
8. A vehicle, characterized in that the vehicle comprises: a memory, a processor and a vehicle lane change detection program stored on the memory and executable on the processor, the vehicle lane change detection program configured to implement the steps of the vehicle lane change detection method according to any one of claims 1 to 6.
9. A computer-readable storage medium storing a computer program, characterized in that the computer program is executable by a processor to implement the steps of the vehicle lane-change detection method according to any one of claims 1 to 6.
CN202210741525.5A 2022-06-28 2022-06-28 Vehicle lane change detection method and device, vehicle and storage medium Pending CN115147791A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210741525.5A CN115147791A (en) 2022-06-28 2022-06-28 Vehicle lane change detection method and device, vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210741525.5A CN115147791A (en) 2022-06-28 2022-06-28 Vehicle lane change detection method and device, vehicle and storage medium

Publications (1)

Publication Number Publication Date
CN115147791A true CN115147791A (en) 2022-10-04

Family

ID=83410514

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210741525.5A Pending CN115147791A (en) 2022-06-28 2022-06-28 Vehicle lane change detection method and device, vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN115147791A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110077399A (en) * 2019-04-09 2019-08-02 魔视智能科技(上海)有限公司 A kind of vehicle collision avoidance method merged based on roadmarking, wheel detection
CN110843789A (en) * 2019-11-19 2020-02-28 苏州智加科技有限公司 Vehicle lane change intention prediction method based on time sequence convolution network
US20200346644A1 (en) * 2019-04-30 2020-11-05 Ford Global Technologies, Llc Lane change intention estimation of a vehicle
CN111942389A (en) * 2019-05-17 2020-11-17 罗伯特·博世有限公司 Driving assistance system, lane change determination unit and lane change determination method
CN112164238A (en) * 2020-09-17 2021-01-01 北京百度网讯科技有限公司 Navigation lane change guiding method, device, equipment and storage medium
CN113044042A (en) * 2021-06-01 2021-06-29 禾多科技(北京)有限公司 Vehicle predicted lane change image display method and device, electronic equipment and readable medium
CN113066298A (en) * 2021-03-17 2021-07-02 北京航迹科技有限公司 Vehicle travel control method, device, vehicle, server, and storage medium
CN113895462A (en) * 2021-11-19 2022-01-07 天津天瞳威势电子科技有限公司 Method, device, computing equipment and storage medium for predicting lane change of vehicle
CN114120266A (en) * 2021-10-29 2022-03-01 际络科技(上海)有限公司 Vehicle lane change detection method and device, electronic equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110077399A (en) * 2019-04-09 2019-08-02 魔视智能科技(上海)有限公司 A kind of vehicle collision avoidance method merged based on roadmarking, wheel detection
US20200346644A1 (en) * 2019-04-30 2020-11-05 Ford Global Technologies, Llc Lane change intention estimation of a vehicle
CN111942389A (en) * 2019-05-17 2020-11-17 罗伯特·博世有限公司 Driving assistance system, lane change determination unit and lane change determination method
CN110843789A (en) * 2019-11-19 2020-02-28 苏州智加科技有限公司 Vehicle lane change intention prediction method based on time sequence convolution network
CN112164238A (en) * 2020-09-17 2021-01-01 北京百度网讯科技有限公司 Navigation lane change guiding method, device, equipment and storage medium
CN113066298A (en) * 2021-03-17 2021-07-02 北京航迹科技有限公司 Vehicle travel control method, device, vehicle, server, and storage medium
CN113044042A (en) * 2021-06-01 2021-06-29 禾多科技(北京)有限公司 Vehicle predicted lane change image display method and device, electronic equipment and readable medium
CN114120266A (en) * 2021-10-29 2022-03-01 际络科技(上海)有限公司 Vehicle lane change detection method and device, electronic equipment and storage medium
CN113895462A (en) * 2021-11-19 2022-01-07 天津天瞳威势电子科技有限公司 Method, device, computing equipment and storage medium for predicting lane change of vehicle

Similar Documents

Publication Publication Date Title
CN110687539B (en) Parking space detection method, device, medium and equipment
US9170115B2 (en) Method and system for generating road map using data of position sensor of vehicle
CN110443225B (en) Virtual and real lane line identification method and device based on feature pixel statistics
WO2018105179A1 (en) Vehicle-mounted image processing device
CN113370977B (en) Intelligent vehicle forward collision early warning method and system based on vision
CN110867132B (en) Environment sensing method, device, electronic equipment and computer readable storage medium
US10867403B2 (en) Vehicle external recognition apparatus
CN110341621B (en) Obstacle detection method and device
CN110610137B (en) Method and device for detecting vehicle running state, electronic equipment and storage medium
CN110659548B (en) Vehicle and target detection method and device thereof
JP2018048949A (en) Object recognition device
CN108319931B (en) Image processing method and device and terminal
JP2018092596A (en) Information processing device, imaging device, apparatus control system, mobile body, information processing method, and program
CN111857135A (en) Obstacle avoidance method and apparatus for vehicle, electronic device, and computer storage medium
CN114639085A (en) Traffic signal lamp identification method and device, computer equipment and storage medium
CN114779276A (en) Obstacle detection method and device
JP2016045767A (en) Motion amount estimation device and program
CN113536867B (en) Object identification method, device and system
CN114419573A (en) Dynamic occupancy grid estimation method and device
CN115147791A (en) Vehicle lane change detection method and device, vehicle and storage medium
CN116086429A (en) Map updating method, device, equipment and computer readable storage medium
EP3825648A1 (en) Object detection device
CN115995163B (en) Vehicle collision early warning method and system
US20230186638A1 (en) Device for determining a topography of a vehicle environment, vehicle and method
US11670095B2 (en) Method for determining support points for estimating a progression of roadside development of a road

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination