CN110531661B - Automatic following control method, device and equipment for vehicle - Google Patents

Automatic following control method, device and equipment for vehicle Download PDF

Info

Publication number
CN110531661B
CN110531661B CN201910779383.XA CN201910779383A CN110531661B CN 110531661 B CN110531661 B CN 110531661B CN 201910779383 A CN201910779383 A CN 201910779383A CN 110531661 B CN110531661 B CN 110531661B
Authority
CN
China
Prior art keywords
vehicle
information
target vehicle
following target
following
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910779383.XA
Other languages
Chinese (zh)
Other versions
CN110531661A (en
Inventor
王挺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Geely Holding Group Co Ltd
Zhejiang Geely Automobile Research Institute Co Ltd
Original Assignee
Zhejiang Geely Holding Group Co Ltd
Zhejiang Geely Automobile Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Geely Holding Group Co Ltd, Zhejiang Geely Automobile Research Institute Co Ltd filed Critical Zhejiang Geely Holding Group Co Ltd
Priority to CN201910779383.XA priority Critical patent/CN110531661B/en
Publication of CN110531661A publication Critical patent/CN110531661A/en
Application granted granted Critical
Publication of CN110531661B publication Critical patent/CN110531661B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/25Pc structure of the system
    • G05B2219/25257Microcontroller

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a vehicle automatic following control method, a device and equipment, wherein the method comprises the following steps: acquiring a following target vehicle according to first image information in front of the vehicle; acquiring second image information in front of the self-vehicle in real time and judging whether a following target vehicle exists in the second image information; if the second image information does not contain the following target vehicle, acquiring the road condition information around the vehicle in real time and controlling the vehicle to change lanes according to the road condition information; acquiring second image information in front in the lane changing process of the vehicle and judging whether a following target vehicle exists in the second image information; if the second image information contains the following target vehicle, generating the following control information of the vehicle according to the second image information, the road condition information and the running state information of the following target vehicle; and controlling the self vehicle to follow the following target vehicle to run according to the following control information. By adopting the invention, stable following on urban expressways and expressways can be realized.

Description

Automatic following control method, device and equipment for vehicle
Technical Field
The invention relates to the technology of automatic driving of vehicles, in particular to a method, a device and equipment for controlling automatic following of a vehicle.
Background
An Advanced Driving Assistance System (ADAS) senses the environment around a vehicle at any time during the Driving of the vehicle by using various sensors mounted on the vehicle, collects data, carefully identifies, detects and tracks static and dynamic objects, and performs systematic calculation and analysis by combining with navigator map data, thereby enabling a driver to detect possible dangers in advance and effectively increasing the Driving comfort and safety of the vehicle. With the mature application of the functions of the advanced driving assistance system, the current mass-produced vehicles realize the driving assistance functions of lane keeping, intelligent piloting, highway assistance, traffic jam assistance and the like, relieve the physical and mental fatigue of drivers caused by long-time driving and improve the driving safety.
According to the classification of automatic driving technology by american automobile engineers, the automatic driving systems mounted on most new vehicles on the market can reach the level L2-partial automatic driving, that is, driving support is provided for multiple operations in steering wheel and acceleration and deceleration through driving environment, and other driving actions are operated by human drivers. In an automated driving system at the level of L2, a human driver needs to constantly monitor the driving environment and prepare for ready take over. Whereas in an automated driving system at the level L3-conditional automation, all driving operations are performed by the unmanned system, the human driver provides an appropriate response upon system request without the human driver having to continuously monitor the driving environment.
With the improvement of life quality of people, self-driving travel on weekends or holidays is increasingly becoming a leisure choice for people to work. However, the vehicle using the L2 level following function often loses the following target vehicle and misses the on-ramp or the like due to the driver's inattention during traveling, which reduces the efficiency of traveling and wastes valuable leisure time.
Meanwhile, the following function of the existing mass-produced vehicles can only be realized under the condition of low speed, the front millimeter wave radar tracks the front vehicle position to carry out small-angle following and lane changing operations under the condition of low speed and no lane line, and the stable following function on an urban expressway or an expressway cannot be realized.
Disclosure of Invention
The invention aims to solve the technical problem that a following target vehicle is easy to lose so as to realize the stable vehicle following function on an urban expressway and an expressway.
In order to solve the technical problems, the invention discloses a vehicle automatic following control method, device and equipment, wherein an auxiliary following function based on feature recognition matching is designed to help a driver to stably drive along with a specific target vehicle on an urban expressway and an expressway.
In a first aspect, the present invention provides a vehicle automatic following control method including:
acquiring a following target vehicle according to first image information in front of the vehicle;
acquiring second image information in front of the self vehicle in real time and judging whether the following target vehicle exists in the second image information or not;
if the following target vehicle does not exist in the second image information, acquiring road condition information around the vehicle in real time and controlling the vehicle to change lanes according to the road condition information;
acquiring second image information in front in the self-vehicle lane changing process and judging whether the following target vehicle exists in the second image information;
if the following target vehicle exists in the second image information, generating following control information of the vehicle according to the second image information, the road condition information and the running state information of the following target vehicle;
and controlling the self vehicle to run along with the following target vehicle according to the following control information.
Further, the acquiring of the following target vehicle from the first image information in front of the own vehicle includes:
trapezoidal filtering is carried out on the first image information through a target detection algorithm, and first vehicle boundary frame area information in front of the self vehicle is obtained;
acquiring the pixel area of the first vehicle bounding box area according to the first vehicle bounding box area information;
acquiring the proportion of the first vehicle boundary frame area in a preset trapezoidal area;
selecting a front vehicle with the pixel area and the occupation ratio meeting preset conditions as the following target vehicle;
carrying out gray level processing on the first vehicle boundary frame area information of the following target vehicle, and acquiring and storing the image characteristics of the following target vehicle;
and carrying out image processing on the first vehicle boundary frame area information of the following target vehicle, and acquiring and storing license plate characters of the following target vehicle.
Further, the determining whether the following target vehicle is present in the second image information includes:
acquiring image characteristics of the following target vehicle;
and performing feature matching on the second image information according to the image features of the following target vehicle, acquiring the position of a matching point in the second image information according to the image features of the following target vehicle, and judging whether the following target vehicle exists according to the position of the matching point.
Further, the determining whether the following target vehicle exists in the second image information further includes:
acquiring license plate characters of the following target vehicle;
trapezoidal filtering is carried out on the second image information through a target detection algorithm, and second vehicle boundary frame area information in front of the self vehicle is obtained;
performing image processing on the second vehicle boundary frame region information to acquire and store license plate characters in front of the vehicle;
and comparing the license plate characters in front of the vehicle with the license plate characters of the following target vehicle, and judging whether the following target vehicle exists in front of the vehicle.
Further, the generating of the following control information of the host vehicle according to the second image information, the road condition information, and the driving state information of the following target vehicle includes:
acquiring the second image information and the road condition information to obtain the transverse following control information of the self-vehicle;
and obtaining the longitudinal following control information of the self vehicle by obtaining the running state information of the following target vehicle.
Further, the vehicle automatic following control method further includes:
if the following target vehicle exists in the second image information, obtaining second vehicle boundary frame area information of the following target vehicle according to the second image information;
and judging whether the following target vehicle is positioned in the own vehicle lane or not according to the second vehicle boundary frame area information of the following target vehicle.
Further, the determining whether the following target vehicle is located within the own vehicle lane according to the second vehicle boundary frame area information of the following target vehicle includes:
acquiring second vehicle boundary frame area information of the following target vehicle;
constructing a lane line as a boundary line of the self-vehicle lane according to the second vehicle boundary frame region information of the following target vehicle;
and judging whether the following target vehicle is positioned in the own vehicle lane or not according to the occupation ratio of the lower line of the second vehicle boundary frame area of the following target vehicle to the boundary line.
Furthermore, the sensor of the self vehicle comprises a camera and a vehicle-mounted radar, the camera acquires image information of the vehicle in front of the self vehicle in real time, and the vehicle-mounted radar acquires driving state information of the following target vehicle and road condition information around the self vehicle in real time.
In a second aspect, the present invention provides a vehicle automatic following control device including:
a following target vehicle acquisition unit configured to acquire a following target vehicle based on first image information in front of a host vehicle;
the first judging unit is used for acquiring second image information in front of the self vehicle in real time and judging whether the following target vehicle exists in the second image information or not;
the lane changing control unit is used for acquiring road condition information around the self-vehicle in real time and controlling the self-vehicle to change lanes according to the road condition information if the following target vehicle does not exist in the second image information;
the second judgment unit is used for acquiring second image information in front in the self-vehicle lane changing process and judging whether the following target vehicle exists in the second image information or not;
a following control information generating unit, configured to generate, if the following target vehicle exists in the second image information, following control information of the vehicle according to the second image information, the road condition information, and the driving state information of the following target vehicle;
and the following control unit is used for controlling the self vehicle to follow the following target vehicle to run according to the following control information.
In a third aspect, the present invention provides a vehicle automatic following control device comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, set of codes or set of instructions, the at least one instruction, the at least one program, the set of codes or set of instructions being loaded and executed by the processor to implement a vehicle automatic following control method according to any one of claims 1 to 8.
By adopting the technical scheme, the method, the device and the equipment for controlling the automatic following of the vehicle have the following beneficial effects:
under the condition that the existing vehicle-mounted sensor is not added, continuous tracking control on a following target vehicle can be performed through image feature recognition and matching of the camera on the vehicle in front of the vehicle, and the automatic driving functions of L3 levels such as stable following driving and following lane changing can be realized under the conditions of low traffic flow of urban expressways, expressways and the like.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flow chart of a vehicle automatic following control method according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of step S100 in an automatic vehicle following control method according to an embodiment of the present invention;
fig. 3 is a schematic flowchart of step S200 in an automatic vehicle following control method according to an embodiment of the present invention;
fig. 4 is a schematic flowchart of step S500 in an automatic vehicle following control method according to an embodiment of the present invention;
fig. 5 is a schematic flowchart of step S510 in an automatic vehicle following control method according to an embodiment of the present invention;
FIG. 6 is a schematic flow chart of a method for controlling automatic following of a vehicle according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an automatic vehicle following control device according to an embodiment of the present invention;
the following is a supplementary description of the drawings:
1-vehicle automatic following control device; 101-a following target vehicle acquisition unit; 102-a first judgment unit; 103-lane change control unit; 104-a second judging unit; 105-a following control information generating unit; 106-the following control unit.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Reference herein to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic may be included in at least one implementation of the invention. In the description of the present invention, it is to be understood that the terms "upper", "lower", "top", "bottom", and the like, indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, are only for convenience in describing the present invention and simplifying the description, and do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. Moreover, the terms "first," "second," and the like are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein.
Example (b):
fig. 1 is a schematic flow chart of a method for controlling automatic vehicle following according to an embodiment of the present invention, and the present specification provides the method operation steps as described in the embodiment or the flowchart, but more or less operation steps may be included based on conventional or non-inventive labor. The order of steps recited in the embodiments is merely one manner of performing the steps in a multitude of orders and does not represent the only order of execution. In practice, the system or server product may be implemented in a sequential or parallel manner (e.g., parallel processor or multi-threaded environment) according to the embodiments or methods shown in the figures. Specifically, as shown in fig. 1, the method may include:
s100: and acquiring a following target vehicle according to first image information in front of the vehicle.
In some possible embodiments, the vehicle may acquire video image information in front of the vehicle as the first image information through a camera.
It is understood that the first image information is current video image information acquired after the own vehicle key starts a following function, and a following target vehicle needs to be selected before the own vehicle autonomously follows.
In some possible embodiments, as shown in fig. 2, the step S100 may include the following steps:
s101: and performing trapezoidal filtering on the first image information through a target detection algorithm to obtain first vehicle boundary frame area information in front of the self vehicle.
In some possible embodiments, the vehicle identification and classification may be performed on the first image information acquired by the camera through a deep learning YOLOv3 framework, all vehicles appearing in the image field of view are detected, all detected vehicles in front of the vehicle are screened through a preset trapezoidal area in the image, vehicles appearing in the trapezoidal area are retained, all vehicles outside the trapezoidal area are filtered, and the vehicle boundary frame area information of the vehicles appearing in the trapezoidal area is acquired as the first vehicle boundary frame area information.
S102: and acquiring the pixel area of the first vehicle bounding box area according to the first vehicle bounding box area information.
S103: and acquiring the occupation ratio of the first vehicle boundary frame area in the trapezoidal area according to the preset trapezoidal area.
S104: and selecting a front vehicle with the pixel area and the occupation ratio meeting preset conditions as the following target vehicle.
It is understood that the larger the pixel area and the occupation ratio of the vehicle bounding box area, the higher the vehicle confidence, the more favorable the stable following of the own vehicle.
S105: and carrying out gray level processing on the first vehicle boundary frame area information of the following target vehicle, and acquiring and storing the image characteristics of the following target vehicle.
It is understood that the first vehicle bounding box area information of the following target vehicle is obtained from the first image information.
It is understood that the image feature of the following target vehicle is acquired and saved in order to determine the following target vehicle in real time during the following of the own vehicle, and in order to attempt to re-search for the following target vehicle in the case where the following target vehicle is lost.
In some possible embodiments, the first vehicle bounding box area information of the following target vehicle may be subjected to gray processing, and the strongest N feature points in the first vehicle bounding box area information of the following target vehicle are selected and stored through an acceleration robust feature (SURF) algorithm.
In some possible embodiments, N may be selected to be 100.
S106: and carrying out image processing on the first vehicle boundary frame area information of the following target vehicle, and acquiring and storing license plate characters of the following target vehicle.
It is understood that the first vehicle bounding box area information of the following target vehicle is obtained from the first image information.
It is understood that the license plate characters of the following target vehicle are acquired and saved for determining the following target vehicle in real time during the following process of the self vehicle, and for attempting to re-search the following target vehicle in the case of losing the following target vehicle. The front of the automobile is likely to have a plurality of vehicles with the same characteristics in the automobile following process, so that characteristic matching is disordered, and a following target vehicle cannot be determined.
In some possible embodiments, the first vehicle boundary frame area information of the following target vehicle may be filtered through a color block, such as a blue block, to determine a pixel position point of a license plate in an image, to cut out a license plate area image from an original image, to cut out and separate each character through image binarization, average filtering, expansion or corrosion processing, and the like, and to extract and store the license plate character as the following target vehicle.
S200: and acquiring second image information in front of the self-vehicle in real time and judging whether the following target vehicle exists in the second image information.
In some possible embodiments, the vehicle may acquire video image information in front of the vehicle as the second image information through a camera.
It can be understood that the second image information is front video image information acquired by the camera in real time during the driving process of the vehicle after the vehicle following function is started and the following target vehicle is selected.
In some possible embodiments, as shown in fig. 3, the step S200 may include the following steps:
s201: and acquiring the image characteristics of the following target vehicle.
It is understood that the image feature of the following target vehicle is the image feature of the following target vehicle that is held in step S100 based on the first vehicle bounding box area information of the following target vehicle in the first image information.
S202: and performing feature matching on the second image information according to the image features of the following target vehicle, acquiring the position of a matching point in the second image information according to the image features of the following target vehicle, and judging whether the following target vehicle exists according to the position of the matching point.
In some possible embodiments, during the following process of the self vehicle, the second image information may be subjected to feature matching through the stored image features of the following target vehicle at a certain preset frequency, and a front vehicle in an area where matching points are concentrated is selected and determined as the following target vehicle through the position of the matching point matched with the features in the second image information.
In some possible embodiments, the preset frequency may be selected to be 2 hz, and 3N feature points may be selected from the second image information to be used as the matching, where N is the number of stored image features of the following target vehicle.
S203: and acquiring the license plate characters of the following target vehicle.
It can be understood that when a plurality of vehicles with the same characteristics appear in the second image information, a phenomenon that the following target vehicle cannot be determined due to disordered matching of the characteristic points occurs, and at this time, secondary confirmation of the license plate characters is needed to help to judge whether the following target vehicle exists in front or not.
It is understood that the license plate characters of the following target vehicle are the license plate characters of the following target vehicle stored according to the first vehicle border frame area information of the following target vehicle in the first image information in step S100.
S204: and performing trapezoidal filtering on the second image information through a target detection algorithm to obtain the second vehicle boundary frame area information in front of the self vehicle.
In some possible embodiments, the vehicle boundary frame area information of the vehicle ahead of the host vehicle is acquired as the second vehicle boundary frame area information by the trapezoidal filtering method described in step S101.
It is understood that the second vehicle bounding box area information is obtained according to second image information acquired by the camera in real time during the driving process of the vehicle.
S205: and carrying out image processing on the second vehicle boundary frame region information, and acquiring and storing license plate characters in front of the vehicle.
In some possible embodiments, the vehicle characters of the vehicle in front of the vehicle are acquired and saved by the image processing method described in step S106.
S206: and comparing the license plate characters in front of the vehicle with the license plate characters of the following target vehicle, and judging whether the following target vehicle exists in front of the vehicle.
It can be understood that the secondary confirmation is carried out on the vehicle in front of the self vehicle through the recognition of the license plate characters, so that the situation that the following target vehicle cannot be determined due to the fact that the feature matching fails due to the coincidence of the vehicle image features when the image information is processed can be avoided.
S300: and if the following target vehicle does not exist in the second image information, acquiring road condition information around the vehicle in real time and controlling the vehicle to change lanes according to the road condition information.
In some possible embodiments, when the following target vehicle does not exist in the second image information, the vehicle is controlled to keep running in the lane of the vehicle according to the set vehicle speed or the vehicle speed of the vehicle before the following target vehicle, meanwhile, the feature matching of the vehicle is performed on the second image information acquired in real time at a certain frequency within a certain time, and if the following target vehicle still cannot be matched, the vehicle is controlled to change lanes according to the road condition information.
In some possible embodiments, when there is no following target vehicle in the second image information, the host vehicle is controlled to keep running in the host vehicle lane according to the set vehicle speed or the following vehicle speed, while feature matching of the vehicle is performed on the 8-frame images at a frequency of 2 hz within 4 seconds.
In some possible embodiments, the vehicle may obtain the road condition information around the vehicle by using a vehicle-mounted radar.
Further, the vehicle-mounted radar includes a millimeter wave radar, an angle radar, a laser radar, an ultrasonic radar, and the like.
Further, the traffic information may be comprehensive information about a vehicle driving environment, which is reflected by the obstacle information, the indication information, the approaching vehicle information, the road information, the drivable area information, and the like; or may be vehicle running environment information in which a single piece of information is reflected. The corresponding information can be acquired according to the actual conditions of the road conditions. Such as: when the vehicle runs on an open road, the road condition information does not contain obstacle information; when the vehicle is traveling on a crowded road, the road condition information includes obstacle information, approaching vehicle information, and travelable area information.
In some possible embodiments, when it is detected that there is an obstacle, an approaching vehicle, or the like in the adjacent lane or there is no feasible area in the adjacent lane of the own vehicle within a certain time, the own vehicle exits the following function and reminds the driver to take over the vehicle.
In some feasible embodiments, when the adjacent lanes are detected not to be in an obstacle or an approaching vehicle and a feasible area exists within a certain time, the left lane is preferentially selected to complete lane change and overtaking at the speed which exceeds the current vehicle speed by 10km/h without speeding.
S400: and acquiring second image information in front in the self-vehicle lane changing process and judging whether the following target vehicle exists in the second image information.
It can be understood that the second image information is front video image information acquired by the camera in real time during the driving process of the vehicle after the vehicle following function is started and the following target vehicle is selected.
In some possible embodiments, the self-vehicle searches for the following target vehicle through the stored characteristics of the following target vehicle during the overtaking process, and if the self-vehicle does not find the following target vehicle within a certain time after the lane change is completed, the self-vehicle quits the following function and reminds the driver to take over the vehicle.
In some possible embodiments, if the own vehicle does not find the following target vehicle within 15 seconds of completing the lane change, the own vehicle exits the following function and reminds the driver to take over the vehicle.
In some possible embodiments, the determination method in step S400 may adopt the same determination method as in step S200.
It can be understood that, in a certain time after the following target vehicle is lost, the lane change overtaking is carried out according to the road condition information around the vehicle, the image information of the vehicle in front of the vehicle is obtained in real time through the camera in the lane change overtaking process, whether the following target vehicle meeting the characteristic matching and the license plate character matching exists is judged, if the following target vehicle is not detected in a certain time after the lane change overtaking is finished, the system can quit the following function and remind the driver to take over, the support degree of the driver can be further reduced, and the automation degree of the vehicle driving is improved.
S500: and if the following target vehicle exists in the second image information, generating the following control information of the vehicle according to the second image information, the road condition information and the running state information of the following target vehicle.
It can be understood that the second image information is front video image information acquired by the camera in real time during the driving process of the vehicle after the vehicle following function is started and the following target vehicle is selected.
In some possible embodiments, as shown in fig. 4, the step S500 may include the following steps:
s510: and obtaining the transverse following control information of the self-vehicle by obtaining the second image information and the road condition information.
It can be understood that the second image information and the traffic information are both information obtained by the vehicle in real time.
In some possible embodiments, if the following target vehicle exists in the second image information, second vehicle boundary frame area information of the following target vehicle is obtained according to the second image information; and judging whether the following target vehicle is positioned in the own vehicle lane or not according to the second vehicle boundary frame area information of the following target vehicle.
It can be understood that if the following target vehicle deviates from the lane of the vehicle by a certain range, it is determined that the following target vehicle is changing the lane, and the vehicle tries to follow the lane change according to the traffic information, so that the second image information and the traffic information need to be fused to generate the lateral following control information.
S520: and obtaining the longitudinal following control information of the self vehicle by obtaining the running state information of the following target vehicle.
It is understood that the running state information of the following target vehicle is information acquired in real time from the vehicle.
Further, the running state of the own vehicle is set as longitudinal following control information according to the running state information of the following target vehicle acquired in real time.
In some possible embodiments, the host vehicle may acquire the driving state information of the following target vehicle by using an on-vehicle radar.
Further, the vehicle-mounted radar includes a millimeter wave radar, an angle radar, a laser radar, an ultrasonic radar, and the like.
Further, the running state information of the following target vehicle includes speed information, direction information, position information, yaw angle information, and the like.
In some possible embodiments, as shown in fig. 5, the step S510 includes:
s511: acquiring second vehicle boundary frame area information of the following target vehicle;
it is understood that the second vehicle boundary frame area information of the following-target vehicle is obtained from the second image information acquired in real time.
S512: constructing a lane line as a boundary line of the self-vehicle lane according to the second vehicle boundary frame region information of the following target vehicle;
in some possible embodiments, two lane lines in the second image information may be detected and reconstructed by a random hough transform and used as two boundary lines of the own lane.
Further, the occupation ratio α of a portion of the lower edge line of the vehicle boundary frame region of the following target vehicle, which is located between the two lane lines, in the entire lower edge line is calculated.
In some possible embodiments, the determination condition is set as:
if the alpha is smaller than the preset threshold value, judging that the following target vehicle is in the lane changing process, and carrying out longitudinal and transverse following control on the self vehicle according to the running state information and the road condition information of the following target vehicle;
and if the alpha is larger than or equal to the preset threshold value, judging that the following target vehicle is not in lane change, keeping the vehicle close to the center line of the vehicle lane, and only carrying out longitudinal following control on the following target vehicle according to the running state information of the following target vehicle.
In some possible embodiments, the preset threshold may be 60%.
S513: and judging whether the following target vehicle is positioned in the own vehicle lane or not according to the occupation ratio of the lower line of the second vehicle boundary frame area of the following target vehicle to the boundary line.
S600: and controlling the self vehicle to run along with the following target vehicle according to the following control information.
It is understood that the host vehicle may be controlled to follow the following target vehicle longitudinally, laterally or quit the following function and to remind the driver to take over, etc. according to the following control information.
The following briefly describes a vehicle automatic following control method provided in the embodiments of the present specification in conjunction with actual situations and specific information.
As shown in fig. 2, after the car key starts the following function, the front camera acquires image information in front of the current car, the vehicle boundary frames of all vehicles appearing in the image information are intercepted by utilizing the trapezoidal filter, and a proper vehicle is selected as a following target vehicle of the car according to the occupation ratio and pixel interview of the vehicle boundary frames in the trapezoidal area.
After the following target vehicle is selected, the image characteristics and the license plate characters of the following target vehicle need to be extracted and stored so as to be compared and matched in the driving process.
After the following target vehicle is selected, the self vehicle collects surrounding road condition information and driving state information of the following target vehicle, and starts to enter a following driving process.
In the following driving process of the self-vehicle, the self-vehicle acquires image information in front in real time through a front camera, matches the stored image characteristics and license plate characters of a following target vehicle with the front vehicle, and continuously judges whether the front vehicle is the following target vehicle.
And if the front vehicle is the following target vehicle, the self vehicle acquires the running state information of the front vehicle, namely the following target vehicle in real time for following control, and judges whether the front vehicle is in the self vehicle lane according to the front image information acquired in real time. If the front vehicle is positioned in the lane of the self vehicle, the self vehicle longitudinally follows the following target vehicle to run and transversely embraces the lane near the center line to run; if the front vehicle is not in the lane of the self vehicle, the lane change can be followed according to the road condition information acquired in real time, and the self vehicle longitudinally follows to run and transversely follows to change lanes under the condition that the lane change is allowed.
If the front vehicle is not the following target vehicle, the self vehicle tries to change the lane according to the road condition information acquired in real time. If the lane change is not allowed, the following function is quitted and the driver is reminded to take over; if lane changing is allowed, the left lane is preferentially selected for autonomous lane changing, the self vehicle still collects image information of the front vehicle in real time in the lane changing process and within a certain time after lane changing is completed, a following target vehicle is searched by using image characteristics or license plate characters of the following target vehicle, if the following target vehicle is not searched within a certain time, the self vehicle quits the following function and reminds a driver to take over the vehicle, and if the following target vehicle is searched again within a certain time, driving state information of the following target vehicle, image information in front of the self vehicle and road condition information around the self vehicle are obtained in real time for re-following driving.
An embodiment of the present invention further provides a vehicle automatic following control device, as shown in fig. 7, where the vehicle automatic following control device 1 includes:
a following target vehicle acquisition unit 101 configured to acquire a following target vehicle from first image information in front of a host vehicle;
a first judging unit 102, configured to acquire second image information in front of the host vehicle in real time and judge whether the following target vehicle exists in the second image information;
a lane change control unit 103, configured to, if the following target vehicle does not exist in the second image information, obtain road condition information around the vehicle in real time and control the vehicle to change lanes according to the road condition information;
a second judging unit 104, configured to acquire second image information of a front side in the vehicle lane change process and judge whether the following target vehicle exists in the second image information;
a following control information generating unit 105, configured to generate, if the following target vehicle exists in the second image information, following control information of the vehicle according to the second image information, the road condition information, and the driving state information of the following target vehicle;
a following control unit 106 for controlling the own vehicle to travel following the following target vehicle in accordance with the following control information.
The vehicle automatic following control device and the method embodiment of the invention are based on the same inventive concept, and please refer to the method embodiment for details, which are not described herein again.
An embodiment of the present invention further provides a vehicle automatic following control device, which includes a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or an instruction set, and the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by the processor to implement any one of the vehicle automatic following control methods as claimed above.
According to the automatic vehicle following control method, the device and the equipment provided by the embodiment of the invention, under the condition that the conventional vehicle-mounted sensor is not added, the continuous tracking control on the following target vehicle can be carried out by identifying and matching the image characteristics of the vehicle in front of the vehicle through the camera, and the automatic driving functions of L3 grades such as stable following driving, following lane changing and the like can be realized under the conditions of low traffic flow of urban expressways, expressways and the like. Meanwhile, the vehicle in front of the vehicle is compared for secondary confirmation through the recognition of the license plate characters, so that the situation that the following target vehicle cannot be determined due to the fact that the characteristic matching fails due to the coincidence of the vehicle image characteristics when the image information is processed can be avoided. In addition, in a certain time after the following target vehicle is lost, the image information of the vehicle in front of the vehicle is acquired in real time through the camera according to the road condition information around the vehicle during the lane changing overtaking process, whether the following target vehicle meeting the characteristic matching and the license plate character matching exists is judged, if the following target vehicle is not detected in a certain time after the lane changing overtaking is completed, the system can quit the following function and remind the driver to take over, the supporting degree of the driver can be further reduced, and the automation degree of the vehicle driving is improved.
It should be noted that: the precedence order of the above embodiments of the present invention is only for description, and does not represent the merits of the embodiments. And specific embodiments thereof have been described above. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the apparatus, system and server embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference may be made to some descriptions of the method embodiments for relevant points.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. A vehicle automatic following control method characterized by comprising:
determining a following target vehicle according to first image information in front of the vehicle based on preset conditions of pixel area and proportion;
acquiring second image information in front of the self vehicle in real time and judging whether the following target vehicle exists in the second image information or not;
if the following target vehicle does not exist in the second image information, acquiring road condition information around the vehicle in real time and controlling the vehicle to change lanes according to the road condition information;
acquiring second image information in front in the self-vehicle lane changing process and judging whether the following target vehicle exists in the second image information;
if the following target vehicle exists in the second image information, generating following control information of the vehicle according to the second image information, the road condition information and the running state information of the following target vehicle;
and controlling the self vehicle to run along with the following target vehicle according to the following control information.
2. The vehicle automatic following control method according to claim 1, wherein the determining of the following target vehicle from the first image information in front of the own vehicle based on the preset conditions of the pixel area and the occupancy includes:
trapezoidal filtering is carried out on the first image information through a target detection algorithm, and first vehicle boundary frame area information in front of the self vehicle is obtained;
acquiring the pixel area of the first vehicle bounding box area according to the first vehicle bounding box area information;
acquiring the proportion of the first vehicle boundary frame area in a preset trapezoidal area;
selecting a front vehicle with the pixel area and the occupation ratio meeting preset conditions as the following target vehicle;
carrying out gray level processing on the first vehicle boundary frame area information of the following target vehicle, and acquiring and storing the image characteristics of the following target vehicle;
and carrying out image processing on the first vehicle boundary frame area information of the following target vehicle, and acquiring and storing license plate characters of the following target vehicle.
3. The vehicle automatic following control method according to claim 2, wherein the determining whether the following target vehicle is present in the second image information includes:
acquiring image characteristics of the following target vehicle;
and performing feature matching on the second image information according to the image features of the following target vehicle, acquiring the position of a matching point in the second image information according to the image features of the following target vehicle, and judging whether the following target vehicle exists according to the position of the matching point.
4. The vehicle automatic following control method according to claim 2, wherein the determining whether the following target vehicle is present in the second image information further includes:
acquiring license plate characters of the following target vehicle;
trapezoidal filtering is carried out on the second image information through a target detection algorithm, and second vehicle boundary frame area information in front of the self vehicle is obtained;
performing image processing on the second vehicle boundary frame region information to acquire and store license plate characters in front of the vehicle;
and comparing the license plate characters in front of the vehicle with the license plate characters of the following target vehicle, and judging whether the following target vehicle exists in front of the vehicle.
5. The vehicle automatic following control method according to claim 1, wherein the generating of the following control information of the own vehicle from the second image information, the road condition information, and the driving state information of the following target vehicle includes:
acquiring the second image information and the road condition information to obtain the transverse following control information of the self-vehicle;
and obtaining the longitudinal following control information of the self vehicle by obtaining the running state information of the following target vehicle.
6. The vehicle automatic following control method according to claim 1, characterized by further comprising:
if the following target vehicle exists in the second image information, obtaining second vehicle boundary frame area information of the following target vehicle according to the second image information;
and judging whether the following target vehicle is positioned in the own vehicle lane or not according to the second vehicle boundary frame area information of the following target vehicle.
7. The vehicle automatic following control method according to claim 6, wherein the determining whether the following target vehicle is located within the own vehicle lane according to second vehicle boundary frame area information of the following target vehicle includes:
acquiring second vehicle boundary frame area information of the following target vehicle;
constructing a lane line as a boundary line of the self-vehicle lane according to the second vehicle boundary frame region information of the following target vehicle;
and judging whether the following target vehicle is positioned in the own vehicle lane or not according to the occupation ratio of the lower line of the second vehicle boundary frame area of the following target vehicle to the boundary line.
8. The vehicle automatic following control method according to claim 1, wherein the sensor of the own vehicle includes a camera and a vehicle-mounted radar, image information of the vehicle in front of the own vehicle is acquired in real time by the camera, and traveling state information of the following target vehicle and road condition information around the own vehicle are acquired in real time by the vehicle-mounted radar.
9. A vehicle automatic following control device characterized by comprising:
a following target vehicle determination unit for determining a following target vehicle from first image information in front of the own vehicle based on a pixel area and a preset condition of a proportion;
the first judging unit is used for acquiring second image information in front of the self vehicle in real time and judging whether the following target vehicle exists in the second image information or not;
the lane change control unit is used for acquiring road condition information around the vehicle in real time and controlling the vehicle to change lanes according to the road condition information if the following target vehicle does not exist in the second image information and the following target vehicle can not be matched during automatic driving at a preset speed or within a preset time of following the current front vehicle;
the second judgment unit is used for acquiring second image information in front in the self-vehicle lane changing process and judging whether the following target vehicle exists in the second image information or not;
a following control information generating unit, configured to generate, if the following target vehicle exists in the second image information, following control information of the vehicle according to the second image information, the road condition information, and the driving state information of the following target vehicle;
and the following control unit is used for controlling the self vehicle to follow the following target vehicle to run according to the following control information.
10. A vehicle automatic following control apparatus, comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions, the at least one instruction, the at least one program, set of codes, or set of instructions being loaded and executed by the processor to implement a vehicle automatic following control method according to any one of claims 1 to 8.
CN201910779383.XA 2019-08-22 2019-08-22 Automatic following control method, device and equipment for vehicle Active CN110531661B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910779383.XA CN110531661B (en) 2019-08-22 2019-08-22 Automatic following control method, device and equipment for vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910779383.XA CN110531661B (en) 2019-08-22 2019-08-22 Automatic following control method, device and equipment for vehicle

Publications (2)

Publication Number Publication Date
CN110531661A CN110531661A (en) 2019-12-03
CN110531661B true CN110531661B (en) 2021-06-22

Family

ID=68662561

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910779383.XA Active CN110531661B (en) 2019-08-22 2019-08-22 Automatic following control method, device and equipment for vehicle

Country Status (1)

Country Link
CN (1) CN110531661B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111103882A (en) * 2019-12-30 2020-05-05 合肥一辂汽车科技有限公司 Autonomous following control method for unmanned electric vehicle
CN111273673A (en) * 2020-03-09 2020-06-12 新石器慧通(北京)科技有限公司 Automatic driving following method and system of unmanned vehicle and unmanned vehicle
CN111619589B (en) * 2020-06-09 2022-12-30 南京工业职业技术学院 Automatic driving control method for complex environment
CN111674394B (en) * 2020-06-09 2023-04-11 南京工业职业技术学院 Automatic driving following keeping method capable of realizing microscopic regulation and control
CN112356848A (en) * 2020-11-06 2021-02-12 北京经纬恒润科技股份有限公司 Target monitoring method and automatic driving system
CN114999165A (en) * 2021-03-01 2022-09-02 上海博泰悦臻网络技术服务有限公司 Vehicle speed determination method, system, medium, and apparatus
CN113744420B (en) * 2021-09-06 2023-03-10 浙江创泰科技有限公司 Road parking charge management method, system and computer readable storage medium
CN113885505A (en) * 2021-10-12 2022-01-04 上海仙塔智能科技有限公司 Following processing method and device for vehicle, electronic equipment and storage medium
CN114379557B (en) * 2021-12-23 2023-12-15 浙江吉利控股集团有限公司 Automatic lane changing method, automatic lane changing control device and automatic lane changing system
CN116339194B (en) * 2023-02-28 2024-07-09 合肥工业大学 Following control method, system, terminal and storage medium for automatic driving vehicle

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101859494A (en) * 2009-04-06 2010-10-13 通用汽车环球科技运作公司 Autonomous vehicle management
CN102859568A (en) * 2010-04-12 2013-01-02 罗伯特·博世有限公司 Video based intelligent vehicle control system
CN103927508A (en) * 2013-01-11 2014-07-16 浙江大华技术股份有限公司 Target vehicle tracking method and device
CN107264531A (en) * 2017-06-08 2017-10-20 中南大学 The autonomous lane-change of intelligent vehicle is overtaken other vehicles motion planning method in a kind of semi-structure environment
CN110007305A (en) * 2019-04-15 2019-07-12 北京行易道科技有限公司 Vehicle front target determines method, apparatus, server and storage medium

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005045018A1 (en) * 2005-09-21 2007-03-22 Robert Bosch Gmbh Device for longitudinal guidance of a motor vehicle
EP2535883B1 (en) * 2008-07-10 2014-03-19 Mitsubishi Electric Corporation Train-of-vehicle travel support device
US20110313665A1 (en) * 2009-03-04 2011-12-22 Adc Automotive Distance Control Systems Gmbh Method for Automatically Detecting a Driving Maneuver of a Motor Vehicle and a Driver Assistance System Comprising Said Method
CN105023429B (en) * 2014-04-24 2017-08-01 上海汽车集团股份有限公司 Automobile-used wireless vehicle tracking and device
CN106209546A (en) * 2016-07-20 2016-12-07 张家港长安大学汽车工程研究院 Based on binocular camera and area array cameras automatic with car system
CN106994969B (en) * 2017-03-24 2019-06-14 奇瑞汽车股份有限公司 A kind of fleet's formation control loop and method
US10372131B2 (en) * 2017-07-06 2019-08-06 Ford Global Technologies, Llc Vehicles changing lanes based on trailing vehicles
CN108491782B (en) * 2018-03-16 2020-09-08 重庆大学 Vehicle identification method based on driving image acquisition
CN108801283A (en) * 2018-06-14 2018-11-13 淮阴工学院 The fleet's automatic driving control system and its automatic Pilot method of enclosed type road
CN109948504B (en) * 2019-03-13 2022-02-18 东软睿驰汽车技术(沈阳)有限公司 Lane line identification method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101859494A (en) * 2009-04-06 2010-10-13 通用汽车环球科技运作公司 Autonomous vehicle management
CN102859568A (en) * 2010-04-12 2013-01-02 罗伯特·博世有限公司 Video based intelligent vehicle control system
CN103927508A (en) * 2013-01-11 2014-07-16 浙江大华技术股份有限公司 Target vehicle tracking method and device
CN107264531A (en) * 2017-06-08 2017-10-20 中南大学 The autonomous lane-change of intelligent vehicle is overtaken other vehicles motion planning method in a kind of semi-structure environment
CN110007305A (en) * 2019-04-15 2019-07-12 北京行易道科技有限公司 Vehicle front target determines method, apparatus, server and storage medium

Also Published As

Publication number Publication date
CN110531661A (en) 2019-12-03

Similar Documents

Publication Publication Date Title
CN110531661B (en) Automatic following control method, device and equipment for vehicle
US11878683B1 (en) Automated system and method for modeling the behavior of vehicles and other agents
US9815460B2 (en) Method and device for safe parking of a vehicle
US11938967B2 (en) Preparing autonomous vehicles for turns
KR102222323B1 (en) Dynamic routing for autonomous vehicles
CN110194160B (en) Automatic driving system
CN111559388B (en) Target vehicle screening method, device, equipment and storage medium
KR20200014931A (en) Vehicle information storage method, vehicle driving control method, and vehicle information storage device
CN113324554B (en) Automatic driving route planning method and device, readable storage medium and electronic equipment
CN112141114B (en) Narrow passage auxiliary system and method
US11285957B2 (en) Traveling control apparatus, traveling control method, and non-transitory computer-readable storage medium storing program
US7440830B2 (en) Driving support system based on driver visual acquisition capacity
CN112124304B (en) Library position positioning method and device and vehicle-mounted equipment
CN114126940A (en) Electronic control device
CN113428160B (en) Dangerous scene prediction method, device and system, electronic equipment and storage medium
CN113879211A (en) Reminding method and system for preventing conflict between muck vehicle and non-motor vehicle in right turning process
CN111223319A (en) Driving strategy planning method and device and vehicle
US20230032741A1 (en) Road model generation method and device
EP4125051A1 (en) Method and device for determining reliability of visual detection
CN112677976B (en) Vehicle driving method, device, vehicle and storage medium
CN115472032B (en) Automatic lane change decision system and method for vehicles in ramp confluence area of expressway
CN114194187B (en) Vehicle travel control device
JP2004070383A (en) Vehicle travel controller
JP7067353B2 (en) Driver information judgment device
JP5154286B2 (en) Operation support device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant