CN111103882A - Autonomous following control method for unmanned electric vehicle - Google Patents

Autonomous following control method for unmanned electric vehicle Download PDF

Info

Publication number
CN111103882A
CN111103882A CN201911399163.0A CN201911399163A CN111103882A CN 111103882 A CN111103882 A CN 111103882A CN 201911399163 A CN201911399163 A CN 201911399163A CN 111103882 A CN111103882 A CN 111103882A
Authority
CN
China
Prior art keywords
vehicle
driving
scene
autonomous
control module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911399163.0A
Other languages
Chinese (zh)
Inventor
张志林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Yilu Automobile Technology Co ltd
Original Assignee
Hefei Yilu Automobile Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Yilu Automobile Technology Co ltd filed Critical Hefei Yilu Automobile Technology Co ltd
Priority to CN201911399163.0A priority Critical patent/CN111103882A/en
Publication of CN111103882A publication Critical patent/CN111103882A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses an automatic following control method of an unmanned electric vehicle, which mainly utilizes global road network files and GPS positioning information data to analyze and realize the identification and the distinguishing of vehicle driving scenes so as to switch control strategies.

Description

Autonomous following control method for unmanned electric vehicle
Technical Field
The invention belongs to the field of automobile unmanned driving, and particularly relates to an autonomous following control method of an unmanned electric automobile.
Background
At present, with the breakthrough of the unmanned automobile technology and the annual reduction of the component cost of the perception system, the automobile unmanned technology becomes the hot door of the current automobile technology research. In mainstream research, there are two main challenges, one is the tradeoff between sensor performance and cost, and the other is the tradeoff between decision control performance and real-time performance. Some experts and scholars propose to use the combined configuration and algorithm design of millimeter wave radar to replace the performance of laser radar, in comparison: the cost of the millimeter wave radar is far lower than that of the laser radar, the detection distance of the long-wave millimeter wave radar is about 100-200 meters, the detection distance of the short wave is about 20-35 meters, and the sensing performance of the millimeter wave radar can replace the laser radar to a certain extent through data fusion and algorithm optimization of the millimeter wave radars with different wave bands. In the design of the control algorithm, the control system drives according to the globally planned path, and the computer performs calculation and local path planning in real time and controls the corresponding actuator to perform control. However, in some scenes, such as a high-speed scene, since the scene is relatively simple, the control algorithm still performs a large amount of operations in real time, and obviously, there is a problem of wasting computer resource performance.
In order to improve the situation, if the control strategy is switched by scene recognition and distinguishing while planning the path, the cruise control is realized by using the characteristics of the millimeter wave radar and the forward-looking camera and a mode of following the front vehicle, so that the purposes of saving computer resources and realizing driving are possible.
Disclosure of Invention
The invention aims to provide an autonomous following control method of an unmanned electric vehicle, which analyzes acquired related data and realizes automatic driving control of partial scenes by using a mode of following a front vehicle in cooperation with the design of a control algorithm, so that the working time of the computer control algorithm is reduced, and the waste of computer resource performance is avoided.
The invention discloses an autonomous following control method of an unmanned electric vehicle, which comprises the following steps:
the driving scene recognition module is used for recognizing and confirming the driving scene of the current driving;
the strategy conversion control module is used for realizing the switching between the unmanned control state and the autonomous following control state;
the monitoring and forced quitting control module is used for monitoring the sensing system data of the whole driving vehicle in real time and providing safety intervention in time;
the driving scene recognition module comprises the following working steps:
a) reading a vehicle global task road network file database, analyzing a global driving path of a whole vehicle, simultaneously judging a driving scene of the driving by combining vehicle-mounted GPS positioning information of the driving, and if the driving scene is an expressway scene, entering a step b), and if the driving scene is a non-expressway scene, entering a step f);
b) determining an expressway scene, separating expressway driving sections in a road network information base by a driving scene recognition module, and further separating and dividing driving paths into road repairing road sections and normal road sections according to vehicle global task road network file database information;
c) the driving scene recognition module confirms the state of a road section in driving according to the GPS positioning information, and enters step f) when judging that the road section is in a road repairing section, and enters step d) when judging that the road section is in a normal road section;
d) the strategy conversion control module acquires information of a followed vehicle in front of the vehicle by combining a millimeter wave radar on the vehicle, when the distance and the speed of the followed vehicle at the current side meet the requirement of autonomous vehicle following driving, the strategy conversion control module controls to switch the autonomous vehicle following control state, starts to enter an autonomous vehicle following mode to realize autonomous vehicle following driving, and enters step e when receiving the warning information of the monitoring and forced quitting control module;
e) the strategy conversion control module controls to exit the autonomous car following mode and enters the step f);
f) and the strategy conversion control module controls and switches the unmanned control state and starts an unmanned control path planning and behavior control strategy.
The technical scheme of the invention has the beneficial effects that:
the invention provides an autonomous following control method of an unmanned electric vehicle, which mainly utilizes a global road network file database and GPS positioning information data to analyze and realize the recognition and the distinguishing of vehicle driving scenes so as to switch control strategies, then is implemented based on a control technology, utilizes the characteristics of a millimeter wave radar and a front camera to monitor the driving state and the lane line condition of a front vehicle in real time, and considers the requirements of safety and comfort of passengers by combining the state of the front vehicle and uses a mode of driving along with the front vehicle to realize the autonomous following cruise control of the vehicle. The real-time operation and analysis of the computer in the unmanned system are effectively reduced or shortened, the waste of the performance of computer resources is avoided, and the service life of the unmanned system is prolonged.
Drawings
FIG. 1 is a schematic view of the operation logic of the driving scene recognition module for recognizing the driving scene and switching the driving mode,
figure 2 is a simplified logic diagram of an autonomous car following mode initiation operation,
fig. 3 is a schematic view illustrating a driving scene state determination.
Detailed Description
In order to facilitate the understanding of the technical solutions of the present invention for those skilled in the art, the technical solutions of the present invention will be further described with reference to the drawings attached to the specification.
As shown in fig. 1, the autonomous following control method for the unmanned electric vehicle according to the technical solution of the present invention mainly includes a driving scene recognition module, a policy conversion control module, and a monitoring and forced quit control module.
The driving scene recognition module is mainly used for recognizing and confirming the driving scene to which the current driving belongs. The driving scenario will generally be divided into: high speed scenes, urban scenes, suburban scenes. The urban scene and the suburban scene are not suitable for following vehicles due to large traffic flow, more pedestrians, more obstacles, frequent and complex changes of turnout degree and road condition, and mainly adopt unmanned control, while the high-speed scene is good in road condition, free of pedestrians, few in obstacles, good in road surface straight line state, less in traffic flow and suitable for following vehicles. When the unmanned vehicle runs with the vehicle, the real-time operation and analysis of a computer in the unmanned system can be effectively reduced or shortened, the waste of the performance of computer resources is avoided, and the service life of the unmanned system is prolonged.
The strategy conversion control module is mainly used for switching between an unmanned control state and an autonomous following control state. When the driving scene recognition module judges that the current driving state of autonomous vehicle following cannot be carried out, the driving state is switched to the unmanned control state, and unmanned driving is realized.
The monitoring and forced quitting control module is mainly used for monitoring sensing system data of the whole driving vehicle in real time and providing safety intervention in time. The monitoring and forced quitting control module is in signal connection with a vehicle-mounted GPS (global positioning system), a millimeter wave radar, a forward-looking camera, a strategy conversion control module and a driving scene recognition module of a driving vehicle and an unmanned control system of a driving vehicle, so that real-time monitoring and acquisition of driving information are realized, and the control monitoring and forced quitting control module is controlled to forcibly quit the driving state of the autonomous following vehicle when the following vehicle is not suitable to be driven through a computer control algorithm in the unmanned control system of the relevant driving vehicle, so that the control monitoring and forced quitting control module is converted into the unmanned control state.
As shown in fig. 1, in the technical solution of the present invention, the driving scene recognition module includes the following working steps:
a) the driving scene recognition module reads a vehicle global task road network file database, analyzes a global driving path of the whole vehicle, simultaneously judges a driving scene of the current driving by combining positioning information of a vehicle-mounted GPS positioning system of the driving, and enters step b) if the driving scene is an expressway scene, and enters step f) if the driving scene is a non-expressway scene.
b) And when the vehicle is determined to be an expressway scene, the driving scene recognition module separates expressway driving sections in the road network information base, and further separates and divides driving paths into road repairing sections and normal road sections according to the vehicle global task road network file database information.
c) And the driving scene recognition module confirms the state of the road section in driving according to the GPS positioning information, and enters step f) when judging that the road section is in the road repairing section, and enters step d) when judging that the road section is in the normal road section.
d) The strategy conversion control module is combined with a millimeter wave radar on the vehicle to acquire the information of the vehicle which is followed in front of the vehicle, when the distance and the speed of the vehicle which is followed by the front meet the requirement of autonomous vehicle following, the strategy conversion control module controls and switches the autonomous vehicle following control state, starts to enter the autonomous vehicle following mode to realize autonomous vehicle following, and enters the step e when the warning information of the monitoring and forced quitting control module is received.
e) And f) controlling the vehicle to exit the autonomous following mode by the strategy conversion control module, and entering the step f).
f) And the strategy conversion control module controls and switches the unmanned control state and starts an unmanned control path planning and behavior control strategy.
As shown in fig. 2, in the step d), the policy conversion control module determines whether the information of the vehicle followed ahead satisfies the autonomous following driving requirement as follows:
s1, combining the data information of the lane in front of the vehicle, which is transmitted by the millimeter wave radar and the front-view camera in real time, by the strategy conversion control module, adopting a region block in the automatic driving system to divide the fusion data in the environment sensing system, extracting the required data of the advancing direction of the vehicle, calibrating the lane position, the speed and the distance between the vehicle and the front vehicle of the front vehicle, and entering the step S1);
s2, if the lane information shows that the driving vehicle and the front vehicle are in the same lane, the step S3) is carried out; if the lane information shows that the driving vehicle and the front vehicle are different lanes, the step S6 is executed;
s3, measuring the running speed of the front vehicle, and if the running speed of the front vehicle is more than or equal to 50km/h, entering the step S4); if the speed is less than or equal to 50km/h, the autonomous car following mode is exited;
s4, controlling the acceleration and braking of the travelling crane through an automatic driving system, and keeping the distance between the travelling crane and the front crane to be a constant value; keeping track of the lane line by controlling the turning angle of the front wheel of the vehicle, and entering step S5);
s5, keeping the running state of the autonomous following vehicle, and entering step S6 when the monitoring and forced quitting control module outputs a warning signal);
and S6, the strategy conversion control module controls and switches the unmanned control state, and starts the unmanned control path planning and behavior control strategy.
An example is provided below for the autonomous following control method of the unmanned electric vehicle with reference to fig. 1 to 3, so that a person skilled in the art can more fully understand the technical scheme of the autonomous following control method of the unmanned electric vehicle.
The control system of the autonomous following control method of the unmanned electric vehicle comprises the following main parts: the system comprises a driving scene identification module, a strategy conversion control module and a monitoring and forced quit control module.
Working process of driving scene recognition module
Firstly, a driving scene recognition module acquires relevant data, and a signal input into the driving scene recognition module comprises the following steps: a global task road network file database and driving GPS positioning information. Firstly, the driving scene identification module analyzes and identifies the database information of the calling task road network file, and distinguishes the access points and the contained areas of a high-speed scene (highway), a city scene (city) and a suburban scene (suburb).
For example: as shown in fig. 3, if the entry point of the highway section is a and the exit point is B, the entry point is marked as highway [ a, B ], and the driving scene recognition module compares the position information of the task road network file database with the positioning information of the driving GPS to determine the current scene state of the driving. And finally, the driving scene recognition module issues the confirmation information instruction to the autonomous following mode control system. Referring to fig. 1, a specific logic flow diagram shows that the driving scene recognition module performs program operation according to the steps a to f to realize switching and selection between the autonomous following mode and the unmanned driving mode.
(II) strategy conversion control module working process
And the strategy conversion control module is mainly used for realizing the switching between the unmanned control state and the autonomous following control. The input signals thereof include: the perception system fuses data, vehicle state information (e.g., acceleration, yaw rate), real-time map information. Logic of the strategy conversion control module referring to fig. 2, the strategy conversion control module performs program operation according to the steps from S1 to S6, determines information of vehicles ahead of the vehicle in the normal road section currently in the high-speed scene, and determines whether the vehicle can autonomously follow the vehicle according to the information of the vehicles ahead, thereby realizing switching and selecting between the autonomous following mode and the unmanned mode.
The strategy conversion control module performs more specific judgment according to the following steps (1) to (5):
(1) and dividing fusion data in the environment perception system by adopting the region blocks, extracting the requirement data of the advancing direction of the vehicle, and calibrating the lane position, the speed and the distance between the front vehicle and the front vehicle of the front vehicle.
M1, calibrating data to obtain a data conversion matrix
Assume that the fused data is DallWherein the data collected by the front millimeter wave radar is Df_radThe data collected by the front camera is Df_camAnd carrying out data fusion by using the whole vehicle coordinate system as a standard:
suppose that a point has a point coordinate [ x, y, z ] in the coordinate system of the whole vehicle]The coordinate of the same point in the millimeter wave data is [ x ]m,ym,zm]The coordinate of the same point in the front-view camera is [ x ]c,yc,zc]Then, there are:
Figure BDA0002347076050000071
Figure BDA0002347076050000072
wherein R ism、RcCoordinate change rotation matrix, T, for millimeter wave radar and front camera, respectivelym、TcCoordinate change translation matrixes of the millimeter wave radar and the front camera are respectively provided. And the real vehicle acquires data and calculates to obtain each numerical value of the specific conversion matrix.
And M2, after data conversion, respectively creating a data group of the front millimeter wave radar and a data group of the front camera, and extracting the required data.
The data provided by the front millimeter wave radar comprises the position and the speed of the forward vehicle, and can be defined as:
Lead-Car(n)[Xn,Yn,Vn,△Sn]
where n represents the number of vehicles in the survey, XnIs the position of the forward direction of the vehicle with the code n, YnAt a position perpendicular to the direction of travel of the vehicle, VnRunning speed of the vehicle with symbol n, △ SnIs the distance between the host vehicle and the vehicle with the code number n.
The data provided by the front millimeter wave radar comprises the distribution situation of lane lines, which can be defined as:
Lane-Car(n)[△Sn,Ln]
where n represents the number of vehicles in the measurement, △ SnDistance of the main vehicle from a forward vehicle with the code n, LnThe attribute of the lane line where the front vehicle is located.
(2) And judging lane information and measuring and calculating the running speed of the front vehicle.
N1 following vehicle confirmation
Checking the attribute of the lane line, selecting the forward vehicle with the same attribute as the vehicle to perform target pre-locking
N2, speed confirmation
Setting a threshold value of the follow-up starting speed to be 50km/h < -Vn < -120 km/h, calling data group data of the front millimeter wave radar, judging whether the running speed of the front millimeter wave radar meets the requirement of the speed threshold value, and if so, carrying out acceleration, deceleration and steering control.
(3) The distance between the two vehicles is kept to be a constant value by controlling acceleration and braking; by controlling the front wheel steering angle, the tracking of the lane line is maintained.
W1, control of spacing the controller is designed by the following formula:
ΔSn=(vn-v)·T
where v is the current speed of the vehicle and T is the road center position determined by the vehicle's visual far point, which is typically 2 seconds. The control of v is in fact the control of vehicle acceleration, and the concrete embodiment is the control of acceleration torque and braking torque, considers passenger's travelling comfort and vehicle ride comfort demand, restricts acceleration control:
-0.2m/s≤v-vd≤0.2m/s
-0.05m/s≤Δv≤0.05m/s
in the formula, vd△ v is the speed increment per control cycle for the desired speed of the vehicle.
W2, tracking control is performed by controlling steering according to the state of the lane:
step 1: reading in the video data of the front camera, and performing two parallel processes on the image:
l1, extracting the contour, and adopting a local self-adaptive domain binarization processing method;
l2, extracting road edges and adopting Canny edge detection.
Step 2: and (3) respectively carrying out filtering processing on the data processed in parallel in the step (1), and carrying out phase-fusion on the filtered images to obtain the images processed in the transverse direction and the longitudinal direction.
And step 3: and processing the fused image by adopting a linear clustering analysis algorithm of density characteristics.
And 4, step 4: and (3) adopting a least square curve fitting algorithm and taking a general expression equation of a quadratic curve.
And 5: establishing a tracking model of the curve:
k1, keeping the car following state, and exiting the car following mode when the monitoring and forced exit control module outputs a warning signal;
k2, activating unmanned behavior decision and path planning control strategy
Considering passenger comfort and vehicle ride comfort requirements, the limit slip angle of the front wheel and the increment of the slip angle of the front wheel are limited as follows:
-25°≤δ≤25°
-0.47°≤Δδ≤0.47°。
(III) monitoring and forced quit control module
And the monitoring and forced quitting control module is mainly used for immediately executing control interruption and forcibly switching into unmanned driving behavior decision and path planning control strategies. The system is a restrictive module for guaranteeing the driving safety, has higher priority intervention right, and is the last safety guarantee. The intervention conditions are as follows:
j1, the speed of the forward vehicle exceeds the speed limit value of the road section;
j2, forward vehicle emergency braking or steering;
j3, intersection switching instruction;
j4, scene switching instruction;
j5, forward congestion confirmation command.
Technical solution of the invention is described above with reference to the accompanying drawings, it is obvious that the specific implementation of the invention is not limited by the above-mentioned manner, and it is within the scope of the invention to adopt various insubstantial modifications of the inventive method concept and technical solution, or to apply the inventive concept and technical solution to other occasions without modification.

Claims (1)

1. An autonomous following control method of an unmanned electric vehicle, comprising:
the driving scene recognition module is used for recognizing and confirming the driving scene of the current driving;
the strategy conversion control module is used for realizing the switching between the unmanned control state and the autonomous following control state;
the monitoring and forced quitting control module is used for monitoring the sensing system data of the whole driving vehicle in real time and providing safety intervention in time;
the driving scene recognition module comprises the following working steps:
a) reading a vehicle global task road network file database, analyzing a global driving path of a whole vehicle, simultaneously judging a driving scene of the driving by combining vehicle-mounted GPS positioning information of the driving, and if the driving scene is an expressway scene, entering a step b), and if the driving scene is a non-expressway scene, entering a step f);
b) determining an expressway scene, separating expressway driving sections in a road network information base by a driving scene recognition module, and further separating and dividing driving paths into road repairing road sections and normal road sections according to vehicle global task road network file database information;
c) the driving scene recognition module confirms the state of a road section in driving according to the GPS positioning information, and enters step f) when judging that the road section is in a road repairing section, and enters step d) when judging that the road section is in a normal road section;
d) the strategy conversion control module acquires information of a followed vehicle in front of the vehicle by combining a millimeter wave radar on the vehicle, when the distance and the speed of the followed vehicle at the current side meet the requirement of autonomous vehicle following driving, the strategy conversion control module controls to switch the autonomous vehicle following control state, starts to enter an autonomous vehicle following mode to realize autonomous vehicle following driving, and enters step e when receiving the warning information of the monitoring and forced quitting control module;
e) the strategy conversion control module controls to exit the autonomous car following mode and enters the step f);
f) and the strategy conversion control module controls and switches the unmanned control state and starts an unmanned control path planning and behavior control strategy.
CN201911399163.0A 2019-12-30 2019-12-30 Autonomous following control method for unmanned electric vehicle Pending CN111103882A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911399163.0A CN111103882A (en) 2019-12-30 2019-12-30 Autonomous following control method for unmanned electric vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911399163.0A CN111103882A (en) 2019-12-30 2019-12-30 Autonomous following control method for unmanned electric vehicle

Publications (1)

Publication Number Publication Date
CN111103882A true CN111103882A (en) 2020-05-05

Family

ID=70425391

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911399163.0A Pending CN111103882A (en) 2019-12-30 2019-12-30 Autonomous following control method for unmanned electric vehicle

Country Status (1)

Country Link
CN (1) CN111103882A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113320545A (en) * 2021-07-01 2021-08-31 江苏理工学院 Intersection behavior prediction decision method based on line-control intelligent vehicle
CN114475597A (en) * 2022-02-28 2022-05-13 东风汽车集团股份有限公司 Method and system for controlling following distance of automatic driving vehicle
US11938941B2 (en) 2020-08-31 2024-03-26 Denso International America, Inc. Mode selection according to system conditions

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000112523A (en) * 1998-09-30 2000-04-21 Honda Motor Co Ltd Automatic follow-up traveling system
US20090037070A1 (en) * 2007-08-03 2009-02-05 Nissan Motor Co., Ltd. System and method for controlling running of a vehicle
CN108536148A (en) * 2018-04-17 2018-09-14 陈明 A kind of new Vehicular automatic driving method
CN109649390A (en) * 2018-12-19 2019-04-19 清华大学苏州汽车研究院(吴江) A kind of autonomous follow the bus system and method for autonomous driving vehicle
CN109976346A (en) * 2019-04-08 2019-07-05 广州小鹏汽车科技有限公司 Vehicle automatic following control method and system
CN110099832A (en) * 2016-12-28 2019-08-06 罗伯特·博世有限公司 Adaptive-feedrate adjustment system for autonomous vehicle
CN110531661A (en) * 2019-08-22 2019-12-03 浙江吉利汽车研究院有限公司 A kind of vehicle is automatically with control method of speeding, device and equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000112523A (en) * 1998-09-30 2000-04-21 Honda Motor Co Ltd Automatic follow-up traveling system
US20090037070A1 (en) * 2007-08-03 2009-02-05 Nissan Motor Co., Ltd. System and method for controlling running of a vehicle
CN110099832A (en) * 2016-12-28 2019-08-06 罗伯特·博世有限公司 Adaptive-feedrate adjustment system for autonomous vehicle
CN108536148A (en) * 2018-04-17 2018-09-14 陈明 A kind of new Vehicular automatic driving method
CN109649390A (en) * 2018-12-19 2019-04-19 清华大学苏州汽车研究院(吴江) A kind of autonomous follow the bus system and method for autonomous driving vehicle
CN109976346A (en) * 2019-04-08 2019-07-05 广州小鹏汽车科技有限公司 Vehicle automatic following control method and system
CN110531661A (en) * 2019-08-22 2019-12-03 浙江吉利汽车研究院有限公司 A kind of vehicle is automatically with control method of speeding, device and equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11938941B2 (en) 2020-08-31 2024-03-26 Denso International America, Inc. Mode selection according to system conditions
CN113320545A (en) * 2021-07-01 2021-08-31 江苏理工学院 Intersection behavior prediction decision method based on line-control intelligent vehicle
CN114475597A (en) * 2022-02-28 2022-05-13 东风汽车集团股份有限公司 Method and system for controlling following distance of automatic driving vehicle

Similar Documents

Publication Publication Date Title
CN109520498B (en) Virtual turnout system and method for virtual rail vehicle
CN109987092B (en) Method for determining vehicle obstacle avoidance and lane change time and method for controlling obstacle avoidance and lane change
CN112193244B (en) Automatic driving vehicle motion planning method based on linear constraint
CN109976329B (en) Planning method for vehicle obstacle avoidance and lane change path
CN110304074B (en) Hybrid driving method based on layered state machine
CN108445885A (en) A kind of automated driving system and its control method based on pure electric vehicle logistic car
JP6641583B2 (en) Vehicle control device, vehicle control method, and program
CN111469847B (en) Lane change path planning method and system
CN106428007A (en) Autonomous driving control apparatus and method for vehicle
CN111103882A (en) Autonomous following control method for unmanned electric vehicle
Nobe et al. An overview of recent developments in automated lateral and longitudinal vehicle controls
CN113677581A (en) Lane keeping method, vehicle-mounted device and storage medium
KR20150061781A (en) Method for controlling cornering of vehicle and apparatus thereof
JP2019160032A (en) Vehicle control device, vehicle control method, and program
CN110389589A (en) Intelligent driving vehicle obstacle-avoidance system and method
CN104999958A (en) Automatic control system and method for steering lamp
CN109501798B (en) Travel control device and travel control method
JP2019160031A (en) Vehicle control device, vehicle control method, and program
CN111994075A (en) Driving assistance method based on artificial intelligence
CN115892063A (en) Road condition monitoring and coping method for unmanned commercial vehicle
CN110737261A (en) Automatic stop control method and system for vehicles
CN116564084A (en) Net-connected auxiliary driving control method and system based on pure road end perception
CN116394979A (en) Automatic driving decision control method based on road side fusion perception
CN116461525A (en) Vehicle lane changing method, device, equipment, medium and vehicle
CN114572250A (en) Method for automatically driving through intersection and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination