CN114435404A - Intelligent driving control method based on environment perception - Google Patents

Intelligent driving control method based on environment perception Download PDF

Info

Publication number
CN114435404A
CN114435404A CN202210215963.8A CN202210215963A CN114435404A CN 114435404 A CN114435404 A CN 114435404A CN 202210215963 A CN202210215963 A CN 202210215963A CN 114435404 A CN114435404 A CN 114435404A
Authority
CN
China
Prior art keywords
vehicle
driving
obstacle
information
control method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210215963.8A
Other languages
Chinese (zh)
Inventor
魏晓娟
贾猛
朱永宽
卢春光
王俊平
孙玉敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Henan Polytechnic Institute
Original Assignee
Henan Polytechnic Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Henan Polytechnic Institute filed Critical Henan Polytechnic Institute
Priority to CN202210215963.8A priority Critical patent/CN114435404A/en
Publication of CN114435404A publication Critical patent/CN114435404A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/35Data fusion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/40High definition maps

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses an intelligent driving control method based on environmental perception, which comprises the following steps: based on a high-precision map and high-precision positioning, marking navigation information of a vehicle on the high-precision map by taking a lane as a unit, marking the lane marked with the navigation information as a planned driving lane, and marking a positioned occupied area of the vehicle on the high-precision map; sensing the obstacle information in all lanes in a preset range in front of the running vehicle, and marking the obstacle information on a high-precision map; obtaining a driving reference line of the vehicle based on the vehicle passing data and the barrier information; and controlling the vehicle to travel along the driving datum line based on the driving datum line. The application discloses can make analysis and evasion in advance to barrier dynamic trajectory on the road, under the prerequisite that the assurance vehicle can be safe current, pedestrian's safety on the road has been protected. Further, whether the operation control of the driver is possible or not is analyzed to determine whether the vehicle can respond to the operation of the driver, thereby avoiding the misoperation of the driver.

Description

Intelligent driving control method based on environment perception
Technical Field
The application belongs to the technical field of automatic driving of automobiles, and particularly relates to an intelligent driving control method based on environmental perception.
Background
With the development of science and technology, the automatic driving technology of the vehicle is more mature, the vehicle is continuously controlled in real time by means of artificial intelligence, visual calculation, radar and monitoring devices, passengers only need to input destinations when using the automatic driving vehicle, and the automatic driving vehicle generates a driving route based on the current position and the destinations and drives according to the generated driving route.
An intelligent automatic driving system of a vehicle is a control strategy obtained by summarizing and analyzing through various detection means, and is based on the premise of safety on the whole. However, in the current vehicle intelligent control, an empty place is taken as a background, or a closed road section with a very small traffic flow is taken as a background, in the daily vehicle driving process, the obstacle on the road is judged, the fixed obstacle is still taken as the standard, and the position of a pedestrian or other objects in the moving process is still judged according to the fixed position at a certain time point. Therefore, when the vehicle is in a road section with traffic congestion and more pedestrians, the surrounding is all obstacles as a result of the vehicle detection, and the vehicle cannot move forward. On the other hand, during the vehicle intelligent automatic control running process, the driver may intervene in the operation control of the vehicle for various reasons, which includes misoperation caused in some specific situations. At present, because the intelligence level of the vehicle is still limited, the control strategy of the vehicle can be correspondingly intervened by a driver, and therefore, traffic accidents caused by misoperation of the driver can occur.
Intelligent automatic control of vehicles is in principle safer than everything, but how to deal with dynamic traffic environments and how to deal with operational conflicts with the driver requires more intelligent sensing and analysis techniques, and subsequent vehicle control architectures.
Disclosure of Invention
The application provides an intelligent driving control method based on environment perception, which is characterized in that an intelligent control strategy with prejudgement is formulated according to dynamic changes of surrounding road environments, obstacles on a road are analyzed and avoided in advance, and on the basis, whether operation control of a driver is feasible or not is further analyzed through the environment perception so as to determine whether a vehicle can respond to the operation of the driver or not.
In order to achieve the above purpose, the present application provides the following solutions:
an intelligent driving control method based on environment perception comprises the following steps:
based on a high-precision map and high-precision positioning, marking navigation information of a vehicle on the high-precision map by taking a lane as a unit, marking the lane marked with the navigation information as a planned driving lane, and marking a positioned vehicle occupation area on the high-precision map;
sensing obstacle information in all lanes in a preset range in front of the running vehicle, and marking the obstacle information on the high-precision map;
obtaining a driving reference line of the vehicle based on vehicle passing data and the obstacle information;
and controlling the vehicle to travel along the driving reference line based on the driving reference line.
Optionally, the high-precision map displays road data in units of lanes;
and the high-precision positioning data is the occupied area of the vehicle relative to the lane.
The method for obtaining the obstacle information comprises the following steps:
collecting front video information of all lanes in the preset range, side video information of the side of the vehicle and rear video information behind the vehicle, integrating the front video information, the rear video information and the rear video information, and converting the integrated information into a two-dimensional image with the vehicle as the center;
identifying imagery of an obstacle in the two-dimensional image using visual analysis techniques;
mapping a radar detection range to the two-dimensional image, and obtaining position data of the obstacle through a radar detection technology;
and obtaining the obstacle information based on the image and the position data of the obstacle.
Optionally, the obstacle comprises a fixed obstacle and a moving obstacle;
the obstacle information comprises fixed obstacle information and moving obstacle information, and the moving obstacle information comprises the movement trend of the moving obstacle;
identifying the fixed obstacle and the moving obstacle in the two-dimensional image, and a movement trend of the moving obstacle using an optical flow analysis method.
Optionally, the method for obtaining the driving reference line includes:
obtaining driving safety critical points based on the movement trend of the moving barrier, and marking the fixed barrier information as the driving safety critical points;
obtaining a drivable area based on the driving safety critical point;
and obtaining the driving datum line based on the drivable area and the vehicle passing data.
Optionally, if the travelable region cannot meet the requirement of the vehicle traffic data, the driving reference line is stopped, and the vehicle stops traveling at the stop of the driving reference line.
Optionally, the driving operation of the vehicle driver on the vehicle is monitored in real time;
responding to the driving operation of a vehicle driver if the vehicle runs in the drivable area;
if the vehicle reaches the edge of the drivable area, stopping the vehicle from driving or correcting the driving direction of the vehicle;
and if the drivable area cannot be obtained according to the driving safety critical point, the vehicle does not respond to the driving operation of the vehicle driver on the vehicle.
Optionally, the driving speed of the vehicle is acquired according to the drivable area and the vehicle passing data.
The beneficial effect of this application does:
the application discloses an intelligent driving control method based on environment perception, on the basis of the existing navigation technology, an intelligent control strategy with prejudgment is made according to dynamic changes of road environments around a vehicle, and a driving route of the vehicle is generated, so that dynamic tracks of obstacles on the road can be analyzed and avoided in advance, and on the premise that the vehicle can safely pass, the safety of pedestrians on the road is protected. Further, according to the technical scheme, the driver is not allowed to intervene randomly, and whether the operation control of the driver is feasible or not is analyzed through environment perception on the premise that the vehicle runs safely, so that whether the vehicle can respond to the operation of the driver or not is determined, and misoperation of the driver is avoided.
Drawings
In order to more clearly illustrate the technical solution of the present application, the drawings needed to be used in the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for a person skilled in the art to obtain other drawings without any inventive exercise.
Fig. 1 is a schematic flow chart of an intelligent driving control method based on environmental awareness according to an embodiment of the present application;
fig. 2 is a schematic view of a simulated road condition in the embodiment of the present application.
Detailed Description
The present application provides an intelligent strategy technique related to autonomous driving, but is not limited to one that provides full vehicle operation and control for a vehicle with intelligent control. To facilitate an understanding of the various technologies and terms used in this application, a description will first be given of some related technologies related to intelligent vehicle control.
Autonomous driving system technology (abbreviated ADS) refers to systems that perform driving tasks for a vehicle (e.g., lateral and longitudinal control of the vehicle) and allow the vehicle to drive with reduced and/or no human controlled driving tasks.
GPS is a Global Navigation Satellite System (GNSS) that provides geographic location and time information to a receiver. Examples of GNSS include, but are not limited to, the Global positioning System developed in the United states, the Differential Global Positioning System (DGPS), the Beidou navigation satellite System (BDS), the GLONASS Global navigation satellite System, the European Union Galileo positioning System.
An automated vehicle (abbreviated "AV") refers to an automated vehicle that operates in an automation mode (e.g., at any automation level).
The automation or intelligence level of a vehicle is described in terms of a "intelligence level" or "automation level". The vehicle intelligence or automation level is one of the following: v0: no automation function; v1: basic functions that assist a human driver in controlling the vehicle; v2: the system assists a human driver to control the vehicle to complete simple tasks and provides a basic sensing function; v3: the system has a detailed real-time environment sensing function and can complete relatively complex driving tasks; v4: a function of allowing the vehicle to run independently under defined conditions and with the support of a human driver; v5: a function that allows the vehicle to run independently in any situation without the support of a human driver.
The system intelligence and/or automation level is one of the following: s0: no function; s1: the system provides simple functions such as cruise control, passive safety and the like for individual vehicles; the system detects the speed, position and distance of the vehicle; s2: the system consists of individual intelligence and can detect the functional state of the vehicle, the acceleration of the vehicle and/or traffic signs and signals; the individual vehicle makes a decision according to the self information, partially realizes automatic driving, and provides complex functions of assisting the vehicle in self-adaptive cruise control, lane keeping, lane-changing automatic parking and the like; s3: the system integrates information of a group of vehicles, has point-to-point intelligence and prediction capability, can make intelligent decision on the vehicle group, and can complete complex conditional automatic driving tasks such as collaborative cruise control, vehicle formation, vehicle navigation at intersections, confluence, diversion and the like; s4: the system optimally integrates driving behaviors in a local network; the system detects and transmits detailed information in a local network, makes decisions according to vehicle and traffic information in the network, and processes complex and high-level automatic driving tasks, such as guiding a traffic signal corridor, and provides an optimal track for vehicles in a small traffic network; s5: vehicle automation and system traffic automation, where the system optimally manages the entire transportation network; the system detects and transmits detailed information within the transportation network and makes decisions based on all available information within the network; the system handles fully automated driving tasks, including individual vehicle tasks and traffic transportation tasks, and coordinates all vehicles to manage traffic.
Some other common standards in this field may be referred to SAE international standard J3016.
In the present embodiment, when the automatic driving or the smart driving control is involved, the above technical contents may be referred or cited.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
As is well known, a vehicle must travel on a road in the general sense, a large number of other vehicles must be on the road, and a large number of pedestrians must be on a part of a road section. However, in practice, other vehicles and pedestrians around the vehicle are in motion and not stationary, which greatly affects the determination of the position of the obstacle. If these moving vehicles and pedestrians are considered as fixed obstacles, the vehicles need to continuously modify their own driving control strategy over time until the vehicles cannot move forward. Further, the driver may take over the vehicle control at this time, and this intervention may directly cause an accident.
Therefore, the method and the device have the advantages that the movement tracks of various obstacles, particularly pedestrians are pre-judged in advance according to the dynamic change of the road environment, the road surface area required in the vehicle advancing process is combined with the obstacle position and the pedestrian movement track, and a form control strategy with the pre-judging property is generated, so that the safety of the obstacles and the pedestrians on the road is guaranteed, and the advancing safety of the vehicle is also guaranteed.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, the present application is described in further detail with reference to the accompanying drawings and the detailed description.
As shown in fig. 1, a schematic flow chart of an intelligent driving control method based on environmental awareness according to an embodiment of the present application mainly includes the following steps.
S102, marking navigation information of the vehicle and the occupied area of the positioned vehicle on the high-precision map by taking the lane as a unit based on the high-precision map and the high-precision positioning.
At present, navigation and positioning technologies have made great progress, and there are many related technologies, and common navigation technologies include GPS, beidou, galileo, and the like, and service providers used daily include gold, Baidu, and the like, which can provide complete navigation and positioning functions. However, these navigation and positioning functions still stay at the road level, and only the road path is displayed. The navigation technology is far from meeting the requirement of vehicles on reasonably avoiding road obstacles.
In fact, the existing navigation technology, especially the Beidou navigation and positioning technology, has reached the high precision of 10cm level, and based on such high-precision maps and high-precision positioning, navigation information can be marked on a certain lane, namely, navigation and road data are displayed by taking a certain lane on the road surface as a unit, and vehicles are guided to run in a certain lane. The lane marked with navigation information is used as a planned driving lane for the vehicle to move ahead, and other lanes can be defined as a standby driving area. In the embodiment, when only one lane exists, the lane is taken as a planned driving lane; when two lanes exist, the right lane is taken as a planned driving lane; and when a plurality of lanes exist, taking the middle lane as a planned driving lane. Meanwhile, based on the high-precision positioning, the position of the vehicle is positioned, and the occupied area of the vehicle relative to the lane is marked based on the overall dimension of the vehicle. The method has the advantages that the planned driving lanes and the occupied areas of the vehicles are marked on the high-precision map, so that the purposes and the advantages that different lane spaces on the road and even the whole road surface can be fully utilized on the premise of meeting relevant traffic regulations, and the vehicles can still plan a safe advancing route on the premise of ensuring the safety of obstacles, particularly pedestrians.
And S104, sensing the obstacle information in all lanes in the preset range in front of the running vehicle.
As is well known, a navigation route is from a starting point to a destination, but the change of the road environment is real-time, especially for moving pedestrians, so that the surrounding environment of the vehicle needs to be sensed in real time, the position and the action of an obstacle need to be analyzed, and data support is provided for the confirmation of the safe route.
In the embodiment of the application, the following method is adopted for sensing various obstacles:
firstly, collecting front video information of all lanes in a preset range, side video information of the side of a vehicle and rear video information of the rear of the vehicle, integrating the front video information, the rear video information and the rear video information, and converting the integrated information into a two-dimensional image taking the vehicle as a center.
As mentioned above, the navigation route is from the starting point to the destination, but the change of the road environment is real-time, especially for moving pedestrians, so that the road section which needs the vehicle to sense and adopt the control strategy in real time is the road section which is actually based on the current position of the vehicle, including the road section within the preset distance of all lanes in front of, beside and behind the vehicle. In the present embodiment, the length of the road section ahead of the vehicle is set in relation to the vehicle speed, because the higher the vehicle speed, the longer the distance traveled over a period of time, the more analysis and anticipation need to be performed in advance. Specifically, in this embodiment, a driving distance of 10 seconds at the current speed of the vehicle is used as a distance range of a road section ahead of the vehicle, but the distance range of the road section ahead of the vehicle is 50 meters at the lowest, for example, the current speed of the vehicle is 36km/h, that is, 10m/s, the driving distance of 10 seconds is 100m, the driving distance exceeds the limit value of 50m at the lowest, and 100m is taken as the distance range of the road section ahead of the vehicle for obstacle sensing; if the vehicle speed is faster, for example 72km/h, namely 20m/s, the driving distance of 10 seconds is 200m, and 200m is taken as the distance range of the road section in front of the vehicle for obstacle perception. However, if the current vehicle speed is lower than 18km/h, namely lower than 5m/s, the distance traveled by the vehicle in 10 seconds does not exceed 50m, and at the moment, 50m is taken as the distance range of the road section in front of the vehicle for obstacle sensing. The method for taking the distance in front of the vehicle can reserve enough safe distance and route analysis time for the vehicle.
On the vehicle side, if the vehicle changes lanes, the vehicle may make collision contact with an obstacle on the vehicle side, and therefore, it is necessary to sense the obstacle in the range of all lanes (the lowest value is 10m on each side) on both sides of the vehicle.
For the rear of the vehicle, because there may be other vehicles that are traveling fast, the lane change of the vehicle may send a collision with the vehicle that is traveling fast behind, and therefore, the sensing range on the rear side of the vehicle is the same as that on the front side of the vehicle, and is not described herein again.
In this embodiment, video acquisition equipment such as a high definition camera is used to acquire video information of the front, the side and the back of the vehicle, and a mature 360-degree panoramic technology is used to perform splicing integration to generate 360-degree panoramic video data centered on the vehicle, and then video frame images are captured at intervals of 0.1s to generate a continuous 360-degree panoramic two-dimensional image sequence centered on the vehicle.
Then, using existing visual analysis techniques, the obstacle image in each two-dimensional image is identified.
In this embodiment, the obstacle is divided into a fixed obstacle and a moving obstacle, and accordingly, the obstacle information includes fixed obstacle information and moving obstacle information, where the fixed obstacle information includes a position, a shape, and a spatial size, and the moving obstacle information includes a movement tendency of the moving obstacle, including a movement direction and a movement speed, in addition to the position, the shape, and the spatial size. The moving obstacle mainly refers to a moving object (vehicle) and a pedestrian, and the embodiment will be described with reference to a pedestrian crossing a road as an example. The obstacle recognition adopts the existing mature visual analysis recognition technology, and is not limited herein, further, the optical flow analysis method combined with the existing mature theory is used for recognizing the position of the same obstacle or pedestrian in the continuous images, and distinguishing the movement trends of the fixed obstacle, the moving obstacle and the moving obstacle.
Then, in combination with a radar detection device, such as a laser radar detection technology widely configured on the current vehicle, a detection range of the laser radar is mapped into the two-dimensional image, and then the position of the obstacle detected by the radar is mapped into the detection range in the two-dimensional image, so that the position data of the obstacle, including the angle of the obstacle relative to the vehicle forward direction and the distance from the vehicle, is obtained. The position, spatial shape, range of occupation of the road surface, and tendency of movement of the obstacle constitute obstacle information.
And finally, identifying the information of all obstacles in the perception range on the high-precision map.
And S106, obtaining a driving reference line of the vehicle based on the vehicle passing data and the obstacle information.
The driving reference line refers to a reference line of a target track in the process of vehicle advancing, and the driving direction and the driving track of the vehicle are embodied. The reference line is not only related to the position of the obstacle, but also related to vehicle traffic data, wherein the vehicle traffic data refer to the occupation of a road in the vehicle traveling process, and the traveling functions and performances which can be realized by the vehicle, including the shape and the shape of a vehicle body, the external dimension, the occupied area of a road surface in the traveling process, and the performances of the vehicle, such as the steering performance of the vehicle, if the vehicle cannot steer at a certain angle, the driving reference line is unrealistic.
As shown in fig. 2, a road situation is simulated: the road is designed into two-way two lanes, the middle part is a road marking, and a solid line indicates that the vehicle cannot cross; the vehicle is located in a north lane and drives to the north. At the moment, sensing that a pedestrian is crossing the road at the position A, and analyzing to obtain that the vehicle is in contact with the pedestrian at the position B after 7s according to the speed of the vehicle and the distance between the pedestrian and the position A and the movement trend of the pedestrian, namely the movement direction and speed of the pedestrian; in addition, a fixed obstacle is perceived to be located at point C, which will have an adverse effect on traffic.
Based on the simulated road condition, the method for reaching the driving reference line in the embodiment of the application comprises the following steps:
firstly, a driving safety critical point is obtained based on the movement trend of the moving barrier, and meanwhile, the fixed barrier information is also marked as the driving safety critical point.
In this embodiment, the driving safety critical point refers to a position point that the vehicle must avoid, which is a position point of a fixed obstacle and a position point of a moving obstacle making contact with the vehicle. In fig. 2, a fixed obstacle is at point C, which has an adverse effect on driving and is marked as a driving safety critical point; according to the perception and the analysis and calculation, when the pedestrian moves from the point A to the point B, the vehicle arrives at the point B at the same time, and the point B is also marked as a driving safety critical point. The vehicle must bypass C, B two driving safety critical points before it can make collision contact with fixed obstacles and pedestrians; at the same time, the vehicle cannot yet cross the solid line in the road, otherwise the relevant traffic legislation will be violated. A, B, C are marked on the figure, and the moving line of the pedestrian from A to B is marked.
And then, constructing a drivable area on the basis of avoiding driving safety critical points. As shown by the inner area shown by the broken line in fig. 2, the two points are avoided B, C and do not cross the middle solid line, and the remaining areas are all travelable areas, and it can be seen that the travelable area is very wide.
And finally, in the drivable area, a strip-shaped vehicle reference line is planned by combining vehicle passing data, particularly the occupied area of the vehicle on the lane and the turning performance of the vehicle. It can be seen that, in order to avoid the fixed obstacle at the position of the C point, the datum line is formed to be shifted to the left, and then to avoid the B point, the datum line is shifted to the right, so that the traveling route of the vehicle is shifted to the left, bypasses the fixed obstacle at the C point, and then is shifted to the right, and bypasses the pedestrian from the rear of the pedestrian.
And S108, based on the driving datum line, the intelligent control system with the vehicle controls the vehicle to move along the driving datum line.
In the embodiment of the present application, a case where the travelable region cannot satisfy the requirement of the vehicle passing data is further considered, for example, a road surface is narrow at a certain position and is not enough to satisfy the vehicle width requirement, or a sharp turn is required, and the turn angle is not realized by the vehicle, when the driving reference line is stopped, the vehicle stops traveling at the stopping position of the driving reference line.
Further, the vehicle monitors the driving operation of the vehicle driver on the vehicle in real time, wherein the driving operation comprises steering of a steering wheel and acceleration of stepping on an accelerator. If the vehicle is still in the travelable region, meaning that the vehicle does not collide with the fixed obstacle at point C, the vehicle may respond to the driver's operation, but if the pedestrian's line of motion is approached in advance, the vehicle's intelligent control system will cut off the driver's operation and control the vehicle to stop traveling, for example, the vehicle may be controlled to stop at a distance of 2m from the pedestrian's line of motion.
If the vehicle responds to the driver's operation but reaches the edge of the travelable area, for example, 30cm from the edge, under the driver's operation, the intelligent control system of the vehicle cuts off the driver's operation and controls the vehicle to stop traveling or corrects the traveling direction of the vehicle so that the vehicle returns to within the travelable area or on the traveling reference line.
If there is not enough space to allow the vehicle to pass through, for example, there are many obstacles or many pedestrians in the area constructed according to the driving safety critical point, it is determined that the travelable area cannot be constructed and the driving reference line cannot be generated, and at this time, the vehicle does not respond to the driving operation (for example, the acceleration operation) of the vehicle by the vehicle driver.
In addition, the speed of the vehicle in the travelable area is also limited correspondingly, and cannot be too fast or not to be too slow, and the traveling speed of the vehicle can be calculated according to the parameters of the width, the length, the turning angle and the like of the travelable area and by combining the traffic data of the vehicle.
The above-described embodiments are merely illustrative of the preferred embodiments of the present application, and do not limit the scope of the present application, and various modifications and improvements made to the technical solutions of the present application by those skilled in the art without departing from the design spirit of the present application should fall within the protection scope defined by the claims of the present application.

Claims (8)

1. An intelligent driving control method based on environment perception is characterized by comprising the following steps:
based on a high-precision map and high-precision positioning, marking navigation information of a vehicle on the high-precision map by taking a lane as a unit, marking the lane marked with the navigation information as a planned driving lane, and marking a positioned vehicle occupation area on the high-precision map;
sensing obstacle information in all lanes in a preset range in front of the running vehicle, and marking the obstacle information on the high-precision map;
obtaining a driving reference line of the vehicle based on vehicle passing data and the obstacle information;
and controlling the vehicle to travel along the driving reference line based on the driving reference line.
2. The intelligent driving control method based on environmental perception according to claim 1, wherein the high-precision map displays road data in units of lanes;
and the high-precision positioning data is the occupied area of the vehicle relative to the lane.
3. The intelligent vehicle driving control method based on environmental awareness according to claim 2, wherein the method of obtaining the obstacle information comprises:
collecting front video information of all lanes in the preset range, side video information of the side of the vehicle and rear video information behind the vehicle, integrating the front video information, the rear video information and the rear video information, and converting the integrated information into a two-dimensional image with the vehicle as the center;
identifying imagery of an obstacle in the two-dimensional image using visual analysis techniques;
mapping a radar detection range to the two-dimensional image, and obtaining position data of the obstacle through a radar detection technology;
and obtaining the obstacle information based on the image and the position data of the obstacle.
4. The intelligent vehicle driving control method based on environmental perception according to claim 3, wherein the obstacles comprise fixed obstacles and moving obstacles;
the obstacle information comprises fixed obstacle information and moving obstacle information, and the moving obstacle information comprises the movement trend of the moving obstacle;
identifying the fixed obstacle and the moving obstacle in the two-dimensional image, and a movement trend of the moving obstacle using an optical flow analysis method.
5. The intelligent driving control method based on environmental perception according to claim 4, wherein the method for obtaining the driving reference line comprises:
obtaining driving safety critical points based on the movement trend of the moving barrier, and marking the fixed barrier information as the driving safety critical points;
obtaining a drivable area based on the driving safety critical point;
and obtaining the driving datum line based on the drivable area and the vehicle passing data.
6. The intelligent driving control method based on environmental awareness according to claim 5, wherein if the drivable region cannot meet the requirement of the vehicle traffic data, the driving reference line is stopped, and the vehicle stops traveling at the stop of the driving reference line.
7. The intelligent driving control method based on environmental awareness according to claim 5, wherein driving operation of a vehicle driver on the vehicle is monitored in real time;
responding to the driving operation of a vehicle driver if the vehicle runs in the drivable area;
if the vehicle reaches the edge of the drivable area, stopping the vehicle from driving or correcting the driving direction of the vehicle;
and if the drivable area cannot be obtained according to the driving safety critical point, the vehicle does not respond to the driving operation of the vehicle driver on the vehicle.
8. The intelligent vehicle running control method based on environmental awareness according to claim 5, wherein the running speed of the vehicle is obtained according to the travelable region and the vehicle traffic data.
CN202210215963.8A 2022-03-07 2022-03-07 Intelligent driving control method based on environment perception Pending CN114435404A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210215963.8A CN114435404A (en) 2022-03-07 2022-03-07 Intelligent driving control method based on environment perception

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210215963.8A CN114435404A (en) 2022-03-07 2022-03-07 Intelligent driving control method based on environment perception

Publications (1)

Publication Number Publication Date
CN114435404A true CN114435404A (en) 2022-05-06

Family

ID=81358850

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210215963.8A Pending CN114435404A (en) 2022-03-07 2022-03-07 Intelligent driving control method based on environment perception

Country Status (1)

Country Link
CN (1) CN114435404A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006178673A (en) * 2004-12-21 2006-07-06 Nissan Motor Co Ltd Traveling support device
CN102792349A (en) * 2010-03-16 2012-11-21 丰田自动车株式会社 Driving assistance device
US20150344030A1 (en) * 2014-05-30 2015-12-03 Honda Research Institute Europe Gmbh Method and vehicle with an advanced driver assistance system for risk-based traffic scene analysis
CN106004879A (en) * 2015-03-27 2016-10-12 株式会社万都 Driving assistant system of vehicle and method for controlling same
CN108227695A (en) * 2016-12-14 2018-06-29 现代自动车株式会社 Automatic Pilot control device, the system and method including the device
CN110672111A (en) * 2019-09-24 2020-01-10 广州大学 Vehicle driving path planning method, device, system, medium and equipment
CN110936953A (en) * 2018-09-21 2020-03-31 大众汽车有限公司 Method and device for providing an image of the surroundings and motor vehicle having such a device
US20200174475A1 (en) * 2018-11-29 2020-06-04 Electronics And Telecommunications Research Institute Autonomous driving method and system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006178673A (en) * 2004-12-21 2006-07-06 Nissan Motor Co Ltd Traveling support device
CN102792349A (en) * 2010-03-16 2012-11-21 丰田自动车株式会社 Driving assistance device
US20150344030A1 (en) * 2014-05-30 2015-12-03 Honda Research Institute Europe Gmbh Method and vehicle with an advanced driver assistance system for risk-based traffic scene analysis
CN106004879A (en) * 2015-03-27 2016-10-12 株式会社万都 Driving assistant system of vehicle and method for controlling same
CN108227695A (en) * 2016-12-14 2018-06-29 现代自动车株式会社 Automatic Pilot control device, the system and method including the device
CN110936953A (en) * 2018-09-21 2020-03-31 大众汽车有限公司 Method and device for providing an image of the surroundings and motor vehicle having such a device
US20200174475A1 (en) * 2018-11-29 2020-06-04 Electronics And Telecommunications Research Institute Autonomous driving method and system
CN110672111A (en) * 2019-09-24 2020-01-10 广州大学 Vehicle driving path planning method, device, system, medium and equipment

Similar Documents

Publication Publication Date Title
US10899345B1 (en) Predicting trajectories of objects based on contextual information
US11619940B2 (en) Operating an autonomous vehicle according to road user reaction modeling with occlusions
CN109641589B (en) Route planning for autonomous vehicles
EP3407329B1 (en) Driving assistance method and device
US9558408B2 (en) Traffic signal prediction
US9175966B2 (en) Remote vehicle monitoring
US20150106010A1 (en) Aerial data for vehicle navigation
EP3822945B1 (en) Driving environment information generation method, driving control method, driving environment information generation device
KR102613822B1 (en) Control of vehicles throughout multi-lane turns
EP3407328A1 (en) Driving assistance method and device
US20240125619A1 (en) Generating scouting objectives
JP7500210B2 (en) Vehicle driving support method and driving support device
CN116373904A (en) Vehicle-mounted intelligent control system based on environment sensing and multi-vehicle cooperation
DE112022003364T5 (en) COMPLEMENTARY CONTROL SYSTEM FOR AN AUTONOMOUS VEHICLE
CN115298073B (en) Vehicle travel support method and travel support device
CN117416344A (en) State estimation of school bus in autonomous driving system
EP4170450A1 (en) Method and system for switching between local and remote guidance instructions for an autonomous vehicle
US20230382368A1 (en) System, Method, and Computer Program Product for Identification of Intention and Prediction for Parallel Parking Vehicles
CN116312007A (en) Automatic driving system of vehicle guided by general sensing integrated equipment
CN114572250A (en) Method for automatically driving through intersection and storage medium
CN114274978A (en) Obstacle avoidance method for unmanned logistics vehicle
CN114435404A (en) Intelligent driving control method based on environment perception
CN112462776A (en) Unmanned driving decision-making method based on unstructured road
JP7467323B2 (en) Driving support method and driving support device
EP4386510A1 (en) Methods and systems for handling occlusions in operation of autonomous vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination