CN109709965B - Control method for automatic driving vehicle and automatic driving system - Google Patents

Control method for automatic driving vehicle and automatic driving system Download PDF

Info

Publication number
CN109709965B
CN109709965B CN201910007648.4A CN201910007648A CN109709965B CN 109709965 B CN109709965 B CN 109709965B CN 201910007648 A CN201910007648 A CN 201910007648A CN 109709965 B CN109709965 B CN 109709965B
Authority
CN
China
Prior art keywords
decision
vehicle
driving
autonomous
automatic driving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910007648.4A
Other languages
Chinese (zh)
Other versions
CN109709965A (en
Inventor
林伟
冯威
张宇
石磊
刘晓彤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uisee Technologies Beijing Co Ltd
Original Assignee
Uisee Technologies Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uisee Technologies Beijing Co Ltd filed Critical Uisee Technologies Beijing Co Ltd
Publication of CN109709965A publication Critical patent/CN109709965A/en
Application granted granted Critical
Publication of CN109709965B publication Critical patent/CN109709965B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/44Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/46Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for vehicle-to-vehicle communication [V2V]

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

The application provides a control method of an automatic driving vehicle and an automatic driving system, wherein the method comprises the following steps: acquiring real-time driving data of an autonomous vehicle through a vehicle-mounted autonomous driving system of the autonomous vehicle; generating, by the vehicle-mounted automatic driving system, a first decision based on the real-time driving data; the vehicle-mounted automatic driving system sends the real-time driving data to a remote data processing system; receiving, by the vehicle-mounted automatic driving system, a second decision from the remote data processing system, the second decision being generated by the remote data processing system based on the real-time driving data; checking and comparing the second decision with the first decision through the automatic driving system; and issuing a decision instruction to the automatic driving vehicle by the vehicle-mounted automatic driving system according to the checking and comparing result. The method and the system improve the safety of the automatic driving vehicle, can be applied to a 4G network environment, but are more suitable for a 5G network environment.

Description

Control method of automatic driving vehicle and automatic driving system
Technical Field
The invention relates to the technical field of automatic driving, in particular to a control method and an automatic driving system of an automatic driving vehicle.
Background
With the development of science and technology, intelligent vehicles become an important development direction of future automobiles. The automatic driving vehicle can not only help to improve the travel convenience and the travel experience of people, but also greatly improve the travel efficiency of people. However, the safety of autonomous vehicles remains one of the major issues that currently needs to be addressed. Among factors affecting the safety of the autonomous vehicle, decision and control of the autonomous vehicle are one of the most critical factors, and directly affect the safety and the rationality of the autonomous vehicle, so that improving the sensitivity and the accuracy of decision and control of the autonomous vehicle is a key task for improving the autonomous vehicle.
The existing automatic driving system of the automatic driving vehicle is limited to the data storage and processing capacity, a relatively simple algorithm model is usually adopted to calculate the running information and the environmental information of the automatic driving vehicle, the calculation precision, the calculation breadth and the calculation depth are limited to a certain extent, the accuracy of a decision instruction of the automatic driving system is influenced, and the safety of the automatic driving vehicle is influenced.
Therefore, it is necessary to provide a control method of an autonomous vehicle and an autonomous driving system to solve the above technical problems.
Disclosure of Invention
The application discloses a control method and an automatic driving system of an automatic driving vehicle, which improve the accuracy of a decision instruction of the automatic driving vehicle and improve the driving safety of the automatic driving vehicle.
One aspect of the present application provides a control method of an autonomous vehicle, including: acquiring real-time driving data of an autonomous vehicle through a vehicle-mounted autonomous driving system of the autonomous vehicle; generating, by the vehicle-mounted automatic driving system, a first decision based on the real-time driving data; the vehicle-mounted automatic driving system sends the real-time driving data to a remote data processing system; receiving, by the vehicle-mounted automatic driving system, a second decision from the remote data processing system, the second decision being generated by the remote data processing system based on the real-time driving data; checking and comparing the second decision with the first decision through the automatic driving system; and issuing a decision instruction to the automatic driving vehicle by the vehicle-mounted automatic driving system according to the checking and comparing result.
And the first decision is obtained by the vehicle-mounted automatic driving system by utilizing the real-time driving data through a first decision model.
And the second decision is obtained by the remote data processing system through a second decision model by utilizing the real-time driving data.
The remote data processing system is a cloud server, and the communication means is 5G communication.
Wherein, according to the result of the check-calculation comparison, the issuing of the decision instruction to the automatic driving vehicle by the vehicle-mounted automatic driving system comprises: the difference between the first decision and the second decision is smaller than a preset threshold value; issuing an instruction according to the first decision or the second decision or a third decision obtained based on the first decision and the second decision.
Wherein, according to the result of the check-calculation comparison, the issuing of the decision instruction to the automatic driving vehicle by the vehicle-mounted automatic driving system comprises: and the difference between the first decision and the second decision is larger than a preset threshold value, and the control module drives the automatic driving vehicle to stop immediately or leave the driving environment as soon as possible, and stops after entering a safe environment.
Wherein, according to the result of the check-calculation comparison, the issuing of the decision instruction to the automatic driving vehicle by the vehicle-mounted automatic driving system comprises: and acquiring the first decision and the second decision again when the difference between the first decision and the second decision is larger than a preset threshold value.
Wherein, according to the result of the check-calculation comparison, the issuing of the decision instruction to the automatic driving vehicle by the vehicle-mounted automatic driving system comprises: and the difference between the first decision and the second decision is still larger than a preset threshold value, and the control module drives the automatic driving vehicle to stop immediately or leave the driving environment as soon as possible, and stops after entering a safe environment.
Wherein the vehicle-mounted automatic driving system can also receive perception results from the remote data processing system.
In another aspect of the present application, there is provided an automatic driving system including: a memory including at least one set of instructions structured to implement a driving strategy for an autonomous vehicle; the processor reads the at least one group of instructions of the memory in a working state and according to the at least one group of instructions: obtaining real-time driving data of the autonomous vehicle; generating first decision information based on the real-time driving data; sending the real-time driving data to a remote data processing system; receiving a second decision from the remote data processing system, the second decision being generated by the remote data processing system based on the real-time driving data; and checking and comparing the second decision with the first decision, and issuing a decision instruction to the automatic driving vehicle according to the checking and comparing result.
And the first decision is obtained by the vehicle-mounted automatic driving system by utilizing the real-time driving data through a first decision model.
And the second decision is obtained by the remote data processing system through a second decision model by utilizing the real-time driving data.
The remote data processing system is a cloud server, and the communication means is 5G communication.
Wherein, according to the result of the check-calculation comparison, the issuing of the decision instruction to the automatic driving vehicle by the vehicle-mounted automatic driving system comprises: the difference between the first decision and the second decision is smaller than a preset threshold value; issuing an instruction according to the first decision or the second decision or a third decision obtained based on the first decision and the second decision.
Wherein, according to the result of the check-calculation comparison, the issuing of the decision instruction to the automatic driving vehicle by the vehicle-mounted automatic driving system comprises: and the difference between the first decision and the second decision is larger than a preset threshold value, and the control module drives the automatic driving vehicle to stop immediately or leave the driving environment as soon as possible, and stops after entering a safe environment.
Wherein, according to the result of the check-calculation comparison, the issuing of the decision instruction to the automatic driving vehicle by the vehicle-mounted automatic driving system comprises: and acquiring the first decision and the second decision again when the difference between the first decision and the second decision is larger than a preset threshold value.
Wherein, according to the result of the check-calculation comparison, the issuing of the decision instruction to the automatic driving vehicle by the vehicle-mounted automatic driving system comprises: and the difference between the first decision and the second decision is still larger than a preset threshold value, and the control module drives the automatic driving vehicle to stop immediately or leave the driving environment as soon as possible, and stops after entering a safe environment.
Wherein the vehicle-mounted automatic driving system can also receive perception results from the remote data processing system.
In another aspect of the present application, there is also provided an autonomous vehicle configured with the autonomous system described herein.
In summary, the present application provides a control method and an automatic driving system for an automatic driving vehicle, which optimizes the existing control system and method for the automatic driving vehicle, improves the accuracy of decision instructions issued by the existing system and method, and improves the driving safety of the automatic driving vehicle.
The control method of the automatic driving vehicle and the automatic driving system have high requirements on network time delay and data transmission speed. For example, the techniques disclosed in this application may be applied in a 4G network environment, but are more suitable for a 5G network environment. The 4G data transmission rate is 100Mbps, the time delay is 30-50ms, the maximum connection per square kilometer is 1 ten thousand, the mobility is about 350KM/h, the 5G transmission rate is 10Gbps, the time delay is 1ms, the maximum connection per square kilometer is million, and the mobility is about 500 KM/h. 5G has higher transmission rates, shorter latency, more square kilometer connections, and higher speed tolerance. 5G is also a change in transmission path. In the past, signals are transferred through a base station when people make calls or transmit photos, but after 5G, the signals can be directly transmitted between equipment without passing through the base station. Therefore, although the application is also suitable for the 4G environment, the operation in the 5G environment can obtain better technical performance and represent higher commercial value.
Additional features of the present application will be set forth in part in the description which follows. The descriptions of the figures and examples below will become apparent to those of ordinary skill in the art from this disclosure. The inventive aspects of the present application can be fully explained by the practice or use of the methods, instrumentalities and combinations set forth in the detailed examples discussed below.
Drawings
The following drawings describe in detail exemplary embodiments disclosed in the present application. Wherein like reference numerals represent similar structures throughout the several views of the drawings. Those of ordinary skill in the art will understand that the present embodiments are non-limiting, exemplary embodiments and that the accompanying drawings are for illustrative and descriptive purposes only and are not intended to limit the scope of the present disclosure, as other embodiments may equally fulfill the inventive intent of the present application. It should be understood that the drawings are not to scale. Wherein:
fig. 1 is an embodiment of a wireless communication system for mobile device network management in the present application.
FIG. 2 is a block diagram of an exemplary vehicle with autopilot capability according to some embodiments of the present application.
FIG. 3 is a schematic view of a scenario of an embodiment of the present application based on an autonomous vehicle control method and an autonomous driving system.
FIG. 4 is a block diagram of an exemplary vehicle and autonomous driving system with autonomous driving capability according to some embodiments of the present application.
FIG. 5 is a schematic diagram of exemplary hardware and software components of an information processing unit in the present application.
FIG. 6 is a process flow diagram of a control method of an autonomous vehicle of the present application.
Fig. 7 is a block diagram of a control method of an autonomous vehicle and a remote data processing system in an autonomous system according to the present application.
Detailed Description
The application discloses control method and autopilot system of autopilot vehicle will autopilot system of autopilot vehicle acquires autopilot vehicle's real-time driving data send to remote data processing system, utilize remote data processing system's stronger information processing ability forms the second decision-making, and will the second decision-making carries out the check calculation contrast with first decision-making, forms more optimized decision-making instruction, has improved the accuracy of the decision-making instruction that current autopilot system and method sent, improves autopilot vehicle's the security of traveling.
In the following detailed description, specific details are set forth by way of examples in order to provide a thorough understanding of the relevant disclosure to those of ordinary skill in the art. However, the disclosure should be understood to be consistent with the scope of the claims and not limited to the specific inventive details. For example, various modifications to the embodiments disclosed herein will be readily apparent to those skilled in the art; and those skilled in the art may now apply the general principles defined herein to other embodiments and applications without departing from the spirit and scope of the present application. For another example, it will be apparent to one skilled in the art that the present application may be practiced without these specific details. In other instances, well known methods, procedures, systems, components, and/or circuits have been described in general terms, but not in detail so as not to unnecessarily obscure aspects of the present application. Thus, the disclosure is not limited to the embodiments shown, but is to be accorded the scope consistent with the claims.
The terminology used in the description presented herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. For example, if a claim element is referred to in this application as "comprising" a "," an "and/or" the equivalent thereof, the claim element may include a plurality of the claim element unless the context clearly dictates otherwise. The terms "including" and/or "comprising" as used in this application refer to the open-ended concept. For example, the inclusion of B in a merely indicates the presence of B in a, but does not exclude the possibility that other elements (such as C) may be present or added to a.
It is to be understood that terms such as "system," "unit," "module," and/or "block" used herein are a means for distinguishing between different components, elements, components, parts, or assemblies at different levels. However, other terms may be used in the present application instead of the above terms if they can achieve the same purpose.
The modules (or units, blocks, units) described in this application may be implemented as software and/or hardware modules. Unless the context clearly indicates otherwise, when a unit or module is described as being "on," "connected to," or "coupled to" another unit or module, the expression may mean that the unit or module is directly on, linked or coupled to the other unit or module, or that the unit or module is indirectly on, connected or coupled to the other unit or module in some way. In this application, the term "and/or" includes any and all combinations of one or more of the associated listed items.
In this application, the term "autonomous vehicle" may refer to a vehicle that is capable of sensing its environment and automatically sensing, determining, and making decisions about the external environment without human (e.g., driver, pilot, etc.) input and/or intervention. The terms "autonomous vehicle" and "vehicle" may be used interchangeably. The term "autopilot" may refer to the ability to intelligently judge and navigate the surrounding environment without human (e.g., driver, pilot, etc.) input.
These and other features of the present application, as well as the operation and function of the related elements of structure and the combination of parts and economies of manufacture, may be significantly improved upon consideration of the following description. All of which form a part of this application, with reference to the accompanying drawings. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the application. It should be understood that the drawings are not to scale.
The flow charts used in this application illustrate the operation of system implementations according to some embodiments of the present application. It should be clearly understood that the operations of the flow diagrams may be performed out of order. Rather, the operations may be performed in reverse order or simultaneously. In addition, one or more other operations may be added to the flowchart. One or more operations may be removed from the flowchart.
The positioning techniques used in the present application may be based on the Global Positioning System (GPS), the global navigation satellite system (GLONASS), the COMPASS navigation system (COMPASS), the galileo positioning system, the quasi-zenith satellite system (QZSS), wireless fidelity (WiFi) positioning techniques, etc., or any combination thereof. One or more of the above-described positioning systems may be used interchangeably in this application.
Further, while the systems and methods herein have been described primarily in terms of a control method for an autonomous vehicle and an autonomous system, it should be understood that this is merely an exemplary embodiment. The system or method of the present application may be applied to any other type of navigation system. For example, the systems or methods of the present application may be applied to transportation systems in different environments, including terrestrial, marine, aerospace, etc., or any combination thereof. The autonomous vehicles of the transportation system may include taxis, private cars, trailers, buses, trains, bullet trains, high speed railways, subways, ships, airplanes, space vehicles, hot air balloons, autonomous vehicles, and the like, or any combination thereof. In some embodiments, the system or method may find application in, for example, logistics warehouses, military affairs.
Fig. 1 is an embodiment of a wireless communication system 100 for mobile device network management. The mobile device network management system may be applied as a support network in the invention described in this disclosure.
The wireless communication system 100 includes remote units 142, 144, 146, base stations 110, and wireless communication links 115, 148. A particular number of remote units 142, 144, 146, base stations 110, and wireless communication links 115, 148 are depicted in fig. 1, but one skilled in the art will recognize that any number of remote units 142, 144, 146, base stations 110, and wireless communication links 115, 148 may be included in the wireless communication system 100.
In some embodiments, the remote units 142, 144, 146 may be mobile devices such as on-board computers (including on-board computers for manned vehicles and or autonomous vehicles with autonomous driving capabilities) 142, 144, and other mobile devices 146 such as cell phones, laptops, personal digital assistants ("PDAs"), tablet computers, smart watches, exercise bands, optical head-mounted displays, and the like. The remote units 142, 144, 146 may also include non-mobile computing devices such as desktop computers, smart televisions (e.g., television sets connected to the internet), set-top boxes, game consoles, security systems (including security cameras), fixed network devices (e.g., routers, switches, modems), and so forth. Moreover, the mobile remote units 142, 144, 146 may be referred to as mobile stations, mobile devices, users, terminals, mobile terminals, fixed terminals, subscriber stations, UEs, user terminals, devices, or other terminology used in the art.
The wireless link between the remote units 142, 144, 146 is 148. The wireless link between the remote units 142, 144, 146 may be 5G communication interaction as well as other forms of wireless interaction, such as bluetooth, Wifi, etc. The base stations 110 form a Radio Access Network (RAN) 120. The wireless link between the base stations 110 is 115. The RAN 120 may be communicatively coupled to a mobile core network 130. The mobile core network 130 may be a 5G network, or may be a 4G, 3G, 2G or other type of network. The present invention is illustrated in this disclosure by way of example in a 5G network.
The 5G mobile core network 130 may belong to a single Public Land Mobile Network (PLMN). For example, the mobile core network 130 may provide services with low latency and high reliability requirements, such as applications in the field of autonomous driving. The mobile core network 130 may also provide services for other application requirements. Such as mobile core network 130, may provide high data rate and medium latency traffic services, such as for mobile devices such as handsets. Such as mobile core network 130, may also provide low mobility and low data rate services.
The base station 110 can serve a plurality of remote units 142, 144, 146, e.g., cells or cell sectors, within a service area via wireless communication links. The base station 110 may communicate directly with one or more remote units 142, 144, 146 via communication signals. Remote units 142, 144, 146 may communicate directly with one or more base stations 110 via uplink ("UL") communication signals. Further, UL communication signals may be carried over the wireless communication links 115, 148. The base station 110 may also transmit Downlink (DL) communication signals to serve the remote units 142, 144, 146 in the time, frequency and/or space domain. Further, the DL communication signal may be carried over a wireless communication link 115. The wireless communication link 115 may be any suitable carrier in the licensed or unlicensed radio spectrum. The wireless communication link 115 can communicate with one or more remote units 142, 144, 146 and/or one or more base stations 110. In some embodiments, the wireless communication system 100 conforms to the long-term evolution (LTE) of the 3GPP protocol, in which the base station 110 transmits using an Orthogonal Frequency Division Multiplexing (OFDM) modulation scheme on the DL. The remote units 142, 144, 146 transmit on the UL using a single-carrier frequency division multiple access (SC-FDMA) scheme. More generally, however, the wireless communication system 100 may implement some other open or proprietary communication protocols, such as WiMAX, among others. The present disclosure is not intended to be limited to implementation of any particular wireless communication system architecture or protocol.
The base station 110 and the remote units 142, 144, 146 may be distributed over a geographic area. In some embodiments, the base stations 110 and remote units 142, 144, 146 may also be referred to as access points, access terminals, or any other terminology used in the art. Typically, two or more geographically adjacent base stations 110 or remote units 142, 144, 146 are combined together into a routing area. In some embodiments, a routing area may also be referred to as a location area, a paging area, a tracking area, or any other terminology used in the art. Each "routing area" has an identifier transmitted from its serving base station 110 to the remote units 142, 144, 146 (or transmitted between the remote units 142, 144, 146).
When the mobile remote unit 142, 144, 146 moves to a new cell broadcasting a different "routing area" (e.g., moves within range of a new base station 110), the mobile remote unit 142, 144, 146 detects a change in routing area. The RAN 120, in turn, pages the mobile remote units 142, 144, 146 in idle mode through the base stations 110 in its current routing area. The RAN 120 contains multiple routing areas. As is known in the art, the size of the routing area (e.g., the number of base stations included in the routing area) may be selected to balance the routing area update signaling load with the paging signaling load.
In some embodiments, the remote units 142, 144, 146 may be attached to the core network 130. When the remote unit 142, 144, 146 detects a mobile device network management event (e.g., a change in routing area), the remote unit 142, 144, 146 may send a mobile device network management request message to the core network 130 (e.g., a service requiring low latency and high reliability for autonomous driving or a service requiring high data rate and medium latency for cell phones). Thereafter, the core network 130 forwards the mobile device network management request to one or more secondary network slices connected to the remote units 142, 144, 146 to provide the corresponding service.
At some point, the remote units 142, 144, 146 may no longer require some network service (e.g., a service requiring low latency and high reliability for autonomous driving or a service requiring high data rate and medium latency traffic for a cell phone). In this case, the remote units 142, 144, 146 may send detach request messages, such as data connection release messages, to detach from the network.
FIG. 2 is a block diagram of an exemplary vehicle with autopilot capability according to some embodiments of the present disclosure. The vehicle 200 with autopilot capability may be the vehicle 142, 144 in the mobile device network managed wireless communication system 100 shown in fig. 1. For example, the vehicle 200 with autopilot capabilities may include a control module, a plurality of sensors, a memory, an instruction module, and a Controller Area Network (CAN) and actuators.
The actuators may include, but are not limited to, actuation of a throttle, engine, braking system, and steering system (including steering of tires and/or operation of turn lights).
The plurality of sensors may include various internal and external sensors that provide data to the vehicle 200. Such as shown in fig. 2, the plurality of sensors may include vehicle component sensors and environmental sensors. The vehicle component sensors are coupled to the actuators of the vehicle 200 and can sense the operating conditions and parameters of the various components of the actuators.
The environmental sensors allow the vehicle to understand and potentially respond to its environment to assist in navigation, path planning, and to secure passengers and people or property in the surrounding environment of the autonomous vehicle 200. The environmental sensors may also be used to identify, track, and predict the movement of objects, such as pedestrians and other vehicles. The environmental sensor may include a position sensor and an external object sensor.
The position sensor may include a GPS receiver, an accelerometer and/or a gyroscope, the receiver. The location sensors may sense and/or determine multiple geographic locations and orientations of autonomous vehicle 200. For example, the latitude, longitude and altitude of the vehicle are determined.
The external object sensors may detect objects external to the vehicle, such as other vehicles, obstacles in the road, traffic signals, signs, trees, etc. The external object sensors may include laser sensors, radar, cameras, sonar, and/or other detection devices.
The laser sensor can measure the distance between the vehicle and the surface of the object facing the vehicle by rotating on its axis and changing its spacing. Laser sensors may also be used to identify changes in surface texture or reflectivity. Thus, the laser sensor may be configured to detect lane lines by distinguishing the amount of light reflected by painted lane lines relative to unpainted dark road surfaces.
The radar sensors may be located at the front and rear of the car and either side of the front bumper. In addition to using radar to determine the relative position of external objects, other types of radar may be used for other purposes, such as conventional speed detectors. Short wave radars may be used to determine snow depth on a road and to determine the location and condition of the road surface.
The camera may capture visual images of the surroundings of the vehicle 200 and extract content therefrom. For example, the camera may capture the street sign signs on both sides of the road and recognize the meaning of these signs through the control module. Such as using a camera to determine the speed limit of the road. The vehicle 200 can also calculate the distance of the surrounding object from the vehicle 200 by the parallax of different images taken by the plurality of cameras.
The sonar may detect the distance of the vehicle 200 from surrounding obstacles. For example, the sonar may be an ultrasonic range finder. The ultrasonic distance meters are installed at both sides and the rear of the vehicle, and are turned on to detect obstacles around a parking space and the distance between the vehicle 200 and the obstacles when parking.
The control module, upon receiving information sensed by the plurality of sensors, may process information and/or data related to vehicle driving (e.g., autonomous driving) to perform one or more of the functions described in this disclosure. In some embodiments, the control module may be configured to autonomously drive the vehicle. For example, the control module may output a plurality of control signals. The plurality of control signals may be configured to be received by one or more Electronic Control Units (ECUs) to control driving of the vehicle. In some embodiments, the control module may determine the reference path and the one or more candidate paths based on environmental information of the vehicle.
In some embodiments, the control module may include one or more central processors (e.g., single-core or multi-core processors). By way of example only, a control module may include a Central Processing Unit (CPU), an application-specific integrated circuit (ASIC), an application-specific instruction-set processor (ASIP), a Graphics Processing Unit (GPU), a Physical Processing Unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a microcontroller unit, a reduced instruction-set computer (RISC), a microprocessor (microprocessor), and the like, or any combination thereof.
The memory may store data and/or instructions. In some embodiments, the memory may store data obtained from autonomous vehicle sensors. In some embodiments, the memory may store data and/or instructions that the control module may execute or use to perform the example methods described in this disclosure. In some embodiments, the memory may include mass storage, removable storage, volatile read-and-write memory (volatile read-and-write memory), read-only memory (ROM), or the like, or any combination thereof. By way of example, mass storage may include magnetic disks, optical disks, solid state drives, and the like; for example, the removable memory may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape; volatile read and write memory, for example, may include Random Access Memory (RAM); for example, RAM may include Dynamic RAM (DRAM), double data Rate synchronous dynamic RAM (DDR SDRAM), Static RAM (SRAM), silicon controlled RAM (T-RAM), and zero capacitor RAM (Z-RAM); for example, ROM can include Mask ROM (MROM), Programmable ROM (PROM), Erasable Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), compact disk ROM (CD-ROM), and digital versatile disk ROM. In some embodiments, the storage may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, and the like, or any combination thereof.
In some embodiments, the memory may be a local memory, i.e., the memory may be part of the autonomous vehicle 200. In some embodiments, the memory may also be remote memory. The central processor may connect the remote memory via network 100 to communicate with one or more components (e.g., control modules, sensor modules) of autonomous vehicle 200. One or more components in autonomous vehicle 200 may access data or instructions stored remotely in a remote memory via network 100. In some embodiments, memory 420 may be directly connected to or in communication with one or more components in autonomous vehicle 200 (e.g., control modules, sensors).
The instruction module receives the information transmitted by the control module, converts the information into an instruction for driving the execution mechanism and transmits the instruction to a Controller Area Network (CAN) bus. For example, the control module sends a driving strategy (acceleration, deceleration, turning, etc.) of the autonomous vehicle 200 to the command module, which receives the driving strategy and converts it into driving commands for actuators (driving commands for accelerator, brake, steering). And meanwhile, the instruction module issues the instruction to the execution mechanism through the CAN bus. The execution of the command by the actuator is then detected by the vehicle component sensors and fed back to the control module, thereby completing closed-loop control and driving of the autonomous vehicle 200.
FIG. 3 is a schematic view of a scenario of one embodiment of the autonomous vehicle based control system and method of the present application. As shown in fig. 3, an autonomous vehicle 200 (hereinafter simply referred to as a "vehicle") can travel on a road 321 along its autonomously set path 320 without human input of the path. The autonomous vehicle 200 must not violate the traffic rules of the road 321 when traveling on the road 321, e.g., the autonomous vehicle 200 must not exceed the highest speed limit of the road 321, and must not run a red light when traveling to a traffic light intersection, for example.
The autonomous vehicle 200 may include some conventional structure of a non-autonomous vehicle, such as an engine, wheels, steering wheel, etc., and may also include a sensing module 340, a control module 350, and a decision-making module 360.
A traffic light 310, a stop line 311, a zebra crossing 312, and a sign 313 are provided at an intersection of the road 321, and the autonomous vehicle 200 can recognize and acquire information of the intersection, including a state of the traffic light 310 (e.g., a color of the traffic light and a countdown time), a distance from the intersection stop line 311 and the zebra crossing 312, contents of the sign 313, and the like. The sign panel 313 is a graphic symbol displaying traffic regulations and road information, including, but not limited to, a warning sign, a prohibition sign, a direction sign, a tourist area sign, a road construction safety sign, a speed limit sign (e.g., a highest speed limit), and the like. During the process of driving to the intersection, the autonomous vehicle 200 may determine the driving speed of the vehicle based on the state of the traffic light 310, for example, may determine whether the vehicle can pass through the intersection stop line 311 based on the color and countdown time of the traffic light 310, the distance from the intersection stop line 311, the current real-time speed, and other parameters, and generate and execute a corresponding driving strategy based on the determination result, for example, when the traffic light 310 is green and the countdown time is long enough, the autonomous vehicle 200 accelerates to pass through the intersection stop line 311; for another example, when the traffic light 310 is red and the countdown time is long enough, the autonomous vehicle 200 decelerates and stops before the intersection stop line 311.
One embodiment of the present application provides a control method of an autonomous vehicle, as shown with reference to fig. 6, including:
step S101: acquiring real-time driving data of an autonomous vehicle through a vehicle-mounted autonomous driving system of the autonomous vehicle;
step S102: generating, by the vehicle-mounted automatic driving system, a first decision based on the real-time driving data;
step S103: the vehicle-mounted automatic driving system sends the real-time driving data to a remote data processing system;
step S104: receiving, by the vehicle-mounted automatic driving system, a second decision from the remote data processing system, the second decision being generated by the remote data processing system based on the real-time driving data;
step S105: checking and comparing the second decision with the first decision through the automatic driving system;
step S106: and issuing a decision instruction to the automatic driving vehicle by the vehicle-mounted automatic driving system according to the checking and comparing result.
Possible embodiments of the autonomous vehicle and the autonomous system as described are provided below with reference to fig. 2 to 4.
An autonomous vehicle according to an embodiment of the present application is, for example, an autonomous vehicle 200 illustrated in fig. 2 and 3. The onboard devices of the autonomous vehicle include all the electronic and mechanical devices that the autonomous vehicle is equipped with, and can acquire all the data and information detected, sensed or generated by the autonomous vehicle 200. In some embodiments of the present application, the onboard equipment of the autonomous vehicle 200 includes an autonomous system 400 of the autonomous vehicle 200.
Fig. 4 is a block diagram of an exemplary vehicle and autonomous driving system 400 with autonomous driving capability according to some embodiments of the present application. As shown in FIG. 4, the autopilot system 400 may include a sensing module 340, a control module 350, and a decision making module 360, a memory 420, a network 430, a gateway module 440, a Controller Area Network (CAN)450, an Engine Management System (EMS)460, an Electric Stability Control (ESC)470, an Electric Power System (EPS)480, a Steering Column Module (SCM)490, a throttle system 465, a braking system 475, and a steering system 495, among others.
The sensing module 340 may collect driving data and environmental information of the vehicle, including but not limited to: the real-time speed of the vehicle, the distance between the vehicle and the target, the traveling route of the vehicle, the traffic condition in the traveling route of the vehicle, the color of the traffic light, the countdown time of the traffic light and the highest speed limit of the intersection, the information of other vehicles or pedestrians in front of and behind the vehicle, the visual information of the two sides of the road, the positioning information of the vehicle and the like. In some embodiments, the sensing module 340 may include a vision sensor 342, a distance sensor 344, a speed sensor 346, an acceleration sensor 348, and a positioning unit 349. The visual sensors 342 may detect the status of the traffic light 310 (including the color of the traffic light 310 and the countdown time), lane lines, the sign 313, other vehicles, etc., and transmit the detected visual information to the decision making module 360. In some embodiments, the vision sensor 342 may employ a binocular camera, a LIDAR system, or the like, all of which are known to those skilled in the art. The distance sensor 344 may measure the distance of the autonomous vehicle 200 from a particular object in the environment (e.g., the intersection stop line 311, other vehicles around the autonomous vehicle 200) and communicate its measurement to the decision-making module 360. In some embodiments, the distance sensor 344 may measure the distance between the autonomous vehicle 200 and the target based on the location information of the autonomous vehicle and the position information of the target on the map. In some embodiments, the distance sensor 344 is a lidar or millimeter wave radar that models the surroundings of the autonomous vehicle 200 in three dimensions. The speed sensor 346 may measure the real-time travel speed of the autonomous vehicle 200 and communicate its measurement to the decision-making module 360. The acceleration sensor 348 may measure the real-time acceleration of the autonomous vehicle 200 and communicate the measurement to the decision-making module 360. The positioning unit 349 can perform real-time positioning on the autonomous vehicle 200 and transmit positioning information to the decision-making determining module 360. In some embodiments, the positioning unit 349 is a high precision GPS positioning unit.
The decision-making determination module 360 may receive the driving information and the environmental information such as traffic signal information, obstacle information, surrounding vehicle information, pedestrian information, etc., and generate determination information and driving decision information for the determination information according to the driving information and the environmental information. In some embodiments, the determination information includes, but is not limited to: when the traffic light 310 is green, whether the autonomous vehicle 200 can pass the intersection stop line 311 within a time corresponding to the traffic light countdown; or when the traffic light 310 is red or yellow, respectively, whether the autonomous vehicle 200 can pass through the intersection stop line 311 within the time counted down by the corresponding traffic light; when there is an obstacle, a pedestrian, or another vehicle on the way of the vehicle, the autonomous vehicle 200 should perform operations such as deceleration, detour, or stop. In some embodiments, the decision information includes, but is not limited to: and issuing a running instruction for keeping a real-time speed at a constant speed, accelerating, decelerating or stopping running to the automatic driving vehicle 200. In some embodiments, the accelerated driving instructions include, but are not limited to: uniform acceleration or variable acceleration. In some embodiments, the deceleration travel instructions include, but are not limited to: uniform deceleration or variable deceleration.
The control module 350 may process information and/or data related to vehicle driving (e.g., autonomous driving) to perform one or more of the functions described herein. In some embodiments, the control module 350 may receive the decision information and control the autonomous vehicle 200 to execute the decision-making driving instructions according to the decision information. In some embodiments, the control module 350 may be configured to autonomously drive the vehicle. For example, the control module 350 may output a plurality of control signals. The plurality of control signals may be configured to be received by a plurality of Electronic Control Units (ECUs) to control driving of the vehicle. In some embodiments, the control module 350 may determine the travel speed of the vehicle based on environmental information of the vehicle (e.g., the state of the traffic light 310). In some embodiments, the control module 350 may include one or more processing engines (e.g., a single core processing engine or a multi-core processor). By way of example only, the control module 350 may include a Central Processing Unit (CPU), an application-specific integrated circuit (ASIC), an application-specific instruction-set processor (ASIP), a Graphics Processing Unit (GPU), a Physical Processing Unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a microcontroller unit, a Reduced Instruction Set Computer (RISC), a microprocessor (microprocessor), and the like, or any combination thereof.
The memory 420 may store data and/or instructions. In some embodiments, the memory 420 may store data obtained from the autonomous vehicle 200 (e.g., data measured by various sensors in the perception module 340). In some embodiments, the memory 420 may store a high-precision map that also includes information such as the number of lanes, lane width, road curvature, road grade, maximum speed, and recommended travel speed. In some embodiments, the memory 420 may store data and/or instructions that the control module 350 may execute or use to perform the example methods described herein. In some embodiments, the memory 420 may include mass storage, removable storage, volatile read-and-write memory (volatile read-and-write memory), read-only memory (ROM), or the like, or any combination thereof. By way of example, mass storage may include magnetic disks, optical disks, solid state drives, and the like; for example, the removable memory may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape; volatile read and write memory, for example, may include Random Access Memory (RAM); for example, RAM may include Dynamic RAM (DRAM), double data Rate synchronous dynamic RAM (DDR SDRAM), Static RAM (SRAM), silicon controlled RAM (T-RAM), and zero capacitor RAM (Z-RAM); for example, ROM can include Mask ROM (MROM), Programmable ROM (PROM), Erasable Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), compact disk ROM (CD-ROM), and digital versatile disk ROM.
In some embodiments, the memory 420 may be connected to the network 430 to communicate with one or more components of the autonomous vehicle 200 (e.g., control module 350, vision sensors 342). One or more components in the autonomous vehicle 200 may access data or instructions stored in the memory 420 via the network 430. In some embodiments, the memory 420 may be directly connected to or in communication with one or more components in the autonomous vehicle 200 (e.g., control module 350, vision sensors 342). In some embodiments, the memory 420 may be part of the autonomous vehicle 200.
The network 430 may facilitate the exchange of information and/or data. In some embodiments, one or more components (e.g., control module 350, vision sensors 342) in the autonomous vehicle 200 may send information and/or data to other components in the autonomous vehicle 200 via the network 430. For example. The control module 350 may obtain/obtain dynamic conditions of the vehicle and/or environmental information around the vehicle via the network 430. In some embodiments, the network 430 may be any type of wired or wireless network, or combination thereof. By way of example only, the network 430 may include a wired network, a fiber optic network, a telecommunications network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a Public Switched Telephone Network (PSTN), a bluetooth network, a ZigBee network, a Near Field Communication (NFC) network, the like, or any combination thereof. In some embodiments, the network 430 may include one or more network access points. For example, the network 430 may include wired or wireless network access points, such as base stations and/or internet exchange points 430-1, …. One or more components through which autonomous vehicle 200 may be connected to network 430 to exchange data and/or information.
The gateway module 440 may determine command sources for a plurality of ECUs (e.g., EMS460, EPS 480, ESC 470, SCM 490) based on a current driving state of the vehicle. The command source may be from a human driver, from the control module 350, etc., or any combination thereof.
The gateway module 440 may determine the current driving state of the vehicle. The driving state of the vehicle may include a manual driving state, a semi-automatic driving state, an error state, or the like, or any combination thereof. For example, the gateway module 440 may determine the current driving state of the vehicle as a manual driving state based on input from a human driver. For another example, when the current road condition is complicated, the gateway module 440 may determine the current driving state of the vehicle as a semi-autonomous driving state. As yet another example, the gateway module 440 may determine the current driving state of the vehicle as an error state when an anomaly (e.g., a signal interruption, a processor crash) occurs.
In some embodiments, the gateway module 440 may send the human driver's operation to the plurality of ECUs in response to determining that the current driving state of the vehicle is a manual driving state. For example, upon determining that the current driving state of the vehicle is a manual driving state, the gateway module 440 may transmit a pressing operation of the accelerator of the autonomous vehicle 200 performed by the human driver to the EMS460 in response. Upon determining that the current driving state of the vehicle is an autonomous driving state, the gateway module 440 may transmit a control signal of the control module 350 to the plurality of ECUs in response. For example, upon determining that the current driving state of the vehicle is an autonomous driving state, the gateway module 440 may responsively transmit control signals associated with steering operations to the SCM 490. The gateway module 440 may transmit the operation of the human driver and the control signal of the control module 350 to the plurality of ECUs in response to a conclusion that the current driving state of the vehicle is the semi-autonomous driving state. The gateway module 440 may transmit an error signal to the plurality of ECUs in response when it is determined that the current driving state of the vehicle is an error state.
The controller area network (CAN bus) 450 is a reliable vehicle bus standard (e.g., message-based protocol) that allows microcontrollers (e.g., control module 350) and devices (e.g., EMS460, EPS 480, ESC 470, SCM 490, etc.) to communicate with each other in applications without a host computer. The CAN 450 may be configured to interface the control module 350 with a plurality of ECUs (e.g., EMS460, EPS 480, ESC 470, SCM 490).
The EMS460 may determine engine performance of the autonomous vehicle 200. In some embodiments, the EMS460 may determine engine performance of the autonomous vehicle 200 based on control signals from the control module 350. For example. When the current driving state is an autonomous driving state, the EMS460 may determine engine performance of the autonomous vehicle 200 based on the control signal associated with the acceleration from the control module 350. In some embodiments, the EMS460 may determine engine performance of the autonomous vehicle 200 based on operation by a human driver. For example, when the current driving state is a manual driving state, the EMS460 may determine engine performance of the autonomous vehicle 200 based on a human driver's depression of an accelerator.
The EMS460 may include a plurality of sensors and at least one microprocessor. The plurality of sensors may be configured to detect one or more physical signals and convert the one or more physical signals into electrical signals for processing. In some embodiments, the plurality of sensors may include various temperature sensors, air flow sensors, throttle position sensors, pump pressure sensors, speed sensors, oxygen sensors, load sensors, knock sensors, etc., or any combination thereof. The one or more physical signals may include, but are not limited to, engine temperature, engine intake air quantity, cooling water temperature, engine speed, and the like, or any combination thereof. The microprocessor may determine engine performance based on a plurality of engine control parameters. The microprocessor may determine a plurality of engine control parameters based on the plurality of electrical signals, and may determine a plurality of engine control parameters to optimize engine performance. The plurality of engine control parameters may include ignition timing, fuel delivery, idle airflow, etc., or any combination thereof.
The throttle system 465 may alter the motion of the autonomous vehicle 200. For example, the throttle system 465 may determine the speed of the autonomous vehicle 200 based on engine output. As another example, the throttle system 465 may cause acceleration of the autonomous vehicle 200 based on engine output. The throttle system 465 may include fuel injectors, fuel pressure regulators, auxiliary air valves, temperature switches, throttles, idle speed motors, fault indicators, ignition coils, relays, and the like, or any combination thereof. In some embodiments, the throttling system 465 may be an external actuator of the EMS 460. The throttle system 465 may be configured to control engine output based on a plurality of engine control parameters determined by the EMS 460.
The ESC 470 may improve vehicle stability, and the ESC 470 may improve vehicle stability by detecting and reducing loss of tractive effort. In some embodiments, the ESC 470 may control the operation of the braking system 475 to assist in maneuvering the vehicle in response to determining that the ESC 470 detects a loss of steering control. For example, the ESC 470 may improve the stability of the braking system 475. When the vehicle starts ignition on an uphill slope, the vehicle is prevented from sliding downwards through braking, and smooth ignition of the vehicle is facilitated. In some embodiments, the ESC 470 may further control engine performance to improve vehicle stability. For example, the ESC 470 may reduce engine power in the event of a potential loss of steering control. Scenarios where loss of steering control may occur include: when the vehicle is coasting during an emergency avoidance turn, the time when the vehicle is under-steered or oversteered when it is judged to be bad on a wet road surface.
The braking system 475 may control the motion state of the autonomous vehicle 200. For example, the braking system 475 may slow the autonomous vehicle 200. As another example, the braking system 475 may stop the autonomous vehicle 200 from traveling forward under one or more road conditions (e.g., a downhill). As yet another example, the braking system 475 may maintain a constant speed of the autonomous vehicle 200 while traveling on a downhill slope. The braking system 475 may include a mechanical control component, a hydraulic unit, a power unit (e.g., a vacuum pump), an implement unit, etc., or any combination thereof. The mechanical control components may include pedals, hand brakes, and the like. The hydraulic unit may include hydraulic oil, hydraulic hoses, brake pumps, etc. The actuator unit may include a caliper, a brake pad, a brake disc, and the like.
The EPS 480 may control the supply of electric power to the autonomous vehicle 200. The EPS 480 may supply, transmit, and/or store electrical power for the autonomous vehicle 200. For example, the EPS 480 may include one or more batteries and an alternator. The alternator may charge the battery, and the battery may be connected to other portions of the autonomous vehicle 200 (e.g., a starter to provide power). In some embodiments, the EPS 480 may control the supply of electrical power to the steering system 495. For example, when the autonomous vehicle 200 determines that a sharp turn is required (e.g., steering wheel bottoming out all the way to the left or all the way to the right), the EPS 480 may provide large power to the steering system 495 to generate a large steering torque in response to the autonomous vehicle 200.
The SCM 490 may control a steering wheel of a vehicle. The SCM 490 may lock/unlock a steering wheel of a vehicle. The SCM 490 may lock/unlock a steering wheel of a vehicle based on a current driving state of the vehicle. For example, the SCM 490 may lock the steering wheel of the vehicle in response to determining that the current driving state is an autonomous driving state. In response to determining that the current driving state is an autonomous driving state, the SCM 490 may further retract the steering column shaft. As another example, the SCM 490 may unlock a steering wheel of the vehicle in response to determining that the current driving state is a semi-autonomous driving state, a manual driving state, and/or an error state. The SCM 490 may control steering of the autonomous vehicle 200 based on control signals of the control module 350. The control signals may include information related to a turning direction, a turning position, a turning angle, etc., or any combination thereof.
The steering system 495 may steer the autonomous vehicle 200. In some embodiments, the steering system 495 may steer the autonomous vehicle 200 based on signals sent from the SCM 490. For example, the steering system 495 may direct the autonomous vehicle 200 based on control signals of the control module 350 transmitted from the SCM 490 in response to determining that the current driving state is an autonomous driving state. In some embodiments, the steering system 495 may steer the autonomous vehicle 200 based on the operation of a human driver. For example, the steering system 495 may steer the autonomous vehicle 200 to the left when a human driver steers the steering wheel to the left in response to determining that the current driving state is the manual driving state.
Fig. 5 is a schematic diagram of exemplary hardware and software components of the information processing unit 500. The information processing unit 500 may carry thereon the control module 350, the EMS460, the ESC 470, the EPS 480, the SCM 490. For example, the control module 350 may be implemented on an information processing unit 500 to perform the functions of the control module 350 disclosed herein.
The information processing unit 500 may be a dedicated computer device specifically designed to process signals from sensors and/or components of the autonomous vehicle 200 and to send instructions to sensors and/or components of the vehicle 200.
For example, the information processing unit 500 may include a COM port 550 connected to a network connected thereto to facilitate data communication. The information processing unit 500 may also include a processor 520, the processor 520 in the form of one or more processors for executing computer instructions. The computer instructions may include, for example, routines, programs, objects, components, data structures, procedures, modules, and functions that perform the particular functions described herein. For example, the processor 520 may obtain one or more path sample features associated with a plurality of candidate paths. The one or more sample features associated with the candidate route may include a route start location, a route destination, a route speed of a vehicle associated with the candidate route, a route acceleration of the vehicle, a route instantaneous curvature of the candidate route. Or the like, or any combination thereof.
In some embodiments, the processor 520 may include one or more hardware processors, such as microcontrollers, microprocessors, Reduced Instruction Set Computers (RISC), Application Specific Integrated Circuits (ASICs), application specific instruction set processors (ASIPs), Central Processing Units (CPUs), Graphics Processing Units (GPUs), Physical Processing Units (PPUs), microcontroller units, Digital Signal Processors (DSPs), Field Programmable Gate Arrays (FPGAs), Advanced RISC Machines (ARMs), Programmable Logic Devices (PLDs), any circuit or processor capable of executing one or more functions, or the like, or any combination thereof.
The information handling unit 500 may include an internal communication bus 510, program storage and various forms of data storage (e.g., disk 570, Read Only Memory (ROM)530, or Random Access Memory (RAM)540) for various data files to be processed and/or transmitted by a computer. The information processing unit 500 may also include program instructions stored in ROM 530, RAM 540, and/or other types of non-transitory storage media to be executed by the processor 520. The methods and/or processes of the present application may be implemented as program instructions. The information processing unit 500 also includes I/O components 560 to support input/output between the computer and other components (e.g., user interface elements). The information processing unit 500 may also receive programming and data through network communication.
For illustrative purposes only, only one processor is depicted in the information processing unit 500 described in the present application. It should be noted, however, that the information processing unit 500 may also include multiple processors, and thus, the operations and/or method steps disclosed herein may be performed by one processor as described herein, or may be performed jointly by multiple processors. For example, if in the present application the processor 520 of the information processing unit 500 performs steps a and B, it should be understood that steps a and B may also be performed jointly or separately by two different processors in the information processing (e.g. a first processor performs step a, a second processor performs step B, or a first and a second processor performs steps a and B together).
Based on the description of the above embodiment of the present application, step S101 is executed to obtain real-time driving data generated, sensed or detected during the driving process of the autonomous vehicle 200, for example, the driving data and the environmental information of the autonomous vehicle, by an on-board device of the autonomous vehicle, for example, the sensing module 340 of the autonomous system 400, where the driving data and the environmental information include but are not limited to: the real-time speed of the automatically driven vehicle, the distance between the vehicle and the target, the traveling route of the vehicle, the traffic condition in the traveling route of the vehicle, the color of the traffic light, the countdown time of the traffic light and the highest speed limit of the intersection, the information of other vehicles or pedestrians in front of and behind the vehicle, the visual information of both sides of the road, the positioning information of the vehicle and the like. In some embodiments of the present application, the driving data and the environmental information may be obtained through the vision sensor 342, the distance sensor 344, the speed sensor 346, the acceleration sensor 348, the positioning unit 349, and the like of the sensing module 340.
In one embodiment of the present application, the real-time driving data may be stored in the memory 420 of the autopilot system 400. The real-time driving data storage can be realized by a capacity memory, a removable memory, a volatile read-write memory, a read-only memory or any combination thereof. In some embodiments of the present application, the real-time driving data may also be stored in a cloud, that is, the memory 420 is a cloud memory.
Further, the real-time driving data is sent to the decision-making judgment module 360 of the automatic driving system 400. In an embodiment of the present application, the decision-making module 360 may process the received real-time driving data to convert the real-time driving data into a file format suitable for the decision-making module 360 to perform the determining step.
In an embodiment of the present application, the decision-making determining module 360 determines a driving state of the automatic driving vehicle and a current environmental condition according to the received real-time driving data; and forms a first decision for driving the autonomous vehicle 200 according to the judgment result (step S102). For example, the first decision is to decrease or increase the driving speed of the vehicle, or the first decision is to control the autonomous vehicle to perform lane change driving, or the first decision is to determine the accurate positioning of the autonomous vehicle, or the first decision is to control the vehicle to stop driving or to enter a nearby parking lot.
In some embodiments of the present application, the decision-making module 360 processes the real-time driving data by executing an algorithm model configured in the autopilot system 400 of the autonomous vehicle 200 and forms the first decision. That is, the first decision is that the vehicle-mounted automatic driving system utilizes the real-time driving data to obtain through a first decision model. Since the real-time driving data may include various information, such as driving route information of the vehicle, driving speed information of the vehicle, driving information of other vehicles around the vehicle during driving of the vehicle, traffic light information, road condition information, and obstacle information on the driving route, the first decision model adopted by the automatic driving system is different for different information and data.
For example, if the data is the information of the traffic light 310 identified by the sensing module 340 when the traffic light is encountered at the intersection during the driving process of the autonomous vehicle 200, the first decision model executed by the decision module may be: and judging whether the distance between the head of the automatic driving vehicle 200 and the intersection stop line 311 is greater than the deceleration area, if so, directly forming a first decision. In some embodiments, the expression of the deceleration zone is: the autonomous vehicle 200 decelerates to zero at the current real-time speed according to a predetermined deceleration strategy for a desired glide distance. The first decision model can be expressed by the following formula, and judges whether the distance between the head of the autonomous vehicle 200 and the intersection stop line 311 is greater than a deceleration zone:
wherein D is the distance between the head of the autonomous vehicle and the intersection stop line 311, V is the specified sliding speed of the vehicle, and a is the acceleration in the parking stage.
In one embodiment of the present application, after forming the first decision, the first decision is stored to the storage
Figure BDA0001936078500000181
In the vessel 420. Without sending the first decision directly to the control module 350.
Step S103 is executed: the vehicle-mounted automatic driving system 400 sends the real-time driving data to a remote data processing system 600, and processes the information and data to generate a second decision;
FIG. 7 is a block diagram illustrating a remote data processing system 600. The remote data processing system 600 includes at least: a data transmitting and receiving module 610; a second memory 620; a second decision making module 630 and a network 640.
The data transmitting and receiving module 610 is configured to receive real-time driving data transmitted from the autonomous driving system 400 of the autonomous driving vehicle 200, and to transmit processed real-time driving data and decision information back to the autonomous driving system 400.
A second memory 620; the second memory 620 may store data and/or instructions. In some embodiments, the second memory 620 may store data transmitted from the autonomous vehicle. In some embodiments, the second memory 620 may store information and data processed by the remote data processing system and second decisions obtained after processing by the remote data processing system to perform the example methods described in this disclosure. In some embodiments, the storage function of the second memory 620 may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, and the like, or any combination thereof.
In some embodiments, the second memory 620 is a remote memory and may include mass storage, removable storage, the like, or any combination thereof. By way of example, mass storage may include magnetic disks, optical disks, solid state drives, and the like; for example, the removable memory may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, and a magnetic tape.
The second decision making module 630 may process the information and data sent by the autopilot system and form a second decision. Such as real-time driving data of the autonomous vehicle, such as traffic signal information of the surroundings of the vehicle, obstacle information during traveling, surrounding vehicle information, pedestrian information, acceleration information of the vehicle, positioning information of the vehicle, traveling route information of the vehicle, and the like.
In one embodiment of the present application, the second determination decision module 630 determines the driving state of the autonomous vehicle and the current environmental condition according to the received real-time driving data; and forming a second decision according to the judgment result. For example, the second decision is to decrease or increase the driving speed of the vehicle, or the second decision is to control the autonomous vehicle to perform lane change driving, or the second decision is to determine the accurate positioning of the autonomous vehicle, or the second decision is to control the vehicle to stop driving or to enter a nearby parking lot.
In some embodiments of the present application, the second decision module 630 is the same as the real-time driving data obtained during the driving process of the autonomous vehicle processed by the decision module 360 of the autonomous driving system 400. The second decision formed by the method is in one-to-one correspondence with the first decision. For example, in actual practice, if the decision module 360 of the automated driving system 400 makes a first decision based on the information of the traffic light 310 identified by the sensing module 340 when the intersection meets the traffic light during the driving of the automated driving vehicle 200, the second decision module 630 of the remote data processing system 600 also makes a second decision based on the information of the traffic light 310 identified by the sensing module 340 when the intersection meets the traffic light during the driving of the automated driving vehicle 200.
In some other embodiments of the present application, the amount of data processed by the second decision module 630 is greater than the real-time driving data obtained during the driving process of the autonomous vehicle processed by the decision module 360 of the autonomous driving system 400. This is because the second decision module 630 may also store or obtain information from other data sources. For example, in the driving process of the automatic driving vehicle, the map information acquired by the automatic driving system of the vehicle is only limited to a certain distance around the vehicle body, and the remote data processing system 600 may also acquire traffic jam information, road condition information, and the like provided by other devices in a farther range from the cloud.
In some embodiments of the present application, the second decision making module 630 processes the information and data by executing a second decision model configured in the remote data processing system 600 and forms the second decision. Since the information and data may include various information, such as the driving route information of the vehicle, the driving speed information of the vehicle, the driving information of other vehicles around the vehicle during the driving process, traffic light information, road condition information, and obstacle information on the driving route, the second decision model adopted by the remote data processing system 600 is different for different information and data.
In some embodiments of the present application, since the decision models used by the decision module 360 and the second decision module 630 for data processing are different, the first decision and the second decision may be the same or different. In some embodiments of the present invention, the depth, breadth and fineness of the data processing by the second decision-making module are greater than the decision-making module 360, so that the accuracy and fineness of the second decision may be greater than the first decision. The judgment decision module 360 is disposed in the vehicle-mounted automatic driving system, and is limited by the data storage amount and the data calculation capability of the vehicle-mounted automatic driving system, the first decision model is relatively simple compared with the second decision model, the calculation breadth, the calculation precision and the calculation depth are limited, and the data calculation capability of the judgment decision module 360 is smaller than the data calculation capability of the second judgment decision module 630. Based on this, in some embodiments of the invention, the accuracy of the second decision is greater than the accuracy of the first decision. For example, when planning a driving route of an autonomous vehicle, the second decision model can combine traffic jam information, road condition information and the like at a longer distance, so that the accuracy and the practicability of the given second decision are higher than those of the first decision. Moreover, in some embodiments of the present application, the computational complexity, e.g., the number of computational layers, of the second decision model is also much higher than that of the first decision model.
In some embodiments of the present application, if the data is the information of the traffic light 310 identified by the sensing module 340 when the traffic light is encountered at the intersection during the driving process of the autonomous vehicle 200, the second decision-making model executed by the second decision-making module is, for example:
if the autonomous vehicle 200 is traveling according to a predetermined acceleration strategy, accelerating from the current real-time speed to the highest speed limit, whether the time at the intersection stop line 311 of the autonomous vehicle 200 is not greater than the countdown time at the green light. In some embodiments, the predetermined acceleration policy may include, but is not limited to: the speed is accelerated to the highest speed limit uniformly within a specified distance at the current real-time speed, or the speed is accelerated to the highest speed limit in a changing way at the current real-time speed (for example, the speed is accelerated to the highest speed limit of a road within a specified distance in a trigonometric function way). The calculation mode of the automatically driven vehicle 200 passing through the intersection stop line can be calculated according to the vehicle head passing through the intersection stop line 311, the vehicle body passing through the intersection stop line, or the vehicle tail passing through the intersection stop line. For example, the second decision model determines whether the time that the vehicle reaches the intersection is longer than the countdown remaining time of the green light if the vehicle accelerates to the maximum speed limit (it is assumed that the vehicle tail does not run the red light until the vehicle tail passes the stop line at the time when the green light changes into the red light) through the intersection stop line 311 according to the uniform acceleration strategy:
wherein, taThe corresponding countdown time when the light is green, D is the distance between the vehicle head and the intersection, and LvIs the length of the body, V is the real-time speed of the vehicle, VmaxThe highest speed limit of the intersection.
The remote data processing system 600 further includes a network 640 for transmitting and exchanging the real-time driving data, in some embodiments of the present application, the network may be the same network as the network 430 in the automatic driving control system or a different network, but when the network is a different network, it should be ensured that data can be transmitted and exchanged between the different networks.
In some embodiments, the network 640 may be any type of wired or wireless network, or combination thereof. By way of example only, the network 640 may include a wired network, a fiber optic network, a telecommunications network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a Public Switched Telephone Network (PSTN), a bluetooth network, a ZigBee network, a Near Field Communication (NFC) network, the like, or any combination thereof. In some embodiments, the network 640 may include one or more network access points.
In other embodiments of the present application, the remote data processing system 600 further includes a second sensing module, configured to perform deep sensing on the driving data and the environmental information collected by the sensing module 340, and send the sensing result back to the vehicle-mounted automatic driving system. The driving data and the environmental information include but are not limited to: the real-time speed of the vehicle, the distance between the vehicle and the target, the traveling route of the vehicle, the traffic condition in the traveling route of the vehicle, the color of the traffic light, the countdown time of the traffic light and the highest speed limit of the intersection, the information of other vehicles or pedestrians in front of and behind the vehicle, the visual information of the two sides of the road, the positioning information of the vehicle and the like. In some embodiments of the present application, the remote data processing system 600 perceives the data in a wider range, and thus the extent and accuracy of the perception results is greater than the extent and accuracy of the perception of the autopilot system.
Step S104 is executed, the data sending and receiving module sends the second decision back to the automatic driving system 400, and the control module 350 of the automatic driving system receives the second decision and stores the second decision in the memory 420.
Step S105 is executed, the second decision and the first decision are made by the decision module 360 of the automatic driving system
Figure BDA0001936078500000211
Performing a check comparison, which may also be referred to as a redundant comparison; in communication engineering, redundancy refers to artificially repeating configuration of some key components or functions for the sake of system safety and reliability. When a system fails, for example, a certain device is damaged, the redundantly configured components can be used as a backup to timely intervene and undertake the work of the failed components, thereby reducing the failure time of the system. Redundancy is particularly useful for emergency handling. Redundancy may exist at different levels, such as network redundancy, server redundancy, disk redundancy, data redundancy, and the like.
And S106 is executed, and according to the checking and comparing result, the vehicle-mounted automatic driving system issues a decision instruction to the automatic driving vehicle. In some embodiments of the present application, if the first decision and the second decision are the same, the control module 350 of the autonomous driving system receives the first decision or the second decision and issues a decision instruction to the autonomous driving vehicle. In some embodiments, the control module 350 may be configured to autonomously drive the vehicle. For example, the control module 350 may output a plurality of control signals. The plurality of control signals may be configured to be received by a plurality of Electronic Control Units (ECUs) to control driving of the vehicle.
In some embodiments of the present application, the control module 350 autonomously issues execution commands to the gateway module 440, Controller Area Network (CAN)450, Engine Management System (EMS)460, Electrical Stability Control (ESC)470, power system (EPS)480, Steering Column Module (SCM)490, throttle system 465, brake system 475, and steering system 495, among others, to control the autonomous vehicle to perform acceleration, deceleration, lane change, turn, and other operations.
In other embodiments of the present application, the first decision and the second decision have a small difference (the difference between the first decision and the second decision is smaller than a preset threshold), for example, the first decision and the second decision have a slight difference in determining the coordinate position of the autonomous vehicle, determining the running speed and the running route of the autonomous vehicle, and in case that the difference is smaller than the preset threshold, the first decision or the second decision may be directly selected. The selection scheme may be different for different decision information. For example, if the first decision and the second decision are to determine the vehicle speed and the vehicle body coordinates of the autonomous vehicle, and the second decision combs the data with higher precision and has a wider data processing range, the value of the second decision is preferably selected. In addition, in the case of traffic congestion, when the driving route of the autonomous vehicle is planned, the second decision model may have a larger processed data amount due to a wider data range (acquired from other data acquisition centers) acquired by the second decision model, and thus the accuracy of the second decision is higher. The accuracy of the second decision may also be higher when planning a path to determine the expected arrival time of the vehicle.
If the first decision and the second decision are decisions about whether the traffic light intersection needs to be expedited, then a relatively safer decision instruction of the first decision and the second decision is employed. In some embodiments, it is also possible to choose the average of the first decision and the second decision.
In the embodiment of the application, the judgment of the difference between the first decision and the second decision instruction is based on an empirical value and theoretical data judgment (preset threshold), the judgment method and the judgment standard of the instruction difference are different for different decision information types, and the judgment of the difference can be optimally set along with the accumulation of the empirical data and the improvement of the technical scheme.
In some embodiments of the present application, the decision may also be a third decision obtained based on the first decision and the second decision. For example, the third decision is a function of the first decision and the second decision.
In other embodiments of the present application, if the difference between the first decision and the second decision is determined to be large (the difference between the first decision and the second decision is greater than a preset threshold), it is determined that the autonomous vehicle and the autonomous driving system are in failure, and a parking instruction is issued by the control module, the autonomous vehicle is driven off the road surface as soon as possible, for example, enters a parking lot or a place where parking is allowed on the roadside, and a gateway module is notified, or an early warning message is issued.
In other embodiments of the present application, if the difference between the first decision and the second decision is determined to be larger (the difference between the first decision and the second decision is larger than a preset threshold), the method for controlling an autonomous vehicle provided in the embodiments of the present application is continuously and repeatedly executed to avoid a calculation error that may be generated in the process of obtaining and comparing the first decision and the second decision. And after the comparison calculation is carried out again, if the difference between the first decision and the second decision which are obtained finally is smaller than a preset threshold value, the first decision or the second decision is selected. And if the difference between the finally obtained first decision and the second decision is still larger than a preset threshold value, the control module drives the automatic driving vehicle to stop immediately or leave the driving environment as soon as possible, and the automatic driving vehicle stops after entering a safe environment.
Embodiments of the present application further provide an autopilot system, comprising: a memory including at least one set of instructions structured to implement a driving strategy for an autonomous vehicle; the processor reads the at least one group of instructions of the memory in a working state and according to the at least one group of instructions:
obtaining real-time driving data of the autonomous vehicle;
generating first decision information based on the real-time driving data;
sending the real-time driving data to a remote data processing system;
receiving a second decision from the remote data processing system, the second decision being generated by the remote data processing system based on the real-time driving data;
and checking and comparing the second decision with the first decision, and issuing a decision instruction to the automatic driving vehicle according to the checking and comparing result.
The embodiment of the application also provides an automatic driving vehicle equipped with the automatic driving system.
In conclusion, upon reading the present detailed disclosure, those skilled in the art will appreciate that the foregoing detailed disclosure can be presented by way of example only, and not limitation. Those skilled in the art will appreciate that the present application is intended to cover various reasonable variations, adaptations, and modifications of the embodiments described herein, although not explicitly described herein. Such alterations, improvements, and modifications are intended to be suggested by this application and are within the spirit and scope of the exemplary embodiments of the application.
Furthermore, certain terminology has been used in this application to describe embodiments of the application. For example, "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various portions of this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined as suitable in one or more embodiments of the application.
It should be appreciated that in the foregoing description of embodiments of the present application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of such feature. This application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains. This is not to be taken as an admission that any of the features of the claims are essential, and it is fully possible for a person skilled in the art to extract some of them as separate embodiments when reading the present application. That is, the embodiments in the present application may also be understood as an integration of a plurality of sub-embodiments. And each sub-embodiment described herein is equally applicable to less than all features of a single foregoing disclosed embodiment.
In some embodiments, numbers expressing quantities or properties used to describe and claim certain embodiments of the application are to be understood as being modified in certain instances by the term "about", "approximately" or "substantially". For example, "about," "approximately," or "substantially" can mean a ± 20% variation of the value it describes, unless otherwise specified. Accordingly, in some embodiments, the numerical parameters set forth in the written description and attached claims are approximations that may vary depending upon the desired properties sought to be obtained by a particular embodiment. In some embodiments, numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the embodiments of the application are approximations, the numerical values set forth in the specific examples are reported as precisely as possible.
Each patent, patent application, publication of a patent application, and other material, such as articles, books, descriptions, publications, documents, articles, and the like, cited herein is hereby incorporated by reference. All matters hithertofore set forth herein except as related to any prosecution history, may be inconsistent or conflicting with this document or any prosecution history which may have a limiting effect on the broadest scope of the claims. Now or later associated with this document. For example, if there is any inconsistency or conflict in the description, definition, and/or use of terms associated with any of the included materials with respect to the terms, descriptions, definitions, and/or uses associated with this document, the terms in this document are used.
Finally, it should be understood that the embodiments of the application disclosed herein are illustrative of the principles of the embodiments of the present application. Other modified embodiments are also within the scope of the present application. Accordingly, the embodiments disclosed herein are to be considered in all respects as illustrative and not restrictive. Those skilled in the art can implement the invention in the present application in alternative configurations according to the embodiments in the present application. Thus, embodiments of the present application are not limited to those embodiments described with accuracy in the application.

Claims (19)

1. A control method of an autonomous vehicle, characterized by comprising:
acquiring real-time driving data of an autonomous vehicle through a vehicle-mounted autonomous driving system of the autonomous vehicle;
generating, by the vehicle-mounted automatic driving system, a first decision based on the real-time driving data;
the vehicle-mounted automatic driving system sends the real-time driving data to a remote data processing system;
receiving, by the vehicle-mounted autopilot system, a second decision from the remote data processing system, the second decision being generated by the remote data processing system;
determining, by the autonomous driving system, a difference in the second decision from the first decision;
according to the difference between the second decision and the first decision and a preset threshold value, the vehicle-mounted automatic driving system issues a decision instruction to the automatic driving vehicle;
at least one of the data storage capacity, the computing capacity, the complexity of the decision model and the input data for generating the second decision of the remote data processing system is larger than that of the vehicle-mounted automatic driving system.
2. The method of claim 1, wherein the first decision is that the on-board automatic driving system obtained through a first decision model using the real-time driving data.
3. The method of claim 1, wherein the second decision is obtained by the remote data processing system using the real-time driving data via a second decision model.
4. The method of claim 1, wherein the remote data processing system is a cloud server and the communication means is 5G communication.
5. The method of claim 1, wherein commanding, by the in-vehicle autonomous driving system, a decision for the autonomous vehicle based on a difference between the second decision and the first decision and a preset threshold comprises:
the difference between the first decision and the second decision is smaller than a preset threshold value;
issuing an instruction according to the first decision or the second decision or a third decision obtained based on the first decision and the second decision.
6. The method of claim 1, wherein commanding, by the in-vehicle autonomous driving system, a decision for the autonomous vehicle based on a difference between the second decision and the first decision and a preset threshold comprises:
and the difference between the first decision and the second decision is larger than a preset threshold value, and the control module drives the automatic driving vehicle to stop immediately or leave the driving environment as soon as possible, and stops after entering a safe environment.
7. The method of claim 1, wherein commanding, by the in-vehicle autonomous driving system, a decision for the autonomous vehicle based on a difference between the second decision and the first decision and a preset threshold comprises:
and acquiring the first decision and the second decision again when the difference between the first decision and the second decision is larger than a preset threshold value.
8. The method of claim 7, wherein commanding, by the in-vehicle autonomous driving system, a decision for the autonomous vehicle based on a difference between the second decision and the first decision and a preset threshold comprises:
the first decision and the second decision still differ by more than a preset threshold,
the control module drives the automatic driving vehicle to stop immediately or to leave the driving environment as soon as possible, and the automatic driving vehicle stops after entering a safe environment.
9. The method of claim 1, wherein the in-vehicle autopilot system receives the perception result from the remote data processing system.
10. An autopilot system, comprising:
a memory including at least one set of instructions structured to implement a driving strategy for an autonomous vehicle;
the processor reads the at least one group of instructions of the memory in a working state and according to the at least one group of instructions:
obtaining real-time driving data of the autonomous vehicle;
generating first decision information based on the real-time driving data;
sending the real-time driving data to a remote data processing system;
receiving a second decision from the remote data processing system, the second decision being generated for the remote data processing system;
determining the difference between the second decision and the first decision based on the second decision and the first decision, and issuing a decision instruction to the automatic driving vehicle according to the difference between the second decision and the first decision and a preset threshold;
at least one of the data storage capacity, the computing capacity, the complexity of the decision model and the input data for generating the second decision of the remote data processing system is larger than that of the vehicle-mounted automatic driving system.
11. The autopilot system of claim 10 wherein the first decision is that the on-board autopilot system utilizes the real-time driving data to obtain via a first decision model.
12. The autopilot system of claim 10 wherein the second decision is at the remote data location
And the real-time driving data is obtained by the management system through a second decision-making model.
13. The autopilot system of claim 10 wherein the remote data processing system is a cloud server and the communication means is 5G communication.
14. The autonomous driving system of claim 10, wherein the commanding, by the in-vehicle autonomous driving system, a decision for the autonomous driving vehicle based on the difference between the second decision and the first decision and a preset threshold comprises:
the difference between the first decision and the second decision is smaller than a preset threshold value;
issuing an instruction according to the first decision or the second decision or a third decision obtained based on the first decision and the second decision.
15. The autonomous driving system of claim 10, wherein the commanding, by the in-vehicle autonomous driving system, a decision for the autonomous driving vehicle based on the difference between the second decision and the first decision and a preset threshold comprises:
and the difference between the first decision and the second decision is larger than a preset threshold value, and the control module drives the automatic driving vehicle to stop immediately or leave the driving environment as soon as possible, and stops after entering a safe environment.
16. The autonomous driving system of claim 10, wherein the commanding, by the in-vehicle autonomous driving system, a decision for the autonomous driving vehicle based on the difference between the second decision and the first decision and a preset threshold comprises:
and acquiring the first decision and the second decision again when the difference between the first decision and the second decision is larger than a preset threshold value.
17. The autonomous driving system of claim 16, wherein the commanding, by the in-vehicle autonomous driving system, a decision for the autonomous driving vehicle based on the difference between the second decision and the first decision and a preset threshold comprises:
and the difference between the first decision and the second decision is still larger than a preset threshold value, and the control module drives the automatic driving vehicle to stop immediately or leave the driving environment as soon as possible, and stops after entering a safe environment.
18. The autopilot system of claim 10 wherein the in-vehicle autopilot system receives the perception result from the remote data processing system.
19. Autonomous vehicle, characterized in that the autonomous vehicle is equipped with an autonomous driving system according to any of the claims 9-16.
CN201910007648.4A 2018-12-28 2019-01-04 Control method for automatic driving vehicle and automatic driving system Active CN109709965B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
PCT/CN2018/124848 WO2020133208A1 (en) 2018-12-28 2018-12-28 Control method for self-driving vehicle, and self-driving system
CNPCT/CN2018/124848 2018-12-28

Publications (2)

Publication Number Publication Date
CN109709965A CN109709965A (en) 2019-05-03
CN109709965B true CN109709965B (en) 2022-05-13

Family

ID=66259868

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910007648.4A Active CN109709965B (en) 2018-12-28 2019-01-04 Control method for automatic driving vehicle and automatic driving system

Country Status (2)

Country Link
CN (1) CN109709965B (en)
WO (1) WO2020133208A1 (en)

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110149340A (en) * 2019-05-29 2019-08-20 武汉阳光尼特智能科技有限公司 A kind of remote online drives managing and control system and method
US11150644B2 (en) * 2019-05-29 2021-10-19 GM Cruise Holdings, LLC Rotating sensor suite for autonomous vehicles
US20200409362A1 (en) * 2019-06-26 2020-12-31 Byton North America Corporation Autonomous driving handoff systems and methods
CN110239597A (en) * 2019-07-03 2019-09-17 中铁轨道交通装备有限公司 A kind of active Unmanned Systems of Straddle type monorail train
CN110316204B (en) * 2019-07-09 2024-07-12 威马智慧出行科技(上海)有限公司 Control method of vehicle driving mode, gateway and automobile
CN110751847B (en) * 2019-10-15 2021-03-30 清华大学 Decision-making method and system for automatically driving vehicle behaviors
CN112835346A (en) * 2019-11-04 2021-05-25 大众汽车(中国)投资有限公司 Method and system for controlling vehicle and vehicle-mounted automatic driving system
CN111026112B (en) * 2019-12-02 2021-03-26 华中科技大学 Control system for automatic walking of wheeled robot along slope center line
CN111061268A (en) * 2019-12-12 2020-04-24 长城汽车股份有限公司 Remote supervision method, device and system for automatic driving vehicle
CN111157996B (en) * 2020-01-06 2022-06-14 珠海丽亭智能科技有限公司 Parking robot running safety detection method
CN112148010A (en) * 2020-09-23 2020-12-29 北京百度网讯科技有限公司 Automatic driving function control method, automatic driving function control device, electronic equipment and storage medium
CN112148615A (en) * 2020-09-30 2020-12-29 知行汽车科技(苏州)有限公司 Automatic driving test method, device and storage medium
CN112286166A (en) * 2020-10-12 2021-01-29 上海交通大学 Vehicle remote driving control system and method based on 5G network
CN112298211A (en) * 2020-11-19 2021-02-02 北京清研宏达信息科技有限公司 Automatic pedestrian yielding driving scheme based on 5G grading decision
CN112590784B (en) * 2020-12-28 2022-09-13 中通客车股份有限公司 Domain control system and method for passenger car
CN112987029A (en) * 2021-02-09 2021-06-18 上海振华重工(集团)股份有限公司 Positioning method, system, equipment and medium suitable for driving equipment
CN113365245B (en) * 2021-07-01 2024-03-22 腾讯科技(深圳)有限公司 Communication method and device applied to remote driving, medium and electronic equipment
CN113467324B (en) * 2021-07-22 2023-12-05 东风悦享科技有限公司 Adaptive 5G network cell switching parallel driving system and method
EP4336297A4 (en) * 2021-07-30 2024-07-10 Huawei Tech Co Ltd Fault detection method, fault detection apparatus, server, and vehicle
CN113655790A (en) * 2021-08-05 2021-11-16 阿波罗智联(北京)科技有限公司 Vehicle control method, device, equipment, storage medium and program product
CN113904959B (en) * 2021-11-02 2023-04-07 广州小鹏自动驾驶科技有限公司 Time delay analysis method and device, vehicle and storage medium
CN114205223B (en) * 2021-11-29 2024-05-28 中汽研(天津)汽车工程研究院有限公司 Tracing positioning method and device for abnormal events of intelligent driving function of vehicle
CN114485725B (en) * 2021-12-22 2024-05-24 深圳元戎启行科技有限公司 Data abnormality detection method, autopilot platform, and computer-readable storage medium
CN114274976B (en) * 2021-12-27 2023-09-12 广西汽车集团有限公司 Takeover algorithm module and method after automatic driving program breakdown
CN114384794A (en) * 2022-03-24 2022-04-22 苏州挚途科技有限公司 Vehicle remote driving control system and method
CN114987514A (en) * 2022-06-21 2022-09-02 零束科技有限公司 Control method and device for automatic driving vehicle and electronic equipment
CN117341723A (en) * 2022-06-28 2024-01-05 深圳市中兴微电子技术有限公司 Automatic driving method and system
CN115165400B (en) * 2022-09-08 2022-11-18 江苏天一航空工业股份有限公司 Parallel driving test system and method for automatic driving test field
CN115185323A (en) * 2022-09-08 2022-10-14 苏州洪昇新能源科技有限公司 Remote control method and system for new energy equipment

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9690290B2 (en) * 2015-06-04 2017-06-27 Toyota Motor Engineering & Manufacturing North America, Inc. Situation-based transfer of vehicle sensor data during remote operation of autonomous vehicles
CN106601001A (en) * 2015-10-16 2017-04-26 普天信息技术有限公司 Vehicle communication method and system
US10802484B2 (en) * 2016-11-14 2020-10-13 Baidu Usa Llc Planning feedback based decision improvement system for autonomous driving vehicle
CN108205923A (en) * 2016-12-19 2018-06-26 乐视汽车(北京)有限公司 A kind of automatic Pilot decision-making technique and system
CN108205922A (en) * 2016-12-19 2018-06-26 乐视汽车(北京)有限公司 A kind of automatic Pilot decision-making technique and system
US10139834B2 (en) * 2017-01-12 2018-11-27 GM Global Technology Operations LLC Methods and systems for processing local and cloud data in a vehicle and a cloud server for transmitting cloud data to vehicles
CN108462726A (en) * 2017-02-14 2018-08-28 广州市联奥信息科技有限公司 Vehicle assistant drive decision system and device towards unknown situation
CN107045345A (en) * 2017-03-06 2017-08-15 吉林大学 Endless-track vehicle remote control and automated driving system based on internet
CN108334072A (en) * 2017-12-29 2018-07-27 同济大学 A kind of double driving mode control systems of the sweeper based on Beidou navigation
CN108445885A (en) * 2018-04-20 2018-08-24 鹤山东风新能源科技有限公司 A kind of automated driving system and its control method based on pure electric vehicle logistic car
CN108583578B (en) * 2018-04-26 2019-12-31 北京领骏科技有限公司 Lane decision method based on multi-objective decision matrix for automatic driving vehicle
CN108845556A (en) * 2018-04-28 2018-11-20 济南浪潮高新科技投资发展有限公司 A kind of automatic driving vehicle test method and test device
CN108549384A (en) * 2018-05-21 2018-09-18 济南浪潮高新科技投资发展有限公司 A kind of remote control automatic Pilot method under 5G environment

Also Published As

Publication number Publication date
CN109709965A (en) 2019-05-03
WO2020133208A1 (en) 2020-07-02

Similar Documents

Publication Publication Date Title
CN109709965B (en) Control method for automatic driving vehicle and automatic driving system
AU2020104467A4 (en) Systems and methods for path determination
CN109693668B (en) System and method for controlling speed of automatic driving vehicle
CN110550029B (en) Obstacle avoiding method and device
CN109582022B (en) Automatic driving strategy decision system and method
EP4071661A1 (en) Automatic driving method, related device and computer-readable storage medium
CN112689588B (en) Control method and device for automatically driving vehicle
CN112859830B (en) Design operation region ODD judgment method, device and related equipment
CN110979327A (en) Longitudinal control method and system for automatic driving vehicle
CN109360438B (en) Vehicle speed decision system and method based on traffic lights
CN110087964B (en) Vehicle control system, vehicle control method, and storage medium
US9956958B2 (en) Vehicle driving control device and control device
US20200148204A1 (en) Using Discomfort For Speed Planning In Responding To Tailgating Vehicles For Autonomous Vehicles
CN112512887B (en) Driving decision selection method and device
CN112429016B (en) Automatic driving control method and device
CN113365878A (en) Redundant structure of automatic driving system
US11243542B2 (en) Vehicle control system, vehicle control method, vehicle control device, and vehicle control program
CN116135654A (en) Vehicle running speed generation method and related equipment
CN115042814A (en) Traffic light state identification method and device, vehicle and storage medium
CN113799794B (en) Method and device for planning longitudinal movement parameters of vehicle
CN115205848A (en) Target detection method, target detection device, vehicle, storage medium and chip
CN117058867A (en) Car meeting method and related device
CN114056346B (en) Automatic driving control method and device
CN114877911B (en) Path planning method, device, vehicle and storage medium
CN115063987B (en) Vehicle control method and device, vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant