CN108292473B - Method and system for controlling a motion profile of an autonomous vehicle - Google Patents

Method and system for controlling a motion profile of an autonomous vehicle Download PDF

Info

Publication number
CN108292473B
CN108292473B CN201680064611.XA CN201680064611A CN108292473B CN 108292473 B CN108292473 B CN 108292473B CN 201680064611 A CN201680064611 A CN 201680064611A CN 108292473 B CN108292473 B CN 108292473B
Authority
CN
China
Prior art keywords
data
trajectory
autonomous vehicle
contingency
trajectories
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201680064611.XA
Other languages
Chinese (zh)
Other versions
CN108292473A (en
Inventor
J·S·莱文森
T·D·肯特力-克雷
G·T·斯布莱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zoox Inc
Original Assignee
Zoox Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zoox Inc filed Critical Zoox Inc
Publication of CN108292473A publication Critical patent/CN108292473A/en
Application granted granted Critical
Publication of CN108292473B publication Critical patent/CN108292473B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0011Planning or execution of driving tasks involving control alternatives for a single driving scenario, e.g. planning several paths to avoid obstacles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0027Planning or execution of driving tasks using trajectory prediction for other traffic participants
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • G05D1/0291Fleet control
    • G05D1/0297Fleet control by controlling means in a control room
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0063Manual parameter input, manual setting means, manual initialising or calibrating means
    • B60W2050/0064Manual parameter input, manual setting means, manual initialising or calibrating means using a remote, e.g. cordless, transmitter or receiver unit, e.g. remote keypad or mobile phone
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/54Audio sensitive means, e.g. ultrasound
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/20Static objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • B60W2554/4026Cycles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • B60W2554/4029Pedestrians
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/406Traffic density
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/20Ambient conditions, e.g. wind or rain
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9316Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles combined with communication equipment with other vehicles or with base stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9322Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using additional data, e.g. driver condition, road state or weather data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Navigation (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Acoustics & Sound (AREA)

Abstract

Various embodiments relate generally to autonomous vehicles and associated mechanical, electrical, and electronic hardware, computer software and systems, and wired and wireless network communications to provide a fleet of autonomous vehicles as a service. More specifically, the systems, devices, and methods are configured to generate a trajectory to affect navigation of an autonomous vehicle. In particular, the method may include receiving path data to navigate from a first geographic location to a second geographic location, generating data representing a trajectory to control movement of the autonomous vehicle based on the path data, generating a brace representing an contingency trajectory, monitoring the generation of the trajectory, and implementing the contingency trajectory after the absence of the trajectory.

Description

Method and system for controlling a motion profile of an autonomous vehicle
Cross Reference to Related Applications
This PCT international application is a continuation of U.S. application No.14/756992 filed on 4 months 11, 2015, U.S. application No.14/756992 being related to: U.S. patent application Ser. No.14/932,959, entitled "AUTONOMOUS VEHICLE FLEET SERVICE AND SYSTEM," filed 11/4/2015; U.S. patent application Ser. No.14/932,963, entitled "ADAPTIVE MAPPING TO NAVIGATE AUTONOMOUS VEHICLES RESPONSIVE TO PHYSICAL ENVIRONMENT CHANGES," filed 11/4/2015; U.S. patent application Ser. No.14/932,966, entitled "TELEOPERATION SYSTEM AND METHOD FOR TRAJECTORY MODIFICATION OF AUTONOMOUS VEHICLES," filed 11/4/2015; U.S. patent application Ser. No.14/932,940, entitled "AUTOMATED EXTRACTION OF SEMANTIC INFORMATION TO ENHANCE INCREMENTAL MAPPING MODIFICATIONS FOR ROBOTIC VEHICLES," filed 11/4/2015; U.S. patent application Ser. No.14/756,995, entitled "ADAPTIVE AUTONOMOUS VEHICLE PLANNER LOGIC," filed on day 11 months and 4; U.S. patent application Ser. No.14/756,991, entitled "SENSOR-BASED OBJECT-DETECTION OPTIMIZATION FOR AUTONOMOUS VEHICLES", filed 11/2015; U.S. patent application Ser. No.14/756,996, entitled "CALIBRATION FOR AUTONOMOUS VEHICLE OPERATION," filed 11/4/2015; they are incorporated herein by reference in their entirety for all purposes.
Technical Field
Various embodiments relate generally to autonomous vehicles and associated mechanical, electrical, and electronic hardware, computer software and systems, and wired and wireless network communications for providing a fleet of autonomous vehicles as a service. More specifically, the systems, devices, and methods are configured to generate a trajectory to affect navigation of an autonomous vehicle.
Background
Various approaches to developing unmanned vehicles have been primarily directed to automating conventional vehicles (e.g., manual-driven automotive vehicle) in order to create unmanned vehicles for consumer purchase. For example, many car companies and branch offices are modifying conventional cars and control mechanisms such as steering to provide consumers with the ability to own vehicles that can operate without the need for a driver. In some approaches, conventional unmanned vehicles perform safety-critical (safety-critical) driving functions under some conditions, but require the driver to take control (e.g., maneuver, etc.) when the vehicle controller cannot solve certain problems that may jeopardize the safety of the occupant.
Functionally, however, conventional unmanned vehicles typically have a number of drawbacks. For example, a large number of unmanned vehicles under development have evolved from vehicles requiring manual (i.e., human-controlled) maneuvering and other similar vehicle functions. Thus, most unmanned vehicles are based on the paradigm that a vehicle is to be designed to accommodate a driver with a driver license that is reserved with a particular seat or location within the vehicle. In this way, the unmanned vehicle is designed to be suboptimal and generally gives up opportunities to simplify vehicle design and conserve resources (e.g., reduce the cost of producing the unmanned vehicle). Other drawbacks are also present in conventional unmanned vehicles.
Other drawbacks also exist in conventional transportation services, which are not well suited for effectively managing, for example, vehicle inventory (inventory), due to the usual approach to providing conventional transportation and ride-sharing services. In one conventional approach, passengers are required to access a mobile application to request transportation services via a central service that assigns people drivers and vehicles (e.g., in the case of private ownership) to the passengers. By utilizing differently owned vehicles, the maintenance and safety systems of private vehicles are typically not inspected. In another conventional approach, some communities enable co-riding on a group of vehicles by allowing drivers registered as members to access vehicles shared among the members. This approach is not well suited to provide convenient transportation services because the driver needs to take a ride (pick up) and drop off (drop off) of the shared vehicle at a particular location, which is rare and scarce in urban environments and requires access to relatively expensive real estate (i.e., a parking lot) to stop the ride-sharing vehicle. In the conventional approach described above, conventional vehicles for providing transportation services are often underutilized from the standpoint of inventory, because the vehicles become immobile once the driver leaves. Furthermore, ride-sharing approaches (and individually owned vehicle transportation services) are often not well suited to rebalancing (rebalancing) inventory to match transportation service requirements to accommodate usage and typical driving patterns. Note also that some conventionally described vehicles with limited self-driving automation capability do not rebalance inventory well, as human drivers may often be required. According to the national highway traffic safety administration ("NHTSA") of the united states department of transportation, an example of a vehicle with limited self-driving autonomy is a vehicle designed as a class 3 ("L3") vehicle.
As another drawback, typical approaches to unmanned vehicles are generally not well suited for detecting and navigating vehicles with respect to interactions (e.g., social interactions) between the running vehicle and other vehicle drivers or individuals. For example, some conventional approaches are not able to adequately identify pedestrians, cyclists, etc., and associated interactions such as eye contacts and gestures to address security risks to occupants of unmanned vehicles, as well as drivers of other vehicles, pedestrians, etc.
Thus, what is needed is a solution for implementing an autonomous vehicle without the limitations of conventional technology.
Drawings
Various embodiments or examples ("examples") of the invention are disclosed in the following detailed description and the accompanying drawings:
FIG. 1 is a diagram depicting an implementation of a fleet of autonomous vehicles communicatively networked to an autonomous vehicle service platform, in accordance with some embodiments;
FIG. 2 is an example of a flow chart of monitoring a fleet of autonomous vehicles according to some embodiments;
FIG. 3A is a diagram depicting an example of sensors and other autonomous vehicle components according to some examples;
3B-3E are diagrams depicting examples of sensor field redundancy (sensor field redundancy) and an autonomous vehicle adaptation to a loss of sensor field, according to some examples;
FIG. 4 is a functional block diagram depicting a system including an autonomous vehicle service platform communicatively coupled to an autonomous vehicle controller via a communication layer, according to some examples;
FIG. 5 is an example of a flow chart for controlling an autonomous vehicle according to some embodiments;
FIG. 6 is a diagram depicting an example of an architecture for an autonomous vehicle controller, in accordance with some embodiments;
FIG. 7 is a diagram depicting an example of an autonomous vehicle service platform implementing redundant communication channels to maintain reliable communication with a fleet of autonomous vehicles, in accordance with some embodiments;
FIG. 8 is a diagram depicting an example of a messaging application configured to exchange data between various applications, in accordance with some embodiments;
FIG. 9 is a diagram depicting types of data for facilitating remote operation using the communication protocol depicted in FIG. 8, in accordance with some examples;
FIG. 10 is a diagram illustrating an example of a remote operator interface that a remote operator may use to affect a path plan, in accordance with some embodiments;
FIG. 11 is a diagram depicting an example of a planner configured to initiate remote operations, in accordance with some examples;
FIG. 12 is an example of a flowchart configured to control an autonomous vehicle, according to some embodiments;
FIG. 13 depicts an example in which a planner may generate trajectories according to some examples;
FIG. 14 is a diagram depicting another example of an autonomous vehicle service platform in accordance with some embodiments;
FIG. 15 is an example of a flow chart for controlling an autonomous vehicle according to some embodiments;
FIG. 16 is a diagram of an example of an autonomous vehicle fleet manager implementing a fleet optimization manager, according to some examples;
FIG. 17 is an example of a flow chart for managing a fleet of autonomous vehicles, according to some embodiments;
FIG. 18 is a diagram illustrating an autonomous vehicle fleet manager implementing an autonomous vehicle communication link manager, according to some embodiments;
FIG. 19 is an example of a flow chart of determining actions for an autonomous vehicle during an event, according to some embodiments;
FIG. 20 is a diagram depicting an example of a localizer (localizer) according to some embodiments;
FIG. 21 is an example of a flow chart for generating local pose (local pose) data based on integrated sensor data according to some embodiments;
FIG. 22 is a diagram depicting another example of a localizer according to some embodiments;
FIG. 23 is a diagram depicting an example of a perception engine in accordance with some embodiments;
FIG. 24 is an example of a flow chart for generating awareness engine data, in accordance with some embodiments;
FIG. 25 is an example of a segmentation processor according to some embodiments;
FIG. 26A is a diagram depicting an example of an object tracker and classifier in accordance with various embodiments;
FIG. 26B is a diagram depicting another example of an object tracker in accordance with at least some examples;
FIG. 27 is an example of a front-end processor for a perception engine according to some examples;
FIG. 28 is a diagram depicting a simulator configured to simulate an autonomous vehicle in a synthetic environment, in accordance with various embodiments;
FIG. 29 is an example of a flow chart simulating various aspects of an autonomous vehicle, according to some embodiments;
FIG. 30 is an example of a flow chart for generating map data, according to some embodiments;
FIG. 31 is a diagram depicting the architecture of a map construction engine, in accordance with some embodiments;
FIG. 32 is a diagram depicting an autonomous vehicle application according to some examples; and
33-35 illustrate examples of various computing platforms configured to provide various functionalities to components of an autonomous vehicle service, according to various embodiments;
FIG. 36 is a diagram depicting an example of at least a portion of a planner for generating trajectories according to some examples;
FIG. 37 is a diagram depicting an example of a trajectory tracker, according to some examples;
FIG. 38 is a diagram depicting an example of redundant implementation of an autonomous vehicle controller according to some examples;
FIG. 39 is a diagram depicting an example of a status and event manager, or portion thereof, in accordance with some examples;
FIG. 40 is a flowchart illustrating an example of implementing one or more track types according to some examples; and
FIG. 41 illustrates an example of various computing platforms configured to provide various generation-related functionalities and/or structures to components of an autonomous vehicle service, in accordance with various embodiments.
Description of the preferred embodiments
The various embodiments or examples may be implemented in numerous ways, including as a system, a process, an apparatus, a user interface, or as a series of program instructions on a computer readable medium or a computer network such as a computer readable storage medium, over which the program instructions are sent over optical, electronic, or wireless communication links. In general, the operations of the disclosed processes may be performed in any order, unless otherwise specified in the claims.
One or more exemplary embodiments are provided below along with the accompanying figures. The detailed description is provided in connection with this example, but is not limited to any particular example. The scope is limited only by the following claims and many alternatives, modifications and equivalents thereof. Numerous specific details are set forth in the following description in order to provide a thorough understanding. These details are provided for the purpose of example and the described techniques may be practiced according to the claims without some or all of these specific details. For clarity, technical material that is known in the technical fields related to the examples has not been described in detail to avoid unnecessarily obscuring the description.
FIG. 1 is a diagram depicting an implementation of a fleet of autonomous vehicles communicatively networked to an autonomous vehicle service platform, in accordance with some embodiments. The illustration 100 depicts a fleet of autonomous vehicles 109 (e.g., one or more autonomous vehicles 109 a-109 e) operating as a service, each autonomous vehicle 109 being configured to a self-driving road network 110 and establishing a communication link 192 with an autonomous vehicle service platform 101. In an example where the fleet of autonomous vehicles 109 constitutes a service, the user 102 may send a request 103 for autonomous transport to the autonomous vehicle service platform 101 via one or more networks 106. In response, the autonomous vehicle service platform 101 may dispatch one of the autonomous vehicles 109 to autonomously transport the user 102 from the geographic location 119 to the geographic location 111. The autonomous vehicle service platform 101 may dispatch an autonomous vehicle from the station 190 to the geographic location 119, or may divert an autonomous vehicle 109c that is already in transit (e.g., no occupants) to service a transport request to the user 102. The autonomous vehicle service platform 101 may also be configured to steer an in-transit autonomous vehicle 109c with a passenger in response to a request from a user 102 (e.g., as a passenger). In addition, the autonomous vehicle service platform 101 may be configured to reserve an on-going autonomous vehicle 109c with passengers to turn to service the user 102's request after an existing passenger drops. It should be noted that multiple autonomous vehicle service platforms 101 (not shown) and one or more sites 190 may be implemented to service one or more autonomous vehicles 109 in conjunction with a road network 110. One or more sites 109 may be configured to store, service, manage, and/or maintain an inventory of autonomous vehicles 109 (e.g., sites 190 may include one or more computing devices implementing autonomous vehicle service platform 101).
According to some examples, at least some of the autonomous vehicles 109 a-109 e are configured as bidirectional autonomous vehicles, such as bidirectional autonomous vehicles ("AV") 130. The bi-directional autonomous vehicle 130 may be configured to travel primarily in any direction along, but not limited to, the longitudinal axis 131. Thus, the bi-directional autonomous vehicle 130 may be configured to implement active lighting external to the vehicle to alert others (e.g., other drivers, pedestrians, riders, etc.) in the vicinity, and where the bi-directional autonomous vehicle 130 is along the direction of travel. For example, the active source of the lamp 136 may be implemented as the active lamp 138a when traveling in a first direction, or the active source of the lamp 136 may be implemented as the active lamp 138b when traveling in a second direction. The active light 138a may be implemented using a first subset of one or more colors with selectable animation (e.g., a variable intensity light pattern of light or color that may change over time). Similarly, the active lamp 138b may be implemented using a second subset of one or more colors and light patterns that may be different from those of the active lamp 138 a. For example, the active lamp 138a may be implemented using a white lamp as a "headlight" and the active lamp 138b may be implemented using a red lamp as a "tail lamp". The active lights 138a and 138b, or portions thereof, may be configured to provide other light-related functionality, such as providing a "turn signal indication" function (e.g., using yellow light). According to various examples, logic in the autonomous vehicle 130 may be configured to adapt the active lights 138a and 138b to comply with various safety requirements and traffic regulations or laws for any number of jurisdictions.
In some embodiments, bi-directional autonomous vehicle 130 may be configured with similar structural elements and components in each quaternary portion, such as quaternary portion (quad portion) 194. The quaternary portion is depicted in at least this example as the portion of bi-directional autonomous vehicle 130 defined by the intersection of plane 132 and plane 134, with both planes 132 and 134 passing through the vehicle to form two similar halves on each side of planes 132 and 134. Further, the bi-directional autonomous vehicle 130 may include an autonomous vehicle controller 147, the autonomous vehicle controller 147 including logic (e.g., hardware or software, or a combination thereof) configured to control a dominant number of vehicle functions including driving control (e.g., propulsion, steering, etc.) and other functions of the active source 136 of lights. The bi-directional autonomous vehicle 130 also includes a plurality of sensors 139 (other sensors not shown) disposed at various locations on the vehicle.
The autonomous vehicle controller 147 may also be configured to determine a local pose (e.g., local position) of the autonomous vehicle 109 and detect external objects relative to the vehicle. For example, consider that bi-directional autonomous vehicle 130 is traveling in road network 110 in direction 119. A localizer (not shown) of the autonomous vehicle controller 147 is able to determine the local pose at the geographic location 111. In this way, the localizer may use collected sensor data, such as sensor data associated with the surfaces of the buildings 115 and 117, which can be compared to reference data, such as map data (e.g., 3D map data, including reflectance data), to determine a local pose. Further, a perception engine (not shown) of autonomous vehicle controller 147 may be configured to detect, classify, and predict behavior of external objects, such as external object 112 ("tree") and external object 114 ("pedestrian"). The classification of the external object may broadly classify the object as a static object such as external object 112 and a dynamic object such as external object 114. The localizer and the perception engine of the AV controller 147 cooperate with other components to cause the autonomous vehicle 109 to autonomously drive.
According to some examples, the autonomous vehicle service platform 101 is configured to provide remote operator services when the autonomous vehicle 109 requests remote operation. For example, consider that an autonomous vehicle controller 147 in an autonomous vehicle 109d detects an object 126 that obscures a path 124 on a roadway 122 at a point 191, as depicted in an illustration 120. If the autonomous vehicle controller 147 is unable to ascertain with a relatively high degree of certainty the path or trajectory that the vehicle 109d can safely travel, the autonomous vehicle controller 147 may send the request message 105 for a teleoperational service. In response, the remote operator computing device 104 may receive instructions from the remote operator 108 to perform a course of action for successfully (safely) traversing the obstacle 126. The response data 107 can then be sent back to the autonomous vehicle 109d to cause the vehicle to safely traverse a set of two wires, for example, as it passes along the alternate path 121. In some examples, the remote operator computing device 104 may generate a response identifying the geographic region to exclude planning of the path. In particular, rather than providing a path to follow, the remote operator 108 may define an area or location that the autonomous vehicle must avoid.
Based on the foregoing, the autonomous vehicle 130 and/or the autonomous vehicle controller 147, as well as the structure and/or functionality of their components, are capable of performing real-time (or near real-time) trajectory calculations, such as localization and perceived autonomous-related operations, to enable the autonomous vehicle 109 to self-drive.
In some cases, the bi-directional nature of the bi-directional autonomous vehicle 130 provides a vehicle having four-component portions 194 (or any other number of symmetrical portions) that are similar or substantially similar to each other. This symmetry reduces the complexity of the design and relatively reduces the number of unique parts or structures, thereby reducing inventory and manufacturing complexity. For example, a drive train (drivtrain) and a wheel system may be provided in any of the quaternary portions 194. Further, the autonomous vehicle controller 147 is configured to initiate a teleoperational service to reduce the likelihood of delays in the passage of the autonomous vehicle 109 while addressing events or problems that may otherwise affect the safety of the occupants. In some cases, the visible portion of the road network 110 depicts a geographic enclosed (geo-enclosed) area that may limit or otherwise control movement of the autonomous vehicle 109 to the road network shown in fig. 1. According to various examples, autonomous vehicles 109 and its fleets may be configurable to operate as class 4 ("fully self-driving automated", or L4) vehicles, with class 4 vehicles being able to provide transportation with point-to-point personal mobility convenience and privacy when needed, while providing the efficiency of sharing vehicles. In some examples, the autonomous vehicle 109 or any of the autonomous vehicles described herein may be configured to omit a steering wheel (steering wheel) or any other mechanical component that provides manual (e.g., human-controlled) maneuvering of the autonomous vehicle 109. Further, the autonomous vehicle 109, or any of the autonomous vehicles described herein, may be configured to omit a seat or position within the vehicle reserved for occupants to engage a steering wheel, or any mechanical interface for steering the system.
FIG. 2 is an example of a flow chart of monitoring a fleet of autonomous vehicles according to some embodiments. At 202, the process 200 begins when monitoring a fleet of autonomous vehicles. The at least one autonomous vehicle includes an autonomous vehicle controller configured to autonomously pass vehicles from a first geographic area to a second geographic area. At 204, data representing an event associated with a calculated confidence level for a vehicle is detected. The event may be a condition or situation that affects the operation of the autonomous vehicle or potentially affects the operation of the autonomous vehicle. The event may be internal to the autonomous vehicle, or external. For example, an obstacle covering a roadway may be considered an event, as may a reduction or loss of communication. Events may include traffic conditions or traffic congestion, as well as unexpected or unusual numbers or types of external objects (or trails) perceived by the perception engine. The event may include a weather-related condition (e.g., due to ice or rain, loss of friction) or an angle at which the sun is shining (e.g., at sunset), such as a low angle relative to horizon in the eyes of a person driver that causes the sun to shine brightly into other vehicles. These and other conditions may be considered events that cause initiation of a remote operator service or cause the vehicle to perform a safe stop trajectory.
At 206, data representing a subset of candidate trajectories may be received from the autonomous vehicle in response to detection of the event. For example, the planner of the autonomous vehicle controller may calculate and evaluate a large number of trajectories (e.g., thousands or more) per unit time, such as seconds. In some embodiments, the candidate trajectories are a subset of trajectories with a relatively high confidence level that the autonomous vehicle may safely move forward (e.g., using an alternate path provided by a remote operator) in view of the event. It is noted that some candidate trajectories may be ranked or associated with a higher confidence level than other candidate trajectories. According to some examples, the subset of candidate trajectories may originate from any number of sources, such as a planner, a remote operator computing device (e.g., a remote operator is able to determine and provide an approximate path), etc., and may be combined into a superset of candidate trajectories. At 208, path guidance data may be identified at one or more processors. The path guidance data may be configured to assist the remote operator in selecting a guidance track from one or more of the candidate tracks. In some cases, the path guidance data specifies a value indicating a confidence level or probability that indicates a degree of certainty that a particular candidate trajectory may reduce or eliminate the probability that the event may affect operation of the autonomous vehicle. The guidance track as the selected candidate track may be received at 210 in response to an input from a remote operator (e.g., the remote operator may select at least one candidate track from a set of differently ranked candidate tracks as the guidance track). The selection may be via an operator interface listing a plurality of candidate trajectories, for example in order from highest confidence level to lowest confidence level. At 212, the selection of candidate trajectories as guide trajectories may be sent to the vehicle, which in turn implements the guided trajectories to address the situation by causing the vehicle to perform remote operator-specified maneuvers. In this way, the autonomous vehicle may transition from an abnormal operating state.
FIG. 3A is a diagram depicting an example of sensors and other autonomous vehicle components according to some examples. The diagram 300 depicts an interior view of a bidirectional autonomous vehicle 330 including sensors, a signal router 345, a transmission system 349, a removable battery 343, an audio generator 344 (e.g., a speaker or transducer), and autonomous vehicle ("AV") control logic 347. The sensors shown in diagram 300 include image capture sensors 340 (e.g., light capture devices or any type of camera), audio capture sensors 342 (e.g., any type of microphone), radar devices 348, sonar devices 341 (or other similar sensors, including ultrasonic sensors or sound related sensors), and other sensor types and modalities (some not shown), such as inertial measurement units (or "IMUs"), global positioning system ("GPS") sensors, sonar sensors, etc. of a LIDAR device 346. It is noted that quaternary portion 350 represents symmetry of each of the four "quaternary portions" of bi-directional autonomous vehicle 330 (e.g., each quaternary portion 350 may include wheels, a transmission 349, similar steering mechanisms, similar structural supports and components, etc., in addition to those depicted). As depicted in fig. 3A, similar sensors may be placed in similar locations in each quad 350, however any other configuration may be implemented. Each wheel may be steerable individually and independently of the other wheels. It is also noted that the removable battery 343 may be configured to facilitate swapping in and out, rather than charging in place, thereby ensuring reduced or negligible downtime due to having to charge the battery 343. Although the autonomous vehicle controller 147a is depicted as being used in the bi-directional autonomous vehicle 330, the autonomous vehicle controller 347a is not limited thereto and may be implemented in a uni-directional autonomous vehicle or any other type of vehicle, whether on land, in the air, or at sea. It should be noted that the depicted and described positions, locations, orientations, numbers, and types of sensors shown in fig. 3A are not intended to be limiting, and as such, any number and type of sensors may be present and may be located and oriented anywhere on autonomous vehicle 330.
According to some embodiments, portions of the autonomous vehicle ("AV") control logic 347 may be implemented using clusters of graphics processing units ("GPUs") implementing a framework and programming model suitable for programming clusters of GPUs (clusters). For example, a compute unified device architecture ("CUDA") compatible programming language and application programming interface ("API") model may be used to program the GPU. Production and maintenance of CUDA by NVIDIA of Santa Clara of California (California) TM . It should be noted that other programming languages, such as OpenCL, or any other parallel programming language, may be implemented.
According to some embodiments, the autonomous vehicle control logic 347 may be implemented in hardware and/or software as an autonomous vehicle controller 347a, the autonomous vehicle controller 347a being shown as including a motion controller 362, a planner 364, a perception engine 366, and a localizer 368. As shown, the autonomous vehicle controller 347a is configured to receive camera data 340a, lidar data 346a, and radar data 348a, or any other range sensing or localization (localization) data, including sonar data 341a, and the like. The autonomous vehicle controller 347a is also configured to receive positioning (location) data, such as GPS data 352, IMU data 354, and other position (location) sensing data (e.g., data related to the wheels, such as steering angle, angular velocity, etc.). In addition, the autonomous vehicle controller 347a may receive any other sensor data 356, as well as reference data 339. In some cases, the reference data 339 includes map data (e.g., 3D map data, 2D map data, 4D map data (e.g., including time determination)), and route (route) data (e.g., road network data, including but not limited to RNDF data (or similar data), MDF data (or similar data), and the like.
The localizer 368 is configured to receive sensor data from one or more sources, such as GPS data 352, wheel data, IMU data 354, lidar data 346a, camera data 340a, radar data 348a, etc., and reference data 339 (e.g., 3D map data and route data). The localizer 368 integrates (e.g., fuses) the sensor data and analyzes the data by comparing the sensor data to map data to determine a local pose (or orientation) of the bi-directional autonomous vehicle 330. According to some examples, the localizer 368 may generate or update the pose or orientation of any autonomous vehicle in real-time or near real-time. It should be noted that the localizer 368 and its functionality need not be limited to "two-way" vehicles and can be implemented in any vehicle of any type. Thus, the localizer 368 (and other components of the AV controller 347 a) may be implemented in a "one-way" vehicle or in any non-autonomous vehicle. According to some embodiments, the data describing the local pose may include one or more x-coordinates, y-coordinates, z-coordinates (or any coordinates of any coordinate system, including polar coordinate system or cylindrical coordinate system, etc.), yaw values, roll values, pitch values (e.g., angle values), velocity (e.g., speed), latitude, etc.
The perception engine 366 is configured to receive sensor data from one or more sources, such as lidar data 346a, camera data 340a, radar data 348a, and the like, as well as local pose data. The perception engine 366 may be configured to determine the location of the external object based on the sensor data and other data. The external object may for example be an object that is not part of the drivable surface. For example, the perception engine 366 can detect and classify external objects as pedestrians, cyclists, dogs, other vehicles, etc. (e.g., the perception engine 366 is configured to classify objects according to the type of classification, which may be associated with semantic information, including tags). Based on the classification of these external objects, these external objects may be annotated as dynamic objects or static objects. For example, an external object classified as a tree may be labeled as a static object, while an external object classified as a pedestrian may be labeled as a static object. External objects labeled as static may or may not be described in the map data. Examples of external objects that may be marked as static include traffic cones, cement barriers placed across the roadway, lane closure markers, newly placed mailboxes or garbage cans adjacent the roadway, and the like. Examples of external objects that may be labeled as dynamic include cyclists, walkers, animals, other vehicles, and the like. If the external object is marked as dynamic, and further data about the external object may indicate typical levels of activity and speed, as well as behavior patterns associated with classification types. Further data about the external object may be generated by tracking the external object. In this way, the classification type can be used to predict or otherwise determine the likelihood that an external object may, for example, interfere with an autonomous vehicle traveling along a planned path. For example, an external object classified as a pedestrian may be associated with a certain maximum speed and an average speed (e.g., based on tracking data). The speed of the pedestrian relative to the speed of the autonomous vehicle can be used to determine whether a collision is likely. In addition, the perception engine 364 may determine uncertainty levels associated with the current and future states of the object. In some examples, the uncertainty level may be represented as an evaluation value (or probability).
The planner 364 is configured to receive sensory data from the sensory engine 366 and may also include localizing data from the localizing device 368. According to some examples, the perception data may include obstacle maps that specify static and dynamic objects located near the autonomous vehicle, while the localizer data may include local poses or orientations. In operation, based at least on the location of the autonomous vehicle relative to the location of the external dynamic and static objects, the planner 364 generates a number of trajectories and evaluates the trajectories. The planner 364 selects the optimal trajectory based on various criteria by which to direct the autonomous vehicle in a manner that provides collision-free travel. In some examples, the planner 364 may be configured to calculate the trajectory as a probabilistically determined trajectory. In addition, the planner 364 may send steering and propulsion commands (as well as deceleration or braking commands) to the motion controller 362. Motion controller 362 may then convert any commands, such as steering commands, hold down or advance commands, and brake commands, into control signals (e.g., applied to actuators or other mechanical interfaces) to effect the steering or change in wheel angle 351 and/or speed 353.
Fig. 3B-3E are diagrams depicting examples of sensor field redundancy and autonomous vehicle adaptation to loss of sensor field, according to some examples. The illustration 391 of fig. 3B depicts a sensor field 301a in which a sensor 310a detects an object (e.g., for determining range or distance, or other information). While sensor 310a may implement any type of sensor or sensor modality, sensor 310a and similarly described sensors such as sensors 310b, 310c, and 30d may include a lidar device. Thus, sensor fields 301a, 301b, 301c, and 301d each include a field into which the laser light extends. The illustration 392 of fig. 3C depicts four overlapping sensor fields, each of which is generated by a corresponding lidar sensor 310 (not shown). As shown, portion 301 of the sensor field includes non-overlapping sensor fields (e.g., a single lidar field), portion 302 of the sensor field includes two overlapping sensor fields, and portion 303 includes three overlapping sensor fields, whereby the sensor provides multiple levels of redundancy in the event of a lidar sensor failure.
Fig. 3D depicts loss of sensor field due to faulty operation of lidar 309 according to some examples. The sensor field 302 of fig. 3C is transformed into a single sensor field 305, one of the sensor fields 301 of fig. 3C is lost at the gap 304, and three of the sensor fields 303 of fig. 3C are transformed into sensor fields 306 (i.e., limited to two overlapping fields). While autonomous vehicle 330c is traveling in traveling direction 396, the sensor field in front of the moving autonomous vehicle may not be as robust as the sensor field in the trailing end portion. According to some examples, an autonomous vehicle controller (not shown) is configured to leverage the bi-directional nature of autonomous vehicle 330c to handle loss of sensor field at a lead zone (lead area) in front of the vehicle. Fig. 3E depicts a bi-directional maneuver for restoring a degree of robustness to the sensor field in front of the autonomous vehicle 330 d. As shown, a more robust sensor field 302 is disposed at the rear of the vehicle 330d coextensive with the taillights 348. When convenient, the autonomous vehicle 330d performs a bi-directional maneuver by pulling into the roadway 397 and switching its directionality such that the tail lights 348 actively switch to the other side (e.g., trailing edge) of the autonomous vehicle 330 d. As shown, the autonomous vehicle 330d resumes the robust sensor field 302 in front of the vehicle as it travels in the travel direction 398. Furthermore, the bi-directional maneuvers described above eliminate the need for more complex maneuvers that require reversing into a busy roadway.
Fig. 4 is a functional block diagram depicting a system including an autonomous vehicle service platform communicatively coupled to an autonomous vehicle controller via a communication layer, according to some examples. The diagram 400 depicts an autonomous vehicle controller ("AV") 447 disposed in an autonomous vehicle 430, the autonomous vehicle 430 in turn comprising a plurality of sensors 470 coupled to the autonomous vehicle controller 447. The sensors 470 include one or more lidar devices 472, one or more cameras 474, one or more radars 476, one or more global positioning system ("GPS") data receiver-sensors, one or more inertial measurement units ("IMUs") 475, one or more ranging sensors 477 (e.g., wheel encoder sensors, wheel speed sensors, etc.), and any other suitable sensors 478, such as infrared cameras or sensors, hyperspectral-capable (hyperspectral-capable) sensors, ultrasonic sensors (or any other acoustic energy based sensors), radio frequency based sensors, and the like. In some cases, a wheel angle sensor configured to sense a steering angle of the wheel may be included as a ranging sensor 477 or a suitable sensor 478. In a non-limiting example, autonomous vehicle controller 447 may include four or more lidars 472, sixteen or more cameras 474, and four or more radar units 476. Further, the sensors 470 may be configured to provide sensor data to components of the autonomous vehicle controller 447 as well as to elements of the autonomous vehicle service platform 401. As shown in diagram 400, the autonomous vehicle controller 447 includes a planner 464, a motion controller 462, a localizer 468, a perception engine 466, and a local map generator 440. It should be noted that the elements depicted in diagram 400 of fig. 4 may include the structure and/or functionality of similarly-named elements described in connection with one or more other diagrams.
The localizer 468 is configured to localize (i.e., determine a local pose) the autonomous vehicles relative to reference data, which may include map data, route data (e.g., road network data, such as RNOF-like data), and the like. In some cases, the localizer 468 is configured to identify, for example, a point in space that may represent the location of the autonomous vehicle 430 relative to a feature representing the environment. The localizer 468 is shown as comprising a sensor data integrator (integrator) 469, the sensor data integrator 469 can be configured to integrate multiple subsets of sensor data (e.g., different sensor modalities) to reduce uncertainty (uncertainty) associated with each individual sensor type. According to some examples, sensor data integrator 469 is configured to fuse sensor data (e.g., lidar data, camera data, radar data, etc.) to form integrated sensor data values for use in determining a local pose. According to some examples, the localizer 468 retrieves reference data from a reference data store (repository) 405, the reference data store 405 including a map data store 405a for storing 2D map data, 3D map data, 4D map data, and the like. The localizer 468 may be configured to identify at least a subset of features in the environment that are comparable to map data to identify or otherwise confirm the pose of the autonomous vehicle 430. According to some examples, the localizer 468 may be configured to identify any number of features in the environment such that the set of features can be one or more features, or all features. In a particular example, any number of lidar data (e.g., most or substantially all lidar data) may be compared to data representing a map for localization purposes. In general, the non-matching object resulting from the comparison of the environmental features to the map data may be a dynamic object, such as a vehicle, a cyclist, a walker, or the like. It is noted that the detection of a dynamic object comprising an obstacle may be performed with or without map data. In particular, dynamic objects may be detected and tracked independently of (i.e., without) map data. In some cases, the 2D map data and the 3D map data may be regarded as "global map data" or map data that has been verified (valid) by the autonomous vehicle service platform 401 at a point in time. Because the map data in the map data store 405a may be periodically updated and/or verified, there is a discrepancy between the map data and the actual environment in which the autonomous vehicle is located. Thus, the localizer 468 can retrieve locally derived map data generated by the local map generator 440 to enhance localization. The local map generator 440 is configured to generate local map data in real-time or near real-time. Alternatively, the local map generator 440 may receive static and dynamic object map data to enhance the accuracy of the locally generated map by, for example, ignoring (disregard) dynamic objects in the locality. In accordance with at least some embodiments, the local map generator 440 can be integrated with or form part of the localizer 468. In at least one instance, the local map (map) generator 440, either individually or in cooperation with the localizer 468, can be configured to generate map and/or reference data based on simultaneous localization and mapping ("SLAM") or the like. It should be noted that the localizer 468 may implement a "hybrid" approach to using map data, whereby logic in the localizer 468 may be configured to select various amounts of map data from the map data store 405a or local map data from the local map generator 440, depending on the reliability of each source of map data. Thus, the localizer 468 may still use obsolete map data in view of locally generated map data.
The perception engine 466 is configured to assist the planner 464 in planning a route and generating a trajectory (trajectry), for example, by identifying objects of interest in the surrounding environment in which the autonomous vehicle 430 is traveling (transit). Furthermore, a probability may be associated with each object of interest, whereby the probability may represent the likelihood that the object of interest may be a threat to safe driving (e.g., a fast moving motorcycle may require enhanced tracking than a person sitting on a bus stop bench reading newspapers). As shown, the perception engine 466 includes an object detector 442 and an object classifier 444. Object detector 442 is configured to distinguish objects from other features in the environment, and object classifier 444 may be configured to classify objects as dynamic or static objects and track the position of the dynamic and static objects relative to autonomous vehicle 430 for planning purposes. Further, the perception engine 466 may be configured to assign an identifier (identifier) to a static or dynamic object that specifies whether the object is (or has the potential to be) an obstacle that may affect path planning at the planner 464. Although not shown in fig. 4, it is noted that the perception engine 466 may perform other perception-related functions, such as segmentation and tracking, examples of which are described below.
The planner 464 is configured to generate a plurality of candidate trajectories for the purpose of reaching a destination via a plurality of paths or routes available. The trajectory evaluator 465 is configured to evaluate candidate trajectories and identify which subsets of candidate trajectories are associated with a higher degree of confidence level of the collision-free path provided to the destination. In this way, the trajectory evaluator 465 can select an optimal trajectory based on relevant criteria for causing the command to generate control signals for the vehicle component 450 (e.g., an actuator or other mechanism). It should be noted that the relevant criteria may include any number of factors defining an optimal trajectory, the selection of which need not be limited to reducing collisions. For example, the selection of trajectories may be made to optimize the user experience (e.g., user comfort) as well as collision-free trajectories that comply with traffic regulations and laws, for example. The user experience may be optimized by slowing down (modeling) acceleration (e.g., reducing jerky or other unpleasant movements) in various straight and angular directions. In some cases, at least part of the relevant criteria can specify which of the other criteria to ignore or replace, while maintaining optimal collision-free travel. For example, legal constraints may be temporarily raised or weakened when a trajectory is generated under a constraint (e.g., crossing a double yellow line to travel around a rider or traveling at a speed higher than the speed constraint of a posted matching traffic stream). In this way, the control signals are configured to cause propulsion and direction changes at the driveline and/or wheels. In this example, motion controller 462 is configured to translate commands into control signals (e.g., speed, wheel angle, etc.) for controlling mobility of autonomous vehicle 430. Where the trajectory evaluator 465 has insufficient information to ensure that the confidence level is high enough to provide a collision-free optimized ride, the planner 464 can generate a request to the remote operator 404 for remote operator support.
Autonomous vehicle service platform 401 includes a remote operator 404 (e.g., a remote operator computing device), a reference data store 405, a map updater 406, a vehicle data controller 408, a calibrator 409, and an offline object classifier 410. It is noted that each element of the autonomous vehicle service platform 40I may be independently disposed or distributed and in communication with other elements in the autonomous vehicle service platform 401. Further, elements of autonomous vehicle services platform 401 may independently communicate with autonomous vehicle 430 via communication layer 402. The map updater 406 is configured to receive map data (e.g., from the local map generator 440, the sensor 460, or any other component of the autonomous vehicle controller 447), and is also configured to detect a deviation of the map data from a locally generated map, e.g., in the map data store 405 a. The vehicle data controller 408 enables the 2D map updater 406 to update the reference data within the repository 405 and facilitate updating 2D, 3D, and/or 4D map data. In some cases, the vehicle data controller 408 can control the rate at which local map data is received into the autonomous vehicle service platform 408 and the frequency at which the map updater 406 performs the updating of the map data.
The calibrator 409 is configured to perform calibration of various sensors of the same or different types. The calibrator 409 may be configured to determine the relative pose of the sensor (e.g., in cartesian space (x, y, z)) and the orientation of the sensor (e.g., roll, pitch, and yaw). The pose and orientation of sensors such as cameras, lidar sensors, radar sensors, etc. may be calibrated with respect to other sensors, as well as globally with respect to a frame of reference (frame) of the vehicle. Off-line self-calibration can also calibrate or estimate other parameters such as vehicle inertia tensor, wheelbase (wheelbase), wheel radius, or surface road friction. According to some examples, calibration can also be performed online to detect parameter changes. It is also noted that calibration by the calibrator 409 may include intrinsic parameters of the sensor (e.g., optical distortion, beam angle, etc.) and may include extrinsic parameters. In some cases, as an example, the calibrator 409 may be performed by maximizing a correlation between depth discontinuities in the 3D laser data and edges of the image data. The offline object classification 410 is configured to receive data, such as sensor data, from the sensor 470 or any other component of the autonomous vehicle controller 447. According to some embodiments, the offline classification channel of offline object classification 410 may be configured to pre-collect and annotate objects (e.g., manually by a person and/or automatically using an offline labeling algorithm), and may also be configured to train an online classifier (e.g., object classifier 444) that is capable of providing real-time classification of object types during online autonomous operation.
FIG. 5 is an example of a flow chart for controlling an autonomous vehicle according to some embodiments. At 502, the process 500 begins when sensor data from a plurality of modalities of sensors at an autonomous vehicle is received, for example, by an autonomous vehicle controller. One or more subsets of sensor data may be integrated for generating fusion data to improve, for example, estimation. In some examples, sensor streams of one or more sensors (e.g., of the same or different modalities) may be fused at 504 to form fused sensor data. In some examples, a subset of lidar sensor data and camera sensor data may be fused at 504 to facilitate positioning. At 506, data representing the object based on at least two subsets of the sensor data may be derived at a processor. For example, data identifying a static object or a dynamic object may be derived (e.g., at a perception engine) from at least the lidar and camera data. At 508, the detected objects are determined to affect the planned path, and at 510, a subset of the trajectories are evaluated (e.g., at the planner) in response to the detected objects. At 512, a confidence level is determined that exceeds a range of acceptable confidence levels associated with normal operation of the autonomous vehicle. Thus, in this case, the confidence level may be a function of the probability that makes the certainty of selecting the best path less likely, whereby the best path may be determined as: facilitating collision-free travel, conforming to traffic laws, providing a comfortable user experience (e.g., a comfortable ride), and/or generating candidate trajectories for any other factor. Thus, a request for an alternate path may be sent to the remote operator computing device at 514. Thereafter, the remote operator computing device may provide the planner with an optimal trajectory for the autonomous vehicle to travel. In this case, the vehicle may also determine that performing a safety-stop maneuver is the best course of action (e.g., safely and automatically stopping the autonomous vehicle at a location of relatively low risk probability). It should be noted that the order depicted herein and in other flow diagrams is not intended to imply that various functions need to be performed linearly (linear), as each portion of the flow diagrams may be performed serially or in parallel with any one or more other portions of the flow diagrams, and may be performed independently of, or in dependence upon, other portions of the flow diagrams.
Fig. 6 is a diagram depicting an example of an architecture for an autonomous vehicle controller, in accordance with some embodiments. The diagram 600 depicts a number of processes including a motion controller process 662, a scheduler processor 664, a perception process 666, a map construction process 640, and a localization process 668, some of which may generate or receive data related to other processes. Other processes, such as processes 670 and 650, may facilitate interaction with one or more mechanical components of the autonomous vehicle. For example, the sensing process 666, the mapping process 640, and the localization process 668 are configured to receive sensor data from the sensor 670, while the planner process 664 and the sensing process 666 are configured to receive guidance data 606, the guidance data 606 may include route data, such as road network data. Further to illustration 600, localization process 668 is configured to receive map data 605a (i.e., 2D map data), map data 605b (i.e., 3D map data), and local map data 642, as well as other types of map data. For example, the localization process 668 may also receive other forms of map data, such as 4D map data, which may include, for example, an epoch determination. The localization process 668 is configured to generate local position data 641 that is representative of the local pose. The local position data 641 is provided to a motion controller process 662, a planner process 664, and a perception process 666. The awareness process 666 is configured to generate static and dynamic object map data 667, which in turn can be sent to the planner process 664. In some examples, static and dynamic object map data 667 may be sent along with other data such as semantic classification information and predicted object behavior. The planner process 664 is configured to generate trajectory data 665, the trajectory data 665 describing a plurality of trajectories generated by the planner 664. The motion controller process uses the trajectory data 665 to generate low-level commands or control signals to apply to the actuator 650 to cause a change in steering angle and/or speed.
Fig. 7 is a diagram depicting an example of an autonomous vehicle service platform implementing redundant communication channels to maintain reliable communication with a fleet of autonomous vehicles, in accordance with some embodiments. The illustration 700 depicts an autonomous vehicle service platform 701, the autonomous vehicle service platform 701 comprising a reference data generator 705, a vehicle data controller 702, an autonomous vehicle fleet manager 703, a remote operator manager 707, a simulator 740, and a policy manager 742. The reference data generator 705 is configured to generate and modify map data and route data (e.g., RNDF data). Further, the reference data generator 705 may be configured to access 2D maps in the 2D map data store 720, access 3D maps in the D map data store 722, and access route data in the route data store 724. Other map representation data and repositories may be implemented in some examples, such as 4D map data including chronologies. The vehicle data controller 702 may be configured to perform various operations. For example, vehicle data controller 702 may be configured to 2D vary the rate at which data is exchanged between the fleet of autonomous vehicles and platform 701 based on the quality level of communication on channel 770. For example, during bandwidth constraint periods, data communications may be prioritized such that teleoperation requests from the autonomous vehicle 730 are prioritized high to ensure delivery. In addition, variable levels of data extraction may be sent via the vehicle over channel 770, depending on the bandwidth available for the particular channel. For example, full lidar data (e.g., substantially all lidar data, but possibly less) may be transmitted in the presence of a robust network connection, while simpler or more abstract depictions of data (e.g., bounding boxes with associated metadata, etc.) may be transmitted in the presence of a degraded or low-speed connection. The autonomous vehicle fleet manager 703 is configured to coordinate the dispatch of the autonomous vehicle 703 to optimize a plurality of variables, including the effective use of battery power, travel time, whether an air conditioning unit in the autonomous vehicle 730 is available during a low state of charge of the battery, etc., any or all of which may be monitored for (in view of) optimizing the costs associated with operating the autonomous vehicle service. Algorithms may be implemented to analyze various variables to minimize time or cost of travel for a fleet of autonomous vehicles. In addition, to maximize the fleet's uptime, the autonomous vehicle fleet manager 703 maintains an inventory (inventory) of autonomous vehicles and portions for adapting to service plans.
Remote operator manager 707 is configured to manage a plurality of remote operator computing devices 704 used by remote operators 708 to provide input. Simulator 740 is configured to simulate the operation of one or more autonomous vehicles 730, as well as the interaction between remote operator manager 707 and autonomous vehicle 730. Simulator 740 may also simulate the operation of a plurality of sensors disposed in autonomous vehicle 730 (including the introduction of simulated noise). Further, an environment such as a city may be simulated such that a simulated autonomous vehicle can be introduced into a synthetic environment, whereby a simulated sensor may receive simulated sensor data such as a simulated laser echo (return). Simulator 740 may also provide other functions including validating software updates and/or 2D map data. The policy manager 742 is configured to maintain data representing policies or rules that the autonomous vehicle should adhere to in view of various conditions or events encountered by the autonomous vehicle while traveling on the roadway network. In some cases, updated policies and/or rules may be simulated in simulator 740 to confirm safe operation of the fleet of (confirm) autonomous vehicles in view of the change in policies. Some of the above elements of the autonomous vehicle service platform 701 are further described below.
The communication channel 770 is configured to provide a networked communication link between the fleet of autonomous vehicles 730 and the autonomous vehicle service platform 701. For example, the communication channel 770 includes a plurality of different types of networks 771, 772, 773, and 774 having corresponding subnetworks (e.g., 771a through 771 n) to ensure a level of redundancy for reliably operating the autonomous vehicle service. For example, different types of networks in the communication channel 770 may include different cellular network providers, different types of data networks, etc., to ensure adequate bandwidth in the event of reduced or lost communication caused by an interruption in one or more of the networks 771, 772, 773, and 774.
Fig. 8 is a diagram depicting an example of a messaging application configured to exchange data between various applications, in accordance with some embodiments. The diagram 800 depicts a remote operator application 801 disposed in a remote operator manager and an autonomous vehicle application 830 disposed in an autonomous vehicle whereby the remote operator application 801 and the autonomous vehicle application 830 exchange message data via protocols that facilitate communication over various networks such as networks 87, 872, and other networks 873. According to some examples, the communication protocol is implemented as a data distribution service (Data Distribution Service TM ) The data distribution service has specifications maintained by the object management group association. According to a communication protocol, the remote operator application 801 and the autonomous vehicle application 830 may include a message router 854 disposed in a message domain (message domain) configured to interface with the remote operator API 852. In some examples, message router 854 is a routing service. In some examples, message field 850a in remote operator application 801 may be identified by a remote operator identifier, while message field 850b may be identified as a field associated with a vehicle identifier. The remote operator API 852 in the remote operator application 801 is configured to interface with the remote operator processes 803 a-803 c, thereby associating the remote operator process 803b with the autonomous vehicle identifier 804 and the remote operator process 803c with the event identifier 806 (e.g., an identifier specifying an intersection that may be problematic for collision-free path planning). The remote operator API 852 in the autonomous vehicle 830 is configured to interface with an autonomous vehicle operating system 840, an autonomous vehicle operating trainThe system 840 includes a sensing application 842, a sensing application 844, a localization application 846, and a control application 848. In view of signing, the above-described communication protocol may facilitate data exchange to facilitate remote operation as described herein. Furthermore, the above-described communication protocols may be adapted to provide secure (secure) data exchange between one or more autonomous vehicles and one or more autonomous vehicle service platforms. For example, message router 854 may be configured to encrypt and decrypt messages to provide protected interaction between, for example, remote operator process 803 and autonomous vehicle operating system 840.
Fig. 9 is a diagram depicting types of data for facilitating remote operation using the communication protocol depicted in fig. 8, in accordance with some examples. The diagram 900 depicts a remote operator 908 interfacing with a remote operator computing device 904 coupled to a remote operator application 901, the remote operator application 901 configured to exchange data via a data center messaging (messaging) bus 972 implemented in one or more networks 971. Data center messaging bus 972 provides a communication link between remote operator application 901 and autonomous vehicle application 930. The remote operator API 962 of the remote operator application 901 is configured to receive message service configuration data 964 and route data 960, such as road network data (e.g., RNDF-like data), task data (e.g., MDF data), and the like. Similarly, messaging service bridge 932 is also configured to receive messaging service configuration data 934. Messaging service configuration data 934 and 964 provide configuration data to configure messaging services between remote operator application 901 and autonomous vehicle application 930. Examples of messaging service configuration data 934 and 964 include information implemented to configure a data distribution service (Data Distribution Service TM ) Quality of service ("QoS") configuration data for an application.
Examples for facilitating data exchange via a communication protocol for remote operation are described below. The obstacle data 920 is considered generated by a perception system of the autonomous vehicle controller. In addition, the planner option data 924 is generated by the planner to notify a remote operator of the subset of candidate trajectories and the position data 926 is generated by the localizer. Obstacle data 920, planner option data 924, and position data 926 are sent to messaging service bridge 932, which messaging service bridge 932 generates telemetry data 940 and query data 942 from message service configuration data 934, and telemetry data 940 and query data 942 are each sent to remote operator application 901 as telemetry data 950 and query data 952 via data center messaging bus 972. The remote operator API 962 receives telemetry data 950 and query data 952, which are processed in turn in view of route data 960 and message service configuration data 964. The resulting data is then presented to the remote operator 908 via the remote operator computing device 904 and/or a cooperating display (e.g., an S-combination dashboard display visible to the remote operator 908). The remote operator 908 reviews candidate track options presented on the display of the remote operator computing device 904 and selects the guided track, and the remote operator computing device 904 generates command data 982 and query response data 980, both of which are passed through the remote operator API 962 as query response data 954 and command data 956. In turn, query response data 954 and command data 956 are transmitted into autonomous vehicle application 930 as query response data 944 and command data 946 via data center messaging bus 972. The messaging service bridge 932 receives the query response data 944 and command data 946 and generates remote operator command data 928 that is configured to generate a trajectory for remote operator selection for implementation by the planner. It should be noted that the messaging process described above is not intended to be limiting, and that other messaging protocols may be implemented as well.
FIG. 10 is a diagram illustrating an example of a remote operator interface that a remote operator may use to affect path planning in accordance with some embodiments. The illustration 1000 depicts an example of an autonomous vehicle 1030 in communication with an autonomous vehicle service platform 1001, the autonomous vehicle service platform 1001 including a remote operator manager 1007 configured to facilitate remote operations. In a first example, the remote operator manager 1007 receives data that requires the remote operator 1008 to prioritize the path of autonomous vehicles whose views are approaching areas of low scheduler confidence levels or potential obstacles so that the remote operator 1008 can pre-process the problem. For example, consider that an intersection that an autonomous vehicle is approaching may be marked as problematic. As such, the user interface 1010 displays a representation 1014 of a corresponding autonomous vehicle 1030 traveling along the path 1012, the path 1012 having been predicted by the plurality of trajectories generated by the planning period. Also shown are other vehicles 1011 and dynamic objects 1013 such as pedestrians, which may cause sufficient confusion at the planner, thereby requiring remote operation support. The user interface 1010 also presents the current speed 1022, speed limit 1024, and current charge 1026 in the battery to the remote operator 1008. According to some examples, the user interface 1010 may display other data, such as sensor data collected from the autonomous vehicle 1030. In a second example, consider that the planner 1064 has generated multiple tracks that are coextensive with the planner generated path 1044, regardless of the unidentified object 1046 detected. The planner 1064 may also generate a subset of the candidate trajectories 1040, but in this example, the planner cannot proceed at the current confidence level. If the planner 1064 is unable to determine the alternate path, a remote operation request may be sent. In this case, the remote operator may select one of the candidate trajectories 1040 to facilitate travel of the autonomous vehicle 1030 that coincides with the remote operator-based path 1042.
FIG. 11 is a diagram depicting an example of a planner configured to initiate remote operations, according to some examples. Diagram 1100 depicts a planner 1164 including a terrain manager 1110, a route manager 1112, a path generator 1114, a trajectory evaluator 1120, and a trajectory tracker 1128. The terrain manager 1110 is configured to receive map data, such as 3D map data or other similar map data specifying terrain features. Terrain manager 1110 is also configured to identify candidate paths based on the terrain-related features on the path to the destination. According to various examples, terrain manager 1110 receives sensor-generated 3D maps associated with one or more autonomous vehicles in a fleet. The route manager 1112 is configured to receive the environmental data 1103, which environmental data 1103 may include traffic related information associated with one or more routes that may be selected as paths to the destination. The path generator 1114 accepts data from the terrain manager 1110 and the route manager 1112 and generates one or more paths or path segments (path segments) suitable for guiding the autonomous vehicle towards the destination. Data representing one or more paths or path segments is sent to a trajectory evaluator 1120.
The trajectory evaluator 1120 includes a status and event manager 1122, which in turn may include a confidence level generator 1123. Track evaluator 1120 also includes a guide track generator 1126 and a track generator 1124. In addition, the planner 1164 is configured to receive policy data 1130, perception engine data 1132, and localizer data 1134.
According to some examples, policy data 1130 may include criteria used by the planner 1164 to determine paths with sufficient confidence levels to generate trajectories. Examples of policy data 1130 include policies specifying that the track is to be delimited by an offshore (stand-off) distance to an external object (e.g., a safety buffer kept as far as possible 3 feet from a rider), or policies requiring that the track must not intersect a central double yellow line, or policies requiring that the track must be limited to a single lane in a 4-lane roadway (e.g., based on past events such as typically gathering at the lane closest to a bus stop), and any other similar criteria specified by the policies. The perception engine data 1132 includes a map of the locations of the static and dynamic objects of interest, and the localizer data 1134 includes at least a local pose or orientation.
The state and event manager 1122 may be configured to probabilistically determine the operating state for the autonomous vehicle. For example, a first operational state (i.e., "normal operation") may describe a situation in which a trajectory is collision-free, while a second operational state (i.e., "abnormal operation") may describe another situation in which a confidence level associated with a possible trajectory is insufficient to ensure collision-free travel. According to some examples, the state and event manager 1122 is configured to determine the state of a normal or abnormal autonomous vehicle using the awareness data 1132. The confidence level generator 1123 may be configured to analyze the perception data 1132 to determine a status for the autonomous vehicle. For example, the confidence level generator 1123 may use semantic information associated with static and dynamic objects and associated probability estimates to increase the degree of certainty of the scheduler 1164 in determining a safe course of action. For example, the scheduler 1164 may use the awareness engine data 1132 specifying a probability of whether the object is a person or not a person to determine whether the scheduler 1164 is safe to operate (e.g., the scheduler 1164 may receive a degree of certainty that the object is 98% probable to be a person and that the object is 2% probable to be not a person).
Once it is determined that the confidence level (e.g., based on statistical and probabilistic determinations) is below a threshold required for predicted safe operation, a relatively low confidence level (e.g., a single probabilistic score) may trigger the planner 1164 to send a request 1135 for remote operation support to the autonomous vehicle service platform 1101. In some cases, telemetry data and a set of candidate tracks may accompany the request. Examples of telemetry data include sensor data, localization data, perception data, and the like. The remote operator 1108 may send the selected trajectory 1137 to the guide trajectory generator 1126 via the remote operator computing device 1104. Thus, the selected trajectory 1137 is a trajectory formed with guidance from a remote operator. Upon confirming that the status has not changed (e.g., that the abnormal status has not been resolved), the lead track generator 1126 passes data to the track generator 1124, which in turn causes the track tracker 1128, which is a track tracking controller, to use the remote operator-specified track to generate control signals 1170 (e.g., steering angle, speed, etc.). It is noted that the scheduler 1164 may trigger the transmission of a request 1135 for remote operation support before transitioning to a state other than the normal state. In particular, the ability of the autonomous vehicle controller and/or components thereof to predict a far obstacle may be problematic and preferably cause the programmer 1164 to initiate a remote operation before the autonomous vehicle reaches the obstacle. Otherwise, upon encountering an obstacle or scene, the autonomous vehicle may cause a delay (e.g., pulling on or off the roadway) by transitioning to a safe station. In another example, the remote operation may be initiated automatically before the autonomous vehicle approaches a particular location known to be difficult to navigate. This determination may optionally take into account other factors including time of day, orientation of the sun, if this situation may cause interference with the reliability of the sensor readings, and traffic or accident data derived from various sources.
Fig. 12 is an example of a flow chart configured to control an autonomous vehicle, according to some embodiments. At 1202, flow 1200 begins. Data representing a subset of objects is received at a planner in the autonomous vehicle, the subset of objects including at least one object associated with data representing a degree of certainty with respect to the classification type. For example, the perception engine data may include metadata associated with the object, whereby the metadata specifies a degree of certainty associated with a particular classification type. For example, dynamic objects may be classified as "elderly pedestrians" with a confidence level of 85% that they are correct. At 1204, localizer data may be received (e.g., at a planner). The localizing data can comprise map data locally generated within the autonomous vehicle. The local map data may specify a degree of certainty (including a degree of uncertainty) of events at the geographic area that may occur. The event may be a condition or situation that affects the operation of the autonomous vehicle or potentially affects the operation of the autonomous vehicle. The event may be internal to the autonomous vehicle (e.g., a faulty sensor or an affected sensor) or external (e.g., a roadway barrier). Examples of events are described herein, such as in fig. 2 and other figures and paragraphs. A path coextensive with the geographic area of interest may be determined at 1206. For example, consider that the event is the positioning of the sun in a daytime sky, where the intensity of the sun affects the vision of the driver during rush hour traffic. In this way, traffic may be slowed down in anticipation or prediction of response to bright sunlight. Thus, if an alternate path avoiding the event is less likely, the planner may prioritize the initiation of the remote operation. At 1208, a local position is determined at the planner based on the local pose data. At 1210, an operating state of the autonomous vehicle may be determined (e.g., probabilistically) based on a degree of certainty of the classification type and a degree of certainty of the event, which may be based on any number of factors, such as speed, orientation, and other state information. For example, consider an example in which an elderly pedestrian is detected by an autonomous vehicle during an event in which the vision of other drivers would likely be affected by the sun, thereby causing an unsafe situation for the elderly pedestrian. Thus, a relatively unsafe situation can be detected as a possible occurrence of a probabilistic event (i.e., an unsafe situation for which a remote operation can be initiated). At 1212, a likelihood that the operational state is in a normal state is determined, and based on the determination, a message is sent to the remote operator computing device requesting remote operation to preferentially transition to a next operational state (e.g., preferentially transition from a normal operational state to an abnormal operational state such as an unsafe operational state).
FIG. 13 depicts an example in which a planner may generate trajectories according to some examples. The illustration 1300 includes a trajectory evaluator 1320 and a trajectory generator 1324. The trajectory evaluator 1320 includes a confidence level generator 1322 and a remote operator inquiry messenger 1329. As shown, the trajectory evaluator 1320 is coupled to the perception engine 1366 to receive the static map data 1301, as well as the current and predicted object state data 1303. The trajectory evaluator 1320 also receives the local pose data 1305 from the localizer 1368 and the planning data 1307 from the global planner 1369. In one operational state (e.g., abnormal), the confidence level generator 1322 receives static map data 1301 and current and predicted object data 1303. Based on this data, the confidence level generator 1322 may determine that the detected trajectory is associated with an unacceptable confidence level value. In this way, the confidence level generator 1322 sends detected trajectory data 1309 (e.g., data that includes candidate trajectories) to notify the remote operator via the remote operator inquiry messenger 1329, which in turn sends a request 1370 for remote operator assistance.
In another operational state (e.g., normal), the static map data 1301, the current and predicted object data 1303, the local pose data 1305, and the planning data 1307 (e.g., global planning data) are received into the trajectory evaluator 1325, the trajectory evaluator 1325 being configured to calculate (e.g., iteratively) the trajectory to determine the best one or more paths. Next, at least one path is selected and transmitted as selected path data 1311. According to some embodiments, the trajectory calculator 1325 is configured to implement a re-plan of trajectories as an example. The nominal driving trajectory generator 1327 is configured to generate the trajectory in a precise way, such as by generating the trajectory based on a roll-horizon control (receding horizon control) technique. The nominal driving trajectory generator 1327 may then send nominal driving trajectory path data 1372 to, for example, a trajectory tracker or a vehicle controller to implement the maneuvers, accelerations, and physical changes of other components.
Fig. 14 is a diagram depicting another example of an autonomous vehicle service platform, according to some embodiments. Illustration 1400 depicts an autonomous vehicle service platform 1401 that includes a remote operator manager 1407, the remote operator manager 1407 configured to manage interactions and/or communications between a remote operator 1408, a remote operator computing device 1404, and other components of the autonomous vehicle service platform 1401. Further to illustration 1400, autonomous vehicle service platform 1401 includes simulator 1440, repository 1441, policy manager 1442, reference data updater 1438, 2D map data repository 1420, 3D map data repository 1422, and route data repository 1424. Other map data such as 4D map data (e.g., age determination) may be implemented and stored in a repository (not shown).
The remote operator action recommendation controller 1412 includes logic configured to receive and/or control remote operation service requests via autonomous vehicle ("AV") planner data 1472, which includes requests for remote operator assistance as well as telemetry and other data. As such, the planner data 1472 may include recommended candidate trajectories or paths from which the remote operator 1408 may select via the remote operator computing device 1404. According to some examples, the remote operator action recommendation controller 1412 may be configured to access other sources of recommended candidate trajectories from which to select the best trajectory. For example, candidate trajectories contained in the autonomous vehicle planner data 1472 may be introduced in parallel into the simulator 1440, the simulator 1440 being configured to simulate an event or condition being experienced by an autonomous vehicle requesting teleoperational assistance. Simulator 1440 is able to access map data and other data necessary to perform the simulation on the set of candidate trajectories, so simulator 1440 does not have to repeatedly simulate exhaustively to confirm sufficiency. Rather, simulator 1440 may provide a confirmation of the suitability of candidate trajectories or otherwise alert the remote operator to their selection.
Remote operator interaction capture analyzer 1416 may be configured to capture a large number of remote operator interactions or interactions for storage in repository 1441, where repository 1441 may, for example, accumulate data related to a plurality of operator interactions for analysis and generation of policies in at least some instances. According to some embodiments, repository 1441 may also be configured to store policy data for access by policy manager 1442. Further, the remote operator interaction capture analyzer 1416 may apply machine learning techniques to empirically determine how best to respond to events or conditions that cause a request for remote operation assistance. In some cases, policy manager 1442 may be configured to update a particular policy or generate a new policy (e.g., after applying machine learning techniques) in response to analyzing a large group of operator interactions. Policy manager 1442 manages policies that may be considered rules or guidelines according to which autonomous vehicle controllers and their components operate to conform to autonomous operation of the vehicle. In some cases, a modified or updated policy may be applied to simulator 1440 to confirm the effect of a permanent release (release) or implement the policy change.
Simulator interface controller 1414 is configured to provide an interface between simulator 1440 and remote operator computing device 1404. For example, consider that sensor data from a fleet of autonomous vehicles is applied to a reference data updater 1438 via autonomous ("AV") fleet data 1470, whereby the reference data updater 1438 is configured to generate updated map and route data 1439. In some implementations, the updated map and route data 1439 may be released initially as an update to data in the map data stores 1420 and 1422, or as an update to data in the route data store 1424. In this case, the data may be labeled "beta version" where a lower threshold for requesting remote operator service may be implemented when, for example, a map tile (tile) including initially updated information is used by the autonomous vehicle. In addition, updated map and route data 1439 may be incorporated into simulator 1440 for verification of the updated map data. Once fully released (e.g., at the end of the beta test), the previously lowered threshold associated with the map tile for requesting remote operator service is cancelled. The user interface graphics controller 1410 provides the remote operator 1408 with rich graphics whereby a fleet of autonomous vehicles may be simulated within the simulator 1440 and accessed via the remote operator computing device 1404 as if the simulated fleet of autonomous vehicles were authentic.
Fig. 15 is an example of a flow chart for controlling an autonomous vehicle according to some embodiments. At 1502, flow 1500 begins. Message data may be received at a remote operator computing device for managing a fleet of autonomous vehicles. The message data may indicate event attributes associated with abnormal operating states in the context of a planned path for the autonomous vehicle. For example, an event may be characterized as a particular intersection that becomes problematic due to a large number of pedestrians hurry across (cross) streets, for example, against traffic lights. The event attributes describe characteristics of the event such as, for example, the number of people crossing the street, traffic delays from an increased number of pedestrians, etc. At 1504, a remote operations repository may be accessed to retrieve a first subset of recommendations based on simulated operations of aggregated data associated with a set of autonomous vehicles. In this case, the simulator may be a source of recommendations that the remote operator may use to implement. In addition, the remote operations repository may also be accessed to retrieve a second subset of recommendations based on an aggregation of remote operator interactions responsive to similar event attributes. In particular, the remote operator interaction capture analyzer may apply machine learning techniques to empirically determine how best to respond to events having similar attributes based on previous requests for remote operation assistance. At 1506, the recommended first subset and the second subset are combined to form a set of recommended course of action for the autonomous vehicle. At 1508, a representation of the set of recommended course of action may be visually presented on a display of the remote operator computing device. At 1510, a data signal representing a selection of a recommended course of action (e.g., by a remote operator) may be detected.
FIG. 16 is an illustration of an example of an autonomous vehicle fleet manager implementing a fleet optimization manager, according to some examples. Illustration 1600 depicts an autonomous vehicle fleet manager configured to manage a fleet of autonomous vehicles 1630 traveling within road network 1650. The autonomous vehicle fleet manager 1603 is coupled to a remote operator 1608 via a remote operator computing device 1604 and is also coupled to a fleet management data store 1646. The autonomous vehicle fleet manager 1603 is configured to receive policy data 1602 and environment data 1606, among other data. In addition, for illustration 1600, fleet optimization manager 1620 is shown as including a pass request processor 1631, with pass request processor 1631 including, in order, a fleet data extractor 1632 and an autonomous vehicle dispatch (dispatch) optimization calculator 1634. The pass request processor 1631 is configured to process a pass request, such as from a user 1688 requesting autonomous vehicle service. The fleet data extractor 1632 is configured to extract data related to autonomous vehicles in the fleet. The data associated with each autonomous vehicle is stored in a memory repository 1646. For example, the data for each vehicle may describe maintenance issues, planned service calls, daily use, battery charge and discharge rates, and any other data that may be updated in real-time, which may be used for the purpose of optimizing the fleet of autonomous vehicles to minimize downtime. The autonomous vehicle dispatch optimization calculator 1634 is configured to analyze the extracted data and calculate an optimized use of the fleet to ensure that a next vehicle, such as dispatched from station 1652, provides a minimum travel time and/or cost in aggregate for autonomous vehicle service.
Fleet optimization manager 1620 is shown to include a hybrid autonomous vehicle/non-autonomous vehicle processor 1640, which in turn includes an AV/non-AV optimization calculator 1642 and a non-AV selector 1644. According to some examples, the hybrid autonomous/non-autonomous vehicle processor 1640 is configured to manage a hybrid fleet of autonomous vehicles and human-driven vehicles (e.g., as independent contractors). As such, autonomous vehicle services may employ non-autonomous vehicles to meet excess demand, or in areas such as non-AV service area 1690 that may be outside of a geofence (geo-fence), or in areas of poor communication coverage. The AV/non-AV optimization calculator 1642 is configured to optimize the use of the autonomous fleet and invite non-AV drivers into the transportation service (e.g., with minimal or no damage to the autonomous vehicle service). The non-AV selector 1644 includes logic for selecting multiple non-AV drivers for assistance based on the calculations derived by the AV/non-AV optimization calculator 1642.
Fig. 17 is an example of a flow chart for managing a fleet of autonomous vehicles, according to some embodiments. At 1702, flow 1700 begins. At 1702, policy data is received. The policy data may include parameters defining how best to apply to select an autonomous vehicle to service the pass request. At 1704, fleet management data from a repository may be extracted. The fleet management data includes a subset of data for the pool of autonomous vehicles (e.g., data describing vehicles ready to service a transportation request). At 1706, data representing a pass request is received. For exemplary purposes, the pass request can be a transport from a first geographic location to a second geographic location. At 1708, attributes based on the policy data are calculated to determine a subset of autonomous vehicles available to service the request. For example, the attributes may include battery charge level and time until next scheduled maintenance. At 1710, an autonomous vehicle is selected as a transport from a first geographic location to a second geographic location, and data is generated to dispatch the autonomous vehicle to a third geographic location associated with the origin of the pass request.
FIG. 18 is a diagram illustrating an autonomous vehicle fleet manager implementing an autonomous vehicle communication link manager, according to some embodiments. The illustration 1800 depicts an autonomous vehicle fleet manager configured to manage a fleet of autonomous vehicles 1830 traveling within a road network 1850 consistent with communication disruption at an area identified as a "reduced communication area" 1880. The autonomous vehicle fleet manager 1803 is coupled to a remote operator 1808 via a remote operator computing device 1804. The autonomous vehicle fleet manager 1803 is configured to receive policy data 1802 and environmental data 1806, among other data. Further, for illustration 1800, the autonomous vehicle communication link manager 1820 is shown as including an environmental event detector 1831, a policy adaptation determiner 1832, and a pass request processor 1834. The environmental event detector 1831 is configured to receive environmental data 1806 specifying a change within an environment in which the autonomous vehicle service is implemented. For example, the environmental data 1806 may specify that the area 1880 has degraded communication services, which may affect autonomous vehicle services. The policy adaptation determiner 1832 may detail parameters that are used to apply when accepting a pass request during the event (e.g., during a communication loss). The pass request processor 1834 is configured to process pass requests in view of degraded communications. In this example, user 1888 is requesting autonomous vehicle service. Further, the pass request processor 1834 includes logic for applying an adaptation policy to modify the manner in which autonomous vehicles are dispatched to avoid complexity due to poor communication.
The communication event detector 1840 includes a policy download manager 1842 and a communication configured ("COMM configured") AV dispatcher 1844. Policy download manager 1842 is configured to provide the updated policy to autonomous vehicle 1830 in view of the reduced communication area 1880, whereby the updated policy may specify a route that quickly exits area 1880 in the event that an autonomous vehicle enters area 1880. For example, autonomous vehicle 1864 may receive updated strategic moments (movements) prior to driving into zone 1880. Upon loss of communication, the autonomous vehicle 1864 enforces the updated strategy and selects the route 1866 to quickly drive out of the area 1880. The COMM configured AV dispatcher 1844 may be configured to identify a point 1865 to park an autonomous vehicle configured to act as a relay to establish a peer-to-peer network over the area 1880. In this way, the COMM-configured AV dispatcher 1844 is configured to dispatch an autonomous vehicle 1862 (without passengers) parked at the location 1865 for the purpose of operating as a communication tower in a peer-to-peer ad hoc network (ad hoc network).
FIG. 19 is an example of a flow chart of determining actions for an autonomous vehicle during an event such as degradation or loss of communication, according to some embodiments. At 1901, process 1900 begins. Policy data is received whereby the policy data defines parameters to be applied to traffic requests in the geographic area during the event. At 1902, one or more of the following actions may be implemented: (1) Dispatching a subset of autonomous vehicles to the geographic location in a portion of the geographic location, the subset of autonomous vehicles configured to park at a particular geographic location and each act as a static communication relay, or to travel in a geographic area to each act as a mobile communication relay; (2) Implementing peer-to-peer communication between portions of the autonomous vehicle pool associated with portions of the geographic area; (3) Providing an autonomous vehicle with an event policy describing a route for a portion of the geographic area that left during the event; (4) initiating a remote operation; and (5) recalculating the path to avoid the geographic portion. After the action is performed, the fleet of autonomous vehicles is monitored at 1914.
FIG. 20 is a diagram depicting an example of a localizer according to some embodiments. The illustration 2000 includes a localizer 2068, the localizer 2068 being configured to receive sensor data from a sensor 2070, such as lidar data 2027, camera data 2074, radar data 2076, and other data 2078. Further, the localizer 2068 is configured to receive reference data 2020, such as 2D map data 2022, 3D map data 2024, and 3D local map data. According to some examples, other map data, such as 4D map data 2025 and semantic map data (not shown), including corresponding data structures and repositories may also be implemented. Further to the illustration 2000, the localizer 2068 includes a localization system 2010 and a localization system 2012, each configured to receive sensor data from the sensor 2070 and reference data 2020. The localization data integrator 2014 is configured to receive data from the positioning system 2010 and data from the localization system 2012, whereby the localization data integrator 2014 is configured to integrate or fuse sensor data from the plurality of sensors to form the local pose data 2052.
FIG. 21 is an example of a flow chart for generating local pose data based on integrated sensor data according to some embodiments. At 2101, process 2100 begins. At 2102, reference data is received, the reference data including three-dimensional map data. In some examples, reference data, such as 3D or 4D map data, may be received via one or more networks. At 2104, localization data from one or more localization sensors is received and placed into a localization system. At 2106, positioning data from one or more positioning sensors is received into a positioning system. At 2108, localization and localization data is integrated. At 2110, the localization data and the positioning data are integrated to form local position data specifying a geographic position of the autonomous vehicle.
FIG. 22 is a diagram depicting another example of a localizer according to some embodiments. The diagram 2200 includes a localizing device 2268, which in turn includes a localizing system 2210 and a relative (relative) localizing system 2212 to generate location-based data 2250 and local location-based data 2251, respectively. The localization system 2210 includes a projection processor 2254a to process the GPS data 2273, the GPS data 2211, and the 3D map data 2222, other optional data (e.g., 4D map data). The localization system 2210 also includes a ranging processor 2254b to process wheel data 2275 (e.g., wheel speed), vehicle mode data 2213, and 3D map data 2222, among other optional data. In addition, the localization system 2210 includes an integrator processor 2254c to process the IMU data 2257, the vehicle mode data 2215, and the 3D map data 2222, as well as other optional data. Similarly, the relative localization system 2212 includes a lidar localization processor 2254D for processing the lidar data 2272, the 2D tile map data 2220, the 3D map data 2222, and the 3D local map data 2223, as well as other optional data. The relative localization system 2212 also includes a visual registration processor 2254e to process the camera data 2274, the 3D map data 2222, and the 3D local map data 2223, among other optional data. In addition, the relative localization system 2212 includes a radar echo processor 2254f to process the radar data 2276, the 3D map data 2222, and the 3D local map data 2223, other optional data. It should be noted that in various examples, other types of sensor data and sensors or processors may be implemented, such as sonar data.
Further, for illustration 2200, localization-based data 2250 and relative localization-based data 2251 may be fed into data integrator 2266a and localization data integrator 2266, respectively. The data integrator 2266a and the localization data integrator 2266 may be configured to fuse corresponding data, whereby localization-based data 2250 may be fused at the data integrator 2266a prior to being fused with the relative localization-based data 2251 at the localization data integrator 2266. According to some embodiments, the data integrator 2266a is formed as part of a localized data integrator 2266, or the data integrator 2266a is absent. Regardless, both the localization-based data 2250 and the relative localization-based data 2251 can be fed into a localization data integrator 2266 to fuse the data to generate local position data 2252. The localization-based data 2250 may include data from the unary constraints (and uncertainty values) of the projection processor 2254a, and data from the binary constraints (and uncertainty values) of the ranging processor 2254b and the integrator processor 2254 c. The relative localization based data 2251 may include data (and uncertainty values) from the localization processor 2254d and the visual registration processor 2254e, and optionally from the unary constraints of the radar echo processor 2254 f. According to some embodiments, localized data integrator 2266 may implement nonlinear smoothing functionality such as Kalman (Kalman) filtering (e.g., gated Kalman filtering), relative bundle (bundle) adjuster, pose-graph relaxation, particle filtering, histogram filtering, and so forth.
Fig. 23 is a diagram depicting an example of a perception engine in accordance with some embodiments. The diagram 2300 includes a perception engine 2366, the perception engine 2366 including, in order, a segmentation processor 2310, an object tracker 2330, and a classifier 2360. Further, perception engine 2366 is configured to receive, for example, local azimuth data 2352, lidar data 2372, camera data 2374, and radar data 2376. It should be noted that other sensor data, such as sonar data, may be accessed to provide the functionality of the perception engine 2366. The segmentation processor 2310 is configured to extract ground plane data and/or segment portions of an image to distinguish objects from one another and from a static image (e.g., background). In some cases, 3D plaques (blobs) may be segmented to distinguish from one another. In some examples, a patch may refer to a set of features that identify objects in a spatially rendered environment, and may be composed of elements (e.g., pixels of camera data, points of laser echo data, etc.) having similar features such as intensity and color. In some examples, the spot may also be pointed at a cloud (e.g., consisting of color laser echo data) or other elements that make up the object. Object tracker 2330 is configured to perform frame-to-frame estimation of motion for a plaque or other segmented image portion. Furthermore, the data association is used to associate a spot at one location in the first frame at time t1 with a spot of a different orientation in the second frame at time t 2. In some examples, object tracker 2330 is configured to perform real-time probabilistic tracking of 3D objects, such as blobs. The classifier 2360 is configured to identify an object and classify the object by classification type (e.g., as a pedestrian, a rider, etc.) and by energy/activity (e.g., whether the object is dynamic or static), thereby describing data representing the classification by semantic tags. According to some embodiments, probability estimation of object categories may be performed, such as classifying objects as vehicles, cyclists, pedestrians, etc. with different confidence levels for each object category. The perception engine 2366 is configured to determine perception engine data 2354, the perception engine data 2354 may include static object maps and/or dynamic object maps, as well as semantic information, so that, for example, a planner may use this information to enhance path planning. According to various examples, one or more of the segmentation processor 2310, the object tracker 2330, and the classifier 2360 may apply machine learning techniques to generate the perception engine data 2354.
FIG. 24 is an example of a flow chart for generating awareness engine data, in accordance with some embodiments. Flowchart 2400 begins at 2402 where data representing a local position of an autonomous vehicle is retrieved. At 2404, localized data from one or more localized sensors is received, and at 2406 features of an environment in which the autonomous vehicle is disposed are segmented to form segmented objects. One or more portions of the segmented object are spatially tracked 2408 to form at least one tracked object having motion (e.g., estimated motion). At 2410, classifying the tracked object as at least a static object or a dynamic object. In some cases, static or dynamic objects may be associated with classification types. At 2412, generating data identifying the classified object. For example, data identifying the classified objects may include semantic information.
FIG. 25 is an example of a split processor according to some embodiments. The illustration 2500 depicts a segmentation processor 2510, the segmentation processor 2510 receiving lidar data from one or more lidars 2572 and camera image data from one or more cameras 2574. Local pose data 2552, lidar data, and camera image data are received into a meta spin (meta spin) generator 2521. In some examples, the meta-spin generator is configured to partition the image into distinguishable regions (e.g., clusters or groups of point clouds) based on various attributes (e.g., color, intensity, etc.), at least two or more of which may be updated simultaneously or nearly simultaneously. The meta-spin data 2522 is used to perform object segmentation and ground segmentation at the segmentation processor 2523, thereby applying the meta-spin data 2522 and segmentation related data from the segmentation processor 2523 to the scan difference processor 2513. The scan difference processor 2513 is configured to predict the motion and/or relative velocity of the segmented image portions, which can be used to identify dynamic objects at 2517. Data indicative of the object having the speed detected at 2517 is optionally sent to the planner to enhance the path planning decisions. In addition, data from the scan difference processor 2513 may be used to approximate the location of the object to form a map of the object (and optionally identify the level of motion). In some examples, an occupancy grid map 2515 may be generated. Data representing the occupancy grid map 2515 may be sent to the planner to further enhance the path planning decisions (e.g., by reducing uncertainty). Further to illustration 2500, image camera data from one or more cameras 2574 is used to classify the plaque in a plaque classifier 2520, the plaque classifier 2520 also receiving plaque data 2524 from a segmentation processor 2523. The segmentation processor 2510 may also receive raw radar echo data 2512 from one or more radars 2576 to perform segmentation at a radar segmentation processor 2514, the radar segmentation processor 2514 generating radar-related spot data 2516. Further, with respect to fig. 25, the segmentation processor 2510 may also receive and/or generate tracking spot data 2518 related to radar data. The spot data 2516, tracking spot data 2518, data from the spot classifier 2520, and spot data 2524 may be used to track an object or portion thereof. According to some examples, one or more of the following are optional: the difference processor 2513, the spot classification 2520, and data from the radar 2576 are scanned.
Fig. 26A is a diagram depicting an example of an object tracker and classifier in accordance with various embodiments. The object tracker 2630 of illustration 2600 is configured to receive spot data 2516, tracking spot data 2518, data from the spot classifier 2520, spot data 2524, and camera image data from one or more cameras 2676. The image tracker 2633 is configured to receive camera image data from the one or more cameras 2676 to generate tracked image data, which in turn may be provided to the data correlation processor 2632. As shown, the data association processor 2632 is configured to receive the spot data 2516, the tracking spot data 2518, the data from the spot classifier 2520, the spot data 2524, and the trace camera image data from the image tracker 2633, and is also configured to identify one or more associations between the above types of data. The data correlation processor 2632 is configured to track various plaque data, e.g., from one frame to the next, e.g., to estimate motion, etc. In addition, the data generated by the data association processor 2632 may be used by the trace updater 2634 to update one or more traces, or trace objects. In some examples, trace updater 2634 may implement kalman filtering or the like to form updated data for tracked objects, which may be stored online in trace database ("DB") 2636. Feedback data may be exchanged between data correlation processor 2632 and trace database 2636 via path 2699. In some examples, the image tracker 2633 may be optional and may be excluded. The object tracker 2630 may also use other sensor data, such as radar or sonar, as well as, for example, any other type of sensor data.
FIG. 26B is a diagram depicting another example of an object tracker in accordance with at least some examples. The illustration 2601 includes an object tracker 2631, which object tracker 2631 may include the structure and/or functionality of similarly named elements described in connection with one or more other figures (e.g., fig. 26A). As shown, the object tracker 2631 includes an optional registration portion 2699, the optional registration portion 2699 including a processor 2696 configured to perform object scan registration and data fusion. The processor 2696 is further configured to store the resulting data in a 3D object database 2698.
Referring back to fig. 26A, illustration 2600 also includes a classifier 2660, where classifier 2660 may include a trace classification engine 2662 for generating static obstacle data 2672 and dynamic obstacle data 2674, where static obstacle data 2672 and dynamic obstacle data 2674 may each be sent to a planner for path planning purposes. In at least one example, trace classification engine 2662 is configured to determine whether an obstacle is static or dynamic and another classification type for the object (e.g., whether the object is a vehicle, pedestrian, tree, rider, dog, cat, paper bag, etc.). The static obstacle data 2672 may be formed as part of an obstacle map (e.g., a 2D occupancy map), and the dynamic obstacle data 2674 may be formed to include bounding boxes with data indicative of speed and classification type. The dynamic obstacle data 2674, at least in some cases, includes 2D dynamic obstacle map data.
FIG. 27 is an example of a front-end processor for a perception engine according to some examples. According to various examples, the illustration 2700 includes a ground split processor 2723a for performing ground split (ground segmentation) and an over split processor 2723b for performing "over split (over segmentation)". Processors 2723a and 2723b are configured to receive optionally color lidar data 2775. The over-segmentation processor 2723b generates data 2710 of a first plaque type (e.g., relatively small plaque) that is provided to an aggregate classification and segmentation engine 2712 that generates data 2714 of a second plaque type. The data 2714 is provided to a data correlation processor 2732, and the data correlation processor 2732 is configured to detect whether the data 2714 is present in the trace database 2736. It is determined at 2740 whether the second patch type of data 2714 (e.g., a relatively large patch, which may include one or more smaller patches) is a new trace. If so, the trace is initialized at 2742, otherwise the trace and trace object data stored in trace database 2736 may be expanded or updated by trace updater 2742. Trace classification engine 2762 is coupled to trace database 2736 to identify and update/modify traces by, for example, adding, removing, or modifying data related to the traces.
FIG. 28 is a diagram depicting a simulator configured to simulate an autonomous vehicle in a synthetic environment, in accordance with various embodiments. Diagram 2800 includes simulator 2840 configured to generate simulated environment 2803. As shown, simulator 2840 is configured to use reference data 2822 (e.g., 3D map data and/or other map or route data including RNDF data or similar road network data) to generate simulated geometry, such as simulated surfaces 2892a and 2892b, within simulated environment 2803. The simulated surfaces 2892a and 2892b may simulate the walls or front sides of a building adjacent to a roadway. Simulator 2840 may also simulate dynamic actors (agents) in a synthetic environment using pre-generated or programmatically generated dynamic object data 2825. An example of a dynamic actor is simulated dynamic object 2801, with simulated dynamic object 2801 representing a simulated rider with speed. The simulated dynamic actor may optionally be responsive to other static and dynamic actors in the simulated environment, including simulating an autonomous vehicle. For example, the simulated object 2810 may slow down due to other obstacles in the simulated environment 2801 instead of following a preset trajectory, thereby creating a more realistic simulation of the actual dynamic environment that exists in the real world.
Simulator 2840 may be configured to generate simulated autonomous vehicle controller 2847, the simulated autonomous vehicle controller 2847 including a composite adaptation of a localizer 2868, a motion controller 2862, a planner 2864, and a perception engine 2866, the localizer 2868, the motion controller 2862, the planner 2864, and the perception engine 2866 may each have the functionality described herein within the simulated environment 2803. Simulator 2840 may also generate a simulation interface ("I/F") to simulate data exchanges with different sensor modalities and different sensor data formats. As such, the analog interface 2849 may simulate a software interface for packetized data from, for example, the analog lidar sensor 2872. In addition, simulator 2840 may also be configured to generate simulated autonomous vehicle 2830 implementing simulated AV controller 2847. The simulated autonomous vehicle 2830 includes a simulated lidar sensor 2872, a simulated camera or image sensor 2874, and a simulated radar sensor 2876. In the example shown, the simulated lidar sensor 2872 may be configured to generate a simulated laser light consistent with the ray trace 2892, which causes the simulated sensor echo 2891 to be generated. It is noted that simulator 2840 may simulate an increase in noise or other environmental impact on sensor data (e.g., increased diffusion or reflection that affects simulated sensor echo 2891, etc.). Further, simulator 2840 may be configured to simulate various sensor defects, including sensor failures, sensor misalignments, intermittent data interruptions, and the like.
Simulator 2840 includes a physical processor 2850 for simulating mechanical, static, dynamic, kinematic aspects of an autonomous vehicle for simulating behavior of simulated autonomous vehicle 2830. For example, the physical processor 2850 includes a inclusion mechanics module 2851 for simulating contact mechanics, a collision detection module 2852 for simulating interactions between simulated subjects, and a multi-subject dynamics module 2854 for simulating interactions between simulated mechanical interactions.
The simulator 2840 also includes a simulator controller 2856, the simulator controller 2856 being configured to control the simulation to adapt the functionality of any synthetically generated elements of the simulation environment 2803 to determine causality and the like. Simulator 2840 includes a simulator evaluator 2858 to evaluate the performance of simulation environment 2803 as synthetically generated elements. For example, simulator evaluator 2858 may analyze simulated vehicle commands 2880 (e.g., simulated steering angles and simulated speeds) to determine if the commands are appropriate responses to simulated activities within simulated environment 2803. Further, simulation evaluator 2858 may evaluate interactions of remote operator 2808 with simulated autonomous vehicle 2830 via remote operator computing device 2804. The simulation evaluator 2858 may evaluate the impact of the updated reference data 2827 including the updated map tiles and route data, which may be added to guide the response of the simulated autonomous vehicle 2830. Simulation evaluator 2858 may also evaluate the response of simulator AV controller 2847 when policy data 2829 is updated, deleted, or added. The above description of simulator 2840 is not intended to be limiting. As such, simulator 2840 is configured to perform various simulations of the autonomous vehicle in relation to the simulated environment, including static and dynamic features. For example, simulator 2840 may be used to verify changes in software versions to ensure reliability. Simulator 2840 may also be used to determine vehicle dynamics and for calibration purposes. In addition, simulator 2840 may be used to develop space for resulting trajectories and applicable controls to implement learning through self-simulation.
FIG. 29 is an example of a flow chart simulating various aspects of an autonomous vehicle, according to some embodiments. Flowchart 2900 begins at 2902, reference data that includes three-dimensional map data is accepted into a simulator. Dynamic object data defining a motion pattern for the classified object may be retrieved at 2904. At 2906, a simulated environment is formed based at least on the 3D map data and the dynamic object data. The simulated environment may include one or more simulated surfaces. At 2908, an autonomous vehicle is simulated, the autonomous vehicle including a simulated autonomous vehicle controller forming part of a simulated environment. The autonomous vehicle controller may include an analog localizer and an analog perception engine configured to receive sensor data. At 2910, simulated sensor data is generated based on the data for the at least one simulated sensor echo, and a simulated vehicle command is generated at 2912 to cause movement (e.g., vector propulsion) of a simulated autonomous vehicle in the synthetic environment. At 2914, the simulated vehicle commands are evaluated to determine whether the behavior of the simulated autonomous vehicle is consistent with the expected behavior (e.g., consistent with a policy).
Fig. 30 is an example of a flow chart for generating map data, according to some embodiments. Flowchart 3000 begins at 3002, where trajectory data is retrieved at 3002. The trajectory data may include trajectories (e.g., recorded trajectories) captured during a time duration. At 3004, at least localized data may be retrieved. Localized data (e.g., recorded localized data) may be captured for a duration of time. At 3006, a camera or other image sensor may be implemented to generate a subset of localized data. In this way, the retrieved localization data may include image data. At 3008, a subset of the localized data is aligned to identify a global fix (e.g., global pose). At 3010, 3D map data is generated based on the global position, and at 3012, the 3D map data is available for implementation by, for example, a manual route data editor (e.g., including a manual road network data editor, such as an RNDF editor), an automatic route data generator (e.g., including an automatic road network generator, including an automatic RNDF generator), a fleet of autonomous vehicles, a simulator, a remote operator computing device, and any other component of an autonomous vehicle service.
FIG. 31 is a diagram depicting the architecture of a map construction engine, in accordance with some embodiments. Illustration 3100 includes a 3D map construction engine configured to receive trajectory log data 3140, lidar log data 3172, camera log data 3174, radar log data 3176, and optionally other recorded sensor data (not shown). Logic 3141 includes loop closure detector 3150, loop closure detector 3150 configured to detect whether sensor data indicates that a nearby point in space has been previously observed, and the like. Logic 3141 also includes a registration controller 3152 to align map data, and in some cases 3D map data, with respect to one or more registration points. Further, logic 3141 provides data 3142 representing the state of loop closure for use by global pose graph generator 3143, global pose graph generator 3143 configured to generate pose graph data 3145. In some examples, pose graphics data 3145 may also be generated based on data from registration accuracy module 3146. Logic 3144 includes a 3D map builder 3154 and a lidar self-calibration unit 3156. Further, logic 3144 receives sensor data and gesture graphics data 3145 to generate 3D map data 3120 (or other map data, such as 4D map data). In some examples, logic 3144 may implement a truncated symbol distance function (truncated sign distance function, "TSDF") to fuse sensor data and/or map data to form an optimal three-dimensional map. Further, logic 3144 is configured to include texture (texture) and reflective properties. The 3D map data 3120 may be released for use by a manual route data editor 3160 (e.g., manipulating route data or other types of route or reference data), an automatic route data generator 3162 (e.g., logic configured to generate route data or other types of road network or reference data), a fleet of autonomous vehicles 3164, a simulator 3166, a remote operator computing device 3168, and any other component of autonomous vehicle service. The mapping engine 3110 may capture semantic information from manual annotations or automatically generated annotations and other sensors such as sonar or instrumentation environments (e.g., smart stop lights).
Fig. 32 is a diagram depicting an autonomous vehicle application in accordance with some examples. Illustration 3200 depicts a mobile computing device 3203, the mobile computing device 3203 including an autonomous vehicle service application 3240, the autonomous vehicle service application 3240 configured to contact an autonomous vehicle service platform 3201 to arrange transportation of a user 3202 via an autonomous vehicle 3230. As shown, the autonomous service application 3240 may include a transport controller 3242, and the transport controller 3242 may be a software application resident on a computing device (e.g., mobile phone 3203, etc.). Transport controller 3242 is configured to receive, plan, select, or perform operations related to an autonomous vehicle and/or a fleet of autonomous vehicles for which user 3202 may arrange transport from a user's location to a destination. For example, user 3202 may open an application to request vehicle 3230. The application may display a map and user 3202 may make a spot (drop a pin) to indicate their destination within, for example, a geofence area. Alternatively, the application may display a list of nearby pre-specified pickup locations, or provide the user with a text entry field for typing the destination by address or name.
Further to the example shown, autonomous vehicle application 3240 may also include a user identification controller 3246, which user identification controller 3246 may be configured to detect whether user 3202 is in or near a geographic area near autonomous vehicle 3230 as it approaches. In some cases, user 3202 may not be readily able to perceive or identify autonomous vehicle 3230 when autonomous vehicle 3230 is proximate for use by user 3203 (e.g., due to various other vehicles, including trucks, automobiles, taxis, and other obstacles typical of urban environments). In one example, autonomous vehicle 3230 may establish wireless communication link 3262 (e.g., via radio frequency ("RF") signals, such as WiFi or bluetooth, including BLE, etc.) for communicating and/or determining a spatial location of user 3202 relative to autonomous vehicle 3230 (e.g., using a relative direction and signal strength of the RF signals). In some cases, autonomous vehicle 3230 may detect an approximate geographic location of user 3203 using, for example, GPS data, or the like. A GPS receiver (not shown) of mobile computing device 3203 may be configured to provide GPS data to autonomous vehicle service application 3240. Thus, user identification controller 3246 may provide GPS data to autonomous vehicle service platform 3201 via link 3260, which in turn, autonomous vehicle service platform 3201 may provide the location to autonomous vehicle 3230 via link 3261. Autonomous vehicle 3230 may then determine the relative distance and/or direction of user 3202 by comparing the user's GPS data to the vehicle's GPS-derived location.
Autonomous vehicle 3230 may also include additional logic to identify the presence of user 3202 such that logic is configured to execute face detection algorithm S to generally detect user 3202 or specifically identify the identity (e.g., name, telephone number, etc.) of user 3202 based on the unique facial features of the user. Further, autonomous vehicle 3230 may include logic to detect code for identifying user 3202. Examples of such codes include specialized visual codes such as QR codes, color codes, etc., specialized audio codes such as voice activated or recognized codes, etc. In some cases, the code may be an encoded security key that may be digitally transmitted to autonomous vehicle 3230 via link 3262 to ensure secure ingress and/or egress. Further, one or more of the above identification techniques for identifying user 3202 may be used as security measures to grant access and exit privileges to user 3202 to prevent others from accessing autonomous vehicle 3230 (e.g., to ensure that a third party person does not access an unoccupied autonomous vehicle until user 3202 is reached). According to various examples, any other measures for identifying user 3202 and providing secure entry and exit may also be implemented in one or more of autonomous vehicle service application 3240, autonomous vehicle service platform 3201, and autonomous vehicle 3230.
To assist user 3302 in identifying the arrival of its requested transportation, autonomous vehicle 3230 may be configured to notify or otherwise alert user 3202 of the presence of autonomous vehicle 3230 when autonomous vehicle 3230 is proximate to user 3202. For example, autonomous vehicle 3230 may activate one or more light emitting devices 3280 (e.g., LEDs) according to a particular light pattern. In particular, certain light patterns are created such that user 3202 may easily perceive that autonomous vehicle 3230 is reserved for servicing the transportation needs of user 3202. As an example, autonomous vehicle 3230 may generate a light pattern 3290, which light pattern 3290 may be perceived by user 3202 as "blinking", or other animation of its exterior and interior lights in this visual and temporal manner. The light pattern 3290 may be generated with or without a sound pattern for the user 3202 to identify the vehicle to which they are subscribed.
According to some embodiments, autonomous vehicle user controller 3244 may implement software applications configured to control various functions of the autonomous vehicle. Further, the application may be configured to redirect or route the autonomous vehicle during its travel to its initial destination. Further, the autonomous vehicle user controller 3244 may be configured to cause the onboard logic to modify the interior lighting of the autonomous vehicle 3230 to achieve, for example, mood lighting. The controller 3244 can also control an audio source (e.g., an external source such as Spotify, or audio stored locally on the mobile computing device 3203), select a type of ride (e.g., modify desired acceleration and braking aggressiveness (aguess), modify active suspension (active suspension) parameters to select a set of "road management" features to implement aggressive driving features, including vibration, or soft ride (soft-ride) quality selected to be comfortable with suppressed vibration), etc. For example, the mobile computing device 3203 may also be configured to control HVAC functions, such as ventilation and temperature.
33-35 illustrate examples of various computing platforms configured to provide various functionalities to components of an autonomous vehicle service, according to various embodiments. In some examples, computing platform 3300 may be used to implement computer programs, applications, methods, processes, algorithms, or other software to perform the above-described techniques.
It should be noted that the various structures and/or functionalities of fig. 33 may be applied to fig. 34 and 35, and as such, some elements of those figures may be discussed in the context of fig. 33.
In some cases, computing platform 3300 can be provided in any device, such as computing device 3390a and/or mobile computing device 3390b, where computing device 3390a may be provided in autonomous vehicle 3391, one or more computing devices in an autonomous vehicle service platform.
Computing platform 3300 includes a bus 3302 or other communication mechanism for communicating information to interconnect subsystems and devices, such as a processor 3304, a system memory 3306 (e.g., RAM, etc.), a storage device 3308 (e.g., ROM, etc.), a memory cache (which may be implemented in RAM 3306 or other portions of computing platform 3300), a communication interface 3313 (e.g., ethernet or wireless controller, bluetooth controller, NFC logic, etc.), to facilitate communication via ports on communication link 3321, for example, to communicate with a computing device, including a mobile computing and/or communication device having a processor. The processor 3304 can be implemented with one or more graphics processing units ("GPUs"), one or more central processing units ("CPUs"), such as those manufactured by intel corporation, or with one or more virtual processors, as well as any combination of CPUs and virtual processors. Computing platform 3300 exchanges data representing inputs and outputs via input-and-output devices 3301, input-and-output devices 3301 including, but not limited to, keyboards, mice, audio inputs (e.g., voice-to-text devices), user interfaces, displays, monitors, cursors, touch-sensitive displays, LCD or LED displays, and other I/O-related devices.
According to some examples, computing platform 3300 performs certain operations by processor 3304 executing one or more sequences of one or more instructions stored in system memory 3306, and computing platform 3300 can be implemented in a client-server arrangement, a peer-to-peer arrangement, or as any mobile computing device, including smartphones, and the like. The instructions or data may be read into the system memory 3306 from another computer-readable medium, such as the storage device 3308. In some examples, hardwired circuitry may be used in place of or in combination with software instructions for implementation. The instructions may be embedded in software or firmware. The term "computer-readable medium" refers to any tangible medium that participates in providing instructions to processor 3304 for execution. Such a medium may take many forms, including but not limited to, non-volatile media and volatile media. Non-volatile media includes, for example, optical or magnetic disks, and the like. Volatile media includes dynamic memory, such as system memory 3306.
Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cartridge (cartridge), or any other medium from which a computer can read. The instructions may also be transmitted or received using a transmission medium. The term "transmission medium" may include any tangible or non-tangible medium that is capable of storing, encoding or carrying instructions for execution by the machine and that includes digital or analog communications signals or other non-tangible medium to facilitate the communication of the instructions. Transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 3302 for transmitting computer data signals.
In some examples, execution of the sequences of instructions may be performed by computing platform 3300. According to some examples, computing platform 3300 can be coupled to any other processor via a communication link 3321 (e.g., a wired network, such as a LAN, PSTN, or any wireless network, including bluetooth, NFC, zigbee, WIFI of various standards and protocols, etc.) to execute sequences of instructions cooperatively (or asynchronously) with each other. Computing platform 3300 may send and receive messages, data, and instructions, including program code (e.g., application code), through communication link 3321 and communication interface 3313. The received program code may be executed by processor 3304 as it is received, and/or stored in memory 3306, or other non-volatile storage for later execution.
In the illustrated example, the system memory 3306 can include various modules including executable instructions to implement the functionality described herein. The system memory 3306 may include an operating system ("O/S") 3332, as well as applications 3336 and/or logic modules 3359. In the example shown in fig. 33, the system memory 3306 includes an autonomous vehicle ("AV") controller module 3350 and/or components thereof (e.g., a perception engine module, a localization module, a planner module, and/or a motion controller module), any one or more portions of which can be configured to facilitate autonomous vehicle service by implementing one or more of the functions described herein.
Referring to the example shown in fig. 34, the system memory 3306 includes an autonomous vehicle service platform module 3450 and/or components 30 thereof (e.g., a remote operator manager, simulator, etc.), any or more portions of which can be configured to facilitate management of autonomous vehicle services by implementing one or more of the functions described herein.
Referring to the example shown in fig. 35, system memory 3306 includes an autonomous vehicle ("AV") module and/or components thereof for use in, for example, a mobile computing device. One or more portions of module 3550 can be configured to facilitate delivery of autonomous vehicle services by implementing one or more functions described herein.
Referring back to fig. 33, the structure and/or functionality of any of the features described above can be implemented in software, hardware, firmware, circuitry, or a combination thereof. It should be noted that the above structures and constituent elements and their functionalities may be aggregated with one or more other structures or elements. Alternatively, the elements and their functionality may be subdivided into constituent sub-elements, if present. As software, the techniques described above may be implemented using various types of programming or formatting languages, frameworks, grammars, applications, protocols, objects, or technologies. As hardware and/or firmware, the techniques described above may be implemented using various types of programming or integrated circuit design languages, including a hardware description language, such as any register transfer language ("RTL") configured to design a field programmable gate array ("FPGA"), an application specific integrated circuit ("ASIC"), or any other type of integrated circuit. According to some embodiments, the term "module" can refer to, for example, an algorithm or portion thereof, and/or logic implemented in hardware circuitry or software, or a combination thereof. These can vary and are not limited to the examples or descriptions provided.
In some embodiments, module 3350 of fig. 33, module 3450 of fig. 34, and module 3550 of fig. 35, or one or more of their components, or any process or device depicted herein, can communicate (e.g., wired or wireless) with or can be disposed in a mobile device, such as a mobile phone or computing device.
In some cases, a mobile device or any networked computing device (not shown) in communication with one or more modules 3359 (module 3350 of fig. 33, module 3450 of fig. 34, and module 3550 of fig. 35) or one or more of its components (or any process or device described herein) can provide at least some of the structure and/or functionality of any features described herein. As depicted in the above-described figures, the structure and/or function of any of the above-described features can be implemented in software, hardware, firmware, circuitry, or a combination thereof. It should be noted that the above structures and constituent elements and their functionalities may be aggregated or combined with one or more other structures or elements. Alternatively, the elements and their functionality may be subdivided into constituent sub-elements, if present. As software, at least some of the techniques described above may be implemented using various types of programming or formatting languages, frameworks, grammars, applications, protocols, objects, or technologies. For example, at least one of the elements depicted in any of the figures can represent one or more algorithms. Alternatively, at least one element can represent a portion of logic comprising hardware configured to provide constituent structure and/or functionality.
For example, module 3350 of FIG. 33, module 3450 of FIG. 34, and module 3550 of FIG. 35, or one or more components thereof, or any process or device described herein, can be implemented in one or more computing devices (i.e., any mobile computing device, such as a wearable device, an audio device (such as a headset) or a mobile phone, whether worn or carried) that include one or more processors configured to execute one or more algorithms in memory.
The above-described structures and techniques may be implemented as hardware and/or firmware using various types of programming or integrated circuit design languages, including hardware description languages, such as any register transfer language ("RTL") configured to design a field programmable gate array ("FPGA"), an application specific integrated circuit ("ASIC"), a multi-chip module, or any other type of integrated circuit.
For example, module 3350 of fig. 33, module 3450 of fig. 34, and module 3550 of fig. 35, or one or more components thereof, or any process or device described herein, can be implemented in one or more computing devices that include one or more circuits. Thus, at least one element in the above-described figures can represent one or more components of hardware. Alternatively, at least one element can represent a portion of logic comprising circuitry configured to provide constituent structure and/or functionality.
According to some embodiments, the term "circuit" can refer to, for example, any system comprising a plurality of components including discrete components and composite components through which current flows to perform one or more functions. Examples of discrete components include transistors, registers, capacitors, inductors, diodes, and so forth, and examples of composite components include memory, processors, analog circuits, digital circuits, and so forth, including field programmable gate array ("FPGA"), application specific integrated circuits ("ASIC"). Thus, a circuit can include a system of electronic components and logic components (e.g., logic configured to execute instructions such that, for example, a set of instructions of an algorithm are executable and thus are components of a circuit). According to some embodiments, the term "module" can refer to, for example, an algorithm or portion thereof, and/or logic implemented in hardware circuitry or software, or a combination thereof (i.e., a module can be implemented as a circuit). In some embodiments, the algorithm and/or the memory in which the algorithm is stored are "parts" of a circuit. Thus, the term "circuitry" can also refer to a system of components, e.g., comprising an algorithm. These can vary and are not limited to the examples or descriptions provided.
FIG. 36 is an illustration depicting an example of at least a portion of a planner for generating trajectories according to some examples. Illustration 3600 includes a trajectory evaluator 3620 and a trajectory generator 3624. In some examples, trajectory evaluator 3620 is at least configured to receive path data to guide movement of the autonomous vehicle from the first geographic location to the second geographic location. In some examples, the path data may include any type of data for planning purposes, such as mission data, road data (e.g., road network data), or any variation thereof, including intermediate paths from one road segment to the next road segment, expressed as travel to a destination. The trajectory generator 3624 may be configured to: generating data representing a trajectory for controlling movement of the autonomous vehicle based on the path data; and generating data representing a contingency (contingent) trajectory. According to various examples, the trajectory provides intermediate navigation of the autonomous vehicle (e.g., incrementally from road segment portion to road segment, such as along the first 200m trajectory to the next), while an emergency trajectory may be provided, e.g., a trajectory guiding the autonomous vehicle in a "safe stop" maneuver. This maneuver may direct the autonomous vehicle to an area in the physical environment that allows the vehicle to safely stop, for example, in a position to remove its occupants from the hazardous roadway. In some cases, contingent trajectories may be implemented when one or more components of the planner, such as the trajectory evaluator 3620 and the trajectory generator 3624, are inoperable or otherwise degrade in a manner that affects trajectory generation. Further, the trajectory generator 3624 may also be configured to select a set of trajectories (e.g., from a subset of candidate trajectories) to be applied to a vehicle component (e.g., propulsion or driveline, steering unit, braking unit, etc.) during a normal (non-nnated) operating state, and the trajectory generator 3624 may also be configured to select a set of contingency trajectories (e.g., from a subset of candidate contingency trajectories) to be applied to the vehicle component during an abnormal operating state (e.g., a non-operational state, whether condition or feature vector-based).
It is noted that the elements depicted in diagram 3600 of fig. 36 may include structure and/or functionality similar to the similarly-named elements described with respect to one or more other diagrams, such as fig. 11-13, etc., as described herein. Further to the illustrated example, according to some examples, the trajectory generator 3624 includes a contingency trajectory generator 3695 and a nominal contingency trajectory generator 3697 that generate a contingency trajectory (or data thereof), and a trajectory calculator 3625 and a nominal driving trajectory generator 3627 for calculating a contingency trajectory (or data thereof). Thus, the contingency trajectory generator 3695 can be configured to calculate (e.g., recalculate, such as iterate) the contingency trajectory to determine one or more best contingency paths. At least one path may be selected and transmitted as emergency path data 3691. Optionally, the nominal contingency trajectory generator 3697 is configured to generate a contingency trajectory based on, for example, a scrolling time-domain control technique. The nominal contingency trajectory generator 3697 may then send nominal contingency trajectory path data 3692 to, for example, a trajectory tracker or vehicle/motion controller to effect manipulation, acceleration, and physical changes of other components. Note that nominal contingency track generator 3697 may be omitted in some examples of track generator 3624. According to some examples, upon receipt of a command (e.g., implementing a safety stop), nominal contingency trace path data 3692 may be used, such as in the case where a remote operator is not contacted or not responded to provide a data trace in a preferred manner. In some cases, the detected trajectory path data 3609 may initiate use of the contingency path data 3691 for use by a trajectory controller (not shown).
The trajectory calculator 3625 is configured to calculate (e.g., iteratively, such as re-calculate) a trajectory to determine the best one or more paths based on, for example, static map data 3601, current and predicted object state data 3603, local pose data 3605, and planning data 3607 (e.g., global planning data). Next, at least one path is selected and transmitted as selected path data 3611. According to some embodiments, trajectory calculator 3625 is configured to implement a re-plan of trajectories as an example. The nominal driving trajectory generator 3627 is configured to generate the trajectory in a precise way, such as by generating the trajectory based on a roll horizon control technique. The nominal driving trajectory generator 3627 may then send nominal driving trajectory path data 3672 to, for example, a trajectory tracker or vehicle/motion controller to effect manipulation, acceleration, and physical changes of other components.
Further to the example shown, the trajectory evaluator 3620 includes a status and event manager 3622, a confidence level generator 3623, and a remote operator query 30 messenger 3629. As shown, the trajectory evaluator 3620 is coupled to the perception engine 3666 to receive static map data 3601 and current and predicted object state data 3603. The trajectory evaluator 3620 also receives local pose data 3605 from the localizer 3668 and planning data 3607 from the global planner 3669. Confidence level generator 3623 receives at least static map data 3601 and current and predicted object state data 3603 to determine one or more operational states (e.g., at least one abnormal state or other state). Based on this data, confidence level generator 3623 may determine that the detected trajectory is associated with an unacceptable (or less preferred) confidence level value. The confidence level generator 3623 thus sends detected trajectory data 3609 (e.g., data including candidate trajectories) to notify the remote operator via the remote operator query messenger 3629, which remote operator query messenger 3629 in turn sends a request 3670 for remote operator assistance.
Fig. 37 is a diagram depicting an example of a trajectory tracker according to some examples. The trajectory tracker 3728 may be implemented as a trajectory tracking controller configured to apply the contingency trajectory ("NCT") data 3692 (e.g., as a nominal contingency trajectory) and/or apply the nominal driving trajectory ("NTD") data 3672 to vehicle components, such as propulsion units or systems (e.g., one or more drive trains), steering systems, braking systems, heating and air conditioning systems, communication systems, and the like. Fig. 3700 depicts a trajectory tracker 3728, the trajectory tracker 3728 including a validator 3741, a trajectory generator monitor 3743, an contingency trajectory execution processor 3745, and a driving trajectory execution processor 3747. In some examples, the contingency trajectory data 3692 and the nominal driving trajectory data 3672 may be generated simultaneously or during a common time interval during which trajectory calculations and generation may be determined iteratively or sequentially. However, it should be noted that, at least in some cases, data 3692 and 3672 may be generated asynchronously.
The validator 3741 may be configured to validate the contingency trajectory data 3692 and the nominal driving trajectory data 3672, as well as related uncertainty or probability distributions (e.g., AI statistics, or statistics derived from a state and event manager (not shown) or the like) for validation criteria, for example, to confirm that the data 3692 and 3672 are within a predicted value or tolerance. The driving trajectory execution processor 3747 is configured to control application of the trajectory data 3672 to various physical vehicle components. The trajectory generator monitor 3743 is configured to monitor generation of a trajectory associated with the nominal driving trajectory data 3672. In some cases, the trajectory generator monitor 3743 may detect the absence or termination of the nominal driving trajectory data 3672 (or other data indicative of inoperable high-level logic of the planner). In this case, the trajectory tracker 3728 facilitates operation of the contingency trajectory execution processor 3745 to control application of the contingency trajectory data 3692 to the various physical vehicle components to achieve a "safe stop" maneuver whereby movement of the autonomous vehicle is stopped and the vehicle is stopped in a safe position. As shown, the trajectory tracker 3728 and/or the contingent trajectory execution processor 3745 may be configured to receive a subset of the sensor data 3770 to facilitate application of the contingent trajectory. In some examples, a subset of sensor data 3770 may refer to reaction sensors, as they may be implemented during a safety-stop maneuver (e.g., when other track types may not be generated). According to some examples, the reactive sensor includes one or more sonar sensors and/or one or more radar sensors.
FIG. 38 is a diagram depicting an example of redundant implementation of an autonomous vehicle controller according to some examples. The illustration 3800 depicts a first implementation 3801 that includes an autonomous vehicle ("AV") controller 3874, the autonomous vehicle ("AV") controller 3874 including a planner 3864, the planner 3864 in turn including a trajectory evaluator 3865. The autonomous vehicle controller 3874 may also include a localizer 3868 and a perception engine 3866, each of the localizer 3868 and the perception engine 3866 shown as receiving sensor data 3870a. The sensor data 3870a includes one or more sets of sensor data for one or more types of sensors in a sensor suite (suite) 3870. The sensors 3870 include one or more lidar devices 3872, one or more cameras 3874, one or more radars 3876, one or more global positioning system ("GPS") data receiver-sensors 3873, one or more inertial measurement units ("IMUs") 3875, one or more ranging sensors 3877 (e.g., wheel encoder sensors, wheel speed sensors, etc.), one or more sonar sensors, and any other suitable sensor 3878, such as infrared cameras or sensors, ultra-spectrum capable sensors, ultrasonic sensors (or any other acoustic energy based sensors), radio frequency based sensors, and the like. In some cases, the wheel angle sensor configured to sense the steering angle of the wheel may be included as a ranging sensor 3877 or a suitable sensor 3878. In some examples, the above-described components of the autonomous vehicle controller 3874 may be disposed in a first architecture layer, such as an artificial intelligence ("AI") layer 3891.
Further to the example of the first implementation 3801, the autonomous vehicle ("AV") controller 3874 further includes a trajectory tracker 3862 configured to receive a subset of the sensor data 3870b, which sensor data 3870b may include reaction sensor data (e.g., radar data and sonar data). Track tracker 3862 is shown as being configured to be coupled to vehicle component 3850. As shown, the track tracker 3862 may be disposed in a second architecture layer lower than the first architecture layer, and the second architecture layer is depicted as a real-time operating system ("RTOS") layer 3892. The vehicle component 3850 may be disposed in a physical platform layer 3893 that is lower than the second layer.
In the second implementation 3803 of the illustration 3800, at least the structure and/or functionality of the planner 3864, the localised engine 3868, and the perception engine 3866 are provided in each of the autonomous vehicle controllers 3847a, 3847b, and 3847c, each of the autonomous vehicle controllers 3847a, 3847b, and 3847c being configured to receive sensor data 3870a. Thus, AI layer 3891 of second implementation 3803 provides triple redundancy for each autonomous vehicle control. According to some examples, each of autonomous vehicle controllers 3847a, 3847b, and 3847c may include one or more processors (e.g., OPUs) and may also include one or more clusters of processors (e.g., one or more clusters of OPUs), and any number or type of memory. According to some examples, the plurality of autonomous vehicle controllers 3847a, 3847b, and 3847c are configured to generate a plurality of trajectories and a plurality of contingency trajectories. The plurality of trajectories may include a trajectory such as a nominal driving trajectory, and the plurality of contingency trajectories may include a contingency trajectory to be applied.
Real-time operating system layer 3892 is depicted as including a plurality of trace trackers 3862a, 3862b, and 3862c, each of the plurality of trace trackers 3862a, 3862b, and 3862c being configured to receive a subset of sensor data 3870 b. As such, RTOS layer 3892 may include redundant implementations of track tracker 3862, each configured to receive either contingent trajectory data or nominal driving trajectory data, or both, from each of autonomous vehicle controllers 3847a, 3847b, and 3847c via communication channel 3805. The RTOS layer 3892 is configured to operate with a relatively high degree of reliability such that if logic in the AI layer 3891 fails or becomes degraded, the RTOS layer 3892 may be used to implement an emergency, such as performing a safety shutdown maneuver. In some examples, communication channel 3805 may include a mesh link or a mesh network. According to some examples, each of the trace trackers 3862a, 3862b, and 3862c may include one or more processors (e.g., OPUs) and may also include one or more clusters of processors (e.g., one or more clusters of OPUs), and any amount or type of memory. According to some examples, the trajectory trackers 3862a, 3862b, and 3862c may be configured to detect, for example, termination of generating data representing the trajectory. In addition, one or more outputs of the trajectory trackers 3862a, 3862b, and 3862c may be applied to implement executable instructions to implement a safety stop maneuver. As shown, the vehicle control unit 3899 may be configured to select a contingency track from a plurality of contingency tracks for application to the physical platform layer 3893. In some cases, the vehicle control unit 3899 may be implemented as a consistency voting control unit configured to determine an optimal nominal driving trajectory or an optimal contingency trajectory based on a comparison of each of the outputs of the trajectory trackers 3862a, 3862b, and 3862 c. As shown, implementation 3803 includes one or more steering units 3850a, one or more propulsion units 3850b, one or more braking units 3850c, and other physical vehicle platform components disposed in a physical platform layer 3893. The above examples are not intended to be limiting, and thus, in other implementations, there may be a smaller or greater number of autonomous vehicle controllers and/or trajectory trackers.
FIG. 39 is a diagram depicting an example of a status and event manager, or portion thereof, according to some examples. The illustration 3900 depicts a state and event manager 3922, the state and event manager 3922 configured to receive various amounts and types of input data 3992a, 3992b, and 3992n for generating trajectory data 3982, the trajectory data 3982 may include contingency trajectory data or nominal driving trajectory data, or both. The state and event manager 3992 (or any other logic in the planner) may also include one or more inference engines 3990, such as inference engines 3990a, 3990b, and 3990n, and one or more classifiers 3991, such as classifiers 3991a, 3991b, and 3991 n.
According to some examples, the inference engine 3990 may include any number or type of inference algorithms configured to infer values, amounts, states, qualities, attributes, or qualities of one or more characteristics that are implemented based on, for example, any input data 3992 implemented or generated in the autonomous vehicle, or received from an autonomous vehicle service platform or other data source. Examples of input data 3992 include: any data from the perception engine, such as, for example, data related to static and dynamic objects including object classification data (and uncertainty); object tracking data; a shift in predicted objects over time; object orientation or pose data; and localized data, including data related to local poses and data related to global poses. In addition, any degree of uncertainty or probability distribution associated with the above data, and descriptor inputs, may also be input or accompanied by other input data 3992. The input data 3992 may also include any amount or type of sensor data, as well as any measurable characteristic of the sensed data, such as the amount of photons impinging on an image sensor (e.g., CCO sensor), or the sensed laser characteristics (e.g., intensity, etc.) of the lidar laser echo data, as well as other sensed data from radar, sonar, etc. Further, the input data 3992 may include data derived from sensed data or any combination, as well as uncertainty and/or probability distributions associated with measured sensor data or sensed data.
To illustrate the operation of the inference engine 3990, consider that the inference engine 3990a is configured to infer a distance between an autonomous vehicle and an external object and the inference engine 3990b is configured to infer a location on a roadway where the autonomous vehicle is located. Other inference engines, such as inference engine 3990n, may be configured to perform other inference operations based on other input data. The inference engines 3990a and 3990b may receive any suitable input data 3992 to generate characteristics of the inferences and, for example, corresponding uncertainty or probability distributions. The output data (e.g., values and probability distributions) of the inference engines 3990a and 3990b may be sent as inputs to the classifier 3991.
The output from the inference engine 3990 and any other data generated or obtained by the autonomous vehicle may be input into the classifier 3991. According to various examples, the classifier 3991 is configured to perform statistical classification (e.g., regression analysis) on a relatively large-sized multimodal distribution to form, for example, probabilistic feature vectors as an output of the classifier 3991, whereby the various feature vectors may represent a state of a planner, a trajectory generator, or an autonomous vehicle. Thus, according to some examples, the output of the classifier 3991 may specify one or more types of actions to take based on the input to the classifier 3991. One or more types of actions may be associated with one or more generated trajectories. According to some examples, the confidence level generator 3923 may characterize a confidence level for the output of the classifier 3991. According to an alternative example, the classifier 3991 may include or perform similar functions provided by the confidence level generator 3923. Thus, the classifier 3991 may generate a trajectory having a confidence level based on the input of the classifier 3991. For example, the classifier 3991 may determine the variable amounts of acceptable (or more preferred) and unacceptable (or less preferred) confidence level values associated with acceptable and unacceptable trajectories. According to various examples, classifier 3991 may be any type of classifier in any combination thereof. For example, classifier 3991a may be implemented as a naive bayes classifier.
FIG. 40 is a flow chart of an example implementing one or more track types according to some examples. The process 400 begins at 4002 with receiving path data that directs movement of an autonomous vehicle at 4002. At 4004, data representing a trajectory is generated to control movement of the autonomous vehicle, the trajectory being a first type of trajectory. At 4006, data representing the contingent trajectory is generated as a second type of trajectory. At 4008, generation of one or more trajectories is monitored to detect whether generation of the trajectories ceases. At 4010, an contingency trajectory can be implemented to cause a change to the movement and/or direction of the autonomous vehicle. It should be noted that the order depicted herein and in other flow diagrams is not intended to imply that various functions are necessarily performed linearly, as execution of each portion of a flow diagram may be serial or parallel with any one or more other portions of a flow diagram, and may be independent of or dependent on other portions of a flow diagram.
FIG. 41 illustrates an example of various computing platforms configured to provide various generation-related functionalities and/or structures to components of an autonomous vehicle service, in accordance with various embodiments. In some examples, computing platform 3300 may be used to implement computer programs, applications, methods, processes, algorithms, or other software to perform the above-described techniques. It should be noted that the various structures and/or functionality of fig. 33 may be applied to fig. 41, and as such, some of the elements of those figures may be discussed in the context of fig. 33. It is also noted that the elements depicted in diagram 4100 of fig. 41 may include the structure and/or functionality of similarly-named elements described in connection with one or more other diagrams herein.
Referring to the example shown in fig. 41, the system memory 3306 includes an autonomous vehicle controller module 4150 and/or components thereof (e.g., a planner module 4152, a trajectory tracking module 4154, etc.), any one of which or one or more portions thereof can be configured to facilitate navigation of autonomous vehicle services by implementing one or more of the functions described herein. In some cases, computing platform 3300 can be provided in any device, such as computing device 3390a and/or mobile computing device 3390b, computing device 3390a may be provided in autonomous vehicle 3391.
Although the foregoing examples have been described in some detail for purposes of clarity of understanding, the inventive techniques described above are not limited to the details provided. There are many alternative ways of implementing the inventive techniques described above. The disclosed examples are illustrative and not limiting.

Claims (38)

1. A method for controlling one or more autonomous vehicle components, comprising:
receiving path data to guide movement of the autonomous vehicle from the first geographic location to the second geographic location;
receiving perception data from a perception engine, the perception data comprising static map data, current object state data, and predicted object state data;
Receiving local pose data of the autonomous vehicle from a localizer;
generating at least a plurality of trajectories and a plurality of contingency trajectories based on the path data, the local pose data, the static map data, the current object state data, and the predicted object state data;
selecting a trajectory from the plurality of trajectories to control one or more vehicle components;
selecting an contingency trajectory from the plurality of contingency trajectories for controlling the one or more vehicle components;
controlling the one or more vehicle components of the autonomous vehicle according to the trajectory to guide the autonomous vehicle to move along the trajectory;
monitoring the generation of the trajectory;
detecting a stop of the generation of the trajectory; and
in response to detecting a cessation of the generation of the trajectory, the one or more vehicle components are controlled in accordance with the contingency trajectory.
2. The method of claim 1, further comprising:
generating one or more commands for controlling movement of the autonomous vehicle; wherein controlling movement of the autonomous vehicle according to the contingency trajectory comprises: the autonomous vehicle is caused to execute the command to perform a safety-stop maneuver to a geographic location based on the contingency track.
3. The method of claim 1, further comprising:
receiving a subset of the sensor data;
in response to receiving the subset of sensor data, controlling the one or more vehicle components according to the contingency trajectory; wherein one or more of the static map data, the current object state data, the local pose data, and the predicted object state data are based on a subset of the sensor data, the subset of sensor data including one or more of lidar data, radar data, or camera data.
4. A method as claimed in claim 3, wherein:
generating the plurality of trajectories includes: calculating the plurality of trajectories based at least in part on the path data, the local pose data, the static map data, the current object state data, and the predicted object state data; and
generating the plurality of contingency tracks includes: the plurality of contingency tracks is calculated based at least in part on the path data, the local pose data, the static map data, the current object state data, and the predicted object state data.
5. The method of claim 1, wherein generating at least the plurality of trajectories and the plurality of contingency trajectories further comprises:
Iteratively generating a set of trajectories on a plurality of autonomous vehicle controllers, the set of trajectories including the trajectories; and
iteratively generating a set of contingency trajectories on the plurality of autonomous vehicle controllers, the set of contingency trajectories including the contingency trajectories.
6. The method of claim 1, wherein controlling the one or more vehicle components according to the contingency trajectory comprises:
transmitting a control signal to the one or more vehicle components to achieve a safe stop of the autonomous vehicle, wherein the control signal is determined based on a roll horizon.
7. The method of claim 1, further comprising:
determining a confidence level, the confidence level indicating an ability of the autonomous vehicle to traverse a path;
based on the confidence level, sending a message to a remote operator; and
receiving a command from a remote operator to control the autonomous vehicle according to the contingency trajectory;
wherein the sensory data comprises one or more dynamic objects and one or more static objects.
8. The method of claim 1, further comprising:
receiving sensor data as a probability distribution;
forming the perception data associated with an external object based on the probability distribution; and
Inferring a characteristic value associated with the external object to form an inferred characteristic value; wherein generating at least the plurality of trajectories and the plurality of contingent trajectories is further based on the inferential characteristic value.
9. The method of claim 1, wherein generating the trajectory comprises:
classifying a subset of input data comprising the path data as a subset of actions to form a classified subset of data; and
based on the categorized subset of data, a subset of tracks is identified that includes the track.
10. The method of claim 1, wherein detecting a stop of the generation of the trajectory comprises detecting data indicative of inoperable advanced logic of a planner module of the autonomous vehicle.
11. A system for controlling one or more autonomous vehicle components, comprising:
a vehicle component of one or more autonomous vehicles for controlling movement of the autonomous vehicle;
one or more processors communicatively coupled to the one or more vehicle components;
a localizer;
a perception engine;
a trajectory tracker; and
a memory storing instructions executable by the one or more processors, the instructions when executed by the one or more processors configuring the system to:
Receiving path data to guide movement of the autonomous vehicle from a first geographic location to a second geographic location;
receiving local pose data from the localizer;
receiving an object, a current object trace, and a predicted object trace from the perception engine;
generating at least a plurality of candidate trajectories and a plurality of candidate contingency trajectories based on the path data, the local pose data, the object, the current object trace, and the predicted object trace; the plurality of candidate trajectories and the plurality of candidate contingency trajectories are generated as the autonomous vehicle traverses a path;
selecting, by the trajectory tracker, a trajectory from the plurality of candidate trajectories;
selecting, by the trajectory tracker, a contingency trajectory from the plurality of candidate contingency trajectories;
controlling, by the trajectory tracker, the one or more vehicle components to guide movement of the autonomous vehicle according to the trajectory;
monitoring, by the trajectory tracker, generation of the trajectory;
detecting, by the trajectory tracker, a degradation in the generation of the trajectory; and
in response to detecting the degradation in the generation of the trajectory, the one or more vehicle components are controlled by the trajectory tracker to guide movement of the autonomous vehicle according to the contingency trajectory.
12. The system of claim 11, further comprising:
one or more trajectory trackers configured to:
monitoring the generation of the trajectory;
detecting degradation of the generation of the trace; and
the contingency trajectory is sent to one or more vehicle components of the autonomous vehicle to control movement of the autonomous vehicle.
13. The system of claim 12, wherein the operations further comprise:
sending a message to a remote operator; and
a response is received from the remote operator, at least one of the one or more trajectory trackers configured to receive one or more of lidar data, radar data, or sonar data.
14. The system of claim 11, further comprising:
one or more of a sensor for capturing sensor data or a communication interface for receiving the sensor data; wherein the operations further comprise:
generating map data from the sensor data or receiving the map data via the communication interface, the map data comprising a three-dimensional model of an environment of the autonomous vehicle; and
generating sensory data from the sensor data or receiving the sensory data via the communication interface, the sensory data comprising data associated with an external object; and
Wherein:
generating at least the plurality of candidate trajectories and the plurality of candidate contingency trajectories is further based on one or more of the map data, the perception data, or the path data, each candidate trajectory and each contingency trajectory having a confidence level associated therewith;
selecting the trajectory from the plurality of candidate trajectories is based on a confidence level associated with the trajectory; and
selecting the contingency track from the plurality of candidate contingency tracks is based on a confidence level associated with the contingency track.
15. The system of claim 14, wherein:
the trajectory is generated for controlling movement of the autonomous vehicle in accordance with a rolling horizon control technique, and the contingency trajectory is generated for controlling movement of the autonomous vehicle in response to detecting that the generation of the trajectory performed by the system is degraded.
16. The system of claim 15, wherein the trajectory is generated by a planner module stored in the memory, degradation of the generation of the trajectory being caused by a failure of the planner module.
17. The system of claim 11, wherein the one or more vehicle components comprise one or more of the following:
A propulsion unit;
a power transmission system;
a steering unit; or (b)
And a braking unit.
18. The system of claim 11, wherein detecting degradation in the generation of the trajectory comprises: data is detected that indicates inoperable high-level logic of a planner module of the autonomous vehicle.
19. A method for controlling an autonomous vehicle, comprising:
receiving path data to guide movement of an autonomous vehicle from a first geographic location to a second geographic location;
generating a trajectory based on the path data;
generating an emergency trajectory based on the path data;
controlling the autonomous vehicle according to the trajectory;
monitoring the generation of the track;
detecting that the generation of the trace is affected; and
the autonomous vehicle is controlled according to the contingency trajectory after the generation of the trajectory is affected by the affecting.
20. The method of claim 19, further comprising receiving sensor data from a sensor of the autonomous vehicle, and wherein generating the contingency trajectory comprises:
determining a plurality of contingency trajectories based on the path data and the sensor data;
comparing the plurality of contingency trajectories; and
One of the plurality of contingency tracks is selected as the contingency track based on the comparison.
21. The method of claim 19, wherein generating the contingency trajectory includes generating a nominal contingency trajectory and generating a safety-stop contingency trajectory.
22. The method of claim 21, wherein generating the nominal contingency trajectory comprises iteratively generating a nominal contingency trajectory based at least in part on a rolling horizon; and is also provided with
Wherein the safety-stop contingency trajectory is configured to cause the autonomous vehicle to safely stop.
23. The method of claim 19, wherein generating the contingent trajectory comprises generating the contingent trajectory using an artificial intelligence layer, and
wherein controlling the autonomous vehicle according to the contingency trajectory includes controlling the autonomous vehicle using a real-time operating system (RTOS) layer.
24. The method of claim 23, wherein detecting the effect comprises detecting a failure or degradation of the artificial intelligence layer.
25. The method of claim 19, further comprising determining whether contingency data is within a range of predicted values, the contingency data including data for controlling the autonomous vehicle in accordance with the contingency track, an uncertainty associated with the contingency track, or a probability associated with the contingency track.
26. The method of claim 19, wherein the affecting comprises: the inoperable high-level logic determined by the trajectory evaluator is not executable by at least one of a processor or a component of the autonomous vehicle; a failure of the trajectory evaluator; or degradation of the trajectory evaluator.
27. A system for controlling an autonomous vehicle, comprising:
one or more processors;
a memory storing processor-executable instructions that, when executed by the one or more processors, cause the system to:
receiving path data to guide movement of an autonomous vehicle from a first geographic location to a second geographic location;
generating a trajectory to control movement of the autonomous vehicle based on the path data;
generating an emergency track;
monitoring the generation of the trajectory;
detecting that the generation of the trace is affected; and
controlling movement of the autonomous vehicle according to the contingency trajectory in response to detecting the effect.
28. The system of claim 27, wherein:
the system further includes a sensor configured to capture sensor data,
wherein generating the contingency trajectory comprises:
Determining a plurality of contingency trajectories based on the sensor data;
comparing the plurality of contingency trajectories; and
one of the plurality of contingency tracks is selected as the contingency track based on the comparison.
29. The system of claim 27, wherein the instructions, when executed, further cause the system to:
receiving static map data;
receiving current object state data and predicted object state data;
receiving local gesture data;
determining a confidence level associated with the trajectory based on the static map data, the current object state data, and the predicted object state data; and
determining an operating state of the autonomous vehicle based on the confidence level,
wherein detecting the effect includes determining that the operating state is an abnormal operating state.
30. The system of claim 27, wherein detecting the effect comprises detecting a stop of generation of the trajectory or detecting data indicative of non-operational high-level logic of a planner.
31. The system of claim 27, further comprising:
an artificial intelligence layer comprising the one or more processors and the memory; and
A real-time operating system, RTOS, layer configured to cause the autonomous vehicle to traverse the trajectory or the contingency trajectory.
32. The system of claim 27, the instructions, when executed, further cause the system to generate a nominal contingency trajectory based on a scrolling time domain,
wherein the contingency trajectory includes a safety-stop maneuver, an
Additionally, wherein controlling movement of the autonomous vehicle according to the contingency trajectory includes passing the autonomous vehicle through the contingency trajectory or the nominal contingency trajectory.
33. The system of claim 32, wherein generating the trajectory comprises:
generating a plurality of trajectories at a trajectory generator;
determining, at a trajectory evaluator, a set of confidence levels associated with the plurality of trajectories; and
selecting as the trajectory having the highest confidence level of the plurality of trajectories,
wherein detecting the effect comprises detecting a failure of the trajectory generator, a failure of the trajectory evaluator, a degradation of the trajectory generator, or a degradation of the trajectory evaluator.
34. A non-transitory computer-readable medium storing processor-executable instructions that, when executed, cause one or more processors to perform operations comprising:
Receiving path data to guide movement of an autonomous vehicle from a first geographic location to a second geographic location;
generating a plurality of trajectories based on the path data, the plurality of trajectories including a trajectory to control movement of the autonomous vehicle and an emergency trajectory;
guiding movement of the autonomous vehicle according to the trajectory;
monitoring the generation of the track;
detecting that the generation of the trace is affected; and
the autonomous vehicle is directed according to the contingency trajectory in response to detecting the effect of the generation of the trajectory.
35. The non-transitory computer-readable medium of claim 34, wherein:
generating the trajectory such that the autonomous vehicle moves at least a portion of a distance between the first geographic location and the second geographic location; and
the contingency track is generated to control movement of the autonomous vehicle to a safe stop in response to detecting that the generation of the track is affected.
36. The non-transitory computer-readable medium of claim 34, wherein generating the contingency track comprises:
receiving sensor data from sensors of the autonomous vehicle;
Determining a plurality of contingency trajectories based on the sensor data and a scrolling time domain technique;
comparing the plurality of contingency trajectories; and
one of the plurality of contingency tracks is selected as the contingency track based on the comparison.
37. The non-transitory computer-readable medium of claim 34, wherein detecting the effect comprises at least one of:
detecting a stop of the generation of the trajectory;
detecting data indicative of inoperable high-level logic of the planner; or alternatively
A failure or degradation of an artificial intelligence layer of the autonomous vehicle is detected.
38. The non-transitory computer-readable medium of claim 34, the operations further comprising:
receiving static map data;
receiving current object state data and predicted object state data;
receiving local gesture data;
determining a confidence level associated with the trajectory based on the static map data, the current object state data, and the predicted object state data; and
determining an operating state of the autonomous vehicle based on the confidence level,
wherein detecting the effect comprises determining that the operating state is an abnormal operating state based on the confidence level meeting or exceeding a threshold confidence.
CN201680064611.XA 2015-11-04 2016-11-02 Method and system for controlling a motion profile of an autonomous vehicle Active CN108292473B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US14/756,992 2015-11-04
US14/756,992 US9910441B2 (en) 2015-11-04 2015-11-04 Adaptive autonomous vehicle planner logic
PCT/US2016/060029 WO2017079228A2 (en) 2015-11-04 2016-11-02 Adaptive autonomous vehicle planner logic

Publications (2)

Publication Number Publication Date
CN108292473A CN108292473A (en) 2018-07-17
CN108292473B true CN108292473B (en) 2023-06-06

Family

ID=58634674

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201680064611.XA Active CN108292473B (en) 2015-11-04 2016-11-02 Method and system for controlling a motion profile of an autonomous vehicle

Country Status (5)

Country Link
US (2) US9910441B2 (en)
EP (1) EP3371794A4 (en)
JP (1) JP7195143B2 (en)
CN (1) CN108292473B (en)
WO (1) WO2017079228A2 (en)

Families Citing this family (182)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10457327B2 (en) * 2014-09-26 2019-10-29 Nissan North America, Inc. Method and system of assisting a driver of a vehicle
US9616773B2 (en) 2015-05-11 2017-04-11 Uber Technologies, Inc. Detecting objects within a vehicle in connection with a service
US9507346B1 (en) 2015-11-04 2016-11-29 Zoox, Inc. Teleoperation system and method for trajectory modification of autonomous vehicles
US9804599B2 (en) 2015-11-04 2017-10-31 Zoox, Inc. Active lighting control for communicating a state of an autonomous vehicle to entities in a surrounding environment
US9910441B2 (en) 2015-11-04 2018-03-06 Zoox, Inc. Adaptive autonomous vehicle planner logic
US9494940B1 (en) 2015-11-04 2016-11-15 Zoox, Inc. Quadrant configuration of robotic vehicles
US9878664B2 (en) 2015-11-04 2018-01-30 Zoox, Inc. Method for robotic vehicle communication with an external environment via acoustic beam forming
JP6709283B2 (en) * 2015-11-16 2020-06-10 オービタル インサイト インコーポレイテッド Detection and analysis of moving vehicles using low resolution remote sensing images
US10712160B2 (en) 2015-12-10 2020-07-14 Uatc, Llc Vehicle traction map for autonomous vehicles
US9841763B1 (en) 2015-12-16 2017-12-12 Uber Technologies, Inc. Predictive sensor array configuration system for an autonomous vehicle
US9840256B1 (en) 2015-12-16 2017-12-12 Uber Technologies, Inc. Predictive sensor array configuration system for an autonomous vehicle
CN108431549B (en) * 2016-01-05 2020-09-04 御眼视觉技术有限公司 Trained system with imposed constraints
KR102361920B1 (en) * 2016-02-26 2022-02-10 현대자동차주식회사 Method for time synchronization of domain based on time information of vehicle
US9990548B2 (en) 2016-03-09 2018-06-05 Uber Technologies, Inc. Traffic signal analysis system
JP2019510240A (en) * 2016-03-15 2019-04-11 ソルファイス リサーチ、インコーポレイテッド System and method for providing vehicle recognition
US9981657B2 (en) * 2016-04-14 2018-05-29 Ford Global Technologies, Llc Autonomous vehicle parking and transition to manual control
CN106096192B (en) * 2016-06-27 2019-05-28 百度在线网络技术(北京)有限公司 A kind of construction method and device of the test scene of automatic driving vehicle
US10852744B2 (en) * 2016-07-01 2020-12-01 Uatc, Llc Detecting deviations in driving behavior for autonomous vehicles
US10331138B2 (en) * 2016-07-05 2019-06-25 Baidu Usa Llc Standard scene-based planning control methods for operating autonomous vehicles
US10775783B2 (en) * 2016-08-04 2020-09-15 Kevin Lawler System for asymmetric just-in-time human intervention in automated vehicle fleets
US11087252B2 (en) 2016-08-16 2021-08-10 Teleport Mobility, Inc. Interactive real time system and real time method of use thereof in conveyance industry segments
US11182709B2 (en) 2016-08-16 2021-11-23 Teleport Mobility, Inc. Interactive real time system and real time method of use thereof in conveyance industry segments
US11176500B2 (en) 2016-08-16 2021-11-16 Teleport Mobility, Inc. Interactive real time system and real time method of use thereof in conveyance industry segments
US20220108260A1 (en) * 2016-08-16 2022-04-07 Teleport Mobility, Inc. Interactive network and method for securing conveyance services
DE102016215421A1 (en) * 2016-08-17 2018-02-22 Robert Bosch Gmbh Method and device for operating an automated motor vehicle
JP6776058B2 (en) * 2016-08-26 2020-10-28 シャープ株式会社 Autonomous driving vehicle control device, autonomous driving vehicle control system and autonomous driving vehicle control method
US20190004524A1 (en) * 2016-08-31 2019-01-03 Faraday&Future Inc. System and method for planning a vehicle path
US10095230B1 (en) * 2016-09-13 2018-10-09 Rockwell Collins, Inc. Verified inference engine for autonomy
JP6548691B2 (en) * 2016-10-06 2019-07-24 株式会社アドバンスド・データ・コントロールズ Image generation system, program and method, simulation system, program and method
US10473470B2 (en) * 2016-10-20 2019-11-12 nuTonomy Inc. Identifying a stopping place for an autonomous vehicle
US11627450B2 (en) 2016-10-20 2023-04-11 Motional Ad Llc Identifying stopping place for autonomous vehicle
US10681513B2 (en) 2016-10-20 2020-06-09 nuTonomy Inc. Identifying a stopping place for an autonomous vehicle
US10331129B2 (en) 2016-10-20 2019-06-25 nuTonomy Inc. Identifying a stopping place for an autonomous vehicle
US10857994B2 (en) 2016-10-20 2020-12-08 Motional Ad Llc Identifying a stopping place for an autonomous vehicle
US10339708B2 (en) * 2016-11-01 2019-07-02 Google Inc. Map summarization and localization
NO20161734A1 (en) * 2016-11-02 2018-01-02 Autostore Tech As Track sensors for detecting position of vehicle relative to tracks
WO2018102425A1 (en) 2016-12-02 2018-06-07 Starsky Robotics, Inc. Vehicle control system and method of use
US9811086B1 (en) 2016-12-14 2017-11-07 Uber Technologies, Inc. Vehicle management system
US10395441B2 (en) 2016-12-14 2019-08-27 Uber Technologies, Inc. Vehicle management system
US10296012B2 (en) * 2016-12-21 2019-05-21 X Development Llc Pre-computation of kinematically feasible roadmaps
EP3346418A1 (en) * 2016-12-28 2018-07-11 Volvo Car Corporation Method and system for vehicle localization from camera image
US20180196415A1 (en) * 2017-01-09 2018-07-12 nuTonomy Inc. Location Signaling with Respect to an Autonomous Vehicle and a Rider
US10740863B2 (en) 2017-01-09 2020-08-11 nuTonomy Inc. Location signaling with respect to an autonomous vehicle and a rider
US10730531B1 (en) * 2017-02-02 2020-08-04 Uatc, Llc Machine-learning based vehicle motion control system
US10780879B2 (en) * 2017-02-14 2020-09-22 Denso Ten Limited Parking controller, parking control system, and parking control method
US20180232840A1 (en) * 2017-02-15 2018-08-16 Uber Technologies, Inc. Geospatial clustering for service coordination systems
US10444342B2 (en) * 2017-03-08 2019-10-15 Gm Global Technology Operations Llc. Control of host device using three-dimensional position and velocity
US10338594B2 (en) * 2017-03-13 2019-07-02 Nio Usa, Inc. Navigation of autonomous vehicles to enhance safety under one or more fault conditions
US10395437B2 (en) * 2017-03-13 2019-08-27 Blackberry Limited Adjusting components of cargo transportation units
US10209718B2 (en) 2017-03-14 2019-02-19 Starsky Robotics, Inc. Vehicle sensor system and method of use
JP2018173729A (en) * 2017-03-31 2018-11-08 パナソニックIpマネジメント株式会社 Automatic driving control method, automatic driving controller using the same, and program
US10262234B2 (en) * 2017-04-24 2019-04-16 Baidu Usa Llc Automatically collecting training data for object recognition with 3D lidar and localization
US11190944B2 (en) 2017-05-05 2021-11-30 Ball Aerospace & Technologies Corp. Spectral sensing and allocation using deep machine learning
US10423162B2 (en) 2017-05-08 2019-09-24 Nio Usa, Inc. Autonomous vehicle logic to identify permissioned parking relative to multiple classes of restricted parking
JP6866762B2 (en) * 2017-05-18 2021-04-28 いすゞ自動車株式会社 Information processing system for vehicles
US10290074B2 (en) * 2017-05-25 2019-05-14 Uber Technologies, Inc. Coordinating on-demand transportation with autonomous vehicles
US10595455B2 (en) * 2017-06-19 2020-03-24 Cnh Industrial America Llc Planning system for an autonomous work vehicle system
US20190137287A1 (en) * 2017-06-27 2019-05-09 drive.ai Inc. Method for detecting and managing changes along road surfaces for autonomous vehicles
JP7219544B2 (en) 2017-06-30 2023-02-08 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Vehicle, vehicle control method, vehicle remote control device, and vehicle remote control method
CN109429503B (en) * 2017-06-30 2022-04-29 松下电器(美国)知识产权公司 Vehicle, vehicle control method, vehicle remote operation device, and vehicle remote operation method
US10759534B2 (en) 2017-07-03 2020-09-01 George A. Miller Method and system from controlling an unmanned aerial vehicle
AU2017418043B2 (en) * 2017-07-13 2020-05-21 Beijing Voyager Technology Co., Ltd. Systems and methods for trajectory determination
US10369974B2 (en) 2017-07-14 2019-08-06 Nio Usa, Inc. Control and coordination of driverless fuel replenishment for autonomous vehicles
US10710633B2 (en) 2017-07-14 2020-07-14 Nio Usa, Inc. Control of complex parking maneuvers and autonomous fuel replenishment of driverless vehicles
US10496099B2 (en) * 2017-07-18 2019-12-03 Uatc, Llc Systems and methods for speed limit context awareness
CA3070624A1 (en) * 2017-07-28 2019-01-31 Nuro, Inc. Flexible compartment design on autonomous and semi-autonomous vehicle
US10452927B2 (en) * 2017-08-09 2019-10-22 Ydrive, Inc. Object localization within a semantic domain
US10395444B1 (en) * 2017-08-10 2019-08-27 Zoox, Inc. Vehicle self-diagnostics
US10860019B2 (en) 2017-09-08 2020-12-08 Motional Ad Llc Planning autonomous motion
JP7147142B2 (en) * 2017-09-15 2022-10-05 ソニーグループ株式会社 CONTROL DEVICE, CONTROL METHOD, PROGRAM, AND MOVING OBJECT
US20190113919A1 (en) 2017-10-18 2019-04-18 Luminar Technologies, Inc. Controlling an autonomous vehicle using smart control architecture selection
US10802483B2 (en) * 2017-10-19 2020-10-13 International Business Machines Corporation Emergency public deactivation of autonomous vehicles
US10717384B2 (en) 2017-10-25 2020-07-21 Pony Ai Inc. System and method for projecting trajectory path of an autonomous vehicle onto a road surface
US10962650B2 (en) 2017-10-31 2021-03-30 United States Of America As Represented By The Administrator Of Nasa Polyhedral geofences
US10831188B2 (en) 2017-11-07 2020-11-10 Zoox, Inc. Redundant pose generation system
US10345811B2 (en) * 2017-11-14 2019-07-09 GM Global Technology Operations LLC Method and apparatus for scenario generation and parametric sweeps for the development and evaluation of autonomous driving systems
US11163309B2 (en) * 2017-11-30 2021-11-02 Direct Current Capital LLC Method for autonomous navigation
US10860018B2 (en) * 2017-11-30 2020-12-08 Tusimple, Inc. System and method for generating simulated vehicles with configured behaviors for analyzing autonomous vehicle motion planners
US11073828B2 (en) * 2017-12-08 2021-07-27 Samsung Electronics Co., Ltd. Compression of semantic information for task and motion planning
JP2019018836A (en) * 2017-12-28 2019-02-07 パナソニックIpマネジメント株式会社 vehicle
US11009359B2 (en) 2018-01-05 2021-05-18 Lacuna Technologies Inc. Transportation systems and related methods
US11022971B2 (en) 2018-01-16 2021-06-01 Nio Usa, Inc. Event data recordation to identify and resolve anomalies associated with control of driverless vehicles
US20190220036A1 (en) * 2018-01-17 2019-07-18 Uber Technologies, Inc. Systems and Methods for Implementing Vehicle Assignments using Vehicle State Information
US11091162B2 (en) * 2018-01-30 2021-08-17 Toyota Motor Engineering & Manufacturing North America, Inc. Fusion of front vehicle sensor data for detection and ranging of preceding objects
WO2019148467A1 (en) * 2018-02-02 2019-08-08 深圳前海达闼云端智能科技有限公司 Positioning method and apparatus, and robot and computer readable storage medium
US10585436B2 (en) * 2018-02-15 2020-03-10 Wipro Limited Method and system for real-time generation of reference navigation path for navigation of vehicle
US10520951B2 (en) 2018-02-22 2019-12-31 Ford Global Technologies, Llc Vehicle formation management
CN110197097B (en) * 2018-02-24 2024-04-19 北京图森智途科技有限公司 Harbor district monitoring method and system and central control system
CN108534790A (en) * 2018-02-27 2018-09-14 吉林省行氏动漫科技有限公司 Automatic driving vehicle air navigation aid, device and automatic driving vehicle
WO2019168530A1 (en) * 2018-02-28 2019-09-06 Nissan North America, Inc. Transportation network infrastructure for autonomous vehicle decision making
US10816992B2 (en) * 2018-04-17 2020-10-27 Baidu Usa Llc Method for transforming 2D bounding boxes of objects into 3D positions for autonomous driving vehicles (ADVs)
US10807236B2 (en) * 2018-04-30 2020-10-20 Beijing Jingdong Shangke Information Technology Co., Ltd. System and method for multimodal mapping and localization
US11334753B2 (en) 2018-04-30 2022-05-17 Uatc, Llc Traffic signal state classification for autonomous vehicles
US10663977B2 (en) 2018-05-16 2020-05-26 Direct Current Capital LLC Method for dynamically querying a remote operator for assistance
US11040729B2 (en) * 2018-05-31 2021-06-22 Nissan North America, Inc. Probabilistic object tracking and prediction framework
US20200004265A1 (en) * 2018-06-28 2020-01-02 Baidu Usa Llc Autonomous driving vehicles with redundant ultrasonic radar
WO2020006091A1 (en) * 2018-06-28 2020-01-02 Zoox, Inc. Multi-resolution maps for localization
US11422259B2 (en) 2018-06-28 2022-08-23 Zoox, Inc. Multi-resolution maps for localization
US10890663B2 (en) 2018-06-28 2021-01-12 Zoox, Inc. Loading multi-resolution maps for localization
US11354406B2 (en) * 2018-06-28 2022-06-07 Intel Corporation Physics-based approach for attack detection and localization in closed-loop controls for autonomous vehicles
US10678246B1 (en) 2018-07-30 2020-06-09 GM Global Technology Operations LLC Occupancy grid movie system
US10824156B1 (en) 2018-07-30 2020-11-03 GM Global Technology Operations LLC Occupancy grid movie system
US10598791B2 (en) 2018-07-31 2020-03-24 Uatc, Llc Object detection based on Lidar intensity
US10845806B2 (en) * 2018-07-31 2020-11-24 GM Global Technology Operations LLC Autonomous vehicle control using prior radar space map
CN109101022A (en) * 2018-08-09 2018-12-28 北京智行者科技有限公司 A kind of working path update method
EP3854646B1 (en) 2018-08-14 2024-05-29 Mobileye Vision Technologies Ltd. Systems and methods for navigating with safe distances
US11256260B2 (en) 2018-10-05 2022-02-22 Waymo Llc Generating trajectories for autonomous vehicles
KR102092913B1 (en) * 2018-10-08 2020-04-23 현대모비스 주식회사 Apparatus for informing inside lane and control method thereof
US11182672B1 (en) * 2018-10-09 2021-11-23 Ball Aerospace & Technologies Corp. Optimized focal-plane electronics using vector-enhanced deep learning
US20200132473A1 (en) * 2018-10-26 2020-04-30 Ford Global Technologies, Llc Systems and methods for determining vehicle location in parking structures
US10838417B2 (en) 2018-11-05 2020-11-17 Waymo Llc Systems for implementing fallback behaviors for autonomous vehicles
DE102018221063A1 (en) * 2018-12-05 2020-06-10 Volkswagen Aktiengesellschaft Configuration of a control system for an at least partially autonomous motor vehicle
US11192545B1 (en) 2018-12-05 2021-12-07 Waymo Llc Risk mitigation in speed planning
US11460848B1 (en) 2018-12-06 2022-10-04 Waymo Llc Biased trajectory progress metric
US11208111B2 (en) 2018-12-11 2021-12-28 Waymo Llc Redundant hardware system for autonomous vehicles
US11099563B2 (en) * 2018-12-19 2021-08-24 Zoox, Inc. Multi-controller synchronization
US11131554B2 (en) * 2018-12-26 2021-09-28 Beijing Voyager Technology Co., Ltd. Systems and methods for vehicle telemetry
US11287270B2 (en) 2018-12-26 2022-03-29 Beijing Voyager Technology Co., Ltd. Systems and methods for safe route planning for a vehicle
CN113748316B (en) * 2018-12-26 2024-01-02 北京航迹科技有限公司 System and method for vehicle telemetry
US11675062B2 (en) * 2019-01-10 2023-06-13 Gm Cruise Holdings Llc Context aware real-time power adjustment for steerable lidar
US11851217B1 (en) 2019-01-23 2023-12-26 Ball Aerospace & Technologies Corp. Star tracker using vector-based deep learning for enhanced performance
US11580521B2 (en) 2019-01-28 2023-02-14 Michael Sawyer Curbside management system for connected and autonomous vehicles
US11561540B2 (en) * 2019-02-26 2023-01-24 Intel Corporation Augmenting autonomous driving with remote viewer recommendation
US11412124B1 (en) 2019-03-01 2022-08-09 Ball Aerospace & Technologies Corp. Microsequencer for reconfigurable focal plane control
US11325591B2 (en) 2019-03-07 2022-05-10 Honda Motor Co., Ltd. System and method for teleoperation service for vehicle
US11364904B2 (en) * 2019-03-26 2022-06-21 GM Global Technology Operations LLC Path-planning fusion for a vehicle
WO2020198937A1 (en) * 2019-03-29 2020-10-08 Baidu.Com Times Technology (Beijing) Co., Ltd. Communications protocols between planning and control of autonomous driving vehicle
CN109947115A (en) * 2019-04-17 2019-06-28 河北农业大学 A kind of mowing machine control system and its control method
US11105642B2 (en) 2019-04-17 2021-08-31 Waymo Llc Stranding and scoping analysis for autonomous vehicle services
CN109976354A (en) * 2019-04-23 2019-07-05 成都精工华耀科技有限公司 A kind of removable automatic running rail polling trolley
US11373318B1 (en) 2019-05-14 2022-06-28 Vulcan Inc. Impact detection
US11488024B1 (en) 2019-05-29 2022-11-01 Ball Aerospace & Technologies Corp. Methods and systems for implementing deep reinforcement module networks for autonomous systems control
US11303348B1 (en) 2019-05-29 2022-04-12 Ball Aerospace & Technologies Corp. Systems and methods for enhancing communication network performance using vector based deep learning
CN112069856B (en) * 2019-06-10 2024-06-14 商汤集团有限公司 Map generation method, driving control device, electronic equipment and system
US11315427B2 (en) 2019-06-11 2022-04-26 Toyota Motor North America, Inc. Vehicle-to-vehicle sensor data sharing
US10769953B1 (en) 2019-06-11 2020-09-08 Toyota Motor North America, Inc. Vehicle-to-vehicle sensor data sharing
US11346682B2 (en) * 2019-06-28 2022-05-31 GM Cruise Holdings, LLC Augmented 3D map
CN110296708B (en) * 2019-07-01 2021-08-17 百度在线网络技术(北京)有限公司 Operation route planning method, device and storage medium
DE102019004753A1 (en) * 2019-07-08 2021-01-14 Daimler Ag Process for the coordination of an autonomous vehicle fleet, as well as vehicle fleet coordination system
US11144058B2 (en) 2019-07-08 2021-10-12 Ford Global Technologies, Llc Systems and methods for vehicle powertrain calibration selection strategy
CN110497909B (en) 2019-07-17 2021-06-29 百度在线网络技术(北京)有限公司 Collision detection and avoidance method and device applied to vehicle
US11407409B2 (en) 2019-08-13 2022-08-09 Zoox, Inc. System and method for trajectory validation
US11458965B2 (en) * 2019-08-13 2022-10-04 Zoox, Inc. Feasibility validation for vehicle trajectory selection
US11914368B2 (en) 2019-08-13 2024-02-27 Zoox, Inc. Modifying limits on vehicle dynamics for trajectories
US11397434B2 (en) 2019-08-13 2022-07-26 Zoox, Inc. Consistency validation for vehicle trajectory selection
US11828598B1 (en) 2019-08-28 2023-11-28 Ball Aerospace & Technologies Corp. Systems and methods for the efficient detection and tracking of objects from a moving platform
DE102019213222B4 (en) * 2019-09-02 2022-09-29 Volkswagen Aktiengesellschaft Method for predicting a future driving situation of a foreign object, device, vehicle participating in road traffic
US11958183B2 (en) 2019-09-19 2024-04-16 The Research Foundation For The State University Of New York Negotiation-based human-robot collaboration via augmented reality
US11351995B2 (en) 2019-09-27 2022-06-07 Zoox, Inc. Error modeling framework
US11625513B2 (en) * 2019-09-27 2023-04-11 Zoox, Inc. Safety analysis framework
US11734473B2 (en) * 2019-09-27 2023-08-22 Zoox, Inc. Perception error models
US20210094569A1 (en) * 2019-09-27 2021-04-01 Honda Motor Co., Ltd. System and method for providing accurate trajectory following for automated vehicles in dynamic environments
US11726492B2 (en) * 2019-10-02 2023-08-15 Zoox, Inc. Collision avoidance perception system
US11994866B2 (en) 2019-10-02 2024-05-28 Zoox, Inc. Collision avoidance perception system
US10999719B1 (en) * 2019-12-03 2021-05-04 Gm Cruise Holdings Llc Peer-to-peer autonomous vehicle communication
CN111071263B (en) * 2019-12-09 2022-04-05 苏州智加科技有限公司 Control method, device, system and equipment for automatically driving vehicle
US11120538B2 (en) * 2019-12-27 2021-09-14 Zoox, Inc. Sensor degradation detection and remediation
WO2021138475A1 (en) * 2019-12-31 2021-07-08 Zoox, Inc. Vehicle control to join and depart a route
US11902327B2 (en) * 2020-01-06 2024-02-13 Microsoft Technology Licensing, Llc Evaluating a result of enforcement of access control policies instead of enforcing the access control policies
US11243548B2 (en) * 2020-01-08 2022-02-08 International Business Machines Corporation Detecting autonomous vehicles causing increased vehicle queueing
CN113246962B (en) * 2020-02-07 2023-09-12 沃尔沃汽车公司 Vehicle-mounted device, system and method for automatic parking assistance
US11873000B2 (en) 2020-02-18 2024-01-16 Toyota Motor North America, Inc. Gesture detection for transport control
US11055998B1 (en) 2020-02-27 2021-07-06 Toyota Motor North America, Inc. Minimizing traffic signal delays with transports
US20210272039A1 (en) 2020-02-28 2021-09-02 Ford Global Technologies, Llc Systems and methods for evaluating a microtransit service
US20210276587A1 (en) * 2020-03-05 2021-09-09 Uber Technologies, Inc. Systems and Methods for Autonomous Vehicle Systems Simulation
EP4128100A1 (en) 2020-03-23 2023-02-08 Nuro, Inc. Methods and apparatus for automated deliveries
US11295540B2 (en) * 2020-03-30 2022-04-05 GM Cruise Holdings, LLC Combined sensor calibration target for calibrating multiple types of sensors, including infrared cameras and visible light cameras
US11290856B2 (en) 2020-03-31 2022-03-29 Toyota Motor North America, Inc. Establishing connections in transports
CA3180999A1 (en) * 2020-06-05 2021-12-09 Gatik Ai Inc. Method and system for deterministic trajectory selection based on uncertainty estimation for an autonomous agent
US11124204B1 (en) 2020-06-05 2021-09-21 Gatik Ai Inc. Method and system for data-driven and modular decision making and trajectory generation of an autonomous agent
CA3181067A1 (en) 2020-06-05 2021-12-09 Gautam Narang Method and system for context-aware decision making of an autonomous agent
US11928406B2 (en) * 2020-07-02 2024-03-12 Ford Global Technologies, Llc Systems and methods for creating infrastructure models
CN112035452B (en) * 2020-08-26 2022-09-06 重庆长安汽车股份有限公司 Internet vehicle user tour analysis method based on big data
US20220126875A1 (en) * 2020-10-26 2022-04-28 Tusimple, Inc. Control of an autonomous vehicle based on behavior of surrounding agents and limited observations of environment
US11932280B2 (en) * 2020-11-16 2024-03-19 Ford Global Technologies, Llc Situation handling and learning for an autonomous vehicle control system
US20220282979A1 (en) * 2021-03-03 2022-09-08 Delhivery Private Limited System and Method for Generating and Optimizing Dynamic Dispatch Plans
GB2607299A (en) 2021-06-01 2022-12-07 Daimler Ag Track fusion for an autonomous vehicle
US11556403B1 (en) 2021-10-19 2023-01-17 Bank Of America Corporation System and method for an application programming interface (API) service modification
WO2022226434A1 (en) * 2022-01-05 2022-10-27 Futurewei Technologies, Inc. Self-driving vehicle evaluation using real-world data
CN115273541B (en) * 2022-06-21 2023-08-15 深蓝汽车科技有限公司 Safety improvement method and system for L2-level automatic driving system
CN115311871B (en) * 2022-08-12 2023-09-05 深圳市能信安科技股份有限公司 Method, device, system, equipment and storage medium for judging vehicle running direction
CN115422380B (en) * 2022-09-07 2024-02-13 武汉品致汽车技术有限公司 Automobile user manual development method

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101311680A (en) * 2007-05-23 2008-11-26 株式会社电装 Apparatus and program for navigation
CN102084219A (en) * 2007-06-15 2011-06-01 微软公司 Route modifications
CN102227612A (en) * 2008-10-24 2011-10-26 格瑞股份公司 Control and systems for autonomously driven vehicles
CN102235875A (en) * 2010-03-23 2011-11-09 株式会社电装 Vehicular navigation device
CN102334151A (en) * 2009-02-27 2012-01-25 丰田自动车株式会社 Movement trajectory generator
CN102859323A (en) * 2009-11-24 2013-01-02 特洛吉斯有限公司 Vehicle route selection based on energy usage
CN102901510A (en) * 2011-07-25 2013-01-30 通用汽车环球科技运作有限责任公司 Autonomous convoying technique for vehicles
CN103069153A (en) * 2010-07-07 2013-04-24 罗伯特·博世有限公司 System and method for controlling the engine of a vehicle
CN104185775A (en) * 2012-01-20 2014-12-03 丰田自动车工程及制造北美公司 Intelligent navigation system
EP2858060A2 (en) * 2013-10-02 2015-04-08 Audi Ag Method for controlling a motor vehicle and motor vehicle
US9008890B1 (en) * 2013-03-15 2015-04-14 Google Inc. Augmented trajectories for autonomous vehicles
CN104812645A (en) * 2012-09-27 2015-07-29 谷歌公司 Determining changes in a driving environment based on vehicle behavior
CN104937512A (en) * 2012-09-21 2015-09-23 罗伯特·博世有限公司 Method and device for operating motor vehicle in automated driving mode

Family Cites Families (117)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH087196A (en) * 1994-06-16 1996-01-12 Sumitomo Electric Ind Ltd Route guiding device
US5558370A (en) 1995-03-30 1996-09-24 Automotive Systems Laboratory, Inc. Electronic seat belt tensioning system
US5646613A (en) 1996-05-20 1997-07-08 Cho; Myungeun System for minimizing automobile collision damage
DE19750338A1 (en) 1997-11-13 1999-05-20 Siemens Ag Motor vehicle cruise control system
US7426429B2 (en) 1998-04-27 2008-09-16 Joseph A Tabe Smart seatbelt control system
JP3865182B2 (en) 1998-12-25 2007-01-10 タカタ株式会社 Seat belt system
JP2000259982A (en) * 1999-03-09 2000-09-22 Toshiba Corp Road traffic management system
AT409313B (en) * 1999-11-15 2002-07-25 Josef Zeller SYSTEM FOR AUTOMATICALLY STOPPING A MOTOR VEHICLE ON A ROADWAY
US6728616B1 (en) 2000-10-20 2004-04-27 Joseph A. Tabe Smart seatbelt control system
US6746049B2 (en) 2002-07-24 2004-06-08 Visteon Global Technologies, Inc. Adaptive seat belt tensioning system
US20060064202A1 (en) 2002-08-26 2006-03-23 Sony Corporation Environment identification device, environment identification method, and robot device
US20070096447A1 (en) 2003-10-07 2007-05-03 Tabe Joseph A Smart seatbelt control system
US7395151B2 (en) * 2004-02-24 2008-07-01 O'neill Dennis M System and method for knowledge-based emergency response
US7447593B2 (en) 2004-03-26 2008-11-04 Raytheon Company System and method for adaptive path planning
CN2731864Y (en) * 2004-07-21 2005-10-05 上海高德威智能交通***有限公司 Intelligent camera having traffic monitoring function
US7499774B2 (en) 2004-10-22 2009-03-03 Irobot Corporation System and method for processing safety signals in an autonomous vehicle
DE112005003266T5 (en) 2004-12-28 2008-09-04 Kabushiki Kaisha Toyota Chuo Kenkyusho Vehicle motion control device
US7644799B2 (en) 2005-02-10 2010-01-12 Friedman Research Corporation Vehicle safety control system
US20060207820A1 (en) 2005-03-20 2006-09-21 Hemant Joshi Active Vehile Shield
JP2007290423A (en) 2006-04-21 2007-11-08 Toyota Motor Corp Seat belt device
US20080033645A1 (en) 2006-08-03 2008-02-07 Jesse Sol Levinson Pobabilistic methods for mapping and localization in arbitrary outdoor environments
WO2008024639A2 (en) 2006-08-11 2008-02-28 Donnelly Corporation Automatic headlamp control system
US7579942B2 (en) 2006-10-09 2009-08-25 Toyota Motor Engineering & Manufacturing North America, Inc. Extra-vehicular threat predictor
US8179281B2 (en) 2006-10-13 2012-05-15 Continental Teves Ag & Co. Ohg Method and apparatus for identifying concealed objects in road traffic
JP4525670B2 (en) * 2006-11-20 2010-08-18 トヨタ自動車株式会社 Travel control plan generation system
US20080320421A1 (en) 2007-06-20 2008-12-25 Demaris David L Feature extraction that supports progressively refined search and classification of patterns in a semiconductor layout
KR20090001403A (en) 2007-06-29 2009-01-08 엘지전자 주식회사 Telematics terminal capable of receiving broadcast and method of processing broadcast signal
CN101414191A (en) * 2007-10-17 2009-04-22 张国成 Circuit method of unmanned driving vehicle
JP4466716B2 (en) * 2007-11-01 2010-05-26 トヨタ自動車株式会社 Traveling locus generation method and traveling locus generation device
CA2719007A1 (en) 2008-03-19 2009-09-24 Appleseed Networks, Inc. Method and apparatus for detecting patterns of behavior
US8548727B2 (en) 2008-05-02 2013-10-01 Honeywell International Inc. Cognitive aircraft hazard advisory system (CAHAS)
WO2009137582A1 (en) 2008-05-06 2009-11-12 University Of Virginia Patent Foundation System and method for minimizing occupant injury during vehicle crash events
US8392064B2 (en) 2008-05-27 2013-03-05 The Board Of Trustees Of The Leland Stanford Junior University Systems, methods and devices for adaptive steering control of automotive vehicles
US8989972B2 (en) 2008-09-11 2015-03-24 Deere & Company Leader-follower fully-autonomous vehicle with operator on side
JP5251496B2 (en) 2008-12-25 2013-07-31 アイシン・エィ・ダブリュ株式会社 Hydraulic control device for automatic transmission
JP5166373B2 (en) * 2009-08-14 2013-03-21 株式会社日立製作所 Transport control server, transport control system, and backup path setting method
US9230292B2 (en) 2012-11-08 2016-01-05 Uber Technologies, Inc. Providing on-demand services through use of portable computing devices
US20130246301A1 (en) 2009-12-04 2013-09-19 Uber Technologies, Inc. Providing user feedback for transport services through use of mobile devices
US8559673B2 (en) 2010-01-22 2013-10-15 Google Inc. Traffic signal mapping and detection
JP5482320B2 (en) * 2010-03-11 2014-05-07 株式会社デンソー Vehicle driving support device
AU2011226623B2 (en) * 2010-03-11 2014-07-17 Inrix, Inc. Learning road navigation paths based on aggregate driver behavior
US9014848B2 (en) 2010-05-20 2015-04-21 Irobot Corporation Mobile robot system
GB201009523D0 (en) 2010-06-07 2010-07-21 Capoco Design Ltd An autonomous vehicle
US8509982B2 (en) 2010-10-05 2013-08-13 Google Inc. Zone driving
US9513630B2 (en) 2010-11-17 2016-12-06 General Electric Company Methods and systems for data communications
US20140358427A1 (en) 2010-12-13 2014-12-04 Google Inc. Enhancing driving navigation via passive drivers feedback
JP5670246B2 (en) 2011-04-07 2015-02-18 本田技研工業株式会社 Lower body structure
DE102011104925A1 (en) 2011-06-18 2012-12-20 Daimler Ag Motor vehicle with a driver assistance unit
JP6399928B2 (en) 2011-08-16 2018-10-03 チャージ ピーク リミテッド Load estimation and management in electric vehicle networks
US8583361B2 (en) 2011-08-24 2013-11-12 Modular Mining Systems, Inc. Guided maneuvering of a mining vehicle to a target destination
DE102011112577A1 (en) 2011-09-08 2013-03-14 Continental Teves Ag & Co. Ohg Method and device for an assistance system in a vehicle for carrying out an autonomous or semi-autonomous driving maneuver
CN105701361B (en) 2011-09-22 2018-11-13 阿索恩公司 Monitoring, diagnosis and trace tool for autonomous mobile robot
RU2014112202A (en) 2011-10-03 2015-11-10 Тойота Дзидося Кабусики Кайся AUXILIARY VEHICLE MANAGEMENT SYSTEM
US20160189544A1 (en) 2011-11-16 2016-06-30 Autoconnect Holdings Llc Method and system for vehicle data collection regarding traffic
FR2984254B1 (en) 2011-12-16 2016-07-01 Renault Sa CONTROL OF AUTONOMOUS VEHICLES
US8880272B1 (en) * 2012-03-16 2014-11-04 Google Inc. Approach for estimating the geometry of roads and lanes by using vehicle trajectories
US20130268138A1 (en) 2012-04-05 2013-10-10 Caterpillar Inc. High Availability For Autonomous Machine Control System
WO2013150630A1 (en) 2012-04-05 2013-10-10 株式会社日立製作所 Map data creation device, autonomous movement system and autonomous movement control device
US8849515B2 (en) 2012-07-24 2014-09-30 GM Global Technology Operations LLC Steering assist in driver initiated collision avoidance maneuver
US9255989B2 (en) 2012-07-24 2016-02-09 Toyota Motor Engineering & Manufacturing North America, Inc. Tracking on-road vehicles with sensors of different modalities
US20140129302A1 (en) 2012-11-08 2014-05-08 Uber Technologies, Inc. Providing a confirmation interface for on-demand services through use of portable computing devices
US9671233B2 (en) 2012-11-08 2017-06-06 Uber Technologies, Inc. Dynamically providing position information of a transit object to a computing device
USD743978S1 (en) 2012-11-08 2015-11-24 Uber Technologies, Inc. Display screen of a computing device with a computer-generated electronic panel for providing confirmation for a service request
DE102012022392B4 (en) 2012-11-15 2016-02-04 Audi Ag Method and device for controlling a safety belt connected to a seatbelt device of a vehicle with a predictive collision detection unit
CN103035121A (en) * 2012-12-06 2013-04-10 南京航空航天大学 Planning method of intelligent vehicle autonomous running dynamic trajectory and system of the same
FR3000005B1 (en) 2012-12-21 2015-10-09 Valeo Securite Habitacle REMOTE CONTROL BOX OF A PARKING MANEUVER CONTROL SYSTEM OF A VEHICLE, AND ASSOCIATED METHOD
US20140188347A1 (en) 2012-12-31 2014-07-03 Joseph Akwo Tabe Smart supplemental restraint and occupant classification system
US9367065B2 (en) 2013-01-25 2016-06-14 Google Inc. Modifying behavior of autonomous vehicles based on sensor blind spots and limitations
US8948993B2 (en) 2013-03-08 2015-02-03 Richard Schulman Method and system for controlling the behavior of an occupant of a vehicle
US8996224B1 (en) 2013-03-15 2015-03-31 Google Inc. Detecting that an autonomous vehicle is in a stuck condition
US9395727B1 (en) 2013-03-22 2016-07-19 Google Inc. Single layer shared aperture beam forming network
US9342074B2 (en) 2013-04-05 2016-05-17 Google Inc. Systems and methods for transitioning control of an autonomous vehicle to a driver
US9141107B2 (en) 2013-04-10 2015-09-22 Google Inc. Mapping active and inactive construction zones for autonomous driving
US9025140B2 (en) 2013-05-07 2015-05-05 Google Inc. Methods and systems for detecting weather conditions including sunlight using vehicle onboard sensors
US9632210B2 (en) 2013-05-07 2017-04-25 Google Inc. Methods and systems for detecting weather conditions using vehicle onboard sensors
US8977007B1 (en) 2013-04-23 2015-03-10 Google Inc. Detecting a vehicle signal through image differencing and filtering
US9384443B2 (en) 2013-06-14 2016-07-05 Brain Corporation Robotic training apparatus and methods
WO2014202258A1 (en) 2013-06-21 2014-12-24 National University Of Ireland, Maynooth A method for mapping an environment
RU140935U1 (en) 2013-06-24 2014-05-20 Елена Николаевна Абышева NAVIGATION DEVICE
DE102013213171A1 (en) 2013-07-04 2015-01-08 Robert Bosch Gmbh Method and device for operating a motor vehicle in an automated driving operation
DE102013213169A1 (en) * 2013-07-04 2015-01-08 Robert Bosch Gmbh Method and device for operating a motor vehicle in an automated driving operation
US9579789B2 (en) 2013-09-27 2017-02-28 Brain Corporation Apparatus and methods for training of robotic control arbitration
US9425654B2 (en) 2013-09-30 2016-08-23 Google Inc. Contactless electrical coupling for a rotatable LIDAR device
US9528834B2 (en) 2013-11-01 2016-12-27 Intelligent Technologies International, Inc. Mapping techniques using probe vehicles
US20150127224A1 (en) 2013-11-02 2015-05-07 Joseph Akwo Tabe Advanced weight responsive supplemental restraint and occupant classification system
CN104616516B (en) * 2013-11-04 2017-11-14 深圳市赛格导航科技股份有限公司 A kind of traffic safety auxiliary control method and system
US20150268665A1 (en) 2013-11-07 2015-09-24 Google Inc. Vehicle communication using audible signals
US9212926B2 (en) 2013-11-22 2015-12-15 Ford Global Technologies, Llc In-vehicle path verification
US9200910B2 (en) 2013-12-11 2015-12-01 Here Global B.V. Ranking of path segments based on incident probability
US9984574B2 (en) 2014-01-21 2018-05-29 Tribal Rides, Inc. Method and system for anticipatory deployment of autonomously controlled vehicles
US9201426B1 (en) 2014-02-19 2015-12-01 Google Inc. Reverse iteration of planning data for system control
US9720410B2 (en) 2014-03-03 2017-08-01 Waymo Llc Remote assistance for autonomous vehicles in predetermined situations
US9547989B2 (en) 2014-03-04 2017-01-17 Google Inc. Reporting road event data and sharing with other vehicles
JP6137001B2 (en) 2014-03-14 2017-05-31 株式会社デンソー In-vehicle device
EP2921362B1 (en) * 2014-03-18 2020-08-12 Volvo Car Corporation Vehicle, vehicle system and method for increasing safety and/or comfort during autonomous driving
US9960986B2 (en) 2014-03-19 2018-05-01 Uber Technologies, Inc. Providing notifications to devices based on real-time conditions related to an on-demand service
US20150292894A1 (en) 2014-04-11 2015-10-15 Telecommunication Systems, Inc. Travel route
US9475422B2 (en) 2014-05-22 2016-10-25 Applied Invention, Llc Communication between autonomous vehicle and external observers
US20150338226A1 (en) 2014-05-22 2015-11-26 Telogis, Inc. Context-based routing and access path selection
US9631933B1 (en) 2014-05-23 2017-04-25 Google Inc. Specifying unavailable locations for autonomous vehicles
US9393922B2 (en) 2014-05-23 2016-07-19 Google Inc. Devices and methods for an energy-absorbing end of a vehicle
US10424036B2 (en) 2014-06-02 2019-09-24 Uber Technologies, Inc. Maintaining data for use with a transport service during connectivity loss between systems
US9235775B2 (en) 2014-06-08 2016-01-12 Uber Technologies, Inc. Entrance detection from street-level imagery
US9494937B2 (en) 2014-06-20 2016-11-15 Verizon Telematics Inc. Method and system for drone deliveries to vehicles in route
US9283678B2 (en) 2014-07-16 2016-03-15 Google Inc. Virtual safety cages for robotic devices
US9381949B2 (en) 2014-10-15 2016-07-05 Toyota Motor Engineering & Manufacturing North America, Inc. Vehicles having a cross-vehicle stabilizing structure
USPP28774P3 (en) 2014-11-27 2017-12-19 Agro Selections Fruits Peach tree named ‘CRISPONDA’
US9371093B1 (en) 2014-12-03 2016-06-21 Toyota Motor Engineering & Manufacturing North America, Inc. Vehicles having upper side member reinforcement portions
JP6277948B2 (en) 2014-12-05 2018-02-14 マツダ株式会社 Lower body structure of automobile
USPP28706P3 (en) 2014-12-08 2017-11-28 Syngenta Participations Ag Dahlia plant named ‘DAHZ0001’
GB2535718A (en) 2015-02-24 2016-08-31 Addison Lee Ltd Resource management
US20160247106A1 (en) 2015-02-24 2016-08-25 Siemens Aktiengesellschaft Managing a fleet of autonomous electric vehicles for on-demand transportation and ancillary services to electrical grid
US10023231B2 (en) 2015-08-12 2018-07-17 Madhusoodhan Ramanujam Parking autonomous vehicles
US9805605B2 (en) 2015-08-12 2017-10-31 Madhusoodhan Ramanujam Using autonomous vehicles in a taxi service
US10220705B2 (en) 2015-08-12 2019-03-05 Madhusoodhan Ramanujam Sharing autonomous vehicles
WO2017058961A2 (en) * 2015-09-28 2017-04-06 Uber Technologies, Inc. Autonomous vehicle with independent auxiliary control units
US9910441B2 (en) 2015-11-04 2018-03-06 Zoox, Inc. Adaptive autonomous vehicle planner logic

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101311680A (en) * 2007-05-23 2008-11-26 株式会社电装 Apparatus and program for navigation
CN102084219A (en) * 2007-06-15 2011-06-01 微软公司 Route modifications
CN102227612A (en) * 2008-10-24 2011-10-26 格瑞股份公司 Control and systems for autonomously driven vehicles
CN102334151A (en) * 2009-02-27 2012-01-25 丰田自动车株式会社 Movement trajectory generator
CN102859323A (en) * 2009-11-24 2013-01-02 特洛吉斯有限公司 Vehicle route selection based on energy usage
CN102235875A (en) * 2010-03-23 2011-11-09 株式会社电装 Vehicular navigation device
CN103069153A (en) * 2010-07-07 2013-04-24 罗伯特·博世有限公司 System and method for controlling the engine of a vehicle
CN102901510A (en) * 2011-07-25 2013-01-30 通用汽车环球科技运作有限责任公司 Autonomous convoying technique for vehicles
CN104185775A (en) * 2012-01-20 2014-12-03 丰田自动车工程及制造北美公司 Intelligent navigation system
CN104937512A (en) * 2012-09-21 2015-09-23 罗伯特·博世有限公司 Method and device for operating motor vehicle in automated driving mode
CN104812645A (en) * 2012-09-27 2015-07-29 谷歌公司 Determining changes in a driving environment based on vehicle behavior
US9008890B1 (en) * 2013-03-15 2015-04-14 Google Inc. Augmented trajectories for autonomous vehicles
EP2858060A2 (en) * 2013-10-02 2015-04-08 Audi Ag Method for controlling a motor vehicle and motor vehicle

Also Published As

Publication number Publication date
JP2019501072A (en) 2019-01-17
US9910441B2 (en) 2018-03-06
US20180196439A1 (en) 2018-07-12
US10921811B2 (en) 2021-02-16
EP3371794A4 (en) 2019-05-01
CN108292473A (en) 2018-07-17
JP7195143B2 (en) 2022-12-23
WO2017079228A2 (en) 2017-05-11
EP3371794A2 (en) 2018-09-12
US20170123429A1 (en) 2017-05-04
WO2017079228A3 (en) 2017-06-15

Similar Documents

Publication Publication Date Title
CN108292473B (en) Method and system for controlling a motion profile of an autonomous vehicle
US11796998B2 (en) Autonomous vehicle fleet service and system
US11061398B2 (en) Machine-learning systems and techniques to optimize teleoperation and/or planner decisions
US11301767B2 (en) Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles
CN114822008B (en) Coordination of dispatch and maintenance of fleet of autonomous vehicles
US11106218B2 (en) Adaptive mapping to navigate autonomous vehicles responsive to physical environment changes
CN108475406B (en) Software application for requesting and controlling autonomous vehicle services
US20200074024A1 (en) Simulation system and methods for autonomous vehicles
CN108700876B (en) Remote operation system and method for autonomous vehicle trajectory modification
US9734455B2 (en) Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles
WO2017079229A1 (en) Simulation system and methods for autonomous vehicles
JP2019504800A (en) Simulation system and method for autonomous vehicles
JP2022137160A (en) Machine learning system and technique for optimizing remote control and/or planner determination
US20240028031A1 (en) Autonomous vehicle fleet service and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant