CN111538322B - Sensor data selection method and device for automatic driving vehicle and vehicle-mounted equipment - Google Patents

Sensor data selection method and device for automatic driving vehicle and vehicle-mounted equipment Download PDF

Info

Publication number
CN111538322B
CN111538322B CN201910106037.5A CN201910106037A CN111538322B CN 111538322 B CN111538322 B CN 111538322B CN 201910106037 A CN201910106037 A CN 201910106037A CN 111538322 B CN111538322 B CN 111538322B
Authority
CN
China
Prior art keywords
sensor data
vehicle
planning information
backward
selecting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910106037.5A
Other languages
Chinese (zh)
Other versions
CN111538322A (en
Inventor
张宇
林伟
石磊
冯威
刘晓彤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uisee Technologies Beijing Co Ltd
Original Assignee
Uisee Technologies Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uisee Technologies Beijing Co Ltd filed Critical Uisee Technologies Beijing Co Ltd
Priority to CN201910106037.5A priority Critical patent/CN111538322B/en
Publication of CN111538322A publication Critical patent/CN111538322A/en
Application granted granted Critical
Publication of CN111538322B publication Critical patent/CN111538322B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the application relates to a sensor data selection method and device for an automatic driving vehicle and vehicle-mounted equipment, wherein the automatic driving vehicle comprises a plurality of sensors, the plurality of sensors comprise a plurality of cameras, a laser radar and/or an ultrasonic radar, the plurality of cameras comprise a forward camera, a backward camera, a left camera and a right camera, and the method comprises the following steps: acquiring data of a plurality of sensors; generating perceived location information based on sensor data, the sensor data selected from a plurality of sensor data based on vehicle planning information; vehicle planning information is generated based on the perceived location information. According to the application, the sensor data are selected based on the vehicle planning information, and the sensing and positioning are performed based on the selected sensor data, so that the vehicle planning information is generated, an automatic driving system does not need to process all the sensor data, the data processing time is reduced, the low-delay requirement of the automatic driving strategy planning of the automatic driving system is met, the accuracy of the automatic driving strategy is improved, and the driving safety is ensured.

Description

Sensor data selection method and device for automatic driving vehicle and vehicle-mounted equipment
Technical Field
The embodiment of the application relates to the technical field of automatic driving of vehicles, in particular to a method and a device for selecting sensor data of an automatic driving vehicle and vehicle-mounted equipment.
Background
With the rapid development of vehicle automatic driving technology, the perception requirements on the surrounding environment of the vehicle are higher and higher. The sensors mainly used in the automatic driving system of the current vehicle include: image sensor and radar, wherein, image sensor is for example the camera, the radar includes again: millimeter wave radar, ultrasonic radar, laser radar, and the like. Different sensors have different characteristics and different sensing ranges.
Current automatic driving systems for vehicles typically use multiple sensors that collect a significant amount of data. For example, an autonomous vehicle includes a plurality of cameras to simultaneously acquire a plurality of pictures. If all pictures are processed, a longer delay is caused. The automatic driving system of the vehicle requires low delay when planning an automatic driving strategy, and longer delay can reduce the accuracy of the automatic driving strategy and influence the driving safety.
Disclosure of Invention
In order to solve at least one problem existing in the prior art, at least one embodiment of the present application provides a method and apparatus for selecting sensor data of an autonomous vehicle, and a vehicle-mounted device.
In a first aspect, an embodiment of the present application provides a method for selecting sensor data of an autonomous vehicle, where the autonomous vehicle includes a plurality of sensors, the plurality of sensors include a plurality of cameras, a laser radar and/or an ultrasonic radar, the plurality of cameras include a forward camera, a backward camera, a left camera and a right camera, and the method includes:
acquiring data of the plurality of sensors;
generating perceived location information based on sensor data selected from a plurality of sensor data based on vehicle planning information;
and generating vehicle planning information based on the perceived positioning information.
In some embodiments, the vehicle planning information includes: heading of the vehicle;
the selecting from a plurality of sensor data based on vehicle planning information includes:
and selecting the sensor data corresponding to the vehicle course.
In some embodiments, the selecting the sensor data corresponding to the vehicle heading includes:
when the vehicle heading is the forward direction of the vehicle, selecting forward sensor data in the plurality of sensor data;
and when the vehicle heading is backward of the vehicle, backward sensor data in the plurality of sensor data is selected.
In some embodiments, the selecting the sensor data corresponding to the vehicle heading includes:
when the vehicle heading is the left forward direction of the vehicle, selecting forward sensor data and left sensor data in the plurality of sensor data;
when the vehicle heading is the right forward direction of the vehicle, selecting forward sensor data and right sensor data in the plurality of sensor data;
when the vehicle heading is the left backward direction of the vehicle, backward sensor data and left sensor data in the plurality of sensor data are selected;
and when the vehicle heading is the right backward direction of the vehicle, selecting backward sensor data and right sensor data in the plurality of sensor data.
In some embodiments, the vehicle planning information includes: a desired path;
the selecting from a plurality of sensor data based on vehicle planning information includes:
based on the desired path, sensor data is selected.
In a second aspect, an embodiment of the present application further provides a sensor data selection apparatus for an autonomous vehicle, where the autonomous vehicle includes a plurality of sensors, the plurality of sensors including a plurality of cameras, a laser radar, and/or an ultrasonic radar, the plurality of cameras including a forward camera, a backward camera, a left camera, and a right camera, and the apparatus includes:
an acquisition unit configured to acquire data of the plurality of sensors;
a first generation unit configured to generate perceived positioning information based on sensor data selected from a plurality of sensor data based on vehicle planning information;
and the second generation unit is used for generating vehicle planning information based on the perceived positioning information.
In some embodiments, the vehicle planning information includes: heading of the vehicle;
the sensor data is sensor data corresponding to the heading of the vehicle.
In some embodiments, the sensor data corresponding to the vehicle heading comprises:
when the vehicle course is the forward direction of the vehicle, the corresponding sensor data is the forward sensor data;
and when the vehicle course is the backward direction of the vehicle, the corresponding sensor data is backward sensor data.
In some embodiments, the sensor data corresponding to the vehicle heading comprises:
when the vehicle course is the left forward direction of the vehicle, the corresponding sensor data are forward sensor data and left sensor data;
when the vehicle course is the right forward direction of the vehicle, the corresponding sensor data are forward sensor data and right sensor data;
when the vehicle course is the left backward direction of the vehicle, the corresponding sensor data are backward sensor data and left sensor data;
when the vehicle course is the right backward direction of the vehicle, the corresponding sensor data are backward sensor data and right sensor data.
In some embodiments, the vehicle planning information includes: a desired path;
the sensor data is selected from a plurality of sensor data based on the desired path.
In a third aspect, an embodiment of the present application further provides an in-vehicle apparatus, including:
a processor, a memory, and a user interface;
the processor, memory, and user interface are coupled together by a bus system;
the processor is configured to perform the steps of the method according to the first aspect by calling a program or instructions stored in the memory.
It can be seen that in at least one embodiment of the present application, sensor data is selected based on vehicle planning information, and sensing positioning is performed based on the selected sensor data, so as to generate sensing positioning information, thereby generating vehicle planning information, an autopilot system does not need to process all sensor data, so that part of computing resources are saved, the data processing time is reduced, the low-delay requirement of the autopilot system for planning an autopilot strategy is satisfied, the accuracy of the autopilot strategy is improved, and the driving safety is ensured.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a functional architecture diagram of an autopilot system according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a vehicle-mounted device according to an embodiment of the present application;
FIG. 3 is a flowchart of a method for selecting sensor data of an autonomous vehicle according to an embodiment of the present application;
fig. 4 is a block diagram of a sensor data selecting device for an automatic driving vehicle according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It should be noted that in this document, relational terms such as "first" and "second" and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
Fig. 1 is a functional architecture diagram of an autopilot system according to an embodiment of the present application. As shown in fig. 1, the automatic driving system includes: the system comprises a sensor data receiving module, a sensor data selecting module, a sensor data processing module and a planning control module. In addition, the execution control module shown in fig. 1 is a vehicle floor execution system, including: steering, braking and power systems.
And the sensor data receiving module is used for receiving data of a sensor group, wherein the sensor group comprises a plurality of cameras, one or more laser radars, a plurality of millimeter wave radars, ultrasonic radars and the like. All sensors in the sensor group are transmitting data at a higher frequency during the vehicle's autonomous driving.
And the sensor data selection module is used for selecting the sensor data from the plurality of sensor data based on the vehicle planning information generated by the planning control module.
And the sensor data processing module is used for performing sensing positioning based on the sensor data selected by the sensor data selecting module and generating sensing positioning information. Specifically, performing the perceived positioning includes, for example, obstacle detection, recognition of a drivable area of a camera image, and the like.
And the planning control module is used for planning the vehicle based on the perceived positioning information generated by the sensor data processing module, generating vehicle planning information and generating a vehicle control instruction based on the vehicle planning information so as to control the vehicle to run according to the expected running path. Vehicle planning information includes, for example, but is not limited to: vehicle heading, desired path, vehicle speed, etc.
The execution control module is used for controlling the vehicle to run based on the vehicle control instruction of the planning control module, such as controlling the steering wheel, the brake and the accelerator to control the vehicle transversely and longitudinally.
Fig. 2 is a schematic structural diagram of an in-vehicle apparatus according to an embodiment of the present application. The in-vehicle apparatus shown in fig. 2 includes: at least one processor 201, at least one memory 202, and other user interfaces 203. The various components in the in-vehicle device are coupled together by a bus system 204. It is understood that the bus system 204 is used to enable connected communications between these components. The bus system 204 includes a power bus, a control bus, and a status signal bus in addition to the data bus. But for clarity of illustration the various buses are labeled as bus system 204 in fig. 2.
The user interface 203 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, a trackball (trackball), or a touch pad, etc.
It will be appreciated that the memory 202 in this embodiment may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile memory may be a Read-only memory (ROM), a programmable Read-only memory (ProgrammableROM, PROM), an erasable programmable Read-only memory (ErasablePROM, EPROM), an electrically erasable programmable Read-only memory (ElectricallyEPROM, EEPROM), or a flash memory, among others. The volatile memory may be a random access memory (RandomAccessMemory, RAM) that acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic random access memory (DynamicRAM, DRAM), synchronous dynamic random access memory (SynchronousDRAM, SDRAM), double data rate synchronous dynamic random access memory (ddr SDRAM), enhanced Synchronous Dynamic Random Access Memory (ESDRAM), synchronous link dynamic random access memory (SynchlinkDRAM, SLDRAM), and direct memory bus random access memory (DirectRambusRAM, DRRAM). The memory 202 described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some implementations, the memory 202 stores the following elements, executable units or data structures, or a subset thereof, or an extended set thereof: an operating system 2021 and application programs 2022.
The operating system 2021 contains various system programs, such as a framework layer, a core library layer, a driver layer, and the like, for implementing various basic services and processing hardware-based tasks. The application 2022 includes various application programs such as a media player (MediaPlayer), a Browser (Browser), and the like for implementing various application services. The program implementing the method of the embodiment of the present application may be contained in the application program 2022.
In the embodiment of the present application, the processor 201 is configured to execute the steps of the embodiments of the sensor data selection method of the autonomous vehicle by calling the program or the instruction stored in the memory 202, specifically, the program or the instruction stored in the application program 2022, where the autonomous vehicle includes a plurality of sensors, the plurality of sensors includes at least one of a plurality of cameras, a laser radar, an ultrasonic radar, and the like, and the plurality of cameras includes a forward camera, a backward camera, a left camera, and a right camera, and the method includes, for example, the following steps one to three:
step one, acquiring data of a plurality of sensors;
generating sensing positioning information based on sensor data, wherein the sensor data is selected from a plurality of sensor data based on vehicle planning information;
and thirdly, generating vehicle planning information based on the perceived positioning information.
The method disclosed in the above embodiment of the present application may be applied to the processor 201 or implemented by the processor 201. The processor 201 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 201 or by instructions in the form of software. The processor 201 may be a general purpose processor, a digital signal processor (DigitalSignalProcessor, DSP), an application specific integrated circuit (application specific IntegratedCircuit, ASIC), an off-the-shelf programmable gate array (FieldProgrammableGateArray, FPGA) or other programmable logic device, a discrete gate or transistor logic device, a discrete hardware component. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software elements in a decoding processor. The software elements may be located in a random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in the memory 202, and the processor 201 reads the information in the memory 202 and, in combination with its hardware, performs the steps of the method described above.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or a combination thereof. For a hardware implementation, the processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital signal processing devices (dsppdevices), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units adapted to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described herein may be implemented by means of units that perform the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the embodiments provided in the present application, it should be understood that the order of execution may be arbitrarily adjusted, unless there is an explicit order of precedence between the steps of the method embodiments. The disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be embodied in essence or a part contributing to the prior art or a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc.
Fig. 3 is a flowchart of a method for selecting sensor data of an automatic driving vehicle according to an embodiment of the present application.
As shown in fig. 3, the method for selecting sensor data of an autonomous vehicle according to the present embodiment includes a plurality of sensors, the plurality of sensors including at least one of a plurality of cameras including a forward camera, a backward camera, a left camera, and a right camera, a laser radar, an ultrasonic radar, and the like, and the method may include the following steps 301 to 303:
301. data of the plurality of sensors is acquired.
302. The perceived location information is generated based on sensor data selected from a plurality of sensor data based on the vehicle planning information.
303. And generating vehicle planning information based on the perceived positioning information.
In this embodiment, since the plurality of sensors are transmitting data at a high frequency during the automatic driving of the vehicle, the automatic driving system continuously acquires the data of the plurality of sensors.
The automated driving system selects data of the sensor based on the generated vehicle planning information and generates perceived location information based on the selected sensor data, thereby continuously generating vehicle planning information based on the perceived location information to continuously select data of the sensor based on the generated vehicle planning information.
The autopilot system does not generate vehicle planning information for a period of time during which operation is initiated, and therefore, all sensor data is used for perceived positioning during the period of time, perceived positioning information is generated, and vehicle planning information is generated based on the perceived positioning information.
After the automatic driving system generates the vehicle planning information, the acquired sensor data are not all used for sensing positioning, but the sensor data used for sensing positioning are selected based on the generated vehicle planning information.
After the automatic driving system selects the sensor data, the sensing positioning information is generated based on the selected sensor data, so that the vehicle planning information is generated based on the sensing positioning information, and the vehicle planning information is not re-planned information, but is continuously generated based on the previously generated vehicle planning information.
The automated driving system uses the generated vehicle planning information as feedback to influence the selection of sensor data.
Therefore, according to the sensor data selection method for the automatic driving vehicle disclosed by the embodiment, the sensor data is selected based on the vehicle planning information, and the sensing positioning is performed based on the selected sensor data, so that the sensing positioning information is generated, and the vehicle planning information is generated.
In some embodiments, the vehicle planning information includes a vehicle heading. Selecting from a plurality of sensor data based on vehicle planning information, comprising: selecting sensor data corresponding to the heading of the vehicle, wherein the sensor data specifically comprises the following contents (1) and (2):
(1) When the vehicle heading is the forward direction of the vehicle, selecting forward sensor data in the plurality of sensor data;
(2) And when the vehicle heading is backward of the vehicle, backward sensor data in the plurality of sensor data is selected.
In this embodiment, the forward sensor data, such as forward camera data, forward laser radar data, forward millimeter wave radar data, etc., may be selected during forward travel of the vehicle.
The sensor data includes direction information, i.e. the sensor data includes sensor data of different direction information. Selecting forward sensor data may be understood as selecting sensor data whose direction information is forward. For example, the data of the lidar includes direction information, and the forward lidar data may be understood as laser radar data in which the direction information is forward.
In other embodiments, sensor data corresponding to a vehicle heading is selected, specifically including the following (3) through (6):
(3) When the vehicle heading is the left forward direction of the vehicle, selecting forward sensor data and left sensor data in the plurality of sensor data;
(4) When the vehicle heading is the right forward direction of the vehicle, selecting forward sensor data and right sensor data in the plurality of sensor data;
(5) When the vehicle heading is the left backward direction of the vehicle, backward sensor data and left sensor data in the plurality of sensor data are selected;
(6) And when the vehicle heading is the right backward direction of the vehicle, selecting backward sensor data and right sensor data in the plurality of sensor data.
In this embodiment, when the vehicle is backing in the right-rear direction, the backward sensor data and the right-direction sensor data may be selected, for example, backward camera data, backward laser radar data, backward millimeter wave radar data, right camera data, right laser radar data, right millimeter wave radar data, and the like.
In some embodiments, the vehicle planning information includes a desired path. Selecting from a plurality of sensor data based on vehicle planning information, comprising: based on the desired path, sensor data is selected.
For example, data of which sensors are needed may be calculated based on the desired path and the sensor data selected. The calculation method belongs to mature calculation in the field and is not described in detail herein.
In this embodiment, the expected path is left lane change overtaking, and forward sensor data, left sensor data, and backward sensor data may be selected, for example, forward camera data, forward laser radar data, forward millimeter wave radar data, left camera data, left laser radar data, left millimeter wave radar data, backward camera data, backward laser radar data, backward millimeter wave radar data, and the like.
As shown in fig. 4, the present embodiment discloses a sensor data selecting apparatus for an autonomous vehicle, the autonomous vehicle including a plurality of sensors including a plurality of cameras including a forward camera, a backward camera, a left camera and a right camera, a laser radar and/or an ultrasonic radar, the apparatus comprising: an acquisition unit 41, a first generation unit 42, and a second generation unit 43. The concrete explanation is as follows:
an acquisition unit 41 for acquiring data of the plurality of sensors;
a first generation unit 42 for generating perceived positioning information based on sensor data selected from a plurality of sensor data based on vehicle planning information;
a second generating unit 43 for generating vehicle planning information based on the perceived location information.
In some embodiments, the vehicle planning information includes: heading of the vehicle;
the sensor data is sensor data corresponding to the heading of the vehicle.
In some embodiments, the sensor data corresponding to the vehicle heading comprises:
when the vehicle course is the forward direction of the vehicle, the corresponding sensor data is the forward sensor data;
and when the vehicle course is the backward direction of the vehicle, the corresponding sensor data is backward sensor data.
In some embodiments, the sensor data corresponding to the vehicle heading comprises:
when the vehicle course is the left forward direction of the vehicle, the corresponding sensor data are forward sensor data and left sensor data;
when the vehicle course is the right forward direction of the vehicle, the corresponding sensor data are forward sensor data and right sensor data;
when the vehicle course is the left backward direction of the vehicle, the corresponding sensor data are backward sensor data and left sensor data;
when the vehicle course is the right backward direction of the vehicle, the corresponding sensor data are backward sensor data and right sensor data.
In some embodiments, the vehicle planning information includes: a desired path;
the sensor data is selected from a plurality of sensor data based on the desired path.
The sensor data selecting device for an automatic driving vehicle disclosed in the above embodiments can implement the flow of the sensor data selecting method for an automatic driving vehicle disclosed in the above method embodiments, and in order to avoid repetition, the description is omitted herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments.
Although embodiments of the present application have been described in connection with the accompanying drawings, various modifications and variations may be made by those skilled in the art without departing from the spirit and scope of the application, and such modifications and variations fall within the scope of the application as defined by the appended claims.

Claims (11)

1. A method of sensor data selection for an autonomous vehicle, the autonomous vehicle comprising a plurality of sensors, the plurality of sensors comprising a plurality of cameras, a lidar and/or an ultrasonic radar, the plurality of cameras comprising a forward camera, a rearward camera, a left camera and a right camera, the method comprising:
acquiring data of the plurality of sensors;
generating perceived location information based on sensor data selected from a plurality of sensor data based on vehicle planning information;
generating vehicle planning information based on the perceived positioning information;
the generating perceived location information based on sensor data selected from a plurality of sensor data based on vehicle planning information, comprising:
generating perceived positioning information based on the acquired plurality of sensor data;
generating vehicle planning information based on the obtained sensing positioning information generated by the plurality of sensor data;
selecting sensor data for perceived positioning based on vehicle planning information generated from perceived positioning information generated from the acquired plurality of sensor data;
the vehicle planning information includes:
heading of the vehicle;
a path is desired.
2. The method of claim 1, wherein the selecting from a plurality of sensor data based on vehicle planning information comprises:
and selecting the sensor data corresponding to the vehicle course.
3. The method of claim 2, wherein the selecting the sensor data corresponding to the vehicle heading comprises:
when the vehicle heading is the forward direction of the vehicle, selecting forward sensor data in the plurality of sensor data;
and when the vehicle heading is backward of the vehicle, backward sensor data in the plurality of sensor data is selected.
4. The method of claim 2, wherein the selecting the sensor data corresponding to the vehicle heading comprises:
when the vehicle heading is the left forward direction of the vehicle, selecting forward sensor data and left sensor data in the plurality of sensor data;
when the vehicle heading is the right forward direction of the vehicle, selecting forward sensor data and right sensor data in the plurality of sensor data;
when the vehicle heading is the left backward direction of the vehicle, backward sensor data and left sensor data in the plurality of sensor data are selected;
and when the vehicle heading is the right backward direction of the vehicle, selecting backward sensor data and right sensor data in the plurality of sensor data.
5. The method of claim 1, wherein the selecting from a plurality of sensor data based on vehicle planning information comprises:
based on the desired path, sensor data is selected.
6. A sensor data selection apparatus for an autonomous vehicle, the autonomous vehicle comprising a plurality of sensors, the plurality of sensors comprising a plurality of cameras, a lidar and/or an ultrasonic radar, the plurality of cameras comprising a forward camera, a rearward camera, a left camera and a right camera, the apparatus comprising:
an acquisition unit configured to acquire data of the plurality of sensors;
a first generation unit configured to generate perceived positioning information based on sensor data selected from a plurality of sensor data based on vehicle planning information;
a second generation unit configured to generate vehicle planning information based on the perceived positioning information;
the first generation unit is specifically configured to generate perceived positioning information based on the acquired plurality of sensor data;
the second generation unit is further configured to generate vehicle planning information based on the perceived positioning information generated by the acquired plurality of sensor data;
the acquisition unit is further used for acquiring vehicle planning information generated by the sensing positioning information generated by the acquired plurality of sensor data and selecting the sensor data for sensing positioning;
the second generation unit generates vehicle planning information based on the perceived positioning information including:
heading of the vehicle;
a path is desired.
7. The apparatus according to claim 6, comprising:
the sensor data is sensor data corresponding to the heading of the vehicle.
8. The apparatus of claim 7, wherein the sensor data corresponding to the vehicle heading comprises:
when the vehicle course is the forward direction of the vehicle, the corresponding sensor data is the forward sensor data;
and when the vehicle course is the backward direction of the vehicle, the corresponding sensor data is backward sensor data.
9. The apparatus of claim 7, wherein the sensor data corresponding to the vehicle heading comprises:
when the vehicle course is the left forward direction of the vehicle, the corresponding sensor data are forward sensor data and left sensor data;
when the vehicle course is the right forward direction of the vehicle, the corresponding sensor data are forward sensor data and right sensor data;
when the vehicle course is the left backward direction of the vehicle, the corresponding sensor data are backward sensor data and left sensor data;
when the vehicle course is the right backward direction of the vehicle, the corresponding sensor data are backward sensor data and right sensor data.
10. The apparatus according to claim 6, comprising:
the sensor data is selected from a plurality of sensor data based on the desired path.
11. An in-vehicle apparatus, characterized by comprising:
a processor, a memory, and a user interface;
the processor, memory, and user interface are coupled together by a bus system;
the processor is adapted to perform the steps of the method according to any one of claims 1 to 5 by invoking a program or instruction stored in the memory.
CN201910106037.5A 2019-01-18 2019-01-18 Sensor data selection method and device for automatic driving vehicle and vehicle-mounted equipment Active CN111538322B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910106037.5A CN111538322B (en) 2019-01-18 2019-01-18 Sensor data selection method and device for automatic driving vehicle and vehicle-mounted equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910106037.5A CN111538322B (en) 2019-01-18 2019-01-18 Sensor data selection method and device for automatic driving vehicle and vehicle-mounted equipment

Publications (2)

Publication Number Publication Date
CN111538322A CN111538322A (en) 2020-08-14
CN111538322B true CN111538322B (en) 2023-09-15

Family

ID=71972846

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910106037.5A Active CN111538322B (en) 2019-01-18 2019-01-18 Sensor data selection method and device for automatic driving vehicle and vehicle-mounted equipment

Country Status (1)

Country Link
CN (1) CN111538322B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113075716A (en) * 2021-03-19 2021-07-06 地平线(上海)人工智能技术有限公司 Image-based vehicle positioning method and device, storage medium and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103587469A (en) * 2013-11-19 2014-02-19 浙江吉利汽车研究院有限公司 Device and method for assisting automobile in turning at night
CN103786632A (en) * 2012-10-31 2014-05-14 现代摩比斯株式会社 Lighting system for vehicle and control method thereof
CN105511469A (en) * 2015-12-18 2016-04-20 北京联合大学 Unmanned intelligent patrol electric vehicle and patrol system
CN106335430A (en) * 2016-08-05 2017-01-18 浙江金刚汽车有限公司 Device for eliminating column A blind zone during vehicle steering
CN106681319A (en) * 2016-12-09 2017-05-17 重庆长安汽车股份有限公司 Automatic lane-changing system and method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10369994B2 (en) * 2016-07-20 2019-08-06 Ford Global Technologies, Llc Rear camera stub detection
US10460180B2 (en) * 2017-04-20 2019-10-29 GM Global Technology Operations LLC Systems and methods for visual classification with region proposals
US10386856B2 (en) * 2017-06-29 2019-08-20 Uber Technologies, Inc. Autonomous vehicle collision mitigation systems and methods

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103786632A (en) * 2012-10-31 2014-05-14 现代摩比斯株式会社 Lighting system for vehicle and control method thereof
CN103587469A (en) * 2013-11-19 2014-02-19 浙江吉利汽车研究院有限公司 Device and method for assisting automobile in turning at night
CN105511469A (en) * 2015-12-18 2016-04-20 北京联合大学 Unmanned intelligent patrol electric vehicle and patrol system
CN106335430A (en) * 2016-08-05 2017-01-18 浙江金刚汽车有限公司 Device for eliminating column A blind zone during vehicle steering
CN106681319A (en) * 2016-12-09 2017-05-17 重庆长安汽车股份有限公司 Automatic lane-changing system and method

Also Published As

Publication number Publication date
CN111538322A (en) 2020-08-14

Similar Documents

Publication Publication Date Title
CN110058588B (en) Method for upgrading automatic driving system, automatic driving system and vehicle-mounted equipment
US10535269B2 (en) Apparatus and method for collision control of vehicle based on boundary
US11220259B2 (en) Multi-sensor probabilistic object detection and automated braking
CN110069064B (en) Method for upgrading automatic driving system, automatic driving system and vehicle-mounted equipment
US8321066B2 (en) Method for determining free spaces in the vicinity of a motor vehicle, in particular in the vicinity relevant to the vehicle operation
DE102018100021A1 (en) SPACIOUS AUDIBLE WARNINGS FOR A VEHICLE
EP3577528B1 (en) Enabling remote control of a vehicle
CN110239525B (en) Parking method, device and system
CN110103912B (en) Brake control method and device for automatic driving vehicle and vehicle-mounted equipment
CN108490442A (en) A kind of radar detection method of vehicle, device, equipment and storage medium
CN113212442A (en) Trajectory-aware vehicle driving analysis method and system
CN112793583A (en) Use of a driver-assisted collision mitigation system with an autonomous driving system
CN111538322B (en) Sensor data selection method and device for automatic driving vehicle and vehicle-mounted equipment
US9821812B2 (en) Traffic complexity estimation
JP6689337B2 (en) Automatic operation control device and automatic operation control method
CN114735029A (en) Control method and device for automatic driving vehicle
US20210109526A1 (en) Processing sensor data in a motor vehicle
CN110936955B (en) Method and device for controlling speed of automatic driving vehicle, vehicle-mounted equipment and storage medium
JP2020053069A (en) On-vehicle electronic control device
JP7351776B2 (en) Information provision device, mobile object and information provision method
US20240149919A1 (en) Vehicle selection device and vehicle selection method
US20230194661A1 (en) High-resolution radar target signature smearing correction
JP2023102115A (en) Vehicle control device, vehicle control method and program
DE102022119459A1 (en) Vehicle driving assistance device, vehicle driving assistance method and vehicle driving assistance program
JP2015193327A (en) Tracking control device and tracking control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant