WO2022022284A1 - Procédé et appareil de détection d'objet cible - Google Patents

Procédé et appareil de détection d'objet cible Download PDF

Info

Publication number
WO2022022284A1
WO2022022284A1 PCT/CN2021/106261 CN2021106261W WO2022022284A1 WO 2022022284 A1 WO2022022284 A1 WO 2022022284A1 CN 2021106261 W CN2021106261 W CN 2021106261W WO 2022022284 A1 WO2022022284 A1 WO 2022022284A1
Authority
WO
WIPO (PCT)
Prior art keywords
endpoint
uncertainty
point
feature
feature points
Prior art date
Application number
PCT/CN2021/106261
Other languages
English (en)
Chinese (zh)
Inventor
曹彤彤
李向旭
刘冰冰
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022022284A1 publication Critical patent/WO2022022284A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the present application relates to the field of autonomous driving, and more particularly, to a method and device for perceiving a target.
  • the set of point data on the shape of the target obtained by the acquisition device is also called a point cloud.
  • the more commonly used point clouds now include laser point clouds, that is, when a laser beam irradiates the surface of the target, the reflected laser will carry information such as the azimuth and distance of the target. If the laser beam is scanned according to a certain trajectory, the reflected laser point information will be recorded while scanning. Since the scanning is relatively fine, a large number of laser points can be obtained, so a laser point cloud can be formed. At present, laser point clouds are often used in the field of autonomous driving or unmanned driving to perceive objects.
  • the point cloud data containing the target object is first processed to obtain a point cloud cluster representing the target object, and the geometric center or center of gravity of the point cloud cluster is determined, and then based on the geometric center of the point cloud cluster The position and velocity of the center or center of gravity, calculate the position and velocity of the target to perceive the target.
  • the present application provides a method and device for sensing a target, so as to improve the accuracy of calculating the position or speed of the target.
  • a method for perceiving a target object comprising: acquiring a plurality of feature points of a point cloud cluster, the point cloud cluster representing the target object; determining the value of each feature point in the plurality of feature points uncertainty, the uncertainty is used to indicate the error generated when the position of each feature point in the point cloud cluster is collected by the collection device; based on the state of each feature point in the plurality of feature points , obtain the first state of the target object corresponding to each feature point in the plurality of feature points, the state of each feature point includes the position and/or speed of each feature point, the first The state includes the first speed and/or the first position of the target; based on the first state of the target corresponding to each of the plurality of feature points, and the The uncertainty corresponding to each feature point determines a second state of the target, and the second state includes a second velocity and/or a second position of the target.
  • the first state of the target object corresponding to the state of each feature point in the multiple feature points is calculated, and based on the multiple feature points
  • the first state of the target object corresponding to each feature point in the points and the uncertainty corresponding to each feature point are used to determine the second state of the target object, which is beneficial to improve the accuracy of determining the second state of the target object. It is avoided that the state of the target object is determined only based on the state of the geometric center or the center of gravity of the point cloud cluster in the prior art. When the geometric center or the center of gravity of the point cloud cluster is occluded, the accuracy of the determined state of the target object is caused. decline.
  • the plurality of feature points include a plurality of endpoints of the point cloud cluster, where the endpoints are also referred to as "interest points" and generally refer to the intersection of two adjacent edges in the point cloud cluster.
  • multiple endpoints of the point cloud cluster are used as the aforementioned multiple feature points, which is beneficial to simplify the process of determining the feature points.
  • the determining the uncertainty of each feature point in the plurality of feature points includes: determining a type of an edge connected to each end point in the plurality of endpoints, the edge The types include visible edges directly collected by the collection device and invisible edges that cannot be directly collected by the collection device; the uncertainty for each of the endpoints.
  • the uncertainty of each end point in the plurality of end points is determined based on the type of the two edges connected to each end point of the plurality of end points, which is beneficial to improve the accuracy of determining the uncertainty of the end point sex.
  • the plurality of endpoints include a first endpoint, a first edge connected to the first endpoint is a visible edge, and a second edge connected to the first endpoint is an invisible edge, and the uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation direction of the target.
  • the uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation of the target, which is beneficial to improve the accuracy of determining the uncertainty of the first endpoint sex.
  • the uncertainty d 1 of the first endpoint is determined by the formula Determine, where R 1 represents the measurement distance between the acquisition device and the first endpoint, C 0 is a preset value, and is negatively correlated with the acquisition accuracy of the acquisition device, in radians; ⁇ 1 represents the coordinate azimuth when the collection device collects the first endpoint; Indicates the azimuth of the orientation of the target.
  • the multiple endpoints include a second endpoint, and the types of the two edges connected to the second endpoint are both visible edges, then the second endpoint is connected to the collection device.
  • the measured distance between is positively related to the uncertainty of the second endpoint.
  • the uncertainty of the second end point is determined based on the measurement distance between the second end point and the acquisition device, which is beneficial to improve the accuracy of determining the uncertainty of the second end point.
  • the measurement distance between, C 1 represents the preset uncertainty, the unit is radian.
  • the determining the uncertainty of each end point of the plurality of end points based on the types of two edges connected to each end point of the plurality of end points includes: if The first reference point is not occluded by other objects, then the uncertainty of each end point of the plurality of end points is determined based on the type of two edges connected to each end point of the plurality of end points, and the first reference point is determined.
  • a reference point is a point with a preset distance from the first end point in the direction of the extension line of the first side, and the other objects are the objects in the image where the point cloud cluster is located except the target object and objects other than the collection device.
  • the uncertainty of each of the multiple endpoints may be determined based on the type of two edges connected to each of the multiple endpoints, It is beneficial to improve the accuracy of determining the uncertainty of the endpoint.
  • the method further includes: if the first reference point is blocked by the other objects, based on the horizontal opening angle corresponding to the first reference point and the first endpoint corresponding to the The degree of change of the horizontal opening angle determines the uncertainty of the first end point.
  • the uncertainty of the first endpoint is determined based on the degree of change of the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint. degree, which is beneficial to improve the accuracy of the uncertainty of the first endpoint.
  • determine the The uncertainty of the first endpoint includes: if the first reference point is blocked by the other objects, based on the horizontal opening angle corresponding to the first reference point and the horizontal angle corresponding to the first endpoint.
  • the difference ⁇ of the opening angle through the formula Determine the uncertainty d 3 of the first endpoint, wherein R 1 represents the measurement distance between the acquisition device and the first endpoint, C 0 is a preset value, and the acquisition device
  • the collection accuracy of ⁇ is negatively correlated, and the unit is radian; ⁇ 1 represents the coordinate azimuth angle when the collection device collects the first endpoint; Indicates the azimuth of the orientation of the target.
  • the first reference point is blocked by other objects, based on the degree of change of the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, and the detection uncertainty of the acquisition device
  • the component in the orientation of the target object determines the uncertainty of the first end point, which is beneficial to improve the accuracy of the uncertainty of the first end point.
  • the method further includes: if the first reference point is blocked by the other objects, based on the horizontal opening angle corresponding to the first reference point and the first endpoint corresponding to the The degree of change of the horizontal opening angle determines the uncertainty of the second end point.
  • the uncertainty of the second endpoint is determined based on the degree of change between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint , which is beneficial to improve the accuracy of the uncertainty of the second endpoint.
  • determining the uncertainty of the second end point includes: if the first reference point is not blocked by the other objects, based on the horizontal opening angle corresponding to the first reference point and the first end point.
  • the uncertainty of the second end point is determined by measuring the distance between them, which is beneficial to improve the accuracy of the uncertainty of the second end point.
  • the first state of the target object corresponding to each feature point in the plurality of feature points, and each feature point in the plurality of feature points Corresponding uncertainty, determining the second state of the target includes: determining each of the plurality of feature points based on the uncertainty corresponding to each of the plurality of feature points The confidence level corresponding to the feature point; based on the first state of the target object corresponding to each feature point in the plurality of feature points, and the confidence level corresponding to each feature point in the plurality of feature points , to determine the second state of the target.
  • the coordinate azimuth in the above can be understood as the angle between the line connecting the end point of the target object and the acquisition device and the x-axis in the coordinate system.
  • the azimuth angle of the orientation of the target object in the above can be understood as the horizontal angle between the clockwise direction and the x-axis with the orientation of the target object as the starting point.
  • an apparatus for sensing a target object including each unit for implementing the first aspect or any possible implementation manner of the first aspect.
  • a device for sensing a target object has the function of implementing the device in the method design of the above-mentioned first aspect.
  • These functions can be implemented by hardware or by executing corresponding software by hardware.
  • the hardware or software includes one or more units corresponding to the above functions.
  • a computing device including an input-output interface, a processor, and a memory.
  • the processor is used to control the input and output interface to send and receive signals or information
  • the memory is used to store a computer program
  • the processor is used to call and run the computer program from the memory, so that the computing device executes the method in the first aspect.
  • a computer program product comprising: computer program code, when the computer program code is run on a computer, causing the computer to perform the methods in the above aspects.
  • a computer-readable medium stores program codes, which, when executed on a computer, cause the computer to execute the methods in the above-mentioned aspects.
  • a seventh aspect provides a system-on-chip
  • the system-on-a-chip includes a processor for a computing device to implement the functions involved in the above aspects, for example, generating, receiving, sending, or processing data and/or data involved in the above methods or information.
  • the chip system further includes a memory for storing necessary program instructions and data of the computing device.
  • the chip system may be composed of chips, or may include chips and other discrete devices.
  • a vehicle including an input-output interface, a processor and a memory.
  • the processor is used to control the input and output interface to send and receive signals or information
  • the memory is used to store a computer program
  • the processor is used to call and run the computer program from the memory, so that the computing device executes the method in the first aspect.
  • the above-mentioned vehicle may have an automatic driving function.
  • FIG. 1 is a functional block diagram of a vehicle 100 provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of an applicable automatic driving system according to an embodiment of the present application.
  • FIG. 3 is a schematic flowchart of a method for sensing a target according to an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a point cloud cluster corresponding to a target in an embodiment of the present application.
  • FIG. 5 is a schematic diagram of the positional relationship between the target object 400 and the acquisition device 500 in the coordinate system according to the embodiment of the present application.
  • FIG. 6 is a schematic diagram of a positional relationship between a target 400 and a collection device 500 in a coordinate system according to another embodiment of the present application.
  • FIG. 7 is a schematic diagram of an environment map according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a device for sensing a target according to an embodiment of the present application.
  • FIG. 9 is a schematic block diagram of a computing device according to another embodiment of the present application.
  • FIG. 1 is a functional block diagram of a vehicle 100 provided by an embodiment of the present application.
  • the vehicle 100 is configured in a fully or partially autonomous driving mode.
  • the vehicle 100 can control itself while in an autonomous driving mode, and can determine the current state of the vehicle and its surroundings through human manipulation, determine the likely behavior of at least one other vehicle in the surrounding environment, and determine the other vehicle
  • the vehicle 100 is controlled based on the determined information with a confidence level corresponding to the likelihood of performing the possible behavior.
  • the vehicle 100 may be placed to operate without human interaction.
  • Vehicle 100 may include various subsystems, such as travel system 102 , sensor system 104 , control system 106 , one or more peripherals 108 and power supply 110 , computer system 112 , and user interface 116 .
  • vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements. Additionally, each of the subsystems and elements of the vehicle 100 may be interconnected by wire or wirelessly.
  • the travel system 102 may include components that provide powered motion for the vehicle 100 .
  • travel system 102 may include engine 118 , energy source 119 , transmission 120 , and wheels/tires 121 .
  • the engine 118 may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a gasoline engine and electric motor hybrid engine, an internal combustion engine and an air compression engine hybrid engine.
  • Engine 118 converts energy source 119 into mechanical energy.
  • Examples of energy sources 119 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electricity.
  • the energy source 119 may also provide energy to other systems of the vehicle 100 .
  • Transmission 120 may transmit mechanical power from engine 118 to wheels 121 .
  • Transmission 120 may include a gearbox, a differential, and a driveshaft.
  • transmission 120 may also include other devices, such as clutches.
  • the drive shaft may include one or more axles that may be coupled to one or more wheels 121 .
  • the sensor system 104 may include several sensors that sense information about the environment surrounding the vehicle 100 .
  • the sensor system 104 may include a positioning system 122 (the positioning system may be a global positioning system (GPS) system, a Beidou system or other positioning systems), an inertial measurement unit (IMU) 124, Radar 126 , laser rangefinder 128 and camera 130 .
  • the sensor system 104 may also include sensors of the internal systems of the vehicle 100 being monitored (eg, an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors can be used to detect objects and their corresponding characteristics (position, shape, orientation, velocity, etc.). This detection and identification is a critical function for the safe operation of the autonomous vehicle 100 .
  • the positioning system 122 may be used to estimate the geographic location of the vehicle 100 .
  • the IMU 124 is used to sense position and orientation changes of the vehicle 100 based on inertial acceleration.
  • IMU 124 may be a combination of an accelerometer and a gyroscope.
  • Radar 126 may utilize radio signals to sense objects within the surrounding environment of vehicle 100 .
  • the radar 126 may also be used to sense one or more of the target's speed, position, and heading.
  • the laser rangefinder 128 may utilize laser light to sense objects in the environment in which the vehicle 100 is located.
  • the laser rangefinder 128 may include one or more laser sources, laser scanners, and one or more detectors, among other system components.
  • Camera 130 may be used to capture multiple images of the surrounding environment of vehicle 100 .
  • Camera 130 may be a still camera or a video camera.
  • Control system 106 controls the operation of the vehicle 100 and its components.
  • Control system 106 may include various elements including steering system 132 , throttle 134 , braking unit 136 , computer vision system 140 , route control system 142 , and obstacle avoidance system 144 .
  • the steering system 132 is operable to adjust the heading of the vehicle 100 .
  • it may be a steering wheel system.
  • the throttle 134 is used to control the operating speed of the engine 118 and thus the speed of the vehicle 100 .
  • the braking unit 136 is used to control the deceleration of the vehicle 100 .
  • the braking unit 136 may use friction to slow the wheels 121 .
  • the braking unit 136 may convert the kinetic energy of the wheels 121 into electrical current.
  • the braking unit 136 may also take other forms to slow the wheels 121 to control the speed of the vehicle 100 .
  • Computer vision system 140 may be operable to process and analyze images captured by camera 130 in order to identify objects and/or features in the environment surrounding vehicle 100 .
  • the objects and/or features may include traffic signals, road boundaries and obstacles.
  • Computer vision system 140 may use object recognition algorithms, structure from motion (SFM) algorithms, video tracking, and other computer vision techniques.
  • SFM structure from motion
  • the computer vision system 140 may be used to map the environment, track objects, estimate the speed of objects, and the like.
  • the route control system 142 is used to determine the travel route of the vehicle 100 .
  • route control system 142 may combine data from sensors, GPS 122, and one or more predetermined maps to determine a driving route for vehicle 100.
  • the obstacle avoidance system 144 is used to identify, evaluate and avoid or otherwise traverse potential obstacles in the environment of the vehicle 100 .
  • control system 106 may additionally or alternatively include components other than those shown and described. Alternatively, some of the components shown above may be reduced.
  • Peripherals 108 may include a wireless communication system 146 , an onboard computer 148 , a microphone 150 and/or a speaker 152 .
  • peripherals 108 provide a means for a user of vehicle 100 to interact with user interface 116 .
  • the onboard computer 148 may provide information to the user of the vehicle 100 .
  • User interface 116 may also operate on-board computer 148 to receive user input.
  • the onboard computer 148 can be operated via a touch screen.
  • peripheral devices 108 may provide a means for vehicle 100 to communicate with other devices located within the vehicle.
  • microphone 150 may receive audio (eg, voice commands or other audio input) from a user of vehicle 100 .
  • speakers 152 may output audio to a user of vehicle 100 .
  • Wireless communication system 146 may wirelessly communicate with one or more devices, either directly or via a communication network.
  • wireless communication system 146 may use 3G cellular communications, such as code division multiple access (CDMA), Global System for Mobile Communications (GSM)/GPRS, or fourth generation (4th generation, 4G) communications such as LTE. Or the fifth generation (5th-Generation, 5G) communication.
  • the wireless communication system 146 may communicate with a wireless local area network (WLAN) using WiFi.
  • the wireless communication system 146 may communicate directly with the device using an infrared link, Bluetooth, or ZigBee.
  • Other wireless protocols, such as various vehicle communication systems, for example, wireless communication system 146 may include one or more dedicated short range communications (DSRC) devices, which may include communication between vehicles and/or roadside stations public and/or private data communications.
  • DSRC dedicated short range communications
  • the power supply 110 may provide power to various components of the vehicle 100 .
  • the power source 110 may be a rechargeable lithium-ion or lead-acid battery.
  • One or more battery packs of such a battery may be configured as a power source to provide power to various components of the vehicle 100 .
  • power source 110 and energy source 119 may be implemented together, such as in some all-electric vehicles.
  • Computer system 112 may include at least one processor 113 that executes instructions 115 stored in a non-transitory computer readable medium such as data memory 114 .
  • Computer system 112 may also be multiple computing devices that control individual components or subsystems of vehicle 100 in a distributed fashion.
  • the processor 113 may be any conventional processor, such as a commercially available central processing unit (CPU). Alternatively, the processor may be a dedicated device such as an application specific integrated circuit (ASIC) or other hardware-based processor.
  • FIG. 1 functionally illustrates the processor, memory, and other elements of the computer 110 in the same block, one of ordinary skill in the art will understand that the processor, computer, or memory may actually include a processor, a computer, or a memory that may or may not Multiple processors, computers, or memories stored within the same physical enclosure.
  • the memory may be a hard drive or other storage medium located within an enclosure other than computer 110 .
  • reference to a processor or computer will be understood to include reference to a collection of processors or computers or memories that may or may not operate in parallel.
  • some components such as the steering and deceleration components may each have their own processor that only performs computations related to component-specific functions .
  • a processor may be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are performed on a processor disposed within the vehicle while others are performed by a remote processor, including taking steps necessary to perform a single maneuver.
  • the memory 114 may contain instructions 115 (eg, program logic) executable by the processor 113 to perform various functions of the vehicle 100 , including those described above.
  • Memory 114 may also contain additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of travel system 102 , sensor system 104 , control system 106 , and peripherals 108 . instruction.
  • memory 114 may store data such as road maps, route information, vehicle location, direction, speed, and other such vehicle data, among other information. Such information may be used by the vehicle 100 and the computer system 112 during operation of the vehicle 100 in autonomous, semi-autonomous and/or manual modes.
  • the above-mentioned processor 113 may also execute the planning scheme for the longitudinal motion parameters of the vehicle according to the embodiments of the present application, so as to help the vehicle to plan the longitudinal motion parameters.
  • the specific longitudinal motion parameter planning method reference may be made to the introduction of FIG. 3 below. , and are not repeated here for brevity.
  • a user interface 116 for providing information to or receiving information from a user of the vehicle 100 .
  • user interface 116 may include one or more input/output devices within the set of peripheral devices 108 , such as wireless communication system 146 , onboard computer 148 , microphone 150 and speaker 152 .
  • Computer system 112 may control functions of vehicle 100 based on input received from various subsystems (eg, travel system 102 , sensor system 104 , and control system 106 ) and from user interface 116 .
  • computer system 112 may utilize input from control system 106 in order to control steering unit 132 to avoid obstacles detected by sensor system 104 and obstacle avoidance system 144 .
  • computer system 112 is operable to provide control of various aspects of vehicle 100 and its subsystems.
  • one or more of these components described above may be installed or associated with the vehicle 100 separately.
  • memory 114 may exist partially or completely separate from vehicle 100 .
  • the above-described components may be communicatively coupled together in a wired and/or wireless manner.
  • FIG. 1 should not be construed as a limitation on the embodiment of the present invention.
  • a self-driving car traveling on a road can recognize objects within its surroundings to determine adjustments to the current speed.
  • the objects may be other vehicles, traffic control equipment, or other types of objects.
  • each identified object may be considered independently, and based on the object's respective characteristics, such as its current speed, acceleration, distance from the vehicle, etc., may be used to determine the speed at which the autonomous vehicle is to adjust.
  • autonomous vehicle 100 or a computing device associated with autonomous vehicle 100 eg, computer system 112, computer vision system 140, memory 114 of FIG.
  • autonomous vehicle 100 For example, traffic, rain, ice on the road, etc.
  • each identified object is dependent on the behavior of the other, so it is also possible to predict the behavior of a single identified object by considering all identified objects together.
  • the vehicle 100 can adjust its speed based on the predicted behavior of the identified object.
  • the self-driving car can determine that the vehicle will need to adjust to a steady state (eg, accelerate, decelerate, or stop) based on the predicted behavior of the object.
  • a steady state eg, accelerate, decelerate, or stop
  • other factors may also be considered to determine the speed of the vehicle 100, such as the lateral position of the vehicle 100 in the road being traveled, the curvature of the road, the proximity of static and dynamic objects, and the like.
  • the computing device may also provide instructions to modify the steering angle of the vehicle 100 so that the self-driving car follows a given trajectory and/or maintains contact with objects in the vicinity of the self-driving car (eg, , cars in adjacent lanes on the road) safe lateral and longitudinal distances.
  • objects in the vicinity of the self-driving car eg, , cars in adjacent lanes on the road
  • the above-mentioned vehicle 100 can be a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a lawn mower, a recreational vehicle, a playground vehicle, construction equipment, a tram, a golf cart, a train, a cart, etc.
  • the embodiments of the invention are not particularly limited.
  • FIG. 2 is a schematic diagram of a suitable automatic driving system according to an embodiment of the present application.
  • the computer system 101 includes a processor 103 , and the processor 103 is coupled to a system bus 105 .
  • the processor 103 may be one or more processors, each of which may include one or more processor cores.
  • a video adapter 107 which can drive a display 109, is coupled to the system bus 105.
  • the system bus 105 is coupled to an input/output (I/O) bus 113 through a bus bridge 111 .
  • I/O interface 115 is coupled to the I/O bus.
  • I/O interface 115 communicates with various I/O devices, such as input device 117 (eg, keyboard, mouse, touch screen, etc.), media tray 121, (eg, CD-ROM, multimedia interface, etc.).
  • Transceiver 123 which can transmit and/or receive radio communication signals
  • camera 155 which can capture sceneries and dynamic digital video images
  • external USB interface 125 external USB interface 125 .
  • the interface connected to the I/O interface 115 may be a USB interface.
  • the processor 103 may be any conventional processor, including a Reduced Instruction Set Computing (Reduced Instruction Set Computing, RISC) processor, a Complex Instruction Set Computing (Complex Instruction Set Computer, CISC) processor or a combination of the above.
  • the processor may be a special purpose device such as an application specific integrated circuit ASIC.
  • the processor 103 may be a neural network processor or a combination of a neural network processor and the above-mentioned conventional processors.
  • computer system 101 may be located remotely from the autonomous vehicle and may communicate wirelessly with the autonomous vehicle.
  • some of the processes described herein are performed on a processor disposed within the autonomous vehicle, others are performed by a remote processor, including taking actions required to perform a single maneuver.
  • Network interface 129 is a hardware network interface, such as a network card.
  • the network 127 may be an external network, such as the Internet, or an internal network, such as an Ethernet network or a virtual private network (Virtual Private Network, VPN).
  • the network 127 may also be a wireless network, such as a Wi-Fi network, a cellular network, and the like.
  • the hard disk drive interface is coupled to the system bus 105 .
  • the hard drive interface is connected to the hard drive.
  • System memory 135 is coupled to system bus 105 . Data running in system memory 135 may include operating system 137 and application programs 143 of computer 101 .
  • the operating system includes a shell 139 and a kernel 141 .
  • Shell 139 is an interface between the user and the kernel of the operating system.
  • Shell 139 is the outermost layer of the operating system.
  • Shell 139 manages the interaction between the user and the operating system: waiting for user input, interpreting user input to the operating system, and processing various operating system outputs.
  • Kernel 141 consists of those parts of the operating system that manage memory, files, peripherals, and system resources. Interacting directly with hardware, the operating system kernel usually runs processes and provides inter-process communication, providing CPU time slice management, interrupts, memory management, IO management, and more.
  • Application 143 includes programs that control the autonomous driving of the car, for example, programs that manage the interaction of the autonomous car with obstacles on the road, programs that control the route or speed of the autonomous car, and programs that control the interaction of the autonomous car with other autonomous vehicles on the road. .
  • Application 143 also exists on the system of software deploying server 149 .
  • computer system 101 may download application 143 from software deploying server 149 when application 147 needs to be executed.
  • the above-mentioned application program may further include an application program corresponding to the target object perception scheme provided by the embodiments of the present application, wherein the target object perception scheme of the embodiments of the present application will be described in detail below. For the sake of brevity, the This will not be repeated here.
  • Sensors 153 are associated with computer system 101 .
  • the sensor 153 is used to detect the environment around the computer 101 .
  • the sensor 153 can detect objects, such as animals, cars, obstacles, etc., and further sensors can detect the surrounding environment of the above objects, such as: the environment around the animal, other animals around the animal, weather conditions , the brightness of the surrounding environment, etc.
  • the sensors may be lidars, cameras, infrared sensors, chemical detectors, microphones, and the like.
  • the point cloud data containing the object is first processed to obtain a point cloud cluster representing the object, and the geometric center or center of gravity of the point cloud cluster is determined, and then based on the geometric center of the point cloud cluster Or the position and speed of the center of gravity, and calculate the position and speed of the target to perceive the target.
  • the geometric center or the center of gravity of the point cloud cluster is blocked, the calculated target object will be reduced. position and velocity accuracy.
  • the present application provides a method for perceiving a target object.
  • the first state of the target for example, the first position and/or the first speed
  • the method of the embodiment of the present application is described below with reference to FIG. 3 . It should be understood that the method shown in FIG. 3 may be executed by the automatic driving system shown in FIG. 2 , or may also be executed by the control system 106 in the vehicle 100 , and optionally, the second state of the target may also be sent to the obstacle
  • the avoidance system 144 is used to plan the driving route of the vehicle 100 and the like.
  • FIG. 3 is a schematic flowchart of a method for sensing a target according to an embodiment of the present application.
  • the method shown in FIG. 3 includes steps 310 to 340 .
  • the point cloud cluster represents the target object, or in other words, the point cloud cluster is used to represent part or all of the contour or shape of the target object.
  • the above-mentioned multiple feature points may be contour points of a point cloud cluster, for example, multiple feature points are multiple endpoints of the point cloud cluster.
  • the above-mentioned multiple feature points may also include the geometric center, the center of gravity, etc. of the point cloud cluster, which is not limited in this embodiment of the present application.
  • the above-mentioned endpoints also known as "interest points", usually refer to the intersection of two adjacent edges in a point cloud cluster.
  • the above-mentioned multiple endpoints can be acquired by using the existing endpoint detection technology.
  • the above-mentioned point cloud cluster can be obtained based on an existing point cloud cluster acquisition scheme.
  • the laser signal can be transmitted and received by the lidar sensor, and the time difference between transmission and reception can be used to determine a The detection distance corresponding to the emission angle, the three-dimensional point cloud of the space environment can be obtained through multi-layer scanning. Then, the obtained point cloud is converted into the format required by the target perception after being driven by the lidar, and the point cloud data is continuously sent to the controller.
  • the controller can cluster the point cloud data, and filter out the clusters that do not meet the target characteristics according to the number of clustering points and sizes of the clusters, and the rest are the point cloud clusters corresponding to the target.
  • the commonly used clustering methods include density-based spatial clustering of applications with noise (DBSCAN), K nearest neighbor (KNN) and so on.
  • the controller can also use the L-Shape feature extraction algorithm or the trained neural network model to extract the orientation and shape of the point cloud cluster.
  • the orientation of the point cloud cluster is determined by the trajectory or the historical movement direction, and then the shape is calculated by traversing the points in the point cloud cluster according to the orientation.
  • FIG. 4 is a schematic diagram of a point cloud cluster corresponding to a target in an embodiment of the present application.
  • the outline of the point cloud cluster corresponding to the target 400 is a rectangle with a length of l and a width of w.
  • the rectangle includes four endpoints, namely endpoint 0, endpoint 1, endpoint 2, endpoint 3, and the rectangular
  • the coordinates of the geometric center are (x, y), and the coordinate azimuth is Among them, the coordinate azimuth is the angle between the orientation of the point cloud cluster and the x-axis in the coordinate system, which is the geometric center of the lidar, and the coordinates of the endpoint 0 are The coordinates of endpoint 1 are The coordinates of endpoint 2 are The coordinates of endpoint 3 are
  • the length and width of the above-mentioned point cloud clusters can be obtained from the statistics of multi-frame point cloud images.
  • the length and width of the above-mentioned point cloud clusters can also be determined based on the collection positions of the above-mentioned endpoints, which are not made in this embodiment of the present application. limited.
  • the acquisition device collects the target, due to the inherent error of the acquisition device itself, or in the process of collecting the target, whether the position of the feature point in the point cloud cluster can be directly observed by the acquisition device will affect the above characteristics. Therefore, in this application, the uncertainty of the endpoint can be set based on whether the edge connected to the endpoint can be directly observed by the acquisition device.
  • the above-mentioned step 320 includes: determining the type of the edge connected to each of the multiple endpoints, wherein the type of the edge includes the visible edge directly observed by the collection device and the invisible edge that cannot be directly observed by the collection device; The type of two edges that connect each of the endpoints, determining the uncertainty for each of the multiple endpoints.
  • the above-mentioned multiple endpoints can usually be divided into the following three types.
  • first type of endpoint one of the two edges connected to the endpoint of this type is a visible edge, and the other is an invisible edge.
  • Endpoint of the second type both edges connected to the endpoint of this type are visible edges, for example, Endpoint 1 shown in Figure 4;
  • Endpoint of the third type two edges connected to the endpoint of this type Both are invisible edges, for example, endpoint 3 shown in Figure 4.
  • the methods for calculating uncertainty based on the three different types of endpoints are described below.
  • the first endpoint belongs to the first type of endpoint
  • the second endpoint belongs to the second type of endpoint
  • the third endpoint belongs to the third type of endpoint.
  • the type of the first edge connected to the first endpoint is a visible edge
  • the type of the second edge connected to the first endpoint is an invisible edge
  • the actual location of the first endpoint is usually Located on the extension line of the visible side (ie, the first side)
  • the uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation of the target.
  • the uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation of the target. It can be understood that in the process of determining the uncertainty of the first endpoint, the acquisition device is considered. The component of the detection uncertainty in the orientation of the target, and the influence of other factors on the uncertainty of the first endpoint. The uncertainty of the above-mentioned first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation of the target. It can also be understood that the detection uncertainty of the acquisition device is directly related to the orientation of the target The component is used as the uncertainty of the first endpoint, which is not limited in this embodiment of the present application.
  • the inherent uncertainty of the acquisition device may be projected to the orientation of the target, and the uncertainty obtained after the projection may be used as the uncertainty of the first endpoint. That is, the uncertainty d1 of the first endpoint is obtained by the formula Determine, wherein, R 1 represents the measurement distance between the collection device and the first endpoint, and ⁇ 1 represents the coordinate azimuth angle when the collection device collects the first endpoint; Indicates the azimuth angle of the orientation of the target; C 0 is a preset value, and is negatively correlated with the acquisition accuracy of the acquisition device, and the unit is radian (rad).
  • the coordinate azimuth angle in the above can be understood as the angle between the connection line between the first end point of the target object and the acquisition device and the x-axis in the coordinate system.
  • the azimuth angle of the orientation of the target object in the above can be understood as the horizontal angle between the clockwise direction and the x-axis with the orientation of the target object as the starting point.
  • C 0 represents the uncertainty of laser scanning, which can be set according to the scanning resolution of the laser, which is proportional to the scanning resolution of the laser. For example, when the scanning resolution of the laser is 0.2°, C 0 can be set to 0.01.
  • the uncertainty of the first endpoint can be projected to the x-axis and the y-axis, that is, The uncertainty D 1x of the first endpoint on the x-axis is
  • the uncertainty D 1y of the first endpoint on the y-axis is Among them, D x0 represents the initial uncertainty in the x-axis direction; D y0 represents the initial uncertainty in the y-axis direction.
  • D x0 and D y0 are related to the first uncertainty C 0 and/or the scanning accuracy of the acquisition device.
  • FIG. 5 is a schematic diagram of the positional relationship between the target object 400 and the acquisition device 500 in the coordinate system according to the embodiment of the present application.
  • the positive directions of the x-axis and the y-axis are as shown in the figure, and the counterclockwise direction is the positive direction, the azimuth angle of the orientation of the acquisition device 500 That is, ⁇ 1 ′, and the inherent uncertainty of the acquisition device 500 is C 0 ′(rad).
  • the measured distance between the acquisition device 500 and the endpoint 0 is R 1 ′. Since the visible edge connected to the endpoint 0 is parallel to the orientation of the target 400, the azimuth of the orientation of the object 400 is equal to the distance between the visible edge and the x-axis.
  • the y-axis component d 1y ' of the uncertainty d 1 ' at endpoint 0 is
  • D x0 ' represents the initial uncertainty in the x-axis direction
  • D y0 ' represents the initial uncertainty in the y-axis direction.
  • endpoint 2 in FIG. 5 also belongs to the above-mentioned first endpoint, and the calculation method of the uncertainty of the above-mentioned first endpoint can be used, which is not repeated here for brevity.
  • the factor affecting the uncertainty of the second end point is usually the measurement distance between the second end point and the acquisition device, wherein, The measurement distance between the second endpoint and the acquisition device is positively related to the uncertainty of the second endpoint. Therefore, the uncertainty of the second end point can be determined based on the measured distance between the second end point and the acquisition device in the coordinate system.
  • the above C 1 may be set according to the horizontal opening angle of the observation target object, and is proportional to the horizontal opening angle of the observation target object. For example, if the horizontal opening angle of the observation target is 10°, C 1 can be set to 0.17.
  • the above-mentioned second uncertainty may also be the same as the first uncertainty, which is not limited in this embodiment of the present application.
  • FIG. 6 is a schematic diagram of the positional relationship between the target 400 and the acquisition device 500 in the coordinate system according to another embodiment of the present application.
  • the azimuth angle of the orientation of the acquisition device 500 That is, ⁇ 2 ′, and the inherent uncertainty of the acquisition device 500 is C 1 ′(rad).
  • the uncertainty of the third endpoint can be set larger than that of the first endpoint and the uncertainty of the second endpoint.
  • the uncertainty for example, can be set to infinity.
  • it can be set that the components of the uncertainty of the third endpoint in the x and y directions of the coordinate system are also larger than the components of the uncertainty of the first endpoint in the x and y directions of the coordinate system, and the second The components of the uncertainty of the endpoint in the x and y directions of the coordinate system.
  • the point cloud cluster of the target object can only represent part of the shape or outline of the target object. It is possible that the obtained endpoint is not the actual endpoint of the target object, and the actual endpoint position of the target object is occluded by other objects. In this case, in order to improve the accuracy of determining the uncertainty of the endpoint, the present application also provides a method for calculating the uncertainty of the endpoint, which is described below with reference to FIG. 7 . It should be understood that in the case that the above-mentioned target object is blocked by other objects, it can also be directly calculated according to the calculation method of the uncertainty of the first endpoint, the second endpoint and the third endpoint introduced above. This is not limited.
  • Whether there are other objects occluded between the above-mentioned target object and the collection device can be determined by generating an environment map.
  • the surroundings are scanned by the acquisition device to obtain an environmental map containing the target, and the environmental map is segmented according to a preset angle (for example, the azimuth of the acquisition device), and the features of each segmented space include the corresponding The azimuth angle and the measured distance of the object closest to the acquisition device corresponding to the azimuth angle and the number of the object.
  • the measured distance of the nearest object corresponding to the azimuth angle and the number of the object can be obtained by: calculating the minimum circumscribed convex polygon of the object according to the point cloud cluster of the object in the environment map, traversing the circumscribed convex polygon of all objects in the environment map, and obtaining The measured distance of the object closest to the acquisition device corresponding to each azimuth angle in the environment map, and the number of the closest object.
  • the reference point corresponding to each endpoint of the current object is determined based on the historical data of the object, and the reference point corresponding to each endpoint is marked in the above environmental map, and combined with the corresponding reference point of each azimuth in the environmental map
  • the measured distance of the object closest to the acquisition device and the serial number of the closest object determine whether the reference point corresponding to the end point of the target object is blocked.
  • the measurement distance from the acquisition device to the reference point corresponding to an endpoint is equal to the measurement distance of the object closest to the acquisition device in the azimuth angle corresponding to the endpoint, then the endpoint and the acquisition device are not blocked by other objects.
  • the measurement distance from the acquisition device to the reference point corresponding to an endpoint is greater than the measurement distance of the object closest to the acquisition device in the azimuth angle corresponding to the endpoint, the endpoint and the acquisition device are blocked by other objects.
  • the historical data of the target object may be the characteristics of the target object obtained during the scanning process before the point cloud cluster of the target object is obtained, for example, parameters such as the length and width of the target object, and for example, the target object The coordinates of each endpoint of , etc.
  • FIG. 7 is a schematic diagram of an environment map according to an embodiment of the present application.
  • the collection device 500 scans the surroundings according to a preset azimuth angle, acquires an environment map including the target 400 , and scans the surroundings according to a preset angle (for example, the azimuth angle of the collection device)
  • the environment map is segmented, and the features of each segmented space include the azimuth angle corresponding to each space, the measured distance of the object closest to the collection device corresponding to the azimuth angle, and the number of the object.
  • the measurement distance of the nearest object corresponding to the azimuth angle and the number of the object can be obtained by the following methods: Calculate the minimum circumscribed convex polygon of the object according to the point cloud clusters of the object 710 and the target object 400 in the environment map, and traverse the minimum circumscribed convex polygon of all objects in the environment map.
  • the circumscribed convex polygons that is, the object 710 and the target object 400, obtain the measured distance of the object closest to the acquisition device corresponding to each azimuth angle in the environment map and the number of the closest object.
  • the position of the reference point 1 corresponding to the endpoint 0 of the current object based on the historical data of the target object, and mark the reference point 1 corresponding to the endpoint 0 in the above environmental map, and determine the distance between the acquisition device and the reference point 1.
  • Measure the distance S and determine the measured distance S min of the object closest to the acquisition device in the segmented space corresponding to the azimuth 1 corresponding to the endpoint 0, and the number of the closest object. Referring to FIG. 7 , S min ⁇ S, the object 710 blocks the space between the reference point 1 and the acquisition device.
  • the uncertainty of the endpoints can be calculated according to the types of endpoints above.
  • the first endpoint For the first type of endpoint, if the first reference point is occluded by other objects, based on the degree of change between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, determine whether the first endpoint is not certainty.
  • the above-mentioned horizontal opening angle corresponding to the first end point can be understood as the horizontal opening angle used by the collecting device to collect the entire target object when the first end point is used as the end point of the target object.
  • the horizontal opening angle corresponding to the above-mentioned first reference point can be understood as the horizontal opening angle used by the collecting device to collect the entire target object when the first reference point is used as the end point of the target object.
  • the horizontal expansion angle corresponding to endpoint 0 is ⁇ 0
  • the horizontal expansion angle corresponding to reference point 1 is ⁇ 1
  • the uncertainty d 3 of the first endpoint can be projected to the x-axis and the y-axis, that is, the uncertainty D 3x of the first endpoint on the x-axis is
  • the uncertainty D 3y of the first endpoint on the y-axis is Among them, D x0 represents the initial uncertainty in the x-axis direction; D y0 represents the initial uncertainty in the y-axis direction.
  • the position of the first end point will affect the position of the second end point, if the first reference point is blocked by other objects, it will affect the determination of the position of the second end point to a certain extent. Therefore, the uncertainty of the second end point is the same as the above
  • the degree of change based on the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint is positively correlated.
  • the formula d 4 D 0 +R 2 ⁇ ( C 1 + ⁇ ), determine the uncertainty d 4 of the second endpoint, where R 2 represents the measurement distance between the acquisition device and the second endpoint, and C 1 represents the preset uncertainty, in radians (rad) ; D 0 represents the initial uncertainty of the preset second endpoint.
  • the uncertainty d 4 of the second endpoint can be projected to the x-axis and the y-axis, that is, the uncertainty D 4x of the first endpoint on the x-axis is
  • the uncertainty D 3y of the first endpoint on the y-axis is Among them, D x0 represents the initial uncertainty in the x-axis direction; D y0 represents the initial uncertainty in the y-axis direction.
  • the first state of the target object corresponding to each feature point in the multiple feature points calculates the first state of the target object corresponding to each feature point in the multiple feature points, and the state of each feature point includes the position of each feature point and/or Velocity, the first state includes a first velocity and/or a first position of the object.
  • the above-mentioned first state of the target can be understood as the position or velocity of the geometric center of the target.
  • the above step 340 includes: based on the uncertainty corresponding to each feature point in the plurality of feature points, determining the confidence level corresponding to each feature point in the plurality of feature points; based on each feature point in the plurality of feature points The corresponding first state of the target object and the confidence level corresponding to each feature point in the plurality of feature points determine the second state of the target object.
  • determining the confidence corresponding to each feature point in the plurality of feature points based on the uncertainty corresponding to each feature point in the plurality of feature points includes: based on the uncertainty corresponding to each feature point in the plurality of feature points.
  • weights of d k and ⁇ k in calculating the confidence level M k are usually adjusted by setting the values of C 3 and C 4 .
  • the values of C 3 and C 4 can be set to 0.5.
  • the above confidence may be divided into a confidence in the x-axis direction and a confidence in the y-direction. That is, the confidence level M kx of the k-th feature point in the x-axis direction is The confidence level M ky of the k-th feature point in the y-axis direction is Among them, d kx represents the uncertainty of the k-th feature point in the x-axis direction; ⁇ kx represents the change between the historical state and the first state of the k-th feature point in the x-axis direction; d ky represents the first state Uncertainty of the k feature points in the y-axis direction; ⁇ ky represents the change between the historical state and the first state of the k-th feature point in the y-axis direction.
  • each feature point of this round (also called “observation feature point”) can be compared with the historically calculated feature point (also called “observation feature point”).
  • “Tracked Feature Points” to associate and update the status of each feature point in the target. Specifically, according to the position, orientation and other information of each feature point, each observed feature point is associated with the tracked feature point. state to obtain the updated state of each feature point.
  • the state of each feature point can be updated based on Kalman filtering, extended Kalman filtering, etc., or the maximum a posteriori can be calculated based on Bayesian reasoning.
  • the state of each feature point is updated with probability, which is not limited in this embodiment of the present application.
  • the state of each feature point may not be updated, and the state of the feature point observed in the current round may be directly determined as the state of the target.
  • FIG. 8 to FIG. 9 The method for sensing the object of the embodiment of the present application is described above with reference to FIGS. 1 to 7 , and the apparatus of the embodiment of the present application is described below with reference to FIGS. 8 to 9 . It should be understood that it should be noted that the apparatuses shown in FIG. 8 to FIG. 9 can implement each step in the above method, which is not repeated here for brevity.
  • FIG. 8 is a schematic diagram of a device for sensing a target according to an embodiment of the present application.
  • the apparatus 800 shown in FIG. 8 includes: an obtaining unit 810 and a processing unit 820 .
  • the foregoing apparatus 800 may be the apparatus for running the automatic driving system in FIG. 1
  • the foregoing apparatus 800 may also be the apparatus for running the control system shown in FIG. 2 , which is not specifically limited in this embodiment of the present application.
  • the above obtaining unit 810 is configured to obtain a plurality of feature points of a point cloud cluster, and the point cloud cluster represents a target object.
  • the above-mentioned processing unit 820 is configured to determine the uncertainty of each feature point in the plurality of feature points, wherein the uncertainty is used to indicate the error generated when the position of each feature point in the point cloud cluster is collected by the collection device.
  • the above-mentioned processing unit 820 is further configured to obtain the first state of the target object corresponding to each feature point based on the state of each feature point in the plurality of feature points, and the state of each feature point includes the position of each feature point and/ or velocity, the first state includes a first velocity and/or a first position of the object.
  • the above processing unit 820 is further configured to determine the second state of the target based on the first state of the target corresponding to each feature point and the uncertainty corresponding to each feature point, and the second state includes the second speed of the target and/or second position.
  • the plurality of feature points include a plurality of endpoints of the point cloud cluster.
  • the processing unit 820 is further configured to: determine the type of an edge connected to each of the multiple endpoints, and the type of the edge includes a visible edge directly collected by the collection device and an edge that cannot be directly collected by the collection device.
  • the invisible edges of determine the uncertainty for each of the multiple endpoints based on the type of two edges that connect each of the multiple endpoints.
  • the plurality of endpoints include a first endpoint, the type of the first edge connected to the first endpoint is a visible edge, and the type of the second edge connected to the first endpoint is an invisible edge. , then the uncertainty of the first endpoint is determined based on the component of the detection uncertainty of the acquisition device in the orientation direction of the target.
  • the uncertainty d 1 of the first endpoint is determined by the formula Determine, where R 1 represents the measurement distance between the acquisition device and the first endpoint, C 0 is a preset value, which is negatively correlated with the acquisition accuracy of the acquisition device, and the unit is radians; ⁇ 1 represents the acquisition device to collect the first The coordinate azimuth at the end point; Indicates the azimuth of the orientation of the target.
  • the multiple endpoints include a second endpoint, and the types of the two edges connected to the second endpoint are both visible edges, then the measured distance between the second endpoint and the collection device is the same as the second endpoint. The uncertainty is positively correlated.
  • the processing unit 820 is further configured to: if the first reference point is not blocked by other objects, determine, based on the type of two edges connected to each of the multiple endpoints, among the multiple endpoints. The uncertainty of each endpoint, the first reference point is the point with a preset distance from the first endpoint in the direction of the extension line of the first side, and the other objects are the objects in the image where the point cloud cluster is located except the target object and the collected object. objects outside the device.
  • the processing unit 820 is further configured to: if the first reference point is blocked by other objects, based on the degree of change of the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, Determine the uncertainty of the first endpoint.
  • the processing unit 820 is further configured to: if the first reference point is blocked by other objects, based on the difference ⁇ between the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, by formula Determine the uncertainty d 3 of the first endpoint, where R 1 represents the measurement distance between the acquisition device and the first endpoint, C 0 is a preset value, and is negatively correlated with the acquisition accuracy of the acquisition device, and the unit is radian; ⁇ 1 represents the coordinate azimuth angle when the acquisition device collects the first endpoint; Indicates the azimuth of the orientation of the target.
  • the processing unit 820 is further configured to: if the first reference point is blocked by other objects, based on the degree of change of the horizontal opening angle corresponding to the first reference point and the horizontal opening angle corresponding to the first endpoint, Determine the uncertainty of the second endpoint.
  • the processing unit 820 is further configured to: determine each of the multiple feature points based on the uncertainty corresponding to each of the multiple feature points Corresponding confidence level; based on the first state of the target object corresponding to each of the multiple feature points, and the confidence level corresponding to each of the multiple feature points , to determine the second state of the target.
  • the total number of points; d k represents the uncertainty of the k-th feature point; ⁇ k represents the change between the historical state of the k-th feature point and the first state; C 3 and C 4 are the set value.
  • the processing unit 820 may be a processor 920, the obtaining module 810 may be a communication interface 930, and the communication device may further include a memory 910, as shown in FIG. 9 .
  • FIG. 9 is a schematic block diagram of a computing device according to another embodiment of the present application.
  • the computing device 900 shown in FIG. 9 may include: a memory 910 , a processor 920 , and a communication interface 930 .
  • the memory 910, the processor 920, and the communication interface 930 are connected through an internal connection path, the memory 910 is used to store instructions, and the processor 920 is used to execute the instructions stored in the memory 920 to control the communication interface 930 to receive/send information or data.
  • the memory 910 may be coupled with the processor 920 through an interface, or may be integrated with the processor 920 .
  • the above-mentioned communication interface 930 uses a transceiver device such as but not limited to an input/output interface (input/output interface) to implement communication between the computing device 900 and other devices.
  • a transceiver device such as but not limited to an input/output interface (input/output interface) to implement communication between the computing device 900 and other devices.
  • each step of the above-mentioned method may be completed by an integrated logic circuit of hardware in the processor 920 or an instruction in the form of software.
  • the methods disclosed in conjunction with the embodiments of the present application may be directly embodied as executed by a hardware processor, or executed by a combination of hardware and software modules in the processor.
  • the software modules may be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other storage media mature in the art.
  • the storage medium is located in the memory 910, and the processor 920 reads the information in the memory 910, and completes the steps of the above method in combination with its hardware. To avoid repetition, detailed description is omitted here.
  • the processor may be a central processing unit (central processing unit, CPU), and the processor may also be other general-purpose processors, digital signal processors (digital signal processors, DSP), dedicated integrated Circuit (application specific integrated circuit, ASIC), off-the-shelf programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the memory may include a read-only memory and a random access memory, and provide instructions and data to the processor.
  • a portion of the processor may also include non-volatile random access memory.
  • the processor may also store device type information.
  • the size of the sequence numbers of the above-mentioned processes does not mean the sequence of execution, and the execution sequence of each process should be determined by its functions and internal logic, and should not be dealt with in the embodiments of the present application. implementation constitutes any limitation.
  • the disclosed system, apparatus and method may be implemented in other manners.
  • the apparatus embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the functions, if implemented in the form of software functional units and sold or used as independent products, may be stored in a computer-readable storage medium.
  • the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art or the part of the technical solution, and the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk and other media that can store program codes .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Remote Sensing (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Electromagnetism (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente demande concerne un procédé de détection d'objet cible et un appareil, permettant d'améliorer la précision des calculs de la position ou de la vitesse de l'objet cible. La présente solution se rapporte au domaine technique de la conduite autonome. Le procédé comprend les étapes consistant à : obtenir une pluralité de points caractéristiques d'un groupe de nuage de points, le groupe de nuage de points représentant l'objet cible ; déterminer le degré d'incertitude de chaque point caractéristique parmi la pluralité de points caractéristiques, le degré d'incertitude étant utilisé pour indiquer l'erreur générée lorsque la position de chaque point caractéristique dans le groupe de nuage de points est collectée au moyen d'un dispositif de collecte ; sur la base de l'état de chaque point caractéristique parmi la pluralité de points caractéristiques, obtenir un premier état de l'objet cible correspondant à chaque point caractéristique parmi la pluralité de points caractéristiques ; sur la base du premier état de l'objet cible correspondant à chaque point caractéristique parmi la pluralité de points caractéristiques et du degré d'incertitude correspondant de chaque point caractéristique parmi la pluralité de points caractéristiques, déterminer un second état de l'objet cible, l'état comprenant la vitesse et/ou la position.
PCT/CN2021/106261 2020-07-31 2021-07-14 Procédé et appareil de détection d'objet cible WO2022022284A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010755668.2 2020-07-31
CN202010755668.2A CN114092898A (zh) 2020-07-31 2020-07-31 目标物的感知方法及装置

Publications (1)

Publication Number Publication Date
WO2022022284A1 true WO2022022284A1 (fr) 2022-02-03

Family

ID=80037525

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/106261 WO2022022284A1 (fr) 2020-07-31 2021-07-14 Procédé et appareil de détection d'objet cible

Country Status (2)

Country Link
CN (1) CN114092898A (fr)
WO (1) WO2022022284A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114577215B (zh) * 2022-03-10 2023-10-27 山东新一代信息产业技术研究院有限公司 一种移动机器人的特征地图更新方法、设备及介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103810475A (zh) * 2014-02-19 2014-05-21 百度在线网络技术(北京)有限公司 一种目标物识别方法及装置
US20180341021A1 (en) * 2017-05-24 2018-11-29 Jena-Optronik Gmbh Method For Detecting And Autonomously Tracking A Target Object Using A LIDAR Sensor
CN109831736A (zh) * 2017-11-23 2019-05-31 腾讯科技(深圳)有限公司 一种数据处理方法、装置、服务器及客户端
CN111060024A (zh) * 2018-09-05 2020-04-24 天目爱视(北京)科技有限公司 旋转中心轴与图像采集装置相交的3d测量及获取装置
CN111199579A (zh) * 2020-01-02 2020-05-26 腾讯科技(深圳)有限公司 一种目标物的三维模型构建方法、装置、设备及介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103810475A (zh) * 2014-02-19 2014-05-21 百度在线网络技术(北京)有限公司 一种目标物识别方法及装置
US20180341021A1 (en) * 2017-05-24 2018-11-29 Jena-Optronik Gmbh Method For Detecting And Autonomously Tracking A Target Object Using A LIDAR Sensor
CN109831736A (zh) * 2017-11-23 2019-05-31 腾讯科技(深圳)有限公司 一种数据处理方法、装置、服务器及客户端
CN111060024A (zh) * 2018-09-05 2020-04-24 天目爱视(北京)科技有限公司 旋转中心轴与图像采集装置相交的3d测量及获取装置
CN111199579A (zh) * 2020-01-02 2020-05-26 腾讯科技(深圳)有限公司 一种目标物的三维模型构建方法、装置、设备及介质

Also Published As

Publication number Publication date
CN114092898A (zh) 2022-02-25

Similar Documents

Publication Publication Date Title
WO2022001773A1 (fr) Procédé et appareil de prédiction de trajectoire
CN112639883B (zh) 一种相对位姿标定方法及相关装置
CN112543877B (zh) 定位方法和定位装置
CN113792566A (zh) 一种激光点云的处理方法及相关设备
CN113498529B (zh) 一种目标跟踪方法及其装置
WO2022001366A1 (fr) Procédé et appareil de détection de ligne de voie
WO2022156309A1 (fr) Procédé et appareil de prédiction de trajectoire, et carte
WO2022062825A1 (fr) Procédé, dispositif de commande de véhicule et véhicule
WO2021110166A1 (fr) Procédé et dispositif de détection de structure de route
WO2022051951A1 (fr) Procédé de détection de ligne de voie de circulation, dispositif associé et support de stockage lisible par ordinateur
CN114693540A (zh) 一种图像处理方法、装置以及智能汽车
WO2021163846A1 (fr) Procédé de suivi de cible et appareil de suivi de cible
CN112810603B (zh) 定位方法和相关产品
WO2022089577A1 (fr) Procédé de détermination de pose et dispositif associé
WO2022052881A1 (fr) Procédé de construction de carte et dispositif informatique
WO2022022284A1 (fr) Procédé et appareil de détection d'objet cible
CN115546781A (zh) 一种点云数据的聚类方法以及装置
WO2021000787A1 (fr) Procédé et dispositif de reconnaissance de géométrie de route
WO2021217646A1 (fr) Procédé et dispositif de détection d'un espace libre pour un véhicule
CN114167404A (zh) 目标跟踪方法及装置
WO2021159397A1 (fr) Procédé de détection et dispositif de détection de région pouvant être parcourue par un véhicule
WO2022033089A1 (fr) Procédé et dispositif permettant de déterminer des informations tridimensionnelles d'un objet qui doit subir une détection
CN115508841A (zh) 一种路沿检测的方法和装置
CN113128497A (zh) 目标形状估计方法及装置
WO2022061725A1 (fr) Procédé et appareil d'observation d'élément de circulation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21850793

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21850793

Country of ref document: EP

Kind code of ref document: A1