WO2022086739A3 - Systems and methods for camera-lidar fused object detection - Google Patents

Systems and methods for camera-lidar fused object detection Download PDF

Info

Publication number
WO2022086739A3
WO2022086739A3 PCT/US2021/054333 US2021054333W WO2022086739A3 WO 2022086739 A3 WO2022086739 A3 WO 2022086739A3 US 2021054333 W US2021054333 W US 2021054333W WO 2022086739 A3 WO2022086739 A3 WO 2022086739A3
Authority
WO
WIPO (PCT)
Prior art keywords
lidar
dataset
methods
object detection
segments
Prior art date
Application number
PCT/US2021/054333
Other languages
French (fr)
Other versions
WO2022086739A2 (en
Inventor
Arsenii Saranin
Basel ALGHANEM
Benjamin D. BALLARD
Jason Ziglar
G. Peter K. CARR
Original Assignee
Argo AI, LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/078,532 external-priority patent/US12050273B2/en
Priority claimed from US17/078,575 external-priority patent/US11430224B2/en
Priority claimed from US17/078,561 external-priority patent/US20220126873A1/en
Priority claimed from US17/078,548 external-priority patent/US20220128702A1/en
Priority claimed from US17/078,543 external-priority patent/US11885886B2/en
Application filed by Argo AI, LLC filed Critical Argo AI, LLC
Priority to CN202180085904.7A priority Critical patent/CN116685874A/en
Priority to DE112021005607.7T priority patent/DE112021005607T5/en
Publication of WO2022086739A2 publication Critical patent/WO2022086739A2/en
Publication of WO2022086739A3 publication Critical patent/WO2022086739A3/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0011Planning or execution of driving tasks involving control alternatives for a single driving scenario, e.g. planning several paths to avoid obstacles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/04Conjoint control of vehicle sub-units of different type or different function including control of propulsion units
    • B60W10/06Conjoint control of vehicle sub-units of different type or different function including control of propulsion units including control of combustion engines
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/18Conjoint control of vehicle sub-units of different type or different function including control of braking systems
    • B60W10/184Conjoint control of vehicle sub-units of different type or different function including control of braking systems with wheel brakes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/20Conjoint control of vehicle sub-units of different type or different function including control of steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • B60W60/0017Planning or execution of driving tasks specially adapted for safety of other traffic participants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0027Planning or execution of driving tasks using trajectory prediction for other traffic participants
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/803Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • B60W2554/4026Cycles

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

Systems and methods for object detection. The methods comprise: obtaining a LiDAR dataset and using the LiDAR dataset and image(s) to detect an object by: matching points of the LiDAR dataset to pixels in the image; generating a pruned LiDAR dataset by reducing a total number of points contained in the LiDAR dataset; computing a distribution of object detections that each point of the LiDAR dataset is likely to be in; creating a plurality of segments of LiDAR data points using the distribution of object detections; merging the plurality of segments of LiDAR data points to generate merged segments; and/or detecting the object in a point cloud defined by the LiDAR dataset based on the merged segments of LiDAR data points. The object detection may be used to facilitate at least one autonomous driving operation.
PCT/US2021/054333 2020-10-23 2021-10-11 Systems and methods for camera-lidar fused object detection WO2022086739A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202180085904.7A CN116685874A (en) 2020-10-23 2021-10-11 Camera-laser radar fusion object detection system and method
DE112021005607.7T DE112021005607T5 (en) 2020-10-23 2021-10-11 Systems and methods for camera-LiDAR-fused object detection

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
US17/078,532 US12050273B2 (en) 2020-10-23 Systems and methods for camera-LiDAR fused object detection with point pruning
US17/078,543 2020-10-23
US17/078,561 2020-10-23
US17/078,532 2020-10-23
US17/078,575 US11430224B2 (en) 2020-10-23 2020-10-23 Systems and methods for camera-LiDAR fused object detection with segment filtering
US17/078,561 US20220126873A1 (en) 2020-10-23 2020-10-23 Systems and methods for camera-lidar fused object detection with segment merging
US17/078,548 US20220128702A1 (en) 2020-10-23 2020-10-23 Systems and methods for camera-lidar fused object detection with local variation segmentation
US17/078,575 2020-10-23
US17/078,543 US11885886B2 (en) 2020-10-23 2020-10-23 Systems and methods for camera-LiDAR fused object detection with LiDAR-to-image detection matching
US17/078,548 2020-10-23

Publications (2)

Publication Number Publication Date
WO2022086739A2 WO2022086739A2 (en) 2022-04-28
WO2022086739A3 true WO2022086739A3 (en) 2022-06-23

Family

ID=81291747

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/054333 WO2022086739A2 (en) 2020-10-23 2021-10-11 Systems and methods for camera-lidar fused object detection

Country Status (2)

Country Link
DE (1) DE112021005607T5 (en)
WO (1) WO2022086739A2 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020168313A1 (en) * 2019-02-15 2020-08-20 Arizona Board Of Regents On Behalf Of The University Of Arizona Mobile 3d imaging system and method
US20230303084A1 (en) * 2022-03-23 2023-09-28 Robert Bosch Gmbh Systems and methods for multi-modal data augmentation for perception tasks in autonomous driving
CN115035184B (en) * 2022-06-13 2024-05-28 浙江大学 Honey pomelo volume estimation method based on lateral multi-view reconstruction
CN117706942B (en) * 2024-02-05 2024-04-26 四川大学 Environment sensing and self-adaptive driving auxiliary electronic control method and system
CN117893412B (en) * 2024-03-15 2024-06-11 北京理工大学 Point cloud data filtering method, device, equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190236381A1 (en) * 2018-01-30 2019-08-01 Wipro Limited. Method and system for detecting obstacles by autonomous vehicles in real-time
US20190340775A1 (en) * 2018-05-03 2019-11-07 Zoox, Inc. Associating lidar data and image data
US20190387216A1 (en) * 2018-06-13 2019-12-19 Luminar Technologies, Inc. Post-processing by lidar system guided by camera information

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190236381A1 (en) * 2018-01-30 2019-08-01 Wipro Limited. Method and system for detecting obstacles by autonomous vehicles in real-time
US20190340775A1 (en) * 2018-05-03 2019-11-07 Zoox, Inc. Associating lidar data and image data
US20190387216A1 (en) * 2018-06-13 2019-12-19 Luminar Technologies, Inc. Post-processing by lidar system guided by camera information

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
VARUNA DE SILVA; JAMIE ROCHE; AHMET KONDOZ: "Fusion of LiDAR and Camera Sensor Data for Environment Sensing in Driverless Vehicles", ARXIV.ORG, 17 October 2017 (2017-10-17), pages 1 - 21, XP081310344 *

Also Published As

Publication number Publication date
DE112021005607T5 (en) 2023-08-24
WO2022086739A2 (en) 2022-04-28

Similar Documents

Publication Publication Date Title
WO2022086739A3 (en) Systems and methods for camera-lidar fused object detection
EP3961579A3 (en) Target detection method, apparatus, medium and computer program product
WO2018069757A3 (en) Navigating a vehicle based on a detected barrier
EP2713310A3 (en) System and method for detection and tracking of moving objects
MX2021009911A (en) Multicamera image processing.
EP3203417A3 (en) Method for detecting texts included in an image and apparatus using the same
MX2022004757A (en) Systems and methods for contextual image analysis.
EP4276628A3 (en) Analyzing large-scale data processing jobs
EP3968286A3 (en) Method, apparatus, electronic device and storage medium for detecting change of building
US11417129B2 (en) Object identification image device, method, and computer program product
Lu et al. A method of SAR target recognition based on Gabor filter and local texture feature extraction
MX2022011994A (en) Cloud-based framework for processing, analyzing, and visualizing imaging data.
WO2018163786A3 (en) Target subject analysis apparatus, target subject analysis method, learning apparatus, and learning method
MY189696A (en) Weather data processing apparatus and method using weather radar
CN112257604A (en) Image detection method, image detection device, electronic equipment and storage medium
Bhardwaj et al. Image processing based smart traffic control system for smart city
Wu et al. Automatic gear sorting system based on monocular vision
KR20190138377A (en) Aircraft identification and location tracking system using cctv and deep running
Pise et al. Text detection and recognition in natural scene images
EP4102484A3 (en) Aircraft identification
Lu et al. Automated bullet identification based on striation feature using 3D laser color scanner
Wang et al. Research of method for detection of rail fastener defects based on machine vision
EP3026598A3 (en) Image processing apparatus, image processing method, and program
Yang et al. An algorithm for fast extraction and identification of star target
Li et al. Texture image segmentation based on GLCM

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 202180085904.7

Country of ref document: CN

122 Ep: pct application non-entry in european phase

Ref document number: 21883552

Country of ref document: EP

Kind code of ref document: A2