US10209717B2 - Autonomous guidance system - Google Patents
Autonomous guidance system Download PDFInfo
- Publication number
- US10209717B2 US10209717B2 US15/545,944 US201515545944A US10209717B2 US 10209717 B2 US10209717 B2 US 10209717B2 US 201515545944 A US201515545944 A US 201515545944A US 10209717 B2 US10209717 B2 US 10209717B2
- Authority
- US
- United States
- Prior art keywords
- vehicle
- radar
- controller
- signal
- small
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 230000006870 function Effects 0.000 description 9
- 230000004438 eyesight Effects 0.000 description 7
- 238000001514 detection method Methods 0.000 description 5
- 238000000034 method Methods 0.000 description 5
- 230000004313 glare Effects 0.000 description 4
- 230000010287 polarization Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000010354 integration Effects 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 239000000853 adhesive Substances 0.000 description 1
- 230000001070 adhesive effect Effects 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000007769 metal material Substances 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
- G01S7/411—Identification of targets based on measurements of radar reflectivity
- G01S7/412—Identification of targets based on measurements of radar reflectivity based on a comparison between measured values and known or stored values
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9327—Sensor installation details
- G01S2013/93276—Sensor installation details in the windshield area
-
- G01S2013/9392—
-
- G05D2201/0213—
Definitions
- This disclosure generally relates to an autonomous guidance system, and more particularly relates to a controller that classifies the object as small when a magnitude of a radar reflection signal associated with the object is less than a signal-threshold.
- an autonomous guidance system that operates a vehicle in an autonomous mode.
- the system includes a camera module, a radar module, and a controller.
- the camera module outputs an image signal indicative of an image of an object in an area about a vehicle.
- the radar module outputs a reflection signal indicative of a reflected signal reflected by the object.
- the controller determines an object-location of the object on a map of the area based on a vehicle-location of the vehicle on the map, the image signal, and the reflection signal.
- the controller classifies the object as small when a magnitude of the reflection signal associated with the object is less than a signal-threshold.
- FIG. 1 is a top view of a vehicle equipped with an autonomous guidance system that includes a sensor assembly, according to one embodiment
- FIG. 2 is a block diagram of the assembly of FIG. 1 , according to one embodiment
- FIG. 3 is a perspective view of the assembly of FIG. 1 , according to one embodiment.
- FIG. 4 is a side view of the assembly of FIG. 1 , according to one embodiment.
- FIG. 1 illustrates a non-limiting example of an autonomous guidance system, hereafter referred to as the system 110 , which operates a vehicle 10 in an autonomous mode that autonomously controls, among other things, the steering-direction, and the speed of the vehicle 10 without intervention on the part of an operator (not shown).
- the means to change the steering-direction, apply brakes, and control engine power for the purpose of autonomous vehicle control are known so these details will not be explained herein.
- the disclosure that follows is general directed to how radar and image processing can be cooperatively used to improve autonomous control of the vehicle 10 , in particular how maps used to determine where to steer the vehicle can be generated, updated, and otherwise improved for autonomous vehicle guidance.
- the vehicle 10 is equipped with a sensor assembly, hereafter the assembly 20 , which is shown in this example located in an interior compartment of the vehicle 10 behind a window 12 of the vehicle 10 . While an automobile is illustrated, it will be evident that the assembly 20 may also be suitable for use on other vehicles such as heavy duty on-road vehicles like semi-tractor-trailers, and off-road vehicles such as construction equipment. In this non-limiting example, the assembly 20 is located behind the windshield and forward of a rearview mirror 14 so is well suited to detect an object 16 in an area 18 forward of the vehicle 10 .
- the assembly 20 may be positioned to ‘look’ through a side or rear window of the vehicle 10 to observe other areas about the vehicle 10 , or the assembly may be integrated into a portion of the vehicle body in an unobtrusive manner. It is emphasized that the assembly 20 is advantageously configured to be mounted on the vehicle 10 in such a way that it is not readily noticed. That is, the assembly 20 is more aesthetically pleasing than previously proposed autonomous systems that mount a sensor unit in a housing that protrudes above the roofline of the vehicle on which it is mounted. As will become apparent in the description that follows, the assembly 20 includes features particularly directed to overcoming problems with detecting small objects.
- FIG. 2 illustrates a non-limiting example of a block diagram of the system 110 , i.e. a block diagram of the assembly 20 .
- the assembly 20 may include a controller 120 that may include a processor such as a microprocessor or other control circuitry such as analog and/or digital control circuitry including an application specific integrated circuit (ASIC) for processing data as should be evident to those in the art.
- the controller 120 may include memory, including non-volatile memory, such as electrically erasable programmable read-only memory (EEPROM) for storing one or more routines, thresholds and captured data. The one or more routines may be executed by the processor to perform steps for determining if signals received by the controller 120 for detecting the object 16 as described herein.
- EEPROM electrically erasable programmable read-only memory
- the controller 120 includes a radar module 30 for transmitting radar signals through the window 12 to detect an object 16 through the window 12 and in an area 18 about the vehicle 10 .
- the radar module 30 outputs a reflection signal 112 indicative of a reflected signal 114 reflected by the object 16 .
- the area 18 is shown as generally forward of the vehicle 10 and includes a radar field of view defined by dashed lines 150 .
- the radar module 30 receives reflected signal 114 reflected by the object 16 when the object 16 is located in the radar field of view.
- the controller 120 also includes a camera module 22 for capturing images through the window 12 in a camera field of view defined by dashed line 160 .
- the camera module 22 outputs an image signal 116 indicative of an image of the object 16 in the area about a vehicle.
- the controller 120 is generally configured to detect one or more objects relative to the vehicle 10 . Additionally, the controller 120 may have further capabilities to estimate the parameters of the detected object(s) including, for example, the object position and velocity vectors, target size, and classification, e.g., vehicle verses pedestrian.
- the assembly 20 may be employed onboard the vehicle 10 for automotive safety applications including adaptive cruise control (ACC), forward collision warning (FCW), and collision mitigation or avoidance via autonomous braking and lane departure warning (LDW).
- ACC adaptive cruise control
- FCW forward collision warning
- LWD autonomous lane departure warning
- the controller 120 or the assembly 20 advantageously integrates both radar module 30 and the camera module 22 into a single housing.
- the integration of the camera module 22 and the radar module 30 into a common single assembly (the assembly 20 ) advantageously provides a reduction in sensor costs.
- the camera module 22 and radar module 30 integration advantageously employs common or shared electronics and signal processing as shown in FIG. 2 .
- placing the radar module 30 and the camera module 22 in the same housing simplifies aligning these two parts so a location of the object 16 relative to the vehicle 10 base on a combination of radar and image data (i.e.—radar-camera data fusion) is more readily determined.
- the assembly 20 may advantageously employ a housing 100 comprising a plurality of walls as shown in FIGS. 3 and 4 , according to one embodiment.
- the controller 120 that may incorporate a radar-camera processing unit 50 for processing the captured images and the received reflected radar signals and providing an indication of the detection of the presence of one or more objects detected in the coverage zones defined by the dashed lines 150 and the dashed lines 160 .
- the controller 120 may also incorporate or combine the radar module 30 , the camera module 22 , the radar-camera processing unit 50 , and a vehicle control unit 72 .
- the radar module 30 and camera module 22 both communicate with the radar-camera processing unit 50 to process the received radar signals and camera generated images so that the sensed radar and camera signals are useful for various radar and vision functions.
- the vehicle control unit 72 may be integrated within the radar-camera processing unit or may be separate therefrom.
- the vehicle control unit 72 may execute any of a number of known applications that utilize the processed radar and camera signals including, but not limited to autonomous vehicle control, ACC, FCW, and LDW.
- the camera module 22 is shown in FIG. 2 including both the optics 24 and an imager 26 . It should be appreciated that the camera module 22 may include a commercially available off the shelf camera for generating video images. For example, the camera module 22 may include a wafer scale camera, or other image acquisition device. Camera module 22 receives power from the power supply 58 of the radar-camera processing unit 50 and communicates data and control signals with a video microcontroller 52 of the radar-camera processing unit 50 .
- the radar module 30 may include a transceiver 32 coupled to an antenna 48 .
- the transceiver 32 and antenna 48 operate to transmit radar signals within the desired coverage zone or beam defined by the dashed lines 150 and to receive reflected radar signals reflected from objects within the coverage zone defined by the dashed lines 150 .
- the radar module 30 may transmit a single fan-shaped radar beam and form multiple receive beams by receive digital beam-forming, according to one embodiment.
- the antenna 48 may include a vertical polarization antenna for providing vertical polarization of the radar signal which provides good propagation over incidence (rake) angles of interest for the windshield, such as a seventy degree)(70°) incidence angle.
- a horizontal polarization antenna may be employed; however, the horizontal polarization is more sensitive to the RF properties and parameters of the windshield for high incidence angle.
- the radar module 30 may also include a switch driver 34 coupled to the transceiver 32 and further coupled to a programmable logic device (PLD 36 ).
- the programmable logic device (PLD) 36 controls the switch driver in a manner synchronous with the analog-to-digital converter (ADC 38 ) which, in turn, samples and digitizes signals received from the transceiver 32 .
- the radar module 30 also includes a waveform generator 40 and a linearizer 42 .
- the radar module 30 may generate a fan-shaped output which may be achieved using electronic beam forming techniques.
- One example of a suitable radar sensor operates at a frequency of 76.5 gigahertz. It should be appreciated that the automotive radar may operate in one of several other available frequency bands, including 24 GHz ISM, 24 GHz UWB, 76.5 GHz, and 79 GHz.
- the radar-camera processing unit 50 is shown employing a video microcontroller 52 , which includes processing circuitry, such as a microprocessor.
- the video microcontroller 52 communicates with memory 54 which may include SDRAM and flash memory, amongst other available memory devices.
- a device 56 characterized as a debugging USB2 device is also shown communicating with the video microcontroller 52 .
- the video microcontroller 52 communicates data and control with each of the radar module 30 and camera module 22 . This may include the video microcontroller 52 controlling the radar module 30 and camera module 22 and includes receiving images from the camera module 22 and digitized samples of the received reflected radar signals from the radar module 30 .
- the video microcontroller 52 may process the received radar signals and camera images and provide various radar and vision functions.
- the radar functions executed by video microcontroller 52 may include radar detection 60 , tracking 62 , and threat assessment 64 , each of which may be implemented via a routine, or algorithm.
- the video microcontroller 52 may implement vision functions including lane tracking function 66 , vehicle detection 68 , and pedestrian detection 70 , each of which may be implemented via routines or algorithms. It should be appreciated that the video microcontroller 52 may perform various functions related to either radar or vision utilizing one or both of the outputs of the radar module 30 and camera module 22 .
- the vehicle control unit 72 is shown communicating with the video microcontroller 52 by way of a controller area network (CAN) bus and a vision output line.
- the vehicle control unit 72 includes an application microcontroller 74 coupled to memory 76 which may include electronically erasable programmable read-only memory (EEPROM), amongst other memory devices.
- the memory 76 may also be used to store a map 122 of roadways that the vehicle 10 may travel. As will be explained in more detail below, the map 122 may be created and or modified using information obtained from the radar module 30 and/or the camera module 22 so that the autonomous control of the vehicle 10 is improved.
- the vehicle control unit 72 is also shown including an RTC watchdog 78 , temperature monitor 80 , and input/output interface for diagnostics 82 , and CAN/HW interface 84 .
- the vehicle control unit 72 includes a twelve volt (12 V) power supply 86 which may be a connection to the vehicle battery. Further, the vehicle control unit 72 includes a private CAN interface 88 and a vehicle CAN interface 90 , both shown connected to an electronic control unit (ECU) that is connected to an ECU connector 92 .
- ECU electronice control unit
- the vehicle control unit 72 may be implemented as a separate unit integrated within the assembly 20 or may be located remote from the assembly 20 and may be implemented with other vehicle control functions, such as a vehicle engine control unit. It should further be appreciated that functions performed by the vehicle control unit 72 may be performed by the video microcontroller 52 , without departing from the teachings of the present invention.
- the camera module 22 generally captures camera images of an area in front of the vehicle 10 .
- the radar module 30 may emit a fan-shaped radar beam so that objects generally in front of the vehicle reflect the emitted radar back to the sensor.
- the radar-camera processing unit 50 processes the radar and vision data collected by the corresponding camera module 22 and radar module 30 and may process the information in a number of ways.
- One example of processing of radar and camera information is disclosed in U.S. Patent Application Publication No. 2007/0055446, which is assigned to the assignee of the present application, the disclosure of which is hereby incorporated herein by reference.
- the assembly 20 is generally illustrated having a housing 100 containing the various components thereof.
- the housing 100 may include a polymeric or metallic material having a plurality of walls that generally contain and enclose the components therein.
- the housing 100 has an angled surface 102 shaped to conform to the interior shape of the window 12 .
- Angled surface 102 may be connected to window 12 via an adhesive, according to one embodiment.
- housing 100 may otherwise be attached to window 12 or to another location behind the window 12 within the passenger compartment of the vehicle 10 .
- the assembly 20 has the camera module 22 generally shown mounted near an upper end and the radar module 30 is mounted below. However, the camera module 22 and radar module 30 may be located at other locations relative to each other.
- the radar module 30 may include an antenna 48 that is vertical oriented mounted generally at the forward side of the radar module 30 for providing a vertical polarized signal.
- the antenna 48 may be a planar antenna such as a patch antenna.
- a glare shield 28 is further provided shown as a lower wall of the housing 100 generally below the camera module 22 .
- the glare shield 28 generally shields light reflection or glare from adversely affecting the light images received by the camera module 22 . This includes preventing glare from reflecting off of the vehicle dash or other components within the vehicle and into the imaging view of the camera module 22 .
- an electromagnetic interference (EMI) shield may be located in front or below the radar module 30 .
- the EMI shield may generally be configured to constrain the radar signals to a generally forward direction passing through the window 12 , and to prevent or minimize radar signals that may otherwise pass into the vehicle 10 .
- the camera module 22 and radar module 30 may be mounted onto a common circuit board which, in turn, communicates with the radar-camera processing unit 50 , all housed together within the housing 100 .
- the system 110 includes a camera module 22 and a radar module 30 .
- the camera module 22 outputs an image signal 116 indicative of an image of an object 16 in the area 18 about a vehicle 10 .
- the radar module 30 outputs a reflection signal 112 indicative of a reflected signal 114 reflected by the object 16 .
- the controller 120 may be used to generate from scratch and store a map 122 of roadways traveled by the vehicle 10 , and/or update a previously stored/generated version of the map 122 .
- the controller 120 may include a global-positioning-unit, hereafter the GPS 124 to provide a rough estimate of a vehicle-location 126 of the vehicle 10 relative to selected satellites (not shown).
- the system 110 advantageously is able to accurately determine an object-location 128 of the object 16 relative to the vehicle 10 so that small objects that are not normally included in typical GPS based maps can be avoided by the vehicle when being autonomously operated.
- the object 16 illustrated in FIG. 1 is a small mound in the roadway, the kind of which is sometimes used to designate a lane boundary at intersections.
- the object 16 could be driven over by the vehicle 10 without damage to the vehicle 10 .
- jostling of passengers by wheels of the vehicle 10 driving over the object 16 may cause undesirable motion of the vehicle 10 that may annoy passengers in the vehicle 10 , or possibly spill coffee in the vehicle 10 .
- Another example of a small object that may warrant some action on the part of an autonomous driving system is a rough rail-road crossing, where the system 110 may slow the vehicle 10 shortly before reaching the rail-road crossing.
- the controller 120 is configured to generate the map 122 of the area 18 based on the vehicle-location 126 of the vehicle 10 . That is, the controller 120 is not preloaded with a predetermined map such as those provided with a typical commercially available navigation assistance device. Instead, the controller 120 builds or generates the map 122 from scratch based on, the image signal 116 , and the reflection signal 112 and global position coordinates provide by the GPS 124 . For example, the width of the roadways traveled by the vehicle 10 may be determined from the image signal 116 , and various objects such as signs, bridges, buildings, and the like may be recorded or classified by a combination of the image signal 116 and the reflection signal.
- vehicle radar systems ignore small objects detected by the radar module 30 .
- small objects include curbs, lamp-posts, mail-boxes, and the like.
- these small objects are typically not relevant to determining when the next turn should be made an operator of the vehicle.
- prior knowledge of small targets can help the system keep the vehicle 10 centered in a roadway, and can indicate some unexpected small object as a potential threat if an unexpected small object is detected by the system 110 .
- the controller 120 may be configured to classify the object 16 as small when a magnitude of the reflection signal 112 associated with the object 16 is less than a signal-threshold.
- the system may also be configured to ignore an object classified as small if the object is well away from the roadway, more than five meters (5 m) for example.
- the controller 120 may be preprogrammed or preloaded with a predetermined map such as those provided with a typical commercially available navigation assistance device.
- a predetermined map such as those provided with a typical commercially available navigation assistance device.
- maps typically do not include information about all objects proximate to a roadway, for example, curbs, lamp-posts, mail-boxes, and the like.
- the controller 120 may be configured or programmed to determine the object-location 128 of the object 16 on the map 122 of the area 18 based on the vehicle-location 126 of the vehicle 10 on the map 122 , the image signal 116 , and the reflection signal 112 .
- the controller 120 may add details to the preprogrammed map in order to identify various objects to assist the system 110 avoid colliding with various objects and keep the vehicle 10 centered in the lane or roadway on which it is traveling. As mention before, prior radar based system may ignore small objects. However, in this example, the controller 120 classifies the object as small when the magnitude of the reflection signal 112 associated with the object 16 is less than a signal-threshold. Accordingly, small objects such as curbs, lamp-posts, mail-boxes, and the like can be remembered by the system 110 to help the system 110 safely navigate the vehicle 10 .
- the controller may be configured to keep track of each time a small object is detected, but not add that small object to the map 122 until the small object has been detected multiple times. In other words, the controller classifies the object 16 as verified if the object 16 is classified as small and the object 16 is detected a plurality of occasions that the vehicle 10 passes through the area 18 . It follows that the controller 120 adds the object 16 to the map 122 after the object 16 is classified as verified after having been classified as small.
- the controller 120 may be configured or programmed to determine a size of the object 16 based on the image signal 116 and the reflection signal 112 , and then classify the object 16 as verified if the object is classified as small and a confidence level assigned to the object 16 is greater than a confidence-threshold, where the confidence-threshold is based on the magnitude of the reflection signal 112 and a number of occasions that the object is detected. For example, if the magnitude of the reflection signal 112 is only a few percent below the signal-threshold used to determine that an object is small, then the object 16 may be classified as verified after only two or three encounters.
- the object 16 may be classified as verified only after many encounter, eight encounters for example. As before, the controller 120 then adds the object 16 to the map 122 after the object 16 is classified as verified.
- Other objects may be classified based on when they appear. For example, if the vehicle autonomously travels the same roadway every weekday to, for example, convey a passenger to work, objects such garbage cans may appear adjacent to the roadway on one particular day, Wednesday for example.
- the controller 120 may be configured to log the date, day of the week, and/or time of day that an object is encountered, and then look for a pattern so the presence of that object can be anticipated in the future and the system 110 can direct the vehicle 10 to give the garbage can a wide berth.
- an autonomous guidance system (the system 110 ), and a controller 120 for the system 110 is provided.
- the controller 120 learns the location of small objects that are not normally part of navigation maps but are a concern when the vehicle 10 is being operated in an autonomous mode. If a weather condition such as snow obscures or prevents the detection of certain objects by the camera module 22 and/or the radar module 30 , the system 110 can still direct the vehicle 10 to avoid the object 16 because the object-location 128 relative to other un-obscured objects is present in the map 122 .
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims (5)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/545,944 US10209717B2 (en) | 2015-02-06 | 2015-12-07 | Autonomous guidance system |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562112770P | 2015-02-06 | 2015-02-06 | |
US15/545,944 US10209717B2 (en) | 2015-02-06 | 2015-12-07 | Autonomous guidance system |
PCT/US2015/064225 WO2016126315A1 (en) | 2015-02-06 | 2015-12-07 | Autonomous guidance system |
Publications (2)
Publication Number | Publication Date |
---|---|
US20180004220A1 US20180004220A1 (en) | 2018-01-04 |
US10209717B2 true US10209717B2 (en) | 2019-02-19 |
Family
ID=56564481
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/545,944 Active 2035-12-22 US10209717B2 (en) | 2015-02-06 | 2015-12-07 | Autonomous guidance system |
Country Status (2)
Country | Link |
---|---|
US (1) | US10209717B2 (en) |
WO (1) | WO2016126315A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10948924B2 (en) | 2015-02-06 | 2021-03-16 | Aptiv Technologies Limited | Method and apparatus for controlling an autonomous vehicle |
US10991247B2 (en) | 2015-02-06 | 2021-04-27 | Aptiv Technologies Limited | Method of automatically controlling an autonomous vehicle based on electronic messages from roadside infrastructure or other vehicles |
US11150326B2 (en) * | 2018-09-17 | 2021-10-19 | Cubtek Inc. | Radar system with angle error determination function and method thereof |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10479376B2 (en) * | 2017-03-23 | 2019-11-19 | Uatc, Llc | Dynamic sensor selection for self-driving vehicles |
CN107380162B (en) * | 2017-06-08 | 2019-05-31 | 南京航空航天大学 | Collision avoidance method is cooperateed with based on function distribution and Multi-Objective Fuzzy Decision |
CN107945562B (en) * | 2017-09-30 | 2021-03-19 | 百度在线网络技术(北京)有限公司 | Parking lot information recommendation method, server device and readable medium |
TWI684778B (en) * | 2018-03-26 | 2020-02-11 | 為升電裝工業股份有限公司 | Anti-collision radar device for trailer-mounted vehicle |
CN108922242A (en) * | 2018-06-05 | 2018-11-30 | 宁波金洋化工物流有限公司 | The preventative tracking of harmful influence haulage vehicle and control platform |
CN110091875A (en) * | 2019-05-14 | 2019-08-06 | 长沙理工大学 | Deep learning type intelligent driving context aware systems based on Internet of Things |
CN111775961B (en) * | 2020-06-29 | 2022-01-04 | 阿波罗智能技术(北京)有限公司 | Automatic driving vehicle planning method and device, electronic equipment and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6526352B1 (en) * | 2001-07-19 | 2003-02-25 | Intelligent Technologies International, Inc. | Method and arrangement for mapping a road |
WO2009070069A1 (en) | 2007-11-26 | 2009-06-04 | Autoliv Development Ab | A system for classifying objects in the vicinity of a vehicle |
US20090140887A1 (en) * | 2007-11-29 | 2009-06-04 | Breed David S | Mapping Techniques Using Probe Vehicles |
US20100013615A1 (en) * | 2004-03-31 | 2010-01-21 | Carnegie Mellon University | Obstacle detection having enhanced classification |
US8199046B2 (en) | 2006-06-30 | 2012-06-12 | Toyota Jidosha Kabushiki Kaisha | Radar system to determine whether an object is subject of detection based on intensity of a radio wave emission of the object |
US20140032093A1 (en) | 2012-07-30 | 2014-01-30 | Ford Global Technologies, Llc | Collision detection system with a plausibiity module |
US20150019080A1 (en) * | 2013-07-09 | 2015-01-15 | GM Global Technology Operations LLC | Driver assistance system for a motor vehicle |
-
2015
- 2015-12-07 WO PCT/US2015/064225 patent/WO2016126315A1/en active Application Filing
- 2015-12-07 US US15/545,944 patent/US10209717B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6526352B1 (en) * | 2001-07-19 | 2003-02-25 | Intelligent Technologies International, Inc. | Method and arrangement for mapping a road |
US20100013615A1 (en) * | 2004-03-31 | 2010-01-21 | Carnegie Mellon University | Obstacle detection having enhanced classification |
US8199046B2 (en) | 2006-06-30 | 2012-06-12 | Toyota Jidosha Kabushiki Kaisha | Radar system to determine whether an object is subject of detection based on intensity of a radio wave emission of the object |
WO2009070069A1 (en) | 2007-11-26 | 2009-06-04 | Autoliv Development Ab | A system for classifying objects in the vicinity of a vehicle |
US20090140887A1 (en) * | 2007-11-29 | 2009-06-04 | Breed David S | Mapping Techniques Using Probe Vehicles |
US20140032093A1 (en) | 2012-07-30 | 2014-01-30 | Ford Global Technologies, Llc | Collision detection system with a plausibiity module |
US20150019080A1 (en) * | 2013-07-09 | 2015-01-15 | GM Global Technology Operations LLC | Driver assistance system for a motor vehicle |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10948924B2 (en) | 2015-02-06 | 2021-03-16 | Aptiv Technologies Limited | Method and apparatus for controlling an autonomous vehicle |
US10991247B2 (en) | 2015-02-06 | 2021-04-27 | Aptiv Technologies Limited | Method of automatically controlling an autonomous vehicle based on electronic messages from roadside infrastructure or other vehicles |
US11543832B2 (en) | 2015-02-06 | 2023-01-03 | Aptiv Technologies Limited | Method and apparatus for controlling an autonomous vehicle |
US11763670B2 (en) | 2015-02-06 | 2023-09-19 | Aptiv Technologies Limited | Method of automatically controlling an autonomous vehicle based on electronic messages from roadside infrastructure or other vehicles |
US11150326B2 (en) * | 2018-09-17 | 2021-10-19 | Cubtek Inc. | Radar system with angle error determination function and method thereof |
Also Published As
Publication number | Publication date |
---|---|
US20180004220A1 (en) | 2018-01-04 |
WO2016126315A1 (en) | 2016-08-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10209717B2 (en) | Autonomous guidance system | |
US11597378B2 (en) | Vehicular sensing system for anticipating cut-in by other vehicle | |
US20200341487A1 (en) | System and Method to Operate an Automated Vehicle | |
US11275175B2 (en) | Method for detecting objects via a vehicular sensing system | |
US9863775B2 (en) | Vehicle localization system | |
CN106485194B (en) | Object recognition device, vehicle with object recognition device, and control method thereof | |
US9151626B1 (en) | Vehicle position estimation system | |
US6888447B2 (en) | Obstacle detection device for vehicle and method thereof | |
US20180068566A1 (en) | Trailer lane departure warning and sway alert | |
EP3159866A1 (en) | Object recognition apparatus and vehicle travel controller using same | |
US7275431B2 (en) | Vehicle mounted system for detecting objects | |
US20170299707A1 (en) | Systems and methods for adaptive sensor angle positioning in vehicles | |
US20180004221A1 (en) | Autonomous guidance system | |
US8370055B2 (en) | Driver assistance system | |
CN110214346B (en) | Sensor arrangement and method for detecting objects around a trailer of a vehicle | |
US6597984B2 (en) | Multisensory correlation of traffic lanes | |
US10222803B2 (en) | Determining objects of interest for active cruise control | |
JP7324057B2 (en) | Vehicle object detection device | |
WO2019112514A1 (en) | Rain filtering techniques for autonomous vehicle | |
CN110546528A (en) | Sensor system for a vehicle and method for determining a threat assessment | |
JP2010162975A (en) | Vehicle control system | |
US11867797B2 (en) | Sensor cluster device and vehicle including the same | |
US10356307B2 (en) | Vehicle camera system | |
US20230177843A1 (en) | Object assessment device, storage medium storing computer program for object assessment, and object assessment method | |
US20240036212A1 (en) | Lane boundary detection using sub-short range active light sensor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: DELPHI TECHNOLOGIES, INC., MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HAZELTON, LAWRENCE D.;REEL/FRAME:043082/0366 Effective date: 20170719 |
|
AS | Assignment |
Owner name: APTIV TECHNOLOGIES LIMITED, BARBADOS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DELPHI TECHNOLOGIES INC.;REEL/FRAME:047153/0902 Effective date: 20180101 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: APTIV TECHNOLOGIES (2) S.A R.L., LUXEMBOURG Free format text: ENTITY CONVERSION;ASSIGNOR:APTIV TECHNOLOGIES LIMITED;REEL/FRAME:066746/0001 Effective date: 20230818 Owner name: APTIV MANUFACTURING MANAGEMENT SERVICES S.A R.L., LUXEMBOURG Free format text: MERGER;ASSIGNOR:APTIV TECHNOLOGIES (2) S.A R.L.;REEL/FRAME:066566/0173 Effective date: 20231005 Owner name: APTIV TECHNOLOGIES AG, SWITZERLAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:APTIV MANUFACTURING MANAGEMENT SERVICES S.A R.L.;REEL/FRAME:066551/0219 Effective date: 20231006 |