GB2598142A - Testing method for vehicle-mounted sensors, testing system, observation vehicle and test vehicle - Google Patents

Testing method for vehicle-mounted sensors, testing system, observation vehicle and test vehicle Download PDF

Info

Publication number
GB2598142A
GB2598142A GB2013033.2A GB202013033A GB2598142A GB 2598142 A GB2598142 A GB 2598142A GB 202013033 A GB202013033 A GB 202013033A GB 2598142 A GB2598142 A GB 2598142A
Authority
GB
United Kingdom
Prior art keywords
vehicle
observation
test
timebase
test vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2013033.2A
Other versions
GB202013033D0 (en
Inventor
Komorkiewicz Mateusz
Pankiewicz Nikodem
Ciepiela Filip
Skruch Pawel
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aptiv Technologies Ltd
Original Assignee
Aptiv Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aptiv Technologies Ltd filed Critical Aptiv Technologies Ltd
Priority to GB2013033.2A priority Critical patent/GB2598142A/en
Publication of GB202013033D0 publication Critical patent/GB202013033D0/en
Publication of GB2598142A publication Critical patent/GB2598142A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)

Abstract

A testing method for vehicle-mounted sensor comprises providing, in an environment, a test vehicle 1100 having one or more sensors for detecting a spatial state of at least one other object 1300 relative to the test vehicle. The method includes providing one or more such objects and an observation vehicle 1200 in the environment. The observation vehicle having an observation sensor observing at least part of the environment. The method also involves establishing a common time-base for signals from the one or more test sensors and signals from the observation sensor. The arrangement preferably involving comparing the identified position of the object by the tested sensors with the relative position of the object and test vehicle identified by the observation sensors.

Description

Testing method for vehicle-mounted sensors, testing system, observation vehicle and test vehicle
Field
This disclosure relates to a testing method for vehicle-mounted sensors, as well as a testing system, an observation vehicle and a test vehicle which may be used to implement the method.
Background
Advanced driver assistance systems (ALAS) may be provided to road-going vehicles in order to improve driver comfort and safety, and with the potential to provide autonomous operation of the vehicle.
Such ADAS employ a sensor fusion strategy, in which signals from a plurality of vehicle-mounted sensors are used to obtain an estimation of the position of the vehicle relative to its environment, and relative to other objects in the environment. A decision engine is provided to process these signals and, as a result, to provide information to the driver, to autonomously operate vehicle functions, or to provide other means of driver support.
ADAS are classified by the Society of Automotive Engineers into one of several levels, according to the J3016 (2018) standard. Level 0 represents no automation but allows for visible, audible and/or tactile warnings; level 1 represents driver assistance in which the system only controls or intervenes in a single driver function, such as steering or speed control while the driver remains responsible for all other functions; level 2 represents partial automation in which the system controls or intervenes in more than one driver function, such as both speed and steering, while the driver remains responsible for all other functions; level 3 represents conditional driving automation in which all driving functions can be performed with the system, while the driver remains ready to take back control to handle complex situations; level 4 represents high driving automation, in which the system can handle highly complex driving situations without driver intervention, while the driver retains the option to take back control; and level 5 represents full driving automation without the need for driver intervention.
Advanced driver assistance systems typically synthesise or fuse signals from various types of sensors integrated on the vehicle, such as ultrasonic sensors, radar sensors, lidar sensors and cameras, in order to obtain an improved estimation of the vehicle position in relation to its environment. Typical sensors which may be provided include short-range ultrasonic parking sensors, long-range forward-looking radar sensors and cameras.
In order to verify the performance of such systems, it is conventional to place a test vehicle equipped with such an advanced driver assistance system into a test environment in which simulated or actual obstacles and hazards exist, and to operate the vehicle in a series of test scenarios in that test environment to confirm that the vehicle performs as expected.
For example, where the advanced driver assistance system includes a collision prevention function, a test vehicle may be caused to travel on a collision course with an object in the environment, such as another vehicle while the ADAS system is engaged. The test vehicle is observed to determine whether appropriate collision avoidance strategies are put into effect. For example, the advanced driver assistance system may be required in such circumstances to actuate brakes of the vehicle in order to bring the vehicle to a controlled stop before a collision takes place.
To correctly verify the performance of such systems, and to obtain understanding when such systems failed to operate as intended, it is usual to provide a means of independently verifying the signals generated by the sensors mounted on the vehicle under test, and their correct interpretation by the ADAS and in particular the decision engine.
Under one such typical approach, the vehicle under test is provided with a second set of sensors, which are, for example, of higher performance than those associated with the ADAS of the vehicle. This second set of sensors are used to obtain an independent understanding of the environment and other objects located therein. For example, the test vehicle may additionally be fitted with a laser imaging, detection and ranging (LIDAR) system, mounted above a roofline of the vehicle, in order to provide an independent three-dimensional model of the surroundings. The advanced driver assistance system can then be verified against this mode.
However, such a system may have difficulties incorrectly interpreting the environment in the presence of obstructions, for example when the line of sight from the LIDAR unit to an object of interest in the environment, such as another vehicle, is occluded. Such occlusion may be by another object in the environment such as a third vehicle or vegetation.
In an effort to overcome these disadvantages, it has been considered to provide other vehicles in the environment with a tracking system for recording their own movements in the environment, such as an on-board GPS unit. In such a situation, even when the test-vehicle-mounted LIDAR fails to adequately track other vehicles in the environment, the recorded GPS positions of the other vehicles in the environment can be accurately determined and included in the verification.
However, such an approach has the disadvantage of requiring further equipment to be installed on all other vehicles in the environment, which adds to the cost, especially in test scenarios which involve multiple vehicles, such as collision avoidance in two-way traffic.
Accordingly, there is a need for an improved method of testing vehicle-mounted sensors, and systems and apparatus to enable such a method to be performed.
Summary
According to a first aspect of the present disclosure, there is provided a testing method for vehicle-mounted sensors. The method may comprise providing, in an environment, a test vehicle. The test vehicle has one or more vehicle-mounted sensors. The vehicle-mounted sensors are for detecting a spatial state of at least one other object relative to the test vehicle. The method comprises providing, in the environment, one or more objects. The method comprises providing an observation vehicle in the environment. The observation vehicle has an observation sensor. The observation sensor observes at least part of the environment. The method comprises establishing a common timebase for signals from the one or more test sensors and signals from the observation sensor. The method comprises associating the signals with the timebase. The method comprises positioning the observation vehicle such that the observation sensor observes at least a part of the environment containing the test vehicle and at least one object.
In an implementation, the association comprises recording signals from the one or more test sensors in association with the common timebase. In the implementation, the method comprises recording signals from the observation sensor in association with the common timebase.
In an implementation, the method comprises determining, with signals from the vehicle-mounted sensors, the spatial state of the at least one object relative to the test vehicle in association with the common timebase. In the implementation, the method comprises determining, with signals derived from the observation sensor substantially simultaneously observing the test vehicle and the object, the spatial state of the at least one object relative to the test vehicle in association with the common timebase.
In an implementation, the method comprises comparing the spatial state of the at least one object relative to the test vehicle determined with the vehicle-mounted sensors with the spatial state of the at least one object relative to the test vehicle determined with the observation sensor using the common timebase.
In an implementation, the method comprises determining an error between the spatial state determined with the vehicle-mounted sensors and the spatial state of the at least one object determined with the observation sensor.
In an implementation, the common timebase is be established by receiving, at the test vehicle and the observation vehicle, a common timebase signal defining the common timebase. The common timebase signal may optionally be a GPO timebase signal or a radio timebase signal In an implementation, the common timebase may be established by transmitting, from the test vehicle to the observation vehicle, a timebase signal defining the common timebase. The timebase signal may optionally be an IEEE 1588 timebase signal or an NIP signal.
In an implementation, the common timebase is established by transmitting, from the observation vehicle to the test vehicle, a timebase signal defining the common timebase. The timebase signal may optionally be an IEEE 1588 timebase signal, an NTP signal, or a frame number of the observation sensor.
In an implementation, the common timebase is established to better than 10 millisecond. The common timebase may optionally be established to better than 1 millisecond.
In an implementation, the observation sensor isan imaging sensor. The imaging sensor may optionallybe a visible-spectrum or infrared camera.
In an implementation, the test vehicle has an observable test vehicle marker for locating the test vehicle in the environment. In an implementation, each object has an observable object marker for locating the object in the environment.
In an implementation, the determining of the spatial state of the at least one object relative to the test vehicle with signals derived from the observation sensor is performed by detecting the object marker and the test vehicle marker in the observed part of the environment containing the test vehicle and the at least one object.
In an implementation, each marker comprises a pattern unique in the environment to the respective vehicle or object. The pattern may be a 2D barcode pattern.
In an implementation, each marker comprises a beacon providing a signal unique in the environment to the respective vehicle or object. The pattern may optionally be optical pulse pattern.
In an implementation, each marker is removably attached to the respective vehicle or object. Each marker may be attached to an upper surface of the respective vehicle or object. The attachment may be by magnetic, adhesive or suction attachment.
In an implementation, the observation sensor has an imaging resolution sufficient to resolve each marker at a range of at least 10m, optionally at least 20m, further optionally at least 50m, further optionally at least 100m.
In an implementation, the observation vehicle has a controller operable autonomously to keep station relative to the test vehicle.
In an implementation, the observation vehicle has a controller remotely operated to keep station relative to the test vehicle.
In an implementation, the test vehicle is equipped with advanced driver-assistance systems satisfying at least SAE J3016 (06/2018) Level 1 requirements using the vehicle-mounted sensors.
In an implementation, the vehicle-mounted sensors comprise sensors selected from radar, lidar, ultrasonic and imaging sensors.
According to a second aspect of the present disclosure, there is provided a testing system for vehicle-mounted sensors. The system comprises a test vehicle. The test vehicle has one or more vehicle-mounted sensors for detecting a spatial state of at least one other vehicle relative to the test vehicle. The system comprises one or more objects. The system comprises an observation vehicle. The observation vehicle has an observation sensor for observing at least part of an environment in which the test vehicle and at least one object are operable. The system generates or receives a common timebase for signals from the one or more test sensors and signals from the observation sensor. The observation vehicle has a controller operable to cause the observation sensor to observe at least a part of the environment containing the test vehicle and at least one object.
According to a third aspect of the present disclosure, there is provided an observation vehicle for testing vehicle-mounted sensors. The observation vehicle comprises an observation sensor for observing at least part of an environment in which a test vehicle and at least one object are provided. The observation vehicle comprises a timebase transmitter or receiver for associating observations of the observation sensor with a common timebase.
According to a fourth aspect of the present disclosure, there is provided a test vehicle for testing vehicle-mounted sensors. The test vehicle has vehicle-mounted sensors mounted thereon for detecting a spatial state of at least one other object relative to the test vehicle. The test vehicle has a timebase transmitter or receiver for associating observations of the observation sensor with a common timebase.
Brief description of the drawings
For a better understanding of the present disclosure, and to show how the same may be carried into effect, reference will now be made, by way of example only, to the accompanying drawings, in which: Figure 1 depicts schematically a test vehicle forming part of a system implementing the present disclosure; Figure 2 depicts schematically an observation vehicle forming part of system implementing the present disclosure; Figure 3 depicts schematically a system implementing the present disclosure performing a test scenario; Figure 4 depicts a schematic block diagram of an ADAS system for implementing the present disclosure; Figure 5 depicts a schematic block diagram of an observation system for implementing the present disclosure; Figure 6 depicts schematically an alternative timebase strategy useable in the present disclosure; Figure 7 depicts schematically a further alternative timebase strategy usable in the present disclosure; Figure 8 depicts schematically an alternative system implementing the present disclosure performing a test scenario; Figure 9 depicts schematically an operation of a system implementing the present disclosure for analysing the results of testing; and Figure 10 depicts a flowchart exemplifying the operation of a system implementing the present disclosure.
Detailed description
In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the various described embodiments. However, it will be apparent to one of ordinary skill in the art that the various described embodiments may be practised without these specific details. In other instances, well-known methods, procedures, components, circuits and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
Figure 1 illustrates a test vehicle 1100 forming part of a test system 1000 being an embodiment of the present disclosure. Test vehicle 1100 is a road-going car, which may be, for example, a diesel, petrol, electric or hybrid electric vehicle.
Test vehicle 1100 is provided with an advanced driver assistance system (ADAS) which has a pair of cameras 1111 and 1112 as imaging sensors, a radar unit 1113 as a direction and ranging sensor, and a controller 1115 which fuses sensor signals from camera 1111, camera 1112, and radar sensor 1113 to provide an input to a decision algorithm which actuates driving control functions of test vehicle 1100 to perform advanced driver assistance functions.
Such advanced driver assistance functions may he categorised according to the Society of Automotive Engineers J3016 (06/2018) standard, and may range from simple single-function automation (level 1) to full autonomous vehicle control (level 5). In the present example, the test vehicle 1100 is equipped with advanced driver-assistance systems satisfying at least SAE J3016 (06/2018) Level 1 requirements using the vehicle-mounted sensors, but may optionally be capable to satisfy any one of the levels from Level 1 to Level 5.
Test vehicle 1100 is also provided with an observable marker 1121, the configuration and function of which will be described further below.
Test vehicle 1100 has been exemplified as being provided with cameras 1111 and 1112 and radar sensor 1113, but may alternatively or additionally be provided with ultrasonic sensors, lidar sensors and/or other sensors as required by the driver assistance functions to be provided by controller 1115.
Figure 2 illustrates an observation vehicle 1200 forming part of the test system 1000. Observation vehicle 1200 is an unmanned aerial vehicle (UAV) or drone.
Observation vehicle 1200 is also provided with four rotor units 1221 comprising a rotor blade driven by a variable-speed motor in accordance with signals from a flight controller 1225. Flight controller 1225 controls the rotation speeds of rotor units 1221 to perform required flight manoeuvres. The plane of the rotor blades of rotor units 1221 defines a plane of horizontal flight of the observation vehicle 1200.
Flight controller 1225 may control rotors 1221 to perform flight manoeuvres in accordance with remote control signals from, for example, a ground control station operated by a user, or may perform autonomous flight control in accordance with a pre-programmed plan or in response to predetermined operational requirements. For example, flight controller 1225 may be equipped with a navigation system (not shown) such as an inertial navigation system or a GPS navigation system, and may perform control in order to cause this observation vehicle 1200 to follow a predetermined path. Flight controller 1225 may also or additionally be equipped with an altimeter such as a barometric altimeter or radar altimeter to determine the height of observation vehicle 1200 in flight, as well as magnetic compass and tilt sensors to determine the attitude of observation vehicle 1200 in fli ht.
In variant configurations, observation vehicle 1200 may be provided in a different flight configuration from the four-rotor configuration disclosed above. For example, observation vehicle 1200 may be provided as another form of aerial vehicle, such as a rotorcraft such as a helicopter, a fixed-wing aircraft such as a glider or powered aircraft, or a lighter-than-air craft such as a dirigible or a balloon.
The flight control functionality of observation vehicle 1200 forms part of the state of the art, and may be implemented using a variety of algorithms and control strategies also known in the art. Flight controller 1200 may be provided with geo-fencing functionality to restrain movement of the observation vehicle 1200 to a particular region in the environment.
Alternatively, flight controller 1225 may be equipped with image processing functionality, and may receive imaging signals from, for example, an on-board camera and may control the motion of observation vehicle 1200 in response, for example to follow a defined object such as the test vehicle 1200 in its environment, or to keep station relative to particular landmarks in the environment. As explained further below, such functionality may alternatively be provided as part of observation unit 1215, or may be remotely commanded by an operator.
Observation vehicle 1200 is also provided with a camera 1211, functioning as an observation sensor, and an observation controller 1215 which receives observation data from camera 1211, for example imaging data. Camera 1211 may, for example, be a visible or infrared camera. Camera 1211 is provided on an underside of observation vehicle 1200 and arranged to look forward and downward in horizontal flight. For example, camera 1211 may have an observation direction (facing direction) dipped at, for example, 45 degrees to the plane of horizontal flight. Camera 1211 may be mounted in fixed orientation to body of observation vehicle 1200, or may be mounted on, for example, a gimbal mount to preserve the attitude of the camera relative to an external reference frame as the observation vehicle 1200 performs flight manoeuvres.
The observation vehicle 1200 is in the present embodiment powered by an onboard power source such as a battery, solar panel or fuel cell. In other embodiments, observation vehicle may be powered, for example, by an electrical tether to a ground station providing a power source and so permitting extended operation times.
As shown in Figure 1, test vehicle 1100 is also provided with marker 1121, which takes the form, in the present embodiment, of an observable pattern located at an upper surface of test vehicle 1100, here on the roof of test vehicle 1100. Marker 1121 may, for example, be provided on a surface of a removable sheet or plate, which is fixable to the roof 1100 of test vehicle 1100 by adhesive or magnetic force. Alternatively, rather than being removable, marker 1121 may be formed directly on the surface of the vehicle, for example by painting or etching.
In a test scenario, test vehicle 1100 is placed in a test environment, such as a test track, and observation vehicle 1200 is controlled, either remotely or autonomously, to observe test vehicle 1100 with camera 1211 from an elevated position, for example by causing observation vehicle 1200 to performs flight manoeuvres above and behind test vehicle 1100.
In Figure 3, camera 1211 of observation vehicle 1200 has a defined field of vision F, which may, for example, be in excess of 90°. Camera 1211 also has a defined pixel resolution, for example Full HD resolution (1920x1080) or 4K resolution (3840x2160). The pattern of marker 1121 is sufficiently large, in both overall extent and in the size of the structure of the geometric pattern, and has sufficient 1.5 contrast, such that it is observable by camera 1211 of observation vehicle 1200 at an appropriate distance, such as a distance of at least 10 metres, at least 20 metres, at least 50 metres or at least 100 metres. Due to the scale and contrast of the pattern of marker 1121, camera 1211 of observation vehicle 1200 may resolve the detail of marker 1121 throughout a test scenario, even when marker 1121 is observed from an oblique angle.
Observation controller 1215 of observation vehicle 1200 is equipped with an image-processing capability in order to extract from a scene viewed by camera 1211 the position and attitude, relative to camera 1211, of marker 1121, and hence that of observation vehicle 1200. Algorithms for determining the position of a defined geometrical element in a scene captured by an imaging sensor such as camera 1211 form part of the state of the art, and the skilled person can select an appropriate algorithm as desired. Such algorithms are generally referred to, for example, as object detection and object tracking algorithms. Moreover, by determining the difference in position and attitude over time, observation controller can determine the motion of the marker 1121, such as the velocity vector, and hence that of observation vehicle 1200.
Also shown in Figure 3 is an object vehicle 1300 which is another vehicle in the test environment. Object vehicle 1300 is here shown as crossing the path of test vehicle 1100 for the purposes of testing collision avoidance of the ADAS system of the test vehicle 1100. Object vehicle 1300 is here controlled by a human operator (driver) who may be operating object vehicle 1300 from a driver's position inside vehicle 1300. Alternatively, object vehicle 1300 may be operated by remote control, for example from a human operator at an external driver's station. Object vehicle 1300 is provided with object vehicle marker 1321 which is substantially the same in configuration as test vehicle marker 1121, except that the geometric pattern of object vehicle marker 1321 is distinguishable from that of test vehicle marker 1121 by camera 1211 when observing the scene containing test vehicle 1100 and object vehicle 1300 from an elevated position.
Observation controller 1215 is thus also able, by similar image-processing routines to those described above, to determine the relative position and attitude of marker 1121 relative to camera 1211 in addition to the relative position and attitude of marker 1121 relative to camera 1211. From these relative positions and attitudes, the observation controller 1215 can determine the relative positions and attitudes of markers 1121 and 1321 relative to each other, and hence the relevant relative positions of test vehicle 1100 and object vehicle 1300.
In Figures 1 and 3, the patterns of markers 1121 and 1321 are each shown as a two-dimensional barcode, sometimes referred to as a QR code, which efficiently encodes digital data in a two-dimensional matrix. Different matrices encode different digital data, and hence can be used to distinguish markers 1121 and 1321. Such patterns are recognisable and distinguishable at a wide range of different scales and attitudes and in poor visibility or poor light, and are hence particularly suited to identifying test vehicle 1100 and object vehicle 1300 in a test environment. Moreover, algorithms for identifying and extracting such two-dimensional barcodes in a scene are widely known. Further, two-dimensional barcodes codes have rotational asymmetry, and can therefore establish both a position and attitude of the respective vehicle. However, other patterns may be used to distinguish test vehicle 1100 and object vehicle 1300, without limitation, such as geometric shapes, colours, alphanumeric codes, or the like.
Observation controller 1215 of observation vehicle 1200 is also arranged to define a common timebase for signals acquired from the sensors mounted to the test vehicle 1100 and signals obtained from camera 1211. In Figure 3, the establishment of the common timebase is represented by communication link L between observation vehicle 1200 and test vehicle 1100.
A common timebase is a common reference frame for timekeeping, represented, for example, by a common progression of defined divisions of time or periods starting with a reference time, such as a zero time, from which elapsed time may be measured. So, for example, the common timebase may be represented as a number of milliseconds from a defined reference time. Other representations of a common timebase are possible, such as a number of transitions of a stable oscillator from a defined reference time.
By using a common timebase shared among a plurality of systems, events which occur at the same time among these systems can be associated one with another to within the accuracy of the common timebase. Establishing the common timebase among these systems can be performed by appropriate communication with the systems, as described further below.
One implementation for establishment of the common timebase will be described with reference to Figures 4 and 5, which depict, schematically, the functional modules of observation controller 1215 and ADAS controller 1115, respectively. Interfaces, denoted by IF, represent connections between the respective controllers and external devices and systems.
As shown in Figure 4, observation controller 1215 is connected to camera 1211 via appropriate interface. Observation controller 1215 comprises camera controller 1212 which sets adjust operational parameters of camera 1211 and receives imaging data from camera 1211. The imaging data from camera 1211, which may be a video stream comprising a series of frames, is passed to image processor 1211. As described above, image processor 1211 performs object recognition and object tracking to identify features in the image such as marker 1121 and 1321 and to determine their positions relative to camera 1211.
Observation controller 1215 is also provided with command module 1213 which can, via a suitable interface, provide commands to flight controller 1225 described above in order to autonomously control observation vehicle 1200. By use of command module 1213 in conjunction with image processing module 1211, observation controller 1215 can cause observation vehicle 1200 to perform flight operations in such a way that test vehicle 1100 and, optionally, object vehicle 1300 remain under observation by camera 1211 for an extended period of time, such as throughout a test sequence.
For example, command module 1212 can determine that observation vehicle 1200 or test vehicle 1100 is moving towards the edge of the field of view of camera 1211, and can respond by commanding flight controller to perform flight manoeuvres to reposition observation vehicle such that observation vehicle 1200 and test vehicle 1100 remain within the field of view of camera 1211. In alternative command strategies, command module can command flight controller to perform flight manoeuvres to maintain test vehicle 1100 within a defined region of the field of view of camera 1211.
To establish the common timebase, observation controller 1215 is provided with a timebase generator 1218 which defines a timebase for association with each observation performed by camera 1211. For example, timebase generator 1218 may be an independent timebase generator, including, for example, a temperature-stabilised quartz oscillator, which defines an elapsed time from an initial zero second. Each observation performed by camera 1211 is therefore associated with a unique time within the timebase established by timebase generator 1218.
Observation controller 1215 is equipped with a transmitter 1216 which, via, for example, an antenna 1217 broadcasts over a communication link L a signal representing the timebase generated by the timebase generator 1218. Thereby, other systems can use the timebase established by timebase generator 1218 as a common r_imebase, as will be explained below with reference to ADAS controller 1115.
In the present configuration, observation controller 1215 also includes datalogger 1219 which records the relative positions and attitudes of test vehicle marker 1121 and object vehicle marker 1321 derived from particular observations by camera 1211 in association with time values derived from the common timebase corresponding to the time at which the observation was performed. For example, an association can be made between frame numbers of the video stream obtained from camera 1211 and time values derived from the common timebase. In other configurations, the video stream may additionally or alternatively be recorded in datalogger 1219. Datalogger 1219 may, for example, be provided by a memory controller interfaced to a flash memory.
Figure 5 shows a block diagram of ADAS controller 1115. ADAS controller 1115 includes a sensor controller 1112 which controls and receives signals from vehicle-mounted sensors, here exemplified by cameras 1111 and 1112 and radar 1113 via appropriate interfaces and optionally provides initial processing such as filtering or scaling.
ADAS controller 1115 also includes a decision engine 1111 which receives signals derived from the sensors from sensor controller 1112 and performs a decision algorithm to evaluates the environment of the vehicle and to determine, for example, whether or not and to what extent to perform a driving function, such as operating the brakes of the vehicle. As part of the decision algorithm, decision engine 1111 may construct a virtual model of the environment around the vehicle. Decision engine 1111 may identify, based on the signals derived from the sensors, one or more objects in the environment and track the objects through known methods including statistical methods.
ADAS controller 1115 also has command module 1113 which, based on the determination of decision module 1111, may provide command signals to vehicle systems (V) via an interface in order to actuate those vehicle systems. For example, command module may, on determination by decision engine 111 that vehicle braking should be performed, cause the vehicle brakes to be applied.
As shown in Figure 5, ADAS controller 1115 of test vehicle 1100 is provided with receiver 1116 which, via ancenna 1117, receives the timebase signal broadcast from antenna 1217 of observation controller 1215 via communications link L. ADAS controller 1115 also includes its own local timebase generator 1118, which receives signals from receiver 1116 representing the common timebase established by observation vehicle 1200 and which performs a synchronisation operation to synchronise the local timebase to the common timebase. In this configuration, timebase generator 1118, by replicating the common timebase, can provide timekeeping even when communications link L is disrupted or noisy. However, in other embodiments, receiver 1116 can provide the timebase signal received via antenna 1117 directly for use as a local timebase for ADAS controller 1115.
ADAS controller 1115 is also provided with datalogger 1119 which records the sensor data, values associated with or derived from the sensor data, decision data and/or command data of the ADAS controller in association with time values derived from the common timebase corresponding to the time at which the data or values were received or generated. The time values may be obtained from the local timebase controller or directly from the receiver 1116 as explained above.
Other ways of establishing a common timebase as between observation vehicle 1200 and test vehicle 1110 are possible within the scope of the present disclosure. For example, as shown in Figure 6, rather than a common timebase being derived from a timebase generator 1218 being provided as part of observation controller 1215, a common external timebase generator may be used.
In such a case, for example, a satellite-based time reference S may be used, with ADAS controller 1115 and observation controller 1215 each being provided with a suitable antenna and receiver to receive a signal transmitted over a broadcast communications link L from the satellite representing the common external timebase.
With this approach, it is possible, for example, to synchronise respective local timebase generators 1116 and 1218 to the common external timebase, or to directly use the external timebase signal as a local timebase. A suitable external timebase in this case may, for example, be a GPS time reference, obtained from the GPS absolute time signal and/or the GPS pulse-per-second (PPS) signal.
In a further embodiment, as shown in Figure 7, an alternative external timebase generator may be available as a terrestrial broadcast time signal transmitted from a terrestrial transmitter R such as the MSF/Rugby time signal transmitted by the UK National Physical Laboratory (NPL), or similar services provided by the WWV shortwave radio station broadcast from Fort Collins, Colorado, or the JJY low-frequency time radio signal station operated by the National Institute of Information Communications Technology (NIICT) in Japan.
In a yet further embodiment, rather than transmitting the timebase signal from observation controller 1215 to ADAS controller 1115, a time signal may be transmitted from ADAS controller 1115 to observation controller 1215, in like manner.
Further alternatively, a test control system may be located in the test environment, for example in the form of a test control workstation 1400 depicted in Figure 9 and discussed below in connection with that Figure, which may generate and broadcast locally a signal representing a common timebase to both ANAS controller 1115 and observation controller 1215.
Suitable protocols for transmission of a signal representing a common timebase include an NIP (Network Time Protocol) signal or an IEEE 1588 signal. Suitable technologies for establishing a communications link with observation controller 1215 and ADAS controller 1115 for establishing the common timebase include low latency 433 MHz or 2.4 GHz radio technologies. Such configurations may realise a common timebase as between test vehicle 1100 and object vehicle 1300 to within ten (10) milliseconds or to within one (1) millisecond The above embodiment has been described with reference to the use of spatial patterns such as geometric for distinguishing the test vehicle 1100 from the object vehicle 1300. However, in some embodiments, rather than using spatial patterns, markers may be used which provide temporally distinct patterns. Such a configuration is shown in Figure 8.
In this configuration, test vehicle 1100 is be provided with, for example, optical beacon 1121 which emits a predetermined repeating temporal sequence of optical pulses, such as flashes of light. In Figure 8, optical beacon 1121 emits a series of alternating short (.) and long (-) pulses, representing a repeated Morse code symbol 1 (one). In contrast, object vehicle 1300 is provided with optical beacon 1321 which repeats a distinct, predetermined sequence of optical pulses, here a series of alternating double short (.) and long (-) pulses, representing a repeated Morse code symbol 2 (two). Other distinct, identifying sequences are possible.
Each repeating sequence may be detected over time in the scene observed by camera 1211 and used both to locate and distinguish test vehicle 1100 and object vehicle 1300. Such an optical beacon can be provided by an optical emitter, such as a lamp or LED, operating in the visible or infrared wavelength bands, pulsed by a pulse controller, as is known in the art. Such a beacon may be permanently fixed to the respective vehicle, or may be removably attached through magnetic, adhesive or suction fixation.
In the above disclosure, it will be appreciated that using an optical beacon such as a pulsed light source having a single optical emitter which may allow location information of object vehicle 1300 to be acquired, but not attitude information. This may be sufficient for simple scenarios and possible objects. However, attitude information can be acquired in an alternative configuration in which la plurality of optical emitters are provided in a defined spatial relationship one to another.
For example, a plurality of optical beacons may be provided at the vertices of an isosceles triangle, allowing the attitude of the vehicle to be identified. The arrangement of the emitters may be used in addition to or as an alternative to the pulsing of the optical beacons to identify the object.
In a further configuration, the plurality of optical emitters may have distinct wavelength bands of emission, and so, for example, may be configured to emit different colours of light. By providing, for example, a triangular configuration of emitters, in which at least one vertex of the triangle emits a different wavelength of light to the others, an orientation of the triangle and hence an attitude of the object vehicle can be identified. The colours of the emitters may be used in addition to or as an alternative to the pulsing of the optical beacons to identify the vehicle.
The above embodiment has been described with reference to the use of spatial patterns, such as geometric patterns, or temporal patterns, such as pulse patterns, for distinguishing the test vehicle 1100 from the object vehicle 1300. However, in some embodiments, the shapes and, optionally, colours of the vehicles themselves can be used to distinguish the test vehicle 1100 from the object vehicle 1300, and to establish their relative position and attitude.
In such an embodiment, image processing module 1211 may be configured to perform object detection and tracking based on known parameters of the vehicles, such as side profiles, plan outlines and colours, in a manner known in the art. Such a configuration can avoid the need to provide any markers to vehicles 1100 and 1300. Such detection and tracking may also, for example, be provided by incorporating a pre-trained object classifier and tracker which is trained to classify and track a set of object vehicles which are expected to be in the environment.
Moreover, the above disclosure has been explained in an illustrative manner with reference to the observation, identification and determination of the spatial state of an object vehicle. However, it will be apparent that the object need not be a vehicle, but may, for example, be another movable object such as a pedestrian, animal or gate, or may be a static object such as an artificial barrier or other obstacle, or a natural feature such as a rock, cliff, boulder, tree or bush. The above disclosure is applicable without limitation to these or any other class of object.
The test system described herein can be used in a variety of ways to verify the performance of ADAS systems.
For example, as explained above, dataloggers 1119 and 1219 may individually be provided with local storage for recording test data in association with time values determined from the common timebase. The data recorded on the local storage may then be analysed at a later time. In another configuration, one of datalogger 1119 and 1219 may, in addition to or instead of recording the data locally, transmit the data in association with the time values to the other datalogger, or each datalogger may transmit the data via respective datalinks Li and L2 to a separate test controller, such as testing workstation depicted in Figure 9.
For example, in one configuration test vehicle 1110 may be provided with an analysis unit (not shown), which compares the data received from observation vehicle 1200 with data obtained from the vehicle-mounted sensors provided to test vehicle 1100. Alternatively, such an analysis module may be provided to observation vehicle 1200. However, will typically be more convenient for each of test vehicle 1100 and observation vehicle 1200 to Provide data to a test control system or test analysis system such as workstation 1400, for example via data links Li and L2 as shown in Figure 9. As shown, data link Li connects test vehicle 1100 to test workstation 1400 while datalink L2 connects observation vehicle 1300 to test workstation 1400.
In the configuration shown in Figure 9, test vehicle 1100 transmits sensor data and other data relating to the operation of ALAS controller 1115 in real time via data link Li to test workstation 1400, while observation vehicle 1300 transmits either imaging data from camera 1211 or data relating to the position of test vehicle 1100 and object vehicle 1300 in the environment as determined by observation vehicle 1300 in real time to test workstation 1400. In each case, the transmitted data is associated with time values derived from the common timebase.
At test workstation 1400, an analysis can be performed, either in real-time or historically, to allow proper operation of ALAS controller 1115 to be confirmed. For example, in one operation, a comparison may be made between a determined position of object vehicle 1300 and relative to test vehicle 1100 obtained from ALAS controller 1115, and the determined position of object vehicle 1300 relative to test vehicle 1100 as determined by observation vehicle 1200 at a defined time within the common timebase.
An error between these two positions may be determined, for example by subtraction of the determined positions, which may be used for improving the performance of the ALAS controller 1115. However, as discussed above, the functions of test workstation 1400 may also, at least partially, be integrated into test vehicle 1100 or observation vehicle 1200.
In another mode of operation, rather than transmitting in real-time data from test vehicle 1100 and observation vehicle 1300 to test workstation 1400, ADAS controller 1115 and observation controller 1215 can record relevant data using the respective dataloggers, which can later be downloaded via data carriers including, for example, flash memory, to test workstation 1400 which may be geographically and temporally separated from the test environment.
Figure 10 is a flowchart representing one example of a testing method which may be implemented with the disclosed system.
In a first step Si, a common timebase is established between ADAS controller 1115 of test vehicle 1100 and observation controller 1215 of observation vehicle 1200.
In a second step 32, observation data is recorded by observation controller 1215 using camera 1211 which observes at least parts of a test environment containing the test vehicle 1100 and at least one object such as object vehicle 1300.
In a third step 33, signals are obtained from one or more sensors 1111, 1112, 1113 associated with ADAS controller 1115 and associated with the common timebase.
In a fourth step S4, signals are obtained from the camera 1211 by the observation controller 1215 and associated with the common timebase.
In a fifth step 55, a spatial state of the at least one object vehicle 1300 relative to the test vehicle 1100 is determined using signals from the sensors associated with ADAS controller 1115 and associated with the common timebase.
The spatial state here may include, for example, at least one of the position, attitude and movement vector of the object relative to the test vehicle. It is advantageous if the spatial state includes at least the position, further advantageous if the spatial state includes the attitude, and most advantageous if the spatial state includes all of the position, attitude and movement vector.
In a sixth step, a spatial state of the at least one object relative to the test vehicle is determined using signals recorded from camera 1211 by the observation controller 1215 and associated with the common timebase.
A seventh step, an error is determined between the spatial state determined using observation vehicle 1200 and the spatial state determined using the sensors associated with ADAS controller 1115, the states being associated with corresponding times in the common timebase.
In the above disclosure, ADAS controller 1115 and observation controller 1215, for example, have been represented as data processing devices having distinct modules, for ease of explanation. However, the operations of these devices may be performed, for example, by an industrial programmable logic controller, or a commodity data processing hardware such as a portable computing device equipped with appropriate interfaces, and executing modular or monolithic program code.
In the above disclosure, the disclosure has been described in connection with a single test vehicle 1115 and a single object such as object vehicle 1300. However, advantages of the present disclosure become even more apparent when multiple object vehicles or other objects are expected or required to be present in a test scenario. In particular, in order to track multiple object vehicles or other objects, the image processing capability of the observation vehicle 1200 need only moderately be increased, while the object vehicles or other objects need only be provided with relatively inexpensive markers or beacons, or in appropriate embodiments need not be provided with any special distinguishing feature.
Moreover, as the number of objects in the test environment increases, the possibility of occlusion of the line of sight between the observation vehicle 1200 and the object vehicles remains minimal due to the elevated position of the observation vehicle in the environment.
Hence, using this approach, test scenarios with two, three, five, ten or more interacting vehicles or other objects may reliably be tested, without high cost or a decrease in reliability.
While this disclosure has been described in terms of the preferred embodiments, is not intended to be so limited, but rather only to the extent set forth in the claims that follow.

Claims (20)

  1. Claims 1. A testing method for vehicle-mounted sensors, comprising: providing, in an environment, a test vehicle, the test vehicle having one or more vehicle-mounted sensors for detecting a spatial state of at least one other object relative to the test vehicle; providing, in the environment, one or more objects; providing an observation vehicle in the environment, the observation vehicle having an observation sensor for observing observe at least part of the environment; establishing a common timebase for signals from the one or more test sensors and signals from the observaTi_on sensor and associating the signals with the timebase; and positioning the observation vehicle such that the observation sensor observes at least a part of the environment containing the test vehicle and at least one object.
  2. 2. The method of claim 1, wherein the association comprises: recording signals from the one or more test sensors in association with the common timebase; and recording signals from the observation sensor in association with the common timebase.
  3. 3. The method of claim 1 or 2, further comprising: determining, with signals derived from the vehicle-mounted sensors, the spatial state of the at least one object relative to the test vehicle in association with the common timebase; and determining, with signals derived from the observation sensor substantially simultaneously observing the test vehicle and the object, the spatial state of the at least one object relative to the test vehicle in association with the common timebase.
  4. 4. The method of claim 3, further comprising comparing the spatial state of the at least one object relative to the test vehicle determined with the vehicle-mounted sensors with the spatial state of the at least one object relative to the test vehicle determined with the observation sensor using the common timebase.
  5. S. The method of claim 4, further comprising determining an error between the spatial state determined with the vehicle-mounted sensors and the spatial state of the at least one object determined with the observation sensor.
  6. 6. The method of any preceding claim, wherein the common timebase is established by: receiving, at the test vehicle and the observation vehicle, a common timebase signal defining the common timebase, optionally a GPS timebase signal or a radio timebase signal; or transmitting, from the test vehicle to the observation vehicle, a timebase signal defining the common timebase, optionally an IEEE 1588 timebase signal or an NTP signal; transmitting, from the observation vehicle to the test vehicle a timebase signal defining the common timebase, optionally an IEEE 1588 timebase signal, an NTP signal or a frame number of the observation sensor.
  7. 7. The method of any preceding claim, wherein the common timebase is established to better than 10 millisecond, optionally better than 1 millisecond.
  8. 8. The method of any preceding claim, wherein the observation sensor is an imaging sensor, optionally a visible-spectrum or infrared camera.
  9. 9. The method of any preceding claim, wherein the test vehicle has an observable test vehicle marker for locating the test vehicle in the environment and each object has an observable object marker for locating the object in the environment.
  10. 10. The method of claim 9 as dependent, directly or indirectly, on claim 3, wherein the determining, with signals derived from the observation sensor, of the spatial state of the at least one object relative to the test vehicle is performed by detecting the object marker and the test vehicle marker in the observed part of the environment containing the test vehicle and the at least one object.
  11. 11. The method of claim 9 or 10, wherein each marker comprises: a pattern unique in the environment to the respective vehicle, optionally a 2D barcode pattern; or a beacon providing a signal unique in the environment to the respective vehicle, optionally an optical pulse pattern, optionally having a plurality of optical emitters.
  12. 12. The method of any one of claims 9 to 11, wherein each marker is removably attached to the respective vehicle or object, optionally to an upper surface of the respective vehicle, optionally by magnetic, adhesive or suction attachment.
  13. 13. The method of any one of claims 9 to 12, wherein the observation sensor has an imaging resolution sufficient to resolve each marker at a range of at least 10m, optionally at least 20m, further optionally at least 50m, further optionally at least 100m.
  14. 14. The method of any preceding claim, wherein the observation vehicle has a controller for autonomously keeping station relative to the test vehicle.
  15. 15. The method of any preceding claim, wherein the observation vehicle is remotely operated to keep station relative to the test vehicle.
  16. 16. The method of any preceding claim, wherein the test vehicle is equipped with advanced driver-assistance systems satisfying at least SAE J3016 (06/2018) Level 1 requirements using the vehicle-mounted sensors.
  17. 17. The method of any preceding claim, wherein the vehicle-mounted sensors comprise sensors selected from radar, lidar ultrasonic and imaging sensors.
  18. 18. A testing system for vehicle-mounted sensors, the system comprising: a test vehicle having one or more vehicle-mounted sensors for detecting a spatial state of at least one other vehicle relative to the test vehicle; one or more objects; and an observation vehicle, the observation vehicle having an observation sensor for observing at least part of an environment in which the test vehicle and at least one object are operable, wherein the system has a timebase generator or receiver providing a common timebase for signals from the one or more test sensors and signals from the observation sensor; and the observation vehicle has a controller operable to cause the observation sensor to observe at least a part of the environment containing the test vehicle and at least one object.
  19. 19. An observation vehicle for testing vehicle-mounted sensors, the observation vehicle comprising: an observation sensor for observing at least part of an environment in which a test vehicle and at least one object are provided; and a timebase transmitter or receiver for associating observations of the observation sensor with a common timebase.
  20. 20. A test vehicle for testing vehicle-mounted sensors, the test vehicle having vehicle-mounted sensors mounted thereon for detecting a spatial state of at least one other object relative to the test vehicle; and a timebase transmitter or receiver for associating observations of the observation sensor with a common timebase.
GB2013033.2A 2020-08-20 2020-08-20 Testing method for vehicle-mounted sensors, testing system, observation vehicle and test vehicle Pending GB2598142A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2013033.2A GB2598142A (en) 2020-08-20 2020-08-20 Testing method for vehicle-mounted sensors, testing system, observation vehicle and test vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2013033.2A GB2598142A (en) 2020-08-20 2020-08-20 Testing method for vehicle-mounted sensors, testing system, observation vehicle and test vehicle

Publications (2)

Publication Number Publication Date
GB202013033D0 GB202013033D0 (en) 2020-10-07
GB2598142A true GB2598142A (en) 2022-02-23

Family

ID=72660768

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2013033.2A Pending GB2598142A (en) 2020-08-20 2020-08-20 Testing method for vehicle-mounted sensors, testing system, observation vehicle and test vehicle

Country Status (1)

Country Link
GB (1) GB2598142A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10254388A1 (en) * 2002-11-18 2004-05-27 Volkswagen Ag Motor vehicle assistance system testing method, in which the system evaluation unit is tested by supplying it with static and or dynamic test information via the system sensors and a test structure
DE102008063988A1 (en) * 2008-12-19 2010-07-29 Audi Ag Testing ground for vehicle driver assistance optical sensors has a test body, positioned accurately by a robot in relation to the test vehicle
US20130238166A1 (en) * 2012-03-07 2013-09-12 Audi Ag Method for testing the operability of a driver assistance system installed in a test vehicle
US8838322B1 (en) * 2012-08-14 2014-09-16 Google Inc. System to automatically measure perception sensor latency in an autonomous vehicle
CN104571111A (en) * 2015-01-09 2015-04-29 中国科学院合肥物质科学研究院 Method for testing outdoor environment sensing capability of mobile robot
US20170067764A1 (en) * 2015-08-28 2017-03-09 Robert Bosch Gmbh Method and device for detecting at least one sensor malfunction of at least one first sensor of at least one first vehicle
CN111191697A (en) * 2019-12-21 2020-05-22 武汉光庭信息技术股份有限公司 ADAS road test verification optimization method and device based on sensor fusion

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10254388A1 (en) * 2002-11-18 2004-05-27 Volkswagen Ag Motor vehicle assistance system testing method, in which the system evaluation unit is tested by supplying it with static and or dynamic test information via the system sensors and a test structure
DE102008063988A1 (en) * 2008-12-19 2010-07-29 Audi Ag Testing ground for vehicle driver assistance optical sensors has a test body, positioned accurately by a robot in relation to the test vehicle
US20130238166A1 (en) * 2012-03-07 2013-09-12 Audi Ag Method for testing the operability of a driver assistance system installed in a test vehicle
US8838322B1 (en) * 2012-08-14 2014-09-16 Google Inc. System to automatically measure perception sensor latency in an autonomous vehicle
CN104571111A (en) * 2015-01-09 2015-04-29 中国科学院合肥物质科学研究院 Method for testing outdoor environment sensing capability of mobile robot
US20170067764A1 (en) * 2015-08-28 2017-03-09 Robert Bosch Gmbh Method and device for detecting at least one sensor malfunction of at least one first sensor of at least one first vehicle
CN111191697A (en) * 2019-12-21 2020-05-22 武汉光庭信息技术股份有限公司 ADAS road test verification optimization method and device based on sensor fusion

Also Published As

Publication number Publication date
GB202013033D0 (en) 2020-10-07

Similar Documents

Publication Publication Date Title
US11480435B2 (en) Map generation systems and methods
US11218689B2 (en) Methods and systems for selective sensor fusion
US10943355B2 (en) Systems and methods for detecting an object velocity
US20200344464A1 (en) Systems and Methods for Improving Performance of a Robotic Vehicle by Managing On-board Camera Defects
EP3428766B1 (en) Multi-sensor environmental mapping
CN108139202B (en) Image processing apparatus, image processing method, and program
CN112558608B (en) Vehicle-mounted machine cooperative control and path optimization method based on unmanned aerial vehicle assistance
US11450026B2 (en) Information processing apparatus, information processing method, and mobile object
US20190068829A1 (en) Systems and Methods for Improving Performance of a Robotic Vehicle by Managing On-board Camera Obstructions
CN111448476A (en) Techniques for sharing drawing data between an unmanned aerial vehicle and a ground vehicle
US11544940B2 (en) Hybrid lane estimation using both deep learning and computer vision
CN111093907B (en) Robust navigation of robotic vehicles
CN105157708A (en) Unmanned aerial vehicle autonomous navigation system and method based on image processing and radar
CN110333735B (en) System and method for realizing unmanned aerial vehicle water and land secondary positioning
Perez et al. Autonomous collision avoidance system for a multicopter using stereoscopic vision
WO2019065431A1 (en) Information processing apparatus, movable apparatus, information processing method, movable-apparatus control method, and programs
CN112461249A (en) Sensor localization from external source data
CN113168692A (en) Information processing device, information processing method, program, moving object control device, and moving object
Urieva et al. Collision detection and avoidance using optical flow for multicopter UAVs
CN104539906A (en) Image/laser ranging/ABS-B monitoring integrated system
US20230260254A1 (en) Information processing device, information processing method, and program
CN110597275A (en) Method and system for generating map by using unmanned aerial vehicle
US11366237B2 (en) Mobile object, positioning system, positioning program, and positioning method
GB2598142A (en) Testing method for vehicle-mounted sensors, testing system, observation vehicle and test vehicle
SE1950992A1 (en) Method and control arrangement for autonomy enabling infrastructure features

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20240321 AND 20240327