WO2019182499A1 - Method, control arrangement and reference object for calibration of sensors in an autonomous vehicle - Google Patents

Method, control arrangement and reference object for calibration of sensors in an autonomous vehicle Download PDF

Info

Publication number
WO2019182499A1
WO2019182499A1 PCT/SE2019/050244 SE2019050244W WO2019182499A1 WO 2019182499 A1 WO2019182499 A1 WO 2019182499A1 SE 2019050244 W SE2019050244 W SE 2019050244W WO 2019182499 A1 WO2019182499 A1 WO 2019182499A1
Authority
WO
WIPO (PCT)
Prior art keywords
reference object
sensor
vehicle
appearance
control arrangement
Prior art date
Application number
PCT/SE2019/050244
Other languages
French (fr)
Inventor
Hjalmar LUNDIN
Christian Larsson
Gustav RISTING
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to DE112019000846.3T priority Critical patent/DE112019000846T5/en
Publication of WO2019182499A1 publication Critical patent/WO2019182499A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S1/00Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith
    • G01S1/02Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith using radio waves
    • G01S1/022Means for monitoring or calibrating
    • G01S1/026Means for monitoring or calibrating of associated receivers
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/09623Systems involving the acquisition of information from passive traffic signs by means mounted on the vehicle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/04Systems determining the presence of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/7803Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4004Means for monitoring or calibrating of parts of a radar system
    • G01S7/4026Antenna boresight
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096733Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
    • G08G1/096758Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where no selection takes place on the transmitted or the received information
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096783Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a roadside individual element
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/163Decentralised systems, e.g. inter-vehicle communication involving continuous checking
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/02Systems for determining distance or velocity not using reflection or reradiation using radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9316Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles combined with communication equipment with other vehicles or with base stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves

Definitions

  • This document discloses a method, a control arrangement and a reference object. More particularly, a method, a control arrangement and a reference object are disclosed for cali bration of a vehicle sensor based on measurements of electromagnetic radiation.
  • a vehicle in particular an autonomous vehicle, comprises a large number of sensors of dif ferent kind.
  • Vehicle sensors are often calibrated in the production line using a reference point or similar to ensure that the sensors are aligned as expected. When the vehicle is opera tional, sensor errors are often discovered by the driver.
  • Another common alternative is to compare the information from different sensors of the vehicle, and e.g. disable Advanced 15 Driver Assistance Systems (ADAS) functions if some sensor information seems to be cor rupt, i.e. different sensors present deviating results.
  • ADAS Advanced 15 Driver Assistance Systems
  • a human interaction by an ambulating mechanic for checking/ adjusting/ calibrating sensors of the vehicle may be required before allowing the vehicle to continue driving. This is unfor tunately expensive and time consuming, which may cause substantive delay of the transpor tation.
  • Document US 20170166219 concerns diagnosing and supplementing vehicle sensor data.
  • Sensor data is shared between vehicles and a road infrastructure via radio.
  • Sensor data such as temperature is collected from a plurality of vehicle sensors, and a mean value is calculated. In case a sensor value is received, or discovered, which deviates substantially 30 from the mean value, it is returned to the vehicle and the temperature sensor is adjusted.
  • the deviating sensor value may dis play the only correct sensor value while the other sensor values may be erroneous.
  • the only sensor correction discussed in D1 is correction of the temperature sensor. 35
  • the temperature sensor of a vehicle is typically mounted in a protected position in the vehi cle, while many other sensors such as radar, lidar, cameras etc., often are positioned at locations where they are subject to interference with the environment. It would for this reason be desirable to detect errors and calibrate sensors dedicated towards measuring distance, proximity, displacement, etc., based on usage of electromagnetic radiation such as visible light, infrared light, microwaves, radio waves, etc.
  • Document US 20170232974 describes a drive assistance system.
  • a vehicle comprises sen sors. When one of the sensors in the vehicle is determined to be incorrect, automatic driving of the vehicle is disabled.
  • Document US 2017072967 concerns a system for autonomously guiding a vehicle.
  • a con troller is designed to detect a malfunction of a sensor in the vehicle.
  • An auxiliary sensor signal (of another vehicle) is obtained via a communication network, and the vehicle is nav igated assisted by the auxiliary sensor signal.
  • the presented solution is dependent upon a correct sensor signal of another vehicle. How ever, there may not be any other vehicle close by, or that vehicle may also have an incorrect sensor. It would be desired to encounter a solution which is not dependent on sensors of other vehicles.
  • Document WO 2017003350 describes a solution wherein vehicle external sensors, i.e. sen sors in other close-by vehicles, are used in order to verify that the sensor data of the own vehicle is correct. In case an error code of the temperature sensor in the own vehicle is detected, temperature information from a temperature sensor in another close-by vehicle may be obtained and utilised.
  • vehicle external sensors i.e. sen sors in other close-by vehicles
  • this objective is achieved by a method in a vehicle, for calibration of a vehicle sensor, which vehicle sensor is based on measurements of elec tromagnetic radiation.
  • the method comprises obtaining wireless signals comprising infor mation describing position and appearance of a reference object. Further the method also comprises detecting the reference object with the vehicle sensor. The method in addition comprises determining position and appearance of the reference object, based on the sensor detection of the reference object. Also, the method further comprises comparing the deter mined position and appearance of the reference object, with the obtained position and ap pearance of the reference object. The method additionally comprises calibrating the sensor based on the obtained position and appearance of the reference object, when the difference between the determined position and appearance of the reference object and the obtained position and appearance of the reference object exceeds a threshold limit.
  • a control ar rangement in a vehicle is configured for calibration of a vehicle sensor; which vehicle sensor is based on measurements of electromagnetic radiation.
  • the control arrangement is configured to obtain wireless signals comprising information describing position and appearance of a reference object, via a receiver.
  • the control arrangement is further configured to detect the reference object with the vehicle sen sor.
  • the control arrangement is additionally configured to determine position and appearance of the reference object, based on the sensor detection of the reference object.
  • a reference object configured for assisting a control arrangement according to the sec ond aspect in calibration of a vehicle sensor in a vehicle.
  • the vehicle sensor is based on measurements of electromagnetic radiation.
  • the reference object comprises a memory de vice, comprising information describing position and appearance of the reference object.
  • the reference object also comprises a transmitter, configured to broadcast wire less signals comprising the information describing position and appearance of the reference object, of the memory device.
  • a calibration of the misaligned sensor may be per formed. This process may be repeated until no misalignment (exceeding a predetermined threshold limit) is detected of any sensor in/ on the vehicle. Thereby, traffic safety is en hanced. Also, it is avoided that the vehicle has to interrupt the current transportation and stop, waiting for a humanoid service operator to come and check/ calibrate/ exchange sen- sors on the vehicle. Hereby, time and money are saved.
  • Figure 1 illustrates an example of a vehicle equipped with sensors and a control ar rangement according to an embodiment of the invention
  • Figure 2 illustrates a vehicle interior of a vehicle equipped with sensors and a control arrangement according to an embodiment of the invention
  • Figure 3 illustrates an example of a vehicle according to an embodiment of the inven tion in an overview, also illustrating a reference object and an intermediate obstacle;
  • Figure 4 is a flow chart illustrating an embodiment of the method
  • Figure 5 is an illustration depicting a system according to an embodiment.
  • Embodiments of the invention described herein are defined as a method, a control arrange- ment and a reference object, which may be put into practice in the embodiments described below. These embodiments may, however, be exemplified and realised in many different forms and are not to be limited to the examples set forth herein; rather, these illustrative examples of embodiments are provided so that this disclosure will be thorough and complete. Still other objects and features may become apparent from the following detailed description, considered in conjunction with the accompanying drawings. It is to be understood, however, that the drawings are designed solely for purposes of illustration and not as a definition of the limits of the herein disclosed embodiments, for which reference is to be made to the appended claims. Further, the drawings are not necessarily drawn to scale and, unless oth erwise indicated, they are merely intended to conceptually illustrate the structures and pro cedures described herein.
  • Figure 1 illustrates a scenario with a vehicle 100 driving in a driving direction 105 on a road 110, approaching a reference object 120a such as e.g. a traffic sign/ light, a pole, a part of the road 1 10, an illumination arrangement, a parking lot, a building or other similar structure.
  • the reference object 120a may be situated beside the road 1 10, i.e. beside, in front of or behind the vehicle 100; in/ under the road 1 10, i.e. under the vehicle 100; above the road 1 10 and also above the vehicle 100, etc.
  • the vehicle 100 comprises at least one sensor 130, 140a, 140b configured for detecting traffic related objects such as e.g. reference objects 120a in the relative vicinity of the vehicle 100.
  • the vehicle 100 may comprise a truck, a bus, a car, a motorcycle, or similar means of con veyance.
  • the vehicle 100 may typically be autonomous/ driverless. However, the vehicle 100 may also, or alternatively be conducted by a human driver.
  • the senor 130, 140a, 140b may comprise camera, laser, lidar or radar sensors already existing on the vehicle 100, for other driving assistance func tions, beside detecting the reference object 120a.
  • the vehicle 100 comprises a wireless receiver 150, configured to receive wireless communication such as e.g. radio signals from a transmitter related to the reference object 120a.
  • wireless communication such as e.g. radio signals from a transmitter related to the reference object 120a.
  • the solution to the problem of malfunctioning or incorrect sensors 130, 140a, 140b is to deploy a functionality in the vehicle 100 that can detect misalignment and defects of sensors 130, 140a, 140b in/ on the vehicle 100 by using information from smart infrastructure, such as the reference object 120a.
  • the functionality may comprise detecting, via the sensors 130, 140a, 140b, that the vehicle 100 is approaching the reference object 120a.
  • the approach of the vehicle 100 to the reference object 120a may be made by receiving short distance wireless communication by the wireless receiver 150 from the transmitter on, or in proximity of the reference object 120a.
  • the wireless communication which may comprise sort of v2x-communication such as e.g. WiFi, Wireless Local Area Network (WLAN), 3GPP LTE, Ultra Mobile Broadband (UMB), Bluetooth (BT), Near Field Communication (NFC), Radio-Frequency Identification (RFID), Z- wave, ZigBee, IPv6 over Low power Wireless Personal Area Networks (6L0WPAN), Wireless Flighway Addressable Remote Transducer (FIART) Protocol, Wireless Universal Serial Bus (USB), optical communication such as Infrared Data Association (IrDA), Low-Power Wide- Area Network (LPWAN) such as e.g. LoRa, or infrared transmission to name but a few pos sible examples of wireless communications in some embodiments.
  • v2x-communication such as e.g. WiFi, Wireless Local Area Network (WLAN), 3GPP LTE, Ultra Mobile Broadband (UMB), Bluetooth (BT), Near Field Communication (NFC), Radio-Frequency Identification (RFID), Z- wave, ZigBee, IPv6
  • the wireless communication may be made according to any IEEE standard for wireless ve hicular communication like e.g. a special mode of operation of IEEE 802.1 1 for vehicular networks called Wireless Access in Vehicular Environments (WAVE).
  • IEEE 802.1 1 p is an extension to 802.1 1 Wireless LAN medium access layer (MAC) and physical layer (PHY) specification.
  • MAC Wireless LAN medium access layer
  • PHY physical layer
  • information concerning absolute and/or relative position, size, shape, colour/s, etc., of the reference object 120a may be transferred to the vehicle 100. This information may then act as“ground truth” and be used for making comparisons with the sensor data.
  • sensor misalignment can be detected and adjusted electronically and / or mechanically to compensate for the current error between detections and the vehicle-to-infrastructure data, in case the error exceeds a predetermined threshold limit, such as e.g. 1%.
  • the advantage with this solution is to be able to achieve automatic online calibration of sen sors 130, 140a, 140b while driving, for example while stopping at or passing a traffic light, a tunnel, or other infrastructure. Since no human can take over in autonomous vehicles 100, there is a need to continuously ensure that the sensor data is correct, for security reasons.
  • the solution prevents vehicles 100 from stopping due to contradictory sensor reports. Thereby, transportation delays and service adjustments from a human operator are avoided. Hereby costs are saved while safety is increased.
  • the functionality may in some embodi ments be extended to incorporate vehicle-to-vehicle information and use other moving or stationary vehicles as a reference point instead of infrastructure.
  • the sensors 130, 140a, 140b of the vehicle 100 may be continuously, or regularly checked and / or calibrated while passing the area, leading to a safer and more reliable traffic situation, more effective transportation and less stops for manual sensor cali bration.
  • Figure 2 illustrates an example of how the previously scenario in Figure 1 may be perceived by the driver of the vehicle 100, approaching a set of reference objects 120a, 120b, 120c.
  • the reference objects 120a, 120b, 120c, or one reference object 120a, 120b, 120c out of the set of refer ence objects 120a, 120b, 120c may comprise or be associated with a memory device 210, comprising information describing position and appearance of the reference object/s 120a, 120b, 120c.
  • the information may comprise absolute position, relative position in relation to other reference objects 120a, 120b, 120c, the road 1 10, the vehicle position, and / or to any other structure, etc.; shape, height, width, dimensions, colour/s, etc.
  • the information concerning the reference objects 120a, 120b, 120c may be determined and stored into the memory device 210 by a human operator in some embodiments.
  • a positioning device such as a Global Positioning System (GPS) device may be comprised at the reference object 120a, 120b, 120c. Then, the position of the reference object 120a, 120b, 120c may be continuously measured automatically, and the memory device 210 may be continuously updated with the location information.
  • GPS Global Positioning System
  • reference objects 120a, 120b, 120c, or one reference object 120a, 120b, 120c out of the set of reference objects 120a, 120b, 120c may comprise or be associated with a transmitter 220, configured to transmit or broadcast wireless signals comprising the infor mation describing position and appearance of the reference object 120a, 120b, 120c, ex tracted from the memory device 210.
  • An advantage with broadcasting the information of the reference object 120a, 120b, 120c continuously or at a regular time interval is that several vehicles 100 (e.g. approaching the reference object 120a, 120b, 120c from different directions and/ or at different distances) may obtain the information and may perform the sensor calibration.
  • the transmitter 220 may be triggered to emit signalling comprising the information e.g. by a detection of the approaching vehicle 100 by a sensor on the refer ence object 120a, 120b, 120c.
  • An advantage therewith is that the transmitter 220 may hiber nate when no vehicle 100 is detected, thereby saving transmission energy.
  • the information to be transmitted may be encrypted, and/ or signed with a digital signature before being broadcasted.
  • the vehicle 100 it could be ascertained by the vehicle 100 that the reference object 120a, 120b, 120c is a real reference object, managed by an authority; and that the information on the memory device 210 concerning the reference object 120a, 120b, 120c has not been tampered with, or otherwise degenerated.
  • a check-sum may be calculated on the information on the memory device 210, and the check-sum may be broadcasted together with the information on the memory device 210.
  • the vehicle 100 may thereby check that the received information received from the transmitter 220 has not been tampered with, or otherwise degenerated, by calculating a check-sum on the information using the same algorithm and then compare the calculated check sum with the received check sum.
  • Security issues in association with autonomous vehicles 100 and calibration of their sensors 130, 140a, 140b may be important as they otherwise may be easy targets for terrorists, an archists and / or criminals.
  • the reference object 120a, 120b, 120c may com prise an impact monitor/ shock sensor 230, configured to detect impact of the reference object 120a, 120b, 120c.
  • the impact monitor 230 may indicate whether a physical shock or impact has occurred.
  • the impact monitor 230 may have a binary output (go/no-go) and may sometimes be called shock overload devices. It may thereby be determined that the refer ence object 120a, 120b, 120c has not been dislocated, for example tilted by another vehicle.
  • the transmitter 220 may in some embodiments be further configured to emit a confirmation that the reference object 120a, 120b, 120c has not been dislocated when the impact monitor 230 has not detected any impact.
  • the impact monitor 230 may comprise or be based on e.g. a sensor such as an accelerometer, a spring-mass systems which can be triggered by a shock, disruption of the surface tension of a liquid, magnetic balls which can be dislodged from a holder, breakage of an inexpensive brittle component with a known fragility, etc.
  • a sensor such as an accelerometer, a spring-mass systems which can be triggered by a shock, disruption of the surface tension of a liquid, magnetic balls which can be dislodged from a holder, breakage of an inexpensive brittle component with a known fragility, etc.
  • the set of reference objects 120a, 120b, 120c may be situated in different directions around the vehicle 100, and / or at different heights.
  • different reference objects 120a, 120b, 120c may be dedicated for calibration of different sensor types in some embodiments; e.g. one reference object 120a, 120b, 120c may be dedicated for calibration of a camera, one reference object 120a, 120b, 120c may be dedicated for cali bration of a laser, one reference object 120a, 120b, 120c may be dedicated for calibration of an ultrasound sensor, one reference object 120a, 120b, 120c may be dedicated for calibra tion of an infrared sensor, etc.
  • the vehicle 100 comprises a control arrangement 200, for calibration of the vehicle sensor 130, 140a, 140b.
  • the control arrangement 200 is configured to obtain wireless signals com prising information describing position and appearance of a reference object 120a, 120b, 120c, via a receiver 150. Further, the control arrangement 200 is configured to detect the reference object 120a, 120b, 120c with the vehicle sensor 130, 140a, 140b. In addition, the control arrangement 200 is configured to determine position and appearance of the reference object 120a, 120b, 120c, based on the sensor detection of the reference object 120a, 120b, 120c. Furthermore, the control arrangement 200 is configured to compare the determined position and appearance of the reference object 120a, 120b, 120c, with the obtained position and appearance of the reference object 120a, 120b, 120c.
  • the control arrangement 200 is additionally configured to calibrate the sensor 130, 140a, 140b based on the obtained posi tion and appearance of the reference object 120a, 120b, 120c, when the difference between the determined position and appearance of the reference object 120a, 120b, 120c and the obtained position and appearance of the reference object 120a, 120b, 120c exceeds a threshold limit.
  • control arrangement 200 may be further configured to obtain wireless signals comprising information describing position and appearance of at least two distinct reference objects 120a, 120b, 120c, via the receiver 150; i.e. a set of distinct refer ence objects 120a, 120b, 120c.
  • the control arrangement 200 may also be configured to detect the at least two reference objects 120a, 120b, 120c via the sensor 130, 140a, 140b.
  • the control arrangement 200 may also be configured to determine the positions and appearances of the reference objects 120a, 120b, 120c, based on the sensor detection of the reference objects 120a, 120b, 120c.
  • control arrangement 200 may also be configured to compare the determined positions and appearances of the reference objects 120a, 120b, 120c (as determined by the sensors 130, 140a, 140b), with the obtained positions and appearances of the reference objects 120a, 120b, 120c (as received from the transmitter 220). Also, the control arrangement 200 may further be configured to calibrate the sensor 130, 140a, 140b, based on the obtained positions and appearances of the refer ence objects 120a, 120b, 120c.
  • control arrangement 200 may be further configured to obtain a wireless signal comprising a confirmation that the reference object 120a, 120b, 120c has not been dislocated.
  • An advantage with arranging a plurality of reference objects 120a, 120b, 120c at different places and directions around the vehicle position is that several vehicle sensors 130, 140a, 140b, oriented in different directions and/ or being of different types may be calibrated.
  • Figure 3 illustrates another scenario wherein a vehicle 100 as illustrated in Figure 1 and / or Figure 2 is approaching reference object 120a, 120b, 120c while driving in a driving direction 105, as regarded from an above perspective.
  • the reference object 120a is obscured by an intermediate obstacle 300.
  • the intermediate obstacle 300 may be e.g. another vehicle, a human, an edifice struc ture, dirt on the sensor lens, a weather phenomenon etc.
  • weather phenomenon may comprise a reduced visibility situation such as e.g. twilight, night, rain, fog, smoke, pollution, snow fall, hail, blizzard, sun dazzle or similar weather conditions.
  • the sensor 130, 140a, 140b may become dazzled by the headlights of a meeting vehicle driving in opposite direc tion.
  • Different kinds of vehicle sensors 130, 140a, 140b may be sensitive for different kinds of intermediate obstacles 300, e.g., a camera may not be able to detect the reference object 120a, 120b, 120c due to darkness, while darkness is no obstacle for a laser or lidar based sensor; a transparent glass surface (vehicle window) may block infrared and ultraviolet sig nals of a PIR, but not visible light used by a camera/ video camera, etc.
  • the control arrangement 200 may be configured to detect that the reference object 120a, 120b, 120c cannot be observed due to the intermediate obstacle 300.
  • the detection may be made e.g. by receiving information over a wireless communication from the transmitter 220 associated with the reference object 120a, 120b, 120c, concerning position of the reference object 120a, 120b, 120c.
  • the sensor calibration may be postponed until a later moment when the intermediate obstacle 300 has left.
  • sensors 130, 140a, 140b of the vehicle 100 being able to detect the reference object 120a, 120b, 120c may be calibrated.
  • the sensors 130, 140a, 140b comprising laser, lidar, ultrasound, and / or infrared light, etc., may be calibrated anyway, while camera sensors may not be calibrated.
  • Figure 4 illustrates an example of a method 400 according to an embodiment.
  • the flow chart in Figure 4 shows the method 400 for use in a vehicle 100 for calibration of a vehicle sensor 130, 140a, 140b, which vehicle sensor 130, 140a, 140b is based on measurements of elec tromagnetic radiation.
  • the sensor calibration may be made based on detection of a reference object 120a, 120b, 120c.
  • the reference object 120a, 120b, 120c may comprise e.g. a traffic sign, a bus stop, a loading dock, a driveway, a garage entrance, a tunnel entrance, an edifice structure, etc.
  • the method 400 may comprise a number of steps 401 -407. Flowever, some of these steps 401 -407 may be performed solely in some alternative embodiments, like e.g. steps 402-403. Further, the described steps 401 -407 may be performed in a somewhat different chronological order than the numbering suggests.
  • the method 400 may comprise the subsequent steps:
  • Step 401 comprises obtaining wireless signals comprising information describing position and appearance of a reference object 120a, 120b, 120c.
  • the obtained wireless signals may comprise information describing position and appearance of at least two distinct reference objects 120a, 120b, 120c.
  • the obtained wireless signal may comprise a confirmation that the reference object 120a, 120b, 120c has not been dislocated, based on impact detection observed by an impact mon itor 230 arranged on the reference object 120a, 120b, 120c, configured to detect impact of the reference object 120a, 120b, 120c.
  • Step 402 may be performed only in some particular embodiments. Step 402 comprises de tecting that the reference object 120a, 120b, 120c cannot be observed due to an intermediate obstacle 300, such as a vehicle, a structure, a weather phenomenon and/ or dirt on the sensor lens (in case the sensor 130, 140a, 140b is a camera, video camera or similar).
  • the nature of the intermediate obstacle 300 may be determined based on the sensor signals, and/ or information obtained from environmental sensors, e.g. in other close-by vehicles, weather information services, rain sensor information, etc.
  • Step 403 which may be performed only in some particular embodiments, comprises discon tinuing the method 400 for at least some sensor categories of the vehicle 100, when the obstacle 300 is detected 402.
  • the vehicle 100 may be ordered to execute a sensor lens cleaning measure, and / or to pass a carwash or other cleaning facility.
  • Step 404 comprises detecting the reference object 120a, 120b, 120c with the vehicle sensor 130, 140a, 140b.
  • the sensor 130, 140a, 140b may detect at least two distinct reference objects 120a, 120b, 120c, in some embodiments.
  • the at least two distinct reference objects 120a, 120b, 120c may be situated in different directions, at different heights, at different distances and be dedicated towards different cat egories of sensors in some embodiments.
  • Step 405 comprises determining position and appearance of the reference object 120a, 120b, 120c, based on the sensor detection 404 of the reference object 120a, 120b, 120c.
  • the positions and appearances of the reference objects 120a, 120b, 120c may be deter mined, based on the sensor detection 404 of the reference objects 120a, 120b, 120c, in some embodiments.
  • Different sensor types of the vehicle sensors 130, 140a, 140b may determine different as pects of the position/ appearance of the reference object 120a, 120b, 120c.
  • a sensor based on laser or lidar may determine distance to the reference object 120a, 120b, 120c while a vehicle sensor 130, 140a, 140b embodied as a camera may determine colour and/ or shape of the reference object 120a, 120b, 120c.
  • Step 406 comprises comparing the determined 405 position and appearance of the refer ence object 120a, 120b, 120c, with the obtained 401 position and appearance of the refer ence object 120a, 120b, 120c.
  • the comparison may in some embodiments be made between the determined 405 positions and appearances of the plurality of reference objects 120a, 120b, 120c, with the obtained 401 positions and appearances of the plurality of reference objects 120a, 120b, 120c.
  • Step 407 comprises calibrating the sensor 130, 140a, 140b based on the obtained 401 po sition and appearance of the reference object 120a, 120b, 120c, when the difference be tween the determined 405 position and appearance of the reference object 120a, 120b, 120c and the obtained 401 position and appearance of the reference object 120a, 120b, 120c exceeds a threshold limit.
  • the calibration of the sensor 130, 140a, 140b may be made, based on the obtained 401 position and appearance of the plurality of reference objects 120a, 120b, 120c.
  • This process may be repeated until no misalignment exceeding the predetermined threshold limit is detected of any sensor 130, 140a, 140b in/ on the vehicle 100.
  • the reason may be that the sensor 130, 140a, 140b has to be replaced.
  • the vehicle 100 may then be ordered to park in a con venient close by location and a request for assistance of a service technician may be out putted, as the sensor 130, 140a, 140b may has to be repaired or replaced.
  • Figure 5 illustrates an embodiment of a system 500 for calibration of a vehicle sensor 130, 140a, 140b in a vehicle 100.
  • the system 500 comprises the vehicle sensor 130, 140a, 140b, configured for measure ments of electromagnetic radiation, comprised in the vehicle 100. Further, the system 500 may comprise a control arrangement 200 for calibration of a vehicle sensor 130, 140a, 140b. The control arrangement 200 is comprised in the vehicle 100.
  • the vehicle sensor 130, 140a visible light (camera, video camera); monochrome, coherent light (laser); microwaves (ma ser); infrared light (Passive Infrared Sensor (PIR)); pulsed laser light (lidar), etc.
  • the system 500 may perform at least some of the previously described steps 401 -407 ac cording to the method 400 described above and illustrated in Figure 4.
  • the control arrangement 200 is configured to obtain wireless signals comprising information describing position and appearance of a reference object 120a, 120b, 120c, via a receiver 150. Further, the control arrangement 200 is configured to detect the reference object 120a, 120b, 120c with the vehicle sensor 130, 140a, 140b. Also, the control arrangement 200 is furthermore configured to determine position and appearance of the reference object 120a, 120b, 120c, based on the sensor detection of the reference object 120a, 120b, 120c. The control arrangement 200 is additionally configured to compare the determined position and appearance of the reference object 120a, 120b, 120c, with the obtained position and appear ance of the reference object 120a, 120b, 120c.
  • control arrangement 200 is configured to calibrate the sensor 130, 140a, 140b based on the obtained position and ap pearance of the reference object 120a, 120b, 120c, when the difference between the deter mined position and appearance of the reference object 120a, 120b, 120c and the obtained position and appearance of the reference object 120a, 120b, 120c exceeds a threshold limit.
  • the control arrangement 200 may comprise a processing circuit 520 configured to perform at least some of the method 400 according to method steps 401 -407, in some embodiments.
  • Such processing circuit 520 may comprise one or more instances of a processing circuit, i.e. a Central Processing Unit (CPU), a processing unit, a processing circuit, a processor, an Application Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may interpret and execute instructions.
  • a processing circuit i.e. a Central Processing Unit (CPU), a processing unit, a processing circuit, a processor, an Application Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may interpret and execute instructions.
  • the herein utilised expression“processor” may thus represent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any, some or all of the ones enumerated above.
  • the control arrangement 200 also comprises a receiving circuit 510 configured for receiving a signal from the sensor 130, 140a, 140b and/ or the transmitter 220.
  • control arrangement 200 may comprise a memory 525 in some embodi ments.
  • the optional memory 525 may comprise a physical device utilised to store data or programs, i.e., sequences of instructions, on a temporary or permanent basis.
  • the memory 525 may comprise integrated circuits comprising silicon- based transistors.
  • the memory 525 may comprise e.g. a memory card, a flash memory, a USB memory, a hard disc, or another similar volatile or non-volatile storage unit for storing data such as e.g. ROM (Read-Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), etc. in different embod iments.
  • control arrangement 200 may comprise a signal transmitter 530.
  • the signal transmitter 530 may be configured for transmitting a control signal to be received by one or several motors or similar directing devices for directing and calibrating the vehicle sensor 130, 140a, 140b.
  • the system 500 additionally comprises a reference object 120a, 120b, 120c for assisting the control arrangement 200 in the vehicle 100 in calibration of a vehicle sensor 130, 140a, 140b.
  • the vehicle sensor 130, 140a, 140b is based on measurements of electromagnetic radiation.
  • the reference object 120a, 120b, 120c comprises a memory device 210, comprising infor mation describing position and appearance of the reference object 120a, 120b, 120c.
  • the system 500 comprises a transmitter 220, configured to broadcast wireless signals comprising the information describing position and appearance of the reference object 120a, 120b, 120c of the memory device 210.
  • the transmitter 220 may be arranged at the reference object 120a, 120b, 120c, or in the vicinity of the reference object 120a, 120b, 120c.
  • the system 500 also comprises a receiver 150 in the vehicle 100, configured to receive the in formation describing position and appearance of the reference object 120a, 120b, 120c from the transmitter 220.
  • system 500 may comprise additional units for performing the method 400 according to steps 401 -407.
  • the above described method steps 401 -407 to be performed in the system 500 may be implemented through the one or more processing circuits 520 within the control arrangement 200, together with computer program product for performing at least some of the functions of the method steps 401 -407.
  • a computer program product comprising instructions for performing the method steps 401 -407 in the control arrangement 200 may perform the method 400 comprising at least some of the steps 401 -407 for calibration of a vehicle sensor 130, 140a, 140b, which vehicle sensor 130, 140a, 140b is based on measurements of elec tromagnetic radiation.
  • the computer program product mentioned above may be provided for instance in the form of a data carrier carrying computer program code for performing at least some of the method step 401 -407 according to some embodiments when being loaded into the one or more pro cessing circuits 520 of the control arrangement 200.
  • the data carrier may be, e.g., a hard disk, a CD ROM disc, a memory stick, an optical storage device, a magnetic storage device or any other appropriate medium such as a disk or tape that may hold machine readable data in a non-transitory manner.
  • the computer program product may furthermore be pro vided as computer program code on a server and downloaded to the control arrangement 200 remotely, e.g., over an Internet or an intranet connection.
  • the term “and/ or” comprises any and all combinations of one or more of the associated listed items.
  • the term “or” as used herein, is to be interpreted as a mathematical OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless ex pressly stated otherwise.
  • the singular forms “a”, “an” and “the” are to be inter preted as “at least one”, thus also possibly comprising a plurality of entities of the same kind, unless expressly stated otherwise.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

Method (400), control arrangement (200) and reference object (120a, 120b, 120c) for calibration of a vehicle sensor (130, 140a, 140b), which vehicle sensor (130, 140a, 140b) is based on measurements of electromagnetic radiation. The method (400) comprises: obtaining (401) wireless signals comprising information describing position and appearance of the reference object (120a, 120b, 120c); detecting (404) the reference object (120a, 120b, 120c) with the vehicle sensor (130, 140a, 140b); determining (405) position and appearance of the reference object (120a, 120b, 120c), based on the sensor detection (404) of the reference object (120a, 120b, 120c); comparing (406) the determined (405) position and appearance of the reference object (120a, 120b, 120c), with the obtained (401) position and appearance of the reference object (120a, 120b, 120c); and calibrating (407) the sensor (130, 140a, 40b) when the difference exceeds a threshold limit.

Description

METHOD, CONTROL ARRANGEMENT AND REFERENCE OBJECT FOR CALIBRATION OF SENSORS IN AN AUTONOMOUS VEHICLE
TECHNICAL FIELD
5 This document discloses a method, a control arrangement and a reference object. More particularly, a method, a control arrangement and a reference object are disclosed for cali bration of a vehicle sensor based on measurements of electromagnetic radiation.
BACKGROUND
10 A vehicle, in particular an autonomous vehicle, comprises a large number of sensors of dif ferent kind. Vehicle sensors are often calibrated in the production line using a reference point or similar to ensure that the sensors are aligned as expected. When the vehicle is opera tional, sensor errors are often discovered by the driver. Another common alternative is to compare the information from different sensors of the vehicle, and e.g. disable Advanced 15 Driver Assistance Systems (ADAS) functions if some sensor information seems to be cor rupt, i.e. different sensors present deviating results.
Autonomous vehicles cannot rely on human interaction, why it is important to ensure that the sensor information of the environment is correct. A possible sensor error could lead to a 20 vehicle-off-road if the erroneous sensor cannot be identified, which typically may be the case.
A human interaction by an ambulating mechanic for checking/ adjusting/ calibrating sensors of the vehicle may be required before allowing the vehicle to continue driving. This is unfor tunately expensive and time consuming, which may cause substantive delay of the transpor tation.
25
Document US 20170166219 concerns diagnosing and supplementing vehicle sensor data. Sensor data is shared between vehicles and a road infrastructure via radio. Sensor data such as temperature is collected from a plurality of vehicle sensors, and a mean value is calculated. In case a sensor value is received, or discovered, which deviates substantially 30 from the mean value, it is returned to the vehicle and the temperature sensor is adjusted.
The document suffers from several draw backs. Firstly, the deviating sensor value may dis play the only correct sensor value while the other sensor values may be erroneous. Sec ondly, the only sensor correction discussed in D1 is correction of the temperature sensor. 35 The temperature sensor of a vehicle is typically mounted in a protected position in the vehi cle, while many other sensors such as radar, lidar, cameras etc., often are positioned at locations where they are subject to interference with the environment. It would for this reason be desirable to detect errors and calibrate sensors dedicated towards measuring distance, proximity, displacement, etc., based on usage of electromagnetic radiation such as visible light, infrared light, microwaves, radio waves, etc.
Document US 20170232974 describes a drive assistance system. A vehicle comprises sen sors. When one of the sensors in the vehicle is determined to be incorrect, automatic driving of the vehicle is disabled.
It is not clear from the document how the malfunctioning sensor is identified. Further, no calibration of the sensor is made, why the solution discussed in this document require human interaction, which would be desired to avoid as much as possible.
Document US 2017072967 concerns a system for autonomously guiding a vehicle. A con troller is designed to detect a malfunction of a sensor in the vehicle. An auxiliary sensor signal (of another vehicle) is obtained via a communication network, and the vehicle is nav igated assisted by the auxiliary sensor signal.
The presented solution is dependent upon a correct sensor signal of another vehicle. How ever, there may not be any other vehicle close by, or that vehicle may also have an incorrect sensor. It would be desired to encounter a solution which is not dependent on sensors of other vehicles.
Document WO 2017003350 describes a solution wherein vehicle external sensors, i.e. sen sors in other close-by vehicles, are used in order to verify that the sensor data of the own vehicle is correct. In case an error code of the temperature sensor in the own vehicle is detected, temperature information from a temperature sensor in another close-by vehicle may be obtained and utilised.
Again, the solution is completely dependent on a close-by other vehicle having a correct sensor. Also, only sensors related to outside temperature, air pressure, humidity and/ or altitude are discussed. As stated before, it would be desired to rather calibrate sensors ded icated towards measuring distance, proximity, displacement, etc., based on usage of elec tromagnetic radiation such as visible light, infrared light, microwaves, radio waves, etc.
Since no human driver is present in an autonomous vehicle, that can evaluate the sensors status during operation, it would be desired to continuously, or repetitively check and/ or calibrate sensors automatically during transportation. SUMMARY
It is therefore an object of this invention to solve at least some of the above problems and improve traffic security, in particular when driving an autonomous vehicle.
According to a first aspect of the invention, this objective is achieved by a method in a vehicle, for calibration of a vehicle sensor, which vehicle sensor is based on measurements of elec tromagnetic radiation. The method comprises obtaining wireless signals comprising infor mation describing position and appearance of a reference object. Further the method also comprises detecting the reference object with the vehicle sensor. The method in addition comprises determining position and appearance of the reference object, based on the sensor detection of the reference object. Also, the method further comprises comparing the deter mined position and appearance of the reference object, with the obtained position and ap pearance of the reference object. The method additionally comprises calibrating the sensor based on the obtained position and appearance of the reference object, when the difference between the determined position and appearance of the reference object and the obtained position and appearance of the reference object exceeds a threshold limit.
According to a second aspect of the invention, this objective is achieved by a control ar rangement in a vehicle. The control arrangement is configured for calibration of a vehicle sensor; which vehicle sensor is based on measurements of electromagnetic radiation. Fur ther, the control arrangement is configured to obtain wireless signals comprising information describing position and appearance of a reference object, via a receiver. In addition, the control arrangement is further configured to detect the reference object with the vehicle sen sor. Furthermore, the control arrangement is additionally configured to determine position and appearance of the reference object, based on the sensor detection of the reference object.
According to a third aspect of the invention, this objective is achieved by a reference object. The reference object is configured for assisting a control arrangement according to the sec ond aspect in calibration of a vehicle sensor in a vehicle. The vehicle sensor is based on measurements of electromagnetic radiation. The reference object comprises a memory de vice, comprising information describing position and appearance of the reference object. Furthermore, the reference object also comprises a transmitter, configured to broadcast wire less signals comprising the information describing position and appearance of the reference object, of the memory device. Thanks to the described aspects, the functionality of various sensors in a vehicle may be checked regularly while passing or stopping at places comprising one or several reference objects with known positions and properties. By comparing received information concerning position and properties with the corresponding information determined by the sensors, a misalignment may be detected. Further, a calibration of the misaligned sensor may be per formed. This process may be repeated until no misalignment (exceeding a predetermined threshold limit) is detected of any sensor in/ on the vehicle. Thereby, traffic safety is en hanced. Also, it is avoided that the vehicle has to interrupt the current transportation and stop, waiting for a humanoid service operator to come and check/ calibrate/ exchange sen- sors on the vehicle. Hereby, time and money are saved.
Other advantages and additional novel features will become apparent from the subsequent detailed description. FIGURES
Embodiments of the invention will now be described in further detail with reference to the accompanying figures, in which:
Figure 1 illustrates an example of a vehicle equipped with sensors and a control ar rangement according to an embodiment of the invention;
Figure 2 illustrates a vehicle interior of a vehicle equipped with sensors and a control arrangement according to an embodiment of the invention;
Figure 3 illustrates an example of a vehicle according to an embodiment of the inven tion in an overview, also illustrating a reference object and an intermediate obstacle;
Figure 4 is a flow chart illustrating an embodiment of the method;
Figure 5 is an illustration depicting a system according to an embodiment.
DETAILED DESCRIPTION
Embodiments of the invention described herein are defined as a method, a control arrange- ment and a reference object, which may be put into practice in the embodiments described below. These embodiments may, however, be exemplified and realised in many different forms and are not to be limited to the examples set forth herein; rather, these illustrative examples of embodiments are provided so that this disclosure will be thorough and complete. Still other objects and features may become apparent from the following detailed description, considered in conjunction with the accompanying drawings. It is to be understood, however, that the drawings are designed solely for purposes of illustration and not as a definition of the limits of the herein disclosed embodiments, for which reference is to be made to the appended claims. Further, the drawings are not necessarily drawn to scale and, unless oth erwise indicated, they are merely intended to conceptually illustrate the structures and pro cedures described herein.
Figure 1 illustrates a scenario with a vehicle 100 driving in a driving direction 105 on a road 110, approaching a reference object 120a such as e.g. a traffic sign/ light, a pole, a part of the road 1 10, an illumination arrangement, a parking lot, a building or other similar structure. The reference object 120a may be situated beside the road 1 10, i.e. beside, in front of or behind the vehicle 100; in/ under the road 1 10, i.e. under the vehicle 100; above the road 1 10 and also above the vehicle 100, etc. The vehicle 100 comprises at least one sensor 130, 140a, 140b configured for detecting traffic related objects such as e.g. reference objects 120a in the relative vicinity of the vehicle 100.
The vehicle 100 may comprise a truck, a bus, a car, a motorcycle, or similar means of con veyance. The vehicle 100 may typically be autonomous/ driverless. However, the vehicle 100 may also, or alternatively be conducted by a human driver.
According to some embodiments, the sensor 130, 140a, 140b may comprise camera, laser, lidar or radar sensors already existing on the vehicle 100, for other driving assistance func tions, beside detecting the reference object 120a.
Further, the vehicle 100 comprises a wireless receiver 150, configured to receive wireless communication such as e.g. radio signals from a transmitter related to the reference object 120a.
The solution to the problem of malfunctioning or incorrect sensors 130, 140a, 140b is to deploy a functionality in the vehicle 100 that can detect misalignment and defects of sensors 130, 140a, 140b in/ on the vehicle 100 by using information from smart infrastructure, such as the reference object 120a. The functionality may comprise detecting, via the sensors 130, 140a, 140b, that the vehicle 100 is approaching the reference object 120a. Alternatively, e.g. in case the sensors 130, 140a, 140b are malfunctioning or in case the reference object 120a is obscured by another object or weather phenomenon, the approach of the vehicle 100 to the reference object 120a may be made by receiving short distance wireless communication by the wireless receiver 150 from the transmitter on, or in proximity of the reference object 120a.
Via the wireless communication, which may comprise sort of v2x-communication such as e.g. WiFi, Wireless Local Area Network (WLAN), 3GPP LTE, Ultra Mobile Broadband (UMB), Bluetooth (BT), Near Field Communication (NFC), Radio-Frequency Identification (RFID), Z- wave, ZigBee, IPv6 over Low power Wireless Personal Area Networks (6L0WPAN), Wireless Flighway Addressable Remote Transducer (FIART) Protocol, Wireless Universal Serial Bus (USB), optical communication such as Infrared Data Association (IrDA), Low-Power Wide- Area Network (LPWAN) such as e.g. LoRa, or infrared transmission to name but a few pos sible examples of wireless communications in some embodiments.
The wireless communication may be made according to any IEEE standard for wireless ve hicular communication like e.g. a special mode of operation of IEEE 802.1 1 for vehicular networks called Wireless Access in Vehicular Environments (WAVE). IEEE 802.1 1 p is an extension to 802.1 1 Wireless LAN medium access layer (MAC) and physical layer (PHY) specification.
Via the wireless communication, information concerning absolute and/or relative position, size, shape, colour/s, etc., of the reference object 120a may be transferred to the vehicle 100. This information may then act as“ground truth” and be used for making comparisons with the sensor data.
By comparing the detection data from sensors 130, 140a, 140b with the reported position and size by the reference object 120a, sensor misalignment can be detected and adjusted electronically and / or mechanically to compensate for the current error between detections and the vehicle-to-infrastructure data, in case the error exceeds a predetermined threshold limit, such as e.g. 1%.
To ensure a robust calibration, several reference object 120a that can be detected at the same time may be used to achieve a more accurate result, according to some embodiments.
The advantage with this solution is to be able to achieve automatic online calibration of sen sors 130, 140a, 140b while driving, for example while stopping at or passing a traffic light, a tunnel, or other infrastructure. Since no human can take over in autonomous vehicles 100, there is a need to continuously ensure that the sensor data is correct, for security reasons. The solution prevents vehicles 100 from stopping due to contradictory sensor reports. Thereby, transportation delays and service adjustments from a human operator are avoided. Hereby costs are saved while safety is increased. The functionality may in some embodi ments be extended to incorporate vehicle-to-vehicle information and use other moving or stationary vehicles as a reference point instead of infrastructure.
By providing reference objects 120a at a large number of places, spread over an area such as a city or other region, the sensors 130, 140a, 140b of the vehicle 100 may be continuously, or regularly checked and / or calibrated while passing the area, leading to a safer and more reliable traffic situation, more effective transportation and less stops for manual sensor cali bration.
Figure 2 illustrates an example of how the previously scenario in Figure 1 may be perceived by the driver of the vehicle 100, approaching a set of reference objects 120a, 120b, 120c.
In this case, the vehicle 100 is passing, or stopping at a pedestrian crossing. The reference objects 120a, 120b, 120c, or one reference object 120a, 120b, 120c out of the set of refer ence objects 120a, 120b, 120c may comprise or be associated with a memory device 210, comprising information describing position and appearance of the reference object/s 120a, 120b, 120c. The information may comprise absolute position, relative position in relation to other reference objects 120a, 120b, 120c, the road 1 10, the vehicle position, and / or to any other structure, etc.; shape, height, width, dimensions, colour/s, etc.
The information concerning the reference objects 120a, 120b, 120c may be determined and stored into the memory device 210 by a human operator in some embodiments. In case the reference objects 120a, 120b, 120c is mobile, i.e. non-static, a positioning device such as a Global Positioning System (GPS) device may be comprised at the reference object 120a, 120b, 120c. Then, the position of the reference object 120a, 120b, 120c may be continuously measured automatically, and the memory device 210 may be continuously updated with the location information.
Further, the reference objects 120a, 120b, 120c, or one reference object 120a, 120b, 120c out of the set of reference objects 120a, 120b, 120c may comprise or be associated with a transmitter 220, configured to transmit or broadcast wireless signals comprising the infor mation describing position and appearance of the reference object 120a, 120b, 120c, ex tracted from the memory device 210.
An advantage with broadcasting the information of the reference object 120a, 120b, 120c continuously or at a regular time interval is that several vehicles 100 (e.g. approaching the reference object 120a, 120b, 120c from different directions and/ or at different distances) may obtain the information and may perform the sensor calibration.
In some embodiments, the transmitter 220 may be triggered to emit signalling comprising the information e.g. by a detection of the approaching vehicle 100 by a sensor on the refer ence object 120a, 120b, 120c. An advantage therewith is that the transmitter 220 may hiber nate when no vehicle 100 is detected, thereby saving transmission energy.
In some embodiments, the information to be transmitted may be encrypted, and/ or signed with a digital signature before being broadcasted. Thereby, it could be ascertained by the vehicle 100 that the reference object 120a, 120b, 120c is a real reference object, managed by an authority; and that the information on the memory device 210 concerning the reference object 120a, 120b, 120c has not been tampered with, or otherwise degenerated. Also, or additionally, a check-sum may be calculated on the information on the memory device 210, and the check-sum may be broadcasted together with the information on the memory device 210. Thereby, the vehicle 100 may thereby check that the received information received from the transmitter 220 has not been tampered with, or otherwise degenerated, by calculating a check-sum on the information using the same algorithm and then compare the calculated check sum with the received check sum.
Security issues in association with autonomous vehicles 100 and calibration of their sensors 130, 140a, 140b may be important as they otherwise may be easy targets for terrorists, an archists and / or criminals.
Further, according to some embodiments, the reference object 120a, 120b, 120c may com prise an impact monitor/ shock sensor 230, configured to detect impact of the reference object 120a, 120b, 120c. The impact monitor 230 may indicate whether a physical shock or impact has occurred. The impact monitor 230 may have a binary output (go/no-go) and may sometimes be called shock overload devices. It may thereby be determined that the refer ence object 120a, 120b, 120c has not been dislocated, for example tilted by another vehicle. The transmitter 220 may in some embodiments be further configured to emit a confirmation that the reference object 120a, 120b, 120c has not been dislocated when the impact monitor 230 has not detected any impact. The impact monitor 230 may comprise or be based on e.g. a sensor such as an accelerometer, a spring-mass systems which can be triggered by a shock, disruption of the surface tension of a liquid, magnetic balls which can be dislodged from a holder, breakage of an inexpensive brittle component with a known fragility, etc. Hereby, the risk of calibrating the vehicle sensors 130, 140a, 140b in relation to a dislocated reference object 120a, 120b, 120c may be omitted or at least reduced.
In some embodiments, the set of reference objects 120a, 120b, 120c may be situated in different directions around the vehicle 100, and / or at different heights. Furthermore, different reference objects 120a, 120b, 120c may be dedicated for calibration of different sensor types in some embodiments; e.g. one reference object 120a, 120b, 120c may be dedicated for calibration of a camera, one reference object 120a, 120b, 120c may be dedicated for cali bration of a laser, one reference object 120a, 120b, 120c may be dedicated for calibration of an ultrasound sensor, one reference object 120a, 120b, 120c may be dedicated for calibra tion of an infrared sensor, etc.
The vehicle 100 comprises a control arrangement 200, for calibration of the vehicle sensor 130, 140a, 140b. The control arrangement 200 is configured to obtain wireless signals com prising information describing position and appearance of a reference object 120a, 120b, 120c, via a receiver 150. Further, the control arrangement 200 is configured to detect the reference object 120a, 120b, 120c with the vehicle sensor 130, 140a, 140b. In addition, the control arrangement 200 is configured to determine position and appearance of the reference object 120a, 120b, 120c, based on the sensor detection of the reference object 120a, 120b, 120c. Furthermore, the control arrangement 200 is configured to compare the determined position and appearance of the reference object 120a, 120b, 120c, with the obtained position and appearance of the reference object 120a, 120b, 120c. The control arrangement 200 is additionally configured to calibrate the sensor 130, 140a, 140b based on the obtained posi tion and appearance of the reference object 120a, 120b, 120c, when the difference between the determined position and appearance of the reference object 120a, 120b, 120c and the obtained position and appearance of the reference object 120a, 120b, 120c exceeds a threshold limit.
In some embodiments, the control arrangement 200 may be further configured to obtain wireless signals comprising information describing position and appearance of at least two distinct reference objects 120a, 120b, 120c, via the receiver 150; i.e. a set of distinct refer ence objects 120a, 120b, 120c. The control arrangement 200 may also be configured to detect the at least two reference objects 120a, 120b, 120c via the sensor 130, 140a, 140b. In addition, the control arrangement 200 may also be configured to determine the positions and appearances of the reference objects 120a, 120b, 120c, based on the sensor detection of the reference objects 120a, 120b, 120c. Furthermore, the control arrangement 200 may also be configured to compare the determined positions and appearances of the reference objects 120a, 120b, 120c (as determined by the sensors 130, 140a, 140b), with the obtained positions and appearances of the reference objects 120a, 120b, 120c (as received from the transmitter 220). Also, the control arrangement 200 may further be configured to calibrate the sensor 130, 140a, 140b, based on the obtained positions and appearances of the refer ence objects 120a, 120b, 120c.
Furthermore, the control arrangement 200 may be further configured to obtain a wireless signal comprising a confirmation that the reference object 120a, 120b, 120c has not been dislocated.
An advantage with arranging a plurality of reference objects 120a, 120b, 120c at different places and directions around the vehicle position is that several vehicle sensors 130, 140a, 140b, oriented in different directions and/ or being of different types may be calibrated.
Figure 3 illustrates another scenario wherein a vehicle 100 as illustrated in Figure 1 and / or Figure 2 is approaching reference object 120a, 120b, 120c while driving in a driving direction 105, as regarded from an above perspective.
As illustrated in Figure 3, the reference object 120a is obscured by an intermediate obstacle 300. The intermediate obstacle 300 may be e.g. another vehicle, a human, an edifice struc ture, dirt on the sensor lens, a weather phenomenon etc. Such weather phenomenon may comprise a reduced visibility situation such as e.g. twilight, night, rain, fog, smoke, pollution, snow fall, hail, blizzard, sun dazzle or similar weather conditions. Also, the sensor 130, 140a, 140b may become dazzled by the headlights of a meeting vehicle driving in opposite direc tion.
Different kinds of vehicle sensors 130, 140a, 140b may be sensitive for different kinds of intermediate obstacles 300, e.g., a camera may not be able to detect the reference object 120a, 120b, 120c due to darkness, while darkness is no obstacle for a laser or lidar based sensor; a transparent glass surface (vehicle window) may block infrared and ultraviolet sig nals of a PIR, but not visible light used by a camera/ video camera, etc.
The control arrangement 200 may be configured to detect that the reference object 120a, 120b, 120c cannot be observed due to the intermediate obstacle 300.
The detection may be made e.g. by receiving information over a wireless communication from the transmitter 220 associated with the reference object 120a, 120b, 120c, concerning position of the reference object 120a, 120b, 120c.
Upon detection of the intermediate obstacle 300, the sensor calibration may be postponed until a later moment when the intermediate obstacle 300 has left. Alternatively, sensors 130, 140a, 140b of the vehicle 100 being able to detect the reference object 120a, 120b, 120c may be calibrated.
In case the intermediate obstacle 300 comprises e.g. darkness, the sensors 130, 140a, 140b comprising laser, lidar, ultrasound, and / or infrared light, etc., may be calibrated anyway, while camera sensors may not be calibrated.
Figure 4 illustrates an example of a method 400 according to an embodiment. The flow chart in Figure 4 shows the method 400 for use in a vehicle 100 for calibration of a vehicle sensor 130, 140a, 140b, which vehicle sensor 130, 140a, 140b is based on measurements of elec tromagnetic radiation.
The sensor calibration may be made based on detection of a reference object 120a, 120b, 120c. The reference object 120a, 120b, 120c may comprise e.g. a traffic sign, a bus stop, a loading dock, a driveway, a garage entrance, a tunnel entrance, an edifice structure, etc.
In order to be able to correctly calibrate the vehicle sensors 130, 140a, 140b, the method 400 may comprise a number of steps 401 -407. Flowever, some of these steps 401 -407 may be performed solely in some alternative embodiments, like e.g. steps 402-403. Further, the described steps 401 -407 may be performed in a somewhat different chronological order than the numbering suggests. The method 400 may comprise the subsequent steps:
Step 401 comprises obtaining wireless signals comprising information describing position and appearance of a reference object 120a, 120b, 120c.
The obtained wireless signals may comprise information describing position and appearance of at least two distinct reference objects 120a, 120b, 120c.
The obtained wireless signal may comprise a confirmation that the reference object 120a, 120b, 120c has not been dislocated, based on impact detection observed by an impact mon itor 230 arranged on the reference object 120a, 120b, 120c, configured to detect impact of the reference object 120a, 120b, 120c. Step 402 may be performed only in some particular embodiments. Step 402 comprises de tecting that the reference object 120a, 120b, 120c cannot be observed due to an intermediate obstacle 300, such as a vehicle, a structure, a weather phenomenon and/ or dirt on the sensor lens (in case the sensor 130, 140a, 140b is a camera, video camera or similar).
In some embodiments, the nature of the intermediate obstacle 300 may be determined based on the sensor signals, and/ or information obtained from environmental sensors, e.g. in other close-by vehicles, weather information services, rain sensor information, etc.
Step 403, which may be performed only in some particular embodiments, comprises discon tinuing the method 400 for at least some sensor categories of the vehicle 100, when the obstacle 300 is detected 402.
In case the obstacle 300 is determined to be dirt on the lens, the vehicle 100 may be ordered to execute a sensor lens cleaning measure, and / or to pass a carwash or other cleaning facility.
Step 404 comprises detecting the reference object 120a, 120b, 120c with the vehicle sensor 130, 140a, 140b.
The sensor 130, 140a, 140b may detect at least two distinct reference objects 120a, 120b, 120c, in some embodiments.
The at least two distinct reference objects 120a, 120b, 120c may be situated in different directions, at different heights, at different distances and be dedicated towards different cat egories of sensors in some embodiments.
Step 405 comprises determining position and appearance of the reference object 120a, 120b, 120c, based on the sensor detection 404 of the reference object 120a, 120b, 120c.
The positions and appearances of the reference objects 120a, 120b, 120c may be deter mined, based on the sensor detection 404 of the reference objects 120a, 120b, 120c, in some embodiments.
Different sensor types of the vehicle sensors 130, 140a, 140b may determine different as pects of the position/ appearance of the reference object 120a, 120b, 120c. For example, a sensor based on laser or lidar may determine distance to the reference object 120a, 120b, 120c while a vehicle sensor 130, 140a, 140b embodied as a camera may determine colour and/ or shape of the reference object 120a, 120b, 120c.
Step 406 comprises comparing the determined 405 position and appearance of the refer ence object 120a, 120b, 120c, with the obtained 401 position and appearance of the refer ence object 120a, 120b, 120c.
The comparison may in some embodiments be made between the determined 405 positions and appearances of the plurality of reference objects 120a, 120b, 120c, with the obtained 401 positions and appearances of the plurality of reference objects 120a, 120b, 120c.
Step 407 comprises calibrating the sensor 130, 140a, 140b based on the obtained 401 po sition and appearance of the reference object 120a, 120b, 120c, when the difference be tween the determined 405 position and appearance of the reference object 120a, 120b, 120c and the obtained 401 position and appearance of the reference object 120a, 120b, 120c exceeds a threshold limit.
The calibration of the sensor 130, 140a, 140b may be made, based on the obtained 401 position and appearance of the plurality of reference objects 120a, 120b, 120c.
This process may be repeated until no misalignment exceeding the predetermined threshold limit is detected of any sensor 130, 140a, 140b in/ on the vehicle 100.
In case calibration attempts according to step 407 fail, the reason may be that the sensor 130, 140a, 140b has to be replaced. The vehicle 100 may then be ordered to park in a con venient close by location and a request for assistance of a service technician may be out putted, as the sensor 130, 140a, 140b may has to be repaired or replaced.
Figure 5 illustrates an embodiment of a system 500 for calibration of a vehicle sensor 130, 140a, 140b in a vehicle 100.
The system 500 comprises the vehicle sensor 130, 140a, 140b, configured for measure ments of electromagnetic radiation, comprised in the vehicle 100. Further, the system 500 may comprise a control arrangement 200 for calibration of a vehicle sensor 130, 140a, 140b. The control arrangement 200 is comprised in the vehicle 100. The vehicle sensor 130, 140a,
Figure imgf000015_0001
visible light (camera, video camera); monochrome, coherent light (laser); microwaves (ma ser); infrared light (Passive Infrared Sensor (PIR)); pulsed laser light (lidar), etc.
The system 500 may perform at least some of the previously described steps 401 -407 ac cording to the method 400 described above and illustrated in Figure 4.
The control arrangement 200 is configured to obtain wireless signals comprising information describing position and appearance of a reference object 120a, 120b, 120c, via a receiver 150. Further, the control arrangement 200 is configured to detect the reference object 120a, 120b, 120c with the vehicle sensor 130, 140a, 140b. Also, the control arrangement 200 is furthermore configured to determine position and appearance of the reference object 120a, 120b, 120c, based on the sensor detection of the reference object 120a, 120b, 120c. The control arrangement 200 is additionally configured to compare the determined position and appearance of the reference object 120a, 120b, 120c, with the obtained position and appear ance of the reference object 120a, 120b, 120c. Furthermore, the control arrangement 200 is configured to calibrate the sensor 130, 140a, 140b based on the obtained position and ap pearance of the reference object 120a, 120b, 120c, when the difference between the deter mined position and appearance of the reference object 120a, 120b, 120c and the obtained position and appearance of the reference object 120a, 120b, 120c exceeds a threshold limit.
The control arrangement 200 may comprise a processing circuit 520 configured to perform at least some of the method 400 according to method steps 401 -407, in some embodiments.
Such processing circuit 520 may comprise one or more instances of a processing circuit, i.e. a Central Processing Unit (CPU), a processing unit, a processing circuit, a processor, an Application Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may interpret and execute instructions. The herein utilised expression“processor” may thus represent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any, some or all of the ones enumerated above.
The control arrangement 200 also comprises a receiving circuit 510 configured for receiving a signal from the sensor 130, 140a, 140b and/ or the transmitter 220.
Furthermore, the control arrangement 200 may comprise a memory 525 in some embodi ments. The optional memory 525 may comprise a physical device utilised to store data or programs, i.e., sequences of instructions, on a temporary or permanent basis. According to some embodiments, the memory 525 may comprise integrated circuits comprising silicon- based transistors. The memory 525 may comprise e.g. a memory card, a flash memory, a USB memory, a hard disc, or another similar volatile or non-volatile storage unit for storing data such as e.g. ROM (Read-Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), etc. in different embod iments.
Further, the control arrangement 200 may comprise a signal transmitter 530. The signal transmitter 530 may be configured for transmitting a control signal to be received by one or several motors or similar directing devices for directing and calibrating the vehicle sensor 130, 140a, 140b.
The system 500 additionally comprises a reference object 120a, 120b, 120c for assisting the control arrangement 200 in the vehicle 100 in calibration of a vehicle sensor 130, 140a, 140b. The vehicle sensor 130, 140a, 140b is based on measurements of electromagnetic radiation. The reference object 120a, 120b, 120c comprises a memory device 210, comprising infor mation describing position and appearance of the reference object 120a, 120b, 120c. Fur ther, the system 500 comprises a transmitter 220, configured to broadcast wireless signals comprising the information describing position and appearance of the reference object 120a, 120b, 120c of the memory device 210. The transmitter 220 may be arranged at the reference object 120a, 120b, 120c, or in the vicinity of the reference object 120a, 120b, 120c. The system 500 also comprises a receiver 150 in the vehicle 100, configured to receive the in formation describing position and appearance of the reference object 120a, 120b, 120c from the transmitter 220.
Flowever, in some alternative embodiments, the system 500 may comprise additional units for performing the method 400 according to steps 401 -407.
The above described method steps 401 -407 to be performed in the system 500 may be implemented through the one or more processing circuits 520 within the control arrangement 200, together with computer program product for performing at least some of the functions of the method steps 401 -407. Thus a computer program product, comprising instructions for performing the method steps 401 -407 in the control arrangement 200 may perform the method 400 comprising at least some of the steps 401 -407 for calibration of a vehicle sensor 130, 140a, 140b, which vehicle sensor 130, 140a, 140b is based on measurements of elec tromagnetic radiation.
The computer program product mentioned above may be provided for instance in the form of a data carrier carrying computer program code for performing at least some of the method step 401 -407 according to some embodiments when being loaded into the one or more pro cessing circuits 520 of the control arrangement 200. The data carrier may be, e.g., a hard disk, a CD ROM disc, a memory stick, an optical storage device, a magnetic storage device or any other appropriate medium such as a disk or tape that may hold machine readable data in a non-transitory manner. The computer program product may furthermore be pro vided as computer program code on a server and downloaded to the control arrangement 200 remotely, e.g., over an Internet or an intranet connection.
The terminology used in the description of the embodiments as illustrated in the accompa nying drawings is not intended to be limiting of the described method 400; the control ar rangement 200; the computer program; the system 500 and/ or the vehicle 100. Various changes, substitutions and/ or alterations may be made, without departing from invention embodiments as defined by the appended claims.
As used herein, the term "and/ or" comprises any and all combinations of one or more of the associated listed items. The term "or” as used herein, is to be interpreted as a mathematical OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless ex pressly stated otherwise. In addition, the singular forms "a", "an" and "the" are to be inter preted as "at least one”, thus also possibly comprising a plurality of entities of the same kind, unless expressly stated otherwise. It will be further understood that the terms "includes", "comprises", "including" and / or "comprising", specifies the presence of stated features, ac tions, integers, steps, operations, elements, and / or components, but do not preclude the presence or addition of one or more other features, actions, integers, steps, operations, ele ments, components, and/ or groups thereof. A single unit such as e.g. a processor may fulfil the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. A computer program may be stored/ distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms such as via Internet or other wired or wireless communication system.

Claims

PATENT CLAIMS
1. A method (400) in a vehicle (100) for calibration of a vehicle sensor (130, 140a, 140b), which vehicle sensor (130, 140a, 140b) is based on measurements of electromag netic radiation, wherein the method (400) comprises:
obtaining (401 ) wireless signals comprising information describing position and ap pearance of a reference object (120a, 120b, 120c);
detecting (404) the reference object (120a, 120b, 120c) with the vehicle sensor (130, 140a, 140b);
determining (405) position and appearance of the reference object (120a, 120b, 120c), based on the sensor detection (404) of the reference object (120a, 120b, 120c); comparing (406) the determined (405) position and appearance of the reference object (120a, 120b, 120c), with the obtained (401 ) position and appearance of the reference object (120a, 120b, 120c); and
calibrating (407) the sensor (130, 140a, 140b) based on the obtained (401 ) position and appearance of the reference object (120a, 120b), when the difference between the de termined (405) position and appearance of the reference object (120a, 120b, 120c) and the obtained (401 ) position and appearance of the reference object (120a, 120b, 120c) exceeds a threshold limit.
2. The method (400) according to claim 1 , wherein:
the obtained (401 ) wireless signals comprises information describing position and appearance of at least two distinct reference objects (120a, 120b, 120c);
the sensor (130, 140a, 140b) detects (404) the at least two reference objects (120a, 120b, 120c);
the positions and appearances of the reference objects (120a, 120b, 120c) are de termined (405), based on the sensor detection (404) of the reference objects (120a, 120b, 120c);
the comparison (406) is made between the determined (405) positions and appear ances of the reference objects (120a, 120b, 120c), with the obtained (401 ) positions and appearances of the reference objects (120a, 120b, 120c); and
the calibration (407) of the sensor (130, 140a, 140b) is made, based on the obtained (401 ) positions and appearances of the reference objects (120a, 120b, 120c).
3. The method (400) according to any one of claim 1 or claim 2, wherein the obtained (401 ) wireless signal comprises a confirmation that the reference object (120a, 120b, 120c) has not been dislocated.
4. The method (400) according to any one of claims 1 -3, further comprising:
detecting (402) that the reference object (120a, 120b, 120c) cannot be observed due to an intermediate obstacle (300); and
discontinuing (403) the method (400) for at least some sensor categories of the vehicle (100) when the obstacle (300) is detected (402).
5. A control arrangement (200) in a vehicle (100) for calibration of a vehicle sensor (130, 140a, 140b), which vehicle sensor (130, 140a, 140b) is based on measurements of electromagnetic radiation, wherein the control arrangement (200) is configured to:
obtain wireless signals comprising information describing position and appearance of a reference object (120a, 120b, 120c), via a receiver (150);
detect the reference object (120a, 120b, 120c) with the vehicle sensor (130, 140a,
140b);
determine position and appearance of the reference object (120a, 120b, 120c), based on the sensor detection of the reference object (120a, 120b, 120c);
compare the determined position and appearance of the reference object (120a, 120b, 120c), with the obtained position and appearance of the reference object (120a, 120b, 120c); and
calibrate the sensor (130, 140a, 140b) based on the obtained position and appear ance of the reference object (120a, 120b, 120c), when the difference between the deter mined position and appearance of the reference object (120a, 120b, 120c) and the obtained position and appearance of the reference object (120a, 120b, 120c) exceeds a threshold limit.
6. The control arrangement (200) according to claim 5, further configured to
obtain wireless signals comprising information describing position and appearance of at least two distinct reference objects (120a, 120b, 120c), via the receiver (150);
detect the at least two reference objects (120a, 120b, 120c) via the sensor (130, 140a, 140b);
determine the positions and appearances of the reference objects (120a, 120b, 120c), based on the sensor detection of the reference objects (120a, 120b, 120c);
compare the determined positions and appearances of the reference objects (120a, 120b, 120c), with the obtained positions and appearances of the reference objects (120a, 120b, 120c); and
calibrate the sensor (130, 140a, 140b), based on the obtained positions and ap pearances of the reference objects (120a, 120b, 120c).
7. The control arrangement (200) according to any one of claim 5 or claim 6, further configured to obtain a wireless signal comprising a confirmation that the reference object (120a, 120b, 120c) has not been dislocated.
8. The control arrangement (200) according to any one of claims 5-7, further config ured to detect that the reference object (120a, 120b, 120c) cannot be observed due to an intermediate obstacle (300); and
discontinue the calibration of the vehicle sensor (130, 140a, 140b) for at least some sensor categories of the vehicle (100), when the obstacle (300) is detected.
9. A computer program comprising program code for performing a method (400) ac cording to any of claims 1 -4 when the computer program is executed in a processing circuit (520) of the control arrangement (200), according to any one of claims 5-8.
10. A vehicle (100) comprising a control arrangement (200), according to any one of claims 5-8.
1 1. A reference object (120a, 120b) for assisting a control arrangement (200) according to any one of claims 5-8 in calibration of a vehicle sensor (130, 140a, 140b), which vehicle sensor (130, 140a, 140b) is based on measurements of electromagnetic radiation, wherein the reference object (120a, 120b, 120c) comprises:
a memory device (210), comprising information describing position and appearance of the reference object (120a, 120b, 120c); and
a transmitter (220), configured to broadcast wireless signals comprising the infor mation describing position and appearance of the reference object (120a, 120b, 120c) of the memory device (210).
12. The reference object (120a, 120b, 120c) according to claim 1 1 , further comprising: an impact monitor (230), configured to detect impact of the reference object (120a,
120b, 120c); and wherein the transmitter (220) is further configured to emit a confirmation that the reference object (120a, 120b, 120c) has not been dislocated when the impact mon itor (230) has not detected any impact.
13. A system (500) for calibration of a vehicle sensor (130, 140a, 140b), which system (500) comprises:
the vehicle sensor (130, 140a, 140b), configured for measurements of electromag netic radiation, comprised in a vehicle (100); a control arrangement (200), according to any one of claims 5-8;
a reference object (120a, 120b, 120c), according to any one of claims 1 1 -12; and a transmitter (220), configured to broadcast wireless signals comprising the infor mation describing position and appearance of the reference object (120a, 120b, 120c); and a receiver (150) in the vehicle (100), configured to receive the information describing position and appearance of the reference object (120a, 120b, 120c) from the transmitter
PCT/SE2019/050244 2018-03-20 2019-03-19 Method, control arrangement and reference object for calibration of sensors in an autonomous vehicle WO2019182499A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE112019000846.3T DE112019000846T5 (en) 2018-03-20 2019-03-19 Method, control arrangement and reference object for calibrating sensors in an autonomous vehicle

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1850314-4 2018-03-20
SE1850314A SE1850314A1 (en) 2018-03-20 2018-03-20 Method, control arrangement and reference object for calibration of sensors

Publications (1)

Publication Number Publication Date
WO2019182499A1 true WO2019182499A1 (en) 2019-09-26

Family

ID=67986332

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2019/050244 WO2019182499A1 (en) 2018-03-20 2019-03-19 Method, control arrangement and reference object for calibration of sensors in an autonomous vehicle

Country Status (3)

Country Link
DE (1) DE112019000846T5 (en)
SE (1) SE1850314A1 (en)
WO (1) WO2019182499A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210232149A1 (en) * 2018-10-16 2021-07-29 Brain Corporation Systems and methods for persistent mapping of environmental parameters using a centralized cloud server and a robotic network
US20220239159A1 (en) * 2021-01-26 2022-07-28 Tdk Corporation Object detection apparatus, power transmission apparatus, and power transmission system

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102020213980A1 (en) * 2020-11-06 2022-05-12 Robert Bosch Gesellschaft mit beschränkter Haftung Calibration method and calibration system for a vehicle sensor

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3697941A (en) * 1971-07-01 1972-10-10 Devenco Inc Vehicle location system and method
JPH0779183A (en) * 1993-09-07 1995-03-20 Sumitomo Electric Ind Ltd Road side beacon equipment and beacon system using the road side beacon equipment
JPH10311731A (en) * 1997-05-09 1998-11-24 Japan Radio Co Ltd Road traffic information display device and method
US9719801B1 (en) * 2013-07-23 2017-08-01 Waymo Llc Methods and systems for calibrating sensors using road map data
CN206954219U (en) * 2017-01-13 2018-02-02 驭势科技(北京)有限公司 Intelligent automobile multisensor self-checking system, accessory system and intelligent automobile
EP3410145A1 (en) * 2017-05-31 2018-12-05 Valeo Schalter und Sensoren GmbH Method for calibrating a radar sensor of a motor vehicle during a movement of the motor vehicle, radar sensor, driver assistance system and motor vehicle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3697941A (en) * 1971-07-01 1972-10-10 Devenco Inc Vehicle location system and method
JPH0779183A (en) * 1993-09-07 1995-03-20 Sumitomo Electric Ind Ltd Road side beacon equipment and beacon system using the road side beacon equipment
JPH10311731A (en) * 1997-05-09 1998-11-24 Japan Radio Co Ltd Road traffic information display device and method
US9719801B1 (en) * 2013-07-23 2017-08-01 Waymo Llc Methods and systems for calibrating sensors using road map data
CN206954219U (en) * 2017-01-13 2018-02-02 驭势科技(北京)有限公司 Intelligent automobile multisensor self-checking system, accessory system and intelligent automobile
EP3410145A1 (en) * 2017-05-31 2018-12-05 Valeo Schalter und Sensoren GmbH Method for calibrating a radar sensor of a motor vehicle during a movement of the motor vehicle, radar sensor, driver assistance system and motor vehicle

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210232149A1 (en) * 2018-10-16 2021-07-29 Brain Corporation Systems and methods for persistent mapping of environmental parameters using a centralized cloud server and a robotic network
US20220239159A1 (en) * 2021-01-26 2022-07-28 Tdk Corporation Object detection apparatus, power transmission apparatus, and power transmission system
US11616405B2 (en) * 2021-01-26 2023-03-28 Tdk Corporation Object detection apparatus, power transmission apparatus, and power transmission system

Also Published As

Publication number Publication date
DE112019000846T5 (en) 2020-10-29
SE1850314A1 (en) 2019-09-21

Similar Documents

Publication Publication Date Title
EP3784989B1 (en) Systems and methods for autonomous vehicle navigation
US10569768B2 (en) Information processing apparatus, information processing method, and non-transitory recording medium
CN109070890B (en) Method and control unit in a vehicle for estimating the extension of a road based on a set of trajectories of a further vehicle
WO2019182499A1 (en) Method, control arrangement and reference object for calibration of sensors in an autonomous vehicle
US10890453B2 (en) Vehicle localization device
US11208085B2 (en) Automotive braking control system, apparatus, and method considering weather condition
RU2611289C2 (en) System for vehicles positioning and direction
EP3913328A1 (en) Vehicle positioning apparatus, system and method, and vehicle
US11514790B2 (en) Collaborative perception for autonomous vehicles
US11099264B2 (en) Variable range and frame-rate radar operation for automated vehicle
US11852742B2 (en) Method for generating a map of the surroundings of a vehicle
KR101865254B1 (en) Radar apparatus for monitoring a vehicle traveling the blind spot and method thereof
US20200191951A1 (en) Under vehicle inspection
WO2015009218A1 (en) Determination of lane position
US20220229153A1 (en) Abnormality diagnosis system
KR20190105613A (en) Method and control unit for ground bearing analysis
WO2020055804A1 (en) Intelligent vehicular system for reducing roadway degradation
WO2021040604A1 (en) Method and control arrangement for autonomy enabling infra-structure features
US11435191B2 (en) Method and device for determining a highly precise position and for operating an automated vehicle
CN116575371A (en) Anti-smashing and anti-misoperation barrier gate opening system and method
GB2611114A (en) Calibration courses and targets
SE544728C2 (en) Method and control arrangement for estimating relevance of location-based information of another vehicle
WO2023150465A1 (en) Methods and systems for measuring sensor visibility
SE2050647A1 (en) Method, control arrangement and reference object for calibration of sensors
CN118212805A (en) Method and apparatus for assisting right turn of vehicle based on UWB communication at intersection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19772160

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19772160

Country of ref document: EP

Kind code of ref document: A1