SE1951476A1 - Method and control arrangement for relational position displacement between two bodies of a multibody vehicle - Google Patents

Method and control arrangement for relational position displacement between two bodies of a multibody vehicle

Info

Publication number
SE1951476A1
SE1951476A1 SE1951476A SE1951476A SE1951476A1 SE 1951476 A1 SE1951476 A1 SE 1951476A1 SE 1951476 A SE1951476 A SE 1951476A SE 1951476 A SE1951476 A SE 1951476A SE 1951476 A1 SE1951476 A1 SE 1951476A1
Authority
SE
Sweden
Prior art keywords
sensor
marker
relative position
control arrangement
position displacement
Prior art date
Application number
SE1951476A
Inventor
Mikael Johansson
Nazre Batool
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to SE1951476A priority Critical patent/SE1951476A1/en
Publication of SE1951476A1 publication Critical patent/SE1951476A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D53/00Tractor-trailer combinations; Road trains
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A method (500) and a control arrangement (120) for estimating relative position displacement (Δ) between a first body (100a) and a second body (100b) of a multibody vehicle (110). The control arrangement (120) is configured to obtain a sensor measurement (400b) from a sensor (130) on the first body (100a), of a marker (140) on the second body (100b). The control arrangement (120) is also configured to estimate relative position displacement (Δ) between the first body (100a) and the second body (100b), based on the obtained sensor measurement (400b) and a reference position (400a) of the marker (140).

Description

METHOD AND CONTROL ARRANGEMENT FOR RELATIONAL POSITION DISPLACE-MENT BETWEEN TWO BODIES OF A MULTIBODY VEHICLE TECHNICAL FIELD This document relates to a method and a control arrangement of a vehicle. More particularly,a method and a control arrangement are described, for estimating relative position displace-ment between a first body and a second body of a multibody vehicle.
BACKGROUND A vehicle, such as e.g. a truck, may have sensors mounted both on the chassis and the cabinto gather data from areas of interest around the vehicle. On a truck the cabin is mounted tothe chassis via a suspension that, to some extent, allows the cabin and chassis to moveindependently from each other. This relative movement between chassis and cabin resultsin an ever-changing spatial relationship between chassis and cabin mounted sensors whichin turn makes the fusion of readings from cabin and chassis mounted sensors difficult. Themotion itself, especially the cabin motion, also introduces noise to sensor data. For example,when observing detections from a radar or lidar, the motion of the sensor may induce reflec-tions from the very same object to appear in different parts of the sensor field of view. Hence,to compensate for these effects it is valuable to know the cabin motion with respect to thechassis.
So, dynamic behaviour of interconnected bodies of a multibody vehicle leads to problemswhen it comes to synchronisation of sensor detection of different sensors on different bodiesof the multibody vehicle.
Two main approaches are known for how to measure or estimate the cabin motion/ positionin relation to the chassis: Using lMU(s) (lnertial Measurement Unit) to measure the accelerations of the cabin andchassis and use that information to reconstruct the motions and extrapolate/ interpolate the exact locations at a given time.
Using visual odometry (or lidar odometry) to reconstruct the motions and from this calculatethe positions at a given time. However, no method has been disclosed for solving the hereindescribed problems. lMUs with precise accuracy are expensive and might still not capture the motion correctly,given the wide spectra of motion frequencies. Another downside with this approach is that the related motion model is also introduced which introduces more inaccuracy, especially ifthe sample time is too slow. Typically, ll\/lUs tend to drift over time, introducing a growingbias error. Another drawback is the lack of real-time information in case it has to be waitedfor an IMU reading to be able to estimate the position at a certain time.
The visual odometry method depends on finding image salient features to track from frameto frame to back track the position. There will be situations where either no such features arefound or are found with much computational time in which case a real-time estimation of theposition will be difficult. The accuracy in the output from such a method will also be highlycorrelated to the sensor calibration accuracy, especially at larger distances of the trackedfeatures. Finally, such a method is quite complex and hard to master.
Document US20190066323 describes a method for motion/ position tracking of an excava-tor's bucket by placing an optical target on the bucket and tracking the motion by a singleimage sensor that is placed in a cabin. However, the motion tracking is made for the bucket,not for the chassis with regard to the cabin. The method is not intended for synchronisingsensors on different bodies.
Document US20180372884, shows an alternative method to measure the relative positionbetween truck cabin and chassis by using a satellite-based navigation device. The methodis not intended for synchronising sensors on different bodies, but to determining position ofthe vehicle.
None of the discussed documents presents a convenient solution to the problem of continu-ously determining relative position displacement between a first body and a second body ofa multibody system and enable sensor synchronisation between sensors on the differentvehicle bodies. lt would thus be desired to improve estimation of relative position displacement betweendifferent vehicle bodies comprised in a multibody system.
SUMMARY lt would be advantageous to achieve a solution overcoming, or at least alleviating, at leastsome of the above-mentioned drawback(s). ln particular, it would be desirable to improveestimation of relative position displacement between vehicle bodies of a multibody vehicle.To better address one or more of these concerns, a control arrangement and a method hav-ing the features defined in the independent claims are provided.
According to a first aspect, this objective is achieved by a control arrangement for estimating relative position displacement between a first body and a second body of a multibody vehicle.The control arrangement is configured to obtain a sensor measurement from a sensor on thefirst body, of a marker on the second body. Also, the control arrangement is configured toestimate relative position displacement between the first body and the second body, based on the obtained sensor measurement and a reference position of the marker.
According to a second aspect, this objective is achieved by a method in a control arrange-ment for estimating relative position displacement between a first body and a second bodyof a multibody vehicle. The method comprises the step of obtaining a sensor measurementfrom a sensor on the first body, of a marker on the second body. Further, the method alsocomprises estimating relative position displacement between the first body and the secondbody, based on the obtained sensor measurement and a reference position of the marker.
Thanks to described aspects, it becomes possible to estimate cabin position/ motion in rela-tion to the chassis of the vehicle, or two or more other bodies of a multibody vehicle for thatmatter. By using the sensor on one of the bodies to detect and track movements over timeof the marker situated on the other body, the relative position/ motion could be estimated.Synchronisation may then be made of sensor detections made with sensors situated on thefirst body and the second body respectively, based in the relative position/ motion estimation.Hereby, sensor detections of the environment are improved which may be valuable in par-ticular for autonomous vehicles, leading to improved detections of obstacles, etc., reducingrisks of an accident. Alternatively, the vehicle may drive faster at the same safety level asaccording to prior art. Thus, increased traffic safety is achieved.
Other advantages and additional novel features will become apparent from the subsequentdetailed description.
FIGURES Embodiments of the invention will now be described in further detail with reference to theaccompanying figures, in which: Figure 1A illustrates a multibody vehicle comprising a cabin and a chassis according toan embodiment of the invention; Figure 1B illustrates a multibody vehicle comprising a tractor and a trailer according toan embodiment of the invention; Figure 1C illustrates a multibody vehicle comprising a tractor and a trailer according toan embodiment of the invention; Figure 1D illustrates a multibody vehicle comprising a tractor and a trailer according to an embodiment of the invention; Figure 1E illustrates a multibody vehicle comprising a tractor and a trailer according toan embodiment of the invention; Figure 1F illustrates a multibody vehicle comprising a tractor and a trailer according toan embodiment of the invention; Figure 2 illustrates a various embodiments of a marker on the second body; Figure 3A illustrates a vehicle combination comprising a tractor and a trailer accordingto an embodiment of the invention; Figure 3B illustrates a vehicle combination comprising a tractor and a trailer as regardedfrom above, according to an embodiment of the invention; Figure 4A illustrates an example of displacement between a sensor measurement anda reference position of the marker; Figure 4B illustrates an example of six degrees of freedom; Figure 5 is a flow chart illustrating an embodiment of the method; Figure 6 is an illustration depicting a system according to an embodiment.
DETAILED DESCRIPTION Embodiments of the solution described herein are defined as a method and a control ar-rangement, which may be put into practice in the embodiments described below. These em-bodiments may, however, be exemplified and realised in many different forms and are not tobe limited to the examples set forth herein; rather, these illustrative examples of embodi-ments are provided so that this disclosure will be thorough and complete.
Still other objects and features may become apparent from the following detailed description,considered in conjunction with the accompanying drawings. lt is to be understood, however,that the drawings are designed solely for purposes of illustration and not as a definition ofthe limits of the herein disclosed embodiments, for which reference is to be made to theappended claims. Further, the drawings are not necessarily drawn to scale and, unless oth-en/vise indicated, they are merely intended to conceptually illustrate the structures and pro-cedures described herein.
Figure 1A illustrates an embodiment in which a first body 100a in form of a vehicle cabin,and a second body 100b in form of a vehicle chassis, together forms a multibody vehicle 110driving on a road 105.
The multibody vehicle 110 comprises a control arrangement 120. ln the illustrated embodi-ment of Figure 1A, the control arrangement 120 may be comprised in/ on the first body 100a.However, in other embodiments, the control arrangement 120 may be comprised in/ on thesecond body 100b, or not comprised in neither the first body 100a, the second body 100bnor the multibody vehicle 110.
The control arrangement 120 may comprise e.g. one or several Electronic Control Units(ECUs), typically a plurality of interacting ECUs. The control arrangement 120 may comprisea digital computer that controls one or more electrical systems, or electrical sub systems, ofthe multibody vehicle 110, based on e.g. information read from the sensors placed at variousparts and in different components of the multibody vehicle 110. ECU is a generic term thatoften is used in automotive electronics, for any embedded system that controls one or moreof the electrical system or sub systems in the multibody vehicle 1 10. The control arrangement120 may be particularly designated to implement estimation of relative position displacementbetween the first body 100a and the second body 100b of a multibody vehicle 110.
The first body 100a also comprises a sensor 130 and the second body 100b comprises amarker 140.
The sensor 130 may comprise a visual sensor such as a camera, a stereo camera, an infra-red camera, a video camera, or similar device, in different embodiments. Other sensor typessuch as e.g. a radar, a lidar, an ultrasound device, a time-of-flight camera, may be used in some alternative embodiments, or in addition to the visual sensor. ln an example, the sensor 130 may comprise a rearward looking sensor/ s for example inte- grated in, or alternatively replacing, a rearward looking mirror.
The marker 140 may be particularly designed for being detectable by the sensor 130 andcomprise a photogrammetry target/ code/ marker.
The photogrammetry coded marker 140 may easily be detected by the sensor 130 and bemonitored in camera frames in real-time. From the known size and location shape of themarker 140 in combination with the known intrinsic parameters of the sensor 130, the exactpose of the sensor 130 with respect to the coded marker 140 and hence relative displace-ment between the bodies 100a, 100b of the multibody vehicle 110, by knowing the exactlocation of the sensor 130 on the first body 100a.
The first body 100a and the second body 100b may be approximated to comprise rigid bodiesfor which deformation may be zero, or alternatively so small that it can be neglected.
For increased robustness and accuracy multiple sensors 130 and/ or markers 140 can beused. lf possible, an existing sensor 130 on any of the bodies 100a, 100b of the multibodyvehicle 110, i.e. an existing sensor 130 which primarily is intended for another usage maybe used as long as a large enough part of the other body 100a, 100b is visible in the field ofview to fit the marker 140. ln the illustrated embodiment, the sensor 130 is arranged on the cabin 100a and the marker140 is arranged on the vehicle chassis 100b. However, the situation may be the opposite inother embodiments, i.e. having the sensor 130 arranged on the vehicle chassis 100b and the marker 140 arranged on the cabin 100a.
Hereby, any movement and/ or displacement between the first body 100a and the secondbody 100b of the multibody vehicle 110 is detectable by the sensor 130 by monitoring themarker 140, mounted on a specific point on the other body 100a, 100b of the multibodyvehicle 110. Thus, the reference point, on which the marker 140 is arranged on the chassis100b, is tracked by the sensor 130 in camera frames.
Hereby, a solution is provided to the above described problem of cabin 100a versus chassis100b movement with high accuracy, at a potentially low cost and dealing with the sensorsynchronisation problems.
To deal with potential blurring of the image in high dynamic motions, a high frame rate cam-era may be used as sensor 130 in some embodiments.
The high frame rate camera may be device capable of capturing moving images with expo-sures of less than approximately 1/ 1000 second, or frame rates in excess of about 250frames per second.
By measuring relative position displacement between the various bodies 100a, 100b of themultibody vehicle 110, it becomes possible to synchronise sensor readings of the environ-ment from sensors 130 situated on the different bodies 100a, 100b by compensating for thedetected displacement.
Advantages provided by the disclosed solution is that no time synchronisation may be re-quired when the position is extracted from the same sensor 130/ camera image as is usedfor visual feature extraction as for the camera perception.
Further, better accuracy in the displacement estimation is achieved since no errors are in-troduced through time compensations estimation or motion extrapoiation. Also, less compu-tational effort is required in comparison with previously known displacement estimation meth-ods.
When already existing sensors 130 are used, the provided solution may be applied withoutadditional hardware costs involved. ln addition, the disclosed solution is accurate and robustin comparison with previously known solutions based on inertial measurement units (IMU) such as accelerometers.
Figure 1 B illustrates a scenario with a powered vehicle/ tractor 100a and a semi-trailer 100b,together forming the vehicle combination/ multibody vehicle 110 driving on the road 105. Thevehicle combination 110 may comprise one or several trailers/ cargo carrying entities 100b,such as e.g. 2, 3, etc.
The tractor 100a/ multibody vehicle 110 may be driver controlled or driverless (i.e. autono-mously controlled) in different embodiments. However, for enhanced clarity, the tractor 100ais subsequently described as having a driver.
The semi-trailer 100b may also be referred to as a trailer in some particular embodiments.
The semi-trailer 100b may typically be unpowered but may also be powered in some em-bodiments. The semi-trailer 100b is normally unmanned but may in some particular embod-iments be manned. Further, the semi-trailer 100b may be autonomous in some particularembodiments, configured for making at least minor position adjustments, e.g. adjusting thegeographical position at a loading bay and/ or adjusting height of the coupling device.
The tractor 100a and the semi-trailer 100b may exchange information over a datalink, e.g.via a bus such as e.g. a Controller Area Network (CAN) bus, a Media Oriented SystemsTransport (MOST) bus, or similar. However, the datalink may alternatively be made over awireless connection comprising, or at least be inspired by wireless communication technol-ogy such as Wi-Fi, Ethernet, Wireless Local Area Network (WLAN), Ultra Mobile Broadband(UMB), Bluetooth (BT), optical communication such as lnfrared Data Association (lrDA) orinfrared transmission to name but a few possible examples of wireless communications in some embodiments. ln some embodiments, the communication between the tractor 100a and the semi-trailer100b in the multibody vehicle 110 may be performed via Vehicle-to-Vehicle (V2V) communi-cation, e.g. based on Dedicated Short-Range Communications (DSRC) devices. DSRC may work in 5.9 GHz band with bandwidth of 75 I\/|Hz and approximate range of 1000 m in someembodiments.
The wireless communication may alternatively be made according to any IEEE standard forwireless vehicular communication like e.g. a special mode of operation of IEEE 802.11 forvehicular networks called Wireless Access in Vehicular Environments (WAVE). IEEE802.11p is an extension to 802.11 Wireless LAN medium access layer (l\/IAC) and physicallayer (PHY) specification.
The tractor 100a comprises an anterior coupling device 101 and the semi-trailer 100b com-prises a rear coupling device 111. The multibody vehicle 110 is composed by attaching ananterior coupling device 101 of the tractor 100a with a rear coupling device 111 of the semi-trailer 100b. The coupling devices 101, 111 may be referred to as a towing coupling/ towingeye/ king pin. Other mechanical arrangements may be used for attaching the semi-trailer100b with the tractor 100a in other embodiments.
The respective coupling devices 101, 111 may have different heights; also the tractor 100aand the semi-trailer 100b may have been made by different manufacturers. The tractor 100amay not know the height of the semi-trailer 100b.
Figure 1C illustrates an example of a multibody vehicle 110 formed by a tractor 100a and asemi-trailer 100b. The semi-trailer 100b has been attached to the tractor 100a with an incli-nation oi in relation to the road 105.
The inclination oi of the semi-trailer 100b causes the upper front end of the semi-trailer 100bto form a maximum height point of the multibody vehicle 110, with a maximum height.
Figure 1D illustrates yet an example of a multibody vehicle 110 formed by a tractor 100aand a semi-trailer 100b.
The semi-trailer 100b has been deformed, momentarily or permanently due to heavy weight.Typically, a U-shape of the semi-trailer 100b may be assumed. The deformation profile maybe estimated e.g. by a polynomial function such as a second-grade polynomial function and/or by selecting a predetermined load case. Based thereupon, the maximum height point ofthe multibody vehicle 110 may be identified, typically the upper front end of the semi-trailer100b may be determined and the maximum height of the multibody vehicle 110 may beestimated.
Figure 1E illustrates yet an example of a multibody vehicle 110 formed by a tractor 100a and a semi-trailer 100b, driving on an inclined road segment 105.
Due to the inclined road, the maximum height point may be identified as the upper front endof the semi-trailer 100b, having the maximum height.
Figure 1 F illustrates yet an example of a multibody vehicle 110 formed by a tractor, first body100a comprising a first trailer and a second body 100b comprising a second trailer.
The multibody vehicle 110 may comprise multiple bodies 100a, 100b and thanks to the dis-closed solution, sensors detections made by sensors 130 on the various multiple bodies100a, 100b of the multibody vehicle 110.
The different versions or examples of Figures 1A-1F may in some embodiments be com-bined.
Figure 2 illustrates some examples of different markers 140, such as photogrammetry codedtargets.
The marker 140 may comprise distinctive visual patterns designed to be automatically de-tected and measured in camera images by a sensor 130. There are various per se knownvisual target designs that potentially may be used, preferably having the property of low sim-ilarity with common environmental features, robust disambiguation between multiple in-stances, usually by means of signals encoded in the marker 140, and being prepared for acheap and robust detection algorithm.
The design of the marker 140 may be for example square, circular, based on concentriccircles etc. in black and white. By providing white dots in a black annulus, a robust targetdetection and decoding is provided, in some embodiments. ln other embodiments, the marker 140 may comprise one or several colours.
The encoding of the marker 140 may enable identification and/ or orientation of the marker140 upon detection by the sensor 130.
Figure 3A illustrates yet a multibody vehicle 110 formed by a first body 100a and a secondbody 100b, driving on a road segment 105 in a driving direction 305.
The first body 100a may for example comprise a vehicle cabin while the second body 100bmay comprise a vehicle chassis. ln other embodiments, the first body 100a may comprise atractor while the second body 100b comprises a semi-trailer.
Figure 3B illustrates the multibody vehicle 110 of Figure 3A as regarded from above.
The first body 100a in this embodiment comprises rearward looking sensors 130a, 130b for example integrated in, or alternatively replacing, a reanivard looking mirror.
The sensors 130a, 130b comprises a device, module, or subsystem whose purpose is todetect events or changes in its environment and send the information to other electronicswithin or outside the first body 100a, such as a control arrangement of the first body 100a/multibody vehicle 110.
The second body 100b comprises at least one marker 140a, 140b, dedicated for being rec-ognised by at least one of the sensors 130a, 130b. Thereby, the relative displacement be-tvveen the first body 100a and the second body 100b could be determined.
The sensor 130a, 130b may comprise e.g. a LIDAR, a camera, a stereo camera, an infraredcamera, a video camera, a radar, an ultrasound device, a time-of-flight camera, or similardevice, in different embodiments. Further, the sensor 130a, 130b may comprise a pluralityof different sensors of the same or different types.
Figure 4A illustrates an example of displacement A between a sensor measurement of asensor 130 situated on a first body 100a and a reference position of a marker 140 situatedon a second body 100b at two different moments in time t1, t2.
A first sensor measurement 400a of the marker 140, made by the sensor 130 at a first mo-ment in time t1. The first sensor measurement 400a may be regarded as a starting point or reference position.
At a second moment in time t2, a second sensor measurement 400b of the marker 140 isperformed by the sensor 130.
Based on the made measurements 400a, 400b, the relative position displacement A betweenthe first body 100a and the second body 100b may be estimated, by using the first sensormeasurement 400a as a reference position of the marker 140 (and thereby also of the sec-ond body 100b on which the marker 140 is situated) and then making a comparison betweenthe first sensor measurement 400a and the second sensor measurement 400b.
The estimated relative position displacement A between the bodies 100a, 100b may then beutilised to synchronise sensor measurements of sensors 130 situated on the first body 100a 11 and the second body 100b with each other based on the estimated relative position displace-ment A. ln the embodiment illustrated in Figure 4A, only the relative position displacement A in onedimension is disclosed, for facilitating understanding of the concept. However, in a realisticscenario, the relative position displacement A between the first sensor measurement 400aand the second sensor measurement 400b in six degrees of freedom, as illustrated in Figure4B.
Six degrees of freedom (6DoF) refers to the freedom of movement of a rigid body in three-dimensional space. Specificaily, the body is free to change position as forward/ backward(surge), up/ down (heave), left/ right (sway) translation in three perpendicular axes, combinedwith changes in orientation through rotation about three perpendicular axes, often termedyaw (normal axis), pitch (transverse axis), and roll (longitudinal axis).
Figure 5 illustrates an example of a method 500 according to an embodiment. The flow chartin Figure 5 shows the method 500 for use in a control arrangement 120 for estimating relativeposition displacement A between a first body 100a and a second body 100b of a multibodyvehicle 110.
The multibody vehicle 110 may be e.g. a truck, a bus, a car, or similar means of conveyance,comprising at least two bodies 100a, 100b that may move in relation to each other. At leastone of the bodies 100a, 100b comprises at least one sensor 130 pointable towards at leastone marker 140 situated on the other vehicle body 100a, 100b, in some embodiments, sim-ultaneously, shifted or sequentially in time. ln order to be able to correctly estimate relative position displacement A between the bodies100a, 100b, the method 500 may comprise a number of steps 501-505. However, some ofthese steps 501-505 may be performed in various alternative manners. Some method stepsmay only be performed in some optional embodiments; such as e.g. steps 502, 503, and/ or505. Further, the described steps 501-505 may be performed in a somewhat different chron-ological order than the numbering suggests. The method 500 may comprise the subsequentsteps: Step 501 comprises obtaining a sensor measurement 400b from a sensor 130 on the firstbody 100a, of a marker 140 situated on a dedicated position on the second body 100b.
According to some embodiments, the sensor measurements 400b may be obtained from aplurality of sensors 130 of the first body 100a, of a respective marker 140 on the second 12 body 100b of the multibody vehicle 110. Hereby, measurement precision may be improved, in some embodiments.
Further, in some particular embodiments, the sensor measurement 400b of the marker 140made by the sensor 130 may also comprise identifying the marker 140 and/ or extractinginformation from the marker 140 by image recognition/ computer vision and object recogni- tion.
Computer vision is a technical field comprising methods for acquiring, processing, analysing,and understanding images and, in general, high-dimensional data from the real world in orderto produce numerical or symbolic information. A theme in the development of this field hasbeen to duplicate the abilities of human vision by electronically perceiving and understandingan image. Understanding in this context means the transformation of visual images (the inputof retina) into descriptions of world that can interface with other thought processes and elicitappropriate action. This image understanding can be seen as the disentangling of symbolicinformation from image data using models constructed with the aid of geometry, physics,statistics, and learning theory. Computer vision may also be described as the enterprise ofautomating and integrating a wide range of processes and representations for vision percep- tion.
The image data of the sensor 130 may take many forms, such as e.g. images, video se- quences, views from one or multiple cameras, or multi-dimensional data from a scanner.
By using encoded information on the marker 140 to identify the marker 140, the position ofthe marker 140 may be determined. Different parts of the bodies 100a, 100b may have dif-ferent displacement in relation to the other body 100a, 100b. For example, the sensor 130may be displaced upwards in relation to a marker 140 on the left side of the second body100b while at the same time being displaced downwards in relation to a marker 140 on theright side of the second body 100b. By identifying the marker and associating it with a par-ticular part of the second body 100b, this may be determined and corresponding adjustmentof sensor signals obtained from sensors on the respective parts of the second body 100bmay be made.
Step 502, which only may be comprised in some embodiments, comprises comparing theobtained 501 sensor measurement 400b with a reference position 400a of the marker 140.
The reference position 400a of the marker 140 may be a previously made sensor measure-ment of the marker 140, made by the sensor 130. 13 The reference position 400a of the marker 140 may be updated/ cleaned/ replaced by a newsensor measurement at regular time intervals. lt is hereby avoided that dirt or wear occludesthe marker 140.
Step 503, which only may be comprised in some embodiments, comprises calculating therelative position displacement A between the obtained 501 sensor measurement 400b anda reference position 400a of the marker 140 in six degrees of freedom.
Step 504 comprises estimating the relative position displacement A between the first body100a and the second body 100b, based on the obtained 501 sensor measurement 400b and a reference position 400a of the marker 140. in some embodiments wherein step 502 has been performed, the relative position displace-ment A between the first body 100a and the second body 100b may be estimated based onthe made comparison 502. in some embodiments wherein step 503 has been performed, the relative position displace-ment A between the first body 100a and the second body 100b may be estimated based onthe made calculation 503.
The estimation of the relative position displacement A between the first body 100a and thesecond body 100b may be based on the obtained 501 plurality of sensor measurements400b, in some embodiments.
When obtaining 501 several sensor measurements 400b, the precision and/ or reliability ofthe estimation is increased.
Step 505, which only may be comprised in some embodiments, comprises synchronisingsensor measurements of sensors 130 situated on the first body 100a and the second body100b with each other based on the estimated 504 relative position displacement A betweenthe first body 100a and the second body 100b.
Hereby, sensor measurements of sensors 130 situated on the first body 100a and the secondbody 100b respectively may be aligned, leading to improved sensor detection of the environ-ment of the multibody vehicle 110. lt may be an advantage to use this adjustment in particular for the sensor measurements ofthe sensor 130 that has been used for the displacement measurement, as no time synchro-nisation is required since the position is extracted from the same camera image as is used 14 for visual feature extraction of the marker 140 as for the camera perception. Hereby compu-tational efforts are saved and accuracy is improved since no errors are introduced through time compensations estimation or motion extrapolation. ln some alternative embodiments, the relative speed difference between the second body100b in relation to the first body 100a may be calculated, based on signals received from thesensor 130. Hereby prediction of a close future relative displacement A between the firstbody 100a and the second body 100b may be made, compensating for lagging.
Figure 6 illustrates an embodiment of a system 600 in a multibody vehicle 110 comprising afirst body 100a and a second body 100b. The first body 100a comprises a sensor 130 whilethe second body 100b comprises a target 140.
The system 600 may perform at least some of the previously described steps 501-505 ac-cording to the method 500 described above and schematically illustrated in Figure 5.
The system 600 comprises at least one control arrangement 120 for the multibody vehicle110. The control arrangement 120 may be situated on board the multibody vehicle 110; al-ternatively off board the multibody vehicle 110.
The control arrangement 120 aims at estimating relative position displacement A betweenthe first body 100a and the second body 100b of the multibody vehicle 110. Further, thecontrol arrangement 120 is configured to obtain a sensor measurement 400b from a sensor130 on the first body 100a, of a marker 140 on the second body 100b. Also, the controlarrangement 120 is configured to estimate relative position displacement A between the firstbody 100a and the second body 100b, based on the obtained sensor measurement 400band a reference position 400a of the marker 140. ln some embodiments, the control arrangement 120 may be configured to synchronise sen-sor measurements of sensors 130 situated on the first body 100a and the second body 100bwith each other based on the estimated relative position displacement A. ln yet some embodiments, the control arrangement 120 may be configured to compare theobtained sensor measurement 400b with a reference position 400a of the marker 140. Also,the estimation of the relative position displacement A between the first body 100a and thesecond body 100b may be based on the comparison.
The control arrangement 120 may furthermore be configured to calculate the relative positiondisplacement A between the obtained sensor measurement 400b and a reference position 400a of the marker 140 in six degrees of freedom. Furthermore, the estimation of the relativeposition displacement A between the first body 100a and the second body 100b may bebased on the calculation. ln yet some alternative embodiments, the control arrangement 120 may also be configuredto obtain sensor measurements 400b from a plurality of sensors 130 on the first body 100a,of a respective marker 140 on the second body 100b. The control arrangement 120 mayadditionally be configured to estimate the relative position displacement A between the firstbody 100a and the second body 100b is based on the obtained plurality of sensor measure-ments 400b and a respective reference position 400a of the markers 140.
The control arrangement 120 comprises a receiving circuit 610 configured for receiving asignal from the sensor 130.
Further, the control arrangement 120 comprises a processing circuitry 620 configured forperforming at least some steps of the method 500, according to some embodiments.
Such processing circuitry 620 may comprise one or more instances of a processing circuit,i.e. a Central Processing Unit (CPU), a processing unit, a processor, an Application SpecificIntegrated Circuit (ASIC), a microprocessor, or other processing logic that may interpret andexecute instructions. The herein utilised expression "processing circuitry" may thus comprisea plurality of processing circuits, such as, e.g., any, some or all of the ones enumeratedabove.
Furthermore, the control arrangement 120 may comprise a memory 625 in some embodi-ments. The optional memory 625 may comprise a physical device utilised to store data orprograms, i.e., sequences of instructions, on a temporary or permanent basis. According tosome embodiments, the memory 625 may comprise integrated circuits comprising silicon-based transistors. The memory 625 may comprise e.g. a memory card, a flash memory, aUSB memory, a hard disc, or another similar volatile or non-volatile storage unit for storingdata such as e.g. ROIVI (Read-Only l\/lemory), PROIVI (Programmable Read-Only Memory),EPROIVI (Erasable PROIVI), EEPROIVI (Electrically Erasable PROIVI), etc. in different embod- iments.
Further, the control arrangement 120 may comprise a signal transmitter 630 in some em-bodiments. The signal transmitter 630 may be configured for transmitting a signal to e.g. thesensor 130 for adjusting sensor measurements according to the relative position displace-ment A between the first body 100a and the second body 100b. 16 ln addition, the system 600 also comprises at least one sensor 130 on the first body 100a ofthe multibody vehicle 1 10. The at least one sensor 130 may comprise e.g. a camera, a stereocamera, an infrared camera, a video camera, radar, lidar, ultrasonic sensor, time- of- flightcamera, or thermal camera or similar. The at least one sensor 130 utilised for performing atleast a part of the method 500 may in some embodiments have another main purpose thanperforming the method 500, i.e. be already existing in the multibody vehicle 110.
The above described steps 501-505 to be performed in the multibody vehicle 110 may beimplemented through the one or more processing circuitries 620 of the control arrangement120, together with computer program product for performing at least some of the functionsof the steps 501-505. Thus, a computer program product, comprising instructions for per-forming the steps 501-505 in the control arrangement 120 may perform the method 500comprising at least some of the steps 501-505 for estimating relative position displacementA between the first body 100a and the second body 100b of the multibody vehicle 110, whenthe computer program is loaded into the one or more processing circuitries 620 of the controlarrangement 120.
Further, some embodiments of the invention may comprise a multibody vehicle 110, com-prising the control arrangement 120, for estimating relative position displacement A betweenthe first body 100a and the second body 100b of the multibody vehicle 110, according to atleast some of the steps 501 -505.
The computer program product mentioned above may be provided for instance in the formof a data carrier carrying computer program code for performing at least some of the steps501-505 according to some embodiments when being loaded into the one or more pro-cessing circuitries 620 of the control arrangement 120. The data carrier may be, e.g., a harddisk, a CD ROIVI disc, a memory stick, an optical storage device, a magnetic storage deviceor any other appropriate medium such as a disk or tape that may hold machine readabledata in a non-transitory manner. The computer program product may furthermore be pro-vided as computer program code on a server and downloaded to the control arrangement 120 remotely, e.g., over an Internet or an intranet connection.
The terminology used in the description of the embodiments as illustrated in the accompa-nying drawings is not intended to be limiting of the described method 500; the control ar-rangement 120; the computer program; computer-readable medium; the system 600 and/ orthe multibody vehicle 110. Various changes, substitutions and/ or alterations may be made,without departing from invention embodiments as defined by the appended claims.
As used herein, the term "and/ or" comprises any and all combinations of one or more of the 17 associated listed items. The term "or" as used herein, is to be interpreted as a mathematicalOR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless ex-pressly stated otherwise. ln addition, the singular forms "a", "an" and "the" are to be inter-preted as "at least one", thus also possibly comprising a plurality of entities of the same kind,unless expressly stated othen/vise. lt will be further understood that the terms "includes","comprises", "including" and/ or "comprising", specifies the presence of stated features, ac-tions, integers, steps, operations, elements, and/ or components, but do not preclude thepresence or addition of one or more other features, actions, integers, steps, operations, ele-ments, components, and/ or groups thereof. A single unit such as e.g. a processor may fulfilthe functions of several items recited in the claims. The mere fact that certain measures arerecited in mutually different dependent claims does not indicate that a combination of thesemeasures cannot be used to advantage. A computer program may be stored/ distributed ona suitable medium, such as an optical storage medium or a solid-state medium suppliedtogether with or as part of other hardware but may also be distributed in other forms such as via Internet or other wired or wireless communication system.

Claims (14)

1. A control arrangement (120) for estimating relative position displacement (A) be-tween a first body (100a) and a second body (100b) of a multibody vehicle (110); whereinthe control arrangement (120) is configured to obtain a sensor measurement (400b) from a sensor (130) on the first body (100a), ofa marker (140) on the second body (100b); and estimate relative position displacement (A) between the first body (100a) and thesecond body (100b), based on the obtained sensor measurement (400b) and a referenceposition (400a) of the marker (140).
2. The control arrangement (120) according to claim 1, further configured to synchronise sensor measurements of sensors (130) situated on the first body (100a)and the second body (100b) with each other based on the estimated relative position dis-placement (A).
3. The control arrangement (120) according to any one of the preceding claims, furtherconfigured to compare the obtained sensor measurement (400b) with a reference position (400a)of the marker (140); and wherein the estimation of the relative position displacement (A)between the first body (100a) and the second body (100b) is based on the comparison.
4. The control arrangement (120) according to any one of the preceding claims, furtherconfigured to calculate the relative position displacement (A) between the obtained sensor meas-urement (400b) and a reference position (400a) of the marker (140) in six degrees of free-dom; and wherein the estimation of the relative position displacement (A) between the firstbody (100a) and the second body (100b) is based on the calculation.
5. The control arrangement (120) according to any one of the preceding claims, furtherconfigured to obtain sensor measurements (400b) from a plurality of sensors (130) on thefirst body (100a), of a respective marker (140) on the second body (100b); and wherein theestimation of the relative position displacement (A) between the first body (100a) and thesecond body (100b) is based on the obtained plurality of sensor measurements (400b) anda respective reference position (400a) of the markers (140).
6. A method (500) in a control arrangement (120) for estimating relative position dis-placement (A) between a first body (100a) and a second body (100b) of a multibody vehicle(110); wherein the method (500) comprises the steps of 19 obtaining (501) a sensor measurement (400b) from a sensor (130) on the first body(100a), of a marker (140) on the second body (100b); and estimating (504) relative position displacement (A) between the first body (100a) andthe second body (100b), based on the obtained (501) sensor measurement (400b) and areference position (400a) of the marker (140).
7. The method (500) according to ciaim 6, further comprising the step of synchronising (505) sensor measurements of sensors (130) situated on the first body(100a) and the second body (100b) with each other based on the estimated (504) relativeposition displacement (A) between the first body (100a) and the second body (100b).
8. The method (500) according to any one of ciaim 6 or ciaim 7, further comprising thestep of comparing (502) the obtained (501) sensor measurement (400b) with a referenceposition (400a) of the marker (140); and wherein the estimation (504) of the relative positiondisplacement (A) between the first body (100a) and the second body (100b) is based on thecomparison (502).
9. The method (500) according to any one of claims 6-8, further comprising the step of caiculating (503) the relative position displacement (A) between the obtained (501)sensor measurement (400b) and a reference position (400a) of the marker (140) in six de-grees of freedom; and wherein the estimation (504) of the relative position displacement (A)between the first body (100a) and the second body (100b) is based on the calculation (503). 10.(400b) are obtained (501) from a p|ura|ity of sensors (130) of the first body (100a), of a re-spective marker (140) on the second body (100b); and wherein the estimation (504) of the
10. The method (500) according to any one of claims 6-9, wherein sensor measurements relative position displacement (A) between the first body (100a) and the second body (100b)is based on the obtained (501) p|ura|ity of sensor measurements (400b). 11.by a computer, cause the computer to carry out the method (500) according to any one of
11. A computer program comprising instructions which, when the program is executedclaims 6-10.12. computer, cause the computer to carry out the method (500) according to any one of claims6-10.
12. A computer-readabie medium comprising instructions which, when executed by a
13. A system (600) for estimating relative position displacement (A) between a first body (100a) and a second body (100b) of a multibody vehicle (110); wherein the system (600)comprises a marker (140), situated on the second body (100b); a sensor (130) situated on the first body (1 OOa), configured to capture a sensor meas-urement (400b) of the marker (140); and a control arrangement (120) according to any one of claims 1-5.
14. A multibody vehicle (110) comprising a system (600) according to claim 13.
SE1951476A 2019-12-17 2019-12-17 Method and control arrangement for relational position displacement between two bodies of a multibody vehicle SE1951476A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
SE1951476A SE1951476A1 (en) 2019-12-17 2019-12-17 Method and control arrangement for relational position displacement between two bodies of a multibody vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1951476A SE1951476A1 (en) 2019-12-17 2019-12-17 Method and control arrangement for relational position displacement between two bodies of a multibody vehicle

Publications (1)

Publication Number Publication Date
SE1951476A1 true SE1951476A1 (en) 2021-06-18

Family

ID=76756090

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1951476A SE1951476A1 (en) 2019-12-17 2019-12-17 Method and control arrangement for relational position displacement between two bodies of a multibody vehicle

Country Status (1)

Country Link
SE (1) SE1951476A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024085797A1 (en) * 2022-10-17 2024-04-25 Scania Cv Ab Method and control arrangement for vehicle height estimation

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002133597A (en) * 2000-10-23 2002-05-10 Isuzu Motors Ltd Rear-view mirror device
US20180040129A1 (en) * 2016-08-02 2018-02-08 Denso International America, Inc. Trailer articulation calculating system and method for calculating articulation angle of trailer
EP3318469A1 (en) * 2016-11-02 2018-05-09 LG Electronics Inc. Apparatus for providing around view image, and vehicle
US20180284243A1 (en) * 2017-03-31 2018-10-04 Uber Technologies, Inc. Autonomous Vehicle Sensor Calibration System
US20180372884A1 (en) * 2015-12-14 2018-12-27 Robert Bosch Gmbh Method, Electronic Control Device and System for Position Determination
US20190066323A1 (en) * 2017-08-24 2019-02-28 Trimble Inc. Excavator Bucket Positioning Via Mobile Device
US20190094331A1 (en) * 2017-09-25 2019-03-28 Continental Automotive Systems, Inc. System and method of infrastructure sensor self-calibration

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002133597A (en) * 2000-10-23 2002-05-10 Isuzu Motors Ltd Rear-view mirror device
US20180372884A1 (en) * 2015-12-14 2018-12-27 Robert Bosch Gmbh Method, Electronic Control Device and System for Position Determination
US20180040129A1 (en) * 2016-08-02 2018-02-08 Denso International America, Inc. Trailer articulation calculating system and method for calculating articulation angle of trailer
EP3318469A1 (en) * 2016-11-02 2018-05-09 LG Electronics Inc. Apparatus for providing around view image, and vehicle
US20180284243A1 (en) * 2017-03-31 2018-10-04 Uber Technologies, Inc. Autonomous Vehicle Sensor Calibration System
US20190066323A1 (en) * 2017-08-24 2019-02-28 Trimble Inc. Excavator Bucket Positioning Via Mobile Device
US20190094331A1 (en) * 2017-09-25 2019-03-28 Continental Automotive Systems, Inc. System and method of infrastructure sensor self-calibration

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024085797A1 (en) * 2022-10-17 2024-04-25 Scania Cv Ab Method and control arrangement for vehicle height estimation

Similar Documents

Publication Publication Date Title
US20170200273A1 (en) System and Method for Fusing Outputs of Sensors Having Different Resolutions
AU2018282302B2 (en) Integrated sensor calibration in natural scenes
CN108141570B (en) Calibration device, calibration method, and calibration program storage medium
EP3114430B1 (en) Calibration method, calibration device, and computer program product
CN110207714B (en) Method for determining vehicle pose, vehicle-mounted system and vehicle
CN108603933B (en) System and method for fusing sensor outputs with different resolutions
US20210215505A1 (en) Vehicle sensor calibration
KR20190132404A (en) Direct vehicle detection as 3D bounding boxes using neural network image processing
US20180309978A1 (en) Camera parameter set calculation method, recording medium, and camera parameter set calculation apparatus
CN110411457B (en) Positioning method, system, terminal and storage medium based on stroke perception and vision fusion
CN105049784A (en) Method and device for image-based visibility range estimation
JP6708730B2 (en) Mobile
CN108027230B (en) Information processing apparatus, information processing method, and program
US20190324129A1 (en) Sensor calibration
US10832428B2 (en) Method and apparatus for estimating a range of a moving object
CN110879598A (en) Information fusion method and device of multiple sensors for vehicle
CN109345591B (en) Vehicle posture detection method and device
CN113874914A (en) Method for determining an operating angle between a tractor and a trailer of a tractor
US20210382496A1 (en) Position detection apparatus, position detection system, remote control apparatus, remote control system, position detection method, and program
SE1951476A1 (en) Method and control arrangement for relational position displacement between two bodies of a multibody vehicle
EP3486871B1 (en) A vision system and method for autonomous driving and/or driver assistance in a motor vehicle
US9727792B2 (en) Method and device for tracking-based visibility range estimation
Ruland et al. Hand-eye autocalibration of camera positions on vehicles
US20220309776A1 (en) Method and system for determining ground level using an artificial neural network
SE2051032A1 (en) Method and control arrangement for relational position displacement between two bodies of a multibody vehicle

Legal Events

Date Code Title Description
NAV Patent application has lapsed