GB2598794A - Controlling or monitoring a remote controlled vehicle - Google Patents

Controlling or monitoring a remote controlled vehicle Download PDF

Info

Publication number
GB2598794A
GB2598794A GB2014505.8A GB202014505A GB2598794A GB 2598794 A GB2598794 A GB 2598794A GB 202014505 A GB202014505 A GB 202014505A GB 2598794 A GB2598794 A GB 2598794A
Authority
GB
United Kingdom
Prior art keywords
vehicle
remote controlled
control
controlled vehicle
manoeuvre
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2014505.8A
Other versions
GB202014505D0 (en
Inventor
Keene David
Brewerton Simon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Richmond Design and Marketing Ltd
Original Assignee
Richmond Design and Marketing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Richmond Design and Marketing Ltd filed Critical Richmond Design and Marketing Ltd
Priority to GB2014505.8A priority Critical patent/GB2598794A/en
Publication of GB202014505D0 publication Critical patent/GB202014505D0/en
Publication of GB2598794A publication Critical patent/GB2598794A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • G05D1/0291Fleet control
    • G05D1/0297Fleet control by controlling means in a control room
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64FGROUND OR AIRCRAFT-CARRIER-DECK INSTALLATIONS SPECIALLY ADAPTED FOR USE IN CONNECTION WITH AIRCRAFT; DESIGNING, MANUFACTURING, ASSEMBLING, CLEANING, MAINTAINING OR REPAIRING AIRCRAFT, NOT OTHERWISE PROVIDED FOR; HANDLING, TRANSPORTING, TESTING OR INSPECTING AIRCRAFT COMPONENTS, NOT OTHERWISE PROVIDED FOR
    • B64F1/00Ground or aircraft-carrier-deck installations

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Control or monitoring of a remote controlled vehicle 200 is performed by providing plural remote sensors 120 to determine relative distances, executing a planned manoeuvre of the remote controlled vehicle 200, and using the determined distances to control or monitor a position and movement of the remote controlled vehicle 200 when executing the planned manoeuvre. The manoeuvre is a bodily movement of the remote controlled vehicle and/or a movement of a component of the remote controlled vehicle 200, such as a platform of a scissor lift. The remote sensors 120 may be provided on control vehicles. The planned manoeuvre may be automatically controlled or executed by a human operator. The remote controlled vehicle 200 and the control vehicles 100 may be airside vehicles of an airport, e.g. cargo or baggage handling vehicle. Also provided is a control vehicle and a transportation system.

Description

CONTROLLING OR MONITORING A REMOTE CONTROLLED
VEHICLE
TECHNICAL FIELD
The present invention relates to methods of controlling or monitoring remote controlled vehicles, particularly but not exclusively to computer-implemented methods of controlling or monitoring remote controlled vehicles, and particularly but not exclusively to computer-implemented methods of controlling airside support vehicles. The invention also relates to control vehicles for controlling remote controlled vehicles, retro-fit apparatuses for converting a vehicle into a remote controlled vehicle, remote controlled vehicles, and transportation and/or logistics systems, particularly but not exclusively airside transportation and/or logistics systems.
BACKGROUND
Airside environments, such as in airports, and other logistically intensive environments require a lot of personnel to be run effectively. Whilst human control is good for some tasks there is an inherent risk of human error in any human controlled system. It is therefore desirable to increase efficiency of operations. Efficiency can be measured in terms of minimising downtime such as wait-times, maximising speed of delivery and embarkation and disembarkation of passengers, maximising utilisation of available space and vehicles, and minimising energy usage. Another aspect of efficiency can be reducing accidents. Airside vehicles under human control do sometimes strike aircraft, causing costly delays to flights, and costly repairs being needed for aircraft. It is common for aircraft operators to have a spare aircraft or two at an airport in case a scheduled aircraft is damaged and cannot fly. This is expensive. Due to constraints such as regulatory constraints and practical constraints, known airside logistics systems include a number of drawbacks, many of which impact upon efficiency. Similar drawbacks are also applicable to other logistics systems, of which airside logistics systems may be considered a subset. For example, storing and retrieving goods in warehouses or at a shipyard or railway or road freight distribution centre are other areas where the present invention also can be used.
Whilst some automation of systems has been posited in some environments, autonomous systems are expensive and require the replacement of a lot of equipment. This expense and complexity makes the changing of systems to fully autonomous systems unsuitable, as the costs and delays that are incurred in making the changes make them unachievable.
It is an aim of the invention to alleviate or solve problems associated with the
prior art.
STATEMENTS OF INVENTION
According to an aspect the invention comprises a computer-implemented method of controlling or monitoring a remote controlled vehicle, the method comprising: providing a first sensor remote to the remote controlled vehicle; providing a second sensor remote to the remote controlled vehicle and the first sensor; using information from the first and second sensors to determine relevant distances in an operating environment of the remote controlled vehicle; executing a planned manoeuvre of the remote controlled vehicle, the planned manoeuvre comprising: a bodily movement of the remote controlled vehicle in the operating environment, and/or a movement of a component of the remote controlled vehicle relative to a body of the remote controlled vehicle; and using the determined distances to control or monitor a position and movement of the remote controlled vehicle when executing the planned manoeuvre.
The planned manoeuvre may be stored in a computer readable memory and a computer processor may access the memory to control the vehicle.
The remote controlled vehicle comprises a body which is capable of moving across the ground of the operating environment. For example, the body may comprise a chassis and one or more wheels rotatably mounted to the chassis The wheels may be in contact with the ground of the operating environment, in use, to enable the body to move across the ground of the operating environment.
A 'bodily movement' of the remote controlled vehicle refers to translation of this body of the remote controlled vehicle across the ground of the operating environment from a first location to a second location.
The remote controlled vehicle may comprise a component such as a platform of a scissor lift. The planned manoeuvre of the remote controlled vehicle may comprise a movement of the component relative to a body of the remote controlled vehicle, i.e. relative to another part of the remote controlled vehicle. The body may be the body capable of moving across the ground of the operating environment as described above, or any other body of the remote controlled vehicle.
Other examples of components of the remote controlled vehicle which are moveable relative to a body of the remote controlled vehicle include: passenger embarkation/disembarkation steps which are moveable relative to the body of the remote controlled vehicle to extend towards or away from an aircraft door, an airbridge which can be extended towards or away from an aircraft door; and a height-adjustable luggage conveyor belt.
Relevant distances may comprise distances between the remote controlled vehicle and objects in the operating environment that the remote controlled vehicle is to interact with or avoid, or between objects in the operating environment The method may comprise providing one or more further sensors in, addition to the first and second sensors, remote from the remote controlled vehicle and from each other. The method may comprise using information from the one or more further sensors to determine relevant distances in the operating environment and using the determined distances to control or monitor a position and movement of the remote controlled vehicle when executing the planned manoeuvre.
Having at least two sensors remote to the remote controlled vehicle allows for better measuring/establishment of distances between the remote controlled vehicle and objects in the operating environment (e.g an aircraft at an airport) by having different fields of view from the at least two sensors. This avoids blind spots and enables trigonometric assessment of distances (and the use of time stamps and processing of distances over time to determine speeds of movement). It also allows the detection of "foreign-unexpected bodies that may interpose themselves in the operating environment -such as people who step into the way, or a door that swings open, or some other unexpected obstruction that might not be properly visible from just the one sensor.
The remote controlled vehicle itself may have one or more sensors in addition to the remote first and second sensors In some embodiments one of the first and second sensors is on the vehicle rather than being off-vehicle.
The method may comprise using the determined distances to determine a measured position and movement of the remote controlled vehicle, and comparing the measured position and movement of the remote controlled vehicle to an expected position and movement of the remote controlled vehicle expected in the planned manoeuvre, and using the comparison in controlling or monitoring the remote controlled vehicle when executing the planned manoeuvre.
Whilst "vehicle' primarily takes its usual definition it this term, it is also intended to include other moving apparatuses that may be found in various logistics environments. For example, in an airport the remote controlled vehicle may be a jetw-ay, nominally part of a building, but movable, mobile stairs for driving to an aircraft for passenger use, catering or luggage or cargo transport platforms such as conveyor belts or scissor lifts, de-icing machines, things that are moved relative to aircraft on the airport apron during the operation of aircraft.
The planned manoeuvre may have an operational envelope of acceptable positions and speeds of movements of the remote controlled vehicle and/or the component of the remote controlled vehicle, for example relative to the operational environment (often relative to an aircraft) and the method may comprise monitoring the actual movement of the remote controlled vehicle and/or the component of the remote controlled vehicle to ensure it is within the operational envelope. The method may comprise modifying an existing planned manoeuvre if the remote controlled vehicle and/or the component of the remote controlled vehicle is determined to be outside of the operational envelope. The modifying of an existing planned manoeuvre may comprise stopping the movement of the remote controlled vehicle and/or the component of the remote controlled vehicle This could, for example, take the form of an intervention to prevent collision.
The method may comprise alerting a human operator (for example a driver) of the remote controlled vehicle if a collision is predicted. Alerting the human operator may be done as well as or instead of controlling the remote controlled vehicle. The method may comprise monitoring an implementation of the planned manoeuvre by a human operator of the remote controlled vehicle and outputting information to the human operator to alert them when a collision is potentially likely if they do not alter the way they are controlling the vehicle The method may not comprise a computer executing a stored planned manoeuvre of the vehicle -the human driver /operator may just decide what to do on their own.
The method may comprise automatically implementing a modifier on a human controlled input, for example when movement of the remote controlled vehicle and/or the component of the remote controlled vehicle is outside an operational envelope. The modifier may be a scaling factor. The modifier may reduce a speed request or the output speed provided in dependence upon the speed request In this way controls may provide finer, slower control when in proximity to obstacles. For example, the remote controlled vehicle may be powered by an electric motor and a normal speed request (for example input via a pedal or lever) may implement between 0 and 100% of the motor's power or speed of movement of the vehicle or component output. The method may comprise modifying this so that the speed request can only implement between 0 and 20% of the motor speed output, scaling the input speed request back by a factor of 5. This scaling factor may increase or decrease in dependence upon object proximity.
At least one of the first and second sensors may be provided on a control vehicle, separate from the vehicle being controlled to perform a work task The first and second remote sensors may be provided on first and second control vehicles respectively.
The first and second vehicles may be positioned relative to the remote controlled vehicle so as to obtain different fields and angles of view for the planned manoeuvre.
The or each control vehicle may be an autonomous vehicle.
The or each control vehicle may be an airside vehicle in an airport environment, such as a cargo or baggage handling vehicle.
The method may comprise establishing a control link between the or each control vehicle and the remote controlled vehicle, and providing control commands via the control link to execute the planned manoeuvre. The method may comprise removing the control link after the planned manoeuvre has been completed.
A computer may use data from the first and second sensors and monitor an implementation of the planned manoeuvre by a human operator of the remote controlled vehicle. The computer may override the human implemented manoeuvre if the human implemented manoeuvre is predicted to result in a collision. The computer may issues a warning to a human operator of the remote controlled vehicle before a collision occurs.
The method may further comprise outputting an intervention report if the planned manoeuvre prevented a collision.
The first sensor may provide data comprising remote controlled vehicle positional data and first sensor position data The second sensor may provide remote controlled vehicle positional data and second sensor position data A computer may apply a modifier to a human control input, for example a speed reduction in the speed of the bodily movement of the remote controlled vehicle and/or the movement of the component of the remote controlled vehicle.
The method may further comprise creating or populating a 3D live map. The 3D live map may include position data for the remote controlled vehicle, the first sensor and the second sensor.
The remote controlled vehicle may be an airside support vehicle. The airside support vehicle may be, for example, a baggage handling or transporting vehicle, mobile stairs for passenger use in boarding and leaving an aircraft, a movable air bridge walkway adapted to connect to an aircraft, a jetway connected to a building and adapted to connect to an aircraft, a baggage handling conveyor belt vehicle, a fuel bowser, de-icing vehicle or equipment, a push back tug or aircraft towing tug, a personnel or passenger transport vehicle, a catering vehicle adapted to bring food to and from an aircraft, an effluent disposal vehicle adapted to remove human effluent from an aircraft.
The or each control vehicle may have multiple functions. At least one of the functions may be to provide a sensing platform for use in the method.
Upon completion of the planned manoeuvre, the control vehicle may proceed to carrying out one or more of its other functions.
One of the other functions the control vehicle may have is loading or unloading cargo.
The method further may further comprise logging a last known location of the remote controlled vehicle.
The method further may further comprise logging the manoeuvre.
According to another aspect of the invention there is provided a control vehicle configured to control or monitor a remote controlled vehicle, the control vehicle 20 comprising: a sensor for measuring distances in an operating environment of the remote controlled vehicle; a transceiver for communicating with the remote controlled vehicle and a further control vehicle; and a processor for planning a manoeuvre for the remote controlled vehicle or for executing a previously planned manoeuvre.
The control vehicle may be configured to provide remote control commands to the remote controlled vehicle and/or it may be configured to provide information from its senor to the remote controlled vehicle.
The sensor may be configured to measure distances in an operating environment of the remote controlled vehicle. The transceiver may be configured to communicate with the remote controlled vehicle and a further control vehicle The processor may be configured to plan a manoeuvre for the remote controlled vehicle or execute a previously planned manoeuvre.
The control vehicle may comprise two separate transceivers, a first transceiver for communicating with the remote controlled vehicle and a second transceiver for communicating with the further control vehicle.
The control vehicle may further comprise a cargo carrying portion.
The control vehicle may be an airside support vehicle, such as an autonomously driven, self-propelled, airside dolly.
According another aspect of the invention there is provided a retro-fit apparatus for converting a vehicle into a remote controlled vehicle for use in the method of the preceding aspects.
The retro-fit apparatus may be for converting an airside support vehicle.
According another aspect of the invention there is provided a transportation system for reducing collisions due to human error, the system comprising: a remote controlled vehicle; a first control vehicle having a first sensor, a second control vehicle having a second sensor and a processor, wherein the first sensor and second sensor are in communication with the processor, and wherein the processor is configured to execute a planned manoeuvre of the remote controlled vehicle, or intervene in human implementation of a planned manoeuvre, the processor being adapted to use information from the first and second sensors to control or monitor the remote controlled vehicle when executing the planned manoeuvre and/or provide a warning if a planned manoeuvre is at risk of going wrong The remote controlled vehicle may comprise one or more sensors. A second control vehicle may therefore not be comprised within the system, the sensor(s) of the remote controlled vehicle used in place of the second sensor.
The transportation system may be an airside transportation system, comprising airs de support vehicles The processor may be adapted to control the first and second control vehicles automatically to position themselves relative to the remote control vehicle so as to obtain different fields and angles of view for the planned manoeuvre. This may provide the sensors with different and good views of the remote control vehicle and of a task it is to perform The remote control vehicle may have manual controls to manoeuvre it to execute the planned manoeuvre, or perform its intended task The processor may be adapted to monitor the remote controlled vehicle when executing the planned manoeuvre and to issue a warning to a user of the vehicle and/or override the manual controls if the planned manoeuvre is in danger of resulting in a collision to slow the speed of movement of the remote control vehicle under human control, or take over control from the human, or stop movement of the remote controlled vehicle.
In many embodiments, the remote control vehicle is incapable of automatically self-manoeuvring to perform its intended task and needs at least one of or both of the first and second control vehicles in order to perform its intended task or to execute the planned manoeuvre automatically.
The first and second sensors may comprise one or more of GPS sensors, gyroscopic sensors, camera sensors, and LIDAR sensors. Image analysis performed on the images obtained by camera sensors is one implementation of the invention. The use of two different kinds of sensors can be advantageous.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will now be described with reference to the accompanying drawings, by way of example only, in which: Figure 1 provides a simplified, schematic, side view of an autonomous airside support vehicle; Figure 2 provides a block diagram of the systems and controls of the airside support vehicle of Figure 1, Figure 3 provides a simplified, schematic, side view of a remote controlled vehicle; Is Figure 4 provides a block diagram of the systems and controls of the remote controlled vehicle of Figure 3, Figure 5 provides a rendered image of an airport logistics system comprising a pair of the airside support vehicles of Figure 1 and a remote controlled vehicle of Figure 3, Figure 6 provides a simplified, schematic, plan view of the operation of a remote controlled vehicle using an autonomous airside support vehicle; Figure 7 provides a simplified, schematic, plan view of the operation of a remote controlled vehicle using a pair of autonomous airside support vehicles; Figure 8 provides a simplified, schematic, plan view of the operation of a remote controlled vehicle using an autonomous airside support vehicle and infrastructure sensors; Figure 9 provides a simplified, schematic, plan view of the operation of a remote controlled vehicle using an autonomous airside support vehicle and sensors of the remote controlled vehicle; Figure 10 provides a simplified, schematic, plan view of the operation of a disabled autonomous airside support vehicle controlled vehicle using a pair of autonomous airside support vehicles; and Figure 11 provides a flow diagram of the steps for controlling a remote controlled airside vehicle.
DETAILED DESCRIPTION
The below description describes vehicles, systems and methods for use in an airside environment. It will be appreciated that minor modifications could be made in order for the various features to be applicable in other environments, for example warehouses, docksides, or other logistical environments and scenarios. The vehicles, systems and methods described below may be based on those disclosed in UK Patent Application No. 1821134.2 (published as GB2576800) and International application No. PCT/0B2019/053562 (published as W02020/128442), the entire contents of which are incorporated within this specification by reference. References to vertical, horizontal, longitudinal and lateral are made with reference to a vehicle when in use and all may be considered functional rather than absolute axes. Vertical refers substantially to a top to bottom or bottom to top direction, horizontal to a front to back or back to front direction, longitudinal similarly refers to a front to back or back to front direction, and lateral refers to a side to side direction.
Figure 1 shows an airside support vehicle 100, specifically an airside dolly. Further details on such a vehicle can be found in UK Patent Application No. 1821134.2 with regards to a cargo and baggage dollies. Figure 2 provides a block diagram showing the connections between the various systems of the dolly 100.The dolly 100 is propelled by a drive system 108 comprising four wheels 110 provided in pairs towards each end of the vehicle and a series of electric motors 112 that provide motive power to the wheels 110. In the present embodiment, a motor 112 is provided for each wheel 110, but a motor 112 could instead be provided for each pair of wheels 110 or a single motor 112 could power all of the wheels 110. Although all four wheels 110 of the present embodiment are powered, any number of the wheels 110 could be provided, and could be powered.
The drive system 108 is controlled by a controller 114 that receives control signals from a processor 116 In response to these control signals, the drive system 108 can control the baggage dolly 100 to move forwards, backwards, and steer, providing full control of the motion of the baggage dolly 100.
The baggage dolly 100 includes a number of other systems that operate in conjunction with the processor 116 to provide additional features to the baggage dolly 100. As will become clear in the present disclosure, unless otherwise stated, any of these features may be used on their own or in conjunction with any other system in order to provide the benefits of each system separately.
Systems of the dolly 100, including the drive system 108, controller 114, and processor 116, are powered by an on-board electrical power supply, which in the present embodiment is a battery 118. More specifically, the electrical power supply is provided by a number of lead-acid batteries. A benefit of these batteries 118 is that they are cheap whilst retaining a power density that is sufficient for operation of the baggage dollies 100. Although lead-acid batteries are not as power-dense as similarly-sized Li-ion or Li-Po batteries, they are sufficient for operation and offer advantages such as high reliability, broad range of operating temperatures, and have a long lifecycle.
In order to enable autonomy of the dolly 100 (whether full autonomy or partial autonomy) a sensing system 120 is provided in the baggage dolly 100. The sensing system 120 also enables control of a remote controlled vehicle by the dolly. The sensing system 120 as shown includes a UPS sensor 122, a gyroscopic sensor 124, four camera sensors 126, and four LIDAR sensors 128. One of each of the camera sensors 126 and LIDAR sensors 128 are positioned towards each end of the vehicle. The UPS sensor 122 and gyroscopic sensor 124 are positioned centrally within the vehicle, adjacent to the processor 116. The camera sensors 126 and LIDAR sensors 128 are mounted on pylons positioned at the four corners of the platform. The sensing system 120 may not comprise all of the detailed sensor types; conversely the sensing system could comprise further or different sensing types. The sensing system is configured to sense an environment in which the dolly is operating.
Each of the sensors of the sensing system 120 communicates with the processor 116 to provide sensing data to the processor 116. The sensing data can include position data of the baggage dolly 100, orientation data of the baggage dolly 100, image or visual data of the surroundings of the baggage dolly 100, speed and direction data of the baggage dolly 100, and distance data of objects surrounding the baggage dolly 100. Other forms of sensing data may also be provided, as will be known to the skilled person when considering providing autonomy to a vehicle. The sensing data can therefore be processed by the processor 116 in order to obtain information about the baggage dolly 100 and its surroundings.
For example, image data provided by the camera sensors 126 may allow the processor 116 to detect objects within a field of view provided by each camera sensor 126. In order to provide depth perception, each camera sensor 126 may include two sensing elements, allowing determination of depth through the use of parallax. Alternatively, or in addition, the image data may be augmented by use of distance data provided by the LIDAR sensors 128. Other sensors may also be used for measuring distance. Distance data may be measured using ultrasonic sensors. Other distance sensors may also be used, particularly for near field sensing, allowing the LIDAR sensors 128 to be focused on further field distance sensing The sensing data can provide the baggage dolly 100 with information about its position, either absolute or relative to known objects, and help it to complete a task or mission through use of the sensing data Any number of sensors may be provided in order to provide sensing data to the processor 116. Such sensors may include those described above and may include in addition or alternatively any other sensors, such as radar sensors, magnetic field sensors, rotating camera sensors, differential GPS, or any other form of sensor.
The sensing system 120 allows the dolly to sense its environment, measuring relevant distances (such as distances between itself and potential obstacles). The sensing system 120 also allows the dolly 100 to have its location determined, both within an environment (such as an airport) and relative to other objects and vehicles (such as other dollies or other airside vehicles, or aircraft).
The sensing system 120 of the depicted embodiment provides enough sensing data to allow the baggage dolly 100 to operate autonomously. The processor 116 contains the requisite circuits and processing power to process the sensing data to provide control signals in response to operate in an autonomous mode. In the autonomous mode, the baggage dolly 100 is able to drive itself around using control signals generated by the processor 116 in response to the sensing system 120, the control signals being provided to the drive system 108 and the other systems, as required.
Autonomous operation of the baggage dolly 100 may enable it to travel with zero or low operator input, depending on the level of autonomy required in the circumstances. Different levels of autonomy are defined by the Society of Automation Engineers (SAE) as SAE Autonomy Levels, The SAE Autonomy Levels are summarised in the following Table 1: SAE Level of Definition Autonomy Autonomy Level 0 No Full-time performance of all aspects of driving by a automation human driver, possibly supplemented by enhanced warning or intervention systems.
1 Driver Driving-mode specific assistance relating to assistance steering, acceleration, and/or deceleration, using information about the driving environment, with expectation of all remaining aspects being performed by a human driver.
2 Partial Driving-mode specific execution of steering, automation acceleration, and deceleration, using information about the driving environment, with expectation of all remaining aspects being performed by a human driver.
3 Conditional Driving-mode specific performance of all aspects of automation a dynamic driving task by an automated driving system, with expectation of appropriate intervention of a human driver when requested.
4 High Driving-mode specific performance of all aspects of automation a dynamic driving task by an automated driving system, even when a human driver fails to respond to a request for intervention Full Full-time performance of all aspects of a dynamic automation driving task by an automated driving system, under all conditions that would otherwise be expected to be managed by a human driver.
Table 1 -SAE Automation Levels With the above definitions in mind, the sensing system 120 may enable the baggage dolly 100 to operate in an SAE Level 3 Autonomy Mode, an SAE Level 4 Autonomy Mode, and/or an SAE Level 5 Autonomy Mode In the SAE Level 3 Autonomy Mode, the baggage dolly 100 may be able to operate fully-autonomously up to a point at which an unexpected event occurs such as the presence of an unexpected object in the path of the baggage dolly 100, for example the presence of a human outside of a designated walkway. In such an autonomy mode, the baggage dolly 100 will then request intervention from a central controller, where a human operator may provide an input to allow the baggage dolly 100 to proceed or a manual input to determine the next steps taken by the baggage dolly 100. Without intervention from the central controller, the baggage dolly 100 will not resume normal operations.
In comparison, when operating in the SAE Level 4 Autonomy Mode, the baggage dolly 100 may request intervention in the same circumstances as when operating in the SAE Level 3 Autonomy Mode. However, if a response from the central controller is not forthcoming after a request for intervention, the baggage dolly 100 will proceed to deal with the unexpected event in the way it deems appropriate, depending on its programming.
Finally, when the baggage dolly 100 is operating in the SAE Level 5 Autonomy Mode, the baggage dolly 100 will continue to operate autonomously in all circumstances, even when confronted with an unexpected event, without any request or requirement for intervention from the central controller or human operator.
It will be apparent to the skilled person how to provide the desired autonomy levels to the baggage dolly 100. Moreover, it will be known in the context of the present disclosure that many different autonomy levels may be provided, with different instructions for operation, for example as to what unexpected events should be dealt with autonomously and what unexpected events should be referred for intervention.
Providing the baggage dolly 100 with autonomous operation allows each baggage dolly 100 to operate without the need for a baggage handler driving a baggage tractor that can pull the individual baggage dolly 100 Moreover, in contrast to known baggage dollies, which are configured as trains of two or three baggage dollies behind a baggage tractor, an autonomous baggage dolly 100 can operate independently of other baggage dollies 100. The advantages of such a system are numerous For example, autonomy allows each baggage dolly 100 to collect and deliver baggage independently of any other vehicle, including other baggage dollies 100 and baggage tractors.
The dolly 100 further comprises a communication system 150. The communication system allows the dolly 100 to communicate wirelessly with other dollies (or other communication system equipped vehicles) and/or a central controller. The communication system is configured to send and receive data. The communication system can also be operable to send command signals, wherein the command signals are configured to provide control instructions to another vehicle. The communication system 150 comprises a transceiver 151 for sending and receiving data. The communication system may also comprise multiple emitter, receivers and/or transceivers for communicating with different vehicles, controllers and/or systems.
As well as using the sensors and computational ability of the dolly 100 to automate its own movement, they can also be used to automate other vehicles that may not possess as a high a level of sophistication. The dolly 100 operated in this manner may be considered a control vehicle 100. Various other vehicles may be configured to operate as a control vehicle 100 and the dolly 100 is provided as just one such example. The less sophisticated vehicles could be other pieces of airside equipment in the case of an automated airside dolly. Of course for autonomous vehicles analogous to the dolly 100 operating in other environments the less sophisticated vehicles can be vehicles for use in those environments. The sensing system 120 is redeployed in order to sense the environment surrounding the less sophisticated vehicle. The less sophisticated vehicle is fitted with a remote control system, such that it can be remotely controlled by the dolly 100. The less sophisticated vehicle may therefore comprise a remote controlled (RC) vehicle. The RC vehicle does not comprise any sensing or processing capabilities itself, or at least does not comprise a full enough sensing system or processing capabilities to enable its own autonomy. The sensing system 120 and processor 116 of the dolly is therefore deployed to enable remote, autonomous control of the RC vehicle. The dolly and its sensors become the "eyes" of the "dumb" / "blind" less sophisticated vehicle.
An example of an RC vehicle 200 is shown in Figure 3. Figure 4 provides a block diagram showing the connections between the various systems of the RC vehicle 200. In this example the RC vehicle is another type of airside support vehicle; a set of stairs. Conventionally the stairs may be either towed into position or driven into position in the case of a self-propelled set of stairs that comprise a drive system. The RC vehicle 200 in this example is the latter, as it comprises a drive system 208. It will be appreciated that a non-self-propelled vehicle could be converted to a self-propelled vehicle with the provision of a drive system 208. The RC vehicle is propelled by the drive system 208. The drive system comprises four wheels 210 provided in pairs towards each end of the vehicle and a series of electric motors 212 that provide motive power to the wheels 210. In the present embodiment, a motor 212 is provided for each wheel 210, but a motor 212 could instead be provided for each pair of wheels 210 or a single motor 212 could power all of the wheels 210 or only a single pair of wheels 210 may be powered. Although all four wheels 210 of the present embodiment are powered, any number of the wheels 210 could be provided, and could be powered.
The drive system 208 is controlled by a controller 214 that receives control signals from a communication system 250. The communication system comprises a transceiver 251 for sending and receiving signals. In response to these control signals, the drive system 108 can control the RC vehicle 200 to move forwards, backwards, and steer, providing full control of the motion of the RC vehicle.
Autonomously and remotely controlling the RC vehicle 200, for example by means of one or more control vehicles 100 as described above, can help to prevent impacts, scrapes, and damage. By preventing or limiting any sort of activity where the RC vehicle 200 could be damaged, it is no longer a requirement that the RC vehicle is resistant to damage. Therefore, the structure of the RC vehicle 200 can be altered so that it is lighter and may be formed from different materials, for example aluminium or fibre composites. In this way, weight can be saved, making the RC vehicle easier to manoeuvre and more energy-efficient to propel. Having all the systems for enabling autonomy on a separate vehicle (e.g. the dolly 100) also means that the RC vehicle 200 is easier to service and cheaper to run and maintain than if it was an autonomous vehicle itself The RC vehicle 200 can therefore enjoy the majority of the benefits provided by autonomy whilst reducing the drawbacks of expense, complexity and require processing power.
Also, it is possible to have multiple -dumb/blind" vehicles, i.e. RC vehicles, possibly of different kinds, that are controlled at different times by the same "visioned" control vehicles. For example, the control vehicles can control a first dumb vehicle at one time, and then move to another location and control another dumb vehicle. This means that a single expensive "visioned" control vehicle can be used to sense and control multiple dumb vehicles, which helps maximise the value obtained from the expensive visioned, smart, control vehicles, and allows cheaper dumb vehicles needing less maintenance.
Figure 5 provides an example of an operation of the vehicles 100, 200 together in an airport logistics system. A pair of dollies 100 is provided, each comprising its own sensor system 120. The dollies 100 are autonomous in this example and drive themselves to position either side of a remote controlled (RC) vehicle 200 that is being aligned with an aircraft 500. The sensing system 120 of each of the dollies 100 is configured to provide information to determine relevant distances in an operating environment of the RC vehicle 200. In the example of Figure 5, the operating environment includes the RC vehicle and the aircraft 500. The RC vehicle 200 may have either been towed or driven into position manually, or already been controlled autonomously into positon. Airside apparatuses such as stairs are not required to move as far or as often as other pieces of equipment, such as baggage or cargo dollies, and so may simply be remotely controlled to be moved out of the way and then back into position as aircraft come and go from an operating environment. The dollies are positioned such that they have a field of view that encompasses the RC vehicle's current position, a target destination for the RC vehicle, and an area in which a path from the RC vehicle's current position to the target destination can be plotted. Using the sensor data from both dollies 100, a processor -which is preferably one of the processors of the dollies, but could also be a further processor incorporated within a further, master controller positioned elsewhere, such as in a command booth -processes the data to plot a path. Plotting the path comprises using the information provided by the sensing system 120 of each of the dollies to determine relevant distances in the operating environment.
The plotted path can then be converted into commands which are sent from one or both of the dollies 100 to the RC vehicle 200, to be executed by its controller. The RC vehicle 200 can therefore be autonomously manoeuvred into position at the aircraft doors.
A single dolly 100 can also be used to control a RC vehicle 200 as shown in Figure 6. In this example, the single dolly 100 comprises two sensing systems 120 each configured to provide information to determine relevant distances in an operating environment of the RC vehicle 200. The dolly is in a position that allows it to view the path that the RC vehicle 200 needs to take. As there are no obstructions within the operating environment of the RC vehicle 200, the dolly 100 can control the RC vehicle with a high degree of confidence. In more confined spaces, or in environments with a lot of obstacles present, this may not be suitable. The dolly 100 assesses the environment for blind spots and the possibility of obstructions. The dolly can then pilot the RC vehicle 200 only if it has a very high degree of confidence (for example 99 % or greater) that there are no obstructions. If there is any risk of possible human obstructions then the dolly will not pilot the RC vehicle.
The single dolly may gather data from more than one position relative to the RC vehicle, for example it may take images/gather data from a first position relative to the RC vehicle and an aircraft that the RC vehicle is to service, drive to a second position relative to the aircraft and the RC vehicle and gather more data, and may use data from both positions to plan the manoeuvre of the RC vehicle/control the RC vehicle.
Figure 7 shows a simplified, plan view of operation of the vehicles 100, 200 in a scenario similar to Figure 5. In this example, two of the dollies 100 are provided and each dolly 100 comprises two sensing systems 120. In other embodiments, each dolly 100 may comprise a single sensing system 100. At least one of the sensing systems 120 of each of the dollies 100 is configured to provide information to determine relevant distances in an operating environment of the RC vehicle 200. The RC vehicle 200 can be piloted between the dollies 100 as the sensor coverage afforded by the dolly positions allows a path to be planned and vehicle motion to be controlled within that.
The field of view offered by using multiple dollies 100 is much greater than a human operator can conventionally access. This allows for the elimination of blind spots and providing viewing angles not afforded to a human operator. Even where spotters may be used, the communication and control between the dollies 100 and the remote control vehicle 200 is much faster than human communication, as are responses to commands. The increase in operator safety over known systems and methods is therefore increased, as there is a reduced chance of collisions or other accidents.
Even operators not directly involved in the immediate process can be aided, as they may be warned of 'unseen' dangers that may not have been visible without the provision of the sensing systems 120.
As well as (or instead of) the sensor systems in place on the dolly 100 there can also be sensors 300 systems embedded within the airport/apron infrastructure. For example scanning and distance measuring sensors (such as stereo vision cameras and radar) can be provided within the airport environment. The sensor systems 300 may be provide on masts or on existing structures in order to increase a field of view of the sensors by placing them at an increased elevation to ground level. Figure 8 provides an example of such an arrangement. Infrastructure sensors 300 are positioned in fixed locations around the airport. Each sensor is configured to provide information to determine relevant distances in an operating environment of an RC vehicle 200. In this example, a dolly 100 is provided in addition to the infrastructure sensors 300. In other embodiments, the dolly 100 may not be present. The dolly 100 can manoeuvre to cover any blind spots that are missed by the infrastructure sensors 300. These blind spots may be intrinsic to the positioning of the infrastructure sensors, or be caused by the passage of vehicles, for example the RC vehicle 200 itself.
The dolly 100 and the infrastructure sensors 300 can therefore combine data to plot a path between them for the RC vehicle and then the RC vehicle can be controlled accordingly.
In some examples the infrastructure sensors 300 may be numerous enough and have enough coverage that no vehicle mounted sensors are required. In that case most, if not all remote controlled vehicles can be operated remotely and autonomous dollies 100 are not required Figure 9 shows another example in which the RC vehicle comes equipped with its own sensor system 220. The sensor system may be a complex sensor suite such as fitted to the dolly 100, or may be a more streamlined set, perhaps only consisting of one or two sensor types. The RC vehicle can broadcast its sensor data to the dolly 100, which is operable to interpret the data and combine it with its own sensor data in order to plot a path and control the RC vehicle 200 In any of the above examples, instead of taking control of the RC vehicle completely, the control may instead take the form of placing a modifier on a control input, or a controlled output in response to a control input, from a human driver. For example, as a driver pilots an airside support vehicle, such as a set of stairs, up to a hatch of an aircraft, the throttle input (or the motor response to it) can be modified to scale the response such that the RC vehicle travels at a slower speed as it approaches the aircraft. Other examples include priming or pre-loading the brakes such that the RC vehicle 200 can be brought to a stop more quickly than if the brakes were not modified.
Using the communication system 150 the dolly 100 is operable to report any interventions that may occur. This can allow for the monitoring of systems for health and safety and performance purposes. If a particular RC vehicle has multiple 'near misses', in which the dolly 100 is required to intervene, then this may be indicative of a malfunction in the RC vehicle 200. A service request can therefore be created for the malfunctioning vehicle to be diagnosed and/or repaired. If a driver similarly has several 'near misses', then this may flag a retraining requirement to improve the driver's performance. This logging of data can allow an audit trail to be provided. In a health and safety conscious world, having an audit trail and practices to help train drivers on safety, and to ensure malfunctions result in maintenance checks, can help a company demonstrate appropriate corporate due diligence in taking their responsibilities seriously. A system that facilitates that, whilst reducing accidents in the first place, can be attractive.
As the dolly 100 is also operable to locate both itself and the RC vehicle 200, this data can be sent to a main controller or control room, so that the positions of vehicles can be monitored This allows positional data of all vehicles to be tracked without fitting every vehicle with a GPS or other positional sensors This data can therefore be logged and maintained so that an up to date map of vehicles' last known locations can be kept and verified The dolly might be able to identify the RC vehicle (possibly which one of a number of the same RC vehicles -a unique identity for the RC vehicle). The identity of the RC vehicle may be used in determining an allowable operational envelope for the movement of the RC vehicle.
The same systems and methods for controlling a RC vehicle 200 can also be applied to the control of another dolly 100d that may have suffered sensor damage and/or experienced other malfunctions, as shown in Figure 10. This can allow for recovery of the disabled vehicle without having to send out specific recovery equipment, such as a tow truck, and can even enable the disabled dolly 100d to complete its task(s) before going for diagnosis and/or repair.
Depending on the nature of the disability of the disabled dolly, other dollies can provide sensory input data to its controller, or actually provide control signals to the controller of the disabled dolly, or provide motive force (e.g. couple the disabled dolly to a mobile dolly to tow it or give it power).
In each of the above examples, the control signals sent from the or each control vehicle 100 to the RC vehicle 200 or the disabled dolly 100d may be delivered via a control link established between the dollies 100 and the RC vehicle 200 or disabled dolly 100d. Once the RC vehicle 200 has been manoeuvred into position or has completed a planned manoeuvre or task, the control link may be ended such that the RC vehicle 200 is returned to a fully "dumb" state in which it cannot operate autonomously.
In some examples the systems and methods do not comprise taking control of the RC vehicle. Instead, if a collision is likely, or some other hazard is identified a warning is provided. The warning may be in the form of a siren or light or notification on a screen.
In some examples as well as, or instead of, controlling the motion of the whole RC vehicle, just a part of the vehicle is controlled. For example the vehicle part could be a loading ramp or arm, an extending stair case, a crane boom, or any other movable part.
Each concept discussed in the present disclosure, except where otherwise provided, may be utilised independently or in combination with any other concept discussed. The skilled person will understand that the specific examples discussed are simply embodiments of the discussed concepts for illustrative purposes and that combinations disclosed in relation to one specific example are not intended to limit the different combinations that could be provided without departing from the scope of the disclosure.
The examples given above with relations to the various vehicle types are equally applicable to other vehicle types, both airside and otherwise Where an aspect of the disclosure is discussed in relation to an airside vehicle or dolly, unless otherwise necessary any feature of the described vehicle may be provided as part of a vehicle, such as a land vehicle, water vehicle, air vehicle, or road vehicle.

Claims (1)

  1. C LA I NI S1. A computer-implemented method of controlling or monitoring a remote controlled vehicle, the method comprising: providing a first sensor remote to the remote controlled vehicle; providing a second sensor remote to the remote controlled vehicle and the first sensor; using information from the first and second sensors to determine relevant distances in an operating environment of the remote controlled vehicle; executing a planned manoeuvre of the remote controlled vehicle, the planned manoeuvre comprising: a bodily movement of the remote controlled vehicle in the operating environment, and/or a movement of a component of the remote controlled vehicle relative to a body of the remote controlled vehicle; and using the determined distances to control or monitor a position and movement of the remote controlled vehicle when executing the planned manoeuvre 2. A method according to claim 1 comprising using the determined distances to determine a measured position and movement of the remote controlled vehicle, and comparing the measured position and movement of the remote controlled vehicle to an expected position and movement of the remote controlled vehicle expected in the planned manoeuvre, and using the comparison in controlling or monitoring the remote controlled vehicle when executing the planned manoeuvre.3. A method according to claim 1 or claim 2 wherein at least one of the first and second sensors is provided on a control vehicle.4. A method according to claim 3 wherein the first and second remote sensors are provided on first and second control vehicles respectively, and preferably wherein the first and second vehicles are positioned relative to the remote controlled vehicle so as to obtain different fields and angles of view for the planned manoeuvre.5. A method according to claim 3 or claim 4 wherein the or each control vehicle is an airside vehicle in an airport environment, such as a cargo or baggage handling vehicle.6. A method according to any one of claims 3 to 5 wherein the or each control vehicle is an autonomous vehicle.7. A method according to any one of claims 3 to 6 comprising establishing a control link between the or each control vehicle and the remote controlled vehicle, and providing control commands via the control link to execute the planned manoeuvre.8. A method according to claim 7 comprising removing the control link after the planned manoeuvre has been completed 9. A method according to any preceding claim wherein a computer uses data from the first and second sensors and monitors an implementation of the planned manoeuvre by a human operator of the remote controlled vehicle and overrides the human implemented manoeuvre if the human implemented manoeuvre is predicted to result in a collision, or wherein the computer issues a warning to the human operator before a collision occurs 10. A method according to claim 9 wherein the method further comprises outputting an intervention report if the planned manoeuvre prevented a collision 11. A method according to any preceding claim wherein the first sensor provides data comprising remote controlled vehicle positional data and first sensor position data and the second sensor provides remote controlled vehicle positional data and second sensor position data.12. A method according to any preceding claim wherein a computer applies a modifier to a human control input, for example a speed reduction in the speed of the bodily movement of the remote controlled vehicle and/or the movement of the component of the remote controlled vehicle.13 A method according to any preceding claim wherein the method further comprises creating or populating a 3D live map, and wherein the 3D live map includes position data for the remote controlled vehicle, the first sensor and the second sensor.14 A method according to any preceding claim wherein the remote controlled vehicle is an airside support vehicle A method according to claim 3 or claim 4 or any claim dependent via claim 3 or claim 4, wherein the or each control vehicle has multiple functions, wherein at least one of the functions is to provide a sensing platform for use in the method, and optionally wherein a second function is that of a baggage or luggage dolly.16. A method according to claim 15, wherein, upon completion of the planned manoeuvre, the control vehicle proceeds to carrying out one or more of its other functions.17 A method according to any preceding claim wherein the method further comprises logging a last known location of the remote controlled vehicle and/or logging the manoeuvre.18. A control vehicle configured to control or monitor a remote controlled vehicle, the control vehicle comprising: a sensor for measuring distances in an operating environment of the remote controlled vehicle; a transceiver for communicating with the remote controlled vehicle and a further control vehicle; and a processor for planning a manoeuvre for the remote controlled vehicle or for executing a previously planned manoeuvre.19. A control vehicle according to claim 18, wherein the control vehicle further comprises a cargo carrying portion.20. A control vehicle according to claim 18 or claim 19, wherein the control vehicle is an airside support vehicle, such as an autonomously driven, self-propelled, airside dolly.21 A retro-fit apparatus for converting a vehicle into a remote controlled vehicle for use in the method of any one of claims 1 to 17, the retro-fit apparatus comprising a transceiver, and optionally a controller adapted to use signals from the transceiver to control the vehicle.22. A retro-fit apparatus according to claim 21 wherein the retro-fit apparatus is for converting an airside support vehicle.23. A transportation system for reducing collisions due to human error, the system comprising: a remote controlled vehicle, a first control vehicle having a first sensor; a second control vehicle having a second sensor and a processor, wherein the first sensor and second sensor are in communication with the processor, and wherein the processor is configured to execute a planned manoeuvre of the remote controlled vehicle, or intervene in human implementation of a planned manoeuvre, the processor being adapted to use information from the first and second sensors to control or monitor the remote controlled vehicle when executing the planned manoeuvre and/or provide a warning if a planned manoeuvre is at risk of going wrong 24. A transportation system according to claim 23 wherein the transportation system is an airside transportation system, comprising airside support vehicles.25. A transportation system according to claim 23 or claim 24 wherein the processor is adapted to control the first and second control vehicles automatically to position themselves relative to the remote control vehicle so as 10 to obtain different fields and angles of view for the planned manoeuvre.26. A transportation system according to claim 23 or claim 24 or claim 25 wherein the remote control vehicle has manual controls to manoeuvre it to execute the planned manoeuvre, and wherein the processor is adapted to monitor the remote controlled vehicle when executing the planned manoeuvre and to override the manual controls if the planned manoeuvre is in danger of resulting in a collision to slow the speed of movement of the remote control vehicle under human control, or take over control from the human, or stop movement of the remote controlled vehicle.27 A transportation system according to any one of claims 23 to 26, wherein the remote control vehicle is incapable of automatically self-manoeuvring and needs at least one of or both of the first and second control vehicles to execute the planned manoeuvre automatically.28. A transportation system according to any one of claims 23 to 27, wherein the first and second control vehicles are configured to establish a control link with the remote controlled vehicle and provide control commands via the control link to execute the planned manoeuvre.29 A transportation system according to claim 28, wherein the first and second control vehicles and/or the remote controlled vehicle is configured to remove the control link after the planned manoeuvre has been completed
GB2014505.8A 2020-09-15 2020-09-15 Controlling or monitoring a remote controlled vehicle Pending GB2598794A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2014505.8A GB2598794A (en) 2020-09-15 2020-09-15 Controlling or monitoring a remote controlled vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2014505.8A GB2598794A (en) 2020-09-15 2020-09-15 Controlling or monitoring a remote controlled vehicle

Publications (2)

Publication Number Publication Date
GB202014505D0 GB202014505D0 (en) 2020-10-28
GB2598794A true GB2598794A (en) 2022-03-16

Family

ID=73149707

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2014505.8A Pending GB2598794A (en) 2020-09-15 2020-09-15 Controlling or monitoring a remote controlled vehicle

Country Status (1)

Country Link
GB (1) GB2598794A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010008807A1 (en) * 2010-02-22 2011-08-25 Engelskirchen, Jürgen, Dipl.-Ing., 22395 Method for automatic remote controlling of controllable object on predetermined arbitrary virtual path, involves determining actual position of object on predetermined virtual path by one or multiple external sensors
US20200043346A1 (en) * 2018-08-02 2020-02-06 Joseph James Vacek Unmanned aerial system detection and mitigation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010008807A1 (en) * 2010-02-22 2011-08-25 Engelskirchen, Jürgen, Dipl.-Ing., 22395 Method for automatic remote controlling of controllable object on predetermined arbitrary virtual path, involves determining actual position of object on predetermined virtual path by one or multiple external sensors
US20200043346A1 (en) * 2018-08-02 2020-02-06 Joseph James Vacek Unmanned aerial system detection and mitigation

Also Published As

Publication number Publication date
GB202014505D0 (en) 2020-10-28

Similar Documents

Publication Publication Date Title
US11203337B2 (en) Vehicle with autonomous driving capability
US9944213B2 (en) Robotic cargo system
KR102625918B1 (en) Container self-guided transport vehicle and method of operation thereof, and system having self-driving transport vehicle
US20220024603A1 (en) Self-Propelled Airside Dolly, Baggage Handling System, Baggage Handling Facility, and Related Apparatus and Methods
US11846940B2 (en) Methods and apparatus for vehicle control
US20110127366A1 (en) Automated system for maneuvering aircrafts on the ground
US11927955B2 (en) Methods for transitioning between autonomous driving modes in large vehicles
US8694238B2 (en) Automated ground handling of aircraft
CN110622082A (en) Method and system for operating an automatically guided transport vehicle for containers
WO2020030508A1 (en) Railway vehicle system and method for improving the safety of a railway vehicle
CN112947496A (en) Unmanned trackless rubber-tyred vehicle standardized transportation platform and control method thereof
EP3833591A1 (en) Railway drone vehicle and railway vehicle system
CN115179860A (en) Detection of small objects under an autonomous vehicle chassis
CN116062032A (en) Driver assistance system for a heavy vehicle with overhang
KR102433595B1 (en) Unmanned transportation apparatus based on autonomous driving for smart maintenance of railroad vehicles
CN112793804A (en) Autopilot luggage trailer for airports
GB2598794A (en) Controlling or monitoring a remote controlled vehicle
WO2021021427A1 (en) Methods for transitioning between autonomous driving modes in large vehicles
WO2023161596A1 (en) Powering and controlling or monitoring of vehicles
CN215591033U (en) Autopilot luggage trailer for airports
GB2609201A (en) An airside vehicle system and a method of operating an airside vehicle system
JP2022160506A (en) Vehicle with autonomous driving capability