GB2598386A - Unmanned moving objects and method of localization - Google Patents

Unmanned moving objects and method of localization Download PDF

Info

Publication number
GB2598386A
GB2598386A GB2013641.2A GB202013641A GB2598386A GB 2598386 A GB2598386 A GB 2598386A GB 202013641 A GB202013641 A GB 202013641A GB 2598386 A GB2598386 A GB 2598386A
Authority
GB
United Kingdom
Prior art keywords
unmanned moving
feature map
moving object
moving objects
unmanned
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB2013641.2A
Other versions
GB202013641D0 (en
Inventor
Horst Meier Matthias
Das Mithun
Li Lei
Sanyal Krishnendu
Kumar Thakur Sunil
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Automotive GmbH
Original Assignee
Continental Automotive GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive GmbH filed Critical Continental Automotive GmbH
Priority to GB2013641.2A priority Critical patent/GB2598386A/en
Publication of GB202013641D0 publication Critical patent/GB202013641D0/en
Publication of GB2598386A publication Critical patent/GB2598386A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3885Transmission of map data to client devices; Reception of map data by client devices
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3841Data obtained from two or more sources, e.g. probe vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • G05D1/0291Fleet control

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

There is provided a method of localizing on a map, a group of unmanned moving objects operating within an area. Localization is achieved by generating, by a first unmanned moving object, a feature map based on data obtained by one or more obstacle detecting sensors affixed to the first unmanned moving object. A georeferenced position and heading of the first unmanned moving object on the feature map is found by communication between the first unmanned moving object and a plurality of anchors affixed in the area. The feature map is communicated to the other unmanned moving objects by publishing the feature map to a topic subscribed by the other unmanned moving objects. Each of the other unmanned moving objects determines its position and heading on the received feature map, by communication between each of the other unmanned moving objects and the plurality of anchors. One stated benefit is to reduce overall cost by reducing the need for all unmanned moving objects in a group to require localization and mapping capabilities, instead maps can be shared efficiently.

Description

Unmanned Moving Objects and Method of Localization
Field of Invention
[001] The invention relates to a method of localization for a group of unmanned moving objects operating within an area. The invention also relates to a group of unmanned moving objects operating within an area.
Background of Invention
[2] In the absence of human intervention, autonomous vehicles need to be able to "see" where it is going and avoid obstacles along the way, as well as autonomously control its movement. Having the technology to enable such autonomous behaviour results in autonomous vehicles being generally expensive.
[3] With respect to localization technology, the enablement of localization of obstacles and/or mapping of the environment, although essential to autonomous vehicles, is generally more expensive than ascertaining self-position. Typically, mapping of the environment including obstacles is done using sensors or scanners like LIDAR which are currently expensive. On the other hand, particularly for localization in an indoor environment, there are several alternative solutions to ascertain self-position, thereby bringing down costs. It is possible to ascertain self-position of obstacles, which would however necessitate labelling every obstacle to be identified for example, resulting in the inability to react to new obstacles.
[4] There is therefore a need to provide autonomous vehicles and localization methods that overcome or at least ameliorate one or more of the disadvantages discussed above and other disadvantages.
Summary
[005] It is an object to provide a group of unmanned moving objects with a localization method to address the problems discussed above.
[6] To accomplish this and other objects, there is provided, in an aspect, a method of localizing on a map, a group of unmanned moving objects operating within an area, the method comprising: generating, by a first unmanned moving object, a feature map based on data obtained by one or more obstacle detecting sensors affixed to the first unmanned moving object; ascertaining a georeferenced position and heading of the first unmanned moving object on the feature map, by communication between the first unmanned moving object and a plurality of anchors affixed in the area; communicating, by the first unmanned moving object, the feature map to the other unmanned moving objects by publishing the feature map to a topic subscribed by the other unmanned moving objects; and ascertaining, by each of the other unmanned moving objects, its georeferenced position and heading on the received feature map, by communication between each of the other unmanned moving objects and the plurality of anchors affixed in the area.
[7] In another aspect, there is provided a group of unmanned moving objects configured to operate within an area, the group comprising: a first unmanned moving object comprising, affixed to the first unmanned moving object: communication means configured to communicate with a plurality of anchors affixed in the area and ascertain a georeferenced position and heading of the first unmanned moving object, and one or more obstacle detecting sensors configured to obtain data to generate a feature map; and the other unmanned moving objects comprising communication means, affixed to each of the other unmanned moving objects, configured to communicate with the plurality of anchors affixed in the area and ascertain its georeferenced position and heading, wherein the other unmanned moving objects are further configured to receive the feature map from the first unmanned moving object by subscribing to a topic to which the first unmanned moving object publishes the feature map, and wherein each of the other unmanned moving objects are configured to ascertain its georeferenced position and heading on the received feature map.
[008] Advantageously, a feature map generated from obstacle detecting sensor(s) of an unmanned moving object is communicated to the other unmanned moving objects so that all unmanned moving objects in the group are in possession of the feature map. It may advantageously not be necessary for all unmanned moving objects in a group to be provided with technology for mapping the environment and/or detecting obstacles. The present application advantageously provides for a group of unmanned moving objects operating within an area to be economically implemented.
Detailed Description
[009] Hereinafter, exemplary embodiments of the present invention will be described in detail. The detailed description of this invention will be provided for the purpose of explaining the principles of the invention and its practical application, thereby enabling a person skilled in the art to understand the invention for various exemplary embodiments and with various modifications as are suited to the particular use contemplated. The detailed description is not intended to be exhaustive or to limit the invention to the precise embodiments disclosed. Modifications and equivalents will be apparent to practitioners skilled in this art and are encompassed within the spirit and scope of the appended claims.
[010] Unmanned moving objects, such as robots, unmanned aerial vehicles or unmanned ground vehicles, may be configured to assist humans to perform task(s). Unmanned moving objects may be configured to autonomously operate given a task. A group of unmanned moving objects may collectively be configured to perform task(s). Some examples of tasks that an unmanned moving object or a group of unmanned moving objects can do include moving to a target position to do repair works or to pick up materials. The group of unmanned moving objects may be deployed in a warehouse to organize or collect materials in the warehouse. Other than the first unmanned moving object, the number of the other unmanned moving objects in the group may be arbitrary. The number of the other unmanned moving objects in the group may depend on the tasks that the group is configured to do.
[011] An unmanned moving object may be configured to autonomously operate or move, or may be remotely controlled by a human operator, or be configured to allow a human operator to remotely intervene a task or movement of the unmanned moving object if necessary. The unmanned moving object(s) may be configured to operate in an outdoor environment or indoor environment. The unmanned moving object(s) may be configured to operate within an area.
[12] lobe capable of autonomously operating and/or moving for at least some of the time, an unmanned moving object would need to ascertain its own position. An unmanned moving object would need to avoid obstacles while autonomously operating and/or moving.
[13] Accordingly, in an embodiment, there is provided a method of localizing on a map, a group of unmanned moving objects operating within an area. The method comprises generating, by a first unmanned moving object, a feature map based on data obtained by one or more obstacle detecting sensors affixed to the first unmanned moving object. The method comprises ascertaining a georeferenced position and heading of the first unmanned moving object on the feature map, by communication between the first unmanned moving object and a plurality of anchors affixed in the area. The method comprises communicating, by the first unmanned moving object, the feature map to the other unmanned moving objects by publishing the feature map to a topic subscribed by the other unmanned moving objects. The method comprises ascertaining, by each of the other unmanned moving objects, its georeferenced position and heading on the received feature map, by communication between each of the other unmanned moving objects and the plurality of anchors affixed in the area.
[14] In another embodiment, there is provided a group of unmanned moving objects configured to operate within an area. The group comprises a first unmanned moving object and the remaining unmanned moving objects. The first unmanned moving object comprises, affixed to it: communication means, and one or more obstacle detecting sensors configured to obtain data to generate a feature map. The other unmanned moving objects comprise communication means, affixed to each of them. The communication means of each unmanned moving object is configured to communicate with a plurality of anchors affixed in the area and ascertain a georeferenced position and heading of each unmanned moving object. The other unmanned moving objects are further configured to receive the feature map from the first unmanned moving object by subscribing to a topic to which the first unmanned moving object publishes the feature map. Each of the other unmanned moving objects are configured to ascertain its georeferenced position and heading on the received feature map.
[15] The first unmanned moving object may be termed as a "scout robot". The first unmanned moving object may be equipped with one or more obstacle detecting sensors. The first unmanned moving object may comprise one or more obstacle detecting sensors affixed to it. The obstacle detecting sensor(s) may be configured to obtain data to generate a feature map.
[16] A feature map plots one or more specific features of the area that the group of unmanned moving objects is operating in, onto a map. The one or more features correspond to, or are concluded from, the data obtained from the obstacle detecting sensor(s). The feature map may be in the form of an occupancy matrix, where the map is divided into grids and a grid in which any obstacles are detected are indicated as occupied, otherwise the grid is indicated as available.
[017] The obstacle detecting sensors may be any suitable sensor, for example sensors that detect range information of surfaces within the area or sensors that detect electromagnetic radiation from surfaces within the area. The obstacle detecting sensors may be at least one of a LIDAR sensor, an infrared sensor, or an ultrasonic sensor. The obstacle detecting sensor(s) may provide the first unmanned moving object with obstacle sensing and avoidance abilities, as well as map generation abilities. Alternatively, the first unmanned moving object may be configured to generate the feature map based on data from the obstacle detecting sensor(s).
[018] The one or more obstacle detecting sensors may be configured to obtain range data of surfaces within the area to generate the feature map. Where the obstacle detecting sensor detects range information of surfaces within the area, the data obtained may include distance or time of signals reflected off a surface or all surfaces in the area. The signals may be a source energy emitted by the sensor, e.g. light or sound, which reflects off a surface and the reflected energy is detected by the sensor. The feature extracted from the data obtained from such range-detecting sensors may be distance from the obstacle detecting sensor or the first unmanned moving object to surfaces in the area. The feature map may therefore comprise the distances from the obstacle detecting sensor or the first unmanned moving object to surfaces in the area plotted thereon. The feature map may provide information of where obstacle surfaces are located. The step of generating the feature map may comprise obtaining range data of surfaces within the area from the obstacle detecting sensor(s). As may be appreciated, such obstacle detecting sensor is useful to detect where surfaces are in the area, which may be considered as obstacles to the movement or operation of the unmanned moving objects. Examples of such obstacle detecting sensor include LIDAR sensors, laser ranging sensors, depth sensing cameras such as RGB-D camera, and ultrasonic sensors.
[019] Where the obstacle detecting sensor detects electromagnetic radiation from surfaces within the area, the data obtained may include the amount of radiation emitted from surfaces or bodies within the area. The amount of radiation detected may be an indication of how near or far the emitting body is located from the obstacle detecting sensor or first unmanned moving object. The feature map may comprise the intensity of radiation from emitting bodies in the area plotted thereon. The feature map may comprise the distances from the obstacle detecting sensor or the first unmanned moving object to the emitting bodies in the area plotted thereon. The feature map may provide information of where obstacle surfaces or bodies are located in the area. As may be appreciated, such obstacle detecting sensor is useful to detect radiation-emitting bodies, such as humans or animals, which may be considered as obstacles to the movement or operation of the unmanned moving objects. An example of such obstacle detecting sensor is an infrared sensor.
[020] Each unmanned moving object may ascertain its georeferenced position by communication between the unmanned moving object and a plurality of anchors affixed in the area. The unmanned moving object and the plurality of anchors may communicate using electromagnetic radiation, such as radio waves. Each unmanned moving object may comprise, or may have affixed to it, communication means configured to communicate with the plurality of anchors. Communication between each unmanned moving object and the plurality of anchors may comprise transmitting signals using a wireless protocol, by communication means of each unmanned moving object, to be received by the plurality of anchors. The communication means of the unmanned moving object may be a transmitter, while the plurality of anchors may each be receivers. Altematively, the communication means of the unmanned moving object may be a receiver, while the plurality of anchors may each be transmitters. The time for a signal transmitted by a transmitter to reach a receiver may be measured in order to calculate the distance between the receiver and transmitter. Each point between the receiver and transmitter may therefore be georeferenced. The georeferenced position may thus be ascertained. The communication means to communicate with the anchors affixed in the area may include emitters of light, radio waves, magnetic fields, or acoustic signals. The communication means may be configured to transmit signals using a wireless protocol to be received by the plurality of anchors. In an example, ultra-wideband (UWB), a radio frequency technology, may be used. The unmanned moving object may comprise a UWB tag, while the plurality of anchors may be UWB anchors. Other examples of the wireless protocol that may be used include Bluetooth, WLAN, e.g. based on 802.11x wireless protocol, RFID, Zigbee, Z-Wave, or WiMax.
[21] Where the communication between the unmanned moving object and the plurality of anchors uses UWB, each unmanned moving object may be equipped with a UWB tag. Several UWB anchors may be affixed in the area at positions suitable for the unmanned moving objects to communicate with them. The anchors may be located at known positions for the unmanned moving objects to have a direct line of sight to the anchors.
[22] Each unmanned moving object or an unmanned moving object may generate a georeferenced map of the area by communication between the unmanned moving object and the plurality of anchors. The georeferenced map may be generated based on the data or positioning data obtained from communication between the unmanned moving object and the plurality of anchors. The method may comprise
B
generating a georeferenced map based on positioning data obtained by communication between each or an unmanned moving object and the plurality of anchors. Each or an unmanned moving object may be configured to generate a georeferenced map based on positioning data obtained by communication between the unmanned moving object and the anchors.
[023] A georeferenced map is a map that associates an area with locations, e.g. geographic coordinates, in physical space. The plurality of anchors may define boundaries of the area or of the map, where each point within the area can be addressed with an x, y, z coordinate. The origin of the georeferenced map may be the position of, e.g. an anchor or an unmanned moving object. Where UWB is used, the georeferenced map may be termed as a "UWB map". The georeferenced map may be used by an unmanned moving object for path planning.
[024] The first unmanned moving object may use the positioning data from communication with the plurality of anchors for path planning. Alternatively, in another example, the first unmanned moving object may use the data obtained from the obstacle detecting sensor(s) for path planning. In this example, the feature map may further comprise positioning data or georeferenced data to enable the first unmanned moving object to plan its path. The first unmanned moving object may generate a georeferenced feature map of the area based on the data obtained from the obstacle detecting sensor(s). The method may comprise generating, by the first unmanned moving object, a georeferenced feature map based on data obtained from the obstacle detecting sensor(s).
[025] The first unmanned moving object or each of the unmanned moving objects may comprise a sensor to detect heading. An example of such sensor includes an inertial measurement unit. The sensor that detects heading may determine the current heading of the unmanned moving object with respect to the origin of the georeferenced map. A dead reckoning operation may be performed on the positioning data obtained from the communication between the unmanned moving object and the anchors or from the obstacle detecting sensor(s). The positioning data obtained from the communication between the unmanned moving object and the anchors, or from the obstacle detecting sensor(s), may be combined or fused with the heading data. A Kalman filter operation may be performed on the heading data. A Kalman filter operation may be performed on the positioning data. Kalman filter operation(s) may be performed before or after combining or fusing the positioning and heading data. The unmanned moving object may be configured to perform the Kalman filter operation. Alternatively, the sensor or communication means itself may perform the Kalman filter operation. The unmanned moving object may ascertain its georeferenced position and heading on the feature map or the georeferenced map. Kalman filter operation(s) may be performed before or after combining or fusing the georeferenced position and/or heading with the georeferenced map or the feature map. The filtered data may be used in movement models of each unmanned moving object, thereby improving the accuracy of how the physical movement of the unmanned moving object is configured.
[26] The first unmanned moving object may ascertain its georeferenced position and heading on the georeferenced map. The first unmanned moving object may ascertain its georeferenced position and heading on the georeferenced feature map. The first unmanned moving object may store the georeferenced map, georeferenced feature map, the georeferenced position or positioning data and/or the heading data in a transitory memory or non-transitory memory.
[27] The first unmanned moving object may autonomously traverse the area to generate a feature map or a map of the area or environment using the one or more obstacle detecting sensors. The first unmanned moving object may be configured to extract one or more features of the area from data obtained by the obstacle detecting sensor(s) to generate the feature map. The disclosed method may comprise extracting one or more features of the area from data obtained by the obstacle detecting sensor(s) to generate the feature map. A simultaneous localization and mapping (SLAM) algorithm may be performed on the data obtained by the obstacle detecting sensor(s) to generate the feature map. A SLAM algorithm may be performed on the feature(s) extracted from data obtained by the obstacle detecting sensor(s) to generate the feature map. The first unmanned moving object may be configured to perform the SLAM algorithm. Where LIDAR sensor(s) are used, the feature map may contain range information to all surfaces within the environment, thereby enabling the first unmanned moving object to identify and avoid obstacles. Where LIDAR sensor(s) are used, the feature map may be termed as a "LIDAR map". The first unmanned moving object may ascertain its georeferenced position and heading on the feature map. The first unmanned moving object may store the feature map, data obtained by the obstacle detecting sensor(s), the one or more features of the area extracted from the data obtained, and/or range information or data of surfaces within the area in the transitory memory or non-transitory memory.
[028] The first unmanned moving object may continue traversing the area to update the feature map using the one or more obstacle detecting sensors. The step of generating the feature map may comprise continuously updating the feature map. The one or more obstacle detecting sensors may be configured to continuously obtain data to update the feature map. The first unmanned moving object may be configured to extract, or continuously extract, one or more features of the area from data obtained by the obstacle detecting sensor(s) to continuously update the feature map. The disclosed method may comprise extracting, or continuously extracting, one or more features of the area from data obtained by the obstacle detecting sensor(s) to continuously update the feature map. A SLAM algorithm may be performed on the data obtained by the obstacle detecting sensor(s) to update the feature map. A SLAM algorithm may be performed on the feature(s) extracted from data obtained by the obstacle detecting sensor(s) to update the feature map. The first unmanned moving object may store data obtained by the obstacle detecting sensor(s), the one or more features of the area extracted from the data obtained, range information or data of surfaces within the area, and/or the updated feature map in the transitory memory or non-transitory memory. The scout robot may advantageously continuously update the feature map in case any objects, obstacles and/or radiation-emitting bodies are moved, added or removed. Examples of obstacle detecting sensors suitable for sensing such dynamic obstacles include infrared sensors and ultrasonic sensors.
[029] The first unmanned moving object may communicate the feature map, georeferenced feature map or the feature map with its georeferenced position and heading, to the other unmanned moving objects by publishing the feature map to a topic subscribed by the other unmanned moving objects. The other unmanned moving objects may be configured to receive such feature map from the first unmanned moving object by subscribing to a topic to which the first unmanned moving object publishes the feature map. The first unmanned moving object may publish such feature map, or other information that the other unmanned moving objects may not yet possess, onto a common platform accessible by the other unmanned moving objects. Examples of common platforms include a website hosted on a server remote from the area or a website hosted on a local server in the area. The other unmanned moving objects may be configured to retrieve or receive the information from the website at periodic intervals, e.g. every minute. The first unmanned moving object may be configured to publish the information onto the website at periodic intervals [030] In another example, the first unmanned moving object may act as a local server for the other unmanned moving objects to access the feature map, or other information that the other unmanned moving objects may not yet possess, from. The other unmanned moving objects may be configured to receive the feature map from the first unmanned moving object in a message defined by the Robot Operating System. The feature map may be communicated to the other unmanned moving objects in a message defined by the Robot Operating System.
[31] In yet another example, the group of unmanned moving objects may be organized in a mesh network. In this example, the first unmanned moving object may be configured to continuously communicate the feature map, or other information that the other unmanned moving objects may not yet possess, with other unmanned moving objects in its vicinity.
[32] For the unmanned moving objects to communicate with each other or with a remote server, e.g. for the first unmanned moving object to publish information or for the other unmanned moving objects to subscribe or retrieve such information, the unmanned moving objects may be connected to the Internet via a wireless protocol, e.g. WLAN, e.g. based on 802.11x wireless protocol, WiFi, or cellular networks.
[33] The other unmanned moving objects may receive the feature map or other information from the first unmanned moving object or remote server. The other unmanned moving objects may ascertain its georeferenced position and heading on the received feature map. The other unmanned moving objects may have had generated a georeferenced map by communicating with the plurality of anchors. The georeferenced position and heading may have been ascertained on this georeferenced map. Hence, the other unmanned moving objects may need to reconcile the georeferenced map and the received feature map.
[34] Each of the other unmanned moving objects may generate a combined map comprising the received feature map and the georeferenced map. The coordinate systems of both maps may be transformed into a uniform coordinate system for use in the combined map. The coordinates of the georeferenced map may be transformed into the coordinate system of the feature map. Alternatively, the coordinates of the feature map may be transformed into the coordinate system of the georeferenced map. The ascertained georeferenced position and heading of the first unmanned moving object on the received feature map may be designated as the origin in order to translate or transform the coordinate system of a map into the coordinate system of the other map. The georeferenced position and heading of each of the other unmanned moving objects may then be plotted on the combined map. The other unmanned moving objects may store the received feature map, combined map, its georeferenced position or positioning data and/or heading data in the transitory memory or non-transitory memory.
[35] As the other unmanned moving objects may subscribe or retrieve the feature map, or other information regarding the environment from the first unmanned moving object, each of the other unmanned moving objects may not comprise any obstacle detecting sensors. Each of the other unmanned moving objects may not require any step of generating a feature map. The disclosed method may exclude a step of generating, by the other unmanned moving objects, a feature map based on data obtained by one or more obstacle detecting sensors.
[36] The present disclosure is advantageous over implementations where autonomous systems supplement localization techniques like GPS and dead reckoning also with real-time UWB, LIDAR and optical sensors. The present disclosure is advantageous over implementations where autonomous vehicles are only capable of UWB localization or only capable of LIDAR localization.
[37] Each unmanned moving object may comprise one or more computing devices, for example, an embedded system or a general computing device. The computing device may comprise one or more computer-readable storage media or memory modules, which may comprise transitory and non-transitory memory. The computer-readable storage media may encompass any electronic component capable of storing electronic information. The computer-readable storage media or memory may include transitory processor-readable media such as random access memory (RAM) or cache memory. The computer-readable storage media or memory may include non-transitory processor-readable media such as read-only memory (ROM), non-volatile random access memory (NVRAM), programmable read-only memory (PROM), erasable programmable read only memory (EPROM), electrically erasable PROM (EEPROM), flash memory, magnetic or optical data storage, registers, etc. The memory is in electronic communication with a processor as disclosed herein and/or other processors of the computing device. Computer-readable instructions, such as an operating system, middleware, firmware or other software framework, may reside in the non-transitory computer-readable storage medium. Computer-readable instructions may be implemented as a program or a code that can be read by the processor. The disclosed method may be implemented as a program or a code that can be read by the processor of the applicable unmanned moving object. The memory configured to store any data, information or message disclosed herein may be pre-allocated by an operating system of the unmanned moving object. The memory may be a shared memory, where the shared memory refers to a memory accessible to different processes in a multi-processor computing device. Exemplary processor(s) of the computing device include a central processing unit (CPU), a microprocessor, a digital signal processor (DSP), a controller, a microcontroller, a state machine, programmable gate arrays, systems-on-chip (SoC), programmable SoCs, or other suitable devices. The term "processor may include a combination of processing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration suitable for the disclosed system.
[38] Each unmanned moving object may comprise computer-readable instructions stored in non-transitory memory. The instructions in each unmanned moving object may comprise one or more of the steps disclosed herein. The instructions in the first unmanned moving object may comprise the steps including tasks like retrieving data from the one or more obstacle detecting sensors or other sensors and/or controlling a motor of the first unmanned moving object. The instructions in the other unmanned moving objects may comprise the steps including tasks like retrieving data from the first unmanned moving object and/or controlling a motor of the unmanned moving object. When executed by a processor of an unmanned moving object, the computer-readable instructions may cause the processor to execute the method disclosed herein or at least some steps of the method disclosed herein. Accordingly, in an embodiment, there is provided a non-transitory memory storing one or more programs, the one or more programs when executed by an unmanned moving object, causes the unmanned moving object to perform at least one or more steps disclosed herein.
[39] Each step may be triggered by clock time. For example, at a certain clock time, the feature map may be wirelessly transmitted by the first unmanned moving object to the other unmanned moving objects. Each step may be triggered by an action, e.g. the completion of a prior step. For example, when an update of the feature map is detected, the feature map may be wirelessly transmitted by the first unmanned moving object to the other unmanned moving objects. Each step may be triggered by receipt of a message, e.g. sent or published by a previous step, from a sensor or from another unmanned moving object. Such message may be standardized for ease of implementation. An example includes message types defined by the Robot Operating System (www.ros.org).
[40] The Robot Operating System (ROS) is a middleware software framework that provides important services for robot systems, e.g. hardware abstraction and message passing between different processes. A ROS application running on an unmanned moving object is divided into different processes called nodes. Nodes are pieces or blocks of software dedicated to specific tasks. In the context of the present application, exemplary tasks include reading from the IMU sensor or controlling the motor. The nodes communicate with each other mainly via topics. A topic has a defined message type, e.g. an integer or a more complex structure, and a node can either publish or subscribe to a topic. A node subscribing to a topic can receive the messages another node publishes to this topic. ROS provides a variety of predefined default message types for different purposes, e.g. an IMU message containing the orientation, velocity and acceleration of an unmanned moving object. This allows for fast implementation of nodes that publish or subscribe data of commonly used sensors, such as the obstacle detecting sensor(s) of the first unmanned moving object in the present application.
[41] In a specific embodiment, each of the disclosed unmanned moving objects is configured to run on a Robot Operating System.
[042] The first unmanned moving object may comprise a UWB tag, being the communication means configured to communicate with the plurality of UWB anchors affixed in the area; an IMU, being the sensor to detect heading; and a LIDAR sensor, being the obstacle detecting sensor. The first unmanned moving object may implement a UWB node configured to obtain raw data from the UWB tag and publish it to a first topic. The first unmanned moving object may implement an IMU node configured to obtain raw data from the IMU and publish it to a second topic. The first unmanned moving object may implement a LIDAR node configured to obtain raw data from the LIDAR sensor and publish it to a third topic. The first unmanned moving object may implement a SLAM node configured to subscribe to the third topic to obtain the raw LIDAR data and configured to process the LIDAR data to generate a feature map. The SLAM node may also be configured to plot positioning data or georeferenced data obtained from the third topic on the feature map, thereby generating a georeferenced feature map. The SLAM node may also be configured to publish the feature map to a fourth topic. The fourth topic may have a default message type named "OccupancyGrid". This message may contain the georeferenced feature map in the form of an occupancy matrix and additional map meta data, e.g. the map resolution, width and height. As the first unmanned moving robot uses the LIDAR sensor also for localization, the "OccupancyGrid" message may further contain the necessary information for navigation. The first unmanned moving object may implement a control node or path planning node configured to subscribe to the first and second topic, to log the initial and/or current georeferenced positions and headings of the first unmanned moving object. The path planning node may also be configured to publish the initial and/or current georeferenced positions and headings to a fifth topic.
[43] In implementations where every autonomous system includes obstacle detection techniques and localization techniques, or where every autonomous system uses obstacle detecting sensors also as a means of localization, there is no need to communicate the feature map or georeferenced feature map to the other unmanned moving objects.
[44] In the present application, the first unmanned moving object may implement a new node to generate the information needed by the other unmanned moving objects and publish it to a new topic for the other unmanned moving objects to subscribe. The other unmanned moving objects may require the feature map. The other unmanned moving objects may require a means to reconcile the feature map received with any map they use. Accordingly, in an embodiment, the first unmanned moving object may implement an intermediary node configured to subscribe to the fourth and fifth topics and configured to ascertain its initial georeferenced position and heading on the feature map to generate a georeferenced feature map. Specifically, the intermediary node may be configured to obtain the georeferenced feature map in the "OccupancyGrid" message and add the initial georeferenced position and heading data into a message type named "GeoreferencedOccupancyGrid". The intermediary node may also be configured to publish the georeferenced feature map to a sixth topic having the message type "GeoreferencedOccupancyGrid".
[045] Each of the other unmanned moving objects may comprise a UWB tag, being the communication means configured to communicate with the plurality of UWB anchors affixed in the area; and an IMU, being the sensor to detect heading. Each of the other unmanned moving objects do not possess any obstacle detecting sensor. Each of the other unmanned moving objects may implement a UWB node configured to obtain raw data from the UWB tag and publish it to a topic named herein as the 11th topic. Each of the other unmanned moving objects may implement an IMU node configured to obtain raw data from the IMU and publish it to a topic named herein as the 12th topic. Each of the other unmanned moving objects may implement a control node or path planning node configured to subscribe to the 11th and 12th topics, to log the initial and/or current georeferenced positions and headings of the unmanned moving object. The path planning node of the other unmanned moving objects may additionally be configured to generate the georeferenced map. The path planning node may also be configured to subscribe to the sixth topic published by the first unmanned moving object and receive the feature map. Specifically, the path planning node may be configured to obtain the georeferenced feature map in the "GeoreferencedOccupancyGrid" message, process the message to read the occupancy matrix and the first unmanned moving object's initial georeferenced position and heading, which is designated as the origin for the other unmanned moving objects to transform coordinate systems. The path planning node may further be configured to reconcile the coordinate systems of the received feature map and the georeferenced map and generate a combined map. The path planning node may be configured to plot the initial and/or current georeferenced position and heading of the unmanned moving object on the combined map. Advantageously, these other unmanned moving objects may be capable of planning its path and avoid obstacles on the way, even without obstacle detecting sensor(s).

Claims (17)

  1. Patent claims 1. A method of localizing on a map, a group of unmanned moving objects operating within an area, the method comprising: generating, by a first unmanned moving object, a feature map based on data obtained by one or more obstacle detecting sensors affixed to the first unmanned moving object; ascertaining a georeferenced position and heading of the first unmanned moving object on the feature map, by communication between the first unmanned moving object and a plurality of anchors affixed in the area; communicating, by the first unmanned moving object, the feature map to the other unmanned moving objects by publishing the feature map to a topic subscribed by the other unmanned moving objects; ascertaining, by each of the other unmanned moving objects, its georeferenced position and heading on the received feature map, by communication between each of the other unmanned moving objects and the plurality of anchors affixed in the area.
  2. 2. The method of claim 1, wherein the step of generating the feature map comprises obtaining range data of surfaces within the area.
  3. 3. The method of claim 1 or 2, wherein the step of generating the feature map comprises continuously updating the feature map.
  4. 4. The method of any preceding claim, wherein the obstacle detecting sensor is at least one of a LIDAR sensor, an infrared sensor, or an ultrasonic sensor.
  5. 5. The method of any preceding claim, wherein the method excludes a step of generating, by the other unmanned moving objects, a feature map based on data obtained by one or more obstacle detecting sensors.
  6. 6. The method of any preceding claim, wherein communication between each unmanned moving object and the plurality of anchors comprises: transmitting signals using a wireless protocol, by communication means of each unmanned moving object, to be received by the plurality of anchors.
  7. 7. The method of claim 6, wherein the wireless protocol is UWB, Bluetooth, WLAN, Zigbee, Z-Wave, or WiMax.
  8. 8. The method of any preceding claim, wherein the feature map is communicated to the other unmanned moving objects in a message defined by the Robot Operating System.
  9. 9. A group of unmanned moving objects configured to operate within an area, the group comprising: a first unmanned moving object comprising, affixed to the first unmanned moving object: communication means configured to communicate with a plurality of anchors affixed in the area and ascertain a georeferenced position and heading of the first unmanned moving object, and one or more obstacle detecting sensors configured to obtain data to generate a feature map; and the other unmanned moving objects comprising communication means, affixed to each of the other unmanned moving objects, configured to communicate with the plurality of anchors affixed in the area and ascertain its georeferenced position and heading, wherein the other unmanned moving objects are further configured to receive the feature map from the first unmanned moving object by subscribing to a topic to which the first unmanned moving object publishes the feature map, and wherein each of the other unmanned moving objects are configured to ascertain its georeferenced position and heading on the received feature map.
  10. 10. The group as claimed in claim 9, wherein the one or more obstacle detecting sensors are configured to obtain range data of surfaces within the area to generate the feature map.
  11. 11. The group as claimed in claim 9 or 10, wherein the one or more obstacle detecting sensors are configured to continuously update the feature map.
  12. 12.The group as claimed in any one of claims 9-11, wherein the obstacle detecting sensor is at least one of a LIDAR sensor, an infrared sensor, or an ultrasonic sensor.
  13. 13. The group as claimed in any one of claims 9-12, wherein each of the other unmanned moving objects do not comprise any obstacle detecting sensors.
  14. 14. The group as claimed in any one of claims 9-13, wherein each unmanned moving object comprises a sensor to detect heading.
  15. 15. The group as claimed in any one of claims 9-14, wherein the communication means are configured to transmit signals using a wireless protocol to be received by the plurality of anchors.
  16. 16. The group as claimed in claim 15, wherein the wireless protocol is UWB, Bluetooth, WLAN, Zigbee, Z-Wave, or WiMax.
  17. 17. The group as claimed in any one of claims 9-16, wherein the other unmanned moving objects are configured to receive the feature map from the first unmanned moving object in a message defined by the Robot Operating System.
GB2013641.2A 2020-08-31 2020-08-31 Unmanned moving objects and method of localization Withdrawn GB2598386A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2013641.2A GB2598386A (en) 2020-08-31 2020-08-31 Unmanned moving objects and method of localization

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2013641.2A GB2598386A (en) 2020-08-31 2020-08-31 Unmanned moving objects and method of localization

Publications (2)

Publication Number Publication Date
GB202013641D0 GB202013641D0 (en) 2020-10-14
GB2598386A true GB2598386A (en) 2022-03-02

Family

ID=72749719

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2013641.2A Withdrawn GB2598386A (en) 2020-08-31 2020-08-31 Unmanned moving objects and method of localization

Country Status (1)

Country Link
GB (1) GB2598386A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112595320B (en) * 2020-11-23 2023-06-30 北京联合大学 Indoor intelligent wheelchair high-precision positioning autonomous navigation method and system based on ROS

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170122751A1 (en) * 2015-10-29 2017-05-04 Leauto Intelligent Technology (Beijing) Co. Ltd Method for acquiring map information, navigation method and equipment
WO2017172778A1 (en) * 2016-03-28 2017-10-05 Sri International Collaborative navigation and mapping
US20180246524A1 (en) * 2017-02-27 2018-08-30 Vorwerk & Co. Interholding Gmbh Method for operating a self-traveling robot
US20190212752A1 (en) * 2018-01-05 2019-07-11 Irobot Corporation Mobile cleaning robot teaming and persistent mapping
US20190220003A1 (en) * 2019-03-27 2019-07-18 Intel Corporation Collaborative 3-d environment map for computer-assisted or autonomous driving vehicles

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170122751A1 (en) * 2015-10-29 2017-05-04 Leauto Intelligent Technology (Beijing) Co. Ltd Method for acquiring map information, navigation method and equipment
WO2017172778A1 (en) * 2016-03-28 2017-10-05 Sri International Collaborative navigation and mapping
US20180246524A1 (en) * 2017-02-27 2018-08-30 Vorwerk & Co. Interholding Gmbh Method for operating a self-traveling robot
US20190212752A1 (en) * 2018-01-05 2019-07-11 Irobot Corporation Mobile cleaning robot teaming and persistent mapping
US20190220003A1 (en) * 2019-03-27 2019-07-18 Intel Corporation Collaborative 3-d environment map for computer-assisted or autonomous driving vehicles

Also Published As

Publication number Publication date
GB202013641D0 (en) 2020-10-14

Similar Documents

Publication Publication Date Title
CN109564430B (en) Moving body guidance system, moving body, guidance device, and computer program
US9785806B2 (en) Low-frequency receiving for radio frequency identification
US20190012496A1 (en) Wireless rfid networking systems and methods
JP5114514B2 (en) Position estimation device
US8396254B1 (en) Methods and systems for estimating a location of a robot
AU2015200490B2 (en) Planning a wireless network
US10121118B1 (en) Confirming delivery of multiple packages to a delivery location using package tags
US20220141619A1 (en) Method and system for locating objects within a master space using machine learning on rf radiolocation
JP6737751B2 (en) Flight device, management device, flight management method and program
EP3285085B1 (en) Method and system for identifying a location of a container within a group of containers
US20180275663A1 (en) Autonomous movement apparatus and movement control system
JP4676449B2 (en) Communication control device
CN111856499B (en) Map construction method and device based on laser radar
GB2598386A (en) Unmanned moving objects and method of localization
CN115248039A (en) Multi-robot-multi-person cooperation control method, device and system
US20110205124A1 (en) Object Locator System
EP3361428A1 (en) Asset location identification system, program and method
JP2016218026A (en) Information processor, positioning method and program
WO2012115558A1 (en) Apparatus and method for tracking a stabled animal
US20210312660A1 (en) Article position estimation system and article position estimation method
JP4641281B2 (en) Sensor network system, server device thereof, and sensor node localization method
US20190213703A1 (en) Systems and methods for identifying unidentified autonomous vehicles
US20230242099A1 (en) Method for Vehicle Driving Assistance within Delimited Area
KR102442448B1 (en) Coordinates recognition apparatus of multiple automatic guided vehicle and method thereof
JP7467190B2 (en) Position estimation device, position estimation system, and position estimation method

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20230223 AND 20230301