SE2151621A1 - Improved navigation for a robotic work tool system - Google Patents

Improved navigation for a robotic work tool system

Info

Publication number
SE2151621A1
SE2151621A1 SE2151621A SE2151621A SE2151621A1 SE 2151621 A1 SE2151621 A1 SE 2151621A1 SE 2151621 A SE2151621 A SE 2151621A SE 2151621 A SE2151621 A SE 2151621A SE 2151621 A1 SE2151621 A1 SE 2151621A1
Authority
SE
Sweden
Prior art keywords
area
work tool
robotic
range
sensor
Prior art date
Application number
SE2151621A
Inventor
Adam Tengblad
Herman Jonsson
Kamila Kowalska
Mikaela Ahlen
Sebastian Bergström
Original Assignee
Husqvarna Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Husqvarna Ab filed Critical Husqvarna Ab
Priority to SE2151621A priority Critical patent/SE2151621A1/en
Priority to PCT/SE2022/050931 priority patent/WO2023121528A1/en
Publication of SE2151621A1 publication Critical patent/SE2151621A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D34/00Mowers; Mowing apparatus of harvesters
    • A01D34/006Control or measuring arrangements
    • A01D34/008Control or measuring arrangements for automated or remotely controlled operation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3841Data obtained from two or more sources, e.g. probe vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3885Transmission of map data to client devices; Reception of map data by client devices
    • G01C21/3896Transmission of map data from central databases
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/40Control within particular dimensions
    • G05D1/43Control of position or course in two dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/40Control within particular dimensions
    • G05D1/43Control of position or course in two dimensions
    • G05D1/437Control of position or course in two dimensions for aircraft during their ground movement, e.g. taxiing
    • G05D1/439Control of position or course in two dimensions for aircraft during their ground movement, e.g. taxiing on the runway during take-off or landing
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/69Coordinated control of the position or course of two or more vehicles
    • G05D1/693Coordinated control of the position or course of two or more vehicles for avoiding collisions between vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3848Data obtained from both position sensors and additional sensors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Optics & Photonics (AREA)
  • Electromagnetism (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A method for a robotic working tool system for mapping an operating area having a boundary, the robotic work tool system comprising a first device comprising at least one navigation sensor and at least one long-range object detection sensor and a second robotic working tool comprising at least one navigation sensor and at least one shortrange object detection sensor, wherein the method comprises the first device:determining an initial mapping of the operating area in a first resolution utilizing the at least one navigation sensor and the at least one long-range object detection sensor;detecting one or more objects utilizing the at least one long-range object detection sensors; anddetermining at least one area in the initial mapping comprising at least one of the one or more detected objects; wherein the method further comprises the second robotic work tool:determining a supplemental mapping of the operating area, by mapping out the at least one area utilizing the at least one navigation sensor and at least one short-range object detection sensor in a second resolution, wherein the first resolution is lower than the second resolution.

Description

IMPROVED NAVIGATION FOR A ROBOTIC WORK TOOL SYSTEM TECHNICAL FIELD This application relates to a robotic Work tool and in particular to a system and a method for providing an improved navigation for robotic Work tools, such as laWnmoWers, in such a system.
BACKGROUND Automated or robotic Work tools such as robotic laWnmoWers are becoming increasingly more advanced and so is the need for proper mapping of an operational area. At the same time, the operational areas are growing in size and so the cost (in time) for mapping an area is increasing.
Thus, there is a need for an improved manner of mapping an operational area.
SUMMARY It is therefore an object of the teachings of this application to overcome or at least reduce those problems by providing a robotic Working tool system for mapping an operating area having a boundary, the robotic Work tool system comprising a first device comprising a controller, at least one navigation sensor and at least one long- range object detection sensor and a second robotic Working tool comprising a controller, at least one navigation sensor and at least one short-range object detection sensor, Wherein the controller of the first device is configured to: determine an initial mapping of the operating area in a first resolution utilizing the at least one navigation sensor and the at least one long-range object detection sensor; detect one or more objects utilizing the at least one long-range object detection sensors; and determine at least one area in the initial mapping comprising at least one of the one or more detected objects; Wherein the controller of the second robotic Work tool is configured to determine a supplemental mapping of the operating area, by mapping out the at least one area utilizing the at least one navigation sensor and at least one short-range object detection sensor in a second resolution, Wherein the first resolution is lower than the second resolution.
In some embodiments the at least one navigation sensor of the first device comprises a GNSS sensor and the first device is configured to determine the initial mapping based on coordinates received through the GNSS device.
In some embodiments the at least one navigation sensor of the first device comprises a radar or LIDAR and the first device is configured to deterrnine the initial mapping based on SLAM.
In some embodiments the at least one navigation sensor of the first device comprises a camera and the first device is configured to deterrnine the initial mapping based on VSLAM.
In some embodiments the first device is configured to deterrnine the initial mapping of the operating area While moving at a first speed and Wherein the second robotic Work tool is configured to deterrnine the supplemental mapping of the operating area While moving at a second speed, Wherein the first speed is higher than the second speed. In some embodiments the first speed is higher than two times the second speed.
In some embodiments the first device is configured to deterrnine the initial mapping of the operating area by moving along the boundary of the operating area.
In some embodiments the controller of the first device is configured to deterrnine the initial mapping of the operating area by traversing the operating area in a pattem Where each partial path is at least a traversal distance from oneanother. In some embodiments the traversal distance is 5, l0 or 20 meters or any range thereinbetWeen.
In some embodiments the controller of the first device is configured to deterrnine at least one area in the initial mapping comprising at least one of the one or more detected objects by deterrnining an enveloping area around objects that are less than a closeness distance from one another. In some embodiments the closeness distance is 2, 3, 4, 5, l0 meters or any range thereinbetWeen.
In some embodiments a long-range object detection sensor is a sensor capable of detecting an object at a distance of over l0 m.
In some embodiments the at least one long-range object detection sensor comprises a camera and the first device is configured to detect the one or more objects through image recognition.
In some embodiments the at least one long-range object detection sensor comprises a radar and/or a LIDAR and the first device is configured to detect the one or more objects through range analysis based on data received from the long-range object detection sensor.
In some embodiments the robotic Work tool system further comprises a server, and Wherein the first device is configured to determine the initial mapping by transmitting navigation data to the server.
In some embodiments the first device is configured to detect the one or more objects and determine at least one area in the initial mapping comprising at least one of the one or more detected objects by transmitting sensor data to the server.
In some embodiments the controller of the second robotic Work tool is configured to map out the at least one area by traversing the Whole area.
In some embodiments the at least one short-range object detection sensor operates by detecting an object through physical connection.
In some embodiments the at least one short-range object detection sensor comprises a collision sensor and Wherein the controller of the second robotic Work tool is configured to map out the at least one area by noting locations Where collisions occur.
In some embodiments the first resolution is in a range of 1 data point at every 1, 5 or 10 meters or any range thereinbetWeen.
In some embodiments the first resolution is in a range of 1 data point at every 1%, 2%, 3%, 4% or 5% of the length of the boundary of the operating area or any range thereinbetWeen.
In some embodiments the second resolution is in a range of 1 data point at every 0.01, 0.1, 0.02, or 0.5 or any range thereinbetWeen.
In some embodiments the Wherein a data point is an entry in a map.
In some embodiments the first device is a first robotic Work tool.
In some embodiments the first robotic Work tool is the second robotic Work tool.
In some embodiments the first device is a drone.
In some embodiments the first device is a smartphone or tablet computer.
In some embodiments an object is any area not being a grass area.
In some embodiments a robotic Work tool system according to any previous claim, Wherein the server comprises a controller configured to receive navigation data and sensor data from the first robotic Work tool and in response thereto deterrnine the initial mapping.
In some embodiments the controller is further configured to detect any objects in the operational area.
In some embodiments the controller is further configured to receive navigation data from the second robotic Work tool and in response thereto deterrnine the supplemental mapping, and/or detection of any objects in the operational area.
It is also an object of the teachings of this application to overcome the problems by providing a method for use in a robotic Working tool system for mapping an operating area having a boundary, the robotic Work tool system comprising a first device comprising at least one navigation sensor and at least one long-range object detection sensor and a second robotic Working tool comprising at least one navigation sensor and at least one short-range object detection sensor, Wherein the method comprises the first device: deterrnining an initial mapping of the operating area in a first resolution utilizing the at least one navigation sensor and the at least one long-range object detection sensor; detecting one or more objects utilizing the at least one long- range object detection sensors; and deterrnining at least one area in the initial mapping comprising at least one of the one or more detected objects; Wherein the method further comprises the second robotic Work tool: deterrnining a supplemental mapping of the operating area, by mapping out the at least one area utilizing the at least one navigation sensor and at least one short-range object detection sensor in a second resolution, Wherein the first resolution is lower than the second resolution.
In some embodiments the supplemental mapping s performed during at least one Work session in the operational area.
In some embodiments the (second) robotic Work tool is a robotic laWnmoWer.
In some embodiments the first robotic Work tool is a mapping robot.
Some benefits of the teachings herein include a lower cost: a robotic Work tool With high precision long-range detection sensors is used initially during installation and can then be moved to be used at another initial site installation. The robotic Work tools making the supplemental mapping thus do not need to have the expensive high precision long-range Sensors. The benefits also provide for a higher quality of mapping as highest quality sensors can be used for mapping. Moreover, more time can be spent analyzing areas of interest, providing detailed inforrnation where needed. And, the teachings herein also provide benefits regarding time and labor saved: manual work to map the site is minimal. This also allows for using robotic work tools with a more limited sensor array for performing the actual work, which keeps the cost of the system low for the end user.
Further embodiments and aspects are as in the attached patent claims and as discussed in the detailed description.
Other features and advantages of the disclosed embodiments will appear from the following detailed disclosure, from the attached dependent claims as well as from the drawings. Generally, all terms used in the claims are to be interpreted according to their ordinary meaning in the technical field, unless explicitly defined otherwise herein. All references to "a/an/the [element, device, component, means, step, etc.]" are to be interpreted openly as referring to at least one instance of the element, device, component, means, step, etc., unless explicitly stated otherwise. The steps of any method disclosed herein do not have to be performed in the exact order disclosed, unless explicitly stated.
BRIEF DESCRIPTION OF THE DRAWINGS The invention will be described in further detail under reference to the accompanying drawings in which: Figure lA shows an example of a robotic lawnmower according to some embodiments of the teachings herein; Figure lB shows a schematic view of the components of an example of a robotic work tool being a robotic lawnmower according to some example embodiments of the teachings herein; Figure 2 shows a schematic view of a robotic work tool system according to some example embodiments of the teachings herein; Figure 3A shows a schematic view of a robotic work tool system according to some example embodiments of the teachings herein; Figure 3B shows a schematic view of a robotic work tool system according to some example embodiments of the teachings herein; and Figure 4 shows a corresponding flowchart for a method according to some example embodiments of the teachings herein.
DETAILED DESCRIPTION The disclosed embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which certain embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein.
Like reference numbers refer to like elements throughout.
It should be noted that even though the description given herein will be focused on robotic lawnmowers, the teachings herein may also be applied to, robotic ball collectors, robotic mine sweepers, robotic farrning equipment, or other robotic work tools where a work tool is to be safeguarded against from accidentally extending beyond or too close to the edge of the robotic work tool.
Figure 1A shows a perspective view of a robotic work tool 100, here exemplified by a robotic lawnmower 100, having a body 140 and a plurality of wheels 130 (only one side is shown). The robotic work tool 100 may be a multi-chassis type or a mono-chassis type (as in figure 1A). A multi-chassis type comprises more than one main body parts that are movable with respect to one another. A mono-chassis type comprises only one main body part.
It should be noted that robotic lawnmower may be of different sizes, where the size ranges from merely a few decimetres for small garden robots, to more than 1, 1.5 5 or even over 2 meters for large robots arranged to service for example sports fields or airf1elds.
It should be noted that even though the description herein is focused on the example of a robotic lawnmower, the teachings may equally be applied to other types of robotic work tools, such as robotic watering tools, robotic golf ball collectors, and robotic mulchers to mention a few examples. It should also be noted that more than one robotic working tool may be set to operate in a same operational area, and that all of these robotic working tools need not be of the same type. In some embodiments the (first) robotic work tool is a mapping robot to be used only during a mapping of the operational area.
It should also be noted that the robotic work tool is a self-propelled robotic work tool, capable of autonomous navigation within a work area, where the robotic work tool propels itself across or around the work area in a pattem (random or predeterrnined) without user control (except, of course, possibly for a start and/or stop command).
Figure 1B shows a schematic overview of the robotic work tool 100, also exemplified here by a robotic lawnmower 100. In this example embodiment the robotic lawnmower 100 is of a mono-chassis type, having a main body part 140. The main body part 140 substantially houses all components of the robotic lawnmower 100. The robotic lawnmower 100 has a plurality of wheels 130. In the exemplary embodiment of figure 1B the robotic lawnmower 100 has four wheels 130, two front wheels and two rear wheels. At least some of the wheels 130 are driveable connected to at least one electric motor 150. It should be noted that even if the description herein is focused on electric motors, combustion engines may altematively be used, possibly in combination with an electric motor. In the example of figure 1B, each of the wheels 130 is connected to a respective electric motor 155, but it would also be possible with two or more wheels being connected to a common electric motor 155, for driving the wheels 130 to navigate the robotic lawnmower 100 in different manners. The wheels, the motor 155 and possibly the battery 150 are thus examples of components making up a propulsion device. By controlling the motors 150, the propulsion device may be controlled to propel the robotic lawnmower 100 in a desired manner, and the propulsion device will therefore be seen as synonymous with the motor(s) 150.
It should be noted that wheels 130 driven by electric motors is only one example of a propulsion system and other variants are possible such as caterpillar tracks.
The robotic lawnmower 100 also comprises a controller 110 and a computer readable storage medium or memory 120. The controller 110 may be implemented using instructions that enable hardware functionality, for example, by using executable computer program instructions in a general-purpose or special-purpose processor that may be stored on the memory 120 to be executed by such a processor. The controller 110 is configured to read instructions from the memory 120 and execute these instructions to control the operation of the robotic laWnmoWer 100 including, but not being limited to, the propulsion and navigation of the robotic laWnmoWer.
The controller 110 in combination With the electric motor 155 and the Wheels 130 forms the base of a navigation system (possibly comprising further components) for the robotic laWnmoWer, enabling it to be self-propelled as discussed under figure 1A, The controller l 10 may be implemented using any suitable, available processor or Programmable Logic Circuit (PLC). The memory 120 may be implemented using any commonly known technology for computer-readable memories such as ROM, FLASH, DDR, or some other memory technology.
The robotic laWnmoWer 100 is further arranged With a Wireless communication interface 115 for communicating With a server, and in some embodiments, also With other devices, such as a personal computer, a smartphone, the charging station, and/or other robotic Work tools (directly or indirectly). Examples of such Wireless communication devices are Bluetooth®, WiFi® (IEEE802. l lb), Global System Mobile (GSM) and LTE (Long Term Evolution), 5G, to name a few. The robotic laWnmoWer 100 is thus arranged to communicate With a server (referenced 240 in figure 2) for providing information regarding status, location, and/or progress of operation as Well as receiving commands or settings from the server.
The robotic laWnmoWer 100 also comprises a grass cutting device 160, such as a rotating blade 160 driven by a cutter motor 165. The grass cutting device being one example of a Work tool 160 for a robotic Work tool 100.
The robotic laWnmoWer 100 may further comprise at least one navigation sensor 185, such as any, some or all of an optical navigation sensor, a distance-based navigation sensor, a beacon navigation sensor and/or a satellite navigation sensor 185. The optical navigation sensor may be a camera-based sensor and/or a laser-based sensor. The distance-based navigation sensor may be a radar-based or LIDAR-based.
The satellite navigation sensor may be a GPS (Global Positioning System), RTK (Real- Time Kinetic) device or other Global Navigation Satellite System (GNSS) device. In embodiments, where the robotic lawnmower 100 is arranged with a navigation sensor, the magnetic sensors 170 as will be discussed below are optional. In embodiments relying (at least partially) on a navigation sensor 185, the work area may be specified as a virtual work area in a map application stored in the memory 120 of the robotic lawnmower 100. The virtual work area may be defined by a virtual boundary. As will be discussed in the below, virtual borders may be used to define a work area 205.
The robotic lawnmower 100 may also or altematively comprise navigation sensors based on deduced reckoning sensors 180. The deduced reckoning sensors may be odometers, (multi-axis) accelerometer or other deduced reckoning sensors. In some embodiments, the deduced reckoning sensors are comprised in the propulsion device, wherein a deduced reckoning navigation may be provided by knowing the current supplied to a motor and the time the current is supplied, which will give an indication of the speed and thereby distance for the corresponding wheel.
For enabling the robotic lawnmower 100 to navigate with reference to a boundary wire emitting a magnetic field caused by a control signal transmitted through the boundary wire, the robotic lawnmower 100 is, in some embodiments, further configured to have at least one magnetic field sensor 170 arranged to detect the magnetic field and for detecting the boundary wire and/or for receiving (and possibly also sending) information to/ from a signal generator (will be discussed with reference to figure 1). In some embodiments, the sensors 170 may be connected to the controller 110, possibly via filters and an amplifier, and the controller 110 may be configured to process and evaluate any signals received from the sensors 170. The sensor signals are caused by the magnetic field being generated by the control signal being transmitted through the boundary wire. This enables the controller 110 to determine whether the robotic lawnmower 100 is close to or crossing the boundary wire, or inside or outside an area enclosed by the boundary wire.
As mentioned above, in some embodiments, the robotic lawnmower 100 is in some embodiments arranged to operate according to a map application representing one or more work areas (and possibly the surroundings of the work area(s)) stored in the memory 120 of the robotic lawnmower 100. The map application may be generated or supplemented as the robotic lawnmower 100 operates or otherwise moves around in the work area 205. In some embodiments, the map application includes one or more start regions and one or more goal regions for each work area. In some embodiments, the map application also includes one or more transport areas.
As discussed in the above, the map application is in some embodiments stored in the memory 120 of the robotic working tool(s) 100. In some embodiments the map application is stored in the server (referenced 240 in figure 2). In some embodiments maps are stored both in the memory 120 of the robotic Working tool(s) 100 and in the server, wherein the maps may be the same maps or show subsets of features of the area.
As discussed in the above, the robotic work tool is exemplified mainly as an autonomous robotic work tool. However, the teachings herein may also be applied to remote-controlled robotic work tools 100.
The robotic work tool comprises at least one detection sensors 190.
In some embodiments the robotic work tool comprises at least one long-range object detection sensors. Such sensors enable the robotic work tool to detect an object at a remote location from the robotic work tool. In some embodiments, such a distance is more than 10 meters.
In some embodiments the at least one long-range object detection sensor comprises a camera and the first device is configured to detect the one or more objects through image recognition.
In some embodiments the at least one long-range object detection sensor comprises a radar and/or a LIDAR allowing the robotic work tool to detect an object by receiving range or distance indications to a potential object and detect one or more objects through range analysis based on data (range or distance indications) received from the long-range object detection sensor.
The long-range object detection sensors further enable the robotic work tool to determine a location of the object based on the sensor data (which provides a location of the object relative the robotic work tool) and the navigation sensor 185 (which provides a location of the robotic work tool).
In some embodiments the robotic work tool comprises at least one short-range object detection sensor which enable the robotic work tool to detect an object by ll making a physical connection with the object. The physical connection may be an electromagnetic connection, such as sensing or receiving an electromagnetic signal from the object. In some embodiments the physical connection may be to make actual physical contact, such as by colliding with the object.
In some embodiments the at least one short-range object detection sensor comprises a collision sensor and wherein the robotic work tool is configured to detect an object by noting locations where collisions occur.
In some embodiments the at least one short-range object detection sensor comprises deduced one or more reckoning sensors, such as an acceleration sensor (such as a gyroscope or inertial movement unit) and/or an odometer, enabling the robotic work tool to detect elevation changes, slopes, slippage and possibly other features of an obstacle (as will be discussed further below).
The short-range object detection sensors enables the robotic work tool to determine a location of the object based on the navigation sensor 185, which provides a location of the robotic work tool, that location indicating the location of the object.
It should be noted that any of the long-range object detection sensors may also be used for close-range (such as physical contact) detection of objects. In such embodiments the resolution or range of the sensor can be set lower than when used for long-range detection.
Figure 2 shows a robotic work tool system 200 in some embodiments. The schematic view is not to scale. The robotic work tool system 200 comprises one or more robotic work tools 100 according to the teachings herein. It should be noted that the operational area 205 shown in figure 2 is simplif1ed for illustrative purposes. The robotic work tool system comprises a boundary 220 that may be virtual and/or electro mechanical. An example of an electro mechanical border is one generated by a magnetic field generated by a control signal being transmitted through a boundary wire, and which magnetic field is sensed by sensors 170 in the robotic work tool 100. An example of a virtual border is one defined by coordinates and navigated using a location-based navigation system, such as a GPS (or RTK) system.
The robotic work tool system 200 further comprises a station 210 possibly at a station location. A station location may altematively or additionally indicate a service 12 station, a parking area, a charging station or a safe area where the robotic work tool may remain for a time period between or during operation session.
As with figures 1A and 1B, the robotic work too1(s) is exemplified by a robotic lawnmower, whereby the robotic work tool system may be a robotic lawnmower system or a system comprising a combinations of robotic work tools, one being a robotic lawnmower, but the teachings herein may also be applied to other robotic work tools adapted to operate within a work area.
The one or more robotic Working tools 100 of the robotic work tool system 200 are arranged to operate in an operational area 205, which in this example comprises a first work area 205A and a second work area 205B connected by a transport area TA. However, it should be noted that an operational area may comprise a single work area or one or more work areas, possibly arranged adj acent for easy transition between the work areas, or connected by one or more transport paths or areas, also referred to as corridors. In the following work areas and operational areas will be referred to interchangeably, unless specifically indicated.
The operational area 205 is in this application exemplif1ed as a garden, but can also be other work areas as would be understood, such as a (part of a) neighbourhood, or a sports field to mention a few examples. A garden and a (part of a) neighbourhood are both examples of domestic areas.
As discussed above, the garden may contain a number of obstacles and/or objects, for example a number of trees, stones, slopes and houses or other structures.
In some embodiments the robotic work tool is arranged or conf1gured to traVerse and operate in work areas that are not essentially flat, but contain terrain that is of varying altitude, such as undulating, comprising hills or slopes or such. The ground of such terrain is not flat and it is not straightforward how to determine an angle between a sensor mounted on the robotic work tool and the ground. The robotic work tool is also or altematively arranged or conf1gured to traVerse and operate in a work area that contains obstacles that are not easily discemed from the ground. Examples of such are grass or moss covered rocks, roots or other obstacles that are close to ground and of a similar colour or texture as the ground. The robotic work tool is also or altematively arranged or conf1gured to traVerse and operate in a work area that contains obstacles that 13 are overhanging, i.e. obstacles that may not be detectable from the ground up, such as low hanging branches of trees or bushes. Such a garden is thus not simply a flat lawn to be mowed or similar, but a work area of unpredictable structure and characteristics. The operational area or any of its work areas 205 exemplified with referenced to figure 2, may thus be such a non-uniforrn area as disclosed in this paragraph that the robotic work tool is arranged to traverse and/or operate in.
For the purpose of the teachings herein an object in an operational area is taken to be any obstacle or object discussed herein. And, in some embodiments, an object may be any area that is not a grass area, such as a gravel area, a sand area, a walkway, a road, a path or any other area that is not covered by grass.
As shown in figure 2, the robotic Working tool(s) l00 is arranged to navigate in one or more work areas 205A, 205B, possibly connected by a transport area TA.
The robotic Working tool system 200 may altematively or additionally comprise or be arranged to be connected to a server 240, such as a cloud service, a cloud server application or a dedicated server 240. The connection to the server 240 may be direct from the robotic working tool l00, indirect from the robotic working tool l00 via the service station 2l0, and/or indirect from the robotic working tool l00 via user equipment (not shown).
As a skilled person would understand a server, a cloud server or a cloud service may be implemented in a number of ways utilizing one or more controllers 240A and one or more memories 240B that may be grouped in the same server or over a plurality of servers.
In the below several embodiments of how the robotic work tool l00 may be adapted will be disclosed. It should be noted that all embodiments may be combined in any combination providing a combined adaptation of the robotic work tool.
Figure 3A shows a schematic view of an example operating area 205, possibly one such as discussed in relation to figure 2, for use with a robotic work tool system 200 as discussed in relation to figure 2. A first robotic work tool l00:l is set to navigate the operational area 205. It should be noted that for the teachings herein an operational area 205 may, in some embodiments, be thought of as any of its sub areas. In some embodiments the robotic work tool l00:l is conf1gured to navigate the operational area 14 205 by following a boundary of the operational area 205, at least partially. In some embodiments the robotic work tool 100:1 is configured to navigate the operational area 205 by following a pattern in the operational area 205, at least partially, such as a zig- zag pattern. In some such embodiments, the robotic work tool is configured to traverse the pattern so that each part of the pattern is at a traversal distance td of at least 5, 10 or more meters or as large as possible while ensuring sensor functionality.
As the robotic work tool 100:1 navigates or traverses the operational area, it does so at a high speed, detecting at least some of the objects in the operational area 205. The robotic work tool utilizes long-range object detection sensors 190 to detect objects remotely. In the illustrative example of figure 3A, the objects are two trees referenced T, a boulder referenced B, a flower bed referenced FB, a slope (as an example of an object being an obstacle not being an object per se) referenced S and a sandy patch referenced SP (as an example of an object being an area that is not an area of grass). In some embodiments, the speed is 1 m/s, 2 m/s, 3 m/s or higher.
As an object is detected, a location is noted for the object in the map. Also, as the robotic work tool navigates the operational area, its location is also noted, thereby providing an initial mapping of the operational area 205. The initial mapping is made at a low resolution. In some embodiments the low resolution of the initial mapping is in a range of 1 data point at every 1, 5 or 10 meters or any range thereinbetween. In some embodiments the low resolution of the initial mapping is in a range of 1 data point at every 1%, 2%, 3%, 4% or 5% of the length of the boundary of the operating area or any range thereinbetween. In this context a data entry is a notification of a location in the map.
The robotic work tool is also configured to determine or generate an area 206 around one or more of the detected objects. In some such embodiments, the area represents or is defined by the extent of the object. In some altemative or additional such embodiments, the area is defined by an area enveloping the extent of the object. In some altemative or supplemental such embodiments, the area represents or is defined by the extent of the object plus a range of 1, 2, 3 or 5 meters extending out from (an edge of) the object in all directions.
In some embodiments, the area is defined as an area that envelopes more than one obj ect, and the robotic work tool is configured to deterrnine that a distance between two detected objects is less than or equal to a closeness distance cd, and if so, the two objects should be enveloped by the same area 206. In some such embodiments, a closeness distance is 2, 3, 4, 5, l0 meters or any range thereinbetween.
In the illustrative example of figure 3A, the robotic work tool has assigned or deterrnined five (5) such areas, some around a single object, some around multiple objects.
Figure 3B shows a schematic view of the example of figure 3A, but where a second robotic work tool is caused to provide a more detailed supplemental mapping of the operational area, but only in the areas of special interest, namely the areas enveloping one or more objects. It should be noted that in some embodiments the supplemental mapping is performed during at least one work session in the operational area, and not necessarily as part of an initial setup or installation of the operational area, but as part of ongoing operations in the operational area.
The second robotic work tool is caused to move out to the area(s) and to navigate the entire area, or at least as much as is possible, as some of the area will be occupied by the object(s) in the area. As a skilled person would understand there are many ways for the second robotic work tool to navigate the area. As the area 206 is navigated, it is mapped at a higher resolution, than the initial mapping. In some embodiments, the second higher resolution is in a range of l data point at every 0.0l, 0.l, 0.02, or 0.5 or any range thereinbetween.
In some embodiments, the second robotic work tool maps the area(s) at a lower (second) speed. In some embodiments, the second speed is 0.0l, 0.l, 0.02, or 0.5 m/s or any range thereinbetween.
This provides for a detailed mapping of areas of interest and a low-resolution mapping of areas where there is no change, which speeds up the mapping process signif1cantly.
During this high-resolution mapping, the objects may be investigated in further details, as to their extent, possibility to traverse, and or other properties, such as slope, wheel slip to mention a few examples. 16 It should be noted that although the teachings herein have been focussed on the first and second robotic Work tools performing the deterrninations and detections, some or all of the processing may actually be performed by the server, by the robotic Work tool(s) transmitting the gathered data from the navigating sensors and/or the object detection sensors to the server for further deterrninations and analysis.
In some such embodiments the robotic Work tool system comprises a server 240, and the first robotic Work tool is configured to determine the initial mapping by transmitting navigation data to the server. In some such embodiments the first robotic Work tool is configured to detect the one or more objects and determine at least one area in the initial mapping comprising at least one of the one or more detected objects by transmitting sensor data to the server.
In some such embodiments the second robotic Work tool is configured to determine the supplemental mapping by transmitting navigation data and sensor to the server.
The server is thus, in some embodiments, configured to receive navigation data and sensor data from the first robotic Work tool and in response thereto determine the initial mapping, and/or detection of any objects in the operational area.
The server is, in some embodiments, further configured to receive navigation data from the second robotic Work tool and in response thereto determine the supplemental mapping, and/or detection of any objects in the operational area.
It should be noted that any deterrnination or detection may be performed by any of the first robotic Work tool, the second robotic Work tool and/or the server.
It should also be noted that in some embodiments, the first robotic Work tool is the second robotic Work tool, Wherein the same robotic Work tool is simply used to map out the operational area in two phases.
In some embodiments, the initial mapping is made by a device not being a robotic Work tool, for example a remote-controlled drone or a user carrying a smartphone or a tablet computer Figure 4 shoWs a general floWchart according to a method of the teachings herein for use in a robotic Working tool system, Where Wherein the method comprises defining 400 32. A method for a robotic Working tool system for mapping an operating 17 area having a boundary, the robotic Work tool system comprising a first device comprising at least one navigation sensor and at least one long-range object detection sensor and a second robotic Working tool comprising at least one navigation sensor and at least one short-range object detection sensor, Wherein the method comprises the first device: deterrnining 410 an initial mapping of the operating area in a first low resolution utilizing the at least one navigation sensor and the at least one long-range object detection sensor; detecting 420 one or more objects utilizing the at least one long-range object detection sensors; and deterrnining 430 at least one area in the initial mapping comprising at least one of the one or more detected objects, the areas being areas of interest; Wherein the method further comprises the second robotic Work tool: causing the second robotic Work tool to go to the at least one area; and deterrnining 440 a supplemental mapping of the operating area, by mapping out the at least one area utilizing the at least one navigation sensor and at least one short-range object detection sensor for detecting 450 the same and/or more objects in a second resolution, Wherein the first resolution is lower than the second resolution. A robotic Work tool system may thus in some embodiments be configured to perform the method according to figure 4 as discussed above for example in relation to figures 3A and 3B.

Claims (32)

Claims
1. A robotic Working tool system for mapping an Operating area having a boundary, the robotic Work tool system comprising a first device comprising a controller (110), at least one navigation sensor and at least one long-range object detection sensor and a second robotic Working tool comprising a controller (110), at least one navigation sensor and at least one short-range object detection sensor, Wherein the controller of the first device is configured to: deterrnine an initial mapping of the operating area in a first resolution utilizing the at least one navigation sensor and the at least one long-range object detection sensor; detect one or more objects utilizing the at least one long-range object detection sensors; and deterrnine at least one area in the initial mapping comprising at least one of the one or more detected objects; Wherein the controller of the second robotic Work tool is configured to deterrnine a supplemental mapping of the operating area, by mapping out the at least one area utilizing the at least one navigation sensor and at least one short-range object detection sensor in a second resolution, Wherein the first resolution is lower than the second resolution.
2. The system according to claim 1, Wherein the at least one navigation sensor of the first device comprises a GNSS sensor and the first device is configured to deterrnine the initial mapping based on coordinates received through the GNSS device.
3. The system according to claim 1 or 2, Wherein the at least one navigation sensor of the first device comprises a radar or LIDAR and the first device is configured to deterrnine the initial mapping based on SLAM.
4. The system according to any preceding claim, Wherein the at least one navigation sensor of the first device comprises a camera and the first device is configured to deterrnine the initial mapping based on VSLAM.
5. The system according to any preceding claim, Wherein the Wherein the first device is configured to determine the initial mapping of the operating area While moving at a first speed and Wherein the second robotic Work tool is configured to deterrnine the supplemental mapping of the operating area While moving at a second speed, Wherein the first speed is higher than the second speed.
6. The system according to claim 5, Wherein the first speed is higher than two times the second speed.
7. The system according to any preceding claim, Wherein the first device is configured to deterrnine the initial mapping of the operating area by moving along the boundary of the operating area.
8. The system according to any preceding claim, Wherein the controller of the first device is configured to deterrnine the initial mapping of the operating area by traversing the operating area in a pattem Where each partial path is at least a traversal distance from oneanother.
9. The system according to claim 8, Wherein the traversal distance is 5, 10 or 20 meters or any range thereinbetWeen.
10. The system according to any preceding claim, Wherein the controller of the first device is configured to deterrnine at least one area in the initial mapping comprising at least one of the one or more detected objects by deterrnining an enveloping area around objects that are less than a closeness distance from one another.
11. The system according to claim 10, Wherein the closeness distance is 2, 3, 4, 5, 10 meters or any range thereinbetWeen.
12. The system according to any preceding claim, Wherein a long-range object detection sensor is a sensor capable of detecting an object at a distance of over 10 m.
13. The system according to any preceding claim, Wherein the at least one long-range object detection sensor comprises a camera and the first device is configured to detect the one or more objects through image recognition.
14. The system according to any preceding claim, Wherein the at least one long-range object detection sensor comprises a radar and/or a LIDAR and the first device is configured to detect the one or more objects through range analysis based on data received from the long-range object detection sensor.
15. The system according to any preceding claim, Wherein the robotic Work tool system further comprises a server, and Wherein the first device is configured to determine the initial mapping by transmitting navigation data to the server.
16. The system according to claim 15, Wherein the first device is configured to detect the one or more objects and determine at least one area in the initial mapping comprising at least one of the one or more detected objects by transmitting sensor data to the server.
17. The system according to any preceding claim, Wherein the controller of the second robotic Work tool is configured to map out the at least one area by traversing the Whole area.
18. The system according to any preceding claim, Wherein the at least one short-range object detection sensor operates by detecting an object through physical connection.
19. The system according to any preceding claim, Wherein the at least one short-range object detection sensor comprises a collision sensor andWherein the controller of the second robotic Work tool is configured to map out the at least one area by noting locations Where collisions occur.
20. The system according to any preceding claim, Wherein the first resolution is in a range of 1 data point at every 1, 5 or 10 meters or any range thereinbetWeen.
21. The system according to any preceding claim, Wherein the first resolution is in a range of 1 data point at every 1%, 2%, 3%, 4% or 5% of the length of the boundary of the operating area or any range thereinbetWeen.
22. The system according to any preceding claim, Wherein the second resolution is in a range of 1 data point at every 0.01, 0.1, 0.02, or 0.5 or any range thereinbetWeen.
23. The system according to any of claims 20, 21 and/or 22, Wherein the Wherein a data point is an entry in a map.
24. The system according to any preceding claim, Wherein the first device is a first robotic Work tool.
25. The system according to claim 24, Wherein the first robotic Work tool is the second robotic Work tool.
26. The system according to any of claims 1 to 23, Wherein the first device is a drone.
27. The system according to any of claims 1 to 23, Wherein the first device is a smartphone or tablet computer.
28. The system according to any preceding claim, Wherein an object is any area not being a grass area.
29. A server configured to be used in a robotic Work tool system according to any previous claim, Wherein the server comprises a controller (240A) configured to receive navigation data and sensor data from the first robotic Work tool and in response thereto deterrnine the initial mapping.
30. The server according to claim 29, Wherein the controller is further configured to detect any objects in the operational area (205).
31.l. The server according to claim 29 or 30, Wherein the controller is further configured to receive navigation data from the second robotic Work tool and in response thereto deterrnine the supplemental mapping, and/or detection of any objects in the operational area.
32. A method for a robotic Working tool system for mapping an operating area having a boundary, the robotic Work tool system comprising a first device comprising at least one navigation sensor and at least one long-range object detection sensor and a second robotic Working tool comprising at least one navigation sensor and at least one short-range object detection sensor, Wherein the method comprises the first device: deterrnining an initial mapping of the operating area in a first resolution utilizing the at least one navigation sensor and the at least one long-range object detection sensor; detecting one or more objects utilizing the at least one long-range object detection sensors; and deterrnining at least one area in the initial mapping comprising at least one of the one or more detected objects; Wherein the method further comprises the second robotic Work tool: deterrnining a supplemental mapping of the operating area, by mapping out the at least one area utilizing the at least one navigation sensor and at least one short-range object detection sensor in a second resolution, Wherein the first resolution is lower than the second resolution.
SE2151621A 2021-12-25 2021-12-25 Improved navigation for a robotic work tool system SE2151621A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE2151621A SE2151621A1 (en) 2021-12-25 2021-12-25 Improved navigation for a robotic work tool system
PCT/SE2022/050931 WO2023121528A1 (en) 2021-12-25 2022-10-14 Improved navigation for a robotic work tool system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE2151621A SE2151621A1 (en) 2021-12-25 2021-12-25 Improved navigation for a robotic work tool system

Publications (1)

Publication Number Publication Date
SE2151621A1 true SE2151621A1 (en) 2023-06-26

Family

ID=84044676

Family Applications (1)

Application Number Title Priority Date Filing Date
SE2151621A SE2151621A1 (en) 2021-12-25 2021-12-25 Improved navigation for a robotic work tool system

Country Status (2)

Country Link
SE (1) SE2151621A1 (en)
WO (1) WO2023121528A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170108867A1 (en) * 2015-10-15 2017-04-20 Honda Research Institute Europe Gmbh Autonomous vehicle with improved simultaneous localization and mapping function
US20180364045A1 (en) * 2015-01-06 2018-12-20 Discovery Robotics Robotic platform with mapping facility
US20200150655A1 (en) * 2017-03-02 2020-05-14 RobArt GmbH Method for controlling an autonomous, mobile robot
SE1951435A1 (en) * 2019-12-12 2021-06-13 Husqvarna Ab Improved navigation control for a robotic work tool
SE2050264A1 (en) * 2020-03-10 2021-09-11 Husqvarna Ab Improved navigation for a robotic work tool
AU2020300962A1 (en) * 2019-04-06 2021-11-04 Electric Sheep Robotics, Inc. System, devices and methods for tele-operated robotics

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10789472B1 (en) * 2017-06-14 2020-09-29 Amazon Technologies, Inc. Multiple image processing and sensor targeting for object detection
GB2589419A (en) * 2019-08-09 2021-06-02 Quantum Leap Tech Limited Fabric maintenance sensor system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180364045A1 (en) * 2015-01-06 2018-12-20 Discovery Robotics Robotic platform with mapping facility
US20170108867A1 (en) * 2015-10-15 2017-04-20 Honda Research Institute Europe Gmbh Autonomous vehicle with improved simultaneous localization and mapping function
US20200150655A1 (en) * 2017-03-02 2020-05-14 RobArt GmbH Method for controlling an autonomous, mobile robot
AU2020300962A1 (en) * 2019-04-06 2021-11-04 Electric Sheep Robotics, Inc. System, devices and methods for tele-operated robotics
SE1951435A1 (en) * 2019-12-12 2021-06-13 Husqvarna Ab Improved navigation control for a robotic work tool
SE2050264A1 (en) * 2020-03-10 2021-09-11 Husqvarna Ab Improved navigation for a robotic work tool

Also Published As

Publication number Publication date
WO2023121528A1 (en) 2023-06-29

Similar Documents

Publication Publication Date Title
CN112584697A (en) Autonomous machine navigation and training using vision system
EP4386508A2 (en) Autonomous machine navigation using reflections from subsurface objects
EP4118509A1 (en) System and method for improved navigation of a robotic work tool
EP4068040A1 (en) Improved operation for a robotic work tool
SE2151621A1 (en) Improved navigation for a robotic work tool system
WO2022203562A1 (en) Improved navigation for a robotic work tool
EP4085745A1 (en) Improved navigation for a robotic work tool
SE2250557A1 (en) Navigation for a robotic work tool system
SE546019C2 (en) Improved mapping for a robotic work tool system
EP4381926A1 (en) Improved operation for a robotic work tool
SE2250247A1 (en) Improved navigation for a robotic work tool system
US20230350421A1 (en) Navigation for a robotic work tool system
SE2150497A1 (en) Improved obstacle handling for a robotic work tool
WO2023167617A1 (en) Improved operation for a robotic lawnmower system
EP4368004A1 (en) Improved operation and installation for a robotic work tool
US20230359221A1 (en) Navigation for a robotic work tool system
SE2151275A1 (en) Improved navigation for a robotic work tool system
EP4386503A1 (en) Improved definition of boundary for a robotic work tool
EP4381925A1 (en) Improved detection of a solar panel for a robotic work tool
WO2023146451A1 (en) Improved operation for a robotic work tool system
SE546035C2 (en) Improved navigation for a robotic work tool system
SE2151016A1 (en) Improved navigation for a robotic work tool system
SE546034C2 (en) Improved navigation for a robotic work tool system
SE545376C2 (en) Navigation for a robotic work tool system