CN111201497A - Autonomous robot system - Google Patents

Autonomous robot system Download PDF

Info

Publication number
CN111201497A
CN111201497A CN201880048036.3A CN201880048036A CN111201497A CN 111201497 A CN111201497 A CN 111201497A CN 201880048036 A CN201880048036 A CN 201880048036A CN 111201497 A CN111201497 A CN 111201497A
Authority
CN
China
Prior art keywords
electronic device
processor
mobile electronic
movement
obstacle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880048036.3A
Other languages
Chinese (zh)
Inventor
列奥尼德·科夫顿
马克西米立安·科夫顿
列奥尼德·雷振科
塔拉斯·叶尔马科夫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Companion Robot Co Ltd
Original Assignee
Companion Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/725,656 external-priority patent/US20180360177A1/en
Application filed by Companion Robot Co Ltd filed Critical Companion Robot Co Ltd
Publication of CN111201497A publication Critical patent/CN111201497A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45CPURSES; LUGGAGE; HAND CARRIED BAGS
    • A45C15/00Purses, bags, luggage or other receptacles covered by groups A45C1/00 - A45C11/00, combined with other objects or articles
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45CPURSES; LUGGAGE; HAND CARRIED BAGS
    • A45C5/00Rigid or semi-rigid luggage
    • A45C5/03Suitcases
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45CPURSES; LUGGAGE; HAND CARRIED BAGS
    • A45C5/00Rigid or semi-rigid luggage
    • A45C5/14Rigid or semi-rigid luggage with built-in rolling means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/028Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Purses, Travelling Bags, Baskets, Or Suitcases (AREA)

Abstract

A system for identifying and following a mobile electronic device, the system comprising: an antenna for receiving and transmitting signals; a plurality of sensors for distance measurement; a processor; and a memory in communication with the processor. The memory stores instructions that, when executed by the processor, cause the processor to determine a speed and a direction of the mobile electronic device; adjusting a movement path of the system based on the determined speed and direction of the mobile electronic device; determining a distance between the mobile electronic device and the system; when an obstacle in the path of movement of the system is identified and avoided, the system is commanded to follow the mobile electronic device within a predetermined range of distances.

Description

Autonomous robot system
Cross Reference to Related Applications
The present application claims priority and benefit from U.S. provisional patent application No. 62/651,023, filed 3/30, 2018, which is a partial continuation of PCT application No. PCT/US17/57319, filed 10/19, 2017, and U.S. patent application No. 15/725,656, filed 10/5, 2017, which claims priority and benefit from U.S. provisional patent application No. 62/530,744, filed 10, 7, 2017, which are incorporated herein by reference in their entirety.
Technical Field
The present invention relates generally to robotics, and more particularly to methods and apparatus for locating a moving target, following the moving target while detecting and avoiding obstacles in its path of movement.
Background
People often travel with items such as bags or suitcases. These articles come in different sizes, as some of them can be heavy and large, making them difficult to handle. Many times people lose their bags or suitcases while traveling, which is a stressful situation for most travelers. Thus, a solution for safely and reliably traveling with heavy or large items has long been sought. Autonomous robotic systems that locate and follow their users provide security and adapt to different environments.
Previous solutions for autonomous robotic systems that locate and follow their users include camera-based vision solutions to follow the target, which only work when there is only one person in the camera or video coverage area, because systems that rely only on visual images have difficulty distinguishing the target user from a group of people who wear similar or long-phase similarities. Furthermore, tracking by camera or video requires a great deal of computational effort and may lead to security problems. Other solutions to tracking objects include voice tracking, thermal sensors, RFID, and bluetooth technologies. Tracking by sound seems impractical because it would require the continuous emission of sounds recognized by the system program. Tracking by thermal sensors becomes unreliable when the system is in an environment with multiple heat sources (e.g., more than one person or animal) in range. RFID and remote control technologies only work when the target is directly visible to the device. Three problems are faced by the currently available solutions for bluetooth technology. First, the person's body can attenuate and propagate bluetooth signals. Second, a large number of signals are reflected from the bluetooth device itself, and the signals are largely dependent on the location of the source, e.g., a cell phone with the bluetooth signal. Third, whenever a bluetooth device changes location, the signal changes all of its parameters, making it difficult to determine the speed of the system and moving object, and the distance between them.
Disclosure of Invention
A system for identifying and following a mobile electronic device, in some embodiments, the system comprising: an antenna for receiving and transmitting signals; a plurality of sensors for distance measurement; a processor; and a memory in communication with the processor. In some embodiments, the memory stores instructions that, when executed by the processor, cause the processor to: determining a speed and a direction of a mobile electronic device; adjusting a movement path of the system based on the determined speed and direction of the mobile electronic device; determining a distance between the mobile electronic device and the system; commanding the system to follow the mobile electronic device within a predetermined range of said distance; identifying an obstacle in a path of movement of the system; commanding the system to stop for a predetermined period of time when an obstacle is identified; determining whether the obstacle is still in the path of movement of the system after the predetermined period of time; adjusting a movement path of the system when it is determined that the obstacle is still in the movement path of the system; and when it is determined that the obstacle is no longer in the path of movement of the system, instructing the system to continue following the mobile electronic device within the predetermined distance range.
A method for identifying and following a mobile electronic device by a system, the method comprising: determining, by a processor, a speed and a direction of a mobile electronic device; adjusting, by a processor, a movement path of a system based on the determined speed and direction of the mobile electronic device; determining, by a processor, a distance between a mobile electronic device and a system; instructing, by a processor, a system to follow the mobile electronic device within a predetermined range of the distance; identifying, by a processor, an obstacle in a path of movement of a system; commanding, by the processor, the system to stop for a predetermined period of time when the obstacle is identified; determining, by a processor, whether the obstacle is still in a path of movement of the system after a predetermined period of time; adjusting, by a processor, a movement trajectory of the system when it is determined that the obstacle is still in the movement path of the system; and instructing, by the processor, the system to continue following the mobile electronic device within the predetermined distance range when it is determined that the obstacle is no longer in the path of movement of the system.
In some embodiments, the processor bluetooth pairs with the mobile electronic device and only follows the mobile electronic device after bluetooth pairing.
In some embodiments, the system includes a camera to perform object recognition to identify obstacles and to send object recognition signals.
In some embodiments, the command to stop the system is based on a predetermined threshold of distance between the system and the obstacle.
In some embodiments, the system includes an engine controller that controls movement of the system.
In some embodiments, the command to cause the system to follow the mobile electronic device within a predetermined distance range is based on the speed and direction of the system.
In some embodiments, the system commands the system to increase the speed of the system when the system is physically pulled at a predetermined angle relative to the ground.
In some embodiments, the system commands the system to rotate a plurality of omni-wheels of the system 180 degrees when the mobile electronic device is at a predetermined threshold angle relative to the system.
In some embodiments, the system includes a lever to control movement of the system.
In some embodiments, the system includes one or more of a suitcase, a bag, a cargo, a cart, a pallet, and a container.
One skilled in the art will recognize that any combination of the above embodiments is within the scope of the present invention. For example, the system may also include a camera, joystick, and/or engine controller.
Drawings
FIG. 1 is a front view of an autonomous robotic system according to some embodiments of the invention;
FIG. 2 is a side view of an autonomous robotic system according to some embodiments of the invention;
FIG. 3 is a rear view of an autonomous robotic system according to some embodiments of the invention;
FIG. 4 illustrates a process flow for an autonomous robotic system to locate a target, according to some embodiments of the invention;
FIG. 5 illustrates a process flow of an autonomous robotic system moving towards a fixed target in accordance with some embodiments of the invention;
FIG. 6 illustrates a process flow for an autonomous robotic system to react to a discovered obstacle in accordance with some embodiments of the invention;
FIG. 7 illustrates a process flow for an autonomous robotic system to follow a moving target, in accordance with some embodiments of the invention;
FIG. 8 illustrates a following process for an autonomous robotic system for assisting a user according to some embodiments of the present invention.
Detailed Description
For the purposes of promoting an understanding of the principles of the invention, reference will now be made to the embodiments illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended. Any alterations and further modifications in the described embodiments, and any further applications of the principles of the invention as described herein are contemplated as would normally occur to one skilled in the art to which the invention relates.
Autonomous robotic systems, such as suitcases, bags, cargo, carts, trays, containers, and similar items with wheels ("systems"), locate an object, such as an electronic device (e.g., a smartphone, laptop, or notepad) carried by a user, and follow the object as it detects and avoids obstacles in its path of movement. The system wirelessly connects with a target, for example a handheld electronic device such as a smartphone, and specifically "pairs" with the target in order to follow the target as it moves. The system navigates through a large range of people, recognizing and avoiding objects on their way when tracking the target path. In some aspects, the system is able to move through people and obstacles without any additional peripheral equipment. In some aspects, the autonomous robotic system includes omni wheels that allow multi-directional, including vertical and horizontal, movement, and better stability. While following the moving target, the system moves at a speed within a predetermined threshold of the target's moving speed.
The system uses a camera or recorder to patrol its environment. In some aspects, the camera or recorder may be remotely controlled. In some aspects, the System includes a Position identification application, such as a Global Positioning System (GPS) chip, to orient and track its Position. In some aspects, the GPS chip is mobile. In some aspects, the system may include two additional GPS chips. In some embodiments, the system uses artificial intelligence (Al) and machine learning to optimize its movements. The system may include an integrated adaptive Al that identifies its environment (e.g., on a flying airplane) and adjusts its movements accordingly. The system may include a Virtual Reality (VR) and camera integration that may be used to reconstruct images of the path of movement of the system. The system may also include a directional indicator, such as a speaker for guiding the sight-impaired user.
Fig. 1 is a front view of an autonomous robotic system, 100, according to some embodiments of the invention. The basic components of the system 100 may include: directional antennas 102, 104, 106, 108; ranging sensors 110, 112, a processor and memory 114, and wheels 116, 118. The system may include a user transmitter detection module that includes specially designed directional antennas 102, 104, 106, 108 and a bluetooth low energy module that includes algorithms for data processing. The directional antennas 102, 104, 106, 108 detect their targets by searching for their wireless signal transmitters (e.g., smartphones, smartwatches, or electronic bracelets). The differences in the strength of the signals received by the directional antennas 102, 104, 106, 108 are used to determine the range and angle of the target relative to the system, for example, by the sector and difference methods.
The system may also detect a target using a ranging sensor. The selection of distance sensors includes ultrasonic ranging sensors and/or laser ranging sensors. In addition, the ranging sensors can also be used to detect obstacles in the path of movement of the system, such as people, fixtures, and buildings.
The system may locate the target by its visual recognition module when the target is within its visual range. As shown in fig. 1, the visual recognition module may include a camera 120. In some embodiments, the system may include a visual recognition and target person fixation module using person image recognition. In some aspects, the module may include at least one camera 120 centrally located on the roof of the vehicle. The module outputs an arc from 0 ° to 170 ° and activates when the character is within a visual range set at a predetermined distance (e.g., 25 cm from ground) and a predetermined angle (e.g., a 45 ° angle). The module also includes an algorithm that processes the image of the target person to determine whether a person within range is the intended target. In some aspects, the camera 120 is configured as a movable 360 degree virtual reality camera.
Fig. 2 is a side view of an autonomous robotic system 200 according to some embodiments of the invention. In some aspects, the system may include a plurality of ultrasonic ranging sensors 110, 112 located at the front of the system, and a plurality of laser ranging sensors 202, 204 located at the top of the system. The system may also include a bio-lock system 206 activated by, for example, a fingerprint, face, or iris. In some aspects, the system may further comprise a mechanism for manual locking.
Fig. 3 is a rear view 300 of an autonomous robotic system according to some embodiments of the invention. The system includes notification modules such as indicator lights and/or audible indicator lights, for example, the system may include addressable LED RGB bands 208. The notification module may also include a speaker 302. The LEDRGB strip 208 and speaker 302 are configured to provide a variety of light patterns and sound effects. The notification module may not necessarily be autonomous, it may be configured to activate under various circumstances, such as when the system is activated from a closed state; detecting an obstacle in a path of movement of the system; when it is not possible to circumvent an obstacle or an obstacle that cannot be avoided, such as when a step is detected; the connection is interrupted; entering a turn; rotation about the axis of the system; and accidental removal, such as someone attempting to steal the system. The power source of the system includes batteries, solar panels and other ways of providing long lasting energy, one example being a removable battery 304, which may be charged wirelessly.
The system may also include a decision-making module, and a "decision" is the result of sequential processing by the "work component" (pipeline) of the system. In some embodiments, the decision making process may include receiving data related to the system engine, such as data from an odometer, and setting the primary movement speed and angle of the system. The stage of decision making may include identifying a target, such as a handheld electronic device or a target person. The system communicates with the electronic device or utilizes facial recognition data in obtaining target location information, including angle and distance. An electronic device (e.g., a smartphone, a smartwatch, a smartband, etc.) may be a mobile target. The system also calculates the velocity of the target and corrects its rotation angle and sets its moving direction based on the position of the target. If the system detects itself too close or too far from the target, e.g., a complete loss of connection with the user, the system may stop moving and send a notification to the electronic device it is following.
The system searches the pre-reviewed list of targets/devices to establish a "pairing". For example, the system may search for a particular person's car, smartphone, smartwatch, and/or tablet, which may be "pre-reviewed" targets that the system may follow. Once the initial pairing of the target device and the system is successful, the paired device is considered a trusted device, and the target. From this point on, the system will not pair with any other targets unless it receives a further command. When the system or target device is shut down, the system and target connection is terminated.
In some aspects, the system establishes exclusive target determination by verifying the bluetooth protocol of the target during the exchange of identification codes between the system and the server and the initial connection. After the first activation of the system and establishment of a connection to the target, the system enters a calibration procedure which is optionally selected by the user in the mobile application. A single transmitter of different types may have different receive and transmit antennas with different characteristics. To be compatible with all types of wireless signal transmitters, an initial calibration of each transmitter relative to the system is required to balance the effects of the different types of individual signal transmitters in order to better determine the distance and angle of the system relative to the target with precision.
FIG. 4 illustrates a process flow of an autonomous robotic system for locating a target according to some embodiments of the invention. The target that the system detects and follows may be a wireless electronic device, such as a smartphone, that is equipped with the bluetooth protocol to pair with the system. The user may activate the pairing process, for example, through a mobile application previously downloaded on the wireless electronic device. In some aspects, the system automatically searches for devices to establish a "pairing. To avoid the system following a wrong target, e.g., pairing with an unwanted smartphone, the target wireless electronic device is registered and authenticated.
Thus, when the system detects a wireless electronic device in block 402 and determines that the device is a possible target, the system begins the authentication process by determining whether the device has been registered. Each registered target/device has a unique serial number; only authenticated registered targets/devices are able to control and monitor the system. The device authentication process registers a serial number, e.g., a remote user account number stored in the cloud, in the server. As shown in block 410, the system first determines whether the serial number of the device is located on the server and then verifies that the device associated with the serial number has been registered in block 412. If the device has not been registered, then the system requires the user to confirm during the initial connection in block 404 by seeking permission to allow the system to register the device for verification in block 406. If the user grants permission, the system will register the device on the server and run the verification process in block 408. Examples of registration methods include using an email address and/or a phone number associated with a smartphone. User permissions include, for example, bluetooth usage, access to GEO data, and the like. In some aspects, the mobile application control panel is a primary application control panel having a set of functions for controlling and monitoring devices. The mobile application may also include the option to link a particular system to the user account to enhance protection against unauthorized access and theft. The system detects the target position and follows the user while observing the set distance between the user and it and maintaining the optimal speed. As an option, an indicator alarm, such as a light or sound, may appear when the system loses its connection with its target.
FIG. 5 illustrates a process flow for an autonomous robotic system moving toward a fixed target ("find me flow") in accordance with some embodiments of the disclosed invention. In some embodiments, the find me process 500 begins after the system successfully verifies the target, and when the target is stationary at its location, the system autonomously travels towards the target until the distance between the system and the target is within a predetermined threshold. In some aspects, the user activates a find me flow using a mobile application in block 502. In the find me process, the system moves at a predetermined constant speed. In some aspects, the predetermined speed is set by a system user and can be changed by a mobile application. In some aspects, the system navigates using only data received from directional antennas (e.g., directional antennas 102, 104, 106, 108 in fig. 1).
At the start of the find me flow in block 500, the system receives data from the odometer about the system engine in block 506. In block 505, the system also receives data relating to the target location. In block 508, the system determines the angle and distance of the target relative to the system using the two data sets. The system compares its distance to the target to a predetermined threshold in block 510 to determine whether to send a command to the system engine controller. If the distance to the target is greater than the predetermined threshold, the system determines the angle and distance to the target and sets the path of travel it will take to reach the target in block 512. On the path to the target, the system may detect one or more obstacles in block 514. When the obstacle is in the path of travel, the system retrieves from the separate operational flow in block 600 the data relating to the position of the obstacle for adjusting its path of travel in block 516 and sends a command to the system engine controller to adjust its movement accordingly. If the system does not detect any obstacles on the path traveled to the target, the system will determine whether the target has moved since first pairing with the system in block 520 by retrieving data relating to the target's location and comparing that data to the previous target data in block 505. If the target has moved, the system receives target motion data, which includes the angle and distance to the target, in block 522. The system analyzes the target motion data and determines engine data in block 524. In block 526, a command sent to the system engine is sent to the engine to set the next movement of the system by adjusting the angle of rotation and the speed of rotation of the system wheels.
When the target is reached, the mode may be automatically turned off and the operating motion management component responsible for that operating mode enters another mode, such as a "sleep" mode, which prevents the system from moving. The system stops following the electronic device and waits for a period of time until the obstacle disappears (e.g., is removed). The period of waiting is predetermined based on the particular environment in which the system operates. If an insurmountable obstacle (pit or blind spot) is detected, an alert, such as an alarm or visual indication (e.g., an LED light or notification), will be generated and sent to the mobile application installed on the user's smartphone.
The system travels toward the target until the distance to the target is less than the threshold. The threshold is typically an optimal distance that the system maintains while following a moving object, and which may be set by the user.
When the distance between the system and the target is less than the threshold, it is assumed that the system has "reached" its target. The system then stops. In some aspects, the system enters a standby mode upon reaching a target.
FIG. 6 illustrates a process flow 600 for an autonomous robotic system to react to a discovered obstacle in accordance with some embodiments of the disclosed invention. If the system finds an obstacle in its path of movement in block 602, the system stops and waits for a period of time in block 604. Generally, an obstacle is a moving object or person, which moves away in a short time. The length of the wait time in block 604 is predetermined or user selectable based on the particular circumstances in which the system is supposed to be placed. For example, an airport may have more temporary obstacles, such as people, that "move away" quickly than permanent obstacles such as roadblocks. Thus, a user at an airport may choose to set a shorter wait time than a user on a street sidewalk.
The system determines whether the obstacle has been removed from its path of movement after the "pause" and continues moving on a path toward the target in block 608. If the obstacle is still present, the system generates a command to the engine controller to adjust its movement by changing the angle of rotation and/or speed of the wheels in block 610. Sometimes, in block 612, the obstacle may be insurmountable, such as a wall, through which the system cannot move even if the wheels are adjusted. An insurmountable obstacle may also be an obstacle that cannot be bypassed by the operations available with the system geometry, for example, when the system is on a path that requires it to ascend or descend stairs. In this case, in block 614, the system stops and generates a notification to alert the user of the obstacle, for example, by an alarm or notification sent to the user's handheld device.
FIG. 7 illustrates a process flow ("follow-me flow") for an autonomous robotic system for following a moving target in accordance with some embodiments of the disclosed invention. After the system successfully verifies the target, the follow-me flow in block 700 begins. When the target is moving, the system determines an initial distance between itself and the target and accelerates toward the target until within an optimal distance from the target. The system maintains its speed to stay within the optimal distance according to the moving object. The system analyzes data collected from all internal components (e.g., antennas, sensors, and/or cameras) and external sources, including data collected from user mobile applications, to determine both the speed and/or angle of movement of the target. In some embodiments, the follow-me flow utilizes Al and autonomous movement techniques to determine the direction and speed of movement of the target based on its speed and/or angle of movement.
According to some embodiments, as shown in fig. 7, in block 702, a user may activate a follow-me flow using a mobile application. The follow-me flow begins by receiving odometer data from the system engine in block 706. The system collects data from the antennas in block 701, the ranging sensors in block 703, and the cameras in block 705 to determine the angle and distance of the moving object relative to the system's position in block 710. In some aspects, the system utilizes the antenna data in block 701 to determine its own movement, in particular its angle and distance relative to the moving object, and to monitor its own travel speed. The antenna detects the direction of movement and the antenna data is sent to the system where the rotation angle required for the wheel to move in its future is determined in block 710. The ranging sensor in block 703 detects the distance between the moving target and the system itself. In block 710, the sensor data is sent to the system processor where the required travel speed of the wheels in their future movement is determined. If the system is within a predetermined threshold, the wheel is rotated at a constant speed to maintain the optimal distance. When the distance is too large or too small with respect to the threshold value, the wheels accelerate or decelerate, respectively. The threshold is the optimal distance the system maintains from the moving object while following the object. In some aspects, the camera in block 705 identifies a target and obtains information about the distance and angle of the target. The camera data is also sent to the system processor for use in determining the next movement based on the distance and angle relative to the target in block 710.
According to some embodiments, the system determines the approximate speed and angle of the moving target in block 710 and sets the path of movement in block 712, which will be done in order to follow the target based on the results of the analysis of the data related to the target in block 708 (e.g., target distance, target angle) and the odometer data in block 706.
On the path of movement following the moving target, the system may detect one or more obstacles in block 714. When an obstacle is identified in the movement path, the system retrieves data from a separate operational flow (e.g., the flow of fig. 6) regarding the position of the obstacle to adjust its movement path in block 716 and sends a command to the system engine controller to adjust the next movement of the system accordingly, e.g., a camera may identify the obstacle through object recognition.
The system retrieves object motion data associated with the location of the moving object, including the angle and distance of the object in block 718, and determines engine data by analyzing the object motion data in block 720. The command sent to the system engine controller in block 722 sets the next movement of the system by adjusting the angle of rotation and the speed of rotation of the system wheels. The follow-me flow ends when the user terminates the target-following mode, or when the target stops traveling and the system reaches a predetermined distance from the target (e.g., the distance between the system and the target is less than 1 meter).
According to some embodiments of the disclosed invention, the system engine controller may directly control the motor driver by generating a Pulse Width Modulation (PWM) control signal of a desired duty cycle for the motor driver. In some aspects, the system engine controller, upon receiving a command, calculates the required wheel speed and rotation angle based on the speed and angle between the system and its target. In some aspects, the engine controller may determine to rotate the wheels backwards based on a predetermined threshold of the angle between the system and its target (e.g., when the angle between the system and its target is 180 degrees) so that the system turns immediately to follow the electronic device.
The flow of the system may also include a manual vehicle movement mode that enables a user to control the movement of the system using a joystick in a mobile application. For example, a user may activate a joystick mode using a mobile application and operate the joystick in multiple sensitivity modes (e.g., low, medium, high). During operation, the mobile application sends (x-y) coordinates and a range of [0, 100] to the system processor, upon receiving the coordinates, the system calculates the rotation angle and speed of the wheel, and sends commands to the system engine to control the movement of the wheel according to the calculation.
FIG. 8 illustrates a process flow ("Help-me flow") for an autonomous robotic system for assisting a user in accordance with some embodiments of the disclosed invention. When the system reaches its target, the user can directly retrieve or process it. For example, the user may physically pull the system through the handle rather than having the system follow the user as the user walks. When a user is physically pulling on the system, the engine of the system may automatically increase its horsepower so that the user does not need to pull on the entire weight of the system, which helps the user move the system that may be too heavy to manipulate. The wheels rotate at an angle according to the direction in which the user pulls the system, while moving at a speed according to an algorithm based on the inclination of the system with respect to the path of movement of the system and the weight of the system itself.
According to fig. 8, the user may activate an assistance mode using the mobile application in block 802. The system is equipped with a gyroscope in block 808 and internal scales in block 809 and monitors the data of the gyroscope in block 804 to detect tilt in block 806. In block 810 the system determines whether the tilt angle exceeds a predetermined threshold, for example the system is tilted at 45 degrees to the ground. If the angle between the system and the ground is within the predetermined threshold, the system determines the necessary movement speed corresponding to the angle in block 812, which does not require the user to hold with excessive tension. The system determines the engine data in block 814 and sends a command to the system engine controller to set the next movement of the system by adjusting the rotation angle and rotation speed of the system wheels in block 816.
In some embodiments, the system includes a peripheral platform. The user mobile application of the control system may also include user registration, optional device authentication, user permissions and control functions. In some aspects, target device authentication may include registering and authenticating devices on a remote server and/or cloud.
The autonomous robotic system may be fully integrated with other software applications to provide additional functionality. For example, in some aspects, the system may be integrated with an application capable of presenting travel advice, airport information, and airport portal information. In some aspects, the functionality of autonomous robotic systems may be continuously improved through machine learning. For example, as operating time increases, the autonomous robotic system automatically uploads its own movement data into the autonomous robotic system application to complete the system. However, in some aspects, the self-learning function may be selected to be disabled for security purposes.
In some embodiments, the autonomous robotic system may carry more items on top of it as it autonomously travels in a horizontal mode, e.g., another suitcase. In some embodiments, the autonomous robotic system may include a built-in scale that measures the weight of its contents. In some aspects, the autonomous robotic system may include a display that displays its total weight. In some aspects, the autonomous robotic system may include a unique handle that may become a portable table, which may be used for laptops, books, documents, and other items. In some aspects, the autonomous robotic system may include a separate compartment for storage that is readily accessible.
It should be noted that the above embodiments are only for describing technical solutions of the embodiments of the present invention, but the embodiments of the present invention are not limited thereto. Although various aspects of the embodiments of the present invention have been described in detail with reference to the above-described exemplary embodiments, it should be understood by those skilled in the art that various changes, modifications, and equivalents may be made to some of the technical features described in the above-described exemplary embodiments. It will also be apparent to those skilled in the art that the above-described embodiments are specific examples of a single broader invention, which may be broader than any single description, without departing from the spirit and scope of the invention.

Claims (18)

1. A system for identifying and following a mobile electronic device, the system comprising:
an antenna for receiving and transmitting signals;
a plurality of sensors for distance measurement;
a processor; and
a memory in communication with the processor, the memory having stored therein instructions that, when executed by the processor, cause the processor to:
determining a speed and a direction of the mobile electronic device;
adjusting a movement path of the system based on the determined speed and direction of the mobile electronic device;
determining a distance between the mobile electronic device and the system;
instructing the system to follow the mobile electronic device within a predetermined range of the distance;
identifying an obstacle in a path of movement of the system;
commanding the system to stop for a predetermined period of time when the obstacle is identified;
determining whether the obstacle is still in a path of movement of the system after the predetermined period of time;
adjusting a path of movement of the system when it is determined that the obstacle is still in the path of movement of the system; and
when it is determined that the obstacle is no longer in the path of movement of the system, commanding the system to continue following the mobile electronic device at a predetermined range of the distance.
2. The system of claim 1, wherein the processor is further bluetooth paired with the mobile electronic device and only follows the mobile electronic device after bluetooth pairing.
3. The system of any one of claims 1 to 2, further comprising a camera for object recognition to identify the obstacle and to send an object recognition signal.
4. The system of any one of claims 1 to 3, wherein the commanding the system to stop is based on a predetermined threshold of distance between the system and the obstacle.
5. The system of any one of claims 1 to 4, further comprising an engine controller to control the system movement.
6. The system of any of claims 1-5, wherein the commanding the system to follow the mobile electronic device within the predetermined range of distances is based on a speed and a direction of the system.
7. The system of claim 6, wherein the processor further commands the system to increase the speed of the system when the system is physically pulled at a predetermined angle relative to the ground.
8. The system of any of claims 1-7, wherein the processor further commands the system to rotate a plurality of omni-wheels of the system 180 degrees when the mobile electronic device is at a predetermined threshold angle relative to the system.
9. The system of any one of claims 1 to 8, further comprising a lever to control movement of the system.
10. The system of any one of claims 1 to 9, wherein the system comprises one or more of a suitcase, a bag, a cargo, a cart, a pallet, and a container.
11. A method for identifying and following a mobile electronic device by a system, the method comprising:
determining, by a processor, a speed and a direction of the mobile electronic device;
adjusting, by the processor, a movement path of the system based on the determined speed and direction of the mobile electronic device;
determining, by the processor, a distance between the mobile electronic device and the system;
instructing, by the processor, the system to follow the mobile electronic device within a predetermined range of the distance;
identifying, by the processor, an obstacle in a path of movement of the system;
commanding, by the processor, the system to stop for a predetermined period of time when the obstacle is identified;
determining, by the processor, whether the obstacle is still in a path of movement of the system after the predetermined period of time;
adjusting, by the processor, a path of movement of the system when it is determined that the obstacle is still in the path of movement of the system; and
commanding, by the processor, the system to continue following the mobile electronic device within a predetermined range of the distance when it is determined that the obstacle is no longer in the path of movement of the system.
12. The method of claim 11, further comprising bluetooth pairing with the mobile electronic device by the processor and following only the mobile electronic device after bluetooth pairing.
13. The method of any of claims 11 to 12, wherein commanding the system to stop is based on a predetermined threshold of distance between the system and the obstacle.
14. The method of any of claims 11-13, wherein commanding the system to follow the mobile electronic device within the predetermined range of distances is based on a speed and a direction of the system.
15. The method of claim 14, further comprising increasing, by the processor, a speed of the system when the system is physically pulled at a predetermined angle relative to the ground.
16. The method of any of claims 11-15, further comprising rotating, by the processor, a plurality of omni-wheels of a system 180 degrees when the mobile electronic device is at a predetermined threshold angle relative to the system.
17. The method of any of claims 11-16, further comprising adjusting, by the processor, a speed and a rotational angle of the system to change a direction of the movement path.
18. The method of any one of claims 11 to 17, wherein the system is one or more of a suitcase, a bag, a cargo, a cart, a pallet, and a container.
CN201880048036.3A 2017-07-10 2018-07-10 Autonomous robot system Pending CN111201497A (en)

Applications Claiming Priority (9)

Application Number Priority Date Filing Date Title
US201762530744P 2017-07-10 2017-07-10
US62/530744 2017-07-10
US15/725656 2017-10-05
US15/725,656 US20180360177A1 (en) 2017-06-15 2017-10-05 Robotic suitcase
PCT/US2017/057319 WO2018231270A1 (en) 2017-06-15 2017-10-19 Robotic suitcase
USPCT/US17/57319 2017-10-19
US201862651023P 2018-03-30 2018-03-30
US62/651023 2018-03-30
PCT/US2018/041525 WO2019014277A1 (en) 2017-07-10 2018-07-10 Autonomous robot system

Publications (1)

Publication Number Publication Date
CN111201497A true CN111201497A (en) 2020-05-26

Family

ID=65002331

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880048036.3A Pending CN111201497A (en) 2017-07-10 2018-07-10 Autonomous robot system

Country Status (4)

Country Link
EP (1) EP3652600A4 (en)
JP (1) JP2020527266A (en)
CN (1) CN111201497A (en)
WO (1) WO2019014277A1 (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111504316B (en) * 2019-01-30 2023-03-31 北京优位智停科技有限公司 Method for vehicle navigation and ground moving device
JP7225262B2 (en) 2020-02-26 2023-02-20 バイドゥドットコム タイムズ テクノロジー (ベイジン) カンパニー リミテッド Trajectory planning for obstacle avoidance of self-driving cars
CN111273672B (en) * 2020-03-06 2023-04-28 陕西雷神智能装备有限公司 Unmanned aerial vehicle line inspection method and system based on known coordinate radio frequency tag and unmanned aerial vehicle
CN111352422B (en) * 2020-03-06 2023-04-28 陕西雷神智能装备有限公司 Unmanned vehicle line inspection method and system based on self-learning radio frequency tag and unmanned vehicle
CN111324134B (en) * 2020-03-06 2023-04-28 陕西雷神智能装备有限公司 Unmanned vehicle line inspection method and system based on preset sequence radio frequency tags and unmanned vehicle
CN111966023B (en) * 2020-08-28 2024-04-30 王旭飞 Intelligent following method and device and electronic equipment
CN112487869A (en) * 2020-11-06 2021-03-12 深圳优地科技有限公司 Robot intersection passing method and device and intelligent equipment
US20220382282A1 (en) * 2021-05-25 2022-12-01 Ubtech North America Research And Development Center Corp Mobility aid robot navigating method and mobility aid robot using the same

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104991560A (en) * 2015-07-12 2015-10-21 仲恺农业工程学院 Autonomous mobile intelligent robot
CN105955267A (en) * 2016-05-11 2016-09-21 上海慧流云计算科技有限公司 Motion control method and motion control system
CN106155065A (en) * 2016-09-28 2016-11-23 上海仙知机器人科技有限公司 A kind of robot follower method and the equipment followed for robot
ES2607223A1 (en) * 2016-06-08 2017-03-29 Pablo VIDAL ROJAS Autonomous case (Machine-translation by Google Translate, not legally binding)
US20170108860A1 (en) * 2015-10-16 2017-04-20 Lemmings LLC Robotic Golf Caddy

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4830452B2 (en) * 2005-11-04 2011-12-07 カシオ計算機株式会社 Forward tilt angle detection device and wheel drive type mobile device
US8948913B2 (en) * 2009-10-26 2015-02-03 Electronics And Telecommunications Research Institute Method and apparatus for navigating robot
US20140107868A1 (en) * 2012-10-15 2014-04-17 Mirko DiGiacomcantonio Self-propelled luggage
JP2014092861A (en) * 2012-11-01 2014-05-19 Symtec Hozumi:Kk Follow-up carriage system
US20140277841A1 (en) * 2013-03-15 2014-09-18 Elizabeth Klicpera Motorized Luggage or Luggage Platform with Wired or Wireless Guidance and Distance Control
JP5915690B2 (en) * 2014-04-18 2016-05-11 株式会社豊田自動織機 Transport assist device
JP5792361B1 (en) * 2014-06-25 2015-10-07 シャープ株式会社 Autonomous mobile device
EP3574797B1 (en) * 2015-03-02 2022-06-15 Bruner, Boyd Motorized luggage
JP6623341B2 (en) * 2015-08-19 2019-12-25 Cyberdyne株式会社 Facility management and operation system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104991560A (en) * 2015-07-12 2015-10-21 仲恺农业工程学院 Autonomous mobile intelligent robot
US20170108860A1 (en) * 2015-10-16 2017-04-20 Lemmings LLC Robotic Golf Caddy
CN105955267A (en) * 2016-05-11 2016-09-21 上海慧流云计算科技有限公司 Motion control method and motion control system
ES2607223A1 (en) * 2016-06-08 2017-03-29 Pablo VIDAL ROJAS Autonomous case (Machine-translation by Google Translate, not legally binding)
CN106155065A (en) * 2016-09-28 2016-11-23 上海仙知机器人科技有限公司 A kind of robot follower method and the equipment followed for robot

Also Published As

Publication number Publication date
EP3652600A4 (en) 2021-08-04
JP2020527266A (en) 2020-09-03
EP3652600A1 (en) 2020-05-20
WO2019014277A1 (en) 2019-01-17

Similar Documents

Publication Publication Date Title
CN111201497A (en) Autonomous robot system
US11160340B2 (en) Autonomous robot system
KR102254881B1 (en) A moving robot and a controlling method for the same
US11266067B2 (en) Moving robot, method for controlling moving robot, and moving robot system
US20170368691A1 (en) Mobile Robot Navigation
JP4871160B2 (en) Robot and control method thereof
US10509412B2 (en) Movable body control system
US11762397B2 (en) Autonomous driving cart
CN107128282A (en) The mobile device control of electric door
US11200765B2 (en) Luggage delivery system
EP3919238B1 (en) Mobile robot and control method therefor
KR101783890B1 (en) Mobile robot system
WO2021109890A1 (en) Autonomous driving system having tracking function
CN110858098A (en) Self-driven mobile robot using human-robot interaction
US11635759B2 (en) Method of moving robot in administrator mode and robot of implementing method
WO2019036321A1 (en) Systems and methods for controlling unmanned transport vehicles via intermediate control vehicles
US20220055654A1 (en) Methods and Apparatus for User Interactions with Autonomous Vehicles
WO2018101962A1 (en) Autonomous storage container
US11861054B2 (en) Moving robot and method for controlling the same
KR102433859B1 (en) A device and method for tracking user based on infrared signal detection
US20210382494A1 (en) Marker for space recognition, method of moving and lining up robot based on space recognition and robot of implementing thereof
KR20080050278A (en) Automatic charging system for moving robot using infrared sensor and camera and method thereof
JP2009223632A (en) Autonomous mobile device and autonomous mobile device system
US11358274B2 (en) Autonomous mobile robot with adjustable display screen
Rao et al. Sensor guided docking of autonomous mobile robot for battery recharging

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200526