WO2017097170A1 - 自主定位导航设备、定位导航方法及自主定位导航*** - Google Patents
自主定位导航设备、定位导航方法及自主定位导航*** Download PDFInfo
- Publication number
- WO2017097170A1 WO2017097170A1 PCT/CN2016/108594 CN2016108594W WO2017097170A1 WO 2017097170 A1 WO2017097170 A1 WO 2017097170A1 CN 2016108594 W CN2016108594 W CN 2016108594W WO 2017097170 A1 WO2017097170 A1 WO 2017097170A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- host device
- positioning navigation
- data
- motion
- information
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 38
- 230000033001 locomotion Effects 0.000 claims abstract description 264
- 230000005540 biological transmission Effects 0.000 claims abstract description 101
- 238000012545 processing Methods 0.000 claims abstract description 64
- 238000004891 communication Methods 0.000 claims abstract description 41
- 238000004458 analytical method Methods 0.000 claims abstract description 16
- 230000001360 synchronised effect Effects 0.000 claims description 16
- 230000004927 fusion Effects 0.000 claims description 11
- 238000007781 pre-processing Methods 0.000 claims description 10
- 230000001133 acceleration Effects 0.000 claims description 2
- 239000000835 fiber Substances 0.000 claims description 2
- 238000002844 melting Methods 0.000 claims description 2
- 230000008018 melting Effects 0.000 claims description 2
- 238000004422 calculation algorithm Methods 0.000 description 75
- 230000006870 function Effects 0.000 description 18
- 230000006399 behavior Effects 0.000 description 17
- 230000003993 interaction Effects 0.000 description 16
- 238000010586 diagram Methods 0.000 description 13
- 238000007726 management method Methods 0.000 description 13
- 230000009471 action Effects 0.000 description 10
- 230000008878 coupling Effects 0.000 description 10
- 238000010168 coupling process Methods 0.000 description 10
- 238000005859 coupling reaction Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 8
- 238000010276 construction Methods 0.000 description 7
- 238000012546 transfer Methods 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- 238000009434 installation Methods 0.000 description 5
- 238000003032 molecular docking Methods 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 238000006073 displacement reaction Methods 0.000 description 4
- 230000010354 integration Effects 0.000 description 4
- 238000013507 mapping Methods 0.000 description 4
- 230000015654 memory Effects 0.000 description 4
- 239000002245 particle Substances 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000004140 cleaning Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000001186 cumulative effect Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000005192 partition Methods 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 230000003936 working memory Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3833—Creation or updating of map data characterised by the source of data
- G01C21/3848—Data obtained from both position sensors and additional sensors
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0248—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/277—Analysis of motion involving stochastic approaches, e.g. using Kalman filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/75—Determining position or orientation of objects or cameras using feature-based methods involving models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
Definitions
- the invention relates to the field of robots, and in particular to a technique for positioning navigation.
- the autonomous positioning navigation function is the key to realize the practical application of the service machine equipment. It allows the robot to autonomously construct the map information of the environment through the sensor data without human assistance, and locate the environmental location in real time. Further, by using the constructed map data and location information for navigation, the robot needs an intelligent planning route to go to the target location of the mission, and effectively avoid obstacles such as pedestrians and furniture in the environment.
- the above-mentioned autonomous positioning navigation function is implemented in the industry by SLAM (Simultaneous localization and mapping) and Motion Planning (Scision) algorithms respectively.
- SLAM Simultaneous localization and mapping
- Motion Planning Scision
- the synchronous positioning navigation algorithm allows the robot to locate the map while real-time positioning by using specific sensor data in any unknown environment. It is the most effective algorithm in autonomous positioning navigation. In order to let the robot launch the action, it is necessary to use some kind of motion planning algorithm to plan the robot's motion trajectory and let the robot dynamically avoid various obstacles in the action and safely reach the destination.
- ROS is only a software-level system and does not have the ability to cooperate with the underlying and upper layers of a specific robot system, it does not alleviate the above-mentioned difficulty in using such algorithms. question.
- due to the complexity of such algorithms even mainstream computer systems currently have a large load pressure when running such algorithms.
- developers In order to efficiently run such algorithms in service robots with embedded computing systems with lower computational performance, developers must be required to optimize the existing algorithms, which further increases the number of algorithms. The difficulty of using such algorithms directly.
- developers in order to map, real-time locate, and obstacle avoidance, developers must equip the robot with a variety of sensors to provide data to the above algorithms. The performance difference of various types of sensors, the quality of the correction effect also greatly affects the implementation effect of the navigation and positioning algorithm.
- Robots with autonomous positioning navigation appearing in the world are often occupied by large-scale enterprises and research institutions, and due to the high coupling and alienation of the system, the current robot software system is difficult to reuse between different robots. It hinders the industrialization process of service robots.
- the root cause of this problem lies in the fact that this type of positioning navigation algorithm has great dependence on sensor configuration, robot size and driving mode in different robot platforms. That is, the autonomous positioning navigation device and the robot host device have a high degree of coupling. This degree of coupling will result in the developer of the robotic system as a host having to make more preparations for adapting an autonomous positioning navigation device.
- the cleaning robot since the specific working behavior of the robot is defined by the purpose of the robot, for example, the cleaning robot requires a path planning mode in which the motion planning algorithm can walk along the wall edge and then perform a bow-shaped reciprocating walk, while a security patrol robot requires The robot completes a patrol mission to the environment with as little cost as possible. At present, there is no autonomous positioning navigation device that can handle the differentiation of this business logic very well.
- An object of the present invention is to provide a highly modular autonomous positioning navigation device, and a positioning navigation method based on the autonomous positioning navigation device and an autonomous positioning navigation system based on the autonomous positioning navigation device, so as to reduce the autonomous positioning navigation device to the host device. Dependence and improve its scalability.
- an autonomous positioning navigation device for positioning and navigating a host device according to an aspect of the present application.
- the autonomous positioning navigation device includes: a first transmission device and a second transmission. Apparatus and processing apparatus; wherein
- the first transmission device performs data communication with the underlying device of the host device to acquire information related to the underlying navigation and to transmit a motion control command for controlling the motion of the host device;
- the second transmission device performs data communication with an upper device of the host device to acquire upper layer positioning navigation related information and transmit motion related logical data for the host device to perform service logic analysis;
- the processing device acquires a plurality of sensing information, the bottom layer positioning navigation related information, and the upper layer positioning navigation related information, and generates the motion related logical data and the motion control command.
- a method for positioning and positioning using an autonomous positioning navigation device wherein the autonomous positioning navigation device is configured to perform positioning and navigation on a host device, and the autonomous positioning navigation device includes a processing device, and a first a transmitting device and a second transmitting device; wherein the method comprises:
- the first transmitting device acquires the bottom positioning navigation related information from the bottom layer control device of the host device, and the second transmitting device acquires the upper layer positioning navigation related information from the upper layer control device of the host device;
- the processing device of B acquires a plurality of sensing information, the bottom positioning navigation related information, and the upper positioning navigation related information, and generates a motion control command for controlling movement of the host device and performs service for the host device Motion related logical data of logic analysis;
- the first transmission device sends the motion control command to the bottom layer of the host device And a control device that transmits the motion related logic data to an upper layer control device of the host device.
- an autonomous positioning navigation device comprising:
- a first device configured to acquire information about an underlying positioning navigation of the host device and information related to the navigation of the upper layer
- a second device configured to acquire a plurality of sensing information, and perform pre-processing and pre-fusion on the plurality of sensing information
- a third device configured to generate motion control commands for controlling motion of the host device based on the pre-processed and pre-fused plurality of the sensing information, the bottom layer positioning navigation related information, and the upper layer positioning navigation related information Motion related logical data for the host device to perform business logic analysis;
- the first device is further configured to send the synchronization positioning data, the map data, the motion planning logic data, and the motion control command to the host device.
- the autonomous positioning navigation device of the present application has a high degree of modularity, greatly reduces the coupling degree with the host device, and facilitates rapid integration into an existing host device.
- the advantages of flexible expansion. Therefore, the host device such as the robot has a simpler and clearer system configuration, which greatly reduces the development difficulty and time period of the host device with the autonomous positioning navigation device.
- the autonomous positioning navigation device integrates the processing of the plurality of sensing information into the autonomous positioning navigation device itself by summarizing the sensing information dependencies required by most of the autonomous positioning navigation devices, thereby reducing the coupling with the host device. degree.
- the autonomous positioning navigation device forms a highly flexible unified external communication interface and protocol specification through the first transmission device and the second transmission device, so that any host device conforming to the interface protocol specification can be easily implemented and autonomously
- the docking of the navigation device 1 is positioned and the function is expanded.
- FIG. 1 is a schematic diagram showing a cooperative structure of an autonomous positioning navigation device and a host device according to an aspect of the present application
- FIG. 2 is a schematic structural diagram of an autonomous positioning navigation device according to a preferred embodiment of the present application.
- FIG. 3 is a schematic diagram showing data transmission during cooperation of a first transmission device of an autonomous positioning navigation device and an underlying control device of a host device according to a preferred embodiment of the present application.
- FIG. 4 illustrates a positioning and navigation method of an autonomous positioning navigation device according to another aspect of the present application
- FIG. 5 is a schematic diagram showing a cooperation structure between an autonomous positioning navigation device and a host device according to a preferred embodiment of the present application.
- FIG. 6 is a schematic diagram showing a cooperation structure between an autonomous positioning navigation device and a host device according to a preferred embodiment of the present application.
- FIG. 7 is a schematic diagram showing a cooperation structure between an autonomous positioning navigation device and a host device according to still another preferred embodiment of the present application.
- FIG. 8 illustrates a positioning navigation method according to a preferred embodiment of the present application.
- the present application aims to propose a highly modular autonomous positioning navigation device and an autonomous positioning navigation device to reduce the dependence on the host device and improve its own scalability.
- FIG. 1 is a schematic diagram of a cooperative structure of an autonomous positioning navigation device and a host device according to an aspect of the present application, wherein the autonomous positioning navigation device 1 is configured to provide a positioning navigation function for the host device 2, and the autonomous positioning navigation
- the device 1 comprises a first transmission device 11, a second transmission device 12, and a processing device 13.
- the first transmission device 11 performs data communication with the underlying control device of the host device 2 to acquire information related to the underlying navigation and to transmit a motion control command for controlling the motion of the host device 2;
- the transmitting device 12 performs data communication with the upper device of the host device 2 to acquire upper layer positioning navigation related information and transmit motion related logical data for the host device 2 to perform business logic analysis;
- the processing device 13 acquires several transmissions Sensing information, the bottom layer positioning navigation related information, and the upper layer positioning navigation related information, and generating the motion related logical data and the motion control command.
- the host device 2 may be a machine device that automatically performs work, such as a robot or the like.
- the host device 2 can accept human command, run a pre-programmed program, or act according to a principle program established by artificial intelligence technology to assist or replace human work.
- the host device 2 has an upper layer control device that processes the business logic, analyzes and formulates the action target, and has an underlying control device that drives the movement of the actuator, that is, the host device 2 can be made by the power component according to the control signal sent by the control device.
- Various actions are performed in which the input control signal is an electrical signal, and the output is a line and angular displacement.
- the driving device used by the host device 2 may be an electric driving device (mechanical wheel set) such as a stepping motor, a servo motor, or the like, or may be a hydraulic or pneumatic driving device or the like.
- the autonomous positioning navigation device 1 is erected on the host device 2.
- the bottom layer positioning navigation related information may include the wheel group status information of the host device 2, and may further include parameter information of the host device 2, where the upper layer positioning navigation related information may include that the host device 2 needs to perform Requests for motion planning and/or requests by the host device 2 for motion control by its underlying control device, the motion related logical data packet Includes map data, simultaneous positioning data, and motion planning logic data.
- the first transmission device 11 and the host device 2 are in data communication with the underlying device of the host device 2 to acquire the underlying positioning navigation related information and to transmit a motion control command for controlling the movement of the host device 2.
- the first transmission device 11 (control signal interface) is mainly used to acquire the bottom operating state of the host device 2, such as the motor working condition, the wheel encoder data, and the autonomous positioning navigation device 1 for the motion control of the host device 2.
- the command is also transmitted by the first transmission device 11.
- the autonomous positioning navigation device 1 and the host device 2 exchange data in the first transmission device 11 using a predefined unified communication protocol.
- the first transmission device 11 preferably adopts a UART serial port (Universal Asynchronous Receiver Transmitter), because the UART serial port is supported by almost all single-chip microcomputers and embedded devices, and the host device 2 only needs
- the cooperation between the autonomous positioning navigation device 1 and the host device 2 can be realized by implementing the processing of the predefined communication protocol, so that the integration of the host device 2 and the autonomous positioning navigation device 1 can be facilitated to the utmost extent.
- CAN bus Controller Area Network, CAN
- SPI bus Serial Peripheral Interface
- I 2 C bus etc.
- an autonomous positioning navigation device 1 may further include any number of any of a plurality of different types of physical interfaces to implement the above-mentioned control signals of the first transmission device 11.
- an abstract external sensor data acquisition protocol is defined on the first transmission device 11, and support for any type of sensor can be implemented.
- the protocol data type transmitted by the first transmission device 11 includes parameter information, wheel group status information of the host device 2 transmitted from the host device 2 to the autonomous positioning navigation device 1 and
- the host device 2 senses information and a motion control command sent from the autonomous positioning navigation device 1 to the host device 2, the parameter information of the host device 2 describing relevant configuration parameters of the host device 2, for example but not Limited to device size, drive mode, installed sensor type and location, etc.;
- the wheel set status information describes each wheel set operation data of the host device 2, such as but not limited to odometer information;
- the host device 2 senses information
- An abstract data definition of an additional sensor on the host device 2 that is desirably processed by the autonomous positioning navigation device 1 describes an abstract data definition of an additional sensor on the host device 2 that is intended to be processed by the autonomous navigation device 1;
- the motion control command describes The positioning navigation identifies a description of the desired host device 2 to move.
- the motion control command includes a description of the autonomous positioning navigation device 1 expecting the host device 2 to move.
- FIG. 3 is a schematic diagram showing data transmission during the cooperation of the first transmission device of the autonomous positioning navigation device 1 and the underlying control device of the host device 2 according to a preferred embodiment of the present application.
- the host device 2 needs to first provide the autonomous positioning navigation device 1 with parameter information of the host device 2 including its own information, and the parameter information of the host device 2 is used to describe the current host device 2.
- Platform characteristics such as its own size information, drive mode (two-wheel differential drive / omnidirectional wheel structure, etc.), the position and angle of the external sensor (ie external sensor) installation and if additional sensors are installed, you need to The related description information of such sensors is provided to the autonomous positioning navigation device 1.
- the autonomous positioning navigation device 1 After receiving the parameter information of the host device 2, the autonomous positioning navigation device 1 will perform the necessary initialization work to adapt the current host device 2. Subsequently, the autonomous positioning navigation device 1 will periodically transmit the motion control command to the host device 2.
- the motion control command is used to describe a mode in which the autonomous positioning navigation device 1 expects the host device 2 to move next.
- the motion control command may be a desired running speed amount of the left and right wheel sets, and for a robot adopting a universal wheel mode, the motion control command may be that the robot performs translation at the next moment. And the linear velocity (v) and angular velocity (w) of the rotation.
- the host device 2 While the autonomous positioning navigation device 1 periodically transmits the motion control command, the host device 2 also needs to periodically transmit the wheel group state information describing the motion situation to the autonomous positioning navigation device 1. This information generally includes the amount of change in the amount of displacement and heading angle of the host device 2 relative to the previous moment. For the host device 2 that uses the two-wheeled scoring drive, the wheel group status information may directly transmit the cumulative number of revolutions of the left and right wheels or the accumulated odometer information.
- the host device 2 sensing information may be periodically transmitted to the autonomous positioning navigation device 1 including the sensor data description information having a uniform definition.
- the autonomous positioning navigation device 1 can expand its own function by accepting the sensor data description information to process additional external sensors.
- the above data is a minimum set of data types that must be transmitted through the first transmission device 11 in order to ensure the normal operation of the autonomous navigation device 1 , and is only the first transmission of the autonomous positioning navigation device 1 .
- the type of protocol which may be applicable to the present application, transfers the type of data between the first transmission device 11 of the autonomous navigation device 1 and the underlying control device 21 of the host device 2, which may still be incorporated herein by reference.
- the second transmission device 12 is connected to an upper layer control device of the host device 2 for data communication.
- the second transmission device 12 (high-speed signal interface) is used to implement data interaction between the autonomous positioning navigation device 1 and the upper control device of the host device 2, such as map data, positioning coordinates, and control path planning data and host device 2 behavior data. Cooperative data related to the business logic, such as transmission, is transmitted through the second transmission device 12.
- the second transmission device 12 preferably implements communication with the external host device 2 for big data throughput by using an Ethernet interface of the 802.11 specification.
- the second transmission device 12 may also include a WIFI wireless communication interface, a USB interface, a fiber interface, and the like, which can also realize a large data amount interaction.
- the high-speed signal interface can include multiple sets of Ethernet interfaces and multiple different types of interface formats: for example, a wired Ethernet interface and a wireless WIFI interface.
- the second transmission device 12 is responsible for transmitting map data from the autonomous positioning navigation device 1 to the upper layer control device of the host device 2, synchronous positioning data including position and posture information and positioning state information, and Motion planning logic data including motion state information, and motion execution request, bottom motion control request from the host device 2 to the upper layer control device of the autonomous positioning navigation device 1.
- the map data includes map data of a specific area constructed by the autonomous positioning navigation device 1; the position and posture information includes spatial position and posture information of the current host device 2 calculated by the autonomous positioning navigation device 1;
- the status information includes a map size calculated by the autonomous positioning navigation device 1 and a positioning status (eg, covariance, whether the positioning is successful);
- the motion status information includes motion planning algorithm information currently being executed by the autonomous positioning navigation device 1, for example, but not Limited to rules such as the ongoing path planning time
- the motion execution request includes a request packet of the built-in motion planning algorithm that the host device 2 requires to autonomously locate the navigation device 1; the underlying motion control request includes the host device 2 requiring the autonomous positioning navigation device 1 to directly control the underlying layer of the host device A request packet for the system to move, such as, for example, but not limited to, requesting control of the robot to a particular destination.
- the map data describes map data information of interest to the host device 2.
- the map data is always located in a portion of the environment map pre-built by the autonomous navigation device 1.
- the host device 2 can randomly acquire the data to the autonomous positioning navigation device 1 as needed.
- the position and posture information includes current position coordinates and posture information of the host device 2 calculated by the autonomous positioning navigation device 1.
- the information may be the coordinates (x, y) of the robot in the plane and the heading angle ⁇ .
- the host device 2 can randomly acquire the data to the autonomous positioning navigation device 1 according to its own needs, or the autonomous positioning navigation device 1 can actively push the data to the host device 2.
- the positioning status information is used to describe the current working situation of the autonomous positioning navigation device 1 for positioning and map construction.
- the information it contains includes the total size of the map that has been constructed so far, the positioning accuracy information, whether the positioning is successful, and other data sets necessary by the host device 2.
- the host device 2 can randomly acquire the data to the autonomous positioning navigation device 1 according to its own needs, or the autonomous positioning navigation device 1 can actively push the data to the host device 2.
- the motion state information describes the execution of the motion planning algorithm currently being performed by the autonomous positioning navigation device 1. For example, the type of motion planning algorithm currently being worked on (idle, path planning, autonomous return charging, etc.), the planned path data to the target location, and the amount of motion control required by the host device 2.
- the host device 2 can randomly acquire the data to the autonomous positioning navigation device 1 according to its own needs, or the autonomous positioning navigation device 1 can actively push the data to the sink device 2.
- the motion execution request is used by the host device 2 to initiate a related description data of the motion planning algorithm built into the autonomous positioning navigation device 1 to the autonomous positioning navigation device 1.
- the general implementation includes the types of motion planning algorithms that the host device 2 wishes to perform (stop all actions, path planning, autonomous return charging, etc.), related parameters (target location coordinates, moving speed, etc.). This information is initiated by the host device 2 to the autonomous positioning navigation device 1 actively.
- the underlying motion control request is for the host device 2 to issue a motion related control command request directly to the underlying control device 21 of the host device 2 via the autonomous positioning navigation device 1.
- the data packet is used to implement the navigation device 1 for autonomous positioning
- the request can implement the underlying motion of the host device 2 to directly advance, retreat, rotate, etc. at a specific speed.
- the underlying motion control request can also contain direct control data for the left and right wheel motor speeds.
- the data transmitted during the communication between the second transmission device 12 of the autonomous positioning navigation device 1 and the upper control device 22 of the host device 2 is a preferred example, including the minimum data that should be supported.
- the autonomous positioning navigation device 1 of the present application cooperates with the first transmission device 11 and the second transmission device 12 to clarify the communication specifications and dependencies between the autonomous positioning navigation device 1 and the host device 2, the autonomous The interaction and data dependency of the positioning navigation device 1 with the host device 2 occurs on one of the communication interfaces of the first transmission device 11 and the second transmission device 12.
- the processing device 13 acquires a plurality of sensing information, the bottom positioning navigation related information, and the upper positioning navigation related information, and generates the motion related logical data and the motion control command. Specifically, the processing device 13 generates map data and synchronization positioning data based on the plurality of the sensing information, the bottom layer positioning navigation related information, and the upper layer positioning navigation related information, and based on the synchronization positioning data and the map. The data and the upper layer navigation related information of the host device 2 generate motion planning logic data and the motion control command.
- the first transmission device 11 may further acquire the sensing information of the host device 2 of the host device 2 from the underlying control device of the host device 2, and the processing device 13 may sense the information in combination with the host device 2. Data processing is performed with a plurality of said sensing information. Specifically, said first transmitting device 11 further acquires sensing information of the host device 2 from an underlying control device of said host device 2; said processing device 13 is based on said host device The sensing information, the plurality of sensing information, the underlying positioning navigation related information, and the upper positioning navigation related information generate motion control related information of the host device 2.
- the bottom layer positioning navigation related information further includes a parameter letter of the host device 2
- the first transmission device 11 acquires the parameter information
- the processing device 13 further generates a motion initial control command based on the parameter information
- the A transmission device 11 transmits the motion initial control command to the underlying device of the host device 2.
- the autonomous positioning navigation device 1 further includes a built-in sensor 14 and an external sensor 15; wherein the processing device 13 acquires a plurality of the sensing information from the built-in sensor 14 and the external sensor 15.
- the built-in sensor 14 includes at least one of the following: a gyroscope, an acceleration sensor, an electronic compass, a temperature sensor, a humidity sensor, and a barometric pressure sensor.
- the external sensor includes at least one of the following: a laser radar, a sonar radar, a visual sensor, and a UWB beacon sensor.
- the built-in sensor 13 is a series of sensors integrated inside the autonomous positioning navigation device 1.
- the built-in sensor 14 may include an inertial navigation sensor such as a gyro, an accelerometer, an electronic compass, or the like, and a combination of one or more of a temperature sensor, a humidity sensor, a pressure sensor, and the like.
- the built-in sensor 14 is characterized in that it can be directly placed in the autonomous positioning navigation device 1 on the physical integrated installation, for example, on the PCB inside the autonomous positioning navigation device 1 , and can be collected by itself without the assistance of the external host device 2 .
- Built-in sensing information Built-in sensing information.
- the built-in sensor 13 can contain more different types of sensors, depending on the specific implementation and application requirements.
- the built-in sensing information acquired by the built-in sensor 14 can be used to determine a pitch angle, a roll angle, a heading angle, a height information, an ambient temperature, a humidity, and the like of the environment in which the autonomous positioning navigation device 1 is currently located, to facilitate the processing device. 13 Perform a posture solving task of the host device 2.
- the external sensor 15 preferably includes one or a combination of a laser radar, a visual sensor (camera, etc.), a UWB (Ultra-Wideband) beacon sensor, and the like, and specifically selects and is specific to the autonomous positioning navigation device 1 .
- a laser radar e.g., a laser radar
- a visual sensor e.g., a CCD sensor
- UWB Ultra-Wideband
- beacon sensor e.g., a laser radar, etc.
- UWB Ultra-Wideband
- the difference between the external sensor 15 and the built-in sensor 13 is that the former requires direct measurement and observation in the external environment, and thus cannot be like the built-in sensor 13 It is placed directly inside the autonomous positioning navigation device 1 on the physical installation, and must be exposed to the outside to facilitate direct measurement of the physical environment.
- the position and angle of installation of the external sensor 15 and associated description information additionally equipped with other sensors are transmitted by the first transmission device 11 during the initialization phase of the autonomous positioning navigation device 1.
- the foregoing devices may be physically designed into the same chip as the specific hardware chip of the autonomous positioning navigation device 1 is selected. Perhaps the same component is made up of several different discrete hardware.
- additional functional units such as internal power management devices, are added in specific implementations, but these portions are not necessary hardware components constituting the autonomous positioning navigation device 1 of the present invention.
- the processing device 13 is mainly used to run an automatic positioning navigation related algorithm, such as but not limited to: synchronous positioning and mapping (SLAM), path planning algorithm, obstacle avoidance algorithm and built-in sensor The algorithm for calculating the space pose of the robot.
- the processing device 13 may be composed of one or more computer systems, or may be a purely hardware implementation such as an Application Specific Integrated Circuit or a Field-Programmable Gate Array.
- the unit When implemented in a general-purpose computer system, the unit will contain one or more CPU units (Central Processing Units), random access memory (RAM), and ROM for storing permanent programs and data.
- CPU units Central Processing Units
- RAM random access memory
- ROM read-only memory
- the processing device 13 includes a main processing unit and a slave processing unit, wherein the main processing unit generates a location based on the plurality of the sensing information, the bottom layer positioning navigation related information, and the upper layer positioning navigation related information. Said motion-related logic data of the host device 2 and the motion control command; the slave processing unit acquires the sensory information from the built-in sensor in real time to acquire a gesture solving task, and the motion control command The underlying control device of the host device 2 is sent by the first transmission device.
- FIG. 2 is a schematic structural diagram of an autonomous positioning navigation device 1 according to a preferred embodiment of the present application, wherein the control signal interface shown corresponds to the first transmission device 11 of FIG. 1, and the high-speed signal interface shown corresponds to The second transmission device 12 of FIG. 1; the processing device 13 includes a main processing unit and a slave processing unit, and the main operation unit (preferably a CPU) corresponds to FIG.
- the main processing unit, the illustrated slave operation unit (preferably MCU, Microcontroller Unit) corresponds to the slave processing unit described in FIG.
- MCU Microcontroller Unit
- the slave processing unit and the slave arithmetic unit are used interchangeably.
- the processing device 13 uses one main arithmetic unit and one slave arithmetic unit in implementation.
- the main operation unit has strong computing power, and most of the positioning navigation algorithms are arranged to perform calculations therein. It is realized by the single-chip computer from the computing unit, and its computing power is relatively weak, but it has good real-time performance. Therefore, it is used to perform the attitude solving task of acquiring data from the built-in sensor, and is also responsible for realizing the control signal interface defined in the device. Responsible for communicating with the underlying control device of the external host device 2.
- the above implementation includes two physical interfaces that implement high-speed signal interfaces: 100M Ethernet interface and 802.11b/g WIFI wireless network interface.
- the host device 2 can communicate with the positioning navigation module through any one of the specific physical interfaces according to its own requirements.
- the electronic compass, gyroscope, accelerometer and barometer in Fig. 2 constitute built-in sensors, which can collect the elevation angle, roll angle, heading angle and height information of the current autonomous positioning navigation device 1 in the environment.
- the external sensor uses a laser radar in the above implementation.
- the autonomous positioning navigation device 1 integrates the sensors required by most of the autonomous positioning navigation devices 1 to integrate the inertial navigation sensors such as the gyroscope, the accelerometer and the electronic compass.
- the sensor is physically integrated into the interior of the autonomous positioning navigation device 1 and directly multiplexed with an external sensor such as a laser, a radar or a visual sensor to process almost all sensor data on which the positioning navigation depends in the processing device in the autonomous positioning navigation device 1. .
- the reliance on the sensor of the host device 2 is greatly reduced, and the host device 2 can perform the positioning and navigation work even if the navigation device 1 is autonomously positioned without additional sensor equipment, thereby well solving the existing Navigation positioning devices are ubiquitous and host device 2 is high Coupling while ensuring flexible scalability.
- FIG. 4 illustrates a positioning and navigation method using an autonomous positioning navigation device 1 according to an aspect of the present application, wherein the autonomous positioning navigation device 1 is configured to perform positioning and navigation on a host device 2, and the autonomous positioning navigation device 1
- the processing device, the first transmitting device and the second transmitting device are included; wherein the method comprises: step S11, step S12 and step S13.
- Step S11 The first transmission device acquires information about the bottom navigation and navigation information from the bottom layer control device of the host device 2, and the second transmission device acquires information about the upper layer navigation and navigation information from the upper layer control device of the host device 2.
- Step S12 the processing device acquires a plurality of sensing information, the bottom layer positioning navigation related information, and the upper layer positioning navigation related information, and generates a motion control command for controlling movement of the host device 2 and is used for The host device 2 performs motion related logic data for business logic analysis;
- the step S13 the first transmission device sends the motion control command to an underlying control device of the host device 2, the second transmission device The motion related logical data is transmitted to an upper layer control device of the host device 2.
- the host device 2 may be a machine device that automatically performs work, such as a robot or the like.
- the host device 2 can accept human command, run a pre-programmed program, or act according to a principle program established by artificial intelligence technology to assist or replace human work.
- the host device 2 has an upper layer control device that processes the business logic, analyzes and formulates the action target, and has an underlying control device that drives the movement of the actuator, that is, the host device 2 can be made by the power component according to the control signal sent by the control device.
- Various actions are performed in which the input control signal is an electrical signal, and the output is a line and angular displacement.
- the driving device used by the host device 2 may be an electric driving device (mechanical wheel set) such as a stepping motor, a servo motor, or the like, or may be a hydraulic or pneumatic driving device or the like.
- the bottom layer positioning navigation related information may include wheel group status information of the host device 2
- the upper layer positioning navigation related information may include a request that the host device 2 needs to perform motion planning and/or the host device. 2
- a request for motion control by its underlying control device is required, the motion related logical data including map data, synchronous positioning data, and motion planning logic data.
- the first transmission device 11 Data communication with the underlying device of the host device 2
- the second transmission device 12 is in data communication with an upper device of the host device 2
- the first transmission device being obtained from an underlying control device of the host device 2
- the bottom layer locates navigation related information
- the second transmission device acquires upper layer positioning navigation related information from an upper layer control device of the host device 2
- the first transmission device sends the motion control command to the bottom layer of the host device 2
- a control device that transmits the motion related logic data to an upper layer control device of the host device 2.
- the content of the first transmission device is the same or substantially the same as the content of the first transmission device 11 shown in FIG. 1, and the content of the second transmission device and the content of the second transmission device 12 shown in FIG. The same or substantially the same, for the sake of brevity, will not be described again, and is only included herein by reference.
- the positioning and navigation method of the present application cooperates with the first transmission device 11 and the second transmission device 12 to clarify the communication specifications and dependencies between the autonomous positioning navigation device 1 and the host device 2, and the autonomous positioning navigation
- the interaction and data dependency of the device 1 with the host device 2 occurs on one of the communication interfaces of the first transmission device 11 and the second transmission device 12.
- FIG. 3 is a schematic diagram showing data transmission during the cooperation of the first transmission device of the autonomous positioning navigation device 1 and the underlying control device of the host device 2 according to a preferred embodiment of the present application.
- the host device 2 needs to first provide the autonomous positioning navigation device 1 with parameter information of the host device 2 including its own information, and the parameter information of the host device 2 is used to describe the current host device 2.
- Platform characteristics such as its own size information, drive mode (two-wheel differential drive / omnidirectional wheel structure, etc.), the position and angle of the external sensor installation, and if additional sensors are installed, it is necessary to have such sensors at this time.
- the related description information is provided to the autonomous positioning navigation device 1.
- the autonomous positioning navigation device 1 After receiving the parameter information of the host device 2, the autonomous positioning navigation device 1 will perform the necessary initialization work to adapt the current host device 2. Subsequently, the autonomous positioning navigation device 1 will periodically transmit the motion control command to the host device 2.
- the motion control command is used to describe a mode in which the autonomous positioning navigation device 1 expects the host device 2 to move next.
- the motion control command may be a desired speed of operation of the left and right wheel sets.
- the motion control command may be a linear velocity (v) and an angular velocity (w) at which the robot performs translation and rotation at the next moment.
- the host device 2 While the autonomous positioning navigation device 1 periodically transmits the motion control command, the host device 2 also needs to periodically transmit the wheel group state information describing the motion situation to the autonomous positioning navigation device 1. This information generally includes the amount of change in the amount of displacement and heading angle of the host device 2 relative to the previous moment. For the host device 2 that uses the two-wheeled scoring drive, the wheel group status information may directly transmit the cumulative number of revolutions of the left and right wheels or the accumulated odometer information.
- the host device 2 sensing information may be periodically transmitted to the autonomous positioning navigation device 1 including the sensor data description information having a uniform definition.
- the autonomous positioning navigation device 1 can expand its own function by accepting the sensor data description information to process additional external sensors.
- the above data is a minimum set of data types that must be transmitted through the first transmission device 11 in order to ensure the normal operation of the autonomous navigation device 1 , and is only the first transmission of the autonomous positioning navigation device 1 .
- the type of protocol which may be applicable to the present application, transfers the type of data between the first transmission device 11 of the autonomous navigation device 1 and the underlying control device 21 of the host device 2, which may still be incorporated herein by reference.
- the second transmission device 12 is responsible for transmitting map data from the autonomous positioning navigation device 1 to the upper layer control device of the host device 2, synchronous positioning data including position and posture information and positioning state information, and Motion planning logic data including motion state information, and motion execution request, bottom motion control request from the host device 2 to the upper layer control device of the autonomous positioning navigation device 1.
- the map data includes map data of a specific area constructed by the autonomous positioning navigation device 1; the position and posture information includes spatial position and posture information of the current host device 2 calculated by the autonomous positioning navigation device 1;
- the status information includes a map size calculated by the autonomous positioning navigation device 1 and a positioning status (eg, covariance, whether the positioning is successful);
- the motion status information includes motion planning algorithm information currently being executed by the autonomous positioning navigation device 1, for example, but not Limited to rules such as the ongoing path planning time
- the motion execution request includes a request packet of the built-in motion planning algorithm that the host device 2 requires to autonomously locate the navigation device 1; the underlying motion control request includes the host device 2 requiring the autonomous positioning navigation device 1 to directly control the host device 2
- a request packet for the underlying system to move such as, for example, but not limited to, requesting control of the robot to a particular destination.
- the map data describes map data information of interest to the host device 2.
- the map data is always located in a portion of the environment map pre-built by the autonomous navigation device 1.
- the host device 2 can randomly acquire the data to the autonomous positioning navigation device 1 as needed.
- the position and posture information includes current position coordinates and posture information of the host device 2 calculated by the autonomous positioning navigation device 1.
- the information may be the coordinates (x, y) of the robot in the plane and the heading angle ⁇ .
- the host device 2 can randomly acquire the data to the autonomous positioning navigation device 1 according to its own needs, or the autonomous positioning navigation device 1 can actively push the data to the host device 2.
- the positioning status information is used to describe the current working situation of the autonomous positioning navigation device 1 for positioning and map construction.
- the information it contains includes the total size of the map that has been constructed so far, the positioning accuracy information, whether the positioning is successful, and other data sets necessary by the host device 2.
- the host device 2 can randomly acquire the data to the autonomous positioning navigation device 1 according to its own needs, or the autonomous positioning navigation device 1 can actively push the data to the host device 2.
- the motion state information describes the execution of the motion planning algorithm currently being performed by the autonomous positioning navigation device 1. For example, the type of motion planning algorithm currently being worked on (idle, path planning, autonomous return charging, etc.), the planned path data to the target location, and the amount of motion control required by the host device 2.
- the host device 2 can randomly acquire the data to the autonomous positioning navigation device 1 according to its own needs, or the autonomous positioning navigation device 1 can actively push the data to the sink device 2.
- the motion execution request is used by the host device 2 to initiate a related description data of the motion planning algorithm built into the autonomous positioning navigation device 1 to the autonomous positioning navigation device 1.
- the general implementation includes the types of motion planning algorithms that the host device 2 wishes to perform (stop all actions, path planning, autonomous return charging, etc.), related parameters (target location coordinates, moving speed, etc.). This information is initiated by the host device 2 to the autonomous positioning navigation device 1 actively.
- the underlying motion control request is for the host device 2 to issue a motion related control command request directly to the underlying control device 21 of the host device 2 via the autonomous positioning navigation device 1.
- the data packet is used to implement the navigation device 1 for autonomous positioning
- the request can implement the underlying motion of the host device 2 to directly advance, retreat, rotate, etc. at a specific speed.
- the underlying motion control request can also contain direct control data for the left and right wheel motor speeds.
- the data transmitted during the communication between the second transmission device 12 of the autonomous positioning navigation device 1 and the upper control device 22 of the host device 2 is a preferred example, including the minimum data that should be supported.
- the processing device acquires a plurality of sensing information, the bottom layer positioning navigation related information, and the upper layer positioning navigation related information, and generates a motion control command for controlling the movement of the host device 2 And motion related logical data for business logic analysis by the host device 2.
- the content of the processing device is the same as or substantially the same as the content of the processing device 13 shown in FIG. 1.
- the content of the processing device 13 shown in FIG. 1 For brevity, it will not be described again, and is only included herein by reference.
- the autonomous positioning navigation device 1 further includes a built-in sensor and an external sensor; the step S12 further includes: the processing device acquiring a plurality of the sensing information from the built-in sensor and an external sensor.
- the content of the built-in sensor is the same as or substantially the same as the content of the built-in sensor 14 shown in FIG. 1.
- the external sensor has the same or substantially the same content as the external sensor 15 shown in FIG. 1. For the sake of brevity, I will not repeat them here, but I will only include them here by reference.
- the step S11 further includes: the first transmission device acquires sensing information of the host device 2 from the underlying control device of the host device 2; the step S12 includes: the processing device is based on the host device 2 The sensing information, the plurality of sensing information, the underlying positioning navigation related information, and the upper positioning navigation related information generate motion control related information of the host device 2.
- the bottom layer positioning navigation related information includes wheel group status information of the host device 2
- the upper layer positioning navigation related information includes a request that the host device 2 needs to perform motion planning and/or the host device 2 needs
- the underlying control device performs a motion control request
- the motion related logical data includes map data, synchronous positioning data, and motion planning logic data
- the step S12 includes: the processing device is based on the plurality of the sensing information, the bottom layer positioning navigation related information, and the upper layer positioning navigation
- the related information generates map data and synchronization positioning data, and generates motion planning logic data and the motion control command based on the synchronization positioning data, the map data, and upper layer positioning navigation related information of the host device 2.
- the step S12 further includes: the processing device further generates a motion initial control command based on the parameter information; the step S13 further includes: The first transmission device transmits the motion initial control command to the underlying device of the host device 2.
- FIG. 5 is a schematic diagram showing a cooperation structure between an autonomous positioning navigation device 1 and a host device 2 according to a preferred embodiment of the present application.
- the autonomous positioning navigation device 1 includes: a first device 31, a second device 32, and a third device. Device 33.
- the first device 31 is configured to acquire information about the bottom layer positioning navigation and the information about the upper layer positioning navigation of the host device 2; the second device 32 is configured to acquire a plurality of sensing information, and perform a plurality of the sensing information. Pre-processing and pre-fusion; the third device 33 is configured to generate, according to the pre-processed and pre-fused, the plurality of the sensing information, the bottom layer positioning navigation related information, and the upper layer positioning navigation related information a motion control command for moving the host device 2 and motion related logic data for performing business logic analysis by the host device 2; the first device 31 is further configured to use the synchronization positioning data, the map data, the Motion planning logic data and the motion control commands are sent to the host device 2.
- the first device 31 is further configured to send the synchronization positioning data, the map data, the motion planning logic data, and the motion control command to the host device 2;
- the third device 33 The first unit 331 is configured to: the motion related logical data includes map data, synchronous positioning data, and motion planning logic data; and the plurality of the sensing information and the bottom positioning navigation related information based on the pre-processed and pre-fused Generating the map data and the synchronization positioning data with the upper layer positioning navigation related information;
- the second unit 332 is configured to generate motion planning logic based on the synchronous positioning data, the map data, and the upper layer positioning navigation related information of the host device 2 Data and motion control commands for controlling the motion of the host device 2.
- the second device 32 includes: a third unit, configured to acquire a plurality of the transmissions Sense information, the plurality of sensing information includes at least one of: built-in sensing information, external sensing information, and sensing information of the host device 2; and a fourth unit configured to preprocess a plurality of the sensing information And pre-fusion.
- the first device 31 further includes: a fifth unit, configured to encapsulate the synchronization positioning data, the map data, the motion planning logic data, and the motion control command according to a unified data protocol format And a sixth unit, configured to send the encapsulated data to the host device 2.
- FIG. 6 is a schematic diagram showing a cooperation structure between an autonomous positioning navigation device 1 and a host device 2 according to a preferred embodiment of the present application. 5 and FIG. 6, wherein the autonomous positioning navigation device 1 includes a positioning and map building module, a motion planning module, a motion control and state acquiring module, and a communication interaction management module.
- the host device 2 includes a behavior control and expansion module, wherein the positioning and map construction module corresponds to the first unit 331, the motion planning module corresponds to the second unit 332, and the motion control and state
- the acquisition module corresponds to the second device 32, the communication interaction management module corresponds to the first device 31, and the behavior control and expansion module corresponds to the fourth device 41.
- the terms are used interchangeably.
- the behavior control and expansion module cooperates with the communication interaction management module to perform data transmission between the autonomous positioning navigation device 1 and the host device 2, and is generally implemented inside the computer of the host device 2 in physical implementation. But from a software perspective, it still belongs to the autonomous navigation system as a whole.
- the positioning and map building module constructs map data and synchronous positioning data.
- the positioning and map building module is implemented for a specific synchronous positioning and mapping (SLAM) algorithm, which may be based on particle filtering and
- SLAM synchronous positioning and mapping
- the raster map model uses the laser radar as the main input signal SLAM algorithm, or it can be the visual SLAM algorithm using the two-dimensional data provided by the camera.
- the module obtains input data through the built-in sensor and the external sensor, and provides the calculated map information and the positioning coordinate information to other modules inside the autonomous positioning navigation device 1.
- the motion planning module is responsible for performing the action control of the host device 2.
- it will include a map-based path planning algorithm such as A*, D*, and the boot host device 2 (machine Human)
- An obstacle avoidance algorithm for real-time obstacle avoidance.
- the module may also include a charging pile docking algorithm such as autonomous return charging or a ground overlay algorithm required by the cleaning robot.
- Another core function of the module is to accept extended control instructions from the behavior control and extension module in the external host device 2, including the request that the host device needs to perform motion planning or the host device needs its underlying control device to perform motion.
- the control request is used to merge with its own motion planning logic to implement more complex control logic for extending and modifying existing motion planning algorithms.
- the motion control and state acquisition module is responsible for collecting built-in sensor information, external sensor information, and host device sensing information from the host device 2, and performing necessary data pre-processing and fusion on the autonomous positioning navigation device.
- Other modules are used, and in addition, the module acts as an abstraction layer for host device differences. Concealing the differences between different host device 2 platforms and necessary simulations, so that the positioning and map building module and motion planning module running on it can minimize the difference of specific host devices 2, and adopt a relatively universal implementation. algorithm.
- the communication and interaction management module acquires information related to the bottom layer positioning navigation and the information about the upper layer positioning navigation of the host device 2, and is responsible for directly interacting with the host device 2 through the high-speed signal interface and the control signal interface of the autonomous positioning navigation device 1, and it can be considered Is the abstraction layer for a specific communication interface.
- the module is responsible for acquiring data required by other modules of the autonomous positioning navigation device 1 from the host device 2 through the corresponding interface, and is responsible for transmitting the data sent to the host device 2 according to a unified data protocol format and then transmitting.
- the communication and interaction management module encapsulates the motion control instruction and the motion-related logical data according to a unified protocol rule, and encapsulates the data to the host device 2.
- the behavior control and expansion module cooperates with the communication interaction management module to perform data transmission between the autonomous positioning navigation device 1 and the host device 2 to assist the software system in the host device 2 to interact with the autonomous positioning navigation device 1 Therefore, it generally runs in the computer system of the host device 2.
- the module can obtain state information such as maps and position coordinates provided by other modules of the autonomous positioning navigation device 1 through the high-speed signal interface, and can execute and expand the existing algorithms in the motion planning module through predefined motion planning extension commands. Modify and other operations.
- the module is generally provided to the host device 2 in the form of a software development kit (SDK) and integrated with other software modules in the host device 2.
- SDK software development kit
- an autonomous positioning navigation system includes: the autonomous positioning navigation device 1 and the host device 2, wherein the host device 2 includes: a fourth device, Sending, to the autonomous positioning navigation device 1, the underlying positioning navigation related information and the upper positioning navigation related information of the host device 2, and acquiring motion control sent by the host device 2 for controlling the motion of the host device 2 Commands and motion related logical data for business logic analysis by the host device 2.
- FIG. 7 is a schematic diagram showing the cooperation structure of the autonomous positioning navigation device 1 and the host device 2 according to another preferred embodiment of the present application.
- the behavior control and extension module running on the host device 2 is divided into two parts, namely, the host-oriented underlying control and the host-oriented upper layer business logic according to the responsibilities, corresponding to the underlying SDK (software toolkit) and the upper layer SDK in the figure.
- the host-oriented underlying control For the part of the underlying control device facing the host device 2, it communicates with the body of the autonomous positioning navigation device 1 through the control signal interface, is responsible for performing the transfer of the robot motion signal with the host device 2, and transmitting the additional extended sensor from the host device 2. data.
- the upper layer control device for the host device 2 communicates with the body of the autonomous positioning navigation device 1 through a high-speed signal interface, which provides the host device 2 with information such as maps, positioning coordinates, and the like generated by the autonomous positioning navigation device 1, and includes a scale.
- the sub-module for extending the motion planning framework is used to implement the call, extension and behavior modification of the internal motion planning algorithm logic of the positioning navigation module by the host device 2.
- the above example uses a laser radar as an external sensor, so the positioning of the map building module is implemented by a SLAM algorithm using a particle filtered raster map.
- the sensor data required by SLAM is acquired by other modules and subjected to necessary data pre-fusion to be finally read in.
- the SLAM module completes the processing, the obtained map and coordinate data are temporarily cached in the memory of the autonomous positioning navigation device for use by other modules and the external host device 2.
- the D* path planning algorithm that can perform direct and shortest path calculation of any bright spot is built in the motion planning module, and the host device 2 is used to assist the host device 2 in moving obstacles in real time through various sensor data.
- FIG. 8 illustrates a positioning and navigation method according to a preferred embodiment of the present application, and a method for positioning navigation according to another aspect of the present application, wherein the method includes: step S31, step S32, step S33, and Step S34.
- Step S31 acquiring the bottom layer positioning navigation related information and the upper layer positioning navigation related information of the host device 2; step S32: acquiring a plurality of sensing information, and performing preprocessing and pre-fusion on the plurality of sensing information; step S33: based on The pre-processed and pre-fused plurality of the sensing information, the underlying positioning navigation related information, and the upper positioning navigation related information generate a motion control command for controlling movement of the host device 2 and for the host device 2 performing motion related logical data of the business logic analysis; step S34: transmitting the synchronous positioning data, the map data, the motion planning logical data, and the motion control command to the host device 2.
- the motion-related logic data includes map data, synchronization positioning data, and motion planning logic data.
- the step S33 includes: performing, according to the pre-processed and pre-fused, the sensing information, the bottom-level positioning navigation related information. Generating map data and synchronous positioning data with the upper layer positioning navigation related information;
- Motion planning logic data and motion control commands for controlling movement of the host device 2 are generated based on the synchronization positioning data, the map data, and upper layer positioning navigation related information of the host device 2.
- the step S32 includes: acquiring a plurality of the sensing information, and the plurality of sensing information includes at least one of the following: built-in sensing information, external sensing information, and a host device 2 Sensing information; pre-processing and pre-fusion of a plurality of said sensing information.
- step S34 includes: encapsulating the synchronization positioning data, the map data, the motion planning logic data, and the motion control command according to a unified data protocol format; and sending the encapsulated data to the location Host device 2.
- the autonomous positioning navigation device 1 includes: a positioning and map building module, a motion planning module, a motion control and state acquiring module, and a communication interaction management module.
- the host device 2 includes a behavior control and expansion module, wherein the positioning and map construction module corresponds to the first unit 331, the motion planning module corresponds to the second unit 332, and the motion control and state
- the acquisition module corresponds to the second device 32
- the communication interaction management module corresponds to the first device 31
- the behavior control and expansion module corresponds to the fourth device 41.
- the terms are used interchangeably.
- the communication and interaction management module acquires information about the underlying positioning navigation and the information of the upper positioning navigation of the host device 2, and the communication and interaction management module is responsible for directly passing the high-speed signal interface of the autonomous positioning navigation device 1 (
- the hardware form interface or software form interface) and the control signal interface (hardware form interface or software form interface) interact with the host device 2, which can be considered as an abstraction layer for a specific communication interface.
- the module is responsible for acquiring data required by other modules of the autonomous positioning navigation device 1 from the host device 2 through the corresponding interface, and is responsible for transmitting the data sent to the host device 2 according to a unified data protocol format and then transmitting.
- the motion control and state acquisition module is responsible for collecting built-in sensing information, external sensing information, and host device sensing information from the host device 2, and performing necessary data pre-processing and fusion. It is provided for use by other modules in the autonomous navigation device.
- the module acts as an abstraction layer for host device differences. Concealing the differences between different host device 2 platforms and necessary simulations, so that the positioning and map building module and motion planning module running on it can minimize the difference of specific host devices 2, and adopt a relatively universal implementation. algorithm.
- the positioning and map construction module constructs map data and synchronization positioning data, and the motion planning module generates motion control instructions and motion related logic data, wherein the positioning and map building module is for a specific synchronization.
- Positioning and Mapping (SLAM) algorithm Now, it can be a SLAM algorithm based on particle filtering and raster map model and using lidar as the main input signal, or a visual SLAM algorithm using two-dimensional data provided by the camera.
- the positioning and map construction module obtains input data through the built-in sensor and the external sensor, and provides the calculated map information and the positioning coordinate information to other modules inside the autonomous positioning navigation device 1.
- the motion planning module is responsible for performing action control of the host device 2.
- the module will include a map-based path planning algorithm such as A*, D*, and a host device 2 (robot) for real-time obstacle avoidance. algorithm.
- the module may also include a charging pile docking algorithm such as autonomous return charging or a ground overlay algorithm required by the cleaning robot.
- Another core function of the module is to accept extended control instructions from the behavior control and extension module in the external host device 2, including the request that the host device needs to perform motion planning or the host device needs its underlying control device to perform motion.
- the control request is used to merge with its own motion planning logic to implement more complex control logic for extending and modifying existing motion planning algorithms.
- the communication and interaction management module encapsulates the motion control instruction and the motion-related logic data according to a unified protocol rule, and encapsulates the data to the host device 2.
- the behavior control and expansion module cooperates with the communication interaction management module to perform data transmission between the autonomous positioning navigation device 1 and the host device 2.
- the behavior control and extension module is generally physically internal to the computer of the host device 2. But from a software perspective, it still belongs to the autonomous navigation system as a whole.
- the purpose of the behavior control and extension module is to assist the software system in the host device 2 to interact with the autonomous positioning navigation device 1 so that it generally runs in the computer system of the host device 2.
- the module can obtain state information such as maps and position coordinates provided by other modules of the autonomous positioning navigation device 1 through the high-speed signal interface, and can execute and expand the existing algorithms in the motion planning module through predefined motion planning extension commands. Modify and other operations.
- the module is generally provided to the host device 2 in the form of a software development kit (SDK) and integrated with other software modules in the host device 2.
- SDK software development kit
- the behavior control and extension module running on the host device 2 is divided into two parts, namely, the host-oriented underlying control and the host-oriented upper layer business logic according to the responsibilities, respectively corresponding to the underlying SDK (software toolkit) and the upper layer in the figure. SDK.
- the host-oriented underlying control and the host-oriented upper layer business logic according to the responsibilities, respectively corresponding to the underlying SDK (software toolkit) and the upper layer in the figure.
- SDK software toolkit
- the host device 2 For the part of the underlying control device facing the host device 2, it communicates with the body of the autonomous positioning navigation device 1 through the control signal interface, is responsible for performing the transfer of the robot motion signal with the host device 2, and transmitting the additional extended sensor from the host device 2. data.
- the upper layer control device for the host device 2 communicates with the body of the autonomous positioning navigation device 1 through a high-speed signal interface, which provides the host device 2 with information such as maps, positioning coordinates, and the like generated by the autonomous positioning navigation device 1, and includes a scale.
- the sub-module for extending the motion planning framework is used to implement the call, extension and behavior modification of the internal motion planning algorithm logic of the positioning navigation module by the host device 2.
- the above example uses a laser radar as an external sensor, so the positioning of the map building module is implemented by a SLAM algorithm using a particle filtered raster map.
- the sensor data required by SLAM is acquired by other modules and subjected to necessary data pre-fusion to be finally read in.
- the SLAM module completes the processing, the obtained map and coordinate data are temporarily cached in the memory of the autonomous positioning navigation device 1 for use by other modules and the external host device 2.
- the D* path planning algorithm for performing direct and shortest path calculation of any bright spot is built in the motion planning module, the obstacle avoidance algorithm for assisting the host device 2 to avoid obstacles during motion in real time through various sensor data, and the autonomous return charging pile for docking charging Logic.
- the control signals generated by its algorithmic operations will eventually be converted into a set of control commands for the host device 2 and passed to the host device 2 via the control signal interface.
- the autonomous positioning navigation device 1 of the present application has a high degree of modularity, greatly reducing the coupling degree with the host device 2, and facilitating rapid integration into an existing host device. 2 and the advantages of flexible expansion. Therefore, the host device 2 such as a robot has a simpler and clearer system configuration, and the navigation device with autonomous positioning is greatly reduced. The development difficulty and time period of the host device 2 of 1. And thanks to the high degree of modularity of the system, the small size of the host device 2 is made possible.
- the autonomous positioning navigation device 1 integrates the processing of a plurality of sensing information into the autonomous positioning navigation device 1 itself by summarizing the sensing information dependencies required by most of the autonomous positioning navigation systems, thereby reducing the host device.
- the autonomous positioning navigation device 1 forms a highly flexible unified external communication interface and protocol specification through the first transmission device and the second transmission device, so that any host device 2 conforming to the interface protocol specification can be easily implemented. Interfacing with the autonomous positioning navigation device 1 and realizing the expansion of functions.
- the present invention can be implemented in software and/or a combination of software and hardware, for example, using an application specific integrated circuit (ASIC), a general purpose computer, or any other similar hardware device.
- the software program of the present invention may be executed by a processor to implement the steps or functions described above.
- the software program (including related data structures) of the present invention can be stored in a computer readable recording medium such as a RAM memory, a magnetic or optical drive or a floppy disk and the like.
- some of the steps or functions of the present invention may be implemented in hardware, for example, as a circuit that cooperates with a processor to perform various steps or functions.
- a portion of the invention can be applied as a computer program product, such as computer program instructions, which, when executed by a computer, can invoke or provide a method and/or solution in accordance with the present invention.
- the program instructions for invoking the method of the present invention may be stored in a fixed or removable recording medium and/or transmitted by a data stream in a broadcast or other signal bearing medium, and/or stored in a The working memory of the computer device in which the program instructions are run.
- an embodiment in accordance with the present invention includes a device including a memory for storing computer program instructions and a processor for executing program instructions, wherein when the computer program instructions are executed by the processor, triggering
- the apparatus operates based on the aforementioned methods and/or technical solutions in accordance with various embodiments of the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Electromagnetism (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Navigation (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
Description
Claims (22)
- 一种自主定位导航设备,其中,所述自主定位导航设备用于对宿主设备进行定位导航,所述自主定位导航设备包括:第一传输装置、第二传输装置和处理装置;其中,所述第一传输装置与所述宿主设备的底层装置进行数据通信,以获取底层定位导航相关信息和发送用于控制所述宿主设备运动的运动控制命令;所述第二传输装置与所述宿主设备的上层装置进行数据通信,以获取上层定位导航相关信息和发送用于所述宿主设备进行业务逻辑分析的运动相关逻辑数据;所述处理装置获取若干传感信息、所述底层定位导航相关信息和所述上层定位导航相关信息,并生成所述运动相关逻辑数据和所述运动控制命令。
- 根据权利要求1所述的自主定位导航设备,其中,所述自主定位导航设备还包括内置传感器和外置传感器;其中,所述处理装置从所述内置传感器和外置传感器获取若干所述传感信息。
- 根据权利要求2所述的自主定位导航设备,其中,所述内置传感器包括至少以下任一项:陀螺仪、加速度传感器、电子罗盘、温度传感器、湿度传感器、气压传感器。所述外置传感器包括至少以下任一项:激光雷达、声呐雷达、视觉传感器、UWB信标传感器。
- 根据权利要求2或3所述的自主定位导航设备,其中,所述处理装置包括主处理单元和从处理单元,其中,所述主处理单元基于若干所述传感信息、所述底层定位导航相关信息和所述上层定位导航相关信息生成所述宿主设备的所述运动相关逻辑数据和所述运动控制命令;所述从处理单元实时从所述内置传感器获取所述传感信息,以获取姿态解算任务,并将所述运动控制命令通过所述第一传输装置发送至所述宿 主设备的底层控制装置。
- 根据权利要求1至4中任一项所述的自主定位导航设备,其中,所述第一传输装置还从所述宿主设备的底层控制装置获取宿主设备传感信息;所述处理装置基于所述宿主设备传感信息、若干所述传感信息、所述底层定位导航相关信息和所述上层定位导航相关信息生成所述宿主设备的运动控制相关信息。
- 根据权利要求1至5中任一项所述的自主定位导航设备,其中,所述第一传输装置包括以下至少任一项:UART串口、CAN总线、SPI总线、I2C总线;所述第二传输装置包括以下至少任一项:以太网接口、无线网络接口、USB接口、光纤接口。
- 根据权利要求1至6中任一项所述的自主定位导航设备,其中,所述底层定位导航相关信息包括所述宿主设备的轮组状态信息,所述上层定位导航相关信息包括所述宿主设备需要进行运动规划的请求和/或所述宿主设备需要其底层控制装置进行运动控制的请求,所述运动相关逻辑数据包括地图数据、同步定位数据及运动规划逻辑数据;所述处理装置基于若干所述传感信息、所述底层定位导航相关信息和所述上层定位导航相关信息生成地图数据和同步定位数据,并基于所述同步定位数据、所述地图数据及所述宿主设备的上层定位导航相关信息生成运动规划逻辑数据和所述运动控制命令。
- 根据权利要求1至7中任一项所述的自主定位导航设备,其中,所述底层定位导航相关信息还包括所述宿主设备的参数信息;所述处理装置还基于所述参数信息生成运动初始控制命令;所述第一传输装置将所述运动初始控制命令发送至所述宿主设备的底层装置。
- 一种利用自主定位导航设备进行定位导航的方法,其中,所述自主定位导航设备用于对宿主设备进行定位导航,所述自主定位导航设备包括处理装置、第一传输装置和第二传输装置;其中,所述方法包括:A所述第一传输装置从所述宿主设备的底层控制装置获取底层定位导航相关信息,所述第二传输装置从所述宿主设备的上层控制装置获取上层定位导航相关信息;B所述处理装置获取若干传感信息、所述底层定位导航相关信息和所述上层定位导航相关信息,并生成用于控制所述宿主设备运动的运动控制命令和用于所述宿主设备进行业务逻辑分析的运动相关逻辑数据;C所述第一传输装置将所述运动控制命令发送至所述宿主设备的底层控制装置,所述第二传输装置将所述运动相关逻辑数据发送至所述宿主设备的上层控制装置。
- 根据权利要求9所述的方法,其中,所述自主定位导航设备还包括内置传感器和外置传感器;所述步骤B还包括:所述处理装置从所述内置传感器和外置传感器获取若干所述传感信息。
- 根据权利要求9或10所述的方法,其中,所述步骤A还包括:所述第一传输装置从所述宿主设备的底层控制装置获取宿主设备传感信息;所述步骤B包括:所述处理装置基于所述宿主设备传感信息、若干传感信息、所述底层定位导航相关信息和所述上层定位导航相关信息生成所述宿主设备的运动控制相关信息。
- 根据权利要求9至11中任一项所述的方法,其中,所述底层定位导航相关信息包括所述宿主设备的轮组状态信息,所述上层定位导航相关信息包括所述宿主设备需要进行运动规划的请求和/或所述宿主设备需要其底层控制装置进行运动控制的请求,所述运动相关逻辑数据包括地图数据、同步定位数据及运动规划逻辑数据;所述步骤B包括:所述处理装置基于若干所述传感信息、所述底层定位导航相关信息和所述上层定位导航相关信息生成地图数据和同步定位数据,并基于所述同步定位数据、所述地图数据及所述宿主设备的上层定位导航相关信息生成运动规划逻辑数据和所述运动控制命令。
- 根据权利要求12所述的方法,其中,所述底层定位导航相关信息 还包括所述宿主设备的参数信息;所述步骤B还包括:所述处理装置还基于所述参数信息生成运动初始控制命令;所述步骤C还包括:所述第一传输装置将所述运动初始控制命令发送至所述宿主设备的底层装置。
- 一种定位导航方法,其中,所述定位导航方法包括:a获取宿主设备的底层定位导航相关信息和上层定位导航相关信息;b获取若干传感信息,并对若干所述传感信息进行预处理和预融合;c基于所预处理和预融合的若干所述传感信息、所述底层定位导航相关信息和所述上层定位导航相关信息生成用于控制所述宿主设备运动的运动控制命令和用于所述宿主设备进行业务逻辑分析的运动相关逻辑数据;d将所述同步定位数据、所述地图数据、所述运动规划逻辑数据和所述运动控制命令发送至所述宿主设备。
- 根据权利要求14所述的定位导航方法,所述运动相关逻辑数据包括地图数据、同步定位数据及运动规划逻辑数据;所述步骤c包括:基于所预处理和预融合的若干所述传感信息、所述底层定位导航相关信息和所述上层定位导航相关信息生成地图数据和同步定位数据;基于所述同步定位数据、所述地图数据及所述宿主设备的上层定位导航相关信息生成运动规划逻辑数据和用于控制所述宿主设备运动的运动控制命令。
- 根据权利要求15所述的定位导航方法,其中,所述步骤d包括:根据所获取的所述同步定位数据、所述地图数据、所述运动规划逻辑数据和所述运动控制命令按照统一的数据协议格式进行封装;将封装后的数据发送至所述宿主设备。
- 根据权利要求14至16中任一项所述的定位导航方法,其中,所述步骤b包括:获取若干所述传感信息,若干所述传感信息包括至少以下任一项:内置传感信息、外置传感信息和宿主设备传感信息;对若干所述传感信息进行预处理和预融合。
- 一种自主定位导航设备,其中,所述自主定位导航设备包括:第一装置,用于获取宿主设备的底层定位导航相关信息和上层定位导航相关信息;第二装置,用于获取若干传感信息,并对若干所述传感信息进行预处理和预融合;第三装置,用于基于所预处理和预融合的若干所述传感信息、所述底层定位导航相关信息和所述上层定位导航相关信息生成用于控制所述宿主设备运动的运动控制命令和用于所述宿主设备进行业务逻辑分析的运动相关逻辑数据;所述第一装置还用于将所述同步定位数据、所述地图数据、所述运动规划逻辑数据和所述运动控制命令发送至所述宿主设备。
- 根据权利要求18所述的自主定位导航设备,其中,所述运动相关逻辑数据包括地图数据、同步定位数据及运动规划逻辑数据;所述第三装置包括:第一单元,用于基于所预处理和预融合的若干所述传感信息、所述底层定位导航相关信息和所述上层定位导航相关信息生成地图数据和同步定位数据;第二单元,用于基于所述同步定位数据、所述地图数据及所述宿主设备的上层定位导航相关信息生成运动规划逻辑数据和用于控制所述宿主设备运动的运动控制命令。
- 根据权利要求19所述的自主定位导航设备,其中,所述第一装置还包括:第五单元,用于根据所获取的所述同步定位数据、所述地图数据、所述运动规划逻辑数据和所述运动控制命令按照统一的数据协议格式进行封装;第六单元,用于将封装后的数据发送至所述宿主设备。
- 根据权利要求18至20中任一项所述的自主定位导航设备,其中,所述第二装置包括:第三单元,用于获取若干所述传感信息,若干所述传感信息包括至少以下任一项:内置传感信息、外置传感信息和宿主设备传感信息;第四单元,用于对若干所述传感信息进行预处理和预融合。
- 一种自主定位导航***,其中,所述自主定位导航***包括:根据权利要求18至21中任一项所述的自主定位导航设备;宿主设备,所述宿主设备包括:第四装置,用于向所述自主定位导航设备发送所述宿主设备的底层定位导航相关信息和上层定位导航相关信息,并获取所述宿主设备发送的用于控制所述宿主设备运动的运动控制命令和用于所述宿主设备进行业务逻辑分析的运动相关逻辑数据。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/779,521 US10974390B2 (en) | 2015-12-10 | 2016-12-05 | Autonomous localization and navigation equipment, localization and navigation method, and autonomous localization and navigation system |
JP2018530092A JP6868028B2 (ja) | 2015-12-10 | 2016-12-05 | 自律測位航法設備、測位航法方法及び自律測位航法システム |
EP16872361.7A EP3388786A4 (en) | 2015-12-10 | 2016-12-05 | DEVICE FOR AUTONOMOUS POSITIONING AND NAVIGATION, POSITIONING AND NAVIGATION METHOD AND SYSTEM FOR AUTONOMOUS POSITIONING AND NAVIGATION |
AU2016368234A AU2016368234A1 (en) | 2015-12-10 | 2016-12-05 | Autonomous positioning and navigation device, positioning and navigation method and autonomous positioning and navigation system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510916639.9A CN106323269B (zh) | 2015-12-10 | 2015-12-10 | 自主定位导航设备、定位导航方法及自主定位导航*** |
CN201510916639.9 | 2015-12-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017097170A1 true WO2017097170A1 (zh) | 2017-06-15 |
Family
ID=57725830
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/108594 WO2017097170A1 (zh) | 2015-12-10 | 2016-12-05 | 自主定位导航设备、定位导航方法及自主定位导航*** |
Country Status (6)
Country | Link |
---|---|
US (1) | US10974390B2 (zh) |
EP (1) | EP3388786A4 (zh) |
JP (1) | JP6868028B2 (zh) |
CN (1) | CN106323269B (zh) |
AU (2) | AU2016102440A4 (zh) |
WO (1) | WO2017097170A1 (zh) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107450571A (zh) * | 2017-09-30 | 2017-12-08 | 江西洪都航空工业集团有限责任公司 | 一种基于ros的agv小车激光导航*** |
CN109782768A (zh) * | 2019-01-26 | 2019-05-21 | 哈尔滨玄智科技有限公司 | 一种适配于内行星式复合轮系搬运机器人的自主导航*** |
CN109976327A (zh) * | 2017-12-28 | 2019-07-05 | 沈阳新松机器人自动化股份有限公司 | 一种巡逻机器人 |
CN112346466A (zh) * | 2020-12-07 | 2021-02-09 | 苏州云骐智能科技有限公司 | 一种基于5g的多传感器融合agv冗余控制***及方法 |
TWI749656B (zh) * | 2020-07-22 | 2021-12-11 | 英屬維爾京群島商飛思捷投資股份有限公司 | 定位圖資建立系統及建立方法 |
CN114509064A (zh) * | 2022-02-11 | 2022-05-17 | 上海思岚科技有限公司 | 一种自主扩展传感器数据处理的方法、接口及设备 |
US11438886B2 (en) | 2020-02-27 | 2022-09-06 | Psj International Ltd. | System for establishing positioning map data and method for the same |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106842230A (zh) * | 2017-01-13 | 2017-06-13 | 深圳前海勇艺达机器人有限公司 | 移动机器人导航方法与*** |
CN107063242A (zh) * | 2017-03-24 | 2017-08-18 | 上海思岚科技有限公司 | 具虚拟墙功能的定位导航装置和机器人 |
CN107357297A (zh) * | 2017-08-21 | 2017-11-17 | 深圳市镭神智能***有限公司 | 一种扫地机器人导航***及其导航方法 |
CN107665503A (zh) * | 2017-08-28 | 2018-02-06 | 汕头大学 | 一种构建多楼层三维地图的方法 |
CN109101012A (zh) * | 2017-12-12 | 2018-12-28 | 上海魔龙机器人科技有限公司 | 一种基于slam算法的机器人导航***及导航方法 |
US10705538B2 (en) * | 2018-01-31 | 2020-07-07 | Metal Industries Research & Development Centre | Auto guided vehicle system and operating method thereof |
CN108469819A (zh) * | 2018-03-19 | 2018-08-31 | 杭州晶智能科技有限公司 | 一种自动吸尘机器人的z字形回归路径规划方法 |
US11119507B2 (en) * | 2018-06-27 | 2021-09-14 | Intel Corporation | Hardware accelerator for online estimation |
CN108969858B (zh) * | 2018-08-08 | 2021-04-06 | 贵州中医药大学 | 一种全自动送氧机器人上氧方法及*** |
CN109725330A (zh) * | 2019-02-20 | 2019-05-07 | 苏州风图智能科技有限公司 | 一种车体定位方法及装置 |
CN109917791B (zh) * | 2019-03-26 | 2022-12-06 | 深圳市锐曼智能装备有限公司 | 移动装置自动探索构建地图的方法 |
CN110262518B (zh) * | 2019-07-22 | 2021-04-02 | 上海交通大学 | 基于轨迹拓扑地图和避障的车辆导航方法、***及介质 |
CN110519689A (zh) * | 2019-09-03 | 2019-11-29 | 广东博智林机器人有限公司 | 一种机器人自动充电上桩***和一种上桩方法 |
CN110658816B (zh) * | 2019-09-27 | 2022-10-25 | 东南大学 | 一种基于智能组件的移动机器人导航与控制方法 |
CN110763245A (zh) * | 2019-10-25 | 2020-02-07 | 江苏海事职业技术学院 | 一种基于流式计算的地图创建方法及其*** |
KR102186830B1 (ko) * | 2020-03-13 | 2020-12-04 | 주식회사 자오스모터스 | 인공지능에 대응한 라이다 시스템 |
US11768504B2 (en) | 2020-06-10 | 2023-09-26 | AI Incorporated | Light weight and real time slam for robots |
US11454974B2 (en) * | 2020-06-29 | 2022-09-27 | Baidu Usa Llc | Method, apparatus, device, and storage medium for controlling guide robot |
US11789110B2 (en) | 2020-09-03 | 2023-10-17 | Honeywell International Inc. | Fault detection, exclusion, isolation, and re-configuration of navigation sensors using an abstraction layer |
CN113110510A (zh) * | 2021-05-19 | 2021-07-13 | 悟空智能科技常州有限公司 | 一种灭火机器人自主导航控制*** |
CN114136334B (zh) * | 2021-11-30 | 2024-03-19 | 北京经纬恒润科技股份有限公司 | 基于车辆定位模组的定位方法及装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN2911609Y (zh) * | 2006-05-09 | 2007-06-13 | 南京恩瑞特实业有限公司 | 嵌入式gps自主导航装置 |
CN103398702A (zh) * | 2013-08-05 | 2013-11-20 | 青岛海通机器人***有限公司 | 一种移动机器人远程操控装置及其操控技术 |
CN204595519U (zh) * | 2015-04-20 | 2015-08-26 | 安徽工程大学 | 一种自主移动机器人控制*** |
CN105137949A (zh) * | 2015-09-23 | 2015-12-09 | 珠海创智科技有限公司 | Agv控制*** |
CN106114633A (zh) * | 2016-07-27 | 2016-11-16 | 苏州博众机器人有限公司 | 模块化agv小车 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3763476B2 (ja) * | 2003-05-29 | 2006-04-05 | 三菱電機株式会社 | 車両及び運転者の挙動解析システム |
US20110046784A1 (en) * | 2009-08-18 | 2011-02-24 | Noel Wayne Anderson | Asymmetric stereo vision system |
US20110106338A1 (en) * | 2009-10-29 | 2011-05-05 | Allis Daniel P | Remote Vehicle Control System and Method |
JP5370568B2 (ja) * | 2012-10-24 | 2013-12-18 | 株式会社アドヴィックス | 車体速度制御装置 |
JP6537780B2 (ja) * | 2014-04-09 | 2019-07-03 | 日立オートモティブシステムズ株式会社 | 走行制御装置、車載用表示装置、及び走行制御システム |
US10282697B1 (en) * | 2014-09-30 | 2019-05-07 | Amazon Technologies, Inc. | Spatially aware mounting system |
-
2015
- 2015-12-10 CN CN201510916639.9A patent/CN106323269B/zh active Active
-
2016
- 2016-12-05 WO PCT/CN2016/108594 patent/WO2017097170A1/zh active Application Filing
- 2016-12-05 AU AU2016102440A patent/AU2016102440A4/en active Active
- 2016-12-05 US US15/779,521 patent/US10974390B2/en active Active
- 2016-12-05 JP JP2018530092A patent/JP6868028B2/ja active Active
- 2016-12-05 EP EP16872361.7A patent/EP3388786A4/en not_active Ceased
- 2016-12-05 AU AU2016368234A patent/AU2016368234A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN2911609Y (zh) * | 2006-05-09 | 2007-06-13 | 南京恩瑞特实业有限公司 | 嵌入式gps自主导航装置 |
CN103398702A (zh) * | 2013-08-05 | 2013-11-20 | 青岛海通机器人***有限公司 | 一种移动机器人远程操控装置及其操控技术 |
CN204595519U (zh) * | 2015-04-20 | 2015-08-26 | 安徽工程大学 | 一种自主移动机器人控制*** |
CN105137949A (zh) * | 2015-09-23 | 2015-12-09 | 珠海创智科技有限公司 | Agv控制*** |
CN106114633A (zh) * | 2016-07-27 | 2016-11-16 | 苏州博众机器人有限公司 | 模块化agv小车 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3388786A4 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107450571A (zh) * | 2017-09-30 | 2017-12-08 | 江西洪都航空工业集团有限责任公司 | 一种基于ros的agv小车激光导航*** |
CN109976327A (zh) * | 2017-12-28 | 2019-07-05 | 沈阳新松机器人自动化股份有限公司 | 一种巡逻机器人 |
CN109782768A (zh) * | 2019-01-26 | 2019-05-21 | 哈尔滨玄智科技有限公司 | 一种适配于内行星式复合轮系搬运机器人的自主导航*** |
US11438886B2 (en) | 2020-02-27 | 2022-09-06 | Psj International Ltd. | System for establishing positioning map data and method for the same |
TWI749656B (zh) * | 2020-07-22 | 2021-12-11 | 英屬維爾京群島商飛思捷投資股份有限公司 | 定位圖資建立系統及建立方法 |
CN112346466A (zh) * | 2020-12-07 | 2021-02-09 | 苏州云骐智能科技有限公司 | 一种基于5g的多传感器融合agv冗余控制***及方法 |
CN114509064A (zh) * | 2022-02-11 | 2022-05-17 | 上海思岚科技有限公司 | 一种自主扩展传感器数据处理的方法、接口及设备 |
Also Published As
Publication number | Publication date |
---|---|
CN106323269B (zh) | 2019-06-07 |
AU2016102440A4 (en) | 2020-04-16 |
EP3388786A4 (en) | 2019-11-13 |
CN106323269A (zh) | 2017-01-11 |
EP3388786A1 (en) | 2018-10-17 |
JP2019501384A (ja) | 2019-01-17 |
US10974390B2 (en) | 2021-04-13 |
AU2016368234A1 (en) | 2018-06-28 |
JP6868028B2 (ja) | 2021-05-12 |
US20180345504A1 (en) | 2018-12-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017097170A1 (zh) | 自主定位导航设备、定位导航方法及自主定位导航*** | |
Asadi et al. | An integrated UGV-UAV system for construction site data collection | |
Cruz et al. | Decentralized cooperative control-a multivehicle platform for research in networked embedded systems | |
JP2020502627A (ja) | ロボットマッピングのためのシステムおよび方法 | |
Li et al. | Localization and navigation for indoor mobile robot based on ROS | |
CN103389699A (zh) | 基于分布式智能监测控制节点的机器人监控及自主移动***的运行方法 | |
WO2023116667A1 (zh) | 一种充电设备以及控制机械臂充电的方法 | |
Hebert et al. | Supervised remote robot with guided autonomy and teleoperation (SURROGATE): a framework for whole-body manipulation | |
Hu et al. | A new ROS-based hybrid architecture for heterogeneous multi-robot systems | |
Konomura et al. | Phenox: Zynq 7000 based quadcopter robot | |
Do Quang et al. | Mapping and navigation with four-wheeled omnidirectional mobile robot based on robot operating system | |
Liu et al. | The multi-sensor fusion automatic driving test scene algorithm based on cloud platform | |
CN112318507A (zh) | 一种基于slam技术的机器人智能控制*** | |
Jo et al. | Towards a ROS2-based software architecture for service robots | |
CN111380527A (zh) | 一种室内服务机器人的导航方法及导航控制器 | |
Zhang et al. | An interactive control system for mobile robot based on cloud services | |
Fan et al. | Collaborative robot transport system based on edge computing | |
Rui et al. | Design and implementation of tour guide robot for red education base | |
CN216697069U (zh) | 基于ros2的移动机器人控制*** | |
Cheema et al. | Development of a Control and Vision Interface for an AR. Drone | |
Choi et al. | Development of smart mobile manipulator controlled by a single windows PC equipped with real-time control software | |
Makhal et al. | Path planning through maze routing for a mobile robot with nonholonomic constraints | |
Rea | AMR system for autonomous indoor navigation in unknown environments | |
Witkowski et al. | Learning Vision Based Navigation with a Smartphone Mobile Robot | |
Duzhen et al. | VSLAM and Navigation System of Unmanned Ground Vehicle Based on RGB-D camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16872361 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2018530092 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2016368234 Country of ref document: AU Date of ref document: 20161205 Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016872361 Country of ref document: EP |