WO2020182146A1 - Robotic system, mapping system and method for robotic navigation map - Google Patents
Robotic system, mapping system and method for robotic navigation map Download PDFInfo
- Publication number
- WO2020182146A1 WO2020182146A1 PCT/CN2020/078789 CN2020078789W WO2020182146A1 WO 2020182146 A1 WO2020182146 A1 WO 2020182146A1 CN 2020078789 W CN2020078789 W CN 2020078789W WO 2020182146 A1 WO2020182146 A1 WO 2020182146A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- mapping
- navigation map
- working
- features
- Prior art date
Links
- 238000013507 mapping Methods 0.000 title claims abstract description 201
- 238000000034 method Methods 0.000 title claims abstract description 46
- 230000033001 locomotion Effects 0.000 claims description 52
- 239000003550 marker Substances 0.000 claims description 47
- 238000012545 processing Methods 0.000 claims description 24
- 230000008569 process Effects 0.000 claims description 15
- 230000010365 information processing Effects 0.000 claims description 7
- 238000006243 chemical reaction Methods 0.000 claims description 5
- 238000012552 review Methods 0.000 claims description 3
- 238000007726 management method Methods 0.000 description 37
- 238000005516 engineering process Methods 0.000 description 14
- 230000008859 change Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000005299 abrasion Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000008713 feedback mechanism Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 230000026676 system process Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
Definitions
- the invention relates to a robot, in particular to a navigation system and method of the robot.
- Electromagnetic navigation embeds metal wires on the driving path of the AGV, and loads the guiding frequency on the metal wire, and realizes the navigation of the AGV by identifying the guiding frequency.
- Magnetic stripe navigation uses magnetic stripe on the ground instead of burying wires underground, and uses magnetic tape induction signals to achieve navigation.
- Two-dimensional code navigation by laying a two-dimensional code at a certain distance on the path, calculates and corrects the pose of the AGV by comparing the position of the two-dimensional code under the camera.
- Laser navigation is based on lidar, which scans the surrounding environment and collects reflected light information to determine your own position in the scene.
- the inventor of the present invention found that in the prior art, electromagnetic navigation technology and magnetic stripe navigation technology have greatly modified the ground.
- the electromagnetic navigation technology even requires pre-embedded magnetic nails, and the industrial application scenarios are narrow.
- there is almost no human-computer interaction between these two technologies and the cost of avoiding obstacles and changing the preset path is extremely high.
- these two technologies cannot achieve intensive operation and multi-machine parallel in a single scene.
- the two-dimensional code navigation technology solves the problem of high ground laying costs, in many scenarios, ground markers (such as medical scenarios) are not allowed.
- the QR code is prone to damage and dirt, causing the QR code to be unrecognized or incorrectly recognized, requiring high labor and maintenance costs.
- Laser navigation currently relies on reflectors on a large scale, has certain requirements on the surrounding environment and lighting conditions, and has very poor adaptability in dynamic environments. It can only be used in simple indoor scenes and cannot adapt to complex environments with multiple goods and multiple machines. In addition, the cost of laser navigation is extremely high, and there is no possibility of cost reduction in the short term. Traditional visual navigation has the characteristics of low recognition accuracy, strong environmental characteristics, and slow operation speed.
- the purpose of the present invention is to provide a robot system, a robot navigation map mapping system and a mapping method, which can be used in large-area scenes without laying permanent positioning markers.
- the present invention provides a robot navigation map mapping method.
- a motion path is preset, a plurality of removable markers are set on the motion path, and the mapping robot is located on the motion path.
- the mapping method includes the steps:
- the feature acquisition module of the mapping robot records the features along the way, and obtains information about the mapping robot when it moves to the removable marker. Information about the pose calibration for calibration;
- the features and corresponding pose information recorded by the feature collection module are processed or the features and corresponding pose information recorded by the feature collection module are sent to a server for processing to obtain a navigation map.
- the mapping method further includes the step of calibrating the coordinate origin of the feature acquisition module of the mapping robot and the coordinate origin of the motion path.
- the feature collection module establishes the features collected by the sensor into its own coordinate system.
- the removable marker is removed.
- the mapping method further includes, after obtaining the navigation map, causing the mapping robot to continue to record features along the way through the feature acquisition module, and to record the newly recorded features and corresponding positions.
- the posture information is updated to the navigation map or the newly recorded feature and corresponding posture information are sent to the server to update the navigation map.
- the removable marker includes identifiable reference pose information.
- the removable marker is an artificially identifiable marker, and the artificially identifiable marker corresponds to the reference pose information.
- the feature acquisition module is a camera, and the ground pattern features along the way are captured by the camera of the mapping robot.
- the feature collection module includes multiple cameras and/or laser sensors, and features along the way are recorded by the multiple cameras and/or laser sensors.
- the present application further provides a machine-readable medium having instructions stored on the machine-readable medium, and when the instructions are executed on a machine, the machine executes the above-mentioned robot navigation map mapping method.
- the application further provides a system, which includes a memory for storing instructions executed by one or more processors of the system; and a processor, which is one of the processors of the system, for executing the above-mentioned robot navigation map mapping method.
- This application further provides a robot navigation map mapping system, the mapping system includes:
- Removable markers the removable markers being arranged in a movement path
- a feature collection module which is configured to record features along the path when the mapping robot travels along the motion path, and to obtain a review of the mapping when the mapping robot reaches the position of the removable marker Map the robot's pose calibration information for calibration;
- a feature processing module configured to process the features and corresponding pose information recorded by the feature collection module or send the features and corresponding pose information recorded by the feature collection module to a server for processing, To get a navigation map.
- the removable marker includes identifiable reference pose information.
- the removable marker is a removable marker that is manually identifiable, and the manually identifiable marker corresponds to the reference pose information.
- the motion path is composed of multiple straight paths.
- the feature collection module is a plurality of cameras and/or laser sensors arranged on the mapping robot.
- the feature collection module is a camera provided on the mapping robot, and the camera is configured to record features of ground patterns along the way.
- the application further provides a robot system, which includes a mapping robot, a working robot, and a robot management system;
- the mapping robot includes:
- a feature collection module configured to record features along a path when the mapping robot travels along a movement path
- a feature processing module configured to send the features and corresponding pose information recorded by the feature acquisition module to the robot management system for processing
- the robot management system is configured to receive and process features and corresponding pose information recorded by the mapping robot to obtain or update a navigation map;
- the working robot is configured to obtain the navigation map from the robot management system for positioning.
- a removable marker is arranged on the movement path, and the feature acquisition module is further configured to obtain a response to the mapping robot when the mapping robot reaches the position of the removable marker. Calibration information to perform calibration.
- the removable marker includes identifiable reference pose information.
- the removable marker is an artificially identifiable marker, and the artificially identifiable marker corresponds to the reference pose information.
- the working robot is configured to compare the recorded features with the features in the navigation map during operation to obtain the current pose information of the working robot.
- the working robot is configured to issue an instruction to create a map to the robot management system when it is confirmed that the recorded feature cannot match the feature in the navigation map.
- the robot management system is configured to instruct the mapping robot to record features along the local movement path near the working robot when receiving an instruction that requires mapping from the working robot Update the navigation map.
- the feature collection module is configured to record the features of ground patterns along the movement path.
- the feature collection module includes multiple cameras and/or laser sensors to record features along the way.
- the robot system includes multiple mapping robots and/or multiple working robots that are cooperatively controlled by the robot management system.
- the working robot is a handling robot.
- the present invention also provides a robot system, which includes a working robot and a robot management system;
- the working robot includes:
- a conversion module configured to switch the working robot from a working mode to a mapping mode under a first predetermined condition
- An information collection module configured to record features along a movement path when the working robot travels along a movement path in the mapping mode
- An information processing module configured to send the features and corresponding pose information recorded by the information collection module to the robot management system for processing in the mapping mode, and in the working mode Obtain a navigation map from the robot management system for positioning;
- the robot management system is configured to receive and process features and corresponding pose information recorded from the working robot to obtain or update the navigation map.
- the information collection module is configured to record features along the way in the working mode
- the information processing module is configured to combine the features recorded by the information collection module in the working mode with the navigation The features in the map are compared to obtain the current pose information of the working robot
- the first predetermined condition includes that the information processing module confirms that the feature recorded by the information collection module cannot match the feature in the navigation map in the working mode.
- the working robot records features along the local movement path near the working robot in the mapping mode to update the local navigation map.
- the conversion module is configured to switch the working robot to the working mode after completing the update of the navigation map.
- the robot system further includes a mapping robot
- the mapping robot includes:
- a feature collection module configured to record features along the path when the mapping robot travels along the movement path
- a feature processing module configured to send the features and corresponding pose information recorded by the feature acquisition module to the robot management system for processing
- the robot management system is further configured to also receive and process features and corresponding pose information recorded by the mapping robot to obtain or update the navigation map.
- the robot management system instructs the mapping robot to replace part or all of the working robots to perform the tasks of the mapping mode under a second predetermined condition.
- the robot system of the present application is a camera-based cluster robot system, which relies on visual features to realize positioning. Compared with the existing cluster robot positioning system, it has the advantages of no need to lay positioning markers, can be used in large-area scenes, high robot storage density, high robot running speed, high positioning accuracy, and less manual intervention in the map update process.
- FIG. 1 is a schematic diagram of the system composition of the robot navigation map mapping system during the first mapping of an embodiment of the present application.
- FIG. 2 is a schematic diagram of the system composition of the robot navigation map mapping system when updating the map according to an embodiment of the present application.
- Fig. 3 is a flowchart of a method for building a robot navigation map according to an embodiment of the present application.
- Fig. 4 is a schematic diagram of the system composition of a robot system according to an embodiment of the present application.
- the purpose of the navigation map mapping system of the present application is to form a navigation map that enables the robot to navigate in an area.
- This area can be an outdoor area, or an indoor area where positioning signals such as GPS cannot be received.
- This system uses the ground pattern feature recognition method to illuminate the ground captured by the camera, and process the image captured by the camera to recognize the current position and posture.
- the ground texture feature refers to any feature on the ground, such as cracks, lines, protrusions, recesses, and possible objects on the ground.
- the image can be a pair of photos or frames in a video.
- the mapping is done using a separate mapping robot.
- the removable marker includes identifiable reference pose information or corresponds to reference pose information.
- the reference pose information refers to relative or absolute position information and pose information that can be referred to when the robot performs pose calibration.
- the mapping robot can proceed along the navigation path on the basis of the navigation map established for the first time, and further capture images of ground texture features, and then process the images to update the navigation map.
- the navigation of this application uses ground texture recognition technology, feature point matching through ground texture recognition, multi-robot map data cloud sharing technology, and robot scheduling system, which can dispatch thousands of robots in the same system according to the position information returned by each robot , Intensive collaborative work in the same scene.
- the navigation map mapping system includes a mapping robot 1.
- the mapping robot 1 may be one or more.
- the mapping robot 1 can travel along the set movement path 2, which can be performed semi-automatically (for example, by remote control) or manually. In the case that a rough preliminary navigation map has been obtained, the mapping robot 1 can also automatically travel along the set movement path 2 to obtain a more refined navigation map.
- the movement path 2 refers to the area where the robot can travel.
- Figures 1 to 4 exemplarily show a route of the motion path 2, that is, the way the mapping robot travels along the motion path 2.
- those skilled in the art should understand that other forms of routes can be set according to actual needs. As long as it can build a navigation map.
- the mapping robot 1 has its own IMU (Inertial Measurement Unit; inertial detection module), which can perform high-precision linear walking in a small local area.
- the mapping robot 1 is provided with a camera, which is used to photograph the ground pattern features in the motion path.
- the camera is preferably a high-speed camera, which can provide high frame rate and high-resolution images when the robot is running at high speed, and realize the approximate real-time processing and feedback mechanism of the image, so as to ensure that the mapping can be performed on the premise of the higher running speed of the mapping robot Next implementation.
- the ground pattern is more resistant to abrasion, which ensures a longer time of operation.
- the camera of the mapping robot 1 can capture scene images other than the ground and record scene features, or the mapping robot 1 can record other features along the path of movement through other feature collection modules, such as laser sensors. Obtain scene characteristics, etc.
- the mapping robot 1 may include multiple cameras and/or laser sensors, so as to be able to acquire multiple types of feature information to cooperate with mapping.
- the mapping robot 1 may be provided with a light supplement device, which can be used to supplement light in a poor light environment, thereby improving the imaging quality.
- the mapping robot 1 may be provided with a communication module to transmit the captured images and/or recorded features to an external processing device, such as a robot that controls one or more mapping robots and working robots.
- Management system 4. The robot management system 4 can be used as a part of a navigation map mapping system.
- the robot management system 4 is configured to receive and process the image taken by the camera of the mapping robot or the feature information extracted from the captured image, and/or the feature information and corresponding pose information recorded by other feature collection modules to obtain navigation map.
- pose information refers to position information and posture information. It can be understood that the corresponding pose information can be obtained by the measuring device equipped with the mapping robot itself, and since it is well known to those skilled in the art, it will not be repeated here. It should be understood that, if only the mapping is realized, the image and/or feature processing can also be performed by the feature processing module carried by the mapping robot itself, so that the navigation map can also be established only by the mapping robot itself.
- the navigation map mapping system may include removable markers 3 distributed on the movement path 2. It can be seen from FIG. 1 that the removable markers 3 can be evenly arranged on a path section or non-uniformly arranged on a path section, and can be set according to site conditions or needs. For example, at locations requiring high accuracy (a place where precise turns are required), relatively more removable markers are arranged to improve accuracy.
- the removable marker 3 includes readable coordinate position information and posture information. Thus, when the mapping robot 1 recognizes the removable marker, it can read its information as the reference pose information to correct the pose of the mapping robot.
- the removable marker may be a QR code or other customized graphic codes.
- the removable marker is an artificially identifiable removable marker, and the artificially identifiable marker corresponds to the reference pose information. Therefore, after the mapping robot 1 obtains the manually identifiable marker, it can be manually operated in the background, for example, through the robot management system 4 or directly through the display screen on the mapping robot 1 to perform mapping The posture of the robot 1 is corrected to facilitate manual review of the mapping effect and adjustment of the mapping parameters.
- the manually identifiable marker may be any form that can be preset to indicate the ground coordinate direction or the scene orientation and correspond to relative position information or absolute position information.
- multiple types of removable markers can be arranged on the motion path to be used to correct the pose of the mapping robot. Then, the mapping robot 2 transmits the corrected pose information to the server for processing, such as the aforementioned robot management system 4, or the feature processing module carried by itself.
- the removable marker can be removed after the first mapping is completed. After the first mapping, the mapping robot continues to travel along the path of the removed markers to complete or update the navigation map, as shown in Figure 2.
- the setting of removable markers can temporarily enhance the environmental characteristics, improve the accuracy of the first mapping, and remove it after the first mapping is completed, so as to adapt to various scenarios where markers are not allowed. In some embodiments, if markers can be retained in the scene, the removable markers may not be removed after the first mapping is completed.
- drawing is a continuous work, which can be continued as needed.
- the ground texture is not completely unchanged in an industrial environment, and the ground texture will change over time, or due to the coordinated work of heavy machinery and rolling, the built-up ground texture pattern will also change significantly. Similar to the ground texture, other features will also change over time due to scene adjustments.
- the mapping robot needs to repeat the mapping and upload the new terrain features and/or other features to the system server. Therefore, the mapping robot needs to patrol the working path regularly to detect whether the ground texture and/or other features in the path have changed or changed completely due to time or external forces.
- the working robot When the working robot finds a feature pattern in the navigation map that cannot be matched, it will also send a request to the system to call the mapping robot to re-judge, store and update ground features and/or other features. Therefore, the stability and accuracy of the navigation map in a complex environment are guaranteed. It should be understood that when updating the navigation map, the mapping robot may not follow the original motion path. The motion path of the mapping robot can be reset according to the working conditions of the working robot, so as to ensure the stability of the navigation map. And accuracy does not affect the normal work of the working robot.
- the above-mentioned navigation map update uses SLAM technology (Simultaneous Localization And Mapping; real-time positioning and map reconstruction technology).
- SLAM technology Simultaneous Localization And Mapping; real-time positioning and map reconstruction technology.
- This technology realizes the determination of the matching relationship between the image taken at the current location and/or the recorded features and the feature points of the data in the pre-established map library, so as to determine the specific and accurate coordinate position of the current location in the calibration map.
- the acquired new data can be continuously updated to the original map library to achieve dynamic optimization of the map library data.
- the above-mentioned navigation map update also uses map data cloud sharing technology. After the robot obtains and updates the map data through the above-mentioned SLAM technology, it is uploaded to the map data management center through its own communication device. The management center optimizes the map data and then shares it with all devices in the current system to ensure the real-time update of the map data of all devices in the system and improve the stability and effectiveness of the overall map
- a motion path is set in the area of the navigation map to be created, and the motion path may be a straight line or a curve.
- the marker can be a QR code or a manually identifiable marker.
- place the mapping robot on the motion path. Determine the direction of the X and Y coordinates of the area and the origin of the robot running map.
- the center of the calibration robot is at the origin of the coordinates, and the origin is in the field of view of the main camera. Turn on the camera of the mapping robot to make the mapping robot move along the path of motion.
- the camera records the ground pattern features along the way and/or records other features through other feature acquisition modules, and moves to each of the Calibrate the position of the mapping robot when the marker can be removed.
- the mapping robot continues to move along the movement path, and uses the camera to record the ground pattern features along the way, and/or record other features through other feature acquisition modules, and capture the newly captured
- the ground texture features and/or newly recorded features are updated to the navigation map.
- the update of the navigation map is completed.
- the image captured by the camera and/or the features recorded by other feature acquisition modules can be uploaded to the remote robot management system and processed in the robot management system to obtain a navigation map.
- the instruction code can be stored in any type of computer accessible memory (for example, permanent or modifiable, volatile or nonvolatile, solid state Or non-solid, fixed or replaceable media, etc.).
- the memory may be, for example, programmable array logic (Programmable Array Logic, "PAL"), random access memory (Random Access Memory, "RAM”), and programmable read-only memory (Programmable Read Only Memory, "PROM” for short).
- Read-Only Memory Read-Only Memory
- EEPROM Electrically Removable Programmable ROM
- magnetic disks optical discs
- digital versatile discs Digital Versatile Disc , Referred to as “DVD" and so on.
- Fig. 4 shows a schematic diagram of the system composition of the robot system composed of the above-mentioned mapping robot and working robot.
- the robot system includes a mapping robot 1, a working robot 5 and a robot management system 4.
- the robot management system 4 cooperatively controls multiple mapping robots 1 and/or multiple working robots 5.
- the mapping robot 1 and the working robot 5 can work simultaneously in the same working area.
- the mapping robot will move to the position where the mapping is needed after receiving the mapping instruction.
- the mapping robot moves along a motion path, and the camera on it can capture the ground texture features in the motion path and/or record the features through other feature acquisition modules.
- the feature processing module of the mapping robot transmits the captured images and/or recorded features to the robot management system in real time.
- the robot management system After the robot management system processes the image and/or features, it updates the original navigation map if necessary.
- the robot management system 4 communicates with the working robot 5, and transmits the updated navigation map to the working robot in real time, so that the working robot can locate according to the updated navigation map.
- Working robots are generally used to carry goods in warehouses and other occasions.
- the working robot 5 stores a navigation map and is provided with a camera or other information collection module.
- the working robot can compare the image captured by the camera and/or the features recorded by other information collection modules with the images and/or features stored in the navigation map to obtain coordinate position information of the working robot for navigation.
- the working robot is arranged to compare the image captured by its camera and/or the features recorded by other information acquisition modules with the nearby location images and/or features stored in the navigation map to obtain the relative position of the working robot. Based on the displacement and rotation angle of the characteristic position with coordinate position information, and then locate the coordinate position information of the working robot in the navigation map to realize navigation.
- the basic workflow of the working robot includes:
- the working robot receives tasks such as cargo handling and moves to the starting origin or a specific coordinate point in any path;
- the working robot if it recognizes that the ground texture and/or other features do not match the stored map, it will issue a map-building instruction to the robot management system, detour to avoid the mismatched area if necessary, and then wait Transmit the new mapping information, and continue working after receiving the new mapping information.
- the feature mismatch here refers to a certain proportion of mismatch, and the proportion can be set as required.
- the working robot may first process the work in other areas, and wait for the map of the local area to update before performing the work in the local area.
- the robot management system dispatches related mapping robots to move to the area according to the power of the mapping robot, the distance from the area, etc., and updates the local navigation map of the area in time. Thereby ensuring stability and accuracy.
- the mismatched area can be determined based on the previous pose information of the working robot, for example, it extends to the surrounding based on the previous pose information of the working robot.
- the warehouse may be divided into blocks in advance, and the block in which the working robot is located is confirmed according to the previous pose information of the working robot, and the block and/or adjacent blocks are regarded as non-matching areas.
- other methods can also be used to determine the unmatched area, as long as the navigation map can be updated at the mismatch identified by the working robot.
- when more than a predetermined number of working robots issue a mapping instruction to the robot management system in a short period of time it means that the environment has changed significantly, and the robot management system instructs the mapping robot to perform the mapping of the entire area.
- the navigation map is updated to ensure the effective operation of the entire system.
- the robot management system may also instruct the mapping robot to perform other operations according to actual conditions to ensure the effectiveness of the navigation map.
- the working robot itself has a mapping function.
- the first mapping can be performed by the mapping robot, the working robot, or both the mapping robot and the working robot.
- the working robot has a working mode and a mapping mode, and the two modes can be switched mutually.
- the working robot has a conversion module configured to control the working robot to switch between a working mode and a mapping mode under a first predetermined condition.
- the working robot can record the characteristics of the local motion path through the information acquisition module, and transmit the recorded characteristics to the robot management system through the information processing module, so that the recorded information can be recognized in the working mode
- the mapping mode is switched to re-map the local path, and the navigation map is updated in time.
- the working robot switches to the working mode.
- the robot management system instructs the mapping robot to perform part or all of the tasks in the mapping mode of the working robot under the second predetermined condition.
- the robot management system instructs the mapping robot to replace the working robot to update the navigation map, and the working robot maintains the working mode for a predetermined period of time , Does not switch to the mapping mode.
- the robot management system instructs the mapping robot to replace some or all of the working robots to update the navigation map to ensure that a certain number of working robots are in the working mode.
- the robot management system instructs the mapping robot to replace the working robot to update the navigation map.
- the mapping robot can also replace the working robot to perform tasks in the mapping mode in other situations.
- the first predetermined condition and the second predetermined condition can be set reasonably in the entire robot system to dynamically balance the effectiveness of work and map data.
- RMS Robot Management System
- This application uses multi-robot map data cloud sharing, which can be used not only for ground pattern recognition, but also for other navigation methods such as laser navigation.
- the task of the mapping robot and the working robot of the robot system of the present application are separated, and the map can be updated at a shorter time interval or a smaller map environment change standard without affecting the work, thereby reducing manual work (For example, reduce the frequency of manual recalibration of the map).
- a logical unit and/or module may be a physical unit and/or module , It can also be a part of a physical unit and/or module, or it can be implemented as a combination of multiple physical units and/or modules.
- the physical implementation of these logical units and/or modules is not the most important.
- These logical units And/or the combination of the functions realized by the module is the key to solving the technical problem proposed by the present invention.
- the foregoing system embodiments of the present invention do not introduce units and/or modules that are not closely related to solving the technical problems proposed by the present invention. This does not indicate that the foregoing system embodiments are not relevant. There are no other units and/or modules.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
Claims (31)
- 一种机器人导航地图建图方法,其特征在于,预先设置一运动路径,所述运动路径上设置多个可移除标记物,建图机器人位于所述运动路径上,所述建图方法包括步骤:A method for mapping a robot navigation map, characterized in that a motion path is preset, a plurality of removable markers are set on the motion path, a mapping robot is located on the motion path, and the mapping method includes steps :在所述建图机器人沿着所述运动路径行进时,通过所述建图机器人的特征采集模块记录沿途的特征,并在运动到所述可移除标记物时获得对所述建图机器人的位姿进行校准的信息以进行校准;以及When the mapping robot travels along the movement path, the feature acquisition module of the mapping robot records the features along the way, and obtains information about the mapping robot when it moves to the removable marker. Information about the pose calibration for calibration; and对所述特征采集模块所记录的特征及对应位姿信息进行处理或将所述特征采集模块所记录的特征及对应位姿信息发送至服务器进行处理,以获得导航地图。The features and corresponding pose information recorded by the feature collection module are processed or the features and corresponding pose information recorded by the feature collection module are sent to a server for processing to obtain a navigation map.
- 根据权利要求1所述的机器人导航地图建图方法,其特征在于,所述建图方法进一步包括步骤:对所述建图机器人的特征采集模块的坐标原点与所述运动路径的坐标原点进行标定。The robot navigation map mapping method according to claim 1, wherein the mapping method further comprises the step of calibrating the coordinate origin of the feature acquisition module of the mapping robot and the coordinate origin of the motion path .
- 根据权利要求1所述的机器人导航地图建图方法,其特征在于,在获得所述导航地图之后,所述可移除标记物被移除。The method for building a robot navigation map according to claim 1, wherein after obtaining the navigation map, the removable marker is removed.
- 根据权利要求1所述的机器人导航地图建图方法,其特征在于,所述建图方法进一步包括在获得所述导航地图之后,使得所述建图机器人继续通过所述特征采集模块记录沿途的特征,以及将新记录到的所述特征及对应位姿信息更新至所述导航地图或将新记录到的所述特征及对应位姿信息发送至服务器以更新所述导航地图。The robot navigation map mapping method according to claim 1, wherein the mapping method further comprises after obtaining the navigation map, making the mapping robot continue to record features along the way through the feature collection module , And update the newly recorded feature and corresponding pose information to the navigation map or send the newly recorded feature and corresponding pose information to a server to update the navigation map.
- 根据权利要求1所述的机器人导航地图建图方法,其特征在于,所述可移除标记物包含有可识别的参照位姿信息。The method for creating a robot navigation map according to claim 1, wherein the removable marker includes identifiable reference pose information.
- 根据权利要求1所述的机器人导航地图建图方法,其特征在于,所述可移除标记物是人工可识别的标记物,所述人工可识别的标记物对应于参照位姿信息。The method for building a robot navigation map according to claim 1, wherein the removable marker is a manually recognizable marker, and the manually recognizable marker corresponds to reference pose information.
- 根据权利要求1所述的机器人导航地图建图方法,其特征在于,所述特征采集模块是摄像头,通过所述建图机器人的摄像头拍摄沿途的地面纹路特征。The method for mapping a robot navigation map according to claim 1, wherein the feature collection module is a camera, and the feature of the ground texture along the way is captured by the camera of the mapping robot.
- 根据权利要求1所述的机器人导航地图建图方法,其特征在于,所述特征采集模块包括多个摄像头和/或激光传感器,通过所述多个摄像头和/或激光传感器记录沿途的特征。The method for mapping a robot navigation map according to claim 1, wherein the feature collection module includes multiple cameras and/or laser sensors, and features along the way are recorded by the multiple cameras and/or laser sensors.
- 一种机器人导航地图建图***,其特征在于,所述建图***包括:A robot navigation map mapping system, characterized in that the mapping system includes:可移除标记物,所述可移除标记物布置于一运动路径中;Removable markers, the removable markers being arranged in a movement path;特征采集模块,所述特征采集模块配置成在建图机器人沿所述运动路径行进时记录沿途的特征,并且在所述建图机器人到达所述可移除标记物所在位置时获得对所述建图机器人的位姿进行校准的信息以进行校准;以及A feature collection module, which is configured to record features along the path when the mapping robot travels along the motion path, and to obtain a review of the mapping when the mapping robot reaches the position of the removable marker Map the robot's pose calibration information for calibration; and特征处理模块,所述特征处理模块配置成对所述特征采集模块所记录的特征及对应位姿信息进行处理或将所述特征采集模块所记录的特征及对应位姿信息发送至服务器进行处理,以获得导航地图。A feature processing module configured to process the features and corresponding pose information recorded by the feature collection module or send the features and corresponding pose information recorded by the feature collection module to a server for processing, To get a navigation map.
- 根据权利要求9所述的机器人导航地图建图***,其特征在于,所述可移除标记物包含有可识别的参照位姿信息。The robot navigation map mapping system according to claim 9, wherein the removable marker includes identifiable reference pose information.
- 根据权利要求9所述的机器人导航地图建图***,其特征在于,所述可移除标记物是人工可识别的可移除标记物,所述人工可识别的标记物对应于参照位姿信息。The robot navigation map mapping system according to claim 9, wherein the removable marker is a manually identifiable removable marker, and the manually identifiable marker corresponds to the reference pose information .
- 根据权利要求9所述的机器人导航地图建图***,其特征在于,所述运动路径由多条直线路径组成。The robot navigation map mapping system according to claim 9, wherein the motion path is composed of multiple straight paths.
- 根据权利要求9所述的机器人导航地图建图***,其特征在于,所述特征采集模块为设置于所述建图机器人上的多个摄像头和/或激光传感器。The robot navigation map mapping system according to claim 9, wherein the feature collection module is a plurality of cameras and/or laser sensors arranged on the mapping robot.
- 根据权利要求9所述的机器人导航地图建图***,其特征在于,所述特征采集模块为设置于所述建图机器人上的摄像头,所述摄像头配置成记录沿途的地面纹路特征。The robot navigation map mapping system according to claim 9, wherein the feature collection module is a camera provided on the mapping robot, and the camera is configured to record features of ground patterns along the way.
- 一种机器人***,其特征在于,所述机器人***包括建图机器人、工作机器人和机器人管理***;A robot system, characterized in that the robot system includes a mapping robot, a working robot and a robot management system;所述建图机器人包括:The mapping robot includes:特征采集模块,所述特征采集模块配置成在所述建图机器人沿一运动路径行进时记录沿途的特征;以及A feature collection module configured to record features along a path when the mapping robot travels along a movement path; and特征处理模块,所述特征处理模块配置成将所述特征采集模块所记录的特征及对应位姿信息发送至所述机器人管理***进行处理;A feature processing module configured to send the features and corresponding pose information recorded by the feature acquisition module to the robot management system for processing;所述机器人管理***配置成接收和处理来自所述建图机器人所记录的特征及对应位姿信息,以获得或者更新导航地图;以及The robot management system is configured to receive and process features and corresponding pose information recorded by the mapping robot to obtain or update a navigation map; and所述工作机器人配置成从所述机器人管理***获得所述导航地图以进行定位。The working robot is configured to obtain the navigation map from the robot management system for positioning.
- 根据权利要求15所述的机器人***,其特征在于,所述运动路径上布置有 可移除标记物,所述特征采集模块还配置成在所述建图机器人到达所述可移除标记物所在位置时获得对所述建图机器人的位姿进行校准的信息以进行校准。The robot system according to claim 15, wherein a removable marker is arranged on the movement path, and the feature acquisition module is further configured to reach the location of the removable marker when the mapping robot reaches In the position, information for calibrating the pose of the mapping robot is obtained for calibration.
- 根据权利要求16所述的机器人***,其特征在于,所述可移除标记物包含有可识别的参照位姿信息。The robot system according to claim 16, wherein the removable marker contains identifiable reference pose information.
- 根据权利要求16所述的机器人***,其特征在于,所述可移除标记物是人工可识别的标记物,所述人工可识别的标记物对应于参照位姿信息。The robot system according to claim 16, wherein the removable marker is an artificially identifiable marker, and the artificially identifiable marker corresponds to the reference pose information.
- 根据权利要求15所述的机器人***,其特征在于,所述工作机器人配置成在运行时将记录的特征与所述导航地图中的特征进行比对,以获得所述工作机器人的当前位姿信息。The robot system according to claim 15, wherein the working robot is configured to compare the recorded features with the features in the navigation map during operation to obtain current pose information of the working robot .
- 根据权利要求15所述的机器人***,其特征在于,所述工作机器人配置成在确认所记录的特征无法与导航地图中的特征匹配时,向所述机器人管理***发出需要建图的指令。The robot system according to claim 15, wherein the working robot is configured to send an instruction to the robot management system to create a map when it is confirmed that the recorded feature cannot match the feature in the navigation map.
- 根据权利要求15所述的机器人***,其特征在于,所述机器人管理***配置成在从所述工作机器人接收到需要建图的指令时,指示所述建图机器人沿所述工作机器人附近的局部所述运动路径记录沿途的特征以更新所述导航地图。The robot system according to claim 15, wherein the robot management system is configured to instruct the mapping robot to follow a local area near the working robot when receiving an instruction from the working robot that requires mapping. The movement path records features along the way to update the navigation map.
- 根据权利要求15所述的机器人***,其特征在于,所述特征采集模块配置成记录所述运动路径沿途的地面纹路特征。15. The robot system according to claim 15, wherein the feature acquisition module is configured to record features of ground textures along the movement path.
- 根据权利要求15所述的机器人***,其特征在于,所述特征采集模块包括多个摄像头和/或激光传感器以记录沿途的特征。The robot system according to claim 15, wherein the feature collection module includes a plurality of cameras and/or laser sensors to record features along the way.
- 根据权利要求15所述的机器人***,其特征在于,所述机器人***包括由所述机器人管理***协同控制的多台建图机器人和/或多台工作机器人。The robot system according to claim 15, characterized in that the robot system comprises multiple mapping robots and/or multiple working robots that are cooperatively controlled by the robot management system.
- 根据权利要求15所述的机器人***,其特征在于,所述工作机器人是搬运机器人。The robot system according to claim 15, wherein the working robot is a handling robot.
- 一种机器人***,其特征在于,所述机器人***包括工作机器人和机器人管理***;A robot system, characterized in that the robot system includes a working robot and a robot management system;所述工作机器人包括:The working robot includes:转换模块,所述转换模块配置成在第一预定条件下将所述工作机器人从工作模式切换到建图模式;A conversion module configured to switch the working robot from a working mode to a mapping mode under a first predetermined condition;信息采集模块,所述信息采集模块配置成在所述建图模式下所述工作机器人沿一运动路径行进时记录沿途的特征;以及An information collection module configured to record features along a movement path when the working robot travels along a movement path in the mapping mode; and信息处理模块,所述信息处理模块配置成在所述建图模式下将所述信息采集模块所记录的特征及对应位姿信息发送至所述机器人管理***进行处理,并在所述工作模式下从所述机器人管理***获得导航地图以进行定位;An information processing module configured to send the features and corresponding pose information recorded by the information collection module to the robot management system for processing in the mapping mode, and in the working mode Obtain a navigation map from the robot management system for positioning;所述机器人管理***配置成接收和处理来自所述工作机器人所记录的特征及对应位姿信息,以获得或者更新所述导航地图。The robot management system is configured to receive and process features and corresponding pose information recorded from the working robot to obtain or update the navigation map.
- 根据权利要求26所述的机器人***,其特征在于,所述信息采集模块配置成在所述工作模式下记录沿途的特征,所述信息处理模块配置成将所述信息采集模块在所述工作模式下所记录的特征与所述导航地图中的特征进行比对,以获得所述工作机器人的当前位姿信息;The robot system according to claim 26, wherein the information collection module is configured to record features along the way in the working mode, and the information processing module is configured to place the information collection module in the working mode Compare the recorded features with the features in the navigation map to obtain the current pose information of the working robot;所述第一预定条件包括所述信息处理模块在所述工作模式下确认所述信息采集模块所记录的特征与所述导航地图中的特征无法匹配。The first predetermined condition includes that the information processing module confirms that the feature recorded by the information collection module cannot match the feature in the navigation map in the working mode.
- 根据权利要求26所述的机器人***,其特征在于,所述工作机器人在建图模式下沿所述工作机器人附近的局部所述运动路径记录沿途的特征以对局部所述导航地图进行更新。The robot system according to claim 26, wherein the working robot records features along the local movement path near the working robot in the mapping mode to update the local navigation map.
- 根据权利要求26所述的机器人***,其特征在于,所述转换模块配置成在完成所述导航地图更新之后,将所述工作机器人切换至所述工作模式。The robot system according to claim 26, wherein the conversion module is configured to switch the working robot to the working mode after completing the update of the navigation map.
- 根据权利要求26所述的机器人***,其特征在于,所述机器人***进一步包括建图机器人,The robot system of claim 26, wherein the robot system further comprises a mapping robot,所述建图机器人包括:The mapping robot includes:特征采集模块,所述特征采集模块配置成在所述建图机器人沿所述运动路径行进时记录沿途的特征;以及A feature collection module configured to record features along the path when the mapping robot travels along the movement path; and特征处理模块,所述特征处理模块配置成将所述特征采集模块所记录的特征及对应位姿信息发送至所述机器人管理***进行处理;A feature processing module configured to send the features and corresponding pose information recorded by the feature acquisition module to the robot management system for processing;所述机器人管理***进一步配置成也接收和处理来自所述建图机器人所记录的特征及对应位姿信息,以获得或者更新所述导航地图。The robot management system is further configured to also receive and process features and corresponding pose information recorded by the mapping robot to obtain or update the navigation map.
- 根据权利要求30所述的机器人***,其特征在于,所述机器人管理***在第二预定条件下指示所述建图机器人代替部分或所有所述工作机器人执行所述建图模式的任务。The robot system according to claim 30, wherein the robot management system instructs the mapping robot to replace part or all of the working robots to perform the tasks of the mapping mode under a second predetermined condition.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910196698.1 | 2019-03-13 | ||
CN201910196698.1A CN111693046A (en) | 2019-03-13 | 2019-03-13 | Robot system and robot navigation map building system and method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020182146A1 true WO2020182146A1 (en) | 2020-09-17 |
Family
ID=72426123
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/078789 WO2020182146A1 (en) | 2019-03-13 | 2020-03-11 | Robotic system, mapping system and method for robotic navigation map |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111693046A (en) |
WO (1) | WO2020182146A1 (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113701767B (en) * | 2020-05-22 | 2023-11-17 | 杭州海康机器人股份有限公司 | Triggering method and system for map updating |
CN112146662B (en) * | 2020-09-29 | 2022-06-10 | 炬星科技(深圳)有限公司 | Method and device for guiding map building and computer readable storage medium |
CN112731923B (en) * | 2020-12-17 | 2023-10-03 | 武汉万集光电技术有限公司 | Cluster robot co-positioning system and method |
CN113829353B (en) * | 2021-06-07 | 2023-06-13 | 深圳市普渡科技有限公司 | Robot, map construction method, apparatus and storage medium |
CN113532421B (en) * | 2021-06-30 | 2024-04-26 | 同济人工智能研究院(苏州)有限公司 | Dynamic laser SLAM method based on subgraph updating and reflector optimization |
CN114355877B (en) * | 2021-11-25 | 2023-11-03 | 烟台杰瑞石油服务集团股份有限公司 | Multi-robot operation area distribution method and device |
CN114873178A (en) * | 2022-05-18 | 2022-08-09 | 上海飒智智能科技有限公司 | Production workshop deployment-free AMR system |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012136555A1 (en) * | 2011-04-08 | 2012-10-11 | Siemens Aktiengesellschaft | Device for tracking and navigating autonomous vehicles and method for the operation thereof |
CN103777637A (en) * | 2014-02-13 | 2014-05-07 | 苏州工业园区艾吉威自动化设备有限公司 | Non-baffle-board laser AGV and navigation method thereof |
CN104679004A (en) * | 2015-02-09 | 2015-06-03 | 上海交通大学 | Flexible path and fixed path combined automated guided vehicle and guide method thereof |
CN107702722A (en) * | 2017-11-07 | 2018-02-16 | 云南昆船智能装备有限公司 | A kind of las er-guidance AGV natural navigation localization methods |
CN107703940A (en) * | 2017-09-25 | 2018-02-16 | 芜湖智久机器人有限公司 | A kind of air navigation aid based on ceiling Quick Response Code |
CN108225303A (en) * | 2018-01-18 | 2018-06-29 | 水岩智能科技(宁波)有限公司 | Two-dimensional code positioning label, and positioning navigation system and method based on two-dimensional code |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103884330B (en) * | 2012-12-21 | 2016-08-10 | 联想(北京)有限公司 | Information processing method, mobile electronic equipment, guiding equipment and server |
CN104067145B (en) * | 2014-05-26 | 2016-10-05 | 中国科学院自动化研究所 | Beta pruning robot system |
CN105203094B (en) * | 2015-09-10 | 2019-03-08 | 联想(北京)有限公司 | The method and apparatus for constructing map |
CN105425807B (en) * | 2016-01-07 | 2018-07-03 | 朱明� | A kind of Indoor Robot air navigation aid and device based on artificial landmark |
US9864377B2 (en) * | 2016-04-01 | 2018-01-09 | Locus Robotics Corporation | Navigation using planned robot travel paths |
CN108919811A (en) * | 2018-07-27 | 2018-11-30 | 东北大学 | A kind of indoor mobile robot SLAM method based on tag label |
-
2019
- 2019-03-13 CN CN201910196698.1A patent/CN111693046A/en active Pending
-
2020
- 2020-03-11 WO PCT/CN2020/078789 patent/WO2020182146A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012136555A1 (en) * | 2011-04-08 | 2012-10-11 | Siemens Aktiengesellschaft | Device for tracking and navigating autonomous vehicles and method for the operation thereof |
CN103777637A (en) * | 2014-02-13 | 2014-05-07 | 苏州工业园区艾吉威自动化设备有限公司 | Non-baffle-board laser AGV and navigation method thereof |
CN104679004A (en) * | 2015-02-09 | 2015-06-03 | 上海交通大学 | Flexible path and fixed path combined automated guided vehicle and guide method thereof |
CN107703940A (en) * | 2017-09-25 | 2018-02-16 | 芜湖智久机器人有限公司 | A kind of air navigation aid based on ceiling Quick Response Code |
CN107702722A (en) * | 2017-11-07 | 2018-02-16 | 云南昆船智能装备有限公司 | A kind of las er-guidance AGV natural navigation localization methods |
CN108225303A (en) * | 2018-01-18 | 2018-06-29 | 水岩智能科技(宁波)有限公司 | Two-dimensional code positioning label, and positioning navigation system and method based on two-dimensional code |
Also Published As
Publication number | Publication date |
---|---|
CN111693046A (en) | 2020-09-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020182146A1 (en) | Robotic system, mapping system and method for robotic navigation map | |
CN108287544B (en) | Method and system for intelligent robot route planning and returning along original path | |
CN111958591B (en) | Autonomous inspection method and system for semantic intelligent substation inspection robot | |
CN109720340B (en) | Automatic parking system and method based on visual identification | |
CN107907131B (en) | positioning system, method and applicable robot | |
US10278333B2 (en) | Pruning robot system | |
JP5079703B2 (en) | System and method for calculating position in real time | |
CN106774310A (en) | A kind of robot navigation method | |
CN107179082B (en) | Autonomous exploration method and navigation method based on fusion of topological map and measurement map | |
US11846949B2 (en) | Systems and methods for calibration of a pose of a sensor relative to a materials handling vehicle | |
CN104635735A (en) | Novel AGV visual navigation control method | |
EP3745085A1 (en) | Multi-device visual navigation method and system in variable scene | |
CN109459032B (en) | Mobile robot positioning method, navigation method and grid map establishing method | |
JP2011039968A (en) | Vehicle movable space detection device | |
CN111161334B (en) | Semantic map construction method based on deep learning | |
KR101319525B1 (en) | System for providing location information of target using mobile robot | |
CN115014338A (en) | Mobile robot positioning system and method based on two-dimensional code vision and laser SLAM | |
CN112204345A (en) | Indoor positioning method of mobile equipment, mobile equipment and control system | |
WO2022027611A1 (en) | Positioning method and map construction method for mobile robot, and mobile robot | |
CN112833890A (en) | Map construction method, map construction device, map construction equipment, robot and storage medium | |
CN112388626B (en) | Robot-assisted navigation method | |
CN109900272B (en) | Visual positioning and mapping method and device and electronic equipment | |
KR101319526B1 (en) | Method for providing location information of target using mobile robot | |
CN109857122A (en) | Controlling of path thereof, device and the warehouse transportation system of warehouse haulage vehicle | |
CN114445494A (en) | Image acquisition and processing method, image acquisition device and robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20769663 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20769663 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 25.01.2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20769663 Country of ref document: EP Kind code of ref document: A1 |