CN112388626B - Robot-assisted navigation method - Google Patents

Robot-assisted navigation method Download PDF

Info

Publication number
CN112388626B
CN112388626B CN201910755796.4A CN201910755796A CN112388626B CN 112388626 B CN112388626 B CN 112388626B CN 201910755796 A CN201910755796 A CN 201910755796A CN 112388626 B CN112388626 B CN 112388626B
Authority
CN
China
Prior art keywords
reference light
robot
distance
pose
light ray
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910755796.4A
Other languages
Chinese (zh)
Other versions
CN112388626A (en
Inventor
蒋星
陈刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Bozhilin Robot Co Ltd
Original Assignee
Guangdong Bozhilin Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Bozhilin Robot Co Ltd filed Critical Guangdong Bozhilin Robot Co Ltd
Priority to CN201910755796.4A priority Critical patent/CN112388626B/en
Publication of CN112388626A publication Critical patent/CN112388626A/en
Application granted granted Critical
Publication of CN112388626B publication Critical patent/CN112388626B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a robot-assisted navigation method. Wherein, the method comprises the following steps: emitting reference light rays, wherein the reference light rays comprise a first reference light ray and a second reference light ray perpendicular to the first reference light ray; collecting images of the reference points and the reference light; determining the distance between the reference point and the reference light and the pose angle according to the image; and adjusting the position and the pose of the robot according to the distance between the reference point and the reference light and the pose angle so as to navigate the robot. The invention solves the technical problems that in the related technology, the accuracy is poor and the precision is influenced when the fluid robot moves to lay materials.

Description

Robot-assisted navigation method
Technical Field
The invention relates to the field of robots, in particular to a robot-assisted navigation method.
Background
The existing robot moving mechanism uses an AGV (automatic guided vehicle), navigation is carried out by means of a laser radar, the highest navigation precision can reach +/-4 mm under a laboratory environment, the angular deviation can reach +/-0.1 degrees, however, due to the fact that the site environment of a building site is complex and is influenced by various interference factors, the worst navigation precision can reach +/-40 mm, the largest angular deviation can reach +/-0.5 degrees, and the precision can not meet the construction requirements of automatically paving and pasting floor tiles and paving mortar. In the moving and mortar paving process of the mortar robot, the mortar robot needs to be positioned for the second time to ensure the shape and size of the mortar.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a robot-assisted navigation method, which at least solves the technical problems that in the related art, a fluid robot is poor in accuracy and affects precision in the moving and material laying process.
According to an aspect of an embodiment of the present invention, there is provided a robot-assisted navigation method including: emitting reference light rays, wherein the reference light rays comprise a first reference light ray and a second reference light ray perpendicular to the first reference light ray; collecting a reference point and an image of the reference light; determining the distance between the reference point and the reference light and the pose angle according to the image; and adjusting the position and the pose of the robot according to the distance between the reference point and the reference light and the pose angle so as to navigate the robot.
Optionally, adjusting the position and the pose of the robot according to the distance between the reference point and the reference light and the pose angle includes: determining the distance between the reference point and the reference light ray, wherein the distance between the reference point and the first reference light ray is a first distance, and the distance between the reference point and the second reference light ray is a second distance; determining a pose angle between the reference point and the reference light ray, wherein the first reference light ray or the second reference light ray is a first pose line, and the angle between the first pose line and the first pose reference line is a first pose angle; and adjusting the position and the pose of the robot according to a first distance difference between the first distance and a first reference distance, a second distance difference between the second distance and a second reference distance, and a first angle difference between the first pose angle and the first reference pose angle.
Optionally, adjusting the position and the pose of the robot according to the distance between the reference point and the reference light and the pose angle, so as to navigate the robot, and then the method includes: controlling the robot to move along a preset direction so as to navigate; and after the robot finishes moving, adjusting the position and the pose angle of the robot according to the distance and the pose angle between the reference point and the first reference light and the second reference light.
Optionally, before adjusting the position and the pose angle of the robot according to the distance between the reference point and the first reference light and the pose angle between the reference point and the second reference light, the method includes: judging whether the first reference light and/or the second reference light are not in the visual field range; replacing the first reference light ray and/or the second reference light ray with a third reference light ray and/or a fourth reference light ray according to the first reference light ray and/or the second reference light ray in a case where the first reference light ray and/or the second reference light ray is not within a field of view, wherein the reference light rays further include the third reference light ray and/or the fourth reference light ray, the third reference light ray is parallel to the first reference light ray, and the fourth reference light ray is parallel to the second reference light ray.
Optionally, adjusting the position and the pose angle of the robot according to the distance between the reference point and the first reference light and the pose angle between the reference point and the second reference light includes: replacing the first reference ray according to the third reference ray if the first reference ray is not within the field of view; adjusting the position and the pose angle of the robot according to the distance between the reference point and the first reference light and the second reference light and the pose angle comprises: and adjusting the position and the pose angle of the robot according to the distance between the reference point and the third reference light and the pose angle between the reference point and the second reference light.
Optionally, adjusting the position and the pose angle of the robot according to the distance between the reference point and the third reference light and the pose angle between the reference point and the second reference light includes: determining the distance and the pose angle between the reference point and the second reference light and the pose angle between the reference point and the third reference light, wherein the distance between the reference point and the third reference light is a third distance, the third reference light or the second reference light is a second pose line, and the angle between the second pose line and the second pose reference line is a second pose angle; and adjusting the position and the pose of the robot according to a third distance difference between the third distance and a third reference distance, a second distance difference between the second distance and a second reference distance, and a second angle difference between the second pose angle and a second reference pose angle.
According to another aspect of the embodiments of the present invention, there is also provided a fluid laying method of a fluid robot, including: emitting reference light rays, wherein the reference light rays comprise a first reference light ray and a second reference light ray perpendicular to the first reference light ray, and the first reference light ray is parallel to the laying direction of the fluid; collecting a reference point and an image of the reference light; determining the distance between the reference point and the reference light and the pose angle according to the image; adjusting the position and the pose of the fluid robot according to the distance between the reference point and the reference light and the pose angle; and controlling the fluid body robot to lay the fluid body according to the position and the pose.
According to another aspect of the embodiments of the present invention, there is also provided a robot-assisted navigation apparatus including: a light source for emitting reference light rays, wherein the reference light rays include a first reference light ray and a second reference light ray perpendicular to the first reference light ray; the camera is used for collecting images of the reference point and the reference light; and the processor is used for adjusting the position and the pose of the robot according to the distance between the reference point and the reference light and the pose angle so as to navigate the robot.
According to another aspect of the embodiments of the present invention, there is also provided a storage medium, where the storage medium includes a stored program, and when the program runs, a device in which the storage medium is located is controlled to execute any one of the above methods.
According to another aspect of the embodiments of the present invention, there is also provided a processor, configured to execute a program, where the program executes to perform the method described in any one of the above.
In the embodiment of the invention, emission of reference light is adopted, wherein the reference light comprises a first reference light and a second reference light perpendicular to the first reference light; collecting a reference point and an image of the reference light; determining the distance between the reference point and the reference light and the pose angle according to the image; according to the distance between the reference point and the reference light and the pose angle, the position and the pose of the robot are adjusted, the moving robot is fed back through the reference light and the reference point in a navigation mode, the purpose that the robot works at high precision is achieved, the technical effect of improving the moving precision of the robot is achieved, and the technical problems that in the related technology, the accuracy is poor and the precision is affected when the fluid robot moves to lay materials are solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a flow chart of a method of robot-assisted navigation according to an embodiment of the present invention;
fig. 2 is a flowchart of a fluid laying method of a fluid robot according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a mortar robot positioning system device based on machine vision according to an embodiment of the invention;
FIG. 4 is a schematic view of a mortar robot laying mortar position according to an embodiment of the invention;
FIG. 5 is a schematic diagram of a mortar robot laying a first mortar zone according to an embodiment of the invention;
FIG. 6 is a schematic illustration of another mortar robot laying a first mortar zone according to an embodiment of the invention;
fig. 7 is a schematic diagram of a robot-assisted navigation device according to an embodiment of the invention.
The figures of the above drawings are numbered:
1 — a first laser line 1; 2-a second laser line; 3-a third laser line; 4, paving the floor tiles with the coordinates (1, 1); 5-a camera; 6-mortar robot; 7-a light source; 8, paving mortar positions on the ground tile coordinates (1, 2); 9-the position of the floor tile where mortar needs to be laid in coordinates (2, 1); 10, paving a mortar area at the coordinates (1, 1) of the floor tile; 11-start laying the tile coordinate (1, 1) view of the mortar area camera; 12, the view field of a camera when a mortar area of a floor tile coordinate (1, 1) is laid; 13-starting to lay the mortar area of the coordinates (1, 1) of the floor tiles and viewing the camera; 14-the field of view of the camera when the tile coordinate (2, 1) mortar area is laid.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In accordance with an embodiment of the present invention, there is provided a method embodiment of a robot-assisted navigation method, it being noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system, such as a set of computer-executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
Fig. 1 is a flowchart of a robot-assisted navigation method according to an embodiment of the present invention, as shown in fig. 1, the method comprising the steps of:
step S102, emitting reference light rays, wherein the reference light rays comprise a first reference light ray and a second reference light ray perpendicular to the first reference light ray;
step S104, collecting images of the reference points and the reference light;
step S106, determining the distance between the reference point and the reference light and the pose angle according to the image;
and S108, adjusting the position and the pose of the robot according to the distance between the reference point and the reference light and the pose angle so as to navigate the robot.
Emitting reference light rays by adopting the steps, wherein the reference light rays comprise first reference light rays and second reference light rays perpendicular to the first reference light rays; collecting images of the reference points and the reference light; determining the distance between the reference point and the reference light and the pose angle according to the image; according to the distance between the reference point and the reference light and the pose angle, the position and the pose of the robot are adjusted, the moving robot is guided, the reference light and the reference point are used for feeding back the moving robot, the purpose of ensuring the robot to work at higher precision is achieved, the technical effect of improving the moving precision of the robot is achieved, and the technical problems that in the related technology, the accuracy is poor and the precision is affected when the fluid robot moves to lay materials are solved. The fluid includes mortar, binder, etc.
The reference ray is used for forming a reference with an object of a real environment, so that the robot can recognize whether the current position is the same as the theoretically predicted position. The same may mean that the current real position and the theoretical position are within a preset error range. Specifically, with the reference light as a reference line, for example, the reference light may be a fixed direction of the real-environment object, for example, parallel to a straight line side of the real-environment object, parallel to or perpendicular to a corner line. The reference ray may also be a fixed direction of the robot, e.g. parallel or perpendicular to the orientation of the robot. The difference between the real position and the real posture of the robot and the theoretical position and the theoretical posture is identified by forming reference by the reference light and the object of the real environment, so that whether the error of the robot is in an allowable range is judged.
The reference light may be emitted by an emitting device on the robot, and the emitting device may be disposed in the real environment where the robot is located, for example, fixed on a side wall. In this embodiment, the emitting device is disposed on the robot, and can move along with the robot, and can also ensure calibration of the position and the posture of the robot, specifically, the irradiation angle of the reference light is fixed and unchanged, and the relative relationship between the light and the robot is unchanged, including the relative position and the relative angle. Therefore, the relative position of the real environment object and the robot is determined according to the relative position of the reference light and the real environment object. The reference light may be laser light, colored visible light, laser light, or the like. The colored visible light rays, such as red light rays, blue light rays, or other light rays with a large difference from the main color tone of the real environment, are used for identification.
The reference light comprises a first reference light and a second reference light, the first reference light is perpendicular to the second reference light, a plane coordinate system can be formed, and the relative position of the robot and the object of the real environment can be described more accurately through the relative positions of the first-level reference light and the second reference light.
The reference point forming a reference to the reference ray needs to be selected from the real object because the shape of the object in the real environment is different at different angles or poses, and the reference point needs to be on the reference object, for example, the geometric center of the reference object or any point within a certain range in the middle of the reference object. Therefore, the relative relationship between the object in the real environment and the reference ray is determined, and it should be noted that the position of the reference point in the image of the same reference point and the reference ray may not be changed, so as to perform processing and operation.
The distance between the reference point and the reference light and the pose angle are determined according to the image, the distance between the reference point and the first reference light and the distance between the reference point and the second reference light are determined according to the image, the included angle between the first reference light or the second reference light and the pose reference line is determined according to the image, the pose reference line can be a corner line of the real environment or the edge of the picture, and the like, and the straight line of the pose of the robot can be calculated.
The position and pose of the robot are adjusted according to the distance between the reference point and the reference light and the pose angle to navigate the robot, and the distance between the reference point determined according to the image and the first reference light and the second reference light, and the included angle between the first reference light or the second reference light and the pose reference line are respectively compared with corresponding standard values to determine whether the error allowable threshold value is exceeded. The threshold value can be preset, and the working threshold value of the robot cannot be influenced.
In this embodiment, the robot can be a mortar paving robot, an AGV (automatic guided vehicle) is used as a moving mechanism of the existing mortar robot, and navigation is performed by means of a laser radar. Therefore, the mortar robot needs to be positioned for the second time in the moving and mortar paving process, and the shape and the size of the mortar are ensured. The secondary positioning is carried out through the method, which comprises the following steps:
a set of vision system is installed on the mortar robot and mainly comprises a light source, an industrial camera and an industrial personal computer. The camera shoots a ground reference laser line or a paved floor tile, the industrial personal computer processes real-time image information and feeds results (left and right direction offset and angle offset) back to the AGV control system, and the mortar robot is guided to move and pave the mortar.
Optionally, adjusting the position and the pose of the robot according to the distance between the reference point and the reference light and the pose angle includes: determining the distance between the reference point and the reference light ray, wherein the distance between the reference point and the first reference light ray is a first distance, and the distance between the reference point and the second reference light ray is a second distance; determining the pose angles of the reference point and the reference light, wherein the first reference light or the second reference light is a first pose line, and the angle between the first pose line and the first pose reference line is a first pose angle; and adjusting the position and the pose of the robot according to a first distance difference between the first distance and the first reference distance, a second distance difference between the second distance and the second reference distance, and a first angle difference between the first pose angle and the first reference pose angle.
The first reference ray or the second reference ray is the first bit gesture line, that is, in this embodiment, the first reference ray may be the first bit gesture line, and the second reference ray may also be the first bit gesture line.
Optionally, the adjusting the position and the pose of the robot according to the distance between the reference point and the reference light and the pose angle to navigate the robot includes: controlling the robot to move along a preset direction so as to navigate; and after the robot finishes moving, adjusting the position and the pose angle of the robot according to the distance between the reference point and the first reference light and the second reference light and the pose angle.
The control robot moves along a preset direction for navigation, and the mortar robot can move along the preset direction for mortar laying, namely work navigation. And after the robot is moved, the robot is adjusted again to correct errors generated in the moving process of the robot.
Optionally, before adjusting the position and the pose angle of the robot according to the distance and the pose angle between the reference point and the first reference light and the second reference light, the method includes: judging whether the first reference light and/or the second reference light are not in the visual field range; in the case that the first reference light and/or the second reference light is not within the field of view, the first reference light and/or the second reference light is replaced by a third reference light and/or a fourth reference light, wherein the reference light further comprises the third reference light and/or the fourth reference light, the third reference light is parallel to the first reference light, and the fourth reference light is parallel to the second reference light.
And judging whether the first reference light and/or the second reference light is not in the visual field range, and under the condition that the first reference light and/or the second reference light is not in the visual field range, replacing the first reference light and/or the second reference light by using the third reference light and/or the fourth reference light. Specifically, the method may include three cases, that is, whether the first reference light is not within the visual field range is determined, and when the first reference light is not within the visual field range, the third reference light is used instead of the first reference light; whether the second reference light is not in the visual field range or not can be judged, and under the condition that the second reference light is not in the visual field range, the fourth reference light is used to replace the second reference light; it is also possible to determine whether the first reference ray and the second reference ray are not within the visual field range, and in the case where the first reference ray and the second reference ray are not within the visual field range, use the third reference ray and the fourth reference ray instead of the first reference ray and the second reference ray.
When one or two of the first reference light ray and the second reference light ray are not in the visual field range, the third reference light ray and/or the fourth reference light ray are/is used for substitution, and it is noted that the substitution light ray is parallel to the substituted light ray.
The field of view may be a field of view of the image of the acquisition reference point and the reference ray, and the adjustment may not be performed when the first reference ray and/or the second reference ray is not within the field of view, so that the first reference ray and/or the second reference ray may be replaced by setting the third reference ray and/or the fourth reference ray.
Specifically, adjusting the position and the pose angle of the robot according to the distance between the reference point and the first reference light and the second reference light and the pose angle includes: under the condition that the first reference light ray is not in the visual field range, replacing the first reference light ray according to the third reference light ray; according to the distance between the reference point and the first reference light and the distance between the reference point and the second reference light and the pose angle, the position and the pose angle of the robot are adjusted by the following steps: and adjusting the position and the pose angle of the robot according to the distance between the reference point and the third reference light and the pose angle between the reference point and the second reference light.
According to the distance between the reference point and the third reference light and the distance between the reference point and the second reference light and the pose angle, the position and the pose angle of the robot are adjusted by the following steps: determining the distance between the reference point and the second reference light and the position and pose angles of the third reference light, wherein the distance between the reference point and the third reference light is a third distance, the third reference light or the second reference light is a second position and pose line, and the angle between the second position and pose reference line is a second position and pose angle; and adjusting the position and the pose of the robot according to a third distance difference between the third distance and the third reference distance, a second distance difference between the second distance and the second reference distance, and a second angle difference between the second pose angle and the second reference pose angle.
Fig. 2 is a flowchart of a fluid laying method of a fluid robot according to an embodiment of the present invention, and as shown in fig. 2, according to another aspect of the embodiment of the present invention, there is also provided a mortar laying method of a mortar robot, the method including the steps of:
step S202, emitting reference light rays, wherein the reference light rays comprise first reference light rays and second reference light rays perpendicular to the first reference light rays, and the first reference light rays are parallel to the laying direction of the fluid;
step S204, collecting images of the reference points and the reference light;
step S206, determining the distance between the reference point and the reference light and the pose angle according to the image;
step S208, adjusting the position and the pose of the fluid robot according to the distance between the reference point and the reference light and the pose angle;
and step S210, controlling the fluidal robot to lay the fluidal body according to the position and the pose.
The fluid may be mortar, adhesive, or other fluid, and may be solidified after other materials are laid, for example, after mortar is laid on tiles. The fluid may also be solidified directly after laying, for example, a binder.
Emitting reference light rays by adopting the steps, wherein the reference light rays comprise first reference light rays and second reference light rays perpendicular to the first reference light rays; collecting images of the reference points and the reference light; determining the distance between the reference point and the reference light and the pose angle according to the image; according to the distance between the reference point and the reference light and the pose angle, the position and the pose of the robot are adjusted, the moving robot is guided, the reference light and the reference point are used for feeding back the moving robot, the purpose of ensuring the robot to work at higher precision is achieved, the technical effect of improving the moving precision of the robot is achieved, and the technical problems that in the related technology, the accuracy is poor and the precision is affected when the fluid robot moves to lay materials are solved.
It should be noted that this embodiment also provides an alternative implementation, which is described in detail below.
The difference point and the improvement point of the embodiment are that the original mortar robot only depends on a laser radar in the moving process of paving cement mortar, so that accurate navigation cannot be realized, and the mortar paving effect is influenced. A set of vision system is installed on a mortar robot, wherein the vision system mainly comprises a light source, an industrial camera and an industrial personal computer. The camera shoots a ground reference laser line or a paved floor tile, the industrial personal computer processes real-time image information and feeds results including left and right direction offset and angle offset back to the AGV control system to guide the mortar robot to move and pave mortar.
The technical problem that this embodiment can solve is: under the condition that the existing mortar robot cannot accurately navigate by simply using a laser radar so that mortar is not laid well, the moving path of the robot can be secondarily adjusted, and the aim of accurately laying mortar is fulfilled.
According to the embodiment, the vision system is arranged on the mortar robot, so that the defect of the navigation precision of the traditional laser radar to the robot is overcome, and the mortar laying work can be completed more accurately. Meanwhile, the invention has advantages in cost control, operation convenience and the like.
Fig. 3 is a schematic structural diagram of a mortar robot positioning system device based on machine vision according to an embodiment of the present invention, and as shown in fig. 3, the present embodiment provides a mortar robot assisted navigation system based on machine vision, the method is preferentially applied to the field of floor tile paving automation, and the device structure mainly includes: the laser system comprises a first laser line 1, a second laser line 2, a camera 5, a mortar robot 6 and a light source 7. The first laser line 1 and the second laser line 2 correspond to the first reference light and the second reference light. The camera 5 is used for capturing images and the light source 7 is used for emitting the first laser line 1 and the second laser line 2. The first laser line 1 is a mortar robot X-direction reference laser line, and the second laser line 2 is a mortar robot Y-direction reference laser line.
The first laser line 1 and the second laser line 2 are used as reference lines parallel to the edge of the wall body, the edge of the wall body can be a foot line, and the distance from the wall surface can be set artificially. Fig. 4 is a schematic diagram of a mortar placement position of a mortar robot according to an embodiment of the present invention, as shown in fig. 4, an area in a room where mortar is to be placed is named before mortar placement, with X and Y directions as coordinate axes, for example, a first column of tile placement is performed in the Y direction as shown in fig. 3, the tile coordinate is named as (1, 1), and a second column of first row of tile coordinate in the X direction is named as (2, 1). The mortar robot 6 firstly moves to the vicinity of an area 10 where mortar needs to be laid by means of laser radar navigation, and at this time, the visual field range of the camera 5 installed on one side of the mortar robot 6 covers one reference laser line 1 and 2. After the image is acquired, the vision system respectively calculates the distances L1 and L2 of the center of the image from the first laser line 1 and the second laser line 2, and the angle alpha 2 of the laser line 2, and then respectively calculates the current distances L1 and L2, and the differences delta L1, delta L2 and delta alpha between the angle alpha 2 and the reference distances L1 and L2 and the reference angle alpha. And after receiving the positions of the delta L1, the delta L2 and the delta alpha sent by the vision system, the mortar robot 6 continuously adjusts the pose and determines the position where the mortar starts to be paved. Then mortar is paved in the Y direction according to the rotation angle and the left and right positions adjusted according to the values of delta L2 and delta alpha. When the camera view moves to the area 11 along with the mortar robot 6, the vision system may recognize the third laser line 3 and the second laser line 2 at the same time, then calculate the distances L3, L2 from the third laser line 3 and the second laser line 2, respectively, of the center of the image, and the angle α 2 of the laser line 2, and then calculate the differences Δ L3, Δ L2, Δ α of the current distances L3, L2, angle α 2 from the reference distances L3, L2, and reference angle α, respectively. When Δ L3 is less than 2mm, the mortar robot 6 stops moving. The floor tile paving robot immediately executes a first floor tile paving procedure after receiving the signal.
Fig. 5 is a schematic diagram of a first mortar area laid by the mortar robot according to the embodiment of the present invention, and as shown in fig. 5, after the first floor tile (1, 1) is completely laid, the mortar robot 6 continues to lay the mortar (1, 2) in the Y direction, since the floor tile has a better reflectivity than the cement floor, and the contrast between the floor tile and the cement floor is greater in the image collected by the camera, and the edge of the floor tile is easily recognized. At the beginning, the edge close to the reference laser line 2 in the previous row and the same column, namely the floor tiles (1, 1) which are already laid, namely the right side edge, is used as the reference edge of the mortar robot, and similarly to the calculation method for the first mortar laying, the distance L2 ' from the center of the image to the right side edge of the floor tile and the angle alpha 2 ' of the right side edge are calculated, and the offset from the reference distance L2 ' and the reference angle alpha are calculated correspondingly, namely the angle and the distance which need to be adjusted by the mortar robot 6, so that the walking posture of the mortar robot is continuously adjusted. When the visual field of the camera 5 on the mortar robot 6 moves to the area 12, the vision system identifies the right edge and the lower edge of the first floor tile, respectively calculates the distances L2 ', L3 ' between the center point of the image and the right edge and the lower edge, and the angle alpha 2 ' between the center point of the image and the right edge, respectively calculates the offset of the image and the corresponding reference positions L2 ', L3 ' and alpha, respectively, and after the set value is reached, the mortar robot 6 finishes laying the second mortar area in the Y direction and stops moving. All rows except the first row can continue to lay other areas of mortar remaining in the Y direction in this way.
Fig. 6 is a schematic diagram of another mortar robot for laying a first mortar area according to an embodiment of the present invention, as shown in fig. 6, the method for laying the mortar area in the X direction (except for 1, 1) is similar to the method for laying (1, 1), and assuming that when the mortar area of the floor tiles (2, 1) is laid, the initial position is based on the left edge and the upper edge of the previous row of floor tiles (1, 1), and the deviation between the current value and the reference values l1 ", l 2" and α are visually calculated respectively, so as to adjust the robot position. The reference edge on the right side of the image becomes the left edge of the previous row of tiles (1, 1) during the movement of the mortar robot 6 in the Y direction, as shown in the field of view 13 in fig. 6, the reference edge below the area where the tiles (2, 1) are laid is the lower edge of the previous row of tiles (1, 1), as shown in the field of view 14 in fig. 6, the deviation from the reference distance l2 ', l 3' is calculated, respectively, and the robot offset can be obtained because the width of the brick joint is known. The mortar laying can be performed in this manner for all the columns except the first column.
In the present embodiment, since the camera is installed at a certain angle to the ground to take a picture, all the captured images need to be image-corrected before calculating each numerical value.
In the present embodiment, the method of extracting the laser beam is as follows:
because the collected image is an RGB image, the laser lines 1, 2 and 3 are red laser, and the cement floor in the room is gray, the R channel image can be separated to effectively extract the laser line image;
and (3) carrying out gray level histogram statistics on the R channel image, and carrying out dynamic threshold segmentation on the R channel image to obtain the areas where the laser lines 1, 2 and 3 are located because the gray level value of the red cross laser line is always kept within 20% of the maximum value.
In this embodiment, the identification of the tiles in the current field of view by the camera needs to be realized by the following steps:
the floor tile robot counts the current floor tiles and sends the current floor tiles to the vision system, and the vision system obtains the coordinates of the current mortar to be laid according to the arrangement rule;
the vision system gives a corresponding recognition scheme according to the current arrangement rule, for example: at present, the mortar robot 6 is to lay mortar at coordinates (1, 1), and the vision system starts a laser line identification scheme. When the robot is about to lay mortar at (2, 1), the vision system starts to identify the left edge and the lower edge of the floor tile with the coordinates of (1, 1) at the right side, and calculates related parameters.
Fig. 7 is a schematic diagram of a robot-assisted navigation device according to an embodiment of the present invention, and as shown in fig. 7, according to another aspect of the embodiment of the present invention, there is also provided a robot-assisted navigation device, including: a light source 72, a camera 74, and a processor 76, which will be described in more detail below.
A light source 72 for emitting reference light rays, wherein the reference light rays include a first reference light ray and a second reference light ray perpendicular to the first reference light ray; a camera 74 for collecting images of the reference points and the reference light; and the processor 76 is used for adjusting the position and the pose of the robot according to the distance between the reference point and the reference light and the pose angle so as to navigate the robot.
By the device, reference light rays are emitted, wherein the reference light rays comprise a first reference light ray and a second reference light ray perpendicular to the first reference light ray; collecting images of the reference points and the reference light; determining the distance between the reference point and the reference light and the pose angle according to the image; according to the distance between the reference point and the reference light and the pose angle, the position and the pose of the robot are adjusted, the moving robot is guided, the reference light and the reference point are used for feeding back the moving robot, the purpose of ensuring the robot to work at higher precision is achieved, the technical effect of improving the moving precision of the robot is achieved, and the technical problems that in the related technology, the accuracy is poor and the precision is affected when the fluid robot moves to lay materials are solved.
Light source and camera setting are on the robot, and the treater includes: a local processing module and a remote processing module; the local processing module is used for processing and forwarding the images of the reference points and the reference light, receiving an adjusting instruction of the remote processing module and controlling the robot according to the adjusting instruction; and the remote processing module is used for calculating the distance and the pose angle according to the data sent by the local processing module, determining the distance and the pose angle, and generating an adjusting instruction for adjusting the robot according to the distance and the pose angle.
According to another aspect of the embodiments of the present invention, there is also provided a storage medium including a stored program, wherein when the program runs, a device in which the storage medium is located is controlled to execute the method of any one of the above.
According to another aspect of the embodiments of the present invention, there is also provided a processor, configured to execute a program, where the program executes to perform the method of any one of the above.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (8)

1. A robot-assisted navigation method, comprising:
emitting reference light rays, wherein the reference light rays comprise a first reference light ray and a second reference light ray perpendicular to the first reference light ray;
collecting a reference point and an image of the reference light;
determining the distance between the reference point and the reference light and the pose angle according to the image;
according to the distance between the reference point and the reference light and the pose angle, adjusting the position and the pose of the robot so as to navigate the robot;
the datum point is selected from the real object;
according to the distance between the reference point and the reference light and the pose angle, adjusting the position and the pose of the robot so as to navigate the robot comprises the following steps:
controlling the robot to move along a preset direction so as to navigate;
after the robot finishes moving, adjusting the position and the pose angle of the robot according to the distance and the pose angle between the reference point and the first reference light and the second reference light;
after the robot finishes moving, the method for adjusting the position and the pose angle of the robot according to the distance and the pose angle between the reference point and the first reference light ray and the second reference light ray comprises the following steps:
judging whether the first reference light and/or the second reference light are not in the visual field range;
replacing the first reference light ray and/or the second reference light ray with a third reference light ray and/or a fourth reference light ray in the case that the first reference light ray and/or the second reference light ray is not within a field of view, wherein the reference light rays further include the third reference light ray and/or the fourth reference light ray, the third reference light ray is parallel to the first reference light ray, and the fourth reference light ray is parallel to the second reference light ray.
2. The method of claim 1, wherein adjusting the position and pose of the robot based on the distance and pose angle of the reference point from the reference ray comprises:
determining the distance between the reference point and the reference light ray, wherein the distance between the reference point and the first reference light ray is a first distance, and the distance between the reference point and the second reference light ray is a second distance;
determining a pose angle between the reference point and the reference light ray, wherein the first reference light ray or the second reference light ray is a first pose line, and the angle between the first pose line and the first pose reference line is a first pose angle;
and adjusting the position and the pose of the robot according to a first distance difference between the first distance and a first reference distance, a second distance difference between the second distance and a second reference distance, and a first angle difference between the first pose angle and the first reference pose angle.
3. The method of claim 2, wherein adjusting the position and pose angle of the robot based on the distance and pose angle of the reference point from the first and second reference rays comprises:
replacing the first reference ray with the third reference ray if the first reference ray is not within a field of view;
adjusting the position and the pose angle of the robot according to the distance between the reference point and the first reference light and the second reference light and the pose angle comprises:
and adjusting the position and the pose angle of the robot according to the distance between the reference point and the third reference light and the pose angle between the reference point and the second reference light.
4. The method of claim 3, wherein adjusting the position and pose angle of the robot based on the distance and pose angle of the reference point from the third reference ray and the second reference ray comprises:
determining the distance and the pose angle between the reference point and the second reference light and the pose angle between the reference point and the third reference light, wherein the distance between the reference point and the third reference light is a third distance, the third reference light or the second reference light is a second pose line, and the angle between the second pose line and the second pose reference line is a second pose angle;
and adjusting the position and the pose of the robot according to a third distance difference between the third distance and a third reference distance, a second distance difference between the second distance and a second reference distance, and a second angle difference between the second pose angle and a second reference pose angle.
5. A method for laying a fluid body in a fluid body robot, comprising:
emitting reference light rays, wherein the reference light rays comprise a first reference light ray and a second reference light ray perpendicular to the first reference light ray, and the first reference light ray is parallel to the laying direction of the fluid;
collecting a reference point and an image of the reference light;
determining the distance between the reference point and the reference light and the pose angle according to the image;
adjusting the position and the pose of the fluid robot according to the distance between the reference point and the reference light and the pose angle;
controlling the fluid body robot to lay the fluid body according to the position and the pose;
the datum point is selected from the real object;
according to the distance between the reference point and the reference light and the pose angle, after the position and pose adjustment is carried out on the fluid robot, the method comprises the following steps:
controlling the fluid body robot to move along a preset direction so as to navigate;
after the fluid body robot finishes moving, adjusting the position and the pose angle of the fluid body robot according to the distance and the pose angle between the reference point and the first reference light and the second reference light;
after the fluid body robot finishes moving, before adjusting the position and the pose angle of the fluid body robot according to the distance and the pose angle between the reference point and the first reference light and the second reference light, the method comprises the following steps:
judging whether the first reference light and/or the second reference light are not in the visual field range;
replacing the first reference light ray and/or the second reference light ray with a third reference light ray and/or a fourth reference light ray in the case that the first reference light ray and/or the second reference light ray is not within a field of view, wherein the reference light rays further include the third reference light ray and/or the fourth reference light ray, the third reference light ray is parallel to the first reference light ray, and the fourth reference light ray is parallel to the second reference light ray.
6. A robot-assisted navigation device, comprising:
a light source for emitting reference light rays, wherein the reference light rays include a first reference light ray and a second reference light ray perpendicular to the first reference light ray;
the camera is used for collecting images of the reference point and the reference light;
the processor is used for adjusting the position and the pose of the robot according to the distance between the reference point and the reference light and the pose angle so as to navigate the robot;
the datum point is selected from the real object;
according to the distance between the reference point and the reference light and the pose angle, adjusting the position and the pose of the robot so as to navigate the robot comprises the following steps:
controlling the robot to move along a preset direction so as to navigate;
after the robot finishes moving, adjusting the position and the pose angle of the robot according to the distance and the pose angle between the reference point and the first reference light and the second reference light;
after the robot finishes moving, the method for adjusting the position and the pose angle of the robot according to the distance and the pose angle between the reference point and the first reference light ray and the second reference light ray comprises the following steps:
judging whether the first reference light and/or the second reference light are not in the visual field range;
replacing the first reference light ray and/or the second reference light ray with a third reference light ray and/or a fourth reference light ray in the case that the first reference light ray and/or the second reference light ray is not within a field of view, wherein the reference light rays further include the third reference light ray and/or the fourth reference light ray, the third reference light ray is parallel to the first reference light ray, and the fourth reference light ray is parallel to the second reference light ray.
7. A storage medium, characterized in that the storage medium comprises a stored program, wherein the program, when executed, controls an apparatus in which the storage medium is located to perform the method of any one of claims 1 to 5.
8. A processor, characterized in that the processor is configured to run a program, wherein the program when running performs the method of any of claims 1 to 5.
CN201910755796.4A 2019-08-15 2019-08-15 Robot-assisted navigation method Active CN112388626B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910755796.4A CN112388626B (en) 2019-08-15 2019-08-15 Robot-assisted navigation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910755796.4A CN112388626B (en) 2019-08-15 2019-08-15 Robot-assisted navigation method

Publications (2)

Publication Number Publication Date
CN112388626A CN112388626A (en) 2021-02-23
CN112388626B true CN112388626B (en) 2022-04-22

Family

ID=74601763

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910755796.4A Active CN112388626B (en) 2019-08-15 2019-08-15 Robot-assisted navigation method

Country Status (1)

Country Link
CN (1) CN112388626B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113776518B (en) * 2021-09-07 2024-04-23 深圳大方智能科技有限公司 Indoor construction robot positioning navigation method and system
CN114249078A (en) * 2021-12-10 2022-03-29 广东智源机器人科技有限公司 Track identification positioning method
CN115359114B (en) * 2022-08-16 2023-07-25 中建一局集团第五建筑有限公司 Positioning method, positioning device, electronic equipment and computer readable storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0810949A (en) * 1994-06-23 1996-01-16 Fanuc Ltd Method for controlling welding robot system in multi-layer over laying
CN102645219B (en) * 2012-05-16 2014-12-03 航天科工哈尔滨风华有限公司 Welding and locating method of welding seam and method of obtaining welding seam offset for visual navigation system of wall climbing robot for weld inspection
CN105783935A (en) * 2016-03-07 2016-07-20 河北科技大学 Visual navigation method for agricultural machine
CN106052676B (en) * 2016-05-26 2019-03-15 深圳市神州云海智能科技有限公司 A kind of robot navigation's localization method, device and robot
CN109343543A (en) * 2018-12-13 2019-02-15 合肥泰禾光电科技股份有限公司 A kind of vehicle straight trip air navigation aid and vehicle are kept straight on navigation device
CN109782772A (en) * 2019-03-05 2019-05-21 浙江国自机器人技术有限公司 A kind of air navigation aid, system and cleaning robot

Also Published As

Publication number Publication date
CN112388626A (en) 2021-02-23

Similar Documents

Publication Publication Date Title
CN112388626B (en) Robot-assisted navigation method
CN109720340B (en) Automatic parking system and method based on visual identification
CN108571971B (en) AGV visual positioning system and method
US9896810B2 (en) Method for controlling a self-propelled construction machine to account for identified objects in a working direction
CN104483966B (en) A kind of binocular vision navigation control method for submarine AGV
US8872920B2 (en) Camera calibration apparatus
EP2187166B1 (en) Industrial Machine
US20160314593A1 (en) Providing a point cloud using a surveying instrument and a camera device
KR101703177B1 (en) Apparatus and method for recognizing position of vehicle
CN110864691B (en) Magnetic stripe imitation positioning method and device based on ceiling type two-dimensional code
EP3783385A1 (en) Combined point cloud generation using a stationary laser scanner and a mobile scanner
CN110361717B (en) Laser radar-camera combined calibration target and combined calibration method
US9719217B2 (en) Self-propelled construction machine and method for visualizing the working environment of a construction machine moving on a terrain
KR101379787B1 (en) An apparatus and a method for calibration of camera and laser range finder using a structure with a triangular hole
CN109387194B (en) Mobile robot positioning method and positioning system
CN110597265A (en) Recharging method and device for sweeping robot
CN103993431A (en) Vision correction method and system used for sewing
CN111693046A (en) Robot system and robot navigation map building system and method
CN111806418B (en) Road center detection for autonomous vehicle control
CN106444774B (en) Vision navigation method of mobile robot based on indoor illumination
CN110640735B (en) Deviation rectifying method, deviation rectifying device and robot
JPH11149557A (en) Surrounding environment recognizing device for autonomous traveling vehicle
CN105184768A (en) Indoor multi-camera synchronization high-precision positioning method
JPH10307627A (en) Working border detecting device and copy travel controller for autonomous traveling work vehicle
Mazzei et al. A lasers and cameras calibration procedure for VIAC multi-sensorized vehicles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant