CN111192301A - Floor installation method and device, robot and storage medium - Google Patents

Floor installation method and device, robot and storage medium Download PDF

Info

Publication number
CN111192301A
CN111192301A CN201911415161.6A CN201911415161A CN111192301A CN 111192301 A CN111192301 A CN 111192301A CN 201911415161 A CN201911415161 A CN 201911415161A CN 111192301 A CN111192301 A CN 111192301A
Authority
CN
China
Prior art keywords
point
camera
pixel
coordinates
flitch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911415161.6A
Other languages
Chinese (zh)
Other versions
CN111192301B (en
Inventor
廖建国
毛淑艺
郑小林
李江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Bozhilin Robot Co Ltd
Original Assignee
Guangdong Bozhilin Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Bozhilin Robot Co Ltd filed Critical Guangdong Bozhilin Robot Co Ltd
Priority to CN201911415161.6A priority Critical patent/CN111192301B/en
Publication of CN111192301A publication Critical patent/CN111192301A/en
Application granted granted Critical
Publication of CN111192301B publication Critical patent/CN111192301B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0081Programme-controlled manipulators with master teach-in means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • EFIXED CONSTRUCTIONS
    • E04BUILDING
    • E04FFINISHING WORK ON BUILDINGS, e.g. STAIRS, FLOORS
    • E04F21/00Implements for finishing work on buildings
    • E04F21/20Implements for finishing work on buildings for laying flooring
    • E04F21/22Implements for finishing work on buildings for laying flooring of single elements, e.g. flooring cramps ; flexible webs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/337Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/04Architectural design, interior design
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Architecture (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Civil Engineering (AREA)
  • Structural Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a floor installation method, which comprises the following steps: paving and pasting a flitch in a preset scene to serve as a reference flitch, then teaching the robot, and acquiring a first teaching point and a second teaching point of the robot when the robot lays the first flitch based on the reference flitch and coordinates of a rotation center of the mechanical arm in a teaching process. In each subsequent paving and pasting process, a first characteristic point and a second characteristic point are obtained through photographing; calculating to obtain the angle deviation of the mechanical arm required to rotate according to the first teaching point, the second teaching point, the first characteristic point and the second characteristic point; and calculating to obtain the displacement deviation of the board to be paved according to the first teaching point and the first characteristic point, and controlling the mechanical arm to align the next board to be paved with the reference board according to the angle deviation and the displacement deviation. The invention also provides a floor mounting device, a robot and a storage medium, which can automatically control the robot to mount the flitch to be paved and pasted according to the reference flitch.

Description

Floor installation method and device, robot and storage medium
Technical Field
The invention relates to the technical field of machine vision, in particular to a floor installation method and device, a robot and a storage medium.
Background
Floor installations are currently generally installed manually by workers. The precision of installation between flitch and the reference flitch of waiting to spread can't be guaranteed in the installation to whole installation is wasted time and energy.
Disclosure of Invention
In view of the above, there is a need to provide a floor mounting method and apparatus, a robot and a storage medium, which can automatically control the robot to mount the flitches to be pasted according to the reference flitch.
Respectively establishing a mapping relation between a pixel coordinate system of the first camera and a pixel coordinate system of the second camera and a world coordinate system of the mechanical arm;
acquiring a pixel coordinate of a first teaching point and a pixel coordinate of a second teaching point based on the reference flitch;
acquiring coordinates of a rotation center of the mechanical arm in a world coordinate system;
controlling the mechanical arm to grab and carry the material plates to be paved to the paving position;
after the robot is controlled to rotate by a preset angle, the first camera is controlled to acquire a first image under a first visual field, and the second camera is controlled to acquire a second image under a second visual field;
obtaining the pixel coordinates of a first characteristic point according to the first image, and obtaining the pixel coordinates of a second characteristic point according to the second image;
calculating the angle deviation of the mechanical arm needing to rotate according to the pixel coordinate of the first teaching point, the pixel coordinate of the second teaching point, the pixel coordinate of the first characteristic point and the pixel coordinate of the second characteristic point;
respectively converting the pixel coordinates of the first teaching point and the first characteristic point into a first world coordinate and a second world coordinate according to the mapping relation;
calculating a third world coordinate of the mechanical arm rotating to a characteristic position based on the first world coordinate, the second world coordinate, the preset angle and a coordinate of a rotation center of the mechanical arm in a world coordinate system;
obtaining displacement deviation of the board to be paved based on the difference value between the third world coordinate and the second world coordinate;
and controlling the mechanical arm to align the flitch to be paved with the reference flitch according to the displacement deviation and the angle deviation.
Preferably, the acquiring the pixel coordinates of the first teaching point and the pixel coordinates of the second teaching point based on the reference flitch includes:
starting the first camera to obtain a first view, and starting the second camera to obtain a second view;
controlling a laser installed on the robot to emit laser to irradiate the reference material plate in the first visual field and the second visual field;
controlling the first camera to acquire a third image under a first visual field, and controlling the second camera to acquire a fourth image under a second visual field;
and obtaining the pixel coordinate of the first teaching point according to the third image, and obtaining the pixel coordinate of the second teaching point according to the fourth image.
Preferably, the acquiring coordinates of the rotation center of the mechanical arm in the world coordinate system includes:
controlling the mechanical arm to grab a material plate to be paved and pasted to a preset position, wherein a circular label object is pasted on the surface of the material plate to be paved and pasted;
controlling the first camera to take a picture to obtain an image;
extracting the pixel coordinates of the central point of the circular label object in the image;
converting the pixel coordinates into world coordinates (rx) according to the mapping relation0,ry0);
After the rotation angle of the mechanical arm is controlled, the first camera is controlled again to take a picture to obtain an image;
extracting the pixel coordinates of the central point of the circular label object in the image obtained by re-photographing;
converting the pixel coordinates into world coordinates (rx) according to the mapping relation1,ry1);
According to world coordinates (rx)0,ry0) World coordinate (rx)1,ry1) And calculating the angle to obtain the coordinates (fx) of the rotation center of the mechanical arm in the world coordinate system0,fy0) Wherein:
Figure BDA0002350993830000031
preferably, before controlling the first camera to acquire a first image and controlling the second camera to acquire a second image, the method further comprises:
and controlling a laser installed on the robot to emit laser to irradiate the reference material plate in the first visual field and the second visual field.
Preferably, the method further comprises:
controlling the laser to emit two beams of laser to irradiate the long material plate of the reference material plate under the first visual field, so that the projections of the two beams of laser are intersected with the long edge of the long material plate to form a first intersection point and a second intersection point;
controlling the laser to emit two other beams of laser to irradiate the short flitch of the reference flitch under the first visual field, and enabling projections of the two other beams of laser to intersect with the short side of the short flitch to form a third intersection point and a fourth intersection point, wherein an intersection point of a straight line formed by the first intersection point and the second intersection point and a straight line formed by the third intersection point and the fourth intersection point is the first characteristic point;
and controlling the laser to emit a beam of laser to irradiate the long material plate of the reference material plate under the second visual field, so that the projection of the beam of laser and the long edge of the long material plate form an intersection point, wherein the intersection point is the second characteristic point.
Preferably, the angular deviation is calculated by the following formula:
dR=arctan((x4-x3)/(y4-y3))-arctan((x1-x2)/(y1-y2))
wherein dR is the angular deviation, (x)1,y1) (x) is the pixel coordinate of the first teach point2,y2) (x) is the pixel coordinate of the second teach point3,y3) (x) is the pixel coordinate of the first feature point4,y4) The pixel coordinates of the second characteristic point.
Preferably, the respectively establishing a mapping relationship between the pixel coordinate system of the first camera and the pixel coordinate system of the second camera and the world coordinate system of the mechanical arm includes:
calculating a first transformation matrix from a pixel coordinate system of a first camera to the world coordinate system;
calculating a second transformation matrix from the pixel coordinate system of the first camera to the pixel coordinate system of the second camera;
and calculating a third transformation matrix from the pixel coordinate system of the second camera to the world coordinate system according to the first transformation matrix and the second transformation matrix.
Preferably, converting the pixel coordinates of the first teach point and the first feature point into first world coordinates and second world coordinates, respectively, according to the mapping relationship includes:
converting pixel coordinates of the first teach point to first world coordinates (px) by a first transformation matrix0,py0);
Converting pixel coordinates of the first feature point into second world coordinates (px) through the first transformation matrix2,py2)。
Preferably, the third world coordinate is calculated by the following formula:
Figure BDA0002350993830000041
wherein the first world coordinate is (px)0,py0) The second world coordinate is (px)2,py2) The preset angle is theta, and the coordinate of the rotation center of the mechanical arm in the world coordinate system is (fx)0,fy0)。
A second aspect of the present application provides a floor mounting apparatus, the apparatus comprising:
the establishing module is used for respectively establishing a mapping relation between a pixel coordinate system of the first camera and a pixel coordinate system of the second camera and a world coordinate system of the mechanical arm;
the acquisition module is used for acquiring the pixel coordinate of the first teaching point and the pixel coordinate of the second teaching point based on the reference flitch;
the acquisition module is further used for acquiring the coordinates of the rotation center of the mechanical arm in a world coordinate system;
the control module is used for controlling the mechanical arm to grab and carry the material plates to be paved to the paving position;
the control module is further used for controlling the first camera to acquire a first image in a first view and controlling the second camera to acquire a second image in a second view after the robot is controlled to rotate by a preset angle;
the processing module is used for obtaining the pixel coordinate of a first characteristic point according to the first image and obtaining the pixel coordinate of a second characteristic point according to the second image;
the calculation module is used for calculating the angle deviation of the mechanical arm needing to rotate according to the pixel coordinate of the first teaching point, the pixel coordinate of the second teaching point, the pixel coordinate of the first characteristic point and the pixel coordinate of the second characteristic point;
the conversion module is used for respectively converting the pixel coordinates of the first teaching point and the first characteristic point into a first world coordinate and a second world coordinate according to the mapping relation;
the processing module is further used for calculating a third world coordinate of the mechanical arm rotating to a characteristic position based on the first world coordinate, the second world coordinate, the preset angle and a coordinate of a rotation center of the mechanical arm in a world coordinate system;
the processing module is further used for obtaining displacement deviation of the board to be paved based on the difference value between the third world coordinate and the second world coordinate;
and the control module is also used for controlling the mechanical arm to align the flitch to be paved with the reference flitch according to the displacement deviation and the angle deviation.
A third aspect of the present application provides a robot comprising:
the camera is used for shooting images of the flitch and the flitch laying environment;
the mechanical arm is used for moving the material plates to be paved to the paving position and aligning the material plates to be paved with the reference material plates;
a controller having stored therein a plurality of program modules that are loaded by the controller and execute the floor installation method as described above.
Preferably, the cameras are fixedly mounted on a support of the robot and maintain the same horizontal plane.
Preferably, the number of the cameras is two, and the two cameras are arranged along the long side direction of the flitch.
A fourth aspect of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a controller, implements a floor installation method as described above.
Compared with the prior art, the method has the advantages that one flitch is laid and pasted in the preset scene to serve as the reference flitch, then the robot is taught, and the first teaching point and the second teaching point of the robot when the first flitch is laid based on the reference flitch and the coordinates of the rotation center of the mechanical arm are obtained in the teaching process. In each subsequent paving and pasting process, a first characteristic point and a second characteristic point are obtained through photographing; calculating to obtain the angle deviation of the mechanical arm required to rotate according to the first teaching point, the second teaching point, the first characteristic point and the second characteristic point; and calculating to obtain the displacement deviation of the board to be paved according to the first teaching point and the first characteristic point, and controlling the mechanical arm to align the next board to be paved with the reference board according to the angle deviation and the displacement deviation. Can wait to spread flitch through accurate, the quick installation of robot, use manpower sparingly.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a robot according to an embodiment of the present invention.
Fig. 2 is a schematic view of a first field of view and a second field of view of a first camera and a second camera according to an embodiment of the present invention.
Fig. 3 is a flow chart of a method of installing a floor in accordance with an embodiment of the present invention.
Fig. 4 is a schematic view of a reference plate according to an embodiment of the present invention.
Fig. 5 is a detailed flowchart of step S2 of the floor installation method according to an embodiment of the present invention.
Fig. 6 is a schematic diagram of a projection obtained by irradiating a reference material plate with laser light emitted by a laser in an embodiment of the invention.
Fig. 7 is a schematic diagram of a first teach point and a second teach point in an embodiment of the invention.
FIG. 8 is a schematic diagram of a deflection angle according to an embodiment of the present invention.
Fig. 9 is a schematic view of a robot arm in the center of rotation of a world coordinate system according to an embodiment of the present invention.
Fig. 10 is a functional block diagram of a floor mounting apparatus according to an embodiment of the present invention.
Description of the main elements
Robot 1
Mechanical arm 2
Camera 3
Laser 4
Controller 5
Support 6
First camera 31
Second camera 32
Reference material plate 51
To-be-laid flitch 50
Floor mounting device 10
Building block 101
Acquisition module 102
Control module 103
Processing module 104
Calculation Module 105
Conversion module 106
Steps S1-S11, S50-S53
The following detailed description will further illustrate the invention in conjunction with the above-described figures.
Detailed Description
Referring to fig. 1, an application environment diagram of the floor installation method according to an embodiment of the invention is shown. In the present embodiment, the floor mounting method is applied to the robot 1. The robot 1 may be an Automated Guided Vehicle (AGV), and may automatically install a material plate in a preset scene quickly and accurately. As shown in fig. 1, the robot 1 includes, but is not limited to, a robot arm 2, a camera 3, a laser 4, and a controller 5. The mechanical arm 2 is used for moving the material plates 50 to be paved to a paving position and aligning the material plates 50 to be paved with the reference material plates 51; the camera 3 is used for shooting images of the flitch and the flitch laying environment; the laser 4 is used for emitting laser to the reference flitch to determine the positions of the teaching points and the characteristic points; the controller 5 is electrically connected with the mechanical arm 2, the camera 3 and the laser 4. The controller 5 is used for controlling the camera 3 and the mechanical arm 2 to realize the flitch installation process.
The flitch is a floor installed on the ground, and may be a wood floor, a composite floor, or the like. The flitch is generally rectangular.
In the present embodiment, the camera 3 is fixedly attached to a bracket 6 of the robot 1, as shown in fig. 2. In this embodiment, there are two cameras 3, and two identical cameras 3 are fixedly mounted on the bracket 6 and are maintained on the same horizontal plane.
In the present embodiment, the field of view of the camera is required to cover at least the flitch to be pasted 50 and the reference flitch 51. The field of view of the camera is determined by the working distance of the camera, the size of the target surface and the focal length. Specifically, the visual field range may be calculated by the following formula:
Figure BDA0002350993830000091
as shown in FIG. 2, the view range of the cameras 31 is required to simultaneously acquire images of two corners α and β spliced by the flitch 50 to be pasted and the reference flitch 51, the distance between the two cameras 32 is reasonably selected according to the length of the flitch, and the distance between the two cameras is required to be pulled open as much as possible, so that the other camera 32 is installed along the long edge direction of the flitch as much as possible and is far away from the camera 31 without exceeding the long edge range of the flitch.
In the present embodiment, the cameras are industrial cameras, the working distance wd of the two cameras is 400mm to 800mm, the target surface size is 3/5 to 3/4 inches, and the focal length of the lens is 8mm to 60 mm. For example, the focal length of the camera in this embodiment is 24mm, and the resolution is 2448 × 2048. When the working distance of the two cameras is 700mm, the field of view sizes of the cameras 31 and 32 are 257mm × 193mm, respectively.
Fig. 1 is only an exemplary robot 1. In other embodiments, the robot 1 may also include more or fewer elements, or have a different arrangement of elements. Although not shown, the robot 1 may further include other components such as a walking device, a wireless fidelity (WiFi) unit, a bluetooth unit, and a battery, which are not described in detail herein.
Referring to fig. 3, a flow chart of a floor installation method according to an embodiment of the invention is shown. The order of the steps in the flow chart may be changed and some steps may be omitted according to different needs.
And step S1, respectively establishing a mapping relation between the pixel coordinate system of the first camera and the pixel coordinate system of the second camera and the world coordinate system of the mechanical arm.
In this embodiment, the two cameras are a first camera and a second camera, respectively. The method for establishing the mapping relation between the pixel coordinate systems of the first camera and the second camera and the world coordinate system of the mechanical arm comprises the following steps: calculating a first transformation matrix from a pixel coordinate system of a first camera to the world coordinate system; calculating a second transformation matrix from the pixel coordinate system of the first camera to the pixel coordinate system of the second camera; and calculating a third transformation matrix from the pixel coordinate system of the second camera to the world coordinate system according to the first transformation matrix and the second transformation matrix.
Specifically, the mechanical arm is driven to enable a marker on the mechanical arm to sequentially pass through a plurality of first preset positions in the visual field range of a first camera, and coordinates of the first preset positions in a world coordinate system and coordinates of the first preset positions in a pixel coordinate system of the first camera are obtained; and calculating a first transformation matrix from the pixel coordinate system of the first camera to the world coordinate system according to the coordinates of the first preset position in the pixel coordinate system of the first camera and the coordinates of the first preset position in the world coordinate system.
And acquiring coordinates of a plurality of second preset positions in a pixel coordinate system of the first camera and coordinates in a pixel coordinate system of the second camera in a superposition interval of the visual field ranges of the first camera and the second camera. The second preset position can be defined in a coincidence interval of the visual field ranges of the first camera and the second camera by self, or a checkerboard calibration plate can be arranged in the coincidence interval, and the characteristic points on the checkerboard calibration plate are selected as the second preset position.
And calculating a second transformation matrix from the pixel coordinate system of the first camera to the pixel coordinate system of the second camera according to the coordinates of the second preset position in the pixel coordinate system of the first camera and the coordinates in the pixel coordinate system of the second camera.
And calculating a third transformation matrix from the pixel coordinate system of the second camera to the world coordinate system according to the first transformation matrix and the second transformation matrix. To this end, the pixel coordinate systems of the first camera and the second camera can be converted to the same world coordinate system through respective transformation matrixes.
And step S2, acquiring the pixel coordinates of the first teaching point and the pixel coordinates of the second teaching point based on the reference flitch.
In this embodiment, a flitch may be laid as a reference flitch in a preset scene. The reference flitch comprises a long flitch and a short flitch. The long material plate and the short material plate are both rectangular. The long material plate comprises a first long edge l11Second long side l12A first short side13And a second short side l14. The short material plate comprises a first long edge l21Second long side l22A first short side23And a second short side l24. The long material plate and the short material plate are laid in the preset scene in an L shape. For example, the first short side l of the strip13With the first short side l of the said short flitch23And aligning and laying the scene in the upper left corner of the preset scene, as shown in fig. 4.
In this embodiment, the acquired pixel coordinates of the first teaching point, the acquired pixel coordinates of the second teaching point, and the acquired coordinates of the rotation center of the robot may be used as teaching data for guiding the subsequent installation of the next flitch to be pasted. When the flitch to be paved is paved, the robot is controlled to photograph to obtain the pixel coordinates of the characteristic points after moving to the paving position; calculating to obtain displacement deviation and angle deviation according to the pixel coordinates of the first teaching point, the pixel coordinates of the second teaching point, the coordinates of the rotation center of the robot and the pixel coordinates of the characteristic points; and controlling the mechanical arm to align the flitch to be paved and the reference flitch according to the displacement deviation and the angle deviation, and finally finishing paving and pasting the wood floor.
In the present embodiment, a specific step of acquiring the pixel coordinates of the first teaching point and the pixel coordinates of the second teaching point based on the reference flit is described in detail in fig. 5.
And step S3, acquiring the coordinates of the rotation center of the mechanical arm in a world coordinate system.
In this embodiment, the acquiring coordinates of the rotation center of the robot arm in the world coordinate system includes:
controlling the mechanical arm to grab a material plate to be paved and pasted to a preset position, wherein a circular label object is pasted on the surface of the material plate to be paved and pasted;
controlling the first camera to take a picture to obtain an image;
extracting the pixel coordinates of the central point of the circular label object in the image;
converting the pixel coordinates into world coordinates (rx) according to the mapping relation0,ry0);
After the rotation angle of the mechanical arm is controlled, the first camera is controlled again to take a picture to obtain an image;
extracting the pixel coordinates of the central point of the circular label object in the image obtained by re-photographing;
converting the pixel coordinates into world coordinates (rx) according to the mapping relation1,ry1);
According to world coordinates (rx)0,ry0) World coordinate (rx)1,ry1) And calculating the angle to obtain the coordinates (fx) of the rotation center of the mechanical arm in the world coordinate system0,fy0)。
Specifically, when the mechanical arm rotates for a certain angle, the center point of the circular label object is from R0 (Shih) around the center of the flangeWorld coordinate (rx)0,ry0) Corresponding point) to R1 (world coordinate (rx)1,ry1) Corresponding points), their relationship can be described by a rotation matrix:
Figure BDA0002350993830000121
the coordinates (fx) of the center of rotation of the robot arm in the world coordinate system are thus obtained0,fy0):
Figure BDA0002350993830000122
In the present embodiment, the coordinates of the rotation center of the robot arm in the world coordinate system may also be calculated from an image taken by the second camera.
And step S4, controlling the mechanical arm to grab and convey the material plate to be paved to a paving position.
In the embodiment, the mechanical arm is controlled to grab and transport the to-be-paved plate to the paving position so as to align the to-be-paved plate with the reference plate. Due to the influence of factors of poor positioning precision of the robot, height difference of the ground and the like, when the robot moves to the paving position every time to cause the mechanical arm to grab the flitch to be paved to the alignment reference flitch, the width of the gap between the two sides of the flitch to be paved cannot meet the alignment requirement (20 +/-1.0 mm) during paving. Therefore, the displacement deviation and the angle deviation between the current flitch to be paved and the reference flitch need to be calculated, so that the displacement deviation and the angle deviation can be fed back to the mechanical arm for compensation, and the position of the compensated flitch to be paved meets the alignment requirement in paving.
And step S5, after the robot is controlled to rotate by a preset angle, the first camera is controlled to acquire a first image in a first view, and the second camera is controlled to acquire a second image in a second view.
In this embodiment, the robot needs to rotate to mount the flitch to be pasted to the pasting position. During rotation of the robot, the support rotates with the robot. And then controlling the first camera to acquire a first image and controlling the second camera to acquire a second image.
It should be noted that before controlling the first camera to acquire the first image and controlling the second camera to acquire the second image, the laser mounted on the robot is controlled to emit laser to irradiate the reference plate in the first view and the second view. Specifically, the laser emits four beams of laser light to irradiate the reference material plate under the first view. And the laser emits a beam of laser to irradiate the reference material plate under the second visual field.
In the present embodiment, the preset angle is θ.
Step S6, obtaining the pixel coordinate of the first feature point according to the first image, and obtaining the pixel coordinate of the second feature point according to the second image.
In the embodiment, the laser is controlled to emit two laser beams to irradiate the long material plate of the reference material plate under the first view field, so that the projection of the two laser beams and the long edge of the long material plate are intersected to form a first intersection point and a second intersection point; controlling the laser to emit two other beams of laser to irradiate the short flitch of the reference flitch under the first visual field, so that the projections of the two other beams of laser intersect with the short edge of the short flitch to form a third intersection point and a fourth intersection point; the intersection point of a straight line formed by the first intersection point and the second intersection point and a straight line formed by the third intersection point and the fourth intersection point is the first characteristic point; and controlling the laser to emit a beam of laser to irradiate the long material plate of the reference material plate under the second visual field, so that the projection of the beam of laser and the long edge of the long material plate form an intersection point, wherein the intersection point is the second characteristic point.
Specifically, the laser emits two laser beams to irradiate the long material plate of the reference material plate under the first view, so that the projection of the two laser beams is aligned with the long edge (such as the second long edge l in fig. 4) of the long material plate12) Intersect and produce a snap effect at the long edges of the strip. And forms two intersections with the long edges of the long material platesPoints, such as points a and B in fig. 6. The laser emits two other laser beams to irradiate the short flitch of the reference flitch under the first view field, so that the projection of the two other laser beams is aligned with the short side (such as the second short side l in fig. 4) of the short flitch24) Intersect and create a snap effect at the short edge of the short panel. And form two intersection points with the short side of the stub plate, such as points C and D in fig. 6. And the intersection point of the straight line formed by the points A and B and the straight line formed by the points C and D is the first characteristic point.
The laser emits a beam of laser to irradiate the long material plate of the reference material plate under the second view field, so that the projection of the beam of laser is projected to the long edge (such as the second long edge l in fig. 4) of the long material plate12) Intersect and produce a snap effect at the long edges of the strip. And an intersection point is formed with the long edge of the long material plate and is the second characteristic point.
In this embodiment, a first coordinate system (XOY) may be established with a lower left corner of the first image when placed in the forward direction as a center O, a horizontal direction as an X axis, and a vertical direction as a Y axis, where coordinates of the first feature point in the first coordinate system (XOY) correspond to pixel points in the first image. Whereby the coordinates determined in the first coordinate system (XOY) are pixel coordinates of the first feature point. The same method can obtain the pixel coordinates of the second feature point.
And step S7, calculating and calculating the angle deviation of the mechanical arm required to rotate according to the pixel coordinate of the first teaching point, the pixel coordinate of the second teaching point, the pixel coordinate of the first characteristic point and the pixel coordinate of the second characteristic point.
In this embodiment, in the process of aligning the flitch to be pasted and the reference flitch, the angular deviation of the mechanical arm required to rotate needs to be calculated.
Specifically, the first teaching point is set to a1, and the corresponding pixel coordinate is (x)1,y1) The second teaching point is A2, and its corresponding pixel coordinate is (x)2,y2) The first feature point is a3, and its corresponding pixel coordinate is (x)3,y3) And the second feature point is a4 with a corresponding pixel coordinate of (x)4,y4) As shown in fig. 8. The angular deviation dR may be calculated according to the following formula, wherein,
dR=arctan((x4-x3)/(y4-y3))-arctan((x1-x2)/(y1-y2))。
and step S8, respectively converting the pixel coordinates of the first teaching point and the first characteristic point into a first world coordinate and a second world coordinate according to the mapping relation.
In the present embodiment, the first world coordinate of the first teaching point is set to (px)0,py0) And the point corresponding to the first world coordinate is P0, and the second world coordinate of the first feature point is (px)2,py2) And the point corresponding to the second world coordinate is P2.
In the present embodiment, the pixel coordinates of the first teaching point may be converted into first world coordinates by a first transformation matrix; the pixel coordinates of the first feature point may be converted into second world coordinates by the first transformation matrix.
And step S9, calculating a third world coordinate of the mechanical arm rotating to the characteristic position based on the first world coordinate, the second world coordinate, the preset angle and the coordinate of the rotation center of the mechanical arm in the world coordinate system.
In this embodiment, the first world coordinate is (px)0,py0) The second world coordinate is (px)2,py2) The preset angle is theta, and the coordinate of the rotation center of the mechanical arm in the world coordinate system is (fx)0,fy0) As shown in fig. 9. Setting the third world coordinate to (px)1,py1) Then the following relationship exists:
Figure BDA0002350993830000161
the formula is calculated to obtain:
Figure BDA0002350993830000162
the third world coordinate is calculated by the following formula:
Figure BDA0002350993830000163
step S10, based on the third world coordinate (px)1,py1) And second world coordinate (px)2,py2) And obtaining the displacement deviation of the board to be paved according to the difference value.
In the present embodiment, the displacement deviation includes a displacement deviation dx of the short-side slit and a displacement deviation dy of the long-side slit. The specific calculation formula is as follows:
dx=px2-px1
dy=py2-py1
and step S11, controlling the mechanical arm to align the flitch to be paved with the reference flitch according to the displacement deviation and the angle deviation.
In the present embodiment, the control robot aligns the to-be-pasted material plate and the reference material plate according to the displacement deviation and the angle deviation, and maintains the width of the gap between the to-be-pasted material plate and the reference material plate to be a preset width (e.g., 20 mm). For example, it is possible to maintain a gap width between the long sides of the to-be-pasted material plates and the reference material plates to be 20mm, and a gap width between the short sides of the to-be-pasted material plates and the reference material plates to be 20 mm.
Through the steps S1-S11, a flitch can be laid and pasted in a preset scene as a reference flitch, then the robot is taught, and the first teaching point and the second teaching point of the robot when the first flitch is laid based on the reference flitch and the coordinates of the rotation center of the mechanical arm are obtained in the teaching process. In each subsequent paving and pasting process, a first characteristic point and a second characteristic point are obtained through photographing; calculating to obtain the angle deviation of the mechanical arm required to rotate according to the first teaching point, the second teaching point, the first characteristic point and the second characteristic point; and calculating to obtain the displacement deviation of the board to be paved according to the first teaching point and the first characteristic point, and controlling the mechanical arm to align the next board to be paved with the reference board according to the angle deviation and the displacement deviation.
Further, the floor installation method may further include: and step S12, controlling the robot to move to the next paving position, and then returning the process to step S4 to continue paving the material plates to be paved to the next paving position.
Referring to fig. 5, obtaining the pixel coordinates of the first teaching point and the pixel coordinates of the second teaching point based on the reference flitch may include the following specific steps:
step S50, the first camera is started to obtain a first view, and the second camera is started to obtain a second view.
And step S51, controlling a laser installed on the robot to emit laser to irradiate the reference material plate in the first visual field and the second visual field.
In this embodiment, the laser emitted by the laser is at a predetermined angle (e.g. 45 °) to the reference plate, since the reference plate has a certain thickness. Therefore, the projected laser can generate a breaking effect when passing through the edge of the reference plate, and the edge point of the reference plate can be conveniently extracted.
Specifically, the laser is controlled to emit two beams of laser to irradiate the long material plate of the reference material plate under the first view field, and the projections of the two beams of laser are intersected with the long edge of the long material plate to form a fifth intersection point and a sixth intersection point; controlling the laser to emit two other beams of laser to irradiate the short flitch of the reference flitch under the first visual field, so that the projections of the two other beams of laser intersect with the short edge of the short flitch to form a seventh intersection point and an eighth intersection point; an intersection point of a straight line formed by the fifth intersection point and the sixth intersection point and a straight line formed by the seventh intersection point and the eighth intersection point is the first teaching point (see point a1 in fig. 7);
and controlling the laser to emit a beam of laser to irradiate the long material plate of the reference material plate under the second view field, so that the projection of the beam of laser and the long edge of the long material plate form an intersection point, wherein the intersection point is the second teaching point (such as a point A2 in fig. 7).
And step S52, controlling the first camera to acquire a third image in the first view field and controlling the second camera to acquire a fourth image in the second view field.
In this embodiment, when the laser emits four beams of laser light to irradiate the reference material plate, the first camera is controlled to acquire a third image in a first view; and when the laser emits a beam of laser to irradiate the reference material plate, controlling the second camera to acquire a fourth image under a second visual field.
And step S53, obtaining the pixel coordinate of the first teaching point according to the third image, and obtaining the pixel coordinate of the second teaching point according to the fourth image.
In this embodiment, a second coordinate system (X ' O ' Y ') may be established with the lower left corner of the third image when placed in the forward direction as the center O ', the horizontal direction as the X ' axis, and the vertical direction as the Y ' axis, and the coordinates of the first teaching point in the second coordinate system (X ' O ' Y ') may correspond to the pixel points in the third image. Thus, the coordinates specified in the second coordinate system (X ' O ' Y ') are the pixel coordinates of the first teaching point. The same method can obtain the pixel coordinates of the second teach point.
Fig. 10 is a functional block diagram of a floor installation device according to the present invention.
In some embodiments, the floor mounting device 10 operates in a robot 1. The floor mounting device 10 may be divided into one or more modules that are stored in the controller 3 and executed by the controller 3 to complete the present application.
The one or more modules may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the floor mounting device 10 in the robot 1. For example, the floor mounting device 10 may be divided into a setup module 101, an acquisition module 102, a control module 103, a processing module 104, a calculation module 105, and a conversion module 106 in fig. 10.
The establishing module 101 is configured to respectively establish a mapping relationship between a pixel coordinate system of the first camera and a pixel coordinate system of the second camera and a world coordinate system of the mechanical arm;
the obtaining module 102 is configured to obtain a pixel coordinate of the first teaching point and a pixel coordinate of the second teaching point based on the reference flitch;
the acquiring module 102 is further configured to acquire coordinates of a rotation center of the robot arm in a world coordinate system;
the control module 103 is used for controlling the mechanical arm to grab and carry the material plates to be paved to the paving position;
the control module 103 is further configured to control the first camera to acquire a first image in a first view and control the second camera to acquire a second image in a second view after the robot is controlled to rotate by a preset angle;
the processing module 104 is configured to obtain a pixel coordinate of a first feature point according to the first image, and obtain a pixel coordinate of a second feature point according to the second image;
the calculating module 105 is configured to calculate an angle deviation that the mechanical arm needs to rotate according to the pixel coordinate of the first taught point, the pixel coordinate of the second taught point, the pixel coordinate of the first feature point, and the pixel coordinate of the second feature point;
the conversion module 106 is configured to convert the pixel coordinates of the first teaching point and the first feature point into a first world coordinate and a second world coordinate, respectively, according to the mapping relationship;
the processing module 104 is further configured to calculate a third world coordinate of the robot arm rotating to the feature position based on the first world coordinate, the second world coordinate, the preset angle, and a coordinate of a rotation center of the robot arm in a world coordinate system;
the processing module 104 is further configured to obtain a displacement deviation of the board to be tiled based on a difference between the third world coordinate and the second world coordinate;
the control module 103 is further configured to control the mechanical arm to align the flitch to be paved with the reference flitch according to the displacement deviation and the angle deviation.
The controller 3 may be a Central Processing Unit (CPU), or may be other general purpose processors, Digital Signal Processors (DSP), Application Specific Integrated Circuits (ASIC), Field-Programmable Gate arrays (FPGA) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, or the like. The general purpose processor may be a microprocessor or the controller may be any other conventional processor or the like.
The separately integrated modules/units of the floor mounting device 10 may be stored in a computer readable storage medium if implemented as software functional units and sold or used as separate products. Based on such understanding, all or part of the processes in the method according to the embodiments of the present invention can also be implemented by a computer program, which can be stored in a computer-readable storage medium, and can implement the steps of the embodiments of the method when the computer program is executed by a processor. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
Although the present invention has been described in detail with reference to the preferred embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the spirit and scope of the invention.

Claims (14)

1. A method of floor installation, the method comprising:
respectively establishing a mapping relation between a pixel coordinate system of the first camera and a pixel coordinate system of the second camera and a world coordinate system of the mechanical arm;
acquiring a pixel coordinate of a first teaching point and a pixel coordinate of a second teaching point based on the reference flitch;
acquiring coordinates of a rotation center of the mechanical arm in a world coordinate system;
controlling the mechanical arm to grab and carry the material plates to be paved to the paving position;
after the robot is controlled to rotate by a preset angle, the first camera is controlled to acquire a first image under a first visual field, and the second camera is controlled to acquire a second image under a second visual field;
obtaining the pixel coordinates of a first characteristic point according to the first image, and obtaining the pixel coordinates of a second characteristic point according to the second image;
calculating the angle deviation of the mechanical arm needing to rotate according to the pixel coordinate of the first teaching point, the pixel coordinate of the second teaching point, the pixel coordinate of the first characteristic point and the pixel coordinate of the second characteristic point;
respectively converting the pixel coordinates of the first teaching point and the first characteristic point into a first world coordinate and a second world coordinate according to the mapping relation;
calculating a third world coordinate of the mechanical arm rotating to a characteristic position based on the first world coordinate, the second world coordinate, the preset angle and a coordinate of a rotation center of the mechanical arm in a world coordinate system;
obtaining displacement deviation of the board to be paved based on the difference value between the third world coordinate and the second world coordinate;
and controlling the mechanical arm to align the flitch to be paved with the reference flitch according to the displacement deviation and the angle deviation.
2. The floor installation method of claim 1, wherein said obtaining pixel coordinates of the first taught point and pixel coordinates of the second taught point based on the reference flitch comprises:
starting the first camera to obtain a first view, and starting the second camera to obtain a second view;
controlling a laser installed on the robot to emit laser to irradiate the reference material plate in the first visual field and the second visual field;
controlling the first camera to acquire a third image under a first visual field, and controlling the second camera to acquire a fourth image under a second visual field;
and obtaining the pixel coordinate of the first teaching point according to the third image, and obtaining the pixel coordinate of the second teaching point according to the fourth image.
3. The method of floor installation of claim 1, wherein said obtaining coordinates of a center of rotation of the robotic arm in a world coordinate system comprises:
controlling the mechanical arm to grab a material plate to be paved and pasted to a preset position, wherein a circular label object is pasted on the surface of the material plate to be paved and pasted;
controlling the first camera to take a picture to obtain an image;
extracting the pixel coordinates of the central point of the circular label object in the image;
converting the pixel coordinates into world coordinates (rx) according to the mapping relation0,ry0);
After the rotation angle of the mechanical arm is controlled, the first camera is controlled again to take a picture to obtain an image;
extracting the pixel coordinates of the central point of the circular label object in the image obtained by re-photographing;
converting the pixel coordinates into world coordinates (rx) according to the mapping relation1,ry1);
According to world coordinates (rx)0,ry0) World coordinate (rx)1,ry1) And calculating the angle to obtain the coordinates (fx) of the rotation center of the mechanical arm in the world coordinate system0,fy0) Wherein:
Figure FDA0002350993820000021
4. the method of installing a floor of claim 1, wherein prior to controlling the first camera to capture a first image and controlling the second camera to capture a second image, the method further comprises:
and controlling a laser installed on the robot to emit laser to irradiate the reference material plate in the first visual field and the second visual field.
5. The method of installing a floor of claim 4, further comprising:
controlling the laser to emit two beams of laser to irradiate the long material plate of the reference material plate under the first visual field, so that the projections of the two beams of laser are intersected with the long edge of the long material plate to form a first intersection point and a second intersection point;
controlling the laser to emit two other beams of laser to irradiate the short flitch of the reference flitch under the first visual field, and enabling projections of the two other beams of laser to intersect with the short side of the short flitch to form a third intersection point and a fourth intersection point, wherein an intersection point of a straight line formed by the first intersection point and the second intersection point and a straight line formed by the third intersection point and the fourth intersection point is the first characteristic point;
and controlling the laser to emit a beam of laser to irradiate the long material plate of the reference material plate under the second visual field, so that the projection of the beam of laser and the long edge of the long material plate form an intersection point, wherein the intersection point is the second characteristic point.
6. A floor installation method according to claim 1, wherein the angular deviation is calculated by the following formula:
dR=arctan((x4-x3)/(y4-y3))-arctan((x1-x2)/(y1-y2))
wherein dR is the angular deviation, (x)1,y1) (x) is the pixel coordinate of the first teach point2,y2) Is said secondPixel coordinates of teach points, (x)3,y3) (x) is the pixel coordinate of the first feature point4,y4) The pixel coordinates of the second characteristic point.
7. The floor mounting method of claim 3, wherein the separately mapping between the pixel coordinate system of the first camera and the pixel coordinate system of the second camera and the world coordinate system of the robotic arm comprises:
calculating a first transformation matrix from a pixel coordinate system of a first camera to the world coordinate system;
calculating a second transformation matrix from the pixel coordinate system of the first camera to the pixel coordinate system of the second camera;
and calculating a third transformation matrix from the pixel coordinate system of the second camera to the world coordinate system according to the first transformation matrix and the second transformation matrix.
8. The floor installation method of claim 7, wherein converting the pixel coordinates of the first teach point and the first feature point to first world coordinates and second world coordinates, respectively, according to the mapping comprises:
converting pixel coordinates of the first teach point to first world coordinates (px) by a first transformation matrix0,py0);
Converting pixel coordinates of the first feature point into second world coordinates (px) through the first transformation matrix2,py2)。
9. A method of installing a floor as claimed in claim 8, wherein the third world coordinate is calculated by the formula:
Figure FDA0002350993820000041
wherein the first world coordinate is (px)0,py0) The second world coordinate is (px)2,py2) The preset angle is theta, and the coordinate of the rotation center of the mechanical arm in the world coordinate system is (fx)0,fy0)。
10. A floor mounting apparatus, the apparatus comprising:
the establishing module is used for respectively establishing a mapping relation between a pixel coordinate system of the first camera and a pixel coordinate system of the second camera and a world coordinate system of the mechanical arm;
the acquisition module is used for acquiring the pixel coordinate of the first teaching point and the pixel coordinate of the second teaching point based on the reference flitch;
the acquisition module is further used for acquiring the coordinates of the rotation center of the mechanical arm in a world coordinate system;
the control module is used for controlling the mechanical arm to grab and carry the material plates to be paved to the paving position;
the control module is further used for controlling the first camera to acquire a first image in a first view and controlling the second camera to acquire a second image in a second view after the robot is controlled to rotate by a preset angle;
the processing module is used for obtaining the pixel coordinate of a first characteristic point according to the first image and obtaining the pixel coordinate of a second characteristic point according to the second image;
the calculation module is used for calculating the angle deviation of the mechanical arm needing to rotate according to the pixel coordinate of the first teaching point, the pixel coordinate of the second teaching point, the pixel coordinate of the first characteristic point and the pixel coordinate of the second characteristic point;
the conversion module is used for respectively converting the pixel coordinates of the first teaching point and the first characteristic point into a first world coordinate and a second world coordinate according to the mapping relation;
the processing module is further used for calculating a third world coordinate of the mechanical arm rotating to a characteristic position based on the first world coordinate, the second world coordinate, the preset angle and a coordinate of a rotation center of the mechanical arm in a world coordinate system;
the processing module is further used for obtaining displacement deviation of the board to be paved based on the difference value between the third world coordinate and the second world coordinate;
and the control module is also used for controlling the mechanical arm to align the flitch to be paved with the reference flitch according to the displacement deviation and the angle deviation.
11. A robot, characterized in that the robot comprises:
the camera is used for shooting images of the flitch and the flitch laying environment;
the mechanical arm is used for moving the material plates to be paved to the paving position and aligning the material plates to be paved with the reference material plates;
a controller having stored therein a plurality of program modules that are loaded by the controller and execute the floor installation method of any of claims 1 to 9.
12. A robot as claimed in claim 11, wherein the camera is fixedly mounted on a support of the robot and remains in the same horizontal plane.
13. A robot as set forth in claim 12, wherein the number of said cameras is two, two of said cameras being arranged in a direction of a long side of said flitch.
14. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the floor installation method as claimed in any one of the claims 1 to 9.
CN201911415161.6A 2019-12-31 2019-12-31 Floor mounting method and device, robot and storage medium Active CN111192301B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911415161.6A CN111192301B (en) 2019-12-31 2019-12-31 Floor mounting method and device, robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911415161.6A CN111192301B (en) 2019-12-31 2019-12-31 Floor mounting method and device, robot and storage medium

Publications (2)

Publication Number Publication Date
CN111192301A true CN111192301A (en) 2020-05-22
CN111192301B CN111192301B (en) 2023-05-05

Family

ID=70709638

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911415161.6A Active CN111192301B (en) 2019-12-31 2019-12-31 Floor mounting method and device, robot and storage medium

Country Status (1)

Country Link
CN (1) CN111192301B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111894247A (en) * 2020-08-04 2020-11-06 广东博智林机器人有限公司 Decorative surface material paving method, device and system and storage medium
CN112928324A (en) * 2021-01-29 2021-06-08 蜂巢能源科技有限公司 Module component assembling method and module component assembling device
CN113482301A (en) * 2021-07-02 2021-10-08 北京建筑大学 Tile paving method and tile automatic paving control system
CN114260908A (en) * 2021-12-20 2022-04-01 深圳市如本科技有限公司 Robot teaching method, device, computer equipment and computer program product
CN114971948A (en) * 2021-02-20 2022-08-30 广东博智林机器人有限公司 Position adjusting method, device, equipment and medium
WO2022194019A1 (en) * 2021-03-17 2022-09-22 广东博智林机器人有限公司 Flooring installation apparatus, device, method, and medium
WO2023272529A1 (en) * 2021-06-29 2023-01-05 西门子(中国)有限公司 Dynamic assembly method, apparatus, and system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104997529A (en) * 2015-06-30 2015-10-28 大连理工大学 Method for correcting cone beam CT system geometric distortion based on symmetrically repetitive template
US20170122735A1 (en) * 2014-06-15 2017-05-04 Cct Creative Construction Tools Ltd. Method and apparatus for assisting in tiling
CN109807885A (en) * 2018-12-29 2019-05-28 深圳市越疆科技有限公司 A kind of vision calibration method of manipulator, device and intelligent terminal
CN110259067A (en) * 2019-06-11 2019-09-20 清华大学 The tile loading position recognition methods of robot and system
CN110405773A (en) * 2019-08-19 2019-11-05 广东博智林机器人有限公司 A kind of floor mounting method and robot
CN110480634A (en) * 2019-08-08 2019-11-22 北京科技大学 A kind of arm guided-moving control method for manipulator motion control
CN110497386A (en) * 2019-08-26 2019-11-26 中科新松有限公司 A kind of cooperation Robot Hand-eye relationship automatic calibration device and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170122735A1 (en) * 2014-06-15 2017-05-04 Cct Creative Construction Tools Ltd. Method and apparatus for assisting in tiling
CN104997529A (en) * 2015-06-30 2015-10-28 大连理工大学 Method for correcting cone beam CT system geometric distortion based on symmetrically repetitive template
CN109807885A (en) * 2018-12-29 2019-05-28 深圳市越疆科技有限公司 A kind of vision calibration method of manipulator, device and intelligent terminal
CN110259067A (en) * 2019-06-11 2019-09-20 清华大学 The tile loading position recognition methods of robot and system
CN110480634A (en) * 2019-08-08 2019-11-22 北京科技大学 A kind of arm guided-moving control method for manipulator motion control
CN110405773A (en) * 2019-08-19 2019-11-05 广东博智林机器人有限公司 A kind of floor mounting method and robot
CN110497386A (en) * 2019-08-26 2019-11-26 中科新松有限公司 A kind of cooperation Robot Hand-eye relationship automatic calibration device and method

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111894247A (en) * 2020-08-04 2020-11-06 广东博智林机器人有限公司 Decorative surface material paving method, device and system and storage medium
CN112928324A (en) * 2021-01-29 2021-06-08 蜂巢能源科技有限公司 Module component assembling method and module component assembling device
CN112928324B (en) * 2021-01-29 2022-03-25 蜂巢能源科技有限公司 Module component assembling method and module component assembling device
CN114971948A (en) * 2021-02-20 2022-08-30 广东博智林机器人有限公司 Position adjusting method, device, equipment and medium
WO2022194019A1 (en) * 2021-03-17 2022-09-22 广东博智林机器人有限公司 Flooring installation apparatus, device, method, and medium
WO2023272529A1 (en) * 2021-06-29 2023-01-05 西门子(中国)有限公司 Dynamic assembly method, apparatus, and system
CN113482301A (en) * 2021-07-02 2021-10-08 北京建筑大学 Tile paving method and tile automatic paving control system
CN113482301B (en) * 2021-07-02 2022-07-01 北京建筑大学 Tile paving method and tile automatic paving control system
CN114260908A (en) * 2021-12-20 2022-04-01 深圳市如本科技有限公司 Robot teaching method, device, computer equipment and computer program product
CN114260908B (en) * 2021-12-20 2023-10-20 深圳市如本科技有限公司 Robot teaching method, apparatus, computer device and computer program product

Also Published As

Publication number Publication date
CN111192301B (en) 2023-05-05

Similar Documents

Publication Publication Date Title
CN111192301B (en) Floor mounting method and device, robot and storage medium
CN108571971B (en) AGV visual positioning system and method
CN107611073B (en) Positioning method for typesetting of solar cell strings
CN103395301B (en) A kind of laser marking machine three-dimensional correction method and device
US10706532B1 (en) Digital projection system for workpiece assembly and associated method
CN111083381B (en) Image fusion method and device, double-optical camera and unmanned aerial vehicle
US20160236619A1 (en) Vehicle periphery image display device and camera adjustment method
WO2021073458A1 (en) Laying method and laying robot
WO2020073940A1 (en) Method and device for calibrating print positioning platform of crystalline silicon photovoltaic solar cell on the basis of machine vision
JP6844582B2 (en) Screw tightening device
TWI781240B (en) Double-side exposure device and double-side exposure method
CN110815205A (en) Calibration method, system and device of mobile robot
CN113496523A (en) System and method for three-dimensional calibration of visual system
CN113043334B (en) Robot-based photovoltaic cell string positioning method
WO2023103679A1 (en) Rapid and automatic calibration method and apparatus for vehicle-mounted surround-view camera
CN105323455A (en) Positioning compensation method based on machine vision
TW201443827A (en) Camera image calibrating system and method of calibrating camera image
CN112950724A (en) Screen printing visual calibration method and device
US6258495B1 (en) Process for aligning work and mask
JP6134166B2 (en) Position detection apparatus and position detection method
KR20110019990A (en) Method for aligning a substrate
KR102603530B1 (en) Double sided exposure apparatus
KR20120077884A (en) Automatic teaching method of wafer trasfer robot
CN113400662A (en) Method and device for attaching electronic element on PCB (printed Circuit Board) and storage medium
KR101430970B1 (en) Alligning method of display panel

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant