WO2023193362A1 - 一种复合机器人和三维视觉的大型结构件自动焊接***及方法 - Google Patents

一种复合机器人和三维视觉的大型结构件自动焊接***及方法 Download PDF

Info

Publication number
WO2023193362A1
WO2023193362A1 PCT/CN2022/106029 CN2022106029W WO2023193362A1 WO 2023193362 A1 WO2023193362 A1 WO 2023193362A1 CN 2022106029 W CN2022106029 W CN 2022106029W WO 2023193362 A1 WO2023193362 A1 WO 2023193362A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
welding
coordinate system
camera
degree
Prior art date
Application number
PCT/CN2022/106029
Other languages
English (en)
French (fr)
Inventor
杨涛
李欢欢
彭磊
姜军委
马力
刘青峰
张妮妮
王芳
Original Assignee
西安知象光电科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 西安知象光电科技有限公司 filed Critical 西安知象光电科技有限公司
Priority to US18/041,129 priority Critical patent/US11951575B2/en
Priority to EP22871144.6A priority patent/EP4279211A4/en
Publication of WO2023193362A1 publication Critical patent/WO2023193362A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K37/00Auxiliary devices or processes, not specially adapted to a procedure covered by only one of the preceding main groups
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1615Programme controls characterised by special kind of manipulator, e.g. planar, scara, gantry, cantilever, space, closed chain, passive/active joints and tendon driven manipulators
    • B25J9/162Mobile manipulator, movable base with manipulator arm mounted on it
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K9/00Arc welding or cutting
    • B23K9/12Automatic feeding or moving of electrodes or work for spot or seam welding or cutting
    • B23K9/126Controlling the spatial relationship between the work and the gas torch
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K37/00Auxiliary devices or processes, not specially adapted to a procedure covered by only one of the preceding main groups
    • B23K37/02Carriages for supporting the welding or cutting element
    • B23K37/0247Driving means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K37/00Auxiliary devices or processes, not specially adapted to a procedure covered by only one of the preceding main groups
    • B23K37/02Carriages for supporting the welding or cutting element
    • B23K37/0294Transport carriages or vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K9/00Arc welding or cutting
    • B23K9/0026Arc welding or cutting specially adapted for particular articles or work
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K9/00Arc welding or cutting
    • B23K9/095Monitoring or automatic control of welding parameters
    • B23K9/0953Monitoring or automatic control of welding parameters using computing means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/005Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators using batteries, e.g. as a back-up power source
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/005Manipulators mounted on wheels or on carriages mounted on endless tracks or belts
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39022Transform between measuring and manipulator coordinate system
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39045Camera on end effector detects reference pattern
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39398Convert hand to tool coordinates, derive transform matrix
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40298Manipulator on vehicle, wheels, mobile
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40584Camera, non-contact sensor mounted on wrist, indep from gripper
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40613Camera, laser scanner on end effector, hand eye manipulator, local
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45104Lasrobot, welding robot

Definitions

  • the present invention relates to the technical fields of industrial automation and machine vision, regarding the automated welding of large structural parts, and in particular to an automatic welding system and method for large structural parts using composite robots and three-dimensional vision.
  • Robotic automated welding is increasingly replacing manual welding tasks, but the arm span of the robot is limited, which means that in order to complete the welding of large-sized workpieces, an external axis must be used to expand the robot's working range.
  • the usual forms of such external shafts are: ground rails, gantry rails, sky rails and other structures.
  • Existing solutions usually use the above or similar solutions to expand the working range of the robot to meet the welding of large structural parts.
  • the patent application with application number 202111298122 the disadvantages of this solution are: (1) High accuracy requirements for the motion mechanism. Usually the stroke of the motion mechanism reaches more than several meters, which is prone to cumulative errors and has a significant impact on the global positioning accuracy of the robot.
  • the purpose of the present invention is to provide an automatic welding system and method for large structural parts that combines robots and three-dimensional vision, using mobile robots to expand the working range of industrial robots, and then using 3D vision technology to perform end-point precision Identify and position, and then automatically complete the welding task, expand the system's working scope, improve the system's flexibility, and reduce the system's cost.
  • An automatic welding system for large structural parts with composite robots and three-dimensional vision including a composite robot system composed of a mobile robot and a multi-degree-of-freedom robot installed above the mobile robot; a welding system installed at the end of the multi-degree-of-freedom robot for welding target workpieces system; and a three-dimensional vision system installed at the end of the multi-degree-of-freedom robot or installed on the welding system.
  • the three-dimensional vision system performs global calibration and positioning of the composite robot system, welding system and target workpiece.
  • the mobile robot includes a mobile robot chassis, a shell is fastened on the chassis, and a rechargeable battery pack that provides energy for the entire system, a power supply, a controller of the multi-degree-of-freedom robot, and a welding machine of the welding system are provided in the shell.
  • the rechargeable battery pack is also connected to external power supply through the power interface on the casing;
  • the multi-degree-of-freedom robot includes a multi-degree-of-freedom robot body, and the multi-degree-of-freedom robot body and its teaching pendant are connected through cables and controller signals in the housing;
  • the welding system includes a welding machine located in the casing, and a welding gun connected to the welding machine.
  • the welding gun is arranged at the end of the multi-degree-of-freedom robot body;
  • the three-dimensional vision system includes a 3D camera.
  • the 3D camera is installed at the end of the multi-degree-of-freedom robot body or installed on the welding gun.
  • the 3D camera is connected to the industrial computer in the casing through cables.
  • the industrial computer is connected to the robot controller through cables. ;
  • the measurement accuracy of the 3D camera is not less than 0.5mm, and the depth map frame rate is greater than 1 frame per second;
  • the multi-degree-of-freedom robot has a robotic arm with more than six degrees of freedom, and the arm span of the robotic arm ranges from 0.5m to 2m.
  • the welding method of the automatic welding system for large structural parts based on the above-mentioned composite robot and three-dimensional vision includes the following steps:
  • this invention uses composite robots and three-dimensional vision technology to effectively improve system flexibility and reduce system costs.
  • the present invention uses a combination of a composite robot and three-dimensional vision technology. In essence, it uses a rough positioning technology based on a mobile platform combined with a precise identification and positioning technology of high-precision three-dimensional vision technology to expand the capabilities of multi-free robots in the XYZ three-dimensional direction.
  • the working scope realizes the flexible welding task of large structural parts. Compared with the traditional solution of extending the robot's working range through external axes, this solution is low-cost; and the extended working range of this solution is much larger than that of external axes in the XY direction. As long as conditions permit, its work can be expanded arbitrarily in the XY direction. scope. And the increase in cost is negligible. Therefore, this solution has a very high cost advantage for the welding of large and ultra-large workpieces; in addition, this solution occupies a small area and has no interference with the hoisting of the workpieces. This solution is also a beneficial effect of this solution.
  • the present invention uses three-dimensional vision technology and has stronger tolerance capabilities. Therefore, the requirements for the accuracy of the workpiece processing group, the positioning accuracy of the motion robot, and the accuracy of the placement of the workpiece are lower, which is beneficial to reducing costs, improving flexibility, and expanding the scope of application.
  • the present invention realizes automated welding of large structural parts. Compared with the current common manual welding methods, it is conducive to saving manpower, improving production efficiency, and improving welding quality.
  • Figure 1 is a system structure diagram of the present invention.
  • the present invention aims to propose an automatic welding system and method for large structural parts that combines robots and three-dimensional vision. It uses mobile robots to expand the working range of industrial robots, and then uses 3D vision technology to accurately identify and position the ends and complete the process automatically. welding tasks, expand the working scope of the system, improve the flexibility of the system, and reduce the cost of the system.
  • An automatic welding system for large structural parts with a composite robot and three-dimensional vision including a composite robot system composed of a mobile robot 3 and a multi-degree-of-freedom robot 2 installed above the mobile robot 3; installed at the end of the multi-degree-of-freedom robot 2 for welding a welding system for the target workpiece 1; and a three-dimensional vision system installed at the end of the multi-degree-of-freedom robot 2 or installed on the welding system.
  • the three-dimensional vision system performs global calibration and positioning of the composite robot system, the welding system and the target workpiece.
  • the mobile robot 3 includes a mobile robot chassis 13.
  • the mobile robot chassis 13 includes a motion module, a control module, a navigation sensor and structural components.
  • the chassis 13 is fastened with a shell 15, and the shell 15 is provided with a rechargeable battery pack 11 and a power supply 12 that provide energy for the entire system, a controller 10 of the multi-degree-of-freedom robot 2, a welder 9 of the welding system, and a rechargeable battery.
  • the pack 11 can also be connected to external power supply through the power interface 14 on the housing 15. When the entire system is working, the battery pack 11 can be used for power supply, or the power supply 12 can be used for power supply.
  • the mobile robot 3 is a movable robot platform with a global navigation and positioning function, and its spatial positioning accuracy is better than 20mm.
  • the global navigation and positioning is rough positioning through the navigation function, including electromagnetic method and QR code. method, visual SLAM, visual tracking, inertial navigation, and a combination of multiple above methods.
  • electromagnetic method or the QR code navigation and positioning method corresponding navigation lines are laid on the ground 4.
  • visual tracking technology to position the mobile robot, one or more tracking targets 16 are printed on the shell 15 of the mobile robot 3, and additional cameras determine the position of the mobile robot by photographing the targets.
  • the target can be a specially designed luminous or reflective structure mounted directly on the mobile robot.
  • the mobile robot 3 is preferably a mobile robot platform with a lifting function.
  • the robot controller 10 includes a controller for controlling the motion of the robot motor and a driver for driving the robot motor, collectively referred to as a controller.
  • the multi-degree-of-freedom robot 2 includes a multi-degree-of-freedom robot body 6 rigidly installed on the mobile robot 3.
  • the multi-degree-of-freedom robot body 6 and the robot teaching pendant are connected through cables and signals from the controller 10 in the housing 15.
  • the multi-degree robot 2 is carried on the back of the mobile robot 3 to expand the working range of the multi-degree-of-freedom robot.
  • the multi-degree-of-freedom robot 2 preferably has a robotic arm with more than six degrees of freedom.
  • the robotic arm is an industrial robot or a collaborative robot.
  • the arm span of the robotic arm is preferably in the range of 0.5m-2m.
  • the welding system includes a welding machine 9 located in the housing 15, a welding gun connected to the welding machine 9, and other necessary components.
  • the welding gun 7 is provided at the end of the multi-degree-of-freedom robot body 6; the other components include Wire machine, welding wire, water tank, protective gas and storage device, air compressor, used to complete the complete welding process.
  • the three-dimensional vision system includes a 3D camera 5.
  • the 3D camera 5 is installed at the end of the multi-degree-of-freedom robot body 6, or installed on the welding gun 7, or as long as the camera can conveniently photograph the target workpiece.
  • the 3D camera 5 is connected to the industrial computer 8 of the housing 15 through a cable.
  • the industrial computer 8 can be installed at any suitable operating position on the housing 15.
  • the industrial computer 8 is connected to the robot controller 10 through a cable.
  • the 3D camera 5 obtains the three-dimensional feature information of the workpiece to be welded, the measurement accuracy is not less than 0.5mm, and the depth map frame rate is greater than 1 frame per second.
  • the 3D camera 5 is a 3D camera with low power consumption, small volume and low weight. Camera, the 3D camera 5 preferably uses laser as the light source to improve light resistance.
  • the 3D camera 5 is preferably a MEMS-based structured light 3D camera to meet the above characteristics.
  • the 3D camera 5 is equipped with a protective device to protect the camera from being affected by high temperature, splashing, and smoke so that it can operate normally.
  • the target workpiece 1 is a large metal structure suitable for welding processing, which means that at least one dimension of the workpiece is within 5m-500m.
  • the target workpiece 1 is placed on a basic plane, which is a flat plane without undulations, preferably a horizontal plane;
  • the mobile robot 3 is equipped with a multi-degree-of-freedom robot 2, a three-dimensional vision system, and a welding system. It moves around the target workpiece on the base plane, performs rough positioning through the navigation function, and expands the working range in the direction perpendicular to the base plane through the lifting function.
  • the welding method of an automatic welding system for large structural parts based on a composite robot and three-dimensional vision includes the following steps:
  • the mobile robot 3 is equipped with a multi-degree-of-freedom robot 2, a three-dimensional vision system, and a welding system. It moves around the target workpiece 1 on the base plane, performs rough positioning through the navigation function, and expands the direction vertical to the base plane through the lifting function.
  • the methods for rough positioning of the navigation function include electromagnetic method, QR code method, visual SLAM, visual tracking, inertial navigation, and composite solutions of the above methods; when using the electromagnetic method or QR code navigation and positioning method, the ground is paved with Corresponding navigation line 4.
  • one or more tracking targets 16 are printed on the housing 15 of the mobile robot 3, and two or more additional cameras determine the position of the mobile robot by photographing the targets.
  • the target can be a specially designed luminous or reflective structure directly installed on the mobile robot to ensure that cameras from different angles can observe the target at any angle.
  • (1.2) Establish the coordinate system of the composite robot system: Establish a coordinate system for the mobile robot 3.
  • the coordinate origin is a fixed point on the base plane.
  • the XY plane coincides with the base plane.
  • the Z direction is vertically upward to the base plane.
  • This coordinate system It is the global basic coordinate system; establish a multi-degree-of-freedom robot 2 coordinate system.
  • its coordinate origin is on the base of the robotic arm, and its X, Y, Z directions coincide with the coordinate system of the mobile robot 3; establish a three-dimensional
  • the coordinate system of the visual system has its coordinate origin at the optical center of the 3D camera lens.
  • the X direction points from the optical center of the camera lens to the optical center of the projection system.
  • the Y axis is perpendicular to the X axis and parallel to the camera imaging chip.
  • the Z axis is perpendicular to the XY plane and points to Stay away from directly in front of the camera.
  • the step (2) includes three parts:
  • the internal parameters include: focal length, principal point position, pixel size, resolution, and distortion parameters.
  • the cam T tool is obtained as the initial value through the previous steps, and further closed-loop iterative solution is performed to obtain the optimized hand-eye transformation matrix cam T tool .
  • the following hand-eye conversion matrix is the optimal cam T tool obtained in this step, regardless of whether closed-loop control optimization is used.
  • the robot's TCP calibration methods include: direct input method, four-point method, and six-point method.
  • the four-point method is used for calibration.
  • the TCP is made to coincide with the fixed point in the space.
  • the coordinate origin of the workpiece coordinate system is set at a place with obvious working characteristics, preferably the intersection of multiple planes, or a corner point, to facilitate alignment.
  • the X, Y, and Z directions of the workpiece coordinate system should be consistent with the main structural feature directions as much as possible; preferably, the longest scale direction is selected as the X direction to facilitate placement.
  • two or more limiting mechanisms distributed along the X direction can be provided so that the workpiece is aligned with the X axis. That is, first, place the target work 1 in the working area of the basic plane, and then adjust the support structure so that the X, Y, and Z directions of the workpiece coordinate system basically coincide with the X, Y, and Z directions of the basic coordinate system.
  • the above basic coincidence It shows that their angle error is within 2°.
  • step (3) The specific implementation steps of step (3) are:
  • O base (x,y,z) cam T robot ⁇ robot T base ⁇ O(x,y,z)
  • the transformation values from the workpiece coordinate system to the base coordinate system in the XYZ directions are (-X T , -Y T , -Z T ).
  • the above transformation will be used to perform translation transformation on the position information in the robot program generated based on the workpiece coordinate system.
  • the motion path of the robot and the camera's photographing position and attitude are planned.
  • the camera's photographing position and attitude can correspond to one or more welding features in a single photo, or to one welding feature in multiple photos.
  • the described welding feature is the target welding position, which is a point, or a straight line, or a curve; the described camera shooting position and posture should be able to make the camera in a position and posture convenient for shooting the target feature.
  • the camera is in The above-mentioned convenient shooting position and posture: the camera is within the effective working distance, the target feature is within the camera's field of view, and the main normal direction of the area where the target feature is located is as parallel as possible to the Z-axis direction of the camera to obtain the best shot. Effect.
  • the robot motion path is the shortest safe path for the robot to move to the target position.
  • the target position should enable the robot's arm span and degree of freedom to satisfy the camera's ability to reach the target shooting position and posture;
  • the method for planning the motion path of the robot and the camera's photographing position and attitude includes: planning through offline programming software, implementing it through parameterization, and implementing it through teaching.
  • step (5) generating a robot motion control and welding program includes: a motion control program for the mobile robot 3, a control program for the multi-degree-of-freedom robot 2, and a welding program.
  • the multi-free robot 2 is used as the main controller to communicate and control the mobile robot 3, 3D camera 5, and welding system.
  • the industrial computer 8 is used as the main control to control the mobile robot 3, the multi-degree-of-freedom robot 2, and the welding system.
  • an external PLC is used as the main control to control the mobile robot 3, the multi-degree-of-freedom robot 2, and the welding system.
  • the control program of the multi-degree-of-freedom robot 2 includes a movement control program of the robotic arm; a communication program with the camera; and a communication program with the welding system and the motion robot.
  • the described program generation method includes: a template program compiled manually offline, a program generated by offline programming software using a digital-analog driver, and a teaching program generated by a teaching pendant.
  • the generated robot program meets the grammatical rules and data format of the corresponding brand robot, and can be run directly on the corresponding brand robot. If it is a template program compiled manually offline or a program generated by offline programming software using a digital-analog driver, the generated robot program first needs to be sent to the robot controller before running.
  • the method can be wired or wireless transmission, or a storage device can be used. Make a copy.
  • the host computer controls the robot online in real time without sending the program to the robot controller.
  • the step (6) includes the following steps:
  • the network method identifies key point information of welding features; the key point information includes the starting point, end point, and direction of the line segment; spline parameters; arc parameters; and the coordinates of multiple discrete points;
  • the new key feature information includes the starting point and end point of the line segment. , direction; spline parameters; arc parameters; coordinates of multiple discrete points.
  • the described point position and direction information have been converted to the basic coordinate system of the system.
  • the mobile robot 3 moves to the P 2 position and performs a cycle: repeat steps (6.3)-(6.6) until the welding is completed or stops.
  • the instructions described above and below may be implemented in software and may be executed on a data processing system or other processing tool by executing computer-executable instructions.
  • the instructions may be program code that is loaded into memory (eg, RAM) from a storage medium or from another computer via a computer network.
  • memory eg, RAM
  • the features described may be implemented by hardwired circuitry instead of software, or a combination of hardwired circuitry and software.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Plasma & Fusion (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

复合机器人和三维视觉的大型结构件自动焊接***及方法,***包括由移动机器人(3)和多自由度机器人(2)所组成的复合机器人***、三维视觉***、用于实现焊接功能的焊接***;基于移动平台的粗定位技术结合了高精度三维视觉技术的精确识别定位技术,扩展了多自由度机器人(2)在XYZ三维方向的工作范围,实现了大型结构件的柔性焊接任务。

Description

一种复合机器人和三维视觉的大型结构件自动焊接***及方法 技术领域
本发明涉及工业自动化和机器视觉技术领域,关于大型结构件的自动化焊接,特别涉及复合机器人和三维视觉的大型结构件自动焊接***及方法。
背景技术
机器人自动化焊接越来越多的替代人工完成繁重的焊接任务,但是机器人的臂展是有限的,这意味着要想完成大尺寸的工件焊接需要配合外部轴来扩展机器人的工作范围。这样的外部轴通常的形式是:地轨、龙门、天轨等结构。现有的方案通常使用上述或者类似的方案拓展机器人的工作范围,以满足大型结构件的焊接。例如申请号为202111298122.X的专利申请就公布了一种基于三维视觉的大型结构件自动化焊接***及方法,多自由度机器人的底座通过滑台安装在外部轴的地轨上。但是这种方案的弊端是:(1)、对运动机构的精度要求高。通常运动机构的行程达到数米以上,极易产生累积误差,对机器人的全局定位精度产生重大的影响。因此,为了保证全局定位精度,需要使用精密导轨,甚至需要光栅尺进行反馈以提高精度,这使得***的成本急剧上升,并且***成本会随着行程的增大等比例的上升。(2)、机器人***需要增加用于控制额外外部轴的驱动器,还需要增加相对应的外部轴电机,也会使***成本上升。由于上述原因,往往使得大型结构件机器人焊接***中,外部轴辅助设施占据了***大部分成本。
发明内容
为了克服上述现有技术的缺陷,本发明的目的在于提供一种复合机器人和三维视觉的大型结构件自动焊接***及方法,利用移动机器人扩展工业机器人的工作范围,然后使用3D视觉技术进行末端精确识别定位,进而自动化地完成焊接任务,扩大***工作范围,提高***的柔性,降低***的成本。
为了达到上述目的,本发明的技术方案为:
一种复合机器人和三维视觉的大型结构件自动焊接***,包括移动机器人和安装在移动机器人上方的多自由度机器人所组成的复合机器人***;安装在多自由度机器人末端用于焊接目标工件的焊接***;以及安装在多自由度机器人末端或安装在焊接***上的三维视觉***,三维视觉***对复合机器人***、焊接***和目标工件进行全局标定和定位。
所述移动机器人包括一个移动机器人底盘,底盘上紧固设置有外壳,外壳内设置有为整个***提供能源的可充电电池组、电源、多自由度机器人的控制器以及焊接***的焊机,可充电电池组还通过外壳上的电源接口连接外部供电;
所述的多自由度机器人包括多自由度机器人本体,多自由度机器人本体和其示教器都通过电缆和外壳内的控制器信号连接;
所述的焊接***,包括位于外壳内的焊机,以及与焊机连接的焊枪,焊枪设置在多自由度机器人本体的末端;
所述的三维视觉***包含3D相机,3D相机安装在多自由度机器人本体的末端,或者安装在焊枪上,3D相机通过线缆和外壳的工控机相连,工控机通过线缆和机器人控制器相连;
所述的3D相机测量精度不低于0.5mm,深度图帧速率大于1帧每秒;
所述多自由度机器人具有六个自由度以上机械臂,所述机械臂的臂展范围在0.5m-2m。
基于上述一种复合机器人和三维视觉的大型结构件自动焊接***的焊接方法,包括以下步骤:
(一)、建立移动机器人和多自由度机器人所组成的复合机器人***的***坐标系;
(二)、标定焊接***和三维视觉***这两个***与复合机器人***坐标系之间的关系;
(三)、将目标工件放置在工作区域,对齐工件坐标系到全局坐标系;
(四)、规划移动机器人的运动路径和三维视觉***中3D相机的拍照位置和姿态;
(五)、生成运动控制和焊接程序;
(六)、移动机器人实现焊接。
本发明的积极效果:
(1)、本发明针对大型结构件,尺度大、一致性差的特点,使用复合机器人和三维视觉技术,有效提高***柔性,降低***成本。
(2)、本发明使用复合机器人和三维视觉技术相结合,实质是使用了基于移动平台的粗定位技术结合了高精度三维视觉技术的精确识别定位技术,扩展了多自由机器人在XYZ三维方向的工作范围,实现了大型结构件的柔性焊接任务。相比传统通过外部轴扩展机器人工作范围的方案,本方案成本低;并且本方案扩展的工作范围在XY方向上远大于外部轴的方式,只要条件允许,在XY方向上可以任意的扩展其工作范围。并且成本的增加可以忽略。所以本方案针对大型和超大型工件的焊接具有非常高的成本优势;此外本方案占地面积小,对工件的吊装无干涉也是本方案的有益效果。
(3)、与传统的使用接触寻位、激光寻位的机器人焊接***相比,由于本发明使用三维视觉技术,容差能力更强。因此对工件的加工组对精度、对运动机器人的定位精度,对工件摆放精度的要求都更低,有利于降低成本和提高柔性,扩展适用范围。
(4)、本发明实现了大型结构件的自动化焊接,相比于当前普遍的人工焊接手段,有利于节省人力,提高生产效率,提升焊接质量。
附图说明
图1是本发明的***结构图。
具体实施方式
下面结合附图对本发明做详细叙述。
参照图1,本发明旨在提出一种复合机器人和三维视觉的大型结构件自动焊接***及方法,利用移动机器人扩展工业机器人的工作范围,然后使用3D视觉技术进行末端精确识别定位进而自动化地完成焊接任务,扩大***工作范围,提高***的柔性,降低***的成本。
一种复合机器人和三维视觉的大型结构件自动焊接***,包括移动机器人3和安装在移动机器人3上方的多自由度机器人2所组成的复合机器人***;安装在多自由度机器人2末端用于焊接目标工件1的焊接***;以及安装在多自由度机器人2末端或安装在焊接***上的三维视觉***,三维视觉***对复合机器人***、焊接***和目标工件进行全局标定和定位。
所述移动机器人3包括一个移动机器人底盘13,所述移动机器人底盘13包含了运动模块、控制模块、导航传感器和结构件组成。底盘13上紧固设置有外壳15,外壳15内设置有为整个***提供能源的可充电电池组11和电源12、多自由度机器人2的控制器10以及焊接***的焊机9,可充电电池组11还可通过外壳15上的电源接口14连接外部供电,整个***工作时,可以使用电池组11供电,也可以使用电源12供电。
所述的移动机器人3是一种带有全局导航定位功能的可移动机器人平台,其空间定位精度优于20mm,所述的全局导航定位是通过导航功能进行粗定位,包含电磁法、二维码法、视觉SLAM、视觉跟踪、惯性导航,以及多项上述方法复合的方案,当使用电磁法或者二维码导航定位法时,地面上铺设了相对应的导航线4。当使用视觉跟踪技术,对移动机器人进行定位时,所述移动机器人3的外壳15上印有一个或者多个跟踪靶标16,额外的相机通过拍摄靶标确定移动机器人的位置。作为一种替代方案,靶标可以是经过特殊设计的一种发光或者反光结构直接安装在移动机器人上。
所述移动机器人3优选带有升降功能的移动机器人平台。所述机器人控制器10包含用于控制机器人电机运动的控制器,以及用于驱动机器人电机的驱动器,统称为控制器。
所述的多自由度机器人2包含刚性安装在移动机器人3上的多自由度机器人本体6,多自由度机器人本体6和机器人示教器都通过电缆和外壳15内控制器10信号连接,多自由度机器人2由移动机器人3背负运动,用于扩展多自由度机器人的工作范围。所述多自由度机器人2优选具有六个自由度以上机械臂,所述是机械臂是工业机器人或者协作机器人,所述机械臂的臂展优选范围在0.5m-2m。
所述的焊接***,包括位于外壳15内的焊机9,以及与焊机9连接的焊枪以及必要的其他部件组成,焊枪7设置在多自由度机器人本体6的末端;所述其他部件包含送丝机、焊丝、水箱、保护气体及储存装置、空气压缩机,用于完成完整的焊接工艺。
所述的三维视觉***包含3D相机5,3D相机5安装在多自由度机器人本体6的末端,或者安装在焊枪7上,或者只要能让相机方便的拍摄目标工件。3D相机5通过线缆和外壳15的工控机8相连,工控机8可安装于外壳15上任意合适操作的位置,工控机8通过线缆和机器人控制器10相连。
所述的3D相机5获取待焊接工件的三维特征信息,测量精度不低于0.5mm,深度图帧速率大于1帧每秒,所述3D相机5是低功耗、小体积、低重量的3D相机,所述3D相机5优选激光作为光源,以提高抗光性。所述3D相机5优选基于MEMS的结构光3D相机,以满足上述特征。所述的3D相机5设置防护装置用于保护相机不受高温、飞溅、烟尘的影响而能正常工作。
所述的目标工件1是适宜焊接加工的大型金属结构件,指工件至少一个维度在5m-500m以内,目标工件1放置在基础平面上,基础平面是一个平坦没有起伏的平面,优选水平平面;移动机器人3搭载多自由度机器人2、三维视觉***、焊接***,在基础平面上围绕目标工件运动,通过导航功能进行粗定位,通过升降功能扩展垂直基础平面方向上的工作范围。
基于一种复合机器人和三维视觉的大型结构件自动焊接***的焊接方法,包含以下步骤:
(一)、建立移动机器人3和多自由度机器人2所组成的复合机器人***的***坐标系;
(二)、标定焊接***和三维视觉***这两个***与复合机器人***坐标系之间的关系;
(三)、将目标工件1放置在工作区域,对齐工件坐标系到全局坐标系;
(四)、规划移动机器人3的运动路径和三维视觉***中3D相机5的拍照位置和姿态;
(五)、生成运动控制和焊接程序;
(六)、移动机器人3实现焊接。
所述的步骤(一)具体为:
(1.1)、所述移动机器人3搭载多自由度机器人2、三维视觉***、焊接***,在基础平面上围绕目标工件1运动,通过导航功能进行粗定位,通过升降功能扩展垂直基础平面方向上的工作范围;
导航功能进行粗定位的方法包含电磁法、二维码法、视觉SLAM、视觉跟踪、惯性导航,以及所述方法的复合方案;当使用电磁法或者二维码导航定位法时,地面上铺设了相对应的导航线4。当使用视觉跟踪技术,对移动机器人3进行定位时,所述移动机器人3的外壳15上印有一个或者多个跟踪靶标16,额外的两个或多个相机通过拍摄靶标确定移动机器人的位置。作为一种替代方案,靶标可以是经过特殊设计的一种发光或者反光结构直接安装在移动机器人上,以保证不同角度的相机在任意角度都可以观察到靶标。
(1.2)、建立复合机器人***的坐标系:建立一个移动机器人3的坐标***,其坐标原点在基础平面上固定的一点,XY平面和基础平面重合,Z方向和基础平面垂直向上,该坐标系是全局的基础坐标系;建立多自由度机器人2坐标系,为了方便操作,其坐标原点在机械臂的基座上,其X、Y、Z方向和移动机器人3的坐标系相重合;建立三维视觉***坐标系,其坐标原点在3D相机5镜头的光心,X方向为由相机镜头光心指向投影***光心,Y轴方向垂直X轴、平行相机成像芯片,Z轴垂直XY平面,指向远离相机的正前方。
所述步骤(二)中,包含三部分:
(2.1)、标定多自由度机器人2和3D相机5的坐标关系,获得相机的视觉***坐标系到多自由度机器人2坐标系的转换关系。
(2.11)、首先,确保3D相机5本身是进行标定过的,并且获取相机的内参,内参包含:焦距、主点位置、像素大小、分辨率、畸变参数。
(2.12)、其次,进行3D相机5和多自由度机器人2标定,定义多自由度机器人2的末端到机器人的底座的其次变换矩阵为 robotT base,类似的3D相机5到目标物体的变换矩阵为 camT obj。使用多自由度机器人2挂载3D相机5,拍摄坐标点已知的标定板,记录多自由度机器人2的位置和姿态;保持标定板不动,多次改变多自由度机器人2的位置和姿态,拍摄标定板,其中不同的两次拍摄可表示为: robot1T base· cam1T robot1· objT cam1robot2T base· cam2T robot2· objT cam2
由于相机与末端之间的坐标关系不变,即 cam1T robot1cam2T robot2camT robot
有:( robot2T -1 base· robot1T basecamT robotcamT robot·( objT cam2· objT -1 cam1)
通过多次拍摄,求解上述方程,得到3D相机5和多自由度机器人2的坐标变换关系 camT robot
3D相机5的手眼转换关系 camT tool为:
Figure PCTCN2022106029-appb-000001
(2.13)、进一步进行闭环控制,得到3D相机5坐标系和焊枪7末端的工具坐标系的转换关系。优选地,增加以下流程,提高标定精度。使用焊枪末端碰触标定板中已知的位置点,得到其在多自由度机器人2工具坐标系下的位置P′(x,y,z),使用3D相机拍摄标定板,从而得到上述已知位置点在3D相机坐标系中的位置P″(x,y,z)。将表征P′(x,y,z)和P″(x,y,z)空间距离的能量方程P带入上述的优化过程,以前述步骤求得 camT tool为初值,进一步闭环迭代求解,求出最优化的手眼转换矩阵 camT tool。下述的手眼转换矩阵,均为本步骤得到的最优的 camT tool,无论是否使用闭环控制优化。
能量方程如下:P=|P′ 1(x,y,z)P″ 1(x,y,z)|+|P′ 2(x,y,z)P″ 2(x,y,z)|+...
其中,|P′ 1(x,y,z)P″ 1(x,y,z)|表点P′ 1(x,y,z)到P″ 1(x,y,z)的欧式距离,下标表示多个位置点。
(2.2)、标定机器人TCP工作坐标系,获得焊枪7尖端在多自由度机器人2坐标***中的位置转换关系。
机器人的TCP标定方法包含:直接输入法、四点法、六点法。在本实施案例中,使用四点法进行标定。
(2.21)、建立一个新的机器人TCP坐标系。
(2.22)、在多自由度机器人2的工作空间内放置一个固定点,通常使用一个锥形的尖点。
(2.23)、通过控制多自由度机器人2的姿态,使TCP与空间内的固定点重合。
(2.24)、重复上述步骤3次,改变多自由度机器人2的姿态使TCP移动到同一点。
(2.25)、以四次TCP点在世界坐标系中坐标相等为条件来建立方程组并求解,从而实现工具坐标系位置的标定。获得焊接***末端焊枪7坐标在多自由度机器人2***坐标系中的位姿转换关系 toolT base
(2.3)、标定多自由度机器人坐标系和移动机器人坐标系,获得将多自由度机器人坐标系转换到移动机器人坐标系的矩阵。
使用机器人挂载3D相机,拍摄坐标点已知的标定板,记录改变移动机器人3的位置和姿态;保持标定板不动,多次改变移动机器人3的位置,每次改变的幅度尽量可能的大,然后调整机器人拍摄标定板,使用已经得到的相机和机器人变换关系 camT robot,其中不同的两次拍摄可表示为: baseT BASE robot1T base· camT robot· objT cam1baseT BASE robot2T base· camT robot· objT cam2
通过多次拍摄,求解上述方程,得到改变移动机器人3基座坐标系到***基础坐标系的变换关系 baseT BASE
所述步骤(三)中,所述的工件坐标系的坐标原点设置在工作特征明显处,优选多个平面的交点,或者角点,便于找正。所述工件坐标系的X,Y,Z方向尽量和主要结构特征方向一致;优选地,选择最长尺度方向为X方向,便于摆放。作为一种替代方案,可设置两个或者多个沿着X方向分布的限位机构,以便工件靠齐X轴。即首先,将目标工作1置于基础平面的工作区域,然后调整支撑结构,让工件坐标系的X,Y,Z方向与基础坐标系的X,Y,Z方向基本重合,所述的基本重合表明他们的角度误差在2°以内。
其次,使用3D相机5拍摄工件坐标系原点位置,提取工件坐标系原点特征,然后将工件坐标系原点的坐标转化到机器人***的基础坐标系下,得到工件坐标系和基础坐标系的转化关系,进而获得工件在基础坐标系的位置。
所述步骤(三)的具体实现步骤为:
(3.1)、控制移动机器人3运动到目标点附近,控制多自由度机器人2使得3D相机5能够拍摄到目标点;
(3.2)、拍摄点云,上传到工控机8进行处理;
(3.3)、根据选定的工件坐标原定类型进行数据处理,在本实施案例中以三面角接的交点为例。对三维点云进行平面拟合,找到三个平面,进而求得平面的交点O(x,y,z),当前O(x,y,z)为相机坐标系下的坐标;
(3.4)、将O(x,y,z)转化到基础坐标系下:O base(x,y,z)= camT robot· robotT base·O(x,y,z)
因为工件坐标系相对于基础坐标系只存在平移变换,所以工件坐标到基坐标系在XYZ三个方向的变换值为(-X T,-Y T,-Z T)。
作为一种替代方案,使用多自由度机器人焊枪尖端触碰工件坐标系原点,然后将工件坐标系原点的坐标转化到机器人***的基础坐标系下,得到工件坐标系和基础坐标系的转化关系,进而获得工件在基础坐标系的位置。具体实施方案如上。
当机器人程序为离线生成时,使用上述变换将基于工件坐标系生成机器人程序中的位置信息进行平移变换。
所述步骤(四)中,规划机器人的运动路径和相机的拍照位置和姿态,所述的相机拍照位置和姿态,单次拍照可以对应一个或多个焊接特征,或者多次拍照对应一个焊接特征,所述的焊接特征是目标焊接位置,是一个点,或者一条直线,或者一条曲线;所述的相机拍照位置和姿态,应能使相机处于一个方便拍摄目标特征的位置和姿态,当相机处于所述方便拍摄的位置和姿态时:相机处于有效工作距离内,目标特征在相机的视场范围内,目标特征所处的区域的主法向方向尽量平行相机Z轴方向,以获得最佳拍摄效果。所述机器人运动路径,是使机器人运动到目标位置的最短安全路径,所述目标位置应能使机器人的臂展和自由度满足相机到达目标拍摄位置和姿态;
所述机器人的运动路径和相机的拍照位置和姿态规划方法,包含:通过离线编程软件规划,通过参数化的方法实现,通过示教的方式实现。
所述步骤(五)中,生成机器人运动控制和焊接程序包含了:移动机器人3的运动控制程序,多自由度机器人2的控制程序以及焊接程序。
优选地,使用多自由机器人2作为主控,和移动机器人3、3D相机5、焊接***进行通讯并进行控制。
作为一种替代方案,使用工控机8作为主控,控制移动机器人3、多自由度机器人2、和焊接***。
作为一种替代方案,使用外置的PLC作为主控,控制移动机器人3、多自由度机器人2、和焊接***。
所述多自由度机器人2的控制程序,包含了机械臂的运控控制程序;以及和相机之间的通讯程序;和焊接***以及运动机器人之间的通讯程序。
所述的程序生成方式,包含:人工离线编制的模板程序,离线编程软件使用数模驱动生成的程序,以及示教器生成的示教程序。所生成的机器人程序满足对应品牌机器人的语法规则、数据格式,可以在对应的品牌机器人上直接运行。如果是人工离线编制的模板程序或者离线编程软件使用数模驱动生成的程序,运行前首先需要将生成的机器人程序下发到机器人控制器,其方式可以是有线、无线传输,也可以使用存储装置进行拷贝。在本发明的另一个实施方案中,由上位机通过实时控制机器人进行在线运行,无需将程序下发到机器人控制器。
所述步骤(六)中,包含以下步骤:
(6.1)、首先,控制移动机器人3和移动机器人3回到坐标零点;
(6.1)、其次,控制移动机器人3到达第一个目标位置P 1
(6.3)、然后,在P 1处,控制多自由度机器人2到达第一个拍照位置,然后拍照,将数据发送到工控机8进行处理;如果P 1处存在下一个拍照位置,继续运动到下一个拍照位置,然后拍照,上传;直到完成P 1处全部拍照任务;
(6.4)、使用工控机8对P 1处所有的点云信息进行融合,识别焊接特征的关键信息;所述的识别焊接特征的关键信息,是以3D点云作为输入,使用几何算法或者神经网络方法,识别焊接特征的关键点位信息;所述的关键点位信息包括线段的起点、末点、方向;样条曲线参数;圆弧参数;多个离散点的坐标;
作为一种替代方案,可以先进行识别,识别出所述关键特征信息,然后将特征进行融合,得到融合之后的新的关键特征信息,所述新的关键特征信息包含,线段的起点、末点、方向;样条曲线参数;圆弧参数;多个离散点的坐标。
所述的点位置、方向信息均已转换到***的基础坐标系下。
(6.5)、通过上述关键点位信息,计算焊枪轨迹;所述焊枪轨迹包括焊枪的位置和姿态信息;
(6.6)、将焊枪7轨迹信息发给机器人控制器10,机器人控制器10收到上述焊枪7轨迹以后,引导移动机器人3进行走位,同时调用相应的焊接程序进行焊接;
(6.7)、移动机器人3运动到P 2位置,进行循环:重复步骤(6.3)-(6.6),直到完成焊接,或者停止。
应强调,在本说明书中使用术语“包括/包含”时,其被理解为规定存在所述的特征、整数、步骤或组分,但不排除存在或添加一个或多个其他特征、整数、步骤、组分或其群组。
上文和下文描述的方法的特征可以软件实施,且可通过执行计算机可执行指令而在数据处理***或其他处理工具上执行。指令可以是程序代码,其从存储介质或经由计算机网络从另一台计算机载入内存(例如RAM)。或者,所述的特征可由硬连线电路代替软件实现,或由硬连线电路和软件组合实现。

Claims (7)

  1. 一种复合机器人和三维视觉的大型结构件自动焊接方法,所基于的焊接***,包括移动机器人(3)和安装在移动机器人(3)上方的多自由度机器人(2)所组成的复合机器人***;安装在多自由度机器人(2)末端用于焊接目标工件(1)的焊接***;以及安装在多自由度机器人(2)末端或安装在焊接***上的三维视觉***,三维视觉***对复合机器人***、焊接***和目标工件进行全局标定和定位;
    所述移动机器人(3)包括一个移动机器人底盘(13),底盘(13)上紧固设置有外壳(15),外壳(15)内设置有为整个***提供能源的可充电电池组(11)、电源(12)、多自由度机器人(2)的控制器(10)以及焊接***的焊机(9),可充电电池组(11)还通过外壳(15)上的电源接口(14)连接外部供电;
    所述的多自由度机器人(2)包括多自由度机器人本体(6),多自由度机器人本体(6)和其示教器都通过电缆和外壳(15)内的控制器(10)信号连接;
    所述的焊接***,包括位于外壳(15)内的焊机(9),以及与焊机(9)连接的焊枪(7),焊枪(7)设置在多自由度机器人本体(6)的末端;
    所述的三维视觉***包含3D相机(5),3D相机(5)安装在多自由度机器人本体(6)的末端,或者安装在焊枪(7)上,3D相机(5)通过线缆和外壳(15)的工控机(8)相连,工控机(8)通过线缆和机器人控制器(10)相连;
    所述的3D相机(5)测量精度不低于0.5mm,深度图帧速率大于1帧每秒;
    所述多自由度机器人(2)具有六个自由度以上机械臂,所述机械臂的臂展范围在0.5m-2m;
    其特征在于,焊接方法包括以下步骤:
    (一)、建立移动机器人(3)和多自由度机器人(2)所组成的复合机器人***的***坐标系;
    (二)、标定焊接***和三维视觉***这两个***与复合机器人***坐标系之间的关系;
    (三)、将目标工件(1)放置在工作区域,对齐工件坐标系到全局坐标系;
    (四)、规划移动机器人(3)的运动路径和三维视觉***中3D相机(5)的拍照位置和姿态;
    (五)、生成运动控制和焊接程序;
    (六)、移动机器人(3)实现焊接;
    所述步骤(二)中,包含三部分:
    (2.1)、标定多自由度机器人(2)和3D相机(5)的坐标关系,获得相机的视觉***坐标系到多自由度机器人(2)坐标系的转换关系;
    (2.11)、首先,确保3D相机(5)本身是进行标定过的,并且获取相机的内参,内参包含:焦距、主点位置、像素大小、分辨率、畸变参数;
    (2.12)、其次,进行3D相机(5)和多自由度机器人(2)标定,定义多自由度机器人(2)的末端到机器人基座的齐次变换矩阵为 robotT base,类似的3D相机(5)到目标物体的变换矩阵为 camT obj,使用多自由度机器人(2)挂载3D相机(5),拍摄坐标点已知的标定板,记录多自由度机器人(2)的位置和姿态;保持标定板不动,多次改变多自由度机器人(2)的位置和姿态,拍摄标定板,其中不同的两次拍摄表示为:
    robot1T base· cam1T robot1· objT cam1robot2T base· cam2T robot2· objT cam2  (1)
    由于3D相机与多自由度机器人的末端之间的坐标变换关系不变,即
    cam1T robot1cam2T robot2camT robot  (2)
    有:( robot2T -1 base· robot1T basecamT robotcamT robot·( objT cam2· objT -1 cam1)  (3)
    通过多次拍摄,求解上述方程,得到3D相机(5)和多自由度机器人(2)的坐标变换关系 camT robot
    3D相机(5)的手眼变换矩阵 camT tool为:
    Figure PCTCN2022106029-appb-100001
    (2.13)、进一步进行闭环控制,得到3D相机(5)坐标系和焊枪(7)末端的工具坐标系的转换关系,使用焊枪末端碰触标定板中已知的位置点,得到其在多自由度机器人(2)坐标系下的位置P′(x,y,z),使用3D相机拍摄标定板,从而得到上述已知位置点在3D相机坐标系中的位置P″(x,y,z),将表征P′(x,y,z)和P″(x,y,z)空间距离的能量方程P代入公式(4),求得手眼变换矩阵 camT tool为初值,进一步闭环迭代求解,求出最优化的手眼变换矩阵 camT tool
    能量方程如下:P=|P 1′(x,y,z)P 1″(x,y,z)|+|P 2′(x,y,z)P 2″(x,y,z)|+...
    其中,|P 1′(x,y,z)P 1″(x,y,z)|表示点P 1′(x,y,z)到P 1″(x,y,z)的欧式距离,下标表示多个位置点;
    (2.2)、标定多自由度机器人(2)的TCP工具坐标系,获得焊枪(7)尖端在多自由 度机器人(2)坐标***中的位置转换关系;
    TCP工作标定方法包括:直接输入法、四点法、六点法;其中所述的四点法标定方法具体为:
    (2.21)、建立一个新的多自由度机器人TCP工具坐标系;
    (2.22)、在多自由度机器人(2)的工作空间内放置一个固定点,通常使用一个锥形的尖点;
    (2.23)、通过控制多自由度机器人(2)的姿态,使TCP点与空间内的固定点重合;
    (2.24)、重复上述步骤3次,改变多自由度机器人(2)的姿态使TCP点移动到同一点;
    (2.25)、以四次TCP点在世界坐标系中坐标相等为条件来建立方程组并求解,从而实现TCP工具坐标系位置的标定,获得焊接***末端焊枪(7)坐标在多自由度机器人(2)***坐标系中的位姿转换关系 toolT base
    (2.3)、标定多自由度机器人***坐标系和移动机器人坐标系,获得将多自由度机器人***坐标系转换到移动机器人坐标系的矩阵;
    使用机器人挂载3D相机,拍摄坐标点已知的标定板,记录改变移动机器人(3)的位置和姿态;保持标定板不动,多次改变移动机器人(3)的位置,每次改变的幅度尽量可能的大,然后调整机器人拍摄标定板,使用已经得到的3D相机和多自由度机器人的末端之间的坐标变换关系 camT robot,其中不同的两次拍摄表示为:
    baseT BASE robot1T base· camT robot· objT cam1baseT BASE robot2T base· camT robot· objT cam2
    通过多次拍摄,求解上述方程,得到移动机器人(3)基座坐标系到***基础坐标系的变换关系 baseT BASE
  2. 根据权利要求1所述的焊接方法,其特征在于,所述的步骤(一)具体为:
    (1.1)、所述移动机器人(3)搭载多自由度机器人(2)、三维视觉***、焊接***,在基础平面上围绕目标工件(1)运动,通过导航功能进行粗定位,通过升降功能扩展垂直基础平面方向上的工作范围;
    导航功能进行粗定位的方法包含电磁法、二维码法、视觉SLAM、视觉跟踪、惯性导航,以及所述方法的复合方案;当使用电磁法或者二维码导航定位法时,地面上铺设了相对应的导航线(4);当使用视觉跟踪技术,对移动机器人(3)进行定位时,所述移动机器人(3)的外壳(15)上印有一个或者多个跟踪靶标(16),额外的相机通过拍摄靶标确定移动机器人的位置;
    (1.2)、建立一个移动机器人(3)的坐标***,其坐标原点在基础平面上,XY平面 和基础平面重合,Z方向和基础平面垂直向上,该坐标系是全局的基础坐标系;建立多自由度机器人(2)坐标系,其坐标原点在机械臂的基座上,其X、Y、Z方向和移动机器人(3)的坐标系相重合;建立三维视觉***坐标系,其坐标原点在3D相机(5)镜头的光心,X方向为由相机镜头光心指向投影***光心,Y轴方向垂直X轴、平行相机成像芯片,Z轴垂直XY平面,指向远离相机的正前方。
  3. 根据权利要求1所述的焊接方法,其特征在于,所述步骤(三)中,所述的工件坐标系的坐标原点设置在工作特征明显处,为多个平面的交点,或者角点,便于找正;所述工件坐标系的X,Y,Z方向尽量和主要结构特征方向一致;即:首先,将目标工件(1)置于基础平面的工作区域,然后调整支撑结构,让工件坐标系的X,Y,Z方向与基础坐标系的X,Y,Z方向基本重合,所述的基本重合表明他们的角度误差在2°以内;其次,使用3D相机(5)拍摄工件坐标系原点位置,提取工件坐标系原点特征,然后将工件坐标系原点的坐标转化到基础坐标系下,得到工件坐标系和基础坐标系的转化关系,进而获得工件在基础坐标系的位置。
  4. 根据权利要求1所述的焊接方法,其特征在于,所述步骤(三)中,具体实现步骤为:
    (3.1)、控制移动机器人(3)运动到目标点附近,控制多自由度机器人(2)使得3D相机(5)能够拍摄到目标点;
    (3.2)、拍摄点云,上传到工控机(8)进行处理;
    (3.3)、根据选定的工件坐标原定类型进行数据处理,对三维点云进行平面拟合,找到三个平面,进而求得平面的交点O(x,y,z),当前O(x,y,z)为相机坐标系下的坐标;
    (3.4)、将O(x,y,z)转化到基础坐标系下:O base(x,y,z)= camT robot· robotT base·O(x,y,z)
    因为工件坐标系相对于基础坐标系只存在平移变换,所以工件坐标到基坐标系在XYZ三个方向的变换值为(-X T,-Y T,-Z T)。
  5. 根据权利要求1所述的焊接方法,其特征在于,所述步骤(四)中,规划移动机器人的运动路径和相机的拍照位置和姿态,所述的相机拍照位置和姿态,单次拍照对应一个或多个焊接特征,或者多次拍照对应一个焊接特征,所述的焊接特征是目标焊接位置,是一个点,或者一条直线,或者一条曲线;所述的相机拍照位置和姿态,应能使相机处于一个方便拍摄焊接特征的位置和姿态,当相机处于所述方便拍摄的位置和姿态时:相机处于有效工作距离内,焊接特征在相机的视场范围内,焊接特征所处的区域的主法向方向尽量平行相机Z轴方向,所述机器人运动路径,是使机器人运动到目标位置的最短安全路径,所述目标位置应能使机器人的臂展和自由度满足相机到达目标拍摄位置和姿态;
    所述机器人的运动路径和相机的拍照位置和姿态规划方法,包含:通过离线编程软件规划,通过参数化的方法实现,通过示教的方式实现。
  6. 根据权利要求1所述的焊接方法,其特征在于,所述步骤(五)中,生成机器人运动控制和焊接程序包含了:移动机器人(3)的运动控制程序,多自由度机器人(2)的控制程序以及焊接程序;
    所述多自由度机器人(2)的控制程序,包含了机械臂的运控控制程序,以及和相机之间的通讯程序,和焊接***,以及移动机器人之间的通讯程序;
    程序生成方式,包含:人工离线编制的模板程序,离线编程软件使用数模驱动生成的程序,以及示教器生成的示教程序;所生成的机器人程序满足对应品牌机器人的语法规则、数据格式,在对应的品牌机器人上直接运行;如果是人工离线编制的模板程序或者离线编程软件使用数模驱动生成的程序,运行前首先需要将生成的机器人程序下发到机器人控制器,其方式是有线、无线传输,或使用存储装置进行拷贝。
  7. 根据权利要求1所述的焊接方法,其特征在于,所述步骤(六)中,包含以下步骤:
    (6.1)、首先,控制多自由度机器人(2)和移动机器人(3)回到坐标零点;
    (6.2)、其次,控制移动机器人(3)到达第一个目标位置P 1
    (6.3)、然后,在P 1处,控制多自由度机器人(2)到达第一个拍照位置,然后拍照,将数据发送到工控机(8)进行处理;如果P 1处存在下一个拍照位置,继续运动到下一个拍照位置,然后拍照,上传;直到完成P 1处全部拍照任务;
    (6.4)、使用工控机(8)对P 1处所有的点云信息进行融合,识别焊接特征的关键信息;所述的识别焊接特征的关键信息,是以3D点云作为输入,使用几何算法或者神经网络方法,识别焊接特征的关键点位信息;所述的关键点位信息包括线段的起点、末点、方向;样条曲线参数;圆弧参数;多个离散点的坐标;
    (6.5)、通过上述关键点位信息,计算焊枪轨迹;所述焊枪轨迹包括焊枪的位置和姿态信息;
    (6.6)、将焊枪(7)轨迹信息发给机器人控制器(10),机器人控制器(10)收到上述焊枪(7)轨迹以后,引导移动机器人(3)进行走位,同时调用相应的焊接程序进行焊接;
    (6.7)、移动机器人(3)运动到P 2位置,进行循环:重复步骤(6.3)-(6.6),直到完成焊接,或者停止。
PCT/CN2022/106029 2022-04-08 2022-07-15 一种复合机器人和三维视觉的大型结构件自动焊接***及方法 WO2023193362A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US18/041,129 US11951575B2 (en) 2022-04-08 2022-07-15 Automatic welding system and method for large structural parts based on hybrid robots and 3D vision
EP22871144.6A EP4279211A4 (en) 2022-04-08 2022-07-15 AUTOMATIC WELDING SYSTEM AND PROCESS FOR LARGE STRUCTURAL PARTS BASED ON HYBRID ROBOTS AND 3D VISION

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210362511.2A CN114434059B (zh) 2022-04-08 2022-04-08 复合机器人和三维视觉的大型结构件自动焊接***及方法
CN202210362511.2 2022-04-08

Publications (1)

Publication Number Publication Date
WO2023193362A1 true WO2023193362A1 (zh) 2023-10-12

Family

ID=81358983

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/106029 WO2023193362A1 (zh) 2022-04-08 2022-07-15 一种复合机器人和三维视觉的大型结构件自动焊接***及方法

Country Status (4)

Country Link
US (1) US11951575B2 (zh)
EP (1) EP4279211A4 (zh)
CN (1) CN114434059B (zh)
WO (1) WO2023193362A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117102725A (zh) * 2023-10-25 2023-11-24 湖南大学 一种钢混组合结构连接件焊接方法及***
CN117484508A (zh) * 2023-12-11 2024-02-02 嘉兴布鲁艾诺机器人有限公司 一种上下料用多关节机器人的智能控制***及方法
CN118023799A (zh) * 2024-04-11 2024-05-14 华南理工大学 一种基于相机的人形焊接机器人自主焊接作业及规划方法
CN118099503A (zh) * 2024-04-26 2024-05-28 无锡黎曼机器人科技有限公司 可实现多层级电池模块自动堆叠的生产***及其生产方法

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114434059B (zh) * 2022-04-08 2022-07-01 西安知象光电科技有限公司 复合机器人和三维视觉的大型结构件自动焊接***及方法
CN114749848A (zh) * 2022-05-31 2022-07-15 深圳了然视觉科技有限公司 一种基于3d视觉引导的钢筋焊接自动化***
CN118023786A (zh) * 2024-04-09 2024-05-14 河南威猛振动设备股份有限公司 一种多工位智能体感焊接设备及焊接方法
CN118061198A (zh) * 2024-04-18 2024-05-24 中国长江电力股份有限公司 用于水轮机顶盖熔覆加工的复合移动机器人自动编程方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014128840A (ja) * 2012-12-28 2014-07-10 Kanto Seiki Kk ロボット制御システム
CN112060103A (zh) * 2020-08-07 2020-12-11 北京卫星制造厂有限公司 一种可移动超快激光加工机器人装备及加工方法
CN112958959A (zh) * 2021-02-08 2021-06-15 西安知象光电科技有限公司 一种基于三维视觉的自动化焊接和检测方法
CN113634958A (zh) * 2021-09-27 2021-11-12 西安知象光电科技有限公司 一种基于三维视觉的大型结构件自动化焊接***及方法
US20220016776A1 (en) * 2020-07-17 2022-01-20 Path Robotics, Inc. Real time feedback and dynamic adjustment for welding robots
CN113954085A (zh) * 2021-09-08 2022-01-21 重庆大学 一种基于双目视觉与线激光传感数据融合的焊接机器人智能定位与控制方法
CN114434059A (zh) * 2022-04-08 2022-05-06 西安知象光电科技有限公司 复合机器人和三维视觉的大型结构件自动焊接***及方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2369845A1 (en) * 2002-01-31 2003-07-31 Braintech, Inc. Method and apparatus for single camera 3d vision guided robotics
SE1050763A1 (sv) * 2010-07-08 2010-07-12 Abb Research Ltd En metod för att kalibrera en mobil robot
CN105904107A (zh) * 2016-04-21 2016-08-31 大族激光科技产业集团股份有限公司 移动机器人激光打标***及激光打标方法
CN112958974A (zh) * 2021-02-08 2021-06-15 西安知象光电科技有限公司 一种基于三维视觉的可交互自动化焊接***
US20220258267A1 (en) * 2021-02-15 2022-08-18 Illinois Tool Works Inc. Helmet based weld tracking systems

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014128840A (ja) * 2012-12-28 2014-07-10 Kanto Seiki Kk ロボット制御システム
US20220016776A1 (en) * 2020-07-17 2022-01-20 Path Robotics, Inc. Real time feedback and dynamic adjustment for welding robots
CN112060103A (zh) * 2020-08-07 2020-12-11 北京卫星制造厂有限公司 一种可移动超快激光加工机器人装备及加工方法
CN112958959A (zh) * 2021-02-08 2021-06-15 西安知象光电科技有限公司 一种基于三维视觉的自动化焊接和检测方法
CN113954085A (zh) * 2021-09-08 2022-01-21 重庆大学 一种基于双目视觉与线激光传感数据融合的焊接机器人智能定位与控制方法
CN113634958A (zh) * 2021-09-27 2021-11-12 西安知象光电科技有限公司 一种基于三维视觉的大型结构件自动化焊接***及方法
CN114434059A (zh) * 2022-04-08 2022-05-06 西安知象光电科技有限公司 复合机器人和三维视觉的大型结构件自动焊接***及方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4279211A4

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117102725A (zh) * 2023-10-25 2023-11-24 湖南大学 一种钢混组合结构连接件焊接方法及***
CN117102725B (zh) * 2023-10-25 2024-01-09 湖南大学 一种钢混组合结构连接件焊接方法及***
CN117484508A (zh) * 2023-12-11 2024-02-02 嘉兴布鲁艾诺机器人有限公司 一种上下料用多关节机器人的智能控制***及方法
CN118023799A (zh) * 2024-04-11 2024-05-14 华南理工大学 一种基于相机的人形焊接机器人自主焊接作业及规划方法
CN118099503A (zh) * 2024-04-26 2024-05-28 无锡黎曼机器人科技有限公司 可实现多层级电池模块自动堆叠的生产***及其生产方法

Also Published As

Publication number Publication date
EP4279211A4 (en) 2024-05-01
CN114434059A (zh) 2022-05-06
US20230390853A1 (en) 2023-12-07
EP4279211A1 (en) 2023-11-22
US11951575B2 (en) 2024-04-09
CN114434059B (zh) 2022-07-01

Similar Documents

Publication Publication Date Title
WO2023193362A1 (zh) 一种复合机器人和三维视觉的大型结构件自动焊接***及方法
CN108274092B (zh) 基于三维视觉与模型匹配的坡口自动切割***及切割方法
CN109591011B (zh) 复合材料三维结构件单边缝合激光视觉路径自动跟踪方法
CN110666798B (zh) 一种基于透视变换模型的机器人视觉标定方法
CN111127568B (zh) 一种基于空间点位信息的相机位姿标定方法
CN114289934B (zh) 一种基于三维视觉的大型结构件自动化焊接***及方法
WO2020024178A1 (zh) 一种手眼标定方法、***及计算机存储介质
WO2018043525A1 (ja) ロボットシステム、ロボットシステム制御装置、およびロボットシステム制御方法
CN111781894B (zh) 利用机器视觉进行装配工具空间定位及姿态导航的方法
CN110936369B (zh) 一种基于双目视觉和机械臂的大型工件位姿精确测量与抓取的方法
CN113246142B (zh) 一种基于激光引导的测量路径规划方法
CN113146620A (zh) 基于双目视觉的双臂协作机器人***和控制方法
CN114043087A (zh) 一种三维轨迹激光焊接焊缝跟踪姿态规划方法
CN112958960B (zh) 一种基于光学靶标的机器人手眼标定装置
CN115042175A (zh) 一种机器人机械臂末端姿态的调整方法
CN114643577B (zh) 一种通用型机器人视觉自动标定装置和方法
CN115619877A (zh) 单目线激光传感器与二轴机床***的位置关系标定方法
CN113324538B (zh) 一种合作目标远距离高精度六自由度位姿测量方法
CN114888501A (zh) 一种基于三维重建的无示教编程建筑构件焊接装置及方法
CN113276115A (zh) 一种无需机器人运动的手眼标定方法及装置
Liu et al. Vehicle automatic charging system guided electric by 3d vision and f/t sensor
CN111283676B (zh) 三轴机械臂的工具坐标系标定方法以及标定装置
JP6343930B2 (ja) ロボットシステム、ロボット制御装置、及びロボット制御方法
CN114034205A (zh) 一种箱体装填***及装填方法
CN117817667B (zh) 一种基于svd分解法的机械臂末端姿态调整方法

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2022871144

Country of ref document: EP

Effective date: 20230331