CN210847488U - Robot laser cleaning path planning device based on computer vision - Google Patents

Robot laser cleaning path planning device based on computer vision Download PDF

Info

Publication number
CN210847488U
CN210847488U CN201920685560.3U CN201920685560U CN210847488U CN 210847488 U CN210847488 U CN 210847488U CN 201920685560 U CN201920685560 U CN 201920685560U CN 210847488 U CN210847488 U CN 210847488U
Authority
CN
China
Prior art keywords
mechanical arm
laser cleaning
teaching
module
computer workstation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201920685560.3U
Other languages
Chinese (zh)
Inventor
徐迟
刘翊
洪鑫
关泽彪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Geosciences
Original Assignee
China University of Geosciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Geosciences filed Critical China University of Geosciences
Priority to CN201920685560.3U priority Critical patent/CN210847488U/en
Application granted granted Critical
Publication of CN210847488U publication Critical patent/CN210847488U/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Laser Beam Processing (AREA)

Abstract

The utility model discloses a path planning device is washd to robot laser based on computer vision. The device comprises: the system comprises a visual motion capture module, a computer workstation, a mechanical arm, a laser cleaning device arranged at the tail end of the mechanical arm, a depth camera and a teaching device; the teaching method based on computer vision is adopted, and when a user finishes path planning through the teaching device, the robot is guided to carry out laser cleaning on a target workpiece in a man-machine cooperation mode. The method is applied to the assembly line operation of small workpieces, and can improve the working efficiency; for a large workpiece to be processed, the teaching efficiency can be improved, meanwhile, the experience of people is fully combined, and the problem of high error rate of the visual automatic identification method in a real complex environment is solved.

Description

Robot laser cleaning path planning device based on computer vision
Technical Field
The utility model relates to a robotics and application, more specifically say, relate to an utilize machine vision method, plan robot laser cleaning route.
Background
With the rapid development of rail transit, the output of subways and high-speed trains is greatly increased. Aluminum alloy and steel are used as main materials for rail transit production, and the connection mode adopted when the aluminum alloy and the steel are applied to high-speed rails is mainly welding, so that in order to avoid weld defects and improve welding quality, in the manufacturing and overhauling processes, rust, oil stains and paint removal are frequently required to be carried out on some complex structural materials, and the welding materials are cleaned before welding.
With the improvement of production process and requirements, the traditional treatment methods such as mechanical polishing method, chemical etching method and the like have the defects of long time consumption, low efficiency, substrate damage, high processing cost, severe operation environment, environmental pollution and the like in the cleaning process, so that the traditional treatment methods are gradually eliminated, and the problems can be well solved by adopting a laser cleaning method.
The laser cleaning utilizes the characteristics of high laser energy density, strong focusing property, good directivity and the like, focuses light beams through the lens combination, concentrates the light beams into a small area, overcomes the binding force between dirt and the surface of a substrate by utilizing the vibration of laser pulse, the photodecomposition or phase change of molecules or the combined action of particles of the laser pulse and the like, and enables the dirt to be separated from the surface to achieve the purpose of cleaning. As a novel cleaning method which is the most promising at present, laser cleaning has the characteristics of high efficiency, greenness, no pollution, no damage, non-contact cleaning and the like, but how to effectively improve the automation degree of a laser cleaning task is a difficult problem in the current industry.
SUMMERY OF THE UTILITY MODEL
The to-be-solved technical problem of the utility model lies in, to prior art's defect, a robot laser cleaning path planning device based on computer vision is provided.
The utility model provides a technical scheme that its technical problem adopted is: the robot laser cleaning path planning device based on computer vision is constructed and is characterized by comprising a vision motion capture module, a computer workstation, a mechanical arm, a laser cleaning device arranged at the tail end of the mechanical arm, a depth camera and a teaching device; wherein:
the visual motion capture module comprises a plurality of infrared cameras for shooting the teaching device and the mechanical arm in real time;
the depth camera is arranged at the tail end of the mechanical arm and on the teaching device; the depth camera is connected to the computer workstation and is used for shooting and transmitting the color image and the depth image of the target workpiece to the computer workstation;
the teaching device is provided with a plurality of teaching mark points and is used for defining the motion path of the mechanical arm, and the rigid body formed by the mark points on the teaching device is identified by the visual motion capture module through the mark points;
each joint of the mechanical arm is provided with a joint mark point, and the mechanical arm is subjected to kinematic parameter calibration through the joint mark points; wherein the mechanical arm is connected to a computer workstation, and a laser cleaning device is fixed at the tail end of the mechanical arm; the mechanical arm is used for carrying out laser cleaning on a target workpiece by using the laser cleaning device along a motion path of the teaching device;
the computer workstation is used for further planning a motion path of the mechanical arm on one hand according to the received data and driving the mechanical arm to move along the planned path; and on the other hand, after the relative pose between the mechanical arm and the target workpiece is adjusted, the laser cleaning device is driven to perform laser cleaning on the target workpiece.
Furthermore, each infrared camera is connected to the computer workstation through a data exchange device, and further, the data processed by each infrared camera is synchronously transmitted to the computer workstation.
Furthermore, when a plurality of infrared cameras are built, a space coordinate system of a camera system formed by the infrared cameras is calibrated through T-shaped and L-shaped calibration tools.
Further, teaching device still includes orbit definition control switch, degree of depth camera control switch and wireless communication module, wherein:
the wireless communication module is in wireless connection with the computer workstation; the track definition control switch and the depth camera control switch are connected with the wireless communication module and are used for controlling the operation modes of the mechanical arm and the laser cleaning device; wherein, the control signal generated in the control process is transmitted to the computer workstation in real time through the wireless transmission module, and the starting position and the ending position of the motion path of the mechanical arm and the laser cleaning device are further controlled by the computer workstation.
Further, the computer workstation comprises a computer host and control software running in the computer host; the control software comprises a data processing module for receiving and processing all real-time data, a mechanical arm control module and a mechanical arm calibration module for controlling and calibrating the motion of the mechanical arm, a three-dimensional reconstruction module for three-dimensionally reconstructing a target workpiece, a visualization module for displaying or replaying the motion process of the mechanical arm and a laser control module for controlling the laser cleaning device.
Further, the computer workstation also comprises a display terminal connected with the computer host; the visualization module displays or plays back a teaching process and a mechanical arm movement process in real time through a display terminal; the real-time display of the visualization module comprises real video recording and three-dimensional scene simulation by utilizing OpenGL software.
In a robot laser cleaning path planning method and device based on computer vision, the user teaches the mechanical arm through the teaching device, and the corresponding motion is accomplished along the path that the teaching pole defined to the end of the mechanical arm. Meanwhile, a teaching method based on computer vision is adopted, and when a user finishes path planning through a teaching device, the robot is guided to carry out laser cleaning on a target workpiece in a man-machine cooperation mode. The method is applied to the assembly line operation of small workpieces, can improve the working efficiency and has wide application prospect.
Drawings
The invention will be further explained with reference to the drawings and examples, wherein:
fig. 1 is a structural diagram of the robot laser cleaning path planning device based on computer vision;
fig. 2 is a flow chart of the method of the robot laser cleaning path planning device based on computer vision of the present invention;
fig. 3 is a structural diagram of the teaching device.
Detailed Description
In order to clearly understand the technical features, objects, and effects of the present invention, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
Please refer to fig. 1, which is a structure diagram of the robot teaching device in the present invention, in the robot teaching device provided by the present invention, the device includes a visual motion capturing module, a computer workstation, a teaching device and a robot, wherein:
1. a visual capture module:
the visual motion capture module comprises 8 infrared cameras, each camera is fixed through a holder bracket, and the distance from the ground of each camera is 2.5 meters; when the 8 infrared cameras are built, on one hand, the cameras are arranged in a square in a field, wherein the radius of the square is 3 meters, and the center of each infrared camera is aligned with the operation area of the workpiece table; on the other hand, in order to ensure that the computer workstation can effectively identify the space coordinates of the marking points or the rigid bodies, a space coordinate system of a camera system consisting of 8 cameras is calibrated by utilizing T-shaped and L-shaped calibration tools;
the visual motion capture module is connected to a computer workstation in a wired mode, wherein each infrared camera is connected to the input end of the data exchanger through a cable, the output end of the infrared camera is connected to the computer workstation, and data processed by each infrared camera is further synchronously transmitted to the computer workstation; in this embodiment, the data acquisition frame rate is 240 FPS.
2. A teaching device:
in the embodiment, a hand-held teaching rod is adopted to predefine the motion path of the mechanical arm, and the teaching rod is provided with 8 teaching mark points, a track definition control switch, a depth camera and a wireless communication module; wherein:
in the embodiment, the teaching mark points adopt reflective spheres, so that the camera can be positioned to the teaching rod; turning on a control switch, namely driving a teaching rod to start teaching work, wherein in the whole teaching process, 8 reflecting spheres arranged on the device jointly form a rigid body, the rigid body is shot by a camera device and transmitted to a computer workstation, the computer workstation further processes the rigid body to obtain space coordinates of the rigid body, and then the space posture of the tail end of the teaching rod is obtained according to the space coordinates;
the wireless communication module is wirelessly connected with a USB interface of the computer workstation; the track definition control switch and the depth camera control switch are connected with the wireless communication module and are used for controlling the operation modes of the mechanical arm and the laser cleaning device; when the teaching rod is controlled by the opening track definition control switch and the depth camera control switch, a control signal generated in the control process is transmitted to the computer workstation in real time through the wireless transmission module, and the start and end positions of the motion path of the mechanical arm and the laser cleaning device are further controlled by the computer workstation.
In this embodiment, a power module is further arranged in the teaching rod, and the power module can be composed of two No. 5 batteries to further supply power to the teaching rod; or the teaching rod is charged through a MicroUSB charging port; the power switch button controls the on-off mode of the handheld teaching rod (please refer to fig. 3 for a design drawing of the whole teaching rod, wherein the handheld teaching rod comprises a track definition switch button 1, a depth camera switch button 2, a wireless communication module 3, a power module 4, a power switch button 5, a micro USB charging port 6, a USB interface 7, a depth camera 8 and 8 marking points 9), and a cleaning path is planned at the tail end of the handheld teaching rod to teach the mechanical arm.
3. Computer workstation
The computer workstation comprises a computer host, a display terminal connected with the computer host and control software running in the computer host, and all data received by the computer host are processed by the control software;
the control software comprises a data processing module for processing data, a mechanical arm control module for planning a motion path of the mechanical arm, a mechanical arm calibration module for calibrating the mechanical arm, a laser control module for controlling the laser cleaning device, a three-dimensional reconstruction module for three-dimensionally reconstructing a target workpiece and a visualization module for displaying or replaying the motion process of the mechanical arm; the visualization module displays or plays back a teaching process and a mechanical arm movement process in real time through the display terminal.
4. Mechanical arm
The mechanical arm is connected to a computer workstation, and a laser cleaning device and a depth camera are fixed at the tail end of the mechanical arm; the mechanical arm is connected to a USB interface of a computer host; in the teaching process of the teaching rod, the computer workstation plans a motion path of the mechanical arm according to the processed space attitude data of the tail end of the teaching rod, and drives the tail end of the mechanical arm to complete corresponding motion according to the planned path; when the mechanical arm moves along a planned path, a target workpiece is shot by using a depth camera arranged at the tail end of the mechanical arm and on the teaching rod, color and depth images of the target workpiece are transmitted to a computer workstation in real time, and after the data are subjected to three-dimensional reconstruction on the target workpiece by a data processing module and a three-dimensional reconstruction module through an ICP algorithm in the computer workstation, the relative pose between the teaching rod and the target workpiece is further obtained; and when the computer workstation adjusts the relative pose between the laser cleaning device arranged at the tail end of the mechanical arm and the target workpiece according to the obtained relative pose between the teaching rod and the target workpiece, the laser control module drives the laser cleaning device to perform laser cleaning on the target workpiece.
As a preferred embodiment, in order to ensure the positioning accuracy of the mechanical arm, before planning the path of the mechanical arm, the mechanical arm needs to be calibrated by using a plurality of joint mark points provided on the mechanical arm.
As a preferred embodiment, when the target workpiece is subjected to secondary laser cleaning, in the process that the mechanical arm runs from an initial state to a path starting point planned by the mechanical arm control module, the target workpiece is subjected to three-dimensional reconstruction again by a depth camera arranged at the tail end of the mechanical arm; matching the 3D point cloud template obtained in the current cleaning process with the 3D point cloud template which is processed in the last cleaning process and stored in the three-dimensional reconstruction module; if the relative pose of the target workpiece and the teaching device is deviated, the mechanical arm control module is used for performing real-time motion compensation on the mechanical arm, and the accuracy of the laser cleaning process is further improved.
Please refer to fig. 2, which is a flowchart illustrating a method for a robot laser cleaning path planning apparatus based on computer vision according to the present invention, wherein the specific processing steps include:
s1, environment construction: building a visual motion capture module, specifically: taking a mechanical arm as a center, arranging in a ring shape and building 8 infrared cameras; the center of each infrared camera is aligned to the operation area of the workpiece table; after the construction is finished, calibrating a space coordinate system of the camera system;
s2, calibrating the mechanical arm: controlling the tail end of the mechanical arm to move through mechanical arm control software, shooting and transmitting the position of each joint mark point on the mechanical arm to a mechanical arm calibration module by using the infrared camera built in the step S1, and calibrating the mechanical arm by using the received data by using the module;
s3, data processing: after calibration is finished, a user drives a teaching rod to start teaching according to an area to be cleaned of a target workpiece; in the teaching process, on one hand, the infrared camera built in the step S1 is used for shooting and transmitting the rigid body space position formed by the teaching mark points to the data processing module; on the other hand, a depth camera arranged on the teaching device is used for shooting and transmitting the color image and the depth image of the target workpiece to the data processing module; the data processing module is used for further calculating to obtain the space attitude of the tail end of the teaching rod and the point cloud data of the target workpiece; filtering the obtained point cloud data by a three-dimensional reconstruction module, and then performing three-dimensional reconstruction on the target workpiece by using the point cloud data to further obtain a 3D point cloud template of the target workpiece and the relative pose of the teaching rod and the target workpiece;
s4, path planning: the mechanical arm control module plans a motion path of the mechanical arm according to the space posture of the tail end of the teaching rod obtained by the processing of the step S2, and drives the mechanical arm to complete corresponding motion; when the mechanical arm moves along the planned path, after the relative pose of the laser cleaning device and the target workpiece is adjusted by the mechanical arm control module, the laser cleaning device is driven by the laser control module to perform laser cleaning on the target workpiece along the movement path of the mechanical arm; and in the motion process of the mechanical arm, the display terminal connected with the visualization module is used for visually displaying the motion process of the mechanical arm.
While the embodiments of the present invention have been described with reference to the accompanying drawings, the present invention is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many modifications may be made by one skilled in the art without departing from the spirit and scope of the present invention as defined in the appended claims.

Claims (6)

1. A robot laser cleaning path planning device based on computer vision is characterized by comprising a vision motion capture module, a computer workstation, a mechanical arm, a laser cleaning device arranged at the tail end of the mechanical arm, a depth camera and a teaching device; wherein:
the visual motion capture module comprises a plurality of infrared cameras for shooting the teaching device and the mechanical arm in real time;
the depth camera is arranged at the tail end of the mechanical arm and on the teaching device; the depth camera is connected to the computer workstation and is used for shooting and transmitting the color image and the depth image of the target workpiece to the computer workstation;
the teaching device is provided with a plurality of teaching mark points and is used for defining the motion path of the mechanical arm, and the rigid body formed by the mark points on the teaching device is identified by the visual motion capture module through the mark points; further processing by a computer workstation to obtain the spatial coordinates of the rigid body, and then obtaining the spatial attitude of the tail end of the teaching device according to the spatial coordinates;
each joint of the mechanical arm is provided with a joint mark point, and the mechanical arm is subjected to kinematic parameter calibration through the joint mark points; wherein the mechanical arm is connected to a computer workstation, and a laser cleaning device is fixed at the tail end of the mechanical arm; the mechanical arm is used for carrying out laser cleaning on a target workpiece by using the laser cleaning device along a motion path of the teaching device;
the computer workstation is used for further planning a motion path of the mechanical arm on one hand according to the received data and driving the mechanical arm to move along the planned path; and on the other hand, after the relative pose between the mechanical arm and the target workpiece is adjusted, the laser cleaning device is driven to perform laser cleaning on the target workpiece.
2. The computer vision-based robot laser cleaning path planning device according to claim 1, wherein each infrared camera is connected to a computer workstation through a data exchange device, and further, data processed by each infrared camera is synchronously transmitted to the computer workstation.
3. The robot laser cleaning path planning device based on computer vision of claim 1, wherein when a plurality of infrared cameras are built, a spatial coordinate system of a camera system composed of the plurality of infrared cameras is calibrated through T-shaped and L-shaped calibration tools.
4. The computer vision based robotic laser cleaning path planning apparatus of claim 1, wherein the teach pendant further comprises a trajectory definition control switch, a depth camera control switch, and a wireless communication module, wherein:
the wireless communication module is in wireless connection with the computer workstation; the track definition control switch and the depth camera control switch are connected with the wireless communication module and are used for controlling the operation modes of the mechanical arm and the laser cleaning device; wherein, the control signal generated in the control process is transmitted to the computer workstation in real time through the wireless transmission module, and the starting position and the ending position of the motion path of the mechanical arm and the laser cleaning device are further controlled by the computer workstation.
5. A computer vision based robotic laser cleaning path planner as claimed in claim 1 wherein the computer workstation comprises a computer host and control software running within the computer host; the control software comprises a data processing module for receiving and processing all real-time data, a mechanical arm control module and a mechanical arm calibration module for controlling and calibrating the motion of the mechanical arm, a three-dimensional reconstruction module for three-dimensionally reconstructing a target workpiece, a visualization module for displaying or replaying the motion process of the mechanical arm and a laser control module for controlling the laser cleaning device.
6. The robot laser cleaning path planning device based on computer vision of claim 5, wherein the computer workstation further comprises a display terminal connected with a computer host; the visualization module displays or plays back a teaching process and a mechanical arm movement process in real time through a display terminal; the real-time display of the visualization module comprises real video recording and three-dimensional scene simulation by utilizing OpenGL software.
CN201920685560.3U 2019-05-14 2019-05-14 Robot laser cleaning path planning device based on computer vision Expired - Fee Related CN210847488U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201920685560.3U CN210847488U (en) 2019-05-14 2019-05-14 Robot laser cleaning path planning device based on computer vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201920685560.3U CN210847488U (en) 2019-05-14 2019-05-14 Robot laser cleaning path planning device based on computer vision

Publications (1)

Publication Number Publication Date
CN210847488U true CN210847488U (en) 2020-06-26

Family

ID=71303688

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201920685560.3U Expired - Fee Related CN210847488U (en) 2019-05-14 2019-05-14 Robot laser cleaning path planning device based on computer vision

Country Status (1)

Country Link
CN (1) CN210847488U (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113070290A (en) * 2021-04-12 2021-07-06 哈尔滨学院 Robot laser cleaning path planning device based on computer vision
CN115861494A (en) * 2023-02-20 2023-03-28 青岛大学 Cross-mode converter model type automatic dance generation method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113070290A (en) * 2021-04-12 2021-07-06 哈尔滨学院 Robot laser cleaning path planning device based on computer vision
CN115861494A (en) * 2023-02-20 2023-03-28 青岛大学 Cross-mode converter model type automatic dance generation method

Similar Documents

Publication Publication Date Title
CN108481323B (en) Augmented reality-based robot motion trajectory automatic programming system and method
CN110116116A (en) Robotic laser cleaning path planning system based on computer vision and method
CN110125944B (en) Mechanical arm teaching system and method
CN110170995B (en) Robot rapid teaching method based on stereoscopic vision
CN110170996B (en) Robot rapid teaching system based on stereoscopic vision
CN112643207B (en) Laser automatic derusting system and method based on computer vision
CN104325268A (en) Industrial robot three-dimensional space independent assembly method based on intelligent learning
CN210847488U (en) Robot laser cleaning path planning device based on computer vision
CN110142770B (en) Robot teaching system and method based on head-mounted display device
CN101770710A (en) Laser-vision sensing assisted remote teaching method for remote welding
CN111906788B (en) Bathroom intelligent polishing system based on machine vision and polishing method thereof
CN113333998A (en) Automatic welding system and method based on cooperative robot
CN113352300B (en) Spraying robot demonstrator and method
CN113246142B (en) Measuring path planning method based on laser guidance
CN111774775B (en) Three-dimensional vision system for gantry type robot welding of large-scale structural part and control method
CN112958974A (en) Interactive automatic welding system based on three-dimensional vision
CN112223292A (en) Online grinding system of structural member welding seam intelligent grinding and polishing robot
CN113021082A (en) Robot casting polishing method based on teleoperation and panoramic vision
CN113223071B (en) Workpiece weld joint positioning method based on point cloud reconstruction
CN117047237B (en) Intelligent flexible welding system and method for special-shaped parts
CN111360789B (en) Workpiece processing teaching method, control method and robot teaching system
CN217618390U (en) Laser welding system based on visual identification
CN104181814A (en) Robot self-adaptation control method
CN114800574B (en) Robot automatic welding system and method based on double three-dimensional cameras
CN214583043U (en) Three-dimensional scanning system for workpiece coating

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200626

CF01 Termination of patent right due to non-payment of annual fee