CN118080205A - Automatic spraying method and system based on vision - Google Patents

Automatic spraying method and system based on vision Download PDF

Info

Publication number
CN118080205A
CN118080205A CN202410498199.9A CN202410498199A CN118080205A CN 118080205 A CN118080205 A CN 118080205A CN 202410498199 A CN202410498199 A CN 202410498199A CN 118080205 A CN118080205 A CN 118080205A
Authority
CN
China
Prior art keywords
workpiece
spraying
point cloud
coordinate system
flat
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410498199.9A
Other languages
Chinese (zh)
Other versions
CN118080205B (en
Inventor
虞静
唐海龙
黄贵余
黄陆君
马伍军
王海兵
邹刘敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SICHUAN UNIVERSITY OF ARTS AND SCIENCE
Sichuan Ji'e Intelligent Technology Co ltd
Original Assignee
SICHUAN UNIVERSITY OF ARTS AND SCIENCE
Sichuan Ji'e Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SICHUAN UNIVERSITY OF ARTS AND SCIENCE, Sichuan Ji'e Intelligent Technology Co ltd filed Critical SICHUAN UNIVERSITY OF ARTS AND SCIENCE
Priority to CN202410498199.9A priority Critical patent/CN118080205B/en
Publication of CN118080205A publication Critical patent/CN118080205A/en
Application granted granted Critical
Publication of CN118080205B publication Critical patent/CN118080205B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses an automatic spraying method and system based on vision, wherein the method comprises the following steps: acquiring integral point cloud data of a workpiece, and selecting and judging a mode of placing a flat-plate rail car on the workpiece according to the initial structural form of the workpiece before feeding the workpiece; acquiring a cross section of a workpiece by adopting a two-dimensional scanner, and reconstructing three-dimensional point cloud data of the workpiece by combining the horizontal movement speed of the flat-plate railcar; unifying a coordinate system of the spraying process; unifying the positions of the tail ends of the robot tools into a unified coordinate system; position data of the flat-plate rail car measured by the laser range finder is subjected to coordinate translation to calculate the position of the workpiece; automatically separating the fixed support and the workpiece according to the point cloud data characteristics; according to the surface model of the workpiece, solving to obtain a normal vector of the surface of the workpiece, planning a spraying track in combination with a spraying range, forming data comprising a coordinate position and a spraying direction, and sending the data to a robot control mechanism. The invention realizes the effect of automatically spraying various non-standard workpieces.

Description

Automatic spraying method and system based on vision
Technical Field
The invention belongs to the technical field of automation, and particularly relates to an automatic spraying method and system based on vision.
Background
The traditional spraying for non-standard parts usually adopts a manual mode, the noise and dust brought are large to the damage of human body, the spraying process and quality are difficult to ensure, the waste of spraying materials is large, the enterprise cost control is affected, and the automatic spraying is imperative to replace manual work by a machine. The robot automatic spraying is adopted, so that the product quality and the production efficiency can be improved, the manual workload of a spraying link can be reduced, and the process is more environment-friendly by adopting a spraying mode in a sealed space.
In the Chinese authorized invention with the patent number of CN110052347B, an automatic spraying method and a system based on machine vision are disclosed, wherein the system comprises: the device comprises a PLC, a frequency converter, a first infrared sensor, a second infrared sensor and a switch, wherein the frequency converter, the first infrared sensor, the second infrared sensor and the switch are connected with the PLC; the frequency converter is also connected with a motor, the switch is also connected with an industrial camera and an industrial personal computer, and the industrial personal computer is also connected with a spraying robot; the first infrared sensor senses infrared signals, so that the motor is controlled to start to operate, image shooting is performed, when the second infrared sensor senses infrared signals, the frequency converter controls the motor to stop operating, and the industrial camera stops image shooting; when the industrial personal computer is matched with the local image corresponding to the image, a spraying program corresponding to the local image is obtained, and the spraying robot is controlled to perform spraying operation according to the spraying program.
The defects of the prior patent are that the system is used for shooting images of workpieces to be sprayed, obtaining a spraying program corresponding to a local image according to the images matched to the local image corresponding to the images, and controlling the spraying robot to perform spraying operation according to the spraying program. However, for spraying of non-standard workpieces, the non-standard workpieces are not provided with preset spraying programs, so that corresponding local images cannot be matched according to the images, and the spraying programs corresponding to the local images are found. The existing vision-based automatic spraying system lacks the effect of automatically spraying various non-standard workpieces.
Disclosure of Invention
Aiming at the problem that the existing vision-based automatic spraying system lacks in automatic spraying of various non-standard workpieces, the invention provides a vision-based automatic spraying method and system.
In order to achieve the technical purpose, the invention adopts the following technical scheme:
an automatic spraying method based on vision, comprising the steps of:
S1, acquiring integral point cloud data of a workpiece, and selecting and judging a mode of placing a flat-plate rail car on the workpiece according to the initial structural form of the workpiece before feeding the workpiece;
s2, three-dimensional modeling is conducted on the surface of the workpiece, a two-dimensional scanner is adopted to obtain the cross section of the workpiece, and the horizontal movement speed of the flat-plate railcar is combined to reconstruct three-dimensional point cloud data of the workpiece;
S3, defining a coordinate system of the visual part, namely defining a coordinate system of a reference robot, and unifying the coordinate system of the spraying process;
s4, unifying the positions of the tail ends of the robot tools into a unified coordinate system based on the coordinates of the robot mounting base;
S5, calculating the position of the workpiece through coordinate translation according to the position data of the flat-plate rail car measured by the laser range finder;
S6, automatically separating the fixed support and the workpiece according to the point cloud data characteristics;
and S7, solving to obtain a normal vector of the surface of the workpiece according to the surface model of the workpiece, planning a spraying track by combining a spraying range, and finally obtaining data comprising a coordinate position and a spraying direction and sending the data to a robot control mechanism.
Further, the detailed steps of step S2:
S201, acquiring contour data of the current position of the cross section surface of a workpiece by adopting a large-view-field two-dimensional laser scanner, and measuring the position and the movement speed of a positioning trolley of a high-precision high-repetition-frequency laser range finder;
s202, converting laser range finder data into position data with time stamp information;
S203, combining the contour data and the time information acquired by the two-dimensional laser scanner, and reconstructing to obtain three-dimensional model data of the workpiece surface after coordinate conversion calculation.
Further, unifying the coordinate system of the spraying process: the origin of the coordinate system is determined by the visual installation position, namely, the central position right below the visual installation position is the global origin of coordinates of the whole set of processes, the global origin of coordinates translates for a distance in the Y direction of the origin of the tail of the defined flat-plate rail car, and the distance is the horizontal distance from the portal frame to the tail of the flat-plate rail car and is measured by the laser range finder.
Further, in step S4, since the robot is attached with an external axis (external spraying moving track axis), the positions of the ends of the robot tool are unified into a unified coordinate system based on the coordinates of the robot base; the unified coordinate system is axially consistent with the global coordinate system used by the position of the workpiece, and the difference value between the unified coordinate system and the global coordinate system is used as the difference value of the origins of the two coordinate systems.
Further, unify the positions of the robot tool tips into a detailed flow in a unified coordinate system:
s401, scanning a workpiece once to acquire workpiece point cloud data;
s402, calculating global coordinates of the workpiece point cloud data;
S403, a fixed deviation exists between the origin of the robot mounting base and the global coordinate origin, the fixed deviation is used as a compensation value, and the compensation value is added to obtain a coordinate value of the robot base origin under the global coordinate system;
s404, determining the coordinates of the tail end of the robot in the global coordinate system through the coordinates of the mounting base in the global coordinate system of the robot and the current external axis position (namely the current coordinate value of the movement of the robot on the spraying track). The workpiece point cloud data and the tail end of the robot are unified to the same global coordinate system.
Further, in step S5, in the workpiece coordinate system, the tail of the flat-plate rail car is the origin position, the scanner completes scanning in the running process of the flat-plate rail car, the position of the workpiece is a known quantity relative to the tail of the flat-plate rail car, and the distance from the tail of the flat-plate rail car to the origin of the global coordinate system is obtained through the ranging sensor, so that the position of the workpiece in the global coordinate system can be obtained.
Further, the work position calculation flow includes:
s501, scanning a workpiece once, wherein the point cloud coordinates obtained by scanning are workpiece coordinates;
S502, determining a distance value between the tail of the flat-plate railcar and an origin of a global coordinate system;
s503, adding the Y-axis value of the workpiece coordinate to the global coordinate of the point cloud.
Further, in step S6, a fixed support and a workpiece are placed on a flat-plate rail car, a distance is preset between the fixed support and the workpiece, and automatic separation is performed according to jump of laser point cloud data, which is lower than the height of the fixed support, on a high layer.
Further, the detailed process of separating the fixed support and the workpiece in step S6 includes:
s601, determining a workpiece reference height;
S602, taking a reference height as a dividing line, and reserving the lower half part of point cloud data;
S603, determining an origin as a current scanning point;
S604, judging whether the current scanning point is in the range of the flat-bed rail car, if so, ending the process of separating the fixed support and the workpiece, and if not, entering step S605;
S605, judging whether point clouds exist within the range of 5 cm of the current scanning point space distance, if so, determining that the current scanning point is the position of the fixed support, and if not, entering step S606;
s606, the current scanning point moves forward to the X-axis to serve as the next scanning point, and the judgment in the step S604 is continued.
An automatic spraying system based on vision comprises a workpiece integral point cloud data acquisition unit, a workpiece placement flat-plate rail car mode judging unit, a workpiece three-dimensional point cloud data reconstruction unit, a spraying process coordinate system unit, a spraying robot coordinate conversion unit, a workpiece position calculation unit, a fixed support and workpiece separation unit and a spraying track planning unit;
the workpiece integral point cloud data acquisition unit acquires workpiece contour point cloud data by adopting a two-dimensional scanner;
judging a mode unit for placing the flat-plate rail car on the workpiece, and selecting and judging a mode for placing the flat-plate rail car on the workpiece according to the preliminary structural form of the workpiece before feeding the workpiece;
The workpiece three-dimensional point cloud data reconstruction unit acquires workpiece contour point cloud data by adopting a two-dimensional scanner, and reconstructs workpiece three-dimensional point cloud data by combining the horizontal movement speed of the flat-plate railcar;
A spraying process coordinate system unit for determining the center position right below the visual installation position as the global coordinate origin of the whole system;
The spraying robot coordinate conversion unit scans a workpiece once to obtain workpiece point cloud data, calculates global coordinates of the workpiece point cloud data, takes the fixed deviation as a compensation value and adds the compensation value to obtain a coordinate value of the origin of the robot base under a global coordinate system, and determines the coordinate of the tail end of the robot in the global coordinate system through the coordinates of the installation base in the global coordinate system of the robot and the current external axis position of the robot;
The workpiece position calculating unit is used for calculating the position of the workpiece through coordinate translation according to the position data of the flat-plate rail car measured by the laser range finder;
The fixed support and workpiece separation unit is used for placing two fixed supports and workpieces on the flat-plate rail car and automatically separating the two fixed supports and the workpieces according to jump of laser point cloud data lower than the fixed supports in a high layer;
And the spraying track planning unit is used for solving and obtaining a normal vector of the surface of the workpiece according to the surface model of the workpiece, planning a spraying track by combining a spraying range, and finally obtaining data comprising a coordinate position and a spraying direction and sending the data to the robot control mechanism.
Compared with the prior art, the invention has the following beneficial effects:
The method comprises the steps of obtaining integral point cloud data of a workpiece, carrying out three-dimensional modeling on the surface of the workpiece, reconstructing three-dimensional point cloud data of the workpiece, unifying a coordinate system in a spraying process, calculating the position of the workpiece through coordinate translation, automatically separating a fixed support piece from the workpiece according to the point cloud data characteristics, finally solving according to a surface model of the workpiece to obtain a normal vector of the surface of the workpiece, planning a spraying track in combination with a spraying range, forming data comprising the coordinate position and the spraying direction, and sending the data to a robot control mechanism. The automatic spraying operation effect on various non-standard workpieces is realized.
Drawings
FIG. 1 is an overall flow chart of a vision-based automatic spray coating method in accordance with an embodiment of the present invention;
FIG. 2 is a block diagram of the overall architecture of a vision-based automatic spray coating system in accordance with an embodiment of the present invention;
FIG. 3 is a schematic view of a coordinate system of an automatic vision-based spray coating system according to an embodiment of the present invention;
FIG. 4 is a flowchart of the calculation of the position of a workpiece according to an embodiment of the invention;
FIG. 5 is a schematic diagram of a trajectory plan when an obstacle is encountered during a spraying process according to an embodiment of the present invention;
FIG. 6 is a detailed flow chart of the position of the end of the robotic tool unified into a unified coordinate system in an embodiment of the invention;
FIG. 7 is a schematic view illustrating the separation of the fixed support and the workpiece according to an embodiment of the invention;
FIG. 8 is a detailed flow chart of separating a fixed support from a workpiece in an embodiment of the invention.
Detailed Description
The invention will be further described with reference to examples and drawings, to which reference is made, but which are not intended to limit the scope of the invention.
As shown in fig. 1, the embodiment provides an automatic spraying method based on vision, which includes the steps of:
S1, acquiring integral point cloud data of a workpiece, and selecting and judging a mode of placing a flat-plate rail car on the workpiece according to the initial structural form of the workpiece before feeding the workpiece;
s2, three-dimensional modeling is conducted on the surface of the workpiece, a two-dimensional scanner is adopted to obtain the cross section of the workpiece, and the horizontal movement speed of the flat-plate railcar is combined to reconstruct three-dimensional point cloud data of the workpiece;
S3, defining a coordinate system of the visual part, namely defining a coordinate system of a reference robot, and unifying the coordinate system of the spraying process;
S4, unifying the position of the tail end of the robot tool into a unified coordinate system (a global coordinate system in a spraying room of a robot arm or a workpiece or a flat-plate railcar preset for the system) by taking the coordinates of a robot mounting base as the reference; the calculation of the subsequent point cloud data is facilitated in a unified coordinate system;
S5, calculating the position of the workpiece through coordinate translation according to the position data of the flat-plate rail car measured by the laser range finder;
S6, automatically separating the fixed support and the workpiece according to the point cloud data characteristics;
And S7, solving to obtain a normal vector of the surface of the workpiece according to the surface model of the workpiece, planning a spraying track by combining a spraying range, and finally obtaining data comprising a coordinate position and a spraying direction and sending the data to a robot control mechanism. A robot is understood to mean a painting actuator (robot arm). The whole track always keeps a certain safety distance from the highest point of the surface of the workpiece, and the safety distance is 300-500mm according to early experience; when encountering an obstacle, the spray trajectory is as shown in fig. 5.
As shown in fig. 5, the workpiece is divided into a number of painting areas, bounded by obstacles. The robot in the same spraying area can directly move to the point position; after the current area is sprayed, the robot needs to move to a transition point firstly, and then moves to the next spraying area from the transition point.
Detailed steps of step S2:
S201, acquiring contour data of the current position of the cross section surface of a workpiece by adopting a large-view-field two-dimensional laser scanner, and measuring the position and the movement speed of a positioning trolley of a high-precision high-repetition-frequency laser range finder;
s202, converting laser range finder data into position data with time stamp information;
S203, combining the contour data and the time information acquired by the two-dimensional laser scanner, and reconstructing to obtain three-dimensional model data of the workpiece surface after coordinate conversion calculation.
In order to improve the point cloud density, reduce the detection cost, reduce the number of two-dimensional scanners, reduce the flatbed running speed through control realization. The improvement of the point cloud density can obtain the details of the workpiece, and the manipulator is prevented from colliding with the workpiece when planning the point cloud data track.
Unifying a coordinate system of a spraying process: the origin of the coordinate system is determined by the visual installation position, namely, the central position right below the visual installation position is the global origin of coordinates of the whole set of processes, the global origin of coordinates translates for a distance in the Y direction of the origin of the tail of the defined flat-plate rail car, and the distance is the horizontal distance from the portal frame to the tail of the flat-plate rail car and is measured by the laser range finder.
In the step S4, the position of the tail end of the robot tool is unified into a unified coordinate system by taking the base coordinates of the robot as the reference because the robot is attached with an external shaft; the unified coordinate system is axially consistent with a global coordinate system used by the position of the workpiece, and the difference between the position of the tail end of the robot tool and the position of the workpiece is used as the difference between the origins of the two coordinate systems.
As shown in fig. 6, the detailed flow of unifying the positions of the robot tool tips into the unified coordinate system, that is, the robot coordinate conversion detailed flow includes:
s401, scanning a workpiece once to acquire workpiece point cloud data;
s402, calculating global coordinates of the workpiece point cloud data;
s403, a fixed deviation exists between the origin of the robot mounting base and the global coordinate origin, the fixed deviation is used as a compensation value, and the robot mounting base origin is added with the compensation value to obtain a coordinate value of the robot base origin under the global coordinate system;
S404, determining the coordinates of the tail end of the robot in the global coordinate system through the coordinates of the mounting base in the global coordinate system of the robot and the current external axis position (namely the current coordinate value of the movement of the robot arm on the spraying track). The workpiece point cloud data and the tail end of the robot are unified to the same global coordinate system.
In the step S5, on the workpiece coordinate system, the tail part of the flat-plate rail car is the original point position, the scanner finishes scanning in the running process of the flat-plate rail car, the position of the workpiece is a known quantity relative to the tail part of the flat-plate rail car, and the distance from the tail part of the flat-plate rail car to the origin point of the global coordinate system is obtained through the ranging sensor, so that the position of the workpiece under the global coordinate system can be obtained.
As shown in fig. 4, the work position calculation flow includes:
s501, scanning a workpiece once, wherein the point cloud coordinates obtained by scanning are workpiece coordinates;
S502, determining a distance value between the tail of the flat-plate railcar and an origin of a global coordinate system;
S503, adding the Y-axis value of the workpiece coordinate to the global coordinate of the point cloud. When the flat-plate rail car moves forwards to the whole workpiece spraying room, the position of the 3D scanning sensor is the origin of the global coordinate system, and when the tail part of the flat-plate rail car passes through the portal frame, workpiece information cannot be continuously acquired beyond the detection range of the 3D scanning sensor.
In step S6, a fixed support and a workpiece are placed on a flat-plate rail car, a distance is preset between the fixed support and the workpiece, and automatic separation is carried out according to jump of laser point cloud data, which is lower than the height of the fixed support, on a high layer.
The heights of all the fixed supporting pieces on site are basically consistent, so that the reference height of one workpiece is determined, a certain distance is shifted downwards on the basis of the reference height to serve as a dividing line, the position of the fixed supporting pieces is determined by a point cloud part below the dividing line, and meanwhile, the vicinity of the area can be used as an obstacle to avoid the obstacle.
As shown in fig. 7 and 8, the position data of the flat-bed rail car measured by the laser range finder has a deviation each time the workpiece is placed, and therefore the position of the workpiece cannot be directly reflected by the position of the trolley, so that the position of the workpiece needs to be calculated through coordinate translation. The detailed process of separating the fixed support and the workpiece in step S6 includes:
s601, determining a workpiece reference height;
S602, taking a reference height as a dividing line, and reserving the lower half part of point cloud data; the parting line can also be a certain distance (0-10 cm) below the reference height;
S603, determining an origin as a current scanning point;
S604, judging whether the current scanning point is in the range of the flat-bed rail car, if so, ending the process of separating the fixed support and the workpiece, and if not, entering step S605;
S605, judging whether point clouds exist within the range of 5 cm of the current scanning point space distance, if so, determining that the current scanning point is the position of the fixed support, and if not, entering step S606; the vicinity is within the range of 0-10 cm;
S606, after the current scanning point moves forward to the X-axis, the scanning point position (fixed distance) is taken as the next scanning point, and the judgment in the step S604 is continued.
As shown in fig. 2 and fig. 3, an automatic spraying system based on vision comprises a workpiece integral point cloud data acquisition unit, a unit for judging a mode of placing a flat-plate rail car on a workpiece, a workpiece three-dimensional point cloud data reconstruction unit, a spraying process coordinate system unit, a spraying robot coordinate conversion unit, a workpiece position calculation unit, a fixed support and workpiece separation unit and a spraying track planning unit;
The workpiece integral point cloud data acquisition unit acquires workpiece contour point cloud data by adopting a two-dimensional scanner; the workpiece integral point cloud data acquisition unit corresponds to a 3D scanning sensor and a laser range finder (high-precision laser range finder) in the system coordinate system of fig. 3. The 3D scanning sensor consists of 7 single-line laser scanners, and the 7 single-line laser scanners are distributed around the portal frame to ensure that the tube screen is free from dead angle coverage.
Judging a mode unit for placing the flat-plate rail car on the workpiece, and selecting and judging a mode for placing the flat-plate rail car on the workpiece according to the preliminary structural form of the workpiece before feeding the workpiece;
The workpiece three-dimensional point cloud data reconstruction unit acquires workpiece contour point cloud data by adopting a two-dimensional scanner, and reconstructs workpiece three-dimensional point cloud data by combining the horizontal movement speed of the flat-plate railcar;
A spraying process coordinate system unit for determining the center position right below the visual installation position as the global coordinate origin of the whole system;
The spraying robot coordinate conversion unit scans a workpiece once to obtain workpiece point cloud data, calculates global coordinates of the workpiece point cloud data, takes the fixed deviation as a compensation value and adds the compensation value to obtain a coordinate value of the origin of the robot base under a global coordinate system, and determines the coordinate of the tail end of the robot in the global coordinate system through the coordinates of the installation base in the global coordinate system of the robot and the current external axis position of the robot;
The workpiece position calculating unit is used for calculating the position of the workpiece through coordinate translation according to the position data of the flat-plate rail car measured by the laser range finder;
The fixed support and workpiece separation unit is used for placing two fixed supports and workpieces on the flat-plate rail car and automatically separating the two fixed supports and the workpieces according to jump of laser point cloud data lower than the fixed supports in a high layer;
And the spraying track planning unit is used for solving and obtaining a normal vector of the surface of the workpiece according to the surface model of the workpiece, planning a spraying track by combining a spraying range, and finally obtaining data comprising a coordinate position and a spraying direction and sending the data to the robot control mechanism.
The whole system works, and has the functions of tracking the working state of each part, inquiring and recording faults, setting the working parameters of the system, counting the material consumption, counting the working efficiency, interacting with a man-machine and the like besides the basic functions of communication, control and the like.
Compared with the prior art, the invention has the following beneficial effects:
The method comprises the steps of obtaining integral point cloud data of a workpiece, carrying out three-dimensional modeling on the surface of the workpiece, reconstructing three-dimensional point cloud data of the workpiece, unifying a coordinate system in a spraying process, calculating the position of the workpiece through coordinate translation, automatically separating a fixed support piece from the workpiece according to the point cloud data characteristics, finally solving according to a surface model of the workpiece to obtain a normal vector of the surface of the workpiece, planning a spraying track in combination with a spraying range, forming data comprising the coordinate position and the spraying direction, and sending the data to a robot control mechanism. The automatic spraying operation effect on various non-standard workpieces is realized.
The automatic spraying method and system based on vision provided by the application are described in detail. The description of the specific embodiments is only intended to aid in understanding the system of the present application and its core ideas. It should be noted that it will be apparent to those skilled in the art that various modifications and adaptations of the application can be made without departing from the principles of the application and these modifications and adaptations are intended to be within the scope of the application as defined in the following claims.

Claims (10)

1. An automatic vision-based spray coating method, comprising the steps of:
S1, acquiring integral point cloud data of a workpiece, and selecting and judging a mode of placing a flat-plate rail car on the workpiece according to the initial structural form of the workpiece before feeding the workpiece;
s2, three-dimensional modeling is conducted on the surface of the workpiece, a two-dimensional scanner is adopted to obtain the cross section of the workpiece, and the horizontal movement speed of the flat-plate railcar is combined to reconstruct three-dimensional point cloud data of the workpiece;
S3, unifying a coordinate system of the spraying process;
s4, unifying the positions of the tail ends of the robot tools into a unified coordinate system based on the coordinates of the robot mounting base;
s5, calculating the position of the workpiece through coordinate translation according to the position data of the flat-plate rail car measured by the laser range finder;
S6, automatically separating the fixed support and the workpiece according to the point cloud data characteristics;
and S7, solving to obtain a normal vector of the surface of the workpiece according to the surface model of the workpiece, planning a spraying track by combining a spraying range, and finally obtaining data comprising a coordinate position and a spraying direction and sending the data to a robot control mechanism.
2. The vision-based automatic spraying method according to claim 1, wherein the detailed steps of step S2 are as follows:
s201, acquiring contour data of the current position of the cross section surface of a workpiece by adopting a large-view-field two-dimensional laser scanner, and measuring the position of a positioning trolley of the laser range finder and the movement speed of the trolley;
S202, converting laser range finder data into position data with time stamp information;
S203, combining the contour data and the time information acquired by the two-dimensional laser scanner, and reconstructing to obtain three-dimensional model data of the workpiece surface after coordinate conversion calculation.
3. The vision-based automatic spraying method according to claim 2, wherein the coordinate system of the spraying process is unified: the center position right below the visual installation position is a global coordinate origin of the whole process, the global coordinate origin defines the translation distance of the origin Y-axis direction of the tail of the flat-plate rail car, and the horizontal distance from the portal frame to the tail of the flat-plate rail car is obtained through measurement of a laser range finder.
4. A vision-based automatic spraying method according to claim 3, characterized in that in step S4 the unified coordinate system axially coincides with the global coordinate system used for the position of the workpiece, and the difference between the position of the end of the robot tool and the position of the workpiece is used as the difference between the unified coordinate system and the origin of the global coordinate system.
5. The vision-based automatic spray coating method according to claim 4, wherein the detailed procedure of unifying the positions of the robot tool tips into a unified coordinate system is as follows:
s401, scanning a workpiece once to acquire workpiece point cloud data;
s402, calculating global coordinates of the workpiece point cloud data;
S403, a fixed deviation exists between the origin of the robot mounting base and the global coordinate origin, the fixed deviation is used as a compensation value, and the compensation value is added to obtain a coordinate value of the robot base origin under the global coordinate system;
S404, determining the coordinates of the tail end of the robot in the global coordinate system through the coordinates of the mounting base in the global coordinate system of the robot and the current external shaft position.
6. The vision-based automatic spraying method according to claim 5, wherein in the coordinate system of the workpiece position in step S5, the tail of the flat-bed rail car is the origin position, the scanner finishes scanning during the traveling process of the flat-bed rail car, the position of the workpiece is a known quantity relative to the tail of the flat-bed rail car, and the distance from the tail of the flat-bed rail car to the origin of the global coordinate system is obtained through a ranging sensor, so that the position of the workpiece in the global coordinate system is obtained.
7. The vision-based automatic spray coating method of claim 6, wherein the work position calculation process comprises:
s501, scanning a workpiece once, wherein the point cloud coordinates obtained by scanning are workpiece coordinates;
S502, determining a distance value between the tail of the flat-plate railcar and an origin of a global coordinate system;
s503, adding the Y-axis value of the workpiece coordinate to the global coordinate of the point cloud.
8. The vision-based automatic spraying method according to claim 7, wherein in step S6, a distance is preset between the fixed support and the workpiece, and the automatic separation is performed according to jump of laser point cloud data at a higher level lower than the height of the fixed support.
9. The vision-based automatic spray coating method according to claim 8, wherein the detailed process of separating the fixed support and the workpiece in step S6 includes:
s601, determining a workpiece reference height;
S602, taking a reference height as a dividing line, and reserving the lower half part of point cloud data;
S603, determining an origin as a current scanning point;
S604, judging whether the current scanning point is in the range of the flat-bed rail car, if so, ending the process of separating the fixed support and the workpiece, and if not, entering step S605;
S605, judging whether point clouds exist within the range of 5 cm of the current scanning point space distance, if so, determining that the current scanning point is the position of the fixed support, and if not, entering step S606;
s606, the current scanning point moves forward to the X-axis to serve as the next scanning point, and the judgment in the step S604 is continued.
10. The vision-based automatic spraying system is characterized by comprising a workpiece integral point cloud data acquisition unit, a workpiece placement flat-plate rail car mode judging unit, a workpiece three-dimensional point cloud data reconstruction unit, a spraying process coordinate system unit, a spraying robot coordinate conversion unit, a workpiece position calculation unit, a fixed support piece and workpiece separation unit and a spraying track planning unit;
the workpiece integral point cloud data acquisition unit acquires workpiece contour point cloud data by adopting a two-dimensional scanner;
judging a mode unit for placing the flat-plate rail car on the workpiece, and selecting and judging a mode for placing the flat-plate rail car on the workpiece according to the preliminary structural form of the workpiece before feeding the workpiece;
The workpiece three-dimensional point cloud data reconstruction unit acquires workpiece contour point cloud data by adopting a two-dimensional scanner, and reconstructs workpiece three-dimensional point cloud data by combining the horizontal movement speed of the flat-plate railcar;
A spraying process coordinate system unit for determining the center position right below the visual installation position as the global coordinate origin of the whole system;
The spraying robot coordinate conversion unit scans a workpiece once to obtain workpiece point cloud data, calculates global coordinates of the workpiece point cloud data, takes the fixed deviation as a compensation value and adds the compensation value to obtain a coordinate value of the origin of the robot base under a global coordinate system, and determines the coordinate of the tail end of the robot in the global coordinate system through the coordinates of the installation base in the global coordinate system of the robot and the current external axis position of the robot;
The workpiece position calculating unit is used for calculating the position of the workpiece through coordinate translation according to the position data of the flat-plate rail car measured by the laser range finder;
The fixed support and workpiece separation unit is used for placing two fixed supports and workpieces on the flat-plate rail car and automatically separating the two fixed supports and the workpieces according to jump of laser point cloud data lower than the fixed supports in a high layer;
And the spraying track planning unit is used for solving and obtaining a normal vector of the surface of the workpiece according to the surface model of the workpiece, planning a spraying track by combining a spraying range, and finally obtaining data comprising a coordinate position and a spraying direction and sending the data to the robot control mechanism.
CN202410498199.9A 2024-04-24 2024-04-24 Automatic spraying method and system based on vision Active CN118080205B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410498199.9A CN118080205B (en) 2024-04-24 2024-04-24 Automatic spraying method and system based on vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410498199.9A CN118080205B (en) 2024-04-24 2024-04-24 Automatic spraying method and system based on vision

Publications (2)

Publication Number Publication Date
CN118080205A true CN118080205A (en) 2024-05-28
CN118080205B CN118080205B (en) 2024-07-23

Family

ID=91144372

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410498199.9A Active CN118080205B (en) 2024-04-24 2024-04-24 Automatic spraying method and system based on vision

Country Status (1)

Country Link
CN (1) CN118080205B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118305021A (en) * 2024-06-11 2024-07-09 四川吉埃智能科技有限公司 Automatic spraying robot control software system based on three-dimensional imaging technology

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02295354A (en) * 1989-05-10 1990-12-06 Canon Inc Picture processor
RU2004110957A (en) * 2004-04-12 2005-10-20 ЗАО "Центр перспективных наукоемких технологий" (RU) METHOD FOR DIAGNOSIS OF LOCAL WEAR OF THE CONTACT NETWORK OF RAILWAY ELECTRIC POWER SUPPLY
CN106423656A (en) * 2016-08-11 2017-02-22 重庆大学 Automatic spraying system and automatic spraying method based on point cloud and image matching
CN108132025A (en) * 2017-12-24 2018-06-08 上海捷崇科技有限公司 A kind of vehicle three-dimensional outline scans construction method
CN108549087A (en) * 2018-04-16 2018-09-18 北京瑞途科技有限公司 A kind of online test method based on laser radar
CN109967292A (en) * 2019-04-18 2019-07-05 中联西北工程设计研究院有限公司 A kind of automatic spraying system and its method based on the reconstruct of workpiece profile information three-dimensional
CN110052347A (en) * 2019-04-24 2019-07-26 佛山科学技术学院 A kind of automatic painting method and system based on machine vision
CN111141252A (en) * 2018-12-14 2020-05-12 广东星舆科技有限公司 Monocular calibration ranging method and system
CN112325796A (en) * 2020-10-26 2021-02-05 上海交通大学 Large-scale workpiece profile measuring method based on auxiliary positioning multi-view point cloud splicing
CN112699820A (en) * 2021-01-04 2021-04-23 湖南长院悦诚装备有限公司 Method and system for recycling roller for laying steel rail
CN113514847A (en) * 2020-04-10 2021-10-19 深圳市镭神智能***有限公司 Vehicle outer contour dimension detection method and system and storage medium
CN113674345A (en) * 2021-10-25 2021-11-19 成都新西旺自动化科技有限公司 Two-dimensional pixel-level three-dimensional positioning system and positioning method
CN114593691A (en) * 2020-12-04 2022-06-07 长安大学 Method and device capable of realizing single-line laser reconstruction of three-dimensional scene
WO2022165973A1 (en) * 2021-02-05 2022-08-11 杭州思看科技有限公司 Three-dimensional scanning method and system, electronic device, and computer equipment
CN115131268A (en) * 2021-03-25 2022-09-30 南京知谱光电科技有限公司 Automatic welding system based on image feature extraction and three-dimensional model matching
CN116152306A (en) * 2023-03-07 2023-05-23 北京百度网讯科技有限公司 Method, device, apparatus and medium for determining masonry quality
CN116603660A (en) * 2023-04-20 2023-08-18 东方电气集团东方锅炉股份有限公司 Spraying system based on laser three-dimensional point cloud
CN116630576A (en) * 2023-07-24 2023-08-22 四川吉埃智能科技有限公司 Casting structure reverse modeling method based on point cloud data
CN116809267A (en) * 2023-05-22 2023-09-29 山东科技大学 Method for automatically spraying large-sized workpiece
CN117824502A (en) * 2024-02-05 2024-04-05 四川吉埃智能科技有限公司 Laser three-dimensional scanning-based non-contact detection method for assembling complex assembled workpiece

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02295354A (en) * 1989-05-10 1990-12-06 Canon Inc Picture processor
RU2004110957A (en) * 2004-04-12 2005-10-20 ЗАО "Центр перспективных наукоемких технологий" (RU) METHOD FOR DIAGNOSIS OF LOCAL WEAR OF THE CONTACT NETWORK OF RAILWAY ELECTRIC POWER SUPPLY
CN106423656A (en) * 2016-08-11 2017-02-22 重庆大学 Automatic spraying system and automatic spraying method based on point cloud and image matching
CN108132025A (en) * 2017-12-24 2018-06-08 上海捷崇科技有限公司 A kind of vehicle three-dimensional outline scans construction method
CN108549087A (en) * 2018-04-16 2018-09-18 北京瑞途科技有限公司 A kind of online test method based on laser radar
CN111141252A (en) * 2018-12-14 2020-05-12 广东星舆科技有限公司 Monocular calibration ranging method and system
CN109967292A (en) * 2019-04-18 2019-07-05 中联西北工程设计研究院有限公司 A kind of automatic spraying system and its method based on the reconstruct of workpiece profile information three-dimensional
CN110052347A (en) * 2019-04-24 2019-07-26 佛山科学技术学院 A kind of automatic painting method and system based on machine vision
CN113514847A (en) * 2020-04-10 2021-10-19 深圳市镭神智能***有限公司 Vehicle outer contour dimension detection method and system and storage medium
CN112325796A (en) * 2020-10-26 2021-02-05 上海交通大学 Large-scale workpiece profile measuring method based on auxiliary positioning multi-view point cloud splicing
CN114593691A (en) * 2020-12-04 2022-06-07 长安大学 Method and device capable of realizing single-line laser reconstruction of three-dimensional scene
CN112699820A (en) * 2021-01-04 2021-04-23 湖南长院悦诚装备有限公司 Method and system for recycling roller for laying steel rail
WO2022165973A1 (en) * 2021-02-05 2022-08-11 杭州思看科技有限公司 Three-dimensional scanning method and system, electronic device, and computer equipment
CN115131268A (en) * 2021-03-25 2022-09-30 南京知谱光电科技有限公司 Automatic welding system based on image feature extraction and three-dimensional model matching
CN113674345A (en) * 2021-10-25 2021-11-19 成都新西旺自动化科技有限公司 Two-dimensional pixel-level three-dimensional positioning system and positioning method
CN116152306A (en) * 2023-03-07 2023-05-23 北京百度网讯科技有限公司 Method, device, apparatus and medium for determining masonry quality
CN116603660A (en) * 2023-04-20 2023-08-18 东方电气集团东方锅炉股份有限公司 Spraying system based on laser three-dimensional point cloud
CN116809267A (en) * 2023-05-22 2023-09-29 山东科技大学 Method for automatically spraying large-sized workpiece
CN116630576A (en) * 2023-07-24 2023-08-22 四川吉埃智能科技有限公司 Casting structure reverse modeling method based on point cloud data
CN117824502A (en) * 2024-02-05 2024-04-05 四川吉埃智能科技有限公司 Laser three-dimensional scanning-based non-contact detection method for assembling complex assembled workpiece

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
卫洪春;: "快速坐标系变换及其在真实感图形中的应用", 四川文理学院学报, no. 05, 10 September 2017 (2017-09-10), pages 25 - 28 *
涂朴;: "基于图像处理的激光测距研究与实现", 四川文理学院学报, no. 02, 10 March 2017 (2017-03-10), pages 44 - 46 *
胡媛媛;杨霞;: "基于机器人的管道内壁三维重建技术研究", 工业仪表与自动化装置, no. 04, 15 August 2016 (2016-08-15), pages 123 - 126 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118305021A (en) * 2024-06-11 2024-07-09 四川吉埃智能科技有限公司 Automatic spraying robot control software system based on three-dimensional imaging technology

Also Published As

Publication number Publication date
CN118080205B (en) 2024-07-23

Similar Documents

Publication Publication Date Title
CN118080205B (en) Automatic spraying method and system based on vision
EP3469974B1 (en) Cooperative work system formed by mother robot and child robot, and operation method thereof
CN107414253B (en) Welding seam tracking control device and method based on cross laser
CN112059363B (en) Unmanned wall climbing welding robot based on vision measurement and welding method thereof
CN202438792U (en) Control system for welding robot
CN103170767B (en) Welding robot control method
CN102590245B (en) Intelligent X-ray digital flat imaging detection system device and detection method
CN114161048B (en) 3D vision-based parameterized welding method and device for tower legs of iron tower
CN202471622U (en) X-ray digital panel imaging intelligent detection system device
CN109914756A (en) Indoor wall 3D putty based on indoor construction intelligent robot prints smooth processing method
CN114769988B (en) Welding control method, system, welding equipment and storage medium
CN111830984B (en) Multi-machine cooperative car washing system and method based on unmanned car washing equipment
CN112223292A (en) Online grinding system of structural member welding seam intelligent grinding and polishing robot
CN107097122A (en) A kind of robot for independently grinding large-scale free form surface
CN112427777A (en) Robot self-adaptive intelligent welding system and welding method for assembly in ship
CN108788394B (en) Laser scanning welding seam tracking device and tracking method thereof
CN116603660A (en) Spraying system based on laser three-dimensional point cloud
CN204965141U (en) Based on real -time online welding process adjustment system of 3D model
CN113348056A (en) Industrial robot device with improved tool path generation and method for operating an industrial robot device according to an improved tool path
CN114434036B (en) Three-dimensional vision system for gantry robot welding of large ship structural member and operation method
CN116117373A (en) Intelligent welding method and system for small assembly components in ship
CN111774775A (en) Three-dimensional vision system for gantry type robot welding of large-scale structural part and control method
CN110961583A (en) Steel ladle positioning device adopting laser scanning and using method thereof
CN112493926B (en) A robot of sweeping floor for scanning furniture bottom profile
CN117359625A (en) Robot special-shaped motion track automatic planning method based on point cloud data guiding technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant