CN113567452B - Burr detection method, device, equipment and storage medium - Google Patents

Burr detection method, device, equipment and storage medium Download PDF

Info

Publication number
CN113567452B
CN113567452B CN202110849072.3A CN202110849072A CN113567452B CN 113567452 B CN113567452 B CN 113567452B CN 202110849072 A CN202110849072 A CN 202110849072A CN 113567452 B CN113567452 B CN 113567452B
Authority
CN
China
Prior art keywords
target
translation
motor driver
camera
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110849072.3A
Other languages
Chinese (zh)
Other versions
CN113567452A (en
Inventor
赵亮
钞蓓英
南星佑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Shendian Vision Technology Co ltd
Original Assignee
Beijing Shendian Vision Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Shendian Vision Technology Co ltd filed Critical Beijing Shendian Vision Technology Co ltd
Priority to CN202110849072.3A priority Critical patent/CN113567452B/en
Publication of CN113567452A publication Critical patent/CN113567452A/en
Application granted granted Critical
Publication of CN113567452B publication Critical patent/CN113567452B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D3/00Control of position or direction
    • G05D3/12Control of position or direction using feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • G01N2021/0106General arrangement of respective parts
    • G01N2021/0112Apparatus in one mechanical, optical or electronic block
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Immunology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Automation & Control Theory (AREA)
  • Quality & Reliability (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application provides a burr detection method, device, equipment and storage medium, wherein the method comprises the following steps: after receiving the position coordinates of the target position on the target pole piece sent by the data processing module, the PLC controls the camera to translate through a translation motor driver; the data processing module acquires a first target difference value through a ranging sensor after receiving a first feedback signal sent by the translation motor driver, and sends the first target difference value to the PLC; after receiving the first target difference value, the PLC controls the camera to move through a linear motor driver; the data processing module obtains a target label by taking a target image acquired by a camera as input data of a burr detection model after receiving a second feedback signal sent by the linear motor driver, and takes the content of the target label as the category of target burrs at a target position; by the method, the manual workload is reduced.

Description

Burr detection method, device, equipment and storage medium
Technical Field
The present application relates to the field of burr detection, and in particular, to a burr detection method, apparatus, device, and storage medium.
Background
In the lithium battery production process, the working procedures of coating, rolling, slitting, die cutting, winding and the like are sequentially carried out, the anode and the cathode of the lithium battery are insulated through the diaphragm, if the length of burrs exceeds the thickness of the insulated diaphragm in the slitting or die cutting process, the anode and the cathode of the battery are short-circuited, the risks of ignition, explosion and the like of the battery are caused, and the personal safety is seriously influenced, so that the burrs are required to be strictly detected in the slitting or die cutting process of the lithium battery pole piece.
At present, the detection of burrs mainly depends on manual spot check, namely: the pole piece needs to be cut in the pole piece cutting process, and the cut pole piece is taken under a secondary instrument to be observed manually, but the detection process is complicated, and manual participation is needed in the whole process, so that manual workload is high.
Disclosure of Invention
In view of this, the embodiments of the present application provide a burr detection method, apparatus, device, and storage medium, so as to reduce the manual workload.
Mainly comprises the following aspects:
in a first aspect, an embodiment of the present application provides a burr detection method, where the burr detection method is applied to a burr detection device, the burr detection device includes: the camera is provided with a telecentric lens, a ranging sensor, a data processing module, a Programmable Logic Controller (PLC) and a motor driver, wherein the data processing module is respectively and electrically connected with the camera, the ranging sensor and the PLC, the PLC is electrically connected with the motor driver, and the motor driver comprises a linear motor driver, a translation motor driver and a rotary motor driver; the burr detection method comprises the following steps:
After receiving the position coordinates of the target position on the target pole piece sent by the data processing module, the PLC sends a translation instruction carrying a translation direction and a translation distance to the translation motor driver according to the abscissa in the position coordinates and the transverse position coordinates of the camera, so that the translation motor driver controls the camera to translate according to the translation direction and the translation distance, wherein the target position comprises the position of at least one edge to be detected on the target pole piece;
the data processing module acquires a first target difference value through the ranging sensor after receiving a first feedback signal which is sent by the translation motor driver and used for indicating the end of translation, and sends the first target difference value to the PLC, wherein the first target difference value is a difference value between a target distance and a preset distance, and the target distance is a distance between the position of the camera and the target position;
after receiving the first target difference value, the PLC sends a moving instruction carrying a moving direction and a moving distance to the linear motor driver according to a first numerical value of the first target difference value and a first sign used for representing positive or negative in the first target difference value, so that the linear motor driver controls the camera to move according to the moving direction and the moving distance;
And the data processing module acquires a target image at the target position through the camera after receiving a second feedback signal which is sent by the linear motor driver and used for indicating the end of movement, and takes the target image as input data of a burr detection model to obtain a target label which is used for indicating the type of burrs in the target image, so that the content of the target label is used as the type of target burrs of the target pole piece at the target position.
Optionally, before the translating command carrying the translating direction and the translating distance is sent to the translating motor driver according to the abscissa in the position coordinates and the lateral position coordinates of the camera, the method further includes:
judging whether the target position is positioned in a target area facing the shooting direction according to the position coordinates and the shooting direction of the telecentric lens;
and if the camera is not positioned in the target area, sending a rotating instruction carrying a preset rotating direction and a preset rotating angle to the rotating motor driver, so that the rotating motor driver controls the camera to rotate according to the preset rotating direction and the preset rotating angle.
Optionally, before the translating command carrying the translating direction and the translating distance is sent to the translating motor driver according to the abscissa in the position coordinates and the lateral position coordinates of the camera, the method further includes:
calculating a second target difference between the abscissa and the lateral position coordinate;
determining a preset translation direction matched with the second sign according to a second sign used for representing positive or negative in the second target difference value, and taking the preset translation direction as the translation direction;
and taking the second numerical value as the numerical value of the translation distance according to the second numerical value of the second target difference value.
Optionally, before the target image is used as input data of the burr detection model to obtain a target label for representing the burr category in the target image, the method further includes:
training a candidate burr detection model for a plurality of times by using a sample training set, wherein the sample training set comprises at least one historical burr image set and preset labels which are set for the historical burr image set, and the preset labels are used for representing types of burrs in each historical burr image included in the historical burr image set;
Judging whether the candidate burr detection model meets a preset condition or not, wherein the preset condition comprises: the accuracy of the candidate burr detection model is greater than a preset threshold value, and/or the training times are greater than or equal to preset times;
and if the preset condition is met, taking the candidate burr detection model as the burr detection model.
Optionally, the method further comprises: and displaying the category of the target burr.
In a second aspect, an embodiment of the present application provides a burr detection apparatus, including: the camera is provided with a telecentric lens, a ranging sensor, a data processing module, a Programmable Logic Controller (PLC) and a motor driver, wherein the data processing module is respectively and electrically connected with the camera, the ranging sensor and the PLC, the PLC is electrically connected with the motor driver, and the motor driver comprises a linear motor driver, a translation motor driver and a rotary motor driver;
the PLC is used for sending a translation instruction carrying a translation direction and a translation distance to the translation motor driver according to the abscissa in the position coordinates and the transverse position coordinates of the camera after receiving the position coordinates of the target position on the target pole piece sent by the data processing module, so that the translation motor driver controls the camera to translate according to the translation direction and the translation distance, wherein the target position comprises the position of at least one edge to be detected on the target pole piece;
The data processing module is used for acquiring a first target difference value through the ranging sensor after receiving a first feedback signal which is sent by the translation motor driver and used for indicating the end of translation, and sending the first target difference value to the PLC, wherein the first target difference value is a difference value between a target distance and a preset distance, and the target distance is a distance between the position of the camera and the target position;
the PLC is used for sending a moving instruction carrying a moving direction and a moving distance to the linear motor driver according to a first numerical value of the first target difference value and a first sign used for representing positive or negative in the first target difference value after receiving the first target difference value, so that the linear motor driver controls the camera to move according to the moving direction and the moving distance;
and the data processing module is used for acquiring a target image at the target position through the camera after receiving a second feedback signal which is sent by the linear motor driver and used for indicating the end of movement, and taking the target image as input data of a burr detection model to obtain a target label which is used for indicating the type of burrs in the target image, so that the content of the target label is used as the type of target burrs of the target pole piece at the target position.
Optionally, before the configuration of the PLC is configured to send a translation instruction carrying a translation direction and a translation distance to the translation motor driver according to an abscissa in the position coordinates and a lateral position coordinate of the camera, the configuration is further configured to:
judging whether the target position is positioned in a target area facing the shooting direction according to the position coordinates and the shooting direction of the telecentric lens;
and if the camera is not positioned in the target area, sending a rotating instruction carrying a preset rotating direction and a preset rotating angle to the rotating motor driver, so that the rotating motor driver controls the camera to rotate according to the preset rotating direction and the preset rotating angle.
Optionally, before the configuration of the PLC is configured to send a translation instruction carrying a translation direction and a translation distance to the translation motor driver according to an abscissa in the position coordinates and a lateral position coordinate of the camera, the configuration is further configured to:
calculating a second target difference between the abscissa and the lateral position coordinate;
determining a preset translation direction matched with the second sign according to a second sign used for representing positive or negative in the second target difference value, and taking the preset translation direction as the translation direction;
And taking the second numerical value as the numerical value of the translation distance according to the second numerical value of the second target difference value.
Optionally, before the configuration of the data processing module is used for taking the target image as input data of the burr detection model, the configuration is further used for:
training a candidate burr detection model for a plurality of times by using a sample training set, wherein the sample training set comprises at least one historical burr image set and preset labels which are set for the historical burr image set, and the preset labels are used for representing types of burrs in each historical burr image included in the historical burr image set;
judging whether the candidate burr detection model meets a preset condition or not, wherein the preset condition comprises: the accuracy of the candidate burr detection model is greater than a preset threshold value, and/or the training times are greater than or equal to preset times;
and if the preset condition is met, taking the candidate burr detection model as the burr detection model.
Optionally, the burr detection device further includes: and the display module is used for displaying the category of the target burr.
In a third aspect, an embodiment of the present application provides a computer device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the steps of the burr detection method according to any one of the first aspects when the processor executes the computer program.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method for detecting burrs as described in any of the first aspects above.
The technical scheme provided by the embodiment of the application can comprise the following beneficial effects:
according to the burr detection method provided by the embodiment of the application, after the target pole piece to be detected is determined, the data processing module sends the position coordinates of the target position to be detected on the target pole piece to the PLC (Programmable Logic Controller, the programmable logic controller), after receiving the position coordinates of the target position, the PLC sends a translation instruction to the translation motor driver according to the abscissa in the position coordinates and the transverse position coordinates of the camera, meanwhile, the translation motor driver carries a translation direction and a translation distance on the translation instruction, after receiving the translation instruction, the translation motor driver controls the camera to translate according to the translation direction and the translation distance, so that the camera moves to a position where an image of the target position can be acquired, after the translation is finished, the translation motor driver sends a first feedback signal for indicating that the translation is finished to the data processing module through the PLC, so that the next operation can be performed, after receiving the first feedback signal, the first target difference is acquired through the sensor, the first target difference is sent to the PLC, so that the PLC sends a movement instruction to the linear motor according to the first distance measuring instruction and the first value of the first target difference, the camera is controlled to move according to the first distance measuring instruction, and the movement direction carried by the linear motor is equal to the preset distance between the camera and the movement direction and the camera, namely, the movement distance is controlled to be equal to the movement distance between the camera and the movement direction. The camera can acquire an image at a clear target position, and after the movement of the camera is finished, the linear motor sends a second feedback signal for indicating the movement is finished to the data processing module through the PLC so as to indicate that the next operation can be performed; after receiving the second feedback signal, the data processing module acquires a target image at a target position through a camera, inputs the target image into a burr detection model, obtains a target label used for representing the type of the burrs in the target image, and can obtain the type of the target burrs of the target pole piece at the target position according to the content of the target label so as to complete the task of burr detection.
In the above-mentioned in-process, obtain positional information, remove the camera according to positional information, the camera gathers the step such as classification of target image and target image is all accomplished through burr detection device is automatic, the whole automation of burr detection process has been realized, compared with the manual detection process among the prior art, the burr detection process of this application does not need artifical the participation, be favorable to reducing manual work load, in addition, burr detection method in this application whole is realized through burr detection device, burr detection process links up, be favorable to improving burr detection efficiency, and in this burr detection process, all need first the position of adjustment camera before this burr detection process, in order to obtain the clear image of target position department, in order to improve the accuracy of target image classification in burr classification model, thereby improve the accuracy of burr detection.
In order to make the above objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered limiting the scope, and that other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 shows a flowchart of a burr detection method according to a first embodiment of the present application;
fig. 2 is a schematic structural diagram of a burr detection device according to a second embodiment of the present application;
fig. 3 is a physical diagram of another burr detection apparatus according to the second embodiment of the present application;
fig. 4 shows a schematic structural diagram of a computer device according to a third embodiment of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as provided in the accompanying drawings, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, are intended to be within the scope of the present application.
Based on this, the embodiment of the application provides a burr detection method, a device, equipment and a storage medium, and the following description is made by embodiments.
Example 1
Fig. 1 shows a flowchart of a burr detection method according to an embodiment of the present application, and as shown in fig. 1, the burr detection method is applied to a burr detection device, and the burr detection device includes: the camera is provided with a telecentric lens, a ranging sensor, a data processing module, a PLC and a motor driver, wherein the data processing module is respectively and electrically connected with the camera, the ranging sensor and the PLC, the PLC is electrically connected with the motor driver, and the motor driver comprises a linear motor driver, a translation motor driver and a rotary motor driver; the burr detection method comprises the following steps:
step S101: after receiving the position coordinates of the target position on the target pole piece sent by the data processing module, the PLC sends a translation instruction carrying a translation direction and a translation distance to the translation motor driver according to the abscissa in the position coordinates and the transverse position coordinates of the camera, so that the translation motor driver controls the camera to translate according to the translation direction and the translation distance, wherein the target position comprises the position of at least one edge to be detected on the target pole piece.
Specifically, the data processing module is located on a PC (Personal Computer, a personal computer), after the lithium battery pole piece is cut or die-cut, at least one lithium battery pole piece can be obtained, and the generated burrs are generally located at the cutting edge of the cut lithium battery pole piece, namely: the method comprises the steps that a to-be-detected edge is detected, a user inputs in advance the number of at least one lithium battery pole piece needing to be detected for burrs and the position information of the to-be-detected edge to be detected on each pole piece in a data processing module, the data processing module stores the number and the position information input by the user, in the process of detecting burrs, the user inputs the number of the lithium battery pole piece needing to be detected currently on a display screen of a PC end, after receiving the number of the lithium battery pole piece, the data processing module takes the lithium battery pole piece corresponding to the number as a target pole piece, and then determines the position information of the to-be-detected edge on the target pole piece in a database, namely: after the position coordinates of the target position are determined, the position coordinates of the target position are sent to the PLC, and after the PLC receives the position coordinates, a translation instruction is sent to a translation motor driver according to the horizontal coordinates in the position coordinates and the horizontal position coordinates of the camera, wherein the translation instruction carries a translation direction and a translation distance, and the translation direction comprises: the translation motor driver, after receiving the translation command, can control the camera to translate according to the translation direction and the translation distance, for example: and the translation direction carried by the translation instruction is right translation, the translation distance is 10cm, and then the translation motor driver controls the camera to translate by 10cm right.
It should be noted that, the form of the position coordinates of the target position may be: two-dimensional coordinates with the standard position of the camera as the origin, for example: (2, -3) the standard position of the camera is vertically aligned with the target position and the distance between the standard position and the target position is equal to the preset distance, and the transverse position coordinates of the camera are as follows: a one-dimensional coordinate with the abscissa of the standard position of the camera as the origin, for example 3cm.
It should be noted that, when the left edge and the right edge of the target pole piece are both cutting edges, the target pole piece has two edges to be detected, and when the left edge or the right edge of the target pole piece is a cutting edge, the target pole piece has one edge to be detected.
It should be noted that, after the translation, the camera reaches the preset position, namely: the camera is vertically aligned with the target location.
It should be noted that the above-mentioned electrical connection may be set according to practical situations, for example, by way of a communication cable, and the specific electrical connection is not specifically limited herein.
It should be noted that, the selection of the camera may be selected according to the actual situation, for example, a camera with high resolution and large depth of field may be selected, and the specific selection mode of the camera is not limited herein, and the selection of the ranging sensor may be selected according to indexes such as repetition accuracy, linearity, measurement range, effective range, and accuracy error of the ranging sensor.
Step S102: the data processing module acquires a first target difference value through the ranging sensor after receiving a first feedback signal which is sent by the translation motor driver and used for indicating the end of translation, and sends the first target difference value to the PLC, wherein the first target difference value is a difference value between a target distance and a preset distance, and the target distance is a distance between the position of the camera and the target position.
Specifically, after the movement of the camera is finished, the translation motor driver sends a first feedback signal for indicating that the translation is finished to the data processing module so as to indicate that the camera is located under or over the target position, and can perform the next operation, after receiving the first feedback signal, the data processing module sends a first acquisition instruction to the ranging sensor so as to enable the ranging sensor to acquire a first target difference value, and after the ranging sensor acquires the first target difference value, the data processing module sends the first target difference value to the data processing module so as to enable the data processing module to send the first target difference value to the PLC.
It should be noted that, the positional relationship between the ranging sensor and the camera is fixed, when the camera is vertically aligned with the target position and the distance between the position of the camera and the target position is equal to the preset distance, the camera is located at the standard position, and the camera can collect the clearest image at the target position, so the user sets the coordinate of the camera located at the standard position as zero, at this time, the first target difference value collected by the ranging sensor is set as zero, and when the first target difference value exists between the target distance and the preset distance, namely: when offset occurs relative to the zero point, the ranging sensor can directly acquire the first target difference value; wherein the target distance refers to the distance between the current camera position and the target position.
It should be noted that, when the first target difference value is not 0, it is described that the camera is not located at the standard position, and the acquired image at the target position has an unclear risk, so that the camera needs to be moved to the standard position.
It should be noted that, since there is no physical connection between the ranging sensor and the PLC, the data transmission needs to be transferred by the data processing module.
Step S103: after receiving the first target difference value, the PLC sends a moving instruction carrying a moving direction and a moving distance to the linear motor driver according to a first numerical value of the first target difference value and a first sign used for representing positive or negative in the first target difference value, so that the linear motor driver controls the camera to move according to the moving direction and the moving distance.
Specifically, after receiving the first target difference, the PLC sends a movement instruction to the linear motor driver according to a first numerical value for representing the magnitude of the first target difference and a first symbol for representing positive or negative in the first target difference, where the movement instruction carries a movement direction and a movement distance, and after receiving the movement instruction, the linear motor driver controls the camera to move according to the movement direction and the movement distance, where the movement direction includes: move up and move down.
The first value and the first sign of the first target difference are, for example, when the first target difference is-5, the first value is 5, and the first sign is "-".
Step S104: and the data processing module acquires a target image at the target position through the camera after receiving a second feedback signal which is sent by the linear motor driver and used for indicating the end of movement, and takes the target image as input data of a burr detection model to obtain a target label which is used for indicating the type of burrs in the target image, so that the content of the target label is used as the type of target burrs of the target pole piece at the target position.
Specifically, after the camera is moved, the linear motor driver sends a second feedback signal for indicating that the movement is completed to the PLC to indicate that the camera is currently located at a standard position and can perform image acquisition work, so that after the PLC receives the second feedback signal, the PLC sends the second feedback signal to the data processing module, and after receiving the second feedback signal, the data processing module sends a second acquisition signal to the camera to enable the camera to acquire a target image at a target position, after the target image is acquired, the camera sends the target image to the data processing module, and after the target image is received, the data processing module takes the target image as input data of a burr detection model pre-stored in the data processing module, namely: the target image is input into the burr detection model, and a target label output by the burr detection model is obtained, wherein the target label can represent the category of burrs in the target image, that is, the target label can represent the category of burrs of the target pole piece at the target position, so that the content of the target label can be taken as the category of target burrs of the target pole piece at the target position.
The types of the target burrs include burrs, remnants, dust, and defects.
In the burr detection method provided in fig. 1, after a target pole piece to be detected is determined, a data processing module sends a position coordinate of a target position to be detected on the target pole piece to a PLC (Programmable Logic Controller, a programmable logic controller), after receiving the position coordinate of the target position, the PLC sends a translation command to a translation motor driver according to an abscissa in the position coordinate and a lateral position coordinate of a camera, and simultaneously carries a translation direction and a translation distance on the translation command, the translation motor driver controls the camera to translate according to the translation direction and the translation distance after receiving the translation command, so that the camera moves to a position where an image of the target position can be acquired, after the movement is finished, the translation motor driver sends a first feedback signal for indicating that the translation is finished to the data processing module through the PLC, so as to indicate that the next operation can be performed, and after receiving the first feedback signal, the data processing module acquires a first target difference value through a sensor and sends the first target difference value to the PLC, so that the PLC sends a movement command to a linear motor according to a first distance measurement command and a first distance measurement value of the first target difference value, and controls the movement distance between the camera and the movement direction according to the first distance command and the linear motor, namely, the movement distance between the movement direction and the camera is controlled to be equal to the preset to the movement distance between the movement command and the camera and the movement direction. The camera can acquire an image at a clear target position, and after the movement of the camera is finished, the linear motor sends a second feedback signal for indicating the movement is finished to the data processing module through the PLC so as to indicate that the next operation can be performed; after receiving the second feedback signal, the data processing module acquires a target image at a target position through a camera, inputs the target image into a burr detection model, obtains a target label used for representing the type of the burrs in the target image, and can obtain the type of the target burrs of the target pole piece at the target position according to the content of the target label so as to complete the task of burr detection.
In the above-mentioned in-process, obtain positional information, remove the camera according to positional information, the camera gathers the step such as classification of target image and target image is all accomplished through burr detection device is automatic, the whole automation of burr detection process has been realized, compared with the manual detection process among the prior art, the burr detection process of this application does not need artifical the participation, be favorable to reducing manual work load, in addition, burr detection method in this application whole is realized through burr detection device, burr detection process links up, be favorable to improving burr detection efficiency, and in this burr detection process, all need first the position of adjustment camera before this burr detection process, in order to obtain the clear image of target position department, in order to improve the accuracy of target image classification in burr classification model, thereby improve the accuracy of burr detection.
In another possible embodiment, after the first target difference value is acquired by the ranging sensor in the step S102, the burr detection method further includes: and the data processing module sends a moving instruction carrying a moving direction and a moving distance to the linear motor driver according to the numerical value of the first target difference value and a sign used for representing positive or negative in the first target difference value, so that the linear motor driver controls the camera to move according to the moving direction and the moving distance.
Specifically, the data processing module can be directly connected with the linear motor driver without a PLC (programmable logic controller), and the mode reduces the data transmission process, so that the risk of data loss in the data transmission process is reduced, the efficiency and the safety of data transmission are improved, and the hardware cost is reduced.
In a possible embodiment, before performing the above step S101, the burr detection method further includes the steps of:
step S201: and judging whether the target position is positioned in a target area facing the shooting direction according to the position coordinates and the shooting direction of the telecentric lens.
Step S202: and if the camera is not positioned in the target area, sending a rotating instruction carrying a preset rotating direction and a preset rotating angle to the rotating motor driver, so that the rotating motor driver controls the camera to rotate according to the preset rotating direction and the preset rotating angle.
Specifically, the above position coordinates refer to position coordinates with a standard position of the camera as an origin of coordinates, a photographing direction of the telecentric lens refers to a lens orientation of the telecentric lens, the target area refers to an area on the same side as the lens orientation of the telecentric lens, and the size and position of the area can be set according to practical situations, which is not specifically limited herein, if the above position coordinates are not located in the target area, it is indicated that the position coordinates are located in an area opposite to the orientation of the telecentric lens, and at this time, the orientation of the camera needs to be rotated to photograph an image at the position coordinates, so the PLC sends a rotation command to the rotating motor driver, where the rotating command carries a preset rotation angle and a preset rotation direction, and the rotating motor driver controls the camera to rotate according to the preset rotation direction and the preset rotation angle after receiving the rotation command, that is: rotating a preset rotation angle along a preset rotation direction.
It should be noted that the preset rotation direction and the preset rotation angle may be set according to actual situations, for example, the preset rotation direction is set to rotate rightward, the preset rotation angle is set to 180 degrees, and the specific setting of the preset rotation direction and the preset rotation angle is not specifically limited herein.
It should be noted that, the above step S201 and the above step S202 may be located after the above step S101, and at this time, the data processing module may further perform the operations in the steps S102 to S104 after receiving the third feedback signal sent by the rotary motor driver and indicating that the rotation is completed, to indicate that the camera is currently located at the standard position.
In another possible embodiment, before performing the step S101, the burr detection method further includes: after the data processing module acquires the position coordinates of the target position on the target pole piece, judging whether the target position is positioned in a region facing the shooting direction according to the position coordinates and the shooting direction of the telecentric lens; if the camera is not located in the area, a rotation instruction carrying a preset rotation direction and a preset rotation angle is sent to the rotating motor driver, so that the rotating motor driver controls the camera to rotate according to the preset rotation direction and the preset rotation angle.
Specifically, the data processing module can be directly connected with the rotating motor driver without a PLC (programmable logic controller), and the mode reduces the data transmission process, so that the risk of data loss in the data transmission process is reduced, the efficiency and the safety of data transmission are improved, and the hardware cost is reduced.
In a possible embodiment, before performing the above step S101, the burr detection method further includes the steps of:
step S301: and calculating a second target difference between the abscissa and the transverse position coordinate.
Step S302: determining a preset translation direction matched with the second sign according to a second sign used for representing positive or negative in the second target difference value, and taking the preset translation direction as the translation direction;
step S303: and taking the second numerical value as the numerical value of the translation distance according to the second numerical value of the second target difference value.
Specifically, in order to determine a lateral deviation value between the target position and the camera, a second target difference value between an abscissa of the target position and a lateral position coordinate of the camera needs to be calculated, and the obtained second target difference value is the lateral deviation value, and before shooting, the camera needs to be vertically aligned with the target position, that is: the target position and the camera have no lateral deviation value, and the value of the lateral deviation value is 0, so that a preset translation direction matched with the second sign in advance can be determined according to the second sign used for representing positive or negative in the calculated second target difference value, for example: if the second sign is positive and translates to the right, if the second sign is negative and translates to the left, the specific matching result may be set according to the actual situation, and is not limited herein, after determining the translation direction, the second value is used as the value of the translation distance according to the second value for representing the magnitude of the second target difference, that is: a translation distance is determined.
Illustrating: the preset translation direction pre-matched for the second symbol is: if the second sign is positive, shifting leftwards, if the second sign is negative, shifting rightwards, so that when the second target difference value is +5, the determined shifting direction is rightwards, and the determined shifting distance is 5; the second numerical value and the second symbol are described in detail with reference to the above description of the first numerical value and the first symbol, and will not be described in detail herein.
In a possible embodiment, before performing the step S103, the burr detection method further includes:
determining a preset moving direction matched with the first sign according to a first sign used for representing positive or negative in the first target difference value, and taking the preset moving direction as the moving direction;
and taking the first value as the value of the moving distance according to the first value of the first target difference value.
Specifically, the specific description of the manner of determining the moving direction and the moving distance refers to the descriptions of the steps S301 to S303, and will not be repeated here.
In a possible embodiment, before performing the step S104, the burr detection method further includes the steps of:
Step S401: and training the candidate burr detection model for a plurality of times by using a sample training set, wherein the sample training set comprises at least one historical burr image set and preset labels set for the historical burr image set, and the preset labels are used for representing types of burrs in each historical burr image included in the historical burr image set.
Step S402: judging whether the candidate burr detection model meets a preset condition or not, wherein the preset condition comprises: the accuracy of the candidate burr detection model is greater than a preset threshold value, and/or the training times are greater than or equal to preset times.
Step S403: and if the preset condition is met, taking the candidate burr detection model as the burr detection model.
Specifically, the data processing module obtains at least one historical burr image set input by the user and preset labels set for each historical burr image set, the historical burr image set comprises at least one historical burr image, the preset labels of each historical burr image in the historical burr image set are consistent with the historical burr image set, the historical burr image refers to a burr image which is manually classified according to experience, the preset labels of the historical burr image set are used for representing types of burrs in each historical burr image in the historical burr image set, the preset labels comprise burrs, clout, dust, incomplete labels and the like, the sample training set comprises all the historical burr image sets obtained by the data processing module and preset labels corresponding to each historical burr image set, the sample training set is input into the candidate burr detection model to complete one training, and in order to improve the accuracy of the candidate burr detection model, the candidate burr detection model needs to be trained for multiple times by using the sample training set, namely: the sample training set needs to be input into the candidate spur detection model multiple times.
After each training, determining the accuracy of the candidate burr detection model according to the classification result of the candidate burr detection model on each historical burr image and a preset label of a historical burr image set to which the historical burr image belongs, and when the accuracy is greater than a preset threshold, the candidate burr detection model meets a preset condition, so that the candidate burr detection model is used as the burr detection model; in addition, when the training frequency is greater than or equal to the preset frequency, the candidate burr training model also satisfies the preset condition, so the candidate burr detection model may also be used as the burr detection model.
It should be noted that, before using the sample training set to perform the candidate spike detection model, parameters of the candidate spike detection model need to be set, for example: and setting parameters such as batch size, learning rate and the like of the candidate burr model.
In one possible embodiment, the burr detection apparatus further includes: and displaying the category of the target burr.
Specifically, after the target image is classified by using the burr detection model, the type of the target burr in the target image needs to be displayed, and the display mode of the type of the target burr may be set according to the actual situation, for example, the target image may be displayed, and the type of the target burr in the target image may be displayed in the form of an annotation, and the specific display mode is not specifically limited herein.
Example two
Fig. 2 is a schematic structural diagram of a burr detection device according to a second embodiment of the present application, and as shown in fig. 2, the burr detection device includes: the camera is provided with a telecentric lens, a ranging sensor, a data processing module, a Programmable Logic Controller (PLC) and a motor driver, wherein the data processing module is respectively and electrically connected with the camera, the ranging sensor and the PLC, the PLC is electrically connected with the motor driver, and the motor driver comprises a linear motor driver, a translation motor driver and a rotary motor driver;
the PLC is used for sending a translation instruction carrying a translation direction and a translation distance to the translation motor driver according to the abscissa in the position coordinates and the transverse position coordinates of the camera after receiving the position coordinates of the target position on the target pole piece sent by the data processing module, so that the translation motor driver controls the camera to translate according to the translation direction and the translation distance, wherein the target position comprises the position of at least one edge to be detected on the target pole piece;
the data processing module is used for acquiring a first target difference value through the ranging sensor after receiving a first feedback signal which is sent by the translation motor driver and used for indicating the end of translation, and sending the first target difference value to the PLC, wherein the first target difference value is a difference value between a target distance and a preset distance, and the target distance is a distance between the position of the camera and the target position;
The PLC is used for sending a moving instruction carrying a moving direction and a moving distance to the linear motor driver according to a first numerical value of the first target difference value and a first sign used for representing positive or negative in the first target difference value after receiving the first target difference value, so that the linear motor driver controls the camera to move according to the moving direction and the moving distance;
and the data processing module is used for acquiring a target image at the target position through the camera after receiving a second feedback signal which is sent by the linear motor driver and used for indicating the end of movement, and taking the target image as input data of a burr detection model to obtain a target label which is used for indicating the type of burrs in the target image, so that the content of the target label is used as the type of target burrs of the target pole piece at the target position.
In a possible embodiment, the configuration of the PLC is further configured, before being configured to send a translation command carrying a translation direction and a translation distance to the translation motor driver according to the abscissa among the position coordinates and the lateral position coordinates of the camera, to:
Judging whether the target position is positioned in a target area facing the shooting direction according to the position coordinates and the shooting direction of the telecentric lens;
and if the camera is not positioned in the target area, sending a rotating instruction carrying a preset rotating direction and a preset rotating angle to the rotating motor driver, so that the rotating motor driver controls the camera to rotate according to the preset rotating direction and the preset rotating angle.
In a possible embodiment, the configuration of the PLC is further configured, before being configured to send a translation command carrying a translation direction and a translation distance to the translation motor driver according to the abscissa among the position coordinates and the lateral position coordinates of the camera, to:
calculating a second target difference between the abscissa and the lateral position coordinate;
determining a preset translation direction matched with the second sign according to a second sign used for representing positive or negative in the second target difference value, and taking the preset translation direction as the translation direction;
and taking the second numerical value as the numerical value of the translation distance according to the second numerical value of the second target difference value.
In a possible embodiment, the configuration of the data processing module is further configured, before the target image is used as input data of the burr detection model, to obtain a target label for representing a burr category in the target image, to:
Training a candidate burr detection model for a plurality of times by using a sample training set, wherein the sample training set comprises at least one historical burr image set and preset labels which are set for the historical burr image set, and the preset labels are used for representing types of burrs in each historical burr image included in the historical burr image set;
judging whether the candidate burr detection model meets a preset condition or not, wherein the preset condition comprises: the accuracy of the candidate burr detection model is greater than a preset threshold value, and/or the training times are greater than or equal to preset times;
and if the preset condition is met, taking the candidate burr detection model as the burr detection model.
In a possible embodiment, the burr detection apparatus further includes: and the display module is used for displaying the category of the target burr.
In one possible embodiment, fig. 3 shows a physical diagram of another burr detection apparatus provided in example two of the present application, as shown in fig. 3, the burr detection apparatus includes:
the camera comprises a camera a, a telecentric lens b, a backlight c1, a backlight c2, a ranging sensor d, a linear motor e, a rotating motor f and a translating motor g, wherein the backlight c1 is arranged above a pole piece 1, a pole piece 2 and a pole piece 3 through a light source bracket, the backlight c2 is arranged below the pole piece 4 and the pole piece 5 through a light source bracket, the camera is arranged between the pole piece 1 and the pole piece 5 through a camera bracket, the linear motor e, the rotating motor f and the translating motor g are connected with the camera through a motor module, and the ranging sensor d is positioned on one side of the camera and keeps level with the camera.
The apparatus provided by the embodiments of the present application may be specific hardware on a device or software or firmware installed on a device, etc. The device provided in the embodiments of the present application has the same implementation principle and technical effects as those of the foregoing method embodiments, and for a brief description, reference may be made to corresponding matters in the foregoing method embodiments where the device embodiment section is not mentioned. It will be clear to those skilled in the art that, for convenience and brevity, the specific operation of the system, apparatus and unit described above may refer to the corresponding process in the above method embodiment, which is not described in detail herein.
According to the burr detection method provided by the embodiment of the application, after the target pole piece to be detected is determined, the data processing module sends the position coordinates of the target position to be detected on the target pole piece to the PLC (Programmable Logic Controller, the programmable logic controller), after receiving the position coordinates of the target position, the PLC sends a translation instruction to the translation motor driver according to the abscissa in the position coordinates and the transverse position coordinates of the camera, meanwhile, the translation motor driver carries a translation direction and a translation distance on the translation instruction, after receiving the translation instruction, the translation motor driver controls the camera to translate according to the translation direction and the translation distance, so that the camera moves to a position where an image of the target position can be acquired, after the translation is finished, the translation motor driver sends a first feedback signal for indicating that the translation is finished to the data processing module through the PLC, so that the next operation can be performed, after receiving the first feedback signal, the first target difference is acquired through the sensor, the first target difference is sent to the PLC, so that the PLC sends a movement instruction to the linear motor according to the first distance measuring instruction and the first value of the first target difference, the camera is controlled to move according to the first distance measuring instruction, and the movement direction carried by the linear motor is equal to the preset distance between the camera and the movement direction and the camera, namely, the movement distance is controlled to be equal to the movement distance between the camera and the movement direction. The camera can acquire an image at a clear target position, and after the movement of the camera is finished, the linear motor sends a second feedback signal for indicating the movement is finished to the data processing module through the PLC so as to indicate that the next operation can be performed; after receiving the second feedback signal, the data processing module acquires a target image at a target position through a camera, inputs the target image into a burr detection model, obtains a target label used for representing the type of the burrs in the target image, and can obtain the type of the target burrs of the target pole piece at the target position according to the content of the target label so as to complete the task of burr detection.
In the above-mentioned in-process, obtain positional information, remove the camera according to positional information, the camera gathers the step such as classification of target image and target image is all accomplished through burr detection device is automatic, the whole automation of burr detection process has been realized, compared with the manual detection process among the prior art, the burr detection process of this application does not need artifical the participation, be favorable to reducing manual work load, in addition, burr detection method in this application whole is realized through burr detection device, burr detection process links up, be favorable to improving burr detection efficiency, and in this burr detection process, all need first the position of adjustment camera before this burr detection process, in order to obtain the clear image of target position department, in order to improve the accuracy of target image classification in burr classification model, thereby improve the accuracy of burr detection.
Example III
The embodiment of the present application further provides a computer device 400, and fig. 4 shows a schematic structural diagram of a computer device provided in the third embodiment of the present application, as shown in fig. 4, where the device includes a memory 401, a processor 402, and a computer program stored in the memory 401 and capable of running on the processor 402, where the processor 402 implements the burr detection method when executing the computer program.
Specifically, the memory 401 and the processor 402 can be general-purpose memories and processors, which are not limited herein, and when the processor 402 runs a computer program stored in the memory 401, the burr detection method can be executed, thereby solving the problem of large manual workload in the prior art.
Example IV
The embodiments of the present application also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the above-described glitch detection method.
Specifically, the storage medium can be a general-purpose storage medium, such as a mobile disk, a hard disk, etc., and when the computer program on the storage medium is executed, the burr detection method can be executed, thereby solving the problem of large manual workload in the prior art.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments provided in the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
It should be noted that: like reference numerals and letters in the following figures denote like items, and thus once an item is defined in one figure, no further definition or explanation of it is required in the following figures, and furthermore, the terms "first," "second," "third," etc. are used merely to distinguish one description from another and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the foregoing examples are merely specific embodiments of the present application, and are not intended to limit the scope of the present application, but the present application is not limited thereto, and those skilled in the art will appreciate that while the foregoing examples are described in detail, the present application is not limited thereto. Any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or make equivalent substitutions for some of the technical features within the technical scope of the disclosure of the present application; such modifications, changes or substitutions do not depart from the spirit and scope of the corresponding technical solutions. Are intended to be encompassed within the scope of this application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A burr detection method, characterized in that the burr detection method is applied to a burr detection apparatus, the burr detection apparatus comprising: the camera is provided with a telecentric lens, a ranging sensor, a data processing module, a Programmable Logic Controller (PLC) and a motor driver, wherein the data processing module is respectively and electrically connected with the camera, the ranging sensor and the PLC, the PLC is electrically connected with the motor driver, and the motor driver comprises a linear motor driver, a translation motor driver and a rotary motor driver;
the burr detection method comprises the following steps:
after receiving the position coordinates of the target position on the target pole piece sent by the data processing module, the PLC sends a translation instruction carrying a translation direction and a translation distance to the translation motor driver according to the abscissa in the position coordinates and the transverse position coordinates of the camera, so that the translation motor driver controls the camera to translate according to the translation direction and the translation distance, wherein the target position comprises the position of at least one edge to be detected on the target pole piece;
the data processing module acquires a first target difference value through the ranging sensor after receiving a first feedback signal which is sent by the translation motor driver and used for indicating the end of translation, and sends the first target difference value to the PLC, wherein the first target difference value is a difference value between a target distance and a preset distance, and the target distance is a distance between the position of the camera and the target position;
After receiving the first target difference value, the PLC sends a moving instruction carrying a moving direction and a moving distance to the linear motor driver according to a first numerical value of the first target difference value and a first sign used for representing positive or negative in the first target difference value, so that the linear motor driver controls the camera to move according to the moving direction and the moving distance;
and the data processing module acquires a target image at the target position through the camera after receiving a second feedback signal which is sent by the linear motor driver and used for indicating the end of movement, and takes the target image as input data of a burr detection model to obtain a target label which is used for indicating the type of burrs in the target image, so that the content of the target label is used as the type of target burrs of the target pole piece at the target position.
2. The method of claim 1, wherein before said sending a translation command to the translation motor driver carrying a translation direction and a translation distance according to an abscissa in the position coordinates and a lateral position coordinate of the camera, the method further comprises:
Judging whether the target position is positioned in a target area facing the shooting direction according to the position coordinates and the shooting direction of the telecentric lens;
and if the camera is not positioned in the target area, sending a rotating instruction carrying a preset rotating direction and a preset rotating angle to the rotating motor driver, so that the rotating motor driver controls the camera to rotate according to the preset rotating direction and the preset rotating angle.
3. The method of claim 1, wherein before said sending a translation command to the translation motor driver carrying a translation direction and a translation distance according to an abscissa in the position coordinates and a lateral position coordinate of the camera, the method further comprises:
calculating a second target difference between the abscissa and the lateral position coordinate;
determining a preset translation direction matched with the second sign according to a second sign used for representing positive or negative in the second target difference value, and taking the preset translation direction as the translation direction;
and taking the second numerical value as the numerical value of the translation distance according to the second numerical value of the second target difference value.
4. The method of claim 1, wherein before using the target image as input data for a spur detection model to obtain a target label for representing a spur class in the target image, the method further comprises:
training a candidate burr detection model for a plurality of times by using a sample training set, wherein the sample training set comprises at least one historical burr image set and preset labels which are set for the historical burr image set, and the preset labels are used for representing types of burrs in each historical burr image included in the historical burr image set;
judging whether the candidate burr detection model meets a preset condition or not, wherein the preset condition comprises: the accuracy of the candidate burr detection model is greater than a preset threshold value, and/or the training times are greater than or equal to preset times;
and if the preset condition is met, taking the candidate burr detection model as the burr detection model.
5. The method of claim 1, wherein the method further comprises:
and displaying the category of the target burr.
6. A burr detection device, characterized in that the burr detection device comprises: the camera is provided with a telecentric lens, a ranging sensor, a data processing module, a Programmable Logic Controller (PLC) and a motor driver, wherein the data processing module is respectively and electrically connected with the camera, the ranging sensor and the PLC, the PLC is electrically connected with the motor driver, and the motor driver comprises a linear motor driver, a translation motor driver and a rotary motor driver;
The PLC is used for sending a translation instruction carrying a translation direction and a translation distance to the translation motor driver according to the abscissa in the position coordinates and the transverse position coordinates of the camera after receiving the position coordinates of the target position on the target pole piece sent by the data processing module, so that the translation motor driver controls the camera to translate according to the translation direction and the translation distance, wherein the target position comprises the position of at least one edge to be detected on the target pole piece;
the data processing module is used for acquiring a first target difference value through the ranging sensor after receiving a first feedback signal which is sent by the translation motor driver and used for indicating the end of translation, and sending the first target difference value to the PLC, wherein the first target difference value is a difference value between a target distance and a preset distance, and the target distance is a distance between the position of the camera and the target position;
the PLC is used for sending a moving instruction carrying a moving direction and a moving distance to the linear motor driver according to a first numerical value of the first target difference value and a first sign used for representing positive or negative in the first target difference value after receiving the first target difference value, so that the linear motor driver controls the camera to move according to the moving direction and the moving distance;
And the data processing module is used for acquiring a target image at the target position through the camera after receiving a second feedback signal which is sent by the linear motor driver and used for indicating the end of movement, and taking the target image as input data of a burr detection model to obtain a target label which is used for indicating the type of burrs in the target image, so that the content of the target label is used as the type of target burrs of the target pole piece at the target position.
7. The apparatus of claim 6, wherein the configuration of the PLC is further configured to, prior to sending a translation command carrying a translation direction and a translation distance to the translation motor driver based on the abscissa in the position coordinates and the lateral position coordinates of the camera:
judging whether the target position is positioned in a target area facing the shooting direction according to the position coordinates and the shooting direction of the telecentric lens;
and if the camera is not positioned in the target area, sending a rotating instruction carrying a preset rotating direction and a preset rotating angle to the rotating motor driver, so that the rotating motor driver controls the camera to rotate according to the preset rotating direction and the preset rotating angle.
8. The apparatus of claim 6, wherein the configuration of the PLC is further configured to, prior to sending a translation command carrying a translation direction and a translation distance to the translation motor driver based on the abscissa in the position coordinates and the lateral position coordinates of the camera:
calculating a second target difference between the abscissa and the lateral position coordinate;
determining a preset translation direction matched with the second sign according to a second sign used for representing positive or negative in the second target difference value, and taking the preset translation direction as the translation direction;
and taking the second numerical value as the numerical value of the translation distance according to the second numerical value of the second target difference value.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of the preceding claims 1-5 when the computer program is executed.
10. A computer readable storage medium having stored thereon a computer program, characterized in that the computer program when executed by a processor performs the steps of the method of any of the preceding claims 1-5.
CN202110849072.3A 2021-07-27 2021-07-27 Burr detection method, device, equipment and storage medium Active CN113567452B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110849072.3A CN113567452B (en) 2021-07-27 2021-07-27 Burr detection method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110849072.3A CN113567452B (en) 2021-07-27 2021-07-27 Burr detection method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113567452A CN113567452A (en) 2021-10-29
CN113567452B true CN113567452B (en) 2024-03-15

Family

ID=78167818

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110849072.3A Active CN113567452B (en) 2021-07-27 2021-07-27 Burr detection method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113567452B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115541500A (en) * 2022-11-30 2022-12-30 苏州恒视智能科技有限公司 On-line burr detection visual mechanism for lithium battery pole piece
CN117849058A (en) * 2024-03-06 2024-04-09 宁德时代新能源科技股份有限公司 Detection system and detection method for pole piece

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102566594A (en) * 2012-01-18 2012-07-11 浙江大学 Micro-member sound control two-dimensional translation method based on micro-vision feedback
CN109613013A (en) * 2018-10-23 2019-04-12 广州量子激光智能装备有限公司 Tab pole piece burr detection method and system
WO2020038109A1 (en) * 2018-08-22 2020-02-27 Oppo广东移动通信有限公司 Photographing method and device, terminal, and computer-readable storage medium
CN110954554A (en) * 2019-12-16 2020-04-03 广州量子激光智能装备有限公司 Online burr detecting system
CN112529829A (en) * 2019-08-28 2021-03-19 银河水滴科技(北京)有限公司 Training method and device for burr positioning and burr detection model

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102566594A (en) * 2012-01-18 2012-07-11 浙江大学 Micro-member sound control two-dimensional translation method based on micro-vision feedback
WO2020038109A1 (en) * 2018-08-22 2020-02-27 Oppo广东移动通信有限公司 Photographing method and device, terminal, and computer-readable storage medium
CN109613013A (en) * 2018-10-23 2019-04-12 广州量子激光智能装备有限公司 Tab pole piece burr detection method and system
CN112529829A (en) * 2019-08-28 2021-03-19 银河水滴科技(北京)有限公司 Training method and device for burr positioning and burr detection model
CN110954554A (en) * 2019-12-16 2020-04-03 广州量子激光智能装备有限公司 Online burr detecting system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于机器视觉的工件边缘毛刺检测***研究;戴凤强;刘涛;王宏波;沈晓东;;农业装备与车辆工程;20180710(第07期);全文 *
基于视觉的卡钳毛刺检测和定位方法研究;韩进宇;吴超群;;数字制造科学;20191215(第04期);全文 *

Also Published As

Publication number Publication date
CN113567452A (en) 2021-10-29

Similar Documents

Publication Publication Date Title
CN113567452B (en) Burr detection method, device, equipment and storage medium
CN108890652B (en) Transformer substation inspection robot and transformer substation equipment inspection method
CN107289925B (en) Method and device for drawing user track
CN110757555B (en) Control method and device for cutting PCB (printed Circuit Board) by board separator
CN109492688B (en) Weld joint tracking method and device and computer readable storage medium
CN107462249B (en) Indoor positioning method, device and system based on RFID
CN114727502B (en) Computer-aided PCB automatic alignment cutting method and system
CN1963383A (en) Method for measuring thickness of insulating sheath of cable
CN116740060B (en) Method for detecting size of prefabricated part based on point cloud geometric feature extraction
CN112536643B (en) Machine health monitoring method and device and computer readable storage medium
CN116678368B (en) BIM technology-based intelligent acquisition method for assembled steel structure data
CN112388710B (en) Method and device for cutting edges of board separator, storage medium and board separator
CN113177980A (en) Target object speed determination method and device for automatic driving and electronic equipment
CN115861407B (en) Safety distance detection method and system based on deep learning
US20240151524A1 (en) Roll map generating device for merge-wound electrode
CN114526724B (en) Positioning method and equipment for inspection robot
CN115830268A (en) Data acquisition method and device for optimizing perception algorithm and storage medium
WO2019180950A1 (en) Placement position notification system
CN116167935A (en) Repairing method, device, equipment and medium for two-dimensional code
CN112507838B (en) Pointer meter identification method and device and electric power inspection robot
CN210089615U (en) Detection apparatus for angle steel punches a hole based on 3D vision
CN113579512A (en) Position adjusting method and device, electronic equipment and storage medium
CN113516091A (en) Method for identifying electric spark image of transformer substation
CN110181511B (en) Robot zero loss detection and zero calibration assisting method and system
CN113762397A (en) Detection model training and high-precision map updating method, device, medium and product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant