CN116625243B - Intelligent detection method, system and storage medium based on frame coil stock cutting machine - Google Patents

Intelligent detection method, system and storage medium based on frame coil stock cutting machine Download PDF

Info

Publication number
CN116625243B
CN116625243B CN202310919671.7A CN202310919671A CN116625243B CN 116625243 B CN116625243 B CN 116625243B CN 202310919671 A CN202310919671 A CN 202310919671A CN 116625243 B CN116625243 B CN 116625243B
Authority
CN
China
Prior art keywords
information
processing
area
detected
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310919671.7A
Other languages
Chinese (zh)
Other versions
CN116625243A (en
Inventor
肖国柱
石涛
郭启
蔡沛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Longshen Hydrogen Energy Technology Co ltd
Original Assignee
Hunan Longshen Hydrogen Energy Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Longshen Hydrogen Energy Technology Co ltd filed Critical Hunan Longshen Hydrogen Energy Technology Co ltd
Priority to CN202310919671.7A priority Critical patent/CN116625243B/en
Publication of CN116625243A publication Critical patent/CN116625243A/en
Application granted granted Critical
Publication of CN116625243B publication Critical patent/CN116625243B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B26HAND CUTTING TOOLS; CUTTING; SEVERING
    • B26DCUTTING; DETAILS COMMON TO MACHINES FOR PERFORATING, PUNCHING, CUTTING-OUT, STAMPING-OUT OR SEVERING
    • B26D5/00Arrangements for operating and controlling machines or devices for cutting, cutting-out, stamping-out, punching, perforating, or severing by means other than cutting
    • B26D5/005Computer numerical control means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B26HAND CUTTING TOOLS; CUTTING; SEVERING
    • B26DCUTTING; DETAILS COMMON TO MACHINES FOR PERFORATING, PUNCHING, CUTTING-OUT, STAMPING-OUT OR SEVERING
    • B26D5/00Arrangements for operating and controlling machines or devices for cutting, cutting-out, stamping-out, punching, perforating, or severing by means other than cutting
    • B26D5/007Control means comprising cameras, vision or image processing systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02WCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO WASTEWATER TREATMENT OR WASTE MANAGEMENT
    • Y02W90/00Enabling technologies or technologies with a potential or indirect contribution to greenhouse gas [GHG] emissions mitigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Forests & Forestry (AREA)
  • Mechanical Engineering (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The application is suitable for the technical field of production equipment detection, and provides an intelligent detection method, an intelligent detection system and a storage medium based on a frame coil stock cutting machine, wherein the method comprises the steps of acquiring a to-be-detected product graph and a first reference product graph of an object to be detected based on a preset camera; determining first processing characteristic information of a processing area to be detected according to the product graph to be detected, and determining second processing characteristic information of the first reference processing area according to the first reference product graph; and determining processing state information of the object to be detected according to the first processing characteristic information and the second processing characteristic information, wherein the processing state information comprises a state to be recovered or a state to be scrapped. The application is beneficial to manufacturers to know whether the frame film which does not meet the production requirements can be reused or not, improves the recycling rate of production resources, and has stronger practical application value.

Description

Intelligent detection method, system and storage medium based on frame coil stock cutting machine
Technical Field
The application relates to the technical field of production equipment detection, in particular to an intelligent detection method, system and storage medium based on a frame coil stock cutting machine.
Background
The hydrogen energy fuel cell has been widely used because of its advantages of high energy conversion, low operating noise, high environmental protection level, etc. The hydrogen energy fuel cell comprises a frame membrane and a proton exchange membrane, and in the production process of the frame membrane, a frame coil cutting machine is generally required to cut and process the frame membrane coil.
In the production process of the frame film, when a frame coil cutting machine cuts and processes the frame film coil, certain vibration is generated or technological parameters are improperly set, production errors are generated in the accuracy of cutting and processing, so that the frame film can be timely detected after cutting and processing to judge whether the frame film meets production requirements, but the current detection method only can know whether the frame film meets the production requirements, when the frame film does not meet the production requirements, more reference information cannot be provided for production staff, the problem that whether the frame film which does not meet the production requirements can be repeatedly utilized is not easily obtained by the production staff, and further improvement is needed.
Disclosure of Invention
Based on the above, the embodiment of the application provides an intelligent detection method, an intelligent detection system and a storage medium based on a frame coil stock cutting machine, so as to solve the problem that in the prior art, production staff are not facilitated to know whether a frame film which does not meet production requirements can be reused.
In a first aspect, an embodiment of the present application provides an intelligent detection method based on a frame coil stock cutting machine, where the method includes:
acquiring a to-be-detected product image and a first reference product image of an object to be detected based on a preset camera, wherein the to-be-detected product image comprises a to-be-detected processing area, and the first reference product image comprises a first reference processing area;
determining first processing characteristic information of a processing area to be detected according to the product diagram to be detected, and determining second processing characteristic information of the first reference processing area according to the first reference product diagram, wherein the first processing characteristic information comprises processing area information and processing area position information, and the second processing characteristic information comprises reference area information and first reference area position information;
and determining processing state information of the object to be detected according to the first processing characteristic information and the second processing characteristic information, wherein the processing state information comprises a state to be recovered or a state to be scrapped.
Compared with the prior art, the beneficial effects that exist are: according to the intelligent detection method based on the frame coil stock cutting machine, the terminal equipment can firstly acquire the to-be-detected product image and the first reference product image of the to-be-detected object, then determine the first processing characteristic information of the to-be-detected processing area according to the to-be-detected product image, determine the second processing characteristic information of the first reference processing area according to the first reference product image, and determine the processing state information of the to-be-detected object according to the first processing characteristic information and the second processing characteristic information, wherein the processing state information comprises the to-be-recovered state or the scrapped state, so that when the frame film does not meet the production requirement, the processing state information is used as reference information for production personnel, the production personnel can know whether the frame film which does not meet the production requirement can be recycled, the recycling rate of production resources is improved, and the problem that the production personnel cannot know whether the frame film which does not meet the production requirement can be recycled is solved to a certain extent.
In a second aspect, an embodiment of the present application provides an intelligent detection system based on a frame coil stock cutting machine, the system including:
the product diagram acquisition module: the method comprises the steps of obtaining a to-be-detected product image and a first reference product image of an object to be detected based on a preset camera, wherein the to-be-detected product image comprises a to-be-detected processing area, and the first reference product image comprises a first reference processing area;
the processing characteristic information determining module: the processing method comprises the steps of determining first processing characteristic information of a processing area to be detected according to a product diagram to be detected, and determining second processing characteristic information of a first reference processing area according to a first reference product diagram, wherein the first processing characteristic information comprises processing area information and processing area position information, and the second processing characteristic information comprises reference area information and first reference area position information;
the processing state information determining module: and the processing state information is used for determining the processing state information of the object to be detected according to the first processing characteristic information and the second processing characteristic information, wherein the processing state information comprises a state to be recovered or a state to be scrapped.
In a third aspect, an embodiment of the present application provides a terminal device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method according to the first aspect as described above when the processor executes the computer program.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing a computer program which, when executed by a processor, implements the steps of the method of the first aspect described above.
It will be appreciated that the advantages of the second to fourth aspects may be found in the relevant description of the first aspect and are not repeated here.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments or the prior art will be briefly described below.
FIG. 1 is a schematic flow chart of an intelligent detection method according to an embodiment of the application;
fig. 2 is a first schematic diagram of a frame film according to an embodiment of the present application, where (a) in fig. 2 is a first schematic diagram of a frame film to be detected, and (b) in fig. 2 is a first schematic diagram of a qualified frame film;
FIG. 3 is a flowchart of step S300 in an intelligent detection method according to an embodiment of the present application;
fig. 4 is a second schematic diagram of a frame film according to an embodiment of the present application, where (a) in fig. 4 is a second schematic diagram of a frame film to be detected, (b) in fig. 4 is a second schematic diagram of a qualified frame film, and (c) in fig. 4 is a third schematic diagram of a frame film to be detected;
Fig. 5 is a third schematic diagram of a frame film according to an embodiment of the present application, where (a) in fig. 5 is a fourth schematic diagram of a frame film to be detected, (b) in fig. 5 is a third schematic diagram of a qualified frame film, and (c) in fig. 5 is a fifth schematic diagram of a frame film to be detected;
fig. 6 is a fourth schematic diagram of a frame film according to an embodiment of the present application, where (a) in fig. 6 is a sixth schematic diagram of a frame film to be detected, (b) in fig. 6 is a fourth schematic diagram of a qualified frame film, and (c) in fig. 6 is a seventh schematic diagram of a frame film to be detected;
fig. 7 is a fifth schematic diagram of a frame film according to an embodiment of the present application, wherein (a) in fig. 7 is an eighth schematic diagram of the frame film to be detected, (b) in fig. 7 is a fifth schematic diagram of the qualified frame film, and (c) in fig. 7 is a ninth schematic diagram of the frame film to be detected;
fig. 8 is a schematic flow chart after step S320 in the intelligent detection method according to an embodiment of the present application;
FIG. 9 is a tenth schematic view of a frame film according to an embodiment of the present application;
fig. 10 is a schematic flow chart after step S310 in the intelligent detection method according to an embodiment of the present application;
FIG. 11 is a schematic illustration of a common area provided by an embodiment of the present application;
FIG. 12 is a schematic illustration of a region of overlap provided by an embodiment of the present application;
FIG. 13 is a schematic view of a remaining area provided by an embodiment of the present application;
FIG. 14 is a block diagram of a smart detection system according to an embodiment of the present application;
fig. 15 is a schematic diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In the description of the present specification and the appended claims, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
In order to illustrate the technical scheme of the application, the following description is made by specific examples.
Referring to fig. 1, fig. 1 is a flow chart of an intelligent detection method based on a frame coil stock cutting machine according to an embodiment of the application. In this embodiment, the execution body of the intelligent detection method is a terminal device. It will be appreciated that the types of terminal devices include, but are not limited to, cell phones, tablet computers, notebook computers, ultra-mobile personal computer (UMPC), netbooks, personal digital assistants (personal digital assistant, PDA), etc., and embodiments of the present application do not limit any particular type of terminal device.
Referring to fig. 1, the intelligent detection method provided by the embodiment of the application includes, but is not limited to, the following steps:
in S100, a to-be-detected product graph and a first reference product graph of an object to be detected are acquired based on a preset camera.
Without losing generality, a high-precision camera can be pre-installed on the frame coil stock cutting machine; referring to fig. 2 (a), the object to be detected may be a frame film cut by a frame roll cutting machine, for example, the frame film cut by the frame roll cutting machine has four to-be-detected processing areas, or a frame film cut by the frame roll cutting machine does not meet the production requirements, for example: the position of the to-be-detected processing area does not meet the production requirement, and/or the size of the to-be-detected processing area does not meet the production requirement, and as the frame film is only determined to be not in accordance with the production requirement, but whether the frame film can be reused or not is not determined, the frame film is defined as the to-be-detected frame film, the to-be-detected product image can be an image corresponding to the to-be-detected frame film, the to-be-detected product image comprises the to-be-detected processing area, and the to-be-detected processing area can be an area for cutting and processing the frame film by the frame coil stock cutting machine. It should be noted that, determining whether the position or the size of the processing area to be detected meets the production requirement, and then determining whether the frame film meets the production requirement is a prior art, so that the description is omitted.
For example, referring to (b) of fig. 2, the first reference product map may be an image corresponding to a qualified frame film (i.e., a frame film satisfying production requirements), the first reference product map includes a first reference processing region, and the first reference processing region may be a cut processing region corresponding to the qualified frame film.
Specifically, the terminal device may acquire the to-be-detected product map and the first reference product map of the to-be-detected object based on a preset high-precision camera.
In S200, first processing feature information of the processing area to be detected is determined according to the product diagram to be detected, and second processing feature information of the first reference processing area is determined according to the first reference product diagram.
Without loss of generality, the first processing characteristic information comprises processing area information and processing area position information, wherein the processing area information is used for describing the area corresponding to the processing area to be detected, and the processing area position information is used for describing the position of the processing area to be detected in the frame film; the second processing characteristic information includes reference region area information for describing an area corresponding to the first reference processing region and first reference region position information for describing a position of the first reference processing region in the frame film.
Specifically, the terminal device may determine first processing feature information of the processing area to be detected, that is, processing area information and processing area position information, according to the to-be-detected product diagram, and then determine second processing feature information of the first reference processing area, that is, reference area information and first reference area position information, according to the first reference product diagram. The terminal device may first determine processing region position information corresponding to a processing region to be detected in the product to be detected based on a preset Yolo V5 target detection algorithm, then perform graying processing on the product to be detected, extract a contour corresponding to the processing region to be detected in the product to be detected based on a preset binary image edge detection method, divide the region corresponding to the contour into a plurality of regular regions, respectively calculate the areas of the regular regions and sum up to determine processing region area information; the process of determining the reference area information and the first reference area position information is similar, and thus, will not be described in detail.
In S300, processing state information of the object to be detected is determined according to the first processing feature information and the second processing feature information.
Specifically, the processing state information includes a state to be recovered or a state to be scrapped; the state to be recovered is used for describing that the object to be detected does not meet the production requirements corresponding to the production line of the current frame coil stock cutting machine, but can meet the production requirements corresponding to the production line after reprocessing and can be recycled, or the object to be detected does not meet the production requirements corresponding to the production line of the current frame coil stock cutting machine although reprocessing, but can meet the production requirements corresponding to other production lines after reprocessing and can be recycled; the scrapped state is used for describing that the object to be detected does not meet the production requirements corresponding to the production line of the current frame coil stock cutting machine, and the production requirements corresponding to the production line (such as serious deviation of the position of the processing area to be detected) are not met after the object to be detected is processed again, and meanwhile, the production requirements corresponding to other production lines are not met after the object to be detected is processed again. Therefore, the intelligent detection method provided by the embodiment of the application can provide a reference quantity, namely processing state information, for a producer about whether the frame film can be reused or not, and the processing state information can be beneficial to the producer to know whether the frame film which does not meet the production requirement can be reused or not, thereby being beneficial to improving the reuse rate of production resources and reducing the production cost of enterprises.
In some possible implementations, referring to fig. 3, in order to improve the accuracy of the processing state information, step S300 includes, but is not limited to, the following steps:
in S310, region relation information is determined from the machining region position information and the first reference region position information.
Specifically, the region relation information comprises complete coverage relation information, partial coincidence relation information or complete dislocation information, wherein the complete coverage relation information is used for describing that the first reference processing region completely covers the processing region to be detected, the partial coincidence relation information is used for describing that the first reference processing region and the processing region to be detected have a coincidence region on the premise that the first reference processing region does not completely cover the processing region to be detected, and the complete dislocation information is used for describing that the first reference processing region and the processing region to be detected do not have a coincidence region; the terminal device can accurately determine the region relation information according to the processing region position information and the first reference region position information.
For example, referring to fig. 4 (c), fig. 4 (c) is a schematic diagram of combining fig. 4 (a) and fig. 4 (b), in fig. 4 (c), the first reference processing region completely covers the processing region to be detected, and the region relation information is complete coverage relation information; referring to fig. 5 (c), fig. 5 (c) is a schematic diagram of combining fig. 5 (a) and fig. 5 (b), in fig. 5 (c), the first reference processing area does not completely cover the to-be-detected processing area, and there is a superposition area between the first reference processing area and the to-be-detected processing area, and the area relation information is local superposition relation information; referring to fig. 6 (c), fig. 6 (c) is a schematic diagram of combining fig. 6 (a) and fig. 6 (b), in fig. 6 (c), there is no overlapping area between the first reference processing area and the processing area to be detected, and the area relation information is complete misalignment information.
In S320, if the region relation information is the complete coverage relation information, the processing state information is determined to be the state to be recovered.
Specifically, if the region relation information is the complete coverage relation information, it indicates that the frame film to be detected is processed again by the frame coil stock cutting machine, after the difference region between the first reference processing region and the processing region to be detected is cut, the frame film to be detected can meet the production requirement corresponding to the production line to which the frame coil stock cutting machine belongs, the frame film to be detected can be reused, and the terminal equipment can determine that the processing state information is the state to be recovered.
In S330, if the region relation information is the partial overlapping relation information or the complete misalignment information, a second reference product map is acquired.
Specifically, if the region relation information is local coincidence relation information or complete dislocation information, it indicates that the reprocessing of the object to be detected still does not meet the production requirements corresponding to the production line to which the current frame coil stock cutting machine belongs, in order to improve the recycling rate of production resources, the terminal device can determine whether the frame film can meet the production requirements corresponding to the production lines of other frame coil stock cutting machines after reprocessing. For example, referring to fig. 7 (b), fig. 7 (b) is a second reference product chart, where the second reference product chart is an image corresponding to a qualified frame film of the other production line, and the second reference product chart includes a second reference processing area, and the second reference processing area may be a cutting processing area corresponding to a qualified frame film of the other production line.
In S340, second reference area position information of the second reference processing area is determined from the second reference product map.
Specifically, after the terminal device obtains the second reference product map, the terminal device may accurately determine the second reference area location information of the second reference processing area according to the second reference product map, and the specific processing procedure may refer to the corresponding content in step S200, so that details are not repeated.
In S350, it is determined whether the second reference machining region completely covers the machining region to be detected, based on the machining region position information and the second reference region position information.
Specifically, after the terminal device determines the second reference region position information, the terminal device may determine whether the second reference processing region completely covers the processing region to be detected based on the processing region position information and the second reference region position information.
In S360, if the second reference machining area completely covers the machining area to be detected, the machining state information is determined to be the state to be recovered, otherwise, the machining state information is determined to be the scrapped state.
For example, referring to fig. 7 (c), fig. 7 (c) is a schematic diagram of combining fig. 7 (a) and fig. 7 (b), in fig. 7 (c), the second reference processing area completely covers the to-be-detected processing area, if the second reference processing area completely covers the to-be-detected processing area, it indicates that the to-be-detected frame film is reworked by the frame coil cutting machine of other production lines, after the difference area between the second reference processing area and the to-be-detected processing area is cut, the to-be-detected frame film can meet the production requirements corresponding to other production lines, the to-be-detected frame film can be reused, and the terminal device can determine that the processing state information is the to-be-recovered state.
Without loss of generality, if the second reference processing area does not completely cover the processing area to be detected, it indicates that even if the frame coil stock cutting machine of other production lines reprocesss the frame film to be detected, the frame film to be detected still does not meet the production requirements corresponding to other production lines, and the terminal device can determine that the processing state information is in a scrapped state.
In some possible implementations, to facilitate the production personnel to more fully analyze the border film to be inspected, referring to fig. 8, after step S320, the method further includes, but is not limited to, the following steps:
in S321, deviation region position information is determined from the machining region position information and the first reference region position information.
For example, referring to fig. 9, the deviation zone position information is used to describe a deviation zone between the to-be-inspected machined zone and the first reference machined zone, i.e., the zone internally marked with a cross-hatching in fig. 9; the terminal device may determine the deviation area location information based on the machining area location information and the first reference area location information.
In S322, the product map to be detected is highlighted based on the deviation area position information, and a highlighted product map is generated.
Specifically, the terminal device may highlight the product graph to be detected based on the deviation area position information, and highlight the deviation area in the product graph to be detected, so as to generate a highlighted product graph.
In S323, a data analysis packet is generated from the highlight product drawing and the deviation area position information.
Specifically, after the terminal device generates the highlight product graph, the terminal device may compress and generate a data analysis packet according to the highlight product graph and the deviation area position information.
In S324, the data analysis package is uploaded to the cloud server.
Specifically, after the terminal device generates the data analysis packet, the terminal device can upload the data analysis packet to the cloud server, so that the robustness of the data is improved.
In some possible implementations, in order to facilitate the producer to know the extent to which the to-be-detected frame film does not meet the production requirement, the producer is facilitated to deeply analyze the production problem corresponding to the to-be-detected frame film that does not meet the production requirement, referring to fig. 10, after step S310, the method further includes, but is not limited to the following steps:
in S311, if the region relation information is the partial overlapping relation information or the complete misalignment information, the corner number information of the first reference processing region is determined according to the first reference product map.
Specifically, the corner number information is used for describing the number corresponding to the intersection point of two adjacent edge lines of the first reference processing area; if the region relation information is local coincidence relation information or complete dislocation information, the terminal equipment can determine the corner number information of the first reference processing region according to the first reference product diagram; when the corner point of the first reference processing area is in a circular arc shape, the terminal device may generate extension lines of two adjacent edge lines of the first reference processing area, calculate an intersection point of the two extension lines, and define the intersection point as the corner point.
In S312, reference corner position information of the first reference processing region is determined according to the first reference region position information, and processing corner position information of the processing region to be detected is determined according to the processing region position information.
Specifically, the first reference area position information includes coordinates of each pixel point of the first reference processing area in the first reference product map, where the pixel points include corner points of the first reference processing area, the terminal device may determine the reference corner point position information of the first reference processing area according to the first reference area position information, and the terminal device may determine the processing corner point position information of the processing area to be detected according to the processing area position information, and a processing procedure for determining the processing corner point position information may refer to the processing procedure for determining the reference corner point position information, so that details are omitted.
In some possible implementations, the terminal device may determine the reference corner position information of the first reference processing area and the processing corner position information of the processing area to be detected using a Harris corner detection algorithm.
In S313, offset information is generated from the reference corner position information and the processing corner position information.
Specifically, the offset information is used for describing the shortest distance between the reference corner position information and the processing corner position information; the terminal equipment can generate offset information according to the reference angular point position information, the processing angular point position information and a preset two-point distance formula.
In S314, common region area information is determined from the machining region position information and the first reference region position information.
For example, referring to fig. 11, taking an example that the first reference machining area does not completely cover the machining area to be detected and there is a coincident area between the first reference machining area and the machining area to be detected, the common area information is used to describe an area corresponding to a common area where the machining area to be detected and the first reference machining area are added, that is, an area with a cross-hatching inside in fig. 11; the terminal device may determine the common area information based on the processing area position information and the first reference area position information.
In S315, the qualified area information is determined according to the product map to be detected, the reference area information, and the processed area information.
For example, referring to fig. 12, taking an example that the first reference machining area does not completely cover the machining area to be detected and there is a coincident area between the first reference machining area and the machining area to be detected, the area information of the qualified area is used to describe an area corresponding to the coincident area between the machining area to be detected and the first reference machining area, that is, an area with a cross-hatching inside in fig. 12; the terminal equipment can determine the area information of the qualified area according to the product graph to be detected, the area information of the reference area and the area information of the processing area.
In S316, area difference information is generated from the reference area information and the common area information.
For example, referring to fig. 13, taking an example that the first reference machining area does not completely cover the machining area to be detected and there is a coincident area between the first reference machining area and the machining area to be detected, the area difference information is used to describe a result value obtained by subtracting the common area information from the reference area information, that is, an area corresponding to a remaining area of the first reference machining area subtracted from the machining area to be detected, and a cross-hatched area in fig. 13 is the remaining area; the terminal device may generate the region area difference information according to the reference region area information and the common region area information.
In S317, the corner number information, the offset information, the common area information, the qualified area information, and the area difference information are input into a preset production deviation rate calculation formula, and the production deviation rate information is determined.
Specifically, the production deviation rate information is used for describing the deviation degree between the to-be-detected processing area and the first reference processing area, a producer can know the degree that the to-be-detected frame film does not meet the production requirement through the production deviation rate information, and the producer can analyze the production problem corresponding to the to-be-detected frame film which does not meet the production requirement by combining the reference quantity; after the terminal device generates the area difference information, the terminal device can input the corner number information, the offset information, the common area information, the qualified area information and the area difference information into a preset production deviation rate calculation formula to determine the production deviation rate information.
In some possible implementations, to improve the correlation of the production deviation information, the above production deviation calculation formula may be:
in the method, in the process of the invention,representing production deviation rate information; />Representing area difference information; / >Information representing the number of corner points; />Offset information corresponding to an ith corner of a processing area to be detected is represented;area information representing a qualified area; />Representing common area information.
The implementation principle of the intelligent detection method based on the frame coil stock cutting machine provided by the embodiment of the application is as follows: the terminal equipment can firstly acquire a to-be-detected product image and a first reference product image of an object to be detected, then determine first processing characteristic information of a to-be-detected processing area according to the to-be-detected product image, determine second processing characteristic information of the first reference processing area according to the first reference product image, and determine that processing state information of the object to be detected is a to-be-recovered state or a scrapped state according to the first processing characteristic information and the second processing characteristic information, so that more reference information is provided for production staff when the frame film does not meet production requirements, the production staff can know whether the frame film which does not meet the production requirements can be recycled, the recycling rate of production resources is improved, and the production cost of enterprises is reduced.
It should be noted that, the sequence number of each step in the above embodiment does not mean the sequence of execution sequence, and the execution sequence of each process should be determined by its function and internal logic, and should not limit the implementation process of the embodiment of the present application in any way.
The embodiment of the present application also provides an intelligent detection system based on a frame coil cutting machine, for convenience of description, only the part relevant to the present application is shown, as shown in fig. 14, the system 140 includes:
the product diagram acquisition module 141: the method comprises the steps of obtaining a to-be-detected product image and a first reference product image of an object to be detected based on a preset camera, wherein the to-be-detected product image comprises a to-be-detected processing area, and the first reference product image comprises a first reference processing area;
the processing feature information determination module 142: the processing method comprises the steps of determining first processing characteristic information of a processing area to be detected according to a product diagram to be detected, and determining second processing characteristic information of a first reference processing area according to a first reference product diagram, wherein the first processing characteristic information comprises processing area information and processing area position information, and the second processing characteristic information comprises reference area information and first reference area position information;
the processing state information determination module 143: and the processing state information is used for determining the processing state information of the object to be detected according to the first processing characteristic information and the second processing characteristic information, wherein the processing state information comprises a state to be recovered or a state to be scrapped.
Optionally, the processing state information determining module 143 includes:
the region relation information determination submodule: the method comprises the steps of determining region relation information according to processing region position information and first reference region position information, wherein the region relation information comprises complete coverage relation information, local coincidence relation information or complete dislocation information, and the complete coverage relation information is used for describing that a first reference processing region completely covers a processing region to be detected;
and (3) determining a state to be recovered, namely: if the area relation information is the complete coverage relation information, determining that the processing state information is a state to be recovered;
a reference product graph acquisition sub-module: if the region relation information is the partial coincidence relation information or the complete dislocation information, acquiring a second reference product graph, wherein the second reference product graph comprises a second reference processing region;
the reference area position information determination submodule: determining second reference area position information of a second reference machining area according to the second reference product map;
area coverage determination submodule: the processing area detection device is used for determining whether the second reference processing area completely covers the processing area to be detected according to the processing area position information and the second reference area position information;
Scrap state determination submodule: and the processing state information is determined to be in a state to be recovered if the second reference processing area completely covers the processing area to be detected, otherwise, the processing state information is determined to be in a scrapped state.
Optionally, the system 140 further includes:
deviation area position information determining module: the method comprises the steps of determining deviation area position information according to machining area position information and first reference area position information, wherein the deviation area position information is used for describing a deviation area between a to-be-detected machining area and the first reference machining area;
highlighting the product graph generation module: the method comprises the steps of carrying out highlighting marking on a product graph to be detected based on deviation area position information to generate a highlighted product graph;
a data analysis packet generation module: the data analysis package is generated according to the highlight product graph and the deviation area position information;
and a data analysis packet uploading module: and the cloud server is used for uploading the data analysis package to the cloud server.
Optionally, the system 140 further includes:
the corner number information determining module: if the region relation information is the partial coincidence relation information or the complete dislocation information, determining the angular point quantity information of the first reference processing region according to the first reference product graph;
Processing corner position information determining module: the processing method comprises the steps of determining reference angular point position information of a first reference processing area according to the first reference area position information, and determining processing angular point position information of a processing area to be detected according to the processing area position information;
offset information generation module: the offset information is used for describing the shortest distance between the reference corner position information and the processing corner position information;
the common area information determining module: the method comprises the steps of determining common area information according to processing area position information and first reference area position information, wherein the common area information is used for describing the area corresponding to a common area of a to-be-detected processing area and a first reference processing area after addition;
the qualified area information determining module: the method comprises the steps of determining qualified area information according to a product graph to be detected, reference area information and processing area information;
the regional area difference information generation module: the method comprises the steps of generating area difference information according to reference area information and common area information, wherein the area difference information is used for describing a result value of subtracting the common area information from the reference area information;
The production deviation rate information determining module: the method comprises the steps of inputting angular point quantity information, offset information, common area information, qualified area information and area difference information into a preset production deviation rate calculation formula, and determining production deviation rate information, wherein the production deviation rate information is used for describing the deviation degree between a to-be-detected processing area and a first reference processing area.
Optionally, the above formula for calculating the production deviation rate is:
in the method, in the process of the application,is production deviation rate information; />Is area difference information; />Is corner number information; />Offset information corresponding to an ith corner of a processing area to be detected; />Area information of the qualified area; />Is common area information.
It should be noted that, because the content of information interaction and execution process between the modules and the embodiment of the method of the present application are based on the same concept, specific functions and technical effects thereof may be referred to in the method embodiment section, and details thereof are not repeated herein.
The embodiment of the present application also provides a terminal device, as shown in fig. 15, where the terminal device 150 of the embodiment includes: a processor 151, a memory 152 and a computer program 153 stored in the memory 152 and executable on the processor 151. The steps in the above-described flow processing method embodiment, such as steps S100 to S300 shown in fig. 1, are implemented when the processor 151 executes the computer program 153; alternatively, the processor 151 may perform the functions of the modules in the above-described apparatus when executing the computer program 153, for example, the functions of the modules 151 to 153 shown in fig. 15.
The terminal device 150 may be a desktop computer, a notebook computer, a palm computer, a cloud server, etc., and the terminal device 150 includes, but is not limited to, a processor 151 and a memory 152. It will be appreciated by those skilled in the art that fig. 15 is merely an example of terminal device 150 and is not intended to limit terminal device 150, and may include more or fewer components than shown, or may combine certain components, or different components, e.g., terminal device 150 may further include an input-output device, a network access device, a bus, etc.
The processor 151 may be a central processing unit (Central Processing Unit, CPU), but also other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.; a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 152 may be an internal storage unit of the terminal device 150, for example, a hard disk or a memory of the terminal device 150, or the memory 152 may be an external storage device of the terminal device 150, for example, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card) or the like provided on the terminal device 150; further, the memory 152 may also include both an internal storage unit and an external storage device of the terminal device 150, the memory 152 may also store the computer program 153 and other programs and data required by the terminal device 150, and the memory 152 may also be used to temporarily store data that has been output or is to be output.
An embodiment of the present application also provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the various method embodiments described above. Wherein the computer program comprises computer program code, the computer program code can be in the form of source code, object code, executable file or some intermediate form, etc.; the computer readable medium may include: any entity or device capable of carrying computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth.
The above embodiments are not intended to limit the scope of the present application, so: all equivalent changes in the method, principle and structure of the present application should be covered by the protection scope of the present application.

Claims (7)

1. An intelligent detection method based on a frame coil stock cutting machine is characterized by comprising the following steps:
acquiring a to-be-detected product image and a first reference product image of an object to be detected based on a preset camera, wherein the to-be-detected product image comprises a to-be-detected processing area, and the first reference product image comprises a first reference processing area;
Determining first processing characteristic information of the processing area to be detected according to the product graph to be detected, and determining second processing characteristic information of the first reference processing area according to the first reference product graph, wherein the first processing characteristic information comprises processing area information and processing area position information, and the second processing characteristic information comprises reference area information and first reference area position information;
determining processing state information of the object to be detected according to the first processing characteristic information and the second processing characteristic information, wherein the processing state information comprises a state to be recovered or a state to be scrapped;
wherein the determining the processing state information of the object to be detected according to the first processing feature information and the second processing feature information includes:
determining region relation information according to the processing region position information and the first reference region position information, wherein the region relation information comprises complete coverage relation information, partial superposition relation information or complete dislocation information, and the complete coverage relation information is used for describing that the first reference processing region completely covers the to-be-detected processing region;
If the region relation information is the complete coverage relation information, determining that the processing state information is a state to be recovered;
if the region relation information is the partial coincidence relation information or the complete dislocation information, a second reference product graph is obtained, wherein the second reference product graph comprises a second reference processing region;
determining second reference area position information of the second reference processing area according to the second reference product graph;
determining whether the second reference processing area completely covers the processing area to be detected according to the processing area position information and the second reference area position information;
and if the second reference processing area completely covers the processing area to be detected, determining that the processing state information is in a state to be recovered, otherwise, determining that the processing state information is in a scrapped state.
2. The method according to claim 1, wherein after the processing state information is determined to be a state to be recovered if the region relation information is the complete coverage relation information, the method further comprises:
determining deviation area position information according to the processing area position information and the first reference area position information, wherein the deviation area position information is used for describing a deviation area between the to-be-detected processing area and the first reference processing area;
Carrying out highlight marking on the product graph to be detected based on the deviation area position information to generate a highlight product graph;
generating a data analysis packet according to the highlight product graph and the deviation area position information;
and uploading the data analysis package to a cloud server.
3. The method of claim 1, wherein after said determining region relation information from said machining region position information and said first reference region position information, said method further comprises:
if the region relation information is local coincidence relation information or complete dislocation information, determining angular point quantity information of the first reference processing region according to the first reference product diagram;
determining reference angular point position information of the first reference processing area according to the first reference area position information, and determining processing angular point position information of the processing area to be detected according to the processing area position information;
generating offset information according to the reference angular point position information and the processing angular point position information, wherein the offset information is used for describing the shortest distance between the reference angular point position information and the processing angular point position information;
Determining common area information according to the processing area position information and the first reference area position information, wherein the common area information is used for describing the area corresponding to the common area of the processing area to be detected and the first reference processing area after being added;
determining qualified area information according to the product graph to be detected, the reference area information and the processing area information;
generating region area difference information according to the reference region area information and the common region area information, wherein the region area difference information is used for describing a result value obtained by subtracting the common region area information from the reference region area information;
inputting the angular point quantity information, the offset information, the common area information, the qualified area information and the area difference information into a preset production deviation rate calculation formula to determine production deviation rate information, wherein the production deviation rate information is used for describing the deviation degree between the to-be-detected processing area and the first reference processing area.
4. A method according to claim 3, wherein the production deviation ratio calculation formula is:
In the method, in the process of the invention,for the production deviation rate information; />The area difference information is the area difference information; />The corner number information is obtained; />Offset information corresponding to an ith corner of the processing area to be detected;area information of the qualified area; />And (5) the common area information.
5. An intelligent detection system based on a frame coil stock guillootine, the system comprising:
the product diagram acquisition module: the method comprises the steps of obtaining a to-be-detected product image and a first reference product image of an object to be detected based on a preset camera, wherein the to-be-detected product image comprises a to-be-detected processing area, and the first reference product image comprises a first reference processing area;
the processing characteristic information determining module: the processing method comprises the steps of determining first processing characteristic information of a processing area to be detected according to a product diagram to be detected, and determining second processing characteristic information of the first reference processing area according to a first reference product diagram, wherein the first processing characteristic information comprises processing area information and processing area position information, and the second processing characteristic information comprises reference area information and first reference area position information;
The processing state information determining module: the processing state information is used for determining the processing state information of the object to be detected according to the first processing characteristic information and the second processing characteristic information, wherein the processing state information comprises a state to be recovered or a state to be scrapped;
wherein, the processing state information determining module includes:
the region relation information determination submodule: the method comprises the steps of determining region relation information according to the processing region position information and the first reference region position information, wherein the region relation information comprises complete coverage relation information, partial coincidence relation information or complete dislocation information, and the complete coverage relation information is used for describing that the first reference processing region completely covers the processing region to be detected;
and (3) determining a state to be recovered, namely: if the area relation information is the complete coverage relation information, determining that the processing state information is a state to be recovered;
a reference product graph acquisition sub-module: if the region relation information is the partial coincidence relation information or the complete dislocation information, acquiring a second reference product graph, wherein the second reference product graph comprises a second reference processing region;
The reference area position information determination submodule: the second reference area position information is used for determining the second reference processing area according to the second reference product graph;
area coverage determination submodule: the processing area detection device is used for determining whether the second reference processing area completely covers the processing area to be detected according to the processing area position information and the second reference area position information;
scrap state determination submodule: and the processing state information is determined to be a state to be recovered if the second reference processing area completely covers the processing area to be detected, otherwise, the processing state information is determined to be a scrapped state.
6. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 4 when the computer program is executed.
7. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 4.
CN202310919671.7A 2023-07-26 2023-07-26 Intelligent detection method, system and storage medium based on frame coil stock cutting machine Active CN116625243B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310919671.7A CN116625243B (en) 2023-07-26 2023-07-26 Intelligent detection method, system and storage medium based on frame coil stock cutting machine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310919671.7A CN116625243B (en) 2023-07-26 2023-07-26 Intelligent detection method, system and storage medium based on frame coil stock cutting machine

Publications (2)

Publication Number Publication Date
CN116625243A CN116625243A (en) 2023-08-22
CN116625243B true CN116625243B (en) 2023-09-19

Family

ID=87610304

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310919671.7A Active CN116625243B (en) 2023-07-26 2023-07-26 Intelligent detection method, system and storage medium based on frame coil stock cutting machine

Country Status (1)

Country Link
CN (1) CN116625243B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101907439A (en) * 2010-03-17 2010-12-08 中国二十二冶集团有限公司 Stimulated measurement and detection method in architectural steel structure fabrication
JP2016112682A (en) * 2016-02-01 2016-06-23 住友化学株式会社 Cutting method and cutting device, manufacturing method for optical member
CN112223356A (en) * 2020-09-21 2021-01-15 安徽新涛光电科技有限公司 Automatic guillootine of PVC adhesive tape
CN113935997A (en) * 2021-12-16 2022-01-14 深圳致星科技有限公司 Image processing method, storage medium and image processing device for detecting material
CN115235375A (en) * 2022-07-18 2022-10-25 南京邮电大学 Multi-circle characteristic parameter measuring method, detecting method and device for cover plate type workpiece
CN116342609A (en) * 2023-05-30 2023-06-27 湖南隆深氢能科技有限公司 Real-time detection method, system and storage medium based on cutting device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6598898B2 (en) * 2018-02-27 2019-10-30 株式会社Screenホールディングス Core misalignment detection apparatus and core misalignment detection method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101907439A (en) * 2010-03-17 2010-12-08 中国二十二冶集团有限公司 Stimulated measurement and detection method in architectural steel structure fabrication
JP2016112682A (en) * 2016-02-01 2016-06-23 住友化学株式会社 Cutting method and cutting device, manufacturing method for optical member
CN112223356A (en) * 2020-09-21 2021-01-15 安徽新涛光电科技有限公司 Automatic guillootine of PVC adhesive tape
CN113935997A (en) * 2021-12-16 2022-01-14 深圳致星科技有限公司 Image processing method, storage medium and image processing device for detecting material
CN115235375A (en) * 2022-07-18 2022-10-25 南京邮电大学 Multi-circle characteristic parameter measuring method, detecting method and device for cover plate type workpiece
CN116342609A (en) * 2023-05-30 2023-06-27 湖南隆深氢能科技有限公司 Real-time detection method, system and storage medium based on cutting device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
数字相机在工业零件检测中的应用;于晶涛, 陈鹰;遥感信息(03);全文 *

Also Published As

Publication number Publication date
CN116625243A (en) 2023-08-22

Similar Documents

Publication Publication Date Title
CN113139559B (en) Training method of target detection model, and data labeling method and device
CN116168041B (en) Real-time detection method and system applied to laminating device
CN110796095B (en) Instrument template establishing method, terminal equipment and computer storage medium
CN113592886B (en) Drawing examining method and device for building drawing, electronic equipment and medium
CN112336342A (en) Hand key point detection method and device and terminal equipment
CN113673519A (en) Character recognition method based on character detection model and related equipment thereof
CN110910375A (en) Detection model training method, device, equipment and medium based on semi-supervised learning
CN115482186A (en) Defect detection method, electronic device, and storage medium
CN116342609B (en) Real-time detection method, system and storage medium based on cutting device
CN116523908B (en) Safe production method, system, equipment and medium based on coil coating production line
CN114549393B (en) Image labeling method, device, equipment and computer storage medium
CN116625243B (en) Intelligent detection method, system and storage medium based on frame coil stock cutting machine
CN116379927B (en) Accurate detection method and system applied to laminating production line and storage medium
CN112966719B (en) Method and device for recognizing instrument panel reading and terminal equipment
CN116129177A (en) Image labeling method and device and electronic equipment
CN108564571B (en) Image area selection method and terminal equipment
CN115223165B (en) Method and device for acquiring cell image to be interpreted
CN115086343B (en) Internet of things data interaction method and system based on artificial intelligence
CN116629606A (en) Industrial chain early warning method, device, equipment and medium based on power data
CN116468761A (en) Registration method, equipment and storage medium based on probability distribution distance feature description
CN114373078A (en) Target detection method and device, terminal equipment and storage medium
CN113639639A (en) Data processing method and device for position data and storage medium
CN113723370B (en) Chromosome detection method and device based on oblique frame
CN116630957B (en) Self-adaptive target detection method and system based on pseudo tag size in unsupervised field
CN116883398B (en) Detection method, system, terminal equipment and medium based on galvanic pile assembly production line

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant