CN112785554A - Quality estimation method, quality estimation device, electronic device and storage medium - Google Patents

Quality estimation method, quality estimation device, electronic device and storage medium Download PDF

Info

Publication number
CN112785554A
CN112785554A CN202011614937.XA CN202011614937A CN112785554A CN 112785554 A CN112785554 A CN 112785554A CN 202011614937 A CN202011614937 A CN 202011614937A CN 112785554 A CN112785554 A CN 112785554A
Authority
CN
China
Prior art keywords
density
information
volume
contour data
characteristic information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011614937.XA
Other languages
Chinese (zh)
Inventor
陈海波
李宗剑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenlan Intelligent Technology Shanghai Co ltd
Original Assignee
DeepBlue AI Chips Research Institute Jiangsu Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DeepBlue AI Chips Research Institute Jiangsu Co Ltd filed Critical DeepBlue AI Chips Research Institute Jiangsu Co Ltd
Priority to CN202011614937.XA priority Critical patent/CN112785554A/en
Publication of CN112785554A publication Critical patent/CN112785554A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The application provides a quality estimation method, a device, an electronic device and a storage medium, wherein the quality estimation method comprises the steps of detecting an object to identify characteristic information and outline data of the object; matching the characteristic information by utilizing a pre-stored material library to determine the material of the object and the density corresponding to the material; estimating a volume of the object based on the contour data; and calculating the mass of the object using the volume and the density, thereby enabling high-accuracy and high-stability mass estimation through machine vision recognition.

Description

Quality estimation method, quality estimation device, electronic device and storage medium
Technical Field
The present application relates to the field of computer vision technology and industrial inspection technology, and in particular, to a quality estimation method, apparatus, device, and computer-readable storage medium.
Background
As an important technology in the current industry, the machine vision inspection technology is widely applied to various fields such as content-based image retrieval, automobile safety, video monitoring, and robotics, and along with the development of the technology, the machine vision inspection technology is applied to more and more fields to meet the ever-increasing market demand.
The prior art has not realized quality estimation through visual detection, and the existing quality estimation method has the defects of easy distortion, large estimation deviation, low accuracy, damage to an object caused by interventional detection and the like.
Content of application
The application aims to provide a quality estimation method, a quality estimation device, an electronic device and a computer readable storage medium, which adopt machine vision identification and combine a multi-sensor fusion technology to perform density fusion calculation so as to realize high-accuracy and high-stability quality estimation.
The purpose of the application is realized by adopting the following technical scheme:
in a first aspect, the present application provides a method of quality estimation, the method comprising: detecting an object to identify feature information and contour data of the object; matching the characteristic information by utilizing a pre-stored material library to determine the material of the object and the density corresponding to the material; estimating a volume of the object based on the contour data; and calculating the mass of the object using the volume and the density.
The technical scheme has the beneficial effects that the quality estimation of the object can be realized with high accuracy and high stability through machine vision identification.
In some optional embodiments, the object is detected by using a two-dimensional deep learning model to obtain surface texture information of the object as the feature information; and the contour data is obtained by three-dimensional visual recognition. The surface texture information includes at least one of: color information, roughness information, and reflection information.
The technical scheme has the advantages that due to the adoption of a non-contact data acquisition mode, the integrity of the object is ensured, and the quality of the object with consistent material quality can be estimated with higher accuracy and stability.
In some alternative embodiments, the object is detected by multi-sensor fusion to obtain the contour data and the internal structure information as the characteristic information. Determining the material of the internal structure of the object based on a deep learning model; and performing density fusion calculation on the plurality of materials when it is determined that the object includes the plurality of materials. The density fusion calculation includes performing weight calculation on a plurality of densities corresponding to the plurality of materials according to a proportion of a volume of each material in a volume of the object to obtain the density of the object.
The technical scheme has the advantages of higher applicability, capability of realizing quality estimation of objects comprising multiple materials, hollowness and the like, and capability of ensuring high accuracy and high stability.
In some optional embodiments, the material library stores the corresponding relationship between the characteristic information and the material and the density, so that the corresponding material and density can be determined according to the characteristic information. And when the material library does not have the material matched with the characteristic information, performing approximate material interval estimation to predict the density range.
The technical scheme has the beneficial effect that the density range of the object can be determined under the condition that the characteristic information cannot be matched in the material library.
In a second aspect, the present application provides a quality estimation apparatus, the apparatus comprising: the detection module is used for detecting the object so as to identify the characteristic information and the contour data of the object; the matching module is used for matching the characteristic information by utilizing a pre-stored material library so as to determine the material of the object and the density corresponding to the material; a volume estimation module for estimating a volume of the object based on the contour data; and a mass estimation module for calculating a mass of the object using the volume and the density.
The technical scheme has the beneficial effects that the quality estimation of the object can be realized with high accuracy and high stability through machine vision identification.
In some optional embodiments, the detection module comprises: a surface texture information extraction unit configured to detect the object by using a two-dimensional deep learning model to obtain surface texture information of the object as the feature information; and a contour data acquisition unit for obtaining the contour data by three-dimensional visual recognition. The surface texture information includes at least one of: color information, roughness information, and reflection information.
The technical scheme has the advantages that due to the adoption of a non-contact data acquisition mode, the integrity of the object is ensured, and the quality of the object with consistent material quality can be estimated with higher accuracy and stability.
In some optional embodiments, the detection module comprises: a multi-sensor fusion unit for detecting the object by multi-sensor fusion to obtain the contour data and internal structure information as the feature information. The matching module includes: a deep learning unit for determining a material of an internal structure of the object based on a deep learning model; and a density fusion unit for performing density fusion calculation on the plurality of materials when it is determined that the object includes the plurality of materials. The density fusion calculation includes performing weight calculation on a plurality of densities corresponding to the plurality of materials according to a proportion of a volume of each material in a volume of the object to obtain the density of the object.
The technical scheme has the advantages of higher applicability, capability of realizing quality estimation of objects comprising multiple materials, hollowness and the like, and capability of ensuring high accuracy and high stability.
In some optional embodiments, the material library stores the corresponding relationship between the characteristic information and the material and the density, so that the corresponding material and density can be determined according to the characteristic information. The matching module includes: and an approximate material interval estimation unit configured to estimate an approximate material interval to predict a density range when a material matching the feature information does not exist in the material library.
The technical scheme has the beneficial effect that the density range of the object can be determined under the condition that the characteristic information cannot be matched in the material library.
In a third aspect, the present application provides an electronic device, which includes a memory, a processor, and a hardware module for performing tasks, where the memory stores a computer program, and the processor implements the steps of any one of the above methods when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of any of the methods described above.
Drawings
The present application is further described below with reference to the drawings and examples.
Fig. 1 is a schematic flow chart of a quality estimation method provided in an embodiment of the present application;
fig. 2 is a schematic flow chart of a quality estimation method according to an embodiment of the present disclosure;
fig. 3 is a schematic flow chart of a quality estimation method according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a quality estimation method provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of a quality estimation apparatus according to an embodiment of the present application
Fig. 6 is a schematic structural diagram of a quality estimation apparatus according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a quality estimation apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a quality estimation apparatus according to an embodiment of the present application
Fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application; and
fig. 10 is a schematic structural diagram of a program product for implementing a quality estimation method according to an embodiment of the present application.
Detailed Description
The prior art provides a measurable stone material mine forklift truck based on vision technique, including ARM main control unit, camera and two laser radar, wherein, ARM main control unit links to each other with camera and two laser radar respectively. The weight of the stone blocks to be loaded and unloaded can be measured by using the camera and the laser radar as information acquisition means.
The prior art also proposes a method for actual load weight of mining excavation equipment, comprising: scanning the surface and determining the surface shape; identifying a digging path; actually calculating a volume of excavated material based on at least the surface shape and the excavation path; and actually calculating the weight of the excavated material based on the at least one density factor.
The above prior art still has many defects, for example, the material quality determination cannot be performed by machine vision recognition, and the accuracy of quality estimation is low, so the present application is proposed.
The present application is further described with reference to the accompanying drawings and the detailed description, and it should be noted that, in the present application, the embodiments or technical features described below may be arbitrarily combined to form a new embodiment without conflict.
Referring to fig. 1, an embodiment of the present application provides a quality estimation method, which includes steps S101 to S104.
Wherein, step S101: a detection step of detecting an object to identify feature information and contour data of the object; step S102: matching the characteristic information by utilizing a pre-stored material library to determine the material of the object and the density corresponding to the material; step S103: a volume estimation step of estimating a volume of the object based on the contour data; step S104: a mass estimation step of calculating a mass of the object using the volume and the density.
The above-described respective steps will be described below with reference to specific embodiments.
[ example 1]
Referring to fig. 2, one embodiment of the quality estimation method of the present application is shown. Wherein, the step S101 may include steps S201 and S202.
Step S201: surface texture information extraction step
And detecting the object by using a two-dimensional deep learning model to obtain surface texture information of the object as characteristic information.
The surface texture information may include color information, roughness information, and reflection information (diffuse reflection, scattering, transmission, dispersion, etc. to reflect thickness), etc.
For objects with serious surface reflection and serious perspective, such as super-mirror objects, high-transparency glass and the like, the sensor may have a distortion phenomenon during surface contour acquisition. In contrast, a sensor based on a spectroscopic confocal technique can be used to acquire surface data of an object; therefore, the acquisition of the surface texture information can be realized.
Step S202: contour data acquisition step
Contour data is obtained by three-dimensional visual recognition.
Note that, when the object is an entity having a uniform material quality, the feature extraction and the contour data extraction are performed by applying the above-described steps S201 and 202. In the case where the object is a multi-material object, a hollow object, or the like, the above steps S201 and S202 may be deviated, and in this case, it is necessary to perform detection of the internal structure of the object to determine the material of the inside of the object by using the step S203 described below (see the following description).
Therefore, through the steps S201 and S202, the extraction of surface texture information and the collection of contour data can be carried out on the object through two-dimensional deep learning and three-dimensional visual recognition, the integrity of the object is ensured due to the adoption of a non-contact data collection mode, and the quality of the object with consistent material quality can be estimated with higher accuracy and stability.
Step S102: matching step
And matching the surface texture information obtained in the step S101 by using a pre-stored material library in a big data analysis mode to obtain the material of the object and the density corresponding to the material.
Specifically, the visual big data is adopted to classify the material and the combination of the material according to the surface texture information, and the corresponding density of each material is determined, so as to establish a material library indicating the corresponding relationship among the surface texture information, the material and the density, and thus the corresponding material and the density thereof can be retrieved from the material library according to the surface texture information obtained in step S101.
Step S103: volume estimation step
From the contour data obtained in step S202, the volume of the object is estimated.
Step S104: quality estimation step
The mass of the object is calculated from the density of the object obtained in step S102 and the volume of the object estimated in step S103.
One embodiment of the quality estimation method of the present application is described above, and another embodiment of the quality estimation method of the present application will be described below with reference to fig. 3.
[ example 2]
As shown in fig. 3, the step S101 may include a step S203.
Step S203: multiple sensor fusion procedure
The object is detected by multi-sensor fusion to obtain the contour data and internal structure information as the feature information.
Among them, various sensors include sensors such as infrared, laser, etc., and these sensors use spectral confocal to detect the internal structure of an object.
Step S102 (matching step) may include steps S301 and S302.
Step S301: deep learning step
And determining the material of the internal structure of the object based on a deep learning model.
Specifically, the material of the internal structure is determined using the internal structure information obtained in step S203 in conjunction with the deep learning model.
The material obtained in step S203 and step S301 may be a plurality of materials, or may be hollow, and in this case, density fusion of the plurality of materials is performed in step S302.
Step S302: density fusion procedure
And when the object is determined to comprise a plurality of materials, performing density fusion calculation on the plurality of materials. The density fusion calculation includes, for example, performing a weight calculation (or averaging or the like) on a plurality of densities corresponding to the plurality of materials based on a proportion of a volume of each material to a volume of the object to obtain a uniform density value of the object as the density of the object.
Step S103: volume estimation step
From the contour data obtained in step S203, the volume of the object is estimated.
Step S104: and a quality estimation step.
The mass of the object is calculated from the density of the object obtained in step S302 and the volume of the object estimated in step S103.
The above-described embodiments enable quality estimation of an object including the cases of multi-material, hollow, and the like, and can ensure high accuracy and high stability.
[ example 3]
The steps of the present embodiment that are the same as those of the previously described embodiment will not be repeated, and the differences from the previously described embodiment are only that in step S102: and (5) matching.
As shown in fig. 4, step S102 includes an approximate material interval estimation step S303.
Specifically, in the approximate material interval estimation step S303, when the corresponding material cannot be matched in the material library based on the feature information (internal structure information, surface texture information, or the like), the approximate material interval estimation may be performed to predict the density range of the internal structure of the object based on a preset model according to the density of the approximate material.
According to this embodiment, the density range of the object can be determined when the feature information cannot be matched in the material library.
A plurality of specific embodiments of the quality estimation method of the present application are described above. Thus, according to the mass estimation method of the present application, it is possible to realize mass estimation of an object with high accuracy and high stability, and it is possible to realize mass estimation of an object of a single material and also of objects of multiple materials, hollow materials, and the like with high applicability.
Referring to fig. 5, an embodiment of the present application further provides a quality estimation apparatus 100, and a specific implementation manner of the quality estimation apparatus is consistent with the implementation manner and the achieved technical effect described in the embodiment of the foregoing method, and a part of the details are not repeated.
The apparatus 100 comprises: a detection module 101, configured to detect an object to identify feature information and contour data of the object; a matching module 102, configured to match the feature information with a pre-stored material library to determine a material of the object and a density corresponding to the material; a volume estimation module 103 for estimating a volume of the object based on the contour data; and a mass estimation module 104 for calculating a mass of the object using the volume and the density.
The above-described respective modules will be described below with reference to specific embodiments.
[ example 4]
Referring to fig. 6, one embodiment of the mass estimation device of the present application is shown. Wherein the detection module 101 may comprise units 201 and 202.
The unit 201: surface texture information extraction unit
The surface texture information extraction unit is used for detecting the object by using a two-dimensional deep learning model so as to obtain the surface texture information of the object as characteristic information.
The surface texture information may include color information, roughness information, and reflection information (diffuse reflection, scattering, transmission, dispersion, etc. to reflect thickness), etc.
For objects with serious surface reflection and serious perspective, such as super-mirror objects, high-transparency glass and the like, the sensor may have a distortion phenomenon during surface contour acquisition. In contrast, the surface texture information extraction unit 201 may acquire surface data of the object using a sensor based on a spectral confocal technique; therefore, the acquisition of the surface texture information can be realized.
Unit 202: contour data acquisition unit
The contour data acquisition unit is used for acquiring contour data through three-dimensional visual recognition.
Note that, when the object is a solid object having a uniform material quality, feature extraction and contour data extraction are performed by the surface texture information extraction unit 201 and the contour data acquisition unit 202 described above. In the case where the object is multi-material, hollow, or the like, the surface texture information extraction unit 201 and the contour data acquisition unit 202 may have a deviation, and at this time, it is necessary to perform detection of the internal structure of the object by using a unit 203 described below, thereby determining the material inside the object (see the following description).
Therefore, by the surface texture information extraction unit 201 and the contour data acquisition unit 202, the extraction of the surface texture information and the acquisition of the contour data can be performed on the object through two-dimensional deep learning and three-dimensional visual recognition, and the integrity of the object is ensured due to the adoption of a non-contact data acquisition mode, and the quality of the object with consistent material quality can be estimated with higher accuracy and stability.
The module 102: matching module
The matching module is configured to match the surface texture information obtained by the surface texture information extraction unit 201 with a pre-stored material library in a big data analysis manner, so as to obtain the material of the object and the density corresponding to the material.
Specifically, the visual big data is adopted to classify the material and the combination of the material according to the surface texture information, and the corresponding density of each material is determined, so as to establish a material library indicating the corresponding relationship among the surface texture information, the material and the density, so that the matching module can retrieve the corresponding material and the density thereof from the material library according to the surface texture information obtained in the surface texture information extraction unit 201.
The module 103: volume estimation module
The volume estimation module is configured to estimate a volume of the object according to the contour data obtained by the contour data obtaining unit 202.
The module 104: quality estimation module
The mass of the object is calculated from the density of the object obtained by the matching module 102 and the volume of the object estimated by the volume estimation module 103.
One embodiment of the quality estimation device of the present application is described above, and another embodiment of the quality estimation device of the present application will be described below with reference to fig. 7.
[ example 5]
As shown in fig. 7, the detection module 101 may include a unit 203.
Unit 203: multi-sensor fusion unit
The multi-sensor fusion unit is configured to detect the object by multi-sensor fusion to obtain the contour data and internal structure information as the feature information.
Among them, various sensors include sensors such as infrared, laser, etc., and these sensors use spectral confocal to detect the internal structure of an object.
The matching module 102 may comprise a unit 301 and a unit 302.
Unit 301: deep learning unit
The deep learning unit is used for determining the material of the internal structure of the object based on a deep learning model.
Specifically, the deep learning unit determines the material of the internal structure using the internal structure information obtained by the multi-sensor fusion unit 203 in conjunction with the deep learning model.
The material obtained by the multi-sensor fusion means 203 and the deep learning means 301 may be a plurality of materials, or may be hollow, and in this case, density fusion of the multi-materials is performed by the following means 302.
The unit 302: density fusion unit
And the density fusion unit is used for performing density fusion calculation on the multiple materials when the object is determined to comprise the multiple materials. The density fusion calculation includes, for example, performing a weight calculation (or averaging or the like) on a plurality of densities corresponding to the plurality of materials based on a proportion of a volume of each material to a volume of the object to obtain a uniform density value of the object as the density of the object.
The module 103: volume estimation module
The volume of the object is estimated from the contour data obtained by the multi-sensor fusion unit 203.
The module 104: and a quality estimation module.
The mass of the object is calculated from the density of the object obtained by the density fusion means 302 and the volume of the object estimated by the volume estimation module 103.
The above-described embodiments enable quality estimation of an object including the cases of multi-material, hollow, and the like, and can ensure high accuracy and high stability.
[ example 6]
The same modules and/or units of the current embodiment and the previously described embodiments will not be repeated, and the differences from the previously described embodiments are only the matching module 102.
As shown in fig. 8, the matching module 102 includes an approximate material interval estimation unit 303.
Specifically, the similar material interval estimation unit 303 is configured to perform approximate material interval estimation to predict a density range of the internal structure of the object based on a preset model according to the density of the approximate material when the corresponding material cannot be matched in the material library based on the feature information (internal structure information or surface texture information, etc.).
The above describes a plurality of specific embodiments of the quality estimation device of the present application. Thus, the mass estimation device according to the present application can realize the mass estimation of the object with high accuracy and high stability, has high applicability, can realize the mass estimation of the object of a single material, and can realize the mass estimation of the object of multiple materials, hollow materials, and the like.
Referring to fig. 9, an embodiment of the present application further provides an electronic device 200, where the electronic device 200 includes at least one memory 210, at least one processor 220, and a bus 230 connecting different platform systems.
The memory 210 may include readable media in the form of volatile memory, such as Random Access Memory (RAM)211 and/or cache memory 212, and may further include Read Only Memory (ROM) 213.
The memory 210 further stores a computer program, and the computer program can be executed by the processor 220, so that the processor 220 executes the steps of any one of the methods in the embodiments of the present application, and the specific implementation manner of the method is consistent with the implementation manner and the achieved technical effect described in the embodiments of the method, and some contents are not described again.
Memory 210 may also include a program/utility 214 having a set (at least one) of program modules 215, including but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Accordingly, processor 220 may execute the computer programs described above, as well as may execute programs/utilities 214.
Bus 230 may be a local bus representing one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, or any other type of bus structure.
The electronic device 200 may also communicate with one or more external devices 240, such as a keyboard, pointing device, Bluetooth device, etc., and may also communicate with one or more devices capable of interacting with the electronic device 200, and/or with any devices (e.g., routers, modems, etc.) that enable the electronic device 200 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 250. Also, the electronic device 200 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 260. The network adapter 260 may communicate with other modules of the electronic device 200 via the bus 230. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 200, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, and data backup storage platforms, to name a few.
The embodiments of the present application further provide a computer-readable storage medium, where the computer-readable storage medium is used to store a computer program, and when the computer program is executed, the steps of any one of the methods in the embodiments of the present application are implemented, and a specific implementation manner of the steps is consistent with the implementation manner and the achieved technical effect described in the embodiments of the methods, and some details are not repeated.
Fig. 10 shows a program product 300 provided by the present embodiment for implementing the method, which may employ a portable compact disc read only memory (CD-ROM) and include program codes, and may be run on a terminal device, such as a personal computer. However, the program product 300 of the present application is not so limited, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. Program product 300 may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing. Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
The foregoing description and drawings are only for purposes of illustrating the preferred embodiments of the present application and are not intended to limit the present application, which is, therefore, to the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present application.

Claims (18)

1. A method of mass estimation, the method comprising:
detecting an object to identify feature information and contour data of the object;
matching the characteristic information by utilizing a pre-stored material library to determine the material of the object and the density corresponding to the material;
estimating a volume of the object based on the contour data; and
calculating the mass of the object using the volume and the density.
2. The method of claim 1,
detecting the object by using a two-dimensional deep learning model to obtain surface texture information of the object as the characteristic information; and is
The contour data is obtained by three-dimensional visual recognition.
3. The method of claim 2,
the surface texture information includes at least one of: color information, roughness information, and reflection information.
4. The method of claim 1,
the object is detected by multi-sensor fusion to obtain the contour data and internal structure information as the feature information.
5. The method of claim 4,
determining the material of the internal structure of the object based on a deep learning model; and is
And when the object is determined to comprise a plurality of materials, performing density fusion calculation on the plurality of materials.
6. The method of claim 5,
the density fusion calculation includes performing weight calculation on a plurality of densities corresponding to the plurality of materials according to a proportion of a volume of each material in a volume of the object to obtain the density of the object.
7. The method according to any one of claims 1 to 6,
the material stock stores the corresponding relation among the characteristic information, the material and the density, so that the corresponding material and the density can be determined according to the characteristic information.
8. The method of claim 7,
and when the material library does not have the material matched with the characteristic information, performing approximate material interval estimation to predict the density range.
9. An apparatus for estimating quality, the apparatus comprising:
the detection module is used for detecting the object so as to identify the characteristic information and the contour data of the object;
the matching module is used for matching the characteristic information by utilizing a pre-stored material library so as to determine the material of the object and the density corresponding to the material;
a volume estimation module for estimating a volume of the object based on the contour data; and
a mass estimation module for calculating a mass of the object using the volume and the density.
10. The apparatus of claim 9, wherein the detection module comprises:
a surface texture information extraction unit configured to detect the object by using a two-dimensional deep learning model to obtain surface texture information of the object as the feature information; and
a contour data acquisition unit for obtaining the contour data by three-dimensional visual recognition.
11. The apparatus of claim 10,
the surface texture information includes at least one of: color information, roughness information, and reflection information.
12. The apparatus of claim 9, wherein the detection module comprises:
a multi-sensor fusion unit for detecting the object by multi-sensor fusion to obtain the contour data and internal structure information as the feature information.
13. The apparatus of claim 12, wherein the matching module comprises:
a deep learning unit for determining a material of an internal structure of the object based on a deep learning model; and
and the density fusion unit is used for performing density fusion calculation on the multiple materials when the object is determined to comprise the multiple materials.
14. The apparatus of claim 13,
the density fusion calculation includes performing weight calculation on a plurality of densities corresponding to the plurality of materials according to a proportion of a volume of each material in a volume of the object to obtain the density of the object.
15. The apparatus according to any one of claims 9 to 14,
the material stock stores the corresponding relation among the characteristic information, the material and the density, so that the corresponding material and the density can be determined according to the characteristic information.
16. The apparatus of claim 15, wherein the matching module comprises:
and an approximate material interval estimation unit configured to estimate an approximate material interval to predict a density range when a material matching the feature information does not exist in the material library.
17. An electronic device, characterized in that the electronic device comprises a memory storing a computer program and a processor implementing the steps of the method according to any of claims 1-8 when the processor executes the computer program.
18. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 8.
CN202011614937.XA 2020-12-30 2020-12-30 Quality estimation method, quality estimation device, electronic device and storage medium Pending CN112785554A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011614937.XA CN112785554A (en) 2020-12-30 2020-12-30 Quality estimation method, quality estimation device, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011614937.XA CN112785554A (en) 2020-12-30 2020-12-30 Quality estimation method, quality estimation device, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN112785554A true CN112785554A (en) 2021-05-11

Family

ID=75754105

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011614937.XA Pending CN112785554A (en) 2020-12-30 2020-12-30 Quality estimation method, quality estimation device, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN112785554A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117773952A (en) * 2024-02-23 2024-03-29 浙江强脑科技有限公司 Bionic hand control method, storage medium, control device and bionic hand

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106203493A (en) * 2016-07-04 2016-12-07 何广森 A kind of food identification device and recognition methods
CN109064509A (en) * 2018-06-29 2018-12-21 广州雅特智能科技有限公司 The recognition methods of food volume and fuel value of food, device and system
CN110287207A (en) * 2019-06-30 2019-09-27 北京健康有益科技有限公司 A kind of quality of food estimating and measuring method based on density meter
CN111819467A (en) * 2018-01-25 2020-10-23 海浪科技有限公司 Method and apparatus for estimating wave propagation and scattering parameters

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106203493A (en) * 2016-07-04 2016-12-07 何广森 A kind of food identification device and recognition methods
CN111819467A (en) * 2018-01-25 2020-10-23 海浪科技有限公司 Method and apparatus for estimating wave propagation and scattering parameters
CN109064509A (en) * 2018-06-29 2018-12-21 广州雅特智能科技有限公司 The recognition methods of food volume and fuel value of food, device and system
CN110287207A (en) * 2019-06-30 2019-09-27 北京健康有益科技有限公司 A kind of quality of food estimating and measuring method based on density meter

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
沈跃等: "激光传感器在喷雾靶标检测中的研究应用", 《农业现代化研究》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117773952A (en) * 2024-02-23 2024-03-29 浙江强脑科技有限公司 Bionic hand control method, storage medium, control device and bionic hand

Similar Documents

Publication Publication Date Title
US11276158B2 (en) Method and apparatus for inspecting corrosion defect of ladle
CN114295324B (en) Fault detection method, device, equipment and storage medium
Malek et al. Methodology to integrate augmented reality and pattern recognition for crack detection
US20230290054A1 (en) Information processing device and method, and non-transitory computer-readable recording medium
CN112785554A (en) Quality estimation method, quality estimation device, electronic device and storage medium
CN112162294B (en) Robot structure detection method based on laser sensor
CN113218328A (en) Equipment maintenance method, device, equipment and medium based on three-dimensional laser scanning
CN111401229B (en) Automatic labeling method and device for small visual targets and electronic equipment
CN112529952B (en) Object volume measurement method and device and electronic equipment
CN112720496B (en) Control method and device for manipulator, pickup device and storage medium
CN112720500B (en) Control method and device for manipulator, pickup device and storage medium
CN113538558B (en) Volume measurement optimization method, system, equipment and storage medium based on IR diagram
CN114220011A (en) Goods quantity identification method and device, electronic equipment and storage medium
CN114266941A (en) Method for rapidly detecting annotation result data of image sample
CN114139622A (en) State monitoring method and device, electronic equipment and readable storage medium
CN114460599A (en) Station building structure safety monitoring method and device based on laser radar and electronic equipment
CN112720499B (en) Control method and device for manipulator, pickup device and storage medium
CN112785553A (en) Density estimation method, density estimation device, electronic equipment and computer-readable storage medium
CN116109932B (en) House security detection method, house security detection device, electronic equipment and readable storage medium
CN118034326B (en) Container inspection robot control method and device, robot and storage medium
CN112683221B (en) Building detection method and related device
CN117191950B (en) Rail hanging structure health monitoring method, system, storage medium and computing equipment
CN112784687B (en) Control method, device and equipment of manipulator and computer readable storage medium
CN117098106B (en) Bluetooth testing method and device, electronic equipment and storage medium
JP5222314B2 (en) Surface measurement method and apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220330

Address after: Building C, No.888, Huanhu West 2nd Road, Lingang New District, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai

Applicant after: Shenlan Intelligent Technology (Shanghai) Co.,Ltd.

Address before: 213000 No.103, building 4, Chuangyan port, Changzhou science and Education City, No.18, middle Changwu Road, Wujin District, Changzhou City, Jiangsu Province

Applicant before: SHENLAN ARTIFICIAL INTELLIGENCE CHIP RESEARCH INSTITUTE (JIANGSU) Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210511