CN116958135A - Texture detection processing method and device - Google Patents

Texture detection processing method and device Download PDF

Info

Publication number
CN116958135A
CN116958135A CN202311205483.4A CN202311205483A CN116958135A CN 116958135 A CN116958135 A CN 116958135A CN 202311205483 A CN202311205483 A CN 202311205483A CN 116958135 A CN116958135 A CN 116958135A
Authority
CN
China
Prior art keywords
texture
image
detection
ceramic product
factory
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311205483.4A
Other languages
Chinese (zh)
Other versions
CN116958135B (en
Inventor
沈晓东
王萌
陈亚莉
王向阳
张航
吴仁珂
李耀
李文
刘璇
王一清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alipay Hangzhou Information Technology Co Ltd
Original Assignee
Alipay Hangzhou Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alipay Hangzhou Information Technology Co Ltd filed Critical Alipay Hangzhou Information Technology Co Ltd
Priority to CN202410182375.8A priority Critical patent/CN118051099A/en
Priority to CN202311205483.4A priority patent/CN116958135B/en
Publication of CN116958135A publication Critical patent/CN116958135A/en
Application granted granted Critical
Publication of CN116958135B publication Critical patent/CN116958135B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10009Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation sensing by radiation using wavelengths larger than 0.1 mm, e.g. radio-waves or microwaves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Electromagnetism (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the specification provides a texture detection processing method and device, wherein the texture detection processing method comprises the following steps: in the process of texture detection, an access request submitted by a user terminal based on the identification information of the ceramic product is received, a detection page for detecting the texture of the ceramic product is generated to return to the user terminal, a texture image of the ceramic product acquired by the user terminal is received, the texture image is subjected to feature extraction to obtain texture features, the texture detection is performed based on the texture features and the factory texture features of the factory texture image bound by the identification information, and a detection result is obtained and returned to the user terminal.

Description

Texture detection processing method and device
Technical Field
The present disclosure relates to the field of data processing technologies, and in particular, to a texture detection processing method and apparatus.
Background
With the continuous development of internet technology and the continuous improvement of the living standard of users, users start to tend to purchase special ceramic artware in the geographic marking products for use or decoration collection, the special ceramic artware attracts more and more users to purchase with the attractive appearance, unique process and practicability, but partial commodities are inferior, common ceramic artware is used for impersonating the special ceramic artware in the geographic marking products, and the users can hardly identify the ceramic artware after buying the commodities.
In the prior art, image acquisition is usually carried out on an original porcelain under an ultraviolet light source and X-ray, image acquisition is carried out on the porcelain to be detected under the ultraviolet light source and X-ray based on the angle of image acquisition on the original porcelain, and the unique identification of the porcelain is carried out in a mode of detecting the similarity of two groups of image data.
Disclosure of Invention
One or more embodiments of the present disclosure provide a texture detection processing method, applied to a server, including: an access request submitted by a user terminal based on the identification information of the ceramic article is received. And responding to the access request, generating a detection page for detecting the texture of the ceramic product and returning to the user terminal. And receiving the texture image of the ceramic product acquired by the user terminal, and extracting the characteristics of the texture image to obtain texture characteristics. And carrying out texture detection based on the texture characteristics and the factory texture characteristics of the factory texture image bound by the identification information, obtaining a texture detection result and returning to the user terminal.
One or more embodiments of the present disclosure provide another texture detection processing method, which is applied to a user terminal, and the method includes: analyzing the identification information of the ceramic product, generating an access request based on the analysis result, and submitting the access request to a server. And receiving a detection page returned by the server for detecting the texture of the ceramic product. And acquiring the texture image of the ceramic product by triggering the image acquisition interface configured by the detection page. Uploading the texture image to the server to detect the texture based on the texture characteristics of the texture image and the factory texture characteristics of the factory texture image bound by the identification information. And receiving a texture detection result of the ceramic product returned by the server.
One or more embodiments of the present disclosure provide a texture detection processing apparatus, which operates on a server, the apparatus including: and the access request receiving module is configured to receive an access request submitted by the user terminal based on the identification information of the ceramic product. And the detection page generation module is configured to respond to the access request, generate a detection page for detecting the texture of the ceramic product and return the detection page to the user terminal. And the texture feature extraction module is configured to receive the texture image of the ceramic product acquired by the user terminal and perform feature extraction on the texture image to obtain texture features. The texture detection module is configured to detect textures based on the texture features and the factory texture features of the factory texture image bound by the identification information, obtain texture detection results and return to the user terminal.
One or more embodiments of the present disclosure provide another texture detection processing apparatus, which operates in a user terminal, and the apparatus includes: and the access request submitting module is configured to analyze the identification information of the ceramic product, generate an access request based on the analysis result and submit the access request to the server. And the detection page receiving module is configured to receive a detection page returned by the server for detecting the texture of the ceramic product. And the texture image acquisition module is configured to acquire texture images of the ceramic product by triggering the image acquisition interface configured by the detection page. And the texture image uploading module is configured to upload the texture image to the server so as to perform texture detection based on the texture characteristics of the texture image and the factory texture characteristics of the factory texture image bound by the identification information. And the detection result receiving module is configured to receive the texture detection result of the ceramic product returned by the server.
One or more embodiments of the present specification provide a server including: a processor; and a memory configured to store computer-executable instructions that, when executed, cause the processor to: an access request submitted by a user terminal based on the identification information of the ceramic article is received. And responding to the access request, generating a detection page for detecting the texture of the ceramic product and returning to the user terminal. And receiving the texture image of the ceramic product acquired by the user terminal, and extracting the characteristics of the texture image to obtain texture characteristics. And carrying out texture detection based on the texture characteristics and the factory texture characteristics of the factory texture image bound by the identification information, obtaining a texture detection result and returning to the user terminal.
One or more embodiments of the present specification provide a user terminal including: a processor; and a memory configured to store computer-executable instructions that, when executed, cause the processor to: analyzing the identification information of the ceramic product, generating an access request based on the analysis result, and submitting the access request to a server. And receiving a detection page returned by the server for detecting the texture of the ceramic product. And acquiring the texture image of the ceramic product by triggering the image acquisition interface configured by the detection page. Uploading the texture image to the server to detect the texture based on the texture characteristics of the texture image and the factory texture characteristics of the factory texture image bound by the identification information. And receiving a texture detection result of the ceramic product returned by the server.
One or more embodiments of the present specification provide a storage medium storing computer-executable instructions that, when executed by a processor, implement the following: an access request submitted by a user terminal based on the identification information of the ceramic article is received. And responding to the access request, generating a detection page for detecting the texture of the ceramic product and returning to the user terminal. And receiving the texture image of the ceramic product acquired by the user terminal, and extracting the characteristics of the texture image to obtain texture characteristics. And carrying out texture detection based on the texture characteristics and the factory texture characteristics of the factory texture image bound by the identification information, obtaining a texture detection result and returning to the user terminal.
One or more embodiments of the present specification provide another storage medium storing computer-executable instructions that, when executed by a processor, implement the following: analyzing the identification information of the ceramic product, generating an access request based on the analysis result, and submitting the access request to a server. And receiving a detection page returned by the server for detecting the texture of the ceramic product. And acquiring the texture image of the ceramic product by triggering the image acquisition interface configured by the detection page. Uploading the texture image to the server to detect the texture based on the texture characteristics of the texture image and the factory texture characteristics of the factory texture image bound by the identification information. And receiving a texture detection result of the ceramic product returned by the server.
Drawings
For a clearer description of one or more embodiments of the present description or of the solutions of the prior art, the following brief description of the drawings is provided for the purpose of making it apparent, to those skilled in the art, that the drawings in the following description are only some of the embodiments described in the present description, from which other drawings can be obtained without the need for inventive labour;
FIG. 1 is a schematic diagram illustrating an implementation environment of a texture detection method according to one or more embodiments of the present disclosure;
FIG. 2 is a flow chart of a texture detection method according to one or more embodiments of the present disclosure;
FIG. 3 is a schematic diagram of a detection page provided in one or more embodiments of the present disclosure;
FIG. 4 is a schematic illustration of a ceramic article provided in one or more embodiments of the present disclosure;
FIG. 5 is a schematic diagram illustrating an edge detection effect provided by one or more embodiments of the present disclosure;
FIG. 6 is a schematic representation of a texture feature provided in one or more embodiments of the present disclosure;
FIG. 7 is a schematic diagram of image acquisition data carried by a detection page according to one or more embodiments of the present disclosure;
FIG. 8 is a process timing diagram of a texture detection process for porcelain detection scenarios according to one or more embodiments of the present disclosure;
FIG. 9 is a process timing diagram of a texture detection process for a ceramic inspection scene according to one or more embodiments of the present disclosure;
FIG. 10 is a process flow diagram of another texture detection processing method provided in one or more embodiments of the present disclosure;
FIG. 11 is a schematic diagram of an embodiment of a texture detection processing apparatus according to one or more embodiments of the present disclosure;
FIG. 12 is a schematic diagram of another texture detection processing apparatus according to one or more embodiments of the present disclosure;
FIG. 13 is a schematic diagram of a server according to one or more embodiments of the present disclosure;
fig. 14 is a schematic structural diagram of a user terminal according to one or more embodiments of the present disclosure.
Detailed Description
In order to enable a person skilled in the art to better understand the technical solutions in one or more embodiments of the present specification, the technical solutions in one or more embodiments of the present specification will be clearly and completely described below with reference to the drawings in one or more embodiments of the present specification, and it is obvious that the described embodiments are only some embodiments of the present specification, not all embodiments. All other embodiments, which can be made by one or more embodiments of the present disclosure without inventive effort, are intended to be within the scope of the present disclosure.
The texture detection processing method provided by one or more embodiments of the present invention is applicable to an implementation environment of texture detection, and referring to fig. 1, the implementation environment includes at least a server 101, where the server 101 may be a server or a server cluster formed by a plurality of servers, or a cloud server of a cloud computing platform. The server 101 is configured to perform texture detection to obtain a detection result.
The implementation environment further includes at least a user terminal 102, where the user terminal 102 may be a smart phone, a tablet computer, an e-book reader, a wearable device, an AR (Augmented Reality) based/VR (Virtual Reality) based device for information interaction, etc. The user terminal 102 is configured to receive the detection page sent by the server 101 and upload the texture image to the server 101, and receive the texture detection result returned by the server 101, where the user terminal 102 may configure a client of texture detection, where the client is configured to receive and display the detection page, and collect and upload the texture image, and the specific form of the client may be an application program, or a subroutine in the application program, or a service module in the application program.
The implementation environment may include a database 103, where the database 103 stores a data table of correspondence between identification information of the ceramic product and a factory texture image, and a factory texture picture to which the identification information is bound.
In addition, the functions of the database 103 may also be implemented by the server 101, for example: the server 101 stores a data table of correspondence between identification information of the ceramic product and a factory texture image, and a factory texture picture to which the identification information is bound.
In the implementation environment, in the process of performing texture detection, a user terminal 102 analyzes identification information of a ceramic product, generates an access request based on the analysis result, submits the access request to a server 101, the server 101 receives the access request, generates a detection page for performing texture detection on the ceramic product and returns the detection page to the user terminal 102, the user terminal 102 receives the detection page, acquires texture images of the ceramic product by triggering an image acquisition interface configured by the detection page, uploads the texture images to the server 101, the server 101 receives the texture images of the ceramic product acquired by the user terminal, performs feature extraction on the texture images to obtain texture features, reads a factory texture image bound with the identification information from a database 103, performs texture detection based on the texture features and the factory texture features of the factory texture image, and returns the detection result to the user terminal 102.
One or more embodiments of a texture detection processing method provided in the present specification are as follows:
referring to fig. 2, the texture detection processing method provided in the present embodiment is applicable to a server, and specifically includes steps S202 to S208.
Step S202, an access request submitted by a user terminal based on the identification information of the ceramic product is received.
In practical application, in the process of purchasing ceramic products by users, the users may encounter goods which are filled with the products for a second time, or in the scene of online transaction, the received ceramic products are inconsistent with the purchased ceramic products, and the users may have difficulty in identifying whether the purchased ceramic products are consistent with the ceramic products which leave the factory, for this purpose, the embodiment performs identification marking on the ceramic products which leave the factory in advance in a mode of detecting the ceramic products purchased by the users, and uploads the factory image of the produced ceramic products, so that the users can enter a detection flow of detecting the uniqueness of the ceramic products based on the identification information of the purchased ceramic products.
In this embodiment, the ceramic product refers to a craft product made of ceramic with different types and/or different patterns of textures on the surface, and may have ornamental and/or practical properties, such as a porcelain bowl, a porcelain dish, a porcelain vase, a porcelain ornament, etc. In addition, ceramic articles may also refer to products produced from a particular region, named by geographic name, namely: ceramic articles in geotag products. Specifically, ceramic products can be classified into ceramic products and porcelain products, and the ceramic products and the porcelain products can be further classified into categories of functions or styles, for example, the porcelain products can be classified into porcelain tableware, porcelain ornaments, porcelain bottles and ceramic tiles according to functions.
In specific implementation, the user terminal requests to enter a detection process for detecting the uniqueness of the ceramic product by submitting the access request of the ceramic product, specifically, the user terminal can generate and submit the access request based on the identification information of the ceramic product, and the submitted access request can also contain the identification information of the ceramic product.
In practical application, in order to ensure the uniqueness of the ceramic product, an identification code can be set for the ceramic product when leaving the factory, and the identification information of the ceramic product is encoded, so that a user collects and analyzes the identification code through a user terminal, and generates and submits an access request according to the identification information obtained by analysis, wherein the identification code can be arranged on a package of the ceramic product, or on the ceramic product, for example, a product two-dimensional code is arranged at the bowl bottom of a porcelain bowl, wherein the identification information can be a unique identification of the ceramic product, the unique identification can correspond to production information and/or product information of the ceramic product, and the identification information can also comprise fields corresponding to product data of the ceramic product, such as a product name of an enterprise, a factory code scanning time, a place code number and the like, and in an alternative embodiment provided by the embodiment, the identification information of the ceramic product is obtained in the following manner:
And scanning and decoding the identification code of the ceramic product through the user terminal, and obtaining the identification information of the ceramic product based on a decoding result.
Specifically, the user terminal scans the identification code configured on the ceramic product, the decoding component based on the user terminal decodes the identification code to obtain a decoding result, and the unique identification information of the ceramic product is obtained based on the decoding result, so that the user terminal generates an access request based on the unique identification information of the ceramic product and submits the access request to the server. Optionally, the unique identification information includes a factory number and/or a production number of the ceramic article.
For example, a user scans a two-dimensional code configured at the bottom of a ceramic product through a user terminal and decodes the two-dimensional code to obtain a decoding result, and a factory number of the ceramic product is obtained based on the decoding result.
In addition, in order to improve the convenience of the user, an NFC (Near Field Communication near field communication technology) tag may be further set when the ceramic product leaves the factory, the identification information of the ceramic product is configured in the NFC tag of the ceramic product, the user performs near field communication interaction with the NFC tag of the ceramic product through the near field communication component of the user terminal to obtain the identification information of the ceramic product, and the NFC tag may also be configured in the package of the ceramic product or on the ceramic product.
And invoking a near field communication component of the user terminal to perform near field communication interaction with a near field communication tag of the ceramic product to obtain identification information of the ceramic product.
Specifically, the user terminal is close to the NFC label of the ceramic product, the near field communication assembly of the user terminal is called after the distance reaches the threshold value, near field communication interaction is carried out between the near field communication assembly and the near field communication label of the ceramic product, and unique identification information of the ceramic product is read.
For example, after the distance reaches a threshold, the user calls a near field communication component of the user terminal to perform near field communication with the NFC tag of the ceramic product to obtain a field contained in the identification information of the ceramic product.
It should be noted that, the ceramic product may also be configured with an identification code and an NFC tag at the same time, based on which, in the process of uniquely identifying the ceramic product, the identification information of the ceramic product may be obtained by interacting with any one of the identification code and the NFC tag, for example, by scanning the identification code of the ceramic product and decoding the identification code, and the identification information is obtained according to the decoding result; or acquiring the identification information of the ceramic product from the NFC tag by means of near field communication with the NFC tag.
And step S204, responding to the access request, generating a detection page for detecting the texture of the ceramic product and returning to the user terminal.
In the implementation, after an access request for detecting the uniqueness of the ceramic product submitted by a user terminal is received, a detection page for detecting the texture of the ceramic product is generated for the access request, the detection page is returned to the user terminal, after the detection page is returned to the user terminal, the texture image of the ceramic product can be collected through the detection page, specifically, an image collection interface can be configured on the detection page, and the texture image collection flow of the ceramic product can be entered by triggering the image collection interface.
After the detection page is returned to the user terminal, the user terminal can display after receiving the detection page returned by the server, so that the user can acquire texture images of the purchased ceramic products in a mode of interaction between the user terminal and the detection page. Specifically, after receiving the detection page, the user terminal can trigger an image acquisition interface configured by the detection page to submit an image triggering instruction for image acquisition. Optionally, after the detection page returns to the user terminal, the image acquisition of the ceramic product is performed by triggering an image acquisition interface configured by the detection page.
In the specific detection page generation process, the identification information of the ceramic product carried in the access request can be extracted, the factory image of the ceramic product corresponding to the identification information is queried in the database based on the identification information, and the detection page is generated based on the factory image and the image acquisition interface. As shown in fig. 3, the detection page contains a factory image 301, an image acquisition interface 302.
In practical application, the surface textures of ceramic products with different purposes may be distributed at different positions, for example, the surface textures of the ceramic discs are mainly distributed on the disc surfaces, the texture images of the ceramic discs can be obtained by image acquisition of the disc surfaces, the surface textures of the ceramic vases are mainly distributed on the side surfaces of the vase body, and the texture images of the ceramic vases can be obtained by image acquisition of the side surfaces of the vase body; in addition, ceramic products of some categories may require multiple angles of texture image acquisition to acquire a complete texture, e.g., ceramic articles with relatively complex surface structures may require multiple angles of texture image acquisition to acquire a texture image;
in this regard, after the detection page configured with the image acquisition interface is returned to the user terminal, before the image acquisition interface configured with the detection page is triggered, that is: before the texture image of the ceramic product is acquired, the shape classification of the ceramic product can be determined on the basis of the image obtained by the pre-acquisition, and then the acquisition reminding of the texture image is generated by triggering the shape classification of the ceramic product, so that the acquisition efficiency of the texture image is improved, the re-acquisition caused by the condition that the acquired texture image is not in compliance with the requirement is avoided, and in an optional implementation manner provided by the embodiment, before the image acquisition interface is triggered, the user terminal executes the following operations by calling the pre-detection interface:
Identifying the shape classification of the ceramic product corresponding to the acquired first texture image, and determining image acquisition data corresponding to the shape classification;
and generating an image acquisition reminder carrying the image acquisition data, and displaying the image acquisition reminder through the detection page.
Specifically, under the condition that the detection page is displayed on the user terminal, a first texture image of the ceramic product is collected and uploaded by calling a pre-detection interface, shape classification of the ceramic product is identified based on the first texture image, image collection data is determined based on an identification result, an image collection reminder carrying the image collection data is generated based on the image collection data, and a returned image collection reminder is called through the detection page display interface; the image acquisition reminding can be a text reminding or a voice reminding or an image reminding displayed on the detection page, and the image acquisition data can be a schematic diagram or a thumbnail or an image with watermark of the image acquisition.
For example, as shown in fig. 4, the user terminal collects an image of the porcelain bowl through the pre-detection interface, identifies that the porcelain bowl belongs to the category of porcelain tableware based on the image, determines a thumbnail 401 of a top view of the porcelain bowl, generates a text reminder for image collection based on the thumbnail 401 of the top view, and displays the text reminder for image collection and the thumbnail 401 of the top view of the porcelain bowl on the detection page.
In this embodiment, in the process of performing texture image acquisition by the user terminal, image quality detection may also be performed on a texture image acquired by the user terminal, specifically, by extracting an image parameter of the acquired texture image, and calculating an image definition of the acquired texture image according to the image parameter, if the image definition meets a texture image acquisition condition, for example, is greater than a definition threshold for performing texture image acquisition, determining that the detection is passed, and in the case that the detection is determined to be passed, uploading, by the user terminal, the currently acquired texture image to the server; otherwise, the detection is determined to not pass, and a re-acquisition reminder can be generated.
In addition, in the process of collecting the texture image by the user terminal, the collected texture image is subjected to image pre-detection, or under the condition that the detection of the image quality detection is passed, the collected texture image is further subjected to image pre-detection; the image pre-detection of the collected texture image may be to calculate the similarity between the currently collected texture image and the factory texture image at the image level, if the similarity is low, the texture detection of the currently collected texture image may not be performed through the factory texture image, so that re-collection is required, specifically, the image pre-detection of the collected texture image includes: calculating the image similarity of the texture image acquired by the user terminal and the factory texture image, if the similarity is larger than a similarity threshold, determining that the detection of the image pre-detection is passed, and uploading the currently acquired texture image to a server by the user terminal under the condition of the detection passing; otherwise, if the similarity is smaller than or equal to the similarity threshold, determining that the detection of the image pre-detection is not passed, and generating a re-acquisition prompt.
Step S206, receiving the texture image of the ceramic product acquired by the user terminal, and extracting the characteristics of the texture image to obtain texture characteristics.
In practical application, due to the fact that different textures of ceramic products may exist due to different manufacturing processes and firing processes of the ceramic products, in the embodiment, the detection of the ceramic products is performed in a characteristic comparison mode by collecting texture images of the ceramic products and extracting texture features, the texture features and the factory texture features of factory images corresponding to the identification information are subjected to characteristic comparison, the detection efficiency is improved in the characteristic comparison mode, and meanwhile the detection accuracy is improved.
In the implementation, the user terminal acquires the texture image of the ceramic product and uploads the texture image to the server, receives the texture image after receiving the texture image uploaded by the user terminal, and performs a specific algorithm to extract the texture characteristics of the texture image to obtain the texture characteristics of the ceramic product in the texture image so as to perform subsequent texture detection processing.
Specifically, in order to improve the extraction accuracy of the texture features, image extraction can be performed on an image area corresponding to the ceramic product in the texture image, and then the texture features are extracted for the image area to obtain the texture features; the texture image can be subjected to image segmentation, an image area corresponding to the ceramic product in the texture image is obtained based on the segmentation result, and texture features are extracted from the image area to obtain texture features; in addition, when noise is present in the texture image, the image region corresponding to the ceramic product may be divided after the image region in the texture image is extracted, the noise-present region may be removed, and the texture feature extraction may be performed on the image region from which the noise region has been removed.
In a specific implementation process, in order to improve the extraction accuracy of texture features, the texture image may be extracted by an edge detection algorithm, so as to extract a target image area corresponding to the ceramic product, namely: removing a background image area of a ceramic product in a texture image, extracting texture features of the ceramic product from a target image area, and in an alternative implementation manner provided in this embodiment, extracting features from the texture image to obtain texture features includes:
performing edge detection on the texture image, and extracting a target image area corresponding to the ceramic product from the texture image based on an edge detection result;
and extracting texture features from the target image area to obtain the texture features.
Specifically, an edge detection algorithm is implemented for a texture image, image area coordinates corresponding to a ceramic product in the texture image are calculated, a target image area corresponding to the ceramic product in the texture image is extracted based on the image area coordinates, and texture feature extraction is performed for the target image area to obtain texture features of the ceramic product.
For example, as shown in fig. 5, after receiving the texture image of the porcelain bowl acquired and uploaded by the user terminal, performing edge detection on the texture image of the porcelain bowl by using an edge detection algorithm to obtain an image as shown in fig. 6, extracting a target image area corresponding to the porcelain bowl, and performing texture feature extraction on the target image area by using a Canny operator to obtain texture features of the porcelain bowl as shown in fig. 7.
Furthermore, on the basis of extracting a target image area by applying an edge detection algorithm to a texture image, the texture image of the ceramic product uploaded by the user terminal may have image noise in a part of the image area, in order to improve the accuracy of feature extraction, the part of the image area with noise may be subjected to noise reduction or removal treatment, the image noise may be noise generated by reflection or noise generated by shadow, in particular, different ceramic products have different manufacturing processes and manufacturing materials, the different manufacturing processes and manufacturing materials have different reflectivities for light, for example, the reflection rate of bright overglaze is higher, the image of the ceramic product with bright overglaze may have reflection noise, the reflection rate of matte glaze is lower, the image of the ceramic product with matte glaze may have shadow noise, and the reflection rates of the ceramic product and the ceramic product with matte glaze are different;
in view of this, the denoising method may be determined according to the category of the ceramic product in the target image area or the image noise type of the target image area, and in an optional implementation manner provided in this embodiment, after performing edge detection on the texture image, extracting the target image area corresponding to the ceramic product in the texture image based on the edge detection result, and before performing texture feature extraction on the target image area to obtain texture features, the method further includes:
Performing category identification based on the target image area to obtain the category of the ceramic product, and/or performing image noise type detection on the target image area to obtain an image noise type;
and determining an image denoising mode according to the category and/or the image noise type, and denoising the target image area according to the image denoising mode.
Correspondingly, the texture features are obtained by extracting the texture features of the denoised target image area.
Specifically, the ceramic products contained in the target image area are subjected to category identification, the category of the ceramic products is obtained according to the identification result, and/or the noise area in the target image area is subjected to image noise type detection, the image noise type is obtained according to the detection result, the image denoising mode is determined according to the category of the ceramic products and/or the image noise type, the target area is subjected to denoising treatment according to the image denoising mode, and the denoised target image area is subjected to texture feature extraction to obtain texture features.
For example, as shown in fig. 6, image noise type detection is performed on reflective areas 601, 602 and 603 in a porcelain bowl contained in a target image area to obtain an image noise type as reflective noise, an image denoising mode is determined according to the reflective noise to remove the reflective areas 601, 602 and 603, and denoising processing is performed on the target image area according to the denoising mode;
For another example, if the category of the porcelain bowl included in the target image area is identified as a matte glaze porcelain, the image denoising method is determined to improve brightness and reduce shadow noise according to the matte glaze porcelain, and the noise reduction treatment is performed on the target image area according to the denoising method.
In practical application, the texture image of the ceramic product acquired and uploaded by the user terminal may have partial deviation with the factory texture image acquired when the ceramic product leaves the factory, for example, the angle of the image is inconsistent or the direction is inconsistent, which may cause inaccurate detection results of subsequent texture detection operations;
in view of this, on the basis of extracting a target image area by implementing an edge detection algorithm on a texture image, image correction can be performed on the target image area according to a factory texture image, and the corrected image is input into a texture feature extraction model to perform texture feature extraction, so as to improve feature extraction accuracy and further improve texture detection accuracy.
Determining a factory image area corresponding to the target image area in the factory texture image, and carrying out image correction on the target image area based on the factory image area;
And inputting the obtained corrected image area into a texture feature extraction model to extract texture features, and obtaining the texture features.
Specifically, a factory texture image is obtained, an image area corresponding to a factory ceramic product in the factory texture image is extracted, the image area is used as a factory image area corresponding to a target image area, image correction is carried out on the target image area based on the factory image area, a corrected image area is obtained, and a texture feature extraction model is input for texture feature extraction to obtain texture features of the ceramic product; the image correction may be performed by performing a rotation correction process on the target image area or by performing a trimming process on the target image area.
For example, determining a delivery image area corresponding to a target image area of the porcelain bowl in a delivery texture image of the porcelain bowl, performing rotation processing on the target image area based on the direction of the delivery image area, inputting the rotated target image area into a texture extraction model for texture feature extraction, and obtaining texture features of the porcelain bowl;
for another example, a factory image area corresponding to a target image area of the porcelain cup in the texture image in the factory texture image of the porcelain cup is determined, the handle area image of the cup in the target image area is cut based on the image contained in the factory area, and the cut target image area is input into a texture extraction model to extract texture features, so that the texture features of the porcelain cup are obtained.
Further, for the image correction processing, there may be a case that a target image area corresponding to the ceramic product in the texture image acquired and uploaded by the user terminal does not correspond to a factory image area corresponding to the factory ceramic product in the factory image, for this, a position of the target image area needs to be moved or rotated to a position corresponding to the factory image area, in order to improve accuracy and usability of image correction, a coordinate mapping relationship between the target image area and the factory image area may be established, and a corrected image area is obtained by performing coordinate correction based on the coordinate mapping relationship, where in an alternative embodiment provided in this embodiment, the correcting of the target image area based on the factory image area includes:
determining a coordinate mapping relation between an image unit contained in the target image area and an image unit contained in the factory image area;
and carrying out coordinate correction on the target image area based on the coordinate mapping relation to obtain the corrected image area.
Specifically, a coordinate system of an image unit included in the texture image and a coordinate system of an image unit included in the factory image are established in advance, a coordinate mapping relation is established based on an image identification unit included in the target image area and an image identification unit included in the factory image area, and coordinate correction is performed on the target image area based on the coordinate mapping relation, namely: and performing rotation processing on the target image area based on the coordinate change to obtain a corrected image area.
For example, a coordinate system of an image unit included in the texture image and a coordinate system of an image unit included in the factory image are established in advance, a coordinate mapping relationship is established based on an image unit of a character included in the target image area and an image unit of a character included in the factory image, and the target image area is rotated based on the coordinate mapping relationship of the image unit of the character included in the target image area and the character included in the factory image, so that a corrected image area is obtained.
Furthermore, the rotation and movement processing can be performed on the target image area based on the coordinate mapping relation between the image unit contained in the target image area and the image unit contained in the factory image area, so as to further improve the accuracy of texture detection described below. Along the above example, a coordinate system of an image unit included in the texture image and a coordinate system of an image unit included in the factory image are established in advance, a coordinate mapping relationship is established based on the image unit of characters included in the target image area and the image unit of characters included in the factory image, coordinates of the target image area are moved by 2 units along the negative x-axis direction based on the coordinate mapping relationship, coordinate areas of the characters are overlapped, a first corrected image area is obtained, rotation processing is performed on the target image area based on the directions of the characters, the characters are overlapped, and the corrected image area is obtained.
In addition, in order to improve the convenience of data processing and reduce the complexity of data processing, an image region corresponding to the ceramic product can be segmented in a mode of image segmentation of the texture image, and texture features are extracted from the segmented image region to obtain texture features. For example, pixel segmentation is performed on a texture image by using SAM (Segment Anything Model image segmentation base model), a target image region corresponding to a ceramic product region is obtained based on the segmentation result, and feature extraction is performed on the target image region by using a neural network model to obtain texture features.
Or, in order to improve the processing efficiency of texture extraction and improve the extraction accuracy when the data amount is more, feature extraction can be performed through a trained texture detection model, and a texture image is input into the texture detection model for feature extraction so as to perform texture detection processing based on the extracted texture features, wherein the texture detection processing is performed based on the texture features and factory texture features. Optionally, feature extraction is performed on the texture image to obtain texture features, and the feature extraction is performed based on a texture detection model.
Furthermore, on the basis of extracting texture features after image segmentation of the texture image, noise regions in image regions corresponding to the ceramic products subjected to image segmentation can be removed, so that the complexity of data processing is reduced, and the feature extraction efficiency is improved, for example: image segmentation is carried out on the texture image, and candidate image areas corresponding to the ceramic products and noise areas in the candidate areas are screened from the obtained candidate image areas; and removing the interference area in the candidate image area corresponding to the ceramic product to obtain a target image area.
In addition, for the texture feature extraction processing of the texture image, the background area of the ceramic product in the texture image and the interference area in the image area corresponding to the ceramic product can be extracted and removed, so as to improve the efficiency and accuracy of the texture feature extraction, for example: and performing edge detection on the texture image, determining an interference area except an image area corresponding to the ceramic product in the texture image based on an edge detection result, and removing the interference area, wherein the interference area can be a background area except the ceramic product in the texture image and/or an image noise area in the image area corresponding to the ceramic product.
In addition to the above-mentioned method for extracting texture features, texture sub-features may be extracted from multiple sub-feature dimensions of the texture features, texture sub-features may be extracted from multiple dimensions based on the target image region, and the extracted multiple texture sub-features may be used together as texture features of the texture image for subsequent texture detection processing, where the texture sub-features may be texture shape sub-features and/or texture distribution sub-features.
And step S208, performing texture detection based on the texture features and the factory texture features of the factory texture image bound by the identification information, obtaining a texture detection result and returning to the user terminal.
In the specific implementation, based on the identification information of the ceramic product carried in the access request, the factory texture characteristics of the factory texture image of the ceramic product bound by the identification information are queried, the texture detection is performed based on the texture characteristics and the factory texture characteristics, the texture detection result is returned to the user terminal, and after the texture detection result is returned to the user terminal, the texture detection result can be displayed on the detection page. Specifically, texture matching calculation can be performed on texture features and factory texture features, texture detection can be performed on the basis of matching degrees of the texture features and the factory texture features, texture images and factory texture images can also be input into a trained texture detection model, and the texture detection can be performed through the texture detection model.
The texture detection result in this embodiment is a detection result for characterizing whether the ceramic product in the texture image collected and uploaded by the user terminal is a ceramic product in a factory texture image, and specifically, the texture detection result can be divided into three cases, and the detection is passed, the detection is failed. The detection shows that the ceramic product in the texture image acquired and uploaded by the user terminal and the ceramic product in the factory texture image are the same piece; the failure detection indicates that the ceramic product in the texture image acquired and uploaded by the user terminal and the ceramic product in the factory texture image may not be the same piece, or the failure detection caused by poor image quality of the texture image may be caused; the failure in detection indicates that the texture image acquired and uploaded by the user terminal is not satisfactory, and the image noise of the texture image may exceed a threshold value, or the sharpness of the texture image may be smaller than the threshold value.
In a specific implementation process, because texture features such as shape features and distribution features of textures of ceramic products with different types of purposes have larger differences, for this purpose, in order to improve efficiency and accuracy of texture detection, texture detection can be performed according to matching degrees of the texture features such as shape features and distribution features of the textures of the ceramic products, and specifically, target texture features corresponding to a target image area can be extracted from the factory texture features, and matching degrees of the target texture features and the texture features are detected, and in an alternative implementation manner provided in this embodiment, the texture detection is performed according to the factory texture features of the factory texture image bound by the texture features and identification information, including:
extracting target texture features corresponding to the target image area from the factory texture features, and performing texture matching calculation on the texture features and the target texture features;
if the texture matching degree obtained through calculation is larger than a matching degree threshold value, determining that the texture detection of the ceramic product is passed;
and if the texture matching degree obtained through calculation is smaller than or equal to the matching degree threshold value, determining that the texture detection of the ceramic product is failed.
Specifically, based on the image range of the target image area, extracting target texture features corresponding to the image range from the factory texture, performing texture matching calculation on the texture features extracted based on the target image area and the target texture features, if the texture matching degree obtained by calculation is greater than a matching degree threshold value, determining that the texture detection of the ceramic product passes, and if the texture detection is less than or equal to the matching degree threshold value, determining that the texture detection does not pass.
For example, if the texture of the porcelain bowl is ice cracks, extracting target texture shape features corresponding to the target image area from factory texture features, performing texture matching calculation on the texture shape features corresponding to the ice cracks and the target texture shape features, and determining that the texture of the porcelain bowl passes through the texture matching degree threshold value.
In addition, since the texture shapes of the ceramic products of some categories may be slightly similar, such as ice cracks, fish-skin cracks, etc., for this purpose, the above matching calculation may be triggered from a plurality of feature dimensions, such as performing matching calculation from two feature dimensions of a texture shape feature and a texture distribution feature, specifically, extracting a target texture feature corresponding to the image range in a factory texture based on the image range of the target image region, performing texture matching calculation based on the texture shape feature of the texture feature and the target texture shape feature of the target texture feature to obtain a shape matching degree, performing texture matching calculation based on the texture distribution feature of the texture feature and the target texture distribution feature of the target texture feature to obtain a distribution matching degree, calculating the texture matching degree based on the shape matching degree and the distribution matching degree, determining that the texture detection of the ceramic product passes if the texture matching degree obtained by calculation is greater than the matching degree threshold, and determining that the texture detection fails if it is less than or equal to the matching degree threshold.
Alternatively, the texture shape of the ceramic product of some categories may be represented by a picture or a more complex pattern, such as a printed glaze in the ceramic product, or a printed surface of a ceramic dish, for which the matching degree calculation may be performed on the texture of the key region, specifically, the key image region of the target image region is extracted, the target texture feature corresponding to the key image region is extracted in the factory texture, the texture matching degree calculation is performed on the texture feature of the key image region and the target texture feature, if the texture matching degree obtained by the calculation is greater than the matching degree threshold value, the texture detection of the ceramic product is determined to pass, and if the texture matching degree is less than or equal to the matching degree threshold value, the texture detection is determined to fail.
Further, in the texture image, a plurality of images, that is: if the target image area contains a plurality of image areas, the texture matching degree obtained by calculating the plurality of areas can be normalized, the texture detection result is determined based on the comparison result and the matching degree threshold value, for example, the texture image comprises a front view and a side view, the normalization calculation can be performed according to the texture matching degree of the front view and the texture matching degree of the side view, if the texture matching degree obtained by calculation is greater than the matching degree threshold value, the texture detection of the ceramic product is determined to pass, and if the texture matching degree is less than or equal to the matching degree threshold value, the texture detection is determined to not pass; or, the texture image may include a plurality of target image areas, for example, the ceramic product is broken into a plurality of blocks during the transportation process, for this, the factory image areas corresponding to the plurality of target image areas in the factory texture image may be extracted for the plurality of target image areas included in the texture image, matching degree calculation is performed for texture features of the plurality of target image areas and texture features of the factory image areas, normalization calculation is performed for each matching degree obtained by the calculation to obtain a texture matching degree, if the calculated texture matching degree is greater than a matching degree threshold, it is determined that texture detection of the ceramic product passes, and if it is less than or equal to the matching degree threshold, it is determined that the texture detection fails.
Further, on the basis of performing texture matching based on the texture features and the target texture features, for the situation that the angle of the texture image acquired and uploaded by the user terminal may not be consistent with the factory image, before performing texture matching calculation, the area overlapping ratio of the target image area and the factory image area corresponding to the target image area in the factory texture image may be calculated, so as to detect whether the angle of the target image area matches with the angle of the factory image area, and in an alternative implementation manner provided in this embodiment, before performing texture matching calculation based on the texture features and the target texture features, the method further includes:
calculating the region overlap ratio of the target image region and a factory image region corresponding to the target image region in the factory texture image;
and if the region overlap ratio is greater than or equal to the overlap ratio threshold, executing the texture matching calculation operation based on the texture features and the target texture features.
Specifically, a factory image area corresponding to a target image area in a factory texture image is extracted, the area overlap ratio of the factory image area and the target image area is calculated, and under the condition that the area overlap ratio is not smaller than an overlap ratio threshold value, the texture matching calculation processing based on the texture characteristics and the target texture characteristics is executed.
Further, if the area overlap ratio is smaller than the overlap ratio threshold, it indicates that the angle offset between the texture image and the factory texture image is too large, which affects the accuracy of texture detection, for this purpose, it may be determined that the texture detection fails and a next acquisition page is issued to the user terminal, specifically, an acquisition page configured with a mask image of the factory texture image may be generated, a detection result of the failure of the texture detection is generated, the acquisition page is configured in the detection result, and the detection result is issued to the user terminal, where the mask image may be a watermark covering a layer above a layer where the factory texture image is located, or a mosaic, so as to avoid that the user stores the factory texture image for texture detection, which affects the accuracy of texture detection, and in an optional implementation provided in this embodiment, after the area overlap ratio calculation, the method further includes:
if the region overlap ratio is smaller than the overlap ratio threshold, determining that detection fails, and generating an acquisition page provided with a mask image of the factory texture image;
and generating texture detection results which do not pass the texture detection and carry the acquisition page.
Specifically, under the condition that the calculated region overlap ratio is smaller than the overlap ratio threshold, determining that texture detection fails, generating an acquisition page configured with mask images of factory texture thumbnails, and generating a texture detection result which does not pass the texture detection and carries the acquisition page, so that a user can continuously acquire the texture images through the acquisition page and the mask images of the factory texture thumbnails.
For example, if the calculated region overlap ratio is smaller than the overlap ratio threshold, determining that texture detection fails, generating an acquisition page configured with a factory texture thumbnail covering the watermark, and generating a texture detection result that the texture detection fails and carries the acquisition page.
Still further, under the condition that the calculated area overlap ratio is smaller than the overlap ratio threshold, the user may further perform second texture image acquisition based on the acquisition page, for this, image combination may be performed on the first uploaded texture image and the second acquired and uploaded texture image, for example, an area with high image quality in the first uploaded texture image is extracted and image combination is performed on the second acquired and uploaded texture image, to obtain a combined texture image with higher image quality, and texture detection is performed based on the combined texture image, where in an optional implementation provided in this embodiment, after performing texture detection based on texture features and factory texture features, obtaining a texture detection result and returning to the user terminal, the method further includes:
Acquiring a second texture image of the ceramic product acquired based on the acquisition page, and carrying out image combination on the second texture image and the texture image to obtain a combined texture image;
and extracting features of the combined texture image to obtain combined texture features, detecting textures based on the combined texture features and the factory texture features, and returning an obtained secondary texture detection result to the user terminal.
Specifically, after sending an acquisition page to a user terminal, the user performs interactive acquisition with the acquisition page through the user terminal and uploads a second texture image of the ceramic product, performs image combination on the second texture image and the texture image through an image processing algorithm to obtain a combined texture image, performs feature extraction on the combined texture image to obtain combined texture features, performs feature detection based on the combined texture features and factory texture features, and returns an obtained secondary texture detection result to the user terminal.
For example, the texture image comprises a left half top view of the porcelain bowl, a user performs interactive acquisition with an acquisition page through a user terminal and uploads a right half top view of the porcelain bowl, the left half top view of the porcelain bowl and the right half top view of the porcelain bowl are combined to obtain a complete top view of the porcelain bowl, the complete top view of the porcelain bowl is subjected to feature extraction, texture detection is performed based on the extracted features and factory texture features, and an obtained secondary detection result is returned to the user terminal.
It should be noted that, the extraction mode of the merged texture feature and the detection mode of the secondary texture detection may be implemented by adopting the texture feature extraction mode and the texture detection mode provided above. Such as: image segmentation is carried out on the combined texture image, a target image area corresponding to the ceramic product is obtained based on a segmentation result, and texture feature extraction is carried out on the target image area to obtain texture features; for another example, extracting target texture features corresponding to the target image area from the factory texture features, and performing texture matching calculation on the combined texture features and the target texture features; if the texture matching degree obtained through calculation is larger than a matching degree threshold value, determining that the texture detection of the ceramic product passes; if the texture matching degree obtained through calculation is smaller than or equal to the matching degree threshold value, determining that the texture detection of the ceramic product is not passed.
In addition, the processing procedure of texture detection for the factory texture features of the factory texture image bound based on the texture features and the identification information can also be performed through a texture detection model, so that the texture detection efficiency is improved, and the detection accuracy rate when more data volume is further improved. Optionally, the texture detection operation is performed based on the texture feature and the factory texture feature of the factory texture image bound by the identification information, and the execution is performed based on a texture detection model. Specifically, the texture image and the factory texture image can be input into a texture detection model for feature extraction, the extracted texture features and the factory texture features are subjected to texture detection processing, and a texture detection result is output. Optionally, the input of the texture detection model includes the texture image and the factory texture image, and the output includes the texture detection result.
In practical application, in order to ensure the detection accuracy and detection precision of the texture detection model, model training can be performed on the texture detection model in advance, a sample image set can be constructed, sample images in the sample image set are input into the texture detection model to be trained for texture detection, parameter adjustment is performed on the model to be trained according to an output result, the sample image set can comprise a positive sample image and a negative sample image, training of the texture detection model can be performed on a cloud server or can be performed on line, and in an optional implementation manner provided in this embodiment, the texture detection model is obtained by training in the following manner:
constructing a sample image set consisting of a positive sample image and a negative sample image;
the sample image pair consisting of the positive sample image and the negative sample image in the sample image set is input into a model to be trained for texture detection;
and calculating the countermeasures loss based on the detection result, and carrying out parameter adjustment on the model to be trained according to the countermeasures loss so as to obtain the texture detection model after training is completed.
Specifically, a sample image set consisting of a positive sample image and a negative sample image is constructed, the sample image set is provided with the sample image consisting of the positive sample image and the negative sample image, texture detection is carried out on a texture detection model to be trained, a detection result is output, a counterloss is calculated based on the detection result, parameter adjustment is carried out on the texture detection model to be trained according to the counterloss, and the training of the texture detection model is carried out by adopting the training mode of the texture detection model until the training of the texture detection model is completed when the counterloss is converged, so that a trained texture detection model is obtained.
In addition, in practical application, in order to improve accuracy and effectiveness of texture detection, different texture detection models can be trained for ceramic products starting from different shape classifications, texture characteristics are obtained by extracting characteristics of texture images of corresponding shape classifications through the texture detection models corresponding to the different shape classifications, and under the corresponding shape classifications, texture detection is carried out on factory texture characteristics of factory texture images bound based on the texture characteristics and identification information.
Specifically, on the basis of determining the image acquisition parameters corresponding to the shape classification of the ceramic product corresponding to the first texture image after identifying and acquiring, generating an image acquisition prompt carrying the image acquisition parameters, displaying the image acquisition prompt through a detection page, and carrying out feature extraction on the texture image after receiving the texture image of the ceramic product acquired by the user terminal to obtain texture features, reading model parameters of a corresponding reference texture detection model according to the shape classification of the ceramic product, loading the model parameters corresponding to the read shape classification into the reference texture detection model to obtain a texture detection model, carrying out feature extraction on the texture image through the texture detection model to obtain texture features, carrying out texture detection on the factory texture features of the factory texture image bound by the texture features and the identification information through the texture detection model, and outputting a texture detection result.
In practical application, after the texture detection is passed, in order to further improve the user perception and increase the reliability of the texture detection, production traceability information of the ceramic product, such as delivery time, delivery batch, craftsman number and/or quality inspection number, etc., can be obtained based on the identification information of the ceramic product, and the production traceability information is written into the texture detection result to return to the user terminal.
And acquiring tracing information of the ceramic product based on the identification information, writing the tracing information into the texture detection result, and returning the tracing information to the user terminal.
Specifically, according to the identification information carried in the access request, inquiring production tracing information of the ceramic product bound with the identification information in a database, writing the production tracing information into a texture detection result passing the texture detection, and returning the detection result carrying the production tracing information to the user terminal.
For example, according to the unique identification information of the ceramic product, inquiring all the delivery information of the ceramic product bound with the unique identification information in a database, writing all the delivery information into a detection passing result and returning to the user terminal so as to display the detection passing result and all the delivery information through the user terminal.
In the process of extracting the characteristics of the texture image to obtain the texture characteristics, the surface sub-characteristics of the texture image can be extracted on the basis of extracting the texture sub-characteristics of the texture image, and the extracted texture sub-characteristics and the surface sub-characteristics are used as the texture characteristics of the texture image together; the surface sub-features may be color sub-features, shape sub-features, and/or pattern sub-features of the surface of the ceramic article in the texture image, among others.
Correspondingly, in the process of carrying out texture detection on the factory texture features of the factory texture image bound based on the texture features and the identification information, calculating the texture feature matching degree of the texture sub-features and the factory texture sub-features contained in the factory texture features, calculating the surface feature matching degree of the surface sub-features and the surface texture sub-features contained in the factory texture features, and determining a detection result according to the texture feature matching degree and the surface feature matching degree; specifically, according to the detection weights of the texture sub-features and the surface sub-features, calculating a weighted sum of the matching degree of the surface features and the matching degree of the surface features, taking the weighted sum as the matching degree of the texture features and the factory texture features, and if the matching degree is in a preset range, determining that the detection is passed; if the matching degree is not in the preset range, determining that the detection is not passed.
In addition, it should be noted that, after the receiving the texture image of the ceramic product collected by the user terminal, the process of extracting the features of the texture image to obtain the texture features may be replaced by extracting the texture features of the texture image to obtain the texture features, and extracting the surface features of the texture image to obtain the surface features, where the surface features may be color features, shape features and/or pattern features of the surface of the ceramic product in the texture image;
correspondingly, the process of carrying out texture detection on the factory texture features of the factory texture image bound on the basis of the texture features and the identification information, obtaining a texture detection result and returning the texture detection result to the user terminal can be replaced by: carrying out feature detection on texture features and surface features based on the factory texture images bound by the identification information, obtaining detection results and returning to the user terminal;
specifically, the feature detection of the texture feature and the surface feature based on the factory texture image bound by the identification information comprises the following steps:
performing texture detection on the factory texture features of the factory texture image bound based on the texture features and the identification information, and if the detection fails, determining that the feature detection fails;
If the detection is passed, carrying out surface feature detection based on the surface features and the surface features of the factory texture image, and if the detection is passed, determining that the detection result is the feature detection; if the detection fails, the detection result is that the feature detection fails;
or alternatively, the process may be performed,
carrying out surface feature detection on the basis of the surface features of the factory texture image bound by the surface features and the identification information, and if the detection fails, determining that the feature detection fails;
if the detection is passed, texture detection is carried out based on texture features and factory texture features of factory texture images, and if the detection is passed, a detection result is determined to be feature detection passed; if the detection fails, the detection result is that the feature detection fails;
still alternatively, or in addition to the above,
texture detection is carried out on the factory texture features of the factory texture image bound on the basis of the texture features and the identification information to obtain texture matching degree, surface feature detection is carried out on the factory surface features of the factory texture image bound on the basis of the surface features and the identification information to obtain surface matching degree, weighted average calculation is carried out on the texture matching degree and the surface matching degree, and a detection result is determined according to a calculation result and returned to the user terminal.
The specific processing procedure of performing texture detection based on the texture features and the factory texture features of the factory texture image bound by the identification information is just described in the above provided processing mode, and is not described herein again.
In summary, in the texture detection process, the one or more texture detection processing methods provided in the embodiments receive an access request submitted by a user terminal based on identification information of a ceramic product, generate a detection page for performing texture detection on the ceramic product, return the detection page to the user terminal, receive a texture image of the ceramic product acquired by the user terminal, extract features from the texture image to obtain texture features, and perform texture detection on factory texture features of the factory texture image bound based on the texture features and the identification information to obtain a detection result and return the detection result to the user terminal, where the embodiment implements efficient and accurate texture detection processing by matching with the user terminal, reduces probability of purchasing imitated commodities in the ceramic product transaction process by efficient and accurate texture detection, further improves transaction experience of the user, and optimizes transaction environment;
furthermore, the regional texture image can be acquired and uploaded on the basis of the texture image acquired and uploaded by the user terminal, and the accuracy of texture detection is further improved by establishing a coordinate mapping relation between the regional texture image and the texture image and performing feature detection on texture features of the regional texture image with clearer texture details and factory texture features.
The implementation process of the texture detection processing method provided in the above-mentioned embodiment may be executed by a server, and the implementation process of the other texture detection processing method provided in the below-mentioned embodiment of the method may be executed by a user terminal, and the two may be matched with each other in the execution process, so that reading the implementation process may refer to the corresponding content of the embodiment of the other texture detection processing method described in the below-mentioned embodiment, and correspondingly, reading the embodiment of the other texture detection processing method described in the below-mentioned embodiment may also refer to the corresponding content of the embodiment of the above-mentioned method.
The following describes the application of the texture detection processing method provided in this embodiment to a porcelain detection scene as an example, and referring to fig. 8, the texture detection processing method applied to a porcelain detection scene specifically includes the following steps.
Step S806, an access request submitted by the user terminal based on the identification code of the porcelain article is received.
Step S808, responding to the access request, generating a detection page for detecting the texture of the porcelain product and sending the detection page to the user terminal.
Step S814, receiving the texture image of the porcelain product collected by the user terminal, performing edge detection on the texture image, and extracting a target image area corresponding to the porcelain product in the texture image based on the edge detection result.
Step S816, category identification is performed based on the target image area to obtain the category of the porcelain product.
And step S818, determining an image denoising mode according to the category of the porcelain products, and denoising the target image area according to the image denoising mode.
Here, the image denoising method may be to reduce the brightness of the image area, reduce the reflection of light, increase the brightness of the image area, reduce the shadow, or remove the noise area.
Step S820, acquiring a factory texture image, determining a factory image area corresponding to the target image area, and performing image correction on the target image area based on the factory image area.
Here, the image correction process may be a rotation process or a trimming process.
Step S822, inputting the obtained corrected image area into a texture feature extraction model for texture feature extraction, and obtaining texture features.
Step S824, extracting target texture features corresponding to the target image area from the factory texture features of the factory texture image, and performing texture matching calculation on the texture features and the target texture features.
In step S826, it is detected that the calculated texture matching degree is greater than the matching degree threshold, and the texture detection of the ceramic product is determined to pass.
Step S828, the delivery detailed information of the porcelain product is obtained based on the identification information, and the delivery detailed information is written into the texture and returned to the user terminal.
Optionally, step S806 may be replaced by receiving an access request submitted by the user terminal based on the NFC tag of the porcelain;
step S814 may be replaced by receiving the texture image of the porcelain product collected by the user terminal, performing image segmentation on the texture image, and obtaining a target image area of the porcelain product based on the segmentation result;
steps S816 to S818 may be replaced by performing image noise type detection on the target image area to obtain an image noise type; and determining an image denoising mode according to the image noise type, and denoising the target image area according to the image denoising mode.
The steps S806 to S808 and the steps S814 to S828 provided in the present embodiment are executed by the server, and it should be noted that, the steps S806 to S808 and the steps S814 to S828 executed by the server and the steps S802 to S804, the steps S810 to S812 and the steps S830 executed by the user terminal in the following embodiments may cooperate with each other in the execution process, so that the following method embodiments refer to the corresponding contents of the steps S802 to S804, the steps S810 to S812 and the steps S830 provided in the following method embodiments, and the following method embodiments refer to the corresponding contents of the steps S806 to S808 and the steps S814 to S828 provided in the present embodiment.
The following describes the texture detection processing method provided in this embodiment further by taking the application of the texture detection processing method provided in this embodiment to a ceramic detection scene as an example, and referring to fig. 9, the texture detection processing method applied to the ceramic detection scene specifically includes the following steps.
Step S906, an access request submitted by the user terminal based on the NFC tag of the ceramic product is received.
Step S908, in response to the access request, a detection page for texture detection of the ceramic product is generated and sent to the user terminal.
Step S914, the shape classification of the ceramic corresponding to the first texture image submitted by the user terminal is identified, and the image acquisition thumbnail corresponding to the shape classification is determined.
Step S916, generating an image acquisition reminder carrying the image acquisition thumbnail, and displaying the image acquisition reminder through the detection page.
Step S922, receiving the texture image of the ceramic product uploaded by the user terminal, and inputting the factory texture image bound by the texture image and the identification information into a texture detection model for texture detection.
Step S924, outputting a failure detection result.
Step S926, generating an acquisition page configured with a mask image of the factory texture image, generating a texture detection result which does not pass the texture detection and carries the acquisition page, and sending the texture detection result to the user terminal.
Step S932, based on the second texture image of the ceramic product sent by the user terminal, performing image combination on the texture image and the second texture image to obtain a combined texture image.
Step S934, extracting features of the combined texture image to obtain combined texture features, detecting textures based on the combined texture features and factory texture features, and returning the obtained secondary texture detection result to the user terminal.
Optionally, step S906 may be replaced by receiving an access request submitted by the user terminal for scanning the identification code of the ceramic;
step S922 may be replaced by receiving the texture image of the ceramic product uploaded by the user terminal, extracting the texture feature of the texture image, calculating the feature matching degree of the factory texture feature of the factory texture image bound by the texture feature and the identification information, and performing texture detection based on the feature matching degree obtained by calculation.
The steps S906 to S908, S914 to S916, S922 to S926 and S932 to S934 provided in the present embodiment are executed by a server, and it should be noted that, in the present embodiment, the steps S906 to S908, S914 to S916, S922 to S926 and S932 to S934 executed by the server are matched with the steps S902 to S904, S910 to S912, S918 to S920, S928 to S930 and S936 executed by the user terminal in the following embodiments, so that the following method embodiments refer to the corresponding contents of the steps S902 to S904, S910 to S912, S918 to S920, S928 to S930 and S936 provided in the following method embodiments, and refer to the corresponding contents of the steps S906 to S908, S916 to S922 and S932 to S934 provided in the following method embodiments.
One or more embodiments of another texture detection processing method provided in the present specification are as follows:
referring to fig. 10, the texture detection processing method provided in the present embodiment is applicable to a user terminal, and specifically includes steps S1002 to S1010.
Step S1002, analyzing the identification information of the ceramic product, generating an access request based on the analysis result, and submitting the access request to a server.
In practical application, in the process of purchasing ceramic products by users, the users may encounter goods which are filled with the products for a second time, or in the scene of online transaction, the received ceramic products are inconsistent with the purchased ceramic products, and the users may have difficulty in identifying whether the purchased ceramic products are identical to the ceramic products which are delivered from the factory.
In this embodiment, the ceramic product refers to a craft product made of ceramic with different types and/or different patterns of textures on the surface, and may have ornamental and/or practical properties, such as a ceramic bowl, a ceramic dish, a ceramic vase, a ceramic ornament, etc. In addition, ceramic articles may also refer to products produced from a particular region, named by geographic name, namely: ceramic articles in geotag products. Specifically, ceramic products can be classified into ceramic products and porcelain products, and the ceramic products and the porcelain products can be further classified into categories of functions or styles, for example, the porcelain products can be classified into porcelain tableware, porcelain ornaments, porcelain bottles and ceramic tiles according to functions.
In the specific implementation, a user submits an identification information analysis instruction through a user terminal, analyzes the identification information of the ceramic product, generates an access request for detection according to an analysis result obtained by analysis, and submits the access request to a server. After that, the server may generate a detection page for detecting the texture of the ceramic article after receiving the access request, and return the detection page to the user terminal, so that the user terminal can collect and upload the texture image of the ceramic article through the detection page.
In practical application, in order to ensure uniqueness of the ceramic product, an identification code can be set for the ceramic product when leaving the factory, the identification information of the ceramic product is encoded, a user can scan the identification code of the ceramic product through a user terminal, obtain the identification information of the ceramic product based on a decoding result, the identification code can be set on a package of the ceramic product, or can be set on the ceramic product, for example, a product two-dimensional code is set at a bowl bottom of a porcelain bowl, wherein the identification information can be a unique identification of the ceramic product, the unique identification can correspond to production information and/or product information of the ceramic product, the identification information can also comprise fields corresponding to product data of the ceramic product, such as an enterprise product name, a factory code scanning time, a landmark code number and the like, and in an optional implementation manner provided by the embodiment, the identification information of the ceramic product is analyzed, and before an access request is generated based on an analysis result and submitted to a server, the method further comprises:
And scanning and decoding the identification code of the ceramic product, and obtaining the identification information of the ceramic product based on a decoding result.
Specifically, a user scans an identification code configured on a ceramic product through a user terminal, decodes the identification code based on a decoding component of the user terminal to obtain a decoding result, obtains unique identification information of the ceramic product based on the decoding result, analyzes the unique identification information, and generates an access request based on the analysis result. Optionally, the unique identification information includes a factory number and/or a production number of the ceramic article.
In addition, in order to improve the convenience of the user, the NFC tag may be further set when the ceramic product leaves the factory, the identification information of the ceramic product is configured in the NFC tag of the ceramic product, the user performs near field communication interaction with the NFC tag of the ceramic product through the near field communication component of the user terminal to obtain the identification information of the ceramic product, and the NFC tag may also be configured in the package of the ceramic product or on the ceramic product.
And invoking a near field communication component of the user terminal to perform near field communication interaction with a near field communication tag of the ceramic product to obtain identification information of the ceramic product.
Specifically, after the user terminal approaches the NFC label of the ceramic product, the near field communication component is called, near field communication interaction is carried out between the near field communication component and the near field communication label of the ceramic product, and unique identification information of the ceramic product is read.
It should be noted that, the ceramic product may also be configured with an identification code and an NFC tag at the same time, based on which, in the process of uniquely identifying the ceramic product, the identification information of the ceramic product may be obtained by interacting with any one of the identification code and the NFC tag, for example, by scanning the identification code of the ceramic product and decoding the identification code, and the identification information is obtained according to the decoding result; or acquiring the identification information of the ceramic product from the NFC tag by means of near field communication with the NFC tag.
Step S1004, receiving a detection page returned by the server for detecting the texture of the ceramic product.
In the implementation, after the server generates a detection page for detecting the texture of the ceramic product and sends the detection page to the user terminal, the user terminal receives the detection page returned by the server and displays the detection page on a screen. Specifically, the detection page is configured with an image acquisition interface, and can also be configured with a factory image of the ceramic product when the ceramic product leaves the factory, so that a user triggers the image acquisition interface to acquire a texture image of the ceramic through a user terminal and upload the texture image to a server. As shown in fig. 3, the detection page contains a factory image 301, an image acquisition interface 302.
In practical applications, textures on surfaces of ceramic products of different types may be distributed in different places, a part of ceramic products of different types may need to collect a plurality of images at multiple angles to better perform the following texture detection, for example, a porcelain bowl may need to perform texture detection only through a top view, a porcelain vase may need to perform texture detection in front view and side view, a ceramic ornament may need to perform texture detection in three views, for this purpose, after receiving the detection page, a pre-detection interface may be invoked to collect an image of the ceramic product and send the image to a server, so that the server returns an image collection prompt after processing is completed, so as to improve the collection efficiency of texture images, reduce repetitive operations, and in an alternative embodiment provided in this embodiment, after receiving the detection page returned by the server to perform texture detection on the ceramic product, the method further includes:
collecting a first texture image of the ceramic product, calling and inputting a pre-detection interface by taking the first texture image as an interface to perform image collection pre-detection, and calling and returning an image collection reminder through the detection page display interface;
wherein, the image acquisition pre-detection includes: and identifying the shape classification of the ceramic product corresponding to the first texture image, determining the image acquisition data corresponding to the shape classification, and generating the image acquisition prompt carrying the image acquisition data.
Specifically, after receiving a detection page, calling a pre-detection interface to collect a first texture image of the ceramic product, calling and inputting the first texture image as an interface, calling the pre-detection interface to perform image collection pre-detection processing, and calling a returned image collection reminder through the detection page display interface; here, the pre-detection interface may identify the shape classification of the ceramic product in the uploaded first texture image, determine image acquisition data based on the identification result, generate an image acquisition reminder carrying the image acquisition data based on the image acquisition data, where the image acquisition reminder may be a text reminder, or a voice reminder, or an image reminder, and the image acquisition data may be a schematic view of image acquisition, or a thumbnail.
Step S1006, the texture image of the ceramic product is acquired by triggering the image acquisition interface configured by the detection page.
And after the detection page is received, an image acquisition interface configured by the detection page is triggered to call a camera component of the user terminal to acquire an image, and a texture image of the ceramic product is acquired to upload the texture image to a server.
In this embodiment, in the process of performing texture image acquisition, image quality detection may also be performed on an acquired texture image, specifically, by extracting image parameters of the acquired texture image, and calculating image sharpness of the acquired texture image according to the image parameters, if the image sharpness meets the texture image acquisition condition, for example, is greater than a sharpness threshold for performing texture image acquisition, determining that the detection is passed, and uploading the currently acquired texture image to the server under the condition that the detection is determined to be passed; otherwise, the detection is determined to not pass, and a re-acquisition reminder can be generated.
In this embodiment, in the process of performing texture image acquisition, image quality detection may also be performed on an acquired texture image, specifically, by extracting image parameters of the acquired texture image, and calculating image sharpness of the acquired texture image according to the image parameters, if the image sharpness meets the texture image acquisition condition, for example, is greater than a sharpness threshold for performing texture image acquisition, determining that the detection is passed, and uploading the currently acquired texture image to the server under the condition that the detection is determined to be passed; otherwise, the detection is determined to not pass, and a re-acquisition reminder can be generated.
In addition, in the process of texture image acquisition, image pre-detection can be carried out on the acquired texture image, or under the condition that the detection of the image quality detection is passed, the image pre-detection can be further carried out on the acquired texture image; the image pre-detection of the collected texture image may be to calculate the similarity between the currently collected texture image and the factory texture image at the image level, if the similarity is low, the texture detection of the currently collected texture image may not be performed through the factory texture image, so that re-collection is required, specifically, the image pre-detection of the collected texture image includes: calculating the image similarity of the acquired texture image and the factory texture image, if the similarity is larger than a similarity threshold, determining that the detection of the image pre-detection is passed, and uploading the currently acquired texture image to a server under the condition that the detection is passed; otherwise, if the similarity is smaller than or equal to the similarity threshold, determining that the detection of the image pre-detection is not passed, and generating a re-acquisition prompt.
Step S1008, uploading the texture image to the server, so as to perform texture detection based on the texture feature of the texture image and the factory texture feature of the factory texture image bound by the identification information.
In the implementation process, the acquired texture image of the ceramic product is uploaded to a server, so that texture detection is carried out through the server based on the texture characteristics of the texture image and the delivery texture characteristics of the delivery texture image bound by the identification information, and a texture detection result is returned to the user terminal.
In practical application, to improve the efficiency of texture detection and further improve the detection accuracy when the data size is large, texture detection processing can be performed through a texture detection model, specifically, a texture image and a processed texture image can be input into the texture detection model to perform feature extraction, the texture detection processing is performed based on the extracted texture features and factory texture features, and a texture detection result is output.
Inputting the texture image and the factory texture image into a trained texture detection model to detect texture, and outputting a texture detection result.
Optionally, the texture detection model performs texture detection by performing feature extraction on the texture image to obtain the texture feature and calculating a texture matching degree between the texture feature and the factory texture feature.
Specifically, the texture detection model can perform feature extraction on an input texture image to obtain texture features, perform feature extraction on an input factory texture image to obtain factory texture features, calculate texture matching degrees of the texture features and the factory texture features, perform texture detection processing in a mode of detecting whether the texture matching degrees exceed a matching degree threshold value, determine that texture detection is passed if the texture matching degrees are greater than the matching degree threshold value, and determine that the texture detection is failed if the texture matching degrees are less than or equal to the matching degree threshold value.
In addition, the texture image collected and uploaded by the user terminal may have a certain interference area, such as a background irrelevant to the ceramic product, or noise corresponding to an image area of the ceramic product, for which, in the process of texture detection, texture features may be extracted after removing the interference area with respect to the texture image, for example: performing edge detection on the texture image, and extracting a ceramic image area in the texture image based on an edge detection result; and performing image segmentation on the ceramic image region to obtain a characteristic sub-region and an interference sub-region, and removing the interference sub-region in the ceramic image region to obtain a target image region so as to extract texture characteristics of the target image region.
Further, the texture matching degree calculation may be performed based on the texture feature of the target image area from which the interference area is removed and the factory texture feature of the factory image area corresponding to the target image area in the factory texture image, for example: extracting target texture features corresponding to the target area image from the factory texture features, and performing texture matching calculation on the texture features and the target texture features; if the texture matching degree obtained through calculation is larger than a matching degree threshold value, determining that the texture detection of the ceramic product passes; if the texture matching degree obtained through calculation is smaller than or equal to the matching degree threshold value, determining that the texture detection of the ceramic product is not passed.
Step S1010, receiving a texture detection result of the ceramic product returned by the server.
And after the texture detection processing based on the texture characteristics and the factory texture characteristics is completed, receiving a texture detection result of the ceramic product returned by the server. Specifically, if the detection result passes, it indicates that the ceramic product in the texture image submitted by the user and the ceramic product in the factory texture image are the same piece, namely: if the detection result is not passed, it may be that the ceramic product in the texture image submitted by the user is not the same piece as the ceramic product in the factory texture image, or the texture image submitted by the user does not meet the requirements, such as excessive image noise, incomplete texture display, etc.
In a specific implementation process, in order to avoid misjudgment caused by that a texture image submitted by a user does not meet requirements, an acquisition page configured with a mask image of a factory texture image can be generated under the condition that the texture image submitted by the user does not meet the requirements, so that the user can acquire and upload the texture image of the ceramic product through the acquisition page for the second time according to the mask image of the factory texture image in the acquisition page, and in an alternative implementation provided by the embodiment, the method for receiving the texture detection result of the ceramic product returned by the server comprises the following steps:
and receiving and displaying a texture detection result which is returned by the server and is not passed by the texture detection of the acquisition page.
Optionally, the acquisition page is configured with a mask image of the factory texture image.
Specifically, when the detection result is that the detection is failed due to the fact that the image is not in accordance with the requirement, a collection page provided with a mask image of a factory texture image is received, so that a second texture image of the ceramic product is collected and uploaded to a server through an image collection interface configured by triggering the collection page.
Further, after receiving the collection page returned by the server, the user may further acquire a second texture image of the ceramic product again by triggering the collection interface configured by the collection page to perform texture detection again, and provide the factory texture image with the mask displayed on the collection page for the user to refer to, in an optional implementation manner provided in this embodiment, after receiving and displaying the collection page, the method further includes:
Acquiring a second texture image of the ceramic product by triggering an acquisition interface configured by the acquisition page;
uploading the second texture image and the texture image to the server to perform texture detection based on the combined texture features of the combined texture image after the image combination of the texture image and the second texture image and the factory texture features;
and receiving a secondary texture detection result of the ceramic product returned by the server.
Specifically, a second texture image of the ceramic product is acquired by triggering an acquisition interface configured by an acquisition page, the second texture image and the texture image are uploaded to a server, so that the server performs image merging on the texture image and the second texture image to obtain a merged texture image, texture detection is performed based on merged texture features of the merged texture image and factory texture features of a factory image, and a secondary texture detection result of the ceramic product returned by the server is received after the completion of the texture detection processing of the server.
It should be noted that, the extraction mode of the merged texture feature and the detection mode of the secondary texture detection may be implemented by adopting the texture feature extraction mode and the texture detection mode provided above. Such as: image segmentation is carried out on the combined texture image, a target image area corresponding to the ceramic product is obtained based on a segmentation result, and texture feature extraction is carried out on the target image area to obtain texture features; for another example, extracting target texture features corresponding to the target image area from the factory texture features, and performing texture matching calculation on the combined texture features and the target texture features; if the texture matching degree obtained through calculation is larger than a matching degree threshold value, determining that the texture detection of the ceramic product passes; if the texture matching degree obtained through calculation is smaller than or equal to the matching degree threshold value, determining that the texture detection of the ceramic product is not passed.
In practical application, after the texture detection is passed, in order to further improve user perception and increase reliability of the texture detection, production traceability information of the ceramic product, such as delivery time, delivery batch, craftsman number and/or quality inspection number, etc., can be obtained based on the identification information of the ceramic product, and the production traceability information is written into the texture detection result to return to the user terminal, such as: and if the detection result is that the detection is passed, extracting the traceability information of the ceramic product contained in the detection result and displaying the traceability information.
In summary, the method for detecting and processing one or more textures provided in this embodiment is applied to a user terminal, in the process of detecting textures, the identification information of a ceramic product is parsed, an access request is generated based on the parsing result and submitted to a server, a detection page returned by the server for detecting textures of the ceramic product is received, the texture image of the ceramic product is collected through an image collection interface configured by triggering the detection page, the texture image is uploaded to the server, the texture detection is performed based on the texture characteristics of the texture image and the factory texture characteristics of the factory texture image bound by the identification information, and the texture detection result returned by the server is received.
Furthermore, the regional texture image can be acquired and uploaded at the user terminal, and the accuracy of texture detection is further improved by establishing a coordinate mapping relation between the regional texture image and the texture image and performing feature detection on the texture features of the regional texture image with clearer texture details and the texture features of the delivery ceramic product, so that the texture detection can be performed even under the condition that the ceramic product is broken, and whether the broken ceramic product is identical with the delivery ceramic product or not is detected.
The following describes the application of the texture detection processing method provided in this embodiment to a porcelain detection scene as an example, and referring to fig. 8, the texture detection processing method applied to a porcelain detection scene specifically includes the following steps.
Step S802, the identification code of the porcelain is scanned for decoding, and the identification information of the porcelain is obtained based on the decoding result.
Step S804, analyzing the identification information of the porcelain products, generating an access request based on the analysis result and submitting the access request to the server.
Step S810, receiving a detection page returned by the server for detecting the texture of the porcelain product.
Step S812, the texture image of the ceramic product is acquired and uploaded to a server by triggering an image acquisition interface configured by the detection page, so as to perform texture detection based on the image characteristics of the texture image and the factory texture characteristics of the factory texture image bound by the identification information.
And step S830, receiving and displaying the texture detection passing result of the porcelain product returned by the server.
The following describes the texture detection processing method provided in this embodiment further by taking the application of the texture detection processing method provided in this embodiment to a ceramic detection scene as an example, and referring to fig. 9, the texture detection processing method applied to the ceramic detection scene specifically includes the following steps.
Step S902, calling a near field communication component to perform near field communication interaction with the NFC tag of the ceramic to obtain identification information of the ceramic.
Step S904, analyzing the identification information of the ceramic product, generating an access request based on the analysis result and submitting the access request to a server.
Step S910, receiving a detection page returned by the server for detecting the texture of the ceramic product, and calling a pre-detection interface to collect a first texture image of the ceramic product.
Step S912, upload the first texture image to the server.
Step S918, receiving an image acquisition prompt sent by the server, and acquiring a texture image of the ceramic product by triggering an image acquisition interface configured by the detection page.
Step S920, uploading the texture image to a server, so as to input the factory texture image bound by the texture image and the identification information into a texture detection model for texture detection.
In step S928, a texture detection failure result carrying the acquisition page sent by the server is received.
In step S930, a second texture image of the ceramic article is acquired through the acquisition page and uploaded to a server.
In step S936, the secondary texture detection passing result sent by the server is received and displayed.
An embodiment of a texture detection processing apparatus provided in the present specification is as follows:
in the foregoing embodiments, a texture detection processing method applied to a server is provided, and a texture detection processing apparatus running on the server is provided correspondingly, which will be described with reference to the accompanying drawings.
Referring to fig. 11, a schematic diagram of an embodiment of a texture detection processing apparatus according to the present embodiment is shown.
Since the apparatus embodiments correspond to the method embodiments, the description is relatively simple, and the relevant portions should be referred to the corresponding descriptions of the method embodiments provided above. The device embodiments described below are merely illustrative.
The embodiment provides a texture detection processing device, which is operated on a server, and the device comprises:
an access request receiving module 1102 configured to receive an access request submitted by a user terminal based on identification information of the ceramic product;
A detection page generation module 1104 configured to generate a detection page for texture detection of the ceramic article and return to the user terminal in response to the access request;
a texture feature extraction module 1106 configured to receive a texture image of the ceramic article acquired by the user terminal, and perform feature extraction on the texture image to obtain texture features;
the texture detection module 1108 is configured to perform texture detection based on the texture feature and the factory texture feature of the factory texture image bound by the identification information, obtain a texture detection result, and return to the user terminal.
Another embodiment of the texture detection processing apparatus provided in the present specification is as follows:
in the foregoing embodiments, a texture detection processing method applied to a user terminal is provided, and correspondingly, a texture detection processing apparatus applied to a user terminal is also provided, which is described below with reference to the accompanying drawings.
Referring to fig. 12, a schematic diagram of an embodiment of a texture detection processing apparatus according to the present embodiment is shown.
Since the apparatus embodiments correspond to the method embodiments, the description is relatively simple, and the relevant portions should be referred to the corresponding descriptions of the method embodiments provided above. The device embodiments described below are merely illustrative.
The present embodiment provides another texture detection processing apparatus, which is operated in a user terminal, and the apparatus includes:
an access request submitting module 1202 configured to parse the identification information of the ceramic product, and generate an access request based on the parsing result and submit the access request to the server;
the detection page receiving module 1204 is configured to receive a detection page returned by the server for performing texture detection on the ceramic product;
a texture image acquisition module 1206 configured to acquire texture images of the ceramic article by triggering an image acquisition interface of the detection page arrangement;
a texture image uploading module 1208 configured to upload the texture image to the server for texture detection based on the texture features of the texture image and the factory texture features of the factory texture image bound by the identification information;
the detection result receiving module 1210 is configured to receive a texture detection result of the ceramic product returned by the server.
One embodiment of a server provided in this specification is as follows:
in correspondence to the above-described texture detection processing method, one or more embodiments of the present disclosure further provide a server for executing the above-provided texture detection processing method, and fig. 13 is a schematic structural diagram of a server provided by one or more embodiments of the present disclosure, based on the same technical concept.
The server provided in this embodiment includes:
as shown in fig. 13, the server may have a relatively large difference due to different configurations or performances, and may include one or more processors 1301 and a memory 1302, where the memory 1302 may store one or more storage applications or data. Wherein the memory 1302 may be transient storage or persistent storage. The application programs stored in memory 1302 may include one or more modules (not shown), each of which may include a series of computer-executable instructions in a server. Still further, the processor 1301 may be arranged to communicate with the memory 1302, executing a series of computer executable instructions in the memory 1302 on a server. The server may also include one or more power supplies 1303, one or more wired or wireless network interfaces 1304, one or more input/output interfaces 1305, and the like.
In a particular embodiment, a server includes a memory, and one or more programs, wherein the one or more programs are stored in the memory, and the one or more programs may include one or more modules, and each module may include a series of computer-executable instructions for the server, and configured to be executed by the one or more processors, the one or more programs comprising computer-executable instructions for:
Receiving an access request submitted by a user terminal based on the identification information of the ceramic product;
responding to the access request, generating a detection page for detecting the texture of the ceramic product and returning the detection page to the user terminal;
receiving texture images of the ceramic product acquired by the user terminal, and extracting features of the texture images to obtain texture features;
and carrying out texture detection based on the texture characteristics and the factory texture characteristics of the factory texture image bound by the identification information, obtaining a texture detection result and returning to the user terminal.
An embodiment of a user terminal provided in the present specification is as follows:
in accordance with another texture detection processing method described above, one or more embodiments of the present disclosure further provide a user terminal for performing the another texture detection processing method provided above, and fig. 14 is a schematic structural diagram of a user terminal provided in one or more embodiments of the present disclosure, based on the same technical concept.
The user terminal provided in this embodiment includes:
as shown in fig. 14, the user terminal may have a relatively large difference due to different configurations or capabilities, and may include one or more processors 1401 and a memory 1402, where the memory 1402 may store one or more storage applications or data. Wherein memory 1402 may be a transitory storage or a persistent storage. The application programs stored in memory 1402 may include one or more modules (not shown in the figures), each of which may include a series of computer-executable instructions in the user terminal. Still further, a processor 1401 may be provided in communication with memory 1402 and execute a series of computer executable instructions in memory 1402 on a user terminal. The user terminal may also include one or more power supplies 1403, one or more wired or wireless network interfaces 1404, one or more input/output interfaces 1405, one or more keyboards 1406, etc.
In a particular embodiment, a user terminal includes a memory, and one or more programs, where the one or more programs are stored in the memory, and the one or more programs may include one or more modules, and each module may include a series of computer-executable instructions for the user terminal, and configured to be executed by one or more processors, the one or more programs including computer-executable instructions for:
analyzing the identification information of the ceramic product, generating an access request based on the analysis result, and submitting the access request to a server;
receiving a detection page returned by the server for detecting the texture of the ceramic product;
acquiring texture images of the ceramic product by triggering an image acquisition interface configured by the detection page;
uploading the texture image to the server to perform texture detection based on the texture characteristics of the texture image and the factory texture characteristics of the factory texture image bound by the identification information;
and receiving a texture detection result of the ceramic product returned by the server.
An embodiment of a storage medium provided in the present specification is as follows:
Corresponding to the texture detection processing method described above, one or more embodiments of the present disclosure further provide a storage medium based on the same technical concept.
The storage medium provided in this embodiment is configured to store computer executable instructions that, when executed by a processor, implement the following flow:
receiving an access request submitted by a user terminal based on the identification information of the ceramic product;
responding to the access request, generating a detection page for detecting the texture of the ceramic product and returning the detection page to the user terminal;
receiving texture images of the ceramic product acquired by the user terminal, and extracting features of the texture images to obtain texture features;
and carrying out texture detection based on the texture characteristics and the factory texture characteristics of the factory texture image bound by the identification information, obtaining a texture detection result and returning to the user terminal.
It should be noted that, in the present specification, an embodiment of a storage medium and an embodiment of a texture detection processing method in the present specification are based on the same inventive concept, so that a specific implementation of the embodiment may refer to an implementation of the foregoing corresponding method, and a repetition is omitted.
Another storage medium embodiment provided in this specification is as follows:
in correspondence with the other texture detection processing method described above, one or more embodiments of the present disclosure further provide another storage medium based on the same technical concept.
The storage medium provided in this embodiment is configured to store computer executable instructions that, when executed by a processor, implement the following flow:
analyzing the identification information of the ceramic product, generating an access request based on the analysis result, and submitting the access request to a server;
receiving a detection page returned by the server for detecting the texture of the ceramic product;
acquiring texture images of the ceramic product by triggering an image acquisition interface configured by the detection page;
uploading the texture image to the server to perform texture detection based on the texture characteristics of the texture image and the factory texture characteristics of the factory texture image bound by the identification information;
and receiving a texture detection result of the ceramic product returned by the server.
It should be noted that, in the present specification, an embodiment of another storage medium and an embodiment of another texture detection processing method in the present specification are based on the same inventive concept, so that the specific implementation of this embodiment may refer to the implementation of the foregoing corresponding method, and the repetition is omitted.
In this specification, each embodiment is described in a progressive manner, and the same or similar parts of each embodiment are referred to each other, and each embodiment focuses on the differences from other embodiments, for example, an apparatus embodiment, and a storage medium embodiment, which are all similar to a method embodiment, so that description is relatively simple, and relevant content in reading apparatus embodiments, and storage medium embodiments is referred to the part description of the method embodiment.
The foregoing describes specific embodiments of the present disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
In the 30 s of the 20 th century, improvements to one technology could clearly be distinguished as improvements in hardware (e.g., improvements to circuit structures such as diodes, transistors, switches, etc.) or software (improvements to the process flow). However, with the development of technology, many improvements of the current method flows can be regarded as direct improvements of hardware circuit structures. Designers almost always obtain corresponding hardware circuit structures by programming improved method flows into hardware circuits. Therefore, an improvement of a method flow cannot be said to be realized by a hardware entity module. For example, a programmable logic device (Programmable Logic Device, PLD) (e.g., field programmable gate array (Field Programmable Gate Array, FPGA)) is an integrated circuit whose logic function is determined by the programming of the device by a user. A designer programs to "integrate" a digital system onto a PLD without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Moreover, nowadays, instead of manually manufacturing integrated circuit chips, such programming is mostly implemented by using "logic compiler" software, which is similar to the software compiler used in program development and writing, and the original code before the compiling is also written in a specific programming language, which is called hardware description language (Hardware Description Language, HDL), but not just one of the hdds, but a plurality of kinds, such as ABEL (Advanced Boolean Expression Language), AHDL (Altera Hardware Description Language), confluence, CUPL (Cornell University Programming Language), HDCal, JHDL (Java Hardware Description Language), lava, lola, myHDL, PALASM, RHDL (Ruby Hardware Description Language), etc., VHDL (Very-High-Speed Integrated Circuit Hardware Description Language) and Verilog are currently most commonly used. It will also be apparent to those skilled in the art that a hardware circuit implementing the logic method flow can be readily obtained by merely slightly programming the method flow into an integrated circuit using several of the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer readable medium storing computer readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), programmable logic controllers, and embedded microcontrollers, examples of which include, but are not limited to, the following microcontrollers: ARC 626D, atmel AT91SAM, microchip PIC18F26K20, and Silicone Labs C8061F320, the memory controller may also be implemented as part of the control logic of the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller in a pure computer readable program code, it is well possible to implement the same functionality by logically programming the method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Such a controller may thus be regarded as a kind of hardware component, and means for performing various functions included therein may also be regarded as structures within the hardware component. Or even means for achieving the various functions may be regarded as either software modules implementing the methods or structures within hardware components.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various units, respectively. Of course, the functions of each unit may be implemented in the same piece or pieces of software and/or hardware when implementing the embodiments of the present specification.
One skilled in the relevant art will recognize that one or more embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, one or more embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present description can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The present description is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the specification. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
One or more embodiments of the present specification may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. One or more embodiments of the specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The foregoing description is by way of example only and is not intended to limit the present disclosure. Various modifications and changes may occur to those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. that fall within the spirit and principles of the present document are intended to be included within the scope of the claims of the present document.

Claims (25)

1. A texture detection processing method applied to a server, the method comprising:
receiving an access request submitted by a user terminal based on the identification information of the ceramic product;
responding to the access request, generating a detection page for detecting the texture of the ceramic product and returning the detection page to the user terminal;
receiving texture images of the ceramic product acquired by the user terminal, and extracting features of the texture images to obtain texture features;
and carrying out texture detection based on the texture characteristics and the factory texture characteristics of the factory texture image bound by the identification information, obtaining a texture detection result and returning to the user terminal.
2. The texture detection processing method according to claim 1, wherein the identification information of the ceramic article is obtained by:
scanning and decoding the identification code of the ceramic product through the user terminal, and obtaining the identification information of the ceramic product based on a decoding result;
Or alternatively, the process may be performed,
and invoking a near field communication component of the user terminal to perform near field communication interaction with a near field communication tag of the ceramic product to obtain identification information of the ceramic product.
3. The texture detection processing method according to claim 1, wherein the feature extraction of the texture image to obtain texture features comprises:
performing edge detection on the texture image, and extracting a target image area corresponding to the ceramic product from the texture image based on an edge detection result;
and extracting texture features from the target image area to obtain the texture features.
4. The texture detection processing method according to claim 3, wherein after the sub-step of performing edge detection on the texture image, extracting a target image region corresponding to the ceramic article in the texture image based on an edge detection result, and before the sub-step of performing texture feature extraction on the target image region to obtain the texture features, further comprises:
performing category identification based on the target image area to obtain the category of the ceramic product, and/or performing image noise type detection on the target image area to obtain an image noise type;
Determining an image denoising mode according to the category and/or the image noise type, and denoising the target image area according to the image denoising mode;
correspondingly, the texture features are obtained by extracting the texture features of the denoised target image area.
5. A texture detection method as claimed in claim 3 wherein said texture feature extraction of said target image region comprises:
determining a factory image area corresponding to the target image area in the factory texture image, and carrying out image correction on the target image area based on the factory image area;
and inputting the obtained corrected image area into a texture feature extraction model to extract texture features, and obtaining the texture features.
6. The texture detection processing method according to claim 5, wherein the image correction of the target image area based on the factory image area includes:
determining a coordinate mapping relation between an image unit contained in the target image area and an image unit contained in the factory image area;
and carrying out coordinate correction on the target image area based on the coordinate mapping relation to obtain the corrected image area.
7. The texture detection processing method according to claim 3, wherein the texture detection based on the texture feature and the factory texture feature of the factory texture image bound by the identification information comprises:
extracting target texture features corresponding to the target image area from the factory texture features, and performing texture matching calculation on the texture features and the target texture features;
if the texture matching degree obtained through calculation is larger than a matching degree threshold value, determining that the texture detection of the ceramic product is passed;
and if the texture matching degree obtained through calculation is smaller than or equal to the matching degree threshold value, determining that the texture detection of the ceramic product is failed.
8. The texture detection processing method according to claim 7, wherein the step of extracting a target texture feature corresponding to the target image area from the factory texture features and performing a texture matching calculation sub-step on the texture feature and the target texture feature, further comprises:
calculating the region overlap ratio of the target image region and a factory image region corresponding to the target image region in the factory texture image;
and if the region overlap ratio is greater than or equal to an overlap ratio threshold, executing the steps of extracting target texture features corresponding to the target image region from the factory texture features, and performing texture matching calculation on the texture features and the target texture features.
9. The texture detection processing method according to claim 8, further comprising, after the step of calculating the region overlap ratio of the target image region and a factory image region corresponding to the target image region in the factory texture image is performed:
if the region overlap ratio is smaller than the overlap ratio threshold, determining that detection fails, and generating an acquisition page provided with a mask image of the factory texture image;
and generating texture detection results which do not pass the texture detection and carry the acquisition page.
10. The texture detection processing method according to claim 9, wherein the performing texture detection based on the texture feature and the factory texture feature of the factory texture image bound by the identification information, obtaining a texture detection result, and returning to the user terminal after the step of performing, further comprises:
acquiring a second texture image of the ceramic product acquired based on the acquisition page, and carrying out image combination on the second texture image and the texture image to obtain a combined texture image;
and extracting features of the combined texture image to obtain combined texture features, detecting textures based on the combined texture features and the factory texture features, and returning an obtained secondary texture detection result to the user terminal.
11. The texture detection processing method according to claim 1, wherein the texture image is subjected to a feature extraction to obtain texture features, and the texture detection processing is performed based on the texture features and the factory texture features of the factory texture image bound by the identification information, and is performed based on a texture detection model;
the texture detection model comprises a texture image and a factory texture image, and the texture detection result is output.
12. The texture detection method according to claim 11, wherein the texture detection model is obtained by training in the following manner:
constructing a sample image set consisting of a positive sample image and a negative sample image;
the sample image pair consisting of the positive sample image and the negative sample image in the sample image set is input into a model to be trained for texture detection;
and calculating the countermeasures loss based on the detection result, and carrying out parameter adjustment on the model to be trained according to the countermeasures loss so as to obtain the texture detection model after training is completed.
13. The texture detection processing method according to claim 1, wherein after the detection page is returned to the user terminal, the image acquisition of the ceramic product is performed by triggering an image acquisition interface configured by the detection page;
Before the image acquisition interface is triggered, the user terminal executes the following operations by calling a pre-detection interface:
identifying the shape classification of the ceramic product corresponding to the acquired first texture image, and determining image acquisition data corresponding to the shape classification;
and generating an image acquisition reminder carrying the image acquisition data, and displaying the image acquisition reminder through the detection page.
14. The texture detection method according to claim 1, wherein if the texture detection result is texture detection passing, the obtaining the texture detection result and returning the texture detection result to the user terminal comprises:
and acquiring tracing information of the ceramic product based on the identification information, writing the tracing information into the texture detection result, and returning the tracing information to the user terminal.
15. A texture detection processing method applied to a user terminal, the method comprising:
analyzing the identification information of the ceramic product, generating an access request based on the analysis result, and submitting the access request to a server;
receiving a detection page returned by the server for detecting the texture of the ceramic product;
acquiring texture images of the ceramic product by triggering an image acquisition interface configured by the detection page;
Uploading the texture image to the server to perform texture detection based on the texture characteristics of the texture image and the factory texture characteristics of the factory texture image bound by the identification information;
and receiving a texture detection result of the ceramic product returned by the server.
16. The texture detection method of claim 15, wherein before the analyzing the identification information of the ceramic article and generating the access request based on the analysis result and submitting the access request to the server, the method further comprises:
scanning and decoding the identification code of the ceramic product, and obtaining the identification information of the ceramic product based on a decoding result;
or alternatively, the process may be performed,
and invoking a near field communication component of the user terminal to perform near field communication interaction with a near field communication tag of the ceramic product to obtain identification information of the ceramic product.
17. The texture detection processing method according to claim 15, further comprising, after the step of receiving a detection page returned by the server for performing texture detection on the ceramic article, and before the step of acquiring a texture image of the ceramic article by triggering an image acquisition interface configured by the detection page, performing:
Collecting a first texture image of the ceramic product, calling and inputting a pre-detection interface by taking the first texture image as an interface to perform image collection pre-detection, and calling and returning an image collection reminder through the detection page display interface;
the image acquisition pre-detection comprises: and identifying the shape classification of the ceramic product corresponding to the first texture image, determining the image acquisition data corresponding to the shape classification, and generating the image acquisition prompt carrying the image acquisition data.
18. The texture detection method of claim 15, wherein the receiving the texture detection result of the ceramic article returned by the server comprises:
receiving and displaying texture detection results which are returned by the server and are not passed by the texture detection of the acquisition page;
the acquisition page is configured with a mask image of the factory texture image.
19. The texture detection processing method according to claim 18, wherein after receiving the texture detection result returned by the server and carrying the texture detection failure of the acquisition page and showing the sub-step execution, further comprising:
acquiring a second texture image of the ceramic product by triggering an acquisition interface configured by the acquisition page;
Uploading the second texture image and the texture image to the server to perform texture detection based on the combined texture features of the combined texture image after the image combination of the texture image and the second texture image and the factory texture features;
and receiving a secondary texture detection result of the ceramic product returned by the server.
20. A texture detection processing apparatus, operable on a server, the apparatus comprising:
an access request receiving module configured to receive an access request submitted by a user terminal based on the identification information of the ceramic product;
the detection page generation module is configured to respond to the access request, generate a detection page for detecting the texture of the ceramic product and return the detection page to the user terminal;
the texture feature extraction module is configured to receive the texture image of the ceramic product acquired by the user terminal and perform feature extraction on the texture image to obtain texture features;
the texture detection module is configured to detect textures based on the texture features and the factory texture features of the factory texture image bound by the identification information, obtain texture detection results and return to the user terminal.
21. A texture detection processing apparatus, operable in a user terminal, the apparatus comprising:
the access request submitting module is configured to analyze the identification information of the ceramic product, generate an access request based on the analysis result and submit the access request to the server;
the detection page receiving module is configured to receive a detection page returned by the server for detecting the texture of the ceramic product;
a texture image acquisition module configured to acquire texture images of the ceramic article by triggering an image acquisition interface of the detection page configuration;
a texture image uploading module configured to upload the texture image to the server to perform texture detection based on the texture features of the texture image and the factory texture features of the factory texture image bound by the identification information;
and the detection result receiving module is configured to receive the texture detection result of the ceramic product returned by the server.
22. A server, comprising:
a processor; and a memory configured to store computer-executable instructions that, when executed, cause the processor to:
receiving an access request submitted by a user terminal based on the identification information of the ceramic product;
Responding to the access request, generating a detection page for detecting the texture of the ceramic product and returning the detection page to the user terminal;
receiving texture images of the ceramic product acquired by the user terminal, and extracting features of the texture images to obtain texture features;
and carrying out texture detection based on the texture characteristics and the factory texture characteristics of the factory texture image bound by the identification information, obtaining a texture detection result and returning to the user terminal.
23. A user terminal, comprising:
a processor; and a memory configured to store computer-executable instructions that, when executed, cause the processor to:
analyzing the identification information of the ceramic product, generating an access request based on the analysis result, and submitting the access request to a server;
receiving a detection page returned by the server for detecting the texture of the ceramic product;
acquiring texture images of the ceramic product by triggering an image acquisition interface configured by the detection page;
uploading the texture image to the server to perform texture detection based on the texture characteristics of the texture image and the factory texture characteristics of the factory texture image bound by the identification information;
And receiving a texture detection result of the ceramic product returned by the server.
24. A storage medium storing computer-executable instructions that when executed by a processor implement the following:
receiving an access request submitted by a user terminal based on the identification information of the ceramic product;
responding to the access request, generating a detection page for detecting the texture of the ceramic product and returning the detection page to the user terminal;
receiving texture images of the ceramic product acquired by the user terminal, and extracting features of the texture images to obtain texture features;
and carrying out texture detection based on the texture characteristics and the factory texture characteristics of the factory texture image bound by the identification information, obtaining a texture detection result and returning to the user terminal.
25. A storage medium storing computer-executable instructions that when executed by a processor implement the following:
analyzing the identification information of the ceramic product, generating an access request based on the analysis result, and submitting the access request to a server;
receiving a detection page returned by the server for detecting the texture of the ceramic product;
Acquiring texture images of the ceramic product by triggering an image acquisition interface configured by the detection page;
uploading the texture image to the server to perform texture detection based on the texture characteristics of the texture image and the factory texture characteristics of the factory texture image bound by the identification information;
and receiving a texture detection result of the ceramic product returned by the server.
CN202311205483.4A 2023-09-18 2023-09-18 Texture detection processing method and device Active CN116958135B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202410182375.8A CN118051099A (en) 2023-09-18 2023-09-18 Texture detection processing method and device
CN202311205483.4A CN116958135B (en) 2023-09-18 2023-09-18 Texture detection processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311205483.4A CN116958135B (en) 2023-09-18 2023-09-18 Texture detection processing method and device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202410182375.8A Division CN118051099A (en) 2023-09-18 2023-09-18 Texture detection processing method and device

Publications (2)

Publication Number Publication Date
CN116958135A true CN116958135A (en) 2023-10-27
CN116958135B CN116958135B (en) 2024-03-08

Family

ID=88442806

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202410182375.8A Pending CN118051099A (en) 2023-09-18 2023-09-18 Texture detection processing method and device
CN202311205483.4A Active CN116958135B (en) 2023-09-18 2023-09-18 Texture detection processing method and device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202410182375.8A Pending CN118051099A (en) 2023-09-18 2023-09-18 Texture detection processing method and device

Country Status (1)

Country Link
CN (2) CN118051099A (en)

Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010219785A (en) * 2009-03-16 2010-09-30 Konica Minolta Business Technologies Inc Image processing system, image processor, and terminal device
US20150154760A1 (en) * 2012-06-22 2015-06-04 Nec Corporation Verification method, verification system, apparatus, verification apparatus, and program
CN104794519A (en) * 2015-05-07 2015-07-22 闫霄龙 Cloud identification system and cloud identification method
CN109308504A (en) * 2018-09-28 2019-02-05 广州科琳电子科技有限公司 A kind of recognition methods, device, terminal and the system of texture anti-fake identification marking
WO2019062693A1 (en) * 2017-09-28 2019-04-04 阿里巴巴集团控股有限公司 Information interaction method, apparatus, and device
CN110503441A (en) * 2019-08-20 2019-11-26 北京微芯边缘计算研究院 A kind of ceramic texture fingerprint characteristic authentication method and system based on cloud platform
US20190370987A1 (en) * 2017-03-27 2019-12-05 Shenzhen Institutes Of Advanced Technology Chinese Academy Of Sciences Texture synthesis method, and device for same
CN110866461A (en) * 2019-10-29 2020-03-06 上海***信息技术有限公司 Commodity automatic identification tracing anti-counterfeiting method and system based on texture partition
WO2020082610A1 (en) * 2018-10-23 2020-04-30 深圳壹账通智能科技有限公司 Identity card information verification method and apparatus, device, and computer readable storage medium
CN112465517A (en) * 2019-09-06 2021-03-09 深圳兆日科技股份有限公司 Anti-counterfeiting verification method and device and computer readable storage medium
WO2021109536A1 (en) * 2019-12-03 2021-06-10 南京环印防伪科技有限公司 Item anti-counterfeiting method based on digital watermark matching high-temperature firing process and ceramic anti-counterfeiting application
CN113563111A (en) * 2021-07-06 2021-10-29 东莞市唯美陶瓷工业园有限公司 Ceramic tile with invisible texture and manufacturing method thereof
WO2021223675A1 (en) * 2020-05-06 2021-11-11 支付宝(杭州)信息技术有限公司 Risk inspection
US20210350193A1 (en) * 2019-04-22 2021-11-11 Dongguan City Wonderful Ceramics Industrial Park Co., Ltd. Anti-counterfeiting image code embedded in a decorative pattern of a ceramic tile and anti-counterfeiting method thereof
CN215068286U (en) * 2021-06-10 2021-12-07 福建省中瓷网络科技有限公司 Intelligent hardware applied to production system and applied to ceramic identity recognition
KR20220022922A (en) * 2020-08-19 2022-03-02 한국전력공사 Method of image processing for porcelain insulator surface crack identification
CN114301973A (en) * 2021-12-24 2022-04-08 支付宝(杭州)信息技术有限公司 Information recommendation processing method and device
CN114511775A (en) * 2021-12-31 2022-05-17 北京三快在线科技有限公司 Image detection method and device
CN114971653A (en) * 2022-05-23 2022-08-30 中山大学 Ceramic tracing and identifying method based on block chain and trusted data space
CN115100037A (en) * 2022-06-17 2022-09-23 广东工业大学 Large-breadth tile imaging method and system based on multi-line scanning camera image splicing
CN217767687U (en) * 2022-04-07 2022-11-08 田晶 Miniature random texture anti-counterfeiting mark
CN115453281A (en) * 2022-08-15 2022-12-09 国网江苏省电力有限公司超高压分公司 Degraded insulator identification method and device based on electric field array and medium
CN115953596A (en) * 2022-12-30 2023-04-11 佛山喀视科技有限公司 Texture recognition method, device and system

Patent Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010219785A (en) * 2009-03-16 2010-09-30 Konica Minolta Business Technologies Inc Image processing system, image processor, and terminal device
US20150154760A1 (en) * 2012-06-22 2015-06-04 Nec Corporation Verification method, verification system, apparatus, verification apparatus, and program
CN104794519A (en) * 2015-05-07 2015-07-22 闫霄龙 Cloud identification system and cloud identification method
US20190370987A1 (en) * 2017-03-27 2019-12-05 Shenzhen Institutes Of Advanced Technology Chinese Academy Of Sciences Texture synthesis method, and device for same
WO2019062693A1 (en) * 2017-09-28 2019-04-04 阿里巴巴集团控股有限公司 Information interaction method, apparatus, and device
CN109308504A (en) * 2018-09-28 2019-02-05 广州科琳电子科技有限公司 A kind of recognition methods, device, terminal and the system of texture anti-fake identification marking
WO2020082610A1 (en) * 2018-10-23 2020-04-30 深圳壹账通智能科技有限公司 Identity card information verification method and apparatus, device, and computer readable storage medium
US20210350193A1 (en) * 2019-04-22 2021-11-11 Dongguan City Wonderful Ceramics Industrial Park Co., Ltd. Anti-counterfeiting image code embedded in a decorative pattern of a ceramic tile and anti-counterfeiting method thereof
CN110503441A (en) * 2019-08-20 2019-11-26 北京微芯边缘计算研究院 A kind of ceramic texture fingerprint characteristic authentication method and system based on cloud platform
CN112465517A (en) * 2019-09-06 2021-03-09 深圳兆日科技股份有限公司 Anti-counterfeiting verification method and device and computer readable storage medium
CN110866461A (en) * 2019-10-29 2020-03-06 上海***信息技术有限公司 Commodity automatic identification tracing anti-counterfeiting method and system based on texture partition
WO2021109536A1 (en) * 2019-12-03 2021-06-10 南京环印防伪科技有限公司 Item anti-counterfeiting method based on digital watermark matching high-temperature firing process and ceramic anti-counterfeiting application
WO2021223675A1 (en) * 2020-05-06 2021-11-11 支付宝(杭州)信息技术有限公司 Risk inspection
KR20220022922A (en) * 2020-08-19 2022-03-02 한국전력공사 Method of image processing for porcelain insulator surface crack identification
CN215068286U (en) * 2021-06-10 2021-12-07 福建省中瓷网络科技有限公司 Intelligent hardware applied to production system and applied to ceramic identity recognition
CN113563111A (en) * 2021-07-06 2021-10-29 东莞市唯美陶瓷工业园有限公司 Ceramic tile with invisible texture and manufacturing method thereof
CN114301973A (en) * 2021-12-24 2022-04-08 支付宝(杭州)信息技术有限公司 Information recommendation processing method and device
CN114511775A (en) * 2021-12-31 2022-05-17 北京三快在线科技有限公司 Image detection method and device
CN217767687U (en) * 2022-04-07 2022-11-08 田晶 Miniature random texture anti-counterfeiting mark
CN114971653A (en) * 2022-05-23 2022-08-30 中山大学 Ceramic tracing and identifying method based on block chain and trusted data space
CN115100037A (en) * 2022-06-17 2022-09-23 广东工业大学 Large-breadth tile imaging method and system based on multi-line scanning camera image splicing
CN115453281A (en) * 2022-08-15 2022-12-09 国网江苏省电力有限公司超高压分公司 Degraded insulator identification method and device based on electric field array and medium
CN115953596A (en) * 2022-12-30 2023-04-11 佛山喀视科技有限公司 Texture recognition method, device and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张海煊;陈云霞;方涛;袁文瓒;汤辉;: "三维扫描技术在陶瓷制品检验鉴证和修复仿制中的应用研究", 中国陶瓷, no. 09 *
江鹏飞;袁文瓒;魏存峰;陈旭;阙介民;: "X射线透视技术在陶瓷艺术品鉴证质量溯源中的应用研究", 中国陶瓷, no. 10 *

Also Published As

Publication number Publication date
CN116958135B (en) 2024-03-08
CN118051099A (en) 2024-05-17

Similar Documents

Publication Publication Date Title
US10872416B2 (en) Object oriented image editing
US9990557B2 (en) Region selection for image match
TWI746674B (en) Type prediction method, device and electronic equipment for identifying objects in images
US10109051B1 (en) Item recommendation based on feature match
US10133951B1 (en) Fusion of bounding regions
WO2018014828A1 (en) Method and system for recognizing location information in two-dimensional code
CN110675487B (en) Three-dimensional face modeling and recognition method and device based on multi-angle two-dimensional face
US9436883B2 (en) Collaborative text detection and recognition
US9424461B1 (en) Object recognition for three-dimensional bodies
US8965117B1 (en) Image pre-processing for reducing consumption of resources
KR101469398B1 (en) Text-based 3d augmented reality
US9270899B1 (en) Segmentation approaches for object recognition
EP2831807A1 (en) User-guided object identification
CN111008935B (en) Face image enhancement method, device, system and storage medium
CN109274891B (en) Image processing method, device and storage medium thereof
US9519355B2 (en) Mobile device event control with digital images
US20230214913A1 (en) Product cards provided by augmented reality content generators
US20230215118A1 (en) Api to provide product cards generated by augmented reality content generators
US20230214912A1 (en) Dynamically presenting augmented reality content generators based on domains
CN116958135B (en) Texture detection processing method and device
Shieh et al. Fast facial detection by depth map analysis
Pereira et al. Mirar: Mobile image recognition based augmented reality framework
CN117078962B (en) Data chaining method and device based on texture acquisition
CN116978003B (en) Texture detection processing method and device for food material commodity
CN116994007B (en) Commodity texture detection processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant