CN112365515A - Edge detection method, device and equipment based on dense sensing network - Google Patents

Edge detection method, device and equipment based on dense sensing network Download PDF

Info

Publication number
CN112365515A
CN112365515A CN202011197809.XA CN202011197809A CN112365515A CN 112365515 A CN112365515 A CN 112365515A CN 202011197809 A CN202011197809 A CN 202011197809A CN 112365515 A CN112365515 A CN 112365515A
Authority
CN
China
Prior art keywords
dense
network
edge
sub
block
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011197809.XA
Other languages
Chinese (zh)
Inventor
李天驰
孙悦
王帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Dianmao Technology Co Ltd
Original Assignee
Shenzhen Dianmao Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Dianmao Technology Co Ltd filed Critical Shenzhen Dianmao Technology Co Ltd
Priority to CN202011197809.XA priority Critical patent/CN112365515A/en
Publication of CN112365515A publication Critical patent/CN112365515A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an edge detection method, a device and equipment based on a dense sensing network, wherein the method comprises the following steps: constructing a dense sensing network; acquiring an RGB image to be identified, inputting the RGB image into a dense sensing network for feature extraction, and generating a feature map; sampling the feature map to generate an edge image; and fusing all edge images of the target to be detected to generate a target edge information image. The embodiment of the invention avoids the edge deletion of the deep layer, so that the problem of incomplete edge information can not occur, thereby improving the effect of edge detection.

Description

Edge detection method, device and equipment based on dense sensing network
Technical Field
The invention relates to the technical field of internet, in particular to an edge detection method, device and equipment based on a dense sensing network.
Background
Edge detection is a classical computer vision process that requires iterative tasks in modern tasks such as image-to-image conversion, photo-sketching, etc. Furthermore, in the field of medical image analysis or remote sensing, most of their cardiac activity requires edge detectors. Despite the considerable effort made in edge detection, it remains a problem to be improved. Since the Sobel operator, many edge detectors have been proposed, and most techniques such as Canny are still in use to date. Recently, in the age of deep learning, edge detectors deep edge, HED, RCF, BDCN, etc. based on Convolutional Neural Networks (CNN) have been proposed. These models can predict edge maps from a given image with better performance. The success of these approaches mainly benefits from the combination of different scales of CNN with training regularization techniques. However, these methods have some disadvantages, such as difficulty in training if the edge information is incomplete, and less than ideal results.
Accordingly, the prior art is yet to be improved and developed.
Disclosure of Invention
In view of the foregoing deficiencies of the prior art, an object of the present invention is to provide a method, an apparatus, and a device for edge detection based on a dense sensing network, which are used to solve the technical problems in the prior art that the edge detection method has incomplete edge information, makes training difficult, fails to achieve an ideal result, and has incomplete edge detection.
The technical scheme of the invention is as follows:
an edge detection method based on a dense-sensing network, the method comprising:
constructing a dense sensing network;
acquiring an RGB image to be identified, inputting the RGB image into a dense sensing network for feature extraction, and generating a feature map;
sampling the feature map to generate an edge image;
and fusing all edge images of the target to be detected to generate a target edge information image.
Further, the dense-aware network comprises a dense start network and an upsampling module,
the constructing of the dense sensing network comprises the following steps:
constructing a network structure of a dense initial network;
and constructing an up-sampling module according to the network structure of the dense initial network.
Further preferably, the acquiring an RGB image to be recognized, inputting the RGB image into a dense sensing network for feature extraction, and generating a feature map includes:
and acquiring an RGB image to be identified, inputting the RGB image into a dense initial network for feature extraction, and generating a feature map.
Further preferably, the sampling the feature map to generate an edge image includes:
and inputting the feature map into an up-sampling module for sampling to generate an edge image.
Preferably, the constructing the dense-aware network includes:
an encoder is constructed by stacking two convolutional layers of a predetermined size and performing a batch process.
Further, the construction of the up-sampling module according to the network structure of the dense initial network comprises
Building conditional stack sub-blocks according to a network structure of a dense starting network, the sub-block type including a first sub-block and a second sub-block;
constructing a first sub-block and a second sub-block;
the input of the first sub-block is preset to be the output of the dense initial network or the output of the second sub-block.
Further, the constructing the first sub-block includes:
the first sub-block is preset to be two layers, wherein the first layer sub-block is a convolution layer, and the second layer sub-block is an anti-convolution layer.
Another embodiment of the present invention provides an edge detection apparatus based on a dense sensing network, including:
the network construction module is used for constructing a dense sensing network;
the characteristic extraction module is used for acquiring an RGB image to be identified, inputting the RGB image into a dense sensing network for characteristic extraction, and generating a characteristic diagram;
the sampling module is used for sampling the feature map to generate an edge image;
and the image fusion module is used for fusing all edge images of the target to be detected to generate a target edge information image.
Another embodiment of the present invention provides a dense-aware network-based edge detection apparatus, comprising at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the dense-aware network-based edge detection method described above.
Yet another embodiment of the present invention provides a non-transitory computer-readable storage medium storing computer-executable instructions that, when executed by one or more processors, cause the one or more processors to perform the above-described dense awareness network-based edge detection method.
Has the advantages that: the embodiment of the invention avoids the edge deletion of the deep layer, so that the problem of incomplete edge information can not occur, thereby improving the effect of edge detection.
Drawings
The invention will be further described with reference to the accompanying drawings and examples, in which:
FIG. 1 is a flowchart illustrating a preferred embodiment of an edge detection method based on a dense sensing network according to the present invention;
FIG. 2 is a functional block diagram of an edge detection apparatus based on a dense sensing network according to a preferred embodiment of the present invention;
fig. 3 is a schematic diagram of a hardware structure of an edge detection device based on a dense sensing network according to a preferred embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and effects of the present invention clearer and clearer, the present invention is described in further detail below. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. Embodiments of the present invention will be described below with reference to the accompanying drawings.
The embodiment of the invention provides an edge detection method based on a dense sensing network. Referring to fig. 1, fig. 1 is a flowchart illustrating an edge detection method based on a dense sensing network according to a preferred embodiment of the present invention. As shown in fig. 1, it includes the steps of:
s100, constructing a dense sensing network;
s200, acquiring an RGB image to be recognized, inputting the RGB image into a dense sensing network for feature extraction, and generating a feature map;
step S300, sampling the feature map to generate an edge image;
and S400, fusing all edge images of the target to be detected to generate a target edge information image.
In specific implementation, the method is mainly used for detecting the edges of the human body and other objects in the live webcast class. The invention provides an edge detection method based on a dense sensing network. The method of the invention is composed of an innovative dense perception network, mainly composed of a plurality of learning filters, which receive input images and then predict edge feature maps with the same resolution. When the RGB image is input into the network, the features are extracted, the feature map of each block of the input network is sampled, a thin edge feature map is generated, and deep edge loss is avoided. Therefore, the problem of incomplete edge information is avoided, and the edge detection effect is improved.
Further, the dense-aware network includes a dense initial network and an upsampling module,
the constructing of the dense sensing network comprises the following steps:
constructing a network structure of a dense initial network;
and constructing an up-sampling module according to the network structure of the dense initial network.
In specific implementation, the network comprises a dense initial network and an up-sampling module, and the dense initial network and the up-sampling module main block are connected through a 1x1 convolution block. Each main block is composed of sub-blocks that are closely connected by the output of the previous main block. The output of each main block is fed into an upsampled block which generates an intermediate edge map to construct a scale space volume which is used to construct a final fused edge map.
Further, acquiring an RGB image to be recognized, inputting the RGB image into a dense sensing network for feature extraction, and generating a feature map, wherein the feature map comprises:
and acquiring an RGB image to be identified, inputting the RGB image into a dense initial network for feature extraction, and generating a feature map.
In specific implementation, the RBG image is fed into a dense initial network for feature extraction, and the dense initial network can well avoid the loss of edge information.
Further, sampling the feature map to generate an edge image, including:
and inputting the feature map into an up-sampling module for sampling to generate an edge image.
In specific implementation, the extracted feature map is fed into an up-sampling module to generate an edge image, and finally the edge images generated by the up-sampling module are fused to generate a final edge information image.
Further, constructing a dense-aware network, comprising:
an encoder is constructed by stacking two convolutional layers of a predetermined size and performing a batch process.
In specific implementation, the dense initial network comprises an encoder comprising 6 main modules, and the results generated by the 6 modules are fused to obtain a final result. The blue block is formed by stacking two convolution layers with a kernel size of 3 × 3, and then performing batch normalization with a ReLU (Rectified Linear Unit) as an activation function (only the last convolution in the last sub-block is not activated). The maximum pool was set to 3 × 3kernel and stride 2. Since the architecture follows multi-scale learning, like HED, followed by an upsampling process, the block loses important edge features due to the deeper convolutional layers, so the output of the sub-block is edge-connected averaged, starting at block 3.
Further, the construction of the up-sampling module is carried out according to the network structure of the dense initial network, and comprises
Building conditional stack sub-blocks according to a network structure of a dense starting network, the sub-block type including a first sub-block and a second sub-block;
constructing a first sub-block and a second sub-block;
the input of the first sub-block is preset to be the output of the dense initial network or the output of the second sub-block.
In specific implementation, the sampling module is composed of conditional stacking sub-blocks. There are two types of sub-blocks, sub-block 1 is fed by dense initial network or sub-block 2, only used when the difference in scale between the feature mapping and the group route is equal to 2; sub-block 2 is considered for use if the scale difference is greater than 2, and sub-block 2 is used iteratively until the scale difference is equal to 2. The structure of sub-block 1 is 1x1 convolution-ReLU activation function-SxS size deconvolution, s being equal to the input signature size. The last convolutional layer has no activation function. Sub-block 2 is arranged similarly to sub-block 1, except for the number of filters.
Further, constructing the first sub-block includes:
the first sub-block is preset to be two layers, wherein the first layer sub-block is a convolution layer, and the second layer sub-block is an anti-convolution layer.
In specific implementation, each sub-block has 2 layers, and one convolution is a deconvolution. Taking the first sub-block as two layers as an example, the upsampling process of the first layer from the sub-block can be performed by bilinear interpolation, sub-pixel convolution and transposed convolution. Specifically, the loss function is as follows:
Figure BDA0002754497270000071
Figure BDA0002754497270000072
where W is the set of all network parameters, W is the n corresponding parameters, and δ is the weight for each scale level.
Y minus/Y plus + Y minus |, (1- β) | Y plus/Y minus |, Y minus and Y plus represent the edges and non-edges in gt (group truth), respectively.
The embodiment of the invention provides an edge detection method based on a dense sensing network. The method of the invention is composed of an innovative dense perception network, mainly composed of a plurality of learning filters, which receive input images and then predict edge feature maps with the same resolution. The network of the present invention can be seen as being composed of two sub-networks, a dense start network and an upsampling area. When the network is inputted with RGB images, the up-sampling area is inputted with the feature map of each block of the network. The generated network generates a thin edge feature map, and deep edge deletion is avoided. Therefore, the problem of incomplete edge information is avoided, and the edge detection effect is improved.
It should be noted that, a certain order does not necessarily exist between the above steps, and those skilled in the art can understand, according to the description of the embodiments of the present invention, that in different embodiments, the above steps may have different execution orders, that is, may be executed in parallel, may also be executed interchangeably, and the like.
Another embodiment of the present invention provides an edge detection apparatus based on a dense sensing network, as shown in fig. 2, the apparatus 1 includes:
the network construction module 11 is used for constructing a dense sensing network;
the feature extraction module 12 is configured to acquire an RGB image to be identified, input the RGB image into a dense sensing network to perform feature extraction, and generate a feature map;
the sampling module 13 is configured to sample the feature map to generate an edge image;
and the image fusion module 14 is configured to fuse all edge images of the target to be detected to generate a target edge information image.
The specific implementation is shown in the method embodiment, and is not described herein again.
Another embodiment of the present invention provides an edge detection device based on a dense sensing network, as shown in fig. 3, the device 10 includes:
one or more processors 110 and a memory 120, where one processor 110 is illustrated in fig. 3, the processor 110 and the memory 120 may be connected by a bus or other means, and the connection by the bus is illustrated in fig. 3.
Processor 110 is operative to implement various control logic of apparatus 10, which may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a single chip, an ARM (Acorn RISC machine) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination of these components. Also, the processor 110 may be any conventional processor, microprocessor, or state machine. Processor 110 may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The memory 120 is a non-volatile computer-readable storage medium, and can be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, such as program instructions corresponding to the edge detection method based on dense sensing network in the embodiment of the present invention. The processor 110 executes various functional applications and data processing of the device 10, i.e. implements the edge detection method based on dense-aware network in the above-described method embodiments, by running non-volatile software programs, instructions and units stored in the memory 120.
The memory 120 may include a storage program area and a storage data area, wherein the storage program area may store an application program required for operating the device, at least one function; the storage data area may store data created according to the use of the device 10, and the like. Further, the memory 120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, memory 120 optionally includes memory located remotely from processor 110, which may be connected to device 10 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
One or more units are stored in the memory 120, and when executed by the one or more processors 110, perform the dense-aware-network-based edge detection method in any of the above-described method embodiments, e.g., perform the above-described method steps S100 to S400 in fig. 1.
Embodiments of the present invention provide a non-transitory computer-readable storage medium storing computer-executable instructions for execution by one or more processors, e.g., to perform method steps S100-S400 of fig. 1 described above.
By way of example, non-volatile storage media can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), electrically erasable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in many forms such as Synchronous RAM (SRAM), dynamic RAM, (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), Synchl ink DRAM (SLDRAM), and Direct Rambus RAM (DRRAM). The disclosed memory components or memory of the operating environment described herein are intended to comprise one or more of these and/or any other suitable types of memory.
Another embodiment of the present invention provides a computer program product comprising a computer program stored on a non-volatile computer-readable storage medium, the computer program comprising program instructions which, when executed by a processor, cause the processor to perform the edge detection method based on dense perceptual network of the above-described method embodiment. For example, the method steps S100 to S400 in fig. 1 described above are performed.
The above-described embodiments are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the embodiment.
Through the above description of the embodiments, those skilled in the art will clearly understand that the embodiments may be implemented by software plus a general hardware platform, and may also be implemented by hardware. Based on such understanding, the above technical solutions essentially or contributing to the related art can be embodied in the form of a software product, which can be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes several instructions for enabling a computer device (which can be a personal computer, a server, or a network device, etc.) to execute the methods of the various embodiments or some parts of the embodiments.
Conditional language such as "can," "might," or "may" is generally intended to convey that a particular embodiment can include (yet other embodiments do not include) particular features, elements, and/or operations, among others, unless specifically stated otherwise or otherwise understood within the context as used. Thus, such conditional language is also generally intended to imply that features, elements, and/or operations are in any way required for one or more embodiments or that one or more embodiments must include logic for deciding, with or without input or prompting, whether such features, elements, and/or operations are included or are to be performed in any particular embodiment.
What has been described herein in the specification and drawings includes examples that can provide a dense-aware network-based edge detection method and apparatus. It will, of course, not be possible to describe every conceivable combination of components and/or methodologies for purposes of describing the various features of the disclosure, but it can be appreciated that many further combinations and permutations of the disclosed features are possible. It is therefore evident that various modifications can be made to the disclosure without departing from the scope or spirit thereof. In addition, or in the alternative, other embodiments of the disclosure may be apparent from consideration of the specification and drawings and from practice of the disclosure as presented herein. It is intended that the examples set forth in this specification and the drawings be considered in all respects as illustrative and not restrictive. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.

Claims (10)

1. An edge detection method based on a dense sensing network is characterized by comprising the following steps:
constructing a dense sensing network;
acquiring an RGB image to be identified, inputting the RGB image into a dense sensing network for feature extraction, and generating a feature map;
sampling the feature map to generate an edge image;
and fusing all edge images of the target to be detected to generate a target edge information image.
2. The edge detection method based on the dense sensing network as claimed in claim 1, wherein the dense sensing network comprises a dense start network and an up-sampling module,
the constructing of the dense sensing network comprises the following steps:
constructing a network structure of a dense initial network;
and constructing an up-sampling module according to the network structure of the dense initial network.
3. The edge detection method based on the dense sensing network as claimed in claim 2, wherein the obtaining of the RGB image to be recognized and the inputting of the RGB image into the dense sensing network for feature extraction to generate the feature map comprises:
and acquiring an RGB image to be identified, inputting the RGB image into a dense initial network for feature extraction, and generating a feature map.
4. The edge detection method based on the dense sensing network as claimed in claim 3, wherein the sampling the feature map to generate the edge image comprises:
and inputting the feature map into an up-sampling module for sampling to generate an edge image.
5. The edge detection method based on the dense sensing network as claimed in claim 4, wherein the constructing the dense sensing network comprises:
an encoder is constructed by stacking two convolutional layers of a predetermined size and performing a batch process.
6. The edge detection method based on the dense sensing network as claimed in claim 5, wherein the constructing of the up-sampling module according to the network structure of the dense initial network comprises
Building conditional stack sub-blocks according to a network structure of a dense starting network, the sub-block type including a first sub-block and a second sub-block;
constructing a first sub-block and a second sub-block;
the input of the first sub-block is preset to be the output of the dense initial network or the output of the second sub-block.
7. The edge detection method based on the dense sensing network as claimed in claim 6, wherein the constructing the first sub-block comprises:
the first sub-block is preset to be two layers, wherein the first layer sub-block is a convolution layer, and the second layer sub-block is an anti-convolution layer.
8. An edge detection device based on a dense sensing network, the device comprising:
the network construction module is used for constructing a dense sensing network;
the characteristic extraction module is used for acquiring an RGB image to be identified, inputting the RGB image into a dense sensing network for characteristic extraction, and generating a characteristic diagram;
the sampling module is used for sampling the feature map to generate an edge image;
and the image fusion module is used for fusing all edge images of the target to be detected to generate a target edge information image.
9. An edge detection device based on a dense-aware network, the device comprising at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the dense perceptual network-based edge detection method of any one of claims 1 to 7.
10. A non-transitory computer-readable storage medium storing computer-executable instructions that, when executed by one or more processors, cause the one or more processors to perform the dense awareness network-based edge detection method of any one of claims 1-7.
CN202011197809.XA 2020-10-30 2020-10-30 Edge detection method, device and equipment based on dense sensing network Pending CN112365515A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011197809.XA CN112365515A (en) 2020-10-30 2020-10-30 Edge detection method, device and equipment based on dense sensing network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011197809.XA CN112365515A (en) 2020-10-30 2020-10-30 Edge detection method, device and equipment based on dense sensing network

Publications (1)

Publication Number Publication Date
CN112365515A true CN112365515A (en) 2021-02-12

Family

ID=74513206

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011197809.XA Pending CN112365515A (en) 2020-10-30 2020-10-30 Edge detection method, device and equipment based on dense sensing network

Country Status (1)

Country Link
CN (1) CN112365515A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112991465A (en) * 2021-03-26 2021-06-18 禾多科技(北京)有限公司 Camera calibration method and device, electronic equipment and computer readable medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109522966A (en) * 2018-11-28 2019-03-26 中山大学 A kind of object detection method based on intensive connection convolutional neural networks
CN110852316A (en) * 2019-11-07 2020-02-28 中山大学 Image tampering detection and positioning method adopting convolution network with dense structure
US20200143526A1 (en) * 2017-08-23 2020-05-07 Boe Technology Group Co., Ltd. Image processing method and device
CN111191662A (en) * 2019-12-31 2020-05-22 网易(杭州)网络有限公司 Image feature extraction method, device, equipment, medium and object matching method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200143526A1 (en) * 2017-08-23 2020-05-07 Boe Technology Group Co., Ltd. Image processing method and device
CN109522966A (en) * 2018-11-28 2019-03-26 中山大学 A kind of object detection method based on intensive connection convolutional neural networks
CN110852316A (en) * 2019-11-07 2020-02-28 中山大学 Image tampering detection and positioning method adopting convolution network with dense structure
CN111191662A (en) * 2019-12-31 2020-05-22 网易(杭州)网络有限公司 Image feature extraction method, device, equipment, medium and object matching method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112991465A (en) * 2021-03-26 2021-06-18 禾多科技(北京)有限公司 Camera calibration method and device, electronic equipment and computer readable medium

Similar Documents

Publication Publication Date Title
WO2021073493A1 (en) Image processing method and device, neural network training method, image processing method of combined neural network model, construction method of combined neural network model, neural network processor and storage medium
EP3923233A1 (en) Image denoising method and apparatus
CN110570426B (en) Image co-registration and segmentation using deep learning
EP3767524A1 (en) Method and apparatus for detecting object
CN111476719A (en) Image processing method, image processing device, computer equipment and storage medium
CN112446834A (en) Image enhancement method and device
CN107886082B (en) Method and device for detecting mathematical formulas in images, computer equipment and storage medium
Gong et al. Combining sparse representation and local rank constraint for single image super resolution
WO2021017006A1 (en) Image processing method and apparatus, neural network and training method, and storage medium
CN111210465A (en) Image registration method and device, computer equipment and readable storage medium
CN109711381A (en) Target identification method, device and the computer equipment of remote sensing images
CN107392316B (en) Network training method and device, computing equipment and computer storage medium
CN112364738A (en) Human body posture estimation method, device, system and medium based on deep learning
CN112365515A (en) Edge detection method, device and equipment based on dense sensing network
CN112541900B (en) Detection method and device based on convolutional neural network, computer equipment and storage medium
CN111833363B (en) Image edge and saliency detection method and device
KR102427884B1 (en) Apparatus and method for training object detection model
CN112465847A (en) Edge detection method, device and equipment based on clear boundary prediction
CN114581916A (en) Image-based character recognition method, device and equipment combining RPA and AI
CN113255756B (en) Image fusion method and device, electronic equipment and storage medium
JP6017005B2 (en) Image search apparatus, image search method and program
CN113223008A (en) Fundus image segmentation method and system based on multi-scale guide attention network
DE102021108527A1 (en) NEURON NETWORK DEVICE FOR OPERATING A NEURON NETWORK, METHOD FOR OPERATING A NEURON NETWORK DEVICE AND APPLICATION PROCESSOR INCLUDING A NEURON NETWORK DEVICE
CN113269812A (en) Image prediction model training and application method, device, equipment and storage medium
CN111815631B (en) Model generation method, device, equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination