CN112435306A - G banding chromosome HDR image reconstruction method - Google Patents

G banding chromosome HDR image reconstruction method Download PDF

Info

Publication number
CN112435306A
CN112435306A CN202011312768.4A CN202011312768A CN112435306A CN 112435306 A CN112435306 A CN 112435306A CN 202011312768 A CN202011312768 A CN 202011312768A CN 112435306 A CN112435306 A CN 112435306A
Authority
CN
China
Prior art keywords
network
image
reconstruction
hdr image
ldr
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202011312768.4A
Other languages
Chinese (zh)
Inventor
崔玉峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Beion Pharmaceutical Technology Co ltd
Original Assignee
Shanghai Beion Pharmaceutical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Beion Pharmaceutical Technology Co ltd filed Critical Shanghai Beion Pharmaceutical Technology Co ltd
Priority to CN202011312768.4A priority Critical patent/CN112435306A/en
Publication of CN112435306A publication Critical patent/CN112435306A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/005Specific pre-processing for tomographic reconstruction, e.g. calibration, source positioning, rebinning, scatter correction, retrospective gating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/008Specific post-processing after tomographic reconstruction, e.g. voxelisation, metal artifact correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to the technical field of image processing, and discloses a G display band chromosome HDR image reconstruction method which comprises the training steps of a dequantization network model, a linearization network model and a detail reconstruction network model, and the step of reconstructing an HDR image by using the trained model. The invention designs three neural network models of a dequantization network, a linearization network and a detail reconstruction network, and utilizes an actual G display chromosome HDR image and a corresponding real LDR image to form a training set, thereby completing the training of the three neural network models; then, carrying out three steps of dequantization, linearization and detail reconstruction of an excessive exposure area on the real LDR image by using the trained model, and finally completing the task of reconstructing the G display chromosome HDR image; the reconstruction method of the HDR image is completed by only one LDR image, so that the G banding chromosome feature is more obvious, the subsequent analysis is convenient, the reconstructed HDR image has no artifact, and the image quality is higher.

Description

G banding chromosome HDR image reconstruction method
Technical Field
The invention relates to the technical field of image processing, in particular to a G display band chromosome HDR image reconstruction method.
Background
The quality of the G-banding chromosome image can seriously affect the effect of chromosome G-banding karyotype analysis, and generally, the G-banding chromosome image with higher quality can provide more chromosome structure information and texture information, so that the G-banding chromosome image has more important clinical application value.
In order to obtain a higher-quality G banding chromosome image, a more effective flaking method is generally adopted to smear G banding chromosome samples, and a high-resolution chromosome flaking method is generally adopted to obtain more obvious chromosome details than a medium-low resolution chromosome flaking method. With the progress of digital image technology, the digital image of the G-band chromosome sample acquired by the digital camera can be processed more conveniently and efficiently, but the dynamic range of the acquired chromosome sample image is low due to the influence of parameters of the digital camera, so that the loss of chromosome detail information is easy to occur at the position where the sample is too bright and too dark, and the slicing quality is influenced.
High Dynamic Range (HDR) G-banding chromosome images generally contain more chromosome detail than Low Dynamic Range (LDR) images. The most common techniques for creating HDR images are: the technology is good in performance in a static scene, but a ghost phenomenon often occurs in a dynamic scene or a handheld camera, and LDR image alignment and post-processing are needed to minimize artifacts in order to process the dynamic scene; or a convolutional neural network is used to fuse a plurality of aligned LDR images or misaligned LDR images to synthesize an HDR image, but because the change of HDR pixels (32 bits) is significantly larger than that of LDR pixels (8 bits), it is difficult to directly learn the LDR-HDR mapping relationship. All the methods rely on carrying out multiple different exposures on a sample at the same time and collecting a group of LDR images, and then synthesizing an HDR image according to the group of LDR images, so that chromosome artifacts in the synthesized HDR image are easily caused, and the reconstruction task of the HDR image by a single LDR image cannot be completed.
Disclosure of Invention
Aiming at the defects that in the prior art, a single LDR image can not be adopted to complete HDR image reconstruction and artifacts can be caused, the invention provides the G banding chromosome HDR image reconstruction method, which can reconstruct a high dynamic range image by utilizing the single G banding chromosome image with low dynamic range and can avoid the problem of chromosome artifacts generated when the HDR image is synthesized.
In order to solve the technical problem, the invention is solved by the following technical scheme:
the G display band chromosome HDR image reconstruction method comprises the training steps of three models, namely a dequantization network, a linearization network and a detail reconstruction network, and the step of reconstructing an HDR image by using the trained models; the steps of reconstructing the HDR image using the trained model are as follows:
s1: firstly, reading a real LDR image I, adopting a trained dequantization network to reduce artifacts generated in a quantization process, and generating a dequantized LDR image
Figure BDA0002790329940000021
S2: de-quantizing LDR image by using Sobel filter
Figure BDA0002790329940000022
Then inputting the extracted edge and histogram features into a trained linearization network to predict K principal component analysis weights, constructing a camera response inverse function, and finally obtaining a linearized LDR image by using the camera response inverse function
Figure BDA0002790329940000023
S3: linearization LDR image by adopting trained detail reconstruction network
Figure BDA0002790329940000024
The part with over exposure is subjected to detail reconstruction, and a residual image predicted by a detail reconstruction network and a linearized LDR image are reconstructed by using a soft mask image of the over exposure part
Figure BDA0002790329940000025
Smooth fusion is carried out, and finally a reconstructed HDR image is obtained
Figure BDA0002790329940000026
And finishing the reconstruction work of the G banding chromosome HDR image.
Further, the training steps of the three models of the dequantization network, the linearization network and the detail reconstruction network are as follows:
s01: fusing the output of the dequantization network onto the input LDR image I to generate a dequantized LDR image
Figure BDA0002790329940000027
And by minimizing the LDR image
Figure BDA0002790329940000028
And corresponding LDR image IdeqA loss function is used for training a model of the dequantization network;
s02: using Sobel filter from dequantized LDR image
Figure BDA0002790329940000029
Extracting edge and histogram features;
s03: inputting the extracted edge and histogram features into a linearization network, and generating K PCA weights by the linearization network to reconstruct a camera response inverse function; finally, a linearized LDR image is obtained by utilizing a camera response inverse function
Figure BDA00027903299400000210
Defining a linearized network loss function LlinAnd an ICRF reconstruction loss function LcrfAnd by optimizing LlincrfLcrfTo train a model of the linearized network;
s04: adding a ReLU layer at the end of a detail reconstruction network to construct a neural network and predict positive residuals, and minimizing a loss function LhalTraining a model of the detail reconstruction network;
s05: by minimizing the loss function LtotalTo make combined fine-tuning of the dequantization network, the linearization network and the detail reconstruction network together.
Further, in step S01, the dequantizing network adopts a 6-level U-Net network structure, each level of the U-Net network structure is composed of two convolution layers activated by the leakage ReLU function (α ═ 0.1), and the Tanh activation layer is used to normalize the result output by the dequantizing network into [ -1.0,1.0 ].
Further, in step S01, the loss function L of the network is dequantizeddeqIs composed of
Figure BDA0002790329940000031
Where I is obtained from the input LDR image using dynamic range clipping and non-linear mapping.
Further, in step S02, 6 feature maps h are obtained when extracting the edge and histogram features, and each feature map is obtained by the following formula:
Figure BDA0002790329940000032
wherein i, j respectively represent the position of the pixel in the horizontal and vertical directions, c represents the index of the color channel, B belongs to {1, ·, B } represents the pixel of the image in the vertical direction, d is the intensity distance from the (i, j) th pixel point to the center of the (B) th pixel point in the c channel on the image.
Further, in step S03, ResNet-18 is used as the backbone network of the linearized network.
Further, in step S03, the network loss function L is linearizedlinIs composed of
Figure BDA0002790329940000033
ICRF reconstruction loss function LcrfIs composed of
Figure BDA0002790329940000034
Wherein, IcRepresenting a linearized image obtained from a real HDR image, subjected to a dynamic range clipping process, g representing a real camera response inverse function,
Figure BDA0002790329940000035
representing the inverse function of the reconstructed camera response.
Further, in step S03, the weight λcrfSet to 0.1.
Further, in step S04, the loss function LhalIs composed of
Figure BDA0002790329940000036
Wherein
Figure BDA0002790329940000037
For the detail reconstructed HDR image, H denotes a real HDR image.
Further, in step S05, the loss function LtotalIs composed of
Ltotal=λdeqLdeqlinLlincrfLcrfhalLhalWherein λ isdep=1,λlin=10,λcrf=1,λhal=1。
Due to the adoption of the technical scheme, the invention has the remarkable technical effects that:
three neural network models of a dequantization network, a linearization network and a detail reconstruction network are designed, and a training set is formed by utilizing an actual G display chromosome HDR image and a corresponding real LDR image, so that the training of the three neural network models is completed. And then, carrying out three steps of dequantization, linearization and detail reconstruction of an excessive exposure area on the real LDR image by using the trained model, and finally completing the task of reconstructing the G display chromosome HDR image. The reconstruction of the HDR image can be completed only through one LDR image without depending on the collection of image information from a plurality of LDR images, the HDR image is reconstructed by directly utilizing a deep learning method to carry out reverse reasoning on the forming process of a chromosome sample image, the G display chromosome feature is more obvious, the reconstructed G display chromosome image can display more details of chromosome forms and textures, the subsequent analysis is facilitated, the problem that chromosome artifacts are easily generated when a plurality of LDR images are synthesized into the HDR image is avoided, and the quality of the image is higher.
Given an input LDR image, a dequantization network is used to recover the missing details of the LDR image quantization, which reduces the artifacts of the input LDR image due to quantization and reduces the visual artifacts (e.g., banding artifacts) of the underexposed regions, thereby greatly enhancing the edge features and internal texture features of G-banding chromosomes in the image.
The purpose of linearization is to estimate a Camera Response Function (CRF) to reconstruct linear illumination of a nonlinear LDR image; although the CRF is different for each camera, all functions should be monotonically increasing, and the minimum and maximum input values of the function would map to minimum and maximum output values, respectively. Since CRF is a one-to-one mapping function, an Inverse Camera Response Function (ICRF) is designed to have the above attributes. The non-linear LDR image is converted to a linear LDR image using an ICRF and a linear network. Based on the ICRF empirical model, the linearization network can estimate a more accurate ICRF using conditions from edges, intensity histograms, and monotonically increasing constraints. A basis vector is extracted from a group of real CRFs through Principal Component Analysis (PCA) to establish an empirical model, the weight of the basis vector is estimated from a single input image through training a deep learning model, ICRF is reconstructed through PCA weight parameters, and constraint conditions are introduced to improve the ICRF, so that the quality of a reconstructed HDR image is greatly enhanced.
In order to recover the missing content caused by dynamic range clipping in the imaging process of the G banding chromosome, after the image is subjected to de-quantization and linearization processing, a detail reconstruction network is trained to predict the missing details in the overexposed area.
By explicitly modeling the reverse reasoning of the LDR image generation step, the difficulty of training a single network to reconstruct an HDR image is greatly reduced.
And finally, the whole model is finely adjusted by combining the previously trained model so as to reduce error accumulation and improve the generalization capability of the real input image.
The effectiveness of the method was evaluated experimentally on multiple G-band chromosome image datasets and real HDR images. Through extensive quantitative and qualitative evaluation and user research, the model is proved to be superior to other G banding chromosome HDR image reconstruction algorithms.
Drawings
FIG. 1 is a process of reconstructing an HDR image of a chromosome G;
FIG. 2 is a flowchart of the HDR image reconstruction of chromosome G;
FIG. 3 is a structure of a dequantization network according to an embodiment of the present invention;
FIG. 4 is a structure of a linearization network of an embodiment of the invention;
FIG. 5 is a detailed reconfiguration network architecture according to an embodiment of the present invention;
FIG. 6 is an input G-banding chromosome LDR image I of the embodiment of the present invention;
FIG. 7 shows the result of reconstructing HDR image of chromosome G.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples.
Examples
A G banding chromosome HDR image reconstruction method is disclosed, wherein an LDR image is a G banding chromosome LDR image, an HDR image is a G banding chromosome HDR image, as shown in figure 1, the method comprises a model training step of three sub-networks of a dequantization network, a linearization network and a detail reconstruction network, and a step of reconstructing the HDR image by using the trained model.
The model training steps are as follows:
s01: the dequantization network adopts a 6-level U-Net network structure, each level of U-Net network structure is composed of two convolution layers activated by a leakage ReLU function (alpha is 0.1), and a Tanh activation layer is adopted to normalize the output result of the dequantization network to [ -1.0,1.0 []Finally, the output of the dequantization network is fused to the input LDR image I (real image) to generate a dequantized LDR image
Figure BDA0002790329940000051
And by minimizing the LDR image
Figure BDA0002790329940000052
And corresponding real LDR image Ideq(for L2 loss function) loss function to train model of dequantization networkWhere I is obtained from the input real LDR image using dynamic range clipping and non-linear mapping, dequantizing the loss function L of the networkdeqAs follows:
Figure BDA0002790329940000053
the pixel values of the LDR image generated after the non-linear mapping are quantized to 8 bits, which in the process tends to cause underexposure of the LDR image and cause errors in the smooth gradient region. These errors often cause scattered noise or introduce banding artifacts, especially in regions where the gradient changes smoothly. Although these noises are not noticeable in non-linear LDR images, the tone mapping operation (used to display HDR images) often amplifies the noise, resulting in noticeable artifacts. Since dequantization of digital images typically produces scattering noise or contour distortion in smooth regions, edge and interior texture features of G-banding chromosomes in LDR images are greatly enhanced by training a dequantization network to reduce artifacts due to quantization in the input LDR image and to reduce visual artifacts (e.g., banding artifacts) in underexposed regions. The structure of the dequantization network is shown in fig. 3.
S02: by setting 1024 pairs at [0,1 ]]The pixel points in between are uniformly sampled to discretize the designed Inverse Camera Response Function (ICRF), so that the function can be represented by a vector g with the size of 1024, and the vector g can be approximately equal to a K linear combination PCA bias vector, wherein K is 11. Using Sobel filter from dequantized LDR image
Figure BDA0002790329940000061
Extracting edge and histogram features from the image (the image is a nonlinear LDR image) to obtain 6 feature maps h, wherein each feature map is obtained by the following formula, i and j respectively represent the positions of pixels in the horizontal direction and the vertical direction, c represents the index of a color channel, B belongs to {1, ·, B } represents a histogram pixel, and d is the intensity distance from the (i, j) th pixel point to the center of the B th pixel point in the c channel on the image.
Figure BDA0002790329940000062
Figure BDA0002790329940000063
S03: ResNet-18 is adopted as a backbone network of the designed linearization network, and the structure of the linearization network is shown in FIG. 4. Inputting the extracted edge and histogram features into a linearization network, adding a global average pooling layer after the last convolution layer of the linearization network to extract global features, and then generating K PCA weights by utilizing two fully-connected layers to reconstruct a camera response inverse function. Defining a linearized network loss function LlinAnd an ICRF reconstruction loss function LcrfAre respectively shown as follows, wherein IcRepresenting a linearized image obtained from a real HDR image subjected to a dynamic range clipping process,
Figure BDA0002790329940000064
representing a linearized image obtained via a linearized network and using an inverse function of the camera response, g representing the true inverse function of the camera response,
Figure BDA0002790329940000065
representing the inverse function of the reconstructed camera response. By optimizing LlincrfLcrfTo train a model of the designed linearized network, with a weight λcrfSet to 0.1.
Figure BDA0002790329940000066
Figure BDA0002790329940000067
S04: training details reconstruction network (by C)-1(.) to pre-do soMeasuring the missing details in the overexposed area, the detail reconstruction network is mainly designed by adopting a codec architecture of jump connection, and the structure of the detail reconstruction network is shown in fig. 5. Since the missing values in the overexposed region should always be larger than the existing pixel values, a ReLU layer is added at the end of the detail reconstruction network to build the neural network and predict the positive residual. Detail reconstructed HDR image
Figure BDA0002790329940000071
As shown in the following formula, wherein
Figure BDA0002790329940000072
Representing a linearized image generated via a linearized network, alpha represents a soft mask map of the overexposed portion, and by smoothly blending the residual with the existing pixel values around the overexposed region, by minimizing the log-L2 loss function (loss function L)hal) To train the model of the designed detailed reconstruction network, the loss function LhalAs shown below, where H denotes a real HDR image.
Figure BDA0002790329940000073
Figure BDA0002790329940000074
S05: and constructing a training set by using the G display chromosome LDR image and corresponding real data, and respectively and independently training three sub-networks of a dequantization network, a linearization network and a detail reconstruction network. After convergence of the three sub-networks, finally by minimizing the loss function LtotalTo make combined fine-tuning of the entire deep neural network together. Wherein the loss function LtotalAs follows. Setting of lambdadep=1,λlin=10,λcrf=1,λhalAnd (1) reducing the accumulation of errors among sub-networks by joint training, and further improving the performance of the G-banding chromosome HDR image reconstruction model. By the method toThe training model can greatly reduce the difficulty of model convergence, and can improve the accuracy of reconstruction of the G display band chromosome HDR image after the model convergence.
Ltotal=λdeqLdeqlinLlincrfLcrfhalLhal
Referring to fig. 1 and fig. 2, the HDR reconstruction of the G display chromosome LDR image using the trained model includes the following steps:
s1: firstly, reading a real LDR image I, adopting a trained dequantization network to reduce artifacts generated in a quantization process, and generating a dequantized LDR image
Figure BDA0002790329940000075
A partially input G-banding chromosome LDR image I is shown in fig. 6;
s2: de-quantizing LDR image by using Sobel filter
Figure BDA0002790329940000076
Extracting edge and histogram features from the non-linear image, inputting the extracted edge and histogram features into a trained linearization network to predict K Principal Component Analysis (PCA) weights and construct a camera response inverse function (ICRF), and finally obtaining a linearized LDR image by using the camera response inverse function
Figure BDA0002790329940000077
S3: linearization LDR image by adopting trained detail reconstruction network
Figure BDA0002790329940000078
The part with over exposure is subjected to detail reconstruction, and a residual image predicted by a detail reconstruction network and a linearized LDR image are reconstructed by using a soft mask image of the over exposure part
Figure BDA0002790329940000079
Smooth fusion is carried out, and finally a reconstructed HDR image is obtained
Figure BDA00027903299400000710
The reconstruction work of the G banding chromosome HDR image is completed, and the G banding chromosome HDR image is partially reconstructed
Figure BDA00027903299400000711
As shown in fig. 7.
In the embodiment, three neural network models, namely a dequantization network, a linearization network and a detail reconstruction network, are designed, and a training set is formed by using an actual HDR image of a G display chromosome and a corresponding real LDR image, so that the training of the three neural network models is completed. And then, carrying out three steps of dequantization, linearization and detail reconstruction of an excessive exposure area on the real LDR image by using the trained model, and finally completing the task of reconstructing the G display chromosome HDR image. The design completes reconstruction of the HDR image through only one LDR image, so that the G banding chromosome feature is more obvious, the subsequent analysis is convenient, the reconstructed HDR image has no artifact, and the quality of the image is higher.
In summary, the above-mentioned embodiments are only preferred embodiments of the present invention, and all equivalent changes and modifications made in the claims of the present invention should be covered by the claims of the present invention.

Claims (10)

  1. The G banding chromosome HDR image reconstruction method is characterized by comprising the training steps of three models, namely a dequantization network, a linearization network and a detail reconstruction network, and the step of reconstructing an HDR image by using the trained models; the steps of reconstructing the HDR image using the trained model are as follows:
    s1: firstly, reading a real LDR image I, adopting a trained dequantization network to reduce artifacts generated in a quantization process, and generating a dequantized LDR image
    Figure FDA0002790329930000011
    S2: de-quantizing LDR image by using Sobel filter
    Figure FDA0002790329930000012
    Then inputting the extracted edge and histogram features into a trained linearization network to predict K principal component analysis weights, constructing a camera response inverse function, and finally obtaining a linearized LDR image by using the camera response inverse function
    Figure FDA0002790329930000013
    S3: linearization LDR image by adopting trained detail reconstruction network
    Figure FDA0002790329930000014
    The part with over exposure is subjected to detail reconstruction, and a residual image predicted by a detail reconstruction network and a linearized LDR image are reconstructed by using a soft mask image of the over exposure part
    Figure FDA0002790329930000015
    Smooth fusion is carried out, and finally a reconstructed HDR image is obtained
    Figure FDA0002790329930000016
    And finishing the reconstruction work of the G banding chromosome HDR image.
  2. 2. The reconstruction method of the G banding chromosome HDR image as claimed in claim 1, characterized in that the training steps of the three models of the dequantization network, the linearization network and the detail reconstruction network are as follows:
    s01: fusing the output of the dequantization network onto the input LDR image I to generate a dequantized LDR image
    Figure FDA0002790329930000017
    And by minimizing the LDR image
    Figure FDA0002790329930000018
    And corresponding LDR image IdeqA loss function is used for training a model of the dequantization network;
    s02: using Sobel filter from dequantized LDR image
    Figure FDA0002790329930000019
    Extracting edge and histogram features;
    s03: inputting the extracted edge and histogram features into a linearization network, and generating K PCA weights by the linearization network to reconstruct a camera response inverse function; finally, a linearized LDR image is obtained by utilizing a camera response inverse function
    Figure FDA00027903299300000110
    Defining a linearized network loss function LlinAnd an ICRF reconstruction loss function LcrfAnd by optimizing LlincrfLcrfTo train a model of the linearized network;
    s04: adding a ReLU layer at the end of a detail reconstruction network to construct a neural network and predict positive residuals, and minimizing a loss function LhalTraining a model of the detail reconstruction network;
    s05: by minimizing the loss function LtotalTo make combined fine-tuning of the dequantization network, the linearization network and the detail reconstruction network together.
  3. 3. The method for reconstructing an HDR image of a G-banding chromosome as claimed in claim 2, wherein in step S01, the dequantizing network employs a 6-stage U-Net network structure, each stage of the U-Net network structure is composed of two convolution layers activated by a leak ReLU function (α ═ 0.1), and a Tanh activation layer is employed to normalize the output result of the dequantizing network to [ -1.0,1.0 ].
  4. 4. The method for reconstructing an HDR image of G banding chromosomes according to claim 2, wherein in step S01, the loss function L of the network is dequantizeddeqIs composed of
    Figure FDA0002790329930000021
    Wherein I is the adoption of dynamics from the input LDR imageRange clipping and non-linear mapping.
  5. 5. The method for reconstructing an HDR image of a G banding chromosome as claimed in claim 2, wherein in the step S02, 6 feature maps h are obtained when extracting the edge and histogram features, and each feature map is obtained by the following formula:
    Figure FDA0002790329930000022
    wherein i, j respectively represent the position of the pixel in the horizontal and vertical directions, c represents the index of the color channel, B belongs to {1, ·, B } represents the pixel of the image in the vertical direction, d is the intensity distance from the (i, j) th pixel point to the center of the (B) th pixel point in the c channel on the image.
  6. 6. The method for reconstructing an HDR image of a G banding chromosome as claimed in claim 2, wherein in the step of S03, ResNet-18 is used as a backbone network of a linearization network.
  7. 7. The method as claimed in claim 2, wherein in step S03, the network loss function L is linearizedlinIs composed of
    Figure FDA0002790329930000023
    ICRF reconstruction loss function LcrfIs composed of
    Figure FDA0002790329930000024
    Wherein, IcRepresenting a linearized image obtained from a real HDR image, subjected to a dynamic range clipping process, g representing a real camera response inverse function,
    Figure FDA0002790329930000025
    representing the inverse function of the reconstructed camera response.
  8. 8. The method as claimed in claim 2, wherein in the step of S03, the weight λ iscrfSet to 0.1.
  9. 9. The method as claimed in claim 2, wherein in step S04, the loss function L is usedhalIs composed of
    Figure FDA0002790329930000026
    Wherein
    Figure FDA0002790329930000027
    For the detail reconstructed HDR image, H denotes a real HDR image.
  10. 10. The method as claimed in claim 2, wherein in step S05, the loss function L is usedtotalIs Ltotal=λdeqLdeqlinLlincrfLcrfhalLhalWherein λ isdep=1,λlin=10,λcrf=1,λhal=1。
CN202011312768.4A 2020-11-20 2020-11-20 G banding chromosome HDR image reconstruction method Withdrawn CN112435306A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011312768.4A CN112435306A (en) 2020-11-20 2020-11-20 G banding chromosome HDR image reconstruction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011312768.4A CN112435306A (en) 2020-11-20 2020-11-20 G banding chromosome HDR image reconstruction method

Publications (1)

Publication Number Publication Date
CN112435306A true CN112435306A (en) 2021-03-02

Family

ID=74694596

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011312768.4A Withdrawn CN112435306A (en) 2020-11-20 2020-11-20 G banding chromosome HDR image reconstruction method

Country Status (1)

Country Link
CN (1) CN112435306A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113344773A (en) * 2021-06-02 2021-09-03 电子科技大学 Single picture reconstruction HDR method based on multi-level dual feedback
CN113808661A (en) * 2021-09-18 2021-12-17 山东财经大学 Chromosome three-dimensional structure reconstruction method and device
US20220210416A1 (en) * 2020-12-29 2022-06-30 Tencent America LLC End-to-end neural compression with deep reinforcement learning
CN115297254A (en) * 2022-07-04 2022-11-04 北京航空航天大学 Portable high-dynamic imaging fusion system under high-radiation condition

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180260942A1 (en) * 2017-03-09 2018-09-13 Thomson Licensing Method for inverse tone mapping of an image with visual effects
CN109447907A (en) * 2018-09-20 2019-03-08 宁波大学 A kind of single image Enhancement Method based on full convolutional neural networks
CN111242883A (en) * 2020-01-10 2020-06-05 西安电子科技大学 Dynamic scene HDR reconstruction method based on deep learning
CN111292264A (en) * 2020-01-21 2020-06-16 武汉大学 Image high dynamic range reconstruction method based on deep learning

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180260942A1 (en) * 2017-03-09 2018-09-13 Thomson Licensing Method for inverse tone mapping of an image with visual effects
CN109447907A (en) * 2018-09-20 2019-03-08 宁波大学 A kind of single image Enhancement Method based on full convolutional neural networks
CN111242883A (en) * 2020-01-10 2020-06-05 西安电子科技大学 Dynamic scene HDR reconstruction method based on deep learning
CN111292264A (en) * 2020-01-21 2020-06-16 武汉大学 Image high dynamic range reconstruction method based on deep learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
YU-LUN LIU ET AL: "Single-Image HDR Reconstruction by Learning to Reverse the Camera Pipeline", 《ARXIV:2004.01179V1》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220210416A1 (en) * 2020-12-29 2022-06-30 Tencent America LLC End-to-end neural compression with deep reinforcement learning
US11917154B2 (en) * 2020-12-29 2024-02-27 Tencent America LLC End-to-end neural compression with deep reinforcement learning
CN113344773A (en) * 2021-06-02 2021-09-03 电子科技大学 Single picture reconstruction HDR method based on multi-level dual feedback
CN113344773B (en) * 2021-06-02 2022-05-06 电子科技大学 Single picture reconstruction HDR method based on multi-level dual feedback
CN113808661A (en) * 2021-09-18 2021-12-17 山东财经大学 Chromosome three-dimensional structure reconstruction method and device
CN115297254A (en) * 2022-07-04 2022-11-04 北京航空航天大学 Portable high-dynamic imaging fusion system under high-radiation condition
CN115297254B (en) * 2022-07-04 2024-03-29 北京航空航天大学 Portable high dynamic imaging fusion system under high radiation condition

Similar Documents

Publication Publication Date Title
CN112435306A (en) G banding chromosome HDR image reconstruction method
Brooks et al. Unprocessing images for learned raw denoising
CN112734650B (en) Virtual multi-exposure fusion based uneven illumination image enhancement method
Liang et al. Cameranet: A two-stage framework for effective camera isp learning
Faraji et al. CCD noise removal in digital images
CN113822830B (en) Multi-exposure image fusion method based on depth perception enhancement
WO2022000397A1 (en) Low-illumination image enhancement method and apparatus, and computer device
Huang et al. Towards low light enhancement with raw images
WO2022022494A1 (en) Cbd-net-based medical endoscopic image denoising method
CN111932471B (en) Double-path exposure degree fusion network model and method for low-illumination image enhancement
CN113096029A (en) High dynamic range image generation method based on multi-branch codec neural network
CN111105376B (en) Single-exposure high-dynamic-range image generation method based on double-branch neural network
CN110225260B (en) Three-dimensional high dynamic range imaging method based on generation countermeasure network
CN114862698B (en) Channel-guided real overexposure image correction method and device
CN114998141B (en) Space environment high dynamic range imaging method based on multi-branch network
Karadeniz et al. Burst photography for learning to enhance extremely dark images
Feng et al. Low-light image enhancement algorithm based on an atmospheric physical model
CN116563183A (en) High dynamic range image reconstruction method and system based on single RAW image
Tan et al. Low-light image enhancement with geometrical sparse representation
Ye et al. Single exposure high dynamic range image reconstruction based on deep dual-branch network
CN116433548A (en) Hyperspectral and panchromatic image fusion method based on multistage information extraction
Siddiqui et al. Hierarchical color correction for camera cell phone images
CN114638764B (en) Multi-exposure image fusion method and system based on artificial intelligence
CN116563133A (en) Low-illumination color image enhancement method based on simulated exposure and multi-scale fusion
CN115147311A (en) Image enhancement method based on HSV and AM-RetinexNet

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20210302