CN117876263B - Astronomical image processing method and device - Google Patents

Astronomical image processing method and device Download PDF

Info

Publication number
CN117876263B
CN117876263B CN202410287648.5A CN202410287648A CN117876263B CN 117876263 B CN117876263 B CN 117876263B CN 202410287648 A CN202410287648 A CN 202410287648A CN 117876263 B CN117876263 B CN 117876263B
Authority
CN
China
Prior art keywords
astronomical
image
training
image processing
processing model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410287648.5A
Other languages
Chinese (zh)
Other versions
CN117876263A (en
Inventor
倪书磊
邱逸盛
陈云川
宋子豪
陈昊
蒋雪健
陈华曦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Lab
Original Assignee
Zhejiang Lab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Lab filed Critical Zhejiang Lab
Priority to CN202410287648.5A priority Critical patent/CN117876263B/en
Publication of CN117876263A publication Critical patent/CN117876263A/en
Application granted granted Critical
Publication of CN117876263B publication Critical patent/CN117876263B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • G06N3/0455Auto-encoder networks; Encoder-decoder networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0475Generative networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/094Adversarial learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The specification discloses an astronomical image processing method and device, wherein an observation image obtained by astronomical observation equipment such as an astronomical telescope is used as a training sample. And constructing a deep learning network based on prior physical information of astronomical observation equipment. The astronomical image processing model is trained by using training samples and a deep learning network. The method inputs an image to be processed into an astronomical image processing model to acquire a corresponding deconvolution image. By the scheme, prior physical information of the astronomical observation equipment is introduced into the training process of the astronomical image processing model. In this way, the difficulty of deconvolution tasks for beam effects or point spread functions in astronomical observers is significantly reduced, and the efficiency of astronomical image processing is also improved.

Description

Astronomical image processing method and device
Technical Field
The present disclosure relates to the field of astronomical computer vision, and in particular, to a method and apparatus for astronomical image processing.
Background
Astronomical image deconvolution is a complex and inherently ambiguous problem in the astronomical observation field. For decades, extensive research has been conducted on it, which is considered a classical problem in reverse computing imaging. Furthermore, it has attracted considerable attention in the field of image processing. In the field of radio astronomy, the elimination of the beam effect produced by a telescope is critical to achieving an accurate and precise image. The inherent beam effects of astronomical telescopes inevitably distort the measured data or the image of the observed object. These effects may come from imperfections in the telescope imaging system. The presence of beam effects can cause blurring or spatial warping of the image, severely affecting the sharpness and resolution of the image. Therefore, it is becoming more and more important to effectively eliminate these effects, improve image quality, and enable scientists to conduct detailed studies on the observed objects.
In solving the deconvolution problem, conventional algorithms typically find the optimal solution by inferring the convolution kernel. The deconvolution task becomes more difficult if the convolution kernel provided is complex or the measurement is inaccurate. Some deconvolution methods attempt to overcome the convolution by introducing various image priors, such as a red-dark channel priors and gradient priors. However, these methods have limited ability to accurately model sharp image features and generate artifact-free output.
With the rapid development and application of deep learning methods in astronomy, they have become more effective methods for solving such inverse problems. For example, convolutional Neural Networks (CNNs) have been widely studied for image deblurring, and CNN-based methods have been widely explored. However, the current deep learning method depends on the scale of the supervised data set to a great extent, and if a larger scale supervised data set cannot be acquired to train an image processing model for deconvolution, the efficiency and accuracy of the deconvolution of the astronomical image are greatly reduced.
Disclosure of Invention
The present disclosure provides a method and apparatus for astronomical image processing, so as to partially solve the above-mentioned problems in the prior art.
The technical scheme adopted in the specification is as follows:
The specification provides an astronomical image processing method, comprising the following steps:
acquiring astronomical observation images as training samples through astronomical observation equipment;
Acquiring prior physical information of the astronomical observation equipment, and constructing a deep learning network corresponding to the astronomical observation equipment according to the prior physical information of the astronomical observation equipment;
Training an astronomical image processing model to be trained according to the training sample and a deep learning network corresponding to astronomical observation equipment to obtain a trained astronomical image processing model;
when an image processing request is received, inputting the image to be processed acquired through the astronomical observation equipment into the astronomical image processing model after training is completed, and obtaining a deconvolution image corresponding to the image to be processed.
Optionally, training the astronomical image processing model to be trained according to the training sample and the deep learning network corresponding to the astronomical observation device, which specifically includes:
Inputting the training sample into an astronomical image processing model to be trained to obtain an intermediate prediction image corresponding to the training sample;
inputting the intermediate predicted image corresponding to the training sample into a deep learning network corresponding to astronomical observation equipment to obtain a predicted image corresponding to the training sample;
And training the astronomical image processing model to be trained by taking the minimization of the difference between the training sample and the predicted image as a training target, so as to obtain the astronomical image processing model after training.
Optionally, inputting the intermediate predicted image corresponding to the training sample to a deep learning network corresponding to astronomical observation equipment to obtain a predicted image corresponding to the training sample, which specifically includes:
determining a convolution kernel of a deep learning network corresponding to astronomical observation equipment according to prior physical information of the astronomical observation equipment;
and carrying out convolution operation on the intermediate predicted image corresponding to the training sample and the convolution kernel of the deep learning network corresponding to the astronomical observation equipment through a fast Fourier transform algorithm to obtain the predicted image corresponding to the training sample.
Optionally, the minimizing of the difference between the training sample and the predicted image trains the astronomical image processing model to be trained for a training target, specifically including:
obtaining loss according to the difference between the training sample and the predicted image and a logarithmic hyperbolic cosine function;
And training the astronomical image processing model to be trained by taking the minimization of the loss as a training target.
Optionally, the astronomical image processing model is constructed by an automatic encoder, a generating countermeasure network, and any one regression network of U-Net and Vision Transformer.
Optionally, the a priori physical information of the astronomical observation device includes information of beam effects or information of a point spread function during imaging of the astronomical observation device.
Optionally, the method further comprises:
acquiring an astronomical dataset comprising a plurality of reference deconvoluted images and a plurality of reference physical information;
sequentially aiming at each reference deconvolution image in the astronomical data set, and constructing each test pair according to the reference deconvolution image and any one of a plurality of reference physical information included in the astronomical data set;
For each test pair, determining a simulated observation image of the test pair according to the reference deconvolution image of the test pair and the reference physical information of the test pair;
Inputting the simulated observation image of the test pair into the astronomical image processing model after training to obtain a predicted deconvolution image corresponding to the simulated observation image of the test pair;
If the difference between the predicted deconvolution image corresponding to the simulated observation image of the test pair and the reference deconvolution image of the test pair is smaller than a preset difference threshold value, establishing a corresponding relation between the reference deconvolution image of the test pair and the reference physical information of the test pair.
Optionally, before inputting the image to be processed acquired by the astronomical observation device into the trained astronomical image processing model, the method further includes:
Acquiring an astronomical verification image as a verification sample through the astronomical observation equipment;
Inputting the verification sample into a trained astronomical image processing model to obtain the trained astronomical image processing model, and obtaining a deconvolution image corresponding to the verification sample;
Inputting the deconvolution image corresponding to the verification sample into a deep learning network corresponding to the astronomical observation equipment to obtain a convolution image corresponding to the verification sample;
and evaluating the astronomical image processing model after training according to the verification sample and the difference between the convolution images corresponding to the verification sample.
Optionally, the method further comprises:
preprocessing an image acquired by the astronomical observation device, wherein the preprocessing comprises denoising, artifact removal and image contrast enhancement.
The present specification provides an astronomical image processing device, including:
the training sample determining module is used for acquiring astronomical observation images through astronomical observation equipment to serve as training samples;
the deep learning network determining module is used for acquiring prior physical information of the astronomical observation equipment and constructing a deep learning network corresponding to the astronomical observation equipment according to the prior physical information of the astronomical observation equipment;
The training module is used for training the astronomical image processing model to be trained according to the training sample and the deep learning network corresponding to the astronomical observation equipment to obtain a trained astronomical image processing model;
And the image processing module is used for inputting the image to be processed acquired by the astronomical observation equipment into the astronomical image processing model after training is completed when an image processing request is received, and obtaining a deconvolution image corresponding to the image to be processed.
The above-mentioned at least one technical scheme that this specification adopted can reach following beneficial effect:
In the astronomical image processing method provided by the specification, astronomical observation images acquired by astronomical observation equipment are used as training samples, a deep learning network is constructed according to prior physical information of the observation equipment, an astronomical image processing model is trained according to the training samples and the deep learning network, and images to be processed acquired by appointed equipment are input into the astronomical image processing model to obtain deconvolution images corresponding to the images to be processed. Therefore, by the scheme, the prior physical information of the astronomical observation equipment is introduced into the training process of the astronomical image processing model, so that the difficulty of deconvolution tasks aiming at the beam effect or the point spread function of the astronomical observation equipment is greatly relieved, and the astronomical image processing efficiency is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the specification, illustrate and explain the exemplary embodiments of the present specification and their description, are not intended to limit the specification unduly. In the drawings:
FIG. 1 is a schematic flow chart of an astronomical image processing method in the present specification;
FIG. 2 is a schematic illustration of an astronomical image according to the present disclosure;
FIG. 3 is a schematic diagram of a training process of astronomical image processing models in the present specification;
FIG. 4 is a schematic flow chart of an astronomical image processing method in the present specification;
FIG. 5 is a schematic flow chart of an astronomical image processing method in the present specification;
FIG. 6 is a schematic diagram of an astronomical image processing device provided in the present specification;
Fig. 7 is a schematic view of the electronic device corresponding to fig. 1 provided in the present specification.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the present specification more apparent, the technical solutions of the present specification will be clearly and completely described below with reference to specific embodiments of the present specification and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present specification. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are intended to be within the scope of the present disclosure.
In addition, all the actions for acquiring signals, information or data in the present specification are performed under the condition of conforming to the corresponding data protection rule policy of the place and obtaining the authorization given by the corresponding device owner.
The features of the following examples and embodiments may be combined with each other without any conflict.
As previously mentioned, in the field of radio astronomy, the elimination of beam effects produced by astronomical telescopes is critical to achieving accurate and precise images.
The presence of side lobes is often accompanied by the point spread function (point spread function, PSF) or beam effect of almost all telescopes, especially interferometer-based telescopes. These side lobes are the result of factors such as optical aberrations, diffraction and imperfections in the imaging system. Sidelobes can present challenges in tasks related to image processing and analysis, as they can introduce undesirable artifacts that affect the overall quality of the image. For example, in deconvolution algorithms, side lobes may be erroneously identified as true image features during reconstruction, resulting in errors and loss of detail.
The inherent beam effects of astronomical telescopes inevitably distort the measured data or the image of the observed object. These effects may come from imperfections in the telescope imaging system or atmospheric conditions. The presence of beam effects can cause blurring or spatial warping of the image, severely affecting the sharpness and resolution of the image.
Given that a single input image may correspond to multiple potentially sharp images, the field of image deconvolution becomes difficult and tricky. Therefore, it is becoming more and more important to effectively eliminate these effects, improve image quality, and enable scientists to conduct detailed studies on the observed objects. It should be appreciated that side lobes cannot be completely eliminated because they are an inherent feature of the imaging system and image forming process. However, by careful algorithm design and calibration, the effect of side lobes can be greatly reduced, thereby improving image quality and more accurate analysis results.
In solving the deconvolution problem, conventional algorithms typically find the optimal solution by inferring the convolution kernel. Deconvolution of images is primarily directed to solving a highly nonlinear and uncertain optimization problem, which makes successful deconvolution extremely challenging. The deconvolution task becomes more difficult if the convolution kernel provided is complex or the measurement is inaccurate. Some deconvolution methods attempt to overcome the convolution by introducing various image priors, such as a red-dark channel priors and gradient priors. However, these methods have limited ability to accurately model sharp image features and generate artifact-free output. With the rapid development and application of deep learning methods in astronomy, they have become more effective methods for solving such inverse problems because they are capable of handling nonlinearities and large amounts of data. For example, convolutional Neural Networks (CNNs) have been widely studied for image deblurring, and CNN-based methods have been widely explored. A link between the traditional optimization-based approach and the neural network architecture was established in various studies. Furthermore, a new separable structure has been introduced as a reliable means to support powerful artifact deconvolution. Subsequently, a supervisory network is established. While these approaches produce impressive results, they rely heavily on supervisory data sets and deeper and wider architectures to improve performance. Therefore, there are challenges in using them in practical applications.
Based on the above, the description provides an astronomical image processing method, which introduces prior physical information of astronomical observation equipment into the training process of an astronomical image processing model, so that the difficulty of deconvolution tasks aiming at beam effects or point spread functions of the astronomical observation equipment is greatly relieved, and the astronomical image processing efficiency is improved.
The following describes in detail the technical solutions provided by the embodiments of the present specification with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of an astronomical image processing method provided in the present specification.
S100: and acquiring astronomical observation images serving as training samples through astronomical observation equipment.
An astronomical image processing method provided in the embodiments of the present specification, the execution process of which can be executed by an electronic device such as a server for performing image processing. In addition, in the process of executing the method, the astronomical image processing model after the training is related, the electronic device for executing the model training of the model and the electronic device for executing the method can be the same or different, and the specification is not limited to this.
In actual astronomical observations, the pixel level of the original image is affected by the convolution of the astronomical image with the point spread function PSF or beam effect, resulting in blurring of the image of the astronomical object captured by the astronomical observation device. As shown in fig. 2, the left image corresponds to an original image of the celestial object, the middle image is a PSF or beam effect of the astronomical observation device, and the right image is a blurred image actually captured by the astronomical observation device. The specific formula can be as follows:
wherein, For blurred images captured by astronomical observers,/>For the corresponding original image (sharp image)/>Is the PSF or beam effect of the astronomical observation device.
As can be seen, the astronomical observation image captured by the astronomical observation device is actually the result of the corresponding convolution of the original image with the PSF or beam of the astronomical observation device, and in order to improve the clarity of the astronomical image, the deconvolution operation is required for the image captured by the astronomical observation device. In this specification, however, in order to effectively eliminate the influence of the PSF and the beam distortion, a method of training an astronomical image processing model for deconvolving an astronomical image is employed. In order to train the astronomical image processing model, a corresponding training sample is first acquired.
In one or more embodiments of the present specification, an observation image acquired by an astronomical observation device is used as a training sample, that is, as an input to an astronomical image processing model to be trained. The astronomical image processing model obtained through training can deconvolute any astronomical observation image obtained by astronomical observation equipment in practical application.
In addition, in the present specification, the astronomical observation device may be an astronomical observation device such as an astronomical telescope, and of course, the astronomical image processing method provided in the present specification may also be applied to other imaging devices such as a microscope, a motion camera, and the like.
In addition, optionally, in practical application, there is a case that the size of a beam or a PSF of the astronomical observation device is matched with the size of an astronomical observation image acquired by the astronomical observation device, in order to preserve a large-scale blurring effect caused by convolution, a complete astronomical observation image is selected as a training sample in the specification, and the utilization of precious features on different scales is optimized in image deblurring and image deconvolution.
S102: and acquiring prior physical information of the astronomical observation equipment, and constructing a deep learning network corresponding to the astronomical observation equipment according to the prior physical information of the astronomical observation equipment.
Specifically, as shown in fig. 3, a deep learning network constructed based on a priori physical information of the astronomical observation device is deployed after the output of the astronomical image processing model. The deep learning network introduces valuable priori knowledge about the images of astronomical observation equipment for the training process of an astronomical image processing model, and improves the scale compensation effect of the accuracy of image recovery and deconvolution tasks.
Wherein the a priori physical information of the astronomical observation device comprises information of beam effects and/or information of a point spread function during imaging of the astronomical observation device. The a priori physical information of the astronomical observation device characterizes the condition that a point source in astronomical observation images acquired by the astronomical observation device is diffused or blurred by a designated imaging device. The prior physical information of the astronomical observation equipment is introduced into the training process of the astronomical image processing model, so that the astronomical image processing model after training can better understand and compensate the blurring effect, and the accuracy and the efficiency of deconvolution of the image acquired by the astronomical observation equipment are improved.
The prior physical information of the astronomical observation device can be obtained through actual calibration or simulation: the astronomical observation device is directly used for shooting images of a known source, namely, impulse response of the astronomical observation device to the source is acquired. And obtaining prior physical information of the astronomical observation equipment based on the image analysis of the shot point light source. Meanwhile, some astronomical observation devices can generate prior physical information through simulation software.
S104: and training the astronomical image processing model to be trained according to the training sample and the deep learning network corresponding to the astronomical observation equipment to obtain the astronomical image processing model after training.
Further, fig. 3 shows a serial structure of the astronomical image processing model to be trained and the deep learning network. In the iterative training process, a training sample is input into an astronomical image processing model to be trained to obtain a prediction deconvolution image corresponding to the training sample output by the astronomical image processing model to be trained, the prediction deconvolution image corresponding to the training sample is input into a deep learning network corresponding to astronomical observation equipment, and the prior physical information of the astronomical observation equipment contained in the deep learning network is used for performing the PSF or wave beam effect convolution operation of the astronomical observation equipment on the prediction deconvolution image corresponding to the training sample to obtain the prediction image corresponding to the training sample. At this time, the predicted image corresponding to the training sample should be a blurred image predicted, and therefore, in this specification, a loss function is determined by a difference between the predicted image output from the deep learning network and the training sample to train the astronomical image processing model to be trained.
According to the training process based on the astronomical image processing model, in one or more embodiments of the present disclosure, the astronomical image processing model is actually obtained based on self-supervision model training, and the labels of the training samples do not need to be determined, so that the acquisition difficulty of the training set is greatly reduced, the scale of the training set is enlarged, and the training efficiency and the training accuracy of the astronomical image processing model are improved.
It is understood that in this specification, the parameters of the deep learning network have been fixed to the PSF or beam effect of the astronomical observation device, and thus, in S104, the trained object is only an astronomical image processing model, and the deep learning network is not included.
In this specification, the astronomical image processing model may be constructed by selecting any one of an automatic encoder, a generation countermeasure network, a U-Net, vision Transformer and other regression networks, and the specific model structure is not limited in this specification.
S106: when an image processing request is received, inputting an image to be processed acquired through the astronomical observation equipment into the astronomical image processing model after training is completed, and obtaining a deconvolution image corresponding to the image to be processed.
After the astronomical image processing model is trained, the trained astronomical image processing model can be deployed. When the training-completed astronomical image processing model is needed to be utilized for carrying out the image deconvolution, the electronic equipment provided with the training-completed astronomical image processing model or the electronic equipment receives the image processing request, and the image to be processed acquired by the astronomical observation equipment is obtained by analyzing the image processing request, namely the image needing to be utilized for carrying out the image deconvolution operation. Inputting the image to be processed into the astronomical image processing model after training, and obtaining a deconvolution image corresponding to the image to be processed output by the model.
It can be seen that the deep learning network containing the prior physical information of the astronomical observation device is only applied to the training process of the astronomical image processing model, and does not need to be deployed after the astronomical image processing model is trained.
In addition, an astronomical observation image acquired by astronomical observation equipment is taken as a training sample, and a deep learning network constructed by combining prior physical information of the astronomical observation equipment is combined, so that the astronomical image processing model obtained by training learns the deconvolution operation capability of the astronomical observation image captured by the astronomical observation equipment, and generally, in practical application, the image to be processed by the astronomical image processing model after training is acquired by the astronomical observation equipment. That is, in practical application, the astronomical image processing models corresponding to different astronomical observation devices can be obtained by training with astronomical observation images acquired by different astronomical observation devices as training samples and combining deep learning networks constructed by prior physical information of the different astronomical observation devices, so as to respectively deconvolute the images to be processed acquired by the different astronomical observation devices.
In the astronomical image processing method provided by the description, an observation image acquired by astronomical observation equipment is used as a training sample, a deep learning network is constructed according to priori physical information of the astronomical observation equipment, an astronomical image processing model is trained according to the training sample and the deep learning network, an image to be processed acquired by appointed equipment is input into the astronomical image processing model, and a deconvolution image corresponding to the image to be processed is obtained.
Therefore, by the scheme, the prior physical information of the astronomical observation equipment is introduced into the training process of the astronomical image processing model, so that the difficulty of deconvolution tasks aiming at the beam effect or the point spread function of the astronomical observation equipment is greatly relieved, and the astronomical image processing efficiency is improved.
In one or more embodiments of the present disclosure, in S104, the training manner of the astronomical image processing model may be as follows, as shown in fig. 4:
S200: and inputting the training sample into an astronomical image processing model to be trained to obtain an intermediate prediction image corresponding to the training sample.
As described above, the astronomical image processing model is used for deconvolving the training samples, so that the intermediate predicted image corresponding to the training samples output by the astronomical image processing model is actually the predicted deconvoluted image corresponding to the training samples.
S202: and inputting the intermediate predicted image corresponding to the training sample into a deep learning network corresponding to astronomical observation equipment to obtain a predicted image corresponding to the training sample.
The depth learning network corresponding to the astronomical observation equipment is constructed based on the prior physical information of the astronomical observation equipment, and the convolution kernel added to the last layer of the depth learning network is determined based on the prior physical information of the astronomical observation equipment. The prior physical information corresponding to different astronomical observers is different, and therefore, the convolution kernels of different deep learning networks are also different.
As described above, the a priori physical information of the astronomical observation device is one or more of the information of the point spread function PSF and the information of the beam effect of the astronomical observation device, and therefore, in this specification, the convolution kernel of the deep learning network is fixed as the PSF or the beam of the astronomical observation device.
The deep learning network corresponding to the astronomical observation device performs a convolution operation of the intermediate predicted image and a convolution kernel of the deep learning network, that is, the predicted deconvoluted image corresponding to the training sample is convolved with the convolution kernel of the deep learning network, and the obtained predicted image is actually a predicted blurred image.
In an alternative embodiment of the present specification, since the PSF or beam dimension of the astronomical observation device is usually large, in order to preserve the originality of the astronomical observation image (training sample) and the PSF, in the present specification, the segmentation technique is not adopted, but a direct convolution calculation is performed, which roughly requires that the deep learning network adopts a large convolution kernel (e.g. 2048×2048), which significantly hinders the learning process. In order to solve the above problem, in the present specification, a convolution method of fast fourier transform (fast Fourier transform, FFT) is adopted, so S202 specifically determines, according to prior physical information of an astronomical observation device, a convolution kernel of a deep learning network corresponding to the astronomical observation device. And carrying out convolution operation on the intermediate predicted image corresponding to the training sample and the convolution kernel of the deep learning network corresponding to the astronomical observation equipment through a fast Fourier transform algorithm to obtain the predicted image corresponding to the training sample.
The convolution operation based on the fast Fourier transform algorithm can greatly reduce the computational complexity, improve the convolution efficiency and further improve the training efficiency of the astronomical image processing model.
S204: and training the astronomical image processing model to be trained by taking the minimization of the difference between the training sample and the predicted image as a training target, so as to obtain a trained astronomical image processing model.
Specifically, a loss function is determined according to the difference between the training sample and the predicted image, so that the astronomical image processing model to be trained is trained by taking the minimization of the loss function as a training target. The loss function may use any type of existing loss function, such as MAE (L1 norm), MSE (L2 norm), huber, and logarithmic hyperbolic cosine function (Log-Cosh). In practical applications, since in the astronomical field, each astronomical observation image may correspond to a star system, making each pixel of the astronomical observation image critical for data analysis, taking into account the robustness of Log-Cosh loss function to outliers and second order microminiaturization, its implementation is prioritized in this specification. Log-Cosh is a Log hyperbolic cosine loss function used to calculate the Log value of the hyperbolic cosine of the prediction error. The Log-Cosh function is similar to MAE at small losses, MSE at large losses, and has second order scalability. On the other hand, the Huber loss function lacks scalability in all cases. MAE loss represents the average of absolute errors, but by considering only the average absolute distance between the prediction and the expected data, it cannot handle significantly erroneous predictions. While MSE loss emphasizes significant errors by using square values, which have a large impact on performance metrics. Therefore, a log-dash function having a stronger robustness against outliers is selected as the loss function in this specification.
In an alternative embodiment of the present disclosure, after the training of the astronomical image processing model is completed, the trained astronomical image processing model may also be verified and evaluated to ensure that the astronomical image processing model put into practical use meets the accuracy requirement in the actual scene. The specific scheme is as follows:
the first step: and acquiring an astronomical verification image serving as a verification sample through the astronomical observation device.
Specifically, after the astronomical image processing model is trained, astronomical images are still obtained through astronomical observation equipment, and the astronomical images are used as astronomical verification images, namely verification samples of the astronomical image processing model after the training is completed. The verification sample obtained in this step is actually similar to the training sample adopted in training the astronomical image processing model, and will not be described here.
And a second step of: inputting the verification sample into a trained astronomical image processing model to obtain the trained astronomical image processing model, and obtaining a deconvolution image corresponding to the verification sample.
The training-completed astronomical image processing model has the function of deconvoluting the blurred image (the verification sample) captured by astronomical observation equipment, so that the verification sample is input into the training-completed astronomical image processing model, and the image output by the model is the deconvolution image corresponding to the verification sample.
And a third step of: and inputting the deconvolution image corresponding to the verification sample into a deep learning network corresponding to the astronomical observation device to obtain a convolution image corresponding to the verification sample.
In order to evaluate the accuracy of deconvolution of the trained astronomical image processing model, deconvolution images corresponding to the verification samples output by the model can be input to a deep learning network corresponding to astronomical observation equipment. As described above, the depth learning network corresponding to the astronomical observation device is determined based on the PSF or the beam effect of the astronomical observation device, the deconvolution image corresponding to the verification sample is processed by the depth learning network of the astronomical observation device to obtain the image, which is actually the deconvolution image corresponding to the verification sample, and the deconvolution operation is performed in combination with the PSF or the beam effect of the astronomical observation device to restore the blurred image obtained by the astronomical observation device, and the blurred image is the convolution image corresponding to the verification sample.
Fourth step: and evaluating the astronomical image processing model after training according to the verification sample and the difference between the convolution images corresponding to the verification sample.
Since the convolution kernel of the deep learning network of the astronomical observation device is fixed, it is the PSF or beam effect of the astronomical observation device. If the difference between the convolution image corresponding to the verification sample and the verification sample obtained through the deep learning network of the astronomical observation device is smaller, the unreeling and image precision of the verification sample obtained through the astronomical image processing model after training is higher. If the difference between the convolution image corresponding to the verification sample and the verification sample obtained through the deep learning network of the astronomical observation device is large, the unreeling and image precision of the verification sample obtained through the astronomical image processing model after training is lower. Therefore, the trained astronomical image processing model is evaluated based on the verification sample and the difference between the convolution images corresponding to the verification sample, and whether the astronomical image processing model needs to be trained again is judged based on the evaluation result.
In an alternative embodiment of the present specification, in order to reduce the quality of the image acquired by the astronomical observation device from being poor, the training process of the astronomical image processing model, the evaluation process of the astronomical image processing model and the deconvolution processing process of the image to be processed are affected, in the present specification, the image acquired by the astronomical observation device (including the training sample, the verification sample and the image to be processed) may be preprocessed respectively, and the preprocessing manners include denoising, artifact removal and image contrast enhancement. However, in practical application, the preprocessing scheme can be flexibly selected according to a specific application scenario, and is not limited to the above-mentioned preprocessing mode.
In an optional embodiment of the present disclosure, after training the astronomical image processing model, an astronomical data set may be further obtained to test the trained astronomical image model, and when the test result meets the preset condition, the trained astronomical image processing model is put into practical use. Wherein the reference deconvoluted image contained in the astronomical dataset is actually a clear image after deconvolution. The input of the astronomical image processing model after training is an actual observation image, which is blurred in nature, so that reference physical information of astronomical observation equipment is also required to be acquired from an astronomical data set, a convolution kernel is determined based on PSF or wave beams indicated by the reference physical information of the astronomical observation equipment, and the reference physical information is convolved with a reference deconvolution image in the astronomical data set to obtain a reference observation image, so that the astronomical image processing model after training is tested based on the reference observation image.
However, the astronomical dataset may not store the correspondence between the reference convolution image and the reference physical information, so that a correct reference observation image may not be obtained. Therefore, in an alternative embodiment of the present disclosure, based on the trained astronomical image processing model, a reliable correspondence between the reference unwind and the image and the reference physical information may be established, and the specific scheme is as follows, as shown in fig. 5:
S300: an astronomical dataset is acquired, the astronomical dataset including a plurality of reference deconvoluted images and a plurality of reference physical information.
Specifically, the astronomical data set stores the deconvoluted images by the plurality of references and the plurality of reference physical information, but the astronomical data set may not store the correspondence between the deconvoluted images by the plurality of references and the plurality of reference physical information, that is, although the astronomical data stores the prior physical information of the astronomical observation device employed to acquire the convolved image corresponding to the reference deconvoluted image, there is no correspondence therebetween. Thus, based on the astronomical image processing model trained in the specification, the corresponding relation between the deconvolution image of the reference image and the reference physical information is established.
S302: and constructing each test pair according to the reference deconvolution image and any one of a plurality of reference physical information included in the astronomical data set aiming at each reference deconvolution image in the astronomical data set.
Specifically, a plurality of reference deconvolution images and a plurality of reference physical information are grouped in any pair to obtain a plurality of test pairs, wherein one test pair comprises a reference deconvolution image and a reference physical information. Traversing the multiple reference convolution images and the multiple reference physical information, and combining any one reference deconvolution image with any one reference physical information to construct a test pair.
For example, the reference deconvolution image is X 1、X2、X3, the reference physical information is Y 1 and Y 2, then the test pair may be :(X1,Y1)、(X1,Y2)、(X2,Y1)、(X2,Y2)、(X3,Y1)、(X3,Y2).
S304: for each test pair, determining a simulated observation image for the test pair based on the test pair reference deconvolution image and the test pair reference physical information.
Specifically, the reference deconvolution image stored in the astronomical data set is actually a clear image subjected to deconvolution operation, and the reference physical information is used for indicating the PSF or the wave beam of the astronomical observation device, so in this step, the convolution kernel is determined based on the reference physical information in the test pair, the determined convolution kernel and the test pair are subjected to convolution operation on the reference deconvolution image, and the convolution image is obtained as a simulated observation image of the test pair, and the simulated observation image is actually a blurred image.
S306: and inputting the simulated observation image of the test pair into the astronomical image processing model after training, and obtaining a predicted deconvolution image corresponding to the simulated observation image of the test pair.
In the specification, the trained astronomical image processing model is used for performing deconvolution operation on the blurred image to obtain a clear image, so that the simulated observation image of the test pair obtained in the step S304 is input into the trained astronomical image processing model to obtain a predicted deconvolution image corresponding to the simulated observation image of the test pair, namely the clear image corresponding to the simulated observation image predicted by the astronomical image processing model.
S308: if the difference between the predicted deconvolution image corresponding to the simulated observation image of the test pair and the reference deconvolution image of the test pair is smaller than a preset difference threshold value, establishing a corresponding relation between the reference deconvolution image of the test pair and the reference physical information of the test pair.
If the clear image corresponding to the simulated observation image predicted by the astronomical image processing model has high similarity with the reference deconvolution image in the test pair, the PSF or the wave beam of astronomical observation equipment represented by the reference physical information in the test pair is described as being similar to the reverse process of deconvolution operation for obtaining the reference deconvolution image in the test pair, so that the corresponding relation between the reference deconvolution image in the test pair and the reference physical information in the test pair can be established.
Otherwise, if the clear image corresponding to the simulated observation image predicted by the astronomical image processing model has low similarity with the reference deconvolution image in the test pair, the PSF or the wave beam of the astronomical observation device represented by the reference physical information in the test pair is different from the reverse process of deconvolution operation corresponding to the reference deconvolution image in the test pair, so that the corresponding relation between the reference deconvolution image in the test pair and the reference physical information in the test pair cannot be established.
The preset difference threshold in S308 may be manually set in advance according to a priori experience, and the specific value of the preset difference threshold is not limited in this specification.
The above astronomical image processing method provided for one or more embodiments of the present disclosure further provides a corresponding astronomical image processing device based on the same concept, as shown in fig. 6.
Fig. 6 is a schematic diagram of an astronomical image processing device provided in the present specification, specifically including:
a training sample determining module 400, configured to obtain an observation image through an astronomical observation device as a training sample;
the deep learning network determining module 402 is configured to obtain prior physical information of the astronomical observation device, and construct a deep learning network corresponding to the astronomical observation device according to the prior physical information of the astronomical observation device;
the training module 404 is configured to train the astronomical image processing model to be trained according to the training sample and the deep learning network corresponding to the astronomical observation device, so as to obtain a trained astronomical image processing model;
And the image processing module 406 is configured to input, when an image processing request is received, an image to be processed acquired through the astronomical observation device into the astronomical image processing model after training is completed, and obtain a deconvoluted image corresponding to the image to be processed.
Optionally, the training module 404 is specifically configured to input the training sample into an astronomical image processing model to be trained, so as to obtain an intermediate prediction image corresponding to the training sample; inputting the intermediate predicted image corresponding to the training sample into a deep learning network corresponding to the astronomical observation equipment to obtain a predicted image corresponding to the training sample; and training the astronomical image processing model to be trained by taking the minimization of the difference between the training sample and the predicted image as a training target, so as to obtain a trained astronomical image processing model.
Optionally, the training module 404 is specifically configured to determine, according to prior physical information of the astronomical observation device, a convolution kernel of a deep learning network corresponding to the astronomical observation device; and carrying out convolution operation on the intermediate predicted image corresponding to the training sample and the convolution kernel of the deep learning network corresponding to the astronomical observation equipment through a fast Fourier transform algorithm to obtain the predicted image corresponding to the training sample.
Optionally, the training module 404 is specifically configured to obtain a loss according to a difference between the training sample and the predicted image and a logarithmic hyperbolic cosine function; and training the astronomical image processing model to be trained by taking the minimization of the loss as a training target.
Optionally, the astronomical image processing model is constructed by an automatic encoder, a generating countermeasure network, and any one regression network of U-Net and Vision Transformer.
Optionally, the a priori physical information of the astronomical observation device includes information of beam effects or information of a point spread function during imaging of the astronomical observation device.
Optionally, the method further comprises:
The correspondence determining module 408 is specifically configured to obtain an astronomical dataset, where the astronomical dataset includes a plurality of reference deconvoluted images and a plurality of reference physical information; sequentially aiming at each reference deconvolution image in the astronomical data set, and constructing each test pair according to the reference deconvolution image and any one of a plurality of reference physical information included in the astronomical data set; for each test pair, determining a simulated observation image of the test pair according to the reference deconvolution image of the test pair and the reference physical information of the test pair; inputting the simulated observation image of the test pair into the astronomical image processing model after training to obtain a predicted deconvolution image corresponding to the simulated observation image of the test pair; if the difference between the predicted deconvolution image corresponding to the simulated observation image of the test pair and the reference deconvolution image of the test pair is smaller than a preset difference threshold value, establishing a corresponding relation between the reference deconvolution image of the test pair and the reference physical information of the test pair.
Optionally, the apparatus further comprises:
The verification module 410 is specifically configured to obtain an astronomical verification image through the astronomical observation device as a verification sample; inputting the verification sample into a trained astronomical image processing model to obtain the trained astronomical image processing model, and obtaining a deconvolution image corresponding to the verification sample; inputting the deconvolution image corresponding to the verification sample into a deep learning network corresponding to the astronomical observation equipment to obtain a convolution image corresponding to the verification sample; and evaluating the astronomical image processing model after training according to the verification sample and the difference between the convolution images corresponding to the verification sample.
Optionally, the apparatus further comprises:
The preprocessing module 412 is specifically configured to perform preprocessing on the image acquired by the astronomical observation device, where the preprocessing includes denoising, artifact removal, and image contrast enhancement.
The present specification also provides a computer-readable storage medium storing a computer program operable to execute the astronomical image processing method shown in fig. 1 described above.
The present specification also provides a schematic structural diagram of the electronic device shown in fig. 7. At the hardware level, the electronic device includes a processor, an internal bus, a network interface, a memory, and a non-volatile storage, as described in fig. 7, although other hardware required by other services may be included. The processor reads the corresponding computer program from the nonvolatile memory into the memory and then runs the computer program to realize the astronomical image processing method shown in fig. 1. Of course, other implementations, such as logic devices or combinations of hardware and software, are not excluded from the present description, that is, the execution subject of the following processing flows is not limited to each logic unit, but may be hardware or logic devices.
In the 90 s of the 20 th century, improvements to one technology could clearly be distinguished as improvements in hardware (e.g., improvements to circuit structures such as diodes, transistors, switches, etc.) or software (improvements to the process flow). However, with the development of technology, many improvements of the current method flows can be regarded as direct improvements of hardware circuit structures. Designers almost always obtain corresponding hardware circuit structures by programming improved method flows into hardware circuits. Therefore, an improvement of a method flow cannot be said to be realized by a hardware entity module. For example, a programmable logic device (Programmable Logic Device, PLD) (e.g., field programmable gate array (Field Programmable GATE ARRAY, FPGA)) is an integrated circuit whose logic functions are determined by user programming of the device. A designer programs to "integrate" a digital system onto a PLD without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Moreover, nowadays, instead of manually manufacturing integrated circuit chips, such programming is mostly implemented with "logic compiler (logic compiler)" software, which is similar to the software compiler used in program development and writing, and the original code before being compiled is also written in a specific programming language, which is called hardware description language (Hardware Description Language, HDL), but HDL is not just one, but a plurality of kinds, such as ABEL(Advanced Boolean Expression Language)、AHDL(Altera Hardware Description Language)、Confluence、CUPL(Cornell University Programming Language)、HDCal、JHDL(Java Hardware Description Language)、Lava、Lola、MyHDL、PALASM、RHDL(Ruby Hardware Description Language), and VHDL (Very-High-SPEED INTEGRATED Circuit Hardware Description Language) and Verilog are currently most commonly used. It will also be apparent to those skilled in the art that a hardware circuit implementing the logic method flow can be readily obtained by merely slightly programming the method flow into an integrated circuit using several of the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer readable medium storing computer readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, application SPECIFIC INTEGRATED Circuits (ASICs), programmable logic controllers, and embedded microcontrollers, examples of controllers include, but are not limited to, the following microcontrollers: ARC 625D, atmel AT91SAM, microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic of the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller in a pure computer readable program code, it is well possible to implement the same functionality by logically programming the method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Such a controller may thus be regarded as a kind of hardware component, and means for performing various functions included therein may also be regarded as structures within the hardware component. Or even means for achieving the various functions may be regarded as either software modules implementing the methods or structures within hardware components.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various units, respectively. Of course, the functions of each element may be implemented in one or more software and/or hardware elements when implemented in the present specification.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the present specification may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present description can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The present description is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the specification. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the present specification may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present description can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments.
The foregoing is merely exemplary of the present disclosure and is not intended to limit the disclosure. Various modifications and alterations to this specification will become apparent to those skilled in the art. Any modifications, equivalent substitutions, improvements, or the like, which are within the spirit and principles of the present description, are intended to be included within the scope of the claims of the present description.

Claims (7)

1. An astronomical image processing method, characterized by comprising:
acquiring astronomical observation images as training samples through astronomical observation equipment;
Acquiring prior physical information of the astronomical observation equipment, and constructing a deep learning network corresponding to the astronomical observation equipment according to the prior physical information of the astronomical observation equipment;
Training an astronomical image processing model to be trained according to the training sample and a deep learning network corresponding to astronomical observation equipment to obtain a trained astronomical image processing model;
When an image processing request is received, inputting an image to be processed acquired through the astronomical observation equipment into the astronomical image processing model after training is completed, and obtaining a deconvolution image corresponding to the image to be processed;
Training the astronomical image processing model to be trained according to the training sample and the deep learning network corresponding to astronomical observation equipment, wherein the training comprises the following steps:
Inputting the training sample into an astronomical image processing model to be trained to obtain an intermediate prediction image corresponding to the training sample;
inputting the intermediate predicted image corresponding to the training sample into a deep learning network corresponding to astronomical observation equipment to obtain a predicted image corresponding to the training sample;
training an astronomical image processing model to be trained by taking the minimization of the difference between the training sample and the predicted image as a training target, so as to obtain a trained astronomical image processing model;
inputting the intermediate predicted image corresponding to the training sample into a deep learning network corresponding to astronomical observation equipment to obtain a predicted image corresponding to the training sample, wherein the method specifically comprises the following steps of:
determining a convolution kernel of a deep learning network corresponding to astronomical observation equipment according to prior physical information of the astronomical observation equipment;
Performing convolution operation on the intermediate predicted image corresponding to the training sample and the convolution kernel of the deep learning network corresponding to the astronomical observation equipment through a fast Fourier transform algorithm to obtain a predicted image corresponding to the training sample;
the minimizing of the difference between the training sample and the predicted image is used for training the astronomical image processing model to be trained by a training target, and specifically comprises the following steps:
obtaining loss according to the difference between the training sample and the predicted image and a logarithmic hyperbolic cosine function;
And training the astronomical image processing model to be trained by taking the minimization of the loss as a training target.
2. The method of claim 1, wherein the astronomical image processing model is constructed from an automatic encoder, a generating countermeasure network, a regression network of any one of U-Net, vision Transformer.
3. The method of claim 1, wherein the a priori physical information of the astronomical observation device includes information of beam effects or information of a point spread function during imaging of the astronomical observation device.
4. The method of claim 1, wherein the method further comprises:
acquiring an astronomical dataset comprising a plurality of reference deconvoluted images and a plurality of reference physical information;
sequentially aiming at each reference deconvolution image in the astronomical data set, and constructing each test pair according to the reference deconvolution image and any one of a plurality of reference physical information included in the astronomical data set;
For each test pair, determining a simulated observation image of the test pair according to the reference deconvolution image of the test pair and the reference physical information of the test pair;
Inputting the simulated observation image of the test pair into the astronomical image processing model after training to obtain a predicted deconvolution image corresponding to the simulated observation image of the test pair;
If the difference between the predicted deconvolution image corresponding to the simulated observation image of the test pair and the reference deconvolution image of the test pair is smaller than a preset difference threshold value, establishing a corresponding relation between the reference deconvolution image of the test pair and the reference physical information of the test pair.
5. The method of claim 1, wherein prior to inputting the image to be processed acquired by the astronomical observation device into the trained astronomical image processing model, the method further comprises:
Acquiring an astronomical verification image as a verification sample through the astronomical observation equipment;
Inputting the verification sample into a trained astronomical image processing model to obtain the trained astronomical image processing model, and obtaining a deconvolution image corresponding to the verification sample;
Inputting the deconvolution image corresponding to the verification sample into a deep learning network corresponding to the astronomical observation equipment to obtain a convolution image corresponding to the verification sample;
and evaluating the astronomical image processing model after training according to the verification sample and the difference between the convolution images corresponding to the verification sample.
6. The method of any one of claims 1-5, further comprising:
preprocessing an image acquired by the astronomical observation device, wherein the preprocessing comprises denoising, artifact removal and image contrast enhancement.
7. An astronomical image processing device, characterized by comprising:
the training sample determining module is used for acquiring astronomical observation images through astronomical observation equipment to serve as training samples;
the deep learning network determining module is used for acquiring prior physical information of the astronomical observation equipment and constructing a deep learning network corresponding to the astronomical observation equipment according to the prior physical information of the astronomical observation equipment;
The training module is used for training the astronomical image processing model to be trained according to the training sample and the deep learning network corresponding to the astronomical observation equipment to obtain a trained astronomical image processing model;
The image processing module is used for inputting the image to be processed acquired by the astronomical observation equipment into the astronomical image processing model after training is completed when an image processing request is received, and obtaining a deconvolution image corresponding to the image to be processed;
the training module is specifically configured to input the training sample into an astronomical image processing model to be trained, so as to obtain an intermediate prediction image corresponding to the training sample; inputting the intermediate predicted image corresponding to the training sample into a deep learning network corresponding to astronomical observation equipment to obtain a predicted image corresponding to the training sample; training an astronomical image processing model to be trained by taking the minimization of the difference between the training sample and the predicted image as a training target, so as to obtain a trained astronomical image processing model;
The training module is specifically configured to determine a convolution kernel of a deep learning network corresponding to the astronomical observation device according to prior physical information of the astronomical observation device; performing convolution operation on the intermediate predicted image corresponding to the training sample and the convolution kernel of the deep learning network corresponding to the astronomical observation equipment through a fast Fourier transform algorithm to obtain a predicted image corresponding to the training sample;
The training module is specifically configured to obtain a loss according to a difference between the training sample and the predicted image and a logarithmic hyperbolic cosine function; and training the astronomical image processing model to be trained by taking the minimization of the loss as a training target.
CN202410287648.5A 2024-03-13 2024-03-13 Astronomical image processing method and device Active CN117876263B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410287648.5A CN117876263B (en) 2024-03-13 2024-03-13 Astronomical image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410287648.5A CN117876263B (en) 2024-03-13 2024-03-13 Astronomical image processing method and device

Publications (2)

Publication Number Publication Date
CN117876263A CN117876263A (en) 2024-04-12
CN117876263B true CN117876263B (en) 2024-05-17

Family

ID=90592106

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410287648.5A Active CN117876263B (en) 2024-03-13 2024-03-13 Astronomical image processing method and device

Country Status (1)

Country Link
CN (1) CN117876263B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110458778A (en) * 2019-08-08 2019-11-15 深圳市灵明光子科技有限公司 A kind of depth image denoising method, device and storage medium
WO2019233166A1 (en) * 2018-06-04 2019-12-12 杭州海康威视数字技术股份有限公司 Surface defect detection method and apparatus, and electronic device
CN115909011A (en) * 2022-12-27 2023-04-04 中国科学院国家天文台南京天文光学技术研究所 Astronomical image automatic classification method based on improved SE-inclusion-v 3 network model
CN116342984A (en) * 2023-05-31 2023-06-27 之江实验室 Model training method, image processing method and image processing device
CN117178302A (en) * 2021-02-19 2023-12-05 迪普赛尔公司 Systems and methods for cell analysis

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200064444A1 (en) * 2015-07-17 2020-02-27 Origin Wireless, Inc. Method, apparatus, and system for human identification based on human radio biometric information

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019233166A1 (en) * 2018-06-04 2019-12-12 杭州海康威视数字技术股份有限公司 Surface defect detection method and apparatus, and electronic device
CN110458778A (en) * 2019-08-08 2019-11-15 深圳市灵明光子科技有限公司 A kind of depth image denoising method, device and storage medium
CN117178302A (en) * 2021-02-19 2023-12-05 迪普赛尔公司 Systems and methods for cell analysis
CN115909011A (en) * 2022-12-27 2023-04-04 中国科学院国家天文台南京天文光学技术研究所 Astronomical image automatic classification method based on improved SE-inclusion-v 3 network model
CN116342984A (en) * 2023-05-31 2023-06-27 之江实验室 Model training method, image processing method and image processing device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
图像恢复盲解卷积之文献综述;张蕾;常丽媛;樊冬梅;;科技资讯;20120723(21);全文 *

Also Published As

Publication number Publication date
CN117876263A (en) 2024-04-12

Similar Documents

Publication Publication Date Title
Molina On the hierarchical Bayesian approach to image restoration: applications to astronomical images
CN108550125B (en) Optical distortion correction method based on deep learning
CN111476719A (en) Image processing method, image processing device, computer equipment and storage medium
US20110267485A1 (en) Range measurement using a coded aperture
CN110766768A (en) Magnetic resonance image reconstruction method, device, equipment and medium
CN113962877B (en) Pixel distortion correction method, correction device and terminal
JP2019537736A (en) System and method for object detection in holographic lens-free imaging with convolution dictionary learning and coding
Kim et al. Blind deconvolution of 3D fluorescence microscopy using depth‐variant asymmetric PSF
CN113466839B (en) Side-scan sonar sea bottom line detection method and device
CN116912923B (en) Image recognition model training method and device
CN117876263B (en) Astronomical image processing method and device
CN112116700A (en) Monocular view-based three-dimensional reconstruction method and device
CN115205224B (en) Adaptive feature enhanced multisource fusion visual detection method, device and medium
CN116452662A (en) Method and device for extracting pixel coordinates of optical fiber center and electronic equipment
CN116010850A (en) Method and device for identifying rapid radio storm, storage medium and electronic equipment
CN113875228B (en) Video frame inserting method and device and computer readable storage medium
KR102680384B1 (en) Method and apparatus for correction of aberration
CN116543246A (en) Training method of image denoising model, image denoising method, device and equipment
CN112017113B (en) Image processing method and device, model training method and device, equipment and medium
CN117593619B (en) Image processing method, device, electronic equipment and storage medium
Howard et al. CoordGate: Efficiently Computing Spatially-Varying Convolutions in Convolutional Neural Networks
CN116630631B (en) Image segmentation method and device, electronic equipment and storage medium
Zhou et al. DR-UNet: dynamic residual U-Net for blind correction of optical degradation
CN114155366B (en) Dynamic cabinet image recognition model training method and device, electronic equipment and medium
CN116912518B (en) Image multi-scale feature processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant