CN112164001A - Digital microscope image rapid splicing and fusing method - Google Patents

Digital microscope image rapid splicing and fusing method Download PDF

Info

Publication number
CN112164001A
CN112164001A CN202011053664.6A CN202011053664A CN112164001A CN 112164001 A CN112164001 A CN 112164001A CN 202011053664 A CN202011053664 A CN 202011053664A CN 112164001 A CN112164001 A CN 112164001A
Authority
CN
China
Prior art keywords
image
images
value
splicing
registration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011053664.6A
Other languages
Chinese (zh)
Other versions
CN112164001B (en
Inventor
左超
张晓磊
沈德同
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University Of Technology Intelligent Computing Imaging Research Institute Co ltd
Original Assignee
Nanjing University Of Technology Intelligent Computing Imaging Research Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University Of Technology Intelligent Computing Imaging Research Institute Co ltd filed Critical Nanjing University Of Technology Intelligent Computing Imaging Research Institute Co ltd
Priority to CN202011053664.6A priority Critical patent/CN112164001B/en
Publication of CN112164001A publication Critical patent/CN112164001A/en
Application granted granted Critical
Publication of CN112164001B publication Critical patent/CN112164001B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/337Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Microscoopes, Condenser (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a method for rapidly splicing and fusing digital microscope images, which comprises the following steps: acquiring a microscopic image from a multi-modal imaging system, preprocessing the image, registering the image and fusing the image to obtain a final spliced image. The optimized phase correlation-based registration algorithm provided by the invention can reduce the probability of registration error when carrying out image registration, effectively improves the registration accuracy and provides a stable registration condition for large-field image splicing; splicing traces can be effectively eliminated, the effect of smooth transition of the overlapped area is achieved, and the key effect is achieved for seamless splicing of the large-view images.

Description

Digital microscope image rapid splicing and fusing method
Technical Field
The invention relates to the technical field of optical microscopic imaging and image splicing, in particular to a method for quickly splicing and fusing digital microscope images.
Background
Since the invention of the microscope, the microscope plays an important role in the research and application fields of various subjects, and the technological development puts higher demands on the microscope. At present, under the microscope observation condition, a large number of high-resolution images which are mutually overlapped are collected, and if the images are required to be spliced into the ultrahigh-resolution large-view-field images under the manual condition, the images are accurate and quick, and meanwhile, the time is consumed. Because the existing imaging systems are limited by the numerical aperture of the optical system, the existing imaging systems are difficult to realize the combination of large depth of field and high resolution, and the problem of acquiring the high-resolution large depth of field image is also a problem. In the field of application and research of medical microscopic images, images with large visual fields and high resolution are more and more prone to be obtained, however, as the magnification of an objective lens of an optical microscope is larger, the visual fields are smaller, and the contradiction between the two results causes that only images of samples to be detected with small visual fields can be acquired at the same time under the condition of high resolution. However, by means of the image splicing technology, a plurality of high-resolution images are spliced, so that the integrity of an observation visual field can be guaranteed, and the requirement of observing any detail of the images at high resolution can be met. Through the continuous efforts of researchers in recent years, the development of medical remote diagnosis is promoted by an image splicing technology based on a computed microscopy imaging system, in the process of remote diagnosis, a doctor can perform detail amplification processing on an image at the other remote end by transmitting the medical image remotely through the internet, and a large-field high-resolution medical image is required to be obtained, so that medical diagnosis and rescue work is performed. The purpose of image splicing is to splice a series of images with overlapped boundaries, so as to obtain an image with an ultra-wide viewing angle.
In the image splicing technology, image registration and fusion are two kernel modules of the technology, the two kernel modules are associated with each other and bear each other, and the image registration is the basis of the fusion. The image registration refers to finding a corresponding transformation relation between overlapping areas of two images by using the characteristics of the images, and then aligning the two images with the overlapping areas according to the corresponding relation. The registration is the core of the image stitching technology, and the selection of the registration algorithm is directly related to the success or failure of the stitching and the stitching speed. Through image registration, the relative position relationship between two continuous images with overlapped areas can be obtained, and then the two images are fused by utilizing the relative position relationship to obtain a large-view-field image. Due to the influence of factors such as illumination and noise between the two pictures, a splicing seam is likely to appear after splicing, and the quality of the spliced images is improved by using a registration technology. The technology mainly selects the most suitable transformation relation from the found multiple transformation relations, so that the spliced image looks more vivid. However, the algorithm is contradictory among calculation precision, calculation real-time performance and anti-interference capability, and how to improve the precision and the splicing speed of image splicing is a key problem in the field of image processing at present.
In the study of image stitching, according to different application fields, researchers propose a plurality of image stitching algorithms. The development of the image splicing technology in China is relatively slow abroad, the image splicing technology which is researched at the earliest in China is also applied to the field of national defense and military, and the 'virtual reality' system is realized by utilizing the panoramic image splicing technology in the grandma of the national defense science and technology university. In 1997, Wangcui used cross correlation coefficients as similarity relationships to evaluate similarity relationships between images, and further proposed an automatic image stitching technique. In the beginning of the 21 st century, with the use of image stitching in more and more application fields and higher requirements for the accuracy of stitching, the reference normalization method based on the RANSAC basic matrix is provided for the Guo jacobsite and the Queen seal, and an effective solution is provided for the problem of mismatching of image registration. In 2010, Lihuping provides a method for obtaining the maximum correlation characteristic points of an image by using a bidirectional search based on a characteristic extraction registration algorithm, and the method can improve the registration accuracy and speed to a certain extent. Therefore, through the continuous efforts of scientists, the image stitching technology has achieved abundant scientific achievements, and more application fields use the image stitching technology, which will certainly lead the technology to develop in a wider direction.
Disclosure of Invention
The invention provides a telecentric optical path distortion center positioning method based on a two-step method, aiming at solving the problem that the large field of view and the resolution ratio in a microscopic image can not be simultaneously considered.
The technical scheme of the invention is as follows: a digital microscope image rapid splicing and fusing method comprises the following steps:
acquiring images from a multi-modal imaging system:
the objective lens is driven to move by the stepping motor to obtain a plurality of lensesThe image sequences under different visual fields transmit the acquired image sequences to a computer, so that the acquired images of the microscopic images are realized, and the acquired images of the adjacent visual fields need to keep a part of overlapped areas; based on an acquired imageI 1 (x,y)Selecting an image with a field of view adjacent to the base image as an image to be matchedI 2 (x,y)WhereinI 1 (x,y)AndI 2 (x,y)representing the grey scale distribution function of the base image and the image to be matched respectively,(x,y)coordinates representing any point of the image;
step two, preprocessing the acquired image: filtering the acquired microscopic image by adopting Gaussian filtering to reduce the noise of the spliced image;
step three, carrying out image registration on the image, and carrying out image registration on the image obtained by preprocessing through an optimization algorithm based on phase correlation:
step 3.1, calculating the base imageI 1 (x,y)With the image to be matchedI 2 (x,y)Relative displacement relationship between them, images to be matchedI 2 (x,y)Relative basis imageI 1 (x,y)There is a certain displacement in a certain direction, provided thatXThe direction displacement isΔxYThe direction displacement isΔyThen, the relationship between the two can be obtained:
Figure 254193DEST_PATH_IMAGE001
step 3.2, imageI 1 (x,y)AndI 2 (x,y)fourier transform is carried out to obtain the results ofF 1 (ξ,η)AndF 2 (ξ,η)then the two would have the following relationship:
Figure 702491DEST_PATH_IMAGE002
in the formula(ξ,η)The coordinates representing the frequency domain are represented by,F 1 (ξ,η)andF 2 (ξ,η)representing the frequency distribution function of the base image and the image to be matched,
step 3.3, calculating the cross-power spectrum of the two by using a cross-power spectrum calculation formula, and obtaining the following result:
Figure 888753DEST_PATH_IMAGE003
in the formulaF 2 * (ξ,η)Is composed ofF 2 (ξ,η)The following results can be obtained by performing inverse fourier transform on the cross-power spectra of the two:
Figure 815121DEST_PATH_IMAGE004
in the formulaσ(x-Δx, y-Δy)As a base imageI 1 (x,y)With the image to be matchedI 2 (x,y)After inverse Fourier transform, the cross power spectrumσ(x-Δx, y-Δy)When taking the peak value, i.e. when the phase correlation function is equal to 1, the corresponding coordinates are exactly (Δx,Δy)The coordinate value of the peak position of the pulse function is the relative displacement relation between the basic image and the image to be matched, and the overlapping area of the two images can be obtained according to the relative displacement relationa(x, y)Andb (x, y)
step 3.4, calculating the proportion of the overlapped area to the original imagesManually setting a threshold value, and when the overlapped area occupies the proportion of the original imagesWhen the threshold is reached the next operation is performed,
3.5, carrying out similarity evaluation on the overlapped areas of the two images by using a template registration-based idea, further ensuring the accuracy of the overlapped areas, when the evaluation result is greater than a preset threshold value, indicating that the registration is accurate, otherwise, carrying out registration by acquiring the images again;
fusing the images, fusing the registered images by adopting a gradual-in and gradual-out optimization algorithm to complete the splicing of the microscopic images:
step 4.1, registering the basic imageI 1 (x,y)With the image to be matchedI 2 (x,y)The overlapping area is fused by a gradual-in and gradual-out algorithm, the gray value of the image is adjusted before the fusion is carried out to eliminate the splicing seam generated by the gradual-in and gradual-out fusion algorithm, so that the brightness difference of the image is reduced,
and 4.2, after the brightness of the overlapped area is adjusted, adding the pixel values of the overlapped parts of the two images according to a certain weight value to realize smooth transition, and fusing the images by using a gradual-in and gradual-out fusion algorithm.
Preferably, the multi-modal imaging system in the first step is composed of a color camera, a multi-modal microscope, a stepping motor and a computer.
Preferably, the second step is specifically: checking a base image acquired from a multi-modal imaging system using a convolution of window size 3 x 3I 1 (x,y)And the image to be matchedI 2 (x,y)And performing convolution preprocessing.
Preferably, the threshold in step 3.4 is taken to be 0.7.
Preferably, step 3.5 is specifically: the overlap region is subjected to down-sampling according to 0.4 times, and the image of the overlap region after down-sampling is assumed to bea s Andb s the image size ism s ×n s Carrying out similarity analysis on the down-sampled image by using a normalized cross-correlation algorithm based on image gray; when the calculated result is greater than the threshold value of 0.85, the overlapped area of the two images is representeda (x, y)Andb(x, y)if the difference is the same, the overlapped areas are larger, the images need to be acquired again, and the registration is performed again according to the steps.
Preferably, step 4.1 is specifically: reading the gray value of the image point A on the left side of the center line of the overlapped region, calculating the mean value of the gray values of the pixel points in the 4 neighborhoods of the point A and the point B corresponding to the right side of the center line of the region, and comparing the gray value of the point A with the mean value in the 4 neighborhoods of the point B; if the value of the point A is larger than the 4 neighborhood average value of the point B, subtracting half of the difference value from the gray value of the point A, otherwise, adding half of the difference value, if the difference value is within 4 gray values, not processing, and keeping the gray value of the point A unchanged; and after the steps are completed, continuing to judge the next pixel point in the left region, and then repeating the steps to correct the whole region once.
Preferably, step 4.2 is specifically: assuming weight factors of two imagesω 1ω 2Then there isω 1+ω 2=1, assuming a step size of the fade process ofωThe initial value is set to 1,widththe width of the fusion area of the two imagesωTo be provided withω=1/widthThe step length is gradually reduced from the left side to the right side of the image fusion area until the step length is reduced to 0, the traversal of the fusion area is completed, and the realization of the methodωA gradual transition from 1 to 0; the mathematical expression of the image obtained after fusion is as follows:
Figure 534553DEST_PATH_IMAGE005
in the formulaG(x,y)The image obtained after the fusion is represented,f(x,y)g(x,y)representing the central left and right images of the fused portion.
Compared with the traditional method, the method has the following advantages: (1) compared with the common registration algorithm based on phase correlation, the optimized registration algorithm based on phase correlation provided by the invention can reduce the probability of registration error when carrying out image registration, effectively improves the registration accuracy and provides a stable registration condition for large-field image splicing. (2) Compared with the common weighted fusion algorithm, the optimized gradual-in and gradual-out fusion algorithm provided by the invention can effectively eliminate splicing traces, so that the overlapped area achieves the effect of smooth transition, and the method plays a key role in seamless splicing of large-view images.
Drawings
Fig. 1 is a flowchart of a digital microscope image rapid stitching fusion method according to an embodiment of the present invention.
FIG. 2 is a schematic diagram of a multi-modality imaging system acquisition in an embodiment of the invention.
Fig. 3 is a flowchart of an optimization algorithm based on phase correlation in step three according to an embodiment of the present invention.
FIG. 4 is a schematic process diagram of the fade-in and fade-out fusion algorithm in step four according to the embodiment of the present invention.
Fig. 5 is a mosaic of the images of the scale leaves of the onion obtained by the phase correlation method.
Fig. 6 is a splicing result of the sample images of the onion flakes obtained by the optimization algorithm according to the embodiment of the invention.
Fig. 7 is a splicing result of pumpkin stem longitudinal cutting sample images obtained based on a weighted fusion algorithm.
Fig. 8 is a splicing result of pumpkin stem longitudinal cutting sample images obtained by the optimization algorithm according to the embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. The flow of the method for rapidly splicing and fusing digital microscope images in this embodiment is shown in fig. 1, and the specific content is as follows.
Firstly, acquiring an image from a multi-modal imaging system.
The multi-modality imaging system used in this embodiment mainly comprises a color camera, a multi-modality microscope, a stepping motor and a computer, as shown in fig. 2. And driving the objective lens to move through the stepping motor to acquire a plurality of image sequences under different visual fields. After the color camera collects the image formed by the image surface, the collected image is transmitted to a computer through a USB, the image obtained by the microscopic image is realized, and the obtained images of adjacent visual fields need to keep a part of overlapped area. Based on an acquired imageI 1 (x,y)Selecting an image to be matched with a visual field adjacent to the basic imageI 2 (x,y)WhereinI 1 (x,y)AndI 2 (x,y)representing the grey scale distribution function of the base image and the image to be matched respectively,(x,y)representing the coordinates of any point of the image.
And step two, preprocessing the image.
In order to eliminate the influence of brightness unevenness and dust in the air, which may be caused by poor sealing performance of a multi-modal imaging microscope light source part, and other factors, redundant noise is mixed in imaging information, so that the noise reduction processing is carried out on the image acquired by the multi-modal imaging system. The commonly used image denoising method includes median filtering, mean filtering, and gaussian filtering, and in order to protect information of an original image, the embodiment performs filtering processing on an acquired microscopic image by using gaussian filtering. Gaussian filtering is a smooth denoising algorithm that reduces the noise of the original image by performing a convolution operation on the image with a gaussian filter function. For an imageI (x,y)In other words, its two-dimensional Gaussian filter functionG(x,y)The following were used:
Figure 888174DEST_PATH_IMAGE006
in the formula(x,y)Representing the center position of the convolution kernel,(i,j)is the corresponding location of each pixel in the convolution kernel,σis the size of the convolution kernel and,σthe larger the size, the more pronounced the smoothing effect on the image and the worse the opposite, the commonly used convolution kernels with window sizes of 3 x 3, 5 x 5. The embodiment is used for the acquired basic imageI 1 (x,y)And the image to be matchedI 2 (x,y)And carrying out convolution preprocessing by using a convolution kernel with the window size of 3 multiplied by 3 to reduce the noise of the spliced image to a certain extent.
And thirdly, performing image registration on the image by adopting an optimization algorithm based on phase correlation, wherein the specific steps are shown in fig. 3.
Step 3.1, calculating the relative displacement relation between the basic image and the image to be matched, namely the image to be matchedI 2 (x,y)Relative basis imageI 1 (x,y)There is a certain displacement in a certain direction, provided thatXThe direction displacement isΔxYThe direction displacement isΔyThen, the relationship between the two can be obtained:
Figure 561732DEST_PATH_IMAGE007
step 3.2, imageI 1 (x,y)AndI 2 (x,y)fourier transform is carried out to obtain the results ofF 1 (ξ,η)AndF 2 (ξ,η)then the two would have the following relationship:
Figure 291790DEST_PATH_IMAGE008
in the formula(ξ,η)The coordinates representing the frequency domain are represented by,F 1 (ξ,η)andF 2 (ξ,η)representing the frequency distribution function of the base image and the image to be matched.
Step 3.3, calculating the cross-power spectrum of the two by using a cross-power spectrum calculation formula, and obtaining the following result:
Figure 367194DEST_PATH_IMAGE009
in the formulaF 2 * (ξ,η)Is composed ofF 2 (ξ,η)The following results can be obtained by performing inverse fourier transform on the cross-power spectra of the two images:
Figure 891716DEST_PATH_IMAGE010
in the formulaσ(x-Δx, y-Δy)As a base imageI 1 (x,y)With the image to be matchedI 2 (x,y)A corresponding displacement function therebetween.
According to the result, the cross-power spectrum is obtained after the inverse Fourier transformσ(x-Δx, y-Δy)When taking the peak value, i.e. when the phase correlation function is equal to 1, the corresponding coordinates are exactly(Δx,Δy)Therefore, the coordinate value of the peak position of the pulse function is the basic imageI 1 (x,y)With the image to be matchedI 2 (x,y)Relative displacement relationship therebetween. The overlapping area of the two images can be obtained according to the displacement relationa(x, y)Andb(x, y)
step 3.4, the proportion of the overlapped area to the original image can be calculateds. At this time, a threshold value is manually setη 1When the overlapping area is in the proportion of the original imagesReach the threshold valueη 1The next operation is carried out, and the threshold value is found through a large number of experimentsη 1Preferably 0.7.
And 3.5, in order to further ensure the accuracy of the overlapped area, performing similarity evaluation on the overlapped area of the two images by using a template registration-based idea. When the evaluation result is larger than a certain threshold valueη 2And if not, re-acquiring the image for registration.
The evaluation method is as follows: the overlap region is subjected to down-sampling according to 0.4 times, and the image of the overlap region after down-sampling is assumed to bea s Andb s the image size ism s ×n s Carrying out similarity analysis on the down-sampled image by using a normalized cross-correlation algorithm based on image gray level, and when the calculated result is greater than a threshold valueη 2When (1)η 2Preferably 0.85), indicating the region of coincidence of the two imagesa(x, y)Andb(x, y)if the difference is larger, the images are acquired again, and the registration is performed again according to the steps.
And step four, fusing the images by adopting a gradual-in and gradual-out optimization algorithm.
Step 4.1, registering the basic imageI 1 (x,y)And the image to be matchedI 2 (x,y)The overlapped regions are fused by a gradual-in and gradual-out algorithm, but in order to eliminate the splicing seams generated by the gradual-in and gradual-out fusion algorithm, the overlapped regions are fusedThe method comprises the following steps of adjusting the gray value of an image to reduce the brightness difference of the image:
firstly, the gray value of an image point A on the left side of the center line of the overlapped region is read, and the mean value of the gray values of 4 neighborhood pixel points of the point A and the point B corresponding to the right side of the center line of the region is calculated. Comparing the gray value of the point A with the neighborhood average value of the point B4, if the value of the point A is larger than the neighborhood average value of the point B4, subtracting half of the difference value from the gray value of the point A, and otherwise, adding half of the difference value; if the difference value is within 4 gray-scale values, no processing is carried out, and the gray-scale value of the point A is unchanged. And after the steps are completed, continuing to judge the next pixel point in the left region, and then repeating the steps to correct the whole region once.
Step 4.2, after the brightness of the overlapped area is adjusted, the images are fused by using a gradual-in and gradual-out fusion algorithm, the gradual-in and gradual-out fusion algorithm is that pixel values of the overlapped part of the two images are added according to a certain weight to realize smooth transition, and the schematic process diagram is shown in fig. 4:
assuming weight factors of two imagesω 1ω 2Then there isω 1+ω 2=1, assuming a step size of the fade process ofωThe initial value is set to 1,widththe width of the fusion area of the two imagesωTo be provided withω=1/widthThe step length is gradually reduced from the left side to the right side of the image fusion area until the step length is reduced to 0, the traversal of the fusion area is completed, and the realization of the methodωA gradual transition from 1 to 0; the mathematical expression of the image obtained after fusion is as follows:
Figure 52570DEST_PATH_IMAGE011
in the formulaG(x,y)The image obtained after the fusion is represented,f(x,y)g(x,y)representing the central left and right images of the fused portion.
The embodiment provides a method for rapidly splicing and fusing digital microscope images, which includes the steps of firstly acquiring microscopic images from a multi-modal imaging system, then preprocessing the acquired images by using a convolution operator of 3 x 3, then registering the images through an optimization algorithm based on phase correlation, and finally realizing the fusion of the microscopic images through an optimization algorithm of gradual-in gradual-out. Experiments and verification are performed on two key contents in image splicing, namely registration accuracy and fusion effect.
(1) The registration effect is as follows: fig. 5 and 6 are stitched images of an onion scale leaf sample, where fig. 5 is a stitched image obtained according to a phase correlation method, fig. 6 is a stitched image obtained by the optimization algorithm based on phase correlation proposed in this embodiment, and a dashed frame in fig. 5 and 6 is an area requiring registration. Through comparison, registration errors occur in the onion flake leaf sample pictures before optimization, so that the dotted line frames in the splicing result cannot be overlapped, as shown in fig. 5. Fig. 6 shows that the optimized phase correlation algorithm obtains effective registration, and the regions to be registered in the stitched image all obtain ideal stitching effect.
(2) Fusion effect: fig. 7 and 8 are stitching results of pumpkin stem longitudinal cutting sample images, fig. 7 is a stitching image obtained based on a weighted fusion algorithm, and fig. 8 is a stitching image obtained by using the gradual-in gradual-out optimization algorithm proposed in this embodiment. The comparison shows that a weak splicing trace exists in the broken line frame of fig. 7, the splicing trace in the broken line frame of fig. 8 is effectively eliminated, the overlapping area achieves the effect of smooth transition, and the key effect is played for seamless splicing of the large-view images.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (7)

1. A digital microscope image rapid splicing and fusing method is characterized by comprising the following steps:
acquiring images from a multi-modal imaging system:
the objective lens is driven to move by the stepping motor, a plurality of image sequences under different visual fields are obtained, the collected image sequences are transmitted to the computer, the obtained images of the microscopic images are realized, and the obtained images of the adjacent visual fields need to keep a part of overlapped areas; based on an acquired imageI 1 (x,y)Selecting an image with a field of view adjacent to the base image as an image to be matchedI 2 (x,y)WhereinI 1 (x,y)AndI 2 (x,y)representing the grey scale distribution function of the base image and the image to be matched respectively,(x,y)coordinates representing any point of the image;
step two, preprocessing the acquired image: filtering the acquired microscopic image by adopting Gaussian filtering to reduce the noise of the spliced image;
step three, carrying out image registration on the image, and carrying out image registration on the image obtained by preprocessing through an optimization algorithm based on phase correlation:
step 3.1, calculating the base imageI 1 (x,y)With the image to be matchedI 2 (x,y)Relative displacement relationship between them, images to be matchedI 2 (x,y)Relative basis imageI 1 (x,y)There is a certain displacement in a certain direction, provided thatXThe direction displacement isΔxYThe direction displacement isΔyThen, the relationship between the two can be obtained:
Figure 386307DEST_PATH_IMAGE001
step 3.2, imageI 1 (x,y)AndI 2 (x,y)fourier transform is carried out to obtain the results ofF 1 (ξ,η)AndF 2 (ξ, η)then the two would have the following relationship:
Figure 316217DEST_PATH_IMAGE002
in the formula(ξ,η)The coordinates representing the frequency domain are represented by,F 1 (ξ,η)andF 2 (ξ,η)representing the frequency distribution function of the base image and the image to be matched,
step 3.3, calculating the cross-power spectrum of the two by using a cross-power spectrum calculation formula, and obtaining the following result:
Figure 277220DEST_PATH_IMAGE003
in the formulaF 2 * (ξ,η)Is composed ofF 2 (ξ,η)The following results can be obtained by performing inverse fourier transform on the cross-power spectra of the two:
Figure 253266DEST_PATH_IMAGE004
in the formulaσ(x-Δx, y-Δy)As a base imageI 1 (x,y)With the image to be matchedI 2 (x,y)After inverse Fourier transform, the cross power spectrumσ(x-Δx, y-Δy)When taking the peak value, i.e. when the phase correlation function is equal to 1, the corresponding coordinates are exactly (Δx,Δy)The coordinate value of the peak position of the pulse function is the relative displacement relation between the basic image and the image to be matched, and the overlapping area of the two images can be obtained according to the relative displacement relationa(x, y)Andb(x, y)
step 3.4, calculating the proportion of the overlapped area to the original imagesManually setting a threshold value, and when the overlapped area occupies the proportion of the original imagesWhen the threshold is reached the next operation is performed,
3.5, carrying out similarity evaluation on the overlapped areas of the two images by using a template registration-based idea, further ensuring the accuracy of the overlapped areas, when the evaluation result is greater than a preset threshold value, indicating that the registration is accurate, otherwise, carrying out registration by acquiring the images again;
fusing the images, fusing the registered images by adopting a gradual-in and gradual-out optimization algorithm to complete the splicing of the microscopic images:
step 4.1, registering the basic imageI 1 (x,y)With the image to be matchedI 2 (x,y)The overlapping area is fused by a gradual-in and gradual-out algorithm, the gray value of the image is adjusted before the fusion is carried out to eliminate the splicing seam generated by the gradual-in and gradual-out fusion algorithm, so that the brightness difference of the image is reduced,
and 4.2, after the brightness of the overlapped area is adjusted, adding the pixel values of the overlapped parts of the two images according to a certain weight value to realize smooth transition, and fusing the images by using a gradual-in and gradual-out fusion algorithm.
2. The method for rapidly splicing and fusing digital microscope images as claimed in claim 1, wherein the multi-modal imaging system in the first step is composed of a color camera, a multi-modal microscope, a stepping motor and a computer.
3. The digital microscope image rapid splicing and fusing method as claimed in claim 1, characterized in that the second step is specifically: checking a base image acquired from a multi-modal imaging system using a convolution of window size 3 x 3I 1 (x,y)And the image to be matchedI 2 (x,y)And performing convolution preprocessing.
4. The method for rapidly splicing and fusing the digital microscope images as claimed in claim 1, wherein the threshold value in the step 3.4 is 0.7.
5. The digital microscope image rapid splicing and fusing method as claimed in claim 1, characterized in that the step 3.5 specifically comprises: the overlap region is subjected to down-sampling according to 0.4 times, and the image of the overlap region after down-sampling is assumed to bea s Andb s the image size ism s ×n s Carrying out similarity analysis on the down-sampled image by using a normalized cross-correlation algorithm based on image gray; when the calculated result is greater than the threshold value of 0.85, the overlapped area of the two images is representeda(x, y)Andb(x, y)if the difference is the same, the overlapped areas are larger, the images need to be acquired again, and the registration is performed again according to the steps.
6. The digital microscope image rapid splicing and fusing method as claimed in claim 1, characterized in that the step 4.1 specifically comprises: reading the gray value of the image point A on the left side of the center line of the overlapped region, calculating the mean value of the gray values of the pixel points in the 4 neighborhoods of the point A and the point B corresponding to the right side of the center line of the region, and comparing the gray value of the point A with the mean value in the 4 neighborhoods of the point B; if the value of the point A is larger than the 4 neighborhood average value of the point B, subtracting half of the difference value from the gray value of the point A, otherwise, adding half of the difference value, if the difference value is within 4 gray values, not processing, and keeping the gray value of the point A unchanged; and after the steps are completed, continuing to judge the next pixel point in the left region, and then repeating the steps to correct the whole region once.
7. The digital microscope image rapid splicing and fusing method as claimed in claim 1, characterized in that the step 4.2 specifically comprises: assuming weight factors of two imagesω 1ω 2Then there isω 1+ω 2=1, assuming a step size of the fade process ofωThe initial value is set to 1,widththe width of the fusion area of the two imagesωTo be provided withω=1/widthThe step length is gradually reduced from the left side to the right side of the image fusion area until the step length is reduced to 0, the traversal of the fusion area is completed, and the realization of the methodωA gradual transition from 1 to 0; the mathematical expression of the image obtained after fusion is as follows:
Figure 731652DEST_PATH_IMAGE005
in the formulaG(x,y)The image obtained after the fusion is represented,f(x,y)g(x,y)representing the central left and right images of the fused portion.
CN202011053664.6A 2020-09-29 2020-09-29 Digital microscope image rapid splicing and fusion method Active CN112164001B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011053664.6A CN112164001B (en) 2020-09-29 2020-09-29 Digital microscope image rapid splicing and fusion method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011053664.6A CN112164001B (en) 2020-09-29 2020-09-29 Digital microscope image rapid splicing and fusion method

Publications (2)

Publication Number Publication Date
CN112164001A true CN112164001A (en) 2021-01-01
CN112164001B CN112164001B (en) 2024-06-07

Family

ID=73860740

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011053664.6A Active CN112164001B (en) 2020-09-29 2020-09-29 Digital microscope image rapid splicing and fusion method

Country Status (1)

Country Link
CN (1) CN112164001B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114708206A (en) * 2022-03-24 2022-07-05 成都飞机工业(集团)有限责任公司 Method, device, equipment and medium for identifying placing position of autoclave molding tool
CN116542857A (en) * 2023-06-28 2023-08-04 南京凯视迈科技有限公司 Multi-image self-adaptive splicing method based on large similarity
CN116630164A (en) * 2023-07-21 2023-08-22 中国人民解放军国防科技大学 Real-time splicing method for massive microscopic images
CN116978005A (en) * 2023-09-22 2023-10-31 南京凯视迈科技有限公司 Microscope image processing system based on attitude transformation

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009108050A1 (en) * 2008-02-27 2009-09-03 Aleksey Nikolaevich Simonov Image reconstructor
CN101840570A (en) * 2010-04-16 2010-09-22 广东工业大学 Fast image splicing method
CN107093166A (en) * 2017-04-01 2017-08-25 华东师范大学 The seamless joint method of low coincidence factor micro-image
CN109658393A (en) * 2018-12-06 2019-04-19 代黎明 Eye fundus image joining method and system
CN110232673A (en) * 2019-05-30 2019-09-13 电子科技大学 A kind of quick steady image split-joint method based on medical micro-imaging
CN110517213A (en) * 2019-08-22 2019-11-29 杭州图谱光电科技有限公司 A kind of real time field depth continuation method based on laplacian pyramid of microscope
US20200150266A1 (en) * 2017-07-14 2020-05-14 Northwestern University Synthetic Apertures for Long-Range, Sub-Diffraction Limited Visible Imaging Using Fourier Ptychography

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009108050A1 (en) * 2008-02-27 2009-09-03 Aleksey Nikolaevich Simonov Image reconstructor
CN101840570A (en) * 2010-04-16 2010-09-22 广东工业大学 Fast image splicing method
CN107093166A (en) * 2017-04-01 2017-08-25 华东师范大学 The seamless joint method of low coincidence factor micro-image
US20200150266A1 (en) * 2017-07-14 2020-05-14 Northwestern University Synthetic Apertures for Long-Range, Sub-Diffraction Limited Visible Imaging Using Fourier Ptychography
CN109658393A (en) * 2018-12-06 2019-04-19 代黎明 Eye fundus image joining method and system
CN110232673A (en) * 2019-05-30 2019-09-13 电子科技大学 A kind of quick steady image split-joint method based on medical micro-imaging
CN110517213A (en) * 2019-08-22 2019-11-29 杭州图谱光电科技有限公司 A kind of real time field depth continuation method based on laplacian pyramid of microscope

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
JIASONG S: "Single-shot quantitative phase microscopy based on color-multiplexed Fourier ptychography", OPTICS LETTERS, vol. 43, no. 14, pages 3365 - 3368 *
LI HUI; ZUO CHAO: "Three dimensional micro surface measurement system based on stereomicroscope", JOURNAL OF APPLIED OPTICS, vol. 38, no. 2, pages 270 - 6 *
MIN-SHAN J: "Rapid microscope auto-focus method for uneven surfaces based on image fusion", MICROSCOPY RESEARCH AND TECHNIQUE, vol. 82, no. 9, pages 1621 - 1627 *
许超;聂诗良;: "基于SURF和改进渐入渐出法的图像拼接算法", 数字技术与应用, no. 12, pages 133 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114708206A (en) * 2022-03-24 2022-07-05 成都飞机工业(集团)有限责任公司 Method, device, equipment and medium for identifying placing position of autoclave molding tool
CN116542857A (en) * 2023-06-28 2023-08-04 南京凯视迈科技有限公司 Multi-image self-adaptive splicing method based on large similarity
CN116542857B (en) * 2023-06-28 2023-10-20 南京凯视迈科技有限公司 Multi-image self-adaptive stitching method
CN116630164A (en) * 2023-07-21 2023-08-22 中国人民解放军国防科技大学 Real-time splicing method for massive microscopic images
CN116630164B (en) * 2023-07-21 2023-09-26 中国人民解放军国防科技大学 Real-time splicing method for massive microscopic images
CN116978005A (en) * 2023-09-22 2023-10-31 南京凯视迈科技有限公司 Microscope image processing system based on attitude transformation
CN116978005B (en) * 2023-09-22 2023-12-19 南京凯视迈科技有限公司 Microscope image processing system based on attitude transformation

Also Published As

Publication number Publication date
CN112164001B (en) 2024-06-07

Similar Documents

Publication Publication Date Title
CN112164001A (en) Digital microscope image rapid splicing and fusing method
Pandey et al. Image mosaicing: A deeper insight
WO2021120407A1 (en) Parallax image stitching and visualization method based on multiple pairs of binocular cameras
CN105631851B (en) Depth map generation
CN104599258B (en) A kind of image split-joint method based on anisotropic character descriptor
CN111626936B (en) Quick panoramic stitching method and system for microscopic images
US9897792B2 (en) Method and system for extended depth of field calculation for microscopic images
CN108257089B (en) A method of the big visual field video panorama splicing based on iteration closest approach
Behrens et al. Real-time image composition of bladder mosaics in fluorescence endoscopy
CN110322485B (en) Rapid image registration method of heterogeneous multi-camera imaging system
WO2014183385A1 (en) Terminal and image processing method therefor
CN111626927B (en) Binocular image super-resolution method, system and device adopting parallax constraint
EP2926558B1 (en) A method and system for extended depth of field calculation for microscopic images
CN106157246A (en) A kind of full automatic quick cylinder panoramic image joining method
CN111145220B (en) Tunnel target track tracking method based on visual information
WO2021012520A1 (en) Three-dimensional mra medical image splicing method and apparatus, and electronic device and computer-readable storage medium
Pulli et al. Mobile panoramic imaging system
CN116152068A (en) Splicing method for solar panel images
Ali et al. Anisotropic motion estimation on edge preserving Riesz wavelets for robust video mosaicing
CN103700082B (en) Image split-joint method based on dual quaterion relative orientation
CN112330613A (en) Method and system for evaluating quality of cytopathology digital image
CN108401104B (en) Dual-focus camera digital zooming method based on frequency band repair and super-resolution
Tseng et al. Depth image super-resolution via multi-frame registration and deep learning
Weibel et al. Contrast-enhancing seam detection and blending using graph cuts
Huang et al. Crack detection of masonry structure based on thermal and visible image fusion and semantic segmentation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant