CN112998735A - System and method for reconstructing image by scanning device - Google Patents

System and method for reconstructing image by scanning device Download PDF

Info

Publication number
CN112998735A
CN112998735A CN202110216943.8A CN202110216943A CN112998735A CN 112998735 A CN112998735 A CN 112998735A CN 202110216943 A CN202110216943 A CN 202110216943A CN 112998735 A CN112998735 A CN 112998735A
Authority
CN
China
Prior art keywords
detector
time
flight
response
representing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110216943.8A
Other languages
Chinese (zh)
Other versions
CN112998735B (en
Inventor
郭铭浩
洪翔
昝云龙
黄秋
赵指向
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongpai S&t Shenzhen Co ltd
Shanghai Jiaotong University
Original Assignee
Zhongpai S&t Shenzhen Co ltd
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongpai S&t Shenzhen Co ltd, Shanghai Jiaotong University filed Critical Zhongpai S&t Shenzhen Co ltd
Priority to CN202110216943.8A priority Critical patent/CN112998735B/en
Publication of CN112998735A publication Critical patent/CN112998735A/en
Application granted granted Critical
Publication of CN112998735B publication Critical patent/CN112998735B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/037Emission tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/58Testing, adjusting or calibrating thereof
    • A61B6/582Calibration
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/58Testing, adjusting or calibrating thereof
    • A61B6/582Calibration
    • A61B6/583Calibration using calibration phantoms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/58Testing, adjusting or calibrating thereof
    • A61B6/582Calibration
    • A61B6/585Calibration of detector units

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Nuclear Medicine (AREA)

Abstract

The invention provides a system and a method for reconstructing an image by a scanning device, which comprises the following steps: placing a die body in a preset area of a scanning visual field, and obtaining a relative position relation between the die body and the center of the scanning visual field; defining a response line, and calculating the intersection length of the response line and the die body; collecting coincidence events, and counting the number of the coincidence events included in the response line to calculate the mean value of the time difference of flight of the response line; correcting the mean value of the time difference of flight to obtain a mean value correction value of the time difference of flight; establishing a calculation model; using the computational model to obtain a temporal resolution of each detector; the temporal resolution of each detector is used to reconstruct the image. The invention provides a system and a method for reconstructing an image by a scanning device, which can improve the quality of the reconstructed image.

Description

System and method for reconstructing image by scanning device
Technical Field
The present invention relates to the field of medical imaging technologies, and in particular, to a system and a method for reconstructing an image of a scanning device.
Background
The time-of-flight positron emission tomography (TOF-PET) scanner is an advanced functional imaging tool in nuclear medicine imaging, and the application prospect of the TOF-PET scanner is highly valued by researchers and manufacturers of nuclear medicine imaging, and the imaging principle is as follows: by injecting a tracer containing a radionuclide into an organism before scanning the organism, the tracer generating beta in the organism+Decay and generate positrons, the positron generated after the decay generates annihilation reaction of positive and negative electrons when meeting with electrons in a living body, so that a pair of gamma photons with opposite directions and same energy is generated, a detector surrounding the detected living body detects the pair of photons, information is stored in a form of coincidence events, and through a series of electronic responses, electronic response signals are input to a computer so as to generate an image capable of reflecting the distribution of the tracer in the living body through a corresponding image reconstruction algorithm. TOF-PET has a time measurement function that can determine the position and intensity of the radionuclide distribution within a coincidence time window, and can improve the imaging quality of a PET scanner, reduce the dosage, and shorten the scanning time by using the time difference of arrival of two 51lkeV gamma photons generated by positron annihilation at a detector and locating the possible position of an annihilation event on a Line of Response (LOR) according to the speed of light.
In the conventional technology, a fixed system time resolution is used in an image reconstruction algorithm of TOF-PET, that is, all coincidence events use the same time resolution to estimate the position where annihilation may occur, and the influences of factors such as different scintillation crystal materials or sizes, circuit design differences, system installation differences or clock characteristics of elements in the TOF-PET system are ignored. In the conventional technology, if the time resolutions of all the detectors are assumed to be the same, the actual time difference of flight measurement accuracy of the coincidence event is overestimated or underestimated, and the image reconstruction quality is affected.
In the reconstruction algorithm of TOF-PET, the probability density of the possible annihilation positions is estimated according to the time difference of flight of a coincidence event, and is generally considered to satisfy a normal distribution, which is represented by obtaining a point on a response line corresponding to the coincidence event by the time difference of flight as the mean value of the normal distribution, and taking the time resolution of the response line (the time resolution of all the response lines is considered to be the same in the conventional technology) as the half-peak width of the normal distribution (or 2.355 times the standard deviation) as the half-peak width. Unlike conventional PET (TOF-free) reconstruction algorithms, where the locations at which annihilation is likely to occur are considered to be equal along the line of response, TOF-PET reconstruction can improve the imaging quality of the PET scanner, reduce drug usage, and shorten scan times. And obtaining the accurate time resolution of the line of response is the basis on which TOF-PET reconstruction algorithms can accurately reconstruct.
Furthermore, the time resolution of the system is considered by the skilled artisan or researcher as an important indicator for evaluating the performance of a TOF-PET system, such as the NEMA (national Electrical Manufacturers Association) standard in the United states. The overall temporal resolution of the system, i.e. that of TOF-PET in conventional technology, is usually obtained by experimental means. This is limited, for example, by the inability to accurately evaluate a small number of detectors with significant time measurement errors. In addition, the time measurement accuracy of different detectors has certain difference, and the evaluation of the difference between different detectors is also an important embodiment for measuring the time measurement performance of the system.
Disclosure of Invention
In view of the above-mentioned drawbacks of the prior art, the present invention provides a system and a method for reconstructing an image by a scanning device, which can improve the quality of the reconstructed image by calculating the temporal resolution of the detector and then reconstructing the image according to the temporal resolution of the detector, thereby improving the influence of the temporal resolution of the detector on the reconstructed image.
To achieve the above and other objects, the present invention provides a method for reconstructing an image by a scanning device, comprising:
placing a die body in a preset area of a scanning visual field, and obtaining a relative position relation between the die body and the center of the scanning visual field, wherein the die body is positioned in a detector ring in the scanning visual field, the detector ring comprises a plurality of detectors with different positions, and each detector has the same or different time resolution;
defining a response line, and calculating the intersection length of the response line and the phantom, wherein a connecting line between a first detector and a second detector is defined as the response line, the response line passes through the phantom, and the time resolution of the response line is related to the time resolution of the first detector and the time resolution of the second detector;
collecting coincidence events, and counting the number of the coincidence events included in the response line to calculate the mean value of the time difference of flight of the response line, wherein when the phantom emits a first ray and a second ray in opposite directions, and the first ray and the second ray are detected in a preset time coincidence window, the coincidence events are defined;
correcting the mean value of the time difference of flight according to the distance between the first detector and the die body and the distance between the second detector and the die body to obtain a mean value correction value of the time difference of flight;
establishing a calculation model according to the mean value correction value of the flight time difference, the number of the coincidence events, the intersection length and the time resolution of the response line;
using the computational model to obtain a temporal resolution of each of the detectors;
obtaining the temporal resolution of the lines of response using the temporal resolution of each of the detectors to reconstruct an image.
Further, the temporal resolution of any two of the detectors may be the same or different.
Further, the time resolution of the response line satisfies the following formula:
Figure BDA0002953457900000021
where σ represents the time standard deviation of the line of response, σ1And σ2Representing the time standard deviation of the first detector and the time standard deviation of the second detector, respectively.
Further, the step of correcting the mean value of the time-of-flight differences according to the distance between the first detector and the phantom and the distance between the second detector and the phantom to obtain a mean value correction value of the time-of-flight differences includes:
correcting the time of the first ray detected by the first detector according to the distance between the first detector and the die body to obtain a time correction value of the first ray detected by the first detector;
correcting the time of the second ray detected by the second detector according to the distance between the second detector and the die body to obtain a time correction value of the second ray detected by the second detector;
calculating a corrected value for the time of flight difference for each of the coincident events;
and obtaining a corrected value of the mean value of the time-of-flight differences according to the corrected value of the time-of-flight differences and the number of coincidence events.
Further, the standard deviation of the time-of-flight differences positively correlates to the intersection length and the standard deviation of the time-of-flight differences positively correlates to the time resolution of the line of response, the mean of the time-of-flight differences tending towards the expectation of the time-of-flight differences.
Further, the time of the first ray detected by the first detector is corrected according to the distance from the first detector to the phantom, so as to obtain a corrected time value of the first ray detected by the first detector, and the corrected time value is defined as T1c
T1c=T1-d13/c
Wherein, T1cIndicating detection of said first detectorA time correction value of the first ray, T1Representing the time at which the first ray was detected by the first detector, d13Representing the distance from the first detector to the phantom, c being the speed of light.
Further, the calculation model is
Figure BDA0002953457900000031
Wherein-represents
Figure BDA0002953457900000032
Satisfying the normal distribution (expressed by Norm),
Figure BDA0002953457900000033
mean correction value representing the time-of-flight difference, N representing the number of coincidence events, R representing the length of intersection, σ1And σ2Representing the time standard deviation of the first detector and the time standard deviation of the second detector, c representing the speed of light.
Further, using the calculation model to obtain the formula of the time resolution of each detector as
Figure BDA0002953457900000034
Wherein
Figure BDA0002953457900000035
A mean correction value, Σ, representing said time-of-flight differenceiRepresents summing all the response lines, N represents the number of coincidence events, R represents the length of intersection, σ represents1And σ2Representing said time standard deviation of said first detector and said time standard deviation of said second detector, c representing the speed of light, and σ representing said time standard deviation of each of said detectors;
wherein the time resolution of the detector
Figure BDA0002953457900000036
Further, the step of obtaining the temporal resolution of the lines of response using the temporal resolution of each of the detectors to reconstruct an image:
calculating the time resolution of the response line according to the time resolution of the first detector and the time resolution of the second detector, and expressing the time resolution of the response line by the following formula;
Figure BDA0002953457900000041
where σ represents the time standard deviation of the line of response, σ1And σ2Representing the time standard deviation of the first detector and the time standard deviation of the second detector, respectively;
the position of the occurrence of the annihilation event is estimated according to the time difference of flight T of each coincidence event and satisfies the following probability distribution
Figure BDA0002953457900000042
Wherein
Figure BDA0002953457900000043
Is a point on the line of response that,
Figure BDA0002953457900000044
indicating the probability density of their annihilation occurring,
Figure BDA0002953457900000045
representing the first detector position as a function of the time-of-flight difference
Figure BDA0002953457900000046
The second detector position
Figure BDA0002953457900000047
The centers of all the possible annihilation positions obtained are calculated by the following formula
Figure BDA0002953457900000048
Wherein
Figure BDA0002953457900000049
Representing a unit vector pointing from the first detector position to the second detector position, | · | representing a vector length, c representing a speed of light;
and obtaining the probability distribution of the position of the occurrence of the annihilation event according to the time-of-flight difference T of each coincidence event so as to reconstruct the image.
Further, the present invention also provides a system for reconstructing an image by a scanning device, comprising:
the die body position acquisition unit is used for acquiring the relative position relation of the die body relative to the center of a scanning view field, wherein the die body is positioned in a detector ring in the scanning view field, and the detector ring comprises a plurality of detectors with different positions;
the response line acquisition unit is used for acquiring a response line and calculating the intersection length of the response line and the phantom, wherein a connecting line between the first detector and the second detector is defined as the response line, and the response line penetrates through the phantom; a temporal resolution of the line of response is related to the temporal resolution of the first detector and the temporal resolution of the second detector;
the acquisition unit is used for acquiring coincidence events and counting the number of the coincidence events included by the response line so as to calculate the mean value of the flight time difference of the response line, wherein when the die body emits a first ray and a second ray in opposite directions, and the first ray and the second ray are detected in a preset time coincidence window, the coincidence events are defined;
the correction unit is used for correcting the mean value of the time difference of flight according to the distance between the first detector and the die body and the distance between the second detector and the die body to obtain a mean value correction value of the time difference of flight;
the model establishing unit is used for establishing a calculation model according to the mean value correction value of the flight time difference, the number of the coincidence events, the intersection length and the time resolution of the response line;
a model processing unit for using the computational model to obtain a temporal resolution of each of the detectors;
a reconstruction unit for obtaining the temporal resolution of the lines of response according to the temporal resolution of each of the detectors to reconstruct an image.
In summary, the present invention provides a system and a method for reconstructing an image by a scanning device, in which a phantom is first placed in a preset region of a scanning field of view, a relative position relationship between the phantom and the center of the scanning field of view is obtained, then a response line and a coincidence event are defined, and simultaneously, the intersection length of the response line and the phantom is calculated, and the number of coincidence events included in the response line is counted. According to the method, the intersection point of the first ray and the first detector is defined as a first intersection point, the intersection point of the first ray and the phantom is defined as a third intersection point, so that the time of the first ray detected by the first detector can be corrected according to the distance between the first intersection point and the third intersection point, and the time of the second ray detected by the second detector can be corrected, so that the time difference of flight corresponding to an event can be corrected, and therefore the corrected value of the time difference of flight, namely the mean value corrected value of the time difference of flight can be obtained. Then establishing a calculation model according to the resolution of the response line, the number of the coincidence events, the intersection length and the mean value correction value of the time difference of flight; and processing the computational model to obtain a temporal resolution of each detector, and then reconstructing an image from the temporal resolution of each detector using a reconstruction unit. According to the invention, by measuring the time resolution of each detector, when the time resolution of the first detector and/or the time resolution of the second detector are/is poor, the first detector and/or the second detector can be avoided, so that the quality of the reconstructed image can be improved, namely the accuracy of the reconstructed TOF-PET image can be improved.
Drawings
FIG. 1: figure of the structure of a TOF-PET apparatus according to the invention.
FIG. 2: a schematic cross-sectional view of a detector ring mounted on the gantry (gantry) of fig. 1 in the present invention.
FIG. 3: a schematic longitudinal cross-sectional view of the detector ring of fig. 2 in the present invention.
FIG. 4: the present invention is a diagram for explaining the principle of the TOF-PET reconstruction method used by the reconstruction unit of fig. 1.
FIG. 5: the invention discloses a method for reconstructing an image by a scanning device.
FIG. 6: the invention is a simplified schematic diagram of a phantom and detector ring.
FIG. 7: the present invention obtains a schematic of the response line.
FIG. 8: in the present invention, a schematic of a coincidence event is collected.
FIG. 9: the invention discloses a system diagram for reconstructing graphs by a scanning device.
FIG. 10: schematic representation of an electronic device of the present invention.
FIG. 11: schematic representation of a computer storage medium in the present invention.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention.
It should be noted that the drawings provided in the present embodiment are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
As shown in fig. 1, the present embodiment proposes a Positron Emission Tomography (PET) apparatus, which may be a TOF (Time Of Flight) -PET apparatus. The TOF-PET apparatus 10 includes a control unit 11, a gantry 12, a signal processing unit 13, a coincidence counting unit 14, a storage unit 15, a reconstruction unit 16, a display unit 17, and an operation unit 18.
As shown in fig. 2-3, fig. 2 shows a schematic cross-sectional view of a detector ring 100 mounted on a gantry 12. Fig. 3 is a cross-sectional view taken along a line a-a of fig. 2. Gantry 12 has a plurality of detector rings 100 arranged along a circumferential central axis Z. The detector ring 100 has a plurality of detectors 110 arranged on a circumference around a central axis Z. An image Field Of View (FOV) is formed in the opening Of the probe ring 100. The bed plate 140 on which the phantom 200 is placed is inserted into the opening of the detector ring 100 so that the beam portion of the phantom 200 enters the FOV. The body mold 200 is placed on the top plate 140 so that the body axis coincides with the central axis Z. A drug labeled with a radioisotope is injected into the phantom 200 for PET imaging. The detector 110 detects the pair of annihilation gamma-rays emitted from the inside of the phantom 200, and generates a pulse-like electric signal according to the amount of light of the pair of annihilation gamma-rays detected.
As shown in FIG. 2, the detector 110 has a plurality of scintillators 120 and a plurality of photomultiplier tubes 130. The scintillator 120 receives gamma rays resulting from pair-wise annihilation by the radioisotopes within the phantom 200, producing scintillation light. Each scintillator is configured such that a long axis direction of each scintillator substantially coincides with a radial direction of the detector ring. The photomultiplier tube 130 is disposed on one end portion of the scintillator 120 with respect to a radial direction orthogonal to the central axis Z. An optical waveguide (Light Guide) (not shown) may or may not be provided between the scintillator 120 and the photomultiplier tube 130. The plurality of scintillators 120 and the plurality of photomultiplier tubes 130 included in the detector ring 100 are arranged in concentric circles (concentric cylinders). Scintillation light generated in the scintillator 120 propagates within the scintillator 120 and toward the photomultiplier tube 130. The photomultiplier tube 130 generates a pulse-like electric signal according to the amount of scintillation light. The generated electric signal is supplied to the signal processing section 13 as shown in fig. 1.
As shown in fig. 1, the signal processing unit 13 generates a single photon event from the electric signal from the photomultiplier tube 130. Specifically, the signal processing unit 13 performs detection time measurement processing, position calculation processing, and energy calculation processing. In the detection timing measurement process, the signal processing section 13 measures the detection timing of the gamma ray by the detector 110. Specifically, the signal processing unit 13 monitors the peak value of the electric signal from the photomultiplier tube 130. Then, the signal processing unit 13 measures, as the detection time, a time at which the peak value of the electric signal exceeds a preset threshold value. That is, the signal processing unit 13 electrically detects annihilation γ -rays by detecting that the intensity of the electric signal exceeds a threshold value. In the position calculation process, the signal processing unit 13 calculates the incident position of annihilation gamma-rays based on the electric signals from the photomultiplier tube 130. The incidence position of the annihilation gamma-ray corresponds to the position coordinates of the scintillator 120 on which the annihilation gamma-ray is incident. In the energy calculation process, the signal processing unit 13 calculates an energy value of annihilation gamma-rays incident on the scintillator 120 based on the electric signal from the photomultiplier tube 130. Detection time data, position coordinate data, and energy value data relating to the single photon events are associated together. The combination of the energy value data, the position coordinate data, and the detection time instant data relating to the single photon event is referred to as single photon event data. The single photon event data is generated sequentially each time an annihilation gamma ray is detected. The generated single photon event data is supplied to the coincidence counting section 14, and the time data of the single photon event data is corrected by the time shift amount of the corresponding detector.
As shown in fig. 1, the coincidence counting unit 14 performs coincidence counting processing on single-photon event data relating to a plurality of single events. Specifically, the coincidence counting unit 14 repeatedly specifies event data concerning 2 single photon events contained in a predetermined time range from the repeatedly supplied single photon event data. The time range is set to, for example, about 1ns to 15 ns. The pair of single photon events is presumed to be due to pair annihilation gamma-rays generated from the same pair of annihilation sites. Paired single photon events are broadly referred to as coincident events. A Line connecting the pair Of detectors 110 (more specifically, the scintillator 120) that detect the pair Of annihilation gamma rays is called LOR (Line Of Response Line). In this way, the coincidence counting section 14 counts coincidence events for each LOR. Event data (hereinafter, referred to as coincidence event data) relating to a pair of events constituting the LOR is stored in the storage unit 15.
As shown in fig. 1, the reconstruction unit 16 reconstructs image data representing the spatial distribution of the concentration of the radioisotope in the subject from coincidence event data on a plurality of coincidence events. The reconstruction unit 16 executes a reconstruction method (hereinafter referred to as a TOF-PET reconstruction method) using a difference in detection times of a pair of annihilation gamma rays. In the TOF-PET reconstruction method, the probability of existence of pair-wise annihilation points in each pixel on the LOR differs depending on the difference in detection time of a coincidence event.
Fig. 4 is a diagram for explaining the principle of the TOF-PET reconstruction method. As shown in FIG. 4, let a pair of annihilation gamma-rays be detected at time t1Is detected by the first detector 111 at a detection time t2When the second detector 112 detects the annihilation position, the distance difference between the annihilation position and the second detector 112 and the first detector 111 can be calculated by the following equation:
Figure BDA0002953457900000071
the reconstruction unit 16 calculates the position of the pair of annihilation points on the LOR for each coincidence event using equation (1). The position of the LOR is calculated by the reconstruction unit 16 from 2 detected positions of 2 events constituting the LOR. When calculating the position of the pair of annihilation points, the reconstruction unit 16 sets a weight corresponding to the existence probability of the pair of annihilation points for each pixel on the LOR. The weight of the target pixel is set to become smaller as the distance from the pair of annihilation points becomes longer. The better the temporal resolution, the higher the accuracy of the calculated distribution of the pair annihilation points. Therefore, the better the temporal resolution, the higher the weight of the pixels of the calculated pair annihilation point relative to the other pixels. The reconstruction unit 16 reconstructs image data from the coincidence event data using the weights set in this manner. For example, the reconstruction unit 16 generates PET projection data representing the position and the count number of LORs from the coincidence event data. Then, the reconstruction unit 16 generates image data by TOF-PET reconstruction from the generated projection data. The reconstructed image data is supplied to the storage unit 15. In this way, TOF-PET reconstruction methods make use of the difference in detection times of coincidence events and can improve the signal-to-noise ratio compared to reconstruction methods that do not make use of the difference in detection times. That is, in the TOF-PET reconstruction method, the time resolution is an important parameter.
As shown in fig. 1, the display section 17 displays an image corresponding to the image data on the display device. As the display device, a CRT display, a liquid crystal display, an organic EL display, a plasma display, or the like can be suitably used. The operation unit 18 receives various commands and information inputs from an operator via an input device. As the input device, a keyboard, a mouse, various buttons, a touch panel, and the like can be suitably used.
As shown in fig. 3, gantry 12 is provided with a plurality of detector rings 100 arranged along the Z-axis. In fig. 3, 3 detector rings 100 are shown for illustration. Each detector ring 100 has a plurality of scintillator rings 121 arranged along the Z-axis. The scintillator ring 121 is constituted by a plurality of scintillators 120 arrayed in a substantially circumferential direction around the Z-axis. In fig. 3, 3 scintillator rings 121 are shown for each detector ring 100 for illustration. Hereinafter, the number of all scintillator rings 121 included in the plurality of detector rings arranged along the central axis Z is referred to as a column number. In fig. 3, the number of columns of the scintillator ring 121, that is, the number of columns of the scintillators 120 is 9. In addition, all of the scintillators 120 contained within the gantry 12 are referred to generally as a scintillator pack 360. The number of columns of the scintillator ring 121 (the number of columns of the scintillators 120), the number of detectors 110 in the detector ring 100, and the number of scintillators 120 in the detectors 110 are not limited to those shown in fig. 3.
In the detector 110, a plurality of scintillators 120 are arranged in, for example, a two-dimensional shape. The scintillator 120 according to the present embodiment may be formed of any type of scintillator material. For example, the scintillator 120 is made of NaI (sodium iodide)) or BGO (bismuth germanate), LSO (a certain amount of cerium can be added in lutetium silicate), LaBr 3: ce. LYSO (mixed crystal of LSO and yttrium silicate) and the like. As the material of the scintillator 120, lutetium crystal is often used. In addition to the above materials, for example, the scintillator 120 may be formed by a gallium-based crystal or a garnet-based crystal, for example.
Before describing particular embodiments, specific terms or concepts related to embodiments of the present invention are explained herein:
response line: the Line between two crystal bars Of gamma photons detected by the detector is called the Line Of Response (LOR).
Event compliance: a pair of coincidence events is considered to have occurred when two 511keV gamma photons are detected within a predetermined temporal coincidence window (e.g., 1-15 ns left or right).
Time window of coincidence: is the time duration set for the time difference between the arrival of two gamma photons at the detector.
The scattering coincided with: refers to two gamma photons produced by annihilation radiation that are scattered from tissue if one is scattered before arrival but detected within a coincidence time window, and is referred to as scatter coincidence.
Random agreement: is a false coincidence, a coincidence event in which two gamma photons have no temporal or spatial correlation, but are erroneously detected within a coincidence time window.
In some embodiments, it is desirable to use a line source phantom to estimate the temporal resolution of the system. For example, in the NEMA standard, the time resolution of the system is measured by counting the half-peak width of the time-of-flight difference between the coincidence times of all the nearby line source phantoms, i.e., the error between the theoretical values of the time-of-flight difference, as the time resolution of the system. In other words, the temporal resolution thus obtained has only one value, ignoring the variability of the temporal measurement performance of the different detectors. Further, the time-of-flight difference measurement accuracy of the coincidence time on one response line cannot be accurately described using such a time resolution, which in turn leads to a reduction in the quality of the reconstructed image.
The embodiment provides a system and a method for reconstructing an image of a scanning device, which are used for obtaining the time resolution of each detector in a TOF-PET device, so that the detector with the poorer time resolution can be screened out, and the accuracy of reconstructing the TOF-PET image is improved.
As shown in fig. 5, the present embodiment provides a method for reconstructing an image by a scanning device, including:
s1: placing a die body in a preset area of a scanning visual field, and obtaining a relative position relation between the die body and the center of the scanning visual field, wherein the die body is positioned in a detector ring in the scanning visual field, and the detector ring comprises a plurality of detectors with different positions;
s2: defining a response line, and calculating the intersection length of the response line and the phantom, wherein a connecting line between a first detector and a second detector is defined as the response line, the response line passes through the phantom, and the time resolution of the response line is related to the time resolution of the first detector and the time resolution of the second detector;
s3: collecting coincidence events, and counting the number of the coincidence events included in the response line to calculate the mean value of the time difference of flight of the response line, wherein when the phantom emits a first ray and a second ray in opposite directions, and the first ray and the second ray are detected in a preset time coincidence window, the coincidence events are defined;
s4: correcting the mean value of the time difference of flight according to the distance between the first detector and the die body and the distance between the second detector and the die body to obtain a mean value correction value of the time difference of flight;
s5: establishing a calculation model according to the mean value correction value of the flight time difference, the number of the coincidence events, the intersection length and the time resolution of the response line;
s6: using the computational model to obtain a temporal resolution of each of the detectors;
using the temporal resolution of each of the detectors, the temporal resolution of the lines of response is obtained and an image reconstructed S7.
As shown in fig. 2 and 6, before using the scanning device 10, the scanning device 10 is calibrated in step S1, for example, the phantom 200 is first placed in the detector ring 100, and the phantom 200 may be any one of a linear radiation source, a uniform barrel radiation source, and a uniform hollow barrel radiation source. The phantom 200 is positioned within the scan field of view and parallel to the central axis of the cylindrical gantry 20. In this embodiment, the axial field of view length of the detector ring 100 is greater than the axial length of the phantom 200, so data can be acquired at axial locations in a continuous or discrete manner. Also, for example, when the cross-section (or axial length) of the phantom 200 is significantly smaller than the cross-section (or axial length) of the detector ring 100, data may be collected at the cross-sectional location in a continuous or discrete manner. In this embodiment, the detector rings 100 are used to detect radiation, so that each detector within each detector ring 100 can collect no less than 10 coincidence events, and as the cross-sectional size of the detector expands, the number of coincidence events collected by the detector also increases. As can be seen in fig. 6, the detector ring 100 includes a plurality of detectors, which are positioned differently. Furthermore, the time-measured energies (in terms of the time resolution of the lines of response) of the plurality of detectors included in the detector ring 100 also vary.
As shown in fig. 6, in the present embodiment, the phantom 200 is located at a predetermined position in the detector ring 100, and the predetermined position may coincide with the center position of the detector ring 100, and of course, the predetermined position may have a certain distance from the center position of the detector ring 100. After the phantom 200 is placed in the detector ring 100, the relative position of the phantom 200 to the detector ring 100, including the inclination angle of the phantom 200, may also be determined by image reconstruction or external measurement. In this embodiment, the size of the phantom 200, and the position of the phantom 200 within the detector ring 100, may be used as parameters for post-modeling. It should be noted that, in the present embodiment, the phantom 200 is, for example, a barrel-shaped solid uniform radiation source, and the radiation dose is uniformly distributed.
As shown in fig. 7, in step S2, the present embodiment defines a connection line between the first detector 111 and the second detector 112 as a response line, for example, the first response line 101, the second response line 102, and the third response line 103 are shown in fig. 7. The first response line 101 does not intersect with the phantom 200, and the second response line 102 and the third response line 103 intersect with the phantom 200, that is, the second response line 102 and the third response line 103 can be obtained by a true coincidence event, and the first response line 101 can be obtained by a random coincidence event or a scattering coincidence event, so that the second response line 102 and the third response line 103 are taken as a research object, and the first response line 101 is not taken as a research object, so that the effectiveness of modeling can be improved. It should be noted that when a phantom having a certain volume, such as a bucket source, is used, this embodiment will also reject lines of response that are tangent to or intersect the phantom 200 only very short (e.g., less than 2cm in length). Since the relative positional relationship of the phantom 200 to the detector ring 100 has been obtained, the lengths of the intersections of the second 102 and third 103 lines of response with the phantom 200 can be calculated.
As shown in fig. 8, in step S3, the present embodiment takes the second response line 102 as an example for explanation, that is, the number of matching events included in the second response line 102 is obtained. When a positron is emitted from a first radiation point 201 in the phantom 200, the positron encounters an electron around the positron and annihilates with the electron, thereby generating a pair of oppositely directed gamma photons of the same energy that can be detected by the detector ring 100. In the present embodiment, the first radiation point 201 emits a pair of first and second rays L1 and L2 with opposite directions and same energy. Within a preset coincidence time window, the first ray L1 is detected by the first detector 111, the second ray L2 is detected by the second detector 112, for example, the first ray L1 and the second ray L2 are detected within 10ns, and it is considered that a pair of coincidence events occurs. Similarly, the second radiation point 202 also emits a pair of oppositely directed first and second rays L1 and L2 with the same energy, and the first ray L1 is detected by the first detector 111 and the second ray L2 is detected by the second detector 112 within the preset coincidence time window, so that a pair of coincidence events can be defined. Since a plurality of radiation points may be included between the line segments A3a4, the number of coincident events included for a line of response, i.e., the number of coincident events that the line of response may include, may be obtained and defined as the number of coincident events.
As shown in fig. 8, the present embodiment is described by taking a first ray L1 and a second ray L2 emitted from the first radiation point 201 as an example. When the first detector 111 detects the first ray L1 and the second detector 112 detects the second ray L2, the time of the first ray L1 reaching the first detector 111 and the time of the second ray L2 reaching the second detector 112 can be obtained, and in this embodiment, the time of the first ray L1 reaching the first detector 111 is defined as the first time T1The time when the second ray L2 reaches the second detector 112 is defined as a second time T2Thereby, a time-of-flight difference of the coincident event may be obtained, which may be the first time T1And a second time T2The difference of (a). According to the above description, the time difference of flight of each coincidence event included in the response line can be obtained, and meanwhile, according to the number of coincidence events, the average value of the time difference of flight of the response line can be obtained. For example, assume the number of coincidence events is N and the time-of-flight difference for each coincidence event is T11,T21,T31,...,TNThen the mean of the time-of-flight differences of the line of response is (T)11+T21+T31+...+TN) and/N. In this embodiment, the standard deviation of the time-of-flight difference is positively correlated to the intersection length and the standard deviation of the time-of-flight difference is positively correlated to the time resolution of the line of response, the mean of the time-of-flight differences tending towards the desired time-of-flight difference.
As shown in FIG. 8, in step S4, the first ray L1 intersects the first detector 111 at a first intersection A1, and the first ray L1 intersects the border of the phantom 200 at a third intersection A3. The second ray L2 intersects the second detector 112 at a second intersection A2, and the second ray L2 intersects the boundary of the phantom 200 at a fourth intersection A4. The mean of the time-of-flight differences may thus be corrected based on the distance of the first detector 111 from the phantom 200 and the distance of the second detector 112 from the phantom 200, such that a mean corrected time-of-flight difference value may be obtained. The embodiment is used for the first time T according to the distance between the first detector 111 and the phantom 2001The modification is described as an example.Since the relative positional relationship of the phantom 200 and the detector ring 100 has been obtained, the distance between the line segments A1A3 can be obtained, and thus the first time T can be determined according to the distance of the line segments A1A31And (6) correcting.
For the first time T in this embodiment1The correction formula of (2) is:
T1c=T1-d13/c
wherein, T1cA correction value (first temporal correction value), T, representing the time at which the first ray L1 was detected by the first detector 1111Representing the time, d, at which the first ray L1 was detected by the first detector 11113Which represents the distance (length of segment A1a 3) of the first detector 111 from the phantom 200, c is the speed of light.
Similarly, a correction value T of the time when the second ray L1 is detected by the second detector 112 can be obtained2c(second time correction value) so that a correction value for the time-of-flight difference of the coincident event can be obtained, that is to say equal to the difference between the first time correction value and the second time correction value, namely:
Tc=T1c-T2c
wherein, TcA correction value for the time of flight difference for the coincidence event;
the present embodiment thus makes it possible to obtain a mean correction value of the time-of-flight differences, i.e. of the time-of-flight differences of the coincident events, based on the number of coincident events and the correction value of the time-of-flight difference of each coincident event, i.e. the mean correction value of the time-of-flight differences of the coincident events, i.e. the mean correction value
Figure BDA0002953457900000121
Wherein
Figure BDA0002953457900000122
And adding the corrected values of the flight time differences of each coincidence event corresponding to the response line to the mean corrected value of the flight time differences of the coincidence events, wherein N is the number of coincidence events, and sigma represents the sum of the corrected values of the flight time differences of each coincidence event corresponding to the response line.
It should be noted that the corrected value of the time difference of flight is convolved with a normal distribution, and the result of a uniform distribution is distributed, and the standard deviation of the normal distribution is proportional to the time resolution of the response line; the length of the uniform distribution is proportional to the intersection length.
As shown in FIG. 8, in this embodiment, because the line of response passes through the phantom 200, i.e., the line of response intersects the phantom 200, for example, the line connecting the third intersection A3 and the fourth intersection A4 may be the length of the intersection of the line of response with the phantom 200. The intersection length may be used as a parameter for subsequent modeling. It should be noted that the intersection length is related to the position of the mold body 200, and the intersection length may also change when the mold body 200 is in the tilted state. In this embodiment, the line segment A2a4 may be the intersection length of the phantom 200 and the response line.
As shown in FIG. 6, after determining the response line in steps S5-S6, the present embodiment calculates the time standard deviation σ of the first detector 1111Time standard deviation σ of the second detector 1122Mean correction of time of flight difference
Figure BDA0002953457900000123
The intersection length R is a modeling parameter, and a calculation model is established, namely:
Figure BDA0002953457900000124
wherein-represents
Figure BDA0002953457900000125
Satisfying the normal distribution (expressed by Norm),
Figure BDA0002953457900000126
mean correction value representing the time-of-flight difference, N representing the number of coincidence events, R representing the length of intersection, σ1And σ2Representing the time standard deviation of the first detector and the time standard deviation of the second detector, c representing the speed of light. Note that, in the present embodiment, the time resolution of the response lineIn relation to the temporal resolution of the first detector 111 and the temporal resolution of the second detector 112,
the temporal resolution of each of the detectors is then obtained from the computational model by solving an optimization problem
Figure BDA0002953457900000127
Wherein
Figure BDA0002953457900000128
Mean correction value, Σ, representing said time-of-flight differenceiRepresents summing all the response lines, N represents the number of coincidence events, R represents the length of intersection, σ represents1And σ2Representing said time standard deviation of said first detector and said time standard deviation of said second detector, c representing the speed of light, and σ representing said time standard deviation of each of said detectors. The solving process can be given by the following iterative process:
a) the standard deviation of time for initializing each detector is
Figure BDA0002953457900000131
Where k is 0 for the number of iterations,
Figure BDA0002953457900000132
respectively, the time standard deviation of each detector to be calculated, n is the number of detectors, sigmasysIndicating the time standard deviation of the system set in advance.
b) For iteration k +1, the update process can be performed in batches, one batch method is to group the PET system modules,
a. updating the time standard deviation of each detector in the module i, selecting a term related to the time standard deviation of the detector in the optimization expression, and selecting the result of the (k + 1) th iteration of the module j smaller than the module i; selecting the result of the k iteration of the module j larger than i;
b. taking the optimum expression as a derivative with respect to the time standard deviation of each detector and finding the time standard deviation of that detector such that the derivative is 0 as a result of the (k + 1) th iteration, the following two points need to be noted:
i. the optimization expression is convex with respect to the time standard deviation of each detector if the following inequality is satisfied
Figure BDA0002953457900000133
In order to ensure that the time standard deviation of the optimized expression relative to each detector is convex, only the response line with the established inequality needs to be selected when the response line is screened;
two probes in each term in the optimized expression belong to different modules. In fact, in screening for coincidence events, a coincidence event can be deemed invalid if the detectors on both ends of the coincidence event belong to the same module.
c.i=i+1
d. Repeating a. -c until the time standard deviations of the detectors on all modules are updated.
c)k=k+1
d) Repeating b) -c) until the amount of change of each detector is less than a threshold value, or a maximum number of iterations is reached.
After obtaining the time standard deviation σ of each of the detectors, the time resolution of each of the detectors can be further obtained as
Figure BDA0002953457900000134
Calculating the time resolution of the response line according to the time resolution of the first detector and the time resolution of the second detector, and expressing the time resolution of the response line by the following formula;
Figure BDA0002953457900000141
wherein
Figure BDA0002953457900000142
Representing the time standard deviation, σ, of the line of response1And σ2Respectively representing the time standard deviation of the first detector 111 and the time standard deviation of the second detector 112. It should be noted that the time standard deviation of the first detector 111
Figure BDA0002953457900000143
Time standard deviation of the second detector 112
Figure BDA0002953457900000144
As shown in fig. 5 and fig. 6, in step S7, after obtaining each of the response line time standard deviations σ, the step of reconstructing the image may be as follows, and is exemplified by a maximum likelihood expectation maximization method (MLEM) and may also be used in other TOF-PET reconstruction methods.
a) Initializing image fk1 is a full 1 vector, where f denotes the image, k 0 denotes the number of iterations, a (·, σ) denotes the projection operation, aT(., σ) represents a projection operation. It should be noted that in the projection operation and the back-projection operation in the present embodiment, the time resolution of each line of response is different, which means that the standard deviation of the probability distribution (usually, normal distribution) for estimating the position where annihilation may occur on different lines of response is different and is related to the time resolution of the detectors at both ends of the line of response.
b)
Figure BDA0002953457900000145
Where g represents the measurement data and 1 represents the full 1 vector.
c)k=k+1
d) Repeating steps b) -c) until the amount of image change is less than a threshold value, or a maximum number of iterations is reached.
In addition, the time resolution of each detector can be used as an index for judging the time measurement accuracy of a TOF-PET system, which is specifically as follows:
a) obtaining said time standard deviation of each of said detectors as σ, said time resolution
Figure BDA0002953457900000146
b) To be provided with
Figure BDA0002953457900000147
Average value of (1) and
Figure BDA0002953457900000148
as an index of evaluation, wherein the average value is an index of evaluating the time measurement performance of the TOF-PET system, and the average value is an index of time measurement consistency, both being as small as possible.
c) Statistics of
Figure BDA0002953457900000149
The abnormal value in (1) is, for example, a range of abnormal values with a standard deviation of plus or minus three times the mean value. Anomalies may occur in the time measurements of the detectors corresponding to the outliers.
As shown in fig. 7 to 8, the present embodiment can obtain the time resolution of the response line by calculating the time resolution of the first detector 111 and the time resolution of the second detector 112, and then can reconstruct an image by the time resolution of the response line. In this embodiment, after calculating the time resolution of the first detector 111 and the time resolution of the second detector 112, different detectors may be selected, so as to improve the problem of poor reconstructed images due to poor time resolution, that is, improve the accuracy of the reconstructed images.
As shown in fig. 9, the present embodiment further provides a system 300 for reconstructing an image by a scanning device, where the system 300 for reconstructing an image by a scanning device includes a phantom position obtaining unit 301, a response line obtaining unit 302, an acquiring unit 303, a model establishing unit 304, a model processing unit 305, and a reconstructing unit 306. The temporal correction method of the system 300 for reconstructing an image by a scanning device can be seen from the above description.
As shown in fig. 8 to 9, in the present embodiment, the phantom position acquiring unit 301 is configured to acquire the position of the phantom 200 in the scanning device, specifically, the phantom position acquiring unit 301 is configured to acquire the position relationship of the phantom 200 in the scanning field, where the phantom 200 is located in the detector ring 100, for example. A response line acquiring unit 302, configured to acquire a response line and calculate an intersection length of the response line and the phantom 200, where a connection line between the first detector 111 and the second detector 112 is defined as the response line, and the response line passes through the phantom 200; the temporal resolution of the lines of response is related to the temporal resolution of the first detector 111 and the temporal resolution of the second detector 112. The acquisition unit 303 is configured to acquire coincidence events, and count the number of coincidence events included in the response line to calculate a mean value of time differences of flight of the response line, where when the phantom emits a first ray and a second ray in opposite directions, and the first ray and the second ray are detected in a preset time coincidence window, the coincidence events are defined;
as shown in fig. 7 to 9, a model establishing unit 304 for establishing a calculation model according to the mean correction value of the time-of-flight difference, the number of coincidence events, the intersection length, and the time resolution of the response line; a model processing unit 305 for using said calculation model to obtain the temporal resolution of each of said detectors. A reconstruction unit 306 for obtaining the temporal resolution of the lines of response according to the temporal resolution of each of the detectors to reconstruct an image. The system for reconstructing the image by the scanning device provided by the embodiment can improve the accuracy of the reconstructed TOF-PET image.
As shown in fig. 10, the present embodiment further provides an electronic device, which includes a processor 50 and a memory 60, where the memory 60 stores program instructions, and the processor 50 executes the program instructions to implement the method for reconstructing an image by a scanning apparatus. The Processor 50 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC) or other programmable logic device, discrete gate or transistor logic device, discrete hardware component; the Memory 60 may include a Random Access Memory (RAM), and may also include a Non-Volatile Memory (Non-Volatile Memory), such as at least one disk Memory. The Memory 60 may also be an internal Memory of Random Access Memory (RAM) type, and the processor 50 and the Memory 60 may be integrated into one or more independent circuits or hardware, such as: application Specific Integrated Circuit (ASIC). It should be noted that the computer program in the memory 60 can be implemented in the form of software functional units and stored in a computer readable storage medium when the computer program is sold or used as a stand-alone product. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, an electronic device, or a network device) to perform all or part of the steps of the method according to the embodiments of the present invention.
As shown in fig. 11, this embodiment also proposes a computer-readable storage medium 701, where the computer-readable storage medium 701 stores computer instructions 70, and the computer instructions 70 are used for causing the computer to execute the above-mentioned method for implementing the image reconstruction by the scanning device. The computer-readable storage medium 701 may be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system or a propagation medium. The computer-readable storage medium 701 may also include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a Random Access Memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Optical disks may include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-RW), and DVD.
In summary, the present invention provides a system and a method for reconstructing an image by a scanning device, in which a phantom is first placed in a preset region of a scanning field of view, a relative position relationship between the phantom and the center of the scanning field of view is obtained, then a response line and a coincidence event are defined, and simultaneously, the intersection length of the response line and the phantom is calculated, and the number of coincidence events included in the response line is counted. According to the method, the intersection point of the first ray and the first detector is defined as a first intersection point, the intersection point of the first ray and the phantom is defined as a third intersection point, so that the time of the first ray detected by the first detector can be corrected according to the distance between the first intersection point and the third intersection point, and the time of the second ray detected by the second detector can be corrected, so that the time difference of flight corresponding to an event can be corrected, and therefore the corrected value of the time difference of flight, namely the mean value corrected value of the time difference of flight can be obtained. Then establishing a calculation model according to the resolution of the response line, the number of the coincidence events, the intersection length and the mean value correction value of the time difference of flight; and processing the computational model to obtain a temporal resolution of each detector, and then reconstructing an image from the temporal resolution of each detector using a reconstruction unit. According to the invention, by measuring the time resolution of each detector, when the time resolution of the first detector and/or the time resolution of the second detector are/is poor, the first detector and/or the second detector can be avoided, so that the quality of the reconstructed image can be improved, namely the accuracy of the reconstructed TOF-PET image can be improved.
Reference throughout this specification to "one embodiment", "an embodiment", or "a specific embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment, and not necessarily all embodiments, of the present invention. Thus, respective appearances of the phrases "in one embodiment", "in an embodiment", or "in a specific embodiment" in various places throughout this specification are not necessarily referring to the same embodiment. Furthermore, the particular features, structures, or characteristics of any specific embodiment of the present invention may be combined in any suitable manner with one or more other embodiments. It is to be understood that other variations and modifications of the embodiments of the invention described and illustrated herein are possible in light of the teachings herein and are to be considered as part of the spirit and scope of the present invention.
It will also be appreciated that one or more of the elements shown in the figures can also be implemented in a more separated or integrated manner, or even removed for inoperability in some circumstances or provided for usefulness in accordance with a particular application.
Additionally, any reference arrows in the drawings/figures should be considered only as exemplary, and not limiting, unless otherwise expressly specified. Further, as used herein, the term "or" is generally intended to mean "and/or" unless otherwise indicated. Combinations of components or steps will also be considered as being noted where terminology is foreseen as rendering the ability to separate or combine is unclear.
As used in the description herein and throughout the claims that follow, "a", "an", and "the" include plural references unless otherwise indicated. Also, as used in the description herein and throughout the claims that follow, unless otherwise indicated, the meaning of "in …" includes "in …" and "on … (on)".
The above description of illustrated embodiments of the invention, including what is described in the abstract of the specification, is not intended to be exhaustive or to limit the invention to the precise forms disclosed herein. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes only, various equivalent modifications are possible within the spirit and scope of the present invention, as those skilled in the relevant art will recognize and appreciate. As indicated, these modifications may be made to the present invention in light of the foregoing description of illustrated embodiments of the present invention and are to be included within the spirit and scope of the present invention.
The systems and methods have been described herein in general terms as the details aid in understanding the invention. Furthermore, various specific details have been given to provide a general understanding of the embodiments of the invention. One skilled in the relevant art will recognize, however, that an embodiment of the invention can be practiced without one or more of the specific details, or with other apparatus, systems, assemblies, methods, components, materials, parts, and/or the like. In other instances, well-known structures, materials, and/or operations are not specifically shown or described in detail to avoid obscuring aspects of embodiments of the invention.

Claims (10)

1. A method for reconstructing an image for a scanning device, comprising:
placing a die body in a preset area of a scanning visual field, and obtaining a relative position relation between the die body and the center of the scanning visual field, wherein the die body is positioned in a detector ring in the scanning visual field, and the detector ring comprises a plurality of detectors with different positions;
defining a response line, and calculating the intersection length of the response line and the phantom, wherein a connecting line between a first detector and a second detector is defined as the response line, the response line passes through the phantom, and the time resolution of the response line is related to the time resolution of the first detector and the time resolution of the second detector;
collecting coincidence events, and counting the number of the coincidence events included in the response line to calculate the mean value of the time difference of flight of the response line, wherein when the phantom emits a first ray and a second ray in opposite directions, and the first ray and the second ray are detected in a preset time coincidence window, the coincidence events are defined;
correcting the mean value of the time difference of flight according to the distance between the first detector and the die body and the distance between the second detector and the die body to obtain a mean value correction value of the time difference of flight;
establishing a calculation model according to the mean value correction value of the flight time difference, the number of the coincidence events, the intersection length and the time resolution of the response line;
using the computational model to obtain a temporal resolution of each of the detectors;
obtaining the temporal resolution of the lines of response using the temporal resolution of each of the detectors to reconstruct an image.
2. A method of reconstructing an image by a scanning device according to claim 1, wherein said temporal resolution of any two of said detectors is the same or different.
3. The method of reconstructing an image by a scanning device according to claim 1, wherein the temporal resolution of the response line satisfies the following formula:
Figure FDA0002953457890000011
wherein
Figure FDA0002953457890000012
Representing the temporal resolution of the line of response,
Figure FDA0002953457890000013
and
Figure FDA0002953457890000014
representing the temporal resolution of the first detector and the temporal resolution of the second detector, respectively.
4. The method of reconstructing an image using a scanning device according to claim 1, wherein the step of correcting the mean value of the time-of-flight differences according to the distance between the first detector and the phantom and the distance between the second detector and the phantom to obtain a mean value correction value of the time-of-flight differences comprises:
correcting the time of the first ray detected by the first detector according to the distance between the first detector and the die body to obtain a time correction value of the first ray detected by the first detector;
correcting the time of the second ray detected by the second detector according to the distance between the second detector and the die body to obtain a time correction value of the second ray detected by the second detector;
calculating a corrected value for the time of flight difference for each of the coincident events;
and obtaining a corrected value of the mean value of the time-of-flight differences according to the corrected value of the time-of-flight differences and the number of coincidence events.
5. The method of reconstructing an image by a scanning device according to claim 4, wherein said standard deviation of time-of-flight differences is positively correlated to said intersection length and said standard deviation of time-of-flight differences is positively correlated to said temporal resolution of said response line, the mean of said time-of-flight differences tending towards the expectation of said time-of-flight differences.
6. The method of reconstructing an image according to claim 4, wherein the time at which the first ray is detected by the first detector is corrected according to the distance from the first detector to the phantom to obtain a corrected time value at which the first ray is detected by the first detector, and the corrected time value is defined as T1c
T1c=T1-d13/c
Wherein, T1cA correction value, T, representing the time at which the first ray was detected by the first detector1Representing the time at which the first ray was detected by the first detector, d13Representing the distance from the first detector to the phantom, c being the speed of light.
7. The method of claim 1, wherein the computational model is
Figure FDA0002953457890000021
Wherein-represents
Figure FDA0002953457890000022
Satisfying the normal distribution (expressed by Norm),
Figure FDA0002953457890000023
mean correction value representing the time-of-flight difference, N representing the number of coincidence events, R representing the length of intersection, σ1And σ2Representing the time standard deviation of the first detector and the time standard deviation of the second detector, c representing the speed of light.
8. A method for reconstructing an image by a scanning device according to claim 1, wherein said computational model is used to obtain the temporal resolution of each of said detectors by the formula
Figure FDA0002953457890000024
Wherein
Figure FDA0002953457890000025
A mean correction value, Σ, representing said time-of-flight differenceiRepresents summing all the response lines, N represents the number of coincidence events, R represents the length of intersection, σ represents1And σ2Representing the time standard deviation of said first detector and the time standard deviation of said second detector, c representing the speed of light, and σ representing said time standard deviation of each of said detectors;
wherein the time resolution of the detector
Figure FDA0002953457890000026
9. A method for reconstructing an image by a scanning device according to claim 1, wherein said temporal resolution of said lines of response is obtained using said temporal resolution of each of said detectors to reconstruct an image by:
calculating the time resolution of the response line according to the time resolution of the first detector and the time resolution of the second detector, and expressing the time resolution of the response line by the following formula;
Figure FDA0002953457890000031
where σ represents the time standard deviation of the line of response, σ1And σ2Representing the time standard deviation of the first detector and the time standard deviation of the second detector, respectively;
the position of the occurrence of the annihilation event is estimated according to the time difference of flight T of each coincidence event and satisfies the following probability distribution
Figure FDA0002953457890000032
Wherein
Figure FDA0002953457890000033
Is a point on the line of response that,
Figure FDA0002953457890000034
indicating the probability density of their annihilation occurring,
Figure FDA0002953457890000035
representing the first detector position as a function of the time-of-flight difference
Figure FDA0002953457890000036
The second detector position
Figure FDA0002953457890000037
The centers of all the possible annihilation positions obtained are calculated by the following formula
Figure FDA0002953457890000038
Wherein
Figure FDA0002953457890000039
Representing a unit vector pointing from the first detector position to the second detector position, | · | representing a vector length, c representing a speed of light;
and obtaining the probability distribution of the position of the occurrence of the annihilation event according to the time-of-flight difference T of each coincidence event so as to reconstruct the image.
10. A system for reconstructing an image from a scanning device, comprising:
the die body position acquisition unit is used for acquiring the relative position relation of the die body relative to the center of a scanning view field, wherein the die body is positioned in a detector ring in the scanning view field, and the detector ring comprises a plurality of detectors with different positions;
the response line acquisition unit is used for acquiring a response line and calculating the intersection length of the response line and the phantom, wherein a connecting line between the first detector and the second detector is defined as the response line, and the response line penetrates through the phantom; a temporal resolution of the line of response is related to the temporal resolution of the first detector and the temporal resolution of the second detector;
the acquisition unit is used for acquiring coincidence events and counting the number of the coincidence events included by the response line so as to calculate the mean value of the flight time difference of the response line, wherein when the die body emits a first ray and a second ray in opposite directions, and the first ray and the second ray are detected in a preset time coincidence window, the coincidence events are defined;
the correction unit is used for correcting the mean value of the time difference of flight according to the distance between the first detector and the die body and the distance between the second detector and the die body to obtain a mean value correction value of the time difference of flight;
the model establishing unit is used for establishing a calculation model according to the mean value correction value of the flight time difference, the number of the coincidence events, the intersection length and the time resolution of the response line;
a model processing unit for using the computational model to obtain a temporal resolution of each of the detectors;
a reconstruction unit for obtaining the temporal resolution of the lines of response according to the temporal resolution of each of the detectors to reconstruct an image.
CN202110216943.8A 2021-02-26 2021-02-26 System and method for reconstructing image by scanning device Active CN112998735B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110216943.8A CN112998735B (en) 2021-02-26 2021-02-26 System and method for reconstructing image by scanning device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110216943.8A CN112998735B (en) 2021-02-26 2021-02-26 System and method for reconstructing image by scanning device

Publications (2)

Publication Number Publication Date
CN112998735A true CN112998735A (en) 2021-06-22
CN112998735B CN112998735B (en) 2022-09-02

Family

ID=76386333

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110216943.8A Active CN112998735B (en) 2021-02-26 2021-02-26 System and method for reconstructing image by scanning device

Country Status (1)

Country Link
CN (1) CN112998735B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114594107A (en) * 2022-05-09 2022-06-07 武汉精立电子技术有限公司 Optimization method and application of scanning path and detection method of surface of semiconductor material

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006020678A (en) * 2004-07-06 2006-01-26 Hitachi Ltd X-ray ct apparatus
CN104183012A (en) * 2013-10-31 2014-12-03 上海联影医疗科技有限公司 PET (Polyethylene terephthalate) three-dimensional image reconstruction method and device
CN105496436A (en) * 2015-11-28 2016-04-20 上海联影医疗科技有限公司 Time correction method and device used for PET device
CN106539591A (en) * 2015-09-21 2017-03-29 上海联影医疗科技有限公司 PET flight time state quality detection methods and PET scan device
CN107137101A (en) * 2017-04-24 2017-09-08 沈阳东软医疗***有限公司 A kind of time calibrating method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006020678A (en) * 2004-07-06 2006-01-26 Hitachi Ltd X-ray ct apparatus
CN104183012A (en) * 2013-10-31 2014-12-03 上海联影医疗科技有限公司 PET (Polyethylene terephthalate) three-dimensional image reconstruction method and device
CN106539591A (en) * 2015-09-21 2017-03-29 上海联影医疗科技有限公司 PET flight time state quality detection methods and PET scan device
CN105496436A (en) * 2015-11-28 2016-04-20 上海联影医疗科技有限公司 Time correction method and device used for PET device
CN107137101A (en) * 2017-04-24 2017-09-08 沈阳东软医疗***有限公司 A kind of time calibrating method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
K.DULSKI,ET AL: "A METHOD FOR TIME CALIBRATION OF PET SYSTEMS USING FIXED β+ RADIOACTIVE SOURCE", 《ACTA PHYSICA POLONICA B》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114594107A (en) * 2022-05-09 2022-06-07 武汉精立电子技术有限公司 Optimization method and application of scanning path and detection method of surface of semiconductor material
CN114594107B (en) * 2022-05-09 2022-08-16 武汉精立电子技术有限公司 Optimization method and application of scanning path and detection method of surface of semiconductor material

Also Published As

Publication number Publication date
CN112998735B (en) 2022-09-02

Similar Documents

Publication Publication Date Title
CN106539591B (en) PET flight time state quality detection method and PET scanning device
WO2017050180A1 (en) System and method for calibrating a pet scanner
US9029786B2 (en) Nuclear medicine imaging apparatus, and nuclear medicine imaging method
US10215864B2 (en) System and method to improve image quality of emission tomography when using advanced radionuclides
US11510636B2 (en) System and method for positron emission tomography
JP2005315887A (en) Method and system for normalizing positron emitting tomography system
JP5845487B2 (en) Method for absolute measurement of radioactivity of positron decay nuclides that emit gamma rays, method for determining the detection efficiency of a radiation detector assembly, and method for calibrating a radiation measurement apparatus
Efthimiou et al. TOF-PET image reconstruction with multiple timing kernels applied on Cherenkov radiation in BGO
JP6125309B2 (en) Random coincidence counting estimation method and random coincidence counting estimation apparatus
US7129497B2 (en) Method and system for normalization of a positron emission tomography system
JP6054050B2 (en) Nuclear medicine imaging method, nuclear medicine imaging apparatus and storage medium
CN112998735B (en) System and method for reconstructing image by scanning device
US11231508B2 (en) Gamma camera dead time determination in real time using long lived radioisotopes
EP2902806B1 (en) Nuclear medicine diagnostic device and medical data processing device
CN112998737B (en) Time offset correction system and method for scanning device
Guérin et al. Realistic PET Monte Carlo simulation with pixelated block detectors, light sharing, random coincidences and dead-time modeling
Stolin et al. Preclinical positron emission tomography scanner based on a monolithic annulus of scintillator: initial design study
Surti et al. PET instrumentation
CN112998736B (en) Time correction system and time correction method of scanning device
Kijewski Positron emission tomography (PET) and single-photon emission computed tomography (SPECT) physics
JP7001176B2 (en) Data processing methods, programs, data processing equipment and positron emission tomographic imaging equipment
US20220343566A1 (en) Methods and systems for reconstructing a positron emission tomography image
Vandenberghe PET Systems
Sharp et al. Positron emission tomography
Tayefi Ardebili Evaluation of the NEMA characteristics for the Modular J-PET scanner

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant