CN113520416A - Method and system for generating two-dimensional image of object - Google Patents

Method and system for generating two-dimensional image of object Download PDF

Info

Publication number
CN113520416A
CN113520416A CN202010316478.0A CN202010316478A CN113520416A CN 113520416 A CN113520416 A CN 113520416A CN 202010316478 A CN202010316478 A CN 202010316478A CN 113520416 A CN113520416 A CN 113520416A
Authority
CN
China
Prior art keywords
dimensional image
projection data
generating
image
array
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010316478.0A
Other languages
Chinese (zh)
Inventor
张宇
牛杰
张娜
冯娟
王汉禹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN202010316478.0A priority Critical patent/CN113520416A/en
Publication of CN113520416A publication Critical patent/CN113520416A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5223Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data generating planar views from image data, e.g. extracting a coronal view from a 3D image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/50Clinical applications
    • A61B6/502Clinical applications involving diagnosis of breast, i.e. mammography

Abstract

The embodiment of the application discloses a method and a system for generating a two-dimensional image of an object. The method for generating the two-dimensional image of the object comprises the following steps: acquiring projection data of the object, wherein the projection data is generated by exposing the object by X-ray sources in an array X-ray source at least two projection angles relative to the object, and the array X-ray source comprises a field emission X-ray source; a two-dimensional image of the object is generated from the projection data. The field emission X-ray source is adopted for three-dimensional imaging, the three-dimensional imaging data can be fused to generate a two-dimensional image, the resolution of the two-dimensional image is improved, and therefore the diagnosis accuracy of a doctor according to the fused two-dimensional image is improved.

Description

Method and system for generating two-dimensional image of object
Technical Field
The present application relates to the field of medical imaging technologies, and in particular, to a method and a system for generating a two-dimensional image of an object based on a field emission light source.
Background
In a conventional mammary gland X-ray medical imaging product, a single rotating motion light source of a hot cathode is generally adopted, and in order to perform multi-view X-ray scanning, the X-ray light source is fixed on a rotating stand to perform arc motion so as to perform X-ray scanning. Due to the motion artifact caused by mechanical motion and the time delay generated by a thermionic emission mechanism, the spatial resolution of a scanned image is reduced, the scanning time is prolonged, the motion artifact is easily generated in the shooting process, the quality of a three-dimensional tomographic image is influenced, and the accuracy of a doctor is reduced when the doctor combines the three-dimensional tomographic image and a two-dimensional image fused with the three-dimensional tomographic image to diagnose the state of an illness. While rotational scanning can cause discomfort to the patient. Therefore, there is a need to provide an efficient method and system for generating a two-dimensional image of the breast to improve image quality and diagnostic accuracy.
Disclosure of Invention
The invention aims to provide a method and a system for generating a two-dimensional image of an object, which are used for improving the quality of the two-dimensional image, so that the accuracy of diagnosis performed by a doctor according to the two-dimensional image is improved.
One of the embodiments of the present application provides a method for generating a two-dimensional image of an object. The method comprises the following steps: acquiring projection data of the object, wherein the projection data is generated by exposing the object by X-ray sources in an array X-ray source at least two different projection angles relative to the object, and the array X-ray source comprises a field emission X-ray source; a two-dimensional image of the object is generated from the projection data.
Drawings
The present application will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, in which like numerals refer to like structures or operations, and wherein:
FIG. 1 is a schematic diagram of an application scenario of an image generation system according to some embodiments of the present application;
FIG. 2 is a block diagram of an image generation system according to some embodiments of the present application; and
FIG. 3 is an exemplary flow chart illustrating the generation of a two-dimensional image of an object according to some embodiments of the present application.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only examples or embodiments of the application, from which the application can also be applied to other similar scenarios without inventive effort for a person skilled in the art. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "apparatus", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this application and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Flow charts are used herein to illustrate operations performed by systems according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
FIG. 1 is a schematic diagram of an application scenario of an image generation system according to some embodiments of the present application.
As shown in FIG. 1, the image generation system 100 may include an array X-ray source 110, a detector 120, a controller 130, a processor 140, a memory 150, and a display 160. The image generation system is used to generate a two-dimensional image of the object 112, wherein the object 112 may comprise a part of the patient to be examined (e.g. a breast to be examined).
The array X-ray source 110 may be comprised of a plurality (e.g., three or more) of X-ray source arrangements. For example, the array X-ray source 110 may include a linear X-ray source consisting of two or more X-ray sources arranged in a line (e.g., a straight line, a broken line, a curved line, etc.). For another example, the array X-ray source 110 may include an area array X-ray source formed by two or more X-ray sources arranged in a planar (e.g., a matrix) manner. As another example, the array X-ray source 110 may be composed of a linear array X-ray source and an area array X-ray source. In some embodiments, the X-ray sources in the array X-ray sources 110 may include field emission X-ray sources. The field emission X-ray source adopts a field emission cathode as an electron source, generates electron beams in a field electron emission mode, and under the action of an external electric field, the height of a cathode surface potential barrier can be reduced, the width can be narrowed, a large number of electrons in an emitter can penetrate through the surface potential barrier to escape due to quantum tunneling effect, and field electron emission is formed in vacuum. The emitted electrons bombard the target under the traction of the anode (or the target) to generate X-rays. The target and the field emission cathode may be arranged opposite to each other and both located in a vacuum. In some embodiments, the target may comprise tungsten, molybdenum, copper, rhodium, silver, aluminum, and the like. The field emission cathode has low working temperature, low power consumption and no time delay. A field emission X-ray source may improve the spatial resolution of the scanned image compared to an X-ray source that employs a hot cathode as the electron source. Each field emission X-ray source in the array X-ray sources 110 may be exposed simultaneously, in tandem, or sequentially to emit X-rays toward the subject 112.
The detector 120 may detect X-rays that have passed through the object 112 to acquire projection data (also referred to as exposure data). Taking object 112 as an example of an examination breast, image generation system 100 may further include a compression paddle (not shown in fig. 1) disposed between array X-ray source 110 and the examination breast. The compression paddle may be used to compress the breast to be detected (e.g., compress the breast to be detected between the compression paddle and the detector 120) to regularly reduce the thickness of the breast area to be detected, make the breast area to be detected thin and uniform, and may separate overlapping soft tissue in the breast structure to be detected to facilitate detection of projection data by the detector 120.
The controller 130 may be used to control the X-ray source exposures in the array X-ray sources 110. For example, the controller 130 may control two or more X-ray sources to be exposed synchronously. As another example, the controller 130 may control the sequencing of the exposure of two or more of the X-ray sources. As another example, controller 130 may control the time interval during which two or more radiation sources are exposed. In some embodiments, the controller 130 can control at least three X-ray sources to perform at least two exposures to ensure that any point of the imaging region is exposed by at least one of the X-ray sources. In some embodiments, the controller 130 may be used to control the detector 120 to transmit the acquired projection images to the processor 140 and/or the memory 150.
In some embodiments, the controller 130 may be in communication with the processor 140. The exposure intensity and/or exposure time of the radiation source in each exposure may be determined for information of the object 112 and/or patient and/or information required for the shot. For example, different exposure intensities may be used for different regions of the object 112 (e.g., breast to be detected), and different exposure times may be used for different regions of the object 112 (e.g., breast to be detected). For example, for a region where the region to be detected is thick, the exposure intensity may be increased and/or the exposure time may be extended as appropriate to improve the sharpness of the captured image. As another example, the patient may be required to perform a minimum amount of exposure, which may be appropriate to reduce the intensity of the exposure and/or shorten the exposure time while ensuring that the picture is clear.
The processor 140 may be used to process data and/or information obtained from the detector 120, the memory 150, and/or the like. For example, the processor 140 may acquire projection images of the object 112 from the detector 120. The processor 140 may generate a three-dimensional image and/or a two-dimensional image of the object 112 from the projection data. In some embodiments, processor 140 may be a single server or a group of servers. The server groups may be centralized or distributed. In some embodiments, the processor 140 may be local or remote. In some embodiments, the processor 140 may be implemented on a cloud platform.
In some embodiments, the processor 140 may also assist in determining the image acquisition task. In determining the image acquisition task, it may also include acquiring information about the subject 112 and/or patient and/or information about the requirements for the photograph. Such as age, height, weight, medical history, degree of fat or thin, imaging site thickness, bone joint point information, and diagnostic requirements. This information may be obtained by direct entry, retrieval of the patient's profile from a database, or other means. In some embodiments, the posture information of the patient may be further acquired based on image information acquired by a 3D camera (which may also be referred to as a depth camera).
Memory 150 may store data and/or instructions. In some embodiments, the memory 150 may store data acquired from the detector 120 and/or the processor 140. For example, the memory 150 may store the projection images detected by the detector 120. As another example, the memory 150 may store three-dimensional images and/or two-dimensional images of the object 112 generated by the processor 140 for processing or display. In some embodiments, memory 150 may store data and/or instructions executable by processor 140 or for performing the exemplary methods described herein. In some embodiments, the memory 150 may be part of the processor 140.
The display 160 may be used to display a three-dimensional image and/or a two-dimensional image of the object 112. For example, the display 160 may simultaneously display a three-dimensional image and a two-dimensional image of the object 112 for review by a physician. In some embodiments, the display may be part of the processor 140.
In some embodiments, the image generation system 100 may also include a network (not shown). The network may be a Local Area Network (LAN), Wide Area Network (WAN), public network, private network, proprietary network, Public Switched Telephone Network (PSTN), the internet, a virtual network, a metropolitan area network, a telephone network, etc., or a combination of multiple. In some embodiments, communication between the array X-ray source 110, the detector 120, the controller 130, the processor 140, and the memory 150 may be achieved through a wired connection, a wireless connection, or a combination of the various.
FIG. 2 is a block diagram of an image generation system according to some embodiments of the present application.
As shown in fig. 2, the image generation system 100 includes a processor 140, which may include a projection data acquisition module 210 and an image generation module 220.
In some embodiments, the projection data acquisition module 210 may be configured to acquire projection data of an object 112 (e.g., a breast to be examined), wherein the projection data is generated by exposing the object 112 to X-ray sources of the array of ray sources 110 at least two different projection angles with respect to the object 112. In some embodiments, the array X-ray source 110 may be comprised of a plurality of field emission X-ray sources arranged in a linear or planar array. The detector 120 may detect X-rays that have passed through the object 112, generating a plurality of (at least two) projection data corresponding to a plurality of (at least two) field emission X-ray sources. In some embodiments, the projection data may be stored in the memory 150 and a plurality of projection images of the object 112 may be acquired from the memory 150. In some embodiments, the projection data may be obtained directly from the detector 120. The related details can be referred to the following description of the present application, and are not repeated herein.
In some embodiments, the image generation module 220 may be configured to generate a two-dimensional image of the object 112 (e.g., a breast to be examined) from the projection data. For example, the image generation module 220 may generate a three-dimensional image of the object 112 from the plurality of projection data. The image generation module 220 may generate a two-dimensional image of the object 112 based on the three-dimensional image and an image fusion technique of the object. The image fusion technique may include one or more of maximum intensity projection, multi-scale analysis, wavelet transform, and the like. The related details can be referred to the following description of the present application, and are not repeated herein.
It should be understood that the system and its modules shown in FIG. 2 may be implemented in a variety of ways. For example, in some embodiments, the modules in the processor 140 may be connected or in communication with each other via a wired connection or a wireless connection. As another example, the functions of the processor 140 and its modules of the image generation system 100 may be implemented on a terminal (e.g., a mobile terminal). Also for example, processor 140 may include a memory module to implement memory functions.
It should be noted that the above description of the system and its modules is merely for convenience of description and should not limit the present application to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the present system, any combination of modules or sub-system configurations may be used to connect to other modules without departing from such teachings. For example, in some embodiments, for example, the projection data acquisition module 210 and the image generation module 220 disclosed in fig. 2 may be different modules in a system, or may be a module that implements the functions of two or more modules described above, or any one module may be split into two or more units. For example, the projection data acquisition module 210 and the image generation module 220 may be two modules, or one module may have both the acquisition and generation functions. For example, each module may share one memory module, and each module may have its own memory module. Such variations are within the scope of the present application.
FIG. 3 is an exemplary flow chart illustrating the generation of a two-dimensional image of an object according to some embodiments of the present application.
In some embodiments, the process 300 may be implemented in the image generation system 100 shown in fig. 1. For example, flow 300 may be stored in a storage medium (e.g., memory 150 or a storage module of processing unit 140) in the form of instructions and may be invoked and/or executed by processor 140 or one or more modules in processor 140 shown in fig. 2). The operations of flow 300 presented below are intended for illustrative purposes. In some embodiments, flow 300 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed herein. Additionally, the order in which the operations of flow 300 are illustrated in FIG. 3 and described below is not intended to be limiting. For convenience of description, the process 300 is developed by taking the object 112 as an example of a breast to be examined, and the object 112 may include any other part of a patient to be examined.
Projection data (e.g., projection images) of the object 112 (e.g., breast to be examined) are acquired, step 310. Wherein the projection data is generated by exposing the object 112 to X-ray sources of the array of X-ray sources 110 at least two different projection angles relative to the object 112, the array of X-ray sources comprising field emission X-ray sources.
In some embodiments, a single field emission X-ray source may cover the breast to be examined and the power may be sufficient for the shooting of the breast. In this case, the array X-ray source 110 may be composed of a plurality of field emission X-ray sources arranged in a linear array or an area array. The plurality of field emission X-ray sources are exposed in a certain sequence or simultaneously, and emit X-rays from different angles to the breast to be detected. The detector 120 may detect X-rays that have passed through the breast to be examined, generating a plurality of projection data corresponding to the plurality of field emission X-ray sources. Each projection data of the plurality of projection data may partially or completely cover the entire breast to be detected.
In some embodiments, a single field emission X-ray source may not cover the entire breast to be examined, subject to various factors such as cold cathode exposure mechanisms, physical characteristics, and the like. In this case, the array X-ray source 110 may be composed of a plurality of field emission X-ray sources arranged in an area array or in an area array and a line array. The plurality of field emission X-ray sources includes a plurality of sets of sources, each set of sources corresponding to one of the plurality of regions of the breast to be detected. Each set of sources comprises field-effect X-ray sources at least two different projection angles with respect to a corresponding region of the breast to be examined. The field emission X-ray sources in each group of the ray sources are exposed according to a certain sequence, and emit X-rays from different angles to the corresponding areas of the breasts to be detected. The detector 120 may detect the X-rays that pass through the corresponding region and generate a set of projection data for the corresponding region (including at least two projection data for the corresponding region). The projection data in each set of projection data may cover a corresponding region of the breast to be examined. For example, the breast to be examined may comprise N regions. The plurality of field emission X-ray sources includes at least N groups of sources, corresponding to the N regions, respectively. For each of the N regions, a set of projection data for the region may be generated, i.e., N regions correspond to N sets of projection data, which constitute a plurality of projection data of the object 112. That is, for each of a plurality of regions of the breast to be examined, projection data from the array of X-ray sources 110 at least two different projection angles with respect to each region may be obtained.
In some embodiments, the plurality of projection data may be stored in the memory 150 and the plurality of projection data of the object 112 may be retrieved from the memory 150. In some embodiments, the plurality of projection data may be obtained directly from the detector 120.
A two-dimensional image of the object 112 (e.g. a breast to be examined) is generated from the projection data, step 320.
In some embodiments, a three-dimensional reconstruction may be performed based on the projection data to generate a three-dimensional image of the object 112. For example, distortion adjustment, color adjustment and/or gray scale adjustment may be performed according to a corresponding position relationship between the array X-ray source and the imaging region, and then a three-dimensional reconstruction technique is employed to perform three-dimensional reconstruction on the plurality of projection data in a spatial domain or a time domain, so as to generate a three-dimensional image of the breast to be detected. Exemplary three-dimensional reconstruction techniques may include one or more combinations of filtered backprojection, maximum likelihood, iterative, algebraic, minimum likelihood, fourier transform, convolutional backprojection, and the like. If each projection data in the plurality of projection data can cover the whole breast to be detected, the three-dimensional reconstruction technology is adopted to carry out three-dimensional reconstruction on the plurality of projection data, so that a complete three-dimensional image of the breast to be detected can be directly obtained.
If the plurality of projection data of the breast to be detected comprises a plurality of sets of projection data, the projection data in each set of projection data only covers the corresponding region of the breast to be detected, i.e. each set of projection data is from the array X-ray source 110 and is at least two different projection angles with respect to each of the plurality of regions of the breast to be detected. In some embodiments, a three-dimensional reconstruction technique may be applied to each set of projection data to obtain a three-dimensional image of the corresponding region of the breast to be detected. The three-dimensional images corresponding to different areas of the breast to be detected are spliced to obtain a complete three-dimensional image of the breast to be detected. Specifically, the three-dimensional images corresponding to different regions of the breast to be detected can be spliced through image preprocessing, image registration, image fusion, boundary smoothing and other operations, so that a complete three-dimensional image of the breast to be detected is obtained. For example, a three-dimensional image corresponding to one region of the breast to be detected may be selected as a reference image, three-dimensional images corresponding to other regions of the breast to be detected may be arranged according to a positional relationship between the array X-ray source and the corresponding imaging region, non-overlapping portions and overlapping portions in the three-dimensional images corresponding to the respective regions of the breast to be detected may be extracted, respectively, and the overlapping portions may be fused. And splicing the fused overlapped part and the non-overlapped part to obtain a complete three-dimensional image of the breast to be detected. In some embodiments, a three-dimensional reconstruction may be performed based on projection data (i.e., sets of projection data) of multiple regions of the breast to be detected, directly generating a complete three-dimensional image of the breast to be detected.
In some embodiments, applying an image fusion technique to a three-dimensional image of the object 112 may generate a two-dimensional image of the object 112. For example, the three-dimensional image of the breast to be examined may comprise a plurality of slice images of the breast to be examined. Applying an image fusion technique to the plurality of slice images may generate a two-dimensional image of the object 112. The image fusion techniques may include one or a combination of data-level image fusion (or pixel-level image fusion), feature-level image fusion, decision-level image fusion, and the like. For example, the image fusion technique may include one or more combinations of maximum intensity projection, multi-scale analysis, wavelet transform, and the like.
In some embodiments, the two-dimensional image of the breast to be detected may be generated directly from the plurality of projection images of the breast to be detected. For example, a plurality of projection images of the breast to be detected may be input to a trained machine learning model, so that the machine learning model may output a two-dimensional image of the breast to be detected. The trained machine learning model is trained based on sample images of the sample breast. The sample image includes a plurality of projection images of the sample breast obtained according to a field emission technique and a two-dimensional image of the sample breast obtained according to a conventional two-dimensional or three-dimensional imaging technique, such as a fully digital mammography (FFDM) technique or a Digital Breast Tomosynthesis (DBT) technique.
It should be noted that the above description related to the flow 300 is only for illustration and explanation, and does not limit the applicable scope of the present application. Various modifications and changes to flow 300 will be apparent to those skilled in the art in light of this disclosure. However, such modifications and variations are intended to be within the scope of the present application. For example, after 320, the process 300 may include an operation of sending the two-dimensional image of the object 112 to the display 160 for review by a physician. The two-dimensional image of the object 112 and the three-dimensional image of the object 112 may be displayed simultaneously or in pagination. As another example, prior to 320, the process 300 may include receiving a request to fuse two-dimensional images.
The image generation system 100 and the method of the embodiment of the present application may bring beneficial effects including but not limited to: (1) according to the method, the mammary gland does not need to be scanned in a rotating manner, so that the scanning time can be shortened, the fear of a patient can be eliminated, the motion artifact can be reduced, the quality of the mammary gland image can be improved, and accurate diagnosis of mammary gland diseases and screening of the mammary gland can be facilitated; (2) the field emission X-ray source is adopted for three-dimensional imaging, the three-dimensional imaging data can be fused to generate a two-dimensional image, the resolution of the two-dimensional image is improved, and therefore the diagnosis accuracy of a doctor according to the fused two-dimensional image is improved. It is to be noted that different embodiments may produce different advantages, and in different embodiments, any one or combination of the above advantages may be produced, or any other advantages may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be considered merely illustrative and not restrictive of the broad application. Various modifications, improvements and adaptations to the present application may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present application and thus fall within the spirit and scope of the exemplary embodiments of the present application.
Also, this application uses specific language to describe embodiments of the application. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the present application is included in at least one embodiment of the present application. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the present application may be combined as appropriate.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present application. Other variations are also possible within the scope of the present application. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the present application can be viewed as being consistent with the teachings of the present application. Accordingly, the embodiments of the present application are not limited to only those embodiments explicitly described and depicted herein.

Claims (11)

1. A method for generating a two-dimensional image of an object, comprising:
acquiring projection data of the object, wherein the projection data is generated by exposing the object by X-ray sources in an array X-ray source at least two different projection angles relative to the object, and the array X-ray source comprises a field emission X-ray source;
a two-dimensional image of the object is generated from the projection data.
2. A method for generating a two-dimensional image of an object as recited in claim 1, wherein the array X-ray source comprises one or more of a linear array source and an area array source.
3. The method for generating a two-dimensional image of an object as recited in claim 1, wherein the generating a two-dimensional image of the object from the projection data comprises:
an image fusion technique is applied to the projection data to generate a two-dimensional image of the object.
4. A method for generating a two-dimensional image of an object as recited in claim 3, wherein said applying an image fusion technique to the projection data to generate the two-dimensional image of the object comprises:
generating a three-dimensional image of the object from the projection data;
generating a two-dimensional image of the object based on the three-dimensional image of the object.
5. The method for generating a two-dimensional image of an object as recited in claim 4, wherein the object comprises a plurality of regions, the acquiring projection data of the object comprising:
for each of the plurality of regions, projection data from the array of X-ray sources at least two different projection angles with respect to each region is acquired.
6. The method for generating a two-dimensional image of an object as recited in claim 5, wherein said generating a three-dimensional image of the object from the projection data comprises:
for each of the plurality of regions, generating a three-dimensional image for each region from projection data from the array of X-ray sources at least two different projection angles with respect to each region;
stitching the three-dimensional images of the plurality of regions to generate a three-dimensional image of the object.
7. The method for generating a two-dimensional image of an object as recited in claim 5, wherein said generating a three-dimensional image of the object from the plurality of projection data comprises:
and performing three-dimensional reconstruction based on the projection data of the plurality of regions to generate a three-dimensional image of the object.
8. A method for generating a two-dimensional image of an object as recited in claim 3, wherein the image fusion technique comprises one or more of maximum intensity projection, multi-scale analysis, and wavelet transform.
9. A system for generating an image, the system comprising:
the projection data acquisition module is used for acquiring projection data of an object, wherein the projection data are generated by exposing the object by X-ray sources which are positioned at least two different projection angles relative to the object in an array X-ray source, and the array X-ray source comprises a field emission X-ray source;
an image generation module to generate a two-dimensional image of the object from the projection data.
10. An apparatus for generating an image, comprising a processor, wherein the processor is configured to perform the method for generating a two-dimensional image of an object according to any one of claims 1 to 8.
11. A computer-readable storage medium storing computer instructions which, when read by a computer, cause the computer to perform the method for generating a two-dimensional image of an object according to any one of claims 1 to 8.
CN202010316478.0A 2020-04-21 2020-04-21 Method and system for generating two-dimensional image of object Pending CN113520416A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010316478.0A CN113520416A (en) 2020-04-21 2020-04-21 Method and system for generating two-dimensional image of object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010316478.0A CN113520416A (en) 2020-04-21 2020-04-21 Method and system for generating two-dimensional image of object

Publications (1)

Publication Number Publication Date
CN113520416A true CN113520416A (en) 2021-10-22

Family

ID=78123785

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010316478.0A Pending CN113520416A (en) 2020-04-21 2020-04-21 Method and system for generating two-dimensional image of object

Country Status (1)

Country Link
CN (1) CN113520416A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023072266A1 (en) * 2021-10-29 2023-05-04 Shanghai United Imaging Healthcare Co., Ltd. Methods, systems and computer storage mediums for image processing
CN116934757A (en) * 2023-09-18 2023-10-24 电子科技大学 Method, equipment and storage medium for lung nodule false positive pruning

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101641589A (en) * 2008-01-15 2010-02-03 西门子公司 Method and device for producing a tomosynthetic 3d x-ray image
CN101842052A (en) * 2007-07-19 2010-09-22 北卡罗来纳大学查珀尔希尔分校 Stationary x-ray digital breast tomosynthesis systems and correlation technique
US20110243418A1 (en) * 2010-03-30 2011-10-06 Kabushiki Kaisha Toshiba MRI mammography with facilitated comparison to other mammography images
US20120057672A1 (en) * 2010-09-07 2012-03-08 Samsung Electronics Co., Ltd. Apparatus and method for imaging breast
DE102010062541A1 (en) * 2010-12-07 2012-06-14 Siemens Aktiengesellschaft Mammography apparatus for performing tomosynthesis measurement of examination object e.g. breast of patient, has X-ray emitters that are arranged, in order to incident X-ray radiation on examination object
CN102579067A (en) * 2011-01-03 2012-07-18 通用电气公司 Method for assisted positioning of an organ on a platform of a medical imaging system
CN106132302A (en) * 2013-11-06 2016-11-16 射线科学有限公司 X-ray imaging equipment including multiple x-ray sources
CN106651982A (en) * 2016-12-16 2017-05-10 西安交通大学 CT (Computed Tomography) image reconstruction method based on array X ray source and detector
CN109310384A (en) * 2016-05-13 2019-02-05 皇家飞利浦有限公司 The system and method for multi-beam X-ray exposure for 4D imaging
CN109561865A (en) * 2016-08-08 2019-04-02 昂达博思有限公司 It is a kind of that X-ray is overlapped to rebuild the method and system of 3-D image by space-time
CN109589127A (en) * 2018-10-29 2019-04-09 深圳先进技术研究院 CT scan headend equipment, system, method and storage medium
CN110621233A (en) * 2017-03-30 2019-12-27 豪洛捷公司 System and method for synthesizing low-dimensional image data from high-dimensional image data using object mesh enhancement
US20200008759A1 (en) * 2018-07-03 2020-01-09 Fujifilm Corporation Image display device, image display method, and image display program

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101842052A (en) * 2007-07-19 2010-09-22 北卡罗来纳大学查珀尔希尔分校 Stationary x-ray digital breast tomosynthesis systems and correlation technique
CN101641589A (en) * 2008-01-15 2010-02-03 西门子公司 Method and device for producing a tomosynthetic 3d x-ray image
US20110243418A1 (en) * 2010-03-30 2011-10-06 Kabushiki Kaisha Toshiba MRI mammography with facilitated comparison to other mammography images
US20120057672A1 (en) * 2010-09-07 2012-03-08 Samsung Electronics Co., Ltd. Apparatus and method for imaging breast
DE102010062541A1 (en) * 2010-12-07 2012-06-14 Siemens Aktiengesellschaft Mammography apparatus for performing tomosynthesis measurement of examination object e.g. breast of patient, has X-ray emitters that are arranged, in order to incident X-ray radiation on examination object
CN102579067A (en) * 2011-01-03 2012-07-18 通用电气公司 Method for assisted positioning of an organ on a platform of a medical imaging system
CN106132302A (en) * 2013-11-06 2016-11-16 射线科学有限公司 X-ray imaging equipment including multiple x-ray sources
CN109310384A (en) * 2016-05-13 2019-02-05 皇家飞利浦有限公司 The system and method for multi-beam X-ray exposure for 4D imaging
CN109561865A (en) * 2016-08-08 2019-04-02 昂达博思有限公司 It is a kind of that X-ray is overlapped to rebuild the method and system of 3-D image by space-time
CN106651982A (en) * 2016-12-16 2017-05-10 西安交通大学 CT (Computed Tomography) image reconstruction method based on array X ray source and detector
CN110621233A (en) * 2017-03-30 2019-12-27 豪洛捷公司 System and method for synthesizing low-dimensional image data from high-dimensional image data using object mesh enhancement
US20200008759A1 (en) * 2018-07-03 2020-01-09 Fujifilm Corporation Image display device, image display method, and image display program
CN109589127A (en) * 2018-10-29 2019-04-09 深圳先进技术研究院 CT scan headend equipment, system, method and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
高宏伟等: "电子制造装备技术", 西安电子科技大学出版社, pages: 238 - 239 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023072266A1 (en) * 2021-10-29 2023-05-04 Shanghai United Imaging Healthcare Co., Ltd. Methods, systems and computer storage mediums for image processing
CN116934757A (en) * 2023-09-18 2023-10-24 电子科技大学 Method, equipment and storage medium for lung nodule false positive pruning
CN116934757B (en) * 2023-09-18 2023-11-21 电子科技大学 Method, equipment and storage medium for lung nodule false positive pruning

Similar Documents

Publication Publication Date Title
US10307129B2 (en) Apparatus and method for reconstructing tomography images using motion information
US8385621B2 (en) Method for reconstruction images and reconstruction system for reconstructing images
US7978886B2 (en) System and method for anatomy based reconstruction
US10143433B2 (en) Computed tomography apparatus and method of reconstructing a computed tomography image by the computed tomography apparatus
KR101665513B1 (en) Computer tomography apparatus and method for reconstructing a computer tomography image thereof
US20220313176A1 (en) Artificial Intelligence Training with Multiple Pulsed X-ray Source-in-motion Tomosynthesis Imaging System
KR20160119540A (en) Tomography apparatus and method for processing a tomography image thereof
CN113520416A (en) Method and system for generating two-dimensional image of object
CN111528879A (en) Method and system for acquiring medical image
US20200226800A1 (en) Tomographic imaging apparatus and method of generating tomographic image
KR20170087320A (en) Tomography apparatus and method for reconstructing a tomography image thereof
JP5329204B2 (en) X-ray CT system
US6728331B1 (en) Method and system for trauma application of CT imaging
KR102387403B1 (en) Projection data correction method for truncation artifact reduction
CN114305469A (en) Low-dose digital breast tomography method and device and breast imaging equipment
CN111583354A (en) Training method for medical image processing unit and medical image motion estimation method
CN110730977B (en) Low dose imaging method and device
CN111973209A (en) Dynamic perspective method and system of C-shaped arm equipment
US20110150176A1 (en) Image processing with computer aided detection and/or diagnosis
US20230115941A1 (en) X-ray diagnostic apparatus and medical information processing method
CN117594197A (en) Preview generation method and device and electronic equipment
CN116704058A (en) Dual-source CT image processing method and system
CN113520415A (en) X-ray image acquisition method and system
CN117338315A (en) Photon counting CT imaging method and system
CN115100132A (en) Method and apparatus for analyzing tomosynthesis image, computer device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination