CN111973209A - Dynamic perspective method and system of C-shaped arm equipment - Google Patents

Dynamic perspective method and system of C-shaped arm equipment Download PDF

Info

Publication number
CN111973209A
CN111973209A CN202010957835.1A CN202010957835A CN111973209A CN 111973209 A CN111973209 A CN 111973209A CN 202010957835 A CN202010957835 A CN 202010957835A CN 111973209 A CN111973209 A CN 111973209A
Authority
CN
China
Prior art keywords
energy
subject
shooting
image
perspective data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010957835.1A
Other languages
Chinese (zh)
Inventor
向军
王振玮
王伟懿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN202010957835.1A priority Critical patent/CN111973209A/en
Publication of CN111973209A publication Critical patent/CN111973209A/en
Priority to PCT/CN2021/118006 priority patent/WO2022053049A1/en
Priority to EP21866101.5A priority patent/EP4201331A4/en
Priority to US18/182,286 priority patent/US20230230243A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4435Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
    • A61B6/4441Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure the rigid structure being a C-arm or U-arm
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/48Diagnostic techniques
    • A61B6/482Diagnostic techniques involving multiple energy imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/56Details of data transmission or power supply, e.g. use of slip rings

Abstract

The embodiment of the application discloses a dynamic perspective method and a dynamic perspective system for C-shaped arm equipment. The method comprises the steps of shooting a subject in a shooting period, acquiring first perspective data of a ray source irradiating the subject under first energy in the shooting process, and acquiring second perspective data of the ray source irradiating the subject under second energy different from the first energy; photographing a subject in a plurality of consecutive photographing periods; a dynamic image of a subject is displayed based on the first perspective data and the second perspective data acquired in each of the plurality of continuous photographing periods.

Description

Dynamic perspective method and system of C-shaped arm equipment
Technical Field
The application relates to the technical field of medical equipment imaging, in particular to a dynamic perspective method and a dynamic perspective system for C-shaped arm equipment.
Background
A radiation device (e.g., a DSA device, a DR device, an X-ray machine, a breast X-ray machine, etc.) photographs and/or treats a subject by emitting radiation (e.g., X-rays). When performing surgery/diagnosis, if medical care personnel can dynamically know the change and movement of the focus and/or each tissue and organ of a subject, a diagnosis result can be obtained more quickly and accurately or clinical operation can be performed.
Therefore, there is a need to provide a dynamic fluoroscopy method and system to better assist diagnosis/treatment.
Disclosure of Invention
One of the embodiments of the present application provides a dynamic perspective method for a C-arm device, including: shooting a subject in a shooting period, acquiring first perspective data of a radiation source irradiating the subject at a first energy in the shooting process, and acquiring second perspective data of the radiation source irradiating the subject at a second energy different from the first energy; performing the photographing on the subject in a plurality of consecutive photographing periods; displaying a dynamic image of the subject based on the first perspective data and the second perspective data acquired in each of the plurality of continuous photographing periods.
One of the embodiments of the present application provides a dynamic perspective system of a C-arm device, including a shooting module and a display module; the shooting module is used for shooting a subject in a shooting period, and acquiring first perspective data of a ray source irradiating the subject under first energy and second perspective data of the ray source irradiating the subject under second energy different from the first energy in the shooting process; the shooting module is further used for shooting the subject in a plurality of continuous shooting periods; the display module is configured to display a dynamic image of the subject according to the first perspective data and the second perspective data acquired in each of the plurality of consecutive shooting periods.
One of the embodiments of the present application further provides an apparatus for dynamic perspective of a C-arm device, where the apparatus includes at least one processor and at least one storage device, where the storage device is configured to store instructions, and when the instructions are executed by the at least one processor, the apparatus implements the dynamic perspective method according to any of the embodiments of the present application.
One of the embodiments of the present application further provides a C-arm imaging system, which includes a radiation source, a detector, a memory, and a display, where the radiation source includes a bulb, a high voltage generator, and a high voltage control module; wherein: the high-voltage control module controls the high-voltage generator to switch between first energy and second energy in a reciprocating manner; the radiation source emits rays to a main body in a shooting period under the drive of the high-voltage generator, and the detector acquires first perspective data of the main body under the irradiation of first energy rays and second perspective data under the irradiation of second energy rays; the memory stores the first perspective data and the second perspective data acquired in each of a plurality of consecutive photographing periods; the memory is also stored with an image processing unit, and the image processing unit performs subtraction processing on the first perspective data and the second perspective data in each shooting period to obtain an image of the subject in each shooting period so as to obtain dynamic images in a plurality of shooting periods; and the display is used for displaying dynamic images of the main body in a plurality of shooting periods.
One of the embodiments of the present application further provides a computer-readable storage medium, where the storage medium stores computer instructions, and after a computer reads the computer instructions in the storage medium, the computer executes the dynamic perspective method according to any embodiment of the present application.
Drawings
The present application will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a schematic diagram of an application scenario of a dynamic perspective system of a C-arm device according to some embodiments of the present application;
FIG. 2 is an exemplary flow diagram of a dynamic perspective method of a C-arm device shown in accordance with some embodiments of the present application;
FIG. 3 is an exemplary block diagram of a dynamic perspective system of a C-arm device shown in accordance with some embodiments of the present application.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only examples or embodiments of the application, from which the application can also be applied to other similar scenarios without inventive effort for a person skilled in the art. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system," "device," and/or "module" as used herein is a method for distinguishing different components, elements, components, parts, or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this application and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Flow charts are used herein to illustrate operations performed by systems according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
Fig. 1 is a schematic view of an application scenario of a dynamic perspective system of a C-arm device according to some embodiments of the present application. The dynamic perspective system 100 can include a perspective device 110, a network 120, at least one terminal 130, a processing device 140, and a storage device 150. The various components of the system 100 may be interconnected by a network 120. For example, the see-through device 110 and the at least one terminal 130 may be connected or in communication via the network 120.
The fluoroscopy device 110 may include a DSA (Digital subtraction angiography), a Digital Radiography Device (DR), a Computed Radiography device (CR), a Digital fluoroscopy Device (DF), a CT scanner, a magnetic resonance scanner, a mammography machine, a C-arm device, etc. In some embodiments, the fluoroscopy device 110 may comprise a gantry, a detector, a detection area, a scanning bed and a radiation source. A gantry may be used to support the detector and the source of radiation. The scanning bed is used for placing a main body for scanning. The subject may include a patient, a phantom, or other scanned object. The radiation source may emit X-rays toward the subject to irradiate the subject. The detector may be for receiving X-rays. By photographing (i.e., illuminating) the subject for a plurality of photographing periods, the fluoroscopy device 110 may acquire fluoroscopy data for the plurality of photographing periods to generate (or reconstruct) a dynamic image of the subject for the plurality of photographing periods.
The network 120 may include any suitable network capable of facilitating information and/or data exchange for the dynamic perspective system 100. In some embodiments, at least one component of the dynamic perspective system 100 (e.g., the perspective device 110, the processing device 140, the storage device 150, the at least one terminal 130) may exchange information and/or data with at least one other component in the dynamic perspective system 100 over the network 120. For example, the processing device 140 may obtain an image of the subject from the fluoroscopy device 110 via the network 120. The network 120 may include a public network (e.g., the internet), a private network (e.g., a Local Area Network (LAN)), a wired network, a wireless network (e.g., an 802.11 network, a Wi-Fi network), a frame relay network, a Virtual Private Network (VPN), a satellite network, a telephone network, a router, a hub, a switch, etc., or any combination thereof. For example, network 120 may include a wireline network, a fiber optic network, a telecommunications network, an intranet, a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), Bluetooth, and a network interfaceTMNetwork and ZigBeeTMA network, a Near Field Communication (NFC) network, the like, or any combination thereof. In some embodiments, network 120 may include at least one network access point. For example, the network 120 may include wired and/or wireless network access points, such as base stations and/or internet exchange points, through which at least one component of the dynamic perspective system 100 may connect to the network 120 to exchange data and/or information.
At least one terminal 130 may be in communication and/or connection with the see-through device 110, the processing device 140, and/or the storage device 150. For example, the first perspective data and the second perspective data acquired by the processing device 140 may be stored in the storage device 150. In some embodiments, at least one terminal 130 may include a mobile device 131, a tablet computer 132, a laptop computer 133, and the like, or any combination thereof. For example, mobile device 131 may include a mobile joystick, a Personal Digital Assistant (PDA), a smart phone, or the like, or any combination thereof. In some embodiments, at least one terminal 130 may include a display that may be used to display information related to the dynamic fluoroscopy process (e.g., a dynamic image of the subject).
In some embodiments, at least one terminal 130 may include an input device, an output device, and the like. The input device may be selected from keyboard input, touch screen (e.g., with tactile or haptic feedback) input, voice input, eye tracking input, gesture tracking input, brain monitoring system input, image input, video input, or any other similar input mechanism. Input information received via the input device may be transmitted, for example, via a bus, to the processing device 140 for further processing. Other types of input devices may include cursor control devices, such as a mouse, a trackball, or cursor direction keys, among others. In some embodiments, an operator (e.g., a technician or physician) may input instructions reflecting the user-selected category of dynamic images via an input device. Output devices may include a display, speakers, printer, or the like, or any combination thereof. The output device may be used to output a moving image or the like determined by the processing device 140. In some embodiments, at least one terminal 130 may be part of the processing device 140.
The processing device 140 may process data and/or information obtained from the perspective device 110, the storage device 150, the at least one terminal 130, or other components of the dynamic perspective system 100. For example, the processing device 140 may obtain fluoroscopy data of the subject from the fluoroscopy device 110. In some embodiments, the processing device 140 may be a single server or a group of servers. The server groups may be centralized or distributed. In some embodiments, the processing device 140 may be local or remote. For example, the processing device 140 may access information and/or data from the see-through device 110, the storage device 150, and/or the at least one terminal 130 via the network 120. As another example, the processing device 140 may be directly connected to the see-through device 110, the at least one terminal 130, and/or the storage device 150 to access information and/or data. In some embodiments, the processing device 140 may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, and the like, or any combination thereof.
Storage device 150 may store data, instructions, and/or any other information. Such as a history shooting protocol, etc. In some embodiments, the storage device 150 may store data obtained from the see-through device 110, the at least one terminal 130, and/or the processing device 140. In some embodiments, storage device 150 may store data and/or instructions that are used by processing device 140 to perform or use to perform the exemplary methods described in this application. In some embodiments, the storage device 150 may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like, or any combination thereof. In some embodiments, the storage device 150 may be implemented on a cloud platform.
In some embodiments, a storage device 150 may be connected to the network 120 to communicate with at least one other component (e.g., the processing device 140, at least one terminal 130) in the dynamic perspective system 100. At least one component in the dynamic perspective system 100 may access data (e.g., perspective data) stored in the storage device 150 via the network 120. In some embodiments, the storage device 150 may be part of the processing device 140.
It should be noted that the foregoing description is provided for illustrative purposes only, and is not intended to limit the scope of the present application. Many variations and modifications will occur to those skilled in the art in light of the teachings herein. The features, structures, methods, and other features of the example embodiments described herein may be combined in various ways to obtain additional and/or alternative example embodiments. For example, the storage device 150 may be a data storage device 150 comprising a cloud computing platform, such as a public cloud, a private cloud, a community and hybrid cloud, and the like. However, such changes and modifications do not depart from the scope of the present application.
In some embodiments, the present application also relates to a C-arm imaging system. The C-arm imaging system may include a radiation source, a detector, a memory, and a display, the radiation source including a bulb, a high voltage generator, and a high voltage control module. In some embodiments, the high voltage control module may control the high voltage generator to switch back and forth between a first energy and a second energy; the radiation source can emit rays to a main body in a shooting period under the driving of the high-voltage generator, and the detector can acquire first perspective data of the main body under the irradiation of a first energy ray and second perspective data under the irradiation of a second energy ray; the memory may store the first perspective data and the second perspective data acquired in each of a plurality of consecutive photographing periods; the memory is also stored with an image processing unit, and the image processing unit performs subtraction processing on the first perspective data and the second perspective data in each shooting period to obtain an image of the subject in each shooting period so as to obtain dynamic images in a plurality of shooting periods; the display may be for displaying a dynamic image of the subject over a plurality of shooting periods.
In some embodiments, a medical staff needs to shoot the subject frequently to know the physical condition of the subject, however, one picture can only reflect the condition of the focus and/or each tissue and organ of the subject at a certain moment, and the requirement of the medical staff is often not met. In particular, when a clinical operation such as an operation or diagnosis is performed on a subject, a medical staff needs to know a lesion or movement of the whole or a part (e.g., a bone, a soft tissue, etc.) of the subject or a certain lesion in real time, and thus needs to continuously photograph the subject in one or more photographing cycles using the fluoroscopy apparatus 110 having a dynamic fluoroscopy function to acquire a dynamic fluoroscopy image. However, in some embodiments, the fluoroscopy apparatus with fluoroscopy function can only irradiate the subject at one energy to obtain the fluoroscopy image, so that the photoelectric absorption effect and the compton scattering effect cannot be effectively utilized to obtain the image required by the medical staff, and the diagnosis and clinical operation efficiency of the medical staff are reduced.
Therefore, some embodiments of the present application provide a dynamic perspective method, such that a radiation source can irradiate a main body in turn at different energies, thereby obtaining perspective data of the main body irradiated at different energies, and then obtaining a dynamic perspective image of the main body required by a medical worker based on a photoelectric absorption effect and a compton scattering effect, thereby effectively improving clinical operation efficiency of the medical worker, and better assisting the medical worker in diagnosis/treatment.
FIG. 2 is an exemplary flow diagram of a dynamic perspective method shown in accordance with some embodiments of the present application. In particular, the dynamic perspective method 200 may be performed by the dynamic perspective system 100 (e.g., the processing device 140). For example, the dynamic perspective method 200 may be stored in a storage device (e.g., storage device 150) in the form of a program or instructions that, when executed by the dynamic perspective system 100 (e.g., processing device 140), may implement the dynamic perspective method 200.
Step 210, a subject is photographed in a photographing period, first perspective data of the radiation source irradiating the subject at a first energy is acquired in the photographing process, and second perspective data of the radiation source irradiating the subject at a second energy different from the first energy is acquired. In some embodiments, step 210 may be performed by the capture module 310.
The subject may refer to an object that is positioned at a location under the fluoroscopy device 110 to be irradiated by the radiation source. In some embodiments, the subject may include an object, e.g., a patient, a physical examiner, etc., photographed for a portion (e.g., head, chest) thereof. In some embodiments, the subject may also be a specific organ, tissue, site, etc.
A shooting cycle may be understood as a period of time during which the radiation source will shoot a subject with radiation of a first energy and a second energy, respectively. In some embodiments, the imaging of the subject at the first energy by the radiation source and the imaging of the subject at the second energy by the radiation source may be continuous, i.e., the imaging of the radiation source at the second energy may be performed immediately after the imaging of the radiation source at the first energy is completed. For convenience of description, the radiation source capturing (i.e., irradiating) a subject at a first energy may also be referred to as a first energy capture, and the radiation source capturing (i.e., irradiating) a subject at a second energy may also be referred to as a second energy capture. In some embodiments, the high voltage generator may be controlled by the high voltage control module to switch between the second energy and the first energy, thereby enabling switching between the first energy shot and the second energy shot. In some embodiments, there may be a time interval between the first energy shot and the second energy shot. In some embodiments, the shooting period may be 1/25 seconds, 1/50 seconds, or the like.
In some embodiments, the fluoroscopy device 110 may comprise a plurality of radiation sources (e.g. a first radiation source and a second radiation source), each radiation source emitting a different energy of radiation. For example, a first radiation source may emit first radiation at a first energy and a second radiation source may emit second radiation at a second energy, the first energy being lower or higher than the second energy. In some embodiments, the fluoroscopy device 110 may also have only one radiation source, which may emit radiation of different energies under the control of the high voltage control module.
In some embodiments, the energy difference between the first energy and the second energy may be 5KeV to 120 KeV. In some preferred embodiments, the energy difference between the first energy and the second energy may be 10KeV to 90 KeV. In some preferred embodiments, the energy difference between the first energy and the second energy may be 20KeV to 100 KeV. In some embodiments, the available energy range of the first energy and the available energy range of the second energy may be partially overlapping or spaced apart. For example, the available energy range of the first energy is 50KeV to 100KeV, and the available energy range of the second energy is 70KeV to 130 KeV. For example, the available energy range of the first energy is 60KeV to 90KeV, and the available energy range of the second energy is 100KeV to 120 KeV. In some embodiments, the first energy may be 70KeV and the second energy may be 120 KeV.
In some embodiments, the processing device 140 may acquire perspective data of the subject under irradiation by rays (otherwise known as beams) of different energies. For example, the processing device 140 may acquire first perspective data of the subject under irradiation with radiation of a first energy and second perspective data of the subject under irradiation with radiation of a second energy. In some embodiments, the fluoroscopy data may refer to data detected by a detector of the fluoroscopy device 110 after the radiation passes through the subject to be irradiated, and the memory of the fluoroscopy device 110 may store the first fluoroscopy data and the second fluoroscopy data acquired in each of the plurality of consecutive photographing cycles. Further, the processing device 140 or the image processing unit in the memory may perform processing (e.g., subtraction processing) based on the perspective data to obtain images for a plurality of photographing periods, thereby obtaining dynamic images for a plurality of photographing periods. In some embodiments, the perspective data may also be understood as an image reconstructed based on the data detected by the detector. In some embodiments, the image may be reconstructed using a semi-reconstruction method, a segmented reconstruction method, or the like. In some embodiments, the processing device 140 may store the perspective data for subsequent processing of the perspective data to achieve dynamic perspective.
It should be noted that the present application does not limit the shooting steps in one shooting cycle. The dynamic fluoroscopy system 100 may first irradiate the subject with the radiation source at a first energy to obtain first fluoroscopy data, and then irradiate the subject with the radiation source at a second energy to obtain second fluoroscopy data. Alternatively, the imaging system 100 may first irradiate the subject at the second energy using the radiation source and then irradiate the subject at the first energy using the radiation source. The specific shooting step can be selected according to actual conditions.
Step 220, performing the photographing on the subject in a plurality of consecutive photographing periods. In some embodiments, step 220 may be performed by the capture module 310.
In some embodiments, the plurality of consecutive photographing periods may include a first photographing period, a second photographing period … …, an nth photographing period, and each photographing period is completed before a next photographing period is entered. In some embodiments, one photographing period may have a photographing (e.g., first energy photographing) start time as a period start time and a photographing (e.g., second energy photographing) completion time as a period end time. In this case, after the radiation source in the current shooting cycle finishes irradiating the subject at the second energy, the next shooting cycle may be directly performed, that is, the radiation source irradiates the subject at the first energy. In some embodiments, for a capture cycle, a certain interval time may also be included before the first energy capture begins and/or after the second energy capture is completed. In a specific embodiment, a shooting period may use a start time of first energy shooting as a period start time, and after the first energy shooting is completed, second energy shooting may be performed after a first time interval; the photographing cycle may be ended and a next photographing cycle may be performed after a second time period elapses after the second energy photographing is completed. In some embodiments, the first time period and the second time period may be equal. In a specific embodiment, one capture period is related to the frame rate, for example, for a DSA device, 50 fluoroscopic images can be captured continuously for 1 second, where two continuous frames are one capture period, i.e., 0.04 second.
Step 230, displaying a dynamic image of the subject according to the first perspective data and the second perspective data acquired in each of the plurality of continuous shooting periods. In some embodiments, step 230 may be performed by display module 320.
In some embodiments, for each capture cycle, the display module 320 may generate an image of the subject (e.g., a target image) based on the first perspective data and the second perspective data acquired in the capture cycle. The plurality of target images of the plurality of consecutive photographing periods may constitute a dynamic image of the subject. The dynamic image of the subject can reflect the change of the subject (such as a focus, a tissue and/or an organ) in a plurality of continuous shooting periods.
In some embodiments, the category of the dynamic image (or target image) of the subject may include a soft tissue image, a bone image, and/or a composite image including at least soft tissue and bone. In some embodiments, a soft tissue image may refer to an image that displays only a soft tissue portion. The bone image may refer to an image in which only a bone portion is displayed. The composite image may refer to an image that displays at least a portion including soft tissue and at least a portion of bone. In some embodiments, the composite image may be integrated from the soft tissue image and the bone image according to a particular weight. In some embodiments, the weights occupied by the soft tissue and bone images in the composite image may be determined at the discretion of the healthcare worker, for example, based on information input by the healthcare worker via an input device.
In some embodiments, the processing device 140 (e.g., the display module 320) may determine and display a dynamic image of the subject using a dual-energy subtraction technique based on the first perspective data and the second perspective data acquired in each of the plurality of consecutive photographing periods. Specifically, the intensity of the photoelectric absorption effect is positively correlated with the relative atomic mass of the irradiated substance (e.g., soft tissue, bone parts of the subject, etc.), and is the main mode in which high-density substances such as calcium, bone, iodine contrast agent, etc. attenuate the X-ray photon energy. The Compton scattering effect is independent of the relative atomic mass of the irradiated subject, and is in a functional relationship with the electron density of the body tissues and organs of the subject, and mainly occurs in soft tissues. The dual-energy subtraction technique can determine a target image (such as a soft tissue image, a bone image or a comprehensive image) of a subject by utilizing the difference of energy attenuation modes of X-ray photons by bones and soft tissues and the difference of photoelectric absorption effects of substances with different atomic weights. This difference in attenuation and absorption is apparent in X-beams of different energies, the energy of which has a negligible effect on the intensity of the compton scattering effect. The display module 320 may process the first perspective data and the second perspective data using a dual-energy subtraction technique by selectively removing or partially removing attenuation information of bone or soft tissue, thereby obtaining and displaying soft tissue images, bone images, or composite images on a display (e.g., the terminal 130). In some embodiments, the image processing unit in the memory may also process the first perspective data and the second perspective data.
In some embodiments, the first perspective data may include a first projection image and the second perspective data may include a second projection imageProjecting an image; the processing device 140 may determine the gray level I of each pixel point in the first projected imagel(x, y) and the gray level I of each pixel in the second projection imageh(x, y). Based on the gray scale of each pixel in the first projected image, the gray scale of each pixel in the second projected image, and the subtraction parameter ω, the processing device 140 may determine the gray scale of each pixel in the target image according to formula (1). Wherein the subtraction parameter ω can be set according to different parts of the subject.
Ides(x,y)=Il(x,y)/Ih(x,y)ω (1)
In other embodiments, the processing device 140 may further determine the first target image by performing an overlay process on the first projection image and the second projection image; and determining a second target image by performing reverse color processing on the first target image. The first target image and the second target image may be different types of dynamic images, for example, the first target image may be a bone image and the second target image may be a soft tissue image.
In some embodiments, the processing device 140 may obtain instructions input by a user (e.g., a medical care provider, a technician), the instructions may reflect that the user selects a category of dynamic images to be displayed, and then determine and display a dynamic image of the subject based on the instructions. The medical staff can select the type of the dynamic image generated by the fluoroscopy device 110 according to actual needs, for example, when performing surgery/diagnosis on a bone part, the medical staff can select to generate and display a bone image; when performing surgery/diagnosis on a soft tissue site, medical personnel can select to generate and display a soft tissue image; when it is desired to view both the bone and soft tissue regions, the healthcare worker may choose to generate and display a composite image. In some embodiments, the instructions may include selection instructions, voice instructions, text instructions, or the like. The voice command may refer to voice information input by the medical staff through the input device, for example, "display bone image". The text instructions may refer to text information entered by the healthcare worker via an input device, for example, "display soft tissue image" entered on the input device. The selection instruction may refer to an instruction or a selection item displayed on an interface of the input device, which can be selected by the medical staff, for example, the selection item one: "display skeleton image", selection item two: "display soft tissue image", selection item three: "display integrated image".
FIG. 3 is an exemplary block diagram of a dynamic perspective system shown in accordance with some embodiments of the present application. As shown in fig. 3, the dynamic perspective system 300 may include a photographing module 310 and a display module 320. In some embodiments, the dynamic perspective system 300 may be implemented by the dynamic perspective system 100 (e.g., the processing device 140) shown in fig. 1.
The photographing module 310 may be configured to photograph a subject during a photographing cycle, during which first perspective data of a radiation source irradiating the subject at a first energy is acquired, and second perspective data of the radiation source irradiating the subject at a second energy different from the first energy is acquired. In some embodiments, the acquisition module 310 may be further configured to perform the capturing of the subject over a plurality of consecutive capture cycles.
The display module 320 may be configured to display a dynamic image of the subject according to the first perspective data and the second perspective data acquired in each of the plurality of consecutive photographing periods.
In other embodiments of the present application, a dynamic perspective apparatus is provided, comprising at least one processing device 140 and at least one storage device 150; the at least one storage device 150 is configured to store computer instructions, and the at least one processing device 140 is configured to execute at least some of the computer instructions to implement the dynamic perspective method 200 as described above.
In still other embodiments of the present application, a computer-readable storage medium is provided that stores computer instructions that, when read by a computer (e.g., processing device 140), processing device 140 performs dynamic perspective method 200 as described above.
It should be noted that the above description of the dynamic perspective system and the apparatus/module thereof is only for convenience of description and should not limit the present application to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the present system, any combination of devices/modules or configuration of subsystems with other devices/modules may be implemented without departing from such teachings. For example, the photographing module 310 and the display module 320 disclosed in fig. 3 may be different modules in one apparatus (e.g., the processing device 140), or may be one module to implement the functions of two or more modules described above. For example, the photographing module 310 and the display module 320 may be two modules, or one module may have both functions of photographing and displaying a moving image. For another example, each module may have its own storage module. As another example, the modules may share a memory module. For another example, the photographing module 310 may include a first photographing sub-module and a second photographing sub-module, wherein the first photographing sub-module may be configured to acquire first perspective data of the radiation source irradiating the subject at a first energy during the photographing process, and the second photographing sub-module may be configured to acquire second perspective data of the radiation source irradiating the subject at a second energy. And such modifications are intended to be included within the scope of the present application.
The beneficial effects that may be brought by the embodiments of the present application include, but are not limited to: (1) the medical staff is helped to quickly know the dynamic changes of the focus and/or each tissue and organ of the subject in a plurality of shooting periods; (2) the required image class can be acquired using dual energy subtraction techniques. It is to be noted that different embodiments may produce different advantages, and in different embodiments, any one or combination of the above advantages may be produced, or any other advantages may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be considered merely illustrative and not restrictive of the broad application. Various modifications, improvements and adaptations to the present application may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present application and thus fall within the spirit and scope of the exemplary embodiments of the present application.
Also, this application uses specific language to describe embodiments of the application. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the present application is included in at least one embodiment of the present application. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this application are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the present application may be combined as appropriate.
Additionally, the order in which elements and sequences of the processes described herein are processed, the use of alphanumeric characters, or the use of other designations, is not intended to limit the order of the processes and methods described herein, unless explicitly claimed. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present application. Other variations are also possible within the scope of the present application. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the present application can be viewed as being consistent with the teachings of the present application. Accordingly, the embodiments of the present application are not limited to only those embodiments explicitly described and depicted herein.

Claims (12)

1. A method of dynamic fluoroscopy for a C-arm device, comprising:
shooting a subject in a shooting period, acquiring first perspective data of a radiation source irradiating the subject at a first energy in the shooting process, and acquiring second perspective data of the radiation source irradiating the subject at a second energy different from the first energy;
performing the photographing on the subject in a plurality of consecutive photographing periods;
displaying a dynamic image of the subject based on the first perspective data and the second perspective data acquired in each of the plurality of continuous photographing periods.
2. The method of claim 1, wherein the available energy range of the first energy and the available energy range of the second energy are partially overlapping or spaced apart.
3. The method of claim 1, wherein the energy difference between the first energy and the second energy is 20KeV to 100 KeV.
4. The method of claim 1, wherein the dynamic image of the subject comprises an image of soft tissue, an image of bone, and/or a composite image comprising at least soft tissue and bone.
5. The method of claim 4, wherein the composite image is integrated from the soft tissue image and the bone image according to a particular weight.
6. The method of claim 1, wherein the displaying the dynamic image of the subject from the first perspective data and the second perspective data acquired in each of the plurality of consecutive photographing periods comprises:
subtracting the first perspective data and the second perspective data acquired in each of the plurality of continuous shooting periods to obtain an image of each shooting period;
determining and displaying a dynamic image of the subject based on a plurality of images of the plurality of consecutive photographing periods.
7. The method of claim 1, further comprising:
acquiring an instruction input by a user, wherein the instruction reflects the dynamic image category selected by the user;
the displaying the dynamic image of the subject includes: displaying a dynamic image of the subject based on the instruction.
8. A dynamic perspective system of a C-arm device is characterized by comprising a shooting module and a display module;
the shooting module is used for shooting a subject in a shooting period, and acquiring first perspective data of a ray source irradiating the subject under first energy and second perspective data of the ray source irradiating the subject under second energy different from the first energy in the shooting process;
the shooting module is further used for shooting the subject in a plurality of continuous shooting periods;
the display module is configured to display a dynamic image of the subject according to the first perspective data and the second perspective data acquired in each of the plurality of consecutive shooting periods.
9. An apparatus for dynamic perspective of a C-arm device, the apparatus comprising at least one processor and at least one memory device for storing instructions which, when executed by the at least one processor, implement the method of any one of claims 1 to 7.
10. The C-shaped arm imaging system is characterized by comprising a ray source, a detector, a memory and a display, wherein the ray source comprises a bulb tube, a high voltage generator and a high voltage control module; wherein:
the high-voltage control module controls the high-voltage generator to switch between first energy and second energy in a reciprocating manner;
the radiation source emits rays to a main body in a shooting period under the drive of the high-voltage generator, and the detector acquires first perspective data of the main body under the irradiation of first energy rays and second perspective data under the irradiation of second energy rays;
the memory stores the first perspective data and the second perspective data acquired in each of a plurality of consecutive photographing periods;
the memory is also stored with an image processing unit, and the image processing unit performs subtraction processing on the first perspective data and the second perspective data in each shooting period to obtain an image of the subject in each shooting period so as to obtain dynamic images in a plurality of shooting periods;
and the display is used for displaying dynamic images of the main body in a plurality of shooting periods.
11. The system of claim 10, wherein the C-arm imaging system comprises a DSA device or a mobile C-arm device.
12. A computer readable storage medium storing computer instructions which, when read by a computer, cause the computer to perform the method of any one of claims 1 to 7.
CN202010957835.1A 2020-09-11 2020-09-11 Dynamic perspective method and system of C-shaped arm equipment Pending CN111973209A (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN202010957835.1A CN111973209A (en) 2020-09-11 2020-09-11 Dynamic perspective method and system of C-shaped arm equipment
PCT/CN2021/118006 WO2022053049A1 (en) 2020-09-11 2021-09-13 Dynamic perspective method, apparatus and system for c-shaped arm equipment
EP21866101.5A EP4201331A4 (en) 2020-09-11 2021-09-13 Dynamic perspective method, apparatus and system for c-shaped arm equipment
US18/182,286 US20230230243A1 (en) 2020-09-11 2023-03-10 Methods, devices, and systems for dynamic fluoroscopy of c-shaped arm devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010957835.1A CN111973209A (en) 2020-09-11 2020-09-11 Dynamic perspective method and system of C-shaped arm equipment

Publications (1)

Publication Number Publication Date
CN111973209A true CN111973209A (en) 2020-11-24

Family

ID=73449341

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010957835.1A Pending CN111973209A (en) 2020-09-11 2020-09-11 Dynamic perspective method and system of C-shaped arm equipment

Country Status (1)

Country Link
CN (1) CN111973209A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022053049A1 (en) * 2020-09-11 2022-03-17 上海联影医疗科技股份有限公司 Dynamic perspective method, apparatus and system for c-shaped arm equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1794898A (en) * 2004-12-22 2006-06-28 西门子公司 X-ray system having a first and a second X-ray array
CN101803930A (en) * 2009-02-17 2010-08-18 通用电气公司 Radiological imaging method and device
US20140243579A1 (en) * 2013-02-27 2014-08-28 Loyola University Chicago Dual-energy image suppression method
JP2016220843A (en) * 2015-05-28 2016-12-28 株式会社東芝 Dynamic body tracking device and dynamic body tracking method
CN107019522A (en) * 2015-12-04 2017-08-08 西门子保健有限责任公司 Method, X-ray apparatus and computer program that image is supported are provided operator

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1794898A (en) * 2004-12-22 2006-06-28 西门子公司 X-ray system having a first and a second X-ray array
CN101803930A (en) * 2009-02-17 2010-08-18 通用电气公司 Radiological imaging method and device
US20140243579A1 (en) * 2013-02-27 2014-08-28 Loyola University Chicago Dual-energy image suppression method
JP2016220843A (en) * 2015-05-28 2016-12-28 株式会社東芝 Dynamic body tracking device and dynamic body tracking method
CN107019522A (en) * 2015-12-04 2017-08-08 西门子保健有限责任公司 Method, X-ray apparatus and computer program that image is supported are provided operator

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022053049A1 (en) * 2020-09-11 2022-03-17 上海联影医疗科技股份有限公司 Dynamic perspective method, apparatus and system for c-shaped arm equipment

Similar Documents

Publication Publication Date Title
JP3697233B2 (en) Radiation image processing method and radiation image processing apparatus
US7283614B2 (en) X-ray diagnosis apparatus and method for creating image data
JP5269298B2 (en) X-ray diagnostic equipment
JP5491914B2 (en) Image display apparatus and X-ray diagnostic apparatus
JP2012050848A (en) Radiographic imaging control apparatus using multi radiation generating apparatus
CN103860185B (en) Determining of multipotency spirogram picture
EP2457512A1 (en) System and method for including and correcting subject orientation data in digital radiographic images
CN111528879A (en) Method and system for acquiring medical image
JP2007014755A (en) Medical image displaying device, medical image generating program and x-ray computerized tomographic device
JP7046538B2 (en) X-ray computer tomography equipment, contrast agent injector and management equipment
JP2012120758A (en) X-ray diagnostic apparatus
CN111973209A (en) Dynamic perspective method and system of C-shaped arm equipment
US20160073998A1 (en) X-ray diagnostic apparatus
JP6466057B2 (en) Medical diagnostic imaging equipment
JP2018020112A (en) X-ray CT apparatus
JP5238296B2 (en) X-ray apparatus and rotational imaging method
JP2009050383A (en) Image processing system, x-ray diagnostic apparatus, its image processing program, and image reconstruction apparatus
JP6956514B2 (en) X-ray CT device and medical information management device
JP5591555B2 (en) X-ray diagnostic apparatus, image processing apparatus, and X-ray diagnostic system
JP5331044B2 (en) Radiation imaging system
JP4976880B2 (en) X-ray apparatus and X-ray image creation method
WO2022053049A1 (en) Dynamic perspective method, apparatus and system for c-shaped arm equipment
JP5727653B2 (en) Radiation diagnostic apparatus and X-ray computed tomography apparatus
JP2007151607A (en) X-ray diagnostic apparatus
JP6855173B2 (en) X-ray CT device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination