CN113647967A - Control method, device and system of medical scanning equipment - Google Patents

Control method, device and system of medical scanning equipment Download PDF

Info

Publication number
CN113647967A
CN113647967A CN202111052132.5A CN202111052132A CN113647967A CN 113647967 A CN113647967 A CN 113647967A CN 202111052132 A CN202111052132 A CN 202111052132A CN 113647967 A CN113647967 A CN 113647967A
Authority
CN
China
Prior art keywords
scanning
scanned person
scanned
person
height
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111052132.5A
Other languages
Chinese (zh)
Inventor
宋舒杰
杨昌玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN202111052132.5A priority Critical patent/CN113647967A/en
Publication of CN113647967A publication Critical patent/CN113647967A/en
Priority to PCT/CN2022/117823 priority patent/WO2023036243A1/en
Priority to EP22866704.4A priority patent/EP4329618A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/037Emission tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/04Positioning of patients; Tiltable beds or the like
    • A61B6/0407Supports, e.g. tables or beds, for the body or parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Pulmonology (AREA)
  • Theoretical Computer Science (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The present specification relates to a method, an apparatus and a system for controlling a medical scanning device, the method comprising: acquiring a plurality of first images of a scanned person through an image acquisition device; determining posture information of the scanned person based on the plurality of first images; automatically acquiring a scanning protocol corresponding to the scanned person based on the identity information of the scanned person; pre-starting the medical scanning device based on a scanning protocol; automatically adjusting the height of the examination bed based on the posture information of the scanner; after a scanned person gets on a bed, acquiring a plurality of second images of the scanned person on an examination bed through an image acquisition device, and determining the actual positioning information of the scanned person based on the plurality of second images; and when the actual positioning information is compared with the preset positioning information in the scanning protocol and meets the preset condition, sending the scanned person to a scanning position for scanning.

Description

Control method, device and system of medical scanning equipment
Technical Field
The present application relates to the field of medical devices, and in particular, to a method, an apparatus, and a system for controlling a medical scanning device.
Background
Medical imaging using medical scanning devices is widely used for various medical treatments and/or diagnoses. Some medical scanning devices require a long preparation before the scanning process (before the imaging process). When a scanned person enters a scanning room for scanning, an operator (medical staff) needs to lower the examination bed to a proper height to ensure that the scanned person easily goes up the examination bed, and then the operator can place the examination bed according to the examination part guidance of the scanned person so as to ensure that the scanned image contains the information of the part of the scanned person. After the scanning is finished, the operator manually releases the sickbed to a proper height, so that the scanned person can get off the sickbed conveniently. It can be seen that in the whole scanning workflow, one operator is needed to help complete all the above work, and the working intensity of the operator is high; moreover, a large amount of time is wasted in the adjustment process of the scanning bed, so that the efficiency of the whole scanning process is low.
Therefore, it is necessary to provide a method to improve the working efficiency of the above scanning process.
Disclosure of Invention
In a first aspect of the present application, there is provided a control method of a medical scanning apparatus, comprising: acquiring a plurality of first images of a scanned person through an image acquisition device arranged in a scanning room; determining posture information of the scanned person based on the plurality of first images of the scanned person; automatically acquiring a scanning protocol corresponding to the scanned person based on the identity information of the scanned person, wherein the scanning protocol at least comprises preset positioning information; pre-starting the medical scanning device based on the scanning protocol; automatically adjusting the height of the examination bed based on the posture information of the scanned person; after the scanned person gets on the bed, acquiring a plurality of second images of the scanned person on the examination bed through the image acquisition device; determining actual positioning information of the scanned person based on the plurality of second images; and when the actual positioning information is compared with the preset positioning information in the scanning protocol and meets the preset condition, sending the scanned person to a scanning position for scanning.
In some embodiments, the method further comprises; after the scanning is finished, the examining table is automatically adjusted to the height of the scanned person getting on the bed.
In a second aspect of the present application, there is provided a control system of a medical scanning apparatus, comprising: the first image acquisition module is used for acquiring a plurality of first images of a scanned person entering a scanning room through an image acquisition device arranged in the scanning room; the posture information acquisition module is used for determining the posture information of the scanned person based on the plurality of first images of the scanned person; the scanning protocol acquisition module is used for automatically acquiring a scanning protocol corresponding to the scanned person based on the identity information of the scanned person, and the scanning protocol at least comprises preset positioning information; a device pre-start module for pre-starting the medical scanning device based on a scanning protocol; the height adjusting module is used for determining the height of the examination bed based on the posture information of the scanned person; the second image acquisition module is used for acquiring a plurality of second images of the scanned person on the examination bed through the image acquisition device after the scanned person gets on the bed; an actual positioning information determining module, configured to determine actual positioning information of the scanned person based on the plurality of second images; and the scanning execution module is used for sending the scanned person to a scanning position for scanning when the actual positioning information is compared with the preset positioning information in the scanning protocol and then meets a preset condition.
In a third aspect of the application, a control apparatus of a medical scanning device is provided, the apparatus comprising at least one processor and at least one memory; the at least one memory is for storing computer instructions; the at least one processor is configured to execute at least some of the computer instructions to implement the medical scanning apparatus automatic control method as described above.
In a fourth aspect of the present application, a computer-readable storage medium for medical scanning device control is provided, the storage medium storing computer instructions which, when executed by a processor, implement a medical scanning device automatic control method as described above.
In a fifth aspect of the application, an automatically controlled medical scanning device is provided, comprising: a scanner for performing a medical scan to acquire scan data of a scan area of a scanned subject; the image acquisition device is used for acquiring a plurality of first images and a plurality of second images of the scanned person; the examination bed control mechanism is used for moving the examination bed between the upper bed position and the scanning position and automatically adjusting the height of the examination bed; a processor for implementing the method for automatic control of a medical scanning apparatus as described above.
The method comprises the steps of capturing the posture information of a scanned person by using an image acquisition device, and automatically lifting a scanning bed (namely an examination bed); meanwhile, the image acquisition device is used for acquiring the actual positioning information of the scanned person, and the positioning of the patient is adjusted by comparing the actual positioning information with the preset positioning information in the scanning protocol; and the identity information of the scanned person is acquired, and the corresponding scanning protocol is called based on the identity information to automatically pre-start the medical scanning equipment, so that the time of the whole scanning process is reduced, and the rapid operation of the whole scanning process is realized.
Drawings
The present description will be further described by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a diagram of an exemplary application scenario of a scanning system shown in accordance with some embodiments of the present description;
FIG. 2 is a schematic diagram of a spatial arrangement of a medical scanning apparatus according to some embodiments of the present description;
FIG. 3 is a schematic diagram of exemplary hardware and/or software components of a computing device that may implement a processing device, according to some embodiments of the present description;
FIG. 4 is a block diagram of a system based on automatic control of a model of a medical scanning device, according to some embodiments of the present disclosure;
fig. 5 is an exemplary flow chart of a method of automatic control of a medical scanning apparatus, shown in accordance with some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only examples or embodiments of the application, from which the application can also be applied to other similar scenarios without inventive effort for a person skilled in the art. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
As used in this application and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Although various references are made herein to certain modules in a system according to embodiments of the present application, any number of different modules may be used and run on a vehicle client and/or server. The modules are merely illustrative and different aspects of the systems and methods may use different modules.
Flow charts are used herein to illustrate operations performed by systems according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
The automatic control method of a medical scanning device, as referred to in the present application, is applicable in medical or industrial applications, e.g. for disease treatment (such as radiotherapy), disease diagnosis (such as medical imaging), etc. In some embodiments, the technical solutions disclosed in the present application can be applied to various medical devices, such as a Magnetic Resonance device (MR), a Computed Tomography device (CT), a Positron Emission Tomography device (PET), an X-ray device (X-ray Product), an Ultrasound device (Ultrasound), a Radiation therapy device (RT), and a multi-modality device combined by the above devices, such as a PET-MR device, a PET-CT device, and the like. The following description is provided for illustrative purposes with reference to a computed tomography imaging apparatus (CT scanner) by way of example, and is not intended to limit the scope of the present invention.
Fig. 1 is a diagram of an exemplary application scenario of a scanning system in accordance with some embodiments of the present description.
As shown in fig. 1, the scanning system 100 may include a medical scanning device 110 (illustratively, a CT scanner in fig. 1), a network 120, a terminal 130, a processing device 140, a storage device 150, and an image acquisition apparatus 160. The components of the scanning system 100 may be connected in various ways. By way of example only, as shown in fig. 1, the medical scanning device 110 may be connected to the processing device 140 through the network 120. As another example, the medical scanning device 110 may be directly connected to the processing device 140 (as indicated by the double-headed arrow in the dashed line connecting the medical scanning device 110 and the processing device 140). As another example, storage device 150 may be connected to processing device 140 directly or through network 120. Also for example, image capture device 160 may be connected to processing device 140 via network 120. As yet another example, a terminal device (e.g., cell phone 130-1, tablet 130-2, computer 130-3, etc.) may be connected directly to processing device 140 (as indicated by the double-headed arrow in the dashed line connecting terminal 130 and processing device 140) or through network 120.
The medical scanning device 110 may scan a single subject and/or generate a plurality of data regarding the subject. In this application, the scanned object may also be referred to as a target object, a target, or a detected object, and the above terms may be used interchangeably. In some embodiments, the scanned person may be a living being such as a patient, an animal, or the like. When the person to be scanned needs to be scanned, it may be placed on the couch 116 as the couch 116 moves along the longitudinal direction of the medical scanning apparatus 110 and into the scanning region 115. After the scanned person enters the scanning area 115, the scanning device 110 performs a bulb tube anode exposure based on a preset scanning protocol to emit a radiation (e.g., an X-ray beam) to irradiate on the target object to obtain a corresponding medical image.
Network 120 may include any suitable network that may facilitate the exchange of information and/or data for a scanning system. In some embodiments, one or more components of the scanning system (e.g., the medical scanning device 110, the terminal 130, the processing device 140, the storage device 150, or the image acquisition apparatus 160) may communicate information and/or data with one or more other components of the scanning system via the network 120. For example, the processing device 140 may acquire medical images of the scanned person from the medical scanning device 110 via the network 120. In some embodiments, the network 120 may be a wired network or a wireless network, or the like, or any combination thereof. The network 120 may be and/or include a public network (e.g., the internet), a private network (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), etc.), a wired network (e.g., an ethernet network), a wireless network (e.g., a Wi-Fi network, a Li-Fi network, etc.), a cellular network (e.g., a Long Term Evolution (LTE) network), a frame relay network, a virtual private network ("VPN"), a satellite network, a telephone network, a router, a hub, a switch, a server computer, and/or any combination thereof. By way of example only, network 120 may include a cable network, a wireline network, a fiber optic network, a telecommunications network, an intranet, a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), Bluetooth, or the likeTMNetwork purple beeTMA network, a Near Field Communication (NFC) network, an Ultra Wideband (UWB) network, a mobile communication (1G, 2G, 3G, 4G, 5G) network, a narrowband internet of things (NB-IoT), infrared communication, or the like, or any combination thereof. In some embodiments, network 120 may include one or more network access points. For example, network 120 may include wired and/or wireless network access points, such as base stations and/or internet exchange points, through which one or more components of the scanning system may connect to network 120 to exchange data and/or information.
The terminal 130 includes a mobile device 131, a tablet computer 132, a notebook computer 133, etc., or any combination thereof. In some embodiments, the terminal 130 may interact with other components in the scanning system over a network. For example, the terminal 130 may send one or more control instructions to the medical scanning device 110 to control the couch 116 to carry the scanned subject into the scanning region 115. Also for example, the terminal 130 may receive data transmitted by the scanning device 110. In some embodiments, the terminal 130 may receive information and/or instructions entered by a user (e.g., a user of the scanning device 110, such as a physician) and transmit the received information and/or instructions to the medical scanning device 110 or the processing device 140 via the network 120. In some embodiments, the terminal 130 may be part of the processing device 140. The terminal 130 and the processing device 140 may be integrated as a control means, e.g. an operation table, of the medical scanning device 110. In some embodiments, terminal 130 may be omitted.
The processing device 140 may process data and/or information obtained from the medical scanning device 110, the terminal 130, the storage device 150, and/or the image acquisition apparatus 160. For example, the processing device 140 may acquire medical image information of the scanned person. As another example, the processing device 140 may correct the relevant parameters of the medical scanning device 110 based on the data acquired above. For another example, the processing device 140 may determine the posture information of the scanned person based on the plurality of first images of the scanned person acquired by the image acquisition device 160, and further determine the height of the examination table. For example, the processing device 140 may determine the actual positioning information of the scanned person based on the plurality of second images of the scanned person on the examination table acquired by the image acquisition device 160.
In some embodiments, the processing device 140 may be a single server or a group of servers. The server groups may be centralized or distributed. In some embodiments, the processing device 140 may be local or remote. For example, the processing device 140 may access information and/or data stored in or obtained by the medical scanning device 110, the terminal 130, and/or the storage device 150 via the network 120. As another example, the processing device 140 may be directly connected to the PET medical scanning device 110 (as indicated by the double-headed arrow in the dashed line connecting the processing device 140 and the medical scanning device 110 in FIG. 1), the terminal 130 (as indicated by the double-headed arrow in the dashed line connecting the processing device 140 and the terminal 130 in FIG. 1), the storage device 150, and/or the image acquisition device 160 to access stored or acquired information and/or data. In some embodiments, the processing device 140 may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an internal cloud, a multi-tiered cloud, and the like, or any combination thereof. In some embodiments, the processing device 140 may be implemented on a computing device 200, the computing device 200 having one or more components shown in fig. 3 in the present application.
Storage device 150 may store data and/or instructions. In some embodiments, the storage device 150 may store data acquired from the medical scanning device 110, the terminal 130, and/or the processing device 140. For example, the storage device 150 may store motion information of a target object that is previously designed by a user (e.g., a doctor, a photo technician). In some embodiments, storage device 150 may store data and/or instructions that processing device 140 may perform or be used to perform the exemplary methods described herein. For example, the storage device 150 may store instructions for the processing device 140 to perform the methods illustrated in the flowcharts. In some embodiments, storage device 150 may include a mass storage device, a removable storage device, volatile read-write memory, read-only memory (ROM), or the like, or any combination thereof. Exemplary memory may include magnetic disks, optical disks, solid state drives, and the like. In some embodiments, the storage device 150 may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an internal cloud, a multi-tiered cloud, and the like, or any combination thereof.
In some embodiments, the storage device 150 may be connected to the network 120 to communicate with one or more components of the scanning system (e.g., the medical scanning device 110, the processing device 140, the terminal 130, the image acquisition apparatus 160, etc.). One or more components of the scanning system may access data or instructions stored in storage device 150 via network 120. In some embodiments, the storage device 150 may be directly connected to or in communication with one or more components of the scanning system (e.g., the medical scanning apparatus 110, the processing apparatus 140, the terminal 130, the image acquisition apparatus 160, etc.). In some embodiments, the storage device 150 may be part of the processing device 140.
The image capturing device 160 is used to capture a plurality of first images of a scanned person entering a scanning room. In some embodiments, the image acquisition device 160 may be used to acquire a plurality of second images of the scanned subject on the examination table 116. In some embodiments, the image capturing device 160 may be a 3D camera, such as a tof (time Of flight) camera, a structured light camera, a binocular camera, a lidar camera, etc., and the 3D camera may restore a three-dimensional image Of the scanned person by calculating information such as a position and a depth Of an object.
Fig. 2 is a schematic diagram of a spatial arrangement of a medical scanning apparatus according to some embodiments of the present description.
Referring to fig. 2, reference numeral 101 denotes a scanning room (the ceiling and a side wall are not shown in order to show the internal structure of the scanning room), a scanning device 110 (including a bed 116 and a scanning area 115), and a processing device 140 disposed in the scanning room. The corresponding descriptions of the scanning device 110 and the processing device 140 can refer to the related description of fig. 1, and are not repeated here.
In some embodiments, image capture device 160 may be placed on a wall (as shown at 160-1 in FIG. 2). In some embodiments, image capture device 160 may also be disposed on the ceiling of the scanning room. The image capturing device 160 is disposed on the ceiling to obtain a wider viewing angle, thereby preventing the viewing angle from being blocked by other mechanisms or objects.
In some embodiments, the image capture device may comprise any type of camera including, but not limited to, still cameras, video cameras, high speed video cameras, 3D depth video cameras, infrared video cameras, and the like. For example, the image capturing device may be provided with a camera, which may be used to capture video/pictures of the person being scanned as they enter the scanning room. Preferably, the image capturing device 160 may include a CMOS camera, a CCD camera, or the like. Illustratively, the image capturing device 160 may be a 720P, 1080P, 2K, 4K high-definition camera module.
In some embodiments, an infrared camera device may be provided on the image capturing device. At this time, the obtained image of the scanned person is the corresponding infrared image. The infrared camera device can identify the body state of the scanned person through the body surface temperature of the scanned person. Compared with an optical image pickup device, the infrared image pickup device is less affected by factors such as shooting background, shooting light and the like, and therefore has higher precision.
In some embodiments, image capture device 160 may contain only 1 camera. In this case, the camera may be disposed directly above the examination table 116. In some embodiments, image capture device 160 may include at least two (e.g., 2,3, 4, 5, 6, etc.). Through setting up a plurality of cameras, can effectively increase image acquisition device 160's image acquisition field of vision, avoid a camera because of the inaccurate condition of image acquisition that the field of vision blind area leads to. In some embodiments, two or more cameras 120 may be used to acquire three-dimensional images of the scanned person. For example, two or more cameras 120 may transmit the acquired images to a processing device (e.g., the processing device 140), and the processor may convert the two-dimensional planar images into three-dimensional images of the scanned person by using image stitching, image coordinate transformation, and the like according to the images acquired by the cameras.
It should be noted that, on the basis of the present application, those skilled in the art can make various reasonable changes to the technical solution of the present application. For example, the number of cameras in the image capturing device 160, the arrangement of the cameras, and/or the positions of the cameras may be specifically set according to actual needs. For example, the arrangement of the cameras may include, but is not limited to, an array arrangement, a ring arrangement, a topology arrangement, and the like. Such variations are intended to be within the scope of the present application.
Fig. 3 is a schematic diagram of exemplary hardware and/or software components of a computing device that may implement a processing device according to some embodiments of the present description.
As shown in FIG. 3, computing device 200 may include a processor 210, memory 220, input/output (I/O)230, and communication ports 240. In some embodiments, the computing device 200 may be disposed in the processing device 140.
The processor 210 may execute computer instructions (program code) and perform the functions of the processing device 140 in accordance with the techniques described herein. Computer instructions may include routines, programs, objects, components, signals, data structures, procedures, modules, and functions that perform particular functions described herein. For example, the processor 210 may retrieve motion data or scan data from the storage device 150 and/or the terminal 130. In some embodiments, processor 210 may include microcontrollers, microprocessors, Reduced Instruction Set Computers (RISC), Application Specific Integrated Circuits (ASIC), application specific instruction set processors (ASIP), Central Processing Units (CPU), Graphics Processing Units (GPU), physical arithmetic processing units (PPU), microcontroller units, Digital Signal Processors (DSP), Field Programmable Gate Array (FPGA), Advanced RISC Machines (ARM), programmable logic devices, any circuit or processor capable of executing one or more functions, and the like, or any combination thereof.
For illustrative purposes only, only one processor is depicted in computing device 200. However, it should be noted that the computing device 200 in the present application may also comprise multiple processors, and thus the operations of the method described in the present application performed by one processor may also be performed by multiple processors in combination or individually. For example, if both operations a and B are performed in the present application processor of computing device 200, it should be understood that operations a and step B may also be performed jointly or separately by two different processors in computing device 200 (e.g., a first processor performing operation a, a second processor performing operation B, or both a first and second processor performing operations a and B).
For example only, the processor 210 may receive instructions that follow a scanning protocol for imaging/scanning an object. For example, the processor 210 may instruct the couch 116 of the scanning device 110 to move the scanned person into the scanning region 115. As another example, the processor 210 may also provide certain control signals to control the rotational speed and positioning of the gantry within the scan area 115. As another example, the processor 210 may also provide certain control signals to control the switching of the bulb in the scan area 115. In some embodiments, the processor 200 may acquire scan data of a target object in a scan region of an imaging device.
In some embodiments, the processor 210 may determine posture information of the scanned person based on the plurality of first images of the scanned person. In some embodiments, the processor 210 may also determine the height of the examination table based on the posture information of the scanned person. In some embodiments, the processor 210 may further determine the scanning posture of the scanned person and the position relationship between the scanned person and the examination table based on the plurality of second images, and determine the actual positioning information of the scanned person based on the determination. Further, the processor 210 may compare the actual positioning information with the preset positioning information in the scanning protocol. The corresponding description of the processor 210 for determining the height of the examination table and the positioning information of the scanned person can be seen in the related description of fig. 5, and will not be described herein.
The memory 220 may store data/information obtained from the medical scanning device 110, the terminal 130, the storage device 150, or any other component of the scanning system 100. In some embodiments, memory 220 may include mass storage devices, removable storage devices, volatile read-write memory, read-only memory (ROM), or the like, or any combination thereof. For example, the mass storage device may include a magnetic disk, an optical disk, a solid state drive, and the like. Removable storage devices may include flash drives, floppy disks, optical disks, memory cards, zip disks, magnetic tape, and the like. The volatile read and write memory may include Random Access Memory (RAM). Exemplary RAM may include a Dynamic RAM (DRAM), double data rate synchronous dynamic RAM (DDR SDRAM), Static RAM (SRAM), thyristor RAM (T-RAM), zero capacitance RAM (Z-RAM), and the like. Exemplary ROMs may include a Mask ROM (MROM), a Program ROM (PROM), a erasable program ROM (PEROM), an Electrically Erasable Program ROM (EEPROM), an optical disk ROM, or a digital versatile disk ROM, etc. In some embodiments, memory 220 may store one or more programs and/or instructions to perform the example methods described herein. For example, the memory 220 may store a program for the processing device 140 for determining the posture information of the subject based on the acquired image data (e.g., the first image). As another example, the memory 220 may store a program associated with controlling the couch 116 to move the couch 116 in a stepwise or continuous motion according to a predetermined trajectory, a predetermined speed.
I/O (input/output) 230 may input or output signals, data, or information. In some embodiments, I/O230 may enable a user to interact with processing device 210. In some embodiments, I/O230 may include input devices and output devices. Exemplary input devices may include a keyboard, mouse, touch screen, microphone, trackball, etc., or a combination thereof. Exemplary output devices may include a display device, speakers, printer, projector, etc., or any combination thereof. Exemplary display devices may include Liquid Crystal Displays (LCDs), Light Emitting Diode (LED) based displays, flat panel displays, curved displays, television devices, Cathode Ray Tubes (CRTs), and the like, or any combination thereof.
For example only, a user (e.g., an operator) may input data related to an object (e.g., a patient) being imaged/scanned via I/O230. The data relating to the object may include identity information (e.g., name, age, gender, medical history, contract information, results of physical examinations, etc.) and/or include scan protocol information that must be performed. The user (e.g., operator) may also input parameters required for the operation of the medical scanning device 110, such as image contrast and/or ratio, region of interest (ROI), slice thickness, image type, or any combination thereof. The I/O may also display a scanned image generated based on the sampled data.
The communication port 240 may be connected to a network (e.g., network 120) to facilitate data communication. The communication port 240 may establish a connection between the processing device 140 and the scanning device 110, the terminal 130, or the storage device 150. The connection may be a wired connection, a wireless connection, or a combination of both, which enables data transmission and reception. The wired connection may include an electrical cable, an optical cable, a telephone line, etc., or any combination thereof. The wireless connection may include Bluetooth, Wi-Fi, WiMax, WLAN, ZigBee, mobile networks (e.g., 3G, 4G, 5G, etc.), the like, or combinations thereof. In some embodiments, the communication port 240 may be a standardized communication port, such as RS232, RS485, and the like. In some embodiments, the communication port 240 may be a specially designed communication port. For example, the communication port 240 may be designed in accordance with digital imaging and communications in medicine (DICOM) protocol.
Fig. 4 is a block diagram of a system based on automatic control of a model of a medical scanning apparatus according to some embodiments of the present disclosure.
In some embodiments, the system 400 is disposed on the processing device 140. Specifically, the system 400 is further disposed on the processor 210, the system 400 comprising:
a first image acquiring module 405, configured to acquire, through an image acquisition device disposed in the scanning room, a plurality of first images of a scanned person entering the scanning room;
a posture information obtaining module 410 for determining the posture information of the scanned person based on the plurality of first images of the scanned person;
an identity information confirming module 420 for confirming identity information of the scanned person;
a scanning protocol obtaining module 425, configured to automatically obtain a scanning protocol corresponding to the scanned person based on the identity information of the scanned person, where the scanning protocol at least includes preset positioning information;
a device pre-start module 430 for pre-starting the medical scanning device based on the scanning protocol;
a height adjusting module 440 for automatically adjusting the height of the examining table based on the posture information of the scanned person;
a second image acquiring module 450, configured to acquire, through the image acquiring device, a plurality of second images of the scanned person on the examination table;
an actual positioning information determining module 460, configured to determine actual positioning information of the scanned person based on the plurality of second images;
the matching degree obtaining module 470 is configured to compare the actual positioning information with preset positioning information in the scanning protocol to obtain a matching degree;
and the scanning execution module 480 is configured to send the scanned person to a scanning position for scanning when the actual positioning information is compared with the preset positioning information in the scanning protocol and then meets a preset condition.
In some embodiments, when the actual positioning information is compared with the preset positioning information in the scanning protocol and does not satisfy the preset condition, the actual positioning of the scanned person on the examination bed, that is, the scanning posture of the scanned person and/or the position relationship between the scanned person and the examination bed, is adjusted so as to satisfy the preset condition.
In some embodiments, the identity information validation module 420 is further configured to: acquiring a face image of a scanned person based on a plurality of first images of the scanned person; identity information of the scanned person is determined based on the face image.
In some embodiments, when the scanned person has a marker disposed thereon, the identity information confirmation module 420 is further configured to: the identity information of the scanned person is confirmed through the marker arranged on the scanned person.
In some embodiments, the device pre-boot module 430 is further configured to: the rotating speed of a frame of the medical scanning equipment is adjusted to the rotating speed corresponding to the scanning protocol, and/or the anode of a bulb tube of the medical scanning equipment is preheated and adjusted to a working state, and/or all parts of the medical scanning equipment are adjusted to a positioning state corresponding to the scanning protocol.
In some embodiments, the height adjustment module 440 is configured to: when the scanned person is identified to enter the scanning room for walking, calculating the leg height of the scanned person, and reducing the height of the examination bed corresponding to the bed-on position to be lower than the leg height of the scanned person; or when the scanned person is recognized to enter the scanning room by taking the wheelchair, acquiring the height of the wheelchair, and reducing the height of the examination bed corresponding to the bed-entering position to be below the height of the wheelchair; or when the fact that the scanned person takes the stretcher to enter the scanning room is recognized, the height of the stretcher is obtained, and the height of the examination bed corresponding to the bed getting position is reduced to be lower than the height of the stretcher.
In some embodiments, the height adjustment module 440 is configured to: after the scanning is finished, the examining table is automatically adjusted to the height of the scanned person on the bed.
In some embodiments, the system 400 further comprises a trajectory simulation module 490, the trajectory simulation module 490 being configured to: before the scanned person is sent to the scanning position to scan, the motion track of the examining table from the bed-on position to the scanning position is simulated based on the position relation between the scanned person and the examining table, and whether the scanned person and the scanning device are in interference risk is determined.
In some embodiments, when the actual positioning information is compared with the preset positioning information in the scanning protocol and then meets a preset condition, the scanning execution module 480 is configured to: determining the required scanning dose based on the actual positioning information of the scanned person; when the scanning dosage is confirmed to meet the requirement, the scanned person is sent to the scanning position to carry out scanning.
In some embodiments, when the actual positioning information is compared with the preset positioning information in the scanning protocol and then meets a preset condition, the scanning execution module 480 is configured to: sending the scanned person to a scanning position; based on the exposure instruction, the scanned person is scanned.
It should be appreciated that the system and its modules in one or more implementations of the present description may be implemented in a variety of ways. For example, in some embodiments, the system and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may be stored in a memory for execution by a suitable instruction execution system, such as a microprocessor or specially designed hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, such code being provided, for example, on a carrier medium such as a diskette, CD-or DVD-ROM, a programmable memory such as read-only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The system and its modules in this specification may be implemented not only by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also by software executed by various types of processors, for example, or by a combination of the above hardware circuits and software (e.g., firmware).
It should be noted that the above description of the processing device and its modules is merely for convenience of description and is not intended to limit the present description to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the present system, any combination of modules or sub-system configurations may be used to connect to other modules without departing from such teachings.
Fig. 5 is an exemplary flow chart of a method of automatic control of a medical scanning apparatus, shown in accordance with some embodiments of the present description.
In some embodiments, flow 500 may be performed by processing device 140. In some embodiments, the process 500 may be performed by the processor 210 in the system 200.
In step 510, a plurality of first images of a scanned person are acquired by an image acquisition device. In some embodiments, step 510 may be performed by the first image acquisition module 405.
In some embodiments, the image capture device (such as image capture device 160 shown in fig. 1 and 2) may be a device with data acquisition, storage, and/or transmission capabilities. The first image acquisition module 405 may be in communication with an image acquisition device to acquire a plurality of first images of a scanned person.
Taking the implementation scenario shown in fig. 2 as an example, after the scanned person enters the scanning room 101, the image capturing device (e.g. 160-1 in fig. 2) may capture a plurality of first images of the scanned person entering the scanning room. In some implementations, the image capturing device captures successive images of the person being scanned as they enter the scan room. At this time, the first image obtaining module 405 may segment the continuous video into a plurality of still images (i.e., obtain a plurality of first images). In some embodiments, multiple frames of static images (i.e., multiple first images) may be acquired based on a consistent time interval. For example, the time interval between any two adjacent still images may be exemplarily set to 1/24s (it may also be said that 24 still images are obtained at equal time intervals within 1 second). In some embodiments, the time interval between any two adjacent static images may also be specifically set according to actual needs, for example, the time interval may be specifically set to 1/25s,1/48s, 1/60s and other numerical values, which respectively indicate that 25 frames, 48 frames and 60 frames of images are obtained at equal time intervals within 1 second. The video images are set according to the self-defined intervals, so that some images with poor image quality (such as images with overexposure and residual images) can be eliminated, the obtained multi-frame static images can keep high image quality, and the posture information extracted based on the images in the subsequent steps is more accurate.
Step 520, determining the posture information of the scanned person based on the plurality of first images of the scanned person. In some embodiments, step 520 may be performed by the posture information acquisition module 410.
In some embodiments, the posture information acquiring module 410 may acquire the contour of the scanned person by performing gray scale change, color distortion, and the like on the plurality of first images of the scanned person entering the scanning room obtained in step 510, and further determine the posture information of the scanned person by contour comparison and the like.
In some embodiments, the posture information acquiring module 410 may further determine feature points of the scanned person based on the plurality of first images, and further determine the posture information thereof. The posture information may include: sitting, standing, supine and lying on side. Extracting the position coordinates of the feature points from the plurality of first images may be implemented using an algorithm such as openpos, Hourglass Network, Mask RCNN, or the like, for example. Taking the leg joint of the scanned person as an example, the position coordinates of each joint (such as knee joint, ankle joint, and joint across) of the leg of the scanned person in each static image can be obtained based on the above algorithm. For another example, feature point information of the torso of the body of the scanned person may be acquired based on the above algorithm.
Further, the posture information acquiring module 410 may determine whether the scanned person is in a lying posture based on the feature point information on the torso of the scanned person (for example, when the feature points on the torso of the scanned person are horizontally distributed or close to the horizontally distributed, the scanned person is lying on a bed).
Further, when determining that the torso of the scanned person is in an upright state, the posture information acquiring module 410 may determine whether the scanned person is in a sitting posture (sitting on a wheelchair) or standing posture based on the leg joints of the scanned person. Specifically, the posture information acquiring module 410 may connect the knee joint with the ankle joint and the knee joint with the joint crossing based on the coordinates of the joints of the leg, and determine the posture of the scanned person based on an included angle between two line segments (the line segment-taking included angle is 0 to 90 °) (hereinafter, the connection line between the knee joint and the ankle joint is referred to as a line segment a, and the connection line between the knee joint and the joint crossing is referred to as a line segment B). For example, when the included angle between the line segment a and the line segment B is greater than a predetermined threshold (e.g., the predetermined threshold is 45 °), the scanned person may be considered to be in a sitting posture. For another example, when the angle between the line segment a and the line segment B is relatively small (close to 0 °), the scanner can be considered to be in a standing posture.
Because the obtained static image is interfered by factors such as a light source, a shooting background, light and the like, certain noise often exists in the position coordinates of the obtained characteristic points. The noise of the static image includes but is not limited to one or more of gaussian noise, non-gaussian noise, systematic noise, and the like. In some embodiments, denoising processing may be further performed on the position coordinates of the feature points to obtain denoised position coordinates of each feature point. In particular, the filtering of the still image noise can be realized by adopting a kalman filtering mode. The Kalman filtering is an algorithm for performing optimal estimation on the system state by using a system state equation and inputting and outputting observation data through a system. In some embodiments, the kalman filtering manner includes, but is not limited to, one or more of a linear kalman filtering method, an extended kalman filtering method, a progressive extended kalman filtering method, an unscented kalman filtering method, and the like.
In step 530, the scanning protocol corresponding to the scanned person is determined. In some embodiments, step 530 may be performed by the scan protocol acquisition module 425.
In some embodiments, the scanning protocol acquiring module 425 may automatically acquire the scanning protocol corresponding to the scanned person based on the identity information of the scanned person. The scan protocol may include information related to scan parameters associated with the scan and/or image reconstruction parameters associated with the scan. For the sake of brevity, a CT scan is exemplified in the following description. The scan parameters may include bulb voltage (kV), bulb current (mA), total exposure time, scan type (e.g., helical scan, axial scan), scan field of view (FOV), pitch, gantry tilt angle, scan gantry rotation time, slice thickness, and the like. The image reconstruction parameters may include a reconstruction FOV, a reconstruction slice thickness, a reconstruction interval for image reconstruction, a Window Width (WW), a Window Level (WL), a reconstruction matrix, a reconstruction algorithm (e.g., a filtered backprojection algorithm, a fan-beam reconstruction algorithm, an iterative reconstruction algorithm, etc.). In some embodiments, the scan protocol may be provided by an operator (e.g., a doctor or technician).
In some embodiments, the identity information confirming module 420 may acquire a face image of the scanned person based on a plurality of first images of the scanned person, and then determine the identity information of the scanned person based on the face image.
In some embodiments, the scanned person is provided with a marker. The marker may be exemplarily provided as a two-dimensional code, a wrist information code, an RFID, an IC card, or the like. For example, the scanned person may present a marker (e.g., a wrist information code) when entering the scanning time, and the identity information confirmation module 420 may obtain the identity information of the scanned person based on the marker.
In some alternative embodiments, the markers may also be provided as a biometric feature, including but not limited to one of a fingerprint, a palm print, a pupil distance, a voice print, a facial makeup, a phalange or skull, and the like, and any combination thereof. These biometrics characteristics are marked and entered into the database in advance, and after the biometrics characteristics of the scanned person (such as the pupils of the scanned person, for example) are acquired, the identity information confirmation module 420 matches the biometrics characteristics stored in the database in advance based on the acquired biometrics characteristics, thereby acquiring the identity information of the scanned person.
Step 540, pre-starting the medical scanning device based on the scanning protocol. In some embodiments, step 540 may be performed by device pre-boot module 430.
In some embodiments, the device pre-start module 430 may pre-start the medical scanning device based on the corresponding scanning protocol. In some embodiments, the device pre-start module 430 may adjust a gantry rotational speed of a medical scanning device (e.g., the medical scanning device 110) to a rotational speed corresponding to a scanning protocol based on the scanning protocol. In some embodiments, the device pre-start module 430 may adjust the bulb anode pre-heating of the medical scanning device (e.g., the medical scanning device 110) to an operable state based on the scanning protocol, for example, to a value corresponding to the bulb voltage and the bulb current corresponding to the scanning protocol. In some embodiments, the device pre-start module 430 can adjust the positioning status of the components of the scanning device based on the scanning protocol to the corresponding positioning status of the scanning protocol (e.g., rotation to the corresponding gantry tilt angle, pitch, scanning field of view, etc.).
In step 550, the height of the examination table is automatically adjusted based on the posture information of the scanned person. In some embodiments, step 550 may be performed by height adjustment module 440.
The height adjustment module 440 may determine the height of the examination couch corresponding to the bed-in position based on the posture information of the scanned person determined in step 520. It will be appreciated that a suitable height may facilitate movement of the subject to the examination table.
In some embodiments, when the scanned person is identified as walking into the scanning room. The height adjustment module 440 may calculate the leg height of the scanned person based on the posture information obtained in step 520, and decrease the height of the examination table corresponding to the bed-on position to be below the leg height of the scanned person, for example, to be below the leg height of 100 mm. For example, when the posture information is extracted based on the feature points, the height adjustment module 440 may obtain the trans-joint height of the scanned person, and then adjust the height of the examination table to be lower than the trans-joint height, so as to facilitate the scanned person to get into the bed.
In some embodiments, when the scanned person is identified to enter the scanning room by taking a wheelchair, the height of the wheelchair is obtained, and the height of the examination bed corresponding to the bed-entering position is reduced to be lower than the height of the wheelchair. In some embodiments, the height of the wheelchair may be a set standard height (e.g., the heights of all wheelchairs in a hospital are consistent), and when the posture information acquired in step 520 is that the wheelchair is pushed in, the height adjustment module 440 may directly decrease the height of the examination bed corresponding to the getting-on-bed position below the standard height of the wheelchair. In some embodiments, the height adjustment module 440 may also determine the height of the wheelchair based on the posture information of the scanned person. Specifically, the height adjustment module 440 directly obtains the height of the joint point of the scanned person (e.g. the height of the knee joint of the scanned person), determines the height of the knee joint as the height of the wheelchair, and then adjusts the height of the examination table below the height of the knee joint of the scanned person.
In some embodiments, when the scanned person is identified to enter the scanning room by taking the stretcher, the height of the stretcher is obtained, and the height of the inspection bed corresponding to the bed getting position is reduced below the height of the stretcher. In some embodiments, the height of the stretcher may be a set standard height (e.g., the heights of all the stretchers in the hospital are consistent), and when the posture information acquired in step 520 is the stretcher pushing-in, the height adjustment module 440 may directly decrease the height of the examining table corresponding to the getting-on-bed position below the standard height of the stretcher. In some embodiments, the height adjustment module 440 may also determine the stretcher height based on the posture information of the scanned person. Specifically, the height adjustment module 440 directly obtains the joint heights of the scanned person (such as the average of the leg, trunk and head joints of the scanned person), determines the average of the heights of the joints as the stretcher height, and adjusts the height of the examination table to be below the stretcher height of the scanned person.
After the examining table is lowered to the corresponding height, the scanned person gets on the bed (lies on the examining table), and the subsequent steps are carried out.
And 560, after the scanned person gets on the bed, acquiring a plurality of second images of the scanned person on the examination bed through the image acquisition device. In some embodiments, step 560 may be performed by second image acquisition module 450.
After the scanner gets on the bed, the second image acquiring module 450 acquires a plurality of second images of the scanned person on the examination bed. In some embodiments, the second image acquisition module 450 may be in communication with an image acquisition device. After the positioning posture of the scanned person on the examination table is confirmed, the second image acquiring module 450 may acquire a plurality of second images of the scanned person through the image capturing device. In some embodiments, the plurality of second images may be obtained via sequential imagery segmentation (e.g., obtaining a plurality of still images on a frame-by-frame basis). For a description of obtaining a plurality of still images by segmenting a continuous image, reference may be made to the corresponding description of step 520, which is not described herein again.
And step 570, determining the actual arrangement information of the scanned person based on the plurality of second images. In some embodiments, step 570 may be performed by the actual placement information determination module 460.
In some embodiments, the actual positioning information determining module 460 may determine the scanning posture of the scanned person and the position relationship between the scanned person and the examination table based on the plurality of second images of the scanned person, thereby determining the actual positioning information of the scanned person. In some embodiments, the actual positioning information determining module 460 may obtain the profile of the scanned person based on gray scale change, color distortion, and the like of the plurality of second images of the scanned person, and further determine the scanning posture of the scanned person and the position relationship between the scanned person and the examination table by profile comparison and the like. In some embodiments, the scanning pose of the scanned person may include, but is not limited to: supine, lateral, lifting hands, etc. In some embodiments, the position relationship between the scanned subject and the table may be measured in a coordinate system. For example, a coordinate system may be established with the examination table as a reference, and each feature point or contour of the scanned object may be calibrated to obtain corresponding coordinates.
Further, the actual positioning information determining module 460 may acquire the scanning posture of the scanned person and the position relationship between the scanned person and the examination table by acquiring the three-dimensional profile data of the scanned person. Specifically, the actual positioning information determination module 460 may collect three-dimensional contour data of the scanned person. The three-dimensional contour data may be data reflecting the contour of the body shape of the scanned subject. In some embodiments, the actual positioning information determining module 460 may accurately detect the distance from the camera of each point in the scanned image through a 3D camera in the image capturing device, so as to obtain three-dimensional space coordinates of each point in the scanned image, and then perform modeling through the three-dimensional space coordinates to obtain three-dimensional contour data (i.e., a three-dimensional model) of the scanned object. In other alternative embodiments, the actual positioning information determining module 460 may further acquire two-dimensional image data of the scanned person through a plurality of 2D cameras, and then perform three-dimensional reconstruction according to the two-dimensional image data, so as to obtain three-dimensional contour data of the scanned person.
It is understood that the acquisition of the three-dimensional profile data of the scanned person by the 3D camera or the 2D camera may include the relative distance and the relative angle between each point in the three-dimensional profile data of the scanned person and the 3D camera or the 2D camera, and since the position (distance and angle) of the 3D camera or the 2D camera relative to the examination table is also determined, the relative position of the scanned person and the examination table may be obtained by processing the data.
In some embodiments, the actual positioning information determining module 460 may further determine feature points of the scanned person based on the plurality of second images, thereby determining the scanning posture of the scanned person and the position relationship between the scanned person and the examination table. Extracting the position coordinates of the feature points from the plurality of second images may be implemented using an algorithm such as openpos, Hourglass Network, Mask RCNN, or the like, for example. For more description of obtaining the scanning pose based on the feature points, reference may be made to the related description of step 520, which is not described herein again.
Further, the actual positioning information determining module 460 may calculate the actual positioning information based on the scanning posture of the scanned person and the position relationship between the scanned person and the examination table. The actual positioning information can reflect the area actually scanned by the scanner under the condition that the current scanning posture of the scanner and the position relation between the scanner and the examination bed are fixed. In other words, the actual positioning information determination module 460 may estimate the current imaging part (the actually scanned area) based on the scanning posture of the scanned person and the positional relationship between the scanned person and the examination couch.
In some embodiments, further, the matching degree obtaining module 470 may obtain the matching degree based on the comparison between the obtained actual positioning information and the preset positioning information in the scanning protocol. It can be understood that the preset positioning information in the scanning protocol may reflect a target region (e.g., a body part corresponding to a lesion site) that a scanned person needs to scan and photograph. For example, when chest scanning is performed, the chest is a subject. In some embodiments, the preset positioning information in the scanning protocol is typically a body part that has been determined by a physician prior to the taking of the picture. The obtained actual positioning information can reflect the body part to be exposed according to the position and the scanning posture of the scanned person. In some embodiments, the actual positioning information may deviate from the preset positioning information due to a position deviation of the scanned person lying on the examination table (for example, the scanning area corresponding to the preset positioning information is the chest but the scanning area corresponding to the actual positioning information is the abdomen).
In some embodiments, the matching degree obtaining module 470 may compare the actual positioning information with the preset positioning information in the scanning protocol to obtain the matching degree. The matching degree can be characterized in various ways, for example, the degree of similarity of the positioning information can be used for representation. Specifically, the position matching degree may reflect a degree of similarity between the actual positioning information and the preset positioning information in the scanning protocol. For example, the deviation value of each coordinate between two pieces of positioning information and the statistical deviation value (e.g., coordinate average deviation) of the coordinate deviation value can be used for measurement. For another example, the position matching degree may reflect the similarity level, such as the similarity level may have 1,2,3 levels, and the higher the similarity level is, the greater the similarity degree is. For another example, the position matching degree may also reflect the size of the same region between the actual positioning information and the preset positioning information in the scanning protocol. For example, the degree of similarity between the areas corresponding to the two pieces of positioning information is 80%, 70%, 35%, or the like. It is understood that when the similarity degree between the two regions is 80%, it can represent that 80% of the regions between the actual placement information and the preset placement information in the scanning protocol are similar.
In some embodiments, the matching degree obtaining module 470 may further compare the actual positioning information with the preset positioning information in the scanning protocol by using a machine learning model. In particular, the machine learning model may be a convolutional neural network. In the convolutional neural network, the input actual positioning information and the preset positioning information in the scanning protocol can be represented by a picture matrix, and the output can be the similarity of the two positioning information. For example, a row of the picture matrix may indicate a length of a region corresponding to the seating information, a column of the picture matrix may indicate a horizontal degree of the region, and an element of the picture matrix may correspond to a pixel (or color degree) of the region. In some embodiments of the present application, the input of the convolutional neural network is a picture matrix in which the actual placement information corresponds to the preset placement information in the scanning protocol, and the output of the convolutional neural network is the degree of similarity between the two. The mapping relation between the actual positioning information and the preset positioning information in the scanning protocol can be constructed through the convolutional neural network, and a more accurate comparison result is obtained.
Step 590, when the actual positioning information is compared with the preset positioning information in the scanning protocol and meets the preset condition, the scanned person is sent to the scanning position for scanning. In some embodiments, step 590 may be performed by the scan execution module 480.
In some embodiments, the scanning dose of the scanned person is preset in the scanning protocol. Such as the scan dose that has been previously stored in the scan protocol and recalled directly at execution.
In some embodiments, the scanning dose may also be determined in real time by the scanning posture of the scanned person and the actual scanning area, i.e. based on the actual positioning information. In this scenario, the scan execution module 480 may obtain the relative position between the imaging part and the gantry corresponding to the actual positioning information, and the thickness, width, and height of the imaged part, so as to automatically adjust the appropriate scan dose.
In some embodiments, the scan execution module 480 may automatically load the determined scan dose into the scan protocol for confirmation by the operator. In some embodiments, after confirming that the scanning dose meets the requirement, the system can control the examining table to automatically send the scanned person to the scanning position for scanning.
In some embodiments, after the scanned person is sent to the scanning position, the scan execution module 480 may illuminate the exposure button for the operator to press. In some embodiments, the scan execution module 480 may automatically scan the scanned person based on an exposure instruction (e.g., an exposure button is pressed by an operator).
In some embodiments, the scan execution module 480 may retrieve the historical photographing protocol to determine whether there is a matching photographing protocol based on the scanning area corresponding to the actual setup information of the scanned person. If a shooting protocol which is the same as the shooting part corresponding to the actual positioning information and has a difference with the posture information within a preset threshold range exists in the historical shooting protocols, determining that a shooting protocol matched with the actual positioning information exists in the historical shooting protocols; otherwise, determining that the shooting protocol matched with the actual positioning information does not exist in the historical shooting protocols.
Specifically, after acquiring the scanning area (e.g., the shooting part and the size data) corresponding to the actual positioning information of the scanned person, the scanning execution module 480 may search among the stored one or more historical shooting protocols based on the search keyword (e.g., chest or chest X-ray), and filter out several historical shooting protocols as candidate shooting protocols, where the candidate shooting protocols may be the historical shooting protocols matching the search keyword, and further, the scanning execution module 480 may compare the posture information of the historical scanned person in the candidate shooting protocols with the posture information of the current scanned person, and take the candidate shooting protocol with the comparison result in the threshold range as the current shooting protocol. In some embodiments, the search keyword may be one or more parameters included in the photographing protocol, for example, the search keyword may be positioning information (e.g., standing photographing, side-lying photographing, supine photographing, etc.) of the scanned person in the photographing protocol, a photographed part (e.g., head, chest, etc.).
In some embodiments, when comparing the scanning area corresponding to the actual positioning information with the area corresponding to the historical photographing protocol, the comparison may be performed in various aspects, for example, the height of the scanned person may be compared, and the thickness, width, and height of the photographed portion may also be compared. When the comparison result is greater than the threshold range, the scan execution module 480 may determine that there is no matching candidate shooting protocol, for example, the thickness of the part to be shot (such as the chest) is 30cm, while the thickness of the part to be shot (such as the chest) in the candidate shooting protocol is 40cm or 20cm, and the difference between the two is greater than the threshold range (e.g., 3cm, 5cm, 7cm, etc.), and then the scan execution module 480 determines that there is no matching. When the result of the comparison is less than or equal to the threshold range, the scan execution module 480 may determine that there is a matching candidate photographing protocol, for example, the thickness of the photographed part (e.g., the chest) is 30cm, and the thickness of the photographed part (e.g., the chest) in the candidate photographing protocol is 35cm or 25cm, and the difference between the two is within the threshold range (e.g., 5cm, 7cm, etc.), and then the scan execution module 480 determines that there is a match.
In some embodiments, after the scanning is completed, for example, after the operator presses the completion button, the scan execution module 480 may control the examination table to automatically return to the bed getting position of the scanned person for getting out of the bed. In some embodiments, after the scan is completed, the examination table may be automatically adjusted to the height at which the scanned person gets on the bed.
In some embodiments, before performing step 590, the process 500 may further include: in step 580, the motion trajectory of the table from the couch position to the scanning position is simulated. In some embodiments, step 580 may be performed by trajectory simulation module 490.
The trajectory simulation module 490 may simulate a movement trajectory of the table from the bed position to the scanning position based on the preset scanning position, the bed position, and a positional relationship between the scanned person and the table. Specifically, an interference region corresponding to the examination couch may be generated based on the motion trajectories from the examination couch and the couch position to the scanning position, and whether the scanned person interferes with other parts (such as the gantry) of the scanning apparatus may be determined based on the interference region. If the scanned person does not interfere with other parts of the scanning device (such as the gantry, etc.), step 590 may be performed, otherwise, the scanning posture of the scanned person needs to be adjusted to eliminate the interference risk.
The beneficial effects that may be brought by the embodiments of the present application include, but are not limited to: (1) the method comprises the steps of capturing the posture information of a scanned person by using an image acquisition device, and automatically lifting a scanning bed; (2) and calling a corresponding scanning protocol based on the identity information to pre-start the medical scanning equipment, thereby reducing the time of the whole scanning process and realizing the rapid operation of the whole scanning process. It is to be noted that different embodiments may produce different advantages, and in different embodiments, any one or combination of the above advantages may be produced, or any other advantages may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be considered merely illustrative and not restrictive of the broad application. Various modifications, improvements and adaptations to the present application may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present application and thus fall within the spirit and scope of the exemplary embodiments of the present application.
Also, this application uses specific language to describe embodiments of the application. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the present application is included in at least one embodiment of the present application. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this application are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the present application may be combined as appropriate.
Moreover, those skilled in the art will appreciate that aspects of the present application may be illustrated and described in terms of several patentable species or situations, including any new and useful combination of processes, machines, manufacture, or materials, or any new and useful improvement thereon. Accordingly, various aspects of the present application may be embodied entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in a combination of hardware and software. The above hardware or software may be referred to as "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the present application may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media.
The computer storage medium may comprise a propagated data signal with the computer program code embodied therewith, for example, on baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, etc., or any suitable combination. A computer storage medium may be any computer-readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated over any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or any combination of the preceding.
Computer program code required for the operation of various portions of the present application may be written in any one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB.NET, Python, and the like, a conventional programming language such as C, Visual Basic, Fortran2003, Perl, COBOL2002, PHP, ABAP, a dynamic programming language such as Python, Ruby, and Groovy, or other programming languages, and the like. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or processing device. In the latter scenario, the remote computer may be connected to the user's computer through any network format, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
Additionally, the order in which elements and sequences of the processes described herein are processed, the use of alphanumeric characters, or the use of other designations, is not intended to limit the order of the processes and methods described herein, unless explicitly claimed. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing processing device or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
The entire contents of each patent, patent application publication, and other material cited in this application, such as articles, books, specifications, publications, documents, and the like, are hereby incorporated by reference into this application. Except where the application is filed in a manner inconsistent or contrary to the present disclosure, and except where the claim is filed in its broadest scope (whether present or later appended to the application) as well. It is noted that the descriptions, definitions and/or use of terms in this application shall control if they are inconsistent or contrary to the statements and/or uses of the present application in the material attached to this application.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present application. Other variations are also possible within the scope of the present application. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the present application can be viewed as being consistent with the teachings of the present application. Accordingly, the embodiments of the present application are not limited to only those embodiments explicitly described and depicted herein.

Claims (15)

1. A method of controlling a medical scanning apparatus, comprising:
acquiring a plurality of first images of a scanned person through an image acquisition device arranged in a scanning room;
determining posture information of the scanned person based on the plurality of first images of the scanned person;
automatically acquiring a scanning protocol corresponding to the scanned person based on the identity information of the scanned person, wherein the scanning protocol at least comprises preset positioning information;
pre-starting the medical scanning device based on the scanning protocol;
automatically adjusting the height of the examination bed based on the posture information of the scanned person;
after the scanned person gets on the bed, acquiring a plurality of second images of the scanned person on the examination bed through the image acquisition device;
determining actual setup information of the scanned person based on the plurality of second images;
and when the actual positioning information is compared with the preset positioning information in the scanning protocol and meets the preset condition, sending the scanned person to a scanning position for scanning.
2. The method of claim 1, further comprising: after the scanning is finished, the examining table is automatically adjusted to the height of the scanned person on the bed.
3. The method according to claim 1, wherein when the actual positioning information is compared with the preset positioning information in the scanning protocol and does not satisfy the preset condition, the actual positioning of the scanned person on the examination table is adjusted.
4. The method of claim 1, further comprising:
determining identity information of the scanned person based on the plurality of first images of the scanned person;
alternatively, the first and second electrodes may be,
and determining the identity information of the scanned person through the marker arranged on the scanned person.
5. The method of claim 1,
the scanning protocol also comprises a frame rotating speed preset by the medical scanning equipment, preheating of the anode of the bulb tube and the positioning state of each part of the scanning equipment;
the pre-booting the medical scanning device based on the scanning protocol includes:
adjusting the rotating speed of the frame of the medical scanning equipment to the rotating speed corresponding to the scanning protocol, and/or adjusting the preheating of the bulb tube anode of the medical scanning equipment to a working state, and/or adjusting the positioning state of each component of the medical scanning equipment to the positioning state corresponding to the scanning protocol.
6. The method of claim 1, wherein automatically adjusting the height of the examination table based on the posture information of the subject comprises:
when the scanned person is identified to walk into the scanning room, calculating the leg height of the scanned person, and reducing the height of the examination bed corresponding to the bed-in position to be lower than the leg height of the scanned person;
alternatively, the first and second electrodes may be,
when the scanned person is recognized to enter the scanning room by taking the wheelchair, acquiring the height of the wheelchair, and reducing the height of the examination bed corresponding to the bed-entering position to be below the height of the wheelchair;
alternatively, the first and second electrodes may be,
when the fact that the scanned person enters the scanning room by taking the stretcher is recognized, the height of the stretcher is obtained, and the height of the examination bed corresponding to the bed getting position is reduced to be lower than the height of the stretcher.
7. The method of claim 1, wherein prior to sending the scanned person to a scanning location for scanning, the method further comprises:
simulating the motion track of the examination bed from the bed-on position of the scanned person to the scanning position, and determining whether the scanned person and the medical scanning equipment are in interference risk.
8. The method according to claim 1, wherein when the actual positioning information is compared with the preset positioning information in the scanning protocol and then meets a preset condition, the step of sending the scanned person to a scanning position for scanning comprises:
determining a required scan dose based on the actual setup information;
and after confirming that the scanning dose meets the requirement, sending the scanned person to a scanning position for scanning.
9. The method of claim 1, further comprising:
when the actual positioning information is compared with preset positioning information in the scanning protocol and meets a preset condition, sending the scanned person to a scanning position;
and scanning the scanned person based on the exposure instruction.
10. A control system for a medical scanning apparatus, comprising:
the first image acquisition module is used for acquiring a plurality of first images of a scanned person through an image acquisition device arranged in a scanning room;
the posture information acquisition module is used for determining the posture information of the scanned person based on the plurality of first images of the scanned person;
the scanning protocol acquisition module is used for automatically acquiring a scanning protocol corresponding to the scanned person based on the identity information of the scanned person, wherein the scanning protocol at least comprises preset positioning information;
a device pre-start module for pre-starting the medical scanning device based on the scanning protocol;
the height adjusting module is used for automatically adjusting the height of the examination bed based on the posture information of the scanned person;
a second image acquisition module, configured to acquire, by the image acquisition device, a plurality of second images of the scanned person on the examination couch after the scanned person gets on the couch;
an actual positioning information determining module for determining actual positioning information of the scanned person based on the plurality of second images;
and the scanning execution module is used for sending the scanned person to a scanning position for scanning when the actual positioning information is compared with the preset positioning information in the scanning protocol and then meets a preset condition.
11. A control apparatus for a medical scanning device, the apparatus comprising at least one processor and at least one memory;
the at least one memory is for storing computer instructions;
the at least one processor is configured to execute at least some of the computer instructions to implement the method of any of claims 1 to 9.
12. A computer-readable storage medium for medical scanning device control, characterized in that the storage medium stores computer instructions which, when executed by a processor, implement the method of any one of claims 1 to 9.
13. An automatically controlled medical scanning device, comprising:
a scanner for performing a medical scan to acquire scan data of a scan area of a scanned subject;
the image acquisition device is used for acquiring a plurality of first images and a plurality of second images of the scanned person;
the examination bed control mechanism is used for moving the examination bed between the upper bed position and the scanning position and automatically adjusting the height of the examination bed;
a processor for implementing the method of any one of claims 1 to 9.
14. The automatically controlled medical scanning device of claim 13, wherein the image acquisition arrangement comprises a plurality of cameras, at least one of the plurality of cameras being arranged on top of the scan room.
15. The automatically controlled medical scanning device of claim 13, wherein the scanner is an electronic computed tomography scanner.
CN202111052132.5A 2021-09-08 2021-09-08 Control method, device and system of medical scanning equipment Pending CN113647967A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202111052132.5A CN113647967A (en) 2021-09-08 2021-09-08 Control method, device and system of medical scanning equipment
PCT/CN2022/117823 WO2023036243A1 (en) 2021-09-08 2022-09-08 Medical devices, methods and systems for monitoring the medical devices
EP22866704.4A EP4329618A1 (en) 2021-09-08 2022-09-08 Medical devices, methods and systems for monitoring the medical devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111052132.5A CN113647967A (en) 2021-09-08 2021-09-08 Control method, device and system of medical scanning equipment

Publications (1)

Publication Number Publication Date
CN113647967A true CN113647967A (en) 2021-11-16

Family

ID=78483011

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111052132.5A Pending CN113647967A (en) 2021-09-08 2021-09-08 Control method, device and system of medical scanning equipment

Country Status (1)

Country Link
CN (1) CN113647967A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115474958A (en) * 2022-09-15 2022-12-16 瑞石心禾(河北)医疗科技有限公司 Method and system for guiding automatic positioning of examination bed in bimodal medical imaging
WO2022262871A1 (en) * 2021-06-18 2022-12-22 Shanghai United Imaging Healthcare Co., Ltd. Systems and methods for medical imaging
CN115713526A (en) * 2022-11-28 2023-02-24 南方医科大学珠江医院 Image quality control system based on artificial intelligence
WO2023036243A1 (en) * 2021-09-08 2023-03-16 Shanghai United Imaging Healthcare Co., Ltd. Medical devices, methods and systems for monitoring the medical devices
WO2023141800A1 (en) * 2022-01-26 2023-08-03 Warsaw Orthopedic, Inc. Mobile x-ray positioning system

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008005958A (en) * 2006-06-28 2008-01-17 Shimadzu Corp Medical image diagnostic apparatus
CN103607955A (en) * 2012-06-12 2014-02-26 株式会社东芝 Diagnostic imaging apparatus, X-ray computed tomography apparatus, medical bed device and bed control method
US20160175177A1 (en) * 2014-12-17 2016-06-23 General Electric Company System and method for adjusting the height of a patient support table based upon sensed patient height
US20170220709A1 (en) * 2016-02-03 2017-08-03 Varian Medical Systems, Inc. System and method for collision avoidance in medical systems
CN107315923A (en) * 2017-08-14 2017-11-03 上海联影医疗科技有限公司 A kind of system and method for adjusting Medical Devices
CN109464155A (en) * 2018-12-29 2019-03-15 上海联影医疗科技有限公司 Medical scanning localization method
CN110353711A (en) * 2019-07-19 2019-10-22 江苏康众数字医疗科技股份有限公司 X-ray imaging analysis method, device and readable storage medium storing program for executing based on AI
CN111035405A (en) * 2020-01-02 2020-04-21 白银市第三人民医院 Automatic system for CT image diagnosis
CN212037549U (en) * 2020-03-11 2020-12-01 上海联影医疗科技有限公司 Medical imaging system
CN112085846A (en) * 2019-06-14 2020-12-15 通用电气精准医疗有限责任公司 Method and system for generating a 3D point cloud of an object in an imaging system
CN112741643A (en) * 2020-12-31 2021-05-04 苏州波影医疗技术有限公司 CT system capable of automatically positioning and scanning and positioning and scanning method thereof

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008005958A (en) * 2006-06-28 2008-01-17 Shimadzu Corp Medical image diagnostic apparatus
CN103607955A (en) * 2012-06-12 2014-02-26 株式会社东芝 Diagnostic imaging apparatus, X-ray computed tomography apparatus, medical bed device and bed control method
US20160175177A1 (en) * 2014-12-17 2016-06-23 General Electric Company System and method for adjusting the height of a patient support table based upon sensed patient height
US20170220709A1 (en) * 2016-02-03 2017-08-03 Varian Medical Systems, Inc. System and method for collision avoidance in medical systems
CN107315923A (en) * 2017-08-14 2017-11-03 上海联影医疗科技有限公司 A kind of system and method for adjusting Medical Devices
CN109464155A (en) * 2018-12-29 2019-03-15 上海联影医疗科技有限公司 Medical scanning localization method
CN112085846A (en) * 2019-06-14 2020-12-15 通用电气精准医疗有限责任公司 Method and system for generating a 3D point cloud of an object in an imaging system
CN110353711A (en) * 2019-07-19 2019-10-22 江苏康众数字医疗科技股份有限公司 X-ray imaging analysis method, device and readable storage medium storing program for executing based on AI
CN111035405A (en) * 2020-01-02 2020-04-21 白银市第三人民医院 Automatic system for CT image diagnosis
CN212037549U (en) * 2020-03-11 2020-12-01 上海联影医疗科技有限公司 Medical imaging system
CN112741643A (en) * 2020-12-31 2021-05-04 苏州波影医疗技术有限公司 CT system capable of automatically positioning and scanning and positioning and scanning method thereof

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张燕翔等: "《舞台展演交互式空间增强现实技术》", 31 August 2018, 中国科学技术大学出版社, pages: 114 - 115 *
林强等: "《行为识别与智能计算》", 30 November 2016, 西安电子科技大学出版社, pages: 45 - 47 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022262871A1 (en) * 2021-06-18 2022-12-22 Shanghai United Imaging Healthcare Co., Ltd. Systems and methods for medical imaging
WO2023036243A1 (en) * 2021-09-08 2023-03-16 Shanghai United Imaging Healthcare Co., Ltd. Medical devices, methods and systems for monitoring the medical devices
WO2023141800A1 (en) * 2022-01-26 2023-08-03 Warsaw Orthopedic, Inc. Mobile x-ray positioning system
CN115474958A (en) * 2022-09-15 2022-12-16 瑞石心禾(河北)医疗科技有限公司 Method and system for guiding automatic positioning of examination bed in bimodal medical imaging
CN115474958B (en) * 2022-09-15 2023-09-08 瑞石心禾(河北)医疗科技有限公司 Method and system for guiding automatic positioning of examination bed in bimodal medical imaging
CN115713526A (en) * 2022-11-28 2023-02-24 南方医科大学珠江医院 Image quality control system based on artificial intelligence

Similar Documents

Publication Publication Date Title
US11253171B2 (en) System and method for patient positioning
EP3669942B1 (en) Systems and methods for determining a region of interest of a subject
US11576645B2 (en) Systems and methods for scanning a patient in an imaging system
CN113647967A (en) Control method, device and system of medical scanning equipment
US20220084245A1 (en) Systems and methods for positioning an object
WO2022032455A1 (en) Imaging systems and methods
US11576578B2 (en) Systems and methods for scanning a patient in an imaging system
CN112022191B (en) Positioning method and system
WO2022105813A1 (en) Systems and methods for subject positioning
CN113397578A (en) Imaging system and method
US20220353409A1 (en) Imaging systems and methods
CN113081013B (en) Spacer scanning method, device and system
WO2023036243A1 (en) Medical devices, methods and systems for monitoring the medical devices
US20240212836A1 (en) Medical devices, methods and systems for monitoring the medical devices
US20240021299A1 (en) Medical systems and methods for movable medical devices
CN113436236B (en) Image processing method and system
WO2022022723A1 (en) Method and system for determining parameter related to medical operation
WO2024067629A1 (en) Methods, systems, and mediums for scanning
US20230398376A1 (en) Methods and systems for radiation therapy guidance
CN112043299A (en) Control method and system of medical equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination