CN111724883A - Medical data processing method, apparatus, system, and storage medium - Google Patents

Medical data processing method, apparatus, system, and storage medium Download PDF

Info

Publication number
CN111724883A
CN111724883A CN201910203165.1A CN201910203165A CN111724883A CN 111724883 A CN111724883 A CN 111724883A CN 201910203165 A CN201910203165 A CN 201910203165A CN 111724883 A CN111724883 A CN 111724883A
Authority
CN
China
Prior art keywords
dimensional
diseased part
data
dimensional data
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910203165.1A
Other languages
Chinese (zh)
Inventor
杨宇
刘帅
董晓滨
王冉冉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN201910203165.1A priority Critical patent/CN111724883A/en
Publication of CN111724883A publication Critical patent/CN111724883A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture

Landscapes

  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Engineering & Computer Science (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Radiology & Medical Imaging (AREA)
  • Surgery (AREA)
  • Urology & Nephrology (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

The invention provides a medical data processing method, equipment, a system and a storage medium. The method comprises the following steps: acquiring first three-dimensional data of a diseased part of a patient, wherein the first three-dimensional data is three-dimensional image data obtained by converting pre-scanned two-dimensional image data of the diseased part, acquiring a pose relation of the first three-dimensional data relative to the diseased part, and presenting the first three-dimensional data on the diseased part in an augmented reality mode according to the pose relation. The first three-dimensional data of the diseased part is displayed on the real diseased part and interacts with the three-dimensional image data in real time, so that the diseased position can be accurately positioned, and the accurate treatment of the operation and the safety of the operation are played to be very important.

Description

Medical data processing method, apparatus, system, and storage medium
Technical Field
The present invention relates to the field of Augmented Reality (AR), and in particular, to a method, device, system, and storage medium for processing medical data.
Background
With the continuous development of medical technology, medical auxiliary means can greatly help the precision degree of diagnosis and operation. Especially, before some operations, medical data such as Computed Tomography (CT) or Magnetic Resonance Imaging (MRI) needs to be combined to accurately position the operation position, so as to achieve the purpose of reducing the trauma.
In the prior art, the gentian violet is generally adopted for preoperative marking, including marking positioning and chord length positioning, and marking is carried out at the operation position by using the gentian violet according to an evaluation result; or the position of some nerve structures under the cortex is determined by using a three-degree coordinate system defined by a mark or other reference points outside the skull by using a special device, such as a brain stereotaxic apparatus, so that the nerve structures can be directionally stimulated, damaged, injected with drugs, guided in potential and the like under the condition of non-direct-view exposure.
Both of the above two schemes do not achieve accurate marking of the surgical site, and further cannot better reduce the surgical trauma of the patient.
Disclosure of Invention
The invention provides a medical data processing method, equipment, a system and a storage medium, aiming at overcoming the problem that the operation position cannot be accurately marked in the prior art.
In a first aspect, the present invention provides a medical data processing method applied to an augmented reality AR device, the method including:
acquiring first three-dimensional data of a diseased part of a patient, wherein the first three-dimensional data is three-dimensional image data obtained by converting pre-scanned two-dimensional image data of the diseased part;
acquiring the pose relation of the first three-dimensional data relative to the diseased part;
and presenting the first three-dimensional data on the diseased part in an augmented reality mode according to the pose relation.
In a specific implementation manner, the acquiring the pose relationship of the first three-dimensional data with respect to the diseased part includes:
acquiring second three-dimensional data of the diseased part, wherein the second three-dimensional data is three-dimensional image data of a blood vessel of the diseased part obtained by adopting an infrared technology;
and registering the first three-dimensional data and the second three-dimensional data to obtain the pose relation of the first three-dimensional data relative to the diseased part.
In another specific implementation manner, the acquiring the pose relationship of the first three-dimensional data with respect to the diseased part includes:
extracting a first three-dimensional coordinate of each mark point from the two-dimensional image data;
acquiring a second three-dimensional coordinate of each marker point of the diseased part relative to the AR equipment;
and registering according to the first three-dimensional coordinate and the second three-dimensional coordinate of each mark point to obtain the pose relation of the first three-dimensional data relative to the diseased part.
Further, the acquiring the first three-dimensional data of the affected part of the patient comprises:
receiving the first three-dimensional data sent by external equipment;
alternatively, the first and second electrodes may be,
receiving two-dimensional image data of the diseased part sent by external equipment;
and converting the two-dimensional image data according to a density value extraction technology or a volume rendering technology to obtain the first three-dimensional data.
In a specific implementation manner, the acquiring second three-dimensional data of the diseased part includes:
and shooting the diseased part irradiated by infrared rays in real time through an infrared camera to obtain second three-dimensional data.
Specifically, the acquiring the second three-dimensional coordinate of each marker point of the diseased part relative to the AR device includes:
detecting, by an infrared camera, a second three-dimensional coordinate of each landmark point relative to the AR device in real-time.
In a second aspect, the present invention provides an augmented reality apparatus, comprising:
the system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring first three-dimensional data of a diseased part of a patient, and the first three-dimensional data is three-dimensional image data obtained by converting two-dimensional image data of the diseased part which is scanned in advance;
the acquisition module is further used for acquiring the pose relation of the first three-dimensional data relative to the diseased part;
and the processing module is used for presenting the first three-dimensional data on the diseased part in an augmented reality mode according to the pose relation.
In a specific implementation manner, the obtaining module is specifically configured to obtain second three-dimensional data of the diseased part, where the second three-dimensional data is three-dimensional image data of a blood vessel of the diseased part obtained by using an infrared technology;
and registering the first three-dimensional data and the second three-dimensional data to obtain the pose relation of the first three-dimensional data relative to the diseased part.
In another specific implementation manner, the two-dimensional image data includes images of a plurality of landmark points set at the affected part, and the acquiring module is specifically configured to:
extracting a first three-dimensional coordinate of each mark point from the two-dimensional image data;
acquiring a second three-dimensional coordinate of each marker point of the diseased part relative to the AR equipment;
and registering according to the first three-dimensional coordinate and the second three-dimensional coordinate of each mark point to obtain the pose relation of the first three-dimensional data relative to the diseased part.
Further, the obtaining module is specifically configured to receive the first three-dimensional data sent by an external device;
alternatively, the first and second electrodes may be,
receiving two-dimensional image data of the diseased part sent by external equipment;
and converting the two-dimensional image data according to a density value extraction technology or a volume rendering technology to obtain the first three-dimensional data.
In a specific implementation manner, the acquiring module is specifically configured to shoot the diseased part irradiated by infrared rays in real time through an infrared camera to obtain the second three-dimensional data.
Specifically, the acquisition module is specifically configured to detect, in real time, a second three-dimensional coordinate of each marker point relative to the AR device by using an infrared camera.
In a third aspect, the present invention provides an augmented reality device, at least comprising: an infrared camera, a display, a processor, a memory, and a computer program;
the display is to present an augmented reality image;
the computer program is stored in the memory, and the processor executes the computer program to implement the medical data processing method according to any one of claims 1 to 7.
In a fourth aspect, the present invention provides a medical data registration system comprising: an external device and the augmented reality device of claim 13; the external equipment is connected with the augmented reality equipment, and the external equipment is used for scanning two-dimensional image data of a diseased part of a patient.
In a specific implementation, the system further includes: a plurality of marker points;
each landmark point is recognizable by an external device and the augmented reality device of claim 13.
In a fifth aspect, the present invention provides a computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program for implementing the medical data processing method according to any one of claims 1 to 7.
According to the medical data processing method, the medical data processing equipment, the medical data processing system and the storage medium, three-dimensional image data of a diseased part are obtained, the pose relation of the three-dimensional image data relative to the diseased part of a patient is determined, the three-dimensional image data are finally presented on the diseased part in an augmented reality mode according to the pose relation, a virtual image of the diseased part and a real diseased part of the patient are seamlessly integrated through the mode, the virtual image can interact with the three-dimensional image data in real time, the diseased position can be accurately positioned, and therefore the method, the equipment and the system play a vital role in accurate treatment of an operation and safety of the operation.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
FIG. 1 is a schematic diagram of a medical data processing system according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an AR glasses according to an embodiment of the present invention;
fig. 3 a is a schematic structural diagram of a first embodiment of a positioning apparatus according to the present invention;
fig. 3.b is a schematic structural diagram of a second positioning apparatus according to an embodiment of the present invention;
fig. 3 c is a schematic structural diagram of a third embodiment of a positioning apparatus according to the present invention;
fig. 4 a is a schematic structural diagram of a front surface of a mark point according to an embodiment of the present invention;
FIG. 4.b is a schematic structural diagram of a side surface of a mark point according to an embodiment of the present invention
Fig. 5 is a schematic flowchart of a first embodiment of a medical data processing method according to the present invention;
fig. 6 is a schematic flowchart of a second embodiment of a medical data processing method according to the present invention;
FIG. 7 is a schematic view of an infrared emitting device for irradiating a diseased part according to an embodiment of the present invention;
fig. 8 is a schematic flowchart of a third embodiment of a medical data processing method according to the present invention;
fig. 9 is a schematic structural diagram of Markers according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of an AR device according to an embodiment of the present invention;
fig. 11 is a schematic diagram of a hardware structure of an AR device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," and the like in the description and in the claims, and in the drawings, of the embodiments of the application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Augmented Reality (AR) technology is a new technology for seamlessly integrating real world information and virtual world information, and is characterized in that entity information (visual information, sound, taste, touch and the like) which is difficult to experience in a certain time space range of the real world originally is overlapped after simulation through scientific technologies such as computers and the like, virtual information is applied to the real world and is perceived by human senses, and therefore sensory experience beyond Reality is achieved. The real environment and the virtual object are superimposed on the same picture or space in real time and exist simultaneously. The AR technology not only shows real world information, but also displays virtual information at the same time, and the two kinds of information are mutually supplemented and superposed. In visual augmented reality, a user can see the real world around it by multiply composing the real world with computer graphics using an AR device. For the medical field, the use of AR equipment is particularly important, the scheme combines an infrared blood vessel developing technology, a binocular camera three-dimensional detection technology, a Computed Tomography (CT) or Magnetic Resonance Imaging (MRI) data three-dimensional reconstruction technology, a multi-mark point positioning technology and the like, three-dimensional image data of a diseased part are projected to the diseased part of a patient in the real world through the AR equipment, the diseased position can be accurately positioned, the operation position is accurately positioned, and the operation safety is improved.
The medical data processing system provided by the present invention is explained below.
Fig. 1 is a schematic diagram of a medical data processing system according to an embodiment of the present invention. As shown in fig. 1, the medical data processing system 10 at least includes an external device 11 and an AR device 12, and the external device 11 is electrically connected to the AR device 12. It will be appreciated that in an actual medical data processing system, there may be one or more of the external device 11 and the AR device 12, and that fig. 1 is merely one example.
In fig. 1, the external device 11 may be a medical imaging device, and may be, for example, a CT device, or an MRI device, or a Positron Emission Tomography (PET) device, or the like.
The AR device 12 may specifically be a wearable device with AR function (e.g., AR glasses), a mobile phone, a tablet, a computer, or any other intelligent terminal device with AR function. In the embodiment of the present application, the AR device has an interface for communicating with an external device and a cloud server. In this scheme, the AR device integrates an infrared camera.
In a specific implementation manner, fig. 2 is a schematic structural diagram of AR glasses according to an embodiment of the present invention, and as shown in fig. 2, the AR glasses 120 includes: a Camera 121 and a Camera 122, wherein the Camera 121 and the Camera 122 may be the same two cameras, also referred to as infrared cameras, cameras or cameras, etc.
In a specific implementation manner, the medical data processing system provided by the scheme may further include a personal computer PC or a cloud server.
In a specific implementation, the present solution may further include a plurality of landmark points in the medical data processing system, and each landmark point may be recognized by the external device 11 and the AR device 12. It should be understood that a plurality of marker points are used to attach to the affected area of the patient.
Referring to fig. 3 a-3 c, fig. 1 is a mark point, and a plurality of mark points can be pasted on the affected part one by one, or can form a positioning device according to a certain structure to be clamped or pasted on the affected part, such as an annular head band or a net cover. Also, the solution does not require structures such as the ring structure shown in fig. 3.a, the semi-ring structure shown in fig. 3.b, and the fork-type structure shown in fig. 3.c, etc.
Taking an external device as a CT device as an example, referring to fig. 4.a and 4.b, fig. 4.a is a schematic structural diagram of a front surface of a mark point provided in an embodiment of the present invention, and fig. 4.b is a schematic structural diagram of a side surface of a mark point provided in an embodiment of the present invention. It can be seen that each marker point consists of two parts, namely a CT marker point (CT marker)1 and an infrared marker point 2, wherein the infrared marker point is a component that can emit infrared light, for example a small-volume infrared lamp. The CT mark points and the infrared mark points jointly form mark points (Marker), so that the mark points can be recognized by CT equipment and AR equipment provided with an infrared camera. The form of the marker points and the form of the CT marker points and the infrared marker points are not required in this embodiment, and fig. 4.a and 4.b are only examples.
It should be understood that when the external device is an MRI device, the MRI marker points should be included in the marker points that can be recognized by the MRI device. Alternatively, when the external device is another medical imaging device, the mark points include corresponding external device mark points that can be recognized by the external device.
On the basis of the above system embodiments, the present solution is explained below by means of several method embodiments.
Fig. 5 is a schematic flow chart of a first embodiment of a medical data processing method according to an embodiment of the present invention, and as shown in fig. 5, the medical data processing method includes:
s101: first three-dimensional data of a diseased portion of a patient is acquired.
In this scenario, the three-dimensional data is also referred to as a three-dimensional location, a three-dimensional model, or a three-dimensional image.
The patient part is scanned by a CT/MRI or other medical imaging device, and three-dimensional reconstruction is performed on the two-dimensional data obtained by the scanning, for example, by a volume rendering technique or a surface rendering technique, so as to obtain three-dimensional image data of the patient part, i.e., the first three-dimensional data.
It should be understood that the manner of acquiring the first three-dimensional data includes at least the following two ways:
in a first mode, the AR device receives first three-dimensional data sent by an external device. Specifically, the external device receives two-dimensional data for scanning a diseased part provided by CT/MRI or other medical imaging devices, performs three-dimensional reconstruction on the two-dimensional data to obtain first three-dimensional data, and sends the first three-dimensional data to the AR device. The external device may be a personal computer PC, or a cloud server, or other processor devices.
In the second mode, the AR device receives two-dimensional data for scanning a diseased part provided by CT/MRI or other medical imaging devices, and performs three-dimensional reconstruction on the two-dimensional image data according to a surface rendering technique or a volume rendering technique to obtain first three-dimensional data.
S102: and acquiring the pose relation of the first three-dimensional data relative to the diseased part.
In this scheme, it should be understood that the pose relationship is the relative position and posture of two sets of three-dimensional data in the coordinate system.
In this step, the three-dimensional image of the diseased part obtained by scanning and three-dimensional reconstruction in step S101 needs to be combined with the diseased part entity of the real world, the real-time image of the diseased part may be received by the infrared camera of the AR device, and the real-time image of the AR device may be registered with the first three-dimensional data, so as to obtain the real-time pose relationship of the first three-dimensional data with respect to the diseased part.
Optionally, the three-dimensional data registration method may be a Point set to Point set registration method (PSTPS), an Iterative Closest Point method (ICP), a registration method based on a Point-line-surface Geometric Feature Constraint (GFC), or an overall registration Method (MVS) of multiple image data, and the like, and the present scheme does not make any request for this.
S103: and presenting the first three-dimensional data on the diseased part in an augmented reality mode according to the pose relation.
According to the pose relationship obtained in the step S102, the first three-dimensional data can be presented on the affected part in an augmented reality mode, and the visual effect of projecting the first three-dimensional data on the affected part can be achieved.
In the scheme, it should be understood that the first three-dimensional data is three-dimensional image data of the diseased part, the first three-dimensional data can reflect the specific diseased position of the diseased part, and the first three-dimensional data is presented at the real diseased part of the patient, so that the accuracy of patient positioning is greatly improved, and a guarantee is provided for a doctor to perform an operation, especially a minimally invasive operation.
Optionally, the augmented reality display can be displayed by a display of the AR device or an external display, and the display mode may be screen display or projection display, which is not required in the present solution.
According to the medical data processing method provided by the embodiment, the first three-dimensional data of the diseased part of the patient and the pose relation between the first three-dimensional data of the diseased part and the diseased part are acquired, and the first three-dimensional data of the diseased part is presented on the diseased part in an augmented reality mode according to the pose relation, so that the first three-dimensional data of the diseased part is presented on the real diseased part and interacts with the three-dimensional image data in real time, the diseased position can be accurately positioned, and the method plays an important role in the accurate treatment of the operation and the safety of the operation.
In the specific implementation process of the three-dimensional data registration, if each frame is calculated in real time, the picture delay may be generated, and in order to avoid the delay in the three-dimensional data registration process, the following solutions are provided:
1) it can be combined with existing tracking techniques of AR devices: after the first frame of three-dimensional data is registered, the pose tracking can refer to the tracking data of the AR device, namely the three-dimensional registration data is combined with the tracking data of the AR device, so that the three-dimensional data registration of each frame is not needed (the three-dimensional data registration can be calculated once every several frames, and the three-dimensional data are combined with each other).
The tracking method of the AR equipment comprises binocular camera tracking, TOF tracking, structured light tracking and the like, and the scheme does not require the method.
2) Increasing the efficiency of the operation from hardware.
Based on the above embodiment, the medical data processing method provided by the present invention can be divided into the following two specific implementation manners for obtaining the pose relationship of the first three-dimensional data with respect to the diseased part, including:
the first method is as follows:
the diseased part is irradiated by infrared rays in real time, and three-dimensional image data, namely second three-dimensional data, of the blood vessel of the diseased part is received through an infrared binocular camera. And registering the received three-dimensional image data of the blood vessel of the diseased part with the three-dimensional image data of the diseased part obtained through scanning and three-dimensional reconstruction to obtain the pose relationship between the three-dimensional image data and the three-dimensional image data of the diseased part, and displaying the three-dimensional image data of the diseased part on the diseased part through a display device of the AR equipment according to the obtained pose relationship.
In a specific implementation manner, fig. 6 is a schematic flow chart of a second embodiment of the medical data processing method according to the embodiment of the present invention, and as shown in fig. 6, the present solution takes an imaging device as a CT device as an example to describe a specific technical solution in an implementation process, but does not represent that the present solution is not applicable to other imaging devices.
S201: and acquiring second three-dimensional data by real-time infrared three-dimensional detection.
Referring to fig. 7, fig. 7 is a schematic view of an infrared emitting device for irradiating a diseased part according to an embodiment of the present invention, in which the infrared emitting device may be an infrared lamp, the infrared emitting device may be an independent device, or may be integrated in an AR device, and the infrared emitting device may adjust the irradiation range according to the specific position and size of the diseased part.
The infrared rays irradiate the skin of the diseased part, blood vessels under the skin can be received by the infrared binocular camera, and real-time three-dimensional data (also called as three-dimensional position, three-dimensional model or three-dimensional image) of the blood vessels, namely second three-dimensional data, can be obtained through the stereo relationship of binocular images by using the binocular infrared camera.
Optionally, the three-dimensional data can be acquired according to the binocular image through an opencv open source algorithm.
S202: and extracting three-dimensional data of the blood vessel from the three-dimensional data after the CT scanning and the three-dimensional reconstruction.
Three-dimensional data of the diseased part after the CT scanning and the three-dimensional reconstruction, namely first three-dimensional data, also called a CT three-dimensional model, is obtained. And extracting three-dimensional data of the blood vessels in the CT three-dimensional model. Ways to extract the three-dimensional data of the blood vessel include, but are not limited to: either by density value extraction or by volume rendering (through a transfer function).
S203: and acquiring the real-time pose relation of the CT three-dimensional model relative to the patient.
And registering according to the blood vessel three-dimensional data in the CT three-dimensional model and second three-dimensional data acquired by the infrared binocular camera in real time to obtain a pose relationship between the blood vessel three-dimensional data and the second three-dimensional data, namely the real-time pose relationship of the CT three-dimensional model relative to the patient.
The three-dimensional data registration method may be a point set to point set registration method (PSTPS), an iterative closest point method (ICP), a point line surface geometric feature constraint-based registration method (GFC), or a global registration Method (MVS) for multiple image data, and the like, which is not required by the present solution.
S204: augmented reality effect image presentation.
This step is consistent with the display process described in the above scheme, and is not described here again.
The second method comprises the following steps:
in the present scheme, it is first necessary to paste a plurality of mark points (Markers) on the affected part, or to clamp and/or paste a positioning device composed of a plurality of mark points (for the specific form and application, refer to the plurality of mark points and the positioning device provided in the embodiment of the medical data processing system). The diseased part is scanned by imaging equipment such as CT/MRI and the like to obtain two-dimensional image data of the diseased part, a first three-dimensional coordinate of each mark point is extracted from the two-dimensional image data, a second three-dimensional coordinate of each mark point relative to AR equipment is collected in real time through an infrared camera of the AR equipment, and the pose relation of the first three-dimensional data relative to the diseased part is obtained by registering according to the first three-dimensional coordinate and the second three-dimensional coordinate of each mark point.
In a specific implementation manner, fig. 8 is a schematic flow chart of a third embodiment of a medical data processing method according to the present invention, and this scheme takes an imaging device as a CT device as an example to describe a specific technical scheme in an implementation process, but this scheme is not intended to be applied to other imaging devices.
In the prior art, in order to determine a diseased part during CT scanning, a CT Marker point (CT Marker) is sometimes used. In the scheme, the CT mark points and the infrared mark points form a Marker together, and a plurality of Markers form Markers according to a certain structure, which can be seen in fig. 9.
As shown in fig. 8, the present solution mainly includes the following steps:
s301: and detecting the real-time infrared multi-point structure, and acquiring a second three-dimensional coordinate of each mark point relative to the AR equipment.
Each mark point is detected in real time through an infrared camera, and the infrared camera can be integrated in AR equipment, can be an infrared binocular camera and can also be an infrared monocular camera. Obtaining the coordinates of each landmark Point relative to the AR device may include, but is not limited to, using a (passive-n-Point, PnP) algorithm.
S302: the coordinates of each marker point are extracted from the two-dimensional data of the CT scan.
Two-dimensional image data of a diseased part is obtained through CT scanning, a two-dimensional image of a patient is converted into three-dimensional graphic image data through a volume drawing technology or a surface drawing technology, meanwhile, each mark point in the two-dimensional image data is extracted, and a three-dimensional coordinate, namely a first three-dimensional coordinate, of each mark point in a three-dimensional reconstruction scene is obtained.
S303: and carrying out accompanying registration according to the first three-dimensional coordinate and the second three-dimensional coordinate to obtain the pose relation of the first three-dimensional data relative to the diseased part.
And each mark point in the AR glasses coordinate system corresponds to each mark point in the CT reconstruction coordinate system one by one, and the coordinate relation between the two mark points can be obtained by utilizing a three-dimensional data registration algorithm, so that the pose relation of the first three-dimensional data relative to the diseased part is obtained.
The three-dimensional data registration method may be a point set to point set registration method (PSTPS), an iterative closest point method (ICP), a point line surface geometric feature constraint-based registration method (GFC), or a global registration Method (MVS) for multiple image data, and the like, which is not required by the present solution.
S304: augmented reality effect image presentation.
This step is consistent with the display process described in the above scheme, and is not described here again.
Fig. 10 is a schematic structural diagram of an AR device according to an embodiment of the present invention, and as shown in fig. 10, the AR device 100 includes:
the acquisition module 101: the system comprises a data acquisition module, a data acquisition module and a data processing module, wherein the data acquisition module is used for acquiring first three-dimensional data of an affected part of a patient, and the first three-dimensional data is three-dimensional image data obtained by converting two-dimensional image data of the affected part scanned in advance;
the acquisition module is further used for acquiring the pose relation of the first three-dimensional data relative to the diseased part;
the processing module 102: and the first three-dimensional data is used for presenting the diseased part in an augmented reality mode according to the pose relation.
The AR device provided in this embodiment is used to implement the technical solution in any of the foregoing method embodiments, and the implementation principle and technical effect thereof are similar, and the three-dimensional image data of the diseased part is presented at the real diseased part in an augmented reality manner, so that real-time interaction with the three-dimensional image data is realized, and an important role is played in accurate treatment of the operation and safety of the operation.
On the basis of the embodiment shown in fig. 10, the acquiring module 101 is specifically configured to acquire second three-dimensional data of the diseased part, where the second three-dimensional data is three-dimensional image data of a blood vessel of the diseased part obtained by using an infrared technology; and registering the first three-dimensional data and the second three-dimensional data to obtain the pose relation of the first three-dimensional data relative to the diseased part.
In a specific implementation manner, the two-dimensional image data includes images of a plurality of marker points set at the affected part, and the obtaining module 101 is specifically configured to:
extracting a first three-dimensional coordinate of each mark point from the two-dimensional image data;
acquiring a second three-dimensional coordinate of each marker point of the diseased part relative to the AR equipment;
and registering according to the first three-dimensional coordinate and the second three-dimensional coordinate of each mark point to obtain the pose relation of the first three-dimensional data relative to the diseased part.
Further, the obtaining module 101 is specifically configured to receive the first three-dimensional data sent by an external device;
alternatively, the first and second electrodes may be,
receiving two-dimensional image data of the diseased part sent by external equipment;
and converting the two-dimensional image data according to a surface rendering technology or a volume rendering technology to obtain the first three-dimensional data.
In a specific implementation manner, the acquiring module 101 is specifically configured to shoot the affected part irradiated by infrared rays in real time through an infrared camera, so as to obtain the second three-dimensional data.
Specifically, the acquiring module 101 is specifically configured to detect a second three-dimensional coordinate of each marker point relative to the AR device in real time through an infrared camera.
The AR device provided by any implementation manner is configured to execute the technical solution in any method embodiment, and the implementation principle and the technical effect are similar, which are not described herein again.
Fig. 11 is a schematic diagram of a hardware structure of an AR device according to an embodiment of the present invention. As shown in fig. 11, the AR apparatus 200 includes:
an infrared camera 201, a display 202, a processor 203, a memory 204 and a computer program;
the display 202 is for presenting an augmented reality image;
the computer program is stored in the memory 204 and executed by the processor 203 to implement the medical data processing method described in any of the method embodiments.
Fig. 11 is a simple design of an AR device, and the number of processors and memories in the AR device is not limited in the embodiments of the present invention, and fig. 11 only illustrates the number as 1 as an example.
Alternatively, the memory 204 may be separate or integrated with the processor 203.
When the memory 204 is provided separately, the terminal device further includes a bus for connecting the memory 204 and the processor 203.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer execution instruction is stored in the computer-readable storage medium, and when a processor executes the computer execution instruction, the method for starting an application program is implemented.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (16)

1. A medical data processing method is applied to an Augmented Reality (AR) device, and comprises the following steps:
acquiring first three-dimensional data of a diseased part of a patient, wherein the first three-dimensional data is three-dimensional image data obtained by converting pre-scanned two-dimensional image data of the diseased part;
acquiring the pose relation of the first three-dimensional data relative to the diseased part;
and presenting the first three-dimensional data on the diseased part in an augmented reality mode according to the pose relation.
2. The method of claim 1, wherein the acquiring the pose relationship of the first three-dimensional data with respect to the diseased part comprises:
acquiring second three-dimensional data of the diseased part, wherein the second three-dimensional data is three-dimensional image data of a blood vessel of the diseased part obtained by adopting an infrared technology;
and registering the first three-dimensional data and the second three-dimensional data to obtain the pose relation of the first three-dimensional data relative to the diseased part.
3. The method according to claim 1, wherein the two-dimensional image data includes images of a plurality of marker points provided at the diseased part, and the acquiring the pose relationship of the first three-dimensional data with respect to the diseased part includes:
extracting a first three-dimensional coordinate of each mark point from the two-dimensional image data;
acquiring a second three-dimensional coordinate of each marker point of the diseased part relative to the AR equipment;
and registering according to the first three-dimensional coordinate and the second three-dimensional coordinate of each mark point to obtain the pose relation of the first three-dimensional data relative to the diseased part.
4. The method of any one of claims 1 to 3, wherein the obtaining of the first three-dimensional data of the diseased portion of the patient comprises:
receiving the first three-dimensional data sent by external equipment;
alternatively, the first and second electrodes may be,
receiving two-dimensional image data of the diseased part sent by external equipment;
and converting the two-dimensional image data according to a surface rendering technology or a volume rendering technology to obtain the first three-dimensional data.
5. The method of claim 2, wherein said acquiring second three-dimensional data of said diseased portion comprises:
and shooting the diseased part irradiated by infrared rays in real time through an infrared camera to obtain second three-dimensional data.
6. The method of claim 3, wherein said obtaining second three-dimensional coordinates of each landmark point of the diseased portion relative to the AR device comprises:
detecting, by an infrared camera, a second three-dimensional coordinate of each landmark point relative to the AR device in real-time.
7. An augmented reality device, comprising:
the system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring first three-dimensional data of a diseased part of a patient, and the first three-dimensional data is three-dimensional image data obtained by converting two-dimensional image data of the diseased part which is scanned in advance;
the acquisition module is further used for acquiring the pose relation of the first three-dimensional data relative to the diseased part;
and the processing module is used for presenting the first three-dimensional data on the diseased part in an augmented reality mode according to the pose relation.
8. The apparatus according to claim 7, wherein the acquiring module is specifically configured to acquire second three-dimensional data of the diseased part, the second three-dimensional data being three-dimensional image data of a blood vessel of the diseased part obtained by using an infrared technology;
and registering the first three-dimensional data and the second three-dimensional data to obtain the pose relation of the first three-dimensional data relative to the diseased part.
9. The apparatus according to claim 7, wherein the two-dimensional image data includes images of a plurality of landmark points set at the affected part, and the acquiring module is specifically configured to:
extracting a first three-dimensional coordinate of each mark point from the two-dimensional image data;
acquiring a second three-dimensional coordinate of each marker point of the diseased part relative to the AR equipment;
and registering according to the first three-dimensional coordinate and the second three-dimensional coordinate of each mark point to obtain the pose relation of the first three-dimensional data relative to the diseased part.
10. The device according to any one of claims 7 to 9, wherein the obtaining module is specifically configured to receive the first three-dimensional data sent by an external device;
alternatively, the first and second electrodes may be,
receiving two-dimensional image data of the diseased part sent by external equipment;
and converting the two-dimensional image data according to a density value extraction technology or a volume rendering technology to obtain the first three-dimensional data.
11. The apparatus according to claim 8, wherein the acquiring module is specifically configured to capture the affected part irradiated with infrared rays in real time by an infrared camera to obtain the second three-dimensional data.
12. The device of claim 9, wherein the acquisition module is specifically configured to detect the second three-dimensional coordinates of each landmark point relative to the AR device in real-time via an infrared camera.
13. An augmented reality device, comprising at least: an infrared camera, a display, a processor, a memory, and a computer program;
the display is to present an augmented reality image;
the computer program is stored in the memory, and the processor executes the computer program to implement the medical data processing method according to any one of claims 1 to 7.
14. A medical data processing system, comprising: an external device and the augmented reality device of claim 13; the external equipment is connected with the augmented reality equipment, and the external equipment is used for scanning two-dimensional image data of a diseased part of a patient.
15. The system of claim 14, further comprising: a plurality of marker points;
each landmark point is recognizable by an external device and the augmented reality device of claim 13.
16. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program for implementing the medical data processing method according to any one of claims 1 to 7.
CN201910203165.1A 2019-03-18 2019-03-18 Medical data processing method, apparatus, system, and storage medium Pending CN111724883A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910203165.1A CN111724883A (en) 2019-03-18 2019-03-18 Medical data processing method, apparatus, system, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910203165.1A CN111724883A (en) 2019-03-18 2019-03-18 Medical data processing method, apparatus, system, and storage medium

Publications (1)

Publication Number Publication Date
CN111724883A true CN111724883A (en) 2020-09-29

Family

ID=72563216

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910203165.1A Pending CN111724883A (en) 2019-03-18 2019-03-18 Medical data processing method, apparatus, system, and storage medium

Country Status (1)

Country Link
CN (1) CN111724883A (en)

Similar Documents

Publication Publication Date Title
JP2023175709A (en) Registration for spatial tracking system and augmented reality display
RU2740259C2 (en) Ultrasonic imaging sensor positioning
US11759261B2 (en) Augmented reality pre-registration
US20220405935A1 (en) Augmented reality patient positioning using an atlas
Wang et al. Video see‐through augmented reality for oral and maxillofacial surgery
CN106687046B (en) Guidance system for positioning a patient for medical imaging
US20230016227A1 (en) Medical augmented reality navigation
US10359916B2 (en) Virtual object display device, method, program, and system
KR20230116969A (en) Aligning Image Data of a Patient with Actual Views of the Patient Using an Optical Code Affixed to the Patient
US10360730B2 (en) Augmented reality providing system and method, information processing device, and program
US11961193B2 (en) Method for controlling a display, computer program and mixed reality display device
Gsaxner et al. Markerless image-to-face registration for untethered augmented reality in head and neck surgery
JP2019519257A (en) System and method for image processing to generate three-dimensional (3D) views of anatomical parts
US11340708B2 (en) Gesture control of medical displays
Shan et al. Augmented reality based brain tumor 3D visualization
NL2022371B1 (en) Method and assembly for spatial mapping of a model of a surgical tool onto a spatial location of the surgical tool, as well as a surgical tool
KR102056436B1 (en) Medical navigation system and the method thereof
EP3242602B1 (en) Ultrasound imaging apparatus and method for segmenting anatomical objects
EP3273409A1 (en) Image processing apparatus and image processing method
JP2015159945A (en) Image display apparatus, method, and program
CN109215104B (en) Brain structure image display method and device for transcranial stimulation treatment
CN108430376B (en) Providing a projection data set
CN111724883A (en) Medical data processing method, apparatus, system, and storage medium
CN115176283A (en) Augmented reality positioning medical views
US10049480B2 (en) Image alignment device, method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination