CN112107366A - Mixed reality ultrasonic navigation system - Google Patents

Mixed reality ultrasonic navigation system Download PDF

Info

Publication number
CN112107366A
CN112107366A CN202010715410.XA CN202010715410A CN112107366A CN 112107366 A CN112107366 A CN 112107366A CN 202010715410 A CN202010715410 A CN 202010715410A CN 112107366 A CN112107366 A CN 112107366A
Authority
CN
China
Prior art keywords
mixed reality
point
tracker
tracking
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010715410.XA
Other languages
Chinese (zh)
Other versions
CN112107366B (en
Inventor
张嘉伟
陈亮
韩曼曼
赵泉洲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changzhou Jinser Medical Information Technology Co ltd
Original Assignee
Changzhou Jinser Medical Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changzhou Jinser Medical Information Technology Co ltd filed Critical Changzhou Jinser Medical Information Technology Co ltd
Priority to CN202010715410.XA priority Critical patent/CN112107366B/en
Publication of CN112107366A publication Critical patent/CN112107366A/en
Application granted granted Critical
Publication of CN112107366B publication Critical patent/CN112107366B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2063Acoustic tracking systems, e.g. using ultrasound

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention relates to a mixed reality ultrasonic navigation system, which is characterized in that a mixed reality device, a tracking device, a tracker and ultrasonic scanning are adopted, and the mixed tracker is arranged on an ultrasonic probe; the mixed reality equipment is used for receiving the three-dimensional simulation model and projecting the three-dimensional simulation model to the actual spatial position of the patient; the method has the advantages that the image is fused by ultrasound and preoperative CT in the real operation of the operation area of the patient in real time in the real space scene, so that the visual field of a doctor does not need to leave the operation area, the ultrasound is used for auxiliary positioning, and the radiation-free dose mixed reality ultrasound navigation method is realized.

Description

Mixed reality ultrasonic navigation system
Technical Field
The invention relates to the technical field of medical instruments, in particular to a mixed reality ultrasonic navigation system.
Background
There are two main types of conventional surgical navigation systems: one type is to perform intraoperative navigation by only depending on preoperative CT and MRI images, and the surgical navigation systems have the problems that the real-time tracking and reflection of elastic human tissues or organs are difficult, and preoperative medical image data cannot be completely matched with intraoperative actual conditions because the human tissues or organs can generate elastic deformation when being contacted with surgical instruments. The other type is the operation navigation which is performed by registering and fusing the CT and MRI images before the operation and the real-time medical images in the operation and then performing image guidance.
The existing operation is performed before the operation, the affected part is shot by medical imaging equipment such as CT, nuclear magnetic resonance, B-ultrasonic, X-ray and the like, a 2D plane image is displayed on a display screen, and a doctor imagines the spatial position and the structure of the affected part according to the 2D image and performs the operation according to the spatial position and the structure. In addition, the conventional navigation system requires a doctor to pay attention to a computer screen in the visual field during an operation to observe the navigation result, and sometimes needs to continuously scan X-rays and introduce radiation dose, thereby causing damage to the doctor and a patient.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provide a mixed reality ultrasonic navigation system which realizes the fusion of images of ultrasound and preoperative CT in real time in the operation area of a patient in a real space scene and ensures that the visual field of a doctor does not need to leave the operation area.
The technical scheme for realizing the purpose of the invention is as follows: a mixed reality ultrasound navigation system has an ultrasound probe, a mixed tracker, a tracking device and a mixed reality device worn on the head of a doctor; a mixing tracker is arranged on the ultrasonic probe; the mixed reality equipment is used for receiving the three-dimensional simulation model and projecting the three-dimensional simulation model to the actual spatial position of the patient; the method comprises the following specific steps:
(1) performing preoperative CT medical image scanning on a patient to obtain a preoperative CT image sequence;
(2) fixing the position of the tracking equipment, and obtaining the spatial orientation of ultrasonic imaging under a coordinate system w of the tracking equipment through the tracking equipment, specifically: starting the tracking equipment, realizing the azimuth tracking of the ultrasonic probe through the hybrid tracker to obtain an azimuth conversion matrix
Figure BDA0002597972560000021
Meanwhile, the ultrasonic probe scans the registration module, and the ultrasonic probe and the hybrid tracker which are provided with the hybrid tracker are calculated according to the mark points of the registration module detected by the ultrasonic probe and by combining the position of the hybrid trackerTransition matrix between desired imaging positions
Figure BDA0002597972560000022
Thereby calculating a conversion matrix between the tracking device and the ultrasonic imaging
Figure BDA0002597972560000023
Wherein the content of the first and second substances,
Figure BDA0002597972560000024
(3) continuously scanning the ultrasonic probe in a tracking state at the position of a patient along a mode from top to bottom to obtain an intraoperative ultrasonic image sequence based on the origin of the tracking equipment and a coordinate system w;
(4) at least three characteristic points are respectively selected from the preoperative ultrasonic image sequence and the preoperative CT image sequence, and a registration conversion matrix from the preoperative CT sequence to the intraoperative ultrasonic sequence is calculated and obtained
Figure BDA0002597972560000025
Therefore, the fusion of the preoperative CT image sequence and the intraoperative ultrasonic image sequence is realized;
(5) transmitting the spatial information obtained by the tracking equipment to the mixed reality equipment, wherein the mixed reality equipment is based on a coordinate transformation matrix from the tracking equipment to the mixed reality equipment
Figure BDA0002597972560000026
And a transformation matrix between the tracking device and the ultrasound imaging
Figure BDA0002597972560000027
Calculating an intraoperative ultrasound image orientation matrix with mixed reality equipment as a coordinate center
Figure BDA0002597972560000028
Wherein
Figure BDA0002597972560000029
Realize that mixed reality equipment wearer sees ultrasonic in art and C before art in real time in real sceneT and the image location is at the actual location of the patient.
The registration module comprises a tracker and a series of mark points, the positions of the mark points are relatively fixed with the tracker of the registration module, namely the position matrix of the tracker
Figure BDA00025979725600000210
The orientations of these marker points are derived.
The step (2) in the above technical scheme is specifically:
A. firstly, the position of a detectable mark point of an ultrasonic probe in the space of the expected position of an ultrasonic image is obtained
Figure BDA00025979725600000211
B. Acquiring the position of a corresponding detectable mark point in the original ultrasonic image tracking position;
C. deducing the spatial position of the detectable mark point in the original tracking position of the ultrasonic image;
D. calculating a transformation matrix between the hybrid tracker-equipped ultrasound probe and the desired imaging location
Figure BDA00025979725600000212
The step A of the technical scheme is specifically as follows: recording the left point of a first row of the mark points of the registration module as a point a and the right point of the first row as a point b; point of the second row is c; the third row has point d; and four points a, b, c, d are in one plane and parallel to z of the tracker on the registration modulecycPlane while knowing zcycThe distance between the plane and the planes of the points a, b, c and d is h, the distance between the tracker and the point a and the distance between the point a and the point b are d, the distance between the first row and the second row and the distance between the second row and the third row are v, and the orientation transformation matrix of the tracker on the registration module is changed
Figure BDA0002597972560000031
Knowing, and hence deriving the position of point a under the coordinate system p of the hybrid tracker as
Figure BDA0002597972560000032
The position of point b is
Figure BDA0002597972560000033
The position of point c is
Figure BDA0002597972560000034
The position of the point d is
Figure BDA0002597972560000035
According to
Figure BDA0002597972560000036
To obtain
Figure BDA0002597972560000037
Wherein
Figure BDA0002597972560000038
Is composed of
Figure BDA0002597972560000039
The inverse matrix of (d);
the step B specifically comprises the following steps: identifying the imaging pixel positions of four marking points a ', b', c ', d' in the original tracking position of the ultrasonic image through the image, obtaining the distances from the imaging pixel positions to the upper boundary and the left boundary of the image by combining the length and width distances corresponding to each pixel of the ultrasonic image, and finally obtaining the distances from the imaging pixel positions to the upper boundary and the left boundary of the image, namely Va 'and Ha'; vb ', Hb'; vc ', Hc'; vd ', Hd';
the step C is specifically as follows: since the real-time tracking position of the ultrasound image is set at the hybrid tracker on the ultrasound probe, the image imaged at the original tracking position is at z of the hybrid trackerPyPOn the plane, the position of the four marking points a ', b', c ', d' in the space is the position of the upper left corner of the plane at the original coordinate point of the hybrid tracker and under the coordinate system pDistance from each point to image edge, i.e. position a 'of point a'p=[0,-Va′,-Ha′,1]T(ii) a The position of the point b 'is b'p=[0,-Vb′,-Hb′,1]T(ii) a Position c 'of point c'p=[0,-Vc′,-Hc′,1]T(ii) a Position c 'of point d'p=[0,-Vd′,-Hd′,1]T
The step D is specifically as follows: through (a'p,b′p,c′p,d′p) And (a)p,bp,cp,dp) Calculates a conversion matrix according to the one-to-one correspondence relationship
Figure BDA00025979725600000310
The corresponding relationship is as follows:
Figure BDA00025979725600000311
Figure BDA00025979725600000312
Figure BDA00025979725600000313
Figure BDA00025979725600000314
due to (a'p,b′p,c′p,d′p) The four marked points are transformed into (a) through rigidity changep,bp,cp,dp) Therefore, it is
Figure BDA00025979725600000315
Only contain translation and rotation operations and do not contain zoom-in and zoom-out operations.
In step D of the above technical solution, the matrix is converted
Figure BDA00025979725600000316
The solving method is as follows,
e) firstly, (a'p,b′p,c′p,d′p) A 'as a whole by translational and rotational operations'pAnd apPoint superposition;
f) then traversing the x, y, z three-axis rotation respectively to obtain transformed (a'p,b′p,c′p,d′p) And (a)p,bp,cp,dp) The combined error of (2) is minimal;
g) then fine-tuning the translation position to further reduce the comprehensive error;
h) finally obtaining
Figure BDA0002597972560000041
The step (4) in the above technical scheme is specifically:
a. selecting more than 3 mark points in the preoperative CT image sequence to obtain the preoperative space position (a)ct,bct,cct…);
b. Selecting and (a) in the intraoperative ultrasound image sequencect,bct,cct…) points corresponding to the anatomical structure, the spatial position (a) of which is obtainedus,bus,cus…);
c. Firstly (a)ct,bct,cct…) as a whole, by translational and rotational operationsctAnd ausPoint superposition;
d. then traversing the x, y, z three-axis rotation respectively, so as to obtain the transformed (a)us,bus,cus…) and (a)ct,bct,cct…) is minimized;
e. then fine-tuning the translation position to further reduce the comprehensive error to obtain
Figure BDA0002597972560000042
In the step (5) of the above technical scheme, the mixed reality device is obtainedCoordinate transformation matrix of backup and tracking device
Figure BDA0002597972560000043
The method specifically comprises the following steps: position matrix obtained by identifying tracking hybrid tracker by hybrid reality device and tracking device in same time
Figure BDA0002597972560000044
And
Figure BDA0002597972560000045
the calculation results in that,
Figure BDA0002597972560000046
wherein
Figure BDA0002597972560000047
Is composed of
Figure BDA0002597972560000048
C represents a fixed value, indicating that it does not change with changes in the position of the object.
A medical instrument mixed reality ultrasonic navigation system and a method using the same are used for tracking a medical instrument (d) equipped with an instrument tracker in an operation and calculating and obtaining a coordinate matrix of the medical tracker on the medical instrument (d) of a mixed reality device and an operation area under a mixed reality device coordinate system ms
Figure BDA0002597972560000049
The technical scheme is as follows: detected orientation matrix of the medical tracker on the medical instrument (d) with reference to the tracking device coordinates
Figure BDA00025979725600000410
Coordinate transformation matrix for medical instruments (d) of a mixed reality device and an operating region
Figure BDA00025979725600000411
After the technical scheme is adopted, the invention has the following positive effects:
(1) according to the invention, through the mixed reality device, the tracking device, the tracker and the ultrasonic scanning, the fusion of the image of the ultrasonic and the preoperative CT in the real operation of the operation area of the patient in real time in the real space scene is realized, so that the visual field of a doctor does not need to leave the operation area, and the ultrasonic is used for auxiliary positioning, thereby realizing the radiation-free dose mixed reality ultrasonic navigation method.
(2) The invention integrates the fusion of the ultrasound and the CT before the operation into an operation three-dimensional navigation system, an operator can adjust the track of a medical instrument according to a pre-planned three-dimensional operation path to achieve the aim of accurate operation treatment, the operation risk caused by errors in the traditional operation positioning based on experience is reduced, the accuracy of the operation is improved, and the invention is more efficient, more intuitive, higher in safety and stronger in functionality compared with the traditional operation.
Drawings
In order that the present disclosure may be more readily and clearly understood, reference is now made to the following detailed description of the present disclosure taken in conjunction with the accompanying drawings, in which
FIG. 1 is a schematic diagram of a use scenario of the present invention;
FIG. 2 is a schematic diagram of coordinate transformation among the hybrid tracker, the hybrid display device and the tracking device according to the present invention;
FIG. 3 is a schematic diagram of an original tracking position of an ultrasound image and a desired position of an ultrasound image according to the present invention;
FIG. 4 is a schematic view of a scanning trajectory of an ultrasonic probe according to the present invention;
FIG. 5 is a schematic diagram illustrating registration transformation between an intraoperative ultrasound image sequence and a preoperative CT image sequence in accordance with the present invention;
FIG. 6 is a schematic view of embodiment 2 of the present invention;
FIG. 7 is a schematic spatial relationship diagram according to embodiment 3 of the present invention;
FIG. 8 is a schematic view of example 3 of the present invention;
fig. 9 is a schematic diagram of registration conversion between an intraoperative ultrasound image sequence and a preoperative CT image sequence according to embodiment 3 of the present invention.
Detailed Description
(example 1)
Referring to fig. 1-9, the present invention has an ultrasound probe, a hybrid tracker, a tracking device, and a mixed reality device worn on the head of a physician; a mixing tracker is arranged on the ultrasonic probe; the mixed reality equipment is used for receiving the three-dimensional simulation model and projecting the three-dimensional simulation model to the actual spatial position of the patient; the method comprises the following specific steps:
(1) performing preoperative CT medical image scanning on a patient to obtain a preoperative CT image sequence;
(2) fixing the position of the tracking equipment, and obtaining the spatial orientation of ultrasonic imaging under a coordinate system w of the tracking equipment through the tracking equipment, specifically: turning on the tracking device, the hybrid tracker is used to track the position of the ultrasonic probe to obtain a position transformation matrix
Figure BDA0002597972560000061
Meanwhile, the ultrasonic probe scans the registration module, and a conversion matrix between the ultrasonic probe provided with the hybrid tracker and the expected imaging position is calculated according to the mark points of the registration module detected by the ultrasonic probe and by combining the orientation of the hybrid tracker
Figure BDA0002597972560000062
Thereby calculating a conversion matrix between the tracking device and the ultrasonic imaging
Figure BDA0002597972560000063
Wherein the content of the first and second substances,
Figure BDA0002597972560000064
Figure BDA0002597972560000065
(3) continuously scanning the ultrasonic probe in the tracking state at the position of the patient along a mode of up and down to obtain an intraoperative ultrasonic image sequence taking the origin of the tracking device and a coordinate system w as reference, and referring to fig. 4;
(4) at least three characteristic points are respectively selected from the preoperative ultrasonic image sequence and the preoperative CT image sequence, and a registration conversion matrix from the preoperative CT sequence to the intraoperative ultrasonic sequence is calculated and obtained
Figure BDA0002597972560000066
So as to realize the fusion of the preoperative CT image sequence and the intraoperative ultrasonic image sequence, as shown in figure 5; the us coordinate system coincides with the w coordinate system of the tracking device, so the transformation matrix is registered
Figure BDA0002597972560000067
Can also be said to be
Figure BDA0002597972560000068
(5) See fig. 2, the spatial information obtained by the tracking device is transmitted to the mixed reality device, and the mixed reality device is based on a coordinate transformation matrix from the tracking device to the mixed reality device
Figure BDA0002597972560000069
And a transformation matrix between the tracking device and the ultrasound imaging
Figure BDA00025979725600000610
Calculating an intraoperative ultrasound image orientation matrix with mixed reality equipment as a coordinate center
Figure BDA00025979725600000611
Wherein
Figure BDA00025979725600000612
The method realizes that a mixed reality device wearer sees a fusion image of intraoperative ultrasound and preoperative CT in real time in a real scene, and the image position is at the actual position of a patient.
The coordinate origin of the mixed reality equipment in the invention is determined for the equipment starting time, the position of the origin cannot be changed by the movement of the equipment, and the mixed reality equipment has SLAM space positioning capability, so that the determination is carried out
Figure BDA00025979725600000613
The value does not change thereafter.
The registration module comprises a tracker and a series of mark points, namely four mark points a, b, c and d, the positions of the mark points are fixed relative to the tracker of the registration module, namely the position matrix of the tracker
Figure BDA0002597972560000071
The orientations of these marker points are derived.
The step (2) is specifically as follows:
A. firstly, the position of a detectable mark point of an ultrasonic probe in the space of the expected position of an ultrasonic image is obtained
Figure BDA0002597972560000072
The method specifically comprises the following steps: recording the left point of a first row of the mark points of the registration module as a point a and the right point of the first row as a point b; point of the second row is c; the third row has point d; and four points a, b, c, d are in one plane and parallel to z of the tracker on the registration modulecycPlane while knowing zcycThe distance between the plane and the planes of the points a, b, c and d is h, the distance between the tracker and the point a and the distance between the point a and the point b are d, the distance between the first row and the second row and the distance between the second row and the third row are v, and the orientation transformation matrix of the tracker on the registration module is changed
Figure BDA0002597972560000073
Knowing, and hence deriving the position of point a under the coordinate system p of the hybrid tracker as
Figure BDA0002597972560000074
The position of point b is
Figure BDA0002597972560000075
The position of point c is
Figure BDA0002597972560000076
Figure BDA0002597972560000077
The position of the point d is
Figure BDA0002597972560000078
According to
Figure BDA0002597972560000079
Figure BDA00025979725600000710
To obtain
Figure BDA00025979725600000711
Wherein
Figure BDA00025979725600000712
Is composed of
Figure BDA00025979725600000713
The inverse matrix of (d);
B. acquiring the position of a corresponding detectable mark point in the original ultrasonic image tracking position; the method specifically comprises the following steps: identifying the imaging pixel positions of four marking points a ', b', c ', d' in the original tracking position of the ultrasonic image through the image, obtaining the distances from the imaging pixel positions to the upper boundary and the left boundary of the image by combining the length and width distances corresponding to each pixel of the ultrasonic image, and finally obtaining the distances from the imaging pixel positions to the upper boundary and the left boundary of the image, namely Va 'and Ha'; vb ', Hb'; vc ', Hc'; vd ', Hd';
C. deducing the spatial position of the detectable mark point in the original tracking position of the ultrasonic image; the method specifically comprises the following steps: since the real-time tracking position of the ultrasound image is set at the hybrid tracker on the ultrasound probe, the image imaged at the original tracking position is at z of the hybrid trackerPyPOn the plane, with its top left corner at the origin of the mixture tracker's own coordinates, under the coordinate system p, the positions of the four marker points a ', b ', c ', d ' in space, i.e. the distances of the points to the edges of the image, i.e. the position a ' of point a 'p=[0,-Va′,-Ha′,1]T(ii) a The position of the point b 'is b'p=[0,-Vb′,-Hb′,1]T(ii) a Position c 'of point c'p=[0,-Vc′,-Hc′,1]T(ii) a Position c 'of point d'p=[0,-Vd′,-Hd′,1]T
D. Calculating a transformation matrix between the hybrid tracker-equipped ultrasound probe and the desired imaging location
Figure BDA00025979725600000714
The method specifically comprises the following steps: through (a'p,b′p,c′p,d′p) And (a)p,bp,cp,dp) Calculates a conversion matrix according to the one-to-one correspondence relationship
Figure BDA00025979725600000715
The corresponding relationship is as follows:
Figure BDA0002597972560000081
Figure BDA0002597972560000082
Figure BDA0002597972560000083
Figure BDA0002597972560000084
due to (a'p,b′p,c′p,d′p) The four marked points are transformed into (a) through rigidity changep,bp,cp,dp) Therefore, it is
Figure BDA0002597972560000085
Only contain translation and rotation operations and do not contain zoom-in and zoom-out operations.
Wherein the conversion matrix
Figure BDA0002597972560000086
The solution is obtained by first solving (a'p,b′p,c′p,d′p) A 'as a whole by translational and rotational operations'pAnd apPoint superposition; then traversing the x, y, z three-axis rotation respectively to obtain transformed (a'p,b′p,c′p,d′p) And (a)p,bp,cp,dp) The combined error of (2) is minimal; then fine-tuning the translation position to further reduce the comprehensive error; finally obtaining
Figure BDA0002597972560000087
The step (4) is specifically as follows:
a. selecting more than 3 mark points in the preoperative CT image sequence to obtain the preoperative space position (a)ct,bct,cct…);
b. Selecting and (a) in the intraoperative ultrasound image sequencect,bct,cct…) points corresponding to the anatomical structure, the spatial position (a) of which is obtainedus,bus,cus…);
c. Firstly (a)ct,bct,cct…) as a whole, by translational and rotational operationsctAnd ausPoint superposition;
d. then traversing the x, y, z three-axis rotation respectively, so as to obtain the transformed (a)us,bus,cus…) and (a)ct,bct,cct…) is minimized;
e. then fine-tuning the translation position to further reduce the comprehensive error to obtain
Figure BDA0002597972560000088
In the step (5), referring to fig. 2, a coordinate transformation matrix of the mixed reality device and the tracking device is obtained
Figure BDA0002597972560000089
The method specifically comprises the following steps: position matrix obtained by identifying tracking hybrid tracker by hybrid reality device and tracking device in same time
Figure BDA00025979725600000810
And
Figure BDA00025979725600000811
is calculated because
Figure BDA00025979725600000812
Thereby obtaining
Figure BDA00025979725600000813
Wherein
Figure BDA00025979725600000814
Is composed of
Figure BDA00025979725600000815
C represents a fixed value, indicating that it does not change with changes in the position of the object.
(example 2)
Referring to fig. 6, a medical instrument mixed reality ultrasound navigation system, using the method of the mixed reality ultrasound navigation system in the embodiment 1, intraoperatively tracks the medical instrument (d) equipped with the instrument tracker, and calculates and obtains the coordinate matrix of the medical tracker on the medical instrument (d) of the mixed reality equipment and operation area under the mixed reality equipment coordinate system ms
Figure BDA0002597972560000091
The method specifically comprises the following steps: detected orientation matrix of the medical tracker on the medical instrument (d) with reference to the tracking device coordinates
Figure BDA0002597972560000092
Due to the fact that
Figure BDA0002597972560000093
Thereby obtaining medical instruments mixing real equipment and operation areas(d) Coordinate transformation matrix of medical tracker
Figure BDA0002597972560000094
If a subsequent calculation is needed for the medical instrument, if the distance from the tip of the medical instrument to the center of the tracker is l, the orientation matrix of the tip is
Figure BDA0002597972560000095
(example 3)
In order to facilitate understanding, the mixed reality device and the tracking device are placed in a certain simple space with the tracked object.
An ultrasound probe with a hybrid tracker is placed in the position shown in figure 7, zwxwPlane, zmsxmsPlane and zpxpComplete overlap, at this time
Figure BDA0002597972560000096
(180 deg. rotation in z) and,
Figure BDA0002597972560000097
therefore, the first and second electrodes are formed on the substrate,
Figure BDA0002597972560000098
the hybrid tracker is mounted exactly parallel to the imaging plane of the ultrasound probe, so
Figure BDA0002597972560000099
Only comprises a translation operation, and z is registered during scanningpxpPlane and zcxcCompletely overlapping.
Fig. 8 is a schematic diagram, and the spatial relationship is based on fig. 7.
Figure BDA00025979725600000910
(including a 180 degree rotation along z),
the spatial position of point a in coordinate system p is then
Figure BDA00025979725600000911
At this point in time,
Figure BDA0002597972560000101
Figure BDA0002597972560000102
therefore, it is not only easy to use
Figure BDA0002597972560000103
Meanwhile, the spatial position of a 'point under the coordinate system p is obviously a'p=[0,-Va′,-Ha′,1]T
The placing positions are ideal and practical
Figure BDA0002597972560000104
The calculation can be carried out only by translating the point a to the point a', the minimum error transformation is already carried out, and the optimal transformation is solved without further rotation.
According to
Figure BDA0002597972560000105
Namely, it is
Figure BDA0002597972560000106
Since Ha' -d-L3 is almost equal to 0,
therefore, it is not only easy to use
Figure BDA0002597972560000107
So far, we have the spatial orientation of ultrasonic imaging under the coordinate system w of the tracking device obtained by the tracking device, namely
Figure BDA0002597972560000108
For example, the current orientation of the ultrasound probe is
Figure BDA0002597972560000111
Then
Figure BDA0002597972560000112
Then, the ultrasound probe is used to continuously scan at the position of the patient along the mode of up-down and up-down to obtain an ultrasound image sequence, and the ultrasound sequence image and the CT sequence image are abstractly seen to be in a cuboid with spatial position attribute, as shown in fig. 9.
It is assumed here that the patient intraoperative position is the head orientation tracking device, i.e. foot-to-head direction and zwThree feature points, consistent and selected in the intraoperative ultrasound sequence, are in zwxwOn a plane and its front-to-back direction with ywIn agreement, i.e. the sequence of intraoperative ultrasound images in fig. 9 shows the patient lying on his side.
Three feature points before the operation are at yCTZ of LCTxCTOn a plane, then we call three points aCT,bCT,cCTThe three points in the operation corresponding to the above are aw,bwAnd cw. So aCT=[xcta,L,zcta,1]T,bCT=[xctb,L,zctb,1]T,cCT=[xctc,L,zctc,1]T;aw=[xwa,0,zwa,1]T,bCT=[xwb,0,zwb,1]T,cCT=[xwc,0,zwc,1]T
The position information of these points can be acquired.
Since the patient position directions of the preoperative CT and the intraoperative ultrasound are consistent, the patient position directions are consistent
Figure BDA0002597972560000113
Only need to calculate aCTIs translated to awThe point can be calculated, the minimum error transformation is already obtained, and the optimal transformation is solved without further rotation.
According to
Figure BDA0002597972560000114
Namely, it is
Figure BDA0002597972560000115
Therefore, it is not only easy to use
Figure BDA0002597972560000116
As a test, we simulated the intraoperative red spot to yCTThe direction is increased by 5 (indicating a deviation towards the back of the patient), i.e. a new point position is obtained as newACT=[xcta,L+5,zcta,1]TFrom the registration transformation matrix we can get its position as intraoperative
Figure BDA0002597972560000121
Figure BDA0002597972560000122
I.e. the point is also offset 5 towards the back of the patient during surgery, in line with the preoperative direction, and is correct.
After the registration process is completed, we have already fused preoperative CT and intraoperative ultrasound based on the coordinate system w.
And finally, transmitting the spatial information obtained by the tracking equipment to mixed reality equipment, wherein the mixed reality equipment needs to be based on the spatial information obtained by the tracking equipment
Figure BDA0002597972560000123
I.e., coordinate transformation matrix calculation between the tracking device to the mixed reality device, the actual position of the object as seen by the mixed reality device, so that the virtual object can be superimposed on the correct position in real space.
Here, taking the previous fig. 7 as an example, it is assumed that the ultrasonic probe is moved rightward 3 and downward 2. We then calculate where in space the real-time ultrasound image should be placed from the mixed reality device perspective.
At this time
Figure BDA0002597972560000124
And is known
Figure BDA0002597972560000125
Because of the fact that
Figure BDA0002597972560000126
Therefore, it is not only easy to use
Figure BDA0002597972560000127
Because of the fact that
Figure BDA0002597972560000128
It is known that
Figure BDA0002597972560000129
Therefore, it is not only easy to use
Figure BDA00025979725600001210
Figure BDA00025979725600001211
It can be seen that the intraoperative ultrasound image is correct from the mixed reality device, with-n-h-q-2 at its x-axis, and-L1-L2-3 at its x-axis, and without any rotation.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are only exemplary embodiments of the present invention, and are not intended to limit the present invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (9)

1. A mixed reality ultrasound navigation system, characterized by: having an ultrasound probe, a hybrid tracker, a tracking device, and a mixed reality device worn on the head of a doctor; a mixing tracker is arranged on the ultrasonic probe; the mixed reality equipment is used for receiving the three-dimensional simulation model and projecting the three-dimensional simulation model to the actual spatial position of the patient; the method comprises the following specific steps:
(1) performing preoperative CT medical image scanning on a patient to obtain a preoperative CT image sequence, and introducing into mixed reality equipment;
(2) fixing the position of the tracking equipment, and obtaining the spatial orientation of ultrasonic imaging under a coordinate system w of the tracking equipment through the tracking equipment, specifically: starting the tracking equipment, realizing the azimuth tracking of the ultrasonic probe through the hybrid tracker to obtain an azimuth conversion matrix
Figure FDA0002597972550000011
Meanwhile, the ultrasonic probe scans the registration module, and a conversion matrix between the ultrasonic probe provided with the hybrid tracker and the expected imaging position is calculated according to the mark points of the registration module detected by the ultrasonic probe and by combining the orientation of the hybrid tracker
Figure FDA0002597972550000012
Thereby calculating a conversion matrix between the tracking device and the ultrasonic imaging
Figure FDA0002597972550000013
Wherein the content of the first and second substances,
Figure FDA0002597972550000014
(3) continuously scanning the ultrasonic probe in a tracking state at the position of a patient along a mode from top to bottom to obtain an intraoperative ultrasonic image sequence based on the origin of the tracking equipment and a coordinate system w;
(4) at least three characteristic points are respectively selected from the preoperative ultrasonic image sequence and the preoperative CT image sequence, and a registration conversion matrix from the preoperative CT sequence to the intraoperative ultrasonic sequence is calculated and obtained
Figure FDA0002597972550000015
Therefore, the fusion of the preoperative CT image sequence and the intraoperative ultrasonic image sequence is realized;
(5) transmitting the spatial information obtained by the tracking equipment to the mixed reality equipment, wherein the mixed reality equipment is based on a coordinate transformation matrix from the tracking equipment to the mixed reality equipment
Figure FDA0002597972550000016
And a transformation matrix between the tracking device and the ultrasound imaging
Figure FDA0002597972550000017
Calculating an intraoperative ultrasound image orientation matrix with mixed reality equipment as a coordinate center
Figure FDA0002597972550000018
Wherein
Figure FDA0002597972550000019
The method realizes that a mixed reality device wearer sees a fusion image of intraoperative ultrasound and preoperative CT in real time in a real scene, and the image position is at the actual position of a patient.
2. The mixed reality ultrasound navigation system of claim 1, wherein the registration module comprises a tracker and a series of markers oriented relative to the tracker of the registration module, i.e. fixed by an orientation matrix of the tracker
Figure FDA00025979725500000110
The orientations of these marker points are derived.
3. The mixed reality ultrasound navigation system according to claim 2, wherein the step (2) is specifically:
A. firstly, the position of a detectable mark point of an ultrasonic probe in the space of the expected position of an ultrasonic image is obtained
Figure FDA0002597972550000021
B. Acquiring the position of a corresponding detectable mark point in the original ultrasonic image tracking position;
C. deducing the spatial position of the detectable mark point in the original tracking position of the ultrasonic image;
D. calculating a transformation matrix between the hybrid tracker-equipped ultrasound probe and the desired imaging location
Figure FDA0002597972550000022
4. The mixed reality ultrasound navigation system according to claim 3, wherein the step A is specifically: recording the left point of a first row of the mark points of the registration module as a point a and the right point of the first row as a point b; point of the second row is c; the third row has point d; and four points a, b, c, d are in one plane and parallel to z of the tracker on the registration modulecycPlane while knowing zcycThe distance between the plane and the planes of the points a, b, c and d is h, the distance between the tracker and the point a and the distance between the point a and the point b are d, the distance between the first row and the second row and the distance between the second row and the third row are v, and the orientation transformation matrix of the tracker on the registration module is changed
Figure FDA0002597972550000023
Knowing, and hence deriving the position of point a under the coordinate system p of the hybrid tracker as
Figure FDA0002597972550000024
The position of point b is
Figure FDA0002597972550000025
The position of point c is
Figure FDA0002597972550000026
Figure FDA0002597972550000027
The position of the point d is
Figure FDA0002597972550000028
According to
Figure FDA0002597972550000029
Figure FDA00025979725500000210
To obtain
Figure FDA00025979725500000211
Wherein
Figure FDA00025979725500000212
Is composed of
Figure FDA00025979725500000213
The inverse matrix of (d);
the step B specifically comprises the following steps: identifying the imaging pixel positions of four marking points a ', b', c ', d' in the original tracking position of the ultrasonic image through the image, obtaining the distances from the imaging pixel positions to the upper boundary and the left boundary of the image by combining the length and width distances corresponding to each pixel of the ultrasonic image, and finally obtaining the distances from the imaging pixel positions to the upper boundary and the left boundary of the image, namely Va 'and Ha'; vb ', Hb'; vc ', Hc'; vd ', Hd';
the step C is specifically as follows: since the real-time tracking position of the ultrasound image is set at the hybrid tracker on the ultrasound probe, the image imaged at the original tracking position is at z of the hybrid trackerPyPOn the plane, with its top left corner at the origin of the mixture tracker's own coordinates, under the coordinate system p, the positions of the four marker points a ', b ', c ', d ' in space, i.e. the distances of the points to the edges of the image, i.e. the position a ' of point a 'p=[0,-Va′,-Ha′,1]T(ii) a The position of the point b 'is b'p=[0,-Vb′,-Hb′,1]T(ii) a Position c 'of point c'p=[0,-Vc′,-Hc′,1]T(ii) a Position c 'of point d'p=[0,-Vd′,-Hd′,1]T
The step D is specifically as follows: through (a'p,b′p,c′p,d′p) And (a)p,bp,cp,dp) Calculates a conversion matrix according to the one-to-one correspondence relationship
Figure FDA0002597972550000031
The corresponding relationship is as follows:
Figure FDA0002597972550000032
Figure FDA0002597972550000033
Figure FDA0002597972550000034
Figure FDA0002597972550000035
due to (a'p,b′p,c′p,d′p) The four marked points are transformed into (a) through rigidity changep,bp,cp,dp) Therefore, it is
Figure FDA0002597972550000036
Only contain translation and rotation operations and do not contain zoom-in and zoom-out operations.
5. The mixed reality ultrasound navigation system of claim 4, wherein in step D, the matrix is transformed
Figure FDA0002597972550000037
The solving method is as follows,
a) firstly, (a'p,b′p,c′p,d′p) A 'as a whole by translational and rotational operations'pAnd apPoint superposition;
b) then traversing the x, y, z three-axis rotation respectively to obtain transformed (a'p,b′p,c′p,d′p) And (a)p,bp,cp,dp) The combined error of (2) is minimal;
c) then fine-tuning the translation position to further reduce the comprehensive error;
d) finally obtaining
Figure FDA0002597972550000038
6. The mixed reality ultrasound navigation system according to claim 3, wherein the step (4) is specifically:
a. selecting more than 3 mark points in the preoperative CT image sequence to obtain the preoperative space position (a)ct,bct,cct...);
b. Selecting and (a) in the intraoperative ultrasound image sequencect,bct,cct...) points corresponding to the anatomical structure, the spatial position (a) of which is obtainedus,bus,cus..);
c. Firstly (a)ct,bct,cct...) as a whole, by translational and rotational operationsctAnd ausPoint superposition;
d. then traversing the x, y, z three-axis rotation respectively, so as to obtain the transformed (a)us,bus,cus...) and (a)ct,bct,cct...) is minimized;
e. then fine-tuning the translation position to further reduce the comprehensive error to obtain
Figure FDA0002597972550000039
7. The mixed reality ultrasound navigation system of claim 3, wherein in step (5), a coordinate transformation matrix of the mixed reality device and the tracking device is obtained
Figure FDA0002597972550000041
The method specifically comprises the following steps: position matrix obtained by identifying tracking hybrid tracker by hybrid reality device and tracking device in same time
Figure FDA0002597972550000042
And
Figure FDA0002597972550000043
the calculation results in that,
Figure FDA0002597972550000044
wherein
Figure FDA0002597972550000045
Is composed of
Figure FDA0002597972550000046
C represents a fixed value, indicating that it does not change with changes in the position of the object.
8. A medical instrument mixed reality ultrasound navigation system, characterized in that, the medical instrument (d) equipped with an instrument tracker is tracked in operation by using the mixed reality ultrasound navigation system according to any one of claims 1 to 7, and the coordinate matrix of the medical tracker on the medical instrument (d) of the mixed reality equipment and operation area is calculated and obtained under the mixed reality equipment coordinate system ms
Figure FDA0002597972550000047
9. The medical instrument mixed reality ultrasound navigation system of claim 8, specifically being: detected orientation matrix of the medical tracker on the medical instrument (d) with reference to the tracking device coordinates
Figure FDA0002597972550000048
Coordinate transformation matrix for medical instruments (d) of a mixed reality device and an operating region
Figure FDA0002597972550000049
CN202010715410.XA 2020-07-23 2020-07-23 Mixed reality ultrasonic navigation system Active CN112107366B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010715410.XA CN112107366B (en) 2020-07-23 2020-07-23 Mixed reality ultrasonic navigation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010715410.XA CN112107366B (en) 2020-07-23 2020-07-23 Mixed reality ultrasonic navigation system

Publications (2)

Publication Number Publication Date
CN112107366A true CN112107366A (en) 2020-12-22
CN112107366B CN112107366B (en) 2021-08-10

Family

ID=73799450

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010715410.XA Active CN112107366B (en) 2020-07-23 2020-07-23 Mixed reality ultrasonic navigation system

Country Status (1)

Country Link
CN (1) CN112107366B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113041083A (en) * 2021-04-22 2021-06-29 江苏瑞影医疗科技有限公司 Holographic projection operation console applied to QMR technology

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004109600A1 (en) * 2003-06-05 2004-12-16 Philips Intellectual Property & Standards Gmbh Adaptive image interpolation for volume rendering
CN102319117A (en) * 2011-06-16 2012-01-18 上海交通大学医学院附属瑞金医院 Arterial intervention implant implanting system capable of fusing real-time ultrasonic information based on magnetic navigation
CN103211655A (en) * 2013-04-11 2013-07-24 深圳先进技术研究院 Navigation system and navigation method of orthopedic operation
US20150067599A1 (en) * 2013-09-05 2015-03-05 General Electric Company Smart and early workflow for quick vessel network detection
CN106846496A (en) * 2017-01-19 2017-06-13 杭州古珀医疗科技有限公司 DICOM images based on mixed reality technology check system and operating method
CN107340871A (en) * 2017-07-25 2017-11-10 深识全球创新科技(北京)有限公司 The devices and methods therefor and purposes of integrated gesture identification and ultrasonic wave touch feedback
CN107536643A (en) * 2017-08-18 2018-01-05 北京航空航天大学 A kind of augmented reality operation guiding system of Healing in Anterior Cruciate Ligament Reconstruction
CN110427102A (en) * 2019-07-09 2019-11-08 河北经贸大学 A kind of mixed reality realization system
CN110537980A (en) * 2019-09-24 2019-12-06 上海理工大学 puncture surgery navigation method based on motion capture and mixed reality technology
CN111420391A (en) * 2020-03-04 2020-07-17 青岛小鸟看看科技有限公司 Head-mounted display system and space positioning method thereof

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004109600A1 (en) * 2003-06-05 2004-12-16 Philips Intellectual Property & Standards Gmbh Adaptive image interpolation for volume rendering
CN102319117A (en) * 2011-06-16 2012-01-18 上海交通大学医学院附属瑞金医院 Arterial intervention implant implanting system capable of fusing real-time ultrasonic information based on magnetic navigation
CN103211655A (en) * 2013-04-11 2013-07-24 深圳先进技术研究院 Navigation system and navigation method of orthopedic operation
US20150067599A1 (en) * 2013-09-05 2015-03-05 General Electric Company Smart and early workflow for quick vessel network detection
CN106846496A (en) * 2017-01-19 2017-06-13 杭州古珀医疗科技有限公司 DICOM images based on mixed reality technology check system and operating method
CN107340871A (en) * 2017-07-25 2017-11-10 深识全球创新科技(北京)有限公司 The devices and methods therefor and purposes of integrated gesture identification and ultrasonic wave touch feedback
CN107536643A (en) * 2017-08-18 2018-01-05 北京航空航天大学 A kind of augmented reality operation guiding system of Healing in Anterior Cruciate Ligament Reconstruction
CN110427102A (en) * 2019-07-09 2019-11-08 河北经贸大学 A kind of mixed reality realization system
CN110537980A (en) * 2019-09-24 2019-12-06 上海理工大学 puncture surgery navigation method based on motion capture and mixed reality technology
CN111420391A (en) * 2020-03-04 2020-07-17 青岛小鸟看看科技有限公司 Head-mounted display system and space positioning method thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113041083A (en) * 2021-04-22 2021-06-29 江苏瑞影医疗科技有限公司 Holographic projection operation console applied to QMR technology

Also Published As

Publication number Publication date
CN112107366B (en) 2021-08-10

Similar Documents

Publication Publication Date Title
JP7429120B2 (en) Non-vascular percutaneous procedure system and method for holographic image guidance
US20220354580A1 (en) Surgical navigation system, computer for performing surgical navigation method, and storage medium
JP4822634B2 (en) A method for obtaining coordinate transformation for guidance of an object
CN103040525B (en) A kind of multimode medical image operation piloting method and system
US11364004B2 (en) System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
US6996430B1 (en) Method and system for displaying cross-sectional images of a body
US6546279B1 (en) Computer controlled guidance of a biopsy needle
US6782287B2 (en) Method and apparatus for tracking a medical instrument based on image registration
US11759272B2 (en) System and method for registration between coordinate systems and navigation
US10912537B2 (en) Image registration and guidance using concurrent X-plane imaging
Navab et al. Merging visible and invisible: Two camera-augmented mobile C-arm (CAMC) applications
JP2966089B2 (en) Interactive device for local surgery inside heterogeneous tissue
US20150133770A1 (en) System and method for abdominal surface matching using pseudo-features
US20080119725A1 (en) Systems and Methods for Visual Verification of CT Registration and Feedback
US20080123910A1 (en) Method and system for providing accuracy evaluation of image guided surgery
US20060036162A1 (en) Method and apparatus for guiding a medical instrument to a subsurface target site in a patient
CN109416841A (en) Surgical guide of the method and application this method of Imaging enhanced validity in wearable glasses
Zeng et al. A surgical robot with augmented reality visualization for stereoelectroencephalography electrode implantation
US20240164848A1 (en) System and Method for Registration Between Coordinate Systems and Navigation
WO2008035271A2 (en) Device for registering a 3d model
Mirota et al. High-accuracy 3D image-based registration of endoscopic video to C-arm cone-beam CT for image-guided skull base surgery
CN112107366B (en) Mixed reality ultrasonic navigation system
CN113229937A (en) Method and system for realizing surgical navigation by using real-time structured light technology
Uddin et al. Three-dimensional computer-aided endoscopic sinus surgery
US11950951B2 (en) Systems and methods for C-arm fluoroscope camera pose refinement with secondary movement compensation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant