CN114782504A - Tissue organ image space registration method - Google Patents

Tissue organ image space registration method Download PDF

Info

Publication number
CN114782504A
CN114782504A CN202210363101.XA CN202210363101A CN114782504A CN 114782504 A CN114782504 A CN 114782504A CN 202210363101 A CN202210363101 A CN 202210363101A CN 114782504 A CN114782504 A CN 114782504A
Authority
CN
China
Prior art keywords
point
point cloud
cloud data
data
scanning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210363101.XA
Other languages
Chinese (zh)
Inventor
张博
陈奎
姚宇航
***
梁晓宁
邱继伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Amite Intelligent Medical Technology Co ltd
Original Assignee
Wuxi Amite Intelligent Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Amite Intelligent Medical Technology Co ltd filed Critical Wuxi Amite Intelligent Medical Technology Co ltd
Priority to CN202210363101.XA priority Critical patent/CN114782504A/en
Publication of CN114782504A publication Critical patent/CN114782504A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Theoretical Computer Science (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Pulmonology (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a tissue organ image space registration method, which comprises the following steps: step 1, respectively obtaining point cloud data P, Q from CT scanning and ultrasonic images; step 2, respectively preprocessing the point cloud data P, Q obtained in the step 1; and 3, carrying out tissue organ image space registration based on the point cloud data P, Q obtained in the step 2.

Description

Tissue organ image space registration method
Technical Field
The invention relates to the technical field of image processing, in particular to a tissue organ image space registration method.
Background
A doctor needs to refer to information displayed by CT when performing tissue and organ operations, and under an accurate minimally invasive operation environment, a preoperative CT image and an ultrasonic real-time image are registered to obtain a specific position of a current tissue and provide guidance for a surgical instrument so as to achieve the functions of navigation, obstacle avoidance, automatic tracking and the like.
Liver organs in tissue organs are soft and easy to deform, preoperative CT data can be reconstructed into a three-dimensional model, but doctors can pay more attention to real-time organ changes such as livers in the actual operation process, namely, how the matching effect of local area CT reconstructed images and real-time ultrasonic images is most concerned under medical conditions, direct manual registration is time-consuming and labor-consuming, the accuracy is low, and the actual operation requirements are difficult to meet when liver local puncture is carried out.
Therefore, the invention provides a tissue organ image space registration method.
Disclosure of Invention
In order to realize the purpose of the invention, the following technical scheme is adopted to realize the purpose:
a tissue organ image spatial registration method, comprising the steps of:
step 1, respectively obtaining point cloud data P, Q from CT scanning and ultrasonic images;
step 2, respectively preprocessing the point cloud data P, Q obtained in the step 1;
and 3, carrying out tissue organ image space registration based on the point cloud data P, Q obtained in the step 2.
The tissue and organ image space registration method comprises the following steps of 1: scanning a body through CT to obtain an original point cloud P; obtaining a target point cloud Q by using the ultrasonic image; scanning a patient by utilizing CT equipment, and then outputting original point cloud data P; scanning a human body by using an ultrasonic probe, performing three-dimensional reconstruction on the obtained human body data, and outputting target point cloud data Q.
The tissue and organ image space registration method, wherein step 2 includes: 2.1, simplifying point cloud data; 2.2 point cloud noise removal.
The tissue and organ image space registration method, wherein step 2.1 includes:
2.1.1 coarse sampling
Fitting the surface of the point cloud data into a curved surface in a global coordinate system, and solving a corresponding normal vector for any point Pi in the point cloud data
Figure BDA0003584695240000021
Another point Pj in the neighborhood of Pi also has a normal vector
Figure BDA0003584695240000022
The included angle between the two is theta:
Figure BDA0003584695240000023
judging the variation amplitude of a curved surface formed by the points Pi and the neighborhoods thereof in the point cloud data through the theta value, when the theta value is smaller, the transformation amplitude is smaller, the trend is flatter, the characteristic value is poorer, and removing part of adjacent points with poorer characteristic values through a sampling algorithm; when the theta value is larger, the characteristic value is better, the data of adjacent point is reserved, and the rough sampling is finished;
2.1.2 Fine sampling
And (3) performing fine sampling on data with better characteristic values: for any point Pi in the data with better characteristic value, a local coordinate system L is established, and the normal vector of the point Pi is set as
Figure BDA0003584695240000031
By normal vector
Figure BDA0003584695240000032
As the Z coordinate of the local coordinate system, the formula for converting the local coordinate system to the global coordinate system is as follows:
W=L·R·T
Figure BDA0003584695240000033
Figure BDA0003584695240000034
wherein m isi、ni、pi(i ═ x, y, z) are coordinate values in the global coordinate system W of the unit vector of the X, Y, Z axes in the local coordinate system L, respectively;
fi(i ═ x, y, z) is a coordinate value of the origin of coordinates of the local coordinate system in the global coordinate system;
let M neighbors in the vicinity of Pi, Pj be the jth neighbor of Pi, and the normal vector be Mj,
The normal curvature K of the point Pi is estimated by the osculating circle of the point Pij
Figure BDA0003584695240000035
Where alpha is the supplementary angle between the position vector of the adjacent point Pj and the Z axis of the local coordinate system, beta is the angle between the normal vector of the adjacent point Pj and the Z axis of the local coordinate system, | PiPjI is the Euclidean distance between the point Pi and the adjacent point;
Kjthe approximation is given by:
Figure BDA0003584695240000041
in the above formula, the local coordinate of the neighboring point Pj is (x)j,yj,zj) And the normal vector local coordinate of the adjacent point Pj is Mj=(nx,j,ny,j,nz,j);
Setting a threshold value E, keeping the data point Pj when the normal curvature is larger than E, and removing the data point Pj when the normal curvature is smaller than E;
2.1.3 Down-sampling
The points with poor eigenvalues retained in step 2.1.1 and the points obtained in step 2.1.2 are down-sampled:
and establishing a three-dimensional voxel grid coordinate system through the point cloud data, and approximately representing the point in each voxel by the gravity center point of the voxel, so that all the points in the voxel are finally represented by one gravity center point to obtain the simplified point cloud data.
The tissue and organ image space registration method, wherein step 2.2 includes:
carrying out filtering and denoising on the simplified point cloud data P obtained in the step 2.1 by using an improved guiding filter1、Q1Performing noise removal, comprising the steps of:
(1) respectively aligning point cloud data P1、Q1At any point p ini、qiEstablishing a linear model
qi=Akpi+bk
Wherein A iskIs a rotation matrix, bkIs a translation matrix;
(2) minimizing reconstructed residual error of linear model
Figure BDA0003584695240000051
(3) From the above results, it can be obtained
Figure BDA0003584695240000052
bk=μi-Akμi
Wherein u isiIs pi, qi is a covariance matrix of 3 × 3 neighbor points around the i, i is an identity matrix, εIIs the mean vector of the neighboring points around pi, qi, and is obtained askThe point cloud data is a noise filtering matrix, and after the noise is filtered, the point cloud data after the corresponding noise is removed is P2,Q2
The tissue and organ image space registration method, wherein step 3 comprises:
3.1 registration iteration
The conversion relationship between the original point cloud P and the target point cloud Q is as follows: q, Q2=P2H, where H is a mapping matrix, where
Figure BDA0003584695240000053
R represents rotation amount, T represents translation amount, V represents projection transformation corresponding to coordinate axes, S is a whole large and small scale factor, and V is a zero vector;
3.2 solving the Euclidean distance by an improved algorithm based on a Kd-Tree algorithm to obtain P, Q corresponding point pairs, namely dividing data points on a k-dimensional space to construct a data structure of a balanced binary Tree; performing a search within the constructed Kd-Tree:
1) searching from top to bottom from a root node of the tree, comparing the size of a node X of the next layer with the size of the root node in a dividing dimension i, if the query point X is smaller, setting the query point as a current optimal point and transferring the query point to a left sub-tree of the root node, otherwise, transferring the query point to a right sub-tree of the root node, and considering the root node as the current optimal point; comparing the current optimal point with a node X of the next layer, if the query point X is smaller, setting the query point as the current optimal point and transferring to the left sub-tree of the root node, otherwise, transferring to the right sub-tree of the root node; repeating the above steps until the last layer of the binary tree;
2) returning and searching according to the searching path of the previous step, namely determining whether the last layer has an uncompared query X point, if so, comparing the current optimal point with the point, if the query point X is smaller, setting the query point as the current optimal point and transferring the query point to the left sub-tree of the root node, otherwise, transferring the query point to the right sub-tree of the root node; then determining whether the previous layer has an uncompared query X point, if so, comparing the current optimal point with the point, if the query X is smaller, setting the query point as the current optimal point and transferring to the left sub-tree of the root node, otherwise, transferring the query point to the right sub-tree of the root node; repeating the steps until the root node of the binary tree is reached;
3) the whole data point space can be traversed by a return searching mode, and when a root node is returned, the current optimal point is the true closest point;
3.3 singular value decomposition method for solving optimal operators R and T
The corresponding point pairs q that have been determined according to step 3.2i、piCalculating by using a singular value decomposition method to solve the optimal solutions R and T so as to obtain the target function GR,TMinimum:
Figure BDA0003584695240000061
wherein R is a rotation operator, T is a translation operator, N represents the number of corresponding point pairs and is not less than Np, and N is not less than Nq, wherein Np is the original point cloud P after noise removal2Number Nq is the noise-removed target point cloud Q2The number of the particles;
or, the normalized minimum value of the distance sum of squares is selected as a new objective function, and the improved objective function, the normalized minimum value of the distance sum of squares, is as follows:
Figure BDA0003584695240000071
3.4 updating the target point cloud according to R, T obtained in step 3.3 and comparing for iteration
Updating coordinate information of the target point cloud P by using the rotation parameter R and the translation parameter T, judging whether the coordinate information meets a target function in 3.3, if so, converging the result, and if not, continuing the step of 3.1-3.3; until the result is satisfied, the matching is successful.
Drawings
FIG. 1 is a flow chart of a method for spatial registration of tissue and organ images;
FIG. 2 is a schematic flow chart of a spatial registration procedure;
FIG. 3 is a schematic diagram of nearest point query using the Kd-Tree algorithm.
Detailed Description
The following detailed description of the present invention will be made with reference to the accompanying drawings 1-3.
As shown in fig. 1, the tissue and organ image spatial registration method of the present invention comprises the following steps:
step 1, point cloud data P, Q is obtained from CT scanning and ultrasonic images;
step 2, respectively preprocessing the point cloud data P, Q obtained in the step 1;
and 3, carrying out tissue organ image space registration based on the point cloud data P, Q obtained in the step 2.
Specifically, the method comprises the following steps:
step 1, obtaining point cloud data P, Q from CT scanning and ultrasonic image
Scanning a body through CT to obtain an original point cloud P; and obtaining a target point cloud Q by using the ultrasonic image. The point cloud data P, Q is obtained by three-dimensional reconstruction of CT scanning information and ultrasonic image information, respectively, and is used for scanning a patient by using CT equipment, particularly for more finely scanning a part to be operated, and then outputting an original point cloud P; the method comprises the steps of scanning a human body by using a handheld ultrasonic probe, simultaneously performing more accurate scanning on a part needing operation, scanning deformation possibly caused in operation, performing three-dimensional reconstruction on the data, and outputting target point cloud data Q.
Step 2, preprocessing point cloud data P, Q
2.1 compaction of Point cloud data
The point cloud data P, Q obtained in step 1 contains a large amount of information, but also contains a large amount of noise information and redundant data, which have no optimization effect on the final registration result and are not beneficial to data storage, transmission and calculation. The point cloud data P, Q therefore needs to be preprocessed separately as follows to reduce the data before registration.
2.1.1 coarse sampling
Firstly, in a selected coordinate system W (called a global coordinate system) ((X))whole,Ywhole,Zwhole) In the method, the point cloud data is surface-simulatedA curved surface is combined, and a corresponding normal vector can be obtained for any point Pi in the point cloud data
Figure BDA0003584695240000081
Another point Pj in the neighborhood of Pi also has a normal vector
Figure BDA0003584695240000082
Their included angle is θ:
Figure BDA0003584695240000083
the variation amplitude of a curved surface formed by the point Pi and the neighborhood thereof in the point cloud data can be judged through the theta value, when the theta value is smaller, the variation amplitude is smaller, the trend is flatter, the characteristic value is poorer, and part of adjacent points with poorer characteristic values are removed through a sampling algorithm; when the theta value is larger, the characteristic value is considered to be better, the data of the adjacent point is reserved, and the rough sampling is completed.
The removal of some points with poor feature values by the sampling algorithm may be performed as follows:
and adopting an average sampling algorithm, setting a certain step length D for sampling, taking one point on the step length D as a sampling point to be reserved every time the step length D passes, and removing the rest points which are not taken.
2.1.2 Fine sampling
Data with good characteristic values need to be further subjected to fine sampling, and the contribution of curvature needs to be considered during fine sampling.
The fine sampling method comprises the following steps: for any point Pi in the data with better characteristic value, establishing a local coordinate system L: (X)local,Ylocal,Zlocal) Let the normal vector of the pi point be
Figure BDA0003584695240000091
By normal vector
Figure BDA0003584695240000092
As Z-coordinate of the local coordinate system. Will be a local coordinate systemThe formula for converting to the global coordinate system is as follows:
W=L·R·T
Figure BDA0003584695240000093
Figure BDA0003584695240000094
wherein m isi、ni、piEach of (i ═ x, y, z) is a coordinate value in the global coordinate system W of the unit vector of the X, Y, Z axis in the local coordinate system L.
fiAnd (i ═ x, y, z) is a coordinate value of the coordinate origin of the local coordinate system in the global coordinate system.
Let M neighbors in the vicinity of Pi, Pj be the jth neighbor of Pi, and the normal vector be Mj
The normal curvature K of the point Pi is estimated by the osculating circle of the point Pij
Figure BDA0003584695240000101
Where α is a supplementary angle between a position vector of the neighboring point Pj (a vector of the neighboring point Pj in the global coordinate system) and a Z-axis angle of the local coordinate system, β is an angle between a normal vector of the neighboring point Pj and the Z-axis of the local coordinate system, | PiPjAnd | is the euclidean distance of the point Pi from the neighboring point Pj. To reduce the amount of computation, KjAn approximation can be given by:
Figure BDA0003584695240000102
in the above formula, the local coordinate of the neighboring point Pj is (x)j,yj,zj) The normal vector local coordinate of the neighboring point Pj is Mj ═ (n)x,j,ny,j,nz,j)。
And setting a threshold value E, keeping the data point Pj when the normal curvature is larger than E, and removing the data point Pj when the normal curvature is smaller than E.
2.1.3 Down-sampling
The points with poor eigenvalues retained in step 2.1.1 and the points obtained in step 2.1.2 are down-sampled:
when down-sampling is carried out, a three-dimensional voxel grid coordinate system is established through point cloud data, then points in each voxel are approximately represented by the gravity center point of the voxel, all the points in the voxel are finally represented by one gravity center point, and simplified point cloud data P are obtained1、Q1
2.2 Point cloud noise removal
For the simplified point cloud data P obtained in the step 2.11、Q1And (3) noise removal is carried out:
using an improved guiding filter to carry out filtering denoising, wherein the denoising steps are as follows:
(1) respectively align point cloud data P1、Q1At any point p ini、qiEstablishing a linear model
qi=Akpi+bk
Wherein A iskIs a 3 x 3 matrix, bkIs a matrix of 3 x 1.
(2) Minimizing reconstructed residual error for linear models
Figure BDA0003584695240000111
(3) From the above results, it can be obtained
Figure BDA0003584695240000112
bk=μi-Akμi
Wherein uiIs pi, qi is a covariance matrix of 3 × 3 neighbor points around the i, i is an identity matrix, εIIs the mean vector of the neighboring points around pi, qiAkThe point cloud data is a noise filtering matrix, and after the noise is filtered, the point cloud data after the corresponding noise is removed is P2,Q2
And 3, registering the tissue organ image space.
As shown in fig. 2, the tissue-organ image spatial registration comprises the following steps:
3.1 Point cloud data registration
The point cloud registration utilizes an ICP algorithm, and the essence of the algorithm is to solve the parameters of the rigid body transformation based on the principle of least square method. The common point cloud matching is regarded as the transformation of a rigid body, when an organ slightly deforms due to breathing or action, points with obvious characteristic values in the point cloud data of the organ are particularly important, when deformation occurs, the points with the obvious characteristic values are taken as key points, when deformation occurs, the point cloud formed by the key points is regarded as the rigid body, and therefore two types of point cloud data P with noise removed can be converted in a translation and rotation mode2,Q2In connection, the process of solving the most suitable translation and rotation operators is the registration process.
The conversion relationship between the original point cloud P and the target point cloud Q is as follows: q2=P2·H
Where H is a mapping matrix, H can be represented as:
Figure BDA0003584695240000121
r represents the rotation amount, T represents the translation amount, V represents the coordinate axis and corresponds the projection transformation, S is holistic big or small scale factor, when the corresponding coordinate projection of two kinds of point cloud data is the same and the volume is similar, V is the null vector, scale factor S equals 1, when coordinate projection and volume are different, V, S change thereupon, and V' S value can be solved by the coordinate transformation and is obtained, S is the volume ratio of corresponding point cloud data. The registration process is the process of solving the optimal operator R, T.
3.2 improved algorithm based on Kd-Tree algorithm finds Euclidean distance to get P, Q nearest corresponding point pairs.
Determining the noise-removed original point cloud P2With the noise-removed target point cloud Q2And carrying out data point association on P and Q to determine a plurality of corresponding point pairs.
In the traditional ICP registration algorithm, the Euclidean distance of each point needs to be calculated, a large amount of time and resources are consumed in the iteration process by the method of all calculation, and a point cloud P is set2、Q2Has a data amount of MP、MQThe time complexity of such an algorithm is then (M)P·MQ) Aiming at the problem, the improved algorithm based on the Kd-Tree algorithm of the invention obtains the Euclidean distance, and the time complexity is reduced to (log)2MQ)。
As shown in FIG. 3, the specific algorithm is that data points are divided on a k-dimensional space to construct a data structure of a balanced binary Tree Kd-Tree; the search is performed within the constructed Kd-Tree. Taking query point X as an example, the description follows.
1. Searching from top to bottom from a root node of the tree, comparing the size of a node X of the next layer with the size of the root node on a dividing dimension i, if the query point X is smaller, setting the query point as a current optimal point and transferring to a left sub-tree of the root node, otherwise, transferring the query point to a right sub-tree of the root node, and considering the root node as the current optimal point; comparing the current optimal point with a node X of the next layer, if the query point X is smaller, setting the query point as the current optimal point and transferring to the left sub-tree of the root node, otherwise, transferring to the right sub-tree of the root node; repeating the above steps until the last layer of the binary tree;
2. returning and searching according to the searching path in the previous step, namely determining whether the last layer has an uncompared query X point, if so, comparing the current optimal point with the point, if the query X point is smaller, setting the query point as the current optimal point and transferring the query point to the left sub-tree of the root node, otherwise, transferring the query point to the right sub-tree of the root node; then determining whether the upper layer has an uncompared query X point, if so, comparing the current optimal point with the point, if the query X point is smaller, setting the query point as the current optimal point and transferring to the left sub-tree of the root node, otherwise, transferring to the right sub-tree of the root node; the above steps are repeated until the root node of the binary tree.
3. The whole data point space can be traversed by a return search mode, and when the root node is returned, the current optimal point is the true closest point.
It should be noted that the threshold E needs to be set during the query processPQmaxTo eliminate the wrong closest point matches (i.e. to delete the point pairs whose distance is greater than or equal to the threshold), and the final data only remains with a distance less than EPQmaxThe point pair of (1).
Obtaining P by the above-described manner2、Q2And setting the number of corresponding point pairs pi and qi as N, wherein N is a positive integer.
3.3 singular value decomposition method for solving optimal operators R and T
The corresponding point pairs q which have been determined according to step 3.2i、piCalculating by using a singular value decomposition method to solve the optimal solutions R and T so as to obtain the target function GR,TMinimum:
Figure BDA0003584695240000141
wherein R is a rotation operator, T is a translation operator, N represents the number of corresponding point pairs and is not less than Np, and N is not less than Nq, wherein Np is the original point cloud P after noise removal2Number, Nq, of target point clouds Q after noise removal2The quantity, the objective function represents the sum of the distances between the corresponding points.
In order to make the data without comparability in the corresponding point pair have comparability and keep the relative relationship between the data, the objective function is further processed, namely, the normalized minimum value of the distance square sum is selected as a new objective function, and the accuracy and the reliability of the registration are also greatly improved. The normalized minimum of the improved objective function, sum of squared distances, is as follows:
Figure BDA0003584695240000151
3.4 updating the target point cloud according to R, T obtained in step 3.3 and comparing for iteration
And updating the coordinate information of the target point cloud P by using the rotation parameter R and the translation parameter T, then judging whether the coordinate information meets the target function in 3.3, if so, converging the result, and if not, continuing the steps of 3.1-3.3. Until the results are satisfied, we consider the match to be successful.
According to the invention, Ct data and ultrasonic data in the operation can be combined in real time, and the point cloud data registration mode is adopted to realize unification in a certain sense, thereby playing a very key role in making corresponding correct decisions for the real-time change of the state of an illness for doctors.

Claims (2)

1. A tissue organ image spatial registration method, characterized by comprising the steps of:
step 1, respectively obtaining point cloud data P, Q from CT scanning and ultrasonic images;
step 2, respectively preprocessing the point cloud data P, Q obtained in the step 1;
and 3, carrying out tissue organ image space registration based on the point cloud data P, Q obtained in the step 2.
2. The tissue organ image spatial registration method according to claim 1, wherein step 1 comprises: scanning a body through CT to obtain an original point cloud P; obtaining a target point cloud Q by using the ultrasonic image; scanning a patient by utilizing CT equipment, and then outputting original point cloud data P; scanning a human body by using an ultrasonic probe, performing three-dimensional reconstruction on the obtained human body data, and outputting target point cloud data Q.
CN202210363101.XA 2022-04-07 2022-04-07 Tissue organ image space registration method Pending CN114782504A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210363101.XA CN114782504A (en) 2022-04-07 2022-04-07 Tissue organ image space registration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210363101.XA CN114782504A (en) 2022-04-07 2022-04-07 Tissue organ image space registration method

Publications (1)

Publication Number Publication Date
CN114782504A true CN114782504A (en) 2022-07-22

Family

ID=82426681

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210363101.XA Pending CN114782504A (en) 2022-04-07 2022-04-07 Tissue organ image space registration method

Country Status (1)

Country Link
CN (1) CN114782504A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115861392A (en) * 2023-01-13 2023-03-28 无锡艾米特智能医疗科技有限公司 Soft tissue registration method based on image data processing

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115861392A (en) * 2023-01-13 2023-03-28 无锡艾米特智能医疗科技有限公司 Soft tissue registration method based on image data processing

Similar Documents

Publication Publication Date Title
US9785858B2 (en) Method and system for hierarchical parsing and semantic navigation of full body computed tomography data
US8358819B2 (en) System and methods for image segmentation in N-dimensional space
CN107871325B (en) Image non-rigid registration method based on Log-Euclidean covariance matrix descriptor
CN114119549A (en) Multi-modal medical image three-dimensional point cloud registration optimization method
CN113112609A (en) Navigation method and system for lung biopsy bronchoscope
CN113570627B (en) Training method of deep learning segmentation network and medical image segmentation method
CN116580068B (en) Multi-mode medical registration method based on point cloud registration
CN115578320A (en) Full-automatic space registration method and system for orthopedic surgery robot
CN114792326A (en) Surgical navigation point cloud segmentation and registration method based on structured light
CN111127488B (en) Method for automatically constructing patient anatomical structure model based on statistical shape model
CN114782504A (en) Tissue organ image space registration method
CN114066953A (en) Three-dimensional multi-modal image deformable registration method for rigid target
CN114202566A (en) Glue path guiding and positioning method based on shape coarse registration and ICP point cloud fine registration
CN116650115A (en) Orthopedic surgery navigation registration method based on UWB mark points
CN115272429A (en) Feature point-based image registration method, system and computer-readable storage medium
CN111260704A (en) Vascular structure 3D/2D rigid registration method and device based on heuristic tree search
CN116612166A (en) Registration fusion algorithm for multi-mode images
WO2022183851A1 (en) Lung lobe segmentation method based on digital human technology
CN116363181A (en) Feature-based CT image and ultrasonic image liver registration method
CN116327362A (en) Navigation method, device, medium and electronic equipment in magnetic probe auxiliary bronchus operation
CN113160417B (en) Multi-organ three-dimensional reconstruction control method based on urinary system
Krawczyk et al. YOLO and morphing-based method for 3D individualised bone model creation
CN113345112B (en) Long bone fracture surface point cloud preprocessing and registering method
CN113256693A (en) Multi-view registration method based on K-means and normal distribution transformation
Erdt et al. Computer aided segmentation of kidneys using locally shape constrained deformable models on CT images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination