CN114027975A - CT three-dimensional visualization system of puncture surgical robot - Google Patents

CT three-dimensional visualization system of puncture surgical robot Download PDF

Info

Publication number
CN114027975A
CN114027975A CN202111178490.0A CN202111178490A CN114027975A CN 114027975 A CN114027975 A CN 114027975A CN 202111178490 A CN202111178490 A CN 202111178490A CN 114027975 A CN114027975 A CN 114027975A
Authority
CN
China
Prior art keywords
robot
dimensional
puncture
image
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111178490.0A
Other languages
Chinese (zh)
Inventor
欧阳春
甘中学
牛福永
张宏达
管宇翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fudan University
Original Assignee
Fudan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fudan University filed Critical Fudan University
Priority to CN202111178490.0A priority Critical patent/CN114027975A/en
Publication of CN114027975A publication Critical patent/CN114027975A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/062Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using magnetic field
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Robotics (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention belongs to the technical field of medical equipment, in particular to a CT three-dimensional visualization system of a puncture surgery robot; the system comprises an ultrasonic instrument system, a space positioning system, an image acquisition system and a computer system; the ultrasonic instrument system is used for acquiring an implemented B-mode ultrasonic image; the space positioning system adopts an electromagnetic positioning system and is used for determining the space positioning of the puncture needle; the image acquisition system is used for acquiring a two-dimensional image into a computer so as to reconstruct a three-dimensional image; the computer is used for receiving the data of the rest systems and reconstructing and visualizing the data in three dimensions. The invention can obtain better precision under the operation through electromagnetic positioning, strong positioning capability, strong anti-interference capability and miniaturized volume, and can ensure higher precision of later three-dimensional visualization and obtain accurate path planning in the puncture operation.

Description

CT three-dimensional visualization system of puncture surgical robot
Technical Field
The invention belongs to the technical field of medical equipment, and particularly relates to a CT three-dimensional visualization system of a puncture surgery robot.
Background
The puncture surgery is more and more favored by doctors and patients due to the advantages of small trauma, light pain, less surgical complications, quick recovery and the like, and ultrasound images are mostly used for guiding the puncture surgery in clinic, but the clinical manual puncture surgery often has certain limitations due to the problems of low accuracy, low precision and the like, so that the operation mode of the image-guided puncture robot is urgently needed to be popularized and applied, and the image-guided mode has greater requirements on the three-dimensional image constructed by the image-guided puncture robot. Therefore, a three-dimensional CT visualization system with high image accuracy and high positioning accuracy for a puncture surgical robot is needed.
Disclosure of Invention
The invention aims to provide a CT three-dimensional visualization system of a puncture surgery robot, which has high image precision and high positioning precision.
The invention provides a CT three-dimensional visualization system of a puncture surgical robot, which comprises: the system comprises an ultrasonic instrument system, a space positioning system, an image acquisition system and a computer system; wherein:
the ultrasonic instrument system is used for acquiring an implemented B-mode ultrasonic image;
the space positioning system adopts electromagnetic positioning to determine the space positioning of the puncture needle;
the image acquisition system is used for acquiring the two-dimensional image into a computer so as to reconstruct a three-dimensional image;
and the computer system is used for receiving the data of the other systems and carrying out three-dimensional reconstruction and visualization on the data.
Further, the space positioning system comprises an electromagnetic transmitter, an electromagnetic receiver and an electronic unit; the electromagnetic transmitter and the electromagnetic receiver are respectively connected with the electronic unit and used for transmitting and receiving pose data.
Further, the electromagnetic receiver comprises an ultrasonic probe electromagnetic receiver and a puncture needle tail end electromagnetic receiver; the ultrasonic probe electromagnetic receiver is used for determining the spatial position of a pixel in a two-dimensional ultrasonic image in a three-dimensional lattice; the electromagnetic receiver at the tail end of the puncture needle is used for monitoring the pose of the puncture needle.
The electromagnetic transmitter is fixed relative to the surgical robot base and the surgical bed, and the electromagnetic receiver is respectively fixed on the ultrasonic probe and the tail end of the puncture needle so as to transmit and receive signals.
Furthermore, the computer system comprises a software system running on a computer, wherein the software system comprises a 2D ultrasonic image and position acquisition module thereof, a 2D ultrasonic image preprocessing and feature point extraction module, a voxel gray body calculation and three-dimensional crystal visualization module, a puncture robot motion parameter calculation module and a puncture robot motion control module. By acquiring the 2D image and the pose information of the focus and establishing the three-dimensional model of the focus by using the three-dimensional reconstruction technology, a doctor can conveniently perform surgical planning on the three-dimensional reconstruction model of the focus and designate a proper needle insertion route.
The 2D ultrasonic image and position acquisition module thereof comprises: in a freehand three-dimensional ultrasonic system based on electromagnetic positioning, software needs to acquire a 2D ultrasonic image of a focus area and pose data corresponding to each image at the same time so as to establish a three-dimensional model of the focus.
The 2D ultrasonic image preprocessing and feature point extracting module comprises: since B-mode ultrasound contains a lot of useless information areas, in order to reduce the calculation amount of three-dimensional reconstruction, the software module needs to reserve high-density important information area images and eliminate images of the useless areas. When the ultrasonic probe is calibrated, 5 bright spot areas on an image need to be extracted, and software needs to determine that the coordinates of 5 points in an image coordinate system are characteristic points.
The voxel gray body calculation and three-dimensional crystal visualization module comprises: the software module firstly calculates the coordinates of pixel points in a two-dimensional image in a three-dimensional lattice according to the calibration result of the probe and the coordinate transformation relation between the electromagnetic emitter and the three-dimensional image, then calculates the voxel gray scale according to the spatial position relation between voxels and pixels in the three-dimensional lattice, finally displays the three-dimensional lattice after the voxels are filled through a three-dimensional visualization technology, and allows a doctor to carry out interactive operations such as sectioning and measurement on the three-dimensional lattice so as to select an optimal needle insertion path.
The puncture robot motion parameter calculation module: the software module firstly maps the needle insertion path in the three-dimensional image to a robot space through coordinate transformation, and then calculates the motion parameters of the robot according to the geometric relationship between the robot and the needle insertion path and the puncture needle: three translation amounts of the arm, two rotation angles of the wrist, and the depth of the needle insertion.
The puncture robot motion control module: based on the motion planning of the robot, the software module realizes the motion function required by the robot puncture operation. The motion of the robot can be divided into arm translation motion, wrist rotation motion, needle insertion motion and needle withdrawal motion according to the sequence. The arm translation motion is realized by a robot external force dragging mode, and is similar to clinical manual puncture. In order to keep the needle point still when the wrist adjusts the puncture needle position, the software realizes the robot fixed-point posture adjustment function through a needle point displacement compensation algorithm ' refer to ' puncture surgery robot auxiliary system research based on three-dimensional ultrasonic images '. The problems of safety, soft tissue deformation and the like need to be considered when the robot is used for needle insertion, and the software realizes the fuzzy control of the needle insertion speed of the robot based on force/position feedback. After tissue extraction or treatment is completed, the robot rapidly withdraws the puncture needle to a safe position along the needle insertion path.
In the motion parameter calculation module of the puncture robot, the required position of the robot can be represented in the laplace domain as follows: firstly, the impedance control position model building module used by the method is as follows:
Fh=Md(Xd-Xc)+Bd(Xd-Xc)+Kd(Xd-Xc);
wherein: xcIndicates the current position, XdIndicating the desired position, MdVirtual inertia matrix representing the robot, BdVirtual damping matrix, K, representing a robotdVirtual stiffness matrix representing a robot, M in a modeld,KdImpedance of robotThe characteristic coefficients are all diagonal arrays. MdThe virtual inertia matrix has strong impact force and great influence on the motion process with large speed transformation; b isdThe virtual damping matrix has great influence on external interference and rapid movement of position change; the virtual stiffness matrix has a large influence on motion near a low-speed motion or stationary state.
The robot desired position can then be expressed in the laplace domain as:
△X(S)=Fh(s)/Mds2+BdS+Kd=Fh(s)H(s); (1)
wherein, Delta X(S)Is the Laplace transform of DeltaX, Fh(s) is FhAnd laplace transform of s. S, obtaining the position of the robot through a six-dimensional force sensor; h(s) obtaining the position of the outer robot on the height through a six-dimensional force sensor;
through the analysis, the controller for the compliant position of the robot joint space is as follows:
Figure BDA0003296364220000031
wherein, f is the external acting force obtained by the six-dimensional force sensor.
An expression of velocity and acceleration is obtained by using a backward difference method:
△X(k)=a0△F(k)+a1△x(k-1)+a2△x(k-2); (3)
wherein:
Figure BDA0003296364220000032
wherein: fh is Md(Xd-Xc)+Bd(Xd-Xc)+Kd(Xd-Xc);
Xc represents the current position, Xd represents the desired position, Md represents the virtual inertial matrix of the robot, Bd represents the virtual damping matrix of the robot, Kd represents the virtual stiffness matrix of the robot, and T is the sampling time.
Furthermore, in the three-dimensional crystal visualization module, a Ray-Casting algorithm is adopted. The Ray-Casting algorithm, belonging to the volume rendering algorithm, is a light Ray transmission method, and has higher and clearer image quality. The basic principle of Ray Casting (Ray-Casting) is that, based on the visual imaging mechanism, an idealized physical model is first constructed (i.e. each voxel is considered as a particle capable of projecting, emitting and reflecting light), then a specific color value v I (the gray level image is a gray level value, also called light intensity) and opacity (opacity) are assigned to each voxel according to the illumination model, the shading model and the medium properties of the voxel, then a ray is emitted from each pixel point on the screen along the set sight line direction, the ray passes through the three-dimensional data field and intersects with a plurality of voxels, selecting a plurality of equidistant or non-equidistant sampling points on a ray, and solving the color value and opacity of all the sampling points on the ray by an interpolation method (nearest neighbor interpolation or trilinear interpolation), see 'research on puncture surgery robot auxiliary system based on three-dimensional ultrasonic images'.
The invention has the beneficial effects that:
through the effect of electromagnetic positioning, the location ability is strong, and the interference killing feature is strong to the volume is miniature, can obtain better precision effect under the operation, plays a better precision assurance effect to the three-dimensional visualization of later stage, guarantees that the three-dimensional effect of later stage is more accurate, can obtain accurate route planning in the puncture operation.
Drawings
Fig. 1 is a structural diagram of a CT three-dimensional visualization system of a puncture surgical robot according to the present invention.
Fig. 2 is a data acquisition flow diagram of an electromagnetic positioning system of a CT three-dimensional visualization system of a puncture surgical robot according to the present invention.
Fig. 3 is a three-dimensional visualization step illustration of a CT three-dimensional visualization system of a puncture surgical robot according to the present invention.
Detailed Description
The present invention will be further described in detail with reference to the following specific examples:
the invention aims to provide a CT three-dimensional visualization software system of a puncture surgery robot, which has high image precision and high positioning precision.
As shown in fig. 1, in order to ensure high image precision and high positioning precision during use, the invention relates to a CT three-dimensional visualization software system for a puncture surgical robot, comprising:
the system comprises an ultrasonic instrument system, a space positioning system, an image acquisition system and a computer system;
the ultrasonic instrument system is used for acquiring an implemented B-mode ultrasonic image; the space positioning system adopts an electromagnetic positioning system and is used for determining the space positioning of the puncture needle; the image acquisition system is used for acquiring a two-dimensional image into a computer so as to reconstruct a three-dimensional image; the computer is used for receiving data of the rest systems, and reconstructing and visualizing the data in three dimensions.
The invention has the advantages that through the electromagnetic positioning effect, the positioning capability is strong, the anti-interference capability is strong, the size is miniature, a better precision effect can be obtained under the operation, a better precision guarantee effect is realized on the later three-dimensional visualization, the later three-dimensional effect is more accurate, and the accurate path planning can be obtained in the puncture operation.
As shown in fig. 2, further, the electromagnetic positioning system includes an electromagnetic transmitter, an electromagnetic receiver, an electronic unit; the electromagnetic transmitter and the electromagnetic receiver are respectively connected with the electronic unit and used for transmitting and receiving pose data.
Further, the electromagnetic receiver comprises an ultrasonic probe electromagnetic receiver and a puncture needle tail end electromagnetic receiver; the ultrasonic probe electromagnetic receiver is used for determining the spatial position of a pixel in a two-dimensional ultrasonic image in a three-dimensional lattice; the electromagnetic receiver at the tail end of the puncture needle is used for monitoring the pose of the puncture needle. The transmitter is fixed relative to the surgical robot base and the operating bed, and the receiver is respectively fixed on the ultrasonic probe and the tail end of the puncture needle, so as to transmit and receive signals.
As shown in fig. 3, the computer system further includes a 2D ultrasound image and position acquisition module, a 2D ultrasound image preprocessing and feature point extraction module, a voxel gray body calculation and three-dimensional crystal visualization module, a puncture robot motion parameter calculation module, and a puncture robot motion control module. By acquiring the 2D image and the pose information of the focus and establishing the three-dimensional model of the focus by using the three-dimensional reconstruction technology, a doctor can conveniently perform surgical planning on the three-dimensional reconstruction model of the focus and designate a proper needle insertion route.
In actual operation, the 2D ultrasonic image and the position acquisition module thereof are used for acquiring the 2D ultrasonic image of a focus area and the pose data corresponding to each image so as to establish a three-dimensional image of the focus, and the 2D ultrasonic image preprocessing and feature point extraction module is used for removing useless information areas in B-mode ultrasonography so as to reduce the calculation amount of three-dimensional reconstruction. The coordinate of a pixel point of the two-dimensional image in the three-dimensional lattice is calculated by a voxel gray body calculation and three-dimensional crystal visualization module according to the calibration result of the probe and the coordinate transformation relation between the electromagnetic emitter and the three-dimensional image, then the voxel gray is calculated according to the spatial position relation between the voxel and the pixel in the three-dimensional lattice, finally the three-dimensional lattice filled with the voxel is displayed by a three-dimensional visualization technology, and a doctor is allowed to carry out sectioning, measurement and other interactive operations on the three-dimensional lattice so as to select the optimal needle insertion path.
Further, the three-dimensional crystal visualization module adopts a Ray-Casting algorithm. The Ray-Casting algorithm, belonging to the volume rendering algorithm, is a light Ray transmission method, and has higher and clearer image quality.
In actual operation, firstly constructing an ideal physical model by a ray transmission method according to a visual imaging mechanism, then distributing a specific color value and different transparencies for each voxel according to a lighting model, a shading model machine and the medium attributes of the voxels, then starting from each pixel point on a screen, sending a ray along a set ray direction, enabling the ray to pass through a three-dimensional data field and intersect with a plurality of voxels, selecting a plurality of equidistant or non-equidistant sampling points on the ray, calculating the color value and the opacity of all the sampling points on the ray by a difference method, setting the intersection point of the ray and the voxels as the sampling points, finally respectively synthesizing and accumulating the color value and the opacity of all the sampling points on the ray in a right backward or forward or backward sequence, and stopping ray propagation when the opacity is accumulated to 1 or the ray passes through the three-dimensional data field, and the current synthesized color value is taken as the color value of the pixel point on the screen.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and features disclosed herein.

Claims (8)

1. A CT three-dimensional visualization system of a puncture surgical robot is characterized by comprising: the system comprises an ultrasonic instrument system, a space positioning system, an image acquisition system and a computer system; wherein:
the ultrasonic instrument system is used for acquiring an implemented B-mode ultrasonic image;
the space positioning system adopts electromagnetic positioning to determine the space positioning of the puncture needle; comprises an electromagnetic transmitter, an electromagnetic receiver and an electronic unit; the electromagnetic transmitter and the electromagnetic receiver are respectively connected with the electronic unit and used for transmitting and receiving pose data;
the electromagnetic receiver comprises an ultrasonic probe electromagnetic receiver and a puncture needle tail end electromagnetic receiver; the ultrasonic probe electromagnetic receiver is used for determining the spatial position of a pixel in a two-dimensional ultrasonic image in a three-dimensional lattice; the electromagnetic receiver at the tail end of the puncture needle is used for monitoring the pose of the puncture needle;
the electromagnetic transmitter is fixed relative to the surgical robot base and the surgical bed, and the electromagnetic receiver is respectively fixed on the ultrasonic probe and the tail end of the puncture needle so as to transmit and receive signals;
the image acquisition system is used for acquiring the two-dimensional image into a computer so as to reconstruct a three-dimensional image;
the computer system is used for receiving data of other systems and carrying out three-dimensional reconstruction and visualization on the data; the system comprises a software system running on a computer, wherein the software system comprises a 2D ultrasonic image and position acquisition module thereof, a 2D ultrasonic image preprocessing and feature point extraction module, a voxel gray body calculation and three-dimensional crystal visualization module, a puncture robot motion parameter calculation module and a puncture robot motion control module; by acquiring the 2D image and the pose information of the focus and establishing the three-dimensional model of the focus by using the three-dimensional reconstruction technology, a doctor can conveniently perform surgical planning on the three-dimensional reconstruction model of the focus and designate a proper needle insertion route.
2. The CT three-dimensional visualization system for the paracentesis surgical robot according to claim 1, wherein the 2D ultrasound image and the position acquisition module thereof are configured to simultaneously acquire the 2D ultrasound image of the lesion area and the pose data corresponding to each image in a freehand three-dimensional ultrasound system based on electromagnetic positioning so as to build a three-dimensional model of the lesion.
3. The CT three-dimensional visualization system of the puncture surgical robot according to claim 2, wherein in the 2D ultrasonic image preprocessing and feature point extracting module, high-density important information area images are reserved and useless area images are removed through preprocessing; and extracting 5 bright spot areas on the image when the ultrasonic probe is calibrated, and determining the coordinates of the 5 points in the image coordinate system as the characteristic points.
4. The CT three-dimensional visualization system of the paracentesis surgical robot according to claim 2, wherein the voxel gray body calculation and three-dimensional crystal visualization module calculates coordinates of pixel points in the two-dimensional image in a three-dimensional lattice according to a probe calibration result and a coordinate transformation relation between the electromagnetic emitter and the three-dimensional image, calculates voxel gray scale according to a spatial position relation between voxels and pixels in the three-dimensional lattice, displays the three-dimensional lattice filled with the voxels through a three-dimensional visualization technology, and allows a doctor to perform sectioning, measurement and other interactive operations on the three-dimensional lattice so as to select an optimal needle advancing path.
5. The CT three-dimensional visualization system of the puncture surgical robot as claimed in claim 4, wherein the puncture robot motion parameter calculation module firstly maps the needle insertion path in the three-dimensional image to the robot space through coordinate transformation, and then calculates the motion parameters of the robot according to the geometric relationship between the robot and the needle insertion path and the puncture needle, and the motion parameters include: three translation amounts of the arm, two rotation angles of the wrist, and the depth of the needle insertion.
6. The CT three-dimensional visualization system of the puncture surgical robot as claimed in claim 5, wherein the puncture robot motion control module realizes the motion function required by the robot puncture surgery based on the motion planning of the robot; the robot motion is divided into arm translation motion, wrist rotation motion, needle insertion motion and needle withdrawal motion according to the sequence; the arm translation movement is realized by a robot external force dragging mode, and is similar to clinical manual puncture; in order to keep the needle point still when the wrist adjusts the puncture needle position, the robot fixed-point posture adjusting function is realized through the needle point displacement compensation algorithm; the problems of safety, soft tissue deformation and the like need to be considered when the robot is used for needle insertion, and the software module is used for fuzzy control of the needle insertion speed of the robot based on force/position feedback; after tissue extraction or treatment is completed, the robot rapidly withdraws the puncture needle to a safe position along the needle insertion path.
7. The CT three-dimensional visualization system of the puncture surgical robot as claimed in claim 6, wherein the motion parameter calculation module of the puncture robot represents the required position of the robot in the Laplace domain as follows:
first, the impedance control position model used for the method is established as follows:
Fh=Md(Xd-Xc)+Bd(Xd-Xc)+Kd(Xd-Xc);
wherein: xcIndicates the current position, XdIndicating the desired position, MdVirtual inertia matrix representing the robot, BdVirtual damping matrix, K, representing a robotdVirtual stiffness matrix representing a robot, M in a modeld,KdThe impedance characteristic coefficients of the robot are diagonal arrays; mdThe virtual inertia matrix has strong impact force and great influence on the motion process with large speed transformation; b isdThe virtual damping matrix has great influence on external interference and rapid movement of position change; the virtual stiffness matrix has a large influence on the motion near a low-speed motion or static state;
the robot desired position is then expressed in the laplace domain as:
△X(S)=Fh(s)/Mds2+BdS+Kd=Fh(s)H(s); (1)
wherein, Delta X(S)Is the Laplace transform of DeltaX, Fh(s) is FhAnd the laplace transform of s; s, obtaining the position of the robot through a six-dimensional force sensor; h(s) obtaining the position of the outer robot on the height through a six-dimensional force sensor;
the controller for the compliant position of the robot joint space comprises:
Figure FDA0003296364210000021
wherein f is an external acting force obtained by the six-dimensional force sensor;
an expression of velocity and acceleration is obtained by using a backward difference method:
△X(k)=a0△F(k)+a1△x(k-1)+a2△x(k-2); (3)
wherein:
Figure FDA0003296364210000022
wherein: fh is Md(Xd-Xc)+Bd(Xd-Xc)+Kd(Xd-Xc);
Xc represents the current position, Xd represents the desired position, Md represents the virtual inertial matrix of the robot, Bd represents the virtual damping matrix of the robot, Kd represents the virtual stiffness matrix of the robot, and T is the sampling time.
8. The CT three-dimensional visualization system of the puncture surgery robot as claimed in claim 7, wherein the three-dimensional crystal visualization module adopts Ray-Casting algorithm, the basic principle is visual imaging mechanism, an ideal physical model is firstly constructed, that is, each voxel is regarded as a particle capable of projecting, emitting and reflecting light; then, distributing a specific color value vI and opacity to each voxel according to the illumination model, the shading model and the medium attribute of the voxel; then starting from each pixel point on the screen, emitting a ray along a set sight line direction; the ray passes through the three-dimensional data field and intersects with a plurality of voxels, a plurality of equidistant or non-equidistant sampling points are selected on the ray, and the color value and the opacity of all the sampling points on the ray are obtained by an interpolation method.
CN202111178490.0A 2021-10-10 2021-10-10 CT three-dimensional visualization system of puncture surgical robot Pending CN114027975A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111178490.0A CN114027975A (en) 2021-10-10 2021-10-10 CT three-dimensional visualization system of puncture surgical robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111178490.0A CN114027975A (en) 2021-10-10 2021-10-10 CT three-dimensional visualization system of puncture surgical robot

Publications (1)

Publication Number Publication Date
CN114027975A true CN114027975A (en) 2022-02-11

Family

ID=80141039

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111178490.0A Pending CN114027975A (en) 2021-10-10 2021-10-10 CT three-dimensional visualization system of puncture surgical robot

Country Status (1)

Country Link
CN (1) CN114027975A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2521554A1 (en) * 2005-09-28 2007-03-28 Canadian Space Agency Robust impedance-matching of manipulators interacting with unknown environments
CN107553495A (en) * 2017-09-27 2018-01-09 北京理工大学 One kind rotation puies forward robot cervical vertebra joint control device and control method
CN108272502A (en) * 2017-12-29 2018-07-13 战跃福 A kind of ablation needle guiding operating method and system of CT three-dimensional imagings guiding
CN108420529A (en) * 2018-03-26 2018-08-21 上海交通大学 The surgical navigational emulation mode guided based on image in magnetic tracking and art
CN111603205A (en) * 2020-03-23 2020-09-01 苏州新医智越机器人科技有限公司 Three-dimensional image reconstruction and positioning analysis system used in CT (computed tomography) cabin of puncture surgical robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2521554A1 (en) * 2005-09-28 2007-03-28 Canadian Space Agency Robust impedance-matching of manipulators interacting with unknown environments
CN107553495A (en) * 2017-09-27 2018-01-09 北京理工大学 One kind rotation puies forward robot cervical vertebra joint control device and control method
CN108272502A (en) * 2017-12-29 2018-07-13 战跃福 A kind of ablation needle guiding operating method and system of CT three-dimensional imagings guiding
CN108420529A (en) * 2018-03-26 2018-08-21 上海交通大学 The surgical navigational emulation mode guided based on image in magnetic tracking and art
CN111603205A (en) * 2020-03-23 2020-09-01 苏州新医智越机器人科技有限公司 Three-dimensional image reconstruction and positioning analysis system used in CT (computed tomography) cabin of puncture surgical robot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
孙银山: ""基于三维超声图像的穿刺手术机器人辅助***研究"", 《博士学位论文信息科技辑》, pages 19 - 93 *

Similar Documents

Publication Publication Date Title
US11696746B2 (en) Ultrasound imaging system having automatic image presentation
US11464575B2 (en) Systems, methods, apparatuses, and computer-readable media for image guided surgery
US7945310B2 (en) Surgical instrument path computation and display for endoluminal surgery
US8248413B2 (en) Visual navigation system for endoscopic surgery
US8248414B2 (en) Multi-dimensional navigation of endoscopic video
US7824328B2 (en) Method and apparatus for tracking a surgical instrument during surgery
CN106999146B (en) Ultrasound imaging system with automatic image rendering
US5526812A (en) Display system for enhancing visualization of body structures during medical procedures
US20080071141A1 (en) Method and apparatus for measuring attributes of an anatomical feature during a medical procedure
US8079957B2 (en) Synchronized three or four-dimensional medical ultrasound imaging and measurements
JP6873647B2 (en) Ultrasonic diagnostic equipment and ultrasonic diagnostic support program
US20140142426A1 (en) Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
JP2001218765A (en) Method and system for visualizing object
CN103356155A (en) Virtual endoscope assisted cavity lesion examination system
IL293957A (en) 2d pathfinder visualization
JP5498185B2 (en) Ultrasonic diagnostic apparatus and ultrasonic image display program
CN114845655A (en) 3D path detection visualization
CN114027975A (en) CT three-dimensional visualization system of puncture surgical robot
CN116528752A (en) Automatic segmentation and registration system and method
Shahidi et al. Volumetric image guidance via a stereotactic endoscope
US20230248441A1 (en) Extended-reality visualization of endovascular navigation
WO2020217860A1 (en) Diagnostic assistance device and diagnostic assistance method
US11941754B2 (en) System and method for generating three dimensional geometric models of anatomical regions
CN113645907A (en) Diagnosis support device, diagnosis support system, and diagnosis support method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20220211