CN114224489A - Trajectory tracking system for surgical robot and tracking method using the same - Google Patents

Trajectory tracking system for surgical robot and tracking method using the same Download PDF

Info

Publication number
CN114224489A
CN114224489A CN202111520421.3A CN202111520421A CN114224489A CN 114224489 A CN114224489 A CN 114224489A CN 202111520421 A CN202111520421 A CN 202111520421A CN 114224489 A CN114224489 A CN 114224489A
Authority
CN
China
Prior art keywords
surgical
marker
dimensional
binocular camera
surgical instrument
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111520421.3A
Other languages
Chinese (zh)
Other versions
CN114224489B (en
Inventor
吴法
卞若帆
梁炎超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Deshang Yunxing Medical Technology Co ltd
Original Assignee
Zhejiang Deshang Yunxing Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Deshang Yunxing Medical Technology Co ltd filed Critical Zhejiang Deshang Yunxing Medical Technology Co ltd
Priority to CN202111520421.3A priority Critical patent/CN114224489B/en
Publication of CN114224489A publication Critical patent/CN114224489A/en
Application granted granted Critical
Publication of CN114224489B publication Critical patent/CN114224489B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Robotics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to a trajectory tracking technology of a surgical robot, and aims to provide a trajectory tracking system for the surgical robot and a tracking method using the system. The system comprises a plurality of markers, wherein the markers are in a sheet shape, and a checkerboard pattern is drawn on the surfaces of the markers; a target including a surgical instrument and a surgical site; binocular camera equipment; and the processing unit is connected with the binocular camera equipment through a signal line and is used for processing the images acquired by the binocular camera equipment, calculating the position and the spatial relationship of each marker and acquiring the three-dimensional coordinate position and the spatial relationship of the target after matching with the target model. The invention adopts the combination of the relatively miniature marker and the binocular camera equipment, and can achieve the tracking effect with higher precision without changing the marking mode of surgical instruments and surgical position appearance. The method can reduce the influence on the clinical operation workflow, can realize the marking and positioning of any surgical instrument and operation position, and has the effect of reducing the restriction of operation space.

Description

Trajectory tracking system for surgical robot and tracking method using the same
Technical Field
The present invention relates to a trajectory tracking technique for a surgical robot, and more particularly, to a trajectory tracking system for acquiring spatial position information and posture information of a target object by tracking information of a marker attached to a surgical instrument and a surgical site, and a tracking method using the same.
Background
When the ablation operation of the tumor is implemented, in order to reduce the workload of doctors and improve the accuracy of the ablation operation, research work is carried out in the industry aiming at an ablation operation robot. When performing a surgery using an ablation surgical robot, in order to minimize the risk of the surgery and improve the precision of the surgery, it is necessary to use a navigator that can accurately guide a surgical instrument to an affected part of a patient after accurately tracking and detecting the spatial position information and posture information of the surgical instrument and a marker at a surgical site.
To this end, it is generally necessary to construct a tracking system comprising: a marker for marking the surgical instrument and the surgical site; the optical positioning system is used for recording optical information of the marker; and the processor is used for processing the relation between the markers, acquiring the three-dimensional coordinates and posture information of the surgical instrument marked by the markers and the surgical position, calculating the spatial relation between the surgical instrument and the surgical position, and guiding the motion scheme behind the surgical robot.
In the field of optical tracking technologies such as markers, the company of Northern Digital Inc (NDI) is the leading technology, and the NDI optical position finder system and the matched Track software produced by the company are frequently applied to surgical robots. Optical pellets are used as markers on an optical tool used in cooperation with an optical locator, and a plurality of (four) optical pellets generally form a marker group, occupy about 10cm × 10cm × 1cm, and are externally attached to a surgical instrument and a surgical site for marking postures and positions. However, the marker set has the problems that the marker is too large for the conventional ablation surgical instrument and surgical position, the actual surgical effect is influenced, the clinical workflow is influenced, and the like, so that the application of the marker set is limited.
Disclosure of Invention
The invention aims to solve the technical problem of overcoming the defects in the prior art and provides a track tracking system for a surgical robot and a tracking method using the same.
In order to solve the technical problem, the solution of the invention is as follows:
a trajectory tracking system for a surgical robot is provided, comprising: the marker is in a sheet shape, a checkerboard pattern is drawn on the surface of the marker, and the marker is used for being attached to different surfaces of a target object needing to determine spatial position information and posture information; the target includes a surgical instrument and a surgical site; the binocular camera equipment is used for recording the live information of each marker; and the processing unit is connected with the binocular camera equipment through a signal line and is used for processing the images acquired by the binocular camera equipment, calculating the position and the spatial relationship of each marker and acquiring the three-dimensional coordinate position and the spatial relationship of the target after matching with the target model.
Preferably, the checkerboard pattern on the surface of the marker at least comprises 2 x 3 black and white squares which are adjacent and arranged at intervals, and the side length of each square is 0.45-0.75 mm.
Preferably, there are at least 4 labels per target.
Preferably, the marker is sticky paper with a back adhesive and a checkerboard pattern printed on the surface, and is directly stuck on the surface of the target object or an attachment on the surface of the target object; or the marker is a metal sheet or a plastic sheet with a checkerboard pattern printed on the surface, and is fixed on the surface of the target object in a buckling mode; alternatively, the marker is a coating printed or sprayed on the surface of the surgical instrument, the coating having a checkerboard pattern.
Preferably, the binocular imaging device has a resolution of at least 6500W pixels.
Preferably, the processing unit is a computer; or a processor with computing power.
Preferably, the system further comprises an operating lamp used as a light source, and the illumination intensity of the operating lamp is more than 100001 ux.
The invention further provides a tracking method using the track tracking system, which comprises the following steps:
(1) attaching markers on different surfaces of the surgical instrument and the surgical site, and placing the surgical instrument and the surgical site at the calibration positions;
(2) the method comprises the steps that a binocular camera device is used for shooting surgical instruments and surgical positions, a processing unit is used for processing obtained images, the three-dimensional coordinate positions and the spatial relation of markers at the calibration positions are calculated, modeling is carried out on the basis of the three-dimensional coordinate positions and the spatial relation, and target object models of the surgical instruments and the surgical positions at the calibration positions are obtained;
(3) in the operation process of the surgical robot, acquiring real-time images of surgical instruments and surgical positions through binocular camera equipment; processing the obtained images by a processing unit, and calculating the real-time three-dimensional coordinate position and spatial relationship of each marker; after matching and comparing with the target object model, acquiring the real-time three-dimensional coordinate position and spatial relationship of the surgical instrument;
(4) and (4) inputting the calculation result obtained in the step (3) as the track data of the surgical instrument into a control system of the surgical robot, and further tracking, regulating and controlling the operation action of the surgical robot according to a preset surgical scheme.
Preferably, the three-dimensional coordinate position and spatial relationship of each marker is determined according to the following method:
(1) calculating three-dimensional attitude information of the marker by using a projection rule of the three-dimensional image on a two-dimensional plane; the attitude information is three-dimensional vectors formed by long sides and short sides of the marker and straight lines perpendicular to the plane of the marker, and each captured and identified marker is represented by unique three-dimensional vector features;
(2) converting three-dimensional vector representations of the marker acquired by the imaging unit of one camera into the imaging unit of the other camera by using an external reference matrix calibrated by binocular camera equipment; if the three-dimensional vectors are equal, the marker matching is considered to be successful; further calculating coordinates of the marker in world coordinates by using a binocular camera projection rule;
(3) forming a three-dimensional point cloud by using the world coordinates of part of the markers obtained by calculation;
(4) obtaining a target object model of the surgical instrument and the surgical position at the calibration position based on the three-dimensional point cloud; or matching and comparing the three-dimensional point cloud obtained by real-time calculation with the target object model to obtain the real-time coordinate information and posture information of the surgical instrument.
Description of the inventive principles:
in the present invention, a relatively small volume of "micro" markers is attached to different surfaces of the target (surgical instrument and surgical site); images are acquired through the binocular camera equipment, and the processing unit is used for calibrating, calculating, matching and comparing the position and the spatial relation of each marker, so that a tracking effect with higher precision is achieved.
In the invention, firstly, imaging acquisition and modeling of three-dimensional coordinate space position information and attitude information are carried out on a target object in a calibration state, and a micro marker point set after modeling is taken as a basic model: and then, acquiring an image for the marker by binocular camera equipment in the actual operation process, and matching the basic model based on the three-dimensional coordinate set of the marker point set to calculate the three-dimensional coordinate position and posture information implemented by the target object so as to be used for the operation of the surgical robot.
Compared with the prior art, the invention has the beneficial effects that:
1. compared with a luminous small ball with larger volume and an optical position finder system, the invention adopts the combination of a relatively miniature marker and binocular camera equipment. The tracking effect with higher precision can be achieved without changing the marking mode of the appearance of the surgical instruments and surgical positions.
2. The present invention can calculate the three-dimensional coordinate position and posture information of the target object (surgical instrument and surgical position). Therefore, the influence on the clinical operation workflow can be reduced, the marking and positioning of any surgical instrument and surgical position can be realized, and the effect of being less limited by the operation space is achieved.
Drawings
FIG. 1 is a diagram of a tracking system of an embodiment of the present invention.
FIG. 2 is an exemplary illustration of the attachment of a radiolabel to a target.
FIG. 3 is a block diagram of the steps for calculating three-dimensional coordinate position and pose information of a target object.
Reference numerals: 1, a surgical instrument; 2, a surgical site; 3 a label; 4 binocular camera equipment; 5 a processing unit; 6, operating lamps; 31-34 marker individuals.
Detailed Description
Hereinafter, preferred embodiments of the present invention will be described in more detail with reference to the accompanying drawings.
In the present invention, the object includes a surgical instrument 1 and a surgical site 2. The surgical instrument 1 is a medical instrument such as an ablation needle or a puncture needle attached to the end of a manipulator of a surgical robot. In the surgical operation, a non-operation area of the human body is covered by a mask, and the non-covered area is the operation position 2 (namely a local area of the patient body to be operated), so that the operation position 2 can be determined by using an independent marker. The operation robot adjusts the action plan of the mechanical arm according to the real-time change conditions of the three-dimensional coordinate position and the posture information of the operation instrument 1 and the operation position 2 and by combining a preset operation scheme, so as to complete the operation. Certainly, according to the requirements of the computing power and the control precision of the system, a mask of a marker which is delivered from a factory and provided with a coating is further selected, and the positioning precision of the track tracking is improved by increasing the number of the auxiliary positioning marks. In the invention, the posture information refers to the rotation transformation relation between the target object and the basic model position, and the three-dimensional coordinate position refers to the translation transformation relation between the target object and the basic model position. The specific control method of the surgical robot belongs to the prior art, and the technical scheme of the invention does not relate to the content, so the detailed description is omitted.
The following detailed description of implementations of the invention refers to the accompanying drawings and detailed description.
As shown in fig. 1-2, a trajectory tracking system for a surgical robot, comprising: marker 3, binocular camera 4, processing unit 5, operating lamp 6.
The marker 3 has a checkerboard pattern for machine vision recognition, which can be imaged clearly in a high-precision binocular camera. For example, the checkerboard pattern is 4 × 5 black and white squares which are adjacent and arranged at intervals, and the side length of each square is 0.45-0.75 mm.
The specific implementation of the marker 3 is many, and the applicable mode should be selected for the surgical instrument 1 and the surgical site 2 according to the actual situation. For example:
the first method is as follows: the markers 3 are sticky papers with gum printed with checkerboard patterns on the surface, and can be respectively adhered on different surfaces of the surgical instrument 1 and the surgical site 2, and the number and the fixing mode can be selected according to actual needs.
The second method comprises the following steps: the marker 3 is a metal sheet or a plastic sheet having a checkerboard pattern printed on the surface thereof, and is fixed to the surface of the surgical instrument 1 by caulking or sticking, or is fixed to the surface of the surgical site 2 by sticking.
The third method comprises the following steps: the markers 3 are coatings printed or sprayed on the surface of the surgical instrument 1, and the coatings have a checkerboard pattern, and are usually self-contained products. The method can also be realized by sticking a label at the later stage for the old products without the markers.
The method is as follows: the markers 3 are coatings printed or sprayed on the surface of the mask, and the coatings have a checkerboard pattern, and are usually self-contained in the mask product. The method can also be realized by sticking a label at the later stage for the old products without the markers.
In general, the target object in FIG. 1 includes a surgical instrument 1 and a surgical site 2, each having a different surface to which markers 3 may be attached; the target object is observed from different angles, and the position relation of the markers obtained by observation is different and has no repetition. And only one viewing angle is selected for illustration in fig. 2. At this observation angle, the ablation needle had a plate-like trailing end, and 10 markers were attached to the surface of the trailing end of the ablation needle in a scattered manner.
The binocular camera 4 can select a high-precision binocular camera, the resolution is at least 6500W pixels, and the lens is an ultra-low dispersion lens. The two cameras are arranged at a certain angle and used for shooting the same area and recording the live information of each marker on the target object in the area.
The processing unit 5 is connected with the binocular camera equipment through a signal line and is used for processing images acquired by the binocular camera equipment, calculating the position and spatial relationship of each marker 3 and acquiring the three-dimensional coordinate position and spatial relationship of a target after matching with a target model. The processing unit 5 may be a computer or a processor with computing capabilities.
The operating lamp 6 is used as a light source, the illumination intensity of which is more than 10000lux, and is used for providing the illumination intensity required by the real-time imaging of the binocular camera equipment 4.
In the invention, the tracking method using the track tracking system comprises the following steps:
(1) attaching markers 3 on the surfaces of the surgical instrument 1 and the surgical position 2, and placing the surgical instrument 1 and the surgical position 2 at the calibration positions;
(2) the binocular camera equipment 4 is used for shooting the surgical instrument 1 and the surgical position 2, the processing unit 5 is used for processing the obtained images, the three-dimensional coordinate position and the spatial relation of each marker 3 in the calibration position are calculated, and a target object model of the surgical instrument 1 and the surgical position 2 in the calibration position is obtained on the basis of the three-dimensional coordinate position and the spatial relation;
the construction mode of the target object model in the step can adopt the existing three-dimensional modeling method or point cloud matching method and the like. The obtained target object model can be built in a robot control program after modeling is completed so as to be repeatedly used for a plurality of times, so that modeling does not need to be performed before trajectory tracking is performed each time. Of course, if the calibration position is changed greatly, the step should be repeated to update the object model to avoid errors.
(3) In the operation process of the surgical robot, acquiring real-time images of the surgical instrument 1 and the surgical position 2 through the binocular camera equipment 4; the processing unit 5 processes the obtained images and calculates the real-time three-dimensional coordinate position and spatial relationship of each marker 3; after matching and comparing with the target object model, obtaining the real-time three-dimensional coordinate position and spatial relationship of the surgical instrument 1;
(4) and (4) inputting the calculation result obtained in the step (3) as the track data of the surgical instrument 1 into a control system of the surgical robot, and further tracking, regulating and controlling the operation action of the surgical robot according to a preset surgical scheme.
For convenience of explanation, the method of determining the three-dimensional coordinate position and spatial relationship of each marker will be described below using a point set composed of individual markers 31 to 34 in fig. 2 as a representative:
(3.1) at a certain position, the two cameras of the binocular camera device 4 simultaneously shoot the individual markers 31-34. Calculating three-dimensional posture information of the marker individuals 31-34 by using a projection rule of the three-dimensional image on a two-dimensional plane; the posture information refers to three-dimensional vectors formed by long sides and short sides of the individual markers 31-34 and straight lines perpendicular to the plane of the markers, and each captured and recognized individual marker 31-34 is represented by unique three-dimensional vector features;
(3.2) converting the three-dimensional vector representations of the individual markers 31-34 acquired by the imaging unit of one camera into the imaging unit of the other camera by using an external reference matrix calibrated by a binocular camera device; if the three-dimensional vectors are equal, the marker matching is considered to be successful; further calculating coordinates of the marker individuals 31-34 under world coordinates by using a binocular camera projection rule;
the world coordinates are calculated as follows:
Figure BDA0003405894320000061
wherein IntL,IntRThe internal reference matrix of the left camera and the right camera; sL,sRThe left and right camera scale factors;
Figure BDA0003405894320000062
Figure BDA0003405894320000063
pixel coordinates of the left camera and the right camera for the marker individuals 31-34; rLRRTLTLRepresenting the left and right camera matrices, respectively.
Figure BDA0003405894320000064
T=TR-RTLFor the external reference matrix, only s is shown in the above representationL,sR,PwIs unknown.
Then order s by calculationLIs (A'. A)-1A first element of A'. T, wherein
Figure BDA0003405894320000065
The world coordinates take the coordinate origin of one of the cameras as the origin and the xy axis of the camera as the xy axis, so that the world coordinates of the marker
Figure BDA0003405894320000066
(3.3) forming a three-dimensional point cloud by using the world coordinates of part of the markers obtained by calculation;
after the world coordinates of the marker individuals 31-34 are calculated, a three-dimensional point cloud consisting of the marker individuals 31-34 can be obtained. And matching and comparing the basic model by using the three-dimensional point cloud to obtain the coordinate information and the posture information of the target object.
(3.4) obtaining a target object model of the surgical instrument and the surgical position at the calibration position based on the three-dimensional point cloud by referring to the same mode; or matching and comparing the three-dimensional point cloud obtained by real-time calculation with the target object model to obtain the real-time coordinate information and posture information of the surgical instrument.

Claims (9)

1. A trajectory tracking system for a surgical robot, comprising:
the marker is in a sheet shape, a checkerboard pattern is drawn on the surface of the marker, and the marker is used for being attached to different surfaces of a target object needing to determine spatial position information and posture information; the target includes a surgical instrument and a surgical site;
the binocular camera equipment is used for recording the live information of each marker;
and the processing unit is connected with the binocular camera equipment through a signal line and is used for processing the images acquired by the binocular camera equipment, calculating the position and the spatial relationship of each marker and acquiring the three-dimensional coordinate position and the spatial relationship of the target after matching with the target model.
2. The trajectory tracking system of claim 1, wherein the checkerboard pattern of the marker surface comprises at least 2 x 3 black and white squares arranged adjacently and at intervals, the individual squares having a side length of 0.45-0.75 mm.
3. The trajectory tracking system of claim 1, wherein there are at least 4 markers on each target.
4. The trajectory tracking system of claim 1, wherein the marker is a sticker with a checkerboard pattern printed on the surface and with an adhesive, and is directly stuck on the surface of the target or an attachment on the surface of the target; or the marker is a metal sheet or a plastic sheet with a checkerboard pattern printed on the surface, and is fixed on the surface of the target object in a buckling mode; alternatively, the marker is a coating printed or sprayed on the surface of the surgical instrument, the coating having a checkerboard pattern.
5. The trajectory tracking system of claim 1, wherein the binocular imaging device has a resolution of at least 6500W pixels.
6. The trajectory tracking system of claim 1, wherein the processing unit is a computer; or a processor with computing power.
7. The trajectory tracking system according to any one of claims 1 to 6, further comprising a surgical lamp as a light source, wherein the illumination intensity is 10000lux or more.
8. A tracking method using the trajectory tracking system of claim 1, comprising the steps of:
(1) attaching markers on different surfaces of the surgical instrument and the surgical site, and placing the surgical instrument and the surgical site at the calibration positions;
(2) the method comprises the steps that a binocular camera device is used for shooting surgical instruments and surgical positions, a processing unit is used for processing obtained images, the three-dimensional coordinate positions and the spatial relation of markers at the calibration positions are calculated, modeling is carried out on the basis of the three-dimensional coordinate positions and the spatial relation, and target object models of the surgical instruments and the surgical positions at the calibration positions are obtained;
(3) in the operation process of the surgical robot, acquiring real-time images of surgical instruments and surgical positions through binocular camera equipment; processing the obtained images by a processing unit, and calculating the real-time three-dimensional coordinate position and spatial relationship of each marker; after matching and comparing with the target object model, acquiring the real-time three-dimensional coordinate position and spatial relationship of the surgical instrument;
(4) and (4) inputting the calculation result obtained in the step (3) as the track data of the surgical instrument into a control system of the surgical robot, and further tracking, regulating and controlling the operation action of the surgical robot according to a preset surgical scheme.
9. The method of claim 8, wherein the three-dimensional coordinate position and spatial relationship of each marker is determined according to the following method:
(1) calculating three-dimensional attitude information of the marker by using a projection rule of the three-dimensional image on a two-dimensional plane; the attitude information is three-dimensional vectors formed by long sides and short sides of the marker and straight lines perpendicular to the plane of the marker, and each captured and identified marker is represented by unique three-dimensional vector features;
(2) converting three-dimensional vector representations of the marker acquired by the imaging unit of one camera into the imaging unit of the other camera by using an external reference matrix calibrated by binocular camera equipment; if the three-dimensional vectors are equal, the marker matching is considered to be successful; further calculating coordinates of the marker in world coordinates by using a binocular camera projection rule;
(3) forming a three-dimensional point cloud by using the world coordinates of part of the markers obtained by calculation;
(4) obtaining a target object model of the surgical instrument and the surgical position at the calibration position based on the three-dimensional point cloud; or matching and comparing the three-dimensional point cloud obtained by real-time calculation with the target object model to obtain the real-time coordinate information and posture information of the surgical instrument.
CN202111520421.3A 2021-12-12 2021-12-12 Track tracking system for surgical robot and tracking method using same Active CN114224489B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111520421.3A CN114224489B (en) 2021-12-12 2021-12-12 Track tracking system for surgical robot and tracking method using same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111520421.3A CN114224489B (en) 2021-12-12 2021-12-12 Track tracking system for surgical robot and tracking method using same

Publications (2)

Publication Number Publication Date
CN114224489A true CN114224489A (en) 2022-03-25
CN114224489B CN114224489B (en) 2024-02-13

Family

ID=80755286

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111520421.3A Active CN114224489B (en) 2021-12-12 2021-12-12 Track tracking system for surgical robot and tracking method using same

Country Status (1)

Country Link
CN (1) CN114224489B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115089293A (en) * 2022-07-04 2022-09-23 山东大学 Calibration method for spinal endoscopic surgical robot
WO2023231098A1 (en) * 2022-05-30 2023-12-07 清华大学 Target tracking method and system, and robot

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160175615A1 (en) * 2014-12-18 2016-06-23 Kabushiki Kaisha Toshiba Apparatus, method, and program for movable part tracking and treatment
CN107179322A (en) * 2017-06-15 2017-09-19 长安大学 A kind of bridge bottom crack detection method based on binocular vision
CN108154552A (en) * 2017-12-26 2018-06-12 中国科学院深圳先进技术研究院 A kind of stereo laparoscope method for reconstructing three-dimensional model and device
CN109833092A (en) * 2017-11-29 2019-06-04 上海复拓知达医疗科技有限公司 Internal navigation system and method
CN109903313A (en) * 2019-02-28 2019-06-18 中国人民解放军国防科技大学 Real-time pose tracking method based on target three-dimensional model
US20200169673A1 (en) * 2017-08-16 2020-05-28 Covidien Lp Synthesizing spatially-aware transitions between multiple camera viewpoints during minimally invasive surgery
CN111388087A (en) * 2020-04-26 2020-07-10 深圳市鑫君特智能医疗器械有限公司 Surgical navigation system, computer and storage medium for performing surgical navigation method
CN113034700A (en) * 2021-03-05 2021-06-25 广东工业大学 Anterior cruciate ligament reconstruction surgery navigation method and system based on mobile terminal
CN113347937A (en) * 2019-01-25 2021-09-03 伯恩森斯韦伯斯特(以色列)有限责任公司 Registration of frame of reference
CN113693723A (en) * 2021-08-05 2021-11-26 北京大学 Cross-modal navigation positioning system and method for oral and throat surgery

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160175615A1 (en) * 2014-12-18 2016-06-23 Kabushiki Kaisha Toshiba Apparatus, method, and program for movable part tracking and treatment
CN107179322A (en) * 2017-06-15 2017-09-19 长安大学 A kind of bridge bottom crack detection method based on binocular vision
US20200169673A1 (en) * 2017-08-16 2020-05-28 Covidien Lp Synthesizing spatially-aware transitions between multiple camera viewpoints during minimally invasive surgery
CN109833092A (en) * 2017-11-29 2019-06-04 上海复拓知达医疗科技有限公司 Internal navigation system and method
CN108154552A (en) * 2017-12-26 2018-06-12 中国科学院深圳先进技术研究院 A kind of stereo laparoscope method for reconstructing three-dimensional model and device
CN113347937A (en) * 2019-01-25 2021-09-03 伯恩森斯韦伯斯特(以色列)有限责任公司 Registration of frame of reference
CN109903313A (en) * 2019-02-28 2019-06-18 中国人民解放军国防科技大学 Real-time pose tracking method based on target three-dimensional model
CN111388087A (en) * 2020-04-26 2020-07-10 深圳市鑫君特智能医疗器械有限公司 Surgical navigation system, computer and storage medium for performing surgical navigation method
CN113034700A (en) * 2021-03-05 2021-06-25 广东工业大学 Anterior cruciate ligament reconstruction surgery navigation method and system based on mobile terminal
CN113693723A (en) * 2021-08-05 2021-11-26 北京大学 Cross-modal navigation positioning system and method for oral and throat surgery

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023231098A1 (en) * 2022-05-30 2023-12-07 清华大学 Target tracking method and system, and robot
CN115089293A (en) * 2022-07-04 2022-09-23 山东大学 Calibration method for spinal endoscopic surgical robot

Also Published As

Publication number Publication date
CN114224489B (en) 2024-02-13

Similar Documents

Publication Publication Date Title
CN109974584B (en) Calibration system and calibration method for auxiliary laser osteotomy robot
CN114041875B (en) Integrated operation positioning navigation system
CN107468350B (en) Special calibrator for three-dimensional image, operation positioning system and positioning method
CN114224489B (en) Track tracking system for surgical robot and tracking method using same
CN111012506B (en) Robot-assisted puncture surgery end tool center calibration method based on stereoscopic vision
US20160000518A1 (en) Tracking apparatus for tracking an object with respect to a body
CN109938837B (en) Optical tracking system and optical tracking method
JP2008224626A (en) Information processor, method for processing information, and calibration tool
CN109297413A (en) A kind of large-size cylinder body Structural visual measurement method
CN114668534B (en) Intraoperative implantation precision detection system and method for dental implant surgery
CN112958960B (en) Robot hand-eye calibration device based on optical target
CN114523471B (en) Error detection method based on association identification and robot system
CN112168357A (en) System and method for constructing spatial positioning model of C-arm machine
CN113870329A (en) Medical image registration system and method for surgical navigation
TWI708591B (en) Three-dimensional real-time positioning method for orthopedic surgery
CN114536399B (en) Error detection method based on multiple pose identifications and robot system
CN212256370U (en) Optical motion capture system
CN112998856B (en) Three-dimensional real-time positioning method
CN111862170A (en) Optical motion capture system and method
CN114536331B (en) Method for determining external stress of deformable mechanical arm based on association identification and robot system
TWI735390B (en) Method for real-time positioning compensation of image positioning system and image positioning system capable of real-time positioning compensation
CN113855240A (en) Medical image registration system and method based on magnetic navigation
Yu et al. Vision-based method of kinematic calibration and image tracking of position and posture for 3-RPS parallel robot
CN215899873U (en) Positioning scale for X-ray imaging operation
CN113100967B (en) Wearable surgical tool positioning device and positioning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant