CN113303840A - Operation navigation positioning system with help of endoscope - Google Patents

Operation navigation positioning system with help of endoscope Download PDF

Info

Publication number
CN113303840A
CN113303840A CN202110528999.7A CN202110528999A CN113303840A CN 113303840 A CN113303840 A CN 113303840A CN 202110528999 A CN202110528999 A CN 202110528999A CN 113303840 A CN113303840 A CN 113303840A
Authority
CN
China
Prior art keywords
surgical
endoscope
tracer
positioning system
planning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110528999.7A
Other languages
Chinese (zh)
Inventor
敖英芳
张辛
张维军
吕名扬
丁国成
彭聪
杨刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Peking University Third Hospital
Tinavi Medical Technologies Co Ltd
Beijing Tinavi Medical Technology Co Ltd
Original Assignee
Peking University Third Hospital
Tinavi Medical Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Peking University Third Hospital, Tinavi Medical Technologies Co Ltd filed Critical Peking University Third Hospital
Priority to CN202110528999.7A priority Critical patent/CN113303840A/en
Publication of CN113303840A publication Critical patent/CN113303840A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/00234Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/02Prostheses implantable into the body
    • A61F2/08Muscles; Tendons; Ligaments
    • A61F2/0805Implements for inserting tendons or ligaments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/302Surgical robots specifically adapted for manipulations within body cavities, e.g. within abdominal or thoracic cavities

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Robotics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Radiology & Medical Imaging (AREA)
  • Gynecology & Obstetrics (AREA)
  • Pathology (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Rehabilitation Therapy (AREA)
  • Rheumatology (AREA)
  • Cardiology (AREA)
  • Transplantation (AREA)
  • Vascular Medicine (AREA)
  • Endoscopes (AREA)

Abstract

The application provides a surgical navigation positioning system with the help of an endoscope, and relates to the field of medical robots. An endoscopic surgical navigational positioning system comprising: a surgical robot, a guide tracking system, a scanning imaging device, a surgical tracking system, an endoscope, a planning probe, having a contact tip and a positioning section, for picking up a surgical site under guidance of the endoscope; a human-computer interaction device for displaying the scanning image of the operation object, the operation part image acquired by the endoscope and the operation position picked up by the planning probe; and the upper controller is in communication connection with the surgical robot, the guiding device tracking system, the scanning imaging device, the surgical tracking system, the endoscope, the planning probe and the human-computer interaction device. The system can accurately position and prevent operation failure caused by positioning position deviation.

Description

Operation navigation positioning system with help of endoscope
Technical Field
The application relates to the technical field of medical treatment, in particular to a surgical navigation and positioning system with the help of an endoscope.
Background
Injury to the anterior cruciate ligament of the knee joint is one of the most common and serious sports injuries, the knee joint is unstable due to tearing of the anterior cruciate ligament, and serious dysfunction of the knee joint is caused due to improper treatment. The restoration and reconstruction of the injury of the anterior cruciate ligament is always an important research subject in the field of orthopedics and sports injury.
Anterior cruciate ligament reconstruction surgery is an effective treatment for indications of anterior cruciate ligament rupture. The operation process is that a hole (ligament tunnel) is drilled at each ligament connection point of the knee joint femur and the knee joint tibia, and the artificial ligament or the ligament made of other materials is used for penetrating and fixing in the two ligament tunnels to replace the broken ligament, so that the knee joint is stabilized. One of the surgical difficulties is to correctly position the ligament insertion points (the points of attachment of the ligament to the bone) and the ligament tunnels. Incorrect ligament insertion position will cause collision and friction between the anterior cruciate ligament and peripheral bone tissue, and the length of the ligament is repeatedly over-stretched in the knee joint movement process, thereby reducing the service life of the anterior cruciate ligament and affecting the operation curative effect.
In the conventional anterior cruciate ligament reconstruction surgery method, a doctor designs the insertion points and the ligament tunnel positions by using statistical data, and has great uncertainty due to the fact that the doctor cannot be completely adapted to the individual conditions of a patient.
The other method is to plan ligament insertion points and passages on preoperative 3D images, project the ligament insertion points and the passages onto intraoperative 2D images and conduct path navigation by using a robot. This method is complicated to operate, and has a certain problem in accuracy.
The other method is to use a mechanical ligament insertion point calibrator, but the method depends on manual operation, cannot observe the position distribution of the ligament tunnel in the bone, and has the problems of insufficient precision and inconvenient operation.
Disclosure of Invention
The application aims to provide a surgical navigation and positioning system by means of an endoscope, accurately plans a surgical position by means of the endoscope, and realizes accurate surgical positioning.
This user characteristic and advantage of the present application will become apparent from the detailed description below or may be learned in part by practice of the present application.
According to an aspect of the present application, there is provided an endoscopic surgical navigation and positioning system, comprising:
a surgical robot comprising a robotic arm;
the guiding device is fixed at the operation end of the mechanical arm;
a guide device tracking system for acquiring position data of the guide device;
scanning imaging means for scanning a surgical object to generate a scanned image;
the operation tracking system is used for tracking and acquiring the position data of the operation object;
an endoscope for acquiring an image of a surgical site;
a planning probe having a contact tip and a positioning portion for picking up a surgical site under the guidance of the endoscope;
a human-computer interaction device for displaying the scanning image of the operation object, the operation part image acquired by the endoscope and the operation position picked up by the planning probe;
the upper controller is in communication connection with the surgical robot, the guiding device tracking system, the scanning imaging device, the surgical tracking system, the endoscope, the planning probe and the human-computer interaction device;
the upper controller is configured to:
determining a surgical corridor based on the surgical location picked up by the planning probe;
controlling the robotic arm such that the guide device is positioned in the surgical corridor.
According to some embodiments, the surgical tracking system comprises:
a surgical tracer for securing to a surgical subject;
a navigation camera for acquiring spatial location information of the surgical tracer.
According to some embodiments, the upper controller is further configured to: registering the scanned image of the surgical object with the surgical tracer.
According to some embodiments, the guiding device tracking system comprises:
a guiding device tracer arranged on the mechanical arm;
the navigation camera is used for acquiring the space position information of the tracer of the guiding device.
According to some embodiments, the surgical navigational positioning system further comprises: an endoscope tracer disposed at the endoscope for positioning the endoscope to register the endoscope with a scanned image of the surgical object.
According to some embodiments, the upper controller is further configured to: and fusing and displaying the image of the endoscope and the scanned image of the operation object.
According to some embodiments, the upper controller is further configured to: the spatial coordinates of the surgical site picked up by the planning probe are recorded.
According to some embodiments, the surgical navigational positioning system further comprises: the endoscope holding device is used for keeping the endoscope stably aligned with the operation site.
According to some embodiments, the surgical navigational positioning system further comprises: the operation object fixing device is arranged on the operation table.
According to some embodiments, the system further comprises: a calibrator to assist in registering the scanned image of the surgical object with the surgical tracer.
According to some embodiments, the surgical navigational positioning system further comprises: the endoscope and the planning probe are of an integrated structure.
According to some embodiments, the surgical navigational positioning system is used in cruciate ligament reconstruction surgery; the operation tracer comprises a first tracer and a second tracer which are respectively arranged on the femur and the tibia; the planning probe is used for picking up ligament insertion points as an exit point and/or an entry point of the operation channel.
According to some embodiments, the planning probe is used to pick up feature points that determine the surgical corridor.
According to some embodiments, the upper controller is further configured to: responding to user operation, and determining an entry point of the surgical channel on the scanned image of the surgical object; or automatically planning an entry point of the surgical channel according to the exit point of the surgical channel picked up by the planning probe.
According to an exemplary embodiment, an endoscope, a scanned image, and a planning probe are used to accurately pick up a surgical site and accurately plan a channel site, thereby improving positioning accuracy with convenient operation. CT images mainly bones, not soft tissues. In ligament surgery, there is a risk of surgery without presenting soft tissue parts. The endoscope can track and display soft tissues, so that the surgical navigation and positioning system according to the exemplary embodiment can overcome the problems that ligament insertion points cannot be accurately picked up through CT or X-ray fluoroscopy images in a general navigation system, or MRI images are high in price and inconvenient to use.
According to some example embodiments, accurate positioning of the surgical robot can be achieved, and operation failure caused by channel deviation is prevented.
According to some example embodiments, the automatic positioning of the surgical robot can be realized, and the operation difficulty of a doctor is reduced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings.
Fig. 1A shows a schematic component diagram of a surgical navigational positioning system for cruciate ligament reconstruction surgery according to an example embodiment of the present application.
FIG. 1B shows a block diagram of a surgical navigational positioning system with an endoscope according to an exemplary embodiment of the present application.
Fig. 2 shows a schematic diagram of a planning probe according to an embodiment of the present application.
Fig. 3 shows a schematic surgical corridor planning according to an embodiment of the present application.
Fig. 4 illustrates a guide apparatus structure diagram according to an exemplary embodiment.
FIG. 5 shows a flow chart of a positioning method of a surgical navigational positioning system with an endoscope according to an exemplary embodiment.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals denote the same or similar parts in the drawings, and thus, a repetitive description thereof will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the application. One skilled in the relevant art will recognize, however, that the subject matter of the present application can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known methods, devices, implementations, or operations have not been shown or described in detail to avoid obscuring aspects of the application.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
It will be understood that, although the terms first, second, third, etc. may be used herein to describe various components, these components should not be limited by these terms. These terms are used to distinguish one element from another. Thus, a first component discussed below may be termed a second component without departing from the teachings of the present concepts. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
It will be appreciated by those skilled in the art that the drawings are merely schematic representations of exemplary embodiments, and that the blocks or processes shown in the drawings are not necessarily required to practice the present application and are, therefore, not intended to limit the scope of the present application.
The anterior/posterior cruciate ligament reconstruction operation under the endoscope is a common operation method at present, ligament injury can be clearly seen under the endoscope, and ligament insertion points can be explored. However, the endoscope has a narrow visual field and cannot contain all required anatomical features, and doctors need to design the insertion points of the anterior cruciate ligament and the ligament tunnel positions by combining with the professional knowledge. Therefore, the efficacy of this method is greatly influenced by the professional level of the doctor. In addition, the ligament passage is established and used by a special mechanical positioning device, the operation is carried out by depending on hands, and the problems of low precision, inconvenient operation and the like are caused because a plurality of persons are required to cooperate in the operation.
Therefore, the application provides an operation navigation positioning system by means of an endoscope, which utilizes the endoscope, scanning images and planning probes to accurately pick up operation positions and can utilize a robot to assist in positioning to improve operation precision.
Arthroscopy is the first minimally invasive technique used in orthopedics. Since the clinical application, the diagnosis rate of joint diseases is greatly improved, and many intra-articular pathological operations, which are difficult to perform in conventional operations, are completed. An arthroscope is a rod-like optical instrument having a diameter of about 5mm for observing the internal structure of a joint, and is an endoscope for diagnosing and treating joint diseases. The endoscope is a detection instrument integrating traditional optics, ergonomics, precision machinery, modern electronics, mathematics and software, and has structures such as an image sensor, an optical lens, a light source illumination and a mechanical device. The endoscope may enter the body through a natural orifice of the body, or through a small incision made by surgery. Since a lesion which cannot be displayed by X-ray can be seen by an endoscope, it is very useful for a doctor to diagnose the condition of a patient. The image of the part to be checked is converted into a digitalized optical fiber signal by a cold light source lens, a fiber optical wire, an image transmission system and a screen display system and adopting laser illumination, and then the image of the disease lesion point is stored and reproduced by an optical fiber transmission instrument display screen.
The following describes the technical solution according to the embodiment of the present application in detail mainly by taking cruciate ligament reconstruction surgery as an example.
FIG. 1A shows a component schematic view of a surgical navigational positioning system for cruciate ligament reconstruction surgical procedures, according to an example embodiment of the present application; FIG. 1B shows a system block diagram.
Referring to fig. 1A and 1B, a surgical navigation and positioning system with an endoscope according to an example embodiment may include a surgical robot 105, a guide 112, a guide tracking system 111, a scanning imaging device 113, a surgical tracking system 107, an endoscope 110, a planning probe 109, a human-machine interaction device 103, and a superordinate controller 101.
As shown in fig. 1A, the surgical robot 105 may include a robotic arm 1050, and the guide 112 of the surgical tool may be secured to an operative end of the robotic arm 1050. With the widespread application of minimally invasive surgery in recent years and the increasing requirement for positioning precision of instruments or internal implants in surgery, surgical robots are increasingly used, for example, for assisting in positioning surgery and even for automatically completing surgery. According to an exemplary embodiment, after the surgical approach is determined, robotic arm 1050 may be controlled such that guide 112 is positioned at the surgical site such that manual or automated/semi-automated surgical operations may be subsequently performed, e.g., in cruciate ligament reconstruction surgery, guide 112 may guide surgical tools to perform ligament tunneling operations on the femur and tibia.
The guiding device tracking system 111 may be an optical tracking device or a magnetic navigation tracking device for acquiring position data of the guiding device. According to an example embodiment, the guide device tracking system 111 may include a navigation camera 116 and a guide device tracer disposed at the robotic arm 1050. The guide means tracer may comprise an infrared tracer or a reflective ball tracer. The navigation camera 116 includes an optical sensor that can receive signals from the tracer elements of the guidance device tracer and convert the signals into positioning information for the guidance device and send the positioning information to the upper controller 101. The upper controller 101 can thus determine the position of the robot arm as a basis for controlling the movement path of the robot arm.
The scanning imaging device 113 is used to scan a surgical object to generate a scanned image. The scanning imaging device 113 may be a CT or CBCT, but the present application is not limited thereto. The subject may be a corresponding body part of a patient to be operated on, such as a knee joint (which may include part or all of the femur and tibia). According to the embodiment of the application, the knee joint scanning image is obtained through the scanning imaging device 113, and the mapping relationship between the image and the patient space is established, so that the mechanical arm 1050 can be subsequently controlled to be positioned at the planned surgery position for performing the surgery operation.
The surgical tracking system 107 may be an optical tracking device or a magnetic navigation tracking device for acquiring positional data of the surgical object. According to an example embodiment, the surgical tracking system 107 may include the aforementioned navigation camera 116 and surgical tracers 1070 and 1071, which may be secured to a surgical object. The surgical tracers 1070 and 1071 may be tracers that may include infrared tracers or reflective ball tracers. Referring to fig. 1A, in an exemplary embodiment, surgical tracers 1070 and 1071 may be disposed on the femur 117 and tibia 119, respectively. The navigation camera 116 may receive signals from the surgical tracers 1070 and 1071 to gather position information of the lower extremities and may send this information to the upper controller 101. The upper controller 101 can thus determine the position of the lower limb as a basis for planning the surgical path of the robotic arm.
In a medical image-guided assisted positioning or surgical navigation system, an image coordinate system, a positioning device coordinate system, and/or a patient coordinate system need to be converted to complete image registration or registration. For example, in the preoperative three-dimensional image acquisition and intraoperative registration method, a three-dimensional image of an operation object is acquired before an operation, and image registration is realized after a spatial coordinate measuring device is adopted to measure a human anatomical feature point and a corresponding feature point in the image for pairing.
According to an example embodiment, a calibrator 115 may be used for image registration. The calibrator 115 may be mounted in a fixed position relative to the patient's body or on a robotic arm, such as on a robotic arm, a guide device tracer, or a surgical tracer. According to some embodiments, the calibrator and the tracer may be a two-in-one structure.
In image acquisition, the calibrator 115 may be placed in a field of view (FOV) of the scanning imaging device, and a three-dimensional medical image is scanned and formed by the scanning imaging device, so that the marker points with specific distribution on the calibrator are shown on the image. The spatial positioning calculation can be carried out according to the distribution of the specific mark points in the three-dimensional medical image, so that the automatic registration or registration of the image space, the robot space and the patient space is realized, and further, the operation planning and the operation positioning navigation are carried out.
According to an exemplary embodiment, the surgical tracers 1070 and 1071 may be fixed with the femur and tibia of the affected limb, respectively, and the knee joint may be movable, for example, in a cruciate ligament reconstruction procedure. CT/CBCT image scanning is performed on the femur, tibia and knee joint by the scanning imaging device 113, and the knee joint is kept still until the image registration is completed in the scanning process. The images are registered or registered with the surgical tracer 1070 via the CT/CBCT images, the surgical tracer 1070, the calibrator 115. The images are registered or registered with the surgical tracer 1071 by the CT/CBCT image, the surgical tracer 1071, the calibrator 115. The images are registered or registered with the robotic arm by CT/CBCT images, guide device tracer, calibrator 115. After registration is complete, the femoral position is tracked by surgical tracer 1070, the tibial position is tracked by surgical tracer 1071, and the position of the guide on the mechanical arm is tracked by the guide tracer.
An endoscope 110 (e.g., an arthroscope) is used to acquire images of the surgical site. In the cruciate ligament reconstruction surgery, the ligament injury can be seen by using an arthroscope, and the ligament insertion point is explored. However, the endoscope has a narrow field of view and cannot contain all the required anatomical features, and therefore the success of the operation depends on the expertise and level of the physician. In the embodiment of the application, the endoscope is used together with the scanned image, so that more information can be seen by a doctor, and the accurate determination of the ligament insertion point can be improved.
According to some embodiments, an endoscope tracking system may also be included in the system. Similar to the surgical tracking system 107, the endoscopic tracking system may include the aforementioned navigation camera 116 and endoscopic tracer 114. The endoscope tracer 114 is provided to the endoscope. The navigation camera 116 may receive signals from the endoscope tracer 114 to gather position information for the endoscope and may send this information to the upper controller 101 to register the endoscope with the scanned image of the surgical object.
According to some embodiments, the endoscope image and the scanned medical image (CB/CBCT image) can be displayed in a fusion mode, so that the endoscope image and the CB/CBCT image are corresponding and displayed on the same device in a combined mode, and therefore a doctor can observe a surgical site and determine a surgical position conveniently, for example, a surgical channel of a cruciate ligament reconstruction surgery is determined.
Optionally, the system may further comprise an endoscopic holding device for holding the endoscope in stable alignment with the surgical site.
According to an exemplary embodiment, the scope holding device can be a passive arm, and can be locked in any pose to keep the endoscope stably aligned with the operation site. During the process of drilling the channel, the drill bit can be monitored through the endoscope, and the damage to tissues in and around the joint after the drill bit drills the bone surface is prevented.
According to some embodiments, the system may further comprise a surgical object fixation device for fixing the surgical object to the surgical table.
The planning probe 109 has a contact tip and a positioning section for picking up the surgical site under the guidance of the endoscope. For example, the planning probe 109 may pick up feature points that define the surgical corridor. According to an example embodiment, the superior machine controller 101 may acquire the spatial coordinates of the point picked up by the planning probe 109 using the exit point (ligament insertion point) and/or the entry point of the surgical corridor that may be picked up by the planning probe 109.
The human-machine interaction device 103 is used to display a scanned image of the surgical object, an image of the surgical site acquired through the endoscope, and the surgical site picked up by the planning probe 109, as will be described later with reference to fig. 3.
The upper controller 101 may be respectively in communication connection with the human-computer interaction device 103, the surgical robot 105, the surgical tracking system 107, the guiding device tracking system 111, the scanning imaging device 113, the planning probe 109, and the endoscope tracking system 114, receive information transmitted by the human-computer interaction device 103, the surgical tracking system 107, the planning probe 109, and the endoscope tracer 114, and transmit related information or instructions to the human-computer interaction device 103, the surgical robot 105, and the surgical tracking system 107.
In some embodiments, superordinate controller 101 may control pilot device tracking system 111, calibrator 115, and the like to be enabled.
According to an example embodiment, upper controller 101 may determine a surgical corridor based on the surgical location picked up by the planning probe 109.
According to some embodiments, such as in cruciate ligament reconstruction surgery, the planning probe 109 picks a ligament insertion point as the exit point of the surgical tunnel and picks the entry point of the surgical tunnel. The upper controller 101 can determine the operation channel according to the exit point and the entry point.
According to other embodiments, for example, in cruciate ligament reconstruction surgery, the planning probe 109 may pick up only ligament insertion points as the exit points of the surgical corridor, and the upper controller 101 may then determine the entry points of the surgical corridor on the scanned image of the surgical object in response to the doctor's operation, thereby determining the surgical corridor.
According to other embodiments, for example, in the cruciate ligament reconstruction surgery, the planning probe 109 may pick up only the ligament insertion point as the exit point of the surgical tunnel, and the upper controller 101 may automatically plan the entry point of the surgical tunnel according to the exit point and the scan image, thereby determining the surgical tunnel.
After the surgical corridor is determined, the upper controller 101 may control the robotic arm 1050 to position the guiding device in the surgical corridor for subsequent surgery. For example, the upper controller 101 controls the robot arm 1050 to move so that the guide device at the end of the robot arm 1050 is positioned at the passage position, and then the operator drills the passage under the guidance of the robot arm to set and fix the artificial ligament.
Fig. 2 shows a planning probe diagram that may be used in a system according to an example embodiment.
As shown in fig. 2, planning probe 109 may include a straight needle 1 and a tracer assembly 3. The straight needle 1 may comprise a straight needle body 21 and a straight needle tip 22. The straight needle body 21 may further include a connecting portion 211 and a holding portion 212.
The tracer assembly 3 may comprise at least three tracer cells 31 and a tracer support 32. The tracer unit 31 is embedded in the tracer holder 32 through an insertion hole. The tracer unit 31 is used to communicate the probe position to the navigation camera by being recognised by the optical tracking device. According to an example embodiment, the planning probe has the function of interacting with an upper controller. A confirmation button may be provided on the planning probe and depressed, and the superior controller may record the current position coordinates of the planning probe, for example, surgical planning and control software may be provided on the superior controller and may be used to record the current position coordinates of the probe.
According to some embodiments, under an endoscopist, a planning probe is used to pick up the exit point (ligament insertion point) and the entry point, respectively, of the surgical channel.
According to some embodiments, the endoscope and the planning probe may be of unitary construction.
Planning probes is well known to those skilled in the art and will not be described in detail herein. It will be readily appreciated that the planning probe usable in the present application is not limited to the type shown in fig. 2, and that any other suitable surgical planning probe may be employed.
Fig. 3 illustrates a photographic image of a surgical corridor planning according to an exemplary embodiment.
Referring to fig. 3, for example, in cruciate ligament reconstruction surgery, the exit and entry points of the planned surgical tunnel are shown on the scan image. In addition, the scanning image also comprises an image of the calibrator.
According to some embodiments, the entry point of the surgical channel may be determined on the scanned image of the surgical object in response to a user operation on the human-machine interface.
According to further embodiments, an entry point of the surgical tunnel may be automatically planned according to an exit point of the surgical tunnel picked up by the planning probe.
FIG. 4 illustrates a guide structure diagram that may be used with a system according to an exemplary embodiment.
The guiding device can be fixed at the operation end of the mechanical arm and used for guiding a surgical tool to perform a surgical operation.
Referring to fig. 4, according to some embodiments, the guide device may include a robot arm connection portion 401, a guide connection portion 403, and a guide portion 405.
The mechanical arm connecting part 401 can be quickly mounted or dismounted with the mechanical arm in an operation, and accurate repeated mounting precision can be achieved through V-shaped surface positioning and screw pressing.
The guide portion 405 may have 1 or 2 cooperating portions with the sleeve, and by cooperating with the sleeve, accurate positioning of the surgical path is achieved.
The guide link 403 connects the arm link 401 and the guide 405.
It will be readily appreciated that the guide arrangement shown in figure 4 is exemplary only and not limiting to the present application.
Fig. 5 shows a flow chart of a navigation positioning method of an endoscope-assisted cruciate ligament reconstruction procedure according to an example embodiment of the present application.
Referring to fig. 5, at S501, a first tracer and a second tracer are mounted on the femur and tibia, respectively, of a patient, and a CT image is scanned across the knee joint. The lower limbs of the patient can be fixed on the operating table through the fixing device, so that the lower limbs are prevented from shaking and being unstable, and the following endoscopic surgery operation can be completed more accurately.
At S503, the images are registered and a mapping relationship between the images and the patient space is established. For example, after acquiring a CT image of two parts of relative movement by scanning the image, spatial positional relationships between the two parts of relative movement and the first tracer and the second tracer are calculated, respectively.
In S505, under the endoscope, the exit point (ligament insertion point) and/or the entry point of the operation channel are/is picked up by the planning probe, and the space coordinates of the picked-up point of the planning probe are recorded. And automatically generating a planned channel position by a superior controller based on the spatial position coordinates of the exit point and/or the entry point, for example, by configuring operation planning and control software.
In S507, controlling the mechanical arm to move according to the position of the operation planning channel, so that a guide device at the tail end of the mechanical arm positions the channel; the operator drills the channel under the guidance of the robotic arm.
The technical solution of the present application has been described above mainly by taking an endoscopically assisted cruciate ligament reconstruction procedure as an example. It is easy to understand that the system and method according to the embodiment of the present application can also be used for other endoscope-based soft tissue operations, and the probe is used for picking up the calibration point of the surgical object under the monitoring of the endoscope, for example, the calibration point can be located at a certain soft tissue part of the surgical object, so as to obtain the spatial position coordinates of the calibration point, and the robotic arm is controlled by the upper controller to guide the guiding device to be located at the calibration point of the surgical object.
Through the description of the example embodiments, those skilled in the art will readily appreciate that the technical solutions according to the embodiments of the present application have at least one or more of the following advantages.
According to some embodiments, ligament insertion points are picked up accurately under the endoscope, and the channel position is planned accurately. The problems that soft tissues cannot be presented through CT or X-ray perspective images in a general navigation system so as to accurately pick up ligament insertion points, MRI images are high in price and inconvenient to use and the like are solved.
According to some embodiments, ligament insertion points are accurately picked up through the probe, so that accurate positioning of the robot is realized, and operation failure caused by channel deviation is prevented.
According to some embodiments, the automatic positioning is realized through a surgical robot, so that the operation difficulty of a doctor is reduced.
It is clear to a person skilled in the art that the solution of the present application can be implemented by means of software and/or hardware. The "unit" and "module" in this specification refer to software and/or hardware that can perform a specific function independently or in cooperation with other components, where the hardware may be, for example, a Field-ProgrammaBLE Gate Array (FPGA), an Integrated Circuit (IC), or the like.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The embodiments of the present application have been described and illustrated in detail above. It should be clearly understood that this application describes how to make and use particular examples, but the application is not limited to any details of these examples. Rather, these principles can be applied to many other embodiments based on the teachings of the present disclosure.
Exemplary embodiments of the present application are specifically illustrated and described above. It is to be understood that the application is not limited to the details of construction, arrangement, or method of implementation described herein; on the contrary, the intention is to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.

Claims (13)

1. A surgical navigational positioning system via an endoscope, the system comprising:
a surgical robot comprising a robotic arm;
the guiding device is fixed at the operation end of the mechanical arm;
a guide device tracking system for acquiring position data of the guide device;
scanning imaging means for scanning a surgical object to generate a scanned image;
a surgical tracking system for acquiring positional data of the surgical object;
an endoscope for acquiring an image of a surgical site;
a planning probe having a contact tip and a positioning portion for picking up a surgical site under the guidance of the endoscope;
a human-computer interaction device for displaying the scanning image of the operation object, the operation part image acquired by the endoscope and the operation position picked up by the planning probe;
the upper controller is in communication connection with the surgical robot, the guiding device tracking system, the scanning imaging device, the surgical tracking system, the endoscope, the planning probe and the human-computer interaction device;
the upper controller is configured to:
determining a surgical corridor based on the surgical location picked up by the planning probe;
controlling the robotic arm such that the guide device is positioned in the surgical corridor.
2. The surgical navigational positioning system of claim 1, wherein the surgical tracking system comprises:
a surgical tracer for securing to a surgical subject;
and the navigation camera is used for acquiring the spatial position information of the operation tracer and sending the spatial position information to the upper controller.
3. The surgical navigational positioning system of claim 2, wherein the guide device tracking system comprises:
a guiding device tracer arranged on the mechanical arm;
and the navigation camera is used for acquiring the spatial position information of the tracer of the guiding device and sending the spatial position information to the upper controller.
4. The surgical navigational positioning system of claim 2, further comprising an endoscope tracking system comprising:
an endoscope tracer disposed at the endoscope;
and the navigation camera is used for acquiring the spatial position information of the endoscope tracer and sending the spatial position information to the upper controller.
5. The surgical navigational positioning system of claim 4, wherein the superior controller is further configured to:
and fusing and displaying the image of the endoscope and the scanned image of the operation object.
6. The surgical navigational positioning system of claim 1, wherein the superior controller is further configured to:
the spatial coordinates of the surgical site picked up by the planning probe are recorded.
7. The surgical navigational positioning system of claim 1, further comprising:
the endoscope holding device is used for keeping the endoscope stably aligned with the operation site.
8. The surgical navigational positioning system of claim 1, further comprising:
and the operation object fixing device is used for fixing the operation object on the operation table.
9. The surgical navigational positioning system of claim 4, further comprising:
a calibrator disposed at the mechanical arm, the guiding device tracer, or the surgical tracer.
10. The surgical navigational positioning system of claim 1, wherein the endoscope is a unitary structure with the planning probe.
11. The surgical navigational positioning system of claim 1, wherein the planning probe is configured to pick up feature points that define the surgical corridor.
12. The surgical navigational positioning system of claim 2, wherein:
the system is used for cruciate ligament reconstruction surgery;
the operation tracer comprises a first tracer and a second tracer which are respectively arranged on the femur and the tibia;
the planning probe is used for picking up ligament insertion points as an in-point and/or an out-point of the surgical channel.
13. The surgical navigational positioning system of claim 12, wherein the superior controller is further configured to:
responding to user operation, and determining an entry point of the surgical channel on the scanned image of the surgical object;
or automatically planning an entry point of the surgical channel according to the exit point of the surgical channel picked up by the planning probe.
CN202110528999.7A 2021-05-14 2021-05-14 Operation navigation positioning system with help of endoscope Pending CN113303840A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110528999.7A CN113303840A (en) 2021-05-14 2021-05-14 Operation navigation positioning system with help of endoscope

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110528999.7A CN113303840A (en) 2021-05-14 2021-05-14 Operation navigation positioning system with help of endoscope

Publications (1)

Publication Number Publication Date
CN113303840A true CN113303840A (en) 2021-08-27

Family

ID=77373307

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110528999.7A Pending CN113303840A (en) 2021-05-14 2021-05-14 Operation navigation positioning system with help of endoscope

Country Status (1)

Country Link
CN (1) CN113303840A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113855286A (en) * 2021-09-24 2021-12-31 四川锋准机器人科技有限公司 Implant robot navigation system and method
CN114767031A (en) * 2022-03-31 2022-07-22 常州朗合医疗器械有限公司 Endoscope apparatus, position guide apparatus of endoscope, system, method, and computer-readable storage medium
CN116459013A (en) * 2023-04-24 2023-07-21 北京微链道爱科技有限公司 Control method based on 3D visual recognition and cooperative robot
CN116549114A (en) * 2023-05-12 2023-08-08 北京长木谷医疗科技股份有限公司 Intelligent perception interaction device and system of surgical robot

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113855286A (en) * 2021-09-24 2021-12-31 四川锋准机器人科技有限公司 Implant robot navigation system and method
CN113855286B (en) * 2021-09-24 2023-01-10 四川锋准机器人科技有限公司 Implant robot navigation system and method
CN114767031A (en) * 2022-03-31 2022-07-22 常州朗合医疗器械有限公司 Endoscope apparatus, position guide apparatus of endoscope, system, method, and computer-readable storage medium
CN114767031B (en) * 2022-03-31 2024-03-08 常州朗合医疗器械有限公司 Endoscope apparatus, position guidance apparatus, system, method, and computer-readable storage medium for endoscope
CN116459013A (en) * 2023-04-24 2023-07-21 北京微链道爱科技有限公司 Control method based on 3D visual recognition and cooperative robot
CN116459013B (en) * 2023-04-24 2024-03-22 北京微链道爱科技有限公司 Collaborative robot based on 3D visual recognition
CN116549114A (en) * 2023-05-12 2023-08-08 北京长木谷医疗科技股份有限公司 Intelligent perception interaction device and system of surgical robot
CN116549114B (en) * 2023-05-12 2024-04-02 北京长木谷医疗科技股份有限公司 Intelligent perception interaction device and system of surgical robot

Similar Documents

Publication Publication Date Title
CN113303840A (en) Operation navigation positioning system with help of endoscope
US10786307B2 (en) Patient-matched surgical component and methods of use
US20220031404A1 (en) System and method for verifying calibration of a surgical system
US20220354580A1 (en) Surgical navigation system, computer for performing surgical navigation method, and storage medium
US8934961B2 (en) Trackable diagnostic scope apparatus and methods of use
US9248001B2 (en) Computer assisted orthopedic surgery system for ligament reconstruction
US6725082B2 (en) System and method for ligament graft placement
JP5190510B2 (en) Multifunctional robotized platform for neurosurgery and position adjustment method
US6112113A (en) Image-guided surgery system
US20070073136A1 (en) Bone milling with image guided surgery
US20090183740A1 (en) Patella tracking method and apparatus for use in surgical navigation
US20070038059A1 (en) Implant and instrument morphing
JP2021528212A (en) Knee surgery methods and devices using inertial sensors
CN107124867A (en) Shape for orthopedic navigation is sensed
JP7367976B2 (en) Detection system and method for automatic detection of surgical instruments
CN116096305B (en) Tunnel position determining system and method for anterior/posterior cruciate ligament reconstruction
US20220323164A1 (en) Method For Stylus And Hand Gesture Based Image Guided Surgery
CN116234489A (en) Markless navigation system
CN212630809U (en) Knee joint ligament rebuilds intelligent control system
CN217525216U (en) Operation navigation positioning system with help of endoscope
US20220296083A1 (en) Arthroscopy method and device
CN112826567A (en) Mechanical arm-assisted medial patellofemoral ligament femoral side positioning operation system and method
JP2001204739A (en) Microscopic medical operation support system
WO2022048554A1 (en) Posteromedial structure, posterolateral structure, and medial patellofemoral ligament reconstruction positioning system and method
US20220331014A1 (en) Endoscope with procedure guidance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination