US20200170721A1 - Robotized system for femoroacetabular impingement resurfacing - Google Patents
Robotized system for femoroacetabular impingement resurfacing Download PDFInfo
- Publication number
- US20200170721A1 US20200170721A1 US16/786,522 US202016786522A US2020170721A1 US 20200170721 A1 US20200170721 A1 US 20200170721A1 US 202016786522 A US202016786522 A US 202016786522A US 2020170721 A1 US2020170721 A1 US 2020170721A1
- Authority
- US
- United States
- Prior art keywords
- osteophyte
- resurfacing
- bone
- model
- controller
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/00234—Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/16—Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
- A61B17/17—Guides or aligning means for drills, mills, pins or wires
- A61B17/1739—Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body
- A61B17/1764—Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body for the knee
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/16—Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
- A61B17/17—Guides or aligning means for drills, mills, pins or wires
- A61B17/1739—Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body
- A61B17/1742—Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body for the hip
- A61B17/1746—Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body for the hip for the acetabulum
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/16—Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
- A61B17/17—Guides or aligning means for drills, mills, pins or wires
- A61B17/1739—Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body
- A61B17/1742—Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body for the hip
- A61B17/175—Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body for the hip for preparing the femur for hip prosthesis insertion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
- A61B2034/2057—Details of tracking cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3983—Reference marker arrangements for use with image guided surgery
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
Definitions
- the present application relates to computer-assisted orthopedic surgery involving robotized apparatuses.
- Computer-assisted surgery has been developed to help an operator in altering bones, and in positioning and orienting implants to a desired orientation.
- optical navigation typically requires the use of a navigation system, which adds operative time.
- the optical navigation is bound to line-of-sight constraints that hamper the normal surgical flow.
- C-arm validation requires the use of bulky equipment, and the C-arm validation is not cost-effective.
- FIG. 1 is a schematic view of a robotized surgery system, in accordance with some embodiments.
- FIG. 2 is an exemplary perspective view of a foot support of the robotized surgery system of FIG. 1 , in accordance with some embodiments.
- FIG. 3A is a perspective schematic view of femoroacetabular impingement (FAI) conditions on the pelvis, in accordance with some embodiments.
- FAI femoroacetabular impingement
- FIG. 3B is a perspective schematic view of FAI conditions on the femoral head and neck, in accordance with some embodiments.
- FIG. 4 is a block diagram of a FAI resurfacing controller used with the robotized surgery system of FIG. 1 , in accordance with some embodiments.
- FIG. 5 illustrates a flow chart showing a robotized surgery system technique for FAI resurfacing, in accordance with some embodiments.
- FIG. 6 illustrates generally an example of a block diagram of a machine upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform in accordance with some embodiments.
- the present disclosure describes a robotic system for resurfacing bones, and in particular, for detecting and resurfacing one or more femoroacetabular impingements (FAIs).
- a FAI resurfacing controller may be used to perform this detecting and resurfacing of FAIs.
- the FAI resurfacing controller may include a bone model generator to receive bone imaging and to generate a model of at least one osteophyte and of a surface of a native bone surrounding the at least one osteophyte.
- the FAI resurfacing controller may include an osteophyte identifier to set a virtual 3D boundary surface between native bone surface and the at least one osteophyte.
- the FAI resurfacing controller may include a resurfacing navigator to generate and output a navigation file.
- the navigation file may include the model with the 3D boundary surface between native bone surface and the at least one osteophyte.
- the navigation file may also include patient-specific numerical control data for resurfacing the bone to remove the at least one osteophyte.
- the robotized surgery system 10 may include a robot arm 20 , a foot support 30 , a thigh support 40 , a robotized surgery controller 50 , a FAI resurfacing controller 60 , and a supplementary tracking apparatus 70 .
- the robot arm 20 is the working end of the system 10 , and is used to perform bone alterations as planned by an operator and as controlled by the robotized surgery controller 50 .
- the robot arm 20 is positioned to access the hip joint of the patient for performing FAI resurfacing.
- the foot support 30 supports the foot and lower leg of the patient, in such a way that it is only selectively movable for adjustment to the patient's position and morphology.
- the thigh support 40 supports the thigh and upper leg of the patient, again in such a way that it is only optionally movable.
- the thigh support 40 may assist in keeping the hip joint fixed during FAI resurfacing, and should hence be positioned so as not to impede the movements of the robot arm 20 .
- the robotized surgery controller 50 controls the robot arm 20 .
- the FAI resurfacing controller 60 output data used to drive the robot arm 20 in performing the FAI resurfacing.
- the tracking apparatus 70 may optionally be used to track the robot arm 20 and the patient limbs.
- the robot arm 20 may stand from a base 21 , for instance in a fixed relation relative to the operating room (OR) table supporting the patient. Indeed, the relative positioning of the robot arm 20 relative to the patient is a determinative factor in the precision of the surgical procedure, whereby the foot support 30 and thigh support 40 may assist in keeping the operated limb fixed in the illustrated ⁇ X Y Z ⁇ coordinate system.
- the robot arm 20 has a plurality of joints 22 and links 23 , of any appropriate form, to support a tool head 24 that interfaces with the patient.
- the arm 20 is shown being a serial mechanism, arranged for the tool head 24 to be displaceable in sufficient degrees of freedom (DOF).
- the robot arm 20 controls 6-DOF movements to the tool head 24 , ⁇ X Y Z ⁇ in the coordinate system, and pitch, roll, and yaw. Fewer or additional DOFs may be present.
- the joints 22 are powered for the robot arm 20 to move as controlled by the controller 50 in the six DOFs. Therefore, the powering of the joints 22 is such that the tool head 24 of the robot arm 20 may execute precise movements, such as moving along a single direction in one translation DOF, or being restricted to moving along a plane, among possibilities.
- Various types of robot arms 20 may be used, such as those described in U.S. patent application Ser. No. 11/610,728, incorporated herein by reference.
- the tool head 24 may also comprise a chuck or like tool interface, typically actuatable in rotation.
- the tool head 24 supports a burr 26 A, such as may be used during FAI resurfacing.
- the tool head 24 may support other surgical tools, such as a registration pointer, a reamer, a cannula, a reciprocating saw, or another surgical tool.
- the various tools may be interchanged, whether with human assistance, or as an automated process.
- the installation of a tool in the tool head 24 may then require some calibration to position the installed tool in the ⁇ X Y Z ⁇ coordinate system of the robot arm 20 .
- Various surgical procedures may be performed when tool head 24 is used with a cannula.
- the robot arm 20 may be used to perform a robotically assisted arthroscopy procedure.
- a shaver, burr, suture applicator, or another surgical instrument may be operated through the cannula.
- the controller 50 may target the cannula using sensors within the robot arm 20 , such as sensors used to detect position or rotation of various components of the robot arm 20 .
- the controller 50 may target the cannula using one or more cameras mounted on the tracking apparatus 70 .
- the tracking apparatus 70 can include at least two cameras coupled to a computing system that utilizes images captured from the cameras to triangulate positions of tracked objects.
- the tracking apparatus 70 can include an arthroscopic camera and a computing device that receives image data and performs image processing operations to segment information out of the image data related to osteophyte removal.
- the tracking apparatus 70 may also be used to verify removal of one or more osteophytes.
- a primary source of error during surgical procedures involving removal of osteophytes is the failure to completely remove the osteophyte, which may be due to the difficulty in determining how much of an osteophyte has been removed.
- the controller 50 may use the tracking apparatus 70 and robot arm 20 to determine how much of the osteophyte has been removed and whether the osteophyte removal process is complete.
- a feedback mechanism may be used to indicate when the osteophyte has been removed.
- the feedback may include a green light, an audible feedback, a tactile feedback, or other feedback.
- the determination of whether the osteophyte has been removed may be based on an image from the tracking apparatus 70 , based on comparing the burr 26 A location against a 3D model, based on manual leg manipulation and tracking range of movement through the tracking apparatus 70 , or based on another osteophyte removal confirmation.
- the controller 50 may drive the robot arm 20 in performing the surgical procedure based on the planning achieved pre-operatively.
- the robotized surgery controller 50 runs various modules, in the form of algorithms, code, non-transient executable instructions, etc., to operate the system 10 in the manner described herein.
- the controller 50 may include a robot driver module, where the robot driver module is tasked with powering or controlling the various joints of the robot arm 20 . Force feedback may be provided by the robot arm 20 to avoid damaging the soft tissue or surrounding environment.
- the robotized surgery controller 50 may have a processor unit to control movement of the robot arm 20 .
- System 10 may include an interface 90 to provide information to the operator.
- the interface 90 may include a display, a wireless portable electronic device (e.g., phone, tablet), a speaker for audio guidance, an LED display, or other type of interface.
- the controller 50 may be used to drive the robot arm 20 to avoid various predetermined soft tissues.
- the controller 50 may use the tracking apparatus 70 to detect a particular soft tissue and drive the robot arm 20 to avoid that soft tissue.
- the controller 50 may identify a safety zone, and may guide the robot arm 20 to enforce a safety zone by avoiding performing surgical procedures within the safety zone during a surgical procedure.
- the safety zone may include surrounding soft tissue, a native bone surface of a patient, or a critical blood vessel (e.g., femoral artery, neck artery).
- FIG. 1 a generic embodiment of a foot support 30 is shown in FIG. 1 , while one possible implementation of the foot support 30 is shown in greater detail in FIG. 2 .
- the foot support 30 may be displaceable relative to the OR table, to adjust to the patient, with the joints then lockable once a suitable position is reached.
- the mechanism of the foot support 30 may have a slider 31 , moving along the OR table in the X-axis direction. Joints 32 and links 33 may also be part of the mechanism of the foot support 30 , to support a foot interface 34 receiving the patient's foot.
- the thigh support 40 may be displaceable relative to the OR table, to be better positioned as a function of the patient's location on the table, so as not to impede action of the robot arm 20 . Accordingly, the thigh support 40 is shown as including a passive mechanism, with various lockable joints to lock the thigh support 40 in a desired position and orientation.
- the mechanism of the thigh support 40 may have a slider 41 , moving along the OR table in the X-axis direction. Joints 42 and links 43 may also be part of the mechanism of the thigh support 40 , to support a thigh bracket 44 .
- a strap 45 can immobilize the thigh/femur in the thigh support 40 .
- the thigh support 40 may not be necessary in some instances.
- the foot support 30 or the thigh support 40 may assist in keeping the bones fixed relative to the ⁇ X Y Z ⁇ coordinate system.
- the fixed relation may be required in instances in which no additional tracking is present to assist the actions of the robot arm 20 .
- the tracking apparatus 70 may provide intraoperative tracking information for the robot arm 20 and for the patient bones, in such a way that some movement of the patient is permissible intraoperatively as the movement is calculable and thus known in the ⁇ X Y Z ⁇ coordinate system.
- the operation of the tracking apparatus 70 may depend on the information within the navigation file C.
- the tracking apparatus 70 may assist in performing the calibration of the patient bone with respect to the robot arm 20 , for subsequent navigation in the ⁇ X Y Z ⁇ coordinate system.
- the tracking apparatus 70 may include two cameras to provide stereoscopic (e.g., 3D) image data to optically identify and locate retro-reflective references 71 A, 71 B, and 71 B to triangulate positions of objects associated with the references, in an embodiment, the reference 71 A is on the tool head 24 of the robot arm 20 such that its tracking allows the controller 50 to calculate the position and/or orientation of the tool head 24 and tool 26 A thereon.
- references 71 B and 71 C may be fixed to the patient bones, such as the tibia for reference 71 B and the femur for reference 71 C.
- references 71 B and 71 C are applied to the patent bones using a brief procedure to provide rapid reference tracking.
- references 71 B and 71 C may include application of a virtual marker (e.g., “painted on”) to an image of the bone, such as using interface 90 .
- a virtual marker e.g., “painted on”
- the references 71 attached to the patient need not be invasively anchored to the bone, as straps or like attachment means may provide sufficient grasping to prevent movement between the references 71 and the bones, in spite of being attached to soft tissue.
- the references 71 may include a fabric removably and non-invasively attachable to a bone, where references 71 each include a plurality of reference markers distributed on the surface of the fabric.
- the controller 50 continuously updates the position or orientation of the robot arm 20 and patient bones in the ⁇ X Y Z ⁇ coordinate system using the data from the tracking apparatus 70 .
- Tracking system 70 may include one or more of optical tracking sensors, inertial tracking sensors, or other motion or location sensors.
- FIG. 3A is a perspective schematic view of FAI conditions on the pelvis D, in accordance with some embodiments.
- FIG. 3B is a perspective schematic view of FAI conditions on the femoral head F 1 and neck F 2 , in accordance with some embodiments.
- the system 10 is used to resurface the femoral head F 1 or neck F 2 , or to resurface the periphery of the acetabulum A 1 in a FAI condition.
- the FAI condition may be caused by one or more osteophytes on the rim of the acetabulum A 1 or femoral head F 1 or neck F 2 .
- FIG. 3A is a perspective schematic view of FAI conditions on the pelvis D, in accordance with some embodiments.
- FIG. 3B is a perspective schematic view of FAI conditions on the femoral head F 1 and neck F 2 , in accordance with some embodiments.
- the system 10 is used to resurface the femoral head F 1 or neck F 2
- the FAI resurfacing controller 60 may receive bone imagery 131 .
- the bone imagery B 1 may include a computed tomography (CT) scan image, magnetic resonance imaging (MRI) image, or any other radiography imagery.
- a bone model generator module 61 receives the bone imagery B 1 to generate a bone model therefrom.
- the model may be a 3D representation of at least a portion of the surface having osteophytes thereon.
- the 3D representation may be that of a portion of the acetabulum A 1 or of a portion of the femoral head F 1 and neck F 2 .
- the 3D representation may include a portion of the bone surface surrounding the osteophytes, and the osteophytes.
- the osteophyte identifier module 62 may be used to identify an osteophyte.
- the received bone model may be used to identify geometric features of the bone, and the osteophytes may be identified by identifying differences between bone model geometric features and a bone atlas databases match.
- one or more femoral or acetabular geometric measurements may be used to identify geometric features of the bone.
- the geometric measurements may include an alpha angle, a lateral center edge angle, a femoral head coverage, a sourcil angle, an acetabular angle, or other femoral or acetabular geometric measurements.
- the alpha angle may be used to characterize the concavity of the anterior femoral head-neck junction, or how big the bump is on the femoral neck.
- the alpha angle is defined as the acute angle between the femoral neck axis and a line between the femoral head center with the point where the head-neck junction cortical surface first meets with a circle superimposed upon an ideally spherical femoral head.
- the alpha angle may be particularly useful in detecting an osteophyte that causes or contributes to a femoral cam impingement.
- the lateral center edge angle may be used to characterize the angular coverage of the femoral head by the weight-bearing zone of the acetabulum.
- the lateral center edge angle is defined as the angle formed by intersection of a vertical line extending through the femoral head center and a line extending through the femoral head center to the lateral sourcil.
- the lateral center edge angle may be particularly useful in detecting an osteophyte that contributes to acetabular dysplasia, acetabular instability, or femoral impingement.
- the femoral head coverage may be used to characterize weight-bearing femoral head coverage, where the femoral head coverage is defined as the percentage coverage of the femoral head by the weight-bearing zone of the acetabulum.
- the acetabular angle may be particularly useful in detecting an osteophyte that contributes to pincer impingement.
- the osteophyte identifier module 62 analyzes the bone geometry to identify the osteophyte, the geometric features may be used by the resurfacing navigator 63 to achieve a desired geometric goal or to correspond with a preoperatively planned geometric goal.
- the osteophyte identifier module 62 may identify the alpha angle, which may be used by the resurfacing navigator 63 to achieve a desired alpha angle or to correspond with a preoperatively planned alpha angle.
- the osteophyte identifier module 62 may analyze the bone model directly, such as by generating a 3D model based on the bone model and determining an impingement-free range of motion. For example, the osteophyte identifier module 62 may perform 3D reconstruction along the neck of the femur, identify the center of the sphere of the femoral head, and identify the non-spherical portions to determine impingement-free range of motion. Accordingly, the osteophyte identifier module 62 virtually segments the native bone surface from the osteophyte, by defining a 3D boundary surface between the native bone and the osteophyte.
- the 3D boundary surface is affixed and surrounded by the 3D bone model of the bone model generator module 61 .
- the osteophyte identifier module 62 may alternatively or supplementally require the assistance of an operator.
- the 3D boundary surface based on the bone atlas data B 2 may be a starting point for an operator to perform adjustments to the virtual segments or other virtual boundaries.
- the osteophyte identifier module 62 may provide the bone model to the bone model generator module 61 , along with interactive virtual tools, for an operator to define a 3D boundary surface between the osteophyte and the native bone surface.
- the interactive virtual tool may include a suggested 3D boundary surface based on extensions of the native bone surrounding the osteophytes.
- the resurfacing navigator 63 may also include a resurfacing path for the robot arm 10 based on a model of the osteophyte, and an identification of the tool that may be used, such as the burr 26 A shown in FIG. 1 .
- the resurfacing path may consider the surrounding soft tissue to be minimally invasive, such as by defining a safety zone to avoid specific soft tissues.
- the resultant navigation file C defines the maneuvers to be performed by the robot arm 20 as directed by the controller 50 of the system 10 .
- the resultant navigation file C may include a patient-specific numerical control data, such as anatomical information specific to the patient to aid in navigating the robot arm 20 .
- the maneuvers may be performed by the robot arm 20 without surgeon intervention.
- FIG. 5 illustrates a flow chart showing a robotized surgery system technique 80 for FAI resurfacing, in accordance with some embodiments.
- technique 80 is performed autonomously by a robotized system for femoroacetabular impingement resurfacing.
- the robotized system may include one or more of the components of the robotized surgery system 10 described above, such as robotized surgery controller 50 , robotic arm 20 , tracking system 70 , or other component.
- the robotized surgery controller 50 may include an FAI resurfacing controller 60 , which may include a bone model generator 61 , an osteophyte identifier 62 , and a resurfacing navigator 63 .
- Technique 80 may include receiving 81 a bone imaging data set at a bone model generator 61 .
- the bone imaging data set may include an x-ray image, a computed tomography (CT) scan image, MRI imaging, or any other radiography imagery that can provide sufficient detail to allow identification of osteophytes.
- the bone model generator 61 may generate 82 a resurfacing model.
- the resurfacing model may include at least one osteophyte and a native bone surface surrounding the at least one osteophyte. As discussed above, generating 82 the resurfacing model can optionally include manual intervention through a graphical user interface provided to a surgeon or technician.
- the osteophyte identifier 62 may map 83 a virtual 3D boundary surface based on the resurfacing model.
- the virtual 3D boundary surface may identify an osteophyte virtual boundary located between the native bone surface and the at least one osteophyte.
- the resurfacing navigator 63 may generate 84 a navigation file.
- the navigation file may include the resurfacing model, the virtual 3D boundary surface, and a plurality of patient-specific numerical control data.
- the navigation file can include control vectors used by the surgery controller 50 to direct a cutting tool attached to the robotic arm 20 to remove the identified osteophytes.
- the surgery controller 50 may execute the navigation file to direct the robotic arm in automatically removing 85 the at least one osteophyte from the native bone surface based on the navigation file.
- Removing 85 the at least one osteophyte may include the surgery controller 50 receiving 86 tracking data from a tracking system 70 .
- the surgery controller 50 may further direct the cutting tool attached to the robotic arm 20 based on tracking data received from the tracking system 70 .
- the robotic arm may remove 85 the at least one osteophyte without surgeon intervention.
- the surgery controller 50 may generate or update a 3D model based on tracking data received 86 from the tracking system 70 to verify osteophyte removal.
- the surgery controller 50 may update the 3D model to confirm the current state of the resurfaced bone provides impingement-free range of motion.
- a surgeon may manipulate a joint intraoperatively and provide osteophyte removal confirmation or other feedback to the surgery controller 50 .
- FIG. 6 illustrates generally an example of a block diagram of a machine 100 upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform in accordance with some embodiments.
- the machine 100 may operate as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine 100 may operate in the capacity of a server machine, a client machine, or both in server-client network environments.
- the machine 100 may be a personal computer (PC), a tablet PC, a personal digital assistant (PDA), a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
- PC personal computer
- PDA personal digital assistant
- machine shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein, such as cloud computing, software as a service (SaaS), other computer cluster configurations.
- cloud computing software as a service
- SaaS software as a service
- Examples, as described herein, may include, or may operate on, logic or a number of components, modules, or like mechanisms.
- Such mechanisms are tangible entities (e.g., hardware) capable of performing specified operations when operating.
- the hardware may be specifically configured to carry out a specific operation (e.g., hardwired).
- the hardware may include configurable execution units (e.g., transistors, circuits, etc.) and a computer readable medium containing instructions, where the instructions configure the execution units to carry out a specific operation when in operation.
- the configuring may occur under the direction of the executions units or a loading mechanism.
- the execution units are communicatively coupled to the computer readable medium when the device is operating.
- the execution units may be configured by a first set of instructions to implement a first set of features at one point in time and reconfigured by a second set of instructions to implement a second set of features.
- Machine 100 may include a hardware processor 102 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a main memory 104 and a static memory 106 , some or all of which may communicate with each other via an interlink (e.g., bus) 108 .
- the machine 100 may further include a display unit 110 , an alphanumeric input device 112 (e.g., a keyboard), and a user interface (UI) navigation device 114 (e.g., a mouse).
- the display unit 110 , alphanumeric input device 112 and UI navigation device 114 may be a touch screen display.
- the display unit 110 may include goggles, glasses, an augmented reality (AR) display, a virtual reality (VR) display, or another display component.
- the display unit may be worn on a head of a user and may provide a heads-up-display to the user.
- the alphanumeric input device 112 may include a virtual keyboard (e.g., a keyboard displayed virtually in a VR or AR setting.
- the machine 100 may additionally include a storage device (e.g., drive unit) 116 , a signal generation device 118 (e.g., a speaker), a network interface device 120 , and one or more sensors 121 , such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor.
- the machine 100 may include an output controller 128 , such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices.
- a serial e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.
- machine readable medium 122 is illustrated as a single medium, the term “machine readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) configured to store the one or more instructions 124 .
- machine readable medium may include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) configured to store the one or more instructions 124 .
- machine readable medium may include any medium that is capable of storing, encoding, or carrying instructions for execution by the machine 100 and that cause the machine 100 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions.
- Non-limiting machine readable medium examples may include solid-state memories, and optical and magnetic media.
- machine readable media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
- non-volatile memory such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices
- EPROM Electrically Programmable Read-Only Memory
- EEPROM Electrically Erasable Programmable Read-Only Memory
- flash memory devices e.g., electrically Erasable Programmable Read-Only Memory (EEPROM)
- EPROM Electrically Programmable Read-Only Memory
- EEPROM Electrically Erasable Programmable Read-Only Memory
- flash memory devices e.g., electrically Era
- the instructions 124 may further be transmitted or received over a communications network 126 using a transmission medium via the network interface device 120 utilizing any one of a number of transfer protocols (e.g., frame relay, Internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.).
- transfer protocols e.g., frame relay, Internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.
- Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi-Fi®, as the personal area network family of standards known as Bluetooth® that are promulgated by the Bluetooth Special Interest Group, peer-to-peer (P2P) networks, among others.
- the network interface device 120 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to the communications network 126 .
- Example 1 is a femoroacetabular impingement resurfacing system comprising: a bone model generator to receive a bone imaging data set and generate a resurfacing model based on the bone imaging data set, the resurfacing model including at least one osteophyte and a native bone surface connected to the at least one osteophyte; an osteophyte identifier to map a virtual 3D boundary surface based on the resurfacing model, the virtual 3D boundary surface identifying an osteophyte virtual boundary located between the native bone surface and the at least one osteophyte; a resurfacing navigator to generate a navigation file, the navigation file including the resurfacing model and the virtual 3D boundary surface, the navigation file to provide control instructions to resurface the native bone surface to remove the at least one osteophyte; and an osteophyte removal device to automatically resurface the native bone surface based on the navigation file.
- a bone model generator to receive a bone imaging data set and generate a resurfacing model based on the bone imaging data set, the
- Example 2 the subject matter of Example 1 optionally includes wherein the osteophyte removal device includes an osteophyte removal tool, a robotic arm, and a robotic controller; and wherein the robotic controller executes the navigation file to cause the robotic arm to maneuver the osteophyte removal tool to automatically resurface the native bone surface to remove the at least one osteophyte.
- the osteophyte removal device includes an osteophyte removal tool, a robotic arm, and a robotic controller; and wherein the robotic controller executes the navigation file to cause the robotic arm to maneuver the osteophyte removal tool to automatically resurface the native bone surface to remove the at least one osteophyte.
- Example 3 the subject matter of Example 2 optionally includes wherein the navigation file includes a safety zone, the safety zone representing a plurality of virtual boundaries that prevent movement of the osteophyte removal device into a plurality of surrounding soft tissues; and wherein the robotic controller uses the navigation tile to cause the osteophyte removal device to avoid the safety zone when resurfacing the native bone surface.
- the navigation file includes a safety zone, the safety zone representing a plurality of virtual boundaries that prevent movement of the osteophyte removal device into a plurality of surrounding soft tissues; and wherein the robotic controller uses the navigation tile to cause the osteophyte removal device to avoid the safety zone when resurfacing the native bone surface.
- Example 4 the subject matter of any one or more of Examples 1-3 optionally include a tracking apparatus to generate tracking data, wherein the robotic controller processes the tracking data to determine a location of at least one of the osteophyte removal device and the native bone surface.
- Example 5 the subject matter of Example 4 optionally includes wherein the tracking apparatus includes an image capture device to generate image data; and wherein the robotic controller processes the image data to identify and determine the location of at least one of the osteophyte removal device and the native bone surface.
- Example 6 the subject matter of any one or more of Examples 4-5 optionally include wherein the tracking apparatus includes a robotic tracking arm to position the image capture device; and wherein the robotic controller executes the navigation file to control the robotic tracking arm to improve an image quality of the generated image data.
- Example 7 the subject matter of any one or more of Examples 1-6 optionally include a bone atlas database, wherein the osteophyte identifier is further operable to: compare the bone imaging data set against the bone atlas database to find a closest bone atlas entry; and use the closest bone atlas as a model for the native bone surface and to identify the at least one osteophyte.
- Example 8 the subject matter of Example 7 optionally includes wherein the osteophyte identifier identifying the at least one osteophyte includes identifying geometric features based on the bone imaging data set.
- Example 9 the subject matter of any one or more of Examples 1-8 optionally include wherein the osteophyte identifier is further to: generate a 3D bone model based on the bone imaging data set; and determine an impingement-free range of motion based on the 3D bone model.
- Example 10 the subject matter of Example 9 optionally includes D bone model includes: performing a 3D reconstruction along a femoral neck; and identifying a femoral head spherical center.
- Example 11 the subject matter of Example 10 optionally includes D bone model.
- Example 12 is a femoroacetabular impingement resurfacing method comprising: performing the following operations on a computing device including a processor and memory, the operations including: receiving a bone imaging data set; generating a resurfacing model based on the bone imaging data set, the resurfacing model including at least one osteophyte and a native bone surface connected to the at least one osteophyte; mapping a virtual 3D boundary surface based on the resurfacing model, the virtual 3D boundary surface identifying an osteophyte virtual boundary located between the native bone surface and the at least one osteophyte; generating a navigation file, the navigation file including the resurfacing model and the virtual 3D boundary surface, the navigation file to provide control instructions to resurface the native bone surface to remove the at least one osteophyte; and outputting the navigation file for use by an osteophyte removal device to automatically resurface the native bone surface based on the navigation file.
- Example 13 the subject matter of Example 12 optionally includes wherein the osteophyte removal device includes an osteophyte removal tool, a robotic arm, and a robotic controller; and the method further comprises executing the navigation file on the robotic controller to cause the robotic arm to maneuver the osteophyte removal tool to automatically resurface the native bone surface to remove the at least one osteophyte.
- the osteophyte removal device includes an osteophyte removal tool, a robotic arm, and a robotic controller
- the method further comprises executing the navigation file on the robotic controller to cause the robotic arm to maneuver the osteophyte removal tool to automatically resurface the native bone surface to remove the at least one osteophyte.
- Example 14 the subject matter of Example 13 optionally includes the operations further including generating a safety zone, the safety zone representing a plurality of virtual boundaries that prevent movement of the osteophyte removal device into a plurality of surrounding soft tissues; wherein the navigation file includes instructions to cause the osteophyte removal device to avoid the safety zone when resurfacing the native bone surface.
- Example 15 the subject matter of any one or more of Examples 12-14 optionally include the operations further including determining a location of at least one of the osteophyte removal device and the native bone surface using a tracking system.
- Example 16 the subject matter of Example 15 optionally includes wherein receiving tracking data includes receiving image data from an image capture device; and wherein the operations include processing the image data to identify and determine the location of at least one of the osteophyte removal device and the native bone surface.
- Example 18 the subject matter of any one or more of Examples 12-17 optionally include the operations further including: comparing the bone imaging data set against a bone atlas database to find a closest bone atlas entry; and use the closest bone atlas as a model for the native bone surface and to identify the at least one osteophyte.
- Example 19 the subject matter of Example 18 optionally includes wherein identifying the at least one osteophyte includes identifying geometric features based on the bone imaging data set.
- Example 20 the subject matter of any one or more of Examples 12-19 optionally include D boundary surface further includes: generating a 3D bone model based on the bone imaging data set; and determining an impingement-free range of motion based on the 3D bone model.
- Example 21 the subject matter of Example 20 optionally includes D bone model includes: performing a 3D reconstruction along a femoral neck; and identifying a femoral head spherical center.
- Example 22 the subject matter of Example 21 optionally includes D bone model.
- Example 23 is at least one machine-readable storage medium, comprising a plurality of instructions that, responsive to being executed with processor circuitry of a computer-controlled femoroacetabular impingement resurfacing device, cause the device to: receive a bone imaging data set; generate a resurfacing model based on the bone imaging data set, the resurfacing model including at least one osteophyte and a native bone surface connected to the at least one osteophyte; map a virtual 3D boundary surface based on the resurfacing model, the virtual 3D boundary surface identifying an osteophyte virtual boundary located between the native bone surface and the at least one osteophyte; generate a navigation tile, the navigation file including the resurfacing model and the virtual 31 ) boundary surface, the navigation file to provide control instructions to resurface the native bone surface to remove the at least one osteophyte; and output the navigation file for use by an osteophyte removal device to automatically resurface the native bone surface based on the navigation file.
- Example 24 the subject matter of Example 23 optionally includes wherein the osteophyte removal device includes an osteophyte removal tool, a robotic arm, and a robotic controller; and the instructions further causing the device to execute the navigation tile on the robotic controller to cause the robotic arm to maneuver the osteophyte removal tool to automatically resurface the native bone surface to remove the at least one osteophyte.
- the osteophyte removal device includes an osteophyte removal tool, a robotic arm, and a robotic controller; and the instructions further causing the device to execute the navigation tile on the robotic controller to cause the robotic arm to maneuver the osteophyte removal tool to automatically resurface the native bone surface to remove the at least one osteophyte.
- Example 25 the subject flatter of Example 24 optionally includes the instructions further causing the device to generate a safety zone, the safety zone representing a plurality of virtual boundaries that prevent movement of the osteophyte removal device into a plurality of surrounding soft tissues; wherein the navigation file includes instructions to cause the osteophyte removal device to avoid the safety zone when resurfacing the native bone surface.
- Example 26 the subject matter of any one or more of Examples 23-25 optionally include the instructions further causing the device to determine a location of at least one of the osteophyte removal device and the native bone surface using a tracking system.
- Example 27 the subject matter of Example 26 optionally includes wherein determining the location using a tracking system includes receiving image data from an image capture device within the tracking system; and wherein the determining the location includes processing the image data to identify and determine the location of at least one of the osteophyte removal device and the native bone surface.
- Example 28 the subject matter of Example 27 optionally includes wherein the tracking system includes a robotic tracking arm to position the image capture device; and the method further comprises executing the navigation file to control the robotic tracking arm to improve an image quality of the received image data.
- Example 134 is at least one non-transitory machine-readable medium including instructions for operation of a robotic arm, which when executed by at least one processor, cause the at least one processor to perform operations of any of the methods of Examples 1-28.
- Method examples described herein may be machine or computer-implemented at least in part. Some examples may include a computer-readable medium or machine-readable medium encoded with instructions operable to configure an electronic device to perform methods as described in the above examples.
- An implementation of such methods may include code, such as microcode, assembly language code, a higher-level language code, or the like. Such code may include computer readable instructions for performing various methods. The code may form portions of computer program products.
- the code may be tangibly stored on one or more volatile, non-transitory, or non-volatile tangible computer-readable media, such as during execution or at other times, :
- tangible computer-readable media may include, but are not limited to, hard disks, removable magnetic disks, removable optical disks (e.g., compact disks and digital video disks), magnetic cassettes, memory cards or sticks, random access memories (RAMs), read only memories (ROMs), and the like.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Robotics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Surgical Instruments (AREA)
- Manipulator (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Prostheses (AREA)
Abstract
Systems and methods are described herein for resurfacing bones, and in particular, for detecting and resurfacing one or more femoroacetabular impingements (FAIs). A FAI resurfacing controller may be used to perform this detecting and resurfacing of FAIs. The FAI resurfacing controller may include a bone model generator to receive bone imaging and to generate a model of at least one osteophyte and of a surface of a native bone surrounding the at least one osteophyte. The FAI resurfacing controller may include an osteophyte identifier to set a virtual 3D boundary surface between native bone surface and the at least one osteophyte. The FAI resurfacing controller may include a resurfacing navigator to generate and output a navigation file. The navigation file may include the model with the 3D boundary surface between native bone surface and the at least one osteophyte.
Description
- This application is a divisional of U.S. patent application Ser. No. 15/625,555, filed on Jun. 16, 2017, which claims the benefit of U.S. Provisional Patent Application Ser. No. 62/350,891, filed on Jun. 16, 2016, the benefit of priority of each of which is claimed hereby, and each of which is incorporated by reference herein in its entirety.
- The present application relates to computer-assisted orthopedic surgery involving robotized apparatuses.
- Computer-assisted surgery has been developed to help an operator in altering bones, and in positioning and orienting implants to a desired orientation. Among the various tracking technologies used in computer-assisted surgery, optical navigation, C-arm validation, and manual reference guides have been used. The optical navigation typically requires the use of a navigation system, which adds operative time. Moreover, the optical navigation is bound to line-of-sight constraints that hamper the normal surgical flow. C-arm validation requires the use of bulky equipment, and the C-arm validation is not cost-effective.
- Such tracking technologies often assist manual work performed by an operator or surgeon. While surgeons may have developed an expertise in manipulations performed during surgery, some practitioners prefer the precision and accuracy of robotized surgery.
-
FIG. 1 is a schematic view of a robotized surgery system, in accordance with some embodiments. -
FIG. 2 is an exemplary perspective view of a foot support of the robotized surgery system ofFIG. 1 , in accordance with some embodiments. -
FIG. 3A is a perspective schematic view of femoroacetabular impingement (FAI) conditions on the pelvis, in accordance with some embodiments. -
FIG. 3B is a perspective schematic view of FAI conditions on the femoral head and neck, in accordance with some embodiments. -
FIG. 4 is a block diagram of a FAI resurfacing controller used with the robotized surgery system ofFIG. 1 , in accordance with some embodiments. -
FIG. 5 illustrates a flow chart showing a robotized surgery system technique for FAI resurfacing, in accordance with some embodiments. -
FIG. 6 illustrates generally an example of a block diagram of a machine upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform in accordance with some embodiments. - The present disclosure describes a robotic system for resurfacing bones, and in particular, for detecting and resurfacing one or more femoroacetabular impingements (FAIs). A FAI resurfacing controller may be used to perform this detecting and resurfacing of FAIs. The FAI resurfacing controller may include a bone model generator to receive bone imaging and to generate a model of at least one osteophyte and of a surface of a native bone surrounding the at least one osteophyte. The FAI resurfacing controller may include an osteophyte identifier to set a virtual 3D boundary surface between native bone surface and the at least one osteophyte. The FAI resurfacing controller may include a resurfacing navigator to generate and output a navigation file. The navigation file may include the model with the 3D boundary surface between native bone surface and the at least one osteophyte. The navigation file may also include patient-specific numerical control data for resurfacing the bone to remove the at least one osteophyte.
- In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. Like numerals having different letter suffixes may represent different instances of similar components. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed in the present document.
-
FIG. 1 is a schematic view of arobotized surgery system 10, in accordance with some embodiments.Robotized surgery system 10 may be used to perform orthopedic surgery maneuvers on a patient, such as FAI resurfacing, arthroscopy, or other surgical procedures. Therobotized surgery system 10 is shown relative to a patient's leg in a supine decubitus, though the patient may be in lateral decubitus (e.g., to expose the hip joint) or in another position. - The
robotized surgery system 10 may include arobot arm 20, afoot support 30, athigh support 40, arobotized surgery controller 50, aFAI resurfacing controller 60, and asupplementary tracking apparatus 70. Therobot arm 20 is the working end of thesystem 10, and is used to perform bone alterations as planned by an operator and as controlled by therobotized surgery controller 50. Therobot arm 20 is positioned to access the hip joint of the patient for performing FAI resurfacing. Thefoot support 30 supports the foot and lower leg of the patient, in such a way that it is only selectively movable for adjustment to the patient's position and morphology. Thethigh support 40 supports the thigh and upper leg of the patient, again in such a way that it is only optionally movable. Thethigh support 40 may assist in keeping the hip joint fixed during FAI resurfacing, and should hence be positioned so as not to impede the movements of therobot arm 20. Therobotized surgery controller 50 controls therobot arm 20. TheFAI resurfacing controller 60 output data used to drive therobot arm 20 in performing the FAI resurfacing. Thetracking apparatus 70 may optionally be used to track therobot arm 20 and the patient limbs. - The
robot arm 20 may stand from abase 21, for instance in a fixed relation relative to the operating room (OR) table supporting the patient. Indeed, the relative positioning of therobot arm 20 relative to the patient is a determinative factor in the precision of the surgical procedure, whereby the foot support 30 andthigh support 40 may assist in keeping the operated limb fixed in the illustrated {X Y Z} coordinate system. Therobot arm 20 has a plurality ofjoints 22 andlinks 23, of any appropriate form, to support atool head 24 that interfaces with the patient. Thearm 20 is shown being a serial mechanism, arranged for thetool head 24 to be displaceable in sufficient degrees of freedom (DOF). For example, therobot arm 20 controls 6-DOF movements to thetool head 24, {X Y Z} in the coordinate system, and pitch, roll, and yaw. Fewer or additional DOFs may be present. For simplicity, only a generic illustration of thejoints 22 andlinks 23 is provided, but more joints of different types may be present to move thetool head 24 in the manner described above. Thejoints 22 are powered for therobot arm 20 to move as controlled by thecontroller 50 in the six DOFs. Therefore, the powering of thejoints 22 is such that the tool head 24 of therobot arm 20 may execute precise movements, such as moving along a single direction in one translation DOF, or being restricted to moving along a plane, among possibilities. Various types ofrobot arms 20 may be used, such as those described in U.S. patent application Ser. No. 11/610,728, incorporated herein by reference. - The
tool head 24 may also comprise a chuck or like tool interface, typically actuatable in rotation. InFIG. 1 , thetool head 24 supports aburr 26A, such as may be used during FAI resurfacing. Thetool head 24 may support other surgical tools, such as a registration pointer, a reamer, a cannula, a reciprocating saw, or another surgical tool. The various tools may be interchanged, whether with human assistance, or as an automated process. The installation of a tool in thetool head 24 may then require some calibration to position the installed tool in the {X Y Z} coordinate system of therobot arm 20. Various surgical procedures may be performed whentool head 24 is used with a cannula. In an example, therobot arm 20 may be used to perform a robotically assisted arthroscopy procedure. A shaver, burr, suture applicator, or another surgical instrument may be operated through the cannula. Thecontroller 50 may target the cannula using sensors within therobot arm 20, such as sensors used to detect position or rotation of various components of therobot arm 20. Thecontroller 50 may target the cannula using one or more cameras mounted on thetracking apparatus 70. -
Tracking apparatus 70 may include various types of tracking systems depending on the particular surgical application. For example, thetracking apparatus 70 may be used to track therobot arm 20 or other components of thesurgery system 10 using one or more image capture devices (e.g., cameras). Thetracking apparatus 70 may include an arthroscopic camera for viewing the surgical site in a minimally invasive manner. Thetracking apparatus 70 may also be used to provide video recognition and tracking-guided surgery procedures. For example, thecontroller 50 may use thetracking apparatus 70 to detect an unhealthy acetabular labrum, place the cannula in a predetermined location, and stitch the acetabular labrum. Thetracking apparatus 70 can also include an associated controller or computing system to processing data received from various sensors (e.g., cameras, etc.) to provide guidance to other components of the system. For example, in an optical tracking scenario, thetracking apparatus 70 can include at least two cameras coupled to a computing system that utilizes images captured from the cameras to triangulate positions of tracked objects. In another example, thetracking apparatus 70 can include an arthroscopic camera and a computing device that receives image data and performs image processing operations to segment information out of the image data related to osteophyte removal. - The
tracking apparatus 70 may also be used to verify removal of one or more osteophytes. A primary source of error during surgical procedures involving removal of osteophytes is the failure to completely remove the osteophyte, which may be due to the difficulty in determining how much of an osteophyte has been removed. Thecontroller 50 may use thetracking apparatus 70 androbot arm 20 to determine how much of the osteophyte has been removed and whether the osteophyte removal process is complete. In an example, a feedback mechanism may be used to indicate when the osteophyte has been removed. The feedback may include a green light, an audible feedback, a tactile feedback, or other feedback. The determination of whether the osteophyte has been removed may be based on an image from thetracking apparatus 70, based on comparing theburr 26A location against a 3D model, based on manual leg manipulation and tracking range of movement through thetracking apparatus 70, or based on another osteophyte removal confirmation. - The
controller 50 may drive therobot arm 20 in performing the surgical procedure based on the planning achieved pre-operatively. Therobotized surgery controller 50 runs various modules, in the form of algorithms, code, non-transient executable instructions, etc., to operate thesystem 10 in the manner described herein. For example, thecontroller 50 may include a robot driver module, where the robot driver module is tasked with powering or controlling the various joints of therobot arm 20. Force feedback may be provided by therobot arm 20 to avoid damaging the soft tissue or surrounding environment. Therobotized surgery controller 50 may have a processor unit to control movement of therobot arm 20.System 10 may include aninterface 90 to provide information to the operator. Theinterface 90 may include a display, a wireless portable electronic device (e.g., phone, tablet), a speaker for audio guidance, an LED display, or other type of interface. - The
controller 50 may be used to drive therobot arm 20 to avoid various predetermined soft tissues. In an embodiment, thecontroller 50 may use thetracking apparatus 70 to detect a particular soft tissue and drive therobot arm 20 to avoid that soft tissue. In an embodiment, thecontroller 50 may identify a safety zone, and may guide therobot arm 20 to enforce a safety zone by avoiding performing surgical procedures within the safety zone during a surgical procedure. In various examples, the safety zone may include surrounding soft tissue, a native bone surface of a patient, or a critical blood vessel (e.g., femoral artery, neck artery). - To preserve the fixed relation between the leg and the coordinate system, a generic embodiment of a
foot support 30 is shown inFIG. 1 , while one possible implementation of thefoot support 30 is shown in greater detail inFIG. 2 . Thefoot support 30 may be displaceable relative to the OR table, to adjust to the patient, with the joints then lockable once a suitable position is reached. The mechanism of thefoot support 30 may have aslider 31, moving along the OR table in the X-axis direction.Joints 32 andlinks 33 may also be part of the mechanism of thefoot support 30, to support afoot interface 34 receiving the patient's foot. - The
thigh support 40 may be displaceable relative to the OR table, to be better positioned as a function of the patient's location on the table, so as not to impede action of therobot arm 20. Accordingly, thethigh support 40 is shown as including a passive mechanism, with various lockable joints to lock thethigh support 40 in a desired position and orientation. The mechanism of thethigh support 40 may have aslider 41, moving along the OR table in the X-axis direction.Joints 42 andlinks 43 may also be part of the mechanism of thethigh support 40, to support athigh bracket 44. Astrap 45 can immobilize the thigh/femur in thethigh support 40. Thethigh support 40 may not be necessary in some instances. - The
foot support 30 or thethigh support 40 may assist in keeping the bones fixed relative to the {X Y Z} coordinate system. For instance, the fixed relation may be required in instances in which no additional tracking is present to assist the actions of therobot arm 20. However, thetracking apparatus 70 may provide intraoperative tracking information for therobot arm 20 and for the patient bones, in such a way that some movement of the patient is permissible intraoperatively as the movement is calculable and thus known in the {X Y Z} coordinate system. - The operation of the
tracking apparatus 70 may depend on the information within the navigation file C. For example, thetracking apparatus 70 may assist in performing the calibration of the patient bone with respect to therobot arm 20, for subsequent navigation in the {X Y Z} coordinate system. Thetracking apparatus 70 may include two cameras to provide stereoscopic (e.g., 3D) image data to optically identify and locate retro-reflective references 71A, 71B, and 71B to triangulate positions of objects associated with the references, in an embodiment, thereference 71A is on thetool head 24 of therobot arm 20 such that its tracking allows thecontroller 50 to calculate the position and/or orientation of thetool head 24 andtool 26A thereon. In an example,reference 71A may be etched on a stable portion of aburr tool 26A. Thecontroller 50 may use information about the position of thetool head 24 and the camera on thetracking apparatus 70 to adjust the camera to optimize the collected images, such as adjusting camera position, camera angle, camera distance fromtool head 24, focal length, or other dynamic camera adjustments. Thetracking apparatus 70 may also include a robot tracking arm, and thecontroller 50 may control the robot tracking arm to adjust the camera position or perform other dynamic camera adjustments. The robot tracking arm may be controlled independently or in conjunction with controlling therobot arm 20. - In addition to
reference 71A on thetool head 24, references 71B and 71C may be fixed to the patient bones, such as the tibia for reference 71B and the femur for reference 71C. In an example, references 71B and 71C are applied to the patent bones using a brief procedure to provide rapid reference tracking. In another example, references 71B and 71C may include application of a virtual marker (e.g., “painted on”) to an image of the bone, such as usinginterface 90. In FAI resurfacing, it may only be necessary to have the reference 71C, although it may be desired to have another reference on the pelvis as well, depending on the location of the osteophytes. As shown, the references 71 attached to the patient need not be invasively anchored to the bone, as straps or like attachment means may provide sufficient grasping to prevent movement between the references 71 and the bones, in spite of being attached to soft tissue. For example, the references 71 may include a fabric removably and non-invasively attachable to a bone, where references 71 each include a plurality of reference markers distributed on the surface of the fabric. Thecontroller 50 continuously updates the position or orientation of therobot arm 20 and patient bones in the {X Y Z} coordinate system using the data from thetracking apparatus 70.Tracking system 70 may include one or more of optical tracking sensors, inertial tracking sensors, or other motion or location sensors. For example, trackingsystem 70 may include inertial sensors (e.g., accelerometers, gyroscopes, etc.) that produce tracking data to be used by thecontroller 50 to update the position or orientation of therobot arm 20 continuously. Other types of tracking technology may also be used. - The calibration may be achieved in the manner described above, with the
robot arm 20 using a registration pointer on therobot arm 20, and with the assistance of thetracking apparatus 70 if present in therobotized surgery system 10. Another calibration approach is to perform radiography of the bones with the references 71 thereon, at the start of the surgical procedure. For example, a C-arm may be used for providing suitable radiographic images. The images are then used for the surface matching with the bone model B of the patient. Because of the presence of the references 71 as fixed to the bones, the intraoperative registration may then not be necessary, as thetracking apparatus 70 tracks the position or orientation of the bones in the {X Y Z} coordinate system after the surface matching between X-ray and bone model is completed. -
FIG. 2 is an exemplary perspective view of a robotized surgerysystem foot support 30, in accordance with some embodiments. Thefoot interface 34 may have an L-shaped body ergonomically shaped to receive the patient's foot To fix the foot in thefoot support 33, different mechanisms may be used, one of which features anankle clamp 35. Theankle clamp 35 surrounds the rear of thefoot interface 34, and laterally supports a pair of malleolus pads 36. The malleolus pads 36 are positioned to be opposite the respective malleoli of the patient, and are displaceable viajoints 37, to be brought together and hence clamp onto the malleoli. Astrap 38 may also be present to secure the leg in thefoot support 30 further, for example by attaching to the patient's shin. As an alternative to the arrangement ofFIG. 2 , a cast-like boot may be used, or a plurality ofstraps 38, provided the foot is fixed in thefoot support 33. -
FIG. 3A is a perspective schematic view of FAI conditions on the pelvis D, in accordance with some embodiments.FIG. 3B is a perspective schematic view of FAI conditions on the femoral head F1 and neck F2, in accordance with some embodiments. Thesystem 10 is used to resurface the femoral head F1 or neck F2, or to resurface the periphery of the acetabulum A1 in a FAI condition. The FAI condition may be caused by one or more osteophytes on the rim of the acetabulum A1 or femoral head F1 or neck F2. InFIG. 3A , osteophyte O1 is shown built up on the periphery of the acetabulum A1, part of the pelvis D. This acetabular bone growth may be known as a pincer deformity, and may cause a pincer impingement. InFIG. 3B , osteophyte O2 is shown built up at the junction of the femoral head F1 and femoral neck F2, part of the pelvis D. This femoral bone growth may be known as a cam deformity, and may cause a femoral cam impingement. One or both of the pincer deformity and cam deformity may occur, and both may be corrected using the FAI resurfacing techniques described herein.FIGS. 3A and 3B are examples of possible osteophyte locations, but other osteophytes can build up at other locations. -
FIG. 4 is a block diagram of a FAI resurfacing controller used with the robotized surgery system ofFIG. 1 , in accordance with some embodiments. To drive therobot arm 20 in resurfacing the hip joint, in either or both conditions ofFIGS. 3A and 3B , a navigation file C may be created. Referring toFIG. 4 , a FAI resurfacing controller is generally shown at 60. Thecontroller 60 may be part of thesystem 10, for example as part of a set of modules that are in therobotized surgery controller 50. TheFAI resurfacing controller 60 may also be a stand-alone processing unit, used in pre-operative planning to prepare a navigation file C. - The
FAI resurfacing controller 60 may receive bone imagery 131. The bone imagery B1 may include a computed tomography (CT) scan image, magnetic resonance imaging (MRI) image, or any other radiography imagery. A bonemodel generator module 61 receives the bone imagery B1 to generate a bone model therefrom. The model may be a 3D representation of at least a portion of the surface having osteophytes thereon. For example, the 3D representation may be that of a portion of the acetabulum A1 or of a portion of the femoral head F1 and neck F2. The 3D representation may include a portion of the bone surface surrounding the osteophytes, and the osteophytes. - An
osteophyte identifier module 62 receives the bone model, and segments the osteophyte from the native bone surface. Various approaches may be used for the segmentation by theosteophyte identifier module 62. According to an embodiment, theosteophyte identifier module 62 consults a bone atlas database 132. The bone atlas database 132 includes a compilation of different femur or pelvis geometries, for instance also as 3D bone models. Theosteophyte identifier module 62 compares the bone model, particularly the native bone surface surrounding the osteophyte, with bone geometries of the database B2 to find closest matches. Once a closest match is identified, the bone models may be overlaid to define a surface of the patient bone covered by osteophytes. - Various geometric features may be used by the
osteophyte identifier module 62 to identify an osteophyte. In an example, the received bone model may be used to identify geometric features of the bone, and the osteophytes may be identified by identifying differences between bone model geometric features and a bone atlas databases match. In another example, one or more femoral or acetabular geometric measurements may be used to identify geometric features of the bone. The geometric measurements may include an alpha angle, a lateral center edge angle, a femoral head coverage, a sourcil angle, an acetabular angle, or other femoral or acetabular geometric measurements. In an example, the alpha angle may be used to characterize the concavity of the anterior femoral head-neck junction, or how big the bump is on the femoral neck. The alpha angle is defined as the acute angle between the femoral neck axis and a line between the femoral head center with the point where the head-neck junction cortical surface first meets with a circle superimposed upon an ideally spherical femoral head. The alpha angle may be particularly useful in detecting an osteophyte that causes or contributes to a femoral cam impingement. In another example, the lateral center edge angle may be used to characterize the angular coverage of the femoral head by the weight-bearing zone of the acetabulum. The lateral center edge angle is defined as the angle formed by intersection of a vertical line extending through the femoral head center and a line extending through the femoral head center to the lateral sourcil. The lateral center edge angle may be particularly useful in detecting an osteophyte that contributes to acetabular dysplasia, acetabular instability, or femoral impingement. In another example, the femoral head coverage may be used to characterize weight-bearing femoral head coverage, where the femoral head coverage is defined as the percentage coverage of the femoral head by the weight-bearing zone of the acetabulum. The femoral head coverage may be particularly useful in detecting an osteophyte that contributes to acetabular dysplasia or pincer impingement, and may also be used when resurfacing the acetabular rim due to pincer impingement. In another example, the sourcil angle may be used to characterize the angle-dependent coverage of the femoral head by the acetabulum. The sourcil angle (e.g., Tonnis angle) is defined as the angle formed between a horizontal line and a line extending from the medial edge of the sourcil to the lateral edge of the sourcil. The sourcil angle may be particularly useful in detecting an osteophyte that contributes to acetabular dysplasia, acetabular instability, or femoral impingement. In another example, the acetabular angle may be used to characterize the acetabular inclination or opening. The acetabular angle may include the acetabular angle of Sharp, defined as the angle formed between a horizontal line and a line from the teardrop to lateral acetabulum. The acetabular angle may include the acetabular roof angle of Tonnis, defined as the angle formed by a horizontal line connecting both triradiate cartilages (e.g., the Hilgenreiner line) and a second line connecting the acetabular roofs. The acetabular angle may be particularly useful in detecting an osteophyte that contributes to pincer impingement. Once theosteophyte identifier module 62 analyzes the bone geometry to identify the osteophyte, the geometric features may be used by the resurfacingnavigator 63 to achieve a desired geometric goal or to correspond with a preoperatively planned geometric goal. For example, theosteophyte identifier module 62 may identify the alpha angle, which may be used by the resurfacingnavigator 63 to achieve a desired alpha angle or to correspond with a preoperatively planned alpha angle. - In an embodiment, the
osteophyte identifier module 62 may analyze the bone model directly, such as by generating a 3D model based on the bone model and determining an impingement-free range of motion. For example, theosteophyte identifier module 62 may perform 3D reconstruction along the neck of the femur, identify the center of the sphere of the femoral head, and identify the non-spherical portions to determine impingement-free range of motion. Accordingly, theosteophyte identifier module 62 virtually segments the native bone surface from the osteophyte, by defining a 3D boundary surface between the native bone and the osteophyte. - In the model generated by the
osteophyte identifier module 62, the 3D boundary surface is affixed and surrounded by the 3D bone model of the bonemodel generator module 61. Theosteophyte identifier module 62 may alternatively or supplementally require the assistance of an operator. For instance, the 3D boundary surface based on the bone atlas data B2 may be a starting point for an operator to perform adjustments to the virtual segments or other virtual boundaries. As another example, theosteophyte identifier module 62 may provide the bone model to the bonemodel generator module 61, along with interactive virtual tools, for an operator to define a 3D boundary surface between the osteophyte and the native bone surface. The interactive virtual tool may include a suggested 3D boundary surface based on extensions of the native bone surrounding the osteophytes. - A resurfacing
navigator 63 uses the bone model with 3D boundary surface to generate the navigation file C. The navigation file C may include the bone model with 3D boundary surface, with a high enough surface resolution of native bone surface surrounding the 3D boundary surface for an intraoperative registration to be executed by therobot arm 20, in a calibration. The calibration may include the bone model B of the patient, for surface matching to be performed by a registration pointer of therobot arm 20. Therobot arm 20 would obtain a cloud of bone landmarks of the exposed bones, to reproduce a 3D surface of the bone. The 3D surface would then be matched to the bone model B of the patient, to set the 3D model in the {X Y Z} coordinate system. If a bone model is not available, a bone model may be generated intraoperatively. For example, a surgical registration pointer may be used to contact a bone surface to register a point, and multiple points may be synthesized to generate a cloud of small surfaces representing an approximate bone model. - The resurfacing
navigator 63 may also include a resurfacing path for therobot arm 10 based on a model of the osteophyte, and an identification of the tool that may be used, such as theburr 26A shown inFIG. 1 . The resurfacing path may consider the surrounding soft tissue to be minimally invasive, such as by defining a safety zone to avoid specific soft tissues. The resultant navigation file C defines the maneuvers to be performed by therobot arm 20 as directed by thecontroller 50 of thesystem 10. The resultant navigation file C may include a patient-specific numerical control data, such as anatomical information specific to the patient to aid in navigating therobot arm 20. The maneuvers may be performed by therobot arm 20 without surgeon intervention. -
FIG. 5 illustrates a flow chart showing a robotizedsurgery system technique 80 for FAI resurfacing, in accordance with some embodiments. In an embodiment,technique 80 is performed autonomously by a robotized system for femoroacetabular impingement resurfacing. The robotized system may include one or more of the components of therobotized surgery system 10 described above, such asrobotized surgery controller 50,robotic arm 20,tracking system 70, or other component. In particular, therobotized surgery controller 50 may include anFAI resurfacing controller 60, which may include abone model generator 61, anosteophyte identifier 62, and a resurfacingnavigator 63.Technique 80 may include receiving 81 a bone imaging data set at abone model generator 61. The bone imaging data set may include an x-ray image, a computed tomography (CT) scan image, MRI imaging, or any other radiography imagery that can provide sufficient detail to allow identification of osteophytes. Thebone model generator 61 may generate 82 a resurfacing model. The resurfacing model may include at least one osteophyte and a native bone surface surrounding the at least one osteophyte. As discussed above, generating 82 the resurfacing model can optionally include manual intervention through a graphical user interface provided to a surgeon or technician. Theosteophyte identifier 62 may map 83 a virtual 3D boundary surface based on the resurfacing model. The virtual 3D boundary surface may identify an osteophyte virtual boundary located between the native bone surface and the at least one osteophyte. The resurfacingnavigator 63 may generate 84 a navigation file. The navigation file may include the resurfacing model, the virtual 3D boundary surface, and a plurality of patient-specific numerical control data. The navigation file can include control vectors used by thesurgery controller 50 to direct a cutting tool attached to therobotic arm 20 to remove the identified osteophytes. Thesurgery controller 50 may execute the navigation file to direct the robotic arm in automatically removing 85 the at least one osteophyte from the native bone surface based on the navigation file. Removing 85 the at least one osteophyte may include thesurgery controller 50 receiving 86 tracking data from atracking system 70. Thesurgery controller 50 may further direct the cutting tool attached to therobotic arm 20 based on tracking data received from thetracking system 70. Regardless of whether a tracking device is used, the robotic arm may remove 85 the at least one osteophyte without surgeon intervention. In an embodiment, thesurgery controller 50 may generate or update a 3D model based on tracking data received 86 from thetracking system 70 to verify osteophyte removal. For example, thesurgery controller 50 may update the 3D model to confirm the current state of the resurfaced bone provides impingement-free range of motion. In addition to thesurgery controller 50 verifying osteophyte removal, a surgeon may manipulate a joint intraoperatively and provide osteophyte removal confirmation or other feedback to thesurgery controller 50. -
FIG. 6 illustrates generally an example of a block diagram of amachine 100 upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform in accordance with some embodiments. In alternative embodiments, themachine 100 may operate as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, themachine 100 may operate in the capacity of a server machine, a client machine, or both in server-client network environments. Themachine 100 may be a personal computer (PC), a tablet PC, a personal digital assistant (PDA), a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein, such as cloud computing, software as a service (SaaS), other computer cluster configurations. - Examples, as described herein, may include, or may operate on, logic or a number of components, modules, or like mechanisms. Such mechanisms are tangible entities (e.g., hardware) capable of performing specified operations when operating. In an example, the hardware may be specifically configured to carry out a specific operation (e.g., hardwired). In an example, the hardware may include configurable execution units (e.g., transistors, circuits, etc.) and a computer readable medium containing instructions, where the instructions configure the execution units to carry out a specific operation when in operation. The configuring may occur under the direction of the executions units or a loading mechanism. Accordingly, the execution units are communicatively coupled to the computer readable medium when the device is operating. For example, under operation, the execution units may be configured by a first set of instructions to implement a first set of features at one point in time and reconfigured by a second set of instructions to implement a second set of features.
- Machine (e.g., computer system) 100 may include a hardware processor 102 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a
main memory 104 and astatic memory 106, some or all of which may communicate with each other via an interlink (e.g., bus) 108. Themachine 100 may further include adisplay unit 110, an alphanumeric input device 112 (e.g., a keyboard), and a user interface (UI) navigation device 114 (e.g., a mouse). In an example, thedisplay unit 110,alphanumeric input device 112 andUI navigation device 114 may be a touch screen display. Thedisplay unit 110 may include goggles, glasses, an augmented reality (AR) display, a virtual reality (VR) display, or another display component. For example, the display unit may be worn on a head of a user and may provide a heads-up-display to the user. Thealphanumeric input device 112 may include a virtual keyboard (e.g., a keyboard displayed virtually in a VR or AR setting. - The
machine 100 may additionally include a storage device (e.g., drive unit) 116, a signal generation device 118 (e.g., a speaker), anetwork interface device 120, and one ormore sensors 121, such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor. Themachine 100 may include anoutput controller 128, such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices. - The
storage device 116 may include a machinereadable medium 122 that is non-transitory on which is stored one or more sets of data structures or instructions 124 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein. Theinstructions 124 may also reside, completely or at least partially, within themain memory 104, withinstatic memory 106, or within thehardware processor 102 during execution thereof by themachine 100. In an example, one or any combination of thehardware processor 102, themain memory 104, thestatic memory 106, or thestorage device 116 may constitute machine readable media. - While the machine
readable medium 122 is illustrated as a single medium, the term “machine readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) configured to store the one ormore instructions 124. - The term “machine readable medium” may include any medium that is capable of storing, encoding, or carrying instructions for execution by the
machine 100 and that cause themachine 100 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions. Non-limiting machine readable medium examples may include solid-state memories, and optical and magnetic media. Specific examples of machine readable media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. - The
instructions 124 may further be transmitted or received over acommunications network 126 using a transmission medium via thenetwork interface device 120 utilizing any one of a number of transfer protocols (e.g., frame relay, Internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.). Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi-Fi®, as the personal area network family of standards known as Bluetooth® that are promulgated by the Bluetooth Special Interest Group, peer-to-peer (P2P) networks, among others. In an example, thenetwork interface device 120 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to thecommunications network 126. In an example, thenetwork interface device 120 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SEM), multiple-input multiple-output (MIMO), or multiple-input single-output (MISO) techniques. The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carrying instructions for execution by themachine 100, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software. - Each of these non-limiting examples may stand on its own, or may be combined in various permutations or combinations with one or more of the other examples.
- Example 1 is a femoroacetabular impingement resurfacing system comprising: a bone model generator to receive a bone imaging data set and generate a resurfacing model based on the bone imaging data set, the resurfacing model including at least one osteophyte and a native bone surface connected to the at least one osteophyte; an osteophyte identifier to map a virtual 3D boundary surface based on the resurfacing model, the virtual 3D boundary surface identifying an osteophyte virtual boundary located between the native bone surface and the at least one osteophyte; a resurfacing navigator to generate a navigation file, the navigation file including the resurfacing model and the virtual 3D boundary surface, the navigation file to provide control instructions to resurface the native bone surface to remove the at least one osteophyte; and an osteophyte removal device to automatically resurface the native bone surface based on the navigation file.
- In Example 2, the subject matter of Example 1 optionally includes wherein the osteophyte removal device includes an osteophyte removal tool, a robotic arm, and a robotic controller; and wherein the robotic controller executes the navigation file to cause the robotic arm to maneuver the osteophyte removal tool to automatically resurface the native bone surface to remove the at least one osteophyte.
- In Example 3, the subject matter of Example 2 optionally includes wherein the navigation file includes a safety zone, the safety zone representing a plurality of virtual boundaries that prevent movement of the osteophyte removal device into a plurality of surrounding soft tissues; and wherein the robotic controller uses the navigation tile to cause the osteophyte removal device to avoid the safety zone when resurfacing the native bone surface.
- In Example 4, the subject matter of any one or more of Examples 1-3 optionally include a tracking apparatus to generate tracking data, wherein the robotic controller processes the tracking data to determine a location of at least one of the osteophyte removal device and the native bone surface.
- In Example 5, the subject matter of Example 4 optionally includes wherein the tracking apparatus includes an image capture device to generate image data; and wherein the robotic controller processes the image data to identify and determine the location of at least one of the osteophyte removal device and the native bone surface.
- In Example 6, the subject matter of any one or more of Examples 4-5 optionally include wherein the tracking apparatus includes a robotic tracking arm to position the image capture device; and wherein the robotic controller executes the navigation file to control the robotic tracking arm to improve an image quality of the generated image data.
- In Example 7, the subject matter of any one or more of Examples 1-6 optionally include a bone atlas database, wherein the osteophyte identifier is further operable to: compare the bone imaging data set against the bone atlas database to find a closest bone atlas entry; and use the closest bone atlas as a model for the native bone surface and to identify the at least one osteophyte.
- In Example 8, the subject matter of Example 7 optionally includes wherein the osteophyte identifier identifying the at least one osteophyte includes identifying geometric features based on the bone imaging data set.
- In Example 9, the subject matter of any one or more of Examples 1-8 optionally include wherein the osteophyte identifier is further to: generate a 3D bone model based on the bone imaging data set; and determine an impingement-free range of motion based on the 3D bone model.
- In Example 10, the subject matter of Example 9 optionally includes D bone model includes: performing a 3D reconstruction along a femoral neck; and identifying a femoral head spherical center.
- In Example 11, the subject matter of Example 10 optionally includes D bone model.
- Example 12 is a femoroacetabular impingement resurfacing method comprising: performing the following operations on a computing device including a processor and memory, the operations including: receiving a bone imaging data set; generating a resurfacing model based on the bone imaging data set, the resurfacing model including at least one osteophyte and a native bone surface connected to the at least one osteophyte; mapping a virtual 3D boundary surface based on the resurfacing model, the virtual 3D boundary surface identifying an osteophyte virtual boundary located between the native bone surface and the at least one osteophyte; generating a navigation file, the navigation file including the resurfacing model and the virtual 3D boundary surface, the navigation file to provide control instructions to resurface the native bone surface to remove the at least one osteophyte; and outputting the navigation file for use by an osteophyte removal device to automatically resurface the native bone surface based on the navigation file.
- In Example 13, the subject matter of Example 12 optionally includes wherein the osteophyte removal device includes an osteophyte removal tool, a robotic arm, and a robotic controller; and the method further comprises executing the navigation file on the robotic controller to cause the robotic arm to maneuver the osteophyte removal tool to automatically resurface the native bone surface to remove the at least one osteophyte.
- In Example 14, the subject matter of Example 13 optionally includes the operations further including generating a safety zone, the safety zone representing a plurality of virtual boundaries that prevent movement of the osteophyte removal device into a plurality of surrounding soft tissues; wherein the navigation file includes instructions to cause the osteophyte removal device to avoid the safety zone when resurfacing the native bone surface.
- In Example 15, the subject matter of any one or more of Examples 12-14 optionally include the operations further including determining a location of at least one of the osteophyte removal device and the native bone surface using a tracking system.
- In Example 16, the subject matter of Example 15 optionally includes wherein receiving tracking data includes receiving image data from an image capture device; and wherein the operations include processing the image data to identify and determine the location of at least one of the osteophyte removal device and the native bone surface.
- In Example 17, the subject matter of Example 16 optionally includes wherein the tracking system includes a robotic tracking arm to position the image capture device; and the method further comprises executing the navigation file to control the robotic tracking arm to improve an image quality of the received image data.
- In Example 18, the subject matter of any one or more of Examples 12-17 optionally include the operations further including: comparing the bone imaging data set against a bone atlas database to find a closest bone atlas entry; and use the closest bone atlas as a model for the native bone surface and to identify the at least one osteophyte.
- In Example 19, the subject matter of Example 18 optionally includes wherein identifying the at least one osteophyte includes identifying geometric features based on the bone imaging data set.
- In Example 20, the subject matter of any one or more of Examples 12-19 optionally include D boundary surface further includes: generating a 3D bone model based on the bone imaging data set; and determining an impingement-free range of motion based on the 3D bone model.
- In Example 21, the subject matter of Example 20 optionally includes D bone model includes: performing a 3D reconstruction along a femoral neck; and identifying a femoral head spherical center.
- In Example 22, the subject matter of Example 21 optionally includes D bone model.
- Example 23 is at least one machine-readable storage medium, comprising a plurality of instructions that, responsive to being executed with processor circuitry of a computer-controlled femoroacetabular impingement resurfacing device, cause the device to: receive a bone imaging data set; generate a resurfacing model based on the bone imaging data set, the resurfacing model including at least one osteophyte and a native bone surface connected to the at least one osteophyte; map a virtual 3D boundary surface based on the resurfacing model, the virtual 3D boundary surface identifying an osteophyte virtual boundary located between the native bone surface and the at least one osteophyte; generate a navigation tile, the navigation file including the resurfacing model and the virtual 31) boundary surface, the navigation file to provide control instructions to resurface the native bone surface to remove the at least one osteophyte; and output the navigation file for use by an osteophyte removal device to automatically resurface the native bone surface based on the navigation file.
- In Example 24, the subject matter of Example 23 optionally includes wherein the osteophyte removal device includes an osteophyte removal tool, a robotic arm, and a robotic controller; and the instructions further causing the device to execute the navigation tile on the robotic controller to cause the robotic arm to maneuver the osteophyte removal tool to automatically resurface the native bone surface to remove the at least one osteophyte.
- In Example 25, the subject flatter of Example 24 optionally includes the instructions further causing the device to generate a safety zone, the safety zone representing a plurality of virtual boundaries that prevent movement of the osteophyte removal device into a plurality of surrounding soft tissues; wherein the navigation file includes instructions to cause the osteophyte removal device to avoid the safety zone when resurfacing the native bone surface.
- In Example 26, the subject matter of any one or more of Examples 23-25 optionally include the instructions further causing the device to determine a location of at least one of the osteophyte removal device and the native bone surface using a tracking system.
- In Example 27, the subject matter of Example 26 optionally includes wherein determining the location using a tracking system includes receiving image data from an image capture device within the tracking system; and wherein the determining the location includes processing the image data to identify and determine the location of at least one of the osteophyte removal device and the native bone surface.
- In Example 28, the subject matter of Example 27 optionally includes wherein the tracking system includes a robotic tracking arm to position the image capture device; and the method further comprises executing the navigation file to control the robotic tracking arm to improve an image quality of the received image data.
- Example 134 is at least one non-transitory machine-readable medium including instructions for operation of a robotic arm, which when executed by at least one processor, cause the at least one processor to perform operations of any of the methods of Examples 1-28.
- Example 135 is a method for performing any one of examples 1-28.
- Method examples described herein may be machine or computer-implemented at least in part. Some examples may include a computer-readable medium or machine-readable medium encoded with instructions operable to configure an electronic device to perform methods as described in the above examples. An implementation of such methods may include code, such as microcode, assembly language code, a higher-level language code, or the like. Such code may include computer readable instructions for performing various methods. The code may form portions of computer program products. Further, in an example, the code may be tangibly stored on one or more volatile, non-transitory, or non-volatile tangible computer-readable media, such as during execution or at other times, :Examples of these tangible computer-readable media may include, but are not limited to, hard disks, removable magnetic disks, removable optical disks (e.g., compact disks and digital video disks), magnetic cassettes, memory cards or sticks, random access memories (RAMs), read only memories (ROMs), and the like.
Claims (1)
1. A femoroacetabular impingement resurfacing system comprising:
a bone model generator to receive a bone imaging data set and generate a resurfacing model based on the bone imaging data set, the resurfacing model including at least one osteophyte and a native bone surface connected to the at least one osteophyte;
an osteophyte identifier to map a virtual 3D boundary surface based on the resurfacing model, the virtual 3D boundary surface identifying an osteophyte virtual boundary located between the native bone surface and the at least one osteophyte;
a resurfacing navigator to generate a navigation file, the navigation file including the resurfacing model and the virtual 3D boundary surface, the navigation file to provide control instructions to resurface the native bone surface to remove the at least one osteophyte; and
an osteophyte removal device to automatically resurface the native bone surface based on the navigation file.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/786,522 US20200170721A1 (en) | 2016-06-16 | 2020-02-10 | Robotized system for femoroacetabular impingement resurfacing |
US17/668,138 US11672613B2 (en) | 2016-06-16 | 2022-02-09 | Robotized system for femoroacetabular impingement resurfacing |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662350891P | 2016-06-16 | 2016-06-16 | |
US15/625,555 US10582971B2 (en) | 2016-06-16 | 2017-06-16 | Robotized system for femoroacetabular impingement resurfacing |
US16/786,522 US20200170721A1 (en) | 2016-06-16 | 2020-02-10 | Robotized system for femoroacetabular impingement resurfacing |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/625,555 Division US10582971B2 (en) | 2016-06-16 | 2017-06-16 | Robotized system for femoroacetabular impingement resurfacing |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/668,138 Division US11672613B2 (en) | 2016-06-16 | 2022-02-09 | Robotized system for femoroacetabular impingement resurfacing |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200170721A1 true US20200170721A1 (en) | 2020-06-04 |
Family
ID=59254037
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/625,555 Active 2037-11-18 US10582971B2 (en) | 2016-06-16 | 2017-06-16 | Robotized system for femoroacetabular impingement resurfacing |
US16/786,522 Abandoned US20200170721A1 (en) | 2016-06-16 | 2020-02-10 | Robotized system for femoroacetabular impingement resurfacing |
US17/668,138 Active US11672613B2 (en) | 2016-06-16 | 2022-02-09 | Robotized system for femoroacetabular impingement resurfacing |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/625,555 Active 2037-11-18 US10582971B2 (en) | 2016-06-16 | 2017-06-16 | Robotized system for femoroacetabular impingement resurfacing |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/668,138 Active US11672613B2 (en) | 2016-06-16 | 2022-02-09 | Robotized system for femoroacetabular impingement resurfacing |
Country Status (5)
Country | Link |
---|---|
US (3) | US10582971B2 (en) |
EP (1) | EP3471647A1 (en) |
CN (1) | CN109310477B (en) |
CA (2) | CA3113815C (en) |
WO (1) | WO2017218933A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11672613B2 (en) | 2016-06-16 | 2023-06-13 | Medtech S.A. | Robotized system for femoroacetabular impingement resurfacing |
Families Citing this family (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107645924B (en) | 2015-04-15 | 2021-04-20 | 莫比乌斯成像公司 | Integrated medical imaging and surgical robotic system |
CN114469211A (en) | 2016-07-12 | 2022-05-13 | 莫比乌斯成像公司 | Multi-stage dilator and cannula system and method |
US11103990B2 (en) | 2016-09-16 | 2021-08-31 | Mobius Imaging Llc | System and method for mounting a robotic arm in a surgical robotic system |
WO2018075784A1 (en) | 2016-10-21 | 2018-04-26 | Syverson Benjamin | Methods and systems for setting trajectories and target locations for image guided surgery |
EP3531954A4 (en) | 2016-10-25 | 2020-09-16 | Mobius Imaging LLC | Methods and systems for robot-assisted surgery |
AU2017362768A1 (en) | 2016-11-18 | 2019-05-30 | Stryker Corp. | Method and apparatus for treating a joint, including the treatment of CAM-type femoroacetabular impingement in a hip joint and pincer-type femoroacetabular impingement in a hip joint |
CA3055941C (en) | 2017-03-13 | 2023-06-20 | Zimmer, Inc. | Augmented reality diagnosis guidance |
US10682129B2 (en) | 2017-03-23 | 2020-06-16 | Mobius Imaging, Llc | Robotic end effector with adjustable inner diameter |
US11432877B2 (en) | 2017-08-02 | 2022-09-06 | Medtech S.A. | Surgical field camera system that only uses images from cameras with an unobstructed sight line for tracking |
AU2018316251B2 (en) | 2017-08-11 | 2024-05-02 | Mobius Imaging, Llc | Method and apparatus for attaching a reference marker to a patient |
EP3691545A4 (en) | 2017-10-04 | 2022-02-16 | Mobius Imaging, LLC | Systems and methods for performing lateral-access spine surgery |
WO2019071189A2 (en) | 2017-10-05 | 2019-04-11 | GYS Tech, LLC d/b/a Cardan Robotics | Methods and systems for performing computer assisted surgery |
US11039892B2 (en) * | 2017-12-22 | 2021-06-22 | Zimmer, Inc. | Robotically-assisted knee arthroplasty support systems and methods |
US11464569B2 (en) | 2018-01-29 | 2022-10-11 | Stryker Corporation | Systems and methods for pre-operative visualization of a joint |
JP7535517B2 (en) * | 2018-12-27 | 2024-08-16 | マコ サージカル コーポレーション | Systems and methods for surgical planning using soft tissue attachment points - Patents.com |
US11957629B2 (en) * | 2019-02-14 | 2024-04-16 | Stryker Australia Pty Ltd | Systems and methods for assisting surgery |
AU2020315554B2 (en) * | 2019-07-17 | 2023-02-23 | Mako Surgical Corp. | Surgical registration tools, systems, and methods of use in computer-assisted surgery |
EP3828768A1 (en) * | 2019-11-28 | 2021-06-02 | Digital Orthopaedics | Method for modeling a bone |
US11457984B1 (en) | 2019-12-02 | 2022-10-04 | Arc Specialties, Inc. | Surgical process for knee replacement and knee resurfacing process therefor |
WO2021168354A1 (en) * | 2020-02-21 | 2021-08-26 | Stryker Corporation | Systems and methods for visually guiding bone removal during a surgical procedure on a joint |
CN111329588A (en) * | 2020-03-09 | 2020-06-26 | 广东省人民医院(广东省医学科学院) | Mixed virtual reality superposition positioning piece for medical image and mixed virtual reality guiding method |
CN111358563B (en) * | 2020-03-11 | 2021-09-03 | 上海交通大学 | Hip arthroscope auxiliary robot system based on cooperative mechanical arm and control method |
CN111388091A (en) * | 2020-03-17 | 2020-07-10 | 京东方科技集团股份有限公司 | Optical scale and coordinate system registration method |
CN111563901B (en) * | 2020-04-15 | 2023-08-08 | 中国科学院苏州生物医学工程技术研究所 | Hip joint image processing method and system based on magnetic resonance, storage medium and equipment |
CN111590584B (en) * | 2020-05-27 | 2021-12-10 | 京东方科技集团股份有限公司 | Determination method and device of safety limit area, reset method and medical robot |
US11730491B2 (en) | 2020-08-10 | 2023-08-22 | Kunnskap Medical, LLC | Endoscopic image analysis and control component of an endoscopic system |
CN112155733B (en) * | 2020-09-29 | 2022-01-28 | 苏州微创畅行机器人有限公司 | Readable storage medium, bone modeling and registering system and bone surgery system |
US20230100698A1 (en) * | 2021-09-29 | 2023-03-30 | Cilag Gmbh International | Methods for Controlling Cooperative Surgical Instruments |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130211531A1 (en) | 2001-05-25 | 2013-08-15 | Conformis, Inc. | Patient-adapted and improved articular implants, designs and related guide tools |
DE10128236A1 (en) | 2001-06-11 | 2002-08-01 | Infineon Technologies Ag | Method for compensating a step-shaped DC interference in a digital baseband signal of a homodyne radio receiver |
FR2871363B1 (en) | 2004-06-15 | 2006-09-01 | Medtech Sa | ROBOTIZED GUIDING DEVICE FOR SURGICAL TOOL |
US20060189864A1 (en) | 2005-01-26 | 2006-08-24 | Francois Paradis | Computer-assisted hip joint resurfacing method and system |
US8548559B2 (en) | 2005-06-17 | 2013-10-01 | Orthosoft, Inc. | Method and apparatus for computer-assisted femoral head resurfacing |
US8623026B2 (en) * | 2006-02-06 | 2014-01-07 | Conformis, Inc. | Patient selectable joint arthroplasty devices and surgical tools incorporating anatomical relief |
US8407067B2 (en) * | 2007-04-17 | 2013-03-26 | Biomet Manufacturing Corp. | Method and apparatus for manufacturing an implant |
US9113971B2 (en) | 2006-02-27 | 2015-08-25 | Biomet Manufacturing, Llc | Femoral acetabular impingement guide |
US7949386B2 (en) | 2006-03-21 | 2011-05-24 | A2 Surgical | Computer-aided osteoplasty surgery system |
US8858563B2 (en) * | 2007-10-30 | 2014-10-14 | Hipco, Inc. | Device and method for hip distention and access |
WO2010132310A1 (en) * | 2009-05-12 | 2010-11-18 | Foundry Newco Xi, Inc. | Methods and devices to treat diseased or injured musculoskeletal tissue |
AU2011266700B2 (en) * | 2010-06-16 | 2015-05-21 | A2 Surgical | Method and system of automatic determination of geometric elements characterizing a bone deformation from 3D image |
US8679125B2 (en) | 2010-09-22 | 2014-03-25 | Biomet Manufacturing, Llc | Robotic guided femoral head reshaping |
US8715289B2 (en) | 2011-04-15 | 2014-05-06 | Biomet Manufacturing, Llc | Patient-specific numerically controlled instrument |
US8764760B2 (en) | 2011-07-01 | 2014-07-01 | Biomet Manufacturing, Llc | Patient-specific bone-cutting guidance instruments and methods |
AU2012289973B2 (en) * | 2011-08-03 | 2017-01-19 | Conformis, Inc. | Automated design, selection, manufacturing and implantation of patient-adapted and improved articular implants, designs and related guide tools |
US9386993B2 (en) | 2011-09-29 | 2016-07-12 | Biomet Manufacturing, Llc | Patient-specific femoroacetabular impingement instruments and methods |
DE102011121708A1 (en) * | 2011-12-20 | 2013-06-20 | Surgiceye Gmbh | Image generation apparatus and method for nuclear imaging |
US20140188240A1 (en) * | 2012-02-07 | 2014-07-03 | Conformis, Inc. | Methods and devices related to patient-adapted hip joint implants |
EP2996589B1 (en) | 2013-03-15 | 2022-01-19 | Howmedica Osteonics Corporation | Generation of a mating surface model for patient specific cutting guide based on anatomical model segmentation |
CN105431102B (en) * | 2013-06-11 | 2018-01-30 | 迷你麦克斯医疗 | System for the processing of the amount of plan of body part |
US9629642B2 (en) | 2014-03-25 | 2017-04-25 | Synvasive Technology, Inc. | Hip resurfacing drill guide device |
EP3471647A1 (en) | 2016-06-16 | 2019-04-24 | Medtech SA | Robotized system for femoroacetabular impingement resurfacing |
-
2017
- 2017-06-16 EP EP17734206.0A patent/EP3471647A1/en active Pending
- 2017-06-16 CA CA3113815A patent/CA3113815C/en active Active
- 2017-06-16 US US15/625,555 patent/US10582971B2/en active Active
- 2017-06-16 WO PCT/US2017/037940 patent/WO2017218933A1/en unknown
- 2017-06-16 CA CA3027964A patent/CA3027964C/en active Active
- 2017-06-16 CN CN201780036192.3A patent/CN109310477B/en active Active
-
2020
- 2020-02-10 US US16/786,522 patent/US20200170721A1/en not_active Abandoned
-
2022
- 2022-02-09 US US17/668,138 patent/US11672613B2/en active Active
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11672613B2 (en) | 2016-06-16 | 2023-06-13 | Medtech S.A. | Robotized system for femoroacetabular impingement resurfacing |
Also Published As
Publication number | Publication date |
---|---|
US11672613B2 (en) | 2023-06-13 |
CN109310477B (en) | 2021-11-30 |
CA3027964C (en) | 2021-05-25 |
CN109310477A (en) | 2019-02-05 |
WO2017218933A1 (en) | 2017-12-21 |
EP3471647A1 (en) | 2019-04-24 |
US20170360513A1 (en) | 2017-12-21 |
CA3113815C (en) | 2023-09-19 |
US20220265366A1 (en) | 2022-08-25 |
CA3113815A1 (en) | 2017-12-21 |
CA3027964A1 (en) | 2017-12-21 |
US10582971B2 (en) | 2020-03-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11672613B2 (en) | Robotized system for femoroacetabular impingement resurfacing | |
US10194990B2 (en) | Method for augmenting a surgical field with virtual guidance content | |
CN113303907B (en) | System for robot-assisted correction procedure | |
US20210121237A1 (en) | Systems and methods for augmented reality display in navigated surgeries | |
US20200038112A1 (en) | Method for augmenting a surgical field with virtual guidance content | |
CN106913366B (en) | On-tool tracking system and computer-assisted surgery method | |
JP2022535738A (en) | Systems and methods for utilizing augmented reality in surgical procedures | |
JP6144351B2 (en) | System and method for guidance and control of an implant placement device | |
CN111345896B (en) | Osteotomy execution system, positioning, control and simulation execution method and electronic equipment | |
CN111031954A (en) | Sensory enhancement system and method for use in medical procedures | |
CN113796956A (en) | Surgical guidance system for computer-assisted navigation during surgery | |
JP7217780B2 (en) | Instruments for guided orthopedic surgery | |
US20230380905A1 (en) | Method and system for validating bone alterations in computer-assisted surgery | |
JP7331223B2 (en) | Fluoroscopic robot artificial implantation system and method | |
US20200069372A1 (en) | Method and system for navigating a bone model in computer-assisted surgery | |
CN114224508A (en) | Medical image processing method, system, computer device and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |