WO2024129771A1 - Controller with a touchpad user interface for operating robotically actuated devices - Google Patents

Controller with a touchpad user interface for operating robotically actuated devices Download PDF

Info

Publication number
WO2024129771A1
WO2024129771A1 PCT/US2023/083686 US2023083686W WO2024129771A1 WO 2024129771 A1 WO2024129771 A1 WO 2024129771A1 US 2023083686 W US2023083686 W US 2023083686W WO 2024129771 A1 WO2024129771 A1 WO 2024129771A1
Authority
WO
WIPO (PCT)
Prior art keywords
touchpad
controller
user
robot
velocity
Prior art date
Application number
PCT/US2023/083686
Other languages
French (fr)
Inventor
Jesse F. D'ALMEIDA
Tayfun E. ERTOP
Robert James Webster
Original Assignee
Vanderbilt University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vanderbilt University filed Critical Vanderbilt University
Publication of WO2024129771A1 publication Critical patent/WO2024129771A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • A61B2034/742Joysticks

Definitions

  • joysticks have been previously proposed for use in handheld surgical robots, including joysticks, rollerballs, thumb/finger wheels, articulated parallel linkages, and specialty user interfaces, such as 1 4020633 Attorney Docket No. VU-032319 WO ORD SpaceMouse ® , available commercially from 3DConnexion US of Fremont, California, which offer axial, translational, and rotational input capabilities.
  • joysticks are appealing since they are readily available commercial products and come in a variety of sizes. The main limitation of these devices is that a joystick can normally provide control inputs for only two degrees-of-freedom, e.g., X and Y control inputs.
  • Bi-directional control in the 3rd degree-of-freedom for a joystick controller requires an additional input, such as a button or trigger, or an additional joystick.
  • Rollerball interfaces have an advantage over joysticks because there are no joint limits, i.e., they can roll unlimited amounts, whereas joystick movements are limited. Rollerballs are, however, like joysticks in that they are inherently a 2D interface. Absent supplemental inputs, neither joysticks nor rollerballs can function alone to control a robot that requires a 3D input.
  • Another option is to simply combine several one degree-of-freedom control inputs on a single interface to control a multiple degree-of-freedom robot.
  • buttons, levers, rollers can be implemented on a surgical instrument, such as the handle of a laparoscopic device or an endoscope, to control the actuation of motors that effectuate bending, translation, rotation, flexion, etc. of an actuatable tool tip, such as a manipulator tip or a laser fiber.
  • a surgical instrument such as the handle of a laparoscopic device or an endoscope
  • motors that effectuate bending, translation, rotation, flexion, etc. of an actuatable tool tip, such as a manipulator tip or a laser fiber.
  • the issue of using several one degree-of-freedom inputs is that it can be challenging for the users to mentally map between multiple inputs, each controlled by different fingers, to coordinate motions of the tool tip.
  • SpaceMouse ® type user interfaces can also be used in robotics.
  • the SpaceMouse ® user interface combines a six degree-of-freedom control input in a compact frame.
  • the SpaceMouse ® is designed to be used with a full-hand grasp with the device resting flat on a surface, such as a tabletop.
  • a full-hand grasp of the SpaceMouse ® user interface is not possible, as the user needs to maneuver the robot body itself as the robotic tools are controlled.
  • the SpaceMouse ® also made it difficult to control a single degree-of-freedom without affecting the others. For example, a user would struggle with moving purely in the X- direction without also moving in the Y or Z direction(s). 2 4020633 Attorney Docket No.
  • a user interface for handheld surgical robots implements a novel three degree-of-freedom touchpad user interface.
  • the touchpad user interface includes a touchpad that encodes two degree-of-freedom movement, such as X-Y movement.
  • a controller for providing three dimensional control inputs to a robot includes a user interface includes a touchpad configured to be touched by a finger of the user while grasping the handle.
  • the touchpad is configured to sense a location where the user’s finger touches the touchpad and to provide X and Y control inputs comprising X- axis data and Y-axis data associated with the location where the user’s finger touches 3 4020633 Attorney Docket No. VU-032319 WO ORD the touchpad.
  • the touchpad is configured to move axially in response to forces applied to the touchpad by the user’s finger.
  • the user interface is configured to provide a Z control input including Z-axis data associated with an axial position of the touchpad.
  • the controller can also include a handle configured to be grasped by the user while actuating the touchpad with a thumb.
  • the touchpad can be configured for a pinching grasp between the thumb and an index finger of the user while the handle is grasped by fingers of the user other than the thumb and index finger.
  • the touchpad can be configured to update the X-axis data and Y-axis data continuously and in real-time so that the associated X and Y control inputs are updated continuously and in real-time.
  • the user interface can be configured so that the X-axis data corresponds to a horizontal left/right position in with respect to the robot, the Y-axis data corresponds to a vertical up/down position in with respect to the robot, and the Z-axis data corresponds to a depth-wise in/out position in with respect to the robot.
  • the touchpad can include a touch surface configured to receive the user’s finger and along which the user’s finger can slide to actuate the X and Y control inputs.
  • the touchpad can be round.
  • the touch surface can be concave.
  • the touch surface can include a surface feature configured to provide a tactile confirmation of the location on the touch surface being touched.
  • the surface feature can be configured to provide tactile indication to the user of at least one of a deadband for the XY input on the touch surface, an outer edge of the touch surface, and a change in scaling of the XY input on the touch surface.
  • the touchpad can be a resistive touchpad or a capacitive touchpad.
  • the controller can also include a biasing member configured to exert a bias on the touchpad to move in a first axial direction toward the user’s finger.
  • the biasing member can be configured so that the bias on the touchpad can be overcome by the user’s finger to move the touchpad in a second axial direction, opposite the first axial direction.
  • the biasing member can include a spring.
  • the controller can also include a force sensor configured to sense a spring force exerted by the spring and provide an output corresponding to the sensed spring force. The controller can be configured to determine the Z-axis data in response to the output from the force sensor.
  • the controller can include a displacement sensor configured to detect the axial position of the touchpad and to provide the Z control input.
  • the displacement sensor can be a force sensor, a resistive displacement sensor, an optical displacement sensor, or a magnetic displacement sensor.
  • a handheld surgical robot can include a robotic element, a robot actuator, and a controller as set forth in any of the aspects set forth above.
  • the user interface of the controller can be configured to provide the X, Y, and Z control inputs to the robot actuator.
  • the robot actuator can be configured to actuate the robotic element in response to the X, Y, and Z control inputs to produce corresponding movements at a tip of the robotic element.
  • the robot actuator can be configured to implement mapping of the X, Y, and Z control inputs from the user interface to produce a 3D velocity-level value.
  • the robot actuator can be configured to implement a resolved rates controller configured to convert the velocity level value to a joint space value. 5 4020633 Attorney Docket No. VU-032319 WO ORD [0027]
  • the mapping can include position-level mapping, velocity-level mapping, or mixed position-level and velocity-level mapping.
  • the diag() function converts the 3x1 scaling vector to a 3x3 matrix.
  • the diag() function converts the 3x1 scaling vector to a 3x3 matrix.
  • ⁇ xthresh,i veli ⁇ ⁇ * kv,i (xi – sign(xi) * xthresh,i) + kv,i * sign(xi)* xthresh,i; if
  • ⁇ xthresh,i where k v is a 3x1 scaling vector that represents the maximum velocity possible in a single direction, xi is a 3x1 vector of user interface input, xthresh .85 is the threshold at 6 4020633 Attorney Docket No.
  • ⁇ xthresh,i veli kv,i * xi; if
  • Fig.1 illustrates a surgical system that includes a robot, according to an example configuration.
  • Fig.2 is a top-front perspective view of the robot, according to the example configuration.
  • Fig.3 is a top-rear perspective view of the robot, according to the example configuration.
  • Fig.4A is a top-front perspective view of a controller of the robot, according to the example configuration.
  • Fig.4B is the view of Fig.4A showing the controller in the grasp of a user’s hand.
  • Fig.5 is a bottom-front perspective view of the controller, according to the example configuration.
  • Fig.6 is a bottom-rear perspective view of the controller, according to the example configuration.
  • Figs.7 and 8 are left and right side views, respectively, of the controller, according to the example configuration. 7 4020633 Attorney Docket No. VU-032319 WO ORD
  • Figs.9 and 10 are front and rear views, respectively, of the controller, according to the example configuration.
  • Fig.11 is a sectional view taken generally along line 11-11 of Fig.9.
  • Fig.12 is a sectional view taken generally along line 12-12 of Fig.9.
  • Fig.13 is a perspective view illustrating the controller in the grasp of a user’s hand according to an example alternative use scenario.
  • Fig.1 illustrates an operating room environment in which surgery can be performed.
  • the operating room of Fig.1 is outfitted with an example configuration of a system 10 for performing surgery, which includes an apparatus 20 in the form of a robot.
  • the robot 20 is what is referred to herein as a “handheld” robot, meaning that the robot and system 10 are configured to allow for manual manipulation of the position and orientation of the robot as a whole while, at the same time, they are configured to allow for the independent control of their robotic elements.
  • the robot 20 includes an endoscope 22 through which one or more robotically controlled elements 30 extend.
  • the robotic elements 30 can be any articulated structure that can be controlled robotically and are capable of performing a surgical or other medical procedure.
  • the robotic elements 30 can be concentric tube manipulators.
  • the robot 20 is not limited to the endoscopic, concentric tube manipulator configuration of the example configuration(s) illustrated herein.
  • the system 10 can implement a robot of any configuration that can benefit from the features and improvements disclosed herein.
  • the endoscope 22 is secured to and extends from an actuator portion 24 of the robot 20.
  • the actuator 24 encloses and supports actuator elements (Not shown), such as encoders, controllers, motors, transmissions, linkages, tracks, sliders, etc. that produce the robotic movement of the robotic elements 30.
  • the actuator elements of the actuator 24 are concealed within a housing.
  • the actuator 24 is configured to produce robotic movements of the robotic elements 30.
  • the robotic movements are translational and rotational 8 4020633 Attorney Docket No. VU-032319 WO ORD movements of the individual tubes of the concentric tube manipulators that make up the robotic elements 30.
  • the translational and rotational movements are controlled and synchronized to produce a desired movement of the tips of the robotic elements and the tools/end effectors supported thereon.
  • the concentric tube manipulators of the robotic elements 30 are small, needle-diameter, tentacle-like robots that include multiple concentric, pre-curved, elastic tubes.
  • These elastic, curved tubes are typically made of a superelastic metal alloy such as a nickel-titanium alloy (“nitinol”) material.
  • the tubes can be rotated about and/or translated along a common longitudinal axis of the concentric tubes. Through relative rotational movement, the rotational positions of the concentric tubes relative to each other can be controlled. Through relative translational movement, the concentric tubes can be retracted into one another and extended from one another. [0048] As the pre-curved tubes interact with one another through relative translational and rotational movement, they cause one another to bend and twist, with the tubes collectively assuming a minimum energy conformation.
  • the pre-curvature(s) of the tube(s) for a given manipulator can be selected to provide a desired workspace throughout which the tip can access through the relative rotational and/or translational movements of the tubes.
  • the curved shape of the distal end of the manipulator is controlled via translation and rotation of each tube at a proximal location outside the patient, e.g., at its base inside the robot actuator 24 where the tubes are connected to the actuator elements.
  • the robotic elements 30 are shown in the example configuration as being concentric tube manipulators, they could have any configuration for which 3D control inputs are desired.
  • the robotic elements could be robot arms including one or more linkages that are actuatable to pivot or rotate relative to each other, with tools affixed to a working end of the arm.
  • examples of the tools that can be implemented by the robotic elements 30 include curettes, grippers, surgical lasers, graspers, retractors, scissors, imaging tips, cauterizing tips, ablation tips, morcelators, knives/scalpels, cameras, irrigation ports, suction ports, needles, probes, and tissue manipulators.
  • 9 4020633 Attorney Docket No. VU-032319 WO ORD [0050]
  • the robot 20 is supported on a support device 26 that permits the user (i.e., surgeon) to easily maneuver and position the robot 20.
  • the support device 26 can, for example, be a counterbalance arm configured to negate all or a portion of the weight of the robot 20 so that the user can manually manipulate its position and orientation with ease.
  • the support device 26 can also have locking features that allow the user to fix the position of the robot 20 so that the user can focus on operating the robotic elements 30 from a fixed position and orientation of the endoscope 22 from which the robotic elements extend.
  • the system 10 can also include one or more computers 50 connected to the robot 20 for control, programming, recording, or other purposes.
  • the system 10 can further include one or monitors 52 for providing endoscope and other video, diagnostics, patient vitals, and other information to the user.
  • the robot 20 includes controllers 100 configured to allow for the manual manipulation of the robot as a whole, as well as providing control inputs to the robot 20 control robotic movements of the robotic elements 30.
  • the controllers 100 include handle features that facilitate grasping in a comfortable and ergonomic manner so that the position and orientation of the robot 20 can be adjusted via the support device 26.
  • the controllers 100 also include user interface features configured to allow the user to supply control inputs to the robot 20 in a convenient and intuitive manner while grasping the handle features.
  • the robot 20 includes two robotic elements 30, each of which is individually controlled by a corresponding one of two controllers 100 that are provided on the robot 20.
  • the left controller 100 (left as viewed from the rear of the robot 20, see, Fig. 3) is actuatable to produce 3D control inputs along left controller XYZ axes, i.e., XLIN, YLIN, ZLIN.
  • the right controller 100 (right as viewed from the rear of the robot 20) is actuatable to produce 3D control inputs along right controller XYZ axes, i.e., XRIN, YRIN, ZRIN.
  • the control inputs XLIN, YLIN, ZLIN and XRIN, YRIN, ZRIN are fed to the robot actuator 24, which implements a kinematic model that converts the control inputs to 10 4020633 Attorney Docket No. VU-032319 WO ORD actuator commands that will produce movements of the tips/end effectors of the robotic elements 30 that correspond to the control inputs.
  • the XLIN, YLIN, ZLIN control inputs received from the left controller 100 produce corresponding output motions at the tips/end effectors of associated robotic element 30, i.e., XLOUT, YLOUT, ZLOUT.
  • the XRIN, YRIN, ZRIN control inputs received from the right controller 100 produce corresponding output motions at the tips/end effector of the associated robotic element 30, i.e., XROUT, YROUT, ZROUT.
  • the left and right controllers 100 are illustrated as controlling the operation of two individual robotic elements 30, it will be appreciated that the controllers 100 could be used to control different aspects of a single robotic element.
  • Figs 4-12 illustrate an example configuration of a controller 100 that can be implemented in the robot 20.
  • the controller 100 illustrated in Figs.4-12 corresponds to the controller shown on the right side of the robot 20, as viewed from a perspective viewed from rear of the robot in Fig.3.
  • the controllers 100 have side-specific configurations, i.e., a left-side controller configuration and a right-side controller configuration.
  • the left-side and right-side controller configurations are mirror images of each other.
  • the controller 100 includes a handle portion 110 that is ergonomically designed to accommodate the user’s hands, i.e., palms and fingers, in a comfortable position that cuts down on fatigue.
  • the handle portion 110 can include finger receiving surfaces 112 configured to receive and match- up with the shape and form of mating portions (e.g., palms and/or fingers) of the human hand when grasping the handle portion.
  • a similar surface 114 is configured to receive and match-up with the shape and form of the user’s hand in the area of the forefinger and thumb when grasping the handle portion 110.
  • the controller 100 also includes a mounting portion 120 configured to facilitate a physical connection of the controller to the robot 20.
  • the mounting portion 120 can take the form of a mounting block or bracket configured to position the controller 100 at the desired position and orientation relative to the robot.
  • the mounting portion 120 can include openings 122 for receiving fasteners (not shown), which fasten the controller 100 to the robot actuator 24, e.g., to a frame or housing of the robot actuator, as shown in Figs.1-3.
  • the controller 100 also includes a user interface 150 configured to provide three-dimensional (3D) inputs to the robot 20 to control operation of the robotic elements 30.
  • the robot 20 includes two robotically controlled elements 30, each of which is controlled independently of the other via a user interface 150 of a corresponding controller 100.
  • Alternative configurations can be implemented. For example, where the robot 20 includes a single robotic element 30, only a single user interface 150 is necessary. In this instance, the robot 20 can be configured so that only one of the controllers 100 includes a user interface 150. Alternatively, in this scenario, the robot 20 can be outfitted with a controller 100 on each side of the robot 20, and the user can select which controller 100 and user interface 150, left-side or right-side, to use.
  • the user interface 150 includes a touchpad 160 configured to be engaged by the user’s thumb while grasping the handle portion 110 (See, Fig.4B).
  • the user 12 4020633 Attorney Docket No. VU-032319 WO ORD interface 150 is configured to coordinate 3D motion in an intuitive manner, so that the movements of the user’s thumb produces a corresponding movement of the robotic element 30.
  • the user interface 150 tracks the position of the user’s thumb in 3D space and produces an input that corresponds to that tracked movement in three dimensions.
  • Each user interface 150 has a three degree-of-freedom configuration in which movement of the user’s thumb is tracked three dimensions defined by the X-axis, Y- axis, and Z-axis, i.e., XIN, YIN, ZIN.
  • the touchpad 160 is configured to act as a 2D interface that acquires two degrees-of-freedom of the tracked thumb position, specifically thumb movement in the X-Y directions. Additionally, the touchpad 160 is configured to move as a whole along the Z-axis and to track the position of the user’s thumb along the Z-axis, thereby acquiring the thumb position in the third degree-of- freedom.
  • the touchpad 160 can be of any type conventionally applied in computer and other devices.
  • the touchpad 160 can be a capacitive touchpad or a resistive touchpad. Capacitive touchpads were implemented in current designs, but resistive touchpads can also be used. As shown in the example configuration shown in Figs.1-12, the touchpad 160 can have a generally round, disc-shaped configuration with a diameter selected to coincide, in general, with the reach of a user’s thumb while grasping the handle portion 110. The surface of the touchpad 160 can also have a slight spherically recessed configuration, as shown, for example, in Fig 12. Alternative shapes, such as elliptical, square, rectangular, polygonal, etc., and alternative sizes can, of course, be implemented.
  • a 40mm diameter TM040040 trackpad available commercially from GlidePointTM (Cirque, Utah, USA), was implemented.
  • the touchpad 160 is mounted on a frame 130 that is configured to move linearly along a rail 140 that is fixed to a housing portion 116 located at an upper end of the handle portion 110 of the controller 100.
  • the frame 130 has an opening 132 through which the rail 140 extends.
  • the frame 130 and rail 140 are configured so that the linear 13 4020633 Attorney Docket No. VU-032319 WO ORD travel of the frame over the rail is perpendicular to both the X-axis and Y-axis of the touchpad 160, i.e., along the Z-axis.
  • a spring 142 is configured to bias movement of the frame 130 along the rail 140 so that forward pressing forces (see Fig.4A-4B) applied to the touchpad 160 by the thumb compress the spring. Because of this, when the thumb releases its pressing force, the frame 130 will move rearward due to the bias applied by the spring 142, with the touchpad 160 maintaining contact with the thumb. As a result, the user interface 150 is configured so that the touchpad 160 is maintained in contact with the thumb during use, regardless of its Z-axis position. Accordingly, the user interface 150 can encode all three degree-of-freedom inputs , i.e., XIN, YIN, ZIN, throughout the procedure, capturing the motion of the user's thumb directly in 3D space.
  • a displacement sensor in the form of a force sensor 134 is mounted in the housing 116 and is configured to measure the push force applied to the touchpad 160.
  • the force sensor 134 is configured to sense the force applied to the touchpad 160 through the spring 142. Due to the spring constant of the spring 142, the force applied to the force sensor 134 increases as the spring is compressed and, similarly, is reduced as the spring compression is relieved.
  • the force sensor 134 can be used to encode the Z-axis position of the frame 130/touchpad 160.
  • the force sensor 134 can be a REB7 Universal Sub Miniature Load Cell (available commercially from Loadstar Sensors, Fremont, California, USA).
  • the touchpad 160 is configured to output an X-Y coordinate based on where the thumb touches the pad.
  • the analog signal is converted (ADC) to a digital signal with a range, for example, of 0-1024 (where 0 is one end of the touchpad, 1024 is the opposite, and 512 is the center) for both the X and Y positions.
  • ADC analog signal
  • these values can be remapped to a range of -1 to +1 such that, for each X and Y, -1 is one side of the touchpad, +1 is the opposite side, and 0 is the center.
  • the spring 142 is configured so that its spring constant is strong enough to move the frame 130/touchpad 160 while, at the same time, not so strong that displacement is cumbersome for the user.
  • mapping the 3D input of the user interface 150 to control the robot 20 there are several options: position-level mapping, velocity-level mapping, and mixed, i.e., a mixture of both position-level mapping and velocity-level mapping.
  • Each mapping will send a velocity-level 3D value to a resolved rates controller implemented in the robot 20, e.g., the robot actuator 24, that converts the desired velocity to the joint space.
  • the absolute readings from the touchpad and force sensor are scaled to correspond to a position in 3D for the tip of the robotic element 30.
  • the scaling factors chosen correlate to the maximum ranges to which the tip of the element 30 can travel. This gives a user quicker absolute control, while limiting the robotic movement to that range.
  • vel k[diag(kp)*xin - (ptip - preference)] where vel is a 3x1 vector of velocity input, kp is a 3x1 scaling vector, xin is a 3x1 vector of user interface input, p tip is the current position of the manipulator tip, p reference is a reference position of the manipulator from which movements are based, and k is a proportionality constant for how fast the velocity input matches the desired position.
  • the diag() function converts the 3x1 scaling vector to a 3x3 matrix.
  • the user interface input maps instead directly to a tip velocity of the robotic element 30.
  • the larger analog range of the touchpad 160 and force sensor 134 allow users to better modulate their velocity in 3D space.
  • Pressing the button makes k v,z negative, enabling movement in the negative Z direction. This may, however, be too cumbersome to users and removes the level of intuitive movement for robotic control that is sought for the controller 100 because forward Z direction movement of the touchpad 160 would be associated with forward and backward tip motion, depending on the state of the switch.
  • Another option is to remap the Z input to -1 to +1, such that 0 is the midpoint of the force range. Therefore, to have a velocity of 0 in the Z direction, the user must keep some force on the touchpad 160. A negative Z velocity is achieved by letting up on the force, while pushing in further results in a positive Z velocity.
  • a deadband region is included about the 0 point of the X,Y,Z input.
  • the region in the X-Y directions on the touchpad is 17.5% of the radius, and for the Z direction it is 22% of the maximum force around the 0.
  • a physical indicator such as a bump from an indentation, can be applied to the frame 130/rail 140 interface, to provide a tactile 16 4020633 Attorney Docket No. VU-032319 WO ORD sense of when the spring 142 and the touchpad 160 are at their midpoints of displacement/travel.
  • the touchpad 160 can include surface features or indicia, such as physical indents or etchings, to indicate different regions on the touch surface 162 with a tactile response that the user can feel and associate with the region.
  • These indents or etchings can, for example, be in the form of dots or other shapes such as dashes or a continuous line arranged in a shape that corresponds with the defined region on the touch surface 162. This is shown in Fig.9.
  • circularly arranged indents/etchings 170 can be centered on the touch surface 162 with a radius selected so that the deadband covers the desired region, e.g., 17.5% of the touchpad radius.
  • circularly arranged indents/etchings 172 can alert the user that they are approaching the edge of the touchpad workspace.
  • indents/etchings 174 can extend or be positioned on the touch surface 162 inward from the outer edge of the touchpad 160, e.g., between the indents/etchings 170 and 172, to indicate where on the workspace a change in scaling occurs.
  • This can be implemented, for example, where velocity-level control is implemented (see below).
  • a modification of velocity-level control includes the addition of a non-linear input scaling where, instead of the velocity mapping linearly from 0 to the maximum velocity (in either directions, + or -), it instead increases the scaling near the edge of the range.
  • ⁇ xthresh,i veli ⁇ ⁇ * kv,i (xi – sign(xi) * xthresh,i) + kv,i * sign(xi)* xthresh,i; if
  • the sign() is the signum function that returns the sign of the input as +1, 0, or -1. 17 4020633 Attorney Docket No.
  • Mixed-level control combines aspects of both the position-level and velocity- level controls.
  • ⁇ xthresh,i veli kv,i * xi; if
  • the user could, however, actuate the touchpad 160 in an alternative manner, using a finger other than the thumb, such as a forefinger.
  • the user might lock the support device 26 to fix the position/orientation of the robot 20, which relieves the need to steady the robot by grasping the handle portion 110, thus allowing for forefinger actuation of the touchpad 160.
  • the user could actuate the touchpad 160 in another alternative manner through a pinching grasp of the touchpad.
  • the user can grasp the touchpad 160 with a pinching gesture using the thumb and index finger while, at the same time, the remaining fingers not involved in the pinching grasp are used to grasp the handle portion 110.
  • the handle portion 110 is not shown.
  • the support device 26 can be locked to fix the position/orientation of the robot 20 while using the pinching grasp.
  • 18 4020633 Attorney Docket No. VU-032319 WO ORD [0082]
  • the thumb is positioned on the front of the touchpad 160 and used to input the sliding XY control inputs.
  • the index finger is positioned on the rear of the touchpad 160 (e.g., centrally as shown or near the bottom, depending on user preference).
  • the thumb effectuates the axial Z control inputs in the forward/pressing/extension direction (e.g., against the spring bias), and the index finger effectuates the Z control inputs in the rearward/pulling/retraction direction (e.g., with the spring bias).
  • the pinching grasp works well because the index finger engagement or grip on the rear of the touchpad 160 allows the thumb to work in concert with the index finger to produce the desired sliding thumb movements on the touchpad in a familiar and intuitive manner, as if the thumb is being slid or rubbed across the index finger itself.
  • the grasp of the touchpad 160 gives the user a better sense of control of the Z direction because they are able to sense and effectuate, positively, movement of the touchpad in the rearward/pulling/retraction direction.
  • the pinching grasp allows the user to actively move the touchpad 160 in both Z directions, as opposed to thumb-only actuation, which relies on the spring maintaining contact between the touchpad and the thumb as the thumb moves alone in the rearward/retraction direction without any input or interaction from the index finger.
  • the displacement of the touchpad 160 in the Z direction can be determined in manners different than the force sensor 134 method described in the example configuration.
  • position sensors such as resistive position sensors, optical position sensors, or magnetic position sensors could be used to sense the displacement of the touchpad 160 in the Z direction and can be used to produce the Z control inputs.
  • position sensors such as resistive position sensors, optical position sensors, or magnetic position sensors could be used to sense the displacement of the touchpad 160 in the Z direction and can be used to produce the Z control inputs.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Robotics (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Public Health (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)

Abstract

A controller for providing three dimensional control inputs to a robot includes a user interface includes a touchpad configured to be touched by a finger of the user while grasping the handle. The touchpad is configured to sense a location where the user's finger touches the touchpad and to provide X and Y control inputs comprising X-axis data and Y-axis data associated with the location where the user's finger touches the touchpad. The touchpad is configured to move axially in response to forces applied to the touchpad by the user's finger. The user interface is configured to provide a Z control input including Z-axis data associated with an axial position of the touchpad.

Description

Attorney Docket No. VU-032319 WO ORD CONTROLLER WITH A TOUCHPAD USER INTERFACE FOR OPERATING ROBOTICALLY ACTUATED DEVICES Government Rights [0001] This invention was made with government support under grant number R01 EB026901 awarded by the National Institute of Health. The government has certain rights in the invention. Related Application [0002] This application claims the benefit of U.S. Provisional Application Serial No. 63/387,050, filed on December 12, 2022, the disclosure of which is hereby incorporated by reference in its entirety. Technical Field [0003] This disclosure relates to user interfaces for operating actuated devices, such as robots. More particularly, this disclosure relates to a touchpad user interface for controlling the operation of handheld surgical robots. Background [0004] In recent years, surgical robotic systems have become increasingly useful tools for physicians. Nearly all surgical robots in use today are teleoperated, and are typically large, complex systems. They are generally remotely controlled from a surgeon console containing complex user interfaces with many degrees-of-freedom. These consoles often have a large footprint to encompass the complex user interface and take up a large space in the operating room. Recently developed handheld robotic surgical systems promise reduced mechanical complexity, smaller size, lower cost, and the potential to integrate into traditional surgical workflows. [0005] Handheld robots are ones in which are either entirely held by the surgeon, or are supported by a counterbalance arm and used by the surgeon inline with the procedure. Several different basic types of user interfaces have been previously proposed for use in handheld surgical robots, including joysticks, rollerballs, thumb/finger wheels, articulated parallel linkages, and specialty user interfaces, such as 1 4020633 Attorney Docket No. VU-032319 WO ORD SpaceMouse®, available commercially from 3DConnexion US of Fremont, California, which offer axial, translational, and rotational input capabilities. [0006] Joysticks are appealing since they are readily available commercial products and come in a variety of sizes. The main limitation of these devices is that a joystick can normally provide control inputs for only two degrees-of-freedom, e.g., X and Y control inputs. Bi-directional control in the 3rd degree-of-freedom for a joystick controller requires an additional input, such as a button or trigger, or an additional joystick. [0007] Rollerball interfaces have an advantage over joysticks because there are no joint limits, i.e., they can roll unlimited amounts, whereas joystick movements are limited. Rollerballs are, however, like joysticks in that they are inherently a 2D interface. Absent supplemental inputs, neither joysticks nor rollerballs can function alone to control a robot that requires a 3D input. [0008] Another option is to simply combine several one degree-of-freedom control inputs on a single interface to control a multiple degree-of-freedom robot. For example, a series of buttons, levers, rollers can be implemented on a surgical instrument, such as the handle of a laparoscopic device or an endoscope, to control the actuation of motors that effectuate bending, translation, rotation, flexion, etc. of an actuatable tool tip, such as a manipulator tip or a laser fiber. The issue of using several one degree-of-freedom inputs is that it can be challenging for the users to mentally map between multiple inputs, each controlled by different fingers, to coordinate motions of the tool tip. [0009] SpaceMouse® type user interfaces can also be used in robotics. The SpaceMouse® user interface combines a six degree-of-freedom control input in a compact frame. The SpaceMouse® is designed to be used with a full-hand grasp with the device resting flat on a surface, such as a tabletop. When mounted on a handheld robot, a full-hand grasp of the SpaceMouse® user interface is not possible, as the user needs to maneuver the robot body itself as the robotic tools are controlled. The SpaceMouse® also made it difficult to control a single degree-of-freedom without affecting the others. For example, a user would struggle with moving purely in the X- direction without also moving in the Y or Z direction(s). 2 4020633 Attorney Docket No. VU-032319 WO ORD [0010] Yet another option to unify more than two degrees of freedom in a single user interface is an articulated mechanism containing multiple degree-of-freedom linkages attached to the handles. These devices are often integrated as an extension of the laparoscopic/endoscopic tools and move the way that the instruments behave, which cannot translate well to handheld robots. [0011] Although its use in robotic teleoperation has been limited, a touchpad is another user interface type that can be used in handheld systems. The popularity of touchpad based interfaces in modern electronics indicates a high degree of pre-existing intuition for users, which can be leveraged within user interfaces. One approach is to divide the touchpad surface into regions to control separate degrees-of-freedom of the robot. This approach, however, is not suitable for applications, such as surgical manipulation, that require continuous coordinated movements in 3D. [0012] Extending the sensing capability of touchpads to a 3D input also has been somewhat explored through pressure sensing, such as in some smartphones. However, these 3D methods also have not been applied to robotic control. Additionally, as they only measure pressure, they cannot take advantage of the body’s natural proprioceptive sense through spatial displacement, which is an important factor for user intuition. Summary [0013] A user interface for handheld surgical robots implements a novel three degree-of-freedom touchpad user interface. The touchpad user interface includes a touchpad that encodes two degree-of-freedom movement, such as X-Y movement. The touchpad is mounted on a frame/carriage that is movable along a rail against the bias of a spring that maintains the touchpad in engagement with the user’s finger/thumb. Displacement of the carriage along the rail is used to encode the remaining one of the three degrees-of-freedom, i.e., Z movement. [0014] A controller for providing three dimensional control inputs to a robot includes a user interface includes a touchpad configured to be touched by a finger of the user while grasping the handle. The touchpad is configured to sense a location where the user’s finger touches the touchpad and to provide X and Y control inputs comprising X- axis data and Y-axis data associated with the location where the user’s finger touches 3 4020633 Attorney Docket No. VU-032319 WO ORD the touchpad. The touchpad is configured to move axially in response to forces applied to the touchpad by the user’s finger. The user interface is configured to provide a Z control input including Z-axis data associated with an axial position of the touchpad. [0015] According to one aspect, the controller can also include a handle configured to be grasped by the user while actuating the touchpad with a thumb. [0016] According to another aspect, alone or in combination with any other aspect, the touchpad can be configured for a pinching grasp between the thumb and an index finger of the user while the handle is grasped by fingers of the user other than the thumb and index finger. [0017] According to another aspect, alone or in combination with any other aspect, the touchpad can be configured to update the X-axis data and Y-axis data continuously and in real-time so that the associated X and Y control inputs are updated continuously and in real-time. [0018] According to another aspect, alone or in combination with any other aspect, the user interface can be configured so that the X-axis data corresponds to a horizontal left/right position in with respect to the robot, the Y-axis data corresponds to a vertical up/down position in with respect to the robot, and the Z-axis data corresponds to a depth-wise in/out position in with respect to the robot. [0019] According to another aspect, alone or in combination with any other aspect, the touchpad can include a touch surface configured to receive the user’s finger and along which the user’s finger can slide to actuate the X and Y control inputs. The touchpad can be round. The touch surface can be concave. [0020] According to another aspect, alone or in combination with any other aspect, the touch surface can include a surface feature configured to provide a tactile confirmation of the location on the touch surface being touched. The surface feature can be configured to provide tactile indication to the user of at least one of a deadband for the XY input on the touch surface, an outer edge of the touch surface, and a change in scaling of the XY input on the touch surface. 4 4020633 Attorney Docket No. VU-032319 WO ORD [0021] According to another aspect, alone or in combination with any other aspect, the touchpad can be a resistive touchpad or a capacitive touchpad. [0022] According to another aspect, alone or in combination with any other aspect, the controller can also include a biasing member configured to exert a bias on the touchpad to move in a first axial direction toward the user’s finger. The biasing member can be configured so that the bias on the touchpad can be overcome by the user’s finger to move the touchpad in a second axial direction, opposite the first axial direction. The biasing member can include a spring. [0023] According to another aspect, alone or in combination with any other aspect, the controller can also include a force sensor configured to sense a spring force exerted by the spring and provide an output corresponding to the sensed spring force. The controller can be configured to determine the Z-axis data in response to the output from the force sensor. [0024] According to another aspect, alone or in combination with any other aspect, the controller can include a displacement sensor configured to detect the axial position of the touchpad and to provide the Z control input. The displacement sensor can be a force sensor, a resistive displacement sensor, an optical displacement sensor, or a magnetic displacement sensor. [0025] According to another aspect, alone or in combination with any other aspect, a handheld surgical robot can include a robotic element, a robot actuator, and a controller as set forth in any of the aspects set forth above. The user interface of the controller can be configured to provide the X, Y, and Z control inputs to the robot actuator. The robot actuator can be configured to actuate the robotic element in response to the X, Y, and Z control inputs to produce corresponding movements at a tip of the robotic element. [0026] According to another aspect, alone or in combination with any other aspect, the robot actuator can be configured to implement mapping of the X, Y, and Z control inputs from the user interface to produce a 3D velocity-level value. The robot actuator can be configured to implement a resolved rates controller configured to convert the velocity level value to a joint space value. 5 4020633 Attorney Docket No. VU-032319 WO ORD [0027] According to another aspect, alone or in combination with any other aspect, the mapping can include position-level mapping, velocity-level mapping, or mixed position-level and velocity-level mapping. [0028] According to another aspect, alone or in combination with any other aspect, the mapping can include position-level mapping that converts a desired position from the X, Y, and Z inputs from the user interface to a velocity input to the resolved rates controller according to: vel =k[diag(kp)*xin - (ptip - preference)] where vel is a 3x1 vector of velocity input, kp is a 3x1 scaling vector, xin is a 3x1 vector of user interface input, ptip is a current position of the tip of the robotic element, preference is the reference position of the tip of the robotic element, and k is a proportionality constant for how fast the velocity input matches the desired position. The diag() function converts the 3x1 scaling vector to a 3x3 matrix. [0029] According to another aspect, alone or in combination with any other aspect, the mapping can include velocity-level mapping that converts a desired position from the X, Y, and Z inputs from the user interface to a velocity input to the resolved rates controller according to: vel = diag(kv)*xin where kv is a 3x1 scaling vector that represents the maximum velocity possible in a single direction, and xin is a 3x1 vector of user interface input. The diag() function converts the 3x1 scaling vector to a 3x3 matrix. [0030] According to another aspect, alone or in combination with any other aspect, the velocity-level mapping can include a non-linear input scaling where velocity, for each axis i, is changed according to: veli = kv,i * xi; if |xi |< xthresh,i veli = ^^* kv,i (xi – sign(xi) * xthresh,i) + kv,i * sign(xi)* xthresh,i; if |xi |≥ xthresh,i where kv is a 3x1 scaling vector that represents the maximum velocity possible in a single direction, xi is a 3x1 vector of user interface input, xthresh = .85 is the threshold at 6 4020633 Attorney Docket No. VU-032319 WO ORD which the non-linear scaling takes place, αi is the constant used to create a non-linear input profile, and sign() is the signum function that returns the sign of the input as +1, 0, or -1. [0031] According to another aspect, alone or in combination with any other aspect, the mapping comprises mixed-level mapping based on a position-level mapping with a low sensitivity according to: veli = k[kp,i * xi – (ptip,i – preference,i)]; if |xi |< xthresh,i veli = kv,i * xi; if |xi |≥ xthresh,i where, for each axis, i, the velocity, veli, will switch from position-level to velocity-level if the input along that axis, xi, exceeds a threshold of xthresh,i. Drawings [0032] Fig.1 illustrates a surgical system that includes a robot, according to an example configuration. [0033] Fig.2 is a top-front perspective view of the robot, according to the example configuration. [0034] Fig.3 is a top-rear perspective view of the robot, according to the example configuration. [0035] Fig.4A is a top-front perspective view of a controller of the robot, according to the example configuration. [0036] Fig.4B is the view of Fig.4A showing the controller in the grasp of a user’s hand. [0037] Fig.5 is a bottom-front perspective view of the controller, according to the example configuration. [0038] Fig.6 is a bottom-rear perspective view of the controller, according to the example configuration. [0039] Figs.7 and 8 are left and right side views, respectively, of the controller, according to the example configuration. 7 4020633 Attorney Docket No. VU-032319 WO ORD [0040] Figs.9 and 10 are front and rear views, respectively, of the controller, according to the example configuration. [0041] Fig.11 is a sectional view taken generally along line 11-11 of Fig.9. [0042] Fig.12 is a sectional view taken generally along line 12-12 of Fig.9. [0043] Fig.13 is a perspective view illustrating the controller in the grasp of a user’s hand according to an example alternative use scenario. Description [0044] Fig.1 illustrates an operating room environment in which surgery can be performed. The operating room of Fig.1 is outfitted with an example configuration of a system 10 for performing surgery, which includes an apparatus 20 in the form of a robot. Particularly, the robot 20 is what is referred to herein as a “handheld” robot, meaning that the robot and system 10 are configured to allow for manual manipulation of the position and orientation of the robot as a whole while, at the same time, they are configured to allow for the independent control of their robotic elements. [0045] In the example configuration, the robot 20 includes an endoscope 22 through which one or more robotically controlled elements 30 extend. The robotic elements 30 can be any articulated structure that can be controlled robotically and are capable of performing a surgical or other medical procedure. In the example configuration of Fig.1, the robotic elements 30 can be concentric tube manipulators. The robot 20 is not limited to the endoscopic, concentric tube manipulator configuration of the example configuration(s) illustrated herein. The system 10 can implement a robot of any configuration that can benefit from the features and improvements disclosed herein. [0046] The endoscope 22 is secured to and extends from an actuator portion 24 of the robot 20. The actuator 24 encloses and supports actuator elements (Not shown), such as encoders, controllers, motors, transmissions, linkages, tracks, sliders, etc. that produce the robotic movement of the robotic elements 30. In the example configuration, the actuator elements of the actuator 24 are concealed within a housing. The actuator 24 is configured to produce robotic movements of the robotic elements 30. In the example configuration, the robotic movements are translational and rotational 8 4020633 Attorney Docket No. VU-032319 WO ORD movements of the individual tubes of the concentric tube manipulators that make up the robotic elements 30. The translational and rotational movements are controlled and synchronized to produce a desired movement of the tips of the robotic elements and the tools/end effectors supported thereon. [0047] The concentric tube manipulators of the robotic elements 30 are small, needle-diameter, tentacle-like robots that include multiple concentric, pre-curved, elastic tubes. These elastic, curved tubes are typically made of a superelastic metal alloy such as a nickel-titanium alloy (“nitinol”) material. Individually or in combination, the tubes can be rotated about and/or translated along a common longitudinal axis of the concentric tubes. Through relative rotational movement, the rotational positions of the concentric tubes relative to each other can be controlled. Through relative translational movement, the concentric tubes can be retracted into one another and extended from one another. [0048] As the pre-curved tubes interact with one another through relative translational and rotational movement, they cause one another to bend and twist, with the tubes collectively assuming a minimum energy conformation. The pre-curvature(s) of the tube(s) for a given manipulator can be selected to provide a desired workspace throughout which the tip can access through the relative rotational and/or translational movements of the tubes. The curved shape of the distal end of the manipulator is controlled via translation and rotation of each tube at a proximal location outside the patient, e.g., at its base inside the robot actuator 24 where the tubes are connected to the actuator elements. [0049] Although the robotic elements 30 are shown in the example configuration as being concentric tube manipulators, they could have any configuration for which 3D control inputs are desired. For example, the robotic elements could be robot arms including one or more linkages that are actuatable to pivot or rotate relative to each other, with tools affixed to a working end of the arm. In any implementation, examples of the tools that can be implemented by the robotic elements 30 include curettes, grippers, surgical lasers, graspers, retractors, scissors, imaging tips, cauterizing tips, ablation tips, morcelators, knives/scalpels, cameras, irrigation ports, suction ports, needles, probes, and tissue manipulators. 9 4020633 Attorney Docket No. VU-032319 WO ORD [0050] The robot 20 is supported on a support device 26 that permits the user (i.e., surgeon) to easily maneuver and position the robot 20. The support device 26 can, for example, be a counterbalance arm configured to negate all or a portion of the weight of the robot 20 so that the user can manually manipulate its position and orientation with ease. The support device 26 can also have locking features that allow the user to fix the position of the robot 20 so that the user can focus on operating the robotic elements 30 from a fixed position and orientation of the endoscope 22 from which the robotic elements extend. [0051] The system 10 can also include one or more computers 50 connected to the robot 20 for control, programming, recording, or other purposes. The system 10 can further include one or monitors 52 for providing endoscope and other video, diagnostics, patient vitals, and other information to the user. [0052] Referring to Figs.2 and 3, the robot 20 includes controllers 100 configured to allow for the manual manipulation of the robot as a whole, as well as providing control inputs to the robot 20 control robotic movements of the robotic elements 30. The controllers 100 include handle features that facilitate grasping in a comfortable and ergonomic manner so that the position and orientation of the robot 20 can be adjusted via the support device 26. The controllers 100 also include user interface features configured to allow the user to supply control inputs to the robot 20 in a convenient and intuitive manner while grasping the handle features. In the example configuration of Figs.1-3, the robot 20 includes two robotic elements 30, each of which is individually controlled by a corresponding one of two controllers 100 that are provided on the robot 20. [0053] The left controller 100 (left as viewed from the rear of the robot 20, see, Fig. 3) is actuatable to produce 3D control inputs along left controller XYZ axes, i.e., XLIN, YLIN, ZLIN. The right controller 100 (right as viewed from the rear of the robot 20) is actuatable to produce 3D control inputs along right controller XYZ axes, i.e., XRIN, YRIN, ZRIN. [0054] The control inputs XLIN, YLIN, ZLIN and XRIN, YRIN, ZRIN are fed to the robot actuator 24, which implements a kinematic model that converts the control inputs to 10 4020633 Attorney Docket No. VU-032319 WO ORD actuator commands that will produce movements of the tips/end effectors of the robotic elements 30 that correspond to the control inputs. As such, the XLIN, YLIN, ZLIN control inputs received from the left controller 100 produce corresponding output motions at the tips/end effectors of associated robotic element 30, i.e., XLOUT, YLOUT, ZLOUT. The XRIN, YRIN, ZRIN control inputs received from the right controller 100 produce corresponding output motions at the tips/end effector of the associated robotic element 30, i.e., XROUT, YROUT, ZROUT. [0055] Additionally, although the left and right controllers 100 are illustrated as controlling the operation of two individual robotic elements 30, it will be appreciated that the controllers 100 could be used to control different aspects of a single robotic element. For example, one controller 100 could be used to control the position of a robotic element and a tool mounted to the robotic element, while the other controller could be used to control the operation of the tool itself. [0056] Figs 4-12 illustrate an example configuration of a controller 100 that can be implemented in the robot 20. The controller 100 illustrated in Figs.4-12 corresponds to the controller shown on the right side of the robot 20, as viewed from a perspective viewed from rear of the robot in Fig.3. Considering this, it will be appreciated that the controllers 100 have side-specific configurations, i.e., a left-side controller configuration and a right-side controller configuration. The left-side and right-side controller configurations are mirror images of each other. Operationally, however, the left-side and right-side configurations are identical. Therefore, only the right-side configuration of the controller is shown and described herein, with the understanding that, aside from their mirror-image features, they are operationally and functionally identical. [0057] Mounting the controllers 100 on opposite sides of the robot 20 creates the sense that the entire robot is a part of the control scheme. Gross movements of the entire robot 20 are facilitated through the physical grasp of the controllers 100 and the manipulation of the robot position and orientation through the support device 26. At the same time, robotic movements of the robotic elements 30 are controlled by the user interface features of the controllers 100, which are discussed in the following paragraphs. 11 4020633 Attorney Docket No. VU-032319 WO ORD [0058] The controller 100 includes a handle portion 110 that is ergonomically designed to accommodate the user’s hands, i.e., palms and fingers, in a comfortable position that cuts down on fatigue. As such, as shown in Figs.4A and 4B, the handle portion 110 can include finger receiving surfaces 112 configured to receive and match- up with the shape and form of mating portions (e.g., palms and/or fingers) of the human hand when grasping the handle portion. A similar surface 114 is configured to receive and match-up with the shape and form of the user’s hand in the area of the forefinger and thumb when grasping the handle portion 110. It is through the grasping of the handle portions 110 that the user can manipulate the gross position of the robot 20 manually, e.g., via the support device 26. [0059] The controller 100 also includes a mounting portion 120 configured to facilitate a physical connection of the controller to the robot 20. As shown, the mounting portion 120 can take the form of a mounting block or bracket configured to position the controller 100 at the desired position and orientation relative to the robot. In the example configuration, the mounting portion 120 can include openings 122 for receiving fasteners (not shown), which fasten the controller 100 to the robot actuator 24, e.g., to a frame or housing of the robot actuator, as shown in Figs.1-3. [0060] The controller 100 also includes a user interface 150 configured to provide three-dimensional (3D) inputs to the robot 20 to control operation of the robotic elements 30. In the example configuration, the robot 20 includes two robotically controlled elements 30, each of which is controlled independently of the other via a user interface 150 of a corresponding controller 100. Alternative configurations can be implemented. For example, where the robot 20 includes a single robotic element 30, only a single user interface 150 is necessary. In this instance, the robot 20 can be configured so that only one of the controllers 100 includes a user interface 150. Alternatively, in this scenario, the robot 20 can be outfitted with a controller 100 on each side of the robot 20, and the user can select which controller 100 and user interface 150, left-side or right-side, to use. [0061] The user interface 150 includes a touchpad 160 configured to be engaged by the user’s thumb while grasping the handle portion 110 (See, Fig.4B). The user 12 4020633 Attorney Docket No. VU-032319 WO ORD interface 150 is configured to coordinate 3D motion in an intuitive manner, so that the movements of the user’s thumb produces a corresponding movement of the robotic element 30. Essentially, the user interface 150 tracks the position of the user’s thumb in 3D space and produces an input that corresponds to that tracked movement in three dimensions. [0062] Each user interface 150 has a three degree-of-freedom configuration in which movement of the user’s thumb is tracked three dimensions defined by the X-axis, Y- axis, and Z-axis, i.e., XIN, YIN, ZIN. The touchpad 160 is configured to act as a 2D interface that acquires two degrees-of-freedom of the tracked thumb position, specifically thumb movement in the X-Y directions. Additionally, the touchpad 160 is configured to move as a whole along the Z-axis and to track the position of the user’s thumb along the Z-axis, thereby acquiring the thumb position in the third degree-of- freedom. [0063] The touchpad 160 can be of any type conventionally applied in computer and other devices. More specifically, the touchpad 160 can be a capacitive touchpad or a resistive touchpad. Capacitive touchpads were implemented in current designs, but resistive touchpads can also be used. As shown in the example configuration shown in Figs.1-12, the touchpad 160 can have a generally round, disc-shaped configuration with a diameter selected to coincide, in general, with the reach of a user’s thumb while grasping the handle portion 110. The surface of the touchpad 160 can also have a slight spherically recessed configuration, as shown, for example, in Fig 12. Alternative shapes, such as elliptical, square, rectangular, polygonal, etc., and alternative sizes can, of course, be implemented. In one example configuration, a 40mm diameter TM040040 trackpad, available commercially from GlidePoint™ (Cirque, Utah, USA), was implemented. [0064] The touchpad 160 is mounted on a frame 130 that is configured to move linearly along a rail 140 that is fixed to a housing portion 116 located at an upper end of the handle portion 110 of the controller 100. The frame 130 has an opening 132 through which the rail 140 extends. The frame 130 and rail 140 are configured so that the linear 13 4020633 Attorney Docket No. VU-032319 WO ORD travel of the frame over the rail is perpendicular to both the X-axis and Y-axis of the touchpad 160, i.e., along the Z-axis. [0065] A spring 142 is configured to bias movement of the frame 130 along the rail 140 so that forward pressing forces (see Fig.4A-4B) applied to the touchpad 160 by the thumb compress the spring. Because of this, when the thumb releases its pressing force, the frame 130 will move rearward due to the bias applied by the spring 142, with the touchpad 160 maintaining contact with the thumb. As a result, the user interface 150 is configured so that the touchpad 160 is maintained in contact with the thumb during use, regardless of its Z-axis position. Accordingly, the user interface 150 can encode all three degree-of-freedom inputs , i.e., XIN, YIN, ZIN, throughout the procedure, capturing the motion of the user's thumb directly in 3D space. [0066] As best shown in Figs.11 and 12, a displacement sensor in the form of a force sensor 134 is mounted in the housing 116 and is configured to measure the push force applied to the touchpad 160. In the example configuration, the force sensor 134 is configured to sense the force applied to the touchpad 160 through the spring 142. Due to the spring constant of the spring 142, the force applied to the force sensor 134 increases as the spring is compressed and, similarly, is reduced as the spring compression is relieved. Thus, the force sensor 134 can be used to encode the Z-axis position of the frame 130/touchpad 160. In an example configuration, the force sensor 134 can be a REB7 Universal Sub Miniature Load Cell (available commercially from Loadstar Sensors, Fremont, California, USA). [0067] The touchpad 160 is configured to output an X-Y coordinate based on where the thumb touches the pad. The analog signal is converted (ADC) to a digital signal with a range, for example, of 0-1024 (where 0 is one end of the touchpad, 1024 is the opposite, and 512 is the center) for both the X and Y positions. For convenience, these values can be remapped to a range of -1 to +1 such that, for each X and Y, -1 is one side of the touchpad, +1 is the opposite side, and 0 is the center. [0068] The spring 142 is configured so that its spring constant is strong enough to move the frame 130/touchpad 160 while, at the same time, not so strong that displacement is cumbersome for the user. In one example configuration, it was found 14 4020633 Attorney Docket No. VU-032319 WO ORD that a 3D printed a spring with a spring constant of 73 N/m fit these requirements. It was also found that about 25mm was a comfortable distance for the thumb move the touchpad 160 linearly (Z-axis) when holding the controller 100. This resulted in a maximum force of about 1.8 N. The force input is also remapped so that the maximum force was +1 (full displacement along the Z-axis) and no force was 0 (no displacement along the Z-axis). The remapping of the X, Y, Z values simplifies the implementation of mapping algorithms and scaling factors and makes it easier to implement changes. [0069] There are several options for mapping the 3D input of the user interface 150 to control the robot 20: position-level mapping, velocity-level mapping, and mixed, i.e., a mixture of both position-level mapping and velocity-level mapping. Each mapping will send a velocity-level 3D value to a resolved rates controller implemented in the robot 20, e.g., the robot actuator 24, that converts the desired velocity to the joint space. [0070] For position-level control, the absolute readings from the touchpad and force sensor are scaled to correspond to a position in 3D for the tip of the robotic element 30. With position-level control, the scaling factors chosen correlate to the maximum ranges to which the tip of the element 30 can travel. This gives a user quicker absolute control, while limiting the robotic movement to that range. For example, a scaling factor of 30 x 103 for the x input would limit the robot to a range of +/- 30 mm in the x-direction in task space. [0071] To convert from the desired position from the user interface to a velocity input to the resolved rates controller, the following equation is used: vel =k[diag(kp)*xin - (ptip - preference)] where vel is a 3x1 vector of velocity input, kp is a 3x1 scaling vector, xin is a 3x1 vector of user interface input, ptip is the current position of the manipulator tip, preference is a reference position of the manipulator from which movements are based, and k is a proportionality constant for how fast the velocity input matches the desired position. The diag() function converts the 3x1 scaling vector to a 3x3 matrix. The form is essentially that of a PID controller, where the error is between the scaled user interface input and the robot tip position. While the proportional gain of the PID control is implemented in the example configuration without the integral and derivative components, either or both 15 4020633 Attorney Docket No. VU-032319 WO ORD of those components can also be incorporated. Note that this algorithm requires that the forward kinematic model of the robot be known, as it uses the current tip position value in ptip. Also, the value preference is set once in the beginning and used as a reference. It was found that k = 1 and kp = {30, 30, 10} x 103 worked well as constants. [0072] For velocity-level control, the user interface input maps instead directly to a tip velocity of the robotic element 30. Here, the larger analog range of the touchpad 160 and force sensor 134 allow users to better modulate their velocity in 3D space. The 3x1 scaling vector, kv, represents the maximum velocity possible in a single direction and the diag() function converts the 3x1 scaling vector to a 3x3 matrix: vel = diag(kv)*xin [0073] Because the Z mapping ranges from 0 to +1, there is no way to move along the negative Z. There are, however, two manners by which this can be achieved. First, an external button, such as a switch on the handle portion 110 of the controller 100 or a foot actuated pedal, can be implemented. Pressing the button makes kv,z negative, enabling movement in the negative Z direction. This may, however, be too cumbersome to users and removes the level of intuitive movement for robotic control that is sought for the controller 100 because forward Z direction movement of the touchpad 160 would be associated with forward and backward tip motion, depending on the state of the switch. [0074] Another option is to remap the Z input to -1 to +1, such that 0 is the midpoint of the force range. Therefore, to have a velocity of 0 in the Z direction, the user must keep some force on the touchpad 160. A negative Z velocity is achieved by letting up on the force, while pushing in further results in a positive Z velocity. It was found that kv = {10, 10, 10} x 103 worked well as constants. For this mapping, a deadband region is included about the 0 point of the X,Y,Z input. The region in the X-Y directions on the touchpad is 17.5% of the radius, and for the Z direction it is 22% of the maximum force around the 0. [0075] For the deadband in the Z direction, a physical indicator, such as a bump from an indentation, can be applied to the frame 130/rail 140 interface, to provide a tactile 16 4020633 Attorney Docket No. VU-032319 WO ORD sense of when the spring 142 and the touchpad 160 are at their midpoints of displacement/travel. [0076] The touchpad 160 can include surface features or indicia, such as physical indents or etchings, to indicate different regions on the touch surface 162 with a tactile response that the user can feel and associate with the region. These indents or etchings can, for example, be in the form of dots or other shapes such as dashes or a continuous line arranged in a shape that corresponds with the defined region on the touch surface 162. This is shown in Fig.9. [0077] Referring to Fig.9, to indicate or delineate the central deadband radius on the touchpad 160, circularly arranged indents/etchings 170 (dots/dashes/continuous line) can be centered on the touch surface 162 with a radius selected so that the deadband covers the desired region, e.g., 17.5% of the touchpad radius. Additionally, to delineate the outer edge of the touch surface 162 of the pad 160, circularly arranged indents/etchings 172 (dots/dashes/continuous line) can alert the user that they are approaching the edge of the touchpad workspace. Finally, circularly arranged indents/etchings 174 (dots/dashes/continuous line) can extend or be positioned on the touch surface 162 inward from the outer edge of the touchpad 160, e.g., between the indents/etchings 170 and 172, to indicate where on the workspace a change in scaling occurs. This can be implemented, for example, where velocity-level control is implemented (see below). [0078] A modification of velocity-level control includes the addition of a non-linear input scaling where, instead of the velocity mapping linearly from 0 to the maximum velocity (in either directions, + or -), it instead increases the scaling near the edge of the range. For example, the result can be that veli changes to the following: veli = kv,i * xi; if |xi |< xthresh,i veli = ^^* kv,i (xi – sign(xi) * xthresh,i) + kv,i * sign(xi)* xthresh,i; if |xi |≥ xthresh,i where xthresh is the threshold at which the non-linear scaling takes place and αi is the constant used to create a non-linear input profile. The sign() is the signum function that returns the sign of the input as +1, 0, or -1. 17 4020633 Attorney Docket No. VU-032319 WO ORD [0079] Mixed-level control combines aspects of both the position-level and velocity- level controls. Mixed-level control is based on a position-level mapping with a low sensitivity. As the user reaches either extreme of the input range of the user interface 150, the input would switch to a velocity-level mapping and continue in a velocity in that direction, as shown: veli = k[kp,i * xi – (ptip,i – preference,i)]; if |xi |< xthresh,i veli = kv,i * xi; if |xi |≥ xthresh,i where, for each axis, i, the velocity, veli, will switch from position-level to velocity-level if the input along that axis, xi, exceeds a threshold of xthresh,j. This methodology produces the intuition of position-level control without the constraints of a small range that can occur. In one example implementation, it was found that k =5, kp = {3.5, 3.5, 10}e - 3, and kv = {10,10, 10}e - 3 worked well as constants. [0080] The above description is given by way of example, and not limitation. In view of the above disclosure, one skilled in the art could devise variations that are within the scope and spirit of the invention disclosed herein. For example, the user interface 150 implemented via the controller 100 has been described where the user grasps the handle portion 110 with their hand/fingers while actuating the touchpad 160 with their thumb. The user could, however, actuate the touchpad 160 in an alternative manner, using a finger other than the thumb, such as a forefinger. In this scenario, the user might lock the support device 26 to fix the position/orientation of the robot 20, which relieves the need to steady the robot by grasping the handle portion 110, thus allowing for forefinger actuation of the touchpad 160. [0081] As another example, as shown in Fig.13, the user could actuate the touchpad 160 in another alternative manner through a pinching grasp of the touchpad. The user can grasp the touchpad 160 with a pinching gesture using the thumb and index finger while, at the same time, the remaining fingers not involved in the pinching grasp are used to grasp the handle portion 110. In Fig.13, however, for purposes of showing the pinching grasp with clarity, the handle portion 110 is not shown. Optionally, the support device 26 can be locked to fix the position/orientation of the robot 20 while using the pinching grasp. 18 4020633 Attorney Docket No. VU-032319 WO ORD [0082] As shown in Fig.13, the thumb is positioned on the front of the touchpad 160 and used to input the sliding XY control inputs. The index finger is positioned on the rear of the touchpad 160 (e.g., centrally as shown or near the bottom, depending on user preference). In this manner, the thumb effectuates the axial Z control inputs in the forward/pressing/extension direction (e.g., against the spring bias), and the index finger effectuates the Z control inputs in the rearward/pulling/retraction direction (e.g., with the spring bias). [0083] It has been found that the pinching grasp works well because the index finger engagement or grip on the rear of the touchpad 160 allows the thumb to work in concert with the index finger to produce the desired sliding thumb movements on the touchpad in a familiar and intuitive manner, as if the thumb is being slid or rubbed across the index finger itself. At the same time, the grasp of the touchpad 160 gives the user a better sense of control of the Z direction because they are able to sense and effectuate, positively, movement of the touchpad in the rearward/pulling/retraction direction. In other words, the pinching grasp allows the user to actively move the touchpad 160 in both Z directions, as opposed to thumb-only actuation, which relies on the spring maintaining contact between the touchpad and the thumb as the thumb moves alone in the rearward/retraction direction without any input or interaction from the index finger. [0084] As another example, the displacement of the touchpad 160 in the Z direction can be determined in manners different than the force sensor 134 method described in the example configuration. For instance, position sensors such as resistive position sensors, optical position sensors, or magnetic position sensors could be used to sense the displacement of the touchpad 160 in the Z direction and can be used to produce the Z control inputs. [0085] Further, the various features of the embodiments disclosed herein can be used alone or in varying combinations with each other and are not intended to be limited to the specific combination described herein. Thus, the scope of the claims is not to be limited by the illustrated embodiments. 19 4020633

Claims

Attorney Docket No. VU-032319 WO ORD Claims We claim: 1. A controller for providing three dimensional control inputs to a robot, comprising: a user interface comprising a touchpad configured to be touched by a finger of the user while grasping the handle, the touchpad being configured to sense a location where the user’s finger touches the touchpad and to provide X and Y control inputs comprising X-axis data and Y-axis data associated with the location where the user’s finger touches the touchpad; wherein the touchpad is configured to move axially in response to forces applied to the touchpad by the user’s finger, wherein the user interface is configured to provide a Z control input comprising Z-axis data associated with an axial position of the touchpad. 2. The controller recited in claim 1, further comprising a handle configured to be grasped by the user while actuating the touchpad with a thumb. 3. The controller recited in claim 2, wherein the touchpad is configured for a pinching grasp between the thumb and an index finger of the user while the handle is grasped by fingers of the user other than the thumb and index finger. 4. The controller recited in claim 1, wherein the touchpad is configured to update the X-axis data and Y-axis data continuously and in real-time so that the associated X and Y control inputs are updated continuously and in real-time. 5. The controller recited in claim 1, wherein the user interface is configured so that the X-axis data corresponds to a horizontal left/right position in with respect to the robot, the Y-axis data corresponds to a vertical up/down position in with respect to the robot, and the Z-axis data corresponds to a depth-wise in/out position in with respect to the robot. 20 4020633 Attorney Docket No. VU-032319 WO ORD 6. The controller recited in claim 1, wherein the touchpad comprises a touch surface configured to receive the user’s finger and along which the user’s finger can slide to actuate the X and Y control inputs. 7. The controller recited in claim 6, wherein the touchpad is round and the touch surface is concave. 8. The controller recited in claim 6, wherein the touch surface comprises a surface feature configured to provide a tactile confirmation of the location on the touch surface being touched. 9. The controller recited in claim 8, wherein the surface feature is configured to provide tactile indication to the user of at least one of a deadband for the XY input on the touch surface, an outer edge of the touch surface, and a change in scaling of the XY input on the touch surface. 10. The controller recited in claim 1, wherein the touchpad comprises a resistive touchpad or a capacitive touchpad. 11. The controller recited in claim 1, further comprising a biasing member configured to exert a bias on the touchpad to move in a first axial direction toward the user’s finger, wherein the biasing member is configured so that the bias on the touchpad can be overcome by the user’s finger to move the touchpad in a second axial direction, opposite the first axial direction. 12. The controller recited in claim 11, wherein the biasing member comprises a spring. 13. The controller recited in claim 12, further comprising a force sensor configured to sense a spring force exerted by the spring and provide an output corresponding to the sensed spring force, wherein the controller is configured to determine the Z-axis data in response to the output from the force sensor. 21 4020633 Attorney Docket No. VU-032319 WO ORD 14. The controller recited in claim 1, further comprising a displacement sensor configured to detect the axial position of the touchpad and to provide the Z control input. 15. The controller recited in claim 14, wherein the displacement sensor comprises a force sensor, a resistive sensor, an optical sensor, or a magnetic sensor. 16. A handheld surgical robot comprising: a robotic element; a robot actuator; and the controller as recited in claim 1, wherein the user interface of the controller is configured to provide the X, Y, and Z control inputs to the robot actuator, the robot actuator being configured to actuate the robotic element in response to the X, Y, and Z control inputs to produce corresponding movements at a tip of the robotic element. 17. The surgical robot recited in claim 16, wherein the robot actuator is configured to implement mapping of the X, Y, and Z control inputs from the user interface to produce a 3D velocity-level value, the robot actuator being configured to implement a resolved rates controller configured to convert the velocity level value to a joint space value. 18. The surgical robot recited in claim 17, wherein the mapping comprises position-level mapping, velocity-level mapping, or mixed position-level and velocity-level mapping. 19. The surgical robot recited in claim 17, wherein the mapping comprises position-level mapping that converts a desired position from the X, Y, and Z inputs from the user interface to a velocity input to the resolved rates controller according to: vel =k[diag(kp)*xin - (ptip - preference)] where vel is a 3x1 vector of velocity input, kp is a 3x1 scaling vector, xin is a 3x1 vector of user interface input, ptip is a current position of the tip of the robotic element, 22 4020633 Attorney Docket No. VU-032319 WO ORD preference is the reference position of the tip of the robotic element, k is a proportionality constant for how fast the velocity input matches the desired position, and the diag() function converts the 3x1 scaling vector to a 3x3 matrix. 20. The surgical robot recited in claim 17, wherein the mapping comprises velocity-level mapping that converts a desired position from the X, Y, and Z inputs from the user interface to a velocity input to the resolved rates controller according to: vel = diag(kv)*xin where kv is a 3x1 scaling vector that represents the maximum velocity possible in a single direction, xin is a 3x1 vector of user interface input, and the diag() function converts the 3x1 scaling vector to a 3x3 matrix. 21. The surgical robot recited in claim 20, wherein the velocity-level mapping further comprises a non-linear input scaling where scaling is increased according to: veli = kv,i * xi; if |xi |< xthresh,i veli = ^^* kv,i (xi – sign(xi) * xthresh,i) + kv,i * sign(xi)* xthresh,i; if |xi |≥ xthresh,i where kv is a 3x1 scaling vector that represents the maximum velocity possible in a single direction, xi is a 3x1 vector of user interface input, xthresh = .85 is the threshold at which the non-linear scaling takes place, αi is the constant used to create a non-linear input profile, and sign() is the signum function that returns the sign of the input as +1, 0, or -1. 22. The surgical robot recited in claim 17, wherein the mapping comprises mixed- level mapping based on a position-level mapping with a low sensitivity according to: veli = k[kp,i * xi – (ptip,i – preference,i)]; if |xi |< xthresh,i veli = kv,i * xi; if |xi |≥ xthresh,i where, for each axis, i, the velocity, veli, will switch from position-level to velocity- level if the input along that axis, xi, exceeds a threshold of xthresh,i. 23 4020633
PCT/US2023/083686 2022-12-12 2023-12-12 Controller with a touchpad user interface for operating robotically actuated devices WO2024129771A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263387050P 2022-12-12 2022-12-12
US63/387,050 2022-12-12

Publications (1)

Publication Number Publication Date
WO2024129771A1 true WO2024129771A1 (en) 2024-06-20

Family

ID=89723199

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/083686 WO2024129771A1 (en) 2022-12-12 2023-12-12 Controller with a touchpad user interface for operating robotically actuated devices

Country Status (1)

Country Link
WO (1) WO2024129771A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040054254A1 (en) * 2002-09-13 2004-03-18 Kiyoshi Miyake Endoscope apparatus
US20190018567A1 (en) * 2017-07-11 2019-01-17 Logitech Europe S.A. Input device for vr/ar applications
GB2606672A (en) * 2018-02-23 2022-11-16 Cmr Surgical Ltd Camera control

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040054254A1 (en) * 2002-09-13 2004-03-18 Kiyoshi Miyake Endoscope apparatus
US20190018567A1 (en) * 2017-07-11 2019-01-17 Logitech Europe S.A. Input device for vr/ar applications
GB2606672A (en) * 2018-02-23 2022-11-16 Cmr Surgical Ltd Camera control

Similar Documents

Publication Publication Date Title
US20230301738A1 (en) Master control device and methods therefor
CN110604618B (en) User interface device with clamping link
US20200275985A1 (en) Master control device with multi-finger grip and methods therefor
US20210298855A1 (en) Master control device with finger grip sensing and methods therefor
CA3064408A1 (en) Handle assemblies for robotic surgical systems
US20210330407A1 (en) Surgical robot systems comprising robotic telemanipulators and integrated laparoscopy
EP3903712A1 (en) Surgical robot and method for setting pivot position
CN113194870B (en) User interface device, main control console of surgical robot device, and operation method thereof
JP7427815B2 (en) User interface device with grip links
JP7498444B2 (en) Surgical support robot and surgical support robot system
CN112107368A (en) Surgeon input device for minimally invasive surgery
JP7344927B2 (en) surgical support robot
WO2024129771A1 (en) Controller with a touchpad user interface for operating robotically actuated devices
WO2020209165A1 (en) Surgical operation system and method for controlling surgical operation system
CN113558773B (en) Surgical auxiliary robot
JP7068379B2 (en) Surgery support robot
US12023122B2 (en) Ungrounded master control devices and methods of use
KR20230125797A (en) Physician Input Device for Concentric Canal Surgery Robot
WO2023192465A1 (en) User interface interaction elements with associated degrees of freedom of motion