WO2023171263A1 - Control device and medical robot - Google Patents

Control device and medical robot Download PDF

Info

Publication number
WO2023171263A1
WO2023171263A1 PCT/JP2023/005129 JP2023005129W WO2023171263A1 WO 2023171263 A1 WO2023171263 A1 WO 2023171263A1 JP 2023005129 W JP2023005129 W JP 2023005129W WO 2023171263 A1 WO2023171263 A1 WO 2023171263A1
Authority
WO
WIPO (PCT)
Prior art keywords
physical interface
operator
robot arm
control device
robot
Prior art date
Application number
PCT/JP2023/005129
Other languages
French (fr)
Japanese (ja)
Inventor
淳 新井
隆弘 柘植
桐郎 増井
岳夫 稲垣
景 戸松
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2023171263A1 publication Critical patent/WO2023171263A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots

Definitions

  • the present disclosure relates to a control device and a medical robot.
  • Medical robots that autonomously perform operations using surgical instruments are known. For example, in endoscopic surgery, an endoscope that operates autonomously is used to photograph the inside of a patient's abdominal cavity, and the photographed images are displayed on a display. By performing surgery while looking at the captured images displayed on the display, the operator (surgeon) can concentrate both hands on operating the surgical instruments. A hands-free operation means is required as an operation means for such a medical robot.
  • Patent Document 1 discloses a technique in which a physical interface for operating a medical robot is attached to a surgical instrument.
  • An object of the present disclosure is to provide a control device and a medical robot that can control the operation of a medical robot in a hands-free manner and manage risks in hands-free operation.
  • a control device includes a physical interface that can be operated by a surgeon with a body part other than his or her own hand, and controls the operation of a medical robot in accordance with the operation of the surgeon on the physical interface. do.
  • FIG. 2 is a schematic diagram showing an example of the arrangement of medical robots according to existing technology.
  • FIG. 2 is a schematic diagram showing an example of the arrangement of medical robots according to existing technology.
  • FIG. 2 is a schematic diagram showing an example of the arrangement of medical robots according to existing technology.
  • 1 is a diagram schematically showing an example of the configuration of an endoscopic surgery system according to existing technology.
  • FIG. 2 is a schematic diagram showing an example of a physical interface according to an embodiment. It is a schematic diagram for explaining the relationship between a robot arm and a surgical bed.
  • FIG. 3 is a schematic diagram for explaining the illumination function of the physical interface according to the embodiment.
  • 1 is a diagram schematically showing an example of the configuration of an endoscopic surgery system according to an embodiment.
  • FIG. 2 is a functional block diagram of an example for explaining the functions of the robot arm system according to the embodiment.
  • FIG. 3 is a schematic diagram for explaining an example of effective arrangement of physical interfaces according to the embodiment.
  • FIG. 3 is a schematic diagram for explaining an example of effective arrangement of physical interfaces according to the embodiment.
  • FIG. 3 is a schematic diagram showing an example of operation mode transition of a robot arm according to a first application example of the embodiment.
  • FIG. 7 is a schematic diagram showing an example of operation mode transition of the robot arm according to a second application example of the embodiment.
  • FIG. 2 is a schematic diagram showing an example of an emergency stop button provided on a robot arm.
  • FIG. 7 is a schematic diagram showing an example of the arrangement of physical interfaces according to a first modification of the embodiment.
  • FIG. 7 is a schematic diagram showing an example of a physical interface according to a first example of a first modification of the embodiment. It is a schematic diagram which shows the example of the physical interface based on the 2nd example of the 1st modification of embodiment. It is a schematic diagram which shows the example of the physical interface based on the 3rd example of the 1st modification of embodiment.
  • FIG. 7 is a schematic diagram showing an example of the arrangement of physical interfaces according to a second modification of the embodiment.
  • FIG. 6 is a schematic diagram showing an example of a physical interface according to a first example of a second modification of the embodiment. It is a schematic diagram which shows the example of the physical interface based on the 2nd example of the 2nd modification of embodiment.
  • Embodiment 2-1-1 Physical interface example according to embodiment 2-1-2.
  • a control device for controlling the operation of a medical robot, particularly a robot arm that assists a surgeon in surgery.
  • a control device includes a physical interface configured to be operable by a surgeon with a body part other than his/her own hand.
  • a physical interface is an interface that has a physical substance and is used to convert an operation by an operator (in this case, a surgeon) into an electrical signal.
  • FIGS. 1A to 1C are schematic diagrams showing examples of the arrangement of medical robots according to existing technology.
  • the medical robot is a robot arm that assists a surgeon during surgery.
  • FIG. 1A is a top view of a patient 101 lying on a surgical bed 100 and an operator 102 standing next to the surgical bed 100 for a surgical procedure.
  • 1B is an overhead view of the state shown in FIG. 1A from diagonally above the rear of the operator 102
  • FIG. 1C is an overhead view of the state shown in FIG. 1A from the opposite side of FIG. 1B.
  • the surgical bed 100 is provided with bed rails 110 on the side.
  • bed rails 110 are provided on both sides of the surgical bed 100 for each movable region.
  • the surgical bed 100 is held at a predetermined height from the floor by a pedestal 140.
  • the robot arm 120 includes a plurality of joints and an arm that connects the joints. By driving a plurality of joints in a predetermined manner, the robot arm 120 can freely change its posture within the movable range of each joint.
  • the robot arm 120 is installed on a trolley 130 and is used as a floor-standing device.
  • the surgical bed 100 is capable of changing the inclination for each movable region
  • the robot arm 120 operates independently of the inclination of each movable region of the surgical bed 100.
  • FIG. 2 is a diagram schematically showing an example of the configuration of an endoscopic surgery system 5000 according to existing technology.
  • FIG. 2 shows a surgeon 102 performing surgery on a patient 101 on a surgical bed 100 using an endoscopic surgery system 5000. Further, in FIG. 2, the surgical bed 100 is shown as seen from the feet or head side of the patient 101, and bed rails 110 are shown on both sides thereof. For example, instruments used in surgery are attached to the bed rail 110 using clamps or the like.
  • an endoscopic surgery system 5000 includes an endoscope 5001, other surgical instruments 5017, a robot arm 120 that is a medical robot that supports the endoscope 5001, and an endoscopic surgery system 5000 for endoscopic surgery. and a rack 5037 containing various devices.
  • Rack 5037 is mounted on truck 130.
  • trocars 5025a to 5025d are punctured into the abdominal wall. Then, the lens barrel 5003 of the endoscope 5001 and other surgical instruments 5017 are inserted into the body cavity of the patient 101 from the trocars 5025a to 5025d.
  • a pneumoperitoneum tube 5019, an energy treatment tool 5021, and forceps 5023 are inserted into the body cavity of the patient 101.
  • the energy treatment tool 5021 is a treatment tool that performs incision and exfoliation of tissue, sealing of blood vessels, etc. using high frequency current or ultrasonic vibration.
  • the surgical tool 5017 shown in FIG. 2 is just an example, and various surgical tools commonly used in endoscopic surgery, such as a lever or a retractor, may be used as the surgical tool 5017.
  • An image of the surgical site inside the body cavity of the patient 101 taken by the endoscope 5001 is displayed on the display device 5041.
  • the surgeon 102 uses the energy treatment instrument 5021 and forceps 5023 to perform a treatment such as cutting off the affected area while viewing the image of the surgical site displayed on the display device 5041 in real time.
  • the pneumoperitoneum tube 5019, the energy treatment instrument 5021, and the forceps 5023 are supported by the operator 102, an assistant, or the like during the surgery.
  • Robot arm 120 includes an arm portion 5031 extending from base portion 5029.
  • the arm portion 5031 includes joint portions 5033a, 5033b, 5033c, and links 5035a, 5035b, and is driven by control from an arm control device 5045.
  • Endoscope 5001 is supported by arm portion 5031, and its position and/or posture is controlled. Thereby, the endoscope 5001 can be stably fixed in position.
  • the arm control device 5045 is capable of autonomously controlling the operation of the robot arm 120, for example, based on a model learned by machine learning.
  • the position of the endoscope indicates the position of the endoscope in space, and can be expressed as three-dimensional coordinates such as coordinates (x, y, z), for example.
  • the posture of the endoscope indicates the direction in which the endoscope faces, and can be expressed as a three-dimensional vector, for example.
  • the endoscope 5001 will be briefly described.
  • the endoscope 5001 includes a lens barrel 5003 whose distal end is inserted into the body cavity of the patient 101 over a predetermined length, and a camera head 5005 connected to the proximal end of the lens barrel 5003.
  • an endoscope 5001 configured as a so-called rigid scope having a rigid tube 5003 is shown, but the endoscope 5001 is configured as a so-called flexible scope having a flexible tube 5003. Good too.
  • An opening into which an objective lens is fitted is provided at the tip of the lens barrel 5003.
  • a light source device 5043 mounted on a rack 5037 is connected to the endoscope 5001, and light generated by the light source device 5043 is directed into the lens barrel by a light guide extending inside the lens barrel 5003. The light is guided to the tip and irradiated toward the observation target in the body cavity of the patient 101 via the objective lens.
  • the endoscope 5001 may be a direct-viewing mirror, a diagonal-viewing mirror, or a side-viewing mirror.
  • An optical system and an image sensor are provided inside the camera head 5005, and reflected light (observation light) from an observation target is focused on the image sensor by the optical system.
  • the observation light is photoelectrically converted by the image sensor, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted as RAW data to a camera control unit (CCU) 5039.
  • the camera head 5005 is equipped with a function of adjusting magnification and focal length by appropriately driving its optical system.
  • the camera head 5005 may be provided with a plurality of image sensors, for example, in order to support stereoscopic viewing (3D display).
  • a plurality of relay optical systems are provided inside the lens barrel 5003 in order to guide observation light to each of the plurality of image sensors.
  • the rack 5037 includes a CCU 5039, a light source device 5043, an arm control device 5045, an input device 5047, a treatment tool control device 5049, an insufflation device 5051, a recorder 5053, and a printer 5055. It is equipped with.
  • the CCU 5039 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and centrally controls the operations of the endoscope 5001 and the display device 5041. Specifically, the CCU 5039 performs various image processing, such as development processing (demosaic processing), on the image signal received from the camera head 5005 in order to display an image based on the image signal. The CCU 5039 provides the image signal subjected to the image processing to the display device 5041. Further, the CCU 5039 transmits a control signal to the camera head 5005 to control its driving.
  • the control signal may include information regarding imaging conditions such as magnification and focal length.
  • the display device 5041 displays an image based on an image signal subjected to image processing by the CCU 5039 under control from the CCU 5039. If the endoscope 5001 is compatible with high resolution imaging such as 4K (horizontal pixels 3840 x vertical pixels 2160) or 8K (horizontal pixels 7680 x vertical pixels 4320), and/or 3D display. In the case where the display device 5041 is capable of high-resolution display and/or 3D display, the display device 5041 may be capable of high-resolution display and/or 3D display. If the display device 5041 is compatible with high-resolution shooting such as 4K or 8K, a more immersive feeling can be obtained by using a display device 5041 with a size of 55 inches or more. Furthermore, a plurality of display devices 5041 having different resolutions and sizes may be provided depending on the purpose.
  • high resolution imaging such as 4K (horizontal pixels 3840 x vertical pixels 2160) or 8K (horizontal pixels 7680 x vertical pixels 4320), and/or 3D display. In
  • the light source device 5043 includes a light emitting element such as an LED (light emitting diode) and a drive circuit for driving the light emitting element, and supplies irradiation light to the endoscope 5001 when photographing the surgical site.
  • a light emitting element such as an LED (light emitting diode)
  • a drive circuit for driving the light emitting element, and supplies irradiation light to the endoscope 5001 when photographing the surgical site.
  • the arm control device 5045 includes a processor such as a CPU, and operates according to a predetermined program to control the drive of the arm portion 5031 of the robot arm 120 according to a predetermined control method.
  • the input device 5047 is an input interface for the endoscopic surgery system 5000.
  • the user can input various information and instructions to the endoscopic surgery system 5000 via the input device 5047.
  • the user inputs various information regarding the surgery, such as patient's physical information and information about the surgical technique, via the input device 5047.
  • the user may issue an instruction to drive the arm portion 5031 or an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 5001 via the input device 5047.
  • an instruction to drive the energy treatment instrument 5021, etc. are input.
  • the type of the input device 5047 is not limited, and the input device 5047 may be any of various known input devices.
  • input devices such as a mouse, keyboard, touch panel, switch, lever, joystick, etc. can be applied.
  • the input device 5047 it is also possible to use a mixture of multiple types of input devices.
  • a foot switch 5057 that is placed at the feet of an operator (for example, the operator 102) and is operated by the feet of the operator can also be applied as the input device 5047.
  • the touch panel may be provided on the display surface of the display device 5041.
  • the input device 5047 is not limited to the above example.
  • the input device 5047 can be a device worn by the user, such as a glasses-type wearable device or a head mounted display (HMD).
  • the input device 5047 can perform various inputs according to the user's gestures and line of sight detected by the devices worn by these users.
  • the input device 5047 can include a camera that can detect the user's movements. In this case, the input device 5047 can perform various inputs according to the user's gestures and line of sight detected from the video captured by the camera. Furthermore, the input device 5047 can include a microphone that can pick up the user's voice. In this case, the input device 5047 can perform voice recognition based on the voice picked up by the microphone, analyze the voice of the speaker (for example, the surgeon 102), and input various operations using voice.
  • the input device 5047 is configured to be able to input various information without contact, a user (for example, the operator 102) who belongs to a clean area can operate equipment belonging to a dirty area without contact. becomes possible. Further, since the user can operate the device without taking his hand off the surgical tool that he has, the user's convenience is improved.
  • the treatment tool control device 5049 controls the driving of the energy treatment tool 5021 for cauterizing tissue, incising, sealing blood vessels, etc.
  • the pneumoperitoneum device 5051 inflates the body cavity of the patient 101 through the pneumoperitoneum tube 5019 in order to secure a field of view with the endoscope 5001 and a working space for the operator 102. Inject gas.
  • the recorder 5053 is a device that can record various information regarding surgery.
  • the printer 5055 is a device that can print various types of information regarding surgery in various formats such as text, images, or graphs.
  • a foot switch operated by the operator 102 with his/her foot to the existing foot switch 5057.
  • a foot pedal may also be used as a means of switching the operation mode of the robot arm 120. Addition of this may reduce the operability of medical equipment other than the robot arm 120 and the robot arm 120 itself.
  • a control device for controlling the robot arm 120 according to the present disclosure includes a physical interface that can be operated by a surgeon with a body part other than his or her own hand.
  • the operator can control the operation of the medical robot hands-free at any time, and risk management during hands-free operation of the medical robot becomes possible. Further, by using the control device according to the present disclosure, it is possible to eliminate the need for physical interface replacement work that accompanies replacement of surgical tools during surgery.
  • FIG. 3 is a schematic diagram showing an example of a physical interface according to the embodiment.
  • the physical interface 10 according to the embodiment is removably attached to the bed rail 110 of the surgical bed 100 by, for example, a clamp (not shown). It is configured as a switch that is activated by pressing in the direction of .
  • the output of the physical interface 10 is transmitted to the arm control device 5045 via the cable 11, for example.
  • Arm control device 5045 can control the operation of robot arm 120 in response to signals transmitted from physical interface 10 .
  • the physical interface 10 is configured to be attachable at any position on the bed rail 110 of the surgical bed 100.
  • the physical interface 10 is removably attached to the bed rail 110 using a clamp or the like.
  • the physical interface 10 is preferably attached to the bed rail 110 of the surgical bed 100 at a position where it does not interfere with other equipment.
  • a plurality of physical interfaces 10 may be installed for one robot arm 120.
  • each of the plurality of physical interfaces 10 may instruct one robot arm 120 to perform the same operation.
  • control by the physical interface 10 can be performed easily. realizable.
  • the physical interface 10 is provided with a surgical interface at a position where the surgeon 102 can operate the body part other than his or her own hand, more specifically, at a position below the waist 1023 of the surgeon 102. It is attached to the bed 100 (bed rail 110).
  • Section (b) of FIG. 3 shows an example of the mounting position of the physical interface 10 according to the embodiment. In the example of section (b) in FIG. 3, the physical interface 10 is attached at a position where it can be operated depending on the body position from the knee 1021 to the thigh 1022.
  • the surgeon 102 can operate the physical interface 10 in a manner different from the foot switch 5057 without using his hands, that is, while handling the surgical instrument with both hands. I can do it. Therefore, the surgeon 102 can smoothly operate the robot arm 120 even when both hands are occupied for operating the surgical instrument. That is, the surgeon 102 can smoothly operate the robot arm 120 hands-free.
  • the operability for the operator 102 is improved.
  • the physical interface 10 by attaching the physical interface 10 to the bed rail 110, there is no need to attach or detach the physical interface 10 when replacing a surgical instrument, which occurs when attaching the physical interface 10 to a surgical instrument or hand. Furthermore, since the physical interface 10 is attached to the bed rail 110, it is possible to cleanly separate the physical interface 10 at low cost by putting it inside an existing covering cloth or covering it with a simple drape. Good economy.
  • FIG. 4 is a schematic diagram for explaining the relationship between the robot arm 120 and the surgical bed 100.
  • the robot arm 120 has a floor-standing configuration installed on a trolley 130, and operates independently of the surgical bed 100.
  • the angle of a part of the surgical bed 100 may be changed depending on, for example, the condition of the surgery. Therefore, for example, during surgery, if the inclination of the surgical bed 100 is changed while the lens barrel 5003 of the endoscope 5001 is inserted into the body cavity of the patient 101 by the robot arm 120, the lens barrel 5003 is inserted into the body cavity of the patient 101. This results in undesirable movement within the body cavity.
  • the physical interface 10 may include a tilt sensor that detects the inclination of the physical interface 10. Tilt information indicating the tilt detected by the tilt sensor is transmitted to the arm control device 5045 via the cable 11.
  • the physical interface 10 since the physical interface 10 according to the embodiment is attached to the bed rail 110 of the surgical bed 100, it is possible to detect the inclination of the surgical bed 100 at the attached position.
  • the physical interface 10 is attached to the surgical bed 100 at a position corresponding to the affected area to be operated on, and the operation of the robot arm 120 is controlled according to the output of the tilt sensor included in the physical interface 10. can be made to follow the inclination of the surgical bed 100.
  • a light emitting section is provided for the physical interface 10 to have a lighting function.
  • FIG. 5 is a schematic diagram for explaining the illumination function of the physical interface 10 according to the embodiment.
  • a patient 101 is covered with a cloth 20 except for the affected area, and a surgeon 102 is shown handling surgical tools with his left hand 102hL and right hand 102hR and performing surgery.
  • the bed rail 110 is covered with a covering cloth 20.
  • the physical interface 10 attached to the bed rail 110 (not shown) is located inside the upholstery 20.
  • the physical interface 10 is covered with a transparent drape to keep itself clean. Therefore, it may be difficult for the operator 102 to directly visually recognize the physical interface 10. Therefore, the physical interface 10 according to the embodiment is provided with a light emitting section 12 to have an illumination function, and the light emitting section 12 is made to emit light at all times during surgery, for example.
  • the light emitting unit 12 is provided at a position where the light emitted from the physical interface 10 can be easily recognized by the operator 102 when the physical interface 10 is attached to the bed rail 110, for example. Further, it is preferable that the light emitting unit 12 emits light with such intensity that the emitted light passes through the covering cloth 20 or the drape to some extent and is easily recognized by the operator 102. Note that the intensity of the light emitted by the light emitting unit 12 is preferably such that the operator 102 does not feel glare when passing through the covering cloth 20 or drape.
  • FIG. 6 is a diagram schematically showing an example of the configuration of an endoscopic surgery system 5000a according to the embodiment.
  • an endoscopic surgery system 5000a has a physical interface 10 added to the endoscopic surgery system 5000 according to the existing technology described using FIG.
  • the physical interface 10 is removably attached to the bed rail 110, as described above.
  • the physical interface 10 is wired via the cable 11 to an arm control device 5045a corresponding to the arm control device 5045 in FIG.
  • the physical interface 10 can also be connected to the arm control device 5045a by wireless communication without using the cable 11. However, considering the response to the operation of the physical interface 10 and risks such as communication errors, it is preferable that the physical interface 10 is connected to the arm control device 5045a by wire.
  • FIG. 7 is an example functional block diagram for explaining the functions of the robot arm system according to the embodiment.
  • the robot arm system 50 includes a robot arm control section 500 and a robot arm 120.
  • Robot arm control section 500 is included in arm control device 5045a shown in FIG.
  • the physical interface (I/F) 10 includes a switch (SW) section 13, a tilt sensor 14, and a light emitting section 12.
  • the switch unit 13 may have a configuration including, for example, a microswitch and an output circuit that outputs a signal according to an operation on the microswitch.
  • a signal output from the output circuit (referred to as a switch signal) is transmitted to the arm control device 5045a via the cable 11 and received by the robot arm control section 500.
  • the switch signal may be, for example, a signal that simply indicates on and off. Further, for example, the switch signal may be a signal that combines a synchronization pattern and a pattern indicating on and off.
  • the physical interface 10 is not limited to this, and may simply be a switch that opens/closes a circuit in response to an operation. In this case, for example, the robot arm control unit 500 needs to constantly supply signals to the physical interface 10.
  • the tilt sensor 14 detects the tilt of the physical interface 10 and outputs tilt information indicating the detected tilt. For example, the tilt sensor 14 detects the angle of the physical interface 10 with respect to the direction of gravity, and outputs the detected angle as tilt information. As the tilt sensor 14, for example, a gyro sensor that detects tilt based on angular acceleration of three axes can be used.
  • the tilt information output from the tilt sensor 14 is transmitted to the arm control device 5045a via the cable 11 and received by the robot arm control unit 500.
  • the light emitting unit 12 can use an LED (Light Emitting Diode) as a light emitting element.
  • the light emitting unit 12 may be controlled to turn on/off the light emitted by, for example, a switch provided on the physical interface 10 .
  • the present invention is not limited to this, and the robot arm control unit 500 may control on/off of the light emission of the light emitting unit 12.
  • the light emitting element applied to the light emitting unit 12 is not limited to an LED as long as it can achieve the purpose of being easily recognized by the operator 102 through the covering cloth 20 or drape.
  • the light emitting section 12 may have a dimming function.
  • the instruction recognizer 60 includes, for example, a voice recognizer, analyzes a voice signal based on the voice picked up by the microphone 61, and obtains instruction information indicating a voice instruction.
  • the instruction recognizer 60 can recognize, for example, an utterance by the surgeon 102 to control the operation of the robot arm 120, and can acquire instruction information for instructing the operation of the robot arm 120.
  • the instruction recognizer 60 may include a line of sight recognizer, recognize the line of sight (for example, the direction of the eyeball) based on the image captured by the camera 62, and acquire instruction information indicating instructions based on the line of sight. You can do it.
  • the instruction recognizer 60 can recognize, for example, the line of sight of the operator 102 to control the operation of the robot arm 120, and can acquire instruction information for controlling the robot arm 120.
  • the instruction recognizer 60 may have both a voice recognizer and a line of sight recognizer, or may have either one of them.
  • the robot arm 120 includes a joint section 121 and a drive control section 122.
  • the joint section 121 includes a joint information detection section 1210 and a joint section drive section 1211.
  • Drive control section 122 generates a drive control signal for driving joint section 121 based on command value information supplied from command value generation section 513, which will be described later.
  • the joint drive unit 1211 drives the joint 121 according to a drive control signal generated by the drive control unit 122.
  • the joint information detection section 1210 detects the state of the joint section 121 using a sensor or the like and acquires joint information. Joint information detection section 1210 passes the acquired joint information to state acquisition section 510 and command value generation section 513, which will be described later.
  • the robot arm 120 is shown to include one joint 121 for the sake of explanation, but in reality, the robot arm 120 includes a plurality of joints 121.
  • the robot arm control section 500 includes a state acquisition section 510, a calculation condition determination section 511, and a command value generation section 513.
  • the state acquisition unit 510 acquires the switch signal output from the switch unit 13 of the physical interface 10 and the tilt information output from the tilt sensor 14. Further, the state acquisition section 510 acquires each joint information output from the joint information detection section 1210 of each joint section 121. The state acquisition unit 510 passes the acquired switch signal, slope information, and each joint information to the calculation condition determination unit 511.
  • the calculation condition determination unit 511 acquires the switch signal, slope information, and each joint information passed from the state acquisition unit 510, and also acquires the instruction information output from the instruction recognizer 60. The calculation condition determining unit 511 determines how the robot arm 120 behaves based on each piece of information and signals acquired. The calculation condition determination unit 511 passes information indicating the determined behavior of the robot arm 120 to the force calculation unit 512.
  • the force calculation unit 512 has a model regarding the movement of the robot arm 120, which has been learned, for example, by machine learning.
  • the force calculation unit 512 applies information indicating the behavior of the robot arm 120 passed from the calculation condition determination unit 511 to the model, and predicts the movement of the robot arm 120.
  • the force calculation section 512 passes robot motion information indicating the predicted motion of the robot arm 120 to the command value generation section 513.
  • the command value generation unit 513 is further provided with joint information from each joint 121 in the robot arm 120.
  • the command value generation unit 513 generates a command value for instructing the drive of each joint 121 of the robot arm 120 based on the joint information passed from each joint 121 and the robot motion information passed from the force calculation unit 512. generate.
  • the command value generation unit 513 passes each generated command value to the drive control unit 122.
  • the drive control section 122 generates each drive control signal for driving each joint section 121 according to each command value passed from the command value generation section 513.
  • the robot arm 120 executes a predetermined operation by driving each joint 121 according to each drive control signal generated by the drive control unit 122.
  • the physical interface 10 according to the embodiment and the robot arm control unit 500 that controls the operation of the robot arm 120 according to the output of the physical interface 10 are included, and the robot arm control unit 500 controls the operation of the robot arm 120 according to the output of the physical interface 10.
  • a control device is configured to control the operation of the medical robot.
  • FIG. 8 is a schematic diagram showing an example of arrangement when one physical interface 10 according to the embodiment is arranged.
  • Section (a) of FIG. 8 is a top view of a patient 101 lying on a surgical bed 100 and an operator 102 standing next to the surgical bed 100 for a surgical procedure.
  • section (b) in the same figure is an overhead view of the state of section (a) in the same figure from diagonally above and behind the operator 102.
  • the physical interface 10 when one physical interface 10 is arranged, the physical interface 10 is arranged at a position corresponding to the standing position of the surgeon 102 on the bed rail 110 of the surgical bed 100.
  • the operator 102 when the operator 102 performs surgery on the patient 101 lying on the surgical bed 100, the operator 102 operates the physical interface 10 in the lower back of the operator 102 or in the body part below the waist.
  • the physical interface 10 is located at a location corresponding to the body part used for the purpose.
  • the physical interface 10 is placed at a position corresponding to the thigh or knee of the left leg of the operator 102. Since the physical interface 10 is detachably attached to the bed rail 110, the arrangement position can be easily adjusted.
  • FIG. 9 is a schematic diagram showing an arrangement example when a plurality of physical interfaces 10 are arranged.
  • two physical interfaces 10a and 10b are used, and that the physical interface 10a is operated by the surgeon 102, and the physical interface 10b is operated by the assistant 103 who assists the surgeon 102 in the surgery.
  • the robot arm system 50 performs the same control on each output of these physical interfaces 10a and 10b.
  • Section (a) of FIG. 9 is a top view of a patient 101 lying on a surgical bed 100 and an operator 102 standing next to the surgical bed 100 for a surgical procedure.
  • the assistant 103 stands in a position facing the surgeon 102 with the surgical bed 100 in between.
  • Section (b) in the same figure is an overhead view of the state of section (a) in the same figure from diagonally above and behind the assistant 103.
  • the arrangement position of the physical interface 10a operated by the surgeon 102 is the same as the position described using FIG. 8. Furthermore, the position of the physical interface 10b operated by the assistant 103 is basically the same as the positional relationship between the surgeon 102 and the physical interface 10a. Specifically, the physical interface 10b may be placed on the bed rail 110 of the surgical bed 100 at a position corresponding to the standing position of the assistant 103.
  • the first application example is an example in which control of the robot arm 120 is enabled or disabled according to an instruction recognized by the instruction recognizer 60 by operating the physical interface 10 according to the embodiment.
  • the instruction recognizer 60 acquires instruction information corresponding to the sound picked up by the microphone 61.
  • FIG. 10 is a schematic diagram showing an example of the operation mode transition of the robot arm 120 according to the first application example of the embodiment.
  • the voice recognition state 200 includes an autonomous control mode 211 that instructs autonomous control of the robot arm 120, and a manual control mode 211 that instructs autonomous control of the robot arm 120, and manually controls the robot arm 120 using, for example, a user interface (UI) provided by an input device 5047.
  • UI operation mode 212 to be operated.
  • the calculation condition determination unit 511 determines the voice recognition function of the instruction recognizer 60 for the robot arm 120 based on the switch signal output from the physical interface 10 and passed from the state acquisition unit 510 in response to the physical interface (IF) operation 220. It is determined whether the corresponding control is enabled or disabled.
  • the calculation condition determining unit 511 determines that the control is valid, it sets the voice recognition state 200 to a voice reception start state 210 in which voice reception is started. In the voice reception start state 210, the calculation condition determining unit 511 switches the operation mode of the robot arm 120 between the autonomous control mode 211 and the UI operation mode 212 according to the voice recognized by the instruction recognizer 60.
  • the calculation condition determination unit 511 determines that the control is invalid, it stops the control and transitions the operation mode of the robot arm 120 to the arm direct operation mode according to the operation on the arm direct operation 231 button.
  • the arm direct operation button 231 is, for example, an operator provided on the robot arm 120 to operate the robot arm 120 without being controlled by the arm control device 5045a.
  • enabling and disabling control of the robot arm 120 according to the voice recognition function is controlled according to a predetermined activation word uttered by the operator 102 or the like (for example, utterance of "start voice").
  • a predetermined activation word uttered by the operator 102 or the like for example, utterance of "start voice".
  • the activation word generated by the operator 102 or the like is changed into an operation of the physical interface 10. Therefore, it is not necessary to speak the activation word, and operability can be improved. That is, according to the first application example of the embodiment, the success rate of activation related to voice control of the operation of the robot arm 120 can be set to 100%.
  • calculation condition determining unit 511 determines whether the operation on the physical interface 10 is of a momentary type in which the state is on only while the operation is being performed (for example, by pressing), or an alternative type in which the state is switched on or off each time the operation is performed. may also be controlled.
  • the calculation condition determining unit 511 performs control according to the voice recognition function of the instruction recognizer 60 while the operator 102 is pressing the switch unit 13 of the physical interface 10, for example. is enabled, and the voice recognition state 200 is controlled to the voice reception start state 210. Further, if the switch unit 13 of the physical interface 10 is not pressed, the calculation condition determining unit 511 disables and stops the control according to the voice recognition function, and changes the operation mode of the robot arm 120 to the arm direct operation mode 230.
  • the calculation condition determination unit 511 switches between enabling and disabling the control according to the voice recognition function, for example, every time the surgeon 102 presses the switch unit 13 of the physical interface 10.
  • the second application example is an example in which the instruction recognized by the instruction recognizer 60 is ignored and the robot arm 120 can be manually operated by operating the physical interface 10 according to the embodiment. Similar to the above-described first application example, the instruction recognizer 60 will be described as a case where instruction information is acquired in accordance with the sound picked up by the microphone 61.
  • FIG. 11 is a schematic diagram showing an example of the operation mode transition of the robot arm 120 according to the second application example of the embodiment.
  • the calculation condition determining unit 511 changes the voice recognition state 200 shown in FIG. . Once the calculation condition determining unit 511 enters the voice reception start state 210 in the voice recognition state 200, it always maintains the voice reception start state 210 regardless of whether or not there is any operation on the physical interface 10. Further, the calculation condition determination unit 511 cancels the voice reception start state 210 in response to a predetermined end word 240b (for example, utterance of “end of voice”), stops control according to the voice recognition function, and controls the robot arm 120. The operation mode is changed to arm direct operation mode 230.
  • a predetermined end word 240b for example, utterance of “end of voice
  • a stop button 250 is provided as a hardware (HW) button.
  • the stop button 250 is an operator for stopping the operation of the robot arm 120 and transitioning the operation mode to the arm direct operation mode 230 in accordance with the operation.
  • This stop button 250 functions as risk management for the operation of the robot arm 120 in the case where a problem occurs in voice recognition.
  • the physical interface 10 is made to function as this stop button 250.
  • the instruction recognizer 60 fails to recognize a voice instructing the operator 102 to "stop the robot" and the robot arm 120 does not stop in response to the voice.
  • the surgeon 102 can stop the operation of the robot arm 120 by operating the physical interface 10. Therefore, according to the second application example of the embodiment, the voice recognition function can be enabled at all times with risk management in place, and operability can be improved.
  • the operator 102 wants to stop the operation of the robot arm 120 with a small delay in controlling the stop of the robot arm 120 in response to voice recognition.
  • the instruction recognizer 60 may require a longer time than usual for voice recognition processing.
  • the switch signal output from the physical interface 10 is a signal simply indicating on and off. Therefore, according to the second application example of the embodiment, the process of stopping the robot arm 120 can be executed faster than when the voice recognition function is used.
  • the third application example is an example in which the physical interface 10 according to the embodiment is used as an emergency stop button to emergency stop the robot arm 120.
  • the robot arm 120 is generally provided with an emergency stop button on its main body to completely stop its operation.
  • FIG. 12 is a schematic diagram showing an example of the emergency stop button 123 provided on the robot arm 120.
  • the emergency stop button 123 is provided directly to the robot arm 120. For example, when the operator 102 detects an abnormality in the operation of the robot arm 120, by operating the emergency stop button 123, the operator 102 can stop the operation of the robot arm 120 with an extremely small delay.
  • Section (b) of FIG. 12 schematically shows an example in which the emergency stop button 123 is placed at a position far from the operator 102.
  • the robot arm 120 is covered with a vinyl cover 21, and the emergency stop button 123 is provided at the middle portion of the robot arm 120.
  • the operator 102 In order to operate the emergency stop button 123, the operator 102 needs to extend his left hand 102hL from the position of the operator 102 through the upper part of the patient 101.
  • the operator 102 will stop operating the surgical instrument and operate the emergency stop button 123.
  • the vinyl cover 21 may get in the way. Therefore, there is a possibility that a delay may occur from the time when it becomes necessary to stop the operation of the robot arm 120 until the emergency stop button 123 is actually operated. Further, in some cases, an assistant or staff other than the surgeon 102 may be forced to operate the emergency stop button 123.
  • the operator 102 can stop operating the surgical instrument even when both hands are occupied with operating the surgical instrument. It is possible to stop the operation of the robot arm 120 without having to do so. That is, in the third application example of the embodiment, the surgeon 102 can stop the operation of the robot arm 120 hands-free.
  • FIG. 13 is a schematic diagram showing an example of the arrangement of physical interfaces according to the first modification of the embodiment.
  • a physical interface 10c according to a first modification of the embodiment is placed, for example, at the feet of a surgeon 102. It is preferable that the physical interface 10c be placed in a position where it does not interfere with the foot switch 5057 or the like.
  • a configuration for detecting an operation using light for example, distance measurement using reflection of light or detecting whether the operator 102's foot or the like is inserted into the physical interface 10c is possible based on the detection result of detecting whether or not the light is blocked. There may be a method to determine whether or not this is the case.
  • the physical interface 10c that detects operations using light By using the physical interface 10c that detects operations using light, it is possible to operate the physical interface 10c non-contact and with a different operation method than the foot switch 5057. Furthermore, by installing the physical interface 10c that detects operations using light at the feet of the surgeon 102, etc., even when both hands of the surgeon 102 are used exclusively for operating the surgical instrument, the physical interface 10c can be used to detect operations using light. It is possible to operate. That is, in the first modified example of the embodiment, the surgeon 102 can operate the physical interface 10c hands-free to control the operation of the robot arm 120.
  • FIG. 14 is a schematic diagram showing an example of a physical interface 10c-1 according to a first example of a first modification of the embodiment.
  • the physical interface 10c-1 has, for example, a structure in which one of each surface of a rectangular parallelepiped is an opening.
  • a distance measuring device 16 is provided on a surface 15, which is, for example, the top surface of an opening in a rectangular parallelepiped.
  • the distance measuring device 16 includes, for example, a light emitting section and a light receiving section, and the light emitting section irradiates the inside of the opening with light.
  • the distance measuring device 16 detects the object from the distance measuring device 16 based on the timing at which the emitted light 160 is emitted from the light emitting section and the timing at which the reflected light 161 from the object (not shown) is received by the light receiving section. Measure the distance to.
  • the distance measuring device 16 determines whether or not there is an operation on the physical interface 10c-1 based on the distance measurement result. The determination result by the distance measuring device 16 is transmitted to the arm control device 5045a via the cable 11.
  • the distance measured when nothing is inserted into the opening of the physical interface 10c-1 is set as the initial value. If distance measurement is performed with, for example, the foot (toe) of the operator 102 inserted into the opening, the emitted light 160 will be reflected by the inserted foot, so the distance measurement result will be a shorter distance than the initial value. be obtained.
  • the state acquisition unit 510 can detect an operation on the physical interface 10c-1 based on the distance measurement result.
  • the physical interface 10c-1 is shown as having a rectangular parallelepiped shape, but this is not limited to this shape.
  • the operation on the physical interface 10c-1 is detected by distance measurement using light, but this is not limited to this example.
  • a light emitting section is provided on the surface 15, and a light receiving section that receives light emitted from the light emitting section is provided on a surface opposite to the surface 15 at a position corresponding to the light emitting section.
  • the status acquisition unit 510 may detect an operation on the physical interface 10c-1 based on whether the light emitted from the light emitting unit is received by the light receiving unit.
  • FIG. 15 is a schematic diagram showing an example of a physical interface 10c-2 according to a second example of the first modification of the embodiment.
  • the physical interface 10c-2 has a wider opening in the horizontal direction than the physical interface 10c-1 shown in FIG. It has a structure in which the opening side of the side wall that is laterally in contact with the opening is notched. Further, in the example of FIG. 15, the physical interface 10c-2 is provided with a plurality of distance measuring devices 16a to 16c each including a light emitting section and a light receiving section on the top surface 17.
  • Each distance measuring device 16a to 16c determines whether or not there is an operation on the physical interface 10c-1 based on the distance measurement result, similarly to the first example of the first modification of the embodiment described above.
  • the determination results from each distance measuring device 16a to 16c are logically summed, for example, and transmitted to the arm control device 5045a via the cable 11.
  • the surgeon 102 can cause the physical interface 10c-2 to detect the operation by sliding the foot (tip of the foot) in the lateral direction.
  • FIG. 16 is a schematic diagram showing an example of a physical interface 10c-3 according to a third example of the first modification of the embodiment.
  • the physical interface 10c-3 includes a light emitting device 17-1 that emits a plurality of light beams 162, and a light receiving device 17-2 that receives each light beam 162 emitted from the light emitting device 17-1. It has a light curtain structure.
  • the light emitting device 17-1 and the light receiving device 17-2 are installed on the floor surface corresponding to the side surface of the surgical bed 100. At this time, it is preferable to make the distance between the light emitting device 17-1 and the light receiving device 17-2 somewhat wide.
  • the light receiving device 17-2 installs a light blocking object (for example, It may be determined that the tip of the foot of the operator 102 exists.
  • the determination result of the light receiving device 17-2 is transmitted to the arm control device 5045a via the cable 11.
  • the state acquisition unit 510 can detect an operation on the physical interface 10c-3 based on this determination result.
  • FIG. 17 is a schematic diagram showing an example of the arrangement of physical interfaces according to the second modification of the embodiment.
  • a physical interface 10d according to a second modification of the embodiment is placed, for example, on the floor at the feet of the surgeon 102. It is preferable that the physical interface 10d be placed in a position where it does not interfere with the foot switch 5057 or the like.
  • the state acquisition unit 510 can detect an operation on the physical interface 10d, for example, based on a detection result that the surgeon 102 has put his weight on the physical interface 10d.
  • the physical interface 10d that detects operations using pressure-sensitive sensors at the feet of the surgeon 102
  • the physical interface 10d can be used even when both hands of the surgeon 102 are used exclusively for operating the surgical instrument. It is possible to operate. That is, by applying the second modification of the embodiment, the surgeon 102 can operate the physical interface 10d hands-free and control the operation of the robot arm 120.
  • the physical interface 10d can be configured to have a certain area, making it easy to operate.
  • FIG. 18 is a schematic diagram showing an example of the physical interface 10d-1 according to the first example of the second modification of the embodiment.
  • the physical interface 10d-1 shown in FIG. 18 is sized so that its pressure sensitive range is limited to one person's operation.
  • the physical interface 10d-1 is preferably placed, for example, at the feet of the surgeon 102 when performing surgery.
  • the physical interface 10d-1 detects, for example, that the surgeon 102 has applied his/her weight.
  • the detection result of the physical interface 10d-1 is transmitted to the status acquisition unit 510 via the cable 11.
  • the status acquisition unit 510 can detect an operation on the physical interface 10d-1 based on the detection result sent from the physical interface 10d-1.
  • FIG. 19 is a schematic diagram showing an example of a physical interface 10d-2 according to a second example of a second modification of the embodiment.
  • the physical interface 10d-2 shown in FIG. 19 has a larger pressure sensitive range than the example shown in FIG. 18, and is sized to be operable by multiple people. It is preferable that the physical interface 10d-2 be placed, for example, over the entire area on one or both sides of the surgical bed 100 while avoiding interference with other equipment.
  • the present technology can also have the following configuration.
  • a physical interface configured so that the operator can operate it with a body part other than his or her own hand, controlling the operation of the medical robot according to the operator's operation on the physical interface; Control device.
  • the physical interface is configured such that the operator can operate it with his/her own waist and body parts below the waist; The control device according to (1) above.
  • the physical interface is an operation interface for operating a robot arm as the medical robot that assists the surgeon in surgery; The control device according to (1) or (2) above.
  • the physical interface includes at least one operation unit that can be operated by the operator.
  • the control device according to any one of (1) to (3) above.
  • the physical interface is configured to be attachable to a rail of a surgical bed; The control device according to any one of (1) to (4) above.
  • the physical interface includes a tilt detection unit that detects a tilt of the surgical bed.
  • the physical interface includes a lighting unit that irradiates light to the outside.
  • a plurality of the physical interfaces are installed for one medical robot, The control device according to any one of (1) to (7) above. (9) Each of the plurality of physical interfaces controls the same operation with respect to one of the medical robots according to the operation; The control device according to (8) above.
  • the physical interface is arranged at a position corresponding to the operator's feet, and detects the operation by the operator's feet using light.
  • the physical interface includes a pressure-sensitive sensor that detects pressure, and is disposed at a position corresponding to the operator's feet, and the physical interface detects the operation by the operator's feet as a result of the change in the pressure detected by the pressure-sensitive sensor. Detect based on The control device according to (1) above.
  • (12) a physical interface configured such that the operator can operate it with a body part other than his/her hands; a robot arm that assists the surgeon in surgery; a control unit that controls the operation of the robot arm according to the operator's operation on the physical interface; including, Medical robot.
  • the control unit includes: controlling the operation of the robot arm according to the result of the voice recognition; switching between enabling and disabling control of the operation of the robot arm according to the result of the voice recognition in accordance with the operator's operation on the physical interface; The medical robot according to (12) above.
  • the control unit includes: switching between enabling and disabling the voice recognition by the voice recognition unit according to the operator's operation on the physical interface; The medical robot according to (13) above.
  • the control unit includes: stopping the operation of the robot arm while the voice recognition by the voice recognition unit is enabled in response to the operator's operation on the physical interface; The medical robot according to (13) above.
  • the control unit includes: emergency stopping the operation of the robot arm in response to the operator's operation on the physical interface; The medical robot according to any one of (12) to (15) above.
  • Robot arm system 60 Instruction recognizer 61 Microphone 100 Surgical bed 101 Patient 102 Operator 103 Assistant 110 Bed rail 120 Robot arm 121 Joint part 122 Drive control part 123 Emergency stop button 160 Emitted light 161 Reflected light 162 Light beam 200 Voice recognition state 210 Voice reception start state 211 Autonomous control mode 212 UI operation mode 220 Physical interface operation 230 Arm direct operation mode 231 Arm direct operation button 250 Stop button 500 Robot arm control unit 510 Status acquisition Unit 511 Calculation condition determination unit 512 Force calculation unit 513 Command value generation unit 1210 Joint information detection unit 1211 Joint drive unit 5003 Lens barrel 5045, 5045a Arm control device 5057 Foot switch

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Manipulator (AREA)

Abstract

The control device according to the present disclosure is provided with a physical interface (10) that is adapted to be operable by a practitioner with a body part other than a hand of the practitioner, in which the operation of a medical robot is controlled in accordance with the operation of the physical interface by the practitioner. The medical robot according to the present disclosure includes: a physical interface that is adapted to be operable by a practitioner with a body part other than a hand of the practitioner; a robot arm (120) that assists a surgery by the practitioner; and a control unit (500) that controls the operation of the robot arm in accordance with the operation of the physical interface by the practitioner.

Description

制御装置および医療用ロボットControl equipment and medical robots
 本開示は、制御装置および医療用ロボットに関する。 The present disclosure relates to a control device and a medical robot.
 術具を用いた動作を自律的に実行する医療用ロボットが知られている。例えば、内視鏡手術においては、自律的に動作する内視鏡を用いて患者の腹腔内を撮影し、撮影した撮影画像をディスプレイに表示させる。術者(執刀医)は、ディスプレイに表示された撮影画像を見ながら手術を行うことで、両手を術具の操作に専念させることができる。このような、医療用ロボットの操作手段として、ハンズフリーな操作手段が求められる。 Medical robots that autonomously perform operations using surgical instruments are known. For example, in endoscopic surgery, an endoscope that operates autonomously is used to photograph the inside of a patient's abdominal cavity, and the photographed images are displayed on a display. By performing surgery while looking at the captured images displayed on the display, the operator (surgeon) can concentrate both hands on operating the surgical instruments. A hands-free operation means is required as an operation means for such a medical robot.
 従来から、電気メスなどのような、電動にて動作する術具を用いた手術において、術者は、手術手技に集中し、両手が手術手技に専有されている状態で術具を操作する手段として、フットペダルを用いることが一般的に行われていた。そのため、医療用ロボットの操作手段としても、フットペダルを採用することが検討されている。また、特許文献1には、術具に医療用ロボットを操作するための物理インタフェースを取り付けるようにした技術が開示されている。 Conventionally, in surgeries using electrically operated surgical tools such as electric scalpels, the surgeon has a method of operating the surgical tools while concentrating on the surgical procedure and using both hands exclusively for the surgical procedure. It was common practice to use a foot pedal. Therefore, consideration is being given to using foot pedals as an operating means for medical robots. Furthermore, Patent Document 1 discloses a technique in which a physical interface for operating a medical robot is attached to a surgical instrument.
特表2018-532527号公報Special table 2018-532527 publication
 医療用ロボットが導入されていない通常の手術においても、術者の足元に、他の機材の操作用に2つ以上(すなわち両足用)のフットペダルが配置されることが多い。そのため、医療用ロボットの操作用にさらにフットペダルを追加することは、医療用ロボット以外の医療機器、および、医療用ロボット自体の操作性が低下するおそれがある。 Even in normal surgeries where medical robots have not been introduced, two or more foot pedals (that is, for both feet) are often placed at the feet of the surgeon for operating other equipment. Therefore, adding a foot pedal for operating the medical robot may reduce the operability of medical devices other than the medical robot and the medical robot itself.
 また、特許文献1に記載の構成では、通常の手術手技のための操作性を担保することが困難である。すなわち、特許文献1に記載の構成においては、医療用ロボットを操作するための物理インタフェースが術具に取り付けられているとはいえ、医療用ロボットに対する複雑な操作を行う際には、術者は、手術手技を一時的に停止しないと、医療用ロボットの動作を正確に制御することが困難である。 Furthermore, with the configuration described in Patent Document 1, it is difficult to ensure operability for normal surgical procedures. In other words, in the configuration described in Patent Document 1, although the physical interface for operating the medical robot is attached to the surgical instrument, the operator must perform complicated operations on the medical robot. , it is difficult to accurately control the operation of a medical robot unless the surgical procedure is temporarily stopped.
 さらに、特許文献1に記載の構成では、手術手技中に術者の指が物理インタフェースに触れてしまうおそれがあり、医療用ロボットを誤動作させてしまうリクスが存在する。さらにまた、実際の手術においては、一般的に、術者の用いる器具の交換作業が頻繁に発生するため、医療用ロボットを操作するための物理インタフェースの着脱および付け替えが必要になってしまう。 Furthermore, in the configuration described in Patent Document 1, there is a risk that the surgeon's fingers may touch the physical interface during a surgical procedure, and there is a risk that the medical robot will malfunction. Furthermore, in actual surgery, instruments used by the surgeon are generally frequently replaced, so it becomes necessary to attach and detach and replace the physical interface for operating the medical robot.
 本開示は、医療用ロボットの動作をハンズフリーで制御可能であり、且つ、ハンズフリー操作におけるリスク管理を可能とした制御装置および医療用ロボットを提供することを目的とする。 An object of the present disclosure is to provide a control device and a medical robot that can control the operation of a medical robot in a hands-free manner and manage risks in hands-free operation.
 本開示に係る制御装置は、術者が自身の手以外の身体部分で操作可能に構成される物理インタフェース、を備え、前記物理インタフェースに対する前記術者の操作に応じて医療用ロボットの動作を制御する。 A control device according to the present disclosure includes a physical interface that can be operated by a surgeon with a body part other than his or her own hand, and controls the operation of a medical robot in accordance with the operation of the surgeon on the physical interface. do.
既存技術による医療用ロボットの配置例を示す模式図である。FIG. 2 is a schematic diagram showing an example of the arrangement of medical robots according to existing technology. 既存技術による医療用ロボットの配置例を示す模式図である。FIG. 2 is a schematic diagram showing an example of the arrangement of medical robots according to existing technology. 既存技術による医療用ロボットの配置例を示す模式図である。FIG. 2 is a schematic diagram showing an example of the arrangement of medical robots according to existing technology. 既存技術による内視鏡手術システムの構成の一例を概略的に示す図である。1 is a diagram schematically showing an example of the configuration of an endoscopic surgery system according to existing technology. 実施形態に係る物理インタフェースの例を示す模式図である。FIG. 2 is a schematic diagram showing an example of a physical interface according to an embodiment. ロボットアームと手術用ベッドとの関係を説明するための模式図である。It is a schematic diagram for explaining the relationship between a robot arm and a surgical bed. 実施形態に係る物理インタフェースの照光機能を説明するための模式図である。FIG. 3 is a schematic diagram for explaining the illumination function of the physical interface according to the embodiment. 実施形態に係る内視鏡手術システムの構成の一例を概略的に示す図である。1 is a diagram schematically showing an example of the configuration of an endoscopic surgery system according to an embodiment. 実施形態に係るロボットアームシステムの機能を説明するための一例の機能ブロック図である。FIG. 2 is a functional block diagram of an example for explaining the functions of the robot arm system according to the embodiment. 実施形態に係る物理インタフェースの効果的な配置例を説明するための模式図である。FIG. 3 is a schematic diagram for explaining an example of effective arrangement of physical interfaces according to the embodiment. 実施形態に係る物理インタフェースの効果的な配置例を説明するための模式図である。FIG. 3 is a schematic diagram for explaining an example of effective arrangement of physical interfaces according to the embodiment. 実施形態の第1の適用例によるロボットアームの動作モード遷移の例を示す模式図である。FIG. 3 is a schematic diagram showing an example of operation mode transition of a robot arm according to a first application example of the embodiment. 実施形態の第2の適用例によるロボットアームの動作モード遷移の例を示す模式図である。FIG. 7 is a schematic diagram showing an example of operation mode transition of the robot arm according to a second application example of the embodiment. ロボットアームに設けられる緊急停止ボタンの例を示す模式図である。FIG. 2 is a schematic diagram showing an example of an emergency stop button provided on a robot arm. 実施形態の第1の変形例に係る物理インタフェースの配置例を示す模式図である。FIG. 7 is a schematic diagram showing an example of the arrangement of physical interfaces according to a first modification of the embodiment. 実施形態の第1の変形例の第1の例に係る物理インタフェースの例を示す模式図である。FIG. 7 is a schematic diagram showing an example of a physical interface according to a first example of a first modification of the embodiment. 実施形態の第1の変形例の第2の例に係る物理インタフェースの例を示す模式図である。It is a schematic diagram which shows the example of the physical interface based on the 2nd example of the 1st modification of embodiment. 実施形態の第1の変形例の第3の例に係る物理インタフェースの例を示す模式図である。It is a schematic diagram which shows the example of the physical interface based on the 3rd example of the 1st modification of embodiment. 実施形態の第2の変形例に係る物理インタフェースの配置例を示す模式図である。FIG. 7 is a schematic diagram showing an example of the arrangement of physical interfaces according to a second modification of the embodiment. 実施形態の第2の変形例の第1の例に係る物理インタフェースの例を示す模式図である。FIG. 6 is a schematic diagram showing an example of a physical interface according to a first example of a second modification of the embodiment. 実施形態の第2の変形例の第2の例に係る物理インタフェースの例を示す模式図である。It is a schematic diagram which shows the example of the physical interface based on the 2nd example of the 2nd modification of embodiment.
 以下、本開示の実施形態について、図面に基づいて詳細に説明する。なお、以下の実施形態において、同一の部位には同一の符号を付することにより、重複する説明を省略する。 Hereinafter, embodiments of the present disclosure will be described in detail based on the drawings. Note that in the following embodiments, the same portions are given the same reference numerals, and redundant explanation will be omitted.
 以下、本開示の実施形態について、下記の順序に従って説明する。
1.本開示に係る技術の概要
 1-1.既存技術について
 1-2.本開示に係る技術
2.実施形態
  2-1-1.実施形態に係る物理インタフェース例
  2-1-2.実施形態に係る医療システムの構成例
  2-1-3.実施形態に適用可能な物理インタフェースの配置例
 2-2.実施形態の第1の適用例
 2-3.実施形態の第2の適用例
 2-4.実施形態の第3の適用例
3.実施形態の第1の変形例
4.実施形態の第2の変形例
Hereinafter, embodiments of the present disclosure will be described in the following order.
1. Overview of technology related to the present disclosure 1-1. Regarding existing technology 1-2. Technology related to the present disclosure 2. Embodiment 2-1-1. Physical interface example according to embodiment 2-1-2. Configuration example of medical system according to embodiment 2-1-3. Example of arrangement of physical interfaces applicable to embodiment 2-2. First application example of embodiment 2-3. Second application example of embodiment 2-4. Third application example 3 of the embodiment. First modification of the embodiment 4. Second modification of the embodiment
(1.本開示に係る技術の概要) (1. Overview of technology related to this disclosure)
 本開示は、医療用ロボット、特に手術において術者の作業を補助するロボットアームの動作を制御するための制御装置に関するものである。本開示に係る制御装置は、術者が、自身の手以外の身体部分で操作可能に構成された物理インタフェースを備える。物理インタフェースとは、操作者(この場合は術者)による動作を電気信号に変換するための、物理的な実体を持つインタフェースである。 The present disclosure relates to a control device for controlling the operation of a medical robot, particularly a robot arm that assists a surgeon in surgery. A control device according to the present disclosure includes a physical interface configured to be operable by a surgeon with a body part other than his/her own hand. A physical interface is an interface that has a physical substance and is used to convert an operation by an operator (in this case, a surgeon) into an electrical signal.
(1-1.既存技術について)
 本開示の説明に先んじて、理解を容易とするために、既存技術による、医療用ロボットを用いて医療システムの構成について説明する。
(1-1. Regarding existing technology)
Prior to describing the present disclosure, in order to facilitate understanding, the configuration of a medical system using a medical robot according to existing technology will be described.
 図1A~図1Cは、既存技術による医療用ロボットの配置例を示す模式図である。以下では、医療用ロボットを、手術時に術者を補助するロボットアームであるものとする。 FIGS. 1A to 1C are schematic diagrams showing examples of the arrangement of medical robots according to existing technology. In the following, it is assumed that the medical robot is a robot arm that assists a surgeon during surgery.
 図1Aは、手術用ベッド100に患者101が横たわっており、術者102が手術手技のために手術用ベッド100の脇に立っている様子を上面から見た図である。また、図1Bは、図1Aの状態を術者102の後方斜め上から俯瞰した図、図1Cは、図1Aの状態を図1Bの逆側から俯瞰した図である。 FIG. 1A is a top view of a patient 101 lying on a surgical bed 100 and an operator 102 standing next to the surgical bed 100 for a surgical procedure. 1B is an overhead view of the state shown in FIG. 1A from diagonally above the rear of the operator 102, and FIG. 1C is an overhead view of the state shown in FIG. 1A from the opposite side of FIG. 1B.
 手術用ベッド100は、側面にベッドレール110が設けられる。図1A~図1Cの例では、ベッドレール110は、手術用ベッド100の両側面に、可動領域ごとに設けられている。また、図1Bおよび図1Cに示されるように、手術用ベッド100は、台座140により床面から所定の高さに保持されている。 The surgical bed 100 is provided with bed rails 110 on the side. In the example of FIGS. 1A to 1C, bed rails 110 are provided on both sides of the surgical bed 100 for each movable region. Further, as shown in FIGS. 1B and 1C, the surgical bed 100 is held at a predetermined height from the floor by a pedestal 140.
 ロボットアーム120は、複数の関節部と、関節部と関節部とを接続するアーム部とを含む。ロボットアーム120は、複数の関節部を所定に駆動することで、各関節部の可動範囲内において自在に姿勢などを変えることが可能とされている。 The robot arm 120 includes a plurality of joints and an arm that connects the joints. By driving a plurality of joints in a predetermined manner, the robot arm 120 can freely change its posture within the movable range of each joint.
 また、図の例では、ロボットアーム120は、台車130に設けられ、床置き型の装置として用いられる。なお、手術用ベッド100は、可動領域ごとに傾きを変化させることが可能とされているが、ロボットアーム120は、手術用ベッド100の各可動領域の傾きとは独立して動作する。 Furthermore, in the illustrated example, the robot arm 120 is installed on a trolley 130 and is used as a floor-standing device. Although the surgical bed 100 is capable of changing the inclination for each movable region, the robot arm 120 operates independently of the inclination of each movable region of the surgical bed 100.
 次に、既存技術による医療用ロボットの適用例として、内視鏡を用いた手術を行うための内視鏡手術システムについて説明する。図2は、既存技術による内視鏡手術システム5000の構成の一例を概略的に示す図である。 Next, as an application example of a medical robot based on existing technology, an endoscopic surgery system for performing surgery using an endoscope will be described. FIG. 2 is a diagram schematically showing an example of the configuration of an endoscopic surgery system 5000 according to existing technology.
 図2では、術者102が、内視鏡手術システム5000を用いて、手術用ベッド100上の患者101に手術を行っている様子が示されている。また、図2では、手術用ベッド100が患者101の足元あるいは頭部側から見た様子として示され、その両側面にベッドレール110が設けられている様子が示されている。ベッドレール110は、例えば手術に用いる器具がクランプなどにより取り付けられる。 FIG. 2 shows a surgeon 102 performing surgery on a patient 101 on a surgical bed 100 using an endoscopic surgery system 5000. Further, in FIG. 2, the surgical bed 100 is shown as seen from the feet or head side of the patient 101, and bed rails 110 are shown on both sides thereof. For example, instruments used in surgery are attached to the bed rail 110 using clamps or the like.
 図2において、内視鏡手術システム5000は、内視鏡5001と、その他の術具5017と、内視鏡5001を支持する、医療用ロボットであるロボットアーム120と、内視鏡下手術のための各種の装置を含むラック5037と、を含む。ラック5037は、台車130に搭載される。 In FIG. 2, an endoscopic surgery system 5000 includes an endoscope 5001, other surgical instruments 5017, a robot arm 120 that is a medical robot that supports the endoscope 5001, and an endoscopic surgery system 5000 for endoscopic surgery. and a rack 5037 containing various devices. Rack 5037 is mounted on truck 130.
 内視鏡手術では、腹壁を切って開腹する代わりに、トロッカ5025a~5025dと呼ばれる筒状の開孔器具が腹壁に複数穿刺される。そして、トロッカ5025a~5025dから、内視鏡5001の鏡筒5003や、その他の術具5017が患者101の体腔内に挿入される。 In endoscopic surgery, instead of cutting the abdominal wall and opening the abdomen, multiple cylindrical hole-opening instruments called trocars 5025a to 5025d are punctured into the abdominal wall. Then, the lens barrel 5003 of the endoscope 5001 and other surgical instruments 5017 are inserted into the body cavity of the patient 101 from the trocars 5025a to 5025d.
 図2の例では、その他の術具5017として、気腹チューブ5019、エネルギー処置具5021および鉗子5023が、患者101の体腔内に挿入されている。また、エネルギー処置具5021は、高周波電流や超音波振動により、組織の切開および剥離、または血管の封止等を行う処置具である。ただし、図2に示す術具5017はあくまで一例であり、術具5017としては、例えば攝子、レトラクタ等、一般的に内視鏡下手術において用いられる各種の術具が用いられてよい。 In the example of FIG. 2, as other surgical tools 5017, a pneumoperitoneum tube 5019, an energy treatment tool 5021, and forceps 5023 are inserted into the body cavity of the patient 101. Furthermore, the energy treatment tool 5021 is a treatment tool that performs incision and exfoliation of tissue, sealing of blood vessels, etc. using high frequency current or ultrasonic vibration. However, the surgical tool 5017 shown in FIG. 2 is just an example, and various surgical tools commonly used in endoscopic surgery, such as a lever or a retractor, may be used as the surgical tool 5017.
 内視鏡5001によって撮影された患者101の体腔内の術部の画像が、表示装置5041に表示される。術者102は、表示装置5041に表示された術部の画像をリアルタイムで見ながら、エネルギー処置具5021や鉗子5023を用いて、例えば患部を切除する等の処置を行う。なお、図示は省略しているが、気腹チューブ5019、エネルギー処置具5021および鉗子5023は、手術中に、術者102または助手等によって支持される。 An image of the surgical site inside the body cavity of the patient 101 taken by the endoscope 5001 is displayed on the display device 5041. The surgeon 102 uses the energy treatment instrument 5021 and forceps 5023 to perform a treatment such as cutting off the affected area while viewing the image of the surgical site displayed on the display device 5041 in real time. Although not shown, the pneumoperitoneum tube 5019, the energy treatment instrument 5021, and the forceps 5023 are supported by the operator 102, an assistant, or the like during the surgery.
(ロボットアーム)
 ロボットアーム120は、ベース部5029から延伸するアーム部5031を備える。図2の例では、アーム部5031は、関節部5033a、5033b、5033c、およびリンク5035a、5035bから構成されており、アーム制御装置5045からの制御により駆動される。アーム部5031によって内視鏡5001が支持され、その位置および/または姿勢が制御される。これにより、内視鏡5001の安定的な位置の固定が実現され得る。アーム制御装置5045は、例えば機械学習により学習されたモデルに基づき、ロボットアーム120の動作を自律的に制御することが可能とされている。
(robot arm)
Robot arm 120 includes an arm portion 5031 extending from base portion 5029. In the example of FIG. 2, the arm portion 5031 includes joint portions 5033a, 5033b, 5033c, and links 5035a, 5035b, and is driven by control from an arm control device 5045. Endoscope 5001 is supported by arm portion 5031, and its position and/or posture is controlled. Thereby, the endoscope 5001 can be stably fixed in position. The arm control device 5045 is capable of autonomously controlling the operation of the robot arm 120, for example, based on a model learned by machine learning.
 なお、内視鏡の位置は、内視鏡の空間内での位置を示し、例えば座標(x,y,z)などの3次元座標として表現できる。また、内視鏡の姿勢は、内視鏡が向く方向を示し、例えば3次元のベクトルとして表現できる。 Note that the position of the endoscope indicates the position of the endoscope in space, and can be expressed as three-dimensional coordinates such as coordinates (x, y, z), for example. Further, the posture of the endoscope indicates the direction in which the endoscope faces, and can be expressed as a three-dimensional vector, for example.
(内視鏡)
 内視鏡5001について、概略的に説明する。内視鏡5001は、先端から所定の長さの領域が患者101の体腔内に挿入される鏡筒5003と、鏡筒5003の基端に接続されるカメラヘッド5005と、から構成される。図示する例では、硬性の鏡筒5003を有するいわゆる硬性鏡として構成される内視鏡5001を図示しているが、内視鏡5001は、軟性の鏡筒5003を有するいわゆる軟性鏡として構成されてもよい。
(Endoscope)
The endoscope 5001 will be briefly described. The endoscope 5001 includes a lens barrel 5003 whose distal end is inserted into the body cavity of the patient 101 over a predetermined length, and a camera head 5005 connected to the proximal end of the lens barrel 5003. In the illustrated example, an endoscope 5001 configured as a so-called rigid scope having a rigid tube 5003 is shown, but the endoscope 5001 is configured as a so-called flexible scope having a flexible tube 5003. Good too.
 鏡筒5003の先端には、対物レンズが嵌め込まれた開口部が設けられている。内視鏡5001には、ラック5037に搭載される光源装置5043が接続されており、当該光源装置5043によって生成された光が、鏡筒5003の内部に延設されるライトガイドによって当該鏡筒の先端まで導光され、対物レンズを介して患者101の体腔内の観察対象に向かって照射される。なお、内視鏡5001は、直視鏡であってもよいし、斜視鏡または側視鏡であってもよい。 An opening into which an objective lens is fitted is provided at the tip of the lens barrel 5003. A light source device 5043 mounted on a rack 5037 is connected to the endoscope 5001, and light generated by the light source device 5043 is directed into the lens barrel by a light guide extending inside the lens barrel 5003. The light is guided to the tip and irradiated toward the observation target in the body cavity of the patient 101 via the objective lens. Note that the endoscope 5001 may be a direct-viewing mirror, a diagonal-viewing mirror, or a side-viewing mirror.
 カメラヘッド5005の内部には光学系および撮像素子が設けられており、観察対象からの反射光(観察光)は当該光学系によって当該撮像素子に集光される。当該撮像素子によって観察光が光電変換され、観察光に対応する電気信号、すなわち観察像に対応する画像信号が生成される。当該画像信号は、RAWデータとしてカメラコントロールユニット(CCU:Camera Control Unit)5039に送信される。なお、カメラヘッド5005には、その光学系を適宜駆動させることにより、倍率および焦点距離を調整する機能が搭載される。 An optical system and an image sensor are provided inside the camera head 5005, and reflected light (observation light) from an observation target is focused on the image sensor by the optical system. The observation light is photoelectrically converted by the image sensor, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated. The image signal is transmitted as RAW data to a camera control unit (CCU) 5039. Note that the camera head 5005 is equipped with a function of adjusting magnification and focal length by appropriately driving its optical system.
 なお、例えば立体視(3D表示)等に対応するために、カメラヘッド5005には撮像素子が複数設けられてもよい。この場合、鏡筒5003の内部には、当該複数の撮像素子のそれぞれに観察光を導光するために、リレー光学系が複数系統設けられる。 Note that the camera head 5005 may be provided with a plurality of image sensors, for example, in order to support stereoscopic viewing (3D display). In this case, a plurality of relay optical systems are provided inside the lens barrel 5003 in order to guide observation light to each of the plurality of image sensors.
(ラックに含まれる各種の装置)
 図2の例では、ラック5037に対して、CCU5039と、光源装置5043と、アーム制御装置5045と、入力装置5047と、処置具制御装置5049と、気腹装置5051と、レコーダ5053と、プリンタ5055と、が搭載されている。
(Various devices included in the rack)
In the example of FIG. 2, the rack 5037 includes a CCU 5039, a light source device 5043, an arm control device 5045, an input device 5047, a treatment tool control device 5049, an insufflation device 5051, a recorder 5053, and a printer 5055. It is equipped with.
 CCU5039は、CPU(Central Processing Unit)やGPU(Graphics Processing Unit)等によって構成され、内視鏡5001および表示装置5041の動作を統括的に制御する。具体的には、CCU5039は、カメラヘッド5005から受け取った画像信号に対して、例えば現像処理(デモザイク処理)等の、当該画像信号に基づく画像を表示するための各種の画像処理を施す。CCU5039は、当該画像処理を施した画像信号を表示装置5041に提供する。また、CCU5039は、カメラヘッド5005に対して制御信号を送信し、その駆動を制御する。当該制御信号には、倍率や焦点距離等、撮像条件に関する情報が含まれ得る。 The CCU 5039 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and centrally controls the operations of the endoscope 5001 and the display device 5041. Specifically, the CCU 5039 performs various image processing, such as development processing (demosaic processing), on the image signal received from the camera head 5005 in order to display an image based on the image signal. The CCU 5039 provides the image signal subjected to the image processing to the display device 5041. Further, the CCU 5039 transmits a control signal to the camera head 5005 to control its driving. The control signal may include information regarding imaging conditions such as magnification and focal length.
 表示装置5041は、CCU5039からの制御により、当該CCU5039によって画像処理が施された画像信号に基づく画像を表示する。内視鏡5001が例えば4K(水平画素数3840×垂直画素数2160)または8K(水平画素数7680×垂直画素数4320)等の高解像度の撮影に対応したものである場合、および/または3D表示に対応したものである場合には、表示装置5041としては、それぞれに対応して、高解像度の表示が可能なもの、および/または3D表示可能なものが用いられ得る。4Kまたは8K等の高解像度の撮影に対応したものである場合、表示装置5041として55インチ以上のサイズのものを用いることで一層の没入感が得られる。また、用途に応じて、解像度、サイズが異なる複数の表示装置5041が設けられてもよい。 The display device 5041 displays an image based on an image signal subjected to image processing by the CCU 5039 under control from the CCU 5039. If the endoscope 5001 is compatible with high resolution imaging such as 4K (horizontal pixels 3840 x vertical pixels 2160) or 8K (horizontal pixels 7680 x vertical pixels 4320), and/or 3D display. In the case where the display device 5041 is capable of high-resolution display and/or 3D display, the display device 5041 may be capable of high-resolution display and/or 3D display. If the display device 5041 is compatible with high-resolution shooting such as 4K or 8K, a more immersive feeling can be obtained by using a display device 5041 with a size of 55 inches or more. Furthermore, a plurality of display devices 5041 having different resolutions and sizes may be provided depending on the purpose.
 光源装置5043は、例えばLED(light emitting diode)といった発光素子およびそれを駆動する駆動回路を含み、術部を撮影する際の照射光を内視鏡5001に供給する。 The light source device 5043 includes a light emitting element such as an LED (light emitting diode) and a drive circuit for driving the light emitting element, and supplies irradiation light to the endoscope 5001 when photographing the surgical site.
 アーム制御装置5045は、例えばCPU等のプロセッサを含み、所定のプログラムに従って動作することにより、所定の制御方式に従ってロボットアーム120のアーム部5031の駆動を制御する。 The arm control device 5045 includes a processor such as a CPU, and operates according to a predetermined program to control the drive of the arm portion 5031 of the robot arm 120 according to a predetermined control method.
 入力装置5047は、内視鏡手術システム5000に対する入力インタフェースである。ユーザは、入力装置5047を介して、内視鏡手術システム5000に対して各種の情報の入力や指示入力を行うことができる。例えば、ユーザは、入力装置5047を介して、患者の身体情報や、手術の術式についての情報等、手術に関する各種の情報を入力する。また、例えば、ユーザは、入力装置5047を介して、アーム部5031を駆動させる旨の指示や、内視鏡5001による撮像条件(照射光の種類、倍率および焦点距離等)を変更する旨の指示、エネルギー処置具5021を駆動させる旨の指示等を入力する。 The input device 5047 is an input interface for the endoscopic surgery system 5000. The user can input various information and instructions to the endoscopic surgery system 5000 via the input device 5047. For example, the user inputs various information regarding the surgery, such as patient's physical information and information about the surgical technique, via the input device 5047. Further, for example, the user may issue an instruction to drive the arm portion 5031 or an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 5001 via the input device 5047. , an instruction to drive the energy treatment instrument 5021, etc. are input.
 入力装置5047の種類は限定されず、入力装置5047は各種の公知の入力装置であってよい。入力装置5047としては、例えば、マウス、キーボード、タッチパネル、スイッチ、レバー、ジョイスティックなどの入力デバイスを適用できる。入力装置5047として、複数種類の入力デバイスを混在して適用することもできる。また、操作者(例えば術者102)の足元に配置され、操作者の足により操作されるフットスイッチ5057も、入力装置5047として適用可能である。入力装置5047としてタッチパネルが用いられる場合には、当該タッチパネルは表示装置5041の表示面上に設けられてもよい。 The type of the input device 5047 is not limited, and the input device 5047 may be any of various known input devices. As the input device 5047, for example, input devices such as a mouse, keyboard, touch panel, switch, lever, joystick, etc. can be applied. As the input device 5047, it is also possible to use a mixture of multiple types of input devices. Furthermore, a foot switch 5057 that is placed at the feet of an operator (for example, the operator 102) and is operated by the feet of the operator can also be applied as the input device 5047. When a touch panel is used as the input device 5047, the touch panel may be provided on the display surface of the display device 5041.
 入力装置5047は、上述の例に限定されない。例えば、入力装置5047は、例えばメガネ型のウェアラブルデバイスやHMD(Head Mounted Display)等の、ユーザによって装着されるデバイスを適用することができる。この場合、入力装置5047は、これらのユーザによって装着されるデバイスによって検出されるユーザのジェスチャや視線に応じて、各種の入力を行うことができる。 The input device 5047 is not limited to the above example. For example, the input device 5047 can be a device worn by the user, such as a glasses-type wearable device or a head mounted display (HMD). In this case, the input device 5047 can perform various inputs according to the user's gestures and line of sight detected by the devices worn by these users.
 また、入力装置5047は、ユーザの動きを検出可能なカメラを含むことができる。この場合、入力装置5047は、当該カメラによって撮像された映像から検出されるユーザのジェスチャや視線に応じて、各種の入力を行うことができる。さらに、入力装置5047は、ユーザの声を収音可能なマイクロホンを含むことができる。この場合、入力装置5047は、当該マイクロホンにより収音された音声に基づき音声認識を行い、話者(例えば術者102)の音声を解析し、音声による各種の動作の入力を行うことができる。 Additionally, the input device 5047 can include a camera that can detect the user's movements. In this case, the input device 5047 can perform various inputs according to the user's gestures and line of sight detected from the video captured by the camera. Furthermore, the input device 5047 can include a microphone that can pick up the user's voice. In this case, the input device 5047 can perform voice recognition based on the voice picked up by the microphone, analyze the voice of the speaker (for example, the surgeon 102), and input various operations using voice.
 このように、入力装置5047が非接触で各種の情報を入力可能に構成されることにより、特に清潔域に属するユーザ(例えば術者102)が、不潔域に属する機器を非接触で操作することが可能となる。また、ユーザは、所持している術具から手を離すことなく機器を操作することが可能となるため、ユーザの利便性が向上する。 In this way, since the input device 5047 is configured to be able to input various information without contact, a user (for example, the operator 102) who belongs to a clean area can operate equipment belonging to a dirty area without contact. becomes possible. Further, since the user can operate the device without taking his hand off the surgical tool that he has, the user's convenience is improved.
 処置具制御装置5049は、組織の焼灼、切開または血管の封止等のためのエネルギー処置具5021の駆動を制御する。気腹装置5051は、内視鏡5001による視野の確保および術者102の作業空間の確保の目的で、患者101の体腔を膨らめるために、気腹チューブ5019を介して当該体腔内にガスを送り込む。レコーダ5053は、手術に関する各種の情報を記録可能な装置である。プリンタ5055は、手術に関する各種の情報を、テキスト、画像またはグラフ等各種の形式で印刷可能な装置である。 The treatment tool control device 5049 controls the driving of the energy treatment tool 5021 for cauterizing tissue, incising, sealing blood vessels, etc. The pneumoperitoneum device 5051 inflates the body cavity of the patient 101 through the pneumoperitoneum tube 5019 in order to secure a field of view with the endoscope 5001 and a working space for the operator 102. Inject gas. The recorder 5053 is a device that can record various information regarding surgery. The printer 5055 is a device that can print various types of information regarding surgery in various formats such as text, images, or graphs.
(1-2.本開示に係る技術)
 上述した構成において、術者102が、ロボットアーム120の動作モードを自律動作モードからマニュアル動作モードに切り替えて、ロボットアーム120をマニュアル操作(例えばロボットアーム120の動作の停止)する場合について考える。この場合、内視鏡手術システム5000に対して、ロボットアーム120の動作モードをハンズフリーで切り替えるための切替手段を設ける必要がある。
(1-2. Technology related to this disclosure)
In the above-described configuration, a case will be considered in which the operator 102 switches the operation mode of the robot arm 120 from the autonomous operation mode to the manual operation mode and manually operates the robot arm 120 (for example, stops the operation of the robot arm 120). In this case, it is necessary to provide the endoscopic surgery system 5000 with a switching means for switching the operation mode of the robot arm 120 hands-free.
 このような切替手段として、術者102が足により操作するフットスイッチを、既存のフットスイッチ5057に対して追加して設けることが考えられる。しかしながら、術者102の足元に、他の機材の操作用に2つ以上(すなわち両足用)のフットスイッチ5057が既に配置されている場合、ロボットアーム120の動作モードの切替手段としてさらにフットペダルを追加することは、ロボットアーム120以外の医療機器、および、ロボットアーム120自体の操作性が低下するおそれがある。 As such a switching means, it is conceivable to add a foot switch operated by the operator 102 with his/her foot to the existing foot switch 5057. However, if two or more foot switches 5057 (that is, for both feet) are already placed at the feet of the surgeon 102 for operating other equipment, a foot pedal may also be used as a means of switching the operation mode of the robot arm 120. Addition of this may reduce the operability of medical equipment other than the robot arm 120 and the robot arm 120 itself.
 また、術者の両手が術具の操作に専有されている状態での医療用ロボットの操作手段として、術者の手や術者が操作する術具に物理インタフェースを取り付ける方法が考えられる(例えば特許文献1)。しかしながら、この方法では、通常の手術のための操作性を担保することが困難である。さらに、実際の手術においては、一般的に、術者の用いる器具の交換作業が頻繁に発生するため、医療用ロボットを操作するための物理インタフェースの着脱および付け替えが必要になってしまう。 In addition, as a means of operating a medical robot in a situation where the surgeon's hands are exclusively used to operate surgical instruments, a method of attaching a physical interface to the surgeon's hands or the surgical instrument operated by the surgeon may be considered (for example, Patent Document 1). However, with this method, it is difficult to ensure operability for normal surgery. Furthermore, in actual surgery, instruments used by the surgeon are generally frequently replaced, so it becomes necessary to attach/detach and replace the physical interface for operating the medical robot.
 本開示に係る、ロボットアーム120を制御する制御装置は、術者が、自身の手以外の身体部分で操作可能に構成された物理インタフェースを備える。本開示に係る制御装置を用いることで、医療用ロボットの動作を術者がハンズフリーでいつでも制御可能であり、医療用ロボットのハンズフリー操作時のリスク管理が可能となる。また、本開示に係る制御装置を用いることで、手術中における術具の交換に伴う物理インタフェースの交換作業を不要とすることができる。 A control device for controlling the robot arm 120 according to the present disclosure includes a physical interface that can be operated by a surgeon with a body part other than his or her own hand. By using the control device according to the present disclosure, the operator can control the operation of the medical robot hands-free at any time, and risk management during hands-free operation of the medical robot becomes possible. Further, by using the control device according to the present disclosure, it is possible to eliminate the need for physical interface replacement work that accompanies replacement of surgical tools during surgery.
(2.実施形態)
 本開示に係る実施形態について説明する。
(2. Embodiment)
Embodiments according to the present disclosure will be described.
(2-1-1.実施形態に係る物理インタフェース例)
 図3は、実施形態に係る物理インタフェースの例を示す模式図である。図3のセクション(a)に示されるように、実施形態に係る物理インタフェース10は、手術用ベッド100のベッドレール110に対して例えばクランプ(図示しない)などにより脱着可能に取り付けられ、ベッドレール110の方向に押圧することで作動するスイッチとして構成されている。物理インタフェース10の出力は、例えばケーブル11を介してアーム制御装置5045に送信される。アーム制御装置5045は、物理インタフェース10から送信された信号に応じて、ロボットアーム120の動作を制御することができる。
(2-1-1. Example of physical interface according to embodiment)
FIG. 3 is a schematic diagram showing an example of a physical interface according to the embodiment. As shown in section (a) of FIG. 3, the physical interface 10 according to the embodiment is removably attached to the bed rail 110 of the surgical bed 100 by, for example, a clamp (not shown). It is configured as a switch that is activated by pressing in the direction of . The output of the physical interface 10 is transmitted to the arm control device 5045 via the cable 11, for example. Arm control device 5045 can control the operation of robot arm 120 in response to signals transmitted from physical interface 10 .
 物理インタフェース10は、術者102の前、すなわち、術者102が手術のために手術用ベッド100に向かう方向に設置することで、術者102が物理インタフェース10を操作する際の操作性を最大化することができる。また、物理インタフェース10は、手術用ベッド100のベッドレール110であれば、どの位置でも取付可能に構成される。例えば、物理インタフェース10は、ベッドレール110に対してクランプなどにより脱着可能に取り付けられる。なお、物理インタフェース10は、手術用ベッド100のベッドレール110に対して取り付けられる、他の機材と干渉しない位置に取り付けることが好ましい。 By installing the physical interface 10 in front of the surgeon 102, that is, in the direction in which the surgeon 102 faces the surgical bed 100 for surgery, the operability when the surgeon 102 operates the physical interface 10 is maximized. can be converted into Furthermore, the physical interface 10 is configured to be attachable at any position on the bed rail 110 of the surgical bed 100. For example, the physical interface 10 is removably attached to the bed rail 110 using a clamp or the like. Note that the physical interface 10 is preferably attached to the bed rail 110 of the surgical bed 100 at a position where it does not interfere with other equipment.
 さらに、物理インタフェース10は、1台のロボットアーム120に対して、複数個を設置してもよい。この場合、複数の物理インタフェース10のそれぞれは、1台のロボットアーム120に対して、同一の動作を指示するものとしてよい。このように、1台のロボットアーム120に対して複数の物理インタフェース10を設置することで、1回の手術を同時に複数の術者102が担当する場合であっても、物理インタフェース10による制御を実現できる。 Furthermore, a plurality of physical interfaces 10 may be installed for one robot arm 120. In this case, each of the plurality of physical interfaces 10 may instruct one robot arm 120 to perform the same operation. In this way, by installing a plurality of physical interfaces 10 for one robot arm 120, even if a plurality of surgeons 102 are in charge of one surgery at the same time, control by the physical interface 10 can be performed easily. realizable.
 実施形態に係る物理インタフェース10は、術者102が自身の手以外の身体部分、より具体的には、術者102が自身の腰部1023から下の身***置により操作が可能な位置に、手術用ベッド100(ベッドレール110)に取り付けられる。図3のセクション(b)は、実施形態に係る物理インタフェース10の取り付け位置の例を示している。図3のセクション(b)の例では、物理インタフェース10は、膝部1021から大腿部1022にかけての身***置により操作が可能な位置に取り付けられている。 The physical interface 10 according to the embodiment is provided with a surgical interface at a position where the surgeon 102 can operate the body part other than his or her own hand, more specifically, at a position below the waist 1023 of the surgeon 102. It is attached to the bed 100 (bed rail 110). Section (b) of FIG. 3 shows an example of the mounting position of the physical interface 10 according to the embodiment. In the example of section (b) in FIG. 3, the physical interface 10 is attached at a position where it can be operated depending on the body position from the knee 1021 to the thigh 1022.
 物理インタフェース10をこのような位置に取り付けることで、術者102は、手を用いずに、すなわち両手で術具を扱いながら、フットスイッチ5057とは異なった方法で、物理インタフェース10を操作することができる。したがって、術者102は、術具の操作に両手が専有されている場合であっても、ロボットアーム120をスムースに操作することが可能である。すなわち、術者102は、ハンズフリーでロボットアーム120をスムースに操作することが可能である。 By attaching the physical interface 10 in such a position, the surgeon 102 can operate the physical interface 10 in a manner different from the foot switch 5057 without using his hands, that is, while handling the surgical instrument with both hands. I can do it. Therefore, the surgeon 102 can smoothly operate the robot arm 120 even when both hands are occupied for operating the surgical instrument. That is, the surgeon 102 can smoothly operate the robot arm 120 hands-free.
 また、物理インタフェース10のスイッチを1つとして、物理インタフェース10による操作を例えばオン、オフの制御といった1種類の操作に絞ることで、術者102に対する操作性が向上される。 Further, by using only one switch in the physical interface 10 and limiting the operation by the physical interface 10 to one type of operation such as on/off control, the operability for the operator 102 is improved.
 さらに、物理インタフェース10をベッドレール110に取り付けることで、物理インタフェース10を術具や手元に取り付ける場合に発生する、術具の交換に応じた物理インタフェース10の着脱が不要になる。さらにまた、物理インタフェース10がベッドレール110に取り付けられるため、物理インタフェース10を既存の覆布の内側に入れる、あるいは、簡易的なドレープにより覆う、など、低コストで清潔分離が可能であり、医療経済性がよい。 Furthermore, by attaching the physical interface 10 to the bed rail 110, there is no need to attach or detach the physical interface 10 when replacing a surgical instrument, which occurs when attaching the physical interface 10 to a surgical instrument or hand. Furthermore, since the physical interface 10 is attached to the bed rail 110, it is possible to cleanly separate the physical interface 10 at low cost by putting it inside an existing covering cloth or covering it with a simple drape. Good economy.
(物理インタフェースによる傾き検出について)
 図4は、ロボットアーム120と手術用ベッド100との関係を説明するための模式図である。
(About tilt detection using physical interface)
FIG. 4 is a schematic diagram for explaining the relationship between the robot arm 120 and the surgical bed 100.
 図4の例では、ロボットアーム120は、台車130に設置さる床置き型の構成とされ、手術用ベッド100とは独立して動作する。一方、手術中において、例えば手術の具合に応じて手術用ベッド100の例えば一部の角度の変更が行われる場合がある。そのため、例えば手術中に、ロボットアーム120により内視鏡5001の鏡筒5003が患者101の体腔内に挿入された状態で、手術用ベッド100の傾きが変更されると、鏡筒5003が患者101の体腔内で好ましくない動きをとってしまうことになる。 In the example of FIG. 4, the robot arm 120 has a floor-standing configuration installed on a trolley 130, and operates independently of the surgical bed 100. On the other hand, during surgery, for example, the angle of a part of the surgical bed 100 may be changed depending on, for example, the condition of the surgery. Therefore, for example, during surgery, if the inclination of the surgical bed 100 is changed while the lens barrel 5003 of the endoscope 5001 is inserted into the body cavity of the patient 101 by the robot arm 120, the lens barrel 5003 is inserted into the body cavity of the patient 101. This results in undesirable movement within the body cavity.
 この手術中の手術用ベッド100の傾きの変更に対応するため、実施形態に係る物理インタフェース10は、物理インタフェース10の傾きを検出する傾斜センサを備えてよい。傾斜センサにより検出された傾きを示す傾斜情報は、ケーブル11を介してアーム制御装置5045に送信される。 In order to accommodate changes in the inclination of the surgical bed 100 during surgery, the physical interface 10 according to the embodiment may include a tilt sensor that detects the inclination of the physical interface 10. Tilt information indicating the tilt detected by the tilt sensor is transmitted to the arm control device 5045 via the cable 11.
 図4に示されるように、実施形態に係る物理インタフェース10は、手術用ベッド100のベッドレール110に取り付けられるため、取り付けられた位置における手術用ベッド100の傾きを検出することができる。物理インタフェース10を、手術用ベッド100の、手術の対象となる患部に対応する位置に取り付け、物理インタフェース10が備える傾斜センサの出力に応じてロボットアーム120の動作を制御することで、ロボットアーム120の動作を手術用ベッド100の傾きに追従させることができる。 As shown in FIG. 4, since the physical interface 10 according to the embodiment is attached to the bed rail 110 of the surgical bed 100, it is possible to detect the inclination of the surgical bed 100 at the attached position. The physical interface 10 is attached to the surgical bed 100 at a position corresponding to the affected area to be operated on, and the operation of the robot arm 120 is controlled according to the output of the tilt sensor included in the physical interface 10. can be made to follow the inclination of the surgical bed 100.
 物理インタフェース10の機能として手術用ベッド100の傾きを検出するため、手術用ベッド100を提供する例えばメーカと協業すること無く、ロボットアーム120の動作を手術用ベッド100の動きと連動させることが可能となる。また、手術用ベッド100の動きをロボットアーム120(アーム制御装置5045)の側において監視することで、リスク管理が可能となる。 Since the inclination of the surgical bed 100 is detected as a function of the physical interface 10, it is possible to link the movement of the robot arm 120 with the movement of the surgical bed 100 without collaborating with, for example, the manufacturer that provides the surgical bed 100. becomes. Furthermore, by monitoring the movement of the surgical bed 100 on the robot arm 120 (arm control device 5045) side, risk management becomes possible.
 例えば、ベッドレール110に取り付けられるロボットアームであれば、手術用ベッド100の傾きが変化した場合であっても、患者101と当該ロボットアームとの相対的な位置関係が変化しない。そのため、この場合には、手術用ベッド100の傾きの変化に対するリスク管理は、不要である。 For example, if the robot arm is attached to the bed rail 110, even if the inclination of the surgical bed 100 changes, the relative positional relationship between the patient 101 and the robot arm will not change. Therefore, in this case, risk management regarding changes in the inclination of the surgical bed 100 is not necessary.
 一方、図4に示されるような、床置き型のロボットアーム120では、手術用ベッド100の傾きが変化した場合に、患者101とロボットアーム120との相対的な位置関係が変化してしまう。そのため、床置き型のロボットアーム120を用いる場合には、ロボットアーム120側が手術用ベッド100の動きに合わせて姿勢や位置を変化させないと、上述のように、ロボットアーム120に取り付けられた鏡筒5003などが患者101の体腔内で好ましくない動きをし、患者101に危害を加えてしまうリスクがある。実施形態に係る物理インタフェース10では、傾斜センサを内蔵させ、手術用ベッド100の傾き状態を監視することで、このリスクを低減させることができる。 On the other hand, in a floor-standing robot arm 120 as shown in FIG. 4, when the inclination of the surgical bed 100 changes, the relative positional relationship between the patient 101 and the robot arm 120 changes. Therefore, when using a floor-standing robot arm 120, unless the robot arm 120 side changes its posture and position in accordance with the movement of the surgical bed 100, the lens barrel attached to the robot arm 120 must be moved as described above. 5003 and the like may move undesirably within the body cavity of the patient 101, and there is a risk that the patient 101 may be harmed. In the physical interface 10 according to the embodiment, this risk can be reduced by incorporating a tilt sensor and monitoring the tilt state of the surgical bed 100.
(物理インタフェースによる照光機能について)
 上述したように、物理インタフェース10をベッドレール110に取り付ける場合、清潔分離のために、物理インタフェース10を清潔分離用のドレープや、患者用の覆布で覆う必要がある。物理インタフェース10をドレープや覆布で覆った場合、術者102による物理インタフェース10に対する視認性が悪化するおそれがある。
(About illumination function using physical interface)
As described above, when the physical interface 10 is attached to the bed rail 110, it is necessary to cover the physical interface 10 with a drape for clean separation or a patient covering for clean separation. When the physical interface 10 is covered with a drape or a covering cloth, there is a possibility that visibility of the physical interface 10 by the operator 102 may be deteriorated.
 そのため、実施形態では、物理インタフェース10に対して発光部を設け、照光機能を持たせる。 Therefore, in the embodiment, a light emitting section is provided for the physical interface 10 to have a lighting function.
 図5は、実施形態に係る、物理インタフェース10の照光機能を説明するための模式図である。図5の例では、患者101が患部を除いて覆布20で覆われ、術者102が左手102hLおよび右手102hRにより術具を扱い手術を実行している様子が示されている。 FIG. 5 is a schematic diagram for explaining the illumination function of the physical interface 10 according to the embodiment. In the example of FIG. 5, a patient 101 is covered with a cloth 20 except for the affected area, and a surgeon 102 is shown handling surgical tools with his left hand 102hL and right hand 102hR and performing surgery.
 ベッドレール110は、覆布20に覆われる。図5の例では、図示されないベッドレール110に取り付けられた物理インタフェース10は、覆布20の内側に配置されている。あるいは、物理インタフェース10が覆布20の外側に配置される場合、物理インタフェース10は、自身を清潔に保つために透明はドレープに覆われる。そのため、術者102は、物理インタフェース10を直接的に視認することが困難となる場合がある。そこで、実施形態に係る物理インタフェース10は、発光部12を設けて照光機能を持たせ、手術の最中はこの発光部12を例えば常時発光させる。 The bed rail 110 is covered with a covering cloth 20. In the example of FIG. 5, the physical interface 10 attached to the bed rail 110 (not shown) is located inside the upholstery 20. Alternatively, if the physical interface 10 is placed outside the shroud 20, the physical interface 10 is covered with a transparent drape to keep itself clean. Therefore, it may be difficult for the operator 102 to directly visually recognize the physical interface 10. Therefore, the physical interface 10 according to the embodiment is provided with a light emitting section 12 to have an illumination function, and the light emitting section 12 is made to emit light at all times during surgery, for example.
 発光部12は、物理インタフェース10に対し、例えば物理インタフェース10をベッドレール110に取り付けた状態で、その発光を術者102が認識容易な位置に設けることが好ましい。また、発光部12は、発光される光が覆布20あるいはドレープをある程度透過して術者102が認識容易となる程度の強度の光を発光すると好ましい。なお、発光部12が発光する光の強度は、覆布20あるいはドレープを介した場合に、術者102が眩しさを感じない程度とすると、好ましい。 It is preferable that the light emitting unit 12 is provided at a position where the light emitted from the physical interface 10 can be easily recognized by the operator 102 when the physical interface 10 is attached to the bed rail 110, for example. Further, it is preferable that the light emitting unit 12 emits light with such intensity that the emitted light passes through the covering cloth 20 or the drape to some extent and is easily recognized by the operator 102. Note that the intensity of the light emitted by the light emitting unit 12 is preferably such that the operator 102 does not feel glare when passing through the covering cloth 20 or drape.
(2-1-2.実施形態に係る医療システムの構成例)
 次に、実施形態に係る医療システムの構成例について説明する。
(2-1-2. Configuration example of medical system according to embodiment)
Next, a configuration example of a medical system according to an embodiment will be described.
 図6は、実施形態に係る内視鏡手術システム5000aの構成の一例を概略的に示す図である。図6において、内視鏡手術システム5000aは、図2を用いて説明した既存技術による内視鏡手術システム5000に対して、物理インタフェース10が追加されている。物理インタフェース10は、上述したように、ベッドレール110に脱着可能に取り付けられる。物理インタフェース10は、ケーブル11を介して、図2のアーム制御装置5045に対応するアーム制御装置5045aに有線接続される。 FIG. 6 is a diagram schematically showing an example of the configuration of an endoscopic surgery system 5000a according to the embodiment. In FIG. 6, an endoscopic surgery system 5000a has a physical interface 10 added to the endoscopic surgery system 5000 according to the existing technology described using FIG. The physical interface 10 is removably attached to the bed rail 110, as described above. The physical interface 10 is wired via the cable 11 to an arm control device 5045a corresponding to the arm control device 5045 in FIG.
 なお、物理インタフェース10は、ケーブル11を用いずに、アーム制御装置5045aと無線通信により接続することも可能である。しかしながら、物理インタフェース10の操作に対するレスポンスや、通信エラーなどのリスクを考慮すると、物理インタフェース10は、アーム制御装置5045aに対して有線により接続することが好ましい。 Note that the physical interface 10 can also be connected to the arm control device 5045a by wireless communication without using the cable 11. However, considering the response to the operation of the physical interface 10 and risks such as communication errors, it is preferable that the physical interface 10 is connected to the arm control device 5045a by wire.
 図7は、実施形態に係るロボットアームシステムの機能を説明するための一例の機能ブロック図である。 FIG. 7 is an example functional block diagram for explaining the functions of the robot arm system according to the embodiment.
 図7において、ロボットアームシステム50は、ロボットアーム制御部500と、ロボットアーム120と、を含む。ロボットアーム制御部500は、図6に示したアーム制御装置5045aに含まれる。また、物理インタフェース(I/F)10は、スイッチ(SW)部13と、傾斜センサ14と、発光部12と、を含む。 In FIG. 7, the robot arm system 50 includes a robot arm control section 500 and a robot arm 120. Robot arm control section 500 is included in arm control device 5045a shown in FIG. Further, the physical interface (I/F) 10 includes a switch (SW) section 13, a tilt sensor 14, and a light emitting section 12.
 物理インタフェース10において、スイッチ部13は、例えばマイクロスイッチと、マイクロスイッチに対する操作に応じた信号を出力する出力回路とを含む構成としてよい。出力回路から出力された信号(スイッチ信号と呼ぶ)は、ケーブル11を介してアーム制御装置5045aに送信され、ロボットアーム制御部500に受信される。 In the physical interface 10, the switch unit 13 may have a configuration including, for example, a microswitch and an output circuit that outputs a signal according to an operation on the microswitch. A signal output from the output circuit (referred to as a switch signal) is transmitted to the arm control device 5045a via the cable 11 and received by the robot arm control section 500.
 スイッチ信号は、例えば単純にオンおよびオフを示す信号としてよい。また例えば、スイッチ信号は、同期パターンと、オンおよびオフを示すパターンとを組み合わせた信号としてもよい。 The switch signal may be, for example, a signal that simply indicates on and off. Further, for example, the switch signal may be a signal that combines a synchronization pattern and a pattern indicating on and off.
 これに限らず、物理インタフェース10は、単に、操作に応じて回路の開/閉を行うスイッチであってもよい。この場合には、例えばロボットアーム制御部500は、物理インタフェース10に対して常時、信号を供給する必要がある。 The physical interface 10 is not limited to this, and may simply be a switch that opens/closes a circuit in response to an operation. In this case, for example, the robot arm control unit 500 needs to constantly supply signals to the physical interface 10.
 傾斜センサ14は、物理インタフェース10の傾きを検出し、検出された傾きを示す傾斜情報を出力する。例えば、傾斜センサ14は、物理インタフェース10の重力方向に対する角度を検出し、検出された角度を傾斜情報として出力する。傾斜センサ14としては、例えば3軸の角加速度に基づき傾きを検出するジャイロセンサを適用することができる。傾斜センサ14から出力された傾斜情報は、ケーブル11を介してケーブル11を介してアーム制御装置5045aに送信され、ロボットアーム制御部500に受信される。 The tilt sensor 14 detects the tilt of the physical interface 10 and outputs tilt information indicating the detected tilt. For example, the tilt sensor 14 detects the angle of the physical interface 10 with respect to the direction of gravity, and outputs the detected angle as tilt information. As the tilt sensor 14, for example, a gyro sensor that detects tilt based on angular acceleration of three axes can be used. The tilt information output from the tilt sensor 14 is transmitted to the arm control device 5045a via the cable 11 and received by the robot arm control unit 500.
 発光部12は、発光素子としてLED(Light Emitting Diode)を適用することができる。発光部12は、例えば物理インタフェース10に設けたスイッチにより、発光のオン/オフを制御してよい。これに限らず、発光部12の発光のオン/オフを、ロボットアーム制御部500から制御してもよい。発光部12に適用される発光素子は、覆布20あるいはドレープを介して術者102が認識容易である目的を達成することができれば、LEDに限定されない。また、発光部12が調光機能を有していてもよい。 The light emitting unit 12 can use an LED (Light Emitting Diode) as a light emitting element. The light emitting unit 12 may be controlled to turn on/off the light emitted by, for example, a switch provided on the physical interface 10 . The present invention is not limited to this, and the robot arm control unit 500 may control on/off of the light emission of the light emitting unit 12. The light emitting element applied to the light emitting unit 12 is not limited to an LED as long as it can achieve the purpose of being easily recognized by the operator 102 through the covering cloth 20 or drape. Furthermore, the light emitting section 12 may have a dimming function.
 図7において、指示認識器60は、例えば音声認識器を有し、マイクロホン61により収音された音声に基づく音声信号を解析し、音声による指示を示す指示情報を取得する。この場合、指示認識器60は、例えば術者102によるロボットアーム120の動作を制御するための発話を認識し、ロボットアーム120の動作を指示するための指示情報を取得することができる。 In FIG. 7, the instruction recognizer 60 includes, for example, a voice recognizer, analyzes a voice signal based on the voice picked up by the microphone 61, and obtains instruction information indicating a voice instruction. In this case, the instruction recognizer 60 can recognize, for example, an utterance by the surgeon 102 to control the operation of the robot arm 120, and can acquire instruction information for instructing the operation of the robot arm 120.
 これに限らず、指示認識器60は、例えば視線認識器を有し、カメラ62により撮影された画像に基づき視線(例えば眼球の向き)を認識し、視線による指示を示す指示情報を取得するようにしてよい。この場合、指示認識器60は、例えば術者102によるロボットアーム120の動作を制御するための視線を認識し、ロボットアーム120を制御するための指示情報を取得することができる。 For example, the instruction recognizer 60 may include a line of sight recognizer, recognize the line of sight (for example, the direction of the eyeball) based on the image captured by the camera 62, and acquire instruction information indicating instructions based on the line of sight. You can do it. In this case, the instruction recognizer 60 can recognize, for example, the line of sight of the operator 102 to control the operation of the robot arm 120, and can acquire instruction information for controlling the robot arm 120.
 指示認識器60は、音声認識器および視線認識器の両方を有していてもよいし、何れか一方を有していてもよい。 The instruction recognizer 60 may have both a voice recognizer and a line of sight recognizer, or may have either one of them.
 ロボットアーム120は、関節部121と、駆動制御部122と、を含む。関節部121は、関節情報検出部1210と、関節部駆動部1211と、を含む。駆動制御部122は、後述する指令値生成部513から供給される指令値情報に基づき関節部121を駆動するための駆動制御信号を生成する。関節部駆動部1211は、駆動制御部122により生成された駆動制御信号に従い、関節部121を駆動する。関節情報検出部1210は、センサなどを用いて関節部121の状態を検出し、関節情報を取得する。関節情報検出部1210は、取得した関節情報を、後述する状態取得部510と、指令値生成部513とに渡す。 The robot arm 120 includes a joint section 121 and a drive control section 122. The joint section 121 includes a joint information detection section 1210 and a joint section drive section 1211. Drive control section 122 generates a drive control signal for driving joint section 121 based on command value information supplied from command value generation section 513, which will be described later. The joint drive unit 1211 drives the joint 121 according to a drive control signal generated by the drive control unit 122. The joint information detection section 1210 detects the state of the joint section 121 using a sensor or the like and acquires joint information. Joint information detection section 1210 passes the acquired joint information to state acquisition section 510 and command value generation section 513, which will be described later.
 なお、図7では、説明のため、ロボットアーム120が1つの関節部121を含むように示されているが、実際には、ロボットアーム120は、複数の関節部121を含む。 Note that in FIG. 7, the robot arm 120 is shown to include one joint 121 for the sake of explanation, but in reality, the robot arm 120 includes a plurality of joints 121.
 ロボットアーム制御部500は、状態取得部510と、演算条件決定部511と、指令値生成部513と、を含む。 The robot arm control section 500 includes a state acquisition section 510, a calculation condition determination section 511, and a command value generation section 513.
 状態取得部510は、物理インタフェース10のスイッチ部13から出力されたスイッチ信号と、傾斜センサ14から出力された傾斜情報とを取得する。また、状態取得部510は、各関節部121の関節情報検出部1210から出力された各関節情報を取得する。状態取得部510は、取得したスイッチ信号と、傾斜情報と、各関節情報とを、演算条件決定部511に渡す。 The state acquisition unit 510 acquires the switch signal output from the switch unit 13 of the physical interface 10 and the tilt information output from the tilt sensor 14. Further, the state acquisition section 510 acquires each joint information output from the joint information detection section 1210 of each joint section 121. The state acquisition unit 510 passes the acquired switch signal, slope information, and each joint information to the calculation condition determination unit 511.
 演算条件決定部511は、状態取得部510から渡されたスイッチ信号、傾斜情報および各関節情報を取得すると共に、指示認識器60から出力された指示情報を取得する。演算条件決定部511は、取得された各情報および信号に基づき、ロボットアーム120がどのように振る舞うかを決定する。演算条件決定部511は、決定したロボットアーム120の振る舞いを示す情報を、力演算部512に渡す。 The calculation condition determination unit 511 acquires the switch signal, slope information, and each joint information passed from the state acquisition unit 510, and also acquires the instruction information output from the instruction recognizer 60. The calculation condition determining unit 511 determines how the robot arm 120 behaves based on each piece of information and signals acquired. The calculation condition determination unit 511 passes information indicating the determined behavior of the robot arm 120 to the force calculation unit 512.
 力演算部512は、例えば機械学習により学習された、ロボットアーム120の動きに関するモデルを有する。力演算部512は、当該モデルに対して、演算条件決定部511から渡されたロボットアーム120の振る舞いを示す情報を適用し、ロボットアーム120の動作を予測する。力演算部512は、予測されたロボットアーム120の動作を示すロボット動作情報を、指令値生成部513に渡す。 The force calculation unit 512 has a model regarding the movement of the robot arm 120, which has been learned, for example, by machine learning. The force calculation unit 512 applies information indicating the behavior of the robot arm 120 passed from the calculation condition determination unit 511 to the model, and predicts the movement of the robot arm 120. The force calculation section 512 passes robot motion information indicating the predicted motion of the robot arm 120 to the command value generation section 513.
 指令値生成部513は、さらに、ロボットアーム120における各関節部121から、関節情報が渡される。指令値生成部513は、各関節部121から渡された各関節情報と、力演算部512から渡されたロボット動作情報とに基づき、ロボットアーム120の各関節部121の駆動を指令する指令値を生成する。指令値生成部513は、生成した各指令値を、駆動制御部122に渡す。 The command value generation unit 513 is further provided with joint information from each joint 121 in the robot arm 120. The command value generation unit 513 generates a command value for instructing the drive of each joint 121 of the robot arm 120 based on the joint information passed from each joint 121 and the robot motion information passed from the force calculation unit 512. generate. The command value generation unit 513 passes each generated command value to the drive control unit 122.
 駆動制御部122は、指令値生成部513から渡された各指令値に従い、各関節部121を駆動するための各駆動制御信号を生成する。ロボットアーム120は、各関節部121が駆動制御部122により生成された各駆動制御信号に従い駆動されることで、所定の動作を実行する。 The drive control section 122 generates each drive control signal for driving each joint section 121 according to each command value passed from the command value generation section 513. The robot arm 120 executes a predetermined operation by driving each joint 121 according to each drive control signal generated by the drive control unit 122.
 このように、実施形態に係る物理インタフェース10と、物理インタフェース10の出力に応じてロボットアーム120の動作を制御するロボットアーム制御部500とを含んで、物理インタフェース10に対する術者102の操作に応じて医療用ロボットの動作を制御する制御装置が構成される。 In this way, the physical interface 10 according to the embodiment and the robot arm control unit 500 that controls the operation of the robot arm 120 according to the output of the physical interface 10 are included, and the robot arm control unit 500 controls the operation of the robot arm 120 according to the output of the physical interface 10. A control device is configured to control the operation of the medical robot.
(2-1-3.実施形態に適用可能な物理インタフェースの配置例)
 次に、実施形態に係る物理インタフェース10の効果的な配置例について、図8および図9を用いて説明する。
(2-1-3. Example of arrangement of physical interfaces applicable to the embodiment)
Next, an example of effective arrangement of the physical interface 10 according to the embodiment will be described using FIG. 8 and FIG. 9.
 図8は、実施形態に係る物理インタフェース10を1つ配置する場合の配置例を示す模式図である。図8のセクション(a)は、手術用ベッド100に患者101が横たわっており、術者102が手術手技のために手術用ベッド100の脇に立っている様子を上面から見た図である。また、同図のセクション(b)は、同図セクション(a)の状態を術者102の後方斜め上から俯瞰した図である。 FIG. 8 is a schematic diagram showing an example of arrangement when one physical interface 10 according to the embodiment is arranged. Section (a) of FIG. 8 is a top view of a patient 101 lying on a surgical bed 100 and an operator 102 standing next to the surgical bed 100 for a surgical procedure. Furthermore, section (b) in the same figure is an overhead view of the state of section (a) in the same figure from diagonally above and behind the operator 102.
 図8の例では、物理インタフェース10を1つ配置する場合、手術用ベッド100におけるベッドレール110の、術者102の立ち位置に対応する位置に、物理インタフェース10を配置している。 In the example of FIG. 8, when one physical interface 10 is arranged, the physical interface 10 is arranged at a position corresponding to the standing position of the surgeon 102 on the bed rail 110 of the surgical bed 100.
 より具体的には、術者102が手術用ベッド100に横たわる患者101の手術を行う際に、術者102の腰部または腰部から下の身体部分のうち、術者102が物理インタフェース10を操作するために用いる身体部分に対応する位置に、物理インタフェース10を配置すると好ましい。図8の例では、術者102の左足の大腿部あるいは膝部に対応する位置に、物理インタフェース10を配置している。物理インタフェース10は、ベッドレール110に対して脱着可能に取り付けられるため、配置位置の調整が容易である。 More specifically, when the operator 102 performs surgery on the patient 101 lying on the surgical bed 100, the operator 102 operates the physical interface 10 in the lower back of the operator 102 or in the body part below the waist. Preferably, the physical interface 10 is located at a location corresponding to the body part used for the purpose. In the example of FIG. 8, the physical interface 10 is placed at a position corresponding to the thigh or knee of the left leg of the operator 102. Since the physical interface 10 is detachably attached to the bed rail 110, the arrangement position can be easily adjusted.
 図9は、物理インタフェース10を複数配置する場合の配置例を示す模式図である。ここでは、2つの物理インタフェース10aおよび10bを用い、物理インタフェース10aを術者102が操作し、物理インタフェース10bを術者102の手術を補助する助手103が操作することを想定している。ロボットアームシステム50は、これら物理インタフェース10aおよび10bの各出力に対して、同一の制御を行う。 FIG. 9 is a schematic diagram showing an arrangement example when a plurality of physical interfaces 10 are arranged. Here, it is assumed that two physical interfaces 10a and 10b are used, and that the physical interface 10a is operated by the surgeon 102, and the physical interface 10b is operated by the assistant 103 who assists the surgeon 102 in the surgery. The robot arm system 50 performs the same control on each output of these physical interfaces 10a and 10b.
 図9のセクション(a)は、手術用ベッド100に患者101が横たわっており、術者102が手術手技のために手術用ベッド100の脇に立っている様子を上面から見た図である。この例では、助手103は、術者102に対して、手術用ベッド100を挟んで向き合う位置に立つことを想定している。同図のセクション(b)は、同図セクション(a)の状態を助手103の後方斜め上から俯瞰した図である。 Section (a) of FIG. 9 is a top view of a patient 101 lying on a surgical bed 100 and an operator 102 standing next to the surgical bed 100 for a surgical procedure. In this example, it is assumed that the assistant 103 stands in a position facing the surgeon 102 with the surgical bed 100 in between. Section (b) in the same figure is an overhead view of the state of section (a) in the same figure from diagonally above and behind the assistant 103.
 術者102が操作する物理インタフェース10aの配置位置は、図8を用いて説明した位置と同様である。また、助手103が操作する物理インタフェース10bの配置位置も、基本的には、術者102と物理インタフェース10aとの位置関係と同様とされる。具体的には、物理インタフェース10bは、手術用ベッド100におけるベッドレール110の、助手103の立ち位置に対応する位置に配置してよい。 The arrangement position of the physical interface 10a operated by the surgeon 102 is the same as the position described using FIG. 8. Furthermore, the position of the physical interface 10b operated by the assistant 103 is basically the same as the positional relationship between the surgeon 102 and the physical interface 10a. Specifically, the physical interface 10b may be placed on the bed rail 110 of the surgical bed 100 at a position corresponding to the standing position of the assistant 103.
 なお、図8に示した物理インタフェース10の配置位置、ならびに、図9に示した物理インタフェース10aおよび10bの配置位置は、手術に用いる他の器具の位置、術者102や助手103の操作の癖などに応じて、適宜に決定してよい。 Note that the arrangement position of the physical interface 10 shown in FIG. 8 and the arrangement positions of the physical interfaces 10a and 10b shown in FIG. It may be determined as appropriate depending on the situation.
(2-2.実施形態の第1の適用例)
 次に、実施形態に係る物理インタフェース10の第1の適用例について説明する。第1の適用例は、実施形態に係る物理インタフェース10に対する操作により、指示認識器60により認識された指示に応じたロボットアーム120の制御の有効および無効を制御する例である。ここでは、指示認識器60は、マイクロホン61により収音された音声に応じた指示情報を取得する場合ものとして説明を行う。
(2-2. First application example of embodiment)
Next, a first application example of the physical interface 10 according to the embodiment will be described. The first application example is an example in which control of the robot arm 120 is enabled or disabled according to an instruction recognized by the instruction recognizer 60 by operating the physical interface 10 according to the embodiment. Here, a description will be given assuming that the instruction recognizer 60 acquires instruction information corresponding to the sound picked up by the microphone 61.
 図10は、実施形態の第1の適用例による、ロボットアーム120の動作モード遷移の例を示す模式図である。 FIG. 10 is a schematic diagram showing an example of the operation mode transition of the robot arm 120 according to the first application example of the embodiment.
 図10において、音声認識状態200は、ロボットアーム120の自律制御を指示する自律制御モード211と、ロボットアーム120を、例えば入力装置5047により提供されるユーザインタフェース(UI:User Interface)を用いて手動操作するUI操作モード212とを含む。 In FIG. 10, the voice recognition state 200 includes an autonomous control mode 211 that instructs autonomous control of the robot arm 120, and a manual control mode 211 that instructs autonomous control of the robot arm 120, and manually controls the robot arm 120 using, for example, a user interface (UI) provided by an input device 5047. UI operation mode 212 to be operated.
 演算条件決定部511は、物理インタフェース(IF)操作220に応じて物理インタフェース10から出力され状態取得部510から渡されたスイッチ信号に基づき、ロボットアーム120に対する、指示認識器60による音声認識機能に応じた制御を有効とするか、無効とするかを判定する。 The calculation condition determination unit 511 determines the voice recognition function of the instruction recognizer 60 for the robot arm 120 based on the switch signal output from the physical interface 10 and passed from the state acquisition unit 510 in response to the physical interface (IF) operation 220. It is determined whether the corresponding control is enabled or disabled.
 演算条件決定部511は、当該制御を有効とすると判定した場合、音声認識状態200を音声の受付を開始する音声受付開始状態210とする。演算条件決定部511は、音声受付開始状態210では、指示認識器60が認識した音声に応じて、ロボットアーム120の動作モードを、自律制御モード211とUI操作モード212とで切り替える。 If the calculation condition determining unit 511 determines that the control is valid, it sets the voice recognition state 200 to a voice reception start state 210 in which voice reception is started. In the voice reception start state 210, the calculation condition determining unit 511 switches the operation mode of the robot arm 120 between the autonomous control mode 211 and the UI operation mode 212 according to the voice recognized by the instruction recognizer 60.
 一方、演算条件決定部511は、当該制御を無効とすると判定した場合、当該制御を停止し、ロボットアーム120の動作モードを、アーム直接操作231ボタンに対する操作に応じたアーム直接操作モードに遷移させる。アーム直接操作ボタン231は、例えば、ロボットアーム120に対して設けられ、アーム制御装置5045aの制御を介さずにロボットアーム120を操作するための操作子である。 On the other hand, if the calculation condition determination unit 511 determines that the control is invalid, it stops the control and transitions the operation mode of the robot arm 120 to the arm direct operation mode according to the operation on the arm direct operation 231 button. . The arm direct operation button 231 is, for example, an operator provided on the robot arm 120 to operate the robot arm 120 without being controlled by the arm control device 5045a.
 既存技術においては、音声認識機能に応じたロボットアーム120の制御の有効および無効は、術者102などが発声した所定のアクティベーションワード(例えば「音声開始」の発声)に応じて制御されていた。これに対して、実施形態の第1の適用例では、術者102などが発生するアクティベーションワードを物理インタフェース10の操作に変えている。そのため、アクティベーションワードの発声が不要となり、操作性を向上できる。すなわち、実施形態の第1の適用例によれば、ロボットアーム120の動作の音声制御に係るアクティベーションの成功率を100%とすることができる。 In the existing technology, enabling and disabling control of the robot arm 120 according to the voice recognition function is controlled according to a predetermined activation word uttered by the operator 102 or the like (for example, utterance of "start voice"). . In contrast, in the first application example of the embodiment, the activation word generated by the operator 102 or the like is changed into an operation of the physical interface 10. Therefore, it is not necessary to speak the activation word, and operability can be improved. That is, according to the first application example of the embodiment, the success rate of activation related to voice control of the operation of the robot arm 120 can be set to 100%.
 なお、演算条件決定部511は、物理インタフェース10に対する操作に対して、操作(例えば押圧)している間だけオン状態となるモメンタリタイプと、操作するごとにオン、オフが切り替わるオルタネイトタイプとの何れの制御を行ってもよい。 Note that the calculation condition determining unit 511 determines whether the operation on the physical interface 10 is of a momentary type in which the state is on only while the operation is being performed (for example, by pressing), or an alternative type in which the state is switched on or off each time the operation is performed. may also be controlled.
 より具体的には、モメンタリタイプの制御では、演算条件決定部511は、例えば術者102が物理インタフェース10のスイッチ部13を押圧している間、指示認識器60による音声認識機能に応じた制御を有効とし、音声認識状態200を音声受付開始状態210に制御する。また、演算条件決定部511は、物理インタフェース10のスイッチ部13が押圧されていない場合、音声認識機能に応じた制御を無効として停止させ、ロボットアーム120の動作モードをアーム直接操作モード230に遷移させる。 More specifically, in the momentary type control, the calculation condition determining unit 511 performs control according to the voice recognition function of the instruction recognizer 60 while the operator 102 is pressing the switch unit 13 of the physical interface 10, for example. is enabled, and the voice recognition state 200 is controlled to the voice reception start state 210. Further, if the switch unit 13 of the physical interface 10 is not pressed, the calculation condition determining unit 511 disables and stops the control according to the voice recognition function, and changes the operation mode of the robot arm 120 to the arm direct operation mode 230. let
 一方、オルタネイトタイプの制御では、演算条件決定部511は、例えば術者102が物理インタフェース10のスイッチ部13を押圧するごとに、音声認識機能に応じた制御の有効および無効を切り替える。 On the other hand, in the alternative type control, the calculation condition determination unit 511 switches between enabling and disabling the control according to the voice recognition function, for example, every time the surgeon 102 presses the switch unit 13 of the physical interface 10.
(2-3.実施形態の第2の適用例)
 次に、実施形態に係る物理インタフェース10の第2の適用例について説明する。第2の適用例は、実施形態に係る物理インタフェース10に対する操作により、指示認識器60により認識された指示を無視してロボットアーム120の手動操作を可能とする例である。上述の第1の適用例と同様に、指示認識器60は、マイクロホン61により収音された音声に応じた指示情報を取得する場合ものとして説明を行う。
(2-3. Second application example of embodiment)
Next, a second application example of the physical interface 10 according to the embodiment will be described. The second application example is an example in which the instruction recognized by the instruction recognizer 60 is ignored and the robot arm 120 can be manually operated by operating the physical interface 10 according to the embodiment. Similar to the above-described first application example, the instruction recognizer 60 will be described as a case where instruction information is acquired in accordance with the sound picked up by the microphone 61.
 図11は、実施形態の第2の適用例による、ロボットアーム120の動作モード遷移の例を示す模式図である。 FIG. 11 is a schematic diagram showing an example of the operation mode transition of the robot arm 120 according to the second application example of the embodiment.
 第2の適用例では、演算条件決定部511は、図11に示される音声認識状態200において、所定の開始ワード240a(例えば「音声開始」の発声)に応じて音声受付開始状態210とされる。演算条件決定部511は、音声認識状態200において音声受付開始状態210とされると、と、物理インタフェース10に対する操作の有無に関わらず、常時、音声受付開始状態210を維持する。また、演算条件決定部511は、所定の終了ワード240b(例えば「音声終了」の発声)に応じて音声受付開始状態210を解除し、音声認識機能に応じた制御を停止させ、ロボットアーム120の動作モードをアーム直接操作モード230に遷移させる。 In the second application example, the calculation condition determining unit 511 changes the voice recognition state 200 shown in FIG. . Once the calculation condition determining unit 511 enters the voice reception start state 210 in the voice recognition state 200, it always maintains the voice reception start state 210 regardless of whether or not there is any operation on the physical interface 10. Further, the calculation condition determination unit 511 cancels the voice reception start state 210 in response to a predetermined end word 240b (for example, utterance of “end of voice”), stops control according to the voice recognition function, and controls the robot arm 120. The operation mode is changed to arm direct operation mode 230.
 ここで、図11の例では、ハードウェア(HW)によるボタンとして、停止ボタン250が設けられている。停止ボタン250は、操作に応じてロボットアーム120の動作を停止させ、動作モードをアーム直接操作モード230に遷移させるための操作子である。この停止ボタン250は、音声認識に不具合が発生した場合などにおける、ロボットアーム120の動作に対するリスク管理として機能している。 Here, in the example of FIG. 11, a stop button 250 is provided as a hardware (HW) button. The stop button 250 is an operator for stopping the operation of the robot arm 120 and transitioning the operation mode to the arm direct operation mode 230 in accordance with the operation. This stop button 250 functions as risk management for the operation of the robot arm 120 in the case where a problem occurs in voice recognition.
 第2の適用例では、物理インタフェース10をこの停止ボタン250として機能させる。 In the second application example, the physical interface 10 is made to function as this stop button 250.
 例えば、指示認識器60での音声認識において、術者102の「ロボット停止」を指示する発声の認識に失敗し、当該発声に応じてロボットアーム120が停止しなかった場合が起こり得る。このような場合に、術者102は、物理インタフェース10を操作することで、ロボットアーム120の動作を停止させることができる。そのため、実施形態の第2の適用例によれば、リスク管理をした状態で音声認識機能を常時有効にすることが可能であり、操作性を向上させることができる。 For example, a case may occur in which the instruction recognizer 60 fails to recognize a voice instructing the operator 102 to "stop the robot" and the robot arm 120 does not stop in response to the voice. In such a case, the surgeon 102 can stop the operation of the robot arm 120 by operating the physical interface 10. Therefore, according to the second application example of the embodiment, the voice recognition function can be enabled at all times with risk management in place, and operability can be improved.
 また例えば、術者102が音声認識に応じたロボットアーム120の停止制御に対して少ない遅延でロボットアーム120の動作を停止させたい場合が有り得る。例えば、指示認識器60は、マイクロホン61により収音された音声の状態によっては、音声認識処理に通常よりも長時間を要する場合が起こり得る。一方、物理インタフェース10から出力されるスイッチ信号は、単純にオンおよびオフを示す信号である。そのため、実施形態の第2の適用例によれば、ロボットアーム120の停止処理を、音声認識機能を用いた場合に比べて高速に実行可能である。 For example, there may be a case where the operator 102 wants to stop the operation of the robot arm 120 with a small delay in controlling the stop of the robot arm 120 in response to voice recognition. For example, depending on the state of the voice picked up by the microphone 61, the instruction recognizer 60 may require a longer time than usual for voice recognition processing. On the other hand, the switch signal output from the physical interface 10 is a signal simply indicating on and off. Therefore, according to the second application example of the embodiment, the process of stopping the robot arm 120 can be executed faster than when the voice recognition function is used.
(2-4.実施形態の第3の適用例)
 次に、実施形態に係る物理インタフェース10の第3の適用例について説明する。第3の適用例は、実施形態に係る物理インタフェース10を、ロボットアーム120を緊急停止させる緊急停止ボタンとして用いる例である。
(2-4. Third application example of embodiment)
Next, a third application example of the physical interface 10 according to the embodiment will be described. The third application example is an example in which the physical interface 10 according to the embodiment is used as an emergency stop button to emergency stop the robot arm 120.
 ロボットアーム120には、一般的に、自身の動作を完全停止させる緊急停止ボタンがその本体に設けられる。図12は、ロボットアーム120に設けられる緊急停止ボタン123の例を示す模式図である。図12のセクション(a)の例では、緊急停止ボタン123は、ロボットアーム120に対して直接的に設けられている。術者102は、例えばロボットアーム120の動作に異常を認めたような場合にこの緊急停止ボタン123を操作することで、ロボットアーム120の動作を極めて小さい遅延で停止させることができる。 The robot arm 120 is generally provided with an emergency stop button on its main body to completely stop its operation. FIG. 12 is a schematic diagram showing an example of the emergency stop button 123 provided on the robot arm 120. In the example in section (a) of FIG. 12, the emergency stop button 123 is provided directly to the robot arm 120. For example, when the operator 102 detects an abnormality in the operation of the robot arm 120, by operating the emergency stop button 123, the operator 102 can stop the operation of the robot arm 120 with an extremely small delay.
 ここで、緊急停止ボタン123は、ロボットアーム120の本体に設けられる場合、術者102から遠い位置に配置される場合がある。図12のセクション(b)は、緊急停止ボタン123が術者102に対して遠い位置に配置される例を模式的に示している。 Here, when the emergency stop button 123 is provided on the main body of the robot arm 120, it may be placed at a position far from the surgeon 102. Section (b) of FIG. 12 schematically shows an example in which the emergency stop button 123 is placed at a position far from the operator 102.
 図12のセクション(b)の例では、ロボットアーム120がビニールカバー21により覆われ、が緊急停止ボタン123は、ロボットアーム120の中間部分に設けられている。術者102は、緊急停止ボタン123を操作するためには、術者102の位置から患者101の上部を介して左手102hLを伸ばす必要がある。術者102は、両手が術具の操作に専有されている場合、術具の操作を中止し、緊急停止ボタン123を操作することになる。また、緊急停止ボタン123の操作に際して、ビニールカバー21が邪魔になる可能性もある。したがって、ロボットアーム120の動作を停止させる必要が生じた時点から、実際に緊急停止ボタン123が操作されるまで、遅延が発生するおそれがある。また、場合によっては、術者102以外の助手やスタッフが緊急停止ボタン123を操作せざるを得ない状況も有り得る。 In the example in section (b) of FIG. 12, the robot arm 120 is covered with a vinyl cover 21, and the emergency stop button 123 is provided at the middle portion of the robot arm 120. In order to operate the emergency stop button 123, the operator 102 needs to extend his left hand 102hL from the position of the operator 102 through the upper part of the patient 101. When the surgeon 102 has both hands occupied for operating the surgical instrument, the operator 102 will stop operating the surgical instrument and operate the emergency stop button 123. Further, when operating the emergency stop button 123, the vinyl cover 21 may get in the way. Therefore, there is a possibility that a delay may occur from the time when it becomes necessary to stop the operation of the robot arm 120 until the emergency stop button 123 is actually operated. Further, in some cases, an assistant or staff other than the surgeon 102 may be forced to operate the emergency stop button 123.
 実施形態の第3の適用例では、物理インタフェース10を緊急停止ボタン123として用いるため、術者102は、両手が術具の操作に専有されている場合であっても、術具の操作を中止すること無く、ロボットアーム120の動作を停止させることが可能である。すなわち、実施形態の第3の適用例では、術者102は、ハンズフリーでロボットアーム120の動作を停止させることが可能である。 In the third application example of the embodiment, since the physical interface 10 is used as the emergency stop button 123, the operator 102 can stop operating the surgical instrument even when both hands are occupied with operating the surgical instrument. It is possible to stop the operation of the robot arm 120 without having to do so. That is, in the third application example of the embodiment, the surgeon 102 can stop the operation of the robot arm 120 hands-free.
(3.実施形態の第1の変形例)
 次に、本開示の実施形態の第1の変形例について説明する。実施形態の第1の変形例では、物理インタフェース10として、光を用いて操作を検出する構成を適用する。
(3. First modification of the embodiment)
Next, a first modification of the embodiment of the present disclosure will be described. In the first modified example of the embodiment, a configuration in which an operation is detected using light is applied as the physical interface 10.
 図13は、実施形態の第1の変形例に係る物理インタフェースの配置例を示す模式図である。図13において、実施形態の第1の変形例に係る物理インタフェース10cは、例えば術者102の足元に配置される。物理インタフェース10cは、フットスイッチ5057などと干渉しない位置に配置することが好ましい。 FIG. 13 is a schematic diagram showing an example of the arrangement of physical interfaces according to the first modification of the embodiment. In FIG. 13, a physical interface 10c according to a first modification of the embodiment is placed, for example, at the feet of a surgeon 102. It is preferable that the physical interface 10c be placed in a position where it does not interfere with the foot switch 5057 or the like.
 光を用いて操作を検出する構成としては、例えば光の反射を用いた測距や、光の遮断の有無を検出した検出結果に基づき、術者102の足などが物理インタフェース10cに挿入されたか否かを判定する方法が考えられる。 As a configuration for detecting an operation using light, for example, distance measurement using reflection of light or detecting whether the operator 102's foot or the like is inserted into the physical interface 10c is possible based on the detection result of detecting whether or not the light is blocked. There may be a method to determine whether or not this is the case.
 光を用いて操作を検出する物理インタフェース10cを用いることで、非接触、且つ、フットスイッチ5057とは異なる操作方法で、物理インタフェース10cを操作することが可能である。また、光を用いて操作を検出する物理インタフェース10cを術者102などの足元に設置することで、術者102の両手が術具の操作に専有されている場合であっても、物理インタフェース10cを操作することが可能である。すなわち、実施形態の第1の変形例では、術者102は、ハンズフリーで物理インタフェース10cを操作し、ロボットアーム120の動作を制御することが可能である。 By using the physical interface 10c that detects operations using light, it is possible to operate the physical interface 10c non-contact and with a different operation method than the foot switch 5057. Furthermore, by installing the physical interface 10c that detects operations using light at the feet of the surgeon 102, etc., even when both hands of the surgeon 102 are used exclusively for operating the surgical instrument, the physical interface 10c can be used to detect operations using light. It is possible to operate. That is, in the first modified example of the embodiment, the surgeon 102 can operate the physical interface 10c hands-free to control the operation of the robot arm 120.
(実施形態の第1の変形例の第1の例)
 図14は、実施形態の第1の変形例の第1の例に係る物理インタフェース10c-1の例を示す模式図である。
(First example of the first modification of the embodiment)
FIG. 14 is a schematic diagram showing an example of a physical interface 10c-1 according to a first example of a first modification of the embodiment.
 図14において、物理インタフェース10c-1は、例えば直方体の各面のうち一面が開口部とされた構造を有する。図の例では、直方体における開口部の例えば上面である面15に、測距装置16が設けられている。測距装置16は、例えば、発光部と受光部とを含み、発光部は、開口部の内部を光の照射範囲としている。測距装置16は、発光部から射出光160が射出されたタイミングと、受光部により対象物(図示しない)からの反射光161が受光されたタイミングと、に基づき、測距装置16から対象物までの距離を測定する。 In FIG. 14, the physical interface 10c-1 has, for example, a structure in which one of each surface of a rectangular parallelepiped is an opening. In the illustrated example, a distance measuring device 16 is provided on a surface 15, which is, for example, the top surface of an opening in a rectangular parallelepiped. The distance measuring device 16 includes, for example, a light emitting section and a light receiving section, and the light emitting section irradiates the inside of the opening with light. The distance measuring device 16 detects the object from the distance measuring device 16 based on the timing at which the emitted light 160 is emitted from the light emitting section and the timing at which the reflected light 161 from the object (not shown) is received by the light receiving section. Measure the distance to.
 測距装置16は、測距結果に基づき物理インタフェース10c-1に対する操作の有無を判定する。測距装置16による判定結果は、ケーブル11を介してアーム制御装置5045aに送信される。 The distance measuring device 16 determines whether or not there is an operation on the physical interface 10c-1 based on the distance measurement result. The determination result by the distance measuring device 16 is transmitted to the arm control device 5045a via the cable 11.
 例えば、物理インタフェース10c-1の開口部に何も挿入されていない状態で測距された距離を初期値とする。当該開口部に例えば術者102の足(足先)が挿入された状態で測距を行うと、射出光160が挿入された足により反射されるため、測距結果として初期値より短い距離が取得される。ロボットアーム制御部500において、状態取得部510は、この測距結果に基づき、物理インタフェース10c-1に対する操作を検出することができる。 For example, the distance measured when nothing is inserted into the opening of the physical interface 10c-1 is set as the initial value. If distance measurement is performed with, for example, the foot (toe) of the operator 102 inserted into the opening, the emitted light 160 will be reflected by the inserted foot, so the distance measurement result will be a shorter distance than the initial value. be obtained. In the robot arm control unit 500, the state acquisition unit 510 can detect an operation on the physical interface 10c-1 based on the distance measurement result.
 なお、図14の例では、説明のため、物理インタフェース10c-1が直方体の形状を有するものとして示されているが、これはこの形状に限定されるものではない。 Note that in the example of FIG. 14, for the sake of explanation, the physical interface 10c-1 is shown as having a rectangular parallelepiped shape, but this is not limited to this shape.
 また、ここでは、光を用いた測距により物理インタフェース10c-1に対する操作を検出しているが、これはこの例に限定されない。例えば、図14を例にとって、面15に発光部を設け、面15と対向する面の、発光部に対応する位置に、発光部から射出された光を受光する受光部を設ける。状態取得部510は、発光部から射出された光が受光部に受光されたか否かに基づき、物理インタフェース10c-1に対する操作を検出してもよい。 Further, here, the operation on the physical interface 10c-1 is detected by distance measurement using light, but this is not limited to this example. For example, taking FIG. 14 as an example, a light emitting section is provided on the surface 15, and a light receiving section that receives light emitted from the light emitting section is provided on a surface opposite to the surface 15 at a position corresponding to the light emitting section. The status acquisition unit 510 may detect an operation on the physical interface 10c-1 based on whether the light emitted from the light emitting unit is received by the light receiving unit.
(実施形態の第1の変形例の第2の例)
 図15は、実施形態の第1の変形例の第2の例に係る物理インタフェース10c-2の例を示す模式図である。
(Second example of the first modification of the embodiment)
FIG. 15 is a schematic diagram showing an example of a physical interface 10c-2 according to a second example of the first modification of the embodiment.
 図15において、物理インタフェース10c-2は、図14に示した物理インタフェース10c-1と比較して、開口部が横方向に広く設けられると共に。開口部に横方向に接する側壁の開口部側が切り欠かれた構造となっている。また、図15の例では、物理インタフェース10c-2は、上面である面17に、それぞれ発光部と受光部とを含む複数の測距装置16a~16cが設けられている。 In FIG. 15, the physical interface 10c-2 has a wider opening in the horizontal direction than the physical interface 10c-1 shown in FIG. It has a structure in which the opening side of the side wall that is laterally in contact with the opening is notched. Further, in the example of FIG. 15, the physical interface 10c-2 is provided with a plurality of distance measuring devices 16a to 16c each including a light emitting section and a light receiving section on the top surface 17.
 各測距装置16a~16cは、上述した実施形態の第1の変形例の第1の例と同様に、測距結果に基づき物理インタフェース10c-1に対する操作の有無を判定する。各測距装置16a~16cによる各判定結果は、例えば論理和をとられてケーブル11を介してアーム制御装置5045aに送信される。 Each distance measuring device 16a to 16c determines whether or not there is an operation on the physical interface 10c-1 based on the distance measurement result, similarly to the first example of the first modification of the embodiment described above. The determination results from each distance measuring device 16a to 16c are logically summed, for example, and transmitted to the arm control device 5045a via the cable 11.
 図15の構成によれば、例えば術者102は、足(足先)を横方向にスライドさせることで、物理インタフェース10c-2に操作を検出させることが可能である。 According to the configuration shown in FIG. 15, for example, the surgeon 102 can cause the physical interface 10c-2 to detect the operation by sliding the foot (tip of the foot) in the lateral direction.
(実施形態の第1の変形例の第3の例)
 図16は、実施形態の第1の変形例の第3の例に係る物理インタフェース10c-3の例を示す模式図である。
(Third example of first modification of embodiment)
FIG. 16 is a schematic diagram showing an example of a physical interface 10c-3 according to a third example of the first modification of the embodiment.
 図16において、物理インタフェース10c-3は、複数の光線162を射出する発光装置17-1と、発光装置17-1から射出された各光線162を受光する受光装置17-2とを含む、所謂ライトカーテンの構造を有している。発光装置17-1と受光装置17-2は、手術用ベッド100側面に対応する床面に設置される。このとき、発光装置17-1と受光装置17-2との間隔を、ある程度広くとると、好ましい。 In FIG. 16, the physical interface 10c-3 includes a light emitting device 17-1 that emits a plurality of light beams 162, and a light receiving device 17-2 that receives each light beam 162 emitted from the light emitting device 17-1. It has a light curtain structure. The light emitting device 17-1 and the light receiving device 17-2 are installed on the floor surface corresponding to the side surface of the surgical bed 100. At this time, it is preferable to make the distance between the light emitting device 17-1 and the light receiving device 17-2 somewhat wide.
 受光装置17-2は、発光装置17-1から射出された各光線162のうち少なくとも1つの光線162が受光されない場合、発光装置17-1と受光装置17-2との間に遮光物(例えば術者102の足先)が存在すると判定してよい。受光装置17-2の判定結果は、ケーブル11を介してアーム制御装置5045aに送信される。ロボットアーム制御部500において、状態取得部510は、この判定結果に基づき、物理インタフェース10c-3に対する操作を検出することができる。 If at least one light ray 162 out of the light rays 162 emitted from the light emitting device 17-1 is not received, the light receiving device 17-2 installs a light blocking object (for example, It may be determined that the tip of the foot of the operator 102 exists. The determination result of the light receiving device 17-2 is transmitted to the arm control device 5045a via the cable 11. In the robot arm control unit 500, the state acquisition unit 510 can detect an operation on the physical interface 10c-3 based on this determination result.
 図16に示す構成によれば、光の照射範囲を広く確保することが可能であるため、術者102以外の者(助手103、他のスタッフなど)でも、物理インタフェース10c-3を操作することが可能となる。 According to the configuration shown in FIG. 16, since it is possible to secure a wide light irradiation range, even a person other than the surgeon 102 (assistant 103, other staff, etc.) can operate the physical interface 10c-3. becomes possible.
(4.実施形態の第2の変形例)
 次に、本開示の実施形態の第2の変形例について説明する。実施形態の第2の変形例では、物理インタフェース10として、圧力に応じた信号を出力する感圧センサを用いて操作を検出する構成を適用する。
(4. Second modification of the embodiment)
Next, a second modification of the embodiment of the present disclosure will be described. In a second modification of the embodiment, a configuration is applied as the physical interface 10 that detects an operation using a pressure-sensitive sensor that outputs a signal according to pressure.
 図17は、実施形態の第2の変形例に係る物理インタフェースの配置例を示す模式図である。図17において、実施形態の第2の変形例に係る物理インタフェース10dは、例えば術者102の足元の床面に配置される。物理インタフェース10dは、フットスイッチ5057などと干渉しない位置に配置することが好ましい。状態取得部510は、例えば術者102が物理インタフェース10dに体重をかけたことを検知した検知結果に基づき、物理インタフェース10dに対する操作を検出できる。 FIG. 17 is a schematic diagram showing an example of the arrangement of physical interfaces according to the second modification of the embodiment. In FIG. 17, a physical interface 10d according to a second modification of the embodiment is placed, for example, on the floor at the feet of the surgeon 102. It is preferable that the physical interface 10d be placed in a position where it does not interfere with the foot switch 5057 or the like. The state acquisition unit 510 can detect an operation on the physical interface 10d, for example, based on a detection result that the surgeon 102 has put his weight on the physical interface 10d.
 感圧センサを用いて操作を検出する物理インタフェース10dを術者102などの足元に設置することで、術者102の両手が術具の操作に専有されている場合であっても、物理インタフェース10dを操作することが可能である。すなわち、実施形態の第2の変形例を適用することで、術者102は、ハンズフリーで物理インタフェース10dを操作し、ロボットアーム120の動作を制御することが可能である。 By installing the physical interface 10d that detects operations using pressure-sensitive sensors at the feet of the surgeon 102, the physical interface 10d can be used even when both hands of the surgeon 102 are used exclusively for operating the surgical instrument. It is possible to operate. That is, by applying the second modification of the embodiment, the surgeon 102 can operate the physical interface 10d hands-free and control the operation of the robot arm 120.
 また、フットスイッチ5057は、決められた位置を踏みに行く必要があるのに対し、物理インタフェース10dは、ある程度の面積を持つように構成できるため、操作が容易である。 Furthermore, whereas the foot switch 5057 requires stepping on a predetermined position, the physical interface 10d can be configured to have a certain area, making it easy to operate.
(実施形態の第2の変形例の第1の例)
 図18は、実施形態の第2の変形例の第1の例に係る物理インタフェース10d-1の例を示す模式図である。図18に示される物理インタフェース10d-1は、感圧範囲が1人の操作に限定された大きさとされている。物理インタフェース10d-1は、例えば手術を行う際の術者102の足元に配置することが好ましい。
(First example of second modification of embodiment)
FIG. 18 is a schematic diagram showing an example of the physical interface 10d-1 according to the first example of the second modification of the embodiment. The physical interface 10d-1 shown in FIG. 18 is sized so that its pressure sensitive range is limited to one person's operation. The physical interface 10d-1 is preferably placed, for example, at the feet of the surgeon 102 when performing surgery.
 物理インタフェース10d-1は、例えば術者102が体重をかけたことを検知する。物理インタフェース10d-1の検知結果は、ケーブル11を介して状態取得部510に送信される。状態取得部510は、物理インタフェース10d-1から送信された検知結果に基づき、物理インタフェース10d-1に対する操作を検出できる。 The physical interface 10d-1 detects, for example, that the surgeon 102 has applied his/her weight. The detection result of the physical interface 10d-1 is transmitted to the status acquisition unit 510 via the cable 11. The status acquisition unit 510 can detect an operation on the physical interface 10d-1 based on the detection result sent from the physical interface 10d-1.
(実施形態の第2の変形例の第2の例)
 図19は、実施形態の第2の変形例の第2の例に係る物理インタフェース10d-2の例を示す模式図である。図19に示される物理インタフェース10d-2は、感圧範囲が図18の例と比較して大きくされ、複数人が操作可能とした大きさとされている。物理インタフェース10d-2は、例えば手術用ベッド100の片側あるいは両側の全域に、他の機材との干渉を避けつつ配置することが好ましい。
(Second example of second modification of embodiment)
FIG. 19 is a schematic diagram showing an example of a physical interface 10d-2 according to a second example of a second modification of the embodiment. The physical interface 10d-2 shown in FIG. 19 has a larger pressure sensitive range than the example shown in FIG. 18, and is sized to be operable by multiple people. It is preferable that the physical interface 10d-2 be placed, for example, over the entire area on one or both sides of the surgical bed 100 while avoiding interference with other equipment.
 実施形態の第2の変形例の第2の例を適用することで、術者102以外の者(助手103、他のスタッフなど)でも、物理インタフェース10d-2を操作することが可能となる。 By applying the second example of the second modification of the embodiment, even a person other than the surgeon 102 (assistant 103, other staff, etc.) can operate the physical interface 10d-2.
 なお、本明細書に記載された効果はあくまで例示であって限定されるものでは無く、また他の効果があってもよい。 Note that the effects described in this specification are merely examples and are not limiting, and other effects may also exist.
 なお、本技術は以下のような構成も取ることができる。
(1)
 術者が自身の手以外の身体部分で操作可能に構成される物理インタフェース、を備え、
 前記物理インタフェースに対する前記術者の操作に応じて医療用ロボットの動作を制御する、
制御装置。
(2)
 前記物理インタフェースは、
 前記術者が自身の腰部および腰部より下の身体部分で操作可能に構成される、
前記(1)に記載の制御装置。
(3)
 前記物理インタフェースは、
 前記術者の手術を補助する、前記医療用ロボットとしてのロボットアームを操作するための操作インタフェースである、
前記(1)または(2)に記載の制御装置。
(4)
 前記物理インタフェースは、前記術者が操作可能な操作部を少なくとも1つ、含む、
前記(1)乃至(3)の何れかに記載の制御装置。
(5)
 前記物理インタフェースは、手術ベッドのレールに取り付け可能に構成される、
前記(1)乃至(4)の何れかに記載の制御装置。
(6)
 前記物理インタフェースは、前記手術ベッドの傾きを検出する傾き検出部を含む、
前記(5)に記載の制御装置。
(7)
 前記物理インタフェースは、外部に光を照射する照光部を含む、
前記(5)または(6)に記載の制御装置。
(8)
 1台の前記医療用ロボットに対して複数の前記物理インタフェースが設置される、
前記(1)乃至(7)の何れかに記載の制御装置。
(9)
 複数の前記物理インタフェースのそれぞれは、操作に応じて、1台の前記医療用ロボットに対して同一の動作の制御を行う、
前記(8)に記載の制御装置。
(10)
 前記物理インタフェースは、前記術者の足元に対応する位置に配置され、前記術者の足による前記操作を光を用いて検知する、
前記(1)に記載の制御装置。
(11)
 前記物理インタフェースは、圧力を検出する感圧センサを含み、前記術者の足元に対応する位置に配置され、前記術者の足による前記操作を、前記感圧センサに検出された前記圧力の変化に基づき検知する、
前記(1)に記載の制御装置。
(12)
 術者が自身の手以外の身体部分で操作可能に構成される物理インタフェースと、
 前記術者の手術を補助するロボットアームと、
 前記物理インタフェースに対する前記術者の操作に応じて前記ロボットアームの動作を制御する制御部と、
を含む、
医療用ロボット。
(13)
 前記術者の音声に基づき音声認識を行う音声認識部、をさらに含み、
 前記制御部は、
 前記音声認識の結果に応じて前記ロボットアームの動作を制御し、
 前記物理インタフェースに対する前記術者の操作に応じて、前記音声認識の結果に応じた前記ロボットアームの動作の制御の有効および無効を切り替える、
前記(12)に記載の医療用ロボット。
(14)
 前記制御部は、
 前記物理インタフェースに対する前記術者の操作に応じて、前記音声認識部による前記音声認識の有効および無効を切り替える、
前記(13)に記載の医療用ロボット。
(15)
 前記制御部は、
 前記物理インタフェースに対する前記術者の操作に応じて、前記音声認識部による前記音声認識を有効にしたまま、前記ロボットアームの動作を停止させる、
前記(13)に記載の医療用ロボット。
(16)
 前記制御部は、
 前記物理インタフェースに対する前記術者の操作に応じて、前記ロボットアームの動作を緊急停止させる、
前記(12)乃至(15)の何れかに記載の医療用ロボット。
Note that the present technology can also have the following configuration.
(1)
A physical interface configured so that the operator can operate it with a body part other than his or her own hand,
controlling the operation of the medical robot according to the operator's operation on the physical interface;
Control device.
(2)
The physical interface is
configured such that the operator can operate it with his/her own waist and body parts below the waist;
The control device according to (1) above.
(3)
The physical interface is
an operation interface for operating a robot arm as the medical robot that assists the surgeon in surgery;
The control device according to (1) or (2) above.
(4)
The physical interface includes at least one operation unit that can be operated by the operator.
The control device according to any one of (1) to (3) above.
(5)
the physical interface is configured to be attachable to a rail of a surgical bed;
The control device according to any one of (1) to (4) above.
(6)
The physical interface includes a tilt detection unit that detects a tilt of the surgical bed.
The control device according to (5) above.
(7)
The physical interface includes a lighting unit that irradiates light to the outside.
The control device according to (5) or (6) above.
(8)
A plurality of the physical interfaces are installed for one medical robot,
The control device according to any one of (1) to (7) above.
(9)
Each of the plurality of physical interfaces controls the same operation with respect to one of the medical robots according to the operation;
The control device according to (8) above.
(10)
The physical interface is arranged at a position corresponding to the operator's feet, and detects the operation by the operator's feet using light.
The control device according to (1) above.
(11)
The physical interface includes a pressure-sensitive sensor that detects pressure, and is disposed at a position corresponding to the operator's feet, and the physical interface detects the operation by the operator's feet as a result of the change in the pressure detected by the pressure-sensitive sensor. Detect based on
The control device according to (1) above.
(12)
a physical interface configured such that the operator can operate it with a body part other than his/her hands;
a robot arm that assists the surgeon in surgery;
a control unit that controls the operation of the robot arm according to the operator's operation on the physical interface;
including,
Medical robot.
(13)
further comprising a voice recognition unit that performs voice recognition based on the voice of the operator,
The control unit includes:
controlling the operation of the robot arm according to the result of the voice recognition;
switching between enabling and disabling control of the operation of the robot arm according to the result of the voice recognition in accordance with the operator's operation on the physical interface;
The medical robot according to (12) above.
(14)
The control unit includes:
switching between enabling and disabling the voice recognition by the voice recognition unit according to the operator's operation on the physical interface;
The medical robot according to (13) above.
(15)
The control unit includes:
stopping the operation of the robot arm while the voice recognition by the voice recognition unit is enabled in response to the operator's operation on the physical interface;
The medical robot according to (13) above.
(16)
The control unit includes:
emergency stopping the operation of the robot arm in response to the operator's operation on the physical interface;
The medical robot according to any one of (12) to (15) above.
10,10a,10b,10c,10c-1,10c-2,10c-3,10d,10d-1,10d-2 物理インタフェース
11 ケーブル
12 発光部
13 スイッチ部
14 傾斜センサ
16,16a,16b,16c 測距装置
17-1 発光装置
17-2 受光装置
50 ロボットアームシステム
60 指示認識器
61 マイクロホン
100 手術用ベッド
101 患者
102 術者
103 助手
110 ベッドレール
120 ロボットアーム
121 関節部
122 駆動制御部
123 緊急停止ボタン
160 射出光
161 反射光
162 光線
200 音声認識状態
210 音声受付開始状態
211 自律制御モード
212 UI操作モード
220 物理インタフェース操作
230 アーム直接操作モード
231 アーム直接操作ボタン
250 停止ボタン
500 ロボットアーム制御部
510 状態取得部
511 演算条件決定部
512 力演算部
513 指令値生成部
1210 関節情報検出部
1211 関節部駆動部
5003 鏡筒
5045,5045a アーム制御装置
5057 フットスイッチ
10, 10a, 10b, 10c, 10c-1, 10c-2, 10c-3, 10d, 10d-1, 10d-2 Physical interface 11 Cable 12 Light emitting section 13 Switch section 14 Inclination sensor 16, 16a, 16b, 16c Measurement Distance device 17-1 Light emitting device 17-2 Light receiving device 50 Robot arm system 60 Instruction recognizer 61 Microphone 100 Surgical bed 101 Patient 102 Operator 103 Assistant 110 Bed rail 120 Robot arm 121 Joint part 122 Drive control part 123 Emergency stop button 160 Emitted light 161 Reflected light 162 Light beam 200 Voice recognition state 210 Voice reception start state 211 Autonomous control mode 212 UI operation mode 220 Physical interface operation 230 Arm direct operation mode 231 Arm direct operation button 250 Stop button 500 Robot arm control unit 510 Status acquisition Unit 511 Calculation condition determination unit 512 Force calculation unit 513 Command value generation unit 1210 Joint information detection unit 1211 Joint drive unit 5003 Lens barrel 5045, 5045a Arm control device 5057 Foot switch

Claims (16)

  1.  術者が自身の手以外の身体部分で操作可能に構成される物理インタフェース、を備え、
     前記物理インタフェースに対する前記術者の操作に応じて医療用ロボットの動作を制御する、
    制御装置。
    A physical interface configured so that the operator can operate it with a body part other than his or her own hand,
    controlling the operation of the medical robot according to the operator's operation on the physical interface;
    Control device.
  2.  前記物理インタフェースは、
     前記術者が自身の腰部および腰部より下の身体部分で操作可能に構成される、
    請求項1に記載の制御装置。
    The physical interface is
    configured such that the operator can operate it with his/her own waist and body parts below the waist;
    The control device according to claim 1.
  3.  前記物理インタフェースは、
     前記術者の手術を補助する、前記医療用ロボットとしてのロボットアームを操作するための操作インタフェースである、
    請求項1に記載の制御装置。
    The physical interface is
    an operation interface for operating a robot arm as the medical robot that assists the surgeon in surgery;
    The control device according to claim 1.
  4.  前記物理インタフェースは、前記術者が操作可能な操作部を少なくとも1つ、含む、
    請求項1に記載の制御装置。
    The physical interface includes at least one operation unit that can be operated by the operator.
    The control device according to claim 1.
  5.  前記物理インタフェースは、手術ベッドのレールに取り付け可能に構成される、
    請求項1に記載の制御装置。
    the physical interface is configured to be attachable to a rail of a surgical bed;
    The control device according to claim 1.
  6.  前記物理インタフェースは、前記手術ベッドの傾きを検出する傾き検出部を含む、
    請求項5に記載の制御装置。
    The physical interface includes a tilt detection unit that detects a tilt of the surgical bed.
    The control device according to claim 5.
  7.  前記物理インタフェースは、外部に光を照射する照光部を含む、
    請求項5に記載の制御装置。
    The physical interface includes a lighting unit that irradiates light to the outside.
    The control device according to claim 5.
  8.  1台の前記医療用ロボットに対して複数の前記物理インタフェースが設置される、
    請求項1に記載の制御装置。
    A plurality of the physical interfaces are installed for one medical robot,
    The control device according to claim 1.
  9.  複数の前記物理インタフェースのそれぞれは、操作に応じて、1台の前記医療用ロボットに対して同一の動作の制御を行う、
    請求項8に記載の制御装置。
    Each of the plurality of physical interfaces controls the same operation with respect to one of the medical robots according to the operation;
    The control device according to claim 8.
  10.  前記物理インタフェースは、前記術者の足元に対応する位置に配置され、前記術者の足による前記操作を光を用いて検知する、
    請求項1に記載の制御装置。
    The physical interface is arranged at a position corresponding to the operator's feet, and detects the operation by the operator's feet using light.
    The control device according to claim 1.
  11.  前記物理インタフェースは、圧力を検出する感圧センサを含み、前記術者の足元に対応する位置に配置され、前記術者の足による前記操作を、前記感圧センサに検出された前記圧力の変化に基づき検知する、
    請求項1に記載の制御装置。
    The physical interface includes a pressure-sensitive sensor that detects pressure, and is disposed at a position corresponding to the operator's feet, and the physical interface detects the operation by the operator's feet as a result of the change in the pressure detected by the pressure-sensitive sensor. Detect based on
    The control device according to claim 1.
  12.  術者が自身の手以外の身体部分で操作可能に構成される物理インタフェースと、
     前記術者の手術を補助するロボットアームと、
     前記物理インタフェースに対する前記術者の操作に応じて前記ロボットアームの動作を制御する制御部と、
    を含む、
    医療用ロボット。
    a physical interface configured such that the operator can operate it with a body part other than his/her hands;
    a robot arm that assists the surgeon in surgery;
    a control unit that controls the operation of the robot arm according to the operator's operation on the physical interface;
    including,
    Medical robot.
  13.  前記術者の音声に基づき音声認識を行う音声認識部、をさらに含み、
     前記制御部は、
     前記音声認識の結果に応じて前記ロボットアームの動作を制御し、
     前記物理インタフェースに対する前記術者の操作に応じて、前記音声認識の結果に応じた前記ロボットアームの動作の制御の有効および無効を切り替える、
    請求項12に記載の医療用ロボット。
    further comprising a voice recognition unit that performs voice recognition based on the voice of the operator,
    The control unit includes:
    controlling the operation of the robot arm according to the result of the voice recognition;
    switching between enabling and disabling control of the operation of the robot arm according to the result of the voice recognition in accordance with the operator's operation on the physical interface;
    The medical robot according to claim 12.
  14.  前記制御部は、
     前記物理インタフェースに対する前記術者の操作に応じて、前記音声認識部による前記音声認識の有効および無効を切り替える、
    請求項13に記載の医療用ロボット。
    The control unit includes:
    switching between enabling and disabling the voice recognition by the voice recognition unit according to the operator's operation on the physical interface;
    The medical robot according to claim 13.
  15.  前記制御部は、
     前記物理インタフェースに対する前記術者の操作に応じて、前記音声認識部による前記音声認識を有効にしたまま、前記ロボットアームの動作を停止させる、
    請求項13に記載の医療用ロボット。
    The control unit includes:
    stopping the operation of the robot arm while the voice recognition by the voice recognition unit is enabled in response to the operator's operation on the physical interface;
    The medical robot according to claim 13.
  16.  前記制御部は、
     前記物理インタフェースに対する前記術者の操作に応じて、前記ロボットアームの動作を緊急停止させる、
    請求項12に記載の医療用ロボット。
    The control unit includes:
    emergency stopping the operation of the robot arm in response to the operator's operation on the physical interface;
    The medical robot according to claim 12.
PCT/JP2023/005129 2022-03-10 2023-02-15 Control device and medical robot WO2023171263A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022037472 2022-03-10
JP2022-037472 2022-03-10

Publications (1)

Publication Number Publication Date
WO2023171263A1 true WO2023171263A1 (en) 2023-09-14

Family

ID=87936774

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/005129 WO2023171263A1 (en) 2022-03-10 2023-02-15 Control device and medical robot

Country Status (1)

Country Link
WO (1) WO2023171263A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018159155A1 (en) * 2017-02-28 2018-09-07 ソニー株式会社 Medical observation system, control device, and control method
JP2019134914A (en) * 2018-02-05 2019-08-15 ミーレ カンパニー インク. Master console for surgical robot
JP2021502195A (en) * 2017-11-09 2021-01-28 クアンタム サージカル Robotic equipment for minimally invasive medical interventions in soft tissues
JP2021019949A (en) * 2019-07-29 2021-02-18 株式会社メディカロイド Surgery system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018159155A1 (en) * 2017-02-28 2018-09-07 ソニー株式会社 Medical observation system, control device, and control method
JP2021502195A (en) * 2017-11-09 2021-01-28 クアンタム サージカル Robotic equipment for minimally invasive medical interventions in soft tissues
JP2019134914A (en) * 2018-02-05 2019-08-15 ミーレ カンパニー インク. Master console for surgical robot
JP2021019949A (en) * 2019-07-29 2021-02-18 株式会社メディカロイド Surgery system

Similar Documents

Publication Publication Date Title
US9517109B2 (en) Medical system
JP7414770B2 (en) Medical arm device, operating method of medical arm device, and information processing device
US11033338B2 (en) Medical information processing apparatus, information processing method, and medical information processing system
US20190022857A1 (en) Control apparatus and control method
KR101772958B1 (en) Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument
JP7015256B2 (en) Auxiliary device control in computer-aided remote control system
KR20220028139A (en) Medical devices, systems, and methods integrating eye gaze tracking for stereo viewer
EP2988696A1 (en) Surgical equipment control input visualization field
KR20140112207A (en) Augmented reality imaging display system and surgical robot system comprising the same
US20160175057A1 (en) Assistance device for imaging support of a surgeon during a surgical operation
KR20140139840A (en) Display apparatus and control method thereof
JP4027876B2 (en) Body cavity observation system
US20200015655A1 (en) Medical observation apparatus and observation visual field correction method
JP2018075218A (en) Medical support arm and medical system
JP2020048706A (en) Surgery system and display method
US11348684B2 (en) Surgical support system, information processing method, and information processing apparatus
WO2020262262A1 (en) Medical observation system, control device, and control method
JP6902639B2 (en) Surgical system
US20210251717A1 (en) Extended reality headset opacity filter for navigated surgery
WO2023171263A1 (en) Control device and medical robot
JP3499946B2 (en) Diagnostic imaging device
JP2021062216A (en) Surgical system and display method
JP2001238205A (en) Endoscope system
WO2023176133A1 (en) Endoscope holding device, endoscopic surgery system, and control method
JP7128326B2 (en) surgery system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23766443

Country of ref document: EP

Kind code of ref document: A1