WO2023032074A1 - Manipulator system and method for inferring shape of manipulator - Google Patents

Manipulator system and method for inferring shape of manipulator Download PDF

Info

Publication number
WO2023032074A1
WO2023032074A1 PCT/JP2021/032120 JP2021032120W WO2023032074A1 WO 2023032074 A1 WO2023032074 A1 WO 2023032074A1 JP 2021032120 W JP2021032120 W JP 2021032120W WO 2023032074 A1 WO2023032074 A1 WO 2023032074A1
Authority
WO
WIPO (PCT)
Prior art keywords
time step
value
manipulator
detection value
estimated value
Prior art date
Application number
PCT/JP2021/032120
Other languages
French (fr)
Japanese (ja)
Inventor
英幸 佐藤
Original Assignee
オリンパスメディカルシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパスメディカルシステムズ株式会社 filed Critical オリンパスメディカルシステムズ株式会社
Priority to PCT/JP2021/032120 priority Critical patent/WO2023032074A1/en
Publication of WO2023032074A1 publication Critical patent/WO2023032074A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J18/00Arms
    • B25J18/06Arms flexible
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices

Definitions

  • the present invention relates to a manipulator system, a manipulator shape estimation method, and the like.
  • Patent Literature 1 discloses a technique for predicting the shape of a bending portion of an endoscope by sensing the amount of wire traction of an endoscope, which is an example of a manipulator.
  • Patent Document 1 does not consider such circumstances, and it cannot be said that the prediction of the curved shape of the manipulator is sufficient.
  • One aspect of the present disclosure is a manipulator including a bending portion, an operation portion for performing an operation input by a user to drive the bending portion, and a first detection relating to an operation input amount of the operation input provided in the operation portion.
  • a first sensor that outputs a value
  • a second sensor that is provided at the tip of the manipulator and outputs a second detected value related to the operation of the manipulator
  • a processing unit that includes at least one processor; generating a first estimated value indicating the shape of the bending portion by receiving a detection value and performing a first estimation process of the shape of the bending portion based on the first detection value; receiving a value and performing a second estimation process of the shape of the curved portion based on the first estimated value and the second detected value to generate a second estimated value indicative of the shape of the curved portion; Relates to manipulator systems.
  • a first sensor that is provided in an operation unit through which a user performs an operation input to drive a bending portion of a manipulator and that outputs a first detection value related to an operation input amount of the operation input detects the receiving a first detection value; receiving the second detection value from a second sensor provided at the tip of the manipulator and outputting a second detection value relating to movement of the manipulator; and the first detection. generating a first estimated value indicating the shape of the curved portion by performing a first estimation process of the shape of the curved portion based on the values; generating a second estimated value indicating the shape of the curved portion by performing a second estimation process of the shape of the curved portion based on the method.
  • FIG. 2 is a block diagram for explaining a configuration example of a manipulator system
  • FIG. FIG. 4 is a diagram schematically explaining an example of a manipulator and an operation unit; The figure explaining the example of a wire. The figure explaining the robustness of bending angle estimation.
  • 4A and 4B are diagrams for explaining the flow of processing according to the embodiment;
  • FIG. 4 is a flowchart for explaining a processing example of the embodiment;
  • 5 is another flowchart for explaining a processing example of the embodiment;
  • FIG. 11 is a diagram for explaining another example of the flow of bending angle estimation processing;
  • FIG. 4 is a block diagram illustrating another configuration example of the manipulator system;
  • FIG. 11 is a diagram for explaining another example of the flow of bending angle estimation processing
  • FIG. 4 is a block diagram illustrating another configuration example of the manipulator system
  • FIG. 11 is a diagram for explaining another example of the flow of bending angle estimation processing
  • 4 is a flowchart for explaining another example of processing according to the embodiment
  • 4 is a flowchart for explaining another example of processing according to the embodiment
  • 4 is a flowchart for explaining another example of processing according to the embodiment
  • FIG. 1 is a block diagram illustrating a configuration example of a manipulator system 10 of this embodiment.
  • the manipulator system 10 includes a manipulator 100 , an operation section 200 and a processing unit 400 .
  • Operation unit 200 includes a first sensor 310 .
  • Manipulator 100 includes a second sensor 320 .
  • the configuration of the manipulator system 10 is not limited to that shown in FIG. 1, and various modifications such as adding other components are possible.
  • the manipulator 100 can be configured to include the operation unit 200 .
  • FIG. 2 is a diagram schematically illustrating the manipulator 100 and the operation unit 200 of this embodiment.
  • the manipulator 100 of this embodiment includes a curved portion.
  • the manipulator 100 in FIG. 2 includes an outer sheath 110, a plurality of bending pieces 120, and a distal end portion 130 connected to the distal ends of the bending pieces 120.
  • the bending piece 120 is a short cylindrical member made of metal, and the number thereof is not particularly limited.
  • the plurality of bending pieces 120 and the distal end portion 130 are each connected by a rotatable connecting portion 140 .
  • the manipulator 100 of FIG. 2 can realize the function as a bending section by the multi-joint structure.
  • the manipulator 100 is not limited to that shown in FIG. 2, and various modifications such as addition of other components are possible.
  • the manipulator 100 may be provided with a flexible portion that passively bends by an external force without the bending piece 120 on the proximal end side.
  • the flexible section may for example consist of a flexible annular member that bends passively with an external force.
  • the distal end portion 130 may include a treatment tool, an illumination device, or an imaging device. Also, all or part of these may be included.
  • the lighting device and imaging device included in the distal end portion 130 can be controlled from the outside by passing an optical fiber connected to the lighting device and a cable connecting to the imaging device through a cavity (not shown) in the operation unit 200 and the bending piece 120.
  • the manipulator 100 of the present embodiment can be used as, for example, an endoluminal device such as a medical endoscope or a catheter, a medical manipulator such as a surgical support robot arm, or an industrial endoscope. It can be used as an industrial manipulator such as an industrial robot arm.
  • the specific structures of the bending piece 120, the distal end portion 130, the connecting portion 140, etc. are known, and detailed description thereof will be omitted.
  • the A axis, the UD axis and the LR axis are illustrated in FIGS. 2 and 3 as three mutually orthogonal axes.
  • the direction along the A-axis is called the A-axis direction, which is the direction along the longitudinal direction of the manipulator 100 .
  • the direction in which the distal end side of the manipulator 100 is inserted into a body cavity, for example, is direction A1
  • the direction in which the manipulator 100 is pulled out is direction A2.
  • the direction along the UD axis is called the UD axis direction
  • the direction along the LR axis is called the LR axis direction.
  • the LR axis, UD axis, and A axis can also be called the X axis, Y axis, and Z axis, respectively.
  • the term "perpendicular" includes not only intersecting at 90° but also intersecting at an angle slightly inclined from 90°.
  • FIG. 2 shows that the manipulator 100 is configured only by the connecting portion 140 for bending in the UD direction, the present invention is not limited to this, and the connecting portion 140 that bends in the RL direction may be added as appropriate. Alternatively, a mechanism for twisting the manipulator 100 may be added. Note that to twist means to rotate the manipulator 100 about the A axis.
  • the operation unit 200 is used by the user to perform operation input in order to drive the bending unit.
  • the pair of wires 160 can be moved in opposite directions by a wire drive mechanism (not shown) based on an operation input from the operation unit 200 to bend the manipulator 100 in a desired direction.
  • FIG. 2 shows the manipulator 100 bending upward in the UD direction by means of the two wires 160, it may be bent in the RL direction.
  • the manipulator 100 is configured to include four wires 160 consisting of an upper bending wire 160u, a lower bending wire 160d, a left bending wire 160l, and a right bending wire 160r. You may In the example shown in FIG.
  • the angle knob 200A that is the operation unit 200
  • the upper bending wire 160u and the lower bending wire 160d move in opposite directions, and the specific bending piece 120 rotates.
  • the manipulator 100 can be bent in a desired direction in the UD direction.
  • the angle knob 200B which is the operation unit 200
  • the left bending wire 160l and the right bending wire 160r move in opposite directions, and the predetermined bending piece 120 rotates, thereby moving the manipulator 100 to RL. It can be bent in any desired direction.
  • specific examples of the operation unit 200 are not limited to the angle knobs 200A and 200B, and are not limited to manual operation.
  • the wire drive mechanism (not shown) can be realized by a known method, so detailed description thereof will be omitted.
  • the first sensor 310 is provided in the operation unit 200 and outputs a first detection value regarding the amount of operation input. Specifically, the first sensor 310 outputs the acquired second detection value to the processing unit 400, which will be described later, through wireless communication or wired communication.
  • the wireless communication here is communication according to wireless communication standards such as Bluetooth (registered trademark) and Wi-Fi (Wireless Fidelity (registered trademark)), but may be other wireless communication standards.
  • the term "wired communication” as used herein refers to communication conforming to a wired communication standard such as USB (Universal Serial Bus), but may be another wired communication standard. For example, by connecting a cable (not shown in FIG. 2) to the processing unit 400 (not shown in FIG. 2), the first sensor 310 can output the first detected value through wired communication.
  • the second sensor 320 is provided at the distal end portion 130 of the manipulator 100 and outputs a second detection value regarding the operation of the manipulator 100 . Specifically, similarly to the first sensor 310, the second sensor 320 outputs the obtained second detection value to the processing unit 400, which will be described later, through the above-described wireless communication or wired communication.
  • the second sensor 320 provided at the tip of the manipulator 100 and the processing unit 400 outside the manipulator 100 are connected by wire by passing a cable that connects to the second sensor 320 through the hollow portion of the bending piece 120 described above. can be realized.
  • the processing unit 400 receives a first input value output by the first sensor 310 and a second input value output by the second sensor 320 .
  • the processing unit 400 is configured by the following hardware.
  • the hardware may include circuitry for processing digital signals and/or circuitry for processing analog signals.
  • the hardware can consist of one or more circuit devices or one or more circuit elements mounted on a circuit board.
  • the one or more circuit devices are, for example, ICs (Integrated Circuits), FPGAs (field-programmable gate arrays), or the like.
  • the one or more circuit elements are, for example, resistors, capacitors, and the like.
  • processing unit 400 is realized by including at least one of the following processors.
  • Processing unit 400 includes a memory that stores information and a processor that operates on the information stored in the memory.
  • the information is, for example, programs and various data.
  • a processor includes hardware.
  • Various processors such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and a DSP (Digital Signal Processor) can be used as the processor.
  • the memory may be a semiconductor memory such as SRAM (Static Random Access Memory) or DRAM (Dynamic Random Access Memory), a register, or a magnetic storage device such as HDD (Hard Disk Drive).
  • it may be an optical storage device such as an optical disc device.
  • the memory stores computer-readable instructions, and the instructions are executed by the processor to implement a part or all of the functions of the units of the processing unit 400 as processes.
  • the instruction here may be an instruction set that constitutes a program, or an instruction that instructs a hardware circuit of a processor to perform an operation.
  • all or part of each part of the processing unit 400 can be realized by cloud computing, and each process described later with reference to FIG. 7 and the like can be performed on cloud computing.
  • the processing unit 400 estimates the shape of the curved portion of the manipulator 100 .
  • Japanese Unexamined Patent Application Publication No. 2002-100000 discloses a method of estimating the shape of an endoscope using the amount of traction of the wire 160 and the amount of operation input of the operation unit 200 . Therefore, by using these methods, the processing unit 400 can estimate the bending angle of the bending portion of the manipulator 100 based on the first input value from the first sensor 310 described above.
  • simply applying the method disclosed in Patent Document 1 or the like does not take into account disturbance factors that cannot be controlled by the input device, so the bending angle of the bending portion of the manipulator 100 cannot be accurately estimated. .
  • the actual bending angle of the bending portion of the manipulator 100 changes over time as indicated by B1 as the user operates the operation unit 200 .
  • the estimated result is as shown in B2, which differs from the behavior shown in B1.
  • C1 a difference occurs between the actual bending angle value and the estimated bending angle value at the start of estimation. This is because the shape of the manipulator 100 at the start of estimation is different, and thus the shape of the wire 160 is also different.
  • C2 there is a difference in the degree of inclination of the bending angle change.
  • the shape of the wire 160 also differs depending on the shape of the manipulator 100, which affects the responsiveness of the operation unit 200 to the operation input. Also, as indicated by C3, there is a difference in the timing at which the bending angle changes. This is because even if the input operation of the operation unit 200 is started, the bending or deterioration of the wire 160 delays the start of the shape change of the bending portion more than expected.
  • estimating the bending angle of the bending portion based only on the first detection value from the first sensor 310 has the problem that the accuracy of the estimated angle is low.
  • the accuracy of the estimation may be low. This is because when the tip of the manipulator 100 is thin, the mounted second sensor 320 itself is also very small, so the second detection value acquired by the second sensor 320 contains a lot of noise.
  • the term "low estimation accuracy” as used herein may also be referred to as “low estimation accuracy”, and specifically means, for example, that the estimated bending angle value deviates greatly from the actual bending angle value.
  • the low accuracy of estimation here is also said to be low accuracy of estimation.
  • the estimated bending angle value varies greatly. For example, as shown in FIG. 5, assuming that the actual bending angle is the value indicated by E at the timing when a predetermined period of time t0 has elapsed from the estimation start time, the bending angle is estimated based on the first input value and the second input value.
  • the bending angles of the bending portion estimated based on the first input value are values indicated by F1, F2, F3, and F4, and the bending angles of the bending portion estimated based on the second input value are G1 and G2. , G3 and G4. Since the estimation of the bending angle based on the first input value is less accurate than the estimation of the bending angle based on the second input value, the values shown in F1 to F4 are shown in E than the values shown in G1 to G4. far from the value. On the other hand, the estimation of the bending angle based on the second input value is less accurate than the estimation of the bending angle based on the first input value. bigger than That is, the first detection values acquired by the first sensor 310 belong to the data categories F1 to F4 in FIG. 5, and the second detection values acquired by the second sensor 320 are shown in G1 to G4 in FIG. Belong to a category of data.
  • the processing unit 400 of the present embodiment includes a first estimation processing unit 410 that executes a first estimation process for the shape of the bending portion of the manipulator 100 and a first estimation processing for the shape of the bending portion of the manipulator 100 .
  • 2 estimation processing section 420 is included.
  • the processing unit 400 of the present embodiment executes a first estimation process of the shape of the curved portion of the manipulator 100 and a second estimation process of the shape of the curved portion of the manipulator 100 .
  • first estimation processing section 410 performs first estimation processing based on the first detection value received from first sensor 310 to generate a first estimation value.
  • a calculation model formula for calculating the bending angle of the bending portion based on the operation input amount of the operation unit 200 is stored in a storage unit (not shown), and the first estimation processing unit 410 calculates the received first detection value and the corresponding
  • the first estimation process can be realized by performing the process of estimating the bending angle of the bending portion based on the formula of the arithmetic model. The details of the equations of the calculation model will be described later.
  • the first estimation process may use an arithmetic model that calculates the bending angle of the bending portion based on the amount of traction of the wire 160, or use the relationship between the operation input amount and the bending angle instead of the mathematical formula. may be used, and various modifications are possible.
  • the second estimation processing unit 420 performs the second estimation process based on the second detection value received from the second sensor 320 and the first estimation value to generate the second estimation value. More specifically, for example, the second estimation processing unit 420 performs a process of generating a second estimated value by correcting the first estimated value using the second detected value, a specific method of which will be described later.
  • the manipulator system 10 of the present embodiment includes a manipulator 100 including a bending portion, an operation portion 200 through which a user performs an operation input to drive the bending portion, a first sensor 310, and a second sensor 320. and a processing unit 400 including at least one processor.
  • the first sensor 310 is provided in the operation unit 200 and outputs a first detection value regarding the amount of operation input of the operation input.
  • a second sensor 320 is provided at the tip of the manipulator 100 and outputs a second detection value regarding the operation of the manipulator 100 .
  • the processing unit 400 receives the first detection value and performs a first estimation process of the shape of the bending portion based on the first detection value to generate a first estimated value indicating the shape of the bending portion.
  • the processing unit 400 receives the first detection value related to the input operation from the first sensor 310 and the , a second detection value for the output motion of the tip of the manipulator 100 can be obtained. Thereby, the processing unit 400 can estimate the curved shape of the curved portion of the manipulator 100 using both the first detection value and the second detection value. Further, since the processing unit 400 can perform the first estimation process and the second estimation process, not only the first estimation process based on the first detection value generates the first estimation value, but also the first estimation value A second estimate can be generated by a second estimation process based on .
  • a bending angle closer to the actual bending angle of the bending portion of the manipulator 100 can be estimated.
  • the first detection value which is more accurate than the second detection value
  • the second detection value which is more accurate than the first detection value
  • the method of the present embodiment may be implemented as a shape estimation method for the manipulator 100.
  • the method for estimating the shape of the manipulator 100 according to the present embodiment is provided in the operation unit 200 through which the user performs an operation input in order to drive the bending portion of the manipulator 100, and outputs the first detection value regarding the operation input amount of the operation input.
  • receiving a first sensed value from a first sensor 310 to Also, the method for estimating the shape of the manipulator 100 includes receiving a second detection value from a second sensor 320 that is provided at the tip of the manipulator 100 and that outputs a second detection value regarding the movement of the manipulator 100 .
  • the shape estimation method of the manipulator 100 includes generating a first estimated value indicating the shape of the curved portion by executing a first estimation process of the shape of the curved portion based on the first detection value. Further, the shape estimation method of the manipulator 100 generates a second estimated value indicating the shape of the curved portion by executing a second estimation process of the shape of the curved portion based on the first estimated value and the second detected value. including doing By doing so, an effect similar to that described above can be obtained.
  • the second estimation processing unit 420 may perform the second estimation processing further using other information such as specific design information.
  • the specific design information is information based on the design information of the manipulator 100, and is, for example, information indicating the respective displacements and orientations of the bending piece 120, the connecting portion 140, etc. that constitute the manipulator 100 according to the bending angle. By doing so, the second estimation processing unit 420 can more appropriately perform the second estimation processing.
  • FIG. 7 The series of processes described above with reference to FIG. 6 are performed at regular time intervals. For example, it can be realized by storing a program relating to the processing examples of the flowcharts shown in FIGS.
  • the processing unit 400 first performs a manipulator system startup process (step S100). Specifically, the user turns on the power of the hardware that constitutes the processing unit 400, and performs processing for activating an application program related to the method described later. After that, a process (step S200) is performed to determine whether or not an operation to end manipulator system 10 has been performed. End processing (step S210) is performed, and the processing of the entire manipulator system 10 ends.
  • step S200 the process of step S200 is repeatedly performed and an interrupt process is performed.
  • the interrupt processing here is, for example, as shown in FIG. 8, the bending angle estimation processing (step S300) is performed by a timer interrupt having a predetermined cycle.
  • a series of processes shown in FIG. 6 are performed in the bending angle estimation process (step S300) in one time step. The same applies to modifications described later with reference to FIGS. 10, 12, 14, and the like.
  • k is a time step in discrete processing and is also called time or timing.
  • the first detection value or the like at the previous time step may be used.
  • the k-1 time step may be called the first time step
  • the k-th time step may be called the second time step. That is, the second timestep is a timestep after the first timestep.
  • the first sensor 310 will be specifically described.
  • the operation unit 200 is the angle knobs 200A and 200B of FIG.
  • the amount the amount of rotation of the angle knobs 200A and 200B can be grasped.
  • the first sensor 310 may be a linear encoder that measures the amount of movement of a direct-acting member such as the wire 160, as long as the operation input amount of the operation input can be grasped.
  • the measurement method of the first sensor 310 may be an optical method or a magnetic method, and is not particularly limited.
  • the second sensor 320 will be specifically described.
  • the second sensor 320 is, for example, a two-axis or three-axis acceleration sensor, and outputs detected acceleration data to the processing unit 400 .
  • the second sensor 320 may integrate the detected acceleration data with an integrator (not shown) and output the result to the processing unit 400 as velocity data.
  • the two axes here are the aforementioned X axis and Y axis
  • the three axes are the X axis, Y axis, and Z axis.
  • the second sensor 320 may be a 2-axis or 3-axis gyro sensor, and may output the detected angular acceleration data or angular velocity data to the processing unit 400 described later by wireless or wired communication.
  • the second sensor 320 may be a motion sensor including a 2-axis or 3-axis acceleration sensor and a 2-axis or 3-axis gyro sensor.
  • the motion sensor may refer to either one of an acceleration sensor and a gyro sensor.
  • a motion sensor is also called an inertial observation device or an IMU (Inertial Measurement Unit).
  • FIG. 2 illustrates the second sensor 320 as being provided at the distal end portion 130, which is the distal end of the manipulator 100, any part that moves in accordance with the motion of the bending portion can good.
  • the second sensor 320 may be, for example, a magnetic position sensor, and at least one portion of the position measurement portion is a portion that does not move according to the motion of the bending portion, and at least another portion is a portion that does not move.
  • the location is not limited as long as it is a portion that moves according to the motion of the bending portion.
  • an imaging device may be used instead of the second sensor 320 . For example, from the amount of movement of the feature point in the image between each imaging time and the distance between the feature point estimated from the image and the imaging device, the operation amount and the operation angle of the imaging device between each imaging time are measured. , the imaging device can realize the same function as the second sensor 320 .
  • the second sensor 320 is at least one of an acceleration sensor, an angular velocity sensor, a position sensor, and an imaging device arranged on the tip side of the manipulator 100.
  • the second detection value is a detection value based on at least one of the position, displacement, velocity, acceleration, angle, and angular velocity of the tip of the manipulator 100 .
  • the second sensor 320 is an acceleration sensor, and the second detection value is a detection value based on acceleration. It is possible to apply
  • the operation unit 200 is the angle knobs 200A and 200B shown in FIG. 2, the first detection value indicating the operation input amount of the operation unit 200 is the rotation angle of the angle knobs 200A and 200B. ) can be expressed as
  • the bending angle of the bending portion of the manipulator 100 is determined by considering the bending in the UD direction and the bending in the RL direction described above with reference to FIG. can be expressed by the formula
  • a first estimated value generated by the first estimation process can be represented by the following equation (3).
  • the coefficients used in Equation (3) are stored in advance in, for example, a storage unit (not shown) in the processing unit 400, but may be stored in an external server, a storage unit in cloud computing, or the like.
  • the first estimation processing section 410 of the processing unit 400 performs the first estimation process based on the first detected value and generates the first estimated value. That is, in the manipulator system 10 of the present embodiment, in the first estimation process, the processing unit 400 converts the first detection value into the first 1 estimate. By doing so, it is possible to estimate the bending angle of the bending portion of the manipulator 100, which is difficult to directly visually recognize, based on the amount of operation input that can be controlled by the user.
  • the method of the present embodiment may be implemented as a shape estimation method for the manipulator 100.
  • the method for estimating the shape of the manipulator 100 according to the present embodiment uses an arithmetic model in which the relationship between the operation input and the shape of the bending portion is modeled in the first estimation process to obtain the first detection value. Including converting to a value. By doing so, an effect similar to that described above can be obtained.
  • Equation (5) the observation equation can be represented by Equation (5).
  • Equation (6) the relationship between the acceleration in the coordinate system of the distal end portion 130 and the gravitational acceleration in the world coordinate system is expressed by Equation (6) using a rotation matrix. can be represented.
  • the rotation matrix is derived from the curvature angle.
  • the manipulator 100 has a multi-joint structure, so if it is composed of i connecting parts 140, the homogeneous transformation matrix from the world coordinate system to the coordinate system of the tip part 130 is Using the Denavit-Hartenberg notation, it can be expressed as Equation (7).
  • Equation (7) Equation (7).
  • m is the coordinate of the root position of the curved portion.
  • the homogeneous transformation matrix can be expressed by Equation (8).
  • Equation (9) the rotation matrix from the world coordinate system to the tip 130 can be expressed by Equation (9).
  • equation (10) can be derived from equations (6) and (9).
  • Equation (11) Since the acceleration at the distal end portion 130 is the second detection value obtained by the second sensor 320 as described above, the observation equation of the extended Kalman filter can be expressed by Equation (11).
  • Equation (12) Given the time update formula of the extended Kalman filter. Differentiating the function on the right side of Equation (1) with respect to x can be expressed by Equation (12).
  • Equation (13) when the right side of Equation (11) is differentiated with respect to x, Equation (13) can be obtained.
  • the second estimated value is represented by a relational expression based on the first estimated value, the Kalman gain, and the second detected value.
  • the accuracy of the first detected value is low, so the accuracy of the first estimated value is also low, while the second detected value is low in precision but high in accuracy. Therefore, according to Equation (18), the second estimation processing unit 420 repeatedly performs the process of correcting the first estimated value as the second estimated value so as to approximate the actual bending angle and outputting the corrected second estimated value.
  • the processing unit 400 estimates the second estimated value by the second estimation process using the Kalman filter. By doing so, by estimating the second estimated value using the first estimated value with high accuracy and the second detected value with high accuracy, one of the first detected value and the second detected value It is possible to estimate the bending accuracy closer to the true value than when estimating the bending accuracy from only the .
  • the method of this embodiment may be implemented as a shape estimation method for the manipulator 100 . That is, the shape estimation method of the manipulator 100 of this embodiment includes estimating the second estimated value by the second estimation process using the Kalman filter. By doing so, an effect similar to that described above can be obtained.
  • the first estimated value and the second estimated value are expressed in the dimension of the operation input amount of the operation unit 200, that is, the angle. That is, in the manipulator system 10 of this embodiment, the first estimated value and the second estimated value are the bending angles of the bending portion.
  • the method of the present embodiment is not limited to the above, and various modifications are possible.
  • the first detection value or the like obtained at the previous time step may be used instead of the first detection value or the like obtained at the current time step.
  • the first estimation processing unit 410 performs the first estimation processing using the difference value between the first detection value detected at the previous time step and the first detection value detected at the time step, for example. may be performed. Further, as shown in FIG. 10, the first estimation processing section 410 may perform the first estimation processing using the second estimation value generated at the previous time step and the aforementioned difference value.
  • the first estimation processing unit 410 calculates the difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimation value at the first time step. Based on this, a first estimation process may be performed at the second time step to generate a first estimated value.
  • time update formula for the second estimation process performed by the second estimation processing unit 420 in FIG. 10 can also be obtained by rewriting the above-described formulas (14) to (19) in the same manner, but the description is omitted. do.
  • the processing unit 400 generates the first detected value and the second detected value and generate a first estimate and a second estimate. Further, the processing unit 400 performs a second detection based on the difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimated value at the first time step. A first estimate is generated by performing a first estimation process at the time step. Further, the processing unit 400 generates a second estimated value by executing a second estimation process at the second time step based on the first estimated value at the second time step and the second detected value at the second time step. do.
  • the first estimated value generated by the first estimation process can be a more accurate value.
  • the difference value here is the amount of change in the first estimated value between time steps, and corresponds to the slope of the behavior shown in B2 in FIG. That is, by continuing to execute the bending angle estimation process (step S300) of FIG. 6 according to the method of the present embodiment, the first estimation process continues to be executed based on the corrected second estimated value. As a result, the difference value is also corrected, so the difference in the degree of inclination indicated by C2 in FIG. 4 can be reduced.
  • the method of the present embodiment may be implemented as a shape estimation method for the manipulator 100.
  • the shape estimation method of the manipulator 100 of the present embodiment receives the first detection value and the second detection value at the first time step and the second time step, which is the time step after the first time step, and , generating a first estimate and a second estimate.
  • the method for estimating the shape of manipulator 100 is based on a difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimated value at the first time step. generating a first estimate by performing a first estimation process at a second timestep.
  • the shape estimation method of the manipulator 100 performs the second estimation process at the second time step based on the first estimated value at the second time step and the second detection value at the second time step. Including generating a value. By doing so, an effect similar to that described above can be obtained.
  • the processing unit 400 may further include a predetermined processing section 425 as shown in FIG. 11, for example.
  • the predetermined processing unit 425 includes a noise removal filter that removes noise from the second detection value.
  • the predetermined processing unit 425 may include a low-pass filter. By doing so, for example, when the second detection value includes noise caused by the high-frequency treatment device arranged at the distal end portion 130, the second detection value after the predetermined processing does not include the noise. can be done.
  • the predetermined processing unit 425 may include a high-pass filter. By doing so, for example, when the second detection value contains drift noise, the second detection value after the predetermined processing can be made not to contain the drift noise.
  • the predetermined processing unit 425 may be a Kalman filter.
  • a predetermined process may be performed at the time step based on the second detection value after the predetermined process at the previous time step and the second detection value at the time step.
  • the processing unit 400 performs the predetermined processing at the second time step based on the second detection value after the predetermined processing at the first time step and the second detection value at the second time step, A second detection value after predetermined processing may be generated.
  • the second estimation processing section 420 may perform the second estimation processing based on the second detection value after the predetermined processing and the above-described first estimation value at the time step.
  • the processing unit 400 generates the first detected value and the second detected value and generate a first estimate and a second estimate. Further, the processing unit 400 performs a second detection based on the difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimated value at the first time step. A first estimate is generated by performing a first estimation process at the time step. Further, the processing unit 400 generates a second detection value after the predetermined processing by executing predetermined processing based on the second detection value at the first time step and the second time step.
  • the processing unit 400 performs the predetermined process at the second time step based on the second detection value after the predetermined process at the first time step and the second detection value at the second time step, A second detection value after predetermined processing is generated. Also, the processing unit 400 executes a second estimation process at the second time step based on the first estimated value at the second time step and the second detected value after the predetermined processing at the second time step. to generate a second estimate. By doing so, the second estimation processing unit 420 can perform the second estimation processing based on the second detection value from which noise and the like have been removed by the predetermined processing. Thereby, the second estimation processing unit 420 can generate a more appropriately estimated second estimated value.
  • the method of the present embodiment may be implemented as a shape estimation method for the manipulator 100.
  • the shape estimation method of the manipulator 100 of the present embodiment receives the first detection value and the second detection value at the first time step and the second time step, which is the time step after the first time step, and , generating a first estimate and a second estimate.
  • the method for estimating the shape of manipulator 100 is based on a difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimated value at the first time step. generating a first estimate by performing a first estimation process at a second timestep.
  • the method for estimating the shape of the manipulator 100 includes generating a second detection value after the predetermined processing by executing predetermined processing based on the second detection value at the first time step and the second time step. . Further, the shape estimation method of the manipulator 100 executes predetermined processing at the second time step based on the second detection value after the predetermined processing at the first time step and the second detection value at the second time step. thereby generating a second detection value after a predetermined process. Further, the shape estimation method of the manipulator 100 performs the second estimation process at the second time step based on the first estimated value at the second time step and the second detected value after the predetermined process at the second time step. The executing includes generating a second estimate. By doing so, an effect similar to that described above can be obtained.
  • the processing unit 400 may further include a third estimation processing section 430, as shown in the block diagram of FIG. 13, for example.
  • the third estimation processing unit 430 is based on the second detected value and the second estimated value after predetermined processing at the previous time step and the difference value of the time step. , performs a third estimation process to generate a third estimated value.
  • the processing unit 400 calculates the value of A third estimation process is performed to generate a third estimated value. That is, the third estimated value is the second detected value at the time step estimated based on the input operation amount between the time steps, and is an estimated value considering the noise removal value at the previous time step.
  • the third estimation processing section 430 may perform the third estimation processing further using the aforementioned specific design information.
  • the predetermined processing unit 425 in FIG. 14 performs predetermined processing based on the second detection value at the time step and the aforementioned third estimated value. Specifically, the second detected value and the third estimated value at the time step are subjected to a linear Kalman filter to remove noise. That is, the second detected value after the predetermined processing in the example of FIG. 14 is the estimated value using the linear Kalman filter. As described above, in the manipulator system 10 of the present embodiment, the processing unit 400 generates the second detection value after predetermined processing by performing predetermined processing using the Kalman filter. Also, the method of the present embodiment may be implemented as a shape estimation method for the manipulator 100 . That is, the method for estimating the shape of the manipulator 100 of the present embodiment includes generating the second detection value after the predetermined processing by the predetermined processing using the Kalman filter.
  • the second estimation processing unit 420 performs a second estimation process based on the second detection value after the predetermined processing at that time step and the first estimation value at that time step. That is, the processing unit 400 performs the second estimation process at the second time step based on the first estimated value at the second time step and the second detected value after the predetermined process at the second time step, and performs the second estimation process at the second time step. Generate an estimate. Furthermore, in other words, the second estimation processing unit 420 performs the second estimation process at the second time step based on the first estimated value and the second detected value after estimation by the linear Kalman filter.
  • the processing of the third estimation processing section 430 and the predetermined processing section 425 in FIG. 14 will be described in more detail.
  • the third estimated value of the second time step is obtained by using the second estimated value at the first time step, the difference value at the second time step, and the second detected value after predetermined processing at the first time step, as follows: It can be represented by the formula (21).
  • Equation (22) The state equation of the linear Kalman filter can be expressed by Equation (22) using Equation (21).
  • Equation (23) the observation equation in this case can be represented by Equation (23).
  • time update formulas in predetermined processing using a linear Kalman filter can be expressed by the following formulas (24), (25), (26), (27), (28), and (29). .
  • the noise removal filter included in the predetermined processing unit 425 causes the second detection value after the predetermined processing to have an amplitude attenuation compared to the second detection value before the predetermined processing. , phase delay, and other changes may occur. For example, if a low-pass filter is used as a noise removal filter and the cut-off frequency of the low-pass filter is lowered to increase the S/N of the second detection value, the amplitude of the original signal, which is not noise, may be attenuated.
  • a low-pass filter is used as a noise removal filter and the cut-off frequency of the low-pass filter is lowered to increase the S/N of the second detection value, the amplitude of the original signal, which is not noise, may be attenuated.
  • the linear Kalman filter is applied to the third estimated value, which is the second detected value estimated from the input information at the current time step, and the second detected value, which is the actual observed value.
  • the second detection value can be estimated without being affected by amplitude attenuation or phase delay. That is, by estimating the true value of the second detection value using a linear Kalman filter, it is possible to extract the original signal that is not noise while removing noise from the second detection value. As a result, the accuracy of the second detection value, which has high accuracy but low accuracy, can be improved. Thereby, the second estimation processing unit 420 can perform the second estimation process more appropriately by using the second detection value with high accuracy.
  • an output value from an encoder, an acceleration sensor, a gyro sensor, or the like that measures the A-axis rotation operation amount of the manipulator 100, or an output value from a position sensor or linear scale that measures the A-axis displacement operation amount, is used as a third It may be used for estimation processing. This is because these output values are input data that affect the second detection values acquired by the second sensor 320 .
  • the processing unit 400 generates the first detected value and the second detected value and generate a first estimate and a second estimate. Further, the processing unit 400 performs a second detection based on the difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimated value at the first time step. A first estimate is generated by performing a first estimation process at the time step. Further, the processing unit 400 generates a second detection value after the predetermined processing by executing predetermined processing based on the second detection value at the first time step and the second time step.
  • the processing unit 400 calculates the second estimated value at the first time step, the difference value at the second time step, and the second detected value after predetermined processing at the first time step, at the second time step.
  • a third estimated value is generated by performing the 3 estimation process.
  • processing unit 400 performs predetermined processing based on the second detection value at the second time step and the third estimated value at the second time step, thereby performing the second detection value after the predetermined processing at the second time step. Generate a detection value.
  • the processing unit 400 executes a second estimation process at the second time step based on the first estimated value at the second time step and the second detected value after the predetermined processing at the second time step. to generate a second estimate.
  • the processing unit 400 can perform a more appropriate second estimation process based on the second detection value after the predetermined process.
  • the method of the present embodiment may be implemented as a shape estimation method for the manipulator 100.
  • the shape estimation method of the manipulator 100 of the present embodiment receives the first detection value and the second detection value at the first time step and the second time step, which is the time step after the first time step, and , generating a first estimate and a second estimate.
  • the method for estimating the shape of manipulator 100 is based on a difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimated value at the first time step. generating a first estimate by performing a first estimation process at a second timestep.
  • the method for estimating the shape of the manipulator 100 includes generating a second detection value after the predetermined processing by executing predetermined processing based on the second detection value at the first time step and the second time step. . Further, the shape estimation method of manipulator 100 is based on the second estimated value at the first time step, the difference value at the second time step, and the second detected value after predetermined processing at the first time step. Generating a third estimate by performing a third estimation process in the step. Further, the shape estimation method of the manipulator 100 performs predetermined processing based on the second detected value at the second time step and the third estimated value at the second time step, and after the predetermined processing at the second time step, generating a second detected value of .
  • the shape estimation method of the manipulator 100 performs the second estimation process at the second time step based on the first estimated value at the second time step and the second detected value after the predetermined process at the second time step.
  • the executing includes generating a second estimate.
  • step S300 The above-described bending angle estimation process (step S300) may not be executed while the bending angle is not being adjusted.
  • the processing unit 400 executes the interrupt prohibition process (step S110) after executing the manipulator system start-up process (step S100) described above with reference to FIG. More specifically, the processing unit 400 performs processing for prohibiting timer interrupt processing including the aforementioned bending angle estimation processing (step S300). Then, the processing unit 400 performs a process (step S120) for determining whether or not there is an input from the user.
  • step S200 the manipulator A process of determining whether or not an operation to end the system 10 has been performed is performed.
  • the processing unit 400 determines that the operation to end the manipulator system 10 has not been performed (NO in step S200)
  • the processing of step S120 is executed again. In other words, after the manipulator system 10 is activated, an infinite loop of steps S120 and S200 is formed as long as there is no input from the user. Not executing the bending angle estimation process (step S300) means that the first estimation process and the second estimation process are not performed.
  • step S130 when the processing unit 400 determines that there is an input from the user (YES in step S120), it performs an interrupt permission process (step S130). More specifically, the processing unit 400 performs processing for permitting timer interrupt processing, including the aforementioned bending angle estimation processing (step S300).
  • the process of step S120 can be realized by, for example, a process of determining whether or not the first detection value acquired by the first sensor 310 is equal to or greater than a certain threshold, but is not limited to this.
  • the processing unit 400 may include an imaging device (not shown) and determine whether the user is performing an input operation based on the user's captured image. The processing unit 400 may also determine whether the user is performing an input operation based on, for example, a motion tracker worn by the user.
  • the processing unit 400 does not perform the first estimation process and the second estimation process when the user does not perform an operation input.
  • the position of the tip portion 130 of the manipulator 100 and the like do not change.
  • the second estimate is not updated.
  • the manipulator 100 is a medical manipulator inserted into a body cavity, for example, peristalsis of the large intestine causes the distal end portion 130 of the manipulator 100 to move even if the user does not perform an input operation on the operation unit 200. 2
  • the value of the detected value changes.
  • the second estimated value may be updated even though the user does not operate the operation unit 200 .
  • the first estimation process and the second estimation process are not performed when the user does not operate the operation unit 200, so that unexpected fluctuations in the second estimated value can be prevented. can be prevented.
  • the method of this embodiment may be implemented as a shape estimation method for the manipulator 100 . That is, the shape estimation method of the manipulator 100 of this embodiment includes not performing the first estimation process and the second estimation process when the user does not perform an operation input. By doing so, an effect similar to that described above can be obtained.
  • step S140 when the processing unit 400 determines that there is no input from the user (NO in step S120), after performing the contact state notification processing (step S140), step The process of S200 may be performed.
  • step S140 for example, when the amount of change in the second detection value exceeds a certain threshold, it is determined that the tip portion 130 of the manipulator 100 is in contact with the external environment such as the inner wall of the large intestine. This is a process of notifying using a display unit (not shown) or the like.
  • the 2 estimates may be generated separately. This makes it possible to obtain information about the state of the bending portion when the distal end portion 130 of the manipulator 100 is in contact with the external environment.
  • step S202 is, for example, a process of updating the coefficient on the right side of Equation (3).
  • the estimation model update process is, for example, a process of updating the coefficient on the right side of Equation (3).
  • the coefficient on the right side of the equation (3) is stored in advance in a storage unit (not shown), and the first estimated value is generated based on the coefficient.
  • the second estimation process and the like are executed, and the first estimated value is corrected to the second estimated value. Therefore, the coefficient on the right side of equation (3) is updated based on the second estimated value at the end of use of the manipulator system 10 .
  • the processing unit 400 updates the estimation model of the first estimation process when the user finishes using the manipulator 100 .
  • the first estimation process can be performed more appropriately the next time the manipulator system 10 is used.
  • the parts such as the wire 160 deteriorate and the responsiveness from the operation unit 200 changes. Degradation limits are difficult to manage.
  • the user can grasp the degree of deterioration of the parts of the manipulator 100 through changes in the coefficients of Equation (3).
  • the method of this embodiment may be implemented as a shape estimation method for the manipulator 100 .
  • the method for estimating the shape of the manipulator 100 uses an arithmetic model in which the relationship between the operation input and the shape of the bending portion is modeled in the first estimation process to obtain the first detection value. and updating the estimation model of the first estimation process when the user finishes using the manipulator 100 . By doing so, an effect similar to that described above can be obtained.
  • the estimation model update process (step S202) may be added after the bending angle estimation process (step S300) in FIG. 7 and performed at each time step.
  • a process of notifying that effect using a predetermined notification means may be added.
  • the coefficient of equation (3) is out of the predetermined range, so that the user can understand that the state of the parts constituting manipulator 100 is not normal.
  • the predetermined notification means is, for example, displaying a display on a display unit (not shown) to urge replacement of the manipulator 100.
  • the image pickup device arranged at the tip portion 130 of the manipulator 100 or the like transmits the picked-up image. It may not be displayed, or a predetermined alarm may be sounded from an audio output device (not shown). Also, changes in the coefficients of equation (3) may be collected as big data on a predetermined server through a wired or wireless network.
  • the first estimation processing section 410 of the processing unit 400 may include a high-pass filter
  • the second estimation processing section 420 may include a low-pass filter to realize complementary filters.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Robotics (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Mechanical Engineering (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Optics & Photonics (AREA)
  • Biophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Manipulator (AREA)

Abstract

A manipulator system (10) comprises: a manipulator (100) which includes a curved portion; an operation unit (200) with which a user conducts operation input for driving the curved portion; a first sensor (310); a second sensor (320); and a process unit (400) which includes at least one processor. The first sensor is provided in the operation unit, and outputs a first detection value which is related to the operation input amount of the operation input. The second sensor is provided to the end of the manipulator, and outputs a second detection value which is related to an action of the manipulator. The process unit receives the first detection value, and executes a first inference process for inferring the shape of the curved portion on the basis of the first detection value to generate a first inference value which indicates the shape of the curved portion. The process unit receives the second detection value, and executes a second inference process for inferring the shape of the curved portion on the basis of the first inference value and the second detection value to generate a second inference value which indicates the shape of the curved portion.

Description

マニピュレータシステム及びマニピュレータの形状推定方法Manipulator system and shape estimation method for manipulator
 本発明は、マニピュレータシステム及びマニピュレータの形状推定方法等に関する。 The present invention relates to a manipulator system, a manipulator shape estimation method, and the like.
 従来、マニピュレータやマニピュレータを含むシステムが、医療や工業の分野において用いられている。マニピュレータを術者が容易に操作するために、マニピュレータの先端の湾曲形状を把握したいニーズが有る。撮像装置が組み込まれたマニピュレータを例えば体内等に挿入して用いる場合、撮像画面に自身の湾曲形状が映らないため、マニピュレータを操作する術者は、マニピュレータの湾曲形状を直接視認することができない。また、マニピュレータとは別の撮像装置を同時に体内等に挿入して用いる場合であっても、ディストーションなどによって、術者は撮像画面を通じてマニピュレータの湾曲形状を正確に認識することができない。特許文献1には、マニピュレータの一例である内視鏡のワイヤ牽引量をセンシングすることで、内視鏡の湾曲部形状を予測する技術が開示されている。 Conventionally, manipulators and systems containing manipulators have been used in the medical and industrial fields. In order for the operator to easily operate the manipulator, there is a need to grasp the curved shape of the tip of the manipulator. When a manipulator with a built-in imaging device is used by being inserted into, for example, the body, the operator who operates the manipulator cannot directly see the curved shape of the manipulator because the curved shape of the manipulator itself does not appear on the imaging screen. Moreover, even when an imaging device separate from the manipulator is inserted into the body and used at the same time, the operator cannot accurately recognize the curved shape of the manipulator through the imaging screen due to distortion or the like. Patent Literature 1 discloses a technique for predicting the shape of a bending portion of an endoscope by sensing the amount of wire traction of an endoscope, which is an example of a manipulator.
特開2013-172905号公報JP 2013-172905 A
 しかし、マニピュレータの入力装置側では制御できない外乱要因が有るため、入力装置側のセンシングに基づいた湾曲形状の推定はロバスト性が悪く、実際の湾曲形状の変化と予測による湾曲形状の変化を比べると応答性の変化や動作遅れ等が見られる。特許文献1にはこのような事情までは考慮されてなく、マニピュレータの湾曲形状の予測は十分とは言えない。 However, since there are disturbance factors that cannot be controlled on the input device side of the manipulator, estimation of the curved shape based on sensing on the input device side is not robust. Changes in responsiveness, operation delays, etc. can be seen. Patent Document 1 does not consider such circumstances, and it cannot be said that the prediction of the curved shape of the manipulator is sufficient.
 本開示の一態様は、湾曲部を含むマニピュレータと、前記湾曲部を駆動させるためにユーザが操作入力を行う操作部と、前記操作部に設けられ、前記操作入力の操作入力量に関する第1検出値を出力する第1センサと、前記マニピュレータの先端に設けられ、前記マニピュレータの動作に関する第2検出値を出力する第2センサと、少なくとも1つのプロセッサを含む処理ユニットと、を含み、前記第1検出値を受信し、前記第1検出値に基づいて、前記湾曲部の形状の第1推定処理を実行することで、前記湾曲部の形状を示す第1推定値を生成し、前記第2検出値を受信し、前記第1推定値と前記第2検出値に基づいて、前記湾曲部の形状の第2推定処理を実行することで、前記湾曲部の形状を示す第2推定値を生成するマニピュレータシステムに関係する。 One aspect of the present disclosure is a manipulator including a bending portion, an operation portion for performing an operation input by a user to drive the bending portion, and a first detection relating to an operation input amount of the operation input provided in the operation portion. a first sensor that outputs a value; a second sensor that is provided at the tip of the manipulator and outputs a second detected value related to the operation of the manipulator; and a processing unit that includes at least one processor; generating a first estimated value indicating the shape of the bending portion by receiving a detection value and performing a first estimation process of the shape of the bending portion based on the first detection value; receiving a value and performing a second estimation process of the shape of the curved portion based on the first estimated value and the second detected value to generate a second estimated value indicative of the shape of the curved portion; Relates to manipulator systems.
 本開示の他の態様は、マニピュレータの湾曲部を駆動させるためにユーザが操作入力を行う操作部に設けられ、前記操作入力の操作入力量に関する第1検出値を出力する第1センサから、前記第1検出値を受信することと、前記マニピュレータの先端に設けられ、前記マニピュレータの動作に関する第2検出値を出力する第2センサから、前記第2検出値を受信することと、前記第1検出値に基づいて、前記湾曲部の形状の第1推定処理を実行することで、前記湾曲部の形状を示す第1推定値を生成することと、前記第1推定値と前記第2検出値に基づいて、前記湾曲部の形状の第2推定処理を実行することで、前記湾曲部の形状を示す第2推定値を生成することと、を含むマニピュレータの形状推定方法に関係する。 Another aspect of the present disclosure is that a first sensor that is provided in an operation unit through which a user performs an operation input to drive a bending portion of a manipulator and that outputs a first detection value related to an operation input amount of the operation input detects the receiving a first detection value; receiving the second detection value from a second sensor provided at the tip of the manipulator and outputting a second detection value relating to movement of the manipulator; and the first detection. generating a first estimated value indicating the shape of the curved portion by performing a first estimation process of the shape of the curved portion based on the values; generating a second estimated value indicating the shape of the curved portion by performing a second estimation process of the shape of the curved portion based on the method.
マニピュレータシステムの構成例を説明するブロック図。FIG. 2 is a block diagram for explaining a configuration example of a manipulator system; FIG. マニピュレータと操作部の例を模式的に説明する図。FIG. 4 is a diagram schematically explaining an example of a manipulator and an operation unit; ワイヤの例を説明する図。The figure explaining the example of a wire. 湾曲角度推定のロバスト性について説明する図。The figure explaining the robustness of bending angle estimation. 精密さと正確さの相違について説明する図。A diagram explaining the difference between precision and accuracy. 本実施形態の処理の流れを説明する図。4A and 4B are diagrams for explaining the flow of processing according to the embodiment; FIG. 本実施形態の処理例を説明するフローチャート。4 is a flowchart for explaining a processing example of the embodiment; 本実施形態の処理例を説明する別のフローチャート。5 is another flowchart for explaining a processing example of the embodiment; 湾曲角度推定処理の流れの例を説明する別の図。Another diagram for explaining an example of the flow of bending angle estimation processing. 湾曲角度推定処理の別の流れの例を説明する図。FIG. 11 is a diagram for explaining another example of the flow of bending angle estimation processing; マニピュレータシステムの別の構成例を説明するブロック図。FIG. 4 is a block diagram illustrating another configuration example of the manipulator system; 湾曲角度推定処理の別の流れの例を説明する図。FIG. 11 is a diagram for explaining another example of the flow of bending angle estimation processing; マニピュレータシステムの別の構成例を説明するブロック図。FIG. 4 is a block diagram illustrating another configuration example of the manipulator system; 湾曲角度推定処理の別の流れの例を説明する図。FIG. 11 is a diagram for explaining another example of the flow of bending angle estimation processing; 本実施形態の別の処理例を説明するフローチャート。4 is a flowchart for explaining another example of processing according to the embodiment; 本実施形態の別の処理例を説明するフローチャート。4 is a flowchart for explaining another example of processing according to the embodiment; 本実施形態の別の処理例を説明するフローチャート。4 is a flowchart for explaining another example of processing according to the embodiment;
 以下、本実施形態について説明する。なお、以下に説明する本実施形態は、請求の範囲に記載された内容を不当に限定するものではない。また本実施形態で説明される構成の全てが、本開示の必須構成要件であるとは限らない。 The present embodiment will be described below. In addition, this embodiment described below does not unduly limit the content described in the claims. Moreover, not all the configurations described in the present embodiment are essential constituent elements of the present disclosure.
 図1、図2、図3を用いて、本実施形態のマニピュレータシステム10の構成例を説明する。図1は、本実施形態のマニピュレータシステム10の構成例を説明するブロック図である。マニピュレータシステム10は、マニピュレータ100と、操作部200と、処理ユニット400を含む。操作部200は、第1センサ310を含む。マニピュレータ100は第2センサ320を含む。なお、マニピュレータシステム10の構成は図1に限定されず、他の構成要素を追加するなどの種々の変形実施が可能である。例えば、マニピュレータ100が操作部200を含む構成にすることもできる。 A configuration example of the manipulator system 10 of the present embodiment will be described using FIGS. 1, 2, and 3. FIG. FIG. 1 is a block diagram illustrating a configuration example of a manipulator system 10 of this embodiment. The manipulator system 10 includes a manipulator 100 , an operation section 200 and a processing unit 400 . Operation unit 200 includes a first sensor 310 . Manipulator 100 includes a second sensor 320 . The configuration of the manipulator system 10 is not limited to that shown in FIG. 1, and various modifications such as adding other components are possible. For example, the manipulator 100 can be configured to include the operation unit 200 .
 図2は本実施形態のマニピュレータ100及び操作部200を模式的に例示する図である。本実施形態のマニピュレータ100は湾曲部を含む。例えば図2のマニピュレータ100はアウターシース110と、複数の湾曲駒120と、湾曲駒120の先端に連結された先端部130と、を含む。湾曲駒120は、金属で形成された短筒状の部材であり、数は特に限定されない。複数の湾曲駒120と先端部130は、各々、回動可能な連結部140によって連結されている。つまり、図2のマニピュレータ100は、多関節構造によって湾曲部としての機能が実現できる。また、マニピュレータ100は図2に示したものに限られず、他の構成要素を追加する等、種々の変形実施が可能である。例えば、図示は省略するが、マニピュレータ100の基端側に、湾曲駒120を備えず外力で受動的に湾曲する軟性部を備えていてもよい。軟性部は例えば外力で受動的に湾曲するような軟性の環状部材から構成されてもよい。また、図示は省略するが、先端部130は、処置具を含んでもよいし、照明装置を含んでもよく、撮像装置を含んでもよい。また、これらの全部または一部を含んでもよい。例えば照明装置と接続する光ファイバや撮像措置と接続するケーブルを、操作部200や湾曲駒120における不図示の空洞部に通すことで、先端部130に含まれる照明装置や撮像装置を外部から制御することが実現できる。このようにすることで、本実施形態のマニピュレータ100を、例えば医療用内視鏡やカテーテル等のエンドルミナルデバイス(endoluminal device)や手術支援ロボットアーム等の医療用マニピュレータの他、工業用内視鏡や工業用ロボットアーム等の工業用マニピュレータとして使用することができる。なお、湾曲駒120、先端部130、連結部140等の具体的な構造は公知につき詳細な説明は省略する。 FIG. 2 is a diagram schematically illustrating the manipulator 100 and the operation unit 200 of this embodiment. The manipulator 100 of this embodiment includes a curved portion. For example, the manipulator 100 in FIG. 2 includes an outer sheath 110, a plurality of bending pieces 120, and a distal end portion 130 connected to the distal ends of the bending pieces 120. As shown in FIG. The bending piece 120 is a short cylindrical member made of metal, and the number thereof is not particularly limited. The plurality of bending pieces 120 and the distal end portion 130 are each connected by a rotatable connecting portion 140 . In other words, the manipulator 100 of FIG. 2 can realize the function as a bending section by the multi-joint structure. Moreover, the manipulator 100 is not limited to that shown in FIG. 2, and various modifications such as addition of other components are possible. For example, although not shown, the manipulator 100 may be provided with a flexible portion that passively bends by an external force without the bending piece 120 on the proximal end side. The flexible section may for example consist of a flexible annular member that bends passively with an external force. Moreover, although illustration is omitted, the distal end portion 130 may include a treatment tool, an illumination device, or an imaging device. Also, all or part of these may be included. For example, the lighting device and imaging device included in the distal end portion 130 can be controlled from the outside by passing an optical fiber connected to the lighting device and a cable connecting to the imaging device through a cavity (not shown) in the operation unit 200 and the bending piece 120. can be realized. By doing so, the manipulator 100 of the present embodiment can be used as, for example, an endoluminal device such as a medical endoscope or a catheter, a medical manipulator such as a surgical support robot arm, or an industrial endoscope. It can be used as an industrial manipulator such as an industrial robot arm. The specific structures of the bending piece 120, the distal end portion 130, the connecting portion 140, etc. are known, and detailed description thereof will be omitted.
 また、説明の便宜上、互いに直交する3つの軸として、A軸、UD軸及びLR軸を、図2や図3に図示している。A軸に沿った方向をA軸方向と言い、マニピュレータ100の長手方向に沿った方向である。また、マニピュレータ100の先端側が例えば体腔内に挿入される方向をA1方向とし、マニピュレータ100を引き抜く方向をA2方向とする。また、UD軸に沿った方向をUD軸方向と言い、LR軸に沿った方向をLR軸方向と言う。ここで、LR軸、UD軸、A軸を各々、X軸、Y軸、Z軸と言うこともできる。なお「直交」は、90°で交わっているものの他、90°から若干傾いた角度で交わっている場合も含むものとする。なお、図2は、マニピュレータ100をUD方向に湾曲させるための連結部140のみで構成されているように図示されているが、これに限らず、RL方向に湾曲する連結部140を適宜追加してもよいし、他にマニピュレータ100をねじる機構を追加してもよい。なお、ねじるとはA軸を軸にしてマニピュレータ100を軸回転させることである。 Also, for convenience of explanation, the A axis, the UD axis and the LR axis are illustrated in FIGS. 2 and 3 as three mutually orthogonal axes. The direction along the A-axis is called the A-axis direction, which is the direction along the longitudinal direction of the manipulator 100 . Further, the direction in which the distal end side of the manipulator 100 is inserted into a body cavity, for example, is direction A1, and the direction in which the manipulator 100 is pulled out is direction A2. Also, the direction along the UD axis is called the UD axis direction, and the direction along the LR axis is called the LR axis direction. Here, the LR axis, UD axis, and A axis can also be called the X axis, Y axis, and Z axis, respectively. Note that the term "perpendicular" includes not only intersecting at 90° but also intersecting at an angle slightly inclined from 90°. Although FIG. 2 shows that the manipulator 100 is configured only by the connecting portion 140 for bending in the UD direction, the present invention is not limited to this, and the connecting portion 140 that bends in the RL direction may be added as appropriate. Alternatively, a mechanism for twisting the manipulator 100 may be added. Note that to twist means to rotate the manipulator 100 about the A axis.
 操作部200は、湾曲部を駆動させるためにユーザが操作入力を行うものである。例えば操作部200の操作入力に基づいて、不図示のワイヤ駆動機構によって一対のワイヤ160が互いに逆方向に移動し、マニピュレータ100を所望の方向に湾曲させることができる。なお、図2では2本のワイヤ160によって、マニピュレータ100がUD方向のうち上側に湾曲するように図示されているが、RL方向に湾曲できるようにしてもよい。具体的には例えば図3に示すように、マニピュレータ100は、上湾曲ワイヤ160uと、下湾曲ワイヤ160dと、左湾曲ワイヤ160lと、右湾曲ワイヤ160rからなる4本のワイヤ160を含むように構成してもよい。図2に示す例では、操作部200であるアングルノブ200Aをユーザが操作すると、上湾曲ワイヤ160uと、下湾曲ワイヤ160dが互いに逆方向に移動し、特定の湾曲駒120が回動することでマニピュレータ100はUD方向のうち所望の方向に湾曲させることができるようになっている。同様に、操作部200であるアングルノブ200Bをユーザが操作すると、左湾曲ワイヤ160lと、右湾曲ワイヤ160rが互いに逆方向に移動し、所定の湾曲駒120が回動することでマニピュレータ100はRL方向のうち所望の方向に湾曲させることができるようになっている。ただし、操作部200の具体例は、アングルノブ200A,200Bに限られるものではなく、手動で操作するものに限られない。なお、不図示のワイヤ駆動機構は公知の手法で実現できるので詳細な説明は省略する。 The operation unit 200 is used by the user to perform operation input in order to drive the bending unit. For example, the pair of wires 160 can be moved in opposite directions by a wire drive mechanism (not shown) based on an operation input from the operation unit 200 to bend the manipulator 100 in a desired direction. Although FIG. 2 shows the manipulator 100 bending upward in the UD direction by means of the two wires 160, it may be bent in the RL direction. Specifically, for example, as shown in FIG. 3, the manipulator 100 is configured to include four wires 160 consisting of an upper bending wire 160u, a lower bending wire 160d, a left bending wire 160l, and a right bending wire 160r. You may In the example shown in FIG. 2, when the user operates the angle knob 200A that is the operation unit 200, the upper bending wire 160u and the lower bending wire 160d move in opposite directions, and the specific bending piece 120 rotates. The manipulator 100 can be bent in a desired direction in the UD direction. Similarly, when the user operates the angle knob 200B, which is the operation unit 200, the left bending wire 160l and the right bending wire 160r move in opposite directions, and the predetermined bending piece 120 rotates, thereby moving the manipulator 100 to RL. It can be bent in any desired direction. However, specific examples of the operation unit 200 are not limited to the angle knobs 200A and 200B, and are not limited to manual operation. Note that the wire drive mechanism (not shown) can be realized by a known method, so detailed description thereof will be omitted.
 第1センサ310は、操作部200に設けられ、操作入力の操作入力量に関する第1検出値を出力する。具体的には、第1センサ310は、無線通信又は有線通信によって、取得した第2検出値を後述する処理ユニット400に出力する。ここでの無線通信は、例えばブルートゥース(Bluetooth、登録商標)や、Wi-Fi(Wireless Fidelity、登録商標)等の無線通信規格に従う通信であるが、他の無線通信規格であってもよい。また、ここでの有線通信とは、例えばUSB(Universal Serial Bus)等の有線通信規格に従う通信であるが、他の有線通信規格であってもよい。例えば図2では不図示のケーブルを、図2では不図示の処理ユニット400と接続することで、第1センサ310は第1検出値を有線通信により出力することが実現できる。 The first sensor 310 is provided in the operation unit 200 and outputs a first detection value regarding the amount of operation input. Specifically, the first sensor 310 outputs the acquired second detection value to the processing unit 400, which will be described later, through wireless communication or wired communication. The wireless communication here is communication according to wireless communication standards such as Bluetooth (registered trademark) and Wi-Fi (Wireless Fidelity (registered trademark)), but may be other wireless communication standards. Also, the term "wired communication" as used herein refers to communication conforming to a wired communication standard such as USB (Universal Serial Bus), but may be another wired communication standard. For example, by connecting a cable (not shown in FIG. 2) to the processing unit 400 (not shown in FIG. 2), the first sensor 310 can output the first detected value through wired communication.
 第2センサ320は、マニピュレータ100の先端である先端部130に設けられ、マニピュレータ100の動作に関する第2検出値を出力する。具体的には、第1センサ310と同様に、第2センサ320は、前述の無線通信又は有線通信によって、取得した第2検出値を後述する処理ユニット400に出力する。例えば第2センサ320と接続するケーブルを、前述の湾曲駒120における空洞部に通すことで、マニピュレータ100の先端に設けられた第2センサ320と、マニピュレータ100の外部にある処理ユニット400と有線接続することが実現できる。 The second sensor 320 is provided at the distal end portion 130 of the manipulator 100 and outputs a second detection value regarding the operation of the manipulator 100 . Specifically, similarly to the first sensor 310, the second sensor 320 outputs the obtained second detection value to the processing unit 400, which will be described later, through the above-described wireless communication or wired communication. For example, the second sensor 320 provided at the tip of the manipulator 100 and the processing unit 400 outside the manipulator 100 are connected by wire by passing a cable that connects to the second sensor 320 through the hollow portion of the bending piece 120 described above. can be realized.
 処理ユニット400は、第1センサ310が出力する第1入力値と、第2センサ320が出力する第2入力値を受信する。処理ユニット400は、下記のハードウェアにより構成される。ハードウェアは、デジタル信号を処理する回路及びアナログ信号を処理する回路の少なくとも一方を含むことができる。例えば、ハードウェアは、回路基板に実装された1又は複数の回路装置や、1又は複数の回路素子で構成することができる。1又は複数の回路装置は例えばIC(Integrated Circuit)、FPGA(field-programmable gate array)等である。1又は複数の回路素子は例えば抵抗、キャパシター等である。 The processing unit 400 receives a first input value output by the first sensor 310 and a second input value output by the second sensor 320 . The processing unit 400 is configured by the following hardware. The hardware may include circuitry for processing digital signals and/or circuitry for processing analog signals. For example, the hardware can consist of one or more circuit devices or one or more circuit elements mounted on a circuit board. The one or more circuit devices are, for example, ICs (Integrated Circuits), FPGAs (field-programmable gate arrays), or the like. The one or more circuit elements are, for example, resistors, capacitors, and the like.
 また、処理ユニット400は、少なくとも下記のプロセッサを1つ含むことにより実現される。処理ユニット400は、情報を記憶するメモリと、メモリに記憶された情報に基づいて動作するプロセッサと、を含む。情報は、例えばプログラムと各種のデータ等である。プロセッサは、ハードウェアを含む。プロセッサは、CPU(Central Processing Unit)、GPU(Graphics Processing Unit)、DSP(Digital Signal Processor)等、各種のプロセッサを用いることが可能である。メモリは、SRAM(Static Random Access Memory)、DRAM(Dynamic Random Access Memory)などの半導体メモリであってもよいし、レジスタであってもよいし、HDD(Hard Disk Drive)等の磁気記憶装置であってもよいし、光学ディスク装置等の光学式記憶装置であってもよい。例えば、メモリはコンピュータにより読み取り可能な命令を格納しており、当該命令がプロセッサにより実行されることで、処理ユニット400の各部のうち一部又は全部の機能が処理として実現されることになる。ここでの命令は、プログラムを構成する命令セットの命令でもよいし、プロセッサのハードウェア回路に対して動作を指示する命令であってもよい。さらに、処理ユニット400の各部の全部または一部をクラウドコンピューティングで実現し、図7等で後述する各処理をクラウドコンピューティング上で行うこともできる。 Also, the processing unit 400 is realized by including at least one of the following processors. Processing unit 400 includes a memory that stores information and a processor that operates on the information stored in the memory. The information is, for example, programs and various data. A processor includes hardware. Various processors such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and a DSP (Digital Signal Processor) can be used as the processor. The memory may be a semiconductor memory such as SRAM (Static Random Access Memory) or DRAM (Dynamic Random Access Memory), a register, or a magnetic storage device such as HDD (Hard Disk Drive). Alternatively, it may be an optical storage device such as an optical disc device. For example, the memory stores computer-readable instructions, and the instructions are executed by the processor to implement a part or all of the functions of the units of the processing unit 400 as processes. The instruction here may be an instruction set that constitutes a program, or an instruction that instructs a hardware circuit of a processor to perform an operation. Furthermore, all or part of each part of the processing unit 400 can be realized by cloud computing, and each process described later with reference to FIG. 7 and the like can be performed on cloud computing.
 また、処理ユニット400は、マニピュレータ100の湾曲部の形状を推定する。従来、ワイヤ160の牽引量や操作部200の操作入力量を用いて内視鏡の形状を推定する手法が、特許文献1等に開示されている。そのため、これらの手法を用いることで前述の第1センサ310からの第1入力値に基づき、処理ユニット400はマニピュレータ100の湾曲部の湾曲角度を推定することができると思われる。しかし、特許文献1等に開示された手法を適用するだけでは、入力装置側では制御不可能な外乱要因が考慮されていないため、マニピュレータ100の湾曲部の湾曲角度を正確に推定することができない。 Also, the processing unit 400 estimates the shape of the curved portion of the manipulator 100 . Japanese Unexamined Patent Application Publication No. 2002-100000 discloses a method of estimating the shape of an endoscope using the amount of traction of the wire 160 and the amount of operation input of the operation unit 200 . Therefore, by using these methods, the processing unit 400 can estimate the bending angle of the bending portion of the manipulator 100 based on the first input value from the first sensor 310 described above. However, simply applying the method disclosed in Patent Document 1 or the like does not take into account disturbance factors that cannot be controlled by the input device, so the bending angle of the bending portion of the manipulator 100 cannot be accurately estimated. .
 図4を用いて、具体的に説明する。ユーザが操作部200を操作したことで、マニピュレータ100の湾曲部の実際の湾曲角度が、時間経過に伴い、B1に示すように変化したものとする。従来の手法によりマニピュレータ100の湾曲部の湾曲角度を推定すると、推定した結果はB2に示すようになり、B1に示す挙動と異なっている。例えば、C1に示すように、推定開始時点において、実際の湾曲角度の値と推定した湾曲角度の値に差が生じる。これは、推定開始時におけるマニピュレータ100の形状が異なることから、ワイヤ160の形状も異なることに起因する。また、C2に示すように、湾曲角度変化の傾き度合いに差が生じる。これは、前述と同様に、マニピュレータ100の形状によってワイヤ160の形状も異なり、操作部200の操作入力に対する応答性に影響していることに起因する。また、C3に示すように、湾曲角度の変化が生じるタイミングに差が生じる。これは、操作部200の入力操作を開始しても、ワイヤ160の撓みや劣化等によって、湾曲部の形状変化の開始が予想以上に遅れること等に起因する。 A specific explanation will be given using FIG. It is assumed that the actual bending angle of the bending portion of the manipulator 100 changes over time as indicated by B1 as the user operates the operation unit 200 . When the bending angle of the bending portion of the manipulator 100 is estimated by the conventional method, the estimated result is as shown in B2, which differs from the behavior shown in B1. For example, as indicated by C1, a difference occurs between the actual bending angle value and the estimated bending angle value at the start of estimation. This is because the shape of the manipulator 100 at the start of estimation is different, and thus the shape of the wire 160 is also different. Also, as indicated by C2, there is a difference in the degree of inclination of the bending angle change. This is because, as described above, the shape of the wire 160 also differs depending on the shape of the manipulator 100, which affects the responsiveness of the operation unit 200 to the operation input. Also, as indicated by C3, there is a difference in the timing at which the bending angle changes. This is because even if the input operation of the operation unit 200 is started, the bending or deterioration of the wire 160 delays the start of the shape change of the bending portion more than expected.
 このように、第1センサ310からの第1検出値のみに基づいて湾曲部の湾曲角度を推定することは、推定角度の確度が低い問題が有る。一方、第2センサ320からの第2検出値のみに基づいて湾曲部の湾曲角度を推定すると、推定の精度が低いことがある。マニピュレータ100の先端が細い場合、搭載される第2センサ320自体も非常に小型となるため、第2センサ320が取得する第2検出値にノイズが多く含まれるからである。 In this way, estimating the bending angle of the bending portion based only on the first detection value from the first sensor 310 has the problem that the accuracy of the estimated angle is low. On the other hand, if the bending angle of the bending portion is estimated based only on the second detection value from the second sensor 320, the accuracy of the estimation may be low. This is because when the tip of the manipulator 100 is thin, the mounted second sensor 320 itself is also very small, so the second detection value acquired by the second sensor 320 contains a lot of noise.
 なお、ここでの推定の確度が低いとは、推定の正確さが低いともいい、具体的には例えば推定した湾曲角度の値が、実際の湾曲角度の値から大きく外れることをいう。また、ここでの推定の精度が低いとは、推定の精密さが低いともいい、具体的には例えば同一の湾曲角度の推定を複数回行ったとき、推定した湾曲角度の値のばらつきが大きいことをいう。例えば、図5に示すように、推定開始時刻から所定期間t0だけ経過したタイミングにおいて、実際の湾曲角度がEに示す値であるとして、第1入力値に基づく湾曲角度の推定と第2入力値に基づく湾曲角度の推定をそれぞれ4回ずつ行った場合を考えてみる。この場合、第1入力値に基づいて推定した湾曲部の湾曲角度は、F1、F2、F3、F4に示す値となり、第2入力値に基づいて推定した湾曲部の湾曲角度は、G1、G2、G3、G4に示す値となる。第1入力値に基づく湾曲角度の推定は、第2入力値に基づく湾曲角度の推定よりも確度が低いことから、F1~F4に示す値は、G1~G4に示す値よりも、Eに示す値から離れている。一方、第2入力値に基づく湾曲角度の推定は、第1入力値に基づく湾曲角度の推定よりも精度が低いことから、G1~G4に示す値のばらつきは、F1~F4に示す値のばらつきよりも大きい。つまり、第1センサ310が取得する第1検出値は、図5のF1~F4に示すデータのカテゴリに属し、第2センサ320が取得する第2検出値は、図5のG1~G4に示すデータのカテゴリに属する。 It should be noted that the term "low estimation accuracy" as used herein may also be referred to as "low estimation accuracy", and specifically means, for example, that the estimated bending angle value deviates greatly from the actual bending angle value. In addition, the low accuracy of estimation here is also said to be low accuracy of estimation. Specifically, for example, when the same bending angle is estimated a plurality of times, the estimated bending angle value varies greatly. Say things. For example, as shown in FIG. 5, assuming that the actual bending angle is the value indicated by E at the timing when a predetermined period of time t0 has elapsed from the estimation start time, the bending angle is estimated based on the first input value and the second input value. Let us consider a case in which the estimation of the bending angle based on each is performed four times. In this case, the bending angles of the bending portion estimated based on the first input value are values indicated by F1, F2, F3, and F4, and the bending angles of the bending portion estimated based on the second input value are G1 and G2. , G3 and G4. Since the estimation of the bending angle based on the first input value is less accurate than the estimation of the bending angle based on the second input value, the values shown in F1 to F4 are shown in E than the values shown in G1 to G4. far from the value. On the other hand, the estimation of the bending angle based on the second input value is less accurate than the estimation of the bending angle based on the first input value. bigger than That is, the first detection values acquired by the first sensor 310 belong to the data categories F1 to F4 in FIG. 5, and the second detection values acquired by the second sensor 320 are shown in G1 to G4 in FIG. Belong to a category of data.
 そこで、本実施形態の処理ユニット400は、図1に示すように、マニピュレータ100の湾曲部の形状の第1推定処理を実行する第1推定処理部410と、マニピュレータ100の湾曲部の形状の第2推定処理を実行する第2推定処理部420を含む。言い換えれば、本実施形態の処理ユニット400は、マニピュレータ100の湾曲部の形状の第1推定処理と、マニピュレータ100の湾曲部の形状の第2推定処理を実行する。 Therefore, as shown in FIG. 1 , the processing unit 400 of the present embodiment includes a first estimation processing unit 410 that executes a first estimation process for the shape of the bending portion of the manipulator 100 and a first estimation processing for the shape of the bending portion of the manipulator 100 . 2 estimation processing section 420 is included. In other words, the processing unit 400 of the present embodiment executes a first estimation process of the shape of the curved portion of the manipulator 100 and a second estimation process of the shape of the curved portion of the manipulator 100 .
 より具体的には、図6に示すように、第1推定処理部410は、第1センサ310から受信した第1検出値に基づいて第1推定処理を実行し、第1推定値を生成する。例えば、操作部200の操作入力量に基づいて湾曲部の湾曲角度を演算する演算モデルの式が不図示の記憶部に記憶され、第1推定処理部410は、受信した第1検出値と当該演算モデルの式に基づいて湾曲部の湾曲角度を推定する処理を行うことで、第1推定処理を実現することができる。なお、演算モデルの式の詳細は後述する。また、第1推定処理は、上記の他に、ワイヤ160の牽引量に基づいて湾曲部の湾曲角度を演算する演算モデルを用いてもよいし、数式の代わりに操作入力量と湾曲角度の関係を示すテーブルを用いてもよく、種々の変形実施が可能である。 More specifically, as shown in FIG. 6, first estimation processing section 410 performs first estimation processing based on the first detection value received from first sensor 310 to generate a first estimation value. . For example, a calculation model formula for calculating the bending angle of the bending portion based on the operation input amount of the operation unit 200 is stored in a storage unit (not shown), and the first estimation processing unit 410 calculates the received first detection value and the corresponding The first estimation process can be realized by performing the process of estimating the bending angle of the bending portion based on the formula of the arithmetic model. The details of the equations of the calculation model will be described later. In addition to the above, the first estimation process may use an arithmetic model that calculates the bending angle of the bending portion based on the amount of traction of the wire 160, or use the relationship between the operation input amount and the bending angle instead of the mathematical formula. may be used, and various modifications are possible.
 そして、第2推定処理部420は、第2センサ320から受信した第2検出値と、第1推定値に基づいて第2推定処理を実行し、第2推定値を生成する。より具体的には、例えば第2推定処理部420は、第2検出値を用いて第1推定値を補正した第2推定値を生成する処理を行うが、具体的な手法は後述する。 Then, the second estimation processing unit 420 performs the second estimation process based on the second detection value received from the second sensor 320 and the first estimation value to generate the second estimation value. More specifically, for example, the second estimation processing unit 420 performs a process of generating a second estimated value by correcting the first estimated value using the second detected value, a specific method of which will be described later.
 以上のように、本実施形態のマニピュレータシステム10は、湾曲部を含むマニピュレータ100と、湾曲部を駆動させるためにユーザが操作入力を行う操作部200と、第1センサ310と、第2センサ320と、少なくとも1つのプロセッサを含む処理ユニット400と、を含む。第1センサ310は、操作部200に設けられ、操作入力の操作入力量に関する第1検出値を出力する。第2センサ320は、マニピュレータ100の先端に設けられ、マニピュレータ100の動作に関する第2検出値を出力する。処理ユニット400は、第1検出値を受信し、第1検出値に基づいて、湾曲部の形状の第1推定処理を実行することで、湾曲部の形状を示す第1推定値を生成する。また、処理ユニット400は、第2検出値を受信し、第1推定値と第2検出値に基づいて、湾曲部の形状の第2推定処理を実行することで、湾曲部の形状を示す第2推定値を生成する。 As described above, the manipulator system 10 of the present embodiment includes a manipulator 100 including a bending portion, an operation portion 200 through which a user performs an operation input to drive the bending portion, a first sensor 310, and a second sensor 320. and a processing unit 400 including at least one processor. The first sensor 310 is provided in the operation unit 200 and outputs a first detection value regarding the amount of operation input of the operation input. A second sensor 320 is provided at the tip of the manipulator 100 and outputs a second detection value regarding the operation of the manipulator 100 . The processing unit 400 receives the first detection value and performs a first estimation process of the shape of the bending portion based on the first detection value to generate a first estimated value indicating the shape of the bending portion. In addition, the processing unit 400 receives the second detection value and performs a second estimation process for the shape of the bending portion based on the first estimated value and the second detection value, thereby indicating the shape of the bending portion. 2 Generate an estimate.
 このように、本実施形態のマニピュレータシステム10は、第1センサ310と、第2センサ320と処理ユニット400を含むことから、処理ユニット400は、第1センサ310から入力操作に関する第1検出値と、マニピュレータ100の先端の出力動作に関する第2検出値を取得することができる。これにより、処理ユニット400は、第1検出値と第2検出値の両方を用いて、マニピュレータ100の湾曲部の湾曲形状を推定することができる。さらに、処理ユニット400は、第1推定処理と第2推定処理を行うことができるので、第1検出値に基づく第1推定処理により第1推定値を生成するだけでなく、当該第1推定値に基づいて、第2推定処理により第2推定値を生成することができる。これにより、実際のマニピュレータ100の湾曲部の湾曲角度により近い湾曲角度を推定することができる。具体的には、第2検出値よりも精度が高い第1検出値と、第1検出値よりも確度が高い第2検出値を用いることで、精度と確度が高い湾曲確度の推定が可能になる。 As described above, since the manipulator system 10 of the present embodiment includes the first sensor 310, the second sensor 320, and the processing unit 400, the processing unit 400 receives the first detection value related to the input operation from the first sensor 310 and the , a second detection value for the output motion of the tip of the manipulator 100 can be obtained. Thereby, the processing unit 400 can estimate the curved shape of the curved portion of the manipulator 100 using both the first detection value and the second detection value. Further, since the processing unit 400 can perform the first estimation process and the second estimation process, not only the first estimation process based on the first detection value generates the first estimation value, but also the first estimation value A second estimate can be generated by a second estimation process based on . As a result, a bending angle closer to the actual bending angle of the bending portion of the manipulator 100 can be estimated. Specifically, by using the first detection value, which is more accurate than the second detection value, and the second detection value, which is more accurate than the first detection value, it is possible to estimate the bending accuracy with high accuracy and accuracy. Become.
 また、本実施形態の手法は、マニピュレータ100の形状推定方法として実現してもよい。つまり、本実施形態のマニピュレータ100の形状推定方法は、マニピュレータ100の湾曲部を駆動させるためにユーザが操作入力を行う操作部200に設けられ、操作入力の操作入力量に関する第1検出値を出力する第1センサ310から、第1検出値を受信することを含む。また、マニピュレータ100の形状推定方法は、マニピュレータ100の先端に設けられ、マニピュレータ100の動作に関する第2検出値を出力する第2センサ320から、第2検出値を受信することを含む。また、マニピュレータ100の形状推定方法は、第1検出値に基づいて、湾曲部の形状の第1推定処理を実行することで、湾曲部の形状を示す第1推定値を生成することを含む。また、マニピュレータ100の形状推定方法は、第1推定値と第2検出値に基づいて、湾曲部の形状の第2推定処理を実行することで、湾曲部の形状を示す第2推定値を生成することを含む。このようにすることで、上記と同様の効果を得ることができる。 Also, the method of the present embodiment may be implemented as a shape estimation method for the manipulator 100. In other words, the method for estimating the shape of the manipulator 100 according to the present embodiment is provided in the operation unit 200 through which the user performs an operation input in order to drive the bending portion of the manipulator 100, and outputs the first detection value regarding the operation input amount of the operation input. receiving a first sensed value from a first sensor 310 to Also, the method for estimating the shape of the manipulator 100 includes receiving a second detection value from a second sensor 320 that is provided at the tip of the manipulator 100 and that outputs a second detection value regarding the movement of the manipulator 100 . Further, the shape estimation method of the manipulator 100 includes generating a first estimated value indicating the shape of the curved portion by executing a first estimation process of the shape of the curved portion based on the first detection value. Further, the shape estimation method of the manipulator 100 generates a second estimated value indicating the shape of the curved portion by executing a second estimation process of the shape of the curved portion based on the first estimated value and the second detected value. including doing By doing so, an effect similar to that described above can be obtained.
 また、第2推定処理部420は、特定設計情報など、他の情報をさらに用いて第2推定処理を行ってもよい。特定設計情報とは、マニピュレータ100の設計情報に基づく情報で、例えば、湾曲角度に応じてマニピュレータ100を構成する湾曲駒120や連結部140等のそれぞれの変位や姿勢を示す情報である。このようにすることで、第2推定処理部420は、第2推定処理をより適切に実行することができる。 Also, the second estimation processing unit 420 may perform the second estimation processing further using other information such as specific design information. The specific design information is information based on the design information of the manipulator 100, and is, for example, information indicating the respective displacements and orientations of the bending piece 120, the connecting portion 140, etc. that constitute the manipulator 100 according to the bending angle. By doing so, the second estimation processing unit 420 can more appropriately perform the second estimation processing.
 次に、図7、図8、図9を用いて、本実施形態の手法をより詳細に説明する。図6で前述した一連の処理は、一定時間ごとに行われる。例えば図7と図8に示すフローチャートの処理例に関するプログラムを、処理ユニット400における不図示の記憶部に記憶することで実現することができる。処理ユニット400は、先ずマニピュレータシステム起動処理(ステップS100)を行う。具体的には、ユーザは、処理ユニット400を構成するハードウェアの電源を投入し、後述する手法に関するアプリケーションプログラムを起動させる処理を行う。その後、マニピュレータシステム10の終了操作が行われたか否かについて判断する処理(ステップS200)を行い、マニピュレータシステム10の終了操作が行われた場合(ステップS200でYES)、処理ユニット400は、マニピュレータシステム終了処理(ステップS210)を行い、マニピュレータシステム10全体の処理が終了する。一方、マニピュレータシステム10を終了操作が行われない場合(ステップS200でNO)、ステップS200の処理が繰り返し行われるとともに、割込み処理が行われる。ここでの割込み処理とは、具体的には例えば図8に示すように、湾曲角度推定処理(ステップS300)が、所定周期からなるタイマ割込みによって行われる処理である。本実施形態では、1タイムステップにおける湾曲角度推定処理(ステップS300)において、図6に示した一連の処理が行われる。図10、図12、図14等で後述する変形例についても同様である。 Next, the method of this embodiment will be described in more detail using FIGS. 7, 8, and 9. FIG. The series of processes described above with reference to FIG. 6 are performed at regular time intervals. For example, it can be realized by storing a program relating to the processing examples of the flowcharts shown in FIGS. The processing unit 400 first performs a manipulator system startup process (step S100). Specifically, the user turns on the power of the hardware that constitutes the processing unit 400, and performs processing for activating an application program related to the method described later. After that, a process (step S200) is performed to determine whether or not an operation to end manipulator system 10 has been performed. End processing (step S210) is performed, and the processing of the entire manipulator system 10 ends. On the other hand, if the operation to end the manipulator system 10 is not performed (NO in step S200), the process of step S200 is repeatedly performed and an interrupt process is performed. Specifically, the interrupt processing here is, for example, as shown in FIG. 8, the bending angle estimation processing (step S300) is performed by a timer interrupt having a predetermined cycle. In this embodiment, a series of processes shown in FIG. 6 are performed in the bending angle estimation process (step S300) in one time step. The same applies to modifications described later with reference to FIGS. 10, 12, 14, and the like.
 以上に基づいて、例えば第kタイムステップ(k=1、2、3…)の湾曲角度推定処理(ステップS300)は、図9のように表すこともできる。図9において、kは離散処理におけるタイムステップであり、時刻又はタイミング等とも呼ばれる。なお、以降において、他の変形例等を説明することがあるが、図9に他の構成要素等を適宜追加して説明する。また、以降の他の変形例の説明においては、1つ前のタイムステップにおける第1検出値等を用いることがある。また、以降において、説明の便宜上、第k-1タイムステップを第1タイムステップと呼び、第kタイムステップを第2タイムステップと呼ぶことがある。つまり、第2タイムステップは第1タイムステップより後のタイムステップである。 Based on the above, for example, the bending angle estimation process (step S300) at the k-th time step (k=1, 2, 3, . . . ) can be expressed as shown in FIG. In FIG. 9, k is a time step in discrete processing and is also called time or timing. In addition, although other modifications and the like may be described in the following, other components and the like will be added to FIG. 9 as appropriate. Also, in the description of other modified examples below, the first detection value or the like at the previous time step may be used. Also, hereinafter, for convenience of explanation, the k-1 time step may be called the first time step, and the k-th time step may be called the second time step. That is, the second timestep is a timestep after the first timestep.
 第1センサ310について具体的に説明する。第1センサ310は、例えば操作部200が図2のアングルノブ200A,200Bの場合、操作部200の回転軸に角度センサやロータリーエンコーダを取り付けることで、第1検出値である湾曲操作の操作入力量として、アングルノブ200A,200Bの回転量を把握することができる。なお、操作入力の操作入力量が把握できるのであれば、第1センサ310は、例えばワイヤ160等の直動部材の移動量を計測するリニアエンコーダであってもよい。また、第1センサ310の測定方式は光学式でもよいし、磁気式でもよく、特に問わない。 The first sensor 310 will be specifically described. For example, when the operation unit 200 is the angle knobs 200A and 200B of FIG. As the amount, the amount of rotation of the angle knobs 200A and 200B can be grasped. Note that the first sensor 310 may be a linear encoder that measures the amount of movement of a direct-acting member such as the wire 160, as long as the operation input amount of the operation input can be grasped. Moreover, the measurement method of the first sensor 310 may be an optical method or a magnetic method, and is not particularly limited.
 第2センサ320について具体的に説明する。第2センサ320は例えば2軸または3軸の加速度センサであり、検出した加速度データを処理ユニット400に出力する。なお、第2センサ320は、検出した加速度データを図示しない積分器により積分し、速度データとして、処理ユニット400に出力してもよい。なお、ここでの2軸とは前述のX軸、Y軸であり、3軸とはX軸、Y軸、Z軸である。以降も同様である。また、第2センサ320は2軸または3軸のジャイロセンサであってもよく、検出した角加速度データまたは角速度データを、無線通信又は有線通信により後述する処理ユニット400に出力してもよい。また、第2センサ320は、2軸または3軸の加速度センサと、2軸または3軸のジャイロセンサを含むモーションセンサであってもよい。なお、モーションセンサは、加速度センサとジャイロセンサのうち一方を言う場合もある。また、モーションセンサは慣性観測装置やIMU(Inertial Measurement Unit)と呼ぶこともある。また、図2では第2センサ320は、マニピュレータ100の先端である先端部130に設けられるように図示しているが、湾曲部の動作に応じて動く部位であれば先端部130でなくてもよい。また、第2センサ320は、例えば磁気式の位置センサであってもよく、位置を計測する部位は、少なくとも一箇所は、湾曲部の動作に応じて動かない部位であり、また少なくとも別の一箇所は、湾曲部の動作に応じて動く部位であれば、場所は問わない。また、第2センサ320の代わりに撮像装置を用いてもよい。例えば各々の撮像時刻間の画像内の特徴点の移動量、及び当該画像から推定される特徴点と撮像装置の距離から、各々の撮像時刻間の撮像装置の動作量や動作角度を計測することで、撮像装置は第2センサ320と同様の機能を実現することができる。 The second sensor 320 will be specifically described. The second sensor 320 is, for example, a two-axis or three-axis acceleration sensor, and outputs detected acceleration data to the processing unit 400 . The second sensor 320 may integrate the detected acceleration data with an integrator (not shown) and output the result to the processing unit 400 as velocity data. Note that the two axes here are the aforementioned X axis and Y axis, and the three axes are the X axis, Y axis, and Z axis. The same applies to the rest. Also, the second sensor 320 may be a 2-axis or 3-axis gyro sensor, and may output the detected angular acceleration data or angular velocity data to the processing unit 400 described later by wireless or wired communication. Also, the second sensor 320 may be a motion sensor including a 2-axis or 3-axis acceleration sensor and a 2-axis or 3-axis gyro sensor. Note that the motion sensor may refer to either one of an acceleration sensor and a gyro sensor. A motion sensor is also called an inertial observation device or an IMU (Inertial Measurement Unit). Further, although FIG. 2 illustrates the second sensor 320 as being provided at the distal end portion 130, which is the distal end of the manipulator 100, any part that moves in accordance with the motion of the bending portion can good. In addition, the second sensor 320 may be, for example, a magnetic position sensor, and at least one portion of the position measurement portion is a portion that does not move according to the motion of the bending portion, and at least another portion is a portion that does not move. The location is not limited as long as it is a portion that moves according to the motion of the bending portion. Also, an imaging device may be used instead of the second sensor 320 . For example, from the amount of movement of the feature point in the image between each imaging time and the distance between the feature point estimated from the image and the imaging device, the operation amount and the operation angle of the imaging device between each imaging time are measured. , the imaging device can realize the same function as the second sensor 320 .
 以上のように、本実施形態のマニピュレータシステム10において、第2センサ320は、マニピュレータ100の先端側に配置された加速度センサ、角速度センサ、位置センサ又は撮像装置のうち少なくとも1つである。また、本実施形態のマニピュレータシステム10において、第2検出値は、マニピュレータ100の先端の位置、変位、速度、加速度、角度及び角速度のうち少なくとも1つに基づく検出値である。なお、以降の説明は、第2センサ320は加速度センサであり、第2検出値は加速度に基づく検出値である場合の例示であるが、他のセンサを用いる場合においても、以降に述べる手法を適用することが可能である。 As described above, in the manipulator system 10 of the present embodiment, the second sensor 320 is at least one of an acceleration sensor, an angular velocity sensor, a position sensor, and an imaging device arranged on the tip side of the manipulator 100. Also, in the manipulator system 10 of the present embodiment, the second detection value is a detection value based on at least one of the position, displacement, velocity, acceleration, angle, and angular velocity of the tip of the manipulator 100 . In the following description, the second sensor 320 is an acceleration sensor, and the second detection value is a detection value based on acceleration. It is possible to apply
 次に、第1検出値と第1推定処理の具体例について数式を用いて説明する。なお、以降において便宜上、数式中に登場する文字等は数式に添えて説明している。また、重複して登場する文字は、2度目の登場以降においては説明を省略している。操作部200の操作入力量を示す第1検出値は、操作部200が図2のアングルノブ200A,200Bの場合、アングルノブ200A,200Bの回転角度が操作入力量となるので、以下の(1)式で表すことができる。
Figure JPOXMLDOC01-appb-M000001
Next, a specific example of the first detection value and the first estimation process will be described using mathematical expressions. In the following description, for the sake of convenience, letters and the like appearing in the formulas are described along with the formulas. In addition, descriptions of characters appearing in duplicate are omitted after the second appearance. When the operation unit 200 is the angle knobs 200A and 200B shown in FIG. 2, the first detection value indicating the operation input amount of the operation unit 200 is the rotation angle of the angle knobs 200A and 200B. ) can be expressed as
Figure JPOXMLDOC01-appb-M000001
 また、マニピュレータ100の湾曲部の湾曲角度は、図2で前述したUD方向の湾曲、RL方向の湾曲の他、図2のA方向を軸とするねじり角度を考慮して、以下の(2)式で表すことができる。
Figure JPOXMLDOC01-appb-M000002
Further, the bending angle of the bending portion of the manipulator 100 is determined by considering the bending in the UD direction and the bending in the RL direction described above with reference to FIG. can be expressed by the formula
Figure JPOXMLDOC01-appb-M000002
 マニピュレータ100の湾曲部において、UD方向の湾曲角度がアングルノブ200Aの回転角度に対して線形に変化し、RL方向の湾曲角度がアングルノブ200Bの回転角度に対して線形に変化すると仮定した場合、第1推定処理によって生成される第1推定値は、以下の式(3)で表すことができる。式(3)に用いられる係数は、例えば処理ユニット400における不図示の記憶部に予め記憶されているが、外部サーバーやクラウドコンピューティング内の記憶部等に記憶されていてもよい。
Figure JPOXMLDOC01-appb-M000003
In the bending portion of the manipulator 100, assuming that the bending angle in the UD direction changes linearly with respect to the rotation angle of the angle knob 200A, and the bending angle in the RL direction changes linearly with respect to the rotation angle of the angle knob 200B, A first estimated value generated by the first estimation process can be represented by the following equation (3). The coefficients used in Equation (3) are stored in advance in, for example, a storage unit (not shown) in the processing unit 400, but may be stored in an external server, a storage unit in cloud computing, or the like.
Figure JPOXMLDOC01-appb-M000003
 以上のような演算モデルの式を立てることで、処理ユニット400の第1推定処理部410は、第1検出値に基づいて、第1推定処理を行い、第1推定値を生成する。つまり、本実施形態のマニピュレータシステム10において、処理ユニット400は、第1推定処理において、操作入力と、湾曲部の形状と、の関係がモデリングされた演算モデルを用いて、第1検出値を第1推定値に変換する。このようにすることで、ユーザが制御可能な操作入力量に基づいて、直接視認することが困難なマニピュレータ100の湾曲部の湾曲角度を推定することができる。 By setting up the equation of the computational model as described above, the first estimation processing section 410 of the processing unit 400 performs the first estimation process based on the first detected value and generates the first estimated value. That is, in the manipulator system 10 of the present embodiment, in the first estimation process, the processing unit 400 converts the first detection value into the first 1 estimate. By doing so, it is possible to estimate the bending angle of the bending portion of the manipulator 100, which is difficult to directly visually recognize, based on the amount of operation input that can be controlled by the user.
 また、本実施形態の手法は、マニピュレータ100の形状推定方法として実現してもよい。つまり、本実施形態のマニピュレータ100の形状推定方法は、第1推定処理において、操作入力と、湾曲部の形状と、の関係がモデリングされた演算モデルを用いて、第1検出値を第1推定値に変換することを含む。このようにすることで、上記と同様の効果を得ることができる。 Also, the method of the present embodiment may be implemented as a shape estimation method for the manipulator 100. In other words, the method for estimating the shape of the manipulator 100 according to the present embodiment uses an arithmetic model in which the relationship between the operation input and the shape of the bending portion is modeled in the first estimation process to obtain the first detection value. Including converting to a value. By doing so, an effect similar to that described above can be obtained.
 次に、第2推定処理について説明する。以下、第2推定処理の例として拡張カルマンフィルタを用いて第2推定値を生成する手法について説明する。なお、拡張カルマンフィルタの基本事項やブロック図等は公知につき説明を省略する。本実施形態において、実際の湾曲角度を状態変数とした場合、状態方程式は式(4)で表すことができる。
Figure JPOXMLDOC01-appb-M000004
Next, the second estimation process will be explained. A method of generating a second estimated value using an extended Kalman filter will be described below as an example of the second estimation process. Since basic matters and block diagrams of the extended Kalman filter are publicly known, description thereof will be omitted. In this embodiment, when the actual bending angle is used as a state variable, the state equation can be expressed by Equation (4).
Figure JPOXMLDOC01-appb-M000004
 また、観測方程式は式(5)で表すことができる。
Figure JPOXMLDOC01-appb-M000005
Also, the observation equation can be represented by Equation (5).
Figure JPOXMLDOC01-appb-M000005
 ここで、先端部130の座標系における並進加速度は、重力加速度より十分小さいため、先端部130の座標系における加速度と、ワールド座標系における重力加速度の関係は回転行列を用いて式(6)で表すことができる。
Figure JPOXMLDOC01-appb-M000006
Here, since the translational acceleration in the coordinate system of the distal end portion 130 is sufficiently smaller than the gravitational acceleration, the relationship between the acceleration in the coordinate system of the distal end portion 130 and the gravitational acceleration in the world coordinate system is expressed by Equation (6) using a rotation matrix. can be represented.
Figure JPOXMLDOC01-appb-M000006
 ここで、湾曲角度から回転行列を導出する。図2で前述したように、マニピュレータ100は多関節構造であるので、i個の連結部140から構成されるとすると、ワールド座標系から、先端部130の座標系までの同次変換行列は、Denavit-Hartenbergの表記法を用いて式(7)のようにおくことができる。ここでmは湾曲部の根本位置の座標である。
Figure JPOXMLDOC01-appb-M000007
Here, the rotation matrix is derived from the curvature angle. As described above with reference to FIG. 2, the manipulator 100 has a multi-joint structure, so if it is composed of i connecting parts 140, the homogeneous transformation matrix from the world coordinate system to the coordinate system of the tip part 130 is Using the Denavit-Hartenberg notation, it can be expressed as Equation (7). where m is the coordinate of the root position of the curved portion.
Figure JPOXMLDOC01-appb-M000007
 また、同次変換行列は、式(8)で表すことができる。
Figure JPOXMLDOC01-appb-M000008
Also, the homogeneous transformation matrix can be expressed by Equation (8).
Figure JPOXMLDOC01-appb-M000008
 また、ワールド座標系から先端部130への回転行列は式(9)で表すことができる。
Figure JPOXMLDOC01-appb-M000009
Also, the rotation matrix from the world coordinate system to the tip 130 can be expressed by Equation (9).
Figure JPOXMLDOC01-appb-M000009
 そこで、式(6)と式(9)から、式(10)が導出できる。
Figure JPOXMLDOC01-appb-M000010
Therefore, equation (10) can be derived from equations (6) and (9).
Figure JPOXMLDOC01-appb-M000010
 先端部130における加速度は、前述のように第2センサ320で取得される第2検出値であるから、拡張カルマンフィルタの観測方程式は、式(11)で表すことができる。
Figure JPOXMLDOC01-appb-M000011
Since the acceleration at the distal end portion 130 is the second detection value obtained by the second sensor 320 as described above, the observation equation of the extended Kalman filter can be expressed by Equation (11).
Figure JPOXMLDOC01-appb-M000011
 次に、拡張カルマンフィルタの時間更新式について考える。式(1)の右辺の関数をxで微分すると、式(12)で表すことができる。
Figure JPOXMLDOC01-appb-M000012
Next, consider the time update formula of the extended Kalman filter. Differentiating the function on the right side of Equation (1) with respect to x can be expressed by Equation (12).
Figure JPOXMLDOC01-appb-M000012
 また、式(11)の右辺をxで微分すると、式(13)で表すことができる。
Figure JPOXMLDOC01-appb-M000013
Further, when the right side of Equation (11) is differentiated with respect to x, Equation (13) can be obtained.
Figure JPOXMLDOC01-appb-M000013
 以上のことから、拡張カルマンフィルタの時間更新式をまとめると、式(14)、式(15)、式(16)、式(17)、式(18)、式(19)で表すことができる。
Figure JPOXMLDOC01-appb-M000014
Figure JPOXMLDOC01-appb-M000015
Figure JPOXMLDOC01-appb-M000016
Figure JPOXMLDOC01-appb-M000017
Figure JPOXMLDOC01-appb-M000018
Figure JPOXMLDOC01-appb-M000019
From the above, the time update formulas of the extended Kalman filter can be summarized as formulas (14), (15), (16), (17), (18), and (19).
Figure JPOXMLDOC01-appb-M000014
Figure JPOXMLDOC01-appb-M000015
Figure JPOXMLDOC01-appb-M000016
Figure JPOXMLDOC01-appb-M000017
Figure JPOXMLDOC01-appb-M000018
Figure JPOXMLDOC01-appb-M000019
 式(18)から、第2推定値は、第1推定値とカルマンゲインと第2検出値に基づく関係式で表されている。前述したように、第1検出値は正確さが低いことから第1推定値の正確さも低いが、第2検出値は、精密さは低いが正確さが高い。そのため、式(18)により、第2推定処理部420は第1推定値を実際の湾曲角度に近づくように、第2推定値として補正し、出力する処理を、繰り返し行う。 From Equation (18), the second estimated value is represented by a relational expression based on the first estimated value, the Kalman gain, and the second detected value. As described above, the accuracy of the first detected value is low, so the accuracy of the first estimated value is also low, while the second detected value is low in precision but high in accuracy. Therefore, according to Equation (18), the second estimation processing unit 420 repeatedly performs the process of correcting the first estimated value as the second estimated value so as to approximate the actual bending angle and outputting the corrected second estimated value.
 以上のように、本実施形態のマニピュレータシステム10において、処理ユニット400は、カルマンフィルタを用いた第2推定処理により第2推定値を推定する。このようにすることで、精度が高い第1推定値と、正確さが高い第2検出値を用いて、第2推定値が推定されることで、第1検出値又は第2検出値の一方のみから湾曲確度を推定する場合に比べて、より真値に近い湾曲確度を推定できる。なお、本実施形態の手法は、マニピュレータ100の形状推定方法として実現してもよい。つまり、本実施形態のマニピュレータ100の形状推定方法は、カルマンフィルタを用いた第2推定処理により第2推定値を推定することを含む。このようにすることで、上記と同様の効果を得ることができる。 As described above, in the manipulator system 10 of the present embodiment, the processing unit 400 estimates the second estimated value by the second estimation process using the Kalman filter. By doing so, by estimating the second estimated value using the first estimated value with high accuracy and the second detected value with high accuracy, one of the first detected value and the second detected value It is possible to estimate the bending accuracy closer to the true value than when estimating the bending accuracy from only the . Note that the method of this embodiment may be implemented as a shape estimation method for the manipulator 100 . That is, the shape estimation method of the manipulator 100 of this embodiment includes estimating the second estimated value by the second estimation process using the Kalman filter. By doing so, an effect similar to that described above can be obtained.
 なお、前述の式(3)や式(18)の両辺を比較すると、第1推定値と第2推定値は操作部200の操作入力量すなわち角度の次元で表わされていることが分かる。つまり、本実施形態のマニピュレータシステム10において、第1推定値と第2推定値は、湾曲部の湾曲角度である。 By comparing both sides of the above equations (3) and (18), it can be seen that the first estimated value and the second estimated value are expressed in the dimension of the operation input amount of the operation unit 200, that is, the angle. That is, in the manipulator system 10 of this embodiment, the first estimated value and the second estimated value are the bending angles of the bending portion.
 また、本実施形態の手法は、上記に限られず、種々の変形実施が可能である。例えば、当該タイムステップで取得した第1検出値等に限らず、1つ前のタイムステップで取得した第1検出値等をさらに用いてもよい。具体的には、第1推定処理部410は、例えば1つ前のタイムステップで検出した第1検出値と、当該タイムステップで検出した第1検出値との差分値を用いて第1推定処理を行ってもよい。また、第1推定処理部410は、図10に示すように、1つ前のタイムステップで生成した第2推定値と、前述の差分値を用いて第1推定処理を行ってもよい。言い換えれば、第1推定処理部410は、第1タイムステップにおける第1検出値と第2タイムステップにおける第1検出値の差分である差分値と、第1タイムステップにおける第2推定値と、に基づいて、第2タイムステップにおける第1推定処理を行い、第1推定値を生成してもよい。 Also, the method of the present embodiment is not limited to the above, and various modifications are possible. For example, the first detection value or the like obtained at the previous time step may be used instead of the first detection value or the like obtained at the current time step. Specifically, the first estimation processing unit 410 performs the first estimation processing using the difference value between the first detection value detected at the previous time step and the first detection value detected at the time step, for example. may be performed. Further, as shown in FIG. 10, the first estimation processing section 410 may perform the first estimation processing using the second estimation value generated at the previous time step and the aforementioned difference value. In other words, the first estimation processing unit 410 calculates the difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimation value at the first time step. Based on this, a first estimation process may be performed at the second time step to generate a first estimated value.
 図10における第1推定処理部410が生成する第1推定値を表す式は、前述の式(3)から、以下の式(20)に書き換えることができる。
Figure JPOXMLDOC01-appb-M000020
The equation representing the first estimated value generated by the first estimation processing unit 410 in FIG. 10 can be rewritten from the above equation (3) to the following equation (20).
Figure JPOXMLDOC01-appb-M000020
 なお、図10における第2推定処理部420が行う第2推定処理の時間更新式についても、前述の式(14)~式(19)を同様に書き換えることで求めることができるが、説明は省略する。 Note that the time update formula for the second estimation process performed by the second estimation processing unit 420 in FIG. 10 can also be obtained by rewriting the above-described formulas (14) to (19) in the same manner, but the description is omitted. do.
 以上のように、本実施形態のマニピュレータシステム10において、処理ユニット400は、第1タイムステップ及び第1タイムステップより後のタイムステップである第2タイムステップにおいて、第1検出値及び第2検出値を受信し、かつ、第1推定値及び第2推定値を生成する。また、処理ユニット400は、第1タイムステップにおける第1検出値と第2タイムステップにおける第1検出値の差分である差分値と、第1タイムステップにおける第2推定値と、に基づいて第2タイムステップにおける第1推定処理を実行することで第1推定値を生成する。また、処理ユニット400は、第2タイムステップにおける第1推定値及び第2タイムステップにおける第2検出値に基づいて、第2タイムステップにおける第2推定処理を実行することで第2推定値を生成する。このようにすることで、第1タイムステップで生成した第2推定値すなわち第1タイムステップで推定された湾曲部の湾曲角度と、前述の差分値だけ変化した湾曲角度と、に基づいて、第2タイムステップにおける第1推定処理が行われることから、第1推定処理によって生成される第1推定値を、より正確な値にすることができる。 As described above, in the manipulator system 10 of the present embodiment, the processing unit 400 generates the first detected value and the second detected value and generate a first estimate and a second estimate. Further, the processing unit 400 performs a second detection based on the difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimated value at the first time step. A first estimate is generated by performing a first estimation process at the time step. Further, the processing unit 400 generates a second estimated value by executing a second estimation process at the second time step based on the first estimated value at the second time step and the second detected value at the second time step. do. By doing so, based on the second estimated value generated at the first time step, that is, the bending angle of the bending portion estimated at the first time step, and the bending angle changed by the difference value, the first Since the first estimation process is performed in two time steps, the first estimated value generated by the first estimation process can be a more accurate value.
 なお、ここでの差分値は、タイムステップ間における第1推定値の変化量であり、図4のB2に示す挙動の傾きに相当する。つまり、本実施形態の手法により、図6の湾曲角度推定処理(ステップS300)が実行され続けることにより、補正された第2推定値に基づいて第1推定処理が実行され続ける。これにより、差分値も補正されるため、図4のC2に示す傾き度合いの差を小さくすることができる。 Note that the difference value here is the amount of change in the first estimated value between time steps, and corresponds to the slope of the behavior shown in B2 in FIG. That is, by continuing to execute the bending angle estimation process (step S300) of FIG. 6 according to the method of the present embodiment, the first estimation process continues to be executed based on the corrected second estimated value. As a result, the difference value is also corrected, so the difference in the degree of inclination indicated by C2 in FIG. 4 can be reduced.
 また、本実施形態の手法は、マニピュレータ100の形状推定方法として実現してもよい。つまり、本実施形態のマニピュレータ100の形状推定方法は、第1タイムステップ及び第1タイムステップより後のタイムステップである第2タイムステップにおいて、第1検出値及び第2検出値を受信し、かつ、第1推定値及び第2推定値を生成することを含む。また、マニピュレータ100の形状推定方法は、第1タイムステップにおける第1検出値と第2タイムステップにおける第1検出値の差分である差分値と、第1タイムステップにおける第2推定値と、に基づいて第2タイムステップにおける第1推定処理を実行することで第1推定値を生成することを含む。また、マニピュレータ100の形状推定方法は、第2タイムステップにおける第1推定値及び第2タイムステップにおける第2検出値に基づいて、第2タイムステップにおける第2推定処理を実行することで第2推定値を生成することを含む。このようにすることで、上記と同様の効果を得ることができる。 Also, the method of the present embodiment may be implemented as a shape estimation method for the manipulator 100. In other words, the shape estimation method of the manipulator 100 of the present embodiment receives the first detection value and the second detection value at the first time step and the second time step, which is the time step after the first time step, and , generating a first estimate and a second estimate. Further, the method for estimating the shape of manipulator 100 is based on a difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimated value at the first time step. generating a first estimate by performing a first estimation process at a second timestep. Further, the shape estimation method of the manipulator 100 performs the second estimation process at the second time step based on the first estimated value at the second time step and the second detection value at the second time step. Including generating a value. By doing so, an effect similar to that described above can be obtained.
 また、本実施形態の手法は、上記に限られず、例えば図11に示すように処理ユニット400はさらに所定処理部425をさらに含んでもよい。所定処理部425は、第2検出値に対してノイズ除去を行うノイズ除去フィルタを含む。具体的には、例えば所定処理部425はローパスフィルタを含んでもよい。このようにすることで、例えば第2検出値が先端部130に配置される高周波処置具に起因するノイズを含む場合、所定処理後の第2検出値は、当該ノイズを含まないようにすることができる。また、例えば所定処理部425はハイパスフィルタを含んでもよい。このようにすることで、例えば第2検出値がドリフトノイズを含む場合、所定処理後の第2検出値は、当該ドリフトノイズを含まないようにすることができる。また、例えば所定処理部425はカルマンフィルタであってもよい。 Also, the technique of the present embodiment is not limited to the above, and the processing unit 400 may further include a predetermined processing section 425 as shown in FIG. 11, for example. The predetermined processing unit 425 includes a noise removal filter that removes noise from the second detection value. Specifically, for example, the predetermined processing unit 425 may include a low-pass filter. By doing so, for example, when the second detection value includes noise caused by the high-frequency treatment device arranged at the distal end portion 130, the second detection value after the predetermined processing does not include the noise. can be done. Also, for example, the predetermined processing unit 425 may include a high-pass filter. By doing so, for example, when the second detection value contains drift noise, the second detection value after the predetermined processing can be made not to contain the drift noise. Also, for example, the predetermined processing unit 425 may be a Kalman filter.
 また、図12に示すように、1つ前のタイムステップにおける所定処理後の第2検出値と、当該タイムステップの第2検出値に基づいて、当該タイムステップにおける所定処理を行ってもよい。言い換えれば、処理ユニット400は、第1タイムステップにおける所定処理後の第2検出値と、第2タイムステップにおける第2検出値と、に基づいて、第2タイムステップにおける所定処理を行うことにより、所定処理後の第2検出値を生成してもよい。そして、第2推定処理部420は、当該タイムステップにおいて、所定処理後の第2検出値と、前述の第1推定値に基づいて第2推定処理を行ってもよい。 Also, as shown in FIG. 12, a predetermined process may be performed at the time step based on the second detection value after the predetermined process at the previous time step and the second detection value at the time step. In other words, the processing unit 400 performs the predetermined processing at the second time step based on the second detection value after the predetermined processing at the first time step and the second detection value at the second time step, A second detection value after predetermined processing may be generated. Then, the second estimation processing section 420 may perform the second estimation processing based on the second detection value after the predetermined processing and the above-described first estimation value at the time step.
 以上のように、本実施形態のマニピュレータシステム10において、処理ユニット400は、第1タイムステップ及び第1タイムステップより後のタイムステップである第2タイムステップにおいて、第1検出値及び第2検出値を受信し、かつ、第1推定値及び第2推定値を生成する。また、処理ユニット400は、第1タイムステップにおける第1検出値と第2タイムステップにおける第1検出値の差分である差分値と、第1タイムステップにおける第2推定値と、に基づいて第2タイムステップにおける第1推定処理を実行することで第1推定値を生成する。また、処理ユニット400は、第1タイムステップ及び第2タイムステップにおいて、第2検出値に基づいて所定処理を実行することで、所定処理後の第2検出値を生成する。また、処理ユニット400は、第1タイムステップにおける所定処理後の第2検出値と、第2タイムステップにおける第2検出値と、に基づいて、第2タイムステップにおける所定処理を実行することにより、所定処理後の第2検出値を生成する。また、処理ユニット400は、第2タイムステップにおける第1推定値と、第2タイムステップにおける所定処理後の第2検出値と、に基づいて、第2タイムステップにおける第2推定処理を実行することで第2推定値を生成する。このようにすることで、第2推定処理部420は、所定処理によってノイズ等が除去された第2検出値に基づいて第2推定処理を行うことができる。これにより、第2推定処理部420は、より適切に推定した第2推定値を生成することができる。 As described above, in the manipulator system 10 of the present embodiment, the processing unit 400 generates the first detected value and the second detected value and generate a first estimate and a second estimate. Further, the processing unit 400 performs a second detection based on the difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimated value at the first time step. A first estimate is generated by performing a first estimation process at the time step. Further, the processing unit 400 generates a second detection value after the predetermined processing by executing predetermined processing based on the second detection value at the first time step and the second time step. Further, the processing unit 400 performs the predetermined process at the second time step based on the second detection value after the predetermined process at the first time step and the second detection value at the second time step, A second detection value after predetermined processing is generated. Also, the processing unit 400 executes a second estimation process at the second time step based on the first estimated value at the second time step and the second detected value after the predetermined processing at the second time step. to generate a second estimate. By doing so, the second estimation processing unit 420 can perform the second estimation processing based on the second detection value from which noise and the like have been removed by the predetermined processing. Thereby, the second estimation processing unit 420 can generate a more appropriately estimated second estimated value.
 また、本実施形態の手法は、マニピュレータ100の形状推定方法として実現してもよい。つまり、本実施形態のマニピュレータ100の形状推定方法は、第1タイムステップ及び第1タイムステップより後のタイムステップである第2タイムステップにおいて、第1検出値及び第2検出値を受信し、かつ、第1推定値及び第2推定値を生成することを含む。また、マニピュレータ100の形状推定方法は、第1タイムステップにおける第1検出値と第2タイムステップにおける第1検出値の差分である差分値と、第1タイムステップにおける第2推定値と、に基づいて第2タイムステップにおける第1推定処理を実行することで第1推定値を生成することを含む。また、マニピュレータ100の形状推定方法は、第1タイムステップ及び第2タイムステップにおいて、第2検出値に基づいて所定処理を実行することで、所定処理後の第2検出値を生成することを含む。また、マニピュレータ100の形状推定方法は、第1タイムステップにおける所定処理後の第2検出値と、第2タイムステップにおける第2検出値と、に基づいて、第2タイムステップにおける所定処理を実行することにより、所定処理後の第2検出値を生成することを含む。また、マニピュレータ100の形状推定方法は、第2タイムステップにおける第1推定値と、第2タイムステップにおける所定処理後の第2検出値と、に基づいて、第2タイムステップにおける第2推定処理を実行することで第2推定値を生成することを含む。このようにすることで、上記と同様の効果を得ることができる。 Also, the method of the present embodiment may be implemented as a shape estimation method for the manipulator 100. In other words, the shape estimation method of the manipulator 100 of the present embodiment receives the first detection value and the second detection value at the first time step and the second time step, which is the time step after the first time step, and , generating a first estimate and a second estimate. Further, the method for estimating the shape of manipulator 100 is based on a difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimated value at the first time step. generating a first estimate by performing a first estimation process at a second timestep. Further, the method for estimating the shape of the manipulator 100 includes generating a second detection value after the predetermined processing by executing predetermined processing based on the second detection value at the first time step and the second time step. . Further, the shape estimation method of the manipulator 100 executes predetermined processing at the second time step based on the second detection value after the predetermined processing at the first time step and the second detection value at the second time step. thereby generating a second detection value after a predetermined process. Further, the shape estimation method of the manipulator 100 performs the second estimation process at the second time step based on the first estimated value at the second time step and the second detected value after the predetermined process at the second time step. The executing includes generating a second estimate. By doing so, an effect similar to that described above can be obtained.
 また、本実施形態の手法は、上記に限られず、例えば図13のブロック図に示すように、処理ユニット400は第3推定処理部430をさらに含んでもよい。第3推定処理部430は、具体的には図14に示すように、1つ前のタイムステップにおける所定処理後の第2検出値及び第2推定値と、当該タイムステップの差分値に基づいて、第3推定処理を行い、第3推定値を生成する。言い換えれば、処理ユニット400は、第1タイムステップにおける第2推定値と、第2タイムステップにおける差分値と、第1タイムステップにおける所定処理後の第2検出値に基づいて、第2タイムステップにおける第3推定処理を行って第3推定値を生成する。つまり、第3推定値とは、タイムステップ間の入力操作量に基づいて推測した、当該タイムステップにおける第2検出値であって、さらに前のタイムステップにおけるノイズ除去値を考慮した推測値である。なお、図14に示すように、第3推定処理部430は、前述の特定設計情報をさらに用いて第3推定処理を行ってもよい。 Also, the method of the present embodiment is not limited to the above, and the processing unit 400 may further include a third estimation processing section 430, as shown in the block diagram of FIG. 13, for example. Specifically, as shown in FIG. 14, the third estimation processing unit 430 is based on the second detected value and the second estimated value after predetermined processing at the previous time step and the difference value of the time step. , performs a third estimation process to generate a third estimated value. In other words, the processing unit 400 calculates the value of A third estimation process is performed to generate a third estimated value. That is, the third estimated value is the second detected value at the time step estimated based on the input operation amount between the time steps, and is an estimated value considering the noise removal value at the previous time step. . In addition, as shown in FIG. 14, the third estimation processing section 430 may perform the third estimation processing further using the aforementioned specific design information.
 そして、図14の所定処理部425は、当該タイムステップにおける第2検出値と、前述の第3推定値とに基づいて所定処理を行う。具体的には、当該タイムステップにおける第2検出値と、第3推定値を線形カルマンフィルタにかけてノイズを除去する処理を行う。つまり、図14の例における、所定処理後の第2検出値は、線形カルマンフィルタを用いた推定値である。以上のように、本実施形態のマニピュレータシステム10において、処理ユニット400は、カルマンフィルタを用いた所定処理により、所定処理後の第2検出値を生成する。また、本実施形態の手法は、マニピュレータ100の形状推定方法として実現してもよい。つまり、本実施形態のマニピュレータ100の形状推定方法は、カルマンフィルタを用いた所定処理により、所定処理後の第2検出値を生成することを含む。 Then, the predetermined processing unit 425 in FIG. 14 performs predetermined processing based on the second detection value at the time step and the aforementioned third estimated value. Specifically, the second detected value and the third estimated value at the time step are subjected to a linear Kalman filter to remove noise. That is, the second detected value after the predetermined processing in the example of FIG. 14 is the estimated value using the linear Kalman filter. As described above, in the manipulator system 10 of the present embodiment, the processing unit 400 generates the second detection value after predetermined processing by performing predetermined processing using the Kalman filter. Also, the method of the present embodiment may be implemented as a shape estimation method for the manipulator 100 . That is, the method for estimating the shape of the manipulator 100 of the present embodiment includes generating the second detection value after the predetermined processing by the predetermined processing using the Kalman filter.
 その後、第2推定処理部420は、当該タイムステップにおける所定処理後の第2検出値と、当該タイムステップにおける第1推定値に基づいて、第2推定処理を行う。つまり処理ユニット400は、第2タイムステップにおける第1推定値と、第2タイムステップにおける所定処理後の第2検出値と、に基づいて、第2タイムステップにおける第2推定処理を行って第2推定値を生成する。さらに言い換えれば、第2推定処理部420は、第2タイムステップにおいて、第1推定値と、線形カルマンフィルタによる推定後の第2検出値と、に基づいて、第2推定処理を実行する。 After that, the second estimation processing unit 420 performs a second estimation process based on the second detection value after the predetermined processing at that time step and the first estimation value at that time step. That is, the processing unit 400 performs the second estimation process at the second time step based on the first estimated value at the second time step and the second detected value after the predetermined process at the second time step, and performs the second estimation process at the second time step. Generate an estimate. Furthermore, in other words, the second estimation processing unit 420 performs the second estimation process at the second time step based on the first estimated value and the second detected value after estimation by the linear Kalman filter.
 図14における第3推定処理部430と所定処理部425の処理についてより詳細に説明する。第2タイムステップの第3推定値は、第1タイムステップにおける第2推定値と、第2タイムステップにおける差分値と、第1タイムステップにおける所定処理後の第2検出値を用いて、以下の式(21)で表すことができる。
Figure JPOXMLDOC01-appb-M000021
The processing of the third estimation processing section 430 and the predetermined processing section 425 in FIG. 14 will be described in more detail. The third estimated value of the second time step is obtained by using the second estimated value at the first time step, the difference value at the second time step, and the second detected value after predetermined processing at the first time step, as follows: It can be represented by the formula (21).
Figure JPOXMLDOC01-appb-M000021
 線形カルマンフィルタの状態方程式は、式(21)を用いて、式(22)で表すことができる。
Figure JPOXMLDOC01-appb-M000022
The state equation of the linear Kalman filter can be expressed by Equation (22) using Equation (21).
Figure JPOXMLDOC01-appb-M000022
 また、この場合の観測方程式は式(23)で表すことができる。
Figure JPOXMLDOC01-appb-M000023
Also, the observation equation in this case can be represented by Equation (23).
Figure JPOXMLDOC01-appb-M000023
 よって、線形カルマンフィルタを用いた所定処理における時間更新式は以下の式(24)、式(25)、式(26)、式(27)、式(28)、式(29)で表すことができる。
Figure JPOXMLDOC01-appb-M000024
Figure JPOXMLDOC01-appb-M000025
Figure JPOXMLDOC01-appb-M000026
Figure JPOXMLDOC01-appb-M000027
Figure JPOXMLDOC01-appb-M000028
Figure JPOXMLDOC01-appb-M000029
Therefore, time update formulas in predetermined processing using a linear Kalman filter can be expressed by the following formulas (24), (25), (26), (27), (28), and (29). .
Figure JPOXMLDOC01-appb-M000024
Figure JPOXMLDOC01-appb-M000025
Figure JPOXMLDOC01-appb-M000026
Figure JPOXMLDOC01-appb-M000027
Figure JPOXMLDOC01-appb-M000028
Figure JPOXMLDOC01-appb-M000029
 例えば図12で前述した手法においては、所定処理部425が含むノイズ除去フィルタによって、所定処理後の第2検出値には、所定処理が行われる前の第2検出値と比べて、振幅の減衰や位相の遅れや等の変化が生じている場合が有る。例えばローパスフィルタをノイズ除去フィルタとして用い、第2検出値のS/Nを上げるためにローパスフィルタのカットオフ周波数を下げると、ノイズではない本来の信号にも振幅の減衰等が生じる場合がある。一方、図14の手法においては、現在のタイムステップにおける入力情報から推定した第2検出値である第3推定値と、実際の観測値である第2検出値と、を線形カルマンフィルタにかけるので、振幅の減衰や位相の遅れによる影響を受けずに第2検出値を推定することができる。即ち、線形カルマンフィルタを用いて第2検出値の真値を推定することで、第2検出値からノイズを除去しつつ、ノイズではない本来の信号を抽出できる。これにより、正確さが高いが精度が低い第2検出値の精度を向上できる。これにより、第2推定処理部420は、その精度が高い第2検出値を用いることで、より適切に第2推定処理を実行することができる。 For example, in the method described above with reference to FIG. 12, the noise removal filter included in the predetermined processing unit 425 causes the second detection value after the predetermined processing to have an amplitude attenuation compared to the second detection value before the predetermined processing. , phase delay, and other changes may occur. For example, if a low-pass filter is used as a noise removal filter and the cut-off frequency of the low-pass filter is lowered to increase the S/N of the second detection value, the amplitude of the original signal, which is not noise, may be attenuated. On the other hand, in the method of FIG. 14, the linear Kalman filter is applied to the third estimated value, which is the second detected value estimated from the input information at the current time step, and the second detected value, which is the actual observed value. The second detection value can be estimated without being affected by amplitude attenuation or phase delay. That is, by estimating the true value of the second detection value using a linear Kalman filter, it is possible to extract the original signal that is not noise while removing noise from the second detection value. As a result, the accuracy of the second detection value, which has high accuracy but low accuracy, can be improved. Thereby, the second estimation processing unit 420 can perform the second estimation process more appropriately by using the second detection value with high accuracy.
 なお、第3推定処理に用いることができる値は上記に限られない。例えばマニピュレータ100のA軸の回転操作量を計測するエンコーダ、加速度センサ、ジャイロセンサ等からの出力値や、A軸方向の変位操作量を計測する位置センサやリニアスケールからの出力値を、第3推定処理に用いてもよい。これらの出力値は、第2センサ320が取得する第2検出値に影響する入力データだからである。 Note that the values that can be used in the third estimation process are not limited to the above. For example, an output value from an encoder, an acceleration sensor, a gyro sensor, or the like that measures the A-axis rotation operation amount of the manipulator 100, or an output value from a position sensor or linear scale that measures the A-axis displacement operation amount, is used as a third It may be used for estimation processing. This is because these output values are input data that affect the second detection values acquired by the second sensor 320 .
 以上のように、本実施形態のマニピュレータシステム10において、処理ユニット400は、第1タイムステップ及び第1タイムステップより後のタイムステップである第2タイムステップにおいて、第1検出値及び第2検出値を受信し、かつ、第1推定値及び第2推定値を生成する。また、処理ユニット400は、第1タイムステップにおける第1検出値と第2タイムステップにおける第1検出値の差分である差分値と、第1タイムステップにおける第2推定値と、に基づいて第2タイムステップにおける第1推定処理を実行することで第1推定値を生成する。また、処理ユニット400は、第1タイムステップ及び第2タイムステップにおいて、第2検出値に基づいて所定処理を実行することで、所定処理後の第2検出値を生成する。また、処理ユニット400は、第1タイムステップにおける第2推定値と、第2タイムステップにおける差分値と、第1タイムステップにおける所定処理後の第2検出値に基づいて、第2タイムステップにおける第3推定処理を実行することで第3推定値を生成する。また、処理ユニット400は、第2タイムステップにおける第2検出値と、第2タイムステップにおける第3推定値と、に基づく所定処理を実行することにより、第2タイムステップにおける所定処理後の第2検出値を生成する。また、処理ユニット400は、第2タイムステップにおける第1推定値と、第2タイムステップにおける所定処理後の第2検出値と、に基づいて、第2タイムステップにおける第2推定処理を実行することで第2推定値を生成する。このようにすることで、所定処理後の第2検出値は、振幅の減衰や位相の遅れ等の影響を受けないようにすることができる。これにより、処理ユニット400は、所定処理後の第2検出値に基づいてより適切な第2推定処理を行うことができる。 As described above, in the manipulator system 10 of the present embodiment, the processing unit 400 generates the first detected value and the second detected value and generate a first estimate and a second estimate. Further, the processing unit 400 performs a second detection based on the difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimated value at the first time step. A first estimate is generated by performing a first estimation process at the time step. Further, the processing unit 400 generates a second detection value after the predetermined processing by executing predetermined processing based on the second detection value at the first time step and the second time step. In addition, the processing unit 400 calculates the second estimated value at the first time step, the difference value at the second time step, and the second detected value after predetermined processing at the first time step, at the second time step. A third estimated value is generated by performing the 3 estimation process. In addition, processing unit 400 performs predetermined processing based on the second detection value at the second time step and the third estimated value at the second time step, thereby performing the second detection value after the predetermined processing at the second time step. Generate a detection value. Also, the processing unit 400 executes a second estimation process at the second time step based on the first estimated value at the second time step and the second detected value after the predetermined processing at the second time step. to generate a second estimate. By doing so, it is possible to prevent the second detection value after the predetermined processing from being affected by amplitude attenuation, phase delay, and the like. Thereby, the processing unit 400 can perform a more appropriate second estimation process based on the second detection value after the predetermined process.
 本実施形態の手法は、マニピュレータ100の形状推定方法として実現してもよい。つまり、本実施形態のマニピュレータ100の形状推定方法は、第1タイムステップ及び第1タイムステップより後のタイムステップである第2タイムステップにおいて、第1検出値及び第2検出値を受信し、かつ、第1推定値及び第2推定値を生成することを含む。また、マニピュレータ100の形状推定方法は、第1タイムステップにおける第1検出値と第2タイムステップにおける第1検出値の差分である差分値と、第1タイムステップにおける第2推定値と、に基づいて第2タイムステップにおける第1推定処理を実行することで第1推定値を生成することを含む。また、マニピュレータ100の形状推定方法は、第1タイムステップ及び第2タイムステップにおいて、第2検出値に基づいて所定処理を実行することで、所定処理後の第2検出値を生成することを含む。また、マニピュレータ100の形状推定方法は、第1タイムステップにおける第2推定値と、第2タイムステップにおける差分値と、第1タイムステップにおける所定処理後の第2検出値に基づいて、第2タイムステップにおける第3推定処理を実行することで第3推定値を生成することを含む。また、マニピュレータ100の形状推定方法は、第2タイムステップにおける第2検出値と、第2タイムステップにおける第3推定値と、に基づく所定処理を実行することにより、第2タイムステップにおける所定処理後の第2検出値を生成することを含む。また、マニピュレータ100の形状推定方法は、第2タイムステップにおける第1推定値と、第2タイムステップにおける所定処理後の第2検出値と、に基づいて、第2タイムステップにおける第2推定処理を実行することで第2推定値を生成することを含む。このようにすることで、上記と同様の効果を得ることができる。 The method of the present embodiment may be implemented as a shape estimation method for the manipulator 100. In other words, the shape estimation method of the manipulator 100 of the present embodiment receives the first detection value and the second detection value at the first time step and the second time step, which is the time step after the first time step, and , generating a first estimate and a second estimate. Further, the method for estimating the shape of manipulator 100 is based on a difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimated value at the first time step. generating a first estimate by performing a first estimation process at a second timestep. Further, the method for estimating the shape of the manipulator 100 includes generating a second detection value after the predetermined processing by executing predetermined processing based on the second detection value at the first time step and the second time step. . Further, the shape estimation method of manipulator 100 is based on the second estimated value at the first time step, the difference value at the second time step, and the second detected value after predetermined processing at the first time step. Generating a third estimate by performing a third estimation process in the step. Further, the shape estimation method of the manipulator 100 performs predetermined processing based on the second detected value at the second time step and the third estimated value at the second time step, and after the predetermined processing at the second time step, generating a second detected value of . Further, the shape estimation method of the manipulator 100 performs the second estimation process at the second time step based on the first estimated value at the second time step and the second detected value after the predetermined process at the second time step. The executing includes generating a second estimate. By doing so, an effect similar to that described above can be obtained.
 なお、本実施形態の手法は、以上に限られない。例えば図15のフローチャートに示す処理例のように、処理ユニット400は、ユーザが実際に操作部200を操作入力しているか否かを判断する処理を追加し、ユーザが操作部200を操作入力していない間は、前述の湾曲角度推定処理(ステップS300)を実行しないようにしてもよい。例えば処理ユニット400は、図7で前述したマニピュレータシステム起動処理(ステップS100)を実行後に、割込み禁止処理(ステップS110)を実行する。より具体的には、処理ユニット400は、前述の湾曲角度推定処理(ステップS300)を含むタイマ割込み処理を禁止する処理を行う。そして、処理ユニット400は、ユーザからの入力が有るか否かを判断する処理(ステップS120)を行い、ユーザからの入力が無いと判断した場合(ステップS120でNO)、図7と同様にマニピュレータシステム10の終了操作が行われたか否かについて判断する処理(ステップS200)を行う。処理ユニット400は、マニピュレータシステム10の終了操作が行われていないと判断した場合(ステップS200でNO)、再度ステップS120の処理を実行する。つまり、マニピュレータシステム10を起動後、ユーザからの入力が無い限り、ステップS120とステップS200による無限ループとなる。湾曲角度推定処理(ステップS300)が実行しないということは、第1推定処理及び第2推定処理は行われないということである。 Note that the method of this embodiment is not limited to the above. For example, as in the processing example shown in the flowchart of FIG. The above-described bending angle estimation process (step S300) may not be executed while the bending angle is not being adjusted. For example, the processing unit 400 executes the interrupt prohibition process (step S110) after executing the manipulator system start-up process (step S100) described above with reference to FIG. More specifically, the processing unit 400 performs processing for prohibiting timer interrupt processing including the aforementioned bending angle estimation processing (step S300). Then, the processing unit 400 performs a process (step S120) for determining whether or not there is an input from the user. If it is determined that there is no input from the user (NO in step S120), the manipulator A process (step S200) of determining whether or not an operation to end the system 10 has been performed is performed. When the processing unit 400 determines that the operation to end the manipulator system 10 has not been performed (NO in step S200), the processing of step S120 is executed again. In other words, after the manipulator system 10 is activated, an infinite loop of steps S120 and S200 is formed as long as there is no input from the user. Not executing the bending angle estimation process (step S300) means that the first estimation process and the second estimation process are not performed.
 一方、処理ユニット400は、ユーザからの入力が有ったと判断した場合(ステップS120でYES)、割込み許可処理(ステップS130)を行う。より具体的には、処理ユニット400は、前述の湾曲角度推定処理(ステップS300)を含むタイマ割込み処理を許可する処理を行う。ステップS120の処理は、例えば第1センサ310が取得する第1検出値が一定の閾値以上であるか否かを判断する処理によって実現することができるが、これに限られない。例えば、処理ユニット400は、不図示の撮像装置を含み、ユーザの撮像画像に基づいて、ユーザが入力操作を行っているか否かを判断してもよい。また、処理ユニット400は、例えばユーザに装着されたモーショントラッカーに基づいてユーザが入力操作を行っているか否かを判断してもよい。 On the other hand, when the processing unit 400 determines that there is an input from the user (YES in step S120), it performs an interrupt permission process (step S130). More specifically, the processing unit 400 performs processing for permitting timer interrupt processing, including the aforementioned bending angle estimation processing (step S300). The process of step S120 can be realized by, for example, a process of determining whether or not the first detection value acquired by the first sensor 310 is equal to or greater than a certain threshold, but is not limited to this. For example, the processing unit 400 may include an imaging device (not shown) and determine whether the user is performing an input operation based on the user's captured image. The processing unit 400 may also determine whether the user is performing an input operation based on, for example, a motion tracker worn by the user.
 以上のように、本実施形態のマニピュレータシステム10において、処理ユニット400は、ユーザが操作入力を行っていない場合、第1推定処理及び第2推定処理を行わない。本来、操作部200の入力操作が行われない場合は、マニピュレータ100の先端部130の位置等は変化しないので、第1推定処理部410や第2推定処理部420等には同じ値が入力され、第2推定値は更新されない。しかし、マニピュレータ100が例えば体腔に挿入される医療用マニピュレータなどの場合、例えば大腸の蠕動運動等によって、ユーザが操作部200の入力操作を行っていなくてもマニピュレータ100の先端部130が動き、第2検出値の値が変化することが起こり得る。これにより、ユーザが操作部200を操作していないにも関わらず第2推定値が更新されるという不都合が生じ得る。その点、本実施形態の手法を適用することで、ユーザが操作部200を操作していない場合は、第1推定処理及び第2推定処理を行わないので、第2推定値の予期しない変動を防止することができる。なお、本実施形態の手法はマニピュレータ100の形状推定方法として実現してもよい。つまり、本実施形態のマニピュレータ100の形状推定方法は、ユーザが操作入力を行っていない場合、第1推定処理及び第2推定処理を行わないことを含む。このようにすることで、上記と同様の効果を得ることができる。 As described above, in the manipulator system 10 of the present embodiment, the processing unit 400 does not perform the first estimation process and the second estimation process when the user does not perform an operation input. Originally, when the input operation of the operation unit 200 is not performed, the position of the tip portion 130 of the manipulator 100 and the like do not change. , the second estimate is not updated. However, in the case where the manipulator 100 is a medical manipulator inserted into a body cavity, for example, peristalsis of the large intestine causes the distal end portion 130 of the manipulator 100 to move even if the user does not perform an input operation on the operation unit 200. 2 It is possible that the value of the detected value changes. As a result, the second estimated value may be updated even though the user does not operate the operation unit 200 . In this regard, by applying the technique of the present embodiment, the first estimation process and the second estimation process are not performed when the user does not operate the operation unit 200, so that unexpected fluctuations in the second estimated value can be prevented. can be prevented. Note that the method of this embodiment may be implemented as a shape estimation method for the manipulator 100 . That is, the shape estimation method of the manipulator 100 of this embodiment includes not performing the first estimation process and the second estimation process when the user does not perform an operation input. By doing so, an effect similar to that described above can be obtained.
 また、図16のフローチャートに示す処理例のように、処理ユニット400は、ユーザからの入力が無いと判断した場合(ステップS120でNO)、接触状態報知処理(ステップS140)を行った後に、ステップS200の処理を行うようにしてもよい。接触状態報知処理(ステップS140)は、例えば第2検出値の変化量が一定の閾値を超えた場合、マニピュレータ100の先端部130が例えば大腸の内壁等の外部環境と接触していることを、不図示の表示部等を用いて報知する処理である。また、マニピュレータ100の先端部130が外部環境と接触した場合、操作部200の入力操作を終了した時点の第2推定値と、当該接触後の第2検出値に基づいて、当該接触後における第2推定値を別途生成してもよい。これにより、マニピュレータ100の先端部130が外部環境と接触している場合における湾曲部の状態に関する情報を得ることができる。 Further, as in the processing example shown in the flowchart of FIG. 16, when the processing unit 400 determines that there is no input from the user (NO in step S120), after performing the contact state notification processing (step S140), step The process of S200 may be performed. In the contact state notification process (step S140), for example, when the amount of change in the second detection value exceeds a certain threshold, it is determined that the tip portion 130 of the manipulator 100 is in contact with the external environment such as the inner wall of the large intestine. This is a process of notifying using a display unit (not shown) or the like. Further, when the tip portion 130 of the manipulator 100 comes into contact with the external environment, based on the second estimated value when the input operation of the operation unit 200 is completed and the second detected value after the contact, the 2 estimates may be generated separately. This makes it possible to obtain information about the state of the bending portion when the distal end portion 130 of the manipulator 100 is in contact with the external environment.
 なお、本実施形態の手法は、以上に限られない。例えば、図17のフローチャートに示すように、マニピュレータシステム10の終了操作が行われた場合(ステップS200でYES)、処理ユニット400は、推定モデル更新処理(ステップS202)を実行する処理を行ってもよい。推定モデル更新処理(ステップS202)とは、例えば、式(3)の右辺の係数を更新する処理である。前述のように、マニピュレータシステム10を使用する前には、予め不図示の記憶部に式(3)の右辺の係数が記憶され、当該係数に基づいて第1推定値が生成されるが、マニピュレータシステム10の使用に伴い第2推定処理等が実行され、第1推定値が第2推定値に補正される。そこで、マニピュレータシステム10の使用終了時における第2推定値に基づいて、式(3)の右辺の係数を更新する。 Note that the method of this embodiment is not limited to the above. For example, as shown in the flowchart of FIG. 17, when the operation to end manipulator system 10 is performed (YES in step S200), processing unit 400 performs processing for executing estimation model update processing (step S202). good. The estimation model update process (step S202) is, for example, a process of updating the coefficient on the right side of Equation (3). As described above, before using the manipulator system 10, the coefficient on the right side of the equation (3) is stored in advance in a storage unit (not shown), and the first estimated value is generated based on the coefficient. As the system 10 is used, the second estimation process and the like are executed, and the first estimated value is corrected to the second estimated value. Therefore, the coefficient on the right side of equation (3) is updated based on the second estimated value at the end of use of the manipulator system 10 .
 以上のように、本実施形態のマニピュレータシステム10において、処理ユニット400は、ユーザがマニピュレータ100の使用を終了したときに、第1推定処理の推定モデルを更新する。このようにすることで、次回のマニピュレータシステム10の使用時において、第1推定処理をより適切に行うことができる。また、マニピュレータ100は使用を続けることで、ワイヤ160等の部品が劣化し、操作部200からの応答性が変化するため、式(3)の係数は時間経過とともに変化するが、これらの部品の劣化の限度を管理することは難しい。その点、本実施形態の手法を適用することで、ユーザはマニピュレータ100の部品の劣化の程度を式(3)の係数の変化を通じて把握することができる。なお、本実施形態の手法はマニピュレータ100の形状推定方法として実現してもよい。つまり、本実施形態のマニピュレータ100の形状推定方法は、第1推定処理において、操作入力と、湾曲部の形状と、の関係がモデリングされた演算モデルを用いて、第1検出値を第1推定値に変換することと、ユーザがマニピュレータ100の使用を終了したときに、第1推定処理の推定モデルを更新することと、を含む。このようにすることで、上記と同様の効果を得ることができる。 As described above, in the manipulator system 10 of the present embodiment, the processing unit 400 updates the estimation model of the first estimation process when the user finishes using the manipulator 100 . By doing so, the first estimation process can be performed more appropriately the next time the manipulator system 10 is used. As the manipulator 100 continues to be used, the parts such as the wire 160 deteriorate and the responsiveness from the operation unit 200 changes. Degradation limits are difficult to manage. In this regard, by applying the method of the present embodiment, the user can grasp the degree of deterioration of the parts of the manipulator 100 through changes in the coefficients of Equation (3). Note that the method of this embodiment may be implemented as a shape estimation method for the manipulator 100 . In other words, the method for estimating the shape of the manipulator 100 according to the present embodiment uses an arithmetic model in which the relationship between the operation input and the shape of the bending portion is modeled in the first estimation process to obtain the first detection value. and updating the estimation model of the first estimation process when the user finishes using the manipulator 100 . By doing so, an effect similar to that described above can be obtained.
 また、本実施形態の手法は、種々の変形実施が可能である。例えば、推定モデル更新処理(ステップS202)を、図7の湾曲角度推定処理(ステップS300)の後に追加して、1タイムステップごとに行ってもよい。また、更新後の係数が所定の範囲を外れた場合、所定の報知手段を用いてその旨を報知する処理を追加してもよい。これにより式(3)の係数が所定の範囲を外れたことから、ユーザはマニピュレータ100を構成する部品の状態が正常ではないことを把握することができる。これにより、マニピュレータ100の使用中に事故が発生することを防止できる。なお、所定の報知手段とは、例えば不図示の表示部にマニピュレータ100の交換を促す表示をすることであるが、マニピュレータ100の先端部130等に配置された撮像素子から送信される撮像画像の表示しないことでもよく、不図示の音声出力装置から所定のアラームを鳴らすことでもよい。また、式(3)の係数の変化を有線または無線のネットワークを通じて所定のサーバー上にビックデータとして収集してもよい。 Also, the method of this embodiment can be modified in various ways. For example, the estimation model update process (step S202) may be added after the bending angle estimation process (step S300) in FIG. 7 and performed at each time step. Further, when the updated coefficient is out of the predetermined range, a process of notifying that effect using a predetermined notification means may be added. As a result, the coefficient of equation (3) is out of the predetermined range, so that the user can understand that the state of the parts constituting manipulator 100 is not normal. As a result, it is possible to prevent accidents from occurring during use of the manipulator 100 . Note that the predetermined notification means is, for example, displaying a display on a display unit (not shown) to urge replacement of the manipulator 100. The image pickup device arranged at the tip portion 130 of the manipulator 100 or the like transmits the picked-up image. It may not be displayed, or a predetermined alarm may be sounded from an audio output device (not shown). Also, changes in the coefficients of equation (3) may be collected as big data on a predetermined server through a wired or wireless network.
 なお、以上の説明では第2推定処理においてカルマンフィルタを用いている例について説明したが、例えば相補フィルタを用いてもよい。例えば、第1センサ310が取得する第1検出値は高周波成分の信頼性が高く、第2センサ320が取得する第2検出値は低周波成分の信頼性が高い。そこで、処理ユニット400の第1推定処理部410はハイパスフィルタを含み、第2推定処理部420はローパスフィルタを含むようにすることで、相補フィルタを実現してもよい。 In the above description, an example in which a Kalman filter is used in the second estimation process has been described, but a complementary filter, for example, may be used. For example, the first detection value obtained by the first sensor 310 has high reliability in high frequency components, and the second detection value obtained by the second sensor 320 has high reliability in low frequency components. Therefore, the first estimation processing section 410 of the processing unit 400 may include a high-pass filter, and the second estimation processing section 420 may include a low-pass filter to realize complementary filters.
 なお、上記のように本実施形態について詳細に説明したが、本実施形態の新規事項および効果から実体的に逸脱しない多くの変形が可能であることは当業者には容易に理解できるであろう。従って、このような変形例はすべて本開示の範囲に含まれるものとする。例えば、明細書又は図面において、少なくとも一度、より広義または同義な異なる用語と共に記載された用語は、明細書又は図面のいかなる箇所においても、その異なる用語に置き換えることができる。また本実施形態及び変形例の全ての組み合わせも、本開示の範囲に含まれる。またマニピュレータシステム及びマニピュレータの形状推定方法の構成及び動作等も、本実施形態で説明したものに限定されず、種々の変形実施が可能である。 Although the present embodiment has been described in detail as above, those skilled in the art will easily understand that many modifications that do not substantially deviate from the novel matters and effects of the present embodiment are possible. . Accordingly, all such modifications are intended to be included within the scope of this disclosure. For example, a term described at least once in the specification or drawings together with a different broader or synonymous term can be replaced with the different term anywhere in the specification or drawings. All combinations of this embodiment and modifications are also included in the scope of the present disclosure. Also, the configuration and operation of the manipulator system and the method for estimating the shape of the manipulator are not limited to those described in the present embodiment, and various modifications are possible.
 10…マニピュレータシステム、100…マニピュレータ、110…アウターシース、120…湾曲駒、130…先端部、140…連結部、160…ワイヤ、160r…右湾曲ワイヤ、160l…左湾曲ワイヤ、160u…上湾曲ワイヤ、160d…下湾曲ワイヤ、200…操作部、200A,200B…アングルノブ、310…第1センサ、320…第2センサ、400…処理ユニット、410…第1推定処理部、420…第2推定処理部、425…所定処理部、430…第3推定処理部 DESCRIPTION OF SYMBOLS 10... Manipulator system, 100... Manipulator, 110... Outer sheath, 120... Bending piece, 130... Tip part, 140... Connection part, 160... Wire, 160r... Right bending wire, 160l... Left bending wire, 160u... Upper bending wire , 160d... lower bending wire, 200... operation unit, 200A, 200B... angle knob, 310... first sensor, 320... second sensor, 400... processing unit, 410... first estimation processing unit, 420... second estimation process section, 425...predetermined processing section, 430...third estimation processing section

Claims (19)

  1.  湾曲部を含むマニピュレータと、
     前記湾曲部を駆動させるためにユーザが操作入力を行う操作部と、
     前記操作部に設けられ、前記操作入力の操作入力量に関する第1検出値を出力する第1センサと、
     前記マニピュレータの先端に設けられ、前記マニピュレータの動作に関する第2検出値を出力する第2センサと、
     少なくとも1つのプロセッサを含む処理ユニットと、を含み、
     前記処理ユニットは、
     前記第1検出値を受信し、
     前記第1検出値に基づいて、前記湾曲部の形状の第1推定処理を実行することで、前記湾曲部の形状を示す第1推定値を生成し、
     前記第2検出値を受信し、
     前記第1推定値と前記第2検出値に基づいて、前記湾曲部の形状の第2推定処理を実行することで、前記湾曲部の形状を示す第2推定値を生成することを特徴とするマニピュレータシステム。
    a manipulator including a curved portion;
    an operation unit through which a user performs an operation input to drive the bending unit;
    a first sensor provided in the operation unit and configured to output a first detection value related to an amount of operation input of the operation input;
    a second sensor provided at the tip of the manipulator and outputting a second detection value relating to the operation of the manipulator;
    a processing unit including at least one processor;
    The processing unit is
    receiving the first detected value;
    generating a first estimated value indicating the shape of the bending portion by performing a first estimation process of the shape of the bending portion based on the first detection value;
    receiving the second detection value;
    A second estimated value indicating the shape of the curved portion is generated by executing a second estimation process of the shape of the curved portion based on the first estimated value and the second detected value. manipulator system.
  2.  請求項1において、
     前記処理ユニットは、
     カルマンフィルタを用いた前記第2推定処理により前記第2推定値を推定することを特徴とするマニピュレータシステム。
    In claim 1,
    The processing unit is
    A manipulator system, wherein the second estimated value is estimated by the second estimation process using a Kalman filter.
  3.  請求項1において、
     前記第1推定値と前記第2推定値は、前記湾曲部の湾曲角度であることを特徴とするマニピュレータシステム。
    In claim 1,
    The manipulator system, wherein the first estimated value and the second estimated value are bending angles of the bending portion.
  4.  請求項1において、
     前記第2検出値は、前記マニピュレータの先端の位置、変位、速度、加速度、角度及び角速度のうち少なくとも1つに基づく検出値であることを特徴とするマニピュレータシステム。
    In claim 1,
    A manipulator system, wherein the second detection value is a detection value based on at least one of position, displacement, velocity, acceleration, angle, and angular velocity of the tip of the manipulator.
  5.  請求項1において、
     前記第2センサは、前記マニピュレータの先端側に配置された加速度センサ、角速度センサ、位置センサ又は撮像装置のうち少なくとも1つを含むことを特徴とするマニピュレータシステム。
    In claim 1,
    The manipulator system, wherein the second sensor includes at least one of an acceleration sensor, an angular velocity sensor, a position sensor, and an imaging device arranged on the tip side of the manipulator.
  6.  請求項1において、
     前記処理ユニットは、
     前記第1推定処理において、前記操作入力と、前記湾曲部の形状と、の関係がモデリングされた演算モデルを用いて、前記第1検出値を前記第1推定値に変換することを特徴とするマニピュレータシステム。
    In claim 1,
    The processing unit is
    In the first estimating process, the first detected value is converted into the first estimated value using an arithmetic model in which a relationship between the operation input and the shape of the bending portion is modeled. manipulator system.
  7.  請求項1において、
     前記処理ユニットは、
     第1タイムステップ及び前記第1タイムステップより後のタイムステップである第2タイムステップにおいて、前記第1検出値及び前記第2検出値を受信し、かつ、前記第1推定値及び前記第2推定値を生成し、
     前記第1タイムステップにおける前記第1検出値と前記第2タイムステップにおける前記第1検出値の差分である差分値と、前記第1タイムステップにおける前記第2推定値と、に基づいて前記第2タイムステップにおける前記第1推定処理を実行することで前記第1推定値を生成し、
     前記第2タイムステップにおける前記第1推定値及び前記第2タイムステップにおける前記第2検出値に基づいて、前記第2タイムステップにおける前記第2推定処理を実行することで前記第2推定値を生成することを特徴とするマニピュレータシステム。
    In claim 1,
    The processing unit is
    receiving the first detected value and the second detected value, and the first estimated value and the second estimated value at a first time step and a second time step that is a time step after the first time step; generate a value,
    The second detection value based on the difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimated value at the first time step. generating the first estimated value by performing the first estimation process at a time step;
    generating the second estimated value by executing the second estimation process at the second time step based on the first estimated value at the second time step and the second detected value at the second time step; A manipulator system characterized by:
  8.  請求項1において、
     前記処理ユニットは、
     第1タイムステップ及び前記第1タイムステップより後のタイムステップである第2タイムステップにおいて、前記第1検出値及び前記第2検出値を受信し、かつ、前記第1推定値及び前記第2推定値を生成し、
     前記第1タイムステップにおける前記第1検出値と前記第2タイムステップにおける前記第1検出値の差分である差分値と、前記第1タイムステップにおける前記第2推定値と、に基づいて前記第2タイムステップにおける前記第1推定処理を実行することで前記第1推定値を生成し、
     前記第1タイムステップ及び前記第2タイムステップにおいて、前記第2検出値に基づいて所定処理を実行することで、前記所定処理後の前記第2検出値を生成し、
     前記第1タイムステップにおける前記所定処理後の前記第2検出値と、前記第2タイムステップにおける前記第2検出値と、に基づいて、前記第2タイムステップにおける前記所定処理を実行することにより、前記所定処理後の前記第2検出値を生成し、
     前記第2タイムステップにおける前記第1推定値と、前記第2タイムステップにおける前記所定処理後の前記第2検出値と、に基づいて、前記第2タイムステップにおける前記第2推定処理を実行することで前記第2推定値を生成することを特徴とするマニピュレータシステム。
    In claim 1,
    The processing unit is
    receiving the first detected value and the second detected value, and the first estimated value and the second estimated value at a first time step and a second time step that is a time step after the first time step; generate a value,
    The second detection value based on the difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimated value at the first time step. generating the first estimated value by performing the first estimation process at a time step;
    generating the second detection value after the predetermined processing by executing predetermined processing based on the second detection value at the first time step and the second time step;
    By executing the predetermined process at the second time step based on the second detected value after the predetermined process at the first time step and the second detected value at the second time step, generating the second detection value after the predetermined processing;
    executing the second estimation process at the second time step based on the first estimated value at the second time step and the second detected value after the predetermined process at the second time step; to generate the second estimate at.
  9.  請求項1において、
     前記処理ユニットは、
     第1タイムステップ及び前記第1タイムステップより後のタイムステップである第2タイムステップにおいて、前記第1検出値及び前記第2検出値を受信し、かつ、前記第1推定値及び前記第2推定値を生成し、
     前記第1タイムステップにおける前記第1検出値と前記第2タイムステップにおける前記第1検出値の差分である差分値と、前記第1タイムステップにおける前記第2推定値と、に基づいて前記第2タイムステップにおける前記第1推定処理を実行することで前記第1推定値を生成し、
     前記第1タイムステップ及び前記第2タイムステップにおいて、前記第2検出値に基づいて所定処理を実行することで、前記所定処理後の前記第2検出値を生成し、
     前記第1タイムステップにおける前記第2推定値と、前記第2タイムステップにおける前記差分値と、前記第1タイムステップにおける前記所定処理後の前記第2検出値に基づいて、前記第2タイムステップにおける第3推定処理を実行することで第3推定値を生成し、
     前記第2タイムステップにおける前記第2検出値と、前記第2タイムステップにおける前記第3推定値と、に基づく前記所定処理を実行することにより、前記第2タイムステップにおける前記所定処理後の前記第2検出値を生成し、
     前記第2タイムステップにおける前記第1推定値と、前記第2タイムステップにおける前記所定処理後の前記第2検出値と、に基づいて、前記第2タイムステップにおける前記第2推定処理を実行することで前記第2推定値を生成することを特徴とするマニピュレータシステム。
    In claim 1,
    The processing unit is
    receiving the first detected value and the second detected value, and the first estimated value and the second estimated value at a first time step and a second time step that is a time step after the first time step; generate a value,
    The second detection value based on the difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimated value at the first time step. generating the first estimated value by performing the first estimation process at a time step;
    generating the second detection value after the predetermined processing by executing predetermined processing based on the second detection value at the first time step and the second time step;
    at the second time step based on the second estimated value at the first time step, the difference value at the second time step, and the second detected value after the predetermined processing at the first time step Generate a third estimated value by performing a third estimation process,
    By executing the predetermined process based on the second detected value at the second time step and the third estimated value at the second time step, the predetermined process after the predetermined process at the second time step is performed. 2 Generate a detection value,
    executing the second estimation process at the second time step based on the first estimated value at the second time step and the second detected value after the predetermined process at the second time step; to generate the second estimate at.
  10.  請求項8または9において、
     前記処理ユニットは、
     カルマンフィルタを用いた前記所定処理により、前記所定処理後の前記第2検出値を生成することを特徴とするマニピュレータシステム。
    In claim 8 or 9,
    The processing unit is
    A manipulator system, wherein the second detection value after the predetermined processing is generated by the predetermined processing using a Kalman filter.
  11.  請求項1において、
     前記処理ユニットは、
     前記ユーザが前記操作入力を行っていない場合、前記第1推定処理及び前記第2推定処理を行わないことを特徴とするマニピュレータシステム。
    In claim 1,
    The processing unit is
    A manipulator system, wherein the first estimation process and the second estimation process are not performed when the user has not performed the operation input.
  12.  請求項1において、
     前記処理ユニットは、
     前記第1推定処理において、前記操作入力と、前記湾曲部の形状と、の関係がモデリングされた演算モデルを用いて、前記第1検出値を前記第1推定値に変換し、
     前記ユーザが前記マニピュレータの使用を終了したときに、前記演算モデルを更新することを特徴とするマニピュレータシステム。
    In claim 1,
    The processing unit is
    converting the first detected value into the first estimated value using a computational model in which the relationship between the operation input and the shape of the bending portion is modeled in the first estimation process;
    A manipulator system, wherein the computational model is updated when the user finishes using the manipulator.
  13.  マニピュレータの湾曲部を駆動させるためにユーザが操作入力を行う操作部に設けられ、前記操作入力の操作入力量に関する第1検出値を出力する第1センサから、前記第1検出値を受信することと、
     前記マニピュレータの先端に設けられ、前記マニピュレータの動作に関する第2検出値を出力する第2センサから、前記第2検出値を受信することと、
     前記第1検出値に基づいて、前記湾曲部の形状の第1推定処理を実行することで、前記湾曲部の形状を示す第1推定値を生成することと、
     前記第1推定値と前記第2検出値に基づいて、前記湾曲部の形状の第2推定処理を実行することで、前記湾曲部の形状を示す第2推定値を生成することと、を含むことを特徴とするマニピュレータの形状推定方法。
    Receiving the first detection value from a first sensor that is provided in an operation unit through which a user performs an operation input in order to drive the bending portion of the manipulator and that outputs a first detection value related to an operation input amount of the operation input. and,
    receiving the second sensed value from a second sensor provided at the tip of the manipulator and outputting a second sensed value relating to movement of the manipulator;
    generating a first estimated value indicating the shape of the curved portion by executing a first estimation process of the shape of the curved portion based on the first detected value;
    generating a second estimated value indicating the shape of the curved portion by executing a second estimation process of the shape of the curved portion based on the first estimated value and the second detected value. A shape estimation method for a manipulator characterized by:
  14.  請求項13において、
     カルマンフィルタを用いた前記第2推定処理により前記第2推定値を推定することを含むことを特徴とするマニピュレータの形状推定方法。
    In claim 13,
    A shape estimation method for a manipulator, comprising estimating the second estimated value by the second estimation process using a Kalman filter.
  15.  請求項13において、
     前記第1推定処理において、前記操作入力と、前記湾曲部の形状と、の関係がモデリングされた演算モデルを用いて、前記第1検出値を前記第1推定値に変換することと、を含むことを特徴とするマニピュレータの形状推定方法。
    In claim 13,
    converting the first detected value into the first estimated value using an arithmetic model in which the relationship between the operation input and the shape of the bending portion is modeled in the first estimation process. A shape estimation method for a manipulator characterized by:
  16.  請求項13において、
     第1タイムステップ及び前記第1タイムステップより後のタイムステップである第2タイムステップにおいて、前記第1検出値及び前記第2検出値を受信し、かつ、前記第1推定値及び前記第2推定値を生成することと、
     前記第1タイムステップにおける前記第1検出値と前記第2タイムステップにおける前記第1検出値の差分である差分値と、前記第1タイムステップにおける前記第2推定値と、に基づいて前記第2タイムステップにおける前記第1推定処理を実行することで前記第1推定値を生成することと、
     前記第2タイムステップにおける前記第1推定値及び前記第2タイムステップにおける前記第2検出値に基づいて、前記第2タイムステップにおける前記第2推定処理を実行することで前記第2推定値を生成することと、を含むことを特徴とするマニピュレータの形状推定方法。
    In claim 13,
    receiving the first detected value and the second detected value, and the first estimated value and the second estimated value at a first time step and a second time step that is a time step after the first time step; generating a value;
    The second detection value based on the difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimated value at the first time step. Generating the first estimate by performing the first estimation process at a time step;
    generating the second estimated value by executing the second estimation process at the second time step based on the first estimated value at the second time step and the second detected value at the second time step; A manipulator shape estimation method comprising:
  17.  請求項13において、
     第1タイムステップ及び前記第1タイムステップより後のタイムステップである第2タイムステップにおいて、前記第1検出値及び前記第2検出値を受信し、かつ、前記第1推定値及び前記第2推定値を生成することと、
     前記第1タイムステップにおける前記第1検出値と前記第2タイムステップにおける前記第1検出値の差分である差分値と、前記第1タイムステップにおける前記第2推定値と、に基づいて前記第2タイムステップにおける前記第1推定処理を実行することで前記第1推定値を生成することと、
     前記第1タイムステップ及び前記第2タイムステップにおいて、前記第2検出値に基づいて所定処理を実行することで、前記所定処理後の前記第2検出値を生成することと、
     前記第1タイムステップにおける前記第2検出値と、前記第2タイムステップにおける前記第2検出値と、に基づいて、前記第2タイムステップにおける前記所定処理を実行することにより、前記所定処理後の前記第2検出値を生成することと、
     前記第2タイムステップにおける前記第1推定値と、前記第2タイムステップにおける前記所定処理後の前記第2検出値と、に基づいて、前記第2タイムステップにおける前記第2推定処理を実行することで前記第2推定値を生成することと、を含むことを特徴とするマニピュレータの形状推定方法。
    In claim 13,
    receiving the first detected value and the second detected value, and the first estimated value and the second estimated value at a first time step and a second time step that is a time step after the first time step; generating a value;
    The second detection value based on the difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimated value at the first time step. Generating the first estimate by performing the first estimation process at a time step;
    Generating the second detection value after the predetermined processing by executing a predetermined processing based on the second detection value at the first time step and the second time step;
    By executing the predetermined process at the second time step based on the second detection value at the first time step and the second detection value at the second time step, after the predetermined process generating the second detection value;
    executing the second estimation process at the second time step based on the first estimated value at the second time step and the second detected value after the predetermined process at the second time step; and generating the second estimate at .
  18.  請求項13において、
     第1タイムステップ及び前記第1タイムステップより後のタイムステップである第2タイムステップにおいて、前記第1検出値及び前記第2検出値を受信し、かつ、前記第1推定値及び前記第2推定値を生成することと、
     前記第1タイムステップにおける前記第1検出値と前記第2タイムステップにおける前記第1検出値の差分である差分値と、前記第1タイムステップにおける前記第2推定値と、に基づいて前記第2タイムステップにおける前記第1推定処理を実行することで前記第1推定値を生成することと、
     前記第1タイムステップ及び前記第2タイムステップにおいて、前記第2検出値に基づいて所定処理を実行することで、前記所定処理後の前記第2検出値を生成することと、
     前記第1タイムステップにおける前記第2推定値と、前記第2タイムステップにおける前記差分値と、前記第1タイムステップにおける前記所定処理後の前記第2検出値に基づいて、前記第2タイムステップにおける第3推定処理を実行することで第3推定値を生成することと、
     前記第2タイムステップにおける前記第2検出値と、前記第2タイムステップにおける前記第3推定値と、に基づく前記所定処理を実行することにより、前記第2タイムステップにおける前記所定処理後の前記第2検出値を生成することと、
     前記第2タイムステップにおける前記第1推定値と、前記第2タイムステップにおける前記所定処理後の前記第2検出値と、に基づいて、前記第2タイムステップにおける前記第2推定処理を実行することで前記第2推定値を生成することと、を含むことを特徴とするマニピュレータの形状推定方法。
    In claim 13,
    receiving the first detected value and the second detected value, and the first estimated value and the second estimated value at a first time step and a second time step that is a time step after the first time step; generating a value;
    The second detection value based on the difference value, which is the difference between the first detection value at the first time step and the first detection value at the second time step, and the second estimated value at the first time step. Generating the first estimate by performing the first estimation process at a time step;
    Generating the second detection value after the predetermined processing by executing a predetermined processing based on the second detection value at the first time step and the second time step;
    at the second time step based on the second estimated value at the first time step, the difference value at the second time step, and the second detected value after the predetermined processing at the first time step Generating a third estimated value by performing a third estimation process;
    By executing the predetermined process based on the second detected value at the second time step and the third estimated value at the second time step, the predetermined process after the predetermined process at the second time step is performed. 2 generating a detection value;
    executing the second estimation process at the second time step based on the first estimated value at the second time step and the second detected value after the predetermined process at the second time step; and generating the second estimate at .
  19.  請求項17または18において、
     カルマンフィルタを用いた前記所定処理により、前記所定処理後の前記第2検出値を生成することを含むことを特徴とするマニピュレータの形状推定方法。
    In claim 17 or 18,
    A method for estimating a shape of a manipulator, comprising generating the second detection value after the predetermined processing by the predetermined processing using a Kalman filter.
PCT/JP2021/032120 2021-09-01 2021-09-01 Manipulator system and method for inferring shape of manipulator WO2023032074A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/032120 WO2023032074A1 (en) 2021-09-01 2021-09-01 Manipulator system and method for inferring shape of manipulator

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/032120 WO2023032074A1 (en) 2021-09-01 2021-09-01 Manipulator system and method for inferring shape of manipulator

Publications (1)

Publication Number Publication Date
WO2023032074A1 true WO2023032074A1 (en) 2023-03-09

Family

ID=85410993

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/032120 WO2023032074A1 (en) 2021-09-01 2021-09-01 Manipulator system and method for inferring shape of manipulator

Country Status (1)

Country Link
WO (1) WO2023032074A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009131406A (en) * 2007-11-29 2009-06-18 Olympus Medical Systems Corp Endoscope system
JP2013172905A (en) * 2012-02-27 2013-09-05 Olympus Corp Tubular insertion system
JP2018521331A (en) * 2015-06-17 2018-08-02 ザ・チャールズ・スターク・ドレイパー・ラボラトリー・インコーポレイテッド System and method for determining shape and / or position
JP2021502173A (en) * 2017-11-10 2021-01-28 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Systems and methods for controlling robot manipulators or related tools
WO2021166103A1 (en) * 2020-02-19 2021-08-26 オリンパス株式会社 Endoscopic system, lumen structure calculating device, and method for creating lumen structure information

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009131406A (en) * 2007-11-29 2009-06-18 Olympus Medical Systems Corp Endoscope system
JP2013172905A (en) * 2012-02-27 2013-09-05 Olympus Corp Tubular insertion system
JP2018521331A (en) * 2015-06-17 2018-08-02 ザ・チャールズ・スターク・ドレイパー・ラボラトリー・インコーポレイテッド System and method for determining shape and / or position
JP2021502173A (en) * 2017-11-10 2021-01-28 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Systems and methods for controlling robot manipulators or related tools
WO2021166103A1 (en) * 2020-02-19 2021-08-26 オリンパス株式会社 Endoscopic system, lumen structure calculating device, and method for creating lumen structure information

Similar Documents

Publication Publication Date Title
JP5242865B1 (en) Insert section shape estimation device
JP6112300B2 (en) Master-slave robot control device and control method, master-slave robot, and control program
JP4897122B2 (en) Endoscope shape detection apparatus and endoscope shape detection method
Enayati et al. A quaternion-based unscented Kalman filter for robust optical/inertial motion tracking in computer-assisted surgery
JP4509988B2 (en) Operation device control apparatus, operation device control method, program, and operation device
JP5594940B2 (en) Method and apparatus for detecting a substantially invariant axis of rotation
US8523765B2 (en) Medical control apparatus
US20180088685A1 (en) Attitude detecting device
Lin et al. Development of the wireless ultra-miniaturized inertial measurement unit WB-4: Preliminary performance evaluation
JP2008077425A (en) Information processor, information processing method, and program
Olivares et al. Using frequency analysis to improve the precision of human body posture algorithms based on Kalman filters
JP2014517747A (en) Medical master / slave device for minimally invasive surgery
CN113108790B (en) Robot IMU angle measurement method and device, computer equipment and storage medium
JP2015179002A (en) Attitude estimation method, attitude estimation device and program
WO2023032074A1 (en) Manipulator system and method for inferring shape of manipulator
JP4897121B2 (en) Endoscope shape detection apparatus and endoscope shape detection method
CN117224241B (en) Control method and related device of soft endoscope control system
JP7406390B2 (en) Calibration device and its control method
CN115919250A (en) Human dynamic joint angle measuring system
Llorach et al. Position estimation with a low-cost inertial measurement unit
CN117091592A (en) Gesture resolving method, gesture resolving device, and computer storage medium
CN115844380A (en) Quaternion-based lower limb posture detection method and system
JP2006038650A (en) Posture measuring method, posture controller, azimuth meter and computer program
JP2007011460A (en) Method for simulating displacement of object, device for simulating displacement of object, and inner force sense presentation device
Rahman et al. Range of motion measurement using single inertial measurement unit sensor: a validation and comparative study of sensor fusion techniques

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21955964

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE