WO2022027129A1 - Robot control stick, computer-implemented method for controlling a robot portion, and teach pendant system - Google Patents

Robot control stick, computer-implemented method for controlling a robot portion, and teach pendant system Download PDF

Info

Publication number
WO2022027129A1
WO2022027129A1 PCT/CA2021/051067 CA2021051067W WO2022027129A1 WO 2022027129 A1 WO2022027129 A1 WO 2022027129A1 CA 2021051067 W CA2021051067 W CA 2021051067W WO 2022027129 A1 WO2022027129 A1 WO 2022027129A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
orientation
elongated member
plane
signal
Prior art date
Application number
PCT/CA2021/051067
Other languages
French (fr)
Inventor
Yan Drolet-Mihelic
Alexander SELIVANOV
Original Assignee
Stickôbot Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Stickôbot Inc. filed Critical Stickôbot Inc.
Publication of WO2022027129A1 publication Critical patent/WO2022027129A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/06Control stands, e.g. consoles, switchboards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0081Programme-controlled manipulators with master teach-in means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/42Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
    • G05B19/427Teaching successive positions by tracking the position of a joystick or handle to control the positioning servo of the tool head, master-slave control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39438Direct programming at the console
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39443Portable, adapted to handpalm, with joystick, function keys, display

Definitions

  • ROBOT CONTROL STICK COMPUTER-IMPLEMENTED METHOD FOR CONTROLLING A ROBOT PORTION, AND TEACH PENDANT
  • the improvements generally relate to robotized systems, and more particularly relate to teaching movements to robots using teach pendant units.
  • a teach pendant unit is a type of controller used for teaching movement(s) that a robot portion is to perform.
  • the teach pendant unit can be used for teaching a robotized articulated arm manipulating a tool a series of successive steps to process a workpiece.
  • the teach pendant unit is generally manually operated by an operator which controls the movement(s) of the robotized articulated arm using keys and/or buttons, while watching the movement of the tool relative to the workpiece.
  • the robot control stick can be used to constrain the movement of the robot portion within a two-dimensional plane that would otherwise be in a three-dimensional space. Accordingly, when detecting a given user input, the robot control stick can fix a base orientation corresponding to the current orientation of the robot control stick, which can in turn define a robot plane orthogonal to the base orientation. Once the robot plane has been defined, pivoting or otherwise rotating the robot control stick will cause the robot portion to move within a two-dimensional path which is confined to the robot plane, thereby facilitating the control of the robot portion in at least some circumstances.
  • a robot control system for controlling a robot portion of a robot, the robot control system comprising a robot control stick having an elongate member, a user input sensor mounted to said elongated member and activatable by a user to generate a user input, a orientation sensor mounted to said elongated member and configured for generating an orientation signal indicative of an orientation of said elongated member in a coordinate system of the robot; an communication module configured to generate a signal including at least a base orientation signal generated by the orientation sensor upon generation of the user input; a computer communicatively coupled to receive the signal, the computer having software stored in a non-volatile memory of the computer and configured to, when executed by a processor of the computer, perform the steps of : defining one of a robot axis and a robot plane based on the base orientation signal, subsequently to said defining one of the robot axis and the robot plane, controlling the movement of the robot portion based on the signal, including confining the
  • a robot control system for controlling a robot portion of a robot, the robot control system comprising : a robot control stick having an elongate member, a user input sensor mounted to said elongated member and activatable by a user to generate a user input, a orientation sensor mounted to said elongated member and configured for measuring an orientation of said elongated member in a coordinate system of the robot; a communication module configured to generate a control stick signal including at least orientation values measured by the orientation sensor upon generation of the user input; a computer communicatively coupled to receive the control stick signal, the computer having software stored in a non-volatile memory of the computer and configured to, when executed by a processor of the computer, perform the steps of : defining one of a robot axis and a robot plane based on the measured orientation values, subsequently to said defining one of the robot axis and the robot plane, controlling a movement of the robot portion based on the control stick signal, including confining
  • a robot control stick for controlling a robot portion
  • the robot control stick comprising: an elongated member; a user input sensor mounted to said elongated member and generating a user input upon activation; and an orientation sensor mounted to said elongated member and communicatively coupled to said user input sensor, said orientation sensor generating, upon receiving said user input, a base orientation signal indicative of an orientation of said elongated member relative to a three-dimensional coordinate system, and further generating, once said base orientation has been generated, an orientation variation signal indicative of a variation of an orientation of said elongated member relative to said three-dimensional coordinate system, said orientation variation signal being convertible into a two-dimensional path confined to a robot plane orthogonal to said base orientation, said robot portion being controllable to move within said robot plane along said two-dimensional path.
  • a computer-implemented method for controlling a robot portion using a robot control stick having an orientation sensor and a user input sensor comprising: upon receiving a user input at a given moment in time, and using an orientation sensor, generating a base orientation signal indicative of a base orientation of said orientation sensor relative to a three- dimensional coordinate system at said given moment in time, and determining a base orientation based on said base orientation signal; subsequent to said determining, generating an orientation variation signal indicative of a variation of said orientation of said orientation sensor; converting said orientation variation signal into a two-dimensional path confined to a robot plane of said three-dimensional coordinate system, with said robot plane being orthogonal to said base orientation; and generating a control signal controlling the robot portion to move within said robot plane along said two-dimensional path.
  • a teach pendant system for a robot portion movable in a three-dimensional coordinate system
  • the teach pendant system comprising: an elongated member having an orientation sensor generating an orientation signal indicative of an orientation of said elongated member relative to said three-dimensional coordinate system; and a teach pendant unit communicatively coupled to said orientation sensor and to said robot portion, the teach pendant unit having a processor and a memory having instructions that when executed by the processor perform the steps of: upon receiving a user input at a given moment in time, determining a base orientation of said elongated member at said moment in time based on said orientation signal; subsequent to said determining, monitoring an orientation of said elongated member based on said orientation signal; converting said monitored orientation into a two-dimensional path confined to a robot plane of said three-dimensional coordinate system, with said robot plane being orthogonal to said base orientation; and generating a control signal controlling the robot portion to move within said robot plane along said two-dimensional path.
  • Fig. 1 is an oblique view of an example teach pendant system for controlling a robot portion, shown with a robot control stick and a controller, in accordance with one or more embodiments;
  • Fig. 1A is a sectional view taken along section 1A-1A of Fig. 1 ;
  • Fig. 1 B is a sectional view taken along section 1 B-1 B of Fig. 1 ;
  • FIG. 2 is an oblique view of the teach pendant system of Fig. 1 , shown with a teach pendant unit and a robot control stick oriented along a base orientation, in accordance with one or more embodiments;
  • FIG. 3 is an oblique view of the teach pendant system of Fig. 1 , shown with the robot control stick rotated by a given angle relative to the base orientation of Fig. 2, in accordance with one or more embodiments;
  • Fig. 4 is a side elevation view of the robot control stick of Fig. 1 , showing a side button, in accordance with one or more embodiments;
  • Fig. 4A is a top plan view of the robot control stick of Fig. 1 , showing top buttons, in accordance with one or more embodiments;
  • FIG. 5 is a schematic view of an example of a computing device of the controller of Fig. 1 , in accordance with one or more embodiments;
  • Fig. 6 is a schematic view of an example of a software application of the controller of Fig. 1 , in accordance with one or more embodiments;
  • Fig. 7 is a flow chart of an example method of controlling a robot portion using a robot control stick, in accordance with one or more embodiments.
  • Fig. 8 is an oblique view of the teach pendant system of Fig. 1 , shown with the robot control stick rotated by a given angle passed the base orientation of Fig. 2, in accordance with one or more embodiments.
  • Fig. 1 shows a teach pendant system 100 used to control a robot portion 102 within a coordinate system.
  • the robot portion 102 is provided in the form of a robotized articulated arm 103.
  • the robotized articulated arm 103 has an end equipped with a robotized plier 105.
  • robots can have different actuating structures and robot portions can take different forms without departing from the present disclosure.
  • the teach pendant system 100 can be used to control any portion thereof depending on the embodiment.
  • the robotized articulated arm 103 in this example can be used to process a workpiece in an industrial application
  • the robot portion may be adapted and used to perform steps in any other industry including, but not limited to, the aerospace industry, the consumer product industry, the disaster response industry, the drone industry, the education industry, the exoskeleton industry, the humanoid industry, the military industry, the security industry, the research industry, the self-driving car industry, the telepresence industry, the underwater industry, or any combination thereof.
  • a teach pendant system such as described herein may advantageously be used in any such alternate embodiment.
  • the teach pendant system 100 has a robot control stick 106.
  • the robot control stick 106 has an elongated member 107.
  • the elongated member 107 may have a pen-like shape so as to be easily manipulable with a single hand.
  • the pen-like shape of the elongated member 107 can be used to indicate a direction of movement of the robot portion 102 within a robot plane 104, in some embodiments.
  • the elongated member 107 extends between a first end 116 and a second end 118, with a cylindrical-like body extending linearly between the first 116 and second ends 118.
  • the robot control stick has one or more user input sensors 108 such as side buttons(s) 120, top button(s) 122 and the like.
  • the button(s) can be click button(s), touch button(s) and/or force button(s) depending on the embodiment, to name some examples.
  • the robot control stick 106 has one or more orientation sensors 110 which are configured to generate an orientation signal indicative of an orientation of the elongated member 107 relative to a coordinate system 124 (x, y, z).
  • the robot control stick 106 can be configured to generate the orientation signal continuously, at a given frequency, or on demand (e.g. based on user input), for instance.
  • the orientation sensor can include, but not limited to, accelerometer(s), gyroscope(s), magnetometer(s), GPS sensor(s), camera(s), inclinometers and the like.
  • the orientation sensor 110 can be provided in the form of an orientation sensor assembly having a three-axis accelerometer, a three-axis gyroscope and/or a three-axis magnetometer generating measurement signals indicative of a corresponding measurand in all three-dimensions.
  • a three-axis accelerometer may be the simplest implementation in some embodiments.
  • a given orientation signal which will be referred to herein as a base orientation signal
  • the given user input can be the pressing and holding of a given button in one example, such as the side button 120 or one of the top bottons 122 shown in the robot central stick 106 shown in Fig. 4 and 4A for instance.
  • a given button in one example, such as the side button 120 or one of the top bottons 122 shown in the robot central stick 106 shown in Fig. 4 and 4A for instance.
  • an indication that the given user input has been triggered can be communicated, preferably wirelessly, to a computer such as a computer 126 integrated within the teach pendant unit 114 for instance, together with the base orientation signal.
  • the software running on the computer 126 can trigger different modes of operation can be triggered at this stage.
  • the software running on the computer 126 can define a robot plane 104 relative to the base orientation signal.
  • the base orientation signal can be indicative of a base orientation 128, which corresponds to the orientation of a stick axis 130 extending along the length of the robot control stick 106 for instance.
  • the robot plane 104 can be defined virtually in the coordinate system 124 of the robot, normal to the orientation of the stick axis 130 at the time where the first user input is received.
  • the robot plane 104 can then serve as a reference for controlling the movement of the robot portion 102.
  • the software can lock the movement of the robot portion 102 within the robot plane 104 for as long as the first user input is maintained, e.g.
  • subsequent movement of the robot control stick 106 such as displacement relative the robot plane 104 or subsequent inclination of the stick axis 130 relative to the orientation of the stick axis 130 at the time when the robot plane 104 is defined can be detected, for instance, based on variations of the orientation signal over time, and be used to control the movement of the robot portion 102 within the robot plane 104.
  • the orientation sensor 110 can continue to generate an orientation signal subsequently to the generation of the base orientation 128 signal, and the following signal can be compared to the initial orientation and interpreted as an orientation variation signal (and/or data) indicative of a variation of the orientation of the elongated member 107.
  • the orientation variation signal can be indicative of a plurality of orientation values at a corresponding plurality of moments in time.
  • the orientation variation signal can have a given frequency, i.e. , a given number of orientation values within a given time duration unit.
  • the orientation variation signal thereby monitors the inclination of the elongated member 107 as it is manipulated by an operator during a teaching sequence, for instance.
  • moving the stick axis 130 in a plane parallel to the robot plane 104 while keeping the stick axis 130 perpendicular to the robot plane 104 can be detected by the orientation sensor 110 and used as a variation signal to trigger displacement of the robot portion 102 within the robot plane 104.
  • the user input sensor 108 has to be activated for a time duration which is greater than a corresponding time duration threshold (e.g., a few ms) in order to determine the base orientation 130.
  • accidental and momentary activation of the user input sensor 108 may not redefine the base orientation 128. It is noted that a base orientation currently shown in Fig. 1 for instance, may be waived and re-determined upon a user activating the user input sensor 108 during a period of time that is greater than the corresponding time duration threshold.
  • the software running on the computer 126 may already have one or more predefined planes in a coordinate system 124 of the robot, such as three orthogonal planes corresponding to a three dimensional coordinate system for instance.
  • the software can perform a match between a plane which is normal to the stick axis 130 at the time the user input is received, which can be determined based on a corresponding signal from the orientation sensor 110 for instance, and one of the predefined planes which is most closely aligned with the former normal plane, and define that matched predefined plane as the robot plane 104.
  • the robot plane 104 is defined indirectly, based on a closest match between the normal plane and one of the predefined virtual planes in the coordinate system 124 of the robot portion 102.
  • subsequent movement of the robot control stick 106 can be used to control the movement of the robot portion 102 relative to the robot plane 104.
  • the technique can thus be used to confine the subsequent movement of the robot portion 102 to a robot plane 104 in a manner similar to how this latter operation was described in the first example, with the distinction that the robot plane 104, in this case, is not defined solely by the orientation of the stick axis 130 of the robot control stick 106, but rather on a best match basis between the orientation of the stick axis 130 and the predefined virtual planes.
  • both the first example and the second example can be provided for in the software, and the choice of defining the robot plane 104 on the first example or on the second example can be based on the exact nature of the user input which is received, such as depending on which button has been pressed on the robot control stick 106, for instance. In other embodiments, only the process of the first example or of the second example can be provided for by the software.
  • the combination of a corresponding user input and of a corresponding base orientation signal can be used to define a robot axis.
  • the software running on the computer 126 can define a robot axis as corresponding to the base orientation 128 signal, and in other words, on the basis of the orientation of the stick axis 130 at the time when the corresponding input is triggered. Once a robot axis has been defined, the movement of the robot portion 102 can be confined to the robot axis for instance.
  • a subsequent user input such as movement of the robot control stick 106 in free space or variation of inclination thereof as detected by the orientation sensor 110 for instance, can be used to move the robot portion 102 and or change its orientation relative to the robot axis.
  • the subsequent user input can be the maintaining of a button in a pressed state after the initial user input, which can form the basis of an command to move the robot portion 102 at a predetermined speed along the robot axis for as long as the button remains in the pressed state, for instance.
  • the software running on the computer 126 may already have one or more predefined axes in a coordinate system of the robot, such as three orthogonal axes (x, y, z) corresponding to a three dimensional Cartesian coordinate system for instance.
  • the software can perform a match between the orientation of the stick axis 130 at the time the user input is received, which can be determined based on a corresponding signal from the orientation sensor 110 for instance, and one of the predefined axes which is most closely aligned with the stick axis 130 orientation, and define that matched predefined axis as the robot axis.
  • the robot axis instead of defining the robot axis directly from the stick axis 130 based on the signal from the orientation sensor 110, the robot axis can be defined indirectly, based on a closest match between the stick axis 130 and one of the predefined virtual axes in the coordinate system of the robot.
  • subsequent movement of the robot control stick 106 can be used to control the movement of the robot portion 102 relative to the robot axis.
  • the technique can thus be used to confine the subsequent movement of the robot to a robot axis 130 in a manner similar to how this latter operation was described in the third example, with the distinction that the robot axis, in this case, is not defined solely by the orientation of the stick axis 130 of the robot control stick 106, but rather on a best match basis between the orientation of the robot control stick and the predefined virtual axes.
  • all of the above examples can be provided for in the software, and the choice of which technique is retained for defining the robot plane 104 or robot axis to which movement is thereafter confined is left to the user, and can be based on user input, such as depending on which button has been pressed on the robot control stick 106, for instance.
  • only the process of the third example or of the fourth example can be provided for by the software.
  • the teach pendant system 100 also has a controller 112 which can be a computer and provide the software functionalities described above.
  • the controller 112 can be part of the robot control stick 106, part of a teach pendant unit 114, or part of both, depending on the embodiment. In some other embodiments, the controller 112 can be external to the robot control stick 106 or to the teach pendant unit 114. In these embodiments, the controller 112 is in communicative coupling with the robot control stick 106, the teach pendant unit 114 and/or to the robot portion 102. As such, it is noted that the controller 112 can be part of an electronic device such as a mobile phone, an electronic tablet and the like to control the robot portion 102 in a remote fashion.
  • the controller 112 is part of the teach pendant unit 114 and, more specifically, of the computer 126 of the teach pendant unit 114. It is understood that the controller 112 can be a computer within the teach pendant unit 114 while being separate of the computer 126 previously disclosed. It is further understood that the controller 112 can be separate from the teach pendant unit 114 without departing from the present disclosure.
  • the robot control stick 106 can have a communication module 132 (Fig. 4) which can communicate the base orientation signal and the orientation variation signal generated by the orientation sensor 110 to the teach pendant unit 114 for processing by the controller 112 enclosed therein.
  • the robot control stick 106 can have a microcontroller integrated thereto, in some embodiments.
  • the teach pendant unit 114 has a frame 134, a user interface including a display screen 136, a keyboard 138 having keys, and buttons such as an emergency stop button.
  • the teach pendant unit 114 is communicatively coupled to the robot control stick 106 so as to interact with one another. For instance, activating a button 120, 122 on the robot control stick 106 can cause the orientation sensor 110 to generate a base orientation 128 signal, which can in turn be communicated to the teach pendant unit 114 for processing.
  • the coupling between the teach pendant unit 114 and the robot control stick 106 can be partially or wholly wireless in some embodiments whereas the coupling may be wired in some other embodiments.
  • the controller 112 has a processor and a memory having instructions that when executed by the processor perform some steps of determining the base orientation 128 and the robot plane 104, and controlling the robot portion 102 only within the two-dimensional path 140 within the robot plane 104.
  • the robot portion 102 can be controlled by moving the robot control stick 106. It is noted that controlling a robot portion 102 to move within a three-dimensional space when movement in only a two-dimensional plane is necessary can be challenging. Indeed, should the robot control stick 106 be moved slightly off-plane, it would cause the robot portion 102 to move in such as way may be detrimental to the teaching process. It can thus be advantageous to fix a plane within which the robot portion 102 is to be moved during a teaching sequence. This plane may then be modified on the go using the robot control stick 106.
  • the orientation sensor 110 generates, upon receiving a user input via one of the user input sensor 108, a base orientation signal indicative of a base orientation 128 of the elongated member 107 relative to a three-dimensional coordinate system 124 (x, y, z).
  • the controller 112 can thereby determine a base orientation 128 of the elongated member 107 at the given moment in time based on the orientation signal generated by the orientation sensor 110.
  • the orientation sensor 110 can further generate, once the base orientation 128 has been generated, an orientation variation signal indicative of a variation of an orientation of the elongated member 107 relative to the three-dimensional coordinate system 124 (x, y, z).
  • the orientation variation signal is convertible into a two-dimensional path confined to a robot plane 104 which is orthogonal to the base orientation 128. Then, the robot portion 102 can be conveniently controlled to move within the robot plane 104 along the two- dimensional path 140.
  • any movement of the robot control stick 106 once the base orientation 128 has been determined will result in movement of the robot portion 102 within the robot plane 104. For instance, as shown in Fig. 3, when the robot control stick 106 is pivoted 142 about an axis by a given angle 146, orientation variation signal will be generated by the orientation sensor 110.
  • the monitored orientation can thereby be converted into an upward movement 144 of the robot portion 102 along the robot plane 104 previously determined. It is noted that the correspondence between an upward movement of the robot control stick 106 can lead to a downward movement, sideways movement, or any other type of displacement of the robot portion 102 without departing from the present disclosure.
  • a speed at which the robot portion 102 is to be moved can be proportional to the angle 146 that is formed between the orientation of the stick axis 130 of the control robot stick 106 and the base orientation 128 and/or the robot plane 104 when the robot portion 102 starts displacing. More specifically, the speed can be proportional to a projection of a current orientation vector of the robot control stick 106 on the robot plane 104. In these embodiments, the speed can be determined by multiplying a constant to that projection. The constant can be stored on a memory of the controller 112, for instance. In some other embodiments, the speed at which the robot portion 102 is to be moved varies linearly as a function of the angle 146 between the stick axis 130 and the robot plane 104.
  • the robot control stick 106 has a cylindrical-like body extending between the first end 116 and the second end 118.
  • the first end 116 has a truncated cone shape ending at a rounded tip 148, whereas the second end 118 terminates in a semi-rounded tip.
  • the rounded tip 148 of the first end 116 can be removably received at a corresponding portion of the teach pendant unit 114.
  • the rounded tip 148 may be positioned at a receiving portion of the teach pendant unit 114, which will prevent it from moving as the remainder of the robot control sick is pivoted or otherwise rotated to control the robot portion 102.
  • the elongated member 107 can be otherwise removably attachable to the teach pendant unit 114, for instance to avoid losing it when it is not in use by an operator.
  • side buttons 120 are provided on each lateral sides of the cylindrical-like body, in addition to a series of top buttons 122 distributed on a top side of the cylindrical-like body.
  • These buttons 120, 122 can differ in shape and functionality. For instance, one of these buttons can be activated to set the base orientation 128. In some other embodiments, another one of these buttons can be activated to toggle between two different parts of a robot portion to control, e.g., to toggle between a robotized articulated arm and a robotized plier.
  • the robot control stick 106 may have a wired or wireless communication module 132 being communicatively coupled to the orientation sensor 110 and to the user input sensors 108. As such, the base orientation signal, the orientation variation signal and the user inputs can be communicated to the controller 112 for further processing.
  • the controller may determine which one of the first 116 and second ends 118 of the cylindrical-like body points upwards or downwards towards a ground reference. As such, the controller 112 can recognize, at any given time, that the robot control stick 106 is inverted, for instance, and adjust the control signal accordingly.
  • the controller 112 can be provided as a combination of hardware and software components.
  • the hardware components can be implemented in the form of a computing device 500, an example of which is described with reference to Fig. 5.
  • the software components of the controller 112 can be implemented in the form of a software application 600, an example of which is described with reference to Fig. 6.
  • the computing device 500 can have a processor 502, a memory 504, and I/O interface 506. Instructions 508 for setting the robot plane and converting any movement of the robot control stick within the robot plane thereafter, and other method steps, can be stored on the memory 504 and accessible by the processor 502.
  • the processor 502 can be, for example, a general-purpose microprocessor or microcontroller, a digital signal processing (DSP) processor, an integrated circuit, a field programmable gate array (FPGA), a reconfigurable processor, a programmable read-only memory (PROM), or any combination thereof.
  • DSP digital signal processing
  • FPGA field programmable gate array
  • PROM programmable read-only memory
  • the memory 504 can include a suitable combination of any type of computer- readable memory that is located either internally or externally such as, for example, randomaccess memory (RAM), read-only memory (ROM), compact disc read-only memory (CDROM), electro-optical memory, magneto-optical memory, erasable programmable readonly memory (EPROM), and electrically-erasable programmable read-only memory (EEPROM), Ferroelectric RAM (FRAM) or the like.
  • RAM randomaccess memory
  • ROM read-only memory
  • CDROM compact disc read-only memory
  • electro-optical memory magneto-optical memory
  • EPROM erasable programmable readonly memory
  • EEPROM electrically-erasable programmable read-only memory
  • FRAM Ferroelectric RAM
  • Each I/O interface 506 enables the computing device 500 to interconnect with one or more input devices, such as orientation sensor(s), user interface sensor(s) such as button(s), or with one or more output devices such as a user interface, a display screen,
  • Each I/O interface 506 enables the controller 112 to communicate with other components, to exchange data with other components, to access and connect to network resources, to serve applications, and perform other computing applications by connecting to a network (or multiple networks) capable of carrying data including the Internet, Ethernet, plain old telephone service (POTS) line, public switch telephone network (PSTN), integrated services digital network (ISDN), digital subscriber line (DSL), coaxial cable, fiber optics, satellite, mobile, wireless (e.g. Wi-Fi, WiMAX), SS7 signaling network, fixed line, local area network, wide area network, and others, including any combination of these.
  • POTS plain old telephone service
  • PSTN public switch telephone network
  • ISDN integrated services digital network
  • DSL digital subscriber line
  • coaxial cable fiber optics
  • satellite mobile
  • wireless e.g. Wi-Fi, WiMAX
  • SS7 signaling network fixed line, local area network, wide area network, and others, including any combination of these.
  • the software application 600 is configured to receive at least a base orientation signal 606 and an orientation variation signal 608 from the orientation sensor(s) 110 of the robot control stick 106.
  • the software application 600 is further configured to receive user inputs 610 from the user input sensor(s) 108 of the robot control stick 106.
  • a base orientation determination module 602 a base orientation signal, which is indicative of a base orientation 128 of the robot control stick 106 as one of the user input sensors 108 of the robot control stick 106 is activated, is received and used to determine the base orientation 612.
  • any monitored orientation variation signal 608 is converted, using a two-dimensional path determination module 604, into a two-dimensional path 140 confined within a robot plane 104 which is orthogonal to the base orientation 128.
  • the software application 600 is stored on the memory 504 and accessible by the processor 502 of the computing device 500.
  • Fig. 7 shows a flow chart of a method 700 of controlling a robot portion using a robot control stick 106 with an orientation sensor 110 and a user input sensor 108. Reference to the robot control stick 106 of Fig. 1 is made in the following paragraphs for ease of reading.
  • a base orientation signal is generated using an orientation sensor 110.
  • the generated orientation signal is indicative of a base orientation 128 of the orientation sensor 110 relative to the three-dimensional coordinate system 124.
  • This step can be performed upon receiving a corresponding user input via the user input sensors 110 for instance.
  • the base orientation signal can serve as a basis for determining a base orientation 128, i.e., coordinates of the base orientation 128 within the three-dimensional system 124 (x, y, z).
  • an orientation variation signal can be generated.
  • the orientation variation signal can be indicative of a variation of an orientation of the robot control stick 106 over time as it is manipulated by an operator.
  • the orientation variation can thereby be used to monitor the pivoting or rotation of the robot control stick 106 over time.
  • the orientation variation signal is converted into a two-dimensional path 140 confined to a robot plane 104 of the three-dimensional coordinate system 124 (x, y, z).
  • the robot plane 104 is set to be orthogonal to the base orientation 128. This relationship between the base orientation 128 and the robot plane 104 is perpendicular in the embodiment disclosed herein. However, at least some other embodiments can be contemplated which use any other suitable type of relationship between the base orientation 128 and the robot plane 104.
  • the robot plane 104 can be set to obliquely disposed relative to the base orientation 128. In any case, it is contemplated that the orthogonal relationship between the base orientation 128 and the robot plane 104 is more intuitive, and can thus be more easily mastered by skilled operators.
  • a control signal is generated to control the robot portion 102 to move along the two-dimensional path 140 confined within the robot plane 104.
  • the control signal can be transmitted over a wired link, over a wireless link, or both, to control the robot portion 102.
  • the base orientation 128, the orientation variation signal and the control signal are stored on a memory of the teach pendant system 100. In this way, the robot portion 102 may be moved along the two-dimensional path 140 as desired, even without the need for a teach pendant system 100.
  • an operator who desires to move the robot portion 102 within a specific plane 104 can orient the robot control stick 106 in an orientation which is perpendicular to that specific plane and activate a corresponding one of the user input sensors 108.
  • any movement of the robot control stick 106 as monitored by the orientation sensor 110 can translate into two-dimensional movement through a two-dimensional path 140 within the previously set robot plane 104.
  • the method 700 can have a step of rotating the robot plane 104 to a given angle along a given direction upon determining that the orientation sensor 110 has been rotated in the given direction by more than a right angle, e.g., ninety degrees, up to the given angle, as depicted in Fig. 8.
  • Fig. 8 shows the previous robot plane 104a used for the displacement of the robot portion 102, which was orthogonal to the base orientation, and which has then been rotated by a given angle along the orientation of the robot control stick 106 to provide the rotated robot plane 104b.
  • This method step can allow the robot plane 104 to be modified on the go without necessarily having to reset it using the user input sensors 108, for instance.
  • the determination of the base orientation 128 and of the perpendicular robot plane 104 can include a step of snapping the base orientation 128, and corresponding robot plane 107, to one of a number of reference orientations and planes that are incrementally spaced-apart from one another.
  • the angle increment on the base orientation and/or on the robot plane can be at least 0.5 degrees, preferably at least 1 degrees and most preferably at least 5 degrees.
  • Such a step of snapping can be useful when it is determined that, for a given application, the robot portion is to move only along the predetermined reference orientations and planes.
  • the orientation as generated by the orientation sensor 110 can be registered in a first coordinate system which may differ from a second coordinate system of the robot portion 102.
  • the method can include a step of registering the base orientation and the monitored orientation in the second coordinate system of the robot portion, thereby registering the first and second coordinate systems to one another. In some embodiments, this step of registering may be performed using a calibration procedure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The control stick can be an integral part of a teach pendant system for controlling a robot portion movable in a three-dimensional coordinate system. The control stick can have an elongated member, a user input sensor mounted to said elongated member and generating a user input upon activation, and an orientation sensor mounted to said elongated member and communicatively coupled to said user input sensor.

Description

ROBOT CONTROL STICK, COMPUTER-IMPLEMENTED METHOD FOR CONTROLLING A ROBOT PORTION, AND TEACH PENDANT
SYSTEM
FIELD
[0001] The improvements generally relate to robotized systems, and more particularly relate to teaching movements to robots using teach pendant units.
BACKGROUND
[0002] A teach pendant unit is a type of controller used for teaching movement(s) that a robot portion is to perform. For instance, the teach pendant unit can be used for teaching a robotized articulated arm manipulating a tool a series of successive steps to process a workpiece. In this situation, the teach pendant unit is generally manually operated by an operator which controls the movement(s) of the robotized articulated arm using keys and/or buttons, while watching the movement of the tool relative to the workpiece. Although existing teach pendant units have been found to be satisfactory to a certain degree, there remains room for improvement.
SUMMARY
[0003] It was found there was a need for providing a robot control stick, and associated teach pendant system, which can be used to ease manipulations of the robot portion during use. More specifically, the robot control stick can be used to constrain the movement of the robot portion within a two-dimensional plane that would otherwise be in a three-dimensional space. Accordingly, when detecting a given user input, the robot control stick can fix a base orientation corresponding to the current orientation of the robot control stick, which can in turn define a robot plane orthogonal to the base orientation. Once the robot plane has been defined, pivoting or otherwise rotating the robot control stick will cause the robot portion to move within a two-dimensional path which is confined to the robot plane, thereby facilitating the control of the robot portion in at least some circumstances.
[0004] In accordance with one aspect, there is provided a robot control system for controlling a robot portion of a robot, the robot control system comprising a robot control stick having an elongate member, a user input sensor mounted to said elongated member and activatable by a user to generate a user input, a orientation sensor mounted to said elongated member and configured for generating an orientation signal indicative of an orientation of said elongated member in a coordinate system of the robot; an communication module configured to generate a signal including at least a base orientation signal generated by the orientation sensor upon generation of the user input; a computer communicatively coupled to receive the signal, the computer having software stored in a non-volatile memory of the computer and configured to, when executed by a processor of the computer, perform the steps of : defining one of a robot axis and a robot plane based on the base orientation signal, subsequently to said defining one of the robot axis and the robot plane, controlling the movement of the robot portion based on the signal, including confining the movement of the robot portion to the corresponding one of the robot axis and the robot plane.
[0005] In accordance with another aspect, there is provided a robot control system for controlling a robot portion of a robot, the robot control system comprising : a robot control stick having an elongate member, a user input sensor mounted to said elongated member and activatable by a user to generate a user input, a orientation sensor mounted to said elongated member and configured for measuring an orientation of said elongated member in a coordinate system of the robot; a communication module configured to generate a control stick signal including at least orientation values measured by the orientation sensor upon generation of the user input; a computer communicatively coupled to receive the control stick signal, the computer having software stored in a non-volatile memory of the computer and configured to, when executed by a processor of the computer, perform the steps of : defining one of a robot axis and a robot plane based on the measured orientation values, subsequently to said defining one of the robot axis and the robot plane, controlling a movement of the robot portion based on the control stick signal, including confining the movement of the robot portion to the corresponding one of the robot axis and the robot plane.
[0006] In accordance with another aspect, there is provided a robot control stick for controlling a robot portion, the robot control stick comprising: an elongated member; a user input sensor mounted to said elongated member and generating a user input upon activation; and an orientation sensor mounted to said elongated member and communicatively coupled to said user input sensor, said orientation sensor generating, upon receiving said user input, a base orientation signal indicative of an orientation of said elongated member relative to a three-dimensional coordinate system, and further generating, once said base orientation has been generated, an orientation variation signal indicative of a variation of an orientation of said elongated member relative to said three-dimensional coordinate system, said orientation variation signal being convertible into a two-dimensional path confined to a robot plane orthogonal to said base orientation, said robot portion being controllable to move within said robot plane along said two-dimensional path.
[0007] In accordance with another aspect, there is provided a computer-implemented method for controlling a robot portion using a robot control stick having an orientation sensor and a user input sensor, said computer-implemented method comprising: upon receiving a user input at a given moment in time, and using an orientation sensor, generating a base orientation signal indicative of a base orientation of said orientation sensor relative to a three- dimensional coordinate system at said given moment in time, and determining a base orientation based on said base orientation signal; subsequent to said determining, generating an orientation variation signal indicative of a variation of said orientation of said orientation sensor; converting said orientation variation signal into a two-dimensional path confined to a robot plane of said three-dimensional coordinate system, with said robot plane being orthogonal to said base orientation; and generating a control signal controlling the robot portion to move within said robot plane along said two-dimensional path.
[0008] In accordance with yet another aspect, there is provided a teach pendant system for a robot portion movable in a three-dimensional coordinate system, the teach pendant system comprising: an elongated member having an orientation sensor generating an orientation signal indicative of an orientation of said elongated member relative to said three-dimensional coordinate system; and a teach pendant unit communicatively coupled to said orientation sensor and to said robot portion, the teach pendant unit having a processor and a memory having instructions that when executed by the processor perform the steps of: upon receiving a user input at a given moment in time, determining a base orientation of said elongated member at said moment in time based on said orientation signal; subsequent to said determining, monitoring an orientation of said elongated member based on said orientation signal; converting said monitored orientation into a two-dimensional path confined to a robot plane of said three-dimensional coordinate system, with said robot plane being orthogonal to said base orientation; and generating a control signal controlling the robot portion to move within said robot plane along said two-dimensional path.
[0009] Many further features and combinations thereof concerning the present improvements will appear to those skilled in the art following a reading of the instant disclosure.
DESCRIPTION OF THE FIGURES
[0010] In the figures,
[0011] Fig. 1 is an oblique view of an example teach pendant system for controlling a robot portion, shown with a robot control stick and a controller, in accordance with one or more embodiments;
[0012] Fig. 1A is a sectional view taken along section 1A-1A of Fig. 1 ;
[0013] Fig. 1 B is a sectional view taken along section 1 B-1 B of Fig. 1 ;
[0014] Fig. 2 is an oblique view of the teach pendant system of Fig. 1 , shown with a teach pendant unit and a robot control stick oriented along a base orientation, in accordance with one or more embodiments;
[0015] Fig. 3 is an oblique view of the teach pendant system of Fig. 1 , shown with the robot control stick rotated by a given angle relative to the base orientation of Fig. 2, in accordance with one or more embodiments;
[0016] Fig. 4 is a side elevation view of the robot control stick of Fig. 1 , showing a side button, in accordance with one or more embodiments;
[0017] Fig. 4A is a top plan view of the robot control stick of Fig. 1 , showing top buttons, in accordance with one or more embodiments;
[0018] Fig. 5 is a schematic view of an example of a computing device of the controller of Fig. 1 , in accordance with one or more embodiments; [0019] Fig. 6 is a schematic view of an example of a software application of the controller of Fig. 1 , in accordance with one or more embodiments;
[0020] Fig. 7 is a flow chart of an example method of controlling a robot portion using a robot control stick, in accordance with one or more embodiments; and
[0021] Fig. 8 is an oblique view of the teach pendant system of Fig. 1 , shown with the robot control stick rotated by a given angle passed the base orientation of Fig. 2, in accordance with one or more embodiments.
DETAILED DESCRIPTION
[0022] Fig. 1 shows a teach pendant system 100 used to control a robot portion 102 within a coordinate system. In this specific example, the robot portion 102 is provided in the form of a robotized articulated arm 103. As depicted, the robotized articulated arm 103 has an end equipped with a robotized plier 105. In alternate embodiments, robots can have different actuating structures and robot portions can take different forms without departing from the present disclosure. As can be understood, the teach pendant system 100 can be used to control any portion thereof depending on the embodiment.
[0023] Although the robotized articulated arm 103 in this example can be used to process a workpiece in an industrial application, in alternate embodiments, the robot portion may be adapted and used to perform steps in any other industry including, but not limited to, the aerospace industry, the consumer product industry, the disaster response industry, the drone industry, the education industry, the exoskeleton industry, the humanoid industry, the military industry, the security industry, the research industry, the self-driving car industry, the telepresence industry, the underwater industry, or any combination thereof. A teach pendant system such as described herein may advantageously be used in any such alternate embodiment.
[0024] Still referring to Fig. 1 , the teach pendant system 100 has a robot control stick 106. The robot control stick 106 has an elongated member 107. For instance, the elongated member 107 may have a pen-like shape so as to be easily manipulable with a single hand. The pen-like shape of the elongated member 107 can be used to indicate a direction of movement of the robot portion 102 within a robot plane 104, in some embodiments. As perhaps best seen in Figs. 4 and 4A, the elongated member 107 extends between a first end 116 and a second end 118, with a cylindrical-like body extending linearly between the first 116 and second ends 118.
[0025] As shown, the robot control stick has one or more user input sensors 108 such as side buttons(s) 120, top button(s) 122 and the like. The button(s) can be click button(s), touch button(s) and/or force button(s) depending on the embodiment, to name some examples.
[0026] The robot control stick 106 has one or more orientation sensors 110 which are configured to generate an orientation signal indicative of an orientation of the elongated member 107 relative to a coordinate system 124 (x, y, z). Depending on the embodiment, the robot control stick 106 can be configured to generate the orientation signal continuously, at a given frequency, or on demand (e.g. based on user input), for instance. Examples of the orientation sensor can include, but not limited to, accelerometer(s), gyroscope(s), magnetometer(s), GPS sensor(s), camera(s), inclinometers and the like. For instance, in some embodiments where the coordinate system 124 is a three dimensional coordinate system, such as a three-dimensional Cartesian coordinate system (x, y, z), the orientation sensor 110 can be provided in the form of an orientation sensor assembly having a three-axis accelerometer, a three-axis gyroscope and/or a three-axis magnetometer generating measurement signals indicative of a corresponding measurand in all three-dimensions. A three-axis accelerometer may be the simplest implementation in some embodiments.
[0027] In some embodiments, a given orientation signal, which will be referred to herein as a base orientation signal, can be generated by the orientation sensor 110 at a moment in time when a user triggers a given user input. The given user input can be the pressing and holding of a given button in one example, such as the side button 120 or one of the top bottons 122 shown in the robot central stick 106 shown in Fig. 4 and 4A for instance. If the robot control stick 106 is used in a teach pendant environment, an indication that the given user input has been triggered can be communicated, preferably wirelessly, to a computer such as a computer 126 integrated within the teach pendant unit 114 for instance, together with the base orientation signal. Depending on the embodiment, and potentially on the choice of user input, the software running on the computer 126 can trigger different modes of operation can be triggered at this stage.
[0028] In a first example, if a first user input is received, the software running on the computer 126 can define a robot plane 104 relative to the base orientation signal. The base orientation signal can be indicative of a base orientation 128, which corresponds to the orientation of a stick axis 130 extending along the length of the robot control stick 106 for instance. The robot plane 104 can be defined virtually in the coordinate system 124 of the robot, normal to the orientation of the stick axis 130 at the time where the first user input is received. The robot plane 104 can then serve as a reference for controlling the movement of the robot portion 102. For instance, the software can lock the movement of the robot portion 102 within the robot plane 104 for as long as the first user input is maintained, e.g. for as long as the user continues to press the button, and subsequent movement of the robot control stick 106 such as displacement relative the robot plane 104 or subsequent inclination of the stick axis 130 relative to the orientation of the stick axis 130 at the time when the robot plane 104 is defined can be detected, for instance, based on variations of the orientation signal over time, and be used to control the movement of the robot portion 102 within the robot plane 104.
[0029] Indeed, in some embodiments, the orientation sensor 110 can continue to generate an orientation signal subsequently to the generation of the base orientation 128 signal, and the following signal can be compared to the initial orientation and interpreted as an orientation variation signal (and/or data) indicative of a variation of the orientation of the elongated member 107. The orientation variation signal can be indicative of a plurality of orientation values at a corresponding plurality of moments in time. For instance, the orientation variation signal can have a given frequency, i.e. , a given number of orientation values within a given time duration unit. As such, the orientation variation signal thereby monitors the inclination of the elongated member 107 as it is manipulated by an operator during a teaching sequence, for instance. Similarly, moving the stick axis 130 in a plane parallel to the robot plane 104 while keeping the stick axis 130 perpendicular to the robot plane 104 can be detected by the orientation sensor 110 and used as a variation signal to trigger displacement of the robot portion 102 within the robot plane 104. In some embodiments, it may be relevant to control both movement and inclination of the robot portion 102 relative to the robot plane 104, which can be achieved intuitively by monitoring both movement and inclination of the stick axis 130 in a plane parallel to the robot plane 104. It is noted that in some embodiments the user input sensor 108 has to be activated for a time duration which is greater than a corresponding time duration threshold (e.g., a few ms) in order to determine the base orientation 130. In these embodiments, accidental and momentary activation of the user input sensor 108 may not redefine the base orientation 128. It is noted that a base orientation currently shown in Fig. 1 for instance, may be waived and re-determined upon a user activating the user input sensor 108 during a period of time that is greater than the corresponding time duration threshold.
[0030] In a second example, the software running on the computer 126 may already have one or more predefined planes in a coordinate system 124 of the robot, such as three orthogonal planes corresponding to a three dimensional coordinate system for instance. In this second example, it can be considered more convenient for the software to use the signal from the robot control stick 106 to select one of the predefined planes. For instance, if a corresponding user input is received, the software can perform a match between a plane which is normal to the stick axis 130 at the time the user input is received, which can be determined based on a corresponding signal from the orientation sensor 110 for instance, and one of the predefined planes which is most closely aligned with the former normal plane, and define that matched predefined plane as the robot plane 104. Henceforth, instead of defining the robot plane 104 directly from the normal plane based on the signal of the stick axis 130 from the orientation sensor 110, the robot plane 104 is defined indirectly, based on a closest match between the normal plane and one of the predefined virtual planes in the coordinate system 124 of the robot portion 102. Once the robot plane 104 has been defined in this second example, subsequent movement of the robot control stick 106, such as displacement and/or inclination, can be used to control the movement of the robot portion 102 relative to the robot plane 104. The technique can thus be used to confine the subsequent movement of the robot portion 102 to a robot plane 104 in a manner similar to how this latter operation was described in the first example, with the distinction that the robot plane 104, in this case, is not defined solely by the orientation of the stick axis 130 of the robot control stick 106, but rather on a best match basis between the orientation of the stick axis 130 and the predefined virtual planes. [0031] In some embodiments, both the first example and the second example can be provided for in the software, and the choice of defining the robot plane 104 on the first example or on the second example can be based on the exact nature of the user input which is received, such as depending on which button has been pressed on the robot control stick 106, for instance. In other embodiments, only the process of the first example or of the second example can be provided for by the software.
[0032] In yet a third example, rather than being used to define a robot plane 104, the combination of a corresponding user input and of a corresponding base orientation signal can be used to define a robot axis. For instance, in the third example, the software running on the computer 126 can define a robot axis as corresponding to the base orientation 128 signal, and in other words, on the basis of the orientation of the stick axis 130 at the time when the corresponding input is triggered. Once a robot axis has been defined, the movement of the robot portion 102 can be confined to the robot axis for instance. A subsequent user input, such as movement of the robot control stick 106 in free space or variation of inclination thereof as detected by the orientation sensor 110 for instance, can be used to move the robot portion 102 and or change its orientation relative to the robot axis. In a specific embodiment, the subsequent user input can be the maintaining of a button in a pressed state after the initial user input, which can form the basis of an command to move the robot portion 102 at a predetermined speed along the robot axis for as long as the button remains in the pressed state, for instance.
[0033] In yet a fourth example, the software running on the computer 126 may already have one or more predefined axes in a coordinate system of the robot, such as three orthogonal axes (x, y, z) corresponding to a three dimensional Cartesian coordinate system for instance. In this second example, it can be considered more convenient for the software to use the signal from the robot control stick 106 to select one of the predefined axes. For instance, if a corresponding user input is received, the software can perform a match between the orientation of the stick axis 130 at the time the user input is received, which can be determined based on a corresponding signal from the orientation sensor 110 for instance, and one of the predefined axes which is most closely aligned with the stick axis 130 orientation, and define that matched predefined axis as the robot axis. Henceforth, instead of defining the robot axis directly from the stick axis 130 based on the signal from the orientation sensor 110, the robot axis can be defined indirectly, based on a closest match between the stick axis 130 and one of the predefined virtual axes in the coordinate system of the robot. Once the robot axis has been defined in this second example, subsequent movement of the robot control stick 106, such as displacement and/or inclination, can be used to control the movement of the robot portion 102 relative to the robot axis. The technique can thus be used to confine the subsequent movement of the robot to a robot axis 130 in a manner similar to how this latter operation was described in the third example, with the distinction that the robot axis, in this case, is not defined solely by the orientation of the stick axis 130 of the robot control stick 106, but rather on a best match basis between the orientation of the robot control stick and the predefined virtual axes.
[0034] In some embodiments, all of the above examples can be provided for in the software, and the choice of which technique is retained for defining the robot plane 104 or robot axis to which movement is thereafter confined is left to the user, and can be based on user input, such as depending on which button has been pressed on the robot control stick 106, for instance. In other embodiments, only the process of the third example or of the fourth example can be provided for by the software.
[0035] In an embodiment implemented in a teach pendent context, the teach pendant system 100 also has a controller 112 which can be a computer and provide the software functionalities described above. The controller 112 can be part of the robot control stick 106, part of a teach pendant unit 114, or part of both, depending on the embodiment. In some other embodiments, the controller 112 can be external to the robot control stick 106 or to the teach pendant unit 114. In these embodiments, the controller 112 is in communicative coupling with the robot control stick 106, the teach pendant unit 114 and/or to the robot portion 102. As such, it is noted that the controller 112 can be part of an electronic device such as a mobile phone, an electronic tablet and the like to control the robot portion 102 in a remote fashion. For instance, in the illustrated embodiment, the controller 112 is part of the teach pendant unit 114 and, more specifically, of the computer 126 of the teach pendant unit 114. It is understood that the controller 112 can be a computer within the teach pendant unit 114 while being separate of the computer 126 previously disclosed. It is further understood that the controller 112 can be separate from the teach pendant unit 114 without departing from the present disclosure. In these embodiments, the robot control stick 106 can have a communication module 132 (Fig. 4) which can communicate the base orientation signal and the orientation variation signal generated by the orientation sensor 110 to the teach pendant unit 114 for processing by the controller 112 enclosed therein. The robot control stick 106 can have a microcontroller integrated thereto, in some embodiments.
[0036] In this specific example, the teach pendant unit 114 has a frame 134, a user interface including a display screen 136, a keyboard 138 having keys, and buttons such as an emergency stop button. In some embodiments, the teach pendant unit 114 is communicatively coupled to the robot control stick 106 so as to interact with one another. For instance, activating a button 120, 122 on the robot control stick 106 can cause the orientation sensor 110 to generate a base orientation 128 signal, which can in turn be communicated to the teach pendant unit 114 for processing. The coupling between the teach pendant unit 114 and the robot control stick 106 can be partially or wholly wireless in some embodiments whereas the coupling may be wired in some other embodiments.
[0037] As further described below, the controller 112 has a processor and a memory having instructions that when executed by the processor perform some steps of determining the base orientation 128 and the robot plane 104, and controlling the robot portion 102 only within the two-dimensional path 140 within the robot plane 104.
[0038] The robot portion 102 can be controlled by moving the robot control stick 106. It is noted that controlling a robot portion 102 to move within a three-dimensional space when movement in only a two-dimensional plane is necessary can be challenging. Indeed, should the robot control stick 106 be moved slightly off-plane, it would cause the robot portion 102 to move in such as way may be detrimental to the teaching process. It can thus be advantageous to fix a plane within which the robot portion 102 is to be moved during a teaching sequence. This plane may then be modified on the go using the robot control stick 106.
[0039] To do so, the orientation sensor 110 generates, upon receiving a user input via one of the user input sensor 108, a base orientation signal indicative of a base orientation 128 of the elongated member 107 relative to a three-dimensional coordinate system 124 (x, y, z). The controller 112 can thereby determine a base orientation 128 of the elongated member 107 at the given moment in time based on the orientation signal generated by the orientation sensor 110.
[0040] The orientation sensor 110 can further generate, once the base orientation 128 has been generated, an orientation variation signal indicative of a variation of an orientation of the elongated member 107 relative to the three-dimensional coordinate system 124 (x, y, z). As described below, the orientation variation signal is convertible into a two-dimensional path confined to a robot plane 104 which is orthogonal to the base orientation 128. Then, the robot portion 102 can be conveniently controlled to move within the robot plane 104 along the two- dimensional path 140.
[0041] Once the base orientation 128 and orthogonal robot plane 104 have been determined, it is expected that as the elongated member 107 is pivoted or otherwise rotated within the three-dimensional coordinate system 124 (x, y, z), any movement of the elongated member 107 can translate into movement of the robot portion 102 that is confined within the robot plane 104. As such, referring now to Figs. 1A and 1 B, if the robot control stick 106 is pivoted upwards with respect to the figure orientation, the robot portion 102 is correspondingly moved in the upwards direction within the robot plane 104, if the robot control stick 106 is pivoted downwards with respect to the figure orientation, the robot portion 102 is correspondingly moved in the downwards direction within the robot plane 104, and so forth. For instance, as shown in Fig. 2, an operator manipulates the robot control stick 106 so as to orient the robot control stick 106 along a desired orientation. When it is satisfactorily oriented, one of the buttons 120, 122 (Fig. 4 & 4A) of the robot control stick 106 is activated, which causes an orientation signal indicative of a base orientation 128 of the elongated member 107 relative to a three-dimensional coordinate system 124 (x, y, z) to be generated by the orientation sensor 110, which can then be used to determine an orthogonal, robot plane 104. As discussed above, any movement of the robot control stick 106 once the base orientation 128 has been determined will result in movement of the robot portion 102 within the robot plane 104. For instance, as shown in Fig. 3, when the robot control stick 106 is pivoted 142 about an axis by a given angle 146, orientation variation signal will be generated by the orientation sensor 110. The monitored orientation can thereby be converted into an upward movement 144 of the robot portion 102 along the robot plane 104 previously determined. It is noted that the correspondence between an upward movement of the robot control stick 106 can lead to a downward movement, sideways movement, or any other type of displacement of the robot portion 102 without departing from the present disclosure.
[0042] It is noted that a speed at which the robot portion 102 is to be moved can be proportional to the angle 146 that is formed between the orientation of the stick axis 130 of the control robot stick 106 and the base orientation 128 and/or the robot plane 104 when the robot portion 102 starts displacing. More specifically, the speed can be proportional to a projection of a current orientation vector of the robot control stick 106 on the robot plane 104. In these embodiments, the speed can be determined by multiplying a constant to that projection. The constant can be stored on a memory of the controller 112, for instance. In some other embodiments, the speed at which the robot portion 102 is to be moved varies linearly as a function of the angle 146 between the stick axis 130 and the robot plane 104.
[0043] As shown in Figs. 4A and 4B, the robot control stick 106 has a cylindrical-like body extending between the first end 116 and the second end 118. In this embodiment, the first end 116 has a truncated cone shape ending at a rounded tip 148, whereas the second end 118 terminates in a semi-rounded tip. In some embodiments, the rounded tip 148 of the first end 116 can be removably received at a corresponding portion of the teach pendant unit 114. For instance, the rounded tip 148 may be positioned at a receiving portion of the teach pendant unit 114, which will prevent it from moving as the remainder of the robot control sick is pivoted or otherwise rotated to control the robot portion 102. In some embodiments, the elongated member 107 can be otherwise removably attachable to the teach pendant unit 114, for instance to avoid losing it when it is not in use by an operator. As shown in this specific embodiment, side buttons 120 are provided on each lateral sides of the cylindrical-like body, in addition to a series of top buttons 122 distributed on a top side of the cylindrical-like body. These buttons 120, 122 can differ in shape and functionality. For instance, one of these buttons can be activated to set the base orientation 128. In some other embodiments, another one of these buttons can be activated to toggle between two different parts of a robot portion to control, e.g., to toggle between a robotized articulated arm and a robotized plier. It is noted that in embodiments where the controller 112 is found elsewhere than within the robot control stick 106, the robot control stick 106 may have a wired or wireless communication module 132 being communicatively coupled to the orientation sensor 110 and to the user input sensors 108. As such, the base orientation signal, the orientation variation signal and the user inputs can be communicated to the controller 112 for further processing.
[0044] Using accelerometers within the robot control stick 106, the controller may determine which one of the first 116 and second ends 118 of the cylindrical-like body points upwards or downwards towards a ground reference. As such, the controller 112 can recognize, at any given time, that the robot control stick 106 is inverted, for instance, and adjust the control signal accordingly.
[0045] The controller 112 can be provided as a combination of hardware and software components. The hardware components can be implemented in the form of a computing device 500, an example of which is described with reference to Fig. 5. Moreover, the software components of the controller 112 can be implemented in the form of a software application 600, an example of which is described with reference to Fig. 6.
[0046] Referring to Fig. 5, the computing device 500 can have a processor 502, a memory 504, and I/O interface 506. Instructions 508 for setting the robot plane and converting any movement of the robot control stick within the robot plane thereafter, and other method steps, can be stored on the memory 504 and accessible by the processor 502.
[0047] The processor 502 can be, for example, a general-purpose microprocessor or microcontroller, a digital signal processing (DSP) processor, an integrated circuit, a field programmable gate array (FPGA), a reconfigurable processor, a programmable read-only memory (PROM), or any combination thereof.
[0048] The memory 504 can include a suitable combination of any type of computer- readable memory that is located either internally or externally such as, for example, randomaccess memory (RAM), read-only memory (ROM), compact disc read-only memory (CDROM), electro-optical memory, magneto-optical memory, erasable programmable readonly memory (EPROM), and electrically-erasable programmable read-only memory (EEPROM), Ferroelectric RAM (FRAM) or the like. [0049] Each I/O interface 506 enables the computing device 500 to interconnect with one or more input devices, such as orientation sensor(s), user interface sensor(s) such as button(s), or with one or more output devices such as a user interface, a display screen, a remote network or a memory system.
[0050] Each I/O interface 506 enables the controller 112 to communicate with other components, to exchange data with other components, to access and connect to network resources, to serve applications, and perform other computing applications by connecting to a network (or multiple networks) capable of carrying data including the Internet, Ethernet, plain old telephone service (POTS) line, public switch telephone network (PSTN), integrated services digital network (ISDN), digital subscriber line (DSL), coaxial cable, fiber optics, satellite, mobile, wireless (e.g. Wi-Fi, WiMAX), SS7 signaling network, fixed line, local area network, wide area network, and others, including any combination of these.
[0051] Referring now to Fig. 6, the software application 600 is configured to receive at least a base orientation signal 606 and an orientation variation signal 608 from the orientation sensor(s) 110 of the robot control stick 106. The software application 600 is further configured to receive user inputs 610 from the user input sensor(s) 108 of the robot control stick 106. Using a base orientation determination module 602, a base orientation signal, which is indicative of a base orientation 128 of the robot control stick 106 as one of the user input sensors 108 of the robot control stick 106 is activated, is received and used to determine the base orientation 612. Then, any monitored orientation variation signal 608 is converted, using a two-dimensional path determination module 604, into a two-dimensional path 140 confined within a robot plane 104 which is orthogonal to the base orientation 128. The software application 600 leading to the emission of a control signal 614 to displace the robot portion 102. In some embodiments, the software application 600 is stored on the memory 504 and accessible by the processor 502 of the computing device 500.
[0052] The computing device 500 and the software application 600 described above are meant to be examples only. Other suitable embodiments of the controller 112 can also be provided, as it will be apparent to the skilled reader. [0053] Fig. 7 shows a flow chart of a method 700 of controlling a robot portion using a robot control stick 106 with an orientation sensor 110 and a user input sensor 108. Reference to the robot control stick 106 of Fig. 1 is made in the following paragraphs for ease of reading.
[0054] At step 702, a base orientation signal is generated using an orientation sensor 110. The generated orientation signal is indicative of a base orientation 128 of the orientation sensor 110 relative to the three-dimensional coordinate system 124. This step can be performed upon receiving a corresponding user input via the user input sensors 110 for instance. The base orientation signal can serve as a basis for determining a base orientation 128, i.e., coordinates of the base orientation 128 within the three-dimensional system 124 (x, y, z).
[0055] At step 704, subsequent to the step 702, an orientation variation signal can be generated. The orientation variation signal can be indicative of a variation of an orientation of the robot control stick 106 over time as it is manipulated by an operator. The orientation variation can thereby be used to monitor the pivoting or rotation of the robot control stick 106 over time.
[0056] At step 706, the orientation variation signal is converted into a two-dimensional path 140 confined to a robot plane 104 of the three-dimensional coordinate system 124 (x, y, z). As discussed above, the robot plane 104 is set to be orthogonal to the base orientation 128. This relationship between the base orientation 128 and the robot plane 104 is perpendicular in the embodiment disclosed herein. However, at least some other embodiments can be contemplated which use any other suitable type of relationship between the base orientation 128 and the robot plane 104. For instance, the robot plane 104 can be set to obliquely disposed relative to the base orientation 128. In any case, it is contemplated that the orthogonal relationship between the base orientation 128 and the robot plane 104 is more intuitive, and can thus be more easily mastered by skilled operators.
[0057] At step 708, a control signal is generated to control the robot portion 102 to move along the two-dimensional path 140 confined within the robot plane 104. The control signal can be transmitted over a wired link, over a wireless link, or both, to control the robot portion 102. In some embodiments, the base orientation 128, the orientation variation signal and the control signal are stored on a memory of the teach pendant system 100. In this way, the robot portion 102 may be moved along the two-dimensional path 140 as desired, even without the need for a teach pendant system 100.
[0058] As such, an operator who desires to move the robot portion 102 within a specific plane 104 can orient the robot control stick 106 in an orientation which is perpendicular to that specific plane and activate a corresponding one of the user input sensors 108. Once the base orientation 128 and the robot plane 104 have been set, any movement of the robot control stick 106 as monitored by the orientation sensor 110 can translate into two-dimensional movement through a two-dimensional path 140 within the previously set robot plane 104.
[0059] In some embodiments, the method 700 can have a step of rotating the robot plane 104 to a given angle along a given direction upon determining that the orientation sensor 110 has been rotated in the given direction by more than a right angle, e.g., ninety degrees, up to the given angle, as depicted in Fig. 8. Fig. 8 shows the previous robot plane 104a used for the displacement of the robot portion 102, which was orthogonal to the base orientation, and which has then been rotated by a given angle along the orientation of the robot control stick 106 to provide the rotated robot plane 104b. This method step can allow the robot plane 104 to be modified on the go without necessarily having to reset it using the user input sensors 108, for instance.
[0060] In some embodiments, the determination of the base orientation 128 and of the perpendicular robot plane 104 can include a step of snapping the base orientation 128, and corresponding robot plane 107, to one of a number of reference orientations and planes that are incrementally spaced-apart from one another. For instance, the angle increment on the base orientation and/or on the robot plane can be at least 0.5 degrees, preferably at least 1 degrees and most preferably at least 5 degrees. Such a step of snapping can be useful when it is determined that, for a given application, the robot portion is to move only along the predetermined reference orientations and planes.
[0061] In some embodiments, the orientation as generated by the orientation sensor 110 can be registered in a first coordinate system which may differ from a second coordinate system of the robot portion 102. As such, in these embodiments, the method can include a step of registering the base orientation and the monitored orientation in the second coordinate system of the robot portion, thereby registering the first and second coordinate systems to one another. In some embodiments, this step of registering may be performed using a calibration procedure. [0062] As can be understood, the examples described above and illustrated are intended to be exemplary only. The scope is indicated by the appended claims.

Claims

WHAT IS CLAIMED IS:
1. A robot control system for controlling a robot portion of a robot, the robot control system comprising : a robot control stick having : an elongate member, a user input sensor mounted to said elongated member and activatable by a user to generate a user input, a orientation sensor mounted to said elongated member and configured for measuring an orientation of said elongated member in a coordinate system of the robot; a communication module configured to generate a control stick signal including at least orientation values measured by the orientation sensor upon generation of the user input; a computer communicatively coupled to receive the control stick signal, the computer having software stored in a non-volatile memory of the computer and configured to, when executed by a processor of the computer, perform the steps of : defining one of a robot axis and a robot plane based on the measured orientation values, subsequently to said defining one of the robot axis and the robot plane, controlling a movement of the robot portion based on the control stick signal, including confining the movement of the robot portion to the corresponding one of the robot axis and the robot plane.
2. The robot control system of claim 1 , wherein the control stick signal generated by the communication module and received by the computer is one of a wired signal and wireless signal.
3. The robot control system of claim 1 , wherein the control stick signal includes an orientation variation signal indicative of a movement of the elongated member in the coordinate system of the robot.
4. The robot control system of claim 3, wherein said controlling the movement of the robot portion is based on the orientation variation signal.
5. The robot control system of any one of claims 3 to 4, wherein said movement of the elongated member is at least a change of the orientation of said elongated member.
6. The robot control system of claim 1 , wherein the computer defines the robot axis and can further define a movement direction along the robot axis based on the orientation of the robot control stick.
7. The robot control system of claim 1 , wherein said movement of the robot portion is controlled by the user input.
8. The robot control system of claim 7, wherein the movement of the robot portion is at a predetermined speed.
9. The robot control system of claim 1 , wherein the computer defines the robot axis as corresponding to the orientation of the elongated member specified by the measured orientation values.
10. The robot control system of claim 1 , wherein the computer defines the robot axis based on a closest match basis between a predefined orientation in the non-volatile memory of the computer and the orientation of the elongated member indicated by the measured orientation values.
11. The robot control system of claim 1 , wherein the computer defines the robot plane as corresponding to a plane normal to the orientation of the elongated member specified by the measured orientation values.
12. The robot control system of claim 1 , wherein the computer defines the robot plane based on a closest match basis between a predefined plane in the non-volatile memory of the computer and a plane normal to the orientation of the elongated member specified by the measured orientation values.
13. A robot control stick for controlling a robot portion, the robot control stick comprising: an elongated member; a user input sensor mounted to said elongated member and activatable by a user to generate a user input; an orientation sensor mounted to said elongated member and configured for generating a base orientation signal indicative of a base orientation of said elongated member within a three-dimensional coordinate system, and further generating, once said base orientation has been generated, an orientation variation signal indicative of a variation of an orientation of said elongated member within said three-dimensional coordinate system relative to the base orientation, said orientation variation signal being convertible into a two-dimensional path confined to a robot plane orthogonal to said base orientation, said robot portion being controllable to move within said robot plane along said two-dimensional path.
14. The robot control stick of claim 13 further comprising a controller communicatively coupled to said orientation sensor and to said robot portion, the controller having a processor and a memory having instructions that when executed by the processor perform the steps of controlling said robot portion based on said base orientation signal and said orientation variation signal.
15. The robot control stick of claim 14 wherein said controller is configured to perform the steps of converting said orientation variation signal into said two-dimensional path confined to said robot plane, and generating a control signal controlling said robot portion to move within said robot plane along said two-dimensional path.
16. The robot control stick of claim 14 wherein said controller is mounted to the elongated member. - 22 -
17. The robot control stick of claim 14 wherein said orientation sensor and said user input sensor are communicatively to said controller via a wireless communication link.
18. The robot control stick of claim 14 wherein said controller is part of a teach pendant unit.
19. The robot control stick of claim 13 wherein said elongated member has pen-like shape.
20. The robot control stick of claim 13 wherein said user input sensor has at least a button.
21. The robot control stick of claim 13 wherein said orientation sensor has at least an accelerometer.
22. A computer-implemented method for controlling a robot portion using a robot control stick having an orientation sensor and a user input sensor, said computer-implemented method comprising: upon receiving a user input at a given moment in time, and using an orientation sensor, generating a base orientation signal indicative of an orientation of said orientation sensor within a three-dimensional coordinate system at said given moment in time, and determining a base orientation based on said base orientation signal; subsequent to said determining, generating an orientation variation signal indicative of a variation of said orientation of said orientation sensor; converting said orientation variation signal into a two-dimensional path confined to a robot plane of said three-dimensional coordinate system, with said robot plane being orthogonal to said base orientation; and generating a control signal controlling the robot portion to move within said robot plane along said two-dimensional path. - 23 -
23. The computer-implemented method of claim 22 wherein, upon determining that said orientation sensor has been rotated in a given direction by more than a right angle up to a given angle, rotating said robot plane to said given angle along said given direction.
24. The computer-implemented method of claim 22 wherein said determining said base orientation includes snapping said base orientation to one of a plurality of predefined reference orientations incrementally spaced-apart from one another.
25. The computer-implemented method of claim 22 wherein said orientation of the orientation sensor is registered in a first coordinate system, the method further comprising registering said base orientation signal and said orientation variation signal in a second coordinate system of said robot portion.
26. A teach pendant system for a robot portion movable in a three-dimensional coordinate system, the teach pendant system comprising: an elongated member having an orientation sensor generating an orientation signal indicative of an orientation of said elongated member relative to said three-dimensional coordinate system; and a teach pendant unit communicatively coupled to said orientation sensor and to said robot portion, the teach pendant unit having a processor and a memory having instructions that when executed by the processor perform the steps of: upon receiving a user input at a given moment in time, determining a base orientation of said elongated member at said moment in time based on said orientation signal; subsequent to said determining, monitoring an orientation of said elongated member based on said orientation signal; converting said monitored orientation into a two-dimensional path confined to a robot plane of said three-dimensional coordinate system, with said robot plane being orthogonal to said base orientation; and - 24 - generating a control signal controlling the robot portion to move within said robot plane along said two-dimensional path.
27. The teach pendant system of claim 26 wherein said elongated member has a user input sensor activatable by a user to generate said user input.
28. The teach pendant system of claim 27 wherein said user input sensor is a button exposed on a surface of said elongated member.
29. The teach pendant system of claim 26 wherein said elongated member extends between a first end and a second end, at least one of said first end and said second end receivable into a corresponding portion of said teach pendant unit at least during said monitoring.
30. The teach pendant system of claim 26 wherein said elongated member is removably attachable to said teach pendant system.
31. The teach pendant system of claim 26 wherein said orientation sensor has at least an accelerometer.
PCT/CA2021/051067 2020-08-03 2021-07-29 Robot control stick, computer-implemented method for controlling a robot portion, and teach pendant system WO2022027129A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063060203P 2020-08-03 2020-08-03
US63/060,203 2020-08-03

Publications (1)

Publication Number Publication Date
WO2022027129A1 true WO2022027129A1 (en) 2022-02-10

Family

ID=80118953

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2021/051067 WO2022027129A1 (en) 2020-08-03 2021-07-29 Robot control stick, computer-implemented method for controlling a robot portion, and teach pendant system

Country Status (1)

Country Link
WO (1) WO2022027129A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6019606A (en) * 1996-11-08 2000-02-01 Toyoda Koki Kabushiki Kaisha Robot teaching machine
WO2013033747A1 (en) * 2011-09-06 2013-03-14 Keba Ag Method, control system and movement presetting means for programming or presetting movements or processes by an industrial robot
US20150321351A1 (en) * 2014-05-08 2015-11-12 Chetan Kapoor Intuitive Motion Coordinate System for Controlling an Industrial Robot

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6019606A (en) * 1996-11-08 2000-02-01 Toyoda Koki Kabushiki Kaisha Robot teaching machine
WO2013033747A1 (en) * 2011-09-06 2013-03-14 Keba Ag Method, control system and movement presetting means for programming or presetting movements or processes by an industrial robot
US20150321351A1 (en) * 2014-05-08 2015-11-12 Chetan Kapoor Intuitive Motion Coordinate System for Controlling an Industrial Robot

Similar Documents

Publication Publication Date Title
US5617515A (en) Method and apparatus for controlling and programming a robot or other moveable object
US9393687B2 (en) Method for programming an industrial robot and industrial robot
US10870199B2 (en) Robot system and robot teaching method
US6157873A (en) Robot programming system and method
CN108367434B (en) Robot system and method of controlling the same
EP3342561B1 (en) Remote control robot system
CN107414842B (en) Control device, robot, and robot system
US10166673B2 (en) Portable apparatus for controlling robot and method thereof
EP2465650B1 (en) Haptic interface handle with force-indicating trigger mechanism
US9300430B2 (en) Latency smoothing for teleoperation systems
US20180311825A1 (en) Operation device for operating robot, robot system, and operation method
US10882191B2 (en) Robotic system with haptic cutting tool
US9919424B1 (en) Analog control switch for end-effector
EP2055446A1 (en) A portable robot control apparatus and a method for controlling a movement of a robot
CN114174015B (en) Control device, control system, robot system, and control method
EP3473386A1 (en) Controller for end portion control of multi-degree-of-freedom robot, method for controlling multi-degree-of-freedom robot by using controller, and robot operated thereby
JP6570540B2 (en) Method for handling an object using a manipulator and an input tool
US20180340763A1 (en) Operation method of position measuring device
WO2022027129A1 (en) Robot control stick, computer-implemented method for controlling a robot portion, and teach pendant system
KR20130000496A (en) Teaching apparatus of robot having acceleration sensor and gyro-sensor and teaching method for robot using the same
WO2021073733A1 (en) Method for controlling a device by a human
JP2011224745A (en) Robot teaching device and controller for the same, and program
CN110799310B (en) Improved robotic training system including motion control rods
CN114161427B (en) Robot control method, device and system based on elastic handle and electronic equipment
Ishibashi et al. Robot movement control using force sensor in remote robot systems

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21852480

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21852480

Country of ref document: EP

Kind code of ref document: A1