WO2021250923A1 - Système de robot, dispositif de commande et procédé de commande - Google Patents

Système de robot, dispositif de commande et procédé de commande Download PDF

Info

Publication number
WO2021250923A1
WO2021250923A1 PCT/JP2021/001844 JP2021001844W WO2021250923A1 WO 2021250923 A1 WO2021250923 A1 WO 2021250923A1 JP 2021001844 W JP2021001844 W JP 2021001844W WO 2021250923 A1 WO2021250923 A1 WO 2021250923A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
unit
trajectory
robot system
error
Prior art date
Application number
PCT/JP2021/001844
Other languages
English (en)
Japanese (ja)
Inventor
真彰 前田
祐市 桜井
直宏 林
Original Assignee
株式会社日立産機システム
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立産機システム filed Critical 株式会社日立産機システム
Publication of WO2021250923A1 publication Critical patent/WO2021250923A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/19Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by positioning or contouring control systems, e.g. to control position from one programmed point to another or to control movement along a programmed continuous path
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/406Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by monitoring or safety
    • G05B19/4069Simulating machining process on screen
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/4093Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by part programming, e.g. entry of geometrical information as taken from a technical drawing, combining this with machining and material information to obtain control information, named part programme, for the NC machine
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/4155Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by programme execution, i.e. part programme or machine function execution, e.g. selection of a programme

Definitions

  • the present invention relates to a robot system, a control device, and a control method.
  • the present invention claims the priority of application number 2020-099821 of the Japanese patent filed on June 9, 2020, and for designated countries where incorporation by reference to the literature is permitted, the content described in the application is Incorporated into this application by reference.
  • the first work gripped by the grip portion of the robot is arranged on the mounting surface.
  • the second work is moved to the second step of determining the position and posture of the second work with respect to the robot and the position and posture determined in the second step, and the assembling operation is performed.
  • a manufacturing method comprising a third step of causing the robot to perform is described.
  • the deviation between the trajectory of the hand at the target time and the trajectory at the time of actual operation is measured, and the position and posture of the second work are measured based on the deviation of the hand and the shape of the second work. It is said that the assembly can be performed by determining the method and moving the second work to the determined position and posture.
  • the planned assembly work may not be carried out only by moving the position and posture of the second work.
  • the present invention has been made in view of the above points, and makes it possible to suppress the failure of the assembling work due to the deviation of the gripping position and the deviation of the posture that occur when the work is actually gripped. With the goal.
  • the present application includes a plurality of means for solving at least a part of the above problems, and examples thereof are as follows.
  • the robot system is a robot system including a robot for executing work assembling work and a control device for controlling the robot, and the control device is The motion planning unit that determines the trajectory from the start point to the end point of the motion of the robot and generates trajectory information, and the gripping position of the robot when the robot actually grips the work based on the trajectory information.
  • the gripping adjustment unit that estimates an error from the time of planning for at least one of the posture and the gripping force, and the gripping position, posture, and gripping force of the robot in the direction of eliminating the estimated error.
  • the assembly is based on a simulation execution unit that newly generates an orbit by executing an operation simulation of the assembly work after adjusting at least one, and whether or not an orbit is generated by the simulation execution unit. It is characterized by having a re-grasping determination unit for determining whether or not work is possible.
  • the present invention it is possible to suppress the failure of the assembling work due to the deviation of the gripping position and the deviation of the posture that occur when the work is actually gripped.
  • FIG. 1 is a diagram showing a configuration example of a robot system according to the first embodiment of the present invention.
  • FIG. 2 is a diagram for explaining an example of processing by the motion planning unit.
  • FIG. 3 is a diagram showing a configuration example of the grip adjustment unit.
  • 4 (A) and 4 (B) are diagrams for explaining an example of processing by the simulation execution unit and the re-grasping determination unit, and
  • FIG. 4 (A) is a diagram when an orbit can be generated because the interference region is small.
  • FIG. 4B is a diagram showing a case where an orbit cannot be generated because the interference region is large.
  • FIG. 5 is a flowchart illustrating an example of control processing by the robot system of FIG. FIG.
  • FIG. 6 is a diagram showing a configuration example of a robot system according to a second embodiment of the present invention.
  • FIG. 7 is a flowchart illustrating an example of control processing by the robot system of FIG.
  • FIG. 8 is a diagram showing a configuration example of a robot system according to a third embodiment of the present invention.
  • FIG. 9 is a flowchart illustrating an example of control processing by the robot system of FIG.
  • FIG. 1 shows a configuration example of the robot system 11 according to the first embodiment of the present invention.
  • the robot system 11 includes a control device 20, a controller 30, and a robot 40.
  • the control device 20 controls the operation of the robot 40 via the controller 30.
  • the control device 20 includes a processor such as a CPU (Central Processing Unit), a memory such as an SDRAM (Static Random Access Memory), a storage such as an HDD (Hard Disc Drive) and an SSD (Solid State Drive), a communication module, an input device, and a display. It consists of a general computer such as a personal computer equipped with a device.
  • the control device 20 includes a calculation unit 21, a storage unit 22, a display unit 23, an input unit 24, and a communication unit 25.
  • the arithmetic unit 21 is realized by a computer processor.
  • the calculation unit 21 has each functional block of an operation planning unit 211, a grip adjustment unit 212, a simulation execution unit 213, a re-grasping determination unit 214, and a display control unit 218. These functional blocks are realized by the processor of the computer executing a predetermined program.
  • the motion planning unit 211 is operated by the robot 40 based on the robot configuration information 221 stored in the storage unit 22, the orbit start point / end point information 222, the interfering object configuration information 223, and the gripping object information 224 (all described later). Is gripped, a trajectory for performing assembly work is determined, and the track information 225 is stored in the storage unit 22.
  • FIG. 2 is a diagram for explaining an example of processing by the motion planning unit 211, and shows a coordinate space (hereinafter referred to as a configuration space) with each joint angle of the robot 40 as a space axis.
  • a configuration space shows a case where the robot 40 has two degrees of freedom.
  • the motion planning unit 211 calculates the position of the interference region where the peripheral structure that may interfere with the motion of the robot 40 exists by the motion simulation based on the robot configuration information 221 and the interfering object configuration information 223 and the gripping object information 224. do.
  • the motion planning unit 211 has a plurality of trajectory candidates from the start point S representing the state in which the work is gripped by the robot 40 to the end point G representing the state in which the work is assembled via one or more waypoints (not shown). Is generated, and among the orbital candidates, the orbital P that does not pass through the interference region is selected. Since there are many known algorithms for the method of generating the orbital candidate and the method of selecting the orbital P, they may be used.
  • the trajectory information generated by the motion planning unit 211 includes information representing the positions of the start point S, one or more waypoints, and the end point G of the motion, and the posture of the robot 40 at each point.
  • the gripping adjustment unit 212 adjusts the gripping operation of the robot 40 with respect to the work by the hand unit 42.
  • FIG. 3 shows a detailed configuration example of the grip adjustment unit 212.
  • the gripping adjustment unit 212 includes a feature amount extraction unit 51, an error estimation unit 52, and a gripping force adjusting unit 53.
  • the feature amount extraction unit 51 acquires sensor data representing the results detected by the tactile sensor 43 and the visual sensor 44 when the hand unit 42 grips the work, and extracts the feature amount.
  • the error estimation unit 52 sets the work at the time of planning and the work for at least one of the gripping position, gripping posture, and gripping force of the work by the hand unit 42. Estimate the error from when actually gripping.
  • the gripping position is represented by, for example, a position vector on Cartesian coordinates when the hand portion 42 grips the work.
  • the gripping posture is represented by rotation (roll, pitch, yaw) around an orthogonal coordinate axis of the hand portion 42 when gripping the work.
  • the gripping force is represented by the angle of rotation of the joint that controls the opening / closing size of the hand portion 42.
  • the gripping force adjusting unit 53 adjusts the gripping force of the gripping operation with respect to the work by the hand unit 42 of the robot 40.
  • the simulation execution unit 213 adjusts at least one of the current position, posture, and gripping force of the hand unit 42 in the direction of eliminating the error estimated by the error estimation unit 52. Specifically, the gripping position of the work by the hand portion 42 is moved, and the posture and the gripping force are changed. The adjustment of the hand unit 42 for eliminating the error may be performed by actually operating the hand unit 42 or as a simulation without operating the hand unit 42. Further, the simulation execution unit 213 executes the same simulation as the motion planning unit 211 on the premise that the hand unit 42 has been adjusted, and searches for a trajectory from the start point to the end point of the assembly work.
  • the re-grasping determination unit 214 can perform assembly work by re-grasping after adjusting at least one of the position, posture, and gripping force of the hand unit 42 based on the simulation result by the simulation execution unit 213. To judge. When it is determined that the assembling work is possible by re-grasping, the re-grasping determination unit 214 generates a control target value for adjusting the hand unit 42 in the direction of eliminating the error. This control target value is transmitted to the controller 30 by the communication unit 25.
  • FIG. 4 is a diagram for explaining an example of processing by the simulation execution unit 213 and the re-grasping determination unit 214, and shows the configuration space of the robot 40.
  • FIG. (A) shows the trajectory in the configuration space after adjusting the hand portion 42 in the direction of eliminating the error when it is determined that the assembly is possible by re-grasping.
  • the starting point S represents the combination of the joint angles of the adjusted robot 40.
  • the end point G is the same position as at the time of planning by the operation planning unit 211.
  • FIG. (A) shows a case where the change in the interference region is small because the adjustment amount is relatively small, and the orbit P from the start point S to the end point G can be generated without passing through the interference region.
  • FIG. 3B shows a case where the change in the interference region is large because the adjustment amount is relatively large, and the trajectory from the start point S to the end point G cannot be generated without passing through the interference region.
  • the display control unit 218 controls various displays on the display unit 23.
  • the storage unit 22 is realized by the memory and storage of the computer.
  • the storage unit 22 stores the robot configuration information 221, the orbit start point / end point information 222, the interfering object configuration information 223, the gripping object information 224, the orbit information 225, and the learned model 226.
  • the robot configuration information 221 is information including the shapes and connection states of the arm portion 41 and the hand portion 42 constituting the robot 40. Specifically, it is 3D / CAD (Computer Aided Design) data, URDF (Unified Robot Description Format) data, or the like of the robot 40.
  • the robot configuration information 221 is stored in the storage unit 22 in advance.
  • the orbit start point / end point information 222 is information representing the coordinates and postures of the start point, the waypoint, and the end point of the operation of the robot 40.
  • the orbit start point / end point information 222 is, for example, input from a user in advance and stored in the storage unit 22.
  • Interfering material configuration information 223 is information on structures existing around the robot 40 that may interfere with the operation of the robot 40. Specifically, for example, it is a jig used for assembly, 3D / CAD data of stalls around the robot 40, and the like. The interfering material configuration information 223 is stored in the storage unit 22 in advance.
  • the gripping object information 224 is information representing the shape of the work to be gripped. Specifically, it is 3D / CAD data of the work or the like.
  • the gripping object information 224 is stored in the storage unit 22 in advance.
  • the trajectory information 225 is information representing time-series changes in the combination of each joint angle of the robot 40.
  • the orbit information 225 is generated by the motion planning unit 211 and stored in the storage unit 22.
  • the trained model 226 is a machine learning model such as a neural network.
  • the trained model 226 in the past gripping motion for a certain work, the feature amount of the sensor data by the tactile sensor 43 and the visual sensor 44, and the gripping position, the posture, and the gripping force when the hand portion 42 grips the work. The relationship between the error at the time of planning and the error at the time of actual operation is learned.
  • the trained model 226 takes the feature amount of the sensor data as an input, and outputs the error at the time of planning and the time of actual operation of the gripping position, the posture, and the gripping force.
  • the trained model 226 is stored in the storage unit 22 in advance.
  • the display unit 23 is realized by a display device of a computer.
  • the display unit 23 is, for example, a GUI (Graphical User Interface) or the like that allows the user to input various inputs, indicate the operating status of the robot 40, or notify that the robot 40 cannot execute the assembly work as planned. Display the screen of.
  • GUI Graphic User Interface
  • the input unit 24 is realized by an input device of a computer.
  • the input unit 24 receives various input operations from the user.
  • the communication unit 25 is realized by a communication module of a computer.
  • the communication unit 25 connects to the controller 30 via a network (not shown) such as the Internet or a mobile phone communication network, and communicates various data.
  • the controller 30 executes the assembly work by operating the arm portion 41 and the hand portion 42 constituting the robot 40 according to the control input from the control device 20.
  • the robot 40 has an arm portion 41, a hand portion 42, a tactile sensor 43, and a visual sensor 44.
  • the robot 40 operates the arm unit 41 and the hand unit 42 according to the control from the controller 30.
  • the arm portion 41 is a structure that supports the hand portion 42, and is a movable part of the robot 40.
  • the hand portion 42 is an end effector such as a gripper.
  • the tactile sensor 43 is installed in the hand portion 42.
  • the tactile sensor 43 detects the shape, stress, and the like of the contact surface between the gripping surface of the hand portion 42 and the work.
  • the visual sensor 44 is installed at any position on the robot 40.
  • the visual sensor 44 is composed of, for example, a camera or the like, and detects a work, an interfering object, or the like existing around the robot 40.
  • the visual sensor 44 may be installed in a place other than the robot 40 (for example, a ceiling or the like) as long as it can detect a work, an interfering object, or the like existing around the robot 40.
  • FIG. 5 is a flowchart illustrating an example of robot control processing by the robot system 11.
  • the robot control process is started, for example, in response to a predetermined operation from the user.
  • the motion planning unit 211 grips the work based on the robot configuration information 221 stored in the storage unit 22, the orbit start point / end point information 222, the interfering object configuration information 223, and the gripping object information 224.
  • the trajectory for performing the assembly work is determined, and the trajectory information 225 is generated and stored in the storage unit 22 (step S1).
  • the gripping adjustment unit 212 refers to the trajectory information 225 and causes the robot 40 to actually grip the work (step S2). Specifically, the controller 30 is controlled via the communication unit 25, and the robot 40 is made to execute the gripping operation.
  • the feature amount extraction unit 51 of the grip adjustment unit 212 acquires the sensor data detected by the tactile sensor 43 and the visual sensor 44 while the robot 40 is executing the gripping operation from the controller 30, and obtains the feature amount. Extract (step S3).
  • the error estimation unit 52 inputs the extracted feature amount to the trained model 226, so that the gripping position, gripping posture, and gripping force of the work by the hand unit 42 are determined between the planned time and the actual operation time. Estimate the error (step S4).
  • the simulation execution unit 213 executes the same simulation as the motion planning unit 211 on the premise that the hand unit 42 is adjusted in the direction of eliminating the estimated error, and passes through the interference region from the changed start point. Search for the trajectory to reach the end point without doing (step S5).
  • the re-grasping determination unit 214 determines whether or not the assembly work is possible based on whether or not the trajectory has been searched based on the simulation result by the simulation execution unit 213 (step S6).
  • the re-grasping determination unit 214 adjusts the hand unit 42 in the direction of eliminating the error. Is generated and transmitted to the controller 30 via the communication unit 25, whereby the robot 40 is made to execute the work gripping operation again (step S7).
  • the gripping adjustment unit 212 controls the controller 30 via the communication unit 25, searches for it in step S5, and operates the robot 40 along the trajectory determined in step S6 that the assembly work is possible (step). S8). This completes the robot control process.
  • step S6 when the re-grasping determination unit 214 determines that the assembly work is impossible because the track cannot be searched (NO in step S6), the display control unit 218 indicates that the assembly work cannot be executed. The alert screen is displayed on the display unit 23 (step S9). This completes the robot control process.
  • FIG. 6 shows a configuration example of the robot system 11 according to the second embodiment of the present invention.
  • the robot system 12 has a threshold value determination unit 215 added as a functional block possessed by the calculation unit 21 of the robot system 11 (FIG. 1), and an error threshold value information 227 added as information stored in the storage unit 22. ..
  • a threshold value determination unit 215 added as a functional block possessed by the calculation unit 21 of the robot system 11 (FIG. 1), and an error threshold value information 227 added as information stored in the storage unit 22. ..
  • those common to the components of the robot system 11 are designated by the same reference numerals and the description thereof will be omitted.
  • the threshold value determination unit 215 refers to the error threshold value information 227 indicating the threshold value of the gripping position, the gripping posture, and the tolerance of the gripping force set for each work, and the error estimated by the error estimation unit 52 is equal to or less than the threshold value. Determine if it exists.
  • FIG. 7 is a flowchart illustrating an example of robot control processing by the robot system 12.
  • the robot control process is started, for example, in response to a predetermined operation from the user. Since the processes of steps S11 to S14 are the same as the processes of steps S1 to S4 in the robot control process (FIG. 5) by the robot system 11, the description thereof will be omitted.
  • the threshold value determination unit 215 then refers to the error threshold value information 227 and determines whether or not the estimated error is equal to or less than the threshold value (step S15).
  • the simulation execution unit 213 operates on the premise that the hand unit 42 is adjusted in the direction of eliminating the estimated error. A simulation similar to that of the planning unit 211 is executed, and a trajectory from the changed start point to the end point without passing through the interference region is searched for (step S16).
  • the re-grasping determination unit 214 determines whether or not the assembly work is possible based on whether or not the trajectory can be searched based on the simulation result by the simulation execution unit 213 (step S17).
  • the re-grasping determination unit 214 controls to adjust the hand unit 42 in the direction of eliminating the error.
  • the robot 40 is made to execute the work gripping operation again (step S18). After that, the process is returned to step S13, and steps S13 and subsequent steps are repeated.
  • step S17 when the re-grasping determination unit 214 determines that the assembly work is impossible because the track cannot be searched (NO in step S17), the display control unit 218 performs the assembly work on the display unit 23. An alert screen indicating that the execution cannot be performed is displayed (step S19). This completes the robot control process.
  • the threshold value determination unit 215 determines that the estimated error is equal to or less than the threshold value (YES in step S15)
  • no further error adjustment is required, so that the grip adjustment unit 212 via the communication unit 25.
  • the re-grasping operation and the search for the trajectory are repeated until the estimated error becomes equal to or less than the threshold value, except when the trajectory cannot be searched. In comparison, the success rate of assembly work can be further improved.
  • FIG. 8 shows a configuration example of the robot system 13 according to the third embodiment of the present invention.
  • the robot system 13 has an interfering object identification unit 216 added as a functional block possessed by the calculation unit 21 of the robot system 11 (FIG. 1).
  • the components of the robot system 13 those common to the components of the robot system 11 are designated by the same reference numerals and the description thereof will be omitted.
  • the interfering object identification unit 216 identifies the interfering object that causes the trajectory to be unsearchable when the re-grasping determination unit 214 determines that the assembly work is impossible because the track after re-grasping could not be searched.
  • the information is output to the display control unit 218.
  • the display control unit 218 causes the display unit 23 to display the position, shape, and the like of the interfering object by, for example, a character string or an image.
  • FIG. 9 is a flowchart illustrating an example of robot control processing by the robot system 13.
  • the robot control process is started, for example, in response to a predetermined operation from the user. Since the processes of steps S31 to S38 are the same as the processes of steps S1 to S8 in the robot control process (FIG. 5) by the robot system 11, the description thereof will be omitted.
  • step S36 when the re-grasping determination unit 214 determines that the assembling work is impossible (NO in step S36), the interfering object identification unit 216 identifies the interfering object that causes the track to be unable to be searched, and the interference object is identified. Information is output to the display control unit 218, and the display control unit 218 causes the display unit 23 to display the position and shape of the interfering object, for example, by means of a character string, an image, or the like (step S39).
  • the display control unit 218 causes the display unit 23 to display an alert screen indicating that the assembly work cannot be executed (step S40). This completes the robot control process.
  • the same operation and effect as the robot control process by the robot system 11 (FIG. 5) can be obtained. Further, according to the robot control process by the robot system 13, the user can grasp the interfering object obstructing the trajectory, so that it becomes possible to facilitate the user's countermeasure planning such as moving the interfering object.
  • the present invention is not limited to the above-described embodiment, and various modifications are possible.
  • the above-described embodiment has been described in detail in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to the one including all the described configurations.
  • the interfering object identification unit 216 may be added to the robot system 12 (FIG. 6).
  • each of the above configurations, functions, processing units, processing means, etc. may be realized by hardware by designing a part or all of them by, for example, an integrated circuit. Further, each of the above configurations, functions, and the like may be realized by software by the processor interpreting and executing a program that realizes each function. Information such as programs, tables, and files that realize each function can be placed in a memory, a recording device such as a hard disk or SSD, or a recording medium such as an IC card, SD card, or DVD.
  • the control lines and information lines indicate those that are considered necessary for explanation, and do not necessarily indicate all the control lines and information lines in the product. In practice, it can be considered that almost all configurations are interconnected.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Manufacturing & Machinery (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Geometry (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

La présente invention empêche une défaillance d'un travail d'assemblage en raison d'un écart de position de maintien ou d'un écart d'orientation qui se produisent lorsqu'une pièce à travailler est maintenue réellement. Un système de robot est caractérisé en ce qu'un dispositif de commande comprend : une unité de planification de mouvement qui détermine un trajet d'un point de départ à un point d'arrivée d'un mouvement à exécuter par un robot, et produit des informations de trajet ; une unité d'ajustement de maintien qui estime une erreur à partir de ce qui avait été planifié, concernant au moins un paramètre parmi la position de maintien, l'orientation et la force de maintien du robot lorsque le robot maintient réellement une pièce à travailler en fonction des informations de trajet ; une unité d'exécution de simulation qui ajuste au moins un paramètre parmi la position de maintien, l'orientation, et la force de maintien du robot dans une direction qui élimine l'erreur estimée et, par la suite, exécute une simulation de mouvement du travail d'assemblage pour produire un nouveau trajet ; et une unité de détermination de remaintien qui détermine si oui ou non le travail d'assemblage est possible, en fonction du fait qu'un trajet a été produit ou non par l'unité d'exécution de simulation.
PCT/JP2021/001844 2020-06-09 2021-01-20 Système de robot, dispositif de commande et procédé de commande WO2021250923A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-099821 2020-06-09
JP2020099821A JP7479205B2 (ja) 2020-06-09 2020-06-09 ロボットシステム、制御装置、及び制御方法

Publications (1)

Publication Number Publication Date
WO2021250923A1 true WO2021250923A1 (fr) 2021-12-16

Family

ID=78847192

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/001844 WO2021250923A1 (fr) 2020-06-09 2021-01-20 Système de robot, dispositif de commande et procédé de commande

Country Status (2)

Country Link
JP (1) JP7479205B2 (fr)
WO (1) WO2021250923A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW202413237A (zh) * 2022-07-29 2024-04-01 日商遠程連接股份有限公司 商品移動裝置、商品移動裝置之控制方法及電腦程式

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009045678A (ja) * 2007-08-16 2009-03-05 Yaskawa Electric Corp ロボットの作業成否判定方法およびロボットシステム
US20150239127A1 (en) * 2014-02-25 2015-08-27 Gm Global Technology Operations Llc. Visual debugging of robotic tasks
JP2017144498A (ja) * 2016-02-15 2017-08-24 キヤノン株式会社 情報処理装置、情報処理装置の制御方法およびプログラム
JP2017177283A (ja) * 2016-03-30 2017-10-05 セイコーエプソン株式会社 ロボット制御装置、ロボットおよびシミュレーション装置
JP2018126796A (ja) * 2017-02-06 2018-08-16 セイコーエプソン株式会社 制御装置、ロボットおよびロボットシステム
JP2019171501A (ja) * 2018-03-27 2019-10-10 日本電産株式会社 ロボットの干渉判定装置、ロボットの干渉判定方法、プログラム

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2255930A1 (fr) 2009-05-27 2010-12-01 Leica Geosystems AG Procédé et système destinés au positionnement très précis d'au moins un objet dans une position finale dans l' espace

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009045678A (ja) * 2007-08-16 2009-03-05 Yaskawa Electric Corp ロボットの作業成否判定方法およびロボットシステム
US20150239127A1 (en) * 2014-02-25 2015-08-27 Gm Global Technology Operations Llc. Visual debugging of robotic tasks
JP2017144498A (ja) * 2016-02-15 2017-08-24 キヤノン株式会社 情報処理装置、情報処理装置の制御方法およびプログラム
JP2017177283A (ja) * 2016-03-30 2017-10-05 セイコーエプソン株式会社 ロボット制御装置、ロボットおよびシミュレーション装置
JP2018126796A (ja) * 2017-02-06 2018-08-16 セイコーエプソン株式会社 制御装置、ロボットおよびロボットシステム
JP2019171501A (ja) * 2018-03-27 2019-10-10 日本電産株式会社 ロボットの干渉判定装置、ロボットの干渉判定方法、プログラム

Also Published As

Publication number Publication date
JP7479205B2 (ja) 2024-05-08
JP2021192944A (ja) 2021-12-23

Similar Documents

Publication Publication Date Title
US11161247B2 (en) Robot trajectory generation method, robot trajectory generation apparatus, storage medium, and manufacturing method
Abu-Dakka et al. Adaptation of manipulation skills in physical contact with the environment to reference force profiles
US9387589B2 (en) Visual debugging of robotic tasks
Kruse et al. A sensor-based dual-arm tele-robotic system
Bagnell et al. An integrated system for autonomous robotics manipulation
Pastor et al. Towards associative skill memories
JP5686775B2 (ja) ロボット制御インターフェイスの動的最適化のための方法
Palmer et al. Real-time method for tip following navigation of continuum snake arm robots
CN110573308A (zh) 机器人***的混合现实辅助空间编程
Sayour et al. Autonomous robotic manipulation: real‐time, deep‐learning approach for grasping of unknown objects
Felip et al. Manipulation primitives: A paradigm for abstraction and execution of grasping and manipulation tasks
WO2021097166A1 (fr) Dextérité et commande tactiles
JP2022176917A (ja) ロボットデバイスを制御するための方法
JP6322949B2 (ja) ロボット制御装置、ロボットシステム、ロボット、ロボット制御方法及びロボット制御プログラム
WO2021250923A1 (fr) Système de robot, dispositif de commande et procédé de commande
JP2015071207A (ja) ロボットハンドおよびその制御方法
Süberkrüb et al. Feel the tension: Manipulation of deformable linear objects in environments with fixtures using force information
Wang et al. Learning robotic insertion tasks from human demonstration
Guanglong et al. Human–manipulator interface using hybrid sensors with Kalman filters and adaptive multi-space transformation
TWI781708B (zh) 學習裝置、學習方法、學習程式、控制裝置、控制方法及控制程式
Du et al. Human‐Manipulator Interface Using Particle Filter
JP7504398B2 (ja) 軌道生成装置、軌道生成方法、及び軌道生成プログラム
KR20220086971A (ko) 손 관절을 추적하는 방법 및 장치
JP7159525B2 (ja) ロボット制御装置、学習装置、及びロボット制御システム
US20210197374A1 (en) Composability framework for robotic control system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21821151

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21821151

Country of ref document: EP

Kind code of ref document: A1