WO2022195680A1 - ロボットの作業教示装置及び作業教示方法 - Google Patents
ロボットの作業教示装置及び作業教示方法 Download PDFInfo
- Publication number
- WO2022195680A1 WO2022195680A1 PCT/JP2021/010391 JP2021010391W WO2022195680A1 WO 2022195680 A1 WO2022195680 A1 WO 2022195680A1 JP 2021010391 W JP2021010391 W JP 2021010391W WO 2022195680 A1 WO2022195680 A1 WO 2022195680A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- teaching
- robot
- detection unit
- work
- teacher
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 40
- 238000001514 detection method Methods 0.000 claims abstract description 81
- 238000005259 measurement Methods 0.000 claims abstract description 28
- 230000033001 locomotion Effects 0.000 claims description 77
- 230000007704 transition Effects 0.000 claims description 62
- 238000012790 confirmation Methods 0.000 claims description 41
- 230000009471 action Effects 0.000 claims description 23
- 230000014509 gene expression Effects 0.000 claims description 23
- 238000006073 displacement reaction Methods 0.000 claims description 17
- 230000008569 process Effects 0.000 claims description 13
- 230000008859 change Effects 0.000 claims description 4
- 238000003384 imaging method Methods 0.000 claims 3
- 238000012795 verification Methods 0.000 abstract 2
- 238000012360 testing method Methods 0.000 description 42
- 239000003550 marker Substances 0.000 description 41
- 239000003153 chemical reaction reagent Substances 0.000 description 29
- 238000012545 processing Methods 0.000 description 28
- 238000010586 diagram Methods 0.000 description 17
- 210000003811 finger Anatomy 0.000 description 5
- 230000036544 posture Effects 0.000 description 4
- 210000004247 hand Anatomy 0.000 description 3
- 239000000758 substrate Substances 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 238000003756 stirring Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/42—Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/36—Nc in input of data, input key till input tape
- G05B2219/36401—Record play back, teach position and record it then play back
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40391—Human to robot skill transfer
Definitions
- the present invention relates to a robot work teaching device and a work teaching method for teaching human work motions to a robot.
- a teaching tool having a button for specifying a teaching position of a robot is set to a position where the teacher wants to move the robot, and the teaching position is specified by pressing the button here.
- the position and orientation of the teaching tool at this time are measured by a stereo camera, determined as the teaching position of the robot, and furthermore, the determined teaching positions are interpolated to generate a teaching program for operating the robot. is configured as
- Patent Document 2 describes a robot teaching system equipped with an acquisition device that acquires work information including image information.
- the process of determining whether the current process has ended normally and moving to the next process should also be described in the robot's teaching program. For example, at the end of the process, the image recognition of the state of the work object is performed, and if it is the same as the target state stored in advance, the processing is judged to be normal termination, otherwise the processing is determined to be abnormal termination. Described.
- the description of the teaching program for the robot as described above needs to be input one by one by the instructor separately from the teaching of the operating position of the robot described in Patent Document 1, and the work content to be executed by the robot is complicated. As a result, input work and debugging work by such a teacher increase, causing an increase in man-hours for developing a teaching program.
- a robot work teaching device that generates a teaching program for causing a robot to perform a series of tasks by detecting that the teacher has grasped or manipulated an object to be grasped and confirmed the work status in synchronization with measurement. And to provide a work teaching method.
- the present invention provides a work teaching device for a robot that teaches a robot a work to be done by a teacher.
- a teaching pose measuring unit for measuring a teaching pose
- a positioning detection unit for detecting that an object to be moved by the teacher has been positioned
- a gripping motion detection unit for detecting that the object has been gripped by the teacher.
- a teaching program that receives signals from a teaching pose measuring section, a positioning detection section, a grasping motion detection section, a functional operation detection section, and a work state confirmation motion detection section and generates a robot teaching program that is divided for each teacher's motion. It comprises a generation section and a teaching program execution section for executing the teaching program generated by the teaching program generation section.
- the present invention provides a method for teaching a work of a robot for teaching a work by a teacher to a robot.
- the object moved by the teacher in synchronization with the measurement of the pose for teaching is detected by the positioning detection unit, and the object is moved by the teacher in synchronization with the measurement of the pose for teaching.
- the grasping motion detection unit detects that the object has been grasped
- the function operation detection unit detects that the teacher has operated a function of the object in synchronization with the measurement of the teaching pose, and detects the teaching pose.
- the work state confirmation motion detection unit detects that the teacher has confirmed the work state of the object, and the pose measurement unit for teaching, the positioning detection unit, the gripping motion detection unit, and the function operation detection.
- the teaching program generation section receives signals from the unit and the work state confirmation motion detection section, generates a robot teaching program divided for each teacher's motion, and transmits the teaching program generated by the teaching program generation section to the teaching program execution section. I tried to run it with
- the robot when converting human work into robot work, the robot can perform a series of tasks simply by demonstrating the work as usual using a work object or a grasped object such as a work tool used in the work. It is possible to generate the teaching program for executing the work of , including not only the description of the movement path of the robot, but also the grasp of the grasped object to be grasped by the robot, the functional operation of the grasped object, and the work state confirmation. , the work teaching of the robot is facilitated, and there is an effect that the development efficiency of the teaching program is improved.
- FIG. 1 is a block diagram illustrating the overall configuration of a robot work teaching device 1 according to an embodiment of the present invention
- FIG. 2 is a plan view showing an example in which cameras 201 to 204 for measuring poses for teaching are arranged with respect to the workbench 10.
- FIG. (a) is a perspective view of a micropipette 310 attached with a marker plate 311 for measuring poses for teaching
- (b) is a perspective view of a test tube 320 attached with a marker plate 321 for measuring poses for teaching.
- It is a diagram.
- (a) is a perspective view of a teaching glove 410 incorporating pressure-sensitive switches 411 to 416 for detecting the positioning, gripping, and function operation of a grasped object by the teacher 11;
- FIG. 11 is a perspective view of a teaching glove 410 showing an example of wearing and gripping a micropipette 310.
- FIG. FIG. 3 is a perspective view of a teacher 11 and its surroundings, showing an example using a footswitch 501 for inputting positioning, gripping, and functional operations of an object to be grasped by the teacher 11 and a microphone 502 for inputting the teacher's 11 voice;
- . 4 is a flow chart showing a procedure of dispensing a reagent into a test tube 320 using a micropipette 310.
- FIG. 4 is a front view of the workbench 10 on which a chip box 702 and reagent bottles 704 are placed; 4 is a flow chart showing a procedure of processing of a teaching program generation unit 106.
- FIG. FIG. 9 is a flow chart showing the continuation of the processing procedure of the teaching program generation unit 106 shown in FIG. 8; FIG.
- FIG. 10 is a table showing an example of data of a teaching pose partial sequence generated by the teaching program generating unit 106
- (b) is a table generated from the teaching pose partial sequence generated by the teaching program generating unit 106.
- 10 is a table showing an example of data of a sequence of joint displacements of a robot to be processed
- 3 is a diagram showing an example of a robot teaching program generated by a teaching program generation unit 106 expressed in the form of a sequential function chart
- FIG. FIG. 12 is a diagram showing an example of editing the sequential function chart shown in FIG. 11 by the teaching program editing unit 107
- 13 is a diagram showing an example in which the sequential function chart shown in FIG. 12 is further edited by the teaching program editing unit 107;
- FIGS. 11 to 13 (a) is a diagram showing a command string 1401 described as an action of step s2 in the sequential function charts shown in FIGS. 11 to 13, and (b) is the sequential function charts shown in FIGS. (c) shows the contents of the transition conditional expression 1402 described as the processing of the transition t201 on the output side of the action in step s2 in FIG.
- a diagram showing the described command sequence 1403 (d) shows the contents of the transition condition expression 1404 described as the processing of the transition t401 on the output side of the action in step s4 in the sequential function charts shown in FIGS.
- (e) shows the command sequence 1405 described as the processing of step s6 in the sequential function charts shown in FIGS.
- FIG. 11 to 13 shows the sequential - It is a diagram showing the contents of the transition conditional expression 1406 described as the processing of the transition t601 on the output side of the action in step s6 in the function chart.
- (a) is a front view of a screen 1501 for editing the processing of step s4 of the sequential function chart shown in FIGS. 11 to 13 by operating the teaching program editing unit 107
- (b) is a teaching program editing
- FIG. 15 is a front view of a screen 1503 for editing a transition conditional expression of transition t401 of the sequential function chart shown in FIGS. 11 to 13 by operating the unit 107
- 4 is a flow chart showing the procedure of processing of a teaching program execution unit 108.
- a robot work teaching device is a robot teaching device that generates a teaching program that enables a robot to reproduce the motion of a grasped object by a teacher demonstrating a series of tasks.
- a robot teaching device By measuring the position and orientation of the object to be grasped in time series, and detecting the positioning, gripping, function operation, and confirmation action of the object to be grasped in synchronization with the measurement, the time series of the object to be grasped at the timing of positioning
- the position and orientation of a grasped object grasped by a teacher are measured as teaching poses, and in synchronism with this, it is determined that the grasped object has been positioned.
- the teaching pose sequence is divided and stored as a teaching pose partial sequence.
- it detects that the object to be grasped, the operation of the function of the object to be grasped, and the confirmation of the work state are performed, and the object to be executed at the timing when these are detected.
- the action of grasping a grasped object, operating a function of a grasped object, and confirming the work state is stored.
- portions of the teaching poses are adjusted so that the positions and postures of the grasped objects to be gripped by the robot are the same as those of the teaching poses included in the teaching pose partial sequences.
- the sequence is converted into a robot joint displacement sequence, commands to be executed are generated, and executed according to the order taught by the instructor. is generated, and a teaching program for the robot is written so as to be executed at the timing when grasping of the grasped object, functional manipulation of the grasped object, and work state confirmation are detected.
- the robot teaching program generated here the robot was made to perform a series of tasks in the same manner as the tasks demonstrated by the instructor.
- the position and orientation of a grasped object grasped by a teacher are measured as teaching poses, and the grasped object is positioned synchronously with this.
- the teaching pose sequence is divided and stored as a teaching pose partial sequence.
- it detects that the object to be grasped, the operation of the function of the object to be grasped, and the confirmation of the work state are performed, and the object to be executed at the timing when these are detected.
- the action of grasping a grasped object, operating a function of a grasped object, and confirming the work state is stored.
- a partial sequence of teaching poses is adjusted so that the position and orientation of an object to be gripped by the robot are the same as those of each of the teaching poses included in the partial sequence of teaching poses. Converts to a sequence of robot joint displacements, generates commands to be executed, executes them according to the order instructed by the instructor, and generates commands for grasping the grasped object, operating the grasped object's functions, and checking the work state. Then, a teaching program for the robot is written so as to be executed at the timing when the action of grasping the object to be grasped, operating the function of the object to be grasped, and confirming the work state is detected. Furthermore, by executing the robot teaching program generated here, the robot is caused to perform a series of tasks in the same manner as the tasks demonstrated by the instructor.
- FIG. 1 is a schematic diagram illustrating the overall configuration of a robot work teaching device 1 according to an embodiment of the present invention.
- the robot work teaching device 1 includes cameras 201 to 204 for measuring the three-dimensional position and orientation of a grasped object grasped by a teacher 11, a right marker plate 311 attached to the grasped object, a left marker plate 321, and a teaching device. It comprises a computer 100 that executes various devices (to be described later) for detecting the positioning of a grasped object grasped by the person 11, grasping motions, function operations, and work state confirmation, and arithmetic processing units related to work instruction.
- the teacher 11 works toward the workbench 10 in order to teach the robot 12.
- the teacher 11 dispenses a reagent into a test tube 320, which is a working object, held with the left hand, using a micropipette 310, which is a working tool, held with the right hand.
- the object to be grasped by the teacher 11 will be referred to as a first object to be grasped.
- a right marker plate 311 for motion capture is attached to the micropipette 310 .
- a left marker plate 321 for motion capture is attached to the test tube 320 .
- each of the right marker plate 311 and the left marker plate 321 is attached with a reflective marker for motion capture in a unique arrangement.
- the marker plate 311 and the left marker plate 321 are recognized separately, and the position and orientation of the marker coordinate system set at the center of each plate are measured.
- the micropipette 310 and the test tube 320 are examples of objects to be grasped by the teacher 11, and the objects to be grasped are not limited to these.
- the robot 12 is installed facing a workbench 10' similar to workbench 10, and is instructed to dispense reagents into test tubes 320' held by its left hand with a micropipette 310' held by its right hand. be.
- the object to be grasped by these robots 12 will be referred to as a second object to be grasped.
- a micropipette 310 ′ and a test tube 320 ′ correspond to the micropipette 310 and the test tube 320 (first grasped object) grasped by the teacher 11 .
- Micropipette 310' and test tube 320' can be of the same shape as micropipette 310 and test tube 320, but need not be exactly the same, and can be of the same type as long as differences in shape are recognized. and may differ in shape, material, and other properties.
- the robot 12 is a dual-armed robot having a left arm 121 and a right arm 122, but is not limited to this.
- the robot may be a combination of two robots, a single-arm robot with a right arm and a single-arm robot with a left arm.
- the two robots may be different types of robots. It is also possible to reproduce the work of the teacher with two arms by a robot having three or more arms.
- the robot 12 is not limited to a model simulating a human body as shown in the drawing, and is not limited to a specific shape as long as it can perform the desired motion.
- the teaching pose measurement unit 101 measures the three-dimensional position and orientation of the object (micropipette 310, test tube 320) gripped by the instructor 11 from the images acquired by the cameras 201-204. Specifically, the teaching pose measurement unit 101 images the right marker plate 311 and the left marker plate 321 with the cameras 201, 202, 203, and 204, and calculates the three-dimensional positions of the right marker plate 311 and the left marker plate 321, respectively. and attitude in chronological order.
- the three-dimensional position and orientation of an object will be referred to as "pose”.
- the "pose" is data including not only the position of the object to be grasped, but also its tilt and rotation.
- the positioning detection unit 102 detects the position of the object to be grasped in synchronization with the measurement of the pose of the object to be grasped (hereinafter referred to as the teaching pose) by the teaching pose measurement unit 101 during the teaching operation by the teacher 11. detect that Specifically, the positioning of the object to be grasped is detected when the pose of the object to be grasped measured by the teaching pose measuring unit 101 does not change for a certain period of time, or a pressure-sensitive switch attached to the finger of the teacher 11, which will be described later, is attached to the finger. 413 to 416 (see FIG. 4), a foot switch 501 (see FIG. 5) operated by the teacher 11 at his/her feet, or a microphone 502 (see FIG. 5) for inputting the voice of the teacher 11. is positioned.
- the gripping motion detection unit 103 detects the gripping motion of the object to be gripped in synchronization with the measurement of the teaching pose by the teaching pose measuring unit 101 during the teaching work by the teacher 11 . Specifically, pressure-sensitive switches 413 to 416 (see FIG. 4) worn on fingers by the teacher 11, which will be described later, a foot switch 501 (see FIG. 5) operated by the teacher 11, or the voice of the teacher 11 is received from any one of the microphones 502 (see FIG. 5) for inputting , and the gripping operation (gripping/releasing) of the gripped object is detected.
- the functional operation detection unit 104 detects that the function of the object to be grasped has been operated in synchronization with the measurement of the teaching pose during the teaching work by the teacher 11 . Specifically, pressure-sensitive switches 411 and 412 (see FIG. 4) worn on fingers by the teacher 11, which will be described later, a foot switch 501 (see FIG. 5) operated by the teacher 11, or the voice of the teacher 11 received a signal from one of the microphones 502 (see FIG. 5) for inputting , and an operation of a function possessed by the object to be grasped (for example, pressing the suction/discharge button 313 of the micropipette 310, etc.) was performed. to detect
- the work state confirmation motion detection unit 105 detects that the work state has been confirmed during the teaching work by the teacher 11 in synchronization with the measurement of the teaching pose. Specifically, pressure-sensitive switches 413 to 416 (see FIG. 4) worn on fingers by the teacher 11, which will be described later, a foot switch 501 (see FIG. 5) operated by the teacher 11, or the voice of the teacher 11 receives a signal from any of the microphones 502 (see FIG. 5) that inputs the , and detects that the work state has been confirmed (for example, the work state has been confirmed by the camera image recognition function provided in the robot 12). .
- the teaching program generation unit 106 generates time-series data of teaching poses at the timing when it is detected that the object to be grasped by the teacher 11 (first object to be grasped) is positioned during the teaching work by the teacher 11. (sequence) is divided and stored as a subsequence of teaching poses. Also, at the timing when the grasping action of the first grasped object and the functional operation of the first grasped object are detected, these are stored as the actions for the first grasped object, and at the timing when the work state confirmation is detected, the operation is performed. It is stored as a state confirmation operation.
- the teaching poses are set so that the pose of the object to be gripped (the second object to be gripped) to be gripped by the robot 12 is the same pose as each of the teaching poses included in the partial sequence of the teaching poses. is converted into a sequence of joint displacements of the robot 12, commands to be executed are generated, and executed according to the order taught by the teacher 11. is executed at the timing when the motion to the first grasped object is detected, and a command for confirming the work state is generated and executed at the timing when the work state confirmation is detected. Generate a program.
- the teaching program editing unit 107 allows the teacher 11 to edit the teaching program for the robot 12 generated by the teaching program generating unit 106, such as corrections and additions, through a graphical user interface, which will be described later. It is a tool to
- the teaching program executing unit 108 sequentially interprets the commands described in the teaching program for the robot 12 generated by the teaching program generating unit 106 or the teaching program for the robot 12 edited by the teaching program editing unit 107, It outputs a joint drive command for executing an action, a drive command for a hand provided at the tip of the robot 12, and the like, and controls each joint axis of the robot 12, the hand, and the like.
- FIG. 2 shows an example of the layout of the cameras 201 to 204 connected to the teaching pose measuring unit 101 shown in FIG.
- Cameras 201 to 204 are mounted on the work table in order to measure the pose of the object to be grasped (such as the micropipette 310 and the test tube 320 shown in FIG. 1) held by the teacher 11 who works toward the work table 10. It is arranged on the opposite side of the teacher 11 with 10 interposed therebetween.
- the fields of view 210 of the cameras 201 to 204 are set so as to overlap each other on the workbench 10, so that the work area on the workbench 10 is entirely covered, and the poses of work tools and work objects as objects to be grasped are determined. is imaged.
- a workbench coordinate system 2100 ( ⁇ W in the drawing) is set on the workbench 10 as a reference coordinate system for motion capture, and poses of objects to be grasped (work tools, work objects) measured by motion capture are set. is expressed as a pose in the workbench coordinate system 2100 .
- FIG. 3(a) shows the state where the right marker plate 311 is attached to the micropipette 310
- FIG. 3(b) shows the state where the left marker plate 321 is attached to the test tube 320.
- FIG. The micropipette 310 has an aspirate/dispense button 313 .
- the micropipette 310 is equipped with an attachment 312 for attaching a right marker plate 311 .
- an attachment 322 for attaching a left marker plate 321 is attached to the test tube 320, as shown in FIG. 3(b).
- the right marker plate 311 has one axis (for example, the X axis) of the right marker plate coordinate system 3100 ( ⁇ RM in the drawing) set in the right marker plate 311 with respect to the micropipette 310 . It is attached so as to be substantially aligned with the vertical direction.
- the left marker plate 321 has one axis (for example, X It is attached so as to substantially match the vertical direction of 320 .
- a right marker plate coordinate system 3100 is defined as a coordinate system whose origin is the center of the right marker plate 311 when registering the arrangement pattern of the reflective markers 311a to 311d.
- the reflective markers 321a to 321d shown in FIG. 3(b) are arranged asymmetrically in the horizontal and vertical directions on the substrate of the left marker plate 321.
- the teaching pose measurement unit 101 can easily adjust the right marker plate 311 and the left marker plate 321 based on the arrangement patterns. can be identified.
- the size and color of the reflective markers 311a to 311d in FIG. 3(a) are different from those of the reflective markers 321a to 321d in FIG. is also possible. It is also possible to change the number of reflective markers on the marker plate 311 in FIG. 3A and the marker plate 321 in FIG.
- FIG. 4 shows a teaching glove 410 containing pressure-sensitive switches 411 to 416 used as detection devices for the detection units 102 to 105 shown in FIG. It is a figure which shows the example holding
- FIG. 4(a) shows an example of arrangement of the pressure-sensitive switches 411 to 416 in the glove for teaching 410, and FIG. - Shows a state in which the discharge button 313 is pressed.
- the pressure-sensitive switches 411 and 412 are arranged on the thumb portion of the teacher 11 and can detect pressing of the suction/discharge button 313 of the micropipette 310 . Also, the pressure-sensitive switches 413 to 416 can detect that the instructor 11 has gripped the micropipette 310 .
- the signals from the pressure-sensitive switches 411 to 416 are input to the signal processing unit 417 (signal 4120), and the results of noise filtering and threshold determination processing are output to the computer 100 (signal 4170).
- FIG. 5 is a diagram showing an example using a footswitch 501 used as a detection device for each of the detection units 102 to 105 shown in FIG.
- a plurality of foot switches 501 are provided. For example, at the timing when the instructor 11 grips the micropipette 310 or operates the suction/discharge button 313 during operation teaching, the foot switches 501 assigned to these gripping operations and functional operations are switched. By pressing the switch 501, it is detected that these grasping motions and functional operations have been performed.
- the microphone 502 can pick up the voice of the instructor 11 during the operation instruction. By inputting the voice assigned to these gripping motions and functional operations, it is detected that these gripping motions and functional operations have been performed.
- the teaching glove 410 shown in FIG. 4 and the foot switch 501 and the microphone 502 shown in FIG. 5 can be used as detection devices connected to the detection units 102 to 105 shown in FIG. It is also possible to use these devices in combination according to the content of work teaching, work environment, and the like.
- FIG. 6 is a flow chart showing the procedure of dispensing the reagent into the test tube 320 using the micropipette 310 .
- the teacher 11 starts work (S601)
- the left hand moves to the gripping position of the test tube 320 placed on the test tube stand (S602), and grips the test tube 320 (S603).
- the test tube 320 is normally gripped (S604).
- the right hand is moved to the gripping position of the micropipette 310 placed on the pipette stand (S605), and the micropipette 310 is gripped (S606).
- the micropipette 310 is normally gripped (S607).
- the left-hand test tube 320 is moved to the standby position (S608).
- the micropipette 310 on the right hand is moved to the tip attachment position, and the tip is attached (S609). Here, it is confirmed visually that the chip is normally attached (S610).
- the micropipette 310 is moved to the reagent aspiration position within the reagent bottle (S611), and the aspiration/discharge button 313 of the micropipette 310 is pressed to aspirate the reagent into the chip (S612).
- the reagent has been normally sucked into the chip (S613).
- the tip end of the micropipette 310 is moved to the discharge position in the test tube 320 (S614), and the suction/discharge button 313 of the micropipette 310 is pressed to discharge the reagent in the tip into the test tube 320 (S615). .
- the suction/discharge button 313 of the micropipette 310 is pressed to discharge the reagent in the tip into the test tube 320 (S615).
- the reagent has been discharged normally into the test tube 320 (S616). It should be noted that in FIG. 6, the procedure of the subsequent work is omitted.
- the above procedure shows the work procedure by the teacher 11, but the robot 12 also follows exactly the same procedure using the micropipette 310' and the test tube 320'.
- the work state is confirmed by the image recognition function of the camera connected to the robot 12 . It should be noted that it is also possible to confirm such a working state by means other than the image recognition of the camera. It is also possible to check the holding state of the test tube 320'. It is also possible to check the state of reagent suction and discharge by detecting the liquid level using a photoelectric sensor.
- FIG. 7 shows the movement path of the micropipette 310 held by the right hand when the instructor 11 performs the dispensing operation using the micropipette 310 on the workbench 10, and divides this as a partial sequence of poses for teaching.
- FIG. 11 is a diagram schematically showing an example of doing.
- the right hand is moved to the gripping position (represented as P1 in the figure) of the micropipette 310 placed on the pipette stand 701, and the micropipette 310 is gripped here (represented as Grasp in the figure).
- the micropipette 310 is normally gripped (indicated as Check 1 in the figure).
- the gripping operation of the micropipette 310 is performed by pressing any of the pressure-sensitive switches 413 to 416 of the teaching glove 410 shown in FIG. 4, or by pressing the foot switch 501 shown in FIG. Alternatively, it is detected by voice input through the microphone 502 (for example, utterance of "Grasp").
- the grasping state of the micropipette 310 can also be confirmed by pressing any of the pressure-sensitive switches 413 to 416 of the teaching glove 410, pressing the foot switch 501, or sounding the microphone 502. Detected by input (e.g., saying "Check grasp pipet").
- a movement path 710 is composed of a sequence of points P1, P2, . . . These point sequences are teaching poses obtained by measuring the movement of the micropipette 310 held by the teacher 11 in the three-dimensional space by motion capture when the teacher 11 demonstrates the work. Time-series data measured at sampling intervals. The data of the sequence of points forming the movement path 710 is the partial sequence of the teaching pose.
- Positioning at the positioning point Pi is detected by stopping the operation for a certain period of time at Pi, or by pressurizing one of the pressure-sensitive switches 413 to 416 of the teaching glove 410, or by the foot switch. It is detected by pressing 501 or voice input by microphone 502 (for example, saying "Stop here").
- the confirmation operation of the attached state of the tip 703 of the micropipette 310 is performed by pressurizing any of the pressure-sensitive switches 413 to 416 of the teaching glove 410, by pressing the foot switch 501, or by the microphone 502. Detected by voice input (for example, saying "Check set tip").
- the micropipette 310 with the chip 703 attached is moved along the movement path 711 to the reagent aspirating position (Pj in the figure) in the reagent bottle 704 and positioned.
- the aspirate/discharge button 313 of the micropipette 310 is pressed to aspirate the reagent 705 into the tip 703 (Aspirate in the drawing). Furthermore, it is confirmed that the reagent 705 is normally sucked into the chip 703 (Check 3 in the figure).
- the movement path 711 is composed of a sequence of points from the movement start point Pi to the positioning point Pj, Pi, Pi+1, . Alternatively, it is detected by pressurization of any of the pressure-sensitive switches 413 to 416 of the teaching glove 410 , pressing of the foot switch 501 , or voice input by the microphone 502 .
- Pressing the suction/discharge button 313 of the micropipette 310 at the positioning point Pj, that is, the functional operation of the micropipette 310 is performed by pressing one of the pressure-sensitive switches 411 and 412 of the teaching glove 410. It is detected by pressing the foot switch 501 or voice input by the microphone 502 (for example, uttering "Aspirate").
- confirming that the reagent 705 has been normally sucked into the chip 703 is similarly done by pressing any of the pressure-sensitive switches 413 to 416 of the teaching glove 410 and pressing the foot switch 501. or by detecting voice input (for example, utterance of "Check aspirate") by the microphone 502.
- the micropipette 310 sucking the reagent 705 is moved along the movement path 712 to the reagent discharge position (Pk in the figure) in the test tube 320 and positioned (the test tube 320 is held in the left hand in advance). (assuming that it is waiting for Here, the suction/discharge button 313 of the micropipette 310 is pressed to discharge the reagent 705 into the test tube 320 (Dispense in the drawing). Furthermore, it is confirmed that the reagent 705 has been discharged normally into the test tube 320 (Check 4 in the figure).
- the movement path 712 is composed of a sequence of points Pj, Pj+1, . . . Pk from the movement start point Pj to the positioning point Pk. Alternatively, it is detected by pressurization of any of the pressure-sensitive switches 413 to 416 of the teaching glove 410 , pressing of the foot switch 501 , or voice input by the microphone 502 .
- the suction/discharge button 313 of the micropipette 310 is pressed at the positioning point Pk, that is, the functional operation of the micropipette 310 is the pressing of one of the pressure-sensitive switches 411 and 412 of the teaching glove 410. It is detected by pressing the footswitch 501 or voice input by the microphone 502 (for example, uttering "Dispense").
- the operation for confirming the discharge state into the test tube 320 is also performed by pressing any of the pressure-sensitive switches 413 to 416 of the teaching glove 410, pressing the foot switch 501, or pressing the microphone 502. Detected by voice input by (for example, saying "Check dispense").
- FIGS. 8 and 9 are flowcharts showing the procedure of processing of the teaching program generation unit 106.
- FIG. In the teaching program generation unit 106, when the teacher 11 starts teaching the work by demonstrating the work (S801), the teaching pose measuring unit 101 measures the pose of the object to be grasped by the teacher 11 as the teaching pose. (S802).
- each pose of the object to be grasped grasped by the right hand and left hand of the teacher 11 is measured as a teaching pose.
- the object to be grasped positioning detection unit 102 detects the positioning of either the right hand or the left hand to be grasped (if YES in S803), the teaching of the hand whose positioning has been detected until now is detected.
- the time-series data (sequence) of the hand pose is divided and stored as a partial sequence of the teaching pose of the hand whose positioning has been detected (S804). If the positioning of either the right hand or the left hand to be grasped has not been detected (NO in S803), the teaching pose measurement is continued (S802).
- the partial sequence of teaching poses is converted into a joint displacement sequence of the robot 12 so that the pose of the object to be grasped by the robot 12 becomes the same as the teaching pose, and executed.
- a command is generated (S805).
- the gripping motion detecting unit 103 detects a gripping motion of the object to be gripped by the hand on which the positioning is detected (YES in S806), this is treated as a gripping motion of the object to be gripped in the partial sequence of the pose for teaching.
- S807 On the other hand, when no grasping motion is detected (NO in S806), the process proceeds to S809.
- a command for performing a gripping motion for the gripped object is generated (S808). Further, when the functional operation detection unit 104 detects a functional operation of the grasped object by the hand that detected the positioning (if YES in S809), this is treated as a functional operation of the grasped object in the teaching pose partial sequence. Store (S810). On the other hand, when the functional operation of the grasped part is not detected (NO in S809), the process proceeds to S812 shown in FIG.
- a command for performing a functional operation of the object to be grasped is generated (S811). Further, when the work state confirmation movement detection unit 105 detects a work state confirmation movement by the hand that detected the positioning (if YES in S812), this is used as the work state confirmation movement in the teaching pose partial sequence. Store (S813). On the other hand, when no confirmation action is detected (NO in S812), the process proceeds to S815.
- FIG. 10 is a diagram showing an example of the teaching pose partial sequence generated by the teaching program generation unit 106 and the robot joint displacement sequence data generated from this.
- FIG. 10A shows an example of teaching pose partial sequence data 1001, where the pose of a grasped object grasped by the teacher 11 is expressed as a sequence of three-dimensional positions and orientations (quaternions). is doing.
- FIG. 10(b) shows an example of joint displacement sequence data 1002 of the robot 12, which is represented here as a sequence of joint angles when the arm of the robot 12 has a 7-axis configuration.
- the teaching pose partial sequence data 1001 and the joint displacement sequence data 1002 of the robot 12 are managed as data for one hand of the teacher 11 and the robot 12, respectively. is a two-handed operation, these data are generated for the right hand and the left hand.
- the partial sequence data 1001 of teaching poses and the sequence data 1002 of joint displacements of the robot 12 are stored as time-series data, and three-dimensional position and posture data and joint angle data at each time are arranged in chronological order. data. These data are given serial numbers (indicated by No. in the figure), and each serial number corresponds to each time.
- each time indicates the time updated in the sampling period of the motion capture described above, and for example, the time is incremented by 50 milliseconds.
- the position of the teaching pose is represented by a Cartesian coordinate system (X, Y, Z), but it is also possible to represent this by another coordinate system, such as a cylindrical coordinate system.
- the postures of teaching poses are represented by quaternions (Rx, Ry, Rz, Rw), they can also be represented by roll, pitch, yaw, Euler angles, and the like.
- FIG. 10(b) shows an example in which the arm of the robot 12 has a seven-axis configuration (the names of the axes are J1 to J7), but other axis configurations may be used.
- the joint displacements of J1 to J7 are represented by angle data of the rotation axis, it is also possible to represent joint displacements other than this, such as the position of the translational axis.
- FIG. 11 is a diagram showing an example of a robot teaching program generated by the teaching program generation unit 106 expressed in the form of a sequential function chart.
- an initial step s1 is generated, and the step name "Initial" is described here.
- a transition t101 on the output side of the initial step s1 is generated at the same time, and a comment "Ready” is described here.
- a link is generated connecting the initial step s1 and the transition t101. It should be noted that there is no transition condition between the action executed in the initial step s1 and the transition t101 on the output side, and when program execution is started, the next step is unconditionally shifted.
- the initial step s1 is internally given the step number 1, and the transition t101 is given the transition number 101 (indicating the first transition on the output side of step s1). Assignment of step numbers and transition numbers is the same for subsequent steps and transitions.
- step s2 As described in the processing procedure of FIG. 8, when the positioning of any hand of the teacher 11 is detected, the next step s2 is generated and the step name "Step 2" is described here. At the same time, a transition t201 on the output side of step s2 is generated, and a comment "Succeeded" is written here. Furthermore, a link is generated between step s2 and transition t201. Note that the step name and comment automatically described here can be corrected later by the teaching program editing unit 107 .
- step s2 when step s2 is generated, as an action to be executed there, first, a partial sequence of teaching poses (corresponding to FIG. 10(a)) of the hand whose positioning has been detected is generated. A sequence of joint displacements of the robot 12 (corresponding to FIG. 10B) is generated so that the pose of the object to be grasped by is the same as the teaching pose, and a command for execution is generated. At this time, if a gripping motion of the hand whose positioning has been detected is detected, a command for performing a gripping motion of the gripped object is generated. Further, when the operation of confirming the working state of the hand that detected the positioning is detected, a command for confirming the working state is generated.
- transition condition for transition t201 a conditional expression for judging the success or failure of the confirmation result is described when the confirmation operation of the work state is performed. If the operation to confirm the work state is not performed, it means that the joint displacement sequence of the robot, that is, the operation up to the positioning of the robot has been completed normally. is normally completed as the transition condition of transition t201.
- FIG. 12 shows an example of the sequential function chart shown in FIG. 11 edited by the teaching program editing unit 107.
- the step names are corrected to names representing the work content (an example of English notation is shown).
- steps corresponding to the steps shown in FIG. 11 are indicated by the same numbers and marked with * .
- the work contents of the same step numbers in FIGS. 11 and 12 are the same. That is, FIG. 12 becomes a program representing the following work.
- test tube 320' is gripped by the hand of the left arm 121, and the micropipette 310' is gripped by the hand of the right arm 122, and taken out from the respective stands (s2 * ).
- test tube 320' gripped by the left arm 121 is moved to the standby position (s3 * ).
- tip 703 in the tip box 702 is attached to the micropipette 310' gripped by the hand of the right arm 122 (s4 * ).
- the reagent 705 in the reagent bottle 704 is aspirated into the tip 703 of the micropipette 310' (s5 * ), and is inserted to the discharge position in the test tube 320' gripped by the hand of the left arm 121 (s6 * ), and the reagent 705 is discharged (s7 * ).
- the micropipette 310' is withdrawn from the test tube 320' (s8 * ) and the tip 703 is removed from the micropipette 310' (s9 * ).
- the test tube 320' gripped by the hand of the left arm 121 is shaken to stir the reagent 705 therein (s10 * ).
- the test tube 320' and the micropipette 310' are placed on their respective stands (s11 * ), completing the series of operations (s12 * ).
- FIG. 13 is a diagram showing an example in which the sequential function chart shown in FIG. 12 is further edited.
- the tip 703 is attached to the micropipette 310' (s4 * )
- the attached state of the tip 703 is confirmed by image recognition of the camera, and if the attachment of the tip 703 is successful (t401), the next step s4 is performed.
- * shows an example in which an error handling process is added such that if the mounting of the chip 703 fails (t402), the mounting operation of the chip 703 is performed again (s13 * ).
- a transition condition for example, a conditional expression for determining the type of reagent 705
- an action for example, a command for executing an operation that increases the number of times the test tube 320′ is shaken
- FIG. 14 is a diagram showing an example of processing descriptions of steps and transitions in the sequential function charts shown in FIGS.
- FIG. 14(a) shows a command string 1401 described as an action of step s2 in FIGS. 11-13.
- "larm_move (“traj_larm_s2")" in the command string 1401 is a partial sequence of teaching poses of the left arm 121 of the robot 12 divided as the motion of step s2 (s2 * ) (automatically stored under the name "traj_larm_s2"). is executed).
- rarm_move("traj_rarm_s2" is a command to execute a partial sequence of teaching poses for the right arm 122 of the robot 12 (automatically stored under the name “traj_rarm_s2").
- “larm_grasp( )” is a command for closing the hand of the left arm 121 of the robot 12 to grasp an object (test tube 320' in this case).
- “rarm_grasp( )” is a command to close the hand of the right arm 122 of the robot 12 to grasp an object (micropipette 310' in this case).
- “larm_check("get_tube”)” is a command for confirming by image recognition whether the hand of the left arm 121 is properly gripping the test tube 320'.
- “rarm_check("get_pipet”)” is a command for confirming by image recognition whether the hand of the right arm 122 is holding the micropipette 310' normally.
- FIG. 14(b) shows a transition conditional expression 1402 described as processing of transition t201 on the output side of step s2 (s2 * ).
- 12 is a conditional expression for judging the success or failure of the confirmation result of the work state. Note that these conditional expressions are automatically generated from the contents of the command string 1401 .
- FIG. 14(c) is a command sequence 1403 described as the processing of step s4 (s4 * ) in FIGS. 1404 is shown. These processing contents are the same as those of the command string 1401 and the transition conditional expression 1402, and "rarm_check("set_tip")" correctly attaches the tip 703 to the micropipette 310' gripped by the hand of the right arm 122. This command is used to check whether the
- FIG. 14(e) is a command sequence 1405 described as the processing of step s6 (s6 * ) in FIGS. 1406 is shown.
- "rarm_move("traj_rarm_s6")" in the command string 1405 is a teaching pose partial sequence of the right arm 122 of the robot 12 divided as the motion of step s5 (s5 * ) (automatically stored under the name “traj_rarm_s6"). is executed).
- FIG. 15A and 15B are diagrams showing an example of display and input of a screen that constitutes the teaching program editing unit 107.
- FIG. A screen constituting the teaching program editing unit 107 may be provided in the computer 100 or may be provided in a place away from the computer 100 (not shown).
- FIG. 15(a) shows a screen 1501 for editing the processing of step s4 (s4 * ) of the sequential function charts shown in FIGS.
- a step edit window 1502 is displayed, where the name 1502a of step s4 is edited (rewrite "Step 4" to "Set tip” in the example of the figure). ) and edit the command string 1502b to be executed in this step s4 * .
- FIG. 15(b) shows a screen 1503 for editing the transition conditional expression of transition t401.
- a transition edit window 1504 is displayed, where the comment 1504a of the transition t401 is edited and the conditional expression 1504b of the transition condition to be determined by this transition t401 is edited. (in this example, the automatically generated description is adopted as is).
- FIG. 16 is a flow chart showing the processing procedure of the teaching program execution unit 108.
- FIG. When the teaching program execution unit 108 starts executing the teaching program (S1601), first, the step number of the step following the initial step is set to the sequential function chart, which is the teaching program as shown in FIGS. It is obtained from the connection relation of the link and substituted for the internally held current step number (S1602). Next, various commands described in the step indicated by the current step number are executed. That is, a command is executed to generate and execute a joint displacement sequence of the robot 12 so that the pose of the object to be grasped by the hand of the robot 12 becomes the same as the teaching pose (S1603).
- a command for gripping an object to be gripped by the hands of the robot 12 is executed (S1604), and a command for functionally manipulating the object to be gripped by the hands of the robot 12 is executed (S1605). Further, a command for confirming the working state of the robot 12 is executed (S1606).
- the conditional expression described as the transition condition is established at any transition on the output side of the current step (if YES in S1607), the next transition is made from the link connection relation of the sequential function chart. A step number is obtained and substituted for the current step number (S1608). Also, if the transition condition is not satisfied at any transition on the output side of the current step (NO in S1607), it waits until one of these is satisfied.
- the robot when converting human work into robot work, the robot can perform a series of tasks simply by demonstrating the work as usual using a grasped object used in the work. It is possible to generate the teaching program for executing the work of , including not only the description of the movement path of the robot, but also the grasp of the grasped object to be grasped by the robot, the functional operation of the grasped object, and the work state confirmation. In comparison with the conventional method of separately programming operation contents and operation state confirmation processing after measuring the motion of the object to be grasped, it becomes easier to teach the robot work, and the development efficiency of the teaching program can be improved.
- the present invention is not limited to the above-described embodiments, and includes various modifications.
- the above-described embodiments have been described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the configurations described.
- part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
- SYMBOLS 1 Robot work teaching device, 100... Computer, 10, 10'... Workbench, 11... Instructor, 12... Robot, 201 to 204... Camera, 310, 310'... Micropipette, 320, 320'... Test tube , 311, 321... marker plate, 210... field of view of camera, 2100... workbench coordinate system, 311a to 311d, 321a to 321d... reflective marker, 312, 322... attachment, 3100, 3200... marker plate coordinate system, 410...
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
Abstract
Description
Claims (15)
- 教示者による作業をロボットに教示するロボットの作業教示装置であって、
前記教示者により把持される物体の位置及び姿勢である教示用ポーズを計測する教示用ポーズ計測部と、
前記教示者により移動される前記物体が位置決めされたことを検出する位置決め検出部と、
前記教示者により前記物体が把持されたことを検出する把持動作検出部と、
前記教示者により前記物体が有する機能を操作することが行われたことを検出する機能操作検出部と、
前記教示者による前記物体への作業状態の確認が行われたことを検出する作業状態確認動作検出部と、
前記教示用ポーズ計測部と前記位置決め検出部と前記把持動作検出部と前記機能操作検出部と前記作業状態確認動作検出部とからの信号を受けて前記教示者の動作ごとに分割した前記ロボットの教示プログラムを生成する教示プログラム生成部と、
前記教示プログラム生成部で生成した教示プログラムを実行する教示プログラム実行部と、
を備えたことを特徴とするロボットの作業教示装置。 - 請求項1記載のロボットの作業教示装置であって、
表示部を更に備え、前記表示部には、前記教示プログラム生成部で生成した前記ロボットの前記教示プログラムをシーケンシャル・ファンクション・チャート形式で表示する第1の領域と、前記第1の領域に表示された前記シーケンシャル・ファンクション・チャート形式の前記教示プログラムから選択したステップで実行されるコマンド列を表示する第2の領域と、前記第1の領域に表示された前記シーケンシャル・ファンクション・チャート形式の前記教示プログラムから選択したトランジションで判定される条件式を表示する第3の領域とを備えることを特徴とするロボットの作業教示装置。 - 請求項1記載のロボットの作業教示装置であって、
前記位置決め検出部で前記教示者により移動される前記物体が位置決めされたことを検出することと、前記把持動作検出部で前記教示者により前記物体が把持されたことを検出することと、前記機能操作検出部で前記教示者により前記物体が有する機能の操作が行われたことを検出することと、前記作業状態確認動作検出部で前記教示者による前記物体への作業状態の確認が行われたことを検出することとを、前記教示用ポーズ計測部で前記教示者による教示用ポーズを計測することと同期して行うことを特徴とするロボットの作業教示装置。 - 請求項1記載のロボットの作業教示装置であって、
前記教示者の動作を複数の方向から撮像する撮像部と、前記教示者による前記物体の位置決め又は前記物体の把持動作又は前記物体が有する機能の操作を検出する操作検出部と、前記物体の位置決め又は前記物体の把持動作又は前記物体が有する機能の操作が行われたことを前記教示者が入力する入力部とを更に備え、前記教示用ポーズ計測部と前記位置決め検出部と前記把持動作検出部と前記機能操作検出部と前記作業状態確認動作検出部とは、前記撮像部又は前記操作検出部又は前記入力部からの信号を受けて処理した信号を前記教示プログラム生成部に出力することを特徴とするロボットの作業教示装置。 - 請求項4記載のロボットの作業教示装置であって、前記教示プログラム生成部は、前記位置決め検出部からの信号、又は、前記操作検出部からの信号、又は、前記入力部からの信号を受けたタイミングごとに分割して前記ロボットの教示プログラムを生成することを特徴とするロボットの作業教示装置。
- 請求項4記載のロボットの作業教示装置であって、
前記位置決め検出部は、前記撮像部で撮像した前記教示者の動作が一定時間変動しないことを以って、又は、前記操作検出部で検出された信号又は前記教示者が前記入力部に入力した信号を受けて、前記教示者により移動される前記物体が位置決めされたことを検出することを特徴とするロボットの作業教示装置。 - 請求項4記載のロボットの作業教示装置であって、
前記操作検出部は前記教示者が手指に装着した感圧スイッチを含み、前記入力部は前記教示者が足元で操作するフットスイッチまたは前記教示者の音声を入力するマイクロフォンの何れかを備え、前記把持動作検出部は、前記感圧スイッチ、又は前記フットスイッチ、又は前記マイクロフォンからの出力信号を受けて前記教示者による前記物体の把持動作が行われたことを検出することを特徴とするロボットの作業教示装置。 - 請求項4記載のロボットの作業教示装置であって、
前記操作検出部は前記教示者が手指に装着した感圧スイッチを含み、前記入力部は前記教示者が足元で操作するフットスイッチまたは前記教示者の音声を入力するマイクロフォンの何れかを備え、前記機能操作検出部は、前記感圧スイッチ、又は前記フットスイッチ、又は前記マイクロフォンからの出力信号を受けて前記教示者による前記物体が有する機能の操作が行われたことを検出することを特徴とするロボットの作業教示装置。 - 請求項4記載のロボットの作業教示装置であって、
前記操作検出部は前記教示者が手指に装着した感圧スイッチを含み、前記入力部は前記教示者が足元で操作するフットスイッチまたは前記教示者の音声を入力するマイクロフォンの何れかを備え、前記作業状態確認動作検出部は、前記感圧スイッチ、又は前記フットスイッチ、又は前記マイクロフォンからの出力信号を受けて前記教示者による前記物体への作業状態の確認が行われたことを検出することを特徴とするロボットの作業教示装置。 - 請求項1記載のロボットの作業教示装置であって、
前記教示プログラム生成部で生成する前記教示者の動作ごとに分割した前記ロボットの教示プログラムは、前記把持された物体の位置決め点で区切られた単位動作ごとに分割され、前記単位動作内に含まれる前記教示用ポーズと前記ロボットにより把持される前記物体と同種の物体の位置及び姿勢が一致するように前記ロボットの動作を生成するように記述した前記ロボットの教示プログラムであることを特徴とするロボットの作業教示装置。 - 教示者による作業をロボットに教示するロボットの作業教示方法であって、
前記教示者より把持される物体の位置及び姿勢である教示用ポーズを教示用ポーズ計測部で計測し、
前記教示用ポーズの計測と同期して前記教示者により移動される前記物体が位置決めされたことを位置決め検出部で検出し、
前記教示用ポーズの計測と同期して前記教示者により前記物体が把持されたことを把持動作検出部で検出し、
前記教示用ポーズの計測と同期して前記教示者により前記物体が有する機能を操作することが行われたことを機能操作検出部で検出し、
前記教示用ポーズの計測と同期して前記教示者による前記物体への作業状態の確認が行われたことを作業状態確認動作検出部で検出し、
前記教示用ポーズ計測部と前記位置決め検出部と前記把持動作検出部と前記機能操作検出部と前記作業状態確認動作検出部とからの信号を受けて教示プログラム生成部で前記教示者の動作ごとに分割した前記ロボットの教示プログラムを生成し、
前記教示プログラム生成部で生成した教示プログラムを教示プログラム実行部で実行する、
ことを特徴とするロボットの作業教示方法。 - 請求項11記載のロボットの作業教示方法であって、
前記教示プログラムを生成する工程において、前記教示者により移動される前記物体が位置決めされたことを位置決め検出部で検出したタイミングで前記教示用ポーズのシーケンスを分割し、前記分割したシーケンスを前記教示用ポーズの部分シーケンスとして記憶し、前記教示者により前記物体が把持されたことを前記把持動作検出部で検出した信号と、前記教示者により前記物体が有する機能を操作することが行われたことを前記機能操作検出部で検出した信号と、前記作業状態確認動作検出部で前記教示者による前記物体への作業状態の確認が行われたことを検出した信号とを前記記憶した前記教示用ポーズの部分シーケンスと関連付けて記憶することを特徴とするロボットの作業教示方法。 - 請求項12記載のロボットの作業教示方法であって、
前記教示プログラムを生成する工程において、前記教示用ポーズの部分シーケンスに含まれる前記教示用ポーズと前記ロボットにより把持される前記物体と同種の物体の位置及び姿勢が一致するように前記ロボットの関節の変位のシーケンスに変換して実行するように記述した前記ロボットの教示プログラムを生成することを特徴とするロボットの作業教示方法。 - 請求項12記載のロボットの作業教示方法であって、
前記教示プログラムを生成する工程において、前記生成する前記ロボットの前記教示プログラムをシーケンシャル・ファンクション・チャートの形式で記述し、前記教示用ポーズの部分シーケンスを前記ロボットの関節の変位のシーケンスに変換して実行するコマンドと、前記把持動作検出部で検出した前記教示者により前記物体の把持動作に基づいて前記ロボットが行う動作のコマンドと、前記機能操作検出部で検出した前記教示者による前記物体が有する機能の操作に基づいて前記ロボットが行う動作のコマンドと、前記作業状態確認動作検出部で検出した前記教示者による前記物体への作業状態の確認に基づいて前記ロボットが行う動作のコマンドとを作成し、前記作成したコマンドを実行する処理を、前記シーケンシャル・ファンクション・チャートのステップと定義し、これらのステップを前記教示用ポーズの部分シーケンスの順に連ねることで前記シーケンシャル・ファンクション・チャートを生成することを特徴とするロボットの作業教示方法。 - 請求項14記載のロボットの作業教示方法であって、
前記シーケンシャル・ファンクション・チャートのトランジションに前記ステップ間の遷移条件として、前記作業状態の確認結果の成否を記述することを特徴とするロボットの作業教示方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112021006786.9T DE112021006786T5 (de) | 2021-03-15 | 2021-03-15 | Tätigkeitsanlerneinrichtung und tätigkeitsanlernverfahren für roboter |
PCT/JP2021/010391 WO2022195680A1 (ja) | 2021-03-15 | 2021-03-15 | ロボットの作業教示装置及び作業教示方法 |
JP2023506404A JPWO2022195680A1 (ja) | 2021-03-15 | 2021-03-15 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/010391 WO2022195680A1 (ja) | 2021-03-15 | 2021-03-15 | ロボットの作業教示装置及び作業教示方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022195680A1 true WO2022195680A1 (ja) | 2022-09-22 |
Family
ID=83320073
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/010391 WO2022195680A1 (ja) | 2021-03-15 | 2021-03-15 | ロボットの作業教示装置及び作業教示方法 |
Country Status (3)
Country | Link |
---|---|
JP (1) | JPWO2022195680A1 (ja) |
DE (1) | DE112021006786T5 (ja) |
WO (1) | WO2022195680A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116087968A (zh) * | 2023-01-20 | 2023-05-09 | 松下神视电子(苏州)有限公司 | 传感器 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012228757A (ja) * | 2011-04-27 | 2012-11-22 | Seiko Epson Corp | ロボット教示方法、ロボット教示装置およびプログラム |
JP2015054378A (ja) * | 2013-09-13 | 2015-03-23 | セイコーエプソン株式会社 | 情報処理装置、ロボット、シナリオ情報生成方法及びプログラム |
JP2015071206A (ja) * | 2013-10-03 | 2015-04-16 | セイコーエプソン株式会社 | 制御装置、ロボット、教示データ生成方法及びプログラム |
JP2018167334A (ja) * | 2017-03-29 | 2018-11-01 | セイコーエプソン株式会社 | 教示装置および教示方法 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5549749B1 (ja) | 2013-01-16 | 2014-07-16 | 株式会社安川電機 | ロボット教示システム、ロボット教示プログラムの生成方法および教示ツール |
WO2019064752A1 (ja) | 2017-09-28 | 2019-04-04 | 日本電産株式会社 | ロボット教示システム、ロボット教示方法、制御装置、及びコンピュータプログラム |
-
2021
- 2021-03-15 DE DE112021006786.9T patent/DE112021006786T5/de active Pending
- 2021-03-15 WO PCT/JP2021/010391 patent/WO2022195680A1/ja active Application Filing
- 2021-03-15 JP JP2023506404A patent/JPWO2022195680A1/ja active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012228757A (ja) * | 2011-04-27 | 2012-11-22 | Seiko Epson Corp | ロボット教示方法、ロボット教示装置およびプログラム |
JP2015054378A (ja) * | 2013-09-13 | 2015-03-23 | セイコーエプソン株式会社 | 情報処理装置、ロボット、シナリオ情報生成方法及びプログラム |
JP2015071206A (ja) * | 2013-10-03 | 2015-04-16 | セイコーエプソン株式会社 | 制御装置、ロボット、教示データ生成方法及びプログラム |
JP2018167334A (ja) * | 2017-03-29 | 2018-11-01 | セイコーエプソン株式会社 | 教示装置および教示方法 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116087968A (zh) * | 2023-01-20 | 2023-05-09 | 松下神视电子(苏州)有限公司 | 传感器 |
CN116087968B (zh) * | 2023-01-20 | 2024-04-30 | 松下神视电子(苏州)有限公司 | 传感器 |
Also Published As
Publication number | Publication date |
---|---|
DE112021006786T5 (de) | 2023-11-09 |
JPWO2022195680A1 (ja) | 2022-09-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11573140B2 (en) | Force/torque sensor, apparatus and method for robot teaching and operation | |
JP7467041B2 (ja) | 情報処理装置、情報処理方法及びシステム | |
JP6880982B2 (ja) | 制御装置およびロボットシステム | |
CN113056351B (zh) | 外部输入设备、机器人***、其控制方法及记录介质 | |
EP3272473B1 (en) | Teaching device and method for generating control information | |
EP2923806A1 (en) | Robot control device, robot, robotic system, teaching method, and program | |
US20180178388A1 (en) | Control apparatus, robot and robot system | |
WO2011065035A1 (ja) | ロボットの教示データを作成する方法およびロボット教示システム | |
EP2460628A2 (en) | Robot apparatus and gripping method for use in robot apparatus | |
WO2011065034A1 (ja) | ロボットの動作を制御する方法およびロボットシステム | |
US20180178389A1 (en) | Control apparatus, robot and robot system | |
JP6902369B2 (ja) | 提示装置、提示方法およびプログラム、ならびに作業システム | |
JP2018167334A (ja) | 教示装置および教示方法 | |
JP7049069B2 (ja) | ロボットシステム及びロボットシステムの制御方法 | |
CN110262664A (zh) | 一种具有认知能力的智能交互手套 | |
WO2022195680A1 (ja) | ロボットの作業教示装置及び作業教示方法 | |
JP2023506050A (ja) | 機械の少なくとも1つの動き及び少なくとも1つの活動をトレーニングするための手持ち式装置、システム、及び方法 | |
CN115338855A (zh) | 双臂机器人组装*** | |
JP6625266B1 (ja) | ロボット制御装置 | |
JP7452619B2 (ja) | 制御装置、制御方法及びプログラム | |
JP7483420B2 (ja) | ロボットシステム、制御装置、情報処理装置、制御方法、情報処理方法、物品の製造方法、プログラム、および記録媒体 | |
WO2017085811A1 (ja) | ティーチング装置及び制御情報の生成方法 | |
WO2016151667A1 (ja) | ティーチング装置及び制御情報の生成方法 | |
JP7454046B2 (ja) | ロボット教示装置及び作業教示方法 | |
JP2022060003A (ja) | 情報処理装置、情報処理装置の制御方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21931435 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023506404 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18280798 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112021006786 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21931435 Country of ref document: EP Kind code of ref document: A1 |