US20140074288A1 - Pickup device capable of determining holding position and posture of robot based on selection condition - Google Patents
Pickup device capable of determining holding position and posture of robot based on selection condition Download PDFInfo
- Publication number
- US20140074288A1 US20140074288A1 US14/025,427 US201314025427A US2014074288A1 US 20140074288 A1 US20140074288 A1 US 20140074288A1 US 201314025427 A US201314025427 A US 201314025427A US 2014074288 A1 US2014074288 A1 US 2014074288A1
- Authority
- US
- United States
- Prior art keywords
- posture
- robot
- holding position
- target object
- holding
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1612—Programme controls characterised by the hand, wrist, grip control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/18—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
- G05B19/408—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by data handling or data format, e.g. reading, buffering or conversion of data
- G05B19/4083—Adapting programme, configuration
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/37—Measurements
- G05B2219/37555—Camera detects orientation, position workpiece, points of workpiece
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39476—Orient hand relative to object
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40053—Pick 3-D object from pile of objects
Definitions
- the present invention relates to a pickup device for picking up a target object by controlling a robot based on a result of measurement of positions and postures of objects by using a sensor.
- a known pickup device for picking up an object by a robot is designed to have a predetermined reference position and posture relative to an object to be held, in which the robot is able to hold the object. Based on a position and posture of the object measured by a sensor and on the reference position and posture, a position and posture of the robot are calculated in order to hold the object. In such a pickup device, an object in any position and posture can be picked up by the robot.
- JP-B-3782679 discloses a related art that involves, in order to calculate a position and posture of a robot for picking up an object, predetermining a range of positions and postures of the robot relative to an object, in which the robot is able to grasp the object, and calculating a position and posture of the robot in which a tool of the robot does not interfere with a storage box for accommodating objects, so as to ensure that there is no interference between the storage box and the tool.
- JP-A-2012-055999 discloses a related art that involves measuring positions and postures of a plurality of objects by a sensor, calculating priority as indicator to reduce cycle time for picking up the respective objects or priority as an indicator to stably convey the objects, and determining a target object to be picked up according to the priority, in order to reduce cycle time of a pickup process or to provide stable conveyance of the objects.
- a tool attached to a tip of the robot or to an arm of the robot, or a robot body may have a posture which tends to cause interference with an obstacle in the periphery of the robot.
- JP-B-3782679 discloses a related art in which candidates for a position and posture of the robot which fall within a range of positions and postures of the robot set by an operator are calculated in a predetermined order, and the candidates are sequentially subject to judgment as to whether or not they cause interference. When it is judged that there is no interference, the position and posture of the robot are then confirmed. Thus, even if there is potentially another position and posture within the predetermined range, which are more desirable in terms of reducing time for movement of the robot or the like, such a position and posture are not always selected. Therefore, there is still room for improvement when it comes to reducing cycle time of the pickup process.
- JP-A-2012-055999 discloses a related art in which priority indicative of which object should be picked up is calculated for the individual objects, and an object is selected according to the priority. However, if objects are picked up successively from a pile of objects, as the pickup process proceeds, more and more objects with lower priority are left behind. If this is the case, an expected effect such as reduction of cycle time and stabilization of a conveyance process can be no longer achieved. Accordingly, there is a problem that the related art disclosed in JP-A-2012-055999 cannot constantly produce an expected result.
- a pickup device for picking up a target object from a plurality of objects, comprising: a robot equipped with a tool adapted to hold the target object; a sensor for measuring positions and postures of the plurality of objects; a reference holding position and posture storing unit for storing a reference holding position and posture which serve as a reference for a position and posture of the robot relative to the target object when the robot holds the target object by the tool; a holding position and posture modification range storing unit for storing a holding position and posture modification range which corresponds to a range of modification which can be applied to the reference holding position and posture so as to hold the target object by the tool; a holding position and posture calculating unit for calculating a holding position and posture of the robot in which the robot can hold the target object by the tool, based on the position and posture of the target object measured by the sensor and on the reference holding position and posture; a selection condition storing unit for storing at least one selection condition, based on which priority of the holding position and posture of the robot can be determined
- a pickup device for picking up a target object from a plurality of objects, comprising: a robot equipped with a tool adapted to hold the target object; a sensor for measuring positions and postures of the plurality of objects; a reference holding position and posture storing unit for storing a reference holding position and posture which serve as a reference for a position and posture of the robot relative to the target object when the robot holds the target object by the tool; a holding position and posture modification range storing unit for storing a holding position and posture modification range which corresponds to a range of modification which can be applied to the reference holding position and posture so as to hold the target object by the tool; a holding position and posture calculating unit for calculating a holding position and posture of the robot in which the robot can hold the target object by the tool, based on the position and posture of the target object measured by the sensor and on the reference holding position and posture; a selection condition storing unit for storing at least one selection condition, based on which priority of the holding position and posture of the robot can be determined
- FIG. 1 is a schematic view illustrating an overall configuration of a pickup device according to a first embodiment of the present invention
- FIG. 2 is a perspective view illustrating a shape of an object picked up by the pickup device according to the first embodiment
- FIG. 3 is a perspective view illustrating a positional relationship between a target object and a hand when a robot is in a holding position and posture in the pickup device according to the first embodiment
- FIG. 4 shows an example in which time required to move the robot to a holding position and posture increases in the pickup device according to the first embodiment
- FIG. 5 shows an example in which time required to move the robot to a holding position and posture decreases in the pickup device according to the first embodiment
- FIG. 6A shows a process for calculating a holding position and posture of the robot based on a selection condition in the pickup device according to the first embodiment
- FIG. 6B shows a process for calculating a holding position and posture of the robot based on a selection condition in the pickup device according to the first embodiment
- FIG. 7 shows a flow chart of an exemplary process performed by the pickup device according to the first embodiment
- FIG. 8A is a schematic view illustrating a configuration of a robot controller of a pickup device according to a second embodiment of the present invention.
- FIG. 8B shows a flow chart of an exemplary process performed by the pickup device according to the second embodiment
- FIG. 9 shows a process for calculating a holding position and posture of a robot based on a selection condition in a pickup device according to a third embodiment of the present invention.
- FIG. 10 illustrates a holding position and posture modification range in a pickup device according to a fourth embodiment of the present invention.
- FIG. 11 illustrates a selection condition in the pickup device according to the fourth embodiment.
- FIG. 1 is a schematic view illustrating a configuration of a pickup device 10 for picking up an object according to a first embodiment of the present invention.
- the pickup device 10 includes a camera 11 , a robot 12 , and a robot controller 13 connected to the robot 12 so as to control the robot 12 .
- the robot 12 has a hand 14 attached to a tip 12 a of the robot 12 .
- the robot controller 13 has a hardware configuration including a processor, a ROM, a RAM, a non-volatile RAM, an input means operated by an operator, a display device for displaying various information, an I/O interface, and a controller for controlling a servo motor provided at each joint of the robot 12 .
- the robot controller 13 also has a configuration including a reference holding position and posture storing unit, a holding position and posture modification range storing unit, a holding position and posture calculating unit, a selection condition storing unit, and a holding position and posture selecting unit.
- a plurality of objects 16 are piled up within a container 15 which opens in its upper portion.
- the pickup device 10 is designed to measure positions and postures of objects 16 by taking an image thereof by using the camera 11 , and to calculate, by the robot controller 13 , a position and posture of the robot 12 corresponding to the position and posture of a target object 16 a which should be picked up.
- the pickup device 10 is also designed to move the robot 12 to the calculated position and posture, and to pick up the target object 16 a with the hand 14 holding the target object 16 a.
- the camera 11 is used as a sensor for measuring positions and postures of objects 16 in a three dimensional space according to the first embodiment, any other types of sensor capable of measuring positions and postures of the objects 16 may also be available.
- a position and posture of an object 16 can be measured by detecting four points on the object 16 on the same plane, whose relative relationship between one another is predetermined, from an image region taken by the camera 11 .
- the camera 11 may also be attached to the tip 12 a of the robot 12 .
- FIGS. 2 and 3 show the hand 14 and the object 16 according to the first embodiment.
- the object 16 has a substantially cylindrical shape, part of which is cut out, with a circular hole 21 at its center.
- the hand 14 is a tool provided at the tip 12 a of the robot 12 .
- the hand 14 includes a flange 20 attached to the tip 12 a , and two claws 18 extending parallel to each other from the flange 20 .
- the claws 18 are configured to have a gap therebetween which can be adjusted by a chuck.
- the hand 14 is intended to hold an object 16 by inserting tip portions 18 a of the claws 18 to the hole 21 of the object 16 , and then widening the gap between the claws 18 , so as to provide pressing force outwardly from the inside the object 16 .
- the object 16 is illustrated only by way of example, any other types of object may also be available, as long as it has a certain shape so as to be held by various tools attached to the tip 12 a of the robot 12 .
- the hand 14 may also have other configurations capable of holding the object 16 , not limited to the chuck as illustrated, but including any known holding means such as a suction nozzle, an attractive magnet or an adhesive pad.
- a position and posture of the hand 14 may vary, but depend on a position and posture of the robot 12 controlled by the robot controller 13 .
- a position and posture of the robot 12 at the time of holding a target object 16 a which should be picked up from the container 15 by using the hand 14 (hereinafter referred to as a “holding position and posture”) can be calculated as follows.
- a preparation stage an object 16 is placed in a position and posture, which serve as a reference.
- the position and posture of the object 16 are measured by the camera 11 , and the measured position and posture are stored as a reference object position and posture Wn.
- the robot 12 is then moved to a position and posture where the robot 12 can hold the object 16 , and the position and posture of the robot 12 are stored as a reference holding position and posture Rn, which serve as a reference of the holding position and posture of the robot 12 .
- the reference object position and posture Wn and the reference holding position and posture Rn are stored in the robot controller 13 .
- the robot controller 13 includes a reference object position and posture storing unit designed to store the reference object position and posture Wn, and a reference holding position and posture storing unit designed to store the reference holding position and posture Rn.
- a position and posture Wa of the object 16 are measured by the camera 11 , and then, a position and posture Ra of the robot 12 relative to the object 16 are calculated by the following formula:
- the holding position and posture of the robot relative to the object 16 in any position and posture can be calculated.
- Calculation of the holding position and posture can be performed by the robot controller 13 .
- the robot controller 13 includes a holding position and posture calculating unit.
- the holding position and posture modification range is a range in which the reference holding position and posture can be modified.
- the robot 12 can hold the target object 16 a by the hand 14 , not only in the reference holding position and posture relative to the target object 16 a , but also in any position and posture within the holding position and posture modification range.
- the target object 16 a is held accordingly by teaching the reference holding position and posture of the robot 12 so that the hand 14 is in a position and posture relative to the target object 16 a as illustrated in FIG. 3 .
- the target object 16 a can be also held even when the hand 14 is rotated around a central axis 31 of the hole 21 of the target object 16 a .
- the holding position and posture modification range is set so that the hand 14 can rotate around the central axis 31 of the hole 21 .
- Such a holding position and posture modification range is stored in the holding position and posture modification range storing unit of the robot controller 13 .
- the holding position and posture modification range depends on the shape of the object 16 and of the hand 14 . Therefore, the holding position and posture modification range is not limited to the illustrated example of rotational movement around a particular axis, but including translational movement or rotational movement around a plurality of axes different from each other, or a combination thereof.
- FIG. 4 shows an example in which time required to move a robot to a holding position and posture increases in the pickup device 10 .
- the position and posture of the robot 12 shown in FIG. 1 are considered to be a waiting position and posture before the robot 12 is positioned in a holding position and posture.
- FIG. 4 shows the robot 12 after the robot 12 is moved to the holding position and posture.
- a direction of the hand 14 is significantly changed in the process of the movement of the robot 12 from the waiting position and posture to the holding position and posture, it is necessary to rotate the tip 12 a of the robot 12 around an axis of the wrist to a great extent.
- the tip portions 18 a of the claws 18 of the hand 14 are offset from the tip 12 a of the robot 12 , the tip 12 a of the robot 12 is located distant from the tip portions 18 a of the claws 18 . Since the hand 14 cannot be moved fast enough, it is necessary to increase the time to move the robot 12 to the holding position and posture.
- the holding position and posture of the robot 12 should be changed from a posture shown in FIG. 4 to a posture shown in FIG. 5 by rotating the hand 14 around the central axis 31 of the hole 21 of the target object 16 a .
- FIG. 5 shows the posture of the robot 12 obtained by rotating the robot 12 shown in FIG. 4 around the central axis 31 of the hole 21 of the target object 16 a by 180 degrees.
- a holding position and posture are selected according to priority in order to ensure that the holding position and posture are approximate to the waiting position and posture. Therefore, time required to move the robot 12 from the waiting position and posture to the holding position and posture can be reduced.
- the selection condition gives priority to the holding position and posture having a posture approximate to that of the waiting position and posture.
- free rotation around the central axis 31 of the hole 21 of the object 16 is determined as a holding position and posture modification range, and the holding position and posture having a posture approximate to that of the waiting position and posture is preferentially selected in accordance with the selection condition.
- the holding position and posture selecting unit of the robot controller 13 is activated to select a holding position and posture of the robot 12 based on the selection condition, as described below.
- the selection condition is stored by the selection condition storing unit of the robot controller 13 , and can be read out by the robot controller 13 as necessary.
- the selection condition may be a predetermined condition or any condition input by an operator during an operation of the pickup device.
- FIGS. 6A and 6B show a process for calculating a holding position and posture of the robot 12 based on a selection condition in the pickup device 10 according to the first embodiment.
- FIG. 6A shows the hand 14 in a reference holding position and posture relative to the target object 16 a
- FIG. 6B shows the hand 14 in a selection condition posture.
- the selection condition posture has the same posture as that of the waiting position and posture. The difference between these two postures can be quantitatively represented as an amount of rotation around an axis necessary to change one posture to the other.
- the difference between the selection condition posture and the reference holding position and posture can be minimized, when the robot 12 in the selection condition posture is rotated around a line perpendicular to the central axis 31 , so as to have a posture in which a directional vector 63 extending in the direction of the central axis 31 seen from the hand 14 of the robot 12 matches a directional vector 64 extending parallel to the central axis 31 when the robot 12 is in the reference holding position and posture. Therefore, in the first embodiment, such a position and posture are selected as the holding position and posture of the robot 12 .
- This calculating process is a non-limiting, exemplary process for calculating the holding position and posture of the robot 12 performed in accordance with the selection condition.
- the calculation process for calculating the holding position and posture of the robot 12 depends on the selection condition and the holding position and posture modification range, the calculation process must be adjusted in accordance with a given condition.
- FIG. 7 shows a flow chart of an exemplary process performed by the pickup device 10 according to the first embodiment. An operation of the pickup device 10 will be described below with reference to the flow chart in FIG. 7 and other relevant drawings.
- the process shown in FIG. 7 is initiated with a start command for starting a pickup process of the object 16 , for example, in response to an operator activating an operational switch, which is not shown in the drawings.
- a start command for starting a pickup process of the object 16 for example, in response to an operator activating an operational switch, which is not shown in the drawings.
- positions and postures of a plurality of objects 16 piled up within the container 15 are measured (step S 101 ).
- the positions and postures of the respective objects 16 are determined by taking an image of the objects 16 with a camera 11 attached to the tip 12 a of the robot 12 or to the support stand 17 , for example, and by processing the image information obtained by the camera 11 .
- a target object 16 a which should be picked up is selected (step S 102 ).
- the target object 16 a is successively selected according to priority for the selection, which are calculated for the respective objects 16 , based on the positions and postures of the objects 16 measured at step S 101 .
- a holding position and posture of the robot 12 corresponding to the position and posture of the target object 16 a obtained at step S 101 is calculated (step S 103 ).
- the holding position and posture of the robot 12 can be calculated based on the reference holding position and posture of the robot 12 , on the reference object position and posture, and on the position and posture of the object, as described above.
- the reference holding position and posture as well as the reference object position and posture are obtained at the preparation stage for the pickup device 10 , as described above. Therefore, when the process at step S 103 is performed, the reference holding position and posture and the reference object position and posture stored by the storing unit of the robot controller 13 are read out therefrom.
- the holding position and posture modification range and the selection condition are then read out from the holding position and posture modification range storing unit and the selection condition storing unit, respectively (step S 104 ).
- the holding position and posture modification range corresponds to rotation around the central axis 31 of the hole 21 of the object 16 in the example of the object 16 and the hand 14 shown in FIGS. 2 and 3 , as described above.
- the selection condition gives priority, for example, to the holding position and posture having a posture approximate to that of the waiting position and posture, as described above in relation to FIGS. 6A and 6B .
- a holding position and posture are selected (step S 105 ).
- the holding position and posture selected at step S 105 are sent to the robot controller 13 .
- the robot controller 13 then generates a control command in order to move the robot 12 to the holding position and posture.
- the robot controller 13 ′ includes, in addition to the configuration according to the first embodiment, a judging unit for judging whether or not there is interference between a tool attached to the tip 12 a or the arm of the robot 12 , such as the hand 14 , and an obstacle which exists in the periphery of an operation area of the robot 12 .
- a plurality of candidates for a position and posture within a range in which the robot 12 can hold the object 16 are generated, and among the candidates, only candidates which are judged by the judging unit as causing no interference are selected according to the selection condition.
- a shape data of the tool attached to the tip 12 a or the arm of the robot 12 and a shape data of the obstacle in the periphery of an operation area of the robot 12 , such as the container 15 are stored in a preparation stage, in order for the judging unit of the robot controller 13 ′ to judge as to whether or not there is interference between the tool and the obstacle.
- These shape data are stored by the shape data storing unit of the robot controller 13 ′ as CAD data, for example.
- FIG. 8A is a schematic view illustrating a configuration of the robot controller 13 ′ for the pickup device according to the second.
- FIG. 8B shows a flow chart of an exemplary process performed by the pickup device according to the second embodiment. An operation of the pickup device 10 will be described below with reference to the flowchart in FIG. 8B and other relevant drawings.
- steps S 201 through S 204 are the same as steps S 101 through S 104 in the first embodiment.
- an image of objects 16 is taken by using the camera 11 at step S 201 , so as to measure positions and postures of the respective objects 16 .
- a target object 16 a which should be picked up is selected among the objects 16 whose positions and postures are measured.
- a holding position and posture of the robot 12 is calculated at step S 203 .
- a holding position and posture modification range and a selecting condition are read out at step S 204 , respectively.
- a plurality of candidates for a holding position and posture of the robot 12 falling within the holding position and posture modification range read out at step S 204 are generated (step S 205 ).
- the candidates for a holding position and posture can be obtained, for example, by rotating the hand 14 around the central axis 31 by every predefined angle in a stepwise manner.
- step S 206 one of the candidates for a holding position and posture having the highest priority specified by the selection condition is selected (step S 206 ). Then, it is judged at step S 207 as to whether or not there is interference with an obstacle in the periphery when the robot 12 is in a position and posture corresponding to the candidate for a holding position and posture which has been selected at step S 206 .
- the selection condition prioritizes a holding position and posture which can be obtained from the waiting position and posture by a smaller amount of changes in posture, in a similar way as the first embodiment.
- the amount of changes in posture may be quantitatively expressed, for example, by an amount of rotation around a certain axis.
- a transformation matrix between the respective candidates for a holding position and posture and the waiting position and posture is calculated, and based on the transformation matrix, an amount of rotation for the respective cases can be calculated.
- the judgment as to whether or not there is interference is carried out for the candidate for a holding position and posture which requires the minimum amount of rotation.
- the judgment as to whether or not there is interference is performed based on a position and posture of the robot 12 corresponding to the candidate for a holding position and posture to be judged, on a shape data of the tool attached to the tip 12 a or the arm of the robot 12 , and on a shape data of the obstacle existent in the periphery of an operation area of the robot 12 .
- step S 207 When it is judged at step S 207 that there is no interference, the candidate, which is the subject of the judgment, is selected as a holding position and posture of the robot 12 (step S 208 ). The result of selection at step S 208 is sent out to the robot controller 13 ′. The root controller 13 ′ then functions to create a control command in order to move the robot 12 to the holding position and posture which have been selected.
- step S 209 On the other hand, when it is judged at step S 207 that there will be interference, a position and posture corresponding to the candidate which has been judged are removed from the candidates for a holding position and posture (step S 209 ). Then, the process returns to step S 206 , and a next candidate for a holding position and posture having the highest priority specified by the selection condition is selected.
- the third embodiment uses a selection condition different from those used in the first and second embodiments. Specifically, in the pickup device according to the third embodiment, priority is given by the selection condition to a position and posture which provide a greater distance between a point fixed at the tip 12 a of the robot 12 and a fixed plane independent of the robot 12 .
- priority is given by the selection condition to a position and posture which provide a greater distance between a point fixed at the tip 12 a of the robot 12 and a fixed plane independent of the robot 12 .
- the third embodiment is advantageous in the following case, for example.
- the tip 12 a of the robot 12 might come in contact with the object 16 situated close to the target object 16 a , in the process of movement of the robot 12 to a holding position and posture in order to hold the target object 16 a .
- Such an incident tends to occur when a flange 20 of the tip 12 a of the robot 12 is in a lower position, for example, as shown in FIG. 4 . Therefore, in order to avoid this problem, a selection condition is applied such that priority is given to a holding position and posture of the robot 12 which allow the flange 20 of the tip 12 a of the robot 12 to be in a higher position.
- FIG. 9 shows a process for calculating a position and posture of the robot 12 based on a selection condition.
- a designated site 91 on an upper surface of the flange 20 is set as a target which should be in a higher position.
- a holding position and posture of the robot 12 are selected from the holding position and posture modification range, so that a distance between the designated site 91 fixed at the tip 12 a of the robot 12 and a floor surface which is considered to be a fixed plane independent of the robot 12 can be maximized.
- a holding position and posture having the highest priority specified by the selection condition can be identified by determining a holding position and posture of the robot 12 for producing the maximum scalar product of the two directional vectors 93 and 94 .
- priority may also be specified in different ways, for example, in accordance with a distance between points, a distance between a point and a line, an angle defined between lines.
- the pickup device in the fourth embodiment provides the robot 12 with multiple degrees of freedom in movement within the holding position and posture modification range, and therefore, the fourth embodiment differs from the first, second and third embodiments in that a plurality of selection conditions are set.
- the fourth embodiment differs from the first, second and third embodiments in that a plurality of selection conditions are set.
- the pickup device operates to pick up an object 101 having a cylindrical portion 103 having an elongated cylindrical shape and a flange portion 105 .
- the tip 12 a of the robot 12 is provided with a hand 101 for holding the object 101 .
- the hand 102 includes a flange 104 attached to the tip 12 a of the robot 12 , and two claws 106 extending perpendicularly from the bottom face of the flange 104 and parallel to each other.
- the hand 102 is designed to hold the object 101 between the two claws 106 by adjusting a distance between the claws 106 by way of a chuck.
- the target object 101 a can be still held by the hand 102 even if a holding position of the hand 102 is moved in the direction of the central axis 111 of the target object 101 a by a distance X.
- This holding position and posture modification range will be referred to as a “first holding position and posture modification range,” in order to distinguish it from a second holding position and posture modification range, which will be described below.
- the second holding position and posture modification range is defined by an angle ⁇ around a line 114 perpendicular to a central axis 111 of the target object 101 a and to a central axis 113 of the hand 102 .
- the object 102 can be still held by the hand 102 even if the hand 102 is rotated around the line 114 within a certain range of angle ⁇ .
- FIG. 11 illustrates a selection condition in the pickup device according to the fourth embodiment.
- a selection condition gives priority to the holding position and posture of the robot 12 shown in FIG. 11 .
- the selection condition gives priority to the state where the claws 106 are in a higher position, and the claws 106 are oriented vertically downward without being slanted relative to a vertical line when the target object 101 a is held by the hand 102 . If the claws 106 are in a lower position or slanted relative to the vertical line, there is a risk of contact between other objects 101 situated near the target object 101 a and the hand 102 attached to the tip 12 a or the arm of the robot 12 , as described above in relation to the third embodiment.
- a holding position and posture of the robot 12 for holding a target object 101 a can be determined within the holding position and posture modification range, according to the selection condition in the following way.
- a holding position and posture of the robot 12 which bring the claws 106 to a highest possible position within the holding position and posture modification range are determined, based on the first holding position and posture modification range and on a posture of the target object 101 a .
- the posture of the robot 12 resulting in the claws 106 being slanted to the minimum extent can be determined by obtaining a posture of the robot 12 in which a scalar product of a directional vector extending in the direction of the central axis 113 of the hand 102 and a directional vector extending perpendicular to the floor surface has the maximum value.
- the posture of the robot 12 for providing the scalar product of the maximum value can be obtained in the same way as the third embodiment.
- a holding position and posture of the robot 12 are selected among a plurality of candidates for a holding position and posture, so as to ensure that there is no interference between the tool attached to the tip 12 a or the arm of the robot 12 and an obstacle in the periphery of an operational area of the robot 12 , a plurality of candidates for a holding position and posture distant from one another are generated for the first and second holding position and posture modification ranges, respectively, and a combination thereof can be used as candidates for a holding position and posture.
- the priority specified by the selection condition is obtained by summing a height of the claws 106 for holding the target object 101 a multiplied with a weighing factor and an angle defined between the two directional vectors multiplied with a weighing factor.
- the calculation process of the priority is not limited to the above example, any other process which allows time required for movement of the robot 12 to be reduced and realizes a stable pickup process may also be employed.
- the calculating process of the priority may be carried out in a predetermined way, or may be in a selected way as necessary, depending on an operator.
- an optimal position and posture of the robot are selected among other possible holding positions and postures of the robot, in accordance with the priority specified by the selection condition.
- an increase in time required for a pickup process and any deviation from the movable range of the robot can be avoided. Movement of the robot which possibly results in being in contact with the obstacle may also be avoided as necessary. Since the calculation of the holding position and posture of the robot is performed for each object, it is always ensured that the cycle time can be reduced and the system can be stabilized.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Manufacturing & Machinery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Manipulator (AREA)
Abstract
Description
- 1. Field of the Invention
- The present invention relates to a pickup device for picking up a target object by controlling a robot based on a result of measurement of positions and postures of objects by using a sensor.
- 2. Description of the Related Art
- A known pickup device for picking up an object by a robot is designed to have a predetermined reference position and posture relative to an object to be held, in which the robot is able to hold the object. Based on a position and posture of the object measured by a sensor and on the reference position and posture, a position and posture of the robot are calculated in order to hold the object. In such a pickup device, an object in any position and posture can be picked up by the robot.
- JP-B-3782679 discloses a related art that involves, in order to calculate a position and posture of a robot for picking up an object, predetermining a range of positions and postures of the robot relative to an object, in which the robot is able to grasp the object, and calculating a position and posture of the robot in which a tool of the robot does not interfere with a storage box for accommodating objects, so as to ensure that there is no interference between the storage box and the tool.
- JP-A-2012-055999 discloses a related art that involves measuring positions and postures of a plurality of objects by a sensor, calculating priority as indicator to reduce cycle time for picking up the respective objects or priority as an indicator to stably convey the objects, and determining a target object to be picked up according to the priority, in order to reduce cycle time of a pickup process or to provide stable conveyance of the objects.
- In the conventional pickup devices, depending on a position and posture of the object, there arise the following problems:
- (1) that a position and posture of a robot for holding a measured object may be deviated from a range of movement of the robot;
- (2) that it may require much longer time to move the robot to a position and posture for picking up the object; and/or
- (3) that a tool attached to a tip of the robot or to an arm of the robot, or a robot body may have a posture which tends to cause interference with an obstacle in the periphery of the robot.
- JP-B-3782679 discloses a related art in which candidates for a position and posture of the robot which fall within a range of positions and postures of the robot set by an operator are calculated in a predetermined order, and the candidates are sequentially subject to judgment as to whether or not they cause interference. When it is judged that there is no interference, the position and posture of the robot are then confirmed. Thus, even if there is potentially another position and posture within the predetermined range, which are more desirable in terms of reducing time for movement of the robot or the like, such a position and posture are not always selected. Therefore, there is still room for improvement when it comes to reducing cycle time of the pickup process.
- JP-A-2012-055999 discloses a related art in which priority indicative of which object should be picked up is calculated for the individual objects, and an object is selected according to the priority. However, if objects are picked up successively from a pile of objects, as the pickup process proceeds, more and more objects with lower priority are left behind. If this is the case, an expected effect such as reduction of cycle time and stabilization of a conveyance process can be no longer achieved. Accordingly, there is a problem that the related art disclosed in JP-A-2012-055999 cannot constantly produce an expected result.
- According to the present disclosure, a pickup device for picking up a target object from a plurality of objects, comprising: a robot equipped with a tool adapted to hold the target object; a sensor for measuring positions and postures of the plurality of objects; a reference holding position and posture storing unit for storing a reference holding position and posture which serve as a reference for a position and posture of the robot relative to the target object when the robot holds the target object by the tool; a holding position and posture modification range storing unit for storing a holding position and posture modification range which corresponds to a range of modification which can be applied to the reference holding position and posture so as to hold the target object by the tool; a holding position and posture calculating unit for calculating a holding position and posture of the robot in which the robot can hold the target object by the tool, based on the position and posture of the target object measured by the sensor and on the reference holding position and posture; a selection condition storing unit for storing at least one selection condition, based on which priority of the holding position and posture of the robot can be determined; and a holding position and posture selecting unit for selecting one of the holding positions and postures of the robot in accordance with the priority determined by the selection condition, the holding positions and postures of the robot being obtained from the holding position and posture calculated by the holding position and posture calculating unit and from the holding position and posture modification range, is provided.
- According to the present disclosure, a pickup device for picking up a target object from a plurality of objects, comprising: a robot equipped with a tool adapted to hold the target object; a sensor for measuring positions and postures of the plurality of objects; a reference holding position and posture storing unit for storing a reference holding position and posture which serve as a reference for a position and posture of the robot relative to the target object when the robot holds the target object by the tool; a holding position and posture modification range storing unit for storing a holding position and posture modification range which corresponds to a range of modification which can be applied to the reference holding position and posture so as to hold the target object by the tool; a holding position and posture calculating unit for calculating a holding position and posture of the robot in which the robot can hold the target object by the tool, based on the position and posture of the target object measured by the sensor and on the reference holding position and posture; a selection condition storing unit for storing at least one selection condition, based on which priority of the holding position and posture of the robot can be determined; a shape data storing unit for storing a shape data of the tool and a shape data of an obstacle which exists in the periphery of the robot; a judging unit for judging as to whether there is interference between the tool and the obstacle, based on the position and posture of the robot, on the shape data of the tool and on the shape data of the obstacle; and a holding position and posture selecting unit for selecting one of the holding positions and postures of the robot in accordance with the priority determined by the selection condition, the holding positions and postures of the robot being judged by the judging unit that there is no interference, the holding positions and postures of the robot being obtained from the holding position and posture calculated by the holding position and posture calculating unit and from the holding position and posture modification range, is also provided.
- These and other objects, features and advantages of the present invention will become more apparent in light of the detailed description of exemplary embodiments thereof as illustrated by the drawings.
-
FIG. 1 is a schematic view illustrating an overall configuration of a pickup device according to a first embodiment of the present invention; -
FIG. 2 is a perspective view illustrating a shape of an object picked up by the pickup device according to the first embodiment; -
FIG. 3 is a perspective view illustrating a positional relationship between a target object and a hand when a robot is in a holding position and posture in the pickup device according to the first embodiment; -
FIG. 4 shows an example in which time required to move the robot to a holding position and posture increases in the pickup device according to the first embodiment; -
FIG. 5 shows an example in which time required to move the robot to a holding position and posture decreases in the pickup device according to the first embodiment; -
FIG. 6A shows a process for calculating a holding position and posture of the robot based on a selection condition in the pickup device according to the first embodiment; -
FIG. 6B shows a process for calculating a holding position and posture of the robot based on a selection condition in the pickup device according to the first embodiment; -
FIG. 7 shows a flow chart of an exemplary process performed by the pickup device according to the first embodiment; -
FIG. 8A is a schematic view illustrating a configuration of a robot controller of a pickup device according to a second embodiment of the present invention; -
FIG. 8B shows a flow chart of an exemplary process performed by the pickup device according to the second embodiment; -
FIG. 9 shows a process for calculating a holding position and posture of a robot based on a selection condition in a pickup device according to a third embodiment of the present invention; -
FIG. 10 illustrates a holding position and posture modification range in a pickup device according to a fourth embodiment of the present invention; and -
FIG. 11 illustrates a selection condition in the pickup device according to the fourth embodiment. -
FIG. 1 is a schematic view illustrating a configuration of apickup device 10 for picking up an object according to a first embodiment of the present invention. Thepickup device 10 includes acamera 11, arobot 12, and arobot controller 13 connected to therobot 12 so as to control therobot 12. Therobot 12 has ahand 14 attached to atip 12 a of therobot 12. Although not illustrated, therobot controller 13 has a hardware configuration including a processor, a ROM, a RAM, a non-volatile RAM, an input means operated by an operator, a display device for displaying various information, an I/O interface, and a controller for controlling a servo motor provided at each joint of therobot 12. Therobot controller 13 also has a configuration including a reference holding position and posture storing unit, a holding position and posture modification range storing unit, a holding position and posture calculating unit, a selection condition storing unit, and a holding position and posture selecting unit. - A plurality of
objects 16 are piled up within acontainer 15 which opens in its upper portion. Thepickup device 10 is designed to measure positions and postures ofobjects 16 by taking an image thereof by using thecamera 11, and to calculate, by therobot controller 13, a position and posture of therobot 12 corresponding to the position and posture of atarget object 16 a which should be picked up. Thepickup device 10 is also designed to move therobot 12 to the calculated position and posture, and to pick up thetarget object 16 a with thehand 14 holding thetarget object 16 a. - Although the
camera 11 is used as a sensor for measuring positions and postures ofobjects 16 in a three dimensional space according to the first embodiment, any other types of sensor capable of measuring positions and postures of theobjects 16 may also be available. When thecamera 11 is used, a position and posture of anobject 16 can be measured by detecting four points on theobject 16 on the same plane, whose relative relationship between one another is predetermined, from an image region taken by thecamera 11. As opposed to the illustrated embodiment in which thecamera 11 is fixed to asupport stand 17, thecamera 11 may also be attached to thetip 12 a of therobot 12. -
FIGS. 2 and 3 show thehand 14 and theobject 16 according to the first embodiment. Theobject 16 has a substantially cylindrical shape, part of which is cut out, with acircular hole 21 at its center. Thehand 14 is a tool provided at thetip 12 a of therobot 12. Thehand 14 includes aflange 20 attached to thetip 12 a, and twoclaws 18 extending parallel to each other from theflange 20. Theclaws 18 are configured to have a gap therebetween which can be adjusted by a chuck. Thehand 14 is intended to hold anobject 16 by insertingtip portions 18 a of theclaws 18 to thehole 21 of theobject 16, and then widening the gap between theclaws 18, so as to provide pressing force outwardly from the inside theobject 16. Theobject 16 is illustrated only by way of example, any other types of object may also be available, as long as it has a certain shape so as to be held by various tools attached to thetip 12 a of therobot 12. Thehand 14 may also have other configurations capable of holding theobject 16, not limited to the chuck as illustrated, but including any known holding means such as a suction nozzle, an attractive magnet or an adhesive pad. A position and posture of thehand 14 may vary, but depend on a position and posture of therobot 12 controlled by therobot controller 13. - A position and posture of the
robot 12 at the time of holding atarget object 16 a which should be picked up from thecontainer 15 by using the hand 14 (hereinafter referred to as a “holding position and posture”) can be calculated as follows. In a preparation stage, anobject 16 is placed in a position and posture, which serve as a reference. The position and posture of theobject 16 are measured by thecamera 11, and the measured position and posture are stored as a reference object position and posture Wn. Therobot 12 is then moved to a position and posture where therobot 12 can hold theobject 16, and the position and posture of therobot 12 are stored as a reference holding position and posture Rn, which serve as a reference of the holding position and posture of therobot 12. The reference object position and posture Wn and the reference holding position and posture Rn are stored in therobot controller 13. Accordingly, therobot controller 13 includes a reference object position and posture storing unit designed to store the reference object position and posture Wn, and a reference holding position and posture storing unit designed to store the reference holding position and posture Rn. - In an operation stage, a position and posture Wa of the
object 16 are measured by thecamera 11, and then, a position and posture Ra of therobot 12 relative to theobject 16 are calculated by the following formula: -
Ra=Wa×inv(Wn)×Rn, - where inv (Wn) represents an inverse matrix of Wn.
- In this way, the holding position and posture of the robot relative to the
object 16 in any position and posture can be calculated. Calculation of the holding position and posture can be performed by therobot controller 13. Thus, therobot controller 13 includes a holding position and posture calculating unit. - Referring to
FIG. 3 again, a holding position and posture modification range will be described. The holding position and posture modification range is a range in which the reference holding position and posture can be modified. Thus, therobot 12 can hold thetarget object 16 a by thehand 14, not only in the reference holding position and posture relative to thetarget object 16 a, but also in any position and posture within the holding position and posture modification range. - In the first embodiment, the
target object 16 a is held accordingly by teaching the reference holding position and posture of therobot 12 so that thehand 14 is in a position and posture relative to thetarget object 16 a as illustrated inFIG. 3 . However, in the case of theillustrated hand 14 and target object 16 a with thehole 21 of a circular shape, thetarget object 16 a can be also held even when thehand 14 is rotated around acentral axis 31 of thehole 21 of thetarget object 16 a. In order to take advantage of it, according to the first embodiment, the holding position and posture modification range is set so that thehand 14 can rotate around thecentral axis 31 of thehole 21. Such a holding position and posture modification range is stored in the holding position and posture modification range storing unit of therobot controller 13. The holding position and posture modification range depends on the shape of theobject 16 and of thehand 14. Therefore, the holding position and posture modification range is not limited to the illustrated example of rotational movement around a particular axis, but including translational movement or rotational movement around a plurality of axes different from each other, or a combination thereof. -
FIG. 4 shows an example in which time required to move a robot to a holding position and posture increases in thepickup device 10. For convenience, the position and posture of therobot 12 shown inFIG. 1 are considered to be a waiting position and posture before therobot 12 is positioned in a holding position and posture. On the other hand,FIG. 4 shows therobot 12 after therobot 12 is moved to the holding position and posture. In comparison withFIGS. 1 and 4 , in the case where a direction of thehand 14 is significantly changed in the process of the movement of therobot 12 from the waiting position and posture to the holding position and posture, it is necessary to rotate thetip 12 a of therobot 12 around an axis of the wrist to a great extent. In addition, since thetip portions 18 a of theclaws 18 of thehand 14 are offset from thetip 12 a of therobot 12, thetip 12 a of therobot 12 is located distant from thetip portions 18 a of theclaws 18. Since thehand 14 cannot be moved fast enough, it is necessary to increase the time to move therobot 12 to the holding position and posture. - In order to reduce the time required to move the
robot 12 from a waiting position and posture to a holding position and posture, the holding position and posture of therobot 12 should be changed from a posture shown inFIG. 4 to a posture shown inFIG. 5 by rotating thehand 14 around thecentral axis 31 of thehole 21 of thetarget object 16 a.FIG. 5 shows the posture of therobot 12 obtained by rotating therobot 12 shown inFIG. 4 around thecentral axis 31 of thehole 21 of thetarget object 16 a by 180 degrees. In this way, a holding position and posture are selected according to priority in order to ensure that the holding position and posture are approximate to the waiting position and posture. Therefore, time required to move therobot 12 from the waiting position and posture to the holding position and posture can be reduced. - According to the first embodiment, in order to reduce the time required to move the
robot 12 to a holding position and posture, the selection condition gives priority to the holding position and posture having a posture approximate to that of the waiting position and posture. In the first embodiment, free rotation around thecentral axis 31 of thehole 21 of theobject 16 is determined as a holding position and posture modification range, and the holding position and posture having a posture approximate to that of the waiting position and posture is preferentially selected in accordance with the selection condition. In this case, the holding position and posture selecting unit of therobot controller 13 is activated to select a holding position and posture of therobot 12 based on the selection condition, as described below. The selection condition is stored by the selection condition storing unit of therobot controller 13, and can be read out by therobot controller 13 as necessary. The selection condition may be a predetermined condition or any condition input by an operator during an operation of the pickup device. -
FIGS. 6A and 6B show a process for calculating a holding position and posture of therobot 12 based on a selection condition in thepickup device 10 according to the first embodiment.FIG. 6A shows thehand 14 in a reference holding position and posture relative to thetarget object 16 a, whileFIG. 6B shows thehand 14 in a selection condition posture. In the present embodiment, the selection condition posture has the same posture as that of the waiting position and posture. The difference between these two postures can be quantitatively represented as an amount of rotation around an axis necessary to change one posture to the other. The difference between the selection condition posture and the reference holding position and posture can be minimized, when therobot 12 in the selection condition posture is rotated around a line perpendicular to thecentral axis 31, so as to have a posture in which adirectional vector 63 extending in the direction of thecentral axis 31 seen from thehand 14 of therobot 12 matches adirectional vector 64 extending parallel to thecentral axis 31 when therobot 12 is in the reference holding position and posture. Therefore, in the first embodiment, such a position and posture are selected as the holding position and posture of therobot 12. This calculating process is a non-limiting, exemplary process for calculating the holding position and posture of therobot 12 performed in accordance with the selection condition. In addition, since the calculation process for calculating the holding position and posture of therobot 12 depends on the selection condition and the holding position and posture modification range, the calculation process must be adjusted in accordance with a given condition. -
FIG. 7 shows a flow chart of an exemplary process performed by thepickup device 10 according to the first embodiment. An operation of thepickup device 10 will be described below with reference to the flow chart inFIG. 7 and other relevant drawings. - The process shown in
FIG. 7 is initiated with a start command for starting a pickup process of theobject 16, for example, in response to an operator activating an operational switch, which is not shown in the drawings. First, positions and postures of a plurality ofobjects 16 piled up within thecontainer 15 are measured (step S101). At step S101, the positions and postures of therespective objects 16 are determined by taking an image of theobjects 16 with acamera 11 attached to thetip 12 a of therobot 12 or to thesupport stand 17, for example, and by processing the image information obtained by thecamera 11. - Then, among the
objects 16 whose positions and postures are measured by thecamera 11 at step S101, atarget object 16 a which should be picked up is selected (step S102). Preferably, thetarget object 16 a is successively selected according to priority for the selection, which are calculated for therespective objects 16, based on the positions and postures of theobjects 16 measured at step S101. - Then, a holding position and posture of the
robot 12 corresponding to the position and posture of thetarget object 16 a obtained at step S101 is calculated (step S103). The holding position and posture of therobot 12 can be calculated based on the reference holding position and posture of therobot 12, on the reference object position and posture, and on the position and posture of the object, as described above. The reference holding position and posture as well as the reference object position and posture are obtained at the preparation stage for thepickup device 10, as described above. Therefore, when the process at step S103 is performed, the reference holding position and posture and the reference object position and posture stored by the storing unit of therobot controller 13 are read out therefrom. - The holding position and posture modification range and the selection condition are then read out from the holding position and posture modification range storing unit and the selection condition storing unit, respectively (step S104). The holding position and posture modification range corresponds to rotation around the
central axis 31 of thehole 21 of theobject 16 in the example of theobject 16 and thehand 14 shown inFIGS. 2 and 3 , as described above. The selection condition gives priority, for example, to the holding position and posture having a posture approximate to that of the waiting position and posture, as described above in relation toFIGS. 6A and 6B . - Then, based on the holding position and posture calculated at step S103, and on the holding position and posture modification range and the selection condition read out at step S104, a holding position and posture are selected (step S105). The holding position and posture selected at step S105 are sent to the
robot controller 13. Therobot controller 13 then generates a control command in order to move therobot 12 to the holding position and posture. - A pickup device according to a second embodiment of the present invention will be described. In the following explanation, the same or corresponding constituent elements are designated with the same reference numerals. In the pickup device according to the second embodiment, the
robot controller 13′ includes, in addition to the configuration according to the first embodiment, a judging unit for judging whether or not there is interference between a tool attached to thetip 12 a or the arm of therobot 12, such as thehand 14, and an obstacle which exists in the periphery of an operation area of therobot 12. Unlike the first embodiment, a plurality of candidates for a position and posture within a range in which therobot 12 can hold theobject 16 are generated, and among the candidates, only candidates which are judged by the judging unit as causing no interference are selected according to the selection condition. In the following explanation directed to the second embodiment, only matters different from the first embodiment will be described. - In the second embodiment, a shape data of the tool attached to the
tip 12 a or the arm of therobot 12 and a shape data of the obstacle in the periphery of an operation area of therobot 12, such as thecontainer 15, are stored in a preparation stage, in order for the judging unit of therobot controller 13′ to judge as to whether or not there is interference between the tool and the obstacle. These shape data are stored by the shape data storing unit of therobot controller 13′ as CAD data, for example.FIG. 8A is a schematic view illustrating a configuration of therobot controller 13′ for the pickup device according to the second. -
FIG. 8B shows a flow chart of an exemplary process performed by the pickup device according to the second embodiment. An operation of thepickup device 10 will be described below with reference to the flowchart inFIG. 8B and other relevant drawings. - The processes at steps S201 through S204 are the same as steps S101 through S104 in the first embodiment. Specifically, an image of
objects 16 is taken by using thecamera 11 at step S201, so as to measure positions and postures of the respective objects 16. At step S202, atarget object 16 a which should be picked up is selected among theobjects 16 whose positions and postures are measured. In accordance with a predefined formula, a holding position and posture of therobot 12 is calculated at step S203. Then, a holding position and posture modification range and a selecting condition are read out at step S204, respectively. A plurality of candidates for a holding position and posture of therobot 12 falling within the holding position and posture modification range read out at step S204 are generated (step S205). For example, in the same way as the first embodiment, when theobject 16 and thehand 14 shown inFIGS. 2 and 3 are applied, free rotation around thecentral axis 31 of thehole 21 of theobject 16 is set as a modification range in which therobot 12 can hold theobject 16. In this case, the candidates for a holding position and posture can be obtained, for example, by rotating thehand 14 around thecentral axis 31 by every predefined angle in a stepwise manner. - Next, one of the candidates for a holding position and posture having the highest priority specified by the selection condition is selected (step S206). Then, it is judged at step S207 as to whether or not there is interference with an obstacle in the periphery when the
robot 12 is in a position and posture corresponding to the candidate for a holding position and posture which has been selected at step S206. In the second embodiment, the selection condition prioritizes a holding position and posture which can be obtained from the waiting position and posture by a smaller amount of changes in posture, in a similar way as the first embodiment. The amount of changes in posture may be quantitatively expressed, for example, by an amount of rotation around a certain axis. For example, a transformation matrix between the respective candidates for a holding position and posture and the waiting position and posture is calculated, and based on the transformation matrix, an amount of rotation for the respective cases can be calculated. Specifically, at step S207, the judgment as to whether or not there is interference is carried out for the candidate for a holding position and posture which requires the minimum amount of rotation. The judgment as to whether or not there is interference is performed based on a position and posture of therobot 12 corresponding to the candidate for a holding position and posture to be judged, on a shape data of the tool attached to thetip 12 a or the arm of therobot 12, and on a shape data of the obstacle existent in the periphery of an operation area of therobot 12. - When it is judged at step S207 that there is no interference, the candidate, which is the subject of the judgment, is selected as a holding position and posture of the robot 12 (step S208). The result of selection at step S208 is sent out to the
robot controller 13′. Theroot controller 13′ then functions to create a control command in order to move therobot 12 to the holding position and posture which have been selected. On the other hand, when it is judged at step S207 that there will be interference, a position and posture corresponding to the candidate which has been judged are removed from the candidates for a holding position and posture (step S209). Then, the process returns to step S206, and a next candidate for a holding position and posture having the highest priority specified by the selection condition is selected. - Next, a pickup device according to a third embodiment of the present invention will be described. The third embodiment uses a selection condition different from those used in the first and second embodiments. Specifically, in the pickup device according to the third embodiment, priority is given by the selection condition to a position and posture which provide a greater distance between a point fixed at the
tip 12 a of therobot 12 and a fixed plane independent of therobot 12. In the following explanation directed to the third embodiment, only matters different from the first and second embodiments will be described. - The third embodiment is advantageous in the following case, for example. In the case where an
object 16 which cannot be recognized by an image taken by thecamera 11 exists near thetarget object 16 a to be picked up, and theformer object 16 and the latter target object 16 a are situated at heights closer to each other, thetip 12 a of therobot 12 might come in contact with theobject 16 situated close to thetarget object 16 a, in the process of movement of therobot 12 to a holding position and posture in order to hold thetarget object 16 a. Such an incident tends to occur when aflange 20 of thetip 12 a of therobot 12 is in a lower position, for example, as shown inFIG. 4 . Therefore, in order to avoid this problem, a selection condition is applied such that priority is given to a holding position and posture of therobot 12 which allow theflange 20 of thetip 12 a of therobot 12 to be in a higher position. -
FIG. 9 shows a process for calculating a position and posture of therobot 12 based on a selection condition. In the following example, a designatedsite 91 on an upper surface of theflange 20 is set as a target which should be in a higher position. In this case, in order to obtain a holding position and posture of therobot 12 for allowing the designatedsite 91 to be in a highest possible position, a holding position and posture of therobot 12 are selected from the holding position and posture modification range, so that a distance between the designatedsite 91 fixed at thetip 12 a of therobot 12 and a floor surface which is considered to be a fixed plane independent of therobot 12 can be maximized. In the case where aperpendicular line 95 drawn from the designatedsite 91 to thecentral axis 31 of thehole 21 of theobject 16 has adistal end 92, the designatedsite 91 is in a highest position when a scalar product of adirectional vector 93 extending from thedistal end 92 to the designatedsite 91 along theperpendicular line 95 and adirectional vector 94 extending perpendicular to the floor surface has the maximum value. Accordingly, a holding position and posture having the highest priority specified by the selection condition can be identified by determining a holding position and posture of therobot 12 for producing the maximum scalar product of the twodirectional vectors - As opposed to the exemplary third embodiment in which priority is given to a position and posture of the
robot 12 for providing a greater distance between a point fixed at thetip 12 a of therobot 12 and a fixed plane independent of therobot 12, priority may also be specified in different ways, for example, in accordance with a distance between points, a distance between a point and a line, an angle defined between lines. - Next, a pickup device according to a fourth embodiment of the present invention will be described. The pickup device in the fourth embodiment provides the
robot 12 with multiple degrees of freedom in movement within the holding position and posture modification range, and therefore, the fourth embodiment differs from the first, second and third embodiments in that a plurality of selection conditions are set. In the following explanation directed to the fourth embodiment, only matters different from the first, second and third embodiments will be described. - In the fourth embodiment, as shown in
FIG. 10 , the pickup device operates to pick up anobject 101 having acylindrical portion 103 having an elongated cylindrical shape and aflange portion 105. Thetip 12 a of therobot 12 is provided with ahand 101 for holding theobject 101. Thehand 102 includes aflange 104 attached to thetip 12 a of therobot 12, and twoclaws 106 extending perpendicularly from the bottom face of theflange 104 and parallel to each other. Thehand 102 is designed to hold theobject 101 between the twoclaws 106 by adjusting a distance between theclaws 106 by way of a chuck. - When a reference holding position and posture of the robot is set so that the
target object 101 a and thehand 102 are in a positional relationship relative to each other as shown inFIG. 10 , thetarget object 101 a can be still held by thehand 102 even if a holding position of thehand 102 is moved in the direction of thecentral axis 111 of thetarget object 101 a by a distance X. This holding position and posture modification range will be referred to as a “first holding position and posture modification range,” in order to distinguish it from a second holding position and posture modification range, which will be described below. The second holding position and posture modification range is defined by an angle θ around aline 114 perpendicular to acentral axis 111 of thetarget object 101 a and to acentral axis 113 of thehand 102. Thus, theobject 102 can be still held by thehand 102 even if thehand 102 is rotated around theline 114 within a certain range of angle θ. -
FIG. 11 illustrates a selection condition in the pickup device according to the fourth embodiment. In the fourth embodiment, a selection condition gives priority to the holding position and posture of therobot 12 shown inFIG. 11 . Specifically, the selection condition gives priority to the state where theclaws 106 are in a higher position, and theclaws 106 are oriented vertically downward without being slanted relative to a vertical line when thetarget object 101 a is held by thehand 102. If theclaws 106 are in a lower position or slanted relative to the vertical line, there is a risk of contact betweenother objects 101 situated near thetarget object 101 a and thehand 102 attached to thetip 12 a or the arm of therobot 12, as described above in relation to the third embodiment. However, according to the fourth embodiment, priority is given as much as possible to the state where theclaws 106 are in a higher position without being slanted. Therefore, thehand 102 can be avoided from unexpectedly coming in contact withother objects 101 near thetarget object 101 a. - A holding position and posture of the
robot 12 for holding atarget object 101 a can be determined within the holding position and posture modification range, according to the selection condition in the following way. A holding position and posture of therobot 12 which bring theclaws 106 to a highest possible position within the holding position and posture modification range are determined, based on the first holding position and posture modification range and on a posture of thetarget object 101 a. Depending on inclination of theobject 101 along thecentral axis 111, which can be measured by thecamera 11, it is determined as to which end of the distance X of the first holding position and posture modification range brings theclaws 106 to a higher position when holding thetarget object 101 a. In addition, the posture of therobot 12 resulting in theclaws 106 being slanted to the minimum extent can be determined by obtaining a posture of therobot 12 in which a scalar product of a directional vector extending in the direction of thecentral axis 113 of thehand 102 and a directional vector extending perpendicular to the floor surface has the maximum value. The posture of therobot 12 for providing the scalar product of the maximum value can be obtained in the same way as the third embodiment. - Similarly to the second embodiment, in the case where a holding position and posture of the
robot 12 are selected among a plurality of candidates for a holding position and posture, so as to ensure that there is no interference between the tool attached to thetip 12 a or the arm of therobot 12 and an obstacle in the periphery of an operational area of therobot 12, a plurality of candidates for a holding position and posture distant from one another are generated for the first and second holding position and posture modification ranges, respectively, and a combination thereof can be used as candidates for a holding position and posture. For the respective candidates obtained in the above-described way, heights of theclaws 106 when holding thetarget object 101 a are calculated, and an angle defined between a directional vector extending in the direction of thecentral axis 113 of thehand 102 and a directional vector extending perpendicular to the floor surface is calculated. Based on the result of the calculation, priority of the candidates for a holding position and posture is further calculated, and then the candidates having higher priority are successively judged as to whether or not there is interference. The candidate for a holding position and posture which has been judged as not getting involved with interference is selected as a holding position and posture of the robot. The priority specified by the selection condition is obtained by summing a height of theclaws 106 for holding thetarget object 101 a multiplied with a weighing factor and an angle defined between the two directional vectors multiplied with a weighing factor. However, the calculation process of the priority is not limited to the above example, any other process which allows time required for movement of therobot 12 to be reduced and realizes a stable pickup process may also be employed. In addition, the calculating process of the priority may be carried out in a predetermined way, or may be in a selected way as necessary, depending on an operator. - The above embodiments are described only for the illustrative purpose, and therefore the present invention is not limited by a particular configuration and/or function according to the above embodiments and variants thereof. Constituent elements of the above embodiments and variants thereof can be replaced with alternatives which are obvious to a person skilled in the art, while substantially maintaining the identity of the present invention. Thus, the embodiments including such alternative constituent elements also fall within the technical scope and the spirit of the present invention. Further, any combination of one or more of the above embodiments and variants thereof is included in the present disclosure.
- With the pickup device according to the present invention, an optimal position and posture of the robot are selected among other possible holding positions and postures of the robot, in accordance with the priority specified by the selection condition. Thus, an increase in time required for a pickup process and any deviation from the movable range of the robot can be avoided. Movement of the robot which possibly results in being in contact with the obstacle may also be avoided as necessary. Since the calculation of the holding position and posture of the robot is performed for each object, it is always ensured that the cycle time can be reduced and the system can be stabilized.
- Although the invention has been shown and described with exemplary embodiments thereof, it should be understood by a person skilled in the art that the foregoing and various other changes, omissions and additions may be made therein and thereto without departing from the spirit and scope of the invention.
Claims (8)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-201958 | 2012-09-13 | ||
JP2012201958A JP5620445B2 (en) | 2012-09-13 | 2012-09-13 | Article takeout device for determining holding position and posture of robot based on selection condition |
Publications (2)
Publication Number | Publication Date |
---|---|
US20140074288A1 true US20140074288A1 (en) | 2014-03-13 |
US9050722B2 US9050722B2 (en) | 2015-06-09 |
Family
ID=50153419
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/025,427 Active 2033-10-09 US9050722B2 (en) | 2012-09-13 | 2013-09-12 | Pickup device capable of determining holding position and posture of robot based on selection condition |
Country Status (4)
Country | Link |
---|---|
US (1) | US9050722B2 (en) |
JP (1) | JP5620445B2 (en) |
CN (1) | CN103659811B (en) |
DE (1) | DE102013014873B4 (en) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140052297A1 (en) * | 2012-08-17 | 2014-02-20 | Liebherr-Verzahntechnik Gmbh | Apparatus for Automated Removal of Workpieces Arranged in a Container |
US20150251314A1 (en) * | 2014-03-07 | 2015-09-10 | Seiko Epson Corporation | Robot, robot system, control device, and control method |
US20150276383A1 (en) * | 2014-03-31 | 2015-10-01 | Canon Kabushiki Kaisha | Information processing apparatus, method for controlling information processing apparatus, gripping system, and storage medium |
US20150321354A1 (en) * | 2014-05-08 | 2015-11-12 | Toshiba Kikai Kabushiki Kaisha | Picking apparatus and picking method |
US20160229062A1 (en) * | 2015-02-10 | 2016-08-11 | Fanuc Corporation | Workpiece taking out robot system having conversion-calculation function of position and orientation, and workpiece taking out method |
US20170312921A1 (en) * | 2016-04-28 | 2017-11-02 | Seiko Epson Corporation | Robot and robot system |
US10556306B2 (en) | 2016-09-08 | 2020-02-11 | Fanuc Corporation | Robot system |
US10692018B2 (en) * | 2016-09-27 | 2020-06-23 | Fanuc Corporation | Machine learning device and machine learning method for learning optimal object grasp route |
EP3792013A1 (en) * | 2019-09-13 | 2021-03-17 | Kabushiki Kaisha Toshiba | Handling device, control device, and holding method |
US11003177B2 (en) | 2017-03-24 | 2021-05-11 | Mitsubishi Electric Corporation | Apparatus and method for generating robot program |
US11007649B2 (en) | 2017-04-04 | 2021-05-18 | Mujin, Inc. | Information processing apparatus, picking system, distribution system, program and information processing method |
US11007643B2 (en) * | 2017-04-04 | 2021-05-18 | Mujin, Inc. | Control device, picking system, distribution system, program, control method and production method |
US11027427B2 (en) * | 2017-04-04 | 2021-06-08 | Mujin, Inc. | Control device, picking system, distribution system, program, and control method |
US11090808B2 (en) * | 2017-04-04 | 2021-08-17 | Mujin, Inc. | Control device, picking system, distribution system, program, control method and production method |
US11097421B2 (en) * | 2017-04-04 | 2021-08-24 | Mujin, Inc. | Control device, picking system, distribution system, program, control method and production method |
CN113319847A (en) * | 2020-02-28 | 2021-08-31 | 精工爱普生株式会社 | Robot control method |
US11241795B2 (en) * | 2018-09-21 | 2022-02-08 | Beijing Jingdong Shangke Information Technology Co., Ltd. | Soft package, robot system for processing the same, and method thereof |
WO2023192333A1 (en) * | 2022-03-28 | 2023-10-05 | Seegrid Corporation | Automated identification of potential obstructions in a targeted drop zone |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6429450B2 (en) | 2013-10-31 | 2018-11-28 | キヤノン株式会社 | Information processing apparatus and information processing method |
JP6137155B2 (en) * | 2014-12-09 | 2017-05-31 | トヨタ自動車株式会社 | Interference avoidance method, control device, and program |
JP6333871B2 (en) * | 2016-02-25 | 2018-05-30 | ファナック株式会社 | Image processing apparatus for displaying an object detected from an input image |
JP6758903B2 (en) * | 2016-05-06 | 2020-09-23 | キヤノン株式会社 | Information processing equipment, information processing methods, programs, systems, and article manufacturing methods |
DE102016122505A1 (en) * | 2016-11-22 | 2018-05-24 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Method for draping a sheet material |
DE112017007903B4 (en) | 2017-10-03 | 2022-04-14 | Mitsubishi Electric Corporation | Holding position and orientation teaching device, holding position and orientation teaching method and robot system |
JP6458912B1 (en) * | 2018-01-24 | 2019-01-30 | 三菱電機株式会社 | Position control device and position control method |
JP6937260B2 (en) * | 2018-03-19 | 2021-09-22 | 株式会社東芝 | Grip control device, grip system, and program |
JP6687657B2 (en) * | 2018-03-20 | 2020-04-28 | ファナック株式会社 | Article taking-out apparatus using sensor and robot, and article taking-out method |
JP7230451B2 (en) | 2018-11-19 | 2023-03-01 | トヨタ自動車株式会社 | Layout structure of front space of hybrid vehicle |
JP7275759B2 (en) * | 2019-03-28 | 2023-05-18 | セイコーエプソン株式会社 | OBJECT DETECTION METHOD, OBJECT DETECTION DEVICE, AND ROBOT SYSTEM |
DE112021005180T5 (en) * | 2020-11-13 | 2023-09-28 | Fanuc Corporation | Numerical control device and numerical control system |
CN112830237A (en) * | 2021-01-13 | 2021-05-25 | 广东智源机器人科技有限公司 | Movement control method, device, equipment and cooking system |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070282485A1 (en) * | 2006-06-06 | 2007-12-06 | Fanuc Ltd | Robot simulation apparatus |
US20090285664A1 (en) * | 2008-05-13 | 2009-11-19 | Samsung Electronics Co., Ltd | Robot, robot hand, and method of controlling robot hand |
US20100103106A1 (en) * | 2007-07-11 | 2010-04-29 | Hsien-Hsiang Chui | Intelligent robotic interface input device |
US20100298974A1 (en) * | 2009-05-19 | 2010-11-25 | Kabushiki Kaisha Yaskawa Denki | Robot and conveying system |
US20110280472A1 (en) * | 2010-05-14 | 2011-11-17 | Wallack Aaron S | System and method for robust calibration between a machine vision system and a robot |
US20120022827A1 (en) * | 2009-04-09 | 2012-01-26 | Andreas Hertgens | Method for automatic measurement and for teaching-in of location positions of objects within a substrate processing system by means of sensor carriers and associated sensor carrier |
US20120053724A1 (en) * | 2010-08-31 | 2012-03-01 | Kabushiki Kaisha Yaskawa Denki | Robot system |
US20120216384A1 (en) * | 2011-02-25 | 2012-08-30 | Durr Ecoclean, Inc. | Manufacturing facility with robotic carrier and method of manufacturing |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3300682B2 (en) * | 1999-04-08 | 2002-07-08 | ファナック株式会社 | Robot device with image processing function |
JP3782679B2 (en) * | 2001-05-09 | 2006-06-07 | ファナック株式会社 | Interference avoidance device |
JP3805310B2 (en) * | 2003-01-30 | 2006-08-02 | ファナック株式会社 | Work take-out device |
JP3930490B2 (en) * | 2004-04-23 | 2007-06-13 | ファナック株式会社 | Article take-out device |
JP4226623B2 (en) * | 2006-09-29 | 2009-02-18 | ファナック株式会社 | Work picking device |
DE102007060653A1 (en) * | 2007-12-15 | 2009-06-18 | Abb Ag | Position determination of an object |
JP5685027B2 (en) | 2010-09-07 | 2015-03-18 | キヤノン株式会社 | Information processing apparatus, object gripping system, robot system, information processing method, object gripping method, and program |
-
2012
- 2012-09-13 JP JP2012201958A patent/JP5620445B2/en active Active
-
2013
- 2013-09-06 DE DE102013014873.1A patent/DE102013014873B4/en active Active
- 2013-09-12 US US14/025,427 patent/US9050722B2/en active Active
- 2013-09-12 CN CN201310415173.5A patent/CN103659811B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070282485A1 (en) * | 2006-06-06 | 2007-12-06 | Fanuc Ltd | Robot simulation apparatus |
US20100103106A1 (en) * | 2007-07-11 | 2010-04-29 | Hsien-Hsiang Chui | Intelligent robotic interface input device |
US20090285664A1 (en) * | 2008-05-13 | 2009-11-19 | Samsung Electronics Co., Ltd | Robot, robot hand, and method of controlling robot hand |
US20120022827A1 (en) * | 2009-04-09 | 2012-01-26 | Andreas Hertgens | Method for automatic measurement and for teaching-in of location positions of objects within a substrate processing system by means of sensor carriers and associated sensor carrier |
US20100298974A1 (en) * | 2009-05-19 | 2010-11-25 | Kabushiki Kaisha Yaskawa Denki | Robot and conveying system |
US20110280472A1 (en) * | 2010-05-14 | 2011-11-17 | Wallack Aaron S | System and method for robust calibration between a machine vision system and a robot |
US20120053724A1 (en) * | 2010-08-31 | 2012-03-01 | Kabushiki Kaisha Yaskawa Denki | Robot system |
US20120216384A1 (en) * | 2011-02-25 | 2012-08-30 | Durr Ecoclean, Inc. | Manufacturing facility with robotic carrier and method of manufacturing |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140052297A1 (en) * | 2012-08-17 | 2014-02-20 | Liebherr-Verzahntechnik Gmbh | Apparatus for Automated Removal of Workpieces Arranged in a Container |
US9902066B2 (en) * | 2012-08-17 | 2018-02-27 | Liebherr-Verzahntechnik Gmbh | Apparatus for automated removal of workpieces arranged in a container |
US9656388B2 (en) * | 2014-03-07 | 2017-05-23 | Seiko Epson Corporation | Robot, robot system, control device, and control method |
US20150251314A1 (en) * | 2014-03-07 | 2015-09-10 | Seiko Epson Corporation | Robot, robot system, control device, and control method |
USRE47553E1 (en) * | 2014-03-07 | 2019-08-06 | Seiko Epson Corporation | Robot, robot system, control device, and control method |
US20150276383A1 (en) * | 2014-03-31 | 2015-10-01 | Canon Kabushiki Kaisha | Information processing apparatus, method for controlling information processing apparatus, gripping system, and storage medium |
US10132613B2 (en) * | 2014-03-31 | 2018-11-20 | Canon Kabushiki Kaisha | Information processing apparatus, method for controlling information processing apparatus, gripping system, and storage medium |
US20150321354A1 (en) * | 2014-05-08 | 2015-11-12 | Toshiba Kikai Kabushiki Kaisha | Picking apparatus and picking method |
US9604364B2 (en) * | 2014-05-08 | 2017-03-28 | Toshiba Kikai Kabushiki Kaisha | Picking apparatus and picking method |
US20160229062A1 (en) * | 2015-02-10 | 2016-08-11 | Fanuc Corporation | Workpiece taking out robot system having conversion-calculation function of position and orientation, and workpiece taking out method |
US9764475B2 (en) * | 2015-02-10 | 2017-09-19 | Fanuc Corporation | Workpiece taking out robot system having conversion-calculation function of position and orientation, and workpiece taking out method |
US20170312921A1 (en) * | 2016-04-28 | 2017-11-02 | Seiko Epson Corporation | Robot and robot system |
US10532461B2 (en) * | 2016-04-28 | 2020-01-14 | Seiko Epson Corporation | Robot and robot system |
US10556306B2 (en) | 2016-09-08 | 2020-02-11 | Fanuc Corporation | Robot system |
US10692018B2 (en) * | 2016-09-27 | 2020-06-23 | Fanuc Corporation | Machine learning device and machine learning method for learning optimal object grasp route |
US11003177B2 (en) | 2017-03-24 | 2021-05-11 | Mitsubishi Electric Corporation | Apparatus and method for generating robot program |
US11097421B2 (en) * | 2017-04-04 | 2021-08-24 | Mujin, Inc. | Control device, picking system, distribution system, program, control method and production method |
US11007649B2 (en) | 2017-04-04 | 2021-05-18 | Mujin, Inc. | Information processing apparatus, picking system, distribution system, program and information processing method |
US11007643B2 (en) * | 2017-04-04 | 2021-05-18 | Mujin, Inc. | Control device, picking system, distribution system, program, control method and production method |
US11027427B2 (en) * | 2017-04-04 | 2021-06-08 | Mujin, Inc. | Control device, picking system, distribution system, program, and control method |
US11090808B2 (en) * | 2017-04-04 | 2021-08-17 | Mujin, Inc. | Control device, picking system, distribution system, program, control method and production method |
US20210339395A1 (en) * | 2017-04-04 | 2021-11-04 | Mujin, Inc. | Control device, picking system, distribution system, program, control method and production method |
US11679503B2 (en) * | 2017-04-04 | 2023-06-20 | Mujin, Inc. | Control device, picking system, distribution system, program, control method and production method |
US11241795B2 (en) * | 2018-09-21 | 2022-02-08 | Beijing Jingdong Shangke Information Technology Co., Ltd. | Soft package, robot system for processing the same, and method thereof |
EP3792013A1 (en) * | 2019-09-13 | 2021-03-17 | Kabushiki Kaisha Toshiba | Handling device, control device, and holding method |
CN113319847A (en) * | 2020-02-28 | 2021-08-31 | 精工爱普生株式会社 | Robot control method |
US20210268651A1 (en) * | 2020-02-28 | 2021-09-02 | Seiko Epson Corporation | Robot control method |
WO2023192333A1 (en) * | 2022-03-28 | 2023-10-05 | Seegrid Corporation | Automated identification of potential obstructions in a targeted drop zone |
Also Published As
Publication number | Publication date |
---|---|
CN103659811A (en) | 2014-03-26 |
JP2014054715A (en) | 2014-03-27 |
DE102013014873B4 (en) | 2015-08-20 |
US9050722B2 (en) | 2015-06-09 |
DE102013014873A1 (en) | 2014-03-13 |
JP5620445B2 (en) | 2014-11-05 |
CN103659811B (en) | 2015-09-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9050722B2 (en) | Pickup device capable of determining holding position and posture of robot based on selection condition | |
EP3392002A1 (en) | Information processing apparatus, measuring apparatus, system, interference determination method, and article manufacturing method | |
US10099380B2 (en) | Robot, robot control device, and robot system | |
US9604363B2 (en) | Object pickup device and method for picking up object | |
US11090814B2 (en) | Robot control method | |
US9415511B2 (en) | Apparatus and method for picking up article randomly piled using robot | |
EP3354418B1 (en) | Robot control method and device | |
JP3782679B2 (en) | Interference avoidance device | |
US10675759B2 (en) | Interference region setting apparatus for mobile robot | |
JP4849804B2 (en) | Robot operation method | |
CN102642201B (en) | Work picking system | |
JP6088563B2 (en) | Work picking robot system having position and orientation conversion operation function, and work picking method | |
JP2018176334A5 (en) | ||
CN112292235B (en) | Robot control device, robot control method, and recording medium | |
JP2014161965A (en) | Article takeout device | |
JPH11502776A (en) | Apparatus and method for calibration of multi-axis industrial robot | |
EP2752640B1 (en) | Method and program for using gestures to control a coordinate measuring device | |
JP6897396B2 (en) | Control devices, robot systems and control methods | |
WO2015054599A2 (en) | Robotic placement and manipulation with enhanced accuracy | |
KR20210110191A (en) | Apparatus and method for controlling robot | |
JP2012135820A (en) | Automatic picking device and automatic picking method | |
CN109983299A (en) | The measuring system and method for industrial robot | |
TW201729959A (en) | Method and system for adjusting position of robot | |
WO2024023934A1 (en) | Workpiece removal device, workpiece removal method, and control device | |
WO2023243051A1 (en) | Workpiece retrieval system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FANUC CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SATOU, TAIGA;REEL/FRAME:031195/0793 Effective date: 20130829 |
|
AS | Assignment |
Owner name: FANUC CORPORATION, JAPAN Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE TITLE OF THE APPLICATION PREVIOUSLY RECORDED ON REEL 031195 FRAME 0793. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:SATOU, TAIGA;REEL/FRAME:031957/0155 Effective date: 20130829 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |