WO2017037931A1 - Dispositif de modification de trajectoire de traitement, robot, système de traitement d'article, et procédé de production d'article - Google Patents

Dispositif de modification de trajectoire de traitement, robot, système de traitement d'article, et procédé de production d'article Download PDF

Info

Publication number
WO2017037931A1
WO2017037931A1 PCT/JP2015/075141 JP2015075141W WO2017037931A1 WO 2017037931 A1 WO2017037931 A1 WO 2017037931A1 JP 2015075141 W JP2015075141 W JP 2015075141W WO 2017037931 A1 WO2017037931 A1 WO 2017037931A1
Authority
WO
WIPO (PCT)
Prior art keywords
processing
article
unit
trajectory
height
Prior art date
Application number
PCT/JP2015/075141
Other languages
English (en)
Japanese (ja)
Inventor
元文 木下
邦廣 平岡
康幸 池田
Original Assignee
株式会社安川電機
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社安川電機 filed Critical 株式会社安川電機
Priority to PCT/JP2015/075141 priority Critical patent/WO2017037931A1/fr
Priority to JP2017537168A priority patent/JP6531829B2/ja
Priority to CN201580082836.3A priority patent/CN107921632B/zh
Publication of WO2017037931A1 publication Critical patent/WO2017037931A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Definitions

  • the present invention relates to a processing trajectory editing device, a robot, an article processing system, and an article manufacturing method.
  • Patent Document 1 discloses a system that uses a robot to perform application processing on an article along a processing trajectory by controlling a robot based on the processing trajectory in a coating processing step on the article.
  • the present invention has been made in view of the above problems, and an object thereof is to provide a processing trajectory editing apparatus, a robot, an article processing system, and an article manufacturing method capable of improving a user interface related to setting of a processing trajectory. There is to do.
  • a processing trajectory editing apparatus includes a photographing unit that photographs an article, and a display that displays a processing trajectory when performing processing on the article, superimposed on a photographed image of the article. And an editing operation accepting unit that accepts an editing operation on the processing trajectory, and the processing trajectory displayed on the display unit is associated with physical coordinates of the article and corresponds to physical coordinates of the article The attached processing locus can be edited.
  • the image processing apparatus may further include a processing locus generation unit that generates the processing locus based on the captured image.
  • the display unit may display the processing trajectory generated by the processing trajectory generation unit so as to overlap the captured image.
  • the editing operation accepting unit may accept an editing operation on the processing trajectory generated by the processing trajectory generating unit.
  • the processing for the article may be performed by moving a processing unit that performs the processing for the article along the processing trajectory.
  • the processing trajectory editing apparatus relates to a height measuring unit that measures the height of the article at a point on the processing trajectory, and the height of the processing unit when the processing unit is moved along the processing trajectory. You may make it further have a height information generation part which produces
  • the display unit may also display the height information.
  • the edit operation accepting unit may accept an edit operation for the height information.
  • the height measuring unit may measure the height of the article at a point on the processed locus after editing when the processed locus is edited.
  • the height information generation unit may regenerate the height information based on a measurement result of the height measurement unit performed after editing the processing trajectory.
  • the height measuring unit may measure a height from a predetermined reference height as the height of the article at a point on the processing locus.
  • the height information generation unit may generate information indicating the height of the processing unit from the reference height as the height information.
  • the processing for the article may be performed by moving a processing unit that performs the processing for the article along the processing trajectory.
  • the processing trajectory editing apparatus may further include an inclination information generation unit that generates inclination information related to the inclination of the processing unit when the processing unit is moved along the processing trajectory.
  • the display unit may also display the tilt information.
  • the edit operation accepting unit may accept an edit operation for the tilt information.
  • the processing for the article may be performed by moving a processing unit that performs the processing for the article along the processing trajectory.
  • the processing trajectory editing apparatus may further include a rotation information generation unit that generates rotation information related to rotation of the processing unit when the processing unit is moved along the processing trajectory.
  • the display unit may also display the rotation information.
  • the edit operation accepting unit may accept an edit operation for the rotation information.
  • a selection operation receiving unit that receives a selection operation as to whether or not to rotate the processing unit when the processing unit is moved along the processing locus may be further included.
  • the robot according to the present invention includes a processing unit that performs processing on an article, and the processing unit is controlled to move along a processing locus edited by any of the processing locus editing apparatuses described above. Is done.
  • an article processing system includes a photographing unit that photographs an article, and a processing locus when processing the article, and a processing locus associated with the physical coordinates of the article is photographed of the article.
  • a display unit that displays the image superimposed on an image; an editing operation receiving unit that receives an editing operation on the processing locus; a processing unit that performs the processing on the article; a moving mechanism that moves the processing unit; and the processing unit.
  • a control unit that controls the moving mechanism to move along the processing trajectory and causes the processing unit to perform the processing on the article.
  • the article manufacturing method includes a photographing step for photographing an article by a photographing unit, and a processing locus when processing the article, wherein the processing locus associated with the physical coordinates of the article is the processing locus.
  • the user edits a processing trajectory associated with physical coordinates of an article while viewing a captured image of the article as a processing trajectory when performing processing on the article.
  • a processing trajectory associated with physical coordinates of an article
  • viewing a captured image of the article as a processing trajectory when performing processing on the article.
  • the article processing system is a system in which processing (work) on an article is performed by a robot.
  • processing on an article includes, for example, processing (application, welding, etc.) or measurement (measurement of surface roughness, etc.) on the article.
  • FIG. 1 is a perspective view showing an example of the appearance of an article processing system according to an embodiment of the present invention.
  • a system for applying cream to the upper surface of a bread is illustrated.
  • a three-dimensional orthogonal coordinate system (XYZ coordinate system) including the Z axis with the vertical upward direction as the positive direction is illustrated.
  • the article processing system 1 includes a transport lane 10, a backlight 20, a photographing unit 30, and a robot 40.
  • the transport lane 10 transports the article 50 to be processed by the robot 40 to the work place PW.
  • the work place PW is a place where the processing for the article 50 is performed by the robot 40.
  • the article 50 is placed on the transport lane 10 with the processing surface, which is the surface to be processed by the robot 40, facing upward.
  • a substantially spindle-shaped pan with unevenness is placed on the transport lane 10 as an article 50 with the upper surface (processing surface) facing upward.
  • the transport lane 10 transports the article 50 from the work place PW.
  • the backlight 20 is installed below the work place PW in the transport lane 10.
  • the imaging unit 30 is installed above the work place PW (that is, the backlight 20) in the transport lane 10.
  • the photographing unit 30 is used for photographing the article 50 conveyed to the work place PW.
  • a jib crane is used to fix the imaging unit 30, but the fixing method of the imaging unit 30 is not limited to this example.
  • the imaging unit 30 may be fixed to the ceiling.
  • FIG. 2 is a diagram for explaining the transport lane 10, the backlight 20, and the photographing unit 30 in more detail.
  • FIG. 2 shows a state in which the article 50 is viewed in a state where it is conveyed to the work place PW.
  • the article 50 is placed on the transport lane 10 with the processing surface 51 (upper surface in the example shown in FIG. 2) facing upward.
  • the portion of the transport lane 10 where the article 50 is placed is formed of a material that can transmit light, and light from the backlight 20 is transmitted. For this reason, the article 50 conveyed to the work place PW is illuminated by the light from the backlight 20, and the shadow of the article 50 illuminated by the backlight 20 is photographed by the photographing unit 30.
  • the robot 40 is a robot for performing processing on the article 50.
  • the robot 40 is disposed at a position facing the article 50 conveyed to the work place PW.
  • FIG. 3 is a diagram for explaining an example of the configuration of the robot 40.
  • the robot 40 is a so-called multi-joint type (multi-axis) robot.
  • a robot having at least six degrees of freedom is used as the robot 40.
  • the robot 40 includes a base part 41, a turning base part 42, a first arm part 43, a second arm part 44, a processing part 45, and a height measuring part 46. Is provided.
  • the base part 41 is a support base part fixed to the floor surface or the like.
  • the turning base part 42 is connected to the base part 41.
  • the robot 40 includes a servo motor for turning the turning base portion 42 about the S axis that is an axis corresponding to the normal direction of the surface to which the base portion 41 is fixed.
  • the turning base portion 42 has the S axis.
  • the base 41 is connected to the base 41 so as to be able to turn around (see arrow A1 in FIG. 3).
  • the first arm portion 43 is connected to the turning base portion 42.
  • the robot 40 includes a servo motor for rotating the first arm portion 43 around the L axis, which is an axis substantially perpendicular to the S axis, and the first arm portion 43 pivots so as to be rotatable around the L axis. It connects with the base part 42 (refer arrow A2 of FIG. 3).
  • the second arm portion 44 is connected to the distal end portion of the first arm portion 43.
  • the robot 40 includes a servo motor for rotating the second arm portion 44 around a U axis that is an axis substantially parallel to the L axis (that is, an axis substantially perpendicular to the S axis). 44 is connected with the front-end
  • the robot 40 also includes a servo motor for twisting the tip of the second arm 44 around the R axis, which is an axis corresponding to the extending direction (longitudinal direction) of the second arm 44.
  • the tip of the second arm portion 44 is provided to be able to twist around the R axis (see arrow A4 in FIG. 3).
  • the processing unit 45 performs processing on the article 50.
  • the article 50 is a bread
  • the processing unit 45 is an application unit that applies cream to the bread.
  • the discharge port of the cream is provided at the tip of the processing unit 45 (application unit), and the processing unit 45 is configured to discharge the cream from the discharge port.
  • the processing unit 45 is connected to the tip of the second arm unit 44.
  • the robot 40 includes a servo motor for rotating the processing unit 45 about the B axis, which is an axis substantially perpendicular to the extending direction (longitudinal direction) of the processing unit 45.
  • the processing unit 45 can rotate about the B axis. To the tip of the second arm portion 44 (see arrow A5 in FIG. 3).
  • the robot 40 also includes a servo motor for twisting the processing unit 45 about the T axis, which is an axis corresponding to the extending direction (longitudinal direction) of the processing unit 45.
  • the processing unit 45 twists about the T axis. It is provided (see arrow A6 in FIG. 3).
  • the robot 40 plays a role as a moving mechanism that moves the processing unit 45.
  • a height measuring unit 46 is provided along with the processing unit 45 at the tip of the second arm unit 44.
  • the height measuring unit 46 measures the height (distance) from the object.
  • the height measurement unit 46 is a height measurement sensor, and measures the height from the object by emitting light toward the object and receiving light reflected by the object.
  • the height measurement unit 46 is provided near the tip of the processing unit 45 and is used to measure the height from the object to the tip of the processing unit 45.
  • the article processing system 1 has two operation modes.
  • the first operation mode is an operation mode in which control data of the processing unit 45 when performing processing on the article 50 can be automatically generated, edited, or registered (hereinafter referred to as “registration mode”). ).
  • registration mode for example, the following data is registered as control data for the processing unit 45.
  • Height information related to the height of the processing unit 45 when moving the processing unit 45 according to the processing trajectory.
  • the second operation mode is an operation mode in which processing on the article 50 is performed by the robot 40 based on the registered control data (hereinafter referred to as “processing execution mode”).
  • processing execution mode an operation mode in which processing on the article 50 is performed by the robot 40 based on the registered control data
  • the article processing system 1 first uses the registration mode to automatically generate, edit, and register control data of the processing unit 45 when performing processing for the article 50.
  • the processing for the article 50 is performed using the processing execution mode.
  • the article processing system 1 includes a processing trajectory editing device that is a device responsible for the registration mode.
  • the processing locus editing apparatus is a computer including, for example, a microprocessor, a main storage unit, an auxiliary storage unit, an operation unit (such as a mouse, a keyboard, or a touch panel), and a display unit.
  • the processing locus editing apparatus can control the transport lane 10, the backlight 20, the photographing unit 30, and the robot 40.
  • the article processing system 1 includes a robot control device that is a device responsible for the processing execution mode.
  • the robot control device is, for example, a computer including a microprocessor, a main storage unit, and an auxiliary storage unit.
  • the robot control device can control the transport lane 10, the backlight 20, the imaging unit 30, and the robot 40.
  • the robot control device and the processing trajectory editing device may be realized by a single computer.
  • FIG. 4 is a functional block diagram showing functions provided in the processing trajectory editing apparatus and the robot control apparatus.
  • the processing trajectory editing device 60 includes a processing trajectory generation unit 61, a height information generation unit 62, an inclination information generation unit 63, a rotation information generation unit 64, an operation reception unit 65, a display. Part 66.
  • the operation reception unit 65 includes an editing operation reception unit 65A and a selection operation reception unit 65B. These functional blocks (excluding the display unit 66) are realized by the microprocessor of the processing locus editing device 60.
  • the robot control device 70 includes a control unit 71.
  • the controller 71 is realized by the microprocessor of the robot controller 70.
  • the article processing system 1 also includes a storage unit 80.
  • the storage unit 80 may be realized by the main storage unit or auxiliary storage unit of the processing trajectory editing device 60, may be realized by the main storage unit or auxiliary storage unit of the robot control device 70, or other devices. It may be realized by a main storage unit or an auxiliary storage unit.
  • the registration mode is for automatically generating / editing / registering the control data of the processing unit 45 when the processing for the article 50 is performed.
  • the operator activates the registration mode by performing a predetermined operation using the operation unit of the processing trajectory editing device 60.
  • processing for automatically generating / editing / registering the control data of the processing unit 45 is executed by the processing trajectory editing device 60.
  • FIG. 5 is a flowchart showing an example of processing executed by the processing trajectory editing device 60 at this time.
  • the processing trajectory editing device 60 displays an automatic generation screen for automatically generating control data of the processing unit 45 on the display unit (S10).
  • FIG. 6 shows an example of the automatic generation screen.
  • the automatic generation screen 90 includes a plurality of forms for accepting designation of various types of information necessary for generating control data of the processing unit 45.
  • the automatic generation screen 90 includes an article form 91.
  • the article form 91 is a form for accepting designation of article identification information (for example, an article name or an article code) that uniquely identifies the article 50.
  • the automatic generation screen 90 includes a trajectory type form 92, a trajectory width form 93, a trajectory interval form 94, and an outer periphery offset form 95.
  • the trajectory type form 92 is a form for accepting designation of the type of processing trajectory.
  • a plurality of types of processing trajectories having different forms (shapes, etc.) are prepared, and the trajectory type form 92 can designate any one of the plurality of types of processing trajectories.
  • the trajectory width form 93 is a form for accepting designation of the width of the processing trajectory. “The width of the processing locus” means the thickness of the line of the processing locus.
  • the trajectory interval form 94 is a form for accepting designation of a processing trajectory interval.
  • the “interval of the processing trajectory” means how far apart (closer) between one trajectory portion of the processing trajectory and the other trajectory portion approaching the one trajectory portion.
  • the outer periphery offset form 95 is a form for accepting designation of an offset from the outer periphery of the article 50 to the processing locus. In other words, the outer periphery offset form 95 is a form for accepting designation of how much the processing locus is set from the outer periphery of the article 50.
  • the operator can specify the type of cream application trajectory, the application width (the thickness of the applied cream), the application interval, and the outer periphery offset. Specify in these forms.
  • the automatic generation screen 90 includes a height form 96.
  • the height form 96 is a form for accepting designation of the height of the processing unit 45 when performing processing on the article 50.
  • the height form 96 is a form for accepting designation of the height of the processing unit 45 from the processing surface of the article 50 (that is, how far the processing unit 45 is separated from the processing surface of the article 50).
  • the automatic generation screen 90 includes a generation button 97 and a cancel button 98.
  • the generation button 97 is a button for accepting an instruction to automatically generate control data from the processing unit 45.
  • the cancel button 98 is a button for accepting an instruction to stop the automatic generation of control data of the processing unit 45.
  • the worker selects the generation button 97 in a state where the article 50 is located at the work place PW.
  • the generation button 97 is selected, the article 50 is photographed by the photographing unit 30 (S11). That is, the shadow of the article 50 illuminated by the backlight 20 is photographed by the photographing unit 30 based on the control by the processing locus editing device 60. An image captured by the imaging unit 30 is supplied to the processing locus editing device 60.
  • the processing trajectory generation unit 61 After the execution of step S11, the processing trajectory generation unit 61 generates a processing trajectory when performing processing on the article 50 based on the captured image captured in step S11 (S12).
  • the processing locus generation unit 61 recognizes the outline of the processing surface 51 of the article 50 in the photographed image. As described above, since the shadow of the article 50 illuminated by the backlight 20 is reflected in the photographed image, the processing locus generation unit 61 recognizes the outline of the shadow as the outline of the processing surface 51 of the article 50.
  • FIG. 7A shows an example of the outline of the processing surface 51 of the article 50 in the photographed image.
  • FIG. 7B shows an example of the reference axis.
  • one end position P1 and the other end position P2 in the longitudinal direction of the article 50 are used as feature points.
  • the Lx axis corresponding to the direction from the one end position P1 to the other end position P2, the midpoint (origin Lo) between the one end position P1 and the other end position P2, and the Lx axis
  • An orthogonal Ly axis is set as the reference axis.
  • the captured image 100 also includes an SxSy coordinate system having an upper left vertex as the origin So, a left and right direction as an Sx axis, and an up and down direction as an Sy axis, as shown in FIG. 7B, for example. Is set. Further, in the article processing system 1, the correspondence between the position in the captured image 100 and the position on the transport lane 10 is stored by performing calibration in advance. That is, the coordinates of the SxSy coordinate system and the coordinates (physical coordinates) in the real space are associated with each other.
  • the coordinates in the LxLy coordinate system are associated with the coordinates in the SxSy coordinate system
  • the coordinates in the LxLy coordinate system are also associated with the coordinates (physical coordinates) in the real space. For this reason, the article processing system 1 can grasp which coordinate in the real space corresponds to the coordinate in the LxLy coordinate system or the SxSy coordinate system.
  • the processing trajectory generation unit 61 Based on the outline 101 of the article 50 as described above, the processing trajectory generation unit 61 generates a processing trajectory when performing processing on the article 50.
  • FIG. 7C shows an example of the processing locus generated in step S12.
  • the processing trajectory 110 is indicated by a dotted line.
  • the processing trajectory 110 is generated in the outline 101 of the article 50 in the captured image 100. That is, the processing trajectory is generated on a two-dimensional plane.
  • the processing trajectory 110 is a one-stroke trajectory from the start point PS to the end point PE.
  • the starting point PS is set at a predetermined position in the outline 101 of the article 50
  • the processing locus 110 starting from the starting point PS is the locus type form 92, locus width form 93, locus interval form 94, and outer periphery of the automatic generation screen 90. It is generated based on the type or value specified in the offset form 95.
  • the processing trajectory 110 is indicated by the LxLy coordinate system.
  • the coordinates in the LxLy coordinate system are associated with the coordinates (physical coordinates) in the real space.
  • the processing trajectory 110 is associated with the coordinates of the real space (that is, the physical coordinates of the article 50), and in the article processing system 1, the point on the processing trajectory 110 is placed on the real space. You can see if it corresponds to the position.
  • processing trajectory 110 illustrated in FIG. 7C is an example of the processing trajectory generated in step S12, and various processing trajectories may be generated in step S12.
  • step S12 measurement by the height measuring unit 46 is performed (S13).
  • the processing trajectory editing device 60 controls the robot 40 to move the processing unit 45 on the article 50 along the processing trajectory generated in step S12.
  • the height (position in the Z-axis direction) of the processing unit 45 is kept constant, and the orientation of the processing unit 45 is maintained in a predetermined state (for example, a state in which the cream discharge port faces).
  • the processing unit 45 is moved for the purpose of performing measurement by the height measuring unit 46, the processing (application of cream) by the processing unit 45 is not performed.
  • measurement by the height measuring unit 46 is performed.
  • the height measuring unit 46 measures the height of the processing surface 51 of the article 50 at a point on the processing trajectory.
  • the measurement result obtained by the height measuring unit 46 is supplied to the processing trajectory editing device 60.
  • the height information generation unit 62 After execution of step S13, the height information generation unit 62 generates height information related to the height of the processing unit 45 when processing is performed along the processing trajectory generated in step S12 (S14). This height information is generated based on the measurement result obtained in step S13.
  • the height of the processing unit 45 (height measurement unit 46) during measurement by the height measurement unit 46 is “Ha”, and the height measurement unit 46 measures at a certain point on the processing locus. It is assumed that the result is “Hb”.
  • “Ha” indicates the height from the reference surface (for example, the transport lane 10) to the processing unit 45
  • “Hb” indicates the processing surface 51 of the article 50 at the point on the processing trajectory to the processing unit 45. Will be shown.
  • the height information (He) of the processing unit 45 when processing the point is generated.
  • the inclination information generation unit 63 After execution of step S14, the inclination information generation unit 63 generates inclination information related to the inclination of the processing unit 45 when processing is performed along the processing locus generated in step S12 (S15).
  • the inclination of the processing unit 45 is, for example, the inclination of the processing unit 45 with respect to the normal direction (or vertical direction) of the processing surface 51 of the article 50.
  • This inclination is represented, for example, by an angle between the normal direction (or vertical direction) of the processing surface 51 of the article 50 and the extending direction of the processing unit 45, and this angle corresponds to an example of “inclination information”.
  • the inclination information generation unit 63 generates inclination information by setting the inclination of the processing unit 45 at each point on the processing locus to a predetermined value.
  • the inclination information generation unit 63 may set the inclination of the processing unit 45 at each point to a predetermined constant value (for example, zero), or the inclination of the processing unit 45 at each point may be You may make it set to the value defined for every point or each area on a process locus.
  • the inclination information generation unit 63 may generate inclination information by determining the inclination of the processing unit 45 at each point on the processing trajectory based on the measurement result obtained in step S13. In this way, the inclination of the processing unit 45 at each point may be set in accordance with a change in height on the processing locus (that is, unevenness on the processing locus).
  • the rotation information generation unit 64 After the execution of step S15, the rotation information generation unit 64 generates rotation information related to the rotation of the processing unit 45 when processing is performed along the processing locus generated in step S12 (S16).
  • the “rotation of the processing unit 45” is, for example, rotation (twisting) of the processing unit 45 around the T axis.
  • This rotation is represented, for example, by a rotation angle from the standard state, and this rotation angle corresponds to an example of “rotation information”.
  • the rotation information generation unit 64 generates rotation information by setting the rotation angle of the processing unit 45 at each point on the processing trajectory to a predetermined value.
  • the rotation information generation unit 64 may set the rotation angle of the processing unit 45 at each point to a predetermined constant value (for example, zero), or the rotation angle of the processing unit 45 at each point. May be set to a value determined for each point or each section on the processing trajectory.
  • the inclination information generation unit 63 may generate the rotation information by determining the rotation angle inclination of the processing unit 45 at each point on the processing trajectory based on the measurement result obtained in step S13. Good. In this way, the rotation angle of the processing unit 45 at each point may be set in accordance with a change in height on the processing locus (that is, unevenness on the processing locus).
  • step S16 the operation receiving unit 65 displays an editing screen for editing the processing trajectory, height information, inclination information, and rotation information generated in steps S12 and S14 to S16 on the display unit 66 ( S17).
  • Fig. 8 shows an example of the edit screen.
  • the processing trajectory 110 generated in step S12 is displayed so as to be superimposed on the captured image 100 of the article 50.
  • the editing operation accepting unit 65A accepts an editing operation for the processing locus 110 displayed on the editing screen 120.
  • a plurality of teaching points 121 are displayed on the processing locus 110 on the editing screen 120. These teaching points 121 are set at predetermined intervals, for example.
  • a cursor 122 is displayed on the editing screen 120. The cursor 122 moves in the captured image 100 in accordance with an operation using an operation unit (such as a mouse) performed by an operator.
  • the operator can change the processing locus 110 by moving the teaching point 121.
  • the operator selects a teaching point 121 to be edited by using the cursor 122 and moves the teaching point 121 by a drag and drop operation.
  • the processing locus 110 is changed by moving the teaching point 121.
  • the teaching point A moves, the teaching point A moves while maintaining the connection between the teaching points B and C adjacent to the teaching point A. For this reason, as the teaching point A moves, the locus portion connecting the teaching point A and the teaching point B and the locus portion connecting the teaching point A and the teaching point C are changed.
  • a rectangular object 123 is displayed on the editing screen 120.
  • An operator can move a plurality of teaching points 121 together using the object 123. Specifically, when the operator drags the object 123 so that it overlaps with the plurality of teaching points 121, the plurality of teaching points 121 move so as to be pushed by the object 123. In this case, the processing trajectory 110 is changed by moving the plurality of teaching points 121.
  • the shape of the object 123 is not limited to a rectangle, and may be another shape (for example, a circle). Further, the operator may be able to change the shape or size of the object 123. That is, the shape or size of the object 123 may be changed according to the changing operation of the operator.
  • the operator may be able to delete the teaching point 121.
  • the teaching points B and C adjacent to the teaching point A are directly connected. That is, along with the deletion of the teaching point A, the teaching point B and the teaching point C are newly connected.
  • the edit screen 120 includes coordinate forms 126 and 127.
  • the coordinate forms 126 and 127 are forms for accepting editing with respect to the coordinates of the teaching point 121.
  • the coordinate value of the teaching point 121 is displayed on the coordinate forms 126 and 127.
  • the operator can also move the teaching point 121 by editing the coordinate values displayed on the coordinate forms 126 and 127.
  • the processing locus 110 is changed by the movement of the teaching point 121.
  • the coordinate value of the SxSy coordinate system may be displayed as the coordinate value of the teaching point 121, or the coordinate value of the LxLy coordinate system may be displayed.
  • the editing screen 120 includes a height form 128, a tilt form 129, and a rotating form 130. These forms are forms for accepting edits to height information, tilt information, and rotation information.
  • the height information, the tilt information, and the rotation information of the processing unit 45 at the teaching point 121 are the height form 128, the tilt form 129, and the rotation form 130, respectively. Is displayed.
  • the editing operation accepting unit 65A accepts editing operations for the height information, the tilt information, and the rotation information displayed on these forms.
  • the height information displayed on the height form 128 is the height information generated in step S14.
  • the inclination information displayed on the inclination form 129 is the inclination information generated in step S15
  • the rotation information displayed on the rotation form 130 is the rotation information generated in step S16.
  • the height form 128 may display the height from the point of the processing surface 51 of the article 50 as in the height form 96 of the automatic generation screen 90. In this way, the operator may be able to specify the height information of the processing unit 45 by the height from the processing surface 51 of the article 50. In this case, when the value of the height form 128 is edited by the operator, the height information (He) of the processing unit 45 is set by the above formulas (1) and (2) based on the edited value. Will be. Note that the height form 128 may display the height from the transport lane 10.
  • the edit screen 120 includes a regeneration button 131.
  • the regeneration button 131 is a button for regenerating the height information, the tilt information, and the rotation information of the processing unit 45. For example, when the operator selects the regenerate button 131 after editing the processing locus 110, steps S13 to S16 are re-executed based on the edited processing locus 110, and the height information, inclination information, And rotation information is regenerated.
  • the edit screen 120 includes a rotation presence / absence form 125.
  • the rotation presence / absence form 125 is for accepting selection as to whether or not to rotate the processing unit 45 when performing processing along the processing locus.
  • the “rotation of the processing unit 45” is, for example, rotation (twisting) of the processing unit 45 around the T axis.
  • the selection operation accepting unit 65B accepts, via the rotation presence / absence form 125, a selection as to whether or not to rotate the processing unit 45 when performing processing along the processing trajectory.
  • the worker can set the processing unit 45 to rotate when performing processing along the processing locus, and by not checking the rotation presence / absence form 125, It can be set so that the processing unit 45 does not rotate when processing is performed along the processing trajectory. Note that, when the rotation presence / absence form 125 is not checked, the rotation information edited on the rotation form 130 is ignored.
  • the edit screen 120 includes an article form 124.
  • the operator can change the item identification information displayed on the item form 124.
  • the edit screen 120 includes a registration button 132 and a cancel button 133.
  • the cancel button 133 is a button for canceling registration of control data in the processing unit 45.
  • the registration button 132 is a button for executing control data registration of the processing unit 45.
  • the processing trajectory editing device 60 uses the processing trajectory, height information, inclination information, edited on the editing screen 120 as control data of the processing unit 45 when performing processing on the article 50.
  • the rotation information is registered in the storage unit 80 (S18).
  • the control data is stored in association with the item identification information specified on the item form 91 on the automatic generation screen 90 (or the item form 124 on the editing screen 120).
  • the trajectory width information specified on the trajectory width form 93 on the automatic generation screen 90 and the rotation presence / absence information on the processing unit 45 specified on the rotation presence / absence form 125 on the editing screen 120 are also stored as part of the control data in the storage unit 80.
  • Registered in The control data registered as described above is used in the processing execution mode described below.
  • the processing execution mode is an operation mode in which the robot 40 performs processing on the article 50 based on the control data registered in the registration mode.
  • the worker activates the process execution mode by performing a predetermined operation using the operation unit of the robot control device 70.
  • a screen (not shown) for specifying the item identification information of the processing target item is displayed.
  • the process for the article 50 is performed.
  • FIG. 9 is a flowchart showing an example of control executed by the robot control device 70 at this time.
  • the article 50 conveyed to the work place PW is photographed by the photographing unit 30 (S20). That is, the shadow of the article 50 illuminated by the backlight 20 is photographed by the photographing unit 30 based on the control by the robot control device 70.
  • the captured image of the imaging unit 30 is supplied to the robot control device 70.
  • step S20 the robot controller 70 recognizes the outline of the processing surface of the article 50 in the captured image (S21). This process is the same as the process executed in step S12 of FIG.
  • control unit 71 After execution of step S21, the control unit 71 acquires a processing locus, height information, inclination information, rotation information, and the like when processing the article 50 (S22).
  • control unit 71 acquires the control data of the processing unit 45 stored in association with the item identification information of the item 50 from the storage unit 80. Since this control data includes data relating to the processing trajectory, height information, tilt information, and rotation information, these processing trajectory, height information, tilt information, and rotation information perform processing on the article 50. Processing locus, height information, tilt information, and rotation information. Since the control data includes locus width information and rotation presence / absence information of the processing unit 45, these are also acquired in step S22.
  • control unit 71 After execution of step S22, the control unit 71 performs processing on the article 50 based on the processing locus, height information, tilt information, rotation information, and the like acquired in step S22 (S23).
  • control unit 71 controls the processing unit 45 to perform processing on the article 50 by controlling the robot 40 so that the processing unit 45 moves according to the processing trajectory acquired in step S22.
  • the processing trajectory acquired in step S22 is applied so as to match the LxLy coordinate system of the outline 101 recognized in step S21.
  • the control unit 71 controls the robot 40 so that the processing unit 45 applies the cream to the bread along the processing trajectory. At this time, the control unit 71 sets the application width of the cream to the locus width set in step S22.
  • control unit 71 controls the robot so that the height of the processing unit 45 when the processing unit 45 moves according to the processing trajectory acquired in step S22 is the height corresponding to the height information acquired in step S22. 40 is controlled.
  • control unit 71 determines that the inclination of the processing unit 45 when the processing unit 45 moves in accordance with the processing trajectory acquired in Step S22 is an inclination according to the inclination information acquired in Step S22. For example, the rotation of the processing unit 45 around the B axis) is controlled.
  • control unit 71 determines whether or not to rotate the processing unit 45 based on the rotation presence / absence information of the processing unit 45 acquired in step S22.
  • control unit 71 controls the robot 40 (for example, rotation of the processing unit 45 around the T axis) based on the rotation information acquired in step S22, and is acquired in step S22.
  • the processing unit 45 that moves along the processing locus is rotated.
  • step S23 is completed, and the processing for the article 50 is completed.
  • the processing for the article 50 is completed, the article 50 is transported from the work place PW. Then, the new article 50 is conveyed to the work place PW, and the process for the new article 50 is performed. In this way, the processing is sequentially performed on the article 50, and the processed article 50 is manufactured.
  • the processing trajectory 110 associated with the physical coordinates of the article 50 is displayed superimposed on the captured image 100 of the article 50, and the processing trajectory thus displayed is displayed. 110 can be edited by the operator. For this reason, according to the article processing system 1, the operator can edit the processing trajectory 110 associated with the physical coordinates of the article 50 while viewing the captured image 100 of the article 50. As a result, it becomes possible to improve the operability related to the setting of the processing trajectory.
  • the processing trajectory 110 associated with the physical coordinates of the article 50 is automatically generated (see step S ⁇ b> 12 in FIG. 5).
  • the operator edits the automatically generated processing trajectory 110. It can be done. For this reason, according to the article processing system 1, it is possible to reduce time and effort for designating the processing trajectory 110 and to ensure that an operator can change the processing trajectory 110 as necessary. .
  • step S ⁇ b> 14 in FIG. 5 height information of the processing unit 45 when performing processing on the article 50 is automatically generated (see step S ⁇ b> 14 in FIG. 5), and the automatically generated height information is processed on the editing screen 120. Can be edited. For this reason, according to the article processing system 1, it becomes possible to reduce the time and effort to specify the height information of the processing unit 45, and the operator can change the height information of the processing unit 45 as necessary. It is also possible to secure it.
  • the tilt information of the processing unit 45 when performing processing on the article 50 is automatically generated (see step S ⁇ b> 15 in FIG. 5), and on the editing screen 120, the worker displays the automatically generated tilt information. It can be edited. For this reason, according to the article processing system 1, it is possible to reduce the time and effort to specify the tilt information of the processing unit 45, and to ensure that the operator can change the tilt information of the processing unit 45 as necessary. It is also possible to do.
  • step S ⁇ b> 16 in FIG. 5 rotation information of the processing unit 45 when performing processing on the article 50 is automatically generated (see step S ⁇ b> 16 in FIG. 5), and on the editing screen 120, the operator displays the automatically generated rotation information. It can be edited. For this reason, according to the article processing system 1, it is possible to reduce the trouble of designating the rotation information of the processing unit 45 and to ensure that the operator can change the rotation information of the processing unit 45 as necessary. It is also possible to do.
  • the operator can select whether or not the processing unit 45 is rotated when processing the article 50 is performed.
  • the height information, the tilt information, and the rotation information of the processing unit 45 are regenerated (updated) based on the edited processing trajectory 110. ). For this reason, according to the article processing system 1, when the worker edits the processing locus 110, it is necessary to edit the height information, the inclination information, and the rotation information of the processing unit 45 in accordance with the edited processing locus 110. It becomes possible to eliminate such trouble.
  • the present invention is not limited to the embodiment described above.
  • the operator may be able to specify the number of teaching points 121 set on the processing trajectory 111.
  • the interval between the teaching points 121 may be set according to the number of teaching points 121.
  • step S23 of FIG. 6 may be executed together with step S18 (or instead of step S18).
  • the processing for the article 50 may be immediately performed based on the edited processing locus or the like.
  • the robot 40 is used as a moving mechanism for moving the processing unit 45, but another mechanism may be used as the moving mechanism.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un dispositif de modification de trajectoire de traitement (60) qui comporte une unité de photographie (30), une unité d'affichage (66), et une unité de réception d'opération de modification (65A). L'unité de photographie (30) photographie un article. L'unité d'affichage (66) superpose et affiche une trajectoire de traitement pour le traitement de l'article sur une image photographiée de l'article. L'unité de réception d'opération de modification (65A) reçoit des opérations de modification pour la trajectoire de traitement. La trajectoire de traitement affichée par l'unité d'affichage (66) est associée aux coordonnées physiques de l'article, et le dispositif de modification de trajectoire de traitement (60) rend possible la modification de la trajectoire de traitement, qui a été associée aux coordonnées physiques de l'article.
PCT/JP2015/075141 2015-09-03 2015-09-03 Dispositif de modification de trajectoire de traitement, robot, système de traitement d'article, et procédé de production d'article WO2017037931A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2015/075141 WO2017037931A1 (fr) 2015-09-03 2015-09-03 Dispositif de modification de trajectoire de traitement, robot, système de traitement d'article, et procédé de production d'article
JP2017537168A JP6531829B2 (ja) 2015-09-03 2015-09-03 処理軌跡編集装置、ロボット、物品処理システム、及び物品製造方法
CN201580082836.3A CN107921632B (zh) 2015-09-03 2015-09-03 处理轨迹编辑装置、机器人、物品处理***以及物品制造方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/075141 WO2017037931A1 (fr) 2015-09-03 2015-09-03 Dispositif de modification de trajectoire de traitement, robot, système de traitement d'article, et procédé de production d'article

Publications (1)

Publication Number Publication Date
WO2017037931A1 true WO2017037931A1 (fr) 2017-03-09

Family

ID=58186862

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/075141 WO2017037931A1 (fr) 2015-09-03 2015-09-03 Dispositif de modification de trajectoire de traitement, robot, système de traitement d'article, et procédé de production d'article

Country Status (3)

Country Link
JP (1) JP6531829B2 (fr)
CN (1) CN107921632B (fr)
WO (1) WO2017037931A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114055456A (zh) * 2020-07-29 2022-02-18 山东若比邻机器人股份有限公司 加工设备及其控制方法、电子设备及计算机可读存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61279480A (ja) * 1985-06-04 1986-12-10 株式会社不二越 ロボツトの作業点教示方法
JPH07308878A (ja) * 1994-03-16 1995-11-28 Tokico Ltd ロボットの教示装置
JPH09174468A (ja) * 1995-12-26 1997-07-08 Nec Corp ロボット教示装置
JPH11207670A (ja) * 1998-01-21 1999-08-03 Kawasaki Heavy Ind Ltd 産業ロボットのティーチング方法とその装置
JP2001060108A (ja) * 1999-06-18 2001-03-06 Agency Of Ind Science & Technol ロボット動作教示装置および動作教示方法
JP2014083610A (ja) * 2012-10-19 2014-05-12 Yaskawa Electric Corp ロボットシステムおよび加工品の製造方法

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2081930A (en) * 1980-08-13 1982-02-24 Wickman Automated Assembly Ltd Work handling apparatus
JPH06269719A (ja) * 1993-03-16 1994-09-27 Citizen Watch Co Ltd 粘性材料塗布装置
WO1997010080A1 (fr) * 1995-09-14 1997-03-20 Kabushiki Kaisha Yaskawa Denki Unite d'enseignement pour robots
JPH10202161A (ja) * 1997-01-17 1998-08-04 Pfu Ltd ロボットによる液体精密塗布方法および液体塗布装置
JPH11320475A (ja) * 1998-05-19 1999-11-24 Daihatsu Motor Co Ltd ロボットアーム
JP4190698B2 (ja) * 2000-04-24 2008-12-03 セイコーインスツル株式会社 部品組立装置
DE10048749A1 (de) * 2000-09-29 2002-04-11 Josef Schucker Anordnung zum Aufbringen von Klebstoff auf ein Werkstück
KR100431644B1 (ko) * 2001-10-18 2004-05-17 한국과학기술원 그래픽유저인터페이스를 구비한 힐사이드라스팅시스템
JP2003159009A (ja) * 2001-11-28 2003-06-03 Seiko Epson Corp 食品加工方法及び食品加工装置
JP4014514B2 (ja) * 2003-02-21 2007-11-28 本田技研工業株式会社 保護層形成材の塗布装置
JP2004268153A (ja) * 2003-03-05 2004-09-30 Sankyo Seiki Mfg Co Ltd 産業用ロボット
JP2011110621A (ja) * 2009-11-24 2011-06-09 Toyota Industries Corp ロボットの教示データを作成する方法およびロボット教示システム
JP5549749B1 (ja) * 2013-01-16 2014-07-16 株式会社安川電機 ロボット教示システム、ロボット教示プログラムの生成方法および教示ツール

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61279480A (ja) * 1985-06-04 1986-12-10 株式会社不二越 ロボツトの作業点教示方法
JPH07308878A (ja) * 1994-03-16 1995-11-28 Tokico Ltd ロボットの教示装置
JPH09174468A (ja) * 1995-12-26 1997-07-08 Nec Corp ロボット教示装置
JPH11207670A (ja) * 1998-01-21 1999-08-03 Kawasaki Heavy Ind Ltd 産業ロボットのティーチング方法とその装置
JP2001060108A (ja) * 1999-06-18 2001-03-06 Agency Of Ind Science & Technol ロボット動作教示装置および動作教示方法
JP2014083610A (ja) * 2012-10-19 2014-05-12 Yaskawa Electric Corp ロボットシステムおよび加工品の製造方法

Also Published As

Publication number Publication date
JPWO2017037931A1 (ja) 2018-04-26
JP6531829B2 (ja) 2019-06-19
CN107921632A (zh) 2018-04-17
CN107921632B (zh) 2021-09-24

Similar Documents

Publication Publication Date Title
JP7490349B2 (ja) 入力装置、入力装置の制御方法、ロボットシステム、ロボットシステムを用いた物品の製造方法、制御プログラム及び記録媒体
US9311608B2 (en) Teaching system and teaching method
JP4844453B2 (ja) ロボットの教示装置及び教示方法
US11173601B2 (en) Teaching device for performing robot teaching operations and teaching method
JP5154616B2 (ja) オフラインティーチング方法
CN110977931A (zh) 使用了增强现实和混合现实的机器人控制装置及显示装置
JP6311421B2 (ja) ティーチングシステム、ロボットシステムおよびティーチング方法
JP5815761B2 (ja) 視覚センサのデータ作成システム及び検出シミュレーションシステム
US9186792B2 (en) Teaching system, teaching method and robot system
KR20180038479A (ko) 로봇시스템
US20170087717A1 (en) Offline teaching device
EP3354418B1 (fr) Procédé et dispositif de commande de robot
US20190077016A1 (en) Programming device for welding robot and programming method for welding robot
JP7155516B2 (ja) 建設機械
WO2017037931A1 (fr) Dispositif de modification de trajectoire de traitement, robot, système de traitement d'article, et procédé de production d'article
JP5813931B2 (ja) 教示データの修正システム
JP2020056277A (ja) 建築作業装置および建築作業方法
JP7120894B2 (ja) 3次元モデル作成装置、加工シミュレーション装置、工具経路自動生成装置
JP5978890B2 (ja) ロボットの動作プログラム修正装置
JP2015100874A (ja) ロボットシステム
JP2006072673A (ja) 溶接ロボットのポジショナ設定方法
JP2019115950A (ja) ロボット制御装置、ロボットおよびロボットシステム
JP2017113815A (ja) 対象物をロボットを用いて把持するロボットシステムの画像表示方法
JP2019081236A (ja) シミュレーション装置、制御装置およびロボット
US20210173546A1 (en) Information processing device and information processing method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15903050

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017537168

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15903050

Country of ref document: EP

Kind code of ref document: A1