CN117798956A - Operating device, robot system, manufacturing method, control method, and recording medium - Google Patents

Operating device, robot system, manufacturing method, control method, and recording medium Download PDF

Info

Publication number
CN117798956A
CN117798956A CN202311243881.5A CN202311243881A CN117798956A CN 117798956 A CN117798956 A CN 117798956A CN 202311243881 A CN202311243881 A CN 202311243881A CN 117798956 A CN117798956 A CN 117798956A
Authority
CN
China
Prior art keywords
robot
state
virtual
operated
operation device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311243881.5A
Other languages
Chinese (zh)
Inventor
今村成吾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2023110814A external-priority patent/JP2024052515A/en
Application filed by Canon Inc filed Critical Canon Inc
Publication of CN117798956A publication Critical patent/CN117798956A/en
Pending legal-status Critical Current

Links

Landscapes

  • Manipulator (AREA)

Abstract

The present disclosure relates to an operating device, a robot system, a manufacturing method, a control method, and a recording medium. The operating device is configured to communicate with a robot controller for controlling the robot, acquire a pressed state of the enabling switch, display a virtual robot corresponding to the robot in a virtual environment, and operate the robot or the virtual robot by an operation of the operating device, the operating device including a determination unit configured to determine the pressed state of the enabling switch and a processing unit configured to issue a switching instruction to switch between a state in which the robot is operated and a state in which the virtual robot is operated, according to the determination of the pressed state of the enabling switch determined by the determination unit.

Description

Operating device, robot system, manufacturing method, control method, and recording medium
Technical Field
The present disclosure relates to an operating device.
Background
As hardware functions are improved, the number of teaching Tools (TP) including simulator (simulator) functions is improved. The mode of the TP provided with the simulator function needs to be switched between the simulation mode and the actual machine mode. A mode change switch, such as a button or physical switch on a Graphical User Interface (GUI), is used to switch the mode of the TP between the analog mode and the actual machine mode. For example, japanese patent application laid-open No.2001-255906 discusses a configuration in which a mode change switch for switching a mode between a normal mode and an analog mode is provided.
Disclosure of Invention
According to one aspect of the present disclosure, an operating device configured to communicate with a robot controller for controlling a robot, acquire a pressed state of an enable switch, display a virtual robot corresponding to the robot in a virtual environment, and operate the robot or the virtual robot by an operation of the operating device includes a determination unit configured to determine the pressed state of the enable switch, and a processing unit configured to issue a switching instruction to switch between a first state in which the robot is operated and a second state in which the virtual robot is operated, according to the pressed state of the enable switch determined by the determination unit.
Other features of the present disclosure will become apparent from the following description of the embodiments with reference to the accompanying drawings.
Drawings
Fig. 1 is a diagram illustrating a robotic system.
Fig. 2 is a configuration diagram illustrating a teach pendant.
Fig. 3 is a diagram illustrating a modified example of the teach pendant.
Fig. 4 is a diagram illustrating a relationship between a robot model and an actual robot.
Fig. 5 is a block diagram illustrating a control unit.
Fig. 6 is a block diagram illustrating a teach pendant.
Fig. 7 is a flowchart illustrating an operation flow of the robot.
Fig. 8A and 8B are a diagram and a flowchart illustrating a demonstrator provided with a mode change switch and an operation flow, respectively.
Fig. 9 is a diagram illustrating a state when the user presses the enable switch.
Fig. 10 is a diagram illustrating a robot before and after a posture change.
Fig. 11 is a schematic diagram illustrating operation confirmation by the operation confirmation unit.
Fig. 12 is a schematic diagram illustrating voice confirmation.
Detailed Description
In the configuration discussed in japanese patent application laid-open No.2001-255906, the user needs to press the enable switch to operate the robot, which is an actual machine. More specifically, after the mode is switched to the normal mode with the mode change switch, the user further presses the enable switch to operate the actual machine. At this time, in the case where the user forgets to switch the mode and tries to operate the real robot by pressing the enable switch, the user cannot operate the real robot.
Therefore, in the case where a switching mode is required in addition to the operation enabling switch, operability of the Teach Pendant (TP) may be lowered.
The present disclosure provides a technical solution for improving operability of TP.
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. However, the embodiments to be described hereinafter are merely examples of the present disclosure, and the present disclosure is not limited to these embodiments. Further, a common configuration will be described with cross-reference to a plurality of drawings, and description of the configuration assigned with a common reference numeral is appropriately omitted. Units having the same name may be distinguished by adding an "nth unit" (e.g., "first unit", or "second unit").
Fig. 1 is a diagram illustrating a robot system 100 according to a first embodiment. The robot system 100 includes a robot 101 serving as an actual robot, a control device 102 for controlling the robot 101, and a Teach Pendant (TP) 103 for giving an operation instruction to the robot 101 via the control device 102. The robot 101 is used for application purposes such as article manufacturing and article transport. The robot 101 is positioned and deployed on, for example, a platform or ground (not illustrated).
In the present embodiment, the robot main body performs an operation of conveying a gripped workpiece (work) and assembling the workpiece with another workpiece. In this way, an industrial product or article can be manufactured. The track data may be calculated by a simulator.
The control device 102 controls the robot 101 based on the operation information about the robot 101. The control device 102 acquires operation information about the TP 103 to generate operation information about the robot 101.
The robot 101 is a manipulator (manipulator). The robot 101 is fixed on a platform. For example, around the robot 101, a tray on which a workpiece serving as an object to be conveyed is placed, and another workpiece serving as an object to be assembled are placed. The workpiece is grasped (grasp) by the robot 101, conveyed to the position of another workpiece, and the workpiece is assembled with the other workpiece. In this way, the article is manufactured.
The robot 101 and the control device 102 are communicably connected via a wire. The robot 101 includes a robot arm and a robot hand as an example of an end effector.
The robotic arm is a vertical articulated robotic arm (vertical articulated robot arm). The robot arm is supported by the robot arm. The robot arm is attached to a preset portion of the robot arm, for example, to a front end of the robot arm. The robot is configured to grasp a workpiece.
The robot arm according to the present embodiment includes a plurality of links connected with a plurality of rotatably driven joints (e.g., six joints). The base portion of the robotic arm is secured to the platform. Each joint of the robot arm is provided with a motor serving as a driving source for driving the joint, a speed reduction unit, and an encoder serving as a position detection unit for detecting a rotation angle of the motor. The arrangement position and the output manner of the encoder are not particularly limited.
The robot arm is attached to a front end of the robot arm. The robot 101 may take different postures by driving joints of the robot arm.
Fig. 2 is a configuration diagram illustrating the TP 103 according to the present embodiment. The TP 103 includes a computer, and the TP 103 functions as a simulator in addition to the instruction means. In the present embodiment, the TP 103 generates instruction data by computer simulation (i.e., offline instructions) to confirm the operation of the robot 101 in advance by performing the simulation. The instruction data generated by TP 103 is output to control device 102. The method of outputting the instruction data to the control device 102 is not particularly limited. For example, the instruction data generated by the TP 103 may be output to the control apparatus 102 through wired communication or wireless communication, or via a storage device (not illustrated).
TP 103 is provided with an enable switch (hereinafter also referred to as ESW) 104. The ESW 104 is a switch provided for operating the robot 101, and the robot 101 is controlled so as not to be moved by a user instruction via the TP 103 when the ESW 104 is being pressed. On the other hand, in the case where the ESW 104 is not pressed, the robot 101 is controlled to be movable by a user instruction via the TP 103 when the ESW 104 is not pressed. The ESW 104 is a three-position switch that includes a first phase, a second phase, and a third phase. In the present embodiment, among the three phases, the ESW 104 is off in the first phase and the third phase (non-pressing state, second state), and the ESW 104 is on in the second phase (pressing state, first state). ESW 104 may be externally attached to TP 103, and ESW 104 may be any type of switch as long as TP 103 can obtain the pressed state of ESW 104. In teaching, the robot 101 may be operated when the ESW 104 is on. The first, second, and third phases of the ESW 104 correspond to the non-depressed state, the half-depressed state, and the fully depressed state of the ESW 104, respectively. In the case where the ESW 104 becomes off while the robot 101 is being operated, the current supplied to the motor serving as the driving source of the robot 101 is cut off. In this way, the robot 101 can be stopped instantaneously even in the event that an unexpected operation of the robot 101 occurs or an unexpected user operation is given. In this way, the safety of the user can be ensured.
As illustrated in fig. 3, ESW 104 may be a device 109 that is independent of TP 103, not disposed on TP 103. The device 109 includes an ESW 104 and an emergency stop button, and the device 109 exists independently of the TP 103. With this configuration, it is possible to reduce the risk of operating the robot 101 via the TP 103 in a state in which the user erroneously presses the ESW 104, which may occur in a case in which the user uses the TP 103 on which the ESW 104 is provided. In this way, the safety of the user can be further improved. At this time, TP 103 may be wirelessly connected to control device 102.
In the present embodiment, it is described that the ESW 104 is turned off in the first stage and the third stage, and the ESW 104 is turned on in the second stage, but different functions may be allocated to the respective stages. For example, after ESW 104 is turned on in the second stage, when ESW 104 changes to the third stage, robot 101 may be operated even if ESW 104 is not kept pressed. In this case, if the user wishes to switch modes to analog mode, the user can switch ESW 104 to the first stage by pressing ESW 104 again to set ESW 104 to the off state. By switching the ESW 104 to the second stage, the pose of the virtual robot 106 can be matched with the pose of the robot 101, and by switching the ESW 104 to the third stage, the pose of the robot 101 can be matched with the pose of the virtual robot 106.
The TP 103 is provided with a display unit 105, and a virtual environment for displaying the virtual robot 106 therein may be displayed on the display unit 105. The display unit 105 is formed by stacking a so-called touch panel on a display.
In the present disclosure, the robot 101 is a robot that can operate in an actual machine mode, and the virtual robot 106 is a robot that is displayed on the display unit 105 of the TP 103 and can operate in a simulation mode.
TP 103 is provided with an emergency stop button. In the case where the robot 101 is about to collide with an object around the robot 101 when the robot 101 is changing posture based on the instructed operation, the robot 101 stops when the user presses the emergency stop button. In this way, the risk of collision when using the robot 101 can be reduced.
Fig. 4 is a diagram illustrating a robot model 120 and a robot model 121 displayed in a virtual environment. One of the robot model 120 and the robot model 121 indicates a robot in simulation, and the other displays the pose of the robot 101 in the virtual environment. For example, the robot model 120 indicates a robot in simulation, and the robot model 121 displays the pose of the robot 101 in the virtual environment. One of the robot model 120 and the robot model 121 can be displayed in a semi-transparent manner to improve the visibility of the user, and, for example, the robot model 121 indicating the pose of the robot 101 can be displayed in a semi-transparent manner in the virtual environment.
Fig. 5 is a block diagram illustrating a configuration of a control unit of the control device 102. The control unit includes a computer, and includes a calculation unit 301 as a control unit. The computing unit 301 includes a Central Processing Unit (CPU) or an Application Specific Integrated Circuit (ASIC). Alternatively, the computing unit 301 comprises a Field Programmable Gate Array (FPGA). The control unit includes a Read Only Memory (ROM) 302 and a main storage unit 303 such as a Random Access Memory (RAM) as storage units. The ROM 302 stores basic programs such as a basic input/output system (BIOS). The main storage unit 303 is a storage unit for temporarily storing various data (such as the calculation result of the calculation unit 301). The control unit includes secondary storage devices 304, such as a Hard Disk Drive (HDD) and a Solid State Drive (SSD), as storage units.
The auxiliary storage device 304 stores the result of the calculation process of the calculation unit 301, externally acquired data, and the like. TP 103 includes Interface (IF) 305.
ROM 302, main memory unit 303, auxiliary memory device 304, and IF 305 are connected to computing unit 301 via bus 310. An operating device (such as a console panel and TP 103), a display device (such as a display or a lamp) may be connected to IF 305. The IF 305 may include an information input unit for inputting information in the control unit. The operating means are connected to the information input unit of the control unit via a wired connection and/or a wireless connection. Since the operating device is connected to the control unit in this way, the operating device can control the control unit. The IF 305 may include an information output unit for outputting information to be displayed on a display device as a display unit to display the information on the display device. The information output unit may include a graphic controller and a microcomputer. Since the display device is connected to the control unit in this way, the display device can display information.
The computing unit 301 executes various types of processing for the operation of the robot 101 based on the program 320 stored in the auxiliary storage device 304. In this way, the calculation unit 301 issues an instruction to move the robot 101 to a desired position, i.e., transform the robot 101 into a desired pose. The calculation unit 301 outputs data on the position instruction to the servo control unit 251 of the robot 101 via the bus 310 and the IF 305 at predetermined intervals.
When the robot 101 changes its pose, the calculation unit 301 instructs the robot 101 to move to a desired position. The servo control unit 251 outputs a current to a joint unit provided in the robot 101 based on a position instruction to the robot 101 to control driving of an electric motor such as the servo motor 211. The calculation unit 301 can control the posture change of the robot 101 by controlling the driving of the electric motor.
The control unit may change the posture of the robot 101 while changing the response characteristic to the external force applied to the robot 101. In this way, the robot 101 can be controlled while reducing the load on a motor such as the servo motor 211.
The control unit may include an object information processing unit 312, a data generation unit 313, a contact determination unit 314, and a posture change control unit 315. In the control unit, an object information processing unit 312, a contact determination unit 314, a data generation unit 313, and a posture change control unit 315 are connected to the IF 305. At least any one of the object information processing unit 312, the contact determining unit 314, and the posture change control unit 315 may be provided in the robot 101. For example, the data generation unit 313 may be provided in the joint unit.
The data generation unit 313 generates data (analog data or digital data) from signals detected and output by sensors for detecting contact with the robot 101 and outputs the generated data to the contact determination unit 314, which can be processed by the contact determination unit 314. The data generated by the data generation unit 313 represents a value corresponding to a physical quantity detected by a sensor for detecting contact with the robot 101. The data generation unit 313 may also be referred to as a contact detection unit.
The contact determining unit 314 may determine the presence of an object in contact with the robot 101 by analyzing contact information output from the data generating unit 313 provided in the control unit or provided outside.
The robot 101 or the control unit may be provided with an object detection sensor 311. The object detection sensor 311 is a sensor capable of detecting objects existing around the robot 101. Various sensors such as an image sensor, a distance measurement sensor, an ultrasonic sensor, and a radar sensor may be used as the object detection sensor 311. The distance measurement sensor may be a time of flight (ToF) type sensor. The object detection sensor 311 is connected to, for example, the IF 305.
The object information processing unit 312 analyzes the object information output from the object detection sensor 311.
Through this analysis, the object information processing unit 312 may determine an object existing within the movable range of the robot 101 or may calculate a distance to the object.
The posture change control unit 315 controls the posture change based on the data about the object existing within the movable range of the robot 101 output from the object information processing unit 312 and the data about the existence of the contact object output from the contact determination unit 314.
The robot system 100 according to the present embodiment may detect contact of an object with the robot 101 using a sensor for detecting contact. The control unit may control the robot 101 based on the detection of the contact. The robot 101 may use the object detection sensor 311 to detect the presence of objects around the robot 101. The control unit may control the robot 101 based on the detection of the presence of the object. The detected object may be displayed on the display unit 105 of the TP 103.
Fig. 6 is a block diagram illustrating the configuration of the TP 103. The computing unit 401, ROM 402, main storage unit 403, auxiliary storage unit 404, IF 405, and bus 410 are similar to the computing unit 301, ROM 302, main storage unit 303, auxiliary storage device 304, IF 305, and bus 310 described with reference to fig. 4, and the descriptions of these units are omitted. In the case where the control device 102 includes the calculation unit 401, the ROM 402, the main storage unit 403, and the auxiliary storage unit 404, the TP 103 does not need to include these units.
The display unit 105, the determination unit 130, and the operation instruction unit 132 of TP 103 are connected to the IF 405. The display unit 105 includes a model display unit 107 and an operation confirmation unit 110, the model display unit 107 being configured to display the virtual robot 106 in the virtual environment, and the operation confirmation unit 110 being configured to confirm whether to issue an operation instruction to the virtual robot 106 or the robot 101. The model display unit 107 acquires the state of the robot 101, and displays a model of the robot 101. The pose of the robot 101 and the pose of the model of the robot 101 are the same.
The determination unit 130 is connected to an ESW state acquisition unit 131, and the ESW state acquisition unit 131 acquires the pressed state of the ESW 104. The determination unit 130 determines whether to operate the robot 101 or the virtual robot 106 based on the pressing state of the ESW 104 acquired by the ESW state acquisition unit 131. In the present embodiment, the ESW state acquisition unit 131 is connected to the control device 102 via the IF 406, but IF the ESW 104 is directly connected to the TP 103, the ESW state acquisition unit 131 does not need to be connected to the control device 102 via the IF 406.
The operation instruction unit 132 acquires the state of the simulation model, and gives the target position to the operation transmission unit 133 to operate the robot 101. The operation transmission unit 133 transmits the target position to the control device 102 via the IF 406 to operate the robot 101 so as to take the same posture as that of the virtual robot 106. IF 405 and IF 406 include communication devices supporting ethernet, universal Serial Bus (USB), and the like.
In fig. 6, the TP 103 is described as being connected to the control device 102 via the IF 406, but the TP 103 may be formed by software in the control device 102.
Fig. 7 is a flowchart illustrating a series of operations of the robot 101 according to the present embodiment. In step S1, the ESW state acquisition unit 131 acquires the pressed state of the ESW 104, and the determination unit 130 determines whether the ESW 104 is turned on. In step S1, in the case where the determination unit 130 determines that the ESW 104 is on (yes in step S1), the process proceeds to step S2. In step S2, the mode is an actual machine mode in which the robot 101 is operated, and a jog (jog) operation of the actual robot 101 is performed based on an operation input of a user. Subsequently, the process ends.
In step S1, in the case where the determination unit 130 determines that the ESW 104 is off (no in step S1), the process proceeds to step S3. In step S3, the mode is an analog mode, and a jog operation of the virtual robot 106 is performed based on an operation input of the user. In the operation after step S3, in the case where the ESW 104 is pressed and the process proceeds to step S4, then, the process proceeds to step S5. In step S5, an operation check is performed.
In the operation check performed in step S5, the user selects any one of "match the posture of the virtual robot 106 simulated in the virtual environment with the posture of the robot 101", "start the jog operation of the robot 101", or "cancel the instruction of the ESW 104 and continue the jog operation of the virtual robot 106". In the case where the user selects "match the pose of the virtual robot 106 simulated in the virtual environment with the pose of the robot 101" (no in step S5), the process proceeds to step S6. In step S6, the pose of the virtual robot 106 is matched with the pose of the robot 101. In the case where the user selects "start click operation of the robot 101" (yes in step S5), the process proceeds to step S2. In step S2, a jog operation of the robot 101 is started. In step S5, in the case where the user cancels the instruction of the ESW 104 (cancel in step S5), the process returns to step S3. In step S3, the jog operation of the virtual robot 106 is continued. Although not described with reference to fig. 4, the pose of the virtual robot 106 may be synchronized with the pose of the robot 101. In step S4, no mode change may be made when ESW 104 transitions from the third phase to the second phase.
As described previously, the TP 103 includes the operation instruction unit 132 and the operation transmission unit 133, wherein the operation instruction unit 132 is configured to switch the operation between the click operation of the robot 101 and the click operation of the robot 106 according to the pressed state of the ESW 104. Conventionally, in the case where a mode change switch is provided in addition to the ESW 104, in order to move the robot 101 from the state of moving the virtual robot 106, the user needs to switch the mode by the mode change switch, and then press the ESW 104. Thus, according to the present embodiment, the ESW 104 may be used as a trigger to switch modes between the mode of the mobile robot 101 and the mode of the mobile virtual robot 106. In this way, it is possible to reduce the complexity of the mode change operation of the user and improve the operability of the TP 103.
The state in which the ESW 104 is pressed is a state indicating that the user wants to operate the robot 101. Therefore, by switching modes using the ESW 104 as a trigger, operability of the robot 101 can be improved while ensuring safety of the user.
As shown in fig. 8A, a mode change switch 108 may be provided in addition to the ESW 104, and a switch corresponding to the purpose may be used. For example, as shown in fig. 8B, switching from the simulation mode to the actual machine mode (s11→s12 and s22→s23→s24) is performed by the ESW 104. The switching from the actual machine mode to the analog mode (s12→s13→s14) can be performed by the mode change switch 108. In this case, when the mode change is made through the ESW 104, the mode change switch 108 is configured to also automatically switch to the mode of the mobile robot 101.
In this way, since the switching to the analog mode is performed by the mode change switch 108, the risk of operating the robot 101 by misunderstanding that the mode is the analog mode can be reduced. Since the switching to the actual machine mode is performed by the ESW 104, the operability of the robot 101 can be improved. As described above, switching from the simulation mode to the actual machine mode is performed using the ESW 104, and switching from the actual machine mode to the simulation mode is performed using the mode change switch 108 different from the ESW 104. In this way, the operability of the robot 101 can be improved while ensuring the safety of the user.
A modified embodiment capable of improving the operability of the TP 103 will be described. The control of the series of operations shown in fig. 7 is performed by the calculation unit 401 of the TP 103, but may be performed by the calculation unit 301 of the control unit.
When the ESW 104 turns on when the posture of the robot 101 is different from the posture of the virtual robot 106, it is possible to confirm whether or not to issue an operation instruction to the robot 101. The confirmation may be made using a pop-up display as will be described below, or may be made by sound or speech.
As shown in fig. 9, when ESW 104 turns on, the pose of virtual robot 106 may be set to automatically match the pose of robot 101. When ESW 104 changes from on to off, the pose of virtual robot 106 may be synchronized with the pose of robot 101.
Further, as shown in fig. 10, the pose of the robot 101 before being changed may be displayed in a dotted line in the virtual environment, and the pose of the robot 101 while being changed may be displayed in the virtual environment. In this case, when the ESW 104 is released, the previous gesture of the robot 101 displayed by the dotted line may be automatically deleted, or a popup window to be described later may be displayed to inquire whether to delete the previous gesture of the robot 101.
As shown in fig. 11, the TP 103 may perform operation confirmation by, for example, displaying a confirmation pop-up window on the display unit 105. The confirmation pop-up window is controlled by the operation confirmation unit 110 shown in fig. 6, and includes a button 110A, a button 110B, and a button 110C. When the button 110A is pressed, the posture of the robot 101 is synchronized with the posture of the virtual robot 106 in the simulation. When button 110B is pressed, the pose of virtual robot 106 in the virtual environment is matched to the pose of robot 101. When the button 110C is pressed, the operation instruction to the robot 101 is canceled, and the inching operation in the simulation is continued. The confirmation window is not limited to one for issuing an operation instruction to any one of the robot 101 and the virtual robot 106, and a user may arbitrarily set any one of the confirmation windows.
As illustrated in fig. 12, the microphone 112 and the speaker 111 installed in the TP 103 may be used to issue an operation instruction to the robot 101 or the virtual robot 106 in the virtual environment. For the operation instructions, the content described above with reference to fig. 11, i.e., "do you want to move the actual machine to the location of the model? Is "or" do you want to enter the location of the actual machine into the model? ", and performs the instructed operation. The actual machine instructs the robot 101. As described above, by making a confirmation via display or voice on the screen, it is possible to prompt the user for operation when the mode is to be changed, and to operate without erroneous operation or mistake.
Further, the pose of the robot 101 before being changed may be displayed in the virtual environment by a dotted line, and the pose of the robot 101 while being changed may be displayed in the virtual environment.
The above-described embodiments may be appropriately modified and changed without departing from the spirit and scope of the technical idea. For example, the various embodiments and modifications thereof may be combined and implemented. Furthermore, some elements of at least one embodiment may be removed or replaced.
New elements may be added to at least one embodiment. The disclosure includes not only what is explicitly described in the disclosure, but also all what can be understood from the disclosure and the drawings of the disclosure.
All the above-described processing procedures performed by the control unit are specifically performed by the calculation unit 301. Accordingly, the present disclosure may be configured to read and execute a software program stored in a recording medium, which may realize the above-described functions. In this case, the program itself read from the recording medium implements the functions of each of the embodiments described above, and the program itself and the recording medium recording the program configure the present disclosure.
In each of the embodiments, a case is described in which the computer-readable recording medium is a ROM, a RAM, or a flash ROM, and the program is stored in the ROM, the RAM, or the flash ROM. However, the present disclosure is not limited to this embodiment. The program for implementing the present disclosure may be stored in any recording medium as long as the recording medium is a computer-readable storage medium. For example, an HDD (or SSD), an external storage device, a recording disk may be used as a recording medium for supplying the control program.
In the above-described various embodiments, the case where the multi-joint robot arm has a plurality of joints to be used is described, but the number of joints is not limited to the number described in this embodiment. As a form of the robot arm, a vertical multi-axis configuration is exemplified, but the above configuration may be implemented in a different kind of joints of a horizontal multi-joint type, a parallel link type, an orthogonal robot type, or the like.
The above-described embodiments may be applied to a machine that can automatically perform operations of expansion and contraction, flexion and extension, vertical movement, horizontal movement, and/or rotational movement, or a combination operation, based on information stored in a storage device provided in the control device 102.
The present disclosure may also be realized by supplying a program for realizing one or more functions of the above-described embodiments to a system or apparatus via a network or a storage medium and a process of reading and executing the program by one or more processors in a computer of the system or apparatus. The present disclosure may also be implemented by circuitry (e.g., an Application Specific Integrated Circuit (ASIC)) capable of implementing one or more functions.
The present disclosure is not limited to the above-described embodiments, and may be modified and changed within the technical concept according to the present disclosure. The effects described in the embodiments are merely examples of preferred effects resulting from the present disclosure, and the effects of the present disclosure are not limited to the effects described in the embodiments of the present disclosure. The various embodiments and modified examples described above can be combined and implemented.
OTHER EMBODIMENTS
The embodiment(s) of the present disclosure may also be implemented by a computer of a system or apparatus that reads out and executes computer-executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be more fully referred to as a "non-transitory computer-readable storage medium") to perform the functions of and/or include one or more circuits (e.g., application Specific Integrated Circuits (ASICs)) for performing the functions of and/or of one or more of the above-described embodiment(s), and by a computer of a system or apparatus executing by, for example, reading out and executing computer-executable instructions from a storage medium to perform the functions of and/or control the execution of one or more circuits by a computer of a system or apparatusImplemented by a method that performs the functions of one or more of the above-described embodiment(s). The computer may include one or more processors (e.g., a Central Processing Unit (CPU), a micro-processing unit (MPU)), and may include a separate computer or a network of separate processors to read out and execute the computer-executable instructions. The computer-executable instructions may be provided to the computer, for example, from a network or a storage medium. The storage medium may include, for example, a hard disk, random Access Memory (RAM), read Only Memory (ROM), storage for a distributed computing system, an optical disk (such as a Compact Disk (CD), digital Versatile Disk (DVD), or blu-ray disc (BD) TM ) One or more of a flash memory device, memory card, etc.
The embodiments of the present invention can also be realized by a method in which software (program) that performs the functions of the above embodiments is supplied to a system or apparatus, a computer of the system or apparatus or a method in which a Central Processing Unit (CPU), a Micro Processing Unit (MPU), or the like reads out and executes the program, through a network or various storage mediums.
While the present disclosure has been described with reference to the embodiments, it is to be understood that the present disclosure is not limited to the disclosed embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.

Claims (21)

1. An operation device configured to communicate with a robot controller for controlling a robot, acquire a pressed state of an enable switch, display a virtual robot corresponding to the robot in a virtual environment, and operate the robot or the virtual robot by an operation of the operation device, the operation device comprising:
a determination unit configured to determine a pressing state of the enable switch; and
and a processing unit configured to issue a switching instruction to switch the state between the state in which the robot is operated and the state in which the virtual robot is operated, according to the determination of the pressing state of the enable switch determined by the determination unit.
2. The operating device according to claim 1,
wherein the robot is operated when the enabling switch is turned on, an
Wherein the virtual robot is operated when the enabling switch is opened.
3. The operation device according to claim 1, wherein in a case where the pose of the virtual robot and the pose of the robot are different, the operation device confirms whether or not to issue the operation instruction when the enable switch is pressed.
4. The operation device according to claim 1, wherein a confirmation pop-up window is displayed on a screen of the operation device to confirm whether or not to issue an operation instruction to any one of the robot and the virtual robot.
5. The operation device according to claim 1, wherein the operation instruction is issued to any one of the robot and the virtual robot by voice or sound.
6. The operation device according to claim 1, wherein the model of the robot is displayed in the virtual environment when the virtual robot is operated.
7. The manipulator of claim 6, wherein the model of the robot is displayed in a transparent manner.
8. The operation device according to claim 1, wherein the pose of the virtual robot is displayed in synchronization with the pose of the robot when the robot is operated.
9. The operation device according to claim 1, wherein the posture of the virtual robot is synchronized with the posture of the robot when the enable switch is switched from on to off.
10. The operation device according to claim 1, wherein the virtual environment is displayed on a screen of the operation device.
11. The operating device of claim 1, wherein the enabling switch is a three-position switch.
12. The operating device of claim 1, wherein the enabling switch is provided as a stand-alone device.
13. The operating device according to claim 1, further comprising a mode change switch which is different from the enable switch and is configured to switch a state between a state in which the robot is operated and a state in which the virtual robot is operated,
wherein the processing unit switches the state from the state in which the virtual robot is operated to the state in which the robot is operated by the enabling switch, and switches the state from the state in which the robot is operated to the state in which the virtual robot is operated by the mode change switch.
14. The operation device according to claim 1, wherein the pose of the robot at the time when the robot starts moving is displayed in a dotted line when the robot is operated.
15. The operation device according to claim 14, wherein the dotted line and the posture of the moving robot are displayed while the robot is operated.
16. A robotic system comprising:
the operating device according to any one of claims 1 to 15; and
the robot.
17. A manufacturing method of manufacturing an article using the robotic system of claim 16.
18. A method of manufacture comprising:
a robot operated by an operating device according to any one of claims 1 to 15 and a manufacturer of the article cooperatively manufacture the article.
19. A control method of an operation device configured to operate a robot, the operation device including a determination unit configured to determine a pressing state of an enable switch, and the operation device configured to operate the robot or a virtual robot by an operation of the operation device, the control method comprising:
displaying a virtual robot corresponding to the robot in the virtual environment; and
the state is switched between a state in which the robot is operated and a state in which the virtual robot is operated, according to the determination of the pressing state of the enable switch determined by the determination unit.
20. A control method of a robot system for controlling a robot in the robot system according to claim 16, according to an instruction issued by an operation device.
21. A non-transitory computer-readable recording medium recording a control program for causing a computer to execute the control method according to claim 19.
CN202311243881.5A 2022-09-30 2023-09-25 Operating device, robot system, manufacturing method, control method, and recording medium Pending CN117798956A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2022-157704 2022-09-30
JP2023110814A JP2024052515A (en) 2022-09-30 2023-07-05 Operation device, robot system, manufacturing method, control method, control program, and recording medium
JP2023-110814 2023-07-05

Publications (1)

Publication Number Publication Date
CN117798956A true CN117798956A (en) 2024-04-02

Family

ID=90418743

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311243881.5A Pending CN117798956A (en) 2022-09-30 2023-09-25 Operating device, robot system, manufacturing method, control method, and recording medium

Country Status (1)

Country Link
CN (1) CN117798956A (en)

Similar Documents

Publication Publication Date Title
JP7030518B2 (en) Industrial remote control robot system
US20190329405A1 (en) Robot simulation device
CN107717982B (en) Control device and operation method of mechanical arm
CN108367434B (en) Robot system and method of controlling the same
JP6953778B2 (en) Coordinated robots, controllers, and methods
US11072077B2 (en) Robot system and operation method thereof
US11370105B2 (en) Robot system and method for operating same
US10807240B2 (en) Robot control device for setting jog coordinate system
JP7049069B2 (en) Robot system and control method of robot system
JP2015182142A (en) Robot, robot system, and teaching method
JP2006192554A (en) Interference prevention method of plurality of robots, device for executing its method, and robot system equipped with its device
JP7483420B2 (en) ROBOT SYSTEM, CONTROL DEVICE, INFORMATION PROCESSING DEVICE, CONTROL METHOD, INFORMATION PROCESSING METHOD, PROGRAM, AND RECORDING MEDIUM
CN117798956A (en) Operating device, robot system, manufacturing method, control method, and recording medium
US20240109188A1 (en) Operation apparatus, robot system, manufacturing method, control method, and recording medium
JP2786874B2 (en) Movable position control device
JP2001100805A (en) Robot controller
JP2024052515A (en) Operation device, robot system, manufacturing method, control method, control program, and recording medium
US20230264353A1 (en) Manipulation apparatus, robot system, manipulation apparatus control method, and robot system control method
WO2023218536A1 (en) Robot control device, robot system, and teaching device
JP7185749B2 (en) ROBOT SYSTEM AND ROBOT SYSTEM CONTROL METHOD
WO2023157261A1 (en) Robot control device
JPH08197465A (en) Contact detecting device for force control robot
KR19990030482A (en) Robot Display Manipulator

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication