US20010010539A1 - Apparatus for correcting movement path of a robot and a method therefor - Google Patents

Apparatus for correcting movement path of a robot and a method therefor Download PDF

Info

Publication number
US20010010539A1
US20010010539A1 US08/836,059 US83605997A US2001010539A1 US 20010010539 A1 US20010010539 A1 US 20010010539A1 US 83605997 A US83605997 A US 83605997A US 2001010539 A1 US2001010539 A1 US 2001010539A1
Authority
US
United States
Prior art keywords
tool
robot
shape data
camera
shape
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US08/836,059
Other versions
US6414711B2 (en
Inventor
Taro Arimatsu
Kazuhiko Akiyama
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fanuc Corp
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Assigned to FANUC LTD. reassignment FANUC LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AKIYAMA, KAZUHIKO, ARIMATSU, TARO
Publication of US20010010539A1 publication Critical patent/US20010010539A1/en
Application granted granted Critical
Publication of US6414711B2 publication Critical patent/US6414711B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/42Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/404Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by control arrangements for compensation, e.g. for backlash, overshoot, tool offset, tool wear, temperature, machine construction errors, load, inertia
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/408Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by data handling or data format, e.g. reading, buffering or conversion of data
    • G05B19/4083Adapting programme, configuration
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36416Adapt teached position as function of deviation 3-D, 2-D position of end effector, tool
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/375673-D vision, stereo vision, with two cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40589Recognize shape, contour of tool
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40611Camera to monitor endpoint, end effector position
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45083Manipulators, robot

Definitions

  • the present invention relates to an apparatus and a method for automatically correcting deviation from an taught path due to deformation of a tool mounted on the wrist end of a robot during an operation by means of the robot.
  • a tool that is mounted on the wrist end of a robot for arc welding and sealing has an elongate shape, and is more apt to be bent, even by a light impact, than tools for spot welding and handling. If programs for the robot are executed with the tool bent, naturally, the welding or sealing cannot be effected along an taught path.
  • the object of the present invention is to provide a tool shape correcting method and apparatus for a robot, in which the shape of a tool mounted on the wrist end of the robot is recognized before starting operation by the robot, whereby the deviation from an taught path attributable to deformation of the tool, if any, can be corrected automatically.
  • a tool mounted on the wrist end of a robot and a camera for photographing the tool are first located in predetermined relative positions, images picked up by the camera are then fetched and processed, shape data of the tool are obtained from the processed image data, the extent of deformation of the tool is detected by comparing the obtained tool shape data with reference tool shape data, and an taught path for the robot is corrected in accordance with the extent of deformation.
  • the extent of deformation of the tool is detected before the operation by the robot is executed, and the taught path can be automatically corrected depending on the extent of deformation, so that accurate operation can be performed even if there is deformation of the tool.
  • FIG. A is a block diagram schematically illustrating an apparatus, according to the present invention, for detecting deformation of a tool and correcting a program taught path in accordance with the extent of the deformation;
  • FIG. 2 is a diagram for illustrating positional relations among the tool in a predetermined photographing position and cameras
  • FIG. 3 is a diagram showing the hardware configuration of an image processing apparatus constituting the apparatus shown in FIG. 1;
  • FIG. 4 is a diagram showing the hardware configuration of a robot control apparatus constituting the apparatus shown in FIG. 1;
  • FIG. 5 is a flowchart showing processing in the image processing apparatus of FIG. 3.
  • FIG. 6 is a flowchart showing processing in the robot control apparatus of FIG. 4.
  • FIG. 1 an outline of an apparatus for detecting deformation of a tool and correcting a program taught path in accordance with the extent of the deformation will be described.
  • a robot 1 is fitted with the tool 2 on its arm end.
  • the tool 2 carries out an operation (hereinafter referred to as main operation) such as arc welding for a workpiece 3 (e.g., vehicle body frame).
  • the robot 1 is controlled by means of a robot control apparatus 30 .
  • the tool 2 Before the main operation is started, the tool 2 is moved to a predetermined photographing position P 1 by the robot control apparatus 30 . As a result, the tool 2 is made to take a predetermined posture by which its tool center point is in alignment with the position P 1 .
  • Two cameras 41 and 42 photograph the tool in the predetermined position P 1 and the predetermined posture. One camera 41 photographs the front face of the tool 2 , while the other camera 42 photographs a side face of the tool 2 .
  • These cameras 41 and 42 photograph the shape of the tool 2 in response to a command from an image processing apparatus 20 .
  • This image processing apparatus 20 comprises tool shape recognizing means 5 , error value computing means 6 , and reference shape data storage means 7 .
  • the tool shape recognizing means 5 recognizes the shape of the tool 2 referring to the images picked up by the cameras 4 .
  • the error value computing means 6 compares the current tool shape recognized by the tool shape recognizing means 5 and the reference shape data previously stored in the reference shape data storage means 7 , and computes a deviation (error value or extent of deformation of the tool) of the current tool shape from the reference shape data.
  • the error value computed by the error value computing means 6 is fed to movement control means 8 in the robot control apparatus 30 .
  • the movement control means 8 corrects the programmed path in accordance with the error value data fed from the error value computing means 6 , and, in the main operation, controls the movement of the robot 1 according to the corrected path.
  • the two cameras 41 and 42 are set in the vicinity of the predetermined photographing position P 1 , and the both face this photographing position P 1 .
  • the direction in which the one camera 41 faces the photographing position P 1 is perpendicular to the direction in which the other camera 42 faces the photographing position P 1 . If a machine coordinate system having three rectangular axes (X-, Y- and Z- axes) is set in the manner shown in FIG.
  • the one camera 41 is located so that its optical axis extends parallel to the Y-axis, and picks up an image of the front face of the object of photographing (tool 2 ), that is, an XZ-plane image
  • the other camera 42 is located so that its optical axis extends parallel to the X-axis, and picks up an image of the side face of the object of photographing, that is, a YZ-plane image.
  • a processor 21 of the image processing apparatus 20 is connected with a ROM 22 , RAM 23 , nonvolatile memory 24 , communication processor 25 , image processor 26 , frame memory 27 , and interfaces (INT) 28 and 29 , respectively, through a bus 20 a.
  • the ROM 22 is loaded with a basic program for operating the image processing apparatus 20 .
  • the RAM 23 stores data for temporary processing and calculation.
  • the nonvolatile memory 24 is composed of a CMOS or the like that is backed up by a power source, and stores data that are expected to be held even after the image processing apparatus 20 is cut off from the power supply.
  • the data to be stored in this nonvolatile memory 24 include reference shape data of the tool 2 in the predetermined photographing position P 1 (and therefore, the reference shape data storage means 7 shown in FIG. 1 corresponds specifically to the nonvolatile memory 24 in the image processing apparatus 20 of FIG. 4).
  • the reference shape data are based on the parameters, namely, the three-dimensional lengths and angles of a distal end portion 2 a of the tool 2 fixed in a given direction at the predetermined photographing position P 1 as viewed from the cameras 41 and 42 .
  • the communication processor 25 delivers to and receives data from the robot control apparatus 30 through a communication line.
  • the image processor 26 reads the image data picked up by the cameras 41 and 42 through the interface (INT) 28 , temporarily loads them into the frame memory 27 , and carries out profiling and the like of the image data in the frame memory 27 .
  • Data such as the photographing conditions of the cameras 41 and 42 and the state of operation are fed to a monitor (not shown) through the interface (INT) 29 , and displayed thereon.
  • the processor 21 detects the shape of the tool 2 in terms of the three-dimensional lengths and angles thereof on the basis of the profiled image data, compares it with the reference shape data in the nonvolatile memory 24 , and compute the error value.
  • the computed error value is fed to the robot control apparatus 30 through the communication processor 25 . Also, the processor 21 determines whether or not the error value is within a preset allowable range, and informs the robot control apparatus 30 of the result.
  • the robot control apparatus 30 is provided with a processor board 31 .
  • a processor 31 a of the processor board 31 is connected with a ROM 31 b and a RAM 31 c through a bus 39 .
  • the processor 31 a controls the whole robot control apparatus 30 in accordance with system programs stored in the ROM 31 b .
  • the RAM 31 c is loaded with various data and also with operation programs for the robot 1 and correction programs for the taught path, which will be mentioned later.
  • Part of the RAM 31 c is formed as a nonvolatile memory, and the operation programs or correction programs are stored in this nonvolatile memory portion.
  • bus 39 is connected with a digital servo control circuit 32 , serial port 34 , digital I/O 35 , analog I/O 37 , and large-capacity memory 38 .
  • the digital servo control circuit 32 drives servomotors 51 , 52 , 53 , 54 , 55 and 56 through a servo amplifier in accordance with commands from the CPU 31 a of the processor board 31 .
  • These servomotors 51 to 56 are built in the robot 1 , and operate individual axes of the robot 1 .
  • the serial port 34 is connected to a teaching control panel 57 with display and other external apparatuses 58 , which are connected through RS232C.
  • This serial port 34 is connected to the image processing apparatus 20 (i.e., those external apparatuses 58 include the image processing apparatus 20 ).
  • the teaching control panel 57 with display is used to input the taught path of the tool 2 and the like.
  • the serial port is connected to a CRT 36 a, which displays a coordinate position, control screen, etc.
  • a console panel 36 b is connected to the digital I/O 35 .
  • the analog I/O 37 its connected to a power supply unit of a laser apparatus, and welding voltage is instructed through the analog I/O 37 .
  • the large-capacity memory 38 is loaded with taught data and the like.
  • This processing is started upon receiving the information from the robot control apparatus 30 to the effect that the robot 1 is moved by the robot control apparatus 30 so that the tool 2 , the object of photographing, is moved to the predetermined photographing position P 1 .
  • the CPU 21 When the processing is started, the CPU 21 first outputs a command for photographing to the first camera 41 , reads XZ-plane image data of the photographed tool 2 through the interface 27 , and temporarily stores the data into the frame memory 27 (Step S 1 ). Then, the CPU 21 delivers a command requiring the detection of the object to the image processor 26 . In response to this command, the image processor 26 makes the image data to undergo profiling and the like, and measures the shape of the tool 2 (i.e., the length, angle, and position of the tool 2 ) projected on the XZ-plane (Step S 2 ).
  • YZ-plane image data of the tool 2 picked up by the second camera 2 are read through the interface 27 and temporarily loaded into the frame memory 27 (Step S 3 ). Then, the image data are made to undergo profiling and the like, and the shape of the tool 2 projected on the YZ-plane is measured (Step S 4 ).
  • Step S 5 three-dimensional shape data in the XYZ coordinate space of the tool 2 are obtained from the shape of the tool 2 on the XZ-plane obtained in Step S 2 and the shape of the tool 2 on the YZ-plane obtained in Step S 4 (Step S 5 ).
  • the reference three-dimensional shape data of the tool 2 are fetched from the nonvolatile memory 24 , and are compared with the three-dimensional shape data of the tool 2 obtained in Step S 6 , whereupon errors or deviations from the reference values of the three-dimensional shape data of the tool 2 , currently attached to the robot, are obtained (Step S 6 ).
  • Step S 7 it is determined whether or not the obtained errors are within the allowable range. If the errors are concluded to be within the allowable range, information to the effect that the errors are within the allowable range is transmitted to the robot control apparatus 30 (Step S 9 ) to terminate this processing. If the obtained errors are concluded to be out of the allowable range, the error data and information to the effect that the errors are out of the allowable range are delivered to the robot control apparatus 30 (Step S 8 ) to terminate this processing.
  • the robot 1 is controlled so that the tool 2 , the object of photographing, is moved to the predetermined photographing position P 1 (Step T 1 ).
  • the image processing apparatus 20 On receiving this signal, the image processing apparatus 20 starts processing according to the flowchart of FIG. 5.
  • Step T 2 After the tool 2 has moved to the photographing position P 1 , it is kept on standby (Step T 2 ) until the information to the effect that the errors are within the allowable range or the information to the effect that the errors are out of the allowable range is received from the image processing apparatus 20 (see Steps S 9 and S 8 of FIG. 5).
  • Step T 6 execution of the main operation is started in accordance with the path taught by the program.
  • Step T 4 when the information to the effect that the errors are out of the allowable range and the relevant error data are received from the image processing apparatus 20 , an operator is informed of the fact that the taught movement path must be corrected due to deformation of the tool through display on the CRT 36 a or the like (Step T 4 ), and the program-taught path of the tool 2 is then corrected on the basis of the delivered error data (Step T 5 ). Execution of the main operation is started by actuating the robot in accordance with the corrected taught path (Step T 6 ).
  • the shape of the tool 2 is measured based on the images of the tool 2 picked up by the cameras 41 and 42 , and the taught path is corrected on the basis of the differences between the measured shape and the reference shape data.
  • the deformation of the tool 2 can be detected, and the path can be automatically corrected depending on the extent of the deformation.
  • the machining line need not be stopped in order to execute the path correction to cope with the deformation of the tool, so that the manufacturing efficiency is improved, and normal operation can be carried out at all times.
  • safety since the operator need not enter the job site for the correction, safety can also be maintained.
  • this embodiment is designed so that the taught path is corrected when the deformation of the tool gets beyond the allowable range; however, this embodiment may be arranged so that the tool can be replaced after stopping the operation of the robot 1 , when the deformation of the tool is too much beyond the allowable range.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Manufacturing & Machinery (AREA)
  • Numerical Control (AREA)
  • Manipulator (AREA)

Abstract

A tool (2) mounted on the wrist end of a robot (1) is placed in a predetermined photographing position (P1), and the front and side faces of the tool (2) are photographed by cameras (41, 42) to obtain three-dimensional shape data of the tool (2). The extent of deformation of the tool (2) is detected by comparing the obtained shape data and reference shape data, and a programmed movement path for the robot is corrected depending on the extent of deformation.

Description

    TECHNICAL FIELD
  • The present invention relates to an apparatus and a method for automatically correcting deviation from an taught path due to deformation of a tool mounted on the wrist end of a robot during an operation by means of the robot. [0001]
  • BACKGROUND ART
  • In general, a tool that is mounted on the wrist end of a robot for arc welding and sealing has an elongate shape, and is more apt to be bent, even by a light impact, than tools for spot welding and handling. If programs for the robot are executed with the tool bent, naturally, the welding or sealing cannot be effected along an taught path. [0002]
  • Conventionally, if the tool is bent during operation, it is replaced in its entirety or corrected to a possible extent, and errors resulting from insufficient correction are adjusted by correcting a programmed path corresponding to the degrees of errors. [0003]
  • In order to carry out such repairing operation, however, the working line must be stopped for a long time, and the adjustment requires much time and labor, so that the manufacturing efficiency is lowered. [0004]
  • Moreover, the operation is performed along a wrong path without the attendance of an operator, so that defective products may possibly be manufactured. [0005]
  • DISCLOSURE OF THE INVENTION
  • The object of the present invention is to provide a tool shape correcting method and apparatus for a robot, in which the shape of a tool mounted on the wrist end of the robot is recognized before starting operation by the robot, whereby the deviation from an taught path attributable to deformation of the tool, if any, can be corrected automatically. [0006]
  • In order to achieve the above object, according to the present invention, a tool mounted on the wrist end of a robot and a camera for photographing the tool are first located in predetermined relative positions, images picked up by the camera are then fetched and processed, shape data of the tool are obtained from the processed image data, the extent of deformation of the tool is detected by comparing the obtained tool shape data with reference tool shape data, and an taught path for the robot is corrected in accordance with the extent of deformation. [0007]
  • According to the present invention, having the features described above, the extent of deformation of the tool is detected before the operation by the robot is executed, and the taught path can be automatically corrected depending on the extent of deformation, so that accurate operation can be performed even if there is deformation of the tool. [0008]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. A is a block diagram schematically illustrating an apparatus, according to the present invention, for detecting deformation of a tool and correcting a program taught path in accordance with the extent of the deformation; [0009]
  • FIG. 2 is a diagram for illustrating positional relations among the tool in a predetermined photographing position and cameras; [0010]
  • FIG. 3 is a diagram showing the hardware configuration of an image processing apparatus constituting the apparatus shown in FIG. 1; [0011]
  • FIG. 4 is a diagram showing the hardware configuration of a robot control apparatus constituting the apparatus shown in FIG. 1; [0012]
  • FIG. 5 is a flowchart showing processing in the image processing apparatus of FIG. 3; and [0013]
  • FIG. 6 is a flowchart showing processing in the robot control apparatus of FIG. 4. [0014]
  • BEST MODE OF CARRYING OUT THE INVENTION
  • Referring first to FIG. 1, an outline of an apparatus for detecting deformation of a tool and correcting a program taught path in accordance with the extent of the deformation will be described. [0015]
  • A [0016] robot 1 is fitted with the tool 2 on its arm end. The tool 2 carries out an operation (hereinafter referred to as main operation) such as arc welding for a workpiece 3 (e.g., vehicle body frame). The robot 1 is controlled by means of a robot control apparatus 30.
  • Before the main operation is started, the [0017] tool 2 is moved to a predetermined photographing position P1 by the robot control apparatus 30. As a result, the tool 2 is made to take a predetermined posture by which its tool center point is in alignment with the position P1. Two cameras 41 and 42 photograph the tool in the predetermined position P1 and the predetermined posture. One camera 41 photographs the front face of the tool 2, while the other camera 42 photographs a side face of the tool 2.
  • These [0018] cameras 41 and 42 photograph the shape of the tool 2 in response to a command from an image processing apparatus 20. This image processing apparatus 20 comprises tool shape recognizing means 5, error value computing means 6, and reference shape data storage means 7. The tool shape recognizing means 5 recognizes the shape of the tool 2 referring to the images picked up by the cameras 4. The error value computing means 6 compares the current tool shape recognized by the tool shape recognizing means 5 and the reference shape data previously stored in the reference shape data storage means 7, and computes a deviation (error value or extent of deformation of the tool) of the current tool shape from the reference shape data.
  • The error value computed by the error value computing means [0019] 6 is fed to movement control means 8 in the robot control apparatus 30. On the other hand, the movement control means 8 corrects the programmed path in accordance with the error value data fed from the error value computing means 6, and, in the main operation, controls the movement of the robot 1 according to the corrected path.
  • Referring further to FIG. 2, positional relations among the [0020] tool 2 and the cameras 41 and 42 in FIG. 1 will be described.
  • The two [0021] cameras 41 and 42 are set in the vicinity of the predetermined photographing position P1, and the both face this photographing position P1. The direction in which the one camera 41 faces the photographing position P1 is perpendicular to the direction in which the other camera 42 faces the photographing position P1. If a machine coordinate system having three rectangular axes (X-, Y- and Z- axes) is set in the manner shown in FIG. 2, the one camera 41 is located so that its optical axis extends parallel to the Y-axis, and picks up an image of the front face of the object of photographing (tool 2), that is, an XZ-plane image, while the other camera 42 is located so that its optical axis extends parallel to the X-axis, and picks up an image of the side face of the object of photographing, that is, a YZ-plane image. Thus, image data in two directions are obtained for the object of photographing (tool 2) by means of the two cameras 41 and 42, so that three-dimensional data for the object of photographing can be obtained by synthesizing those image data.
  • Referring to FIG. 3, the hardware configuration of the [0022] image processing apparatus 20 will be described.
  • A [0023] processor 21 of the image processing apparatus 20 is connected with a ROM 22, RAM 23, nonvolatile memory 24, communication processor 25, image processor 26, frame memory 27, and interfaces (INT) 28 and 29, respectively, through a bus 20 a.
  • The [0024] ROM 22 is loaded with a basic program for operating the image processing apparatus 20. The RAM 23 stores data for temporary processing and calculation. The nonvolatile memory 24 is composed of a CMOS or the like that is backed up by a power source, and stores data that are expected to be held even after the image processing apparatus 20 is cut off from the power supply. The data to be stored in this nonvolatile memory 24 include reference shape data of the tool 2 in the predetermined photographing position P1 (and therefore, the reference shape data storage means 7 shown in FIG. 1 corresponds specifically to the nonvolatile memory 24 in the image processing apparatus 20 of FIG. 4). The reference shape data are based on the parameters, namely, the three-dimensional lengths and angles of a distal end portion 2 a of the tool 2 fixed in a given direction at the predetermined photographing position P1 as viewed from the cameras 41 and 42.
  • The [0025] communication processor 25 delivers to and receives data from the robot control apparatus 30 through a communication line. The image processor 26 reads the image data picked up by the cameras 41 and 42 through the interface (INT) 28, temporarily loads them into the frame memory 27, and carries out profiling and the like of the image data in the frame memory 27. Data such as the photographing conditions of the cameras 41 and 42 and the state of operation are fed to a monitor (not shown) through the interface (INT) 29, and displayed thereon.
  • The [0026] processor 21 detects the shape of the tool 2 in terms of the three-dimensional lengths and angles thereof on the basis of the profiled image data, compares it with the reference shape data in the nonvolatile memory 24, and compute the error value. The computed error value is fed to the robot control apparatus 30 through the communication processor 25. Also, the processor 21 determines whether or not the error value is within a preset allowable range, and informs the robot control apparatus 30 of the result.
  • Referring to FIG. 4, the hardware configuration of the [0027] robot control apparatus 30 will be described.
  • The [0028] robot control apparatus 30 is provided with a processor board 31. A processor 31 a of the processor board 31 is connected with a ROM 31 b and a RAM 31 c through a bus 39. The processor 31 a controls the whole robot control apparatus 30 in accordance with system programs stored in the ROM 31 b. The RAM 31 c is loaded with various data and also with operation programs for the robot 1 and correction programs for the taught path, which will be mentioned later. Part of the RAM 31 c is formed as a nonvolatile memory, and the operation programs or correction programs are stored in this nonvolatile memory portion.
  • Further, the [0029] bus 39 is connected with a digital servo control circuit 32, serial port 34, digital I/O 35, analog I/O 37, and large-capacity memory 38.
  • The digital [0030] servo control circuit 32 drives servomotors 51, 52, 53, 54, 55 and 56 through a servo amplifier in accordance with commands from the CPU 31 a of the processor board 31. These servomotors 51 to 56 are built in the robot 1, and operate individual axes of the robot 1.
  • The [0031] serial port 34 is connected to a teaching control panel 57 with display and other external apparatuses 58, which are connected through RS232C. This serial port 34 is connected to the image processing apparatus 20 (i.e., those external apparatuses 58 include the image processing apparatus 20). The teaching control panel 57 with display is used to input the taught path of the tool 2 and the like. Moreover, the serial port is connected to a CRT 36 a, which displays a coordinate position, control screen, etc. A console panel 36 b is connected to the digital I/O 35. The analog I/O 37 its connected to a power supply unit of a laser apparatus, and welding voltage is instructed through the analog I/O 37. The large-capacity memory 38 is loaded with taught data and the like.
  • Referring now to the flowcharts of FIGS. 5 and 6, procedures to be executed by the [0032] image processing apparatus 20 and the robot control apparatus 30 having the above hardware configuration and constituting the correcting apparatus for the robot movement path will be described.
  • Referring first to the flowchart of FIG. 5, the procedures of processing by the [0033] CPU 21 of the image processing apparatus 20 will be described.
  • This processing is started upon receiving the information from the [0034] robot control apparatus 30 to the effect that the robot 1 is moved by the robot control apparatus 30 so that the tool 2, the object of photographing, is moved to the predetermined photographing position P1.
  • When the processing is started, the [0035] CPU 21 first outputs a command for photographing to the first camera 41, reads XZ-plane image data of the photographed tool 2 through the interface 27, and temporarily stores the data into the frame memory 27 (Step S1). Then, the CPU 21 delivers a command requiring the detection of the object to the image processor 26. In response to this command, the image processor 26 makes the image data to undergo profiling and the like, and measures the shape of the tool 2 (i.e., the length, angle, and position of the tool 2) projected on the XZ-plane (Step S2).
  • Likewise, YZ-plane image data of the [0036] tool 2 picked up by the second camera 2 are read through the interface 27 and temporarily loaded into the frame memory 27 (Step S3). Then, the image data are made to undergo profiling and the like, and the shape of the tool 2 projected on the YZ-plane is measured (Step S4).
  • Thereupon, three-dimensional shape data in the XYZ coordinate space of the [0037] tool 2 are obtained from the shape of the tool 2 on the XZ-plane obtained in Step S2 and the shape of the tool 2 on the YZ-plane obtained in Step S4 (Step S5). Then, the reference three-dimensional shape data of the tool 2 are fetched from the nonvolatile memory 24, and are compared with the three-dimensional shape data of the tool 2 obtained in Step S6, whereupon errors or deviations from the reference values of the three-dimensional shape data of the tool 2, currently attached to the robot, are obtained (Step S6).
  • Then, it is determined whether or not the obtained errors are within the allowable range (Step S[0038] 7). If the errors are concluded to be within the allowable range, information to the effect that the errors are within the allowable range is transmitted to the robot control apparatus 30 (Step S9) to terminate this processing. If the obtained errors are concluded to be out of the allowable range, the error data and information to the effect that the errors are out of the allowable range are delivered to the robot control apparatus 30 (Step S8) to terminate this processing.
  • Referring now to the flowchart of FIG. 6, the procedures of processing by the [0039] CPU 31 of the robot control apparatus 30 will be described. This processing is executed every time one stage of the main operation by the robot 1 is finished (i.e., before the next main operation is started).
  • The [0040] robot 1 is controlled so that the tool 2, the object of photographing, is moved to the predetermined photographing position P1 (Step T1). When the tool 2 has reached the photographing position P1, information to that effect is transmitted to the image processing apparatus 20. On receiving this signal, the image processing apparatus 20 starts processing according to the flowchart of FIG. 5.
  • After the [0041] tool 2 has moved to the photographing position P1, it is kept on standby (Step T2) until the information to the effect that the errors are within the allowable range or the information to the effect that the errors are out of the allowable range is received from the image processing apparatus 20 (see Steps S9 and S8 of FIG. 5). When the information to the effect that the errors are within the allowable range is received from the image processing apparatus 20, execution of the main operation is started in accordance with the path taught by the program (Step T6).
  • On the other hand, when the information to the effect that the errors are out of the allowable range and the relevant error data are received from the [0042] image processing apparatus 20, an operator is informed of the fact that the taught movement path must be corrected due to deformation of the tool through display on the CRT 36 a or the like (Step T4), and the program-taught path of the tool 2 is then corrected on the basis of the delivered error data (Step T5). Execution of the main operation is started by actuating the robot in accordance with the corrected taught path (Step T6).
  • Thus, in the present embodiment, the shape of the [0043] tool 2 is measured based on the images of the tool 2 picked up by the cameras 41 and 42, and the taught path is corrected on the basis of the differences between the measured shape and the reference shape data. Thus, even when the operator is not in the site of operation, the deformation of the tool 2 can be detected, and the path can be automatically corrected depending on the extent of the deformation. Thus, the machining line need not be stopped in order to execute the path correction to cope with the deformation of the tool, so that the manufacturing efficiency is improved, and normal operation can be carried out at all times. Moreover, since the operator need not enter the job site for the correction, safety can also be maintained.
  • Furthermore, in the case of the present embodiment, it is designed so that the taught path is corrected when the deformation of the tool gets beyond the allowable range; however, this embodiment may be arranged so that the tool can be replaced after stopping the operation of the [0044] robot 1, when the deformation of the tool is too much beyond the allowable range.

Claims (8)

1. An apparatus for correcting movement path of a robot, comprising:
a camera for photographing a tool mounted on the wrist end of a robot;
an image processing apparatus for obtaining the shape data of the tool by processing images of said tool picked up by said camera; and
a robot control apparatus for correcting taught path in accordance with the shape data of the tool obtained by said image processing apparatus and controlling the robot to make it move along the corrected taught path.
2. An apparatus for correcting movement path of a robot according to
claim 1
, wherein said robot control apparatus has a function to control the robot to move the tool to a predetermined photographing position and transmit information to the effect that said photographing position is reached by the tool to said image processing apparatus, while said image processing apparatus has a function to command the camera to photograph said tool on receiving the information to that effect from the robot.
3. An apparatus for correcting movement path of a robot according to
claim 1
, wherein said image processing apparatus obtains three-dimensional shape data of the tool by receiving an image of one side face of the tool picked up by the camera and an image of another side face of the same tool perpendicular to the one side face.
4. An apparatus for correcting movement path of a robot according to
claim 1
, wherein said taught path is corrected in accordance with the shape data of the tool obtained by said image processing apparatus only when the difference between the tool shape and a reference tool shape as a result of comparison thereof is not smaller than a predetermined value.
5. An apparatus for correcting movement path of a robot according to
claim 1
, wherein said image processing apparatus includes tool shape recognizing means for recognizing the shape of the tool from the images picked up by the camera, reference shape data storage means for storing reference shape data of said tool, and error value computing means for computing error values between the tool shape recognized by said tool shape recognizing means and the reference shape data stored in said reference shape data storage means.
6. A movement path correcting method for a robot, comprising steps of:
locating a tool mounted on the wrist end of a robot and a camera for photographing said tool in predetermined relative positions;
fetching and processing images through the camera to obtain shape data of the tool from the processed image data;
comparing the obtained tool shape data with reference tool shape data to detect the deformation of the tool; and
correcting a path taught by a program in accordance with the result of the detection.
7. A movement path correcting method for a robot according to
claim 6
, wherein said camera is placed in a fixed position, and said robot is controlled so that the tool is brought to a photographing position for the camera.
8. A movement path correcting method for a robot according to
claim 6
, wherein said camera photographs the front and side faces of the tool to obtain three-dimensional shape data of the tool from the two image data.
US08/836,059 1995-09-06 1996-09-06 Apparatus for correcting movement path of a robot and a method therefor Expired - Lifetime US6414711B2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP228716/1995 1995-09-06
JP7-228716 1995-09-06
JP7228716A JPH0970780A (en) 1995-09-06 1995-09-06 Tool shape correcting method of robot
PCT/JP1996/002548 WO1997009154A1 (en) 1995-09-06 1996-09-06 Apparatus and method for correcting a travelling route for a robot

Publications (2)

Publication Number Publication Date
US20010010539A1 true US20010010539A1 (en) 2001-08-02
US6414711B2 US6414711B2 (en) 2002-07-02

Family

ID=16880701

Family Applications (1)

Application Number Title Priority Date Filing Date
US08/836,059 Expired - Lifetime US6414711B2 (en) 1995-09-06 1996-09-06 Apparatus for correcting movement path of a robot and a method therefor

Country Status (5)

Country Link
US (1) US6414711B2 (en)
EP (1) EP0796704B1 (en)
JP (1) JPH0970780A (en)
DE (1) DE69603066T2 (en)
WO (1) WO1997009154A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8000837B2 (en) 2004-10-05 2011-08-16 J&L Group International, Llc Programmable load forming system, components thereof, and methods of use
CN103302666A (en) * 2012-03-09 2013-09-18 佳能株式会社 Information processing apparatus and information processing method
US9156162B2 (en) 2012-03-09 2015-10-13 Canon Kabushiki Kaisha Information processing apparatus and information processing method
CN107000224A (en) * 2014-12-22 2017-08-01 川崎重工业株式会社 The deformation detection method of arm-and-hand system and end effector
JP2017170599A (en) * 2016-03-25 2017-09-28 ファナック株式会社 Positioning device using robot
WO2019071133A1 (en) * 2017-10-06 2019-04-11 Advanced Solutions Life Sciences, Llc End effector calibration assemblies, systems, and methods
US11476138B2 (en) 2016-09-28 2022-10-18 Kawasaki Jukogyo Kabushiki Kaisha Diagnostic system of substrate transfer hand

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19826395A1 (en) * 1998-06-12 1999-12-23 Amatec Gmbh Method for capturing and compensating for kinematic changes in a robot
US7420588B2 (en) * 1999-06-09 2008-09-02 Mitutoyo Corporation Measuring method, measuring system and storage medium
JP3556589B2 (en) * 2000-09-20 2004-08-18 ファナック株式会社 Position and orientation recognition device
US6738507B2 (en) * 2001-01-09 2004-05-18 Ford Global Technologies, Llc Apparatus and method for correlating part design geometry, manufacturing tool geometry, and manufactured part geometry
DE10164944B4 (en) * 2001-10-15 2013-03-28 Hermann, Dr.-Ing. Tropf Apparatus and method for correcting the movement of gripping and machining tools
US8010180B2 (en) 2002-03-06 2011-08-30 Mako Surgical Corp. Haptic guidance system and method
JP2004174662A (en) * 2002-11-27 2004-06-24 Fanuc Ltd Operation state analysis device for robot
DE10351669B4 (en) * 2003-11-05 2012-09-13 Kuka Laboratories Gmbh Method and device for controlling a handling device relative to an object
DE102004026185A1 (en) * 2004-05-28 2005-12-22 Kuka Roboter Gmbh Method and apparatus for operating a machine, such as a multi-axis industrial robot
CN1319704C (en) * 2004-10-21 2007-06-06 上海交通大学 Servo binocular vision sensors on welding robot
JP4087841B2 (en) * 2004-12-21 2008-05-21 ファナック株式会社 Robot controller
DE102005047489A1 (en) * 2005-10-04 2007-04-05 Ford Global Technologies, LLC, Dearborn Robot`s operation sequence and motion sequence programming method for motor vehicle manufacturing industry, involves additionally representing point corresponding to currently represented operation and motion conditions of robot in program
JP5665333B2 (en) * 2010-03-10 2015-02-04 キヤノン株式会社 Information processing apparatus and information processing apparatus control method
US9921712B2 (en) 2010-12-29 2018-03-20 Mako Surgical Corp. System and method for providing substantially stable control of a surgical tool
US9119655B2 (en) 2012-08-03 2015-09-01 Stryker Corporation Surgical manipulator capable of controlling a surgical instrument in multiple modes
JP2012223839A (en) * 2011-04-15 2012-11-15 Yaskawa Electric Corp Robot system, and method for operating robot system
JP5561260B2 (en) * 2011-09-15 2014-07-30 株式会社安川電機 Robot system and imaging method
JP5975685B2 (en) 2012-03-09 2016-08-23 キヤノン株式会社 Information processing apparatus and information processing method
US9226796B2 (en) 2012-08-03 2016-01-05 Stryker Corporation Method for detecting a disturbance as an energy applicator of a surgical instrument traverses a cutting path
US9820818B2 (en) 2012-08-03 2017-11-21 Stryker Corporation System and method for controlling a surgical manipulator based on implant parameters
KR102668586B1 (en) 2012-08-03 2024-05-28 스트리커 코포레이션 Systems and methods for robotic surgery
CN105025835B (en) 2013-03-13 2018-03-02 史赛克公司 System for arranging objects in an operating room in preparation for a surgical procedure
AU2014248758B2 (en) 2013-03-13 2018-04-12 Stryker Corporation System for establishing virtual constraint boundaries
DE102013215430B4 (en) * 2013-08-06 2016-07-14 Lufthansa Technik Ag processing device
JP2015226963A (en) * 2014-06-02 2015-12-17 セイコーエプソン株式会社 Robot, robot system, control device, and control method
JP6459227B2 (en) 2014-06-02 2019-01-30 セイコーエプソン株式会社 Robot system
RU2017106913A (en) * 2014-09-17 2018-10-18 Нуово Пиньоне СРЛ CONTROL OF GEOMETRIC PARAMETERS AND OPTIMAL FITTING OF TOOLS FOR ELECTROEROSION PROCESSING
US10016892B2 (en) * 2015-07-23 2018-07-10 X Development Llc System and method for determining tool offsets
WO2017117369A1 (en) 2015-12-31 2017-07-06 Stryker Corporation System and methods for performing surgery on a patient at a target site defined by a virtual object
JP6434943B2 (en) * 2016-09-20 2018-12-05 本田技研工業株式会社 Assembly equipment
EP3554414A1 (en) 2016-12-16 2019-10-23 MAKO Surgical Corp. Techniques for modifying tool operation in a surgical robotic system based on comparing actual and commanded states of the tool relative to a surgical site
JP6499716B2 (en) * 2017-05-26 2019-04-10 ファナック株式会社 Shape recognition apparatus, shape recognition method, and program
JP7441707B2 (en) 2020-03-31 2024-03-01 株式会社ユーシン精機 Attachment three-dimensional shape measurement method
CN114653558B (en) * 2022-05-25 2022-08-02 苏州柳溪机电工程有限公司 Water blowing system for coating production line

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5773410A (en) * 1980-10-23 1982-05-08 Fanuc Ltd Numerical control system
US4613942A (en) * 1982-02-19 1986-09-23 Chen Richard M Orientation and control system for robots
JPS59180605A (en) 1983-03-31 1984-10-13 Hitachi Ltd Device for converting working data of robot
JPS60193013A (en) * 1984-03-15 1985-10-01 Hitachi Ltd Controller for robot equipped with visual sensor
JPS6126106A (en) * 1984-07-16 1986-02-05 Fanuc Ltd Correcting system of position of tool
JPS61173878A (en) 1985-01-30 1986-08-05 株式会社日立製作所 Individual-difference corresponding teach data correction system of robot
EP0205628B1 (en) * 1985-06-19 1990-06-13 International Business Machines Corporation Method for identifying three-dimensional objects using two-dimensional images
JPH0679325B2 (en) * 1985-10-11 1994-10-05 株式会社日立製作所 Position and orientation determination method
US4675502A (en) * 1985-12-23 1987-06-23 General Electric Company Real time tracking control for taught path robots
US5579444A (en) * 1987-08-28 1996-11-26 Axiom Bildverarbeitungssysteme Gmbh Adaptive vision-based controller
DE3903133A1 (en) * 1988-02-04 1989-08-31 Amada Co WORKPIECE WORKABILITY DETECTION METHOD AND METHOD FOR MACHINING A WORKPIECE BY MEANS OF A CHIP MACHINING MACHINE USING THIS METHOD
JPH0213804A (en) * 1988-07-01 1990-01-18 Fanuc Ltd Nominal setting system for vision sensor
US4985846A (en) * 1989-05-11 1991-01-15 Fallon Patrick J Acoustical/optical bin picking system
JPH0736989B2 (en) * 1990-01-19 1995-04-26 トキコ株式会社 Control method for industrial robot
US5086401A (en) * 1990-05-11 1992-02-04 International Business Machines Corporation Image-directed robotic system for precise robotic surgery including redundant consistency checking
US5255199A (en) 1990-12-14 1993-10-19 Martin Marietta Energy Systems, Inc. Cutting tool form compensaton system and method
JP2663726B2 (en) * 1991-01-08 1997-10-15 株式会社デンソー Multi-layer condition inspection equipment
JP2779072B2 (en) * 1991-01-28 1998-07-23 ファナック株式会社 Robot teaching method
JPH04269607A (en) * 1991-02-25 1992-09-25 Mitsui Eng & Shipbuild Co Ltd Apparatus for measuring size of substance
US5319443A (en) * 1991-03-07 1994-06-07 Fanuc Ltd Detected position correcting method
JPH04343178A (en) * 1991-05-20 1992-11-30 Sony Corp Image processor
DE4120746A1 (en) 1991-06-24 1993-01-14 Guenter Heilig AUTOMATIC TOOL MEASUREMENT
US5380978A (en) * 1991-07-12 1995-01-10 Pryor; Timothy R. Method and apparatus for assembly of car bodies and other 3-dimensional objects
US5577130A (en) * 1991-08-05 1996-11-19 Philips Electronics North America Method and apparatus for determining the distance between an image and an object
JP3230826B2 (en) 1991-10-16 2001-11-19 ファナック株式会社 Spot welding gun position correction method and robot welding device
JPH06143166A (en) 1992-11-06 1994-05-24 Meidensha Corp Method for correcting displacement in position of articulated robot
JPH0755427A (en) 1993-08-12 1995-03-03 Nec Corp Lead position detector
JP3665353B2 (en) * 1993-09-14 2005-06-29 ファナック株式会社 3D position correction amount acquisition method of robot teaching position data and robot system
US5572102A (en) * 1995-02-28 1996-11-05 Budd Canada Inc. Method and apparatus for vision control of welding robots
EP0812662B1 (en) * 1995-12-27 2008-01-23 Fanuc Ltd Composite sensor robot system
US5959425A (en) * 1998-10-15 1999-09-28 Fanuc Robotics North America, Inc. Vision guided automatic robotic path teaching method

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8000837B2 (en) 2004-10-05 2011-08-16 J&L Group International, Llc Programmable load forming system, components thereof, and methods of use
CN103302666A (en) * 2012-03-09 2013-09-18 佳能株式会社 Information processing apparatus and information processing method
US9026234B2 (en) 2012-03-09 2015-05-05 Canon Kabushiki Kaisha Information processing apparatus and information processing method
US9156162B2 (en) 2012-03-09 2015-10-13 Canon Kabushiki Kaisha Information processing apparatus and information processing method
CN107000224A (en) * 2014-12-22 2017-08-01 川崎重工业株式会社 The deformation detection method of arm-and-hand system and end effector
JP2017170599A (en) * 2016-03-25 2017-09-28 ファナック株式会社 Positioning device using robot
US10525598B2 (en) 2016-03-25 2020-01-07 Fanuc Corporation Positioning system using robot
US11476138B2 (en) 2016-09-28 2022-10-18 Kawasaki Jukogyo Kabushiki Kaisha Diagnostic system of substrate transfer hand
WO2019071133A1 (en) * 2017-10-06 2019-04-11 Advanced Solutions Life Sciences, Llc End effector calibration assemblies, systems, and methods
US11260531B2 (en) * 2017-10-06 2022-03-01 Advanced Solutions Life Sciences, Llc End effector calibration assemblies, systems, and methods

Also Published As

Publication number Publication date
US6414711B2 (en) 2002-07-02
DE69603066D1 (en) 1999-08-05
EP0796704A1 (en) 1997-09-24
DE69603066T2 (en) 1999-10-21
JPH0970780A (en) 1997-03-18
WO1997009154A1 (en) 1997-03-13
EP0796704B1 (en) 1999-06-30
EP0796704A4 (en) 1997-10-15

Similar Documents

Publication Publication Date Title
US6414711B2 (en) Apparatus for correcting movement path of a robot and a method therefor
US9050728B2 (en) Apparatus and method for measuring tool center point position of robot
EP1555508B1 (en) Measuring system
US4380696A (en) Method and apparatus for manipulator welding apparatus with vision correction for workpiece sensing
US7386367B2 (en) Workpiece conveying apparatus
US7359817B2 (en) Method of and device for re-calibrating three-dimensional visual sensor in robot system
JP7153085B2 (en) ROBOT CALIBRATION SYSTEM AND ROBOT CALIBRATION METHOD
US7376488B2 (en) Taught position modification device
EP1607194B1 (en) Robot system comprising a plurality of robots provided with means for calibrating their relative position
US11267142B2 (en) Imaging device including vision sensor capturing image of workpiece
EP0884141B1 (en) Force control robot system with visual sensor for inserting work
JP2005138223A (en) Positional data correcting device for robot
US20080154428A1 (en) Device, method, program and recording medium for robot offline programming
EP1512499A2 (en) Robot having a camera mounted at the distal end of its arm and method for operating such a robot
JPH0784631A (en) Method for correcting robot teaching program
CN112549052B (en) Control device for robot device for adjusting position of robot-supported component
US11679508B2 (en) Robot device controller for controlling position of robot
JP3442140B2 (en) Position measurement device and position deviation correction device using three-dimensional visual sensor
JPH08132373A (en) Coordinate system coupling method in robot-sensor system
JPH1097311A (en) Correction and setting method for robot tool coordinate system and end effector used for the method
JPH01247285A (en) Method for calibration of work locating device
EP1314510B1 (en) Method of welding three-dimensional structure and apparatus for use in such method
WO2024111062A1 (en) Control device and computer-readable recording medium
JPH05301195A (en) Camera position slippage detecting method in visual sensor
JPH05337785A (en) Grinding path correcting device of grinder robot

Legal Events

Date Code Title Description
AS Assignment

Owner name: FANUC LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ARIMATSU, TARO;AKIYAMA, KAZUHIKO;REEL/FRAME:008610/0496

Effective date: 19970424

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12