WO2021059341A1 - Angle transfer error measuring device - Google Patents

Angle transfer error measuring device Download PDF

Info

Publication number
WO2021059341A1
WO2021059341A1 PCT/JP2019/037339 JP2019037339W WO2021059341A1 WO 2021059341 A1 WO2021059341 A1 WO 2021059341A1 JP 2019037339 W JP2019037339 W JP 2019037339W WO 2021059341 A1 WO2021059341 A1 WO 2021059341A1
Authority
WO
WIPO (PCT)
Prior art keywords
angle
arm
mark
transmission error
camera
Prior art date
Application number
PCT/JP2019/037339
Other languages
French (fr)
Japanese (ja)
Inventor
武史 藤城
Original Assignee
株式会社Fuji
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Fuji filed Critical 株式会社Fuji
Priority to PCT/JP2019/037339 priority Critical patent/WO2021059341A1/en
Priority to JP2021548010A priority patent/JP7152614B2/en
Publication of WO2021059341A1 publication Critical patent/WO2021059341A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

Definitions

  • This specification discloses an angle transmission error measuring device.
  • the visual sensor control device measures the position of the mark member by imaging the mark member with a camera at predetermined time intervals while the robot control device rotates the arm over the entire range of the rotatable angle, and the measured position. Is converted into rotation angle data and notified to the robot control device.
  • the measurement angle recording unit of the robot control device records the rotation angle notified from the visual sensor control device as measurement angle data.
  • the motor rotation angle recording unit records the position data detected by the encoder at the same time as the visual sensor control device, converts the position data at each predetermined time into the output side angle of the speed reducer, and converts the position data at each predetermined time into the motor rotation angle at each predetermined time. Record as data.
  • the transmission error correction amount calculation unit calculates the transmission error correction amount based on the difference between the measurement angle data and the motor rotation angle data at each predetermined time.
  • the angle of the arm when the angle of the arm is measured by processing the image captured by the camera, it is desirable to increase the resolution of the image in order to secure sufficient measurement accuracy.
  • the field of view of the camera may be narrowed. In this case, since the angle range of the arm capable of capturing the mark is limited, the angle measurement range is narrowed.
  • a main object of the present disclosure is to provide an angle transmission error measuring device capable of measuring an angle transmission error generated between an input side and an output side of a speed reducer with good accuracy in a wider range.
  • This disclosure has taken the following measures to achieve the above-mentioned main purpose.
  • the angle transmission error measuring device of the present disclosure is Angle transmission error measurement used for horizontal joint robots that have an arm that can be driven horizontally by the power transmitted from the motor via the reducer, and measures the angle transmission error that occurs between the input side and output side of the reducer. It ’s a device, With the camera attached to the arm An encoder that detects the angle of the arm and When the angle of the arm is in the first angle range, the first mark enters the field of view of the camera, and when the angle of the arm is in the second angle range, the second mark enters the field of view of the camera.
  • the first and second marks are provided so that both the first and second marks are in the field of view of the camera when the angle of the arm is in the intermediate angle range between the first and second angle ranges.
  • the angle of the arm is gradually moved over the first angle range, the intermediate angle range, and the second angle range, and each time the arm is moved, the mark member is imaged by the camera and the encoder is used.
  • the detection angle of the detected arm is acquired, the first mark is recognized based on the images captured in the first angle range and the intermediate angle range, the actual angle of the arm is obtained, and the second angle is obtained.
  • the second mark is recognized based on the image captured in the range and the intermediate angle range to obtain the actual angle of the arm, and the angle transmission error is calculated based on the actual angle and the detection angle for each movement of the arm.
  • the desired control device and The gist is to prepare.
  • the angle transmission error measuring device of the present disclosure includes a camera attached to the arm, an encoder for detecting the angle of the arm, a mark member, and a control device.
  • the mark member the first mark enters the field of view of the camera when the angle of the arm is in the first angle range
  • the second mark enters the field of view of the camera when the angle of the arm is in the second angle range.
  • the first and second marks are provided so that both the first and second marks are in the field of view of the camera when the angle of the arm is in the intermediate angle range between the first and second angle ranges.
  • the control device gradually moves the angle of the arm over the first angle range, the intermediate angle range, and the second angle range, and each time the arm moves, the mark member is imaged by the camera and the arm detected by the encoder. Get the detection angle of. Subsequently, the control device recognizes the first mark based on the images captured in the first angle range and the intermediate angle range, obtains the actual angle of the arm, and captures the images in the second angle range and the intermediate angle range. The second mark is recognized based on the above, and image processing is performed to obtain the actual angle of the arm. Then, the control device obtains the angle transmission error based on the actual angle and the detection angle for each movement of the arm.
  • the mark member can be imaged by a camera attached to the arm from the first angle range to the second angle range through the intermediate angle range. Therefore, by performing image processing on the captured image. , It becomes possible to recognize the angle of the arm in a wider range. Further, by performing image processing using an image captured by a camera having a relatively narrow field of view, the resolution of the image can be easily increased, so that the angle of the arm can be measured with high accuracy. As a result, it is possible to obtain an angle transmission error measuring device capable of measuring the angle transmission error generated between the input side and the output side of the speed reducer with good accuracy in a wider range.
  • FIG. 1 is an external perspective view of the work robot 10.
  • FIG. 2 is a side view of the work robot 10.
  • FIG. 3 is a block diagram showing an electrical connection relationship between the robot body 20 and the control device 70.
  • the front-rear direction is the X-axis direction
  • the left-right direction is the Y-axis direction
  • the up-down direction is the Z-axis direction.
  • the work robot 10 is configured as a horizontal articulated robot that performs a predetermined work on a work placed in the work area A.
  • the working robot 10 includes a robot main body 20 (see FIGS. 1 to 3) and a control device 70 (see FIG. 3) that controls the robot main body 20.
  • the robot body 20 includes a base 22, a first arm 24, a second arm 26, a tip shaft 28, a first arm drive unit 30, and a second arm drive unit 40. And a tip shaft drive unit 50.
  • the base 22 is fixed to the support base 12.
  • the first arm 24 is rotatably connected to the base 22 in a horizontal plane via the first joint shaft 24a (J1 shaft).
  • the second arm 26 is rotatably connected to the first arm 24 in a horizontal plane via the second joint shaft 26a (J2 shaft).
  • the tip shaft 28 is connected to the tip of the second arm 26 so as to be able to move up and down with respect to the second arm 26.
  • Various tools for performing work on the work can be attached to the tip shaft 28.
  • the first arm drive unit 30 includes a motor 32, a speed reducer 34, and an encoder 36.
  • the rotating shaft of the motor 32 is connected to the first joint shaft 24a via a speed reducer 34.
  • the first arm driving unit 30 rotationally drives the first arm 24 with the first joint shaft 24a as a fulcrum by the torque transmitted to the first joint shaft 24a via the speed reducer 34.
  • the speed reducer 34 is configured as, for example, a strain wave gearing speed reducer.
  • a strain wave gearing reducer (not shown) includes a wave generator, a flexspline, and a circular spline.
  • the wave generator is composed of an elliptical cam and a bearing fitted on the outer circumference thereof.
  • the flexspline is a thin-walled cup-shaped elastic body, and teeth are formed on the outer periphery of the opening thereof.
  • the circular spline is a ring-shaped rigid body, and teeth are formed on the inner circumference thereof so as to have two more teeth than the number of teeth of the flexspline.
  • the speed reducer 34 transmits power by rotating the wave generator with the flexspline fixed, thereby sequentially moving the meshing position with the circular spline while elastically deforming the flexspline.
  • the encoder 36 is attached to the rotating shaft of the motor 32 and is configured as a rotary encoder that detects the amount of rotational displacement of the motor 32.
  • the second arm drive unit 40 includes a motor 42, a speed reducer 44, and an encoder 46, similarly to the first arm drive unit 30.
  • the rotation shaft of the motor 42 is connected to the second joint shaft 26a via the speed reducer 44.
  • the second arm driving unit 40 rotationally drives the second arm 26 with the second joint shaft 26a as a fulcrum by the torque transmitted to the second joint shaft 26a via the speed reducer 44.
  • the speed reducer 44 is composed of, for example, a strain wave gearing speed reducer similar to the speed reducer 34 of the first arm drive unit 30.
  • the encoder 46 is attached to the rotating shaft of the motor 42 and is configured as a rotary encoder that detects the amount of rotational displacement of the motor 42.
  • the tip shaft drive unit 50 includes a motor 52 and an encoder 56.
  • the motor 52 is connected to a ball screw mechanism (not shown), and moves the tip shaft 28 up and down by driving the ball screw mechanism.
  • the tip shaft drive unit 50 can raise and lower the tool mounted on the tip shaft 28 by raising and lowering the tip shaft 28.
  • the encoder 56 is configured as a linear encoder that detects the elevating position of the tip shaft 28.
  • the control device 70 includes a CPU 71, a ROM 72, a RAM 73, and an input / output interface (not shown). Position signals and the like from the encoders 36, 46, and 56 are input to the control device 70 via the input / output interface. From the control device 70, drive signals to the motors 32, 42, 52 and the like are output via the input / output interface.
  • the control device 70 of the work robot 10 first acquires the target position of the tool mounted on the tip shaft 28. Subsequently, the control device 70 calculates the target rotation angle of the first joint shaft 24a of the first arm 24 and the target rotation angle of the second joint shaft 26a of the second arm 26 for moving the tool to the target position. .. Then, in the control device 70, the rotation angle of the first joint shaft 24a detected by the encoder 36 matches the target rotation angle, and the rotation angle of the second joint shaft 26a detected by the encoder 46 matches the target rotation angle.
  • the motors 32 and 42 are controlled.
  • the rotating shaft of the motor 32 is connected to the first joint shaft 24a via the speed reducer 34 (strain wave gearing speed reducer). Further, the rotating shaft of the motor 42 is connected to the second joint shaft 26a via a speed reducer 44 (strain wave gearing speed reducer).
  • the speed reducers 34 and 44 have a rotation angle of an output that theoretically rotates when an arbitrary rotation angle is given as an input and an output that actually rotates due to machining errors and assembly errors of each component. There is a difference (angle transmission error) from the rotation angle.
  • the speed reducers 34 and 44 include an angle transmission error, as shown in FIG. 4, even if the motors 32 and 42 rotate at a constant rotation speed, the first and second joint shafts 24a and 26a are periodic. Since various speed fluctuations occur, it becomes difficult to move the tip shaft 28 (tool) to the target position with high accuracy. Therefore, the work robot 10 of the present embodiment measures the angle transmission error in advance in order to correct the angle transmission error included in the speed reducers 34 and 44.
  • the angle transmission error was measured by mounting the camera 60 as a tool on the tip shaft 28 and attaching the first mark M1 and the second mark M2 to positions separated by a predetermined distance in the X-axis direction, respectively. This is performed by installing the flat plate-shaped mark member 61 in the work area A and then causing the control device 70 to execute the angle transmission error measurement process.
  • the angle transmission error measurement process will be described.
  • the camera 60, the mark member 61, and the control device 70 that controls the work robot 10 and inputs and processes the image captured by the camera 60 correspond to the angle transmission error measuring device of the present disclosure.
  • FIG. 5 is a flowchart showing an example of the angle transmission error measurement process executed by the CPU 71 of the control device 70.
  • the CPU 71 of the control device 70 first sets the target joint axis, which is the joint axis to be measured for the angle transmission error, among the first joint axis 24a and the second joint axis 26a (S100).
  • the corresponding motor is driven and controlled so that the set angle of the target joint axis becomes the measurement start angle (S110).
  • the CPU 71 drives and controls the motor 32 when the target joint axis is the first joint axis 24a, and drives and controls the motor 42 when the target joint axis is the second joint axis 26a.
  • the measurement start angle is a predetermined rotation angle within the rotation angle range of the target joint axis in which the first mark M1 is included in the field of view of the camera 60 and the second mark M2 is not included.
  • the CPU 71 initializes the variable n to the value 1 (S120). Then, the CPU 71 takes an image of the mark member 61 with the camera 60 (S130) and performs image processing to process the captured image (S140). The image processing is performed by executing the image processing of FIG.
  • the description of the angle transmission error measurement process will be interrupted, and the image processing will be described.
  • the CPU 71 first performs a recognition process of recognizing the first and second marks M1 and M2 from the captured image (S300). This process can be performed, for example, by preparing template images of the first mark M1 and the second mark M2 in advance and searching for an image similar to the template image from the captured images. As a result of the recognition process, the CPU 71 determines whether or not only the first mark M1 is recognized from the captured image (S310) and whether or not only the second mark M2 is recognized (S320).
  • the CPU 71 determines that only the first mark M1 is recognized, the CPU 71 derives the imaging position (x1 (n), y1 (n)) of the first mark M1 based on the recognition result (S330), and ends the image processing. To do.
  • the CPU 71 determines that only the second mark M2 is recognized, the CPU 71 derives the imaging position (x2 (n), y2 (n)) of the second mark M2 based on the recognition result (S340), and ends the image processing. To do.
  • the CPU 71 determines that both the first mark M1 and the second mark M2 are recognized (“NO” in S320 and “NO” in S330), the CPU 71 determines that the first mark M1 and the second mark M2 are recognized, respectively, based on the recognition result.
  • (X1 (n), y1 (n)), (x2 (n), y2 (n)) are derived (S350), and the image processing is completed.
  • the imaging position (x1 (n), y1 (n)) indicates the position of the camera 60 when the first mark M1 is imaged nth time after the measurement of the target joint axis is started. Further, the imaging position (x2 (n), y2 (n)) indicates the position of the camera 60 when the second mark M2 is imaged nth time after the measurement of the target joint axis is started.
  • the CPU 71 rotates the target joint axis by a predetermined angle (S150) and images the mark member 61 with the camera 60 (S160). Subsequently, the CPU 71 acquires the amount of rotational displacement of the target joint axis from the time of the previous imaging to the time of the current imaging from the corresponding encoder as the input angle ⁇ i (n) (S170). Then, the CPU 71 calculates the theoretical output angle ⁇ o1 (n) by the following equation (1) based on the input angle ⁇ i (n) (S180). In the formula (2), " ⁇ " indicates the reduction ratio of the speed reducer provided on the target joint axis. The theoretical output angle ⁇ o1 (n) indicates the theoretical output angle output from the reducer when the input angle is given to the reducer as an input.
  • ⁇ o1 (n) ⁇ i (n) / ⁇ (1)
  • the CPU 71 performs the same image processing as in step S140 on the image captured in step S160, and the imaging positions of the first mark M1 and the second mark M (x1 (n), y1 (n)), ( One or both of x2 (n) and y2 (n)) are calculated (S190). Subsequently, the CPU 71 calculates and calculates the amount of rotational displacement of the same mark of the first mark M1 and the second mark M with the target joint axis as the fulcrum from the previously calculated imaging position to the currently calculated imaging position. The amount of rotational displacement is set as the actual output angle ⁇ o2 (n) (S200).
  • the amount of rotational displacement can be easily obtained from the xy coordinates of the imaging position calculated last time, the xy coordinates of the imaging position calculated this time, and the xy coordinates of the known target joint axis. Then, the CPU 71 calculates the angle transmission error ⁇ (n) in the variable n by the following equation (2) obtained by subtracting the actual output angle ⁇ o2 (n) from the theoretical output angle ⁇ o1 (n) (S210), and the calculated angle transmission.
  • the error ⁇ (n) is stored in the RAM 73 (S220).
  • ⁇ (n) ⁇ o1 (n) - ⁇ o2 (n) (2)
  • the CPU 71 calculates and stores the angle transmission error ⁇ (n) in the variable n, it determines whether or not the angle of the target joint axis has reached the measurement end angle (S230).
  • the measurement end angle is a predetermined rotation angle within the rotation angle range of the target joint axis in which the second mark M2 is included in the field of view of the camera 60 and the first mark M1 is not included.
  • the variable n is incremented by a value of 1 (S240), and then the process returns to step S150.
  • the CPU 71 rotates the target joint axis by a predetermined angle from the measurement start angle to the measurement end angle, and marks with the camera 60 each time the target joint axis rotates (one or both of the first mark M1 and the second mark M2). ) Is imaged and the input angle ⁇ i (n) of the target joint axis is acquired from the corresponding encoder. Then, the CPU 71 obtains the theoretical output angle ⁇ o1 (n) from the input angle ⁇ i (n) and obtains the actual output angle ⁇ o2 (n) of the target joint axis from the position of the mark recognized from the captured image, and obtains the theoretical output angle ⁇ o1. By taking the difference between (n) and the actual output angle ⁇ o2 (n), the angle transmission error ⁇ (n) is calculated for each rotation of the target joint axis.
  • FIGS. 7A to 7E are explanatory views showing how the angle transmission error of the first joint (J1) is measured.
  • the angle of the first joint axis 24a is within the first angle range including the measurement start angle, only the first mark M1 enters the field of view B of the camera 60, and only the first mark M1 enters the field of view B of the camera 60.
  • the imaging position (x1 (n), y1 (n)) is derived from the image captured by the first mark M1 (see FIGS. 7A and 7B).
  • both the first mark M1 and the second mark M2 enter the field of view of the camera 60, and the first mark M1 and the first mark M1 and Both of the second mark M2 are imaged by the camera 60, and the imaging positions (x1 (n), y1 (n)) and (x2 (n), y2 (n) are taken from the captured images of the first mark M1 and the second mark M2. ) Is derived (see FIG. 7C).
  • the first mark M1 deviates from the field of view of the camera 60 and only the second mark M2 enters the field of view of the camera 60. Only the second mark M2 is imaged by the camera 60, and the imaging position (x2 (n), y2 (n)) is derived from the captured image of the second mark M2 (see FIGS. 7D and 7E).
  • the imaging position (x1 (n), y1 (n)) and the imaging position (x2 (n) is plotted on concentric circles having different radii from the target joint axis.
  • the first mark M1 is imaged each time the first joint axis 24a is rotated by a predetermined angle.
  • the actual output angle ⁇ o2 (n) calculated based on the change in the imaging position (x1 (n), y1 (n)) and the imaging position (x2 (n), y2 (x2 (n), y2) when the second mark M2 is imaged. It is possible to have continuity with the actual output angle ⁇ o2 (n) calculated based on the change of n)).
  • the continuous angle transmission error ⁇ (n) can be measured over a wider range. That is, a continuous actual output angle ⁇ o2 (n) can be obtained from the measurement start angle to the measurement end angle (see FIG. 9).
  • the resolution of the image is determined by the number of pixels of the camera 60 and the size of the field of view.
  • a camera 60 having a relatively narrow field of view is used so that the resolution of the image can be improved and the mark can be recognized with high accuracy.
  • the angle transmission error can be measured with good accuracy while ensuring a sufficient measurement range.
  • the CPU 71 determines that the angle of the target joint axis has reached the measurement end angle, it determines that the measurement of the target joint axis has been completed, and the unmeasured joint axis for which the angle transmission error has not been measured remains. Whether or not it is determined (S250). When the CPU 71 determines that the unmeasured joint axis remains, the CPU 71 returns to step S100, sets the unmeasured joint axis as a new target joint axis, and repeats the process. On the other hand, when the CPU 71 determines that no unmeasured joint axis remains, the CPU 71 ends the angle transmission error measurement process.
  • 10A to 10E are explanatory views showing how the angle transmission error of the second joint (J2) is measured.
  • the angle of the second joint axis 26a is within the first angle range including the measurement start angle, only the first mark M1 enters the field of view B of the camera 60, and only the first mark M1 enters the field of view B of the camera 60.
  • the imaging position (x1 (n), y1 (n)) is derived from the image captured by the first mark M1 (see FIGS. 10A and 10B).
  • both the first mark M1 and the second mark M2 enter the field of view of the camera 60, and the first mark M1 and the first mark M1 and Both of the second mark M2 are imaged by the camera 60, and the imaging positions (x1 (n), y1 (n)) and (x2 (n), y2 (n) are taken from the captured images of the first mark M1 and the second mark M2. ) Is derived (see FIG. 10C).
  • the angle of the second joint axis 26a reaches within the second angle range including the measurement end angle
  • the first mark M1 deviates from the field of view of the camera 60 and only the second mark M2 enters the field of view of the camera 60. Only the second mark M2 is imaged by the camera 60, and the imaging position (x2 (n), y2 (n)) is derived from the image captured by the second mark M2 (see FIGS. 10D and 10E).
  • the angle transmission error can be measured individually for each of the first joint shaft 24a and the second joint shaft 26a.
  • the motors 32 and 42 correspond to the motors
  • the speed reducers 34 and 44 correspond to the speed reducers
  • the first arm 24 and the second arm 26 correspond to the arms
  • the work robot 10 corresponds to the horizontal joint robot.
  • the camera 60 corresponds to the camera
  • the encoders 36 and 46 correspond to the encoder
  • the first mark M1 corresponds to the first mark
  • the second mark M2 corresponds to the second mark
  • the mark member 61 corresponds to the mark member.
  • the control device 70 corresponds to the control device.
  • the first joint shaft 24a and the second joint shaft 26a correspond to a plurality of joints.
  • the speed reducers 34 and 44 are configured as a strain wave gearing speed reducer.
  • the speed reducers 34 and 44 are not limited to this, and may be configured as other gear speed reducers such as planetary gear speed reducers. Even in this case, since the angle transmission error occurs in the speed reducer due to the tooth pitch error or the like, the accuracy of the robot can be improved by measuring the angle transmission error.
  • the mark member 61 includes two marks (first marks M1 and M2).
  • the mark member 61 may include three or more marks.
  • the mark member when the mark member includes the first mark M1, the second mark M2, and the third mark M3, and the arm is moved from the first angle range to the third angle range via the second angle range, the third mark member is provided.
  • the mark M3 is when the third mark M3 is in the field of view of the camera 60 when the angle of the arm is in the third angle range, and the angle of the arm is in the intermediate angle range between the second and third angle ranges.
  • the second and third marks M2 and M3 may be arranged so as to be in the field of view of the camera 60. As a result, the measurement range of the angle transmission error can be further expanded.
  • the work robot 10 includes two joint axes (first joint axis 24a and second joint axis 26a).
  • the number of joint axes may be one or three or more.
  • the angle transmission error measuring device of the present disclosure is used for a horizontal joint robot having an arm that can be horizontally driven by the power transmitted from the motor via the speed reducer, and the input side and the output of the speed reducer.
  • An angle transmission error measuring device that measures the angle transmission error that occurs between the side and the arm, the camera attached to the arm, the encoder that detects the angle of the arm, and the angle of the arm in the first angle range.
  • the first mark is in the field of view of the camera
  • the second mark is in the field of view of the camera when the angle of the arm is in the second angle range, and the angle of the arm is in the first and first angles.
  • the angle between the arm and the mark member provided with the first and second marks so that both the first and second marks are in the field of view of the camera when they are in the intermediate angle range between the two angle ranges. Is gradually moved over the first angle range, the intermediate angle range, and the second angle range, and each time the arm is moved, the mark member is imaged by the camera and the arm detected by the encoder.
  • the first mark is recognized based on the images captured in the first angle range and the intermediate angle range to obtain the actual angle of the arm, and the second angle range and the intermediate angle are obtained.
  • a control device that recognizes the second mark based on an image captured in an angle range to obtain the actual angle of the arm, and obtains the angle transmission error based on the actual angle and the detection angle for each movement of the arm. , Is the gist.
  • the angle transmission error measuring device of the present disclosure includes a camera attached to the arm, an encoder for detecting the angle of the arm, a mark member, and a control device.
  • the mark member the first mark enters the field of view of the camera when the angle of the arm is in the first angle range
  • the second mark enters the field of view of the camera when the angle of the arm is in the second angle range.
  • the first and second marks are provided so that both the first and second marks are in the field of view of the camera when the angle of the arm is in the intermediate angle range between the first and second angle ranges.
  • the control device gradually moves the angle of the arm over the first angle range, the intermediate angle range, and the second angle range, and each time the arm moves, the mark member is imaged by the camera and the arm detected by the encoder. Get the detection angle of. Subsequently, the control device recognizes the first mark based on the images captured in the first angle range and the intermediate angle range, obtains the actual angle of the arm, and captures the images in the second angle range and the intermediate angle range. The second mark is recognized based on the above, and image processing is performed to obtain the actual angle of the arm. Then, the control device obtains the angle transmission error based on the actual angle and the detection angle for each movement of the arm.
  • the mark member can be imaged by a camera attached to the arm from the first angle range to the second angle range through the intermediate angle range. Therefore, by performing image processing on the captured image. , It becomes possible to recognize the angle of the arm in a wider range. Further, by performing image processing using an image captured by a camera having a relatively narrow field of view, the resolution of the image can be easily increased, so that the angle of the arm can be measured with high accuracy. As a result, it is possible to obtain an angle transmission error measuring device capable of measuring the angle transmission error generated between the input side and the output side of the speed reducer with good accuracy in a wider range.
  • each of the horizontal joint robots has a plurality of joints that can be horizontally driven by power transmitted from a motor via a speed reducer, and the control device includes the plurality of joints.
  • the joints may be operated individually to obtain the angle transmission error for each joint. In this way, the angle transmission error measuring device of the present disclosure can be similarly applied to the horizontal articulated robot.
  • the speed reducer may be a strain wave gearing speed reducer.
  • This disclosure can be used in the manufacturing industry of angle transmission error measuring devices and the like.
  • 10 work robot, 12 support base, 20 robot body, 22 base, 24 1st arm, 24a 1st joint shaft, 26 2nd arm, 26a 2nd joint shaft, 28 tip shaft, 30 1st arm drive unit, 32 motor , 34 reducer, 36 encoder, 40 second arm drive unit, 42 motor, 44 reducer, 46 encoder, 50 tip shaft drive unit, 52 motor, 56 encoder, 60 camera, 61 mark member, A work area, M1 first 1 mark, M2 2nd mark, 70 control device, 71 CPU, 72 ROM, 73 RAM.

Abstract

According to the present invention, a mark member is configured such that a first mark enters the field of view of a camera when the angle of an arm is in a first angle range, a second mark enters the field of view of the camera when the angle of the arm is in a second angle range, and both the first and second marks enter the field of view of the camera when the angle of the arm is in an intermediate angle range. The control device gradually moves the angle of the arm over the first angle range, the intermediate angle range, and the second angle range, captures an image of the mark member with the camera each time the arm moves, and acquires the detection angle of the arm detected by an encoder. Subsequently, the control device performs an image process for recognizing the first mark on the basis of the images captured in the first angle range and the intermediate angle range to obtain the actual angle of the arm, and recognizing the second mark on the basis of the images captured in the second angle range and the intermediate angle range to obtain the actual angle of the arm. Then, an angle transmission error is obtained on the basis of the actual angle and the detection angle for each movement of the arm.

Description

角度伝達誤差測定装置Angle transmission error measuring device
 本明細書は、角度伝達誤差測定装置について開示する。 This specification discloses an angle transmission error measuring device.
 従来、サーボモータから減速機を介して駆動軸に伝達される動力によりロボットアームが駆動されるロボットにおいて、減速機の入力側の角度と減速機の出力側の角度との間に生じる角度伝達誤差を測定する角度伝達誤差測定方法が提案されている(例えば、特許文献1参照)。ロボットは、ロボット制御装置により制御される。ロボット制御装置は、計測角度記録部とモータ回転角度記録部と伝達誤差補正量算出部とを有する。ロボットアームには、マーク部材が取り付けられている。また、外部には、カメラと、ロボット制御装置と相互に通信可能に接続される視覚センサ制御装置とを有する位置計測手段が設置されている。視覚センサ制御装置は、ロボット制御装置がアームを回転可能角度の全範囲に亘って回転させている間、所定時刻毎にカメラによりマーク部材を撮像してマーク部材の位置を計測し、計測した位置を回転角度データに変換してロボット制御装置へ通知する。ロボット制御装置の計測角度記録部は、視覚センサ制御装置から通知された回転角度を計測角度データとして記録する。モータ回転角度記録部は、エンコーダが検出した位置データを視覚センサ制御装置と同じ時刻毎に記録し、所定時刻毎の位置データを減速機の出力側角度に換算して所定時刻毎のモータ回転角度データとして記録する。伝達誤差補正量算出部は、計測角度データおよびモータ回転角度データの所定時刻毎の差分に基づいて伝達誤差補正量を算出する。 Conventionally, in a robot in which a robot arm is driven by power transmitted from a servomotor to a drive shaft via a reducer, an angle transmission error that occurs between an angle on the input side of the reducer and an angle on the output side of the reducer. A method for measuring an angle transmission error has been proposed (see, for example, Patent Document 1). The robot is controlled by a robot control device. The robot control device has a measurement angle recording unit, a motor rotation angle recording unit, and a transmission error correction amount calculation unit. A mark member is attached to the robot arm. Further, a position measuring means having a camera and a visual sensor control device connected to the robot control device so as to be able to communicate with each other is installed outside. The visual sensor control device measures the position of the mark member by imaging the mark member with a camera at predetermined time intervals while the robot control device rotates the arm over the entire range of the rotatable angle, and the measured position. Is converted into rotation angle data and notified to the robot control device. The measurement angle recording unit of the robot control device records the rotation angle notified from the visual sensor control device as measurement angle data. The motor rotation angle recording unit records the position data detected by the encoder at the same time as the visual sensor control device, converts the position data at each predetermined time into the output side angle of the speed reducer, and converts the position data at each predetermined time into the motor rotation angle at each predetermined time. Record as data. The transmission error correction amount calculation unit calculates the transmission error correction amount based on the difference between the measurement angle data and the motor rotation angle data at each predetermined time.
特開2010-120110号公報Japanese Unexamined Patent Publication No. 2010-120110
 ここで、カメラにより撮像された画像を処理してアームの角度を測定する場合、十分な測定精度を確保するためには、画像の分解能を高くすることが望ましい。しかしながら、画像の分解能を高くしようとすると、カメラの視野が狭くなる場合がある。この場合、マークを撮像可能なアームの角度範囲が限定されるため、角度の測定範囲が狭くなってしまう。 Here, when the angle of the arm is measured by processing the image captured by the camera, it is desirable to increase the resolution of the image in order to secure sufficient measurement accuracy. However, if an attempt is made to increase the resolution of the image, the field of view of the camera may be narrowed. In this case, since the angle range of the arm capable of capturing the mark is limited, the angle measurement range is narrowed.
 本開示は、減速機の入力側と出力側との間で生じる角度伝達誤差を良好な精度でより広範囲に測定可能とする角度伝達誤差測定装置を提供することを主目的とする。 A main object of the present disclosure is to provide an angle transmission error measuring device capable of measuring an angle transmission error generated between an input side and an output side of a speed reducer with good accuracy in a wider range.
 本開示は、上述の主目的を達成するために以下の手段を採った。 This disclosure has taken the following measures to achieve the above-mentioned main purpose.
 本開示の角度伝達誤差測定装置は、
 モータから減速機を介して伝達される動力により水平駆動可能なアームを有する水平関節ロボットに用いられ、前記減速機の入力側と出力側との間で生じる角度伝達誤差を測定する角度伝達誤差測定装置であって、
 前記アームに取り付けられたカメラと、
 前記アームの角度を検出するエンコーダと、
 前記アームの角度が第1の角度範囲にあるときに第1マークが前記カメラの視野に入り、前記アームの角度が第2の角度範囲にあるときに第2マークが前記カメラの視野に入り、前記アームの角度が前記第1および第2の角度範囲の間の中間角度範囲にあるときに前記第1および第2マークが共に前記カメラの視野に入るように前記第1および第2マークが設けられたマーク部材と、
 前記アームの角度を前記第1の角度範囲,前記中間角度範囲および前記第2の角度範囲に亘って少しずつ移動させ、前記アームの移動毎に前記カメラで前記マーク部材を撮像すると共に前記エンコーダにより検出されるアームの検出角度を取得し、前記第1の角度範囲および前記中間角度範囲において撮像した画像に基づいて前記第1マークを認識して前記アームの実角度を求めると共に前記第2の角度範囲および前記中間角度範囲において撮像した画像に基づいて前記第2マークを認識して前記アームの実角度を求め、前記アームの移動毎の前記実角度および前記検出角度に基づいて前記角度伝達誤差を求める制御装置と、
 を備えることを要旨とする。
The angle transmission error measuring device of the present disclosure is
Angle transmission error measurement used for horizontal joint robots that have an arm that can be driven horizontally by the power transmitted from the motor via the reducer, and measures the angle transmission error that occurs between the input side and output side of the reducer. It ’s a device,
With the camera attached to the arm
An encoder that detects the angle of the arm and
When the angle of the arm is in the first angle range, the first mark enters the field of view of the camera, and when the angle of the arm is in the second angle range, the second mark enters the field of view of the camera. The first and second marks are provided so that both the first and second marks are in the field of view of the camera when the angle of the arm is in the intermediate angle range between the first and second angle ranges. Mark member and
The angle of the arm is gradually moved over the first angle range, the intermediate angle range, and the second angle range, and each time the arm is moved, the mark member is imaged by the camera and the encoder is used. The detection angle of the detected arm is acquired, the first mark is recognized based on the images captured in the first angle range and the intermediate angle range, the actual angle of the arm is obtained, and the second angle is obtained. The second mark is recognized based on the image captured in the range and the intermediate angle range to obtain the actual angle of the arm, and the angle transmission error is calculated based on the actual angle and the detection angle for each movement of the arm. The desired control device and
The gist is to prepare.
 この本開示の角度伝達誤差測定装置は、アームに取り付けられたカメラと、アームの角度を検出するエンコーダと、マーク部材と、制御装置と、を備える。マーク部材は、アームの角度が第1の角度範囲にあるときに第1マークがカメラの視野に入り、アームの角度が第2の角度範囲にあるときに第2マークがカメラの視野に入り、アームの角度が第1および第2の角度範囲の間の中間角度範囲にあるときに第1および第2マークが共にカメラの視野に入るように第1および第2マークが設けられる。制御装置は、アームの角度を第1の角度範囲,中間角度範囲および第2の角度範囲に亘って少しずつ移動させ、アームの移動毎にカメラでマーク部材を撮像すると共にエンコーダにより検出されるアームの検出角度を取得する。続いて、制御装置は、第1の角度範囲および中間角度範囲において撮像した画像に基づいて第1マークを認識してアームの実角度を求めると共に第2の角度範囲および中間角度範囲において撮像した画像に基づいて第2マークを認識してアームの実角度を求める画像処理を行なう。そして、制御装置は、アームの移動毎の実角度および検出角度に基づいて角度伝達誤差を求める。これにより、第1の角度範囲から中間角度範囲を経て第2の角度範囲に至るまでアームに取り付けられたカメラによってマーク部材を撮像することできるため、撮像した画像に対して画像処理を行なうことで、より広範囲にアームの角度を認識することが可能となる。また、視野が比較的狭いカメラで撮像した画像を用いて画像処理を行なうことにより、画像の分解能を容易に高めることができるため、アームの角度を精度良く測定することができる。これらの結果、減速機の入力側と出力側との間で生じる角度伝達誤差を良好な精度でより広範囲に測定可能とする角度伝達誤差測定装置とすることができる。 The angle transmission error measuring device of the present disclosure includes a camera attached to the arm, an encoder for detecting the angle of the arm, a mark member, and a control device. As for the mark member, the first mark enters the field of view of the camera when the angle of the arm is in the first angle range, and the second mark enters the field of view of the camera when the angle of the arm is in the second angle range. The first and second marks are provided so that both the first and second marks are in the field of view of the camera when the angle of the arm is in the intermediate angle range between the first and second angle ranges. The control device gradually moves the angle of the arm over the first angle range, the intermediate angle range, and the second angle range, and each time the arm moves, the mark member is imaged by the camera and the arm detected by the encoder. Get the detection angle of. Subsequently, the control device recognizes the first mark based on the images captured in the first angle range and the intermediate angle range, obtains the actual angle of the arm, and captures the images in the second angle range and the intermediate angle range. The second mark is recognized based on the above, and image processing is performed to obtain the actual angle of the arm. Then, the control device obtains the angle transmission error based on the actual angle and the detection angle for each movement of the arm. As a result, the mark member can be imaged by a camera attached to the arm from the first angle range to the second angle range through the intermediate angle range. Therefore, by performing image processing on the captured image. , It becomes possible to recognize the angle of the arm in a wider range. Further, by performing image processing using an image captured by a camera having a relatively narrow field of view, the resolution of the image can be easily increased, so that the angle of the arm can be measured with high accuracy. As a result, it is possible to obtain an angle transmission error measuring device capable of measuring the angle transmission error generated between the input side and the output side of the speed reducer with good accuracy in a wider range.
作業ロボット10の外観斜視図である。It is an external perspective view of a working robot 10. 作業ロボット10の側面図である。It is a side view of the work robot 10. ロボット本体20と制御装置70との電気的な接続関係を示すブロック図である。It is a block diagram which shows the electrical connection relationship between a robot body 20 and a control device 70. 角度伝達誤差の一例を示す説明図である。It is explanatory drawing which shows an example of the angle transmission error. 角度伝達誤差測定処理の一例を示すフローチャートである。It is a flowchart which shows an example of the angle transmission error measurement processing. 画像処理の一例を示すフローチャートである。It is a flowchart which shows an example of image processing. 第1関節(J1)の角度伝達誤差を測定する様子を示す説明図である。It is explanatory drawing which shows the state of measuring the angle transmission error of the 1st joint (J1). 第1関節(J1)の角度伝達誤差を測定する様子を示す説明図である。It is explanatory drawing which shows the state of measuring the angle transmission error of the 1st joint (J1). 第1関節(J1)の角度伝達誤差を測定する様子を示す説明図である。It is explanatory drawing which shows the state of measuring the angle transmission error of the 1st joint (J1). 第1関節(J1)の角度伝達誤差を測定する様子を示す説明図である。It is explanatory drawing which shows the state of measuring the angle transmission error of the 1st joint (J1). 第1関節(J1)の角度伝達誤差を測定する様子を示す説明図である。It is explanatory drawing which shows the state of measuring the angle transmission error of the 1st joint (J1). 画像処理結果を示す説明図である。It is explanatory drawing which shows the image processing result. 実出力角度を示す説明図である。It is explanatory drawing which shows the actual output angle. 第2関節(J2)の角度伝達誤差を測定する様子を示す説明図である。It is explanatory drawing which shows the state of measuring the angle transmission error of the 2nd joint (J2). 第2関節(J2)の角度伝達誤差を測定する様子を示す説明図である。It is explanatory drawing which shows the state of measuring the angle transmission error of the 2nd joint (J2). 第2関節(J2)の角度伝達誤差を測定する様子を示す説明図である。It is explanatory drawing which shows the state of measuring the angle transmission error of the 2nd joint (J2). 第2関節(J2)の角度伝達誤差を測定する様子を示す説明図である。It is explanatory drawing which shows the state of measuring the angle transmission error of the 2nd joint (J2). 第2関節(J2)の角度伝達誤差を測定する様子を示す説明図である。It is explanatory drawing which shows the state of measuring the angle transmission error of the 2nd joint (J2).
 次に、本開示を実施するための形態について図面を参照しながら説明する。 Next, a mode for carrying out the present disclosure will be described with reference to the drawings.
 図1は、作業ロボット10の外観斜視図である。図2は、作業ロボット10の側面図である。図3は、ロボット本体20と制御装置70との電気的な接続関係を示すブロック図である。なお、図1,図2において、前後方向はX軸方向であり、左右方向はY軸方向であり、上下方向はZ軸方向である。 FIG. 1 is an external perspective view of the work robot 10. FIG. 2 is a side view of the work robot 10. FIG. 3 is a block diagram showing an electrical connection relationship between the robot body 20 and the control device 70. In FIGS. 1 and 2, the front-rear direction is the X-axis direction, the left-right direction is the Y-axis direction, and the up-down direction is the Z-axis direction.
 作業ロボット10は、作業エリアAに置かれたワークに対して所定の作業を行なう水平多関節ロボットとして構成されている。作業ロボット10は、ロボット本体20(図1~図3参照)と、ロボット本体20を制御する制御装置70(図3参照)とを備える。ロボット本体20は、図1,図2に示すように、ベース22と、第1アーム24と、第2アーム26と、先端軸28と、第1アーム駆動部30と、第2アーム駆動部40と、先端軸駆動部50と、を備える。 The work robot 10 is configured as a horizontal articulated robot that performs a predetermined work on a work placed in the work area A. The working robot 10 includes a robot main body 20 (see FIGS. 1 to 3) and a control device 70 (see FIG. 3) that controls the robot main body 20. As shown in FIGS. 1 and 2, the robot body 20 includes a base 22, a first arm 24, a second arm 26, a tip shaft 28, a first arm drive unit 30, and a second arm drive unit 40. And a tip shaft drive unit 50.
 ベース22は、支持台12に固定されている。第1アーム24は、第1関節軸24a(J1軸)を介して水平面内で回転可能にベース22に連結されている。第2アーム26は、第2関節軸26a(J2軸)を介して水平面内で回転可能に第1アーム24に連結されている。先端軸28は、第2アーム26に対して昇降可能に第2アーム26の先端部に連結されている。先端軸28には、ワークに対して作業を行なうための各種ツールが装着可能である。 The base 22 is fixed to the support base 12. The first arm 24 is rotatably connected to the base 22 in a horizontal plane via the first joint shaft 24a (J1 shaft). The second arm 26 is rotatably connected to the first arm 24 in a horizontal plane via the second joint shaft 26a (J2 shaft). The tip shaft 28 is connected to the tip of the second arm 26 so as to be able to move up and down with respect to the second arm 26. Various tools for performing work on the work can be attached to the tip shaft 28.
 第1アーム駆動部30は、図2に示すように、モータ32と、減速機34と、エンコーダ36とを備える。モータ32の回転軸は、減速機34を介して第1関節軸24aに連結されている。第1アーム駆動部30は、モータ32を駆動することにより、減速機34を介して第1関節軸24aに伝達されるトルクにより、第1関節軸24aを支点に第1アーム24を回転駆動する。減速機34は、例えば、波動歯車減速機として構成されている。波動歯車減速機は、図示しないが、ウェーブジェネレータと、フレクスプラインと、サーキュラススプラインとを有する。ウェーブジェネレータは、楕円状のカムと、その外周に嵌められたベアリングとにより構成されている。フレクスプラインは、薄肉カップ状の弾性体であり、その開口部の外周に歯が形成されている。サーキュラススプラインは、リング状の剛体であり、その内周にフレクスプラインの歯数よりも二枚多くなるように歯が形成されている。減速機34は、フレクスプラインを固定した状態でウェーブジェネレータを回転させることにより、フレクスプラインが弾性変形しながらサーキュラススプラインとの噛み合い位置を順次移動することにより動力を伝達する。エンコーダ36は、モータ32の回転軸に取り付けられ、モータ32の回転変位量を検出するロータリエンコーダとして構成されている。 As shown in FIG. 2, the first arm drive unit 30 includes a motor 32, a speed reducer 34, and an encoder 36. The rotating shaft of the motor 32 is connected to the first joint shaft 24a via a speed reducer 34. By driving the motor 32, the first arm driving unit 30 rotationally drives the first arm 24 with the first joint shaft 24a as a fulcrum by the torque transmitted to the first joint shaft 24a via the speed reducer 34. .. The speed reducer 34 is configured as, for example, a strain wave gearing speed reducer. A strain wave gearing reducer (not shown) includes a wave generator, a flexspline, and a circular spline. The wave generator is composed of an elliptical cam and a bearing fitted on the outer circumference thereof. The flexspline is a thin-walled cup-shaped elastic body, and teeth are formed on the outer periphery of the opening thereof. The circular spline is a ring-shaped rigid body, and teeth are formed on the inner circumference thereof so as to have two more teeth than the number of teeth of the flexspline. The speed reducer 34 transmits power by rotating the wave generator with the flexspline fixed, thereby sequentially moving the meshing position with the circular spline while elastically deforming the flexspline. The encoder 36 is attached to the rotating shaft of the motor 32 and is configured as a rotary encoder that detects the amount of rotational displacement of the motor 32.
 第2アーム駆動部40は、図2に示すように、第1アーム駆動部30と同様に、モータ42と、減速機44と、エンコーダ46とを備える。モータ42の回転軸は、減速機44を介して第2関節軸26aに連結されている。第2アーム駆動部40は、モータ42を駆動することにより、減速機44を介して第2関節軸26aに伝達されるトルクにより、第2関節軸26aを支点に第2アーム26を回転駆動する。減速機44は、例えば、第1アーム駆動部30の減速機34と同様の波動歯車減速機により構成されている。エンコーダ46は、モータ42の回転軸に取り付けられ、モータ42の回転変位量を検出するロータリエンコーダとして構成されている。 As shown in FIG. 2, the second arm drive unit 40 includes a motor 42, a speed reducer 44, and an encoder 46, similarly to the first arm drive unit 30. The rotation shaft of the motor 42 is connected to the second joint shaft 26a via the speed reducer 44. By driving the motor 42, the second arm driving unit 40 rotationally drives the second arm 26 with the second joint shaft 26a as a fulcrum by the torque transmitted to the second joint shaft 26a via the speed reducer 44. .. The speed reducer 44 is composed of, for example, a strain wave gearing speed reducer similar to the speed reducer 34 of the first arm drive unit 30. The encoder 46 is attached to the rotating shaft of the motor 42 and is configured as a rotary encoder that detects the amount of rotational displacement of the motor 42.
 先端軸駆動部50は、図3に示すように、モータ52と、エンコーダ56とを備える。モータ52は、図示しないボールねじ機構に接続されており、ボールねじ機構を駆動することにより先端軸28を昇降させる。先端軸駆動部50は、先端軸28を昇降させることにより、先端軸28に装着されたツールを昇降させることができる。エンコーダ56は、先端軸28の昇降位置を検出するリニアエンコーダとして構成されている。 As shown in FIG. 3, the tip shaft drive unit 50 includes a motor 52 and an encoder 56. The motor 52 is connected to a ball screw mechanism (not shown), and moves the tip shaft 28 up and down by driving the ball screw mechanism. The tip shaft drive unit 50 can raise and lower the tool mounted on the tip shaft 28 by raising and lowering the tip shaft 28. The encoder 56 is configured as a linear encoder that detects the elevating position of the tip shaft 28.
 制御装置70は、図3に示すように、CPU71とROM72とRAM73と入出力インタフェース(図示せず)とを備える。制御装置70には、エンコーダ36,46,56からの位置信号などが入出力インタフェースを介して入力されている。制御装置70からは、モータ32,42,52への駆動信号などが入出力インタフェースを介して出力されている。 As shown in FIG. 3, the control device 70 includes a CPU 71, a ROM 72, a RAM 73, and an input / output interface (not shown). Position signals and the like from the encoders 36, 46, and 56 are input to the control device 70 via the input / output interface. From the control device 70, drive signals to the motors 32, 42, 52 and the like are output via the input / output interface.
 こうして構成された作業ロボット10の動作について説明する。作業ロボット10の制御装置70は、まず、先端軸28に装着されたツールの目標位置を取得する。続いて、制御装置70は、ツールを目標位置に移動させるための第1アーム24の第1関節軸24aの目標回転角度と第2アーム26の第2関節軸26aの目標回転角度とを計算する。そして、制御装置70は、エンコーダ36により検出される第1関節軸24aの回転角度が目標回転角度に一致すると共にエンコーダ46により検出される第2関節軸26aの回転角度が目標回転角度に一致するようモータ32,42を制御する。 The operation of the work robot 10 configured in this way will be described. The control device 70 of the work robot 10 first acquires the target position of the tool mounted on the tip shaft 28. Subsequently, the control device 70 calculates the target rotation angle of the first joint shaft 24a of the first arm 24 and the target rotation angle of the second joint shaft 26a of the second arm 26 for moving the tool to the target position. .. Then, in the control device 70, the rotation angle of the first joint shaft 24a detected by the encoder 36 matches the target rotation angle, and the rotation angle of the second joint shaft 26a detected by the encoder 46 matches the target rotation angle. The motors 32 and 42 are controlled.
 ここで、上述したように、モータ32の回転軸は、減速機34(波動歯車減速機)を介して第1関節軸24aに連結されている。また、モータ42の回転軸は、減速機44(波動歯車減速機)を介して第2関節軸26aに連結されている。減速機34,44には、各構成部材の加工誤差や組み付け誤差などに起因して、任意の回転角度が入力として与えられたときに理論上回転する出力の回転角度と実際に回転した出力の回転角度とに差(角度伝達誤差)が生じる。減速機34,44に角度伝達誤差が含まれると、図4に示すように、モータ32,42が一定の回転速度で回転しても、第1,第2関節軸24a,26aには周期的な速度変動が発生するため、先端軸28(ツール)を精度良く目標位置へ移動させることが困難となる。そこで、本実施形態の作業ロボット10は、減速機34,44に含まれる角度伝達誤差を補正するために、角度伝達誤差を予め測定している。 Here, as described above, the rotating shaft of the motor 32 is connected to the first joint shaft 24a via the speed reducer 34 (strain wave gearing speed reducer). Further, the rotating shaft of the motor 42 is connected to the second joint shaft 26a via a speed reducer 44 (strain wave gearing speed reducer). The speed reducers 34 and 44 have a rotation angle of an output that theoretically rotates when an arbitrary rotation angle is given as an input and an output that actually rotates due to machining errors and assembly errors of each component. There is a difference (angle transmission error) from the rotation angle. When the speed reducers 34 and 44 include an angle transmission error, as shown in FIG. 4, even if the motors 32 and 42 rotate at a constant rotation speed, the first and second joint shafts 24a and 26a are periodic. Since various speed fluctuations occur, it becomes difficult to move the tip shaft 28 (tool) to the target position with high accuracy. Therefore, the work robot 10 of the present embodiment measures the angle transmission error in advance in order to correct the angle transmission error included in the speed reducers 34 and 44.
 角度伝達誤差の測定は、図1に示すように、先端軸28にツールとしてカメラ60を装着すると共に第1マークM1および第2マークM2がそれぞれX軸方向に所定距離離れた位置に付された平板状のマーク部材61を作業エリアAに設置した後、制御装置70に角度伝達誤差測定処理を実行させることにより行なわれる。以下、角度伝達誤差測定処理について説明する。ここで、カメラ60と、マーク部材61と、作業ロボット10を制御すると共にカメラ60で撮像された画像を入力して処理する制御装置70とが、本開示の角度伝達誤差測定装置に該当する。 As shown in FIG. 1, the angle transmission error was measured by mounting the camera 60 as a tool on the tip shaft 28 and attaching the first mark M1 and the second mark M2 to positions separated by a predetermined distance in the X-axis direction, respectively. This is performed by installing the flat plate-shaped mark member 61 in the work area A and then causing the control device 70 to execute the angle transmission error measurement process. Hereinafter, the angle transmission error measurement process will be described. Here, the camera 60, the mark member 61, and the control device 70 that controls the work robot 10 and inputs and processes the image captured by the camera 60 correspond to the angle transmission error measuring device of the present disclosure.
 図5は、制御装置70のCPU71により実行される角度伝達誤差測定処理の一例を示すフローチャートである。角度伝達誤差測定処理では、制御装置70のCPU71は、まず、第1関節軸24aおよび第2関節軸26aのうち角度伝達誤差の測定対象の関節軸である対象関節軸を設定し(S100)、設定した対象関節軸の角度が測定開始角度となるように対応するモータを駆動制御する(S110)。CPU71は、対象関節軸が第1関節軸24aであればモータ32を駆動制御し、対象関節軸が第2関節軸26aであればモータ42を駆動制御する。測定開始角度は、本実施形態では、カメラ60の視野に第1マークM1が入り第2マークM2が入らない対象関節軸の回転角度範囲のうち予め定められた回転角度である。続いて、CPU71は、変数nを値1に初期化する(S120)。そして、CPU71は、カメラ60でマーク部材61を撮像すると共に(S130)、撮像した画像を処理する画像処理を行なう(S140)。画像処理は、図6の画像処理を実行することにより行なわれる。以下、角度伝達誤差測定処理の説明を中断し、画像処理について説明する。 FIG. 5 is a flowchart showing an example of the angle transmission error measurement process executed by the CPU 71 of the control device 70. In the angle transmission error measurement process, the CPU 71 of the control device 70 first sets the target joint axis, which is the joint axis to be measured for the angle transmission error, among the first joint axis 24a and the second joint axis 26a (S100). The corresponding motor is driven and controlled so that the set angle of the target joint axis becomes the measurement start angle (S110). The CPU 71 drives and controls the motor 32 when the target joint axis is the first joint axis 24a, and drives and controls the motor 42 when the target joint axis is the second joint axis 26a. In the present embodiment, the measurement start angle is a predetermined rotation angle within the rotation angle range of the target joint axis in which the first mark M1 is included in the field of view of the camera 60 and the second mark M2 is not included. Subsequently, the CPU 71 initializes the variable n to the value 1 (S120). Then, the CPU 71 takes an image of the mark member 61 with the camera 60 (S130) and performs image processing to process the captured image (S140). The image processing is performed by executing the image processing of FIG. Hereinafter, the description of the angle transmission error measurement process will be interrupted, and the image processing will be described.
 図6の画像処理では、CPU71は、まず、撮像した画像から第1,第2マークM1,M2を認識する認識処理を行なう(S300)。この処理は、例えば、予め第1マークM1と第2マークM2のテンプレート画像を準備しておき、撮像画像の中からテンプレート画像と類似するものをサーチすることにより行なうことができる。CPU71は、認識処理の結果、撮像画像から第1マークM1のみが認識されたか否か(S310)、第2マークM2のみが認識されたか否か(S320)、をそれぞれ判定する。CPU71は、第1マークM1のみが認識されたと判定すると、認識結果に基づいて第1マークM1の撮像位置(x1(n),y1(n))を導出して(S330)、画像処理を終了する。CPU71は、第2マークM2のみが認識されたと判定すると、認識結果に基づいて第2マークM2の撮像位置(x2(n),y2(n))を導出して(S340)、画像処理を終了する。CPU71は、第1マークM1および第2マークM2の両方が認識されたと判定すると(S320の「NO」およびS330の「NO」)、認識結果に基づいて第1マークM1および第2マークM2のそれぞれの撮像位置(x1(n),y1(n)),(x2(n),y2(n))を導出して(S350)、画像処理を終了する。撮像位置(x1(n),y1(n))は、対象関節軸の測定を開始してからn回目に第1マークM1を撮像したときのカメラ60の位置を示す。また、撮像位置(x2(n),y2(n))は、対象関節軸の測定を開始してからn回目に第2マークM2を撮像したときのカメラ60の位置を示す。 In the image processing of FIG. 6, the CPU 71 first performs a recognition process of recognizing the first and second marks M1 and M2 from the captured image (S300). This process can be performed, for example, by preparing template images of the first mark M1 and the second mark M2 in advance and searching for an image similar to the template image from the captured images. As a result of the recognition process, the CPU 71 determines whether or not only the first mark M1 is recognized from the captured image (S310) and whether or not only the second mark M2 is recognized (S320). When the CPU 71 determines that only the first mark M1 is recognized, the CPU 71 derives the imaging position (x1 (n), y1 (n)) of the first mark M1 based on the recognition result (S330), and ends the image processing. To do. When the CPU 71 determines that only the second mark M2 is recognized, the CPU 71 derives the imaging position (x2 (n), y2 (n)) of the second mark M2 based on the recognition result (S340), and ends the image processing. To do. When the CPU 71 determines that both the first mark M1 and the second mark M2 are recognized (“NO” in S320 and “NO” in S330), the CPU 71 determines that the first mark M1 and the second mark M2 are recognized, respectively, based on the recognition result. (X1 (n), y1 (n)), (x2 (n), y2 (n)) are derived (S350), and the image processing is completed. The imaging position (x1 (n), y1 (n)) indicates the position of the camera 60 when the first mark M1 is imaged nth time after the measurement of the target joint axis is started. Further, the imaging position (x2 (n), y2 (n)) indicates the position of the camera 60 when the second mark M2 is imaged nth time after the measurement of the target joint axis is started.
 角度伝達誤差測定処理に戻って、CPU71は、対象関節軸を所定角度回転させ(S150)、カメラ60でマーク部材61を撮像する(S160)。続いて、CPU71は、前回の撮像時から今回の撮像時までの対象関節軸の回転変位量を入力角度θi(n)として対応するエンコーダから取得する(S170)。そして、CPU71は、入力角度θi(n)に基づいて次式(1)により理論出力角度θo1(n)を計算する(S180)。式(2)中、「γ」は対象関節軸に設けられた減速機の減速比を示す。理論出力角度θo1(n)は、入力角度が入力として減速機に与えられたときに、減速機から出力される理論上の出力角度を示す。 Returning to the angle transmission error measurement process, the CPU 71 rotates the target joint axis by a predetermined angle (S150) and images the mark member 61 with the camera 60 (S160). Subsequently, the CPU 71 acquires the amount of rotational displacement of the target joint axis from the time of the previous imaging to the time of the current imaging from the corresponding encoder as the input angle θi (n) (S170). Then, the CPU 71 calculates the theoretical output angle θo1 (n) by the following equation (1) based on the input angle θi (n) (S180). In the formula (2), "γ" indicates the reduction ratio of the speed reducer provided on the target joint axis. The theoretical output angle θo1 (n) indicates the theoretical output angle output from the reducer when the input angle is given to the reducer as an input.
 θo1(n)=θi(n)/γ   (1) Θo1 (n) = θi (n) / γ (1)
 次に、CPU71は、ステップS160で撮像した画像に対してステップS140と同様の画像処理を行なって第1マークM1および第2マークMの撮像位置(x1(n),y1(n)),(x2(n),y2(n))の一方または両方を算出する(S190)。続いて、CPU71は、第1マークM1および第2マークMのうち同一マークに対して前回算出した撮像位置から今回算出した撮像位置までの対象関節軸を支点とした回転変位量を計算すると共に計算した回転変位量を実出力角度θo2(n)として設定する(S200)。回転変位量は、前回算出した撮像位置のxy座標と今回算出した撮像位置のxy座標と既知の対象関節軸のxy座標とから容易に求めることができる。そして、CPU71は、理論出力角度θo1(n)から実出力角度θo2(n)を減じた次式(2)により変数nにおける角度伝達誤差ε(n)を計算し(S210)、計算した角度伝達誤差ε(n)をRAM73に記憶する(S220)。 Next, the CPU 71 performs the same image processing as in step S140 on the image captured in step S160, and the imaging positions of the first mark M1 and the second mark M (x1 (n), y1 (n)), ( One or both of x2 (n) and y2 (n)) are calculated (S190). Subsequently, the CPU 71 calculates and calculates the amount of rotational displacement of the same mark of the first mark M1 and the second mark M with the target joint axis as the fulcrum from the previously calculated imaging position to the currently calculated imaging position. The amount of rotational displacement is set as the actual output angle θo2 (n) (S200). The amount of rotational displacement can be easily obtained from the xy coordinates of the imaging position calculated last time, the xy coordinates of the imaging position calculated this time, and the xy coordinates of the known target joint axis. Then, the CPU 71 calculates the angle transmission error ε (n) in the variable n by the following equation (2) obtained by subtracting the actual output angle θo2 (n) from the theoretical output angle θo1 (n) (S210), and the calculated angle transmission. The error ε (n) is stored in the RAM 73 (S220).
 ε(n)=θo1(n)-θo2(n)    (2) Ε (n) = θo1 (n) -θo2 (n) (2)
 CPU71は、変数nにおいて、角度伝達誤差ε(n)を計算して記憶すると、対象関節軸の角度が計測終了角度に到達したか否かを判定する(S230)。計測終了角度は、本実施形態では、カメラ60の視野に第2マークM2が入り第1マークM1が入らない対象関節軸の回転角度範囲のうち予め定められた回転角度である。CPU71は、対象関節軸の角度が計測終了角度に到達していないと判定すると、変数nを値1だけインクリメントしてから(S240)、ステップS150に戻る。このように、CPU71は、計測開始角度から計測終了角度まで対象関節軸を所定角度ずつ回転させ、対象関節軸の回転毎にカメラ60でマーク(第1マークM1および第2マークM2の一方または両方)を撮像すると共に対応するエンコーダから対象関節軸の入力角度θi(n)を取得する。そして、CPU71は、入力角度θi(n)から理論出力角度θo1(n)を求めると共に撮像画像から認識されるマークの位置から対象関節軸の実出力角度θo2(n)を求め、理論出力角度θo1(n)と実出力角度θo2(n)との差分をとることにより、対象関節軸の回転毎に角度伝達誤差ε(n)を計算する。 When the CPU 71 calculates and stores the angle transmission error ε (n) in the variable n, it determines whether or not the angle of the target joint axis has reached the measurement end angle (S230). In the present embodiment, the measurement end angle is a predetermined rotation angle within the rotation angle range of the target joint axis in which the second mark M2 is included in the field of view of the camera 60 and the first mark M1 is not included. When the CPU 71 determines that the angle of the target joint axis has not reached the measurement end angle, the variable n is incremented by a value of 1 (S240), and then the process returns to step S150. In this way, the CPU 71 rotates the target joint axis by a predetermined angle from the measurement start angle to the measurement end angle, and marks with the camera 60 each time the target joint axis rotates (one or both of the first mark M1 and the second mark M2). ) Is imaged and the input angle θi (n) of the target joint axis is acquired from the corresponding encoder. Then, the CPU 71 obtains the theoretical output angle θo1 (n) from the input angle θi (n) and obtains the actual output angle θo2 (n) of the target joint axis from the position of the mark recognized from the captured image, and obtains the theoretical output angle θo1. By taking the difference between (n) and the actual output angle θo2 (n), the angle transmission error ε (n) is calculated for each rotation of the target joint axis.
 図7A~図7Eは、第1関節(J1)の角度伝達誤差を測定する様子を示す説明図である。図示するように、第1関節軸24aの角度が測定開始角度を含む第1の角度範囲内にあるときには、第1マークM1のみがカメラ60の視野Bに入り、第1マークM1のみがカメラ60により撮像されて第1マークM1の撮像画像から撮像位置(x1(n),y1(n))が導出される(図7A,図7B参照)。また、第1関節軸24aの角度が第1の角度範囲を過ぎて中間角度範囲内に至ると、第1マークM1および第2マークM2の両方がカメラ60の視野に入り、第1マークM1および第2マークM2の両方がカメラ60により撮像されて第1マークM1および第2マークM2の撮像画像から撮像位置(x1(n),y1(n))および(x2(n),y2(n))が導出される(図7C参照)。さらに、第1関節軸24aの角度が測定終了角度を含む第2の角度範囲内に至ると、第1マークM1がカメラ60の視野から外れて第2マークM2のみがカメラ60の視野に入り、第2マークM2のみがカメラ60により撮像されて第2マークM2の撮像画像から撮像位置(x2(n),y2(n))が導出される(図7D,図7E参照)。上述したように、第1マークM1と第2マークM2とは所定距離離れた位置に設けられているため、撮像位置(x1(n),y1(n))と撮像位置(x2(n),y2(n))とは、図8に示すように、対象関節軸からの半径が異なる同心円上にそれぞれプロットされることとなる。 7A to 7E are explanatory views showing how the angle transmission error of the first joint (J1) is measured. As shown in the figure, when the angle of the first joint axis 24a is within the first angle range including the measurement start angle, only the first mark M1 enters the field of view B of the camera 60, and only the first mark M1 enters the field of view B of the camera 60. The imaging position (x1 (n), y1 (n)) is derived from the image captured by the first mark M1 (see FIGS. 7A and 7B). Further, when the angle of the first joint axis 24a passes the first angle range and reaches the intermediate angle range, both the first mark M1 and the second mark M2 enter the field of view of the camera 60, and the first mark M1 and the first mark M1 and Both of the second mark M2 are imaged by the camera 60, and the imaging positions (x1 (n), y1 (n)) and (x2 (n), y2 (n) are taken from the captured images of the first mark M1 and the second mark M2. ) Is derived (see FIG. 7C). Further, when the angle of the first joint axis 24a reaches within the second angle range including the measurement end angle, the first mark M1 deviates from the field of view of the camera 60 and only the second mark M2 enters the field of view of the camera 60. Only the second mark M2 is imaged by the camera 60, and the imaging position (x2 (n), y2 (n)) is derived from the captured image of the second mark M2 (see FIGS. 7D and 7E). As described above, since the first mark M1 and the second mark M2 are provided at positions separated by a predetermined distance, the imaging position (x1 (n), y1 (n)) and the imaging position (x2 (n), As shown in FIG. 8, y2 (n)) is plotted on concentric circles having different radii from the target joint axis.
 このように、第1マークM1および第2マークM2の両方がカメラ60の視野に入るオーバーラップ区間を設けることで、第1関節軸24aが所定角度ずつ回転する度に第1マークM1を撮像したときの撮像位置(x1(n),y1(n))の変化に基づいて算出される実出力角度θo2(n)と第2マークM2を撮像したときの撮像位置(x2(n),y2(n))の変化に基づいて算出される実出力角度θo2(n)とに連続性をもたせることができる。この結果、カメラ60の視野が狭い場合であっても、より広範囲に連続した角度伝達誤差ε(n)を測定することができる。すなわち、計測開始角度から計測終了角度までに亘って連続した実出力角度θo2(n)を得ることができる(図9参照)。ここで、画像の分解能は、カメラ60の画素数と視野の大きさとにより決定される。本実施形態では、画像の分解能を高めてマークを精度良く認識できるように、視野が比較的狭いカメラ60が用いられている。これにより、十分な測定範囲を確保しつつ、良好な精度で角度伝達誤差を測定することができる。 In this way, by providing an overlapping section in which both the first mark M1 and the second mark M2 are in the field of view of the camera 60, the first mark M1 is imaged each time the first joint axis 24a is rotated by a predetermined angle. The actual output angle θo2 (n) calculated based on the change in the imaging position (x1 (n), y1 (n)) and the imaging position (x2 (n), y2 (x2 (n), y2) when the second mark M2 is imaged. It is possible to have continuity with the actual output angle θo2 (n) calculated based on the change of n)). As a result, even when the field of view of the camera 60 is narrow, the continuous angle transmission error ε (n) can be measured over a wider range. That is, a continuous actual output angle θo2 (n) can be obtained from the measurement start angle to the measurement end angle (see FIG. 9). Here, the resolution of the image is determined by the number of pixels of the camera 60 and the size of the field of view. In this embodiment, a camera 60 having a relatively narrow field of view is used so that the resolution of the image can be improved and the mark can be recognized with high accuracy. As a result, the angle transmission error can be measured with good accuracy while ensuring a sufficient measurement range.
 CPU71は、対象関節軸の角度が計測終了角度に到達したと判定すると、その対象関節軸の計測が終了したと判断し、角度伝達誤差の計測が行われてない未計測の関節軸が残っているか否かを判定する(S250)。CPU71は、未計測の関節軸が残っていると判定すると、ステップS100に戻って、その未計測の関節軸を新たな対象関節軸に設定して処理を繰り返す。一方、CPU71は、未計測の関節軸が残っていないと判定すると、角度伝達誤差測定処理を終了する。 When the CPU 71 determines that the angle of the target joint axis has reached the measurement end angle, it determines that the measurement of the target joint axis has been completed, and the unmeasured joint axis for which the angle transmission error has not been measured remains. Whether or not it is determined (S250). When the CPU 71 determines that the unmeasured joint axis remains, the CPU 71 returns to step S100, sets the unmeasured joint axis as a new target joint axis, and repeats the process. On the other hand, when the CPU 71 determines that no unmeasured joint axis remains, the CPU 71 ends the angle transmission error measurement process.
 図10A~図10Eは、第2関節(J2)の角度伝達誤差を測定する様子を示す説明図である。図示するように、第2関節軸26aの角度が測定開始角度を含む第1の角度範囲内にあるときには、第1マークM1のみがカメラ60の視野Bに入り、第1マークM1のみがカメラ60により撮像されて第1マークM1の撮像画像から撮像位置(x1(n),y1(n))が導出される(図10A,図10B参照)。また、第2関節軸26aの角度が第1の角度範囲を過ぎて中間角度範囲内に至ると、第1マークM1および第2マークM2の両方がカメラ60の視野に入り、第1マークM1および第2マークM2の両方がカメラ60により撮像されて第1マークM1および第2マークM2の撮像画像から撮像位置(x1(n),y1(n))および(x2(n),y2(n))が導出される(図10C参照)。さらに、第2関節軸26aの角度が測定終了角度を含む第2の角度範囲内に至ると、第1マークM1がカメラ60の視野から外れて第2マークM2のみがカメラ60の視野に入り、第2マークM2のみがカメラ60により撮像されて第2マークM2の撮像画像から撮像位置(x2(n),y2(n))が導出される(図10D,図10E参照)。これにより、第1関節軸24aおよび第2関節軸26aのそれぞれにおいて個別に角度伝達誤差を計測することができる。 10A to 10E are explanatory views showing how the angle transmission error of the second joint (J2) is measured. As shown in the figure, when the angle of the second joint axis 26a is within the first angle range including the measurement start angle, only the first mark M1 enters the field of view B of the camera 60, and only the first mark M1 enters the field of view B of the camera 60. The imaging position (x1 (n), y1 (n)) is derived from the image captured by the first mark M1 (see FIGS. 10A and 10B). Further, when the angle of the second joint axis 26a passes the first angle range and reaches the intermediate angle range, both the first mark M1 and the second mark M2 enter the field of view of the camera 60, and the first mark M1 and the first mark M1 and Both of the second mark M2 are imaged by the camera 60, and the imaging positions (x1 (n), y1 (n)) and (x2 (n), y2 (n) are taken from the captured images of the first mark M1 and the second mark M2. ) Is derived (see FIG. 10C). Further, when the angle of the second joint axis 26a reaches within the second angle range including the measurement end angle, the first mark M1 deviates from the field of view of the camera 60 and only the second mark M2 enters the field of view of the camera 60. Only the second mark M2 is imaged by the camera 60, and the imaging position (x2 (n), y2 (n)) is derived from the image captured by the second mark M2 (see FIGS. 10D and 10E). As a result, the angle transmission error can be measured individually for each of the first joint shaft 24a and the second joint shaft 26a.
 ここで、実施形態の主要な要素と請求の範囲に記載した本開示の主要な要素との対応関係について説明する。即ち、モータ32,42がモータに相当し、減速機34,44が減速機に相当し、第1アーム24および第2アーム26がアームに相当し、作業ロボット10が水平関節ロボットに相当し、カメラ60がカメラに相当し、エンコーダ36,46がエンコーダに相当し、第1マークM1が第1マークに相当し、第2マークM2が第2マークに相当し、マーク部材61がマーク部材に相当し、制御装置70が制御装置に相当する。また、第1関節軸24aおよび第2関節軸26aが複数の関節に相当する。 Here, the correspondence between the main elements of the embodiment and the main elements of the present disclosure described in the claims will be described. That is, the motors 32 and 42 correspond to the motors, the speed reducers 34 and 44 correspond to the speed reducers, the first arm 24 and the second arm 26 correspond to the arms, and the work robot 10 corresponds to the horizontal joint robot. The camera 60 corresponds to the camera, the encoders 36 and 46 correspond to the encoder, the first mark M1 corresponds to the first mark, the second mark M2 corresponds to the second mark, and the mark member 61 corresponds to the mark member. However, the control device 70 corresponds to the control device. Further, the first joint shaft 24a and the second joint shaft 26a correspond to a plurality of joints.
 なお、本開示は上述した実施形態に何ら限定されることはなく、本開示の技術的範囲に属する限り種々の態様で実施し得ることはいうまでもない。 It should be noted that the present disclosure is not limited to the above-described embodiment, and it goes without saying that the present disclosure can be carried out in various embodiments as long as it belongs to the technical scope of the present disclosure.
 例えば、上述した実施形態では、減速機34,44は、波動歯車減速機として構成されるものとした。しかし、減速機34,44は、これに限定されるものではなく、遊星歯車減速機など他の歯車減速機として構成されてもよい。この場合でも、減速機には、歯ピッチ誤差などに起因して、角度伝達誤差は生じるから、角度伝達誤差を測定することで、ロボットの精度を向上させることができる。 For example, in the above-described embodiment, the speed reducers 34 and 44 are configured as a strain wave gearing speed reducer. However, the speed reducers 34 and 44 are not limited to this, and may be configured as other gear speed reducers such as planetary gear speed reducers. Even in this case, since the angle transmission error occurs in the speed reducer due to the tooth pitch error or the like, the accuracy of the robot can be improved by measuring the angle transmission error.
 上述した実施形態では、マーク部材61は、2つのマーク(第1マークM1,M2)を備えるものとした。しかし、マーク部材61は、3つ以上のマークを備えるものとしてもよい。例えば、マーク部材が第1マークM1と第2マークM2と第3マークM3とを備え、アームを第1の角度範囲から第2の角度範囲を経て第3の角度範囲へ移動させる場合、第3マークM3は、アームの角度が第3の角度範囲にあるときに第3マークM3がカメラ60に視野に入り、アームの角度が第2および第3の角度範囲の間の中間角度範囲にあるときに、第2および第3マークM2,M3が共にカメラ60の視野に入るように配置されればよい。これにより、角度伝達誤差の測定範囲をさらに広げることができる。 In the above-described embodiment, the mark member 61 includes two marks (first marks M1 and M2). However, the mark member 61 may include three or more marks. For example, when the mark member includes the first mark M1, the second mark M2, and the third mark M3, and the arm is moved from the first angle range to the third angle range via the second angle range, the third mark member is provided. The mark M3 is when the third mark M3 is in the field of view of the camera 60 when the angle of the arm is in the third angle range, and the angle of the arm is in the intermediate angle range between the second and third angle ranges. The second and third marks M2 and M3 may be arranged so as to be in the field of view of the camera 60. As a result, the measurement range of the angle transmission error can be further expanded.
 上述した実施形態では、作業ロボット10は、2つの関節軸(第1関節軸24a,第2関節軸26a)を備えるものとした。しかし、関節軸の数は、1つであってもよいし、3つ以上あってもよい。 In the above-described embodiment, the work robot 10 includes two joint axes (first joint axis 24a and second joint axis 26a). However, the number of joint axes may be one or three or more.
 以上説明したように、本開示の角度伝達誤差測定装置は、モータから減速機を介して伝達される動力により水平駆動可能なアームを有する水平関節ロボットに用いられ、前記減速機の入力側と出力側との間で生じる角度伝達誤差を測定する角度伝達誤差測定装置であって、前記アームに取り付けられたカメラと、前記アームの角度を検出するエンコーダと、前記アームの角度が第1の角度範囲にあるときに第1マークが前記カメラの視野に入り、前記アームの角度が第2の角度範囲にあるときに第2マークが前記カメラの視野に入り、前記アームの角度が前記第1および第2の角度範囲の間の中間角度範囲にあるときに前記第1および第2マークが共に前記カメラの視野に入るように前記第1および第2マークが設けられたマーク部材と、前記アームの角度を前記第1の角度範囲,前記中間角度範囲および前記第2の角度範囲に亘って少しずつ移動させ、前記アームの移動毎に前記カメラで前記マーク部材を撮像すると共に前記エンコーダにより検出されるアームの検出角度を取得し、前記第1の角度範囲および前記中間角度範囲において撮像した画像に基づいて前記第1マークを認識して前記アームの実角度を求めると共に前記第2の角度範囲および前記中間角度範囲において撮像した画像に基づいて前記第2マークを認識して前記アームの実角度を求め、前記アームの移動毎の前記実角度および前記検出角度に基づいて前記角度伝達誤差を求める制御装置と、を備えることを要旨とする。 As described above, the angle transmission error measuring device of the present disclosure is used for a horizontal joint robot having an arm that can be horizontally driven by the power transmitted from the motor via the speed reducer, and the input side and the output of the speed reducer. An angle transmission error measuring device that measures the angle transmission error that occurs between the side and the arm, the camera attached to the arm, the encoder that detects the angle of the arm, and the angle of the arm in the first angle range. When the first mark is in the field of view of the camera, the second mark is in the field of view of the camera when the angle of the arm is in the second angle range, and the angle of the arm is in the first and first angles. The angle between the arm and the mark member provided with the first and second marks so that both the first and second marks are in the field of view of the camera when they are in the intermediate angle range between the two angle ranges. Is gradually moved over the first angle range, the intermediate angle range, and the second angle range, and each time the arm is moved, the mark member is imaged by the camera and the arm detected by the encoder. The first mark is recognized based on the images captured in the first angle range and the intermediate angle range to obtain the actual angle of the arm, and the second angle range and the intermediate angle are obtained. A control device that recognizes the second mark based on an image captured in an angle range to obtain the actual angle of the arm, and obtains the angle transmission error based on the actual angle and the detection angle for each movement of the arm. , Is the gist.
 この本開示の角度伝達誤差測定装置は、アームに取り付けられたカメラと、アームの角度を検出するエンコーダと、マーク部材と、制御装置と、を備える。マーク部材は、アームの角度が第1の角度範囲にあるときに第1マークがカメラの視野に入り、アームの角度が第2の角度範囲にあるときに第2マークがカメラの視野に入り、アームの角度が第1および第2の角度範囲の間の中間角度範囲にあるときに第1および第2マークが共にカメラの視野に入るように第1および第2マークが設けられる。制御装置は、アームの角度を第1の角度範囲,中間角度範囲および第2の角度範囲に亘って少しずつ移動させ、アームの移動毎にカメラでマーク部材を撮像すると共にエンコーダにより検出されるアームの検出角度を取得する。続いて、制御装置は、第1の角度範囲および中間角度範囲において撮像した画像に基づいて第1マークを認識してアームの実角度を求めると共に第2の角度範囲および中間角度範囲において撮像した画像に基づいて第2マークを認識してアームの実角度を求める画像処理を行なう。そして、制御装置は、アームの移動毎の実角度および検出角度に基づいて角度伝達誤差を求める。これにより、第1の角度範囲から中間角度範囲を経て第2の角度範囲に至るまでアームに取り付けられたカメラによってマーク部材を撮像することできるため、撮像した画像に対して画像処理を行なうことで、より広範囲にアームの角度を認識することが可能となる。また、視野が比較的狭いカメラで撮像した画像を用いて画像処理を行なうことにより、画像の分解能を容易に高めることができるため、アームの角度を精度良く測定することができる。これらの結果、減速機の入力側と出力側との間で生じる角度伝達誤差を良好な精度でより広範囲に測定可能とする角度伝達誤差測定装置とすることができる。 The angle transmission error measuring device of the present disclosure includes a camera attached to the arm, an encoder for detecting the angle of the arm, a mark member, and a control device. As for the mark member, the first mark enters the field of view of the camera when the angle of the arm is in the first angle range, and the second mark enters the field of view of the camera when the angle of the arm is in the second angle range. The first and second marks are provided so that both the first and second marks are in the field of view of the camera when the angle of the arm is in the intermediate angle range between the first and second angle ranges. The control device gradually moves the angle of the arm over the first angle range, the intermediate angle range, and the second angle range, and each time the arm moves, the mark member is imaged by the camera and the arm detected by the encoder. Get the detection angle of. Subsequently, the control device recognizes the first mark based on the images captured in the first angle range and the intermediate angle range, obtains the actual angle of the arm, and captures the images in the second angle range and the intermediate angle range. The second mark is recognized based on the above, and image processing is performed to obtain the actual angle of the arm. Then, the control device obtains the angle transmission error based on the actual angle and the detection angle for each movement of the arm. As a result, the mark member can be imaged by a camera attached to the arm from the first angle range to the second angle range through the intermediate angle range. Therefore, by performing image processing on the captured image. , It becomes possible to recognize the angle of the arm in a wider range. Further, by performing image processing using an image captured by a camera having a relatively narrow field of view, the resolution of the image can be easily increased, so that the angle of the arm can be measured with high accuracy. As a result, it is possible to obtain an angle transmission error measuring device capable of measuring the angle transmission error generated between the input side and the output side of the speed reducer with good accuracy in a wider range.
 こうした本開示の角度伝達誤差測定装置において、前記水平関節ロボットは、それぞれ、モータから減速機を介して伝達される動力により水平駆動可能な複数の関節を有し、前記制御装置は、前記複数の関節をそれぞれ個別に動作して各関節ごとに前記角度伝達誤差を求めるものとしてもよい。こうすれば、本開示の角度伝達誤差測定装置を、水平多関節ロボットにも同様に適用することができる。 In such an angle transmission error measuring device of the present disclosure, each of the horizontal joint robots has a plurality of joints that can be horizontally driven by power transmitted from a motor via a speed reducer, and the control device includes the plurality of joints. The joints may be operated individually to obtain the angle transmission error for each joint. In this way, the angle transmission error measuring device of the present disclosure can be similarly applied to the horizontal articulated robot.
 また、本開示の角度伝達誤差測定装置において、前記減速機は、波動歯車減速機であるものとしてもよい。 Further, in the angle transmission error measuring device of the present disclosure, the speed reducer may be a strain wave gearing speed reducer.
 本開示は、角度伝達誤差測定装置の製造産業などに利用可能である。 This disclosure can be used in the manufacturing industry of angle transmission error measuring devices and the like.
 10 作業ロボット、12 支持台、20 ロボット本体、22 ベース、24 第1アーム、24a 第1関節軸、26 第2アーム、26a 第2関節軸、28 先端軸、30 第1アーム駆動部、32 モータ、34 減速機、36 エンコーダ、40 第2アーム駆動部、42 モータ、44 減速機、46 エンコーダ、50 先端軸駆動部、52 モータ、56 エンコーダ、60 カメラ、61 マーク部材、A 作業エリア、M1 第1マーク、M2 第2マーク、70 制御装置、71 CPU、72 ROM、73 RAM。 10 work robot, 12 support base, 20 robot body, 22 base, 24 1st arm, 24a 1st joint shaft, 26 2nd arm, 26a 2nd joint shaft, 28 tip shaft, 30 1st arm drive unit, 32 motor , 34 reducer, 36 encoder, 40 second arm drive unit, 42 motor, 44 reducer, 46 encoder, 50 tip shaft drive unit, 52 motor, 56 encoder, 60 camera, 61 mark member, A work area, M1 first 1 mark, M2 2nd mark, 70 control device, 71 CPU, 72 ROM, 73 RAM.

Claims (3)

  1.  モータから減速機を介して伝達される動力により水平駆動可能なアームを有する水平関節ロボットに用いられ、前記減速機の入力側と出力側との間で生じる角度伝達誤差を測定する角度伝達誤差測定装置であって、
     前記アームに取り付けられたカメラと、
     前記アームの角度を検出するエンコーダと、
     前記アームの角度が第1の角度範囲にあるときに第1マークが前記カメラの視野に入り、前記アームの角度が第2の角度範囲にあるときに第2マークが前記カメラの視野に入り、前記アームの角度が前記第1および第2の角度範囲の間の中間角度範囲にあるときに前記第1および第2マークが共に前記カメラの視野に入るように前記第1および第2マークが設けられたマーク部材と、
     前記アームの角度を前記第1の角度範囲,前記中間角度範囲および前記第2の角度範囲に亘って少しずつ移動させ、前記アームの移動毎に前記カメラで前記マーク部材を撮像すると共に前記エンコーダにより検出されるアームの検出角度を取得し、前記第1の角度範囲および前記中間角度範囲において撮像した画像に基づいて前記第1マークを認識して前記アームの実角度を求めると共に前記第2の角度範囲および前記中間角度範囲において撮像した画像に基づいて前記第2マークを認識して前記アームの実角度を求め、前記アームの移動毎の前記実角度および前記検出角度に基づいて前記角度伝達誤差を求める制御装置と、
     を備える角度伝達誤差測定装置。
    Angle transmission error measurement used for horizontal joint robots that have an arm that can be driven horizontally by the power transmitted from the motor via the reducer, and measures the angle transmission error that occurs between the input side and output side of the reducer. It ’s a device,
    With the camera attached to the arm
    An encoder that detects the angle of the arm and
    When the angle of the arm is in the first angle range, the first mark enters the field of view of the camera, and when the angle of the arm is in the second angle range, the second mark enters the field of view of the camera. The first and second marks are provided so that both the first and second marks are in the field of view of the camera when the angle of the arm is in the intermediate angle range between the first and second angle ranges. Mark member and
    The angle of the arm is gradually moved over the first angle range, the intermediate angle range, and the second angle range, and each time the arm is moved, the mark member is imaged by the camera and the encoder is used. The detection angle of the detected arm is acquired, the first mark is recognized based on the images captured in the first angle range and the intermediate angle range, the actual angle of the arm is obtained, and the second angle is obtained. The second mark is recognized based on the image captured in the range and the intermediate angle range to obtain the actual angle of the arm, and the angle transmission error is calculated based on the actual angle and the detection angle for each movement of the arm. The desired control device and
    An angle transmission error measuring device.
  2.  請求項1に記載の角度伝達誤差測定装置であって、
     前記水平関節ロボットは、それぞれ、モータから減速機を介して伝達される動力により水平駆動可能な複数の関節を有し、
     前記制御装置は、前記複数の関節をそれぞれ個別に動作して各関節ごとに前記角度伝達誤差を求める、
     角度伝達誤差測定装置。
    The angle transmission error measuring device according to claim 1.
    Each of the horizontal joint robots has a plurality of joints that can be horizontally driven by the power transmitted from the motor via the reduction gear.
    The control device operates the plurality of joints individually to obtain the angle transmission error for each joint.
    Angle transmission error measuring device.
  3.  請求項1または2に記載の角度伝達誤差測定装置であって、
     前記減速機は、波動歯車減速機である、
     角度伝達誤差測定装置。
    The angle transmission error measuring device according to claim 1 or 2.
    The speed reducer is a strain wave gearing speed reducer.
    Angle transmission error measuring device.
PCT/JP2019/037339 2019-09-24 2019-09-24 Angle transfer error measuring device WO2021059341A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2019/037339 WO2021059341A1 (en) 2019-09-24 2019-09-24 Angle transfer error measuring device
JP2021548010A JP7152614B2 (en) 2019-09-24 2019-09-24 Angular transmission error measuring device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/037339 WO2021059341A1 (en) 2019-09-24 2019-09-24 Angle transfer error measuring device

Publications (1)

Publication Number Publication Date
WO2021059341A1 true WO2021059341A1 (en) 2021-04-01

Family

ID=75165207

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/037339 WO2021059341A1 (en) 2019-09-24 2019-09-24 Angle transfer error measuring device

Country Status (2)

Country Link
JP (1) JP7152614B2 (en)
WO (1) WO2021059341A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114043527A (en) * 2021-11-22 2022-02-15 成都飞机工业(集团)有限责任公司 Single joint positioning precision calibration method of joint robot

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7441716B2 (en) 2020-04-27 2024-03-01 日立Geニュークリア・エナジー株式会社 Work system and work control device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1080882A (en) * 1996-09-06 1998-03-31 Fujitsu Ltd Coordinate transformation parameter measuring device for robot
JP2010120110A (en) * 2008-11-19 2010-06-03 Daihen Corp Method of calculating transmission error correcting amount of speed reducer, and robot control device
JP2018120306A (en) * 2017-01-23 2018-08-02 セイコーエプソン株式会社 Encoder, robot and printer
JP2018202608A (en) * 2018-09-28 2018-12-27 キヤノン株式会社 Robot device, control method of robot device, program, and recording medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0721505B2 (en) * 1987-12-15 1995-03-08 帝人製機株式会社 Angle error measuring device
JP4000685B2 (en) 1998-10-30 2007-10-31 ソニー株式会社 Rotation angle measuring device
JP4513435B2 (en) 2003-07-16 2010-07-28 東京エレクトロン株式会社 Transport device
JP2019143979A (en) 2018-02-15 2019-08-29 セイコーエプソン株式会社 Encoder unit, angle detection method, and robot

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1080882A (en) * 1996-09-06 1998-03-31 Fujitsu Ltd Coordinate transformation parameter measuring device for robot
JP2010120110A (en) * 2008-11-19 2010-06-03 Daihen Corp Method of calculating transmission error correcting amount of speed reducer, and robot control device
JP2018120306A (en) * 2017-01-23 2018-08-02 セイコーエプソン株式会社 Encoder, robot and printer
JP2018202608A (en) * 2018-09-28 2018-12-27 キヤノン株式会社 Robot device, control method of robot device, program, and recording medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114043527A (en) * 2021-11-22 2022-02-15 成都飞机工业(集团)有限责任公司 Single joint positioning precision calibration method of joint robot

Also Published As

Publication number Publication date
JP7152614B2 (en) 2022-10-12
JPWO2021059341A1 (en) 2021-04-01

Similar Documents

Publication Publication Date Title
JP3961408B2 (en) Assembly method and apparatus
US9517560B2 (en) Robot system and calibration method of the robot system
KR102091917B1 (en) Gear mechanism assembly and assembly method
JP4112538B2 (en) Robot kinematic calibration apparatus and method
JP2016168651A (en) Robot controlling method, robot apparatus, program, and recording medium
WO2021059341A1 (en) Angle transfer error measuring device
US20180161984A1 (en) Control device, robot, and robot system
JP6816495B2 (en) Robot deflection correction method, robot control device
CN111745623B (en) Five-degree-of-freedom hybrid robot tail end pose error detection and compensation system and method
JP2004195621A (en) Three-dimensional measuring device
JP2018094648A (en) Control device, robot, and robot system
JP2020078859A (en) Robot device
US20240058949A1 (en) Robot, drive unit for a robot and positioning method
US20180161983A1 (en) Control device, robot, and robot system
EP1886771B1 (en) Rotation center point calculating method, rotation axis calculating method, program creating method, operation method, and robot apparatus
JP2014240106A (en) Robot, robot control device, and driving method of robot
JP6568172B2 (en) ROBOT CONTROL DEVICE, MEASUREMENT SYSTEM, AND CALIBRATION METHOD FOR CALIBRATION
US9270209B2 (en) Servo apparatus, and controlling method of servo apparatus
JPH07100781A (en) Articulated robot
JP2010120110A (en) Method of calculating transmission error correcting amount of speed reducer, and robot control device
JP7267688B2 (en) Robot system, robot arm control method, article manufacturing method, driving device, and driving device control method
JPH07100782A (en) Method and device for estimating life of industrial robot
US11759955B2 (en) Calibration method
KR101681076B1 (en) Method of correcting tracking error of scara robot
CN109807935B (en) Industrial robot arm strain detection device and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19947004

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021548010

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19947004

Country of ref document: EP

Kind code of ref document: A1