CN112584984A - Auxiliary metrology position coordinate determination system including alignment sensors for robots - Google Patents

Auxiliary metrology position coordinate determination system including alignment sensors for robots Download PDF

Info

Publication number
CN112584984A
CN112584984A CN201980053434.9A CN201980053434A CN112584984A CN 112584984 A CN112584984 A CN 112584984A CN 201980053434 A CN201980053434 A CN 201980053434A CN 112584984 A CN112584984 A CN 112584984A
Authority
CN
China
Prior art keywords
scale
imaging
alignment
end tool
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980053434.9A
Other languages
Chinese (zh)
Inventor
K.阿瑟顿
M.纳赫姆
C.E.埃姆特曼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitutoyo Corp
Original Assignee
Mitutoyo Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/104,033 external-priority patent/US10751883B2/en
Priority claimed from US16/146,640 external-priority patent/US10871366B2/en
Application filed by Mitutoyo Corp filed Critical Mitutoyo Corp
Publication of CN112584984A publication Critical patent/CN112584984A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Robots use an assisted metrology position coordinate determination (SMPD) system. The "robot precision" (e.g. for controlling and sensing the end tool position of an end tool mounted near the distal end of its movable arm structure) is based on a robot position sensor comprised in the robot. The SMPD system includes an imaging structure and an XY scale and an alignment sensor for sensing alignment/misalignment therebetween, and an image triggering section and a processing section. One of the XY scale or imaging structure is coupled to the movable arm structure and the other is coupled to a stationary element (e.g., a frame above the robot). The imaging structure acquires XY-scale images with known alignment/misalignment which are used to determine metrology position coordinates representing the position of the end tool, with a level of accuracy better than that of the robot.

Description

Auxiliary metrology position coordinate determination system including alignment sensors for robots
Cross Reference to Related Applications
This application is a continuation-in-part application of U.S. patent application No.16/146,640 filed on 28.9.2018, a continuation-in-part application of U.S. patent application No.16/104,033 filed on 16.8.2018, the disclosures of which are incorporated herein by reference in their entirety.
Technical Field
The present disclosure relates to robotic systems, and more particularly, to a system for determining end tool position coordinates of a robot.
Background
Robotic systems are increasingly being used in manufacturing and other processes. Various types of robots that may be used include articulated robots, Selectively Compliant Articulated Robot Arm (SCARA) robots, cartesian robots, cylindrical robots, spherical robots, and the like. As an example of components that may be included in a robot, a SCARA robot system (which may be, for example, an articulated robot system) may generally have a base to which a first arm is rotatably connected and to one end of which a second arm is rotatably connected. In various configurations, a tip tool may be connected to an end of the second arm (e.g., for performing certain work and/or inspection operations). Such a system may include position sensors (e.g., rotary encoders) for determining/controlling the positioning of the arms and the positioning of the respective end tools. In various embodiments, such a system may have a positioning accuracy of about 100 microns due to limitations imposed by certain factors (e.g., rotary encoder performance in combination with mechanical stability of the robotic system, etc.)
U.S. patent No.4,725,965, which is incorporated herein by reference in its entirety, discloses certain calibration techniques for improving the accuracy of SCARA systems. As described in the' 965 patent, a technique is provided for calibrating a SCARA-type robot that includes a first rotatable arm and a second rotatable arm carrying an end tool. The calibration technique is related to the fact that the SCARA robot can be controlled using a kinematic model which, when accurate, allows the arms to be placed in a first and a second angular configuration where the end-tool carried by the second arm is kept in the same position. To calibrate the kinematic model, the arm is placed in a first configuration to position the end tool above a fixed datum point. The arm is then placed in a second angular configuration to nominally position the end tool again in alignment with the datum point. When the arm is switched from the first angular configuration to the second angular configuration, an error in the kinematic model is calculated from the movement of the position of the tip tool from the reference point. The kinematic model is then compensated for the calculated error. These steps are repeated until the error reaches zero, at which point the kinematics model of the SCARA robot is considered to be calibrated.
As further described in the' 965 patent, calibration techniques may include the use of certain cameras. For example, in one embodiment, the reference point may be the center of the viewing area of a stationary television camera (i.e., on the ground below the end tool), and when the link is switched from the first configuration to the second configuration, the output signal of the camera may be processed to determine the offset of the position of the end tool from the center of the viewing area of the camera. In another embodiment, the second arm may carry a camera, and the technique may begin with placing the arms in a first angular configuration where a second predetermined interior angle is measured between the arms to center the camera carried by the second arm directly over a fixed reference point. The arms are then placed in a second angular configuration in which an interior angle equal to a second predetermined interior angle is measured between the arms to nominally re-center the camera above the reference point. Then, when switching the arm from the first angular configuration to the second angular configuration, the output signal of the camera is processed to determine a positional shift of the reference point, as seen by the camera. The error in the known position of the camera is then determined from the offset of the reference point position seen by the camera. These steps are then repeated as part of the calibration process until the error approaches zero.
While techniques such as those described in the' 965 patent may be used to calibrate a robotic system, in certain applications, utilizing such techniques may be less than ideal (e.g., which may require a significant amount of time and/or may not provide a desired level of accuracy for all possible orientations of the robot during certain operations, etc.). A robotic system that can provide improvements with respect to these issues (e.g., for increasing reliability, repeatability, speed, etc. of position determination during workpiece measurement and other processes) would be desirable.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
An assisted metrology position coordinate determination system is provided for use in conjunction with a robot that is part of a robotic system. Robots (e.g., articulated robots, SCARA robots, cartesian robots, cylindrical robots, spherical robots, etc.) include a movable arm structure and a motion control system. The movable arm structure includes an end tool mounting structure located near a distal end of the movable arm structure. The robot is configured to move the movable arm structure to move at least a portion of an end tool mounted to the end tool mounting structure along at least two dimensions in the end tool workspace. The motion control system is configured to control a measurement point position or an end tool position of the end tool based at least in part on sensing and controlling a position of the movable arm structure using at least one position sensor (e.g., a rotary encoder, a linear encoder, etc.) included in the robot to a level of accuracy defined as robot accuracy.
An auxiliary metrology position coordinate determination system includes a first imaging structure, an XY scale, an operational alignment subsystem having at least one alignment sensor, an image triggering portion, and a metrology position coordinate processing portion. The first imaging structure includes a first camera and has an optical axis. In various embodiments, operating the alignment subsystem may further include operating an alignment actuator structure, as described in more detail below. The XY scale includes a nominally planar substrate and a plurality of corresponding imageable features distributed over the substrate, where respective imageable features are located at respective known XY scale coordinates on the XY scale. The scale plane may be defined to nominally coincide with the planar base of the XY scale, and the direction perpendicular to the scale plane may be defined as the scale imaging axis direction. An alignment sensor is located proximate to the first camera and is mounted in a rigid structure relative to the first camera, and the alignment sensor is configured to provide an alignment signal indicative of a scale imaging axis direction. The image triggering section is configured to input at least one input signal related to a measurement point position of the end tool or an end tool position, and determine a timing of the first imaging trigger signal based on the at least one input signal, and output the first imaging trigger signal to the first imaging structure. The first imaging structure is configured to acquire a digital image of the XY scale at an image acquisition time in response to receiving the first imaging trigger signal. The metrological position coordinates processing portion is configured to input the acquired images and identify at least one respective imageable feature included in the acquired images of the XY scale and an associated respective known XY scale coordinate position. In various embodiments, the XY scale may be an incremental scale or an absolute scale.
In various embodiments in which the operational alignment subsystem includes an operational alignment actuator structure, the secondary metrology position coordinate determination system structure is configured such that the movable one of the XY scale or the first imaging structure is coupled to, or is part of, the operational alignment actuator structure. The other of the XY scale or the first imaging structure is coupled to a fixed element near the robot. The XY scale or the fixed one of the first imaging structures defines a first reference position.
In such an implementation, the robotic system is configured to operate the operational alignment subsystem and the operational alignment actuator structure to adjust the alignment of the movable one of the XY scale or the first imaging structure based on the alignment signal provided by the alignment sensor to provide the operational structure of the auxiliary metrology position coordinate determination system, wherein the XY scale and the first imaging structure are arranged such that the optical axis of the first imaging structure is parallel to the direction and the scale plane is within the focus range of the first imaging structure along the scale imaging axis direction.
In such an embodiment, the auxiliary metrological position coordinate determination system is configured such that, when the movable one of the XY scale or the first imaging structure and the fixed one of the XY scale or the first imaging structure are arranged in the operative configuration and the movable arm structure is positioned with the XY scale in the field of view of the first imaging structure, the metrological position coordinate processing portion is operable to determine metrological position coordinates giving the relative position between the movable one of the XY scale or the first imaging structure and the first reference position, at a level of accuracy greater than the robot accuracy, based on determining the image position of the identified at least one respective imageable feature in the captured images. The determined metrology position coordinates give the measurement point position of the end tool or the end tool position at the time of image acquisition, at least for the vector component of the metrology position coordinates (which is transverse or perpendicular to at least one of the scale imaging axis directions), the level of accuracy is better than the robot accuracy.
In such embodiments, the operational alignment actuator arrangement may comprise at least a first rotary element which rotates about a first axis of rotation which is nominally parallel to the scale plane if the XY scale is the movable one and nominally perpendicular to the optical axis if the first imaging arrangement is the movable one. The operational alignment actuator arrangement may further comprise at least a second rotation element that rotates about a second rotation axis that is nominally perpendicular to the first rotation axis. According to the convention used herein, two axes oriented such that the dot product of the direction vectors of the two is zero will be understood to be orthogonal, regardless of whether they intersect. In some such embodiments, the first and second rotational elements may be included in the movable arm structure. In other such embodiments, the first and second rotational elements may be included in discrete operational alignment actuator structures located near the distal end of the movable arm structure.
In various embodiments where the operational alignment subsystem does not include an operational alignment actuator structure, the secondary metrology position coordinate determination system structure is configured such that the movable one of the XY scale or the first imaging structure is coupled to the movable arm structure. The other of the XY scale or the first imaging structure is coupled to a fixed element near the robot. The XY scale or the fixed one of the first imaging structures defines a first reference position.
In such embodiments, the robotic system is configured to provide at least a nominal operating configuration of the auxiliary metrology position coordinate determination system in which at least one of the XY scale or the first imaging configuration is arranged such that the optical axis of the first imaging configuration is nominally parallel to the direction of the scale imaging axis direction (e.g. based on robotic precision) and such that the scale plane is within the focus range of the first imaging configuration along the scale imaging axis direction. The robotic system is further configured to operate the operational alignment subsystem to determine a residual misalignment between the optical axis and the scale imaging axis as indicated by the alignment signal provided by the alignment sensor (e.g., with a better accuracy than the robot accuracy).
In such embodiments, the auxiliary metrological position coordinate determination system may be configured such that, when the movable one of the XY scale or the first imaging structure and the fixed one of the XY scale or the first imaging structure are arranged in the nominal operating configuration and the movable arm structure is positioned in the field of view of the first imaging structure together with the XY scale, the metrological position coordinate processing portion is operable to acquire a digital image of the XY scale at the image acquisition time and determine a corresponding residual misalignment. Based on the image position and the corresponding residual misalignment of the identified at least one respective imageable feature in the acquired image, the auxiliary metrology position coordinate determination system may then determine a first set of metrology position coordinates indicative of the relative position between the movable one of the XY scale or the first imaging structure and the first reference position, at least for a vector component of the first set of metrology position coordinates (which is at least one of transverse or perpendicular to the scale imaging axis direction), the level of accuracy being better than the robot accuracy. The metrology position coordinate processing section may be further configured to determine, based on the first set of metrology position coordinates and the corresponding residual misalignment, a second set of metrology position coordinates that give a measurement point position or an end tool position of the end tool at the time of image acquisition, the level of accuracy being better than the accuracy of the robot, at least for a vector component of the second set of metrology position coordinates (the vector component being at least one of transverse or perpendicular to the scale imaging axis direction).
In various embodiments, the alignment sensor may be configured to output an alignment beam to the XY scale and receive a reflected alignment beam therefrom on a position sensitive detector of the alignment sensor and provide an alignment signal based on at least one output from the position sensitive detector.
In various embodiments, the movable one of the XY scale or the first imaging structure is configured to be in a rigid relationship with at least one of the end tool mounting structure and/or an end tool mounted to the end tool mounting structure.
In various embodiments, the auxiliary metrology position coordinate determination system is configured to determine metrology position coordinates of the measurement point position of the end tool or the end tool position at the time of image acquisition based on the determined metrology position coordinates and a known coordinate position offset between the measurement point position of the end tool or the end tool position and the movable one of the XY scale or the first imaging structure, the metrology position coordinates being indicative of the relative position of the movable one of the XY scale or the first imaging structure.
In various embodiments, the robot is configured to move the movable one of the end tool and the XY scale or the first imaging structure in a plane parallel to the scale plane while the auxiliary metrology position coordinate determination system is in the operating configuration.
In various embodiments, the robotic system may operate in a robot position coordinate mode or an assisted metrology position coordinate mode. The robot position coordinate mode may correspond to an independent and/or standard mode of operation of the robot (e.g., a mode in which the robot operates independently, such as when the auxiliary metrology position coordinate determination system is not activated or provided). In the robot position coordinate mode, the robot movement and the corresponding end tool's measurement point position or end tool position are controlled and determined with a level of accuracy defined as the robot accuracy (i.e. with position sensors comprised in the robot). Conversely, in the auxiliary metrology position coordinate mode, metrology position coordinates may be determined which give the measurement point position of the end tool or the end tool position at the time of image acquisition, at least for the vector component of the metrology position coordinates (which is transverse or perpendicular to at least one of the scale imaging axis directions), the level of accuracy is better than the robot accuracy (e.g. better than the accuracy of the position sensor comprised in the robot). In various embodiments, the determined position information (e.g., the determined metrology position coordinates giving relative position, the determined metrology position coordinates of the end tool position or the measurement point position of the end tool and/or other relevant determined position information) may then be used to perform a specified function (e.g., as part of a workpiece measurement, positioning control of a robot, etc.).
Drawings
FIG. 1 is a block diagram of a first exemplary embodiment of a robotic system including an articulated robot and an auxiliary metrology position coordinate determination system including a first exemplary embodiment of an operational alignment subsystem in accordance with the principles disclosed herein;
FIG. 2A is an isometric view of a second exemplary embodiment of a robotic system similar to that of FIG. 1, with a first imaging structure and alignment sensors of an operational alignment subsystem coupled to a fixed element;
FIG. 2B is an isometric view of the robotic system of FIG. 2A, showing certain errors that may be indicated by the alignment sensors;
fig. 3A is an isometric view of a third exemplary embodiment of a robotic system with an XY scale coupled to a stationary element and a first imaging structure and alignment sensor of an operational alignment subsystem coupled to a moving element;
FIG. 3B is an isometric view of the robotic system of FIG. 3A, showing certain errors that may be indicated by the alignment sensors;
FIG. 4 is an isometric view of an exemplary embodiment of an incremental XY scale;
FIG. 5 is an isometric view of an exemplary embodiment of an absolute XY scale;
FIG. 6 is a flow diagram showing a first exemplary embodiment of a routine for operating a robotic system including a robot and an auxiliary metrology position coordinate determination system disclosed herein;
FIG. 7 is a flow diagram showing a first exemplary embodiment of a routine for determining a position of an end tool, where a robotic position sensor may be used during a first portion of a motion timing and an auxiliary metric position coordinate determination system determined relative position may be used during a second portion of the motion timing;
FIG. 8 is a block diagram of a fourth exemplary embodiment of a robotic system including an articulated robot and an auxiliary metrology position coordinate determination system including a second exemplary embodiment of an operational alignment subsystem in accordance with the principles disclosed herein;
FIG. 9 is an isometric view of a portion of a fifth exemplary embodiment of a robotic system similar to that of FIG. 8, including an articulated robot, in which a first imaging structure and an alignment sensor of an operational alignment subsystem are coupled to a stationary element, the alignment sensor controlling the operational alignment of an XY scale located on a moving element;
fig. 10 is an isometric view of a portion of a sixth exemplary embodiment of a robotic system including an articulated robot, with a first imaging structure and an alignment sensor of an operational alignment subsystem coupled to a moving element, the alignment sensor controlling operational alignment of the first imaging structure relative to an XY scale located on a stationary element;
FIG. 11 is a flow diagram showing a second exemplary embodiment of a routine for operating a robotic system including a robot and an auxiliary metrology position coordinate determination system as disclosed herein;
figure 12 is an isometric view showing a portion of an embodiment of a robotic system similar to that of figures 2A and 2B, wherein an XY scale and alignment sensor and alignment actuator structure that operates an alignment subsystem are coupled to the moving element, and the alignment sensor and alignment actuator are used to control the operational alignment of the XY scale relative to an imaging structure located on the stationary element;
figure 13 is an isometric view showing a portion of an embodiment of a robotic system similar to that of figures 2A and 2B, wherein an imaging structure and alignment sensor and alignment actuator structure of an operational alignment subsystem are coupled to the moving element and the alignment sensor and alignment actuator are used to control the operational alignment of the imaging structure relative to an XY scale located on the stationary element; and
FIG. 14 is a diagram of a first exemplary structure of an alignment sensor that may be used in various embodiments of an operating alignment subsystem, according to principles disclosed herein.
Detailed Description
Fig. 1 is a block diagram of a first exemplary embodiment of a robotic system 100, the robotic system 100 including an articulated robot 110 and an auxiliary metrology position coordinate determination system 150. The auxiliary metrology position coordinate determination system 150 is shown as including a first exemplary embodiment of an operational alignment subsystem OAS that includes at least one alignment sensor ASen connected to operational alignment subsystem processing circuitry/routines 190, as described in more detail below.
The articulated robot 110 includes first and second arm sections 120 and 130, first and second rotary joints 125 and 135, position sensors SEN and EN2, an end-tool configuration ETCN, and a robot motion control and processing system 140. The first arm 120 is mounted to the first revolute joint 125 at a proximal end PE1 of the first arm 120. The first rotational joint 125 (e.g., at the upper end of the support base BSE) has an axis of rotation RA1 aligned along the z-axis direction such that the first arm 120 moves about the first rotational joint 125 in an x-y plane perpendicular to the z-axis. Second revolute joint 135 is located at distal end DE1 of first arm 120. The axis of rotation RA2 of second rotary joint 135 is nominally aligned along the z-axis direction. The second arm 130 is mounted to the second rotary joint 135 at a proximal end PE2 of the second arm 130 such that the second arm 130 moves about the second rotary joint 135 in an x-y plane nominally perpendicular to the z-axis. In various embodiments, position sensors SEN and EN2 (e.g., rotary encoders) may be used to determine the angular position (i.e., in the x-y plane) of the first and second arm portions 120 and 130 about the first and second rotary joints 125 and 135, respectively.
In various embodiments, the end tool configuration ETCN may include the Z-motion mechanism ZMM, the Z-arm ZARM, the position sensor EN3, and an end tool coupling portion ETCP coupled to the end tool ETL. In various embodiments, end tool ETL may include an end tool sensing portion ETSN and an end tool stylus ETST with a measurement point MP (e.g., for contacting a surface of work piece WP). The Z-motion mechanism ZMM is located near the distal end DE2 of the second arm 130. The Z-motion mechanism ZMM (e.g., a linear actuator) is configured to move the Z-arm ZARM up and down in the Z-axis direction. In some embodiments, the Z arm ZARM may also be configured to rotate about an axis parallel to the Z-axis direction. In any case, the end tool ETL is coupled at an end tool coupling portion ETCP and has a respective end tool position ETP with respective coordinates (e.g., x, y, and z coordinates). In various embodiments, the end tool position ETP may correspond to or be proximate to the distal end DE3 of the Z-arm ZARM (e.g., at or near the end tool coupling portion ETCP).
The motion control system 140 is configured to control the end tool position ETP of the end tool ETL with a level of accuracy defined as the robot accuracy. More specifically, motion control system 140 is generally configured to control the x and y coordinates of end tool position ETP with robotic precision based, at least in part, on sensing and controlling the angular position (i.e., in the x-y plane) of first and second arm portions 120 and 130 about first and second rotational joints 125 and 135, respectively, using position sensors SEN and EN 2. In various embodiments, motion control and processing system 140 may include first and second rotary joint control and sensing portions 141 and 142 that may receive signals from position sensors SEN and EN2, respectively, for sensing the angular position of first and second arms 120 and 130, and/or may provide control signals (e.g., to motors, etc.) in first and second rotary joints 125 and 135 for rotating first and second arms 120 and 130.
Further, motion control system 140 is generally configured to sense and control the linear position (i.e., along the Z-axis) of Z-arm ZARM based at least in part on using Z-motion mechanism ZMM and position sensor EN3 to control the Z-coordinate of end-tool position ETP with robotic accuracy. In various embodiments, the motion control and processing system 140 may include a Z-motion mechanism control and sensing portion 143 that may receive a signal from a position sensor EN3 to sense the linear position of the Z-arm ZARM, and/or may provide a control signal to the Z-motion mechanism ZMM (e.g., a linear actuator) to control the Z-position of the Z-arm ZARM.
Motion control and processing system 140 may also receive signals from end tool sensing portion ETSN. In various embodiments, end tool sensing portion ETSN can include circuitry and/or structure associated with operation of end tool ETL for sensing work piece WP. As will be described in greater detail below, in various embodiments, the end tool ETL (e.g., contact probe, scanning probe, camera, etc.) can be used to contact or otherwise sense surface locations/positions/points on the work piece WP for which various corresponding signals can be received, determined, and/or processed by an end tool sensing portion ETSN, which can provide corresponding signals to the motion control and processing system 140. In various embodiments, motion control and processing system 140 may include an end tool control and sensing portion 144 that may provide control signals to and/or receive sensing signals from end tool sensing portion ETSN. In various embodiments, end tool control and sensing portion 144 and end tool sensing portion ETSN may be merged and/or indistinguishable. In various embodiments, the first and second rotary joint control and sensing portions 141 and 142, the Z-motion mechanism control and sensing portion 143, and the end tool control and sensing portion 144 can all provide outputs to and/or receive control signals from the robot position processing portion 145, and the robot position processing portion 145 can be part of the robot motion control and processing system 140 to control and/or determine the overall positioning of the articulated robot 110 and the corresponding end tool position ETP.
In various embodiments, the auxiliary metrology position coordinate determination system 150 can be included in the articulated robot 110 or otherwise added to the articulated robot 110 (e.g., as part of a retrofit structure for adding to an existing articulated robot 110, etc.). In general, the auxiliary metrology position coordinate determination system 150 may be used to provide an improved level of accuracy for the determination of the end tool position ETP. More specifically, as will be described in greater detail below, the auxiliary metrology position coordinate determination system 150 may be used to determine a relative position that gives the metrology position coordinates of the end tool position ETP with a level of accuracy that is better than the robot accuracy, at least for x and y metrology position coordinates in an x-y plane perpendicular to the z-axis.
As shown in fig. 1, the auxiliary metrology position coordinate determination system 150 includes a first imaging structure 160, an XY scale 170, an image trigger portion 181, and a metrology position coordinate processing portion 185. The first imaging structure 160 is coupled to the fixation element STE. In various embodiments, the fixation element STE may comprise a frame arranged over at least a part of the operable workspace OPV of the articulated robot 110, and to this end the first imaging structure 160 is fixed to the frame over a part of the operable workspace OPV. In various embodiments, the fixation element STE may include one or more structural support elements SSP (e.g., extending from a floor, ceiling, etc.) for maintaining the fixation element STE in a fixed position (e.g., having a fixed position and/or orientation) relative to the articulated robot 110.
The first imaging structure 160 includes a first camera CAM1 and has an optical axis OA1 aligned parallel to the z-axis (e.g., nominally aligned based on robot precision or better aligned based on alignment sensor signals). The first imaging structure 160 has an effective focal range REFP along its optical axis OA 1. In various embodiments, the range REFP may be defined by first and second effective focus positions EFP1 and EFP2, which will be described in more detail below. At a given time, the first imaging structure 160 has an effective focal position EFP that falls within the REFP range. In embodiments using a Variable Focal Length (VFL) lens, the range REFP may correspond to the focal range of the VFL lens.
In various embodiments, the VFL lens used may be a tunable acoustic gradient index (TAG) lens. With respect to the general operation of such a TAG lens, in various embodiments, a lens controller (e.g., as included in the first imaging structure control and image processing portion 180) may rapidly adjust or modulate the optical power of the TAG lens periodically to achieve a high speed TAG lens capable of periodic modulation of 250kHz, 70kHz, 30kHz, or the like (i.e., at the TAG lens resonant frequency). In such a configuration, the effective focus position EFP of the first imaging structure 160 may be (fast) moved within the range REFP (e.g., autofocus search range). The effective focus position EFP1 (or EFPmax) may correspond to the maximum optical power of the TAG lens and the effective focus position EFP2 (or EFPmin) may correspond to the maximum negative optical power of the TAG lens. In various embodiments, the middle of the range REFP may be designated as EFPnom, and may correspond to zero optical power for the TAG lens.
In various embodiments, such a VFL lens (e.g., a TAG lens) and corresponding range REFP may be advantageously selected such that the structure can limit or eliminate the need for macro-mechanical adjustment of the first imaging structure 160 and/or adjustment of the distance between components in order to change the effective focal position EFP. For example, in embodiments where an unknown amount of tilt or "droop" may occur at the distal end DE2 of the second arm portion 130 (e.g., due to the weight and/or particular orientation of the first and second arm portions 120 and 130, etc.), the precise focus distance from the first imaging structure 160 to the XY scale 170 may be unknown and/or may vary with different orientations of the arms. In such a configuration, it may be desirable to use a VFL lens capable of scanning or otherwise adjusting the effective focal position EFP to determine and accurately focus on the XY scale 170.
In various embodiments, the XY scale 170 includes a nominally planar substrate SUB (as shown in fig. 4). The scale plane may be defined to nominally coincide with the planar substrate SUB, and a direction perpendicular to the scale plane may be defined as the scale imaging axis direction SIA. In the illustrated embodiment, the XY scale 170 is aligned in an operative configuration, wherein the scale imaging axis direction SIA is at least nominally aligned with (i.e., parallel to) the optical axis OA1 of the first imaging structure 160.
In some embodiments, the operational alignment actuator structure AAct of the operational alignment subsystem OAS may be omitted (or not used), and the scale imaging axis direction SIA is simply nominally aligned with the optical axis OA1 and/or the z-axis for one or more poses of the articulated robot 110 (e.g., based on robot precision). It should be appreciated that such alignment is "passive" or open loop and may suffer from small alignment errors associated with small sag/tilt misalignment angles MisAng (e.g., as shown in fig. 2B) based on or due to various poses, robot accuracy, and/or unavoidable robot deformations caused by gravity for the articulated robot 110 shown in fig. 1. In accordance with conventions used herein, such small alignment errors may be considered to fall within the definition of a "nominal" operational configuration and/or the definition of a "nominal" alignment in the various embodiments outlined herein.
However, in other embodiments, the operational alignment actuator structure AAct (e.g., the discrete operational alignment actuator structure AAct shown in fig. 1) is included in the operational alignment subsystem OAS and used. In such embodiments, based on signals from the alignment sensor (which has a known and/or stable alignment relative to the optical axis OA1), at any desired time during operation of the articulated robot 110, the operational alignment actuator structure AAct may actively align the scale imaging axis direction SIA with the optical axis OA1 for one or more poses of the articulated robot 110. It should be appreciated that such alignment is active or closed loop, and that at any desired time during operation of the articulated robot 110, small alignment errors according to the small sag/tilt misalignment angle MisAng described above may be actively corrected. In the illustrated embodiment, the alignment error may be actively corrected by using the alignment control signal ACont generated in the alignment control section 192 to control the discretely operated alignment actuator structure AAct to adjust the alignment of the movable XY scale 170 based on the alignment signal Asig provided by the alignment sensor ASen to provide an operated configuration of the XY scale 170 and the first imaging structure 160 in which the optical axis OA1 and the scale imaging axis direction SIA are arranged in parallel as indicated by the alignment signal Asig.
As previously mentioned, in the embodiment shown in fig. 1, the operational alignment subsystem OAS includes an alignment sensor ASen, a discrete operational alignment actuator AAct, and operational alignment subsystem processing circuitry/routines 190. The operational alignment subsystem processing circuitry/routine 190 includes at least one alignment signal processing section 191 which may provide signal processing which may provide primary signal conditioning and/or correction for the alignment signal Asig of the alignment sensor Asen and/or provide an analysis which can determine a misalignment angle/vector or residual misalignment angle/vector corresponding to the alignment signal Asig, as described in more detail below.
In embodiments where the operational alignment subsystem OAS includes some form of operational alignment actuator AAct, the operational alignment subsystem processing circuitry/routine 190 may further include an alignment control portion 192, the alignment control portion 192 typically being configured to adjust the alignment of the movable one of the XY scale or the first imaging structure based on an alignment signal Asig provided by the alignment sensor ASen to provide an operational configuration of the XY scale and the first imaging structure in which the optical axis of the first imaging structure (e.g., OA1) and the scale imaging axis direction SIA are arranged to be parallel as indicated by the alignment signal Asig (e.g., as described above).
It should be understood that the architecture of the operational alignment subsystem processing circuit/routine 190 shown in FIG. 1 and outlined above is exemplary only, and not limiting. In various embodiments, various portions of the alignment subsystem processing circuitry/routines 190 may be located outside of the external control system ECS (e.g., in the operational alignment sensor ASen), or may be merged and/or indistinguishable from other portions (e.g., portions 185 and/or 187) of the auxiliary metrology position coordinate determination system 150. In some embodiments, the operational alignment subsystem processing circuitry/routines 190 may exchange position and/or alignment information and/or control signals with the robot motion and control processing system 140, as indicated by dashed lines 193, in order to implement various operational principles or features disclosed herein. The foregoing and other aspects of various operational alignment subsystems OAS according to the principles disclosed herein will be described in greater detail with reference to the appended drawings below.
XY scale 170 may include a plurality of corresponding imageable features distributed over substrate SUB. The respective imageable features are located at respective known x and y scale coordinates on the XY scale 170. In various embodiments, the XY scale 170 can be an incremental scale or an absolute scale, as described below with reference to fig. 4 and 5.
In various embodiments, the image triggering portion 181 and/or the metric location coordinates processing portion 185 can be included as part of the external control system ECS (e.g., as part of an external computer, etc.). An image triggering portion 181 may be included as part of the first imaging structure control and processing portion 180. In various embodiments, the image triggering portion 181 is configured to input at least one input signal related to the end tool position ETP, and to determine the timing of the first imaging trigger signal based on the at least one input signal and output the first imaging trigger signal to the first imaging structure 160. In various embodiments, the first imaging structure 160 is configured to acquire a digital image of the XY scale 170 at an image acquisition time in response to receiving the first imaging trigger signal. In various embodiments, the metrological position coordinates processing portion 185 is configured to input the acquired image and identify at least one corresponding imageable feature included in the acquired image of the XY scale 170 and the associated corresponding known XY scale coordinate position. In various embodiments, the external control system ECS may further include a standard robot position coordinate mode portion 147 and an auxiliary metrology position coordinate mode portion 187 for implementing respective modes, which will be described in more detail below.
In various embodiments, the first imaging structure 160 may include components (e.g., sub-circuits, routines, etc.) that periodically (e.g., at set timing intervals) activate image integration of the camera CAM1, for which a first imaging trigger signal may activate flash timing or other mechanisms to effectively freeze motion and accordingly determine exposure within the integration period. In such embodiments, the resulting image may be discarded if the first imaging trigger signal is not received during the integration period, wherein if the first imaging trigger signal is received during the integration period, the resulting image may be saved and/or otherwise processed/analyzed to determine the relative position, as will be described in more detail below.
In various embodiments, different types of end tools ETL may provide different types of outputs that can be used for the image trigger portion 181. For example, in embodiments where the end tool ETL is a contact probe for measuring a workpiece and outputting a contact signal when it contacts the workpiece, the image trigger portion 181 may be configured to input the contact signal, or a signal derived therefrom, as the at least one input signal on which the timing of the first imaging trigger signal is determined. As another example, in embodiments where the end tool ETL is a scanning probe for measuring a workpiece and providing respective workpiece measurement sample data corresponding to respective sample timing signals, the image trigger portion 181 may be configured to input the respective sample timing signals or signals derived therefrom as the at least one input signal. As another example, in embodiments where the end tool ETL is a camera for providing respective workpiece measurement images corresponding to respective workpiece image acquisition signals, the image trigger portion 181 may be configured to input the workpiece image acquisition signals or signals derived therefrom as the at least one input signal.
In the exemplary embodiment of fig. 1, the auxiliary metrology position coordinate determination system 150 is configured to have the XY scale 170 coupled to the second arm 130 (which is proximate to the distal end DE2 of the second arm 130) and to the stationary element STE (e.g., a frame disposed above the articulated robot 110) as the first imaging structure 160 and define a first reference position REF 1. In an alternative embodiment (e.g., as will be described in more detail below with reference to fig. 3), the auxiliary metrology position coordinate determination system may be configured to have the first imaging structure 160 coupled to the second arm portion 130 (which is proximate the distal end DE2 of the second arm portion 130) and the XY scale 170 coupled to the fixed element STE and defining the first reference position REF 1.
In either case, as will be described in more detail below, the position of the XY scale 170 along the z-axis is within the focus range of the first imaging structure 160 (e.g., the focus position may be adjusted by a VFL lens or other means), and the auxiliary metrology position coordinate determination system 150 is configured such that the metrology position coordinate processing portion 185 is operable to determine the relative position (e.g., including x and y coordinates) between the movable one of the XY scale 170 or the first imaging structure 160 and the first reference position REF1 with a level of accuracy that is better than that of the robot based on determining the image position of the identified at least one respective imageable feature in the acquired images. The determined relative position gives the metrology position coordinates of the end tool position ETP at the time of image acquisition, at least for x and y metrology position coordinates in an x-y plane perpendicular to the z-axis, the level of accuracy is better than the robot accuracy. In various embodiments, the auxiliary metrology position coordinate determination system 150 may be configured to determine the metrology position coordinates of the tip tool position ETP at the image acquisition time based on the determined relative position and tip tool position ETP and a known coordinate position offset (x and y coordinate offsets) between the XY scale 170 or the movable one of the first imaging structure 160. It should be appreciated that such a system may have certain advantages over various alternative systems. For example, in various embodiments, a system such as disclosed herein may be smaller and/or less expensive, and may also have greater accuracy in some embodiments, than alternative systems that utilize techniques such as laser trackers or photogrammetry to track robot motion/position. The disclosed system also does not occupy or obscure any part of the operational workspace OPV, such as an alternative system that may include a scale or fiducial on the ground or platform, or in the same area (e.g., operational workspace) in which the workpiece may be operated and/or inspected, etc.
Fig. 2A and 2B are isometric views of a second exemplary embodiment of a robotic system 200 similar to robotic system 100 of fig. 1, with a first imaging structure 160 and an alignment sensor ASen operating an alignment subsystem OAS coupled to a fixed element STE (e.g., fixed element STE of fig. 1). Fig. 2B is an isometric view of the robotic system of fig. 2A, showing certain errors that alignment sensor ASen may indicate.
It is to be appreciated that certain numbered components (e.g., 1XX or 2XX) of fig. 2A and 2B can correspond to and/or have similar operation as corresponding components (e.g., 1XX) of fig. 1 that are the same or similarly numbered, and can be understood as being similar or identical thereto and can be understood by analogy, as described below. Numbering schemes indicating elements having similar and/or identical design and/or function are also applicable to various other figures herein including corresponding components of the same or similar numbering. In some instances, reference numerals have been omitted from subsequent figures that are significantly similar or identical elements to avoid visual clutter and to more clearly show and emphasize new or different elements introduced in those subsequent figures. Such similar or identical elements may be recognized in the various figures and may be understood by analogy with the previous description unless described or the context indicates otherwise.
In the structure of fig. 2A and 2B, the operational alignment actuator structure shown in fig. 1 is omitted, and the XY scale 170 is coupled to the second arm portion 130 near the distal end DE2 of the second arm portion 130. In various embodiments, the fixation element STE coupled with the first imaging structure 160 can comprise a frame disposed above the articulated robot 110, as described above with reference to fig. 1. In various embodiments, different reference axes and lines may be specified to reference particular motions, coordinates, and angles of components of the articulated robot 110. For example, the first and second arm portions 120 and 130 may each have a designated horizontal centerline CL1 and CL2, respectively, that passes down through the center of the respective arm portion. The angle a1 may be specified to occur between the centerline CL1 of the first arm 120 and the x-z plane (e.g., in terms of the amount of rotation of the first rotational joint 125 about the first rotational axis RA 1). The angle a2 may be specified to occur between the horizontal centerline CL1 of the first arm 120 and the horizontal centerline CL2 of the second arm 130 (e.g., in terms of the amount of rotation of the second rotary joint 135 about the second rotational axis RA 2).
In various embodiments, the tip tool configuration ETCN may be coupled to the second arm 130 near the distal end DE2 of the second arm 130, and may be designated as having a tip tool axis EA of the tip tool ETL that nominally intersects the centerline CL2 of the second arm 130, and to this end, the tip tool axis EA may be generally assumed to be parallel to the rotational axes RA2 and the z-axis. In various embodiments, the tip tool axis EA passes through the tip tool position ETP and has a known coordinate position offset (i.e., for x and y coordinates) relative to the XY scale 170. Accordingly, there may be a known coordinate position offset between the end tool position ETP and the XY scale 170. For example, XY scale 170 may have a specified reference point (e.g., at the center or edge of XY scale 170) that has a known coordinate position offset (e.g., a known distance) in the x-y plane relative to tip tool axis EA and, correspondingly, to tip tool position ETP. In various embodiments, such a known coordinate position offset may be represented by a known x offset and a known y offset.
In various embodiments, a known coordinate position offset between end tool position ETP and XY scale 170 may be used as part of the process of determining the metrology position coordinates of end tool position ETP. More specifically, as described above, the auxiliary metrology position coordinate determination system 150 may be configured such that, based on determining the image position of at least one respective imageable feature identified in the acquired image (i.e., of the XY scale 170), the metrology position coordinate processing portion 185 is operative to determine the relative position between the XY scale 170 and the first reference position REF1 (i.e., defined by the fixed first imaging structure 160). The auxiliary metrology position coordinate determination system 150 may also be configured to determine metrology position coordinates of the end tool position ETP based on the determined relative position and the known coordinate position offset between the end tool position ETP and the movable XY scale 170. In one particular example embodiment, a known coordinate position offset (e.g., represented by a known x offset and a known y offset) may be added to or otherwise combined with the determined relative position in order to determine the metrology position coordinates of the end tool position ETP.
As one particular example position coordinate structure, XY scale 170 may be specified to have a reference position (e.g., origin position) at X0, Y0, Z0 (e.g., they may have values of 0,0, 0 for origin position). In such a configuration, the reference position REF1 (i.e., as defined by the fixed first imaging structure 160) may be at relative coordinates of X1, Y1, Z1, and the center of the corresponding field of view FOV1 (e.g., corresponding to the acquired image) may be at relative coordinates of X1, Y1, Z0. The position of the tip tool axis EA in the X-Y plane extending from the XY scale 170 may be specified as having relative coordinates of X2, Y2, Z0. The end tool position ETP may be specified as having coordinates of X2, Y2, Z2. In various embodiments, the end tool ETL may have a measurement point MP (e.g., at the end of the end tool stylus ETST for contacting the workpiece), which may be designated as having coordinates X3, Y3, Z3. In embodiments where the measurement point MP of the end tool ETL does not vary in the X or Y direction relative to the rest of the end tool, the X3 coordinate and the Y3 coordinate may be equal to the X2 coordinate and the Y2 coordinate, respectively.
In one particular exemplary embodiment, the acquired images may be analyzed by the metric position coordinate processing portion 185 to determine relative position (e.g., to determine the X1, Y1 coordinates corresponding to the center of the field of view FOV1 of the fixed first imaging structure 160). Such determination may be made in accordance with standard camera/scale image processing techniques (e.g., for determining the position of the camera relative to the scale). Various examples of such techniques are disclosed in U.S. patent nos. 6,781,694; 6,937,349, respectively; 5,798,947, respectively; 6,222,940, respectively; and 6,640,008, each of which is incorporated herein by reference in its entirety. In various embodiments, such techniques can be used to determine a field of view position (e.g., corresponding to a position of a camera) within a scale (e.g., within XY scale 170). In various embodiments, such determination may include identifying at least one respective imageable feature included in the acquired image of the XY scale 170 and an associated respective known XY scale coordinate position. Such a determination may correspond to determining a relative position between the XY scale 170 and the first reference position REF1 (i.e., defined by the fixed first imaging structure 160). The relative X2, Y2 coordinates (i.e., the coordinates of the end tool position ETP) may then be determined from the known coordinate position offset between the end tool position ETP and the XY scale 170 (e.g., adding the X and Y position offset values to X1 and Y1 to determine X2 and Y2).
As previously mentioned, fig. 2A and 2B show an embodiment in which the first imaging structure 160 and the alignment sensor ASen operating the alignment subsystem OAS are coupled to the stationary element STE and the XY scale 170 is coupled to the movable second arm 130. According to known methods, the alignment sensor ASen is located near the first camera CAM1 and is mounted in a rigid structure relative to the first camera CAM1 and the first imaging structure 160. In such a rigid structure, it is desirable to have the alignment beam ABeam output by the alignment sensor Asen aligned parallel or nearly parallel to the optical axis OA1 of the first imaging structure 160. In this case, when the alignment beam ABeam is aligned perpendicular to the XY scale, the scale imaging axis SIA is aligned parallel to the optical axis OA1 as necessary to establish the desired operational configuration in accordance with the principles disclosed and claimed herein. However, if the alignment beam ABeam and the optical axis OA1 are only nearly parallel, the net result is a constant offset error, which may be uncorrelated or compensated for in various applications.
For purposes of explanation, fig. 2A shows an idealized view of a nominal operating configuration, which, according to conventions previously explained herein, is a passive or open-loop aligned configuration in which the scale imaging axis SIA is defined perpendicular to the XY scale 170, and the optical axes OA1 are set to be nominally parallel to each other during set-up, and thereafter operate with the configuration. In the idealized case shown, the various arms of the robotic system 200 do not sag or twist significantly, and once the desired operational configuration (in which the scale imaging axis SIA defined perpendicular to the XY scale 170 is aligned parallel to the optical axis OA1) is established, and without any residual misalignment MisAng due to sag or twist, etc., the desired operational configuration is maintained in the position shown and elsewhere in the robotic system 200.
The alignment sensor ASen may be of any type suitable for determining a direction perpendicular to the XY scale within a limited range of residual misalignments MisAng (as shown in fig. 2B) with respect to a nominal or reference direction. In the illustrated embodiment, the desired operative configuration corresponds to the reflected alignment beam ABeamR being reflected from the alignment reflecting feature ARF on the XY scale 170 back to the alignment sensor Asen along the same path as the output alignment beam ABeam to land on the detector of the alignment sensor Asen and be in a null or reference position. This produces a zero signal or reference signal value for the alignment signal ASig outlined previously, which indicates that there is no residual misalignment (MisAng 0). An exemplary structure that may be used for alignment sensor ASen is further described below with reference to fig. 14. Exemplary structures that can be used for the XY scale 170 and the alignment feature ARF are further described below with reference to fig. 4 and 5.
It will be appreciated that the movable XY scale is structured in a rigid relationship with respect to the end tool mounting structure ETMC, the end tool ETL, the (e.g., end tool position ETP) and the measurement point MP of the end tool ETL. Thus, the coordinate offset between XY scale 170 and end tool position ETP and/or measurement point MP is constant and can be calibrated. Additionally, it should be appreciated that within a limited range of residual misalignment MisAng, which may be quantitatively indicated by alignment sensor Asen, the residual misalignment (e.g., misalignment angle or vector) of end tool ETL may be known, and a corresponding misalignment or error of end tool position ETP and/or measurement point MP may be determined and at least partially corrected or compensated based on the indicated residual misalignment, as described in more detail with reference to fig. 2B.
Fig. 2B illustrates the same structure as shown in fig. 2A, except for the nonideal or reality of the various arms of the robotic system 200 having significant sag or twist (e.g., tens or hundreds of microns). According to the conventions outlined previously, the structure may still be described as providing a nominal operating structure that provides positioning and measurement results within an expected or specified nominal range of robot accuracy (e.g., as has been expected and/or tolerated in various prior art robotic systems).
In the actual situation shown in fig. 2B, each arm of the robotic system 200 has significant sag and/or twist, which causes the XY scale 170 to deflect to a corresponding residual misalignment MisAng (e.g., a residual misalignment angle). For many practical robotic systems, the angle of this residual misalignment MisAng is small, and the XY scale 170 can be significantly displaced in the Z direction without significant displacement in the X and Y directions due to the residual misalignment MisAng. Thus, in comparison to fig. 2A, optical axis OA1 is shown intersecting the XY scale at the same X and Y coordinates (X1, Y1), but at a different Z coordinate (Z0', instead of Z0).
However, it can be observed that the measurement point position MP and the tip tool position ETP of the "touch probe" tip tool ETL are significantly displaced in the Z direction, and significantly displaced in the X and Y directions due to the interaction of the angle of the residual misalignment MisAng with their respective offsets LoffEPT and LoffMP relative to the scale plane of the XY scale 170. It can be seen that the offsets LoffEPT and LoffMP are along a direction perpendicular to the scale plane (and/or the nominal Z axis). The offset may be known by design or calibration. One of ordinary skill in the art will recognize that the coordinate displacement or error (X3'-X3) of the measurement point location MP can be approximated as sin (MisAngX) LoffMP, and (Y3' -Y3) can be approximated as sin (MisAngY) LoffMP, where MisAngX and MisAngY are the angular components of the residual misalignment MisAng in the XZ and YZ planes, respectively. Similarly, the coordinate displacement or error of the end tool position ETP (X2'-X2) may be approximated as sin (misangx) LoffETP, and (Y2' -Y2) may be approximated as sin (misangy) LoffETP, respectively. These determined coordinate displacements or errors based on the residual misalignment MisAng may be used to at least partially correct or compensate a set of metrology position coordinates that give the end tool position ETP or measurement point position MP of the end tool when there is residual misalignment as described above, at least for the vector components of the set of metrology position coordinates that are transverse or perpendicular to at least one of the scale imaging axis directions (e.g., X and Y coordinates). In some embodiments, the coordinate displacements or errors associated with (Z2 '-Z2) and/or (Z3' -Z3) may be approximated based on the residual misalignment MisAng indicated by the alignment sensor ASen and the known geometries and orientations of the various arms and bearings of the robotic system 200 and mechanical properties (e.g., beam properties). In such embodiments, errors that may occur in the Z coordinate of a set of metrology position coordinates may also be at least partially corrected or compensated for based on residual misalignment indicated by the alignment sensor ASen. In some embodiments, the magnitude of the residual misalignment MisAng may be evaluated in the operation of the alignment subsystem OAS, and if the magnitude exceeds a predetermined threshold (e.g., related to an error limit), operations related to compensating for or correcting for translation or displacement of the XY scale 170 over the field of view FOV1 may be performed. For example, based on the residual misalignment MisAng indicated by the alignment sensor ASen and the known geometries and orientations of the various arms and bearings of the robotic system 200 and mechanical properties (e.g., beam properties), the X and Y displacements of the XY scale 170 due to the large residual misalignment can be approximated. In such embodiments, errors that may occur in the X and Y image position coordinates of the imageable features of XY scale 170 and/or the corresponding set of metrology position coordinates may be corrected or compensated, at least in part, based on calculations related to residual misalignment indicated by alignment sensor ASen. It will be appreciated that the decision whether to include such an X and Y image position correction in a set of metrology position coordinates indicating the relative position between the movable one of the XY scale or the first imaging structure and the first reference position may be made based on the magnitude of the residual misalignment and the accuracy desired in the particular application.
Fig. 3A and 3B are isometric views of a third exemplary embodiment of a robotic system 300 similar to robotic system 100 of fig. 1 and robotic system 200 of fig. 2A and 2B, except that a first imaging structure 160 and an alignment sensor ASen operating an alignment subsystem OAS are coupled to movable second arm 130 proximate a distal end DE2 of second arm 130, and an XY scale 170 is coupled to stationary element STE and defines a first reference position REF 1. In particular, fig. 3B is an isometric view of the robotic system of fig. 3A, showing certain errors that alignment sensor ASen may indicate. Similar to the structure of fig. 2A and 2B, the operation alignment actuator structure shown in fig. 1 is omitted.
As previously noted, certain numbered components (e.g., 3XX) of fig. 3A and 3B may correspond to, and may be similar or identical to, the same or similarly numbered corresponding components (e.g., 1XX, 2XX) of fig. 1, 2A and 2B, and may be understood by analogy, as described below. In some cases, elements that are significantly similar or identical in later figures may have their reference numerals omitted to avoid visual clutter, but they may be understood as outlined above.
Fig. 3A can be understood in many respects by analogy with the previous description of fig. 2A, so only the significant differences are emphasized below. With respect to fig. 3A, the first imaging structure 160 may have a specified reference point (e.g., at the center of the effective lens position of the first imaging structure 160), which is shown with coordinates (X1, Y1, Z1).
Similar to fig. 2A, for purposes of explanation, fig. 3A shows an idealized version of a nominally operating structure, which is a passive or open-loop alignment structure in which the scale imaging axis SIA and optical axis OA1 are disposed nominally parallel to one another, according to the same convention previously described with reference to fig. 2A, and thereafter operate in that structure. In the idealized case illustrated, the various arms of the robotic system 200 do not sag or twist significantly and maintain the desired nominal operating configuration. In the illustrated embodiment, the desired operative configuration corresponds to a reflected alignment beam ABeamR that is reflected from the alignment reflection feature ARF on the XY scale 170 back to the alignment sensor Asen along the same path as the output alignment beam ABeam to produce a zero or reference signal value for the alignment signal ASig as outlined previously, which indicates that there is no residual misalignment (MisAng 0), as previously described with reference to fig. 2A.
It should be appreciated that the movable first imaging structure 160 and the alignment sensor ASen are structured in a rigid relationship with respect to each other and with respect to the end tool mounting structure ETMC, the end tool ETL, the measurement point MP (e.g., end tool position ETP) and the end tool ETL. Accordingly, the coordinate offset between the specified reference point of the first imaging structure 160 and the end tool position ETP and/or the measurement point MP is constant and can be calibrated. Additionally, it should be appreciated that within a limited range of residual misalignment mirang, which may be quantitatively indicated by alignment sensor Asen, the residual misalignment (e.g., misalignment angle or vector) of end tool ETL may be known, and the corresponding misalignment or error of end tool position ETP and/or measurement point MP may be determined and at least partially corrected or compensated based on the indicated residual misalignment, which may be understood by similar determinations and corrections similar to the compensation outlined above with reference to fig. 2B.
Fig. 3B illustrates the same structure as shown in fig. 3A, except for the nonideal or practical case where the various arms of the robotic system 300 have significant amounts of sag or twist (e.g., on the order of tens or hundreds of microns). According to the conventions outlined previously, the structure may still be described as providing a nominal operating structure that provides positioning and measurement results within an expected or specified nominal range of robot accuracy (e.g., as has been expected and/or tolerated in various prior art robotic systems). Fig. 3A can be understood in many ways by analogy with the previous description of fig. 2B, so only the significant differences are emphasized below.
With respect to fig. 3B, additional errors may be generated due to residual misalignment MisAng, which is not present in the structures shown in fig. 2A and 2B. In particular, the field of view FOV1 of the first imaging structure 160 will translate from its desired or reference alignment position (e.g., as shown in fig. 3A) through the XY scale 170 according to the residual misalignment MisAng, resulting in a FOV misalignment error. It should be appreciated that the apparent position of the first imaging structure 160 relative to the XY scale 170 (at least in terms of its X, Y coordinates) is typically inferred based on determining the image position of at least one respective identifiable feature of the XY scale 170 in the acquired image. In the absence of information about residual misalignment MisAng, the above-described FOV misalignment error is typically undetectable and manifests as a corresponding error in the determined image position of at least one corresponding identifiable feature of the XY scale 170. For example, fig. 3A shows that when MisAng is 0, the scale feature located along the optical axis OA1 in the acquired image has the same X and Y position coordinates (X1, Y1) as the position coordinates (X1, Y1) of the designated reference point of the first imaging structure 160. In contrast, due to the residual misalignment error shown in fig. 3B, the scale features positioned along optical axis OA1 in the acquired image have "translated" X and Y position coordinates (X1 ', Y1') that are different from the actual position coordinates (X1, Y1) of the specified reference point of the first imaging structure 160, which results in a corresponding error (FOV misalignment error) when estimating or determining the position of the first imaging structure based on the acquired image.
However, within a limited range of residual misalignments, MisAng, which may be quantitatively indicated by the alignment sensor Asen, the residual misalignments (e.g., misalignment angles or vectors) may be known and the corresponding FOV misalignment errors may be determined and at least partially corrected or compensated based on the indicated residual misalignments, MisAng. One of ordinary skill in the art will recognize that FOV misalignment error in the X direction (X1 '-X1) may be approximated as sin (MisAngX) ID and FOV misalignment error in the Y direction (Y1' -Y1) may be approximated as positive sin (MisangY) ID, where MisAngX and MisangY are the angular components of residual misalignment Misang in the XZ and YZ planes, respectively. It will be appreciated that such a determined FOV misalignment displacement or error based on the residual misalignment MisAng may be used to at least partially correct or compensate for the image position error and/or the resulting set of metrological position coordinates, at least for the vector component of the set of metrological position coordinates that is at least one of transverse or perpendicular to the scale imaging axis directions (e.g. X and Y coordinates).
It will be appreciated that the image position correction outlined immediately above may be combined with the correction outlined previously relating to the end tool position ETP and/or the measurement point MP due to their respective offsets and residual misalignment errors to determine a set of metrology position coordinates indicative of the end tool position ETP or the measurement point position of the end tool at the time of image acquisition, the level of accuracy being better than the robot accuracy, at least for the vector component of the second set of metrology position coordinates (which is transverse or perpendicular to at least one of the scale imaging axis directions). In some embodiments, coordinate displacements or errors associated with (Z2 '-Z2) and/or (Z3' -Z3) may be approximated based on residual misalignment indicated by the alignment sensor ASen and known geometries and orientations of various arms and bearings of the robotic system 300 and mechanical properties (e.g., beam properties). In such embodiments, errors that may occur in the Z coordinate of a set of metrology position coordinates may also be at least partially corrected or compensated for based on residual misalignment indicated by the alignment sensor ASen.
The various structures and operations outlined above with reference to fig. 2A, 2B, 3A and 3B (where the operational alignment subsystem OAS does not include the operational alignment actuator structure AAct) may be summarized as follows. The robot system 200, 300 comprises a robot comprising a moveable arm structure MAC and a motion control system. The movable arm structure MAC includes an end tool mounting structure ETMC located near a distal end of the movable arm structure MAC, and the robot is configured to move the movable arm structure MAC to move at least a portion of an end tool ETL mounted to the end tool mounting structure ETMC along at least two dimensions in the end tool workspace. The motion control system is configured to control the end tool position ETP or the measurement point position MP of the end tool ETL at a level of precision defined as the precision of the robot based at least partly on sensing and controlling the position of the movable arm structure MAC using at least one position sensor SEN comprised in the robot. The robotic system further comprises an auxiliary metrology position coordinate determination system 150 comprising a first imaging structure 160 and an XY scale 170, an operational alignment subsystem OAS, an image triggering portion 181, a metrology position coordinate processing portion 185. The first imaging structure 160 includes a first camera CAM1 and has a light axis OA 1. As described in more detail below, the XY scale 170 comprises a nominally planar substrate and a plurality of corresponding imageable features distributed on the substrate, wherein the corresponding imageable features are located at corresponding known XY scale coordinates on the XY scale, the scale plane is defined to nominally coincide with the planar substrate of the XY scale, and the direction perpendicular to the scale plane is defined as the scale imaging axis direction SIA. The operational alignment subsystem OAS comprises at least one alignment sensor ASen, wherein the alignment sensor ASen is located near the first camera CAM1 and is mounted in a rigid structure with respect to the first camera CAM1, and the alignment sensor ASen is configured to provide an alignment signal Asig representing the scale imaging axis direction SIA. The image triggering portion is configured to input at least one input signal related to an end tool position ETP or a measurement point position MP of the end tool ETL, and to determine a timing of a first imaging trigger signal based on the at least one input signal, and to output the first imaging trigger signal to the first imaging structure 160, wherein the first imaging structure 160 is configured to acquire a digital image of the XY scale at an image acquisition time in response to receiving the first imaging trigger signal. The metrological position coordinates processing part is configured to input the acquired image and identify at least one corresponding imageable feature included in the acquired XY-scale image, and an associated corresponding known XY-scale coordinate position.
In such a configuration, the auxiliary metrology position coordinate determination system 150 is configured with: one of the XY scale 170 or the first imaging structure 160 and the alignment sensor Asen, which is coupled to the movable arm structure MAC, and the other of the XY scale 170 or the first imaging structure 160 and the alignment sensor Asen, which is coupled to a fixed element STE near the robot, wherein the fixed one of the XY scale 170 or the first imaging structure 160 defines a first reference position. The robotic system is configured to provide at least a nominal operating configuration of the auxiliary metrology position coordinate determination system 150 in which the XY scale 170 and the first imaging structure 160 are arranged such that the optical axis OA1 of the first imaging structure 160 is nominally parallel to the direction of the scale imaging axis direction SIA and the scale plane is located within the focus range of the first imaging structure 160 along the scale imaging axis direction SIA. The robotic system is also configured to operate the operational alignment subsystem OAS to determine a residual misalignment MisAng between the optical axis OA1 and the scale imaging axis SIA as indicated by an alignment signal ASig provided by an alignment sensor ASen. The auxiliary metrological position coordinate determination system 150 is further configured such that, when the movable one of the XY scale 170 or first imaging structure 160 and the fixed one of the XY scale 170 or first imaging structure 160 are arranged in a nominal operating configuration and the movable arm structure MAC is positioned such that the XY scale 170 is within the field of view FOV1 of the first imaging structure 160, then the metrological position coordinate processing portion 150 is operable to acquire a digital image of the XY scale 170 at image acquisition time and determine a corresponding residual misalignment MisAng. Based on the image position of the identified at least one respective imageable feature in the acquired image and the respective residual misalignment MisAng, the auxiliary metrology position coordinate determination system may then determine a first set of metrology position coordinates that give the relative position between the movable one of the XY scale 170 or the first imaging structure 160 and the first reference position, at least for the vector component of the first set of metrology position coordinates (which is at least one of transverse or perpendicular to the scale imaging axis), the level of accuracy being better than the robot accuracy. The metrology position coordinate processing section 150 is further operable to determine a second set of metrology position coordinates based on the first set of metrology position coordinates and the corresponding residual misalignment MisAng, which second set of metrology position coordinates gives the end tool position ETP or the measurement point position MP of the end tool ETL at the time of image acquisition, at least for a vector component of the second set of metrology position coordinates (which vector component is at least one of transverse or perpendicular to the scale imaging axis direction SIA), the level of accuracy being better than the robot accuracy.
Figure 4 is an isometric view of an exemplary embodiment of an incremental XY scale 170A. As shown in fig. 4, incremental XY scale 170A includes an array of evenly spaced incremental imageable features IIF. In various embodiments, the incremental XY scale 170A may have a periodicity of less than 100 microns (e.g., the periodic spacing XP1 and YSP1 between the incremental imageable features IIF along the respective x-axis and y-axis may each be less than 100 microns). In various embodiments, the position information determined using the incremental XY scale 170A may have an accuracy of at least 10 microns. The accuracy determined using such XY scale 170A may be an even factor of about 10 better than robot accuracy, as compared to robot accuracy, which may be about 100 microns or more in some embodiments. In one particular exemplary embodiment, the incremental XY scale 170A may have a higher periodicity of about 10 microns for which an accuracy of about 1 micron may be achieved if the magnification of the first imaging structure 160 is about 1x and the interpolation is performed at a factor of 10 x. The accuracy of this structure is improved by a factor of about 100 over a robot accuracy of about 100 microns.
In various embodiments, the position of the field of view FOV of the first imaging structure 160 within the incremental XY scale 170A may provide an indication of the relative position between the XY scale 170A and the first reference position REF 1. In various embodiments, the first imaging structure 160 can be used in conjunction with an incremental XY scale 170A as part of a camera/scale image processing structure. For example, the metrological position coordinates processing portion 185 may determine the relative incremental position between XY scale 170A and first reference position REF1 based on the position of the field of view FOV within incremental XY scale 170A, as indicated by the portion of XY scale 170A in the captured image, and as is known in the camera/scale image processing art (e.g., as described in the previously incorporated references). In various implementations, the incremental XY scale 170A can have various dimensions relative to the field of view FOV (e.g., the incremental XY scale 170A can be at least 4 times, 10 times, 20 times, etc. larger than the field of view FOV).
In various embodiments, the incremental positions indicated by the XY scale 170A can be combined with position information from the articulated robot 110 to determine a relatively precise and/or absolute position. For example, sensors SEN and EN2 (e.g., rotary encoders) of articulated robot 110 may indicate end tool position ETP with robotic precision for which the incremental position indicated by XY scale 170A may be used to further refine the determined end tool position ETP to have a better precision than robotic precision. In one such configuration, the metric position coordinate processing portion 185 can be configured to identify one or more corresponding imageable features IIF included in the acquired image of the XY scale 170A based on the image position of the one or more imageable features IFF in the acquired image and based on the positional data of the articulated robot derived from the motion control system 140 corresponding to the image acquisition time.
In such a configuration, the corresponding imageable features IFF of the XY scale 170A can include a set of similar imageable features IFF distributed over the substrate such that they are spaced apart from each other at regular intervals by a distance that is greater than the maximum positional error allowed within the accuracy of the robot. As shown in fig. 4, the interval of the imageable features IFF (e.g., at intervals XP1 and YSP 1) is greater than the maximum position error MPE, which is represented by a circle around the representative imageable feature IFF. It should be appreciated that in such a configuration, the accuracy of the robot used for position determination is sufficient to determine the position with an accuracy greater than the spacing between the imageable features IFF. More specifically, in various embodiments, a single imageable feature IFF on XY scale 170A (that is, where the imageable feature is both at known x and y metrology position coordinates on XY scale 170A according to uniform spacing on the scale) can thus be identified by the articulated robot position data with sufficient accuracy that no two imageable features IFF can be confused with each other. In such a configuration, the location of the single imageable feature IFF in the acquired image may then be used to further refine the end tool position ETP, with accuracy better than robot accuracy, at least for the x and y metrology position coordinates of the end tool position ETP in the x-y plane perpendicular to the z-axis.
As described above with respect to fig. 2, in one particular exemplary embodiment, XY scale 170A may be specified to have a reference position (e.g., origin position) at X0, Y0, Z0 (e.g., which may have a value of 0,0, 0 for the origin position). In such a configuration, the reference position REF1 (i.e., as defined by the fixed first imaging structure 160) may be at relative coordinates of X1, Y1, Z1, and the center of the respective field of view FOV (e.g., as captured in the acquired image) may be at relative coordinates of X1, Y1, Z0. The position of the tip tool axis EA in the X-Y plane extending from the XY scale 170 may be specified as having relative coordinates of X2, Y2, Z0. The end tool position ETP may be specified as having coordinates of X2, Y2, Z2.
In operation, the acquired images may be analyzed by the metric position coordinate processing component 185 to determine the X1, Y1 coordinates corresponding to the center of the field of view FOV of the fixed first imaging structure 160. In various embodiments, such determination may be made in accordance with standard camera/ruler image processing techniques for determining a field of view position (e.g., corresponding to the position of the camera) within a ruler range (e.g., within XY ruler 170A). It will be appreciated that the reference positions/origin positions X0, Y0, Z0 need not be located in the field of view of the FOV to make such a determination in accordance with standard camera/scale image processing techniques (i.e., the relative positions may be determined from scale information at any position along the XY scale 170A, provided in part by scale elements comprising uniformly spaced incremental imageable features IIF). In various embodiments, such determination may include identifying at least one respective imageable feature included in the acquired image of the XY scale 170 and an associated respective known XY scale coordinate position. Such a determination may correspond to determining a relative position between the XY scale 170 and the first reference position REF1 (i.e., defined by the fixed first imaging structure 160). The relative X2, Y2 coordinates (i.e., the coordinates of the end tool position ETP) may then be determined from the known coordinate position offset between the end tool position ETP and the XY scale 170 (e.g., adding the X and Y position offset values to X1 and Y1 to determine X2 and Y2).
A specific example of combining the position information from the articulated robot 110 with incremental position information indicated by the XY scale 170A to determine a relatively precise and/or absolute position is as follows. As shown in fig. 4, the acquired image may indicate that the center of the field of view FOV is in the middle of the four incrementally imageable features IIF, but may not indicate which of the four particular incrementally imageable features IIF of the XY scale 170 are included in the image. The positional information from the articulated robot 110 can be sufficiently accurate to provide information for which a particular four incremental imageable features IIF of the XY scale 170A can be identified (e.g., based in part on the principles described above by which the imageable features IFF are spaced apart by more than the maximum positional error represented by the representative circular area MPE so that each imageable feature IFF can be uniquely identified). The acquired images can then be analyzed by the metrology position coordinate processing section 185 to determine exactly where the center of the field of view (i.e., at coordinates X1, Y1, Z0) appears within that portion of the XY scale (i.e., which includes the particular four incremental imageable features IIF). The process may then continue as described above (e.g., for determining the X2 and Y2 coordinates of the end tool position ETP accordingly).
For use of an XY scale 170A or the like with the alignment sensor ASen, the alignment beam ABeam is reflected from an alignment reflective feature ARF on a surface parallel to the scale plane, as shown in fig. 2A-3B. In various embodiments, the reflective features ARF may be imageable features IFF, and the alignment beam ABeam may be positioned to reflect these features (based on moving the robot to achieve the reflection) (it is sufficient to use robot precision). It should be understood that the alignment sensor need not operate anywhere on the XY scale 170A, and need not operate continuously. For example, the alignment beam ABeam may be operated intermittently to avoid relevant "noise" in the acquired scale image.
For optimum performance of the alignment sensor, it may be desirable for the alignment reflecting feature ARF to be larger than the spot size of the alignment beam Abeam. If this conflicts with the desired size of the imageable feature IFF, an additional larger feature ARF can be provided at a different position on the XY scale, as schematically represented by the optional alignment reflective feature ARF shown in FIG. 4. It should be appreciated that in other embodiments, alignment beam Abeam may have a wavelength that is not visible to first imaging structure 160 (e.g., based on camera sensitivity or wavelength filtering), and XY scale 170A may include a reflective layer for a particular wavelength at any location, which allows alignment sensor ASen to operate and/or operate continuously on any of XY scale 170A.
Figure 5 is an isometric view of an exemplary embodiment of an absolute XY scale 170B. For the use of XY scale 170B, etc. with alignment sensor ASen, it should be understood that these considerations are the same as those previously outlined with reference to fig. 4. In the example of fig. 5, similar to the incremental XY scale 170A, the absolute XY scale 170B includes an array of evenly spaced incremental imageable features IIF, and also includes a set of absolute imageable features AIF having a uniquely identifiable pattern (e.g., a 16-bit pattern). In operation, the position of the field of view FOV of the first imaging structure 160 within the absolute XY scale 170B (i.e., included in the captured image) provides an indication of the absolute position between the XY scale 170B and the first reference position REF 1. In the embodiment of fig. 5, the set of absolute imageable features AIF are distributed over the substrate SUB such that they are spaced apart (e.g., by the spacings XSP2 and YSP2) by a distance that is less than the distance corresponding to the distance across the field of view FOV of the first imaging structure 160 (i.e., such that at least one absolute imageable feature AIF will always be included in the field of view). In operation, the metric position coordinate processing portion 185 is configured to identify at least one respective absolute imageable feature AIF included in the acquired image of the XY scale 170B based on the uniquely identifiable pattern of the respective absolute imageable feature AIF. It should be appreciated that, at least for the x and y metrology position coordinates of end tool position ETP in the x-y plane perpendicular to the z-axis, such an embodiment is capable of independently determining the absolute position representing end tool position ETP with an accuracy that is better than robot accuracy (e.g., as opposed to incremental XY scale 170B, which may not need to be combined with position information from articulated robot 110 to determine the absolute position).
A specific example of using the absolute imageable features AIF to determine a relatively precise and absolute position is as follows. As shown in fig. 5, the acquired image may indicate that the center of the field of view FOV is in the middle of the plurality of incrementally imageable features IIF. The position information from the two included absolute imageable features AIF indicates which portion of the XY scale 170B the image includes, for which portion the included incremental imageable features IIF of the XY scale 170 can also be identified. Thus, the acquired images can be analyzed by the metrology position coordinate processing section 185 to determine exactly where the center of the field of view (i.e., at coordinates X1, Y1, Z0) appears in that portion of the XY scale (i.e., which includes both the absolute imageable feature and the incremental imageable feature IIF). The process may then continue as described above (e.g., for determining the X2 and Y2 coordinates of the end tool position ETP accordingly).
FIG. 6 is a flow diagram illustrating an exemplary embodiment of a routine 600 for operating a robotic system including a robot and an auxiliary metrology position coordinate determination system including an operational alignment subsystem without an operational alignment actuator structure. As shown in fig. 6A, at decision block 610, a determination is made whether the robotic system is to operate in an auxiliary metrology position coordinate mode. In various embodiments, the selection and/or activation of the auxiliary metrology position coordinate mode or the standard robot position coordinate mode may be made by a user and/or may be made automatically by the system in response to certain operations and/or instructions. For example, in one embodiment, an auxiliary metrology position coordinate mode may be entered (e.g., automatically or at the user's option) when the articulated robot is moved to a particular location (e.g., to move the end tool from a general area where assembly or other operations are performed to a more specific area where workpiece inspection operations are typically performed, and where the auxiliary metrology position coordinate mode will be utilized). In various embodiments, this mode may be implemented by the external control system ECS (e.g., as the external control system ECS of fig. 1 utilizes the standard robot position coordinate mode portion 147 and the auxiliary metrology position coordinate mode portion 187). In various embodiments, the hybrid mode may operate independently or as part of an auxiliary metrology position coordinate mode, and/or may be implemented as a switch between modes, as will be described in more detail below with reference to FIG. 7.
If at decision block 610 it is determined that the robotic system is not operating in the auxiliary metrology position coordinate mode, the routine proceeds to block 615, where the robotic system is operating in the standard robotic position coordinate mode. As part of the standard robot position coordinate model, the articulated robot's position sensors (e.g., rotary encoders) are used to control and determine the articulated robot's movements and corresponding end tool positions with robot precision (e.g., which is based at least in part on the precision of the articulated robot's position sensors). As described above, the first and second rotary encoders can indicate the positions of the first and second arm portions with lower accuracy than position information determined using the XY scale. In general, the robot position coordinate mode may correspond to an independent and/or standard mode of operation of the articulated robot (e.g., a mode in which the articulated robot operates independently, such as when the assistance-metrology position coordinate determination system is inactive or not provided).
If the robot system is to operate in the auxiliary metrology position coordinate mode, the routine proceeds to a block 620, where the robot and auxiliary metrology position coordinate determination system are set to provide a "nominal" operating configuration of the auxiliary metrology position coordinate determination system. The "nominal" operating structure has been previously defined in accordance with the conventions used herein. The scale plane is defined to be nominally coincident with the planar base of the XY scale, and the direction perpendicular to the scale plane is defined as the scale imaging axis direction. In a "nominal" operating configuration, at least one of the XY scale or the first imaging structure is arranged such that the optical axis of the first imaging structure is nominally parallel to the scale imaging axis direction, and the scale plane is located within the focus range of the first imaging structure along the scale imaging axis direction.
At block 630, at least one input signal is received (i.e., at the image trigger portion), the input signal being related to a measurement point position or an end tool position of an end tool of the articulated robot. The timing of the first imaging trigger signal is determined based on the at least one input signal and the first imaging trigger signal is output to the first imaging structure. The first imaging structure acquires a digital image of the XY scale at an image acquisition time in response to receiving the first imaging trigger signal.
At block 635, the operational alignment subsystem is operated (e.g., by a robotic system) to determine a residual misalignment between the optical axis and the scale imaging axis, as indicated by the alignment signal provided by the alignment sensor, which corresponds to the acquired digital image. In various embodiments, the residual misalignment corresponding to the acquired digital image may depend on the situation. For optimal accuracy, it may be desirable to establish a residual misalignment with the movable arm structures of the robot in the same (or nearly the same) position and/or attitude during the operations of block 630. However, if the robotic arm structure is sufficiently rigid, and/or the position and/or pose used during the operations of block 635 is close to the position and/or pose used during the operations of block 630, and/or the accuracy requirements under certain circumstances are less stringent, the operations of blocks 635 and 630 may be performed at different positions and/or times, and the residual misalignment determined at block 635 may substantially correspond to the digital image acquired at block 630.
At block 640, the acquired image is received (e.g., in a metrological position coordinates processing portion) and at least one respective imageable feature included in the acquired image of the XY scale and an associated respective known XY scale coordinate position are identified.
At block 650, a first set of metrology position coordinates indicative of the relative position between the movable of the XY scale or the first imaging structure and the first reference position is determined with a precision better than the robot precision, at least for the vector component of the first set of metrology position coordinates (the vector component being at least one of transverse or perpendicular to the scale imaging axis direction). The first set of metrology position coordinates is determined based on an image position of the identified at least one respective imageable feature in the acquired image and a respective residual misalignment (e.g., as outlined above with reference to fig. 2A, 2B, 3A, and/or 3B).
At optional block 655, a second set of metrology position coordinates may be determined based on the first set of metrology position coordinates and the corresponding residual misalignment, the second set of metrology position coordinates giving a measurement point position or end tool position of the end tool at image acquisition time, the level of accuracy being better than the robot accuracy at least for a vector component of the second set of metrology position coordinates (the vector component being transverse or perpendicular to at least one of the scale imaging axis directions). It should be appreciated that the first set of metrology coordinates determined at block 650 is indicative of a local reference position of the movable one of the XY scale or the first imaging structure. The operation of this block 655 is related to correcting errors related to the effect of residual misalignment on the known offset between the measurement point position of the end tool or the end tool position and the reference position of the movable one of the XY scale or the first imaging structure, such as highlighted in the description of fig. 2A and 2B. After the operations of block 650 (or 655), the routine may end.
Alternatively, after the operations of block 650 (or 655), the routine may be partially or fully repeated. For example, the determined position information (e.g., from block 655) may correspond to or otherwise be used to determine a first surface position on the workpiece, and the routine may be repeated, for which a second surface position on the workpiece may then be determined (e.g., as part of a measurement of the workpiece, such as measuring a feature of the workpiece). During the repetition of the routine, in various embodiments, the operations at block 620 need not be repeated. In any case, the first and second determined metrology position coordinates representing the first and second relative positions and/or associated position information determined by the iterative routine 600 may be used to determine a dimension of the workpiece corresponding to a distance between the first and second surface locations on the workpiece corresponding to the respective end tool position or measurement point position of the end tool when contacting the respective first and second surface locations on the workpiece at the respective image acquisition time, and so on. It should be appreciated that rather than using the position sensors of the robot (e.g., rotary encoders, linear encoders, etc.) to determine the first and second surface positions on the workpiece with robotic accuracy, the techniques described herein may be utilized to determine more accurate position information. More specifically, the determination of the first and second surface positions (i.e. corresponding to first and second determined metrology position coordinates corresponding to respective first and second positions on the XY scale, the precise distance between these coordinates/positions being determinable using the techniques described above in terms of the accuracy of the XY scale) allows the respective dimensions of the workpiece (e.g. workpiece feature) between the first and second surface positions to be determined with high accuracy.
FIG. 7 is a flow diagram illustrating one exemplary embodiment of a routine 700 for determining end tool position, where different techniques may be used during different portions of the movement timing. Typically, during the motion timing, one or more arms of the articulated robot move from a first rotational position to a second rotational position (e.g., this may include rotating the arms from a first rotational orientation to a second rotational orientation about a rotational joint). As shown in FIG. 7, at decision block 710, a determination is made whether a hybrid mode is to be used to determine the end tool position during the motion timing. In various embodiments, the hybrid mode may also represent a process that includes switching between an auxiliary metrology position coordinate mode and a standard robot position coordinate mode. If the hybrid mode is not used, the routine continues to block 720 where the articulated robot's position sensor (e.g., rotary encoder) is used only to determine the end tool position during the motion timing.
If a hybrid mode is to be used, the routine proceeds to block 730, and during a first portion of the motion timing, a position sensor included in the articulated robot is used to determine the end tool position. During such operations, the relative position of the auxiliary metrology position coordinate determination system may not be determined and/or used to determine the end tool position. At block 740, the end tool position is determined using the determined relative position of the auxiliary metric position coordinate determination system during a second portion of the motion timing that occurs after the first portion of the motion timing. It will be appreciated that such operation enables the system to perform an initial/fast/coarse movement of the end tool position during a first portion of the movement timing and a more precise final/slower/fine movement of the end tool position during a second portion of the movement timing.
Fig. 8 is a block diagram of a fourth exemplary embodiment of a robotic system 800 including a robot 810 and an auxiliary metrology position coordinate determination system 850. The secondary metrology position coordinate determination system 850 is shown as comprising a second exemplary embodiment of an operational alignment subsystem OAS comprising an alignment sensor ASen and an operational alignment actuator structure AAct comprising a movable arm structure MAC' of the robotic system 800 and elements comprised in the robot motion control and processing system 840, as described in more detail below. The alignment sensor ASen and the operational alignment actuator structure AAct are connected to (or interoperate with) the operational alignment subsystem processing circuitry/routines 890.
Robot 810 (e.g., an articulated robot) includes a moveable arm structure MAC' and a robot motion control and processing system 840. The auxiliary metrology position coordinate determination system 850 includes a first imaging structure 860, an XY scale 870, an image trigger portion 881, and a metrology position coordinate processing portion 885. In the structure of fig. 8, an XY scale 870 is coupled to the movable arm structure MAC'. As will be described in more detail below, the first imaging structure 860 has a first optical axis OA1, which may be parallel to the scale imaging axis direction SIA 1 when in the operative configuration.
In the example of FIG. 8, the movable arm structure MAC 'includes a lower base portion BSE', an arm portion 821-. As will be described in greater detail below and further illustrated in FIG. 9, each arm 821-825 may have a respective proximal PE1-PE5 and a respective distal DE1-DE 5. In various embodiments, some or all of the arms 821-. In the example of fig. 8, some or all of the movement mechanisms 831-. In various embodiments, position sensors SEN1'-SEN5' (e.g., rotary encoders, linear encoders, etc.) may be used to determine the position (e.g., angular orientation, linear position, etc.) of the respective arm portions 821-825.
In various embodiments, the movable arm structure MAC' may have a portion (e.g., the fifth arm portion 825) designated as a terminal portion. In the example structure of fig. 8, the tip tool mounting structure ETMC is located near (e.g., at) a distal end DE5 of a fifth arm portion 825 (e.g., designated as a terminal portion), which distal end DE5 corresponds to a distal end of the movable arm structure MAC'. In various embodiments, XY scale 870 may be coupled to movable arm structure MAC 'so as to be proximate a distal end of movable arm structure MAC'. In the embodiment of fig. 8, an XY scale 870 is coupled to the fifth arm portion 825 at a position near the distal end of the movable arm structure MAC'. In some embodiments according to the principles disclosed herein, the secondary metrology position coordinate determination system is configured such that the movable one of the XY scale (e.g., 870) or the first imaging structure (e.g., 860) is coupled to the operational alignment actuator structure AAct, which is coupled to or part of the movable arm structure (e.g., MAC'). The embodiments shown in fig. 8,9 and 10 correspond to this description in that the operating alignment actuator structure AAct comprises a first rotation element 835/825 and a second rotation element 834/824 comprised in the movable arm structure MAC'.
In various embodiments, end tool mounting structure ETMC may include various elements for attaching and retaining end tool ETL near the distal end of movable arm structure MAC'. For example, in various embodiments, the end tool mounting configuration ETMC may include an automatic articulation, a magnetic coupling portion, and/or other coupling elements known in the art for mounting the end tool ETL to a corresponding element. The end tool mounting structure ETMC may also include electrical connections (e.g., power connections, one or more signal lines, etc.) for providing power and/or transmitting signals to and/or from at least a portion of the end tool ETL (e.g., to and from the end tool sensing portion ETSN).
In various embodiments, end tool ETL may include an end tool sensing portion ETSN and an end tool stylus ETST, a measurement point MP of the end tool (e.g., for contacting a surface of work piece WP). Fifth motion mechanism 835 is located near distal end DE4 of fourth arm section 824. In various embodiments, the fifth motion 835 (e.g., a rotary joint with a corresponding motor) may be configured to rotate the fifth arm 825 about the rotational axis RA 5. In any case, the end tool ETL is mounted to (e.g., coupled to) the end tool mounting structure ETMC and has a respective end tool position ETP with respective metrology position coordinates (e.g., x, y, and z coordinates). In various embodiments, end tool position ETP may correspond to or be proximate to a position of end tool mounting structure ETMC (e.g., at or proximate to distal end DE5 of fifth arm portion 825, which may correspond to a distal end of movable arm structure MAC').
The motion control system 840 is configured to control the end tool position ETP of the end tool ETL with a level of accuracy defined as the robot accuracy. More specifically, motion control system 840 is generally configured to control the metrology position coordinates (e.g., x, y, and z coordinates) of end tool position ETP with robotic accuracy based at least in part on sensing and controlling the position of arm portion 821-. In various embodiments, the motion control and processing system 840 may include motion mechanism control and sensing portions 841-845 that may respectively receive signals from respective position sensors SEN1'-SEN5' for sensing the position (e.g., angular position, linear position, etc.) of the respective arm 821-825, and/or may provide control signals to the respective motion mechanism 831-835 (e.g., including a rotary joint, a linear actuator, a motor, etc.) for movement as the respective arm 821-825.
Motion control and processing system 840 may also receive signals from end tool sensing portion ETSN. In various embodiments, end tool sensing portion ETSN can include circuitry and/or structure associated with operation of end tool ETL for sensing work piece WP. As will be described in greater detail below, in various embodiments, end tool ETL (e.g., a contact probe, a scanning probe, a camera, etc.) can be used to contact or otherwise sense a surface location/position/point on work piece WP, for which end tool sensing portion ETSN can receive, determine, and/or process various corresponding signals, which can provide corresponding signals to motion control and processing system 840. In various embodiments, motion control and processing system 840 can include an end tool control and sensing portion 846 that can provide control signals to and/or receive sensing signals from end tool sensing portion ETSN. In various embodiments, end tool control and sensing portion 846 and end tool sensing portion ETSN may be merged and/or indistinguishable. In various embodiments, both the motion mechanism control and sensing portion 841-845 and the end tool control and sensing portion 846 may provide outputs to the robot position processing portion 847 and/or receive control signals from the robot position processing portion 847, and the robot position processing portion 847 may control and/or determine the overall positioning of the movable arm structure MAC' and the corresponding end tool position ETP of the robot 810 (as part of the robot motion control and processing system 840).
In various embodiments, the auxiliary metrology position coordinate determination system 850 may be included in the robot 810 or otherwise added to the robot 810 (e.g., as part of a retrofit structure for adding to an existing robot 810, etc.). In general, the auxiliary metrology position coordinate determination system 850 may be used to provide an improved level of accuracy for the determination of the end tool position ETP. More specifically, as will be described in greater detail below, the auxiliary metrology position coordinate determination system 850 may be used to determine metrology position coordinates representing the end tool position ETP with a level of accuracy that is better than the robot accuracy, at least for the vector component of the metrology position coordinates (which is transverse or perpendicular to at least one of the scale imaging axis directions SIA). In various embodiments (e.g. where the scale imaging axis direction SIA and end tool stylus ETST are parallel to the z-axis), this may correspond to a level of accuracy that is better than that of the robot, at least for x and y metrology position coordinates in an x-y plane perpendicular to the z-axis.
As shown in fig. 8, the first imaging structure 860 is coupled to a stationary element STE near the robot 810. In various embodiments, the fixation element STE may comprise a frame arranged over at least a portion of the end tool workspace ETWV, and to this end the first imaging structure 860 is fixed to the frame over a portion of the end tool workspace ETWV. In various embodiments, the fixation element STE may include one or more structural support elements SSP (e.g., extending from a floor, ceiling, etc.) for maintaining the fixation element STE in a fixed position (e.g., having a fixed position and/or orientation) relative to the robot 810.
In various embodiments, end tool workspace ETWV' includes a space in which at least a portion of at least one of end tool ETL and/or XY scale 870 may move. In the example of fig. 8, end tool working space ETEV' is shown to include a space in which measurement point MP of end tool ETL may move when inspecting a workpiece. As an alternative example, the end tool workspace may alternatively comprise a space in which the XY scale 870 may move as the end tool ETL moves for inspecting the workpiece. In various embodiments, robot 810 is configured to move movable arm structure MAC 'so as to move at least a portion of end tool ETL' (e.g., measurement point MP of end tool ETL ') that is mounted to end tool mounting structure ETMC along at least two dimensions (e.g., x and y dimensions) in end tool workspace ETWV'. In the example of fig. 8, a portion of the end tool ETL (e.g., the measurement point MP of the end tool ETL') may be moved by the robot 810 in three dimensions (e.g., x, y, and z dimensions).
The first imaging structure 860 includes a first camera CAM1 and has a light axis OA 1. In an operative configuration of the auxiliary metrology position coordinate determination system 850, the optical axis OA1 of the first imaging structure 860 is parallel to the direction of the scale imaging axis direction SIA. The first imaging structure 860 has an effective focal range REFP along its optical axis OA 1. In various embodiments, the range REFP may be defined by first and second effective focus positions EFP1 and EFP2, which will be described in more detail below. At a given time, the first imaging structure 860 has an effective focal position EFP that falls within the REFP range. In embodiments using a Variable Focal Length (VFL) lens, the range REFP may correspond to the focal range of the VFL lens.
In various embodiments, the VFL lens used may be a tunable acoustic gradient index (TAG) lens. With respect to the general operation of such a TAG lens, in various embodiments, a lens controller (e.g., as included in the first imaging structure control and processing portion 880) may rapidly adjust or modulate the optical power of the TAG lens periodically to achieve a high speed TAG lens capable of periodic modulation at 250kHz, 70kHz, 30kHz, or the like (i.e., at the TAG lens resonant frequency). In such a configuration, the effective focus position EFP of the first imaging structure 860 may be moved (e.g., rapidly) within the range REFP (e.g., autofocus search range). The effective focus position EFP1 (or EFPmax) may correspond to the maximum optical power of the TAG lens and the effective focus position EFP2 (or EFPmin) may correspond to the maximum negative optical power of the TAG lens. In various embodiments, the middle of the range REFP may be designated as EFPnom, and may correspond to zero optical power for the TAG lens.
In various embodiments, such a VFL lens (e.g., a TAG lens) and/or corresponding distance REFP may be advantageously selected such that the structure can limit or eliminate the need for macro-mechanical adjustment of the first imaging structure 860 and/or distance adjustment between components (to change the effective focal position EFP). For example, in embodiments where an unknown amount of tilt or "droop" may occur at the distal end DE5 of fifth arm portion 825 (e.g., corresponding to the distal end of movable arm structure MAC') (e.g., due to the weight and/or particular orientation of arm portions 821 and 825, etc.), the precise focus distance from first imaging structure 860 to XY scale 870 may be unknown and/or may vary with different orientations of the arm portions, etc. It should also be appreciated that in the example structure of fig. 8, the distance between the XY scale 870 and the first imaging structure 860 may generally vary according to the general operation of a movable arm structure MAC' that can move the end tool position ETP to a different position/distance along the scale imaging axis direction SIA than the first imaging structure 860 (e.g., as part of an operation of scanning the surface of the workpiece WP, etc.). In such a configuration, it may be desirable to use a VFL lens capable of scanning or otherwise adjusting the effective focal position EFP to determine and accurately focus on the XY scale 870. In various embodiments, this technique using a VFL lens may be used in conjunction with other focus adjustment techniques (e.g., with a variable objective lens, which may also be included in the first imaging structure 860, etc.)
As previously mentioned, in the embodiment shown in fig. 8, the operational alignment subsystem OAS includes an alignment sensor ASen, an operational alignment actuator AAct (which includes elements of the movable arm structure MAC'), and an operational alignment subsystem processing circuit/routine 890. The operational alignment subsystem processing circuit/routine 890 includes an alignment signal processing section 891 that may provide signal processing that may provide primary signal conditioning and/or correction for the alignment signal Asig of the alignment sensor Asen and/or analysis for determining a misalignment angle/vector or residual misalignment angle/vector corresponding to the alignment signal Asig, as described in more detail below. The operational alignment subsystem processing circuit/routine 890 also includes an alignment control section 892 generally configured to adjust the alignment of the movable one of the XY scale or the first imaging structure based on the alignment signal Asig provided by the alignment sensor ASen to provide an operational configuration of the XY scale and the first imaging structure in which the optical axis (e.g., OA1) and the scale imaging axis direction SIA of the first imaging structure are arranged to be parallel, as indicated by the alignment signal Asig, as described above.
It should be understood that the architecture of the operational alignment subsystem processing circuit/routine 890 shown in fig. 8 and outlined above is exemplary only and not limiting. In various embodiments, various portions of the alignment subsystem processing circuitry/routines 890 may be located outside of the external control system ECS (e.g., in the operational alignment sensor ASen), or may be incorporated and/or indistinguishable from other portions of the auxiliary metrology position coordinate determination system 850 (e.g., portions 885 and/or 887). In the illustrated embodiment, the operational alignment subsystem processing circuitry/routines 890 exchanges position and/or alignment information and/or control signals with the robot motion and control processing system 840, as indicated by dashed lines 893, in order to implement various operational principles or features disclosed herein. The various elements and operations described above will be described in more detail below.
In various embodiments, the XY scale 870 can be identical to the XY scale 170 described above with reference to fig. 4 and 5, or otherwise configured according to the principles disclosed herein. In various embodiments, the robotic system 800 is operable to provide the operational structure of an auxiliary metrology position coordinate determination system 850, as described in more detail below with reference to fig. 9 and 10. In an operative configuration of the auxiliary metrology position coordinate determination system 850, the movable XY scale 870 is arranged such that the direction of the scale imaging axis direction SIA is parallel to the optical axis OA1 of the first imaging structure 860, as indicated by the alignment signal Asig of the alignment sensor ASen, and the scale plane is located within the range of the focal point REFP of the first imaging structure 860 along the scale imaging axis direction SIA. It will be appreciated that in order to place the auxiliary metrology position coordinate determination system 850 in an operational configuration having the features described above, various adjustments may be made to the position/orientation of the arm portions 821 and 825 of the movable arm structure MAC' based on the alignment signal Asig of the alignment sensor ASen and a set of actuators considered to operate the alignment actuator AAct of the alignment subsystem OAS, as described below with reference to fig. 9. In other words, the robotic system 800 is configured to operate the operational alignment subsystem OAS and the operational alignment actuator structure AAct to adjust the alignment of the movable one of the XY scale 870 or the first imaging structure 860 based on the alignment signal provided by the alignment sensor ASen to provide the operational structure of the auxiliary metrology position coordinate determination system 850, wherein in the operational structure of the auxiliary metrology position coordinate determination system 850 the XY scale 870 and the first imaging structure 860 are arranged with the optical axis OA1 of the first imaging structure 860 parallel to the scale imaging axis direction SIA indicated by the alignment signal Asig.
In various embodiments, the image triggering portion 881 and/or the metrological position coordinates processing portion 885 may be included as part of the external control system ECS' (e.g., as part of an external computer, etc.). An image triggering portion 881 may be included as part of the first imaging structure control and processing portion 880. In various embodiments, the image triggering portion 881 is configured to input at least one input signal related to the end tool position ETP, and determine a timing of the first imaging trigger signal based on the at least one input signal and output the first imaging trigger signal to the first imaging structure 860. In various embodiments, the first imaging structure 860 is configured to acquire a digital image of the XY scale 870 at an image acquisition time in response to receiving the first imaging trigger signal. In various embodiments, the metric position coordinate processing portion 885 is configured to input the acquired image and identify at least one respective imageable feature included in the acquired image of the XY scale 870 and an associated respective known XY scale coordinate position. In various embodiments, the external control system ECS' may further include a standard robot position coordinate mode portion 849 and an auxiliary metrology position coordinate mode portion 887 for implementing respective modes, which will be described in more detail below.
In various embodiments, the first imaging structure 860 may include components (e.g., sub-circuits, routines, etc.) that periodically (e.g., at set timing intervals) activate image integration of the camera CAM1, for which a first imaging trigger signal from the image trigger section 881 may activate flash timing or other mechanisms to effectively freeze motion and accordingly determine exposure within an integration period. In such implementations, the resulting image may be discarded if the first imaging trigger signal is not received during the integration period, wherein if the first imaging trigger signal is received during the integration period, the resulting image may be saved and/or otherwise processed/analyzed to determine the metrology position coordinates, as will be described in more detail below.
In various embodiments, different types of end tools ETL may provide different types of output that may be used for the image trigger section 881. For example, in embodiments where the end tool ETL is a contact probe for measuring a workpiece and outputting a contact signal when it contacts the workpiece (e.g., when the measurement point MP contacts the workpiece), the image triggering portion 881 may be configured to input the contact signal or a signal derived therefrom as the at least one input signal on which the timing of the first imaging trigger signal is determined. In various embodiments where the end tool ETL is a contact probe, the central axis of the contact probe can be oriented along the scale imaging axis direction SIA (e.g., the central axis of the contact probe corresponds to the end tool axis EA). As another example, in an embodiment where the end tool ETL is a scanning probe for measuring a workpiece and providing corresponding workpiece measurement sample data corresponding to a corresponding sample timing signal, the image triggering portion 881 may be configured to input the corresponding sample timing signal or a signal derived therefrom as the at least one input signal. As another example, in an embodiment where the end tool ETL is a camera for providing a respective workpiece measurement image corresponding to a respective workpiece image acquisition signal, the image triggering portion 881 may be configured to input the workpiece image acquisition signal or a signal derived therefrom as the at least one input signal.
In the exemplary embodiment of fig. 8, the secondary metrology position coordinate determination system 850 is configured with an XY scale 870 coupled to an operational alignment actuator structure AAct coupled to or part of a movable arm structure (e.g., MAC'). The embodiments shown in fig. 8,9 and 10 correspond to this description in that the operating alignment actuator structure AAct comprises a first rotation element 835/825 and a second rotation element 834/824 comprised in the movable arm structure MAC'. Further, the first imaging structure 860 and the alignment sensor ASen are coupled to the stationary element STE (e.g., a frame disposed above and near the robot 810) and define a first reference position REF 1. In an alternative embodiment (e.g. as described below with reference to fig. 10), the auxiliary metrology position coordinate determination system may be configured with the first imaging structure 860 and an alignment sensor ASen coupled to an operating alignment actuator structure AAct, which is coupled to or part of the movable arm structure (e.g. MAC '), proximate the distal end of the movable arm structure MAC', and an XY scale 870 coupled to the fixed element STE and defining the first reference position REF 1.
In either case, as will be described in more detail below, the auxiliary metrology position coordinate determination system 850 may be configured such that the movable one of the XY scale 870 or the first imaging structure 860 and the fixed one of the XY scale 870 or the first imaging structure 860 are arranged in an operative configuration based on the alignment signal Asig from the alignment sensor ASen (or indicated by the alignment sensor ASen), with the movable arm structure MAC', the XY scale 870 lying within the field of view and focus range of the first imaging structure 860. Based on determining the image position of the identified at least one corresponding imageable feature in the acquired image, the metrological position coordinates processing part 885 is then operated to determine metrological position coordinates that give the relative position between the movable one of the XY scale 870 or the first imaging structure 860 and the first reference position REF1, at least for the vector component of the metrological position coordinates (which is at least one of transverse or perpendicular to the scale imaging axis direction SIA), the level of accuracy is better than the robot accuracy. The determined metrology position coordinates give the end tool position ETP and/or the measurement point position MP of the end tool at the time of image acquisition, at least for the vector component of the metrology position coordinates (which is at least one of transverse or perpendicular to the scale imaging axis direction SIA), the level of accuracy is better than the robot accuracy. In various embodiments, the auxiliary metrology position coordinate determination system 850 may be configured to determine the metrology position coordinates of the end tool position ETP and/or the measurement point position MP of the end tool at the time of image acquisition based on the determined metrology position coordinates (which give the relative position of the movable one of the XY scale 870 or the first imaging structure 860) and the known coordinate position offset between the end tool position ETP and/or the measurement point position MP of the end tool and the movable one of the XY scale 870 or the first imaging structure 860.
It should be appreciated that robotic systems such as those shown in fig. 1 and 8 may have certain advantages over various alternative systems. For example, in various embodiments, a system such as disclosed herein may be smaller and/or less expensive than alternative systems that utilize techniques such as laser trackers or camera measurements to track robot motion/position, and may also have greater accuracy in some embodiments. The disclosed system also does not occupy or obscure any portion of the end tool workspace, such as an alternative system that may include a scale or fiducial on the floor or platform, or in the same area (e.g., in the end tool workspace or end tool workspace of the end tool workspace) in which the workpiece may be otherwise processed and/or inspected, and so forth
Fig. 9 and 10 are similar in that they both illustrate embodiments that use active alignment to eliminate X and X errors previously outlined with reference to fig. 2B and 3B. Active alignment substantially eliminates these errors. The active alignment is a "closed loop" alignment process based on the signal ASig from the alignment sensor ASen. It should be understood that there is sag and twist in the movable arm structure MAC' shown in these figures (e.g., similar to that shown in fig. 2B and 3B), but sag and twist are not shown to avoid visual clutter. It should be appreciated that in accordance with the principles disclosed herein, due to the active alignment based on the alignment sensor signals, the effects of sag and twist may be compensated or cancelled out (at least with respect to the X and Y coordinates) in order to achieve a desired operational configuration. Active alignment may be performed manually or automatically at least once at a desired time during operation of the robotic system 900(1000) in order to establish a desired operational configuration, or intermittently at any desired time (e.g., as the robot pose changes), or frequently or continuously in some implementations.
Regarding the Z-error caused by sag and twist in the movable arm structure MAC' during the operating structure, the Z-error can be corrected at least approximately as described previously with reference to fig. 2B and 3B. For example, it will be appreciated that the magnitude of adjustment of the operational alignment actuator AAct necessary to correct for sag and twist to achieve the operational configuration may be known and recorded in the operational alignment subsystem circuit/routine 890 (or 190). Alternatively, the magnitude of the residual misalignment MisAng of the operational alignment actuator AAct in a known or reference state prior to such adjustment may be known and recorded. In either case, the sag/twist misalignment (e.g., sag/twist misalignment angle or vector) proximate the alignment sensor ASen, the distal end of the movable arm structure MAC' and/or the end tool ETL may be at least approximate or known or inferred.
In some embodiments, coordinate displacements or errors associated with the Z2 and/or Z3 coordinates shown in fig. 9 and 10 may be approximated based on sag/twist misalignments determined as described above in conjunction with known geometries and orientations of various arms and bearings of the robotic system 900 (or 1000) and mechanical properties (e.g., beam properties). In such embodiments, it should be appreciated that the sag/twist misalignment determination outlined above may ultimately be traced back to and based on the residual misalignment indicated by the alignment sensor ASen. Thus, according to one type of description, in such embodiments, errors that may occur in the Z coordinate of a set of metrology position coordinates may be at least partially corrected or compensated for based on sag/twist misalignments determined as described above, which should ultimately be based on residual misalignments indicated by the alignment sensor ASen.
In the embodiment shown in fig. 9 and 10, the operational alignment actuator arrangement AAct comprises a movement mechanism 834 and 835, which are included in the movable arm arrangement MAC' and are used in the operational alignment subsystem OAS. In such embodiments, the scale imaging axis direction SIA can be actively aligned with the optical axis OA1 for one or more poses of the articulated robot 110 at any desired time during operation of the articulated robot 110. It will be appreciated that this alignment is effective (i.e., established in a "closed loop" manner based on the alignment signals from the alignment sensor ASen) and that small alignment errors according to the small sag/tilt misalignment angle MisAng as previously outlined herein can be actively corrected at any desired time during operation of the articulated robot 110. In the illustrated embodiment, the alignment error may be actively corrected by using the alignment control signal generated in the alignment control section 892 to control the operating alignment actuator structure AAct to adjust the alignment of the movable XY scale 870 based on the alignment signal Asig provided by the alignment sensor ASen to provide an operating structure of the XY scale 870 and the first imaging structure 860 in which the optical axis OA1 and the scale imaging axis direction SIA are arranged in parallel as indicated by the alignment signal Asig. For example, the operational alignment subsystem processing circuitry/routines 890 may exchange certain position and/or alignment information and/or control signals with the robot motion and control processing system 840, as indicated by dashed lines 893, in order to implement various operational principles or features disclosed herein. Many of the aspects and features illustrated in fig. 9 and 10 can generally be understood based on the foregoing explanation and description of similar aspects and features in the various previous drawings. Certain other aspects and features will be described in more detail below.
Fig. 9 is a perspective view of a portion of a fifth exemplary embodiment of a robotic system 900 similar to the robotic system 800 of fig. 8, wherein a first imaging structure 860 and an alignment sensor ASen operating an alignment subsystem OAS are coupled to a stationary element STE. The alignment sensor ASen controls the operational alignment of the XY scale 870 located on the moving element. It should be appreciated that, similar to the above numbering scheme, certain named or numbered components of fig. 9 (e.g., 8 XX', or 9XX) may correspond to and/or have the same and/or similar operation as corresponding components of fig. 8 or other figures (e.g., 8XX) that are identically or similarly named or numbered and may be understood as similar or identical thereto and may be understood by analogy as described below. As noted above, naming and numbering schemes that represent elements of similar and/or identical design and/or function generally apply to the various figures of the present application (e.g., fig. 1-5, 8,9, and 10).
In the configuration of fig. 9, the fixation element STE coupled with the first imaging structure 860 may comprise a frame arranged above the robot 810. The movable arm structure MAC' includes arm portions 821-. In the embodiment shown in fig. 9, the operational alignment actuator arrangement AAct comprises movement mechanisms 834 and 835 which are comprised in the movable arm arrangement MAC' and used in the operational alignment subsystem OAS. An XY scale 870 is coupled to the arm or carriage 825 and thereby to the operational alignment actuator structure AAct and thereby to the remainder of the movable arm structure MAC'. In other structures, other coupling structures may be utilized to couple XY scale 870 to movable arm structure MAC'. In various embodiments, the position and/or orientation of XY scale 870 coupled to movable arm structure MAC' may be adjustable, but may also be temporarily locked or otherwise fixed at a given position/orientation (e.g., for a series of measurements, etc.). In any case, in an operational configuration of the auxiliary metrology position coordinate determination system 850, the first imaging structure 860 may be arranged such that the optical axis OA1 of the first imaging structure 860 is parallel to the direction of the scale imaging axis direction SIA, and the scale plane is located within the focus range of the first imaging structure 860 along the scale imaging axis direction SIA.
The robot 810 is described briefly herein, as it is of a known type. As shown in fig. 9, first arm portion 821 (e.g., an upper base portion) is mounted to first motion mechanism 831 (e.g., a rotational joint) at proximal end PE1 of first arm portion 821. The first movement mechanism 831 is located at an upper end of the lower support base BSE 'and has a rotation axis RA 1' aligned along the Z-axis. It will be appreciated that the optical axis OA1 and the scale imaging axis direction SIA can be aligned with the Z-axis such that the first arm 821 rotates in a plane perpendicular to the optical axis OA1 and the scale imaging axis direction SIA when they are arranged in the desired operational configuration. A position sensor SEN' (e.g., a rotary encoder) may be used to determine the angular position (e.g., angular orientation) of the first arm portion 821. The second movement mechanism 832 is located near the distal end DE1 of the first arm portion 821. The second motion mechanism 832 has an axis of rotation RA 2' nominally perpendicular to the Z-axis. The second arm 822 is mounted to a second motion mechanism 832 at a proximal end PE2 of the second arm 822 such that the second arm 822 moves about an axis of rotation RA 2'. A position sensor EN2 'may be used to determine the angular position a 2' of the second arm 822. The third movement mechanism 833 is located at the distal end DE2 of the second arm portion 822. The third motion mechanism 833 has an axis of rotation RA 3' nominally perpendicular to the Z axis. The third arm 823 is mounted to the third moving mechanism 833 at the proximal end PE3 of the third arm 823, so that the third arm 823 moves about the rotation axis RA 3'. The position sensor EN3 'may be used to determine the angular position A3' of the third arm 823. The fourth motion mechanism 834 is located at the distal end DE3 of the third arm 823. The fourth motion 834 has an axis of rotation RA4 that is nominally perpendicular to the Z axis. The fourth arm portion 824 is mounted to a fourth motion 834 at a proximal end PE4 of the fourth arm portion 824 such that the fourth arm portion 824 rotates about a rotational axis RA 4. A position sensor EN4 may be used to determine the angular position (e.g., in a plane parallel to the Z axis) of fourth arm portion 824. The fifth motion 835 may be located at the distal end DE4 of the fourth arm 824 and has an axis of rotation RA5, which in various embodiments, axis of rotation RA5 may be nominally perpendicular to axis of rotation RA 4. A fifth arm 825 (e.g., a bracket) is mounted to the fifth motion mechanism 835 at a proximal end PE5 of the fifth arm 825 such that the fifth arm 825 rotates about a rotational axis RA 5. A position sensor EN5 may be used to determine the angular position of fifth arm 825 and/or XY scale 870 about rotational axis RA 5. In some embodiments, it may be desirable to arrange the scale plane of XY scale 870 parallel to axis of rotation RA5, as described in more detail below. As shown in fig. 9, the second and third arm portions 822 and 823 may each have a designated centerline CL 2' and CL3, respectively, that extend down the center of the respective arm portion. An angle a2 '(which may, for example, correspond to an amount of rotation of the second motion mechanism 832) may be specified to occur between a centerline CL 2' of the second arm 822 and a plane (e.g., parallel to the scale plane in the operative configuration, which may be in the x-y plane when the optical axis OA1 is parallel to the z-axis). The angle a 3' may be specified to occur between the centerline CL 2' of the second arm 822 and the centerline CL3 of the third arm 823 (e.g., according to the amount of rotation of the third motion mechanism 833 about the third rotation axis RA3 '). It should be understood that other arm portions may similarly have corresponding reference lines and/or axes, etc., for referencing certain movements, coordinates, and angles of the components of the movable arm structure MAC'.
In various embodiments, the movable XY scale 870 (e.g., as shown in fig. 8 and 9) may be described as being coupled to a central sub-portion of the movable arm structure MAC' (e.g., including the arm portion 823 and at least some of the proximal elements) by a distal sub-portion that includes a rotating element that rotates about the axis of rotation RA4 (e.g., the arm portion 824 coupled to the movement mechanism 834) and a rotating element that rotates about the axis of rotation RA4 (e.g., the stand or arm portion 825 coupled to the movement mechanism 835). As previously mentioned, the motion mechanisms 834 and 835 may be considered to be the operational alignment actuator structure AAct and may be used to provide the desired operational alignment outlined above and described in more detail below. In the exemplary operational alignment actuator configuration shown in fig. 9, the axis of rotation RA5 is nominally parallel to the scale plane of the XY scale 870, nominally perpendicular to the scale imaging axis SIA. The axis of rotation RA4 is nominally perpendicular to the axis of rotation RA 5. According to the convention used herein, two axes oriented such that the dot product of the two directional vectors is zero will be understood to be orthogonal or perpendicular, whether or not they intersect. It will be appreciated that this arrangement of the axes of rotation allows for simple and convenient motion control and sensor processing, but it is merely exemplary and not limiting. Although in this embodiment the rotating elements of the operational alignment actuator structure AAct are included in the movable arm structure MAC', in other embodiments the rotating elements (typically the first and second rotating elements) may be included in a discrete operational alignment actuator structure located near the distal end of the movable arm structure MAC (e.g. as described below with reference to fig. 12 and 13).
In the embodiment shown in fig. 9 (and 10), the axis of rotation RA4 is parallel to one or more other axes of rotation of the movable arm structure MAC' (e.g., RA2, RA 3). It should be appreciated that in this case, if the axis of rotation RA4 counter-rotates in a direction opposite and equal to the angular rotation of the parallel axes of rotation in the movable arm structure MAC', the desired operational configuration of the auxiliary metrology position coordinate determination system 850 may be maintained in various movements or positions of the robot 810, if desired. More generally, the XY is locked or adjustable/rotatable to different fixed positions/positions relative to the movable arm structure MAC' in order to achieve the desired orientation/position for a particular measurement.
In various embodiments, the tip tool ETL may be mounted (e.g., coupled) to the tip tool mounting configuration ETMC proximate to the distal end DE5 of the fifth arm 825. The end tool ETL may be designated as having an end tool axis EA (e.g., passing through the middle and/or central axis of the stylus ETST). In the illustrated embodiment, the end tool axis EA passes through the end tool position ETP and has a known coordinate position offset from the XY scale 870 (e.g., for a Z coordinate position offset component, as indicated by offset LoffMP), and is parallel to the scale imaging axis direction SIA in the operative configuration (e.g., such that the end tool ETL with the stylus ETST is oriented parallel to the scale imaging axis direction SIA). As previously outlined with reference to fig. 2A-2B, there may be a known coordinate position offset between the end tool position ETP and the XY scale 870. For example, XY scale 870 can have a specified reference point (e.g., at a center or edge of XY scale 870) having a known coordinate position (e.g., a known distance in a plane parallel to the scale plane or other plane) offset from end tool axis EA (e.g., from end tool position ETP, respectively). In various embodiments, such known coordinate position offsets may be represented by known offset components.
As previously described, the known coordinate position offset between the end tool position ETP and the XY scale 870 may be used as part of the process of determining the metrology position coordinates of the end tool position ETP. More specifically, as described above, the auxiliary metrology position coordinate determination system 850 may be configured such that the metrology position coordinate processing portion 885 determines metrology position coordinates that give the relative position between the XY scale 870 and the first reference position REF1 (i.e., defined by the fixed first imaging structure 860) based on determining the image position of at least one corresponding imageable feature identified in the acquired image (i.e., of the XY scale 870). The auxiliary metrology position coordinate determination system 850 may also be configured to determine metrology position coordinates of the end tool position ETP and/or the measurement point position MP of the end tool based on the determined metrology position coordinates that give a relative position (i.e., between the XY scale 870 and the first reference position REF 1) and a known coordinate position offset between the end tool position ETP and/or the measurement point position MP of the end tool and the movable XY scale 870. In one particular example embodiment, a known coordinate position offset (e.g., expressed in known offset components, such as a known x-offset and a known y-offset and a known z-offset) may be added to or otherwise combined with the determined metrology position coordinates that indicate relative positions (i.e., between XY scale 870 and first reference position REF 1) in order to determine metrology position coordinates of measurement point position MP and/or end tool position ETP of end tool ETL.
As one particular example position coordinate structure, in embodiments where the scale imaging axis direction SIA is parallel to the Z-axis in the operational configuration, the XY scale 870 may be designated as having an origin at Z0X 0, Y0 (e.g., it may have scale coordinate values of 0,0, 0 for the origin position of the scale center). The reference position REF1 (i.e., as defined by the fixed first imaging structure 860) may have X1, Y1, Z1 metrology coordinates, and the center of the corresponding field of view FOV1 (e.g., corresponding to the acquired image) may be at the X1, Y1, Z0 metrology coordinates. The position of the tip tool axis EA in the X-Y plane extending from the XY scale 870 may be specified as having relative metrology position coordinates of X2, Y2. The end tool position ETP may be specified as having metrology position coordinates of X2, Y2, Z2. In various embodiments, the end tool ETL may have a measurement point MP (e.g., at the end of the end tool stylus ETST for contacting the workpiece), which may be designated as having metrology position coordinates X3, Y3, Z3. In embodiments where the measurement point MP of the end tool ETL does not vary in the X or Y direction relative to the rest of the end tool and the end tool axis EA is parallel to the z axis in the operative configuration, the X3 coordinate and the Y3 coordinate may be equal to the X2 coordinate and the Y2 coordinate, respectively.
In a particular exemplary embodiment, the acquired images may be analyzed by the metrology position coordinate processing section 885 to determine scale coordinates corresponding to metrology position coordinates X1, Y1, the metrology position coordinates X1, Y1 corresponding to the center of the field of view FOV1 of the fixed first imaging structure 860. Such determination may be made in accordance with standard camera/scale image processing techniques (e.g., for determining the position of the camera relative to the scale). Various examples of such techniques are disclosed in U.S. patent nos. 6,781,694; 6,937,349, respectively; 5,798,947, respectively; 6,222,940 and 6,640,008, each of which is incorporated herein by reference in its entirety. In various embodiments, such techniques may be used to determine the position of field of view FOV1 (e.g., corresponding to the position of the camera) within a scale range (e.g., within XY scale 870), as described above with reference to fig. 4 and 5. In various embodiments, such determination may include identifying at least one respective imageable feature included in the acquired image of XY scale 870 and an associated respective known XY scale coordinate position. Such a determination may correspond to determining metrology position coordinates that give a relative position between XY scale 870 and first reference position REF1 (i.e., defined by fixed first imaging structure 860). The relative X2, Y2 coordinates (i.e., the coordinates of the end tool position ETP) may then be determined from the measured point position MP of the end tool ETL and/or the known coordinate position offset between the end tool position ETP and the XY scale 870 (e.g., adding the known X and Y and Z position offset values to the X1 and Y1 and Z0 to determine X2, Y2, Z2, and/or X3, Y3, and Z3).
Summarizing the implementation operation shown in fig. 9, in the desired operational configuration provided as described above, the residual misalignment MisAng is reduced to zero based on and indicated by the misalignment sensor ASen. Thus, errors that depend on residual misalignment MisAng (e.g., as previously outlined with reference to FIGS. 2A-3B) are substantially prevented and do not require correction or compensation. For example, various offsets and/or misalignment errors between various components may be determined and/or saved as calibration data and used as outlined herein without requiring additional correction or compensation that would otherwise result from non-zero residual misalignment.
Fig. 10 is an isometric view of a portion of a sixth exemplary embodiment of a robotic system 1000 similar to robotic system 900 of fig. 9, but with the first imaging structure 860 and an alignment sensor ASen operating alignment subsystem OAS coupled to a moving element of the movable arm structure MAC', the alignment sensor ASen controlling the operative alignment of the first imaging structure 860 relative to an XY scale 870 located on the fixed element STE.
In the configuration of fig. 10, the fixing element STE coupled with the XY scale 870 may include a frame disposed above the robot 810. In the embodiment shown in fig. 10, the operational alignment actuator arrangement AAct comprises movement mechanisms 834 and 835 which are comprised in the movable arm arrangement MAC' and are used for operating the alignment subsystem OAS as described earlier with reference to fig. 9. The first imaging structure 860 and the alignment sensor ASen are coupled to the arm or support 825 and thereby to the operating alignment actuator structure AAct and thereby to the remainder of the movable arm structure MAC'. In other structures, other coupling structures may be utilized to couple the first imaging structure 860 to the movable arm structure MAC'. In various embodiments, the position and/or orientation of the first imaging structure 860 coupled to the movable arm structure MAC' may be adjustable, but may also be temporarily locked or otherwise fixed at a given position/orientation (e.g., for a series of measurements, etc.). In any case, in an operational configuration of the auxiliary metrology position coordinate determination system 850, the first imaging structure 860 may be arranged such that the optical axis OA1 of the first imaging structure 860 is parallel to the direction of the scale imaging axis direction SIA, and the scale plane is located within the focus range of the first imaging structure 860 along the scale imaging axis direction SIA.
The robot 810 may be substantially as previously described with reference to fig. 9. In various embodiments, the movable first imaging structure 860 (e.g., as shown in fig. 10) may be described as being coupled to a central sub-portion of the movable arm structure MAC' (e.g., including the arm portion 823 and at least some of the proximal elements) by a distal sub-portion that includes a rotating element that rotates about the rotational axis RA4 (e.g., the arm portion 824 coupled to the motion mechanism 834) and a rotating element that rotates about the rotational axis RA4 (e.g., the support or arm portion 825 coupled to the motion mechanism 835). As previously mentioned, the motion mechanisms 834 and 835 may be considered to be the operational alignment actuator structure AAct and may be used to provide the desired operational alignment outlined above and described in more detail below. In some embodiments, it may be desirable to arrange the optical axis OA1 of the first imaging structure 860 perpendicular to the rotational axis RA5, for example as shown in fig. 10 and described in more detail below. The axis of rotation RA4 is nominally perpendicular to the axis of rotation RA 5. According to the convention used herein, two axes oriented such that the dot product of the direction vectors is zero will be understood to be orthogonal or perpendicular, whether they intersect or not. It will be appreciated that this arrangement of the axes of rotation allows for simple and convenient motion control and sensor processing, but it is merely exemplary and not limiting. Although in this embodiment the rotating elements of the operational alignment actuator structure AAct are included in the movable arm structure MAC', in other embodiments the rotating elements (typically the first and second rotating elements) may be included in a discrete operational alignment actuator structure located near the distal end of the movable arm structure MAC (e.g. as described below with reference to fig. 12 and 13).
In the embodiment shown in fig. 10, the axis of rotation RA4 is parallel to one or more other axes of rotation of the movable arm structure MAC' (e.g., RA2, RA 3). It should be appreciated that in this case, if the axis of rotation RA4 counter-rotates in a direction opposite and equal to the angular rotation of the parallel axes of rotation in the movable arm structure MAC', the desired operational configuration of the auxiliary metrology position coordinate determination system 850 may be maintained in various movements or positions of the robot 810, if desired. More generally, the XY scale is locked or adjustable/rotatable to different fixed orientations/positions with respect to the movable arm structure MAC' in order to achieve a desired orientation/position for a specific measurement.
In various embodiments, tip tool ETL may be mounted (e.g., coupled) near distal end DE5 of fifth arm 825 with a structure having functional features and characteristics substantially as previously described with reference to fig. 9 (except for various minor differences in various offset dimensions, etc.). As previously outlined with reference to fig. 3A-3B, there may be a known coordinate positional offset (e.g., for a Z coordinate positional offset component, as indicated by offset LoffMP) between the end tool position ETP and the first imaging structure 860. For example, the first imaging structure 860 may have a specified reference point (e.g., at the center of its lens, at the metrology coordinate position markers X1, Y1, Z1) that has a known coordinate position that is offset relative to the end tool position ETP. In various embodiments, such known coordinate position offsets may be represented by known offset components.
As previously described, a known coordinate position offset between the end tool position ETP and the first imaging structure 860 may be used as part of the process of determining the metrology position coordinates of the end tool position ETP. More specifically, as described above, the auxiliary metrology position coordinate determination system 850 may be configured such that the metrology position coordinate processing portion 885 determines metrology position coordinates that give the relative position between the first imaged structure 860 and the first reference position REF1 (i.e., defined by the fixed XY scale 870) based on determining the image position of the identified at least one corresponding imageable feature (i.e., XY scale 870) in the acquired images. The auxiliary metrology position coordinate determination system 850 may also be configured to determine metrology position coordinates of the end tool position ETP and/or the measurement point position MP based on the determined metrology position coordinates, which give a relative position (i.e., between the first imaging structure 860 and the first reference position REF 1), and a known coordinate position offset between the end tool position ETP and/or the measurement point position MP of the end tool and the movable first imaging structure 860. In one particular example embodiment, a known coordinate position offset (e.g., expressed in known offset components, such as a known x-offset and a known y-offset and a known z-offset) may be added to or otherwise combined with the determined metrology position coordinates that refer to the metrology position coordinates that give the relative position (i.e., between the first imaging structure 860 and the first reference position REF 1) in order to determine the metrology position coordinates of the measurement point position MP and/or the end tool position ETP of the end tool ETL.
As one particular example position coordinate structure, in embodiments where the optical axis OA1 is parallel to the scale imaging axis SIA and Z axis in the operational configuration, the XY scale 870 may be designated as having an origin at REF1 positions X0, Y0, Z0. The origin position of the scale center may have a scale coordinate value of 0,0, 0, which causes the scale coordinate system and the metrology coordinate system to coincide in this particular embodiment. A position along the optical axis of the first imaging structure 860 at the center of the field of view FOV1 (e.g., corresponding to the acquired image) may be located at the metrology coordinates X1, Y1, Z0. The reference point of the first imaging structure 860 may then be understood as having metrology coordinates X1, Y1, Z1 in the desired operative configuration. The end tool position ETP may be designated as having metrology position coordinates X2, Y2, Z2. In various embodiments, the end tool ETL may have a measurement point MP (e.g., at the end of the end tool stylus ETST for contacting the workpiece), which may be designated as having metrology position coordinates X3, Y3, Z3. In embodiments where the measurement point MP and the end tool position ETP of the end tool ETL do not vary in the X or Y direction relative to the reference point of the first imaging structure 860, the X2, Y2 and X3, Y3 coordinates may be equal to the X1, Y1 coordinates, respectively.
In a particular exemplary embodiment, the acquired images may be analyzed by the metrology position coordinate processing section 885 to determine scale coordinates corresponding to metrology position coordinates X1, Y1, the metrology position coordinates X1, Y1 corresponding to the center of the field of view FOV1 of the fixed first imaging structure 860. Such determination may be made in accordance with standard camera/scale image processing techniques (e.g., for determining the position of the camera relative to the scale). Various examples of such techniques are disclosed in U.S. patent nos. 6,781,694; 6,937,349, respectively; 5,798,947, respectively; 6,222,940 and 6,640,008, each of which is incorporated herein by reference in its entirety. In various embodiments, such techniques may be used to determine the position of field of view FOV1 (e.g., corresponding to the position of the camera) within a scale range (e.g., within XY scale 870), as described above with reference to fig. 4 and 5. In various embodiments, such determination may include identifying at least one respective imageable feature included in the acquired image of XY scale 870 and an associated respective known XY scale coordinate position. Such a determination may correspond to determining metrology position coordinates that give the relative position between the first imaging structure 860 and the first reference position REF1 (i.e., defined by the fixed XY scale 870). The relative X, Y coordinates of the end tool ETL (i.e., the end tool position ETP) and/or the measurement point position MP may then be determined from the measurement point position MP of the end tool ETL and/or the known coordinate position offset between the end tool position ETP and the first imaging structure 860 (e.g., adding known X, Y and Z position offset values to X1 and Y1 and Z1 to determine X2, Y2, Z2, and/or X3, Y3, and Z3).
Summarizing the operation of the embodiment shown in fig. 10, in the desired operating configuration provided as described above, the residual misalignment MisAng is reduced to zero based on and having the misalignment sensor ASen indicate. Thus, errors that depend on residual misalignment MisAng (e.g., as previously outlined with reference to FIGS. 2A-3B) are substantially prevented and do not require correction or compensation. For example, various offsets and/or misalignment errors between various components may be determined and/or saved as calibration data and used as outlined herein without requiring additional correction or compensation that would otherwise result from non-zero residual misalignment.
The embodiments described above with reference to fig. 9 and 10 use an operable alignment actuator AAct, which is included in the movable arm structure MAC. It will be appreciated that the operational alignment subsystem OAS may alternatively provide a set of discrete operational alignment actuators AAct if desired (for example, if the associated movable arm structure does not already include the appropriate actuators used by the operational alignment subsystem OAS). Such a discrete set of operational alignment actuators may operate substantially as outlined above with reference to fig. 9 and 10, and may provide similar benefits in accordance with the principles disclosed herein. Fig. 12 and 13 show such an embodiment.
Fig. 12 is an isometric view showing a portion of an embodiment of a robotic system 1200 similar to that of fig. 2A and 2B, wherein an XY scale 870 and an alignment sensor ASen and an alignment actuator structure AAct operating an alignment subsystem OAS are coupled to the moving element, and the alignment sensor ASen and the alignment actuator structure AAct are used to control the operational alignment of the XY scale 870 relative to an imaging structure 860 located on the stationary element STE. The features and operations associated with operating the alignment subsystem OAS are similar to those outlined above with reference to fig. 9. It should be appreciated that, similar to the above numbering scheme, certain named or numbered components of fig. 12 (e.g., 1XX, 8 XX') may correspond to and/or have the same and/or similar operation as corresponding components of fig. 2A, 2B, and 9 or other figures (e.g., 1XX, 8XX) that are identically or similarly named or numbered and may be understood as similar or identical thereto and may be understood by analogy, as described below. As noted above, naming and numbering schemes that represent elements of similar and/or identical design and/or function generally apply to the various figures of the present application (e.g., fig. 1-5, 8,9, 10, 12, and 13).
In the configuration of fig. 12, the fixation element STE coupled with the first imaging structure 860 may comprise a frame arranged above the robot 810. The movable arm structure MAC may be the same as that shown in fig. 2A. In the embodiment shown in fig. 12, the operational alignment actuator arrangement AAct includes discrete motion mechanisms 834 'and 835', which may be part of the operational alignment subsystem OAS. For example, the operational alignment actuator structure AAct may be coupled to an arm or carriage 825, which arm or carriage 825 may be coupled to the end tool ETL and the XY scale 870 in an end tool structure ETCN that is mechanically and electrically connected to the movable arm structure MAC at the end tool mounting structure ETMC. In the illustrated embodiment, the discrete motion mechanisms 834 'and 835' need only satisfy motion within a relatively small angular range about the rotational axes RA4 'and RA 5', respectively. They can therefore be integrated in low profile and compact two-axis actuators as schematically shown in fig. 12. Teachings relating to such actuators are disclosed, for example, in U.S. patent nos. 5,583,691 and 9,323,025, which are incorporated herein by reference in their entirety. It should be appreciated that the control and position signals from the discrete motion mechanisms 834 'and 835' may be combined with control and position signals of a robotic system that includes a movable arm structure MAC as previously described and/or as known in the art.
In various embodiments, the position and/or orientation of XY scale 870, as coupled to operational alignment actuator structure AAct and thus to movable arm structure MAC, may be adjustable, but it may also be temporarily locked or otherwise fixed at a given position/orientation (e.g., for a series of measurements, etc.). In any case, in an operational configuration of the auxiliary metrology position coordinate determination system 850, the first imaging structure 860 may be arranged such that the optical axis OA1 of the first imaging structure 860 is parallel to the direction of the scale imaging axis direction SIA, and the scale plane is located within the focus range of the first imaging structure 860 along the scale imaging axis direction SIA.
As shown in fig. 12, the motion mechanism 834' has an axis of rotation RA4, and in some embodiments, the axis of rotation RA4 can be nominally perpendicular to the Z-axis (e.g., without significant sag or twist). The motion mechanism 834 'may include an actuator portion AP824 (e.g., a plate) coupled to the motion mechanism 834' such that the actuator portion AP824 rotates about a rotational axis RA 4. A position sensor of the motion mechanism 834' may be used to determine the angular position (e.g., in a plane parallel to the Z-axis) of the actuator portion AP 824.
Motion mechanism 835' may be coupled to actuator portion AP824 and has an axis of rotation RA5, which in various embodiments may be nominally perpendicular to axis of rotation RA4, axis of rotation RA 5. The motion mechanism 835 'may include an actuator portion AP825 (e.g., a plate) coupled to the motion mechanism 835' such that the actuator portion AP825 rotates about a rotational axis RA 5. An arm portion 825' (e.g., a bracket) may be mounted to the actuator portion AP 825. The position sensor of the motion mechanism 835 'may be used to determine the angular position of the actuator portion AP825, the arm portion 825', and/or the XY scale 870 about the rotational axis RA 5. In some embodiments, it may be desirable to arrange the scale plane of XY scale 870 parallel to axis of rotation RA 5.
In various embodiments, the movable XY scale 870 may be described as being coupled to a central sub-portion of the movable arm structure MAC (e.g., including the arm 130 and at least some elements adjacent thereto) by a discrete operation alignment actuator structure AAct, which includes a rotating element that rotates about a rotational axis RA4 (e.g., the actuator portion AP824 coupled to the motion mechanism 834 ') and a rotating element that rotates about a rotational axis RA5 (e.g., the actuator portion AP825 coupled to the motion mechanism 835 ' and/or the stand or arm 825 '). In the exemplary operational alignment actuator configuration AAct shown in fig. 12, the axis of rotation RA5 is nominally parallel to the scale plane of the XY scale 870, nominally perpendicular to the scale imaging axis SIA. The axis of rotation RA4 is nominally perpendicular to the axis of rotation RA 5. It will be appreciated that this arrangement of the axes of rotation allows for simple and convenient motion control and sensor processing, but it is merely exemplary and not limiting.
In the embodiment shown in fig. 12, the movable arm structure MAC is configured such that once the desired operational structure of the auxiliary metrology position coordinate determination system 850 is established by using the alignment sensors ASen and the operating alignment actuators AAct, the desired operational structure may be nominally maintained during various movements or positions of the robot 810, if desired. If it is desired to ensure the best possible alignment in the operating configuration, the operating alignment can be adjusted at any desired time by adjusting the position of the operating alignment actuator AAct to provide the desired operating alignment based on and as indicated by the alignment sensor ASen.
As previously described, the measured point position MP of the end tool and/or the known coordinate position offset between the end tool position ETP and the XY scale 870 may be used as part of the process of determining the metrology position coordinates of the end tool position ETP. More specifically, as described above, the auxiliary metrology position coordinate determination system 850 may be configured such that the metrology position coordinate processing portion 885 determines metrology position coordinates that give the relative position between the XY scale 870 and the first reference position REF1 (i.e., defined by the fixed first imaging structure 860) based on determining the image position of the identified at least one respective imageable feature (i.e., of the XY scale 870) in the acquired image. The auxiliary metrology position coordinate determination system 850 may also be configured to determine metrology position coordinates for the measurement point position MP and/or the end tool position ETP of the end tool based on the determined metrology position coordinates which give a relative position (i.e., between the XY scale 870 and the first reference position REF 1) and a known coordinate position offset between the measurement point position MP and/or the end tool position ETP of the end tool and the movable XY scale 870. In one particular example embodiment, a known coordinate position offset (e.g., expressed in known offset components, such as a known x-offset and a known y-offset and a known z-offset) may be added to or otherwise combined with the determined metrology position coordinates that give a relative position (i.e., between XY scale 870 and first reference position REF 1) in order to determine the metrology position coordinates of the measurement point position MP and/or the end tool position ETP of the end tool ETL.
In a particular exemplary embodiment, the acquired images may be analyzed by the metrology position coordinate processing section 885 to determine scale coordinates corresponding to metrology position coordinates X1, Y1, the metrology position coordinates X1, Y1 corresponding to the center of the field of view FOV1 of the fixed first imaging structure 860. Such determination may be made in accordance with standard camera/scale image processing techniques (e.g., for determining the position of the camera relative to the scale). In various embodiments, such techniques may be used to determine the position of field of view FOV1 (e.g., corresponding to the position of the camera) within a scale range (e.g., within XY scale 870), as described above with reference to fig. 4 and 5. In various embodiments, such determination may include identifying at least one respective imageable feature included in the acquired image of XY scale 870 and an associated respective known XY scale coordinate position. Such a determination may correspond to determining metrology position coordinates that give a relative position between XY scale 870 and first reference position REF1 (i.e., defined by fixed first imaging structure 860). The relative X2, Y2 coordinates (i.e., the coordinates of the end tool position ETP) may then be determined from the measured point position MP of the end tool ETL and/or the known coordinate position offset between the end tool position ETP and the XY scale 870 (e.g., adding the known X and Y and Z position offset values to the X1 and Y1 and Z0 to determine X2, Y2, Z2, and/or X3, Y3, and Z3).
Summarizing the operation of the embodiment shown in fig. 12, in the desired operational configuration provided as described above, the residual misalignment MisAng is based on the misalignment sensor ASen and is reduced to zero as indicated. Thus, errors that depend on residual misalignment MisAng (e.g., as previously outlined with reference to FIGS. 2A-3B) are substantially prevented and do not require correction or compensation. For example, various offsets and/or misalignment errors between various components may be determined and/or saved as calibration data and used as outlined herein without requiring additional correction or compensation that would otherwise result from non-zero residual misalignment.
Fig. 13 is an isometric view showing a portion of an embodiment of a robotic system 1300 similar to that of fig. 12 and 3A and 3B, wherein an imaging structure 860 and an alignment sensor ASen and an alignment actuator structure AAct of an operational alignment subsystem OAS are coupled to the moving element, and the alignment sensor ASen and the alignment actuator structure AAct are used to control the operational alignment of the imaging structure 860 relative to an XY scale 870 located on the stationary element STE.
The features and operations associated with operating the alignment subsystem OAS are similar to those outlined above with reference to fig. 10 and 12. It should be appreciated that, similar to the above numbering scheme, certain named or numbered components of fig. 13 (e.g., 1XX, 8 XX') may correspond to and/or have the same or similar operations as corresponding components of fig. 3A, 3B, 10 and 12 (e.g., 1XX, 8XX) or other figures named or numbered similarly and may be understood as similar or identical thereto and may be understood by analogy, as described below.
In the configuration of fig. 13, the fixing element STE coupled with the XY scale 870 may include a frame disposed above the robot 810. The movable arm structure MAC may be the same as that shown in fig. 2A or 3B. In the embodiment shown in fig. 13, the operational alignment actuator arrangement AAct includes discrete motion mechanisms 834 'and 835', which may be part of the operational alignment subsystem OAS. For example, the operational alignment actuator structure AAct may be coupled to an arm or bracket 825, which arm or bracket 825 may be coupled to the first imaging structure 860 and an end tool ETL in an end tool structure ETCN that is mechanically and electrically connected to the movable arm structure MAC at the end tool mounting structure ETMC. In the illustrated embodiment, the discrete motion mechanisms 834 'and 835' need only satisfy motion within a relatively small angular range about the rotational axes RA4 'and RA 5', respectively. They can therefore be integrated in low profile and compact two-axis actuators as schematically shown in fig. 13. Teachings relating to such actuators are disclosed in, for example, U.S. patent nos. 55,583,691 and 9,323,025, the entire contents of which are incorporated herein by reference. It should be appreciated that the control and position signals from the discrete motion mechanisms 834 'and 835' may be combined with control and position signals of a robotic system that includes a movable arm structure MAC as previously described and/or as known in the art.
In various embodiments, the position and/or orientation of the first imaging structure 860 (as coupled to the operational alignment actuator structure AAct and thus to the movable arm structure MAC) may be adjustable, but it may also be temporarily locked or otherwise fixed at a given position/orientation (e.g., for a series of measurements, etc.). In any case, in an operational configuration of the auxiliary metrology position coordinate determination system 850, the first imaging structure 860 may be arranged such that the optical axis OA1 of the first imaging structure 860 is parallel to the direction of the scale imaging axis direction SIA, and the scale plane is located within the focus range of the first imaging structure 860 along the scale imaging axis direction SIA.
As shown in fig. 13, the motion mechanism 834' has an axis of rotation RA4, and in some embodiments, the axis of rotation RA4 can be nominally perpendicular to the Z-axis (e.g., without significant sag or twist). The motion mechanism 834 'may include an actuator portion AP824 (e.g., a plate) coupled to the motion mechanism 834' such that the actuator portion AP824 rotates about a rotational axis RA 4. A position sensor of the motion mechanism 834' may be used to determine the angular position (e.g., in a plane parallel to the Z-axis) of the actuator portion AP 824.
Motion mechanism 835' may be coupled to actuator portion AP824 and has an axis of rotation RA5, which in various embodiments may be nominally perpendicular to axis of rotation RA4, axis of rotation RA 5. The motion mechanism 835 'may include an actuator portion AP825 (e.g., a plate) coupled to the motion mechanism 835' such that the actuator portion AP825 rotates about a rotational axis RA 5. An arm portion 825' (e.g., a bracket) may be mounted to the actuator portion AP 825. The position sensor of the motion mechanism 835 'may be used to determine the angular position of the actuator portion AP825, the arm portion 825', and/or the XY scale 870 about the rotational axis RA 5. In some embodiments, it may be desirable to arrange the optical axis OA1 of the first imaging structure 860 perpendicular to the rotational axis RA 5.
In various embodiments, the movable first imaging structure 860 may be described as being coupled to a central sub-portion of the movable arm structure MAC (e.g., including the arm 130 and at least some elements adjacent thereto) by a discrete operation alignment actuator structure AAct, which includes a rotating element (e.g., the actuator portion AP824 coupled to the motion mechanism 834 ') that rotates about the rotation axis RA4 and a rotating element (e.g., the actuator portion AP825 coupled to the motion mechanism 835 ' and/or the stand or arm 825 ') that rotates about the rotation axis RA 5. In the exemplary operational alignment actuator structure AAct shown in fig. 13, the axis of rotation RA5 is nominally perpendicular to and aligned with the optical axis OA1 of the first imaging structure 860. The axis of rotation RA4 is nominally perpendicular to the axis of rotation RA 5. It will be appreciated that this arrangement of the axes of rotation allows for simple and convenient motion control and sensor processing, but it is merely exemplary and not limiting.
In the embodiment shown in fig. 13, the movable arm structure MAC is configured such that once the desired operating configuration of the auxiliary metrology position coordinate determination system 850 is established by using the alignment sensors ASen and the operating alignment actuators AAct, the desired operating configuration may be nominally maintained during various movements or positions of the robot 810, if desired. If it is desired to ensure the best possible alignment in the operational configuration, the operational alignment may be adjusted at any desired time by adjusting the position of the operational alignment actuator(s) AAct to provide the desired operational alignment based on and as indicated by the alignment sensor ASen.
As previously described, the measured point position MP of the end tool and/or the known coordinate position offset between the end tool position ETP and the first imaging structure 860 may be used as part of the process of determining the metrology position coordinates of the end tool position ETP. More specifically, as described above, the auxiliary metrology position coordinate determination system 850 may be configured such that the metrology position coordinate processing portion 885 determines metrology position coordinates that give the relative position between the first imaging structure 860 and the first reference position REF1 (i.e., defined by the fixed XY scale 870) based on determining the image position of the identified at least one respective imageable feature (i.e., XY scale 870) in the acquired image. The auxiliary metrology position coordinate determination system 850 may also be configured to determine metrology position coordinates for the measurement point position MP and/or the end tool position ETP of the end tool based on the determined metrology position coordinates, which give a relative position (i.e., between the first imaging structure 860 and the first reference position REF 1), and a known coordinate position offset between the measurement point position MP and/or the end tool position ETP of the end tool and the movable first imaging structure 860. In one particular example embodiment, a known coordinate position offset (e.g., expressed in known offset components, such as a known x-offset and a known y-offset and a known z-offset) may be added to or otherwise combined with the determined metrology position coordinates that give a relative position (i.e., between the first imaging structure 860 and the first reference position REF 1) in order to determine metrology position coordinates of the measurement point position MP and/or the end tool position ETP of the end tool ETL.
In a particular exemplary embodiment, the acquired images may be analyzed by the metrology position coordinate processing section 885 to determine scale coordinates corresponding to metrology position coordinates X1, Y1, the metrology position coordinates X1, Y1 corresponding to the center of the field of view FOV1 of the fixed first imaging structure 860. Such determination may be made in accordance with standard camera/scale image processing techniques (e.g., for determining the position of the camera relative to the scale). In various embodiments, such techniques may be used to determine the position of field of view FOV1 (e.g., corresponding to the position of the camera) within a scale range (e.g., within XY scale 870), as described above with reference to fig. 4 and 5. In various embodiments, such determination may include identifying at least one respective imageable feature included in the acquired image of XY scale 870 and an associated respective known XY scale coordinate position. Such a determination may correspond to determining metrology position coordinates that are a relative position between the first imaging structure 860 and a first reference position REF1 (i.e., defined by a fixed XY scale 870). Relative X2, Y2, and/or X3Y 3 coordinates may then be determined from the measured point positions MP of the tip tool ETL and/or the known coordinate position offsets between the tip tool position ETP and the XY scale 870 (e.g., adding the known X and Y and Z position offset values to X1 and Y1 and Z0 to determine X2, Y2, Z2, and/or X3, Y3, and Z3.
Summarizing the operation of the embodiment shown in fig. 13, in the desired operational configuration provided as described above, the residual misalignment MisAng is based on the misalignment sensor ASen and is reduced to zero as indicated. Thus, errors that depend on residual misalignment MisAng (e.g., as previously outlined with reference to FIGS. 2A-3B) are substantially prevented and do not require correction or compensation. For example, various offsets and/or misalignment errors between various components may be determined and/or saved as calibration data and used as outlined herein without requiring additional correction or compensation that would otherwise result from non-zero residual misalignment.
Figure 11 is a flow diagram illustrating an exemplary embodiment of a routine 1100 for operating a robotic system including a robot and an auxiliary metrology position coordinate determination system including an operational alignment subsystem including an operational alignment actuator structure AAct. As shown in fig. 11, at decision block 1110, a determination is made whether the robotic system is to operate in an assisted metrology position coordinate mode. In various embodiments, the selection and/or activation of the auxiliary metrology position coordinate mode or the standard robot position coordinate mode may be made by a user and/or may be made automatically by the system in response to certain operations and/or instructions. For example, in one embodiment, the secondary metrology position coordinate mode may be entered (e.g., automatically or at the user's option) when the robot moves to a particular location (e.g., to move the end tool from a general area where assembly or other operations are performed to a more specific area where workpiece inspection operations are typically performed and/or the secondary metrology position coordinate mode is used). In various embodiments, this mode may be implemented by an external control system (e.g., such as external control system ECS of FIG. 1, which utilizes standard robot position coordinate mode portion 147 and auxiliary metrology position coordinate mode portion 187, or external control system ECS' of FIG. 8, which utilizes standard robot position coordinate mode portion 849 and auxiliary metrology position coordinate mode portion 887). In various embodiments, the hybrid mode may operate independently, or as part of an assisted metrology position coordinate mode, and/or may be implemented as a switch between modes, as described previously with reference to FIG. 7.
If at decision block 1110 it is determined that the robotic system is not operating in the secondary metrology position coordinate mode, the routine proceeds to block 1115, where the robotic system is operating in a standard robotic position coordinate mode. As part of the standard robot position coordinate model, the position sensors of the robot (e.g., rotary encoders, linear encoders, etc.) are used to control and determine robot motion and corresponding end tool positions or measurement point positions of the end tool with robot precision (e.g., which is based at least in part on the precision of the position sensors of the robot). As previously described, the position sensor of the robot may indicate the position of the movable arm structure MAC or MAC' (e.g., the position of the arm) with a lower accuracy than the position information determined using the XY scale. In general, the robot position coordinate mode may correspond to an independent and/or standard mode of operation of the robot (e.g., a mode in which the robot operates independently, such as when the auxiliary metrology position coordinate determination system is not activated or provided).
If the robot system is to operate in the secondary metrology position coordinate mode, the routine proceeds to block 1120, where the robot and secondary metrology position coordinate determination system are configured to operate the operational alignment subsystem and the operational alignment actuator structure to adjust the alignment of the movable one of the XY scale or the first imaging structure based on the alignment signal provided by the alignment sensor to provide the operational configuration of the secondary metrology position coordinate determination system. The scale plane is defined to be nominally coincident with the planar base of the XY scale, and the direction perpendicular to the scale plane is defined as the scale imaging axis direction. In the operating configuration, the XY scale and the first imaging structure are arranged such that the optical axis of the first imaging structure is parallel to the direction of the scale imaging axis direction indicated by the alignment signal, and the scale plane is located within the focus range of the first imaging structure along the scale imaging axis direction.
As previously mentioned, in various embodiments, this process for providing an operating configuration may include various position adjustments using an operating alignment actuator AAct, which may include discrete actuators operating the alignment subsystem OAS and/or actuators included in the movable arm structure MAC 'or MAC'. As a specific example, in the embodiments of fig. 8,9 and 10, the fourth and fifth motion mechanisms 834 and 835 may be operable to rotate the fourth and fifth arm portions 824 and 825 to thereby rotate the XY scale 870 to cause the scale imaging axis direction SIA to be parallel to the optical axis OA1, as indicated by the alignment signal of the alignment sensor ASen. In some embodiments, such adjustments may be made automatically or guided by a user or inspection program, etc. In various embodiments, various adjustments may be made to the first imaging structure 860 (e.g., the magnification and/or focus range, etc. may be adjusted) in order to bring the scale plane within the focus range of the first imaging structure 860 along the scale imaging axis direction SIA.
At block 1130, at least one input signal is received (e.g., at an image trigger portion, such as image trigger portion 181 or 881, etc.) that is related to a measurement point position or an end tool position of an end tool of the robot. The timing of the first imaging trigger signal is determined based on the at least one input signal and the first imaging trigger signal is output to the first imaging structure. The first imaging structure acquires a digital image of the XY scale at an image acquisition time in response to receiving the first imaging trigger signal. In various embodiments, different types of end tools may provide different types of outputs that may be used for at least one input signal. For example, in embodiments where the end tool is a contact probe for measuring a workpiece and outputting a contact signal when it contacts the workpiece, the contact signal, or a signal derived therefrom, may be input as at least one input signal to determine the timing of the first imaging trigger signal based on the at least one input signal. As another example, in embodiments where the end tool is a scanning probe for measuring a workpiece and providing respective workpiece measurement sample data corresponding to respective sample timing signals, the respective sample timing signals, or signals derived therefrom, may be input as the at least one input signal. As another example, in embodiments where the end tool is a camera for providing respective workpiece measurement images corresponding to respective workpiece image acquisition signals, the workpiece image acquisition signals, or signals derived therefrom, may be input as the at least one input signal.
At block 1140, the acquired image is received (e.g., at a metrology position coordinate processing section, such as metrology position coordinate processing section 185 or 885, etc.) and at least one respective imageable feature included in the acquired image of the XY scale and an associated respective known XY scale coordinate position is identified. At block 1150, metrology position coordinates are determined that give the relative position between the movable of the XY scale or the first imaging structure and the first reference position with a level of accuracy that is better than the accuracy of the robot based on determining the image position of the identified at least one corresponding imageable feature in the captured image. The determined metrology position coordinates give the end tool position at the time of image acquisition, at least for the vector component of the metrology position coordinates (which is transverse or perpendicular to at least one of the scale imaging axis directions), the level of accuracy is better than the robot accuracy. At block 1160, the determined position information (e.g., determined metrology position coordinates giving relative position, measured point position of the end tool or determined metrology position coordinates of the end tool position, and/or other relevant determined position information) is used for the specified function (e.g., workpiece measurement, positioning control of a movable arm structure of the robot, etc.). After the operations of block 1160, the routine may end. As part of such operations, or otherwise, the routine may then proceed to point A, where the routine may end in various embodiments. Alternatively, after the operations of block 1160, the routine may be partially or fully repeated. For example, the determined position information (e.g., from block 1160) may correspond to or otherwise be used to determine a first surface position on the workpiece, and the routine may be repeated, for which a second surface position on the workpiece may then be determined (e.g., as part of a measurement of the workpiece, such as measuring a feature of the workpiece). Whether the operation at block 1120 is to be repeated may depend on the particular circumstances in which the routine is repeated. For optimal accuracy, it may be desirable to establish the operational alignment outlined at block 1120, with the movable arm structure of the robot in the same (or nearly the same) position and/or pose that will be used during the operations of blocks 1130 and/or 1140, and the operation adjusts the alignment of the movable one of the XY scale or the first imaging structure based on the alignment signals provided by the alignment sensors. However, if the robotic arm structure is sufficiently rigid and/or the position and/or pose used at the second surface position is close to the position and/or pose used at the first surface position and/or the accuracy requirements under certain circumstances are less stringent, then repeating operations at block 1120 may be omitted in some such circumstances, if desired.
In any case, the first and second determined metrology position coordinates, which give first and second relative positions and/or associated position information, determined by repeating this routine 1100 are used to determine a dimension of the workpiece, which corresponds to a distance between the first and second surface locations on the workpiece, which corresponds to the respective end tool position or measurement point position of the end tool when contacting the respective first and second surface locations on the workpiece at the respective image acquisition time, and so on. It should be appreciated that rather than using the position sensors of the robot (e.g., rotary encoders, linear encoders, etc.) to determine the first and second surface positions on the workpiece with robotic precision, the techniques described herein may be utilized to determine more accurate position information. More specifically, the determination of the first and second surface positions (i.e. corresponding to first and second determined metrology position coordinates corresponding to respective first and second positions on the XY scale, the precise distance between these coordinates/positions being determinable using the techniques described above in terms of the accuracy of the XY scale) allows the respective dimension of the workpiece (e.g. a workpiece feature) between the first and second surface positions to be determined with high accuracy.
It is noted that the various techniques described above differ from techniques that utilize fiducials or other reference marks (e.g., for XY scale 170 or 870, for which the same fiducials or reference marks are required in each image, as compared to XY scale 170 or 870, position information thereof may be determined for the entire extent of XY scale 170 or 870 and correspondingly for any portion of XY scale 170 or 870 contained in the image corresponding to field of view FOV or field of view 1 of imaging structure 160 or 860).
It will be appreciated that the routine outlined above with reference to fig. 6 may be used with a corresponding operational alignment subsystem OAS (e.g. as described with reference to fig. 1-3B), or the routine outlined above with reference to fig. 11 may be used with a corresponding operational alignment subsystem OAS (e.g. as described below with reference to fig. 8-10 and/or 12-13) to implement the hybrid mode described above with reference to fig. 7. In particular, such a routine and corresponding operation alignment subsystem may be used in the operations of block 740 of FIG. 7.
Fig. 14 is a diagram 1400 of a first exemplary structure of an alignment sensor ASen in various embodiments that may be used to operate an alignment subsystem OAS according to the principles disclosed herein. According to one type of description, it may be considered a laser autocollimator for detecting residual misalignment mirang of XY scale 170 or 870, etc. using an alignment beam. As shown in fig. 14, the alignment sensor ASen may generally include a laser beam source LD (e.g., a laser diode and a lens), a polarization beam splitter PBS, a quarter wave plate PX having a wavelength corresponding to the laser beam source LD, an objective lens L, and a position sensor PSD. The light emitting point of the light source LD is located at the focal point of the objective lens L. The polarizing beam splitter PBS is located on the optical path of the light beam emitted by the light source LD and at a distance "b" from the light source LD. The polarizing beam splitter PBS polarizes the light in the alignment beam light beam. As previously described, the quarter wave plate PX rotates the polarization of the light in the alignment beam according to known principles, and then the alignment beam ABeam is output to the XY scale 170(870) through the objective lens L.
As previously described, XY scale 170(870) reflects alignment beam ABeam as reflected alignment beam ABeamR. As shown, the reflected alignment beam ABeamR returns through the objective lens L and the quarter wave plate PX and is reflected from the polarizing beam splitter PBS to the position sensor PSD. The position detector PSD is spaced from the polarizing beam splitter PBS by a distance "b".
It should be appreciated that if the XY scale had a residual misalignment angle θ relative to the alignment beam ABeam, the reflected alignment beam ABeamR would be reflected at an angle of 2 θ, as shown by the residual misalignment 2 MisAng in the various figures described previously herein. Thus, it will be appreciated that the displacement or position "d" of the final reflected alignment beam ABeamR focused on the position detector PSD will follow the relationship
θ=d/2f
For the illustrated configuration, where f (═ a + b) is the focal length of the objective lens L.
It should be understood that although the foregoing description describes misalignment detection in one plane, the same detector and the same detection principles may be applied to both planes when the position detector PSD has two sensitive axes and corresponding output signals. For example, in various embodiments, the position detector PSD may be a quadrant detector of a known type, which may provide a differential signal of a known type giving the displacement or position "d" along the respective "X" and "Y" axes to the reflected alignment beam ABeamR. Such a signal may be considered as the alignment signal ASig outlined previously for the alignment sensor ASen.
It should be understood that although the component name "XY scale" has been used in this disclosure with reference to components 170, 170A, 170B, 870, etc., the component name is exemplary only and not limiting. It is referred to as an "XY scale" with reference to a cartesian coordinate system, and its description includes a nominally planar substrate (e.g., arranged nominally perpendicular to the scale imaging axis direction, which may be parallel to the z axis in some embodiments). More generally, however, the component name XY scale should be understood to refer to any reference scale that includes features or markings that correspond to known two-dimensional coordinates (e.g., precise and/or precisely calibrated positions in two dimensions) on the reference scale, so long as the scale is capable of operating as disclosed herein. For example, such scale features may be expressed and/or labeled in a cartesian coordinate system on the reference scale, or in a polar coordinate system, or any other convenient coordinate system. Further, such features may include features that are uniformly or non-uniformly distributed throughout the operational scale area, and may include scale markings, with or without a scale, so long as such features correspond to known two-dimensional coordinates on the scale and are capable of operating as disclosed herein.
It should be understood that although the robotic systems and corresponding movable arm structures disclosed and illustrated herein are generally shown and described with reference to a number of arms (e.g., 3 arms, 5 arms, etc.), such systems are not limited thereto. In various embodiments, the robotic system is assumed to include arms such as described and/or claimed herein, and the robotic system may include fewer or more arms, if desired.
It will be appreciated that the XY scale or reference scale and the camera for imaging the scale may be rotated relative to each other, depending on the motion and/or position of the robotic system. It is to be understood that methods known in the art (e.g., methods disclosed in the incorporated references) can be used to accurately determine any such relative rotation and/or perform any desired coordinate transformation, and/or to analyze the relative positions of the camera and scale, regardless of such relative rotation, in accordance with the principles disclosed herein. It should be understood that in various embodiments, the metrology position coordinates referred to herein may take into account any such relative rotation. Further, it should be understood that in some embodiments, metrology site coordinates referred to herein may include a set of coordinates that includes an accurate determination and/or indication of any such relative rotation, if desired.
While preferred embodiments of the disclosure have been shown and described, many variations in the arrangements of the features and sequences of operations shown and described will be apparent to those skilled in the art based on this disclosure. Various alternatives may be used to implement the principles disclosed herein. Furthermore, the various embodiments described above may be combined to provide further implementations. All U.S. patents and U.S. patent applications mentioned in this specification are herein incorporated by reference in their entirety. Aspects of the implementations can be modified, if necessary, to employ concepts of the various patents and applications to provide yet further implementations.
These and other changes can be made to the embodiments in light of the above detailed description. In general, in the following claims, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to include all possible embodiments along with the full scope of equivalents to which such claims are entitled.

Claims (23)

1. A robotic system, comprising:
a robot, comprising:
a movable arm structure, wherein the movable arm structure comprises an end tool mounting structure located near a distal end of the movable arm structure, and the robot is configured to move the movable arm structure so as to move at least a portion of an end tool mounted to the end tool mounting structure along at least two dimensions in the end tool workspace; and
a motion control system configured to control a measurement point position or an end tool position of the end tool based at least in part on sensing and controlling a position of the movable arm structure using at least one position sensor included in the robot to a level of accuracy defined as robot accuracy; and is
Wherein:
the robotic system further includes an auxiliary metrology position coordinate determination system comprising:
a first imaging structure having a first camera, the first imaging structure having an optical axis;
an XY scale comprising a nominally planar substrate and a plurality of respective imageable features distributed on the substrate, wherein the respective imageable features are located at respective known XY scale coordinates on the XY scale, a scale plane is defined as being nominally coincident with the planar substrate of the XY scale, and a direction perpendicular to the scale plane is defined as a scale imaging axis direction;
an operational alignment subsystem OAS comprising at least one alignment sensor ASen and an operational alignment actuator structure, wherein the alignment sensor is located in the vicinity of the first camera and is mounted in a rigid structure with respect to the first camera and the alignment sensor is configured to provide an alignment signal Asig representing the scale imaging axis direction;
an image trigger part configured to input at least one input signal related to a measurement point position of an end tool or an end tool position, and to determine a timing of a first imaging trigger signal based on the at least one input signal, and to output the first imaging trigger signal to the first imaging structure, wherein the first imaging structure is configured to acquire a digital image of the XY scale at an image acquisition time in response to receiving the first imaging trigger signal, and
a metrological position coordinate processing part configured to input the acquired image and identify at least one respective imageable feature included in the acquired XY scale image and an associated respective known XY scale coordinate position, and
wherein:
the secondary metrology position coordinate determination system is configured such that the movable one of the XY scale or the first imaging structure is coupled to an operational alignment actuator structure coupled to or part of the movable arm structure, and the other of the XY scale or the first imaging structure is coupled to a fixed element in the vicinity of the robot, wherein the fixed one of the XY scale or the first imaging structure defines a first reference position;
the robotic system is configured to operate the operational alignment subsystem and the operational alignment actuator structure to adjust an alignment of the movable one of the XY scale or the first imaging structure based on the alignment signal provided by the alignment sensor to provide an operational structure of the auxiliary metrological position coordinate determination system, wherein in the operational structure of the auxiliary metrological position coordinate determination system the XY scale and the first imaging structure are arranged such that an optical axis of the first imaging structure is parallel to a direction of the scale imaging axis direction indicated by the alignment signal and the scale plane is within a focus range of the first imaging structure along the scale imaging axis direction;
the auxiliary metrological position coordinate determination system is configured such that, when the movable one of the XY scale or first imaging structure and the fixed one of the XY scale or first imaging structure are arranged in the operative structure and the movable arm structure is positioned such that the XY scale is in the field of view of the first imaging structure, the metrological position coordinate processing portion is operable to determine metrological position coordinates based on determining the image position of the identified at least one respective imageable feature in the captured image, the metrological position coordinates giving the relative position between the movable one of the XY scale or first imaging structure and a first reference position, at a level of accuracy greater than robot accuracy; and
the determined metrology position coordinates give the measurement point position of the end tool or the end tool position at the time of image acquisition, at least for the vector component of the metrology position coordinates, which is at least one of transverse or perpendicular to the scale imaging axis direction, the level of accuracy is better than the robot accuracy.
2. A robotic system according to claim 1, wherein the operative alignment actuator structure comprises at least a first rotary element that rotates about a first axis of rotation that is nominally parallel to the scale plane if the XY scale is the movable one and nominally perpendicular to the optical axis if the first imaging structure is the movable one.
3. The robotic system of claim 2, wherein the operational alignment actuator structure includes at least a second rotating element that rotates about a second axis of rotation nominally perpendicular to the first axis of rotation.
4. The robotic system of claim 3, wherein the first and second rotating elements are included in the movable arm structure.
5. The robotic system of claim 3, wherein the first and second rotating elements are included in discrete operational alignment actuator structures located near a distal end of the movable arm structure.
6. The robotic system of claim 2, wherein the hub portion includes at least a first hub portion rotating element that rotates about an axis of rotation nominally parallel to the first axis of rotation.
7. The robotic system of claim 2, wherein, for the distal sub-portion coupling the movable one of the XY scale or the first imaging structure to the central sub-portion, a distal sub-portion axis of rotation nominally perpendicular to the scale plane is excluded if the XY scale is the movable one, and a distal sub-portion axis of rotation nominally parallel to the optical axis is excluded if the first imaging structure is the movable one.
8. The robotic system of claim 2, wherein the distal terminal portion includes a bracket that couples the movable of the XY scale or the first imaging structure to the first rotating element.
9. The robotic system of claim 1, wherein the movable one of the XY scale or the first imaging structure is configured in a rigid relationship with at least one of the tip tool mounting structure and a tip tool mounted to the tip tool mounting structure.
10. The robotic system of claim 1, wherein the alignment sensor is configured to output an alignment beam to the XY scale and receive a reflected alignment beam therefrom on a position sensitive detector of the alignment sensor and provide the alignment signal based on at least one output from the position sensitive detector.
11. A robotic system according to claim 1, wherein the robot is configured to move the movable one of the XY scale or the first imaging structure in a plane parallel to the scale plane while the auxiliary metrology position coordinate determination system is in the operative configuration.
12. The robotic system of claim 1, wherein:
when the end tool is a contact probe for measuring a workpiece and outputting a contact signal when it contacts the workpiece, the image triggering section is configured to input the contact signal or a signal derived therefrom as at least one input signal thereof; or
When the end tool is a scanning probe for measuring a workpiece and providing respective workpiece measurement sample data corresponding to respective sample timing signals, the image triggering portion is configured to input the respective sample timing signals or signals derived therefrom as at least one input signal thereof; or
When the end tool is a camera for providing a respective workpiece measurement image corresponding to a respective workpiece image acquisition signal, the image triggering portion is configured to input the workpiece image acquisition signal or a signal derived therefrom as at least one input signal thereof.
13. The robotic system of claim 1, wherein the auxiliary metrology position coordinate determination system is configured to determine the metrology position coordinates of the measurement point position or the tip tool position of the tip tool at the image acquisition time based on determined metrology position coordinates representing the relative position of the movable one of the XY scale or the first imaging structure and a known coordinate position offset between the measurement point position or the tip tool position of the tip tool and the movable one of the XY scale or the first imaging structure.
14. The robotic system of claim 1, wherein the first imaging structure and the alignment sensor are coupled to the movable arm structure and the XY scale is coupled to the stationary element.
15. The robotic system of claim 14, wherein the fixed element comprises a frame disposed over at least a portion of the tip tool workspace, and the XY scale is fixed to the frame over a portion of the tip tool workspace.
16. The robotic system of claim 1, wherein:
the respective imageable features of the XY scale comprise a set of imageable features having a uniquely identifiable pattern, wherein the set of imageable features are distributed over the substrate such that they are spaced apart by a distance that is less than a distance corresponding to a distance across the field of view of the first imaged structure; and, the metrological position coordinates processing portion is configured to identify at least one respective imageable feature included in the acquired XY scale image based on the uniquely identifiable pattern thereof; or
The metrology position coordinate processing section is configured to identify at least one respective imageable feature included in the captured image of the XY scale based on its image position in the captured image and based on robot position data derived from the motion control system corresponding to the image capture time, wherein the respective imageable feature of the XY scale comprises a set of similar imageable features distributed over the substrate such that they are spaced apart from each other by a distance greater than a maximum position error allowed within robot accuracy.
17. A method for operating an assisted metrology position coordinate determination system for use with a robot,
the robot includes:
a movable arm structure, wherein the movable arm structure comprises an end tool mounting structure located near a distal end of the movable arm structure, and the robot is configured to move the movable arm structure so as to move at least a portion of an end tool mounted to the end tool mounting structure along at least two dimensions in the end tool workspace; and
a motion control system configured to control a measurement point position or an end tool position of the end tool based at least in part on sensing and controlling a position of the movable arm structure using at least one position sensor included in the robot to a level of accuracy defined as robot accuracy;
the auxiliary metrology position coordinate determination system comprises:
a first imaging structure comprising a first camera, the first imaging structure having an optical axis;
an XY scale comprising a nominally planar substrate and a plurality of respective imageable features distributed on the substrate, wherein the respective imageable features are located at respective known XY scale coordinates on the XY scale, a scale plane is defined as being nominally coincident with the planar substrate of the XY scale, and a direction perpendicular to the scale plane is defined as a scale imaging axis direction;
an operational alignment subsystem OAS comprising at least one alignment sensor ASen and an operational alignment actuator structure, wherein the alignment sensor is located in the vicinity of the first camera and is mounted in a rigid structure relative to the first camera and the alignment sensor is configured to provide an alignment signal indicative of the scale imaging axis direction;
an image triggering section; and
a measurement position coordinate processing section for measuring the position of the object,
wherein:
the secondary metrology position coordinate determination system structure is configured such that the movable one of the XY scale or the first imaging structure is coupled to an operational alignment actuator structure coupled to or part of the movable arm structure, and the other of the XY scale or the first imaging structure is coupled to a fixed element in the vicinity of the robot, wherein the fixed one of the XY scale or the first imaging structure defines a first reference position;
the robotic system is configured to operate the operational alignment subsystem and the operational alignment actuator structure to adjust an alignment of the movable one of the XY scale or the first imaging structure based on an alignment signal provided by the alignment sensor to provide an operational structure of the auxiliary metrological position coordinate determination system, wherein in the operational structure of the auxiliary metrological position coordinate determination system the XY scale and the first imaging structure are arranged such that an optical axis of the first imaging structure is parallel to a direction of the scale imaging axis direction indicated by the alignment signal and the scale plane is within a focus range of the first imaging structure along the scale imaging axis direction;
the method comprises the following steps:
operating the operational alignment subsystem and the operational alignment actuator structure to provide an operational structure of the auxiliary metrological position coordinate determination system, the optical axis of the first imaging structure being parallel to the direction of the scale imaging axis direction indicated by the alignment signal and the scale plane being within the focus range of the first imaging structure along the scale imaging axis direction;
receiving at the image trigger portion at least one input signal relating to a measurement point position or an end tool position of the end tool and determining a timing of a first imaging trigger signal based on the at least one input signal and outputting the first imaging trigger signal to the first imaging structure, wherein the first imaging structure acquires a digital image of the XY scale at an image acquisition time in response to receiving the first imaging trigger signal and for the first imaging structure the auxiliary metrology position coordinate determination system is at least nominally in the operative configuration when acquiring the digital image;
receiving the acquired image at the metrological position coordinates processing portion and identifying at least one respective imageable feature included in the acquired XY scale image and an associated respective known XY scale coordinate position; and
based on determining an image position of the identified at least one respective imageable feature in the captured image, metrology position coordinates are determined that give a relative position between the movable one of the XY scale or the first imaging structure and the first reference position, wherein the determined metrology position coordinates give a measurement point position or an end tool position of the end tool at the time of image capture, at least for a vector component of the metrology position coordinates that is at least one of transverse or perpendicular to the scale imaging axis direction, the level of accuracy being better than the robotic accuracy.
18. The method of claim 17, further comprising measuring a feature of the workpiece using the determined metrology position coordinates that give the relative position.
19. The method of claim 18, wherein the relative position is a first relative position corresponding to a first surface position on a workpiece, and further comprising:
receiving at the image trigger portion at least one second input signal relating to a measurement point position of the end tool or an end tool position, and determining a timing of a second imaging trigger signal based on the at least one second input signal, and outputting the second imaging trigger signal to the first imaging structure, wherein the first imaging structure acquires a second digital image of the XY scale at a second image acquisition time in response to receiving the second imaging trigger signal, and the auxiliary metrology position coordinate determination system is at least nominally in the operating configuration when acquiring the second digital image;
receiving the second acquired image at the metrological position coordinates processing portion and identifying at least one second respective imageable feature included in the second acquired XY scale image and an associated respective second known XY scale coordinate position;
determining metrology position coordinates based on determining a second image position of the identified at least one second respective imageable feature in the second acquired image, the metrology position coordinates giving a second relative position between the movable one of the XY scale or the first imaging structure and the second reference position, wherein the determined metrology position coordinates are indicative of the end tool's measurement point position or the end tool position at the second image acquisition time, the vector component being at least one of transverse or perpendicular to the scale imaging axis direction, at least for a vector component of the metrology position coordinates, the level of accuracy being better than robotic accuracy, and the second relative position being different from the first relative position and corresponding to a second surface position on the workpiece different from the first surface position; and
the determined metrology position coordinates, which give the first and second relative positions, are used to determine a dimension of the workpiece, which corresponds to a distance between the first and second surface locations on the workpiece.
20. An auxiliary metrology position coordinate determination system for a robotic system, comprising:
a movable arm structure, wherein the movable arm structure comprises an end tool mounting structure located near a distal end of the movable arm structure, and the robot is configured to move the movable arm structure so as to move at least a portion of an end tool mounted to the end tool mounting structure along at least two dimensions in the end tool workspace; and
a motion control system configured to control a measurement point position or an end tool position of the end tool based at least in part on sensing and controlling a position of the movable arm structure using at least one position sensor included in the robot to a level of accuracy defined as robot accuracy;
the auxiliary metrology position coordinate determination system comprises:
a first imaging structure having a first camera, the first imaging structure having an optical axis;
an XY scale comprising a nominally planar substrate and a plurality of respective imageable features distributed on the substrate, wherein the respective imageable features are located at respective known XY scale coordinates on the XY scale, a scale plane is defined as being nominally coincident with the planar substrate of the XY scale, and a direction perpendicular to the scale plane is defined as a scale imaging axis direction;
an operational alignment subsystem comprising at least one alignment sensor and an operational alignment actuator structure, wherein the alignment sensor is located proximate to the first camera and is mounted in a rigid structure relative to the first camera, and the alignment sensor is configured to provide an alignment signal indicative of a scale imaging axis direction;
an image trigger part configured to input at least one input signal related to a measurement point position of an end tool or an end tool position, and to determine a timing of a first imaging trigger signal based on the at least one input signal, and to output the first imaging trigger signal to the first imaging structure, wherein the first imaging structure is configured to acquire a digital image of the XY scale at an image acquisition time in response to receiving the first imaging trigger signal, and
a metrological position coordinates processing portion configured to input the acquired image and identify at least one respective imageable feature included in the acquired XY scale image, and an associated respective known XY scale coordinate position;
wherein the auxiliary metrology position coordinate determination system is configured such that when it is operably connected to a robotic system:
the secondary metrology position coordinate determination system is configured such that the movable one of the XY scale or the first imaging structure is coupled to an operational alignment actuator structure coupled to or part of the movable arm structure, and the other of the XY scale or the first imaging structure is coupled to a fixed element in the vicinity of the robot, wherein the fixed one of the XY scale or the first imaging structure defines a first reference position;
the robotic system is configured to operate the operational alignment subsystem and the operational alignment actuator structure to adjust an alignment of the movable one of the XY scale or the first imaging structure based on an alignment signal provided by the alignment sensor to provide an operational structure of the auxiliary metrological position coordinate determination system, wherein in the operational structure of the auxiliary metrological position coordinate determination system the XY scale and the first imaging structure are arranged such that an optical axis of the first imaging structure is parallel to a direction of the scale imaging axis direction indicated by the alignment signal and the scale plane is within a focus range of the first imaging structure along the scale imaging axis direction;
the auxiliary metrological position coordinate determination system is configured such that, when the movable one of the XY scale or first imaging structure and the fixed one of the XY scale or first imaging structure are arranged in the operative structure and the movable arm structure is positioned such that the XY scale is in the field of view of the first imaging structure, the metrological position coordinate processing portion is operable to determine metrological position coordinates based on determining the image position of the identified at least one respective imageable feature in the captured image, the metrological position coordinates giving the relative position between the movable one of the XY scale or first imaging structure and a first reference position, at a level of accuracy greater than robot accuracy; and
the determined metrology position coordinates give the measurement point position of the end tool or the end tool position at the time of image acquisition, at least for the vector component of the metrology position coordinates, which is at least one of transverse or perpendicular to the scale imaging axis direction, the level of accuracy is better than the robot accuracy.
21. A robotic system, comprising:
a robot, comprising:
a movable arm structure, wherein the movable arm structure comprises an end tool mounting structure located near a distal end of the movable arm structure, and the robot is configured to move the movable arm structure so as to move at least a portion of an end tool mounted to the end tool mounting structure along at least two dimensions in the end tool workspace; and
a motion control system configured to control a measurement point position or an end tool position of the end tool based at least in part on sensing and controlling a position of the movable arm structure using at least one position sensor included in the robot to a level of accuracy defined as robot accuracy; and
wherein:
the robotic system further includes an auxiliary metrology position coordinate determination system comprising:
a first imaging structure having a first camera, the first imaging structure having an optical axis;
an XY scale comprising a nominally planar substrate and a plurality of respective imageable features distributed on the substrate, wherein the respective imageable features are located at respective known XY scale coordinates on the XY scale, a scale plane is defined as being nominally coincident with the planar substrate of the XY scale, and a direction perpendicular to the scale plane is defined as a scale imaging axis direction;
an operational alignment subsystem OAS comprising at least one alignment sensor ASen and an operational alignment actuator structure, wherein the alignment sensor is located in the vicinity of the first camera and is mounted in a rigid structure relative to the first camera and the alignment sensor is configured to provide an alignment signal Asig indicative of the scale imaging axis direction;
an image trigger part configured to input at least one input signal related to a measurement point position of an end tool or an end tool position, and to determine a timing of a first imaging trigger signal based on the at least one input signal, and to output the first imaging trigger signal to the first imaging structure, wherein the first imaging structure is configured to acquire a digital image of the XY scale at an image acquisition time in response to receiving the first imaging trigger signal, and
a metrological position coordinate processing part configured to input the acquired image and identify at least one respective imageable feature included in the acquired XY scale image and an associated respective known XY scale coordinate position, and
wherein:
the auxiliary metrology position coordinate determination system is configured such that the movable one of the XY scale or the first imaging structure is coupled to the movable arm structure and the other of the XY scale or the first imaging structure is coupled to a fixed element in the vicinity of the robot, wherein the fixed one of the XY scale or the first imaging structure defines a first reference position;
the robotic system is configured to provide at least a nominal operating configuration of the auxiliary metrology position coordinate determination system, wherein in the nominal operating configuration of the auxiliary metrology position coordinate determination system the XY scale and the first imaging configuration are arranged such that the optical axis of the first imaging configuration is nominally parallel to the direction of the scale imaging axis direction and such that the scale plane lies within the focus range of the first imaging configuration along the scale imaging axis direction;
the robotic system is configured to operate the operational alignment subsystem to determine a residual misalignment between the optical axis and the scale imaging axis, indicated by an alignment signal provided by the alignment sensor;
the auxiliary metrological position coordinate determination system is configured such that, when the movable one of the XY scale or first imaging structure and the fixed one of the XY scale or first imaging structure are arranged in a nominal operating configuration and the movable arm structure is positioned such that the XY scale is in the field of view of the first imaging structure, the metrological position coordinate processing portion is operable to acquire a digital image of the XY scale at image acquisition time and determine a corresponding residual misalignment, and to determine a first set of metrological position coordinates giving the relative position between the movable one of the XY scale or first imaging structure and the first reference position, at least for vector components of the first set of metrological position coordinates, based on the image position of the identified at least one corresponding imageable feature in the acquired image and the corresponding residual misalignment, the vector component is at least one of transverse or perpendicular to the scale imaging axis direction, with an accuracy better than the robot accuracy.
22. The robotic system of claim 21, wherein the metrology position coordinate processing section is further configured to determine a second set of metrology position coordinates based on the first set of metrology position coordinates and the corresponding residual misalignment, the second set of metrology position coordinates giving a measurement point position or an end tool position of the end tool at the time of image acquisition, the vector component being at least one of transverse or perpendicular to a scale imaging axis direction for at least a second set of metrology position coordinates, the level of accuracy being better than the robot accuracy.
23. The robotic system of claim 21, wherein the alignment sensor is configured to output an alignment beam to the XY scale and receive a reflected alignment beam therefrom on a position sensitive detector of the alignment sensor and provide the alignment signal based on at least one output from the position sensitive detector.
CN201980053434.9A 2018-08-16 2019-08-15 Auxiliary metrology position coordinate determination system including alignment sensors for robots Pending CN112584984A (en)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US16/104,033 2018-08-16
US16/104,033 US10751883B2 (en) 2018-08-16 2018-08-16 Robot system with supplementary metrology position coordinates determination system
US16/146,640 2018-09-28
US16/146,640 US10871366B2 (en) 2018-08-16 2018-09-28 Supplementary metrology position coordinates determination system for use with a robot
US201862785129P 2018-12-26 2018-12-26
US62/785,129 2018-12-26
PCT/US2019/046702 WO2020037147A1 (en) 2018-08-16 2019-08-15 Supplementary metrology position coordinates determination system including an alignment sensor for use with a robot

Publications (1)

Publication Number Publication Date
CN112584984A true CN112584984A (en) 2021-03-30

Family

ID=69524884

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980053434.9A Pending CN112584984A (en) 2018-08-16 2019-08-15 Auxiliary metrology position coordinate determination system including alignment sensors for robots

Country Status (4)

Country Link
JP (1) JP7431216B2 (en)
CN (1) CN112584984A (en)
DE (1) DE112019004129T5 (en)
WO (1) WO2020037147A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113847894B (en) * 2021-09-23 2024-03-29 深圳市人工智能与机器人研究院 Robot multi-positioning system coordinate unifying method and system

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4753569A (en) * 1982-12-28 1988-06-28 Diffracto, Ltd. Robot calibration
EP0329531A1 (en) * 1988-02-18 1989-08-23 Telemecanique Method and device for estimating the parameters of the geometric model of a manipulator
JPH0996506A (en) * 1995-09-29 1997-04-08 Ricoh Co Ltd Adjusting method for position by three-dimensional visual sensor and recognition apparatus for three-dimensional image
US20100153058A1 (en) * 2008-12-16 2010-06-17 Phillip John Crothers Geometric inspection of machined objects
CN104457566A (en) * 2014-11-10 2015-03-25 西北工业大学 Spatial positioning method not needing teaching robot system
US20170113351A1 (en) * 2015-10-21 2017-04-27 Fanuc Corporation Calibration system and calibration method calibrating mechanical parameters of wrist part of robot
CN107088892A (en) * 2017-04-01 2017-08-25 西安交通大学 A kind of industrial robot motion accuracy checking method based on binocular vision
US20170274534A1 (en) * 2016-03-25 2017-09-28 Fanuc Corporation Positioning system using robot
US20180001478A1 (en) * 2016-06-29 2018-01-04 Applied Materials, Inc. Methods and systems providing misalignment correction in robots
CN110834320A (en) * 2018-08-16 2020-02-25 株式会社三丰 Auxiliary measurement position coordinate determination system for use with a robot
CN110834322A (en) * 2018-08-16 2020-02-25 株式会社三丰 Robot system with auxiliary measuring position coordinate determination system

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2816089B2 (en) * 1993-12-22 1998-10-27 松下電工株式会社 Robot path correction method
US5768759A (en) * 1996-11-19 1998-06-23 Zevatech, Inc. Method and apparatus for reflective in-flight component registration
JP2003117861A (en) 2001-10-15 2003-04-23 Denso Corp Position correcting system of robot
JP4267005B2 (en) * 2006-07-03 2009-05-27 ファナック株式会社 Measuring apparatus and calibration method
JP5849403B2 (en) 2011-02-15 2016-01-27 セイコーエプソン株式会社 Robot controller, robot, and robot system
JP6468741B2 (en) * 2013-07-22 2019-02-13 キヤノン株式会社 Robot system and robot system calibration method
US20160243703A1 (en) * 2015-02-19 2016-08-25 Isios Gmbh Arrangement and method for the model-based calibration of a robot in a working space
JP6622765B2 (en) * 2017-08-22 2019-12-18 ファナック株式会社 Robot system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4753569A (en) * 1982-12-28 1988-06-28 Diffracto, Ltd. Robot calibration
EP0329531A1 (en) * 1988-02-18 1989-08-23 Telemecanique Method and device for estimating the parameters of the geometric model of a manipulator
JPH0996506A (en) * 1995-09-29 1997-04-08 Ricoh Co Ltd Adjusting method for position by three-dimensional visual sensor and recognition apparatus for three-dimensional image
US20100153058A1 (en) * 2008-12-16 2010-06-17 Phillip John Crothers Geometric inspection of machined objects
CN104457566A (en) * 2014-11-10 2015-03-25 西北工业大学 Spatial positioning method not needing teaching robot system
US20170113351A1 (en) * 2015-10-21 2017-04-27 Fanuc Corporation Calibration system and calibration method calibrating mechanical parameters of wrist part of robot
US20170274534A1 (en) * 2016-03-25 2017-09-28 Fanuc Corporation Positioning system using robot
US20180001478A1 (en) * 2016-06-29 2018-01-04 Applied Materials, Inc. Methods and systems providing misalignment correction in robots
CN107088892A (en) * 2017-04-01 2017-08-25 西安交通大学 A kind of industrial robot motion accuracy checking method based on binocular vision
CN110834320A (en) * 2018-08-16 2020-02-25 株式会社三丰 Auxiliary measurement position coordinate determination system for use with a robot
CN110834322A (en) * 2018-08-16 2020-02-25 株式会社三丰 Robot system with auxiliary measuring position coordinate determination system

Also Published As

Publication number Publication date
WO2020037147A1 (en) 2020-02-20
DE112019004129T5 (en) 2021-05-12
JP7431216B2 (en) 2024-02-14
JP2021534983A (en) 2021-12-16

Similar Documents

Publication Publication Date Title
US10751883B2 (en) Robot system with supplementary metrology position coordinates determination system
US10871366B2 (en) Supplementary metrology position coordinates determination system for use with a robot
US10913156B2 (en) Robot system with end tool metrology position coordinates determination system
CN109141223B (en) PSD-based laser interferometer light path efficient and accurate calibration method
Fan et al. A 6-degree-of-freedom measurement system for the accuracy of XY stages
US6067165A (en) Position calibrating method for optical measuring apparatus
US11745354B2 (en) Supplementary metrology position coordinates determination system including an alignment sensor for use with a robot
US11002529B2 (en) Robot system with supplementary metrology position determination system
US11754934B2 (en) Projection exposure apparatus for semiconductor lithography having an optical element with sensor reference and method for aligning the sensor reference
CN113091653B (en) Device and method for measuring angle freedom degree error of linear guide rail based on pentaprism
JP7431216B2 (en) Supplementary metrology position coordinate determination system including alignment sensors used with robots
KR102492492B1 (en) Charged particle beam apparatus and sample alignment method of charged particle beam apparatus
JP4500729B2 (en) Surface shape measuring device
JP6706164B2 (en) Alignment apparatus, exposure apparatus, and alignment method
US20230204340A1 (en) Metrology system with position and orientation tracking utilizing light beams
US20240069178A1 (en) Method for determining a current position and/or orientation of a laser radar relative to an object to be measured
US11442427B2 (en) Multiaxis machining device and compensation method thereof
CN114367973A (en) Robot system with supplemental metering position determination system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination