TW202102347A - Calibration method of vision-guided robot arm only needing to specify a positioning mark in the calibration target to perform calibration - Google Patents

Calibration method of vision-guided robot arm only needing to specify a positioning mark in the calibration target to perform calibration Download PDF

Info

Publication number
TW202102347A
TW202102347A TW108123805A TW108123805A TW202102347A TW 202102347 A TW202102347 A TW 202102347A TW 108123805 A TW108123805 A TW 108123805A TW 108123805 A TW108123805 A TW 108123805A TW 202102347 A TW202102347 A TW 202102347A
Authority
TW
Taiwan
Prior art keywords
image
coordinate
calibration
coordinate system
axis
Prior art date
Application number
TW108123805A
Other languages
Chinese (zh)
Other versions
TWI699264B (en
Inventor
洪興隆
黃眉瑜
高偉勛
賴傳釗
Original Assignee
上銀科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上銀科技股份有限公司 filed Critical 上銀科技股份有限公司
Priority to TW108123805A priority Critical patent/TWI699264B/en
Application granted granted Critical
Publication of TWI699264B publication Critical patent/TWI699264B/en
Publication of TW202102347A publication Critical patent/TW202102347A/en

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

A calibration method of a vision-guided robot arm is applied in a robot arm with the following steps: (A) setting the operating conditions; (B) placing the calibration target: (C) moving the TCP (Tool Center Point): (D) moving an image sensor: (E) performing the image analysis of positioning marks: (F) correcting the image with the real distance: (G) calculating the image correction data: (H) calculating the compensation amount of the image sensor coordinate system. As a result, the vision-guided robot arm calibration method provided by the present invention is not limited to a specific calibration target, such as a dot matrix, and only needs to specify a positioning mark in the calibration target to perform calibration, which can save time for calibration. In addition, determining the coordinate position through image analysis can also reduce the visual errors caused by human judgment.

Description

視覺導引機器手臂校正方法Correction method of vision guided robotic arm

本發明係與機器手臂校正有關,特別是指一種視覺導引機器手臂校正方法。The present invention is related to the correction of a robot arm, and particularly refers to a method for correcting a vision guided robot arm.

按,視覺導引機器手臂通常是指在機器手臂的末端效應器上增加一影像感測器,如电荷耦合影像感測器(Charge-coupled Device, CCD),讓機器手臂多了眼睛似的,當影像感測器取得工件位置後,透過機器手臂控制器控制機器手臂將末端效應器移動至工件位置進行取物或放物的作業。Press, visually guided robot arm usually refers to adding an image sensor to the end effector of the robot arm, such as a charge-coupled device (CCD), which makes the robot arm more eyes. After the image sensor obtains the position of the workpiece, the robot arm is controlled by the robot arm controller to move the end effector to the position of the workpiece to fetch or put the object.

然而,上述取放工件作業要進行之前,必須先讓機器手臂藉由視覺導引機器手臂校正作業,讓控制器可以儲存末端效應器和影像感測器鏡頭之間的座標位置差。However, before the above-mentioned work of picking and placing the workpiece, the robot must be guided by the vision to guide the robot to calibrate the operation, so that the controller can store the coordinate position difference between the end effector and the lens of the image sensor.

在傳統視覺導引機器手臂系統校正技術中,所使用的校正標的為點陣圖。因為點陣圖為規律之圖形,沒有方向性,因此需由使用者依序決定在點陣圖上三個特徵點。接著,校正人員先控制機器手臂移動至適當高度,使相機可拍到完整之點陣圖影像,此位置即為影像校正點。使用者在影像處理軟體中的點陣圖影像上輸入上述三個特徵點之影像座標,並且輸入點陣圖中點與點之間在真實世界中的中心距離,透過影像處理軟體計算影像座標系至真實世界座標系之座標系轉換關係,如此影像處理軟體中便定義了真實世界座標系

Figure 02_image001
。In the traditional vision guidance robotic arm system calibration technology, the calibration target used is a bitmap. Because the bitmap is a regular pattern and has no directionality, it is up to the user to determine the three characteristic points on the bitmap in sequence. Then, the calibration staff first controls the robot arm to move to an appropriate height, so that the camera can take a complete bitmap image, and this position is the image calibration point. The user enters the image coordinates of the above three characteristic points on the bitmap image in the image processing software, and enters the center distance between the points in the bitmap image in the real world, and calculates the image coordinate system through the image processing software The coordinate system conversion relationship to the real-world coordinate system, so that the real-world coordinate system is defined in the image processing software
Figure 02_image001
.

在前述校正程序結束後,校正人員還要移動機器手臂,將機器手臂作業工具工作點依序移動至上述三個特徵點,並記錄工具工作點位在該特徵點之機器手臂座標值。當校正完成後,機器手臂控制器內部會根據上述機器手臂座標值自動計算並定義機器手臂之基底座標系。此時,機器手臂之基底座標系將會與影像處理軟體中真實世界座標系重合。因此,當影像處理軟體分析影像並經過轉換取得物件之真實世界座標,便可直接傳送給機器手臂進行作業,而不需透過額外之轉換。After the aforementioned calibration procedure is completed, the calibration personnel must move the robot arm, move the working point of the working tool of the robot arm to the above three characteristic points in sequence, and record the coordinate value of the robot arm with the working point of the tool at the characteristic point. When the calibration is completed, the robot arm controller will automatically calculate and define the base frame standard system of the robot arm according to the above-mentioned coordinate values of the robot arm. At this time, the base frame coordinate system of the robot arm will coincide with the real world coordinate system in the image processing software. Therefore, when the image processing software analyzes the image and obtains the real-world coordinates of the object through conversion, it can be directly sent to the robotic arm for operation without additional conversion.

然而,傳統視覺導引機器手臂校正技術因需要完全仰賴人力,使得程序費時且容易產生誤差。此外,作業工具工作點是否正確移動至各特徵點,也端賴校正人員目視確認,所以可能因校正人員不同而產生不同之校正結果產生目視誤差。However, the traditional vision-guided robot arm calibration technology completely relies on manpower, which makes the procedure time-consuming and prone to errors. In addition, whether the working point of the work tool is correctly moved to each feature point depends on the visual confirmation of the correction personnel, so different correction results may result in visual errors due to different correction personnel.

相關技術,如美國US6812665號專利公報,說明一種離線相對校準的方法,能夠依據機器手臂工具中心點(tool center point, TCP )與工件之間的誤差進行補償,創建精確的加工路徑。然而,機器手臂需事先了解標準工件的外形參數進行標準參數校正,上線操作時以力回饋或位移感測器獲得當前的工件與標準工件參數誤差進行補償。Related technologies, such as the US Patent Publication No. US6812665, describe an offline relative calibration method that can compensate for the error between the tool center point (TCP) of the robot arm and the workpiece to create an accurate processing path. However, the robot arm needs to know the shape parameters of the standard workpiece in advance to calibrate the standard parameters, and use force feedback or displacement sensors to obtain the current workpiece and standard workpiece parameter errors to compensate for the online operation.

美國US7019825號專利公報,說明一種由架設於機器手臂末端相機獲取至少兩張工件影像的手眼校正方法。手臂移動獲取至少兩張影像透過投影不變描述進行手臂與相機的旋轉及平移向量計算。然而,獲取至少兩張以上的工件影像進行投影不變計算,拍攝工件需要限定有足夠邊緣資訊,否則轉換需進行最佳化計算費時及無法獲得良好結果。US Patent No. 7019825 describes a hand-eye correction method for acquiring at least two workpiece images by a camera installed at the end of the robot arm. At least two images are acquired by arm movement, and the rotation and translation vectors of the arm and camera are calculated through the projection invariant description. However, to obtain at least two images of the workpiece for projection invariant calculation, the shooting of the workpiece needs to be limited with sufficient edge information, otherwise the conversion needs to be optimized and calculated and it takes time and a good result cannot be obtained.

還有美國US20050225278A1號專利公報提供一種測量系統,用於確定機器手臂的移動方式,使得工具中心點在光接收表面上的位置移動到光接收表面上的預定點,通過所確定的位置來移動機器人並儲存機器手臂位置,來確定工具中心點相對於機器人的工具安裝表面的位置。此種實施方式在影像校正的手法上,需將校正工具中心點位置由機器手臂帶動至對準圖像顯示之中心點位,做為共同座標系計算之基底。故在人員手動校點操作過程中是較煩瑣以及費時。There is also the US20050225278A1 Patent Publication which provides a measurement system for determining the movement mode of the robotic arm, so that the position of the tool center point on the light receiving surface is moved to a predetermined point on the light receiving surface, and the robot is moved through the determined position And store the position of the robot arm to determine the position of the tool center point relative to the tool mounting surface of the robot. In the method of image correction in this embodiment, the center point of the correction tool needs to be driven by the robotic arm to the center point of the aligned image display, as the basis for the calculation of the common coordinate system. Therefore, it is more cumbersome and time-consuming in the process of manual calibration by personnel.

本發明之主要目的乃在於提供一種視覺導引機器手臂校正方法,其可讓校正作業省時且減少產生誤差。The main purpose of the present invention is to provide a method for calibrating a visually guided robotic arm, which can save time and reduce errors.

緣是,依據本發明所提供之一種視覺導引機器手臂校正方法用在一機器手臂,該機器手臂具有一基座;該機器手臂末端具有一個法蘭面;該機器手臂電性連接一控制器,該控制器具有輸入資料、輸出資料、儲存資料、處理運算資料及顯示資料的功能;該控制器預先儲存一基底座標系和一法蘭座標系,該基底座標系,係由相互垂直的一X軸、一Y軸、一Z軸所構成的座標空間,該基底座標具有一基底座標原點;該機器手臂具有一個工作範圍;該法蘭座標系,係由相互垂直的一X1軸、一Y1軸、一Z1軸所構成的座標空間,該法蘭座標系具有一法蘭座標原點;一個作業工具安裝在該法蘭面;該作業工具具有一個作業工具中心點;該控制器設定一作業工具座標系,該作業工具座標系係由相互垂直的一X2軸、一Y2軸、一Z2軸所構成的座標空間,該作業工具座標系具有一作業工具座標原點,該作業工具座標原點位在該作業工具中心點;一個影像感測器,安裝在該法蘭面,並且電性連接該控制器;該影像感測器內部具有一影像感測晶片,該影像感測晶片具有一影像感測平面;該控制器設定一影像感測器第一座標系,其係由相互垂直的一X3軸、一Y3軸、一Z3軸所構成的座標空間,該影像感測器第一座標系的該X3軸和該Y3軸構成的X3Y3平面需平行於該影像感測晶片的該影像感測平面;該影像感測器第一座標系具有一影像感測器第一座標原點;使用者能夠操作該控制器選擇該法蘭座標系、該作業工具座標系或該影像感測器第一座標系作為一當前座標系,該當前座標系表示目前正在使用的座標系;其特徵在於:A)操作條件設定:在該控制器設定在該基底座標系下的一校正高度、一第一校正座標點、一第二校正座標點、一第三校正座標點和一第四校正座標點; B)放置校正標的:放置一校正標的於該機器手臂的工作範圍之內;該校正標的具有一個定位記號;C)移動作業工具中心點:選擇該作業工具座標系為該當前座標系,操作該機器手臂移動該作業工具,使該作業工具中心點被移動至該定位記號上;該控制器儲存在該基底座標系下的一當前位置座標;D)移動影像感測器:選擇該影像感測器第一座標系為該當前座標系並且加入該校正高度;該控制器控制該機器手臂移動該影像感測器,使得該影像感測器第一座標原點被移動至一校正基準位置座標,該校正基準位置座標位在該定位記號上方,只有Z軸座標值相差為該校正高度;E)定位記號影像分析:該影像感測器擷取一定位影像,該定位影像是具有該定位記號的影像;該控制器透過一影像分析軟體在該定位影像設定一定位影像中心並且分析該定位影像;透過該影像分析軟體取得該定位影像中的定位記號相對於該定位影像中心的位置,而讓該控制器取得一定位記號影像座標;F)影像與真實距離的校正:操作該機器手臂,移動該影像感測器,使得該影像感測器第一座標原點被移動至該第一至第四校正座標點;該影像感測器在該影像感測器第一座標原點被移動至該第一至第四校正座標點的時候,分別擷取一第一影像、一第二影像、一第三影像和一第四影像,由該控制器透過該影像分析軟體分析該第一影像、該第二影像、該第三影像和該第四影像,分別取得該定位記號在該第一至第四影像內的一第一校正影像座標、一第二校正影像座標、一第三校正影像座標和一第四校正影像座標;G)計算影像校正資料:已知在該基底座標系下的該第一至該第四校正座標點的座標值,以及該第一校正影像座標至該第四校正影像座標,可以計算得出一影像校正資料;透過該影像校正資料,可以瞭解影像內的距離和真實世界的距離之間的轉換關係;H)計算影像感測器座標系補償量:利用該定位記號影像座標與該影像校正資料,計算一影像感測器第一座標系補償量 ,補償該影像感測器影像中位置與該作業工具位置的誤差。The reason is that a vision-guided robotic arm calibration method provided by the present invention is used in a robotic arm, the robotic arm has a base; the end of the robotic arm has a flange surface, and the robotic arm is electrically connected to a controller , The controller has the functions of inputting data, outputting data, storing data, processing calculation data and displaying data; the controller pre-stores a base base standard system and a flange coordinate system, the base base standard system consists of a mutually perpendicular The coordinate space formed by the X axis, a Y axis, and a Z axis. The base base mark has a base base mark origin; the robot arm has a working range; the flange coordinate system is composed of an X1 axis and a vertical axis. The coordinate space formed by the Y1 axis and a Z1 axis, the flange coordinate system has a flange coordinate origin; a work tool is installed on the flange surface; the work tool has a work tool center point; the controller sets a The work tool coordinate system, the work tool coordinate system is a coordinate space formed by an X2 axis, a Y2 axis, and a Z2 axis that are perpendicular to each other. The work tool coordinate system has a work tool coordinate origin, and the work tool coordinate system is The point is located at the center point of the working tool; an image sensor is installed on the flange surface and is electrically connected to the controller; the image sensor has an image sensor chip inside, and the image sensor chip has a Image sensing plane; the controller sets the first coordinate system of an image sensor, which is a coordinate space formed by an X3 axis, a Y3 axis, and a Z3 axis that are perpendicular to each other. The image sensor first coordinate system The X3Y3 plane formed by the X3 axis and the Y3 axis must be parallel to the image sensing plane of the image sensor chip; the image sensor first coordinate system has an image sensor first coordinate origin; use A person can operate the controller to select the flange coordinate system, the work tool coordinate system, or the first coordinate system of the image sensor as a current coordinate system, and the current coordinate system represents the coordinate system currently in use; it is characterized by: A) Setting of operating conditions: a calibration height, a first calibration coordinate point, a second calibration coordinate point, a third calibration coordinate point, and a fourth calibration coordinate point set by the controller under the base frame standard system; B) Place the calibration target: place a calibration target within the working range of the robot; the calibration target has a positioning mark; C) move the center point of the working tool: select the working tool coordinate system as the current coordinate system, and operate the The robot arm moves the work tool, so that the center point of the work tool is moved to the positioning mark; the controller stores a current position coordinate under the base frame; D) Move the image sensor: select the image sensor The first coordinate of the device is the current coordinate system and the correction height is added; the controller controls the robotic arm to move the image sensor so that the origin of the first coordinate of the image sensor is moved to a correction reference position coordinate, The calibration reference position coordinates are above the positioning mark, and only the difference in the Z-axis coordinate value is the corrected height; E) Positioning mark image analysis: The image sensor captures a positioning image, and the positioning image has the positioning mark Image; the control The controller sets a positioning image center in the positioning image through an image analysis software and analyzes the positioning image; obtains the position of the positioning mark in the positioning image relative to the center of the positioning image through the image analysis software, and allows the controller to obtain Image coordinates of a positioning mark; F) Correction of image and real distance: operating the robotic arm and moving the image sensor so that the first coordinate origin of the image sensor is moved to the first to fourth calibration coordinate points ; The image sensor captures a first image, a second image, a third image and a third image when the first coordinate origin of the image sensor is moved to the first to fourth calibration coordinate points A fourth image. The controller analyzes the first image, the second image, the third image, and the fourth image through the image analysis software, and obtains the positioning marks in the first to fourth images, respectively A first corrected image coordinate, a second corrected image coordinate, a third corrected image coordinate, and a fourth corrected image coordinate; G) Calculate the image correction data: the first to the first under the base frame system are known The coordinate values of the four-calibrated coordinate points and the coordinates of the first corrected image to the fourth corrected image can be calculated to obtain an image correction data; through the image correction data, the distance in the image and the distance in the real world can be understood H) Calculate the compensation value of the image sensor coordinate system: use the image coordinates of the positioning mark and the image correction data to calculate the compensation value of the first coordinate system of an image sensor to compensate the image sensor image The error between the position and the position of the work tool.

藉由以上提供的方法,本發明提供之視覺導引機器手臂校正方法,其不限於特定校正標的,如點陣圖,而只需在校正標的內指定定位記號即可進行校正作業,可讓校正作業省時。此外,透過影像分析方式判斷座標位置,也可以且減少因人為判斷所產生的目視誤差。With the methods provided above, the vision-guided robotic arm calibration method provided by the present invention is not limited to a specific calibration target, such as a bitmap, but only needs to specify the positioning mark in the calibration target to perform the calibration operation. Time-saving operations. In addition, judging the coordinate position through the image analysis method can also reduce the visual error caused by human judgment.

值得一提的是,步驟A)中,該第一至第四校正座標點的Z軸分量都相同,位在相同高度。It is worth mentioning that, in step A), the Z-axis components of the first to fourth calibration coordinate points are all the same and are located at the same height.

此外,如申請專利範圍第1項所述之視覺導引機器手臂校正方法,其中:該校正座標點的數量需要四個以上。但是,愈多座標點用來校正,則運算量愈大,運算時間愈多,運算成本增加,故要選擇適當的校正點數量,而在本實施例中是操作四點校正。In addition, according to the vision-guided robotic arm calibration method described in item 1 of the scope of patent application, the number of calibration coordinate points needs to be more than four. However, the more coordinate points are used for correction, the greater the amount of calculation, the longer the calculation time, and the higher the calculation cost. Therefore, an appropriate number of correction points must be selected. In this embodiment, four-point correction is performed.

另外,步驟G)中,該影像校正資料的計算方法如下,已知該第一至第四校正座標點的座標分別為

Figure 02_image003
Figure 02_image005
,而對應之該第一至第四校正影像座標為
Figure 02_image007
Figure 02_image005
,分別以矩陣表示如下:
Figure 02_image009
Figure 02_image011
上述矩陣
Figure 02_image013
為該基底座標系下該第一至第四校正座標點所構成,而矩陣
Figure 02_image015
則為影像空間中該第一至第四校正影像座標所構成,以如下關係式表示:
Figure 02_image017
矩陣
Figure 02_image019
為兩平面座標系間之仿射變換矩陣(Affine transformation matrix)。透過計算矩陣
Figure 02_image021
之摩爾-彭若斯廣義逆矩陣
Figure 02_image023
(Moore-Penrose pseudo-inverse matrix) 即可計算出矩陣
Figure 02_image019
,即:
Figure 02_image025
廣義逆矩陣
Figure 02_image023
可利用奇異值分解法(Singular Value Decomposition, SVD)進行求解,而矩陣
Figure 02_image019
即為該影像校正資料,顯示影像內的距離和真實世界的距離之間的轉換關係。In addition, in step G), the calculation method of the image correction data is as follows. It is known that the coordinates of the first to fourth correction coordinate points are respectively
Figure 02_image003
,
Figure 02_image005
, And the corresponding coordinates of the first to fourth corrected images are
Figure 02_image007
,
Figure 02_image005
, Respectively expressed as a matrix as follows:
Figure 02_image009
Figure 02_image011
The above matrix
Figure 02_image013
Is composed of the first to fourth calibration coordinate points under the base frame standard system, and the matrix
Figure 02_image015
It is formed by the first to fourth corrected image coordinates in the image space, expressed by the following relational expression:
Figure 02_image017
matrix
Figure 02_image019
It is the Affine transformation matrix between the two plane coordinate systems. Through the calculation matrix
Figure 02_image021
Moore-Penrose Generalized Inverse Matrix
Figure 02_image023
(Moore-Penrose pseudo-inverse matrix) to calculate the matrix
Figure 02_image019
,which is:
Figure 02_image025
Generalized inverse matrix
Figure 02_image023
Singular Value Decomposition (SVD) can be used to solve the problem, and the matrix
Figure 02_image019
It is the image correction data, which shows the conversion relationship between the distance in the image and the distance in the real world.

還可以在步驟H)中,將該影像感測器第一座標系補償量設定至該控制器,產生一感測器第二座標系。In step H), the compensation amount of the first coordinate system of the image sensor is set to the controller to generate a second coordinate system of the sensor.

為了詳細說明本發明之技術特點所在,茲舉以下之較佳實施例並配合圖式說明如後,其中:In order to explain in detail the technical features of the present invention, the following preferred embodiments are described in conjunction with the drawings, in which:

如圖1-4所示,本發明一較佳實施例所提供之一種視覺導引機器手臂校正方法,係用在一機器手臂10,該機器手臂為六軸機器手臂,該機器手臂10具有一基座11。該機器手臂10末端具有一個法蘭面12供連接物體。該機器手臂10電性連接一控制器13,該控制器13具有輸入資料、輸出資料、儲存資料、處理運算資料及顯示資料的功能。於該機器手臂10出廠時,該控制器13預先儲存一基底座標系和一法蘭座標系。該基底座標系,係由相互垂直的一X軸、一Y軸、一Z軸所構成的座標空間,該基底座標具有一基底座標原點,在本實施例中該原點位在基座11,但不以此為限,可以選擇在其他地方。該機器手臂10在該基底座標系下具有一個工作範圍。該法蘭座標系,係由相互垂直的一X1軸、一Y1軸、一Z1軸所構成的座標空間,該法蘭座標系具有一法蘭座標原點,在本實施例中該法蘭座標原點位在該法蘭面12的幾何中心。該法蘭座標系與該基底座標系關係為x1, y1, z1, a1, b1, c1其中 x1:該法蘭座標系的X1軸向與該基底座標系的該X軸向的距離關係 y1:該法蘭座標系的Y1軸向與該基底座標系的該Y軸向的距離關係 z1:該法蘭座標系的Z1軸向與該基底座標系的該Z軸向的距離關係 a1:該法蘭座標系的X1軸向繞著該基底座標系的該X軸向的旋轉角度 b1:該法蘭座標系的Y1軸向繞著該基底座標系的該Y軸向的旋轉角度 c1:該法蘭座標系的Z1軸向繞著該基底座標系的該Z軸向的旋轉角度As shown in FIGS. 1-4, a method for calibrating a vision-guided robotic arm provided by a preferred embodiment of the present invention is used in a robotic arm 10, which is a six-axis robotic arm, and the robotic arm 10 has a Base 11. The end of the robotic arm 10 has a flange surface 12 for connecting objects. The robotic arm 10 is electrically connected to a controller 13 which has the functions of inputting data, outputting data, storing data, processing calculation data, and displaying data. When the robotic arm 10 is shipped from the factory, the controller 13 pre-stores a base frame coordinate system and a flange coordinate system. The base base mark system is a coordinate space formed by an X axis, a Y axis, and a Z axis that are perpendicular to each other. The base base mark has a base base mark origin. In this embodiment, the origin is located on the base 11 , But not limited to this, you can choose other places. The robotic arm 10 has a working range under the base frame standard system. The flange coordinate system is a coordinate space formed by an X1 axis, a Y1 axis, and a Z1 axis that are perpendicular to each other. The flange coordinate system has a flange coordinate origin. In this embodiment, the flange coordinate The origin is located at the geometric center of the flange surface 12. The relationship between the flange coordinate system and the base and base standard system is x1, y1, z1, a1, b1, c1 where x1: The distance relationship between the X1 axis of the flange coordinate system and the X axis of the base and base standard system y1: the distance relationship between the Y1 axis of the flange coordinate system and the Y axis of the base and base frame z1: The distance relationship between the Z1 axis of the flange coordinate system and the Z axis of the base and base standard system a1: the rotation angle of the X1 axis of the flange coordinate system around the X axis of the base and base frame b1: the rotation angle of the Y1 axis of the flange coordinate system around the Y axis of the base and base frame c1: the rotation angle of the Z1 axis of the flange coordinate system around the Z axis of the base and base frame

一個作業工具15安裝在該法蘭面12,在本實施例中該作業工具15以吸盤舉例,但是不以此為限。該作業工具15具有一個作業工具中心點(tool center point, TCP)。使用者在該控制器13設定一作業工具座標系,該作業工具座標系係由相互垂直的一X2軸、一Y2軸、一Z2軸所構成的座標空間,該作業工具座標系具有一作業工具座標原點,該作業工具座標原點位在該作業工具中心點TCP。該作業工具座標系與法蘭座標系的關係為x2, y2, z2, a2, b2, c2,其中 x2:該作業工具座標系的X2軸向與該法蘭座標系的該X1軸向的距離關係 y2:該作業工具座標系的Y2軸向與該法蘭座標系的該Y1軸向的距離關係 z2:該作業工具座標系的Z2軸向與該法蘭座標系的該Z1軸向的距離關係 a2:該作業工具座標系的X2軸向繞著該法蘭座標系的該X1軸向的旋轉角度 b2:該作業工具座標系的Y2軸向繞著該法蘭座標系的該Y1軸向的旋轉角度 c2:該作業工具座標系的Z2軸向繞著該法蘭座標系的該Z1軸向的旋轉角度A working tool 15 is installed on the flange surface 12. In this embodiment, the working tool 15 is a suction cup as an example, but it is not limited to this. The work tool 15 has a tool center point (TCP). The user sets a working tool coordinate system in the controller 13, the working tool coordinate system is a coordinate space formed by an X2 axis, a Y2 axis, and a Z2 axis that are perpendicular to each other, and the working tool coordinate system has a working tool The coordinate origin, the coordinate origin of the work tool is at the center point TCP of the work tool. The relationship between the coordinate system of the work tool and the flange coordinate system is x2, y2, z2, a2, b2, c2, where x2: The distance relationship between the X2 axis of the work tool coordinate system and the X1 axis of the flange coordinate system y2: the distance relationship between the Y2 axis of the work tool coordinate system and the Y1 axis of the flange coordinate system z2: The distance relationship between the Z2 axis of the work tool coordinate system and the Z1 axis of the flange coordinate system a2: the rotation angle of the X2 axis of the work tool coordinate system around the X1 axis of the flange coordinate system b2: the rotation angle of the Y2 axis of the work tool coordinate system around the Y1 axis of the flange coordinate system c2: the rotation angle of the Z2 axis of the work tool coordinate system around the Z1 axis of the flange coordinate system

一個影像感測器17,在本實施例為電荷耦合影像感測器 (Charge Coupled Device, CCD),安裝在該法蘭面12,並且電性連接該控制器13,該影像感測器17用來擷取影像。須說明的是,該影像感測器17內部具有一影像感測晶片171,該影像感測晶片171具有一影像感測平面171a。使用者在該控制器13設定一影像感測器第一座標系,其係由相互垂直的一X3軸、一Y3軸、一Z3軸所構成的座標空間,該影像感測器第一座標系的該X3軸和該Y3軸構成的X3Y3平面需平行於該影像感測晶片171的該影像感測平面171a。該影像感測器第一座標系具有一影像感測器第一座標原點,在本實施例中,該影像感測器第一座標原點位在該影像感測平面171a。該影像感測器第一座標系與法蘭座標系的關係為x3, y3, z3, a3, b3, c3其中 x3:該影像感測器第一座標系的X3軸向與該法蘭座標系的該X1軸向的距離關係 y3:該影像感測器第一座標系的Y3軸向與該法蘭座標系的該Y1軸向的距離關係 z3:該影像感測器第一座標系的Z3軸向與該法蘭座標系的該Z1軸向的距離關係 a3:該影像感測器第一座標系的X3軸向繞著該法蘭座標系的該X1軸向的旋轉角度 b3:該影像感測器第一座標系的Y3軸向繞著該法蘭座標系的該Y1軸向的旋轉角度 c3:該影像感測器第一座標系的Z3軸向繞著該法蘭座標系的該Z1軸向的旋轉角度An image sensor 17, which is a Charge Coupled Device (CCD) in this embodiment, is installed on the flange surface 12 and is electrically connected to the controller 13, and the image sensor 17 is used for To capture images. It should be noted that the image sensor 17 has an image sensor chip 171 inside, and the image sensor chip 171 has an image sensor plane 171a. The user sets the first coordinate system of an image sensor in the controller 13, which is a coordinate space formed by an X3 axis, a Y3 axis, and a Z3 axis that are perpendicular to each other. The image sensor first coordinate system The X3Y3 plane formed by the X3 axis and the Y3 axis needs to be parallel to the image sensing plane 171a of the image sensing chip 171. The first coordinate of the image sensor has a first coordinate origin of the image sensor. In this embodiment, the first coordinate origin of the image sensor is located on the image sensing plane 171a. The relationship between the first coordinate system of the image sensor and the flange coordinate system is x3, y3, z3, a3, b3, c3 where x3: The distance relationship between the X3 axis of the first coordinate system of the image sensor and the X1 axis of the flange coordinate system y3: the distance relationship between the Y3 axis of the first coordinate system of the image sensor and the Y1 axis of the flange coordinate system z3: The distance relationship between the Z3 axis of the first coordinate system of the image sensor and the Z1 axis of the flange coordinate system a3: the rotation angle of the X3 axis of the first coordinate system of the image sensor around the X1 axis of the flange coordinate system b3: the rotation angle of the Y3 axis of the first coordinate system of the image sensor around the Y1 axis of the flange coordinate system c3: the rotation angle of the Z3 axis of the first coordinate system of the image sensor around the Z1 axis of the flange coordinate system

另須說明的是,使用者能夠操作該控制器13選擇該法蘭座標系、該作業工具座標系或該影像感測器第一座標系作為一當前座標系,該當前座標系表示目前正在使用的座標系。使用者在該基底座標系下設定一位置點,而在選擇完該當前座標系後,該控制器13將會控制該當前座標系的原點移動至該位置點,並且使得該當前座標系的X1Y1平面、X2Y2平面或X3Y3平面平行於該基底座標系的XY平面。例如,使用者選擇該作業工具座標系作為該當前座標系時,該控制器13會控制該機器手臂10使得該作業工具座標原點移動至該位置點,且該工具座標系的該X2軸和該Y2軸構成的X2Y2平面平行於該基底座標系的該X軸和該Y軸構成的XY平面。又如,使用者選擇該影像感測器第一座標系作為該當前座標系時,該控制器13會控制該機器手臂10使得該該影像感測器第一座標原點移動至該位置點,且該影像感測器第一座標系的該X3軸和該Y3軸構成的X3Y3平面平行於該基底座標系的該X軸和該Y軸構成的XY平面。It should also be noted that the user can operate the controller 13 to select the flange coordinate system, the working tool coordinate system, or the first coordinate system of the image sensor as a current coordinate system, and the current coordinate system indicates that it is currently in use The coordinate system. The user sets a position point under the base frame coordinate system, and after selecting the current coordinate system, the controller 13 will control the origin of the current coordinate system to move to the position point, and make the current coordinate system The X1Y1 plane, X2Y2 plane or X3Y3 plane is parallel to the XY plane of the base frame standard system. For example, when the user selects the working tool coordinate system as the current coordinate system, the controller 13 will control the robotic arm 10 to move the coordinate origin of the working tool to the position point, and the X2 axis of the tool coordinate system and The X2Y2 plane formed by the Y2 axis is parallel to the XY plane formed by the X axis and the Y axis of the base frame standard system. For another example, when the user selects the first coordinate system of the image sensor as the current coordinate system, the controller 13 will control the robotic arm 10 to move the first coordinate origin of the image sensor to the position point, And the X3Y3 plane formed by the X3 axis and the Y3 axis of the first coordinate system of the image sensor is parallel to the XY plane formed by the X axis and the Y axis of the base frame coordinate system.

如圖3所示,本發明提供的視覺導引機器手臂校正方法包含有下列步驟:As shown in FIG. 3, the method for calibrating a vision-guided robotic arm provided by the present invention includes the following steps:

A)操作條件設定A) Setting of operating conditions

使用者在該控制器13設定在該基底座標系下的一校正高度Zcal、一第一校正座標點P1、一第二校正座標點P2、一第三校正座標點P3和一第四校正座標點P4。須說明的是,該第一至第四校正座標點P1-P4的Z軸分量都相同,位在相同高度。A calibration height Zcal, a first calibration coordinate point P1, a second calibration coordinate point P2, a third calibration coordinate point P3, and a fourth calibration coordinate point set by the user in the controller 13 under the base frame standard system P4. It should be noted that the Z-axis components of the first to fourth calibration coordinate points P1-P4 are all the same and are located at the same height.

B)放置校正標的B) Place the calibration target

使用者放置一校正標的18於該機器手臂10的工作範圍之內。該校正標的18具有一個定位記號181,在本實施例中該定位記號181為圓點,但是不以圓點為限。The user places a calibration target 18 within the working range of the robotic arm 10. The calibration mark 18 has a positioning mark 181. In this embodiment, the positioning mark 181 is a dot, but it is not limited to a dot.

C)移動作業工具中心點C) Move the center point of the work tool

選擇該作業工具座標系為該當前座標系,操作該機器手臂10移動該作業工具15,使該作業工具中心點TCP被移動至該定位記號181上。該控制器13儲存在該基底座標系下的一當前位置座標Psp。Select the work tool coordinate system as the current coordinate system, and operate the robotic arm 10 to move the work tool 15 so that the center point TCP of the work tool is moved to the positioning mark 181. The controller 13 stores a current position coordinate Psp under the base frame standard system.

D)移動影像感測器D) Moving image sensor

選擇該影像感測器第一座標系為該當前座標系並且加入該校正高度Zcal。該控制器13控制該機器手臂10移動該影像感測器17,使得該影像感測器第一座標原點被移動至一校正基準位置座標Pcp,該校正基準位置座標Pcp位在該定位記號181上方。在該基底座標系下,該校正基準位置座標Pcp與該當前位置座標Psp相比,只有Z軸座標值相差為該校正高度Zcal,其他X軸、Y軸分量數值相同。Select the first coordinate system of the image sensor as the current coordinate system and add the correction height Zcal. The controller 13 controls the robotic arm 10 to move the image sensor 17 so that the first coordinate origin of the image sensor is moved to a calibration reference position coordinate Pcp, and the calibration reference position coordinate Pcp is located at the positioning mark 181 Above. In the base frame standard system, the correction reference position coordinate Pcp is compared with the current position coordinate Psp, only the Z-axis coordinate value is different from the corrected height Zcal, and the other X-axis and Y-axis component values are the same.

E)定位記號影像分析E) Image analysis of positioning marks

該影像感測器17擷取一定位影像,該定位影像是具有該定位記號181的影像。該控制器13透過一影像分析軟體在該定位影像設定一定位影像中心並且分析該定位影像,在本實施例中,該定位影像中心為該定位影像的幾何中心,但是不以此為限。透過該影像分析軟體取得該定位影像中的定位記號相對於該定位影像中心的位置,而讓該控制器13取得一定位記號影像座標Xcs。The image sensor 17 captures a positioning image, and the positioning image is an image with the positioning mark 181. The controller 13 sets a positioning image center in the positioning image through an image analysis software and analyzes the positioning image. In this embodiment, the positioning image center is the geometric center of the positioning image, but it is not limited thereto. The position of the positioning mark in the positioning image relative to the center of the positioning image is obtained through the image analysis software, so that the controller 13 obtains a positioning mark image coordinate Xcs.

此外,前面提到的該影像分析軟體為一般市售影像分析軟體,用來確定影像內的物體並且分析其在影像內的座標位置,在此恕不贅述。In addition, the aforementioned image analysis software is generally commercially available image analysis software, which is used to determine the object in the image and analyze its coordinate position in the image, which will not be repeated here.

F)影像與真實距離的校正F) Correction of image and real distance

操作該機器手臂10,移動該影像感測器17,使得該影像感測器第一座標原點被移動至該第一至第四校正座標點P1-P4。該影像感測器17在該影像感測器第一座標原點被移動至該第一至第四校正座標點P1-P4的時候,分別擷取一第一影像、一第二影像、一第三影像和一第四影像,由該控制器13透過該影像分析軟體分析該第一影像、該第二影像、該第三影像和該第四影像,分別取得該定位記號181在該第一至第四影像內的一第一校正影像座標Xc1、一第二校正影像座標Xc2、一第三校正影像座標Xc3和一第四校正影像座標 Xc4。Operate the robotic arm 10 to move the image sensor 17 so that the first coordinate origin of the image sensor is moved to the first to fourth calibration coordinate points P1-P4. The image sensor 17 captures a first image, a second image, and a second image when the first coordinate origin of the image sensor is moved to the first to fourth calibration coordinate points P1-P4. With three images and a fourth image, the controller 13 analyzes the first image, the second image, the third image, and the fourth image through the image analysis software, and obtains the positioning marks 181 in the first to the fourth images, respectively. A first corrected image coordinate Xc1, a second corrected image coordinate Xc2, a third corrected image coordinate Xc3, and a fourth corrected image coordinate Xc4 in the fourth image.

G)計算影像校正資料G) Calculate image correction data

已知在該基底座標系下的該第一至該第四校正座標點P1-P4的座標值(真實空間),以及該定位記號181在該第一至第四影像內的該第一校正影像座標Xc1、該第二校正影像座標Xc2、該第三校正影像座標Xc3和該第四校正影像座標 Xc4(影像空間),可以計算影像內的距離與真實空間(基底座標系)的距離關係,而得出一影像校正資料。透過該影像校正資料,可以瞭解影像內的距離和真實世界的距離之間的轉換關係。It is known that the coordinate values (real space) of the first to fourth calibration coordinate points P1-P4 under the base frame standard system, and the first calibration image of the positioning mark 181 in the first to fourth images The coordinates Xc1, the second corrected image coordinates Xc2, the third corrected image coordinates Xc3, and the fourth corrected image coordinates Xc4 (image space) can be used to calculate the distance relationship between the distance in the image and the real space (base frame coordinate system), and Obtain an image correction data. Through the image correction data, the conversion relationship between the distance in the image and the distance in the real world can be understood.

須說明的是,本實施例係以四點校正作為實施例,但不限於四點,而是四點以上都可以。愈多座標點用來校正,則運算量愈大,運算時間愈多,運算成本增加,故要選擇適當的校正點數量,而在本實施例中是操作四點校正。It should be noted that this embodiment uses four-point calibration as an embodiment, but it is not limited to four points, but more than four points are possible. The more coordinate points are used for correction, the greater the amount of calculation, the longer the calculation time, and the higher the calculation cost. Therefore, an appropriate number of correction points must be selected. In this embodiment, four-point correction is performed.

在本實施例中的該影像校正資料的計算方法如下,但是不以此為限。The calculation method of the image correction data in this embodiment is as follows, but it is not limited to this.

已知該第一至第四校正座標點P1-P4的座標分別為

Figure 02_image003
Figure 02_image005
。而對應之該第一至第四校正影像座標為
Figure 02_image007
Figure 02_image005
。分別以矩陣表示如下:
Figure 02_image009
Figure 02_image011
上述矩陣
Figure 02_image013
為該基底座標系下該第一至第四校正座標點P1-P4所構成,而矩陣
Figure 02_image015
則為影像空間中該第一至第四校正影像座標所構成,以如下關係式表示:
Figure 02_image017
矩陣
Figure 02_image019
為兩平面座標系間之仿射變換矩陣(Affine transformation matrix)。透過計算矩陣
Figure 02_image021
之摩爾-彭若斯廣義逆矩陣
Figure 02_image023
(Moore-Penrose pseudo-inverse matrix) 即可計算出矩陣
Figure 02_image019
,即:
Figure 02_image025
廣義逆矩陣
Figure 02_image023
可利用奇異值分解法(Singular Value Decomposition, SVD)進行求解。而矩陣
Figure 02_image019
即為該影像校正資料,顯示影像內的距離和真實世界的距離之間的轉換關係。It is known that the coordinates of the first to fourth calibration coordinate points P1-P4 are respectively
Figure 02_image003
,
Figure 02_image005
. And the corresponding coordinates of the first to fourth corrected images are
Figure 02_image007
,
Figure 02_image005
. Respectively expressed as a matrix as follows:
Figure 02_image009
Figure 02_image011
The above matrix
Figure 02_image013
Is composed of the first to fourth calibration coordinate points P1-P4 under the base frame standard system, and the matrix
Figure 02_image015
It is formed by the first to fourth corrected image coordinates in the image space, expressed by the following relational expression:
Figure 02_image017
matrix
Figure 02_image019
It is the Affine transformation matrix between the two plane coordinate systems. Through the calculation matrix
Figure 02_image021
Moore-Penrose Generalized Inverse Matrix
Figure 02_image023
(Moore-Penrose pseudo-inverse matrix) to calculate the matrix
Figure 02_image019
,which is:
Figure 02_image025
Generalized inverse matrix
Figure 02_image023
Singular Value Decomposition (SVD) can be used to solve the problem. And the matrix
Figure 02_image019
It is the image correction data, which shows the conversion relationship between the distance in the image and the distance in the real world.

H)計算影像感測器第一座標系補償量H) Calculate the compensation amount of the first coordinate system of the image sensor

利用該定位記號影像座標Xcs 與該影像校正資料,計算一影像感測器第一座標系補償量。Using the image coordinate Xcs of the positioning mark and the image correction data, a compensation amount of the first coordinate system of an image sensor is calculated.

在理想狀況下。由於該工具座標系的該X2軸和該Y2軸構成的X2Y2平面,以及該影像感測器第一座標系的該X3軸和該Y3軸構成的X3Y3平面皆平行於該基底座標系的該X軸和該Y軸構成的XY平面,又該校正基準位置座標Pcp與該當前位置座標Psp只相差為該校正高度Zcal而無X軸、Y軸上的分量差異,若是該工具座標系和該影像感測器第一座標系之間的轉換為理想情況時,則將使得該定位影像內位的定位記號位在該定位影像中心,也代表著在該作業工具座標系下的該定位記號181所在位置,將與在該影像感測器座標系的該影像中心重合。如此一來,在獲知該影像校正資料(該影像內的距離與真實世界的距離比例)後,使用者即可透過該影像感測器17擷取的畫面資料和該影像校正資料,直覺地操作該控制器13控制該機器手臂10並且控制該作業工具15。Under ideal conditions. Because the X2Y2 plane formed by the X2 axis and the Y2 axis of the tool coordinate system and the X3Y3 plane formed by the X3 axis and the Y3 axis of the first coordinate system of the image sensor are parallel to the X Axis and the Y axis constitute the XY plane, and the calibration reference position coordinate Pcp and the current position coordinate Psp differ only by the calibration height Zcal without the component difference on the X axis and the Y axis. If it is the tool coordinate system and the image When the conversion between the first coordinate system of the sensor is an ideal situation, the positioning mark in the positioning image will be positioned at the center of the positioning image, which also represents the location of the positioning mark 181 in the working tool coordinate system. The position will coincide with the image center in the image sensor coordinate system. In this way, after knowing the image correction data (the ratio of the distance in the image to the real-world distance), the user can intuitively operate through the screen data captured by the image sensor 17 and the image correction data The controller 13 controls the robot arm 10 and controls the work tool 15.

然而,在一般情況下,該定位記號181在該影像內的位置與該影像中心會有誤差而需有一影像補償量

Figure 02_image027
去補償。由於該定位記號影像座標Xcs即為該定位點181在該定位影像內以該定位影像中心為原點的座標值,所以,可以將該定位記號影像座標Xcs的座標值轉化為該影像補償量
Figure 02_image027
,顯示轉換該工具座標系和該影像感測器第一座標系在影像內需補償的誤差。若要以該定位點181為中心並且透過從該影像感測器17擷取的影像直覺地控制該作業工具,僅需將該影像補償量
Figure 02_image027
加入該影像感測器17擷取的影像即可使得畫面中的定位點影位於畫面中心,而方便使用者直覺地透過該感測器擷取的畫面操作該作業工具。而關於控制器13的部分,則需要該影像感測器第一座標系補償量,以控制該作業工具的移動,補償該影像感測器17影像中位置與該作業工具位置的誤差。However, in general, the position of the positioning mark 181 in the image and the center of the image will have an error, and an image compensation amount is required.
Figure 02_image027
To compensate. Since the image coordinate Xcs of the positioning mark is the coordinate value of the positioning point 181 in the positioning image with the center of the positioning image as the origin, the coordinate value of the positioning mark image coordinate Xcs can be converted into the image compensation amount
Figure 02_image027
, To display the error that needs to be compensated in the image for converting the tool coordinate system and the first coordinate system of the image sensor. To control the working tool intuitively through the image captured from the image sensor 17 with the positioning point 181 as the center, it is only necessary to compensate the image
Figure 02_image027
Adding the image captured by the image sensor 17 can make the positioning point shadow in the screen be located at the center of the screen, and it is convenient for the user to intuitively operate the working tool through the screen captured by the sensor. Regarding the part of the controller 13, the first coordinate system compensation amount of the image sensor is required to control the movement of the work tool and compensate for the error between the position in the image of the image sensor 17 and the position of the work tool.

值得一提的是,還可以將該影像感測器第一座標系補償量設定至該控制器13,產生一感測器第二座標系。如此一來毋須每次將補償量加入該影像感測器17擷取的影像,而是讓該機器手臂10帶動該影像感測器17運動時,直接讓該影像感測器第一座標系補償量加入該感測器17移動位置,方便使用者使用。It is worth mentioning that the compensation amount of the first coordinate system of the image sensor can also be set to the controller 13 to generate a second coordinate system of the sensor. In this way, there is no need to add the compensation amount to the image captured by the image sensor 17 every time, but when the robotic arm 10 drives the image sensor 17 to move, the first coordinate system of the image sensor is directly compensated. The moving position of the sensor 17 is added to the amount, which is convenient for the user to use.

藉由以上提供的方法,本發明提供之視覺導引機器手臂校正方法,其不限於特定校正標的,如點陣圖,而只需在校正標的內指定定位記號即可進行校正作業,可讓校正作業省時。此外,透過影像分析方式判斷座標位置,也可以且減少因人為判斷所產生的目視誤差。With the methods provided above, the vision-guided robotic arm calibration method provided by the present invention is not limited to a specific calibration target, such as a bitmap, but only needs to specify the positioning mark in the calibration target to perform the calibration operation. Time-saving operations. In addition, judging the coordinate position through the image analysis method can also reduce the visual error caused by human judgment.

10:機器手臂11:基座 12:法蘭面13:控制器 15:作業工具17:影像感測器 171:影像感測晶片171a:影像感測平面 18:校正標的181:定位記號 P1:第一校正座標點P2:第二校正座標點 P3:第三校正座標點P4:第四校正座標點 Psp:當前位置座標Pcp:校正基準位置座標 TCP:作業工具中心點Xcs:定位記號影像座標 Xc1:第一校正影像座標Xc2:第二校正影像座標 Xc3:第三校正影像座標Xc4:第四校正影像座標 Zcal:校正高度 <基底座標系> X軸Y軸Z軸 <法蘭座標系> X1軸Y1軸Z1軸 <作業工具座標系> X2軸Y2軸Z2軸 <影像感測器第一座標系> X3軸Y3軸Z3軸10: Robot arm 11: Base 12: Flange surface 13: Controller 15: Working tool 17: Image sensor 171: Image sensor chip 171a: Image sensor plane 18: Calibration target 181: Positioning mark P1: the first calibration coordinate point P2: the second calibration coordinate point P3: Third calibration coordinate point P4: Fourth calibration coordinate point Psp: current position coordinates Pcp: correction reference position coordinates TCP: Working tool center point Xcs: Positioning mark image coordinates Xc1: The first corrected image coordinate Xc2: The second corrected image coordinate Xc3: Third corrected image coordinate Xc4: Fourth corrected image coordinate Zcal: Corrected height <Base and base standard system> X axis Y axis Z axis <Flange Coordinate System> X1 axis Y1 axis Z1 axis <Work tool coordinate system> X2 axis Y2 axis Z2 axis <The first coordinate system of the image sensor> X3 axis Y3 axis Z3 axis

圖1係本發明較佳實施例之系統示意圖,顯示機器手臂。 圖2係本發明較佳實施例之校正標的示意圖。 圖3係本發明較佳實施例之流程方塊圖。 圖4係本發明較佳實施例之影像感測器擷取影像示意圖,顯示影像具有校正標的、定位記號和影像中心。Figure 1 is a schematic diagram of the system of the preferred embodiment of the present invention, showing a robotic arm. Fig. 2 is a schematic diagram of the calibration target of the preferred embodiment of the present invention. Fig. 3 is a block diagram of a preferred embodiment of the present invention. 4 is a schematic diagram of an image captured by an image sensor according to a preferred embodiment of the present invention, showing that the image has a calibration target, a positioning mark, and an image center.

10:機器手臂 10: Robotic arm

11:基座 11: Pedestal

12:法蘭面 12: Flange surface

13:控制器 13: Controller

15:作業工具 15: work tools

17:影像感測器 17: Image sensor

18:校正標的 18: Calibration target

181:定位記號 181: Positioning Mark

<基底座標系> <Base and base standard system>

X:軸 X: axis

Y:軸 Y: axis

Z:軸 Z: axis

<法蘭座標系> <Flange Coordinate System>

X1:軸 X1: axis

Y1:軸 Y1: axis

Z1:軸 Z1: axis

<作業工具座標系> <Work Tool Coordinate System>

X2:軸 X2: axis

Y2:軸 Y2: axis

Z2:軸 Z2: axis

<影像感測器第一座標系> <The first coordinate system of the image sensor>

X3:軸 X3: axis

Y3:軸 Y3: axis

Z3:軸 Z3: axis

Claims (5)

一種視覺導引機器手臂校正方法,係用在一機器手臂,該機器手臂具有一基座;該機器手臂末端具有一個法蘭面;該機器手臂電性連接一控制器,該控制器具有輸入資料、輸出資料、儲存資料、處理運算資料及顯示資料的功能;該控制器預先儲存一基底座標系和一法蘭座標系,該基底座標系,係由相互垂直的一X軸、一Y軸、一Z軸所構成的座標空間,該基底座標具有一基底座標原點;該機器手臂具有一個工作範圍;該法蘭座標系,係由相互垂直的一X1軸、一Y1軸、一Z1軸所構成的座標空間,該法蘭座標系具有一法蘭座標原點;一個作業工具安裝在該法蘭面;該作業工具具有一個作業工具中心點;該控制器設定一作業工具座標系,該作業工具座標系係由相互垂直的一X2軸、一Y2軸、一Z2軸所構成的座標空間,該作業工具座標系具有一作業工具座標原點,該作業工具座標原點位在該作業工具中心點;一個影像感測器,安裝在該法蘭面,並且電性連接該控制器;該影像感測器內部具有一影像感測晶片,該影像感測晶片具有一影像感測平面;該控制器設定一影像感測器第一座標系,其係由相互垂直的一X3軸、一Y3軸、一Z3軸所構成的座標空間,該影像感測器第一座標系的該X3軸和該Y3軸構成的X3Y3平面需平行於該影像感測晶片的該影像感測平面;該影像感測器第一座標系具有一影像感測器第一座標原點;使用者能夠操作該控制器選擇該法蘭座標系、該作業工具座標系或該影像感測器第一座標系作為一當前座標系,該當前座標系表示目前正在使用的座標系;該視覺導引機器手臂校正方法包含下列步驟: A)   操作條件設定: 在該控制器設定在該基底座標系下的一校正高度、一第一校正座標點、一第二校正座標點、一第三校正座標點和一第四校正座標點; B)   放置校正標的: 放置一校正標的於該機器手臂的工作範圍之內;該校正標的具有一個定位記號; C)   移動作業工具中心點: 選擇該作業工具座標系為該當前座標系,操作該機器手臂移動該作業工具,使該作業工具中心點被移動至該定位記號上;該控制器儲存在該基底座標系下的一當前位置座標; D)   移動影像感測器: 選擇該影像感測器第一座標系為該當前座標系並且加入該校正高度;該控制器控制該機器手臂移動該影像感測器,使得該影像感測器第一座標原點被移動至一校正基準位置座標,該校正基準位置座標位在該定位記號上方,只有Z軸座標值相差為該校正高度; E)   定位記號影像分析:該影像感測器擷取一定位影像,該定位影像是具有該定位記號的影像;該控制器透過一影像分析軟體在該定位影像設定一定位影像中心並且分析該定位影像;透過該影像分析軟體取得該定位影像中的定位記號相對於該定位影像中心的位置,而讓該控制器取得一定位記號影像座標; F)    影像與真實距離的校正: 操作該機器手臂,移動該影像感測器,使得該影像感測器第一座標原點被移動至該第一至第四校正座標點;該影像感測器在該影像感測器第一座標原點被移動至該第一至第四校正座標點的時候,分別擷取一第一影像、一第二影像、一第三影像和一第四影像,由該控制器透過該影像分析軟體分析該第一影像、該第二影像、該第三影像和該第四影像,分別取得該定位記號在該第一至第四影像內的一第一校正影像座標、一第二校正影像座標、一第三校正影像座標和一第四校正影像座標; G)   計算影像校正資料: 已知在該基底座標系下的該第一至該第四校正座標點的座標值,以及該第一校正影像座標至該第四校正影像座標,可以計算得出一影像校正資料;透過該影像校正資料,可以瞭解影像內的距離和真實世界的距離之間的轉換關係; H)   計算影像感測器座標系補償量: 利用該定位記號影像座標與該影像校正資料,計算一影像感測器第一座標系補償量 ,補償該影像感測器影像中位置與該作業工具位置的誤差。A method for calibrating a vision guided robotic arm is used in a robotic arm, the robotic arm has a base; the end of the robotic arm has a flange surface; the robotic arm is electrically connected to a controller, and the controller has input data , Output data, store data, process calculation data and display data; the controller pre-stores a base base standard system and a flange coordinate system, the base base standard system consists of an X axis, a Y axis, and A coordinate space formed by a Z axis, the base base mark has a base base mark origin; the robotic arm has a working range; the flange coordinate system is composed of an X1 axis, a Y1 axis, and a Z1 axis that are perpendicular to each other. In the coordinate space formed, the flange coordinate system has a flange coordinate origin; a work tool is installed on the flange surface; the work tool has a work tool center point; the controller sets a work tool coordinate system, and the work The tool coordinate system is a coordinate space composed of an X2 axis, a Y2 axis, and a Z2 axis that are perpendicular to each other. The work tool coordinate system has a work tool coordinate origin, and the work tool coordinate origin is at the center of the work tool Point; an image sensor installed on the flange surface and electrically connected to the controller; the image sensor has an image sensor chip inside, the image sensor chip has an image sensing plane; the control The device sets the first coordinate system of an image sensor, which is a coordinate space constituted by an X3 axis, a Y3 axis, and a Z3 axis that are perpendicular to each other. The X3 axis and the X3 axis of the first coordinate system of the image sensor The X3Y3 plane formed by the Y3 axis needs to be parallel to the image sensing plane of the image sensor chip; the first coordinate of the image sensor has a first coordinate origin of the image sensor; the user can operate the controller to select The flange coordinate system, the working tool coordinate system, or the first coordinate system of the image sensor is used as a current coordinate system, and the current coordinate system represents the coordinate system currently in use; the vision-guided robot arm calibration method includes the following steps : A) Setting of operating conditions: A calibration height, a first calibration coordinate point, a second calibration coordinate point, a third calibration coordinate point, and a fourth calibration coordinate point set by the controller under the base frame standard system; B) Place calibration target: Place a calibration target within the working range of the robotic arm; the calibration target has a positioning mark; C) The center point of the mobile work tool: Select the work tool coordinate system as the current coordinate system, operate the robotic arm to move the work tool, so that the center point of the work tool is moved to the positioning mark; the controller stores a current position coordinate under the base base coordinate system ; D) Mobile image sensor: Select the first coordinate system of the image sensor as the current coordinate system and add the correction height; the controller controls the robotic arm to move the image sensor so that the origin of the first coordinate of the image sensor is moved to a Calibration reference position coordinates, the calibration reference position coordinates are located above the positioning mark, only the difference of the Z axis coordinate value is the calibration height; E) Positioning mark image analysis: The image sensor captures a positioning image, and the positioning image is an image with the positioning mark; the controller sets a positioning image center in the positioning image through an image analysis software and analyzes the positioning Image; obtain the position of the positioning mark in the positioning image relative to the center of the positioning image through the image analysis software, and let the controller obtain a positioning mark image coordinate; F) Correction of the distance between the image and the real distance: Operate the robotic arm to move the image sensor so that the first coordinate origin of the image sensor is moved to the first to fourth calibration coordinate points; the image sensor is at the first coordinate of the image sensor When the origin is moved to the first to fourth calibration coordinate points, a first image, a second image, a third image, and a fourth image are respectively captured, and analyzed by the controller through the image analysis software The first image, the second image, the third image, and the fourth image respectively obtain a first corrected image coordinate, a second corrected image coordinate, and a position mark in the first to fourth images. Third corrected image coordinates and a fourth corrected image coordinates; G) Calculate image correction data: Knowing the coordinate values of the first to the fourth calibration coordinate points under the base frame standard system, and the first calibration image coordinates to the fourth calibration image coordinates, an image calibration data can be calculated; through the image Calibration data can understand the conversion relationship between the distance in the image and the distance in the real world; H) Calculate the compensation amount of the image sensor coordinate system: Using the image coordinates of the positioning mark and the image correction data, a compensation amount of the first coordinate system of an image sensor is calculated to compensate for the error between the position in the image of the image sensor and the position of the working tool. 如申請專利範圍第1項所述之視覺導引機器手臂校正方法,其中:步驟A)中,該第一至第四校正座標點的Z軸分量都相同,位在相同高度。The vision-guided robotic arm calibration method described in the scope of the patent application, wherein: in step A), the Z-axis components of the first to fourth calibration coordinate points are the same and are located at the same height. 如申請專利範圍第1項所述之視覺導引機器手臂校正方法,其中:該校正座標點的數量需要四個以上。For the vision-guided robotic arm calibration method described in item 1 of the scope of patent application, the number of calibration coordinate points needs to be more than four. 如申請專利範圍第1項所述之視覺導引機器手臂校正方法,其中:步驟G)中,該影像校正資料的計算方法如下, 已知該第一至第四校正座標點的座標分別為
Figure 03_image003
Figure 03_image005
,而對應之該第一至第四校正影像座標為
Figure 03_image007
Figure 03_image005
,分別以矩陣表示如下:
Figure 03_image009
Figure 03_image011
上述矩陣
Figure 03_image013
為該基底座標系下該第一至第四校正座標點所構成,而矩陣
Figure 03_image015
則為影像空間中該第一至第四校正影像座標所構成,以如下關係式表示:
Figure 03_image017
矩陣
Figure 03_image019
為兩平面座標系間之仿射變換矩陣(Affine transformation matrix)。透過計算矩陣
Figure 03_image021
之摩爾-彭若斯廣義逆矩陣
Figure 03_image023
(Moore-Penrose pseudo-inverse matrix) 即可計算出矩陣
Figure 03_image019
,即:
Figure 03_image025
廣義逆矩陣
Figure 03_image023
可利用奇異值分解法(Singular Value Decomposition, SVD)進行求解,而矩陣
Figure 03_image019
即為該影像校正資料,顯示影像內的距離和真實世界的距離之間的轉換關係。
For example, the vision-guided robotic arm calibration method described in the scope of the patent application, wherein: in step G), the calculation method of the image calibration data is as follows, it is known that the coordinates of the first to fourth calibration coordinate points are respectively
Figure 03_image003
,
Figure 03_image005
, And the corresponding coordinates of the first to fourth corrected images are
Figure 03_image007
,
Figure 03_image005
, Respectively expressed as a matrix as follows:
Figure 03_image009
Figure 03_image011
The above matrix
Figure 03_image013
Is composed of the first to fourth calibration coordinate points under the base frame standard system, and the matrix
Figure 03_image015
It is formed by the first to fourth corrected image coordinates in the image space, expressed by the following relational expression:
Figure 03_image017
matrix
Figure 03_image019
It is the Affine transformation matrix between the two plane coordinate systems. Through the calculation matrix
Figure 03_image021
Moore-Penrose Generalized Inverse Matrix
Figure 03_image023
(Moore-Penrose pseudo-inverse matrix) to calculate the matrix
Figure 03_image019
,which is:
Figure 03_image025
Generalized inverse matrix
Figure 03_image023
Singular Value Decomposition (SVD) can be used to solve the problem, and the matrix
Figure 03_image019
It is the image correction data, which shows the conversion relationship between the distance in the image and the distance in the real world.
如申請專利範圍第1項所述之視覺導引機器手臂校正方法,其中:在步驟H)中,將該影像感測器第一座標系補償量設定至該控制器,產生一感測器第二座標系。The method for calibrating a vision-guided robotic arm as described in claim 1, wherein: in step H), the first coordinate system compensation amount of the image sensor is set to the controller to generate a sensor first Two coordinate system.
TW108123805A 2019-07-05 2019-07-05 Correction method of vision guided robotic arm TWI699264B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW108123805A TWI699264B (en) 2019-07-05 2019-07-05 Correction method of vision guided robotic arm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW108123805A TWI699264B (en) 2019-07-05 2019-07-05 Correction method of vision guided robotic arm

Publications (2)

Publication Number Publication Date
TWI699264B TWI699264B (en) 2020-07-21
TW202102347A true TW202102347A (en) 2021-01-16

Family

ID=72602131

Family Applications (1)

Application Number Title Priority Date Filing Date
TW108123805A TWI699264B (en) 2019-07-05 2019-07-05 Correction method of vision guided robotic arm

Country Status (1)

Country Link
TW (1) TWI699264B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI776694B (en) * 2021-09-30 2022-09-01 台達電子工業股份有限公司 Automatic robot arm system and method of coordinating robot arm and computer vision thereof
US11958200B2 (en) 2021-09-30 2024-04-16 Delta Electronics, Inc. Automatic robotic arm system and coordinating method for robotic arm and computer vision thereof

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI748626B (en) * 2020-08-31 2021-12-01 財團法人工業技術研究院 Calibration method of tool center point, teaching method for mechanical arm and robot arm system using the same
TWI724977B (en) * 2020-09-29 2021-04-11 台達電子工業股份有限公司 Calibration apparatus and calibration method for coordinate system of robotic arm

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9393694B2 (en) * 2010-05-14 2016-07-19 Cognex Corporation System and method for robust calibration between a machine vision system and a robot
TWI408037B (en) * 2010-12-03 2013-09-11 Ind Tech Res Inst A position method and a calibrating method for the robot arm
DE102016116702B4 (en) * 2015-09-14 2019-01-24 Fanuc Corporation Measuring system for calibrating the mechanical parameters of a robot
TWI693990B (en) * 2017-07-13 2020-05-21 達明機器人股份有限公司 Device and method for calibrating end-effector of robot arm

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI776694B (en) * 2021-09-30 2022-09-01 台達電子工業股份有限公司 Automatic robot arm system and method of coordinating robot arm and computer vision thereof
US11958200B2 (en) 2021-09-30 2024-04-16 Delta Electronics, Inc. Automatic robotic arm system and coordinating method for robotic arm and computer vision thereof

Also Published As

Publication number Publication date
TWI699264B (en) 2020-07-21

Similar Documents

Publication Publication Date Title
KR102280663B1 (en) Calibration method for robot using vision technology
JP6966582B2 (en) Systems and methods for automatic hand-eye calibration of vision systems for robot motion
TWI699264B (en) Correction method of vision guided robotic arm
JP6770605B2 (en) Vision system for training the assembly system by virtual assembly of the object
JP7207851B2 (en) Control method, robot system, article manufacturing method, program and recording medium
TWI672206B (en) Method and apparatus of non-contact tool center point calibration for a mechanical arm, and a mechanical arm system with said calibration function
JP4021413B2 (en) Measuring device
JP3946711B2 (en) Robot system
US9519736B2 (en) Data generation device for vision sensor and detection simulation system
TWI404609B (en) Parameters adjustment method of robotic arm system and adjustment apparatus
JP6812095B2 (en) Control methods, programs, recording media, robotic devices, and manufacturing methods for articles
KR20180120647A (en) System and method for tying together machine vision coordinate spaces in a guided assembly environment
JP5618770B2 (en) Robot calibration apparatus and calibration method
JP2019115974A (en) Calibration and operation of vision-based manipulation systems
CN112238453B (en) Vision-guided robot arm correction method
JP2019113895A (en) Imaging apparatus with visual sensor for imaging work-piece
JPWO2018043525A1 (en) Robot system, robot system control apparatus, and robot system control method
KR101972432B1 (en) A laser-vision sensor and calibration method thereof
JP6912529B2 (en) How to correct the visual guidance robot arm
TWI806761B (en) Mark detection device and robot teaching system
CN114571199A (en) Screw locking machine and screw positioning method
TW202246872A (en) Imaging device for calculating three-dimensional position on the basis of image captured by visual sensor
CN117132658A (en) Method, system, and storage medium for visualizing calibration status of camera