TWI776694B - Automatic robot arm system and method of coordinating robot arm and computer vision thereof - Google Patents

Automatic robot arm system and method of coordinating robot arm and computer vision thereof Download PDF

Info

Publication number
TWI776694B
TWI776694B TW110136446A TW110136446A TWI776694B TW I776694 B TWI776694 B TW I776694B TW 110136446 A TW110136446 A TW 110136446A TW 110136446 A TW110136446 A TW 110136446A TW I776694 B TWI776694 B TW I776694B
Authority
TW
Taiwan
Prior art keywords
robotic arm
calibration
image
target
control
Prior art date
Application number
TW110136446A
Other languages
Chinese (zh)
Other versions
TW202315721A (en
Inventor
陳鴻欣
余家潤
黃啟銘
張晉綸
張耿寧
Original Assignee
台達電子工業股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 台達電子工業股份有限公司 filed Critical 台達電子工業股份有限公司
Priority to TW110136446A priority Critical patent/TWI776694B/en
Application granted granted Critical
Publication of TWI776694B publication Critical patent/TWI776694B/en
Publication of TW202315721A publication Critical patent/TW202315721A/en

Links

Images

Landscapes

  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

An automatic robot arm system and a method of coordinating robot arm and computer vision thereof are disclosed. A beam-splitting mirror splits the incident light into the visible light and the ranging light and respectively guides to an image capturing device and optical ranging device in the different reference axes. In a calibration mode, a transformation relationship is computed based on the calibration postures and the corresponding calibration images. In a work mode, a mechanical space coordinate is computed based on the captured word image and the transformation relationship, and the robot arm is controlled to move based on the mechanical space coordinate. The present disclosure can make the capture optical axis overlap the flange axis of the robot arm, and improve the coordination between the robot arm and the computer vision.

Description

自動化機械手臂系統與機械手臂與其電腦視覺之間的協調方 法 Coordination between the automated robotic arm system and the robotic arm and its computer vision Law

本發明係與機械手臂有關,特別是有關於自動化機械手臂系統與其機械手臂與電腦視覺之間的協調方法。 The present invention is related to a robotic arm, in particular to a method for coordinating between an automated robotic arm system and its robotic arm and computer vision.

現有的機械手臂系統中,是使用相機來拍攝工作對象的影像,透過影像分析來決定工作對象的位置,並控制機械手臂移動至所決定的位置來對工作對象進行動作。 In the existing robotic arm system, a camera is used to capture an image of the working object, the position of the working object is determined through image analysis, and the robotic arm is controlled to move to the determined position to act on the working object.

然而,現有的機械手臂系統缺點在於,若將機械手臂與相機採一上一下的同軸配置,則會嚴重縮減機械手臂的可加工範圍,並嚴重限制相機的體積上限。 However, the disadvantage of the existing robotic arm system is that if the robotic arm and the camera are coaxially configured one above the other, the processing range of the robotic arm will be severely reduced, and the upper limit of the camera volume will be severely limited.

若將機械手臂與相機採非同軸設置,則機械手臂的法蘭軸與相機的拍攝光軸之間存在偏置量。前述偏置量會導致以拍攝影像為基準的視覺空間與機械手臂的機械空間之間存在隨機誤差,而使得電腦視覺無法精準地控制機械手臂。 If the manipulator and the camera are set non-coaxially, there will be an offset between the flange axis of the manipulator and the shooting optical axis of the camera. The aforementioned offset will cause random errors between the visual space based on the captured image and the mechanical space of the robotic arm, making it impossible for computer vision to accurately control the robotic arm.

是以,現有機械手臂系統存在上述問題,而亟待更有效的方案被提出。 Therefore, the existing robotic arm systems have the above problems, and more effective solutions are urgently needed.

本發明之主要目的,係在於提供一種自動化機械手臂系統與其機械手臂與電腦視覺之間的協調方法,可使拍攝光軸與法蘭軸疊合,量測目標距離,並協調機械手臂與電腦視覺。 The main purpose of the present invention is to provide an automatic robotic arm system and a method for coordinating between its robotic arm and computer vision, so that the shooting optical axis and the flange axis can be superimposed, the target distance can be measured, and the robotic arm and computer vision can be coordinated. .

於一實施例中,一種機械手臂與電腦視覺之間的協調方法,包含:於一校正模式下,基於一光學測距裝置所量測的一目標距離控制一機械手臂於一影像擷取裝置的一有效拍攝範圍內移動為多個校正姿態,並透過該影像擷取裝置於該多個校正姿態分別拍攝多個校正影像,其中一分光鏡將可見光導引至設置於該機械手臂的一法蘭軸外的該影像擷取裝置,並將測距光導引至該光學測距裝置,該該光學測距裝置的測距軸平行或重疊該法蘭軸;基於該多個校正姿態與該多個校正影像計算該影像擷取裝置的視覺空間與該機械手臂的機械空間之間的一轉換關係;於一工作模式下,透過該影像擷取裝置拍攝一工作影像,並基於該工作影像及該轉換關係決定執行工作的一機械空間座標;及,控制該機械手臂移動至該機械空間座標。 In one embodiment, a method for coordinating between a robotic arm and computer vision includes: in a calibration mode, controlling a robotic arm on an image capturing device based on a target distance measured by an optical ranging device. Move into a plurality of calibration postures within an effective shooting range, and capture a plurality of calibration images in the calibration postures through the image capture device, wherein a beam splitter guides visible light to a flange disposed on the robotic arm The image capture device is off-axis, and the ranging light is guided to the optical ranging device, the ranging axis of the optical ranging device is parallel to or overlapping the flange axis; based on the plurality of corrected attitudes and the plurality of A calibration image is used to calculate a conversion relationship between the visual space of the image capture device and the mechanical space of the robotic arm; in a working mode, a working image is captured by the image capture device, and based on the working image and the The conversion relationship determines a mechanical space coordinate for performing work; and, controls the robotic arm to move to the mechanical space coordinate.

於一實施例中,一種自動化機械手臂系統,包含一機械手臂、一影像擷取裝置、一光學測距裝置、一光路結構及控制裝置。該機械手臂用以於一立體空間中移動。該影像擷取裝置設置於該機械手臂的法蘭軸外,並用以拍攝影像。該光學測距裝置,設置於該機械手臂上,用以量測一目標距離,該光學測距裝置的測距軸平行或重疊該法蘭軸。該光路結構包含一分光鏡,該分光鏡用以將可見光導引至該影像擷取裝置並將測距光導引至該光學測距裝置。該 控制裝置連接該機械手臂、該影像擷取裝置及該光學測距裝置,該控制裝置被設定為於一校正模式下,基於該目標距離控制該機械手臂於該影像擷取裝置的一有效拍攝距離內移動為多個校正姿態,並控制該影像擷取裝置於該多個校正姿態分別拍攝多個校正影像,基於該多個校正姿態與該多個校正影像計算該影像擷取裝置的視覺空間與該機械手臂的機械空間之間的一轉換關係。該控制裝置被設定為於一工作模式下,控制該影像擷取裝置拍攝一工作影像,基於該工作影像及該轉換關係決定執行工作的一機械空間座標,並控制該機械手臂移動至該機械空間座標。 In one embodiment, an automated robotic arm system includes a robotic arm, an image capturing device, an optical ranging device, an optical path structure and a control device. The mechanical arm is used to move in a three-dimensional space. The image capturing device is arranged outside the flange shaft of the robotic arm and is used for capturing images. The optical distance measuring device is arranged on the mechanical arm for measuring the distance of a target, and the distance measuring axis of the optical distance measuring device is parallel to or overlapping the flange axis. The optical path structure includes a beam splitter for guiding visible light to the image capturing device and guiding the ranging light to the optical ranging device. Should The control device is connected to the robot arm, the image capture device and the optical distance measuring device, and the control device is set to control an effective shooting distance of the robot arm to the image capture device based on the target distance in a calibration mode The internal movement is a plurality of calibration postures, and the image capture device is controlled to capture a plurality of calibration images respectively in the plurality of calibration postures, and the visual space and the visual space of the image capture device are calculated based on the plurality of calibration postures and the plurality of calibration images. A transformation relationship between the mechanical spaces of the robotic arm. The control device is set to control the image capture device to capture a working image in a working mode, determine a mechanical space coordinate for executing work based on the working image and the conversion relationship, and control the robotic arm to move to the mechanical space coordinate.

本發明可疊合拍攝光軸與機械手臂的法蘭軸,並提升機械手臂與電腦視覺之間的協調性。 The invention can superimpose the shooting optical axis and the flange axis of the mechanical arm, and improve the coordination between the mechanical arm and computer vision.

1:機械手臂系統 1: Robotic arm system

10:機械手臂 10: Robotic arm

11:端效器 11: End effector

12:相機 12: Camera

13:目標 13: Goals

140:法蘭軸 140: Flange shaft

141:拍攝光軸 141: Shooting Optical Axis

15:目標影像 15: Target image

16:範圍 16: Range

17:轉動後的範圍 17: Range after rotation

18:軸心 18: Axis

2:自動化機械手臂系統 2: Automated robotic arm system

20:控制裝置 20: Control device

21:影像擷取裝置 21: Image capture device

210:感光元件 210: Photosensitive element

211:鏡頭 211: Lens

22:光學測距裝置 22: Optical ranging device

220:光發射器 220: Light Emitter

221:光接收器 221: Optical Receiver

23:機械手臂 23: Robotic Arm

230-233:關節 230-233: Joints

24:光路結構 24: Optical path structure

240:分光鏡 240: Beamsplitter

241:反射鏡 241: Reflector

25:儲存裝置 25: Storage device

250:電腦程式 250: Computer programming

251:有效拍攝距離 251: Effective shooting distance

252:轉換關係 252: Conversion relationship

26:目標 26: Goals

30:控制電腦 30: Control the computer

31:機械手臂控制器 31: Robotic arm controller

32:周邊裝置 32: Peripherals

40:拍攝控制模組 40: Shooting control module

41:測距控制模組 41: Ranging control module

42:手臂控制模組 42: Arm control module

43:校正控制模組 43: Calibration control module

44:工作控制模組 44: Work Control Module

45:轉換處理模組 45: Conversion processing module

46:影像分析模組 46: Image Analysis Module

50:光源 50: light source

51:目標 51: Goals

52:治具 52: Jig

53:法蘭軸 53: Flange shaft

54:工作裝置 54: Working device

55:安裝基座 55: Install the base

60:影像 60: Video

600-601:位置 600-601: Location

α 1:轉動角度 α 1: Rotation angle

d1:偏置量 d1: offset

h1:目標距離 h1: target distance

P1、P2:姿態 P1, P2: Attitude

V1:變化量 V1: Variation

S10-S16:協調步驟 S10-S16: Coordination steps

S20-S25:校正步驟 S20-S25: Calibration steps

S30-S33:工作步驟 S30-S33: Working steps

圖1為本發明一實施例之自動化機械手臂系統的架構圖。 FIG. 1 is a structural diagram of an automated robotic arm system according to an embodiment of the present invention.

圖2為本發明一實施例之自動化機械手臂系統的部分架構圖。 FIG. 2 is a partial structural diagram of an automated robotic arm system according to an embodiment of the present invention.

圖3為本發明一實施例之控制裝置的架構圖。 FIG. 3 is a structural diagram of a control device according to an embodiment of the present invention.

圖4為本發明一實施例之協調方法的流程圖。 FIG. 4 is a flowchart of a coordination method according to an embodiment of the present invention.

圖5為本發明一實施例之協調方法的部分流程圖。 FIG. 5 is a partial flowchart of a coordination method according to an embodiment of the present invention.

圖6為本發明一實施例之協調方法的部分流程圖。 FIG. 6 is a partial flowchart of a coordination method according to an embodiment of the present invention.

圖7為本發明一實施例的自動化機械手臂系統的設置示意圖。 FIG. 7 is a schematic diagram of the arrangement of an automated robotic arm system according to an embodiment of the present invention.

圖8為本發明一實施例的校正模式的第一示意圖。 FIG. 8 is a first schematic diagram of a calibration mode according to an embodiment of the present invention.

圖9為本發明一實施例的校正模式的第二示意圖。 FIG. 9 is a second schematic diagram of a calibration mode according to an embodiment of the present invention.

圖10為圖8所拍攝的第一校正影像的示意圖。 FIG. 10 is a schematic diagram of the first corrected image captured in FIG. 8 .

圖11為圖9所拍攝的第二校正影像的示意圖。 FIG. 11 is a schematic diagram of the second corrected image captured in FIG. 9 .

圖12為現有的機械手臂系統的設置示意圖。 FIG. 12 is a schematic diagram of the arrangement of a conventional robotic arm system.

圖13為現有的機械手臂系統的視野範圍的示意圖。 FIG. 13 is a schematic diagram of the field of view of a conventional robotic arm system.

茲就本發明之一較佳實施例,配合圖式,詳細說明如後。 Hereinafter, a preferred embodiment of the present invention will be described in detail in conjunction with the drawings.

請參閱圖12與圖13,圖12為現有的機械手臂系統的設置示意圖,圖13為現有的機械手臂系統的視野範圍的示意圖。 Please refer to FIG. 12 and FIG. 13 , FIG. 12 is a schematic diagram of the arrangement of a conventional robotic arm system, and FIG. 13 is a schematic diagram of a visual field of the conventional robotic arm system.

如圖12所示,機械手臂系統1的相機12與機械手臂10是採用不同軸配置。由於不同軸配置,相機12的拍攝光軸141與機械手臂10的法蘭軸140之間存在偏置量d1,上述偏置量d1會造成視覺空間與機械空間定位上的隨機誤差。 As shown in FIG. 12 , the camera 12 of the robot arm system 1 and the robot arm 10 are arranged on different axes. Due to the non-axial configuration, there is an offset d1 between the shooting optical axis 141 of the camera 12 and the flange axis 140 of the robot arm 10 . The offset d1 will cause random errors in positioning in the visual space and the mechanical space.

端效器11是直接設置於機械手臂10的末端。當機械手臂10移動端效器11(即改變姿態)時,掛載在機械手臂上的相機12的視野範圍也會隨之變動,而可以不同角度拍攝目標13。 The end effector 11 is directly disposed at the end of the robotic arm 10 . When the robot arm 10 moves the end effector 11 (ie, changes the posture), the field of view of the camera 12 mounted on the robot arm also changes accordingly, and the target 13 can be photographed at different angles.

如圖13所示,當機械手臂10以法蘭軸140作為軸心18進行角度α 1的旋轉運動時,由於端效器11並沒有水平方向的移動,實際上仍可以對目標13進行加工。 As shown in FIG. 13 , when the robot arm 10 rotates at an angle α 1 with the flange shaft 140 as the axis 18 , since the end effector 11 does not move in the horizontal direction, the target 13 can actually be processed.

然而,轉動後的相機12的視野範圍會從範圍16變為轉動後的範圍17,而使得目標13的目標影像15脫離相機12的視野範圍,這會造成機械手臂系統1無法對目標13進行視覺空間定位。 However, the field of view of the rotated camera 12 will change from the range 16 to the rotated range 17 , so that the target image 15 of the target 13 is out of the field of view of the camera 12 , which will cause the robotic arm system 1 to fail to perform the visual space on the target 13 . position.

為解決上述不同軸配置所造成的問題,本發明提出一種自動化機械手臂系統與其機械手臂與電腦視覺之間的協調方法,可透過新穎的光路結構 (特別是分光鏡)來讓拍攝光軸的入射端貼合至與法蘭軸藉以達成手眼同軸的效果。 In order to solve the problems caused by the above-mentioned different axis configurations, the present invention proposes a coordination method between an automated robotic arm system and its robotic arm and computer vision, which can pass through a novel optical path structure. (especially the beam splitter) to make the incident end of the shooting optical axis fit to the flange axis to achieve the effect of hand-eye coaxial.

並且,本發明由於可以實現手眼同軸,可以消除不同軸配置所產生的偏置量問題,而可避免目標物超出視野範圍。 Moreover, since the present invention can realize the coaxiality of the hand and the eye, the offset problem caused by different axis configurations can be eliminated, and the target object can be prevented from exceeding the field of view.

並且,本發明由於採用不同軸配置,可大幅提升機械手臂的可加工範圍,並大幅提升相機的體積上限。 In addition, because the present invention adopts different axis configuration, the processing range of the robot arm can be greatly increased, and the upper limit of the camera volume can be greatly increased.

並且,本發明還可透過光學測距輔助視覺空間與機械間的定位校正,來提升機械手臂與電腦視覺的協調。 In addition, the present invention can also improve the coordination between the mechanical arm and the computer vision by assisting the positioning correction between the visual space and the machine through optical ranging.

請參閱圖1,為本發明一實施例之自動化機械手臂系統的架構圖。本發明的自動化機械手臂系統2主要包含影像擷取裝置21、光學測距裝置22、機械手臂23、儲存裝置25及連接上述裝置的控制裝置20。 Please refer to FIG. 1 , which is a structural diagram of an automated robotic arm system according to an embodiment of the present invention. The automated robotic arm system 2 of the present invention mainly includes an image capturing device 21 , an optical ranging device 22 , a robotic arm 23 , a storage device 25 and a control device 20 connected to the above-mentioned devices.

影像擷取裝置21,例如是RGB攝影機等彩色攝影機,用來對工作區域的目標進行拍攝,來獲得包含目標的彩色影像(如後述之校正影像與工作影像)。前述彩色影像主要是用來執行電腦視覺分析,並提供運算結果來作為機械手臂23的運動參考。 The image capturing device 21 is, for example, a color camera such as an RGB camera, and is used for photographing objects in the work area to obtain color images including the objects (such as the correction image and the working image to be described later). The aforementioned color images are mainly used to perform computer vision analysis, and provide calculation results as a motion reference for the robotic arm 23 .

機械手臂23,用以於立體空間中移動所掛載裝置,來實現對不同位置執行量測(掛載光學測距裝置22)、拍攝(掛載影像擷取裝置21)、加工(掛載工作裝置54)等工作。 The robotic arm 23 is used to move the mounted device in the three-dimensional space, so as to perform measurement (mounting the optical ranging device 22 ), photographing (mounting the image capturing device 21 ), and processing (mounting the work) for different positions device 54) and so on.

機械手臂23的末端設定有虛擬的法蘭軸(例如是機械手臂23的移動基準點),其末端的空間位置可基於法蘭軸來計算確定。前述法蘭軸的計算為機械手臂23控制領域的現有技術,於此不再贅述。 The end of the robot arm 23 is set with a virtual flange axis (for example, the movement reference point of the robot arm 23 ), and the spatial position of the end can be calculated and determined based on the flange axis. The calculation of the aforementioned flange axis is a prior art in the field of control of the robot arm 23 , and details are not described herein again.

於本發明中,影像擷取裝置21是設置於機械手臂23的法蘭軸外,藉以增加機械手臂23的可加工範圍(由法蘭軸的可移動範圍決定),並提升影像 擷取裝置21的可允許體積上限,即可以採用體積較大效能較強的攝影機,且配線限制較為寬鬆。 In the present invention, the image capturing device 21 is disposed outside the flange shaft of the robot arm 23, so as to increase the processing range of the robot arm 23 (determined by the movable range of the flange shaft), and improve the image. The upper limit of the allowable volume of the capture device 21 is that a camera with a larger volume and higher performance can be used, and the wiring restrictions are relatively loose.

於一實施例中,當機械手臂23的末端掛載工作裝置54(如圖8與圖9所示)時,透過機械手臂23的運動,工作裝置54可對不同位置執行加工。透過搭載不同的工作裝置54,本發明可實現不同應用。 In one embodiment, when the end of the robotic arm 23 is mounted with a working device 54 (as shown in FIGS. 8 and 9 ), the working device 54 can perform processing on different positions through the movement of the robotic arm 23 . By carrying different working devices 54, the present invention can realize different applications.

於一實施例中,工作裝置54可連接控制裝置20並受其控制來執行自動化動作。 In one embodiment, the working device 54 can be connected to and controlled by the control device 20 to perform automated actions.

舉例來說,當工作裝置54為夾取端效器、焊接加熱器、標記工具、研磨工具、組裝端效器、塗膠工具及/或鎖固工具時,前述自動化動作可為對應的夾取動作(例如是夾取或吸取電子元件)、焊接動作(例如是控制雷射焊頭加熱)、標記動作(例如是以烙印、噴塗等方式進行標記)、研磨動作(例如是執行切削、研磨等)、組裝動作(例如是依指定組裝方式將多個目標執行拼接、疊合等)、塗膠動作(例如是塗膠、點膠等)及/或鎖固動作(例如是鎖螺絲、螺母)。 For example, when the working device 54 is a gripping end effector, welding heater, marking tool, grinding tool, assembling end effector, gluing tool and/or locking tool, the aforementioned automated actions may be corresponding gripping tools Actions (such as clamping or sucking electronic components), welding actions (such as controlling the heating of the laser welding head), marking actions (such as marking by branding, spraying, etc.), grinding actions (such as performing cutting, grinding, etc.) ), assembly actions (such as splicing, stacking, etc. of multiple objects according to the specified assembly method), gluing actions (such as gluing, dispensing, etc.) and/or locking actions (such as locking screws, nuts) .

光學測距裝置22,例如是紅外線測距儀,用以透過光學手段量測光學測距裝置22與目標之間的目標距離。 The optical ranging device 22 , such as an infrared range finder, is used to measure the target distance between the optical ranging device 22 and the target through optical means.

於一實施例中,前述量測是使目標位於虛擬的測距軸上,並透過三角定位法來獲得朝測距軸的平行方向進行量測。 In one embodiment, the aforementioned measurement is performed by placing the target on a virtual distance measuring axis, and obtaining the measurement in a direction parallel to the distance measuring axis through a triangulation method.

於一實施例中,光學測距裝置22是設置於機械手臂23的末端(或接近末端),而可以量測末端與目標之間的距離。 In one embodiment, the optical distance measuring device 22 is disposed at the end (or near the end) of the robotic arm 23, and can measure the distance between the end and the target.

於一實施例中,光學測距裝置22的測距軸可平行或重疊機械手臂23的法蘭軸,藉以使所量測的目標距離是對應法蘭軸中機械手臂23末端與正下方的目標之間的深度值。 In one embodiment, the distance measuring axis of the optical distance measuring device 22 can be parallel to or overlap the flange axis of the robot arm 23, so that the measured target distance corresponds to the end of the robot arm 23 and the target directly below the flange axis. depth value in between.

光路結構24,設置於機械手臂23的末端(或接近末端)來接收入射光(從目標發出或反射的光),將入射光分為可見光與測距光,並分別導引至影像擷取裝置21與光學測距裝置22。 The optical path structure 24 is disposed at the end (or near the end) of the robotic arm 23 to receive incident light (light emitted or reflected from the target), divide the incident light into visible light and ranging light, and guide them to the image capturing device respectively 21 and the optical ranging device 22.

具體而言,光路結構24包含分光鏡240(如圖7-9,例如是光學稜鏡),分光鏡240可將入射光分離為不同波長的光線(原理為不同波長的光線具有不同折射率),例如是將入射光分為可見光與紅外線(測距光)。於分離後,前述可見光可透過可見光路導引(可設置反射鏡或透鏡或直接射入)至影像擷取裝置21的鏡頭211與感光元件210(如圖7),前述測距光可透過測距光路導引(可設置反射鏡或透鏡或直接射入)至光學測距裝置22的光接收器221。藉此,光路結構24可以在入射端實現的法蘭軸、測距軸、拍攝光軸(例如是拍攝視野的中心點或其他基準點)的同軸配置,並允許影像擷取裝置21設置於法蘭軸(與測距光軸)外。 Specifically, the optical path structure 24 includes a beam splitter 240 (as shown in FIG. 7-9 , for example, an optical lens), and the beam splitter 240 can separate the incident light into rays of different wavelengths (the principle is that the rays of light with different wavelengths have different refractive indices) , for example, the incident light is divided into visible light and infrared light (ranging light). After separation, the visible light can be guided through the visible light path (reflectors or lenses can be provided or directly injected) to the lens 211 and the photosensitive element 210 of the image capture device 21 (as shown in FIG. 7 ). The distance optical path is guided (a mirror or lens can be provided or directly incident) to the light receiver 221 of the optical distance measuring device 22 . Thereby, the optical path structure 24 can realize the coaxial configuration of the flange axis, the distance measuring axis, and the photographing optical axis (for example, the center point of the photographing field or other reference points) at the incident end, and allows the image capturing device 21 to be arranged on the Outside the blue axis (and the distance measuring optical axis).

儲存裝置25,如磁碟硬碟、固態硬碟、ROM、RAM、EEPROM、快閃記憶體或多種儲存媒體的任意組合,用來儲存資料,例如儲存有效拍攝距離251與轉換關係252。 The storage device 25 , such as a hard disk, a solid-state drive, ROM, RAM, EEPROM, flash memory, or any combination of various storage media, is used to store data, such as the effective shooting distance 251 and the conversion relationship 252 .

控制裝置20,用來控制自動化機械手臂系統2,例如控制校正模式與工作模式。 The control device 20 is used to control the automated robotic arm system 2, for example, to control the calibration mode and the working mode.

請參閱圖2,為本發明一實施例之自動化機械手臂系統的部分架構圖。於本實施例中,控制裝置20可包含控制電腦30與機械手臂控制器。 Please refer to FIG. 2 , which is a partial structural diagram of an automated robotic arm system according to an embodiment of the present invention. In this embodiment, the control device 20 may include a control computer 30 and a robotic arm controller.

機械手臂控制器31,連接機械手臂23,用來基於所收到的手臂控制命令來控制機械手臂移動。 The robotic arm controller 31 is connected to the robotic arm 23 for controlling the movement of the robotic arm based on the received arm control commands.

於一實施例中,機械手臂23包含用來提供多個自由度的多個關節230-233(如圖8至圖9),各關節230-233由伺服馬達來控制旋轉角度,藉此,機械手臂23可於多個自由度中進行運動。 In one embodiment, the robotic arm 23 includes a plurality of joints 230-233 (as shown in FIG. 8 to FIG. 9 ) for providing multiple degrees of freedom. The arm 23 can move in multiple degrees of freedom.

手臂控制命令可指示機械手臂23移動的目的地(機械空間座標),機械手臂控制器31可將手臂控制命令轉換為對應的姿態座標(如各關節230-233的旋轉角度),並控制各關節230-233轉動來擺出手臂控制命令所對應的姿態。 The arm control command can indicate the moving destination (mechanical space coordinate) of the robot arm 23, and the robot arm controller 31 can convert the arm control command into the corresponding posture coordinates (such as the rotation angle of each joint 230-233), and control each joint 230-233 turns to pose corresponding to arm control commands.

控制電腦30,例如為工業電腦或個人電腦,連接(例如是透過工業網路或其他區域網路)機械手臂控制器31、影像擷取裝置21、光學測距裝置22及儲存裝置25,並對這些裝置進行控制。舉例來說,控制電腦30可透過發出前述手臂控制命令至機械手臂控制器31來控制機械手臂23。 The control computer 30, such as an industrial computer or a personal computer, is connected (eg, through an industrial network or other local area network) to the robot arm controller 31, the image capture device 21, the optical distance measuring device 22 and the storage device 25, and controls the These devices are controlled. For example, the control computer 30 can control the robotic arm 23 by issuing the aforementioned arm control commands to the robotic arm controller 31 .

於一實施例中,控制電腦30還連接周邊裝置32,如通訊介面(用來連接網路)、人機介面(用來與用戶互動)、電源設備(用來提供電力)等。 In one embodiment, the control computer 30 is also connected to peripheral devices 32 , such as a communication interface (used to connect to a network), a human-machine interface (used to interact with a user), a power supply device (used to provide power), and the like.

請參閱圖7,為本發明一實施例的自動化機械手臂系統的設置示意圖。 Please refer to FIG. 7 , which is a schematic diagram of the arrangement of an automated robotic arm system according to an embodiment of the present invention.

如圖7所示,自動化機械手臂系統2包含安裝基座55。安裝基座55連接機械手臂23的末端,而可於一立體空間中被機械手臂23移動。 As shown in FIG. 7 , the automated robotic arm system 2 includes a mounting base 55 . The mounting base 55 is connected to the end of the robotic arm 23 and can be moved by the robotic arm 23 in a three-dimensional space.

並且,影像擷取裝置21、光學測距裝置22及光路結構24都設置在安裝基座55。 In addition, the image capturing device 21 , the optical distance measuring device 22 and the optical path structure 24 are all disposed on the mounting base 55 .

於一實施例中,安裝基座55可設置有一或多個光源50(如環形光源),光源50用來對工作區域(尤其是目標51及治具52)進行照明,使得影像擷取裝置21可以獲得亮度較佳的目標影像,而大幅降低環境亮度變化影響。 In one embodiment, the mounting base 55 can be provided with one or more light sources 50 (eg, ring light sources), and the light sources 50 are used to illuminate the work area (especially the target 51 and the fixture 52 ), so that the image capture device 21 A target image with better brightness can be obtained, and the influence of changes in ambient brightness can be greatly reduced.

於一實施例中,光路結構24可包含分光鏡240與反射鏡241。反射鏡241用來反射分光鏡240所分離出的可見光至影像擷取裝置21的鏡頭211與感光元件210。透過分光鏡240與反射鏡241,影像擷取裝置21的拍攝光軸可貼合機械手臂23的法蘭軸53。並且,光學測距裝置22的測距光軸可平行或貼合法蘭軸53。 In one embodiment, the light path structure 24 may include a beam splitter 240 and a reflector 241 . The reflector 241 is used for reflecting the visible light separated by the beam splitter 240 to the lens 211 and the photosensitive element 210 of the image capturing device 21 . Through the beam splitter 240 and the reflecting mirror 241 , the photographing optical axis of the image capturing device 21 can fit the flange axis 53 of the robotic arm 23 . In addition, the distance measuring optical axis of the optical distance measuring device 22 can be parallel to or attached to the flange axis 53 .

於一實施例中,分光鏡240可為長通分色鏡(longpass dichroic mirror),並具有80%以上(例如97%)的可見光反射率與75%以上(例如92%)的紅外線穿透率,例如是允許波長在730nm以上(例如750nm)的光線穿透,並反射波長為300nm-730nm((例如450nm-490nm)的光線。 In one embodiment, the beam splitter 240 can be a longpass dichroic mirror, and has a visible light reflectivity of more than 80% (eg, 97%) and an infrared transmittance of more than 75% (eg, 92%). , for example, allowing light with a wavelength above 730 nm (eg, 750 nm) to pass through, and reflecting light with a wavelength of 300 nm-730 nm (eg, 450 nm-490 nm).

於一實施例中,光學測距裝置22包含光發射器220、光接收器221與連接上述裝置的測距控制器(圖未標示)。光發射器220與光接收器221的中點的垂直線即為測距光軸(圖7中,測距光軸與法蘭軸53貼合)。 In one embodiment, the optical ranging device 22 includes an optical transmitter 220, an optical receiver 221, and a ranging controller (not shown) connected to the above-mentioned devices. The vertical line between the midpoints of the light transmitter 220 and the light receiver 221 is the distance measuring optical axis (in FIG. 7 , the distance measuring optical axis is attached to the flange axis 53 ).

光發射器220用以朝目標51發射測距光(測距紅外線),測距光打在目標51後會反射至分光鏡240,並於穿透分光鏡240後到達光接收器221。測距控制器(如微控制器或SoC)被設定來基於測距光的發射-接收時間差、光傳播速度及光發射器220與光接收器221之間的距離執行三角定位來計算目標距離(即目標51的深度值)。 The light transmitter 220 is used for emitting ranging light (ranging infrared rays) toward the target 51 . After hitting the target 51 , the ranging light is reflected to the beam splitter 240 , and reaches the light receiver 221 after passing through the beam splitter 240 . The ranging controller (such as a microcontroller or SoC) is set to perform triangulation based on the transmit-receive time difference of the ranging light, the light propagation speed, and the distance between the light transmitter 220 and the light receiver 221 to calculate the target distance ( That is, the depth value of the target 51).

請參閱圖3,為本發明一實施例之控制裝置的架構圖。控制裝置20可包含模組40-46。模組40-46分別被設定來產生執行本發明之不同功能。 Please refer to FIG. 3 , which is a structural diagram of a control device according to an embodiment of the present invention. Control device 20 may include modules 40-46. Modules 40-46 are respectively configured to generate different functions for carrying out the present invention.

拍攝控制模組40,用來控制影像擷取裝置21,如控制拍攝動作、控制對焦動作、取得影像資料、執行所設定之影像處理等。 The shooting control module 40 is used for controlling the image capturing device 21, such as controlling the shooting action, controlling the focusing action, acquiring the image data, executing the set image processing, and the like.

測距控制模組41,用來控制光學測距裝置22,如控制執行量測、取得量測資料(目標距離)、執行量測校正等。 The ranging control module 41 is used to control the optical ranging device 22 , such as controlling the execution of measurement, acquisition of measurement data (target distance), and execution of measurement correction.

手臂控制模組42,用以透過發出手臂控制命令至機械手臂控制器31來控制機械手臂23的姿態,並可取得機械手臂23的目前位置。 The arm control module 42 is used to control the posture of the robotic arm 23 by sending arm control commands to the robotic arm controller 31 , and can obtain the current position of the robotic arm 23 .

校正控制模組43,用以執行校正模式。 The calibration control module 43 is used to execute the calibration mode.

工作控制模組44,用以執行工作模式。 The work control module 44 is used to execute the work mode.

轉換處理模組45,用以計算視覺空間至機械空間的座標轉換與機械空間至視覺空間的座標轉換。 The conversion processing module 45 is used for calculating the coordinate conversion from the visual space to the mechanical space and the coordinate conversion from the mechanical space to the visual space.

影像分析模組46,用以對目標影像執行影像分析與處理。 The image analysis module 46 is used for performing image analysis and processing on the target image.

前述模組40-46是相互連接(可為電性連接與資訊連接),並可為硬體模組(例如是電子電路模組、積體電路模組、SoC等等)、軟體模組(例如是韌體、作業系統或應用程式)或軟硬體模組混搭,不加以限定。 The aforementioned modules 40-46 are connected to each other (which may be electrical connection and information connection), and may be hardware modules (such as electronic circuit modules, integrated circuit modules, SoCs, etc.), software modules ( For example, firmware, operating system or application) or a mix of software and hardware modules, which is not limited.

再者,當前述模組40-46為軟體模組(例如是韌體、作業系統或應用程式)時,儲存裝置25可包含非暫態電腦可讀取記錄媒體(圖未標示),前述非暫態電腦可讀取記錄媒體儲存有電腦程式250,電腦程式250記錄有電腦可執行之程式碼,當控制裝置20執行前述程式碼後,可實做對應模組40-46之功能。 Furthermore, when the aforementioned modules 40-46 are software modules (such as firmware, operating systems or applications), the storage device 25 may include a non-transitory computer-readable recording medium (not shown), the aforementioned non-transitory The transient computer-readable recording medium stores a computer program 250, and the computer program 250 records a computer-executable program code. After the control device 20 executes the aforesaid program code, the functions of the corresponding modules 40-46 can be implemented.

於一實施例中,前述模組40-46可設置在控制電腦30。舉例來說,儲存裝置25可包含控制電腦30的儲存器,前述儲存器儲存有電腦程式250,控制電腦30的處理器可以執行電腦程式250來實做對應模組40-46之功能。 In one embodiment, the aforementioned modules 40 - 46 may be disposed in the control computer 30 . For example, the storage device 25 may include a storage of the control computer 30, and the aforementioned storage stores a computer program 250. The processor of the control computer 30 may execute the computer program 250 to implement the functions of the corresponding modules 40-46.

請參閱圖4,為本發明一實施例之協調方法的流程圖。本實施例的機械手臂與電腦視覺之間的協調方法包含校正步驟S10-S12與工作模式S13-S16。 Please refer to FIG. 4 , which is a flowchart of a coordination method according to an embodiment of the present invention. The method for coordinating between the robotic arm and computer vision in this embodiment includes calibration steps S10-S12 and working modes S13-S16.

步驟S10:控制電腦30透過校正控制模組43進入校正模式以執行機械空間與視覺空間之間的協調與校正。 Step S10: The control computer 30 enters the calibration mode through the calibration control module 43 to perform coordination and calibration between the mechanical space and the visual space.

舉例來說,控制電腦30可於接受用戶的開始校正操作或收到校正命令時進入校正模式。 For example, the control computer 30 may enter the calibration mode when it accepts the user's start of the calibration operation or receives the calibration command.

步驟S11:控制電腦30控制機械手臂23移動,並取得當前的目標距離,依據當前的目標距離判斷機械手臂23(的末端)是否進入影像擷取裝置21的有效拍攝範圍。 Step S11 : the control computer 30 controls the robotic arm 23 to move, obtains the current target distance, and determines whether (the end) of the robotic arm 23 enters the effective shooting range of the image capture device 21 according to the current target distance.

若進入有效拍攝範圍,則控制機械手臂23於有效拍攝範圍中依序擺出多個校正姿態,並於擺出各校正姿態時拍攝至少一校正影像,藉以獲得分別對應多個校正姿態的多個校正影像。 If it enters the effective shooting range, the robotic arm 23 is controlled to pose a plurality of calibration poses in sequence in the valid shooting range, and at least one calibration image is captured when posing each calibration pose, so as to obtain a plurality of calibration poses corresponding to the plurality of calibration poses respectively. Correct the image.

步驟S12:控制電腦30透過轉換處理模組45基於多個校正姿態與多個校正影像計算影像擷取裝置21的視覺空間與機械手臂23的機械空間之間的轉換關係。 Step S12 : The control computer 30 calculates the conversion relationship between the visual space of the image capturing device 21 and the mechanical space of the robot arm 23 based on the plurality of calibration postures and the plurality of calibration images through the conversion processing module 45 .

於一實施例中,控制電腦30可於各校正影像中識別校正目標的視覺空間座標,計算校正目標於多個校正影像的多個視覺空間座標的變化,計算多個校正姿態所對應的多個機械空間座標的變化,並基於上述機械空間座標的變化及視覺空間座標的變化來計算視覺空間與機械空間之間的轉換關係。 In one embodiment, the control computer 30 can identify the visual space coordinates of the calibration target in each calibration image, calculate the changes of the visual space coordinates of the calibration target in the plurality of calibration images, and calculate a plurality of corresponding calibration poses. The change of the mechanical space coordinate, and the conversion relationship between the visual space and the mechanical space is calculated based on the change of the above-mentioned mechanical space coordinate and the change of the visual space coordinate.

於一實施例中,視覺空間

Figure 110136446-A0305-02-0013-1
、機械空間
Figure 110136446-A0305-02-0013-3
與轉換關係
Figure 110136446-A0305-02-0013-7
之間的數學關係為:
Figure 110136446-A0305-02-0013-22
In one embodiment, the visual space
Figure 110136446-A0305-02-0013-1
, mechanical space
Figure 110136446-A0305-02-0013-3
relation to conversion
Figure 110136446-A0305-02-0013-7
The mathematical relationship between is:
Figure 110136446-A0305-02-0013-22

於一實施例中,轉換關係

Figure 110136446-A0305-02-0013-9
可透過以下方式計算獲得。 In one embodiment, the conversion relationship
Figure 110136446-A0305-02-0013-9
It can be calculated in the following way.

影像擷取裝置21多次拍攝校正目標的特徵f(例如是棋盤格),來獲得多個不同校正姿態下影像擷取裝置21與特徵的關係式:

Figure 110136446-A0305-02-0013-11
,同時獲得多個當下的校正姿態的表示式
Figure 110136446-A0305-02-0013-13
(W為機械空間座標,如世界座標),由於特徵物在機械空間座標下固定為
Figure 110136446-A0305-02-0013-15
,彼此的關係式可以表示為
Figure 110136446-A0305-02-0013-17
,i=1~N。 The image capture device 21 shoots the feature f (eg, a checkerboard) of the calibration target multiple times to obtain the relationship between the image capture device 21 and the feature under a plurality of different calibration postures:
Figure 110136446-A0305-02-0013-11
, and simultaneously obtain expressions of multiple current corrected poses
Figure 110136446-A0305-02-0013-13
(W is the mechanical space coordinate, such as the world coordinate), because the feature is fixed in the mechanical space coordinate as
Figure 110136446-A0305-02-0013-15
, the relationship between each other can be expressed as
Figure 110136446-A0305-02-0013-17
, i =1~ N .

由於

Figure 110136446-A0305-02-0013-19
,i=1~N皆為已知,可透過取得多筆資料來最佳化方程式,以得到誤差項最小之最佳解
Figure 110136446-A0305-02-0013-20
,即校正資料越多,轉換關係
Figure 110136446-A0305-02-0013-21
越精準。 because
Figure 110136446-A0305-02-0013-19
, i = 1~ N are all known, and the equation can be optimized by obtaining multiple pieces of data to obtain the best solution with the smallest error term
Figure 110136446-A0305-02-0013-20
, that is, the more correction data, the conversion relationship
Figure 110136446-A0305-02-0013-21
more precise.

步驟S13:控制電腦30透過工作控制模組44進入工作模式以執行工作。 Step S13 : the control computer 30 enters the work mode through the work control module 44 to perform work.

舉例來說,控制電腦30可於接受用戶的開始工作操作或收到工作命令時進入工作模式。 For example, the control computer 30 can enter the work mode when it accepts the user's start-to-work operation or receives a work command.

步驟S14:控制電腦30控制機械手臂23移動,並取得當前的目標距離,依據當前的目標距離判斷機械手臂23(的末端)是否進入影像擷取裝置21的有效拍攝範圍。 Step S14 : the control computer 30 controls the robot arm 23 to move, obtains the current target distance, and determines whether (the end) of the robot arm 23 enters the effective shooting range of the image capture device 21 according to the current target distance.

若進入有效拍攝範圍,則控制電腦30控制影像擷取裝置21拍攝工作目標來獲得工作影像,透過影像分析模組46來執行工作相關影像分析處理,並於工作影像中決定要進行加工的位置(視覺空間座標)。接著,控制電腦30透過轉換處理模組45使用轉換關係來將視覺空間座標轉換為機械空間座標。並且,控制電腦30控制機械手臂23移動至機械空間座標。 If it enters the effective shooting range, the control computer 30 controls the image capture device 21 to shoot the work target to obtain the work image, and executes the work-related image analysis processing through the image analysis module 46, and determines the position to be processed in the work image ( visual space coordinates). Next, the control computer 30 converts the visual space coordinates into mechanical space coordinates by using the conversion relationship through the conversion processing module 45 . In addition, the control computer 30 controls the robot arm 23 to move to the mechanical space coordinates.

步驟S14:控制電腦30控制機械手臂23移動至機械空間座標。 Step S14 : the control computer 30 controls the robot arm 23 to move to the mechanical space coordinates.

於一實施例中,控制電腦30可進一步控制工作裝置54控制機械手臂於機械空間座標執行自動化動作,例如夾取動作、焊接動作、標記動作、研磨動作、組裝動作、塗膠動作及/或鎖固動作。 In one embodiment, the control computer 30 can further control the working device 54 to control the robotic arm to perform automated actions in mechanical space coordinates, such as gripping actions, welding actions, marking actions, grinding actions, assembling actions, gluing actions and/or locking actions. solid action.

本發明可對機械手臂與電腦視覺進行校正,而可以提升機器人的手眼協調。 The invention can correct the mechanical arm and computer vision, and can improve the hand-eye coordination of the robot.

請同時參閱圖4與圖5,圖5為本發明一實施例之協調方法的部分流程圖。相較於圖4的協調方法,本實施例的協調方法的步驟S11更包含步驟S20-S24。 Please refer to FIG. 4 and FIG. 5 at the same time. FIG. 5 is a partial flowchart of a coordination method according to an embodiment of the present invention. Compared with the coordination method of FIG. 4 , step S11 of the coordination method of this embodiment further includes steps S20 - S24 .

步驟S20:控制電腦30取得影像擷取裝置21的有效拍攝距離(如圖1所示的有效拍攝距離251),並基於有效拍攝距離設定有效拍攝範圍。 Step S20 : the control computer 30 obtains the effective shooting distance of the image capturing device 21 (the effective shooting distance 251 shown in FIG. 1 ), and sets the effective shooting range based on the effective shooting distance.

前述有效拍攝距離可例如為取得影像擷取裝置21的最大或最小對焦距離,並且有效拍攝範圍可例如為影像擷取裝置21的對焦範圍。 The aforementioned effective shooting distance may be, for example, the maximum or minimum focusing distance of the image capturing device 21 , and the effective shooting range may be, for example, the focusing range of the image capturing device 21 .

於一實施例中,若有效拍攝距離為50公分,則控制電腦30可將0-50公分設定為有效拍攝範圍,或將25-50公分設定為有效拍攝範圍,或將25-75公分設定為有效拍攝範圍,不加以限定。 In one embodiment, if the effective shooting distance is 50 cm, the control computer 30 can set 0-50 cm as the effective shooting range, or set 25-50 cm as the effective shooting range, or set 25-75 cm as the effective shooting range. The effective shooting range is not limited.

再者,當影像擷取裝置21與拍攝目標是落入前述有效拍攝距離或有效拍攝範圍時,影像擷取裝置21可以正確地對拍攝目標進行聚焦,而可以拍攝到清晰的目標影像;當影像擷取裝置21與拍攝目標是不在有效拍攝範圍內時,影像擷取裝置21無法正確地聚焦,而會產生模糊的目標影像。 Furthermore, when the image capturing device 21 and the shooting target fall within the aforementioned effective shooting distance or effective shooting range, the image capturing device 21 can correctly focus on the shooting target, and can capture a clear target image; When the capturing device 21 and the shooting target are not within the effective shooting range, the image capturing device 21 cannot focus correctly, and a blurred target image will be generated.

步驟S21:控制電腦30控制機械手臂23移動,持續量測目標距離,直到基於目標距離判斷進入有效拍攝範圍內。 Step S21 : the control computer 30 controls the movement of the robotic arm 23 , and continues to measure the target distance until it is judged based on the target distance to enter the effective shooting range.

步驟S22:控制電腦30持續量測當前的目標距離,並控制機械手臂23於有效拍攝範圍內移動並擺出不同的測焦姿態,並拍攝各測焦姿態的測焦影像。 Step S22 : Controlling the computer 30 to continuously measure the current target distance, and controlling the robotic arm 23 to move within the effective shooting range to assume different focus measurement postures, and to shoot focus measurement images of each focus measurement posture.

於一實施例中,前述多個測焦姿態是於不同的目標距離所擺出,即控制電腦30是於有效拍攝範圍內不斷變換機械手臂23的高度(如從遠離目標到接近目標),來獲得不同高度的測焦影像。 In an embodiment, the aforementioned plurality of focus measurement postures are posed at different target distances, that is, the control computer 30 continuously changes the height of the robotic arm 23 within the effective shooting range (eg, from far away from the target to close to the target) to Obtain focus images at different heights.

步驟S23:控制電腦30透過影像分析模組46對多個測焦影像與對應的多個目標距離執行對焦分析來決定基準姿態及基準距離。 Step S23 : The control computer 30 determines the reference posture and the reference distance by performing focus analysis on the plurality of focus images and the corresponding target distances through the image analysis module 46 .

於一實施例中,前述對焦分析包含於多個測焦影像中選擇一或多個準焦的測焦影像(即清晰影像),並基於拍攝這些測焦影像的測焦姿態來決定基準姿態(如這些測焦姿態的中心或重心),並基於拍攝這些測焦影像的目標距離來決定基準距離(如平均值)。 In one embodiment, the aforementioned focus analysis includes selecting one or more focus-focused images (ie, clear images) from a plurality of focus-measurement images, and determining a reference pose ( Such as the center or center of gravity of these focus measurement poses), and the reference distance (eg, average value) is determined based on the target distance for capturing these focus measurement images.

於一實施例中,前述對焦分析可藉由分析多個影像的邊緣特徵、梯度大小等,來決定最清晰的測焦影像,取得能取得最清晰的測焦影像的測焦姿態與目標距離,並作為基準姿態及基準距離。 In one embodiment, the aforementioned focus analysis can determine the clearest focus measurement image by analyzing edge features, gradient magnitudes, etc. of multiple images, and obtain the focus measurement posture and target distance that can obtain the clearest focus measurement image, And as the reference attitude and reference distance.

步驟S24:控制電腦30基於基準姿態及基準距離控制機械手臂23移動為校正姿態,並於此校正姿態下拍攝對應的校正影像。 Step S24 : The control computer 30 controls the robot arm 23 to move to a calibration posture based on the reference posture and the reference distance, and shoots a corresponding calibration image in the calibration posture.

於一實施例中,各校正姿態的目標距離是等於或接近基準距離,並是基於基準姿態進行變化,例如在相同高度平面上旋轉或位移機械手臂23的末端。 In one embodiment, the target distance of each calibration posture is equal to or close to the reference distance, and is changed based on the reference posture, such as rotating or displacing the end of the robotic arm 23 on the same height plane.

步驟S25:控制電腦30透過校正控制模組43判斷是否預設的停止收集條件滿足,以判斷是否所收集的校正資料已足夠,例如滿足預設的筆數,如10筆、50筆或100筆等,不加以限定。 Step S25: The control computer 30 determines whether the preset stop collection condition is satisfied through the calibration control module 43, so as to determine whether the collected calibration data is sufficient, for example, the preset number of records, such as 10 records, 50 records, or 100 records is satisfied. etc., without limitation.

若停止收集條件滿足,則結束收集校正資料;否則,在次執行步驟S24,以獲得不同校正姿態下拍攝的校正影像。 If the stop collection condition is satisfied, the collection of calibration data is ended; otherwise, step S24 is performed next to obtain calibration images captured under different calibration postures.

藉此,本發明可連續地改變機械手臂的旋轉與位移來擺出不同的校正姿態,並拍攝各校正姿態的校正影像,直到收集到足夠的校正資料。 In this way, the present invention can continuously change the rotation and displacement of the robotic arm to assume different calibration postures, and shoot calibration images of each calibration posture until sufficient calibration data are collected.

於一實施例中,於所收集的多個校正姿態中,至少兩個校正姿態所在平面是跟治具52平行。 In one embodiment, among the collected calibration poses, at least two calibration poses are located on a plane parallel to the jig 52 .

於一實施例中,於所收集的多個校正姿態中,至少兩個校正姿態在不同的目標距離,即不同高度。 In one embodiment, among the collected calibration poses, at least two calibration poses are at different target distances, ie, different heights.

本發明由於拍攝光軸與法蘭軸貼合,所計算出來的轉換關係可以更為準確。 In the present invention, since the photographing optical axis and the flange axis are fitted, the calculated conversion relationship can be more accurate.

請同時參閱圖4與圖6,圖6為本發明一實施例之協調方法的部分流程圖。相較於圖4的協調方法,本實施例的協調方法的步驟S14更包含步驟S30-S33。 Please refer to FIG. 4 and FIG. 6 at the same time. FIG. 6 is a partial flowchart of a coordination method according to an embodiment of the present invention. Compared with the coordination method of FIG. 4 , step S14 of the coordination method of this embodiment further includes steps S30 - S33 .

步驟S30:控制電腦30控制機械手臂23移動(如持續朝工作目標接近),持續取得目標距離,並基於目標距離判斷機械手臂23是否進入有效拍攝範圍(如目標距離是否小於有效拍攝距離251)。 Step S30: The control computer 30 controls the robotic arm 23 to move (such as continuously approaching the work target), continuously obtains the target distance, and determines whether the robotic arm 23 enters the effective shooting range based on the target distance (such as whether the target distance is less than the effective shooting distance 251).

步驟S31:控制電腦30於機械手臂23(包含影像擷取裝置21)進入有效拍攝範圍後,對工作目標進行拍攝來獲得工作影像。 Step S31 : After the robotic arm 23 (including the image capture device 21 ) enters the effective shooting range, the control computer 30 shoots the work target to obtain the work image.

步驟S32:控制電腦30透過影像分析模組46對工作影像中執行影像分析。 Step S32 : the control computer 30 performs image analysis on the working image through the image analysis module 46 .

於一實施例中,前述影像分析可包含於工作影像中識別工作目標,並基於工作目標在視覺空間的位置執行工作分析來決定需要執行工作的視覺空間座標。 In one embodiment, the aforementioned image analysis may include identifying the work target in the work image, and performing the work analysis based on the position of the work target in the visual space to determine the coordinates of the visual space where the work needs to be performed.

於一實施例中,前述工作分析可以是瑕疵檢測處理(例如檢測元件瑕疵)、量測處理(例如量測元件面積或長度)、分類篩檢處理(例如對元件進行辨識與分類)與元件定位處理(例如決定元件的抓取點、組裝點、焊接點等)。 In one embodiment, the aforementioned work analysis may be defect detection processing (eg, detecting component defects), measurement processing (eg, measuring component area or length), sorting and screening processing (eg, identifying and classifying components), and component positioning. Processing (e.g. to determine component grab points, assembly points, solder points, etc.).

步驟S33:控制電腦30基於轉換關係252轉換執行工作的視覺空間座標為執行工作的機械空間座標。 Step S33 : The control computer 30 converts the visual space coordinates of the execution work to the mechanical space coordinates of the execution work based on the conversion relationship 252 .

於一實施例中,控制電腦30可進一步依據工作裝置54與法蘭軸的位置差,對機械空間座標進行補償,來獲得補償後的機械空間座標。並且,控制電腦30可基於補償後的機械空間座標產生手臂控制命令,並發送手臂控制命令至機械手臂控制器31來控制機械手臂將工作裝置54移動至執行工作的機械空間座標。 In one embodiment, the control computer 30 may further compensate the mechanical space coordinates according to the position difference between the working device 54 and the flange shaft, so as to obtain the compensated mechanical space coordinates. In addition, the control computer 30 can generate arm control commands based on the compensated mechanical space coordinates, and send the arm control commands to the robot arm controller 31 to control the robot arm to move the working device 54 to the mechanical space coordinates where the work is performed.

藉此,本發明可透過電腦視覺來自動執行加工作業。 In this way, the present invention can automatically perform processing operations through computer vision.

請參閱圖8至圖11,圖8為本發明一實施例的校正模式的第一示意圖,圖9為本發明一實施例的校正模式的第二示意圖,圖10為圖8所拍攝的第一校正影像的示意圖,圖11為圖9所拍攝的第二校正影像的示意圖。 Please refer to FIG. 8 to FIG. 11 , FIG. 8 is a first schematic diagram of a calibration mode according to an embodiment of the present invention, FIG. 9 is a second schematic diagram of a calibration mode according to an embodiment of the present invention, and FIG. 10 is a first schematic diagram taken in FIG. 8 A schematic diagram of the corrected image, FIG. 11 is a schematic diagram of the second corrected image captured in FIG. 9 .

於本實施例中,光路結構僅包含分光鏡240,分光鏡240分離出的可見光是直接射入影像擷取裝置21,影像擷取裝置21的鏡頭朝向是與法蘭軸垂直。 In this embodiment, the optical path structure only includes the beam splitter 240 , the visible light separated by the beam splitter 240 is directly incident on the image capture device 21 , and the lens of the image capture device 21 is oriented perpendicular to the flange axis.

此外,工作裝置54是設置於安裝基座55底部,且位於法蘭軸之外,藉以避免干擾入射光的射入。 In addition, the working device 54 is disposed at the bottom of the mounting base 55 and is located outside the flange axis, so as to avoid disturbing the incident light.

再者,上述設置方式中,工作裝置54與法蘭軸的距離是固定的,這使得控制裝置20可以由法蘭軸快速且準確地計算工作裝置54目前的空間位置。 Furthermore, in the above arrangement, the distance between the working device 54 and the flange shaft is fixed, which enables the control device 20 to quickly and accurately calculate the current spatial position of the working device 54 from the flange shaft.

機械手臂23於末端移動至有效拍攝距離h1內後,可透過調整關節230-233來擺出如圖8所示的第一個校正姿態P1,並透過影像擷取裝置21來拍攝如圖10所示的第一張校正影像。 After the end of the robotic arm 23 moves within the effective shooting distance h1, it can adjust the joints 230-233 to assume the first correction posture P1 as shown in FIG. the first corrected image shown.

接著,機械手臂23可透過調整關節232、233來擺出如圖9所示的不同的第二個校正姿態P2,並透過影像擷取裝置21來拍攝如圖11所示的第二張校正影像。 Next, the robotic arm 23 can assume a different second calibration posture P2 as shown in FIG. 9 by adjusting the joints 232 and 233 , and use the image capturing device 21 to capture the second calibration image as shown in FIG. 11 . .

如圖10所示,第一校正姿態P1下,第一張校正影像的目標60的特徵(於此為中心點)是位於視覺空間的位置600。 As shown in FIG. 10 , in the first calibration posture P1 , the feature of the target 60 in the first calibration image (here, the center point) is a position 600 in the visual space.

如圖111所示,於變換至第二校正姿態P2後,第二張校正影像的目標60的特徵移動至視覺空間的位置601。 As shown in FIG. 111 , after the transformation to the second calibration pose P2 , the feature of the target 60 in the second calibration image moves to the position 601 in the visual space.

接著,計算第一校正姿態P1與第二校正姿態P2之間的機械空間座標變化量,並計算位置600至位置601的視覺空間變化量V1,將兩組變化量進行關聯即可獲得視覺空間與機械空間之間的轉換關係,而完成校正。 Next, calculate the mechanical space coordinate change between the first corrected posture P1 and the second corrected posture P2, and calculate the visual space change V1 from position 600 to position 601, and correlate the two sets of changes to obtain the visual space and The conversion relationship between mechanical spaces is completed while the calibration is completed.

以上所述僅為本發明之較佳具體實例,非因此即侷限本發明之申請專利範圍,故舉凡運用本發明內容所為之等效變化,均同理皆包含於本發明之範圍內,合予陳明。 The above description is only a preferred specific example of the present invention, and therefore does not limit the scope of the present invention. Therefore, all equivalent changes made by using the content of the present invention are all included in the scope of the present invention. Chen Ming.

S10-S16:協調步驟 S10-S16: Coordination steps

Claims (20)

一種機械手臂與電腦視覺之間的協調方法,包括:a)於一校正模式下,基於一光學測距裝置所量測的一目標距離控制一機械手臂於一影像擷取裝置的一有效拍攝範圍內移動為多個校正姿態,並透過該影像擷取裝置於該多個校正姿態分別拍攝多個校正影像,其中一分光鏡將可見光導引至設置於該機械手臂的一法蘭軸外的該影像擷取裝置,並將測距光導引至該光學測距裝置,該光學測距裝置的測距軸平行或重疊該法蘭軸;b)基於該多個校正姿態與該多個校正影像計算該影像擷取裝置的視覺空間與該機械手臂的機械空間之間的一轉換關係;c)於一工作模式下,透過該影像擷取裝置拍攝一工作影像,並基於該工作影像及該轉換關係決定執行工作的一機械空間座標;及d)控制該機械手臂移動至該機械空間座標。 A method for coordinating between a robotic arm and computer vision, comprising: a) in a calibration mode, controlling a robotic arm in an effective shooting range of an image capture device based on a target distance measured by an optical ranging device The inner movement is a plurality of calibration postures, and a plurality of calibration images are respectively captured in the plurality of calibration postures through the image capturing device, wherein a beam splitter guides the visible light to the outer flange shaft disposed on the mechanical arm. an image capturing device, and guiding the ranging light to the optical ranging device, the ranging axis of the optical ranging device is parallel to or overlapping the flange axis; b) based on the plurality of calibration postures and the plurality of calibration images Calculate a conversion relationship between the visual space of the image capture device and the mechanical space of the robotic arm; c) in a working mode, capture a working image through the image capture device, and based on the working image and the conversion The relationship determines a mechanical space coordinate for performing work; and d) controls the robotic arm to move to the mechanical space coordinate. 如請求項1所述之機械手臂與電腦視覺之間的協調方法,其中該步驟a)包括:a1)取得該影像擷取裝置的一有效拍攝距離,並基於該有效拍攝距離設定該有效拍攝範圍;a2)控制該機械手臂於該有效拍攝範圍內分別為不同的該目標距離的多個測焦姿態,並於該多個測焦姿態分別拍攝多個測焦影像;及a3)對該多個測焦影像與對應的該多個目標距離執行一對焦分析來決定一基準姿態及一基準距離。 The method for coordinating between a robotic arm and computer vision as claimed in claim 1, wherein the step a) comprises: a1) obtaining an effective shooting distance of the image capture device, and setting the effective shooting range based on the effective shooting distance A2) control this mechanical arm to be respectively a plurality of focus measuring attitudes of different this target distance in this effective shooting range, and shoot a plurality of focus measuring images respectively at this multiple focus measuring attitudes; and a3) to this multiple focus measuring attitude A focus analysis is performed on the focus measurement image and the corresponding target distances to determine a reference attitude and a reference distance. 如請求項2所述之機械手臂與電腦視覺之間的協調方法,其中該對焦分析包括:e1)於該多個測焦影像中選擇至少一準焦的該測焦影像;及e2)基於拍攝到清晰的該測焦影像的該測焦姿態與該目標距離決定該基準姿態及該基準距離。 The method for coordinating between a robotic arm and computer vision as claimed in claim 2, wherein the focus analysis comprises: e1) selecting at least one focus-focused image among the plurality of focus-measurement images; and e2) based on shooting The focus measurement attitude and the target distance to the clear focus measurement image determine the reference attitude and the reference distance. 如請求項1所述之機械手臂與電腦視覺之間的協調方法,其中該步驟a)包括:a4)基於一基準姿態及一基準距離連續地控制該機械手臂移動為不同的該校正姿態,並分別拍攝多個該校正影像,直到一停止收集條件滿足。 The method for coordinating between a robotic arm and computer vision as claimed in claim 1, wherein the step a) comprises: a4) continuously controlling the robotic arm to move into different calibration postures based on a reference posture and a reference distance, and A plurality of the corrected images are taken separately until a stop collection condition is satisfied. 如請求項1所述之機械手臂與電腦視覺之間的協調方法,其中該步驟b)包括:b1)於各該校正影像中識別一校正目標的一視覺空間座標;b2)計算該校正目標於該多個校正影像的該多個視覺空間座標的變化;b3)計算該多個機械空間座標的變化;及b4)基於該多個機械空間座標的變化及該多個視覺空間座標的變化計算視覺空間與機械空間之間的該轉換關係。 The method for coordinating between a robotic arm and computer vision as claimed in claim 1, wherein the step b) comprises: b1) identifying a visual space coordinate of a calibration target in each of the calibration images; b2) calculating the calibration target in Changes in the plurality of visual space coordinates of the plurality of corrected images; b3) calculating the changes in the plurality of mechanical space coordinates; and b4) calculating the changes in the plurality of visual space coordinates based on the changes in the plurality of mechanical space coordinates and the plurality of visual space coordinates This transformation relationship between space and mechanical space. 如請求項1所述之機械手臂與電腦視覺之間的協調方法,其中該步驟c)包括:c1)於該機械手臂進入該有效拍攝範圍時拍攝該工作影像;c2)於該工作影像中識別一工作目標,並基於該工作目標的位置執行一工作分析來決定執行工作的一視覺空間座標;及 c3)基於該轉換關係轉換執行工作的該視覺空間座標為執行工作的該機械空間座標。 The method for coordinating between a robotic arm and computer vision according to claim 1, wherein the step c) comprises: c1) shooting the working image when the robotic arm enters the effective shooting range; c2) identifying the working image in the working image a job target, and performing a job analysis based on the position of the job target to determine a visual space coordinate for performing the job; and c3) Convert the visual space coordinates of the execution work to the mechanical space coordinates of the execution work based on the conversion relationship. 如請求項6所述之機械手臂與電腦視覺之間的協調方法,其中該工作分析包括瑕疵檢測處理、量測處理、分類篩檢處理與元件定位處理的至少其中之一。 The method for coordinating between a robotic arm and computer vision as claimed in claim 6, wherein the work analysis includes at least one of defect detection processing, measurement processing, sorting and screening processing, and component positioning processing. 如請求項1所述之機械手臂與電腦視覺之間的協調方法,更包括:f)控制該機械手臂於該機械空間座標執行一自動化動作。 The method for coordinating between a robotic arm and computer vision according to claim 1, further comprising: f) controlling the robotic arm to execute an automated action on the mechanical space coordinates. 如請求項8所述之機械手臂與電腦視覺之間的協調方法,其中該自動化動作包括夾取動作、焊接動作、標記動作、研磨動作、組裝動作、塗膠動作與鎖固動作的至少其中之一。 The method for coordinating between a robotic arm and computer vision as claimed in claim 8, wherein the automated action includes at least one of a gripping action, a welding action, a marking action, a grinding action, an assembling action, a gluing action and a locking action one. 如請求項1所述之機械手臂與電腦視覺之間的協調方法,其中該分光鏡具有80%以上的可見光反射率與75%以上的紅外線穿透率。 The method for coordinating between a robotic arm and computer vision as claimed in claim 1, wherein the beam splitter has a visible light reflectivity of over 80% and an infrared transmittance of over 75%. 一種自動化機械手臂系統,包括:一機械手臂,用以於一立體空間中移動;一影像擷取裝置,設置於該機械手臂的法蘭軸外,並用以拍攝影像;一光學測距裝置,設置於該機械手臂上,用以量測一目標距離,該光學測距裝置的測距軸平行或重疊該法蘭軸;一光路結構,包括一分光鏡,該分光鏡用以將可見光導引至該影像擷取裝置並將測距光導引至該光學測距裝置;及一控制裝置,連接該機械手臂、該影像擷取裝置及該光學測距裝置; 其中,該控制裝置被設定為於一校正模式下,基於該目標距離控制該機械手臂於該影像擷取裝置的一有效拍攝範圍內移動為多個校正姿態,並控制該影像擷取裝置於該多個校正姿態分別拍攝多個校正影像,基於該多個校正姿態與該多個校正影像計算該影像擷取裝置的視覺空間與該機械手臂的機械空間之間的一轉換關係;其中,該控制裝置被設定為於一工作模式下,控制該影像擷取裝置拍攝一工作影像,基於該工作影像及該轉換關係決定執行工作的一機械空間座標,並控制該機械手臂移動至該機械空間座標。 An automated robotic arm system, comprising: a robotic arm used to move in a three-dimensional space; an image capture device arranged outside the flange shaft of the robotic arm and used for capturing images; an optical ranging device arranged On the robotic arm, for measuring a target distance, the distance measuring axis of the optical distance measuring device is parallel to or overlapping the flange axis; an optical path structure includes a beam splitter, and the beam splitter is used to guide visible light to the image capturing device and guiding the ranging light to the optical ranging device; and a control device connecting the robotic arm, the image capturing device and the optical ranging device; Wherein, the control device is set to be in a calibration mode, based on the target distance, to control the robotic arm to move into a plurality of calibration postures within an effective shooting range of the image capture device, and to control the image capture device in the image capture device. A plurality of calibration postures are respectively captured a plurality of calibration images, and a conversion relationship between the visual space of the image capture device and the mechanical space of the robotic arm is calculated based on the plurality of calibration postures and the plurality of calibration images; wherein, the control The device is set to control the image capturing device to capture a working image in a working mode, determine a mechanical space coordinate for executing work based on the working image and the conversion relationship, and control the robotic arm to move to the mechanical space coordinate. 如請求項11所述之自動化機械手臂系統,其中該控制裝置包括:一機械手臂控制器,連接該機械手臂,用以基於所收到的一手臂控制命令來控制該機械手臂移動;及一控制電腦,連接該機械手臂控制器、該影像擷取裝置及該光學測距裝置,用以發出該手臂控制命令;其中,該控制電腦被設定來取得該影像擷取裝置的一有效拍攝距離,並基於該有效拍攝距離設定該有效拍攝範圍,控制該機械手臂於該有效拍攝範圍內分別為不同的該目標距離的多個測焦姿態,並於該多個測焦姿態分別拍攝多個測焦影像;其中,該控制電腦被設定來對該多個測焦影像與對應的該多個目標距離執行一對焦分析來決定一基準姿態及一基準距離; 其中,該控制電腦被設定來基於該基準姿態及該基準距離連續地控制該機械手臂移動為不同的該校正姿態,並拍攝多個該校正影像,直到一停止收集條件滿足。 The automated robotic arm system of claim 11, wherein the control device comprises: a robotic arm controller connected to the robotic arm for controlling the robotic arm to move based on an arm control command received; and a control a computer, connected to the robotic arm controller, the image capture device and the optical distance measuring device, to issue the arm control command; wherein the control computer is configured to obtain an effective shooting distance of the image capture device, and The effective shooting range is set based on the effective shooting distance, the robotic arm is controlled to take a plurality of focus measurement poses with different target distances within the effective shooting range, and a plurality of focus measurement images are respectively captured in the multiple focus measurement positions ; wherein, the control computer is configured to perform a focus analysis on the plurality of focus measurement images and the corresponding plurality of target distances to determine a reference attitude and a reference distance; Wherein, the control computer is set to continuously control the robot arm to move to different calibration postures based on the reference posture and the reference distance, and shoot a plurality of the calibration images until a stop collection condition is satisfied. 如請求項12所述之自動化機械手臂系統,其中該控制電腦被設定為於該多個測焦影像中選擇至少一準焦的該測焦影像,並基於拍攝到清晰的該測焦影像的該測焦姿態與該目標距離決定該基準姿態及該基準距離。 The automated robotic arm system of claim 12, wherein the control computer is configured to select at least one of the focus-measured images of the plurality of focus-measured images, and based on the focus-measured image that captures a clear focus-measured image The focus attitude and the target distance determine the reference attitude and the reference distance. 如請求項11所述之自動化機械手臂系統,更包括一安裝基座,用以設置該影像擷取裝置、該光學測距裝置及該光路結構;其中,該機械手臂的末端連接該安裝基座,並用以於該立體空間中移動該安裝基座;其中,該控制裝置被設定為於各該校正影像中識別一校正目標的一視覺空間座標,計算該校正目標於該多個校正影像的該多個視覺空間座標的變化,計算該多個機械空間座標的變化,並基於該多個機械空間座標的變化及該多個視覺空間座標的變化計算視覺空間與機械空間之間的該轉換關係。 The automated robotic arm system as claimed in claim 11, further comprising a mounting base for setting the image capturing device, the optical ranging device and the optical path structure; wherein the end of the robotic arm is connected to the mounting base , and is used to move the mounting base in the three-dimensional space; wherein, the control device is set to identify a visual space coordinate of a calibration target in each calibration image, and calculate the calibration target in the plurality of calibration images. For the changes of the plurality of visual space coordinates, the changes of the plurality of mechanical space coordinates are calculated, and the conversion relationship between the visual space and the mechanical space is calculated based on the changes of the plurality of mechanical space coordinates and the changes of the plurality of visual space coordinates. 如請求項11所述之自動化機械手臂系統,其中該機械手臂包括用以提供多個自由度的多個關節,該機械空間座標是基於該機械空間座標調整該多個關節的旋轉角度來控制該機械手臂於該多個自由度中進行運動;其中,該控制裝置被設定為當該機械手臂進入該有效拍攝範圍時拍攝該工作影像,於該工作影像中識別一工作目標,並基於該工作目標的位置執行一工作分析來決定執行工作的一視覺空間 座標,並基於該轉換關係轉換執行工作的該視覺空間座標為執行工作的該機械空間座標;其中,該工作分析包括瑕疵檢測處理、量測處理、分類篩檢處理與元件定位處理的至少其中之一。 The automated robotic arm system as claimed in claim 11, wherein the robotic arm includes a plurality of joints for providing a plurality of degrees of freedom, and the mechanical space coordinates are based on the mechanical space coordinates to adjust the rotation angles of the plurality of joints to control the The robotic arm moves in the multiple degrees of freedom; wherein, the control device is set to capture the working image when the robotic arm enters the effective shooting range, identify a working target in the working image, and based on the working target perform a job analysis to determine a visual space in which to perform the job Coordinates, and based on the conversion relationship, the visual space coordinates of the execution work are converted into the mechanical space coordinates of the execution work; wherein, the work analysis includes at least one of defect detection processing, measurement processing, classification screening processing and component positioning processing one. 如請求項11所述之自動化機械手臂系統,更包括一工作裝置,連接該控制裝置;其中,該控制裝置被設定來於該機械手臂移動至該機械空間座標時,控制該工作裝置執行一自動化動作。 The automated robotic arm system of claim 11, further comprising a working device connected to the control device; wherein the control device is set to control the working device to execute an automation when the robotic arm moves to the mechanical space coordinate action. 如請求項16所述之自動化機械手臂系統,其中該工作裝置包括夾取端效器、焊接加熱器、標記工具、研磨工具、組裝端效器、塗膠工具與鎖固工具的至少其中之一;其中,該自動化動作包括夾取動作、焊接動作、標記動作、研磨動作、組裝動作、塗膠動作與鎖固動作的至少其中之一。 The automated robotic arm system of claim 16, wherein the working device includes at least one of a gripping end effector, a welding heater, a marking tool, a grinding tool, an assembling end effector, a gluing tool and a locking tool wherein, the automatic action includes at least one of a clamping action, a welding action, a marking action, a grinding action, an assembling action, a gluing action and a locking action. 如請求項11所述之自動化機械手臂系統,其中該影像擷取裝置包括彩色攝影機;其中,該光學測距裝置包括紅外線測距儀;其中,該分光鏡為長通分色鏡(longpass dichroic mirror),並具有80%以上的可見光反射率與75%以上的紅外線穿透率。 The automated robotic arm system of claim 11, wherein the image capturing device comprises a color camera; wherein the optical ranging device comprises an infrared rangefinder; wherein the beam splitter is a longpass dichroic mirror ), and has a visible light reflectivity of more than 80% and an infrared transmittance of more than 75%. 如請求項11所述之自動化機械手臂系統,其中該光學測距裝置包括:一光發射器,用以朝一目標發射該測距光;一光接收器,用以接收反射的該測距光;及 一測距控制器,連接該光發射器與該光接收器,被設定來基於該測距光的一發射-接收時間差、一光傳播速度及該光發射器與該光接收器之間的距離計算該目標距離。 The automated robotic arm system of claim 11, wherein the optical ranging device comprises: an optical transmitter for emitting the ranging light toward a target; an optical receiver for receiving the reflected ranging light; and A ranging controller, connecting the optical transmitter and the optical receiver, is set based on a transmit-receive time difference of the ranging light, a light propagation speed, and the distance between the optical transmitter and the optical receiver Calculate the target distance. 如請求項11所述之自動化機械手臂系統,其中該光路結構更包括一反射鏡,該反射鏡用以反射該分光鏡所反射的可見光至該影像擷取裝置。 The automated robotic arm system as claimed in claim 11, wherein the optical path structure further comprises a reflector for reflecting the visible light reflected by the beam splitter to the image capturing device.
TW110136446A 2021-09-30 2021-09-30 Automatic robot arm system and method of coordinating robot arm and computer vision thereof TWI776694B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW110136446A TWI776694B (en) 2021-09-30 2021-09-30 Automatic robot arm system and method of coordinating robot arm and computer vision thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW110136446A TWI776694B (en) 2021-09-30 2021-09-30 Automatic robot arm system and method of coordinating robot arm and computer vision thereof

Publications (2)

Publication Number Publication Date
TWI776694B true TWI776694B (en) 2022-09-01
TW202315721A TW202315721A (en) 2023-04-16

Family

ID=84958061

Family Applications (1)

Application Number Title Priority Date Filing Date
TW110136446A TWI776694B (en) 2021-09-30 2021-09-30 Automatic robot arm system and method of coordinating robot arm and computer vision thereof

Country Status (1)

Country Link
TW (1) TWI776694B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011009814A1 (en) * 2010-03-05 2011-09-08 Fanuc Corporation Robotic system with visual sensor
TW202102347A (en) * 2019-07-05 2021-01-16 上銀科技股份有限公司 Calibration method of vision-guided robot arm only needing to specify a positioning mark in the calibration target to perform calibration
TWI724977B (en) * 2020-09-29 2021-04-11 台達電子工業股份有限公司 Calibration apparatus and calibration method for coordinate system of robotic arm
TW202124110A (en) * 2019-12-18 2021-07-01 財團法人工業技術研究院 Automated calibration system and method for workpiece coordinate frame of a robot

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011009814A1 (en) * 2010-03-05 2011-09-08 Fanuc Corporation Robotic system with visual sensor
TW202102347A (en) * 2019-07-05 2021-01-16 上銀科技股份有限公司 Calibration method of vision-guided robot arm only needing to specify a positioning mark in the calibration target to perform calibration
TW202124110A (en) * 2019-12-18 2021-07-01 財團法人工業技術研究院 Automated calibration system and method for workpiece coordinate frame of a robot
TWI724977B (en) * 2020-09-29 2021-04-11 台達電子工業股份有限公司 Calibration apparatus and calibration method for coordinate system of robotic arm

Also Published As

Publication number Publication date
TW202315721A (en) 2023-04-16

Similar Documents

Publication Publication Date Title
JP5290324B2 (en) Method and system for accurately positioning at least one object in a final pose in space
US9604363B2 (en) Object pickup device and method for picking up object
JP5911934B2 (en) Contour measurement device and robot system
US10805546B2 (en) Image processing system, image processing device, and image processing program
TWI670153B (en) Robot and robot system
JP6429473B2 (en) Robot system, robot system calibration method, program, and computer-readable recording medium
JP5377758B2 (en) Method and system for accurately positioning at least one object in a final pose in space
JP6855492B2 (en) Robot system, robot system control device, and robot system control method
JP6504274B2 (en) Three-dimensional shape data and texture information generation system, imaging control program, three-dimensional shape data and texture information generation method, and information recording medium
JP2009269110A (en) Assembly equipment
JP2002090113A (en) Position and attiude recognizing device
WO2020252632A1 (en) Coordinate system calibration method, device, and computer readable medium
US20150085108A1 (en) Lasergrammetry system and methods
JP6565175B2 (en) Robot and robot system
JP7191309B2 (en) Automatic Guidance, Positioning and Real-time Correction Method for Laser Projection Marking Using Camera
JP2019089180A (en) Robot and robot system
JP6973233B2 (en) Image processing system, image processing device and image processing program
JP7263501B2 (en) Automated robotic arm system and method of cooperation between the robotic arm and computer vision
TWI776694B (en) Automatic robot arm system and method of coordinating robot arm and computer vision thereof
US20180231474A1 (en) Apparatus and method for generating operation program of inspection system
TW201447289A (en) Testing apparatus and method for testing optical-electrical lens
JPH0545117A (en) Optical method for measuring three-dimensional position
TW202222519A (en) Robot system, robot arm, end effector, and adapter
JP2013007588A (en) Defect detection device and its method
CN106153012B (en) The spatial attitude parameter measurement method of specified target and its application

Legal Events

Date Code Title Description
GD4A Issue of patent certificate for granted invention patent