JPS5988298A - Method of guiding body - Google Patents

Method of guiding body

Info

Publication number
JPS5988298A
JPS5988298A JP19925082A JP19925082A JPS5988298A JP S5988298 A JPS5988298 A JP S5988298A JP 19925082 A JP19925082 A JP 19925082A JP 19925082 A JP19925082 A JP 19925082A JP S5988298 A JPS5988298 A JP S5988298A
Authority
JP
Japan
Prior art keywords
light
light source
guide
dimensional
working end
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP19925082A
Other languages
Japanese (ja)
Other versions
JPH048191B2 (en
Inventor
隆 伊藤
上田 澄広
平山 真明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kawasaki Heavy Industries Ltd
Kawasaki Motors Ltd
Original Assignee
Kawasaki Heavy Industries Ltd
Kawasaki Jukogyo KK
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kawasaki Heavy Industries Ltd, Kawasaki Jukogyo KK filed Critical Kawasaki Heavy Industries Ltd
Priority to JP19925082A priority Critical patent/JPS5988298A/en
Publication of JPS5988298A publication Critical patent/JPS5988298A/en
Publication of JPH048191B2 publication Critical patent/JPH048191B2/ja
Granted legal-status Critical Current

Links

Abstract

(57)【要約】本公報は電子出願前の出願データであるた
め要約のデータは記録されません。
(57) [Summary] This bulletin contains application data before electronic filing, so abstract data is not recorded.

Description

【発明の詳細な説明】 本発明に、作業機械の作業端などの物体を案内する光学
的な方法に関する。
DETAILED DESCRIPTION OF THE INVENTION The present invention relates to an optical method for guiding an object, such as a working end of a work machine.

従来から作業機械の作業端を案内するために、操作盤に
設けられた押mt操作し友り、棒状のジョイスティック
を操作している。このような操作に人間の感覚に対シロ
した操作でになく、多くの希望する位置と姿勢とに作業
端を案内するのに多く時間を必磐とする。
Conventionally, in order to guide the working end of a working machine, a rod-shaped joystick provided on an operation panel has been operated. Such operations are not sensitive to human senses, and require a lot of time to guide the working end to many desired positions and postures.

本発明の目的に、人間の感覚に対応して作業端などの物
体1に案内する方法5を提供することである。
It is an object of the invention to provide a method 5 for guiding an object 1, such as a working end, in response to human sensations.

@1図に本発明の一実施例の簡略化したブロック図であ
る。産業用ロボットなどの作業機械1の溶接などを行な
う作業端50にに、2個の2次元光点位置検出器から構
成さ几る案内センサ2が固着される。作業者51が把持
した寮内棒3πに、案−面3ai有する板状体が取付け
らn−ている。
Figure 1 is a simplified block diagram of an embodiment of the present invention. A guide sensor 2 composed of two two-dimensional light spot position detectors is fixed to a working end 50 of a working machine 1 such as an industrial robot, which performs welding or the like. A plate-shaped body having a guide surface 3ai is attached to the dormitory rod 3π held by the worker 51.

第2図に案内センサ2と案内rIiJ3aとの簡略化し
た斜視図である。案内センサ2+d、x軸方回に配置さ
nた2つの光点位置検出器52.53を有する。案内面
33には、仮想上の正三角形の頂点に配置された3つの
黒光iPl、P2.P3が配置さn、ル。この光@Pl
、P2.P3VCよって形成される仮想上の正三角形の
図心をGとし、この図心の座標を(ΔX、Δy、Δ2)
とする。光源PI、P2、、P3汀、たとえば発光ダイ
オードによって実現され、順次1つずつ点滅を繰り返し
て点灯さnる。たとえば黒光#Plが点灯した後、消灯
し、その後、光源P2が点灯し、その光源P2が消灯し
た後、点光源P3が点灯し、この光源P3が消灯した後
、再び光gPlfii点灯する。
FIG. 2 is a simplified perspective view of the guide sensor 2 and the guide rIiJ3a. The guide sensor 2+d has two light spot position detectors 52 and 53 arranged in the direction of the x-axis. On the guide surface 33, three black lights iPl, P2. P3 is placed n, le. This light @Pl
, P2. The centroid of the hypothetical equilateral triangle formed by P3VC is G, and the coordinates of this centroid are (ΔX, Δy, Δ2)
shall be. The light sources PI, P2, . . . P3 are realized by, for example, light emitting diodes, and are turned on and off one by one in sequence. For example, the black light #Pl is turned on and then turned off, then the light source P2 is turned on, after the light source P2 is turned off, the point light source P3 is turned on, and after this light source P3 is turned off, the light gPlfii is turned on again.

2次元光点位置検出器52.53および光源PL。Two-dimensional light spot position detector 52,53 and light source PL.

P2.P:l;t、マイクロコンピュータなどによって
実現さnる娠理装置4Vc接続される。し理装置4から
の出力げ、制御装置5に与えらnlこfl、vcよって
作業機械lの献納手段に、作業端50および案内センサ
2を駆動する。そのため物体としての作業端50は、案
内面3aπ対して予め定めた位置と姿勢とを保つように
移動さ几る。
P2. P:l;t is connected to a processing device 4Vc realized by a microcomputer or the like. The output from the cleaning device 4 is applied to the control device 5, which drives the working end 50 and the guide sensor 2 to the delivery means of the working machine 1. Therefore, the working end 50 as an object is moved so as to maintain a predetermined position and attitude with respect to the guide surface 3aπ.

$3図に、案内センサ2のx −y平面から見た1配略
化した図である。光源PI、P2.P3σ〕うちの1つ
をPiとしくi=1.2.3)、その光源Piの点灯時
における像に、レンズ8.10VCよって検出素子6.
7上に結像3n、る。結像位置Qil 、Qi2に、光
源Piとレンズ8,10の主点9.11とを結ぶ直線と
、検出素子6,7の受光面との交点である。レンズ8.
lOの主点9.11面の距FIM、をDとし、レンズ8
.10の主点9゜11と検出素子6,7の受光面との間
のyIIillI方回の距離をFとし、検出素子6,7
の受光面上における主点9.11を通るyIiltIに
平行な直線からのずf′Lをhil 、 hi2とする
と、光?1JXP1のX成分、y成分に第1式および@
2式から求まる。
FIG. 3 is a schematic diagram of the guide sensor 2 viewed from the x-y plane. Light source PI, P2. P3σ] One of them is designated as Pi and i=1.2.3), and the detection element 6.
The image is formed on 7, 3n. The imaging positions Qil and Qi2 are the intersections of the straight line connecting the light source Pi and the principal points 9.11 of the lenses 8 and 10 and the light-receiving surfaces of the detection elements 6 and 7. Lens 8.
The principal point of lO is 9. The distance FIM of the 11th plane is D, and the lens 8
.. Let F be the distance in the yIIillI direction between the principal point 9°11 of 10 and the light-receiving surfaces of the detection elements 6 and 7.
If f'L from a straight line parallel to yIiltI passing through principal point 9.11 on the light-receiving surface of is hil and hi2, then light? 1JXP1's X component and y component include the first formula and @
It can be found from equation 2.

、    hil−hi2   D hil+hi2 2     °−tit” ” hi
l + hi2  D     ・・・(2)第4図を
参照すると、y −z平面から児た案内センサ2の簡略
化した図である。2次元光点位置検出鼎53の検出素子
7ぴ〕受光面上における光源Piの結像位置Qi2のy
軸からのずれviを検出することによって光#Piの2
成分は第3式のようVC求まる。
, hil-hi2 D hil+hi2 2 °-tit” hi
l+hi2D (2) Referring to FIG. 4, it is a simplified diagram of the guidance sensor 2 viewed from the y-z plane. Detection element 7 of the two-dimensional light spot position detection pin 53] y of the imaging position Qi2 of the light source Pi on the light receiving surface
2 of light #Pi by detecting the deviation vi from the axis
The component is determined by VC as shown in the third equation.

zi=   ゛ hil + hi2   D    ””’案内センサ
2したがって作業端50の姿勢を表わす回転角α、β、
T[%第5図(a)、第5図(blおよび第5図(c)
 vcそnぞn示さnている。
zi= ゛hil + hi2 D ""'Rotation angles α, β, which represent the posture of the guide sensor 2 and therefore the working end 50,
T [% Fig. 5(a), Fig. 5(bl and Fig. 5(c)
vc is shown.

光源PI、P2.P3がriil述のように正三角形の
名頂点に配IHさ九でいるとき、案内面3aの姿勢の検
出方法を第6図を参照して説明する。点18に光源Pi
、P2の中点である一光源Piの3次元空間位置座標(
xi、yi、zi )とする。点21と点22に、光源
P3と点18のy=0平而平面わちx −z平面への投
影位置である。点21と点22を通石直線とzliI[
bの或す角に、X軸に対する回転角Δσに一致する。こ
のとき3つの光源PIP2.P3の3次元空間位置と、
xIlllIIニ対する回転角Δαとの間VCに、第4
式の関係がある。
Light source PI, P2. A method of detecting the attitude of the guide surface 3a when P3 is located at the vertex of an equilateral triangle as described above will be described with reference to FIG. Light source Pi at point 18
, the three-dimensional spatial position coordinates of one light source Pi, which is the midpoint of P2 (
xi, yi, zi). Points 21 and 22 are the projection positions of the light source P3 and point 18 onto the y=0 plane, ie, the x-z plane. Point 21 and point 22 are connected to the stone straight line and zliI[
A rotation angle Δσ with respect to the X axis corresponds to a certain angle of b. At this time, three light sources PIP2. The three-dimensional spatial position of P3,
The fourth
There is a relationship between formulas.

tanΔ” =z’3 ” (zl + z”l ’ 
”2   ”・I4’点19と点20とに、光@P3と
点18のX=0平面すなわちy−z平面への投影点であ
る。点19と点20とを通る1L線と2軸との成す角に
、Ylll[11rc対する回転角Δβに一致する。ま
た点23と点24に、光源PIと光源P2とのz=0平
曲すなわちx −y平面に対する投影点であり、点23
と点24とを通る直線とX軸の成す角は2軸に対する角
度ΔT”I’m一致する。回転角Δβ、ΔTと3個の光
源PI、P2.P3の3次元空間座標の間VCに、第5
式および@6式の関係がある。
tanΔ"=z'3" (zl + z"l'
"2".I4' Point 19 and point 20 are the projection points of the light @P3 and point 18 onto the X=0 plane, that is, the yz plane. The angle formed by the 1L line passing through points 19 and 20 and the two axes corresponds to the rotation angle Δβ with respect to Ylll[11rc. In addition, points 23 and 24 are the projection points of the light source PI and the light source P2 onto the z=0 flat curve, that is, the x-y plane, and the point 23
The angle formed by the straight line passing through and point 24 and the X axis coincides with the angle ΔT''I'm with respect to the two axes. , 5th
There is a relationship between formula and @6 formula.

tanJT =工しコ’         00.+6
1xi  −x2 3個の光#P1.P2.P3が形成する正三角形の図心
Gの位置座標(ΔX、Δy、Δ2)に、%7式〜%9式
のとおりである。
tanJT = work '00. +6
1xi −x2 3 lights #P1. P2. The position coordinates (ΔX, Δy, Δ2) of the centroid G of the equilateral triangle formed by P3 are as shown in formulas %7 to %9.

xi + x2 + x3 Δx=   3           ・・・(7)Δ
7=  i+  2+  3 3           ・・・(8)zl + z2
 + z3 Δz=   3            °−t91第
7図に、本発明の一実施例の全体のブロック図である。
xi + x2 + x3 Δx= 3 ... (7) Δ
7= i+ 2+ 3 3...(8) zl + z2
+z3 Δz=3°-t91 FIG. 7 is an overall block diagram of an embodiment of the present invention.

案内センサ2の2次元光点位置検出器52.53に含′
11.ている検出素子6,7に、同一の構成を有する。
Included in the two-dimensional light spot position detectors 52 and 53 of the guide sensor 2.
11. The detection elements 6 and 7 have the same configuration.

この検出素子6に、対をなす仮想上の正方形の名刀に配
置さnた対をなす2組の本隊25a 、26a ; 2
7a 、28aを含むホトダイオードであり、高抵抗半
導体の表面が均一な抵抗層で形成さnてあり、そnらの
電極25a。
On this detection element 6, two pairs of main forces 25a, 26a;
7a and 28a, the surface of which is a high-resistance semiconductor is formed with a uniform resistance layer, and an electrode 25a thereof.

26a;27a、281に表面抵抗層において対向して
いる。光によって生成さnた電流に、電極25a、26
a;27a、28aに分割して出力さn1ライン29a
〜32aを介して処理回路33π入力さnる。もう一つ
の検出素子7VC関しでに添字すをけして示す。
26a; Opposed to 27a and 281 in the surface resistance layer. Electrodes 25a, 26 are connected to the current generated by the light.
a; Divided into 27a and 28a and output n1 line 29a
~32a to the processing circuit 33π. Regarding the other detection element 7VC, the subscript is omitted.

前述の第3図および第4図に関連して説り」した結像位
11Qil、Qi2の座標を次のように求める。
The coordinates of the imaging positions 11Qil and Qi2 explained in connection with FIGS. 3 and 4 above are determined as follows.

結像位fit(Ql l、Ql2の2軸方回に平行なり
1ltlIIおよびX軸方向に平行なh軸から成る座標
系の位置に対応して、名ライン29〜32から処理回路
33に与えらnる電流11,12.I3,14iEそれ
ぞれ定まる。検出素子6,7の受光面にv −h平1用
に平行である。電流11.I2.I3.I4によって結
像位RQilの座標(vil、hil)とQl2の座標
(vi2.hi2)とに第1O式および第11式によつ
で定まる。
The image forming position fit (Ql is given to the processing circuit 33 from lines 29 to 32 corresponding to the position of the coordinate system consisting of the two axes 1ltlII parallel to the quadrature of Ql and Ql2 and the h axis parallel to the X-axis direction. The coordinates (vil , hil) and the coordinates of Ql2 (vi2.hi2) are determined by Equation 1O and Equation 11.

、ll−12 hiJ=Il +I2      °°°叫、、I3−
I4 ”””I3+I4      °−1+)ココテ光源P
l、P2.P3Vc対応してi=1゜2.3であり、結
像位置Qil、Qi2 K対応してj=1.2である。
,ll-12 hiJ=Il +I2 °°°scream, ,I3-
I4 """I3+I4 °-1+) Kokote light source P
l, P2. Corresponding to P3Vc, i=1°2.3, and corresponding to the imaging positions Qil, Qi2K, j=1.2.

第10式および第11式の演算に、処理回路33によっ
て演算すf′L、、座標値hi1.vil、hi2.v
i2を表わす信号n fg号線34a、35a、34b
The processing circuit 33 calculates f'L, coordinate value hi1. vil, hi2. v
Signal n representing i2 FG lines 34a, 35a, 34b
.

35bから処理回lN154にそれぞ几入力さnる。35b to the processing circuit 154.

処理回路54でa1閃心Gの座標値ΔX、Δy、Δ2 
お工び姿勢を表わす回転角Δα、Δβ、ΔTを第4式〜
第9式に基づいて演算して求める。案内面と案内センサ
の衝突ケ避けるため、Δyの値に演算器55Vζよ2て
、定奴設定回路56において定めらn値だけ、減算され
る。屓算器55からの出力および値ΔX、Δ2.Δα、
Δβ、ΔTぼ、絶対座標への座標変換を行なう回FMI
63I7i′入力される。
The processing circuit 54 calculates the coordinate values ΔX, Δy, Δ2 of the a1 centroid G.
The rotation angles Δα, Δβ, and ΔT representing the machining posture are expressed by the 4th formula ~
It is determined by calculation based on the ninth equation. In order to avoid collision between the guide surface and the guide sensor, the value of Δy is subtracted by a predetermined n value in the fixed load setting circuit 56 using the arithmetic unit 55Vζ2. The output from the multiplier 55 and the values ΔX, Δ2 . Δα,
Δβ, ΔT, times FMI to perform coordinate transformation to absolute coordinates
63I7i' is input.

この回Fp163jJ、2次元光点位置検出器2の座標
系を、作業機械lの絶対座標系Vt変換する。したがっ
て回路63からの出力に、作業機械1の絶対座標系上の
案内センサ2、したがって作業端50が移動すべき偏差
を表わしている。この偏差を参照符Δxa、Δya、Δ
za、Δαa、Δβa、ΔTaとする。これらの偏差の
値は、演算回路57〜62′、でよって座標変換回Fl
/!164からの出力と加算される。演算器57〜62
からの出力に、2次元光点位置検出器2、したがって作
業端50か移動すべき作業機械1σ)絶対座標系におけ
る目標値X r # Y re zr、αr、β1.T
rである。座標変換回路64からに、案内センサ2、シ
九がって作業端50σ】絶対座標糸πおける現在の位置
と姿勢を表わす値力;辱出さ几る。処理回路65に、作
業機械1 ff)複数軸たとえば6qIliIのh軸ご
との制御装置66〜71に信号θlr〜θ6rが与えら
れる。h軸ごとの制御装置66〜71に、比例、積分、
微分などの制?」要素Hcと、h軸の駆動手段を含む伝
達関数Gaを有する回路と、フィートノ(ツクを行なう
演算器Eとを含む。このようにしてh軸の駆1手段力1
1#され、2次元光点位置険出器2Vc固定されていル
作業端50が、案内面3aπ対応した位置と姿勢を味つ
ことに一&る。h軸の制御装置66〜71からの出力θ
l〜06に、h軸の位置を表わしており、このは号に座
標変換回路64に与えられる。
At this time Fp163jJ, the coordinate system of the two-dimensional light spot position detector 2 is transformed into the absolute coordinate system Vt of the work machine l. The output from the circuit 63 thus represents the deviation by which the guide sensor 2 and therefore the working end 50 in the absolute coordinate system of the working machine 1 must be moved. This deviation is referred to as Δxa, Δya, Δ
za, Δαa, Δβa, and ΔTa. The values of these deviations are calculated by the arithmetic circuits 57 to 62' and the coordinate conversion circuit Fl.
/! 164. Arithmetic units 57-62
At the output from the two-dimensional light spot position detector 2, the working end 50 or the working machine to be moved 1σ) determines the target values in the absolute coordinate system X r # Y re zr, αr, β1 . T
It is r. From the coordinate conversion circuit 64, the guide sensor 2 outputs a value representing the current position and orientation at the working end 50σ and the absolute coordinate thread π. Signals θlr to θ6r are applied to the processing circuit 65 and to the control devices 66 to 71 for each h axis of the work machine 1 ff) plural axes, for example, 6qIliI. The controllers 66 to 71 for each h-axis have proportional, integral,
Controls such as differentiation? '' element Hc, a circuit having a transfer function Ga including the h-axis driving means, and an arithmetic unit E that performs the function.In this way, the h-axis driving means 1 force 1
1#, and the working end 50 to which the two-dimensional light spot position elevating device 2Vc is fixed takes a position and attitude corresponding to the guide surface 3aπ. Output θ from the h-axis controllers 66 to 71
1 to 06 represent the position of the h axis, and this number is given to the coordinate conversion circuit 64.

こn、によって座標変換回路64に、前述のように案内
センサ2、したがって作業端50〜色対I4!本呉系に
おける現在の位置と姿勢を表わす信号を導出する。
This causes the coordinate conversion circuit 64 to convert the guide sensor 2 as described above, and thus the working end 50 to the color pair I4! We derive signals representing the current position and attitude of the Honku system.

第8図に一作業者51か案内面3aを操作して作業機械
lの作朝端50を案内する動作を説明するブローチヤー
ドである。ステップn17)、らステップn2に移り、
処理装置4によって光#P1を点灯する。ステップn3
で6%案内センサ2における検出素子6,7によってh
ll、hL2.vllk収りこむ。ここで、v12に本
発明でに、必要ではないので収りこむ必要汀ない。ステ
ップn4でに光源Plが消灯される。次のステップn5
でに光i1’lの3次元空間座標(xi、yl、zl 
) k IA 1式〜第3式によって算出する。ステッ
プn6でに、光源P2を点灯し、その光の検出によって
同様な3次元空間座標(x2.y2.z2 )を算出す
る。またステップn7でに、光源P31に点灯し、その
3次元空間座標(x3.y3.z3 )を算出する。ス
テップn8では第4式〜@6式に基づいて姿勢角(Δa
FIG. 8 is a broachyard for explaining the operation of guiding the working end 50 of the working machine 1 by one worker 51 operating the guide surface 3a. Step n17), then move on to step n2,
The processing device 4 turns on the light #P1. step n3
h by the detection elements 6 and 7 in the guide sensor 2 at 6%
ll, hL2. vllk fit in. Here, since it is not necessary for the present invention in v12, there is no need to include it. In step n4, the light source Pl is turned off. next step n5
The three-dimensional spatial coordinates of the light i1'l (xi, yl, zl
) k IA Calculated using equations 1 to 3. In step n6, the light source P2 is turned on, and similar three-dimensional spatial coordinates (x2.y2.z2) are calculated by detecting the light. Further, in step n7, the light source P31 is turned on and its three-dimensional spatial coordinates (x3.y3.z3) are calculated. In step n8, the attitude angle (Δa
.

Δβ、ΔT)を算出する。次にステップ09に移り、案
内面3aの位1αすなわち光#Pl、P2.P3の図心
Gの位1m (ΔX、Δy、Δz)k%7式〜第9式に
基づいて算出する。ステップnlOでに、検出した案内
面3aの位置と姿勢について基準の位置と姿勢からのず
れを算出する。この算出結果に、回路63から尋出さn
、る。ステップnilで汀制υII装置i15によって
作業機械lの作業端50を案内ffl+3aに対して予
め定めた位置と姿勢とになるように躯w1さnる。この
ようにして案内面3aと作業端50とを、予め定めた位
置と姿勢との関係が保たれるようVr駆−#される。l
、たがって人聞の感舅に応じて案内面13aを操作して
作業端50を移納することができる。
Δβ, ΔT). Next, the process moves to step 09, and the position 1α of the guide surface 3a, that is, the light #Pl, P2. 1 m of centroid G of P3 (ΔX, Δy, Δz) k% Calculated based on formulas 7 to 9. In step nlO, the deviation from the reference position and orientation of the detected position and orientation of the guide surface 3a is calculated. Based on this calculation result, the circuit 63 outputs n
,ru. In step nil, the control υII device i15 turns the working end 50 of the working machine 1 into a predetermined position and attitude with respect to the guide ffl+3a. In this way, the guide surface 3a and the working end 50 are driven by Vr so that the relationship between the predetermined positions and postures is maintained. l
Therefore, the working end 50 can be moved or retracted by operating the guide surface 13a according to one's intuitive sense.

3に4以上設けらn、でもよく、たとえば仮想正方形の
各項点位置に点光源か配置d菖nでもよい。
For example, a point light source or a point light source may be arranged at each point position of a virtual square.

以上のように本発明によnば、案内面を操作して作業端
一などの物体1に案内することができるので、人聞の感
覚に応じてその物体を案内する操作が容易であり、作業
性が同上する。
As described above, according to the present invention, it is possible to guide the object 1 such as the work end 1 by operating the guide surface, so it is easy to guide the object according to human senses, Workability is the same as above.

【図面の簡単な説明】[Brief explanation of the drawing]

第1図に本発明の一実施例の全体の系統図、第2図に案
内センサ2と案内面3aとの斜視図、第3図に案内セン
サ2のx −y平面における闇路化した図、第4図は案
内センサ2のy−z平面におけるl#略化した図、第5
図に姿勢角(ΔαJΔβ、ΔT)を表わすための図、第
6図に光源Pi、P2.P3の座標力・ら姿勢角を算出
する7j伝を説明するための図、%7図11−r本発明
の一実施例の具体的な構成ケ示すブロック図、第8図に
その天施例の前作を説明するためのフローチャートでア
ル。 l・・・作業機械、2・・・案内センサ、3a・・・案
内面一4・・・処理装置、5・・・制御装置、6,7・
・・検圧素子。 52.53・・・2次元光点位置検出器代理人   弁
理士 西教圭一部
Fig. 1 is an overall system diagram of an embodiment of the present invention, Fig. 2 is a perspective view of the guide sensor 2 and the guide surface 3a, and Fig. 3 is a diagram of the guide sensor 2 in the x-y plane. , FIG. 4 is a simplified diagram of the guide sensor 2 in the y-z plane.
The figure shows the attitude angle (ΔαJΔβ, ΔT), and FIG. 6 shows the light sources Pi, P2. A diagram for explaining the 7j process for calculating the coordinate force and attitude angle of P3, %7 Fig. 11-r A block diagram showing the specific configuration of an embodiment of the present invention, Fig. 8 is an example thereof. Al with a flowchart to explain the previous work. 1... Working machine, 2... Guide sensor, 3a... Guide surface 4... Processing device, 5... Control device, 6, 7...
...Pressure detection element. 52.53...Two-dimensional light spot position detector agent Patent attorney Kei Nishi

Claims (1)

【特許請求の範囲】[Claims] 3個以上の黒光#を具備した案内面の名光源を順次点灯
し、その光を物体に設置した2個の2次元光点位置検出
器により受光し、この2次元光点位置検出器の出力に基
づいて光源の3次元位置を検出し、その光源の3次元位
置から案内面の位置と姿勢1に検出し、物体が案内面に
対し予め定めた位1uと姿勢を保つように1枢動手段に
よって物体を案内することを特徴とする物体の案内力法
Light sources on the guide surface equipped with three or more black light #s are sequentially turned on, the light is received by two two-dimensional light spot position detectors installed on the object, and the output of these two-dimensional light spot position detectors is The three-dimensional position of the light source is detected based on the three-dimensional position of the light source, and the position and attitude of the guide surface are detected from the three-dimensional position of the light source. An object guiding force method characterized by guiding an object by a means.
JP19925082A 1982-11-12 1982-11-12 Method of guiding body Granted JPS5988298A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP19925082A JPS5988298A (en) 1982-11-12 1982-11-12 Method of guiding body

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP19925082A JPS5988298A (en) 1982-11-12 1982-11-12 Method of guiding body

Publications (2)

Publication Number Publication Date
JPS5988298A true JPS5988298A (en) 1984-05-22
JPH048191B2 JPH048191B2 (en) 1992-02-14

Family

ID=16404662

Family Applications (1)

Application Number Title Priority Date Filing Date
JP19925082A Granted JPS5988298A (en) 1982-11-12 1982-11-12 Method of guiding body

Country Status (1)

Country Link
JP (1) JPS5988298A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60136806A (en) * 1983-12-26 1985-07-20 Agency Of Ind Science & Technol Device for instructing work of robot
JPS62123889U (en) * 1986-01-29 1987-08-06
JPS62254206A (en) * 1986-04-28 1987-11-06 Fuji Electric Co Ltd Deciding device for plane direction

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58137590A (en) * 1982-02-10 1983-08-16 三菱電機株式会社 Robot instruction device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58137590A (en) * 1982-02-10 1983-08-16 三菱電機株式会社 Robot instruction device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60136806A (en) * 1983-12-26 1985-07-20 Agency Of Ind Science & Technol Device for instructing work of robot
JPH0310125B2 (en) * 1983-12-26 1991-02-13 Kogyo Gijutsuin
JPS62123889U (en) * 1986-01-29 1987-08-06
JPS62254206A (en) * 1986-04-28 1987-11-06 Fuji Electric Co Ltd Deciding device for plane direction

Also Published As

Publication number Publication date
JPH048191B2 (en) 1992-02-14

Similar Documents

Publication Publication Date Title
Milgram et al. Telerobotic control using augmented reality
US20190036337A1 (en) System for robotic 3d printing
EP3342542A1 (en) Industrial remote control robot system
CN110977931A (en) Robot control device and display device using augmented reality and mixed reality
Farkhatdinov et al. A user study of command strategies for mobile robot teleoperation
CN102581445A (en) Visual real-time deviation rectifying system and visual real-time deviation rectifying method for robot
CN101896321A (en) Determining the position of an object
Burrell et al. Towards a cooperative robotic system for autonomous pipe cutting in nuclear decommissioning
Lim et al. Internet-based teleoperation of a mobile robot with force-reflection
Matsuda et al. Control system for object transportation by a mobile robot with manipulator combined with manual operation and autonomous control
CN108145702B (en) Device for setting a boundary surface and method for setting a boundary surface
JPS5988298A (en) Method of guiding body
WO2021117868A1 (en) Robot system and method for forming three-dimensional model of workpiece
JPS60136806A (en) Device for instructing work of robot
Fahantidis et al. Robot handling of flat textile materials
Kobayashi et al. Motion capture with inertial measurement units for hand/arm robot teleoperation
Kosuge et al. Decentralized coordinated motion control of manipulators with vision and force sensors
Tongloy et al. An image-based visual servo control system based on an eye-in-hand monocular camera for autonomous robotic grasping
Vogel et al. A projection-based sensor system for ensuring safety while grasping and transporting objects by an industrial robot
JP3021202B2 (en) Robot position and orientation guidance method
JPS61274852A (en) Non-contact curved surface copying sensor
Hanh et al. Implement contour following task of objects with unknown geometric models by using combination of two visual servoing techniques
JPS625408A (en) Method for controlling joint-type robot
JPH1177568A (en) Teaching assisting method and device
Brunner et al. Programming robots via learning by showing in a virtual environment