WO2018173192A1 - Procédé de détermination de parallélisme de robot articulé et dispositif de réglage d'inclinaison de robot articulé - Google Patents

Procédé de détermination de parallélisme de robot articulé et dispositif de réglage d'inclinaison de robot articulé Download PDF

Info

Publication number
WO2018173192A1
WO2018173192A1 PCT/JP2017/011704 JP2017011704W WO2018173192A1 WO 2018173192 A1 WO2018173192 A1 WO 2018173192A1 JP 2017011704 W JP2017011704 W JP 2017011704W WO 2018173192 A1 WO2018173192 A1 WO 2018173192A1
Authority
WO
WIPO (PCT)
Prior art keywords
parallelism
articulated robot
robot
pattern
articulated
Prior art date
Application number
PCT/JP2017/011704
Other languages
English (en)
Japanese (ja)
Inventor
崇 種池
信夫 大石
雅登 鈴木
識 西山
慎也 三浦
Original Assignee
株式会社Fuji
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Fuji filed Critical 株式会社Fuji
Priority to JP2019506828A priority Critical patent/JP6807450B2/ja
Priority to PCT/JP2017/011704 priority patent/WO2018173192A1/fr
Publication of WO2018173192A1 publication Critical patent/WO2018173192A1/fr

Links

Images

Definitions

  • This specification discloses a method for determining the degree of parallelism of an articulated robot and an inclination adjusting device for the articulated robot.
  • Patent Document 1 a wafer positioned on an alignment plate (magnetic plate) is photographed with a camera mounted on a robot arm, and the parallelism of the wafer with respect to the magnetic plate is calculated based on a photographed image of the wafer.
  • the parallelism is calculated by the control device as follows. In other words, the control device images two alignment marks attached to the wafer by the camera. Subsequently, the control device calculates a focus shift at each alignment mark based on the captured image of each alignment mark, and calculates the parallelism of the wafer with respect to the magnetic plate from the focus shift. Then, the control device controls the actuator of each joint portion of the robot arm to tilt the wafer so that the wafer is parallel to the magnetic plate based on the calculated parallelism of the wafer.
  • Patent Document 1 detects a focus shift from a captured image of a mark.
  • a dedicated detection device is required, or the processing is complicated to ensure sufficient detection accuracy. Problems arise. Such a problem can also occur when determining the parallelism with respect to the reference plane of the articulated robot.
  • the main purpose of the present disclosure is to more easily determine the parallelism with respect to the reference plane of the articulated robot.
  • the present disclosure is a parallelism determination method for an articulated robot that determines parallelism with respect to a reference plane of the articulated robot, wherein a plate having a pattern of a predetermined pattern is disposed on a work table of the articulated robot, and the articulated robot The plate is imaged by a camera mounted on the arm of the robot, the interval of the pattern recognized in the captured image of the plate is measured at a plurality of locations, and the parallelism is determined based on the comparison of the measured intervals.
  • the gist is to do.
  • the parallelism determination method of the articulated robot of the present disclosure measures the interval of the pattern in the captured image at a plurality of locations and determines the parallelism by comparing the measured intervals, the articulated robot can be more easily performed.
  • the parallelism with respect to the reference plane can be determined.
  • FIG. 2 is a configuration diagram showing an outline of the configuration of a robot 20.
  • FIG. 3 is an explanatory diagram showing a movable range of a robot 20.
  • FIG. 3 is an explanatory diagram showing a movable range of a robot 20.
  • FIG. 3 is a block diagram showing an electrical connection relationship among a robot 20, a robot control device 70, and an image processing device 80.
  • FIG. It is explanatory drawing which shows an example of an inclination adjustment process. It is explanatory drawing which shows the mode of the imaging of the jig plate. It is explanatory drawing which shows the upper surface pattern of the jig plate P.
  • FIG. It is explanatory drawing which shows the upper surface pattern of the jig plate P.
  • FIG. It is explanatory drawing which shows the upper surface pattern of the jig plate P.
  • FIG. 8A It is explanatory drawing which shows the size H of an image sensor, the focal distance F, the visual field FOV, and the working distance WD.
  • FIG. 1 is a configuration diagram showing an outline of the configuration of the robot 20.
  • 2A and 2B are explanatory diagrams showing the movable range of the robot 20.
  • FIG. 3 is a block diagram showing an electrical connection relationship among the robot 20, the robot control device 70, and the image processing device 80.
  • the robot 20 receives a control from the robot control device 70 (see FIG. 3) and performs a predetermined work on the work (work object) transported by the work transport device 12 (see FIG. 3).
  • the predetermined work include a pick-up work for picking up a work, a place work for placing the work at a predetermined position, and an assembling work for assembling the work at a predetermined position.
  • the robot 20 includes a 5-axis vertical articulated arm (hereinafter referred to as an arm) 22 as shown in FIG.
  • the arm 22 has six links (first to sixth links 31 to 36) and five joints (first to fifth joints 41 to 45) that connect the links so as to be rotatable or pivotable.
  • Each joint (first to fifth joints 41 to 45) includes motors (servo motors) 51 to 55 for driving the corresponding joints, and encoders (rotary encoders) 61 to 65 for detecting the rotational positions of the corresponding motors. Is provided.
  • a work tool as an end effector can be attached to and detached from the distal end link (sixth link 36) of the arm 22.
  • the work tool include an electromagnetic chuck, a mechanical chuck, and a suction nozzle.
  • the tool attached to the tip link is appropriately selected according to the shape and material of the work to be worked.
  • a camera 24 is attached to the tip end portion (fifth link 35) of the arm 22.
  • the camera 24 is for capturing an image of the workpiece in order to recognize the position and posture of the workpiece.
  • the arm 22 of the present embodiment configured in this way has a front (front) and rear (back) direction as viewed from the front of the robot 20 as an X axis, and the vertical direction (first joint) of the robot 20. It is possible to move in a three-dimensional space and to move in the rotational direction (RC) around the Z axis, with the Z axis as the rotation direction of the 41 rotation axis) and the Y axis as the direction orthogonal to the X axis and the Z axis. It is. Furthermore, as shown in FIG. 2B, the arm 22 can move in the rotational direction (RB) around the Y axis.
  • the robot control device 70 is configured as a microprocessor centered on the CPU 71, and includes a ROM 72, an HDD 73, a RAM 74, an input / output interface (not shown), a communication interface (not shown), and the like in addition to the CPU 71.
  • the HDD 73 stores an operation program of the robot 20 and the like. Detection signals from the encoders 61 to 65 and the like are input to the robot control device 70. Further, the robot control device 70 outputs control signals to the motors 51 to 55, the workpiece transfer device 12, and the like.
  • the robot control device 70 drives and controls the motors 51 to 55 of the robot 20 to move the work root attached to the tip link (sixth link 36) of the arm 22 toward the work, and uses the work tool. To perform predetermined work on the workpiece. Specifically, the robot control device 70 acquires the target position (X, Y, Z) and target posture (RB, RC) of the work tool for performing work on the workpiece from the image processing device 80. Subsequently, the robot controller 70 sets the acquired target position (X, Y, Z) and target posture (RB, RC) to the target position (target angle) of each joint of the arm 22 using known HD parameters or the like. Convert coordinates. Then, the robot controller 70 drives and controls the corresponding motors 51 to 55 so that the positions (angles) of the respective joints coincide with the coordinate-converted target positions (target angles), and the work is performed on the workpiece. Drive and control work tools.
  • an image processing device 80 is communicably connected to the robot control device 70.
  • the image processing apparatus 80 is configured as a microprocessor centered on a CPU 81, and includes a ROM 82, an HDD 83, a RAM 84, an input / output interface (not shown), a communication interface (not shown), and the like in addition to the CPU 81.
  • the image processing apparatus 80 receives an image signal from the camera 24, an input signal from the input apparatus 85, and the like.
  • the image processing device 80 outputs a drive signal to the camera 24, an output signal to the output device 86, and the like.
  • the input device 85 is an input device on which an operator performs an input operation, such as a keyboard and a mouse.
  • the output device 86 is a display device for displaying various information such as a liquid crystal display.
  • the image processing device 80 is communicably connected to the robot control device 70 and exchanges control signals and data with each other.
  • FIG. 4 is an explanatory diagram illustrating an example of the inclination adjustment process.
  • the CPU 81 of the image processing apparatus 80 executes the following processes of S110 to S160.
  • FIG. 6A and 6B are explanatory views showing the upper surface pattern of the jig plate P.
  • FIG. 6A On the upper surface of the jig plate P, as shown in FIG. 6A, a pattern in which circular dots (marks) are arranged in a matrix is formed.
  • FIG. 6B a checkered pattern in which squares of two colors (white and black) are alternately arranged may be formed on the upper surface of the jig plate P.
  • the CPU 81 of the image processing apparatus 80 images the jig plate P with the camera 24 mounted on the arm 22 of the robot 20 (S110). This process is performed by transmitting a control signal including the imaging position of the camera 24 corresponding to the installation position of the jig plate P to the robot controller 70.
  • the robot control device 70 drives and controls the motors 51 to 55 so that the camera 24 comes above the jig plate P as shown in FIG.
  • the CPU 81 performs lens distortion correction for correcting lens aberration (distortion aberration) included in the captured image of the jig plate P using a known distortion correction algorithm (S120).
  • lens distortion correction for example, the position of each pixel in the image and the amount of distortion are associated with each other, stored in the ROM 82 as distortion amount mapping data, and the position of each pixel in the captured image is corrected with the corresponding amount of distortion. This can be done.
  • the CPU 81 separates the background color (white) and the mark color (black) from the captured image after the lens distortion correction, and extracts the outlines of the four corner marks of the jig plate P (S130).
  • the CPU 81 determines the distance between the centers of the two dots at the A end on one side in the X direction of the robot 20 (the longitudinal direction of the robot 20) (the distance La between the dot centers) and the two at the B end on the other side.
  • the dot center distance (dot center distance Lb) is measured by counting the number of pixels included between the dot centers (S140).
  • the CPU 81 calculates the parallelism ⁇ (Y with respect to the reference surface (the upper surface of the work table 11) of the robot 20 based on the ratio (La / Lb) of the dot center distance La at the B end to the dot center distance Lb at the A end.
  • the inclination of the rotation direction around the axis is derived (S150).
  • FIG. 7A is an explanatory diagram showing a state of imaging the jig plate P when the camera 24 is parallel to the jig plate P.
  • FIG. 7B is an explanatory diagram showing a captured image of the jig plate P captured from the position of FIG. 7A.
  • FIG. 8A is an explanatory diagram showing a state of imaging the jig plate P when the camera 24 is not parallel to the jig plate P.
  • FIG. FIG. 8B is an explanatory diagram showing a captured image of the jig plate P captured from the position of FIG. 8A. As shown in FIG.
  • FIG. 7A when the camera 24 is parallel to the jig plate P, in the captured image of the jig plate P, all marks appear at the same size and at the same interval as shown in FIG. 7B.
  • FIG. 8A when the camera 24 is tilted relative to the jig plate P around the Y axis, the captured image of the jig plate P has a large and wide mark as shown in FIG. 8B. There are regions that appear at intervals, and regions where the marks are small and appear at narrow intervals. The larger the relative inclination of the camera 24 with respect to the jig plate P, the larger the difference in size and the difference between the marks.
  • the CPU 81 measures the interval of each mark appearing in the captured image of the jig plate P at a plurality of locations and compares them, thereby comparing the relative inclination between the camera 24 and the jig plate P, that is, the robot.
  • the parallelism ⁇ (tilt) with respect to 20 reference planes can be obtained.
  • the processing in S150 is performed by experimentally obtaining in advance a relationship between the ratio (La / Lb) of the dot center distance La at the B end to the distance Lb between the dot centers at the A end and the parallelism ⁇ , and as a map.
  • this is done by deriving the corresponding parallelism ⁇ from the map.
  • the processes of S100 to S150 described above correspond to the parallelism determination process of determining the parallelism (tilt) with respect to the reference plane of the robot 20.
  • the CPU 81 sets the tilt correction value of the robot 20 based on the derived parallelism ⁇ (S160), and ends the tilt adjustment process.
  • the process of S160 is performed by setting the offset angle ⁇ RB with respect to the target posture (RB) of the work tool so as to be offset in the reverse direction by the inclination in the rotational direction around the Y axis.
  • the robot control apparatus 70 controls the robot 20 based on the target posture (RB) of the work tool offset by the offset angle ⁇ RB.
  • the camera 24 corresponds to a camera
  • the CPU 81 of the image processing device 80 corresponds to a control device.
  • the CPU 81 of the image processing apparatus 80 images the jig plate P having a pattern (matrix mark) formed on the upper surface (imaging surface) with the camera 24. Subsequently, the CPU 81 measures a dot center distance La between two non-diagonal marks among the four corner marks in the captured image of the jig plate P and a dot center distance Lb between the remaining two marks. Then, the CPU 81 derives the parallelism ⁇ of the robot 20 based on the measured ratio (La / Lb) between the dot center distances. Thereby, the parallelism with respect to the reference plane of the articulated robot can be more easily determined without using a dial gauge or the like.
  • the CPU 81 performs lens distortion correction on the captured image of the jig plate P, and measures the dot center distances La and Lb based on the captured image after the lens distortion correction. Thereby, the CPU 81 can more accurately measure the distances La and Lb between the dot centers, and can more appropriately determine the parallelism.
  • the CPU 81 sets an inclination correction value of the robot 20 based on the derived parallelism ⁇ . Thereby, the inclination with respect to the reference plane of the robot 20 can be adjusted by a simpler method.
  • the CPU 81 derives the parallelism ⁇ with respect to the reference plane of the robot 20 from the map based on the ratio (La / Lb) of the dot center distance La at the B end to the dot center distance Lb at the A end.
  • the CPU 81 may calculate the parallelism ⁇ by calculation using a predetermined calculation formula.
  • FIG. 9 is an explanatory diagram showing the size H, focal length F, field of view FOV, and working distance WD of the image sensor.
  • the size of the image sensor 24b is H [mm]
  • the focal length that is the distance from the image sensor 24b to the lens 24a is F [mm].
  • the distance is WD [mm]
  • the field of view that is the imaging range in the working distance range is FOV [mm].
  • the visual field FOV is calculated by the following equation (2).
  • the resolution RES indicates the size per pixel in the captured image, and is calculated by the following equation (3) using the field of view FOV in the X direction and the number of pixels PN in the X direction of the image sensor 24b. Is done.
  • the resolution RES is calculated by the following formula (4). It can be seen from the equation (4) that the resolution RES is proportional to the working distance WD.
  • ⁇ in equation (4) is a resolution per unit length of working distance (also referred to as unit resolution), and depends on the specifications (number of pixels, sensor size) of the image sensor 24b and the individual lens (focal length, magnification). It is a fixed constant.
  • F * FOV H * WD... (1)
  • FOV (H / F) * WD...
  • RES FOV / PN... (3)
  • the unit resolution ⁇ is calculated by the following equation (5).
  • FIG. 11 is an explanatory diagram showing the working distance WD1 at the left end (A end) of the visual field and the working distance WD2 at the right end (B end) of the visual field when the camera 24 is inclined by a predetermined angle ⁇ with respect to the object.
  • FIG. 12 is an explanatory diagram showing the dot center distances La1 and La2 at the A end and the dot center distances Lb1 and Lb2 at the B end.
  • the working distance WD1 at the left end (A end) of the field of view is closer than the original working distance WD.
  • the working distance WD2 at the right end (B end) of the field of view is farther than the original working distance WD.
  • the captured image of the camera 24 appears large at the A end on the short distance side and small at the B end on the long distance side.
  • ⁇ WD WD2 ⁇ WD1
  • the distance L is known, the inclination ⁇ of the camera 24 can be calculated by detecting ⁇ WD.
  • Equation (5) can be transformed into the following equation (7).
  • the A-end resolution RES1 and the B-end resolution RES2 can be calculated by processing the captured image of FIG. That is, the resolution RES1 at the A end is determined by calculating the average value of the dot center distances La1 and La2 [pix] between two adjacent dots including the dot at the center of the edge at the A end in FIG. Is calculated by the following equation (8) using the center-to-center distance DP [mm] on the specifications of two adjacent dots.
  • the dot center distances La1 and La2 [pix] can be measured by counting the number of pixels included between the centers of adjacent dots appearing in the captured image.
  • the resolution RES2 at the B end is similarly the average value of the dot center distances Lb1 and Lb2 [pix] of two adjacent dots including the dot at the center of the edge at the B end in FIG.
  • the center distance DP [mm] the following formula (9) is used.
  • the resolutions RES1 and RES2 are included between the two dots in the captured image and the distance in specification between any two dots in the dot group on each side of the A and B ends, respectively. It can be calculated using the number of pixels.
  • the inclination ⁇ of the camera 24, that is, the parallelism ⁇ with respect to the reference plane of the robot 20 can be calculated.
  • a matrix dot pattern or a checkered pattern is formed on the upper surface (imaging surface) of the jig plate P.
  • the pattern (pattern) formed on the jig plate P may be any pattern as long as it has at least four marks arranged in a square or rectangular shape.
  • the pattern formed on the jig plate P is measured by measuring the interval of the pattern appearing in the captured image of the jig plate P by the camera 24 at a plurality of locations, whereby the parallelism of the camera 24 with respect to the jig plate P, that is, the reference of the robot 20. Any device that can determine the parallelism with respect to the surface may be used.
  • the robot 20 includes the 5-axis vertical articulated arm 22.
  • the articulated arm is not limited to 5 axes, and may have, for example, 6 axes or more.
  • a 6-axis articulated arm can be moved in a three-dimensional space of X (front-back direction of the robot), Y (left-right direction), and Z (up-down direction), and moved in a rotational direction (RA) around the X-axis.
  • RA rotational direction
  • Movement in the rotation direction (RB) around the Y axis and movement in the rotation direction (RC) around the Z axis are possible.
  • the CPU of the image processing apparatus determines the distance between the dot centers of the two marks on one side in the Y direction among the four corner marks included in the captured image of the jig plate by the camera and the two marks on the other side. Measure the distance between dot centers. Thereby, the CPU of the image processing apparatus can determine the inclination (parallelism) with respect to the reference plane of the robot in the rotation direction around the X axis based on the measured distance between the dot centers.
  • the invention of the present disclosure has been described as a form of the tilt adjustment device, but may be a form of a parallelism determination method.
  • This disclosure can be used in the robot manufacturing industry.

Landscapes

  • Manipulator (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un procédé de détermination du parallélisme d'un robot articulé, conçu pour déterminer un parallélisme d'un robot articulé par rapport à une surface de référence. Ce procédé de détermination de parallélisme place une plaque présentant un motif prédéfini de marques sur un établi destiné à un robot articulé, et capture une image de la plaque à l'aide d'une caméra montée sur le bras du robot articulé. Le procédé mesure ensuite une pluralité de distances entre marques telles que reconnues dans l'image capturée de la plaque, et détermine un parallélisme du robot articulé en fonction d'une comparaison entre la pluralité mesurée de distances entre marques.
PCT/JP2017/011704 2017-03-23 2017-03-23 Procédé de détermination de parallélisme de robot articulé et dispositif de réglage d'inclinaison de robot articulé WO2018173192A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2019506828A JP6807450B2 (ja) 2017-03-23 2017-03-23 多関節ロボットの平行度判定方法および多関節ロボットの傾き調整装置
PCT/JP2017/011704 WO2018173192A1 (fr) 2017-03-23 2017-03-23 Procédé de détermination de parallélisme de robot articulé et dispositif de réglage d'inclinaison de robot articulé

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/011704 WO2018173192A1 (fr) 2017-03-23 2017-03-23 Procédé de détermination de parallélisme de robot articulé et dispositif de réglage d'inclinaison de robot articulé

Publications (1)

Publication Number Publication Date
WO2018173192A1 true WO2018173192A1 (fr) 2018-09-27

Family

ID=63585110

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/011704 WO2018173192A1 (fr) 2017-03-23 2017-03-23 Procédé de détermination de parallélisme de robot articulé et dispositif de réglage d'inclinaison de robot articulé

Country Status (2)

Country Link
JP (1) JP6807450B2 (fr)
WO (1) WO2018173192A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110187259A (zh) * 2019-06-10 2019-08-30 德淮半导体有限公司 一种防止晶圆测试中针痕偏移的调整***以及调整方法
WO2021152744A1 (fr) * 2020-01-29 2021-08-05 株式会社Fuji Dispositif de commande, procédé de commande, dispositif de traitement d'informations et procédé de traitement d'informations
WO2024089852A1 (fr) * 2022-10-27 2024-05-02 株式会社Fuji Dispositif de commande, système de robot et procédé de commande

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1011146A (ja) * 1996-06-25 1998-01-16 Shinko Electric Co Ltd 移動体の停止姿勢補正装置
JP2002340529A (ja) * 2001-05-16 2002-11-27 Nippon Koei Power Systems Co Ltd デジタルカメラを用いた構造物の亀裂変位計測方法
JP2003030628A (ja) * 2001-07-18 2003-01-31 Fujitsu Ltd 相対位置計測装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1011146A (ja) * 1996-06-25 1998-01-16 Shinko Electric Co Ltd 移動体の停止姿勢補正装置
JP2002340529A (ja) * 2001-05-16 2002-11-27 Nippon Koei Power Systems Co Ltd デジタルカメラを用いた構造物の亀裂変位計測方法
JP2003030628A (ja) * 2001-07-18 2003-01-31 Fujitsu Ltd 相対位置計測装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110187259A (zh) * 2019-06-10 2019-08-30 德淮半导体有限公司 一种防止晶圆测试中针痕偏移的调整***以及调整方法
WO2021152744A1 (fr) * 2020-01-29 2021-08-05 株式会社Fuji Dispositif de commande, procédé de commande, dispositif de traitement d'informations et procédé de traitement d'informations
WO2024089852A1 (fr) * 2022-10-27 2024-05-02 株式会社Fuji Dispositif de commande, système de robot et procédé de commande

Also Published As

Publication number Publication date
JP6807450B2 (ja) 2021-01-06
JPWO2018173192A1 (ja) 2019-11-21

Similar Documents

Publication Publication Date Title
CN107053167B (zh) 控制装置、机器人以及机器人***
US11267142B2 (en) Imaging device including vision sensor capturing image of workpiece
JP7153085B2 (ja) ロボットキャリブレーションシステム及びロボットキャリブレーション方法
JP4267005B2 (ja) 計測装置及びキャリブレーション方法
JP5815761B2 (ja) 視覚センサのデータ作成システム及び検出シミュレーションシステム
KR102230321B1 (ko) 레이저 가공 장치
JP2013231702A (ja) 画像計測装置、画像計測方法及び画像計測プログラム
WO2018173192A1 (fr) Procédé de détermination de parallélisme de robot articulé et dispositif de réglage d'inclinaison de robot articulé
JP6661027B2 (ja) 作業ロボット
JP2012101306A (ja) ロボットの校正装置および校正方法
JP6965422B2 (ja) カメラの平行度判定方法
KR102422990B1 (ko) 스캔을 이용한 로봇의 캘리브레이션 시스템 및 방법
CN116930187A (zh) 车身漆面缺陷视像检测方法与视像检测***
EP3895855A1 (fr) Système de commande robotisée, et procédé de commande robotisée
CN111716340B (zh) 3d相机与机械手臂坐标***的校正装置及方法
JP6578671B2 (ja) ロボット、ロボットの制御方法、及びロボットの制御装置
CN111062989B (zh) 一种高精度二维摄像头与机器人手眼标定的方法及***
JP7105223B2 (ja) ロボットシステム
WO2022254613A1 (fr) Procédé de correction d'écart de position d'un appareil photo et dispositif robot
WO2022124232A1 (fr) Système de traitement d'image et procédé de traitement d'image
JPH05204423A (ja) 視覚装置付きロボット装置における座標較正方法
WO2022014043A1 (fr) Procédé de mesure d'écart de position pour caméra
JP5481123B2 (ja) 部品の搭載装置、部品の搭載方法
WO2024048491A1 (fr) Système de robot et procédé permettant de commander un système de robot
CN116803628A (zh) 对象物的检测方法和检测装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17901398

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019506828

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17901398

Country of ref document: EP

Kind code of ref document: A1