CN111037559B - Quick calibration method and device for position of material tray of machine and storage medium - Google Patents

Quick calibration method and device for position of material tray of machine and storage medium Download PDF

Info

Publication number
CN111037559B
CN111037559B CN201911354411.XA CN201911354411A CN111037559B CN 111037559 B CN111037559 B CN 111037559B CN 201911354411 A CN201911354411 A CN 201911354411A CN 111037559 B CN111037559 B CN 111037559B
Authority
CN
China
Prior art keywords
motor
camera
tray
suction nozzle
photographing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911354411.XA
Other languages
Chinese (zh)
Other versions
CN111037559A (en
Inventor
张长元
曾纪光
张天皓
夏勇俊
高增禄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Colibri Technologies Co ltd
Original Assignee
Shenzhen Colibri Technologies Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Colibri Technologies Co ltd filed Critical Shenzhen Colibri Technologies Co ltd
Priority to CN201911354411.XA priority Critical patent/CN111037559B/en
Publication of CN111037559A publication Critical patent/CN111037559A/en
Application granted granted Critical
Publication of CN111037559B publication Critical patent/CN111037559B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Supply And Installment Of Electrical Components (AREA)

Abstract

The embodiment of the application discloses a quick calibration method and a quick calibration device for a machine tray, wherein the method comprises the following steps: when the motor is at an initial position, determining a reference photographing point position and a fixed photographing position of the material tray through the camera; when the motor is positioned at the reference photographing point position, determining a reference pixel coordinate through the camera; determining a reference position of the suction nozzle; acquiring single-pixel precision after the current camera is calibrated; when the motor is positioned at a fixed photographing position, obtaining the pixel coordinates of the identification points at different positions of different trays through a camera; the position of each tray is calculated. By implementing the embodiment of the application, the labor consumption can be saved, the accuracy is improved, and the wrong placement of the material tray is prevented.

Description

Method and device for quickly calibrating position of material tray of machine and storage medium
Technical Field
The application relates to the technical field of machine placement position calibration, in particular to a method and a device for quickly calibrating a position of a machine tray and a storage medium.
Background
In the automation industry, the material taking and placing position is a very critical step, and the material taking and placing position directly influences the running stability of the machine and the accuracy of a final assembly or detection result. In many cases, at present, an automatic machine still adopts a manual alignment mode at an initial calibration stage to determine a starting point position and then performs analogized calculation. This approach firstly consumes manpower and time, and secondly does not guarantee that the manually identified points meet the exact accuracy requirements.
Content of application
An object of the embodiments of the present application is to provide a method and an apparatus for quickly calibrating a tray position of a machine, and a storage medium, so as to save labor consumption, improve accuracy, and prevent a tray from being placed incorrectly.
In order to achieve the above object, in a first aspect, an embodiment of the present application provides a method for quickly calibrating a position of a machine tray, which is applicable to a tray system of a machine, where the tray system includes a motor, a suction nozzle, a camera, and at least one tray, and the suction nozzle and the camera are installed on the motor. The method comprises the following steps:
when the motor is at an initial position, determining a reference photographing point position and a fixed photographing position of the material tray through the camera;
when the motor is positioned at the reference photographing point position, determining a reference pixel coordinate through the camera;
sending an instruction to the motor, enabling the motor to drive the suction nozzle to move to a preset reference position, and taking the current position of the motor as the reference position of the suction nozzle;
acquiring single-pixel precision after the current camera is calibrated;
when the motor is located at the fixed photographing position, the pixel coordinates of the identification points at different positions of different trays are obtained through the camera;
and calculating the position of each tray according to the reference photographing point position, the reference pixel coordinate, the suction nozzle reference position, the single pixel precision and the identification point pixel coordinate.
As a specific implementation manner of this application, when the motor is in the initial position, determine the reference photographing point position and the fixed photographing position of the tray through the camera, specifically include:
when the motor is at an initial position, acquiring a plurality of first pictures through the camera;
and determining the position of the reference photographing point and three fixed photographing positions of the material tray according to the plurality of first pictures and the preset reference position.
As a specific implementation manner of the present application, when the motor is located at the reference photographing point position, determining a reference pixel coordinate by the camera specifically includes:
when the motor is positioned at the reference photographing point position, acquiring a second picture through the camera;
and carrying out template identification on the second picture to obtain a reference pixel coordinate.
As a specific implementation manner of this application, when the motor is in when the fixed position of shooing, through the camera obtains the identification point pixel coordinate of different charging trays different positions, specifically includes:
when the motor is located at the fixed photographing position, photographing is carried out through the camera and template recognition is carried out through a preset template, so that the pixel coordinates of the recognition points at different positions of different trays are obtained.
Wherein the template identification is: and establishing a characteristic shape of the object to be identified as a template, and finding the object to be identified in the picture through similarity matching.
As a specific implementation manner of the present application, according to the reference photographing point position, the reference pixel coordinate, the suction nozzle reference position, the single pixel precision, and the identification point pixel coordinate, the method specifically includes:
calculating the position coordinate of the suction nozzle according to the position of the reference photographing point, the reference pixel coordinate, the reference position of the suction nozzle and the single pixel precision;
and calculating the position of each material tray according to the position coordinates of the suction nozzle.
Wherein, calculating the suction nozzle position coordinate specifically includes:
when the next identification point is moved, after the identification is carried out through the template, the current photographing position (x, y) of the motor is obtained, and the current pixel coordinate (x) of the identification point p ,y p ,R p ) And obtaining the position coordinates of the suction nozzle according to the formulas (1) and (2) as follows:
x 1 =(x-x 0 )+(x p -x p0 )*P+x n1
y 1 =(y-y 0 )+(y p -y p0 )*P+y n1
R 1 =(R p -R p0 )+R n1
equations (1) and (2) are respectively: nozzle position X, Y = motor offset position X, Y + picture offset position X, Y + nozzle reference position X, Y; the suction nozzle angle R = the picture offset angle R + the suction nozzle reference angle R;
wherein (x) 0 ,y 0 ) Motor coordinate (x) of position of taking picture point as reference p0 ,y p0 ,R p0 ) Pixel coordinates of a reference shot point position, (x) n1 ,y n1 ,R n1 ) Is the motor coordinate of the suction nozzle reference position, and P is single pixel precision.
Wherein, calculate every charging tray position according to suction nozzle position coordinate specifically includes:
calculating the groove coordinate and the tray angle of the tray according to the position coordinate of the suction nozzle, wherein the groove coordinate calculation formula is as follows:
x ij =x O1 +D Xx *i+D Yx *j;
y ij =y O1 +D Yy *j+D Xy *i;
the calculation formula of the tray angle is as follows: r = (R) O1 +R Y1 +R X1 )/3;
Wherein (x) O1 ,y O1, R O1 )、(x Y1 ,y Y1, R Y1 )、(x X1 ,y X1, R X1 ) Coordinates of three positions of the suction nozzle on each material tray; d Yy Is the difference of the Y coordinates of two adjacent grooves in the Y direction, D Yx Is the difference of the X coordinates in the Y direction; d Xy Is the difference of the Y coordinates of two adjacent grooves in the X direction, D Xx Is the difference of the X coordinates in the X direction; and R is the angle of the material tray.
In a second aspect, an embodiment of the present application further provides another picture processing apparatus, including a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, where the memory is used to store a computer program, and the computer program includes program instructions, and the processor is configured to call the program instructions to execute the method of the first aspect.
In a third aspect, embodiments of the present application further provide a computer-readable storage medium having a computer program, where the computer program includes program instructions, and the program instructions, when executed by a processor, cause the processor to execute the method of the first aspect.
The embodiment of the application has the following advantages:
1. the manpower consumption is saved. The machine can be realized by taking and placing the detection materials without the cooperation of an operator when each calibration is not needed.
2. The accuracy is improved. The manual point location correction depends on feeling and visual inspection, accurate datamation cannot be realized, and complete data parameter calibration can be realized by the scheme.
3. Prevent the tray from being placed wrongly. The position point can not be detected in the automatic calibration process, the process operation fails, the position of the material tray is indirectly pointed out to be greatly deviated, the abnormity is prevented, and the controllability of the machine is improved.
Drawings
In order to more clearly illustrate the detailed description of the present application or the technical solutions in the prior art, the drawings that are required in the detailed description of the present application or the technical solutions in the prior art will be briefly described below.
FIG. 1 is a schematic flow chart of a method for quickly calibrating a tray position of a machine according to an embodiment of the present application;
FIG. 2 is a schematic diagram of the fixed photographing position calibration of the tray;
FIG. 3 is a schematic view of template identification;
fig. 4 is a schematic structural diagram of a quick calibration device for a machine tray provided by an embodiment of the application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, of the embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The inventive concept of the present application is: in order to solve the technical problems in the prior art (please refer to the background technology of the present application), a debugging process method for automatically calibrating the pick-and-place position of an automatic machine is provided, after a program confirms a reference coordinate, necessary coordinate information is obtained by photographing according to a camera, and then position data of all material trays are obtained through calculation, so that blind pick-and-place of a test material is realized, the labor consumption and the calibration time are saved, and the efficiency and the accuracy are improved. The programs (software) referred to in the present application include both camera software and control software. As will be understood by those skilled in the art from the following description, the camera software and the control software may be integrated into one piece of hardware, so that the hardware implementing body in the following method embodiment is one piece, which can be understood as a machine tray position quick calibration device.
In addition, before the position of the material tray is quickly calibrated, the machine is provided with at least one material tray for placing materials, a suction nozzle for sucking materials, an industrial camera (for photographing and positioning) and a motor (for taking and placing materials in real-time motion), the camera and the suction nozzle are fixed on the moving motor and can freely rotate to match the material taking directions of different material trays, and the camera is calibrated by itself before being used. In addition, the first and second substrates are,
on the basis, as shown in fig. 1, the method for quickly calibrating the position of the material tray of the machine comprises the following steps:
s101, when the motor is located at an initial position, the camera determines the position of a reference photographing point and the fixed photographing position of the tray.
Specifically, the motor is controlled to move manually through control software, meanwhile, a continuous photographing state is kept, a real-time photographed picture is observed, a defined reference position appears in the picture, at the moment, the motor position is a reference photographing point position (reference photographing position) which is needed by people, and similarly, a fixed photographing point position (SlotO, slotY, slotX) of each tray is found, as shown in fig. 2.
And S102, when the motor is positioned at the reference photographing point position, determining a reference pixel coordinate through the camera.
Specifically, the motor is controlled to move to a reference photographing point position through control software to take a picture, and through camera software and a corresponding template (the template identification introduces the following figure), a pixel coordinate (a reference pixel coordinate) of an identification point is obtained after the template is identified.
As shown in fig. 3, a part of feature shapes of the recognized objects are established as templates, and objects to be recognized in the picture are found through similarity matching, where a square area in the picture is a recognition result.
S103, sending an instruction to the motor, enabling the motor to drive the suction nozzle to move to a preset reference position, and taking the current position of the motor as the reference position of the suction nozzle.
Specifically, the control software controls the motor to move, the suction nozzle is moved to the reference position, and the obtained current motor position is the suction nozzle reference position.
And S104, acquiring the single-pixel precision of the current camera after calibration.
Specifically, the single-pixel precision P after the current camera calibration is obtained through the camera software.
And S105, when the motor is positioned at the fixed photographing position, obtaining the pixel coordinates of the identification points at different positions of different trays through the camera.
And S106, calculating the position of each tray according to the reference photographing point position, the reference pixel coordinate, the suction nozzle reference position, the single pixel precision and the identification point pixel coordinate.
Specifically, the software controls the motor to move to fixed photographing point positions SlotO, slotY and SlotX of each material tray respectively, recognition point pixel coordinates of three positions of different material trays are obtained after photographing and template recognition are completed through camera software and corresponding templates respectively, and then all groove positions of each material tray are calculated by using a formula, so that the position of the whole material tray is obtained.
Wherein, the calculation part comprises the following contents:
1. calculating single position coordinates
Referring to the calibration process, the reference position marked in the figure is taken as the reference position of the user, and the reference photographing position x is obtained according to the calibration process 0 ,y 0 Reference pixel coordinate x p0 ,y p0 ,R p0 Reference position x of suction nozzle n1 ,y n1 ,R n1 Single pixel precision P.
When we move the next recognition point, the template is used as wellAfter identification is confirmed, the motor position (the current photographing position x, y) and the pixel coordinate (the current pixel coordinate x) of the identification point can be obtained p ,y p ,R p ) Therefore, the current suction nozzle position can be deduced, and the formula suction nozzle position X is applied, Y = mechanical position deviation X, Y + picture position deviation X, Y + suction nozzle reference X, Y is applied; the suction nozzle angle R = the picture offset angle R + the suction nozzle reference angle R;
current nozzle coordinate x 1 =(x-x 0 )+(x p -x p0 )*P+x n1 ;y 1 =(y-y 0 )+(y p -y p0 )*P+y n1
R 1 =(R p -R p0 )+R n1
2. Calculating the coordinates of the whole material tray
Referring to fig. 2, using the way of calculating coordinates of a single position, referring to the calibration process, when the motor is moved, the coordinates of the suction nozzle at three points are obtained by taking pictures at SlotO, slotY, and SlotX, respectively, and the coordinates of the suction nozzle at three positions are x O1 ,y O1 ,R O1 、x Y1 ,y Y1 ,R Y1 、x X1 ,y X1 ,R X1 As we know, the Tray has A columns of slots in the Y direction and B rows of slots in the X direction, and the whole Tray has A × B slots. In the Y direction, the difference between the Y coordinates of two adjacent slots is D Yy The difference of X coordinates is D Yx (ii) a In the X direction, the Y coordinate of two adjacent slots has a difference of D Xy The difference of X coordinates is D Xx
D Yy =(y Y1 -y O1 )/A;D Yx =(x Y1 -x O1 )/A;
D Xy =(y X1 -y O1 )/B;D Xx =(x X1 -x O1 )/B;
So for a slot coordinate x of i row and j column ij ,y ij The calculation formula is
x ij =x O1 +D Xx *i+D Yx *j;y ij =y O1 +D Yy *j+D Xy *i。
Because all the groove angles of the material tray are consistent, the averaging reduces the calculation error,
tray angle R = (R) O1 +R Y1 +R X1 )/3。
The implementation of the method of the embodiment of the application has the following advantages:
1. the manpower consumption is saved. The machine can be realized by taking and placing the detection materials without the cooperation of an operator when each calibration is not needed.
2. The accuracy is improved. The manual point location correction depends on feeling and visual inspection, accurate datamation cannot be realized, and complete data parameter calibration can be realized by the scheme.
3. Prevent the tray from being placed wrongly. The position point is not detected in the automatic calibration process, the process operation fails, the position of the material tray is indirectly pointed out to be greatly deviated, the abnormity is prevented, and the controllability of the machine is improved.
As can be seen from the above description, the key point of the technical solution of the present application is the information acquisition and calculation manner of the initial targeting location. Three-point position information of X/Y/R (namely horizontal, vertical and angular directions) can be obtained from the pictures of the three selected positions, and the possible transverse/vertical and deflection displacement of the tray can be calculated and compensated on the basis of the three position information, so that the accuracy of the taking and placing positions can be ensured regardless of the state of the tray.
Based on the same inventive concept, the embodiment of the application also provides a quick calibration device for the machine tray. As shown in fig. 4, the apparatus may include: one or more processors 101, one or more input devices 102, one or more output devices 103, and memory 104, the processors 101, input devices 102, output devices 103, and memory 104 being interconnected by a bus 105. The memory 104 is used for storing a computer program comprising program instructions, the processor 101 being configured for invoking the program instructions for performing the methods of the above-described method embodiment parts.
It should be understood that, in the embodiment of the present Application, the Processor 101 may be a Central Processing Unit (CPU), and the Processor may also be other general-purpose processors, digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 102 may include a keyboard, etc., and the output device 103 may include a display (LCD, etc.), speakers, etc.
The memory 104 may include read-only memory and random access memory, and provides instructions and data to the processor 101. A portion of the memory 104 may also include non-volatile random access memory. For example, the memory 104 may also store device type information.
In a specific implementation, the processor 101, the input device 102, and the output device 103 described in this embodiment of the present application may execute the implementation described in the embodiment of the image processing method provided in this embodiment of the present application, and are not described herein again.
Further, an embodiment of the present application further provides a readable storage medium storing a computer program, where the computer program includes program instructions, and when executed by a processor, the program instructions implement: the quick calibration method for the position of the material tray of the machine.
The computer readable storage medium may be an internal storage unit of the system according to any of the foregoing embodiments, for example, a hard disk or a memory of the system. The computer readable storage medium may also be an external storage device of the system, such as a plug-in hard drive, smart Media Card (SMC), secure Digital (SD) Card, flash memory Card (Flash Card), etc. provided on the system. Further, the computer readable storage medium may also include both an internal storage unit and an external storage device of the system. The computer-readable storage medium is used for storing the computer program and other programs and data required by the system. The computer readable storage medium may also be used to temporarily store data that has been output or is to be output.
While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (7)

1. The utility model provides a quick calibration method in machine charging tray position, is applicable to machine charging tray system, charging tray system includes motor, suction nozzle, camera and at least one charging tray, suction nozzle and camera install in on the motor, its characterized in that includes:
when the motor is at an initial position, determining a reference photographing position and a fixed photographing position of the material tray through the camera;
when the motor is positioned at the reference photographing point position, determining a reference pixel coordinate through the camera; the reference pixel coordinate is a recognition point pixel coordinate after the recognition object is recognized through camera software and a template; the template is a part of characteristic shape of the identification object;
sending an instruction to the motor, enabling the motor to drive the suction nozzle to move to a preset reference position, and taking the current position of the motor as the reference position of the suction nozzle;
acquiring single-pixel precision after the current camera is calibrated;
when the motor is located at the fixed photographing position, obtaining the pixel coordinates of the identification points at different positions of different trays through the camera;
calculating the position of each tray according to the reference photographing point position, the reference pixel coordinate, the suction nozzle reference position, the single pixel precision and the identification point pixel coordinate;
the step of calculating the position of each tray specifically comprises the following steps:
when moving the next recognition point, after the recognition through the template, obtaining the current photographing position (x, y) of the motor, the current pixel coordinate of the recognition point (x, y)x p ,y p ,R p ) And obtaining the position coordinates of the suction nozzle according to the formulas (1) and (2) as follows:
x 1 =(x–x 0 )+(x p –x p0 )*P+x n1
y 1 =(y–y 0 )+(y p –y p0 )*P+y n1
R 1 =(R p -R p0 )+R n1
calculating the groove coordinate and the tray angle of the tray according to the position coordinate of the suction nozzle, wherein the groove coordinate calculation formula is as follows:
x ij =x O1 +D Xx *i+D Yx *j;
y ij =y O1 +D Yy *j+D Xy *i;
the calculation formula of the tray angle is as follows: r = (R) O1 +R Y1 +R X1 )/3;
Equations (1) and (2) are respectively: nozzle position X, Y = motor offset position X, Y + picture offset position X, Y + nozzle reference position X, Y; the suction nozzle angle R = the picture offset angle R + the suction nozzle reference angle R;
wherein (x) 0 ,y 0 ) Motor coordinate (x) of position of taking picture point as reference p0 ,y p0 ,R p0 ) Pixel coordinates of a reference shot point position, (x) n1 ,y n1 ,R n1 ) The coordinate of the motor is the reference position of the suction nozzle, and P is the single-pixel precision;
(x O1 ,y O1 ,R O1 )、(x Y1 ,y Y1 ,R Y1 )、(x X1 ,y X1 ,R X1 ) Coordinates of three positions of the suction nozzle on each material tray; d Yy Is the difference of the Y coordinates of two adjacent grooves in the Y direction, D Yx Is the difference of the X coordinates in the Y direction; d Xy Is the difference of the Y coordinates of two adjacent grooves in the X direction, D Xx Is the difference of X coordinates in the X direction; and R is the angle of the material tray.
2. The method according to claim 1, wherein when the motor is in the initial position, determining a reference photographing point position and a fixed photographing position of the tray by the camera specifically comprises:
when the motor is at an initial position, acquiring a plurality of first pictures through the camera;
and determining the position of the reference photographing point and three fixed photographing positions of the material tray according to the plurality of first pictures and the preset reference position.
3. The method of claim 2, wherein determining, by the camera, the reference pixel coordinates when the motor is at the reference picture-taking point position comprises:
when the motor is positioned at the reference photographing point position, a second picture is obtained through the camera;
and carrying out template identification on the second picture to obtain a reference pixel coordinate.
4. The method according to claim 1, wherein when the motor is located at the fixed photographing position, obtaining pixel coordinates of the identification points at different positions of different trays by the camera specifically comprises:
when the motor is located at the fixed photographing position, photographing is carried out through the camera and template recognition is carried out through a preset template, so that the pixel coordinates of the recognition points at different positions of different trays are obtained.
5. The method of claim 3 or 4, wherein the template is identified as: and establishing a characteristic shape of the object to be identified as a template, and finding the object to be identified in the picture through similarity matching.
6. A machine tray position quick calibration device, comprising a processor, an input device, an output device and a memory, wherein the processor, the input device, the output device and the memory are connected with each other, wherein the memory is used for storing a computer program, the computer program comprises program instructions, and the processor is configured to call the program instructions to execute the method according to any one of claims 1 to 5.
7. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program comprising program instructions that, when executed by a processor, cause the processor to carry out the method according to any one of claims 1-5.
CN201911354411.XA 2019-12-25 2019-12-25 Quick calibration method and device for position of material tray of machine and storage medium Active CN111037559B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911354411.XA CN111037559B (en) 2019-12-25 2019-12-25 Quick calibration method and device for position of material tray of machine and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911354411.XA CN111037559B (en) 2019-12-25 2019-12-25 Quick calibration method and device for position of material tray of machine and storage medium

Publications (2)

Publication Number Publication Date
CN111037559A CN111037559A (en) 2020-04-21
CN111037559B true CN111037559B (en) 2023-03-10

Family

ID=70239584

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911354411.XA Active CN111037559B (en) 2019-12-25 2019-12-25 Quick calibration method and device for position of material tray of machine and storage medium

Country Status (1)

Country Link
CN (1) CN111037559B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111906783A (en) * 2020-07-17 2020-11-10 深圳市华成工业控制股份有限公司 Method and device for accurately determining storage of quadrilateral material tray and robot
CN115205379B (en) * 2022-05-17 2023-09-12 深圳市腾盛精密装备股份有限公司 Multi-point calibration method of cutting separator and related equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104848784A (en) * 2014-02-19 2015-08-19 江苏腾世机电有限公司 Position offset correction method and system of chip mounter suction nozzle units
JP2017071033A (en) * 2015-10-09 2017-04-13 キヤノン株式会社 Working reference object, working reference object manufacturing method, robot arm adjusting method, vision system, robot apparatus, and indicator member
CN106670763A (en) * 2017-01-10 2017-05-17 荣旗工业科技(苏州)有限公司 Calculating method of high-precision automatic assembly machine
CN109483531A (en) * 2018-10-26 2019-03-19 江苏大学 It is a kind of to pinpoint the NI Vision Builder for Automated Inspection and method for picking and placing FPC plate for manipulator
CN110143055A (en) * 2018-05-22 2019-08-20 广东聚华印刷显示技术有限公司 The bearing calibration of ink droplet drops positional shift, device and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3259908B1 (en) * 2015-02-18 2021-07-14 Siemens Healthcare Diagnostics Inc. Image-based tray alignment and tube slot localization in a vision system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104848784A (en) * 2014-02-19 2015-08-19 江苏腾世机电有限公司 Position offset correction method and system of chip mounter suction nozzle units
JP2017071033A (en) * 2015-10-09 2017-04-13 キヤノン株式会社 Working reference object, working reference object manufacturing method, robot arm adjusting method, vision system, robot apparatus, and indicator member
CN106670763A (en) * 2017-01-10 2017-05-17 荣旗工业科技(苏州)有限公司 Calculating method of high-precision automatic assembly machine
CN110143055A (en) * 2018-05-22 2019-08-20 广东聚华印刷显示技术有限公司 The bearing calibration of ink droplet drops positional shift, device and system
CN109483531A (en) * 2018-10-26 2019-03-19 江苏大学 It is a kind of to pinpoint the NI Vision Builder for Automated Inspection and method for picking and placing FPC plate for manipulator

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
用于生产线料盘取放料的带视觉引导机器人技术的研发;张圣祥;《中国优秀硕士学位论文全文数据库》;20191216;全文 *

Also Published As

Publication number Publication date
CN111037559A (en) 2020-04-21

Similar Documents

Publication Publication Date Title
JP6527178B2 (en) Vision sensor calibration device, method and program
CN110570477B (en) Method, device and storage medium for calibrating relative attitude of camera and rotating shaft
KR102276259B1 (en) Calibration and operation of vision-based manipulation systems
US20220026194A1 (en) Method and apparatus for determining pose information of a robot, device and medium
Zhang et al. Camera calibration with lens distortion from low-rank textures
CN111037559B (en) Quick calibration method and device for position of material tray of machine and storage medium
KR970005616B1 (en) Automatic calibration method
CN107931012B (en) Method for extracting dispensing path and dispensing system
EP3712853A1 (en) Positioning method and system, and suitable robot
TW201706091A (en) System and method for tying together machine vision coordinate spaces in a guided assembly environment
WO2022052404A1 (en) Memory alignment and insertion method and system based on machine vision, device, and storage medium
CN110163912B (en) Two-dimensional code pose calibration method, device and system
CN106570907B (en) Camera calibration method and device
US20150262415A1 (en) Image processing device, system, image processing method, and image processing program
EP3665898A1 (en) System and method for recalibrating a projector system
CN108662974B (en) Dual-camera-based dispensing positioning method and device
CN104463833A (en) Method and system for calibrating camera parameters of one-dimensional area array camera set
WO2021046767A1 (en) Autonomous robot tooling system, control system, control method, and storage medium
CN113172636B (en) Automatic hand-eye calibration method and device and storage medium
CN106507656B (en) Method for correcting element carrier and automatic assembling machine
CN112950528A (en) Certificate posture determining method, model training method, device, server and medium
US20200041262A1 (en) Method for performing calibration by using measured data without assumed calibration model and three-dimensional scanner calibration system for performing same
CN110248148B (en) Method and device for determining positioning parameters
WO2016194078A1 (en) Information processing apparatus, calibration method, and calibration processing program
CN101082488B (en) Image split joint method for long-distance telemetering measurement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant