CN108161931A - The workpiece automatic identification of view-based access control model and intelligent grabbing system - Google Patents

The workpiece automatic identification of view-based access control model and intelligent grabbing system Download PDF

Info

Publication number
CN108161931A
CN108161931A CN201611116892.7A CN201611116892A CN108161931A CN 108161931 A CN108161931 A CN 108161931A CN 201611116892 A CN201611116892 A CN 201611116892A CN 108161931 A CN108161931 A CN 108161931A
Authority
CN
China
Prior art keywords
robot
module
workpiece
image
based access
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201611116892.7A
Other languages
Chinese (zh)
Inventor
覃争鸣
梁鹏
周健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rich Intelligent Science And Technology Ltd Is Reflected In Guangzhou
Original Assignee
Rich Intelligent Science And Technology Ltd Is Reflected In Guangzhou
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rich Intelligent Science And Technology Ltd Is Reflected In Guangzhou filed Critical Rich Intelligent Science And Technology Ltd Is Reflected In Guangzhou
Priority to CN201611116892.7A priority Critical patent/CN108161931A/en
Publication of CN108161931A publication Critical patent/CN108161931A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/02Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type
    • B25J9/04Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type by rotating at least one arm, excluding the head movement itself, e.g. cylindrical coordinate type or polar coordinate type
    • B25J9/046Revolute coordinate type
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a kind of workpiece automatic identification of view-based access control model and intelligent grabbing system, which includes:Image capture module, industrial personal computer module and robot module;Wherein, described image acquisition module is connected with the industrial personal computer module;The industrial personal computer module is connected with the robot module.The present invention program provides image coordinate to the transfer algorithm of robot coordinate by establishing the parameterized model of grasping system, the image pre-processing method provided using specific software, secondary development is carried out under designated environment, it realizes target positioning and robot controls two big basic functions, the final crawl that robot is controlled to complete target workpiece.

Description

The workpiece automatic identification of view-based access control model and intelligent grabbing system
Technical field
The invention belongs to machine vision positioning fields, are related to a kind of workpiece automatic identification of view-based access control model and intelligent grabbing system System.
Background technology
Workpiece identification and the important application that crawl is industrial robot on production line, and it is most on production line at present Industrial robot is to control the robot to perform scheduled instruction action by way of advance teaching or off-line programing, once Working environment or target object change, and robot cannot adapt to these variations in time, fail so as to cause crawl, therefore, This working method greatly limits flexibility and the working efficiency of industrial robot.
Machine vision technique has the characteristics that quick and non-contacting, and machine vision technique is introduced industrial robot field, It the Mission Operations such as captured, carried by vision guide robot, for improving the automatization level of production line, widening machine The application range of people is all of great significance.
Application publication number discloses one kind for the application for a patent for invention of CN105905560A and " dynamically captures the full-automatic of storage Control system and its control method ", which realizes a PLC controller and controls more manipulators, and controls simultaneously Material transportation is moved with material disc, realizes the crawl of dynamic material.But the invention need to use PLC controller, and basis is needed to take pictures Region operation and control free of discontinuities, required cost is higher, and systematic realizing program is more complicated.
Paper " target identification of binocular stereo vision and positioning, intelligence system journal, 2011,6 (4):303-311, still Pretty, Ruan Qiuqi, Li little Li " realize target identification and positioning using binocular stereo vision;The Binocular Stereo Vision System mainly wraps Include camera calibration, image segmentation, Stereo matching and 3 dimension 4 modules of ranging, wherein Stereo matching be binocular visual positioning most A crucial step, but realize that the accurate Stereo matching of target area is more difficult, and the inaccuracy of Stereo matching will result directly in and be obtained The depth information taken generates deviation, while its real-time is the ultimate challenge that binocular and more mesh positioning vision system face.
Application publication number is that the application for a patent for invention of CN104369188B discloses one kind " based on machine vision and ultrasonic wave The workpiece gripper device and method of sensor ", the patent of invention acquire workpiece profile image using monocular camera, use ultrasonic wave Sensor instrument distance realizes crawl of the robot to workpiece.But its device is hard by camera, sensor, liquid crystal display, PLC etc. Part is formed, and hardware needed for equipment is more and cost is higher.
Invention content
Present invention aims at a kind of the workpiece automatic identification and intelligent grabbing system of view-based access control model is provided, grabbed by establishing The parameterized model of system is taken to provide image coordinate to the transfer algorithm of robot coordinate, the image provided using specific software Preprocess method carries out secondary development under designated environment, realizes target positioning and robot controls two big basic functions, most Control robot completes the crawl of target workpiece eventually, and working environment or mesh can not be adapted in time by efficiently solving existing robot Mark object changes, and leads to operation failure, so as to meet the requirement of flexible manufacturing system.
In order to solve the above technical problems, the present invention adopts the following technical scheme that:A kind of workpiece of view-based access control model is known automatically Not and intelligent grabbing system, the system include:Image capture module, industrial personal computer module and robot module;Wherein, the figure As acquisition module is connected with the industrial personal computer module;The industrial personal computer module is connected with the robot module.
Further, described image acquisition module is made of camera, camera lens, light source and workpiece, is made an uproar for quick obtaining Sound is small, precision hi-vision data information.
Further, the industrial personal computer module uses the industrial control computer of Taiwan Yan Hua companies, is responsible for receiving Image Acquisition The image information of module acquisition simultaneously controls signal to control with robot is converted to after image processing algorithm completion workpiece identification The physical location of end effector of robot.
Further, the robot module is made of driving device and robot body, for the control according to reception Instruction performs corresponding operation.
The present invention has following advantageous effect compared with prior art:
The present invention program provides conversion of the image coordinate to robot coordinate by establishing the parameterized model of grasping system Algorithm, the image pre-processing method provided using specific software carry out secondary development under designated environment, realize target and determine Position and robot control two big basic functions, the final crawl that robot is controlled to complete target workpiece.
Description of the drawings
Fig. 1 is the workpiece automatic identification of view-based access control model and intelligent grabbing system structure diagram.
Fig. 2 is the workpiece automatic identification of view-based access control model and intelligent grabbing systematic schematic diagram.
Fig. 3 is national forest park in Xiaokeng.
Fig. 4 is the workpiece automatic identification of view-based access control model and intelligent grabbing systematic parameter model.
Specific embodiment
Below in conjunction with the accompanying drawings and specific embodiment the present invention is carried out in further detail with complete explanation.It is appreciated that It is that specific embodiment described herein is only used for explaining the present invention rather than limitation of the invention.
With reference to Fig. 1, a kind of workpiece automatic identification of view-based access control model of the invention and intelligent grabbing system, the system include: Image capture module, industrial personal computer module and robot module;Wherein, described image acquisition module and the industrial personal computer module phase Even;The industrial personal computer module is connected with the robot module.
(1) image capture module is made of camera, camera lens, light source and workpiece, for quick obtaining noise to be small, precision is high Image data information;Wherein,
Camera uses the industrial CCD camera of Basler companies acA2500-14gm models, using gigabit Ethernet with calculating Machine communicates, and camera is mounted on right over conveyer belt;
Camera lens is using the tight shot of the M0814-MP2 models of COMPUTAR companies of Japan, focal length 8mm, maximum imaging ruler Very little 8.8mm × 6.6mm, imaging size meet design requirement;
Light source uses the LED annular light sources of CCS companies, and the LED illumination System response time is fast, and it is right can to obtain high-quality, height Than degree image.
(2) industrial personal computer module uses the industrial control computer of Taiwan Yan Hua companies, is responsible for receiving the image letter of CCD camera acquisition It ceases and image processing algorithm is used to be converted to robot control signal after completing workpiece identification to control end effector of robot Physical location.
(3) robot module is made of driving device and robot body, for performing phase according to the control instruction of reception The operation answered.Wherein, robot uses the IRB120 humanoid robots of ABB AB of Switzerland, which has 6 rotary joints, By AC servo machinery driving, end repetitive positioning accuracy 0.01mm, control is simple, and programming is convenient, suitable for being grabbed on production line It is taken as industry.
With reference to Fig. 2, the video camera of this system is fixedly mounted on above conveyer belt, and conveyer belt is continuously run, and workpiece is from transmission Band one end enters camera coverage, sets timer, and per 0.5s, triggering camera acquires a frame image, image size for 640 × 480, the position of form center of workpiece is determined by stencil matching method, by two field pictures workpiece the direction of motion displacement and shooting The time interval of this two field pictures can calculate the speed of workpiece motion s, pass through Kalman prediction following clock cycle work The position of part, and the movement locus of planning robot make workpiece motion s to the position of the end effector of robot when capturing station Appearance is overlapped with object pose, and finally by the inverse solution of robot kinematics, posture information is converted into pass known to industrial robot Angle and angle control information are saved, so as to fulfill vision guide robot accurately grabbing workpiece is utilized.
Wherein, upper ductility limit is the position that workpiece has just initially entered the robot capture area moment, in order to reduce robot Stand-by period, as far as possible that the setting of upper ductility limit is more forward, lower ductility limit is the position that workpiece leaves the capture area moment, and workpiece must It must be crawled by robots in capture area, otherwise this time crawl mission failure, robot abandons tracking the workpiece.
First, the parameterized model of grasping system is established
1st, camera calibration
With reference to Fig. 3, (come from paper for national forest park in Xiaokeng《Robot vision measures and control》, author:Xu De, Tan Min, Lee original Beijing:National Defense Industry Press, 2011.), coordinate system is established in the optical axis center of video camera, Z axis is along optical axis direction, X Axis takes image coordinate along the horizontal increased direction of image coordinate, in camera coordinate system OCIn-xyz, the coordinate of the p that sets up an office for (x, Y, z), it the subpoint P of the plane of delineation coordinate for (X, Y, Z), wherein Z=f, f are the focal lengths of video camera.
Following proportionate relationship is then obtained by pinhole imaging system principle:
By CCD image-forming principles it is found that the picture on imaging plane obtains digital picture by enhanced processing, on imaging plane Picture point (X, Y) is converted to picture point (u, v), and remember (u0,v0) image coordinate for optical axis center line and imaging plane intersection point, then Have:
In formula:dx, dyPhysical size of the respectively pixel on X and Y-direction, sx=1/dx, sy=1/dyRespectively X With the number of pixels of the sample frequency in Y-direction, i.e. unit length.
Substituting into formula (2) and be rewritten into matrix form formula (1) has:
In formula:fx=fsx, fy=fsyIt is respectively defined as X and the equivalent focal length of Y-direction, fx、fy、u0、v0This 4 parameters are only Related with video camera internal structure, because of the inner parameter of referred to herein as video camera, the outer parameter model of video camera is world coordinate system Description in camera coordinate system.Coordinate system O-XwYwZwExpression in coordinate system O-xyz forms the outer parameter of video camera Matrix:
Therefore the relationship of world coordinates and image coordinate is set up by camera coordinate system;Formula (4) is substituted into formula (3) :
2nd, trick coordinate is demarcated
In this system, video camera is separately mounted to conveyer belt both ends with robot, it is impossible to pass through traditional hand and eye calibrating Method determines the relative pose between workpiece and robot, therefore establishes two reference frame ref on a moving belt1And ref2, ref1It establishes in the range of camera coverage, ref2It establishes in the Work Space Range of robot.With reference to Fig. 4, video camera passes through Plane target standardization calibrates inside and outside parameter, and establishes reference frame ref using wherein one secondary scaling board image1, obtain ref1With the relative pose between camera coordinate system camcam Href1;ref1With ref2Between only X-direction translation relation, Position orientation relation is ref1Href2;With the line-of-sight course calibration reference frame ref of similar calibration workpiece coordinate system2With robot base Position orientation relation between coordinate systembase Href2
It is set up between camera coordinate system cam and robot base's coordinate system base by this two reference frames Relationshipbase Hcam=base Href2·ref2Href1·(cam Href1)-1, position to obtain by targetcam Hobj, then target workpiece exist Pose transition matrix in robot base's coordinate system is:base Hobj=base Hcam·cam Hobj;Thus set up target work Contacting between part and robot.
2nd, based on the relevant template matching algorithm of gray scale
Feature extraction and template matches are an important links in motion target tracking, the profile of target image, shape, The criterion of gray value, color histogram when can serve as template matches, and a variety of geometry or gray scale can be comprehensively utilized Feature is to target into line trace.After target's feature-extraction, suitable search matching algorithm is selected to realize target positioning.In order to full The higher requirement of real-time during sufficient operation, used image processing algorithm must have sufficiently high arithmetic speed, and right Illumination variation and environmental factor have enough robustness.
Common template matching algorithm is mainly based upon gray scale correlation and two kinds of template matching algorithms based on geometric properties. Based on the relevant template matching algorithm of gray scale directly matched using gradation of image value information as characteristic parameter, based on gray scale Relevant algorithm comparison is ripe, and principle is simple, realization is also relatively easy to.Therefore, the present invention is using based on the relevant template of gray scale Matching algorithm.
The similarity measurement criterion used all pixels gray value difference between calculation template image and image to be searched Quadratic sum, i.e. SSD algorithms, if template size is M × N number of pixel, the similarity function of masterplate and image to be matched is:
In formula:T (m, n) and S (i+m, j+n) be respectively template image and image to be searched at (m, n) coordinate and (i+m, Whether j+n) the gray value at coordinate, similarity function value during by calculating each position determine to have in image and mould The same or similar target of plate.Above formula is normalized to obtain the normalized-cross-correlation function NCC of template matches:
NCC coefficient magnitudes represent the matching degree of template and image to be searched at (i, j) position, value 0~1 it Between, NCC=1 expression found in image to be searched with the identical example of template, in image to be searched complete all searches The position at NCC maximum is found out after rope, which is the target matched.Utilize similarity measurements flow function, template image warp It crosses and translates and be rotated in one or more examples that template is found in image to be searched, and determine position coordinates and the rotation of example Angle, the Grasp Planning for subsequent robot provide authentic communication.
3rd, Moving Target Tracking Algorithm
1st, Kalman filtering
The position of moment target on a moving belt of taking pictures is obtained by camera calibration and template matches, but target is transmitting It is constantly moving to take, and robot realizes that grasping movement is also required to certain time interval, therefore, only pre- in advance The position that target is likely to occur is measured, and robot is made to move to reach simultaneously with target in advance and could complete workpiece at predicted position Crawl, the here prediction of object pose (come from paper by Kalman filter《The basic principle of Kalman filtering and application》, make Person:Peng Ding acute hearing software guides, 2009,11 (8):32~34.) it completes, Kalman filter is a kind of linear filter, i.e., As long as knowing that the estimated value of last moment state and the observation of current state calculate the estimated value of current state, Therefore the historical information of hourly observation or estimation is not needed to.Kalman filter is widely used in Visual Tracking System. It can accurately estimate the state in moving object future, and then guided robot completes dynamic crawl task.
2nd, the foundation of motion model
Workpiece on conveyer belt generally does linear uniform motion, if the motion state parameters of target are a certain moment target Position and speed, during tracking, since adjacent two field pictures time interval is shorter, target state variation is smaller, Target uniform motion in unit interval is assume that, so speed parameter is enough the movement tendency for reflecting target.At this In with a four-dimensional variable-definition system mode xk, i.e. (xsk,ysk,xvk,yvk) target is represented respectively in the position in x and y directions And speed, equation are:
Therefore to this system, system model is established as follows:Dt is tk-1With tkTime interval.
It can only observe the position of target in the picture, therefore observation model zkFor:
Detect to take pictures the position (xs of moment workpiece by stencil matching0,ys0), the starting velocity (xv of workpiece0, 0), by Originate the image that two frames include workpiece, by calculate workpiece centre the displacement of the direction of motion divided by shoot this 2 frame image when Between obtain, then by the original state of systemWith system initial error covariance matrix P0= 10eye (4) (eye (4) represent 4 rank diagonal matrix) initialized card Thalmann filter, and at the time of record present image, under Before one frame image carries out pattern match, by calculating the time interval dt between two frames and bringing into status predication equation, obtain Go out current motion stateEstimated value, and willCentered on ROI of the region as this pattern match (Region of interest) finds the best match of template, obtains (x in the roi1,y1), and record present image when It carves.By z1=(x1,y1) observation vector brings state renewal equation into, filter status is updated, obtains each moment moving target Position and the estimated value of speed.Using the possible position of Kalman filter prediction workpiece in the picture, so as to avoid to whole Width picture search matches, and greatly accelerates the speed of template matches, improves the real-time of system.- school is estimated by such Positive process, the position (t after estimating target certain time Δ t with Kalman filter2Moment), and planning robot accordingly Movement locus and speed, by the control instruction of generation by switch board control robot complete this grasping movement.
The foregoing is merely the preferred embodiment of the present invention, are not intended to restrict the invention, for those skilled in the art For, the present invention can have various modifications and changes.All any modifications made within spirit and principles of the present invention are equal Replace, improve etc., it should all be included in the protection scope of the present invention.

Claims (4)

1. the workpiece automatic identification of view-based access control model and intelligent grabbing system, which is characterized in that the system comprises:Image Acquisition mould Block, industrial personal computer module and robot module;Wherein, described image acquisition module is connected with the industrial personal computer module;The work Control machine module is connected with the robot module.
2. the workpiece automatic identification of view-based access control model according to claim 1 and intelligent grabbing system, which is characterized in that described Image capture module is made of camera, camera lens, light source and workpiece, for quick obtaining noise to be small, precision hi-vision data letter Breath.
3. the workpiece automatic identification of view-based access control model according to claim 1 and intelligent grabbing system, which is characterized in that described Industrial personal computer module uses the industrial control computer of Taiwan Yan Hua companies, is responsible for receiving the image information and fortune of image capture module acquisition Robot control signal is converted to after completing workpiece identification with image processing algorithm to control the reality of end effector of robot Position.
4. the workpiece automatic identification of view-based access control model according to claim 1 and intelligent grabbing system, which is characterized in that described Robot module is made of driving device and robot body, for performing corresponding operation according to the control instruction of reception.
CN201611116892.7A 2016-12-07 2016-12-07 The workpiece automatic identification of view-based access control model and intelligent grabbing system Pending CN108161931A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611116892.7A CN108161931A (en) 2016-12-07 2016-12-07 The workpiece automatic identification of view-based access control model and intelligent grabbing system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611116892.7A CN108161931A (en) 2016-12-07 2016-12-07 The workpiece automatic identification of view-based access control model and intelligent grabbing system

Publications (1)

Publication Number Publication Date
CN108161931A true CN108161931A (en) 2018-06-15

Family

ID=62526636

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611116892.7A Pending CN108161931A (en) 2016-12-07 2016-12-07 The workpiece automatic identification of view-based access control model and intelligent grabbing system

Country Status (1)

Country Link
CN (1) CN108161931A (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107942949A (en) * 2017-03-31 2018-04-20 沈机(上海)智能***研发设计有限公司 A kind of lathe vision positioning method and system, lathe
CN108709621A (en) * 2018-08-02 2018-10-26 河北工业大学 A kind of special-shaped workpiece detection grabbing device based on supersonic array
CN108827154A (en) * 2018-07-09 2018-11-16 深圳辰视智能科技有限公司 A kind of robot is without teaching grasping means, device and computer readable storage medium
CN108972556A (en) * 2018-08-14 2018-12-11 广东工业大学 Conducting wire grasping system and method on small and special electric machine production line under complex illumination environment
CN109015653A (en) * 2018-08-30 2018-12-18 黄河科技学院 Grab control method, device, storage medium and electronic equipment
CN109048918A (en) * 2018-09-25 2018-12-21 华南理工大学 A kind of visual guide method of wheelchair arm robot
CN109454638A (en) * 2018-10-31 2019-03-12 昆山睿力得软件技术有限公司 A kind of robot grasping system of view-based access control model guidance
CN109454501A (en) * 2018-10-19 2019-03-12 江苏智测计量技术有限公司 A kind of lathe on-line monitoring system
CN109623821A (en) * 2018-12-26 2019-04-16 深圳市越疆科技有限公司 The visual guide method of mechanical hand crawl article
CN109848998A (en) * 2019-03-29 2019-06-07 砚山永盛杰科技有限公司 One kind being used for 3C industry vision four axis flexible robot
CN109911549A (en) * 2019-01-25 2019-06-21 东华大学 A kind of the Robotic Dynamic tracking grasping system and method for fragile goods
CN110102490A (en) * 2019-05-23 2019-08-09 北京阿丘机器人科技有限公司 The assembly line packages device and electronic equipment of view-based access control model technology
CN110480685A (en) * 2019-05-15 2019-11-22 青岛科技大学 A kind of Agricultural vehicle wheel automatic production line vision manipulator
CN110509273A (en) * 2019-08-16 2019-11-29 天津职业技术师范大学(中国职业培训指导教师进修中心) The robot mechanical arm of view-based access control model deep learning feature detects and grasping means
CN110640739A (en) * 2018-11-07 2020-01-03 宁波赛朗科技有限公司 Grabbing industrial robot with center position recognition function
CN110980061A (en) * 2019-12-12 2020-04-10 重庆铁马专用车有限公司 Novel intelligence major possession refuse treatment system
CN111112885A (en) * 2019-11-26 2020-05-08 福尼斯智能装备(珠海)有限公司 Welding system with vision system for feeding and discharging workpieces and self-adaptive positioning of welding seams
CN111216099A (en) * 2018-11-27 2020-06-02 发那科株式会社 Robot system and coordinate conversion method
CN111447366A (en) * 2020-04-27 2020-07-24 Oppo(重庆)智能科技有限公司 Transportation method, transportation device, electronic device, and computer-readable storage medium
CN111815718A (en) * 2020-07-20 2020-10-23 四川长虹电器股份有限公司 Method for quickly switching stations of industrial screw robot based on vision
CN111989540A (en) * 2018-07-13 2020-11-24 深圳配天智能技术研究院有限公司 Workpiece tracking method and system and robot
CN112296999A (en) * 2019-11-12 2021-02-02 太原科技大学 Irregular workpiece machining path generation method based on machine vision
WO2021042376A1 (en) * 2019-09-06 2021-03-11 罗伯特·博世有限公司 Calibration method and apparatus for industrial robot, three-dimensional environment modeling method and device for industrial robot, computer storage medium, and industrial robot operating platform
CN113043334A (en) * 2021-02-23 2021-06-29 上海埃奇机器人技术有限公司 Robot-based photovoltaic cell string positioning method
CN113808197A (en) * 2021-09-17 2021-12-17 山西大学 Automatic workpiece grabbing system and method based on machine learning
CN114347033A (en) * 2022-01-27 2022-04-15 达闼机器人有限公司 Robot article grabbing method and device, robot and storage medium
US20220130066A1 (en) * 2019-01-25 2022-04-28 Sony Interactive Entertainment Inc. Robot controlling system
CN114454177A (en) * 2022-03-15 2022-05-10 浙江工业大学 Robot tail end position compensation method based on binocular stereo vision

Cited By (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107942949A (en) * 2017-03-31 2018-04-20 沈机(上海)智能***研发设计有限公司 A kind of lathe vision positioning method and system, lathe
CN107942949B (en) * 2017-03-31 2019-01-25 沈机(上海)智能***研发设计有限公司 A kind of lathe vision positioning method and system, lathe
CN108827154A (en) * 2018-07-09 2018-11-16 深圳辰视智能科技有限公司 A kind of robot is without teaching grasping means, device and computer readable storage medium
CN111989540A (en) * 2018-07-13 2020-11-24 深圳配天智能技术研究院有限公司 Workpiece tracking method and system and robot
CN111989540B (en) * 2018-07-13 2022-04-15 深圳配天智能技术研究院有限公司 Workpiece tracking method and system and robot
CN108709621A (en) * 2018-08-02 2018-10-26 河北工业大学 A kind of special-shaped workpiece detection grabbing device based on supersonic array
CN108709621B (en) * 2018-08-02 2024-04-26 河北工业大学 Abnormal workpiece detection grabbing device based on ultrasonic array
CN108972556A (en) * 2018-08-14 2018-12-11 广东工业大学 Conducting wire grasping system and method on small and special electric machine production line under complex illumination environment
CN108972556B (en) * 2018-08-14 2021-07-09 广东工业大学 Wire grabbing system and method in complex illumination environment on micro special motor production line
CN109015653A (en) * 2018-08-30 2018-12-18 黄河科技学院 Grab control method, device, storage medium and electronic equipment
CN109048918A (en) * 2018-09-25 2018-12-21 华南理工大学 A kind of visual guide method of wheelchair arm robot
CN109048918B (en) * 2018-09-25 2022-02-22 华南理工大学 Visual guide method for wheelchair mechanical arm robot
CN109454501A (en) * 2018-10-19 2019-03-12 江苏智测计量技术有限公司 A kind of lathe on-line monitoring system
CN109454638A (en) * 2018-10-31 2019-03-12 昆山睿力得软件技术有限公司 A kind of robot grasping system of view-based access control model guidance
CN110640739A (en) * 2018-11-07 2020-01-03 宁波赛朗科技有限公司 Grabbing industrial robot with center position recognition function
CN111216099A (en) * 2018-11-27 2020-06-02 发那科株式会社 Robot system and coordinate conversion method
CN109623821A (en) * 2018-12-26 2019-04-16 深圳市越疆科技有限公司 The visual guide method of mechanical hand crawl article
CN109623821B (en) * 2018-12-26 2022-04-01 日照市越疆智能科技有限公司 Visual guide method for grabbing articles by mechanical arm
CN109911549A (en) * 2019-01-25 2019-06-21 东华大学 A kind of the Robotic Dynamic tracking grasping system and method for fragile goods
US20220130066A1 (en) * 2019-01-25 2022-04-28 Sony Interactive Entertainment Inc. Robot controlling system
CN109848998A (en) * 2019-03-29 2019-06-07 砚山永盛杰科技有限公司 One kind being used for 3C industry vision four axis flexible robot
CN110480685A (en) * 2019-05-15 2019-11-22 青岛科技大学 A kind of Agricultural vehicle wheel automatic production line vision manipulator
CN110102490B (en) * 2019-05-23 2021-06-01 北京阿丘机器人科技有限公司 Assembly line parcel sorting device based on vision technology and electronic equipment
CN110102490A (en) * 2019-05-23 2019-08-09 北京阿丘机器人科技有限公司 The assembly line packages device and electronic equipment of view-based access control model technology
CN110509273B (en) * 2019-08-16 2022-05-06 天津职业技术师范大学(中国职业培训指导教师进修中心) Robot manipulator detection and grabbing method based on visual deep learning features
CN110509273A (en) * 2019-08-16 2019-11-29 天津职业技术师范大学(中国职业培训指导教师进修中心) The robot mechanical arm of view-based access control model deep learning feature detects and grasping means
CN114390963A (en) * 2019-09-06 2022-04-22 罗伯特·博世有限公司 Calibration method and device for industrial robot, three-dimensional environment modeling method and device, computer storage medium and industrial robot operating platform
WO2021042376A1 (en) * 2019-09-06 2021-03-11 罗伯特·博世有限公司 Calibration method and apparatus for industrial robot, three-dimensional environment modeling method and device for industrial robot, computer storage medium, and industrial robot operating platform
CN112296999A (en) * 2019-11-12 2021-02-02 太原科技大学 Irregular workpiece machining path generation method based on machine vision
CN112296999B (en) * 2019-11-12 2022-07-08 太原科技大学 Irregular workpiece machining path generation method based on machine vision
CN111112885A (en) * 2019-11-26 2020-05-08 福尼斯智能装备(珠海)有限公司 Welding system with vision system for feeding and discharging workpieces and self-adaptive positioning of welding seams
CN110980061A (en) * 2019-12-12 2020-04-10 重庆铁马专用车有限公司 Novel intelligence major possession refuse treatment system
CN111447366A (en) * 2020-04-27 2020-07-24 Oppo(重庆)智能科技有限公司 Transportation method, transportation device, electronic device, and computer-readable storage medium
CN111815718A (en) * 2020-07-20 2020-10-23 四川长虹电器股份有限公司 Method for quickly switching stations of industrial screw robot based on vision
CN113043334A (en) * 2021-02-23 2021-06-29 上海埃奇机器人技术有限公司 Robot-based photovoltaic cell string positioning method
CN113808197A (en) * 2021-09-17 2021-12-17 山西大学 Automatic workpiece grabbing system and method based on machine learning
CN114347033A (en) * 2022-01-27 2022-04-15 达闼机器人有限公司 Robot article grabbing method and device, robot and storage medium
WO2023143408A1 (en) * 2022-01-27 2023-08-03 达闼机器人股份有限公司 Article grabbing method for robot, device, robot, program, and storage medium
CN114347033B (en) * 2022-01-27 2023-12-08 达闼机器人股份有限公司 Robot character grabbing method and device, robot and storage medium
CN114454177A (en) * 2022-03-15 2022-05-10 浙江工业大学 Robot tail end position compensation method based on binocular stereo vision

Similar Documents

Publication Publication Date Title
CN108161931A (en) The workpiece automatic identification of view-based access control model and intelligent grabbing system
CN109308693B (en) Single-binocular vision system for target detection and pose measurement constructed by one PTZ camera
US11117262B2 (en) Intelligent robots
CN107618030B (en) Robot dynamic tracking grabbing method and system based on vision
US8244402B2 (en) Visual perception system and method for a humanoid robot
CN101274432B (en) Apparatus for picking up objects
CN107471218B (en) Binocular vision-based hand-eye coordination method for double-arm robot
CN103895042A (en) Industrial robot workpiece positioning grabbing method and system based on visual guidance
CN110211180A (en) A kind of autonomous grasping means of mechanical arm based on deep learning
CN108898634B (en) Method for accurately positioning embroidery machine target needle eye based on binocular camera parallax
CN108399639A (en) Fast automatic crawl based on deep learning and arrangement method
CN111645074A (en) Robot grabbing and positioning method
CN109454638A (en) A kind of robot grasping system of view-based access control model guidance
CN113276106B (en) Climbing robot space positioning method and space positioning system
CN109940626B (en) Control method of eyebrow drawing robot system based on robot vision
CN113267452A (en) Engine cylinder surface defect detection method and system based on machine vision
Fanello et al. 3D stereo estimation and fully automated learning of eye-hand coordination in humanoid robots
CN112775959A (en) Method and system for determining grabbing pose of manipulator and storage medium
Gratal et al. Visual servoing on unknown objects
Hsu et al. Development of a faster classification system for metal parts using machine vision under different lighting environments
Mišeikis et al. Two-stage transfer learning for heterogeneous robot detection and 3d joint position estimation in a 2d camera image using cnn
Kragic et al. Model based techniques for robotic servoing and grasping
Gao et al. An automatic assembling system for sealing rings based on machine vision
Li et al. Workpiece intelligent identification and positioning system based on binocular machine vision
CN116206189A (en) Curved surface graphic identification code and identification method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20180615