CN105562361A - Independent sorting method of fabric sorting robot - Google Patents

Independent sorting method of fabric sorting robot Download PDF

Info

Publication number
CN105562361A
CN105562361A CN201510979762.5A CN201510979762A CN105562361A CN 105562361 A CN105562361 A CN 105562361A CN 201510979762 A CN201510979762 A CN 201510979762A CN 105562361 A CN105562361 A CN 105562361A
Authority
CN
China
Prior art keywords
theta
fabric
formula
robot
sorting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510979762.5A
Other languages
Chinese (zh)
Other versions
CN105562361B (en
Inventor
王晓华
李鹏飞
张蕾
***
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Polytechnic University
Original Assignee
Xian Polytechnic University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Polytechnic University filed Critical Xian Polytechnic University
Priority to CN201510979762.5A priority Critical patent/CN105562361B/en
Publication of CN105562361A publication Critical patent/CN105562361A/en
Application granted granted Critical
Publication of CN105562361B publication Critical patent/CN105562361B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/34Sorting according to other particular properties
    • B07C5/342Sorting according to other particular properties according to optical properties, e.g. colour
    • B07C5/3422Sorting according to other particular properties according to optical properties, e.g. colour using video scanning devices, e.g. TV-cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/36Sorting apparatus characterised by the means used for distribution
    • B07C5/361Processing or control devices therefor, e.g. escort memory
    • B07C5/362Separating or distributor mechanisms

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Manipulator (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an independent sorting method of a fabric sorting robot. Digital images, acquired by the robot, of fabric on an assembly line are adopted as research objects, a weighting form is applied to a color component H and a color component S through image color spatial switching, and an Otsu self-adaption threshold value method is applied to achieve image segmentation; the contours of the fabric are extracted by applying wavelet edge detection and morphological contour compensation; and the actual positions of the fabric are obtained according to the calibration result of a camera. Meanwhile, the weighting form of the color component H and the color component S is adopted as a feature, and a minimum variance criterion is used for classifying K-mean clustering methods. Different kinds of fabric are arranged at corresponding placing positions, the robot reversely solves kinematical parameters of all joints according to the placing positions and a kinematical equation, and finally grabbing actions are implemented. According to the independent sorting method of the fabric sorting robot, manual intervention is not needed, and the fabric can be sorted automatically, efficiently, accurately, stably and enduringly.

Description

The autonomous method for sorting of a kind of fabric sorting machine people
Technical field
The invention belongs to machine vision, robot autonomous fabric classification Sorting Technique field, be specifically related to the autonomous method for sorting of a kind of fabric sorting machine people.
Background technology
Along with the national economic development marches toward new normality, the development model that traditional dependence resource consumption and labour continue to drop into is hard to carry on, and the inner system dynamics of textile clothing industry must constantly strengthen.Slowly pulling open and the disappearance of demographic dividend, the ripe gradually of Robotics and popularizing in every profession and trade application of " industry 4.0 " curtain, robot replaces people to reduce recruitment, will finally become the realistic choice of textile clothing enterprise.Promoted the informationization technology level of industry by " machine substitution ", application messageization initiatively cracks a poor efficiency high consumption difficult problem, and promoting enterprise is changed to technology-intensive type by labor-intensive, is the only selection of textile clothing enterprise sustainable development.Domestic enterprise achieves some achievements to single stations of industry such as welding, spraying, carrying or the research and development of process integration universal type industrial robot, but can not meet the particular/special requirement of textile clothing enterprise to robot.Labor-intensive enterprises are difficult to the high robot of import price, according to the actual demand of textile clothing enterprise " machine substitution ", research and development manufacturing price is relatively low, reliable in quality, practical even process integration robot of single station robot, is the effective ways alleviating enterprise's " machine substitution " cost burden.2014 start, and the ground such as the layout machine device people of single station, the textile machinery people of process integration type and printing and dyeing robot start in Qingdao, stone lion, Shaoxing are researched and developed, and not yet come into operation.Research and development China there is the high-performance robot control system of independent intellectual property right and application system extremely urgent.
Because operative exists the problem of fatigue strength, use the industrial robot based on machine vision to carry out fabric sorting, not only efficiently and accurately but also stable lasting, there is very large advantage.At present, the existing vision system of company of external robot, the M-liA high speed radio frequency etc. as the TrueView system of ABB, the drill bit sorting system of Sehuster-prazision and FANUC be all based on rigid objects sorting application and design; And the domestic business sorting system that there is no maturation.Existing sorting system is all rigid objects exploitation, and general Shape-based interpolation is classified, and the simple mode of geometric templates coupling that adopts identifies, can not adapt to actual conditions well.And for the sorting of the flexible articles such as fabric, at present, home and abroad there is no ripe method and system and releases.
Summary of the invention
The object of this invention is to provide the autonomous method for sorting of a kind of fabric sorting machine people, solve the problem that independently can not sort flexible fabric existed in prior art.
The technical solution adopted in the present invention is, the autonomous method for sorting of a kind of fabric sorting machine people, specifically according to following steps:
Step 1, the binocular camera be arranged on above robot to be demarcated;
Step 2, gather textile image signal by being arranged on binocular camera above robot, then real-time online sends graphics processing unit to;
The image transmitted in step 3, graphics processing unit Real-time Obtaining step 2, and RGB color image is converted to HSV image, application Otsu automatic threshold method, splits each fabric;
The boundary profile of the textile image obtained in step 4, application Wavelet Edge Detection extraction step 3, and applied morphology method carries out outline compensation, obtains the segmentation image based on color segmentation and Wavelet Edge Detection;
Step 5, according to the demarcation of step 1 and step 4 and image edge information, determine the actual crawl position of fabric;
Step 6, structure fabric classification device;
Step 7, robot, according to the classification results of fabric classification device, capture different colours fabric, are placed on diverse location.
Feature of the present invention is also:
Demarcation in step 1 adopts classical Zhang Zhengyou scaling method.
When in step 3, RGB color image is converted to HSV image, consider the needs distinguishing the close fabric of color, for the image of the H obtained, channel S, extract the histogram of H, S color component, adopt two component weighted type, wherein, the weight of H component is 0.2 ~ 0.4, the weight of S component is 0.6 ~ 0.8, two weight sums is 1.
Step 5 determines the actual crawl position of fabric, be specially: according to the position relationship between binocular calibration result and two video cameras, the depth information of image is obtained by Stereo matching, in conjunction with the marginal information that small echo and morphological method detect, determine the three-dimensional information of edge in real world, consider actual crawl situation, inwardly advance 8 ~ 15 pixels to be concrete crawl position with rightmost edges.
Fabric classification device in step 6 adopts unsupervised K-means clustering method, with minimum variance function minimization extreme value for criterion, asks the maximal possibility estimation of textile image color characteristic average, fabric is divided into i class.
According to the classification results of fabric classification device in step 7, capture different colours fabric, be placed on diverse location, the concrete crawl position of the fabric of each color, is the movement position of robot terminal, this position is brought into Robot kinematics equations, by anti-solution, ask for the kinematics parameters in each joint of four-degree-of-freedom robot, this Parameter transfer is to motion control card, robot completes relevant action, and kinematics parameters computational process is as follows:
Robot forward kinematics equation can represent with formula (1):
T 4 0 = T 1 0 ( θ 1 ) T 2 1 ( θ 2 ) T 3 2 ( d 3 ) T 4 3 ( θ 4 ) ; - - - ( 1 )
In formula, T is the function about joint variable, θ i(i=1,2,4) are from X i-1to X iaround Z i-1the angle that axle rotates, d 3for joint distance, corresponding to X 2to X 3along Z 3the distance that axle is measured;
The position of known machine people paw terminal and attitude, ask for each joint angle that robot is corresponding, and with driving joint motor, thus make the attitude of paw meet crawl requirement, inverse kinematics has multi-solution, and the concrete solution of Inverse Kinematics of the present invention is as follows:
A. joint variable θ is asked 1:
In order to variables separation, to the both sides premultiplication simultaneously of formula (1) obtain formula (2):
T 1 0 - 1 ( θ 1 ) T 4 0 = T 2 1 ( θ 2 ) T 3 2 ( d 3 ) T 4 3 ( θ 4 ) ; - - - ( 2 )
Substitute into the concrete data of four-degree-of-freedom robot, obtain formula (3):
cosθ 1 sinθ 1 0 - l 1 - sinθ 1 cosθ 1 0 0 0 0 1 0 0 0 0 1 n x o x a x p x n y o y a y p y n z o z a z p z 0 0 0 1 = cosθ 2 - sinθ 1 0 l 2 cosθ 2 sinθ 2 cosθ 1 0 l 2 sinθ 2 0 0 1 - d 3 0 0 0 1 cosθ 4 - sinθ 4 0 0 sinθ 4 cosθ 4 0 0 0 0 1 0 0 0 0 1 ; - - - ( 3 )
In formula, l 1for robot links 1 length, l 2for robot links 2 length, n, o, a are respectively normal, sensing and close vector, and p is Two coordinate initial point length.
Make that the first row the 4th element in formula (3) in the matrix of left and right is equal and the second row the 4th element is equal, as shown in formula (4):
cosθ 1 · p x + sinθ 1 p y - l 1 = cosθ 2 · l 2 - sinθ 1 · p x + cosθ 1 p y = sinθ 2 · l 2 ; - - - ( 4 )
θ is obtained by formula (4) 1, as shown in formula (5):
In formula A = l 1 2 - l 2 2 + p x 2 + p y 2 2 l 1 · p x 2 + p y 2 ,
B. joint variable θ is asked 2:
Formula is obtained by formula (4):
In formula r = p x 2 + p y 2 ,
C. joint variable d is asked 3:
Make the third line the 4th element in the left and right matrix in formula (3) equal, obtain formula (7):
D 3=-(p z+ d 4); (7) in formula, d 4for joint distance, corresponding to X 3to X 4along Z 4the distance that axle is measured;
D. joint variable θ is asked 4:
Make the second row first element in formula (3) in the matrix of left and right equal, obtain formula (8):
-sinθ 1·n x+cosθ 1·n y=sinθ 2·cosθ 4+cosθ 1·sinθ 4;(8)
θ is tried to achieve by formula (8) 4, ask for formula as shown in (9):
θ 4=arcsin(-sinθ 1·n x+cosθ 1·n y)-θ 2;(9)
The invention has the beneficial effects as follows:
1. the autonomous method for sorting of a kind of fabric sorting machine people of the present invention, the color of robot energy ONLINE RECOGNITION fabric, the marginal information of Fusion of Color and image carries out Iamge Segmentation, extracts target fabric;
2. the autonomous method for sorting of a kind of fabric sorting machine people of the present invention, the design of grader makes robot the fabric of industry spot can be carried out automatic classification;
3. the autonomous method for sorting of a kind of fabric sorting machine people of the present invention, can position by the fabric on pipeline, and according to classification results, captures and be placed on diverse location.
Accompanying drawing explanation
Fig. 1 is the autonomous method for sorting flow chart of a kind of fabric sorting machine people of the present invention;
Fig. 2 is a kind of fabric sorting machine of the present invention people fabric classification device flow chart;
Fig. 3 is the autonomous sort process figure of a kind of fabric sorting machine people of the present invention.
Detailed description of the invention
Below in conjunction with the drawings and specific embodiments, the present invention is described in detail.
The present invention with control system form for the four-degree-of-freedom machine of PC and motion control card is artificially routine.
The autonomous method for sorting of a kind of fabric sorting machine people of the present invention, flow process as shown in Figure 1, specifically according to following steps:
Step 1, first the binocular camera be arranged on above robot to be demarcated; Demarcate and adopt classical Zhang Zhengyou scaling method, for the image that binocular vision gathers, every width image gets 6 points respectively, can calculate camera interior and exterior parameter;
Step 2, gather textile image signal by being arranged on binocular camera above robot, then real-time online sends the graphics processing unit on PC to;
Step 3, be positioned at the graphics processing unit of PC, the digital picture transmitted in Real-time Obtaining step 2, and RGB color image is converted to HSV image, consider the needs distinguishing the close fabric of color, for the image of the H obtained, channel S, extract the histogram of H, S color component, adopt two component weighted type, wherein, the weight of H component is 0.2 ~ 0.4, the weight of S component is 0.6 ~ 0.8, and application Otsu adaptive threshold method distinguishes background and target;
The boundary profile of the textile image obtained in step 4, application Wavelet Edge Detection extraction step 3, and applied morphology method carries out outline compensation;
Step 5, according to the demarcation of step 1 and step 4 and image edge information, determine the actual crawl position of fabric, according to the position relationship between binocular calibration result and two video cameras, the depth information of image is obtained by Stereo matching, in conjunction with the marginal information that small echo and morphological method detect, determine the three-dimensional information of edge in real world, consider actual crawl situation, inwardly advance 8 ~ 15 pixels to be concrete crawl position with rightmost edges;
Step 6, structure fabric classification device: grader adopts unsupervised K-means clustering method, and with minimum variance function minimization extreme value for criterion, ask the maximal possibility estimation of textile image color characteristic average, fabric is divided into i class, its flow chart as shown in Figure 2;
Step 7, robot are according to the classification results of fabric classification device, capture different colours fabric, be placed on diverse location, this sort process as shown in Figure 3, the concrete crawl position of the fabric of each color, be the movement position of robot terminal, this position is brought into Robot kinematics equations, by anti-solution, ask for the kinematics parameters in each joint of four-degree-of-freedom robot, this Parameter transfer is to motion control card, and robot completes relevant action, and kinematics parameters computational process is as follows:
Robot forward kinematics equation can represent with formula (1):
T 4 0 = T 1 0 ( θ 1 ) T 2 1 ( θ 2 ) T 3 2 ( d 3 ) T 4 3 ( θ 4 ) ; - - - ( 1 )
In formula, T is the function about joint variable, θ i(i=1,2,4) are from X i-1to X iaround Z i-1the angle that axle rotates, d 3for joint distance, corresponding to X 2to X 3along Z 3the distance that axle is measured;
The position of known machine people paw terminal and attitude, ask for each joint angle that robot is corresponding, and with driving joint motor, thus make the attitude of paw meet crawl requirement, inverse kinematics has multi-solution, and the concrete solution of Inverse Kinematics of the present invention is as follows:
A. joint variable θ is asked 1:
In order to variables separation, to the both sides premultiplication simultaneously of formula (1) obtain formula (2):
T 1 0 - 1 ( θ 1 ) T 4 0 = T 2 1 ( θ 2 ) T 3 2 ( d 3 ) T 4 3 ( θ 4 ) ; - - - ( 2 )
Substitute into the concrete data of four-degree-of-freedom robot, obtain formula (3):
cosθ 1 sinθ 1 0 - l 1 - sinθ 1 cosθ 1 0 0 0 0 1 0 0 0 0 1 n x o x a x p x n y o y a y p y n z o z a z p z 0 0 0 1 = cosθ 2 - sinθ 1 0 l 2 cosθ 2 sinθ 2 cosθ 1 0 l 2 sinθ 2 0 0 1 - d 3 0 0 0 1 cosθ 4 - sinθ 4 0 0 sinθ 4 cosθ 4 0 0 0 0 1 0 0 0 0 1 ; - - - ( 3 )
In formula, l 1for robot links 1 length, l 2for robot links 2 length, n, o, a are respectively normal, sensing and close vector, and p is Two coordinate initial point length;
Make the second row the 4th element in the equal and left and right matrix of the first row the 4th element in formula (3) in the matrix of left and right equal, as shown in formula (4):
cosθ 1 · p x + sinθ 1 p y - l 1 = cosθ 2 · l 2 - sinθ 1 · p x + cosθ 1 p y = sinθ 2 · l 2 ; - - - ( 4 )
θ is obtained by formula (4) 1, as shown in formula (5):
In formula A = l 1 2 - l 2 2 + p x 2 + p y 2 2 l 1 · p x 2 + p y 2 ,
B. joint variable θ is asked 2:
Obtained by formula (4):
In formula r = p x 2 + p y 2 ,
C. joint variable d is asked 3:
Make the third line the 4th element in the left and right matrix in formula (3) equal:
D 3=-(p z+ d 4); (7) in formula, d 4for joint distance, corresponding to X 3to X 4along Z 4the distance that axle is measured;
D. joint variable θ is asked 4:
Make the second row first element in formula (3) in the matrix of left and right equal:
-sinθ 1·n x+cosθ 1·n y=sinθ 2·cosθ 4+cosθ 1·sinθ 4;(8)
θ is tried to achieve by formula (8) 4, ask for formula as shown in (9):
θ 4=arcsin(-sinθ 1·n x+cosθ 1·n y)-θ 2;(9)
Above step is separated parameter obtain counter for robot kinematics.

Claims (6)

1. the autonomous method for sorting of fabric sorting machine people, is characterized in that, specifically according to following steps:
Step 1, the binocular camera be arranged on above robot to be demarcated;
Step 2, gather textile image signal by being arranged on binocular camera above robot, then real-time online sends graphics processing unit to;
The image transmitted in step 3, graphics processing unit Real-time Obtaining step 2, and RGB color image is converted to HSV image, application Otsu automatic threshold method, splits each fabric;
The boundary profile of the textile image obtained in step 4, application Wavelet Edge Detection extraction step 3, and applied morphology method carries out outline compensation, obtains the segmentation image based on color segmentation and Wavelet Edge Detection;
Step 5, according to the demarcation of step 1 and step 4 and image edge information, determine the actual crawl position of fabric;
Step 6, structure fabric classification device;
Step 7, robot, according to the classification results of fabric classification device, capture different colours fabric, are placed on diverse location.
2. the autonomous method for sorting of a kind of fabric sorting machine people according to claim 1, is characterized in that, the demarcation in described step 1 adopts classical Zhang Zhengyou scaling method.
3. the autonomous method for sorting of a kind of fabric sorting machine people according to claim 1, it is characterized in that, when in described step 3, RGB color image is converted to HSV image, for the image of the H obtained, channel S, extract the histogram of H, S color component, adopt two component weighted type, wherein, the weight of H component is the weight of 0.2 ~ 0.4, S component be 0.6 ~ 0.8, two weight sums is 1.
4. the autonomous method for sorting of a kind of fabric sorting machine people according to claim 1, it is characterized in that, described step 5 determines the actual crawl position of fabric, be specially: according to the position relationship between binocular calibration result and two video cameras, the depth information of image is obtained by Stereo matching, in conjunction with the marginal information that small echo and morphological method detect, determine the three-dimensional information of edge in real world, inwardly advance 8 ~ 15 pixels to be concrete crawl position with rightmost edges.
5. the autonomous method for sorting of a kind of fabric sorting machine people according to claim 1, it is characterized in that, fabric classification device in described step 6 is for adopting unsupervised K-means clustering method, with minimum variance function minimization extreme value for criterion, ask the maximal possibility estimation of textile image color characteristic average, fabric is divided into i class.
6. the autonomous method for sorting of a kind of fabric sorting machine people according to claim 1, it is characterized in that, in described step 7, robot is according to the classification results of fabric classification device, capture different colours fabric, be placed on diverse location, the concrete crawl position of the fabric of each color, be the movement position of robot terminal, this position is brought into Robot kinematics equations, by anti-solution, ask for the kinematics parameters in each joint of four-degree-of-freedom robot, this Parameter transfer is to motion control card, robot completes relevant action, is specially:
" kinematics parameters computational process is as follows:
Robot forward kinematics equation can represent with formula (1):
T 4 0 = T 1 0 ( θ 1 ) T 2 1 ( θ 2 ) T 3 2 ( d 3 ) T 4 3 ( θ 4 ) ; - - - ( 1 )
In formula, T is the function about joint variable, θ i(i=1,2,4) are from X i-1to X iaround Z i-1the angle that axle rotates, d 3for joint distance, corresponding to X 2to X 3along Z 3the distance that axle is measured;
The position of known machine people paw terminal and attitude, ask for each joint angle that robot is corresponding, with driving joint motor, thus makes the attitude of paw meet crawl requirement, and the concrete solution of Inverse Kinematics of the present invention is as follows:
A. joint variable θ is asked 1:
In order to variables separation, to the both sides premultiplication simultaneously of formula (1) obtain formula (2):
T 1 0 - 1 ( θ 1 ) 4 0 T = 1 2 T ( θ 2 ) 3 2 T ( d 3 ) 4 3 T ( θ 4 ) ; - - - ( 2 )
Substitute into the concrete data of four-degree-of-freedom robot, obtain formula (3):
cosθ 1 sinθ 1 0 - l 1 - sinθ 1 cosθ 1 0 0 0 0 1 0 0 0 0 1 n x o x a x p x n y o y a y p y n z o z a z p z 0 0 0 1 = cosθ 2 - sinθ 1 0 l 2 cosθ 2 sinθ 2 cosθ 1 0 l 2 sinθ 2 0 0 1 - d 3 0 0 0 1 cosθ 4 - sinθ 4 0 0 sinθ 4 cosθ 4 0 0 0 0 1 0 0 0 0 1 ; - - - ( 3 )
In formula, l 1for robot links 1 length, l 2for robot links 2 length, n, o, a are respectively normal, sensing and close vector, and p is Two coordinate initial point length;
Make the second row the 4th element in the equal and left and right matrix of the first row the 4th element in formula (3) in the matrix of left and right equal, as shown in formula (4):
cosθ 1 · p x + sinθ 1 p y - l 1 = cosθ 2 · l 2 - sinθ 1 · p x + cosθ 1 p y = sinθ 2 · l 2 ; - - - ( 4 )
θ is obtained by formula (4) 1, as shown in formula (5):
In formula A = l 1 2 - l 2 2 + p x 2 + p y 2 2 l 1 · p x 2 + p y 2 ,
B. joint variable θ is asked 2:
Obtained by formula (4):
In formula r = p x 2 + p y 2 ,
C. joint variable d is asked 3:
Make the third line the 4th element in the left and right matrix in formula (3) equal:
d 3=-(p z+d 4);(7)
In formula, d 4for joint distance, corresponding to X 3to X 4along Z 4the distance that axle is measured;
D. joint variable θ is asked 4:
Make the second row first element in formula (3) in the matrix of left and right equal:
-sinθ 1·n x+cosθ 1·n y=sinθ 2·cosθ 4+cosθ 1·sinθ 4;(8)
θ is tried to achieve by formula (8) 4, ask for formula as shown in (9):
θ 4=arcsin(-sinθ 1·n x+cosθ 1·n y)-θ 2;(9)”。
CN201510979762.5A 2015-12-23 2015-12-23 A kind of autonomous method for sorting of fabric sorting machine people Expired - Fee Related CN105562361B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510979762.5A CN105562361B (en) 2015-12-23 2015-12-23 A kind of autonomous method for sorting of fabric sorting machine people

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510979762.5A CN105562361B (en) 2015-12-23 2015-12-23 A kind of autonomous method for sorting of fabric sorting machine people

Publications (2)

Publication Number Publication Date
CN105562361A true CN105562361A (en) 2016-05-11
CN105562361B CN105562361B (en) 2017-12-22

Family

ID=55873306

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510979762.5A Expired - Fee Related CN105562361B (en) 2015-12-23 2015-12-23 A kind of autonomous method for sorting of fabric sorting machine people

Country Status (1)

Country Link
CN (1) CN105562361B (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106000917A (en) * 2016-07-15 2016-10-12 安徽东锦资源再生科技有限公司 Intelligent color sorting device and method for waste fiber products
CN106245125A (en) * 2016-09-12 2016-12-21 安徽新创智能科技有限公司 Waste and old medicated clothing environment-friendly high-efficiency regenerative system and renovation process
CN107138432A (en) * 2017-04-05 2017-09-08 杭州迦智科技有限公司 Non-rigid object method for sorting and device
CN107649406A (en) * 2017-09-30 2018-02-02 南京航空航天大学 A kind of efficient more material picking up system of binocular vision and method
CN107671013A (en) * 2017-11-23 2018-02-09 安徽锐视光电技术有限公司 Large scale material based on color sorting technology rejects technique
CN107784634A (en) * 2017-09-06 2018-03-09 广东工业大学 A kind of power transmission line shaft tower Bird's Nest recognition methods based on template matches
CN108038861A (en) * 2017-11-30 2018-05-15 深圳市智能机器人研究院 A kind of multi-robot Cooperation method for sorting, system and device
CN108205324A (en) * 2018-01-03 2018-06-26 李文清 A kind of Intelligent road cleaning plant
CN108217045A (en) * 2018-01-03 2018-06-29 广州供电局有限公司 A kind of intelligent robot for undercarriage on data center's physical equipment
CN108607819A (en) * 2018-04-25 2018-10-02 重庆邮电大学 Material sorting system and method
CN108648208A (en) * 2018-05-14 2018-10-12 西安工程大学 A kind of embedded cut-parts lacing film robot control system and robot control method
CN108764062A (en) * 2018-05-07 2018-11-06 西安工程大学 A kind of clothing cutting plate recognition methods of view-based access control model
CN109513629A (en) * 2018-11-14 2019-03-26 深圳蓝胖子机器人有限公司 Packages method, apparatus and computer readable storage medium
CN110174065A (en) * 2019-06-17 2019-08-27 湖南农业大学 Fruit size lossless detection method based on orthogonal binocular machine vision
EP3530778A1 (en) * 2018-01-22 2019-08-28 Novetex Textiles Limited System and method for recycling fibers from textiles waste
CN113487659A (en) * 2021-07-14 2021-10-08 浙江大学 Image registration method, device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006239602A (en) * 2005-03-04 2006-09-14 Hirosaki Univ Grading method for fruit and vegetable
CN101890409A (en) * 2010-07-16 2010-11-24 合肥安晶龙电子有限公司 Method for sorting substance particles based on colour space conversion
CN102706274A (en) * 2012-04-25 2012-10-03 复旦大学 System for accurately positioning mechanical part by machine vision in industrially-structured scene
CN104056789A (en) * 2013-03-19 2014-09-24 青岛农业大学 Carrot defect image quantitative detection method and carrot sorting apparatus
CN104511436A (en) * 2013-09-28 2015-04-15 沈阳新松机器人自动化股份有限公司 Express sorting method and system based on robot visual servo technology

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006239602A (en) * 2005-03-04 2006-09-14 Hirosaki Univ Grading method for fruit and vegetable
CN101890409A (en) * 2010-07-16 2010-11-24 合肥安晶龙电子有限公司 Method for sorting substance particles based on colour space conversion
CN102706274A (en) * 2012-04-25 2012-10-03 复旦大学 System for accurately positioning mechanical part by machine vision in industrially-structured scene
CN104056789A (en) * 2013-03-19 2014-09-24 青岛农业大学 Carrot defect image quantitative detection method and carrot sorting apparatus
CN104511436A (en) * 2013-09-28 2015-04-15 沈阳新松机器人自动化股份有限公司 Express sorting method and system based on robot visual servo technology

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
晁衍凯等: "基于双目视觉的机器人目标定位与机械臂控制", 《计算机技术与发展》 *

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106000917A (en) * 2016-07-15 2016-10-12 安徽东锦资源再生科技有限公司 Intelligent color sorting device and method for waste fiber products
CN106245125A (en) * 2016-09-12 2016-12-21 安徽新创智能科技有限公司 Waste and old medicated clothing environment-friendly high-efficiency regenerative system and renovation process
CN107138432A (en) * 2017-04-05 2017-09-08 杭州迦智科技有限公司 Non-rigid object method for sorting and device
CN107138432B (en) * 2017-04-05 2020-03-13 杭州迦智科技有限公司 Method and apparatus for sorting non-rigid objects
CN107784634A (en) * 2017-09-06 2018-03-09 广东工业大学 A kind of power transmission line shaft tower Bird's Nest recognition methods based on template matches
CN107649406A (en) * 2017-09-30 2018-02-02 南京航空航天大学 A kind of efficient more material picking up system of binocular vision and method
CN107671013A (en) * 2017-11-23 2018-02-09 安徽锐视光电技术有限公司 Large scale material based on color sorting technology rejects technique
CN108038861A (en) * 2017-11-30 2018-05-15 深圳市智能机器人研究院 A kind of multi-robot Cooperation method for sorting, system and device
CN108205324A (en) * 2018-01-03 2018-06-26 李文清 A kind of Intelligent road cleaning plant
CN108217045A (en) * 2018-01-03 2018-06-29 广州供电局有限公司 A kind of intelligent robot for undercarriage on data center's physical equipment
EP3530778A1 (en) * 2018-01-22 2019-08-28 Novetex Textiles Limited System and method for recycling fibers from textiles waste
CN108607819A (en) * 2018-04-25 2018-10-02 重庆邮电大学 Material sorting system and method
CN108764062A (en) * 2018-05-07 2018-11-06 西安工程大学 A kind of clothing cutting plate recognition methods of view-based access control model
CN108764062B (en) * 2018-05-07 2022-02-25 西安工程大学 Visual sense-based clothing piece identification method
CN108648208A (en) * 2018-05-14 2018-10-12 西安工程大学 A kind of embedded cut-parts lacing film robot control system and robot control method
CN109513629A (en) * 2018-11-14 2019-03-26 深圳蓝胖子机器人有限公司 Packages method, apparatus and computer readable storage medium
CN109513629B (en) * 2018-11-14 2021-06-11 深圳蓝胖子机器智能有限公司 Method, device and computer readable storage medium for sorting packages
CN110174065A (en) * 2019-06-17 2019-08-27 湖南农业大学 Fruit size lossless detection method based on orthogonal binocular machine vision
CN113487659A (en) * 2021-07-14 2021-10-08 浙江大学 Image registration method, device, equipment and storage medium
CN113487659B (en) * 2021-07-14 2023-10-20 浙江大学 Image registration method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN105562361B (en) 2017-12-22

Similar Documents

Publication Publication Date Title
CN105562361A (en) Independent sorting method of fabric sorting robot
Kumar et al. Review of lane detection and tracking algorithms in advanced driver assistance system
WO2015010451A1 (en) Method for road detection from one image
CN104537676B (en) Gradual image segmentation method based on online learning
CN106251353A (en) Weak texture workpiece and the recognition detection method and system of three-dimensional pose thereof
CN106530297A (en) Object grabbing region positioning method based on point cloud registering
CN104732536A (en) Sub-pixel edge detection method based on improved morphology
CN113643280B (en) Computer vision-based plate sorting system and method
CN104134209A (en) Feature extraction and matching method and feature extraction and matching system in visual navigation
CN105787519A (en) Tree species classification method based on vein detection
CN104299246B (en) Production line article part motion detection and tracking based on video
CN102567703A (en) Hand motion identification information processing method based on classification characteristic
CN104200461A (en) Mutual information image selected block and sift (scale-invariant feature transform) characteristic based remote sensing image registration method
CN104298996A (en) Underwater active vision tracking method applied to bionic robot fish
Guan et al. A 2D human body model dressed in eigen clothing
CN106504262A (en) A kind of small tiles intelligent locating method of multiple features fusion
CN112926503B (en) Automatic generation method of grabbing data set based on rectangular fitting
Zhang et al. An adaptive vision navigation algorithm in agricultural IoT system for smart agricultural robots
CN108109154A (en) A kind of new positioning of workpiece and data capture method
CN109389165A (en) Oil level gauge for transformer recognition methods based on crusing robot
CN103886324B (en) Scale adaptive target tracking method based on log likelihood image
CN115147448A (en) Image enhancement and feature extraction method for automatic welding
CN104992448B (en) The automatic positioning method of robot antisitic defect grape-picking
CN103761523A (en) Automatic identification and tracking method for airborne remote sensing video in specific man-made area
CN107516315A (en) A kind of development machine based on machine vision is slagged tap monitoring method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20171222

Termination date: 20201223