CN108858193A - A kind of mechanical arm grasping means and system - Google Patents
A kind of mechanical arm grasping means and system Download PDFInfo
- Publication number
- CN108858193A CN108858193A CN201810736694.3A CN201810736694A CN108858193A CN 108858193 A CN108858193 A CN 108858193A CN 201810736694 A CN201810736694 A CN 201810736694A CN 108858193 A CN108858193 A CN 108858193A
- Authority
- CN
- China
- Prior art keywords
- mechanical arm
- crawl
- point
- grasping means
- constraint
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
The system that the present invention proposes a kind of mechanical arm grasping means and realizes this method, this method comprises the following steps:S1, the point cloud information for acquiring body surface to be grabbed (visible surface in camera fields of view);S2, the data of point cloud information are further processed, the feasible crawl point for meeting constraint condition is extracted by Grasp Planning algorithm;S3, by referring to that parallel gripper issues motion control instruction to mechanical arm and two using the crawl point as the input of mechanical arm inverse kinematics;S4, mechanical arm execute motion control instruction, move to designated position, then according to the movement sequential relationship in motion control instruction, refer to that parallel gripper opens, crawl task is completed in clamping by two.Beneficial effects of the present invention have:The present invention can be under the conditions of body form be uncertain, and being realized using single visual sensor has the function of that the mechanical arm of robustness grabs.
Description
Technical field
The present invention relates to robotic technology field, especially a kind of mechanical arm crawl side towards shape uncertainty object
Method and system.
Background technique
With the rise of artificial intelligence tide, robot plays the effect to become more and more important in all trades and professions.To robot
For, crawl is that robot comes into the essential technical ability of real world, for example sorts in logistic industry to object,
The assembly etc. of part is completed in industrial production line.However, robot, which completes crawl task, still has many uncertain problems
It needs further to study.Therefore, how to handle uncertain so that improving the success rate of crawl is the problem of highly research.
In general, the shape that the uncertainty during crawl mainly includes object to be grabbed is not known, object to be grabbed
Posture is uncertain, manipulator contact point is uncertain and the quality of object is uncertain etc..In actual application field, grab
The main uncertain problem faced during taking is uncertain from the shape of object to be grabbed, and causes shape uncertain
Illumination is insufficient when reason mainly includes crawl, causes to be difficult to accurately to identify target object;The precision for observing camera is inadequate, object
The position of body exceeds valid analysing range;Observation camera can only see the part of the surface of object to be grabbed;Object to be grabbed is
Bright or translucent, surface reflection, such as transparent mineral water bottle, the incomplete latticed pen container in surface, easily-deformable hair
Suede toy etc. is all the object being difficult to.
Common is used to handle there are mainly two types of the methods for not knowing object when mechanical arm crawl in face of shape:One is remove
Except video camera, then by other single or multiple sensors (for example, touch sensor, force snesor, laser sensing
Device etc.) the more information regarding objects of feedback, single camera bring form error is made up, eventually by control multiple degrees of freedom
Manipulator complete crawl task;Another kind is in grabbing the approach application of machine learning to mechanical arm, by enough
The obtained mass data of crawl experiment obtain one by passing through as the training set of the feasible crawl configuration of mechanical arm and manipulator
Test the crawl model that data obtain.When the point cloud information obtained from video camera is sufficiently complete, just by the point cloud data of this part
As the test set of crawl model, corresponding mechanical arm crawl parameter is recalled, driving mechanical arm completes crawl task.
Even if however the shortcomings that both modes, is all it is obvious that first method is by way of increasing sensor, to obtain
More object informations are taken, what is finally cooperated is multivariant manipulator, does so cost and greatly improves, is not suitable for industry
Production and daily life.Also because of cost reason, industrial most grasping manipulation is to have passed through special equipment to solve,
Such as conveyer belt movement and sorting equipment.Crawl is usually to be completed using two finger clamps either sucker.Second method is
Data training obtains mechanical arm crawl model through a large number of experiments, to obtain data much more so need the sufficiently long time and
Operating robotic arm completes enough crawl numbers, so that the service life of mechanical arm substantially reduces.
Summary of the invention
The purpose of the present invention is to propose to a kind of mechanical arm grasping means and system towards shape uncertainty object.
For this purpose, the present invention proposes a kind of mechanical arm grasping means, include the following steps:S1, body surface to be grabbed is acquired
The point cloud information on (visible surface in camera fields of view);S2, the data of point cloud information are further processed, are calculated by Grasp Planning
Method extracts the feasible crawl point for meeting constraint condition;S3, by using the crawl point as the input of mechanical arm inverse kinematics, to machine
Tool arm and two refers to that parallel gripper issues motion control instruction;S4, mechanical arm execute motion control instruction, move to designated position,
Then according to the movement sequential relationship in motion control instruction, refer to that parallel gripper opens, crawl task is completed in clamping by two;Its
In, step S2 includes:S2a. the average coordinates for calculating all data points, the center-of-mass coordinate as the object;S2b. it calculates all
Relative coordinate of the data point relative to center of mass point;S2c. all coordinates are brought into and has set constraint condition, obtained and meet constraint item
The set of all data points of part;S2d. all data point coordinates for meeting constraint are subjected to gaussian filtering, obtain every two number
Related coefficient between strong point;S2e:Descending arrangement from big to small is carried out by probabilistic size to filtered result,
Choose the smallest a pair of of the crawl point of uncertainty.
Preferably, in embodiments of the present invention, to RGB-D observe camera and mechanical arm, two refer to manipulators carry out each other it
Between communication function initial configuration.
It is further preferred that in embodiments of the present invention, in S2d, the correlation between two data points passes through covariance
Matrix indicates that specific calculate then is calculated using kernel function.
It is further preferred that in embodiments of the present invention, the calculating of kernel function uses the linear combination of kernel function:
cov(xi,xj)=ω kG(xi,xj)+(1-ω)·kT(xi,xj)
Wherein, ω is any positive number between 0 and 1, xi,xjIt is distance,
Gaussian kernel function:
Thin plate kernel function:kT(xi,xj)=2 | | xi-xj||3-3||xi-xj||2,
cov(xi,xj) calculated result be just used as the probabilistic measurement of shape of the invention, i.e., probabilistic quantization.
It is further preferred that in embodiments of the present invention, step S2 further comprises:In step S3, obtain a pair of feasible
Crawl point and then according to the Kinect camera demarcated and mechanical arm and two finger manipulators between position orientation relation,
When the pose instruction that feasible a pair of of crawl point coordinate transformation should be moved at mechanical arm is started to open with two finger manipulators
The control instruction when closed refers to that manipulator is sent to mechanical arm and two respectively.
It is further preferred that in embodiments of the present invention, the constraint condition in step S2 includes:Hand constraint, crawl are steady
Qualitative constraint is grabbed physical constraint.
It is further preferred that in embodiments of the present invention, hand constraint includes since different manipulators is due to itself
Mechanical structure limitation;Grasp stability constraint includes that the frictional force that generates of crawl can satisfy the gravity for being grabbed object and not
Object is touched in advance.
It is further preferred that in embodiments of the present invention, to meeting, the shape of the feasible crawl point of Prescribed Properties is not
Certainty is arranged from low to high, only take minimum shape it is uncertain that crawl point.
It is further preferred that in embodiments of the present invention, in step S2d, during gaussian filtering, using kernel function by three
Dimension space data point (x, y, z) is mapped to space-time (d, nx, ny, nz), and wherein d indicates the distance between two data points,
It is uncertain to quantify shape with this.
The present invention also proposes that a kind of mechanical arm grasping system, including control RGB-D observe camera, central controller, machinery
Arm and two refers to parallel gripper, is stored with program in the central controller, for controlling RGB-D observation camera, mechanical arm and two
Refer to parallel gripper, executes method above-mentioned.
Compared with prior art, beneficial effects of the present invention have:The present invention can under the conditions of body form is uncertain,
Realize there is that the mechanical arm of robustness grabs using single visual sensor (RGB-D observes camera).
Detailed description of the invention
Fig. 1 is grasping system schematic diagram of the embodiment of the present invention;
Fig. 2 is basic flow chart of the embodiment of the present invention;
Fig. 3 is the embodiment of the present invention and the related variable schematic diagram of crawl;
Fig. 4 is coordinate system schematic diagram of the embodiment of the present invention.
Appended drawing reference:1, fixed RGB-D observes camera;2, the mechanical arm assembled;3, two refer to parallel manipulator;4, to
Grab object;5, the supporting surface at the place of object to be grabbed;6, central controller;7, the Debugging interface of central controller;8, it observes
Data line between camera and controller;9, the data line between mechanical arm and controller.
Specific embodiment
With reference to embodiment and compares attached drawing the present invention is described in further details.Wherein identical attached drawing
Label indicates identical component, unless stated otherwise.It is emphasized that following the description is only exemplary, without
It is to limit the scope of the invention and its apply.
Such as Fig. 1, the mechanical arm grasping system of the present embodiment includes RGB-D observation camera 1, equipped with ubuntu operating system
The central controller (desktop computer) 6 of (a kind of (SuSE) Linux OS), mechanical arm 2, two refer to parallel gripper 3,8 and 9 points in Fig. 1
The data line between camera, mechanical arm and controller Biao Shi not observed.7 be the Debugging interface of central controller, convenient for handing over
Mutually.4 be object to be grabbed, and 5 be the supporting surface at the place of object to be grabbed.
The mechanical arm grasping means of the present embodiment includes segregation reasons and online execution two parts.Segregation reasons mainly include
The configuration of mechanical arm Grasp Planning and manipulator.Grasp Planning includes the feature extraction two of body surface modeling and feasible crawl point
Part.The posture of manipulator when the configuration of manipulator mainly grabs.
The point cloud information that camera 1 acquires body surface to be grabbed (visible surface in camera fields of view) is observed by RGB-D,
And point cloud data (three-dimensional coordinate and normal vector of each data point) is saved as obj file format, facilitate the later period to control in center
It is handled in device 6.
Central controller 6, which is read, has the obj file content of point cloud data information, ROS (robot operating system,
Robot Operating System) in the data in obj file are further processed, pass through Grasp Planning algorithm extract meet
The feasible crawl point of constraint condition, and by using the crawl point as the input of mechanical arm inverse kinematics, from ROS to mechanical arm 2
Gripper 3 parallel with two fingers issues motion control instruction.
Mechanical arm executes the movement instruction issued after inverse kinematics is settled accounts by 6 computer of central controller, moves to specified
Position refers to that parallel gripper opens, clamping is completed crawl and appointed by two then according to the movement sequential relationship in motion control instruction
Business.
Point Cloud Processing software (ROS indigo software) is installed in central controller 6.Camera is observed by RGB-D
The body surface point cloud information for acquiring visible part carries out data based on ROS software on central computer as initial data
Work is handled, three-dimensional coordinate and its normal vector including extracting each data point, the software run on (SuSE) Linux OS (this
It is ubuntu operating system in embodiment).
Workflow:
Basic procedure is as shown in Figure 2.
Step 1. runs ROS indigo software in the Ubuntu system of central controller (desktop computer), to RGB-D
Observe the communication function initial configuration between camera and mechanical arm, two finger manipulators progress.
Object to be grabbed is placed within the scope of RGB-D observation camera fields of view by step 2., central controller (desktop computer)
Instruction is issued so that observing camera by RGB-D acquires body surface information to be grabbed (the mainly three-dimensional seat of object table millet cake
Mark and its corresponding normal vector, are only capable of collecting the information of visible body surface in camera fields of view at this time certainly).
RGB-D is observed the collected data transmission of camera to central controller (desktop computer), center control by step 3.
Device (desktop computer) is handled data inside ROS, and the feasible crawl point that this part will finally obtain two finger manipulators is sat
Mark.
Main work includes in step 3:
A. the average coordinates for calculating all data points, the center-of-mass coordinate as the object.
B. relative coordinate of all data points relative to center of mass point is calculated.
C. all coordinates are brought into and has set constraint condition, obtain the set for meeting all data points of constraint condition.
D. all data point coordinates for meeting constraint are subjected to gaussian filtering (gaussian filtering process:By all data point bands
Enter corresponding kernel function to be calculated), obtain the related coefficient between every two data point.
Wherein, it is described as follows about related coefficient:
Here it is considered that two coordinate points adjacent in the point cloud of object have correlation, and meet Gaussian Profile
Rule.Correlation between two data points, we can be indicated by covariance matrix, it is specific calculate then using kernel function into
Row calculates.Linear combination according to two kernel functions is still kernel function, therefore the line of kernel function is innovatively used in the present invention
Property combination.
cov(xi,xj)=ω kG(xi,xj)+(1-ω)·kT(xi,xj)
Wherein, ω is any positive number between 0 and 1, xi,xjIt is distance.
Gaussian kernel function:
Thin plate kernel function:kT(xi,xj)=2 | | xi-xj||3-3||xi-xj||2。
cov(xi,xj) calculated result be just used as the probabilistic measurement of shape of the invention, i.e., probabilistic quantization.
Descending from big to small then is carried out by probabilistic size to filtered result to arrange, and chooses uncertainty
The smallest a pair of of crawl point (because system uses the parallel gripper of two fingers, two crawl points can only be used, as gripper and to
Grab the contact point of object).
Step 4. obtain a pair of feasible crawl point and then ROS Rigen according to the Kinect camera demarcated and
Position orientation relation between UR5 mechanical arm and two finger manipulators should move feasible a pair of of crawl point coordinate transformation at mechanical arm
To pose instruction and two finger manipulators when start to open the control instruction when closed, refer to respectively to mechanical arm and two mechanical
Hand is sent.
After step 5. receives the motion control instruction of central controller (desktop computer), mechanical arm preferential answering, movement
To specified spatial position and correspondingly terminal angle is adjusted, two finger manipulators is facilitated to grab.The control of oneself is completed to mechanical arm
After system instruction, two finger manipulators start to execute control instruction, open two and refer to, clamp target object.
The related notion being previously mentioned in above-mentioned steps 1-5 is described in detail below:
(1) constraint condition:In face of being primarily present three kinds of constraint conditions, i.e. hand during the probabilistic grasping body of shape
Portion's constraint, is grabbed physical constraint at grasp stability constraint.
Fig. 3 is specifically called out and grabs related variable, and C1C2 is the feasible crawl of a pair for meeting each constraint condition
Point, n1n2 are the normal vector of C1C2 point respectively, and g1g2 indicates the point being overlapped on parallel gripper with feasible crawl point, and such g1g2 is just
Indicate the direction of crawl, W indicates the width that parallel gripper opens.
A. hand constraint is dynamic for certain crawls mainly due to different manipulators since the mechanical structure of itself limits
It is unable to complete.In the present embodiment, it is flat no more than the finger of ROBOTIQ bis- to limit the distance between a pair of feasible crawl point for we
Row gripper maximum opens distance.In addition, refer to that manipulators are to grab in parallel due to two, we require the crawl direction of parallel gripper with
The normal vector of selected feasible crawl point is parallel.It is oblique crawl in order to prevent in this way, object can not be firmly grasped by, which causing, causes to grab
Failure.
B. grasp stability constrains.The precondition of grasping stability is exactly that the frictional force that generates of crawl can satisfy and be grabbed object
The gravity of body, therefore we require gripper to be greater than two feasible folders for grabbing normal vector between point in the angle of friction of crawl point
Angle.In addition, due to there is a point to be located at recess in feasible crawl point, such gripper is close to be easy for making when grabbing object
At object is touched in advance, object overturning is caused, crawl is caused to fail.
C. physical constraint is grabbed:Obvious feasible crawl point has to grab body surface in band.
In addition, the feasible crawl point for meeting above-mentioned constraint condition may be more than a pair of, so we are to meeting the above institute
The shape uncertainty of the feasible crawl point of Prescribed Properties is arranged from low to high, it is desirable that only takes minimum shape uncertain
That is to crawl point.
(2) gaussian filtering is exactly to utilize linear combination kernels function (gaussian kernel function with robustness) and to smooth continuous
Surface has the linear combination of the thin plate kernel function of extraordinary adaptability, to guarantee that it is more that the present invention can adapt to as far as possible
Object is grabbed, eliminates noise (due to the precision of RGB-D observation camera itself and the visitor of ambient noise when acquisition body surface information
It sees and exists, it is that error can add up to transmit that grass, which carrys out maximum problem).
(3) kernel function defines one from low dimensional to high-dimensional mapping.In the present invention, for the point of utilization acquisition
Cloud data re-establish body surface crawl model, and three-dimensional space data point (x, y, z) is mapped to four using kernel function by us
Dimension space (d, nx, ny, nz) (be detailed in below explanation), wherein d indicates the distance between two data points, quantifies shape with this
It is uncertain.
Above-mentioned distance d is further described below:We distinguish the D coordinates value of all point cloud datas observed
It averages, and regards it as the centroid position of the target object.The distance that we define centroid position is -1, in this, as entire
The measurement of the distributed degrees of Observable object point cloud.Pass through cov (d againi,dj) two data points of measurement correlation, it is entire to measure
The set of value can reflect that the shape of whole object is uncertain.(note:Here cov (di,dj) it is exactly cov (x abovei,
xj))。
It is mapped to space-time:It is to be herein pointed out having for a certain specific data point in cloud
(xyz) space coordinate also has normal vector (nx, nynz) information as object table millet cake.This is to make data processing
Data information in journey keeps integrality and one-to-one correspondence property, quasi- also for being used as constraint criterion to provide normal vector data below
Exact figures value.
(4) RGB-D observes the calibration process between camera, mechanical arm and two finger manipulators:Calibration seeks to determine camera
Relativeness (relative position and posture) between coordinate system and robot, the present invention in camera mounting means using
The pedestal of " eye is outside hand ", such camera and mechanical arm is relatively-stationary, reduction systematic error.
B system is the fixed coordinate system of mechanical arm in Fig. 4, and E system is the parallel gripper coordinate system of two fingers, and C system is RGB-D observation phase
Machine coordinate system, D system are that scaling board (is fixed coordinate system with mechanical arm tail end.Calibration is exactly to drive mechanical arm body to several
Different spatial position (requirement to these positions is able to see scaling board), so that RGB-D observes camera in this process
In can see scaling board.For the whole process that emphasis indicates camera calibration in Fig. 4, by master controller etc., other objects are ignored
It does not draw.
The spatial pose of D system and E system in the process, camera coordinates system and end robot coordinate system be it is fixed, i.e.,
T2 position auto―control is thick-and-thin.So RGB-D observes position auto―control T4=T1 of the camera under mechanical arm fixed coordinate system
× T2 × T3, T1-T4 are robot linkage coordinate systems, for describing the pose transformation relation of mechanical arm in space.T1 is machine
Tool shoulder joint 1 is relative to B system, the i.e. fixed coordinate system of mechanical arm --- the spatial pose transformation matrix of pedestal.T2 is then opposite
In the spatial pose transformation matrix of T1.And E system is then used to describe the spatial pose relationship that end effector two refers to manipulator, E is
Spatial pose transformation matrix relative to T4 matrix.
(scaling board is moved twice, later all spatial positions of calculating observation camera to process twice respectively.Because of this hair
Camera in bright is fixed, therefore whole connecting rod transformation matrix multiplied result is the same), have:T1 × T2 × T3=T1 '
× T2 × T3 ' can thus acquire the specific value of B system, and then can solve RGB-D observation camera in mechanical arm fixed coordinates
Position orientation relation T4 in system.
Method used in the present invention is to be applied linear combination kernels function based on Gaussian process and do not known object to shape
Resurfacing is carried out, and extracts feasible crawl point under corresponding constraint condition, as the defeated of mechanical arm control system
Enter, completes crawl task.The present embodiment has:1, the crawl of robustness;2, it can be good at what processing crawl encountered in the process
Shape is uncertain.The data volume that this method needs is small, also uses a video camera, therefore cost is not also high, has good
Good practicability.
The crawl of mechanical arm is related to many aspects, including machinery, control, computer, artificial intelligence etc..Therefore the present invention
It can be applied to the neck such as office automation, automatic vending, logistic storage carrying, individual workship's processing, robot teaching, industrial production
Domain can complete the sorting and carrying, component assembly, car surface polishing, 3D printing, laser engraving, welding circuit board of logistics
Etc. tasks.
The above content is a further detailed description of the present invention in conjunction with specific preferred embodiments, and it cannot be said that
Specific implementation of the invention is only limited to these instructions.For those skilled in the art to which the present invention belongs, it is not taking off
Under the premise of from present inventive concept, several equivalent substitute or obvious modifications can also be made, and performance or use is identical, all answered
When being considered as belonging to protection scope of the present invention.
Claims (10)
1. a kind of mechanical arm grasping means, which is characterized in that include the following steps:
S1, the point cloud information for acquiring body surface to be grabbed (visible surface in camera fields of view);
S2, the data of point cloud information are further processed, the feasible crawl for meeting constraint condition is extracted by Grasp Planning algorithm
Point;
S3, by referring to that parallel gripper issues movement to mechanical arm and two using the crawl point as the input of mechanical arm inverse kinematics
Control instruction;
S4, mechanical arm execute motion control instruction, designated position are moved to, then according to the movement timing in motion control instruction
Relationship refers to that parallel gripper opens, crawl task is completed in clamping by two;
Wherein, step S2 includes:
S2a. the average coordinates for calculating all data points, the center-of-mass coordinate as the object;
S2b. relative coordinate of all data points relative to center of mass point is calculated;
S2c. all coordinates are brought into and has set constraint condition, obtain the set for meeting all data points of constraint condition;
S2d. all data point coordinates for meeting constraint are subjected to gaussian filtering, obtain the phase relation between every two data point
Number;
S2e:Descending arrangement from big to small is carried out by probabilistic size to filtered result, it is minimum to choose uncertainty
A pair of of crawl point.
2. mechanical arm grasping means according to claim 1, which is characterized in that observe camera and mechanical arm, two to RGB-D
Refer to the communication function initial configuration between manipulator carries out.
3. mechanical arm grasping means according to claim 2, which is characterized in that the correlation in S2d, between two data points
Property indicated by covariance matrix, specific calculate then is calculated using kernel function.
4. mechanical arm grasping means according to claim 3, which is characterized in that the calculating of kernel function uses the line of kernel function
Property combination:
cov(xi,xj)=ω kG(xi,xj)+(1-ω)·kT(xi,xj)
Wherein, ω is any positive number between 0 and 1, xi,xjIt is distance,
Gaussian kernel function:
Thin plate kernel function:kT(xi,xj)=2 | | xi-xj||3-3||xi-xj‖2,
cov(xi,xj) calculated result be just used as the probabilistic measurement of shape of the invention, i.e., probabilistic quantization.
5. mechanical arm grasping means according to claim 3, which is characterized in that step S2 further comprises:In step S3, obtain
It obtains a pair of feasible crawl point and then is referred between manipulators according to the Kinect camera demarcated and mechanical arm and two
When the pose instruction and two that feasible a pair of of crawl point coordinate transformation should be moved at mechanical arm is referred to manipulator by position orientation relation
Start that the control instruction when closed opened, refers to that manipulator is sent to mechanical arm and two respectively.
6. mechanical arm grasping means according to claim 1, which is characterized in that the constraint condition in step S2 includes:Hand
Portion's constraint, is grabbed physical constraint at grasp stability constraint.
7. mechanical arm grasping means according to claim 6, which is characterized in that hand constraint includes due to different machinery
Hand is limited due to the mechanical structure of itself;Grasp stability constraint can satisfy including the frictional force that crawl generates is grabbed object
Gravity and object is not touched in advance.
8. mechanical arm grasping means according to claim 6, which is characterized in that meeting, the feasible of Prescribed Properties is grabbed
Take shape uncertainty a little to be arranged from low to high, only take minimum shape it is uncertain that crawl point.
9. mechanical arm grasping means according to claim 8, which is characterized in that in step S2d, during gaussian filtering,
Three-dimensional space data point (x, y, z) is mapped to space-time (d, nx, ny, nz) using kernel function, wherein d indicates two data
The distance between point, it is uncertain to quantify shape with this.
10. a kind of mechanical arm grasping system, which is characterized in that observe camera, central controller, mechanical arm including control RGB-D
Gripper parallel with two fingers is stored with program in the central controller, and for controlling RGB-D observation camera, mechanical arm and two refers to
Parallel gripper executes method as claimed in claims 1-9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810736694.3A CN108858193B (en) | 2018-07-06 | 2018-07-06 | Mechanical arm grabbing method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810736694.3A CN108858193B (en) | 2018-07-06 | 2018-07-06 | Mechanical arm grabbing method and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108858193A true CN108858193A (en) | 2018-11-23 |
CN108858193B CN108858193B (en) | 2020-07-03 |
Family
ID=64299559
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810736694.3A Active CN108858193B (en) | 2018-07-06 | 2018-07-06 | Mechanical arm grabbing method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108858193B (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110103231A (en) * | 2019-06-18 | 2019-08-09 | 王保山 | A kind of accurate grasping means and system for mechanical arm |
CN110271000A (en) * | 2019-06-18 | 2019-09-24 | 清华大学深圳研究生院 | A kind of grasping body method based on oval face contact |
CN110580725A (en) * | 2019-09-12 | 2019-12-17 | 浙江大学滨海产业技术研究院 | Box sorting method and system based on RGB-D camera |
CN110842984A (en) * | 2019-11-22 | 2020-02-28 | 江苏铁锚玻璃股份有限公司 | Power mechanical arm with radiation resistance and high-precision positioning operation |
CN111112885A (en) * | 2019-11-26 | 2020-05-08 | 福尼斯智能装备(珠海)有限公司 | Welding system with vision system for feeding and discharging workpieces and self-adaptive positioning of welding seams |
CN111216124A (en) * | 2019-12-02 | 2020-06-02 | 广东技术师范大学 | Robot vision guiding method and device based on integration of global vision and local vision |
CN111784218A (en) * | 2019-08-15 | 2020-10-16 | 北京京东乾石科技有限公司 | Method and apparatus for processing information |
CN112589795A (en) * | 2020-12-04 | 2021-04-02 | 中山大学 | Vacuum chuck mechanical arm grabbing method based on uncertainty multi-frame fusion |
CN113305847A (en) * | 2021-06-10 | 2021-08-27 | 上海大学 | Building 3D printing mobile mechanical arm station planning method and system |
CN113500017A (en) * | 2021-07-16 | 2021-10-15 | 上海交通大学烟台信息技术研究院 | Intelligent system and method for sorting materials in unstructured scene |
CN113771045A (en) * | 2021-10-15 | 2021-12-10 | 广东工业大学 | Vision-guided high-adaptability positioning and grabbing method for middle frame of right-angle robot mobile phone |
CN117549338A (en) * | 2024-01-09 | 2024-02-13 | 北京李尔现代坦迪斯汽车***有限公司 | Grabbing robot for automobile cushion production workshop |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3876234B2 (en) * | 2003-06-17 | 2007-01-31 | ファナック株式会社 | Connector gripping device, connector inspection system and connector connection system equipped with the same |
CN102527643A (en) * | 2010-12-31 | 2012-07-04 | 东莞理工学院 | Sorting manipulator structure and product sorting system |
CN104048607A (en) * | 2014-06-27 | 2014-09-17 | 上海朗煜电子科技有限公司 | Visual identification and grabbing method of mechanical arms |
WO2015119838A2 (en) * | 2014-02-04 | 2015-08-13 | Microsoft Technology Licensing, Llc | Controlling a robot in the presence of a moving object |
JP2017061032A (en) * | 2011-03-23 | 2017-03-30 | エスアールアイ インターナショナルSRI International | High performance remote manipulator system |
US20170326728A1 (en) * | 2016-05-11 | 2017-11-16 | X Development Llc | Generating a grasp pose for grasping of an object by a grasping end effector of a robot |
-
2018
- 2018-07-06 CN CN201810736694.3A patent/CN108858193B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3876234B2 (en) * | 2003-06-17 | 2007-01-31 | ファナック株式会社 | Connector gripping device, connector inspection system and connector connection system equipped with the same |
CN102527643A (en) * | 2010-12-31 | 2012-07-04 | 东莞理工学院 | Sorting manipulator structure and product sorting system |
JP2017061032A (en) * | 2011-03-23 | 2017-03-30 | エスアールアイ インターナショナルSRI International | High performance remote manipulator system |
WO2015119838A2 (en) * | 2014-02-04 | 2015-08-13 | Microsoft Technology Licensing, Llc | Controlling a robot in the presence of a moving object |
CN104048607A (en) * | 2014-06-27 | 2014-09-17 | 上海朗煜电子科技有限公司 | Visual identification and grabbing method of mechanical arms |
US20170326728A1 (en) * | 2016-05-11 | 2017-11-16 | X Development Llc | Generating a grasp pose for grasping of an object by a grasping end effector of a robot |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110271000A (en) * | 2019-06-18 | 2019-09-24 | 清华大学深圳研究生院 | A kind of grasping body method based on oval face contact |
CN110103231A (en) * | 2019-06-18 | 2019-08-09 | 王保山 | A kind of accurate grasping means and system for mechanical arm |
CN110271000B (en) * | 2019-06-18 | 2020-09-22 | 清华大学深圳研究生院 | Object grabbing method based on elliptical surface contact |
CN111784218A (en) * | 2019-08-15 | 2020-10-16 | 北京京东乾石科技有限公司 | Method and apparatus for processing information |
CN110580725A (en) * | 2019-09-12 | 2019-12-17 | 浙江大学滨海产业技术研究院 | Box sorting method and system based on RGB-D camera |
CN110842984A (en) * | 2019-11-22 | 2020-02-28 | 江苏铁锚玻璃股份有限公司 | Power mechanical arm with radiation resistance and high-precision positioning operation |
CN111112885A (en) * | 2019-11-26 | 2020-05-08 | 福尼斯智能装备(珠海)有限公司 | Welding system with vision system for feeding and discharging workpieces and self-adaptive positioning of welding seams |
CN111216124B (en) * | 2019-12-02 | 2020-11-06 | 广东技术师范大学 | Robot vision guiding method and device based on integration of global vision and local vision |
CN111216124A (en) * | 2019-12-02 | 2020-06-02 | 广东技术师范大学 | Robot vision guiding method and device based on integration of global vision and local vision |
CN112589795A (en) * | 2020-12-04 | 2021-04-02 | 中山大学 | Vacuum chuck mechanical arm grabbing method based on uncertainty multi-frame fusion |
CN113305847A (en) * | 2021-06-10 | 2021-08-27 | 上海大学 | Building 3D printing mobile mechanical arm station planning method and system |
CN113500017A (en) * | 2021-07-16 | 2021-10-15 | 上海交通大学烟台信息技术研究院 | Intelligent system and method for sorting materials in unstructured scene |
CN113500017B (en) * | 2021-07-16 | 2023-08-25 | 上海交通大学烟台信息技术研究院 | Intelligent system and method for sorting materials in unstructured scene |
CN113771045A (en) * | 2021-10-15 | 2021-12-10 | 广东工业大学 | Vision-guided high-adaptability positioning and grabbing method for middle frame of right-angle robot mobile phone |
CN117549338A (en) * | 2024-01-09 | 2024-02-13 | 北京李尔现代坦迪斯汽车***有限公司 | Grabbing robot for automobile cushion production workshop |
CN117549338B (en) * | 2024-01-09 | 2024-03-29 | 北京李尔现代坦迪斯汽车***有限公司 | Grabbing robot for automobile cushion production workshop |
Also Published As
Publication number | Publication date |
---|---|
CN108858193B (en) | 2020-07-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108858193A (en) | A kind of mechanical arm grasping means and system | |
CN108453743B (en) | Mechanical arm grabbing method | |
CN107160364B (en) | Industrial robot teaching system and method based on machine vision | |
Morales et al. | Integrated grasp planning and visual object localization for a humanoid robot with five-fingered hands | |
CN108972494A (en) | A kind of Apery manipulator crawl control system and its data processing method | |
CN105598987B (en) | Determination of a gripping space for an object by means of a robot | |
Eppner et al. | Grasping unknown objects by exploiting shape adaptability and environmental constraints | |
Kita et al. | Clothes handling based on recognition by strategic observation | |
Suzuki et al. | Grasping of unknown objects on a planar surface using a single depth image | |
CN110605711B (en) | Method, device and system for controlling cooperative robot to grab object | |
CN110271000A (en) | A kind of grasping body method based on oval face contact | |
Jiang et al. | Learning hardware agnostic grasps for a universal jamming gripper | |
Nagata et al. | Picking up an indicated object in a complex environment | |
Mavrakis et al. | Task-relevant grasp selection: A joint solution to planning grasps and manipulative motion trajectories | |
Tsarouchi et al. | Vision system for robotic handling of randomly placed objects | |
Bierbaum et al. | Grasp affordances from multi-fingered tactile exploration using dynamic potential fields | |
CN117103277A (en) | Mechanical arm sensing method based on multi-mode data fusion | |
Schiebener et al. | Discovery, segmentation and reactive grasping of unknown objects | |
CN113894774A (en) | Robot grabbing control method and device, storage medium and robot | |
CN113681565A (en) | Man-machine cooperation method and device for realizing article transfer between robots | |
Lin et al. | Vision based object grasping of industrial manipulator | |
TW201914782A (en) | Holding position and posture instruction apparatus, holding position and posture instruction method, and robot system | |
CN115861780B (en) | Robot arm detection grabbing method based on YOLO-GGCNN | |
Lopez et al. | Taichi algorithm: Human-like arm data generation applied on non-anthropomorphic robotic manipulators for demonstration | |
Kawasaki et al. | Virtual robot teaching for humanoid hand robot using muti-fingered haptic interface |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |