CN109015653A - Grab control method, device, storage medium and electronic equipment - Google Patents

Grab control method, device, storage medium and electronic equipment Download PDF

Info

Publication number
CN109015653A
CN109015653A CN201811003915.2A CN201811003915A CN109015653A CN 109015653 A CN109015653 A CN 109015653A CN 201811003915 A CN201811003915 A CN 201811003915A CN 109015653 A CN109015653 A CN 109015653A
Authority
CN
China
Prior art keywords
depth
mechanical arm
value
field
grasping mechanism
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811003915.2A
Other languages
Chinese (zh)
Inventor
李小亮
李慧
王宁
贾洁
肜瑶
王鸿运
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huanghe Science and Technology College
Original Assignee
Huanghe Science and Technology College
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huanghe Science and Technology College filed Critical Huanghe Science and Technology College
Priority to CN201811003915.2A priority Critical patent/CN109015653A/en
Publication of CN109015653A publication Critical patent/CN109015653A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Image Analysis (AREA)
  • Manipulator (AREA)

Abstract

The embodiment of the present invention provides a kind of crawl control method, device, storage medium and electronic equipment, the crawl control method, when comprisining the steps of detecting that trigger signal, the depth of field photo for the collected object to be captured of depth of field camera being arranged on mechanical arm is obtained;Characteristic ginseng value of the preset multiple characteristic parameters in the depth of field photo is obtained according to the depth of field photo;The model database pre-established is inquired, according to multiple characteristic ginseng value to obtain the relative positional relationship between the object and the grasping mechanism of the mechanical arm;The distance between the object and the grasping mechanism value are obtained according to the depth of view information that the depth of field photo carries;Relationship and the distance value control the mechanical arm and the grasping mechanism are moved to corresponding position to grab the object depending on that relative position.The present invention has the beneficial effect for improving crawl efficiency, crawl flexibility ratio and accuracy.

Description

Grab control method, device, storage medium and electronic equipment
Technical field
The present invention relates to field of communication technology, in particular to a kind of crawl control method, device, storage medium and electronics are set It is standby.
Background technique
Existing industrial robot can only orient crawl object, can not accomplish after object space changes, corresponding to change Itself control program realizes grasping body.
Therefore, the prior art is defective, needs to improve.
Summary of the invention
The embodiment of the present application provides a kind of crawl control method, device, storage medium and electronic equipment, and object can be improved Grab flexibility.
The embodiment of the present application provides a kind of crawl control method, comprising the following steps:
When detecting trigger signal, the depth of field for obtaining the collected object to be captured of depth of field camera being arranged on mechanical arm is shone Piece;
Characteristic ginseng value of the preset multiple characteristic parameters in the depth of field photo is obtained according to the depth of field photo;
The model database pre-established is inquired, according to multiple characteristic ginseng value to obtain the crawl of the object Yu the mechanical arm Relative positional relationship between mechanism;
The distance between the object and the grasping mechanism value are obtained according to the depth of view information that the depth of field photo carries;
Relationship and the distance value control the mechanical arm and the grasping mechanism are moved to corresponding position depending on that relative position To grab the object.
It is described that the mould pre-established is inquired according to multiple characteristic ginseng value in crawl control method of the present invention Type database, the step of to obtain the relative positional relationship between the object and the mechanical arm depth of field camera before further include: root The information of the object to be captured is judged according to the depth of field photo;
It is described that the model database pre-established is inquired according to multiple characteristic ginseng value, to obtain the object and the mechanical arm scape The step of relative positional relationship between deep camera includes:
The model database pre-established according to the inquiry of the information of the object to be grabbed is to obtain corresponding relationship model group;
The object to be captured is inquired from multiple relationship models in the relationship model group according to multiple characteristic ginseng value With the relative positional relationship between the mechanical arm depth of field camera.
In crawl control method of the present invention, the depth of view information carried according to the depth of field photo obtains the object The step of the distance between body and the grasping mechanism value includes:
The object is extracted from the depth of field photo to obtain the depth map of the object;
The distance between the object and the grasping mechanism value are obtained according to the depth of view information in the depth map.
In crawl control method of the present invention, moment sensing is provided at the grasping mechanism of the mechanical arm front end Device;It is characterized in that, the relationship depending on that relative position and the distance value control the mechanical arm for the grasping mechanism Being moved to the step of corresponding position is to grab the object includes:
The information of relationship, distance value and the object to be captured generates control signal, the control depending on the relative position Signal is moved to corresponding position and controls the grasping mechanism for controlling the mechanical arm and grabbed with corresponding grasp force wait grab Object.
In crawl control method of the present invention, the depth of field camera is set to the centre of the grasping mechanism.
A kind of crawl control device, comprising:
First obtains module, for when detecting trigger signal, obtain the depth of field camera being arranged on mechanical arm it is collected to The depth of field photo of the object of crawl;
Second obtains module, for obtaining feature of the preset multiple characteristic parameters in the depth of field photo according to the depth of field photo Parameter value;
Third obtains module, for inquiring the model database pre-established according to multiple characteristic ginseng value, to obtain the object Relative positional relationship between body and the grasping mechanism of the mechanical arm;
4th obtains module, and the depth of view information for being carried according to the depth of field photo obtains between the object and the grasping mechanism Distance value;
Control module controls the mechanical arm for relationship depending on that relative position and the distance value and moves the grasping mechanism It moves to corresponding position to grab the object.
In crawl control device of the present invention, further includes: judgment module, for being somebody's turn to do according to depth of field photo judgement The information of object to be captured;
The third obtains module
First query unit, the model database for being pre-established according to the information inquiry for being somebody's turn to do object to be grabbed is to obtain Corresponding relationship model group;
Second query unit, for being inquired from multiple relationship models in the relationship model group according to multiple characteristic ginseng value Relative positional relationship between the object to be captured and the mechanical arm depth of field camera out.
In crawl control device of the present invention, the described 4th, which obtains module, is used for the object from the depth of field It is extracted in photo to obtain the depth map of the object;The object and institute are obtained according to the depth of view information in the depth map State the distance between grasping mechanism value.
A kind of storage medium is stored with computer program in the storage medium, when the computer program is in computer When upper operation, so that the computer executes method described in any of the above embodiments.
A kind of electronic equipment, including processor and memory are stored with computer program, the processing in the memory Device is by calling the computer program stored in the memory, for executing method described in any of the above embodiments.
From the foregoing, it will be observed that obtaining the depth of field camera being arranged on mechanical arm when the present invention is by detecting trigger signal and collecting Object to be captured depth of field photo;Preset multiple characteristic parameters are obtained in the depth of field photo according to the depth of field photo Characteristic ginseng value;The model database pre-established is inquired, according to multiple characteristic ginseng value to obtain the object and the machinery Relative positional relationship between the grasping mechanism of arm;The object is obtained according to the depth of view information that the depth of field photo carries to grab with described Take the distance between mechanism value;Relationship and the distance value control the mechanical arm for the grasping mechanism depending on that relative position Corresponding position is moved to grab the object, the object can be grabbed as object space changes accurate working medium robot Body has the beneficial effect of the accuracy and flexibility that improve crawl.
Detailed description of the invention
In order to more clearly explain the technical solutions in the embodiments of the present application, make required in being described below to embodiment Attached drawing is briefly described.It should be evident that the drawings in the following description are only some examples of the present application, for For those skilled in the art, without creative efforts, it can also be obtained according to these attached drawings other attached Figure.
Fig. 1 is the flow chart of the crawl control method in some embodiments of the invention.
Fig. 2 is the structure chart of the crawl control device in some embodiments of the invention.
Fig. 3 is the structure chart of the electronic equipment in some embodiments of the invention.
Specific embodiment
Presently filed embodiment is described below in detail, the example of the embodiment is shown in the accompanying drawings, wherein from beginning Same or similar element or element with the same or similar functions are indicated to same or similar label eventually.Below by ginseng The embodiment for examining attached drawing description is exemplary, and is only used for explaining the application, and should not be understood as the limitation to the application.
In the description of the present application, it is to be understood that term " center ", " longitudinal direction ", " transverse direction ", " length ", " width ", " thickness ", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outside", " up time The orientation or positional relationship of the instructions such as needle ", " counterclockwise " is to be based on the orientation or positional relationship shown in the drawings, and is merely for convenience of It describes the application and simplifies description, rather than the device or element of indication or suggestion meaning must have a particular orientation, with spy Fixed orientation construction and operation, therefore should not be understood as the limitation to the application.In addition, term " first ", " second " are only used for Purpose is described, relative importance is not understood to indicate or imply or implicitly indicates the quantity of indicated technical characteristic. " first " is defined as a result, the feature of " second " can explicitly or implicitly include one or more feature.? In the description of the present application, the meaning of " plurality " is two or more, unless otherwise specifically defined.
In the description of the present application, it should be noted that unless otherwise clearly defined and limited, term " installation ", " phase Even ", " connection " shall be understood in a broad sense, for example, it may be being fixedly connected, may be a detachable connection, or be integrally connected;It can To be mechanical connection, it is also possible to be electrically connected or can mutually communicate;It can be directly connected, it can also be by between intermediary It connects connected, can be the connection inside two elements or the interaction relationship of two elements.For the ordinary skill of this field For personnel, the concrete meaning of above-mentioned term in this application can be understood as the case may be.
In this application unless specifically defined or limited otherwise, fisrt feature second feature "upper" or "lower" It may include that the first and second features directly contact, also may include that the first and second features are not direct contacts but pass through it Between other characterisation contact.Moreover, fisrt feature includes the first spy above the second feature " above ", " above " and " above " Sign is right above second feature and oblique upper, or is merely representative of first feature horizontal height higher than second feature.Fisrt feature exists Second feature " under ", " lower section " and " following " include that fisrt feature is directly below and diagonally below the second feature, or is merely representative of First feature horizontal height is less than second feature.
Following disclosure provides many different embodiments or example is used to realize the different structure of the application.In order to Simplify disclosure herein, hereinafter the component of specific examples and setting are described.Certainly, they are merely examples, and And purpose does not lie in limitation the application.In addition, the application can in different examples repeat reference numerals and/or reference letter, This repetition is for purposes of simplicity and clarity, itself not indicate between discussed various embodiments and/or setting Relationship.In addition, this application provides various specific techniques and material example, but those of ordinary skill in the art can be with Recognize the application of other techniques and/or the use of other materials.
The description and claims of this application and term " first " in above-mentioned attached drawing, " second ", " third " etc. (if present) is to be used to distinguish similar objects, without being used to describe a particular order or precedence order.It should be appreciated that this The object of sample description is interchangeable under appropriate circumstances.In addition, term " includes " and " having " and their any deformation, meaning Figure, which is to cover, non-exclusive includes.For example, containing the process, method of series of steps or containing a series of modules or list The device of member, terminal, system those of are not necessarily limited to be clearly listed step or module or unit, can also include unclear The step of ground is listed or module or unit also may include its intrinsic for these process, methods, device, terminal or system Its step or module or unit.
Fig. 1 is please referred to, Fig. 1 is the flow chart of one of some embodiments of the invention crawl control method, this method packet Include following steps:
S101, when detecting trigger signal, the scape for the collected object to be captured of depth of field camera being arranged on mechanical arm is obtained Deep photo.
Wherein, it when detecting trigger signal, needs to open light source module and carries out light filling, depth of field camera is shot To being more clear bright depth of field photo.The depth of field photo not only has the information such as the color shape of the object, also has the object The depth of view information of body, the depth of view information can be used to obtain distance value.The depth of field camera is being set to the grasping mechanism just Center.
S102, characteristic ginseng value of the preset multiple characteristic parameters in the depth of field photo is obtained according to the depth of field photo.
Wherein, this feature parameter can be the length and width higher size parameter or form parameter of the object, color parameter Deng.Since often feature is different for different objects, to extract the characteristic parameter that can most show the object features, setting The information that the default object is just needed to refer to when determining characteristic parameter, needs to preset the information of object according to this to select Multiple parameters are as characteristic parameter.In the case where for known object information, can directly the object information choosing The characteristic parameter for needing the depth of field photo extracted is selected, after the characteristic ginseng value which characteristic parameter determination will extract, so that it may The characteristic ginseng value of this feature parameter is extracted from the depth of field photo quickly.
In some embodiments, the object to be captured be unknown object in the case where, execute step S102 it Before, it is also necessary to it executes: obtaining the information of the object to be captured according to the depth of field photo.Picture recognition technology can be used Identify the information of the object, it can also be by the way of being manually entered.
S103, the model database pre-established is inquired according to multiple characteristic ginseng value, to obtain the object and the machine Relative positional relationship between the grasping mechanism of tool arm.
In this step, which is the model data established in above-described embodiment using robotics learning method Library.In order to improve efficiency, in some embodiments, step S103 includes:
S1031, the model database pre-established according to the information inquiry for being somebody's turn to do object to be grabbed are to obtain corresponding model Relationship group;S1032, this is inquired wait grab from multiple relationship models in the relationship model group according to multiple characteristic ginseng value Relative positional relationship between the object taken and the mechanical arm depth of field camera.
Wherein, the relative positional relationship include mechanical arm depth of field camera or the mechanical arm front end grasping mechanism with wait grab The distance between object taken value and relative bearing relationship.
S104, the distance between the object and the grasping mechanism are obtained according to the depth of view information that the depth of field photo carries Value.
In this step, first the object is extracted from the depth of field photo to obtain the depth map of the object; The distance between the object and the grasping mechanism value are obtained according to the depth of view information in the depth map.Wherein, how root Distance value, which is calculated, according to depth of view information belongs to existing algorithm.But since there may be alternate position spikes for depth of field camera and grasping mechanism It is different, it is therefore desirable to carry out compensation data.
S105, depending on that relative position relationship and the distance value control the mechanical arm and are moved to the grasping mechanism Corresponding position is to grab the object.
In this step, relationship and the distance value generate control signal, the control signal depending on that relative position Include that the mechanical arm needs angle and range data mobile or rotate, allows the mechanical arm by the grasping mechanism It is moved at the object to be captured.
In some embodiments, torque sensor is provided at the grasping mechanism of the mechanical arm front end;Step S105 packet Include: the information of relationship, distance value and the object to be captured generates control signal, control letter depending on the relative position Number for controlling, the mechanical arm is moved to corresponding position and to control the grasping mechanism to be captured to grab with corresponding grasp force Object.And in the process of grasping, torque sensor meeting real-time feedback data gives the electronic equipment.For different types of object Body, since the intensity and weight of the object are all different, it is therefore desirable to which the grasp force of application is not also identical, can be to avoid damage Object also can guarantee that object is not fallen out.
Specifically, relationship and the information of the object to be captured generate control letter to the step depending on the relative position It number include: according to the weight information of the information of the object or the object, external form information and material information;According to the object Weight information, external form information and the material information of body calculate specified grasp force;
Relationship generates mobile control parameter information depending on the relative position, according to the movement control parameter information and specified crawl Power generates control signal, allows the mechanical arm that the grasping mechanism is moved to corresponding position and is grabbed with the specified grasp force Take the object.
From the foregoing, it will be observed that obtaining the depth of field camera being arranged on mechanical arm when the present invention is by detecting trigger signal and collecting Object to be captured depth of field photo;Preset multiple characteristic parameters are obtained in the depth of field photo according to the depth of field photo Characteristic ginseng value;The model database pre-established is inquired, according to multiple characteristic ginseng value to obtain the object and the machinery Relative positional relationship between the grasping mechanism of arm;The object is obtained according to the depth of view information that the depth of field photo carries to grab with described Take the distance between mechanism value;Relationship and the distance value control the mechanical arm for the grasping mechanism depending on that relative position Corresponding position is moved to grab the object, the object can be grabbed as object space changes accurate working medium robot Body has the beneficial effect of the accuracy and flexibility that improve crawl.
Referring to figure 2., Fig. 2 is the structure chart of one of some embodiments of the invention crawl control device, the device packet Include: first, which obtains module 201, second, obtains module 202, third acquisition module 203 the 4th acquisition module 204 and control module 205。
Wherein, which is used for when detecting trigger signal, obtains the depth of field phase being arranged on mechanical arm The depth of field photo of the collected object to be captured of machine.Wherein, when detecting trigger signal, need to open light source module progress Light filling takes depth of field camera and is more clear bright depth of field photo.
Wherein, which is used to obtain preset multiple characteristic parameters in the scape according to the depth of field photo Characteristic ginseng value in deep photo.Wherein, this feature parameter can be length and width higher size parameter or the shape ginseng of the object Number, color parameter etc..Since often feature is different for different objects, to extract the spy that can most show the object features Parameter is levied, the information of the default object is just needed to refer to when setting characteristic parameter, needs to preset the kind of object according to this Category information selects the multiple parameters as characteristic parameter.It, can the direct object in the case where for known object information The information selection characteristic parameter of the depth of field photo that needs to extract, to extract the feature ginseng of which characteristic parameter in determination After numerical value, so that it may extract the characteristic ginseng value of this feature parameter from the depth of field photo quickly.
Wherein, which obtains module 203 and is used to inquire the model data pre-established according to multiple characteristic ginseng value Library, to obtain the relative positional relationship between the object and the mechanical arm depth of field camera.The model database is above-described embodiment The middle model database established using robotics learning method.In order to improve efficiency, in some embodiments, which obtains mould Block 303 includes: first acquisition unit, for inquiring the model database pre-established according to the information for being somebody's turn to do object to be grabbed To obtain corresponding relationship model group;Query unit, for according to multiple characteristic ginseng value from more in the relationship model group The relative positional relationship between the object to be captured and the mechanical arm depth of field camera is inquired in a relationship model.Wherein, should Relative positional relationship includes between the grasping mechanism of mechanical arm depth of field camera or the mechanical arm front end and object to be captured Distance value and relative bearing relationship.
4th acquisition module 204 is used to obtain the object and the crawl according to the depth of view information that the depth of field photo carries The distance between mechanism value.First the object is extracted from the depth of field photo to obtain the depth map of the object;Root The distance between the object and the grasping mechanism value are obtained according to the depth of view information in the depth map.Wherein, how basis Depth of view information calculates distance value and belongs to existing algorithm.But since depth of field camera and grasping mechanism are there may be position difference, Therefore it needs to carry out compensation data.
Wherein, the control module 205 is for relationship and the distance value to control the mechanical arm depending on that relative position The grasping mechanism is moved to corresponding position to grab the object.Relationship and the distance value are raw depending on that relative position At control signal, which includes that the mechanical arm needs angle and range data mobile or rotate, so that should The grasping mechanism can be moved at the object to be captured by mechanical arm.
In some embodiments, torque sensor is provided at the grasping mechanism of the mechanical arm front end;The control module 205 For: relationship and the information of the object to be captured generate control signal depending on the relative position, which uses Object to be captured is grabbed with corresponding grasp force in controlling the mechanical arm and being moved to corresponding position and control the grasping mechanism. And in the process of grasping, torque sensor meeting real-time feedback data gives the electronic equipment.For different types of object, by Be all different in the intensity and weight of the object, it is therefore desirable to the grasp force of application is not also identical, can to avoid damage object, Also it can guarantee that object is not fallen out.
From the foregoing, it will be observed that obtaining the depth of field camera being arranged on mechanical arm when the present invention is by detecting trigger signal and collecting Object to be captured depth of field photo;Preset multiple characteristic parameters are obtained in the depth of field photo according to the depth of field photo Characteristic ginseng value;The model database pre-established is inquired, according to multiple characteristic ginseng value to obtain the object and the machinery Relative positional relationship between the grasping mechanism of arm;The object is obtained according to the depth of view information that the depth of field photo carries to grab with described Take the distance between mechanism value;Relationship and the distance value control the mechanical arm for the grasping mechanism depending on that relative position Corresponding position is moved to grab the object, the object can be grabbed as object space changes accurate working medium robot Body has the beneficial effect of the accuracy and flexibility that improve crawl.
The embodiment of the present invention also provides a kind of storage medium, is stored with computer program in the storage medium, when the calculating When machine program is run on computers, which executes the human face expression described in any of the above-described embodiment based on speech recognition Trailing, to realize following functions: when detecting trigger signal, it is collected to obtain the depth of field camera being arranged on mechanical arm The depth of field photo of object to be captured;Spy of the preset multiple characteristic parameters in the depth of field photo is obtained according to the depth of field photo Levy parameter value;The model database pre-established is inquired, according to multiple characteristic ginseng value to obtain the object and the mechanical arm Grasping mechanism between relative positional relationship;The object and the crawl are obtained according to the depth of view information that the depth of field photo carries The distance between mechanism value;Relationship and the distance value control the mechanical arm and move the grasping mechanism depending on that relative position It moves to corresponding position to grab the object.
Referring to figure 3., the embodiment of the present invention also provides a kind of electronic equipment.The electronic equipment can be smart phone, put down Plate apparatus such as computer.Such as show, electronic equipment 300 includes processor 301 and memory 302.Wherein, processor 301 and memory 302 are electrically connected.Processor 301 is the control centre of electronic equipment 300, utilizes various interfaces and the entire terminal of connection Various pieces, by running or calling the computer program being stored in memory 302, and calling to be stored in memory 302 Interior data execute the various functions and processing data of terminal, to carry out integral monitoring to terminal.
In the present embodiment, processor 301 in electronic equipment 300 can according to following step, by one or one with On the corresponding instruction of process of computer program be loaded into memory 302, and run by processor 301 and be stored in storage Computer program in device 302, to realize various functions: when detecting trigger signal, obtaining the depth of field being arranged on mechanical arm The depth of field photo of the collected object to be captured of camera;Preset multiple characteristic parameters are obtained in the scape according to the depth of field photo Characteristic ginseng value in deep photo;The model database pre-established is inquired, according to multiple characteristic ginseng value to obtain the object Relative positional relationship between body and the grasping mechanism of the mechanical arm;The object is obtained according to the depth of view information that the depth of field photo carries The distance between body and the grasping mechanism value;Relationship and the distance value control the mechanical arm and incite somebody to action depending on that relative position The grasping mechanism is moved to corresponding position to grab the object.
Memory 302 can be used for storing computer program and data.Include in the computer program that memory 302 stores The instruction that can be executed in the processor.Computer program can form various functional modules.Processor 301 is stored in by calling The computer program of memory 302, thereby executing the control method of various robots.
From the foregoing, it will be observed that obtaining the depth of field camera being arranged on mechanical arm when the present invention is by detecting trigger signal and collecting Object to be captured depth of field photo;Preset multiple characteristic parameters are obtained in the depth of field photo according to the depth of field photo Characteristic ginseng value;The model database pre-established is inquired, according to multiple characteristic ginseng value to obtain the object and the machinery Relative positional relationship between the grasping mechanism of arm;The object is obtained according to the depth of view information that the depth of field photo carries to grab with described Take the distance between mechanism value;Relationship and the distance value control the mechanical arm for the grasping mechanism depending on that relative position Corresponding position is moved to grab the object, the object can be grabbed as object space changes accurate working medium robot Body has the beneficial effect of the accuracy and flexibility that improve crawl.
It should be noted that those of ordinary skill in the art will appreciate that whole in the various methods of above-described embodiment or Part steps are relevant hardware can be instructed to complete by program, which can store in computer-readable storage medium In matter, which be can include but is not limited to: read-only memory (ROM, Read Only Memory), random access memory Device (RAM, Random Access Memory), disk or CD etc..
Above to robotics learning method, control method, device, storage medium and electronics provided by the embodiment of the present application Equipment is described in detail, and specific examples are used herein to illustrate the principle and implementation manner of the present application, with The explanation of upper embodiment is merely used to help understand the present processes and its core concept;Meanwhile for the technology of this field Personnel, according to the thought of the application, there will be changes in the specific implementation manner and application range, in conclusion this theory Bright book content should not be construed as the limitation to the application.

Claims (10)

1. a kind of crawl control method, which comprises the following steps:
When detecting trigger signal, the depth of field for obtaining the collected object to be captured of depth of field camera being arranged on mechanical arm is shone Piece;
Characteristic ginseng value of the preset multiple characteristic parameters in the depth of field photo is obtained according to the depth of field photo;
The model database pre-established is inquired, according to multiple characteristic ginseng value to obtain the crawl of the object Yu the mechanical arm Relative positional relationship between mechanism;
The distance between the object and the grasping mechanism value are obtained according to the depth of view information that the depth of field photo carries;
Relationship and the distance value control the mechanical arm and the grasping mechanism are moved to corresponding position depending on that relative position To grab the object.
2. crawl control method according to claim 1, which is characterized in that described to be inquired according to multiple characteristic ginseng value The model database pre-established, the step of to obtain the relative positional relationship between the object and the mechanical arm depth of field camera it Before further include: the information of the object to be captured is judged according to the depth of field photo;
It is described that the model database pre-established is inquired according to multiple characteristic ginseng value, to obtain the object and the mechanical arm scape The step of relative positional relationship between deep camera includes:
The model database pre-established according to the inquiry of the information of the object to be grabbed is to obtain corresponding relationship model group;
The object to be captured is inquired from multiple relationship models in the relationship model group according to multiple characteristic ginseng value With the relative positional relationship between the mechanical arm depth of field camera.
3. crawl control method according to claim 2, which is characterized in that the depth of field carried according to the depth of field photo The step of the distance between the acquisition of information object and the grasping mechanism value includes:
The object is extracted from the depth of field photo to obtain the depth map of the object;
The distance between the object and the grasping mechanism value are obtained according to the depth of view information in the depth map.
4. crawl control method according to claim 3, setting moment is passed at the grasping mechanism of the mechanical arm front end Sensor;It is characterized in that, the relationship depending on that relative position and the distance value control the mechanical arm for the gripper Structure is moved to the step of corresponding position is to grab the object and includes:
The information of relationship, distance value and the object to be captured generates control signal, the control depending on the relative position Signal is moved to corresponding position and controls the grasping mechanism for controlling the mechanical arm and grabbed with corresponding grasp force wait grab Object.
5. crawl control method according to claim 1, which is characterized in that the depth of field camera is set to the gripper The centre of structure.
6. a kind of crawl control device characterized by comprising
First obtains module, for when detecting trigger signal, obtain the depth of field camera being arranged on mechanical arm it is collected to The depth of field photo of the object of crawl;
Second obtains module, for obtaining feature of the preset multiple characteristic parameters in the depth of field photo according to the depth of field photo Parameter value;
Third obtains module, for inquiring the model database pre-established according to multiple characteristic ginseng value, to obtain the object Relative positional relationship between body and the grasping mechanism of the mechanical arm;
4th obtains module, and the depth of view information for being carried according to the depth of field photo obtains between the object and the grasping mechanism Distance value;
Control module controls the mechanical arm for relationship depending on that relative position and the distance value and moves the grasping mechanism It moves to corresponding position to grab the object.
7. crawl control device according to claim 6, which is characterized in that further include: judgment module, for according to the scape Deep photo judges the information of the object to be captured;
The third obtains module
First query unit, the model database for being pre-established according to the information inquiry for being somebody's turn to do object to be grabbed is to obtain Corresponding relationship model group;
Second query unit, for being inquired from multiple relationship models in the relationship model group according to multiple characteristic ginseng value Relative positional relationship between the object to be captured and the mechanical arm depth of field camera out.
8. crawl control device according to claim 6, which is characterized in that the described 4th, which obtains module, is used for the object Body is extracted from the depth of field photo to obtain the depth map of the object;It is obtained according to the depth of view information in the depth map The distance between the object and the grasping mechanism value.
9. a kind of storage medium, which is characterized in that computer program is stored in the storage medium, when the computer program When running on computers, so that the computer perform claim requires 1 to 5 described in any item methods.
10. a kind of electronic equipment, which is characterized in that including processor and memory, computer journey is stored in the memory Sequence, the processor require 1 to 5 by calling the computer program stored in the memory, for perform claim Method described in one.
CN201811003915.2A 2018-08-30 2018-08-30 Grab control method, device, storage medium and electronic equipment Pending CN109015653A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811003915.2A CN109015653A (en) 2018-08-30 2018-08-30 Grab control method, device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811003915.2A CN109015653A (en) 2018-08-30 2018-08-30 Grab control method, device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN109015653A true CN109015653A (en) 2018-12-18

Family

ID=64625721

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811003915.2A Pending CN109015653A (en) 2018-08-30 2018-08-30 Grab control method, device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN109015653A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07319525A (en) * 1994-05-25 1995-12-08 Nippondenso Co Ltd High-speed picking device for piled parts
CN103706568A (en) * 2013-11-26 2014-04-09 中国船舶重工集团公司第七一六研究所 System and method for machine vision-based robot sorting
CN105014677A (en) * 2015-07-07 2015-11-04 西安交通大学 Visual mechanical arm control device and method based on Camshift visual tracking and D-H modeling algorithms
CN106826815A (en) * 2016-12-21 2017-06-13 江苏物联网研究发展中心 Target object method of the identification with positioning based on coloured image and depth image
CN107263468A (en) * 2017-05-23 2017-10-20 陕西科技大学 A kind of SCARA robotic asssembly methods of utilization digital image processing techniques
CN108161931A (en) * 2016-12-07 2018-06-15 广州映博智能科技有限公司 The workpiece automatic identification of view-based access control model and intelligent grabbing system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07319525A (en) * 1994-05-25 1995-12-08 Nippondenso Co Ltd High-speed picking device for piled parts
CN103706568A (en) * 2013-11-26 2014-04-09 中国船舶重工集团公司第七一六研究所 System and method for machine vision-based robot sorting
CN105014677A (en) * 2015-07-07 2015-11-04 西安交通大学 Visual mechanical arm control device and method based on Camshift visual tracking and D-H modeling algorithms
CN108161931A (en) * 2016-12-07 2018-06-15 广州映博智能科技有限公司 The workpiece automatic identification of view-based access control model and intelligent grabbing system
CN106826815A (en) * 2016-12-21 2017-06-13 江苏物联网研究发展中心 Target object method of the identification with positioning based on coloured image and depth image
CN107263468A (en) * 2017-05-23 2017-10-20 陕西科技大学 A kind of SCARA robotic asssembly methods of utilization digital image processing techniques

Similar Documents

Publication Publication Date Title
CN109159119A (en) Method for controlling robot, device, storage medium and electronic equipment
JP6374993B2 (en) Control of multiple suction cups
CN109176521A (en) A kind of mechanical arm and its crawl control method and system
CN109048915A (en) Mechanical arm grabs control method, device, storage medium and electronic equipment
US11406061B2 (en) Automated walnut picking and collecting method based on multi-sensor fusion technology
CN110216649A (en) The control method of robot manipulating task system and robot manipulating task system
JP6857332B2 (en) Arithmetic logic unit, arithmetic method, and its program
CN108214487B (en) Robot target positioning and grabbing method based on binocular vision and laser radar
CN105259786B (en) The inertial parameter discrimination method and device of target to be identified
CN202854048U (en) Fully-automatic rivet optical detection system
CN102829726A (en) Machine vision automatic detection system for rivets
CN101370624A (en) Method and system allowing the automatic picking of parts
CN114310954B (en) Self-adaptive lifting control method and system for nursing robot
CN111439594A (en) Unstacking method and system based on 3D visual guidance
CN109872355B (en) Shortest distance acquisition method and device based on depth camera
CN112017226A (en) Industrial part 6D pose estimation method and computer readable storage medium
CN113400301A (en) Robot 3D hand-eye calibration method, system, device and medium
CN110509280A (en) A kind of multi-freedom parallel connection crawl robot control system and its control method
CN109352646B (en) Automatic yarn loading and unloading method and system
CN109188902A (en) A kind of robotics learning method, control method, device, storage medium and main control device
CN109760054A (en) Robot autonomous learning system and robot control method
CN109015653A (en) Grab control method, device, storage medium and electronic equipment
Liu et al. An image based visual servo approach with deep learning for robotic manipulation
CN115808425B (en) Defect identification and application method in concrete member rebound detection process
CN117282580A (en) Intelligent manipulator for spraying assembly line and control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20181218