CN109760054A - Robot autonomous learning system and robot control method - Google Patents

Robot autonomous learning system and robot control method Download PDF

Info

Publication number
CN109760054A
CN109760054A CN201910088467.9A CN201910088467A CN109760054A CN 109760054 A CN109760054 A CN 109760054A CN 201910088467 A CN201910088467 A CN 201910088467A CN 109760054 A CN109760054 A CN 109760054A
Authority
CN
China
Prior art keywords
workpiece
default
information
robotic arm
autonomous learning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910088467.9A
Other languages
Chinese (zh)
Inventor
冉祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Liangjiang Micro Chain Intelligent Technology Co Ltd
Original Assignee
Chongqing Liangjiang Micro Chain Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Liangjiang Micro Chain Intelligent Technology Co Ltd filed Critical Chongqing Liangjiang Micro Chain Intelligent Technology Co Ltd
Priority to CN201910088467.9A priority Critical patent/CN109760054A/en
Publication of CN109760054A publication Critical patent/CN109760054A/en
Pending legal-status Critical Current

Links

Landscapes

  • Manipulator (AREA)

Abstract

The embodiment of the present invention provides a kind of robot autonomous learning system and robot control method, the robot autonomous learning system, it include: information acquisition unit, relative positional relationship, the mechanical arm information when for obtaining, storing default workpiece video and shoot the video between robotic arm front end and default workpiece;Arithmetic element, the data for being obtained according to the information acquisition unit, autonomous learning identify that default workpiece and analog mechanical arm grab posture;Output unit forms and exports the model database of mechanical arm the crawl posture and default workpiece corresponding relationship, the model database portable output for the calculated result according to the arithmetic element.The present invention grabs posture by inputting default workpiece video, by computer or industrial personal computer simulation robotic arm, carries out autonomous learning, improves visual machine arm learning efficiency, saves resource.

Description

Robot autonomous learning system and robot control method
Technical field
This application involves field of communication technology, in particular to a kind of robot autonomous learning system and robot controlling party Method.
Background technique
Existing industrial robot has become the necessaries of production automation, each row applied to industrialized production Industry.Current robot system needs to be learnt for workpiece to be captured before equipping automatic production line, especially existing Some visual machine arm systems need to be trained it by technical staff before being put into use, shoot to grabbing workpiece picture sample This, setup parameter condition is learnt, and picture sample is more, and the number of study is more, and the crawl of visual machine arm is more accurate, During this, the machine learning program of generation is stored in the industrial computer system of this robotic arm, and there are two for such learning process A problem:
(1) it needs individually to train each robot system to put into production, is stored in the industrial computer system of robot Machine learning program versatility it is poor, use can not be transplanted;
(2) clapping repeatedly takes the mode repetitive operation of samples pictures more, and efficiency is relatively low.
Therefore, the prior art is defective, needs to improve.
Summary of the invention
The embodiment of the present application provides a kind of robot autonomous learning system and robot control method, according to the workpiece of input Video and relative positional relationship, the mechanical arm information when shooting the video between robotic arm front end and default workpiece, are adopted Posture is grabbed with computer or industrial personal computer simulation robotic arm, autonomous learning is carried out, saves resource, improve learning efficiency.
The embodiment of the present application provides a kind of robot autonomous learning system, comprising:
Information acquisition unit, robotic arm front end and default work when for obtaining, storing default workpiece video and shoot the video Relative positional relationship, the mechanical arm information between part;
Arithmetic element, for the data according to acquisition, autonomous learning identifies that default workpiece and analog mechanical arm grab posture;
Output unit forms and exports mechanical arm crawl posture and pre- for the calculated result according to the arithmetic element If the model database of workpiece corresponding relationship, the model database portable output
In robot autonomous learning system of the present invention, the relative positional relationship includes robotic arm front end and default work The distance between part value and relative bearing relationship.
In robot autonomous learning system of the present invention, the information acquisition unit is also used to obtain the default work The information of part;
The output unit is established for the calculated result according to the arithmetic element according to the information for presetting workpiece Relationship model group forms and exports the model database of mechanical arm the crawl posture and default workpiece corresponding relationship.
In robot autonomous learning system of the present invention, the information acquisition unit obtains the video of default workpiece The following steps are included:
The video includes multiple image, identifies the default workpiece in the multiple image, obtains screening picture;
It sets multiple characteristic parameters and obtains multiple characteristic parameter in the characteristic ginseng value of the screening image.
In robot autonomous learning system of the present invention, the model database includes according to the default workpiece Information, screening each characteristic ginseng value of picture, corresponding relative positional relationship and analog mechanical arm grab appearance The relationship model that state is established.
In robot autonomous learning system of the present invention, the default workpiece identified in the multiple image is obtained Take screening picture the following steps are included:
The multiple image of acquisition is pre-processed, noise is removed;
Foreground target is obtained using background subtraction technology, is eliminated simultaneously for false prospect, color space method and shade are utilized Direction remove various shades;
Movable frame and movement are set in the multiple image, obtain image pixel in movable frame;Pass through preset feature Parameter identifies the default workpiece from the image pixel in the movable frame.
A kind of robot control method, which comprises the following steps:
When detecting workpiece grabbing signal, the real-time video of the collected workpiece to be captured of robotic arm front end camera is obtained;
The model database exported in advance is inquired, according to the real-time video to obtain the opposite position of the workpiece Yu the robotic arm front end It sets relationship and the robotic arm grabs posture;
Relationship generates control signal depending on that relative position, which grabs the work to be captured for controlling robotic arm Part.
The present invention is by according to the workpiece video of input and when shooting the video between robotic arm front end and default workpiece Relative positional relationship, the mechanical arm information grab posture using computer or industrial personal computer simulation robotic arm, are independently learned It practises, improves vision mechanical arm learning efficiency, save resource.In addition, being moved by the model database that will be generated by simulation learning The mechanical arm system for planting identical type model takes the real-time video of workpiece to be captured can be realized and waits grabbing to this by bat Workpiece fast accurate crawl, be not required to clap repeatedly and take samples pictures, improve the training effectiveness of robot.
Detailed description of the invention
In order to more clearly explain the technical solutions in the embodiments of the present application, make required in being described below to embodiment Attached drawing is briefly described.It should be evident that the drawings in the following description are only some examples of the present application, for For those skilled in the art, without creative efforts, it can also be obtained according to these attached drawings other attached Figure.
Fig. 1 is the system block diagram of the robot autonomous learning system in some embodiments of the invention.
Fig. 2 is the flow chart for the video information that the information acquisition unit in some embodiments of the invention obtains default workpiece;
Fig. 3 is the flow chart of the screening picture approach in some embodiments of the invention.
Fig. 4 is the flow chart of the robot control method in some embodiments of the invention.
Appended drawing reference: 100-information acquisition units, 200-arithmetic elements, 300-output units.
Specific embodiment
Presently filed embodiment is described below in detail, the example of the embodiment is shown in the accompanying drawings, wherein from beginning Same or similar element or element with the same or similar functions are indicated to same or similar label eventually.Below by ginseng The embodiment for examining attached drawing description is exemplary, and is only used for explaining the application, and should not be understood as the limitation to the application.
In the description of the present application, it is to be understood that term " center ", " longitudinal direction ", " transverse direction ", " length ", " width ", " thickness ", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outside", " up time The orientation or positional relationship of the instructions such as needle ", " counterclockwise " is to be based on the orientation or positional relationship shown in the drawings, and is merely for convenience of It describes the application and simplifies description, rather than the device or element of indication or suggestion meaning must have a particular orientation, with spy Fixed orientation construction and operation, therefore should not be understood as the limitation to the application.In addition, term " first ", " second " are only used for Purpose is described, relative importance is not understood to indicate or imply or implicitly indicates the quantity of indicated technical characteristic. " first " is defined as a result, the feature of " second " can explicitly or implicitly include one or more feature.? In the description of the present application, the meaning of " plurality " is two or more, unless otherwise specifically defined.
In the description of the present application, it should be noted that unless otherwise clearly defined and limited, term " installation ", " phase Even ", " connection " shall be understood in a broad sense, for example, it may be being fixedly connected, may be a detachable connection, or be integrally connected;It can To be mechanical connection, it is also possible to be electrically connected or can mutually communicate;It can be directly connected, it can also be by between intermediary It connects connected, can be the connection inside two elements or the interaction relationship of two elements.For the ordinary skill of this field For personnel, the concrete meaning of above-mentioned term in this application can be understood as the case may be.
In this application unless specifically defined or limited otherwise, fisrt feature second feature "upper" or "lower" It may include that the first and second features directly contact, also may include that the first and second features are not direct contacts but pass through it Between other characterisation contact.Moreover, fisrt feature includes the first spy above the second feature " above ", " above " and " above " Sign is right above second feature and oblique upper, or is merely representative of first feature horizontal height higher than second feature.Fisrt feature exists Second feature " under ", " lower section " and " following " include that fisrt feature is directly below and diagonally below the second feature, or is merely representative of First feature horizontal height is less than second feature.
Following disclosure provides many different embodiments or example is used to realize the different structure of the application.In order to Simplify disclosure herein, hereinafter the component of specific examples and setting are described.Certainly, they are merely examples, and And purpose does not lie in limitation the application.In addition, the application can in different examples repeat reference numerals and/or reference letter, This repetition is for purposes of simplicity and clarity, itself not indicate between discussed various embodiments and/or setting Relationship.In addition, this application provides various specific techniques and material example, but those of ordinary skill in the art can be with Recognize the application of other techniques and/or the use of other materials.
The description and claims of this application and term " first " in above-mentioned attached drawing, " second ", " third " etc. (if present) is to be used to distinguish similar objects, without being used to describe a particular order or precedence order.It should be appreciated that this The object of sample description is interchangeable under appropriate circumstances.In addition, term " includes " and " having " and their any deformation, meaning Figure, which is to cover, non-exclusive includes.For example, containing the process, method of series of steps or containing a series of modules or list The device of member, terminal, system those of are not necessarily limited to be clearly listed step or module or unit, can also include unclear The step of ground is listed or module or unit also may include its intrinsic for these process, methods, device, terminal or system Its step or module or unit.
With reference to Fig. 1, Fig. 1 is the robot autonomous learning system in some embodiments of the invention, comprising:
Information acquisition unit 100, robotic arm front end and default when for obtaining, storing default workpiece video and shoot the video Relative positional relationship, the mechanical arm information between workpiece;
In practical applications, shooting the default workpiece video can be used any camera, and camera site can choose as the camera just Front and different angle to the workpiece, also, when getting the default workpiece video, also while getting the mechanical arm Information that is to say title, model of the mechanical arm etc..It can be obtained by way of being manually entered, variety classes, model The information of mechanical arm can be pre-stored in database, every letter of the mechanical arm is obtained by way of inquiring database Breath, certainly, is not limited to this.
Wherein, which includes robotic arm front end or the grasping mechanism and default workpiece of the robotic arm front end The distance between value and relative bearing relationship.
Arithmetic element 200, the data for being obtained according to the information acquisition unit 100, autonomous learning identify default work Part and analog mechanical arm grab posture;
In practical applications, which is computer, industrial personal computer or cloud processor, and the method for the machine learning can adopt With algorithm in the prior art, which is grabbed by the method analog mechanical arm of modeling and the default workpiece exists Position and depth information in three-dimensional space.
Output unit 300 forms for the calculated result according to arithmetic element 200 and exports the mechanical arm crawl appearance The model database of state and default workpiece corresponding relationship, model database portable output.
In practical applications, the result of each calculation optimization is input to the output unit 300 by arithmetic element 200, and being formed should The model database of mechanical arm crawl posture and default workpiece corresponding relationship.Model database portable output has logical Data format, can be before training at the industrial computer system or cloud of the identical type of input or the robot of model or robotic arm It is run in reason device.Different location, angle with the default workpiece in the model database is corresponding with mechanical arm crawl posture Relationship model.Data relationship can use relational data model, its group organization data in the form of record group or tables of data, in order to It is stored and is converted using the relationship between various geographical entities and attribute, not stratified also pointer-free, is to establish spatial data A kind of very effective data organization method of relationship between attribute data.
In some embodiments, the information acquisition unit is also used to obtain the information of the default workpiece;
Specifically, the information of relationship and the information of the default workpiece depending on the relative position, comprising: according to the work The weight information of the information of part or the workpiece, external form information and material information;According to the weight information of the workpiece, outside Type information and material information, arithmetic element calculate the grasp force that mechanical arm grabs the default workpiece.
Output unit 300 is built for the calculated result according to arithmetic element 200 according to the information for presetting workpiece Vertical relationship model group forms and exports the model database of mechanical arm the crawl posture and default workpiece corresponding relationship.
In other preferred embodiments, as shown in Fig. 2, the information acquisition unit 100 obtains the view of default workpiece Frequency information the following steps are included:
S101, the video include multiple image, identify the default workpiece in the multiple image, obtain screening picture;
In this step, identification has the screening picture for presetting workpiece in the multiple image, and figure in the prior art can be used Piece recognizer, details are not described herein.
S102, the multiple characteristic parameters of setting simultaneously obtain multiple characteristic parameter in the characteristic ginseng value of the screening picture;
In this step, this feature parameter can be the length and width higher size parameter or form parameter of the workpiece, color parameter Deng.Since often feature is different for different workpiece, to extract the characteristic parameter that can most show the workpiece features, setting The information that the default workpiece is just needed to refer to when determining characteristic parameter, is stored in the database of the information acquisition unit 100, The information of workpiece is preset according to this to select multiple parameters as characteristic parameter.The case where for known workpiece information Under, can the directly workpiece the information selection characteristic parameter of the screening picture that needs to extract, which to extract in determination After the characteristic ginseng value of a little characteristic parameters, so that it may extract the characteristic parameter of this feature parameter from the screening picture quickly Value.
In some preferred embodiments, which includes the information that workpiece is preset according to this, screening The relationship model that each characteristic ginseng value, corresponding relative positional relationship and the analog mechanical arm crawl posture of picture are established.Its In, which is the model database established in above-described embodiment using robotics learning method.
In some preferred embodiments, it referring to Fig. 3, step S101, identifies the default workpiece in the multiple image, obtains Take screening picture the following steps are included:
S1011, the multiple image of acquisition is pre-processed, removes noise;
In this step, it can use differential technique and substantially distinguish the prospect of multiple image and the profile of background, remove noise.
S1012, foreground target is obtained using background subtraction technology, is eliminated simultaneously for false prospect, utilize color sky Between the direction of method and shade remove various shades;
In this step, the boundary wheel that convolution algorithm finds out image object is carried out using foreground target and high-pass filtering template Exterior feature is isolated image object according to the continuity of profile and closure, is eliminated to false prospect.
S1013, movable frame and movement are set in the multiple image, obtain image pixel in movable frame;By preparatory The characteristic parameter of setting identifies the default workpiece from the image pixel in the movable frame.
In this step, image pixel and preset characteristic parameter carry out convolution algorithm in movable frame, are screened Picture;
Specifically, this feature parameter can be the length and width higher size parameter or form parameter of the workpiece, color parameter etc.. Since often feature is different for different workpiece, to extract the characteristic parameter that can most show the workpiece features, setting The information that the default workpiece is just needed to refer to when characteristic parameter, needs that preset the information of workpiece more to select according to this A parameter is as characteristic parameter.
Referring to figure 4., Fig. 4 is the flow chart of one of some embodiments of the invention robot control method,
Method includes the following steps:
S201, when detecting workpiece grabbing signal, the real-time view of the collected workpiece to be captured of robotic arm front end camera is obtained Frequently;
Wherein, when detecting workpiece grabbing signal, light source module can be opened and carry out light filling.
S202, the model database moved into advance is inquired according to the real-time video, before obtaining the workpiece and the robotic arm The relative positional relationship at end and the robotic arm grab posture;
In this step, which is the model database established in above-described embodiment using robotics learning method.
S203, depending on that relative position relationship generate control signal, which should for controlling robotic arm crawl Workpiece to be captured.
In this step, which includes that the mechanical arm needs angle that is mobile or rotating and apart from number According to, and its crawl posture, which is moved to the grasping mechanism at the workpiece to be captured, completes crawl. Specifically, relationship and the information of the workpiece to be captured generation control signal include: the step depending on the relative position According to the weight information of the information of the workpiece or the workpiece, external form information and material information;According to the weight of the workpiece Amount information, external form information and material information calculate specified grasp force;
Specifically, relationship and the information of the workpiece to be captured generate control signal packet to the step depending on the relative position It includes: according to the weight information of the information of the workpiece or the workpiece, external form information and material information;According to the workpiece Weight information, external form information and material information calculate specified grasp force.
Relationship generates mobile control parameter information depending on the relative position, according to the movement control parameter information and specified Grasp force generates control signal, allows the robotic arm that the grasping mechanism is moved to corresponding position and with the specified grasp force To grab the workpiece.
It should be noted that those of ordinary skill in the art will appreciate that whole in the various methods of above-described embodiment or Part steps are relevant hardware can be instructed to complete by program, which can store in computer-readable storage medium In matter, which be can include but is not limited to: read-only memory (ROM, Read Only Memory), random access memory Device (RAM, Random Access Memory), disk or CD etc..
Autonomous learning systems provided by the embodiment of the present application and robot control method are described in detail above, Specific examples are used herein to illustrate the principle and implementation manner of the present application, and the explanation of above embodiments is only used The present processes and its core concept are understood in help;Meanwhile for those skilled in the art, according to the think of of the application Think, there will be changes in the specific implementation manner and application range, in conclusion the content of the present specification should not be construed as pair The limitation of the application.

Claims (7)

1. a kind of robot autonomous learning system characterized by comprising
Information acquisition unit, robotic arm front end and default work when for obtaining, storing default workpiece video and shoot the video Relative positional relationship, the robotic arm information between part;
Arithmetic element, the data for being obtained according to the information acquisition unit, autonomous learning identify default workpiece and simulation Robotic arm grabs posture;
Output unit forms and exports robotic arm crawl posture and pre- for the calculated result according to the arithmetic element If the model database of workpiece corresponding relationship, the model database portable output.
2. robot autonomous learning system according to claim 1, which is characterized in that the relative positional relationship includes machine The distance between device arm front end and default workpiece value and relative bearing relationship.
3. robot autonomous learning system according to claim 1, which is characterized in that the information acquisition unit is also used to Obtain the information of the default workpiece;
The output unit is established for the calculated result according to the arithmetic element according to the information for presetting workpiece Relationship model group forms and exports the model database of robotic arm the crawl posture and default workpiece corresponding relationship.
4. robot autonomous learning system according to claim 1, which is characterized in that the information acquisition unit obtains pre- If the video information of workpiece the following steps are included:
The video includes multiple image, identifies the default workpiece in the multiple image, obtains screening picture;
It sets multiple characteristic parameters and obtains multiple characteristic parameter in the characteristic ginseng value of the screening picture.
5. robot autonomous learning system according to claims 1 to 4, which is characterized in that the model database includes According to the information of the default workpiece, screening each characteristic ginseng value of picture, corresponding relative positional relationship and Simulate the relationship model that robotic arm crawl posture is established.
6. robot autonomous learning system according to claim 4, which is characterized in that in described identification multiple image Default workpiece, obtain screening picture the following steps are included:
The multiple image of acquisition is pre-processed, noise is removed;
Foreground target is obtained using background subtraction technology, is eliminated simultaneously for false prospect, color space method and shade are utilized Direction remove various shades;
Movable frame and movement are set in the multiple image, obtain image pixel in movable frame;
The default workpiece is identified from the image pixel in the movable frame by preset characteristic parameter.
7. a kind of robot control method, which comprises the following steps:
When detecting workpiece grabbing signal, the real-time video of the collected workpiece to be captured of robotic arm front end camera is obtained;
The model database being previously implanted is inquired, according to the real-time video to obtain the opposite position of the workpiece Yu the robotic arm front end It sets relationship and the robotic arm grabs posture;
Relationship generates control signal depending on that relative position, which grabs the work to be captured for controlling robotic arm Part.
CN201910088467.9A 2019-01-30 2019-01-30 Robot autonomous learning system and robot control method Pending CN109760054A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910088467.9A CN109760054A (en) 2019-01-30 2019-01-30 Robot autonomous learning system and robot control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910088467.9A CN109760054A (en) 2019-01-30 2019-01-30 Robot autonomous learning system and robot control method

Publications (1)

Publication Number Publication Date
CN109760054A true CN109760054A (en) 2019-05-17

Family

ID=66454677

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910088467.9A Pending CN109760054A (en) 2019-01-30 2019-01-30 Robot autonomous learning system and robot control method

Country Status (1)

Country Link
CN (1) CN109760054A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110751229A (en) * 2019-10-30 2020-02-04 珠海格力智能装备有限公司 Visual inspection system and method
CN112008717A (en) * 2019-05-30 2020-12-01 松下i-PRO传感解决方案株式会社 Camera and robot system
CN112720499A (en) * 2020-12-30 2021-04-30 深兰人工智能芯片研究院(江苏)有限公司 Control method and device for manipulator, pickup device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106203506A (en) * 2016-07-11 2016-12-07 上海凌科智能科技有限公司 A kind of pedestrian detection method based on degree of depth learning art
CN108107886A (en) * 2017-11-29 2018-06-01 珠海格力电器股份有限公司 Travel control method and device, the sweeping robot of sweeping robot
CN108406767A (en) * 2018-02-13 2018-08-17 华南理工大学 Robot autonomous learning method towards man-machine collaboration
US20180272527A1 (en) * 2017-03-24 2018-09-27 International Business Machines Corporation Self-assembling robotics for disaster applications
CN109159119A (en) * 2018-09-05 2019-01-08 张军强 Method for controlling robot, device, storage medium and electronic equipment
CN109188902A (en) * 2018-08-08 2019-01-11 重庆两江微链智能科技有限公司 A kind of robotics learning method, control method, device, storage medium and main control device
CN109218446A (en) * 2018-10-26 2019-01-15 湖北大学 A kind of Internet of Things intelligent robot autonomous learning cloud computing device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106203506A (en) * 2016-07-11 2016-12-07 上海凌科智能科技有限公司 A kind of pedestrian detection method based on degree of depth learning art
US20180272527A1 (en) * 2017-03-24 2018-09-27 International Business Machines Corporation Self-assembling robotics for disaster applications
CN108107886A (en) * 2017-11-29 2018-06-01 珠海格力电器股份有限公司 Travel control method and device, the sweeping robot of sweeping robot
CN108406767A (en) * 2018-02-13 2018-08-17 华南理工大学 Robot autonomous learning method towards man-machine collaboration
CN109188902A (en) * 2018-08-08 2019-01-11 重庆两江微链智能科技有限公司 A kind of robotics learning method, control method, device, storage medium and main control device
CN109159119A (en) * 2018-09-05 2019-01-08 张军强 Method for controlling robot, device, storage medium and electronic equipment
CN109218446A (en) * 2018-10-26 2019-01-15 湖北大学 A kind of Internet of Things intelligent robot autonomous learning cloud computing device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
(美)巴恩斯著: "《ADA程序设计》", 31 August 1990, 高等教育出版社 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112008717A (en) * 2019-05-30 2020-12-01 松下i-PRO传感解决方案株式会社 Camera and robot system
US11813740B2 (en) 2019-05-30 2023-11-14 i-PRO Co., Ltd. Camera and robot system
CN110751229A (en) * 2019-10-30 2020-02-04 珠海格力智能装备有限公司 Visual inspection system and method
CN112720499A (en) * 2020-12-30 2021-04-30 深兰人工智能芯片研究院(江苏)有限公司 Control method and device for manipulator, pickup device and storage medium
CN112720499B (en) * 2020-12-30 2022-06-17 深兰智能科技(上海)有限公司 Control method and device for manipulator, pickup device and storage medium

Similar Documents

Publication Publication Date Title
CN109760054A (en) Robot autonomous learning system and robot control method
CN111275063B (en) Robot intelligent grabbing control method and system based on 3D vision
KR102023588B1 (en) Deep machine learning method and device for robot gripping
CN109800864B (en) Robot active learning method based on image input
Aleksander et al. Artificial vision for robots
CN104842361A (en) Robotic system with 3d box location functionality
CN109409327B (en) RRU module object pose detection method based on end-to-end deep neural network
CN109444146A (en) A kind of defect inspection method, device and the equipment of industrial processes product
CN110852241B (en) Small target detection method applied to nursing robot
CN111160261A (en) Sample image labeling method and device for automatic sales counter and storage medium
CN206568190U (en) A kind of depth camera caliberating device that field is captured for industrial robot
CN109159119A (en) Method for controlling robot, device, storage medium and electronic equipment
CN115816460A (en) Manipulator grabbing method based on deep learning target detection and image segmentation
CN106020489A (en) Industrial-design simulation system
CN114612786A (en) Obstacle detection method, mobile robot and machine-readable storage medium
CN110782484A (en) Unmanned aerial vehicle video personnel identification and tracking method
CN112288809B (en) Robot grabbing detection method for multi-object complex scene
JPH07291450A (en) Intelligent palletizing system
CN114187312A (en) Target object grabbing method, device, system, storage medium and equipment
CN114872055B (en) SCARA robot assembly control method and system
CN109188902A (en) A kind of robotics learning method, control method, device, storage medium and main control device
CN110271001A (en) Robot recognition methods, control method, device, storage medium and main control device
CN206416179U (en) A kind of motion target tracking positioning and grasping system based on binocular vision
CN116529760A (en) Grabbing control method, grabbing control device, electronic equipment and storage medium
CN113771029A (en) Robot operating system and method based on video incremental learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190517