CN112847346A - Manipulator control method, device, equipment and computer readable storage medium - Google Patents

Manipulator control method, device, equipment and computer readable storage medium Download PDF

Info

Publication number
CN112847346A
CN112847346A CN202011613066.XA CN202011613066A CN112847346A CN 112847346 A CN112847346 A CN 112847346A CN 202011613066 A CN202011613066 A CN 202011613066A CN 112847346 A CN112847346 A CN 112847346A
Authority
CN
China
Prior art keywords
information
grabbed
point
manipulator
grabbing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011613066.XA
Other languages
Chinese (zh)
Other versions
CN112847346B (en
Inventor
陈海波
李宗剑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenlan Intelligent Technology Shanghai Co ltd
Original Assignee
DeepBlue AI Chips Research Institute Jiangsu Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DeepBlue AI Chips Research Institute Jiangsu Co Ltd filed Critical DeepBlue AI Chips Research Institute Jiangsu Co Ltd
Priority to CN202011613066.XA priority Critical patent/CN112847346B/en
Publication of CN112847346A publication Critical patent/CN112847346A/en
Application granted granted Critical
Publication of CN112847346B publication Critical patent/CN112847346B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The application provides a manipulator control method, a manipulator control device, an electronic device and a computer readable storage medium, wherein the method comprises the following steps: acquiring image information of an object to be grabbed, which is acquired by a camera, and acquiring characteristic information of the object according to the image information; calculating and analyzing the information of the points to be grabbed of the object according to the characteristic information; acquiring a grabbing component type corresponding to an object to be grabbed according to the information of the point to be grabbed; and controlling the manipulator to use the grabbing component corresponding to the grabbing component type to grab the point to be grabbed of the object according to the information of the point to be grabbed so as to drive the object to move. This application can obtain the characteristic information of object according to the image information of object, then calculates and the analysis to obtain waiting of object and snatch some information, according to waiting to snatch the information determination suitable subassembly type of snatching of some, use suitable subassembly of snatching through the manipulator at last and snatch the object, thereby avoided snatching the subassembly and snatch some and select the condition of incorrect and leading to the object damage.

Description

Manipulator control method, device, equipment and computer readable storage medium
Technical Field
The present disclosure relates to the field of manipulator control technologies, and in particular, to a manipulator control method, device, and apparatus, and a computer-readable storage medium.
Background
A robot is an automatic manipulator that simulates some of the motion functions of a human hand and arm to grasp, transport objects or manipulate tools according to a fixed program. It is characterized by that it can utilize programming to implement various expected operations, and its structure and performance possess the advantages of both human body and mechanical arm machine, the mechanical arm is the earliest appearing industrial robot and modern robot, it can replace heavy labour of human body to implement mechanization and automation of production, can be operated under the harmful environment to protect personal safety, therefore, the manipulator is widely applied to the departments of mechanical manufacturing, metallurgy, electronics, light industry, atomic energy and the like, but in the prior art, when the existing manipulator clamps a product, because the product is provided with an opening or the inside of the product is arranged in a hollow way and the manipulator can not identify, the clamping force is too high, so that the product is easy to damage, in the aspect of use, the product cannot be better clamped for use, and the center of gravity of the product is extremely easy to turn outwards in the grabbing process.
Accordingly, there is a need to ameliorate one or more of the problems with the related art solutions described above.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The application aims to provide a manipulator control method, a manipulator control device, a manipulator control equipment and a computer-readable storage medium, and the manipulator control method, the manipulator control device and the manipulator control equipment can calculate information of an object grabbing point through acquired image information of an object.
The purpose of the application is realized by adopting the following technical scheme:
in a first aspect, the present application provides a manipulator control method, including:
acquiring image information of an object to be grabbed, which is acquired by a camera, and acquiring characteristic information of the object according to the image information;
calculating and analyzing the information of the points to be grabbed of the object according to the characteristic information;
acquiring the type of a grabbing component corresponding to the object to be grabbed according to the information of the point to be grabbed;
and controlling the manipulator to use the grabbing component corresponding to the grabbing component type to grab the point to be grabbed of the object according to the information of the point to be grabbed so as to drive the object to move.
The technical scheme has the advantages that the characteristic information of the object can be obtained according to the image information of the object, then the characteristic information of the object is calculated and analyzed, so that the information of the point to be grabbed of the object is obtained, the proper grabbing component type is determined according to the information of the point to be grabbed, and finally the proper grabbing component is used for grabbing the object through the manipulator, so that the situation that the object is damaged due to the fact that the grabbing component and the grabbing point are selected incorrectly is avoided.
In some optional embodiments, the acquiring image information of the object to be grabbed, which is acquired by the camera, includes:
receiving image information of the object shot by the camera under different illumination changes; alternatively, the first and second electrodes may be,
and receiving the image information of the object shot by the camera at multiple angles.
The beneficial effects of this technical scheme lie in, through gathering the photo of object under different illumination change conditions, or shoot the photo of object under the multi-angle, can judge the profile information of object, or judge the opening profile information of object.
In some optional embodiments, the characteristic information of the object is opening profile information of the object;
the acquiring the characteristic information of the object according to the image information includes:
identifying shadow characteristic information of the object under different illumination changes;
and determining the opening outline information of the object according to the shadow characteristic information.
The technical scheme has the beneficial effects that the object is placed under different illumination changes, if the object has an opening, the position of the opening presents a dark area under illumination, and then the object picture under the illumination condition is shot, so that the characteristic information in the picture information is extracted to construct the outline information of the opening, and the orientation and the volume size of the opening of the object can be accurately obtained by the mode.
In some optional embodiments, the characteristic information of the object includes at least one of:
opening profile information of the object;
actual contour information of the object;
material information of the object.
The technical scheme has the beneficial effects that the opening outline information, the actual outline information and the material information of the object can be extracted through the picture information of the object, so that a basis is provided for subsequently judging the center and the density of the object.
In some optional embodiments, the characteristic information of the object is material information of the object;
the calculating and analyzing the information of the points to be grabbed of the object according to the characteristic information comprises the following steps:
acquiring material information of the object in the characteristic information, and acquiring density information of the object corresponding to the material information;
constructing an actual space model of the object to calculate volume information of the object, and calculating the mass of the object by combining the density information;
and positioning the gravity center position of the object, and determining the position information and the grabbing force information of the point to be grabbed by combining the opening profile information of the object.
The beneficial effects of this technical scheme lie in, thereby obtain the volume information of object through the density information of material information acquisition object, thereby the actual space model who constructs the object obtains the volume information of object to calculate the quality information of object, and fix a position out the focus position of object, thereby determine the positional information who treats the grabbing point and the information of snatching the dynamics, this kind of mode can be accurate go out the focus position of object and treat the position of grabbing point and the dynamics of snatching of manipulator.
In some optional embodiments, obtaining the type of the grabbing component corresponding to the object to be grabbed according to the information of the point to be grabbed includes:
and comparing the information of the point to be grabbed with the type of the grabbing component of the pre-stored manipulator, and selecting the corresponding type of the grabbing component according to the position information and the grabbing strength information of the point to be grabbed.
The technical scheme has the advantages that the corresponding grabbing component type is selected according to the calculated position information and force information of the point to be grabbed, the grabbing component most suitable for the object can be selected in the mode, and therefore the situation that the object is damaged in the grabbing process is avoided to a certain extent.
In some optional embodiments, the controlling the robot to grasp the point to be grasped of the object by using the grasping component corresponding to the grasping component type according to the information of the point to be grasped includes:
acquiring first direction information of the gravity center of the object;
and controlling the manipulator to grab the point to be grabbed of the object according to the information of the point to be grabbed and the information of the first direction of the gravity center of the object, and extracting the direction of resultant force to coincide with the first direction of the gravity center of the object.
The beneficial effects of this technical scheme lie in, make the direction coincidence at the direction that draws resultant force direction and object focus place of manipulator, the manipulator can be placed in this kind of setting and the object turns on one's side when snatching the transportation object to the condition emergence of object damage has been avoided.
In some optional embodiments, the method further comprises:
receiving internal characteristic information of the object under multiple angles, which is sent by ray equipment;
calculating and analyzing the grabbing force information of the points to be grabbed of the object according to the internal characteristic information;
the step of controlling the manipulator to use the grabbing component corresponding to the grabbing component type to grab the point to be grabbed of the object according to the information of the point to be grabbed comprises the following steps:
and controlling the manipulator to use the grabbing component corresponding to the grabbing component type to grab the point to be grabbed of the object according to the point information to be grabbed and the grabbing force information thereof.
This technical scheme's beneficial effect lies in, can learn whether cavity in the middle part of the object through ray equipment to calculate the quality of cavity object, thereby calculate the point of waiting to snatch of this kind of object of analysis department and snatch dynamics information, thereby this kind of mode can avoid the object to take the circumstances that great power damaged the object when the cavity condition manipulator.
In a second aspect, the present application provides a robot control device, comprising:
the image acquisition module is used for acquiring image information of an object to be grabbed, which is acquired by the camera, and acquiring characteristic information of the object according to the image information;
the calculation module is used for calculating and analyzing the information of the points to be grabbed of the object according to the characteristic information;
the type acquisition module is used for acquiring the type of the grabbing component corresponding to the object to be grabbed according to the information of the point to be grabbed;
and the control module is used for controlling the manipulator to use the grabbing component corresponding to the grabbing component type to grab the point to be grabbed of the object according to the information of the point to be grabbed so as to drive the object to move.
In some optional embodiments, the image acquisition module comprises:
the first receiving unit is used for receiving the image information of the object shot by the camera under different illumination changes; alternatively, the first and second electrodes may be,
and the second receiving unit is used for receiving the image information of the object shot by the camera at multiple angles.
In some optional embodiments, the characteristic information of the object is opening profile information of the object;
the image acquisition module includes:
the identification unit is used for identifying shadow characteristic information of the object under different illumination changes;
and the determining unit is used for determining the opening outline information of the object according to the shadow characteristic information.
In some optional embodiments, the characteristic information of the object includes at least one of:
opening profile information of the object;
actual contour information of the object;
material information of the object.
In some optional embodiments, the characteristic information of the object is material information of the object;
the calculation module comprises:
an obtaining unit, configured to obtain material information of the object in the feature information, and obtain density information of the object corresponding to the material information;
the construction unit is used for constructing an actual space model of the object so as to calculate the volume information of the object and calculate the mass of the object by combining the density information;
and the positioning unit is used for positioning the gravity center position of the object and confirming the position information and the grabbing force information of the point to be grabbed by combining the opening profile information of the object.
In some optional embodiments, the type obtaining module is configured to compare the information of the point to be grabbed with a pre-stored type of a manipulator grabbing component, so as to select a corresponding type of the grabbing component according to the position information and the grabbing strength information of the point to be grabbed.
In some optional embodiments, the control module comprises:
the direction obtaining unit is used for obtaining first direction information of the gravity center of the object;
and the manipulator control unit is used for controlling the manipulator to grab the point to be grabbed of the object according to the information of the point to be grabbed and the information of the first direction of the gravity center of the object, and extracting the direction of resultant force to be superposed with the first direction of the gravity center of the object.
In some optional embodiments, the apparatus further comprises a force module, the force module comprising:
the internal receiving unit is used for receiving the internal characteristic information of the object under multiple angles, which is sent by ray equipment;
the force calculation unit is used for calculating and analyzing the grabbing force information of the points to be grabbed of the object according to the internal characteristic information;
and the control module is used for controlling the manipulator to use the grabbing component corresponding to the grabbing component type to grab the point to be grabbed of the object according to the information of the point to be grabbed and the grabbing force information thereof.
In a third aspect, the present application provides a grasping apparatus, which includes a memory and a processor, where the memory stores a computer program, and the processor implements the steps of any one of the above methods when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of any of the methods described above.
Drawings
The present application is further described below with reference to the drawings and examples.
Fig. 1 is a schematic flowchart of a method for controlling a manipulator according to an embodiment of the present disclosure;
FIG. 2 is a schematic flowchart of acquiring image information according to an embodiment of the present disclosure;
FIG. 3 is a schematic flowchart of acquiring image information according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of obtaining feature information according to an embodiment of the present application;
fig. 5 is a schematic flowchart of calculating information of points to be grabbed according to an embodiment of the present disclosure;
FIG. 6 is a schematic flow chart illustrating obtaining types of grasping elements according to an embodiment of the present disclosure;
FIG. 7 is a schematic flow chart illustrating a robot gripping an object according to an embodiment of the present disclosure;
fig. 8 is a schematic flowchart of a method for controlling a manipulator according to an embodiment of the present disclosure;
FIG. 9 is a schematic flow chart illustrating a robot gripping an object according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of a manipulator control device according to an embodiment of the present disclosure;
FIG. 11 is a schematic structural diagram of an image acquisition module provided in an embodiment of the present application;
FIG. 12 is a schematic structural diagram of an image acquisition module provided in an embodiment of the present application;
FIG. 13 is a schematic structural diagram of a computing module provided in an embodiment of the present application;
FIG. 14 is a schematic structural diagram of a control module provided in an embodiment of the present application;
fig. 15 is a schematic structural diagram of a method for controlling a manipulator according to an embodiment of the present disclosure;
fig. 16 is a schematic structural diagram of a force module provided in an embodiment of the present application;
fig. 17 is a schematic structural diagram of a grasping apparatus according to an embodiment of the present application;
fig. 18 is a schematic structural diagram of a program product for implementing a robot control method according to an embodiment of the present application.
Detailed Description
The present application is further described with reference to the accompanying drawings and the detailed description, and it should be noted that, in the present application, the embodiments or technical features described below may be arbitrarily combined to form a new embodiment without conflict.
Referring to fig. 1, an embodiment of the present application provides a manipulator control method, which includes steps S101 to S104.
Step S101: acquiring image information of an object to be grabbed, which is acquired by a camera, and acquiring characteristic information of the object according to the image information. Specifically, by extracting the image information collected by the camera, the characteristic information of the object in the image information can be extracted, and the characteristic information can be material information and contour information of the object, and is not limited specifically.
In a specific implementation, referring to fig. 2 and 3, the step S101 may include steps S201 to S202.
Step S201: and receiving the image information of the object shot by the camera under different illumination changes. Specifically, the object image shot by the camera under different illumination change conditions may contain opening shadow information of the object, and the opening contour information and the position information of the object are determined accordingly, so as to further determine the information of the grasping point of the object.
Step S202: and receiving the image information of the object shot by the camera under different illumination changes. Specifically, through gathering the picture of object under different illumination change conditions, or shoot the picture of object under the multi-angle, can judge the profile information of object, or judge the opening profile information of object. By the method, the opening position of the object can be quickly identified, and the outline information and the orientation information of the opening of the object can be acquired.
In a specific implementation, referring to fig. 4, the characteristic information of the object is opening profile information of the object; the step S101 may include steps S301 to S302.
Step S301: and identifying shadow characteristic information of the object under different illumination changes. Specifically, by lighting the side of the object, the position of the opening of the object can be shaded, and the shading characteristic information can be acquired to determine the outline and the orientation information of the opening.
Step S301: and determining the opening outline information of the object according to the shadow characteristic information. Specifically, through placing the object under the illumination change of difference, if there is the opening part in the object, the open-ended position will present the dark region under the illumination, then shoot the object picture under this illumination condition to the characteristic information who extracts in the picture information constructs out open-ended profile information, orientation and the volume size of the acquisition object opening part that this kind of mode can be accurate.
Step S102: and calculating and analyzing the information of the points to be grabbed of the object according to the characteristic information.
In a specific implementation, referring to fig. 5, the characteristic information of the object is material information of the object, and the step S102 may include steps S401 to S403.
Step S401: and acquiring material information of the object in the characteristic information, and acquiring density information of the object corresponding to the material information. Specifically, the characteristic information of the object and the pre-stored characteristic information of the object can be compared to obtain the material information of the corresponding object, so that the density information of the object can be obtained.
Step S402: and constructing an actual space model of the object to calculate volume information of the object, and calculating the mass of the object by combining the density information. Specifically, an actual space model of the object is constructed through the obtained actual contour information of the object, so that volume information of the object is obtained, and the mass of the object is obtained according to the obtained density information.
Step S403: and positioning the gravity center position of the object, and determining the position information and the grabbing force information of the point to be grabbed by combining the opening profile information of the object. The method comprises the following steps of obtaining density information of an object through material information, constructing an actual space model of the object to obtain volume information of the object, calculating quality information of the object, positioning the gravity center position of the object, determining position information of a point to be grabbed and information of grabbing force, and accurately obtaining the gravity center position of the object, the position of the point to be grabbed and the grabbing force of a manipulator by the mode.
Step S103: and acquiring the type of the grabbing component corresponding to the object to be grabbed according to the information of the point to be grabbed. Specifically, the different gripping points may adopt different types of gripping components, such as hooking, adsorbing, or clamping to grip the object.
In a specific implementation, referring to fig. 6, the type of the grabbing component corresponding to the object to be grabbed is obtained according to the information of the point to be grabbed, and the step S103 may include step S501.
Step S501: and comparing the information of the point to be grabbed with the type of the grabbing component of the pre-stored manipulator, and selecting the corresponding type of the grabbing component according to the position information and the grabbing strength information of the point to be grabbed. Specifically, the corresponding grabbing component type is selected according to the calculated position information and force information of the point to be grabbed, and the grabbing component most suitable for the object can be selected in the mode, so that the condition that the object is damaged in the grabbing process is avoided to a certain extent.
Step S104: and controlling the manipulator to use the grabbing component corresponding to the grabbing component type to grab the point to be grabbed of the object according to the information of the point to be grabbed so as to drive the object to move. Specifically, after confirming that the object to be grabbed is point information and the grabbing component used by the manipulator, the manipulator is controlled to move so as to accurately grab the object, and the object is prevented from being damaged.
Therefore, the characteristic information of the object can be obtained according to the image information of the object, then the characteristic information of the object is calculated and analyzed, so that the information of the point to be grabbed of the object is obtained, the proper grabbing component type is determined according to the information of the point to be grabbed, and finally the proper grabbing component is used for grabbing the object through the manipulator, so that the situation that the object is damaged due to the fact that the grabbing component and the grabbing point are selected incorrectly is avoided.
In a specific implementation, the characteristic information of the object includes at least one of: opening profile information of the object; actual contour information of the object; material information of the object.
Therefore, the opening outline information, the actual outline information and the material information of the object can be extracted through the picture information of the object, and a basis is provided for subsequently judging the center and the density of the object.
In a specific implementation, referring to fig. 7, the step S104 may include steps S601 to S602.
Step S601: and acquiring first direction information of the gravity center of the object. Specifically, the position of the center of gravity of the object can be determined according to the mass information of the object and the constructed spatial model information, and information of a first direction in which the center of gravity is located is obtained, and the information of the first direction coincides with the direction in which the resultant force of the object extracted by the manipulator is located.
Step S602: and controlling the manipulator to grab the point to be grabbed of the object according to the information of the point to be grabbed and the information of the first direction of the gravity center of the object, and extracting the direction of resultant force to coincide with the first direction of the gravity center of the object. Specifically, the direction of resultant force of the extraction of messenger's manipulator coincides with the direction at object focus place, and the manipulator can be placed in this kind of setting and the object is turned on one's side when snatching the transportation object to the condition emergence of object damage has been avoided.
In a specific implementation, referring to fig. 8, the step S104 may include steps S105 to S106.
Step S105: and receiving the internal characteristic information of the object under multiple angles, which is sent by ray equipment. Specifically, the characteristic information of the interior of the object can be detected through the ray equipment, if the object is of a hollow structure, the characteristic information can be detected by the ray equipment, and corresponding data are sent, so that a model of the hollow part of the object can be built, and the volume of the hollow part and the gravity center of the object can be calculated.
Step S106: and calculating and analyzing the grabbing force information of the points to be grabbed of the object according to the internal characteristic information. Specifically, the mass and the gravity center position of the object can be obtained by calculating and analyzing the internal features, so that information of the point to be grasped of the object, such as force information, can be calculated.
In a specific implementation, referring to fig. 9, the step S104 may include a step S701.
Step S701: and controlling the manipulator to use the grabbing component corresponding to the grabbing component type to grab the point to be grabbed of the object according to the point information to be grabbed and the grabbing force information thereof. It is specific, can learn whether the object middle part is cavity through ray equipment to calculate the quality of cavity object, thereby calculate the point of waiting to snatch of this kind of object of analysis department and snatch dynamics information, thereby this kind of mode can avoid the object to take the circumstances of great power damage object when the cavity condition.
Referring to fig. 10, an embodiment of the present application further provides a manipulator control device, and a specific implementation manner of the manipulator control device is consistent with the implementation manner and the achieved technical effect described in the embodiment of the foregoing method, and details are not repeated.
The device comprises: the image acquisition module 401 is configured to acquire image information of an object to be captured, which is acquired by a camera, and acquire feature information of the object according to the image information; a calculating module 402, configured to calculate and analyze information of a to-be-grabbed point of the object according to the feature information; a type obtaining module 403, configured to obtain, according to the information of the point to be grabbed, a type of the grabbing component corresponding to the object to be grabbed; and the control module 404 is configured to control the manipulator to use the grabbing component corresponding to the grabbing component type to grab the point to be grabbed of the object according to the information of the point to be grabbed, so as to drive the object to move.
Referring to fig. 11, in an implementation, the image acquisition module 401 includes: the first receiving unit 4011 is configured to receive image information of the object, which is captured by the camera under different illumination changes; or, the second receiving unit 4012 is configured to receive image information of the object, which is shot by the camera at multiple angles.
Referring to fig. 12, in an implementation, the characteristic information of the object is opening profile information of the object; the image acquisition module 401 includes: the identification unit 4013 is configured to identify shadow feature information of the object under different illumination changes; a determining unit 4014, configured to determine opening contour information of the object according to the shadow feature information.
In a specific implementation, the characteristic information of the object includes at least one of: opening profile information of the object; actual contour information of the object; material information of the object.
Referring to fig. 13, in an implementation, the characteristic information of the object is material information of the object; the calculation module 402 includes: an obtaining unit 4021, configured to obtain material information of the object in the feature information, and obtain density information of the object corresponding to the material information; a building unit 4022, configured to build an actual spatial model of the object, to calculate volume information of the object, and to calculate a mass of the object by combining the density information; the positioning unit 4023 is configured to position the center of gravity of the object, and determine position information and grasping force information of a point to be grasped by combining the opening profile information of the object.
In a specific implementation, the type obtaining module 403 is configured to compare the information of the point to be grabbed with a pre-stored type of a grabbing component of the manipulator, so as to select a corresponding type of the grabbing component according to the position information and the grabbing strength information of the point to be grabbed.
Referring to fig. 14, in an implementation, the control module 404 includes: a direction obtaining unit 4041, configured to obtain first direction information where the center of gravity of the object is located; and the manipulator control unit 4042 is configured to control the manipulator to grasp the point to be grasped of the object according to the information of the point to be grasped and the information of the first direction in which the center of gravity of the object is located, and extract a resultant force direction to coincide with the first direction in which the center of gravity of the object is located.
Referring to fig. 15 and 16, in an implementation, the apparatus further includes a force module 405, where the force module 405 includes: an internal receiving unit 4051, configured to receive internal feature information of the object from a ray device at multiple angles; the force calculation unit 4052 is configured to calculate and analyze the grabbing force information of the point to be grabbed of the object according to the internal feature information; the control module 404 is configured to control the manipulator to grasp the point to be grasped of the object by using the grasping component corresponding to the grasping component type according to the information of the point to be grasped and the grasping force information thereof.
Referring to fig. 17, an embodiment of the present application further provides an electronic device 200, where the electronic device 200 includes at least one memory 210, at least one processor 220, and a bus 230 connecting different platform systems.
The memory 210 may include readable media in the form of volatile memory, such as Random Access Memory (RAM)211 and/or cache memory 212, and may further include Read Only Memory (ROM) 213.
The memory 210 further stores a computer program, and the computer program can be executed by the processor 220, so that the processor 220 executes the steps of the robot control method in the embodiment of the present application. Memory 210 may also include a program/utility 214 having a set (at least one) of program modules 215, including but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Accordingly, processor 220 may execute the computer programs described above, as well as may execute programs/utilities 214.
Bus 230 may be a local bus representing one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, or any other type of bus structure.
The electronic device 200 may also communicate with one or more external devices 240, such as a keyboard, pointing device, Bluetooth device, etc., and may also communicate with one or more devices capable of interacting with the electronic device 200, and/or with any devices (e.g., routers, modems, etc.) that enable the electronic device 200 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 250. Also, the electronic device 200 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 260. The network adapter 260 may communicate with other modules of the electronic device 200 via the bus 230. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 200, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, and data backup storage platforms, to name a few.
The embodiment of the present application further provides a computer-readable storage medium, which is used for storing a computer program, and when the computer program is executed, the steps of the robot control method in the embodiment of the present application are implemented. Fig. 18 shows a program product 300 provided by the present embodiment for implementing the method, which may employ a portable compact disc read only memory (CD-ROM) and include program codes, and may be run on a terminal device, such as a personal computer. However, the program product 300 of the present invention is not so limited, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. Program product 300 may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing. Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
The foregoing description and drawings are only for purposes of illustrating the preferred embodiments of the present application and are not intended to limit the present application, which is, therefore, to the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present application.

Claims (18)

1. A method for controlling a robot, the method comprising:
acquiring image information of an object to be grabbed, which is acquired by a camera, and acquiring characteristic information of the object according to the image information;
calculating and analyzing the information of the points to be grabbed of the object according to the characteristic information;
acquiring the type of a grabbing component corresponding to the object to be grabbed according to the information of the point to be grabbed;
and controlling the manipulator to use the grabbing component corresponding to the grabbing component type to grab the point to be grabbed of the object according to the information of the point to be grabbed so as to drive the object to move.
2. The manipulator control method according to claim 1, wherein the acquiring image information of the object to be grasped acquired by the camera includes:
receiving image information of the object shot by the camera under different illumination changes; alternatively, the first and second electrodes may be,
and receiving the image information of the object shot by the camera at multiple angles.
3. The robot control method according to claim 2, wherein the characteristic information of the object is opening profile information of the object;
the acquiring the characteristic information of the object according to the image information includes:
identifying shadow characteristic information of the object under different illumination changes;
and determining the opening outline information of the object according to the shadow characteristic information.
4. The robot control method according to claim 1, wherein the characteristic information of the object includes at least one of:
opening profile information of the object;
actual contour information of the object;
material information of the object.
5. The robot control method according to claim 1, wherein the characteristic information of the object is material information of the object;
the calculating and analyzing the information of the points to be grabbed of the object according to the characteristic information comprises the following steps:
acquiring material information of the object in the characteristic information, and acquiring density information of the object corresponding to the material information;
constructing an actual space model of the object to calculate volume information of the object, and calculating the mass of the object by combining the density information;
and positioning the gravity center position of the object, and determining the position information and the grabbing force information of the point to be grabbed by combining the opening profile information of the object.
6. The manipulator control method according to claim 1, wherein obtaining a grasping assembly type corresponding to the object to be grasped according to the information of the point to be grasped comprises:
and comparing the information of the point to be grabbed with the type of the grabbing component of the pre-stored manipulator, and selecting the corresponding type of the grabbing component according to the position information and the grabbing strength information of the point to be grabbed.
7. The manipulator control method according to claim 1, wherein the controlling the manipulator to grasp the point to be grasped of the object using the grasping assembly corresponding to the grasping assembly type according to the information of the point to be grasped comprises:
acquiring first direction information of the gravity center of the object;
and controlling the manipulator to grab the point to be grabbed of the object according to the information of the point to be grabbed and the information of the first direction of the gravity center of the object, and extracting the direction of resultant force to coincide with the first direction of the gravity center of the object.
8. The robot control method according to claim 1, characterized by further comprising:
receiving internal characteristic information of the object under multiple angles, which is sent by ray equipment;
calculating and analyzing the grabbing force information of the points to be grabbed of the object according to the internal characteristic information;
the step of controlling the manipulator to use the grabbing component corresponding to the grabbing component type to grab the point to be grabbed of the object according to the information of the point to be grabbed comprises the following steps:
and controlling the manipulator to use the grabbing component corresponding to the grabbing component type to grab the point to be grabbed of the object according to the point information to be grabbed and the grabbing force information thereof.
9. A manipulator control device, characterized in that the device comprises:
the image acquisition module is used for acquiring image information of an object to be grabbed, which is acquired by the camera, and acquiring characteristic information of the object according to the image information;
the calculation module is used for calculating and analyzing the information of the points to be grabbed of the object according to the characteristic information;
the type acquisition module is used for acquiring the type of the grabbing component corresponding to the object to be grabbed according to the information of the point to be grabbed;
and the control module is used for controlling the manipulator to use the grabbing component corresponding to the grabbing component type to grab the point to be grabbed of the object according to the information of the point to be grabbed so as to drive the object to move.
10. The robot control apparatus of claim 9, wherein the image acquisition module comprises:
the first receiving unit is used for receiving the image information of the object shot by the camera under different illumination changes; alternatively, the first and second electrodes may be,
and the second receiving unit is used for receiving the image information of the object shot by the camera at multiple angles.
11. The robot control device according to claim 10, wherein the characteristic information of the object is opening profile information of the object;
the image acquisition module includes:
the identification unit is used for identifying shadow characteristic information of the object under different illumination changes;
and the determining unit is used for determining the opening outline information of the object according to the shadow characteristic information.
12. The robot control apparatus of claim 9, wherein the characteristic information of the object includes at least one of:
opening profile information of the object;
actual contour information of the object;
material information of the object.
13. The robot control device according to claim 9, wherein the characteristic information of the object is material information of the object;
the calculation module comprises:
an obtaining unit, configured to obtain material information of the object in the feature information, and obtain density information of the object corresponding to the material information;
the construction unit is used for constructing an actual space model of the object so as to calculate the volume information of the object and calculate the mass of the object by combining the density information;
and the positioning unit is used for positioning the gravity center position of the object and confirming the position information and the grabbing force information of the point to be grabbed by combining the opening profile information of the object.
14. The manipulator control device according to claim 9, wherein the type obtaining module is configured to compare the information of the point to be grabbed with a pre-stored manipulator grabbing component type, so as to select a corresponding grabbing component type according to the position information and grabbing strength information of the point to be grabbed.
15. The robot control device of claim 9, wherein the control module comprises:
the direction obtaining unit is used for obtaining first direction information of the gravity center of the object;
and the manipulator control unit is used for controlling the manipulator to grab the point to be grabbed of the object according to the information of the point to be grabbed and the information of the first direction of the gravity center of the object, and extracting the direction of resultant force to be superposed with the first direction of the gravity center of the object.
16. The manipulator control device of claim 9, further comprising a force module, the force module comprising:
the internal receiving unit is used for receiving the internal characteristic information of the object under multiple angles, which is sent by ray equipment;
the force calculation unit is used for calculating and analyzing the grabbing force information of the points to be grabbed of the object according to the internal characteristic information;
and the control module is used for controlling the manipulator to use the grabbing component corresponding to the grabbing component type to grab the point to be grabbed of the object according to the information of the point to be grabbed and the grabbing force information thereof.
17. Gripping apparatus, characterized in that the gripping apparatus comprises a memory and a processor, the memory storing a computer program, the processor realizing the steps of the method according to any of the claims 1-8 when executing the computer program.
18. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 8.
CN202011613066.XA 2020-12-30 2020-12-30 Method, device and equipment for controlling mechanical arm and computer readable storage medium Active CN112847346B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011613066.XA CN112847346B (en) 2020-12-30 2020-12-30 Method, device and equipment for controlling mechanical arm and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011613066.XA CN112847346B (en) 2020-12-30 2020-12-30 Method, device and equipment for controlling mechanical arm and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN112847346A true CN112847346A (en) 2021-05-28
CN112847346B CN112847346B (en) 2023-06-23

Family

ID=75998585

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011613066.XA Active CN112847346B (en) 2020-12-30 2020-12-30 Method, device and equipment for controlling mechanical arm and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112847346B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113753562A (en) * 2021-08-24 2021-12-07 深圳市长荣科机电设备有限公司 Carrying method, system and device based on linear motor and storage medium
CN114851208A (en) * 2022-06-16 2022-08-05 梅卡曼德(北京)机器人科技有限公司 Object gripping method and system for gripping an object

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013083818A1 (en) * 2011-12-09 2013-06-13 Commissariat A L'energie Atomique Et Aux Energies Alternatives Control method for controlling a robot and control system employing such a method
CN103753585A (en) * 2014-01-10 2014-04-30 南通大学 Method for intelligently adjusting manipulator and grasping force on basis of visual image analysis
CN104156726A (en) * 2014-08-19 2014-11-19 大连理工大学 Workpiece recognition method based on geometric shape feature and device thereof
CN108115688A (en) * 2017-12-29 2018-06-05 深圳市越疆科技有限公司 Crawl control method, system and the mechanical arm of a kind of mechanical arm
CN108453735A (en) * 2018-03-15 2018-08-28 河南大学 A kind of grasping means based on friction nanometer power generator bionic mechanical hand, device
CN110834335A (en) * 2019-11-21 2020-02-25 广东弓叶科技有限公司 Object clamping method and sorting equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013083818A1 (en) * 2011-12-09 2013-06-13 Commissariat A L'energie Atomique Et Aux Energies Alternatives Control method for controlling a robot and control system employing such a method
CN103753585A (en) * 2014-01-10 2014-04-30 南通大学 Method for intelligently adjusting manipulator and grasping force on basis of visual image analysis
CN104156726A (en) * 2014-08-19 2014-11-19 大连理工大学 Workpiece recognition method based on geometric shape feature and device thereof
CN108115688A (en) * 2017-12-29 2018-06-05 深圳市越疆科技有限公司 Crawl control method, system and the mechanical arm of a kind of mechanical arm
CN108453735A (en) * 2018-03-15 2018-08-28 河南大学 A kind of grasping means based on friction nanometer power generator bionic mechanical hand, device
CN110834335A (en) * 2019-11-21 2020-02-25 广东弓叶科技有限公司 Object clamping method and sorting equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
牟洪波等: "《基于BP和RBF神经网络的木材缺陷检测研究》", 31 May 2011, 哈尔滨工程大学出版社 *
黄德双: "《现代信息技术理论与应用》", 31 August 2002, 中国科学技术出版社 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113753562A (en) * 2021-08-24 2021-12-07 深圳市长荣科机电设备有限公司 Carrying method, system and device based on linear motor and storage medium
CN114851208A (en) * 2022-06-16 2022-08-05 梅卡曼德(北京)机器人科技有限公司 Object gripping method and system for gripping an object
CN114851208B (en) * 2022-06-16 2024-02-02 梅卡曼德(北京)机器人科技有限公司 Object gripping method and system for gripping an object

Also Published As

Publication number Publication date
CN112847346B (en) 2023-06-23

Similar Documents

Publication Publication Date Title
CN112847346B (en) Method, device and equipment for controlling mechanical arm and computer readable storage medium
CN109726763B (en) Information asset identification method, device, equipment and medium
CN112115927B (en) Intelligent machine room equipment identification method and system based on deep learning
CN109685075A (en) A kind of power equipment recognition methods based on image, apparatus and system
CN108876857B (en) Method, system, device and storage medium for positioning unmanned vehicle
CN114093052A (en) Intelligent inspection method and system suitable for machine room management
CN111462109A (en) Defect detection method, device and equipment for strain clamp and storage medium
CN112621765A (en) Automatic equipment assembly control method and device based on manipulator
CN111882610A (en) Method for grabbing target object by service robot based on elliptical cone artificial potential field
CN109101967A (en) The recongnition of objects and localization method, terminal and storage medium of view-based access control model
CN115781673A (en) Part grabbing method, device, equipment and medium
CN115409808A (en) Weld joint recognition method and device, welding robot and storage medium
CN112847348A (en) Manipulator control method, manipulator control device, pickup apparatus, and computer-readable storage medium
CN115409809A (en) Weld joint recognition method and device, welding robot and storage medium
CN113468048B (en) System testing method, device, equipment and computer readable storage medium
CN112734700A (en) Workpiece detection method, workpiece detection device, electronic equipment and storage medium
CN113888024A (en) Operation monitoring method and device, electronic equipment and storage medium
US8639398B2 (en) Apparatus and method for automatically generating satellite operation procedure parameters
CN112975950B (en) Remote operation system and remote operation method
CN112785556A (en) Reinspection method, reinspection device, electronic equipment and computer-readable storage medium
CN112059983A (en) Method, device and computer readable medium for assembling workpiece
CN108108895A (en) Method, system, equipment and the storage medium of task status dynamic management and control
CN112720496A (en) Control method and device for manipulator, pickup device and storage medium
CN111859370A (en) Method, apparatus, electronic device and computer-readable storage medium for identifying service
CN112784687B (en) Control method, device and equipment of manipulator and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220330

Address after: Building C, No.888, Huanhu West 2nd Road, Lingang New District, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai

Applicant after: Shenlan Intelligent Technology (Shanghai) Co.,Ltd.

Address before: 213000 No.103, building 4, Chuangyan port, Changzhou science and Education City, No.18, middle Changwu Road, Wujin District, Changzhou City, Jiangsu Province

Applicant before: SHENLAN ARTIFICIAL INTELLIGENCE CHIP RESEARCH INSTITUTE (JIANGSU) Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant