CN114770504A - Robot control method, robot control device, robot, and storage medium - Google Patents

Robot control method, robot control device, robot, and storage medium Download PDF

Info

Publication number
CN114770504A
CN114770504A CN202210447069.3A CN202210447069A CN114770504A CN 114770504 A CN114770504 A CN 114770504A CN 202210447069 A CN202210447069 A CN 202210447069A CN 114770504 A CN114770504 A CN 114770504A
Authority
CN
China
Prior art keywords
palm
user
delivered
robot
mechanical arm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210447069.3A
Other languages
Chinese (zh)
Other versions
CN114770504B (en
Inventor
夏舸
梁朋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uditech Co Ltd
Original Assignee
Uditech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uditech Co Ltd filed Critical Uditech Co Ltd
Priority to CN202210447069.3A priority Critical patent/CN114770504B/en
Publication of CN114770504A publication Critical patent/CN114770504A/en
Application granted granted Critical
Publication of CN114770504B publication Critical patent/CN114770504B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a robot control method, a device, a robot and a storage medium, wherein the robot is provided with a storage bin, a mechanical arm is arranged in the storage bin, and the method comprises the following steps: after the goods to be delivered in the storage bin are delivered to the delivery destination, the grabbed goods to be delivered are delivered out of the storage bin through the mechanical arm; detecting whether a palm of a user exists in an environment outside the storage bin through a sensor arranged in the mechanical arm; when the palm of the user is determined to exist, acquiring the position of the palm of the user through a sensor; and controlling the mechanical arm to move to the position of the palm of the user so as to place the grabbed object to be dispensed on the palm of the user. According to the delivery robot, the to-be-delivered articles are delivered to the hands of the user automatically, the user only needs to stretch the hands to receive the to-be-delivered articles, the hands do not need to stretch into the storage bin to take the articles, the article taking process of the user is simplified, the intelligence of the delivery robot is improved, and the article taking experience of the user is improved.

Description

Robot control method, robot control device, robot, and storage medium
Technical Field
The present invention relates to the field of robot technology, and in particular, to a robot control method, apparatus, robot, and storage medium.
Background
At present, with the maturity and development of robot technology, robots are increasingly applied to various fields to replace manual work to complete work tasks. For example, in a hotel, restaurant, or the like, a robot may be used to provide intelligent goods delivery services to customers. However, when the robot delivers articles, the user is often required to stretch out the hand to enter the storage bin of the robot to take articles, and the user is cumbersome to operate, time-consuming and labor-consuming because the user stretches out the hand to bend down to take articles after identity verification.
Disclosure of Invention
The invention mainly aims to provide a robot control method, a robot control device, a robot and a storage medium, and aims to solve the technical problem of how to improve the intelligence of the robot when the robot delivers articles.
In order to achieve the above object, the present invention provides a robot control method, applied to a robot, the robot being provided with a storage compartment, the storage compartment being provided with a mechanical arm, the method comprising:
after the delivery destination of the article to be delivered in the storage bin is reached, the grabbed article to be delivered is delivered out of the storage bin through the mechanical arm;
detecting whether a palm of a user exists in the environment outside the storage bin through a sensor arranged in the mechanical arm;
when the palm of the user is determined to exist, acquiring the position of the palm of the user through the sensor;
and controlling the mechanical arm to move to the position of the palm of the user so as to place the grabbed object to be dispensed on the palm of the user.
Optionally, the detecting whether the palm of the user exists in the environment outside the storage bin through a sensor arranged in the mechanical arm includes:
acquiring sensor data through a sensor arranged in the mechanical arm, wherein the sensor is a depth camera and/or a radar sensor;
extracting contour features of the environment object from the sensor data, and comparing the contour features of the environment object with preset contour features of the palm;
and if the environmental object outline characteristic is consistent with the palm outline characteristic in comparison, determining that the palm of the user exists in the environment outside the storage bin.
Optionally, after the article to be delivered arrives at the delivery destination of the article to be delivered in the storage compartment, before the gripped article to be delivered is delivered out of the storage compartment by the mechanical arm, the method further includes:
after the articles to be delivered are determined, the articles to be delivered are grabbed and fixed from the storage bin through the mechanical arm.
Optionally, the grabbing and fixing the object to be dispensed from the storage bin by the mechanical arm includes:
scanning through a graphic code scanning device arranged in the mechanical arm, determining the position of the article to be delivered in the storage bin according to the scanned graphic code of the article to be delivered, or communicating with each article in the storage bin through a near field communication device arranged in the mechanical arm, and determining the position of the article to be delivered in the storage bin according to a communication result;
and controlling the mechanical arm to move to the position of the article to be delivered in the storage bin and then grabbing and fixing the article to be delivered.
Optionally, before detecting whether a palm of a user exists in the environment outside the storage compartment through a sensor disposed in the mechanical arm, the method further includes:
detecting, by the sensor, whether a pickup object is present;
if the goods taking object is determined to exist, detecting height characteristics of the goods taking object through the sensor;
and controlling the mechanical arm to carry the article to be delivered to hover at a height position corresponding to the height characteristic.
Optionally, when the sensor includes an image pickup device, the detecting whether the pickup object exists by the sensor includes:
acquiring environment image data through the camera device, and identifying the environment image data to obtain identity verification information;
matching the identity authentication information with the identity information of the goods taking object corresponding to the goods to be delivered;
and when the identity authentication information is matched with the identity information of the goods-taking object, determining that the goods-taking object exists.
Optionally, the controlling the robot arm to move to a position where the palm of the user is located so as to place the gripped object to be dispensed on the palm of the user includes:
controlling the mechanical arm to move to the position of the palm of the user;
detecting the pressure value of each contact point through the electronic skin arranged on the contact surface of the mechanical arm grabbing part and the object to be dispensed;
and when detecting that each pressure value accords with a preset pressure value distribution state, controlling the mechanical arm to loosen the article to be delivered so as to place the grabbed article to be delivered on the palm of the user.
In order to achieve the above object, the present invention further provides a robot control device, the device is disposed in a robot, the robot is provided with a storage bin, a mechanical arm is disposed in the storage bin, and the device includes:
the grabbing module is used for sending the grabbed articles to be delivered out of the storage bin through the mechanical arm after the articles to be delivered reach a delivery destination of the articles to be delivered in the storage bin;
the detection module is used for detecting whether a palm of a user exists in the environment outside the storage bin through a sensor arranged in the mechanical arm;
the acquisition module is used for acquiring the position of the palm of the user through the sensor when the palm of the user is determined to exist;
and the delivery module is used for controlling the mechanical arm to move to the position of the palm of the user so as to place the grabbed object to be delivered on the palm of the user.
To achieve the above object, the present invention also provides a robot comprising: a memory, a processor and a robot control program stored on the memory and executable on the processor, the robot control program, when executed by the processor, implementing the steps of the robot control method as described above.
Furthermore, to achieve the above object, the present invention also proposes a computer-readable storage medium having stored thereon a robot control program, which when executed by a processor, implements the steps of the robot control method as described above.
According to the invention, the robot is provided with the storage bin, the mechanical arm is arranged in the storage bin, after the robot reaches a delivery destination corresponding to the article to be delivered, the article to be delivered in the bin is grabbed by the mechanical arm and is delivered out of the bin, whether a user palm exists in the external environment is detected by the sensor arranged in the mechanical arm, the position of the user palm is obtained by the sensor when the user palm is determined, the mechanical arm is controlled to move to the position of the user palm, so that the grabbed article to be delivered is placed in the user palm, the robot can automatically deliver the article to be delivered to the user hand, the user only needs to stretch the hand to catch the article to be delivered, the hand does not need to stretch into the storage bin to take the article, the pickup flow of the user is simplified, the intelligence of the delivery robot is improved, and the pickup experience of the user is improved.
Drawings
Fig. 1 is a schematic structural diagram of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a robot control method according to a first embodiment of the present invention;
fig. 3 is a schematic view of a scenario in which a robot arm grabs an article to be delivered according to an embodiment of the present invention;
fig. 4 is a functional block diagram of a robot control device according to a preferred embodiment of the invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, fig. 1 is a schematic device structure diagram of a hardware operating environment according to an embodiment of the present invention.
It should be noted that, the device in the embodiment of the present invention may be a device with data processing capability, such as a smart phone, a personal computer, and a server, and the device may be deployed in a mobile robot, which is not limited herein. The robot is provided with a storage bin, and a mechanical arm is arranged in the storage bin.
As shown in fig. 1, the apparatus (terminal or robot) may include: a processor 1001, such as a CPU, a memory 1002, and a communication bus 1003. The communication bus 1003 is used to implement connection communication among these components. The memory 1002 may be a high-speed RAM memory or a non-volatile memory (e.g., a disk memory). The memory 1002 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the configuration of the apparatus shown in fig. 1 is not intended to be limiting of the apparatus and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, the memory 1002, which is a kind of computer storage medium, may include therein an operating system and a robot control program. An operating system is a program that manages and controls the hardware and software resources of a device, supporting the operation of robot control programs, as well as other software or programs. In the device shown in fig. 1, a processor 1001 may be used to invoke a robot control program stored in a memory 1002 and perform the following operations:
after the delivery destination of the article to be delivered in the storage bin is reached, the grabbed article to be delivered is delivered out of the storage bin through the mechanical arm;
detecting whether a palm of a user exists in the environment outside the storage bin or not through a sensor arranged in the mechanical arm;
when the palm of the user is determined to exist, acquiring the position of the palm of the user through the sensor;
and controlling the mechanical arm to move to the position of the palm of the user so as to place the grabbed object to be dispensed on the palm of the user.
Further, the detecting whether a user's palm is present in the environment outside the stowage bin via a sensor disposed in the mechanical arm includes:
acquiring sensor data through a sensor arranged in the mechanical arm, wherein the sensor is a depth camera and/or a radar sensor;
extracting contour features of environmental objects from the sensor data, and comparing the contour features of the environmental objects with preset contour features of the palm;
and if the environmental object outline characteristic is consistent with the palm outline characteristic in comparison, determining that the palm of the user exists in the environment outside the storage bin.
Further, after the delivery destination of the article to be delivered in the storage bin is reached, before the mechanical arm delivers the grasped article to be delivered out of the storage bin, the processor 1001 may be further configured to call a robot control program stored in the memory 1002 to perform the following operations:
after the article to be delivered is determined, the article to be delivered is grabbed and fixed from the storage bin through the mechanical arm.
Further, the grabbing and fixing the object to be dispensed from the storage bin through the mechanical arm comprises:
scanning through a graphic code scanning device arranged in the mechanical arm, determining the position of the to-be-delivered article in the storage bin according to the scanned graphic code of the to-be-delivered article, or communicating with each article in the storage bin through a near field communication device arranged in the mechanical arm, and determining the position of the to-be-delivered article in the storage bin according to a communication result;
and controlling the mechanical arm to move to the position of the article to be delivered in the storage bin and then grabbing and fixing the article to be delivered.
Further, before detecting whether the palm of the user exists in the environment outside the locker by the sensor disposed in the mechanical arm, the processor 1001 may be further configured to call a robot control program stored in the memory 1002, and perform the following operations:
detecting, by the sensor, whether a pickup object is present;
if the goods taking object is determined to exist, detecting the height characteristic of the goods taking object through the sensor;
and controlling the mechanical arm to carry the article to be delivered to hover at a height position corresponding to the height characteristic.
Further, when the sensor includes an image pickup device, the detecting whether the pickup object exists by the sensor includes:
acquiring environment image data through the camera device, and identifying the environment image data to obtain identity verification information;
matching the identity authentication information with the identity information of the goods taking object corresponding to the goods to be delivered;
and when the identity authentication information is matched with the identity information of the goods-taking object, determining that the goods-taking object exists.
Further, the controlling the mechanical arm to move to the position of the palm of the user so as to place the grabbed object to be dispensed on the palm of the user comprises:
controlling the mechanical arm to move to the position of the palm of the user;
detecting the pressure value of each contact point through the electronic skin arranged on the contact surface of the mechanical arm grabbing part and the object to be dispensed;
and when detecting that each pressure value accords with a preset pressure value distribution state, controlling the mechanical arm to loosen the article to be delivered so as to place the grabbed article to be delivered on the palm of the user.
Based on the above structure, various embodiments of a robot control method are proposed.
Referring to fig. 2, fig. 2 is a flowchart illustrating a robot control method according to a first embodiment of the present invention.
While a logical order is shown in the flow chart, in some cases, the steps shown or described may be performed in an order different than that shown or described herein. The execution main body of each embodiment of the robot control method can be a robot, the robot can be a conventional robot controlled by an automatic control program, and the type and the specific implementation details of the robot are not limited in each embodiment. In the present embodiment, the robot control method includes the following steps S10-S40:
step S10, after the delivery destination of the article to be delivered in the storage bin is reached, the grabbed article to be delivered is sent out of the storage bin through the mechanical arm;
in this embodiment, the robot is provided with the storing storehouse for place the article that need the delivery, sets up the arm in the storing storehouse for snatch and send out the storehouse with the article in the storing storehouse outside. In a specific embodiment, the storage bin can be provided with or without a bin door, the bin door can be arranged for improving the safety of goods delivery, and the robot controls the bin door to be opened or closed according to scene requirements.
The items to be dispensed will be referred to hereinafter as items to be dispensed. In a specific embodiment, the article to be dispensed may be manually placed in a storage compartment of the robot by a worker, or may be automatically picked by the robot in an automatic picking area, which is not limited in this embodiment.
The robot can carry out delivery according to the delivery destination of the article to be delivered, and after the robot reaches the delivery destination, the robot is controlled to grab the article to be delivered in the storage bin and send the article out of the storage bin. For example, in an embodiment, the target pose (including position and posture) of the robot arm may be determined according to a specific scene, the control parameters of each component of the robot arm may be calculated according to the current pose and the target pose of the robot arm and a control algorithm of the robot arm, and each component may be driven to move according to the control parameters to adjust the robot arm from the current pose to the target pose.
Further, in an embodiment, when the robot simultaneously delivers a plurality of articles, a corresponding relationship between the articles to be delivered and the delivery destination may be obtained, in a specific embodiment, the delivery destination may be determined according to the corresponding relationship when the articles to be delivered are known, and then the robot navigates to the delivery destination to deliver the articles to be delivered, or the articles to be delivered may be determined according to the corresponding relationship when the delivery destination is known, and then the articles to be delivered are delivered to the customer when the robot navigates to reach the delivery destination.
In this embodiment, the manner in which the robot acquires the correspondence between the article to be delivered and the delivery destination is not limited. For example, in one embodiment, when the article to be dispensed is placed into the storage compartment by a worker, the worker may input a delivery destination of the article to be dispensed, and the robot recognizes the article to be dispensed placed into the storage compartment by the recognition device in the storage compartment and binds the delivery destination input by the worker with the recognized article to be dispensed. For another example, in another embodiment, when the robot picks up goods in the automatic goods picking area, the robot arm may be controlled to extend out of the storage bin, a camera disposed in the robot arm scans a graphic code disposed in the goods to be delivered, and identifies the graphic code to obtain a delivery destination of the goods to be delivered, or a near field communication device disposed in the robot arm communicates with a near field communication device in the goods to be delivered to obtain the delivery destination of the goods to be delivered, binds the goods to be delivered with the delivery destination, and captures the goods to be delivered into the storage bin through the robot arm.
It should be noted that, in a specific embodiment, the robot may distinguish different articles according to unique numbers of the articles, or a camera is disposed in the robot, an image feature of an article is identified through an image captured by the camera, and different articles are distinguished based on the image feature. It should be noted that when the articles are distinguished according to the unique codes of the articles, the codes can be carried in the graphic codes set in the articles, or can be preset in the near field communication devices in the articles, or can be input in the robot by staff.
It should be noted that the robot may grasp the article to be delivered by the mechanical arm after reaching the delivery destination, and deliver the article to be delivered out of the storage compartment, or may pre-grasp the article to be delivered in the storage compartment by the mechanical arm before reaching the delivery destination, so that the article to be delivered can be rapidly delivered out of the storage compartment after reaching the delivery destination.
Step S20, detecting whether a user palm exists in the environment outside the storage bin through a sensor arranged in the mechanical arm;
the mechanical arm can be provided with a sensor for sensing the external environment, the sensor can specifically adopt a camera device, a radar sensor, an infrared sensor, a thermal imaging sensor and the like, and certainly, various sensors can be arranged at the same time, which is not limited in the embodiment.
The robot can detect whether a user palm exists in the environment outside the storage bin through a sensor arranged in the mechanical arm, and specifically can determine through analyzing data collected by the sensor. In particular embodiments, when different types of sensors are employed, the manner in which the sensor data is analyzed to determine whether the user's palm is present may also vary. Specifically, the data may be collected by a sensor when a user palm exists in the external environment in advance, the collected sensor data may be analyzed to extract data features related to the user palm for storage, for example, profile features, when the robot needs to identify whether the user palm exists in the external environment, the data features extracted by analyzing the data collected by the sensor may be compared with the data features related to the user palm stored in advance, and if the comparison is consistent, the user palm may be determined to exist.
Step S30, when the palm of the user is determined to exist, acquiring the position of the palm of the user through the sensor;
when the user palm exists in the external environment, the robot can acquire the position of the user palm through the sensor, and the position can be represented by data such as the distance and the direction of the user palm relative to the mechanical arm. In a specific embodiment, when the data collected by the sensor is analyzed to determine that the palm of the user exists, the position of the palm of the user can be determined according to the data in the analysis process, or when the palm of the user is determined to exist, the position of the palm of the user can be obtained by analyzing the data collected by the sensor.
In a specific embodiment, when the types of the adopted sensors are different, the methods for analyzing the data collected by the sensors to determine the position of the palm of the user are also different; specifically, when the data characteristics obtained by analyzing the data collected by the sensor are compared with the data characteristics related to the user palm, which are stored in advance, and the data characteristics are consistent, the position of the user palm is determined according to the position information carried by the sensor data corresponding to the data characteristics which are consistent in comparison with the data characteristics related to the user palm.
It will be appreciated that when it is determined that the user's palm is present in the external environment, this indicates that the user has an intention to reach for the item to be dispensed.
Further, in an embodiment, when it is determined that the user palm does not exist in the external environment, the robot may adjust the position of the mechanical arm or adjust the overall orientation of the robot, and after the adjustment, the robot may analyze and determine whether the user palm exists by collecting data through a sensor in the mechanical arm.
And step S40, controlling the mechanical arm to move to the position of the palm of the user so as to place the grabbed object to be delivered on the palm of the user.
After the position of the palm of the user is determined, the robot can control the mechanical arm to move to the position of the palm of the user, so that the grabbed object to be dispensed is placed on the palm of the user. The specific embodiment of controlling the mechanical arm to move to the position where the palm of the user is located may refer to the specific embodiment of controlling the mechanical arm to send the article to be dispensed out of the warehouse, which is not described herein again.
In particular embodiments, the robotic arm may release the item to be dispensed upon detecting that the user holds the item to be dispensed to deliver the item to the user. For example, when a confirmation instruction (which may be input through a touch screen, a touch key, voice, or the like) input by the user is detected, it may be determined that the user holds the article to be dispensed.
It can be understood that, in this embodiment, a storage bin is arranged on a robot, a mechanical arm is arranged in the storage bin, after a delivery destination corresponding to an article to be delivered is reached, the article to be delivered in the bin is grabbed by the mechanical arm and is delivered out of the bin, whether a user palm exists in an external environment is detected by a sensor arranged in the mechanical arm, the position of the user palm is obtained by the sensor when the user palm is determined, the mechanical arm is controlled to move to the position of the user palm, so that the grabbed article to be delivered is placed in the user palm, the robot can automatically deliver the article to be delivered to the user hand, the user only needs to stretch the hand to receive the article to be delivered, the hand does not need to stretch into the storage bin to take the article, the article taking flow of the user is simplified, the intelligence of the delivery robot is improved, and the article taking experience of the user is improved.
Further, based on the first embodiment described above, a second embodiment of the robot control method according to the present invention is proposed, in this embodiment, the step S20 of detecting whether the palm of the user exists in the environment outside the storage compartment by a sensor provided in the robot arm includes steps S201 to S203:
step S201, acquiring sensor data through a sensor arranged in the mechanical arm, wherein the sensor is a depth camera and/or a radar sensor;
in this embodiment, a sensor is provided in the robot arm, and the sensor may be a depth camera and/or a radar sensor. The image data shot by the depth camera comprises the positions of the pixel points and the distance between the pixel points and the camera relative to the camera, and the point cloud data measured by the radar sensor comprises the distance between the pixel points and the radar sensor in space and the positions of the pixel points and the distance between the pixel points and the radar sensor relative to the radar sensor.
The robot may collect data through the sensor, which may be referred to as sensor data, which includes image data when the sensor includes a depth camera and point cloud data when the sensor includes a radar sensor.
Step S202, extracting contour features of an environment object from the sensor data, and comparing the contour features of the environment object with preset contour features of a palm;
the robot may extract contour features (hereinafter, referred to as environment contour features) from the sensor data to distinguish them. There are many ways to extract contour features from the image data of the depth camera, and this embodiment is not limited thereto. There are many methods for extracting the contour features from the point cloud data of the radar sensor, and the method is not limited in this embodiment. The robot may have palm profile features preset therein. The palm profile feature may be obtained by analyzing sensor data collected by the sensor when a palm of a user exists in an external environment. The robot compares the contour features of the environmental object with the contour features of the palm.
Step S203, if the environmental object outline characteristics are consistent with the palm outline characteristics in comparison, determining that a user palm exists in the environment outside the storage bin.
If the contour features of the environment object are consistent with the contour features of the palm in comparison, the palm of the user in the environment outside the storage bin can be determined. It should be noted that, because other obstacles may exist in the environment outside the storage bin besides the palm of the user, the contour features of the environment object also include contour features of the other obstacles, and when a part of the contour features of the environment object is consistent with the contour features of the palm in comparison, it may be determined that the palm of the user exists in the environment outside the storage bin.
Further, in a specific embodiment, when the environmental object contour feature is inconsistent with the palm contour feature in comparison, it may be determined that the palm of the user does not exist in the environment outside the storage bin, or other detection manners may be further adopted to detect whether the palm of the user exists, for example, an image matching method.
Further, in an embodiment, when the contour features of the environment object are compared and matched with the contour features of the palm, a part which is compared and matched with the contour features of the palm can be extracted from the contour features of the environment object, distance and orientation information corresponding to the part of features is extracted from sensor data, the distance and the orientation of the palm of the user relative to the sensor are determined, and the distance and the orientation are converted into the distance and the orientation of the palm of the user relative to the mechanical arm, so that the position of the palm of the user is obtained.
Further, in an embodiment, the robot may also be configured to capture an image by setting a common camera, match the captured image with a preset palm image of the user, determine whether the captured image includes a palm of the user, and determine whether the palm of the user exists in the environment outside the storage compartment according to a matching result. The user palm image can be an image which is shot in advance and comprises a user palm, specifically, the image which is shot by the palms of different people from different angles and different distances can be matched with the image of each user palm respectively, if the image is matched with one of the user palm images successfully, the user palm in the environment outside the storage bin can be determined, and if the image is not matched with one of the user palm images, the user palm in the environment outside the storage bin can be determined. Further, when the photographed image is successfully matched with the user palm image, a partial image matched with the user palm image can be extracted from the photographed image, the orientation of the user palm relative to the mechanical arm can be obtained through conversion according to the position of the partial image in the whole photographed image, the distance between the user palm and the mechanical arm can be obtained through conversion according to the occupied area of the partial image in the whole photographed image, and the position of the user palm is obtained.
Further, in an embodiment, the step S10, after reaching the delivery destination of the article to be delivered in the storage compartment, before the mechanical arm delivers the grasped article to be delivered out of the storage compartment, further includes:
and step S50, after the articles to be delivered are determined, the articles to be delivered are grabbed and fixed from the storage bin through the mechanical arm.
After the robot determines the articles to be delivered, the articles to be delivered can be grabbed and fixed from the storage bin through the mechanical arm so as to achieve the effect of grabbing in advance, and in the walking process of the robot, the articles to be delivered can be fixed through the mechanical arm so as to ensure that the articles to be delivered are not influenced by bumping in the moving process of the robot in the delivery process, and the safety of the articles to be delivered is ensured.
Further, in an embodiment, the step S50, after determining the article to be delivered, of grabbing and fixing the article to be delivered from the storage bin by the mechanical arm includes:
step S501, scanning is carried out through a graphic code scanning device arranged in the mechanical arm, the position of the article to be delivered in the storage bin is determined according to the scanned graphic code of the article to be delivered, or the position of the article to be delivered in the storage bin is determined according to the communication result through a near field communication device arranged in the mechanical arm and communication with each article in the storage bin;
in this embodiment, when a plurality of articles are placed in the storage bin, the mechanical arm needs to identify the articles to be dispensed and grab the articles. In one embodiment, the graphic code can be given to the article by sticking or other methods, the graphic code carries identification information of the article for distinguishing other articles, and a graphic code scanning device can be arranged in the mechanical arm; the graphic code scanning device is started in the storage bin through the mechanical arm for scanning, which graphic code is the graphic code of the article to be delivered is determined according to the information carried in the scanned graphic code, and the position of the article to be delivered in the storage bin can be determined according to the position of the graphic code.
In another embodiment, a near field communication device may be placed in each item and the robot arm, and the near field communication device of each item may have identification information of the item built therein for distinguishing from other items; the mechanical arm is used for communicating with the storage bin by adopting the near field communication device, which near field communication signal is sent by the near field communication device of the article to be delivered is determined, and the position of the article to be delivered in the storage bin can be determined according to the sending position of the communication signal.
Step S502, the mechanical arm is controlled to move to the position of the article to be delivered in the storage bin, and then the article to be delivered is grabbed and fixed.
After the robot determines the position of the article to be dispensed in the storage bin, the robot can control the mechanical arm to move to the position and grab and fix the article to be dispensed.
Further, in an embodiment, the robot may capture images of each article in the storage bin through a camera disposed in the storage bin or on the mechanical arm, output and display the images on a display screen of the robot, so that a user selects an article in the images, the robot takes the article as an article to be dispensed, and convert the position of the article to be dispensed in the storage bin according to the position of the article selected by the user in the images.
Further, based on the first and/or second embodiments, a third embodiment of the robot control method according to the present invention is provided, in this embodiment, the step S20 includes, before detecting whether the palm of the user exists in the environment outside the storage compartment by a sensor provided in the mechanical arm, further:
step S60, detecting whether or not there is a pickup object by the sensor;
the robot may detect the presence or absence of the pickup object by a sensor provided in the robot arm. In the specific embodiment, the condition for judging whether the goods taking object exists can be set according to the requirement, and when different judging conditions are set, the adopted detection method is different. For example, in an embodiment, when it is set that only a user appearing outside the storage compartment is about to be a pickup object, whether a person exists in an environment outside the storage compartment may be determined by analyzing data collected by the sensor, and the specific detection manner may refer to the specific embodiment of detecting whether a palm of the user exists according to the sensor, which is not described herein again.
Step S70, if the goods taking object is determined to exist, the height characteristic of the goods taking object is detected through the sensor;
if the goods taking object is determined to exist, the robot can detect the height characteristic of the goods taking object through the sensor. The height characteristics can be expressed by the height value of the pickup object or other data capable of representing the height of the pickup object, for example, people of different ages have different heights, and the height characteristics can also be expressed by the age of the pickup object.
When the height characteristic is expressed by the height value of the goods-taking object, the specific embodiment of detecting the height characteristic of the goods-taking object through the sensor can refer to the specific embodiment of detecting the position of the palm of the user through the sensor, and the height value of the goods-taking object can be obtained through conversion after the distance and the direction of the head and the feet of the goods-taking object relative to the mechanical arm are detected.
When the height characteristic is represented by age, the sensor can be realized by adopting a camera device, and the classification result of the age of the goods-taking object can be obtained by classifying the image data shot by the camera device.
And step S80, controlling the mechanical arm to carry the article to be delivered to hover at a height position corresponding to the height characteristic.
After the height characteristics of the goods-taking object are detected by the robot, the robot can control the mechanical arm to carry the goods to be delivered to hover at the height position corresponding to the height characteristics. The robot may be preset with height positions corresponding to various height features, and the height positions may be relative to a coordinate system of the mechanical arm. The height positions corresponding to various height characteristics can be set as required, so that the mechanical arm can hover at the chest position of the goods-taking object. By detecting the height characteristic corresponding to the goods taking, the mechanical arm is controlled to carry the goods to be delivered to hover at the height position corresponding to the height characteristic of the goods taking object, so that the goods taking object can conveniently know the purpose of delivering the goods to be delivered by the robot to reach the hand for receiving the goods, the mechanical arm can be conveniently and quickly positioned to the position of the palm of a user, and the delivery can be quickly completed.
Further, in one embodiment, the step S60, the detecting whether the object to be picked exists through the sensor includes:
step S601, acquiring environment image data through the camera device, and identifying the environment image data to obtain identity verification information;
in the present embodiment, the sensor provided in the robot arm may include a camera. The robot may acquire image data (hereinafter, referred to as environment image data) by using an imaging device, and identify the environment image data to obtain authentication information. The identity authentication information may be face information obtained through recognition, or pickup information in a verification graphic code presented by a user through recognition, and is not limited in this embodiment.
Step S602, the identity authentication information is matched with the identity information of the goods taking object corresponding to the goods to be delivered;
the robot may obtain the identification information of the pickup object of the article to be delivered in advance, specifically, the identification information may be input by a worker, or may be obtained by identifying a graphic code of the article to be delivered by the robot, or obtained in other manners, which is not limited in this embodiment.
The robot matches the authentication information with the identification information of the pickup object of the article to be delivered to determine whether the authentication information is the authentication information of the pickup object.
Step S603, when the identity authentication information matches the identity information of the pickup object, determining that the pickup object exists.
When the identity authentication information is matched with the identity information of the goods-taking object, the robot can determine that the goods-taking object exists, and further can carry out subsequent delivery operation of the goods to be delivered.
Further, in one embodiment, the robot may determine that the pickup object does not exist when the authentication information is not matched with the pickup object identification information. When it is determined that the pickup object is not present, the robot may adjust the position of the robot arm or adjust the orientation of the robot, and analyze to determine whether the pickup object is present based on image data of other angles captured by the imaging device in the robot arm.
By acquiring the identity verification information and matching the identity information of the goods taking object, the subsequent delivery operation of the goods to be delivered is executed when the identity verification information and the identity information of the goods taking object are matched, the delivery accuracy and the delivery safety of the goods to be delivered are improved, and delivery errors are avoided.
Further, in an embodiment, the step S40, the controlling the mechanical arm to move to a position where the palm of the user 'S hand is located so as to place the grasped object to be dispensed on the palm of the user' S hand, includes:
step S401, controlling the mechanical arm to move to the position where the palm of the user is located;
s402, detecting the pressure value of each contact point through the electronic skin arranged on the contact surface of the mechanical arm grabbing part and the object to be dispensed;
in this embodiment, the electronic skin may be provided on a contact surface between the grasping member of the robot arm and the article, and whether or not the article to be dispensed is dragged by a person may be determined based on the pressure value of each contact point detected by the electronic skin.
Wherein, the electron skin can be realized by a plurality of pressure sensor, and pressure sensor can detect the pressure value that receives. After the robot moves the mechanical arm to the position where the palm of the user is located, the pressure value of each contact point can be detected through the electronic skin.
Step S403, when it is detected that each pressure value meets a preset pressure value distribution state, controlling the mechanical arm to release the article to be delivered, so as to place the grasped article to be delivered on the palm of the user.
The robot detects whether the pressure value of each contact point accords with a preset pressure value distribution state. The preset pressure value distribution state can specifically include a pressure value which should be shown by each contact point or a pressure value range which should be in when the article to be delivered is dragged, and specifically can be obtained by adopting a mechanical arm to grab the article in a laboratory stage and adopting a person or a test tool to drag the article, measuring the pressure value of each contact point under the condition and further setting the preset pressure value distribution state according to the measured pressure value of each contact point.
When the robot detects that each pressure value accords with the distribution state of the preset pressure value, the robot can be controlled to release the article to be dispensed, so that the article to be dispensed is placed in the palm of a user, and the user can take away the article to be dispensed.
In addition, an embodiment of the present invention further provides a robot control apparatus, where the apparatus is deployed in a robot, where the robot is provided with a storage compartment, and a mechanical arm is provided in the storage compartment, and with reference to fig. 4, the apparatus includes:
the grabbing module 10 is used for sending the grabbed articles to be delivered out of the storage bin through the mechanical arm after the articles to be delivered arrive at a delivery destination of the articles to be delivered in the storage bin;
the detection module 20 is configured to detect whether a palm of a user exists in the environment outside the storage bin through a sensor arranged in the mechanical arm;
an obtaining module 30, configured to obtain, through the sensor, a position of the user palm when it is determined that the user palm exists;
and the delivery module 40 is used for controlling the mechanical arm to move to the position of the palm of the user so as to place the grabbed object to be delivered on the palm of the user.
Further, the detection module 20 is further configured to:
acquiring sensor data through a sensor arranged in the mechanical arm, wherein the sensor is a depth camera and/or a radar sensor;
extracting contour features of environmental objects from the sensor data, and comparing the contour features of the environmental objects with preset contour features of the palm;
and if the environmental object outline characteristic is consistent with the palm outline characteristic in comparison, determining that the palm of the user exists in the environment outside the storage bin.
Further, the grabbing module 10 is further configured to:
after the articles to be delivered are determined, the articles to be delivered are grabbed and fixed from the storage bin through the mechanical arm.
Further, the grasping module 10 is further configured to:
scanning through a graphic code scanning device arranged in the mechanical arm, determining the position of the article to be delivered in the storage bin according to the scanned graphic code of the article to be delivered, or communicating with each article in the storage bin through a near field communication device arranged in the mechanical arm, and determining the position of the article to be delivered in the storage bin according to a communication result;
and controlling the mechanical arm to move to the position of the article to be delivered in the storage bin and then grabbing and fixing the article to be delivered.
Further, the detection module 20 is further configured to:
detecting, by the sensor, whether a pickup object is present;
if the goods taking object is determined to exist, detecting height characteristics of the goods taking object through the sensor;
the delivery module 40 is further configured to: and controlling the mechanical arm to carry the article to be delivered to hover at a height position corresponding to the height characteristic.
Further, when the sensor comprises an image capturing device, the detecting module 20 is further configured to:
acquiring environment image data through the camera device, and identifying the environment image data to obtain identity verification information;
matching the identity authentication information with the identity information of the goods taking object corresponding to the goods to be delivered;
and when the identity authentication information is matched with the identity information of the goods taking object, determining that the goods taking object exists.
Further, the delivery module 40 is further configured to:
controlling the mechanical arm to move to the position of the palm of the user;
detecting the pressure value of each contact point through the electronic skin arranged on the contact surface of the mechanical arm grabbing part and the object to be dispensed;
and when detecting that each pressure value accords with a preset pressure value distribution state, controlling the mechanical arm to loosen the article to be delivered so as to place the grabbed article to be delivered on the palm of the user.
The specific embodiment of the robot control device of the present invention has basically the same extension as the embodiments of the robot control method, and is not described herein again.
Furthermore, an embodiment of the present invention further provides a computer-readable storage medium, where a robot control program is stored, and the robot control program, when executed by a processor, implements the steps of the robot control method described below.
The embodiments of the robot and the computer-readable storage medium of the present invention can refer to the embodiments of the robot control method of the present invention, and are not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are only for description, and do not represent the advantages and disadvantages of the embodiments.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solutions of the present invention or portions thereof contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention, and all equivalent structures or equivalent processes performed by the present invention or directly or indirectly applied to other related technical fields are also included in the scope of the present invention.

Claims (10)

1. The robot control method is applied to a robot and is characterized in that the robot is provided with a storage bin, a mechanical arm is arranged in the storage bin, and the method comprises the following steps:
after the delivery destination of the article to be delivered in the storage bin is reached, the grabbed article to be delivered is sent out of the storage bin through the mechanical arm;
detecting whether a palm of a user exists in the environment outside the storage bin through a sensor arranged in the mechanical arm;
when the palm of the user is determined to exist, acquiring the position of the palm of the user through the sensor;
and controlling the mechanical arm to move to the position of the palm of the user so as to place the grabbed object to be dispensed on the palm of the user.
2. The robot control method of claim 1, wherein said detecting the presence of the user's palm in the environment outside the bin via a sensor disposed in the robotic arm comprises:
acquiring sensor data through a sensor arranged in the mechanical arm, wherein the sensor is a depth camera and/or a radar sensor;
extracting contour features of environmental objects from the sensor data, and comparing the contour features of the environmental objects with preset contour features of the palm;
and if the environmental object outline characteristic is consistent with the palm outline characteristic in comparison, determining that the palm of the user exists in the environment outside the storage bin.
3. The robot control method according to claim 1, wherein before the gripped article to be dispensed is carried out of the magazine by the robot arm after reaching the delivery destination of the article to be dispensed in the magazine, the robot control method further comprises:
after the articles to be delivered are determined, the articles to be delivered are grabbed and fixed from the storage bin through the mechanical arm.
4. The robot control method according to claim 3, wherein the grasping and fixing of the article to be dispensed from the magazine by the robot arm comprises:
scanning through a graphic code scanning device arranged in the mechanical arm, determining the position of the article to be delivered in the storage bin according to the scanned graphic code of the article to be delivered, or communicating with each article in the storage bin through a near field communication device arranged in the mechanical arm, and determining the position of the article to be delivered in the storage bin according to a communication result;
and controlling the mechanical arm to move to the position of the article to be delivered in the storage bin and then grabbing and fixing the article to be delivered.
5. The robot control method according to claim 1, wherein before detecting whether a palm of a user's hand is present in the environment outside the storage compartment by a sensor provided in the robot arm, the method further comprises:
detecting, by the sensor, whether a pickup object is present;
if the goods taking object is determined to exist, detecting the height characteristic of the goods taking object through the sensor;
and controlling the mechanical arm to carry the article to be delivered to hover at a height position corresponding to the height characteristic.
6. The robot control method according to claim 5, wherein when the sensor includes an imaging device, the detecting whether the pickup object exists by the sensor includes:
acquiring environment image data through the camera device, and identifying the environment image data to obtain identity verification information;
matching the identity authentication information with the identity information of the goods taking object corresponding to the goods to be delivered;
and when the identity authentication information is matched with the identity information of the goods taking object, determining that the goods taking object exists.
7. The robot control method according to any one of claims 1 to 6, wherein the controlling the robot arm to move to a position where the user's palm is located to place the grasped article to be dispensed on the user's palm includes:
controlling the mechanical arm to move to the position of the palm of the user;
detecting the pressure value of each contact point through the electronic skin arranged on the contact surface of the mechanical arm grabbing part and the object to be dispensed;
and when detecting that each pressure value accords with a preset pressure value distribution state, controlling the mechanical arm to loosen the article to be delivered so as to place the grabbed article to be delivered on the palm of the user.
8. The utility model provides a robot control device, the device is disposed in the robot, its characterized in that, the robot is provided with the storing storehouse, be provided with the arm in the storing storehouse, the device includes:
the grabbing module is used for sending the grabbed objects to be delivered out of the storage bin through the mechanical arm after the objects to be delivered in the storage bin reach a delivery destination;
the detection module is used for detecting whether a palm of a user exists in the environment outside the storage bin through a sensor arranged in the mechanical arm;
the acquisition module is used for acquiring the position of the palm of the user through the sensor when the palm of the user is determined to exist;
and the delivery module is used for controlling the mechanical arm to move to the position of the palm of the user so as to place the grabbed object to be delivered on the palm of the user.
9. A robot, characterized in that the robot comprises: memory, a processor and a robot control program stored on the memory and executable on the processor, which robot control program, when executed by the processor, carries out the steps of the robot control method according to any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that a robot control program is stored thereon, which, when being executed by a processor, carries out the steps of the robot control method according to any one of claims 1 to 7.
CN202210447069.3A 2022-04-26 2022-04-26 Robot control method, device, robot and storage medium Active CN114770504B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210447069.3A CN114770504B (en) 2022-04-26 2022-04-26 Robot control method, device, robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210447069.3A CN114770504B (en) 2022-04-26 2022-04-26 Robot control method, device, robot and storage medium

Publications (2)

Publication Number Publication Date
CN114770504A true CN114770504A (en) 2022-07-22
CN114770504B CN114770504B (en) 2024-01-30

Family

ID=82433904

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210447069.3A Active CN114770504B (en) 2022-04-26 2022-04-26 Robot control method, device, robot and storage medium

Country Status (1)

Country Link
CN (1) CN114770504B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115922712A (en) * 2022-12-02 2023-04-07 深圳优地科技有限公司 Robot distribution method and robot

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070152619A1 (en) * 2005-12-12 2007-07-05 Honda Motor Co., Ltd. Autonomous mobile robot and goods carrying method of using the same
US20080051933A1 (en) * 2006-08-23 2008-02-28 Provision Interactive Technologies, Inc Vending machine having aerial display system
CN201044114Y (en) * 2006-08-23 2008-04-02 浦比俊引特艾克堤夫科技公司 Automatic sale machine with midair display system
JP2013146389A (en) * 2012-01-19 2013-08-01 Panasonic Corp Hand drying apparatus
US20160089782A1 (en) * 2014-09-30 2016-03-31 Toyota Jidosha Kabushiki Kaisha Robotic handover system natural for humans
WO2018127880A2 (en) * 2018-03-14 2018-07-12 Logic Studio Method and apparatus for giving and receiving objects
CA2997849A1 (en) * 2017-03-09 2018-09-09 Memic Innovative Surgery Ltd. Control console for surgical device with mechanical arms
CN108711086A (en) * 2018-05-09 2018-10-26 连云港伍江数码科技有限公司 Man-machine interaction method, device, article-storage device and storage medium in article-storage device
CN110271800A (en) * 2019-03-14 2019-09-24 金树玉 A kind of cargo collator and its working method for intelligent storage
CN111993978A (en) * 2020-07-14 2020-11-27 嘉善新石器智牛科技有限公司 Automatic driving vending cart and man-machine interaction method thereof
CN112036644A (en) * 2020-09-01 2020-12-04 北京京东振世信息技术有限公司 Method and apparatus for distributing courier boxes
CN112276956A (en) * 2020-10-30 2021-01-29 北京市商汤科技开发有限公司 Article distribution method, device and equipment and storage medium
CN113867398A (en) * 2017-04-28 2021-12-31 深圳市大疆创新科技有限公司 Control method for palm landing of unmanned aerial vehicle and unmanned aerial vehicle
CN114004329A (en) * 2020-07-28 2022-02-01 辉达公司 Machine learning control of object hand-off

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070152619A1 (en) * 2005-12-12 2007-07-05 Honda Motor Co., Ltd. Autonomous mobile robot and goods carrying method of using the same
US20080051933A1 (en) * 2006-08-23 2008-02-28 Provision Interactive Technologies, Inc Vending machine having aerial display system
CN201044114Y (en) * 2006-08-23 2008-04-02 浦比俊引特艾克堤夫科技公司 Automatic sale machine with midair display system
JP2013146389A (en) * 2012-01-19 2013-08-01 Panasonic Corp Hand drying apparatus
US20160089782A1 (en) * 2014-09-30 2016-03-31 Toyota Jidosha Kabushiki Kaisha Robotic handover system natural for humans
CA2997849A1 (en) * 2017-03-09 2018-09-09 Memic Innovative Surgery Ltd. Control console for surgical device with mechanical arms
CN113867398A (en) * 2017-04-28 2021-12-31 深圳市大疆创新科技有限公司 Control method for palm landing of unmanned aerial vehicle and unmanned aerial vehicle
WO2018127880A2 (en) * 2018-03-14 2018-07-12 Logic Studio Method and apparatus for giving and receiving objects
CN108711086A (en) * 2018-05-09 2018-10-26 连云港伍江数码科技有限公司 Man-machine interaction method, device, article-storage device and storage medium in article-storage device
CN110271800A (en) * 2019-03-14 2019-09-24 金树玉 A kind of cargo collator and its working method for intelligent storage
CN111993978A (en) * 2020-07-14 2020-11-27 嘉善新石器智牛科技有限公司 Automatic driving vending cart and man-machine interaction method thereof
CN114004329A (en) * 2020-07-28 2022-02-01 辉达公司 Machine learning control of object hand-off
CN112036644A (en) * 2020-09-01 2020-12-04 北京京东振世信息技术有限公司 Method and apparatus for distributing courier boxes
CN112276956A (en) * 2020-10-30 2021-01-29 北京市商汤科技开发有限公司 Article distribution method, device and equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115922712A (en) * 2022-12-02 2023-04-07 深圳优地科技有限公司 Robot distribution method and robot

Also Published As

Publication number Publication date
CN114770504B (en) 2024-01-30

Similar Documents

Publication Publication Date Title
EP2585256B1 (en) Method for the selection of physical objects in a robot system
JP6529302B2 (en) INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM
JP3834297B2 (en) Image processing device
CN111590611B (en) Article classification and recovery method based on multi-mode active perception
US7734062B2 (en) Action recognition apparatus and apparatus for recognizing attitude of object
US11232589B2 (en) Object recognition device and object recognition method
KR102081139B1 (en) Object peaking system, object detecting device and method thereof
JP2013541775A (en) Filtering method of target object image in robot system
CN114770504B (en) Robot control method, device, robot and storage medium
US9361695B2 (en) Method of recognizing a position of a workpiece from a photographed image
CN114029243A (en) Soft object grabbing and identifying method for sorting robot hand
US10562187B2 (en) Robot, robot control device, and robot system
CN113927601B (en) Method and system for realizing precise picking of mechanical arm based on visual recognition
CN111783509A (en) Automatic settlement method, device, system and storage medium
CN110026976A (en) The robot of elevator and the method using the robot sending and receiving article above and below energy
CN112368724A (en) Learning device, learning system, and learning method
CN111476840B (en) Target positioning method, device, equipment and computer readable storage medium
JP2016146188A (en) Information processor, information processing method and computer program
JP6041710B2 (en) Image recognition method
KR102317041B1 (en) Gripping System of object and method thereof
CN113284129B (en) 3D bounding box-based press box detection method and device
KR101669850B1 (en) Sensor Calibration Method and Electronic Device and Marker Board for the same
CN113842209A (en) Ultrasound apparatus control method, ultrasound apparatus, and computer-readable storage medium
CN112022003A (en) Sweeping robot, control method and device thereof, and computer-readable storage medium
JP6026365B2 (en) Image recognition method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant