CN116237930A - Goods taking guiding method and device and electronic equipment - Google Patents

Goods taking guiding method and device and electronic equipment Download PDF

Info

Publication number
CN116237930A
CN116237930A CN202211671962.0A CN202211671962A CN116237930A CN 116237930 A CN116237930 A CN 116237930A CN 202211671962 A CN202211671962 A CN 202211671962A CN 116237930 A CN116237930 A CN 116237930A
Authority
CN
China
Prior art keywords
goods
target
received
mode
goods taking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211671962.0A
Other languages
Chinese (zh)
Inventor
赵敏
周竹青
孟宪鹏
赵慧斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202211671962.0A priority Critical patent/CN116237930A/en
Publication of CN116237930A publication Critical patent/CN116237930A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/083Shipping

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Business, Economics & Management (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Development Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Human Computer Interaction (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)

Abstract

The disclosure provides a goods taking guiding method, a goods taking guiding device and electronic equipment, relates to the technical field of artificial intelligence, and particularly relates to the technical fields of Internet of things, computer vision, natural language processing and voice. The specific implementation scheme is as follows: under the condition that the robot reaches the preset range of the address to be distributed, interacting with the object to be received corresponding to the address to be distributed to obtain a target goods taking mode selected by the object to be received from at least one goods taking mode and a target goods taking object in the target goods taking mode; according to the target goods taking mode, the target goods taking object is guided to take goods to be dispatched corresponding to the goods to be dispatched, so that goods can be taken by the help object or the help object when the goods to be dispatched are inconvenient to take, the goods can be ensured to be dispatched successfully, workers are prevented from arranging secondary goods to be dispatched, and the dispatching efficiency is improved.

Description

Goods taking guiding method and device and electronic equipment
Technical Field
The disclosure relates to the technical field of artificial intelligence, in particular to the technical fields of internet of things, computer vision, natural language processing and voice, and particularly relates to a goods taking guiding method, a goods taking guiding device and electronic equipment.
Background
At present, after the robot reaches the to-be-delivered address, the robot notifies the to-be-received object to take goods. And in the process of taking goods, for example, when the verification code input by the object to be received is correct, a cabin door where the goods are located is opened so that the object to be received can take the goods. If the object to be received is not picked up, multiple notifications are carried out; if the object to be received is not picked up after the notification for many times, the original path is returned.
In the scheme, if the object to be received is not picked up, the robot needs to be notified for a plurality of times, and the waiting time is long; and after the original path returns, workers are required to arrange secondary distribution and the like, and the distribution efficiency is poor.
Disclosure of Invention
The disclosure provides a goods taking guiding method, a goods taking guiding device and electronic equipment.
According to an aspect of the present disclosure, there is provided a pickup guiding method applied to a robot, the method including: under the condition that the target delivery mode reaches the preset range of the address to be delivered, interacting with a target delivery object corresponding to the address to be delivered, and acquiring a target delivery mode selected by the target delivery object from at least one delivery mode and a target delivery object under the target delivery mode; and guiding the target goods taking object to take goods to be delivered corresponding to the address to be delivered according to the target goods taking mode.
According to another aspect of the present disclosure, there is provided a pick-guide device applied to a robot, the device including: the system comprises an acquisition module, a storage module and a storage module, wherein the acquisition module is used for interacting with a to-be-received object corresponding to an address to be distributed under the condition that the to-be-distributed address is reached within a preset range, and acquiring a target goods taking mode selected by the to-be-received object from at least one goods taking mode and a target goods taking object under the target goods taking mode; and the guiding module is used for guiding the target goods taking object to take goods to be delivered corresponding to the address to be delivered according to the target goods taking mode.
According to another aspect of the present disclosure, there is provided an electronic device including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the pick-guide method set forth above in the present disclosure.
According to another aspect of the present disclosure, there is provided a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform the pick guidance method set forth above in the present disclosure.
According to another aspect of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements the steps of the pick guidance method set forth above in the present disclosure.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
The drawings are for a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 is a schematic diagram according to a first embodiment of the present disclosure;
FIG. 2 is a schematic diagram according to a second embodiment of the present disclosure;
FIG. 3 is a schematic diagram according to a third embodiment of the present disclosure;
fig. 4 is a block diagram of an electronic device used to implement a pick guidance method of an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present disclosure to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
At present, after the robot reaches the to-be-delivered address, the robot notifies the to-be-received object to take goods. And in the process of taking goods, for example, when the verification code input by the object to be received is correct, a cabin door where the goods are located is opened so that the object to be received can take the goods. If the object to be received is not picked up, multiple notifications are carried out; if the object to be received is not picked up after the notification for many times, the original path is returned.
In the scheme, if the object to be received is not picked up, the robot needs to be notified for a plurality of times, and the waiting time is long; and after the original path returns, workers are required to arrange secondary distribution and the like, and the distribution efficiency is poor.
In view of the above problems, the present disclosure provides a method, an apparatus, and an electronic device for guiding goods taking.
Fig. 1 is a schematic diagram of a first embodiment of the present disclosure, and it should be noted that the pick-up guiding method of the embodiments of the present disclosure may be applied to a pick-up guiding device, which may be configured in an electronic apparatus, so that the electronic apparatus may perform a pick-up guiding function.
The electronic device may be any device with computing capability, for example, may be a hardware device that has various operating systems, touch screens and/or display screens, such as a robot, or the like, and is capable of moving autonomously. In the following embodiments, an execution body will be described as an example of a robot.
As shown in fig. 1, the pick-and-guide method may include the steps of:
step 101, under the condition that the target object arrives in the preset range of the to-be-delivered address, interacting with the to-be-delivered object corresponding to the to-be-delivered address to obtain a target goods taking mode selected by the to-be-delivered object from at least one goods taking mode and a target goods taking object under the target goods taking mode.
In embodiments of the present disclosure, the pick-up means may include at least one of: the object to be received takes goods, the object to be helped takes goods, and the object to be helped takes goods. Wherein, the object to be received is taken, which means that the object to be received carries out the taking treatment. The object to be picked up is to search the object to be picked up, and the object to be picked up is used to pick up the goods. The help-seeking object gets goods, namely, the robot searches for the help-seeking object which can help to get goods, and the help-seeking object carries out the goods-taking treatment.
Wherein, setting up of multiple goods taking mode for wait to receive the goods object when inconvenient getting goods, can look for the help object and get goods, perhaps look for the help object by the robot and get goods, improve and get goods efficiency, and then improve delivery efficiency.
In an example of the embodiment of the present disclosure, the robot interacts with the object to be received, and the process of obtaining the target pickup mode and the target pickup object may be, for example, that the robot may obtain at least one pickup mode and an optional pickup object in the pickup mode; at least one goods taking mode and optional goods taking objects are sent to terminal equipment used by the objects to be received; and selecting the target goods taking mode and the target goods taking object from the objects to be received, and sending the selected target goods taking mode and the target goods taking object to the robot through the used terminal equipment.
In another example, the robot interacts with the object to be received, and the process of obtaining the target delivery method and the target delivery object may be, for example, that the robot may query the object to be received, and obtain the specified delivery method and the specified delivery object sent by the object to be received through the used terminal device;
the robot takes the designated goods taking mode as a target goods taking mode and takes the designated goods taking object as a target goods taking 5 object.
The terminal device used by the object to be received can comprise at least one of the following: mobile phones, landline phones, instant messaging (Instant Messaging, IM) software in mobile phones, short message communication software in mobile phones, intelligent sound boxes, etc.; the present invention is not limited thereto, and may be set according to actual needs.
Step 0, 102, guiding the target pick-up object to correspond to the address to be delivered according to the target pick-up mode
The goods to be distributed are fetched.
In the embodiment of the disclosure, for different picking modes, the robot may store corresponding guiding strategies. Correspondingly, the robot can acquire a guiding strategy corresponding to the target goods taking mode; root of Chinese character
And according to the guiding strategy, guiding the target goods taking object to take 5 goods from the goods to be delivered corresponding to the address to be delivered.
The guiding strategy is used for guiding the target goods taking object to move to the position of the robot through voice or text; and guiding the target goods taking object to input a verification code, or guiding the target goods taking object to move the face to a face scanning area so as to scan and acquire face information; at the input of verification code
And when the face information is correct or the face information is correct, guiding the target goods taking object to take out goods from the cabin. Wherein, the 0 guiding strategy moves to the position of the target goods taking object; and guiding the target goods taking object to be delivered
Entering a verification code, or guiding a target goods taking object to move the face to a face scanning area so as to scan and acquire face information; and when the input verification code or the face information is correct, guiding the target goods taking object to take out goods from the cabin.
The target object to be picked up can be an object to be picked up, a help object 5 found by the object to be picked up, a help object found by a robot, and the like.
In the embodiment of the present disclosure, in a case where the target pickup mode is to pick up the object to be received or pick up the object to be assisted, in an example, the process of performing step 102 by the robot may be, for example: acquiring a verification code input by a target goods taking object; the input verification code and the verification code I of the object to be received are input
In such a case, the target pick-up object is provided with the goods to be dispensed. Wherein, in this example, the to-be-received 0-ship object may provide the verification code to the target pick object in advance.
In the case where the target pickup mode is to pick up the object to be received or pick up the object to be helped, in another example, the process of the robot performing step 102 may be, for example: scanning to obtain face information of a target goods taking object; and providing the goods to be delivered to the target goods taking object under the condition that the face information is consistent with the face information of the goods to be received or the auxiliary object stored in advance. In this example, the object to be received may send its face information and face information of the assisting object to the robot in advance.
Under the condition that the target goods taking mode is to take goods from the goods to be received or take goods from the auxiliary object, the verification code is input by the goods to be received or the face information of the goods to be received is scanned by the robot, or the verification code is input by the auxiliary object or the face information of the auxiliary object is scanned by the robot, so that goods taking processing is carried out, goods taking efficiency can be improved, and distribution efficiency is further improved.
In the embodiment of the present disclosure, in the case that the target pickup mode is to pick a help object, in an example, the process of performing step 102 by the robot may be, for example: acquiring the position of a target goods taking object; moving to the position of the target goods taking object, and sending a verification code of the object to be received to terminal equipment used by the target goods taking object; acquiring a verification code input by a target goods taking object; and providing the goods to be delivered to the target goods taking object under the condition that the input verification code is consistent with the verification code of the goods to be received.
In the embodiment of the present disclosure, in the case where the target pickup mode is to pick a help object, in another example, the process of performing step 102 by the robot may be, for example: acquiring the position of a target goods taking object; moving to the position of the target goods taking object, and scanning to acquire the face information of the target goods taking object; and providing the goods to be delivered to the target goods taking object under the condition that the face information is consistent with the face information of the help seeking object stored in advance.
Under the condition that the target goods taking mode is to take goods for the help-seeking object, the verification code is input by the target goods taking object or the face information of the help-seeking object is scanned by the robot so as to take goods, the goods taking processing can be carried out when the goods to be received are inconvenient to receive goods, the goods taking efficiency can be improved, and then the distribution efficiency is improved.
In the case that the robot is provided with the mechanical arm, the process of providing the goods to be delivered to the target goods taking object by the robot may be, for example, obtaining a target position provided by the target goods taking object; and controlling the mechanical arm to place the goods to be distributed to the target position. The mechanical arm is arranged, so that the goods to be distributed can be automatically placed at the target position by the robot under the condition that the goods are inconvenient to take by the target goods taking object or the goods are overweight, the distribution efficiency is improved, and the goods taking efficiency of the target goods taking object is improved.
In an embodiment of the present disclosure, the robot may further perform the following process: in the case of non-selection of the target pick-up means for the object to be received, e.g. delivery after selection of the object to be received, or interaction
And if the object to be received is not fed back in the process, the robot can stop the goods taking guiding process and return to the original delivery site of 5 goods to be delivered.
The object to be received does not perform feedback in the interaction process, for example, a response of the object to be received to the provided content is not received, the provided content is difficult to be successfully sent to the object to be received, communication with terminal equipment used by the object to be received is interrupted, and the like.
After stopping the pickup guiding process, or while stopping the pickup guiding process, the robot 0 may notify the object to be received of stopping the pickup guiding process through voice or text. Wherein, the language
The audio content or text content may be, for example, "the robot has been returned in the way because it was not picked up for a long time, and may be contacted by the relevant staff if necessary.
Wherein, under the condition that the object to be received does not select the target picking mode, the robot returns to the object to be received
The original delivery site for delivering the goods can avoid long-time waiting, and the waiting time is reduced, so that the robot 5 can save time for delivering other goods, and further the working efficiency of the robot is improved.
According to the goods taking guiding method, under the condition that the goods arrive in the preset range of the to-be-delivered address, interaction is conducted with the to-be-received object corresponding to the to-be-delivered address, the target goods taking mode selected by the to-be-received object from at least one goods taking mode is obtained, and target goods taking under the target goods taking mode is achieved
A cargo object; according to the target goods taking mode, the target goods taking object is guided to take goods from the 0 goods to be delivered corresponding to the address to be delivered, so that the assistant object can be used when the goods to be received is inconvenient to take goods
Or the help object performs goods taking processing, so that the goods are ensured to be successfully delivered, the workers are prevented from arranging secondary delivery of the goods, and the delivery efficiency is improved.
Wherein, in order to obtain the target goods taking mode 5 selected by the goods to be received from at least one goods taking mode and the target goods taking object under the target goods taking mode, the robot can perform the following steps
And (5) performing multiple interactions. As shown in fig. 2, fig. 2 is a schematic diagram of a second embodiment according to the present disclosure, and the embodiment shown in fig. 2 may include the following steps:
step 201, in case of reaching the preset range of the address to be delivered, the object to be received is sent
At least two pick options are provided, wherein the pick options are used to indicate whether to pick up the object to be received. 0 in the embodiment of the present disclosure, in a case of reaching the preset range of the address to be delivered, the robot may provide at least two pickup options to the object to be received, for example, may be "may pick up the goods before the XX time point", "may not pick up the goods before the XX time point", respectively. The at least two pick-up options may be in the form of, for example, query sentences plus candidate answers. For example, the inquiry statement may be "do you have your items sent to the gate by the robot, ask you that before the XX time point can be taken before? "two candidate answers are" ok "and" not ok ", respectively. Wherein the candidate answer is "ok" indicating that the item can be picked up by the item to be picked up; wherein the candidate answer "not ok" indicates that the pick-up is not available for the object to be received.
Step 202, obtaining a pick-up option selected by a to-be-received object.
If the robot does not acquire the goods taking options selected by the object to be received within a certain time period, the robot can provide at least two goods taking options for the object to be received again; and when the goods taking option selected by the object to be received is not obtained after the goods taking option is continuously provided again for a plurality of times, determining that the object to be received does not select the target goods taking mode.
In step 203, in the case that the selected pick option indicates that the object to be received is picked, the target pick mode is determined to be the object to be received, and the target pick object is determined to be the object to be received.
Under the condition that the selected goods taking option indicates that goods are taken by the goods to be received, the robot can know that the target goods taking mode is to take goods for the goods to be received through one interaction process of the robot and the goods to be received, so that the robot can wait for the goods to be received to input the verification code or wait for scanning face information of the goods to be received, further, through waiting for a short time, the goods to be received can be taken by the goods to be received, and the goods distribution efficiency is improved.
Under the condition that the target goods taking mode is to take goods from the goods to be received, if the robot waits for a certain period of time and then determines that the goods to be delivered are not taken, the robot can interact with the goods to be received again, and the target goods taking mode and the target goods taking object are determined again, or the later delivery is determined.
In step 204, in the case that the selected pickup option indicates that the pickup is not performed by the object to be received, providing a candidate pickup mode to the object to be received, where the candidate pickup mode is another pickup mode except for the pickup of the object to be received.
In the embodiment of the disclosure, in the case that the goods taking mode includes goods taking of the object to be received, goods taking of the assistant object, and goods taking of the help object, the candidate goods taking mode may be, for example, goods taking of the assistant object and goods taking of the help object.
In addition, the robot can also provide the option of later delivery for the object to be received, so that the object to be received can be conveniently selected when the object does not want to receive goods or does not need to receive goods, the long-time waiting of the robot or the repeated interactive inquiry of the robot is avoided, and the working efficiency of the robot is improved.
Step 205, obtaining a picking mode selected by the object to be received.
In step 206, in the case that the selected picking mode is the picking of the helper object, the target picking mode is determined to be the picking of the helper object, and the target picking object is determined to be the helper object of the object to be received.
Under the condition that the selected goods taking mode is to take goods for the helper object, through the two interaction processes of the robot and the object to be received, the robot can know that the target goods taking mode is to take goods for the helper object, so that the robot can wait for the helper object to input the verification code, or wait for scanning face information of the helper object, further ensure that the helper object can take goods through waiting for a short time, improve goods distribution efficiency and improve working efficiency of the robot.
Step 207, in the case that the selected picking mode is picking for the help object, acquiring a trusted object list, and providing the trusted object list to the object to be received.
Under the condition that the selected goods taking mode is the help object for taking goods, through the three interaction processes of the robot and the goods to be received, the robot can know that the target goods taking mode is the help object for taking goods, so that the robot can move to the vicinity of the help object, guide the help object to input verification codes, or guide the face information of the scanned help object, thereby ensuring that the goods to be distributed are successfully distributed, ensuring that the goods to be received can be taken, and improving the goods distribution efficiency.
In the embodiment of the present disclosure, in the case where the selected delivery mode is delivery of the help-seeking object, the process of obtaining the trusted object list by the robot may, for example, be to obtain at least one object around the address to be delivered; selecting a trusted object from at least one object; and generating a trusted object list according to the selected trusted objects.
The process of obtaining at least one object around the address to be distributed by the robot may be, for example, obtaining an area with the address to be distributed as a center and within a certain distance range; and carrying out video acquisition and identification on the object positioned in the area, and acquiring face information, identity information and the like of at least one object positioned in the area.
The robot generates a trusted object list and provides the trusted object list for the object to be received, the object to be received can conveniently select the trusted object from the trusted object list to take goods, the goods can be conveniently obtained from the trusted object subsequently, and the distribution efficiency is further improved.
In the embodiment of the disclosure, the process of selecting the trusted object from the at least one object by the robot may be, for example, determining scene information of an address to be distributed and a referenceable item corresponding to the scene information; a trusted object is selected from the at least one object based on the content of the at least one object on the reference item.
The robot selects the trusted object and generates a trusted object list according to the content of at least one object on the reference item, so that the object in the trusted object list is the object which is recognized by the object to be received and is convenient to pick up, and the distribution efficiency is further improved.
In an embodiment of the present disclosure, the scene information includes at least one of: office scenes, accommodation scenes; the trusted parameter items corresponding to the office scene comprise at least one of the following: whether the face data of the object is stored, the distance between the position of the object and the address to be distributed and the current state of the object; the trusted reference item corresponding to the accommodation scene includes at least one of: the distance between the position of the object and the address to be distributed, whether the object wears uniform or not, and whether the object is a staff of the accommodation point or not.
Wherein the current state of the object, e.g. busy, sleeping, focusing on the robot, etc. Wherein the robot can set a weight for each trusted reference item; determining a value corresponding to the trusted reference item according to the content of the trusted reference item; determining the credibility of the object by combining the numerical value and the weight corresponding to the credible reference item; and then sorting the objects according to the credibility, and selecting the credible objects according to the sorting result.
The method comprises the steps of setting different trusted parameter items in different scenes, selecting proper trusted objects according to different scenes, conveniently selecting the trusted objects from the objects to be received for taking goods, conveniently obtaining goods from the trusted objects subsequently, and further improving distribution efficiency.
In step 208, in the case that the object to be received selects one of the trusted objects in the trusted object list, the target delivery mode is determined to be the help object delivery, and the target delivery object is determined to be the selected trusted object.
In an embodiment of the present disclosure, the robot may further perform the following process: if the selected goods taking mode is the help object goods taking mode, if the trusted object list is not obtained, or the goods to be received does not select the trusted object in the trusted object list, providing the goods taking mode for helping the goods to be received to the goods to be received; under the condition that the object to be received selects the mode of taking the auxiliary object to take the goods, the target mode of taking the goods is determined to be the mode of taking the goods by the auxiliary object, and the target object of taking the goods is determined to be the auxiliary object of the object to be received.
Under the condition that no trusted object exists or the trusted object is not selected by the object to be received, the robot can interact with the object to be received to determine whether the object to be received is searching for the object to be assisted for taking goods, so that the delivery success rate of the goods to be delivered can be further improved, and the robot experience of the object to be received is improved.
Step 209, guiding the target pick-up object to pick up the goods to be delivered corresponding to the address to be delivered according to the target pick-up mode.
It should be noted that, for details of step 209, reference may be made to step 102 in the embodiment shown in fig. 1, and details will not be described here.
According to the goods taking guiding method, at least two goods taking options are provided for the goods to be received under the condition that the goods to be received reach the preset range of the address to be distributed, wherein the goods taking options are used for indicating whether goods are taken by the goods to be received or not; acquiring a goods taking option selected by a to-be-received object; under the condition that the selected goods taking option indicates that goods are taken by the goods to be received, determining that the target goods taking mode is the goods to be received and determining that the target goods taking object is the goods to be received; providing a candidate goods taking mode for the object to be received under the condition that the selected goods taking option indicates that the object to be received is not taken, wherein the candidate goods taking mode is other goods taking modes except for the object to be received; acquiring a goods taking mode selected by a to-be-received object; under the condition that the selected goods taking mode is the goods taking mode of the helper object, determining the target goods taking mode as the goods taking mode of the helper object and determining the target goods taking object as the helper object of the object to be received; under the condition that the selected goods taking mode is to take goods for the help object, a trusted object list is obtained, and the trusted object list is provided for the object to be received; under the condition that the object to be received selects one trusted object in the trusted object list, determining the target goods taking mode as help object goods taking and determining the target goods taking object as the selected trusted object; according to the target goods taking mode, the target goods taking object is guided to take goods to be dispatched corresponding to the goods to be dispatched, so that goods can be taken by the help object or the help object when the goods to be dispatched are inconvenient to take, the goods can be ensured to be dispatched successfully, workers are prevented from arranging secondary goods to be dispatched, and the dispatching efficiency is improved.
In order to implement the above embodiments, the present disclosure further provides a pick-guide device. As shown in fig. 3, fig. 3 is a schematic diagram according to a third embodiment of the present disclosure. The pick-guide device 30 may include: an acquisition module 301 and a guidance module 302.
The acquiring module 301 is configured to interact with a to-be-delivered object corresponding to the to-be-delivered address when the to-be-delivered address is within a preset range, and acquire a target delivery mode selected by the to-be-delivered object from at least one delivery mode, and a target delivery object in the target delivery mode; and the guiding module 302 is configured to guide the target pickup object to pick up the to-be-delivered goods corresponding to the to-be-delivered address according to the target pickup mode.
As one possible implementation manner of the embodiment of the present disclosure, the picking manner includes at least one of the following: the object to be received takes goods, the object to be helped takes goods, and the object to be helped takes goods.
As one possible implementation of the embodiment of the present disclosure, the obtaining module 301 includes: a first providing unit, a first acquiring unit, and a first determining unit; the first providing unit is used for providing at least two goods taking options for the object to be received, wherein the goods taking options are used for indicating whether the object to be received is taken; the first acquiring unit is used for acquiring a goods taking option selected by the object to be received; the first determining unit is configured to determine that the target pickup mode is to pick up the object to be picked up and determine that the target pickup object is the object to be picked up when the selected pickup option indicates that the object to be picked up is to pick up.
As one possible implementation of the embodiment of the present disclosure, the obtaining module 301 further includes: a second providing unit, a second acquiring unit, and a second determining unit; the second providing unit is configured to provide a candidate pickup mode for the object to be received when the selected pickup option indicates that the object to be received is not picked, where the candidate pickup mode is another pickup mode except for the object to be received; the second obtaining unit is used for obtaining a goods taking mode selected by the object to be received; the second determining unit is configured to determine that the target delivery mode is delivery of the helper object and determine that the target delivery object is the helper object of the object to be received when the selected delivery mode is delivery of the helper object.
As one possible implementation of the embodiment of the present disclosure, the obtaining module 301 further includes: a third acquisition unit and a third determination unit; the third obtaining unit is configured to obtain a trusted object list when the selected delivery mode is for delivering the help object, and provide the trusted object list to the object to be delivered; the third determining unit is configured to determine, in a case where the object to be received selects one of the trusted objects in the trusted object list, that the target delivery mode is to deliver the help object, and determine that the target delivery object is the selected trusted object.
As one possible implementation of the embodiment of the present disclosure, the obtaining module 301 further includes: a third providing unit and a fourth determining unit; the third providing unit is configured to provide, in a case where the selected delivery mode is delivery of the help-seeking object, a delivery mode of delivering the help-seeking object to the object to be received if the trusted object list is not obtained, or if the object to be received does not select the trusted object in the trusted object list; the fourth determining unit is configured to determine, in a case where the object to be received selects a mode of taking a product by a helper object, that the target mode of taking the product is the helper object, and determine that the target object of taking the product is the helper object of the object to be received.
As one possible implementation manner of the embodiment of the present disclosure, the third obtaining unit is specifically configured to obtain at least one object around the address to be dispatched; selecting a trusted object from at least one of the objects; and generating the trusted object list according to the selected trusted objects.
As a possible implementation manner of the embodiment of the present disclosure, the third obtaining unit is specifically further configured to determine scenario information of the address to be distributed and a referenceable item corresponding to the scenario information; a trusted object is selected from at least one of the objects based on the content of the at least one of the objects on the reference.
As one possible implementation of the embodiments of the present disclosure, the scene information includes at least one of the following: office scenes, accommodation scenes; the trusted parameter item corresponding to the office scene comprises at least one of the following: whether the face data of the object is stored, the distance between the position of the object and the address to be distributed and the current state of the object; the trusted reference item corresponding to the accommodation scene comprises at least one of the following: the distance between the position of the object and the address to be distributed, whether the object wears uniform, and whether the object is a staff of the accommodation point.
As a possible implementation manner of the embodiment of the present disclosure, the guiding module 302 is specifically configured to obtain the verification code input by the target pickup object when the target pickup mode is that the pickup of the object to be received or the pickup of the object to be assisted is performed; and providing the goods to be distributed for the target goods taking object under the condition that the input verification code is consistent with the verification code of the goods to be received.
As a possible implementation manner of the embodiment of the present disclosure, the guiding module 302 is specifically configured to obtain, in a case where the target pickup mode is a help-seeking object pickup, a location of the target pickup object; moving to the position of the target goods taking object, and sending the verification code of the object to be received to terminal equipment used by the target goods taking object; acquiring a verification code input by the target goods taking object; and providing the goods to be distributed for the target goods taking object under the condition that the input verification code is consistent with the verification code of the goods to be received.
As one possible implementation manner of the embodiment of the present disclosure, the robot is provided with a mechanical arm, and the guiding module 302 is specifically configured to obtain a target position provided by the target pickup object; and controlling the mechanical arm to place the goods to be distributed to the target position.
As one possible implementation manner of the embodiment of the present disclosure, the guiding module 302 is further configured to stop the pickup guiding process and return to the original delivery site of the to-be-delivered cargo when the to-be-delivered object does not select the target pickup mode.
According to the goods taking guiding device, under the condition that the goods arrive in the preset range of the address to be distributed, interaction is conducted with the goods to be received corresponding to the address to be distributed, and the target goods taking mode selected by the goods to be received from at least one goods taking mode and the target goods taking object in the target goods taking mode are obtained; according to the target goods taking mode, the target goods taking object is guided to take goods to be dispatched corresponding to the goods to be dispatched, so that goods can be taken by the help object or the help object when the goods to be dispatched are inconvenient to take, the goods can be ensured to be dispatched successfully, workers are prevented from arranging secondary goods to be dispatched, and the dispatching efficiency is improved.
In the technical scheme of the disclosure, the related processes of collecting, storing, using, processing, transmitting, providing, disclosing and the like of the personal information of the user are performed on the premise of proving the consent of the user, and all the processes accord with the regulations of related laws and regulations, and the public welfare is not violated.
According to embodiments of the present disclosure, the present disclosure also provides an electronic device, a readable storage medium and a computer program product.
Fig. 4 illustrates a schematic block diagram of an example electronic device 400 that may be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 4, the apparatus 400 includes a computing unit 401 that can perform various suitable actions and processes according to a computer program stored in a Read Only Memory (ROM) 402 or a computer program loaded from a storage unit 408 into a Random Access Memory (RAM) 403. In RAM 403, various programs and data required for the operation of device 400 may also be stored. The computing unit 401, ROM 402, and RAM 403 are connected to each other by a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
Various components in device 400 are connected to I/O interface 405, including: an input unit 406 such as a keyboard, a mouse, etc.; an output unit 407 such as various types of displays, speakers, and the like; a storage unit 408, such as a magnetic disk, optical disk, etc.; and a communication unit 409 such as a network card, modem, wireless communication transceiver, etc. The communication unit 409 allows the device 400 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The computing unit 401 may be a variety of general purpose and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 401 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 401 performs the various methods and processes described above, such as the pick guidance method. For example, in some embodiments, the pick guidance method may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as the storage unit 408. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 400 via the ROM 402 and/or the communication unit 409. When the computer program is loaded into RAM 403 and executed by computing unit 401, one or more of the steps of the pick-guide method described above may be performed. Alternatively, in other embodiments, the computing unit 401 may be configured to perform the pick-guide method in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server incorporating a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel or sequentially or in a different order, provided that the desired results of the technical solutions of the present disclosure are achieved, and are not limited herein.
The above detailed description should not be taken as limiting the scope of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present disclosure are intended to be included within the scope of the present disclosure.

Claims (29)

1. A pick-and-guide method, applied to a robot, the method comprising:
under the condition that the target delivery mode reaches the preset range of the address to be delivered, interacting with a target delivery object corresponding to the address to be delivered, and acquiring a target delivery mode selected by the target delivery object from at least one delivery mode and a target delivery object under the target delivery mode;
and guiding the target goods taking object to take goods to be delivered corresponding to the address to be delivered according to the target goods taking mode.
2. The method of claim 1, wherein the pick-up style comprises at least one of: the object to be received takes goods, the object to be helped takes goods, and the object to be helped takes goods.
3. The method according to claim 1 or 2, wherein the acquiring the target pickup mode selected by the object to be received from at least one pickup mode, and the target pickup object under the target pickup mode, comprises:
providing at least two pick options to the object to be received, wherein the pick options are used for indicating whether the object to be received picks up goods;
acquiring a goods taking option selected by the object to be received;
and under the condition that the selected goods taking option indicates that the goods are taken by the goods to be received, determining that the target goods taking mode is the goods to be received, and determining that the target goods taking object is the goods to be received.
4. The method of claim 3, wherein the acquiring the target pickup mode selected by the object to be received from at least one pickup mode, and the target pickup object under the target pickup mode, further comprises:
providing a candidate goods taking mode for the object to be received under the condition that the selected goods taking option indicates that the object to be received is not picked, wherein the candidate goods taking mode is other goods taking modes except for the object to be received;
Acquiring a goods taking mode selected by the object to be received;
and under the condition that the selected goods taking mode is the goods taking mode of the helper object, determining the target goods taking mode to be the goods taking mode of the helper object, and determining the target goods taking object to be the helper object of the object to be received.
5. The method of claim 4, wherein the obtaining the target pickup mode selected by the object to be received from at least one pickup mode, and the target pickup object under the target pickup mode, further comprises:
under the condition that the selected goods taking mode is to take goods for the help object, a trusted object list is obtained, and the trusted object list is provided for the object to be received;
and under the condition that the object to be received selects one trusted object in the trusted object list, determining the target goods taking mode as help object goods taking, and determining the target goods taking object as the selected trusted object.
6. The method of claim 4, wherein the obtaining the target pickup mode selected by the object to be received from at least one pickup mode, and the target pickup object under the target pickup mode, further comprises:
If the selected goods taking mode is the help object goods taking mode, if the trusted object list is not obtained, or the goods to be received does not select the trusted object in the trusted object list, the goods taking mode for helping the goods to be received is provided for the goods to be received;
and under the condition that the object to be received selects the picking mode of picking the object to be received, determining the target picking mode as the picking mode of picking the object to be received, and determining the target picking object as the object to be received.
7. The method of claim 5, wherein the obtaining a list of trusted objects comprises:
acquiring at least one object around the address to be distributed;
selecting a trusted object from at least one of the objects;
and generating the trusted object list according to the selected trusted objects.
8. The method of claim 7, wherein the selecting a trusted object from at least one of the objects comprises:
determining scene information of the address to be distributed and a referenceable item corresponding to the scene information;
a trusted object is selected from at least one of the objects based on the content of the at least one of the objects on the reference.
9. The method of claim 8, wherein the scene information comprises at least one of: office scenes, accommodation scenes;
the trusted parameter item corresponding to the office scene comprises at least one of the following: whether the face data of the object is stored, the distance between the position of the object and the address to be distributed and the current state of the object;
the trusted reference item corresponding to the accommodation scene comprises at least one of the following: the distance between the position of the object and the address to be distributed, whether the object wears uniform, and whether the object is a staff of the accommodation point.
10. The method of claim 1, wherein the guiding the target pick-up object to pick up the to-be-delivered goods corresponding to the to-be-delivered address according to the target pick-up mode includes:
under the condition that the target goods taking mode is to take goods from an object to be received or take goods from a helper object, acquiring a verification code input by the target goods taking object;
and providing the goods to be distributed for the target goods taking object under the condition that the input verification code is consistent with the verification code of the goods to be received.
11. The method of claim 1, wherein the guiding the target pick-up object to pick up the to-be-delivered goods corresponding to the to-be-delivered address according to the target pick-up mode includes:
Under the condition that the target goods taking mode is help object goods taking, acquiring the position of the target goods taking object;
moving to the position of the target goods taking object, and sending the verification code of the object to be received to terminal equipment used by the target goods taking object;
acquiring a verification code input by the target goods taking object;
and providing the goods to be distributed for the target goods taking object under the condition that the input verification code is consistent with the verification code of the goods to be received.
12. The method of claim 10 or 11, wherein the robot is provided with a robotic arm, the providing the target pick object with the goods to be dispensed comprising:
acquiring a target position provided by the target goods taking object;
and controlling the mechanical arm to place the goods to be distributed to the target position.
13. The method of claim 1, wherein the method further comprises:
and stopping the goods taking guiding process and returning to the original delivery site of the goods to be delivered under the condition that the object to be received does not select the target goods taking mode.
14. A pick-guide device for use with a robot, the device comprising:
The system comprises an acquisition module, a storage module and a storage module, wherein the acquisition module is used for interacting with a to-be-received object corresponding to an address to be distributed under the condition that the to-be-distributed address is reached within a preset range, and acquiring a target goods taking mode selected by the to-be-received object from at least one goods taking mode and a target goods taking object under the target goods taking mode;
and the guiding module is used for guiding the target goods taking object to take goods to be delivered corresponding to the address to be delivered according to the target goods taking mode.
15. The apparatus of claim 14, wherein the pick-up means comprises at least one of: the object to be received takes goods, the object to be helped takes goods, and the object to be helped takes goods.
16. The apparatus of claim 14 or 15, wherein the acquisition module comprises: a first providing unit, a first acquiring unit, and a first determining unit;
the first providing unit is used for providing at least two goods taking options for the object to be received, wherein the goods taking options are used for indicating whether the object to be received is taken;
the first acquiring unit is used for acquiring a goods taking option selected by the object to be received;
the first determining unit is configured to determine that the target pickup mode is to pick up the object to be picked up and determine that the target pickup object is the object to be picked up when the selected pickup option indicates that the object to be picked up is to pick up.
17. The apparatus of claim 16, wherein the acquisition module further comprises: a second providing unit, a second acquiring unit, and a second determining unit;
the second providing unit is configured to provide a candidate pickup mode for the object to be received when the selected pickup option indicates that the object to be received is not picked, where the candidate pickup mode is another pickup mode except for the object to be received;
the second obtaining unit is used for obtaining a goods taking mode selected by the object to be received;
the second determining unit is configured to determine that the target delivery mode is delivery of the helper object and determine that the target delivery object is the helper object of the object to be received when the selected delivery mode is delivery of the helper object.
18. The apparatus of claim 17, wherein the acquisition module further comprises: a third acquisition unit and a third determination unit;
the third obtaining unit is configured to obtain a trusted object list when the selected delivery mode is for delivering the help object, and provide the trusted object list to the object to be delivered;
the third determining unit is configured to determine, in a case where the object to be received selects one of the trusted objects in the trusted object list, that the target delivery mode is to deliver the help object, and determine that the target delivery object is the selected trusted object.
19. The apparatus of claim 17, wherein the acquisition module further comprises: a third providing unit and a fourth determining unit;
the third providing unit is configured to provide, in a case where the selected delivery mode is delivery of the help-seeking object, a delivery mode of delivering the help-seeking object to the object to be received if the trusted object list is not obtained, or if the object to be received does not select the trusted object in the trusted object list;
the fourth determining unit is configured to determine, in a case where the object to be received selects a mode of taking a product by a helper object, that the target mode of taking the product is the helper object, and determine that the target object of taking the product is the helper object of the object to be received.
20. The apparatus of claim 18, wherein the third acquisition unit is configured to,
acquiring at least one object around the address to be distributed;
selecting a trusted object from at least one of the objects;
and generating the trusted object list according to the selected trusted objects.
21. The apparatus of claim 20, wherein the third acquisition unit is further configured to,
determining scene information of the address to be distributed and a referenceable item corresponding to the scene information;
A trusted object is selected from at least one of the objects based on the content of the at least one of the objects on the reference.
22. The apparatus of claim 21, wherein the context information comprises at least one of: office scenes, accommodation scenes;
the trusted parameter item corresponding to the office scene comprises at least one of the following: whether the face data of the object is stored, the distance between the position of the object and the address to be distributed and the current state of the object;
the trusted reference item corresponding to the accommodation scene comprises at least one of the following: the distance between the position of the object and the address to be distributed, whether the object wears uniform, and whether the object is a staff of the accommodation point.
23. The device according to claim 14, wherein the guiding module is specifically configured to,
under the condition that the target goods taking mode is to take goods from an object to be received or take goods from a helper object, acquiring a verification code input by the target goods taking object;
and providing the goods to be distributed for the target goods taking object under the condition that the input verification code is consistent with the verification code of the goods to be received.
24. The device according to claim 14, wherein the guiding module is specifically configured to,
Under the condition that the target goods taking mode is help object goods taking, acquiring the position of the target goods taking object;
moving to the position of the target goods taking object, and sending the verification code of the object to be received to terminal equipment used by the target goods taking object;
acquiring a verification code input by the target goods taking object;
and providing the goods to be distributed for the target goods taking object under the condition that the input verification code is consistent with the verification code of the goods to be received.
25. The device according to claim 23 or 24, wherein the robot is provided with a robotic arm, the guiding module being in particular adapted to,
acquiring a target position provided by the target goods taking object;
and controlling the mechanical arm to place the goods to be distributed to the target position.
26. The apparatus of claim 14, wherein the guidance module is further configured to stop the pick-up guidance process and return to the original delivery site of the to-be-delivered item if the target pick-up mode is not selected by the to-be-delivered item.
27. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
The memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-13.
28. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-13.
29. A computer program product comprising a computer program which, when executed by a processor, implements the steps of the method according to any one of claims 1-13.
CN202211671962.0A 2022-12-23 2022-12-23 Goods taking guiding method and device and electronic equipment Pending CN116237930A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211671962.0A CN116237930A (en) 2022-12-23 2022-12-23 Goods taking guiding method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211671962.0A CN116237930A (en) 2022-12-23 2022-12-23 Goods taking guiding method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN116237930A true CN116237930A (en) 2023-06-09

Family

ID=86625147

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211671962.0A Pending CN116237930A (en) 2022-12-23 2022-12-23 Goods taking guiding method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN116237930A (en)

Similar Documents

Publication Publication Date Title
CN109039671B (en) Group message display method, device, terminal and storage medium
US9363300B2 (en) Systems and methods for voice communication
US20210097262A1 (en) Method and device for sending information
CN114227698B (en) Control method, device, equipment and storage medium for robot
CN110766348B (en) Method and device for combining goods picking tasks
CN111906780A (en) Article distribution method, robot and medium
WO2023130748A1 (en) Task processing method and apparatus for unmanned vehicle
CN110727775B (en) Method and apparatus for processing information
CN115481227A (en) Man-machine interaction dialogue method, device and equipment
CN116841506B (en) Program code generation method and device, and model training method and device
CN116450917B (en) Information searching method and device, electronic equipment and medium
CN113055523A (en) Crank call interception method and device, electronic equipment and storage medium
CN116237930A (en) Goods taking guiding method and device and electronic equipment
CN115879469B (en) Text data processing method, model training method, device and medium
CN117112065A (en) Large model plug-in calling method, device, equipment and medium
WO2023173684A1 (en) Distribution method and device
CN114722171B (en) Multi-round dialogue processing method and device, electronic equipment and storage medium
CN112925623B (en) Task processing method, device, electronic equipment and medium
CN109002498A (en) Interactive method, device, equipment and storage medium
CN113360590A (en) Method and device for updating point of interest information, electronic equipment and storage medium
CN113114851B (en) Incoming call intelligent voice reply method and device, electronic equipment and storage medium
CN113554062A (en) Training method, device and storage medium of multi-classification model
CN113470645B (en) Call processing method, device, equipment and storage medium
CN114416937B (en) Man-machine interaction method, device, equipment, storage medium and computer program product
CN113552879B (en) Control method and device of self-mobile device, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination