CN117245642B - Robot control method, device and storage medium - Google Patents

Robot control method, device and storage medium Download PDF

Info

Publication number
CN117245642B
CN117245642B CN202211559264.1A CN202211559264A CN117245642B CN 117245642 B CN117245642 B CN 117245642B CN 202211559264 A CN202211559264 A CN 202211559264A CN 117245642 B CN117245642 B CN 117245642B
Authority
CN
China
Prior art keywords
control
robot
target
control instruction
controlling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211559264.1A
Other languages
Chinese (zh)
Other versions
CN117245642A (en
Inventor
杜坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Robot Technology Co ltd
Original Assignee
Beijing Xiaomi Robot Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Robot Technology Co ltd filed Critical Beijing Xiaomi Robot Technology Co ltd
Priority to CN202211559264.1A priority Critical patent/CN117245642B/en
Publication of CN117245642A publication Critical patent/CN117245642A/en
Application granted granted Critical
Publication of CN117245642B publication Critical patent/CN117245642B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Manipulator (AREA)

Abstract

The disclosure relates to a control method, a device and a storage medium for a robot, which can respond to the triggering operation of a user on a target application program interface on a terminal to determine a target control triggered by the user, wherein the target application program is used for controlling the movement of the robot; determining a target control instruction according to the triggering operation and the target control, wherein the target control instruction comprises the motion parameters of the robot; and controlling the robot to move according to the movement parameters according to the target control instruction.

Description

Robot control method, device and storage medium
Technical Field
The disclosure relates to the technical field of robots, and in particular relates to a control method, a control device and a storage medium for a robot.
Background
Along with the continuous development of science and technology, the demands of people for material culture in life are increased, and robots are gradually moved into the life of people. The robot dog is used as a bionic quadruped robot, so that great convenience and fun are brought to the life of people, for example, the robot dog can present various different action postures according to control instructions of people, such as lying down, turning over in front, make a slight bow, rolling, four-foot small jumping, three-foot walking and the like, and the robot dog can also realize an automatic following function. Therefore, in order to facilitate the user to control the function of the machine dog, it is important to select a proper control mode.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides a control method, apparatus, and storage medium for a robot.
According to a first aspect of an embodiment of the present disclosure, there is provided a control method of a robot, including: responding to triggering operation of a user on a target application program interface on a terminal, determining a target control triggered by the user, wherein the target application program is used for controlling the movement of the robot;
Determining a target control instruction according to the triggering operation and the target control, wherein the target control instruction comprises the motion parameters of the robot;
and controlling the robot to move according to the movement parameters according to the target control instruction.
Optionally, the triggering operation includes a sliding operation, and determining the target control instruction according to the triggering operation and the target control includes:
acquiring a current trigger position corresponding to the sliding operation under the condition that the target control is a first preset control for controlling the horizontal movement of the robot;
Determining the motion parameters according to the current trigger position and a preset maximum motion parameter threshold;
And determining the target control instruction according to the motion parameter, the control type of the target control and the sliding direction of the sliding operation.
Optionally, the target control instruction includes a first control instruction or a second control instruction, the motion parameter includes a motion acceleration, the first preset control includes a first control or a second control, the first control is used for controlling the robot to move according to a first direction, and the second control is used for controlling the robot to translate according to a second direction; the first direction intersects the second direction; the determining the target control instruction according to the motion parameter, the control type of the target control and the sliding direction of the sliding operation includes:
if the control type is the first control, generating a first control instruction according to the motion acceleration and the sliding direction if the sliding direction is a preset direction, wherein the first control instruction is used for controlling the robot to move in the first direction according to the motion acceleration;
And under the condition that the control type is the second control, generating a second control instruction according to the motion acceleration and the sliding direction, wherein the second control instruction is used for controlling the robot to move in the second direction according to the motion acceleration.
Optionally, the target control instruction includes a third control instruction, the motion parameter includes a turning angular speed, the first control is further used for controlling the robot to turn, and determining the target control instruction according to the motion parameter, the control type of the target control, and the sliding direction of the sliding operation includes:
and under the condition that the control type is the first control, if the sliding direction is not the preset direction, generating a third control instruction according to the turning angular speed and the sliding direction, wherein the third control instruction is used for controlling the robot to turn according to the turning angular speed.
Optionally, the method further comprises:
Under the condition that the target control is a second preset control for controlling the robot to execute preset gesture actions, determining the target gesture actions selected by the user according to the triggering operation;
and generating a fourth control instruction according to the target gesture, wherein the fourth control instruction is used for controlling the robot to execute the target gesture.
Optionally, a third preset control is further deployed on the interface of the target application program, and the method further includes:
and responding to the triggering operation of the user on the third preset control, and displaying the preset basic operation information of the robot.
Optionally, the controlling the robot to move according to the movement parameter according to the target control instruction includes:
And sending the target control instruction to the robot through a GRPC (Google Remote Procedure Call, remote procedure call) network communication module so that the robot can control the robot to move according to the motion parameters according to the target control instruction.
According to a second aspect of the embodiments of the present disclosure, there is provided a control device of a robot, including:
The first determining module is configured to respond to triggering operation of a user on a target application program interface on a terminal, determine a target control triggered by the user, and the target application program is used for controlling the movement of the robot;
A second determining module configured to determine a target control instruction according to the trigger operation and the target control, the target control instruction including a motion parameter of the robot;
And the control module is configured to control the robot to move according to the movement parameters according to the target control instruction.
According to a third aspect of the embodiments of the present disclosure, there is provided a control device of a robot, including:
A processor;
A memory for storing processor-executable instructions;
Wherein the processor is configured to perform the steps of the method of controlling a robot provided in the first aspect of the present disclosure.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the method of controlling a robot provided by the first aspect of the present disclosure.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects: the target control instruction is determined according to the triggering operation of the user on the target application program interface and the triggered target control, and the target control instruction comprises the motion parameters, so that the robot can be controlled to move according to the motion parameters while the robot is controlled to move according to the target control instruction, the function of controlling the robot through the terminal application program is enriched, and the user experience is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating a control method of a robot according to an exemplary embodiment.
Fig. 2 is a flowchart illustrating a control method of a robot according to the embodiment shown in fig. 1.
FIG. 3 is a schematic diagram of a remote control interface on a target application, according to an example embodiment.
Fig. 4 is a flowchart illustrating a control method of a robot according to the embodiment shown in fig. 1.
Fig. 5 is a block diagram illustrating a control apparatus of a robot according to an exemplary embodiment.
Fig. 6 is a block diagram of a control device of a robot according to the embodiment shown in fig. 5.
Fig. 7 is a block diagram illustrating a control apparatus for a robot according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
It should be noted that, all actions of acquiring signals, information or data in the present application are performed under the condition of conforming to the corresponding data protection rule policy of the country of the location and obtaining the authorization given by the owner of the corresponding device.
The method and the device can be applied to the scene of controlling the instructions of the bionic quadruped robot (such as a robot dog), and generally, a control instruction is mainly sent to the robot dog through special remote control equipment, so that a user needs to carry the remote control equipment with him, inconvenience is brought to the travel of the user, the inventor firstly thinks that the instruction control of the robot dog can be realized by using a mobile phone APP (application software) simulator wheel disc, for example, a left wheel disc and a right wheel disc can be arranged on an APP interface, the robot dog can be controlled to move back and forth and left and right through the left wheel disc, and the function of turning can be controlled through the right wheel disc.
In order to solve the above technical problems, the present disclosure provides a control method, a device and a storage medium for a robot, and the following detailed description of specific embodiments of the present disclosure is given with reference to the accompanying drawings.
Fig. 1 is a flowchart illustrating a control method of a robot according to an exemplary embodiment, which may be applied to a terminal installed with a target application program, which is an application program communicatively connected to a robot to be controlled, and which may be used to perform motion control of the robot; as shown in fig. 1, the method includes the following steps.
In step S11, in response to a triggering operation of a user on a target application program interface on a terminal, determining a target control triggered by the user, where the target application program is used for performing motion control on the robot.
In an actual control scenario, a user may open the target application program on the terminal, and then use the robot to be controlled on the target application program connection (such as WIFI connection or bluetooth connection) to enter a remote control interface on the target application program, where different controls are set on the remote control interface, and the different controls are used to control the robot to execute different actions or functions.
The triggering operation may include a sliding operation or a clicking operation on a target control, etc.
In step S12, a target control command is determined according to the trigger operation and the target control, where the target control command includes a motion parameter of the robot.
For example, the target application program interface (i.e., the remote control interface mentioned above) may include various controls such as a wheel control, an instruction operation control, a basic information presentation control, and the like, where the wheel control may further include a left wheel control and a right wheel control, and the instruction operation control may further include a motion control and a gesture control. Thus, assuming that the target control triggered by the user is a left wheel control, the target control instruction corresponding to the target control may include an instruction for controlling the robot to move forward or backward, or may include an instruction for controlling the robot to turn, etc.; for another example, assuming that the user triggers an attitude control in the command operation control, the target control command corresponding to the target control may include a command for controlling the robot to move according to a preset attitude (such as four-foot small jump and three-foot walking) corresponding to the attitude control. The above examples are merely illustrative of target controls and target control instructions, which are not limiting of the present disclosure.
The motion parameter may include a motion acceleration of the robot, a turning angular velocity at the time of turning, and the like.
In step S13, the robot is controlled to move according to the movement parameter according to the target control command.
In a possible implementation manner, in order to realize communication between the terminal and the robot, GRPC network communication modules may be respectively provided on the terminal and the robot, so that in the process of executing the step, the target control instruction may be sent to the robot through the GRPC network communication module, and a system control module is provided on the robot, and the system control module may control the robot to perform movement according to the movement parameter by executing the target control instruction.
It should be noted that, compared with other http communication modes, the GRPC network communication mode can save transmission bandwidth, has a higher transmission rate, further can reduce the connection times of communication, and improves the transmission efficiency of data.
By adopting the method, the target control instruction can be determined according to the triggering operation of the user on the target application program interface and the triggered target control, and the target control instruction comprises the motion parameter, so that the robot can be controlled to move according to the motion parameter while being controlled to move according to the target control instruction, the function of controlling the robot through the terminal application program is enriched, and the user experience is improved.
Fig. 2 is a flowchart of a control method of a robot according to the embodiment shown in fig. 1, and as shown in fig. 2, step S12 includes the following sub-steps:
In step S121, if the target control is a first preset control for controlling the horizontal movement of the robot, the current trigger position corresponding to the sliding operation is obtained.
Wherein, the height of the object is kept unchanged, and the position changes on the same horizontal plane, namely the horizontal movement. The horizontal movement in the present disclosure may include movements such as forward movement, backward movement, left or right lateral movement, and turning of the robot, where the current trigger position may include a specific position point on the target control triggered by the user at the current moment, or include a rotation angle relative to a preset direction when the user slides on the target control by a finger; in one implementation, the current trigger position may be obtained periodically according to a preset period.
By way of example, fig. 3 is a schematic diagram of a remote control interface on a target application program, where, as shown in fig. 3, two left and right wheels are disposed on the remote control interface, and in one possible implementation, the first preset control may include a left wheel or a right wheel on the remote control interface, so that a user may control the robot to advance or retract by triggering the left wheel, and control the robot to move left and right by triggering the right wheel (where traversing may be understood as maintaining the current state of the robot to move laterally), and in particular, during the process of controlling the robot to advance or retract by the left wheel, a buoy located in the center of the left wheel may be dragged by a finger to slide upwards or downwards; in the process of controlling the robot to transversely move left and right through the right wheel disc, a buoy positioned in the center of the right wheel disc can be dragged by a finger to slide leftwards or rightwards. In addition, the user can control the robot to turn through triggering the left wheel disc, for example, the left wheel disc center buoy can be dragged by a finger to make arc-shaped sliding towards the direction deviating from the preset direction, the preset direction can be understood as a sliding direction corresponding to sliding operation when the robot is controlled to advance or retreat, for example, the robot is controlled to advance when the left wheel disc center buoy is dragged to move right above, the robot is controlled to retreat when the left wheel disc center buoy is dragged to move right below, and then the right above or right below is the preset direction. In this way, the terminal may acquire the sliding position of the user's finger during the sliding process at preset periods (for example, every 20 ms), which is only an example, and the present disclosure is not limited thereto.
It should be noted that, based on the process that the left wheel disc or the right wheel disc shown in fig. 3 controls the robot to perform horizontal movement, the sliding operation of the user on the left wheel disc or the right wheel disc is similar to the operation of the automobile steering wheel in the driving process, which can give the user a stronger driving experience sense, and also more accords with the actual operation habit of the user, so that the user has better operation experience.
In step S122, the motion parameter is determined according to the current trigger position and a preset maximum motion parameter threshold.
The motion parameters may include motion acceleration or turning angular velocity, and the motion acceleration may be motion acceleration during a forward, backward or left-right traversing process of the robot, and the turning angular velocity may be rotation angular velocity during a turning process of the robot. The preset maximum motion parameter threshold may include a maximum operation acceleration or a maximum operation speed of the robot when the motion acceleration is determined, and the maximum motion parameter threshold may include a maximum angular speed and a maximum rotation angle when the turning angular speed is determined.
As described above, the first preset control may include a left wheel disc or a right wheel disc as shown in fig. 3, and in calculating the motion acceleration of the robot at the current moment, the distance between the finger trigger point at the current moment and the center of the wheel disc may be calculated according to the current trigger position, and then the motion acceleration may be calculated according to the distance between the finger trigger point and the center of the wheel disc and the maximum running acceleration. In the process of calculating the turning angular velocity of the robot at the current moment, the deviation angle of the finger trigger point and the preset direction can be determined based on the current trigger position, and then the turning angular velocity is calculated according to the deviation angle, the maximum angular velocity and the maximum rotation angle.
For example, the maximum running acceleration may be expressed as maxA, assuming that the radius of the wheel disc is R, the distance between the finger trigger point at the current moment and the center of the wheel disc calculated according to the current trigger position is R, and the movement acceleration of the robot is a=r/r×maxa. The maximum angular velocity may be denoted as maxW, the maximum rotation angle may be denoted as X, and when the user slides on the target control, assuming that the acquired deviation angle between the finger trigger point of the user and the preset direction at the current moment is X, the turning angular velocity of the robot may be: w=x/X maxW, the above examples are merely illustrative, and the present disclosure is not limited thereto.
In step S123, the target control command is determined according to the motion parameter, the control type of the target control, and the sliding direction of the sliding operation.
The target control command may include a first control command for controlling the robot to move in a first direction according to the motion acceleration or a second control command for controlling the robot to move in a second direction according to the motion acceleration. As shown in fig. 3, the upper left corner of fig. 3 shows a first direction and a second direction, the first direction intersecting the second direction, wherein the robot moving in the first direction may include controlling the robot to advance or retract in the first direction, and the robot moving in the second direction may include controlling the robot to translate left or right in the second direction.
In this step, if the control type is the first control, the first control instruction may be generated according to the movement acceleration and the sliding direction if the sliding direction is a preset direction, where if the sliding direction is determined to be the preset direction, it is further determined whether the sliding direction slides downward or upward along the preset direction, for example, if the sliding direction slides downward along the preset direction, the robot may be controlled to retract according to the movement acceleration, and if the sliding direction slides upward along the preset direction, the robot may be controlled to advance according to the movement acceleration, and in this case, the first control instruction may be generated. In addition, the first control may be, for example, a left wheel as shown in fig. 3.
In addition, when the control type is the second control, the second control command is generated according to the motion acceleration and the sliding direction, and the sliding direction of the sliding operation needs to be further determined, for example, when the sliding direction is left sliding, the robot can be controlled to move horizontally, and when the sliding direction is right sliding, the robot can be controlled to move horizontally, and in this case, the second control command can be generated. In addition, the second control may be, for example, a right wheel as shown in fig. 3.
In addition, in the present disclosure, the target control instruction may further include a third control instruction, where the third control instruction is configured to control the robot to turn according to the turning angular velocity, and in this scenario, the motion parameter is the turning angular velocity, and the present disclosure may further control the robot to turn through the first control, so in this step, if the sliding direction is not the preset direction and performs arc sliding in a direction deviating from the preset direction in a case where the control type is the first control, the third control instruction for controlling the robot to turn according to the turning angular velocity may be generated according to the turning angular velocity and the sliding direction.
Based on the steps shown in fig. 1 and fig. 2, a target control instruction can be determined according to the triggering operation of a user on a target application program interface and a triggered target control, and because the target control instruction contains a motion parameter, the robot can be controlled to move according to the motion parameter while being controlled to move according to the target control instruction, so that the function of controlling the robot through a terminal application program is enriched. In addition, a second preset control is further provided on the target application program interface in the disclosure, and the user can control the robot to execute the target gesture action selected by the user by triggering the second preset control.
Fig. 4 is a flowchart of a control method of a robot according to the embodiment shown in fig. 1, and as shown in fig. 4, the method further includes the steps of:
In step S14, if the target control is a second preset control for controlling the robot to execute the preset gesture, determining the target gesture selected by the user according to the trigger operation.
The preset gesture actions can include actions such as lying down, standing up, turning back, turning over forward, make a slight bow, rolling, dog walking, 90-degree jumping, and the like, and can also include gestures such as four-foot small jumping and three-foot walking. The different preset gesture actions correspond to different second preset controls, which may include, for example, controls such as "down", "gesture", "action", "follow", etc. in the remote control interface shown in fig. 3.
In this step, the terminal may determine, according to a second preset control triggered by the user, the target gesture corresponding to the control.
In step S15, a fourth control instruction is generated according to the target gesture, where the fourth control instruction is used to control the robot to execute the target gesture.
In this way, based on the control method of the robot provided by the present disclosure, various gesture actions of the robot may be controlled by the target application program of the terminal, and besides the basic movement function of the robot may be controlled by the target application program of the terminal, and further, in the present disclosure, there may be more function control of the gesture actions of the robot, for example, in the case that the robot dog triggers a second preset control corresponding to "back blank turning" on the remote control interface of the target application program, the user may control the robot dog to perform the back blank turning operation.
The third preset control can be further arranged on the terminal target application program interface, and can be used for displaying preset basic operation information of the robot, wherein the preset basic operation information can comprise information such as the operation speed, the operation mode, the network connection state and the residual electric quantity of the robot.
Accordingly, in one embodiment of the present disclosure, as shown in fig. 4, the method further comprises the steps of:
in step S16, in response to a triggering operation of the user on the third preset control, preset basic operation information of the robot is displayed.
The triggering operation here may include a sliding operation or a clicking operation on the third preset control, and so on.
According to the control method of the robot, not only the function control can be achieved, but also the scram control of the robot can be achieved through the target application program on the terminal, for example, a fourth preset control can be arranged on the target application program interface, and therefore, under sudden and unexpected conditions, a user can control the robot to conduct scram operation through triggering the fourth preset control.
Fig. 5 is a block diagram illustrating a control apparatus of a robot according to an exemplary embodiment, as shown in fig. 5, the apparatus including:
A first determining module 501, configured to determine a target control triggered by a user in response to a triggering operation of the user on a target application interface on a terminal, where the target application is used for performing motion control on the robot;
A second determining module 502 configured to determine a target control instruction according to the trigger operation and the target control, the target control instruction including a motion parameter of the robot;
a first control module 503 configured to control the robot to move according to the motion parameters according to the target control instruction.
Optionally, the triggering operation includes a sliding operation, and the second determining module 502 is configured to obtain a current triggering position corresponding to the sliding operation when the target control is a first preset control for controlling the horizontal movement of the robot; determining the motion parameters according to the current trigger position and a preset maximum motion parameter threshold; and determining the target control instruction according to the motion parameter, the control type of the target control and the sliding direction of the sliding operation.
Optionally, the target control instruction includes a first control instruction or a second control instruction, the motion parameter includes a motion acceleration, the first preset control includes a first control or a second control, the first control is used for controlling the robot to move according to a first direction, and the second control is used for controlling the robot to translate according to a second direction; the first direction intersects the second direction; the second determining module 502 is configured to generate, when the control type is the first control, the first control instruction according to the motion acceleration and the sliding direction if the sliding direction is a preset direction, where the first control instruction is used to control the robot to move in the first direction according to the motion acceleration; and under the condition that the control type is the second control, generating a second control instruction according to the motion acceleration and the sliding direction, wherein the second control instruction is used for controlling the robot to move in the second direction according to the motion acceleration.
Optionally, the target control instruction includes a third control instruction, the motion parameter includes a turning angular speed, the first control is further used for controlling the robot to turn, and the second determining module 502 is configured to generate the third control instruction according to the turning angular speed and the sliding direction if the sliding direction is not the preset direction if the control type is the first control, and the third control instruction is used for controlling the robot to turn according to the turning angular speed.
Optionally, fig. 6 is a block diagram of a control device of a robot shown according to the embodiment shown in fig. 5, and as shown in fig. 6, the device further includes:
A second control module 504, configured to determine, according to the trigger operation, a target gesture motion selected by a user, in a case where the target control is a second preset control for controlling the robot to perform a preset gesture motion; and generating a fourth control instruction according to the target gesture, wherein the fourth control instruction is used for controlling the robot to execute the target gesture.
Optionally, a third preset control is further deployed on the interface of the target application program, as shown in fig. 6, and the apparatus further includes:
And the display module 505 is configured to respond to a triggering operation of the third preset control by a user, and display preset basic operation information of the robot.
Optionally, the first control module 503 is configured to send the target control instruction to the robot through a remote procedure call GRPC network communication module, so that the robot controls the robot to move according to the motion parameter according to the target control instruction.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
The present disclosure also provides a computer-readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the control method of the robot provided by the present disclosure.
Fig. 7 is a block diagram illustrating a control apparatus 700 for a robot according to an exemplary embodiment. For example, apparatus 700 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, or the like.
Referring to fig. 7, an apparatus 700 may include one or more of the following components: a processing component 702, a memory 704, a power component 706, a multimedia component 708, an audio component 710, an input/output interface 712, a sensor component 714, and a communication component 716.
The processing component 702 generally controls overall operation of the apparatus 700, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 702 may include one or more processors 720 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 702 can include one or more modules that facilitate interaction between the processing component 702 and other components. For example, the processing component 702 may include a multimedia module to facilitate interaction between the multimedia component 708 and the processing component 702.
The memory 704 is configured to store various types of data to support operations at the apparatus 700. Examples of such data include instructions for any application or method operating on the apparatus 700, contact data, phonebook data, messages, pictures, videos, and the like. The memory 704 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply component 706 provides power to the various components of the device 700. The power components 706 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 700.
The multimedia component 708 includes a screen between the device 700 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 708 includes a front-facing camera and/or a rear-facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the apparatus 700 is in an operational mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 710 is configured to output and/or input audio signals. For example, the audio component 710 includes a Microphone (MIC) configured to receive external audio signals when the device 700 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 704 or transmitted via the communication component 716. In some embodiments, the audio component 710 further includes a speaker for outputting audio signals.
The input/output interface 712 provides an interface between the processing component 702 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 714 includes one or more sensors for providing status assessment of various aspects of the apparatus 700. For example, the sensor assembly 714 may detect an on/off state of the device 700, a relative positioning of the components, such as a display and keypad of the device 700, a change in position of the device 700 or a component of the device 700, the presence or absence of user contact with the device 700, an orientation or acceleration/deceleration of the device 700, and a change in temperature of the device 700. The sensor assembly 714 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 714 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 714 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 716 is configured to facilitate communication between the apparatus 700 and other devices in a wired or wireless manner. The apparatus 700 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 716 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 716 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 700 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for performing the control methods of the robots described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 704, including instructions executable by processor 720 of apparatus 700 to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-mentioned control method of a robot when being executed by the programmable apparatus.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (7)

1. A control method of a robot, comprising:
Responding to triggering operation of a user on a target application program interface on a terminal, determining a target control triggered by the user, wherein the target application program is used for controlling the movement of the robot;
Determining a target control instruction according to the triggering operation and the target control, wherein the target control instruction comprises the motion parameters of the robot; the target control instruction comprises a first control instruction or a second control instruction, and the motion parameter comprises motion acceleration or turning angular velocity;
controlling the robot to move according to the motion parameters according to the target control instruction;
The triggering operation comprises a sliding operation, and the determining a target control instruction according to the triggering operation and the target control comprises the following steps:
Acquiring a current trigger position corresponding to the sliding operation under the condition that the target control is a first preset control for controlling the horizontal movement of the robot; determining the motion parameters according to the current trigger position and a preset maximum motion parameter threshold; determining the target control instruction according to the motion parameter, the control type of the target control and the sliding direction of the sliding operation; the first preset control comprises a first control or a second control, the first control is used for controlling the robot to move according to a first direction, and the second control is used for controlling the robot to translate according to a second direction; the first direction intersects the second direction;
the determining the target control instruction according to the motion parameter, the control type of the target control and the sliding direction of the sliding operation includes:
if the control type is the first control, generating a first control instruction according to the motion acceleration and the sliding direction if the sliding direction is a preset direction, wherein the first control instruction is used for controlling the robot to move in the first direction according to the motion acceleration;
Generating a second control instruction according to the motion acceleration and the sliding direction under the condition that the control type is the second control, wherein the second control instruction is used for controlling the robot to move in the second direction according to the motion acceleration;
The first control is also used for controlling the robot to turn; and if the control type is the first control, generating a third control instruction according to the turning angular speed and the sliding direction if the sliding direction is not the preset direction, wherein the third control instruction is used for controlling the robot to turn according to the turning angular speed, and the target control instruction comprises the third control instruction.
2. The method according to claim 1, wherein the method further comprises:
Under the condition that the target control is a second preset control for controlling the robot to execute preset gesture actions, determining the target gesture actions selected by the user according to the triggering operation;
and generating a fourth control instruction according to the target gesture, wherein the fourth control instruction is used for controlling the robot to execute the target gesture.
3. The method of claim 1, wherein a third preset control is further deployed on the interface of the target application, the method further comprising:
and responding to the triggering operation of the user on the third preset control, and displaying the preset basic operation information of the robot.
4. A method according to any one of claims 1-3, wherein said controlling the robot to move according to the movement parameters in accordance with the target control instructions comprises:
And the GRPC network communication module is called through a remote procedure to send the target control instruction to the robot so that the robot can control the robot to move according to the motion parameters according to the target control instruction.
5. A control device for a robot, comprising:
The first determining module is configured to respond to triggering operation of a user on a target application program interface on a terminal, determine a target control triggered by the user, and the target application program is used for controlling the movement of the robot;
A second determining module configured to determine a target control instruction according to the trigger operation and the target control, the target control instruction including a motion parameter of the robot; the target control instruction comprises a first control instruction or a second control instruction, and the motion parameter comprises motion acceleration or turning angular velocity;
the control module is configured to control the robot to move according to the motion parameters according to the target control instruction;
The triggering operation comprises a sliding operation, and the second determining module is configured to acquire a current triggering position corresponding to the sliding operation under the condition that the target control is a first preset control for controlling the horizontal movement of the robot; determining the motion parameters according to the current trigger position and a preset maximum motion parameter threshold; determining the target control instruction according to the motion parameter, the control type of the target control and the sliding direction of the sliding operation; the first preset control comprises a first control or a second control, the first control is used for controlling the robot to move according to a first direction, and the second control is used for controlling the robot to translate according to a second direction; the first direction intersects the second direction;
The second determining module is further configured to generate, if the control type is the first control and the sliding direction is a preset direction, the first control instruction according to the motion acceleration and the sliding direction, where the first control instruction is used to control the robot to move in the first direction according to the motion acceleration; generating a second control instruction according to the motion acceleration and the sliding direction under the condition that the control type is the second control, wherein the second control instruction is used for controlling the robot to move in the second direction according to the motion acceleration; the first control is also used for controlling the robot to turn; and if the control type is the first control, generating a third control instruction according to the turning angular speed and the sliding direction if the sliding direction is not the preset direction, wherein the third control instruction is used for controlling the robot to turn according to the turning angular speed, and the target control instruction comprises the third control instruction.
6. A control device for a robot, comprising:
A processor;
A memory for storing processor-executable instructions;
Wherein the processor is configured to perform the steps of the method of any of claims 1-4.
7. A computer readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the steps of the method of any of claims 1-4.
CN202211559264.1A 2022-12-06 2022-12-06 Robot control method, device and storage medium Active CN117245642B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211559264.1A CN117245642B (en) 2022-12-06 2022-12-06 Robot control method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211559264.1A CN117245642B (en) 2022-12-06 2022-12-06 Robot control method, device and storage medium

Publications (2)

Publication Number Publication Date
CN117245642A CN117245642A (en) 2023-12-19
CN117245642B true CN117245642B (en) 2024-06-11

Family

ID=89130066

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211559264.1A Active CN117245642B (en) 2022-12-06 2022-12-06 Robot control method, device and storage medium

Country Status (1)

Country Link
CN (1) CN117245642B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106625670A (en) * 2016-12-26 2017-05-10 迈赫机器人自动化股份有限公司 Control system and method of multifunctional man-machine interaction humanoid teaching robot
CN110154016A (en) * 2018-08-09 2019-08-23 腾讯科技(深圳)有限公司 Robot control method, device, storage medium and computer equipment
CN110580843A (en) * 2018-09-05 2019-12-17 南京科青信息科技有限公司 robot control method, device, equipment and readable medium
CN111760275A (en) * 2020-07-08 2020-10-13 网易(杭州)网络有限公司 Game control method and device and electronic equipment
CN112558955A (en) * 2019-09-10 2021-03-26 广州途道信息科技有限公司 Robot programming and control method, readable storage medium, and computing device
CN112731936A (en) * 2020-12-28 2021-04-30 上海有个机器人有限公司 Method, device, medium and intelligent terminal for scanning remote-controlled robot

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106625670A (en) * 2016-12-26 2017-05-10 迈赫机器人自动化股份有限公司 Control system and method of multifunctional man-machine interaction humanoid teaching robot
CN110154016A (en) * 2018-08-09 2019-08-23 腾讯科技(深圳)有限公司 Robot control method, device, storage medium and computer equipment
CN110580843A (en) * 2018-09-05 2019-12-17 南京科青信息科技有限公司 robot control method, device, equipment and readable medium
CN112558955A (en) * 2019-09-10 2021-03-26 广州途道信息科技有限公司 Robot programming and control method, readable storage medium, and computing device
CN111760275A (en) * 2020-07-08 2020-10-13 网易(杭州)网络有限公司 Game control method and device and electronic equipment
CN112731936A (en) * 2020-12-28 2021-04-30 上海有个机器人有限公司 Method, device, medium and intelligent terminal for scanning remote-controlled robot

Also Published As

Publication number Publication date
CN117245642A (en) 2023-12-19

Similar Documents

Publication Publication Date Title
CN106572299B (en) Camera opening method and device
EP3163411A1 (en) Method, device and apparatus for application switching
EP3226096A1 (en) Method and device for parking a balance vehicle
EP3291061A1 (en) Virtual reality control method, apparatus and electronic equipment
EP3136216A1 (en) Method for controlling mobile terminal and mobile terminal
EP3716030A1 (en) Interactive interface display method, apparatus and storage medium
EP3239827B1 (en) Method and apparatus for adjusting playing progress of media file
US11372516B2 (en) Method, device, and storage medium for controlling display of floating window
EP3249505A1 (en) Method and device for controlling state of touch screen, electronic device
JP2021524116A (en) Dynamic motion detection method, dynamic motion control method and device
EP3758343B1 (en) Method and device for controlling image acquisition component and storage medium
CN112929561A (en) Multimedia data processing method and device, electronic equipment and storage medium
EP3015968A1 (en) Method for image deletion and device thereof
CN106940653B (en) Method, apparatus and computer-readable storage medium for controlling application program
CN110597443B (en) Calendar display method, device and medium
CN117245642B (en) Robot control method, device and storage medium
CN110891139B (en) On-off control method and device for shooting function, electronic equipment and storage medium
CN112423092A (en) Video recording method and video recording device
CN112148183B (en) Processing method, device and medium of associated object
CN113460092A (en) Method, device, equipment, storage medium and product for controlling vehicle
CN114029949A (en) Robot action editing method and device, electronic equipment and storage medium
CN109407942B (en) Model processing method and device, control client and storage medium
CN108958300B (en) Tripod head control method and device
CN111610921A (en) Gesture recognition method and device
CN117804491A (en) Navigation method, navigation device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant