CN112518747A - Robot control method, robot control device and wearable equipment - Google Patents

Robot control method, robot control device and wearable equipment Download PDF

Info

Publication number
CN112518747A
CN112518747A CN202011348892.6A CN202011348892A CN112518747A CN 112518747 A CN112518747 A CN 112518747A CN 202011348892 A CN202011348892 A CN 202011348892A CN 112518747 A CN112518747 A CN 112518747A
Authority
CN
China
Prior art keywords
gesture
executed
robot
preset
target robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011348892.6A
Other languages
Chinese (zh)
Inventor
彭克刚
雷雄
钟永
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ubtech Technology Co ltd
Original Assignee
Shenzhen Ubtech Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ubtech Technology Co ltd filed Critical Shenzhen Ubtech Technology Co ltd
Priority to CN202011348892.6A priority Critical patent/CN112518747A/en
Publication of CN112518747A publication Critical patent/CN112518747A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/006Controls for manipulators by means of a wireless system for controlling one or several manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Manipulator (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a robot control method, a robot control device, wearable equipment and a computer readable storage medium. Wherein, the method comprises the following steps: detecting whether a user sends out a gesture to be executed through the wearable equipment or not according to a sensor carried in the wearable equipment; when the fact that a user sends out a gesture to be executed through the wearable device is determined, judging whether the gesture to be executed is legal or not according to a preset gesture database, wherein at least one preset gesture data is stored in the gesture database; if the gesture to be executed is legal, judging whether the gesture to be executed is effective or not according to a preset gesture condition; and if the gesture to be executed is effective, controlling the target robot to execute the action corresponding to the gesture to be executed, wherein the target robot is the robot which establishes communication connection with the wearable equipment. The scheme enables a user to control the robot through the wearable equipment, simplifies the control flow and improves the maneuverability and flexibility of the robot.

Description

Robot control method, robot control device and wearable equipment
Technical Field
The application belongs to the technical field of intelligent device interaction, and particularly relates to a robot control method, a robot control device, wearable devices and a computer readable storage medium.
Background
With the development of artificial intelligence, the robot provides convenience for people in all aspects of life. Currently, most robots still require user control via buttons. For example, for a low-end robot, a robot manufacturer integrates the functions of the robot into physical keys on the robot body, so that a user presses the physical keys to control the robot; for a high-end robot, in order to achieve better user interaction, a robot manufacturer develops a supporting application program, so that a user presses a virtual key or a console on a mobile device (e.g., a smart phone) loaded with the application program to control the robot. However, these control methods are cumbersome to implement, resulting in poor maneuverability and flexibility of most current robots.
Disclosure of Invention
The application provides a robot control method, a robot control device, wearable equipment and a computer readable storage medium, so that a user can control a robot through the wearable equipment, the control flow is simplified, and the maneuverability and flexibility of the robot are improved.
In a first aspect, the present application provides a robot control method applied to a wearable device, including:
detecting whether a user sends out a gesture to be executed through the wearable equipment or not according to a sensor carried in the wearable equipment;
when the user is determined to send the gesture to be executed through the wearable device, judging whether the gesture to be executed is legal or not according to a preset gesture database, wherein at least one preset gesture data is stored in the gesture database;
if the gesture to be executed is legal, judging whether the gesture to be executed is effective or not according to a preset gesture condition;
and if the gesture to be executed is effective, controlling a target robot to execute the action corresponding to the gesture to be executed, wherein the target robot is a robot which establishes communication connection with the wearable device.
In a second aspect, the present application provides a robot control apparatus applied to a wearable device, including:
the detection unit is used for detecting whether a user sends out a gesture to be executed through the wearable equipment or not according to the sensor carried in the wearable equipment;
the wearable device comprises a first judging unit, a second judging unit and a gesture executing unit, wherein the first judging unit is used for judging whether the gesture to be executed is legal or not according to a preset gesture database when the user is determined to send the gesture to be executed through the wearable device, and at least one preset gesture datum is stored in the gesture database;
the second judging unit is used for judging whether the gesture to be executed is valid or not according to a preset gesture condition if the gesture to be executed is legal;
and the control unit is used for controlling a target robot to execute the action corresponding to the gesture to be executed if the gesture to be executed is effective, wherein the target robot is a robot which establishes communication connection with the wearable device.
In a third aspect, the present application provides a wearable device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor, when executing the computer program, performs the steps of the method according to the first aspect.
In a fourth aspect, the present application provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of the first aspect.
In a fifth aspect, the present application provides a computer program product comprising a computer program which, when executed by one or more processors, performs the steps of the method of the first aspect as described above.
Compared with the prior art, the application has the beneficial effects that: whether a user sends a gesture to be executed for controlling the robot through the wearable device is detected according to a sensor carried in the wearable device, if yes, the legality of the gesture to be executed is judged firstly, the gesture to be executed is identified, the validity of the gesture to be executed is judged on the premise that the gesture to be executed is legal, misoperation of the user is avoided, and finally when the gesture to be executed is effective, a target robot which is in communication connection with the wearable device is controlled to execute the action corresponding to the gesture to be executed, so that the user can control the robot through the wearable device, the control flow is simplified, and the maneuverability and flexibility of the robot are improved. It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flow chart of an implementation of a robot control method provided in an embodiment of the present application;
FIG. 2-1 is an exemplary diagram of an arm-up gesture provided by an embodiment of the present application;
2-2 are example diagrams of arm-down gestures provided by embodiments of the present application;
2-3 are exemplary diagrams of a left-handed gesture of an arm provided by an embodiment of the present application;
2-4 are exemplary diagrams of hand gestures for right arm rotation provided by embodiments of the present application;
2-5 are exemplary diagrams of a hand-up gesture provided by an embodiment of the present application;
2-6 are exemplary diagrams of hand gestures to swing down as provided by embodiments of the present application;
2-7 are exemplary diagrams of gestures for arm left movement provided by embodiments of the present application;
2-8 are exemplary diagrams of gestures for right arm movement provided by embodiments of the present application;
FIG. 3 is an exemplary diagram of a three-axis coordinate system provided by an embodiment of the present application;
fig. 4 is a block diagram of a robot control device according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a wearable device provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution proposed in the present application, the following description will be given by way of specific examples.
A robot control method provided in an embodiment of the present application is described below. The robot control method is applied to wearable equipment. The wearable device can be worn on the wrist of a user, and for example, the wearable device may be a smart watch or a smart bracelet, which is not limited herein. The robot control method provided in the embodiment of the present application is described below by taking a wearable device as an example of a smart watch. Referring to fig. 1, the robot control method includes:
step 101, detecting whether a user sends out a gesture to be executed through the wearable device according to a sensor carried in the wearable device;
in the embodiment of the application, before a user controls a robot through a smart watch, a communication connection between the smart watch and the robot waiting to be controlled needs to be established. Illustratively, the smart watch may establish a communication connection with the robot through bluetooth; alternatively, the smart watch may establish a communication connection with the robot through a wireless local area network (e.g., Wi-Fi), and the manner in which the robot establishes a communication connection with the smart watch is not limited herein. For convenience of explanation, in the embodiments of the present application, a robot that establishes a communication connection with a smart watch is referred to as a target robot.
The smartwatch is equipped with various sensors for detecting its posture. After establishing communication connection with the target robot, whether the user sends a gesture to be executed through the intelligent watch or not can be detected according to the sensor carried in the intelligent watch. In order to detect the gestures of the smart watch, the default state (the waiting state for controlling the robot, that is, the initial state of each gesture) of the smart watch is set as follows: the dial plate of the intelligent watch is perpendicular to the horizontal plane, and the two sides (which can be regarded as connecting lines of the dial plate and the watchband) of the dial plate and the watchband of the intelligent watch are parallel to the horizontal plane. For example, the user wears the smartwatch on the left wrist, and when the forearm of the left hand of the user is lifted forward, the tiger's mouth is upward after the left hand makes a fist, and the fist is inward, the smartwatch is in a default state. It should be noted that, in consideration of the fact that in real life, it is difficult to realize absolute parallelism and absolute perpendicularity, therefore, when judging whether the smart watch is in a default state, a deviation angle threshold exists; as long as the deviation angle of the dial plate of the intelligent watch and the vertical line of the horizontal plane is smaller than the corresponding deviation angle threshold value, and the deviation angle of the parallel lines of the two sides, connected with the watchband, of the dial plate of the intelligent watch and the horizontal plane is smaller than the corresponding deviation angle threshold value, the intelligent watch can be considered to be in a default state.
In some embodiments, the present application divides gestures into two categories, taking into account the differences in functions supported by different robots: one type is a basic gesture, and the basic gesture is a universal gesture and is specifically used for controlling the robot to execute a universal movement action; for example, the general movement action may be forward, backward, left turn or right turn, etc.; the other type is a special effect gesture, the special effect gesture can only control part of special robots, and is specifically used for controlling the robots of preset types (namely special robots) to execute preset actions, and the preset actions which can be executed by the robots of different preset types are different; for example, for a wheeled robot, the preset actions that it can perform include left-shift, right-shift, or drift; for the humanoid robot, the preset actions that can be performed by the humanoid robot include jumping, dancing, etc., and are not limited herein.
In some embodiments, preset gestures such as a downward arm gesture, an upward arm gesture, a left arm gesture, and a right arm gesture may be set as the base gesture. Referring to fig. 2-1, fig. 2-1 shows an example of an arm-up gesture, which can be understood as that in a default state, a user lifts up an arm with an elbow or a shoulder as an origin, and the arm-up gesture is used for controlling the robot to perform a backward movement. Referring to fig. 2-2, fig. 2-2 shows an example of an arm-down gesture, which can be understood as a gesture for controlling the robot to perform a forward motion when the user puts down the arm with the shoulder as the origin in a default state. Referring to fig. 2-3, fig. 2-3 show an example of a left-handed arm gesture, which can be understood as a gesture for controlling the robot to perform a left-handed operation when the user turns the wrist to the left by default, so that the fist of the left hand wearing the smart watch is upward and the back of the hand is downward. Referring to fig. 2-4, fig. 2-4 illustrate an example of a right arm gesture, which may be understood as a gesture in which a user rotates the wrist to the right by default such that the fist of the left hand wearing the smart watch is down and the back of the hand is up, the right arm gesture being used to control the robot to perform a right turn action.
In some embodiments, preset gestures such as a gesture of moving the arm left, a gesture of moving the arm right, a gesture of retracting the arm upwards, a gesture of retracting the arm downwards and the like can be set as the trick gestures. Referring to fig. 2-5, fig. 2-5 show an example of a hand-up gesture, which can be understood as that in a default state, a user takes a shoulder as an origin to rapidly and greatly lift up an arm, and the hand-up gesture is similar to the hand-up gesture to some extent, and the difference is: the movement amplitude, movement acceleration and movement speed of the arm when the hand is folded upwards are generally larger than those when the arm is folded upwards. Similarly, referring to fig. 2-6, fig. 2-6 illustrate an example of a downward swing gesture, which can be understood as a gesture where the user swings his arm downward quickly and largely from the shoulder by default, and the downward swing gesture is similar to the downward arm gesture, and the difference is: the movement amplitude, movement acceleration and movement speed of the arm when the hand is thrown downward are generally larger than those when the arm is downward. Referring to fig. 2-7, fig. 2-7 illustrate an example of a gesture where the arm moves to the left, which may be understood as the default state where the arm moves to the left. Referring to fig. 2-8, fig. 2-8 illustrate an example of a gesture for moving the arm to the right, which may be understood as the movement of the arm to the right in a default state.
It should be noted that the same trick gesture may differ in the preset actions performed when controlling different preset types of robots. For example only, when the wheeled robot is controlled using a hand-up gesture, a preset action performed by the wheeled robot is a leftward drift; when the human-shaped robot is controlled by the gesture of taking in hands upwards, the preset action performed by the human-shaped robot is dancing.
In some embodiments, whether a user issues a gesture to be performed through the wearable device may be detected according to an accelerometer sensor, an orientation sensor, or a linear accelerometer sensor carried by the wearable device. In consideration of the gesture characteristics of the basic gesture and the special gesture, whether a user sends a gesture to be executed belonging to the basic gesture through the intelligent watch can be detected through an accelerometer sensor carried by the intelligent watch; through the direction sensor or the linear accelerometer sensor that intelligent wrist-watch carried on, detect whether the user sends the gesture of waiting to carry out that belongs to the trick gesture through this intelligent wrist-watch. Specifically, the accelerometer sensor may be configured to detect whether a gesture to be performed issued by a user is any one of a gesture of an arm downward, a gesture of an arm upward, a gesture of an arm left-handed rotation, and a gesture of an arm right-handed rotation; the direction sensor can be used for detecting whether the gesture to be executed sent by the user is a gesture of moving the arm left or a gesture of moving the arm right; the linear accelerometer sensor can be used for detecting whether the gesture to be executed sent by the user is a hand-in-upward gesture and a hand-in-downward gesture.
102, when it is determined that the user sends the gesture to be executed through the wearable device, judging whether the gesture to be executed is legal or not according to a preset gesture database;
in this embodiment of the application, at least one preset gesture data is stored in the gesture database, where the gesture data may be a change state of sensor data corresponding to each preset gesture, and/or a change state of a gesture of the smart watch corresponding to each preset gesture. In order to verify the validity of the gesture to be executed, whether gesture data corresponding to the gesture to be executed exists in the gesture database or not can be detected, and when the gesture data corresponding to the gesture to be executed exists in the gesture database, the gesture to be executed can be determined to be legal. Establishing a three-axis coordinate system by taking the center of the intelligent watch as an original point, the direction perpendicular to the watchband on the plane of the dial plate of the intelligent watch as an x-axis, the direction perpendicular to the x-axis on the plane of the dial plate of the intelligent watch as a y-axis and the direction perpendicular to the dial plate of the intelligent watch as a z-axis based on a right-hand rule; it can be understood that the direction of three o 'clock in the dial plate of intelligent wrist-watch is the positive direction of x axle, and the direction of twelve o' clock in the dial plate is the positive direction of y axle, and perpendicular dial plate upward direction is the positive direction of z axle. Referring to fig. 3, fig. 3 shows an example of the three-axis coordinate system. Under the constructed three-axis coordinate system, the explanation is made for each preset gesture:
for a hand gesture with the arm down, the stored corresponding gesture data in the gesture database is: the smart watch is inclined in the vertical direction, and the first acceleration is less than 0, wherein the first acceleration is the acceleration of the smart watch along the x axis, and the acceleration comprises gravity and has the unit of meter per second2(i.e., m/s)2). Acquiring a first acceleration of the smart watch in real time by using an accelerometer sensor of the smart watch; and determining whether the intelligent watch is inclined in the vertical direction or not according to the change value of the first acceleration. If the first acceleration is less than 0, and the intelligent watch is determined to be inclined in the vertical directionAnd the gesture to be recognized is considered to be legal, and the gesture to be recognized is likely to be a gesture with a downward arm.
For an arm-up gesture, the stored corresponding gesture data in the gesture database is: the smart watch is tilted in a vertical direction, and the first acceleration is greater than 0. Acquiring a first acceleration of the smart watch in real time by using an accelerometer sensor of the smart watch; and determining whether the intelligent watch is inclined in the vertical direction or not according to the change value of the first acceleration. If the first acceleration is larger than 0 and the intelligent watch is determined to be inclined in the vertical direction, the gesture to be recognized is considered to be legal, and the gesture to be recognized is likely to be an upward gesture of an arm.
For a left-handed gesture of an arm, the corresponding stored gesture data in the gesture database is: the smart watch is inclined in the horizontal direction, and the second acceleration is less than 0, wherein the second acceleration is the acceleration of the smart watch along the z-axis, the acceleration includes gravity, and the unit of the acceleration is meter/second2(i.e., m/s)2). Acquiring a second acceleration of the smart watch in real time by adopting an accelerometer sensor of the smart watch; and determining whether the intelligent watch is inclined in the horizontal direction or not according to the change value of the second acceleration. If the first acceleration is less than 0 and the intelligent watch is determined to be inclined in the horizontal direction, the gesture to be recognized is considered to be legal, and the gesture to be recognized is likely to be a gesture of a left hand of the arm.
For a right hand arm gesture, the corresponding stored gesture data in the gesture database is: the smart watch is tilted in a horizontal direction, and the second acceleration is greater than 0. Acquiring a second acceleration of the smart watch in real time by adopting an accelerometer sensor of the smart watch; and determining whether the intelligent watch is inclined in the horizontal direction or not according to the change value of the second acceleration. If the first acceleration is larger than 0 and the intelligent watch is determined to be inclined in the horizontal direction, the gesture to be recognized is considered to be legal, and the gesture to be recognized is likely to be a right-hand gesture of an arm.
For a gesture in which the arm moves to the left, the corresponding gesture data stored in the gesture database is: the azimuth of the smart watch is offset to the left relative to the initial position. The azimuth data of the intelligent watch can be acquired in real time by adopting a direction sensor of the intelligent watch; determining the position of the intelligent watch in a default state as an initial position; if the fact that the azimuth angle of the smart watch deviates leftwards relative to the initial position is detected, the gesture to be recognized is considered to be legal, and the gesture to be recognized is likely to be a gesture of leftwards movement of the arm.
For a gesture in which the arm moves to the right, the corresponding stored gesture data in the gesture database is: the azimuth of the smart watch is offset to the right relative to the initial position. The azimuth data of the intelligent watch can be acquired in real time by adopting a direction sensor of the intelligent watch; if the fact that the azimuth angle of the smart watch deviates rightwards relative to the initial position is detected, the gesture to be recognized is considered to be legal, and the gesture to be recognized is likely to be a gesture of rightwards movement of the arm.
For the gesture of taking the hand up and the gesture of taking the hand down, the corresponding gesture data stored in the gesture database are: the sum of the third acceleration of the smart watch is greater than the preset first threshold value, wherein the third acceleration is the acceleration of the smart watch along the y axis, the acceleration does not include gravity, and the unit of the acceleration is meter/second2(i.e., m/s)2) (ii) a The first threshold may be 10m/s2Other values are also possible, and are not limited herein. The third acceleration of the smart watch can be acquired in real time by adopting a linear accelerometer sensor of the smart watch, and when the third acceleration of the smart watch is larger than the first threshold value, the gesture to be recognized is considered to be legal, and the gesture to be recognized is likely to be a gesture of upwards taking in a hand or a gesture of downwards throwing away the hand.
103, if the gesture to be executed is legal, judging whether the gesture to be executed is effective according to a preset gesture condition;
in the embodiment of the application, it is considered that it is difficult for the user to keep the arm absolutely still, and therefore, when the user wears the smart watch, the gesture may be recognized by mistake due to the small-amplitude swing of the arm. In order to reduce the occurrence of such a situation, for each preset gesture, the embodiment of the application further sets a corresponding gesture condition to distinguish whether the gesture currently detected by the smart watch is a real gesture to be recognized or a gesture made by the user without intention. The following description is made for each preset gesture:
for the gesture of arm down, the gesture conditions are as follows: the tilt angle of the smart watch in the vertical direction exceeds a preset first tilt angle threshold (e.g., 15 °). That is, only if the first acceleration of the smart watch is less than 0 and the inclination angle of the smart watch in the vertical direction exceeds the first inclination angle threshold, it is determined that the gesture to be recognized is legal and effective, and the gesture with the arm facing downward can execute the subsequent steps to realize the control of the target robot.
For the gesture of the arm upward, the gesture conditions are as follows: the tilt angle of the smart watch in the vertical direction exceeds a preset first tilt angle threshold (e.g., 15 °). That is, only if the first acceleration of the smart watch is greater than 0 and the inclination angle of the smart watch in the vertical direction exceeds the first inclination angle threshold, it is determined that the gesture to be recognized is legal and effective, and the gesture with the arm facing upward can execute subsequent steps to realize the control of the target robot.
For the gesture of the left hand of the arm, the gesture conditions are as follows: the tilt angle of the smart watch in the horizontal direction exceeds a preset second tilt angle threshold (e.g., 15 °). That is, only if the second acceleration of the smart watch is less than 0, and the inclination angle of the smart watch in the horizontal direction exceeds the second inclination angle threshold, it is determined that the gesture to be recognized is legal and effective, and is a gesture of left rotation of the arm, and then the subsequent steps can be executed, so that the target robot can be controlled.
For the gesture of arm right-handed rotation, the gesture conditions are as follows: the tilt angle of the smart watch in the horizontal direction exceeds a preset second tilt angle threshold (e.g., 15 °). That is, only if the second acceleration of the smart watch is greater than 0 and the inclination angle of the smart watch in the horizontal direction exceeds the second inclination angle threshold, it is determined that the gesture to be recognized is legal and effective, and is a gesture of right-handed rotation of the arm, and then the subsequent steps can be executed, so that the target robot can be controlled.
For the gesture of moving the arm to the left, the gesture conditions are as follows: the azimuth of the smart watch is offset to the left relative to the initial position by an angle exceeding a preset offset angle threshold (e.g., 30 °). That is, only when the angle of the azimuth angle of the smart watch, which is offset leftward relative to the initial position, exceeds the offset angle threshold, it is determined that the gesture to be recognized is legal and effective, and is a gesture in which the arm moves leftward, and at this time, the subsequent steps can be executed, so that the target robot can be controlled.
For the gesture of moving the arm to the right, the gesture conditions are as follows: the azimuth of the smart watch is offset to the right relative to the initial position by an angle exceeding a preset offset angle threshold (e.g., 30 °). That is, only when the angle of the azimuth angle of the smart watch, which is shifted to the right relative to the initial position, exceeds the offset angle threshold, it is determined that the gesture to be recognized is legal and effective, and the gesture is a gesture in which the arm is shifted to the right, and at this time, the subsequent steps can be executed, so that the target robot can be controlled.
For the gesture of hand-in upwards, the gesture conditions are as follows: third acceleration of the smart watch > a preset first threshold (e.g., 10 m/s)2) Starting at the next nth millisecond (for example, 500 milliseconds), m (for example, 10) third accelerations are continuously acquired, and the sum of the m third accelerations is calculated, and is greater than a preset second threshold value, where the second threshold value is not less than 0.
For the gesture of swinging hands downwards, the gesture conditions are as follows: third acceleration of the smart watch > a preset first threshold (e.g., 10 m/s)2) Starting at the next nth millisecond (for example, 500 milliseconds), m (for example, 10) third accelerations are continuously acquired, and the sum of the m third accelerations is calculated, and is smaller than a preset third threshold value, wherein the third threshold value is not larger than 0.
Namely, when the third acceleration of the smart watch is larger than the first threshold, entering a hand receiving/shaking trick gesture recognition state, continuously acquiring m third accelerations after n milliseconds, and calculating to obtain the sum of the m third accelerations; when the sum is larger than the second threshold value, the gesture of taking the hand upwards is considered, and when the sum is smaller than the third threshold value, the gesture of throwing the hand downwards is considered, and then the subsequent steps can be executed, so that the target robot can be controlled.
It should be noted that if the gesture to be executed is determined to be illegal, or if the gesture to be executed is determined to be invalid, the gesture to be executed cannot affect the target robot, that is, the target robot will continue to remain still.
And 104, if the gesture to be executed is effective, controlling the target robot to execute the action corresponding to the gesture to be executed.
In the embodiment of the application, on the premise that the gesture to be executed is legal, once the gesture to be executed is confirmed to be effective, the target robot can be controlled according to the gesture to be executed, so that the target robot can execute the action corresponding to the gesture to be executed.
In some embodiments, considering that the gesture has two gesture types, namely a basic gesture and a special gesture, wherein the basic gesture is universal, almost all robots can be controlled by the basic gesture, and the special gesture can only control part of special robots, therefore, the following operations can be carried out for the gesture to be executed to improve the control efficiency: after determining that the gesture to be executed is effective, acquiring the gesture type of the gesture to be executed; if the gesture to be executed belongs to the basic gesture, the target robot can be directly controlled to execute the moving action corresponding to the gesture to be executed in consideration of the universality of the gesture to be executed; if the gesture to be executed belongs to the special effect gesture, detecting whether the target robot belongs to a preset type, namely detecting whether the target robot supports the special effect gesture; and only when the target robot belongs to the preset type, controlling the target robot to execute the preset action corresponding to the gesture to be executed.
For example, assume that the target robot is a tracked robot, which does not support trick gestures; that is, the target robot can perform only forward, backward, left turn, and right turn motions. When the gesture to be executed sent by the user through the smart watch is a gesture with the arm downward, the gesture to be executed is determined to be a basic gesture, and the corresponding action is a forward action, so that the target robot can be controlled to move forward. When the gesture to be executed sent by the user through the smart watch is a gesture for receiving hands upwards, the gesture to be executed is determined to be a special effect gesture.
By way of further example, assume that the target robot is a humanoid robot, which supports trick gestures; that is, the target robot can support various special effects (i.e., preset actions) in addition to the forward, backward, left turn, and right turn actions. For example only, the preset motion corresponding to the gesture of retracting the hand upwards is a dancing motion, the preset motion corresponding to the gesture of swinging the hand downwards is a martial arts motion, the preset motion corresponding to translating leftwards is a jumping motion leftwards, and the preset motion corresponding to translating rightwards is a jumping motion rightwards. When the gesture to be executed sent by the user through the smart watch is a gesture with the arm downward, the gesture to be executed is determined to be a basic gesture, and the corresponding action is a forward action, so that the target robot can be controlled to move forward. When the user sends through intelligent wrist-watch waiting to carry out the gesture for the gesture of upwards receiving the hand, confirm that should wait to carry out the gesture and be the trick gesture, because the target robot is humanoid robot, belong to preset the type, support the trick gesture, therefore, the target robot can respond to should wait to carry out the gesture, specifically for carrying out the action of dancing.
In some embodiments, only the gesture recognition operation may be performed on the smart watch side, specifically: after the gesture to be executed is determined to be legal and effective, the gesture name (or other corresponding gesture data) of the gesture to be executed is sent to the target robot; after receiving the gesture name of the gesture to be executed sent by the smart watch, the target robot generates a control instruction corresponding to the gesture name and sends the control instruction to the lower computer so as to control a motor, a steering engine and/or other components of the target robot to respond to the action corresponding to the control instruction.
In some embodiments, the gesture recognition operation and the instruction parsing operation may also be directly performed on the smart watch side, specifically: after the gesture to be executed is determined to be legal and effective, generating a corresponding control instruction based on the gesture to be executed, and sending the control instruction to a target robot; the target robot transmits the received control command to the lower computer of the target robot so as to control the motor, the steering engine and/or other components of the target robot to respond to the action corresponding to the control command. In this case, a configuration table may be stored in advance in the smart watch, and the configuration table is configured with gestures supported by each type of robot and control instructions corresponding to each gesture when controlling different robots. By way of example only, the following table gives one example of this configuration table:
Figure BDA0002800799550000131
in some embodiments, it is considered that the target robot may move its position after performing the corresponding action in response to the gesture to be performed; and the user may also walk in the process of operating the target robot; in order to avoid the movement of the target robot and/or the walking of the user, the target robot is disconnected from the smart watch in a communication mode, the distance between the target robot and the smart watch can be acquired in real time, and once the distance is found to reach a preset maximum distance threshold value, the target robot is controlled to stop moving, wherein the preset distance threshold value is set according to the mode of establishing the communication connection between the target robot and the smart watch. For example, if the smart watch establishes a communication connection with the target robot through bluetooth, the maximum distance threshold may be set based on the maximum communicable distance of the bluetooth. Further, after the control target robot stops moving, the user can input a robot return instruction on the smart watch; the smart watch can forward the robot return instruction to the target robot so as to control the target robot to move to the position of the smart watch until the distance between the target robot and the smart watch reaches a preset minimum distance threshold (namely, the target robot returns to the position where the user is currently located).
It can be seen from the above that, according to the embodiment of the application, whether a user sends a gesture to be executed for controlling a robot through a wearable device is detected according to a sensor carried in the wearable device, if so, the legality of the gesture to be executed is judged first, the gesture to be executed is recognized, the validity of the gesture to be executed is judged on the premise that the gesture to be executed is legal, misoperation of the user is avoided, and finally when the gesture to be executed is valid, a target robot which is in communication connection with the wearable device is controlled to execute an action corresponding to the gesture to be executed, so that the user can control the robot through the wearable device, a control flow is simplified, and the maneuverability and flexibility of the robot are improved. Furthermore, the application also provides a universal gesture and a special gesture, and the user can use the same gesture to operate different robots according to the same gesture specification through the universal gesture, so that the learning cost of the user is reduced; through the special effect gesture, partial special robots can be controlled to display special actions, and interestingness is enhanced.
Corresponding to the robot control method proposed in the foregoing, embodiments of the present application provide a robot control apparatus integrated in a wearable device. Referring to fig. 4, a robot control device 400 according to an embodiment of the present invention includes:
a detection unit 401, configured to detect whether a user sends a gesture to be executed through the wearable device according to a sensor mounted in the wearable device;
a first determining unit 402, configured to determine whether the gesture to be performed is legal according to a preset gesture database when it is determined that the user sends the gesture to be performed through the wearable device, where at least one preset gesture data is stored in the gesture database;
a second determining unit 403, configured to determine whether the gesture to be executed is valid according to a preset gesture condition if the gesture to be executed is valid;
a control unit 404, configured to control a target robot to execute an action corresponding to the gesture to be executed if the gesture to be executed is valid, where the target robot is a robot that establishes a communication connection with the wearable device.
Optionally, the control unit 404 includes:
the gesture type obtaining subunit is configured to obtain a gesture type of the gesture to be executed if the gesture to be executed is valid;
and the action control subunit is used for controlling the target robot to execute the action corresponding to the gesture to be executed according to the gesture type of the gesture to be executed.
Optionally, the gesture type includes a basic gesture, and the basic gesture is used for controlling the robot to execute a moving action; the motion control subunit includes:
the first control subunit is used for controlling the target robot to execute the moving action corresponding to the gesture to be executed if the gesture to be executed belongs to the basic gesture;
optionally, the gesture types include a trick gesture, the trick gesture is used to control a preset type of robot to execute a preset action, and there is a difference between preset actions that different preset types of robots can execute; the motion control subunit includes:
a robot type detecting unit, configured to detect whether the target robot belongs to the preset type if the gesture to be executed belongs to a trick gesture;
and the second control subunit is used for controlling the target robot to execute the preset action corresponding to the gesture to be executed if the target robot belongs to the preset type.
Optionally, the detecting unit 401 is specifically configured to detect whether a user sends a gesture to be executed through the wearable device according to an accelerometer sensor, a direction sensor, or a linear accelerometer sensor mounted on the wearable device.
Optionally, the first determining unit 402 includes:
a database gesture detection subunit, configured to detect whether the gesture to be executed exists in the gesture database;
and the gesture legality determining subunit is used for determining that the gesture to be executed is legal if the gesture to be executed exists in the gesture database.
Optionally, the robot controller 400 further includes:
and if the gesture to be executed is illegal or invalid, controlling the target robot to keep still.
Optionally, the robot controller 400 further includes:
a distance acquisition unit, configured to acquire a distance between the target robot and the wearable device in real time;
the control unit 404 is further configured to control the target robot to stop moving if the distance reaches a preset maximum distance threshold, where the preset distance threshold is set according to a type of communication connection established between the target robot and the wearable device.
Optionally, the control unit 404 is further configured to, after the target robot is controlled to stop moving, if a robot return instruction input by the user is received, control the target robot to move to the position of the wearable device until the distance reaches a preset minimum distance threshold.
It can be seen from the above that, according to the embodiment of the application, whether a user sends a gesture to be executed for controlling a robot through a wearable device is detected according to a sensor carried in the wearable device, if so, the legality of the gesture to be executed is judged first, the gesture to be executed is recognized, the validity of the gesture to be executed is judged on the premise that the gesture to be executed is legal, misoperation of the user is avoided, and finally when the gesture to be executed is valid, a target robot which is in communication connection with the wearable device is controlled to execute an action corresponding to the gesture to be executed, so that the user can control the robot through the wearable device, a control flow is simplified, and the maneuverability and flexibility of the robot are improved. Furthermore, the application also provides a universal gesture and a special gesture, and the user can use the same gesture to operate different robots according to the same gesture specification through the universal gesture, so that the learning cost of the user is reduced; through the special effect gesture, partial special robots can be controlled to display special actions, and interestingness is enhanced.
An embodiment of the present application further provides a wearable device, please refer to fig. 5, where the wearable device 5 in the embodiment of the present application includes: a memory 501, one or more processors 502 (only one shown in fig. 5), and a computer program stored on the memory 501 and executable on the processors. Wherein: the memory 501 is used for storing software programs and units, and the processor 502 executes various functional applications and data processing by running the software programs and units stored in the memory 501, so as to acquire resources corresponding to the preset events. Specifically, the processor 502 realizes the following steps by running the above-mentioned computer program stored in the memory 501:
detecting whether a user sends out a gesture to be executed through the wearable equipment or not according to a sensor carried in the wearable equipment;
when the user is determined to send the gesture to be executed through the wearable device, judging whether the gesture to be executed is legal or not according to a preset gesture database, wherein at least one preset gesture data is stored in the gesture database;
if the gesture to be executed is legal, judging whether the gesture to be executed is effective or not according to a preset gesture condition;
and if the gesture to be executed is effective, controlling a target robot to execute the action corresponding to the gesture to be executed, wherein the target robot is a robot which establishes communication connection with the wearable device.
In a second possible embodiment based on the first possible embodiment, the controlling the target robot to execute the action corresponding to the gesture to be executed if the gesture to be executed is valid includes:
if the gesture to be executed is effective, acquiring the gesture type of the gesture to be executed;
and controlling the target robot to execute the action corresponding to the gesture to be executed according to the gesture type of the gesture to be executed.
In a third possible implementation manner provided as a basis for the second possible implementation manner, the gesture types include a basic gesture, and the basic gesture is used for controlling the robot to execute a moving action;
correspondingly, the controlling the target robot to execute the action corresponding to the gesture to be executed according to the gesture type of the gesture to be executed includes:
and if the gesture to be executed belongs to the basic gesture, controlling the target robot to execute the moving action corresponding to the gesture to be executed.
In a fourth possible implementation manner provided on the basis of the second possible implementation manner, the gesture types include a trick gesture, the trick gesture is used to control a robot of a preset type to execute a preset action, and there is a difference between preset actions that can be executed by robots of different preset types;
correspondingly, the controlling the target robot to execute the action corresponding to the gesture to be executed according to the gesture type of the gesture to be executed includes:
if the gesture to be executed belongs to a special effect gesture, detecting whether the target robot belongs to the preset type;
and if the target robot belongs to the preset type, controlling the target robot to execute a preset action corresponding to the gesture to be executed.
In a fifth possible embodiment based on the first possible embodiment, the detecting whether the user issues a gesture to be executed by the wearable device according to the sensor mounted in the wearable device includes:
according to an accelerometer sensor, a direction sensor or a linear accelerometer sensor carried by the wearable device, whether a user sends a gesture to be executed through the wearable device is detected.
In a sixth possible implementation manner provided on the basis of the first possible implementation manner, the determining whether the gesture to be executed is legal according to a preset gesture database includes:
detecting whether the gesture to be executed exists in the gesture database;
and if the gesture to be executed exists in the gesture database, determining that the gesture to be executed is legal.
In a seventh possible implementation manner provided on the basis of the first possible implementation manner, the second possible implementation manner, the third possible implementation manner, the fourth possible implementation manner, the fifth possible implementation manner, or the sixth possible implementation manner, the processor 502 implements the following steps by running the computer program stored in the memory 501:
and if the gesture to be executed is illegal or invalid, controlling the target robot to keep still.
In an eighth possible implementation manner provided on the basis of the first possible implementation manner, the second possible implementation manner, the third possible implementation manner, the fourth possible implementation manner, the fifth possible implementation manner, or the sixth possible implementation manner, the processor 502 implements the following steps by running the computer program stored in the memory 501:
acquiring the distance between the target robot and the wearable equipment in real time;
and controlling the target robot to stop moving if the distance reaches a preset maximum distance threshold value, wherein the preset distance threshold value is set according to the type of the communication connection established between the target robot and the wearable equipment.
In a ninth possible implementation manner provided on the basis of the eighth possible implementation manner, after the controlling the target robot to stop moving, the processor 502 further implements the following steps when executing the computer program stored in the memory 501:
and if the robot return instruction input by the user is received, controlling the target robot to move to the position of the wearable equipment until the distance reaches a preset minimum distance threshold value.
It should be understood that in the embodiments of the present Application, the Processor 502 may be a Central Processing Unit (CPU), and the Processor may be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field-Programmable Gate arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Memory 501 may include both read-only memory and random access memory and provides instructions and data to processor 502. Some or all of the memory 501 may also include non-volatile random access memory. For example, the memory 501 may also store device class information.
It can be seen from the above that, according to the embodiment of the application, whether a user sends a gesture to be executed for controlling a robot through a wearable device is detected according to a sensor carried in the wearable device, if so, the legality of the gesture to be executed is judged first, the gesture to be executed is recognized, the validity of the gesture to be executed is judged on the premise that the gesture to be executed is legal, misoperation of the user is avoided, and finally when the gesture to be executed is valid, a target robot which is in communication connection with the wearable device is controlled to execute an action corresponding to the gesture to be executed, so that the user can control the robot through the wearable device, a control flow is simplified, and the maneuverability and flexibility of the robot are improved. Furthermore, the application also provides a universal gesture and a special gesture, and the user can use the same gesture to operate different robots according to the same gesture specification through the universal gesture, so that the learning cost of the user is reduced; through the special effect gesture, partial special robots can be controlled to display special actions, and interestingness is enhanced.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned functions may be distributed as different functional units and modules according to needs, that is, the internal structure of the apparatus may be divided into different functional units or modules to implement all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art would appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of external device software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described system embodiments are merely illustrative, and for example, the division of the above-described modules or units is only one logical functional division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The integrated unit may be stored in a computer-readable storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. The computer program includes computer program code, and the computer program code may be in a source code form, an object code form, an executable file or some intermediate form. The computer-readable storage medium may include: any entity or device capable of carrying the above-described computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer readable Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signal, telecommunication signal, software distribution medium, etc. It should be noted that the computer readable storage medium may contain other contents which can be appropriately increased or decreased according to the requirements of the legislation and the patent practice in the jurisdiction, for example, in some jurisdictions, the computer readable storage medium does not include an electrical carrier signal and a telecommunication signal according to the legislation and the patent practice.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (12)

1. A robot control method is applied to wearable equipment and comprises the following steps:
detecting whether a user sends out a gesture to be executed through the wearable equipment or not according to a sensor carried in the wearable equipment;
when the user is determined to send the gesture to be executed through the wearable device, judging whether the gesture to be executed is legal or not according to a preset gesture database, wherein at least one preset gesture data is stored in the gesture database;
if the gesture to be executed is legal, judging whether the gesture to be executed is effective or not according to a preset gesture condition;
and if the gesture to be executed is effective, controlling a target robot to execute the action corresponding to the gesture to be executed, wherein the target robot is a robot which establishes communication connection with the wearable equipment.
2. The robot control method according to claim 1, wherein if the gesture to be executed is valid, controlling the target robot to execute the action corresponding to the gesture to be executed comprises:
if the gesture to be executed is effective, acquiring the gesture type of the gesture to be executed;
and controlling the target robot to execute the action corresponding to the gesture to be executed according to the gesture type of the gesture to be executed.
3. The robot control method of claim 2, wherein the gesture types include a base gesture for controlling the robot to perform a movement action;
correspondingly, the controlling the target robot to execute the action corresponding to the gesture to be executed according to the gesture type of the gesture to be executed includes:
and if the gesture to be executed belongs to the basic gesture, controlling the target robot to execute the moving action corresponding to the gesture to be executed.
4. The robot control method according to claim 2, wherein the gesture types include a trick gesture for controlling a preset type of robot to perform a preset action, and there is a difference in the preset actions that different preset types of robots can perform;
correspondingly, the controlling the target robot to execute the action corresponding to the gesture to be executed according to the gesture type of the gesture to be executed includes:
if the gesture to be executed belongs to a special effect gesture, detecting whether the target robot belongs to the preset type;
and if the target robot belongs to the preset type, controlling the target robot to execute a preset action corresponding to the gesture to be executed.
5. The robot control method according to claim 1, wherein the detecting whether a user issues a gesture to be performed through the wearable device according to a sensor mounted in the wearable device includes:
and detecting whether a user sends a gesture to be executed through the wearable equipment or not according to an accelerometer sensor, a direction sensor or a linear accelerometer sensor carried by the wearable equipment.
6. The robot control method according to claim 1, wherein the determining whether the gesture to be performed is legal according to a preset gesture database includes:
detecting whether gesture data corresponding to the gesture to be executed exist in the gesture database;
and if the gesture data corresponding to the gesture to be executed exists in the gesture database, determining that the gesture to be executed is legal.
7. A robot control method according to any of claims 1 to 6, characterized in that the robot control method further comprises:
and if the gesture to be executed is illegal, or if the gesture to be executed is invalid, controlling the target robot to keep still.
8. A robot control method according to any of claims 1 to 6, characterized in that the robot control method further comprises:
acquiring the distance between the target robot and the wearable equipment in real time;
and if the distance reaches a preset maximum distance threshold value, controlling the target robot to stop moving, wherein the preset distance threshold value is set according to a mode of establishing communication connection between the target robot and the wearable equipment.
9. The robot control method according to claim 8, wherein after the controlling the target robot stops moving, the robot control method further comprises:
and if a robot return instruction input by the user is received, controlling the target robot to move to the position of the wearable equipment until the distance reaches a preset minimum distance threshold value.
10. A robot control device is applied to wearable equipment and is characterized by comprising:
the detection unit is used for detecting whether a user sends out a gesture to be executed through the wearable equipment or not according to the sensor carried in the wearable equipment;
the first judging unit is used for judging whether the gesture to be executed is legal or not according to a preset gesture database when the user is determined to send the gesture to be executed through the wearable device, wherein at least one preset gesture data is stored in the gesture database;
the second judging unit is used for judging whether the gesture to be executed is valid or not according to a preset gesture condition if the gesture to be executed is legal;
and the control unit is used for controlling a target robot to execute the action corresponding to the gesture to be executed if the gesture to be executed is effective, wherein the target robot is a robot which establishes communication connection with the wearable equipment.
11. A wearable device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor, when executing the computer program, implements the method of any of claims 1-9.
12. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 9.
CN202011348892.6A 2020-11-26 2020-11-26 Robot control method, robot control device and wearable equipment Pending CN112518747A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011348892.6A CN112518747A (en) 2020-11-26 2020-11-26 Robot control method, robot control device and wearable equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011348892.6A CN112518747A (en) 2020-11-26 2020-11-26 Robot control method, robot control device and wearable equipment

Publications (1)

Publication Number Publication Date
CN112518747A true CN112518747A (en) 2021-03-19

Family

ID=74993867

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011348892.6A Pending CN112518747A (en) 2020-11-26 2020-11-26 Robot control method, robot control device and wearable equipment

Country Status (1)

Country Link
CN (1) CN112518747A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112691002A (en) * 2021-03-24 2021-04-23 上海傅利叶智能科技有限公司 Control method and device based on gesture interaction rehabilitation robot and rehabilitation robot
CN117523679A (en) * 2024-01-08 2024-02-06 成都运达科技股份有限公司 Driver gesture recognition method, system and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140371906A1 (en) * 2013-06-13 2014-12-18 GM Global Technology Operations LLC Method and Apparatus for Controlling a Robotic Device via Wearable Sensors
CN106547204A (en) * 2016-10-18 2017-03-29 上海斐讯数据通信技术有限公司 A kind of method of intelligent watch and its bright aobvious screen
CN108466263A (en) * 2018-01-29 2018-08-31 青岛真时科技有限公司 A kind of robot control method and device
KR20190027726A (en) * 2017-09-07 2019-03-15 한양대학교 산학협력단 Terminal control method usign gesture
CN110861110A (en) * 2019-11-08 2020-03-06 珠海市一微半导体有限公司 Control method of walking robot, walking robot and chip
CN111203874A (en) * 2019-12-26 2020-05-29 深圳市优必选科技股份有限公司 Robot control method, device, electronic device and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140371906A1 (en) * 2013-06-13 2014-12-18 GM Global Technology Operations LLC Method and Apparatus for Controlling a Robotic Device via Wearable Sensors
CN106547204A (en) * 2016-10-18 2017-03-29 上海斐讯数据通信技术有限公司 A kind of method of intelligent watch and its bright aobvious screen
KR20190027726A (en) * 2017-09-07 2019-03-15 한양대학교 산학협력단 Terminal control method usign gesture
CN108466263A (en) * 2018-01-29 2018-08-31 青岛真时科技有限公司 A kind of robot control method and device
CN110861110A (en) * 2019-11-08 2020-03-06 珠海市一微半导体有限公司 Control method of walking robot, walking robot and chip
CN111203874A (en) * 2019-12-26 2020-05-29 深圳市优必选科技股份有限公司 Robot control method, device, electronic device and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杭州市科学技术协会等: "《第三十一届杭州市青少年科技创新大赛 优秀作品集》", 30 November 2017, 浙江工商大学出版社, pages: 20 - 22 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112691002A (en) * 2021-03-24 2021-04-23 上海傅利叶智能科技有限公司 Control method and device based on gesture interaction rehabilitation robot and rehabilitation robot
CN112691002B (en) * 2021-03-24 2021-06-29 上海傅利叶智能科技有限公司 Control device based on gesture interaction rehabilitation robot and rehabilitation robot
CN117523679A (en) * 2024-01-08 2024-02-06 成都运达科技股份有限公司 Driver gesture recognition method, system and storage medium

Similar Documents

Publication Publication Date Title
KR101729721B1 (en) Portable electronic device and method for controlling operation thereof based on user motion
CN112518747A (en) Robot control method, robot control device and wearable equipment
CN106339070B (en) Display control method and mobile terminal
US20120154275A1 (en) User controlled device for sending control signals to an electric appliance, in particular user controlled pointing device such as mouse or joystick, with 3d-motion detection
EP1586978A2 (en) Portable device with action shortcut function
US11422609B2 (en) Electronic device and method for controlling operation of display in same
US9753539B2 (en) Method, device, system and non-transitory computer-readable recording medium for providing user interface
WO2018099043A1 (en) Terminal behavior triggering method and terminal
US10725550B2 (en) Methods and apparatus for recognition of a plurality of gestures using roll pitch yaw data
CN104898827A (en) Somatosensory interaction method applying somatosensory interaction system
US20230325006A1 (en) Wearable device and method for detecting motion gesture of wearable device
US9686638B2 (en) Input device having Bluetooth module and operation method therefor
CN113031840A (en) False triggering prevention method and device for wrist-worn device, electronic device and storage medium
CN103200304A (en) System and method for controlling mobile terminal intelligent cursor
Gouthaman et al. Gesture detection system using smart watch based motion sensors
CN112783318A (en) Human-computer interaction system and human-computer interaction method
US20220253198A1 (en) Image processing device, image processing method, and recording medium
JP7155242B2 (en) Personal digital assistant
CN113051538B (en) Information unlocking method and electronic equipment
CN210142314U (en) Intelligent control device
CN108322600B (en) Network access method and mobile terminal
US20180253213A1 (en) Intelligent Interaction Method, Device, and System
EP4217828A1 (en) Guiding fingerprint sensing via user feedback
CN115079684A (en) Feedback method of robot and robot
WO2015084082A1 (en) Method, device and system for recognizing posture or motion, and non-temporary computer-readable recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210319

RJ01 Rejection of invention patent application after publication