CN115816456A - System and method for controlling dexterous hand of humanoid robot based on raspberry pie - Google Patents

System and method for controlling dexterous hand of humanoid robot based on raspberry pie Download PDF

Info

Publication number
CN115816456A
CN115816456A CN202211583996.4A CN202211583996A CN115816456A CN 115816456 A CN115816456 A CN 115816456A CN 202211583996 A CN202211583996 A CN 202211583996A CN 115816456 A CN115816456 A CN 115816456A
Authority
CN
China
Prior art keywords
layer
actuator
data
action
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211583996.4A
Other languages
Chinese (zh)
Inventor
王磊
李龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Qingyun Robot Co ltd
Original Assignee
Shanghai Qingyun Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Qingyun Robot Co ltd filed Critical Shanghai Qingyun Robot Co ltd
Priority to CN202211583996.4A priority Critical patent/CN115816456A/en
Publication of CN115816456A publication Critical patent/CN115816456A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Manipulator (AREA)

Abstract

The invention discloses a raspberry pie-based humanoid robot dexterous hand control system and a method, wherein the method comprises the following steps: step 1: the collecting layer (1) collects the myoelectric information of the human arm, processes the myoelectric information of the human arm and transmits the processed myoelectric information to the transmission layer (2); step 2: the transmission layer carries out data processing on the myoelectric information of the arms of the human body and then transmits the myoelectric information to the control layer (3); and step 3: the control layer analyzes the myoelectric information of the human arm, selects a matched action model and transmits the action model to the execution layer (4); and 4, step 4: the execution layer outputs the action result corresponding to the action model to the physical layer (5), and simultaneously outputs the data feedback information to the control layer; and 5: the physical layer decomposes the action result and executes the action result, and simultaneously feeds back palm physical data to the execution layer. The invention develops a dexterous hand control system of the humanoid robot based on the raspberry pi, and realizes the integral control of the palm of the humanoid robot.

Description

System and method for controlling dexterous hand of humanoid robot based on raspberry pie
Technical Field
The invention relates to a robot control system and a method, in particular to a raspberry pie-based humanoid robot dexterous hand control system and a method.
Background
The humanoid robot is designed and manufactured by simulating the shape and behavior of a human, has human appearance, can adapt to the living and working environment of the human, replaces the human to finish various operations, can expand the human ability in many aspects, and is widely applied to various fields of service, medical treatment, education, entertainment and the like. The development of humanoid robots is limited because humans do not have a thorough understanding of themselves and existing robotics and artificial intelligence techniques are still immature.
Raspberry pie (Raspberry Pi) is designed for learning computer programming education, being a credit card sized microcomputer. The robot control system has the advantages that the size is small, the basic functions of all PCs are achieved, if the control system for the robot is designed based on the raspberry group, the robot can be correspondingly small, light and flexible in action, the control system for the humanoid entity robot based on the raspberry group is unavailable on the market at present, and the construction of the control system framework of the whole robot is difficult. Therefore, it is necessary to provide a smart hand control system and method for a humanoid solid robot developed based on raspberry pi, so as to realize the overall control of the palm of the humanoid robot.
Disclosure of Invention
The invention aims to provide a dexterous hand control system and a dexterous hand control method of a humanoid robot based on a raspberry pi, and the dexterous hand control system of the humanoid robot is developed based on the raspberry pi to realize the integral control of a palm of the humanoid robot.
The invention is realized by the following steps:
a humanoid robot dexterous hand control system based on a raspberry pie is characterized in that a humanoid robot dexterous hand control system framework is built based on the raspberry pie and comprises an acquisition layer, a transmission layer, a control layer, an execution layer and a physical layer; the output of collection layer is connected with the input of transmission layer, and the output of transmission layer is connected with the input of control layer, and the output of control layer is connected with the input of execution layer, and the feedback end of execution layer is connected with the input of control layer, and the output of execution layer is connected with the input of physical layer, and the feedback end of physical layer is connected with the input of execution layer.
A control method of a humanoid robot dexterous hand control system based on a raspberry pi comprises the following steps:
step 1: the collecting layer collects the myoelectric information of the arms of the human body, and transmits the myoelectric information of the arms of the human body to the transmission layer after data processing;
step 2: the transmission layer receives the human arm electromyographic information transmitted by the acquisition layer, processes the human arm electromyographic information and transmits the processed information to the control layer;
and step 3: the control layer receives the human arm electromyographic information transmitted by the transmission layer, analyzes the human arm electromyographic information and selects a matched action model, and transmits the action model to the execution layer;
and 4, step 4: the executive layer receives the action model transmitted by the control layer, outputs an action result corresponding to the action model to the physical layer and outputs data feedback information to the control layer;
and 5: the physical layer receives the action result transmitted by the execution layer, decomposes the action result and executes the action result, and simultaneously feeds back the palm physical data to the execution layer.
The collecting layer comprises a human arm myoelectricity information collecting module and a data filtering module, and the step 1 comprises the following sub-steps:
step 1.1: monitoring myoelectric information of a human arm by the acquisition layer, executing the step 1.2 if the myoelectric information of the human arm is monitored by the acquisition layer, and continuing monitoring if the myoelectric information of the human arm is not monitored by the acquisition layer;
step 1.2: the human arm myoelectricity information acquisition module acquires human arm myoelectricity original data and acquires the human arm myoelectricity original data to the data filtering module;
step 1.3: the data filtering module is used for filtering the myoelectric original data of the human arm and filtering abnormal data;
step 1.4: and the data filtering module outputs the filtered myoelectricity original data of the human arm to the transmission layer.
The transmission layer comprises a data formatting module, a protocol encapsulation module and a bus transmission module; the step 2 comprises the following sub-steps:
step 2.1: the transmission layer monitors the myoelectric information of the human arm after the data processing is carried out by the acquisition layer, if the transmission layer monitors the myoelectric information of the human arm, the step 2.2 is executed, and if the transmission layer does not monitor the myoelectric information of the human arm, the monitoring is continued;
step 2.2: the data formatting module carries out data formatting processing on the human arm electromyographic information after the data processing;
step 2.3: the protocol encapsulation module carries out protocol encapsulation on the human arm electromyographic information after data formatting processing;
step 2.4: and the bus transmission module transmits the myoelectric information of the human arm after the protocol encapsulation to the control layer.
The electromyographic information protocol data structure of the data formatting module is M = [ id1, ti, k ], wherein M is the electromyographic information protocol data structure, id1 is the serial number of the electromyographic sensor, ti is the collection time point of the electromyographic information of the human arm, and k is the numerical value of the electromyographic sensor.
The control layer comprises an electromyographic data receiving module, an electromyographic data analyzing module, an electromyographic data feedback analyzing module, an action algorithm calculating module and an action library training module; the step 3 comprises the following sub-steps:
step 3.1: the control layer monitors the myoelectric information of the human arm after the data processing is carried out by the transmission layer, if the control layer monitors the myoelectric information of the human arm, the step 3.2 is executed, and if the control layer does not monitor the myoelectric information of the human arm, the monitoring is continued;
step 3.2: the myoelectric data receiving module receives human arm myoelectric information subjected to data processing by the transmission layer, and performs data protocol analysis on the human arm myoelectric information, and the myoelectric data receiving module transmits the analyzed human arm myoelectric information to the myoelectric data analysis module;
step 3.3: the myoelectric data analysis module carries out high-pass filtering and low-pass filtering on the myoelectric information of the human arm to obtain myoelectric data of the human arm, and transmits the myoelectric data of the human arm to the action algorithm calculation module;
step 3.4: a bioelectricity reflection model is built in the action algorithm calculation module according to the bioelectricity electromyography model of the human body, electromyography data of the human body arm is input into the bioelectricity reflection model of the action algorithm calculation module, the bioelectricity reflection model outputs action data, and the action algorithm calculation module transmits the action data to the action library training module;
the myoelectric data analysis module receives feedback information output by the execution layer through the data feedback analysis module, and the myoelectric data analysis module performs high-order filtering and data screening by combining myoelectric information of a human arm and the feedback information to obtain myoelectric data of the human arm;
step 3.5: and the action library training module stores a pre-trained action model, selects an action model matched with the action data according to the action data and transmits the action model to the execution layer.
The execution layer comprises an action execution module and a data feedback module, and the step 4 comprises the following sub-steps:
step 4.1: the execution layer monitors the action model transmitted by the control layer, if the execution layer monitors the action model, the step 4.2 is executed, and if the execution layer does not monitor the action model, the monitoring is continued;
step 4.2: the action execution module receives the action model transmitted by the control layer, analyzes the action model into an action result and outputs the action result to the physical layer;
step 4.3: the data feedback module outputs the feedback information to the electromyographic data analysis module of the control layer.
And the meta-motion protocol data structure of the motion result is D = [ id, ti, D ], wherein D is the meta-motion protocol data structure, id is the actuator number of the physical layer, ti is the time point when the actuator runs to the current angle, and D is the current angle value of the actuator.
The physical layer comprises a wrist actuator, a thumb actuator, an index finger actuator, a middle finger actuator, a ring finger actuator and a little finger actuator; the wrist actuator comprises a first bending actuator, a first inner actuator, a first outer actuator, a first left actuator and a first right actuator; the thumb actuator comprises a second bending actuator and a second inner actuator and a second outer actuator; the index finger actuator comprises a third bending actuator and a second left-right actuator; the middle finger actuator is a fourth bending actuator, the ring finger actuator is a fifth bending actuator, and the little finger actuator is a sixth bending actuator;
the wrist actuator, the thumb actuator, the index finger actuator, the middle finger actuator, the ring finger actuator and the little finger actuator respectively have the following meta-actions: a wrist element action E1, a thumb element action E2, an index finger element action E3, a middle finger element action E4, a ring finger element action E5 and a little finger element action E6; the palm motion data structure is: e = [ E1, E2, E3, E4, E5, E6], wherein E is a palm motion;
the method comprises the steps that palm motions are realized through execution of an individual actuator and execution of a combined actuator, when the individual actuator executes, a meta-motion execution data structure of a single actuator is B = [ id, ti, d, v, a ], wherein B is the meta-motion execution data structure of the single actuator, id is an actuator number, ti is a termination time point of the actuator running to a current angle, d is a current angle of the actuator, v is a current speed of the actuator, and a is a current acceleration of the actuator;
when the joint actuator executes, the five-finger action execution data structure is C = [ i, bn ], wherein C is the respective joint action of the five fingers, i is the number of the five fingers, and Bn is the current information of one or more actuators contained in each finger;
the overall meta-motion data structure of a dexterous hand is N = [ ti, cn ], where N is the joint motion, ti is the time series of the joint motion, and Cn is the current information of the five fingers.
The step 5 comprises the following sub-steps:
step 5.1: the physical layer monitors the action result transmitted by the execution layer, if the physical layer monitors the action result, the step 5.2 is executed, and if the physical layer does not monitor the action result, the monitoring is continued;
step 5.2: the physical layer decomposes the action result into five finger and wrist element actions, and corresponding element actions are respectively executed by a wrist actuator, a thumb actuator, an index finger actuator, a middle finger actuator, a ring finger actuator and a little finger actuator;
step 5.3: the physical layer takes the palm physical data and feeds it back to the execution layer.
Compared with the prior art, the invention has the following beneficial effects:
1. the invention designs and develops the dexterous hand control system of the humanoid robot based on the raspberry pi, makes up the technical blank that the dexterous hand control system of the humanoid robot based on the raspberry pi does not exist at present, and realizes the integral control of the dexterous hand of the humanoid robot.
2. The invention decomposes the collected myoelectric information of the human arm into the elementary actions of the wrist and the five fingers, and the actuators of the wrist and the five fingers respectively execute the elementary actions of bending, moving inside and outside, moving left and right, and the like, the execution data of the elementary actions comprises action time points, angles, speeds, accelerations, and the like, and the closed-loop control is combined to realize the flexible and fine control of the whole palm, thereby the dexterous hand of the humanoid robot can accurately realize the humanoid actions to replace the human body to execute the corresponding actions.
Drawings
FIG. 1 is a block diagram of a raspberry pi based humanoid robotic dexterous hand control system of the present invention;
FIG. 2 is a flow chart of a dexterous hand control method of a humanoid robot based on a raspberry pie;
FIG. 3 is a flow chart of step 1 of the humanoid robot dexterous hand control method based on raspberry pi of the present invention;
FIG. 4 is a flow chart of step 2 of the dexterous hand control method of the humanoid robot based on raspberry pi of the present invention;
FIG. 5 is a flow chart of step 3 in the dexterous hand control method of the humanoid robot based on raspberry pi of the present invention;
FIG. 6 is a flow chart of step 4 of the dexterous hand control method of the humanoid robot based on raspberry pi of the present invention;
FIG. 7 is a flow chart of step 5 of the method for controlling the dexterous hand of the humanoid robot based on the raspberry pi of the present invention;
FIG. 8 is a diagram of the operation state of a dexterous hand in the method for controlling a dexterous hand of a humanoid robot based on a raspberry pie;
FIG. 9 is an allocation diagram of the actuators of the physical layer of the raspberry-based humanoid robot smart hand control system of the present invention;
FIG. 10 is a flow chart of the operation of the body arm electromyography information acquisition module in the raspberry pi-based humanoid robot dexterous hand control system of the present invention;
FIG. 11 is a flowchart of the operation of the data filtering module in the raspberry pi based humanoid robot smart hand control system of the present invention;
FIG. 12 is a flow chart of the operation of the electromyographic data analysis module in the raspberry pi-based humanoid robot dexterous hand control system.
In the figure, a collection layer 1, a human arm electromyogram information collection module 11, a data filtering module 12, a transmission layer 2, a data formatting module 21, a protocol packaging module 22, a bus transmission module 23, a control layer 3, an electromyogram data receiving module 31, an electromyogram data analysis module 32, an electromyogram data feedback analysis module 33, an action algorithm calculation module 34, an action library training module 35, an execution layer 4, an action execution module 41, a data feedback module 42, a physical layer 5, a wrist actuator 51, a first bending actuator 511, a first inner and outer actuator 512, a first left and right actuator 513, a thumb actuator 52, a second bending actuator 521, a second inner and outer actuator 522, a second outer actuator 53, a third bending actuator 531, a second left and right actuator 532, a ring finger actuator 55, a middle finger actuator 54 and a small finger actuator 56 are provided.
Detailed Description
The invention is further described with reference to the following figures and specific examples.
Referring to the attached figure 1, a raspberry group-based humanoid robot dexterous hand control system is characterized in that a control system architecture of the humanoid robot dexterous hand is built based on the raspberry group, and the control system architecture comprises a collection layer 1, a transmission layer 2, a control layer 3, an execution layer 4 and a physical layer 5; the output of collection layer 1 is connected with the input of transmission layer 2, and the output of transmission layer 2 is connected with the input of control layer 3, and the output of control layer 3 is connected with the input of execution layer 4, and the feedback end of execution layer 4 is connected with the input of control layer 3, and the output of execution layer 4 is connected with the input of physical layer 5, and the feedback end of physical layer 5 is connected with the input of execution layer 4.
Referring to fig. 1 and 2, a method for controlling a dexterous hand of a humanoid robot based on a raspberry pi includes the following steps:
step 1: the collecting layer 1 collects the myoelectric information of the arms of the human body, processes the myoelectric information of the arms of the human body and transmits the myoelectric information to the transmission layer 2.
Referring to fig. 1 and fig. 3, the collection layer 1 includes a human arm electromyography information collection module 11 and a data filtering module 12, and the step 1 includes the following sub-steps:
step 1.1: the collecting layer 1 monitors myoelectric information of human arms, if the collecting layer 1 monitors the myoelectric information of the human arms, the step 1.2 is executed, and if the collecting layer 1 does not monitor the myoelectric information of the human arms, the monitoring is continued.
Step 1.2: the human arm myoelectricity information acquisition module 11 acquires human arm myoelectricity raw data through the myoelectricity sensor group and acquires the human arm myoelectricity raw data to the data filtering module 12.
Referring to fig. 10, preferably, the human arm myoelectric information acquisition module 11 may include six medical dry electrodes, a lead, a binding band, an STM32 processor, an output line, a power supply and a switch, the six medical dry electrodes are fixed at the wrist and finger joints of the arm through the binding band and are tightly attached to the skin surface, the switch is turned on, the power supply supplies power to the six medical dry electrodes, so that the six medical dry electrodes acquire wrist and finger motion information, that is, human arm myoelectric raw data, and transmit the wrist and finger motion information to the STM32 processor through the lead, and the STM32 processor outputs the human arm myoelectric raw data through the output line.
Step 1.3: the data filtering module 12 performs filtering processing on the myoelectricity original data of the human arm and filters abnormal data.
Referring to fig. 11, preferably, the data filtering module 12 includes a capacitor resistor and an inductor, and performs high-pass filtering and low-pass filtering on the myoelectric raw data of the human arm.
Preferably, the abnormal data includes data significantly higher or lower than the normal human myoelectric signal, i.e., data not within the range of normal human myoelectric data (0-1.5 mv).
The data of the normal human myoelectricity is characterized in that: the human body electromyographic signal is a one-dimensional time action electric potential sequence, can be represented by an alternating current signal, the amplitude of the alternating current signal is generally in direct proportion to the muscle movement strength, the amplitude of the alternating current signal is 0-1.5 mv, the alternating current signal can also be represented by a signal frequency, the frequency range of the alternating current signal is 0-500 HZ, and the main energy is concentrated at 20-150 HZ.
The human body electromyographic signals are generated 30-150ms ahead of the limb movement, so the movement can be judged in advance by utilizing the collected human body arm electromyographic raw data.
Step 1.4: the data filtering module 12 outputs the filtered myoelectric original data of the human arm to the transmission layer 2.
Step 2: the transmission layer 2 receives the human arm electromyographic information transmitted by the acquisition layer 1, processes the human arm electromyographic information and transmits the processed information to the control layer 3.
Referring to fig. 1 and fig. 4, the transport layer 2 includes a data formatting module 21, a protocol encapsulation module 22, and a bus transport module 23; the step 2 comprises the following sub-steps:
step 2.1: the transmission layer 2 monitors the myoelectric information of the arms of the human body after the data processing is carried out by the acquisition layer 1, if the myoelectric information of the arms of the human body is monitored by the transmission layer 2, the step 2.2 is executed, and if the myoelectric information of the arms of the human body is not monitored by the transmission layer 2, the monitoring is continued.
Step 2.2: the data formatting module 21 performs data formatting processing on the human arm electromyography information after data processing.
The electromyographic information protocol data structure of the data formatting module 21 is M = [ id1, ti, k ], where M is the electromyographic information protocol data structure, id1 is the serial number of the electromyographic sensor, ti is the collection time point of the electromyographic information of the human arm, and k is the value of the electromyographic sensor.
The myoelectric information of the human arm received by the data formatting module 21 is floating point type data, the myoelectric information of the human arm is amplified by the data formatting module 21 to form integer type data, preferably, the myoelectric information of the human arm can be amplified by 1000 times, and the myoelectric information of the human arm can also be amplified according to actual requirements, so that later-stage calculation is facilitated.
Step 2.3: the protocol encapsulation module 22 performs protocol encapsulation on the human arm electromyography information after data formatting processing.
The encapsulated protocol includes: an actuator number id, a termination time point ti, data k, and a check value sum. And the processed data is packaged through protocol packaging, so that the data is conveniently packaged and sent, and the transmission efficiency is improved.
Step 2.4: the bus transmission module 23 transmits the protocol-encapsulated human arm myoelectric information to the control layer 3 through a bus.
The Bus (Bus) is a standardized way of exchanging data (data) between computer components, i.e. providing data transfer and control logic for each component in a general way.
And 3, step 3: the control layer 3 receives the human arm electromyography information transmitted by the transmission layer 2, analyzes the human arm electromyography information and selects a matched action model, and the control layer 3 transmits the action model to the execution layer 4.
Referring to fig. 1 and 5, the control layer 3 includes an electromyographic data receiving module 31, an electromyographic data analyzing module 32, an electromyographic data feedback analyzing module 33, an action algorithm calculating module 34, and an action library training module 35; the step 3 comprises the following sub-steps:
step 3.1: the control layer 3 monitors the myoelectric information of the human arm after the data processing is carried out by the transmission layer 2, if the myoelectric information of the human arm is monitored by the control layer 3, the step 3.2 is executed, and if the myoelectric information of the human arm is not monitored by the control layer 3, the monitoring is continued.
Step 3.2: the electromyographic data receiving module 31 receives the human arm electromyographic information subjected to the data processing by the transmission layer 2, and performs data protocol analysis on the human arm electromyographic information, and the electromyographic data receiving module 31 transmits the analyzed human arm electromyographic information to the electromyographic data analyzing module 32.
Preferably, the electromyographic data receiving module 31 may adopt a serial port receiver, and the electromyographic data receiving module 31 analyzes the protocol flow after receiving the electromyographic information of the human arm, that is, analyzes the received electromyographic information of the human arm according to the above encapsulation protocol (i.e., the motor number id, the relative time ti, the data k, and the check value sum) to obtain each individual data item.
Step 3.3: the electromyographic data analysis module 32 performs high-pass filtering and low-pass filtering on the electromyographic information of the human arm to obtain electromyographic data of the human arm, and transmits the electromyographic data of the human arm to the action algorithm calculation module 34.
The high-pass filter allows high-frequency or alternating-current components in the signals to pass through, and the filter inhibits low-frequency or direct-current components and is used for screening high-frequency data in the myoelectric information of the human arm. The low-pass filter allows low-frequency or direct-current components in the signals to pass through, and the filter inhibits high-frequency components or interference and noise and is used for screening low-frequency data in the electromyographic information of the human arm.
The electromyographic data analysis module 32 receives the feedback information output by the data feedback module 42 of the execution layer 4 through the data feedback analysis module 33, and the electromyographic data analysis module 32 performs high-order filtering and data screening by combining the electromyographic information of the human arm and the feedback information to obtain the electromyographic data of the human arm.
Referring to fig. 12, the electromyographic data analysis module 32 includes an electromyographic data input module, a screening module, and a result output module, and the data screening process includes: the electromyographic data input module receives the simply processed (i.e. high-pass filtering and low-pass filtering) human arm electromyographic data, a high threshold value and a low threshold value of the human arm electromyographic data are set, the screening module respectively eliminates the human arm electromyographic data exceeding the high threshold value and the low threshold value, human arm electromyographic data of a plurality of channels can be separated, and finally the result output module outputs the screened human arm electromyographic data.
Step 3.4: a bioelectricity reflection model is built in the action algorithm calculation module 34 according to the bioelectricity electromyography model of the human body, electromyography data of the human body arm is input into the bioelectricity reflection model of the action algorithm calculation module 34, the bioelectricity reflection model outputs action data, and the action algorithm calculation module 34 transmits the action data to the action library training module 35.
The human body biological myoelectric model respectively carries out modeling on six paths of myoelectric information under different actions according to the collected six paths of myoelectric information by one y [ i ] = a [ i ] x [ i ] (0 < = i < = 6), and the training method of the model comprises the following steps: and calculating and averaging the a [ i ] under the specific action to obtain the bioelectricity reflection model. Modeling and training of the bioelectrical reflex model can be realized based on neural networks such as RNN and CNN in the prior art.
Step 3.5: the action library training module 35 stores a pre-trained action model, and the action library training module 35 selects an action model matching the action data according to the action data and transmits the action model to the execution layer 4.
The action library training module 35 is composed of a six-path myoelectricity input module, a factor calculation module and a result output module. The working process comprises the following steps: the six-path myoelectric input module receives six-path myoelectric data input, the algorithm of the action algorithm calculation module 34 in the step 3.4 is adopted to calculate the matching factor b, when b is more than or equal to 0.5 and less than or equal to 1, the action is considered to be matched, and the action library training module 35 outputs 1. The output of the action library training module 35 is 0 or 1, where 0 represents a mismatch and 1 represents a match.
And 4, step 4: the execution layer 4 receives the action model transmitted by the control layer 3, outputs an action result corresponding to the action model to the physical layer 5, and outputs data feedback information to the control layer 3.
Referring to fig. 1 and fig. 6, the execution layer 4 includes an action execution module 41 and a data feedback module 42, and step 4 includes the following sub-steps:
step 4.1: the execution layer 4 monitors the action model transmitted by the control layer 3, if the execution layer 4 monitors the action model, the step 4.2 is executed, and if the execution layer 4 does not monitor the action model, the monitoring is continued.
Step 4.2: the action execution module 41 receives the action model transmitted from the control layer 3, and the action execution module 41 analyzes the action model into an action result and outputs the action result to the physical layer 5.
The action execution module 41 comprises a thumb execution module, a forefinger execution module, a middle finger execution module, a ring finger execution module and a little finger execution module, and a single execution module consists of an action result input module and an action execution output module, namely a motor or a steering engine bottom layer drive. The workflow of a single execution module is as follows: and after the action result input module receives the input of the action result, the action execution output module outputs a hardware signal according to the result. Each execution module respectively outputs hardware signals corresponding to the thumb, the index finger, the middle finger, the ring finger and the little finger, so that the thumb, the index finger, the middle finger, the ring finger and the little finger can be respectively controlled conveniently.
The meta-motion protocol data structure of the motion result is D = [ id, ti, D ], wherein D is the meta-motion protocol data structure, id is the actuator number of the physical layer 5, ti is the time point when the actuator runs to the current angle, D is the current angle value of the actuator, and the actuator of the physical layer 5 may be a steering engine or a motor.
Step 4.3: the data feedback module 42 outputs the feedback information to the electromyographic data analysis module 32 of the control layer 3.
The feedback information of the data feedback module 42 includes information about whether the action execution of the thumb, the index finger, the middle finger, the ring finger and the little finger is completed, and the feedback information can be used for checking an identifier of whether the action training is completed in the action training of the bioelectrical reflex model, so that the training optimization of the bioelectrical reflex model is facilitated.
And 5: the physical layer 5 receives the action result transmitted by the execution layer 4, and the physical layer 5 decomposes the action result and executes the action result through the actuator, and simultaneously feeds back the palm physical data to the execution layer 4.
Referring to fig. 9, the physical layer 5 includes a wrist actuator 51, a thumb actuator 52, an index finger actuator 53, a middle finger actuator 54, a ring finger actuator 55 and a small finger actuator 56; wherein, the wrist actuator 51 comprises a first bending actuator 511, a first inside and outside actuator 512 and a first left and right actuator 513; thumb actuator 52 includes a second bend actuator 521 and a second inside-outside actuator 522; the index finger actuator 53 includes a third bending actuator 531 and a second left and right actuator 532; the middle finger actuator 54 is a fourth bending actuator, the ring finger actuator 55 is a fifth bending actuator, and the little finger actuator 56 is a sixth bending actuator.
The actuators of the joints of the palm are distributed differently according to different functions, and the actions of the wrist actuator 51, the thumb actuator 52, the index finger actuator 53, the middle finger actuator 54, the ring finger actuator 55 and the small finger actuator 56 mainly include: a wrist element action E1, a thumb element action E2, an index finger element action E3, a middle finger element action E4, a ring finger element action E5 and a little finger element action E6. The palm action data structure is mainly formulated according to the execution data structure of the meta-action, and the actions of each actuator of the palm are realized through the execution of the individual actuator and the execution of the combined actuator, so that the effect of the palm action is achieved. The palm action data structure is: e = [ E1, E2, E3, E4, E5, E6], where E is a palm motion.
When the individual actuator executes, the meta-motion execution data structure of the single actuator is B = [ id, ti, d, v, a ], wherein B is the meta-motion execution data structure of the single actuator, id is an actuator number, ti is a termination time point of the actuator running to a current angle, d is a current angle of the actuator, v is a current speed of the actuator, and a is a current acceleration of the actuator.
When the combined actuator is executed, the five-finger action execution data structure is C = [ i, bn ], wherein C is the respective combined action of the five fingers, i is the number of the five fingers (the numbers from thumb to little finger are 1-5 in sequence), and Bn is the current information of one or more actuators contained in each finger.
The whole meta-motion data structure of the dexterous hand is N = [ ti, cn ], wherein N is a joint motion, ti is a time sequence of the joint motion, and Cn is current information of five fingers.
Referring to fig. 8, the operating states of the dexterous hand include Ready state, running state, stop state, error state and Status state.
Ready (ready state), namely the current process of the humanoid robot dexterous hand control system is in a ready state, and the humanoid robot dexterous hand control system can respectively enter a running state or an inquiry state in the ready state, wherein the input state of the ready state can only be a stop state, and meanwhile, the ready state can be kept by self all the time without any input.
Running, namely that the current process of the dexterous hand control system of the humanoid robot is in a running state, the current process can only enter a stopping state forwards in the running state, the input state of the running state can only be a ready state, and the current process can continuously perform self-maintenance in the running state until entering the next state.
Stop state, namely the current process of the humanoid robot smart hand control system is in a stop running state, and the humanoid robot smart hand control system can only enter a ready state forward in the stop state, and the input state of the stop state can be an inquiry state or a running state.
And 4, error (abnormal state), namely the current process of the humanoid robot smart hand control system is in an abnormal state, the current process can only enter a stop state forwards in the abnormal state, and the input state of the abnormal state is a running state.
Status (query state), namely, the current process of the humanoid robot dexterous hand control system is in a system state query state, the humanoid robot dexterous hand control system can only enter a stop state forward in the query state, and the input state of the query state can only be a ready state.
Referring to fig. 1 and 7, the step 5 includes the following sub-steps:
step 5.1: the physical layer 5 monitors the action result transmitted by the execution layer 4, if the physical layer 5 monitors the action result, step 5.2 is executed, if the physical layer 5 does not monitor the action result, the monitoring is continued.
Step 5.2: the physical layer 5 decomposes the action result into five finger and wrist meta actions, and executes the corresponding meta actions through a wrist actuator 51, a thumb actuator 52, an index finger actuator 53, a middle finger actuator 54, a ring finger actuator 55 and a small finger actuator 56 respectively.
Preferably, the wrist actuator 51, the thumb actuator 52, the index finger actuator 53, the middle finger actuator 54, the ring finger actuator 55 and the small finger actuator 56 can adopt motors or steering gears, and after the action results are received and decomposed to obtain the meta action, the motors or steering gears are driven to rotate according to the meta action.
The wrist actuator 51 can execute bending motion, left-right motion and inside-outside motion according to the meta motion, the thumb actuator 52, the index finger actuator 53, the middle finger actuator 54, the ring finger actuator 55 and the small finger actuator 56 can execute bending motion according to the meta motion, the thumb actuator 52 can execute inside-outside motion according to the meta motion, and the index finger actuator 53 can execute left-right motion according to the meta motion, so that the motion control of the whole dexterous hand is realized according to different meta motion combinations.
Step 5.3: the physical layer 5 takes the palm physical data and feeds it back to the execution layer 4.
The palm physical data mainly comprises angle data information of a motor or a steering engine, and can be acquired from a wrist actuator 51, a thumb actuator 52, an index finger actuator 53, a middle finger actuator 54, a ring finger actuator 55 and a little finger actuator 56 in a serial port line communication mode.
The physical layer 5 continuously feeds back the collected palm physical data to the execution layer 4, and then the data feedback module 42 of the execution layer 4 continuously feeds back the collected palm physical data to the data feedback analysis module 33 of the control layer 3, so that the control layer 3 can accurately judge whether the physical palm action is executed in place.
The present invention is not limited to the above embodiments, and any modifications, equivalent replacements, improvements, etc. within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. The utility model provides a dexterous hand control system of humanoid robot based on raspberry group, characterized by: a control system architecture of a humanoid robot dexterous hand is built based on a raspberry pie, and comprises an acquisition layer (1), a transmission layer (2), a control layer (3), an execution layer (4) and a physical layer (5); the output of collection layer (1) is connected with the input of transmission layer (2), the output of transmission layer (2) is connected with the input of control layer (3), the output of control layer (3) is connected with the input of execution layer (4), the feedback end of execution layer (4) is connected with the input of control layer (3), the output of execution layer (4) is connected with the input of physical layer (5), the feedback end of physical layer (5) is connected with the input of execution layer (4).
2. The method for controlling the dexterous hand control system of the humanoid robot based on the raspberry pi of claim 1, which is characterized in that: the method comprises the following steps:
step 1: the collecting layer (1) collects the myoelectric information of the human arm, processes the myoelectric information of the human arm and transmits the processed myoelectric information to the transmission layer (2);
step 2: the transmission layer (2) receives the myoelectric information of the human arm transmitted by the acquisition layer (1), processes the myoelectric information of the human arm and transmits the processed myoelectric information of the human arm to the control layer (3);
and step 3: the control layer (3) receives the myoelectric information of the human arm transmitted by the transmission layer (2), analyzes the myoelectric information of the human arm and selects a matched action model, and the control layer (3) transmits the action model to the execution layer (4);
and 4, step 4: the execution layer (4) receives the action model transmitted by the control layer (3), outputs an action result corresponding to the action model to the physical layer (5), and outputs data feedback information to the control layer (3);
and 5: the physical layer (5) receives the action result transmitted by the execution layer (4), and the physical layer (5) decomposes the action result and executes the action result, and simultaneously feeds back the palm physical data to the execution layer (4).
3. The control method according to claim 2, wherein: the collecting layer (1) comprises a human arm electromyographic information collecting module (11) and a data filtering module (12), and the step 1 comprises the following steps:
step 1.1: monitoring myoelectric information of a human arm by the acquisition layer (1), executing the step 1.2 if the myoelectric information of the human arm is monitored by the acquisition layer (1), and continuing monitoring if the myoelectric information of the human arm is not monitored by the acquisition layer (1);
step 1.2: the human arm myoelectricity information acquisition module (11) acquires human arm myoelectricity original data and acquires the human arm myoelectricity original data to the data filtering module (12);
step 1.3: the data filtering module (12) is used for filtering myoelectricity original data of the human arm and filtering abnormal data;
step 1.4: the data filtering module (12) outputs the filtered myoelectricity original data of the human arm to the transmission layer (2).
4. The control method according to claim 2, wherein: the transmission layer (2) comprises a data formatting module (21), a protocol encapsulation module (22) and a bus transmission module (23); the step 2 comprises the following sub-steps:
step 2.1: the transmission layer (2) monitors the myoelectric information of the human arm after the data processing is carried out by the acquisition layer (1), if the transmission layer (2) monitors the myoelectric information of the human arm, the step 2.2 is executed, and if the transmission layer (2) does not monitor the myoelectric information of the human arm, the monitoring is continued;
step 2.2: the data formatting module (21) carries out data formatting processing on the human arm electromyographic information after data processing;
step 2.3: the protocol encapsulation module (22) carries out protocol encapsulation on the human arm electromyographic information after data formatting processing;
step 2.4: the bus transmission module (23) transmits the human arm electromyogram information after protocol encapsulation to the control layer (3).
5. The control method according to claim 4, wherein: the electromyographic information protocol data structure of the data formatting module (21) is M = [ id1, ti, k ], wherein M is the electromyographic information protocol data structure, id1 is the serial number of the electromyographic sensor, ti is the collection time point of the electromyographic information of the human arm, and k is the numerical value of the electromyographic sensor.
6. The control method according to claim 2, wherein: the control layer (3) comprises an electromyographic data receiving module (31), an electromyographic data analyzing module (32), an electromyographic data feedback analyzing module (33), an action algorithm calculating module (34) and an action library training module (35); the step 3 comprises the following sub-steps:
step 3.1: the control layer (3) monitors myoelectric information of the human arm after data processing is carried out by the transmission layer (2), if the control layer (3) monitors the myoelectric information of the human arm, the step 3.2 is executed, and if the control layer (3) does not monitor the myoelectric information of the human arm, the monitoring is continued;
step 3.2: the electromyographic data receiving module (31) receives the human arm electromyographic information subjected to data processing by the transmission layer (2), and performs data protocol analysis on the human arm electromyographic information, and the electromyographic data receiving module (31) transmits the analyzed human arm electromyographic information to the electromyographic data analyzing module (32);
step 3.3: the electromyographic data analysis module (32) carries out high-pass filtering and low-pass filtering on the electromyographic information of the human arm to obtain the electromyographic data of the human arm, and transmits the electromyographic data of the human arm to the action algorithm calculation module (34);
step 3.4: a bioelectricity reflection model is built in the action algorithm calculation module (34) according to a human body bioelectricity electromyography model, human body arm electromyography data are input into the bioelectricity reflection model of the action algorithm calculation module (34), the bioelectricity reflection model outputs action data, and the action algorithm calculation module (34) transmits the action data to the action library training module (35);
the electromyographic data analysis module (32) receives feedback information output by the execution layer (4) through the data feedback analysis module (33), and the electromyographic data analysis module (32) performs high-order filtering and data screening by combining the electromyographic information of the human arm and the feedback information to obtain the electromyographic data of the human arm;
step 3.5: the action library training module (35) stores pre-trained action models, and the action library training module (35) selects action models matched with the action data according to the action data and transmits the action models to the execution layer (4).
7. The control method according to claim 2, wherein: the execution layer (4) comprises an action execution module (41) and a data feedback module (42), and the step 4 comprises the following sub-steps:
step 4.1: the execution layer (4) monitors the action model transmitted by the control layer (3), if the execution layer (4) monitors the action model, the step 4.2 is executed, and if the execution layer (4) does not monitor the action model, the monitoring is continued;
step 4.2: the action execution module (41) receives the action model transmitted by the control layer (3), and the action execution module (41) analyzes the action model into an action result and outputs the action result to the physical layer (5);
step 4.3: the data feedback module (42) outputs the feedback information to a myoelectric data analysis module (32) of the control layer (3).
8. The control method according to claim 7, wherein: and the meta-motion protocol data structure of the motion result is D = [ id, ti, D ], wherein D is the meta-motion protocol data structure, id is the actuator number of the physical layer (5), ti is the time point when the actuator operates to the current angle, and D is the current operating angle value of the actuator.
9. The control method according to claim 2, wherein: the physical layer (5) comprises a wrist actuator (51), a thumb actuator (52), an index finger actuator (53), a middle finger actuator (54), a ring finger actuator (55) and a small finger actuator (56); wherein, the wrist actuator (51) comprises a first bending actuator (511), a first inner and outer actuator (512) and a first left and right actuator (513); the thumb actuator (52) comprises a second bending actuator (521) and a second inner and outer actuator (522); the index finger actuator (53) comprises a third bending actuator (531) and a second left-right actuator (532); the middle finger actuator (54) is a fourth bending actuator, the ring finger actuator (55) is a fifth bending actuator, and the little finger actuator (56) is a sixth bending actuator;
the primary actions of the wrist actuator (51), the thumb actuator (52), the index finger actuator (53), the middle finger actuator (54), the ring finger actuator (55) and the little finger actuator (56) are respectively as follows: a wrist element action E1, a thumb element action E2, an index finger element action E3, a middle finger element action E4, a ring finger element action E5 and a little finger element action E6; the palm motion data structure is: e = [ E1, E2, E3, E4, E5, E6], wherein E is a palm motion;
the method comprises the steps that palm motions are realized through execution of an individual actuator and execution of a combined actuator, when the individual actuator executes, a meta-motion execution data structure of a single actuator is B = [ id, ti, d, v, a ], wherein B is the meta-motion execution data structure of the single actuator, id is an actuator number, ti is a termination time point of the actuator running to a current angle, d is a current angle of the actuator, v is a current speed of the actuator, and a is a current acceleration of the actuator;
when the joint actuators are executed, the five-finger action execution data structure is C = [ i, bn ], wherein C is the respective joint action of the five fingers, i is the number of the five fingers, and Bn is the current information of one or more actuators contained in each finger;
the whole meta-motion data structure of the dexterous hand is N = [ ti, cn ], wherein N is a joint motion, ti is a time sequence of the joint motion, and Cn is current information of five fingers.
10. The control method according to claim 2, wherein: the step 5 comprises the following sub-steps:
step 5.1: the physical layer (5) monitors the action result transmitted by the execution layer (4), if the physical layer (5) monitors the action result, the step 5.2 is executed, and if the physical layer (5) does not monitor the action result, the monitoring is continued;
step 5.2: the physical layer (5) decomposes the action result into five finger and wrist element actions, and corresponding element actions are respectively executed through a wrist actuator (51), a thumb actuator (52), an index finger actuator (53), a middle finger actuator (54), a ring finger actuator (55) and a little finger actuator (56);
step 5.3: the physical layer (5) acquires palm physical data and feeds it back to the execution layer (4).
CN202211583996.4A 2022-12-09 2022-12-09 System and method for controlling dexterous hand of humanoid robot based on raspberry pie Pending CN115816456A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211583996.4A CN115816456A (en) 2022-12-09 2022-12-09 System and method for controlling dexterous hand of humanoid robot based on raspberry pie

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211583996.4A CN115816456A (en) 2022-12-09 2022-12-09 System and method for controlling dexterous hand of humanoid robot based on raspberry pie

Publications (1)

Publication Number Publication Date
CN115816456A true CN115816456A (en) 2023-03-21

Family

ID=85546201

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211583996.4A Pending CN115816456A (en) 2022-12-09 2022-12-09 System and method for controlling dexterous hand of humanoid robot based on raspberry pie

Country Status (1)

Country Link
CN (1) CN115816456A (en)

Similar Documents

Publication Publication Date Title
Yang et al. Haptics electromyography perception and learning enhanced intelligence for teleoperated robot
Bhattacharyya et al. Motor imagery and error related potential induced position control of a robotic arm
Bi et al. A review on EMG-based motor intention prediction of continuous human upper limb motion for human-robot collaboration
CN101785704B (en) Self-adaptive filtering device of master-slave minimally-invasive surgery robot system
Artemiadis EMG-based robot control interfaces: past, present and future
Bhattacharyya et al. A synergetic brain-machine interfacing paradigm for multi-DOF robot control
Rani et al. Electroencephalogram-based brain controlled robotic wheelchair
CN109199783B (en) Control method for controlling stiffness of ankle joint rehabilitation equipment by using sEMG
CN112247962B (en) Man-machine game control method and system for upper limb wearable robot
CN112370035A (en) Human-computer cooperation fatigue detection system based on digital twin platform
Chen et al. Stiffness estimation and intention detection for human-robot collaboration
Pons et al. Objectives and technological approach to the development of the multifunctional MANUS upper limb prosthesis
Turgunov et al. Comparative analysis of the results of EMG signal classification based on machine learning algorithms
Al-Aubidy et al. Wheelchair neuro fuzzy control using brain computer interface
Ryait et al. Study of issues in the development of surface EMG controlled human hand
CN113262088B (en) Multi-degree-of-freedom hybrid control artificial hand with force feedback and control method
Sun et al. Real-time human intention recognition of multi-joints based on MYO
CN115816456A (en) System and method for controlling dexterous hand of humanoid robot based on raspberry pie
CN108942938A (en) A kind of four axis robotic arm motion control methods and system based on eeg signal
Byfield et al. Realtime classification of hand motions using electromyography collected from minimal electrodes for robotic control
CN113370172B (en) Auxiliary manipulator system based on multiple sensors and force feedback and application method thereof
Wang et al. Pattern recognition-based real time myoelectric system for robotic hand control
Poulton et al. Progress of a modular prosthetic arm
Kaplanoglu et al. Hand gesture based motion control of collaborative robot in assembly line
Lin et al. Voluntary-Redundant Hybrid Control of SuperLimb based on Redundant Muscle for On-Site Assembly Tasks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination