CN118058882A - Forearm artificial limb control method, device, equipment and medium - Google Patents

Forearm artificial limb control method, device, equipment and medium Download PDF

Info

Publication number
CN118058882A
CN118058882A CN202211485274.5A CN202211485274A CN118058882A CN 118058882 A CN118058882 A CN 118058882A CN 202211485274 A CN202211485274 A CN 202211485274A CN 118058882 A CN118058882 A CN 118058882A
Authority
CN
China
Prior art keywords
target
target object
forearm prosthesis
forearm
prosthesis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211485274.5A
Other languages
Chinese (zh)
Inventor
郑悦
李向新
刘岩
谭迎宵
张浩诗
田岚
林宛华
蒋续钢
李光林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN202211485274.5A priority Critical patent/CN118058882A/en
Priority to PCT/CN2023/133472 priority patent/WO2024109858A1/en
Publication of CN118058882A publication Critical patent/CN118058882A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/68Operating or control means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/68Operating or control means
    • A61F2/70Operating or control means electrical
    • A61F2/72Bioelectric control, e.g. myoelectric

Landscapes

  • Health & Medical Sciences (AREA)
  • Cardiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Transplantation (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Vascular Medicine (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Prostheses (AREA)

Abstract

The embodiment of the invention discloses a method, a device, equipment and a medium for controlling a forearm artificial limb. The method comprises the following steps: acquiring a target object image acquired by an image acquisition functional module of a palm arranged on a target forearm prosthesis, and analyzing the pose of a target object according to the target object image; acquiring and identifying an electromyographic signal associated with the target forearm prosthesis, and starting or stopping a motion control process of the target forearm prosthesis according to a motion intention corresponding to the electromyographic signal; in the motion control process, determining control parameters for controlling the target forearm prosthesis according to the real-time pose analysis result of the target object, and controlling the target forearm prosthesis according to the control parameters. According to the technical scheme, the problem of poor stability of the artificial limb is solved by utilizing the electromyographic signals to control the artificial limb in real time, the stability of the artificial limb control is improved, the cognitive burden and muscle fatigue of a user when the artificial limb is used are reduced, and the target object is efficiently and accurately controlled.

Description

Forearm artificial limb control method, device, equipment and medium
Technical Field
The embodiment of the invention relates to the technical field of artificial limb control, in particular to a forearm artificial limb control method, a forearm artificial limb control device, forearm artificial limb control equipment and forearm artificial limb control medium.
Background
Existing prosthetic control is based on the relative position of the user's arm and the target object and the electromyographic signals. However, the continuous use of the electromyographic signals as control signals for the prosthesis aggravates muscle fatigue of the user, and the recognition of the electromyographic signals is easily affected by the environment, so that the real-time control of the prosthesis by using the electromyographic signals affects the stability of the prosthesis control.
Disclosure of Invention
The invention provides a method, a device, equipment and a medium for controlling a forearm artificial limb, which improve the stability of artificial limb control, reduce the cognitive burden and muscle fatigue of a user when using the artificial limb, and efficiently and accurately finish the control on a target object.
According to an aspect of the present invention, there is provided a method of controlling a forearm prosthesis, the method comprising:
Acquiring a target object image acquired by an image acquisition functional module of a palm arranged on a target forearm prosthesis, and analyzing the pose of a target object according to the target object image;
acquiring and identifying an electromyographic signal associated with the target forearm prosthesis, and starting or stopping a motion control process of the target forearm prosthesis according to a motion intention corresponding to the electromyographic signal;
in the motion control process, determining control parameters for controlling the target forearm prosthesis according to the real-time pose analysis result of the target object, and controlling the target forearm prosthesis according to the control parameters.
According to another aspect of the present invention, there is provided a forearm prosthesis control device comprising:
The image acquisition module is used for acquiring the target object image acquired by the image acquisition function module of the palm of the target forearm prosthesis and analyzing the pose of the target object according to the target object image;
The signal identification module is used for acquiring and identifying an electromyographic signal associated with the target forearm prosthesis and starting or stopping a motion control process of the target forearm prosthesis according to a motion intention corresponding to the electromyographic signal;
And the parameter control module is used for determining control parameters for controlling the target forearm prosthesis according to the real-time pose analysis result of the target object in the motion control process and controlling the target forearm prosthesis according to the control parameters.
According to another aspect of the present invention, there is provided an electronic device including:
at least one processor; and
A memory communicatively coupled to the at least one processor; wherein,
The memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the method of controlling a forearm prosthesis according to any of the embodiments of the invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to execute a method of controlling a forearm prosthesis according to any of the embodiments of the invention.
According to the technical scheme, the target object image acquired by the image acquisition function module arranged on the palm of the target forearm prosthesis is acquired, and pose analysis is carried out on the target object according to the target object image; acquiring and identifying an electromyographic signal associated with the target forearm prosthesis, and starting or stopping a motion control process of the target forearm prosthesis according to a motion intention corresponding to the electromyographic signal; in the motion control process, determining control parameters for controlling the target forearm prosthesis according to the real-time pose analysis result of the target object, and controlling the target forearm prosthesis according to the control parameters. According to the technical scheme, the problem of poor stability of the artificial limb is solved by utilizing the electromyographic signals to control the artificial limb in real time, the stability of the artificial limb control is improved, the cognitive burden and muscle fatigue of a user when the artificial limb is used are reduced, and the target object is efficiently and accurately controlled.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for controlling a forearm prosthesis according to an embodiment of the invention;
FIG. 2 is a flow chart of another method for controlling a forearm prosthesis according to an embodiment of the invention;
FIG. 3 is a flow chart of yet another method for controlling a forearm prosthesis provided by an embodiment of the invention;
FIG. 4 is a block diagram of a forearm prosthesis control system;
FIG. 5 is a schematic view of a forearm prosthesis;
FIG. 6 is a flow chart of a method of controlling a forearm prosthesis;
FIG. 7 is a block diagram of a control device for a forearm prosthesis according to an embodiment of the invention;
fig. 8 is a block diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "target," "initial," "first," and "second," etc. in the description and claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Fig. 1 is a flowchart of a method for controlling a forearm prosthesis according to an embodiment of the invention, where the embodiment is applicable to a scenario of controlling a forearm prosthesis, and is more applicable to a scenario of implementing control of a forearm prosthesis based on electromyographic signals and visual signals. The method may be performed by a forearm prosthesis control device, which may be implemented in hardware and/or software, or may be configured in an electronic device.
As shown in fig. 1, the forearm prosthesis control method includes the steps of:
S110, acquiring a target object image acquired by an image acquisition functional module arranged on the palm of the target forearm prosthesis, and analyzing the pose of the target object according to the target object image.
The target forearm prosthesis comprises a prosthesis palm with touch perception capability for touch force perception.
The palm of the target forearm prosthesis is also provided with an image acquisition functional module, for example, a high-definition camera is arranged in the palm of the target forearm prosthesis and used for acquiring images of the target object in real time. Image-based forearm prosthesis control in the prior art is achieved by positioning the camera at the back of the hand, arm or head. However, when the back of hand camera acquires the target object image, the back of hand position is influenced by the artificial limb position, and in the grabbing process, the back of hand position is opposite to the target object direction, and the target object is difficult to be in the visual range of the camera all the time. The camera is arranged on the arm or the head, and the target object can be shielded by the forearm prosthesis. The camera is arranged at the palm position, so that the camera always faces the direction of the target object in the process that the forearm artificial limb approaches the target object, and the target object image can be acquired in real time.
Further, performing pose analysis on the target object according to the target object image includes:
First, a target object in a target object image is identified, and contour position information of the target object in the target object image is determined.
The contour position information is information for determining the position of the target object in the target object image, such as the outer edge contour image or the coordinates of the outer edge contour line of the target object in the target object image.
It will be appreciated that, first, the target image needs to be preprocessed, for example, image enhancement, smoothing, filtering, etc. may be performed on the target image, for removing interference and noise; the target object may then be identified using a target detection algorithm, for example, canny, RCNN, SPP-Net, faster R-CNN, R-FCN, mask-RCNN, FPN, locNet, YOLO, or SSD algorithms may be used to perform image segmentation based on the geometric and statistical characteristics of the target object to achieve target object identification.
Specifically, the neural network model may be pre-trained based on contour position information of the target object in the manually marked target object image, and the target object image is input into the pre-trained neural network model to obtain contour position information of the target object in the target object image.
Then, the centroid coordinates and the inclination angle of the target object are determined according to the contour position information.
Further, determining the centroid coordinates and the tilt angle of the target object according to the contour position information includes:
first, the pixel value of a pixel belonging to the contour region of the target object is set to 1 and the pixel value of a pixel not belonging to the contour region of the target object is set to 0 according to the contour position information.
Specifically, the two-dimensional image of the target object is binarized according to the contour position information, the two-dimensional image of the target object is converted into a grayscale image, the pixel value of the pixel belonging to the contour region of the target object is set to 1 (white), and the pixel value of the pixel not belonging to the contour region of the target object is set to 0 (black).
And then, according to the pixel value and the coordinate value of each pixel point in the contour area, performing first-order matrix calculation to determine the barycenter coordinate.
Specifically, V (i, j) is used to represent the pixel value at the point (i, j) in the coordinates of the contour region of the target object, and the sum of the pixel values of the pixel points of the white region in the contour region isI is the number of pixels in the abscissa direction, and J is the number of pixels in the ordinate direction; the product of the abscissa of all pixels of the white region within the contour region and the pixel value of the pixel is/>The product of the ordinate of all pixels of the white region in the contour region and the pixel value of the pixel is/>Thus the centroid coordinate (x c,yc) is
And finally, performing second-order matrix calculation according to the barycenter coordinates and the pixel values and coordinate values of all pixel points in the contour area, and determining the inclination angle.
The tilt angle is the tilt angle of the target object relative to the target forearm prosthesis.
Specifically, the product of the abscissa and the ordinate of all the pixels of the white region in the contour region and the product of the pixel values of the pixels isThe sum of the product of the square of the abscissa of all pixels of the white region within the contour region and the pixel value of the pixel is/>The sum of the squares of the ordinate of all pixels of the white region within the contour region and the product of the pixel values of the pixels isThus dip/>Wherein the method comprises the steps of
And finally, determining the position information of the target object relative to the target forearm prosthesis according to the centroid coordinates and the parameter information of the image acquisition functional module, and taking the position information and the inclination angle as pose analysis results.
Specifically, the coordinates of the target object relative to the image acquisition functional module are determined by combining the centroid coordinates and the corresponding relation between the contour position information of the target object and the pixel coordinates of the target object, which are acquired by the image acquisition functional module, and the coordinates and the inclination angle are used as the pose analysis result of the current target object.
S120, acquiring and identifying an electromyographic signal associated with the target forearm prosthesis, and starting or stopping a motion control process of the target forearm prosthesis according to a motion intention corresponding to the electromyographic signal.
The electromyographic signal is a one-dimensional time series signal, originating from motor neurons in the spinal cord of the central nerve, and is the sum of action potentials (action potentials, APs) emitted by a number of motor units contacted by the electrodes. The electromyographic signals contain muscle contraction mode and intensity information and can be used as a control information source.
In this embodiment, the human myoelectric signal and the visual signal provided by the camera mounted on the forearm prosthesis are combined as the control signal source, and only the human myoelectric signal is used as the starting and stopping signals of the prosthesis control by using a cooperative control method. And after the artificial limb control is started, the position and posture information of the target object are identified by utilizing the visual signals provided by the camera, and the control of the forearm artificial limb is carried out according to the position and posture analysis result of the identified target object.
Specifically, the human body electromyographic signals are used as starting or stopping signals for artificial limb control, and when the electromyographic signals are preset and identified to be 'fist making', the control of the forearm artificial limb is started, so that the forearm artificial limb moves; when the myoelectric signal is identified as "hand tension", the control process of the forearm prosthesis is terminated, and the movement of the forearm prosthesis is terminated.
Further, establishing the correspondence between the electromyographic signals and the intention recognition can be achieved through the following processes: firstly, an electrode can be attached to the surface of the muscle of the forearm stump of a user, so that the user controls the forearm stump to implement the actions of 'hand opening' and 'grasping', electromyographic signals corresponding to the two actions are respectively obtained, and the obtained electromyographic signals are preprocessed through signal amplification, envelope, mean filtering, wave trap or band-pass filter and the like, so that power frequency interference and noise are reduced; then, extracting the characteristics of the preprocessed electromyographic signals, wherein the extracting parameters of the electromyographic signals can be set according to requirements, for example, the electromyographic signals can be extracted by a principal component analysis method, a support vector machine (SVM-RFE) recursion elimination method and a correlation coefficient diagram (also called thermodynamic diagram); finally, classifying the electromyographic signals according to the extracted electromyographic signal characteristics, for example, classifying based on threshold values, classifying based on amplitude codes, classifying based on layered control decisions, setting corresponding standards or models describing the correspondence between the classification of the electromyographic signal characteristics and the movement intention of the hand, and setting corresponding standards or models describing the correspondence between the classification of the electromyographic signal characteristics and the movement intention of the hand, wherein the standards or models are used for determining the movement intention corresponding to the current electromyographic signals and starting or stopping the movement control process of the target forearm prosthesis.
And S130, in the motion control process, determining control parameters for controlling the target forearm prosthesis according to the real-time pose analysis result of the target object, and controlling the target forearm prosthesis according to the control parameters.
Specifically, the current electromyographic signals are input into a model describing the corresponding relation between the classification of electromyographic signal characteristics and the movement intention, or the movement intention corresponding to the current electromyographic signals is obtained according to the corresponding standard; in the process of performing motion control on the target forearm prosthesis, the control parameters of the target forearm prosthesis corresponding to the current target object pose analysis result are determined according to the corresponding relation between the target object pose analysis result and the target forearm prosthesis control parameters, and the target forearm prosthesis is subjected to motion control by the control parameters, so that autonomous control of the forearm prosthesis is realized.
According to the technical scheme, the target object image acquired by the image acquisition function module arranged on the palm of the target forearm prosthesis is acquired, and pose analysis is carried out on the target object according to the target object image; acquiring and identifying an electromyographic signal associated with the target forearm prosthesis, and starting or stopping a motion control process of the target forearm prosthesis according to a motion intention corresponding to the electromyographic signal; in the motion control process, determining control parameters for controlling the target forearm prosthesis according to the real-time pose analysis result of the target object, and controlling the target forearm prosthesis according to the control parameters. According to the technical scheme, the problem of poor stability of the artificial limb is solved by utilizing the electromyographic signals to control the artificial limb in real time, the stability of the artificial limb control is improved, the cognitive burden and muscle fatigue of a user when the artificial limb is used are reduced, and the target object is efficiently and accurately controlled.
Fig. 2 is a flowchart of another forearm prosthesis control method according to an embodiment of the invention, where the forearm prosthesis control method according to the embodiment and the forearm prosthesis control method according to the embodiment described above are the same, and on the basis of the embodiment described above, further description is made of acquiring and identifying an electromyographic signal associated with a target forearm prosthesis, and starting or stopping a motion control process of the target forearm prosthesis according to a motion intention corresponding to the electromyographic signal. The method can be performed by a forearm prosthesis control device, which can be implemented in software and/or hardware, integrated in an electronic apparatus with application development functions.
As shown in fig. 2, the forearm prosthesis control method includes the steps of:
S210, acquiring a target object image acquired by an image acquisition functional module arranged on the palm of the target forearm prosthesis, and analyzing the pose of the target object according to the target object image.
S220, acquiring an electromyographic signal and extracting preset signal characteristics in the electromyographic signal.
Firstly, a needle electrode or a microneedle electrode is attached to the muscle surface of the forearm stump of a user, or can be attached to the surface of a healthy side limb or other parts of the body to obtain electromyographic signals, then each path of electromyographic signals is intercepted in real time by using window functions, and the characteristics of the electromyographic signals under each window function are extracted.
S230, matching the preset signal characteristics with first reference signal characteristics corresponding to the first movement intention and second reference signal characteristics corresponding to the second movement intention respectively.
It can be understood that different movement intents correspond to different signal characteristics, classification is realized according to the signal characteristics of the electromyographic signals corresponding to the movement intents, and the different signal characteristics correspond to different movement intents. For example, the electromyographic signals may be classified into two categories based on a threshold, and the two categories of classification results correspond to the activation and deactivation of the control target forearm prosthesis, respectively. For example, the first movement intent is a fist, the first reference signal is characterized by a characteristic of an electromyographic signal corresponding to the first movement intent that exceeds the corresponding threshold; the second movement intention is a hand tension and the second reference signal is characterized by a characteristic of an electromyographic signal corresponding to the second movement intention, the characteristic not exceeding the corresponding threshold.
And S240, when the preset signal characteristics are matched with the first reference signal characteristics, starting the motion control process of the target forearm prosthesis, and when the preset signal characteristics are matched with the second reference signal characteristics, stopping the motion control process of the target forearm prosthesis.
It can be understood that when the signal characteristics of the extracted electromyographic signals are matched with the first reference signal characteristics, which indicates that the motion corresponding to the electromyographic signals is intended to be a fist, the motion control process of the target forearm prosthesis is started; and when the signal characteristics of the extracted electromyographic signals are matched with the second reference signal characteristics, indicating that the motion corresponding to the electromyographic signals is intended to be hand tension, ending the motion control process of the target forearm prosthesis.
Further, the determining process of the first reference signal characteristic and the second reference signal characteristic includes:
Firstly, an electromyographic signal when the exercise intention is fist making is obtained as a first exercise intention electromyographic signal, and a preset signal characteristic is extracted from the first exercise intention electromyographic signal to obtain a first reference signal characteristic.
Optionally, the needle electrode or the microneedle electrode is attached to the muscle surface of the forearm stump of the user, so that the user controls the forearm stump to perform the fist making action, and meanwhile, an electromyographic signal is obtained as a first movement intention electromyographic signal, namely, a corresponding electromyographic signal is started, and a characteristic is extracted from the electromyographic signal, for example, the average value of the amplitude of the electromyographic signal can be used as a first reference signal characteristic.
Then, the electromyographic signal with the movement intention of the hand as a second movement intention electromyographic signal is obtained, and a preset signal characteristic is extracted from the second movement intention electromyographic signal to obtain a second reference signal characteristic.
Optionally, the needle electrode or the microneedle electrode is attached to the muscle surface of the forearm stump of the user, so that the user controls the forearm stump to perform the hand opening action, and meanwhile, an electromyographic signal is obtained as a second movement intention electromyographic signal, namely, the electromyographic signal corresponding to the stopping is used as a second movement intention electromyographic signal, and the electromyographic signal is extracted to obtain a characteristic, for example, the characteristic can be an average value of the amplitude of the electromyographic signal and is used as a second reference signal characteristic.
S250, in the motion control process, determining control parameters for controlling the target forearm prosthesis according to the real-time pose analysis result of the target object, and controlling the target forearm prosthesis according to the control parameters.
According to the technical scheme, the target object image acquired by the image acquisition function module arranged on the palm of the target forearm prosthesis is acquired, and pose analysis is carried out on the target object according to the target object image; acquiring an electromyographic signal and extracting preset signal characteristics in the electromyographic signal; matching the preset signal characteristics with first reference signal characteristics corresponding to the first movement intention and second reference signal characteristics corresponding to the second movement intention respectively; and when the preset signal characteristic is matched with the first reference signal characteristic, starting the motion control process of the target forearm prosthesis, and when the preset signal characteristic is matched with the second reference signal characteristic, stopping the motion control process of the target forearm prosthesis. In the motion control process, determining control parameters for controlling the target forearm prosthesis according to the real-time pose analysis result of the target object, and controlling the target forearm prosthesis according to the control parameters. According to the technical scheme, the electromyographic signals are classified, the movement intention is identified, the problem that the stability of the artificial limb is poor when the electromyographic signals are used for controlling the artificial limb in real time is solved, the stability of the artificial limb control is further improved, the cognitive burden and the muscle fatigue of a user when the artificial limb is used are reduced, and the control of a target object is efficiently and accurately completed.
Fig. 3 is a flowchart of another forearm prosthesis control method according to an embodiment of the invention, where the forearm prosthesis control method according to the embodiment and the forearm prosthesis control method according to the embodiment are the same inventive concept, and the process of determining a control parameter for controlling a target forearm prosthesis according to a real-time pose analysis result of the target object during motion control and controlling the target forearm prosthesis according to the control parameter is further described on the basis of the embodiment. The method can be performed by a forearm prosthesis control device, which can be implemented in software and/or hardware, integrated in an electronic apparatus with application development functions.
As shown in fig. 3, the forearm prosthesis control method includes the steps of:
s310, acquiring a target object image acquired by an image acquisition functional module arranged on the palm of the target forearm prosthesis, and analyzing the pose of the target object according to the target object image.
S320, acquiring and identifying an electromyographic signal associated with the target forearm prosthesis, and starting or stopping a motion control process of the target forearm prosthesis according to a motion intention corresponding to the electromyographic signal.
S330, when the motion control process is started, the target forearm prosthesis is controlled to move to a preset initial pose.
When the motion control process is started, the forearm prosthesis is controlled, so that the end effector prosthetic hand of the forearm prosthesis is automatically changed to an initial pose, for example, the hand can be opened, and the target object is ready to be grasped.
And S340, taking the position information in the real-time pose analysis result as the target forearm prosthesis movement target position, and controlling the target forearm prosthesis to move.
Specifically, position information in a real-time pose analysis result is used as a target position of a target forearm artificial limb, the forearm artificial limb is controlled to enable the direction of the forearm artificial limb to face the direction of a target object, a user moves the forearm artificial limb to approach the target object, and the prosthetic hand is controlled to finish grabbing the target object.
Further, in the motion control process, determining a control parameter for controlling the target forearm prosthesis according to the real-time pose analysis result of the target object, and controlling the target forearm prosthesis according to the control parameter, and further comprising:
first, the target angle is adjusted by taking the inclination angle in the real-time pose analysis result as the palm pose of the target forearm prosthesis.
Then, the posture angle of the palm of the target forearm prosthesis is adjusted according to the target angle.
Specifically, according to the inclination angle of the target object relative to the target forearm prosthesis in the real-time pose analysis result, the target angle is adjusted as the palm pose of the target forearm prosthesis, and according to the angle, the pose angle of the palm of the forearm prosthesis of the user is adjusted to reach the target angle.
In a specific embodiment, FIG. 4 is a block diagram of a forearm prosthesis control system, as shown in FIG. 4, with intent recognition module 310, object recognition module 320, forearm prosthesis 330, and forearm prosthesis control module 340.
The intent recognition module 310 includes a myoelectricity sensing sub-module 311, a myoelectricity acquisition sub-module 312, and a myoelectricity signal processing sub-module 313.
The target object recognition module 320 includes a camera 321 and a host computer 322.
The forearm prosthesis 330 includes a multi-fingered prosthetic hand and a prosthetic wrist joint, where both the multi-fingered prosthetic hand and the prosthetic wrist joint include force and position sensors.
The forearm prosthesis control module 340 includes a communication sub-module 341, a control sub-module 342, a sensory information sub-acquisition module 343, and a drive sub-module 344.
The myoelectricity sensing sub-module 311 is used for being attached to the muscle surface of a user.
The myoelectricity acquisition sub-module 312 is used for acquiring myoelectricity signals of a user in real time.
The electromyographic signal processing sub-module 313 performs processing such as filtering and feature extraction on the obtained electromyographic signal.
The camera 321 is installed on the palm of the multi-finger prosthetic hand of the forearm prosthetic 330, acquires a target object image in real time, transmits the acquired target object image to the upper computer 332, receives the target object image and identifies the target object, then performs pose analysis on the target object, and transmits the real-time pose analysis result to the control submodule 342 in the forearm prosthetic control module 340.
The communication sub-module 341 is configured to receive the motion intention of the user and the real-time pose of the target object relative to the forearm prosthesis sent by the intention recognition module 310 and the target object recognition module 320.
The control submodule 342 is configured to: determining whether to control the forearm prosthesis 330 to implement the manipulation of the target object according to the real-time pose analysis result sent by the communication sub-module 341; whether the forearm prosthesis is operated or not is judged according to the position information received by the sensing information acquisition sub-module 343, and if the forearm prosthesis body 104 is not operated, the driving sub-module 344 drives the forearm prosthesis 330 to operate the target object.
FIG. 5 is a schematic view of a forearm prosthesis, as shown in FIG. 5, including a camera 66 and myoelectric electrodes 99, the camera 66 being used to acquire images of a target object in real time; the myoelectricity electrode 99 is used to attach to the muscle surface of the user, and acquire the myoelectricity signal of the user in real time.
FIG. 6 is a flow chart of a method of controlling a forearm prosthesis, one particular method of controlling a forearm prosthesis as shown in FIG. 6 including the steps of:
s410, acquiring a target object image and preprocessing: images of the target object are acquired in real time by the camera 66 placed at the palm of the forearm prosthesis 330, with interference and noise removed by image enhancement and filtering methods.
S420, target object identification: and (3) carrying out edge detection on the target object through a Canny operator, detecting the outline of the target object, and converting the corresponding two-dimensional image in the outline of the target object into a binary image.
S430, calculating the mass center and the inclination angle of the target object: and calculating the mass center of the target object by using the first moment of the binary image, and calculating the inclination angle of the target object by using the second moment.
S440, calculating the pose of the target object relative to the artificial limb: and (3) converting coordinates according to the pixel coordinates of the centroid of the target object, the parameters of the camera 66 and the corresponding pixel coordinates obtained in the step (S430) to obtain the pose analysis result of the target object relative to the camera 66 and the forearm prosthesis 330.
S450, identifying movement intention: the needle electrodes or microneedle electrodes are used to acquire the user's surface electromyographic signals, extract the electromyographic signal characteristics and identify their intent to move as control signals for manipulating the forearm prosthesis 330. Firstly, attaching a surface myoelectric electrode to the forearm stump of a user, enabling the user to control muscle contraction, simultaneously obtaining a myoelectric signal, filtering the myoelectric signal through a filter, and reducing the influence of power frequency interference and noise; then, intercepting each path of electromyographic signals in real time by using window functions, and extracting the characteristics of the electromyographic signals under each window function; and finally, classifying the electromyographic signals into two types according to the characteristics of the electromyographic signals, and respectively corresponding to starting and ending signals of the forearm artificial limb control system.
S460, independently controlling and grabbing the forearm prosthesis: control parameters for controlling the forearm prosthesis 330 are determined according to the real-time pose analysis result of the target object, and the control flow is started/stopped by controlling the forearm prosthesis 330 according to the control parameters. When the autonomous control of the forearm prosthesis 330 is started, the prosthetic hand of the forearm prosthesis 330 automatically changes to an initial pose for preparation for grabbing, meanwhile, the direction of the forearm prosthesis 330 faces to the direction of the target object, a user moves the forearm prosthesis 330 to approach to the target object, the pose of the prosthesis is adjusted in real time, and the prosthetic hand is controlled to finish the control of the target object; when the forearm prosthesis is autonomously controlled to stop, the forearm prosthesis is stopped.
According to the technical scheme, the target object image acquired by the image acquisition function module arranged on the palm of the target forearm prosthesis is acquired, and pose analysis is carried out on the target object according to the target object image; acquiring and identifying an electromyographic signal associated with the target forearm prosthesis, and starting or stopping a motion control process of the target forearm prosthesis according to a motion intention corresponding to the electromyographic signal; when the motion control process is started, controlling the target forearm prosthesis to move to a preset initial pose; and taking the position information in the real-time pose analysis result as the target position of the target forearm prosthesis, and controlling the target forearm prosthesis to move. According to the technical scheme, the target forearm artificial limb is controlled based on the real-time pose analysis result, the problem that the stability of the artificial limb is poor due to the fact that the myoelectric signal is used for controlling the artificial limb in real time is solved, the stability of the artificial limb control is further improved, the cognitive burden and muscle fatigue of a user when the artificial limb is used are reduced, and the target object is controlled efficiently and accurately.
Fig. 7 is a block diagram of a control device for a forearm prosthesis according to an embodiment of the invention, where the embodiment is applicable to a scenario of control of a forearm prosthesis, and is more applicable to a scenario of control of a forearm prosthesis based on electromyographic signals and visual signals. The apparatus may be implemented in hardware and/or software, and integrated into a computer device having application development functionality.
As shown in fig. 7, the forearm prosthesis control device includes: an image acquisition module 710, a signal identification module 720, and a parameter control module 730.
The image acquisition module 710 is configured to acquire a target object image acquired by the image acquisition function module of a palm of the target forearm prosthesis, and perform pose analysis on the target object according to the target object image; the signal recognition module 720 is configured to acquire and recognize an electromyographic signal associated with the target forearm prosthesis, and start or stop a motion control process of the target forearm prosthesis according to a motion intention corresponding to the electromyographic signal; and the parameter control module 730 is configured to determine a control parameter for controlling the target forearm prosthesis according to the real-time pose analysis result of the target object during motion control, and control the target forearm prosthesis according to the control parameter.
According to the technical scheme, the target object image acquired by the image acquisition function module arranged on the palm of the target forearm prosthesis is acquired, and pose analysis is carried out on the target object according to the target object image; acquiring and identifying an electromyographic signal associated with the target forearm prosthesis, and starting or stopping a motion control process of the target forearm prosthesis according to a motion intention corresponding to the electromyographic signal; in the motion control process, determining control parameters for controlling the target forearm prosthesis according to the real-time pose analysis result of the target object, and controlling the target forearm prosthesis according to the control parameters. According to the technical scheme, the problem that the artificial limb is poor in stability due to the fact that the electromyographic signals are used for controlling the artificial limb in real time is solved, the stability of artificial limb control is further improved, cognitive burden and muscle fatigue of a user when the artificial limb is used are reduced, and the target object is efficiently and accurately controlled.
Optionally, the image acquisition module 710 is further configured to:
identifying a target object in the target object image, and determining contour position information of the target object in the target object image;
determining the barycenter coordinates and the inclination angle of the target object according to the contour position information;
And determining the position information of the target object relative to the target forearm prosthesis according to the centroid coordinates and the parameter information of the image acquisition functional module, and taking the position information and the inclination angle as pose analysis results.
Optionally, the image acquisition module 710 is further configured to:
Setting 1 pixel values of pixel points in the contour area of the target object according to the contour position information, and setting 0 pixel values of pixel points not in the contour area of the target object;
according to the pixel value and the coordinate value of each pixel point in the contour area, performing first-order matrix calculation to determine the barycenter coordinate;
And performing second-order matrix calculation according to the centroid coordinates and the pixel values and coordinate values of all pixel points in the contour area, and determining the inclination angle.
Optionally, the signal identifying module 720 is further configured to:
acquiring an electromyographic signal and extracting preset signal characteristics in the electromyographic signal;
matching the preset signal characteristics with first reference signal characteristics corresponding to the first movement intention and second reference signal characteristics corresponding to the second movement intention respectively;
And when the preset signal characteristic is matched with the first reference signal characteristic, starting the motion control process of the target forearm prosthesis, and when the preset signal characteristic is matched with the second reference signal characteristic, stopping the motion control process of the target forearm prosthesis.
Optionally, the signal identifying module 720 is further configured to:
Acquiring an electromyographic signal when the exercise intention is making a fist as a first exercise intention electromyographic signal, and extracting preset signal characteristics from the first exercise intention electromyographic signal to obtain first reference signal characteristics;
and acquiring an electromyographic signal of which the movement intention is that the hand is opened as a second movement intention electromyographic signal, and extracting preset signal characteristics from the second movement intention electromyographic signal to obtain second reference signal characteristics.
Optionally, the parameter control module 730 is further configured to:
when the motion control process is started, controlling the target forearm prosthesis to move to a preset initial pose;
And taking the position information in the real-time pose analysis result as the target position of the target forearm prosthesis, and controlling the target forearm prosthesis to move.
Optionally, the parameter control module 730 is further configured to:
The inclination angle in the real-time pose analysis result is used as the palm pose of the target forearm artificial limb to adjust the target angle;
and adjusting the posture angle of the palm of the target forearm artificial limb according to the target angle.
The forearm artificial limb control device provided by the embodiment of the invention can execute the forearm artificial limb control method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Fig. 8 is a block diagram of an electronic device according to an embodiment of the present invention. The electronic device 10 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, or other appropriate computers. The electronic device may also represent various forms of mobile apparatus, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), or other similar computing devices. The components shown herein, their connection relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the invention described and/or claimed herein.
As shown in fig. 8, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard or a mouse; an output unit 17 such as various types of displays or speakers, etc.; a storage unit 18 such as a magnetic disk or an optical disk; and a communication unit 19 such as a network card, modem or wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, a Digital Signal Processor (DSP), any suitable processor, controller or microcontroller, and the like. The processor 11 performs the various methods and processes described above, such as the forearm prosthesis control method.
In some embodiments, the forearm prosthesis control method may be implemented as a computer program tangibly embodied on a computer readable storage medium, such as the storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. One or more of the steps of the forearm prosthesis control method described above may be performed when the computer program is loaded into RAM 13 and executed by processor 11. Alternatively, in other embodiments, the processor 11 may be configured to perform the forearm prosthesis control method in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above can be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine or partly on the machine, partly on the machine and partly on a remote machine or entirely on the remote machine or server as a stand-alone software package.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., CRT (Cathode Ray Tube) or LCD (Liquid CRYSTAL DISPLAY) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. A method of controlling a forearm prosthesis, comprising:
Acquiring a target object image acquired by an image acquisition functional module of a palm arranged on a target forearm prosthesis, and analyzing the pose of a target object according to the target object image;
acquiring and identifying an electromyographic signal associated with the target forearm prosthesis, and starting or stopping a motion control process of the target forearm prosthesis according to a motion intention corresponding to the electromyographic signal;
And in the motion control process, determining control parameters for controlling the target forearm prosthesis according to the real-time pose analysis result of the target object, and controlling the target forearm prosthesis according to the control parameters.
2. The method of claim 1, wherein said pose analyzing the target object from the target object image comprises:
Identifying a target object in the target object image, and determining contour position information of the target object in the target object image;
determining the centroid coordinates and the inclination angle of the target object according to the contour position information;
And determining the position information of the target object relative to the target forearm prosthesis according to the centroid coordinates and the parameter information of the image acquisition functional module, and taking the position information and the inclination angle as pose analysis results.
3. The method of claim 2, wherein said determining centroid coordinates and tilt angles of said target object from said contour position information comprises:
Setting 1 pixel value of the pixel point in the contour area of the target object according to the contour position information, and setting 0 pixel value of the pixel point not in the contour area of the target object;
According to the pixel value and the coordinate value of each pixel point in the contour area, performing first-order matrix calculation to determine the centroid coordinate;
And performing second-order matrix calculation according to the centroid coordinates and the pixel values and coordinate values of all pixel points in the contour area, and determining the inclination angle.
4. The method of claim 1, wherein the acquiring and identifying the electromyographic signal associated with the target forearm prosthesis initiates or stops a motion control process of the target forearm prosthesis based on a motion intent corresponding to the electromyographic signal, comprising:
acquiring the electromyographic signals and extracting preset signal characteristics in the electromyographic signals;
Matching the preset signal characteristics with first reference signal characteristics corresponding to the first movement intention and second reference signal characteristics corresponding to the second movement intention respectively;
And when the preset signal characteristic is matched with the first reference signal characteristic, starting the motion control process of the target forearm prosthesis, and when the preset signal characteristic is matched with the second reference signal characteristic, stopping the motion control process of the target forearm prosthesis.
5. The method of claim 4, wherein the determining of the first reference signal characteristic and the second reference signal characteristic comprises:
acquiring an electromyographic signal when the exercise intention is making a fist as a first exercise intention electromyographic signal, and extracting a preset signal characteristic from the first exercise intention electromyographic signal to obtain the first reference signal characteristic;
And acquiring an electromyographic signal when the movement intention is that the hand is opened as a second movement intention electromyographic signal, and extracting a preset signal characteristic from the second movement intention electromyographic signal to obtain the second reference signal characteristic.
6. The method according to claim 1, wherein the determining, during the motion control, a control parameter for controlling the target forearm prosthesis according to a real-time pose analysis result of the target object, and controlling the target forearm prosthesis according to the control parameter, includes:
When the motion control process is started, controlling the target forearm prosthesis to move to a preset initial pose;
and taking the position information in the real-time pose analysis result as the moving target position of the target forearm prosthesis, and controlling the target forearm prosthesis to move.
7. The method of claim 6, wherein during the motion control, determining a control parameter for controlling the target forearm prosthesis according to the real-time pose analysis result of the target object, and controlling the target forearm prosthesis according to the control parameter, further comprises:
taking the inclination angle in the real-time pose analysis result as the palm pose of the target forearm prosthesis to adjust the target angle;
And adjusting the posture angle of the palm of the target forearm artificial limb according to the target angle.
8. A forearm prosthesis control device comprising:
the image acquisition module is used for acquiring a target object image acquired by the image acquisition function module of the palm of the target forearm prosthesis and analyzing the pose of the target object according to the target object image;
The signal identification module is used for acquiring and identifying an electromyographic signal associated with the target forearm prosthesis and starting or stopping a motion control process of the target forearm prosthesis according to a motion intention corresponding to the electromyographic signal;
and the parameter control module is used for determining control parameters for controlling the target forearm prosthesis according to the real-time pose analysis result of the target object in the motion control process and controlling the target forearm prosthesis according to the control parameters.
9. An electronic device, the electronic device comprising:
at least one processor; and
A memory communicatively coupled to the at least one processor; wherein,
The memory stores a computer program executable by the at least one processor to enable the at least one processor to perform a method of controlling a forearm prosthesis according to any of claims 1-7.
10. A computer readable storage medium, characterized in that it stores computer instructions for causing a processor to execute the forearm prosthesis control method according to any of claims 1-7.
CN202211485274.5A 2022-11-24 2022-11-24 Forearm artificial limb control method, device, equipment and medium Pending CN118058882A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211485274.5A CN118058882A (en) 2022-11-24 2022-11-24 Forearm artificial limb control method, device, equipment and medium
PCT/CN2023/133472 WO2024109858A1 (en) 2022-11-24 2023-11-22 Forearm prosthesis control method and apparatus, device, and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211485274.5A CN118058882A (en) 2022-11-24 2022-11-24 Forearm artificial limb control method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN118058882A true CN118058882A (en) 2024-05-24

Family

ID=91104637

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211485274.5A Pending CN118058882A (en) 2022-11-24 2022-11-24 Forearm artificial limb control method, device, equipment and medium

Country Status (2)

Country Link
CN (1) CN118058882A (en)
WO (1) WO2024109858A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1065123C (en) * 1994-12-16 2001-05-02 清华大学 Controlling device for grip of artifical hand
JPH09304019A (en) * 1996-05-15 1997-11-28 Ishikawajima Harima Heavy Ind Co Ltd Position and attitude detecting method of object
CN100546553C (en) * 2007-05-18 2009-10-07 天津大学 Adopt the prosthetic hand and the control method thereof of myoelectricity and brain electricity Collaborative Control
US9089966B2 (en) * 2010-11-17 2015-07-28 Mitsubishi Electric Corporation Workpiece pick-up apparatus
CN103519924B (en) * 2013-10-22 2015-12-02 深圳先进技术研究院 Intelligent artificial hand system
CN108453735B (en) * 2018-03-15 2021-03-02 河南大学 Grabbing method and device based on bionic manipulator of friction nano generator

Also Published As

Publication number Publication date
WO2024109858A1 (en) 2024-05-30

Similar Documents

Publication Publication Date Title
KR100847136B1 (en) Method and Apparatus for Shoulder-line detection and Gesture spotting detection
US9460339B2 (en) Combined color image and depth processing
CN107885327B (en) Fingertip detection method based on Kinect depth information
CN108595008B (en) Human-computer interaction method based on eye movement control
WO2019242330A1 (en) Monitoring method, recognition method, related apparatus, and system
CN106981077B (en) Infrared image and visible light image registration method based on DCE and LSS
US7734062B2 (en) Action recognition apparatus and apparatus for recognizing attitude of object
EP1856470A2 (en) Detecting and tracking objects in images
CN101635031B (en) Method for extracting and identifying small sample character contour feature
WO2022252642A1 (en) Behavior posture detection method and apparatus based on video image, and device and medium
CN111428731A (en) Multi-class target identification and positioning method, device and equipment based on machine vision
CN111367415B (en) Equipment control method and device, computer equipment and medium
EP3249576A1 (en) Biometric information processing device, biometric information processing method and biometric information processing program
KR20120006819A (en) Gaze detection method and system adopting the same
CN114463244A (en) Vision robot grabbing system and control method thereof
CN113034526B (en) Grabbing method, grabbing device and robot
CN113544738B (en) Portable acquisition device for anthropometric data and method for collecting anthropometric data
CN118058882A (en) Forearm artificial limb control method, device, equipment and medium
CN115951783A (en) Computer man-machine interaction method based on gesture recognition
Shashidhara et al. A novel approach to circular edge detection for iris image segmentation
Aydi et al. A fast and accurate eyelids and eyelashes detection approach for iris segmentation
Yang et al. Target position and posture recognition based on RGB-D images for autonomous grasping robot arm manipulation
Jamaludin et al. Adaptive initial contour and partly-normalization algorithm for iris segmentation of blurry iris images
CN111914585A (en) Iris identification method and system
Mozumder et al. Iris segmentation using adaptive histogram equalization and median filtering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination