CN108392207A - A kind of action identification method based on posture label - Google Patents

A kind of action identification method based on posture label Download PDF

Info

Publication number
CN108392207A
CN108392207A CN201810133363.0A CN201810133363A CN108392207A CN 108392207 A CN108392207 A CN 108392207A CN 201810133363 A CN201810133363 A CN 201810133363A CN 108392207 A CN108392207 A CN 108392207A
Authority
CN
China
Prior art keywords
label
posture
key node
posture label
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810133363.0A
Other languages
Chinese (zh)
Other versions
CN108392207B (en
Inventor
徐嘉晨
张晓云
刘小通
周建益
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwest University
Original Assignee
Northwest University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwest University filed Critical Northwest University
Priority to CN201810133363.0A priority Critical patent/CN108392207B/en
Publication of CN108392207A publication Critical patent/CN108392207A/en
Application granted granted Critical
Publication of CN108392207B publication Critical patent/CN108392207B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Physiology (AREA)
  • Veterinary Medicine (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)

Abstract

The present invention provides a kind of action identification method based on posture label, action recognition is abstracted as gesture recognition by this method, posture is abstracted as posture label based on key node relative position method, by comparing the attitudes vibration of mankind's certain time, finds out the action that the mankind are sent out;Difficulty is established this method reduce template library, drastically reduces speed and the operation requirement of action recognition, improves versatility of the action recognition for identification individual.This method has important application value in field of human-computer interaction, field of virtual reality, field of video monitoring, motion subtree field.

Description

A kind of action identification method based on posture label
Technical field
The invention belongs to action recognition technical fields, are related to a kind of action identification method based on posture label.
Background technology
Action recognition is a hot spot of research in recent years, and the achievement in research that existing action recognition field generates is applied to The every field such as people's air defense security, human lives' behavioral study, human-computer interaction, virtual reality, and produce prodigious positive effect Fruit.Traditional action recognition by the relevant technical method of image procossing directly to image (comprising video, several photos etc.) into Row analysis, by being split to image, feature extraction, motion characteristic extraction, motion characteristic classification and etc., finally realize Action recognition.Although existing action identification method obtains prodigious progress, but there are still certain problems, as operand is huge Greatly;The bad foundation in motion characteristic library needs professional person's typing material;Pair know with the mankind of material different building shape height When other, precision will produce decline etc. by a relatively large margin.
Invention content
For problems of the prior art, the object of the present invention is to provide a kind of actions based on posture label Recognition methods solves computationally intensive present in existing action recognition technology, template library foundation difficulty and template library versatility The problem of difference.
To achieve the goals above, the present invention adopts the following technical scheme that:
A method of by movement decomposition be posture label, include the following steps:
Step 1, the positional number for the key node that trunk action is inscribed when each is obtained using bone tracing equipment According to the position data of the key node is the data under bone tracing equipment coordinate system;The key node includes at least crucial save Point HEAD, key node SHOULDER CENTER, key node SPINE, key node HIP CENTER, key node SHOULDE RIGHT, key node SHOULDER LEFT, key node ELBOW RIGHT, key node ELBOW LEFT, Key node WRIST RIGHT, key node WRIST LEFT, key node HAND RIGHT, key node HAND LEFT, Key node HIP RIGHT, it key node HIP LEFT, key node KNIEE RIGHT, key node KNIEE LEFT, closes Key node ANIKLE RIGHT, key node ANIKLE LEFT, key node FOOT RIGHT, key node FOOT LEFT;
Step 2, by step 1 obtain it is each when the position data of key node inscribed be separately converted to morphology and sit The position data of key node under mark system;The morphology coordinate system using trunk in face of direction as Z axis positive direction, with Extreme direction is Y axis positive directions on the morphology of trunk, using the left direction of people as X-axis positive direction, with key node HIPCENTER is origin;
Step 3, using step 2 obtain it is each when the morphology coordinate system inscribed under key node position data The posture label inscribed when asking each respectively, the posture label include subjective posture label G Lbody, left fore posture label GLlf, right fore posture label G Lrf, left hind posture label G LlbWith right hind posture label G Lrb
Optionally, the subjective posture label G L in the step 3bodyAcquiring method it is as follows:
Choose XF, YFAnd ZFThe coordinate value of middle maximum absolute value finds the corresponding GL in the affiliated section of the coordinate valuebodyValue As subjective posture label G LbodyValue, the formula of use is as follows:
Wherein, XF, YFAnd ZFThe respectively coordinate of 3 reference axis of unit vector F;Unit vector It is that key node HEAD and key node HIPCENTER exist The vector formed under bone tracing equipment coordinate system;
Left fore posture label G L in the step 3lf, right fore posture label G Lrf, left hind posture label G Llb With right hind posture label G LrbAcquiring method it is as follows:
Four kinds of posture labels include three key nodes, are denoted as key node 1, key node 2 and key node 3, for left fore posture label G LlfIncluding three key nodes be respectively ELBOWLEFT, WRISTLEFT and HANDLEFT, for right fore posture label G LrfIncluding three key nodes be respectively KNIEELEFT, ANIKLELEFT and FOOTLEFT, for left hind posture label G LlbIncluding three key nodes be respectively ELBOWLEFT, WRISTLEFT and HANDLEFT, right hind posture label G LrbIncluding three key nodes be respectively KNIEELEFT, ANIKLELEFT and FOOTLEFT。
The data of three key nodes under morphology coordinate system use (X respectively1,Y1,Z1) (X2,Y2,Z2)(X3,Y3, Z3) indicate;Above-mentioned four kinds of posture labels include height tags G1, orientation label G2 and curl label G 3;
Wherein, the acquiring method of height tags G1 is as follows:
G1=(g1+g2+g3The rounding of)/3, wherein
Wherein, n=1,2,3, YHFor the Y axis coordinate under the morphology coordinate system of key node HEAD, YHCFor key node Y axis coordinate under the morphology coordinate system of SHOULDERCENTER;
The acquiring method of orientation label G2 is as follows:
The X axis coordinate and Z axis for counting key node 1, key node 2 and key node 3 sit aiming symbol, and use is following Formula seeks orientation label G2:
The acquiring method for curling label G 3 is as follows:
According to key node 1, key node 2 and key node 3, introduce key node 4, calculate separately key node 1, Key node 2 and key node 3, the distance between key node 4 D1, D2, D3;For left fore posture label G Llf, crucial Node 4 is SHOULDER LEFT, for right fore posture label G Lrf, key node is SHOULDER RIGHT, for left back Limb posture label G Llb, key node 4 is HIP LEFT, for right hind posture label G Lrb, key node 4 is HIP RIGHT;
The obtaining value method for curling label G 3 uses following formula:
The present invention also provides a kind of methods for seeking action template library, include the following steps:
Step 1, standard operation is repeatedly made, the posture mark inscribed when the standard operation made every time is decomposed into each Label;The posture label for choosing initial time is start frame posture label, and the posture label for choosing end time is abort frame posture Label;The standard operation made for the first time is acted for contrast standard, the standard operation that other times are made is dynamic for reference standard Make;Contrast standard acts corresponding start frame posture label and compares posture label, the mark that will be made for the first time as start frame Standard acts corresponding abort frame posture label and compares posture label as abort frame;
The posture label inscribed when the standard operation made every time is decomposed into each is according to claim 1 The obtained posture label of method;
Step 2, initial frame coefficient of similarity group is sought, the specific method is as follows:
Calculate separately each category of the start frame posture label and start frame comparison posture label of multiple reference standard actions The similarity Sl1 (A) of propertyn, the formula of use is as follows:
Sl1(A)n=An(n ∈ Z, n ∈ [1,13], work as n=1 to × Z1n ÷ ln, and ln takes 5 when 4,7,10,13,3) remaining takes
Wherein, AnFor the coefficient of similarity value of initialization, An=1, n indicate that the serial number of attribute, the serial number are distinguished from 1 to 13 For the subjective posture label G L of posture labelbody, left fore posture label G LlfIn height tags G1, left fore posture label GLlfIn orientation label G2, left fore posture label G LlfIn curl label G 3, left hind posture label G LlbIn height Label G 1, left hind posture label G LlbIn orientation label G2, left hind posture label G LlbIn curl label G 3, it is right before Limb posture label G LrfHeight tags G1, right fore posture label G LrfOrientation label G2, right fore posture label G Lrf's Curl label G 3, right hind posture label G LrbHeight tags G1, right hind posture label G LrbOrientation label G2, the right side Hind leg posture label G LrbCurl label G 3;z1nAppearance is compared for the start frame posture label and start frame of reference standard action The absolute value of the difference of the correspondence attribute of state label;
For each attribute n, selection is calculated similar for the start frame posture label of multiple reference standards action Spend Sl1 (A)nIn second largest value as the coefficient of similarity value A1 under the attributen.Each corresponding coefficient of similarity values of attribute n A1nForm initial frame coefficient of similarity group Astar={ A1n, n ∈ Z, n=1,2 ..., 13 };
Step 3, abort frame coefficient of similarity group is sought, the specific method is as follows:
Calculate separately each category of the abort frame posture label and abort frame comparison posture label of multiple reference standard actions The similarity Sl2 (A) of propertyn, the formula of use is as follows:
Sl2(A)n=AA(n ∈ Z, n ∈ [1,13], work as n=1 to × Z2n ÷ ln, and ln takes 5 when 4,7,10,13,3) remaining takes
Wherein, z2nFor the corresponding attribute of the abort frame posture label and abort frame comparison posture label of reference standard action Difference absolute value;
For each attribute n, selection is calculated similar for the abort frame posture label of multiple reference standards action Spend Sl2 (A)nIn second largest value as the coefficient of similarity value A2 under the attributen;Each corresponding coefficient of similarity values of attribute n A2nForm abort frame coefficient of similarity group Astop={ A2n, n ∈ Z, n=1,2 ..., 13 };
Step 4, for multiple standard operations the corresponding initial frame of each standard operation is obtained according to the method for step 1-3 Coefficient of similarity group and abort frame coefficient of similarity group, the corresponding initial frame coefficient of similarity group of all standard operations and end Only the formation of frame coefficient of similarity group acts template library.
The present invention also provides a kind of action identification methods based on posture label, include the following steps:
Step 1, for action to be identified, the posture label inscribed when by movement decomposition to be identified being each;The general The posture label that movement decomposition to be identified is inscribed when being each is the posture label obtained according to method described in claim 1;
Step 2, a certain standard operation in selection action template library, calculates the abort frame posture label that step 1 obtains, The similarity Sl (B) of each attribute between the abort frame posture label of the standard operation of selectionn, remember abort frame posture mark Label are t frame posture labels, and the formula of use is as follows:
Sl(B)n=A1n(n ∈ Z, n ∈ [1,13], work as n=1 to × Z3n ÷ ln, and ln takes 5 when 4,7,10,13,3) remaining takes
Wherein, z3nThe abort frame posture label of the standard operation of the abort frame posture label and selection that obtain for step 1 Correspondence attribute difference absolute value;
Calculate the abort frame of the standard operation of the posture label of abort frame and selection that step 1 obtains posture label it Between overall similarity S (B), the formula of use is as follows:
Step 3, if overall similarity S (B) is more than given threshold MAXBLUR, return to step 2;Otherwise, step is executed 4;
Step 4, the start frame posture of the previous frame posture label of abort frame posture label and the standard operation of selection is calculated The similarity Sl (C) of each attribute between labeln, note previous frame posture label is t-1 frame posture labels, the public affairs of use Formula is as follows:
Sl(C)n=A2n(n ∈ Z, n ∈ [1,13], work as n=1 to × Z3n ÷ ln, and ln takes 5 when 4,7,10,13,3) remaining takes
Wherein, z4nFor previous frame posture label and the corresponding attribute of the start frame posture label of the standard operation of selection The absolute value of difference;
Calculate the overall similarity S between previous frame posture label and the start frame posture label of the standard operation of selection (C), the formula of use is as follows:
Step 5, if overall similarity S (C) is less than given threshold MAXBLUR, action to be identified and the standard chosen are dynamic Make consistent;If overall similarity S (C) is more than given threshold MAXBLUR, return to step 4 will be dealt with objects by t-1 frame appearances State tag replacement is that t-2 frame posture labels obtain overall similarity S until when process object is first frame posture label (C) it is more than given threshold MAXBLUR, then return to step 2.
Compared with prior art, the present invention has the following technical effects:Action recognition is abstracted as posture and known by this method Not, posture is abstracted as by posture label based on key node relative position method, is become by the posture for comparing mankind's certain time Change, finds out the action that the mankind are sent out;Difficulty is established this method reduce template library, drastically reduces action recognition Speed and operation requirement, improve versatility of the action recognition for identification individual.This method shows in field of human-computer interaction, virtually There is important application value in real field, field of video monitoring, motion subtree field.
Explanation and illustration in further detail is made to the solution of the present invention with reference to the accompanying drawings and examples.
Description of the drawings
Fig. 1 is bone tracing equipment coordinate system schematic diagram of the present invention.
Fig. 2 is 20 bone key node position views that the present invention obtains.
Specific implementation mode
The present invention provides a kind of method by movement decomposition for posture label, includes the following steps:
Step 1, the position data of the key node of trunk action, key section are obtained using bone tracing equipment The position data of point is the data under bone tracing equipment coordinate system.Wherein, Kinect can be used in bone tracing equipment, uses The position data of the key node data that Kinect is acted according to certain frequency acquisition, the key node indicates that bone is specific 20 bone nodes position, the node name of key node and serial number are as shown in the table:
Wherein, for bone tracing equipment coordinate system using equipment camera as origin, camera face direction is Z axis positive direction, The negative direction of gravity is Y-axis positive direction, and camera left direction is X axis positive directions, and unit length is 1 meter.Bone tracing equipment Coordinate system is static coordinate system.
Step 2, by step 1 obtain it is each when the position data of key node inscribed be separately converted to morphology and sit The position data of key node under mark system;The formula of use is as follows:
Wherein, (x, y, z)=(X-XHC, Y-YHC, Z-ZHC) indicate under the bone tracing equipment coordinate system that step 1 obtains The coordinate of vector between any key node NODE, (X, Y, Z) indicate the position data of key node NODE, (XHC, YHC, ZHC) indicate the position data of key node HIPCENTER;α, β and γ are respectively each reference axis phase in morphology coordinate system For the rotation angle of bone tracing equipment coordinate system.
Then the position data of the key node under morphology coordinate system is (x', y', z').
The morphology coordinate system faces direction as Z axis positive direction, with the morphology upper end of trunk using trunk Direction is Y-axis positive direction, using the left direction of people as X-axis positive direction, using key node HIPCENTER as origin.
The morphology upper end of trunk is referred to using the head of people as starting point, downward along body, is extended outwardly, more early The position of arrival is the morphology upper end at the position of later arrival.For example, people is when attentioning standing, both hands naturally droop, left The state of shoulder, left elbow, three positions of left hand is as follows:Left shoulder is the morphology upper end of left elbow, and left elbow is on the morphology of left hand End.
Step 3, the subjective posture label G L inscribed when asking eachbody, left fore posture label G Llf, right fore posture mark Sign GLrf, left hind posture label G LlbWith right hind posture label G Lrb
Specifically, in another embodiment, the morphology in face of direction and trunk of the trunk in step 2 The determination method of upper extreme direction is as follows:
The position data of the key node SHOULDER RIGHT obtained in step 1 is (XSR, YSR, ZSR), crucial section The position data of point SHOULDER LEFT is (XSL, YSL, ZSL), the position data of key node HIP CENTER be (XHC, YHC, ZHC), three key nodes can determine that a plane, the plane are plane where trunk.
The normal line vector of plane where trunkWherein,
Ask key node HEAD and vectors of the key node HIPCENTER under bone tracing equipment coordinate system
It, will since Kinect device head always turns forwardIt is multiplied byIf value is Just,Positive sign is taken, if value is negative,Take negative sign.Direction be trunk face direction, Direction be extreme direction on the morphology of trunk.
Specifically, subjective posture label G LbodyAcquiring method it is as follows:
Ask key node HEAD and vectors of the key node HIPCENTER under bone tracing equipment coordinate systemEnable unit vector
Choose XF, YFAnd ZFThe coordinate value of middle maximum absolute value finds the corresponding GL in the affiliated section of the coordinate valuebodyValue As subjective posture label G LbodyValue, the formula of use is as follows:
Because F is unit vector, thenXF, YFAnd ZFIn one be 0, and other two value is equal When, acquiring two equal values isThen XF, YFAnd ZFIn maximum value be more than
Left fore posture label G Llf, right fore posture label G Lrf, left hind posture label G LlbWith right hind posture mark Sign GLrbAcquiring method it is as follows:
Above-mentioned four kinds of posture labels include three key nodes, are denoted as key node 1, key node 2 and key node 3, for left fore posture label G LlfIncluding three key nodes be respectively ELBOWLEFT, WRISTLEFT and HANDLEFT, for right fore posture label G LrfIncluding three key nodes be respectively KNIEELEFT, ANIKLELEFT and FOOTLEFT, for left hind posture label G LlbIncluding three key nodes be respectively ELBOWLEFT, WRISTLEFT and HANDLEFT, right hind posture label G LrbIncluding three key nodes be respectively KNIEELEFT, ANIKLELEFT and FOOTLEFT。
The data of three key nodes under morphology coordinate system use (X respectively1,Y1,Z1) (X2,Y2,Z2)(X3,Y3, Z3) indicate;Above-mentioned four kinds of posture labels include height tags G1, orientation label G2 and curl label G 3.
Wherein, the acquiring method of height tags G1 is as follows:
G1=(g1+g2+g3The value of the rounding of)/3, G1 is smaller, illustrates the location of the position from morphologic upper end It is closer.Wherein:
Wherein, n=1,2,3, YHFor the Y axis coordinate under the morphology coordinate system of key node HEAD, YHCFor key node Y axis coordinate under the morphology coordinate system of SHOULDERCENTER, and YH>YHC
The acquiring method of orientation label G2 is as follows:
The X axis coordinate and Z axis for counting key node 1, key node 2 and key node 3 sit aiming symbol, and use is following Formula seeks orientation label G2:
The acquiring method for curling label G 3 is as follows:
According to key node 1, key node 2 and key node 3, introduce key node 4, calculate separately key node 1, Key node 2 and key node 3, the distance between key node 4 D1, D2, D3.For left fore posture label G Llf, crucial Node 4 is SHOULDER LEFT, for right fore posture label G Lrf, key node is SHOULDER RIGHT, for left back Limb posture label G Llb, key node 4 is HIP LEFT, for right hind posture label G Lrb, key node 4 is HIP RIGHT。
The obtaining value method for curling label G 3 uses following formula:
Another aspect of the present invention provides a kind of method for seeking action template library, includes the following steps:
Step 1, standard operation is repeatedly made, according to above-mentioned by the method that movement decomposition is posture label, will be made every time The posture label inscribed when being decomposed into each of standard operation;The posture label for choosing initial time is start frame posture label, The posture label for choosing end time is abort frame posture label.The standard operation made for the first time is acted for contrast standard, The standard operation that other times are made acts for reference standard.Contrast standard acts corresponding start frame posture label as starting Frame compares posture label, and posture is compared using the corresponding abort frame posture label of the standard operation made for the first time as abort frame Label.
Step 2, initial frame coefficient of similarity group is sought, the specific method is as follows:
Calculate separately each category of the start frame posture label and start frame comparison posture label of multiple reference standard actions The similarity Sl1 (A) of propertyn, the formula of use is as follows:
Sl1(A)n=An(n ∈ Z, n ∈ [1,13], work as n=1 to × Z1n ÷ ln, and ln takes 5 when 4,7,10,13,3) remaining takes
(6)
Wherein, AnFor the coefficient of similarity value of initialization, An=1, n indicate that the serial number of attribute, the serial number are distinguished from 1 to 13 For the subjective posture label G L of posture labelbody, left fore posture label G LlfIn height tags G1, left fore posture label GLlfIn orientation label G2, left fore posture label G LlfIn curl label G 3, left hind posture label G LlbIn height Label G 1, left hind posture label G LlbIn orientation label G2, left hind posture label G LlbIn curl label G 3, it is right before Limb posture label G LrfHeight tags G1, right fore posture label G LrfOrientation label G2, right fore posture label G Lrf's Curl label G 3, right hind posture label G LrbHeight tags G1, right hind posture label G LrbOrientation label G2, the right side Hind leg posture label G LrbCurl label G 3;z1nAppearance is compared for the start frame posture label and start frame of reference standard action The absolute value of the difference of the correspondence attribute of state label, such as z11For subjective posture in the start frame posture label of reference standard action Label G LbodyThe subjective posture label G L of posture label is compared with start framebodyDifference absolute value.
For each attribute n, selection is calculated similar for the start frame posture label of multiple reference standards action Spend Sl1 (A)nIn second largest value as the coefficient of similarity value A1 under the attributen.Each corresponding coefficient of similarity values of attribute n A1nForm initial frame coefficient of similarity group Astar={ A1n, n ∈ Z, n=1,2 ..., 13 }.
Step 3, abort frame coefficient of similarity group is sought, the specific method is as follows:
Calculate separately each category of the abort frame posture label and abort frame comparison posture label of multiple reference standard actions The similarity Sl2 (A) of propertyn, the formula of use is as follows;
Sl2(A)n=An(n ∈ Z, n ∈ [1,13], work as n=1 to × Z2n ÷ ln, and ln takes 5 when 4,7,10,13,3) remaining takes
(7)
Wherein, z2nFor the corresponding attribute of the abort frame posture label and abort frame comparison posture label of reference standard action Difference absolute value, such as z21For subjective posture label G L in the abort frame posture label of reference standard actionbodyWith abort frame Compare the subjective posture label G L of posture labelbodyDifference absolute value.
For each attribute n, selection is calculated similar for the abort frame posture label of multiple reference standards action Spend Sl2 (A)nIn second largest value as the coefficient of similarity value A2 under the attributen.Each corresponding coefficient of similarity values of attribute n A2nForm abort frame coefficient of similarity group Astop={ A2n, n ∈ Z, n=1,2 ..., 13 }.
Step 4, for multiple standard operations the corresponding initial frame of each standard operation is obtained according to the method for step 1-3 Coefficient of similarity group and abort frame coefficient of similarity group, the corresponding initial frame coefficient of similarity group of all standard operations and end Only the formation of frame coefficient of similarity group acts template library.
The third aspect of the invention provides a kind of action identification method, includes the following steps:
Step 1, for action to be identified, according to above-mentioned by the method that movement decomposition is posture label, by action to be identified The posture label inscribed when being decomposed into each.
Step 2, a certain standard operation in selection action template library, calculates the abort frame posture label that step 1 obtains, The similarity Sl (B) of each attribute between the abort frame posture label of the standard operation of selectionn, remember abort frame posture mark Label are t frame posture labels, and the formula of use is as follows:
Sl(B)n=A1n(n ∈ Z, n ∈ [1,13], work as n=1 to × Z3n ÷ ln, and ln takes 5 when 4,7,10,13,3) remaining takes
(8)
Wherein, z3nThe abort frame posture label of the standard operation of the abort frame posture label and selection that obtain for step 1 Correspondence attribute difference absolute value.
Calculate the abort frame of the standard operation of the posture label of abort frame and selection that step 1 obtains posture label it Between overall similarity S (B), the formula of use is as follows:
Step 3, if overall similarity S (B) is more than given threshold MAXBLUR, return to step 2;Otherwise, step is executed 4;
Step 4, the start frame posture of the previous frame posture label of abort frame posture label and the standard operation of selection is calculated The similarity Sl (C) of each attribute between labeln, note previous frame posture label is t-1 frame posture labels, the public affairs of use Formula is as follows:
Sl(C)n=A2n(n ∈ Z, n ∈ [1,13], work as n=1 to × Z3n ÷ ln, and ln takes 5 when 4,7,10,13,3) remaining takes
(10)
Wherein, z4nFor previous frame posture label and the corresponding attribute of the start frame posture label of the standard operation of selection The absolute value of difference.
Calculate the overall similarity S between previous frame posture label and the start frame posture label of the standard operation of selection (C), the formula of use is as follows:
Step 5, if overall similarity S (C) is less than given threshold MAXBLUR, action to be identified and the standard chosen are dynamic Make consistent;If overall similarity S (C) is more than given threshold MAXBLUR, return to step 4 will be dealt with objects by t-1 frame appearances State tag replacement is that t-2 frame posture labels obtain overall similarity S until when process object is first frame posture label (C) it is more than given threshold MAXBLUR, then return to step 2.MAXBLUR expressions act the fog-level of matching algorithm, and value is 0.25-0.05。
Embodiment
Action recognition is carried out using conventional method:
The use of equipment is separate unit Kinect, when identification maneuver is saluted for the right hand, template library is established using conventional method, surveys Examination person a height 173cm, weight 60kg, tester b height 191cm, weight 100kg, tester c height 181cm, weight 80kg.Preceding 50 samples are tester's a typings, and the sample of sample serial number 51-80 is tester's b typings, each sample typing Probably need the time 2 minutes, the action of selected typing when typing is made the right hand and saluted for 1.5 meters by typing person station before equipment Action, sample database are the complete 20 bone nodes of Kinect with test point
, it is specified that tester station is made comply with standard action as far as possible before 1.5 meters of equipment when test, each new sample of typing When this optimization template library, everyone do the right hand salute action ten times, count recognition result.Recognition result statistics is as shown in table 1:
Table 1
By test result it can be seen that:Tester a is used as tester, and when as sample typing person, with typing time Number increases, and recognition success rate obviously increases, and final recognition success rate reaches 100% when sample number is 50, and other testers Success rate be basically unchanged, when changing using tester b as sample typing person, tester's b success rates improve a lot, test Person's a success rates decrease instead.For tester c because having neither part nor lot in typing, recognition success rate is relatively low, but as sample number increases Add, success rate increased.Test 1, which amounts to, takes 4 hours 20 points.
Action recognition is carried out using the method for the present invention:
The use of equipment is separate unit Kinect, identification maneuver is that the right hand is saluted, both hands are brandished, and posture is established using this method Tag library, action-posture library establishes that template library is total to take 30 minutes using all 20 nodes, including six actions:It stands It stands, left hand is held high, the right hand is held high, hands held high, left arm is saluted, right arm is saluted.Wherein the right hand is held high salutes action with right arm It is close, hands held high be three postures compound action, action had not only met left hand and holds requirement high, but also met the right hand and hold requirement high, It is added for increasing this difficulty of test.Tester a height 173cm, weight 60kg, tester b height 191cm, weight 100kg, tester c height 181cm, weight 80kg, it is consistent with test 1.
, it is specified that tester station is made comply with standard action as far as possible, everyone does the right hand and respects before 1.5 meters of equipment when test Action each ten times is waved in gift, both hands, and entire test is without update template library, so not needing everyone does more wheel actions, statistics is known Other result.Recognition result statistics is as shown in table 2:
Table 2
Wherein right arm of tester c action of saluting is erroneously identified as the right hand and holds high, and primary hands held high that action is wrong The right hand is mistakenly identified as to hold high, it is related with related action setting in action-posture library.
Integrated testability is more much higher than 1 recognition success rate of test, and for three different building shape testers, has good Good success rate.Integrated testability is got off, and is amounted to and is taken 1 hour 10 points, and identification maneuver is compared test 1 and more enriched, and difficulty is more Greatly.
Illustrate that this method possesses good versatility, typing (design) simpler convenience of template library for tester.

Claims (4)

1. it is a kind of by movement decomposition be posture label method, which is characterized in that include the following steps:
Step 1, the position data that the key node that trunk action is inscribed when each is obtained using bone tracing equipment, should The position data of key node is the data under bone tracing equipment coordinate system;The key node includes at least key node HEAD, key node SHOULDER CENTER, key node SPINE, key node HIP CENTER, key node SHOULDE RIGHT, key node SHOULDER LEFT, key node ELBOW RIGHT, key node ELBOW LEFT, key node WRIST RIGHT, key node WRIST LEFT, key node HAND RIGHT, key node HAND LEFT, key node HIP RIGHT, key node HIP LEFT, key node KNIEE RIGHT, key node KNIEE LEFT, key node ANIKLE RIGHT, key node ANIKLE LEFT, key node FOOT RIGHT, key node FOOT LEFT;
Step 2, by step 1 obtain it is each when the position data of key node inscribed be separately converted under morphology coordinate system Key node position data;The morphology coordinate system faces direction as Z axis positive direction, with trunk using trunk Morphology on extreme direction be Y-axis positive direction, be former with key node HIPCENTER using the left direction of people as X-axis positive direction Point;
Step 3, using step 2 obtain it is each when the morphology coordinate system inscribed under the position data of key node ask respectively The posture label inscribed when each, which includes subjective posture label G Lbody, left fore posture label G Llf, right fore Posture label G Lrf, left hind posture label G LlbWith right hind posture label G Lrb
2. being as described in claim 1 the method for posture label by movement decomposition, which is characterized in that the master in the step 3 Body posture label G LbodyAcquiring method it is as follows:
Choose XF, YFAnd ZFThe coordinate value of middle maximum absolute value finds the corresponding GL in the affiliated section of the coordinate valuebodyValue be based on Body posture label G LbodyValue, the formula of use is as follows:
Wherein, XF, YFAnd ZFThe respectively coordinate of 3 reference axis of unit vector F;Unit vector It is key node HEAD and key node HIPCENTER in bone The vector formed under tracing equipment coordinate system;
Left fore posture label G L in the step 3lf, right fore posture label G Lrf, left hind posture label G LlbBehind the right side Limb posture label G LrbAcquiring method it is as follows:
Four kinds of posture labels include three key nodes, are denoted as key node 1, key node 2 and key node 3, right In left fore posture label G LlfIncluding three key nodes be respectively ELBOWLEFT, WRISTLEFT and HANDLEFT, for Right fore posture label G LrfIncluding three key nodes be respectively KNIEELEFT, ANIKLELEFT and FOOTLEFT, for Left hind posture label G LlbIncluding three key nodes be respectively ELBOWLEFT, WRISTLEFT and HANDLEFT, right hind Posture label G LrbIncluding three key nodes be respectively KNIEELEFT, ANIKLELEFT and FOOTLEFT.
The data of three key nodes under morphology coordinate system use (X respectively1,Y1,Z1)(X2,Y2,Z2)(X3,Y3,Z3) table Show;Above-mentioned four kinds of posture labels include height tags G1, orientation label G2 and curl label G 3;
Wherein, the acquiring method of height tags G1 is as follows:
G1=(g1+g2+g3The rounding of)/3, wherein
Wherein, n=1,2,3, YHFor the Y axis coordinate under the morphology coordinate system of key node HEAD, YHCFor key node Y axis coordinate under the morphology coordinate system of SHOULDERCENTER;
The acquiring method of orientation label G2 is as follows:
The X axis coordinate and Z axis for counting key node 1, key node 2 and key node 3 sit aiming symbol, are asked using following formula Orientation label G2:
The acquiring method for curling label G 3 is as follows:
According to key node 1, key node 2 and key node 3, key node 4 is introduced, calculates separately key node 1, crucial section Point 2 and key node 3, the distance between key node 4 D1, D2, D3;For left fore posture label G Llf, key node 4 is SHOULDER LEFT, for right fore posture label G Lrf, key node is SHOULDER RIGHT, for left hind posture mark Sign GLlb, key node 4 is HIP LEFT, for right hind posture label G Lrb, key node 4 is HIP RIGHT;
The obtaining value method for curling label G 3 uses following formula:
3. a kind of method for seeking action template library, which is characterized in that include the following steps:
Step 1, standard operation is repeatedly made, the posture label inscribed when the standard operation made every time is decomposed into each;Choosing It is start frame posture label to take the posture label of initial time, and the posture label for choosing end time is abort frame posture label; The standard operation made for the first time is acted for contrast standard, the standard operation that other times are made acts for reference standard;Comparison The corresponding start frame posture label of standard operation compares posture label as start frame, and the standard operation made for the first time is corresponded to Abort frame posture label as abort frame compare posture label;
The posture label inscribed when the standard operation made every time is decomposed into each is according to method of claim 1 Obtained posture label;
Step 2, initial frame coefficient of similarity group is sought, the specific method is as follows:
Calculate separately the start frame posture label and each attribute of start frame comparison posture label of multiple reference standard actions Similarity Sl1 (A)n, the formula of use is as follows:
Sl1(A)n=An(n ∈ Z, n ∈ [1,13], work as n=1 to × Z1n ÷ ln, and ln takes 5 when 4,7,10,13,3) remaining takes
Wherein, AnFor the coefficient of similarity value of initialization, An=1, n indicate that the serial number of attribute, the serial number are respectively appearance from 1 to 13 The subjective posture label G L of state labelbody, left fore posture label G LlfIn height tags G1, left fore posture label G LlfIn Orientation label G2, left fore posture label G LlfIn curl label G 3, left hind posture label G LlbIn height tags G1, left hind posture label G LlbIn orientation label G2, left hind posture label G LlbIn curl label G 3, right fore appearance State label G LrfHeight tags G1, right fore posture label G LrfOrientation label G2, right fore posture label G LrfCurl Label G 3, right hind posture label G LrbHeight tags G1, right hind posture label G LrbOrientation label G2, right hind appearance State label G LrbCurl label G 3;z1nPosture label is compared for the start frame posture label and start frame of reference standard action The absolute value of the difference of corresponding attribute;
For each attribute n, the similarity Sl1 being calculated for the start frame posture label of multiple reference standards action is chosen (A)nIn second largest value as the coefficient of similarity value A1 under the attributen.Each corresponding coefficient of similarity value A1 of attribute nnIt is formed Initial frame coefficient of similarity group Astar={ A1n, n ∈ Z, n=1,2 ..., 13 };
Step 3, abort frame coefficient of similarity group is sought, the specific method is as follows:
Calculate separately the abort frame posture label and each attribute of abort frame comparison posture label of multiple reference standard actions Similarity Sl2 (A)n, the formula of use is as follows:
Sl2(A)n=An(n ∈ Z, n ∈ [1,13], work as n=1 to × Z2n ÷ ln, and ln takes 5 when 4,7,10,13,3) remaining takes
Wherein, z2nFor the difference of the abort frame posture label and the corresponding attribute of abort frame comparison posture label of reference standard action Absolute value;
For each attribute n, the similarity Sl2 being calculated for the abort frame posture label of multiple reference standards action is chosen (A)nIn second largest value as the coefficient of similarity value A2 under the attributen;Each corresponding coefficient of similarity value A2 of attribute nnIt is formed Abort frame coefficient of similarity group Astop={ A2n, n ∈ Z, n=1,2 ..., 13 };
Step 4, it is similar to obtain the corresponding initial frame of each standard operation according to the method for step 1-3 for multiple standard operations Spend coefficient sets and abort frame coefficient of similarity group, the corresponding initial frame coefficient of similarity group of all standard operations and abort frame phase Template library is acted like degree coefficient sets formation.
4. a kind of action identification method based on posture label, which is characterized in that include the following steps:
Step 1, for action to be identified, the posture label inscribed when by movement decomposition to be identified being each;Described will wait knowing The posture label inscribed when being decomposed into each of not moving is the posture label obtained according to method described in claim 1;
Step 2, a certain standard operation in selection action template library, calculates the abort frame posture label that step 1 obtains, with selection Standard operation abort frame posture label between each attribute similarity Sl (B)n, note abort frame posture label is t Frame posture label, the formula of use are as follows:
Sl(B)n=A1n(n ∈ Z, n ∈ [1,13], work as n=1 to × Z3n ÷ ln, and ln takes 5 when 4,7,10,13,3) remaining takes
Wherein, z3nThe abort frame posture label obtained for step 1 is corresponding with the abort frame posture label of the standard operation of selection The absolute value of the difference of attribute;
It is whole between the posture label of the abort frame of the posture label for the abort frame that calculating step 1 obtains and the standard operation of selection Body similarity S (B), the formula of use are as follows:
Step 3, if overall similarity S (B) is more than given threshold MAXBLUR, return to step 2;Otherwise, step 4 is executed;
Step 4, the start frame posture label of the previous frame posture label of abort frame posture label and the standard operation of selection is calculated Between each attribute similarity Sl (C)n, note previous frame posture label is t-1 frame posture labels, and the formula of use is such as Under:
Sl(C)n=A2n(n ∈ Z, n ∈ [1,13], work as n=1 to × Z3n ÷ ln, and ln takes 5 when 4,7,10,13,3) remaining takes
Wherein, z4nFor previous frame posture label and the difference of the corresponding attribute of the start frame posture label of the standard operation of selection Absolute value;
The overall similarity S (C) between previous frame posture label and the start frame posture label of the standard operation of selection is calculated, is adopted Formula is as follows:
Step 5, if overall similarity S (C) is less than given threshold MAXBLUR, action to be identified and the standard operation one chosen It causes;If overall similarity S (C) is more than given threshold MAXBLUR, return to step 4 will be dealt with objects by t-1 frame posture marks Label replace with t-2 frame posture labels, until when process object is first frame posture label, obtains overall similarity S (C) and are more than Given threshold MAXBLUR, then return to step 2.
CN201810133363.0A 2018-02-09 2018-02-09 Gesture tag-based action recognition method Expired - Fee Related CN108392207B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810133363.0A CN108392207B (en) 2018-02-09 2018-02-09 Gesture tag-based action recognition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810133363.0A CN108392207B (en) 2018-02-09 2018-02-09 Gesture tag-based action recognition method

Publications (2)

Publication Number Publication Date
CN108392207A true CN108392207A (en) 2018-08-14
CN108392207B CN108392207B (en) 2020-12-11

Family

ID=63096010

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810133363.0A Expired - Fee Related CN108392207B (en) 2018-02-09 2018-02-09 Gesture tag-based action recognition method

Country Status (1)

Country Link
CN (1) CN108392207B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110215216A (en) * 2019-06-11 2019-09-10 中国科学院自动化研究所 Based on the with different levels Activity recognition method in skeletal joint point subregion, system
CN110309743A (en) * 2019-06-21 2019-10-08 新疆铁道职业技术学院 Human body attitude judgment method and device based on professional standard movement
CN112617819A (en) * 2020-12-21 2021-04-09 西南交通大学 Method and system for recognizing lower limb posture of infant

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11272877A (en) * 1998-03-25 1999-10-08 Namco Ltd Skeleton model data representation
CN103886588A (en) * 2014-02-26 2014-06-25 浙江大学 Feature extraction method of three-dimensional human body posture projection
CN104268138A (en) * 2014-05-15 2015-01-07 西安工业大学 Method for capturing human motion by aid of fused depth images and three-dimensional models
WO2015162158A1 (en) * 2014-04-22 2015-10-29 Université Libre de Bruxelles Human motion tracking
US20150325004A1 (en) * 2013-01-18 2015-11-12 Kabushiki Kaisha Toshiba Motion information processing apparatus and method
CN105243375A (en) * 2015-11-02 2016-01-13 北京科技大学 Motion characteristics extraction method and device
CN105608467A (en) * 2015-12-16 2016-05-25 西北工业大学 Kinect-based non-contact type student physical fitness evaluation method
CN106022213A (en) * 2016-05-04 2016-10-12 北方工业大学 Human body motion recognition method based on three-dimensional bone information
CN106295616A (en) * 2016-08-24 2017-01-04 张斌 Exercise data analyses and comparison method and device
CN106528586A (en) * 2016-05-13 2017-03-22 上海理工大学 Human behavior video identification method
KR101722131B1 (en) * 2015-11-25 2017-03-31 국민대학교 산학협력단 Posture and Space Recognition System of a Human Body Using Multimodal Sensors
CN106874884A (en) * 2017-03-03 2017-06-20 中国民航大学 Human body recognition methods again based on position segmentation
CN107038430A (en) * 2017-05-05 2017-08-11 成都通甲优博科技有限责任公司 A kind of method and its device for constructing human body attitude data sample
CN107115102A (en) * 2017-06-07 2017-09-01 西南科技大学 A kind of osteoarticular function appraisal procedure and device
CN107174255A (en) * 2017-06-15 2017-09-19 西安交通大学 Three-dimensional gait information gathering and analysis method based on Kinect somatosensory technology
CN107225573A (en) * 2017-07-05 2017-10-03 上海未来伙伴机器人有限公司 The method of controlling operation and device of robot

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11272877A (en) * 1998-03-25 1999-10-08 Namco Ltd Skeleton model data representation
US20150325004A1 (en) * 2013-01-18 2015-11-12 Kabushiki Kaisha Toshiba Motion information processing apparatus and method
CN103886588A (en) * 2014-02-26 2014-06-25 浙江大学 Feature extraction method of three-dimensional human body posture projection
WO2015162158A1 (en) * 2014-04-22 2015-10-29 Université Libre de Bruxelles Human motion tracking
CN104268138A (en) * 2014-05-15 2015-01-07 西安工业大学 Method for capturing human motion by aid of fused depth images and three-dimensional models
CN105243375A (en) * 2015-11-02 2016-01-13 北京科技大学 Motion characteristics extraction method and device
KR101722131B1 (en) * 2015-11-25 2017-03-31 국민대학교 산학협력단 Posture and Space Recognition System of a Human Body Using Multimodal Sensors
CN105608467A (en) * 2015-12-16 2016-05-25 西北工业大学 Kinect-based non-contact type student physical fitness evaluation method
CN106022213A (en) * 2016-05-04 2016-10-12 北方工业大学 Human body motion recognition method based on three-dimensional bone information
CN106528586A (en) * 2016-05-13 2017-03-22 上海理工大学 Human behavior video identification method
CN106295616A (en) * 2016-08-24 2017-01-04 张斌 Exercise data analyses and comparison method and device
CN106874884A (en) * 2017-03-03 2017-06-20 中国民航大学 Human body recognition methods again based on position segmentation
CN107038430A (en) * 2017-05-05 2017-08-11 成都通甲优博科技有限责任公司 A kind of method and its device for constructing human body attitude data sample
CN107115102A (en) * 2017-06-07 2017-09-01 西南科技大学 A kind of osteoarticular function appraisal procedure and device
CN107174255A (en) * 2017-06-15 2017-09-19 西安交通大学 Three-dimensional gait information gathering and analysis method based on Kinect somatosensory technology
CN107225573A (en) * 2017-07-05 2017-10-03 上海未来伙伴机器人有限公司 The method of controlling operation and device of robot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
KE Q , AN S , BENNAMOUN M , ET AL.: "SkeletonNet: Mining Deep Part Features for 3-D Action Recognition", 《IEEE SIGNAL PROCESSING LETTERS》 *
KIM, YEJIN; BAEK, SEONGMIN; BAE, BYUNG-CHULL: "Motion Capture of the Human Body Using Multiple Depth Sensors", 《ETRI JOURNAL》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110215216A (en) * 2019-06-11 2019-09-10 中国科学院自动化研究所 Based on the with different levels Activity recognition method in skeletal joint point subregion, system
CN110215216B (en) * 2019-06-11 2020-08-25 中国科学院自动化研究所 Behavior identification method and system based on skeletal joint point regional and hierarchical level
CN110309743A (en) * 2019-06-21 2019-10-08 新疆铁道职业技术学院 Human body attitude judgment method and device based on professional standard movement
CN112617819A (en) * 2020-12-21 2021-04-09 西南交通大学 Method and system for recognizing lower limb posture of infant

Also Published As

Publication number Publication date
CN108392207B (en) 2020-12-11

Similar Documents

Publication Publication Date Title
CN110097639B (en) Three-dimensional human body posture estimation method
CN106600626B (en) Three-dimensional human motion capture method and system
CN108764120B (en) Human body standard action evaluation method
CN105389539B (en) A kind of three-dimension gesture Attitude estimation method and system based on depth data
CN105320944B (en) A kind of human body behavior prediction method based on human skeleton motion information
CN111931804B (en) Human body action automatic scoring method based on RGBD camera
Uddin et al. Human activity recognition using body joint‐angle features and hidden Markov model
CN110705390A (en) Body posture recognition method and device based on LSTM and storage medium
CN109086706A (en) Applied to the action identification method based on segmentation manikin in man-machine collaboration
CN107301370A (en) A kind of body action identification method based on Kinect three-dimensional framework models
CN110348330A (en) Human face posture virtual view generation method based on VAE-ACGAN
CN108392207A (en) A kind of action identification method based on posture label
CN110363867A (en) Virtual dress up system, method, equipment and medium
CN106909890B (en) Human behavior recognition method based on part clustering characteristics
Uddin et al. Human Activity Recognition via 3-D joint angle features and Hidden Markov models
CN111539245B (en) CPR (CPR) technology training evaluation method based on virtual environment
CN108073855A (en) A kind of recognition methods of human face expression and system
CN115797851B (en) Cartoon video processing method and system
CN110135277A (en) A kind of Human bodys' response method based on convolutional neural networks
Hassanpour et al. Visionbased hand gesture recognition for human computer interaction: A review
Hachaj et al. Human actions recognition on multimedia hardware using angle-based and coordinate-based features and multivariate continuous hidden Markov model classifier
CN109993818B (en) Method, device, equipment and medium for synthesizing motion of three-dimensional human body model
CN111539364A (en) Multi-somatosensory human behavior recognition algorithm based on feature fusion and multi-classifier voting
CN111738095B (en) Character recognition method based on skeleton posture
Ding et al. Skeleton‐Based Human Action Recognition via Screw Matrices

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20201211

Termination date: 20220209