CN106295616B - Exercise data analyses and comparison method and device - Google Patents

Exercise data analyses and comparison method and device Download PDF

Info

Publication number
CN106295616B
CN106295616B CN201610714829.7A CN201610714829A CN106295616B CN 106295616 B CN106295616 B CN 106295616B CN 201610714829 A CN201610714829 A CN 201610714829A CN 106295616 B CN106295616 B CN 106295616B
Authority
CN
China
Prior art keywords
segment
data
frame data
target action
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610714829.7A
Other languages
Chinese (zh)
Other versions
CN106295616A (en
Inventor
张斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201610714829.7A priority Critical patent/CN106295616B/en
Publication of CN106295616A publication Critical patent/CN106295616A/en
Application granted granted Critical
Publication of CN106295616B publication Critical patent/CN106295616B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A kind of exercise data analyses and comparison method and device of the present invention, method include: the exercise data for obtaining target sporter within the scope of specified time;The exercise data of target action segment is extracted from the exercise data of target sporter;The exercise data of target action segment is analyzed, the key position posture frame data at different human body position in target action segment are obtained;The key position posture frame data at position each in target action segment are converted into the spatial displacements parameter in human body coordinate system;Feature vector by the set of the position posture frame data at position each in target action segment as target action segment, matches corresponding standard operation segment in deliberate action template library;Target action segment is compared with the position posture frame data at each position of standard operation segment, and exports comparison result.In this way, converting the matching that calculation amount is small and the intuitive spatial displacements parameter of expression acts for key operations data in exercise data, the efficiency of Data Matching is effectively increased.

Description

Exercise data analyses and comparison method and device
Technical field
The present invention relates to data analysis fields, in particular to a kind of exercise data analyses and comparison method and device.
Background technique
With digital technology continuous development with it is perfect, the digitlization rational analysis of exercise data, which becomes, measures motion state Major criterion.By the way that family can be used clearly by the comparison of action data and standard operation data in user movement The deficiency of solution movement posture into movement.The comparison analysis of action data is widely used in athletic posture correction, individual sports The fields such as exercise analysis, the analysis of training athlete data, the analysis of athletic competition data.
The comparison analysis of exercise data is general to model identification maneuver type before this, then standard operation and target action are carried out Comparison information.In traditional exercise data analyses and comparison method, the extraction of exercise data and modeling method are complicated, data embodiment side Formula is not intuitive, and establish standard operation with it is computationally intensive when target action matching relationship, seriously affected exercise data analysis The efficiency of comparison.
Summary of the invention
In order to overcome above-mentioned deficiency in the prior art, technical problem to be solved by the invention is to provide a kind of embodiment sides Formula is intuitive, and the exercise data analyses and comparison method and device that analytical calculation amount is small.
For method, the present invention provides a kind of exercise data analyses and comparison method, which comprises
Obtain the exercise data of target sporter within the scope of specified time, wherein the exercise data includes described specified Whole posture frame data in time range under multiple time sampling points, the entirety posture frame data include the target movement The position posture frame data and human space data at the variant position of person, the human space data include human body turn to data and Human space position;
The exercise data of target action segment is extracted from the exercise data of the target sporter;
The exercise data for analyzing the target action segment obtains the pass at different human body position in the target action segment Key position posture frame data, wherein the key position posture frame data include the single position of human body when moving starting Position movement at the end of position movement starting frame data and movement terminates frame data;
Human body coordinate system is established centered on default human body, in conjunction with the human space data, by target action piece The key position posture frame data at each position are converted to the spatial displacements parameter in the human body coordinate system in section;
Using the set of the position posture frame data at position each in target action segment as the target action segment Feature vector, corresponding standard operation segment is matched in deliberate action template library;
The target action segment is compared with the position posture frame data at each position of standard operation segment, and is exported Comparison result.
For device, the present invention provides a kind of exercise data analyses and comparison device, and described device includes:
Target motion capture module, for obtaining the exercise data of target sporter within the scope of specified time, wherein The exercise data includes the whole posture frame data within the scope of the specified time under multiple time sampling points, the entirety appearance State frame data include the position posture frame data and human space data at the variant position of target sporter, and the human body is empty Between data include that human body turns to data and human space position;
Target action snippet extraction module, for extracting target action segment from the exercise data of the target sporter Exercise data;
Target critical posture frame analysis module obtains the mesh for analyzing the exercise data of the target action segment Mark acts the key position posture frame data at different human body position in segment, wherein the key position posture frame data include Position movement starting frame data of the single position of human body when moving starting and position movement at the end of movement terminate frame number According to;
Target exercise data conversion module, for establishing human body coordinate system centered on default human body, in conjunction with described The key position posture frame data at position each in target action segment are converted to the human body coordinate by human space data Spatial displacements parameter in system;
Matching module is acted, for using the set of the spatial displacements parameter at position each in target action segment as institute The feature vector for stating target action segment matches corresponding standard operation segment in deliberate action template library;
Comparison module is acted, for carrying out the whole posture frame data of the target action segment and standard operation segment It compares, and exports comparison result.
In terms of existing technologies, the invention has the following advantages:
The present invention one provides a kind of exercise data analyses and comparison method and device, extracts target sporter fortune by analysis The target action segment in dynamic data, calculates the key position posture frame number obtained in the target action segment According to, and spatial displacements parameter is converted by the key position posture frame data in conjunction with the human space data.By described Spatial displacements parameter matching criteria in deliberate action standard operation library acts segment, and is compared.In this way, will movement Key operations data in data are converted into the matching that calculation amount is small and the intuitive spatial displacements parameter of expression is acted, effectively Improve the efficiency of Data Matching.
Detailed description of the invention
In order to illustrate the technical solution of the embodiments of the present invention more clearly, below will be to needed in the embodiment attached Figure is briefly described, it should be understood that the following drawings illustrates only certain embodiments of the present invention, therefore is not construed as pair The restriction of range for those of ordinary skill in the art without creative efforts, can also be according to this A little attached drawings obtain other relevant attached drawings.
Fig. 1 is data processing equipment structural block diagram provided in an embodiment of the present invention;
Fig. 2 is exercise data analyses and comparison method flow schematic diagram provided in an embodiment of the present invention;
Fig. 3 is that human motion position provided in an embodiment of the present invention divides schematic diagram;
Fig. 4 is Eulerian angles schematic diagram provided in an embodiment of the present invention;
Fig. 5 is human space data reference direction schematic diagram provided in an embodiment of the present invention;
Fig. 6 is that human body provided in an embodiment of the present invention turns to data reference direction schematic diagram;
Fig. 7 is human space parametric direction schematic diagram provided in an embodiment of the present invention;
Fig. 8 is spatial parameter reference direction schematic diagram in position provided in an embodiment of the present invention;
Fig. 9 is diversity factor comparative analysis interface schematic diagram provided in an embodiment of the present invention;
Figure 10 is sync rates assay surface schematic diagram provided in an embodiment of the present invention;
Figure 11 is velocity contrast's interface schematic diagram provided in an embodiment of the present invention;
Figure 12 is that exercise data provided in an embodiment of the present invention compares analytical equipment structural block diagram.
In above-mentioned attached drawing, the corresponding title of each appended drawing reference are as follows:
Data processing equipment 100
Processor 130
Memory 120
Communication unit 140
Exercise data analyses and comparison device 110
Target motion capture module 111
Target action snippet extraction module 112
Target critical posture frame analysis module 113
Target exercise data conversion module 114
Act matching module 115
Act comparison module 116
Specific embodiment
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, below in conjunction with the embodiment of the present invention In attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is A part of the embodiment of the present invention, instead of all the embodiments.The present invention being usually described and illustrated herein in the accompanying drawings is implemented The component of example can be arranged and be designed with a variety of different configurations.
Therefore, the detailed description of the embodiment of the present invention provided in the accompanying drawings is not intended to limit below claimed The scope of the present invention, but be merely representative of selected embodiment of the invention.Based on the embodiments of the present invention, this field is common Technical staff's every other embodiment obtained without creative efforts belongs to the model that the present invention protects It encloses.
It should also be noted that similar label and letter indicate similar terms in following attached drawing, therefore, once a certain Xiang Yi It is defined in a attached drawing, does not then need that it is further defined and explained in subsequent attached drawing.
In the description of the present invention, it should be noted that term " center ", "upper", "lower", "left", "right", "vertical", The orientation or positional relationship of the instructions such as "horizontal", "inner", "outside" is to be based on the orientation or positional relationship shown in the drawings, or be somebody's turn to do Invention product using when the orientation or positional relationship usually put, be merely for convenience of description of the present invention and simplification of the description, without It is that the device of indication or suggestion meaning or element must have a particular orientation, be constructed and operated in a specific orientation, therefore not It can be interpreted as limitation of the present invention.In addition, term " first ", " second ", " third " etc. are only used for distinguishing description, and cannot manage Solution is indication or suggestion relative importance.
In addition, the terms such as term "horizontal", "vertical", " pendency " are not offered as requiring component abswolute level or pendency, and It is that can be slightly tilted.It is not to indicate the structure if "horizontal" only refers to that its direction is more horizontal with respect to for "vertical" It has to fully horizontally, but can be slightly tilted.
A kind of exercise data analyses and comparison method provided in this embodiment is applied to data processing equipment, in the present embodiment, The data processing equipment 100 may be, but not limited to, server, smart phone, PC (personal computer, PC), tablet computer, personal digital assistant (personal digital assistant, PDA), mobile internet surfing equipment (mobile Internet device, MID) etc..
Fig. 1 is please referred to, Fig. 1 is the block diagram of the data processing equipment 100 provided in this embodiment.The data Processing equipment 100 includes exercise data analyses and comparison device 110, memory 120, processor 130, communication unit 140.
The memory 120, processor 130 and each element of communication unit 140 are directly or indirectly electrical between each other Connection, to realize the transmission or interaction of data.For example, these elements can pass through one or more communication bus or letter between each other Number line, which is realized, to be electrically connected.The exercise data analyses and comparison device 110 includes at least one can be with software or firmware (firmware) form is stored in the memory 120 or is solidificated in the operating system of the data processing equipment 100 Software function module in (operating system, OS).The processor 130 is deposited in the memory 120 for executing The executable module of storage, such as software function module and computer journey included by the exercise data analyses and comparison device 110 Sequence etc..The communication unit 140 obtains data or file, such as exercise data for being communicated with other equipment.
Referring to figure 2., Fig. 2 is the flow diagram of exercise data analyses and comparison method provided in this embodiment, the side Method includes the following steps.
Step S110 obtains the exercise data of target sporter within the scope of specified time, wherein the exercise data includes Whole posture frame data within the scope of the specified time under multiple time sampling points, the entirety posture frame data include described The position posture frame data and human space data at the variant position of target sporter, the human space data include that human body turns To data and human space position.
In the present embodiment, the exercise data is by within the scope of continuous one section of specified time, under different time sampled point The entirety posture frame data composition, each overall data frame data include multiple and different movements under current time sample point The position posture frame data at position, the entirety posture frame data further include human space number under current time sample point According to, that is,
Whole posture frame data={ { the position posture frame data of different parts }, human space data }
Specifically, referring to figure 3., in the present embodiment, to the structure of human body using rumpbone as origin, it is divided into 7 big systems, respectively Refer to be to refer to be with the right hand for somatic system, left lower extremity system, right lower extremity system, left upper extremity system, right upper extremity system, left hand.Concrete position number Please refer to table 1.
Table 1
Specifically, in the present embodiment, pass through posture frame data in position described in Eulerian angles data characterization.Human body is mainly transported Regard the rigid body being connected between two movable joints as in dynamic position, wherein close to the pass of rumpbone on physical feeling extending direction Section is father joint.Referring to figure 4., space coordinates are established as origin using the father joint of physical feeling, passes through stretching in Eulerian angles Exhibition angle θ, angle of rotation φ and rotation angle ψ characterize the current pose at the position.
That is, position posture frame data={ θttt}
In the present embodiment, referring to figure 5., reference axis reference direction such as Fig. 5 when human body standard posture is stood is defined.It please refers to Table 1, the change in coordinate axis direction of different parts coordinate system such as table 1, wherein O is the father joint at the position, i.e. coordinate origin.X,Y,Z The respectively direction of three reference axis of the position Eulerian angles.Initial angle is the Euler at each position when human body standard station appearance is upright Angle angle specifically please refers to table 2.
Table 2
Specifically, in the present embodiment, data (turning round angle ω, pitch angle ε, angle of heel ξ) and human body are turned to by human body Spatial position (x, y, z) characterizes the human space data, that is,
Human space data={ (ωttt),(xt,yt,zt)}
Please refer to Fig. 6, in the present embodiment, using human body standard posture it is upright when rumpbone as origin, angle ω, pitching are turned round in definition The initial position and direction of angle ε and angle of heel ξ are as shown in Figure 6.
It is straight up z-axis, the right hand is using facial face direction as x-axis under the upright original state of body standard posture Y-axis establishes coordinate system, the human space position (xt,yt,zt) indicate human body when current point in time is relative to initial position Human body is regarded as in the present embodiment and calculates the human space position (x by reference point of rumpbone by space displacement amountt,yt, zt)。
In the present embodiment, by be arranged in each human body joint the collected 3-axis acceleration of sensing device and/or Three axis magnetic biasing angular datas calculate original state and current shape after Kalman filtering algorithm does the processing for eliminating shake and noise The data of state obtain the position posture frame data and human space data at different human body position.
Step S120 extracts the exercise data of target action segment from the exercise data of the target sporter.
In practical applications, the exercise data within the scope of the specified time may include that target sporter does not have started Data when movement, and exercise data when non-setting in motion does not need to be compared.Therefore it is designed in the present embodiment The target action segment analysed and compared is extracted from the exercise data within the scope of specified time.Specifically, Step S120 may include following sub-step.
Compare in the multiple whole posture frame data of exercise data of the target sporter successively the continuous first whole appearance State frame data, the second whole posture frame data and third entirety posture frame data.It is more in the exercise data of the target sporter In a entirety posture frame data k (t), continuous three entirety posture frame k (i-1), k (i) and k (i+1) are compared.
When the position of all corresponding positions in the described first whole posture frame data and the second whole posture frame data Posture frame data are all the same, and have the portion of any corresponding position in the second whole posture frame data and third entirety posture frame data When pose state frame data difference, the corresponding frame of the second entirety posture frame data is acted into start frame as a whole.That is, if depositing In k (i-1)=k (i), and k (i) ≠ k (i+1), then using the corresponding frame of k (i) as the molar behavior start frame.
Compare in the multiple whole posture frame data of exercise data of the target sporter successively the continuous 4th whole appearance State frame data, the 5th whole posture frame data and the 6th whole posture frame data.It is more in the exercise data of the target sporter In a entirety posture frame data k (t), continuous three entirety posture frame k (j-1), k (j) and k (j+1) are compared.
When the portion for having any corresponding position in the described 4th whole posture frame data and the 5th whole posture frame data Pose state frame data are different, and the position of the 5th whole posture frame data and all corresponding positions in the 6th entirety posture frame data When posture frame data are all the same, the corresponding frame of the 5th entirety posture frame data is acted into end frame as a whole.Even deposit In k (j-1) ≠ k (j), and k (j)=k (j+1), then using the corresponding frame of k (j) as the molar behavior end frame.
The whole posture frame data between the molar behavior start frame and molar behavior end frame are extracted as institute The exercise data of the target action segment is recorded as r by the exercise data for stating target action segment.
Step S130 analyzes the exercise data of the target action segment, obtains different people in the target action segment The key position posture frame data of body region, wherein the key position posture frame data include that the single position of human body is being transported Position movement starting frame data when dynamic starting and the position movement at the end of movement terminate frame data.
In the position posture frame data h of target action segmentn(t) in, to continuous three positions posture frame data hn(p-1)、 hn(p) and hn(p+1) it is compared.I.e. if it exists hn(p-1)=hnAnd h (p),n(p)≠hn(p+1), then by hn(p) being used as should The position movement starting frame data at position.
In the position posture frame data h of target action segmentn(t) in, to continuous three positions posture frame data hn(q-1)、 hn(q) and hn(q+1) it is compared.I.e. if it exists hn(q-1)≠hnAnd h (q),n(q)=hn(q+1), then by hn(q) being used as should The position movement at position terminates frame data.
The position of each motive position movement starting frame data and position movement are terminated into frame data as the position The key position posture frame data.
Step S140 establishes human body coordinate system centered on default human body, in conjunction with the human space data, by mesh The key position posture frame data at each position are converted to the ginseng of the spatial displacements in the human body coordinate system in mark movement segment Number.
In the present embodiment, spatial displacements parameter includes position spatial parameter and human space parameter.
In this embodiment, the human space parameter by pitch orientation α, turn directions β, roll tilted direction γ and vertical Direction μ is characterized.
That is, human space parameter { α, beta, gamma, μ }, wherein
Pitch orientation α ∈ (forward, backwards, upward, downward)
Turn directions β ∈ (forward, backwards, towards it is left, towards the right side)
Roll direction γ ∈ (towards it is left, towards right, upward, downward)
Vertical direction μ ∈ (upwards, Xiang Zhong, downwards)
Specifically, in this embodiment, the human space data are converted into the pitch orientation α by calculating, turned round Direction β, tilted direction γ institute is rolled for the rotation direction on direction, pass through the rotation towards the rotation feelings characterized in human motion Condition.
Fig. 7 is please referred to, when definition human body standard stance is upright, initial human body steering data are in human space data (ω000), it is (ω when the target action segment end time pointttt), then the pitch orientation α, turn directions β, the value of tilted direction γ is rolled as shown in table 3, table 4 and table 5.
α value ωt0
Forward -45°<(ωt0)<45°
Towards a left side 45°<(ωt0)<135°
Backwards 135°<(ωt0)<-135°
Towards the right side -135°<(ωt0)<-45°
Table 3
β value εt0
Forward -45°<(εt0)<45°
Upward 45°<(εt0)<135°
Backwards 135°<(εt0)<-135°
Downward -135°<(εt0)<-45°
Table 4
ξ value ξt0
Upward -45°<(ξt0)<45°
Towards the right side 45°<(ξt0)<135°
Downward 135°<(ξt0)<-135°
Towards a left side -135°<(ξt0)<-45°
Table 5
In the present embodiment, defining human space position original state in human space data is (x0,y0,z0), target is dynamic Make at the end of segment to be (xt,yt,zt), wherein z0The height at rumpbone position when standing.The then value such as table of the vertical direction μ Shown in 6.
Table 6
In this embodiment, the position spatial parameter passes through position positive space of planes direction U and position lateral space direction V table Sign.
That is, position spatial parameter={ U, V }, wherein
U ∈ (middle position, left position, position in the right side, in upper, upper left position, upper right position, middle the next, lower-left position, bottom right),
V ∈ (anteposition, rear position, middle position)
Please refer to Fig. 8, U is position positive space of planes direction, and position is divided into 9 positions by figure 8 above altogether, wherein rumpbone with Vertebrae prominens spacing is defined as " middle height " (Wh), and shoulder breadth is defined as " middle width " (Wm).V is position lateral space direction, spatial position Definition such as Fig. 8, body thickness are defined as " side is wide " (Ws).
Specifically, in the present embodiment, human body coordinate system is established centered on specified rumpbone, the space of human peripheral is drawn logical It crosses the positive space of planes direction U and position lateral space direction V is divided into 27 spaces, by posture frame in position described in human body Data are converted to the position spatial parameter of this 27 space segments by calculating, and are in certain in this 27 spaces by human body One, to characterize the position at the position.
The positive space of planes direction U and position lateral space direction V is calculated by the position posture frame data and is obtained, tool Steps are as follows for body calculating.
Table 7 is please referred to, in the present embodiment, it is as shown in table 6 to define human body connection relationship.
Position Connect dimension d Explanation
Rumpbone 0 Human body origin final reference coordinate
Vertebrae prominens 1 Connect rumpbone
Head 2 Connect vertebrae prominens
Right shoulder bone 2 Connect vertebrae prominens
Left shoulder bone 2 Connect vertebrae prominens
Right elbow 3 Connect right shoulder bone
Left elbow 3 Connect left shoulder bone
Right wrist 4 Connect right elbow
Left wrist 4 Connect left elbow
The right hand palm 5 Connect right wrist
The left hand palm 5 Connect left wrist
Left hip 1 Rumpbone is connected, is not changed
Right hip 1 Rumpbone is connected, is not changed
Left knee 2 Connect left hip
Right knee 2 Connect right hip
Right ankle 3 Connect right knee
Left ankle 3 Connect left knee
The right crus of diaphragm palm 4 Connect right ankle
The left foot palm 4 Connect left ankle
Table 7
Wherein, it is starting point, d-th of connection from human body coordinate origin rumpbone that the connection dimension d, which indicates that the motive position is, Position.Rumpbone establishes coordinate as origin when being stood using human body standard posture, and x is face direction, and y-axis is right-hand direction, and z-axis is vertical Straight upward direction.Remember that the position spatial parameter is H (Ud,Vd), calculation formula is as follows:
f(Ud,Vd)=f (xd,yd,zd)=f (f0(xd-1,yd-1,zd-1)+f0(Δxd,Δyd,Δzd)),d≥1
Wherein, (xd,yd,zd) indicate current location in human body coordinate system D0In coordinate;
(xd-1,yd-1,zd-1) indicate last position in human body coordinate system D0In coordinate;
xd=xd-1+Δxd
yd=yd-1+Δyd
zd=zd-1+Δzd
f0(Δxd,Δyd,Δzd) show current location d in D0The baseline component of coordinate system, formula are as follows: f0(Δxd, Δyd,Δzd)={ Δ xd=Ld×cos(ωd),Δyd=Ld×cos(εd),Δzd=Ld×cos(ξd) wherein, LdExpressed portion Bit length, (ωddd) it is position d in D0Middle corresponding tri- axis of XYZ deflection angle.
f(Ud,Vd)=f (xd,yd,zd) contrast relationship is as shown in table 8 and table 9:
Table 8
V Z-axis
Middle position -Ws/2≤zd≤Ws/2
Anteposition Ws/2 < zd
Position afterwards zd<-Ws/2
Table 9
In this embodiment, the method also includes step S200.
Step S200 pre-establishes movement template library.Specific step is as follows.
Obtain the exercise data of specified time range internal standard sporter.
The exercise data of extraction standard movement segment from the exercise data of the standard movement person.
The exercise data for analyzing the standard operation segment obtains the pass at different human body position in the standard operation segment Key position posture frame data, wherein the key position posture frame data include the single position of human body when moving starting Position movement at the end of position movement starting frame data and movement terminates frame data.
Human body coordinate system is established centered on default human body, in conjunction with the human space data, by standard operation piece The key position posture frame data at each position are converted to the spatial displacements parameter in the human body coordinate system in section.
Save the exercise data of the standard operation segment, and the main movement portion that user in standard operation segment is specified Feature vector of the set of the spatial displacements parameter of position as the standard operation segment.
The specific method that exercise data extracts conversion in above-mentioned steps please refers to step S110 to step S140, again no longer It repeats.
In the present embodiment, it can be selected as a variety of different motion type of action according to user and create the corresponding standard Act segment.
Type of action identification is impacted in order to reduce the exercise data of inessential motive position, in the present embodiment, The set of design and the spatial displacements parameter at the main movement position for specifying user in standard operation segment is as the standard Act the feature vector of segment.Such as, when standard operation segment is squatting motion, left leg, left thigh, left hipbone, the right side are chosen Hipbone, right thigh and right leg are as the main movement position, by the spatial displacements parameter at the main movement position Feature vector as the standard operation segment.
Step S150 moves the set of the spatial displacements parameter at position each in target action segment as the target The feature vector for making segment matches corresponding standard operation segment in deliberate action template library.
According to main movement position is specified in the standard operation segment characterizations vector, extract in the target action segment The spatial displacements parameter of corresponding position.Such as, the main movement position specified in the standard operation segment is left leg, Zuo great Leg, left hipbone, right hipbone, right thigh and right leg then choose position of being corresponding to it from the feature vector of target action segment The spatial displacements parameter.
It calculates and judges the space from the spatial displacements parameter and standard operation segment extracted in the target action segment The matching degree of action parameter.The spatial displacements parameter of the spatial displacements parameter of target action segment and the standard operation segment Matching degree it is higher, then illustrate that target action segment and the similarity of the standard operation segment are bigger.
It chooses to be used as with the maximum standard operation segment of the target action fragment match degree and is used for motion analysis The standard operation segment of comparison.Return the maximum standard operation segment of similarity position posture frame data, as with the mesh Mark movement segment does the data compared.
The target action segment is compared step S160 with the whole posture frame data of standard operation segment, and Export comparison result.
Further, in the present embodiment, the step of comparison may include:
Using the time span of standard operation segment as reference, the time span of the exercise data of target action segment is done and is returned One change processing makes the time span of the target action segment and frame number be equal to the time span and frame of the standard operation segment Number.
Compare each position posture frame of the target action segment and the standard operation segment under different time sampled point Euler's angular data of data, and the difference of defeated Euler's angular data.
Specifically, Fig. 9 is please referred to, a diversity factor comparative analysis interface is shown, on diversity factor comparative analysis interface After showing normalized, each position Eulerian angles are using the time as variable in the target action segment and the standard operation segment Change curve.The target action segment and the standard operation segment Eulerian angles change curve are calculated under same time point Diversity factor, and user is shown to by patterned statistical.
Further, in the present embodiment, the step of comparison can also include:
Terminate frame data according to the position movement starting frame data at each position in the target action segment and position movement, Obtain the key poses segment of the target action segment.
Terminate frame data according to the position movement starting frame data at each position in the standard operation segment and position movement, Obtain the key poses segment of the standard operation segment.
Calculate the key poses segment overlapping time of the corresponding position of the target action segment and standard operation segment Frame number.
The ratio for calculating the frame number of the overlapping time and the position actuation time totalframes of the standard operation segment, is obtained Synchronous ratio must be acted.
Specifically, Figure 10 is please referred to, a sync rates assay surface is shown, shows normalizing on the sync rates assay surface After change processing, the target action key poses segment time zone Chong Die with standard operation key poses segment.It counts respectively The frame number of overlapping time described in each position and the ratio of standard operation key poses piece totalframes are calculated, movement synchronous ratio is obtained, And the movement sync rates are shown in the sync rates assay surface.
Further, in the present embodiment, the step of comparison can also include:
The movement speed for calculating each position of standard operation segment, obtains the movement speed of the standard operation segment Average value and maximum value, the variable quantity characterization that the movement speed passes through Eulerian angles in the unit time;
Calculate movement speed corresponding with the specified main movement position of the standard operation segment in the target action segment Degree, obtains the average value and maximum value of the movement speed of the target action segment;
Show the movement speed comparison result of the target action segment Yu standard operation segment.
Specifically, when the standard operation segment and the time interval of target action segment entirety posture frame data press practical Between interval handle.Eulerian angles (θ, φ, ψ) are merged into an attitude angle η, calculation formula are as follows:
The corresponding position of target action segment and standard operation segment is calculated, obtains corresponding position in the appearance of every frame The angular speed S of state angle ηi, its calculation formula is:
Si=| ηii-1|/|ti-tt-1|
Wherein, i indicates that frame number, t indicate the time.
Calculate separately each position activity interval mean angular velocityCalculation formula are as follows:
Wherein, (1, n) is the frame number of position activity interval.
Specifically, Figure 11 is please referred to, shows a velocity contrast interface, and shows crucial appearance in the velocity contrast interface The time zone of state segment covering, and mean angular velocity is identified in the time zone of each key poses segment covering With highest angular speed Smax.
The angular speed at each position is added, the synthesis angular speed of the frame is obtained:
Wherein, m indicates position, and n indicates frame number.
Target action segment and standard operation segment are calculated separately, obtains the comprehensive angular speed SA of the maximum of respective sequencemax, And position and maximum comprehensive angular speed SA are marked in corresponding framing bitmaxValue.
Further, in the present embodiment, the method can also include:
Choose multiple target action segments with the same standard operation fragment match.
The whole posture frame data of the multiple target action segment are compared, and export comparison result.
Specifically, in the present embodiment, the exercise data of different target sporter can be compared.At two Different target acts segment, and uniformly same standard operation segment can assert described two different target action segments to when matching For same type of movement, and the data of described two different target action segments are compared.Comparing it is specific Step please refers to the alignments of target action segment Yu standard operation segment, repeats no more again.
Figure 12 is please referred to, the present embodiment also provides a kind of exercise data analyses and comparison device 110, and described device includes:
Target motion capture module 111, for obtaining the exercise data of target sporter within the scope of specified time, In, the exercise data includes the whole posture frame data within the scope of the specified time under multiple time sampling points, described whole Body posture frame data include the position posture frame data and human space data at the variant position of target sporter, the people Body spatial data includes that human body turns to data and human space position.
Target action snippet extraction module 112, for extracting target action from the exercise data of the target sporter The exercise data of segment.
Target critical posture frame analysis module 113, for analyzing the exercise data of the target action segment, described in acquisition The key position posture frame data at different human body position in target action segment, wherein the key position posture frame data packet It includes position movement starting frame data of the single position of human body when moving starting and acts end frame with position at the end of movement Data.
Target exercise data conversion module 114, for establishing human body coordinate system centered on default human body, in conjunction with institute Human space data are stated, the key position posture frame data at position each in target action segment are converted into the human body and are sat Spatial displacements parameter in mark system.
Matching module 115 is acted, for by the collection cooperation of the spatial displacements parameter at position each in target action segment For the feature vector of the target action segment, corresponding standard operation segment is matched in deliberate action template library.
Comparison module 116 is acted, for by the whole posture frame data of the target action segment and standard operation segment It is compared, and exports comparison result.
It should be understood that above-mentioned description and attached drawing to each display interface, does not constitute the restriction to display interface, In various embodiments, the quantity and position distribution for the function window for including in display interface are can be different.
In conclusion the present invention one provides a kind of exercise data analyses and comparison method and device, mesh is extracted by analysis The target action segment in sporter's exercise data is marked, the key position obtained in the target action segment is calculated Posture frame data, and spatial displacements parameter is converted by the key position posture frame data in conjunction with the human space data. By the spatial displacements parameter, matching criteria acts segment in deliberate action standard operation library, and is compared.Such as This, by the key operations data in exercise data converts that calculation amount is small and the intuitive spatial displacements parameter of expression acts Matching, effectively increases the efficiency of Data Matching.
The foregoing is only a preferred embodiment of the present invention, is not intended to restrict the invention, for the skill of this field For art personnel, the invention may be variously modified and varied.All within the spirits and principles of the present invention, made any to repair Change, equivalent replacement, improvement etc., should all be included in the protection scope of the present invention.

Claims (10)

1. a kind of exercise data analyses and comparison method, which is characterized in that the described method includes:
Obtain the exercise data of target sporter within the scope of specified time, wherein the exercise data includes the specified time Whole posture frame data in range under multiple time sampling points, the entirety posture frame data include that the target sporter is each The position posture frame data and human space data of different parts, the human space data include that human body turns to data and human body Spatial position;
The exercise data of target action segment is extracted from the exercise data of the target sporter;
The exercise data for analyzing the target action segment obtains the crucial portion at different human body position in the target action segment Pose state frame data, wherein the key position posture frame data include position of the single position of human body when moving starting Position movement at the end of movement starting frame data and movement terminates frame data;
Human body coordinate system is established centered on default human body, it, will be in target action segment in conjunction with the human space data The key position posture frame data at each position are converted to the spatial displacements parameter in the human body coordinate system;
Spy by the set of the position posture frame data at position each in target action segment as the target action segment Vector is levied, corresponding standard operation segment is matched in deliberate action template library;
The target action segment is compared with the position posture frame data at each position of standard operation segment, and exports comparison As a result.
2. method according to claim 1, it is characterised in that:
The position posture frame data include Euler's angular data of partes corporis humani position, the position posture frame data and the human body Spatial data passes through the 3-axis acceleration for the sensing device acquisition being arranged at human synovial and/or the calculating of three axis magnetic declination obtains ?.
3. the method according to claim 1, wherein described extract from the exercise data of the target sporter The step of exercise data of target action segment includes:
Compare in the multiple whole posture frame data of exercise data of the target sporter successively the continuous first whole posture frame Data, the second whole posture frame data and third entirety posture frame data;
When the position posture of all corresponding positions in the described first whole posture frame data and the second whole posture frame data Frame data are all the same, and have the position appearance of any corresponding position in the second whole posture frame data and third entirety posture frame data When state frame data difference, the corresponding frame of the second entirety posture frame data is acted into start frame as a whole;
Compare in the multiple whole posture frame data of exercise data of the target sporter successively the continuous 4th whole posture frame Data, the 5th whole posture frame data and the 6th whole posture frame data;
When the position appearance for having any corresponding position in the described 4th whole posture frame data and the 5th whole posture frame data State frame data are different, and the position posture of the 5th whole posture frame data and all corresponding positions in the 6th entirety posture frame data When frame data are all the same, the corresponding frame of the 5th entirety posture frame data is acted into end frame as a whole;
The whole posture frame data between the molar behavior start frame and molar behavior end frame are extracted as the mesh The exercise data of mark movement segment.
4. the method according to claim 1, wherein the method also includes:
Movement template library is pre-established,
It is described pre-establish movement template library the step of include:
Obtain the exercise data of specified time range internal standard sporter;
The exercise data of extraction standard movement segment from the exercise data of the standard movement person;
The exercise data for analyzing the standard operation segment obtains the crucial portion at different human body position in the standard operation segment Pose state frame data, wherein the key position posture frame data include position of the single position of human body when moving starting Position movement at the end of movement starting frame data and movement terminates frame data;
Human body coordinate system is established centered on default human body, it, will be in standard operation segment in conjunction with the human space data The key position posture frame data at each position are converted to the spatial displacements parameter in the human body coordinate system;
Save the exercise data of the standard operation segment, and the main movement position that user in standard operation segment is specified Feature vector of the set of spatial displacements parameter as the standard operation segment.
5. according to the method described in claim 4, it is characterized in that, the space by position each in target action segment Feature vector of the set of action parameter as the target action segment, matches corresponding standard in deliberate action template library Act segment the step of include:
According to main movement position is specified in the standard operation segment characterizations vector, extract corresponding in the target action segment The spatial displacements parameter at position;
It calculates and judges the spatial displacements from the spatial displacements parameter and standard operation segment extracted in the target action segment The matching degree of parameter;
It chooses to be used as with the maximum standard operation segment of the target action fragment match degree and be compared for motion analysis Standard operation segment.
6. according to the method described in claim 5, it is characterized in that, described by the target action segment and standard operation segment The position posture frame data at each position are compared, and the step of exporting comparison result includes:
Using the time span of standard operation segment as reference, the time span of the exercise data of target action segment is normalized Processing makes the time span of the target action segment and frame number be equal to the time span and frame number of the standard operation segment;
Compare each position posture frame data of the target action segment and the standard operation segment under different time sampled point Euler's angular data, and export the difference of extension angle θ, angle of rotation Φ and rotation angle Ψ in Euler's angular data respectively.
7. according to the method described in claim 6, it is characterized in that, described by the target action segment and standard operation segment The position posture frame data at each position are compared, and the step of exporting comparison result further include:
Terminate frame data according to the position movement starting frame data at each position in the target action segment and position movement, obtains The key poses segment of the target action segment;
Terminate frame data according to the position movement starting frame data at each position in the standard operation segment and position movement, obtains The key poses segment of the standard operation segment;
Calculate the frame number of the key poses segment overlapping time of the corresponding position of the target action segment and standard operation segment;
The ratio for calculating the frame number of the overlapping time and the position actuation time totalframes of the standard operation segment, is moved Make synchronous ratio.
8. the method according to the description of claim 7 is characterized in that described by the target action segment and standard operation segment The position posture frame data at each position are compared, and the step of exporting comparison result further include:
The movement speed for calculating each position of standard operation segment, obtains being averaged for the movement speed of the standard operation segment Value and maximum value, the variable quantity characterization that the movement speed passes through Eulerian angles in the unit time;
Movement speed corresponding with the specified main movement position of the standard operation segment in the target action segment is calculated, is obtained Obtain the average value and maximum value of the movement speed of the target action segment;
Show the movement speed comparison result of the target action segment Yu standard operation segment.
9. the method according to claim 1, wherein the method also includes:
Choose multiple target action segments with the same standard operation fragment match;
The whole posture frame data of the multiple target action segment are compared, and export comparison result.
10. a kind of exercise data analyses and comparison device, which is characterized in that described device includes:
Target motion capture module, for obtaining the exercise data of target sporter within the scope of specified time, wherein described Exercise data includes the whole posture frame data within the scope of the specified time under multiple time sampling points, the entirety posture frame Data include the position posture frame data and human space data at the variant position of target sporter, the human space number Data and human space position are turned to according to including human body;
Target action snippet extraction module, for extracting the fortune of target action segment from the exercise data of the target sporter Dynamic data;
It is dynamic to obtain the target for analyzing the exercise data of the target action segment for target critical posture frame analysis module Make the key position posture frame data at different human body position in segment, wherein the key position posture frame data include human body Single position move starting when position movement starting frame data and movement at the end of position movement terminate frame data;
Target exercise data conversion module, for establishing human body coordinate system centered on default human body, in conjunction with the human body Spatial data is converted to the key position posture frame data at position each in target action segment in the human body coordinate system Spatial displacements parameter;
Matching module is acted, for using the set of the spatial displacements parameter at position each in target action segment as the mesh Mark acts the feature vector of segment, and corresponding standard operation segment is matched in deliberate action template library;
Comparison module is acted, for comparing the whole posture frame data of the target action segment and standard operation segment It is right, and export comparison result.
CN201610714829.7A 2016-08-24 2016-08-24 Exercise data analyses and comparison method and device Active CN106295616B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610714829.7A CN106295616B (en) 2016-08-24 2016-08-24 Exercise data analyses and comparison method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610714829.7A CN106295616B (en) 2016-08-24 2016-08-24 Exercise data analyses and comparison method and device

Publications (2)

Publication Number Publication Date
CN106295616A CN106295616A (en) 2017-01-04
CN106295616B true CN106295616B (en) 2019-04-30

Family

ID=57615103

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610714829.7A Active CN106295616B (en) 2016-08-24 2016-08-24 Exercise data analyses and comparison method and device

Country Status (1)

Country Link
CN (1) CN106295616B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108392207B (en) * 2018-02-09 2020-12-11 西北大学 Gesture tag-based action recognition method
CN110516516A (en) * 2018-05-22 2019-11-29 北京京东尚科信息技术有限公司 Robot pose measurement method and device, electronic equipment, storage medium
CN111080589A (en) * 2019-12-05 2020-04-28 广州极泽科技有限公司 Target object matching method, system, device and machine readable medium
CN111091889A (en) * 2019-12-12 2020-05-01 深圳英鸿骏智能科技有限公司 Human body form detection method based on mirror surface display, storage medium and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101894278A (en) * 2010-07-16 2010-11-24 西安电子科技大学 Human motion tracing method based on variable structure multi-model
CN103713739A (en) * 2013-12-25 2014-04-09 北京握奇数据***有限公司 Movement data acquisition processing method and system
CN105068654A (en) * 2015-08-14 2015-11-18 济南中景电子科技有限公司 Motion capturing system and method based on CAN bus and inertial sensor
WO2016025460A1 (en) * 2014-08-11 2016-02-18 Icuemotion, Llc Codification and cueing system for human interactions in tennis and other sport and vocational activities

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8284990B2 (en) * 2008-05-21 2012-10-09 Honeywell International Inc. Social network construction based on data association

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101894278A (en) * 2010-07-16 2010-11-24 西安电子科技大学 Human motion tracing method based on variable structure multi-model
CN103713739A (en) * 2013-12-25 2014-04-09 北京握奇数据***有限公司 Movement data acquisition processing method and system
WO2016025460A1 (en) * 2014-08-11 2016-02-18 Icuemotion, Llc Codification and cueing system for human interactions in tennis and other sport and vocational activities
CN105068654A (en) * 2015-08-14 2015-11-18 济南中景电子科技有限公司 Motion capturing system and method based on CAN bus and inertial sensor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
人体运动捕获数据的建模与重用研究;蓝荣祎;《中国博士学位论文全文数据库 信息科技辑》;20150615;I138-65

Also Published As

Publication number Publication date
CN106295616A (en) 2017-01-04

Similar Documents

Publication Publication Date Title
CN106295616B (en) Exercise data analyses and comparison method and device
CN111144217B (en) Motion evaluation method based on human body three-dimensional joint point detection
CN103839040B (en) Gesture identification method and device based on depth image
CN106650687B (en) Posture correction method based on depth information and skeleton information
CN102567703B (en) Hand motion identification information processing method based on classification characteristic
US8254627B2 (en) Method for automatically following hand movements in an image sequence
CN109299659A (en) A kind of human posture recognition method and system based on RGB camera and deep learning
US8755569B2 (en) Methods for recognizing pose and action of articulated objects with collection of planes in motion
CN105664462A (en) Auxiliary training system based on human body posture estimation algorithm
CN107169411B (en) A kind of real-time dynamic gesture identification method based on key frame and boundary constraint DTW
CN104200200B (en) Fusion depth information and half-tone information realize the system and method for Gait Recognition
CN108305283A (en) Human bodys&#39; response method and device based on depth camera and basic form
CN111127446B (en) Gait analysis-oriented plantar pressure image partitioning method
CN106778510B (en) Method for matching high-rise building characteristic points in ultrahigh-resolution remote sensing image
CN113065505A (en) Body action rapid identification method and system
CN103955680A (en) Action recognition method and device based on shape context
Shin et al. Hand region extraction and gesture recognition using entropy analysis
CN107132915A (en) A kind of brain-machine interface method based on dynamic brain function network connection
Yin et al. Estimation of the fundamental matrix from uncalibrated stereo hand images for 3D hand gesture recognition
CN103186241B (en) A kind of interactive desktop contact right-hand man&#39;s recognition methods
CN102156994A (en) Joint positioning method of single-view unmarked human motion tracking
Zhou et al. Finger vein recognition based on stable and discriminative superpixels
CN109255293A (en) Model&#39;s showing stage based on computer vision walks evaluation method
CN113282164A (en) Processing method and device
Kim et al. Hand shape recognition using fingertips

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant