CN110354480A - A kind of golf swing scoring estimation method compared based on posture - Google Patents

A kind of golf swing scoring estimation method compared based on posture Download PDF

Info

Publication number
CN110354480A
CN110354480A CN201910683497.4A CN201910683497A CN110354480A CN 110354480 A CN110354480 A CN 110354480A CN 201910683497 A CN201910683497 A CN 201910683497A CN 110354480 A CN110354480 A CN 110354480A
Authority
CN
China
Prior art keywords
frame
video
user
stage
template
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910683497.4A
Other languages
Chinese (zh)
Other versions
CN110354480B (en
Inventor
刘峰
陈静静
干宗良
高嘉轩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Post and Telecommunication University
Nanjing University of Posts and Telecommunications
Original Assignee
Nanjing Post and Telecommunication University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Post and Telecommunication University filed Critical Nanjing Post and Telecommunication University
Priority to CN201910683497.4A priority Critical patent/CN110354480B/en
Publication of CN110354480A publication Critical patent/CN110354480A/en
Application granted granted Critical
Publication of CN110354480B publication Critical patent/CN110354480B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/36Training appliances or apparatus for special sports for golf
    • A63B69/3667Golf stance aids, e.g. means for positioning a golfer's feet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0647Visualisation of executed movements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2102/00Application of clubs, bats, rackets or the like to the sporting activity ; particular sports involving the use of balls and clubs, bats, rackets, or the like
    • A63B2102/32Golf
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/05Image processing for measuring physical parameters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Physical Education & Sports Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of golf swing scoring estimation methods compared based on posture, comprising: template video and user video are registrated;Extract user video human action feature;The user video characteristics of human body of extraction and its template video are swung to act and compared, and scoring is weighted to the comparison result of different characteristic.The golf swing provided by the invention compared based on posture is scored estimation method, either background is simple or miscellaneous scene in can fast and accurately calculate the scoring of golf swing, there is good promotion prospect.

Description

A kind of golf swing scoring estimation method compared based on posture
Technical field
The invention belongs to machine vision and graph processing technique fields, and in particular to a kind of Gao Er compared based on posture Husband, which swings, acts scoring estimation method.
Background technique
For golf sports, the variation of human body attitude is a key factor of decision balling effect, so, it is right In the important component that the analysis of human motion posture is golf training system.In existing golf training system, though User action can so be corrected, but a reasonable scoring estimation cannot be carried out to user action, user can only root Whether oneself is judged according to the quantity of the malfunction of oneself has promotion by the movement that swings after training.However certain joints are dynamic Although making there are certain mistake, may influence to the effect of batting it is limited, therefore, by the quantity of malfunction come Judge the fine or not and unreasonable of the movement that oneself swings.
Summary of the invention
The purpose of the present invention is to provide a kind of golf swing scoring estimation methods compared based on posture, with solution Certainly to the reasonability of user action assessment.
To solve the above problems, the invention adopts the following technical scheme:
A kind of golf swing scoring estimation method compared based on posture, comprising the following steps:
1) template video and user video are registrated: using the information of club slope, the extraction section from user video Frame is corresponded with template video;
2) it extracts human action feature: utilizing human joint points coordinate, extract several human body limb angles, it is dynamic as human body Make feature;
3) the user video characteristics of human body of extraction and its template video are swung to act and is compared, and the comparison to different characteristic As a result it is weighted scoring.
Further, in step 1), template video and the matching of user video frame, tool are carried out using the information of club slope Body the following steps are included:
(THCxi,THCyi) be in template video on the club of the i-th frame close to hand the point coordinate of a bit, (TDCxi, TDCyi) be in template video on the club of the i-th frame close to some coordinates of club head, byIt calculates Can obtain club in the i-th frame of template video in cartesian coordinate system with the angle value TC ρ of the formation of horizontal directioni, TC ρiModel It encloses for [0,180);
(UHCxj,UHCyj) be in user video on the club of jth frame close to hand the point coordinate of a bit, (UDCxj, UDCyj) be in user video on the ball of jth frame close to some coordinates of club head, byCalculating can Club in family video rod stage jth frame in cartesian coordinate system with the angle value UC ρ of the formation of horizontal directionj, UC ρj Range be [0,180);
Video that one section is completely swung is divided into three sections, i.e. rod stage video (start frame to upper boom latter stage), lower beam rank Section video (first frame to frame of batting) after upper boom latter stage and with bar stage video (first frame to abort frame) after batting, respectively The matching way of stage video is consistent, and the video of three phases is referred to as template stage video, and template video is one complete Video includes upper boom, lower beam, with bar three phases video, and template stage video is the video having been segmented into, it may be possible on Bar, lower beam, with any one in bar stage video, video frame matching process is largely divided into three steps, the specific steps are as follows:
(1) it finds out and the most matched user's stage video frame of the 1st frame of template stage video:
It is assumed that I is the template stage total frame length of video, J is the total frame length of user's stage video, is sentenced first with club directional information Whether disconnected the 1st frame of template stage video and the club direction in the 1st frame of user's stage video are consistent, when meeting following judgment condition THCy1<TDCy1&&UHCy1<UDCy1Or THCy1>TDCy1&&UHCy1>UDCy1, show the 1st frame of template stage video and user The 1st frame direction of stage video is consistent, conversely, then showing that the 1st frame of template stage video and the 1st frame direction of user's stage video are different It causes, wherein THCy1 is the y-coordinate value of any in template stage video on the club of the 1st frame close to hand, and TDCy1 is template The point coordinate of any in stage video on the club of the i-th frame close to head, UHCy1 is the ball of the 1st frame in user's stage video On bar close to hand the y-coordinate value of a bit, UDCy1 be in user's stage video on the club of the 1st frame close to head a bit Y-coordinate value;If the 1st frame of template stage video is consistent with the 1st frame direction of user's stage video, it is assumed that template stage video the 1st Club direction in frame is THCy1<TDCy1, public using the club distance in template stage video frame and user's stage video frame Formula calculates the club distance of each frame in the 1st frame of template stage video and user's stage video, and participates in the user's stage calculated Frame meets UHCy in video1<UDCy1, Δ C ρ is obtained at this time1,1, Δ C ρ1,2... Δ C ρ1,n, respectively the 1st frame of user's stage video To the club distance of n-th frame and the 1st frame of template stage video, then club is found out from this group of data corresponding to the smallest value User's stage video frame MinP, then it is assumed that user's stage video MinP frame is most matched with the 1st frame of template stage video, club Range formula is as follows:
ΔCρi,j=min [abs (TC ρi-UCρj),180-abs(TCρi-UCρj)]
If the club direction in the 1st frame of template stage video and the 1st frame of user's stage video is inconsistent, it is assumed that the template stage The direction of the 1st frame of video is THCy1<TDCy1, then give up the user stage video frame inconsistent with template stage video direction, directly Start to calculate club distance in a manner described when consistent with the 1st frame direction of template stage video to user's stage video frame, at this time Obtain Δ C ρ1,m, Δ C ρ1,m+1... Δ C ρ1,n, respectively user's stage video m frame to n-th frame and the 1st frame of template stage video Club distance, then find out from this group of data club user stage video frame MinP corresponding to the smallest value, then recognize It is most matched for user's stage video MinP frame with the 1st frame of template stage video.
(2) distance matrix of calculation template stage video and user's stage video, size I*J.
It, then can be with by (1) step it is found that the MinP frame of user's stage video is matched with the 1st frame of template stage video Thinking the 1st of user video cannot all match to MinP-1 frame with the 1st frame of template stage video.It is by actual conditions it is found that any Two videos between club distance can not completely the same, i.e., the distance, delta C between template stage video and user's stage video ρi,j> 0 it is permanent set up, so, for cannot the distance between matched template stage video and user's stage video be Δ C ρi,j=0. Distance is the distance matrix of template stage video and user's stage video, disi,jIt is the element in Distance, it is each Element calculation formula is as follows:
(3) calculation template stage video and user's stage video localized accumulated are looked for away from value, and using localized accumulated distance value Out with the most matched user's stage video frame j of the i-th frame of template stage video:
cumuDisi,jIt is the cumulative distance value of template stage the i-th frame of video Yu user's stage video jth interframe, it is next The cumulative distance value of position determines that formula is as follows by localized accumulated range formula:
Wherein, the value of p and q is by min [disi,j+1,disi+1,j+1,disi+1,j] determine, when minimum value is disi,j+1When, then P=i, q=j+1;When minimum value is disi+1,j+1When, then p=i+1, q=j+1;When minimum value is, disi+1,jWhen, p=i+1, q =j.
Using the i-th frame of template stage video and user's stage video jth frame as initial position, and i and j meet i < I, j < J, this When cumuDisi,j=disi,j, cumuDis is obtained using localized accumulated range formulai,j+1,…,cumuDisi,j+k, it is mould respectively The cumulative distance value of plate stage the i-th frame of video and+1 frame of user's stage video jth to jth+k frame, it is next when being calculated using formula Cumulative distance value obtains cumuDisi+1,j+k+1Or cumuDisi+1,j+kWhen, then stop calculating next cumulative distance value.It is utilizing Distance matrix D istance, finds out disi,j, disi,j+1,…,disi,j+k, i.e. the i-th frame of template stage video and user's stage regard Frequency jth frame finds out dis to the distance value of jth+k framei,j, disi,j+1,…,disi,j+kUser's stage of middle lowest distance value regards Frequency frame subscript, it is assumed that disi,minIt is disi,j, disi,j+1,…,disi,j+kIn minimum value, then min is the use of lowest distance value Family stage video frame subscript thinks that the i-th frame of template stage video frame is matched with user's stage video frame min frame at this time.
In view of golf sports are a continuous and complete swinging process, anyone the club track basic one that swings Cause, then when template stage the i-th frame of video and user's stage video min frame match, template stage video i+1 frame with User's stage video min frame must cannot match.So finding the user to match with the i+1 frame in template stage video Stage video frame then needs to calculate min+1 frame since user's stage video, it is possible thereby to find out respectively and the template stage The most matched user's stage video jth frame of the i-th frame of video, further using matching after template video and user video carry out And the characteristics of human body in video carries out movement comparison and is weighted scoring to the comparison result of different characteristic.
Known M is the total length of complete template video frame, and N is the total length of whole user video frame, utilizes registration Algorithm It has extracted M framed user video frame and template video frame corresponds, template video frame and user video frame are utilized respectively 14 passes Node is extracted 13 human body podomere angles.
StIt is the assessment score of t frame in the M framed user's video extracted, specific formula is as follows:
Wherein n is the number at human body podomere angle, and being worth for 13, C is human body podomere angle maximum value, in golf swing In, C value is 180, and the range at human body podomere angle in cartesian coordinate system is [0,180],It is user video in t frame With the difference at i-th of human body podomere angle of template video, WiIt is the weight at i-th of human body podomere angle, calculation formula is as follows:
Wi TIt is the accumulated time human body podomere angle weight of template video movement, Wi UIt is user action accumulated time human body limb Save angle weight.Weight Wi TWith weight Wi UObtained by accumulated time human body podomere angle weight equation all by video actions, the calculating is public Formula is as follows:
Wherein, tcFor the present frame of video sequence, n is that the number at human body podomere angle is 13,It is i-th in present frame Human body podomere angle calculates mark, is codetermined by template video frame and user video frame,For i-th of human body podomere in video The interframe accumulated change amount until present frame at angle.
Mark is calculated for i-th of human body podomere angle, calculation formula is as follows:
Wherein,Mark is calculated for i-th of human body podomere angle in template video present frame, it is known that a human body podomere Angle is determined by the line of two human joint points, then will be with when some body joint point coordinate cannot be obtained accurately in template video The calculating mark at the artis related human body podomere angle sets 0, i.e.,Otherwise certain podomere angle is related in template video Two artis when can accurately obtain,Similarly,For i-th of human body limb in user video present frame It saves angle and calculates mark, it, then will people related with the artis when some body joint point coordinate cannot be obtained accurately in user video The calculating mark at body podomere angle sets 0, i.e.,It is on the contrary
Current t is arrived for i-th human body podomere angle in videocInterframe accumulated change amount until frame, calculation formula It is as follows:
WhereinIt is the angle at i-th of human body podomere angle in video present frame,It is t in videoc- 1-k frame In i-th of human body podomere angle angle.K is indicated from tc- 1-k frame is to tcIn frame, the calculating mark at continuous k frame human body podomere angle Knowledge sets 0.
S is the comprehensive assessment score of user video, and calculation formula is as follows:
Wherein, M is the total length of complete template video frame, and N is the total length of whole user video frame, wtIt is M framed user The assessment score S of t frame in videotWeight, M framed user's video be extracted from the total length N of whole user video frame M frame, and wt=1/M.
It is compared with the prior art, advantageous effects of the invention:
1, the registration Algorithm used in the present invention will be to formation template video frame and the one-to-one feelings of user video frame Condition avoids existing time wrapping algorithm that a video in template video or user video is extended or is compressed, finally Forming a frame template video can or multiple template video frame corresponding with multiple user video frames and framed user's video frame phase Corresponding situation is convenient simple;
2, single frames assessment and comprehensive assessment effectively can be carried out to the golf swing in user video, for Other professional motions matched, it may have good practicability has good promotion prospect.
Detailed description of the invention
Fig. 1 is the flow diagram of the method for the present invention specific embodiment;
Fig. 2 is human joint points schematic diagram of the invention.
Specific embodiment
The invention will be further described with reference to the accompanying drawing.Once embodiment is only used for more clearly illustrating this hair Bright embodiment, and cannot once limit the scope of the invention.
Embodiment 1
The application scenarios of the present embodiment are that people brandishes iron type golf club golf, are carried out to golf swing Scoring estimation.
A kind of golf swing scoring estimation method compared based on posture, comprising the following steps:
1) template video and user video are registrated: using the information of club slope, portion is extracted from user video Framing and template video correspond;
2) it extracts human action feature: according to human joint points coordinate, extracting several (13) human body limb angles, as Human action feature;
3) the user video characteristics of human body of extraction and its template video are swung to act and is compared, and the comparison to different characteristic As a result it is weighted scoring.
Further, in step 1), template video and the matching of user video frame, tool are carried out using the information of club slope Body the following steps are included:
(THCxi,THCyi) be in template video on the club of the i-th frame close to hand the point coordinate of a bit, (TDCxi, TDCyi) be in template video on the club of the i-th frame close to some coordinates of club head, byIt calculates TC ρ can be obtainedi, TC ρiFor the angle value with the formation of horizontal direction in cartesian coordinate system of the club in the i-th frame of template video, [0,180) range is.
(UHCxj,UHCyj) be in user video on the club of jth frame close to hand the point coordinate of a bit, (UDCxj, UDCyj) be in user video on the ball of jth frame close to some coordinates of club head, byCalculating can Obtain UC ρj, UC ρjFor the angle with the formation of horizontal direction in cartesian coordinate system of the club in user video rod stage jth frame Angle value, range be [0,180).
Video that one section is completely swung is divided into three sections, i.e. rod stage video (start frame to upper boom latter stage), lower beam rank Section video (first frame to frame of batting) after upper boom latter stage and with bar stage video (first frame to abort frame) after batting, respectively The matching way of stage video is consistent.Video frame matching process is largely divided into three steps, the specific steps are as follows:
(1) it finds out and matches most matched user's stage video frame with the 1st frame of template stage video.
It is assumed that I is the template stage total frame length of video, J is the total frame length of user's stage video.Sentence first with club directional information Whether disconnected the 1st frame of template stage video and the club direction in the 1st frame of user's stage video are consistent.When meeting following judgment condition THCy1<TDCy1&&UHCy1<UDCy1Or THCy1>TDCy1&&UHCy1>UDCy1, show the 1st frame of template stage video and user The 1st frame direction of stage video is consistent, conversely, then showing that the 1st frame of template stage video and the 1st frame direction of user's stage video are different It causes.If the 1st frame of template stage video is consistent with the 1st frame direction of user's stage video, it is assumed that the ball in the 1st frame of template stage video Bar direction is THCy1<TDCy1, using the club range formula in template stage video frame and user's stage video frame, calculate The club distance of each frame in the 1st frame of template stage video and user's stage video, and participate in frame in the user's stage video calculated Meet UHCy1<UDCy1, Δ C ρ is obtained at this time1,1, Δ C ρ1,2... Δ C ρ1,n, respectively the 1st frame of user's stage video to n-th frame With the club distance of the 1st frame of template stage video, then club user corresponding to the smallest value is found out from this group of data Stage video frame MinP, then it is assumed that user's stage video MinP frame is most matched with the 1st frame of template stage video, and club distance is public Formula is as follows:
ΔCρi,j=min [abs (TC ρi-UCρj),180-abs(UCρi-UCρj)] (10)
If the club direction in the 1st frame of template stage video and the 1st frame of user's stage video is inconsistent, it is assumed that the template stage The direction of the 1st frame of video is THCy1<TDCy1, then give up the user stage video frame inconsistent with template stage video direction, directly Start to calculate club distance in a manner described when consistent with the 1st frame direction of template stage video to user's stage video frame, at this time Obtain Δ C ρ1,m, Δ C ρ1,m+1... Δ C ρ1,n, respectively user's stage video m frame to n-th frame and the 1st frame of template stage video Club distance, then find out from this group of data club user stage video frame MinP corresponding to the smallest value, then recognize It is most matched for user's stage video MinP frame with the 1st frame of template stage video.
(2) distance matrix of calculation template stage video and user's stage video, size I*J:
It, then can be with by (1) step it is found that the MinP frame of user's stage video is matched with the 1st frame of template stage video Thinking the 1st of user video cannot all match to MinP-1 frame with the 1st frame of template stage video.It is by actual conditions it is found that any Two videos between club distance can not completely the same, i.e. Δ C ρi,j> 0 it is permanent set up, so, for cannot matched template Distance between stage video and user's stage video is Δ C ρi,j=0.Distance is to regard in template stage video and user's stage The distance matrix of frequency, disi,jIt is the element in Distance, each element calculation formula is as follows:
(3) calculation template stage video and user's stage video localized accumulated are looked for away from value, and using localized accumulated distance value Out with the most matched user's stage video j of the i-th frame of template stage video.
cumuDisi,jIt is the cumulative distance value of template stage the i-th frame of video Yu user's stage video jth interframe, it is next The cumulative distance value of position determines that formula is as follows by localized accumulated range formula:
Wherein, the value of p and q is by min [disi,j+1,disi+1,j+1,disi+1,j] determine, when minimum value is disi,j+1When, then P=i, q=j+1;When minimum value is disi+1,j+1When, then p=i+1, q=j+1;When minimum value is, disi+1,jWhen, p=i+1, q =j.
Using the i-th frame of template stage video and user's stage video jth frame as initial position, and i and j meet i < I, j < J, this When cumuDisi,j=disi,j, cumuDis is obtained using localized accumulated range formulai,j+1,…,cumuDisi,j+k, it is mould respectively The cumulative distance value of plate stage the i-th frame of video and+1 frame of user's stage video jth to jth+k frame, it is next when being calculated using formula Cumulative distance value obtains cumuDisi+1,j+k+1Or cumuDisi+1,j+kWhen, then stop calculating next cumulative distance value, it is sharp at this time With Distance matrix D istance, dis is found outi,j, disi,j+1,…,disi,j+k, i.e. the i-th frame of template stage video and user's stage Video jth frame finds out dis to the distance value of jth+k framei,j, disi,j+1,…,disi,j+kUser's stage of middle lowest distance value Video frame subscript, it is assumed that disi,minIt is disi,j, disi,j+1,…,disi,j+kIn minimum value, then min is lowest distance value User's stage video frame subscript thinks that the i-th frame of template stage video frame is matched with user's stage video frame min frame at this time.
In view of golf sports are a continuous and complete swinging process, anyone the club track basic one that swings Cause, then when template stage the i-th frame of video and user's stage video min frame match, template stage video i+1 frame with User's stage video min frame must cannot match.So finding the user to match with the i+1 frame in template stage video Stage video frame then needs to calculate min+1 frame since user's stage video, it is possible thereby to find out respectively and the template stage The most matched user's stage video jth frame of the i-th frame of video, further using matching after template video and user video carry out And the characteristics of human body in video carries out movement comparison and is weighted scoring to the comparison result of different characteristic.
Known M is the total length of complete template video frame, and N is the total length of whole user video frame, utilizes registration Algorithm M framed user video frame and template video frame has been extracted to correspond, as shown in Fig. 2, 17 artis of setting, template video frame and User video frame is utilized respectively 14 artis, is extracted 13 human body podomere angles.
StIt is the assessment score of t frame in the M framed user's video extracted, specific formula is as follows:
Wherein n is the number at human body podomere angle, and being worth for 13, C is human body podomere angle maximum value, in golf swing In, C value is 180.Obviously the range at human body podomere angle in cartesian coordinate system is [0,180],It is user in t frame The difference at i-th of human body podomere angle of video and template video, WiIt is the weight at i-th of human body podomere angle, calculation formula is such as Under:
Wherein, Wi TIt is the accumulated time human body podomere angle weight of template video movement, Wi UIt is user action accumulated time people Body podomere angle weight, weight Wi TWith weight Wi UObtained by accumulated time human body podomere angle weight equation all by video actions, the meter It is as follows to calculate formula:
Wherein, tcFor the present frame of video sequence, n is that the number at human body podomere angle is 13,It is i-th in present frame Human body podomere angle calculates mark, is codetermined by template video frame and user video frame,For i-th of human body podomere in video The interframe accumulated change amount until present frame at angle.
Mark is calculated for i-th of human body podomere angle, calculation formula is as follows:
Wherein,Mark is calculated for i-th of human body podomere angle in template video present frame, it is known that a human body podomere Angle is determined by the line of two human joint points, then will be with when some body joint point coordinate cannot be obtained accurately in template video The calculating mark at the artis related human body podomere angle sets 0, i.e.,Otherwise certain podomere angle is related in template video Two artis when can accurately obtain,Similarly,For i-th of human body limb in user video present frame It saves angle and calculates mark, it, then will people related with the artis when some body joint point coordinate cannot be obtained accurately in user video The calculating mark at body podomere angle sets 0, i.e.,Conversely,
Current t is arrived for i-th human body podomere angle in videocInterframe accumulated change amount until frame, calculation formula It is as follows:
Wherein,For the angle at i-th of human body podomere angle in video present frame,For t in videoc-1-k The angle at i-th of human body podomere angle in frame, k are indicated from tc- 1-k frame is to tcIn frame, the calculating at continuous k frame human body podomere angle Mark sets 0.
S is the comprehensive assessment score of user video, and calculation formula is as follows:
Wherein, M is the total length of complete template video frame, and N is the total length of whole user video frame, wtFor M framed user The assessment score S of t frame in videotWeight, M framed user's video be extracted from the total length N of whole user video frame M frame, and wt=1/M.
Present approach provides the golf swing compared based on posture scoring estimation methods, either in background In simple or complex scene, scoring estimation quick and precisely can be carried out to user's movement that swings.In Yoga, diving etc. is this kind of right In the extra high movement of action request, the scoring formula in the method for the present invention stands good, and has good promotion prospect.
The above is only a preferred embodiment of the present invention, it is noted that for the ordinary skill people of the art For member, without departing from the technical principles of the invention, several improvement and deformations can also be made, these improvement and deformations Also it should be regarded as protection scope of the present invention.

Claims (7)

  1. The estimation method 1. a kind of golf swing compared based on posture is scored, which comprises the following steps:
    1) template video and user video are registrated: using the information of club slope, from user video extracting part framing with Template video is corresponded;
    2) it extracts human action feature: utilizing human joint points coordinate, extract several human body limb angles, as human action spy Sign;
    3) the user video characteristics of human body of extraction and its template video are swung to act and is compared, and to the comparison result of different characteristic It is weighted scoring.
  2. The estimation method 2. a kind of golf swing compared based on posture according to claim 1 is scored, feature It is, in step 1), the template video and user video are registrated, comprising the following steps:
    Video that one section is completely swung is divided into three sections, i.e. rod stage video, lower beam stage video and with bar stage video, respectively The matching way of stage video is consistent, and the video of three phases is referred to as template stage video, and video frame matching process is divided into three Step, comprising:
    (1) it finds out and the most matched user's stage video frame of the 1st frame of template stage video;
    Using the club direction in the 1st frame of club directional information judge templet stage video and the 1st frame of user's stage video whether Unanimously;
    (2) distance matrix of calculation template stage video and user's stage video, size I*J, wherein I is the template stage The total frame length of video, J are the total frame lengths of user's stage video;
    By (1) step it is found that the MinP frame of user's stage video is matched with the 1st frame of template stage video, user's stage is regarded 1st frame of frequency is not matched to Minp-1 frame with the 1st frame of template stage video;Distance is template stage video and user The distance matrix of stage video, disI, jIt is the element in Distance, each element calculation formula is as follows:
    Club distance between two any videos can not be completely the same, i.e., between template stage video and user's stage video Distance, delta C ρI, j> 0 is permanent to be set up, for cannot distance, delta C ρ between matched template stage video and user's stage videoI, j= 0;
    (3) calculation template stage video and user's stage video localized accumulated be away from value, and using localized accumulated distance value find out with The most matched user's stage video j of template stage the i-th frame of video;
    cumuDisI, jIt is the cumulative distance value of template stage the i-th frame of video Yu user's stage video jth interframe, the next position Cumulative distance value determined by following localized accumulated range formula:
    Wherein, the value of p and q is by min [disI, j+1, disI+1, j+1, disI+1, j] determine, when minimum value is disI, j+1When, then p=i, Q=j+1;When minimum value is disI+1, j+1When, then p=i+1, q=j+1;When minimum value is, disI+1, jWhen, p=i+1, q=j;
    Using the i-th frame of template stage video and user's stage video jth frame as initial position, and i and j meet i < I, j < J, at this time cumuDisI, j=disI, j, cumuDis is obtained using localized accumulated range formulaI, j+1..., cumuDisI, j+k, it is template respectively The i-th frame of stage video and+1 frame of user's stage video jth tire out to the cumulative distance value of jth+k frame when using formula calculating is next Product distance value obtains cumuDisI+1, j+k+1Or cumuDisI+1, j+kWhen, then stop calculating next cumulative distance value, utilize at this time away from From matrix D istance, dis is found outI, j, disI, j+1..., disI, j+k, i.e. the i-th frame of template stage video and user's stage video Jth frame finds out dis to the distance value of jth+k frameI, j, disI, j+1..., disI, j+kUnder the user video frame of middle lowest distance value Mark, it is assumed that disI, minIt is disI, j, disI, j+1..., disI, j+kIn minimum value, then min is user's stage of lowest distance value Video frame subscript thinks that the i-th frame of template stage video frame is matched with user's stage video frame min frame at this time;When the template stage When the i-th frame of video and user's stage video min frame match, template stage video i+1 frame and user's stage video min Frame mismatches, and calculates and finds and the i+1 frame phase in template stage video min+1 frame since user's stage video The user's stage video frame matched.
  3. The estimation method 3. a kind of golf swing compared based on posture according to claim 2 is scored, feature It is, it is assumed that I is the template stage total frame length of video, and J is the total frame length of user's stage video;When meeting following judgment condition:
    THCy1< TDCy1&&UHCy1< UDCy1Or THCy1> TDCy1&&UHCy1> UDCy1
    Show that the 1st frame of template stage video is consistent with the 1st frame direction of user's stage video, conversely, then showing template stage video 1st frame and the 1st frame direction of user's stage video are inconsistent, wherein THCy1It is to be leaned on the club of the 1st frame in template stage video The y-coordinate value of any of nearly hand, TDCy1It is the y-coordinate of any in template stage video on the club of the i-th frame close to head Value, UHCy1It is the y-coordinate value of any in user's stage video on the club of the 1st frame close to hand, UDCy1It is to regard in user's stage The y-coordinate value of any in frequency on the club of the 1st frame close to head;
    If the 1st frame of template stage video is consistent with the 1st frame direction of user's stage video, it is assumed that in the 1st frame of template stage video Club direction is THCy1< TDCy1, using the club range formula in template stage video frame and user's stage video frame, calculate The club distance of the 1st frame of template stage video and each frame in user's stage video out, and participate in the user's stage video calculated Frame meets UHCy1< UDCy1, Δ C ρ is obtained at this time1,1, Δ C ρ1,2... Δ C ρ1, n, respectively the 1st frame of user's stage video to n-th The club distance of frame and the 1st frame of template stage video, then find out from this group of data club use corresponding to the smallest value Family stage video frame MinP, then it is assumed that user's stage video MinP frame is most matched with the 1st frame of template stage video, club distance Formula is as follows:
    ΔCρI, j=min [abs (TC ρi-UCρj), 180-abs (TC ρi-UCρj)]。
  4. The estimation method 4. a kind of golf swing compared based on posture according to claim 3 is scored, feature It is, if the club direction in the 1st frame of template stage video and the 1st frame of user's stage video is inconsistent, it is assumed that template stage view Frequently the direction of the 1st frame is THCy1< TDCy1, then give up the user stage video frame inconsistent with template stage video direction, Until user's stage video frame starts to calculate club distance in a manner described when consistent with the 1st frame direction of template stage video, this When obtain Δ C ρ1, m, Δ C ρ1, m+1... Δ C ρ1, n, respectively user's stage video m frame to n-th frame and template stage video the 1st The club distance of frame, then find out from this group of data club user stage video frame MinP corresponding to the smallest value, then Think that user's stage video MinP frame is most matched with the 1st frame of template stage video.
  5. The estimation method 5. a kind of golf swing compared based on posture according to claim 1 is scored, feature It is, the user video characteristics of human body of extraction and its template video is swung and act comparison, and to the comparison result of different characteristic It is weighted scoring, is included the following steps:
    M is the total length of complete template video frame, and N is the total length of whole user video frame, extracts M framed user video frame and mould Plate video frame corresponds, and extracts human action feature, template video frame and user video frame are utilized respectively 14 artis, mention 13 human body podomere angles are taken;
    StIt is the assessment score of t frame in the M framed user's video extracted, StCalculation formula it is as follows:
    Wherein, n is the number at human body podomere angle, and n 13, C are human body podomere angle maximum values, and in golf swing, C is 180, the range at human body podomere angle in cartesian coordinate system is [0,180],It is user video and template view in t frame The difference at i-th of human body podomere angle of frequency, WiIt is the weight at i-th of human body podomere angle, calculation formula is as follows:
    Wherein, Wi TIt is the accumulated time human body podomere angle weight of template video movement, Wi UIt is user video movement accumulated time people Body podomere angle weight, weight Wi TAnd Wi UObtained by accumulated time human body podomere angle weight equation by video actions, the calculating is public Formula is as follows:
    Wherein, tcFor the present frame of video sequence,Mark is calculated for i-th of human body podomere angle in present frame, by template video Frame and user video frame codetermine,For the interframe accumulated change until present frame at i-th of human body podomere angle in video Amount.
  6. The estimation method 6. a kind of golf swing compared based on posture according to claim 5 is scored, feature It is, i-th of human body podomere angle calculates mark in present frameCalculation formula it is as follows:
    Wherein,Mark is calculated for i-th of human body podomere angle in template video present frame, it is known that a human body podomere angle is by two The line decision of a human joint points, then will be with the joint when some body joint point coordinate cannot be obtained accurately in template video The calculating mark at point related human body podomere angle sets 0, i.e.,Conversely, certain podomere angle is two relevant in template video When artis can be obtained accurately,Similarly,For i-th of human body podomere angle meter in user video present frame Mark is calculated, it, then will human body podomere related with the artis when some body joint point coordinate cannot be obtained accurately in user video The calculating mark at angle sets 0, i.e.,Conversely,
  7. The estimation method 7. a kind of golf swing compared based on posture according to claim 5 is scored, feature It is, the interframe accumulated change amount until present frame at i-th of human body podomere angle in videoCalculation formula it is as follows:
    Wherein,For the angle at i-th of human body podomere angle in video present frame,For t in videocIn -1-k frame I-th of human body podomere angle angle, k indicates from tc- 1-k frame is to tcIn frame, the calculating at continuous k frame human body podomere angle is identified Set 0;
    S is the comprehensive assessment score of user video, and calculation formula is as follows:
    Wherein, M is the total length of complete template video frame, and N is the total length of whole user video frame, wtFor in M framed user's video The assessment score S of t frametWeight, M framed user's video is the M frame extracted from the total length N of whole user video frame, And wt=1/M.
CN201910683497.4A 2019-07-26 2019-07-26 Golf swing action score estimation method based on posture comparison Active CN110354480B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910683497.4A CN110354480B (en) 2019-07-26 2019-07-26 Golf swing action score estimation method based on posture comparison

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910683497.4A CN110354480B (en) 2019-07-26 2019-07-26 Golf swing action score estimation method based on posture comparison

Publications (2)

Publication Number Publication Date
CN110354480A true CN110354480A (en) 2019-10-22
CN110354480B CN110354480B (en) 2021-04-16

Family

ID=68222016

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910683497.4A Active CN110354480B (en) 2019-07-26 2019-07-26 Golf swing action score estimation method based on posture comparison

Country Status (1)

Country Link
CN (1) CN110354480B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113327267A (en) * 2021-07-15 2021-08-31 东南大学 Action evaluation method based on monocular RGB video
WO2021174697A1 (en) * 2020-03-06 2021-09-10 平安科技(深圳)有限公司 Human body posture evaluation method and apparatus, computer device, and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8371954B1 (en) * 2012-06-04 2013-02-12 Gary Anderson Golf swing instruction tool utilizing a motion training schematic
JP2018099416A (en) * 2016-12-21 2018-06-28 カシオ計算機株式会社 Motion analysis apparatus, motion analysis method and program
CN108392805A (en) * 2018-03-30 2018-08-14 深圳市元征科技股份有限公司 A kind of golf swing action-analysing method and intelligent terminal
CN108564596A (en) * 2018-03-01 2018-09-21 南京邮电大学 A kind of the intelligence comparison analysis system and method for golf video
CN108985227A (en) * 2018-07-16 2018-12-11 杭州电子科技大学 A kind of action description and evaluation method based on space triangular plane characteristic
CN109522937A (en) * 2018-10-23 2019-03-26 北京市商汤科技开发有限公司 Image processing method and device, electronic equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8371954B1 (en) * 2012-06-04 2013-02-12 Gary Anderson Golf swing instruction tool utilizing a motion training schematic
JP2018099416A (en) * 2016-12-21 2018-06-28 カシオ計算機株式会社 Motion analysis apparatus, motion analysis method and program
CN108564596A (en) * 2018-03-01 2018-09-21 南京邮电大学 A kind of the intelligence comparison analysis system and method for golf video
CN108392805A (en) * 2018-03-30 2018-08-14 深圳市元征科技股份有限公司 A kind of golf swing action-analysing method and intelligent terminal
CN108985227A (en) * 2018-07-16 2018-12-11 杭州电子科技大学 A kind of action description and evaluation method based on space triangular plane characteristic
CN109522937A (en) * 2018-10-23 2019-03-26 北京市商汤科技开发有限公司 Image processing method and device, electronic equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021174697A1 (en) * 2020-03-06 2021-09-10 平安科技(深圳)有限公司 Human body posture evaluation method and apparatus, computer device, and storage medium
CN113327267A (en) * 2021-07-15 2021-08-31 东南大学 Action evaluation method based on monocular RGB video

Also Published As

Publication number Publication date
CN110354480B (en) 2021-04-16

Similar Documents

Publication Publication Date Title
CN108764120B (en) Human body standard action evaluation method
CN108564596B (en) Intelligent comparison analysis system and method for golf swing video
US10186041B2 (en) Apparatus and method for analyzing golf motion
CN110163110A (en) A kind of pedestrian&#39;s recognition methods again merged based on transfer learning and depth characteristic
CN104167016B (en) A kind of three-dimensional motion method for reconstructing based on RGB color and depth image
CN102184541B (en) Multi-objective optimized human body motion tracking method
CN105512621A (en) Kinect-based badminton motion guidance system
CN107349594A (en) A kind of action evaluation method of virtual Dance System
CN106600626B (en) Three-dimensional human motion capture method and system
CN110448870B (en) Human body posture training method
CN108597578A (en) A kind of human motion appraisal procedure based on two-dimensional framework sequence
CN103227888B (en) A kind of based on empirical mode decomposition with the video stabilization method of multiple interpretational criteria
CN110354480A (en) A kind of golf swing scoring estimation method compared based on posture
KR102238085B1 (en) Device and method for analyzing motion
CN110298218B (en) Interactive fitness device and interactive fitness system
CN107293175A (en) A kind of locomotive hand signal operation training method based on body-sensing technology
CN105279769A (en) Hierarchical particle filtering tracking method combined with multiple features
CN103839050A (en) ASM positioning algorithm based on feature point expansion and PCA feature extraction
CN110378871A (en) Game charater original painting copy detection method based on posture feature
CN104898971B (en) A kind of mouse pointer control method and system based on Visual Trace Technology
Zhang et al. Intelligent sports performance scoring and analysis system based on deep learning network
CN103198297B (en) Based on the kinematic similarity assessment method of correlativity geometric properties
CN105869153A (en) Non-rigid face image registering method integrated with related block information
CN108596947A (en) A kind of fast-moving target tracking method suitable for RGB-D cameras
Shen et al. Design of OpenPose-based of exercise assistant system with instructor-user synchronization for self-practice dynamic yoga

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant