CN110782967B - Body-building action standard degree assessment method and device - Google Patents
Body-building action standard degree assessment method and device Download PDFInfo
- Publication number
- CN110782967B CN110782967B CN201911058207.3A CN201911058207A CN110782967B CN 110782967 B CN110782967 B CN 110782967B CN 201911058207 A CN201911058207 A CN 201911058207A CN 110782967 B CN110782967 B CN 110782967B
- Authority
- CN
- China
- Prior art keywords
- time
- action
- recognition
- standard
- exercise
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Abstract
The invention discloses a standard evaluation method and a standard evaluation device for body-building actions, which are characterized in that an action recognition time interval is set for each body-building action in a body-building template video, the action recognition time interval of each body-building action is readjusted, the recognition starting time of the next action recognition time interval after the recognition ending time of the previous action recognition time interval in two adjacent action recognition time intervals is prevented, then the time scoring value is calculated for the body-building action according to the action recognition time interval of each body-building action and the action ending time of the body-building action, the time scoring value can be independently used as a standard scoring value to evaluate whether the body-building action is standard, and can be combined with a force scoring value obtained through action recognition to be used as a standard scoring value to evaluate whether the body-building action is standard, so that the invention can accurately evaluate the completion time of the body-building action.
Description
Technical Field
The invention relates to the technical field of sports application, in particular to a fitness action standard evaluation method and device.
Background
Along with the increasing promotion of the living standard of people, people also pay more and more attention to physical exercise. Meanwhile, various fitness software is also used for helping people exercise in life. The current exercise software plays the exercise template video, and the user can learn and imitate the video while watching the video to help the user exercise. However, the existing body-building software cannot monitor whether the body-building action of the user is standard or not, and cannot detect the body feeling. Although some motion sensing applications, such as motion sensing games, can detect the motion of a user, they can only detect that the user does not do a simple motion (e.g. waving a hand), but cannot detect the time taken by the user's exercise motion, and cannot detect the complex motion such as exercise, and cannot guide the user to achieve a good training effect.
Disclosure of Invention
The invention mainly solves the technical problem of providing a standard degree evaluation method for body-building actions, which can accurately evaluate the completion time of the body-building actions.
In order to solve the technical problems, the invention adopts a technical scheme that: providing a fitness action standard evaluation method, which comprises the following steps of; acquiring standard starting time and standard ending time of each exercise action in the exercise template video, setting an action recognition time interval of each exercise action, expanding the standard starting time of each exercise action forwards by a first preset time to serve as the recognition starting time of the action recognition time interval, and expanding the standard ending time of each exercise action backwards by a second preset time to serve as the recognition ending time of the action recognition time interval; judging whether the recognition end time of the previous action recognition time interval in the two adjacent action recognition time intervals lags behind the recognition start time of the next action recognition time interval; if the time delay is judged, sequentially adding the identification ending time and the standard ending time of the previous action identification time interval and the identification starting time and the standard starting time of the next action identification time interval into the adjustment array according to ascending order; taking the average value of the sum of the second element and the third element of the adjustment array as reference time, replacing the identification ending time of the previous action identification time interval with the difference between the reference time and the preset delay time, and replacing the identification starting time of the next action identification time interval with the sum of the reference time and the preset delay time; acquiring exercise data of a user, performing motion recognition on each exercise according to the exercise data to obtain a recognition result, and recording the motion ending time of each exercise; judging whether the action ending time of the current body-building action is larger than a standard ending time or not; if the judgment is yes, the ratio of the difference between the identification ending time and the action ending time to the difference between the identification ending time and the standard ending time is taken as the time grading value of the current body-building action, otherwise, the ratio of the difference between the action ending time and the identification starting time to the difference between the standard ending time and the identification starting time is taken as the time grading value of the current body-building action.
As a preferred embodiment of the present invention, the recognition result includes a strength score value; the fitness action standard evaluation method further comprises the following steps: calculating the sum of the product of the time scoring value of the current exercise and the first weight value and the product of the force scoring value and the second weight value as a standard degree scoring value; wherein the sum of the first weight value and the second weight value is 1.
As a preferred embodiment of the present invention, the exercise motion standard evaluation method further includes; comparing the standard degree scoring value with a plurality of scoring intervals divided in advance; and taking the comments corresponding to the scoring intervals in which the standard degree scoring values are positioned as evaluation results, wherein the scoring intervals correspond to unused comments respectively.
In order to solve the technical problems, the invention adopts another technical scheme that: there is provided a fitness action standardization assessment device comprising: the video analysis module is used for acquiring the standard starting time and the standard ending time of each exercise action in the exercise template video, setting an action recognition time interval of each exercise action, expanding the standard starting time of each exercise action forwards by a first preset time to serve as the recognition starting time of the action recognition time interval, and expanding the standard ending time of each exercise action backwards by a second preset time to serve as the recognition ending time of the action recognition time interval; the time comparison module is used for judging whether the recognition end time of the previous action recognition time interval in the two adjacent action recognition time intervals lags behind the recognition start time of the next action recognition time interval, and adding the recognition end time and the standard end time of the previous action recognition time interval and the recognition start time and the standard start time of the next action recognition time interval into the adjustment array in sequence according to ascending order when the judgment result is that the recognition end time is lags behind; the time setting module is used for taking the average value of the sum of the second element and the third element of the adjustment array as reference time, replacing the identification ending time of the previous action identification time interval with the difference between the reference time and the preset delay time, and replacing the identification starting time of the next action identification time interval with the sum of the reference time and the preset delay time; the data acquisition module is used for acquiring exercise data of a user, performing action recognition on each exercise according to the exercise data to obtain a recognition result, and recording the action ending time of each exercise; and the action evaluation module is used for judging whether the action ending time of the current body-building action is greater than the standard ending time, and taking the ratio of the difference between the identification ending time and the action ending time to the difference between the identification ending time and the standard ending time as the time scoring value of the current body-building action when the judgment is yes, otherwise taking the ratio of the difference between the action ending time and the identification starting time to the difference between the standard ending time and the identification starting time as the time scoring value of the current body-building action.
As a preferred embodiment of the present invention, the recognition result includes a strength score value; the action evaluation module is also used for calculating the sum of the product of the time scoring value of the current exercise action and the first weight value and the product of the force scoring value and the second weight value as a standard degree scoring value; wherein the sum of the first weight value and the second weight value is 1.
As a preferred embodiment of the present invention, the action evaluation module is further configured to compare a standard grade value with a plurality of pre-divided grade intervals, and use a comment corresponding to the grade interval in which the standard grade value is located as an evaluation result, where the plurality of grade intervals respectively correspond to unused comments.
Unlike the prior art, the invention has the beneficial effects that: by setting an action recognition time interval for each exercise action in the exercise template video, and readjusting the action recognition time interval of each exercise action, the recognition start time of the next action recognition time interval after the recognition end time of the previous action recognition time interval in the two adjacent action recognition time intervals is avoided, and then the time scoring value calculation is carried out on the exercise action according to the action recognition time interval of each exercise action and the action end time of the exercise action, so that the completion time of the exercise action can be accurately evaluated.
Drawings
Fig. 1 is a flowchart of a fitness action standard evaluation method according to an embodiment of the present invention.
Fig. 2 is a schematic block diagram of a fitness action standard assessment device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, a flow chart of a method for evaluating fitness action standard according to an embodiment of the invention is shown. The fitness action standard evaluation method of the embodiment of the invention comprises the following steps of;
s1: the method comprises the steps of obtaining standard starting time and standard ending time of each body-building action in a body-building template video, setting an action recognition time interval of each body-building action, expanding the standard starting time of each body-building action forwards by a first preset time to serve as recognition starting time of the action recognition time interval, and expanding the standard ending time of each body-building action backwards by a second preset time to serve as recognition ending time of the action recognition time interval.
Wherein, the standard starting time and standard ending time of each exercise are obtained by intercepting and obtaining the standard starting time and standard ending time of each exercise from the exercise template video by frames, and of course, the exercise template video can be a video marked with the standard starting time and standard ending time of each exercise. The user should complete the exercise while following the standard start time and the standard end time, but it is difficult to complete the exercise in real time while learning the exercise while watching the video in consideration of the human body reaction time. Therefore, the standard starting time is extended forward by a first preset time as the identification starting time of the action identification time interval, and the standard ending time is extended backward by a second preset time as the identification ending time of the action identification time interval, so that untimely reaction of a user can be avoided. The first preset time and the second preset time can be the same or different, and specific values can be determined according to actual needs.
S2: judging whether the recognition end time of the previous action recognition time interval in the two adjacent action recognition time intervals lags behind the recognition start time of the next action recognition time interval.
For some exercise actions that are connected in time, there may be a recognition start time of a next action recognition time interval after a recognition end time of a previous action recognition time interval in two adjacent action recognition time intervals, for example, a left hook and a right hook in boxing actions, and the two exercise actions have fast and continuous punch speeds, which may result in that the action recognition time interval of the left hook is not yet ended, and the action recognition time interval of the right hook is started.
S3: if the time delay is judged, the recognition ending time and the standard ending time of the previous action recognition time interval and the recognition starting time and the standard starting time of the next action recognition time interval are added into the adjustment array in sequence according to ascending order.
The recognition end time and standard end time of the previous action recognition time interval are perRealZ, perStdZ, the recognition start time and standard start time of the next action recognition time interval are curRealA, curStdA, and if perRealZ > curcutting A, perRealZ, perStdZ, curRealA, curStdA is added into the adjustment array in sequence according to ascending order. In one application scenario, assuming that the time units are each millisecond, perstdz=1000, perrealz=4000, curreala=2000, curstda=3000, then the adjustment array a is [ perStdZ, curRealA, curStdA, perRealZ ].
S4: taking the average value of the sum of the second element and the third element of the adjustment array as the reference time, replacing the identification ending time of the previous action identification time interval with the difference between the reference time and the preset delay time, and replacing the identification starting time of the next action identification time interval with the sum of the reference time and the preset delay time.
Wherein, the reference time is time, and then time= (a [1] +a [2 ])/2= (2000+3000)/2=2500. If the preset delay time is n, the recognition end time perrealz=time-n=2500-n in the former action recognition time interval, and the recognition start time curreala=time+n=2500+n in the latter action recognition time interval. The preset delay time mainly introduces delay for the recognition of two exercise actions, so that the fact that the recognition of the former exercise action is started immediately after the recognition of the latter exercise action is finished is avoided. The preset delay time n may be set according to actual needs, for example, in the foregoing application scenario, n is 10 ms, where perrealz=2490 and curreala=2510.
S5: and acquiring exercise data of the user, performing motion recognition on each exercise according to the exercise data to obtain a recognition result, and recording the motion ending time of each exercise.
Wherein a user may wear a wearable device to gather fitness data, the wearable device including, but not limited to, a tri-axial accelerometer and a gyroscope. The process of performing motion recognition for each exercise motion based on exercise data may be implemented in accordance with prior art recognition algorithms.
S6: and judging whether the action ending time of the current body-building action is larger than the standard ending time.
S7: if the judgment is yes, the ratio of the difference between the identification ending time and the action ending time to the difference between the identification ending time and the standard ending time is taken as the time grading value of the current body-building action, otherwise, the ratio of the difference between the action ending time and the identification starting time to the difference between the standard ending time and the identification starting time is taken as the time grading value of the current body-building action.
Wherein, the action ending time is curZ, if curZ is less than or equal to curStdZ, the time grading value of the current exercise action is = (curZ-curcutting A)/(curStdZ-curcutting A), and if curZ is more than curStdZ, the time grading value of the current exercise action is = (curcutting Z-curcutting)/(curcutting Z-curstdZ). In one application scenario, curreala=3000, curstda=4000, curstdz=7000, currealz=8000, if curz=6500, the time score value of the current exercise is= (6500-3000)/(7000-3000) =0.875, and if curz=7500, the time score value of the current exercise is= (8000-7500)/(8000-7000) =0.5. The time score value may be used as a criteria score value to evaluate whether the exercise activity is standard.
Further, in one possible embodiment, the recognition result includes a force score value, where the force score value is obtained through acceleration data, gyroscope data, and the like acquired by the wearable device. The fitness action standard evaluation method further comprises the following steps:
s8: and calculating the sum of the product of the time scoring value of the current exercise and the first weight value and the product of the force scoring value and the second weight value as a standard degree scoring value.
Wherein the sum of the first weight value and the second weight value is 1. Assuming that the standard deviation score is s, s=α+β, alpha and beta are a first weight value and a second weight value, respectively.
In order to improve the user experience and excite the user to feel, in this embodiment, the fitness action standard degree evaluation method further includes:
s9: the standard grade value is compared with a plurality of grade intervals divided in advance.
S10: and taking the comments corresponding to the scoring interval in which the standard degree scoring value is positioned as an evaluation result, wherein a plurality of scoring intervals correspond to unused comments respectively.
Specifically, the plurality of score intervals are divided from 0 to 10, and are respectively [0,2], (2, 4], (4, 6], (6, 8], (8, 10], which correspond to the comments MISS, OK, GOOD, GREAT, PERFECT, assuming that the score intervals are divided into 5 score intervals.
Referring to fig. 2, a schematic block diagram of a fitness action standard evaluation device according to an embodiment of the present invention is shown. The fitness action standard evaluation device according to the embodiment of the invention comprises a video analysis module 10, a time comparison module 20, a time setting module 30, a data acquisition module 40 and an action evaluation module 50.
The video analysis module 10 is configured to obtain a standard start time and a standard end time of each exercise action in the exercise template video, set an action recognition time interval of each exercise action, extend the standard start time of each exercise action forward by a first predetermined time as a recognition start time of the action recognition time interval, and extend the standard end time of each exercise action backward by a second predetermined time as a recognition end time of the action recognition time interval.
The time comparison module 20 is configured to determine whether the recognition end time of the previous action recognition time interval lags behind the recognition start time of the next action recognition time interval in the two adjacent action recognition time intervals, and sequentially add the recognition end time, the standard end time, the recognition start time and the standard start time of the previous action recognition time interval to the adjustment array in ascending order when the determination result is that the recognition end time of the previous action recognition time interval lags behind the recognition start time of the next action recognition time interval.
The time setting module 30 is configured to replace the recognition end time of the previous action recognition time interval with the difference between the reference time and the preset delay time, and replace the recognition start time of the next action recognition time interval with the sum of the reference time and the preset delay time, with the average value of the sum of the second element and the third element of the adjustment array as the reference time.
The data acquisition module 40 is configured to acquire exercise data of a user, perform motion recognition on each exercise according to the exercise data to obtain a recognition result, and record an end time of each exercise.
The motion estimation module 50 is configured to determine whether the motion end time of the current exercise is greater than the standard end time, and if yes, use a ratio of a difference between the recognition end time and the motion end time to a difference between the recognition end time and the standard end time as a time score of the current exercise, otherwise use a ratio of a difference between the motion end time and the recognition start time to a difference between the standard end time and the recognition start time as a time score of the current exercise.
Further, in one possible embodiment, the recognition result includes a force score value, and the action evaluation module 50 is further configured to calculate, as the standard score value, a sum of a product of the time score value of the current exercise action and the first weight value and a product of the force score value and the second weight value. Wherein the sum of the first weight value and the second weight value is 1.
In this embodiment, the action evaluation module 50 is further configured to compare the standard score value with a plurality of score intervals divided in advance, and take, as the evaluation result, the comment corresponding to the score interval in which the standard score value is located, where the plurality of score intervals respectively correspond to unused comments.
The exercise motion standard evaluation device according to the embodiment of the present invention has the same technical features as the exercise motion standard evaluation method according to the foregoing embodiment, and is not described in detail herein.
By means of the method and the device for evaluating the standard degree of the body-building action, the action recognition time interval is set for each body-building action in the body-building template video, the action recognition time interval of each body-building action is readjusted, the situation that the recognition end time of the previous action recognition time interval lags behind the recognition start time of the next action recognition time interval in the two adjacent action recognition time intervals is avoided, then the time scoring value is calculated for the body-building action according to the action recognition time interval of each body-building action and the action end time of the body-building action, and accordingly the completion time of the body-building action can be evaluated accurately.
The foregoing description is only illustrative of the present invention and is not intended to limit the scope of the invention, and all equivalent structures or equivalent processes or direct or indirect application in other related technical fields are included in the scope of the present invention.
Claims (6)
1. The fitness action standard evaluation method is characterized by comprising the following steps of;
acquiring standard starting time and standard ending time of each exercise action in the exercise template video, setting an action recognition time interval of each exercise action, expanding the standard starting time of each exercise action forwards by a first preset time to serve as the recognition starting time of the action recognition time interval, and expanding the standard ending time of each exercise action backwards by a second preset time to serve as the recognition ending time of the action recognition time interval;
judging whether the recognition end time of the previous action recognition time interval in the two adjacent action recognition time intervals lags behind the recognition start time of the next action recognition time interval;
if the time delay is judged, sequentially adding the identification ending time and the standard ending time of the previous action identification time interval and the identification starting time and the standard starting time of the next action identification time interval into the adjustment array according to ascending order;
taking the average value of the sum of the second element and the third element of the adjustment array as reference time, replacing the identification ending time of the previous action identification time interval with the difference between the reference time and the preset delay time, and replacing the identification starting time of the next action identification time interval with the sum of the reference time and the preset delay time;
acquiring exercise data of a user, performing motion recognition on each exercise according to the exercise data to obtain a recognition result, and recording the motion ending time of each exercise;
judging whether the action ending time of the current body-building action is larger than a standard ending time or not;
if the judgment is yes, the ratio of the difference between the identification ending time and the action ending time to the difference between the identification ending time and the standard ending time is taken as the time grading value of the current body-building action, otherwise, the ratio of the difference between the action ending time and the identification starting time to the difference between the standard ending time and the identification starting time is taken as the time grading value of the current body-building action.
2. The exercise action standard assessment method according to claim 1, wherein the recognition result includes a strength score value;
the fitness action standard evaluation method further comprises the following steps:
calculating the sum of the product of the time scoring value of the current exercise and the first weight value and the product of the force scoring value and the second weight value as a standard degree scoring value;
wherein the sum of the first weight value and the second weight value is 1.
3. The exercise performance criteria evaluation method of claim 2, further comprising;
comparing the standard degree scoring value with a plurality of scoring intervals divided in advance;
and taking the comments corresponding to the scoring intervals in which the standard degree scoring values are positioned as evaluation results, wherein the scoring intervals correspond to unused comments respectively.
4. A fitness activity standardization assessment device, characterized in that the fitness activity standardization assessment device comprises:
the video analysis module is used for acquiring the standard starting time and the standard ending time of each exercise action in the exercise template video, setting an action recognition time interval of each exercise action, expanding the standard starting time of each exercise action forwards by a first preset time to serve as the recognition starting time of the action recognition time interval, and expanding the standard ending time of each exercise action backwards by a second preset time to serve as the recognition ending time of the action recognition time interval;
the time comparison module is used for judging whether the recognition end time of the previous action recognition time interval in the two adjacent action recognition time intervals lags behind the recognition start time of the next action recognition time interval, and adding the recognition end time and the standard end time of the previous action recognition time interval and the recognition start time and the standard start time of the next action recognition time interval into the adjustment array in sequence according to ascending order when the judgment result is that the recognition end time is lags behind;
the time setting module is used for taking the average value of the sum of the second element and the third element of the adjustment array as reference time, replacing the identification ending time of the previous action identification time interval with the difference between the reference time and the preset delay time, and replacing the identification starting time of the next action identification time interval with the sum of the reference time and the preset delay time;
the data acquisition module is used for acquiring exercise data of a user, performing action recognition on each exercise according to the exercise data to obtain a recognition result, and recording the action ending time of each exercise;
and the action evaluation module is used for judging whether the action ending time of the current body-building action is greater than the standard ending time, and taking the ratio of the difference between the identification ending time and the action ending time to the difference between the identification ending time and the standard ending time as the time scoring value of the current body-building action when the judgment is yes, otherwise taking the ratio of the difference between the action ending time and the identification starting time to the difference between the standard ending time and the identification starting time as the time scoring value of the current body-building action.
5. The exercise action standard deviation assessment device of claim 4, wherein the recognition result comprises a force score value;
the action evaluation module is also used for calculating the sum of the product of the time scoring value of the current exercise action and the first weight value and the product of the force scoring value and the second weight value as a standard degree scoring value;
wherein the sum of the first weight value and the second weight value is 1.
6. The exercise action standard assessment device according to claim 5, wherein the action assessment module is further configured to compare a standard score value with a plurality of score intervals divided in advance, and take a comment corresponding to the score interval in which the standard score value is located as an assessment result, where the plurality of score intervals respectively correspond to unused comments.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911058207.3A CN110782967B (en) | 2019-11-01 | 2019-11-01 | Body-building action standard degree assessment method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911058207.3A CN110782967B (en) | 2019-11-01 | 2019-11-01 | Body-building action standard degree assessment method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110782967A CN110782967A (en) | 2020-02-11 |
CN110782967B true CN110782967B (en) | 2023-04-21 |
Family
ID=69388583
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911058207.3A Active CN110782967B (en) | 2019-11-01 | 2019-11-01 | Body-building action standard degree assessment method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110782967B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111523517B (en) * | 2020-05-26 | 2023-08-04 | 北京奇艺世纪科技有限公司 | Action scoring method and device, electronic equipment and readable storage medium |
CN112365954A (en) * | 2020-10-26 | 2021-02-12 | 埃欧健身管理(上海)有限公司 | Method and equipment for dynamically adjusting fitness scheme |
CN113239849B (en) * | 2021-05-27 | 2023-12-19 | 数智引力(厦门)运动科技有限公司 | Body-building action quality assessment method, body-building action quality assessment system, terminal equipment and storage medium |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6693648B1 (en) * | 2000-11-22 | 2004-02-17 | Campus Crusade For Christ, Inc. | Pointer interactive apparatus |
WO2013122327A1 (en) * | 2012-02-17 | 2013-08-22 | 연세대학교 산학협력단 | Physical-fitness test system using acceleration sensor |
CN104361207A (en) * | 2010-07-14 | 2015-02-18 | 阿迪达斯股份公司 | Location-aware fitness monitoring systems |
CN105184382A (en) * | 2015-07-14 | 2015-12-23 | 成都乐动信息技术有限公司 | Optimization method of trajectory and device |
WO2017156835A1 (en) * | 2016-03-18 | 2017-09-21 | 深圳大学 | Smart method and system for body building posture identification, assessment, warning and intensity estimation |
CN107438398A (en) * | 2015-01-06 | 2017-12-05 | 大卫·伯顿 | Portable wearable monitoring system |
WO2017217567A1 (en) * | 2016-06-15 | 2017-12-21 | (주)그린콤 | Fitness monitoring system |
CN108346456A (en) * | 2018-01-23 | 2018-07-31 | 曲阜师范大学 | A kind of intelligence sport body building management system |
CN110020630A (en) * | 2019-04-11 | 2019-07-16 | 成都乐动信息技术有限公司 | Method, apparatus, storage medium and the electronic equipment of assessment movement completeness |
CN110197721A (en) * | 2019-05-06 | 2019-09-03 | 平安科技(深圳)有限公司 | Tendon condition evaluation method, apparatus and storage medium based on deep learning |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8738323B2 (en) * | 2010-09-30 | 2014-05-27 | Fitbit, Inc. | Methods and systems for metrics analysis and interactive rendering, including events having combined activity and location information |
US20120316011A1 (en) * | 2011-06-10 | 2012-12-13 | Branko Milosevic | Computer Implemented Athletic Training Method |
JP2016034482A (en) * | 2014-07-31 | 2016-03-17 | セイコーエプソン株式会社 | Exercise analysis device, exercise analysis method, exercise analysis program, and exercise analysis system |
JP2017124073A (en) * | 2016-01-15 | 2017-07-20 | セイコーエプソン株式会社 | Electronic apparatus, system, analysis method, analysis program and recording medium |
-
2019
- 2019-11-01 CN CN201911058207.3A patent/CN110782967B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6693648B1 (en) * | 2000-11-22 | 2004-02-17 | Campus Crusade For Christ, Inc. | Pointer interactive apparatus |
CN104361207A (en) * | 2010-07-14 | 2015-02-18 | 阿迪达斯股份公司 | Location-aware fitness monitoring systems |
WO2013122327A1 (en) * | 2012-02-17 | 2013-08-22 | 연세대학교 산학협력단 | Physical-fitness test system using acceleration sensor |
CN107438398A (en) * | 2015-01-06 | 2017-12-05 | 大卫·伯顿 | Portable wearable monitoring system |
CN105184382A (en) * | 2015-07-14 | 2015-12-23 | 成都乐动信息技术有限公司 | Optimization method of trajectory and device |
WO2017156835A1 (en) * | 2016-03-18 | 2017-09-21 | 深圳大学 | Smart method and system for body building posture identification, assessment, warning and intensity estimation |
WO2017217567A1 (en) * | 2016-06-15 | 2017-12-21 | (주)그린콤 | Fitness monitoring system |
CN108346456A (en) * | 2018-01-23 | 2018-07-31 | 曲阜师范大学 | A kind of intelligence sport body building management system |
CN110020630A (en) * | 2019-04-11 | 2019-07-16 | 成都乐动信息技术有限公司 | Method, apparatus, storage medium and the electronic equipment of assessment movement completeness |
CN110197721A (en) * | 2019-05-06 | 2019-09-03 | 平安科技(深圳)有限公司 | Tendon condition evaluation method, apparatus and storage medium based on deep learning |
Non-Patent Citations (4)
Title |
---|
Feichtenhofer, C.等.Spatiotemporal Residual Networks for Video Action Recognition [arXiv].《Advances in neural information processing systems》.2016,全文. * |
Xiao F等.A system for exercise activity recognition and quality evaluation based on green sensing.《 IEEE Transactions on Emerging Topics in Computing》.2018,第08卷(第03期),全文. * |
刘刚.视频监控中人的动作识别.《中国优秀硕士学位论文全文数据库信息科技辑》.2013,(第2013(06)期),全文. * |
董俊峰.基于视觉的人体运动分析技术研究.《中国优秀硕士学位论文全文数据库信息科技辑》.2015,(第2015(08)期),全文. * |
Also Published As
Publication number | Publication date |
---|---|
CN110782967A (en) | 2020-02-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110782967B (en) | Body-building action standard degree assessment method and device | |
US10512406B2 (en) | Systems and methods for determining an intensity level of an exercise using photoplethysmogram (PPG) | |
Ross et al. | Match analysis and player characteristics in rugby sevens | |
Aslan et al. | Metabolic demands of match performance in young soccer players | |
US9174084B2 (en) | Automatic exercise segmentation and recognition | |
CN109716444B (en) | Assessment and guidance of athletic performance | |
US8951164B2 (en) | Extending gameplay with physical activity monitoring device | |
EP3340248B1 (en) | A method and an apparatus for determining training status | |
US20140257535A1 (en) | Personal training with physical activity monitoring device | |
CN111477297A (en) | Personal computing device | |
EP3509071B1 (en) | A method for determining injury risk of a person based on physiological data | |
EP3391809A1 (en) | Fitness level prediction device, system and method | |
Thomas et al. | Relationship between velocity and muscular endurance of the upper body | |
WO2022193851A1 (en) | Method and system for recognizing user actions | |
CN106178466A (en) | A kind of body-building expenditure analysis method and system | |
CN117109567A (en) | Riding gesture monitoring method and system for dynamic bicycle movement and wearable riding gesture monitoring equipment | |
US20160143578A1 (en) | System and method for functional state and/or performance assessment and training program adjustment | |
CN114870364B (en) | Exercise machine control method, exercise machine, and storage medium | |
TWI756754B (en) | A method for monitoring an exercise session with multiple schemes | |
Petri et al. | Analysis of anticipatory cues in karate kumite using an in-situ-study | |
CN111401721A (en) | Method and system for evaluating and training target pre-judging thinking | |
Flowers et al. | Using Multi-Sensor Voting for Resilience in VR Biofeedback Games | |
WO2024040547A1 (en) | System, device, and method for monitoring motion | |
Noh et al. | Smart exercise application to improve LEG function and short-term memory through game-like lunge exercises: development and evaluation | |
EP4290429A1 (en) | Learning device and evaluation information output device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |