CN111097142A - Motion capture motion training method and system based on 5G communication - Google Patents

Motion capture motion training method and system based on 5G communication Download PDF

Info

Publication number
CN111097142A
CN111097142A CN201911316981.XA CN201911316981A CN111097142A CN 111097142 A CN111097142 A CN 111097142A CN 201911316981 A CN201911316981 A CN 201911316981A CN 111097142 A CN111097142 A CN 111097142A
Authority
CN
China
Prior art keywords
motion
data
communication
action
human body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911316981.XA
Other languages
Chinese (zh)
Inventor
周湘君
张李京
贺子彬
芦振华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Xishan Yichuang Culture Co ltd
Original Assignee
Wuhan Xishan Yichuang Culture Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Xishan Yichuang Culture Co ltd filed Critical Wuhan Xishan Yichuang Culture Co ltd
Priority to CN201911316981.XA priority Critical patent/CN111097142A/en
Publication of CN111097142A publication Critical patent/CN111097142A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • A63B2024/0065Evaluating the fitness, e.g. fitness level or fitness index
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • A63B2024/0068Comparison to target or threshold, previous performance or not real time comparison to other individuals

Landscapes

  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Rehabilitation Tools (AREA)

Abstract

The invention discloses a motion capture motion training method and system based on 5G communication, wherein the method comprises the following steps: acquiring motion data corresponding to a motion project stage selected by a user through a plurality of data acquisition points by using wireless motion capture equipment, and summarizing the motion data through 5G communication; generating real-time action data of a corresponding human body model according to the action data, and acquiring standard action data of the human body model corresponding to the motion project stage; rendering two human body models in a graphical interface, respectively displaying real-time action and standard action, and giving corresponding prompts according to the deviation of the real-time action data and the standard action data. The embodiment of the invention has at least the following beneficial effects: real-time motion data are captured through motion and compared with standard motion, errors are prompted to be corrected, the correctness of the posture is guaranteed, the motion safety is improved, the portability is improved, and the real-time performance of the data is guaranteed.

Description

Motion capture motion training method and system based on 5G communication
Technical Field
The invention relates to a motion capture technology, in particular to a motion capture motion training method and system based on 5G communication.
Background
The traditional exercise training method usually needs a teacher to participate in the training and guides students on the spot; live motion teaching, student are because self level restriction, and there is certain problem and unable consciousness usually in the conscious in-process motion posture, has aggravated injured risk.
Disclosure of Invention
The present invention is directed to solving at least one of the problems of the prior art. Therefore, the invention provides a motion capture motion training method based on 5G communication, which can correct the motion posture of a student without field guidance.
The invention also provides a motion capture motion training system based on 5G communication, which adopts the motion capture motion training method based on 5G communication.
The motion capture motion training method based on 5G communication according to the embodiment of the first aspect of the invention comprises the following steps: s100, collecting motion data corresponding to a motion project stage selected by a user through a plurality of data collecting points by using wireless motion capturing equipment, and summarizing the motion data through 5G communication; s200, generating real-time action data of a corresponding human body model according to the action data, and acquiring standard action data of the human body model corresponding to the motion project stage; s300, two human body models are rendered in a graphical interface, real-time actions and standard actions are respectively displayed, and corresponding prompts are given according to the deviation of the real-time action data and the standard action data.
The motion capture motion training method based on 5G communication provided by the embodiment of the invention at least has the following beneficial effects: real-time motion data are captured through motion and compared with standard motion, errors are prompted to be corrected, and the correctness of the posture is guaranteed, so that the motion safety is improved. Wireless action and 5G communication guarantee that data are transmitted efficiently in real time under the condition of enhancing the portability of the action capturing equipment, and untimely prompt caused by time delay is avoided.
According to some embodiments of the invention, the motion data is acquired by an inertial motion capture device, including head pose data and limb pose data.
According to some embodiments of the invention, the standard motion data of the mannequin is motion data of a previously collected motion tutor.
According to some embodiments of the invention, said step S300 comprises: s310, dividing regions according to four limbs, the head, the trunk and the hands, and respectively calculating deviation values of the real-time action data and the standard action data; s320, highlighting the corresponding part on the human body model showing the real-time action for the area with the maximum deviation value, and giving a corresponding voice prompt.
According to some embodiments of the invention, the step S320 further comprises: and giving a micro-current stimulation prompt to the data acquisition point with the maximum deviation value.
According to a second aspect of the invention, the motion capture motion training system based on 5G communication is characterized by comprising: the motion capture module comprises a motion capture helmet, motion capture gloves and a motion capture garment and is used for collecting and capturing motion data of a motion item stage selected by a user, wherein the motion data comprises head posture data and limb posture data; the communication module is used for summarizing and sending the action data to a server based on 5G communication data and receiving the action data including the processed human body model; the action processing module is used for processing the action data, generating corresponding real-time human body model motion data and acquiring corresponding preset standard human body model motion data according to the motion project stage; and the display module is used for rendering the action performance of the human body model, including standard actions and real-time actions of the user, and giving an error display prompt.
The motion capture motion training system based on 5G communication provided by the embodiment of the invention has at least the following beneficial effects: real-time motion data are captured through motion and compared with standard motion, errors are prompted to be corrected, and the correctness of the posture is guaranteed, so that the motion safety is improved. Wireless action and 5G communication, when strengthening the portability of motion capture equipment, also guarantee that data can transmit in real time with high efficiency, avoid leading to the suggestion untimely because of the time delay.
According to some embodiments of the invention, further comprising: and the data storage module is used for storing the standard human body model motion data.
According to some embodiments of the present invention, the voice module is configured to receive a user command to select the sport stage, and provide a corresponding voice prompt according to an error condition of the user action.
According to some embodiments of the invention, the voice module and the display module are disposed on the helmet.
According to some embodiments of the invention, further comprising: and the micro-current prompting module is used for giving micro-current stimulation prompt to the data acquisition point with the maximum deviation.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic flow chart illustrating the main steps of a method according to an embodiment of the present invention;
FIG. 2 is a flow chart illustrating the subdivision step of step S300 according to an embodiment of the present invention;
FIG. 3 is a schematic block diagram of a system in accordance with an embodiment of the present invention.
Reference numerals:
the motion capture module 100, the communication module 200, the motion processing module 300, the display module 400, the data storage module 500, the voice module 600, and the micro-current prompt module 700.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
Referring to fig. 1, the main method steps of the embodiment of the present invention are as follows: s100, collecting motion data corresponding to a motion project stage selected by a user by using wireless motion capture equipment, and summarizing the motion data through 5G communication; and S200, generating real-time action data of the corresponding human body model according to the action data, and acquiring standard action data of the corresponding human body model at the exercise project stage. S300, two human body models are rendered in the graphical interface, real-time actions and standard actions are rendered respectively, and corresponding prompts are given according to the deviation of the real-time action data and the standard action data. The two models simultaneously display the real-time action and the standard action of the human body model, the comparison and the distinguishing are obvious, and the posture and the action can be corrected.
In an embodiment of the invention, the motion data is acquired by an inertial motion capture device, and the acquired motion data comprises: head gestures and limb gestures; in other embodiments of the present invention, head poses may or may not be collected because the items of motion are not sensitive to head pose, such as orientation relative to the torso.
In an embodiment of the present invention, the standard motion data of the human body model is motion data of a previously collected motion tutor. It is understood that the motion data of a single teacher may be, or may be differentiated according to gender, and the motion data of a teacher who guides the user and corresponds to the gender may be selected, or the motion data of a designated teacher and is designated by the user.
Referring to fig. 2, in an embodiment of the present invention, S300 includes: s310, dividing regions according to four limbs, the head, the trunk and the hands, and respectively calculating deviation values of the real-time action data and the standard action data; and S320, highlighting the corresponding part of the real-time moving human model on the area part with the maximum deviation value, and giving a corresponding voice prompt. S320, giving a micro-current stimulation prompt to the data acquisition point with the maximum deviation value. The micro-current stimulation is directly given to the data acquisition point, a user can intuitively feel the error part without determining the specific position according to display and prompt, and the reaction time is shortened. It is understood that microcurrent stimulation may not be included in some embodiments of the present invention.
In the system of the embodiment of the invention, the method comprises the following steps: a motion capture module 100 comprising a motion capture helmet, motion capture gloves, and a motion capture suit for collecting motion data capturing a user-selected motion item phase, including head pose data and limb pose data; the communication module 200 is used for summarizing and sending the collected action data to a server based on the 5G communication data and receiving the processed model action data; the action processing module 300 is used for processing the action data, generating corresponding real-time human body model motion data and acquiring corresponding preset standard human body model motion data according to the selected motion project stage; and the display module 400 is used for rendering the action performance of the human body model, including standard actions and real-time actions of the user, and giving an error display prompt. In an embodiment of the invention, two human-type models are used for respectively rendering standard action data and real-time action data. The two models are displayed with different color intensities. It will be appreciated that the two models may be displayed in the same screen location or may be displayed separately.
In an embodiment of the present invention, further comprising: and the data storage module 500 is used for storing the pre-collected standard human body model action data. The action data of the teacher at the corresponding motion project stage is collected in advance and used as the corresponding standard human body model action data.
In an embodiment of the present invention, further comprising: the voice module 600 is used for receiving a user command, selecting a sports stage, and giving a corresponding prompt according to the error condition of the user action. In the embodiment of the invention, the voice module and the display module are both arranged on the helmet, wherein the voice module is an earphone and a voice acquisition module on the helmet, and the display module is VR glasses connected with the helmet.
In an embodiment of the present invention, further comprising: and the micro-current prompting module 700 is used for giving a micro-current stimulation prompt to the data acquisition point with the maximum deviation value.
The embodiments of the present invention have been described in detail with reference to the accompanying drawings, but the present invention is not limited to the above embodiments, and various changes can be made within the knowledge of those skilled in the art without departing from the gist of the present invention.

Claims (10)

1. A motion capture motion training method based on 5G communication is characterized by comprising the following steps:
s100, collecting motion data corresponding to a motion project stage selected by a user through a plurality of data collecting points by using wireless motion capturing equipment, and summarizing the motion data through 5G communication;
s200, generating real-time action data of a corresponding human body model according to the action data, and acquiring standard action data of the human body model corresponding to the motion project stage;
s300, two human body models are rendered in a graphical interface, real-time actions and standard actions are respectively displayed, and corresponding prompts are given according to the deviation of the real-time action data and the standard action data.
2. The 5G communication based motion capture motion training method of claim 1, wherein the motion data is acquired by an inertial motion capture device and comprises head pose data and limb pose data.
3. The motion capture motion training method based on 5G communication of claim 1, wherein the standard motion data of the human body model is motion data of a pre-collected motion guide teacher.
4. The motion capture motion training method based on 5G communication according to claim 1, wherein the step S300 comprises:
s310, dividing regions according to four limbs, the head, the trunk and the hands, and respectively calculating deviation values of the real-time action data and the standard action data;
s320, highlighting the corresponding part of the human body model showing the real-time action for the area with the maximum deviation value, and giving a corresponding voice prompt.
5. The motion capture motion training method based on 5G communication of claim 4, wherein the step S320 further comprises: and giving a micro-current stimulation prompt to the data acquisition point with the maximum deviation value.
6. A motion capture motion training system based on 5G communication, using the method of any of claims 1-5, comprising:
the motion capture module comprises a motion capture helmet, motion capture gloves and a motion capture garment and is used for collecting and capturing motion data of a motion item stage selected by a user, wherein the motion data comprises head posture data and limb posture data;
the communication module is used for summarizing and sending the action data to a server based on 5G communication data and receiving the action data including the processed human body model;
the action processing module is used for processing the action data, generating corresponding real-time human body model action data and acquiring corresponding preset standard human body model action data according to the motion project stage;
and the display module is used for rendering the action performance of the human body model, including standard actions and real-time actions of the user, and giving an error display prompt.
7. The 5G communication-based motion capture motion training system of claim 5, further comprising: and the data storage module is used for storing the standard human body model motion data.
8. The 5G communication-based motion capture motion training system of claim 5, further comprising: and the voice module is used for receiving a user command to select the sports item stage and giving a corresponding voice prompt according to the error condition of the user action.
9. The 5G communication-based motion capture exercise training system of claim 7, wherein the voice module and the display module are disposed on the helmet.
10. The 5G communication-based motion capture motion training system of claim 5, further comprising: and the micro-current prompting module is used for giving micro-current stimulation prompt to the data acquisition point with the maximum deviation.
CN201911316981.XA 2019-12-19 2019-12-19 Motion capture motion training method and system based on 5G communication Pending CN111097142A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911316981.XA CN111097142A (en) 2019-12-19 2019-12-19 Motion capture motion training method and system based on 5G communication

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911316981.XA CN111097142A (en) 2019-12-19 2019-12-19 Motion capture motion training method and system based on 5G communication

Publications (1)

Publication Number Publication Date
CN111097142A true CN111097142A (en) 2020-05-05

Family

ID=70422902

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911316981.XA Pending CN111097142A (en) 2019-12-19 2019-12-19 Motion capture motion training method and system based on 5G communication

Country Status (1)

Country Link
CN (1) CN111097142A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112484229A (en) * 2020-11-30 2021-03-12 珠海格力电器股份有限公司 Air conditioner control method and device, electronic equipment and readable storage medium

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3579909B2 (en) * 1993-11-01 2004-10-20 松下電器産業株式会社 Foam practice equipment
CN106110627A (en) * 2016-06-20 2016-11-16 曲大方 Physical culture and Wushu action correction equipment and method
CN106251387A (en) * 2016-07-29 2016-12-21 武汉光之谷文化科技股份有限公司 A kind of imaging system based on motion capture
CN106502388A (en) * 2016-09-26 2017-03-15 惠州Tcl移动通信有限公司 A kind of interactive movement technique and head-wearing type intelligent equipment
KR101716474B1 (en) * 2016-11-22 2017-03-15 (주)윈투스시스템 GIS based CCTV monitoring system
CN106512398A (en) * 2016-12-06 2017-03-22 腾讯科技(深圳)有限公司 Reminding method in virtual scene and related device
CN108519818A (en) * 2018-03-29 2018-09-11 北京小米移动软件有限公司 Information cuing method and device
CN108883335A (en) * 2015-04-14 2018-11-23 约翰·詹姆斯·丹尼尔斯 The more sensory interfaces of wearable electronics for people and machine or person to person
CN109228358A (en) * 2018-10-26 2019-01-18 白明号 A kind of 3D printing intelligent monitor system and application method
CN109260672A (en) * 2018-08-28 2019-01-25 百度在线网络技术(北京)有限公司 Analysis method, device, wearable device and the storage medium of exercise data
EP3474235A1 (en) * 2016-06-16 2019-04-24 Sony Corporation Information processing device, information processing method and storage medium
CN109692003A (en) * 2017-10-20 2019-04-30 深圳市鹰硕技术有限公司 A kind of children's running posture correction training system
US20190160339A1 (en) * 2017-11-29 2019-05-30 Board Of Trustees Of Michigan State University System and apparatus for immersive and interactive machine-based strength training using virtual reality
CN109919034A (en) * 2019-01-31 2019-06-21 厦门大学 A kind of identification of limb action with correct auxiliary training system and method
CN110037715A (en) * 2019-04-25 2019-07-23 广东小天才科技有限公司 Based reminding method, device, wearable device and the storage medium of wearable device
US20190280936A1 (en) * 2018-03-12 2019-09-12 Bank Of America Corporation Iot circuitry modules
CN110298218A (en) * 2018-03-23 2019-10-01 上海形趣信息科技有限公司 Interactive body-building device and interactive body-building system
CN209596409U (en) * 2018-09-07 2019-11-08 深圳阳光整形美容医院 A kind of shaping structure for abdomen
CN110502107A (en) * 2019-07-26 2019-11-26 森博迪(深圳)科技有限公司 Wearable real-time action instructs system and method

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3579909B2 (en) * 1993-11-01 2004-10-20 松下電器産業株式会社 Foam practice equipment
CN108883335A (en) * 2015-04-14 2018-11-23 约翰·詹姆斯·丹尼尔斯 The more sensory interfaces of wearable electronics for people and machine or person to person
EP3474235A1 (en) * 2016-06-16 2019-04-24 Sony Corporation Information processing device, information processing method and storage medium
CN106110627A (en) * 2016-06-20 2016-11-16 曲大方 Physical culture and Wushu action correction equipment and method
CN106251387A (en) * 2016-07-29 2016-12-21 武汉光之谷文化科技股份有限公司 A kind of imaging system based on motion capture
CN106502388A (en) * 2016-09-26 2017-03-15 惠州Tcl移动通信有限公司 A kind of interactive movement technique and head-wearing type intelligent equipment
KR101716474B1 (en) * 2016-11-22 2017-03-15 (주)윈투스시스템 GIS based CCTV monitoring system
CN106512398A (en) * 2016-12-06 2017-03-22 腾讯科技(深圳)有限公司 Reminding method in virtual scene and related device
CN109692003A (en) * 2017-10-20 2019-04-30 深圳市鹰硕技术有限公司 A kind of children's running posture correction training system
US20190160339A1 (en) * 2017-11-29 2019-05-30 Board Of Trustees Of Michigan State University System and apparatus for immersive and interactive machine-based strength training using virtual reality
US20190280936A1 (en) * 2018-03-12 2019-09-12 Bank Of America Corporation Iot circuitry modules
CN110298218A (en) * 2018-03-23 2019-10-01 上海形趣信息科技有限公司 Interactive body-building device and interactive body-building system
CN108519818A (en) * 2018-03-29 2018-09-11 北京小米移动软件有限公司 Information cuing method and device
CN109260672A (en) * 2018-08-28 2019-01-25 百度在线网络技术(北京)有限公司 Analysis method, device, wearable device and the storage medium of exercise data
CN209596409U (en) * 2018-09-07 2019-11-08 深圳阳光整形美容医院 A kind of shaping structure for abdomen
CN109228358A (en) * 2018-10-26 2019-01-18 白明号 A kind of 3D printing intelligent monitor system and application method
CN109919034A (en) * 2019-01-31 2019-06-21 厦门大学 A kind of identification of limb action with correct auxiliary training system and method
CN110037715A (en) * 2019-04-25 2019-07-23 广东小天才科技有限公司 Based reminding method, device, wearable device and the storage medium of wearable device
CN110502107A (en) * 2019-07-26 2019-11-26 森博迪(深圳)科技有限公司 Wearable real-time action instructs system and method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112484229A (en) * 2020-11-30 2021-03-12 珠海格力电器股份有限公司 Air conditioner control method and device, electronic equipment and readable storage medium
CN112484229B (en) * 2020-11-30 2022-05-17 珠海格力电器股份有限公司 Air conditioner control method and device, electronic equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CN109191588B (en) Motion teaching method, motion teaching device, storage medium and electronic equipment
CN107551521B (en) Fitness guidance method and device, intelligent equipment and storage medium
CN109432753B (en) Action correcting method, device, storage medium and electronic equipment
CN111091732B (en) Cardiopulmonary resuscitation (CPR) instructor based on AR technology and guiding method
CN110321786A (en) A kind of human body sitting posture based on deep learning monitors method and system in real time
CN109846485A (en) Computing electronics, the System and method for of human posture's healthcare information are provided
CN109101879A (en) A kind of the posture interactive system and implementation method of VR teaching in VR classroom
CN110490173B (en) Intelligent action scoring system based on 3D somatosensory model
CN106652590A (en) Teaching method, teaching recognizer and teaching system
CN108335747A (en) Cognitive training system
CN107694046A (en) A kind of body building training method, device and computer-readable recording medium
CN113409651B (en) Live broadcast body building method, system, electronic equipment and storage medium
US20220392361A1 (en) Learning system and learning method
KR20140043174A (en) Simulator for horse riding and method for simulation of horse riding
CN111097142A (en) Motion capture motion training method and system based on 5G communication
Echeverria et al. KUMITRON: Artificial intelligence system to monitor karate fights that synchronize aerial images with physiological and inertial signals
CN111223549A (en) Mobile end system and method for disease prevention based on posture correction
CN114005511A (en) Rehabilitation training method and system, training self-service equipment and storage medium
CN113989832A (en) Gesture recognition method and device, terminal equipment and storage medium
KR20180099399A (en) Fitness Center and Sports Facility System Using a Augmented reality virtual trainer
CN113051973A (en) Method and device for posture correction and electronic equipment
CN116704603A (en) Action evaluation correction method and system based on limb key point analysis
CN110458076A (en) A kind of teaching method based on computer vision and system
CN108969864A (en) Depression recovery therapeutic equipment and its application method based on VR technology
JP2007183792A (en) Equipment operability evaluation device, equipment operability evaluation method and equipment operability evaluation program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200505