CN105824432A - Motion capturing system - Google Patents
Motion capturing system Download PDFInfo
- Publication number
- CN105824432A CN105824432A CN201610416006.6A CN201610416006A CN105824432A CN 105824432 A CN105824432 A CN 105824432A CN 201610416006 A CN201610416006 A CN 201610416006A CN 105824432 A CN105824432 A CN 105824432A
- Authority
- CN
- China
- Prior art keywords
- motion capture
- order
- collecting unit
- real
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a motion capturing system which is used for collecting motions of an operator to form real-time images. The motion capturing system comprises a dynamic capturing device used for correspondingly collecting motion data for motions of the operator who wears the dynamic capturing device. The dynamic capturing device comprises a plurality of collecting units, a sending unit, a processing unit, a receiving unit and a control unit, wherein the collecting units are arranged at a plurality of preset positions of the dynamic capturing device respectively, each collecting unit corresponds to one unique position identification, the collecting units are used for collecting the motion data of the corresponding preset positions in real time, the sending unit is used for outputting the motion data, the processing unit is used for receiving the motion data sent by the collecting units and sending the motion data to the sending unit, the receiving unit is connected with the sending unit and used for receiving the motion data, and the control unit is used for establishing 3D postures of a human body by means of a preset human body motion model according to the motion data and generating the real-time images according to the 3D postures.
Description
Technical field
The present invention relates to attitude detection field, particularly relate to a kind of motion capture system.
Background technology
Existing motion capture is mainly used in space flight, aviation and 3D film making etc., such as: the navigation system of unmanned plane just have employed motion capture technology.In existing 3D film producing process, commonly used photoelectric sensor gathers the movement locus of personnel to be measured, reflective spot by the photographic head pickup light electric transducer of multiple different angles, requirement to environment is high, the shortcoming existed has: be easily subject to backlight impact during shooting, cost is high, has supported, but owing to image calculation is high to hardware requirement, and there is also the shortcoming that time delay is big at present, narrow being difficult to of range of application is popularized.Traditional motion capture number of sensors is many and scattered, need during use to adhere on the body of personnel to be measured by motion capture sensor one by one, and position corresponding to each motion capture sensor fix, if the position patch mistake of motion capture sensor directly affects the data of collection, complex operation, and current commonly used three axles of motion capture sensor or six axles gather data, precision is low.
Summary of the invention
The problems referred to above existed for existing motion capture technology, now provide one to aim at and are prone to dress and acquisition precision height, the motion capture system of applied range.
Concrete technical scheme is as follows:
A kind of motion capture system, in order to the action of acquisition operations personnel to form corresponding real-time imaging, including:
One motion capture equipment, described motion capture equipment is in order to carry out corresponding action data collection to the motion of the described operator dressing described motion capture equipment, described motion capture equipment includes: a plurality of collecting units, a plurality of described collecting units are respectively arranged on a plurality of predeterminated positions of described motion capture equipment, the most corresponding unique station location marker of each described collecting unit, described collecting unit is in order to the described action data of the corresponding described predeterminated position of Real-time Collection;
One transmitting element, in order to export described action data;
One processing unit, connects described transmitting element and a plurality of described collecting unit respectively, in order to receive a plurality of described action data that a plurality of described collecting unit sends, sends a plurality of described action datas to described transmitting element;
One receives unit, is connected with described transmitting element, in order to receive a plurality of described action data;
One control unit, connects described reception unit, in order to set up the 3D attitude of human body according to the modelling of human body motion preset according to a plurality of described action datas, and generates described real-time imaging according to described 3D attitude.
Preferably, described modelling of human body motion is multi-rigid model.
Preferably, including: described motion capture equipment matches with described multi-rigid model, and the number of described rigid body is corresponding with the number of described collecting unit, described rigid body and described collecting unit one_to_one corresponding.
Preferably, the center of each described rigid body is formed at the described predeterminated position of described motion capture equipment, and described collecting unit is in order to gather the described action data of the described center of described rigid body.
Preferably, forming a carrier coordinate system respectively in each described collecting unit, the central point of described carrier coordinate system is the center of described collecting unit;
Described collecting unit includes:
One three axis accelerometer, in order to the Real-time Collection described rigid body corresponding with described collecting unit rotation 3-axis acceleration under described carrier coordinate system;
One three-axis gyroscope, in order to the Real-time Collection described rigid body corresponding with described collecting unit rotation three axis angular rate under described carrier coordinate system;
One or three axle magnetometers, in order to the Real-time Collection described rigid body corresponding with described collecting unit three axle magnetic force component under described carrier coordinate system;
One control module, connect described three axis accelerometer, described three-axis gyroscope and described three axle magnetometers respectively, in order to generate the quaternary number under world coordinate system according to described 3-axis acceleration, described three axis angular rates and described three axle magnetic force component, described action data includes described quaternary number and the described station location marker corresponding with described collecting unit.
Preferably, described transmitting element uses wireless module.
Preferably, described reception unit uses wireless module.
Preferably, described control unit includes:
One MBM, sets up the described 3D attitude of human body in order to the described quaternary number according to each described collecting unit and corresponding described station location marker according to described multi-rigid model;
One synthesis module, connects described MBM, in order to real-time described 3D attitude is synthesized described real-time imaging.
Preferably, described control unit also includes:
One display module, connects described synthesis module, in order to show described real-time imaging.
Preferably, described motion capture equipment includes: headgear, jacket, trousers, glove and footwear.
The beneficial effect of technique scheme:
In the technical program, using motion capture equipment to be prone to dress, to gather the precision of data high, it is to avoid post the process of motion capture sensor one by one, and without using picture pick-up device recording image, the requirement of environment is low to external world, is widely used;Utilizing the data modeling that motion capture equipment is gathered by control unit to generate 3D image in real time, efficiency is high.
Accompanying drawing explanation
Fig. 1 is the module map of a kind of embodiment of motion capture system of the present invention;
Fig. 2 is the internal module figure of a kind of embodiment of collecting unit of the present invention;
Fig. 3 is the internal module figure of a kind of embodiment of control unit of the present invention.
Detailed description of the invention
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is clearly and completely described, it is clear that described embodiment is only a part of embodiment of the present invention rather than whole embodiments.Based on the embodiment in the present invention, the every other embodiment that those of ordinary skill in the art are obtained on the premise of not making creative work, broadly fall into the scope of protection of the invention.
It should be noted that in the case of not conflicting, the embodiment in the present invention and the feature in embodiment can be mutually combined.
The invention will be further described with specific embodiment below in conjunction with the accompanying drawings, but not as limiting to the invention.
As it is shown in figure 1, a kind of motion capture system, in order to the action of acquisition operations personnel to form corresponding real-time imaging, including:
One motion capture equipment, motion capture equipment is in order to carry out corresponding action data collection to the motion of the operator dressing motion capture equipment, motion capture equipment includes: a plurality of collecting units 1, a plurality of collecting units 1 are respectively arranged on a plurality of predeterminated positions of motion capture equipment, the most corresponding unique station location marker of each collecting unit 1, collecting unit 1 is in order to the action data of the corresponding predeterminated position of Real-time Collection;
One transmitting element 3, in order to output action data;
One processing unit 2, connects transmitting element 3 and a plurality of collecting unit 1 respectively, in order to receive a plurality of action datas that a plurality of collecting unit 1 sends, sends a plurality of action datas to transmitting element 3;
One receives unit 5, is connected with transmitting element 3, in order to receive a plurality of action data;
One control unit 4, connects and receives unit 5, in order to set up the 3D attitude of human body according to the modelling of human body motion preset according to a plurality of action datas, and generates real-time imaging according to 3D attitude.
Further, receiving between unit 5 and control unit 4 to use usb communication mode to carry out data transmission.
In the present embodiment, use motion capture equipment to be prone to dress to gather the precision of data high, it is to avoid posts the process of motion capture sensor one by one, time-consuming, and without using picture pick-up device recording image, the requirement of environment is low to external world, is widely used;Utilizing the data modeling that motion capture equipment is gathered by control unit 4 to generate 3D image in real time, efficiency is high, and low cost.
In a preferred embodiment, modelling of human body motion is multi-rigid model.Multi-rigid model includes multiple rigid body.Include that, as a example by 16 rigid bodies, 16 rigid bodies include by multi-rigid model: head rigid body, upper trunk rigid body, lower trunk rigid body, pelvis rigid body, left upper arm rigid body, left forearm rigid body, left hand rigid body, left thigh rigid body, left leg rigid body, left foot rigid body, right upper arm rigid body, right forearm rigid body, right hand rigid body, right thigh rigid body, right leg rigid body and right crus of diaphragm rigid body.
In the present embodiment, multi-rigid model is that every section of limbs of human body are seen as a rigid body, and i.e. internal any position does not produce the object of relative deformation.Rigid body is at the volley or after stress effect, shapes and sizes are constant, and the object of the relative invariant position of interior point.Definitely rigid body is the most non-existent, a kind of ideal model, because any object is after stress effect, deforms the most more or less, if the degree of deformation is the most small for the physical dimension of object own, when studying object of which movement, deformation is just negligible.
In a preferred embodiment, including: motion capture equipment matches with multi-rigid model, and the number of rigid body is corresponding with the number of collecting unit 1, rigid body and collecting unit 1 one_to_one corresponding.
Further, the center of each rigid body is formed at the predeterminated position of motion capture equipment, and collecting unit 1 is in order to gather the action data of the center of rigid body.
In the present embodiment, collecting unit 1 is utilized to detect the action data of corresponding rigid body in real time, then action data is sent in real time the processing unit 2 being arranged on motion capture equipment, the data collection that all collecting units 1 are gathered by processing unit 2, control unit 4 is sent data to by transmitting element 3 and reception unit 5, control unit 4 can use mobile terminal, such as the client of Android system.According to the construction features of multi-rigid model, the connection node of rigid body can be divided into child node, father node and root node, as a example by motion capture equipment is as clothes, root node is positioned at waist, and the absolute location information of child node is to be determined by rotating of father node.
As in figure 2 it is shown, in a preferred embodiment, forming a carrier coordinate system respectively in each collecting unit 1, the central point of carrier coordinate system is the center of collecting unit 1;
Collecting unit 1 includes:
One three axis accelerometer 11, in order to the Real-time Collection rigid body corresponding with collecting unit 1 rotation 3-axis acceleration under carrier coordinate system;
One three-axis gyroscope 12, in order to the Real-time Collection rigid body corresponding with collecting unit 1 rotation three axis angular rate under carrier coordinate system;
One or three axle magnetometers 13, in order to the Real-time Collection rigid body corresponding with collecting unit 1 three axle magnetic force component under carrier coordinate system;
One control module 14, connect three axis accelerometer 11, three-axis gyroscope 12 and three axle magnetometers 13 respectively, in order to generate the quaternary number under world coordinate system according to 3-axis acceleration, three axis angular rates and three axle magnetic force component, action data includes quaternary number and the station location marker corresponding with collecting unit 1.
Further, carrier coordinate system is the coordinate system of collecting unit 1 self.
In the present embodiment, collecting unit 1 gathers nine number of axle according to the precision providing collection action by three axis accelerometer 11, three-axis gyroscope 12 and three axle magnetometers 13, each collecting unit 1 is each equipped with a control module 14, nine number of axle evidences three axis accelerometer 11, three-axis gyroscope 12 and three axle magnetometers 13 gathered by control module 14 are converted to quaternary number, decrease the computing of control unit 4 and run burden, improve control unit 4 and generate the speed of real-time imaging.
In a preferred embodiment, transmitting element 3 uses wireless module, receives unit 5 and uses wireless module, uses wireless communication mode between transmitting element 3 and reception unit 5.
Further, wireless module can use 2.4G module, and its frequency range is between 2.400GHz~2.4835GHz, and 2.4G module has low cost, the advantages such as efficiency high-low voltage, volume are little.
In the present embodiment, processing unit 2 and transmitting element 3 may be contained within motion capture equipment, to realize data wirelessly being sent to receiving unit 5.
As it is shown on figure 3, in a preferred embodiment, control unit 4 includes:
One MBM 42, sets up the 3D attitude of human body in order to the quaternary number according to each collecting unit 1 and corresponding station location marker foundation multi-rigid model;
One synthesis module 41, connects MBM 42, in order to real-time 3D attitude is synthesized real-time imaging.
In the present embodiment, owing to control unit 4 is to be modeled action data under world coordinate system, utilizing MBM 42 to set up the 3D attitude of human body according to multi-rigid model and corresponding quaternary number according to the station location marker of each collecting unit 1, the 3D attitude generated by synthesis module 41 synthesizes real-time imaging.
In a preferred embodiment, control unit 4 also includes:
One display module 43, connects synthesis module 41, in order to show real-time imaging.
In the present embodiment, by display module 43, real-time imaging is shown in real time.
In a preferred embodiment, motion capture equipment includes: headgear, jacket, trousers, glove and footwear.
Further, motion capture equipment comprises the steps that medicated cap, integrated clothes, footwear and glove.
The foregoing is only preferred embodiment of the present invention; not thereby embodiments of the present invention and protection domain are limited; to those skilled in the art; the equivalent done by all utilization description of the invention and diagramatic content and the scheme obtained by obvious change should be can appreciate that, all should be included in protection scope of the present invention.
Claims (10)
1. a motion capture system, in order to the action of acquisition operations personnel to form real-time imaging, it is characterised in that including:
One motion capture equipment, described motion capture equipment is in order to carry out corresponding action data collection to the motion of the described operator dressing described motion capture equipment, described motion capture equipment includes: a plurality of collecting units, a plurality of described collecting units are respectively arranged on a plurality of predeterminated positions of described motion capture equipment, the most corresponding unique station location marker of each described collecting unit, described collecting unit is in order to the described action data of the corresponding described predeterminated position of Real-time Collection;
One transmitting element, in order to export described action data;
One processing unit, connects described transmitting element and a plurality of described collecting unit respectively, in order to receive a plurality of described action data that a plurality of described collecting unit sends, sends a plurality of described action datas to described transmitting element;
One receives unit, is connected with described transmitting element, in order to receive a plurality of described action data;
One control unit, connects described reception unit, in order to set up the 3D attitude of human body according to the modelling of human body motion preset according to a plurality of described action datas, and generates described real-time imaging according to described 3D attitude.
2. motion capture system as claimed in claim 1, it is characterised in that described modelling of human body motion is multi-rigid model.
3. motion capture system as claimed in claim 2, it is characterised in that including: described motion capture equipment matches with described multi-rigid model, the number of described rigid body is corresponding with the number of described collecting unit, described rigid body and described collecting unit one_to_one corresponding.
4. motion capture system as claimed in claim 3, it is characterised in that the center of each described rigid body is formed at the described predeterminated position of described motion capture equipment, and described collecting unit is in order to gather the described action data of the described center of described rigid body.
5. motion capture system as claimed in claim 4, it is characterised in that forming a carrier coordinate system respectively in each described collecting unit, the central point of described carrier coordinate system is the center of described collecting unit;
Described collecting unit includes:
One three axis accelerometer, in order to the Real-time Collection described rigid body corresponding with described collecting unit rotation 3-axis acceleration under described carrier coordinate system;
One three-axis gyroscope, in order to the Real-time Collection described rigid body corresponding with described collecting unit rotation three axis angular rate under described carrier coordinate system;
One or three axle magnetometers, in order to the Real-time Collection described rigid body corresponding with described collecting unit three axle magnetic force component under described carrier coordinate system;
One control module, connect described three axis accelerometer, described three-axis gyroscope and described three axle magnetometers respectively, in order to generate the quaternary number under world coordinate system according to described 3-axis acceleration, described three axis angular rates and described three axle magnetic force component, described action data includes described quaternary number and the described station location marker corresponding with described collecting unit.
6. motion capture system as claimed in claim 1, it is characterised in that described transmitting element uses wireless module.
7. motion capture system as claimed in claim 1, it is characterised in that described reception unit uses wireless module.
8. motion capture system as claimed in claim 5, it is characterised in that described control unit includes:
One MBM, sets up the described 3D attitude of human body in order to the described quaternary number according to each described collecting unit and corresponding described station location marker according to described multi-rigid model;
One synthesis module, connects described MBM, in order to real-time described 3D attitude is synthesized described real-time imaging.
9. motion capture system as claimed in claim 8, it is characterised in that described control unit also includes:
One display module, connects described synthesis module, in order to show described real-time imaging.
10. motion capture system as claimed in claim 1, it is characterised in that described motion capture equipment includes: headgear, jacket, trousers, glove and footwear.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610416006.6A CN105824432A (en) | 2016-06-14 | 2016-06-14 | Motion capturing system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610416006.6A CN105824432A (en) | 2016-06-14 | 2016-06-14 | Motion capturing system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN105824432A true CN105824432A (en) | 2016-08-03 |
Family
ID=56532770
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610416006.6A Pending CN105824432A (en) | 2016-06-14 | 2016-06-14 | Motion capturing system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105824432A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107122048A (en) * | 2017-04-21 | 2017-09-01 | 甘肃省歌舞剧院有限责任公司 | One kind action assessment system |
WO2018035874A1 (en) * | 2016-08-26 | 2018-03-01 | 北京神秘谷数字科技有限公司 | Wearable motion capture device |
CN107885335A (en) * | 2017-11-24 | 2018-04-06 | 西安交通大学 | A kind of motion capture system based on organic flexible fiber strain sensor |
CN108687757A (en) * | 2017-04-06 | 2018-10-23 | 航天时代电子技术股份有限公司 | Robot |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102467749A (en) * | 2010-11-10 | 2012-05-23 | 上海日浦信息技术有限公司 | Three-dimensional virtual human body movement generation method based on key frames and spatiotemporal restrictions |
CN103179692A (en) * | 2011-12-26 | 2013-06-26 | 陈建新 | Human motion tracking system based on Zigbee/ institute of electrical and electronic engineers (IEEE) 802.15.4 |
CN203039726U (en) * | 2012-12-12 | 2013-07-03 | 西安理工大学 | Human body three-dimensional posture identifying system |
CN104197987A (en) * | 2014-09-01 | 2014-12-10 | 北京诺亦腾科技有限公司 | Combined-type motion capturing system |
CN104267815A (en) * | 2014-09-25 | 2015-01-07 | 黑龙江节点动画有限公司 | Motion capturing system and method based on inertia sensor technology |
CN104503589A (en) * | 2015-01-05 | 2015-04-08 | 京东方科技集团股份有限公司 | Somatosensory recognition system and recognition method |
-
2016
- 2016-06-14 CN CN201610416006.6A patent/CN105824432A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102467749A (en) * | 2010-11-10 | 2012-05-23 | 上海日浦信息技术有限公司 | Three-dimensional virtual human body movement generation method based on key frames and spatiotemporal restrictions |
CN103179692A (en) * | 2011-12-26 | 2013-06-26 | 陈建新 | Human motion tracking system based on Zigbee/ institute of electrical and electronic engineers (IEEE) 802.15.4 |
CN203039726U (en) * | 2012-12-12 | 2013-07-03 | 西安理工大学 | Human body three-dimensional posture identifying system |
CN104197987A (en) * | 2014-09-01 | 2014-12-10 | 北京诺亦腾科技有限公司 | Combined-type motion capturing system |
CN104267815A (en) * | 2014-09-25 | 2015-01-07 | 黑龙江节点动画有限公司 | Motion capturing system and method based on inertia sensor technology |
CN104503589A (en) * | 2015-01-05 | 2015-04-08 | 京东方科技集团股份有限公司 | Somatosensory recognition system and recognition method |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018035874A1 (en) * | 2016-08-26 | 2018-03-01 | 北京神秘谷数字科技有限公司 | Wearable motion capture device |
CN108687757A (en) * | 2017-04-06 | 2018-10-23 | 航天时代电子技术股份有限公司 | Robot |
CN107122048A (en) * | 2017-04-21 | 2017-09-01 | 甘肃省歌舞剧院有限责任公司 | One kind action assessment system |
CN107885335A (en) * | 2017-11-24 | 2018-04-06 | 西安交通大学 | A kind of motion capture system based on organic flexible fiber strain sensor |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104699247B (en) | A kind of virtual reality interactive system and method based on machine vision | |
CN101579238B (en) | Human motion capture three dimensional playback system and method thereof | |
CN106125908A (en) | A kind of motion capture calibration system | |
CN106125909A (en) | A kind of motion capture system for training | |
CN110327048B (en) | Human upper limb posture reconstruction system based on wearable inertial sensor | |
CN208677399U (en) | Intelligent switch joint angle measurement unit and system | |
CN201431466Y (en) | Human motion capture and thee-dimensional representation system | |
CN105824432A (en) | Motion capturing system | |
CN104473618A (en) | Body data acquisition and feedback device and method for virtual reality | |
US11222457B2 (en) | Systems and methods for augmented reality | |
CN106625673A (en) | Narrow space assembly system and assembly method | |
CN106840112A (en) | A kind of space geometry measuring method of utilization free space eye gaze point measurement | |
CN109243575B (en) | Virtual acupuncture method and system based on mobile interaction and augmented reality | |
Chen et al. | Real‐time human motion capture driven by a wireless sensor network | |
CN109846487A (en) | Thigh measuring method for athletic posture and device based on MIMU/sEMG fusion | |
RU121947U1 (en) | TRAFFIC CAPTURE SYSTEM | |
CN106108909A (en) | A kind of human body attitude detection wearable device, system and control method | |
CN110609621B (en) | Gesture calibration method and human motion capture system based on microsensor | |
CN106970705A (en) | Motion capture method, device and electronic equipment | |
CN109343713B (en) | Human body action mapping method based on inertial measurement unit | |
CN108253954A (en) | A kind of human body attitude captures system | |
CN111158482B (en) | Human body motion gesture capturing method and system | |
CN106112997A (en) | Ectoskeleton takes | |
CN203630717U (en) | Interaction system based on a plurality of light inertial navigation sensing input devices | |
CN203001878U (en) | Motion tracking device for human body interaction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20160803 |